Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
How Fast Do Algorithms Improve? [pdf] (ieee.org)
54 points by djoldman on Sept 23, 2021 | hide | past | favorite | 7 comments


Progress in computation can be decomposed into three factors:

Perf = (perf / op) * (op / $) * $

(1) Perf / op is essentially software progress, the subject of this article. (And for AI/ML, you can further decompose it into algorithm progress and training data progress.)

(2) Op / $ is essentially Moore's law, which has been slowing down and can be further decomposed into: Ops / $ = (ops / transistor) * (transistor / area) * (area / $)

(3) $ is willingness to spend money. This has probably been the biggest source of progress for AI projects in the past 10 years, with Deep Mind and Open AI and Google Brain and Silicon Valley ad tech spending billions to build models more complex and capable than any that have come before. http://www.rossgritz.com/trends-in-deepmind-operating-costs/...


Another interesting thing to study would be how fast programmers manage to trade constant factors for development speed (and other things) to keep perceived performance roughly constant (or even deteriorating) despite algorithmic and hardware improvements.


Roughly O(log n) by the looks of it.


Slow I can tell you that! Unless its an open problem that no one is sure on yet. And you get lots of optimizations for a certain class/subtype of that problem.

Alot ground work has already been laid down. Also most the times when an algorithm with a better time conplexity is found it only applies to really large n. Sometimes that n is so large its basically not even useful for actual computation aka look up the term galactic algorithm.


From the study:

> We find enormous heterogeneity in algorithmic progress, with nearly half of algorithm families experiencing virtually no progress, while 14% experienced improvements orders of magnitude larger than hardware improvement (including Moore’s law). Overall, we find that algorithmic progress for the median algorithm family increased substantially but by less than Moore’s law for moderate-sized problems and by more than Moore’s law for big data problems.


I did look at the paper too so not sure why you need to just re-post an excerpt, and that is basically what I said. Like they said more than half of the families of algorithms they looked at basically are at a stand still.


I mean, of course you're not going to do much better at sorting, for example. But things like machine learning and optimization are advancing at a rapid pace, and I'm not sure I would characterize them as "open problems that no one is sure on yet." Unless you just want to apply that characterization to anything that's still improving, which would be a bit of a tautology.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: