Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Fun example, thanks for pointing it out!

I actually took Professor Grötschel's Linear Optimization course at TU Berlin, and the practical optimization task/competition we did for that course very much illustrates the point made in the answer to the stackexchange question you posted.

Our team won the competition, beating the performance not just of the other student teams' programs, but also the program of the professor's assistants, by around an order of magnitude. How? By changing a single "<" (less than) to "<=" (less than or equal), which dramatically reduced the run-time of the dominant problem of the problem-set.

This really miffed the professor quite a bit, because we were just a bunch of dumb CS majors taking a class in the far superior math department, but he was a good sport about it and we got a nice little prize in addition to our grade.

It also helped that our program was still fastest without that one change, though now with only a tiny margin.

The point being that, as the post notes, this is a single problem in a single, very specialized discipline, and this example absolutely does not generalize.



I think that your original comment is the one that risks over-generalization. The "software gets slower" perception is in a very narrow technical niche: user interface/user experience for consumer applications. And it has a common cause: product managers and developers stuff as much in the the user experience as they can until performance suffers. But even within that over-stuffing phenomenon you can see individual components that have the same software improvement curve.

In any area where computing performance has been critical - optimization, data management, simulation, physical modeling, statistical analysis, weather forecasting, genomics, computational medicine, imagery, etc. there have been many, many cases of software outpacing hardware in rate of improvement. Enough so that it is normal to expect it, and something to investigate for cause if it's not seen.


First, I think your estimate of which is the niche, High Performance Computing or all of personal computing (including PCs, laptops, smartphones, and the web) is not quite correct.

The entire IT market is ~ $3.5 trillion, HPC is ~ $35 billion. Now that's nothing to sneeze at, but just 1% of the total. I doubt that all the pieces I mentioned also account for just 1%. If so, what's the other 98%?

Second, there are actually many factors that contribute to software bloat and slowdown, what you mention is just one, and many other kinds of software are getting slower, including compilers.

Third, while I believe you that many of the HPC fields see some algorithmic performance improvements, I just don't buy your assertion that this is regularly more than the improvements gained by the massive increases in hardware capacity, and that one singular example just doesn't cut it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: