Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's interesting, when you think about it from that perspective.

On my system, maybe that 1% improvement doesn't mean very much.

But when you add up all the systems in the world that are running Linux, and think about how much electricity is used by them or how many personal experiences they are mediating, it really adds up into something worthwhile.

The curious question is: at what point does it become not worthwhile? 1% is maybe worthwhile. But .01%? .5%?



> at what point does it become not worthwhile

There is always someone who will want to do it if only to show they can. You only need to care about "worthwhile" if you're balancing it against other concerns.

Obviously people are generally going to be motivated to smash the larger ones first. But Linux doesn't run like a centralized project where developers are directed on what to prioritize.


If you've got 100 CPU cores at or near full utilization, you've just saved yourself a CPU core. 100 machines at full utilization, it's a full machine.

The more cores/machines you have, the more this savings means. The threshold "is it worth it?" percentage depends on how many machines saved is worth an engineer's time to do the optimization.


Presumably users will gladly accept any positive % as long as it works, so the question comes down to what are the motivations of those actually implementing (or green-lighting) the commits.


I would say that so long as there are issues that have been identified to negatively affect performance, whichever issue has the biggest performance impact should always be considered worthwhile.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: