The Illusion of Hardware Slowing Down

"Does your hardware get twice as fast every two years? It does not!"

2022-10-06

Does your hardware get twice as fast every two years? It does not! I believe that Moore's Law, while historically and industrially important as a metric, is little use for everyday consumers. Hardware gets faster in periodic bursts as it is upgraded or replaced. Software inefficiency on the other hand keeps growing at a consistent rate. May's Law, a Wirth's Law variation, is therefore incorrect. Growing software inefficiency does not compensate Moore's Law; in the real world, excluding a brief post hardware upgrade period, it overshadows it! This phenomenon creates an illusion that devices get slower with time when they do not. Well, maybe. More on this later.

The phenomenon Wirth's Law describes is a necessity! To explain the reasoning behind this thought, I thought up two possible reasons software gets more inefficient by the day:

To understand both points better, please visualize a single-axis spectrum in your head with "ease of development" on the left side and "optimized code" on the right side.

More lenient coding practices are trade-offs that trade optimization for ease of development. How far away from the right side of the spectrum varies from case to case.

Abstractions on the other hand are placed on the furthest edge of the left side in the spectrum. That is not to say they are useless! Abstractions are necessary evils that come into being by valid reasons such as maintaining backwards compatibility, easing cross platform development to the point they're now possible in feasible time frames, reducing repeated code and eliminating divided codebases. In fact, I'm actually quite fond of abstractions! Any developer that has ever worked on a cross-platform project before will sing abstractions' praises and having worked on projects like that, I too think they're invaluable. Of course, in an ideal world, I'd be singing Progresive Web Apps' praises but that ship has already sailed. One can't always get what they want as the world is built on compromises and this is the one we ended up with. The fact however remains that abstractions, no matter how necessary or useful they may be, add onto the already complex software stack that is quasi-necessary today to feasibly develop, resulting in inefficiency.

The inefficiency caused by these two reasons are expected to be negated by Moore's Law.

My friends complain about my blog posts being walls of text so I made the following graph as a visual aid to explain my thinking better:

The very simplified graph above assumes the following:

I have spent a month asking around and browsing to find a name for this phenomenon but my efforts were unfortunately in vain. If there is a name for it that I am unaware of, feel free to contact me using the e-mail address in this site's footer. Until then, I am taking this opportunity to not so humbly call it "Sateallia's Law".

It goes something like this: Hardware relies on periodic updates to catch up with the growing needs of ever more inefficient software, resulting in an unstable performance timeline graph.