In his 1995 article “A Plea for Lean Software”, Niklaus Wirth wrote:

  • Software expands to fill the available memory
  • Software is getting slower more rapidly than hardware becomes faster

Wirth explained that the two causes for bloating of software are the rapid growth in hardware performance and customers' ignorance of features that are essential vs nice to have. 25 years on, Wirth’s Law still holds true. At a time when a software product’s power is measured based on the number of features, the rampant incorporation of all possible features that users may want messes up the software design and causes quality and performance to take a backseat.

The pressure to deliver a product to the market, with as many features as possible and before any competitor, has a negative effect on not only the customers' experience but also on the engineers' morale and skills. A market polluted with slow and bloated software has gradually been lowering our standards and the engineering skills required to design and build compact, efficient software are neglected and discouraged.

I’m sure many developers remember the times they had to forego refactoring because it was deemed unnecessary at the time, especially compared to the latest feature. Or the times when they were prevented from optimizing the code because “developer time is more expensive than computation time”. And how many times were the developers asked or allowed to work on the refactoring or optimization after they had implemented the latest feature? Not often, if ever.

The problem with neglecting performance is that if a product is expected to have thousands or millions or users, every inefficiency is multiplied a thousand or million-fold. If some process takes 1 second longer to execute than its optimized equivalent, then for a million-strong user base a million seconds worth of productive time has been wasted, which translates to over 11.5 days. Most likely, spending 11.5 days on optimizing that would have been possible. If an application requires 32GB of memory and cannot work with 8GB of memory (for example, Adobe Premiere), then a million-strong user base will need to buy 24GB of additional memory each, which translates to a total of around $101,250,000 in additional hardware investments at today’s price.

“The speed of software halves every 18 months.”

– Gates’s Law

Moore’s Law may be dead or dying but I think most agree that Gates’s Law is still alive. And that means we can’t expect our programs and apps to perform better when we buy newer computers and smartphones. The apparent performance will be about the same, if not worse, and the size will be bloated.

So what can we do about it? Here are some ideas at a high-level:

  • architects and developers should choose better tools, designs, programming languages and frameworks when building software
  • developers should learn skills to build leaner software that is smaller and faster
  • engineering managers should consider the long term impact of the tech debt that is code bloat
  • product managers should be concerned about the users' experience from a performance perspective.

I plan to elaborate more in future blog posts.