The longer I write software, the more my sense of “impressive” changes. What actually amazes me these days isn’t modern technology, but older syste...
For further actions, you may consider blocking this person and/or reporting abuse
That is the negative consequence of over-provisioning.
Developers get fast machines and fast internet connections, and then they assume everyone else has access to the same means.
The same is happening with hosting, even websites that are local want fast page loads on the other side of the world. And because of hyperscalers like AWS, Azure and Google cloud it is possible.
Don't get me started on AI, people are expected to have multi gigabyte models on every device they own.
The tech industry needs to do less disrupting and add more value.
"Re-introduce constraints intentionally" is the key phrase. And you're spot on—the platforms that host this software share the responsibility. True innovation from hyperscalers wouldn't just be more power, but tooling that rewards leanness and helps us build software that's fast because it's efficient, not just because it's running on a beast!
Thanks for your post! I mostly agree, but it's funny that you cite Windows 95 as a good example, that unstable pre-NT desktop operation system built on top of MS-DOS that needed 9 floppy disks to install. Software like Windows 95 was one of the reasons that the Linux community developed desktop environments, because developers were craving for stable alternatives to Microsoft, and Steve Jobs hadn't yet decided to bring his NeXT step ideas back to Apple.
I appreciate memory saving and efficiency right now and back then, but I'd rather point to home computer game development in the 1980s or the 1990s demo scene instead. Now you can retort and point out what an idiosyncratic chaos Commodore assembly development meant ;-)
Haha, excellent and fair critique! I was using Windows 95 more as a shocking reference for scale, not stability—but your point stands!. The real craft was indeed in the demo scene, game devs, and early Unix communities working under extreme limits. It’s precisely that mindset we’ve drifted from. And you nailed the irony: today's Linux, in many forms, consumes far beyond what its pioneers imagined. The spirit of minimalism they fought for has been overshadowed by convenience!
As an eight-year Linux user, the painful irony is watching the ecosystem recreate the bloat it opposed. Even after applying limits; browsers and CLI tools crash from pure memory bloat on an 8GB RAM! That was an entire server in those days—no complex logic, just waste. That's the tangible price of the discipline we've lost.
I think your post brings a few interesting topics up.
There is probably a balance to be struck between importing the world with little regard to consequences, and optimizing performance down to the last byte.
Different applications in different contexts will probably land on different points on this spectrum, for example if we are building a prototype webapp to show a concept to a user, it would probably be on the less optimised with less good "manners".
Whereas an application where the concept is more concrete and the focus is on delivering performance that is good enough for now and the new couple of years would probably fall closer to the other end with better manners and more optimisation.
Sometimes even within the same application you might have different parts at different points on the spectrum, say we are a company whose USP is being an insurance broker. We'd probably want to make sure that the insurance broker bit of our application was put together very well and we spend most of our time there, vs spending less time building supporting functions like an Admin UI, or Reporting, we'd still have them to a good enough quality, but probably use more off the shelf type tooling for those bits as that is not where there core of our business is.
Spot on—it's all about context and conscious choice on that spectrum.
My major concern is the loss of that awareness. We default to "import the world" even for critical production systems, forgetting to respect the user's resources. It’s not about optimizing every prototype; it’s about ensuring the broker's engine doesn’t fail because of bloat, forcing the user to upgrade hardware in a few years. That's the line between a strategic compromise and accidental bloat.
Totally agree with your pragmatic take.
This really resonates with me. I’ve been building software for a long time, and I recognise that shift you’re describing — from being careful and intentional, to assuming the machine (or the cloud) will just absorb whatever we throw at it.
A big reason I created myapi.rest was actually because of this. I kept seeing teams re-implementing the same small utilities over and over, often in rushed ways, with little thought for performance, limits, or long-term maintenance. It felt… a bit careless.
For me, software manners is about respect — for the system, for the next developer, and for the future version of yourself. Even when things are “cheap” or abstracted away, I still believe there’s value in doing the small things thoughtfully.
Running 500K+ API calls daily taught me this the hard way. Switched to Cloudflare Workers because the platform forces discipline—tight CPU limits, pay-per-request billing. Can't be wasteful when the edge punishes bloat. Constraints aren't nostalgia, they're competitive advantage.
We loved your post so we shared it on social.
Keep up the great work!
I agree with your points. I would like to add another factor to the problems of modern software development: the dynamics of capitalism. I know it sounds like a bold statement, but in times of software abundance, the race to be "first" becomes overwhelming. I believe the flow of money is driven by this mechanism. The Windows system, in my opinion, corroborates this. As far as I know, Windows wasn't considered a high-quality product, but it became very popular for various reasons. This "popularity" made money flow to Microsoft, and this was an important factor in developing a better product. Not all software follows this logic (Linux, for example), but if we restrict ourselves to software in the context of companies and businesses, being first makes all the difference. Therefore, to me, it seems that software development faces a lot of pressure to be "ready" as quickly as possible, even at risk, and then we have to deal with the problems and technical debt. Unfortunately.