Reading an article in today’s Globe and Mail about where technology gadgets should go to die made me think about life spans vs useful lifespans and how technology vendors encourage a cycle of waste. Back when I was a software developer we used to joke about the tradeoffs between good/cheap/quick — and management could get any combination of two factors. If it was good and quick it was not going to be cheap, cheap and quick meant not good, etc.
The useful life of computers is getting much different from the actual life of computers. Like any machine, computers have a limited lifetime. Semiconductors gradually degrade and change their properties, other components age as a result of power cycles heating and cooling their components — solder joints eventually break, capacitors fail and so forth. Hard disks develop wear spots that eventually stop reporting their data content correctly — so logical data errors accumulate and eventually the disk fails even though mechanically it may still be ok. In my experience this process takes on average about eight years or so for a computer to die as a result of old age. But it may be put out to pasture (read landfill or recycling depot) long before this as a result of software-driven planned obsolescence.
When I started writing code a typical development computer had 1 mb of memory, maybe was 1 mhz in processor speed and would support perhaps 50 concurrent users, each with a 64kb workspace. This was enough to do many jobs, although the tools that we produced strained these limits. But most of the resources went to do useful work and relatively little to user-coddling interfaces.
Times have changed. Today a single user with a machine 1000x larger is told that his machine is too small and too slow. Why? Because the overhead that has been added to computing to make it prettier and more user-friendly and provide more background services consumes huge amounts of resources. Years ago just having color and reverse video areas on the screen was a big deal. Now users expect to be able to choose the colors of the screen and the decorative image trim (skins…) that they get displayed. All of this flexibility adds substantial overhead.
Today I read a trade magazine that suggested that all future operating systems and applications from Microsoft would be 64bit only. Well, what does this mean to a small business? It means simply that if your servers are not recent P4, Xeon or AMD64-based machines you will need to buy new hardware to run the new versions. Just as with Vista the real requirement was buy new computers as nothing you have can be upgraded.
What this means is that if marketing is successful for the new stuff, there will be a wave of discards going onto the scrap heap. I think this is deliberate. The real incentive for everyone in the computing food chain (except the end user — whose business benefit derived from computing really pays for it all) is to sell new stuff and push the last sale off the table and into the can as quickly as possible. It is great that computers have become so much faster and memory much larger and at much less cost (we think). And software developers make use of this to write code that is even more capable and convenient than the previous releases. But the path from one version to another is anything but smooth. And the more disruptive the transition, the more money that is made by the computing food chain — in consulting services and development.
But all this is expensive for the individuals and businesses at whom it is targeted. While it may be fashionable to have the latest car parked in the driveway as a symbol of how well we are keeping ahead of the Jones, the same cache does not work with computing. The software and hardware that keep the orders flowing and the accounts balanced are largely invisible except to the internal staff. So the expense of new hardware, software and services every couple of years (to say nothing of retraining) just reduces the overall profitability of the company. This is rather like a slow version of getting all the golden eggs from the goose. In the end, if really successful, there is no more goose and the eggs in hand are all.
So despite the hype, if sales are slowing on the new stuff the vendors have only themselves to blame. And as the economy slows the tendency of business to defer expenses grows, the computing industry will feel the pinch. Eventually, it will be generally realized that the marketing pitches for ‘buy the new stuff and your business will flourish’ will be revealed to be just that — pitches. Paul Straussmann, in his book ‘The Squandered Computer’, showed that there was little relationship between how profitable a business was and how much money was spent on computing.
So when will computing consumers wake up (and vendors)? All this waste and churn just gets in the way of the optimized business processes that can only be built on stable, evolutionary frameworks. It limits the real value of computing and keeps it as a business expense and irritation instead of a competitive and operational benefit.