WHo Is Right… Some Thoughts on Planned Obsolescence


Watched a video about deploying Vista today. Over an hour of folks talking about all the great new features and how large the computers were that they had to buy to run it properly. And one almost rant about how unacceptable it is that folks always complain about new operating system releases but eventually accept them and then don’t want to change, etc.

I have a problem with this. On the one hand, the vendors are right. They are not going to spend the time and money making software that runs well on current platforms because it would make the software more expensive (remember Trumans triangle?) and the box that ships next week will be faster anyhow, so who cares? And all of the features that they manage to pack in may well be for our own good — like allowing PC users to run unprivileged, a desirable goal almost as long as I have been a programmer. And how very few programs can actually be done that way due to a wide variety of factors.

On the other hand, it is my money and I expect to have to spend it only when it makes economic sense to me. Imagine what would happen if car makers were allowed to do what computer vendors consider their right and privilege? Millions of cars piled beside the road because the manufacturer decided to use sparkplugs with left hand threads or maybe changed voltages so none of the electrical parts would still work or varied the fuel type. So no spares after a very short time so once it breaks you just walk away and buy another one. The humor pages have had many better comparisons as to what driving would be like if M*sft made cars, so no point in repeating it here.

I guess the point is that vendors in the IT industry have gotten rich by keeping product lifetimes short and pressuring users (us) that it is in our interest to junk it all and buy new as frequently as possible. And we hear that computers are cheaper than ever (they are) so what is the problem? Well, there is the little detail of software… to say nothing of time to put it all back together and make it work again. If I buy a car, there is a familiarization exercise to be sure, but in general, the gas and brake are in the same place every year and one puts similar fluids in similar places — in other words, standardization of the user interface. Sure, each model has internal changes that may make it very different from the last one, but the user experience is pretty much the same.

But each generation of software is different, and there are very few upgrade pricing deals around — so essentially one has to relicense the entire pile of software again, and then patiently relearn it and sort out all the (different) bugs. So the total cost to a small business or individual home user can be substantial. And as in any technophile household there are a pile of computers doing various tasks that we deem (more likely me) essential. And having played with several generations of server operating systems one learns that different versions do not play well together, regardless of what vendors may say. R2 server versions have problems talking to R1 active directory, file replication and so forth. So there is no percentage in gradual change — it needs to be everything at once, like large business rollouts of the new desktops. But that is a lot of money and time, so we bumble along on the old stuff, knowing that eventually the patches will stop coming and when a harddrive or powersupply flakes out its probably curtains for the whole lot. Because of course there will be no spares available. (So I guess it all becomes part of another toxic waste shipment to some other part of the world…)

I guess what I am trying to say is that the technology firms would find their case a lot easier if they strove for changes that were not disruptive and tried to standardize what they offered rather than differentiating it so the new stuff was instantly recognizable. This goes for repairable and upgradeable PCs as well. I know on the software side it can be done — VMS used sharable libraries with transfer vectors to hide changes in code from applications. If an application was linked to a sharable library and function calls used the transfer vector approach — so same name, arguments, etc, the libraries and the OS could be upgraded with very little impact to the applications. I also know that Windows came from here, but with no hardware support for memory isolation it was a lot tougher. And then there was the legacy of games that wrote directly to hardware for better performance…

Just seems like the models for what is being done have gone entirely wrong — instead of emulating the electrical industry or the auto industry and making products that are reliable and repairable they have gone the route of cheap fashion and throwaway luxury priced products. I dont know about anybody else but I am tired of it.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s