Layering or Why Nothing Works


Having been involved with computers for 50 years now I have certainly seen some changes. As machines have become faster and memory cheaper the accessibility of increasingly powerful capabilities to average programmers has improved. But the results are not always much of an improvement — in my jaundiced view things are prettier but flakier than ever.

A couple of examples — in my retirement I dabble in stock trading as a way to earn scotch money… when there is a market stable enough to want to trade in, anyhow. I have bought a couple of analytical tools to help me, tools that are fully buzzword compliant. Problem is that funny things keep happening — printing goes all to hell and the formatting routines just ignore the settings, data paths change on their own, complex rules about what stocks to target or dump trigger on symbols with no data (see ‘path’ above). The standard vendor response is to uninstall all of the pieces and reinstall — especially the .net, .net2, .net3 libraries and so forth. The vendor has a nice tool and mostly it is reliable, but just mostly. My suspicions are that the just in time compilations these libraries depend on are less deterministic than they should be — as there is no real memory isolation the end results are probably influenced by factors outside the application. So ‘reproducing’ the problem is probably impossible.

Similar things happen with a systems management application I deployed on my server — it should make patch management easier, but no, I would spend my time chasing internal problems in this incredibly narcissistic application — over night it can log hundreds of errors as its pieces fail to work smoothly together. On the ‘right’ machine it is probably a nice tool but in a small shop there is just no time to sort out its problems. An uninstall is running as I write this.

I guess I am getting cranky in my old age — finding bloated, flaky applications and tools ever harder to tolerate. Had the latest vendor OS and office suite on one machine, pulled both as too many things were harder to do than before. Glad I am technically capable of reconfiguring stuff when it doesn’t behave. Pity it is getting to be a survival skill.

What I suspect is happening is that in too many circles the analytical technique of divide and conqueor has been applied as a general design tool. Methodologies are slavishly applied by implementors who now see that ‘we have always done it this way’. Layers of complexity are built up that should work but don’t because of linkages and dependencies that the simplifying assumptions just exclude [the terms of reference are drafted to exclude societal consequences]. Things are blithly used with no understanding at all of how they work — or fail. So applications dont work reliably, financial markets suddenly break down when factors outside the scope of the carefully designed instrument change.

The root of the problem is that we have embraced complexity as an alternative to the hard work of understanding in a general way what we are doing. So unintended consequences become the norm and things just collapse. Too many semi-skilled specialists and no where near enough generalists. And the big picture is getting fuzzier.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s