The recent major cyberattack on hospitals, even one here in Ontario, is continuing evidence that short-sighted practices of the past are coming home. And I would suggest that the root problems are more extensive than just software, but extend into every corner of the built environment.
Back in the 1970’s, there was an idea floating around the software developers I worked with called ‘Truman’s Triangle’. The concept was that there are tradeoffs between how quickly or cheaply something can get done and how good it was. Pick any two, they said — if you want it good then it will likely not be cheap or quick. Problem is that management always went for cheap and quick — issues could always be fixed in the next release (which frequently never happened).
As a developer, one common mistake in writing code is neglecting to check boundaries when writing into memory. So if one read 512 bytes from disk, there had better be a 512 byte buffer to accept it. Otherwise it overwrites whatever was above it in memory. A typical neophyte error and the root cause of buffer overflow, one of the common pathways for malware to gain a toehold. And if in a rush because the vendor had a committed release cycle — the 2001 version of windows (XP) there were probably a few such errors. Testing usually checks the expected behavior of a program — not what breaks otherwise. The software house where I worked had one guy who torture tested programs — if your stuff could get past Charlie Brown (his name, really) it was pretty bulletproof. Never saw this anywhere else.
In the real world this issue is compounded because the vendor (Microsoft) wants to sustain sales by forcing users to by the latest version when it is released. Not only are there improvements that may be desirable (and hopefully a few bug fixes) but changes to make the operating environment incompatible with the prior release. Hardware vendors facilitate this by only writing their drivers for the latest versions. And application vendors will put out a new version for the latest OS so that they get an upgrade sale as well. This is not always bad — sometimes the new version is better. And support has to be paid for somehow.
As bugs, like the buffer overflow, get discovered and complained about the vendor pushes out patches to fix things. These have to be installed and tested — sometimes the patch breaks things elsewhere, so some care and thought is required. A ‘fools rush in’ area to be sure. Why systems folk appreciate the value of patching but are sometimes reluctant to do it. And the thinner they are spread the greater the chance that this is one item than languishes.
So Microsoft adds pressure by dropping support, including providing patches, for those versions of the OS that they want off the landscape. For isolated machines like single desktops or computers embedded in machinery this is probably not an issue, IMHO. But for anything that is network connected, especially to the Internet, it is a serious threat, almost extortion. It is worth noting that after formally dropping support for XP, Microsoft has just released a patch for XP for the current, very conspicuous, bug.
But in the real world of enterprise the decisions around how to implement and maintain a computer application, a hospital medical records system for example, is made by people more interested in their balance sheet than the esoterica that they may be hacked. After all, that only happens to others — we are so much better, smarter, whatever… will never happen to us. So remote access is simplified with more a view to convenience than security, firewall rules not maintained, logs not checked, backups not monitored and tested to ensure they are really recoverable. And remote staff are used for support because its cheaper to get someone out of bed than have local staff — and they may need to get at everything remotely.
And finally we have a whole landscape of malefactors who want to steal, destroy or harass for a wide variety of reasons. And unlike conventional insurgency or warfare, this can be done from the comfort of a remote office or home with some chance that the source will remain hidden. And if greed is the driver, ransomware [locking a users system until ransom is paid] is seen as a whole lot better than just stealing the data and trying to sell it elsewhere.
So when one asks why are we not doing more to prevent these kinds of attacks, it is worth following the trail of breadcrumbs to realize that the issue exists because of deliberate decisions on many levels, most but not all, I believe, in ignorance, possibly willful, of the long term consequences of their choices. And until robustness and security are taken seriously at every point in the chain, the situation may change but only appear to get better.
But there is a larger issue seen in this problem. While there may be funding for the original creation, adequate funding for maintenance is a different problem. So corners get cut or ignored completely in keeping things running smoothly. And when funds get tight, maintenance and support is the first thing to go. And besides, every politician or business person likes the attention garnered for doing something new. But keeping things running smoothly and economically… not so much.
But it is not just software and computer systems where this is a problem. Look around… bridges are rusting, roads are crumbling, pipes leaking, food safety questionable. The list goes on and on — neglect, greed, short-sighted decision-making. Problems with building on flood plains that no one really knew about because the information was not collected or maintained. But because of that very human tendency to think that the potential consequences of taking the cheap/quick approach will not happen to them because they are…
The true miracle is that anything works at all.