The latest strain of ransomware has been in the news, accompanied by somewhat sensationalistic news coverage. Yes, it’s a big deal, but not unexpected – ransomware is only going to get worse. Right now it’s focused on availability, next it’ll be integrity (more on that in the next post). One question that’s just starting to be asked is, who’s fault is it? I’m looking beyond the cyber criminals who released it, and towards the IT ecosystem that enables this to happen.
The NSA is a target for a lot of pundits. From media reports, there was an internal debate about disclosing the vulnerability to Microsoft, but ultimately the agency decided against it. It’s easy to take an absolute position on this – we should horde vulnerabilities for intelligence purposes, or we should always disclose. Unfortunately, we live in a grey world and such black and white absolutes are hard to come by. As the agency realized that the tools had been exposed, they privately notified Microsoft, who quickly issued a patch, and they’re to be commended for that. They’re in a tough position, and it’s not an easy answer. I’ll have more thoughts on policy options in the future.
End users certainly share some of the responsibility. Patching is often stated as the first and most important defense against attacks. I’d argue that running a supported operating system is even more important. Folks still running XP need to either isolate the machine (physically offline), or upgrade. That may mean spending money to update industrial systems, or to change procedures to run them disconnected from other networks. There are no other viable options. Consumers need to turn on automatic updates on their personal machines, and apply them regularly. What they don’t need is to spend more money on consumer antivirus.
Businesses are in a tougher position – they may have thousands of machines that need updating, including both servers and laptops/desktops. There are tools available (for example, IBM BigFix) that make this straightforward, but often it’s not the actual patching that’s the issue, it’s compatibility with enterprise systems. Corporate development needs to remove as many platform dependencies as they can, to make applying patches less risky. But we can’t even get rid of Flash, Silverlight, and Java, so OS linkages are likely to take even longer to fix. They need to build processes to test and apply security patches quickly – it’s just hygiene, but it needs to have a higher priority than it currently does.
Which brings us to Microsoft. They have been making this harder by changing how patches are provided (combining security and feature patches, and drastically reducing the information about what’s in a patch). Both of those need to change to make it easier to assess and test updates. On Windows 10, they also force-download and install patches – something that’s controversial. That’s hit me with high mobile data usage, but probably keeps the vast majority of people far safer than the Windows 7 approach. Are they responsible for the bug? Sure, but I can’t beat them up over it – all software has bugs, and Windows 10 is a major improvement over previous editions. By comparison their major competition is having a growing problem with defects.
So who’s to blame? At some level, we all are. Security professionals for making easy-to-say statements like ‘upgrade and patch immediately’ without regard to system stability or upgrade cost, pundits who say ‘disclose all vulnerabilities’ without regard to legitimate national intelligence needs, vendors who focus on rapid release of features at the expense of system stability, businesses who fail to invest in keeping their IT infrastructure current, and end users who blindly assume that all the others will take care of it for them.
Not an easy fix.