You could call last week’s WannaCry ransomware attacks a perfect storm. But maybe we shouldn’t pin the label “perfect” on something that owed so much to a confluence of venality, incompetence, bad decision-making, and wrongheadedness.
Also known by other names such as “WannaCrypt,” these assaults on Windows PCs–which encrypted unsuspecting users’ data, then demanded a ransom payable in bitcoin to restore access–didn’t need to happen at all. And government agencies and tech companies could overhaul policies in ways that could prevent future outbreaks.
WannaCry relies on a Windows file-sharing exploit allegedly discovered by the U.S. National Security Agency (NSA) that allowed the code to spread across the internet from PC to PC like a worm. Once active, it could then deliver the payload to computers on the same network and randomly probe the internet for additional vulnerable machines.
The exploit was part of a set of Windows flaws that a group that calls itself the Shadow Brokers says it stole from the NSA and then tried to sell. Failing to get a buyer, the Shadow Brokers released the flaws openly in mid-April. Microsoft released comprehensive patches a month before that, clearly tipped off weeks or months in advance. But with the genie out of the bottle, criminal groups obviously set out to use the flaws to attack unpatched Windows systems.
Plenty Of Blame
Where to start pointing fingers? Pundits, ostensibly neutral news stories, and security experts are blaming individuals and organizations who didn’t install the patches. The National Health Service in the United Kingdom has outdated systems and was warned about the extent of the problems they might face months ago, for instance.
But that kind of blame displaces the structural culprits, making users the goats as usual. The NSA, FBI, and other U.S. governmental groups discover or pay to acquire zero-day attacks–ones which haven’t yet been patched by companies such as Microsoft–to use in their arsenal of spy tools. Despite official policies, these agencies don’t alert the affected software and hardware companies, thereby leaving software vulnerable rather than helping to fix it.
The fruits of that decision were laid bare here, because the NSA reportedly protected its tools so poorly that one NSA contractor apparently posted them insecurely on a server and an NSA employee allegedly copied them and brought them home.
This helps bolster those who believe no government should be trusted with retaining such secrets, because they’ll either get out–or they’ll be discovered simultaneously by malicious parties who could have been foiled had the agency worked with the companies with vulnerable products. Microsoft condemned the NSA’s practices Sunday.
Back in 2015, one of Apple’s arguments against providing a special version of iOS to the FBI that would allow that agency to attempt to break through passcode protection on an iPhone used by one of the San Bernardino shooters is that despite the FBI’s promise of keeping it secret, it would inevitably leak. Ironically, the FBI turned to an undisclosed third-party vendor that apparently had a hoarded iOS zero-day vulnerability to crack the phone.
— Harry Adams (@harryadams) May 15, 2017
But like a lot of low-hanging malware, WannaCry relied on Microsoft’s long-running policy of releasing Windows with ports gaping open and various software services running by default, making the operating system more vulnerable to attack. That wasn’t a great idea in 1995, and has only gotten more dangerous in subsequent years. The company has become increasingly cagier, with Windows 10 considered one of its best-secured releases. (Windows 10 was unaffected by this particular exploit, and it’s robust enough that contemporary exploits largely target older versions of Windows such as Windows XP–Conficker remains one of the most active viruses–or vulnerable software like Adobe Flash.)
That original sin on Microsoft’s part continues to pay dividends for criminals and intrusive government agencies, despite the many opportunities the company had to push out updates that would have shut down unused services or guided users through figuring out what they did and didn’t need to run. The fact that services by default were accessible over the internet–even when they were intended only for local networks–remains appalling. That was bad software design when Microsoft started doing it, and it’s unconscionable that it remains true in older versions of of Windows while the company has focued on releasing newer versions of the operating system with better controls.
The Fact Of Life Known As Windows XP
Finally, the capper. Microsoft released patches for Windows in March, but the versions for Windows XP (and a few later releases) were only available to companies that had paid for special past end-of-life custom support. It took the spread of WannaCry for the company to drop that barrier and release them to all users on Friday.
Some companies, government agencies, individuals, and nonprofits lack the resources to upgrade from Windows XP to later versions, especially in the developing world, where XP was widely pirated. They may not have computers powerful enough to run more recent versions of Windows.
Even for groups like the UK’s NHS, among the world’s five biggest employers, installing patches is a disruptive process that has to be planned like a war effort to keep critical systems running, no matter how dire the risk. Security patches can break essential third-party software, so they have to be tested and rolled out.
It doesn’t have to be like this. WannaCry could increase the volume of debate about prohibiting the NSA, FBI, and other agencies from keeping exploits secret. The government claims it discloses most, but the Shadow Brokers seem to have put the lie to that. Disclosing all of them is now more clearly in America’s national security and business interests.
— Patrick Coomans (@patrickcoomans) May 15, 2017
Microsoft can’t change the past, but it can help with the future. It could release tools for outdated systems designed to shut down services. On affected version of Windows, merely disabling a networking technology called SMB (an task beyond typical users’ abilities) would block WannaCry, for instance. On a broader scale, Microsoft could help disable the spread of worms by helping to secure the tens of millions of computers still running XP, and hundreds of millions running other versions prior to Windows 10. Most people and groups need almost no services running.
Operating-system makers such as Microsoft could also advance the fight against ransomware, which encrypts user documents, and thus doesn’t have to insinuate itself very far into an operating system to have a devastating effect. In most OSes, applications can read, write, and delete any file “owned” by a user years after ransomware began its rise. Specialized tools can restrict file access and require user permission and training, but they’re too complicated for most people. Some researchers have found effective ways to detect ransomware by identifying suspicious behavior, but these approaches aren’t integrated into anti-virus software, whether provided by an OS maker or third parties.
Blaming organizations and individuals for a combination of NSA exploit policies, Microsoft’s multi-decade security choices, and criminal deployment of leaked flaws seems rather rich. Government and operating system policy changes could have prevented or highly minimized WannaCry. Maybe the damage it’s already done will finally prompt change.