A nation must think before it acts.
In February 2016, we wrote about the dispute between the U.S. Federal Bureau of Investigation (FBI) and Apple, Inc. over the government’s demand that Apple intentionally create an insecure “back door” version of its iOS software that would permit the FBI to break encryption on the iPhone used by one of the San Bernardino terrorists, Syed Rizwan Farook. Apple and others claimed that once a “back door” into a system was created, it would inevitably become known and would be used by criminals and spies to steal personal information. Apple worried that the FBI would not be able to keep the back door safe from hackers, pointing to previously publicized hacks that had exposed a great many government secrets.
While the solution we favored, having Apple digitally “sign” software written by the FBI to access the iPhone, was not implemented (the Bureau purchased hacking software from an Israeli company for nearly $1 million instead, and reported that no useful information was then recovered from the device), the larger issues related to encryption and system security went unresolved.
Beginning on May 12, 2017, hundreds of thousands of computer systems across the world were incapacitated by a ransomware attack dubbed “WannaCry.” Computers in hospitals in Britain’s National Health Service caused facilities to be closed, surgeries to be postponed, and ambulances to stop transporting patients. Other critical computers became inert, and system operators scrambled to disable networks to keep the virus from spreading.
While such attacks are not new, WannaCry is slightly different, and thus, more virulent than earlier ones. Carried by email, this trojan virus infects Windows computers and encrypts most or even all of the files. The virus then demands that a ransom in BitCoin be paid in order to have the files decrypted. In the case of WannaCry, the demand is a ransom of $300 at the time of infection. If the user doesn’t pay the ransom within three days, the amount doubles to $600. After seven days without payment, WannaCry will delete all of the encrypted files, leaving the computer completely useless.
The reason that this virus has been so dangerous is that it uses an “exploit” or security hole in Windows that was developed by the U.S. National Security Agency (NSA). Called “EternalBlue,” this flaw in Windows was kept secret by the NSA for its own intelligence gathering purposes. It was made public last month when a group of hackers called Shadow Brokers released the details of the exploit on the internet. Because this flaw exists in Windows XP, 8, and 2003 Server, which are no longer updated with security patches by Microsoft unless an expensive custom support agreement has been purchased, (as well as unmatched newer versions of Windows) any system running these unpatched versions contained the seeds of its own destruction. (Last Friday, Microsoft did finally issue a free update to patch the vulnerability, which should immediately be installed by anyone running these systems.)
When NSA’s EternalBlue code was mixed with an aggressive internet scanner program and a run-of-the-mill ransomware program, WannaCry was born. We still do not know who is behind the attack, but it is still in progress. A new Windows PC placed on the internet will be attacked by this virus in less than 30 minutes, according to security researchers. We may also expect that modified and more deadly versions of this software will begin circulating quite soon.
The policy debate reignited by WannaCry is whether governments should develop and stockpile cyber “exploits” that they use in espionage and cyberwar actions. Microsoft President Brad Smith, in a blog post stated, “We have seen vulnerabilities stored by the CIA show up on WikiLeaks, and now this vulnerability stolen from the NSA has affected customers around the world. Repeatedly, exploits in the hands of governments have leaked into the public domain and caused widespread damage. An equivalent scenario with conventional weapons would be the U.S. military having some of its Tomahawk missiles stolen.”
In the game of cat-and-mouse that is cyber, intelligence agencies argue that they must stay one step ahead of adversaries in order to perform their functions. On the other hand, discovered vulnerabilities may disproportionately affect both the U.S. government and the private sector, as well as our allies. Bills have been proposed in Congress to force confidential disclosure of such vulnerabilities to companies whose systems have been penetrated, but the debate over such measures is still far in the future, and the fate of the legislation is uncertain.
Microsoft’s Smith went on, “The governments of the world should treat this attack as a wake-up call. They need to take a different approach and adhere in cyberspace to the same rules applied to weapons in the physical world. We need governments to consider the damage to civilians that comes from hoarding these vulnerabilities and the use of these exploits. This is one reason we called in February for a new ‘Digital Geneva Convention’ to govern these issues, including a new requirement for governments to report vulnerabilities to vendors, rather than stockpile, sell, or exploit them.”
We at FPRI look forward to the debate that WannaCry may instigate, and we pledge to provide our research results and views to those engaged in that debate, in what we view as a pressing need to protect our “Digital Commons.”