Advertisement

Government not ‘sitting on hundreds of zero days,’ former NSA official says

“We disclose something like 90 percent of the vulnerabilities we find,” said Richard Ledgett.
(Getty Images)

This story first appeared on CyberScoop.

Storm clouds are rising over the U.S. government’s policy on software flaw disclosure after the massive WannaCry infection spread using a cyberweapon developed by the NSA, and even former agency leaders say it might be time to take a fresh look at the Vulnerability Equities Process.

Under the VEP, U.S. officials weigh the benefits of disclosing a newly discovered flaw to the manufacturer — which can issue a patch to protect customers — or having the government retain it for spying on foreign adversaries who use the vulnerable software. The process has always had a bias toward disclosure, former federal officials said.

“We disclose something like 90 percent of the vulnerabilities we find,” said Richard Ledgett, who retired April 28 as the NSA’s deputy director. “There’s a  narrative out there that we’re sitting on hundreds of zero days and that’s just not the case,” he told Georgetown University Law Center’s annual cybersecurity law institute.

Advertisement

On the contrary, he said, “the process, led by the [White House National Security Council], is very bureaucratic and slow and doesn’t have the throughput that it needs.” He said it was an issue NSA leaders had raised with both the previous administration and the Trump White House — and that current homeland security adviser Thomas Bossert had promised to fix.

A zero day vulnerability is a newly discovered software flaw — one the manufacturer has “zero days” to patch before it can be exploited. An exploit is a piece of code that uses a vulnerability to work mischief on a computer, for instance allowing a remote hacker to download software and seize control. “Not all zero days are created equal,” one of the architects of the VEP, former White House Cybersecurity Coordinator J. Michael Daniel, told CyberScoop recently.

Some exploits might require physical access, or need other exploits to be pre-positioned. Some might even rely on known but widely unpatched vulnerabilities, he said. One of the reasons WannaCry spread so fast — despite being relatively unsophisticated in design — is that it utilizes a very powerful NSA exploit called EternalBlue.

EternalBlue was one of a large cache of NSA hacking tools dumped on the web last month by an anonymous group calling itself the Shadow Brokers — an event that led to calls for the government to give up stockpiling vulnerabilities altogether.

That would be a mistake, Ledgett said, in part because even disclosed vulnerabilities can be exploited. Hackers can take apart the patch and reverse-engineer the vulnerability it is fixing, and then weaponize it with an exploit. Even when there’s a patch available, Ledgett noted “Many people don’t patch, for all sorts of reasons.” Large companies, for example, often have custom software that can break when an operating system is updated.

Advertisement

“The idea that if you disclose every vulnerability, everything would be hunky dory is just not true,” he said.

Besides, the NSA’s use of its cyber-exploit arsenal was “very tailored, very specific, very measured,” added Ledgett, agreeing that the VEP policy was “in about the right place.”

Indeed, he said, there “was an argument to be made” that Microsoft, which last weekend rushed out an unprecedented patch for discontinued but still widely used software like Windows XP, should bear some of the blame for not patching the discontinued products in March, when it patched its current products — apparently in response to an advance warning from the NSA.

Daniel revealed the VEP in 2014, in response to suspicions that the NSA had known about the huge Heartbleed vulnerability in a very widely used piece of open-source software — it hadn’t, he said. But the policy has been in place since 2010, according to documents declassified in response to a Freedom of Information Act request from the Electronic Frontier Foundation — an internet freedom advocacy group.

And Ledgett said the NSA had previously had a similar policy in place “for decades.” At the heart of the process, he said, is a balancing of how valuable the vulnerability in question is for the NSA’s foreign intelligence mission, versus how damaging it might be U.S. companies or Americans generally, if it were discovered by an adversary or revealed before it could be patched.

Advertisement

Ledgett said the new process balanced more or less the same factors in more or less the same way — although there were additional players like the State and Commerce Departments at the table in the National Security Council-led VEP.

“The thing that’s new since since 2014 is the risk of disclosure of a vulnerability,” he said.

But former NSA director and retired four-star Air Force Gen. Michael Hayden points out two other things that have also changed — affecting “where NSA places the fulcrum in its balancing of offensive and defensive equities.”

“Far more often now the vulnerability in question is residing on a device that is in general use (including by Constitutionally protected US persons) than on an isolated adversary network,” he wrote in a blog post for the Chertoff Group, where he now works.

He said that a “comfort zone” the NSA had previously enjoyed had also narrowed “considerably.” The comfort zone was called NOBUS, short for “nobody but us.” In other words, “This vulnerability is so hard to detect and so hard to exploit that nobody but us (a massive, technological powerful, resource rich, nation state security service) could take advantage of it.

Advertisement

“That playing field is being leveled, not just by competing nation states but also by powerful private sector enterprises,” he concluded, “The NOBUS comfort zone is considerably smaller than it once was.”

This week, bipartisan bills in both chambers sought to give the VEP a basis in law. Sens. Brian Schatz, D-Hawaii, Ron Johnson, R-Wis., and Cory Gardner, R-Colo., and Reps. Ted Lieu, D-Calif., and Blake Farenthold, R-Texas, put forward the Protecting Our Ability to Counter Hacking Act, or PATCH Act.

Latest Podcasts