Written byChris Bing
The disclosure process that governs how and when federal agencies should tell tech firms about flawed computer code is in no immediate danger of termination under the Trump administration, current and former U.S. officials said.
Flawed code by its very nature offers vulnerabilities that can be targeted by hackers. Knowledge of these vulnerabilities — especially those never publicly reported — is valuable to a wide array of actors, including law enforcement and intelligence services. In the past the default has been to err on the side of disclosure, even by the super-secretive National Security Agency, according to a comprehensive research reported conducted by Columbia University. For now, that isn’t likely to change, said Neil Jenkins, director of the Homeland Security Department’s Enterprise Performance Management Office, or EPMO.
“It is not within our national interest to build up a stockpile of vulnerabilities to hide behind and to use for intelligence or law enforcement purposes. We have to get those out to make sure that systems are secure,” Jenkins told a room full of cybersecurity industry insiders while speaking on a panel at the 2017 RSA conference. “But the process does recognize that there are some vulnerabilities that we need to keep, that we need to use for national security purposes.”
In most cases, though, the government has incentive to report flawed code, U.S. officials say.
“The process is still in use, it is in regular use, and we are having meetings about these things on a pretty regular basis. And I would say, as of right now, we are still in the mode of responsible disclosure under the current administration,” Jenkins said at RSA.
Known as the Vulnerabilities Equities Process, or VEP, the guidelines were only recently made public. The government first began releasing limited information about the process during the Obama administration.
“We made an agreement early on that we would err on the side of the defense of this nation … and if we didn’t put it out the capability, if it were ever found then we would put it out there. And we stuck to that. It makes your job harder, but it was the right thing for the country,” former NSA Director Keith Alexander said in December during a University of Maryland event. “We put out more than 90 plus percent of those things that we saw. Some people criticized it [the VEP] but nobody changed.”
In the run-up to President Donald Trump’s inauguration, however, some feared that a new White House would reverse course and offer intelligence and law enforcement agencies greater leeway to keep vulnerabilities secret.
“Where we are in 2017, if you look at the sort of arc of cyber policy today, coming out of the Trump campaign and then out of the then president-elect’s office, it was very offense-oriented,” explained Rob Knake, a former director for cybersecurity on the National Security Council at the White House. “So I think there was this sense that the gloves were coming off, that the [VEP] would be thrown out the window … That was my fear.”
“But what we have seen since then I think is a growing recognition that we revived this policy, that this is a policy from the Bush administration, this started in 2008 and came out of the CNCI … and one which [current White House homeland security adviser] Tom Bossert had a heavy hand in,” said Knake.
Though the VEP offers renewed transparency in its current form over what has largely been a clandestine decision-making process, the panel of experts also agreed that more can be done.
Greater coordination and shared oversight, Knake explained, should be instituted between the multiple federal agencies that share a vested interest in either disclosure or keeping software flaws secret. Until today, each federal agencies has approached the VEP in a slightly different manner.
“We agree that it is time for this process to be codified in law just to make sure that it continues, that there is are clear considerations around the risk and potential for operational use … and for regular review of what you’re not disclosing,” said Heather West, a senior policy manager for Mozilla.
“I think ultimately there could be a lot more transparency around [the VEP], one of the things we noticed as we have gone through and researched this process is that it works reasonable well and the government could build a lot of trust with industry; saying this is what we have and are doing. And then we can have this collaborative relationship that we don’t normally have in the cybersecurity space,” West said.
At the moment, there are no penalties in place for agencies or U.S. officials that decide to keep software vulnerabilities out of the VEP process.