Advertisement

The government has tons of cybersecurity data, now what?

At the federal level, the reach of data has long exceeded the grasp of government. But recently, the government and private industries have developed the capacity to access and analyze that data effectively.

If only it were that easy.

“This data is a commodity,” said Curtis Levinson, the United States cyberdefense adviser to NATO. But “everybody is a consumer and no one wants to be a supplier” of data.

Advertisement

Speaking Wednesday at a FedInsider event, “The Role of Analytics in Cybersecurity,” Levinson and other cybersecurity experts explained the struggle to get useful analysis from cybersecurity data. Agencies and industries have unfathomable amounts of data, but no consistent method for releasing data, accessing data or displaying understandable analysis of data.

“What is new is that the technology has changed,” said Stephen Dennis, innovation director of the Homeland Security Advanced Research Projects Agency. “Like the Homeland Security’s version of DARPA,” he said, referencing the military’s advanced technology research division, “but we don’t have all the money that DARPA has.”

DHS, like many government agencies, has massive amounts of data, but a limited capacity for analysis. Each day, the agency prescreens 1.8 million passengers at 448 airports, seizes more than $500,000 in counterfeit money, and patrols 3.4 million miles of U.S. waterways. According to Dennis, the agency recently compiled a list of 43 needs around big data. Easily, the No. 1 requirement was data access, he said. Employees don’t want to deal with different systems hosting different databases.

“People wanted to get to their storage regardless of where it resides,” he said.

But easy access to data is hampered by a ubiquitous fear of exposing confidential information. It goes back to the problem Levinson highlighted: “Everybody wants [data], but nobody wants to offer it up.” Government fears the release of confidential information. Private industry fears the release of personal identification information.

Advertisement

In the end, no one gets anything.

“There are gaps in trust and there are gaps in who provides what,” Levinson said. “While we’re able to collect the data, there is no uniform, consistent way of scrubbing that data. So data that isn’t scrubbed isn’t readily provided.” It’s the “single most pressing need,” he said, for cybersecurity data analysis.

To help solve this problem, JR Reagan, Deloitte & Touche LLP’s federal chief innovation officer, encouraged government agencies to ask: What can we learn from industries that have solved similar problems? The financial services and the gambling industry have both tackled big data analysis in recent years. The high-profile 2008 collapse of the financial sector, blamed partly on poor predictive models, makes the example “a little bit of a stretch, but there’s something to this,” Reagan said.

But the industry “has started to figure out how to manage large volumes of data, look at risk within that data,” Reagan said. “That’s a different metaphor that can be possibly applied to cyberanalytics. What we’re missing is a unit of measure, of course. We know what dollars and yens are, but we need to figure out what the value of data is.”

And Las Vegas is famous for its omnipresent monitoring systems. “They know how to protect their dollar really well,” Reagan said. “Those cameras really measure, look at you, and connect the non-obvious relationships. It is a really tightly wound process they’ve figured out.”

Advertisement

Beyond data analysis, the panelists said, agencies must also work on “visualization” — understanding the best pictorial method or metaphor to present trends and patterns to those uninitiated to cybersecurity intelligence. Reagan referred to it as a “next frontier” for cybersecurity analytics. “We can see pictures 60,000 times faster than we can recognize a piece of text,” he said.

Latest Podcasts