In federal cybersecurity, people remain the greatest vulnerability

Share

Written by

Despite all the technological advances — the machine learning, sophisticated intrusion detection software and multifactor authentication — the fundamental problem in federal cybersecurity remains the human factor, and securing federal data will require fundamental behavioral changes in the people who use it.

“People are still our greatest vulnerability,” said Richard Young, chief information officer for the Department of Agriculture’s Foreign Agricultural Service. “We have to embed and engrave cybersecurity DNA into our people.”

Young made his remarks last month at the “Executive Leadership Forum: Mitigating Data Risks in a Virtual World” event, hosted by immixGroup, an Arrow company that helps technology companies do business with the government.  He spoke on a panel with Dr. Joseph L. Ronzio, deputy chief health technology officer, Veterans Health Administration; Chad Tompkins, data section chief, Consumer Financial Protection Board; and Sonny Bhagowalia, CIO with the Department of the Treasury.

Personnel-oriented changes mean taking on a new way of approaching systems. The panelists emphasized redefining high-value assets, or HVAs; better collaboration to change individual perspectives on what to secure and why; and abandoning outdated software development approaches.

High-value assets and the need to collaborate

Program and business staff have to do better in partnering with CIOs to help ensure system security, Tompkins said. He noted that if there are breaches, “it’s the CIO’s butt on the line.”

Collaboration requires taking a harder look at what constitutes a HVA. “There are too many” HVAs, Bhagowalia said. Business continuity should be the deciding factor in HVAs, assigning that status to “only those systems you need to constitute your mission very quickly.”

Treasury has 329 systems, Bhagowalia said, and they are all HVAs. “I’m a business continuity professional,” he said.

“It should be 10 percent of that.”

A balance must be struck between limiting people’s access to high value data while not preventing them from doing their jobs. This is reflective of the federal government’s changing attitude about security. It’s moved from “protect everything, share what you must,” to “share everything, protect what you must.”

Young was more direct. He said he tells program officials, “If you think this is an [HVA], then fight for the money to protect it.”

“I don’t care if you turn them [systems] all off; I just want to protect the data,” Young said. Faced with that decision, Young noted, financial and email systems roll to the top in terms of importance. Less significant, so-called “birdwatching” systems fall away quickly, he added.

The argument for openness in development

When asked by briefing moderator Tim Larkins, immixGroup’s Market Intelligence director, about new approaches to software development, the panel was in agreement about the growing need to adopt open source and open data.

As a relatively new organization, Tompkins said CFPB found it useful to “steal as much source code as possible from other agencies.” Understanding what other agencies have done avoids a “Tower of Babel,” Tompkins said, where “everyone’s talking” but there’s no agreement.

Open data is equally important, Tompkins said, although getting good results from open data means starting with “good questions, data, analysis and decision-making.”

Ronzio acknowledged that his organization has been “trying to modernize for years,” but that the VHA is still using “1968 software development” architecture. The volume of data from connected devices is forcing VHA to think differently about software. The devices are creating a waterfall of information that “we have to track for 125-plus years.”

“How would you even think of planning for that?” Ronzio asked. “Nobody has. If your security systems can’t grow and mature, you’re going to have a vulnerability in the future that will be very difficult to overcome.”

Young stressed that his teams perform data modeling to understand how that data affects mission fulfillment, and to identify whether some data actually requires current levels of security. 

Young conceded that this means more risk tolerance for federal CIOs, pointing out that alternative means of data storage can reduce operating costs if that data does not need to be rapidly accessed.

The role of policy in security planning

When asked about the importance of regulations and policies like the Federal Information Technology Acquisition Reform Act, known as FITARA, Bhagowalia acknowledged its benefit, but pointed to the effort required to examine data and security in all the contract clauses. “It’s a vast amount of work” to make the appropriate transitions in legacy systems.

Young noted that FITARA makes technology and program experts collaborate and work together, but stressed that policy with all its components doesn’t automatically work “just because you write it on a piece of paper.”

“Some of the challenges we have are not just collaboration, but also program areas within the agencies,” Young explained. Policy requirements such as FITARA can be used “as a tool to get the program areas to comply.”

Still, policy can be a double-edged sword when it comes to securing data against cyber attacks. As Bhagowalia explained, “We have to defend and be right 100 percent of the time, while the adversary only has to be right one percent of the time. We’re a nation of laws and they’re not.”

Overcoming entrenched thinking is at the heart of better success in cyber security. “I look at five things across a lifecycle: People, policy, process, technology and governance,” Bhagowalia said. “You have too many policies, too much culture in the governance side of it, and that’s why (securing) data is becoming a challenge.”

Lloyd McCoy Jr. is immixGroup’s Department of Defense consultant on the Market Intelligence team. Connect with him on LinkedIn at www.linkedin.com/in/lloydmccoy. 

-In this Story-

Cybersecurity, Tech
TwitterFacebookLinkedInRedditGoogle Gmail