Advertisement

Inside the FCC’s risky IT overhaul

FCC is in the midst of a total IT overhaul as its legacy systems have resulted in very public failures. Can David Bray lead the agency out of its antiquated past?

There are a number of ways you could describe what happened when John Oliver’s call to action over new net neutrality rules crashed the Federal Communications Commission’s website. A catastrophe. A disaster. A debacle.

FCC Chief Information Officer David Bray calls it a “distraction.”

Referring to the ordeal as a distraction may come off presumptuous or aloof, but, for Bray, it was a wrench thrown into his overarching, already-underway plans. June’s website crash laid bare to the public FCC’s badly aging IT systems, just as Bray was overseeing the beginning stages of a large plan to overhaul and modernize FCC’s computer systems.

While the June crash brought FCC’s dated systems to the forefront, Bray had been working to overhaul the commission’s IT infrastructure since he came into the CIO position in late 2013, mapping out the tangled system mess the agency had been using for nearly 20 years. After meeting with the different bureaus and offices, Bray found FCC was operating with 207 separate mission systems, some of which hadn’t been touched since the ’90s. Systems were still running on Sybase PowerBuilder and Adobe ColdFusion. Like the software, the data centers were extremely antiquated, with many lacking proper heating, ventilation and air conditioning systems.

Advertisement

A quick fix was not in the cards.

After getting a lay of the land, Bray reached out to different people in the federal government and assembled a team that could perform what Bray refers to as the “lift and shift,” an assessment of what systems work, what can be eliminated and how the agency can use commercially available technology that would greatly modernize the agency’s systems. This plan, which will play out over the next three years, will eventually lead to FCC’s systems sitting on top of a “data lake” that will allow the agency to avoid ever having to wait two weeks — let alone two decades — to adjust to a changing technology landscape.

A rational beginning

For Bray’s team, a key portion of 2014 was spent going through a rationalization process, where those 207 systems were picked apart. Helping with this process were two people Bray plucked from the Department of Homeland Security: Mary Ellen Seale and Chuck Aaron. Seale’s first task was to conduct several audits of FCC’s systems: a full sweep of the entire enterprise that included a “cyber hygiene” examination, a vulnerability assessment and complete measure of FCC’s existing hardware inventory. After gathering up each system, Bray and his team put every system it found on the chopping block.

“We [decided that we] are going to go through a rationalization process that looks at all the systems and make a decision in concert with other offices and say, ‘Should we keep this, divest it, modernize it, move to a different version or complete re-engineer it,'” Seale, now FCC’s deputy CIO of resiliency, told FedScoop.

Advertisement

Bray said this system helped undo 20 years of requests for new systems from past administrations or Capitol Hill, which is how FCC stockpiled its bloated portfolio of outdated systems.

“The reason why we have so many systems in place at the FCC is generally when there is a new request from the administration or from [Congress], the response was, ‘Let’s build a new system to meet that request,'” Bray said. “Having this rationalization approach will let us be the most effective with the dollars we have.”

Over six months, FCC eliminated more than half of its systems, dropping the total from 207 to 94. This was just within the first step of the process, with Seale saying the agency is going through a further “business decomposition” with its offices across the country, figuring out how to move its business processes into reusable software components.

“One of the things we hope to do in the rationalization process is figure out which system gets us the most bang for the buck,” Seale said, adding that the team is also deciding what coding steps need to be done for each system so they’re eventually interchangeable.

One of the big things this process uncovered was a host of consumer-facing and back-office applications FCC could eliminate by moving to the cloud. During the summer and into October, FCC moved to Microsoft Office 365, which not only removed an outdated legacy system but also provided an added layer of security since the security holes in the old software are no longer directly tied to the agency’s data centers.

Advertisement

Aaron said the move to Office 365 went against the grain of the federal government’s standard “fix-what-we-have-and-move-on” mantra, partly because the systems were considered state-of-the-art when they were first installed.

“When the systems were built originally by the FCC, they were built with some of the best equipment that could be purchased at the time,” Aaron, now FCC’s chief enterprise architect, told FedScoop. “The problem is that technology today has outstripped those [systems] to the point where the vulnerabilities on those things are a gaping hole. We are now forced to catch up to the technology today. The problem is because we are on old operating systems, it’s not as simple as to just copy it into a newer server that would provide some of the things that you would need, because now I can’t do that. I actually have to rework these things to make them go on,” he said. “We will move [new systems] very quickly to the cloud in order to use them right away. It’s the older things that we have to worry about.”

The ‘TurboTax’ approach

This line of thinking helped Bray establish a strategy for how his team would apply what it learned from the rationalization process in order to modernize its systems: adapt the systems that are already in-house, and if it can’t do that, try to find a commercial solution to integrate into the agency. If it couldn’t find the answer in an off-the-shelf product, only then could it think about writing its own code.

While Bray warned people within the agency that this process was “going to be a huge jump,” any headaches were worth the cost savings: He estimates that moving software to the cloud saved the agency around $500,000, while it may have cost close to $3 million to build new infrastructure.

Advertisement

“So then as a value proposition to the public and [Capitol Hill] is, one, moving to the cloud gets us better security than if we tried spending money to get it done ourselves, we would just not be able to provide it because we are a small agency,” Bray told FedScoop. “Two, it gives us agility in a sense that in the future when there is a new requirement from the administration or from the Hill to do a new system, we don’t actually [create the system], we stitch across the other parts of our quilt.”

Some of those new systems that use commercial software are already in place. Through the rationalization process, FCC started moving some of its consumer-facing applications into software-as-a-service platforms like ZenDesk or Appian, which further eliminated some legacy systems and allowed the agency to become more agile. Seale said with ZenDesk, the team was able to take reduce one 17-form process into one user-friendly, straightforward sheet.

Aaron refers to these new applications as “TurboTax applications” because the agency can just take its forms or other associated data, run in through these systems and spool up new applications with relative ease.

“We do a lot of forms where someone who wants to come in and apply for a license for a radio or a television station, there’s 50 forms for that,” Aaron said. “But if you build a TurboTax, taxes are all the same. Licenses are all the same. It’s the questions you ask and how you lead them down and let the consumer know ‘What do I have to fill out?’ and we will pre-populate the form based on the questions they answer. So it’s the ease of use for the consumer, and the idea that we can use a platform, and take it into anything: auctions, regulations, or any of these [subjects], then build on top of it like a layer cake.”

Data and more data

Advertisement

Whatever metaphor FCC wants to use — layer cake, quilt, data pool — what Bray’s team is describing is the end product of its modernization process: making FCC a guardian of a vast repository of data that sits behind applications built via cloud software, allowing the agency to move at the current speed of technology instead of being dragged down by its ongoing legacy systems.

Using the Atlas platform, an infrastructure system developed by the Defense Advanced Research Projects Agency and used at DHS, Aaron said FCC will move to collect all the data the agency uses in on-premises applications, then use that data archive to feed everything on the front end, be it internal business applications or front-facing consumer ones. Aaron sees three advantages to this model: It will help the agency share data, leverage predictive analytics and feed other systems across applications to prevent users from having to navigate a maze of different options.

“We will really be an agile environment where all of our builds will be based around building that small front-end piece,” Aaron said. “We won’t have to go back and build this from the start. We are really approaching this holistic view of data as static, but we feed the front end, and then we move to that [agile] piece.”

“The nice thing about the common data platform that we’re working toward is that you can say this can easily be pulled by the public either through a user interface or through an API if they want to do bulk downloads,” Bray said. “That’s the model we want to get to, the FCC [being] really a trusted data broker on issues related to telecommunications.”

The “trusted data broker” title would be a change from how some view the current FCC system. Even as Bray’s team goes through its overhaul plan, problems keep arising with the current legacy systems already in place. Prior to the end of 2014, the chief technology officer for open Internet advocacy group Fight for the Future claimed FCC ignored hundreds of thousands of pro-net-neutrality comments filed to its Electronic Comment Filing System (ECFS), which then led to some media reports that the greater public was anti-net-neutrality.

Advertisement

The agency responded, saying the missing comments were not transferred to the XML files FCC publicly released due to an error with an Apache tool it used to upload the comments. Yet, in a blog post about the error, Bray highlighted that ECFS itself continues to operate because it has been patched together “MacGyver-style.”

“We think it’s important that people understand that much of the confusion stems from the fact that the Commission has an 18-year-old Electronic Comment Filing system (ECFS), which was not built to handle this unprecedented volume of comments nor initially designed to export comments via XML,” Bray wrote in December.

Bray said as much during his interview with FedScoop for this story in November, claiming that ECFS was never built to handle the traffic it saw related to the commission’s open Internet proceedings. He estimates the 4 million comments tied to the proposed rule-making order represent 40 to 45 percent of the total amount of comments ECFS has seen in its 18-year history.

Getting IT out of the way

The headaches from the public are not the only ones Bray deals with as this overhaul moves along. His team is still meeting with vendors on how to use new cloud offerings, but it hasn’t been easy to find the right team that can help move legacy systems into the modern era. Aaron, Bray and Seale highlighted a number of different headaches they’ve been through with different companies, but all focused on one key tenet: No matter what company, they need to understand just how outdated and convoluted FCC systems have become.

Advertisement

“I get probably 120 to 140 different emails a day from vendors, and it just goes to a folder, that’s that,” Bray said. “If you really want to build a relationship as a partnership, you need to take the time to understand the business, and if there is any message I can share with cloud vendors, it’s actually take the time to understand the business of the customer that you are actually working with.”

“Discovery is the long pole of the tent, to be quite frank,” Aaron said. “It’s like your house. If you lived in your house for 20 years, you don’t know half the things that are in your closet. A lot of data centers are that way. Getting somebody in that has the skill set to crawl through and understand how this works is the hard part. That’s been our real challenge, making sure we can find somebody that we are comfortable with, who can walk away and tell you ‘OK, we know what you have, and we can help you get to where you have to go.’ We’re just about there.”

Yet with net neutrality rules expected in February, along with various auctions and merger rulings to come, FCC is facing a spotlight it has never had to deal with before. Every outage of an old system quickly turns into a magnified distraction. Bray’s team knows this, which is why it is moving as fast as it can to get out of the way of the John Olivers of the world.

“We could sit by passively and say, ‘Well, we’re doing the best we can.’ But its not a good answer,” Aaron said. “The conversation should never be about the IT staff. We should never be the topic of conversation anymore than the referees in a game should be the topic of conversation. It can’t be about the IT staff. It’s our job to get us out of the topic and get the real topic out in front.”

“There are risks to making this move,” Bray said. “At the same time, I think there’s even more risk to doing nothing. Part of being a CIO is being a chief risk officer. So while there are risks taking this path, I think we’re being much more deliberative about mitigating those risks. If we just sat on our laurels, that would be even riskier.”

Greg Otto

Written by Greg Otto

Greg Otto is Editor-in-Chief of CyberScoop, overseeing all editorial content for the website. Greg has led cybersecurity coverage that has won various awards, including accolades from the Society of Professional Journalists and the American Society of Business Publication Editors. Prior to joining Scoop News Group, Greg worked for the Washington Business Journal, U.S. News & World Report and WTOP Radio. He has a degree in broadcast journalism from Temple University.

Latest Podcasts