Inside what made one NASA office undertake a massive data architecture overhaul

(Getty Images)

Share

Written by

Las Vegas — In 2013, an astronaut’s helmet filled with water in a harrowing spacewalk that drove investigators to issue a series of recommendations —  including a call for NASA’s data to be more readily available and easily searchable.

And so the saga of a watery spacesuit ultimately led to improvements in the space agency’s IT, too.

At the time of the incident, caused by a clogged filter, data that might have shed light on the problem was locked into proprietary systems, each with a separate login.

Cuong Nguyen, NASA EVA office IT manager, underwent the journey to build a data integration platform with a single sign-on to give agency staff “enough data at their fingertips” to mitigate problems, or prevent them before they occur.

By October of last year, Nguyen’s office went live with phase one: an initial capability to mine the data across the different systems. But the job isn’t done, Nguyen said in an interview with FedScoop at Amazon Web Services’ re: Invent conference.

Nguyen said he’s now working to migrate more datasets to the cloud, and build out applications. And this initial project raised another issue around contracting that he is trying to work out for the future: How can NASA procure services with open source data architectures, or with data it can completely own?

He told FedScoop that vendors “kind of hold us blackmail, I hate to use that word, but you kind of have to go to them for the data … instead of putting the data [somewhere] centrally located.”

Much of the design and certification data is proprietary and kept by respective contractors, Nguyen said.

For future procurement, then, Nguyen is focused on “What kind of platform do I need to have, what kind of requirements do I need to levy … so we don’t have to run into this similar problem in the future.”

Nguyen’s electronic data integration project is hosted in AWS Gov Cloud, part of NASA’s WESTPrime contract with provider InfoZen, which he said he decided to use after looking at several other solutions, including hosting on-premise.

“After looking around it turned out that this is probably the best system for us to be on because of the ease of integrating with other systems,” Nguyen said.

Nguyen told FedScoop what the 2013 scenario might have looked like, had it happened now. He ran through the searches he would make: First, he would type in the crew member’s name and look for the actual vehicle he went out on. Then he would click on the equipment he actually went out with, to check for what parts might have water, and therefore cause a leak.

He also might then look for what subsequent failures might happen after that specific spacesuit pump fails, based on data contained in the once-siloed databases.

“All of that I can do fairly quickly, within seconds, minutes, depending on how much I traced down the serial numbers and so forth,” Nguyen said. “Before, I had to log into the suit system, the engineering side of it. I have to log into the safety system to look at the DRs. That’s assuming that I can tie all of that stuff together in my head. Right?”

He added: “And I don’t have to log into multiple places either, because some data I migrated into the cloud, some data are sitting at different data centers. With that example I accessed data in the cloud, I accessed safety data out at Ames, and I also accessed a local [Johnson Space Center] server, all three within that one query.”

-In this Story-

Agencies, National Aeronautics and Space Administration