Advertisement

NOAA advances model to expand open data access

The National Oceanic and Atmospheric Administration is getting closer to a solving a dilemma facing many federal agencies: how to find the resources to move more of its untapped open data into the public domain.

NOAA has a long history of making torrents of data available to private enterprises. New York University’s Governance Lab estimates that NOAA’s open, machine-readable data alone has spawned several billion-dollar industries and saves the U.S. more than $30 billion annually.

But NOAA “would do more if we were empowered (and had the resources) to do more,” said David McClure Jr., a data asset portfolio analyst for NOAA, speaking Thursday at a forum on managing enterprise data sponsored by the trade group AFCEA Bethesda.

McClure said NOAA collects between 20-to-40 terabytes of satellite and other Earth data every day, but it only has the resources to release a tenth of it.

Advertisement

“We’re trying to focus on a positive mosaic effect – to discover relationships in the data and pull those together and extract economic value,” McClure said. “We know the data has value and the private sector thinks it has value, but there’s a cost barrier” that must be surmounted, he said.

A solution is beginning to take shape, however, in the form a big data partnership between NOAA and the private sector. If completed, the arrangement would help finance the creation of a cloud-based repository of NOAA data that would be supported by, and accessible to the public through, the use of access fees.

2014_11_Screen-Shot-2014-11-13-at-4.45.52-PM (Credit: NOAA Big Data Partnership Project Presentation)

McClure likened the fee to what the public pays to enter the country’s national parks. The fees would help defray cloud startup and certain operating costs. But it would also help solve a ticklish financial equation for private sector firms wanting access to the data by creating a funding mechanism that would also address the need to preserve equal access for everyone.

Advertisement

The concept, which took wing last February when NOAA issued a request of information and attracted responses from 70 organizations, gained new traction at the agency’s Oct. 17 industry day and could be ready to roll out early next year.

2014_11_Screen-Shot-2014-11-13-at-4.45.52-PM (Credit: NOAA Big Data Partnership Project Presentation)

“There is still a lot of friction” in ironing out operational details, McClure said. Beyond the financing terms, NOAA must also establish protocols to ensure its data sets are properly channeled and maintained in the proposed cloud repository. Since the data would be a copy of NOAA data, methods for authenticating the data also must be worked out, he said.

The funding model is just one piece in a larger puzzle for government officials trying to make enterprise data more useful and discoverable.

Advertisement

At the heart of those efforts are continuing moves to create a common architecture for data interoperability across the government. That includes developing a data reference model that reflects the way data is increasingly consumed, said Michael Kennedy, an executive in the Information Sharing Environment office, created in the wake of the September 11 terrorist attacks.

ISE's Michael Kennedy, NOAA's Dave E. McClure, Jr. and DHS's Michael Simcock speaking at AFCEA Bethesda forum Nov. 13, 2014. (Credit: AFCEA Bethesda) ISE’s Michael Kennedy, NOAA’s Dave E. McClure, Jr. and DHS’s Michael Simcock speaking at AFCEA Bethesda forum Nov. 13, 2014. (Credit: AFCEA Bethesda)

“We need to move from data ownership to data stewardship,” Kennedy said at the forum. The ISE’s efforts to support that mantra will be reflected in the latest edition of the data aggregation reference architecture, due out at the end of November, he said. The document provides detailed guidance on data interoperability standards.

While much of the data used across the Department of Homeland Security isn’t intended for public use, there are still lessons DHS has learned in handling data that can add value to internal operations, said Michael Simcock, chief data architect for DHS. He noted how FEMA developed an API to make highly requested disaster data during Hurricane Sandy more readily available and accessible. Using the API saved FEMA from having to create and maintain “thousands of Web pages” with the information and also helped FEMA workers and the public get information more quickly.

Latest Podcasts