Advertisement

New NIST framework to address big data fundamentals

NIST's seven-volume Big Data Interoperability Framework looks to clarify underlying concepts when it comes to the archietecture and technology behind big data practices.

The National Institute of Standards and Technology is asking for people to help answer the fundamental questions surrounding big data.

NIST released the first draft of its Big Data Interoperability Framework earlier this week, a seven-volume series that looks to create a secure and effective roadmap for how the country moves forward with collecting, sharing and analyzing massive amounts of data.

Each framework volume tackles a separate topic, aiming to create a common vocabulary and taxonomies, set baselines for security and privacy, and demonstrate the vast amount of uses for big data.

“The availability of vast data resources carries the potential to answer questions previously out of reach,” the agency wrote in a release. “Questions such as: How do we reliably detect a potential pandemic early enough to intervene? Can we predict the properties of new materials even before they’ve been synthesized? How can we reverse the current advantage of the attacker over the defender in guarding against cybersecurity threats?”

Advertisement

The framework was created after a NIST working group reached out to various sectors — including the private sector, academia and government — that said while big data holds vast potential, it’s overwhelming traditional practices at a rate that enterprises can’t manage. The draft is the first step in supplying some order to both the public and private sector, with the former devoting $200 million to 80 projects spread throughout six agencies that aim improve the tools needed to access and analyze huge volumes of digital data.

“One of NIST’s Big Data goals was to develop a reference architecture that is vendor-neutral, and technology- and infrastructure-agnostic to enable data scientists to perform analytics processing for their given data sources without worrying about the underlying computing environment,” said NIST’s Digital Data Adviser Wo Chang.

The framework, which will go through three stages, is seeking public comment on its first stage until May 21.

Visit the draft on NIST’s Big Data Working Group’s website.

Greg Otto

Written by Greg Otto

Greg Otto is Editor-in-Chief of CyberScoop, overseeing all editorial content for the website. Greg has led cybersecurity coverage that has won various awards, including accolades from the Society of Professional Journalists and the American Society of Business Publication Editors. Prior to joining Scoop News Group, Greg worked for the Washington Business Journal, U.S. News & World Report and WTOP Radio. He has a degree in broadcast journalism from Temple University.

Latest Podcasts