NIST Seeks Feedback on the Big Data Framework Development
The National Institute of Standards and Technology (NIST) is seeking public comment on a draft publication of the NIST Big Data Interoperability Framework, as part of a major collaborative effort to develop a standard framework to make it easier for to use “Big Data” sets for analytics.
“One of NIST’s Big Data goals was to develop a reference architecture that is vendor-neutral, and technology- and infrastructure-agnostic to enable data scientists to perform analytics processing for their given data sources without worrying about the underlying computing environment,” said NIST’s Digital Data Advisor Wo Chang.
The seven-volume publication of Big Data foundational documents that will ultimately be produced will serve as the U.S. input to the international standards community.
Big Data analytics has become a powerful tool for risk-based security efforts, and is crucial for next generation libe attack intelligence, which requires the processing of hundreds of terabytes of raw data daily.
“Big Data is the term used to describe the deluge of data in our networked, digitized, sensor-laden, information-driven world. Big Data collections are measured in trillions of bytes (terabytes) and thousands of trillions of bytes (petabytes),” NIST said.
“The data range from text, images and audio collected from social media to the output from physics experiments that deliver data points 40 million times a second. New technology is evolving to harness the rapid growth of data.”
Nist says there is a broad consensus among commercial, academic and government leaders that the effective use of Big Data has the potential to spark innovation, fuel commerce, and drive progress across multiple sectors, and the availability of vast data resources carries the potential to answer questions previously out of reach.