The federal government produces or collects petabytes of information daily, including mission and operation data, agency and employee records, social media and basic business documents. Without strategic guidance, agencies tend to save everything, adding unnecessarily to storage, data management and protection costs. Compounding that challenge is the growing need to manage data in the cloud, and across hyper-converged infrastructure environments.
As more and more information is generated, decisions must be made about where it goes, who gets access, whether it’s formatted or unformatted, if it has been cleaned, whether it should be encrypted at rest, encrypted in transit and how to make better decisions with it.
A new FedScoop Tech Brief explores the current challenges of data management and protection in a changing technology landscape, making key recommendations for what to look for in new data management tools that maximize the value of data while minimizing risk and cost.
New data management tools can help provide a real-time, 360-degree view of the data residing across an agency’s IT ecosystem, ultimately driving efficiencies, lowering costs and mitigating risks. This new generation of data management tools can help federal agencies automate the processes of classifying, archiving and discovering data, as well as ensuring that data is protected, regardless of where it resides or where it travels.
According to Veritas, which provides data and information management solutions for government, decisions about data management may be more difficult than many agency executives appreciate. Agencies are juggling federal mandates from both the current and former administrations regarding security, moves to electronic records by 2019 and data center consolidation. Security is about more than guarding against intrusions — it is about protecting the integrity of the data and making sure it only gets into appropriate hands.
There is also the challenge of “dark data” — information that organizations collect but don’t contribute to business activities — where the “owner” of the data really can’t be determined. Veritas estimates that organizations spend 52 percent of their storage budget on dark data. Another aspect of dark data is duplicated data – information that may be in multiple locations. Identifying which information is the original and most current, and which are duplicates, also dictates what should be stored, what should be moved to the cloud and what should be cleaned.
Data management platforms have evolved significantly beyond single business intelligence tools. The best of breed offerings today provide a combination of capabilities, delivered “as-a-service,” allowing agency CIOs to move away from capital investments to more flexible, controllable operating expenditures.
Download the Tech Brief for details on how data management platforms help agencies:
- Establish end-to-end visibility of data across hyper-converged infrastructure
- Automate data and application interdependencies
- Enable controls for migrating data across multiple data centers and cloud platforms easily and efficiently
- Provide unified data protection, improving data portability and resiliency
- Generate more powerful insights and actionable intelligence through data mapping, archiving, e-discovery and other tools
The tech brief also explores how data management platforms tie back to three core elements of information governance: Information availability, information protection and information insight.
Putting data to work effectively involves knowing what data agencies have, where it’s located, who’s using it and how it can drive better decision making. Having a 360-degree view on the scope, sprawl and condition of data can making moving data to the cloud significantly easier, and give agencies greater control.
Download the tech brief for more on what a 360-degree view of data management looks like, and how to improve data insights and security.
This article was produced by FedScoop for, and sponsored by, Veritas.