How agencies can master their data to deliver better services

Harnessing federal data for AI and faster analytics requires a foundation of enterprisewide data management technology — and good governance strategies.
data management
(Getty Images)

Deadlines to meet the Federal Data Strategy 2020 Action Plan are approaching quickly. Federal agencies must prioritize actions to establish processes, build capacity and align efforts that leverage their data to better serve the public. The first step of the action plan requires federal leaders to identify the agency’s data needs by September 2020.

While many large agencies started down this path before the release of the mandate, small- to mid-sized agencies are struggling to meet the same requirements with less budget, talent or tools.

Sherry Bennett, PhD., Chief Data Scientist, DLT and Michael Anderson, Chief Strategist, Public Sector, Informatica

As we talk to agency leaders and chief data officers (CDOs), it’s not that data management technology isn’t readily available. The problem CDOs and program leaders face is putting all the pieces in place to develop a coherent data strategy and having the means to fully assess the state of their data resources.

First steps for holistic data programs

Perhaps the most important action agencies can take right now is integrating their CDOs into the lines of business. This will ensure the CDO understands what data is essential to the organization’s core functions and operational strategies — and respond more effectively as they tackle step one of the action plan to identify what data their agency has in all its systems.

Putting CDOs at the table with program leaders will afford them a stronger position to implement the federal strategy over the next 10 years. Agencies are required to spell out specific strategies, governance and policies to manage their data more successfully. They also need to add iterative tasks to their processes, including inventorying and cataloguing their data.

This means answering questions like – where is the information located? Is it in systems that program leaders aren’t aware of, or don’t have access to? Most organizations we talk with face the same challenge — data is replicated across multiple systems, so it is difficult to know which is the authoritative source.

Many CDOs and program leaders may not know where to start. Hence, initiating a very specific, high-value project that supports the agency’s strategic priorities can help them demonstrate the value of the data to the rest of the operation by starting small. The value of these projects will compound over time and secure more buy-in from leaders across the enterprise.

Imagine how much better leaders can perform when they have solid evidence that’s based on clean data and that is collected to the extent leaders need — whether it be on policy, activities, actions, regulations or operations. And with so many new varieties of data — text, images, video, and more — when the data is managed properly, it can be leveraged to enhance decision-making.

Take an example of an agency considering a digital transformation project on its financial system. If the CDO and program manager collaborate early on concerning the data need from the financial system, and complementary information can be integrated from other resources, leaders will have a clearer view of operations to realize cost-efficiencies and optimize resources across the agency.

Getting to the bigger picture

Assessing the data is only part of the task. The quickest way to deliver specific, high-value data projects is to help project owners demonstrate how they are meeting these critical core functions with data that can back them up. So, ensuring processes are in place to keep data clean and properly cataloged has never been more important.

A good place to start on an initial data project is at the governance structure — making sure rules, workflows and a common lexicon are in place to manage data over its lifecycle. Even as projects begin to deliver improved results, a lot of tuning will still be required to refine guidelines for ETL (extracting, transmitting, load) tools and to decisions on how to integrate data from different systems. Governance takes on added importance in defining automation rules, establishing metadata standards and integrating the components of an enterprisewide data management system.

Governance will be pivotal tool as agencies develop artificial intelligence capabilities. AI needs quality data at massive scale. AI won’t deliver the desired outcomes — or potentially worse, it will deliver faulty outcomes — without a solid governance framework and data at scale.

A fully capable data management system should allow agencies a platform to master their data. With a centralized solution, program leaders will have the ability to manage governance and cataloging rules, maintain data quality and effectively extract, integrate and analyze information across all systems.  A master data program also brings data together with a single source of truth, allowing a trusted 360-degree view of a person, place or thing regardless of the number of sources and types of data.

Being able to master data gives agencies the potential to empower better decision-making across the agency, improve services for citizens and operate more efficiently at scale.

Michael Anderson is the Chief Strategist, Public Sector at Informatica. He has over 27 years of experience in executive leadership and strategy for federal government organizations.

Sherry Bennett is the Chief Data Scientist for DLT. She has more than 25 years of data science and IT experience, working with public sector organizations on emerging innovations, data analytics, AI and machine learning strategic initiatives.

Learn more about data architecture and management to deliver better insights into the mission.

Latest Podcasts