The Department of Health and Human Services’ 29 component agencies face a number of roadblocks to sharing valuable health data — chief among these the fact that there’s simply no standard process or protocol for such interactions.
This “challenge,” and four others, are documented in a new report out of the Office of the Chief Technology Officer at HHS. The report, which was written by the office’s “data initiative team,” surveys the current state of data sharing at HHS, with an eye to increasing what is shared.
“While the value proposition for open data has taken root in the marketplace, government agencies must likewise use its data as a strategic asset,” the report states. “Across the federal government, there is growing consensus in the value and promise of data governance to reduce inefficiencies and costs.”
Through conversations with senior leaders and interviews focused on various specific datasets, the data initiative team settled on five areas where HHS must improve on its way to becoming a more “data-driven” agency.
The first, simply, is process.
“HHS lacks consistent and standardized processes for one agency to request data from another agency,” the report states. Because there’s no standard process agencies find themselves making something up — some use memorandums of understanding for the transaction, or interagency agreements. Furthermore, agencies don’t face any consequences for failing to provide access to interagency data in an appropriate manner.
“The lack of standardization at the departmental level for data governance and sharing, the lack of accountability for timely response to requests, and the fact that data are largely kept in silos, often results in HHS agencies having no means to access interagency data in an efficient way,” the report finds.
Next is technology: While the “majority” of agency data comes in machine-readable formats, it is not the “widespread default” yet. Agencies could also do a better job collaborating on acquisitions of data processing software — “there are significant redundancies in the instances of technologies across the Department,” the report states.
Other challenges include existing privacy regulations, some of which, the report suggests, might need to be amended; disclosure risk management; and resource constraints — “data sharing” might not be seen as a core part of any one staff member’s job.
“We cannot advertise the availability of the data to other agencies because we don’t have enough expertise [or resources] to process the requests when they come in,” one agency employee told report’s authors.
‘Think of ourselves as a single organization’
If much of this sounds familiar, it should. The challenges to data sharing within HHS have been a key talking point for chief data officer Mona Siddiqui for some time. She and former CTO Bruce Greenstein discussed the “stumbling blocks” to data sharing while on a panel at South by Southwest in March. She also discussed the issue with agency data science practitioners as part of a training program at the beginning of 2018.
“We need to start to think of ourselves as a single organization as we think about how to service the needs of Americans in the best way possible with the resources that we have,” Siddiqui told FedScoop in May. “We have to be able to connect the information that is residing in different parts of the department if we want to be able to use our resources to the best capacity possible.”
This report, however, puts formal research behind acknowledged weaknesses. So what happens next?
“Each of the areas highlighted in this report will need to be incrementally but persistently addressed,” the report concludes, stating that “evaluation” of potential next steps has already begun. “Ultimately, success will require a long-term investment, continued collaboration, and the iterative demonstration of value from data to drive the culture change essential to transforming HHS.”