There are a few key traits that successful federated data efforts — when a specific kind of data is collected across disparate organizations — tend to have in common. And The General Service Administration’s 18F and Office of Products and Platforms, through their U.S. Data Federation project, are trying to articulate exactly what those traits are.
The project, which grew out of an idea funded through GSA’s 10x program, is an attempt to create “a common language and framework” for understanding federated data efforts.
The goal? That future efforts will have resources, and perhaps even a playbook, to turn to.
“This type of [federated] data sharing effort is common in our distributed style of government, but we believe it has not been given the systemic investigation and infrastructural support it deserves — currently each such effort is treated as an isolated instance, with little sharing of tools or lessons from one effort to the next,” the project’s GitHub page reads.
The U.S. Data Federation aims to change that paradigm.
Given interviews with experts and project leads on projects like data.gov, Open311, opendataphilly.org and more, the team came up with nine “plays” to improve the chances of success of a federated data effort. These plays, published last year, include commandments like “Identify Use Cases With Demonstrated Demand,” “Allocate Proper Resources,” and “Nurture Early Adopters.”
But the project doesn’t end at giving this kind of academic advice. The team is also working to develop shared tools that can help decrease the startup cost involved in pursuing a federated data effort. The first of these tools was recently piloted in partnership with the U.S. Department of Agriculture’s Food and Nutrition Service (FNS).
In this case, the tool, “an open-source data aggregation and validation tool that can be accessed via a web interface or API,” was used to verify eligibility for free or reduced-price lunch. This is a typical federated situation — data is collected at the school level according to standards set by FNS, verified at the state level and finally sent to FNS. The 18F-developed tool, which was piloted in a few states, allows states to access one central system (instead of disparate spreadsheets) and automatically checks the uploaded data against FNS rules, highlighting errors. It smooths what can traditionally be a “lengthy and burdensome remediation process.”
Moving forward, the Data Federation team is looking for more projects, similar to the FNS case, on which to test the collected techniques and tools. “We’re eager to talk about ways we may be able to help you and learn from your work,” a recent blog post reads.