As with private companies, data is at the heart of digital government. And just as private companies need to find efficient and innovative ways to leverage their data as a strategic asset to drive business decisions, reduce cycle time, and enhance user experiences across the board, so do federal agencies.
But siloed data is a challenge for agencies that are accustomed to spinning up a new data warehouse for each mission, program, or center need. This siloed approach to data management is inefficient and often leads to duplicated work—while hampering innovation and slowing the pace of discovery. While it might seem like the answer is simply to build an integrated data warehouse that’s big enough to combine your siloed programmatic data into a central repository, this approach is not scalable or feasible in large enterprises.
How do you serve the right data to the right users at the right time to improve service delivery and accelerate speed-to-insight? In our digital modernization work for federal agencies, we support the compilation, storage, and analytics of mission-critical program data to provide decision makers with actionable, data-driven insights. Here are three data analytics best practices we follow to help agencies increase their mission impact.
#1 Embrace domain ownership of data and analytics
Federal agency leaders face significant challenges in scaling their data and analytics capabilities, while remaining agile enough to respond to rapidly evolving mission priorities. Centralized operational and architectural models for data governance and data warehouses have failed to address these challenges. A new sociotechnical paradigm, known as Data Mesh, applies lessons-learned from decades of experience addressing software complexity during mass digitization of large organizational processes. Just as organizations have learned that by applying Domain-Driven Design and computational governance (e.g., through Zero Trust architecture), they can balance agility, interoperability, and security in the delivery of microservices and APIs, these approaches can also be applied to deliver composable data and analytics products at scale.
Two key principles in this paradigm include 1.) adopting Data-as-a-Product thinking and 2.) self-service tools. The former invests mission teams with the responsibility (and autonomy) to deliver high quality data products to their consumers and the enterprise, while the latter equips them with tools that reduce the cost of delivering high-value data products that satisfy enterprise data governance constraints.
#2 Make your data and analytics open, accessible, and transparent
Agencies are required to make their data accessible and support their policymaking with statistical evidence, but many still use opaque analysis processes. They may be working with data that is old or incomplete and may also struggle to produce the data that was used in an analysis when pressed to do so. Not only does an opaque approach risk errors in understanding that can lead to suboptimal decisions and poor data quality, but it is also slow—when data is not open, accessible, and clearly versioned, agencies can lose time reproducing work or delay their decisions due to a lack of confidence in the underlying data.
By contrast, an open, accessible, and transparent approach to data and analytics allows anyone who’s involved in the process to see the data that’s being used, to reference the analysis that took place to understand the decision—and to also provide a starting point for the next analysis.
And transparency builds trust. An agency’s ability to defend a decision with quality data is essential—as the public has grown accustomed to doing their own research and digging into the data, data transparency helps build and maintain trust with the people they serve.
#3 Prioritize user-friendly tools and upskilling as needed
Giving mission teams access to the data is only part of the solution; you must also provide them with everything they need to understand and use the data effectively. Adopting a Data-as-a-product approach empowers agencies to package their data in a well-defined interface rather than serving up raw datasets. This is key: Mission employees and analysts have a range of data skills, so it’s important to equip them with the self-service tools they need to interrogate and interpret the data in a user-friendly way. Interactive BI dashboards and visual data manipulation tools can engage decision-makers in the analytics process.
In addition to delivering user-friendly data experiences, agencies must match employee skills to the mission need that the data is serving. This might require introducing new people who can bridge the gap between policy and data by bringing high-level analytics skills to a policy background. Don’t be afraid to embrace organizational change in pursuit of innovation, as the two go together. For a program or center to reach its full data-driven potential, there may also be upskilling required—an opportunity that moves existing employees to higher value work while allowing agencies to attract new employees with data skills to the mission.
While arming mission teams with the data they need to make informed decisions is foundational to mission success, it's also key to start with the end in mind. What problem are you trying to solve, and how can we design a solution with this desired outcome in mind? Learn how we used a federated data governance approach and data analytics, modeling, and rapid simulations to help the Centers for Medicare & Medicaid (CMS) modernize their regulatory impact analysis process.