What if open data programs around the world were all optimized and giving us great insights? Most senior level executives know that their data is an asset but less than a quarter believe they’re effective at using the data strategically. Anyone who uses data knows that we’re in a period where our job is not done by simply disseminating data sets to fulfill a mandate or be compliant. We have to take action to determine the impact our data has. I begin another blog mini-series looking at data maturity within the US federal agency space since I work in that space and am most familiar with those efforts. Next week I will look at how the U.S. government compares to the European Union. Finally, I will end the series with a look at other countries are doing with open data efforts.
Data maturity is one way to determine impact. As reported in a FedScoop article, the first study to establish a maturity of big data and analytics tools by large federal agencies was done by IDC Government Insights in June 2014. They define the stages as Ad Hoc, Opportunistic, Repeatable, Managed and Optimized as shown in the table below. I agree with their stage characteristics that people and process are just as important as the data and technology in maximizing data value.
The IDC study is only one data maturity model and that there are others such as those from the CMMI Institute, the DGI Framework, or Garner not discussed in detail today. The study found that only 1 percent of those surveyed believed their agencies had “optimized” their analytics capabilities. The vast majority reported lacking a coherent data strategy. Adelaide O’Brien of IDC said that agencies that are high achievers on the data maturity model:
- Successfully recruit, train and reward statisticians, data scientists and others that are subject matter experts in the goals of the organization;
- Actively communicate and work with other agencies or groups on initiatives;
- Have senior level sponsorship and involvement;
- Use continuous process improvement, pilot initiatives and quantitative feedback;
- Use the data, analytics tools and automation for decision making
In addition, according to a Government Executive article a mature data program is one where data is handled with consistency, applications are well thought-out, and employees know how to use all relevant tools.
So are there any U.S. federal agencies that have an “optimized” data program as of January 2016? The short answer in my opinion is probably not but I wasn’t able to find much “data” to answer this question. Data.gov has metrics with federal agency participation in posting data according to federal mandates. But these metrics only measure the quantity of datasets rather than the quality or impact they are having. NYU’s Gov Lab has recently taken a shot of measuring the impact of open data programs from 19 case studies but only one is from a U.S. federal agency (https://data-alliance.noaa.gov/). There also doesn’t appear to be discussion as to the role of the data maturity model plays in the success stories.
Finally, the Office of Management and Budget (OMB) has taken a stab at how Federal agencies are performing on the Open Data Policy to include a “Use and Impact” indicator. Agencies that are doing well have a green color and those that have no information in a category or are not doing well have a red color. This indicator is a result of questions asked by OMB through the Integrated Data Collection Tool. The “Use and Impact” indicator is a starting point for measuring impact but doesn’t fully capture what the public thinks of the data sets or quantify how they use them.
Bottom line – I think that the majority of U.S. federal agencies are probably somewhere between “Ad Hoc” and “Opportunistic.”
Image by Informatica