Dennis D. McDonald (ddmcd@ddmcd.com) consults from Alexandria Virginia. His services include writing & research, proposal development, and project management.

Scoping Out the ‘Total Cost of Standardization’ in Federal Financial Reporting

By Dennis D. McDonald, Ph.D.

Click or tap the above image to download a .pdf of this article.

Introduction

The Digital Accountability and Transparency Act (DATA) making its way through the US Congress regarding Federal financial reporting has several implications:

  • People and systems that previously had difficulties communicating will be able to talk with each other.

  • Data currently locked in difficult-to-access paper and .pdf documents will be made machine-readable, accessible, and reusable.

  • Links and metadata tailored to specific organizations and industries will be standardized thereby improving contextual access and data filtering.

  • The ability to track and reconcile appropriations and disbursements will improve.

There seems to be general agreement that developing and implementing data and metadata standards will increase efficiency and accuracy associated with using financial data in the long run.  This is the natural way we think about standards. It relates at least partly to the idea that it is often more expensive to make than to buy or adopt existing rules and procedures for describing things digitally. Standards enable reusability and this is certainly true regarding data.

At the same time, the costs and process complexity associated with implementing DATA Act standardization — what I call the “total cost of standardization” — may also be significant. These costs are related to:

  1. The number of different systems and communities that need to be involved in the process by which standards are developed, adopted, and used.

  2. The changes that need to be made both to systems and business processes to implement standardization.

Understanding the potential scope of the costs related to standardization and to the number and variety of affected systems, applications, and processes is the primary subject of this document.

Context

Consider the different situations in which standardized data are used and the processes and programs that are affected. Examples of different uses of standardized data include:

  • Determining eligibility for Federal benefits.

  • Calculating the dollar amount to be disbursed via check or transfer for such benefits.

  • Verifying identity of payment recipients.

  • Reconciling appropriations with actual expenditures.

  • Eliminating payments to unqualified recipients.

  • Reporting account status to governing and oversight bodies.

  • Planning and prioritizing actions required by legislation.

  • Uncovering potential fraud or misuse of funds.

The costs and benefits of standardization associated with systems highly dependent on standardized data may differ from those that have low or indirect dependence on standardized data. In environments that require sharing of data among different systems and applications, developing and implementing standards for data and metadata is one way to control costs and miscommunication errors.

Approaches

Data standardization comes with its own costs and management implications. Several approaches to generating the benefits of data standardization exist including:

  1. One single system using one single database.

  2. Multiple systems and applications accessing the same database.

  3. Different systems and databases can be made interoperable through appropriate data conversion or data transformation processes.

The third situation is the often the norm and is especially true within the federal government in terms of how data relating to appropriations and expenditures are managed and reported. Multiple financial, HR, and other systems are involved in managing these data. How data and metadata associated with the many people, places, things, and transactions are described by different communities (constituencies) inside and outside the government will need to be understood. Even when shared services are adopted to reduce the total number of infrastructure systems, they may still have to interact with a variety of external systems both inside and outside the government.

Another consideration is how much specialization or customization is needed in order to accommodate the needs of different communities, industries, or constituencies, each of which may have formally or informally adopted different data related standards. At the local level, for example, how many different approaches are needed to describing local transit schedules or restaurant inspections? At the Federal level, how many numbering schemes for companies and other organizations are needed?

In such cases the cost of transitioning to use of a single standard to support a given use must be considered especially if the transition will require significant changes to existing systems and processes. How such costs and benefits are defined will be important especially when significant one-time costs have to be incurred at the front end of the standardization process.

Open Data and Standards

One important consideration is possible contention between making data public now versus waiting till standards are developed and adopted. Much is already underway to open up government data to public access:

  • The current Administration is pushing heavily for more openness in Federal data.

  • State and local governments themselves are making more efforts to open up access to government data files.

  • A variety of commercial and open source companies, API development tools, and cooperatives are emerging to facilitate rapid public access to and re-use of government-sourced data.

  • International organizations such as the World Bank have not only increased the amount of data they publish about development efforts but have also experimented with intensive volunteer analysis efforts focused on data analytics.

As I suggested in How Important Is ‘Total Cost of Standardization’ to the DATA Act? in the long run it may be more difficult and costly  to implement necessary data standards around financial information if the public and businesses are already dependent on the rapidly emerging sources of governmental data. It’s difficult to say with certainty what the actual implications of this situation are given the ease with which cloud based storage, analysis, and visualization tools can be used with existing data files. Do the benefits of  “immediate public access” outweigh the desire to implement standards first and then make data public? Without getting into specific use cases that is really difficult to assess but the costs, benefits, and risks need to be assessed for different uses such as those listed above under “Context.”

Forward

In my opinion, the best way forward is to do both:

  1. In the interest of public transparency and accountability, Federal expenditure data should be made available now for access and use by the public while at the same time work “behind the scenes” proceeds to intelligently apply standards to data and metadata.

  2. Then over time we bring all data “into the standards fold” through an organized and collaborative process so that, eventually, data elements and metadata can be interoperable no matter what level of government, country, industry, or community is involved in the exchange of data.

Priorities will have to be established for such a massive undertaking but I think it important not to let the “perfect be the enemy of the good” by delaying access to financial and other structured data because they have not yet gone through some kind of formal standardization process. In fact, by managing this process effectively and by aggressively pushing for as much public access to data as possible, policymakers can be guided in prioritization efforts by feedback to the data already being made available.

Parts of this process are already underway, of course. Numerous standards bodies already exist throughout government and industry and the efforts of such bodies will continue. The difference now is that the systems for accessing and processing large and small amounts of quantitative data are become much more democratized than in the past. Cost and technical barriers to public access to data have fallen drastically in recent years.

Conclusion

Just as central IT departments have had to adjust to increasingly sophisticated and attractive publicly available hardware and software tools, those in government responsible for gathering, organizing, and distributing program-related data — especially financial data — need to intelligently coordinate both short term access and long term standards and system development efforts. Accomplishing this, of course, will require authority, governance, expertise, and resources.

Related reading:

Copyright © 2013 by Dennis D. McDonald, Ph.D. Dr. McDonald is an independent project management consultant based in Alexandria, Virginia. He has worked throughout the U.S. and in Europe, Egypt, and China. His clients for project planning and project management have included the U.S. Department of Veterans Affairs, the Environmental Protection Agency, the World Bank, AIG, ASHP, and  the National Library of Medicine. In addition to consulting company ownership and management his experience includes database publishing and data transformation, integration of large systems, corporate technology strategy, social media adoption, statistical research, and IT cost analysis. His web site is located at www.ddmcd.com and his email addres is ddmcd@yahoo.com. On Twitter he is @ddmcd.

 

 

Recommendations for Collaborative Management of Government Data Standardization Projects

Recommendations for Collaborative Management of Government Data Standardization Projects

How Important Is ‘Total Cost of Standardization’ to the DATA Act?

How Important Is ‘Total Cost of Standardization’ to the DATA Act?