Dennis D. McDonald (ddmcd@outlook.com) is an independent consultant located in Alexandria Virginia. His services and capabilities are described here. Application areas include project, program, and data management; market assessment, digital strategy, and program planning; change and content management; social media; and, technology adoption. Follow him on Google+. He also publishes on CTOvision.com and aNewDomain.

Can Government Regulations Drive Improvements in Enterprise Data Governance?

Can Government Regulations Drive Improvements in Enterprise Data Governance?

By Dennis D. McDonald

Thorhildur Jetzek’s premise in her excellent post

Can data protection requirements help us create more value from data?

is sound: government regulations DO have the potential to stimulate improvement in organizational data governance processes. The experience of heavily regulated industries (e.g., banking and pharmacuticals) attests to that. Here’s a brief extract from her post:

Governments (at least in the EU) could very soon ask question about storage of personal data, or customers could ask for a transferal of their data or for their personal data to be deleted. For data privacy purposes,  a company needs to know where personal data is stored on their systems and who is responsible for these data. This can be especially difficult if the content is kept in unstructured formats in documents, presentations, and spreadsheets. To be able to trace and find sensitive data, data governance methods such as data classification can come in handy.

One challenge for those at the beginning of the data governance process is a lack of knowledge about what actually needs to be done to improve data governance (and how much this will cost).

Data governance has a way of requiring one to cross traditional organizational and technological siloes. It may also involve consideration of new software and data management tools with which the organization has little experience.

Here is the comment I left at Jetzek’s excellent post:

One can only efficiently and effectively comply with regulations when effective governance processes are in place, starting with knowing which data reside in the organization. The trick is to ensure that the regulations don’t strangle innovation and impose unnecessary burdens on the marketplace. Here in the U.S., for example, some organizations have protested that Federal Government requirements that financial data reports for public organizations be tagged to facilitate digital manipulation impose unsustainable burdens on the submitting organizations. Exempting large numbers of organizations from having to submit structured data would, of course, cripple efforts to perform necessary financial oversight functions. The argument here, I believe, is not over regulation per se (in terms of making financial data more analyzable and accessible) but over how much it costs to establish the intelligent governance processes that you discuss (and of course who pays for such costs). The costs of intelligent data governance are real and the sooner we understand them the better.

Government regulators need to be sensitive to the costs of complying with and overseeing the regulations they impose on data management, as do the organizations that are regulated. Costs related to data oversight, quality control, standardization, security, and privacy all need to be considered in comparison with the quantitative and qualitative benefits that will be generated. As I suggested in How Important Is“Perfect Data” To Your Data Analytics Program?,

1. Be honest about costs. They’ll bite you if you aren’t.

Copyright (c) 2016 by Dennis D. McDonald. To find more articles like this scroll down. To find out more about my consulting go here.

Learning From General Electric’s Big Data Challenges

Learning From General Electric’s Big Data Challenges

How Important Is "Perfect Data" To Your Data Analytics Program?

How Important Is "Perfect Data" To Your Data Analytics Program?