Dennis D. McDonald (ddmcd@outlook.com) is an independent consultant located in Alexandria Virginia. His services and capabilities are described here. Application areas include project, program, and data management; market assessment, digital strategy, and program planning; change and content management; social media; and, technology adoption. Follow him on Google+. He also publishes on CTOvision.com and aNewDomain.

How To Combat “Data Trutherism”

How To Combat “Data Trutherism”

By Dennis D. McDonald

Data trutherism

The recent Washington Post editorial When the facts don’t matter, how can democracy survive? is a disturbing piece. In it author Catherine Rampell documents recent research showing that many Americans doubt the veracity of data provided by the U.S. government.

Rampell discusses this lack of trust in the context of politics and a tendency on the part of many Americans to view conspiracies around every corner. Many people don't believe the numbers the government agencies use day in and day out to guide their legislatively mandated policies and programs. Rampell concludes:

"This is how a democracy crumbles: not with a bang, but with data trutherism."

While it is probably well known that modern media allow people to surround themselves with information content and messaging that reinforces their beliefs and prejudices, it is rather shocking (at least to me) to see this "filter bubble" extended to basic facts and figures.

But perhaps it's not surprising. People like to pick and choose their media. We probably shouldn't be surprised that people pick and choose their numeric facts as well. If you're surrounded by people who are unemployed, for example, is it surprising that you might have some doubts about, say, the improvement of national unemployment numbers since Obama took office?

The growing disconnect

One problem is, the greater the disconnect between government generated statistics and the people’s belief in these numbers, the greater the potential disconnect between how government services are managed and the people that are served. Many federal agencies’ delivery of services is triggered by various metrics. If the connection between the two is not understood, that means how the government operates is not understood. Such ignorance is bad for a democracy where institutions rely on "the consent of the governed" and the taxes the governed pay.

I wouldn't be the first to recognize a difference between ignorance about what numbers "mean" and whether or not some sort of conspiracy exist to “spin” the numbers. Unfortunately, ignorance and willingness to believe in an imaginary conspiracy theories do reinforce each other, especially when leaders arise who fan the flames of bigotry and intolerance.

Today the belief might be that unemployment statistics are being "cooked" to make the current Administration look good. Tomorrow, reliance on the U.S. Census to apportion legislative representation as specified in the Constitution might be questioned. The day after that? Massive refusal to pay taxes because so much money is being spent on helping people “who don't live in my neighborhood”?

Mistrust in basic government statistics can be a bad thing. This mistrust can't be understood and appreciated without knowing what other factors influence the mistrust. These other factors may include a lack of basic numeric literacy, a sense of resentment about one's status, willingness to respond positively to the appeals of a would-be demagogue, and the tiresome politicians' strategy of "running against Washington" (until they get to Washington, of course).

What's the solution?

I'd like to think that at least part of the solution for “data trutherism" is education (of the electorate) and more transparency (on the part of the government).

Solution #1: Education

Regarding education, that's only part of the solution given that one of the reasons we have reached the point of widespread data trutherism is the failure of our educational system to adequately prepare a large segment of the population to objectively understand and evaluate basic demographic and population statistics. This results in the dissemination of amazing errors like the AP’s report on Clinton Foundation donations and State Department access. There a sample percentage metric was erroneously projected to an entire population. Such errors can be widely disseminated in no time at all. Even when sponsors admit the errors, the damage is done. Erroneous data have already been disseminated, digested, and redistributed.

In the face of incompetence such as AP's, one defense is an educated populace that learns to ask basic questions like, "Where did those numbers come from?" This is not a perfect solution, as reading about the LA Times’ “tracking poll” and how its weighting factors can impact its results will attest. I understood that particular explanation since once upon a time I designed and manage surveys that involved segmented samples. Someone with less quantitative experience might find such explanations difficult to follow.

Solution #2: Government Transparency

This brings us to the need for greater government transparency so that more people understand the relationship between government services and how they are delivered. A step in this direction is the movement to provide more “open data” by many government agencies. Many open data programs now distribute data files along with analytical tools to help users interpret the constantly increasing amounts of data. That's a good thing.

Problem is, making data "open" and accessible is only the tip of the usage “iceberg,” even when analysis and visualization tools are also provided. People have to make sense of the data and interpret it in ways that make it meaningful to them. For most people that means that the data has to be delivered in an understandable fashion. If they don’t understand the basics of data analysis, making data “open” won’t necessarily help them.

My belief is that, in the face of so much “data illiteracy," government agencies must do a better job of making the data it provides about its operations and its services more understandable. This means going beyond making data files and analysis tools available to providing consultative and educational services that help citizens understand data and what the data mean to them personally.

One challenge is that such “outreach” programs are expensive. I would argue that doing so should be part of standard descriptions of government agency responsibilities and not something done separately.

Dangers

There are dangers to involving the government in interpreting its own data. One danger is that government agencies will themselves “spin” the numbers to make themselves look good. How to combat this possibility? The first approach of course is the oversight role played by the legislative branch of government. While this can obviously be politicized, a transparently performed oversight role can help bring context and meaning to more people even when hearings and investigations are highly partisan.

The second approach gets us back to education: the more people know and understand about how government operates, the harder it is to “spin” the numbers, no matter who is doing the "spinning."

Copyright © 2016 by Dennis D. McDonald

Underwhelming: Facebook VR Demo

Underwhelming: Facebook VR Demo

Why Measure the Value of an Organization's Information?

Why Measure the Value of an Organization's Information?