Dennis D. McDonald (ddmcd@ddmcd.com) consults from Alexandria Virginia. His services include writing & research, proposal development, and project management.

Can Better Data Governance Improve Disaster Resilience?

Can Better Data Governance Improve Disaster Resilience?

By Dennis D. McDonald

Facilitating the flow of critical information to the public during an emergency or natural disaster is not a newly discovered priority. We already have purpose-built systems for gathering and disseminating information about earthquakes, hurricanes, and other natural disasters. Yet, according a letter from a team of researchers from Mississippi State and University of California Irvine published in the January 11 issue of Science (“Integrated data could augment resilience”), California’s recent wildfire experience was an example of information not getting out when it could have done more good:

Lack of an integrated framework for circulating information among decision-makers and passing it to residents exacerbated the devastating impact of the wildfire. Reports unanimously point to shortcomings in disseminating critical information to residents before and during the wildfire … The data crisis further increased in areas where cellular and telecommunication infrastructure was damaged, limiting internet access … .

The authors’ suggestions for how to address these recent shortcomings include a laundry list of logical actions — get all the relevant stakeholders involved in advance, make sure the right data are available in the right format when needed, and make sure the platforms for getting the right data to the public are resilient and effective even in the midst of chaos and disaster. The authors also recommend using artificial intelligence methods for timely analysis of multiple data sources to support effective decision-making.

All of these ideas are good but, based on my own work, they are not sufficient by themselves to overcome real world problems in effective disaster warning. These problems include:

  • An overabundance of data coupled with a lack of comprehensive data modeling and prediction methods to address all disaster scenarios.

  • A lack of a comprehensive management structure that can oversee (or require) cooperation by a host of (sometimes uncooperative) public and private sector organizations and systems.

  • A fragile communication infrastructure where over-reliance on vulnerable communication channels leads to last-mile failure in message delivery.

  • A public that cannot always be counted on to “do the right thing” even when information is provided.

Improving data governance in such a potentially chaotic situation as disaster prediction and public communication about what’s coming places a strain on traditional data management and governance processes. For example, a lack of data standards or organizational differences in terminology can stymie or delay data sharing and analysis.

Planning ahead to prevent data sharing from breaking down through implementation of reserved or dedicated communication channels is one approach. Problems can arise, unfortunately, when data, analytics, or decisions based on analysis must “jump the boundaries” of responding organizations to more mainline delivery channels such as text messaging, radio, television, telephone, and internet.

This is one reason why cellular companies have established quick response methods for restoring cellphone service following natural disasters. Yet, preserving a physical path between source and target won’t overcome problems or shortcomings in the data content that’s delivered. Delivery needs to be done via a trusted channel, the information and data as delivered need to be understandable, and they need to be “actionable” in a timely fashion. You can’t just throw charts and graphs at the public, you need to be explicit in a language and format they understand, as in “Get out now!” and “Here’s where to go!”

The researchers in the Science letter do address some of these issues. Whether or not AI, big data, or machine learning methods can actually speed actionable data to the people who need them is one thing. Making sure the data have been created, managed, and manipulated along the way are accurate and timely and package the right message correctly are another.

Perhaps modeling the entire disaster response data delivery cycle from a data program management perspective is worthy of further research.

Copyright (c) 2019 by Dennis D. McDonald

More about “data program management

Weekly Top Ten

Weekly Top Ten

Government Data and the Commercial Cloud: Who Governs, Who Pays, and TANSTAAFL

Government Data and the Commercial Cloud: Who Governs, Who Pays, and TANSTAAFL