Government regulators need to be sensitive to the costs of complying with and overseeing the regulations they impose on data management, as do the organizations that are regulated. Costs related to data oversight, quality control, standardization, security, and privacy all need to be considered in comparison with the quantitative and qualitative benefits that will be generated.
My interests: Project, program, and data management; market research, digital strategy, and program planning; change management; technology adoption; books, movies, & photography.
Entries in Project Management (148)
If you’re doing exploratory data analysis to help you decide how much data prep might be needed to make your data public, that’s one thing. On the other hand, if you are using your data to calculate input to an invoicing system, that’s another.
Given the wide variations that currently exist in most organizations in understanding the ins and outs of managing current data governance and analytics practices, it’s not surprising that bringing in potentially “disruptive” technologies will be even more of a challenge.
To appreciate some of the implications of the recent proposal by the International Committee of Medical Journal Editors (ICMJE) to make consideration of journal article acceptance contingent upon the author’s agreement to share de-identified clinical trial data with other researchers, as reported by NPR, some context is appropriate.
A good place to start will be to get a good handle on the costs and benefits of introducing new big data related initiatives into the organization; the more realistic these costs and benefits, are the better.
There’s a lot more to collaboration than just providing a technology platform.
One of the benefits of focusing on behavioral outcomes as a way of assessing the effectiveness or usefulness of improved data analytics is that behavioral outcomes are potentially measurable.
I began researching “big data project management” when I started seeing publications and online discussions concerning big data project “failures” being attributed to the classic reasons for project failure such as scope creep, poor stakeholder engagement, and inadequately understood requirements.