The following are links to and excerpts from a selection of articles related to the topic "data program management." To see all posts in this category go here.
Don’t let tools drive your policy.
Have a plan and be prepared to change it.
Know your stakeholders.
Keep track of where you are and where you’re going.
Be honest about the costs.
Always be thinking about boundaries.
Understand where the data comes from and where it’s going.
Provides a detailed model outlining the elements needed to plan an effective program for making data open, accessible, and aligned with program goals.
Insights on how to effectively manage “big data” projects based on interviews with data scientists, project and program managers, and government officials:
Still confusion about what constitutes “big data.”
Hard to sell a “capability.”
Benefit of linking data analytics to existing initiatives.
Need to distinguish between project & program level planning, implementation, & oversight.
Front end assessment & planning are crucial.
Enterprise level issues e.g. data governance must be addressed judiciously.
Hard to avoid organizational & political issues.
Need for speed, agility, & early delivery of value.
Need to avoid “cart before the horse” syndrome.
Planning an effective data analytics program requires a governance process that focuses on developing and supporting useful data-based services aligned with program goals.
Discusses both intermediate progress reporting metrics as well as the governance challenges associated with data programs that incorporate partnership with private sector organizations (such as the NOAA open data program).
Eventually priorities must be set for making use of data. This article describes the factors to consider in project selection. Basic questions include:
Are we looking to make existing data more useful and understandable?
Are we attempting to better represent current operations or transactions through near-real-time data visualizations?
Are we building a predictive model based on past experience to help anticipate resource demands?
Are we trying to understand the relationship between passively collected remote sensing data and the actual behavior of our customers?
"Is it really true that "Nearly two-thirds of big data projects will fail to get beyond the pilot and experimentation phase in the next two years, and will end up being abandoned," as suggested by Steve Ranger last year? My take: to be successful you need a collaborative team with multiple skills, effective leadership, good communication -- and a plan. In other words, don't put the cart before the horse by starting with a technical solution before you understand what problems you'll be trying to solve."
“… we need to do a better job of making sure that people can understand and use their data. Not everyone is a data scientist or statistician. Even reasonably intelligent people can be flummoxed by the intricacies of even a moderately sophisticated spreadsheet. Plus, the details of an individual’s financial or health records may require expert knowledge to interpret.”
NOAA’s Big Data Project is an innovative effort to provide public access to large amounts of environmental and weather data through innovative partnership with private sector cloud vendors. This article is a progress report on the issues that the program needs to address including transparency, risk averse management, and the measurement of success.
Reviews a case study of how an industrial giant takes advantage of big data, software as a service, and infrastructure development to build a new business that transforms an old business. Good view of the “industrial internet.”
Factors that drive the success of a data access program include:
The importance of stakeholder involvement
The impact of resource constraints
Controlling business process change
Importance of the data management lifecycle
The importance of use cases
The role of subject matter expertise
A snapshot view of how health related data are providing the foundation for a wide range of new data-reliant products and services:
“From a program design standpoint this means government health programs that generate useful data need to incorporate systems and processes not only to makes sure program data are used internally in an intelligent and secure fashion to support planning and management but also to make sure appropriately anonymized data are discoverable by and accessible to innovators, developers, analysts, researchers, and the public.”
Lesson learned from energy utility “big data” projects include:
Business ownership and responsibility for each new data element must be established early on.
Stakeholder Involvement must be ensured early on.
Dedicated funding must be available.
Marketing and communication may be necessary if the changes to how data are collected and analyzed involve personal or otherwise sensitive data.
The time and cost involved in changing business processes may exceed the costs associated with technology change
Risk management and change management will be critical to the success of large, complex projects involving many “moving parts.”
Adopting big data tools and process changes may be associated with a range of organizational changes including:
The move to cloud services as a replacement for current infrastructure.
The need to learn new tools and techniques.
Resistance from business process owners who don’t want to change.
Overemphasis on tools and technologies while shortchanging business and strategy.
The need to align data services with core business needs, not just with “easy to do” and “low hanging fruit” initiatives.
“Yet, big data does offer challenges that many analysts and managers are going to have difficulty reconciling. Analysts really do need to understand more about business and business strategy than might have been the case in the more compartmentalized past. At the same time, managers who don’t understand and appreciate how data analysts work and how trends, modeling, and error are handled will be at a disadvantage. The two groups need to work together to make “big data” work.”
“Data governance organizational structures have to be sustainable. They must support and facilitate needed data related services. While short-term “skunk work” tactics might best be served by a separate organization, in the long run a more federated or collaborative approach empowered to work through existing lines of authority might also make sense.”
“Managing data and metadata at an enterprise level to facilitate efficient tool use can be a complex undertaking. This is especially true when corporate actions such as transitioning IT resources to the cloud, constantly upgrading technologies, and increasing attention to privacy and security must also be considered. Such complexity should not be a cause for discouragement but should help drive the organization to become more disciplined in how it generates value from its data.”
“The first challenge – – knowing how to develop a convincing business case – – is something that can be taught or purchased. The second challenge – – knowing in advance what the outcome will be of a new data analysis or predictive modeling effort – – is more difficult to address and may be especially acute where management is not analytically oriented. One way to address this second challenge is to start with something simple and not attempt a program- or enterprise-level change requiring modifications to the organization’s culture.”
Discusses organizational models when “data analytics” is the focus. Models include:
An IT department that supports various enterprise functions and systems (including the development and/or support for specialized tools).
A project management organization (PMO) that provides management and administration support to multiple projects at various points in their life cycles.
A customer service or call center that methodically intakes, processes, and satisfies the data analytics needs of a diverse and changing user population.
A collaborative and potentially research-oriented organization that actively engages both in data analytics and supports the collaboration and sharing of data with other researchers.
“Perhaps the most problematic aspect of measuring the value of information is how we deal with uncertainty. In deciding how to plan an improvement in how an organization manages and analyzes its information assets, it's not unusual to have to answer the question, ‘Why should I spend money on this system/project/program/tool if I don't know with certainty how useful the results of this new analytical capability will be?’”