Dennis D. McDonald (ddmcd@outlook.com) is an independent consultant located in Alexandria Virginia. His services and capabilities are described here. Application areas include project, program, and data management; market assessment, digital strategy, and program planning; change and content management; social media; and, technology adoption. Follow him on Google+. He also publishes on CTOvision.com and aNewDomain.

Learning From General Electric’s Big Data Challenges

Learning From General Electric’s Big Data Challenges

By Dennis D. McDonald

A follow on article to this one is located here.

Introduction

A very interesting article is GE’s Big Bet on Data and Analytics (MIT Sloan Management Review, February 18, 2016). It discusses how industrial giant General Electric (GE) is approaching the use of big data, predictive analytics, the cloud, and “the internet of things.”

The article suggests that GE has long been thought of – somewhat negatively — as a “traditional” manufacturing company. Yet, as discussed in the article, the data-related strategies it is now pursuing are worthy of attention by others who are looking for ways to better use data and data analytics to their advantage. The article begins with this quote:

If software experts truly knew what Jeff Immelt and GE Digital were doing, there’s no other software company on the planet where they would rather be.–Bill Ruh, CEO of GE Digital and CDO for GE

I’ve never thought of GE in pejorative terms. As a fan of aeronautical history I’ve always known about GE’s involvement with jet engine technology. You can’t avoid seeing the GE logo on jet aircraft engines on airliners around the world. Also, years ago I was involved with GE Appliances and GE Medical Systems when my employer at the time was customizing its proprietary retrieval and networking software for various GE call-center, mobile service, and medical equipment repair system applications. I’ve always thought that GE has always been on the forefront of applying new technology to its businesses.

What this particular article discusses is a bit similar to what I have found in my own modest big data project planning and management research:

  1. Organizations need to take a strategic view of how they manage and use their data.
  2. Organizations may need to rethink how existing operations are organized and managed.

This last point may not be the case for organizations that are already “digitally native” or “data centric” since their understanding of and reliance on data are already part of who they are and how they operate. For the rest of us, though, that’s not the case. Changes are necessary in the face of big data’s “onslaught.”

This is what makes this article about GE so interesting. The author discusses issues that have arisen at GE as it adjusts to not only a constant flow of data from the industrial devices it sells but to how managing this data effectively is fundamentally changing GE’s products and services.

Discussed below are some of the most important challenges discussed in this article and how they be relevant to other service organizations.

The Industrial Internet: how relevant?

The article focuses on managing data generated by and related to sensors attached to the industrial devices GE sells, hence the term “industrial Internet.” The data thus generated describe real physical processes that can (usually) be measured an managed in quantitative and relatively unambiguous terms.

Does this mean that the learning discussed in this article is only relevant to industrial companies such as GE?

Not necessarily. Today, for example, companies are developing and selling cloud-based analytical packages to generate predictive data around prescription drugs and their health impacts. The case being made there for the value of predictive analytics — relating process cost of specific actions to positive or negative outcomes — are similar to what GE is doing. Knowing in advance what will succeed or fail has obvious value in both cases.

Another use of predictive analytics based on data gathered via remote sensing would be prediction of faults in fast food preparation equipment. While many fast food operations at the retail level are small, a significant number of fast food restaurants are run by multi-restaurant franchisees where loss of production equipment in high-volume locations has an immediate dollar impact. Predictive analytics based on real-world and near-real-time equipment performance might be more cost-effective at predicting replacement requirements than maintenance tables based on machine averages.

What business are you in?

GE has long been selling software to help control the devices that it sells. It’s now predicting that revenue from its cloud-based predictive platform (which sits “on top” of the devices it controls) will provide the majority of its software income in just a few years.

That is significant to the sales process. Where once you sold machinery, now you sell the means to control the machines. Plus, you can now bring in contextual and environmental information to augment and help manage not only the machines you sell but also the machines (potentially made by other companies) that interact with your machines.

It’s not unusual for organizations to crank contextual information into pricing and marketing. Energy utilities have always paid very close attention to temperature fluctuations given the impact of temperature on gas and electricity demand. The same goes for government agencies that manage large real estate and office holdings where energy and environmental control of office space is a major expense. Predicting not only failures but changes needed in operational parameters to optimize performance and economics has definite benefits at scale.

What GE is exploring now – intelligently managing other industrial equipment as well as its own —  is similar to what systems integrators have done for many years in the IT world: help manage the multitude of interconnected hardware and software platforms that organizations have amassed over the years. The significant difference this article points out is the need for a more unified data management approach given the need for (a) data interchange among companies that have traditionally competed and for (b) moving the data to the cloud to take advantage of the economies of scale and demand management.

Again, this challenging integration business is different from what GE has traditionally engaged in:

“When we think about the future, digitizing our customers’ businesses requires a technology shift, a business model shift, and a skill shift.”

Changing economies of scale

While the economies of scale the GE is pursuing in data-rich markets such as oil and gas may not be relevant to much smaller or less concentrated industries, it’s also true that we’ve all seen how the tech world has a way of driving prices down so that even small companies can afford new services. Also, in the oil and gas industry GE’s experience is an example of having to deal with a fundamentally conservative industry where manipulations of peer pressure and “fear of being first” are important factors to address when introducing new services.

GE’s approach to addressing such challenges is relevant to the entry of data management and data analytics services into a variety of markets:

  • Focus on important problems with big payback, not just easy to handle low hanging fruit.
  • Convince customers that it is to their (and their stockholders’) advantage to standardize and share information.
  • Make new management tools “user-friendly” by using tools and techniques that are already familiar to customers.

All of this requires substantial domain expertise, data handling skills, and the ability to deliver both improvements to current process management as well as improvements to business planning based on better predictive analytics. In other words, to be successful with this type of business you need to address data, technology and business in an integrated way; just delivering the tools will only take you so far.

An aging workforce

GE and its customers are losing engineering talent to death and retirement. This “double whammy” comes at a time when the availability of new data management, prediction, and application development technology is fundamentally changing the business and the available workforce.

Effective equipment maintenance at GE’s customer has benefited from staff with years of experience based on a time when today’s sophisticated analytical tools were not available. When such talent leaves there’s sometimes a gap that needs to be filled by bringing back retired workers as consultants, or by seeking maintenance expertise elsewhere from outside vendors such as GE.

The idea of outsourcing maintenance, repair, and support work is not new. Outsourcing has been a fact of life in many industries for many years along with arguments about quality and economics. Now, replacing in-house expertise with external data management and algorithm-supported decision-making has the potential for being a major game changer since it provides not only way to address retirement-induced brain drain but provides a way to take advantage of lessons learned from many different companies and systems.

New workers

One point made by the article is that younger workers are tending to demand new technology when they come on board the oil and gas business. Such demand has been a fairly common theme in industries where younger workers and other early adopters, based on their experience using smart phones, tablets, and slick web-based social media, expect similar functionality from their corporate tools.

But when it comes to data and data analytics, is it really true that younger workers will necessarily be more receptive to tools that support a more modern approach to data manipulation and visualization than their elders?

I don’t know the answer. The article discusses young versus old engineers in the oil and gas industry. Engineers usually tend to be more quantitatively oriented than the general population. I can see how younger workers’ expectations for management software with certain analytical capabilities could, overtime, percolate into management. But how relevant is this to other industries?

Conclusions

The article concludes with a discussion of what others can learn from GE’s experience. I don’t think that the lessons are unique to companies of GE’s stature and age. Plus, companies can “ buy their way” into an industry by targeting acquisitions based around a defined supply chain. The issues at play here concern factors that are relevant to many industries:

  1. Declining data collection, management, and manipulation costs.
  2. Increasing disruption of traditional organizational and departmental boundaries.
  3. Growing awareness that the advantages of data sharing and standardization need to be balanced against privacy and competitive concerns.

The two key lessons I get from this article are universally relevant:

  1. The importance of platform based services that can be tied to organizational goals and objectives, regardless of the technologies (and vendors) that need to be integrated.
  2. The importance of industry-specific business and domain expertise.

GE is selling a cloud-based platform that customers can use to manage their businesses. Data sharing and interoperability are prerequisites for this approach. GE is counting on these being delivered economically.

Knowing what to do and how to do it requires an understanding of the data, not just an understanding of the tools. Industry credibility is key. GE has industry credibility, and that’s good for them and potentially for their customers who are looking for ways to up their data analytics “game.”

Risks

Two risks are apparent for GE’s approach which may also be relevant to other industries where cloud-based analytical platforms are being developed and marketed. (I’m thinking here, for example, of medical devices, prescription medicine, and treatment costs.)

The first has to do with barriers to entry. If the cost for data collection, management, and access continue to drop, won’t anyone be able to enter the platform services market, not just behemoths like GE?

The second risk concerns short term and long term conversion costs. There’s usually a financial “hit” upfront when when switching platforms. Some of these costs are directly predictable data and software conversion costs that an honest preliminary assessment should be able to uncover—assuming such an honest assessment is actually conducted. Longer-term, switching to a new platform—even an efficiently run analytics platform such as GE’s—tends to uncover business process change costs that only tend to surface over time. This is why so many IT system consolidations following mergers and acquisitions tend to be more expensive than anticipated.

Bottom line

The more you understand how to manage and analyze your company’s and your industry’s data, the better off you’re going to be. That way it’s more likely you’ll stay in control of your business, not an external platform vendor.

Copyright (c) 2016 by Dennis D. McDonald. A follow on article to this one is located here. For more about Data Program Management go here. Contact Dennis by email at ddmcd@outlook.com or by phone at 703-402-7382. Check out his curated Managing Data collection on Google+.

Google Security Checkups and Your Devices

Google Security Checkups and Your Devices

Can Government Regulations Drive Improvements in Enterprise Data Governance?

Can Government Regulations Drive Improvements in Enterprise Data Governance?