Skip to main content
Tech Trends Report 2017 Hero image

Tech Trends 2017: A New Dawn for Analytics

10/24/2017

Historically, when you discuss “analytics,” the conversation centers on “reporting” or “data visualization.” Over the last five years, however, the technology has changed dramatically and, as a result, so has the definition of “analytics.”

Today, data comes in different forms — streams, lakes, clouds — each with a new set of analytics to improve business outcomes. These new forms are sandwiched between applications (enterprise resource planning, analytics platform systems, warehouse management systems, tag management systems, etc.) and workforce productivity technologies (Microsoft Office, internet search, etc.).

With these shifts in data volume, variety and velocity, consumer goods companies are coming to the stark realization that today’s analytics approaches are largely legacy. The technologies need to be updated, yes. But more importantly, business leaders need to reskill teams and update their own capabilities to understand and seize the opportunity within this new world of analytics.

The traditional approach to analytics — reporting on top of enterprise applications (for ERP, WMS, APS, supplier and customer relationship management) — is no longer equal to the task. These solutions automated functional processes within the enterprise. Sitting on top of relational database systems using visual analytics, the analytics focused on transactional data. 

The evolution of open source capabilities, which enable the use of unstructured data (text, social sentiment, warranty and quality data, weather data and images), is an opportunity for the process redefinition of social listening, quality sensing, supplier development and risk management. We’re in the middle of a process evolution to use structured and unstructured data together to drive new outcomes. The change is dramatic, and it’s happening quickly. 

Historically, business leaders invested in what we termed “alphabet-soup analytics.” This traditional approach automated pockets of the enterprise, but it didn’t align internal processes to make cross-functional decisions at the speed of business. The focus was on order-to-cash and procure-to-pay, along with transactional visibility. As a result, companies became rich in data, but low on insights.

The needs of today’s global businesses are much greater. Alphabet-soup analytics can’t handle the task of automating and supporting the evolution of digital, outside-in processes.

When company leaders imagine the future, it starts outside-in with the consumer. (The use of channel data including point of sale, warehouse withdrawal, and even social sentiment.) The goal is to sense market shifts, adapt to them and drive a profitable response. Accomplishing this vision requires multiple technologies. As shown in Figure 1, it involves a coalescence of analytics to redefine the future.

Advertisement - article continues below
Advertisement
Figure 1, analytics

So let’s examine the future:

Sensors, Real-Time Data, and the Internet of Things: Streaming data from the IoT doesn’t fit into today’s enterprise architectures, which are founded on batch-based processes. These processes respond; they cannot sense.

Changing this paradigm requires the building of streaming data architectures and the definition of new processes to use real-time data. As a result, data from sensors aren’t used. Process improvements are limitless. The use of real-time data can redefine manufacturing. Instead of taking equipment down for maintenance based on mean-time failure, the use of sensors enables maintenance alerts to serve and repair based on actual conditions. Replenishment can be driven based on usage. Vehicle status is more accurate.

Cognitive Computing: Within five years, the applications of decision support — price management, trade promotion management, supply chain planning, and rules-based logic (inventory/order matching, available-to-promise and allocation) — will be replaced by cognitive computing, which has a well-defined sensing mechanism with inputs translated through an ontology into outputs. The data can be both structured and unstructured.

Open Source Analytics: The use of open source analytics — like Hadoop, Kafka and Spark — reduces costs and improves capabilities. The use of Hadoop for data lakes and data mining, along with machine learning, improves master data management.

Blockchain/Hyperledger: While traditional investments focused on the enterprise, the automation of business networks represents the future. Blockchain — an immutable ledger — enables the rethinking of business-to-business analytics. While new and largely unproven, blockchain technology offers the possibility of track and trace to improve lineage, the redefinition of supply chain finance, and the ability to improve supplier onboarding.

Memory and Concurrent Optimization: Traditional applications were constrained by memory. This is no longer the  case. Through in-memory computing advancements, companies now have the ability to rethink planning and decision support. Traditional applications, as we know them, shift with new capabilities as more and more data can be managed in memory.

As companies look forward, the emphasis is on active data. The focus is  on more actionable, real-time data that is more consumable through insights by new forms of analytics. There is a shift away from passive, latent data with batch-based processes.

New forms of analytics embrace structured and unstructured data and move at the speed of business. Advances in open source and concurrent optimization enable outside-in processes from the consumer back. This will redefine decision support applications like supply chain planning, price management, network design applications, transportation management/visibility, and trade promotion management. 

In addition, the rules of the supply chain are becoming more robust. While traditional rules are “single ifs”  to “single thens,” the rise of cognitive computing enables “multiple ifs” to “multiple thens.” This enables adaptive rule sets for inventory/order matching, ATP, allocation, order/ invoice matching and deduction logic, along with freight matching/routing.

The supply chain is moving to multi-tier processes to disintermediate banking and define finance. This, along with outside-in process flow, relegates enterprise applications developed in the last two decades on the backbone of ERP as the system of record.

This evolution will not be a step change. It will happen incrementally over many years. The solutions will not come from one vendor, but from many. To seize the opportunity, companies should build a cross-functional business team to test and learn about new analytic techniques. With the best opportunities often coming from best-of-breed small vendors, there’s a need to learn to partner with innovators.

In addition, companies need to build a strategy to reinvent process capabilities based on new forms of analytics (see Figure 2). This requires “unlearning” and then “retooling” the organization. Companies will need to learn new concepts and build new capabilities.

Simultaneously, the traditional world of supply chain management, as we know it, will be redefined. We must first learn the practices of the past to then “unlearn” them in order to imagine the future. The technology is the easiest element. The hardest step is unlearning to enable the re-skilling.

Figure 2, analytics
Advertisement - article continues below
Advertisement

____________________________________________________________________

TABLE OF CONTENTS
 

X
This ad will auto-close in 10 seconds