The Overlooked Opportunities of Inventory & Capacity Based Supply Chain Analytics

Much of the analytics effort to date has focused on the demand side of the supply chain equation. Yet analytics can be applied just as effectively to the supply side to assure reliable and cost effective inventory, capacity, and supplier capabilities. By Pierre Mitchell December 02, 2013,

As companies work to elevate the value and sophistication of their products and services, they must also do the same for their “information supply chains.”

This requires generating deeper insight and intelligence to uncover hidden risks and rewards from increasingly voluminous and fragmented data littered across internal and, increasingly, cloud-based databases. Supply chain analytics can deliver the needed intelligence and insight.

But to do so, the analytics needs to go beyond the silos (such as focusing only on forensic spending analysis in procurement) to layer multi-dimensional data modeling, visualization, optimization, and other technologies on top of a harmonized view of this data.

Supply chain analytics is a broad topic; it is typically focused on the customer/demand side to tune the demand-driven supply chain.

However, the supply-driven side is just as important to assure the discovery and positioning of multi-tier supply, which requires reliable and cost-effective inventory, capacity, and supplier capabilities.

We define supply analytics simply as analyses that support improved design and management of the inbound value chain. But, a simple definition doesn’t mean that the analytics are simple.

True supply analytics goes well beyond supply planning against an S&OP (Sales & Operations Plan) and sharing it with suppliers.

As the old quote goes: “Ginger Rogers did everything Fred Astaire did, but backwards and in high heels.” So when a heel breaks on the supply side (for example, the Japanese tsunami, Thailand flooding, and so forth), bad things happen.

Those “bad things” represent supply chain risk potential, which has become a subject of great interest to supply chain professionals. Among the findings from The Hackett Group’s 2012 Procurement Key Issues research:

  • Supply chain agility (an antidote to supply risk) is the top-rated issue for supply chain executives.
  • Aside from spend savings/influence and innovation support, supply risk is the top issue for procurement executives.
  • Input price of raw materials and energy is the highest-rated volatility area for senior executives, who expect it to get even worse in a few years. (See Exhibit 1.)

Manage Not Just to Value, But to Variability Understanding the impact of supply volatility and variability on business performance is critical. Yet it’s often viewed as secondary in importance to customer/demand analytics.

To illustrate, Target Corp. on the demand side uses customer data mining, content aggregation, and predictive analytics to predict customer pregnancies with roughly 90 percent accuracy.

But what about when organizations want similarly detailed insights on suppliers and supply markets? Where might supply costs go? How will this affect profits? What are the biggest supply risks?

While many procurement groups do forecasting to anticipate price hikes, only 39 percent mitigate price increases effectively and integrate them into cross-functional business/profit planning. (Exhibit 2 shows the results from the Hackett Group’s “Input-Cost Inflation Study.”)

Analyzing historic spending, while challenging, is much easier than predicting future spending confidently based on multiple levels of cost models, supply tiers, and input commodities. Yet tying these predicted economic costs (not sunk costs) to products, channels, customers, and profits is essential.

One way to do this is to drive the S&OP process deeper upstream into the inbound supply chain, to the end-commodity level—and then incorporating demand variability and supply variability into the models rather than just managing to point estimates.

The following case study on StuffCo (not real name of company) illustrates the point.

StuffCo, a multibillion-dollar consumer products manufacturer with razor-thin margins, was faced with highly volatile commodity prices on one side (material costs for paper, PET, and coatings made up nearly half of total landed product costs to customers) and unforgiving, price-sensitive retailers on the other.

Its response was to develop an innovative and integrated business planning process with world-class S&OP at its core. Importantly, StuffCo also linked the process to supply market realities by integrating supply market intelligence and purchase price forecasts formally into time-phased profit plans by product and customer.

StuffCo used this forward-looking supply intelligence to create robust scenarios, perform additional analyses where needed, and then optimize all its options—finding substitute inputs, changing specifications, reconfiguring the product mix to the customer, trying to increase customer price, hedging (physical or financial), changing the supply network, and adopting other strategies.

This effort helped minimize millions of dollars in product/customer profitability erosion while resulting in a more robust and predictable plan.

The StuffCo case study has at least four of the hallmarks of a good supply analytics implementation:

  1. A clear business problem and associated metrics. The focus was on improving margin and then digging deeper to touch lower levels of detail, fact-based decisions, and strategies.
  2. An end-to-end process that is chartered and championed by C-level management. The integrated S&OP planning team was led by the COO and supported by the CEO. They drove the team to make good decisions, resolve trade-offs, and remove roadblocks.
  3. Forward-looking scenarios and causal analysis used to understand variability and performance drivers, without getting lost in the data. The team constantly forced itself to make decisions based on the analysis. As Tom Stallkamp, the former CPO of Chrysler, once said: “If you’re waiting for more data, it’s probably too late.”
  4. Performed iteratively to demonstrate value and self-fund subsequent improvement opportunities. StuffCo started with spreadsheets as a proof of concept, but then implemented the supply analytics described above in a large business intelligence (BI) application environment to properly scale it. Starting small and demonstrating early wins also helped uncover future opportunities in such areas as data quality. Pragmatically, fixing upstream data problems also helped free managers’ time to focus on analyzing data rather than wrestling with it.

Consider the value added by the knowledge workers at advanced companies like Target and our StuffCo example. According to Hackett Group data, procurement staff at world-class companies spend less than 30 percent of their time compiling data, compared to 60 percent for the bottom quartile.(1)

In other words, while typical companies are still wrangling data, world-class procurement organizations are spending more time analyzing it and making informed recommendations.

Another side benefit of analytics is the ability to highlight opportunities to standardize data, improve data quality, consolidate systems, and change processes and policies.

All of these actions help trim the “information supply chain.” To quote Larry Kittelberger, former SVP/CIO at Honeywell: “Seventy-five percent of the effort and cost is process reengineering and data cleansing and creation, and 25 percent is the IT portion.

When people say their systems didn’t deliver, chances are they missed the 75 percent they should have been working on.”(2)

Putting Analytics at the Center of Performance Management Analytics can often seem like an overly complicated or abstract concept, but nearly all of us are analyzing problems all the time.

And while improving analytics capabilities can certainly get complicated in terms of technology and techniques, the end objective is fairly straightforward: To integrate analytics into broader supply chain and business processes so that we are analyzing the right things to answer the right questions and make the best decisions.

So, where to start?  First, it’s important to recognize the broad scope of analytics:

  • From simple reporting and ad hoc analysis to multidimensional predictive analytics.
  • From extended supply network design to real-time troubleshooting.
  • From analyzing root causes of known problems to discovering entirely new insights and classes of problems.
  • From steady-state process execution to transformational process support.

This broad scope puts analytics (and supporting information management activities) at the top of most supply management organizations’ IT priority list. But analytics is not really an IT issue— it’s a business issue to make better business decisions.

Thus, the best way to frame analytics is in its rightful place at the heart of a performance management framework such as DMAIC (Define, Measure, Analyze, Improve, Control) with “Analyze” in the center. (See Exhibit 3.) Analytics commands the center position because it is critical to all facets of closed-loop lifecycle processes within the value chain.

And because this article is about supply analytics, we’ll frame it as the spend/supply management lifecycle(3) that is actually embedded within multiple business lifecycles—not just a sourcing process tied to a supplier contract cycle or a component in a product lifecycle.

Corporate procurement and legal groups are obviously biased toward the contract lifecycle. Finance and indirect procurement staff focus on the budget lifecycle. Engineering groups liaise with direct procurement staff regarding strategic supply choices made during product lifecycle management, asset design, project planning, and customer engagement.

Progressive supply organizations don’t operate these silos independently. Rather, they engage with stakeholders across all of these lifecycles to optimize supply/spend outcomes through a combination of processes such as category management or S&OP. The truly strategic aspects of supply chain analytics will occur not only within these lifecycles, but also on the edges where they intersect.

A few advanced companies even use DMAIC as a top-level business management framework, integrating other methodologies such as strategic sourcing and supplier management at a lower level.

Viewing analytics within the context of the Six Sigma framework in Exhibit 3 specifically enables companies to: Define the value that spend owners must obtain from their spending and suppliers in relation to their goals and budgets. Value analysis/value engineering (VA/VE) can be used here as well as techniques like Hoshin planning and zero-based budgeting.

Measure the value currently being delivered—and the gaps—in terms of savings, supplier performance, supply risk mitigation, cycle time, and spend compliance.

This is the realm of performance analytics, scorecards/dashboards, and measurement of past and future spending levels (that is, “spend analysis” vs. spend planning). While seemingly straightforward in principle, obtaining historic, line-item level spend visibility by supplier, category/commodity, cost center/location, contract and transaction channel is not easy for most companies, especially on a recurring, automated basis.

Improve the spending/supply outcomes and document savings delivered. This stage of the supply/spend management lifecycle involves scaling the capabilities needed for more-sophisticated analytics, scorecarding, market intelligence, and master data management (MDM) to support more complex analyses. An example would be mapping payment terms and supplier financing costs into a supplier payment analysis.

Control the process to maintain improvements achieved and remain in compliance with contracts, suppliers, regulations, and channel requirements. This part of the lifecycle includes building supply market intelligence, risk-management alerts, and predictive analytics that can alert users to changes in conditions.

Different types of analytic approaches and tools are better suited for different parts of the lifecycle. Exhibit 4 depicts various examples of analytics across these phases and the types of business decisions that they help address.

The types of analytics most appropriate to these lifecycles will vary, as will the best way to buy or build them. The first step is to understand what is possible, and then pick the types of analytics that are best-suited to the situation.

It’s important here to consider such factors as size of the opportunity, appetite for implementation risk, technical skills available, and budget. Through the supply analytics process, it’s important to keep in mind three guiding principles, each of which is discussed below:

  • make variability your friend;
  • embed timely analytics into the business process;
  • and link company analytics to external content.

Make Variability Your Friend Identifying variation is a simple way to find improvement opportunities through supply analytics.

The following actions can help in this regard: Benchmark supply performance and capabilities within the company’s business units and functions against other companies.

For example, The Hackett Group has a cloud-based service called Performance Exchange that automatically extracts transactional information from ERP systems to automate the analysis of company performance against its procurement performance benchmark database. Benchmarking analysis (e.g., price/cost benchmarking) also is popular for comparing capabilities and performance across suppliers.

Translate demand variation to supply variation during:

  • Supply planning. Demand variability must be translated to supply requirement variability so that upstream inventory planning, capacity planning, and commercial risk management decisions can be made intelligently. For example, Hewlett-Packard’s procurement risk management program uses advanced analytics to trade off uncertainty in demand, availability and price against risk tolerance—including the cost of risk mitigation—to make optimized contract decisions.
  • Sourcing.  A best practice is to present large market baskets to suppliers so that they can flexibly bid on the requirements that are best suited to their capabilities.

This allows both sides to win, rather than holding all variables constant except for price before running a price-compressing, winner-take-all reverse auction on predetermined lots.

The value of such “expressive bidding” is well proven, especially in transportation. (An example of this can be found in Procter & Gamble’s work with CombineNet.)

Manage cost/price variation. The lowest of the low-hanging fruit in spend analytics is finding lower prices paid for the same item. However, identifying price discrepancies for common materials used in different parts isn’t so easy.

Caterpillar Inc. has used a vendor called Akoya to analyze estimated “should-cost” variability based on CAD drawings, material libraries, and commodity price data across its component portfolio to identify material cost reduction opportunities.

Conduct multi-tier supply analysis. Caterpillar uses a similar “should-cost” analytic approach for multi-tier capacity planning in such activities as foundry analysis.

Multi-tier supply analysis also can be applied to:

  • Freight bid optimization to find optimal combinations of freight provides, including private fleet, core carriers, and 3PLs.
  • Working capital analysis to identify supply-chain finance opportunities to lower capital costs for suppliers (and presumably share in the gains).
  • Multi-echelon inventory planning to optimize the deployment of inventory, regardless of who owns it.
  • Extended supply network design analysis that goes upstream beyond internal manufacturing locations.
  • “Design for supply” analysis that focuses on supplier manufacturing capabilities, use of standardized/substitutable raw materials, and other factors that support total cost reduction or other business objectives. On the flip side, from a demand standpoint, it goes without saying that value analysis ensures that design requirements are pegged quantifiably back to true customer requirements.

Embed Timely Analytics into the Business Process Analytics are best executed upstream—for example, during the design phase, before costs are locked in. This approach transforms compliance analysis from downstream forensic reporting to upstream process “failsafing.”

For example, rather than procurement chasing maverick spenders, AI-based auto-classification technology borrowed from spend analytics applications can be used in a Google-like manner to classify requisitioner search requests, guiding them to preferred suppliers before the spend occurs. (One major life sciences firm uses the vendor Zycus exactly for this purpose.)

Such spend planning also is key to supporting a zero-based budgeting approach rather than the dysfunctional “use it or lose it” budgeting process. The goal is to use analytics to apply the principle of “quality at the source” to information—leading to better quality of decisions.

Interestingly, analytics can actually help improve data quality by exposing “dirty” master data that feed the analysis—whether the source of that data be suppliers, items, contracts, cost centers, bill of materials, and so on.

Link Company Analytics to External Content Analytics have limited effectiveness if applied only to internal systems, especially if procurement organizations are trying to analyze broader trends, make predictions, and measure themselves against external indices—for instance, using purchase-price benchmarking  to determine true price performance against the market, rather than just the last price paid.

This type of price/cost benchmarking for all TCO elements can only be done with a total cost model that is linked to external commodity price indices. The case study on “EuroElectric” below illustrates the importance of linking analytics to external content.

Analyzing cost and volume variances relative to external indices obviously makes sense for direct materials commodities. But it can be applied to other spending as well.

For example, a major European electric utilities company, we’ll call it EuroElectric, sought to improve the management of its budgeted vs. actual spend for non-feedstock spending. It wanted a single system that separated consumption variation (within a contract and also across contracts), external market pricing variation, and procurement-led pricing impacts.

To accomplish this, EuroElectric used a third-party tool for spend analysis and procurement performance analysis (from Sievo) that was cross-referenced to a database with thousands of price indices (fed primarily from third-party content provider IHS).

Such transparency allowed the company to reduce the “noise” among finance, procurement, and budget owners. Further, it allowed for a more fact-based discussion focused on continuous improvement.

Granted, making such a system work requires substantial modeling expertise, specifically the ability to manage multiple cost models linked with the many indexes. But analytics offers competitive advantage because the competition can not easily replicate such an advanced system.

The EuroElectric example confirms that analytics work best when integrated to external information. Analytics without external information integration (a “data warehouse without windows” if you will) is just navel-gazing.

On the other hand, external information without analytics to sift through massive volumes of “big data” is like drowning in a sea of data. As such, we’ve started seeing a trend toward pre-packaged hosted analytics applications (or intelligence services that use such analytics) designed to make sense of massive data sets.

The integration between external information, internal analytics, and strategy is nowhere more apparent than in managing supply risk—supplier risk, regulatory risk, competitive risk, IP risk, and so forth. These all are risks that impact supply performance.

Managing these risks involves a comprehensive array of analytic techniques, including the following: scenario planning to identify all risk types; Monte Carlo simulation to quantify the probability/impact of adverse risk events; segmentation and visualization as in risk “heat maps”; and predictive analysis to identify, prioritize and mitigate the biggest risks for the least investment.

At present, there is no single services or applications provider that incorporates all of these capabilities; some assembly will be required.

Making Progress Without the “Hard ROI” While this might sound like heresy, the problem with determining the ROI for analytics is that it is all “option value.” That is, you don’t know exactly how you will benefit until you actually do it.

This makes obtaining funding for a supply analytics initiative very challenging when senior executives demand “hard ROI.” Below are a few tactics to advance the cause of analytics:

  • Identify and leverage internal champions in corporate strategy, IT, finance, enterprise risk, and corporate intelligence.
  • Set up an analytics Center of Excellence within the supply chain to manage reporting, benchmarking, KPIs, intelligence/knowledge, training, and interfacing with IT.
  • Benchmark externally and internally, and raise the visibility of strategic questions and associated KPIs that cannot be easily answered or measured. C-level executives do not like blind spots; when encountered with them, they will likely ask for more explanation and analysis. Analytics will shed light on these blind spots while invariably highlighting broader opportunities. For example, how many unique payment terms do you have? How many different prices and contract terms do you have for the same items? The answers might be quite shocking.  Start small and self-fund. Nothing breeds success like success. Use limited-scope projects to implement the previous recommendations.
  • Make the analytics broadly accessible, similar to what many companies have done with Lean and Six Sigma. It is amazing how much value can be drawn from analyzing spend/supplier data in new ways via analytical “workbenches” that don’t just drill down into traditional data hierarchies, but also ”drill-around” by understanding patterns and relationships across different types of data.
  • Work collaboratively with IT to create an analytic architecture that supports—but is not slowed down by—the corporate business intelligence (BI) platform. Use BI to aggregate data. But also use third parties for specialized data classification, enrichment, and pre-packaged analytical applications that knowledge workers can immediately use and begin to explore to uncover opportunities.
  • Extend the analytics already in place and use them in new ways. A large equipment manufacturer integrated its supply network model with a natural-hazards “hot spots” database from Columbia University to prioritize risk mitigations. Another company, a manufacturer of paper products, tweaked its supply network model to display CO2 as its unit of measure (rather than dollars) to get an immediate, detailed view of its transportation emissions footprint.
  • Tap supply markets and get the analytics as a service. The idea of augmenting what you already have extends to third parties. Some companies have packaged up and outsourced certain types of supply market intelligence activities to companies like Beroe, SmartCube, and other KPO firms. Others have used hosted analytics vendors, professional services providers, broader application providers (ERP, BI, best-of-breed suites), content/intelligence providers, benchmarking firms, and large professional service providers (BPO firms, management consultants) that use analytics to perform diagnostic services.

Looking Forward: The Great Analytics Mash-up Many of the services providers we’ve mentioned are increasingly partnering with other firms to provide the benefits of broad and deep analytics integrated with external content, delivered via a cloud-based service.

The lines are increasingly blurring between providers of software, content and professional services. To cite a few examples::

  • PowerAdvocate converges powerful supply intelligence content/analytics into its industry-specific application suite.
  • Resilinc brings together supply risk management, supply market intelligence, and supply network design down to the part level.
  • TriplePoint merges trading-desk analytics, contract management, and procure-to-pay (P2P) for managing highly volatile commodities.

Many large BPO firms offer outsourced supply chain analytics, but also use analytics as part of their core service offerings. Procurian, for example, has a strong contract/spend visibility and compliance solution it uses internally to verify (and get paid for) implemented savings.

The company presently is piloting this as a standalone dashboard solution. Similarly, IBM tapped its Watson Labs group to apply mathematical optimization to the problem of identifying and prioritizing supplier contract non-compliance opportunities. In fact, IBM is now implementing this work within its own business as well as with some early-adopter customers.

There is clearly plenty of white space in supply analytics that transcends basic spend analysis and supply planning. Through the lens of The Hackett Group’s benchmarking data, we see that supply analytics are a hallmark of world-class procurement and supply chain performance.

However, it should also be clear that broader supply analytics have the potential to create an even larger performance advantage within the extended supply chain.

Endnotes: (1) Source: The Hackett Group procurement benchmark database (2) CIO Today (3) “Spend” and “supply” are two sides of the same coin. Spend management maximizes supply value from minimum spending (i.e., “’bang for the buck”), measured across a “balanced scorecard of supply.” “supply” is the inbound supply chain, e.g., supply base, spend category, individual supplier, supplier/item.

Pierre Mitchell is Research Director, Procurement & Supply Chain, The Hackett Group. He also serves as an adjunct business advisor within Hackett’s Procurement Executive Advisory Program. He can be reached at