According to Accenture research, the pressure on companies to rein in spending is pervasive: 73 percent of companies consider strategic cost reduction to be a very important part of their legacy business transformation over the next three years.1 But there’s a problem: Much of the low-hanging fruit has already been plucked, meaning it’s getting harder to find opportunities for more significant cost reductions.

Industrial equipment and product manufacturers, for instance, have reduced costs within individual functions. But they struggle to do more because of steep functional barriers: Each department has its own language, datasets, priorities and ways of analyzing their inefficiencies, which makes getting a comprehensive picture of costs a massive challenge. Complicating matters is the fact that industry trends—including shorter product life cycles, demand for product customization and more demanding service expectations—are working against them.

Here’s a common example: Suppliers’ parts often fail during assembly. The factory, which is focused on cycle time, keeps a lot of these parts in inventory so it can replace a failed part as quickly as possible. Parts also fail in the field, requiring field engineers to fix the problem rapidly to minimize downtime. But these incidents are rarely, if ever, linked because the failed part simply isn’t a big issue for either manufacturing or field service. Yet if looked at across the entire value chain, it could end up being a major problem for the larger enterprise.

A single customer touchpoint, supported by a collaborative ecosystem

Source: Accenture Strategy, 2018

Zero-base the cost of quality

What companies today need is their version of the Rosetta Stone—a way to translate and harmonize every function’s view of the world so decision makers can understand their true costs across the enterprise. It would help them get a handle on the “cost of non-quality,” or the delta between their actual costs and what they’d be in a perfect world, and in turn, identify the biggest cost targets and put in place programs to address them.

Fortunately, this sort of Rosetta Stone actually exists. Accenture Strategy has developed a simple, effective approach—underpinned by powerful analytics tools—that enables any enterprise that manufactures high-tech, complex equipment and products to connect the dots between functions and, subsequently, both quantify their true cost of non-quality and pinpoint the drivers of that cost. This zero-based approach can help product manufacturers reduce their cost of goods sold (COGS)—by far, their biggest expense—by more than 10 percent.2

Our recent research finds that only 20 percent of companies are applying zero-based principles to the supply chain, and of those, only 70 percent are addressing COGS.3 This is a huge opportunity that starts with establishing a standard definition of the cost of non-quality across functions and identifying the lowest common denominator to which a company can associate any cost. This denominator in original equipment manufacturers (OEMs) is typically the part number, which makes sense because every department uses it, as do customers and suppliers. This makes it easy to follow the part across the value chain, so if a part fails in the factory or at a customer’s site, a company can use the part number to track all the costs in all the processes that are triggered by the failure.



A standard definition of the cost of non-quality is a prerequisite for identifying the sources of data (for example, ERP systems and Excel spreadsheets) needed to quantify that cost. By exporting that data into a central data lake and using logic and algorithms to link data elements, a company can use a data visualization tool to illustrate what’s contributing to the end-to-end cost of non-quality and how much. (In our experience, at most OEMs, field and factory materials account for the vast majority of this cost—upward of 80 percent.) The company then can create driver trees, underpinned by data, that reveal what influences the cost. Such an exercise may reveal, for example, that an inexpensive component from a supplier leads to many failures of complex, expensive equipment in the field.

One of the key features of the approach is its agility and speed. It’s designed to be executed in a lab environment via a series of short sprints in rapid, iterative cycles. This enables a company to rationalize definitions, collect the relevant data, and identify, test, and validate the cost-reduction opportunities in eight to 12 weeks. Yet despite being light on effort, the exercise can produce outsized savings: One company using the approach has identified roughly $500 million in potential savings on a cost-of-goods spend of $5 billion.4 And that’s in an enterprise that has been deploying cost-optimization programs for years.

Gaining a competitive edge

What’s the cost of non-quality potential in your operations? Three good places to look would be the difference between the theoretical product bill of material acquisition costs and actual spend; the gap between real and target factory cycle time; and unexpected service costs (both in labor and parts). You’ll get a much better picture of your true costs and likely will find plenty of ways to reduce them. And that means more savings to fuel your growth engine.

1 Accenture, “Make your wise pivot to the New”, 2018

2 Accenture client experience

3 Accenture Strategy 2017 Beyond ZBB research

4 Accenture client experience

Steve Craen

Managing Director – Accenture Strategy, Supply Chain, Operations and Sustainability Strategy


Maarten van Bree

Principal Director – Accenture Strategy, Supply Chain, Operations and Sustainability Strategy

MORE ON THIS TOPIC


Subscription Center
Stay in the Know with Our Newsletter Stay in the Know with Our Newsletter