Data Replication

You Don't Need to Finish an Initiative to Start Reaping the Benefits: Drive to Agile Value Capture

In an Agile Value Capture approach, once we begin to pool, cleanse, and harmonize the data, we can immediately start measuring performance against metrics.


 

Implementation programs run over time and budget.  Acquisitions fail to deliver promised revenue or cost synergies.  Restructuring initiatives deliver notional, nebulous benefits.

 

There are multiple causes:

  • Lack of long-term focus - By the time the benefits arrive, the leadership who signed up for them has moved on.
  • Poor governance - There are many claimants to the benefits—multiple “hands in the cookie jar”—and the organization can’t agree what drove a particular benefit.  Was it this initiative or a different one?  Random chance?
  • Unclear accountability - Program leaders and senior executives sign organizational units up for benefits without a clear owner.  When Programs finish, they don’t maintain staffing to drive value capture and neither does the business.
  • Inadequate planning and budgeting – Budgets for value capture staffing are not planned.  Investments in data and analytics capabilities are underfunded or are cut at the first budget challenge.
  • Politicization of metrics – Managers resist changing metrics or creating new ones because they’ve determined how to game the current metrics to meet their agreed performance measures or defend their fiefdoms.  If they have to redo the metrics, it will create a great deal of new work for them, and they may not be able to control the outcome.
  • Deficient performance metrics and data management - Metrics are not defined, or don’t reflect what the business needs.  Metrics don’t have valid baselines to which to compare future results, or data debt, and data complexity precludes timely accurate measurement of results. 

These major drivers of poor value capture performance are largely organizational and behavioral.  

 

Yet there’s another critical flaw—a methodological gap—in how we generally think of value capture:  it’s done on a “waterfall” basis.  Brief reminder:  a waterfall methodology lays out activities and phases sequentially, A to B to C, so that later activities and outcomes are more or less rigidly determined by their predecessors.  From a value capture and ROI standpoint, this means in simple terms that we have to complete the program to get the benefits.  Right?

 

Wrong.

 

Well, let’s say it’s wrong now, but it used to be largely true, and the lens of that history inaccurately tempers how we view value capture.

 

Businesses undertake large, complex programs to drive an outcome:  growing sales or margins, entering a new market, improving product or service quality, reducing working capital, and so on.  We measure these outcomes through metrics.  The metrics, in turn, depend on data.

 

Now frequently data is a mess, metrics and analytical capabilities are in disarray, so the default thought process is to fix all those problems first, then drive the value.  Makes sense, right?

 

 

But perhaps it doesn’t.  Instead, if we take advantage of current capabilities in data management and analytics, we can begin identifying value capture opportunities in parallel to execution, once we collect the data in one spot.  This applies even in the most complex programs.  We don’t need to finish the whole initiative to start identifying, monitoring, and measuring the benefits.  

 

We can drive Agile Value Capture.  Brief reminder:  an Agile methodology breaks work down into smaller chunks that can be executed incrementally with greater parallelism and flexibility.

 

 

In an Agile Value Capture approach, once we begin to pool, cleanse, and harmonize the data, we can immediately start measuring performance against metrics.  Current software-enabled capabilities allow data to be virtualized and inter-related rapidly, fueling the power to drive our metrics and outcome-based ROI.

 

As a simple example, suppose as part of an M&A program, we want to improve procurement risk controls and reduce cost of goods sold.  Even though the data to drive these metrics likely comes from multiple sources and systems, pooling (or ‘staging’) it together will give us an earlier view of how consolidated our procurement spend is and whether we’re getting the best volume pricing for our combined purchasing spend.  

 

The beauty of this approach is that even with incomplete or somewhat inconsistent data—say we’re not able to ‘harmonize’ 100% of the suppliers and the materials we’re buying—we can begin to create baseline metrics based on early mapping and matching of the highest volume suppliers and materials.  The data doesn’t have to be perfect to identify major risks, like where we single-source a key material from one vendor, and major CoGs cost savings, like where we’re not realizing price/volume discounts because the data was duplicated.  In fact, taking pareto-based approach to our program data under management is a valid and valuable approach, since the highest volume elements (in this case suppliers and materials) are overrepresented in the total dataset, allowing us to get to them early on; conversely, low volume data is underrepresented and unlikely to drive significant business benefit.

 

Importantly, the same opportunity exists for revenue synergies in M&A, for data migrations in ERP and other large system integrations, and so on.  As soon as we stage the data in a migration platform, or a data lake, or another robust technology repository, we can begin matching, mapping, and data rationalization processes using AI and machine learning.  We can create valid baselines, even if there were none previously, and we can start driving performance to metrics, long before our large transformation program—which often takes several years to complete—is finished.

 

Moving from waterfall to Agile Value Capture has further benefits.  Most critically, it changes the funding curve and ROI of the program.  Because we’re starting to realize benefits before the initiative ends, the program is self-funding, reducing capital or P&L funding requirements.  

 

And getting to business value early largely neutralizes the core issues we discussed that bedevil benefits realization:

  • Lack of long-term focus – Now we’re going fast, so we can maintain executive attention.
  • Poor governance – We can see the Program benefits immediately…there’s no doubt who gets credit.
  • Unclear accountability – When benefits appear, managers will fight to claim them…not run away to avoid having to find and drive them.
  • Inadequate planning and budgeting – We build analytics into the data transformation work; it’s not a significant additive cost.
  • Politicization of metrics – There’s no silver bullet for this, but early visibility makes “zombie metrics” more difficult to defend and maintain.
  • Deficient performance metrics and data management – We can define the metrics and baselines iteratively…it doesn’t all need to be done up front.  

 

Contemporary best-in-class Data Management software is clearly a key enabler to Agile Value Capture.  A clearly articulated Data Strategy that links business outcomes is also critical.  

 

A well-thought-out Data Strategy links business value to performance metrics and performance metrics to the data that underlies and drives them.  Using data in an Agile way, rather than waiting for the end of an initiative to start measuring and realizing value, is itself a key component of Data Strategy; we’re empowering our data to work for us, rather than treating it as an afterthought.

 

As business leaders, we should insist on identifying, monitoring and driving ROI through Agile Value Capture.  We don’t need to wait until the last chapter of the book to begin enjoying it.

Similar posts

Get Notified on New Syniti Blog Posts

Be the first to know about new blogs from Syniti to stay up-to-date on the latest industry knowledge and to learn how Syniti delivers data you can trust.