In previous blogs of this series, I analyzed the risks associated with adopting an ETL/Excel approach for enterprise data migration and produced recommendations for how to mitigate those with more comprehensive tooling and methodologies. This time around, we’re going to focus on another key success factor of enterprise data migration success, speed.
Naturally, in the fast-paced world of most industries, being able to enact change faster than the competition represents huge advantages, but does this desire for speed in digital transformations actually cause more headaches and ultimately delays?
The “lift and shift” approach is often viewed by newcomers as the “quick path” to conducting a data migration. However, for most that venture down the “lift and shift” path, it’ll likely not only cause time delays as data-to-target incompatibility and lack of project orchestration cause things to grind to a halt, but will also hit the bottom line as increased downtime, and increased human remediation resource requirements impact the program budget.
Businesses focus hard on avoiding these negative financial impacts and take steps to mitigate those risks, but they can’t simultaneously sacrifice speed either. Neither the hare nor the tortoise approach is really an option. We’ve found customers want both: fast and successful enterprise data migrations. But how can you achieve both when there seems a tug-of-war between the two? To help, we’ve compiled our three recommendations for how we at Syniti would deliver an enterprise data migration at speed, without introducing risk.
1. Rely on Proven Expertise
Data migrations can be daunting, particularly with complex enterprise IT and data landscapes. In such a high-risk project environment, it’s sensible to explore internal and external resources who are familiar with the challenges and know the best methodologies and technologies that are fit for the job.
Avoid the pitfalls of going alone
The concept of enterprise data migrations are a half-a-century old now and whether it be due to data, custom code, or process issues, there are guaranteed to be experts that have seen your challenge before. It is a rare business that has deep data migration expertise available internally, so relying on data and industry vertical migration experts to avoid mistakes can accelerate projects, lower risk, and be worth every dollar.
Integrate Technology Expertise
In addition to adding diverse brainpower to your project, introducing effective enterprise data migration technology should be a close second on your list. By leveraging a centralized data migration solution to orchestrate the entire end-to-end migration, you streamline workflows and automate a large proportion of the required code. What’s more, the best solutions in the marketplace will bake their learnings, experience, and best practices into the platform, giving you a significant head start.
Access Industry-expertise
Every industry has its own set of challenges, whether that be managing spare parts in a manufacturing company, keeping track of inventory in retail, or staying on top of consumer records in a services company. While some challenges may translate across industries, some might be unique. Look for technology that has industry-specific content and best practices built-in. When building out your project resourcing, ensure you’re incorporating proven expertise from both the system integrator community and data community as both will bring you the necessary insights to get your program running faster.
2. Leverage Technology that Accelerates Execution
In such a high-tech, high-risk process like data migration, it can be costly as well as risky to rely on low-tech and manual approaches for effective execution. To help your scoping exercise, here are three components to really look for in any prospective data migration technology.
Centralized platform to support all users and use cases
To accelerate data migration timelines, it’s important you focus on making all project work cohesive and efficient. The optimal way to ensure this happens is to have one centralized, cloud-based solution, where all stakeholders operate in the same system under the same rules, guidelines, and workflows. This not only encourages smooth migration processes but provides project transparency, giving program managers and executives a single source of truth for all project operations and reports. It’s also important for any solution to deliver on the needs of all data migration stakeholders. With employees ranging from executives to data and IT to lines of business, each with their own technical capability and responsibilities, it’s vital the centralized migration platform gives each stakeholder the visibility and capabilities they need to execute on their particular objectives quickly and efficiently.
Sandbox support for accurate migration sprints
Any technology brought into the migration must also support project personnel working, learning, and testing at speed. Agile migration sprints are a relatively new approach for enterprise data migration, but is fast becoming an essential process to find and fix issues before they become major problems. Working in real-time with other project personnel takes out the sluggish step-by-step wait for tasks to be completed and allows for proactive testing and retesting which is so important in the Boring Go-Live playbook.
Smart, AI/ML-driven automation
Another big benefit of introducing fit-for-purpose technology in the migration effort is leveraging best practices and automation contained within the platforms. Without smart automation, migrations can get messy, fast. With a whole collection of code required to facilitate the transition of data from source to target, finding ways to automate its creation to give programs a jumpstart, really is another no-brainer. If you can also find ways to integrate further AI and ML into the project too, the less scope you have for manual error. While it is impossible to fully automate an enterprise data migration, it’s worth finding a solution that has intelligent technology built-in wherever possible and which empowers users with low-code, no-code capabilities so required customizations are easy and fast.
3. Elevate and Deliver Data Quality
Often, businesses that don’t make data quality a key objective of the migration effort fail to meet their business goals and don’t meet their expectations for value delivered. Not only may the new system fail to work properly, but because the data is not optimized, you’ll have only successfully moved your old data and process problems to a new system. In addition to a potentially sluggish go-live, the essential data you’re relying on to make business-critical decisions just won’t be trustworthy. In the context of speeding up data migrations, quality has an unheralded role. Here’s the simple truth: the quicker you resolve data quality issues, the faster you will experience real business value. Here’s some recommendations to ensure data quality doesn’t delay your go-live.
Map data so it’s fit for target
Unless your migration involves the simplest of upgrades from one manufacturer’s ERP to the new, you’ll need to navigate the challenge of source-to-target mapping. In simple terms, this means correlating what something meant in your old system (e.g. part number) to what the target system recognizes that field to be. Just mapping one system to another system can be a major effort; now, try making that 80 source systems, and you’ve got one mammoth undertaking. Pre-designed mapping, sometimes known as “content” or “accelerators” in the industry are a massive help here as it takes knowledge from other migrations, for example, Oracle JD Edwards as a source and SAP S/4HANA as a target, and automatically loads the mappings for many of those common objects involved in this type of migration.
Deliver high quality only
As has been mentioned, many businesses have felt the limitations of “lifting and shifting” data from the old system to the new without improving it along the way. At some point, the lack of data quality results in a new system that is not delivering the business value that was promised and data quality needs must be addressed. Executing on a migration with a laser focus on delivering pristine data quality to the new target system can result not only in happier users and executives but a 95% or greater reduction in unplanned downtime, according to a report by IDC. De-duplication of data is a great place to start, as is running dataset-wide quality tests against long-term governance rules to establish quality.
Right size your data
In addition to high data quality, you should focus on the size of your data ecosystem too. Performing an analysis on what data you truly need, whether that’s transactional data for ongoing operations or historical data for audits/ customer services, will help you gain an appreciation for what data you don’t need to take with you. You may wish to archive this data, move it to a data lake, or simply delete it. This process will bear fruit later on as streamlining your dataset streamlines your ultimate hosting costs too. What’s more, the environmental benefits of limiting overall storage/data center space are significant. With more corporate carbon promises coming into place, the environmental impacts of data is something all IT/data personnel will be pressured to streamline/optimize in the near future.
I hope you’ve found these recommendations to speed up your data migration useful. Data migrations are not easy at the best of times, so trying to accelerate them while still ensuring success is an almost herculean task if you don’t have the expertise, technology, and laser focus on data quality. Don’t be the one to take the risk, make sure you have the correct toolset to run your enterprise data migration.
For more information on Syniti’s data migration solution, Syniti Migrate, visit our product page here. Stay tuned to blog.syniti.com for more insights from the world of data.