It's not often that organizations embark on major data migration projects. When they do, many assume that the vendor they choose will do much of the upfront planning, data clean-up and extracts. After all, that's the way it used to be.
But things have changed in the last few years. Vendors have shifted much of the data migration burden to their customers. Now, in order to reduce their quotes and to shift risks, more software implementers are making data migration their customers' responsibility. Vendors will provide the import file specifications and perform the physical data loads. But all mapping, transformation and cleaning are in your court. As vendors exclude all of the actual data migration prep work from their proposals, they have no accountability for the effort or outcome. It's up to your organization to get the data out of your source systems, clean it up, and put it into their format.
This shift has created a significant workload that is easy to underestimate. The upfront efforts that have been shifted to the customer's organization are often 60 to 70 percent of the work involved in a system migration. This shift also means that the majority of the work has not been priced or resourced—a realization that catches many organizations by surprise. As a result, companies may be pressed to complete even the core data cleansing they have identified.
Point B’s Perspective
In any data migration, business continuity comes first. When an organization doesn't give itself enough time and resources to complete the important upfront work, the data optimization they hoped to complete falls by the wayside in the need to get the system up and running. This is a serious sacrifice and a lost opportunity. If it happens, you jeopardize the ROI you'd hoped to achieve, together with many of the strategic goals for taking on the migration in the first place.
It's important for leadership to recognize that, once the app design and configuration begins, your key business experts will be fully engaged doing that work, along with their day jobs. They won't have time to dig into existing data, clean it up, and optimize the structure for new business processes. This is one reason why, even with decades of experience, 75 percent of ERP implementations run 50 per cent or more over budget. More than ever, it is critical to start early and put time on your side.
It's never too soon to start understanding your data and establishing future business requirements. In fact, the sooner, the better. You want to be sure you have time to achieve your goals for data optimization and ROI. Done well, a data migration is a chance not only to clean things up, but also to optimize and structure the data to meet future needs—for example, by organizing products and customer channels, creating new attributes to improve analysis, and establishing consistent data definitions.
You can begin preparing for a data migration as soon as you've decided to do it. Don't let vendor selection hold you back; much of the important work is independent of the tool you choose, so you can get started before your vendor selection is complete. Getting this work underway may even help you zero in on the right vendor.
Starting as soon as you can allows you to spread out the work, reducing peak workloads to manageable levels. It allows time to align the data structure with your business objectives. It also gives you time to do the resource planning needed to execute the work.
Where to Begin?
Master data optimization.
Start by taking a close look at the data migration in light of your future business operations. How do you want things to run in the future? How will you reorganize your data to support your goals? Where will you get your highest ROI on this migration? How can it advance your business? Answer these key business questions upfront. Maybe it's time to revisit your product master list, change the way you classify customers, or restructure data related to your sales channels. Now's the time to bring your migration strategy in line with your strategic business goals and make decisions that will leverage your migration in order to maximize your ROI.
Historical archiving strategy.
A sound historical archiving strategy serves as a safety net that lets you be more judicious in choosing what data you migrate. Without such a strategy, people tend to want to migrate "the history of everything" into the new system. Because it's not feasible to clean up and reconcile it all, there's a danger of "pumping and dumping" the same old data into the new system on the off chance that someone will ask for a purchase order from 10 years ago. A robust data archive is a reassuring back-up, and it's typically much more efficient than expanding the migration scope to accommodate archaic data.
Data quality analysis.
It pays to look at data quality in detail. What needs to be fixed, filled in or standardized? Can it be automated or outsourced? Does it require business user involvement? Bear in mind that the older the data, the more work it is to migrate it. Be prudent about curating your data. Make tough decisions now to separate the wheat from the chaff, and you'll be glad you did forever after.
Taking these steps will inform your data migration scope. Your historical archiving strategy provides a safety net, while the data quality analysis reveals the effort required for different choices. For example, while your initial impulse might be to keep seven years of all customer data, it may be worth taking a deeper dive to assess what you really need. Keep in mind that your effort will grow exponentially with the amount of data you need to reconcile. If your data is in good shape, you may have a pretty expansive migration scope.
Master data cleanup.
Congratulations. At this point, you're ready to begin the actual data cleansing, focusing on the core data customers, products, vendors, assets and locations. All the planning and decisions you've made to date will pay off in having clear, accessible data that supports your business objectives.
The Bottom Line
Making the time to prepare for a data migration early and in depth is a sure way to stay in control of your application implementation. It helps assure that you'll have the time and resources to actually optimize your data and get the ROI you're aiming for.