Improve company productivity with a Business Account.Sign Up


Deliverables after Data Mapping Phase of a Migration Project

Posted on 2010-11-26
Medium Priority
Last Modified: 2012-05-10

I wanted to know from experts whether they can suggest any other deliverables apart from ones listed which should come out as part of Data Mapping phase of a data migration project

1) Field Level Mapping Document(including gaps identified and decisions)
2) Migration Approach Document
3) FS-Staging Area Development

Also do share inputs on how should a field level data mapping activty kick off..whats the ideal step by step way to do this exercise
Question by:suhinrasheed
LVL 18

Expert Comment

ID: 34220979
To my opinion, a very important deliverable is one that match every entity of the migrated and imported structures stating whose keys and rules (Merging rules may apply) are used to match them.

Although, most of the times I try a scenario where I switched from an old app to a new one slowly using bi-directional interfaces that allows for an easy roll-out of my new app. I hate the "big bang" way as it is very tough to handle the mandatory change mgmt. In fact, you can not big bang for more than a few thousands users otherwise your primary trained users would have been trained 2 years ago...

Accepted Solution

RemRemRem earned 2000 total points
ID: 34223115
I would want to include a testing model and a set of expectations and standards for the outcome. Customers often don't realize how much involvement THEY need to have to ensure their data is doing what they want - only their in-house day to day users will really be able to catch certain types of flaws in the logic of a conversion.

This leads to the need for a time/resources needs document - how many people on the conversion team (and who), how many people needed from other departments to ensure success, and the expected points during conversion during which they'll be needed (and for how long).

If I had access to the raw data while developing my map, I'd also anticipate a list of data concerns - the biggest time and energy suck in a large conversion is handling the exceptions. The worst scenario is if a company doesn't realize how much bad data they already have (and yes, Virginia, there WILL be bad data). Setting expectations in advance can help.

If they haven't gotten it from the mapping material, they will need a reality check for budget planning, too. I always try to pre-warn my clients that their perfect project is going to be:
1/3 Development
1/3 Documentation/Training
1/3 Data Conversion

On the field level data mapping - this takes your most data aware developer(s) working in close proximity with the end users (ideally, a developer who is user-communication savvy, or this goes very slowly and very badly). I would go department by department, process by process, taking printed screen shots of every existing interface which have been mapped to fields already. With the end user, confirm the functional intent behind each field and determine if it is free form or has limitations in the form of table masters or range criteria. If the criteria is not enforced by the existing system, these fields need particular attention during the cleanup phase, as they'll have the most errors.

From the human interface level, I'd then move to the tech side - if the original structure is one that can have logging take place, I'd consider logging each user's process to see what behind the scenes fields are being updated by their actions, and add those to my original field grid.

I'd then review the tables in raw form, determining if there are data laden fields that I haven't accounted for in the prior two steps. Much of the time, the "why" of these can be extrapolated. Product documentation can provide additional clues. It's also key to determine which fields don't exist in your grid thus far, and which of them are not needed at all.

The same is then done with the new system - review what data is expected where, mapping it to the field locations.

Do an initial "obvious" field match - "Customer Name" = "CustName" and so on.

Do a secondary match - likely values.

Do a review of missing fields in the result list and determine if the values ever existed in the original (after all, the inability to store something may be part of why it's being upgraded).

Review for values that can be calculated or extrapolated from other fields - got the zip code but no county? Pick up a county table from the new product and assume a conversion by zip will be needed.

VERY carefully review value concepts that are changing - is your existing system calculating dollars on a unit by unit basis and your new system is kitting things? If so, what original SKUs make up a new Kit? What do you do with orders that have incomplete kits?  Is your customer "box" moving from a person-per-address model to an address with multiple people model? ...and so on. Determine if any of your unfilled fields are resolved through a reshaping of the value concept.

Rinse, repeat, until you've itemized all the critical incoming data.

Once all possible new field structures are accounted for, be sure to remember to go back to the old data and ensure that there's nothing that had been stored which is no longer accounted for in the may need to be reshaped into a custom field value in the new system.

Any questionable or calculated values need to be reviewed with the departmental "owner" of that element - finance department, fulfillment team, what have you - to ensure that anything not clearly a one to one conversion is being done in a manner that works for them.

Hope that helps...?

Featured Post

Improve Your Query Performance Tuning

In this FREE six-day email course, you'll learn from Janis Griffin, Database Performance Evangelist. She'll teach 12 steps that you can use to optimize your queries as much as possible and see measurable results in your work. Get started today!

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Among the most obnoxious of Exchange errors is error 1216 – Attached Database Mismatch error of the Jet Database Engine. When faced with this error, users may have to suffer from mailbox inaccessibility and in worst situations, permanent data loss.
An introductory discussion about Oracle Analytic Functions which are used to calculate or compute Aggregate values, based on a group of rows.
This video explains what a user managed backup is and shows how to take one, providing a couple of simple example scripts.
This is a high-level webinar that covers the history of enterprise open source database use. It addresses both the advantages companies see in using open source database technologies, as well as the fears and reservations they might have. In this…

608 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question