Solved

Deliverables after Data Mapping Phase of a Migration Project

Posted on 2010-11-26
2
1,051 Views
Last Modified: 2012-05-10
Hi

I wanted to know from experts whether they can suggest any other deliverables apart from ones listed which should come out as part of Data Mapping phase of a data migration project

1) Field Level Mapping Document(including gaps identified and decisions)
2) Migration Approach Document
3) FS-Staging Area Development

Also do share inputs on how should a field level data mapping activty kick off..whats the ideal step by step way to do this exercise
0
Comment
Question by:suhinrasheed
2 Comments
 
LVL 18

Expert Comment

by:BigSchmuh
Comment Utility
To my opinion, a very important deliverable is one that match every entity of the migrated and imported structures stating whose keys and rules (Merging rules may apply) are used to match them.

Although, most of the times I try a scenario where I switched from an old app to a new one slowly using bi-directional interfaces that allows for an easy roll-out of my new app. I hate the "big bang" way as it is very tough to handle the mandatory change mgmt. In fact, you can not big bang for more than a few thousands users otherwise your primary trained users would have been trained 2 years ago...
0
 
LVL 7

Accepted Solution

by:
RemRemRem earned 500 total points
Comment Utility
I would want to include a testing model and a set of expectations and standards for the outcome. Customers often don't realize how much involvement THEY need to have to ensure their data is doing what they want - only their in-house day to day users will really be able to catch certain types of flaws in the logic of a conversion.

This leads to the need for a time/resources needs document - how many people on the conversion team (and who), how many people needed from other departments to ensure success, and the expected points during conversion during which they'll be needed (and for how long).

If I had access to the raw data while developing my map, I'd also anticipate a list of data concerns - the biggest time and energy suck in a large conversion is handling the exceptions. The worst scenario is if a company doesn't realize how much bad data they already have (and yes, Virginia, there WILL be bad data). Setting expectations in advance can help.

If they haven't gotten it from the mapping material, they will need a reality check for budget planning, too. I always try to pre-warn my clients that their perfect project is going to be:
1/3 Development
1/3 Documentation/Training
1/3 Data Conversion

On the field level data mapping - this takes your most data aware developer(s) working in close proximity with the end users (ideally, a developer who is user-communication savvy, or this goes very slowly and very badly). I would go department by department, process by process, taking printed screen shots of every existing interface which have been mapped to fields already. With the end user, confirm the functional intent behind each field and determine if it is free form or has limitations in the form of table masters or range criteria. If the criteria is not enforced by the existing system, these fields need particular attention during the cleanup phase, as they'll have the most errors.

From the human interface level, I'd then move to the tech side - if the original structure is one that can have logging take place, I'd consider logging each user's process to see what behind the scenes fields are being updated by their actions, and add those to my original field grid.

I'd then review the tables in raw form, determining if there are data laden fields that I haven't accounted for in the prior two steps. Much of the time, the "why" of these can be extrapolated. Product documentation can provide additional clues. It's also key to determine which fields don't exist in your grid thus far, and which of them are not needed at all.

The same is then done with the new system - review what data is expected where, mapping it to the field locations.

Do an initial "obvious" field match - "Customer Name" = "CustName" and so on.

Do a secondary match - likely values.

Do a review of missing fields in the result list and determine if the values ever existed in the original (after all, the inability to store something may be part of why it's being upgraded).

Review for values that can be calculated or extrapolated from other fields - got the zip code but no county? Pick up a county table from the new product and assume a conversion by zip will be needed.

VERY carefully review value concepts that are changing - is your existing system calculating dollars on a unit by unit basis and your new system is kitting things? If so, what original SKUs make up a new Kit? What do you do with orders that have incomplete kits?  Is your customer "box" moving from a person-per-address model to an address with multiple people model? ...and so on. Determine if any of your unfilled fields are resolved through a reshaping of the value concept.

Rinse, repeat, until you've itemized all the critical incoming data.

Once all possible new field structures are accounted for, be sure to remember to go back to the old data and ensure that there's nothing that had been stored which is no longer accounted for in the new...it may need to be reshaped into a custom field value in the new system.

Any questionable or calculated values need to be reviewed with the departmental "owner" of that element - finance department, fulfillment team, what have you - to ensure that anything not clearly a one to one conversion is being done in a manner that works for them.

Hope that helps...?
-Rachel
0

Featured Post

Maximize Your Threat Intelligence Reporting

Reporting is one of the most important and least talked about aspects of a world-class threat intelligence program. Here’s how to do it right.

Join & Write a Comment

This article explains all about SQL Server Piecemeal Restore with examples in step by step manner.
These days socially coordinated efforts have turned into a critical requirement for enterprises.
This video explains at a high level with the mandatory Oracle Memory processes are as well as touching on some of the more common optional ones.
This video shows how to copy a database user from one database to another user DBMS_METADATA.  It also shows how to copy a user's permissions and discusses password hash differences between Oracle 10g and 11g.

728 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now