[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 172
  • Last Modified:

Migrating Data - Tables with different structure

What is the easiest/best way to accomplish the following:

During the process of an upgrade we are migrating data, but the table structure is slightly different. For the most part, the new table has additional fields, but it is possible that it is missing fields.

What is the easiest way to get the matching fields carried over?
5 Solutions
My advice would be to create an INSERT script for each table.  Alternately, you could delete the new tables, import the old ones, then run an alter script on the newly imported table that applies the new field definitions.  This will drop the fields (and their data) that shouldn't be there, and it will add the extra fields you need.

The first soluiton is better for a reoccurring transfer.  The second may be easier for a one-time event.
We migrate in place. Write scripts to add new columns, with defaults if needed, and drop columns we no longer need.

Create a backup, and load it into a new instance, then test your update scripts. When they are correct, run them against production data.
With most of the migrations I've done, a few things seems to always be true...

1) You have to perform the mirgation more than once
    This is either because the data isn't quite perfect or you have to do it for testing a few times and then finally to go live when everything is perfect.

2) You're not just adding or removing columns, you're also changing some of the values in the columns.

If these are true for you, then I go with biggstrc's first suggestion of creating a series of select/insert statements.  

Start by creating your database schema just the way it should be (if it doesn't already exist).     I always turn on my table contraints including required fields, unique and foreign keys.   That way you have a better bet the new data going in will be clean.  

The create a series of insert statements that also massage the data to the new values.
Make sure to save your scripts and name them showing the order of execution.  I use names like   CONTACTS_STEP1    CONTACTS_STEP2.   That way I know which to execute and in what order.   The scripts should also include a delete statement to clear the tables before inserting.

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

coldchillinAuthor Commented:
The User table has a ton of columns. I was hoping to avoid going through and coding for all of them...I guess that's not possible?
You only need to have statements for the columns that change, not all columns. Unless, of course, the majority were changed.

In my spare time, I have been working on a script generator to produce update scripts. I get tired of it too. I already generate scripts to build the tables, indexes etc, and to populate a new instance with our default data. We have to support both Oracle and SS2005, so the generators are a big help. Keeping them in synch before was a chore.
If you use Jim's approach of making a copy of the database, then running scripts to alter it, you don't need to create insert statements.  As Jim says, only the columns that are changing require either Alter, Add or Drop statements, plus update statements to massage the data.

My approach of creating a new database and performing inserts would require you to write insert statements.   Its not really that hard to write a script that builds the insert statements for you.  Depending on your database, you select all the column from the table and place them in a comma delimited list  INSERT INTO TABLE ( column list ) select ... column list from oldTable.   You can do this in coldfusion by using the MyQuery.columnList variable.

It really depends on what you're doing which approach to use.   In the past, I have liked to build my new database from scratch and I ensure that all tables in my new database are clean, normalized and actually in use.   It's like doing spring cleaning in your closet by taking everything out and only putting back what you want.   If you're pretty sure of the contents of your "closet" then you can just pick and choose what to clean/modify.   That would make your life easier.   Do you need some cleaning?
If mine were only for internal use, I might do it differently - never thought about it. The DB I work with supports our primary product, which is a large MES system. So, when I do updates, after my unit testing, they get tested to death by our QA group before they are shipped to customers with the latest revision of the product.

Obviously, when they get to the customer site, they need to be pretty much foolproof. So each bit of script gets wrapped in a script that verifies that it has not been run once already, then runs the script, then records whether it succeeded. Then the whole thing gets wrapped in one script to synchronize the whole thing. It gets pretty complicated, but the customer's DBAs seem to appreciate it.

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now