Solved

DTS Pack to copy tables from SQL 2000 to SQL 2005

Posted on 2009-05-09
14
585 Views
Last Modified: 2013-11-30
Hello all,

I have a database on a SQL Server 2000 machine with tables that need to be copied to a SQL Server 2005 database on a different machine. Both servers are on separate networks and the way our firewall is configured is such that the 2000 server can see the 2005 server but not the other way around; so it looks like my only option is to initiate the transfer from the 2000 server using DTS instead of SSIS from the 2005 server.

Every table in my 2000 database exists in the 2005 database and both have the exact same schema so there's no concern for crazy column mapping: it's just a straight column-to-column transfer. What I have done thus far is add a Transform Data Task for each table that needs to be copied but as the number of tables grow or shrink it has become bothersome to manage the DTS package.

My question: Is there some way I could dynamically specify which table is to be copied for a single Transform Data Task and then automatically reconfigure the column mappings and execute the task so that I won't have so many tasks to manage? I was hoping to just select a list of tables in an ActiveX Script Task then modify the settings for a single Transform Data Task that exists in the Package. Has anyone faced this issue before? If so, your thoughts?

I have attached a screenshot of my current DTS in which I have one Transform Data Task for each table to be copied.
Capture.JPG
0
Comment
Question by:tritonmgt
  • 5
  • 3
  • 2
  • +3
14 Comments
 
LVL 31

Expert Comment

by:James Murrell
Comment Utility
cannot you do as one dts job?


just right click on db in 2005 and import from then just select all tables form 2000
0
 
LVL 68

Expert Comment

by:Qlemo
Comment Utility
Definitely using the DTS Wizard is the best option, as it constructs everything needed into a single job.
0
 
LVL 26

Expert Comment

by:Zberteoc
Comment Utility
You could also use the OPENROWSET or OPENDATASET functions to do the jon and then all you need to do is to writhe some insert statements:

http://jasonhaley.com/blog/post/2004/03/23/Using-OPENROWSET-for-moving-data-from-SQL-Server-to-SQL-Server.aspx
0
 
LVL 75

Expert Comment

by:Anthony Perkins
Comment Utility
And the big question, much as I like DTS and SSIS, why are you using them in this case.  It seems overkill.  Why not just link the servers and do a regular INSERT.  Or am I missing something?
0
 
LVL 68

Expert Comment

by:Qlemo
Comment Utility
There is (nearly) no difference between INSERT with linked server or with OPENROWSET in this case. However, both require you to loop thru all tables and insert their data. Since there is a sp_msForEachTable, this is no big deal either, as shown with linked server:

exec sp_msforeachtable "insert into ? select * from rmtsrv.db.owner.?"

Open in new window

0
 
LVL 75

Expert Comment

by:Anthony Perkins
Comment Utility
Sure.  But my point was that DTS was not the right approach.
0
 
LVL 68

Expert Comment

by:Qlemo
Comment Utility
I see, acperkins, I just wanted to add that detail.
Nevertheless, DTS allows for an additional step: check if the table structure is sufficient, i.e. columns are matching, even if the sequence is not the same. To have the same effect with above script, it would be much more complex, as you have to read the columns and put them into a dynamic SQL.
0
6 Surprising Benefits of Threat Intelligence

All sorts of threat intelligence is available on the web. Intelligence you can learn from, and use to anticipate and prepare for future attacks.

 
LVL 38

Expert Comment

by:Jim P.
Comment Utility
More complicated, but another idea: Microsoft SQL Server Database Publishing Wizard 1.1.

I believe it has a command line option to run it from and automate the transfer.

http://www.microsoft.com/downloads/details.aspx?familyid=56E5B1C5-BF17-42E0-A410-371A838E570A&displaylang=en
0
 
LVL 1

Author Comment

by:tritonmgt
Comment Utility
Wow! Thank you for all of the suggestions. I know that using DTS is a little much and moreover I would rather update existing records and insert those that do not exist (based off the table's primary key). But transferring the data between the two servers is not my primary goal; I just need the data on the 2005 server for the sake of convenience. Some of the tables literally have over 100 fields: they are imports from our old FilePro system and I am migrating the data to our new, better system (which will reside on the 2005 server). So even though a lot of data will be transferred and many columns, I do not wish to invest a lot of time and effort into the transfer mechanism.
I was, however, able to accomplish my goal with a single ActiveX Script Task containing the code below. It may not be very efficient (I'm not sure) but it gets the job done. If anyone has any suggestions for modifying the code to make it faster I would defintely appreciate it.

'**********************************************************************

'  Visual Basic ActiveX Script

'************************************************************************
 

Function Main()
 

	Dim package, task, step, trans

	Dim dataMigration, db1Migration

	Dim dataSource

	Dim sqlTables

	Dim iTables

	

	dataSource = "Provider=SQLOLEDB.1;Data Source=myserver;Initial Catalog=Migration;User ID=myuser;Password=mypassword;"

	Set package = DTSGlobalVariables.Parent
 

	'ADO Objects

	Set conTables = CreateObject("ADODB.Connection")

	Set rsTables = CreateObject("ADODB.Recordset")

	sqlTables = "SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE' AND TABLE_NAME NOT IN ('dtproperties') ORDER BY TABLE_NAME"
 

	conTables.Open = dataSource

	rsTables.Open sqlTables, conTables, 3
 

	For iTables = 1 to rsTables.RecordCount

		stepExists = False

		taskExists = False

		tableName = rsTables.Fields("TABLE_NAME").Value
 

		For iTasks = 1 to package.Tasks.Count

			If package.Tasks(iTasks).Name = "task_" & tableName Then

				taskExists = True

				Exit For

			End If

		Next

		For iSteps = 1 to package.Steps.Count

			If package.Steps(iSteps).Name = "step_" & tableName Then

				stepExists = True

				Exit For

			End If

		Next

		If stepExists = True Then package.Steps.Remove "step_" & tableName

		If taskExists = True Then package.Tasks.Remove "task_" & tableName
 

		Set step = package.Steps.New

		

		step.Name = "step_" & tableName

		step.TaskName = "task_" & tableName

		package.Steps.Add step

	

		Set pump = package.Tasks.New("DTSDataPumpTask")

		Set task = pump.CustomTask

		task.Name = "task_" & tableName

		package.Tasks.Add pump

		

		task.SourceConnectionID = 1

		task.DestinationConnectionID = 2

		task.SourceObjectName = tableName

		task.DestinationObjectName = tableName

		

		Set rsColumns = CreateObject("ADODB.Recordset")

		sqlColumns = "SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME='" & tableName & "' ORDER BY ORDINAL_POSITION"
 

		rsColumns.Open sqlColumns, conTables, 3 'uses existing connection object that is already open
 

		For iColumns = 1 to rsColumns.RecordCount

			columnName = rsColumns.Fields("COLUMN_NAME").Value

			

			Set trans = task.Transformations.New("DTS.DataPumpTransformCopy")

			Set colSource = trans.SourceColumns.New(columnName, iColumns)

			Set colDest = trans.DestinationColumns.New(columnName, iColumns)
 

			trans.Name = "trans_" & tableName & "_" & columnName
 

			colSource.Name = columnName

			colSource.Ordinal = iColumns

			trans.SourceColumns.Add colSource
 

			colDest.Name = columnName

			colDest.Ordinal = iColumns

			trans.DestinationColumns.Add colDest

			

			task.Transformations.Add trans

			

			rsColumns.MoveNext

		Next
 

		step.Execute
 

		package.Tasks.Remove task.Name

		package.Steps.Remove step.Name

		rsTables.MoveNext

	Next
 

	conTables.Close
 

	Main = DTSTaskExecResult_Success

End Function

Open in new window

0
 
LVL 38

Expert Comment

by:Jim P.
Comment Utility
>> But transferring the data between the two servers is not my primary
>> goal; I just need the data on the 2005 server for the
>> sake of convenience.

Then a one time thing with the pub wizard may be what you are looking for.
0
 
LVL 1

Author Comment

by:tritonmgt
Comment Utility
Well, I suppose I should have worded that statement differently. As time goes on, more tables will be converted from the FilePro format to a SQL table version that need to be moved to the 2005 server so this package will be ran several times over the course of the next few weeks and the number of tables (and possibly their fields) can and will change.
0
 
LVL 68

Accepted Solution

by:
Qlemo earned 500 total points
Comment Utility
That is a hard job, and I would rethink that procedure if I were in your shoes. You request for something that keeps track of structural and data changes, and only applies changes. And all that work "for the sake of convenience".
0
 
LVL 1

Author Comment

by:tritonmgt
Comment Utility
You all provide great points. I only meant to say that I would prefer to manage the transfer process in a more efficient manner but I do not wish to spend a great deal of time on this as it is not the most important part of my project. It is merely a means to an end. I can connect to the 2005 server from my laptop (at home) without having to RDC (remote desktop) to the machine but in order for me to interact with the 2000 server I have to use RDC. Like I said, transferring the data is only for convenience so I can run my migration scripts directly from my laptop instead of over remote desktop.
I really appreciate all of your suggestions and input. I would like to revisit this package down the road to make it more efficient; when that time comes, your suggestions will prove very helpful I am sure.
0
 
LVL 68

Expert Comment

by:Qlemo
Comment Utility
I reckon all is said, so you should close this question now, giving points at your liking.
0

Featured Post

Comprehensive Backup Solutions for Microsoft

Acronis protects the complete Microsoft technology stack: Windows Server, Windows PC, laptop and Surface data; Microsoft business applications; Microsoft Hyper-V; Azure VMs; Microsoft Windows Server 2016; Microsoft Exchange 2016 and SQL Server 2016.

Join & Write a Comment

This article explains how to reset the password of the sa account on a Microsoft SQL Server.  The steps in this article work in SQL 2005, 2008, 2008 R2, 2012, 2014 and 2016.
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Familiarize people with the process of utilizing SQL Server functions from within Microsoft Access. Microsoft Access is a very powerful client/server development tool. One of the SQL Server objects that you can interact with from within Microsoft Ac…
Via a live example, show how to shrink a transaction log file down to a reasonable size.

728 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now