Link to home
Start Free TrialLog in
Avatar of tritonmgt
tritonmgtFlag for United States of America

asked on

DTS Pack to copy tables from SQL 2000 to SQL 2005

Hello all,

I have a database on a SQL Server 2000 machine with tables that need to be copied to a SQL Server 2005 database on a different machine. Both servers are on separate networks and the way our firewall is configured is such that the 2000 server can see the 2005 server but not the other way around; so it looks like my only option is to initiate the transfer from the 2000 server using DTS instead of SSIS from the 2005 server.

Every table in my 2000 database exists in the 2005 database and both have the exact same schema so there's no concern for crazy column mapping: it's just a straight column-to-column transfer. What I have done thus far is add a Transform Data Task for each table that needs to be copied but as the number of tables grow or shrink it has become bothersome to manage the DTS package.

My question: Is there some way I could dynamically specify which table is to be copied for a single Transform Data Task and then automatically reconfigure the column mappings and execute the task so that I won't have so many tasks to manage? I was hoping to just select a list of tables in an ActiveX Script Task then modify the settings for a single Transform Data Task that exists in the Package. Has anyone faced this issue before? If so, your thoughts?

I have attached a screenshot of my current DTS in which I have one Transform Data Task for each table to be copied.
Capture.JPG
Avatar of James Murrell
James Murrell
Flag of United Kingdom of Great Britain and Northern Ireland image

cannot you do as one dts job?


just right click on db in 2005 and import from then just select all tables form 2000
Definitely using the DTS Wizard is the best option, as it constructs everything needed into a single job.
You could also use the OPENROWSET or OPENDATASET functions to do the jon and then all you need to do is to writhe some insert statements:

http://jasonhaley.com/blog/post/2004/03/23/Using-OPENROWSET-for-moving-data-from-SQL-Server-to-SQL-Server.aspx
And the big question, much as I like DTS and SSIS, why are you using them in this case.  It seems overkill.  Why not just link the servers and do a regular INSERT.  Or am I missing something?
There is (nearly) no difference between INSERT with linked server or with OPENROWSET in this case. However, both require you to loop thru all tables and insert their data. Since there is a sp_msForEachTable, this is no big deal either, as shown with linked server:

exec sp_msforeachtable "insert into ? select * from rmtsrv.db.owner.?"

Open in new window

Sure.  But my point was that DTS was not the right approach.
I see, acperkins, I just wanted to add that detail.
Nevertheless, DTS allows for an additional step: check if the table structure is sufficient, i.e. columns are matching, even if the sequence is not the same. To have the same effect with above script, it would be much more complex, as you have to read the columns and put them into a dynamic SQL.
More complicated, but another idea: Microsoft SQL Server Database Publishing Wizard 1.1.

I believe it has a command line option to run it from and automate the transfer.

http://www.microsoft.com/downloads/details.aspx?familyid=56E5B1C5-BF17-42E0-A410-371A838E570A&displaylang=en
Avatar of tritonmgt

ASKER

Wow! Thank you for all of the suggestions. I know that using DTS is a little much and moreover I would rather update existing records and insert those that do not exist (based off the table's primary key). But transferring the data between the two servers is not my primary goal; I just need the data on the 2005 server for the sake of convenience. Some of the tables literally have over 100 fields: they are imports from our old FilePro system and I am migrating the data to our new, better system (which will reside on the 2005 server). So even though a lot of data will be transferred and many columns, I do not wish to invest a lot of time and effort into the transfer mechanism.
I was, however, able to accomplish my goal with a single ActiveX Script Task containing the code below. It may not be very efficient (I'm not sure) but it gets the job done. If anyone has any suggestions for modifying the code to make it faster I would defintely appreciate it.

'**********************************************************************
'  Visual Basic ActiveX Script
'************************************************************************
 
Function Main()
 
	Dim package, task, step, trans
	Dim dataMigration, db1Migration
	Dim dataSource
	Dim sqlTables
	Dim iTables
	
	dataSource = "Provider=SQLOLEDB.1;Data Source=myserver;Initial Catalog=Migration;User ID=myuser;Password=mypassword;"
	Set package = DTSGlobalVariables.Parent
 
	'ADO Objects
	Set conTables = CreateObject("ADODB.Connection")
	Set rsTables = CreateObject("ADODB.Recordset")
	sqlTables = "SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES WHERE TABLE_TYPE='BASE TABLE' AND TABLE_NAME NOT IN ('dtproperties') ORDER BY TABLE_NAME"
 
	conTables.Open = dataSource
	rsTables.Open sqlTables, conTables, 3
 
	For iTables = 1 to rsTables.RecordCount
		stepExists = False
		taskExists = False
		tableName = rsTables.Fields("TABLE_NAME").Value
 
		For iTasks = 1 to package.Tasks.Count
			If package.Tasks(iTasks).Name = "task_" & tableName Then
				taskExists = True
				Exit For
			End If
		Next
		For iSteps = 1 to package.Steps.Count
			If package.Steps(iSteps).Name = "step_" & tableName Then
				stepExists = True
				Exit For
			End If
		Next
		If stepExists = True Then package.Steps.Remove "step_" & tableName
		If taskExists = True Then package.Tasks.Remove "task_" & tableName
 
		Set step = package.Steps.New
		
		step.Name = "step_" & tableName
		step.TaskName = "task_" & tableName
		package.Steps.Add step
	
		Set pump = package.Tasks.New("DTSDataPumpTask")
		Set task = pump.CustomTask
		task.Name = "task_" & tableName
		package.Tasks.Add pump
		
		task.SourceConnectionID = 1
		task.DestinationConnectionID = 2
		task.SourceObjectName = tableName
		task.DestinationObjectName = tableName
		
		Set rsColumns = CreateObject("ADODB.Recordset")
		sqlColumns = "SELECT COLUMN_NAME FROM INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME='" & tableName & "' ORDER BY ORDINAL_POSITION"
 
		rsColumns.Open sqlColumns, conTables, 3 'uses existing connection object that is already open
 
		For iColumns = 1 to rsColumns.RecordCount
			columnName = rsColumns.Fields("COLUMN_NAME").Value
			
			Set trans = task.Transformations.New("DTS.DataPumpTransformCopy")
			Set colSource = trans.SourceColumns.New(columnName, iColumns)
			Set colDest = trans.DestinationColumns.New(columnName, iColumns)
 
			trans.Name = "trans_" & tableName & "_" & columnName
 
			colSource.Name = columnName
			colSource.Ordinal = iColumns
			trans.SourceColumns.Add colSource
 
			colDest.Name = columnName
			colDest.Ordinal = iColumns
			trans.DestinationColumns.Add colDest
			
			task.Transformations.Add trans
			
			rsColumns.MoveNext
		Next
 
		step.Execute
 
		package.Tasks.Remove task.Name
		package.Steps.Remove step.Name
		rsTables.MoveNext
	Next
 
	conTables.Close
 
	Main = DTSTaskExecResult_Success
End Function

Open in new window

>> But transferring the data between the two servers is not my primary
>> goal; I just need the data on the 2005 server for the
>> sake of convenience.

Then a one time thing with the pub wizard may be what you are looking for.
Well, I suppose I should have worded that statement differently. As time goes on, more tables will be converted from the FilePro format to a SQL table version that need to be moved to the 2005 server so this package will be ran several times over the course of the next few weeks and the number of tables (and possibly their fields) can and will change.
ASKER CERTIFIED SOLUTION
Avatar of Qlemo
Qlemo
Flag of Germany image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
You all provide great points. I only meant to say that I would prefer to manage the transfer process in a more efficient manner but I do not wish to spend a great deal of time on this as it is not the most important part of my project. It is merely a means to an end. I can connect to the 2005 server from my laptop (at home) without having to RDC (remote desktop) to the machine but in order for me to interact with the 2000 server I have to use RDC. Like I said, transferring the data is only for convenience so I can run my migration scripts directly from my laptop instead of over remote desktop.
I really appreciate all of your suggestions and input. I would like to revisit this package down the road to make it more efficient; when that time comes, your suggestions will prove very helpful I am sure.
I reckon all is said, so you should close this question now, giving points at your liking.