Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

Copy 1 large sql 2008 table to another database on the same machine

Posted on 2013-06-03
4
Medium Priority
?
308 Views
Last Modified: 2016-02-11
I have a table with about 30 million records that I need to move to another database on the same machine.  The table does have an Identity column.  I'm using TSQL and it is very slow  any suggestions?

The Code:

BEGIN DISTRIBUTED TRANSACTION

BEGIN TRY

      SET IDENTITY_INSERT [MasterWare_Archive].[dbo].[InputMaster_Archive] ON
      
    INSERT  INTO [MasterWare_Archive].[dbo].[InputMasterWare_Archive] ( ID
                         , Line
                         , FileDate
                         , ... )
            SELECT ID
                         , Line
                         , FileDate
                         , ...
            FROM
                [MasterWare].[dbo].[InputMasterWare_Archive]
            WHERE
                [DateInstalled] > 90;
               
      SET IDENTITY_INSERT [MasterWare_Archive].[dbo].[InputMaster_Archive] OFF

END TRY
BEGIN CATCH
    ROLLBACK ;
    PRINT 'Failed to copy' ;
    PRINT ERROR_MESSAGE() ;
    PRINT ERROR_LINE() ;
END CATCH


COMMIT ;
0
Comment
Question by:wdbates
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
4 Comments
 
LVL 16

Accepted Solution

by:
Surendra Nath earned 150 total points
ID: 39217203
Your transaction log might be building up a lot...
you can insert into the table in batches...
The result of such inserts is a batch might fail but the previous batches will be commited to the database..

DECLARE @count BIGINT,@count_max BIGINT,@batch BIGINT
SET @Batch = 1000
select @count =  min(id) from [MasterWare].[dbo].[InputMasterWare_Archive]
select @count_max =  max(id) from [MasterWare].[dbo].[InputMasterWare_Archive]
while (@count < @count_max)
BEGIN
SET IDENTITY_INSERT [MasterWare_Archive].[dbo].[InputMaster_Archive] ON
INSERT  INTO [MasterWare_Archive].[dbo].[InputMasterWare_Archive] ( ID
                         , Line
                         , FileDate
                         , ... )
            SELECT ID
                         , Line
                         , FileDate
                         , ...
            FROM
                [MasterWare].[dbo].[InputMasterWare_Archive]
            WHERE [DateInstalled] > 90
            and ID >= @count AND ID<@count+@Batch
SET @count = @count+@Batch
      SET IDENTITY_INSERT [MasterWare_Archive].[dbo].[InputMaster_Archive] OFF
END

END TRY
BEGIN CATCH
    ROLLBACK ;
    PRINT 'Failed to copy' ;
    PRINT ERROR_MESSAGE() ;
    PRINT ERROR_LINE() ;
END CATCH

Open in new window

0
 
LVL 40

Assisted Solution

by:lcohan
lcohan earned 150 total points
ID: 39219475
The best option in my opinion is SSIS or directly SQL Export/Import Wizzard if this is a one timer:

http://www.mssqltips.com/sqlservertutorial/202/simple-way-to-export-data-from-sql-server/
http://beyondrelational.com/modules/12/tutorials/24/tutorials/9011/getting-started-with-ssis-part-6-import-export-wizard-part-1.aspx



Or....use BCP OUT then copy the file to the other SQL box if needed otherwise just use BCP IN:

http://msdn.microsoft.com/en-us/library/aa337544(v=sql.90).aspx
0
 

Author Comment

by:wdbates
ID: 39220896
Hello All;

Sorry for the delay, but work been a bitch.  It is a weekly job, but the first time this runs will be a very large pull of data.  I like the SSIS approach, but how can I commit or rollback in SSIS of if the package fails.  This is critical data and I need to be sure that the records are loaded or if package fails the job is roll backed.

WB
0
 

Author Closing Comment

by:wdbates
ID: 39226690
Both answers were helpful and I will be able to use each in difference situations.

Thank you for your support.
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

My client has a dictionary table. They're defining a list of standard naming convention. Now, they are requiring my team to provide us a mechanism how to match new incoming data with existing data in their system.
I have a large data set and a SSIS package. How can I load this file in multi threading?
This course is ideal for IT System Administrators working with VMware vSphere and its associated products in their company infrastructure. This course teaches you how to install and maintain this virtualization technology to store data, prevent vuln…
In this video, Percona Director of Solution Engineering Jon Tobin discusses the function and features of Percona Server for MongoDB. How Percona can help Percona can help you determine if Percona Server for MongoDB is the right solution for …

609 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question