moving 80 tables from one db to another server/db but with different where clauses

2 queries as below

SELECT * FROM   PAGORI.RASA.MARKING_TARGET   WHERE  PICK_CODE_DATE IN ('2011-07-31','2012-01-11')
SELECT * FROM   PAGORI.RASA.FUND_SOURCE WHERE PICK_CODE_DATE IN ('2011-07-31','2012-01-11')

--------------------------

SELECT * FROM   PAGORI.RASA.SOURCE   WHERE OFFICE_CODE IN ('3262','4373','4373','4377','8443','9843','4332')  
SELECT * FROM   PAGORI.RASA.PICTOUR  where   OFFICE_CODE IN ('3262','4373','4373','4377','8443','9843','4332')
4 other such table transfers

--------------------------
 
SELECT * FROM   PAGORI.RASA.RESPROT WHERE OFFICE_CODE IN ('3262','4373','4373','4377','8443','9843','4332') AND PICK_CODE_DATE IN ('2011-07-31','2012-01-11')
SELECT * FROM   PAGORI.RASA.SCOREMNT WHERE OFFICE_CODE IN ('3262','4373','4373','4377','8443','9843','4332') AND PICK_CODE_DATE IN ('2011-07-31','2012-01-11')
70 more such table transfers

These 80 table data need to be transfered to another server (same schema on destination database). The 80 tables needs to be truncated and then loaded.

is the attached ssis the best method to do it..?
Untitled.png
LVL 5
25112Asked:
Who is Participating?
 
David ToddConnect With a Mentor Senior DBACommented:
Hi,

This does not look something that can be easily data driven, unless those conditions are the same for each and every table.

If that is the case, create a table that has the tablenames in it, and write a cursor or while to loop through the 80 tables and do them one-by-one with some dynamic code.

And if you need another 100 tables done in the future, then that is realtively easy.

HTH
  David

PS Dropping a table has minimal logging (and locking), and if initially the restored database is in simple mode while dropping the unwanted tables and deleting unwanted rows then the logging shouldn't be too bad. Then change recovery model to full and take a backup.
0
 
David ToddSenior DBACommented:
Hi,

Depending on what the source database looks like, would it be easier to restore a backup of the current database on the destination server, and then filter off what isn't needed?

HTH
  David
0
 
25112Author Commented:
there are 500+ tables, David and some ofthem are too big.. so deletes will take lot of log maintenance and time..

i am open to ideas. thanks.
0
 
25112Author Commented:
SSIS option seems like i have to hard code 80 queries? seems tedious, but benefit is i can save it and rerun as needed.. but if i need to do same concept for another set of 100 tables, that becomes hard again.

any way to automate this better by segments? 2 table transfers have one set of WHERE; 8 have another set of WHERE; and 70+ have another set of WHEREs.
0
 
25112Author Commented:
thank you David-
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.