Go Premium for a chance to win a PS4. Enter to Win

x
?
Solved

INSERT_UPDATE in SQL Server

Posted on 2006-10-30
6
Medium Priority
?
812 Views
Last Modified: 2008-01-09
Hi experts,

i'm looking for advice on the best way to acheive the following with performance being primary goal.

I have two tables called table_import and table_live.

table_import will have over 1 million rows of data imported into it each night. This is not appended to existing data but is a new set of data.
table_live will be the data that the live application will be referrencing

each night I want to run a job that looks at the data in table_import and compare it against the data in table_live, removing data from table_live that is no longer in table_import and updating any rows in table_live that have been updated in table_import and adding new data from table_import that do not exist in table_live.

I have have been told a function exists in db2 called insert_update that acheives this using primary keys in both table.

The issue I have is that I do not have primary keys available in both table but I do have a unique column which is a telephone number in both tables.

Also, the table_live table will be in constant use so performance is a priority.

Can anyony please advise on the best method and provide example scripts to do this on SQL Server?

Thanks,
Paul
0
Comment
Question by:trancetopia
  • 3
  • 2
6 Comments
 
LVL 11

Expert Comment

by:rw3admin
ID: 17838173
>>each night I want to run a job that looks at the data in table_import and compare it against the data in table_live, removing data from table_live that is no longer in table_import and updating any rows in table_live that have been updated in table_import and adding new data from table_import that do not exist in table_live.<<

will your table table_import have a flag for 'new' or 'update' records?

if not then why not truncate table_live  table and insert everything from table_import
rw3admin
0
 
LVL 10

Expert Comment

by:AaronAbend
ID: 17838333
given the need to be up and running continously, i suggest building the new table and renaming it...

select * into table_new from table_live
delete table_new where phonekey in (select phonekey from table_import)
insert table_new ( columns  ) select columns from table_import

sp_rename table_live, table_old
sp_rename table_new, table_live

Now if this will delete all of the rows, then rw3admin has the best solution
0
 

Author Comment

by:trancetopia
ID: 17838338
No there are no flags on the import data.  I'm not able to truncate the live table as the application will fail if there is no data there.

table_live is similar to table_import but some of the data will be modified before going into the live table.  The live table needs to bu up 24/7.

I did intend to truncate the live table overnight in an out of hour period and transform the import data into the live table but I have been told now that the data needs to be available 100% of the time so I cannot do this so the only thing I can think of is:

1. Find all new rows in the import table and insert into the live table.
2. Find all rows that exist in the live table and do not exist in the import table and delete them.
3. Find any changed data in the import table compared to live and update live with the modification.

Here are example queries I plan to use for the above:
1. INSERT INTO live (foo) SELECT foo FROM import i WHERE NOT EXISTS (SELECT telephone FROM live WHERE i.telephone = l.telephone)
2. DELETE FROM live WHERE NOT EXISTS (SELECT telephone FROM import i WHERE i.telephone = live.telephone)

I'm not sure about the 3rd but there should only be one updated column.  Another note on my unqiue column is that it is not that unique.  Some customers will have multiple numbers so there could be 3 rows per customer.
0
Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 

Author Comment

by:trancetopia
ID: 17838383
Please also let me know if there is a better way to do the above queries to improve performance.

I also want to say that i'm planning to be putting a clustered index on the telephone column (which is an integer datatype).  Is a clustered index best for this column?
0
 
LVL 11

Expert Comment

by:rw3admin
ID: 17838391
anytime you are doing any sort of DML on live table, your table is locked for select operations anyway, you can use select with NOLOCK but that would kill the purpose as your interface will report dirt records as well (records in transactions).

0
 
LVL 11

Accepted Solution

by:
rw3admin earned 2000 total points
ID: 17838430
try these

INSERT INTO live (foo)
SELECT foo
FROM import i
left join
      live l
ON       i.telephone = l.telephone
Where    l.telephone is Null

DELETE      Live
FROM       Live L
Right join
      import i
ON       i.telephone = l.telephone
Where    i.telephone is Null
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Recently we ran in to an issue while running some SQL jobs where we were trying to process the cubes.  We got an error saying failure stating 'NT SERVICE\SQLSERVERAGENT does not have access to Analysis Services. So this is a way to automate that wit…
It is possible to export the data of a SQL Table in SSMS and generate INSERT statements. It's neatly tucked away in the generate scripts option of a database.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
Via a live example, show how to set up a backup for SQL Server using a Maintenance Plan and how to schedule the job into SQL Server Agent.

971 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question