Synchronizing two tables using a stored procedure, only updating and adding rows where values do not match

Hi All,

I've got two tables with identical structure. I'm trying to create a synchronize procedure that will be triggered by a scheduled event at a given time every day.
truncate table [localserver].[DBName].dbo.linktest
go
insert into [localserver].[DBName].dbo.linktest
(
ID,Name
)
select ID,Name
from [remoteserver].DBName.dbo.linktest
with (nolock)

Open in new window

Instead of inserting every single row on every execution, like in my scrpt, is it possible to only insert new rows on local server ?

This would greatly increase the execution speed, since it doesn't have to insert all the 4000 rows when maybe only 20 new rows.

Please advise how to modify above script to insert only new rows (append) on every execution instead of all 4000 rows.
sqldba2013Asked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
sqldba2013Connect With a Mentor Author Commented:
we don't have access on remote server and we have only read permissions on remote server.

Can we perform above steps on local server instead of remote?
0
 
Pratima PharandeConnect With a Mentor Commented:
insert into [localserver].[DBName].dbo.linktest
(
ID,Name
)
select ID,Name
from [remoteserver].DBName.dbo.linktest
with (nolock)
where id not in ( select ID from [localserver].DBName.dbo.linktest )
0
 
Pratima PharandeCommented:
To Update

Update
Set L.Name = R.Name
From [localserver].[DBName].dbo.linktest L
inner join [remoteserver].DBName.dbo.linktest R on L.id=R.id
where LName <> R.Name
0
Build your data science skills into a career

Are you ready to take your data science career to the next step, or break into data science? With Springboard’s Data Science Career Track, you’ll master data science topics, have personalized career guidance, weekly calls with a data science expert, and a job guarantee.

 
sqldba2013Author Commented:
for the performance prospective, which option is best that is update or insert?
0
 
pven13Connect With a Mentor Commented:
You can use the MERGE-statement to handle both inserts and updates:

MERGE [localserver].[DBName].dbo.LinkTest AS L
USING [remoteserver].[DBName].dbo.LinkTest AS R
ON L.id = R.id
WHEN MATCHED THEN
	UPDATE SET L.name = R.name
WHEN NOT MATCHED THEN
	INSERT (id, name)
	VALUES (R.id, R.name);

Open in new window


If you don't need the updates, you can skip the "when matched" part.
0
 
Scott PletcherConnect With a Mentor Senior DBACommented:
For performance, you need to send the row ids/keys to a temporary table on the remote server, so that the join is completely on the remote server.  Otherwise, SQL is very likely to copy the entire remote table to the local server to do the join.

Therefore, create a table on [remoteserver] to hold the key values from [localserver], then DELETE and load that table (you can't TRUNCATE a remote table).

--run on [remoteserver]
--CREATE TABLE /*[remoteserver].*/DBName.dbo.linktest_keys ( ID NOT NULL );

DELETE FROM TABLE [remoteserver].DBName.dbo.linktest_keys

INSERT INTO [remoteserver].DBName.dbo.linktest_keys ( ID )
SELECT ID
FROM [localserver].DBName.dbo.linktest WITH (NOLOCK)

INSERT INTO [localserver].DBName.dbo.linktest
SELECT l.ID, l.Name
FROM [remoteserver].DBName.dbo.linktest l WITH (NOLOCK)
LEFT OUTER JOIN [remoteserver].DBName.dbo.linktest_keys lk ON
    lk.ID = l.ID
WHERE
    lk.ID IS NULL


As to changes, I'd use some type of datetime column or other flag on the row so you could tell if an UPDATE had been run on that row since the last time you sync'd the tables.  Anything else would likely be a lot more overhead.
0
 
Scott PletcherSenior DBACommented:
You can do a local join to the remote server table, using LEFT JOIN as demonstrated above.

But the performance will not get better, since all rows will still have to be copied from the remote server to the local server.
0
 
pven13Commented:
Why don't you run a simple test and see how well the different solutions perform? No one can predict that since we don't know the specs of your network connections.
0
 
Scott PletcherConnect With a Mentor Senior DBACommented:
Regardless of the current network connections and existing load (or lack of it), it's a poor practice to force SQL to copy huge tables to another server to do a join when you can copy a vastly smaller select list yourself to the other server.
0
 
pven13Commented:
I agree. But the author mentioned 4000 rows and 20 new rows. That's not what I call "huge tables"... He also wrote he only has read permission on the remote server.

Besides, I would like to know if sqldba2013 is going to do something with our suggestions.
0
 
sqldba2013Author Commented:
the above solutions resolved my issue.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.