?
Solved

MSSQL Data sync Stage to dev

Posted on 2011-03-24
7
Medium Priority
?
618 Views
Last Modified: 2012-05-11
I have two separate physical MS SQL 2008 servers that I would like to sync in a nightly job. The table structure will always be the same but I would like the data from the stage server  to overwrite,update, ect the data on the dev server.

What is the best way to accomplish this?
0
Comment
Question by:jgoodale
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
  • 2
7 Comments
 
LVL 40

Expert Comment

by:lcohan
ID: 35208227
The best way in my oppinion would be to use log shipping and this way both structure and data changes on one server.database would be carried forward to the other server.database.

http://msdn.microsoft.com/en-us/library/ms190640.aspx
0
 

Author Comment

by:jgoodale
ID: 35208768
Unfortunately the servers are not on the same network so a local share is not possible. The development server is behind a firewall on a dynamic IP where as the stage server has a static IP behind a different firewall. Exceptions can be created to the static IP but it would be much harder to go the other way.
0
 
LVL 28

Assisted Solution

by:Ryan McCauley
Ryan McCauley earned 400 total points
ID: 35208923
If the databases aren't so large that a complete refresh isn't out of the question, I've written a job like this for some servers I use and it's totally automated:

http://trycatchfinally.net/2009/09/moving-a-sql-server-database-to-another-server-on-a-schedule-without-using-replication/

Basically, a scheduled SQL Agent job on the source server takes a backup, compressed it (with 7zip), and ships it to the standby server via FTP. The standby server has a job that's constantly checking for new backups, and then it finds one (once the FTP upload is done), it unzips it, restores it, run some indexing scripts, and then emails me that a backup is restored.

If you have any questions, please let me know.

0
Does Your Cloud Backup Use Blockchain Technology?

Blockchain technology has already revolutionized finance thanks to Bitcoin. Now it's disrupting other areas, including the realm of data protection. Learn how blockchain is now being used to authenticate backup files and keep them safe from hackers.

 
LVL 40

Expert Comment

by:lcohan
ID: 35209303
If the servers are behind different firewals and networks can they talk to each other currently?
If they are not then indeed you can't do log shipping and you can't use Replication or SSIS to transfer all tables from one db to another but you could still do T-log backups on the staging server then FTP or rather SFTP them to your SQLDEV box and restore them over there. For that all you need is both DB's to be in FULL recovery mode and an initial full backup/restore from staging to dev.
0
 

Author Comment

by:jgoodale
ID: 35209347
Lets say firewalls aside, the easiest way for scheduled data copy? Ryan's method looks interesting but seems a bit over kill for copy a couple tables with just a couple hundred rows of varchar(max) data. If no other way makes sense that is the direction I will go.
0
 
LVL 40

Accepted Solution

by:
lcohan earned 1600 total points
ID: 35209512
If you need to only "copy a couple tables with just a couple hundred rows of varchar(max) data."
AND the servers can talk to each other then why go that way and not create use a SSIS to do that by simply right click one server.database select Tasks - > export(or import) depending where you run it and go from there. You can use that wizzard to create the SSIS package for you - just don't hit Finish but Save it as well as a file for later use...
OR
Simpli add a Linked Server on either one of them and use distributed queries to copy data from one server.database.table to another!

http://msdn.microsoft.com/en-us/library/ms188279.aspx
0
 
LVL 28

Expert Comment

by:Ryan McCauley
ID: 35209626
Ah - my method is way overkill for a few hundred rows in a few tables, though it only needs FTP to be open and will do everything else automatically. If there's other data that you don't want to replicate, though, then it won't work well because it takes everything.

You could set up log shipping via FTP - that only requires the single port as well and could move the data over on whatever schedule you want (daily, hourly, constantly, etc). It might be a better option for you. I believe you can do an FTP destination with SQL Server replication, but if you can't, you can just ship the logs locally and have something like SyncBackSE (http://www.2brightsparks.com/syncback/sbse-features.html) ship the logs to your remote site to be applied. I've used that app (no affiliation to them) in a number of cases where I need to FTP files as the appear and it's done a great job.
0

Featured Post

The Eight Noble Truths of Backup and Recovery

How can IT departments tackle the challenges of a Big Data world? This white paper provides a roadmap to success and helps companies ensure that all their data is safe and secure, no matter if it resides on-premise with physical or virtual machines or in the cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

     When we have to pass multiple rows of data to SQL Server, the developers either have to send one row at a time or come up with other workarounds to meet requirements like using XML to pass data, which is complex and tedious to use. There is a …
After restoring a Microsoft SQL Server database (.bak) from backup or attaching .mdf file, you may run into "Error '15023' User or role already exists in the current database" when you use the "User Mapping" SQL Management Studio functionality to al…
In this video you will find out how to export Office 365 mailboxes using the built in eDiscovery tool. Bear in mind that although this method might be useful in some cases, using PST files as Office 365 backup is troublesome in a long run (more on t…
Sometimes it takes a new vantage point, apart from our everyday security practices, to truly see our Active Directory (AD) vulnerabilities. We get used to implementing the same techniques and checking the same areas for a breach. This pattern can re…

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question