?
Solved

Help with SQLServer replication

Posted on 2012-08-26
2
Medium Priority
?
442 Views
Last Modified: 2012-10-08
We are currently looking for a resource (an individual or company) who is a database administration guru under SQLServer 2008 R2 who has advanced skills not just in SQLServer administration, but also advanced skills with replication under this platform (and version).  This individual or company would need to be located in Melbourne, Australia and we would like to engage these services initially on a 2-3 days a week basis.

Initially the resource would need to be able to provide some advice/recommendations regarding a currently pressing issue on possible techniques and approaches regarding replication and the handling of documents/files (attachments) submitted by the public.

We have a scenario where we have an internet site that accepts details of a submission, together with attachments to support the submission.  These details are then saved into two SQLServer tables hosted with the web site (the attachments are currently stored in the database, not on the file system).  These tables are then replicated down to the ourselves using SQLServer merge replication over http (IIS) where they are then checked and processed (a flag indicating the successful processing of the submission is updated in the table that then is replicated back up to the web site).

Occasionally the articles/schema may change, and of course the web developers then want to re-initialise the replication, which in turn leads to an issue with the re-initialisation of the table holding the attachments.  This has been identified as being caused by the size of the snapshot for the attachments table being too large to be transported from the web to the organisation during replication, ie. a IIS limitation.

Occasionally the same thing may happen if a large and unexpected volume of submissions with attachments are lodged between replications.

In the near future we are expecting a substantial increase in the volume of attachments lodged via the web site, and are not convinced that we are handling this in the best possible way.

Our web developers have tried using replication filtering in an attempt to alleviate this issue, but we have found that this approach may deliver us the parent record (the dispute) but not all child records (the attachments), or, may hold onto the dispute up at the website for a longer than anticipated period before being delivered via replication.

Between our external web developers and external database administrators we have had suggested several approaches including:
•      replication filtering (as mentioned above)
•      replication over a dedicated VPN connection (which in itself may have security connotations)
•      making use of the FILESTREAM feature
•      storing the files on the file system and transferring via secure FTP

Unfortunately with both the replication filtering and the FILESTREAM feature, neither the external web developers nor external DBA's are familiar enough with replication (and these features) to be confident that either are the best approach.

Basically, we are struggling to formulate what is the best approach to handle the increase of volume in the transfer of the above attachments together with the details of the submissions the attachments relate to.
0
Comment
Question by:MartinTay
2 Comments
 
LVL 16

Accepted Solution

by:
AlexPace earned 1000 total points
ID: 38336235
You could do offsite log shipping with Robo-FTP.  This would give you secure, encrypted file transmissions without using a VPN... like when you order something online with a credit card.

By restoring the transaction logs at an offisite server you keep the remote database up to date one small chunk at a time.  As a side benefit you are also testing that your incremental transaction log backups can actually be used to restore your DB in event of emergency.

Here is a sample server event script that shows using Robo-FTP Server as a secure site that automatically restores database backup files as they are uploaded:
http://kb.robo-ftp.com/script_library/show/60

Here is a sample Robo-FTP client script that runs as a Windows service and automatically creates and sends a transaction log backup every 15 minutes: http://kb.robo-ftp.com/script_library/show/61  It would be trivial to change that to run at whatever interval works for you.  It would be possible to change it to poll the transaction log to see how full it is and make and send a backup whever it reaches, for example, 90% of capacity.
0
 

Author Closing Comment

by:MartinTay
ID: 38476256
Thanks Alex,

we did consider log shipping however, unfortunately log shipping is not an option our company can go down.  This is because log shipping requires downtime.
0

Featured Post

Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

A Stored Procedure in Microsoft SQL Server is a powerful feature that it can be used to execute the Data Manipulation Language (DML) or Data Definition Language (DDL). Depending on business requirements, a single Stored Procedure can return differe…
This month, Experts Exchange sat down with resident SQL expert, Jim Horn, for an in-depth look into the makings of a successful career in SQL.
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
Viewers will learn how the fundamental information of how to create a table.
Suggested Courses

850 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question