Help with SQLServer replication

We are currently looking for a resource (an individual or company) who is a database administration guru under SQLServer 2008 R2 who has advanced skills not just in SQLServer administration, but also advanced skills with replication under this platform (and version).  This individual or company would need to be located in Melbourne, Australia and we would like to engage these services initially on a 2-3 days a week basis.

Initially the resource would need to be able to provide some advice/recommendations regarding a currently pressing issue on possible techniques and approaches regarding replication and the handling of documents/files (attachments) submitted by the public.

We have a scenario where we have an internet site that accepts details of a submission, together with attachments to support the submission.  These details are then saved into two SQLServer tables hosted with the web site (the attachments are currently stored in the database, not on the file system).  These tables are then replicated down to the ourselves using SQLServer merge replication over http (IIS) where they are then checked and processed (a flag indicating the successful processing of the submission is updated in the table that then is replicated back up to the web site).

Occasionally the articles/schema may change, and of course the web developers then want to re-initialise the replication, which in turn leads to an issue with the re-initialisation of the table holding the attachments.  This has been identified as being caused by the size of the snapshot for the attachments table being too large to be transported from the web to the organisation during replication, ie. a IIS limitation.

Occasionally the same thing may happen if a large and unexpected volume of submissions with attachments are lodged between replications.

In the near future we are expecting a substantial increase in the volume of attachments lodged via the web site, and are not convinced that we are handling this in the best possible way.

Our web developers have tried using replication filtering in an attempt to alleviate this issue, but we have found that this approach may deliver us the parent record (the dispute) but not all child records (the attachments), or, may hold onto the dispute up at the website for a longer than anticipated period before being delivered via replication.

Between our external web developers and external database administrators we have had suggested several approaches including:
•      replication filtering (as mentioned above)
•      replication over a dedicated VPN connection (which in itself may have security connotations)
•      making use of the FILESTREAM feature
•      storing the files on the file system and transferring via secure FTP

Unfortunately with both the replication filtering and the FILESTREAM feature, neither the external web developers nor external DBA's are familiar enough with replication (and these features) to be confident that either are the best approach.

Basically, we are struggling to formulate what is the best approach to handle the increase of volume in the transfer of the above attachments together with the details of the submissions the attachments relate to.
Who is Participating?
You could do offsite log shipping with Robo-FTP.  This would give you secure, encrypted file transmissions without using a VPN... like when you order something online with a credit card.

By restoring the transaction logs at an offisite server you keep the remote database up to date one small chunk at a time.  As a side benefit you are also testing that your incremental transaction log backups can actually be used to restore your DB in event of emergency.

Here is a sample server event script that shows using Robo-FTP Server as a secure site that automatically restores database backup files as they are uploaded:

Here is a sample Robo-FTP client script that runs as a Windows service and automatically creates and sends a transaction log backup every 15 minutes:  It would be trivial to change that to run at whatever interval works for you.  It would be possible to change it to poll the transaction log to see how full it is and make and send a backup whever it reaches, for example, 90% of capacity.
MartinTayAuthor Commented:
Thanks Alex,

we did consider log shipping however, unfortunately log shipping is not an option our company can go down.  This is because log shipping requires downtime.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.