Help with SQLServer replication

Posted on 2012-08-26
Last Modified: 2012-10-08
We are currently looking for a resource (an individual or company) who is a database administration guru under SQLServer 2008 R2 who has advanced skills not just in SQLServer administration, but also advanced skills with replication under this platform (and version).  This individual or company would need to be located in Melbourne, Australia and we would like to engage these services initially on a 2-3 days a week basis.

Initially the resource would need to be able to provide some advice/recommendations regarding a currently pressing issue on possible techniques and approaches regarding replication and the handling of documents/files (attachments) submitted by the public.

We have a scenario where we have an internet site that accepts details of a submission, together with attachments to support the submission.  These details are then saved into two SQLServer tables hosted with the web site (the attachments are currently stored in the database, not on the file system).  These tables are then replicated down to the ourselves using SQLServer merge replication over http (IIS) where they are then checked and processed (a flag indicating the successful processing of the submission is updated in the table that then is replicated back up to the web site).

Occasionally the articles/schema may change, and of course the web developers then want to re-initialise the replication, which in turn leads to an issue with the re-initialisation of the table holding the attachments.  This has been identified as being caused by the size of the snapshot for the attachments table being too large to be transported from the web to the organisation during replication, ie. a IIS limitation.

Occasionally the same thing may happen if a large and unexpected volume of submissions with attachments are lodged between replications.

In the near future we are expecting a substantial increase in the volume of attachments lodged via the web site, and are not convinced that we are handling this in the best possible way.

Our web developers have tried using replication filtering in an attempt to alleviate this issue, but we have found that this approach may deliver us the parent record (the dispute) but not all child records (the attachments), or, may hold onto the dispute up at the website for a longer than anticipated period before being delivered via replication.

Between our external web developers and external database administrators we have had suggested several approaches including:
•      replication filtering (as mentioned above)
•      replication over a dedicated VPN connection (which in itself may have security connotations)
•      making use of the FILESTREAM feature
•      storing the files on the file system and transferring via secure FTP

Unfortunately with both the replication filtering and the FILESTREAM feature, neither the external web developers nor external DBA's are familiar enough with replication (and these features) to be confident that either are the best approach.

Basically, we are struggling to formulate what is the best approach to handle the increase of volume in the transfer of the above attachments together with the details of the submissions the attachments relate to.
Question by:MartinTay
    LVL 16

    Accepted Solution

    You could do offsite log shipping with Robo-FTP.  This would give you secure, encrypted file transmissions without using a VPN... like when you order something online with a credit card.

    By restoring the transaction logs at an offisite server you keep the remote database up to date one small chunk at a time.  As a side benefit you are also testing that your incremental transaction log backups can actually be used to restore your DB in event of emergency.

    Here is a sample server event script that shows using Robo-FTP Server as a secure site that automatically restores database backup files as they are uploaded:

    Here is a sample Robo-FTP client script that runs as a Windows service and automatically creates and sends a transaction log backup every 15 minutes:  It would be trivial to change that to run at whatever interval works for you.  It would be possible to change it to poll the transaction log to see how full it is and make and send a backup whever it reaches, for example, 90% of capacity.

    Author Closing Comment

    Thanks Alex,

    we did consider log shipping however, unfortunately log shipping is not an option our company can go down.  This is because log shipping requires downtime.

    Featured Post

    How to run any project with ease

    Manage projects of all sizes how you want. Great for personal to-do lists, project milestones, team priorities and launch plans.
    - Combine task lists, docs, spreadsheets, and chat in one
    - View and edit from mobile/offline
    - Cut down on emails

    Join & Write a Comment

    If you have heard of RFC822 date formats, they can be quite a challenge in SQL Server. RFC822 is an Internet standard format for email message headers, including all dates within those headers. The RFC822 protocols are available in detail at:   ht…
    How to leverage one TLS certificate to encrypt Microsoft SQL traffic and Remote Desktop Services, versus creating multiple tickets for the same server.
    Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.
    Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…

    729 members asked questions and received personalized solutions in the past 7 days.

    Join the community of 500,000 technology professionals and ask your questions.

    Join & Ask a Question

    Need Help in Real-Time?

    Connect with top rated Experts

    16 Experts available now in Live!

    Get 1:1 Help Now