Delta (bit-level) Backups to an external HD

Posted on 2012-08-28
Last Modified: 2012-10-18
I have a client that wants have delta-level backup to external
drives attached to a Win2008 server.

The reason for this is that there are several large binary files (20-30gb) that need
to be copied while a very important service is stopped. VSS will not work for these
files (aka "open"), so a key software program must be temporarily shutdown. The belief
was that copying only delta-level changes to these big files would dramatically speed
up the process.

I have tested Syncrify. While it appears to function, the desired effect is exactly the opposite as what was hoped. Using Syncrify to match up & copy one of the big file (the versions where about two weeks apart) took more than 3 hours.

A simple full file copy takes just under an hour.

The folks at Synametrics basically told me it is an "unsupported configuration"

Does anyone know of a good product/solution?


Question by:ziceman
    LVL 13

    Expert Comment

    I'm always pleased with our online/offsite backup solution, but it can be coupled with a device that they supply onsite.  Does full delta backups of exchange, system state, file shares, sql, most things you can chuck at it.

    If you opted for the onsite box as well, it will sync delta changes to local server and then replicate them offsite at line speed as well.

    LVL 11

    Expert Comment

    Why wouldn't VSS help?  If the file isn't "quiet" long enough for it to work, then stop the important service, let VSS make snapshot, then resume the service and let the backup software back up the snapshot.  That way your backup windows doesn't have to be restricted by the time you can afford to shut down the service.

    Any block-level delta monitoring can't really be done more efficiently than the file system itself can (with a kernel-mode process like VSS).  What you described about the Syncrify problem would be true for any non-file system application, as they'd have to query the file system, or simply read the files to find out the changed bits, which will be a slow disk-read process.

    Author Comment


    We were informed by the vertical market software developer's support engineers that relying on VSS for backups of this large file was "not advised". The software is collecting data from thousands of field instruments at any given moment.

    This being said, how can I confirm when the VSS snapshot has commenced and is completed?
    Where is the data and/or process exposed via a command line interface?
    LVL 55

    Expert Comment

    If it's collecting data from thousands of field instruments then presumably it is a database. If so is it in the traditional database format of a huge datafile that you can only take a full backup of once in a blue moon plus a bunch of transaction logs that you can keep a real time copy of?

    More info is needed as to what you're backing up, most databases have their own inbuilt backup routines that are supported rather than trying to take a snapshot of it when it might be quiet. If you don't run those database-aware backup procedures the transaction logs can get so big that they not only eat up all your disk space but also spawn gremlins that eat your CPU power regulators to feed themselves so they can grow into petabyte chomping beasts.

    Author Comment

    I only wish it was a database. Some of their other collected information is indeed stored in SQL Server, and these portions are a breeze to back up through the standard maintenance plan.

    This particular chunk is in a 20+ gig proprietary .DAT file (and growing). They do offer a special command-line EXPORT utility that can be run while the system is "up", but the procedure is slow, cumbersome and creates a plethora of little files that would make restoration nothing short of a nightmare.

    So, we have that or a manual copy while the system/service is temporarily stopped.

    My whole aim here is simply to reduce the amount of downtime for the backup of this one file.
    LVL 13

    Expert Comment

    I appreciate that I am going to suggest a 2nd item, that isn't directly compatible with what you originally asked but is food for thought.

    How about a storage array with Snapshot features?  That way, you can stop the application, trigger a snapshot, and then restart the application.

    The way that snapshots work is to mark all blocks used to store that data as read only, and then starts to write any new data into new blocks - therefore protecting the original blocks.  You could then run a slow time copy to another drive/device.

    The snapshot process would likely only take seconds, rather than minutes or hours.

    See this NetApp PDF for further explanation:
    LVL 55

    Expert Comment

    I'd try symantec CDP but it may suffer from the same performance problem as your current CDP product, taking a very long time to make the initial backup.
    LVL 11

    Accepted Solution

    If you can put those huge binary files by themselves on a separate volume, it's pretty easy to just use the native UI for the basic VSS ("Shadow Copies") to time the shutdown of the application and the system's schedule of taking snapshots.  See this for some background info on that:

    Microsoft also has a vshadow.exe tool to do command-line/script manipulation of shadow copies.  It can be downloaded as part of the SDK:

    If your backup software knows how to utilize the native shadow copies directly, it'd be straightforward once the scheduled shadow copies are available.  Otherwise you'd need a tool like vshadow.exe to expose the copies in some way for the backup software to use them.

    Symantec CDP may be a good option, but that requires a whole set of infrastructure to be set up.

    Write Comment

    Please enter a first name

    Please enter a last name

    We will never share this with anyone.

    Featured Post

    How your wiki can always stay up-to-date

    Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
    - Increase transparency
    - Onboard new hires faster
    - Access from mobile/offline

    VM backups can be lost due to a number of reasons: accidental backup deletion, backup file corruption, disk failure, lost or stolen hardware, malicious attack, or due to some other undesired and unpredicted event. Thus, having more than one copy of …
    The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
    This tutorial will walk an individual through locating and launching the BEUtility application to properly change the service account username and\or password in situation where it may be necessary or where the password has been inadvertently change…
    This tutorial will walk an individual through setting the global and backup job media overwrite and protection periods in Backup Exec 2012. Log onto the Backup Exec Central Administration Server. Examine the services. If all or most of them are stop…

    737 members asked questions and received personalized solutions in the past 7 days.

    Join the community of 500,000 technology professionals and ask your questions.

    Join & Ask a Question

    Need Help in Real-Time?

    Connect with top rated Experts

    18 Experts available now in Live!

    Get 1:1 Help Now