Solved

Backup Raid - Remove a disk for nightly transport offsite

Posted on 2013-01-14
12
351 Views
Last Modified: 2013-01-22
Ok, so I am tired of dealing with software based backup solutions, I have tried file syncers, full backup programs like Symantec and Acronis, all have their faults and being one person at a large company backups have become a serious stress point.


My setup:
1 Backup server running WinServer 2k3
1 3TB drive that gets all the networks data synced to it. (it syncs out mailserver data folders and our filesever data folder)
We have 5 3TB drives attached via USB 3.0, One for each day plus the "week" number
so
Monday, Tuesday, Wednesday, Thursday, Week#

we take the current week number home and swap with the oldest and the cycle continues.

The file sync works perfectly from the network to the one main drive but as I try to sync all the other drives on a daily basis, it becomes bogged down.

The issue is the number of files.  We have 2.5TB of data and seemingly millions of files and folders, no sync program can handle it properly and the other programs seem to have image errors constantly.

The main drive works fine because it has a current copy and there are only a few gb's of data to sync.  The problem lies is syncing the other drives which are days (and exponential gb) larger.

So I would like a standalone system that I can send files to as backup and then take one drive out of that system each day and replace it with a new one (or previously used in the array) and have it rebuild a backup with the latest data.

I do not know much about raid but surely there has to be a solution like this out there?

So like maybe a hot swappable raid computer that has 6-8 bays, where it syncs all the files and I can take one drive out (that I can access the files from on any computer) and stick in a new one to be "rebuilt" to mirror the main?

Not sure if I am explaining my problem properly so feel free to ask for clarification, this has become a huge stress point for me and I'll take any advice.
0
Comment
Question by:EGormly
  • 4
  • 3
  • 2
  • +3
12 Comments
 
LVL 16

Expert Comment

by:choward16980
Comment Utility
Dude, at 2.5 TB, you should be using Tape.  Two LTO-5's should do ya.

If you insist on the hard drive method, just use a workstation with two hot swappable hard drive caddies.  Use windows 7 and make a software RAID 0.  To remove the drive, shut the workstation all the way down.  Remove disk.  Boot back up.  Delete orphaned disk from volume and re-add or rebuild mirror.  No software other than OS necessary.  I used to do this, but HD's are really sensitive.  You should be taping 2.5 TB, if you know what's good for you and your company.  SOftware solutions like backup exec will work perfect on the file level.  Don't use imaging for something so large unless you're going to consider a solution like deduplication.  At 2.5 TB, no telling how large those deltas can grow.
0
 
LVL 4

Expert Comment

by:Levi Gwyn
Comment Utility
If such as system existed, I would have one.  I know this does not answer your question but we swap out out USB drives just like you do and have not found a better hardware solution.

We have about 6TBs of data to protect and we're going with a cloud based backup solution from Barracuda.  I have no allegiance to Barracuda - there are other solutions out there like it.  Depending on your budget, you may want to consider something like that.  It greatly reduces the need to screw around with taking disks (or tape) offsite all the time.
0
 
LVL 47

Expert Comment

by:dlethe
Comment Utility
Tape is the right tool for this job.  If it costs too much, then buy used. but bottom line, it is the way to go.
0
 
LVL 20

Expert Comment

by:SelfGovern
Comment Utility
I agree with the posters who say that tape is the right solution.  Because of the number of files you have to back up, you might want to consider a disk-to-disk-to-tape (D2D2T) solution.
Consider using a product like HP's Data Protector as the backup software, because it can perform an "incremental forever with synthetic full backup".  This means that you back up the full filesystem to disk once.  After that, you only run incremental backups.  Periodically (once a week, typically), you have the software assemble a backup tape (your synthetic full) which is the same as if you'd taken a complete full backup at that point in time.

The benefits of this solution are:
- easy off site portability
- very fast restores from disk when needed (particularly individual files or directories)
- provides great archival data stores, if you ever have to go back to data from a year or three ago (think tax data, fraud recovery, contract enforcement, etc.)

IBM's TSM does a very similar incremental forever strategy, but is significantly more complex and requires a lot more administration than Data Protector.

The reason I suggest the incremental forever strategy instead of a simpler/cheaper D2T alone is that you're talking about millions of files.   As a bit of reading here or on other backup forums will reveal, backing up millions of small files quickly is an extremely difficult proposition because of the way backup applications have to walk the filesystem trees.  Incremental Forever minimizes a lot of the difficulty and can give you much better performance.  Additionally, D2D2T will save tons of wear and tear on your tape drives and tape media compared to very slow lots-of-small-files D2T backup.

Be aware that the typical D2D backup as you described won't provide an archival solution (disks aren't made to retain data for long periods when not powered on), and the space requirements if you retain all data grow pretty amazingly fast.

Also -- if you do go with some sort of RAID disk implementation that has you cycling through disks, check the manufacturer's data for the number of plug/unplug cycles it is designed for.  Many of the common RAID systems have backplanes designed for only a couple dozen remove/add cycles, since the expected usage is to plug the disks in and leave 'em in until a failure or rare disk upgrade requires a swap.
0
 
LVL 16

Expert Comment

by:Gerald Connolly
Comment Utility
It may seem like a good idea to use a pool of disks and keep swapping them over, but in reality its not.

As @SelfGovern said disk interface connectors are not normally designed to be plugged in and out on a regular basis although USB connectors may be more reliable for this purpose

I agree with the others that tape would be a better option than disk, especially USB disk as it will be significantly quicker.

The problem with both Tape and disks is that they both have one unreliable component! - the protein robot that has to unplug/eject the disk/tape and transport it to the remote site (and back again) .

One good option would be to have a tape library on a remote site which would cut out this unreliable component altogether.
0
 
LVL 47

Expert Comment

by:dlethe
Comment Utility
For what it is worth ... I added a short USB expansion cable onto a port on a test system that I often plug equipment in.  That way you only wear out half of a cable instead of the all-important USB port soldered onto a motherboard.

You might also just do something completely different.  Contact one of those online backup companies.  I can't imagine you have that much data changing on a daily basis and it may end up costing you a lot less having some cloud-based backup company providing the repository along with a background agent program that does this 24x7x365.

This also eliminates the human factor, plus there IS a cost associated with taxiing disk drives around so that can contribute to value proposition of going to online backup.
0
What Security Threats Are You Missing?

Enhance your security with threat intelligence from the web. Get trending threat insights on hackers, exploits, and suspicious IP addresses delivered to your inbox with our free Cyber Daily.

 
LVL 4

Accepted Solution

by:
Levi Gwyn earned 500 total points
Comment Utility
Cloud-based backup is the best option in my opinion.  I won't dispute that tape is very convenient for the backup part of the equation but when it comes time for recovery, tape is too slow.

There are a lot of appliance-based backup solutions that also archive to the cloud.  Barracuda is an example but there are plenty of others.  This way you have a two stage backup with a local copy that can be used to recovery data quickly in a mid-level disaster scenario.  You also have the cloud copy as your offsite replica in the event of a high-level disaster recovery scenario.

This setup takes care of your offsite requirement without the need to screw around with removing USB disks, setting them back up on the server, labeling, etc.

Tape will definitely work - not flaming the folks who are fans, but if your budget can support it, an appliance > cloud solution is the best in my view.
0
 

Author Comment

by:EGormly
Comment Utility
thanks for the suggestions guys.  I have also been using cloud based DriveHQ and it cannot handle the sync at all errors out all the time.

so this is another reason I wanted to go with the hard drives.

can someone tell em about baracuda?
0
 
LVL 4

Expert Comment

by:Levi Gwyn
Comment Utility
0
 
LVL 47

Expert Comment

by:dlethe
Comment Utility
There are lots of cloud backup vendors, but you really need to do your homework and learn from people who have a large amount of data and FIRST HAND experience.  Also factor in your internet pipe bandwidth.

Probably best to open a new question on cloud backup solutions to get recommendations from users rather than online reviews and provider websites.  But since tape is out of your budget, cloud is the "best" solution.
0
 

Author Closing Comment

by:EGormly
Comment Utility
thank you I have chosen the cloud :)
0
 
LVL 4

Expert Comment

by:Levi Gwyn
Comment Utility
Good luck and please let us know what solution you choose.  Very interested in your experiences.
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Citrix XenApp, Internet Explorer 11 set to Enterprise Mode and using central hosted sites.xml file.
How to update Firmware and Bios in Dell Equalogic PS6000 Arrays and Hard Disks firmware update.
To efficiently enable the rotation of USB drives for backups, storage pools need to be created. This way no matter which USB drive is installed, the backups will successfully write without any administrative intervention. Multiple USB devices need t…
This tutorial will walk an individual through the process of installing of Data Protection Manager on a server running Windows Server 2012 R2, including the prerequisites. Microsoft .Net 3.5 is required. To install this feature, go to Server Manager…

763 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

11 Experts available now in Live!

Get 1:1 Help Now