Improve company productivity with a Business Account.Sign Up

  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 588
  • Last Modified:

Good solution for 4TB remote backup

We're using a remote backup vendor to manage our backup of our data center. We have in total 4TB of data kept remotely. This is split into 3,9TB file data and 100GB MS SQL Server databases. We pay a lot of money (EUR 2000/mo) to our vendor each month to make this backup and we're looking for another solution. More directly we're thinking in ways to arrange our own backup hardware setup in a another data centre and do the remote backup ourselves. But I would like to hear from people that has done such setups before and can recommend ways of doing this.

Here are my thoughts:
* A NAS box with good expansion options, so we can safely grow data as time go by. Today we grow by 1 TB each year. We buy this box ourselves and place in another data centre for redundancy.
* Reliable backup software for large MS SQL Server databases. I like Tivoli TDP agent that we use today, because it transfer the backup of a database directly to TSM server, instead of making a local backup first and then transfer. This works ok, but I'm sure there are alternatives.
* Reliable and fast file backup or large number of files. We have 3,9TB file data, split into around 10 mio files and 5 mio folders (and still growing). We use Tivoli TSM today, because it has a Journal Engine (JBB) that keeps tracks of file changes real-time. This makes our daily incremental backup very fast, because it does not have to scan the filesystem for changes - it just take backup of the files tracked in JBB. This works ok, but not I think there must be other ways?
* I would like to be able to make backup/snapshots more often, so I'm able to recover data as close to breakdown time as possible. Today I would loose 1 day or work if the system was crashed.

Does anyone have good ideas? I look forward hearing your thoughts. Thanks.
  • 4
1 Solution
Sounds like you need a SAN. Not sure that I would use a NAS though I'd be more inclined to use some low end DAS device. Even some of the HP storage arrays would suit. They have fibre channel devices for direct connection to the SAN. Not 100% certain but Tivoli should have the addons that you need for your situation. As far as backup hardware you'll probably be best off with LTO3 library. I've always had my own preferences of vendor and no doubt you have yours. Pesonally I would be using EMC SAN devices, HP Ultrium3 tape library, HP Storage array and Vertias Backup Exec/Net Backup. Backup Exec itself can be run remotely and it sounds like all your devices and data to be backed up are already at the datacenter so Netbackup is not a requirement here. You could run hourly snapshots or less which would cover your restoration time issues. As I said check out what options Tivoli has for snapshots. No doubt other experts will be along shortly to give a perspective on that.
jawsdkAuthor Commented:
So you would start out by using a SAN as storage for all my data, both files and databases? And once I use a SAN to store all my data, the backup procedure will suddently give me a lot of advantages, such as hourly snapshots, remote backup etc?

Concerning Tape libraries such as LTO3 library, I think of those as quite old fashioned. I don't like the fact that I need to switch tape physically and I have an idea (without knowing it) that tape solution is more expensive than disk based solution? I was wondering of placing a storage box using cheap SATA disks remotely and make backup to that.

It's very important for me the solution will be remote, in case of fire or something else happening in my main data center.

However my next thought would be to gain advantage of having all our data in another location, and in this way also put some servers at the same locations and suddently being able to have 2 redundant locations for our complete software solution. We are offering hosted web based systems, based on Windows platform (2003, IIS, SQL Server etc). So this idea should actually be possible...just a quick thought.
All you thoughts on this are valid options but of course your budget is the primary issue. Some say that tape is outdated however I disagree. Tapes can be sent offsite to a vault (which is how I do it) on a daily basis. Disk based backups are certainly less expensive than tape based, however I am not convinced that the risk outweighs the benefit on such a large scale. When doing network based backups you need to consider the cost of bandwidth and data. To move 3TB on a weekly basis will be extremely expensive. If your budget allows for it then sure why not but I would ensure that you have a fully redundant backup system in case of hardware failure. That then means that for a redundant backup and redundant site you need 4 of everything.

With the proper architecture the SAN will allow you to do everything you need. The architecture is the key to success in this kind of deployment.

As far as your backup software and options go, I am saying that proper configuration and the correct amount of storage allocation can allow you to take hourly snapshots. Snapshots are a great idea for another reason and that is that your backup window is much greater because the backups are taken from the snapshot rather than the actual files allowing you to backup during business hours with no impact whatsoever on the network or the server resources.

For the service your running then I agree redundant site is quite important, and again, your architecture and budget will dictate what you will be able to achieve in your specific environment. This should be analyzed on a needs basis.
Get expert help—faster!

Need expert help—fast? Use the Help Bell for personalized assistance getting answers to your important questions.


If disk based backups is what you are thinking of then i think exagrid has the right kind of solution with its disk de-duplication technology which can do transfer the backup data at byte level.

refer to this link.

read this white paper on disk based backups.

Going forward if you want to go ahead and manage your backups then you need to get people capable of doing the same.

The advantange of using a SAN is that the backup would not interfere with the network. But if you involve NAS then its advisable to have a dedicated network for the backups.

As far as high availablility is concerned you can go for High availability solutions like CA Xosoft or doubletake for your SQL,Exchange and file servers which can perform replication over wan links.


The exagrid system is certainly a great product will definitely fit this scenario the only drawback to this is that if all reliance is on this type of system then there is no recovery option for deleted or corrupted file. A backup system is still required if you intend on taking hourly snapshots for this type of recovery.
I should clarify that what I am saying is that the exagrid systems are useful for both file storage and for disk based backups. You would need to consider that in your cost and needs analysis.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 4
Tackle projects and never again get stuck behind a technical roadblock.
Join Now