allanburrows
asked on
Any advice on backing up a very large amount of data?
We currently have around 30 TB soon to increase to 80TB on a full backup which currently goes to tape and is taking far too long. We have a disk based solution which we plan to carry on doing full weekly and daily incrementals using backup exec 2010.
To speed things up i can cut the existing job in to say 5 jobs all going to virtual tape libraries on the quantum.
My question is that after the 1st full backup the next weeks full backup is probably going to have say 80% of its data exactly the same as the previous week. The quantum will dedup this so space isnt an issue but the wasted time it takes to send all that un needed data across the network is.
Has anyone heard of any product or method of backup that can eliminate the sending of the unnecsisarry data.
I like the sound of synthetic backup but have been told that it is only ok if your backups are very reliable for the baseline backup otherwise all your full backups there after will be missing files. we have over 100 servers and never get all of them 100% so synthetic worries me.
To speed things up i can cut the existing job in to say 5 jobs all going to virtual tape libraries on the quantum.
My question is that after the 1st full backup the next weeks full backup is probably going to have say 80% of its data exactly the same as the previous week. The quantum will dedup this so space isnt an issue but the wasted time it takes to send all that un needed data across the network is.
Has anyone heard of any product or method of backup that can eliminate the sending of the unnecsisarry data.
I like the sound of synthetic backup but have been told that it is only ok if your backups are very reliable for the baseline backup otherwise all your full backups there after will be missing files. we have over 100 servers and never get all of them 100% so synthetic worries me.
ASKER
A good point, we use backup exec but that has dedupe too. Currently the quantum DXI 6701 we have does our dedupe but i shall investigate if the BE 2010 agent can work with it using the OST.
ASKER
Looks like the best we can acheive is for the media server to dedupe before it sends the data to our disk based device using symantecs solutions. However our bottleneck is at the source servers before the data gets to the media server. I think it was a data domain product that had the agent that knew about what was backed up previously and only sent what was needed. Pretty expensive though. I shall investigate more of the options for our VMware environment that will pick up the speed and see how that helps.
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
looking in to another problem we stumbled across the client side dedupe option which seems to be the only option for us with our software setup.
http://www.symantec.com/business/support/index?page=content&id=TECH127071
They have some sort of client side de-duplication, which most probably will reduce the on-net traffic.
Haven't tested this feature, but as for the other features Netbackup has delivered till now, they work.