Link to home
Start Free TrialLog in
Avatar of EICT
EICTFlag for United Kingdom of Great Britain and Northern Ireland

asked on

Backing up to AWS using Backup Exec 2014 - Do I really need to have 315Gb of Virtual Storage?

Hi,
We currently have about 350Gb of Data stored on a physical server which we full backup each week using Backup Exec 2014 to a local HDD. Differential Backups are performed during the week.

For disaster recovery purposed I would like to store a copy of our data each week in the cloud using Amazon Web Services.  It looks like Glacier is the best solution especially as we do not require frequent recovery access.

Going through the AWS instructions it look like I'm going to have to:
1. Create a virtual environment with a VM.
2. Have a virtual disk of 150GB for an Upload Buffer.
3. Have a virtual disk of 165Gb for Cache Storage.

Given that I have only 350GB of data do I have to provide 315GB of virtual storage just for the upload to AWS process?

Is this set up using a Virtual Tape Library (VTL) really my only option?
ASKER CERTIFIED SOLUTION
Avatar of Ben Hart
Ben Hart
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of EICT

ASKER

Thanks Ben. I shall try with out tomorrow and let you know how I get on.
Avatar of Stuart Scott
Hi,

You could also use CloudBerry Lab suite of products (www.cloudberrylab.com) which allows you to link to your Glacier account via a GUI and simply drag and drop your data.

I wrote an article on CloudBerry Lab product here if you wanted to take a preview look at it:

https://www.experts-exchange.com/articles/20139/Review-of-Cloudberry-Explorer-Pro-Linked-with-AWS-Amazon-Web-Services-S3.html

Cheers,

Stu...
Avatar of EICT

ASKER

FastGlacier has done the trick and is working well with AWS Glacier using a script to automate the process