Backing up to AWS using Backup Exec 2014 - Do I really need to have 315Gb of Virtual Storage?

EICT used Ask the Experts™
We currently have about 350Gb of Data stored on a physical server which we full backup each week using Backup Exec 2014 to a local HDD. Differential Backups are performed during the week.

For disaster recovery purposed I would like to store a copy of our data each week in the cloud using Amazon Web Services.  It looks like Glacier is the best solution especially as we do not require frequent recovery access.

Going through the AWS instructions it look like I'm going to have to:
1. Create a virtual environment with a VM.
2. Have a virtual disk of 150GB for an Upload Buffer.
3. Have a virtual disk of 165Gb for Cache Storage.

Given that I have only 350GB of data do I have to provide 315GB of virtual storage just for the upload to AWS process?

Is this set up using a Virtual Tape Library (VTL) really my only option?
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
I used a product called FastGlacier

Multi-threaded.. I liked it alot.  Of course that was 2014, I haven't used it since we migrated to Commvault but back then it performed very, very well.  And it was cheap too.


Thanks Ben. I shall try with out tomorrow and let you know how I get on.
Stuart ScottAWS Content Lead at Cloud Academy
Most Valuable Expert 2015
Top Expert 2015


You could also use CloudBerry Lab suite of products ( which allows you to link to your Glacier account via a GUI and simply drag and drop your data.

I wrote an article on CloudBerry Lab product here if you wanted to take a preview look at it:




FastGlacier has done the trick and is working well with AWS Glacier using a script to automate the process

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial