Do I have to backup my S3 storage?

Do I have to backup my S3 storage?  Or is it by default geo-replicated in some way.
tike55Asked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

David Johnson, CD, MVPOwnerCommented:
you should backup somewhere as replication is not backup
if you delete a file then that file is deleted/modified to all replicas
Omar SoudaniSenior System EngineerCommented:
You have to create a new bucket for cross region replication. Enable replication on the main bucket that you want to replicate.

Note: cross region replication requires bucket versioning, it must be enabled on the destination bucket.

Any new objects or changed objects only will be replicated Automatically. You need to use the command line to move old created objects in main bucket.
nobusCommented:
it never hurts to be safe - so plse, make a backup, before you post a question about lost data
Big Business Goals? Which KPIs Will Help You

The most successful MSPs rely on metrics – known as key performance indicators (KPIs) – for making informed decisions that help their businesses thrive, rather than just survive. This eBook provides an overview of the most important KPIs used by top MSPs.

David FavorLinux/LXD/WordPress/Hosting SavantCommented:
Read David Johnson's comment closely.

S3 data is replicated by Amazon, meaning there are multiple copies. This means while you have any data in S3, you have a copy.

If you delete some S3 data, then the deletion is also replicated. At this point all copies of your data disappear.

So, best to have an archive policy to keep permanent copies of important files stored somewhere forever.

This is very easy to do locally these days. Raw 14TB disk drives are cheap, quiet and fairly cool, so you an run your own local archive (deep freeze for files) at your home or your office.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
nociSoftware EngineerCommented:
Isn't the backup for S3 named Glacier?
David Johnson, CD, MVPOwnerCommented:
S3 is near real time storage, Glacier is cold storage, S3 doesn't automatically backup to Glacier.. Glacier is a lot cheaper than S3 but you may have to wait 5 hours to retrieve anything.. anything that doesn't need near real time storage I send directly to Glacier.. I use cloudberry explorer and anything older than 6 months I move to glacier.
nociSoftware EngineerCommented:
I don't expect Glacier to autmoticaly start anything.... Amazon needs at least some involvment to get their invoices for the service paid...
The customer must at least agree to allow payments to be charged.

And Backup normaly is a cold version of current data, get a tape from a vault and restoring part of the data is not expected to be instantanous... (at least i don;t expect that from backup).
Shalom CarmelCTOCommented:
Imho, getting cloud storage and then using on-premise hardware for backup, defeats the purpose of going to the cloud in the first place.

When talking about S3 backup, you have several risks to address.

* Accidental deletion of storage objects
The solution is to enable versioning on the S3 bucket. Versioning prevents loss of data due to accidental deletion. This document explains how to do it.

* Technical service disruption (S3 in us-east-1 gets nuked in a terrorist attack).
The solution is to define cross-region bucket replication, which includes versioning by definition. Best is to replicate to a different account, with a different payer. See the next section to understand why you should consider a different payer. Consider also adding lifecycle rules to replicate to Glacier.

* Vendor business disruption (AWS goes bankrupt, or closes your account access due to a business dispute).
This is tricky, but easily doable via a Lambda function that listens to S3 events. I have such a setup that automatically copies S3 objects to the Alibaba OSS storage.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Storage

From novice to tech pro — start learning today.