?
Solved

AWS S3 File- Local File Transfer

Posted on 2016-09-15
7
Medium Priority
?
809 Views
Last Modified: 2016-09-27
Hi, I'm new to AWS but have some rudimentary skills.  Need to transfer local files on a server to our S3 bucket in AWS environment.   I assume I can use either AWS Tools for Windows PowerShell or use High-Level s3 Commands with the AWS Command Line Interface.  If either of these tools would work I would need the syntax to automate these data transfer on a daily basis.

Any help would be great.

Thank you.
0
Comment
Question by:ITADUMA
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 2
7 Comments
 
LVL 5

Expert Comment

by:Laroy Shtotland
ID: 41800469
Uploading a file from EC2 to S3 is easy:
$ aws s3 cp newfile.txt s3://testbucket/
https://aws.amazon.com/cli/
Here is a simple python script example https://code.google.com/archive/p/s3afe/
Also take a look at http://s3tools.org/s3cmd
But you'd better start with the best practice of using a IAM role for your EC2 instance instead of distributing access credentials.
0
 

Author Comment

by:ITADUMA
ID: 41800515
Thank you for the reply.  Just to be clear, the files I need to get to our S3 bucket are located on a server located in our HQ location, the files are not already in AWS.
0
 
LVL 5

Expert Comment

by:Laroy Shtotland
ID: 41800783
Sorry, no EC2 roles then, just regular user credentials.

The AWS CLI has an aws s3 sync command that will copy directories and sub-directories to/from Amazon S3. You can also nominate which filetypes are included/excluded. It will only copies that a new or have been modified since the previous sync.

As an alternative to aws-cli you might consider using https://github.com/minio/mc
It implements mc mirror command to recursively sync files and directories to multiple destinations in parallel and eatures a cool progress bar and session management for resumable copy/mirror operations.

Also you may take a look at:
http://www.bucketexplorer.com/documentation/amazon-s3--schedule-commander-on-windows-os.html
http://www.cloudberrylab.com/subdomains/backup/freeware/
and cyberduck.io
- maybe you find it helpful.
Sorry, don't know your server's OS and requerements.
0
WatchGuard's M Series Appliances - Miecom Approved

WatchGuard's newest M series appliances were put to the test by Miercom.  We had great results and outperformed all of our competitors in both stateless and stateful traffic throghput scenarios! Ready to see how your UTM appliance stacked up? Download the Miercom Report!

 
LVL 33

Expert Comment

by:shalomc
ID: 41801555
If this is a one time operation, then consider using the AWS Import/Export function, where you can ship a physical disk to AWS and have its contents uploaded to S3.

This is especially true if the transfer size is several GB or larger.
0
 
LVL 5

Expert Comment

by:Laroy Shtotland
ID: 41801625
If this is a one time operation, then consider using the AWS Import/Export function, where you can ship a physical disk to AWS and have its contents uploaded to S3.
This is especially true if the transfer size is several GB or larger.

IMHO, AWS Snowball is worth its money only if the transfer size is several TB and you need to upload it ASAP, and your backbone is some poor asymmetric ADSL.
0
 

Author Comment

by:ITADUMA
ID: 41801883
Thanks guys, shipping the drive is not an option at this point, thanks.  

@Laroy we're using windows server 2008 and 2012 for these file transfers.  Would you be able to provide the syntax for aws s3 sync command that will copy directories and sub-directories to/from Amazon S3.

Thank you.
0
 
LVL 5

Accepted Solution

by:
Laroy Shtotland earned 2000 total points
ID: 41801936
The sync command has the following form. Possible source-target combinations are:
Local file system to Amazon S3
Amazon S3 to local file system
Amazon S3 to Amazon S3
$ aws s3 sync <source> <target> [--options]
The following example synchronizes the contents of an Amazon S3 folder named path in my-bucket with the current working directory. s3 sync updates any files that have a different size or modified time than files with the same name at the destination. The output displays specific operations performed during the sync. Notice that the operation recursively synchronizes the subdirectory MySubdirectory and its contents with s3://my-bucket/path/MySubdirectory.

$ aws s3 sync . s3://my-bucket/path
upload: MySubdirectory\MyFile3.txt to s3://my-bucket/path/MySubdirectory/MyFile3.txt
upload: MyFile2.txt to s3://my-bucket/path/MyFile2.txt
upload: MyFile1.txt to s3://my-bucket/path/MyFile1.txt
Normally, sync only copies missing or outdated files or objects between the source and target. However, you may supply the --delete option to remove files or objects from the target not present in the source.

The following example, which extends the previous one, shows how this works.

// Delete local file
$ rm ./MyFile1.txt

// Attempt sync without --delete option - nothing happens
$ aws s3 sync . s3://my-bucket/path

// Sync with deletion - object is deleted from bucket
$ aws s3 sync . s3://my-bucket/path --delete
delete: s3://my-bucket/path/MyFile1.txt

// Delete object from bucket
$ aws s3 rm s3://my-bucket/path/MySubdirectory/MyFile3.txt
delete: s3://my-bucket/path/MySubdirectory/MyFile3.txt

// Sync with deletion - local file is deleted
$ aws s3 sync s3://my-bucket/path . --delete
delete: MySubdirectory\MyFile3.txt

// Sync with Infrequent Access storage class
$ aws s3 sync . s3://my-bucket/path --storage-class STANDARD_IA
The --exclude and --include options allow you to specify rules to filter the files or objects to be copied during the sync operation. By default, all items in a specified directory are included in the sync. Therefore, --include is only needed when specifying exceptions to the --exclude option (for example, --include effectively means "don't exclude"). The options apply in the order that is specified, as demonstrated in the following example.

Local directory contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt
'''
$ aws s3 sync . s3://my-bucket/path --exclude '*.txt'
upload: MyFile2.rtf to s3://my-bucket/path/MyFile2.rtf
'''
$ aws s3 sync . s3://my-bucket/path --exclude '*.txt' --include 'MyFile*.txt'
upload: MyFile1.txt to s3://my-bucket/path/MyFile1.txt
upload: MyFile88.txt to s3://my-bucket/path/MyFile88.txt
upload: MyFile2.rtf to s3://my-bucket/path/MyFile2.rtf
'''
$ aws s3 sync . s3://my-bucket/path --exclude '*.txt' --include 'MyFile*.txt' --exclude 'MyFile?.txt'
upload: MyFile2.rtf to s3://my-bucket/path/MyFile2.rtf
upload: MyFile88.txt to s3://my-bucket/path/MyFile88.txt
The --exclude and --include options can also filter files or objects to be deleted during a sync operation with the --delete option. In this case, the parameter string must specify files to be excluded from, or included for, deletion in the context of the target directory or bucket. The following shows an example.

Assume local directory and s3://my-bucket/path currently in sync and each contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt
'''
// Delete local .txt files
$ rm *.txt

// Sync with delete, excluding files that match a pattern. MyFile88.txt is deleted, while remote MyFile1.txt is not.
$ aws s3 sync . s3://my-bucket/path --delete --exclude 'my-bucket/path/MyFile?.txt'
delete: s3://my-bucket/path/MyFile88.txt
'''
// Delete MyFile2.rtf
$ aws s3 rm s3://my-bucket/path/MyFile2.rtf

// Sync with delete, excluding MyFile2.rtf - local file is NOT deleted
$ aws s3 sync s3://my-bucket/path . --delete --exclude './MyFile2.rtf'
download: s3://my-bucket/path/MyFile1.txt to MyFile1.txt
'''
// Sync with delete, local copy of MyFile2.rtf is deleted
$ aws s3 sync s3://my-bucket/path . --delete
delete: MyFile2.rtf
The sync command also accepts an --acl option, by which you may set the access permissions for files copied to Amazon S3. The option accepts private, public-read, and public-read-write values.

$ aws s3 sync . s3://my-bucket/path --acl public-read
As previously mentioned, the s3 command set includes cp, mv, ls, and rm, and they work in similar ways to their Unix counterparts. The following are some examples.

// Copy MyFile.txt in current directory to s3://my-bucket/path
$ aws s3 cp MyFile.txt s3://my-bucket/path/

// Move all .jpg files in s3://my-bucket/path to ./MyDirectory
$ aws s3 mv s3://my-bucket/path ./MyDirectory --exclude '*' --include '*.jpg' --recursive

// List the contents of my-bucket
$ aws s3 ls s3://my-bucket

// List the contents of path in my-bucket
$ aws s3 ls s3://my-bucket/path

// Delete s3://my-bucket/path/MyFile.txt
$ aws s3 rm s3://my-bucket/path/MyFile.txt

// Delete s3://my-bucket/path and all of its contents
$ aws s3 rm s3://my-bucket/path --recursive
When the --recursive option is used on a directory/folder with cp, mv, or rm, the command walks the directory tree, including all subdirectories. These commands also accept the --exclude, --include, and --acl options as the sync command does.

https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
0

Featured Post

Four New Appliances. Same Industry-leading Speeds.

But don't take it from us.  The Firebox M370 is Miercom tested and Miercom approved, outperforming its competitors for stateless and stateful traffic throughput scenarios.  Learn more about the M370, M470, M570 and M670 and find the right solution for your organization today!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

A recent project that involved parsing Tableau Desktop and Server log files to extract reusable user queries for use in other systems. I chose to use PowerShell to gather the data, and SharePoint to present it...
Why do some people recommend buying business VoIP from an ISP? What are the benefits to my company? What are the costs?
Steps to create a PostgreSQL RDS instance in the Amazon cloud. We will cover some of the default settings and show how to connect to the instance once it is up and running.
Both in life and business – not all partnerships are created equal. Spend 30 short minutes with us to learn:   • Key questions to ask when considering a partnership to accelerate your business into the cloud • Pitfalls and mistakes other partners…
Suggested Courses

770 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question