[Webinar] Streamline your web hosting managementRegister Today

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1463
  • Last Modified:

AWS S3 File- Local File Transfer

Hi, I'm new to AWS but have some rudimentary skills.  Need to transfer local files on a server to our S3 bucket in AWS environment.   I assume I can use either AWS Tools for Windows PowerShell or use High-Level s3 Commands with the AWS Command Line Interface.  If either of these tools would work I would need the syntax to automate these data transfer on a daily basis.

Any help would be great.

Thank you.
0
ITADUMA
Asked:
ITADUMA
  • 4
  • 2
1 Solution
 
Laroy ShtotlandIT Security ConsultantCommented:
Uploading a file from EC2 to S3 is easy:
$ aws s3 cp newfile.txt s3://testbucket/
https://aws.amazon.com/cli/
Here is a simple python script example https://code.google.com/archive/p/s3afe/
Also take a look at http://s3tools.org/s3cmd
But you'd better start with the best practice of using a IAM role for your EC2 instance instead of distributing access credentials.
0
 
ITADUMAAuthor Commented:
Thank you for the reply.  Just to be clear, the files I need to get to our S3 bucket are located on a server located in our HQ location, the files are not already in AWS.
0
 
Laroy ShtotlandIT Security ConsultantCommented:
Sorry, no EC2 roles then, just regular user credentials.

The AWS CLI has an aws s3 sync command that will copy directories and sub-directories to/from Amazon S3. You can also nominate which filetypes are included/excluded. It will only copies that a new or have been modified since the previous sync.

As an alternative to aws-cli you might consider using https://github.com/minio/mc
It implements mc mirror command to recursively sync files and directories to multiple destinations in parallel and eatures a cool progress bar and session management for resumable copy/mirror operations.

Also you may take a look at:
http://www.bucketexplorer.com/documentation/amazon-s3--schedule-commander-on-windows-os.html
http://www.cloudberrylab.com/subdomains/backup/freeware/
and cyberduck.io
- maybe you find it helpful.
Sorry, don't know your server's OS and requerements.
0
Managing Security Policy in a Changing Environment

The enterprise network environment is evolving rapidly as companies extend their physical data centers to embrace cloud computing and software-defined networking. This new reality means that the challenge of managing the security policy is much more dynamic and complex.

 
shalomcCTOCommented:
If this is a one time operation, then consider using the AWS Import/Export function, where you can ship a physical disk to AWS and have its contents uploaded to S3.

This is especially true if the transfer size is several GB or larger.
0
 
Laroy ShtotlandIT Security ConsultantCommented:
If this is a one time operation, then consider using the AWS Import/Export function, where you can ship a physical disk to AWS and have its contents uploaded to S3.
This is especially true if the transfer size is several GB or larger.

IMHO, AWS Snowball is worth its money only if the transfer size is several TB and you need to upload it ASAP, and your backbone is some poor asymmetric ADSL.
0
 
ITADUMAAuthor Commented:
Thanks guys, shipping the drive is not an option at this point, thanks.  

@Laroy we're using windows server 2008 and 2012 for these file transfers.  Would you be able to provide the syntax for aws s3 sync command that will copy directories and sub-directories to/from Amazon S3.

Thank you.
0
 
Laroy ShtotlandIT Security ConsultantCommented:
The sync command has the following form. Possible source-target combinations are:
Local file system to Amazon S3
Amazon S3 to local file system
Amazon S3 to Amazon S3
$ aws s3 sync <source> <target> [--options]
The following example synchronizes the contents of an Amazon S3 folder named path in my-bucket with the current working directory. s3 sync updates any files that have a different size or modified time than files with the same name at the destination. The output displays specific operations performed during the sync. Notice that the operation recursively synchronizes the subdirectory MySubdirectory and its contents with s3://my-bucket/path/MySubdirectory.

$ aws s3 sync . s3://my-bucket/path
upload: MySubdirectory\MyFile3.txt to s3://my-bucket/path/MySubdirectory/MyFile3.txt
upload: MyFile2.txt to s3://my-bucket/path/MyFile2.txt
upload: MyFile1.txt to s3://my-bucket/path/MyFile1.txt
Normally, sync only copies missing or outdated files or objects between the source and target. However, you may supply the --delete option to remove files or objects from the target not present in the source.

The following example, which extends the previous one, shows how this works.

// Delete local file
$ rm ./MyFile1.txt

// Attempt sync without --delete option - nothing happens
$ aws s3 sync . s3://my-bucket/path

// Sync with deletion - object is deleted from bucket
$ aws s3 sync . s3://my-bucket/path --delete
delete: s3://my-bucket/path/MyFile1.txt

// Delete object from bucket
$ aws s3 rm s3://my-bucket/path/MySubdirectory/MyFile3.txt
delete: s3://my-bucket/path/MySubdirectory/MyFile3.txt

// Sync with deletion - local file is deleted
$ aws s3 sync s3://my-bucket/path . --delete
delete: MySubdirectory\MyFile3.txt

// Sync with Infrequent Access storage class
$ aws s3 sync . s3://my-bucket/path --storage-class STANDARD_IA
The --exclude and --include options allow you to specify rules to filter the files or objects to be copied during the sync operation. By default, all items in a specified directory are included in the sync. Therefore, --include is only needed when specifying exceptions to the --exclude option (for example, --include effectively means "don't exclude"). The options apply in the order that is specified, as demonstrated in the following example.

Local directory contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt
'''
$ aws s3 sync . s3://my-bucket/path --exclude '*.txt'
upload: MyFile2.rtf to s3://my-bucket/path/MyFile2.rtf
'''
$ aws s3 sync . s3://my-bucket/path --exclude '*.txt' --include 'MyFile*.txt'
upload: MyFile1.txt to s3://my-bucket/path/MyFile1.txt
upload: MyFile88.txt to s3://my-bucket/path/MyFile88.txt
upload: MyFile2.rtf to s3://my-bucket/path/MyFile2.rtf
'''
$ aws s3 sync . s3://my-bucket/path --exclude '*.txt' --include 'MyFile*.txt' --exclude 'MyFile?.txt'
upload: MyFile2.rtf to s3://my-bucket/path/MyFile2.rtf
upload: MyFile88.txt to s3://my-bucket/path/MyFile88.txt
The --exclude and --include options can also filter files or objects to be deleted during a sync operation with the --delete option. In this case, the parameter string must specify files to be excluded from, or included for, deletion in the context of the target directory or bucket. The following shows an example.

Assume local directory and s3://my-bucket/path currently in sync and each contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt
'''
// Delete local .txt files
$ rm *.txt

// Sync with delete, excluding files that match a pattern. MyFile88.txt is deleted, while remote MyFile1.txt is not.
$ aws s3 sync . s3://my-bucket/path --delete --exclude 'my-bucket/path/MyFile?.txt'
delete: s3://my-bucket/path/MyFile88.txt
'''
// Delete MyFile2.rtf
$ aws s3 rm s3://my-bucket/path/MyFile2.rtf

// Sync with delete, excluding MyFile2.rtf - local file is NOT deleted
$ aws s3 sync s3://my-bucket/path . --delete --exclude './MyFile2.rtf'
download: s3://my-bucket/path/MyFile1.txt to MyFile1.txt
'''
// Sync with delete, local copy of MyFile2.rtf is deleted
$ aws s3 sync s3://my-bucket/path . --delete
delete: MyFile2.rtf
The sync command also accepts an --acl option, by which you may set the access permissions for files copied to Amazon S3. The option accepts private, public-read, and public-read-write values.

$ aws s3 sync . s3://my-bucket/path --acl public-read
As previously mentioned, the s3 command set includes cp, mv, ls, and rm, and they work in similar ways to their Unix counterparts. The following are some examples.

// Copy MyFile.txt in current directory to s3://my-bucket/path
$ aws s3 cp MyFile.txt s3://my-bucket/path/

// Move all .jpg files in s3://my-bucket/path to ./MyDirectory
$ aws s3 mv s3://my-bucket/path ./MyDirectory --exclude '*' --include '*.jpg' --recursive

// List the contents of my-bucket
$ aws s3 ls s3://my-bucket

// List the contents of path in my-bucket
$ aws s3 ls s3://my-bucket/path

// Delete s3://my-bucket/path/MyFile.txt
$ aws s3 rm s3://my-bucket/path/MyFile.txt

// Delete s3://my-bucket/path and all of its contents
$ aws s3 rm s3://my-bucket/path --recursive
When the --recursive option is used on a directory/folder with cp, mv, or rm, the command walks the directory tree, including all subdirectories. These commands also accept the --exclude, --include, and --acl options as the sync command does.

https://docs.aws.amazon.com/cli/latest/reference/s3/sync.html
0

Featured Post

Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

  • 4
  • 2
Tackle projects and never again get stuck behind a technical roadblock.
Join Now