Hi, I'm new to AWS but have some rudimentary skills. Need to transfer local files on a server to our S3 bucket in AWS environment. I assume I can use either AWS Tools for Windows PowerShell or use High-Level s3 Commands with the AWS Command Line Interface. If either of these tools would work I would need the syntax to automate these data transfer on a daily basis.
Any help would be great.
Thank you.
AWSPowershellCloud Services
Last Comment
Laroy Shtotland
8/22/2022 - Mon
Laroy Shtotland
Uploading a file from EC2 to S3 is easy:
$ aws s3 cp newfile.txt s3://testbucket/ https://aws.amazon.com/cli/
Here is a simple python script example https://code.google.com/archive/p/s3afe/
Also take a look at http://s3tools.org/s3cmd
But you'd better start with the best practice of using a IAM role for your EC2 instance instead of distributing access credentials.
ITADUMA
ASKER
Thank you for the reply. Just to be clear, the files I need to get to our S3 bucket are located on a server located in our HQ location, the files are not already in AWS.
Laroy Shtotland
Sorry, no EC2 roles then, just regular user credentials.
The AWS CLI has an aws s3 sync command that will copy directories and sub-directories to/from Amazon S3. You can also nominate which filetypes are included/excluded. It will only copies that a new or have been modified since the previous sync.
As an alternative to aws-cli you might consider using https://github.com/minio/mc
It implements mc mirror command to recursively sync files and directories to multiple destinations in parallel and eatures a cool progress bar and session management for resumable copy/mirror operations.
If this is a one time operation, then consider using the AWS Import/Export function, where you can ship a physical disk to AWS and have its contents uploaded to S3.
This is especially true if the transfer size is several GB or larger.
Laroy Shtotland
If this is a one time operation, then consider using the AWS Import/Export function, where you can ship a physical disk to AWS and have its contents uploaded to S3.
This is especially true if the transfer size is several GB or larger.
IMHO, AWS Snowball is worth its money only if the transfer size is several TB and you need to upload it ASAP, and your backbone is some poor asymmetric ADSL.
ITADUMA
ASKER
Thanks guys, shipping the drive is not an option at this point, thanks.
@Laroy we're using windows server 2008 and 2012 for these file transfers. Would you be able to provide the syntax for aws s3 sync command that will copy directories and sub-directories to/from Amazon S3.
$ aws s3 cp newfile.txt s3://testbucket/
https://aws.amazon.com/cli/
Here is a simple python script example https://code.google.com/archive/p/s3afe/
Also take a look at http://s3tools.org/s3cmd
But you'd better start with the best practice of using a IAM role for your EC2 instance instead of distributing access credentials.