We help IT Professionals succeed at work.

Import files to S3 bucket from the server

RadhaKrishnaKiJaya
on
Hi Experts,
I am completely new to AWS. I have been assigned a task to transfer the files of  'x' folder  on the server to s3 bucket .  Because the server space is getting filled very fast. We need an automated power shell script or something else which would move the files everyday automatically and delete the files from the server.

Any help would be greatly appreciated!
Comment
Watch Question

AWS has example documentation about this, using Powershell scripts:
https://docs.aws.amazon.com/powershell/latest/userguide/pstools-s3-upload-object.html
 
It doesn't completely fit your needs, but it should point you into the right direction.
 
Hoping this helps,
Cheers!
Prabhin MPDevOps Engineer
Distinguished Expert 2018

Commented:
first, you need to install aws cli and configure your ACCESS key and secret key which you get it from IAM page -> got to the user and download the key from there.


use the below commands as per your distro
apt-get update awscli      
yum install awscli

after installing execute the following command to configure AWS CLI

aws configure

You will now be asked to enter the ‘AWS Access Key ID’, then ‘AWS Secret Access Key’ & lastly ‘Default Region Name’. All these information can be obtained from AWS Dashboard.

now create a bucket in s3


see the backup script

vi backup.sh

nowdate=$(date +"%Y-%m-%d")
dayago=`(date --date='02 days ago' '+%Y-%m-%d')`
todate=$(date +"%Y-%m-%d")
mondate=$(date +"%m")
mkdir backup
cd backup ; mkdir folder-$nowdate
cp -r /path/to/takebackup folder-$nowdate
aws cp folder-$nowdat s3://buckername --recursive
rm -rf /path/to/takebackup
rm -rf folder-dayago

Author

Commented:
Thank you!