mikha
asked on
reading from S3 in python program
I have a python based program, which I plan to put it in a docker container and deploy it in aws EC2.
currently the program reads XML files from a folder in the local machine. I want to move the code to aws EC2 and reference a S3 bucket in python code, where i can read data?
how can i reference a S3 , also what roles or policies would i need?
currently the program reads XML files from a folder in the local machine. I want to move the code to aws EC2 and reference a S3 bucket in python code, where i can read data?
how can i reference a S3 , also what roles or policies would i need?
You are looking for code that uses the boto3 python library.
The following are barebones code snippets to work with S3.
This returns a paginated list of files
The following are barebones code snippets to work with S3.
This returns a paginated list of files
import boto3
s3bucket = 'your-bucket'
s3folder = 'somefolder/'
botoclient = boto3.client('s3')
bucket_list_response = botoclient.list_objects(
Bucket=s3bucket,
Prefix=s3folder
)
This returns an actual xml file as stringfilename = 'somefile.xml'
s3key = s3folder + filename
# or alternatively get the s3 key name from the list response
s3key = bucket_list_response["Contents"][1]
s3object = botoclient.get_object(
Bucket=s3bucket,
Key=s3key
)
filedata = s3object["Body"].read()
filedata = filedata.decode('utf-8')
print((filedata))
This question needs an answer!
Become an EE member today
7 DAY FREE TRIALMembers can start a 7-Day Free trial then enjoy unlimited access to the platform.
View membership options
or
Learn why we charge membership fees
We get it - no one likes a content blocker. Take one extra minute and find out why we block content.
1) Move your data from your local machine to an S3 bucket.
2) Then you'll set a policy of anyone can read if data is public, or policy of only you can read if data is private.