Hi all,
I'm just starting about with boto3, and having some trouble understanding how to integrate it into a Flask application and let the user upload files directly to S3. The upload interface uses a WTForm, with a FileField. With this form, I grab the file out of the request which is a FileStorage class that contains a BytesIO stream of the data. So far I have the following, which successfully uploads the file to S3:
def upload():
form = UploadForm() # This contains the field: data_file = FileField()
if request.method == 'POST' and form.submit.data:
try:
file = request.files[form.data_file.name]
s3 = boto3.client('s3')
s3.put_object(Body=file, Bucket='mybucket', Key=file.filename)
......
However, I would like to make use of the S3 customization s3.upload_file() to do this, so that I can take advantage of the automatic multipart uploads for large files. I am assuming (please correct me I'm wrong) that s3.put_object() does not do this.
When attempting the following, I get an exception that either a string or bytes must be passed for the Filename argument.
def upload():
form = UploadForm() # This contains the field: data_file = FileField()
if request.method == 'POST' and form.submit.data:
try:
file = request.files[form.data_file.name]
s3 = boto3.client('s3')
s3.put_object(file, 'mybucket', file.filename)
The problem is the FileField returns a BytesIO stream. I've tried ways of reading out the stream, but my impression is the method is looking for a path to a file, rather than the body of the data. Any suggestions on how to go about using upload_file (either in getting a file path from the form or how the BytesIO return gets handled)? Or is the recommendation to manually set up a multipart upload for large files?
Thanks!