!web2py, but i'm asking someone's help with aws-sdk-js.

63 views
Skip to first unread message

lucas

unread,
May 15, 2019, 11:44:52 PM5/15/19
to web2py-users
hello one and all,

i'm in a nightmare trying to upload files via pure client-side javascript to aws s3 using, i guess, aws-sdk-js.  perhaps someone out there who's done this can have a discussion with me and i don't know where else to go.  i know this isn't web2py but i'm in the weeds on this one and pretty damn frustrated.  alright, here is my code under the view under the head block which goes under the header of the html page as assembled by web2py:
{{block head}}
<script src="https://sdk.amazonaws.com/js/aws-sdk-2.283.1.min.js"></script>
<script>
const bucketName = 'bucketXYZ';
const bucketRegion = 'us-east-1';


var s3 = new AWS.S3({
   
params: { Bucket: bucketName },
    region
: bucketRegion,
    accessKeyId
: 'someKeyId',
    secretAccessKey
: 'someAccessKey'
});


function local_file_upload(event) {
   
event.preventDefault();
   
var e = jQuery('input[type="file"]');
   
var file = e.prop('files')[0];
   
var filename = file.name;
    console
.log('uploading file: '+filename);
   
var folder = encodeURIComponent('foldername'+'/'+filename);
    s3
.upload({
       
Key: folder,
       
Body: file,
        ACL
: 'public-read'
   
}, function(error, data) {
       
if (error) {
           
return alert('Upload Error: ', error.message);
       
}
        alert
('Upload Successfull');
   
});
}
{{end}}

i'm crazy because all of the examples are spread all over hell and some have similar code but then mix in a thing called node.js and hence a require('aws-sdk') which i can't figure out if node.js is a client side javascript file because it seems to get installed on the server.  anyway, i may not need node.js at all and would prefer to stay with the min.js file, as in the above code.  at this point, i'm not worried about just using the access and secret keys as a straight string, i know to protect it later with an ajax request to the server and hide the keys in the code.

the s3 variable doesn't error.  only when i run the local_file_update function.  it throws the error: Failed to load resource: Preflight response is not successful.

anyway, if anyone has an idea and can guide me, i would really appreciate the help.  thank you, lucas


Dave S

unread,
May 16, 2019, 4:38:46 AM5/16/19
to web...@googlegroups.com


On Wednesday, May 15, 2019 at 8:44:52 PM UTC-7, lucas wrote:
[...] 
i'm crazy because all of the examples are spread all over hell and some have similar code but then mix in a thing called node.js and hence a require('aws-sdk') which i can't figure out if node.js is a client side javascript file because it seems to get installed on the server.

FWIW, node.js is usually used as a light-weight server providing either JSON (the J in JWT) or stuff going to REACT and the like.  It can also be used as a "client server", but this doesn't seem to be common at the moment.

.Sorry I'm not answering the actual question.  You could look at Niphlod's original posts on the subject, but many of the comments and examples are actually in the gluon file from when it was adopted into core.

Also,  I use AWS but haven't gotten to S3 yet.

/dps
 

lucas

unread,
May 16, 2019, 6:56:13 AM5/16/19
to web2py-users
hey Dave S,

yes, I've been using ec2 with a centos 7.6 os for 12 years and s3 for the same amount of time.  I have t2.micro sized server and it costs about $20 and I don't have to think about amount of ethernet traffic or throughput, RAM, downtimes, etc.  call me old fashioned, but its really weird to "setup" a server and first login and you get a flashing prompt of an empty server.

I've used boto in web2py for server-side uploading of files to s3.  its worked fine.  as a professor running podcasts the size of hundreds of megabytes I wanted the client-side javascript upload for 2 reasons.  off load the intermediate liaison server-side to be more efficient by having the client, where the file originated, directly into s3, and introduce a progress status during the upload because some of these files can take a while to upload, like an hour.  so the rational is sound, I just have to figure out how to do it.  but again, I'm lost in the weeds.

so not using node.js makes sense because its really a java version of python's boto, which is what I've been doing all along anyway.  so I really need a javascript library for uploading from the client only.  I don't understand why its so hard to find examples when s3 been around longer then 2006.

so, I'm hoping to get guidance from my peers on web2py google groups.  Lucas

lucas

unread,
May 17, 2019, 8:57:29 AM5/17/19
to web...@googlegroups.com
actually Dave S, 

your indirect answer led me to a place of where not to go at all, which was the server-side node.js.  believe it or not.  because it led me to a place where i actually needed to be.  and i conjured the final code after about 44 hours of toil.  it works freaking great, pure client-side and progress also.  so, i'll post my code snippets for anyone else in the future.  

so here is most of the snippet from the view:
{{block head}}
<link rel="stylesheet" href="{{=URL('static','css/jquery-confirm.min.css')}}"/>
<script src="{{=URL('static','js/jquery-confirm.min.js')}}"></script>
<script src="{{=URL('static','js/aws-sdk-2.456.0.min.js')}}"></script> <!--aws-sdk-js-->
<script>
var fileChoose, fileChooseUploadButton, fileChooseSpan, fileETag;

function local_file_upload(event) {
   
event.preventDefault();
   
var submit = jQuery('form#item_edit > div#submit_record__row > div.col-sm-9 > input[type="submit"]');
   
var file = fileChoose.prop('files')[0];
    console
.log('uploading file: '+file.name);
    fileETag
.prop('readonly', false).val('').prop('readonly', true);
    fileChooseUploadButton
.val('Uploading...').prop('disabled', true);
    submit
.prop('disabled', true);
    jQuery
.get({url:"{{=URL(c='default', f='ajaxAws')}}", async:false}).done(function(t) {
       
if (t.response) {
            console
.log('rtn t.response: '+t.response);
           
//console.log('rtn t: '+JSON.stringify(t));
            AWS
.config.update({
               
"accessKeyId": t.accessKeyId,
               
"secretAccessKey": t.secretAccessKey, //.replace('K', '9'),
               
"region": "us-west-1"
           
});
       
}
   
});
   
var s3 = new AWS.S3();
   
var params = {
       
Bucket: 'happyBucketDays',
       
Key: '{{=professor_folder}}/'+jQuery('form#item_edit > div#lecture_items_filename__row > div.col-sm-9 > input#lecture_items_filename').val(),
       
ContentType: file.type,
       
Body: file,
        ACL
: 'public-read'
   
};
    s3
.upload(params, function (error, results) {
       
if (error) {
            fileETag
.prop('readonly', false).val('').prop('readonly', true);
            fileChooseSpan
.html(error.message);
           
return jQuery.alert({title:"Error Uploading Data: ", content:error});
       
} else {
           
var etg = results.ETag.replace('"', '').replace('"', '');
            fileChooseSpan
.html('Sucessfull Upload Confirmed.  Be sure to click Submit below.');
            fileETag
.prop('readonly', false).val(etg).prop('readonly', true);
            console
.log('file uploaded with result: '+JSON.stringify(results));
            submit
.prop('disabled', false);
           
return jQuery.alert({title:"Successfully Uploaded File", content:""});
       
}
   
}).on('httpUploadProgress', function(progress) {
        fileChooseSpan
.html('Uploaded: '+parseInt((progress.loaded * 100)/progress.total)+'%');
   
});
}

function local_filename_change() {

   
var e = jQuery('input[type="file"]');

    console
.log('local file name: '+e.prop('files')[0].name);
    jQuery
('input#lecture_items_filename').val(e.prop('files')[0].name);
    fileChooseUploadButton
.css('display', "inline").val('Upload File').prop('disabled', false);
    console
.log(fileChooseUploadButton.prop('type')+" ___ "+fileChooseUploadButton.prop('id'));
    fileChooseSpan
.html('Make sure the "Server Filename" is correct.');
    jQuery
('form#item_edit > div#lecture_items_filename__row > div.col-sm-9 > input#lecture_items_filename').focus();
}

jQuery
(document).ready(function() {
   
var sx = "jQuery.version: "+jQuery.fn.jquery;
    console
.log("jQuery.document.ready begin..."+sx);
    jQuery
('form#item_edit > div#lecture_items_text_before__row').after('<div class="form-group row" id="lecture_items_local_filename__row"><label class="form-control-label col-sm-3" for="lecture_items_local_filename" id="lecture_items_local_filename__label">Local Filename</label><div class="col-sm-9" style="text-align:center;"><input type="file" class="btn btn-primary" id="lecture_items_file" onchange="local_filename_change();" /><input class="btn btn-primary" type="submit" id="lecture_items_file_upload" value="Upload File" onclick="local_file_upload(event);" style="display:none;" /><span class="help-block" style="display:block;"></span></div></div>');
    fileChoose
= jQuery('form#item_edit > div#lecture_items_local_filename__row > div.col-sm-9 > input#lecture_items_file');
    fileChooseUploadButton
= jQuery('form#item_edit > div#lecture_items_local_filename__row > div.col-sm-9 > input#lecture_items_file_upload');
    fileChooseSpan
= jQuery('form#item_edit > div#lecture_items_local_filename__row > div.col-sm-9 > span.help-block');
    fileETag
= jQuery('form#item_edit > div#lecture_items_fileetag__row > div.col-sm-9 > input#lecture_items_fileetag');
    fileETag
.prop('readonly', true);
    console
.log("jQuery.document.ready end...");
});
</script>
{{end}}

{{if ('body' in globals()) and (body is not None):
   
=body
   
pass}}
where the "body" is passed as an SQLFORM from the controller, where the referenced fields in the <script> region will have to be implied.

the jQuery.get is a quick get from the server to ensure the accessKeyId and secretAccessKey are not accessible or visible by the user, or hopefully, the console or hack.  i welcome anyone correcting me on that latter point.

the real trick was getting aws s3 permissions in line because there are like a thousand protocols and formats and every other damn thing.  i believe the IAM protocol is the most straightforward and its setup when you setup your s3 account.  under python's boto library, you'd setup a policy with boto and nothing was really special on the s3 side except your accessKeyId and secretAccessKey.   however, when trying to upload or access the s3 bucket via the client-side javascript (aws-sdk-js), you have to setup the CORS configuration under s3, Permissions, CORS configuration under and for that s3 bucket.  mine is fairly open and looks like:
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
   
<AllowedOrigin>*</AllowedOrigin>
   
<AllowedMethod>GET</AllowedMethod>
   
<AllowedMethod>PUT</AllowedMethod>
   
<AllowedMethod>POST</AllowedMethod>
   
<AllowedMethod>DELETE</AllowedMethod>
   
<ExposeHeader>ETag</ExposeHeader>
   
<AllowedHeader>*</AllowedHeader>
</CORSRule>
</CORSConfiguration>

where its probably wise to delete the "<AllowedMethod>DELETE</AllowedMethod>" line for added security.

you don't need to change or add any of the other s3 permissions, i.e., "Block public access", "Access control list" (ACL), or "Bucket Policy", for the above script to work and access your bucket.  they are all bears anyway.

i did play with changing the accessKeyId and/or secretAccessKey by one character and the CORS config and such to make sure the upload was denied when the proper access was not provided.  all those security measures worked.

the jQuery.get call points to:
@auth.requires(lambda: any([auth.has_membership(r) for r in ['Administrator', 'Professor']]))
def ajaxAws():
   
if (session is not None) and ('ajaxAws_allowed' in session) and (session.ajaxAws_allowed is not None) and session.ajaxAws_allowed:
       
from xyz1 import s3AccessKey, s3SecretKey #xyz1 is a module not controller
        session
.ajaxAws_allowed = None
       
return response.json({'response':True, 'accessKeyId':s3AccessKey, 'secretAccessKey':s3SecretKey})
   
return response.json({'response':False})

@auth.requires(lambda: any([auth.has_membership(r) for r in ['Administrator', 'Professor']]))
def lecture_item_upload():
   
#SQLFORM creation stored into body
   
#...
    session
.ajaxAws_allowed = True
   
return dict(body=body, professor_folder=sProfessor.create_lecture_foldername)
where you can see i've written a website for professors to create their own courses and upload their podcasts.  so the uploading of the podcasts, which can be huge, > 100s MB, is really best done directly from their computers right into s3 as opposed to running it through the server.

so, that is it.  i hope this helps others.  lucas

Dave S

unread,
May 20, 2019, 4:08:58 AM5/20/19
to web2py-users


On Friday, May 17, 2019 at 5:57:29 AM UTC-7, lucas wrote:
actually Dave S, 

your indirect answer led me to a place of where not to go at all, which was the server-side node.js.  believe it or not.  because it led me to a place where i actually needed to be.  and i conjured the final code after about 44 hours of toil.  it works freaking great, pure client-side and progress also.  so, i'll post my code snippets for anyone else in the future.  

[much protein elided] 

Congratulations, and thanks for the update.  I may need to come up to speed on S3 sometime soon, although our current disk configuration is working for us (as long as I periodically retire older uploads from the boot disk to the "cold" disk)..

/dps

Reply all
Reply to author
Forward
0 new messages