pyfilesystem support (store uploads in S3)

Skip to first unread message

Massimo Di Pierro

unread,
May 31, 2012, 1:56:47 PM5/31/12
to web...@googlegroups.com
Here is an example:

easy_install pyfilesystem

>>> import fs.s3fs
>>> myfs = fs.s3fs.S3FS(bucket, prefix, aws_access_ke, aws_secret_key)
>>> db.define_table('test',Field('file','upload',uploadfs = myfs))

Now all your uploaded files will go on S3.
Here is a list of supported filesystems: http://packages.python.org/fs/filesystems.html

WARNINGS: 
- needs testing. I have tested with OSFS and I am confident it works
- I do not think with will work on GAE, should be tested
- uploadfolder and uploadseparate are ignored when uploadfs is specified (this should be changed, any takers?)

Should be possible to wrap myfs into an encryption layer but I have not done it yet.

We may want a more comprehensive strategy and allow every web2py file (including apps, sessions, tickets, etc) to go into a pyfilesystem. Is this necessary? On linux one can mount filesystems in a folder anyway. Is this more trouble than it is worth?

Massimo





Jason (spot) Brower

unread,
Jun 1, 2012, 3:34:04 AM6/1/12
to web...@googlegroups.com
Sounds fun!

nils

unread,
Jun 1, 2012, 5:32:04 PM6/1/12
to web...@googlegroups.com

Hi,
I use s3 for storing images, I was looking a creating something like this ages ago but my python Fu and web2py Fu were not up to scratch...

I will test this, just wondering if I can change my DB model over,currently I have s3 mounted as a filesystem, so web2py is just writing to a directory.

Nils

Massimo Di Pierro

unread,
Jun 1, 2012, 11:23:06 PM6/1/12
to web...@googlegroups.com
For production, your solution is better than using web2py+pyfilesystem. Yet, if you can help us test it would be great.

howesc

unread,
Jun 2, 2012, 6:19:57 PM6/2/12
to web...@googlegroups.com
i bet we could do something similar using the boto library on GAE.

Diogo Munaro

unread,
Jan 8, 2014, 12:01:03 PM1/8/14
to web...@googlegroups.com
Hey guys! I know this topic is old, but here it's working great with mysql or sqlite on web2py 2.7.4.

Just install:

pip install fs

Then on model:

import fs.s3fs
myfs = fs.s3fs.S3FS(bucket, prefix, aws_access_key, aws_secret_key)
db.define_table('image',Field('image','upload',uploadfs = myfs))

I'm using ubuntu 12.04 amd64 with python 2.7.3 virtualenv.

Thx for your help!

Lakshmi Narasimha

unread,
Feb 20, 2016, 10:35:02 AM2/20/16
to web2py-users
I'm using the exact same method to test uploading images to S3. But I'm receiving the following error:


Ticket ID

127.0.0.1.2016-02-16.14-39-32.325d6b5d-68c7-405f-b63d-4e6971c2623e

<class 'boto.exception.S3ResponseError'> S3ResponseError: 403 Forbidden

Any idea where I'm going wrong? I couldn't add the region name in this and I guess one of the reasons is that the bucket is not in the default region. Thanks
Reply all
Reply to author
Forward
0 new messages