SuspiciousOperation at filebrowser and s3

1,465 views
Skip to first unread message

Darren Ma

unread,
Apr 3, 2013, 10:54:53 AM4/3/13
to mezzani...@googlegroups.com
I'm getting this SuspiciousOperation error when I swtich to s3boto storage and try add a new Blogpost or open the Media Library.

Exception Value:
Attempted access to '/app/backlash/media/uploads' denied.

I'm not sure why it's reading the local file path to begin with. All the other static seems to be loading fine.

I have created an 'uploads' in my bucket and set it to public.
I have also tried setting FILEBROWSER_DIRECTORY = MEDIA_URL + 'uploads/' 
which then gives me
Exception Value:
Attempted access to '/app/backlash/media/https:/myproject.s3.amazonaws.com/media/media/uploads' denied.

Any suggestions?

Sorry if this has been asked before. Didn't find a solution in the resources I looked at.

Sachin Shende

unread,
Apr 3, 2013, 12:26:34 PM4/3/13
to mezzani...@googlegroups.com
Are you sure its not a permission problem?

Darren Ma

unread,
Apr 3, 2013, 12:48:17 PM4/3/13
to mezzani...@googlegroups.com
I've made all the folders in my bucket public. Still got the same issue.
Sometimes I wonder if all the pain of setting up S3 is worth it...

Stephen McDonald

unread,
Apr 3, 2013, 6:40:59 PM4/3/13
to mezzani...@googlegroups.com
This is an ongoing issue which is being tracked here:


Personally I don't use S3, so I don't have the available space or motivation to dive into this - desperately hoping someone who actually needs it can put in the work to resolve it.

--
You received this message because you are subscribed to the Google Groups "Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mezzanine-use...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 



--
Stephen McDonald
http://jupo.org

Darren Ma

unread,
Apr 4, 2013, 3:49:15 AM4/4/13
to mezzani...@googlegroups.com, st...@jupo.org
Thanks for the reply Stephen.
I knew s3 was going to be a dark road!

We are just starting a small news blog site and decided to host on Heroku because it's free until we get bigger.

Does anyone have any suggestions for alternative static/media storage on Heroku or do I have to change my hosting platform?

Marcos Scriven

unread,
Apr 8, 2013, 7:16:11 AM4/8/13
to mezzani...@googlegroups.com, st...@jupo.org
Can you post your S3 boto settings in full please (obviously minus your secret key!)

I've not seen this error, but I'll see if I can replicate it.

Darren Ma

unread,
Apr 8, 2013, 7:41:45 AM4/8/13
to mezzani...@googlegroups.com, st...@jupo.org
Hi Marcos,
I'm on 
  • Django 1.5.1
  • Mezzanine 1.4.5
  • boto 2.8.0
  • django-storages 1.1.8
My s3 settings are:
########## S3 STATIC CONFIGURATION
DEFAULT_FILE_STORAGE = 'backlash.storage.S3ProtectedStorage'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = get_env_setting('AWS_ACCESS_KEY_ID')
AWS_SECRET_ACCESS_KEY = get_env_setting('AWS_SECRET_ACCESS_KEY')
AWS_STORAGE_BUCKET_NAME = get_env_setting('AWS_STORAGE_BUCKET_NAME')
AWS_MEDIA_BUCKET_NAME = get_env_setting('AWS_MEDIA_BUCKET_NAME')
STATIC_URL = 'https://%s.s3.amazonaws.com/static/' % AWS_STORAGE_BUCKET_NAME
MEDIA_URL = 'https://%s.s3.amazonaws.com/media/' % AWS_MEDIA_BUCKET_NAME
# Used to make sure that only changed files are uploaded with collectstatic
AWS_PRELOAD_METADATA = True
AWS_LOCATION = 'static'
AWS_QUERYSTRING_EXPIRE = 7200
#turns off https for static files (necessary)
AWS_QUERYSTRING_AUTH = False
########## S3 STATIC CONFIGURATION

Here is the link to the storage class I'm using. http://pastebin.com/2XW0DPaQ (not sure perhaps one can use default s3botostorage but I had issues with that on previous deployments so that's why I use this one.)

Works with local filestorage but not with s3. That is why I'm curious to see your setup.
I should also mention that with my current setup I have an older version of mezzanine and django working(kinda, buggy media library).

Thanks for the effort!

Marcos Scriven

unread,
Apr 8, 2013, 8:22:44 AM4/8/13
to mezzani...@googlegroups.com, st...@jupo.org
Hi Darren

I don't use that S3boto class (I use the default one), but I do think I found the reason for your issue:


Let me know if that helps, if not I'll look into it further.

Marcos

Darren Ma

unread,
Apr 8, 2013, 9:04:15 AM4/8/13
to mezzani...@googlegroups.com, st...@jupo.org
That post on SO doesn't offer a solution. It is where the error is bubbling up from but I don't seem to understand the s3boto well enough to make changes to fix this.

I also reverted to DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
And set my FILEBROWSER_DIRECTORY = MEDIA_URL + 'uploads/'

Unfortunately no luck. Still get this error:
Attempted access to '/home/darren/Projects/heroku/backlash_project/backlash/media/https:/authentictrendmedia.s3.amazonaws.com/media/uploads' denied.

I tried playing around with _clean_name() like removing the replace('\\', '/')
and also messing with _normalize_name() by commenting out and just using pass.

No luck though.

Marcos Scriven

unread,
Apr 8, 2013, 4:13:08 PM4/8/13
to mezzani...@googlegroups.com, st...@jupo.org
Oh....

Then I see the problem right away - it's literally trying to find a directory called:

'/home/darren/Projects/heroku/backlash_project/backlash/media/https:/authentictrendmedia.s3.amazonaws.com/media/uploads' denied.

On your machine. 

But I'm still not clear when exactly this error occurs for you? When trying to add a file in the media library? Or just when loading the media library at all?

Personally, I set my MEDIA_ROOT to empty string : ""

As MEDIA_ROOT doesn't make any sense when using S3 - it's an absolute file path.

Darren Ma

unread,
Apr 9, 2013, 3:00:54 AM4/9/13
to mezzani...@googlegroups.com, st...@jupo.org
Hi Marco,

Yes you are right the MEDIA_ROOT was being set in my base.py setting. 
By editing these setting I can finally load and kind of use my Media Library!

########## S3 STATIC CONFIGURATION
MEDIA_ROOT = ''
# FILEBROWSER_DIRECTORY = MEDIA_URL + 'uploads/'

I still wouldn't say it's usable though. I tried to upload a file and it completes by everytime I am greeted with this error:

Request Method: GET
Request URL: http://127.0.0.1:8000/admin/media-library/browse/?ot=desc&o=date
Traceback:
File "/home/darren/.virtualenvs/bl/local/lib/python2.7/site-packages/django/core/handlers/base.py" in get_response
  115.                         response = callback(request, *callback_args, **callback_kwargs)
File "/home/darren/.virtualenvs/bl/local/lib/python2.7/site-packages/django/contrib/admin/views/decorators.py" in _checklogin
  17.             return view_func(request, *args, **kwargs)
File "/home/darren/.virtualenvs/bl/local/lib/python2.7/site-packages/django/views/decorators/cache.py" in _wrapped_view_func
  89.         response = view_func(request, *args, **kwargs)
File "/home/darren/.virtualenvs/bl/local/lib/python2.7/site-packages/filebrowser_safe/views.py" in browse
  108.         if fileobject.filetype == request.GET.get('filter_type', fileobject.filetype) and get_filterdate(request.GET.get('filter_date', ''), fileobject.date):
File "/home/darren/.virtualenvs/bl/local/lib/python2.7/site-packages/filebrowser_safe/base.py" in _date
  85.             self._date_stored = time.mktime(default_storage.modified_time(self.path).timetuple())
File "/home/darren/Projects/heroku/backlash_project/backlash/storages/backends/s3boto.py" in modified_time
  455.         return parse_ts_extended(entry.last_modified)
File "/home/darren/Projects/heroku/backlash_project/backlash/storages/backends/s3boto.py" in parse_ts_extended
  39.         rv = parse_ts(ts)
File "/home/darren/.virtualenvs/bl/lib/python2.7/site-packages/boto/utils.py" in parse_ts
  390.     ts = ts.strip()
Exception Type: AttributeError at /admin/media-library/browse/
Exception Value: 'NoneType' object has no attribute 'strip'

The REALLY strange part is that this error persists and I can't load my Media Library until a do a restart on the server. Stop the runserver and start it again for example. Then all of a sudden the Media Library loads and I can see the files uploaded there.

Marcos Scriven

unread,
Apr 9, 2013, 4:20:59 AM4/9/13
to mezzani...@googlegroups.com, st...@jupo.org
Hi Darren

From the bottom of that stack trace looks to me like S3 isn't providing a timestamp (hence 'NoneType'), and by restarting I suspect you are forcing S3 to reload the files (and hence pickup a timestamp)

I would definitely recommend putting a debug point in here:

File "/home/darren/.virtualenvs/bl/local/lib/python2.7/site-packages/filebrowser_safe/base.py" in _date
  85.             self._date_stored = time.mktime(default_storage.modified_time(self.path).timetuple())

And follow into the call to modified_time to see why that's coming back as None for new files.

Marcos

Darren Ma

unread,
Apr 9, 2013, 9:24:18 AM4/9/13
to mezzani...@googlegroups.com, st...@jupo.org
Marco you are a good man.

In the s3boto.py class there is this 

    def modified_time(self, name):
        name = self._normalize_name(self._clean_name(name))
        entry = self.entries.get(name)
        # only call self.bucket.get_key() if the key is not found
        # in the preloaded metadata.
        if entry is None:
            entry = self.bucket.get_key(self._encode_name(name))
        # Parse the last_modified string to a local datetime object.
        return parse_ts_extended(entry.last_modified)

The issue is that entry.last_modified was None which was breaking everything.
I've edited the def to the following:

    def modified_time(self, name):
        name = self._normalize_name(self._clean_name(name))
        entry = self.entries.get(name)
        # only call self.bucket.get_key() if the key is not found
        # in the preloaded metadata.
        if entry is None:
            entry = self.bucket.get_key(self._encode_name(name))
        # Parse the last_modified string to a local datetime object.
        if not entry.last_modified:
            try:
                entry.last_modified = datetime.datetime.now().strftime(ISO8601)
            except ValueError:
                entry.last_modified = datetime.datetime.now().strftime(ISO8601_MS)

        return parse_ts_extended(entry.last_modified)

It is working. Not sure if this will have negative consequences further down the line but I'm just happy it's working.
Thank you again for steering me in the right direction.

Stephen McDonald

unread,
Apr 9, 2013, 9:28:27 AM4/9/13
to Darren Ma, mezzani...@googlegroups.com
Really enjoying this thread! I'd love to get to a point where we can say do X, Y and Z and S3 will just work.

Are the issues:

- Set MEDIA_URL to an empty string
- s3boto needs a patch for modified time

Is that it? Anything else?

Is there anything we could do in Mezzanine to monkey-patch our way into getting the modified time issue working?

Marcos Scriven

unread,
Apr 9, 2013, 10:01:10 AM4/9/13
to mezzani...@googlegroups.com, Darren Ma, st...@jupo.org
The MEDIA_ROOT needs to be set to empty string, rather than MEDIA_URL. You definitely need MEDIA_URL to be set to wherever you're serving your media from.

While the patch to S3 boto might work, I would worry about a couple of things:

  • If the issue doesn't occur after restart (without the patch), it would be preferable to set the last modified time via the normal code path. If that means reloading the file from S3 to get the timestamp, so be it. Much preferable to get the timestamp according to S3, than just putting one in
  • Timezones/daylight savings - I'd use datetime.datetime.utcnow().replace(tzinfo=utc) (or whatever timezone you want). Even if you're not working across timezones, you can get burnt twice a year when the clocks change over. If you happen to have a cron job running (for example), that depends on the last modified timestamp, you could be caught out.
I didn't actually experience this timestamp issue - so perhaps it's a new bug (or an old one). I'll check what versions I have.

The only issues I have with it at the moment are:

  • Media Library is very slow indeed - I rather suspect that's due to it trying to pickup meta data like file sizes and timestamps, when really all you want is a filename and a thumbnail. I'd have to look and see if that's Mezzanine or FIlebrowser (I suspect the latter)
  • With the thumbnail thing in mind, I found that Mezzanine tries to write the thumbnail directly to the local filesystem, rather than using default_storage (hence this topic https://groups.google.com/forum/?fromgroups=#!topic/mezzanine-users/WGVaNhD5vRs). Initially I thought we could use an in memory file to save the PIL output, but looking again I don't really see the need to save it anywhere other than default storage at all, since it's only ever served from MEDIA_URL?

Thanks

Marcos

Stephen McDonald

unread,
Apr 9, 2013, 10:15:34 AM4/9/13
to mezzani...@googlegroups.com, Darren Ma
Yes I was thinking the same thing with the change you proposed - if we can have the thumbnail written directly via the storage api, then everything should just work.

Thanks for all your help in troubleshooting this.

 

Thanks

Marcos

--
You received this message because you are subscribed to the Google Groups "Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mezzanine-use...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.
 
 

Darren Ma

unread,
Apr 9, 2013, 10:26:45 AM4/9/13
to mezzani...@googlegroups.com, Darren Ma, st...@jupo.org
So far using storages for S3 I'd recommend

MEDIA_ROOT = ''
FILEBROWSER_DIRECTORY does not need to be set.
You should make your media/uploads folder in your bucket and make them public.

As for the little timestamp issue I've been having I'll try see how to do it better and get back to you all if I ever get there.

Darren Ma

unread,
Apr 9, 2013, 10:33:04 AM4/9/13
to mezzani...@googlegroups.com, Darren Ma, st...@jupo.org
Sorry to harp on.
Was going to submit an issue to django-storages regarding this timestamp issue but I believe it's already been placed.


Cheers

Stephen McDonald

unread,
Apr 9, 2013, 10:36:16 AM4/9/13
to mezzani...@googlegroups.com, Darren Ma
Awesome - I just added a link back to this thread in that issue.

Marcos Scriven

unread,
Apr 9, 2013, 10:37:39 AM4/9/13
to mezzani...@googlegroups.com, Darren Ma, st...@jupo.org
Ah... well spotted!

Looks like it is indeed a new issue. Was introduced here a couple of weeks ago:
 

Whereas I've not updated since Jan (version 1.1.6)

Marcos

Flavio Barros

unread,
Apr 22, 2015, 9:28:33 PM4/22/15
to mezzani...@googlegroups.com, darren...@gmail.com, st...@jupo.org
I'm still having this problem. Any solutions?

Flavio Barros

unread,
Apr 22, 2015, 9:53:44 PM4/22/15
to mezzani...@googlegroups.com, darren...@gmail.com, st...@jupo.org

--
You received this message because you are subscribed to the Google Groups "Mezzanine Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to mezzanine-use...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Tom Longson

unread,
Jun 9, 2015, 8:19:52 PM6/9/15
to mezzani...@googlegroups.com, st...@jupo.org, darren...@gmail.com
I'm having the ts.strip error still, tried using django-storages-redux, but ran into another error:

ImproperlyConfigured at /admin/media-library/browse/
Error finding Upload-Folder. Maybe it does not exist?

Looking for a solution still.

Tom

Daniel Blasco

unread,
Sep 1, 2015, 8:54:03 PM9/1/15
to Mezzanine Users, st...@jupo.org, darren...@gmail.com
Hi,

A different way to deal with this problem by overriding modified_time function, which actually fetches the object again from storage if modified_time is None:


from storages.backends.s3boto import S3BotoStorage, parse_ts_extended

class MyS3BotoStorage(S3BotoStorage):

    def modified_time(self, name):
        name = self._normalize_name(self._clean_name(name))
        entry = self.entries.get(name)
        # only call self.bucket.get_key() if the key is not found
        # in the preloaded metadata or if modified time stamp is empty.
        if entry is None or not entry.last_modified:
            entry = self.bucket.get_key(self._encode_name(name))
            self._entries[name] = entry

        # Parse the last_modified string to a local datetime object.
        return parse_ts_extended(entry.last_modified)

And then reference this new class from STATICFILES_STORAGE and DEFAULT_FILE_STORAGE of your settings.

By the way, it looks like the original project in bitbucket https://bitbucket.org/david/django-storages is not being maintained. Is there any fork or alternatives?

Derek Adair

unread,
Mar 12, 2016, 3:44:37 PM3/12/16
to Mezzanine Users, st...@jupo.org, darren...@gmail.com
Django Storages Redux is now on github.  I think they even took over the pypi namespace.
Reply all
Reply to author
Forward
0 new messages