Nginx + uwsgi file download problem

646 views
Skip to first unread message

Thomas Bellembois

unread,
Jan 27, 2012, 7:25:22 AM1/27/12
to web...@googlegroups.com
Hello,

I have a file download problem with my Web2py application. The file is
about 200kB.

web2py version: 1.99.2 (2011-09-26 06:55:33) stable
nginx-1.0.6
uwsgi-0.9.9.2
Debian Squeeze.

One of my crontroler creates a tar.gz file and returns it to the user.
It works well with the default rocket server but not in my production
environment.
The file is corrupted, actually when a look at the file size it is never
the expected size.

I have tested the two following return methods:
- return open(the_filename,'r+b').read()
- return response.stream(open('the_filename','r+b'))
with the same results.

I have the following Nginx configuration:

--------------------------------
user nginx;
worker_processes 4;

events {
worker_connections 1024;
}

http {
sendfile on;
client_max_body_size 100M;

charset utf-8;
default_type application/octet-stream;
ignore_invalid_headers on;
include mime.types;
source_charset utf-8;

gzip on;
gzip_vary on;
gzip_comp_level 2;
gzip_proxied any;
gzip_types text/plain text/html text/css application/json
application/x-javascript text/xml application/xml application/xml+rss
text/javascript;

...

server {
server_name myserver.ens-lyon.fr;
listen 443;

ssl on;
...

keepalive_timeout 70;

access_log /var/log/myserver_access.log;
error_log /var/log/myserver_error.log;

location ~* \.(ico|css|js|gif|jpe?g|png)(\?[0-9]+)?$ {
root /var/www/myserver/applications/;
expires max;
}
location / {
uwsgi_pass 127.0.0.1:9001;
include uwsgi_params;
uwsgi_param UWSGI_SCHEME $scheme;
}
location /static {
root /var/www/myserver/applications/init/;
expires max;
}
}
--------------------------------

I have googled the question, tried some Nginx configuration for file
download problems without any success.

Could you help me?

Regards,

Thomas

pbreit

unread,
Jan 27, 2012, 5:53:50 PM1/27/12
to web...@googlegroups.com
Hard to say.

I don't know if this helps but here's how appadmin does it for a csv file:

def csv():
    import gluon.contenttype
    response.headers['Content-Type'] = \
        gluon.contenttype.contenttype('.csv')
    db = get_database(request)
    query = get_query(request)
    if not query:
        return None
    response.headers['Content-disposition'] = 'attachment; filename=%s_%s.csv'\
         % tuple(request.vars.query.split('.')[:2])
    return str(db(query).select())

Roberto De Ioris

unread,
Jan 29, 2012, 1:56:52 PM1/29/12
to web...@googlegroups.com

Try to set Content-Length header to avoid nginx losing data

--
Roberto De Ioris
http://unbit.it

nick name

unread,
Jan 31, 2012, 1:38:47 AM1/31/12
to web...@googlegroups.com
Almost surely the same problem discussed in this thread: https://groups.google.com/d/msg/web2py/1_b63bhBeQs/sYFbXNJL8D4J

Thomas Bellembois

unread,
Jan 31, 2012, 3:02:32 AM1/31/12
to web...@googlegroups.com
Le 31/01/2012 07:38, nick name a écrit :
> Almost surely the same problem discussed in this thread:
> https://groups.google.com/d/msg/web2py/1_b63bhBeQs/sYFbXNJL8D4J
>
It looks like the problem is with IE8 in this thread ?
Tried to set the content-lenght header but I miss something:

response.headers = {'Content-disposition:': 'attachment;
filename=chimitheque_db.tar.gz',
'Content-Length:':
os.stat('my_filename').st_size/8,
'Content-type:': 'application/gzip'}

The server response is not correct. I run the last 1.99.4 stable version.
I will try with Python 2.7...

nick name

unread,
Jan 31, 2012, 3:18:39 AM1/31/12
to web...@googlegroups.com
No, the thread started with Ie8 being suspects, but at least from my experiments it is a problem in Rocket which can be triggered with any browser or even without a browser (e.g. wget/curl instead of a browser)

See e.g. https://github.com/explorigin/Rocket/issues/1#issuecomment-3734231

The reason it works locally for you and not in production is probably because you have lower bandwidth to the production machine, triggering timeouts.

If changing SOCKET_TIMEOUT to 10 in rocket.py makes the problem go away for you (or at least much less frequent), then this is your problem too.

(This is not really a solution -- it doesn't address the cause, and it interferes with timely detection of problems -- but it will let you know if you are experiencing the same problem or a different one)

Thomas Bellembois

unread,
Jan 31, 2012, 3:37:19 AM1/31/12
to web...@googlegroups.com
Thanks for the explanation. I have changed the SOCKET_TIMEOUT to 10,
removed the *.pyc files to be sure, and I have half a success.

I now have a NOT corrupted tar.gz file but with a wrong content. I mean
that the archive contains a single 2.0GB temporary file (not the real
size of course, size shown in the unzipping tool) instead of several
.csv files.

I will install the same environment on my local machine (uwsgi + nginx)
to see differences.

Thanks,

Thomas

pbreit

unread,
Jan 31, 2012, 3:50:32 AM1/31/12
to web...@googlegroups.com
If its a big file, need to adjust:

client_max_body_size 100M;

Reply all
Reply to author
Forward
0 new messages