That's an interesting approach. Unfortunately what I'm trying to do is protect against file upload attacks (both to do with size and files containing viruses), so I really do need a mechanism to cut off the upload once we decide it's a problem.
On Sunday, September 30, 2012 8:27:52 AM UTC+1, David M. wrote:
I also had an hard time trying to implement a file upload process that would directly stream the data to another backend system without caching the uploaded data into a temporary file.
The main problem is that the HTTP protocol is made in a way that the server has to wait until the client has finished sending its whole request body before sending a response. If you send an error code it won't be taken into account before the request has been completely consumed, if you close the connection by sending an Exception the browser will seem to hang so there seems to be no-way to properly cancel a request (from the server point of view) that has been started.
Before doing the actual upload I call a specific controller (isUploadValid) that will tell me if the upload will be valid (based on target file size, currently logged-in user account settings, ...)
If it's not possible then I simply don't do the upload
Else I proceed with the real upload
In the "real" upload Action I have written a specific part handler that handle uploaded data (in my case I just want to stream it somewhere else) that always read the full user-provided data. In case of problem I keep reading the content (but don't use it any more) and when everything has been red I send an error-code.
Of course if someone is trying to "attack" your site and directly calls the upload controller without going through the check first then this solution won't work, but I guess this is another topic.
Since I have implemented this solution I heard about the expect handshake. HTTP protocol defines an expect 100 continue handshake: in this mode the client starts by only sending the headers of the request (i.e. including content-length) and waits for the server to return either an error code or a 100-continue response. If the later is received then the full request with post content is made. I think that it would be the more elegant way to handle your problem but I don't know if it's supported by the play! framework yet.
Good luck! David
On Wednesday, September 26, 2012 1:55:25 AM UTC+2, jimr wrote:
I've been trying to implement an upper bound on the request body length using the maxLength body parser combined with the multipartFormData body parser (or the raw body parser) so that huge requests can be blocked without having to load all of the data into memory.
Unfortunately this doesn't seem to be working when uploading large files from the browser. When I choose a large file (e.g. 5MB) and submit the <form>, the request hangs in the browser and never completes. On the server-side, the request completes, but the browser never seems to see the response.