Streaming vs Nginx proxy

149 views
Skip to first unread message

Tomas Morstein

unread,
Mar 8, 2014, 7:47:30 PM3/8/14
to chica...@googlegroups.com
Hello,

Is anybody using streaming actions behind the Nginx proxy?

Let's say we have the following action:

stream_test ('GET', [], _) ->
    StreamGen = fun
                    ({N, Max}) when N < Max ->
                        {output,
                         list_to_binary ([ <<($A+(N rem ($Z-$A)))>>
                                           || _ <- lists:seq (1, Max) ]),
                         {N+1, Max}};
                    (_) ->
                        done
                end,
    {stream, StreamGen, {0, 64}}.


And just a "proxy_pass" rule on the Nginx side.

If I call the "stream_test" action directly, everything works fine. If I try to call it via Nginx proxy, I get the error:
  upstream prematurely closed connection while reading response header from upstream

I have tried everything about the proxy buffer sizes, tried to toggle buffering at all, tweaked most of the timeout options available, but none of the mentioned helped.
(My tests are based on CB 0.8.7. Also tried to manually upgrade ranch+cowboy to their latest versions. Didn't help.)

Any ideas?

Tom

Dmitry Polyanovsky

unread,
Mar 8, 2014, 11:57:26 PM3/8/14
to chica...@googlegroups.com
Hi,

As I remember, such error occurs than you send data before headers (or don't send headers at all). There is two possible ways to solve it:
1) try to send some headers from your code prior to streaming
2) stream thru Nginx itself, there is no reason to pass it to CB if file exists on filesystem

p.s. upgrade your CB to latest and you will have all latest versions of deps too

Tomas Morstein

unread,
Mar 9, 2014, 7:38:13 AM3/9/14
to chica...@googlegroups.com
Hi,

Well, according to the CB code, the headers should be included (at least the default "Content-Type: text/html") even if we don't add anything explicitly.
I tried to change {stream, StreamGen, {0, 64}} to {stream, StreamGen, {0, 64}, [{"Content-Type", "text/plain"}]}, but it didn't help.

If I call the CB with cURL in verbose mode, I can see all the headers I've passed from the controller. If I call it via Nginx, it gives me a HTTP 502 error.

The background story is that we have a CB-based application server that lives inside a per-customer Docker container while the Nginx is just a vhost-proxy with dynamically generated configuration for each of the Docker container's port (randomly assigned).
Each of the containers is available through SSH ("legacy" character-cell based application; also proxified, so we cannot upload anything with sftp/scp) and HTTP (mostly reporting), so if we want to upload a hotfix patch package (instead of rebooting the container from a new Docker image), the only way is to upload it over HTTP.
But in that case, we want to write some progress messages (output of the installation procedure; we don't upgrade just an Erlang stuff which is almost clean and silent), so we have a controller that runs an OS command via Erlang port and stream the command's output back to the client.

This worked perfectly whenever we had an on-site deployment with no HTTP proxy, but on the "hosting", there's no way how to live without HTTP proxy... (which was originally Apache, but because of WebSockets and other reasons, we've switched to Nginx).

So the base facts are:
- I think headers are sent correctly
- Nginx has no direct access to the CB's filesystem, even if the streamed content was a file (it is not, in this case)
- we cannot simply follow the CB upstream since we have some custom patches (not just in CB, but also in several dependency projects). Some of them cannot be merged to the upstream for multiple reasons... It could be fixed, but would require some time what's something I don't have at the moment :-(

Tom
Reply all
Reply to author
Forward
0 new messages