Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Streaming STDOUT to client, 4GB limit

38 views
Skip to first unread message

Gary W

unread,
Jun 14, 2006, 1:05:45 PM6/14/06
to
I have a process that issues a stream of text
to STDOUT , Apache 2.0, to the web client.
It fails various ways after a few GB.
I use a fictional mime type to give
the user a chance to save it to a named
file. I want to be able to send them the
whole thing, 100GB, at once.
Is this something I can fix in Apache with
-DLARGE_FILES_SOURCE -DFILE_OFFSET_BITS=64 ?

W2K Mozilla 1.7b - removed the ?.dat file as advised.
says there is not room, but there is
W2k Firefox - fails to progress after a while, no file there
W2K IE6 - saves 3.99GB, saving...
RedHat Mozilla 1.7.8 - file could not be saved

How do I do this?

Gary Wesley
--
"Play it once and it's wrong. Play it twice and it's right".
-- Ornette Coleman

http://InfoLab.Stanford.EDU/~gary/


--
PLEASE NOTE: comp.infosystems.www.authoring.cgi is a
SELF-MODERATED newsgroup. aa.net and boutell.com are
NOT the originators of the articles and are NOT responsible
for their content.

HOW TO POST to comp.infosystems.www.authoring.cgi:
http://www.thinkspot.net/ciwac/howtopost.html

Andreas Micheler

unread,
Jun 14, 2006, 4:25:48 PM6/14/06
to
Gary W schrieb:

> I have a process that issues a stream of text
> to STDOUT , Apache 2.0, to the web client.
> It fails various ways after a few GB.
> I use a fictional mime type to give
> the user a chance to save it to a named
> file. I want to be able to send them the
> whole thing, 100GB, at once.
> Is this something I can fix in Apache with
> -DLARGE_FILES_SOURCE -DFILE_OFFSET_BITS=64 ?
>
> W2K Mozilla 1.7b - removed the ?.dat file as advised.
> says there is not room, but there is
> W2k Firefox - fails to progress after a while, no file there
> W2K IE6 - saves 3.99GB, saving...
> RedHat Mozilla 1.7.8 - file could not be saved
>
> How do I do this?

I'm not an expert, but can't you just write the file to the server's
disk and then let the user download it like any other file?
You could use a lookup table for the files,
if you have several users.
And to delete the temporary files you could write a little demon,
looking from time to time on the directory where those temp files are,
and deleting files older than an arbitrary amount of time.
I think there's no 2 or 4 GB limit on files, is it?

Cheers,
Andreas

Gary W

unread,
Jun 15, 2006, 1:13:26 PM6/15/06
to
Andreas,

Thanks, but there is not
that much space on the server disk.
Some of these crawls are 0.5TB compressed.
In general, there can be several users
asking for various of these at one time.
It is actually a process that reads and
decompresses a whole file tree of many hundreds
of files that make up a stored web crawl.
We have dozens of these crawls available.

Gary


--
When you're through changing, you're through.
-- Bruce Barton

http://InfoLab.Stanford.EDU/~gary/

Gary W

unread,
Jun 27, 2006, 4:49:59 PM6/27/06
to
Turns out to be a browser limitation.
Perhaps I could use ftp instead of http
to transmit the stream, inside the browser?

Gary

0 new messages