huge json data in response and Content-Length Header

1,761 views
Skip to first unread message

snillo

unread,
Oct 14, 2009, 3:28:32 AM10/14/09
to MooTools Users
Hi,

for testing i made the following script:

PHP:
$output = Array();
for ($i=0; $i<8000; $i++) {
$output[$i] = "testString";
}
$resp = json_encode($output);
header('Expires: Mon, 26 Jul 1990 05:00:00 GMT');
header('Last-Modified: ' . gmdate( "D, d M Y H:i:s" ) . 'GMT');
header('Cache-Control: no-cache, must-revalidate');
header('Pragma: no-cache');
header('Content-Length: '.strlen($resp)); // HERE IS THE PROBLEM
header('Content-type: application/json');
print ($resp);

now if i make a Request.JSON call in mootools the response is not
fully loaded. There are some signs missing and the json String cannot
be decoded on JS side. This happens only with huge data like in the
example above. Now if i remove on PHP side the line: header('Content-
Length: '.strlen($resp)); everything works fine, so the problem must
be there.
How do you set the Content-Length Header-Variable, or maybe is that
variable not neccessary and mootools works also without that var?

thanks for any advice,
Georg

Jan Kassens

unread,
Oct 14, 2009, 3:55:55 AM10/14/09
to mootool...@googlegroups.com
Did you check the actual response and the string response in the
onSuccess handler of Request in firebug? It shouldn't be necessary to
set the content-length header by hand.

Sent from my iPhone

Sanford Whiteman

unread,
Oct 14, 2009, 4:16:19 AM10/14/09
to Jan Kassens
> It shouldn't be necessary to set the content-length header by hand.

Mmm, actually it is necessary unless there is a global handler active.

-- Sandy

Sanford Whiteman

unread,
Oct 14, 2009, 4:17:29 AM10/14/09
to snillo
> How do you set the Content-Length Header-Variable, or maybe is that
> variable not neccessary and mootools works also without that var?

C-L is not required.

I would hazard a guess that the problem has to do with content
compression, i.e. you set the plaintext length and you have something
subsequently gzipping the content but not updating the header,
something like that. So you end up with a mismatch. Without the
header, there's nothing to mismatch.

-- Sandy

snillo

unread,
Oct 14, 2009, 7:19:21 AM10/14/09
to MooTools Users
Hi,

thanks for answering quickly. So if C-L isn't required i get rid of
it. But anyway it's strange because i don't do any gzipping or other
content compression. I also checked the response in Firebug console
and how i sayed before there i see that not the whole json string is
send.

greetings,
Georg

On Oct 14, 10:17 am, Sanford Whiteman <sa...@cypressintegrated.com>
wrote:

samdev

unread,
Oct 14, 2009, 8:54:55 AM10/14/09
to mootool...@googlegroups.com
Firebug may already be showing you unpacked string.

I would not be specifying length by hands unless connection is timed
out while sending back a very long string. Although if this is the
case I'd rather think about design than tweak headers. Usually 30
seconds is more than enough (especially with modern connection speeds)
to request and receive response from server using Asynchronous
connection with JSON, XML, HTML or whatever. I personally prefer to
stick with JSON because it is well packed data that can be expanded
into HTML, XML in Browser with Mootools.

Do you really have to transmit such a long list of strings?

Jan Kassens

unread,
Oct 14, 2009, 9:37:01 AM10/14/09
to mootool...@googlegroups.com
If firebug is already showing a wrong result its at least not a bug in
MooTools but i suppose in your setup (such a bug should already be
spotted if its caused by firefox).

Jan

--
Jan - MooTools comitter
twitter/blog: http://kassens.net

snillo

unread,
Oct 14, 2009, 11:33:52 AM10/14/09
to MooTools Users
ok now i found the mistake. I use Typo3 and the PHP script which
produces the response is called via the eID Get-Parameter in Typo3
(don't know if somebody of you knows Typo3..). It looks like Typo3
writes something to the output buffer because if i change the PHP
script to:

PHP:
$output = json_encode($outputArr);
header('Expires: Mon, 26 Jul 1990 05:00:00 GMT');
header('Last-Modified: ' . gmdate( "D, d M Y H:i:s" ) . 'GMT');
header('Cache-Control: no-cache, must-revalidate');
header('Pragma: no-cache');
header('Content-Length: '.strlen($output));
header('Content-type: application/json');
ob_clean(); // THIS DOES THE TRICK
flush();
print $output;

then everything works like expected. I don't know exactly what ob_clean
() does and how Typo3 can write to the output Buffer before i print
the response, but thats another story.

Samdev, the response is generated dynamically and in some cases it
could be possible that i have huge data to transmit. So would you
recommend to set the Content-Length header? If i specify the Content-
Length then the request is never timed out, or is there another way to
have influence on the time-out of the request?

greetings,
Georg

samdev

unread,
Oct 14, 2009, 12:34:27 PM10/14/09
to mootool...@googlegroups.com
Ok, my concern about your code would be:

1. use header(...) in PHP before any other functions (another example
is session_start() ). Otherwise functions especially DB ones may
produce error or warning messages that would be output before header
and thus break data transmission. Suppress warnings and error messages
in PHP with @ prefix in functions if header can not be called before,
for example:

$output = @json_encode( $outputArr);

2. Why do you care about page timeout? Is user connected via modem
with old connection speed of several kBs? What's the point? Do not
transmit large data. Break it into smaller logical pieces and make
several requests instead. Example:

- Log. Instead of transmitting the whole DB return by server entries
headers and let each entry to download body with its own request
because body may be a long text.
- Users list: the same idea as with Log: return basic data, say users
list with IDs only and then request each user separately.

Such approach will work faster then requesting one huge block of data
and then waiting for 5 minutes to get it back from server. After all
it requires less neat programming.

3. I also write pages using Mootools Requests and honestly never had
problems with C-L. The whole idea of Requests ( and AJAX at first
place) is that Browser updates SMALL parts of pages otherwise refresh
page. It means: Do not send long contents. There are always at least 2
different ways to implement the same concept in programming. Redesign
your code and you'll see that setting headers explicitly is not
necessary unless extremely specific case like generation of image on
fly.

By the way nice way to suppress caching by browser is to add time to
URL, e.g.:

new Request.JSON({
url: './pages/get/data.php?t=' + $time(),

...
}).send();

$time() has precision of milliseconds and there is no way your code is
going to make two request at exactly the same time moment.

Thierry bela nanga

unread,
Oct 14, 2009, 12:55:32 PM10/14/09
to mootool...@googlegroups.com
that's not exactly the truth,

you could prevent anything to be sent with ob_start() in php, you call at the begining o your script.


and after you could do ob_clean() to delete any previous output

finally you just echo what you want.
--
http://tbela99.blogspot.com/

fax : (+33) 08 26 51 94 51

Thierry bela nanga

unread,
Oct 14, 2009, 12:59:23 PM10/14/09
to mootool...@googlegroups.com
I meant the explanation about header,


'you could do this

<?php

ob_start();

echo 'string';

header(...)


?>

without problem.

snillo

unread,
Oct 14, 2009, 12:59:47 PM10/14/09
to MooTools Users
thanks for your advices. I understand that it is not good to transmit
large data, but in some cases it is necesary. In my case i generate
images on the server side dynamically. It can happen that it takes a
long time (1-?sec, depends..) until the PHP script finished. So i'm
asking myself if there is a timeout for AJAX requests and how i can
control that.
Unfortunately i can't set header() before any other functions because
i'm using Typo3 and i can only hook into that.

samdev

unread,
Oct 14, 2009, 1:32:14 PM10/14/09
to mootool...@googlegroups.com
I thinks you are missing the point in this case.

AJAX, Mootools Requests and Asynchronous requests are NOT made or
designed for Images. They are made for HTML, XML, JSON, HTML and TEXT
reply but NOT images! Let browser to handle itself image upload and
add slot for event onLoad for that image that is generated dynamically
on server. So, do something in code like:

var _newImage = new Element( 'img', { url: './images/dynamic_photo.php?
t=' + $time() + '&params', events: { 'load': _onLoadImage } });

I am sorry for possible incorrect syntax in Image element. It only
intended to give an idea.
So, once image is downloaded by browser then you'll get _onLoadImage
function called.

Browser automatically loads images and waits until it is completely
transmitted. But in given case you may need to specify contents length
(not sure about it). Definitely you need to define TYPE in header.

Anyway I do not understand why Request is needed for this? By the way,
Safari 4 has extremely friendly developer tools that you may wish to
use to see all requests and responses to/from server.

snillo

unread,
Oct 14, 2009, 2:24:01 PM10/14/09
to MooTools Users
that would be a good way of doing that. i hope you didn't
misunderstund me, i don't transmit the image via AJAX, i just create
it on the server side and give the path to the image back in the AJAX
response.
I think there are no problems doing it directly by an ajax request.
Normaly it takes about 1-4sec to generate the image on server side,
but it can also take about 10sec and more, it depends wath the surfer
wants to draw. But the surfer has to wait the same time if i let load
the image with the browser via a php file. So is there a timout for
the response with Request.JSON? I never had one so i don't think so..
Reply all
Reply to author
Forward
0 new messages