Hi All:
It seems to me that there is a problem with the Max-Connections test.
I tried it on both desktop Chrome and FF recently. This is what I
find:
You can see in the below JS code excerpt, it is GETting 2 images from
each of the five different domains but then it requests same images
from same domains again. The browser recognizes this and just loads
the images directly from its own cache. On desktop Chrome and FF, I
don’t even see conditional GETs for these images because the server is
setting max-age and expire tags well ahead in future. In both Chrome/
FF, I see a total of 10 HTTP GETs requested by the browser - that is 2
images x five domains. I don't see any subsequent GET request.
I think it is either that the code is buggy – or perhaps the server
config is screwed up [it should at least set no-store tag and hope the
browser would obey].
// Done w/ Javascript to prevent bots from hitting our servers hard.
var imgC = document.getElementById('img-c');
var img = document.createElement('img');
for (var i = 1; i <= 30; i++) {
var addr = (i % 5) + 1; // aka number of resource-cgi backends -1
for (var j = 1; j <=2; j++) {
var clone = img.cloneNode(true);
clone.id = 'img' + i + '-' + j;
clone.onload = countImage;
clone.src = 'http://' + addr + '.
cgi.browserscope.net/cgi-bin/
resource.cgi?type=gif&' +
'sleep=6&n=' + j + '&t=1320862013';
imgC.appendChild(clone);
}
}