Making browsers faster: Resource Packages

25 views
Skip to first unread message

Alexander Limi

unread,
Nov 16, 2009, 10:01:14 PM11/16/09
to ar...@mozilla.com, bliz...@mozilla.com
Hi dev.platform,

I have published the first public draft on Resource Packages, and
would like to hear your feedback:

http://limi.net/articles/resource-packages/

Summary:
What if there was a backwards compatible way to transfer all of the
resources that are used on every single page in your site — CSS, JS,
images, anything else — in a single HTTP request at the start of the
first visit to the page? This is what Resource Package support in
browsers will let you do.

darkyndy

unread,
Nov 16, 2009, 11:52:15 PM11/16/09
to

It's a very good idea and it will help to speed up the loading time.
You must take in consideration that the .zip file can contain a virus
or other vulnerabilities so please consider this.

Hope that I will see it in FireFox 3.7

Ryan Doherty

unread,
Nov 17, 2009, 2:51:04 AM11/17/09
to
Overall I really like the idea. We're pretty much heading in this
direction anyway (1 large CSS file, 1 large JS file + sprites).

An addition I'd add would be a way to have multiple manifest files.
Currently a way to speed up websites/pages is to only load what is
required for the current page and post-load the rest. An example would
be a homepage would only load the manifest file it needs to make it
load faster, but post-load another manifest file. Or another page can
load a different manifest file later on. Considering some sites are
huge, their manifest files could get far too large for most internet
connections.

And making sure browsers can pull resources out of the manifest file
*before* it has completed downloading should be a requirement.
Currently browsers deal with missing/incomplete files as gracefully as
they can, I'd like that graceful degradation to continue.

The ordering of assets should be CSS, JS then images inside the
manifest file.

Might even want a way to notate if the JS should be parsed after the
page is ready or not (defer). (Equivalent of putting it at the bottom
of the page)

Good stuff!

Magne Andersson

unread,
Nov 17, 2009, 3:25:05 AM11/17/09
to
(I posted this on Bugzilla first, realized the discussion might belong
here instead.)

I like this proposal and I hope that more browsers implement it soon.
But,
let's say that you have a partial ZIP file, where a CSS file is
referenced on
the page itself through a <link> tag, but does not exist in the ZIP.
Will this
be read and applied separately? Will this happen after everything from
the ZIP
file has been applied?

Or, let's say a file inside the ZIP was corrupt (or would that make
the whole
ZIP corrupt? What happens then?). If the browser can notice this, will
it try
to read the file referenced in the page instead?

And, a quite silly question, but to be ready for all situations, are
multiple resource packages allowed on the same page? (While it would
be quite dumb to do it, I don't see why it shouldn't be allowed)

Mook

unread,
Nov 17, 2009, 3:40:31 AM11/17/09
to
Alexander Limi wrote:
> Hi dev.platform,
>
> I have published the first public draft on Resource Packages, and
> would like to hear your feedback:
>
> http://limi.net/articles/resource-packages/
>

Interesting spec. Sorry if some of these questions look familiar; I
recall discussion but not any of the answers :( Here's a collection of
mostly unorganized questions:

How would this deal with CDNs / files hosted on a completely different
domain? Just have the href= be an absolute URL instead? (Zip files don't
allow directory entries like ../../../file.html do they? Appnote.txt
doesn't seem to say anything about that...) Would this lead to more
GIFAR-like exploits?

Appnote.txt specifies that file names all come with separate size counts
(i.e. they _can_ contain null bytes); it'll be fun to make sure that
doesn't break things. But the jar: protocol handler probably does it
right already :)

Would the content-type magic guessing do interesting things? I guess it
would finally allow XHTML 1.1 transitional to be parsed with an XML
parser (since I'd imagine IE to not understand it, and parse it with a
HTML parser - especially if it ignores the resource package completely).

I'm having trouble understanding the third bullet about charsets in
"Additonal notes". Zip files (with extra field 0x7075, UPath) can
contain UTF8 file names; what are the charsets here referring to? The
textual content of plain text files?

Are multiple <link rel="resource-package"> elements allowed per page
(say, to have different caching policies)? If those elements end up
containing the same resource, who wins?

How would that affect caching policy, for both requests from the current
host, and for requests from actual documents hosted on the CDN host?
What happens if the zip file has a different cache header than the
normal, unpackaged resource? Possibly the first one to be used wins, I
guess.

Is this straight up cache injection? If I include a HTML file, then
have a link to where the HTML file maps to, does the browser not need to
make a request?

I wish something other than .zip was as ubiquitous, since having a
format with a central directory at the end of the file sucks, especially
given that normal zip archives don't expose the order of files within an
archive to the user at all. They're quite free to tack the new file at
the end instead. That and the fact that pathnames are separated with a
DOS-style backslash; I vaguely recall some problems with reading zip
files with forward slashes for jar: in Firefox at some point.

(Clarification request: who is "we" in the context of the blog post?)

--
Mook

Magne Andersson

unread,
Nov 17, 2009, 3:40:24 AM11/17/09
to
I also have a suggestion. We know that some files are more important
than others on some sites, this is especially true on large sites
loaded on slow connections. How about a way to set priority to files
inside manifest.txt?

From your example:

PRIORITY: 1
styles/reset.css
styles/grid.css
styles/main.css
PRIORITY: 2
javascript/jquery.js
PRIORITY: 3
images/save.png
images/info.png

This would make sure the style sheets finish loading before the
javascript, and after that comes the images. This would be completely
optional and wouldn't matter on fast connections unless the site is
huge. Still, it could be a large (perceived) performance win on slow
connections as what is required for the site to function is loaded
first.

Christian

unread,
Nov 17, 2009, 3:55:10 AM11/17/09
to
Hi everybody,

those Resource Packages are a very interesting idea!
I like your priority idea Magne, it's quite simple and might get rid
of the sorting problems within the zip files.

But there's on issue, which Darkyndy already adressed in the first
comment: Viruses.
How do you think this should be adressed? With such a ZIP File to be
downloaded almost anything could come along with it. I guess if you
would first have to perform a virus check with the antivirus tool of
each users choice, it would make up all the speed advantage formerly
introduced.
This client side virus check should be considered anyway. Wouldn't
that ZIP file be adressed by any of the popular Antivirus Programs
just like any download would be, and therefore be checked? May that
become a problem? Especially on larger websites, where the ZIP File
might be considerably larger than in most cases.
Maybe the "mutliple manifest idea" mentioned above could help here?!

Of course there could be a server-side check of the ZIP File...

looking forward to see this come to life!
greetz
Christian

semper.cras

unread,
Nov 17, 2009, 4:13:31 AM11/17/09
to
The priority idea by Magne (but maybe not the syntax) is what your
proposal really lacks. However it is possible to put files in zip
archive in arbitrary order, which can be considered as a priority by a
browser, i don't think it's a transparent and easy operation with all
zipping solutions.

As for viruses, I don't quite get how it can be more dangerous than
downloading separate js files. Is it real to have a virus in this zip
that would auto-execute after downloading?

Magne Andersson

unread,
Nov 17, 2009, 4:33:13 AM11/17/09
to

I agree with you that the syntax is a bit odd, and I'd like some
feedback/suggestions for a better syntax.

Gervase Markham

unread,
Nov 17, 2009, 4:35:44 AM11/17/09
to
On 17/11/09 07:51, Ryan Doherty wrote:
> An addition I'd add would be a way to have multiple manifest files.

You mean multiple resource packages? The spec already supports that.

> load a different manifest file later on. Considering some sites are
> huge, their manifest files could get far too large for most internet
> connections.

The manifest file only contains a list of items in the zip file. If the
manifest file is enormous, then the zip file will be 20x as big. I think
you might be getting confused between manifest files and resource packages.

> And making sure browsers can pull resources out of the manifest file
> *before* it has completed downloading should be a requirement.

This is also in the spec.

> The ordering of assets should be CSS, JS then images inside the
> manifest file.

The spec doesn't mandate an order; that's up to the site designer.

> Might even want a way to notate if the JS should be parsed after the
> page is ready or not (defer). (Equivalent of putting it at the bottom
> of the page)

That notation will be on the <script> tag referring to the resource.
Resource packages just give an alternate way to obtain the resource.

Gerv

Gervase Markham

unread,
Nov 17, 2009, 4:37:15 AM11/17/09
to
On 17/11/09 08:25, Magne Andersson wrote:
> (I posted this on Bugzilla first, realized the discussion might belong
> here instead.)
>
> I like this proposal and I hope that more browsers implement it soon.
> But,
> let's say that you have a partial ZIP file, where a CSS file is
> referenced on
> the page itself through a<link> tag, but does not exist in the ZIP.
> Will this
> be read and applied separately?

Yes; the purpose of manifest files is to let the browser know quickly
which files are in the ZIP and which are not. Files not in the ZIP will
be downloaded as normal, in parallel, just like any other request.

> Or, let's say a file inside the ZIP was corrupt (or would that make
> the whole
> ZIP corrupt? What happens then?). If the browser can notice this, will
> it try
> to read the file referenced in the page instead?

Probably not, no. How would you detect "corrupt"? You might be able to
in some cases, but then surely the fix is to fix the zip file.

> And, a quite silly question, but to be ready for all situations, are
> multiple resource packages allowed on the same page? (While it would
> be quite dumb to do it, I don't see why it shouldn't be allowed)

It wouldn't be at all dumb, and it's certainly allowed.

Gerv

Gervase Markham

unread,
Nov 17, 2009, 4:37:49 AM11/17/09
to
On 17/11/09 08:40, Magne Andersson wrote:
> I also have a suggestion. We know that some files are more important
> than others on some sites, this is especially true on large sites
> loaded on slow connections. How about a way to set priority to files
> inside manifest.txt?

ZIP files contain files in a specific order; put the most important
files first, and they will be downloaded and used first.

Gerv

Gervase Markham

unread,
Nov 17, 2009, 4:38:52 AM11/17/09
to
On 17/11/09 08:55, Christian wrote:
> But there's on issue, which Darkyndy already adressed in the first
> comment: Viruses.

Resource packages don't introduce any new virus issues that I can see.
The browser is not going to be executing any native code downloaded
without asking the user.

> How do you think this should be adressed? With such a ZIP File to be
> downloaded almost anything could come along with it.

But that is also true of an <a href="...">.

Gerv

Gervase Markham

unread,
Nov 17, 2009, 4:41:43 AM11/17/09
to
On 17/11/09 08:40, Mook wrote:
> How would this deal with CDNs / files hosted on a completely different
> domain?

The CDN could distribute the ZIP file.

> I'm having trouble understanding the third bullet about charsets in
> "Additonal notes". Zip files (with extra field 0x7075, UPath) can
> contain UTF8 file names; what are the charsets here referring to? The
> textual content of plain text files?

Yes.

> Are multiple <link rel="resource-package"> elements allowed per page
> (say, to have different caching policies)? If those elements end up
> containing the same resource, who wins?

Yes. But it's a good question as to which wins if they both contain the
same resource name but different resource data. I'd say the behaviour is
undefined (i.e. "don't do that").

> How would that affect caching policy, for both requests from the current
> host, and for requests from actual documents hosted on the CDN host?
> What happens if the zip file has a different cache header than the
> normal, unpackaged resource? Possibly the first one to be used wins, I
> guess.

The normal unpackaged resource isn't downloaded if the resource is in
the ZIP file. So the ZIP file's caching headers govern all the resources
in it.

Gerv

Magne Andersson

unread,
Nov 17, 2009, 4:43:42 AM11/17/09
to

Will all "proper" ZIP-programs contain them in that order? I mean,
files in a classic folder can easily be sorted after name, or date of
creation, or type.

Mariusz Nowak

unread,
Nov 17, 2009, 4:45:05 AM11/17/09
to

I think this is the real solution and CSS Sprites are just workaround
that was developed because this is one was missing.
But (!) if there could be solution to that problem at HTTP level, I'd
rather focus on that, let's cut it at the source.

Magne Andersson

unread,
Nov 17, 2009, 4:47:07 AM11/17/09
to
> It wouldn't be at all dumb, and it's certainly allowed.
>
> Gerv

What I meant by that was that the goal of this proposal was to reduce
the number of requests. By having multiple packages, you add at least
one more request without any reason.

Ictinus

unread,
Nov 17, 2009, 4:55:45 AM11/17/09
to
I think I'd rather not have to unzip/edit/re-zip every time I modify a
resource (oops someone forgot to zip!) Especially when the resources
in question are not maintained in a file system.
eg. IBM/Lotus Domino holds the resources within another file structure
(.nsf)
I would much prefer to have the server dish up a single (cached)
gzipped, minified resource to the browser.
See www.dominoexperts. com/dapInfo

Magne Andersson

unread,
Nov 17, 2009, 5:00:26 AM11/17/09
to

Well, obviously, you wouldn't include a frequently edited resource in
the ZIP, but most sites have their resources static for quite a long
time.

Sander

unread,
Nov 17, 2009, 5:14:18 AM11/17/09
to
Gervase Markham wrote:
> On 17/11/09 08:40, Mook wrote:
>> Are multiple <link rel="resource-package"> elements allowed per page
>> (say, to have different caching policies)? If those elements end up
>> containing the same resource, who wins?
>
> Yes. But it's a good question as to which wins if they both contain the
> same resource name but different resource data. I'd say the behaviour is
> undefined (i.e. "don't do that").

I'd say this really _should_ be defined clearly before people start to
rely on any particular (perceived) behaviour. I thought of several
strategies where I'd have a large static "main" resource package, but
would occasionally want to be able to override a file from it on a
case-by-case basis with a different version of that file specified in a
second resource package. It'd be hacky, but it'd also be really useful.

But then I realized that this'd mean that content from later resource
packages could potentially override content from earlier resource
packages, which would mean that the browser would have to wait for all
resource packages to be downloaded before it'd be capable of doing
certain things, which'd be a bad thing. (The same situation exists in
reverse if a second small resource package finishes loading before a
first large resource package, containing the same resource, though. If
only speccing "whichever file gets seen first by the browser" wouldn't
be so indeterminate...)


In general, the entire capability of defining multiple resource package
probably deserves a more explicit mention in the spec than being tucked
away inside an "additional note" about headers. Splitting up available
resources into multiple packages was the first thing I (working web
developer) wanted to ask about after reading the spec. On any
non-trivial site I'll want to spit up my resources into two or three
resource packages per page (one global site-wide, cached forever, one
high priority (maybe reusable per section), and one with whatever is
left over and specific for the page).

Sander

Ictinus

unread,
Nov 17, 2009, 5:17:32 AM11/17/09
to

I agree, perhaps a ZIP file would be useful in the short term, but a
server managed solution long term would allow all resources to get the
performance benefit.
Not having to manage the zip file makes a developers life easier.
Having it server side means more benefit. After all we already do gzip
server side.

semper.cras

unread,
Nov 17, 2009, 5:22:02 AM11/17/09
to

Basic ZIP programs do NOT let the user to (re-)sort files in archive
manually (tried Gnome's File Roller and basic WinXP archivator). So
the only way that'll probably work is to create archive and than add
files there one by one, which is by no means elegant and transparent.
Moreover, even after this one cannot be sure that the order is right,
because there's no obvious way to check it.

And now imagine that I want to change the order of two images in the
zip file with say 20 files. Do I have to unzip it and then zip all the
files one by one to make them go in the need order?

So, there's a problem. If we set priority in the manifest file and the
real priority (in the zip file) is different, we can lose the
advantage of not waiting for the zip to download completely. Maybe the
partial solution would be promoting/creating handy tools for managing
file order in the archive?

Magne Andersson

unread,
Nov 17, 2009, 5:24:26 AM11/17/09
to

I don't think you understood what I meant, I said that this solution
is great for resources that doesn't update frequently. This is also
made to eliminate requests, GZIP doesn't do that. It just makes the
files smaller.

Magne Andersson

unread,
Nov 17, 2009, 5:27:39 AM11/17/09
to

Or ignore the order of the files in the ZIP if priority is already
specified in the manifest file.

Duncan

unread,
Nov 17, 2009, 5:45:25 AM11/17/09
to
On Nov 17, 10:22 am, "semper.cras" <semper.c...@gmail.com> wrote:
> Basic ZIP programs do NOT let the user to (re-)sort files in archive
> manually (tried Gnome's File Roller and basic WinXP archivator). So
> the only way that'll probably work is to create archive and than add
> files there one by one, which is by no means elegant and transparent.
> Moreover, even after this one cannot be sure that the order is right,
> because there's no obvious way to check it.

Most command line zip programs (e.g. Info-zip) will read a list of
files and zip them in the order specified. So something like:

zip resource.zip manifest.txt -@ <manifest.txt

should be all you need to create you a resource file with manifext.txt
first and then all the files listed in manifest.txt in the order they
appear. That's both elegant and transparent.

On another note, what I like about this proposal is that you can
potentially speed up browsing even for old broken browsers by adding
support for resource packages to a proxy server. So everyone wins.

Ictinus

unread,
Nov 17, 2009, 5:56:08 AM11/17/09
to

No, I understand, and I agree (short term), but I believe a (long
term) better solution is one where we don't need to do the extra work
(maintain separate zip files) not matter how infrequently.
A server generated/managed single resource (combined js/css/image)
minified (where appropriate), gzipped, and with appropriate headers to
allow caching (based on similar decisions described in the above ZIP
suggestion), would provide the same performance benefits, but without
the developer needing to do the extra work (ie. let the server do the
work).
Perhaps the 'manifest' idea could be the actual request and the server
would read the manifest and build the single combined resource. If the
manifest is requested frequently, the resulting single resource can be
cached and not even re-built.

Joris

unread,
Nov 17, 2009, 7:10:57 AM11/17/09
to
On Nov 17, 4:01 am, Alexander Limi <l...@gmail.com> wrote:
> Hi dev.platform,
>
> I have published the first public draft on Resource Packages, and
> would like to hear your feedback:
>
> http://limi.net/articles/resource-packages/
>
> Summary:
> What if there was a backwards compatible way to transfer all of the
> resources that are used on every single page in your site — CSS, JS,
> images, anything else — in a single HTTP request at the start of the
> first visit to the page? This is what Resource Package support in
> browsers will let you do.

This could be taken one step further: Allow direct linking to a file
in such a package (alongside the current proposal).
So I propose support is added to format urls like these examples:
http://www.mysite.com/resource.zip/contact.html ,
file:///home/joris/test/resource.zip/contact.html . If I recall
correctly, Firefox already uses something similar internally, but I
could be wrong.
index.html or a file defined in the manifest.txt would be the default
file, similar to DirectoryIndex in apache.
The above, in combination with a new file extension (.har?), would be
an excellent way to distribute documentation, all you need is one zip
file on your harddisk.

Henri Sivonen

unread,
Nov 17, 2009, 7:11:41 AM11/17/09
to
Is there a reason not to use the same manifest location already used by
jar and ODF?

The benefit of reuse would be the ability to use the jar tool to create
archive files that put the manifest first in the zip file.

--
Henri Sivonen
hsiv...@iki.fi
http://hsivonen.iki.fi/

Magne Andersson

unread,
Nov 17, 2009, 7:17:52 AM11/17/09
to
On 17 Nov, 13:11, Henri Sivonen <hsivo...@iki.fi> wrote:
> Is there a reason not to use the same manifest location already used by
> jar and ODF?
>
> The benefit of reuse would be the ability to use the jar tool to create
> archive files that put the manifest first in the zip file.
>
> --
> Henri Sivonen
> hsivo...@iki.fihttp://hsivonen.iki.fi/

For the lost ones, which is the location of the manifest file in jar
and ODF?

Jesper Kristensen

unread,
Nov 17, 2009, 7:18:21 AM11/17/09
to
Gervase Markham skrev:

>> Are multiple <link rel="resource-package"> elements allowed per page
>> (say, to have different caching policies)? If those elements end up
>> containing the same resource, who wins?
>
> Yes. But it's a good question as to which wins if they both contain the
> same resource name but different resource data. I'd say the behaviour is
> undefined (i.e. "don't do that").

Please don't. It is hard to make 4 browser engine makers, who all know
what they are doing, follow a spec. It is impossible to make thousands
of web developers, some of whom don't have a clue about what they are
doing, follow a spec. Web developers should be able to expect that if
something works in one browser, it should also work in another.

Jesper Kristensen

unread,
Nov 17, 2009, 7:22:00 AM11/17/09
to
Magne Andersson skrev:

> I also have a suggestion. We know that some files are more important
> than others on some sites, this is especially true on large sites
> loaded on slow connections. How about a way to set priority to files
> inside manifest.txt?
>
> From your example:
>
> PRIORITY: 1
> styles/reset.css
> styles/grid.css
> styles/main.css
> PRIORITY: 2
> javascript/jquery.js
> PRIORITY: 3
> images/save.png
> images/info.png

How would you implement such thing? The files are all located in one zip
file, and the zip file is downloaded from beginning to end, so the
priorities can be used for nothing. Except the browser may do multiple
HTTP Range requests for individual files in the resource to download the
most important first, but doesn't that defeat the whole purpose of this?

Jesper Kristensen

unread,
Nov 17, 2009, 7:26:02 AM11/17/09
to
Ictinus skrev:

> I agree, perhaps a ZIP file would be useful in the short term, but a
> server managed solution long term would allow all resources to get the
> performance benefit.
> Not having to manage the zip file makes a developers life easier.
> Having it server side means more benefit. After all we already do gzip
> server side.

But a zip file is exactly a server managed solution that you request,
isn't it? No one says the zip file has to be created by the developer
manually. It can be generated by the server. You could use a server
configuration file or you could even do:

<link rel="resource-package"
href="site-resources.php?files=a.css,b.js,c.css,d.jpg" />

Jesper Kristensen

unread,
Nov 17, 2009, 7:28:32 AM11/17/09
to
Joris skrev:

> This could be taken one step further: Allow direct linking to a file
> in such a package (alongside the current proposal).
> So I propose support is added to format urls like these examples:
> http://www.mysite.com/resource.zip/contact.html ,
> file:///home/joris/test/resource.zip/contact.html . If I recall
> correctly, Firefox already uses something similar internally, but I
> could be wrong.

Yes, you can just use
jar:http://www.mysite.com/resource.zip!contact.html but that is not
backwards compatible.

Magne Andersson

unread,
Nov 17, 2009, 7:31:44 AM11/17/09
to
On 17 Nov, 13:22, Jesper Kristensen

I thought one advantage with using ZIP in this case was that it didn't
have to be downloaded from beginning to end, but instead could
download styles and apply the previously downloaded at the same time?

"Can be unpacked even in partial state — which means that we can
stream the file, and put CSS and JavaScript first in the archive, and
they will unpacked and made available before the entire file has been
downloaded."

Magne Andersson

unread,
Nov 17, 2009, 7:34:32 AM11/17/09
to
I hope we'll get to see an automatic ZIP together feature in FTP
software, always having an up to date ZIP-file when uploading.

Neil

unread,
Nov 17, 2009, 7:37:08 AM11/17/09
to
Alexander Limi wrote:

>I have published the first public draft on Resource Packages, and would like to hear your feedback
>

So, which resources does this apply to? I have only seen Images, style
sheets and scripts mentioned, but someone might want to supply audio
clips, or embedded SVG documents, or arbitrary frames?

--
Warning: May contain traces of nuts.

Magne Andersson

unread,
Nov 17, 2009, 7:39:39 AM11/17/09
to

Don't forget fonts, those takes an enormous time to load ;)

I don't see why this can't apply to every form of file that can be
stored inside a ZIP-file/folder?

Jesper Kristensen

unread,
Nov 17, 2009, 8:05:53 AM11/17/09
to
Alexander Limi skrev:
> Hi dev.platform,

>
> I have published the first public draft on Resource Packages, and
> would like to hear your feedback:
>
> http://limi.net/articles/resource-packages/

First of all, I think this is great and should be done sooner than later.

Is something like this allowed?

<link rel="resource-package"
href="http://some.other.site.com/site-resources.zip" />

It seems to be since you mention CDNs. What is the security implications
of this? A site cannot download info from another site unless in special
cases for historical reasons or if HTTP Access Control is used. E.g. a
site can include JS, CSS and JavaScript from another site and by
analyzing how the included code influenced the site, it can partly
detect whatever might have been in the file. Sites protect these things
by not displaying authenticated information in files, which conforms to
the syntax of these formats. Does current sites also have this
protection for the zip format. If not, they would have to add it or this
feature would make them vulnerable to information theft.

You say that the mainfest must be first in the zip. What happens if it
comes in the middle? Will it be ignored or used anyway? What happens if
a manifest does not list a file but it is still in the zip? I think it
is important to document browser behavior on these things, as we cannot
predict web developer behavior.

The purpose of the manifest file seems to be to make it faster to answer
"no" to the question of whether a file is in the zip or not. I see the
benefit of being able to answer "no" quickly to start downloading the
resource quicker. But do we need a manifest for that, or does the
manifest just adds unneeded complexity to the spec? You just have to
arrange the file structure on the server such that files packed in the
zip are located in one directory and all other files not. The browser
does not need a manifest to know that http://example.com/foo.css cannot
possibly be in the resource package http://example.com/static/resources.zip

Regarding the notes on MIME Type: While true that browsers must use
heuristics today to make this work, it is seen time and again on live
sites that this fails, and pages are not displayed correctly. We have to
live with that for those sites, who does not declare these informations
correctly, but making this the only option seems to spell trouble to me.
The spec should define standard mappings from file names to MIME types,
or maybe even allow mime types to be declared in the manifset.

Assuming that files inside the zip are UTF-8 seems really bad. Most text
based formats have set rules for defining character encoding inside the
files, like CSS with its @charset declaration and fallback to the
document encoding. Changing how this works is incompatible with existing
standards and practice and I could imagine it would break things.

Jesper Kristensen

unread,
Nov 17, 2009, 8:08:05 AM11/17/09
to
Magne Andersson skrev:

>> How would you implement such thing? The files are all located in one zip
>> file, and the zip file is downloaded from beginning to end, so the
>> priorities can be used for nothing. Except the browser may do multiple
>> HTTP Range requests for individual files in the resource to download the
>> most important first, but doesn't that defeat the whole purpose of this?
>
> I thought one advantage with using ZIP in this case was that it didn't
> have to be downloaded from beginning to end, but instead could
> download styles and apply the previously downloaded at the same time?
>
> "Can be unpacked even in partial state � which means that we can

> stream the file, and put CSS and JavaScript first in the archive, and
> they will unpacked and made available before the entire file has been
> downloaded."

Yes, files are available as they are downloaded. That is a feature of
the zip format. But that does not allow files to be downloaded in
another order than the order they are placed in the zip file.

Paweł Gawroński

unread,
Nov 17, 2009, 8:08:19 AM11/17/09
to
I would suggest an optional, general header/section syntax to
manifest.txt.
It could be like mime-headers (but in utf-8) and one of those headers
should be "Version: 1.0".
Example file:

Version: 1.0

css/style.css
...etc.

Of course, if no headers are provided, specs fall back to 1.0.
This should simplify extending this protocol and provide space for
ideas like prioritization, http-headers overriding, language specific
blocks,
external resources (like "Import: [URL]"), etc. I hope you like this
idea.

PG

Magne Andersson

unread,
Nov 17, 2009, 8:36:34 AM11/17/09
to
On 17 Nov, 14:08, Jesper Kristensen

<moznewsgro...@something.to.remove.jesperkristensen.dk> wrote:
> Magne Andersson skrev:
>
> >> How would you implement such thing? The files are all located in one zip
> >> file, and the zip file is downloaded from beginning to end, so the
> >> priorities can be used for nothing. Except the browser may do multiple
> >> HTTP Range requests for individual files in the resource to download the
> >> most important first, but doesn't that defeat the whole purpose of this?
>
> > I thought one advantage with using ZIP in this case was that it didn't
> > have to be downloaded from beginning to end, but instead could
> > download styles and apply the previously downloaded at the same time?
>
> > "Can be unpacked even in partial state — which means that we can

> > stream the file, and put CSS and JavaScript first in the archive, and
> > they will unpacked and made available before the entire file has been
> > downloaded."
>
> Yes, files are available as they are downloaded. That is a feature of
> the zip format. But that does not allow files to be downloaded in
> another order than the order they are placed in the zip file.

But with a manifest.txt file which defines which files the ZIP
contains, it can skip ahead and download and apply the prioritized
files first.

Eddy Nigg

unread,
Nov 17, 2009, 8:54:31 AM11/17/09
to
On 11/17/2009 05:01 AM, Alexander Limi:

> Hi dev.platform,
>
> I have published the first public draft on Resource Packages, and
> would like to hear your feedback:
>
> http://limi.net/articles/resource-packages/
>
> Summary:
> What if there was a backwards compatible way to transfer all of the
> resources that are used on every single page in your site — CSS, JS,
> images, anything else — in a single HTTP request at the start of the
> first visit to the page? This is what Resource Package support in
> browsers will let you do.
>

Seems like maintenance headache to me, of course scripting to your help
eventually. Except if the package could include all resources, download
it once and reuse it for a complete site for whatever request from that
site comes along, e.g per site package, reusing the same package over
and over again for every page, caching the content of the zip obviously.

--
Regards

Signer: Eddy Nigg, StartCom Ltd.
XMPP: star...@startcom.org
Blog: http://blog.startcom.org/
Twitter: http://twitter.com/eddy_nigg


Jesper Kristensen

unread,
Nov 17, 2009, 9:10:25 AM11/17/09
to
Magne Andersson skrev:

> On 17 Nov, 14:08, Jesper Kristensen
> <moznewsgro...@something.to.remove.jesperkristensen.dk> wrote:
>> Magne Andersson skrev:
>>
>>>> How would you implement such thing? The files are all located in one zip
>>>> file, and the zip file is downloaded from beginning to end, so the
>>>> priorities can be used for nothing. Except the browser may do multiple
>>>> HTTP Range requests for individual files in the resource to download the
>>>> most important first, but doesn't that defeat the whole purpose of this?
>>> I thought one advantage with using ZIP in this case was that it didn't
>>> have to be downloaded from beginning to end, but instead could
>>> download styles and apply the previously downloaded at the same time?
>>> "Can be unpacked even in partial state � which means that we can

>>> stream the file, and put CSS and JavaScript first in the archive, and
>>> they will unpacked and made available before the entire file has been
>>> downloaded."
>> Yes, files are available as they are downloaded. That is a feature of
>> the zip format. But that does not allow files to be downloaded in
>> another order than the order they are placed in the zip file.
>
> But with a manifest.txt file which defines which files the ZIP
> contains, it can skip ahead and download and apply the prioritized
> files first.

But skipping ahead means multiple HTTP requests. And the whole point is
to avoid those. So the simplest way of implementing priority would be to
do it the old way, as the performance gain would probably be lost anyway.

Bernhard H.

unread,
Nov 17, 2009, 9:24:48 AM11/17/09
to
On 17 Nov., 04:01, Alexander Limi <l...@gmail.com> wrote:
> Hi dev.platform,
>
> I have published the first public draft on Resource Packages, and
> would like to hear your feedback:
>
> http://limi.net/articles/resource-packages/
>
> Summary:
> What if there was a backwards compatible way to transfer all of the
> resources that are used on every single page in your site — CSS, JS,
> images, anything else — in a single HTTP request at the start of the
> first visit to the page? This is what Resource Package support in
> browsers will let you do.

Why not make pipelining work? (Or provide a fallback)
If you have a page with 20 images (like thumbnails) + css+ sprite +
js, you might save 2 requests with your .zip package. All other image
request are hardly predictable, dynamically creating .zips on the
server is not really an option (too much CPU and bad maintainability,
would slow down everything again)
On the other hand for pipelining you don't have to do anything and it
would save the time of 22 reqests.

Magne Andersson

unread,
Nov 17, 2009, 9:25:58 AM11/17/09
to
On 17 Nov, 15:10, Jesper Kristensen

<moznewsgro...@something.to.remove.jesperkristensen.dk> wrote:
> Magne Andersson skrev:
>
>
>
> > On 17 Nov, 14:08, Jesper Kristensen
> > <moznewsgro...@something.to.remove.jesperkristensen.dk> wrote:
> >> Magne Andersson skrev:
>
> >>>> How would you implement such thing? The files are all located in one zip
> >>>> file, and the zip file is downloaded from beginning to end, so the
> >>>> priorities can be used for nothing. Except the browser may do multiple
> >>>> HTTP Range requests for individual files in the resource to download the
> >>>> most important first, but doesn't that defeat the whole purpose of this?
> >>> I thought one advantage with using ZIP in this case was that it didn't
> >>> have to be downloaded from beginning to end, but instead could
> >>> download styles and apply the previously downloaded at the same time?
> >>> "Can be unpacked even in partial state — which means that we can

> >>> stream the file, and put CSS and JavaScript first in the archive, and
> >>> they will unpacked and made available before the entire file has been
> >>> downloaded."
> >> Yes, files are available as they are downloaded. That is a feature of
> >> the zip format. But that does not allow files to be downloaded in
> >> another order than the order they are placed in the zip file.
>
> > But with a manifest.txt file which defines which files the ZIP
> > contains, it can skip ahead and download and apply the prioritized
> > files first.
>
> But skipping ahead means multiple HTTP requests. And the whole point is
> to avoid those. So the simplest way of implementing priority would be to
> do it the old way, as the performance gain would probably be lost anyway.

You got me there. Good, you made me learn something. (I know that the
goal is to reduce them, but I didn't know that it would generate more
of them inside the same file.)

Ted Mielczarek

unread,
Nov 17, 2009, 9:43:37 AM11/17/09
to dev-pl...@lists.mozilla.org
On Tue, Nov 17, 2009 at 9:24 AM, Bernhard H. <
bernhard....@googlemail.com> wrote:

> Why not make pipelining work? (Or provide a fallback)
>

We've tried this, there are still issues with the real web:
https://bugzilla.mozilla.org/show_bug.cgi?id=414477 - https should run with
pipelining
https://bugzilla.mozilla.org/show_bug.cgi?id=422978 - pipelining breaks
secure Internet banking websites
https://bugzilla.mozilla.org/show_bug.cgi?id=264354 - Enable HTTP
pipelining by default

-Ted

Ant Bryan

unread,
Nov 17, 2009, 9:48:24 AM11/17/09
to
On Nov 16, 10:01 pm, Alexander Limi <l...@gmail.com> wrote:
> Hi dev.platform,
>
> I have published the first public draft on Resource Packages, and
> would like to hear your feedback:
>
> http://limi.net/articles/resource-packages/
>
> Summary:
> What if there was a backwards compatible way to transfer all of the
> resources that are used on every single page in your site — CSS, JS,
> images, anything else — in a single HTTP request at the start of the
> first visit to the page? This is what Resource Package support in
> browsers will let you do.

A couple comments.

Connection limits were removed from HTTPbis recently.
"You are probably familiar with the issue; a well-known optimization
technique is to reduce the number of HTTP requests that are done for a
given web site, since browsers only do 2–6 requests in parallel."

Missing "be"?


"Can be unpacked even in partial state — which means that we can
stream the file, and put CSS and JavaScript first in the archive, and

they will BE unpacked and made available before the entire file has
been downloaded."

Do you mean "...omit it in documents where it is INvalid..."?
"The default MIME type for a resource package is application/zip, and
you can omit it in documents where it is valid, like in HTML5, where
an equivalent would be:"

--
(( Anthony Bryan ... Metalink [ http://www.metalinker.org ]
)) Easier, More Reliable, Self Healing Downloads

Boris Zbarsky

unread,
Nov 17, 2009, 10:14:44 AM11/17/09
to
On 11/17/09 9:24 AM, Bernhard H. wrote:
> Why not make pipelining work? (Or provide a fallback)

Because the pipelining breakage tends to be on the server side (or
rather transparent proxy side), not on our side. And we don't control
that code. And neither does the person authoring the website,
typically. And it's been broken for close to a decade now, with no
improvement in sight....

> On the other hand for pipelining you don't have to do anything and it
> would save the time of 22 reqests.

Not quite. It would save the cost of spinning up 22 connections; you
still have to make 22 separate requests on the one connection. Still a
big win, of course.

Note that you're assuming that there will be a sprite. One of the
points of the package idea is that you no longer have to sprite and take
the resulting memory and performance hits during rendering.

-Boris

Boris Zbarsky

unread,
Nov 17, 2009, 10:15:53 AM11/17/09