On Mon, Dec 7, 2009 at 9:22 PM, EnviroChemSounds like you found a bug. Can you give us a snippet of html
<k...@environmentalchemistry.com> wrote:
> I've spent much of today using The Page Speed add-on for Firefox and
> have observed the following bugs/quirks/inaccuracies/annoyances:
>
> 1) INACCURACY: It falsely claims width/height clues missing from
> images if those clues are handled by CSS file as they are supposed to
> be. It appears add-on is only looking for old HTML width and height
> attributes, which have been superseded by CSS.
and css that shows the problem?
<div class="BLAd" style="width:120px;margin:380px Auto 0"><a href="http://www.sciam.com/earth-and-environment" title="Environmental News" rel="nofollow"><img src="/images/SANetworkLogo.png" alt="Scientific American: Partner Network"></a></div>
> 4) ANNOYING: Minimize DNS lookups and parallelize downloads acrossHow so? Minimize DNS Lookups recommends not serving one resource
> hostnames are contradictory instructions.
per domain, and serving javascript in the first 10% of the DOM from
the same host as the main document. Parallelize Downloads Across
Hostnames recommends balancing static resources across four hostnames
when at least one host serves 10 static resources.
See the thread "[page-speed-discuss] Recommendations are google
> Also the extra DNS lookups
> are frequently caused by Google JS files (e.g. Google AdSense and
> Analytics). The only way to reduce those DNS lookups would be to
> remove Google code or for Google to consolidate such files one domain.
issues", posted yesterday.
Yes, that could be improved.
> 5) QUIRK: Page Speed Activity doesn't allow one to scroll left or
> right along time line to see details that have scrolled off page.
> This isn't very helpful.
Glad to hear that. Thanks for letting us know what is broken, so
> One of the pages I was using for my testing was http://environmentalchemistry.com
>
> Overall the tool is very helpful for uncovering potential performance
> issues, but the above listed issues also make it very frustrating to
> use.
we can fix it.
Sam
--
Sorry, my message above was still being composed and issues tested. It
got sent before I was ready (I wasn't even aware it had been sent).
On Dec 8, 10:20 am, Sam Kerner <sker...@google.com> wrote:
> On Mon, Dec 7, 2009 at 9:22 PM, EnviroChem
>
> <k...@environmentalchemistry.com> wrote:This problem may have been a caching issue with the browser using
> > I've spent much of today using The Page Speed add-on for Firefox and
> > have observed the following bugs/quirks/inaccuracies/annoyances:
>
> > 1) INACCURACY: It falsely claims width/height clues missing from
> > images if those clues are handled by CSS file as they are supposed to
> > be. It appears add-on is only looking for old HTML width and height
> > attributes, which have been superseded by CSS.
>
> Sounds like you found a bug. Can you give us a snippet of html
> and css that shows the problem?
>
outdated CSS.
Very simply long cache times for CSS and JS files is a solution to a
non-issue. It will cause more development, maintenance headaches
without ANY appreciable advantage to the end user.
If I knew what I was going to change next week or next month, I would
> If you know for a fact that you will change a file next week, you
> might decide that you want to set caching headers for a shorter amount
> of time. I have seen to many smart and well meaning people mess this
> up to recommend it.
change it now and be done with it. This isn't how development works.
EXACTLY! This is why recommending long cache times for JS & CSS will
> Keeping track of what content is cached for how
> long is hard, and getting it right is unrealistic for all but the
> simplest sites.
cause more problems then it will resolve. What is the most benefit to
the user is a cache time that allows the vast majority of users to a
site to maintain support files in their browser cache for the duration
of their session, yet still makes sure the next time they visit they
are getting proper versions of their files. The advice to maintain
really long cache times for CSS & JS files WILL lead some developers
to change their cache time to too long of a period without considering
the ramifications. This will then end up causing problems for some of
their users the next time the developer updates one of these files.
It is not doing it now, this may have been one of the notes I was
> > 3) INACCURACY: Incorrectly advises applying gzip to image files.
> > Image files are already compressed. Applying gzip to them wastes cpu
> > cycles and can actually result in a larger download.
>
> Page Speed should not recommend compressing images. Can you give
> a URL that demonstrates this?
still figuring out what was going on when my message was accidentally
sent.
This is a much more clear explanation of what those notes were trying
> > 4) ANNOYING: Minimize DNS lookups and parallelize downloads across
> > hostnames are contradictory instructions.
>
> How so? Minimize DNS Lookups recommends not serving one resource
> per domain, and serving javascript in the first 10% of the DOM from
> the same host as the main document. Parallelize Downloads Across
> Hostnames recommends balancing static resources across four hostnames
> when at least one host serves 10 static resources.
to convey than what I got from the Page Speed recommendations. The
reality is, however, most developers will be able to do nothing about
the minimize DNS lookups suggestion as the single request domains are
typically for 3rd party widgets like Google Analystics. The only way
to address the issue would be to remove said widget, oftentimes this
is not a viable solution. Removing the widget also wouldn't be a
desirable solution for Google as frequently it would be their widgets
that would be getting removed.
Parallelizing downloads isn't even a viable option for most websites
as it would require registering multiple domains and getting tertiary
sites set up just to handle this. This suggestion would create major
website management headaches and increase web hosting costs. If this
advice is going to be given, then there needs to be some kind of
guideline as to how much of a performance improvement this would
provide. This could be massive retooling of a website and we need to
know up front how much value such an undertaking would be provided.
Without this knowledge, I'd suggest getting rid of this recommendation
because it will be of no use to 90% of us and will only cause
confusion.
On Dec 24, 12:07 am, Richard Rabbat <rab...@google.com> wrote:
> You may disagree with a lot of the opinions here, since we believe that
> every millisecond counts. the 100 ms is just a slippery slope.
I do see counting milliseconds as important, I spent a lot of time A-B
testing my .htaccess file and PHP to shave off every last millisecond
I could. The problem comes where shaving milliseconds via the
versioning CSS file versioning recommendation becomes a development
nightmare. In my case setting a cache length of one week for JS and
CSS files would serve the surfing habits of 90%+ of my user base.
There would be almost no gain for almost all of my user base to set
cache lengths longer than 1 month (which I'm doing for some files like
images). I really see setting cache times up to a year as obscene
with no payback for anyone except really big sites (e.g. Google) where
lots of people return to them day in and day out.
> there are several schools of thought. let's say you're using a javascript
> library at version 1.1. if you don't keep track of changes that the
> developer makes, you may end up pulling a new version that is incompatible
> with the functionality that you use. This doesn't happen often, but being
> able to test before you ship is sometimes a good idea. See for example how
> the api's that code.google.com hosts allow you to specify a version number
> that keeps you comfortable that you won't get regressions (whether latency
> or functionality related).
I can see that if a JS file is shared (like a library) and hosted on a
3rd party site (e.g. FriendConnect) that versioning would be a good
idea to keep things from breaking on other people's sites. I could
also see using versioning if there are major changes that could break
things if the wrong file is pulled due to caching, but for minor stuff
this doesn't make sense.
For instance, I use a dynamically generated JS for a pull down menu
(yes there are static HTML menus as well, via site/section directory
pages). The contents of the menu is dynamically generated, but only
gets updated when a new page is added to the site. It also pulls an
RSS feed from my blog. These menu items don't change all that often
so it really doesn't matter if a user caches this JS for a couple of
days.
The real beauty of this JS menu is that it costs only 13kb of download
yet links to hundreds upon hundreds of pages on my site. As a separate
JS they download it once and can use it for their entire session. If
this menu were static HTML it would have added about 100kb to the
download and certainly couldn't be a part of every page. This makes it
much easier to jump directly from one page to another on my site
without using an intermediary directory page.
My point is that pushing most websites to extend cache times beyond
one week or one month (depending upon the type of resource) and to use
versioning will be of no value to most of their users, but it will
cause development headaches.
> that is not the intent of the rule. You would be surprised how many websites
> hit just too many different servers. See the results of this rule for 2
> different sites:
> One of them has:
> [snip]
> 12 and 6 DNS hits respectively to web servers with one resource
each. It
> just adds up
Okay, I confess, I call two stats counters, Google Analystics and
Quantcast, I'm using the asynchronous version of GA and Quantcast is
the very last thing that gets called before the closing </body></html>
tags. I use GA for its great stats and one of my ad providers wants
Quantcast to provide 3rd party stats for prospective advertisers. I
did end up killing FriendConnect and will roll my own page
recommendation code that has a smaller footprint. I also pull stuff
from three ad providers, but those scripts pays the bills and puts
bread on the table so they are a necessary "evil".
By chance I did register a really short version of my domain name for
other purposes recently. It will be a cookieless domain so I was able
to do some parallelizing of CSS and core "skin" image files to this
new domain, which should help. Content images will still be pulled
from the main domain, as it would take way to much work to change
these, but this should still improve things. Now if Webmaster Tools
would just give me new averages sans the old methods. I'm finding
there is a two to three day lag before the full effect of changes are
realized on the "Site Performance" page.
> You can blame http 1.1 section 8.1.4. It seems most of the browsers ignore
> this rule now but ie6/ie7 (which have a sizable portion of the browser
> market) are limited to 2.
Ya, I read up on that whole mess and what a mess it is. There are
major shortcomings in HTTP specs.
> I'll take the feedback to the friendconnect team; unfortunately, I can't
> see the css for the snipped DoubleClick url.
Like I mentioned above, I ended up killing off the friendconnect
script because it was so very bloated and didn't provide enough value
to users. The DoubleClick problem comes and goes. I'm sure if the
DoubleClick folks were made aware that Page Speed is detecting issues
with their code they could find and resolve them.
> Speed is one of many issues
> that a web developer has to optimize and we personally love standards (as
> you might have seen from the numerous posts about HTML5 for example).
I've been as faithful to HTML/CSS specifications as possible since the
days of HTML3.0. I try very hard to make sure all my sites validate to
HTML/CSS specifications try to follow WCAG. Right now my target is
HTML4.01 Strict and CSS2.1. I'll wait on HTML5 until it gets finalized
before adopting it. What I'd really like to see is for ad providers
(e.g. AdSense) to make the generated code their scripts spit out
validate to W3C HTML/CSS specifications and to be JS error free.
> thanks for the feedback and keep it coming :)
I'm all for anything that helps us improve the experience of visitors
to our websites.