I'm using FF 45.0.1 and also see the page does not render correctly.
Looks like they use a ton of Javascript some of which is using Canvas
some which points to off-domain resources. I restarted FF in its safe
mode to eliminate the CanvasBlocker or any other add-on causing the
problem. Still only got part of the web page.
I hit F12 and clicked on the Console tab and noticed there is an error
about the site trying to use geolocation (where the client provides for
geo data via getHTML5Location(), not the site using IP geolocation). I
have that disabled in Firefox. There is another error (only remember
seeing something about "undefined" for several functions). In FF safe
mode, that wouldn't be due to the adblocker (uBlock Origin) preventing
the site from accessing a resource since in FF safe mode that add-on
isn't loaded. There is also a warning that the site is referencing
js-sec.indexww.com which is using a weak SHA-1 certificate. The same
errors and warning showed in in the console when I restarted FF in
normal mode (all add-ons loaded) and visited
weather.com (except the
weak cert warning disappeared because the adblocker prevented
weather.com from accessing resources from that off-domain site).
I temporarily disable my anti-virus software and refreshed the web page
but still only part of it got retrieved. I think the site is fucked.
That navbar still works ... sort of. Click on "Forecasts" and then
"Today's Forecast". I get the error page saying "Oh no! The page you
are looking for does not exist. Go back, friend, go back!". This is
because the navigation links between pages is broken and their logic
default to pointing at their local "\page-not-found" notification page.
Their "National Forecasts" navbar link works to correctly navigate to
that page "forecast/news/national-forecast-20141009" but that page still
has resource problems (can't get to the resources needed to dynamically
construct that page, like the wxnode_video_player).
Shit happens and you have to wait to see the the site admins gets around
to flushing the turds. I've seen that when a site migrates to a new
layout but has not yet cleared their web server of the old pages or the
links in the new pages did not get updated to point at the other new
pages (they point at old pages that aren't there anymore or are now
incomplete).
weather.com paints okay (but slow) when I use Internet Explorer 11. So
I started to ponder if that site is stupidly using the User Agent header
to determine what client is visiting their site (and then altering what
HTML code is used to handle that client). I changed the UA string in
Firefox (using the User Agent Switcher add-on with its miniscule
pre-defined set of UA string only for IE6, 7, and 8) to pretend that I
was using IE8. Nope,
weather.com still wouldn't paint correctly.
Although it comes with a miniscule set of UA strings from which to
select, I installed the User Agent Switcher add-on. It only comes with
a short list of IE strings: IE6, 7, and 8. I changed to IE8 as to what
FF sends as its UA header and tested by visiting
useragentstring.com.
Yep, I'm reported as an IE8 client. Still
weather.com wouldn't work
when using FF 45.0.1 pretending to be IE8.
Although you would not expect a web site to catastrophically fail if you
configured your client to NOT voluntarily sent geolocation data, this
site will. I had geo.enabled set to False because I don't want to have
my client send that info. If a site wants to know where I am, let them
use an IP address lookup table from a geolocation database. Yet when I
*enable geolocation in Firefox* (geo.enabled = true) then
weather.com
will paint okay. This comes back to the error that I noted in the
console about that site erroring when trying to use Javascript to get my
client (Firefox) to divulge my geolocation. With geo.enabled = true, I
get a popup in Firefox asking if I want to share my geolocation with the
site. I just X'ed out of the popup alert and then the site painted
okay. So the site DEMANDS that the client have geolocation functions
enabled for use by their Javascript even if you elect to not give it to
them. They are too stupid to check for an error status on a function to
determine that the client does not support will has disabled its
geolocation function.
So decide if you want to continue visiting a site so poorly coded that
its Javascript pukes because it assume the *client's* geolocation
function is available. You can leave geo.enabled = false to eliminate
one method of tracking you and most sites that try to use it in their
Javascript will simply fail and adopt to using other code or simply skip
past the error for return status. Not this site. If they detect you
have a web browser that has a geolocation feature, they MUST have it
enabled (rather than code for a function's error status).
For me, I will continue using Firefox with geo.enabled = false. I
stopped using
weather.com a long time ago and switched to
accuweather.com where there is no problem visiting that site with a web
browser capable of geolocation but having it disabled.
geo.enabled = true (default) - Firefox will send geolocation data to a
site that requests it via Javascript.
geo.enabled = false - Firefox will error on Javascript function calls to
the geo API. Site needs to have error recovery in their page code
(e.g., simply check for return status when trying to use the Javascript
function).
weather.com does NOT provide graceful error recovery for a client they
know supports geolocation but where it is disabled in the client.