Well I am bringing it to your attention now. I have been looking at the
effect of this particular meta tag for 2-3 years now.
At the time when Google was reporting supplemental pages, the effect almost
immediately after people added this uncorrected meta tag to their homepages
was to make all or most other pages go supplemental very quickly, with a
marked drop in expected SERPs for their typical queries. it got to be
extremely predictable, as soon as one would complain of their pages going
supplemental, I'd go and see exactly this meta tag with the wrong closure
used in non xhtml documents. Most of the time those are web pages which
don't have any doctype at all. And even if they also have tons of other
validation errors, this particular one broke the camel's back.
Now there is no more mention of supplemental pages. But the effect is the
same, the pages might remain indexed (occasionally disappear), but cached
less frequently, even while the homepage is freshly cached. yes, cached, but
its content isn't actually indexed.
You've not been following the endless heated discussions we've been having
in the Webmaster groups on this. It's true no Googlers have stated that
this is a factor - nor have they denied it. I could only speculate as to why
they are not. My theory is that saying yes, breaking the head in this way
gets the content of the page to be ignored, would attract the ire of many
webmaster who've been having problems ever since they jumped on the
bandwagon of sitemaps. Even if they were to point out that a simple run
through the validator would have pointed out the problem, somebody's always
bound to say that google.com isn't valid and the old chestnut of amazon.com
is way invalid. besides those sites typically were invalid before anyway.
But this is what triggered the main problem with the new smart robots that
parse semantically.
I can only maintain a firm stance on this.
Christina
www.webado.net