I'm trying to reduce the number of dynamic web pages the Googlebots
access. While at the same time I want the pages to validate properly
to HTML specifications.
1. According to W3C "nofollow" is not valid for REL= but their
validator doesn't catch it.
2. The REL= is not valid for links using <FORM>.
3. The REL stands Relationship and "nofollow" does not really fit
into that definition.
It does not sit well with me to clutter a web site's pages with
invalid code just to please Google.
I've always managed to cover things with the Robots Meta tag and
recently started excluding some links with the robot.txt file. These
I know properly meet HTML and other specifications. They also
This brings up a whole bunch of questions:
1. Has Google gone through the process of requesting a change adding
"nofollow" to the REL attibutes?
2. Do any of the other search engines pay attention to rel="nofollow"
3. Will using the Robots Meta tag and/or the robots.txt to exclude
links that might damage a site's ranking work the same as
4. Does Google take the "nofollow" literally and not open that link
or do they peek at the destination for other purposes?