I know Google Matt has said that he prefers a tree-like structure in a
website as his personal preference. I do to.
Trouble is, I'm got literally hundreds of product categories, and I am
using a CSS expanding menu to make jumping to ANY category a one-click
operation from any page (make pages for users, not for search engines,
Trouble is, GoogleBot gets confused by all the links.
Is their a method of causing GoogleBot to ignore certain links, and if
so, what is Google's prefered method of tagging links as such.
The site in question is www.lingerieplace.com. You can see why I like
being able to have the expanding menu, it pleases the humans. I want
to please GoogleBot and humans!
Google guidelines recommend a max of 100 links on a page, and although
it's not a fixed limit, I believe that GoogleBot stops readng links
after around 200. This would mean that any lines within the actual
page would be ignored, and links towards the top of the menu would
automatically be favored.
I have a sitemap file for the robots, but I'm still concerned that it
may be better for the robot to ignore the links within the menu (or at
least the links more than 1 level deep). I'm not sure whether I'm
right to be concerned, nor about how to go about marking the inner
links so that GoogleBot ignores them.
When I say Googlebot is getting confused, I mean that it has no
context to recognize the greater importance of higher-level menu links
vs lower-level links.
All pages can be accessed without the expanding menu (with multiple
I like the nofollow idea. That would probably be most in the spirit
of following Google's guidelines, which I wish to do.
If we assume that:
1. Google does, in fact, only consider the first, lets say, 200 links,
2. You have, say, 600 links.
3. The first, say, 400 links are tagged nofollow
does that mean:
a. Google will only see only the 200 links which aren't tagged with
b. Google will see no links, since the first 200 links were nofollow
and it "stopped trying"
Now, that's my opinion only: i'd say they'll consider following up to
100 (150, whatever) links per page, regardless of where they are in
links), be that before, after or between it.
THis came up before (a few times) about the css menu's etc.
I think the general responses was akin to...
... how is Google meant to identify the 'important' pages in that big
list of links ? ...
Which kind of makes sense.
Technically, when we structure navigation in such a way, we are
defining groupings... parents, children, grand-children etc...
users/visitors/bots have to "drill-down".
believe it or not - the important stuff is meant to be top-level
This is so that visitors can easily locate such info - straight away.
This is part of the problem with getting "inner-pages" indexed.
So - thats the theory side covered - and the clues are there.
We already have the suggestion of nofollow.
Brilliant ... now - can we make that dynamic?
For an example... lets say we have 2 Parents... Parent1 and Parent2.
In each of those, we have Children (7 each) and Grandchildren (2 per
So we have a total of 44 pages (is that right?)
That 44 is split into 2 branches.
IF I go to Parent1 -> Child3(P1) -> Grandchild2(P1C3) ...
I don't really want the bots wandering off and looking at ANYTHING in
In fact ... the only things i want them to look at is the sibling (the
first Grandchild of P1C3) and the immediate Ancestor (child3) ... that
way it can move sidewards and back a step - thats it.
Now - I'm not saying that it will work - but the logic is there ...
you should be able to control what path the bots can take. This means
you can (hopefully) stop them wandering off and looking at other
things rather than following the prescribed/logically path/branch ...
plus it means at any one time, you are automatically reducing the
number the of links that the bot should follow.
To go a step further, you could not only apply this logic to the path
being followed.... but to the depth as well.
Make the menu so that it only allows 1 step from current.
So if you are at Root and looking at the 2 Parents... the only links
you could follow are to the Parent pages...
If you are on a Parent page, the only places you could go is to the
other Parent pages, the child of the current Parent, or back to Root.
Again - this means we should be able to focus the bots attention on
what we want.
> Now, that's my opinion only: i'd say they'll consider following up to
> 100 (150, whatever) links per page, regardless of where they are in
> links), be that before, after or between it.
Just for the record, we can process more than 100 links per page :-).
We do however recommend the limit of 100 because it generally makes
sense for users (and search engines).
At any rate, I like how you did the CSS menu and I think the use of
rel=nofollow there is a great idea (good stuff, Luzie!)! Since we can
find the lower level pages from the higher level ones, you should be
As Autocrat mentioned, it sometimes helps us (and users) when the
higher level categories are easy to find, which is something your menu
does naturally for users -- and with the rel=nofollow search engines
can spot it that as well.