Thanks that clears things up a lot.
I'll change them all to #!.
I knew that if I used ? the user would need page refreshes, but this
would only have been seen if the user has javascript disabled. I
figured I could possibly use the search-engine crawlable version also
has a "javascriptless" version of the site.
I guess I can still do that..maybe if I use "#!" for the links when
"_escaped_fragment_" is in the url, and "?" if it isn't, but
javascript is disabled.
(If JavaScript is enabled, ?'s can be changed to "#s" as long as
_escaped_fragment_ isn't there).
I think that will work.
I certainly dont want any crawlers treating the site as separate, as
page-for-page the content will be the same.
> >
google-web-tool...@googlegroups.com<google-web-toolkit%2Bunsubs
cr...@googlegroups.com>
> > .