Dave Smart
unread,Apr 29, 2024, 5:45:07 AMApr 29Sign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to Chrome UX Report (Discussions), barryp...@google.com, Chrome UX Report (Discussions), psabha...@gmail.com, Dave Smart
Thanks too for that follow up!
Yeah, definitely understood re the param stripping.
The blocked by robots.txt one is interesting to me, being the only real method to control crawl, it's far from uncommon, or bad practice, for folks to block things like filtered / faceted navigation on ecommerce sites to prevent wasted crawling, (noindex etc still require crawl to be seen).
These may or may not, but very often are params, so:
example.com/shoes?colour=red&size=10
example.com/shoes?colour=blue&size=12 and so on. So if robots.txt isn't stopping CrUX eligibility, site owners might not know that
example.com/shoes is including visits to those filtered variants, and often these aren't as well cached and primed, so may actually be performing worse, or even the other way around performing better as there's less product to return.
Definitely seems like a good question for the Search team to see if there's some guidance on how they report eligibility to CrUX in those circumstances!