Core Web Vitals report in GSC- using lab data instead of field?

222 views
Skip to first unread message

Maria Ibarra

unread,
Nov 5, 2020, 6:21:26 PM11/5/20
to Chrome UX Report (Discussions)
Hello! 

It's my understanding that the Core Web Vitals report in GSC uses field data from the CrUX report.
The puzzle I'm trying to understand is the following:
  • PSI field data shows a good LCP of 1.6s, and says that the page passes the core web vitals assessment. 
  • PSI lab data shows a poor LCP of 4.0s. 
PSI test results here.
The Core Web Vitals report in GSC is bucketing this and other URLs in the "poor" category citing that these pages have a poor LCP >4.0s, which my own explanation is that GSC is using the lab data instead of field for some reason. Is this possible or could this be a bug?

We ran a validation twice but both of them failed and it's been well over 28 days. Any ideas what could be the problem and how to solve it?

Thanks in advance!
Maria 

Rick Viscomi

unread,
Nov 5, 2020, 7:45:30 PM11/5/20
to Chrome UX Report (Discussions), Maria Ibarra
Hi Maria,

You're correct that Search Console's Core Web Vitals report uses field data from CrUX:

> The data for the Core Web Vitals report comes from the CrUX report. The CrUX report gathers anonymized metrics about performance times from actual users visiting your URL (called field data).
I think what's causing the misalignment is how GSC groups URLs together:

> An issue is assigned to a group of URLs that provide a similar user experience. This is because it is assumed that performance issues in similar pages is probably due to the same underlying problem, such as a common slow-loading feature in the pages.

So GSC will group all pages similar to /senior-care and assess them by their aggregated LCP experience: "Agg LCP (aggregated LCP) shown in the report is the time it takes for 75% of the visits to a URL in the group to reach the LCP state."

Because this is a group-level aggregation, it may sometimes differ from the URL-level aggregation used by PSI, as you're seeing in this case.

To fix the LCP issue that GSC is reporting, I'd recommend testing some of the other sample URLs in the same group having poor aggregate LCP. There seem to be one or more pages dragging the group's assessment down, likely to be popular pages with more influence over the aggregate experience. There should be about 20 other sample URLs listed, so I would expect most of them to also be individually assessed as having poor LCP experiences in PSI, similar to the group-level assessment given by GSC.

I'm especially interested in making the transition from GSC to PSI more smooth so please let me know if you have any thoughts on the current implementation or ways we could improve it. I'd also be interested to hear if you're finding many/most of the sample URLs not actually reported to have poor LCP by PSI.


Thanks,

Rick

Maria Ibarra

unread,
Nov 6, 2020, 1:17:40 PM11/6/20
to Chrome UX Report (Discussions), Rick Viscomi, Maria Ibarra
Hi Rick,

Many thanks for answering so quickly and being so helpful!

I checked the 20 sample URLs listed under the same poor LCP bucket, and only one of them showed a poor LCP of 4.3. All others seem to be good or borderline to 2.5s. I wonder if there is a way to check more of the 1,000 similar URLs in the same group? We can improve those pages but we just don't know what they are, and I wouldn't think that only one page will drag down the whole group's assessment. 

What's strange to me is that the problematic page is an international URL (/en-au/) with quite a different experience from the other URLs in this group-level assessment. I noticed how all the other international pages are usually in the same group except this one.  

About your question on how to improve the transition from GSC to PSI - I think the existing implementation is helpful as it provides a link to audit each URL with PSI. What would make the experience delightful would be to integrate the single individual metric being evaluated for each of the 20 URLs in the same group rather than just the Agg. high-level metric (in my case, it would have saved me 20 clicks to PSI and opening so many tabs!).

When a validation fails in the Core Web Vitals report, I often need to audit a few random URLs from the group-level to determine which did not benefit from our perf improvement -lots of clicks-. In theory, the Agg. metric would be enough, but as we just saw in the case above, the results are in different parts of the distribution, and finding that one URL with the problem isn't easy. 

I'm happy to provide more feedback as long as it helps to make the web faster :)

Thanks again,
Maria

Rick Viscomi

unread,
Nov 6, 2020, 1:53:12 PM11/6/20
to Chrome UX Report (Discussions), maria....@care.com, Rick Viscomi
Thanks Maria, this is excellent feedback and I'll share it with the relevant teams. I'd love to hear any other feedback about your experience, it's really helpful to document these edge cases and find ways to make our tools better.

> I wonder if there is a way to check more of the 1,000 similar URLs in the same group? We can improve those pages but we just don't know what they are, and I wouldn't think that only one page will drag down the whole group's assessment.

I'd suggest using your analytics data to get a list of the most popular pages and evaluate their LCP in PSI one by one. It's possible that one of these pages is so massive that it's bending the Agg LCP of the group into the slow category despite most pages having good LCP performance. That's currently the best option as Search Console only gives a sample of URLs in the group and not the full list.


Rick

Maria Ibarra

unread,
Nov 6, 2020, 4:23:11 PM11/6/20
to Chrome UX Report (Discussions), Rick Viscomi, Maria Ibarra
Thanks Rick, I'll try that and see what I can find.

Best,
Maria

Ishan Anand

unread,
Nov 6, 2020, 5:58:53 PM11/6/20
to Maria Ibarra, Chrome UX Report (Discussions), Rick Viscomi
Rick, one related piece of feedback. My experience has been on large sites the quality of the grouping by similarity in GSC has been hit or miss. Sometimes it makes sense but other times it's associating pages that are definitely different. Having only 20 sample urls is a real problem, especially in a large org where different teams are responsible for different sections of the site. We really need the full list in GSC at least as a CSV/XLS export. Then it would be a simple exercise to know which team is responsible.

--
You received this message because you are subscribed to the Google Groups "Chrome UX Report (Discussions)" group.
To unsubscribe from this group and stop receiving emails from it, send an email to chrome-ux-repo...@chromium.org.
To view this discussion on the web visit https://groups.google.com/a/chromium.org/d/msgid/chrome-ux-report/08894dd6-33b2-48ec-b6b8-e1313a0e71abn%40chromium.org.


--
Ishan Anand
Moovweb
340 Pine Street, Suite 400
San Francisco, CA 94104
Cell: +1-415-335-6094 (sms/texts are welcome)

Rick Viscomi

unread,
Nov 9, 2020, 2:14:03 PM11/9/20
to Chrome UX Report (Discussions), Ishan Anand, Chrome UX Report (Discussions), Rick Viscomi, maria....@care.com
Thanks for the feedback Ishan, this is very helpful.
Reply all
Reply to author
Forward
0 new messages