GSC, CrUX Dashboard, Pagespeed Insights - which is the right source?

498 views
Skip to first unread message

Drew Ailes

unread,
May 12, 2021, 6:23:19 PM5/12/21
to Chrome UX Report (Discussions)
I'm just making sure I'm following, as for some reason Google makes this process oddly convoluted for measuring something that has been expressed as being so...core...and...vital to the web experience:

- We have the Core Web Vitals which contribute to a Performance Score, but the weighting of the metrics changes 

- We have the lab data from Lighthouse/Pagespeed Insights

- We have the CrUX metrics which Google Search Console Uses but it reports based on aggregate page group scores

- We have the CrUX Dashboard in the Data Studio, which reports based on the URL level, but the individual metric views differ from the rolled up scores on the first page

Is this right? If so, I get that everything I've read says we should use CrUX, but I'm at a loss at what I'm supposed to use for reporting without presenting conflicting sources of information.

Does anyone have any input?

Thank you.

Ziemek Bućko

unread,
May 13, 2021, 4:36:52 AM5/13/21
to Chrome UX Report (Discussions), andrewro...@gmail.com
I think you got most of that right. But let me just go through those again and maybe clear up some things that way:

- Core Web Vitals are performance metrics. They do contribute to Performance Score in both PageSpeed Insights and Lighthouse, and these Performance Scores do change every now and then when it comes to weighting. 
 Core Web Vitals measured in the field (gathered from Chrome users that meet certain data sharing conditions) are stored in the CrUX database and that's the data Google is planning to use in search ranking. Performance Score doesn't play a role here. 
 Not every website will have the field data available in CrUX because there are certain traffic thresholds you need to meet to be included.
 CrUX via BigQuery contains detailed origin-level data, but you can access page-level data via PSI API, CrUX API, or the CrUX Dashboard (I think? I haven't used that in ages)
- You can also simulate Core Web Vitals in a lab environment - a machine with predetermined connection speed and computing power opens up the page and measures the performance, including CWV. That's what you see in Lighthouse and PageSpeed Insights (although PSI also contains some field data from CrUX)
- Google Search Console shows a very high-level overview of CrUX data, yes.
- Again, haven't used it in ages but that seems correct.

The key thing to understand is the difference between lab and field data (shameless plug here, sorry). Field data gives you a more real view of how your users experience your website while browsing, and it's what Google will use for search. BUT lab data is what you need to use when debugging, just because you can run a lab test on-demand and don't have to wait for the real user data to come in. 

For reporting, you need to use whatever looks better ;p I'm assuming you mean reporting to the business side - field data is what ultimately represents your performance as seen by your users and will correlate well with business metrics, so as someone with business KPIs that's what I'd wanna see.

Rick Viscomi

unread,
May 13, 2021, 5:04:05 PM5/13/21
to Chrome UX Report (Discussions), zie...@onely.com, andrewro...@gmail.com
Could you give an example of the conflicting information you're seeing? Based on your description I think I understand what's causing your frustration but it'd be good to have a concrete example to work with.


Rick

Drew Ailes

unread,
May 24, 2021, 5:32:56 PM5/24/21
to Chrome UX Report (Discussions), rvis...@google.com, andrewro...@gmail.com
So on the note of that last bullet for the Data Studio, I'm having trouble understanding why the metrics don't show the same information on the Core Web Vitals (main) section as in the individual metric reports. For example in the screen shots, they both seem to be the same month and same device, but the CWV page says LCP is 64.39% and the LCP page says its 57.21%. Same thing happens in the other metrics. What am I missing here? Why are they different?


Sorry I've gotta use those obnoxious links, work computer prevents me from uploading images here.

Rick Viscomi

unread,
May 25, 2021, 12:25:35 AM5/25/21
to Chrome UX Report (Discussions), Drew Ailes
Hi Drew,

I think this discrepancy happens when you deselect the latest value from the Month picker on the CWV page. Data Studio attempts to aggregate all months together, which produces a less-than-helpful average distribution. For example, here's the CWV page set to April 2021 with good phone LCP at 57.21% as expected:

Screen Shot 2021-05-25 at 12.12.56 AM.png

I think part of the confusion had to do with the "Month" label next to the origin saying "Apr 2021" even though you had no months selected. I don't have a lot of customization options, but I've just edited the dashboard to default to the oldest month rather than the newest one to at least raise some red flags that something is off:

Screen Shot 2021-05-25 at 12.23.29 AM.png

Happy to look into any other examples of discrepancies between tools.


Rick

Drew Ailes

unread,
May 25, 2021, 3:32:22 PM5/25/21
to Chrome UX Report (Discussions), rvis...@google.com, Drew Ailes
Thanks Rick, this is a big help. 
Reply all
Reply to author
Forward
0 new messages