This article explains why you might see score and metric differences between PageSpeed Insights and other tools.
You can run Lighthouse in many different environments:
- PageSpeed Insights and web.dev
- Monitoring tools like DebugBear
- On your local device using Chrome DevTools or the command-line interface
Each of these environments brings with it its own device characteristics and set of configuration options.
If you live in the UK, opening a website that's hosted on a server in London will be faster than opening one that's hosted in New York.
PageSpeed Insights picks the server to run the test from based on your current location. It uses one of four locations:
- Northwestern US (Oregon)
- Southeastern US (South Carolina)
- Northwestern Europe (Netherlands)
- Asia (Taiwan)
web.dev tests used to always run in the US, but now match the PageSpeed Insights behavior.
DebugBear tests pages from a server in South Carolina by default, but can run tests from 10+ locations.
If you run Lighthouse on your own computer, then the test results will always show how a user in your location would experience the website.
To achieve realistic and consistent results, Lighthouse uses a throttled network connection to test pages. For mobile devices, this connection uses a bandwidth of 1.6 Mbps and 150ms server round-trip time.
However, there are multiple different ways of throttling the connection.
PageSpeed Insights loads the page on a fast connection without any throttling, and then simulates how the page might have loaded on a slower connection. This is the default setting for Lighthouse.
If you run Lighthouse in Chrome DevTools you can choose between simulated throttling and browser-level "applied" throttling. Here the browser introduces a delay to each network response.
Other tools like DebugBear throttle the local network connection at the operating system level, delaying each network packet as it arrives on the test device. This is more realistic, but tests also take more time and show more variance between individual test results.
Every time Lighthouse tests a page it runs a very simple CPU benchmark. The DebugBear servers reach a score around 600, while PageSpeed Insights reports a score around 800. So the PSI servers are slightly faster, resulting in better performance scores.
You can see the CPU benchmark at the bottom of the Lighthouse report.
What Chrome version is used can affect performance as well. For example, newer versions may have performance improvements that make them faster.
The way performance metrics are measured by browsers also changes over time. Chrome provides a changelog showing how the definition of First Contentful Paint, Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift have changed over time.
As of April 2021 PSI uses Chrome 88 while DebugBear uses Chrome 91.
Over time, Lighthouse changes how tests are run, and the extent to which different metrics contribute to the overall Performance score changes as well.
As of April 2021, PageSpeed Insights uses Lighthouse 7.1, and DebugBear uses Lighthouse 7.2.
Lighthouse provides a number of configuration options that can impact metrics. For example, you can control when Lighthouse ends a test by controlling timeout settings.
If the page doesn't render or finish loading after a certain amount of time Lighthouse gives up and finishes the page analysis. In some cases, this might be before the page actually finished loading. In other cases the page might finish loading quickly, but Lighthouse doesn't always correctly detect this, for example if there's regular CPU activity.
PageSpeed Insights and DebugBear use different timeout settings:
|Waiting for...||PageSpeed Insights||DebugBear|
|First Contentful Paint||15 s||30 s|
|Load||35 s||60 s|
Note that because PSI uses simulated throttling it can generally get away with lower timeouts than a tool like DebugBear that actually slows down the page load.
Due to this, PageSpeed Insights also waits for a shorter amount of time before concluding that a page has finished loading.
The longer thresholds used on DebugBear also apply if running Lighthouse in DevTools with the Simulated throttling option disabled.
|Threshold before test finishes||PageSpeed Insights||DebugBear / Non-simulated|
|Pause after FCP||1.00 s||5.25 s|
|Pause after Load event||1.00 s||5.25 s|
|Network quiet threshold||1.00 s||5.25 s|
|CPU quiet threshold||1.00 s||5.25 s|
If you run Lighthouse in Chrome DevTools, Chrome extensions can impact your Performance scores. Use Incognito mode or a new Chrome profile to run tests with extensions disabled.
What are the right values?
What tool takes the best measurements, PageSpeed Insights, running Lighthouse locally, or using a hosted tool like DebugBear?
Lab-based testing can only ever generate a snapshot of how your page behaves in a certain environment. How well that snapshot reflects user experience depends on what the network and device of your users look like.
If you want to find out how fast your site is for real users you need to capture performance data for real users.
Debugging Lighthouse score discrepancies
As mentioned earlier, Lighthouse calculates simulated performance metrics by default. If you see discrepancies between tools, it can be useful to look at the raw metric values Lighthouse collected from Chrome. Lighthouse refers to these as observed metrics.
- Open the Lighthouse HTML report (you'll have to click View Report on web.dev)
- Open the DevTools console
The result will look something like this:
In this example, the unthrottled observed FCP is greater than the simulated FCP, with values of 1.8s and 1.7s, respectively. This suggests that the simulation is underestimating the real FCP value, as even on a fast connection it took 1.8s for the page to start rendering.
Why would Lighthouse underreport the First Contentful Paint? In this case, the page contained a large number of unnecessary preload tags, hurting site performance. While these tags hurt performance in Chrome, the Lighthouse simulation does not simulate their performance impact with complete accuracy.
If you use
--throttling-method devtools or
--throttling-method provided the observed metrics will be the same as the reported ones, as Lighthouse does not run the simulation.
Lab vs field data
Finally, performance metrics can be collected in a lab tool like Lighthouse, or from real users. PageSpeed Insights reports both field data from the Chrome User Experience Report and the results from an ad-hoc Lighthouse test.
The field data will often be noticeably faster than the lab data, for example for the BBC homepage.
In this example, the First Contentful Paint measured in the field is 1.3s, while the lab-based Lighthouse test reports an FCP of 6.9s.
The reason for this is that Lighthouse simulates a device and network connection that's significantly slower than what real users to the BBC homepage are using. Lighthouse simulates a 3G connection with a 150ms round-trip time, but many mobile users are using a wifi connection with less than 10ms of latency.
Seeing Lighthouse score differences between PageSpeed Insights and other tools? I just updated my article on what causes metric discrepancies.— DebugBear (@DebugBear) March 31, 2021
This thread explains why you might get different test results using different tools. 🧵https://t.co/csvjVOlVqF