Why is my Lighthouse score different from PageSpeed Insights?

21 Aug 2019 – updated 31 Mar 2021

This article explains why you might see score and metric differences between PageSpeed Insights and other tools.

You can run Lighthouse in many different environments:

  • PageSpeed Insights and web.dev
  • Monitoring tools like DebugBear
  • On your local device using Chrome DevTools or the command-line interface

Each of these environments brings with it its own device characteristics and set of configuration options.

Test locations

If you live in the UK, opening a website that's hosted on a server in London will be faster than opening one that's hosted in New York.

PageSpeed Insights picks the server to run the test from based on your current location. It uses one of four locations:

  • Northwestern US (Oregon)
  • Southeastern US (South Carolina)
  • Northwestern Europe (Netherlands)
  • Asia (Taiwan)

Map showing the 4 PSI test locations

web.dev tests are always run from a server in the US.

DebugBear tests pages from a server in South Carolina by default, but can run tests from 10+ locations.

If you run Lighthouse on your own computer, then the test results will always show how a user in your location would experience the website.

Network throttling

To achieve realistic and consistent results, Lighthouse uses a throttled network connection to test pages. For mobile devices, this connection uses a bandwidth of 1.6 Mbps and 150ms server round-trip time.

However, there are multiple different ways of throttling the connection.

PageSpeed Insights loads the page on a fast connection without any throttling, and then simulates how the page might have loaded on a slower connection. This is the default setting for Lighthouse.

If you run Lighthouse in Chrome DevTools you can choose between simulated throttling and browser-level "applied" throttling. Here the browser introduces a delay to each network response.

Other tools like DebugBear throttle the local network connection at the operating system level, delaying each network packet as it arrives on the test device. This is more realistic, but tests also take more time and show more variance between individual test results.

Read more about the different throttling methods and the impact they have.

CPU performance

If a site runs a lot of JavaScript code or requires complex layout computations, a faster CPU will make the page load faster. The CPU speed will be different PageSpeed Insights, your local computer, or a monitoring service.

Every time Lighthouse tests a page it runs a very simple CPU benchmark. The DebugBear servers reach a score around 600, while PageSpeed Insights reports a score around 800. So the PSI servers are slightly faster, resulting in better performance scores.

You can see the CPU benchmark at the bottom of the Lighthouse report.

Benchmark index in Lighthouse footer

Chrome version

What Chrome version is used can affect performance as well. For example, newer versions may have performance improvements that make them faster.

The way performance metrics are measured by browsers also changes over time. Chrome provides a changelog showing how the definition of First Contentful Paint, Largest Contentful Paint, First Input Delay, and Cumulative Layout Shift have changed over time.

As of April 2021 PSI uses Chrome 88 while DebugBear uses Chrome 91.

Lighthouse version

Over time, Lighthouse changes how tests are run, and the extent to which different metrics contribute to the overall Performance score changes as well.

As of April 2021, PageSpeed Insights uses Lighthouse 7.1, and DebugBear uses Lighthouse 7.2.

Different settings

Lighthouse provides a number of configuration options that can impact metrics. For example, you can control when Lighthouse ends a test by controlling timeout settings.

If the page doesn't render or finish loading after a certain amount of time Lighthouse gives up and finishes the page analysis. In some cases, this might be before the page actually finished loading. In other cases the page might finish loading quickly, but Lighthouse doesn't always correctly detect this, for example if there's regular CPU activity.

PageSpeed Insights and DebugBear use different timeout settings:

Waiting for... PageSpeed Insights DebugBear
First Contentful Paint 15 30s
Load 35s 60s

Note that because PSI uses simulated throttling it can generally get away with lower timeouts than a tool like DebugBear that actually slows down the page load.

Chrome extensions

If you run Lighthouse in Chrome DevTools, Chrome extensions can impact your Performance scores. Use Incognito mode or a new Chrome profile to run tests with extensions disabled.

What are the right values?

What tool takes the best measurements, PageSpeed Insights, running Lighthouse locally, or using a hosted tool like DebugBear?

Lab-based testing can only ever generate a snapshot of how your page behaves in a certain environment. How well that snapshot reflects user experience depends on what the network and device of your users look like.

If you want to find out how fast your site is for real users you need to capture performance data for real users.

Debugging Lighthouse score discrepancies

As mentioned earlier, Lighthouse calculates simulated performance metrics by default. If you see discrepancies between tools, it can be useful to look at the raw metric values Lighthouse collected from Chrome. Lighthouse refers to these as observed metrics.

  1. Open the Lighthouse HTML report (you'll have to click View Report on web.dev)
  2. Open the DevTools console
  3. Run __LIGHTHOUSE_JSON__.audits.metrics.details["items"][0]

The result will look something like this:

{
  "observedFirstContentfulPaint": 1835,
  "largestContentfulPaint": 10035,
  "firstContentfulPaint": 1755,
  "observedLargestContentfulPaint": 2566,
  "cumulativeLayoutShift": 0.36618412272135414,
  ...
}

In this example, the unthrottled observed FCP is greater than the simulated FCP, with values of 1.8s and 1.7s, respectively. This suggests that the simulation is underestimating the real FCP value, as even on a fast connection it took 1.8s for the page to start rendering.

Why would Lighthouse underreport the First Contentful Paint? In this case, the page contained a large number of unnecessary preload tags, hurting site performance. While these tags hurt performance in Chrome, the Lighthouse simulation does not simulate their performance impact with complete accuracy.

If you use --throttling-method devtools or --throttling-method provided the observed metrics will be the same as the reported ones, as Lighthouse does not run the simulation.

Lab vs field data

Finally, performance metrics can be collected in a lab tool like Lighthouse, or from real users. PageSpeed Insights reports both field data from the Chrome User Experience Report and the results from an ad-hoc Lighthouse test.

The field data will often be noticeably faster than the lab data, for example for the BBC homepage.

PSI field and lab data discrepancies

In this example, the First Contentful Paint measured in the field is 1.3s, while the lab-based Lighthouse test reports an FCP of 6.9s.

The reason for this is that Lighthouse simulates a device and network connection that's significantly slower than what real users to the BBC homepage are using. Lighthouse simulates a 3G connection with a 150ms round-trip time, but many mobile users are using a wifi connection with less than 10ms of latency.

DebugBear is a website monitoring tool built for front-end teams. Track performance metrics and Lighthouse scores in CI and production. Learn more.

Get new articles on web performance by email.

DebugBear logo
Track and analyze site speed with DebugBear.
➔ Learn more