Why is my Lighthouse score different from PageSpeed Insights?

21 Aug 2019 – updated 7 Nov 2020

This article will explain why you might see score and metric differences between PageSpeed Insights (or web.dev) and other tools.

Other tools might be:

  • Running Lighthouse on your local machine
  • Monitoring your page with DebugBear
  • Using other Lighthouse-based monitoring tools

Client locations

If you live in the UK, opening a website that's hosted in London will be faster than opening one that's hosted in New York.

PageSpeed Insights picks the server to run the test from based on your current location. There are four locations:

  • Northwestern US (Oregon)
  • Southeastern US (South Carolina)
  • Northwestern Europe (Netherlands)
  • Asia (Taiwan)

DebugBear tests pages from a server in South Carolina by default.

web.dev tests are always run from a server in the US.

Network throttling

To achieve realistic and consistent results, Lighthouse uses a throttled network connection to analyze pages. However, there are different ways of throttling the connection.

PageSpeed Insights loads the page without any throttling, and then simulates how the page might have loaded on a slower connection.

DebugBear throttles the local network connection and then loads the page. This is more realistic, but also takes more time and increases variance between tests.

Read more about the different throttling methods and the impact they have.

CPU performance

If a site runs a lot of JavaScript code or requires complex layout computations, a faster CPU will make the page load faster.

Every time Lighthouse analyzes a page it runs a very simple benchmark. The DebugBear servers reach a score around 540, while PageSpeed Insights reports a score around 660. So the PSI servers are slightly faster.

Chrome version

What Chrome version is used can affect performance as well. For example, newer versions may have performance improvements that make them faster.

As of September 2019 PSI uses Chrome 74 while DebugBear uses Chrome 75.

Different timeout settings

If the page doesn't render or finish loading after a certain amount of time Lighthouse will give up and finish the page analysis. This might be before the page actually finished loading.

Here are the timeout settings as of September 2019:

Waiting for... PageSpeed Insights DebugBear
First Contentful Paint 15 45s
Load 35s 120s

Note that because PSI uses simulated throttling it can generally get away with lower timeouts than a tool like DebugBear that actually slows down the page load.

What are the right values?

So what tool takes the best measurements, PageSpeed Insights, running Lighthouse locally, or using a hosted tool like DebugBear?

Lab-based testing can only ever generate a snapshot of how your page behaves in a certain environment. How well that snapshot reflects the experience of your users depends on what the environment that your users load your page in looks like.

If you want to find out how fast your site is for real users you need to capture performance data for real users.

Debugging Lighthouse score discrepancies

As mentioned earlier, Lighthouse calculates simulated performance metrics by default. If you see discrepancies between tools, it can be useful to look at the raw metric values Lighthouse collected from Chrome. Lighthouse refers to these as observed metrics.

  1. Open the Lighthouse HTML report (you'll have to click View Report on web.dev)
  2. Open the DevTools console
  3. Run __LIGHTHOUSE_JSON__.audits.metrics.details["items"][0]

The result will look something like this:

  "observedFirstContentfulPaint": 1835,
  "largestContentfulPaint": 10035,
  "firstContentfulPaint": 1755,
  "observedLargestContentfulPaint": 2566,
  "cumulativeLayoutShift": 0.36618412272135414,

In this example, you can see that the unthrottled observed FCP greater than the simulated FCP, with values of 1.8s and 1.7s, respectively. This suggests that the simulation is underestimating the real FCP value.

If you use --throttling-method devtools or --throttling-method provided the observed metrics will be the same as the reported ones, as Lighthouse does not run the simulation.

DebugBear is a website monitoring tool built for front-end teams. Track performance metrics and Lighthouse scores in CI and production. Learn more.

Get new articles on web performance by email.

© 2020 DebugBear Ltd