Why is my Lighthouse score different from PageSpeed Insights?
This article will explain why you might see score and metric differences between PageSpeed Insights (or web.dev) and other tools.
Other tools might be:
- Running Lighthouse on your local machine
- Monitoring your page with DebugBear
- Using other Lighthouse-based monitoring tools
If you live in the UK, opening a website that's hosted in London will be faster than opening one that's hosted in New York.
PageSpeed Insights picks the server based on your current location. There are four locations:
- Northwestern US (Oregon)
- Southeastern US (South Carolina)
- Northwestern Europe (Netherlands)
- Asia (Taiwan)
By default, DebugBear uses a server in South Carolina to analyze pages.
Analyzing pages from other locations is currently in beta. For other locations DebugBear also uses a nearby proxy that adds around 10ms to each request, depending on the number of round-trips required.
The proxy is necessary to allow access to websites that are restricted by the client's geographical location, based on the IP address. The underlying issue is that DebugBear uses Google Cloud VMs which use IPs that are associated with the US, regardless of where the machine is physically located.
To achieve realistic and consistent results, Lighthouse uses a throttled network connection to analyze pages. However, there are different ways of throttling the connection.
PageSpeed Insights loads the page without any throttling, and then simulates how the page might have loaded on a slower connection.
DebugBear throttles the local network connection and then loads the page. This is more realistic, but also takes more time and increases variance between analyses.
Every time Lighthouse analyzes a page it runs a very simple benchmark. The DebugBear servers reach a score around 540, while PageSpeed Insights reports a score around 660. So the PSI servers are slightly faster.
How big the viewport is can affect some visual rendering metrics.
With DebugBear there's no difference in the mobile viewport size, but on desktop the viewport is slightly smaller than with PageSpeed Insights (1280x800 vs. 1350x940).
Different timeout settings
If the page doesn't render or finish loading after a certain amount of time Lighthouse will give up and finish the page analysis. This might be before the page actually finished loading.
Here are the timeout settings as of September 2019:
|Waiting for...||PageSpeed Insights||DebugBear|
Note that because PSI uses simulated throttling it can generally get away with lower timeouts than a tool like DebugBear that actually slows down the page load.
PageSpeed Insights currently doesn't support HTTP/2 and the HTTP/2 audit is disabled. If you run Lighthouse locally or use DebugBear, HTTP/2 is correctly detected and supported.
What Chrome version is used can affect performance as well. For example, newer versions may have performance improvements that make them faster.
As of September 2019 PSI uses Chrome 74 while DebugBear uses Chrome 75.
What are the right metrics?
So which generates more accurate, PageSpeed Insights, running Lighthouse locally, or using a tool like DebugBear?
Lab-based testing can only ever generate a snapshot of how your page behaves in a certain environment. If you want to find out how fast your site is for real users you need to capture performance data for real users.