Changelog

Learn about recently added features here. To see upcoming features, or to make suggestions, check out the roadmap.


3 Sep 2021

August 2021 release notes

Web Vitals tab

If you want to optimize Web Vitals DebugBear now provides all the metrics in one place. You can also see the DOM element that caused the Largest Contentful Paint, and a list of the layout shifts that occurred on the page.

In addition to the DebugBear lab data, the tab also shows real user metrics from the Chrome User Experience Report.

DebugBear Web Vitals tab

Improved request waterfall

The request waterfall now shows badges for render-blocking requests (B) and resources that are preloaded (L).

Render-blocking and preloaded files

Improved trendlines

You can now choose to view daily instead of weekly trendline data. In addition to trendlines for Lighthouse scores and Performance metrics you can now also look at HTML and console errors.

Page error trendlines

Upgrades to Chrome and Lighthouse

DebugBear now runs tests using Chrome 92 and Lighthouse 8.3.0.

Blog posts

Why does Lighthouse lab data not match field data? – PageSpeed Insights often shows very different values for the two types of metrics, and this article explains why that is. CSP error noise caused by Chrome extensions – I tested 1000 Chrome extensions to see how many of them cause false CSP error reports.

22 Jul 2021

July 2021 release notes

New scheduling options

Instead of setting a test interval like "every 4 hours" you can now also provide specific test times like "4pm".

DebugBear performance test schedule

Lighthouse 8.0 and Chrome 91

DebugBear now runs tests using the latest versions of Chrome and Lighthouse.

The Performance score was updated in Lighthouse 8.0 to emphasize Core Web Vitals.

Lighthouse Performance score breakdown

Weekly email

DebugBear can now send you a weekly summary of your project's performance.

Weekly performance email

Configure email subscriptions in your account settings or in the Project Integrations tab.

text

New blog post

I've investigated how Chrome extensions impact browser performance.

Chrome extension performance impact

1 Jun 2021

May 2021 release notes

Improved timeline

The DebugBear timeline now shows user timings, layout shifts, as well as when chunks for each request are received. Plus you can select a time range and zoom in!

DebugBear performance timeline

Layout Shifts table

The Performance tab now includes a list of all layout shifts, letting you see exactly what caused a change to your Cumulative Layout Shift metric.

DebugBear layout shifts

Delete tests and trigger pages in bulk

You can now trigger ad-hoc tests for all pages in your project, or a bunch of pages you no longer need in one go.

DebugBear bulk edit and trigger

Create custom metric charts

Create charts combining any number of metrics in the Performance tab.

Custom performance metric charts

Disable default wait

On some pages the CPU and network never become fully quiet. As a result the Lighthouse test ends up timing out.

To test these pages you can now disable the default Lighthouse wait and instead use your own custom condition, for example an element that exists in the DOM.

Disable default Lighthoue wait

New blog posts

Profiling site speed with the Chrome DevTools Performance tab – an in-depth look at how to interpret the performance data in Chrome DevTools

Optimizing page performance by lazy loading images – case study showing how to use the loading="lazy" attribute to improve performance.

13 Apr 2021

April 2021 release notes

Testing warm load performance

By default, DebugBear tests how fast a website loads the first time someone visits it. But for regular users performance will be better, as some page resources can be loaded from the browser cache.

You can now track the performance of these warm loads. Just check Disable clearing cache in the advanced page settings.

Testing warm loads in DebugBear

Better request diffs

DebugBear now shows request headers and response bodies directly in the requests table.

DebugBear response headers

You can also prettify JavaScript, JSON, CSS, and HTML responses to see exactly what changed.

Pretty HTML diff

Collapsible sidebar

Collapse the sidebar to leave more space for charts and performance analysis. This also hides the chat widget, so it doesn't overlap content you want to see.

Sidebar collapsed

New blog posts

Web Vitals FAQ – learn what Google has to say about the metrics that will impact search rankings from May 2021.

Common problems with rel="preload" – preload link tags can tell the browser what files it should prioritize, but sometimes they make the browser prioritize the wrong things, slowing down page performance

New performance docs

Time to Interactive – when do CPU and network become idle?

First Input Delay – how quickly does your page start processing user input?

4 Mar 2021

February 2021 release notes

Track user timing measures

The User Timing API lets you create custom performance metrics for your website. Before, DebugBear only kept track of the performance.mark load time metrics. Now you can also see performance.measure entries.

tracking performance.measure entries

One metric per chart

Combining many metrics in chart can help give context, but can also make it hard to make out individual metric data. The One metric per chart option allows you to break down the Performance charts.

Separate charts for each metric

New server regions

You can now test site speed from the following additional regions:

  • Canada (Montreal)
  • US Central (Iowa)
  • US West CA (California)

Export project metrics

Click the Export button in the top right of the project overview page to generate a CSV with data for all your sites.

Project CSV export

Lighthouse 7.2

DebugBear now uses the latest version of Lighthouse.

Delete test results

You can now delete individual test results, for example if your site was down for a few moments.

Delete performance test result

19 Jan 2021

January 2021 release notes

Trendlines

The project overview page now shows weekly averages for the last 10 weeks, rather than just the most recent scores and metrics.

Web performance metrics trendlines

Lighthouse 7.0

DebugBear now runs test using the latest version of Lighthouse.

  • The third-party facades audit suggests using a static placeholder for third-party widgets, until the user starts interacting with the widget.
  • In some cases Lighthouse now waits longer for the page to load, for example if there's a single XHR request that delays rendering. This makes the metrics more accurate.

Improved mobile site

The mobile project overview now also shows performance metrics, and you can use the same filters as on desktop.

Mobile site

New blog posts

Why is the Google Cloud UI so slow – a look at a large JavaScript application and what's slowing it down.
Debugging web performance with the Chrome DevTools Network tab – a detailed explanation of the information DevTools provides about network activity.

Updated documentation

Want to script user journeys and measure their performance? This article explains how to do that for single-page apps.

4 Dec 2020

November 2020 release notes

Set up multiple devices and locations in one go

To make setup easier, you can now select multiple device types and test locations and set up monitoring for them in one step.

Set up performance monitoring for multiple devices and locations

Bulk edit pages

You can now update some properties of multiple pages at once:

  • Page Title
  • Page URL
  • Test frequency
  • Tags

First, click the edit icon in the top right of the Project overview page.

Enter bulk edit mode

Then select the pages you want to update, either by using the standard search filters or by toggling the checkboxes. Then set the new values, and click the Update button.

text

Page switcher

You can now navigate directly from one monitored pages to another. The dropdown normally shows pages with the same URL first, so you can easily switch between Desktop and Mobile monitoring results.

DebugBear page switching

To make space for the dropdown, the "Open tested page" link has moved to the top right, next to the page ID.

Open tested page link

New blog posts

Lighthouse automatically tests the Performance, SEO , and accessbility of your website, but you can also add your own audits and audit categories.

Repeating performance tests reduces overall metric variability – this blog post quantifies how by much variance is reduced when running tests 3, 5, or 7 times.

Creating a web performance team can help make site speed a priority in your company. Marc Radziwill explains how to get started and make performance teams successful.

Millions of websites are built using website builders – we took a look at how site performance compares between different site builders.

New metrics documentation

The documentation now contains an overview of the Core Web Vitals, which start affecting Google search rankings next year.

There's also an in-depth look at one of the Core Web Vitals, the Largest Contentful Paint.

1 Nov 2020

October 2020 release notes

Manage monitored pages with tags

You can now tag your pages to make them easier to group and filter. Check your project settings to show tags in the navbar.

DebugBear tags

Order pages by metric

Another improvement to the page listing: sort pages by metric to identify pages that are slow or have SEO opportunities.

Click on the heading for the metric column to enable sorting. In this screenshot we're sorting by First Contentful Paint.

Pages ordered by First Contentful Paint

Block ads and tracking

Device settings now have an option to block ads and tracking using uBlock. This helps reduce test variability and makes sure DebugBear tests don't impact analytics. However, if the ads on your site have a meaningful performance impact, it can also skew your results and make them look better than they are.

Ad blocking option in the device settings

Stats mode for the page comparison

Stats mode allows you to aggregate data over a time range to identify longer-term trends. You can enable stats mode via the date dropdown.

Stats mode for page performance comparison

You can also aggregate metrics across all pages in order to track trends across your website, rather than for specific pages.

Aggregated page performance metrics

2 Oct 2020

September 2020 release notes

Customize device speed

There are now 4 default simulated devices that you can test on:

  • Mobile – 1.6Mbps bandwidth, 150ms latency, 4x CPU throttling – this matches the default Lighthouse Mobile settings
  • Mobile Fast – 12Mbps bandwidth, 70ms latency, 2x CPU throttling – this matches the LTE setting on WebPageTest
  • Desktop – 8Mbps bandwidth, 40ms latency, no CPU throttling – this matches the default Lighthouse Desktop settings
  • Desktop Fast – 100Mbps bandwidth, 2ms latency, no CPU throttling – this matches a fast wifi connection

You can also create new devices that match the characteristic of your users:

Setting up a 5G emulated device for monitoring

Compare tests arbitrary tests

It's now possible to compare the experience of a user in Australia to that of a user in Finland. Or you can compare a test result from today to one from a year ago.

To do that, go to the Overview tab of one of the pages you want to compare and scroll down to the Compare section.

Compare site speed results on DebugBear

For example, this site is notably slower in Brazil than it is in the US:

Comparing web performance between countries

New articles

We've published an in-depth article on how to front-end JavaScript performance. Learn about common performance issues, how to identify them, and how to fix them.

That article focusses on execution times, but you can also read about JavaScript memory leaks.

Finally, a new documentation page takes an in-depth look at the Cumulative Layout Shift metric.

24 Aug 2020

August 2020 release notes

Filmstrips and CPU timeline

Each DebugBear result now contains a filmstrip and CPU timeline. Use it to understand how your page renders and what's holding back performance.

Web performance filmstrip

Compare filmstrips

You also select "Filmstrips" on the project overview page to compare performance with your competitors.

Comparing site speed with competitors

Disable scheduling for pages

If you primarily use the API to trigger tests you can now disable scheduled tests.

Only test web performance as needed

Set up monitoring for many URLs more easily

Do you have 10 URLS you need to monitor? Instead of submitting the "new page" form 10 times you can now set all of them up in one go.

You can specify page titles by putting a space after the URL followed by the desired title. If no title is passed in the origin and pathname will be used, for example "example.com – /about".

Bulk website monitoring setup

30 Jul 2020

July 2020 release notes

Here's a roundup of some of the changes we've made recently.

Reduce variability by running tests multiple times

You can now run tests up to 7 times and then save the median result. This removes outliers and results and avoids unnecessary alerts.

Running each performance test once, 3 times, 5 times, or 7 times

Annual plans

Save 20% on your subscription by paying annually.

Two new monitoring locations

See how users in Mumbai and Singapore experience the performance of your website.

Lighthouse 6.1

Lighthouse 6.1 includes bug fixes, more data on long JS tasks, and a new SEO audit that makes sure search engines can crawl your links.

DebugBear now also uses Chrome 84 to test your pages.

Page loaded conditions

Does Lighthouse sometimes finish the test before your page has fully loaded? You can now set up a JavaScript expression that needs to be fulfilled before the test finishes.

New articles

Reduce variance between Lighthouse runs

Debug and improve server response time

Performance impact of Chrome extensions

First Contentful Paint and how to improve it

21 Jul 2020

Improved list of network requests

The list in the requests tab now shows the request duration by default, and you can add other columns as well. You can also click on the column headers to sort by that column.

Response time in list of requests

For example, you can break down the request duration into time spent on each part of the HTTP transaction: DNS lookup, TCP connection, SSL connection, Time to First Byte, and actually downloading the response content.

DNS, TCP, SSL, TTFB, and download

Or you can look at the content encodings, decoded response size, and response statuses. The request start time is relative to when the initial document request was made.

Request response status and content encoding

Larger request changes will also in the overview tab. Here you can see that the First Contentful Paint increased because the response for the initial document request took longer.

Overview tab list of requests

19 Jun 2020

Console tab redesign with code snippets

The console tab now has a custom design rather than showing the text-based diff by default.

Console messages will also include a call stack and code snippet where available.

Console errors in monitoring results

Request errors also show an HTML snippet, if the request was triggered by the page HTML.

text

19 May 2020

Lighthouse 6.0

DebugBear now tests your websites with version 6 of Lighthouse. We've also upgraded Chrome from version 78 to 83.

New metrics and scoring

Lighthouse 6.0 introduces several new metrics and changes how the overall performance score is calculated

  • Total Blocking Time – How often do JavaScript or rendering tasks make the page unresponsive?
  • Largest Contentful Paint – When is the largest content element displayed on the screen?
  • Cumulative Layout Shift – How much does page content move around after being rendered?

The composition of the Performance score has changed as follows:

Existing metrics

  • First Contentful Paint – 20% ➔ 15%
  • Speed Index – 26.7% ➔ 15%
  • Time to Interactive – 33.3% ➔ 15%

New metrics

  • Largest Contentful Paint – 0% ➔ 25%
  • Total Blocking Time – 0% ➔ 25%
  • Cumulative Layout Shift – 0% ➔ 5%

Deprecated metrics

  • First Meaningful Paint – 6.7% ➔ 0%
  • First CPU Idle – 13.3% ➔ 0%

New metric for Lighthouse V6 in the Lighthous report

You can find the charts for the new metrics in the Performance tab.

Charts for TTFB, largest contentful paint, total blocking time, and cumulative layout shift

Read more about these metric changes in the Lighthouse 6.0 announcement post.

Better timing budgets

Lighthouse performance budgets now support more timing metrics. If you've set up a performance budget on DebugBear those metrics will also show up in the Lighthouse report.

Lighthouse performance budgets for First Contentful Paint, Speed Index, Time to Interactive, Largest Contentful Paint, and Total Blocking Time

Stack packs

These were introduced in Lightouse 5.6.0, but DebugBear was previously running version 5.5.0. The Lighthouse report now includes recommendations tailored to your tech stack:

Lighthouse React stack pack

23 Mar 2020

Better and more customizable notifications

DebugBear automatically generates notifications if it looks like there's been a regression on your site. That means you don't need to do any work to get set up, but you might get some notifications that aren't relevant to you.

From now on you can configure when a notification is sent. It's been possible to mute specific notifications for a while, but I've now added some documentation for it.

Configuring DebugBear alerts

One common issue has been notifications for performance issues that can't be reproduced later on. Maybe the server was busy, or something weird happened with DNS. To avoid this problem in the future, some performance alerts are now only sent if the performance problem occurs more than once in a row.

25 Feb 2020

HTML validation

DebugBear now has a Validation tab which shows errors and warnings generated by the W3C HTML validator.

Most of these errors aren't very helpful. The HTML might not be valid, but as long as all browsers handle it fine that's not a problem. And sometimes the validator doesn't know about a recently added feature and will complain about it.

So DebugBear doesn't list common validation errors by default, and currently no email or Slack alerts are sent if there's a regression.

However, there are many potential problems the validator can identify:

  • duplicate attributes (e.g. two style attributes on the same element)
  • Stray start and end tags
  • Invalid inline CSS, like style="background: [object Object]"

Monitoring HTML validation errors, charts showing error and warning count, list of errors

20 Feb 2020

User flows

Until yesterday, DebugBear had a Login Steps feature that allowed you to fill out a login form before testing your page. There were a few problems with this though:

  • What if your site's login flow is split over two separate pages?
  • What if you have more than two form fields that need to be filled out?
  • What if you want to fill out a search from rather than do a login?

User flows are the solution to these problems. Rather than shoehorning all this functionality into a login form, you can now set up flexible steps that run before the actual page analysis.

text

14 Feb 2020

API updates: TypeScript typings, custom headers, and access build results

We've published a new version of the Node API. Here's an example of what you can do with it:

const { DebugBear } = require("debugbear")
const debugbear = new DebugBear(process.env.DEBUGBEAR_API_KEY)

const analysis = await debugbear.pages.analyze(pageId, {
  // Commit Hash is required to generate a build
  commitHash: "abc123",
  customHeaders: {
    "X-Enable-Experiment": "true"
  },
})
const result = await analysis.waitForResult()
console.log(result.build.status) // "success"

Check out the migration guide if you're moving from version 1 of the API.

24 Jan 2020

Capturing OCSP requests

Browsers make OCSP requests to check if a certificate is revoked. Chrome only does this for Extended Validation (EV) certificates.

These requests are now included in the request chart, so it should be easier to understand if you SSL connection takes a long time:

OCSP request in request chart