Skip to main content

API

Getting started

To get started install the debugbear Node module and generate an API key. You'll also need to find the ID of the page you want to analyze.

Learn more about getting started.

Managing projects and pages

Read about managing projects and pages with the Node API.

Running a website test

You can either run a single test or trigger tests in bulk.

Create a script.js file like this:

const { DebugBear } = require("debugbear");
// TypeScript: import { DebugBear } from "debugbear"
const debugbear = new DebugBear(process.env.DEBUGBEAR_API_KEY);

const pageId = 185;
debugbear.pages.analyze(pageId).then((analysis) => {
analysis.waitForResult().then(() => {
console.log("Test complete, view results here: " + analysis.url);
});
});

Then run DEBUGBEAR_API_KEY=... node script.js.

Additional options

The Node module supports similar arguments to the CLI.

Building a particular commit

debugbear.pages.analyze(pageId, {
commitHash: "e2ba122",
buildTitle: "Add support for tags",
// infer additional details from the environment, for
// example the name of the current branch
inferBuildInfo: true,
});

Customizing the url and HTTP headers

debugbear.pages.analyze(pageId, {
url: "http://staging.com",
customHeaders: {
"X-Feature-Flags": "tags",
},
});

Access build status and metrics

const analysis = await debugbear.pages.analyze(pageId, {
commitHash: "abc123",
});
const res = await analysis.waitForResult();
console.log(res);

This might log something like this:

{
"url": "https://www.debugbear.com/viewResult/787431",
"hasFinished": true,
"build": {
"status": "failure",
"oneLineSummary": "PF 100 ➔ 96, SEO 85 ➔ 80, Req# +1",
"metrics": {
"analysis.date": "2020-02-14T19:06:42.201Z",
"performance.speedIndex": 1087,
"performance.interactive": 895,
"performance.firstContentfulPaint": 845,
"performance.firstMeaningfulPaint": 1301,
"performance.score": 0.96,
"accessibility.score": 0.55,
"bestPractices.score": 0.79,
"seo.score": 0.8,
"pwa.score": 0.54,
"pageWeight.total": 1666205,
"pageWeight.document": 5745,
"pageWeight.stylesheet": 4562,
"pageWeight.image": 1571870,
"pageWeight.script": 65597,
"pageWeight.font": 18431,
"pageWeight.ajax": 0,
"pageWeight.media": 0,
"pageWeight.other": 0,
"pageWeight.redirect": 0,
"cpu.total": 42,
"cpu.scriptEvaluation": 8.4,
"console.errors": 0,
"console.warnings": 0,
"html.errors": 1,
"html.warnings": 1,
"mark.start": 120,
"mark.fully-rendered": 14125,
"measure.timeout": 14005,
"crux.granularity": "url",
"crux.fcp.p75": 906,
"crux.lcp.p75": 901,
"crux.cls.p75": 0,
"crux.fid.p75": 2
}
}
}

Status will be neutral if no performance budget has been set up, otherwise it will be either success or failure.

Using the HTTP API directly

You can trigger tests without using the Node module:

curl https://www.debugbear.com/api/v1/page/PAGE_ID/analyze \
-X POST \
-H "x-api-key: API_KEY" \
-H "Content-Type: application/json" \
-d '{"url":"http://example.com","buildTitle":"Site update"}'

Run tests in bulk

You can trigger multiple tests at once with the analyzeBulk function.

const bulkTests = await dbb.pages.analyzeBulk([123, 124]);
const results = await bulkTests.waitForResult();
note

Currently the bulk test API does not support passing additional options like a commitHash or custom headers. Contact support if you are interested in these features.

The result looks like this:

{
"hasFinished": true,
"results": [
{
"hasFinished": true,
"build": {
"status": "neutral",
"metrics": {
"analysis.date": "2024-05-26T07:56:39.634Z",
"performance.largestContentfulPaint": 1822,
"performance.totalBlockingTime": 190,
"...": "..."
},
"oneLineSummary": "No changes.",
"budgets": []
},
"analysis": {
"commitHash": null,
"commitBranch": null,
"buildTitle": null
},
"page": {
"id": "123",
"name": "MZ Finland",
"url": "https://www.example.com/",
"...": "..."
}
},
"..."
]
}

Export lab test result metrics

The getMetrics function provides the same metrics as the on-page data export button, for example performance metrics, page weight, and Lighthouse scores.

let metrics = await debugbear.pages.getMetrics(pageId, {
from: new Date(2022, 4, 1),
to: new Date(2022, 5, 1),
});
console.log(metrics[0]["performance.score"]);

Using cURL

You can also load page metrics by using the HTTP API directly.

curl https://www.debugbear.com/api/v1/page/PAGE_ID/metrics \
-X GET \
-H "x-api-key: API_KEY" \
-G \
-d from=2022-02-01 \
-d to=2022-03-01

This will load data spanning from midnight on 1 February 2022 to midnight on 1 March 2022.

Get recent results for pages in a project

Call projects.getPageMetrics to get the latest metrics for all your pages, similar to what you'd see on your project overview page on the DebugBear website.

const pageMetrics = await debugbear.projects.getPageMetrics(project.id);
pageMetrics.forEach((item) => {
console.log(`SEO score for ${item.page.name}: ${item.metrics["seo.score"]}`);
});

Getting older metrics

By default the most recent results will be reported by the API. If you need to load results for an older build you can pass in the before parameter, so results after that date will be ignored.

const pageMetrics = await debugbear.projects.getPageMetrics(project.id, {
before: new Date(2020, 8, 4),
});

Using cURL

curl https://www.debugbear.com/api/v1/projects/PROJECT_ID/pageMetrics  \
-X GET \
-H "x-api-key: API_KEY" \
-G \
-d before=2020-02-01

Retrieving a project

Use projects.get to retrieve a project, including the list of pages within it.

const project = await dbb.projects.get(project.id);
console.log(project.name, project.pages);

Using cURL

curl https://www.debugbear.com/api/v1/projects/PROJECT_ID  \
-X GET \
-H "x-api-key: API_KEY"

Timeline annotations

Call annotation.create to create an annotation.

await annotations.create(project.id, {
title: "Staging release",
description: "some description",
pageFilter: "",
date: new Date(),
});

Use project.annotations.list(projectId) to retrieve annotations.

pageFilter property

A filter string to only apply the annotation to specific pages.

Use pageId:1234 if you want to add an annotation to just one specific page.

Using cURL

To create an annotation:

curl https://www.debugbear.com/api/v1/project/PROJECT_ID/annotation \
-X POST \
-H "x-api-key: API_KEY" \
-H "Content-Type: application/json" \
-d '{"title": "V5 release", "date": "2022-12-21T11:00:00.000Z"}'

To list annotations:

curl https://www.debugbear.com/api/v1/project/PROJECT_ID/annotations \
-H "x-api-key: API_KEY"

Loading RUM metrics

Use the getRumMetrics(projectId) method to load RUM data for a given project.

const rumData = await dbb.projects.getRumMetrics("123", {
groupBy: "urlPath",
device: "mobile",
});
console.log(rumData);

The endpoint returns aggregate RUM metrics. The value shows the metric value (by default the 75th percentile) and the count shows the number of page views included.

{
"info": {
"groupBy": "urlPath",
"stat": "p75",
"from": "2024-04-25T20:15:00.000Z",
"to": "2024-05-26T20:15:00.000Z"
},
"lcp": [
{
"count": 115,
"urlPath": "/",
"value": 1614
},
{
"count": 70,
"urlPath": "/product",
"value": 698
},
"..."
],
"cls": ["..."],
"inp": ["..."],
"fcp": ["..."],
"ttfb": ["..."]
}

Use the from and to parameters to specify a date range. Pass in custom metrics with the metrics parameter. Use device or urlPath filters to only include specific experiences.

const MS_PER_DAY = 24 * 60 * 60 * 1000;
const rumData = await dbb.projects.getRumMetrics("123", {
from: new Date(new Date().valueOf() - 31 * MS_PER_DAY),
to: new Date(),
metrics: ["load", "dcl", "fcp"],
urlPath: "/",
});

Need other API features?

Let us know and we'll work with you to support your requirements.