Real User Monitoring (RUM) collects performance metrics from live sessions on your website and provides insight into user experience and site speed.
Using Calibre RUM, you can see how users experience your site, allowing you to quickly identify the biggest performance bottlenecks and make impactful improvements.
Real User Monitoring is part of Calibre’s performance monitoring suite, offered to Starter, Team and Company plans. You can get started by signing up and installing RUM tracking on your website.
Calibre RUM is currently available through an early access trial. If you would like to try it out, please contact us.
The RUM Dashboard portrays a summary of key information about the tracked site:

The RUM Dashboard is a great starting point for performance and user experience discovery. You can see the summary of important data and the Core Web Vitals and Web Vitals metric charts at a glance. Each bold title takes you to a more in-depth report.
Each metric chart can show dashed lines that show good, to improve, and poor measurement thresholds based on the data that was collected. One or more lines may be visible on your charts. It's easy to see how close the measurements are to what you want, like getting LCP back to “good” or keeping INP below “poor”.
Filtering choices are retained as you navigate to specific metric pages (for example, Largest Contentful Paint) from the Dashboard.
The Dashboard can be filtered by choosing an option within the filter bar at the top of the page (including devices, dimensions, URL, aggregation, and time period).
The Pages Leaderboard ranks your Pages by key performance metrics. You can quickly find the pages or templates that have the worst user experience by sorting by any metric.

The shown metrics can be changed based on what’s important to your context:
The Pages Leaderboard can be filtered by selecting an option within the filter bar at the top of the page (including devices, dimensions, URL, aggregation, and time period).
The report updates instantly based on the selection. Selected filters can be cleared by pressing the x button next to the filter name.
You can easily view the Real User Metrics Dashboard for each page by clicking on its path in the table. The Dashboard includes current live visitors to the page, total session count, audience locations, good UX session percentage and charts for all collected metrics.
Each performance metric has a dedicated report, allowing you to deep-dive into trends and segment by various dimensions.

Subparts are available for Largest Contentful Paint, Interaction to Next Paint, and Time to First Byte:
| Metric | Subparts |
|---|---|
| Largest Contentful Paint | Time to First Byte, Image load delay, Image load duration, Render delay |
| Interaction to Next Paint | Input delay, Processing duration, Input presentation delay |
| Time to First Byte | Waiting, Cache, DNS, Connection, Request |
To help you identify areas to pay attention to, the metric measurements and the subparts have badges that indicate changes in the metric relative to the previous time period (for example, Time To First Byte is 876 ms, which is +67 ms compared to the previous period).
Each metric chart can have dashed lines that show good, to-improve, and poor measurement thresholds. Thresholds help you visualise how a metric is performing, and is useful for goal setting, such as getting back to “good” on LCP or keeping INP under “poor”.
The Metric reports can be filtered in two ways:
Clear the selected filters by pressing the x button next to the filter name.
The Audience report shows how many sessions that had a good, needs improvement, or poor experience based on Core Web Vitals performance. You can filter and segment by various dimensions, including location, device type, browser, and more.

The Audience report showcases the Core Web Vitals assessment with a percentage split between good, to improve and poor rated sessions. To identify emerging trends, percentage values for each grading are compared to previous periods (e.g., Poor grading has the value of 9.9%, which is a 9.2% rise compared to the previous period).
The Audience report can be filtered in two ways:
Clear the selected filters can be cleared by pressing the x button next to the filter name.
RUM reports shows all devices in aggregate (All option) or separately for Desktop, Tablet, or Mobile. Filtering by device is especially helpful in surfacing often large differences in speed experience observed between Desktop and Mobile sessions.
RUM reports can be filtered by various dimensions:
| Dimension | Description and examples |
|---|---|
| Page Path | Filter by specific page URLs. Example: /pricing, /cart |
| Page Grouping | Filter by custom page groupings. Example: /blog/*, /products/* |
| Navigation Type | Filter by how the user navigated to the page Example: Navigate, Back-forward Cache, Prerender |
| Attributed Element | Filter by specific elements using attribution. Example: hero-image, start-new-search-button |
| Browser | Filter by browser name. Example: Chrome, Safari |
| Browser Version | Filter by specific browser versions. Example: 144, 16 |
| Device Vendor | Filter by device vendor. Example: Apple, Samsung |
| Device Model | Filter by specific device models. Example: iPhone, Moto G |
| Operating System | Filter by operating system name. Example: iOS, Windows |
| Operating System Version (“OS Version”) | Filter by specific operating system versions. Example: 10.15 |
| Country | Filter by the country from which the user is accessing the site. Example: Australia, Germany, United Kingdom, Poland |
| City | Filter by the city from which the user is accessing the site. Example: Melbourne, Berlin, Liverpool, Krakow |
Real User Monitoring → Pages report can be filtered by URL type (singular or grouped):
| Option | Description |
|---|---|
| Path | URL of a single page. |
| Page grouping | Group of pages matching a defined URL pattern. |
While filtering by path helps find the best or worst performing pages, filtering by page grouping brings attention to specific groups, such as landing pages, cart pages, blog posts, documentation pages, and so on.
| Option | Description |
|---|---|
| P50 (median) | 50% of session metric measurements fall below the P50 value, and 50% above. |
| P75 | 75% of session metric measurements fall below the P75 value, and 25% above. |
| P95 | 95% of session metric measurements fall below the P95 value, and 5% above. |
| P98 | 98% of session metric measurements fall below the P98 value, and 2% above. |
| Minimum | Minimum collected metric values for selected time period. |
| Maximum | Maximum collected metric values for selected time period. |
P75 aggregation is a good starting point for evaluating how a site is performing. It’s often used as a monitoring baseline, including Chrome User Experience Report (CrUX) data. If your site passes the Core Web Vitals assessment at P75, we recommend moving the target to P95. At P95, the vast majority of the visitors have a positive user experience (for example: 95 visitors in 100 have a good user experience, while only 5 don’t).
RUM reports can be displayed for the following time periods:
| Option | Display |
|---|---|
| Today | Hourly |
| Last 7 days | Daily |
| Last month | Daily |
| Last 3 months | Weekly |
| Last 6 months | Monthly |
| All time | Yearly |
The longer the time period, the easier it is to spot persisting performance trends that might need to be addressed. Shorter time frames can help find newly emerging changes, like after a new release.
Page groupings let you use URL patterns to create labels for groups of pages. This allows you to see performance metrics for specific types of pages, such as all blog posts, product pages, or documentation pages.
Creating a Blog Posts grouping with pattern /blog/* allows you to see how all blog post pages perform as a whole.

You can create and manage page groupings by navigating to Site → Settings → Page Groupings. Page groupings can be used as filters in all RUM reports, and can also be used to group pages in the Pages Leaderboard.
On this page