When you’re building a modern web experience, it’s important to measure, optimise and monitor if you’re to get fast and stay fast. Performance plays a significant role in the success of any site, as high-performing sites engage and retain users better than poorlyperforming ones. Tools like PageSpeed Insights (PSI) and Lighthouse can be valuable for monitoring the performance of your sites. But what do they do?
Lighthouse offers a comprehensive set of performance opportunities and the time saved by implementing each optimisation. You can find it in the Chrome DevTools Audits panel and in PageSpeed Insights.
PageSpeed Insights reports on the performance of a page on mobile and desktop devices and provides suggestions on how that page may be improved.
PSI provides both ‘lab’ (Lighthouse) and ‘field’ data about a page. Lab data is useful for debugging performance issues, as it is collected in a controlled environment. However, it may not capture real-world bottlenecks. Field data is useful for capturing true, real-world user experience but has a more limited set of metrics.
At the top of the report, PSI provides a score that summarises the page’s performance. This score is determined by running Lighthouse to collect and analyse lab data about the page. A score of 90 or above is considered fast and 50 to 90 is considered moderate. Below 50 is considered to be slow.
Lab versus field data
Lab data is performance data collected within a controlled environment with predefined device and network settings.
This offers reproducible results and debugging capabilities to help identify, isolate and fix performance issues.
- Helpful when it comes to debugging any performance issues
- End-to-end and offers deep visibility into the UX
- Provides reproducible testing and debugging environment
- Might not capture real-world bottlenecks
- Cannot correlate against real-world page KPIs
Note: Tools like Lighthouse and WebPageTest (https://www.webpagetest. org/) collect this type of data.
Field data (also called real user monitoring, or RUM) is performance data collected from real page loads your users are experiencing in the wild.
- Captures true real-world user experience
- Enables correlation to business key performance indicators
- Restricted set of metrics
- Limited debugging capabilities
Note: Public data sets like Chrome User Experience Report (https://developers. google.com/web/tools/chrome-userexperience-report/) and performance tools like the PageSpeed Insights speed score report this type of data.
In the past, web performance has been measured with the load event. However, this moment doesn’t necessarily correspond with what the user cares about. Over the last few years, browser teams have been working to standardise a set of new metrics and APIs that more accurately capture the performance of a web page.
To help ensure that the metrics are relevant to users, we frame them around a few key questions:
You can read upto 3 premium stories before you subscribe to Magzter GOLD
Log-in, if you are already a subscriber
Get unlimited access to thousands of curated premium stories and 5,000+ magazines
READ THE ENTIRE ISSUE