mPulse Real User Monitoring Methodology

Document created by B-F-F08DRX Employee on Jul 20, 2017Last modified by Sheril Joseph on Aug 11, 2017
Version 3Show Document
  • View in full screen mode

 Akamai strongly believes that the most successful web sites today implement a holistic web performance management program incorporating functional testing, load testing, and monitoring.  It is not enough to simply develop and test a site and just hope for the best.  An ongoing web performance monitoring program is critical to ensuring that the site is meeting goals for performance, availability, and critical business objectives.  


Using the right tools, like Akamai mPulse, is only part of the solution.  You need a strategy, you need to collect the right kinds of data for performance and business metrics, you need to be able to assess the data and use it to drive improvement, and you need a plan for alerting, triage, and diagnostics when things go wrong.  This document covers Akamai's methodology to web performance management and how to get the most out of mPulse.




Web site performance monitoring has evolved over time to help site owners manage increasingly more complex and challenging site designs and business goals. 


The first web performance monitoring solutions focused primarily on availability.  In the days when the Internet itself was not as reliable as it is today, site owners needed to know if their customers could actually get to their sites at all and alert them if their ISP, or some other critical component in their service offering, was failing.  Soon, site owners demanded more and the first external site performance monitoring offerings began to appear on the market. 


These “synthetic” web measurements, as they came to be known, were taken from servers located in a handful of places on the Internet which robotically requested specific web pages at regular intervals.  These early approaches did not use real web browsers, did not execute scripts on the pages, did not track user states, and did not step through user journeys on a site.  As synthetic web measurement technologies improved, some of these limitations were overcome.  Test services began to use real browsers, and allowed a user to script a multi-step journey.  Test measurements could be run on a variety of network connection types and even on real mobile devices.


However, even with these improvements, synthetic web site monitoring could not tell the complete story for web performance.  Measurements were taken from a small sample of locations that did not reflect the real variety of visitors in small and medium markets.  The choice of network connections was also limited, especially on mobile smartphone measurements.  Entire countries might have no probes from which to take measurements at all.  Often, only one or two browsers were available to measure with, and they might not be the most popular versions of those browsers.  Some pages, such as purchase confirmation pages, might be unreachable in a synthetic script, and the vast majority of pages on a site might never be included in a measurement script at all.


How much better it would be if the site could arrange for its visitors to report their own user experience rather than relying on simulated users doing the same canned user journeys over and over?


Real User Monitoring 


Actually, why couldn’t the users’ browsers self-report how long the page takes to load?  The browsers have standard timing events that fire during page loads and JavaScript can access the details about those events.  Why not have a JavaScript running on each page that collects timing information and sends it back to a central data repository?  Out of these observations, Real User Monitoring (RUM) was born. 


Unlike synthetic monitoring, a RUM measurement strategy collects and reports on real user experiences by directly examining the time it takes pages to load in the real user’s own web browser.   The first RUM implementations collected only page load times, but as interest in this approach grew, the W3C developed new standards for web browsers to give developers more details into the performance of real web page visits.  New APIs like Navigation Timing and Resource Timing have standardized the kinds of performance data used by the industry across most modern browsers.  Efforts to standardize the way that the JavaScript used for RUM loads on a web, to prevent it from blocking or delaying any critical page content, gave site developer’s confidence to use this new measurement approach.


By using real user performance data rather than synthetic web site measurement data, web site owners gain very comprehensive insights into their users’ experience.  Data comes in from all kinds of browsers, locations, network connections, and device types.  The challenge is how to cope with the massive quantity of data.   With millions or billions of page views a month, how do you make sense out of the performance data?  How do you correlate performance data into business objectives and outcomes?


Akamai mPulse is built on the industry-standard open-source Boomerang project.  mPulse extends the functionality of Boomerang, but retains the key design principles that have made it, and RUM in general, successful in the marketplace.   


The graphic below illustrates the elements of Akamai's mPulse Methodology, which leverages existing best practices to extend traditional approaches to web performance monitoring and address the new opportunities and challenges presented by Real User Measurements.



The following is a description of each pillar and step in this iterative methodology (shown above).  The fundamental approach from a business strategy standpoint is to assess the current people, processes and tools used by an organization and map those to the Akamai methodology.  There are a number of methodology assets associated with each pillar of the methodology that SOASTA makes available to its customers.  These assets include videos, tips, and best practices documents around using custom timers and metrics, custom dimensions, custom dashboards, alerts, and more.





A clear and generally accepted strategy is core to the Akamai performance methodology.  The broader performance management, operations, and development teams need to understand and buy-into the overall web performance management program.  All stakeholders should have a voice in the overall strategy and stand to benefit from its implementation.



Committing to standards and principles of measurement will help ensure that all stakeholders in the web performance management program can be invested in the results. Akamai mPulse is committed to the following principles in the use of Real User Measurements


Performance Measurement 101

Despite the fact that performance measurement solutions are becoming commonplace, many people still only have a vague idea of what they actually do.  This article highlights how synthetic and real user measurement work, the pros and cons of each, and how they complement each other.

Synthetic and RUM: A Recipe for Webperf Success

This presentation helps you understand how RUM can complement an existing synthetic performance measurement program.

mPulse Boomerang Overhead

The mPulse Boomerang implementation is designed with the smallest possible footprint, whether Boomerang is loaded into a mobile browser or into some other browser type.  The mPulse Team has ensured by design that your client's performance comes first, and that measuring performance is not adversely affected by the measurement tool in use.

The mPulse Script Loader

This mPulse Technical Article describes why it is ok for our JavaScript snippet to be at the top of the page.

Measure Everywhere

This white paper describes why you want to collect 100% of all page views (link coming soon...)

Only Real Users

This technical reference documents the User-Agent strings that Akamai mPulse uses to identify beacons originating from synthetic web performance measurements and other bots (search engine crawlers, etc.) 

(link coming soon...)



Begin a web performance management program by deciding what Key Performance Indicators are important to your online business.  Different stakeholders will have different ideas about what is important to measure.  Sites in different industries may even have different objectives for their online presence.  The most effective web performance management programs work to a consensus agreement on what timers, metrics, and measurements matter.  Periodically reviewing and adjusting these KPIs over time is also important.


What Should I measure?

A PPT presentation on the various options for timers, customer timers, and metrics that you can collect in Akamai mPulse. (link coming soon...)

Desktop, tablet, mobile?

Is your site built on the principles of Responsive Web Design?  Or do you have separate sites for desktop and mobile?  This document describes options and strategies for approaching RUM In a multi-screen world. 

(link coming soon...)

How to Provide Real User Monitoring for SIngle-page Applications

At Akamai, we’re seeing a growing need to be able to monitor today’s latest breed of websites built with SPA frameworks such as AngularJS. To do so, we’ve made improvements to Boomerang, which is responsible for gathering performance metrics for mPulse.

Mobile Apps vs Mobile Web

Users experience your online presence in different ways when using mobile apps versus the mobile web.  How does this affect your performance measurement strategy? (link coming soon...)



After establishing the kinds of data that matter to the business, successful web performance management programs identify specific goals and establish the means to reach those goals.


Conversion rates

This document describes conversion rate, revenue metrics, and how to set goals for better online business results. (link coming soon...)

Bounce rates

What is a good bounce rate for my industry or my site?  This document describes what bounce rate is, why you should track it, and what it can mean to your business. (link coming soon...)

Medians, Percentiles, and Percentages

Akamai believes that performance can never be scored with a single numerical value.   Read here about strategies to use more than just the median in determining whether or not your site is meeting you performance goals. (link coming soon...)





In order to implement a performance monitoring strategy, it is critical to understand the people and processes that need to be in place. This is the link between the strategy and execution. In some cases, it is one or two people who do everything and the processes are rather simple. In other cases, there will be multiple people for many of the responsibilities and a comprehensive web performance monitoring program that is part of a much larger process.



Every company will have different needs when it comes to data access and control.  There are a number of responsibilities in a web performance management program.  You may be able to perform them yourself or you might choose to look for professional services support from Akamai.  In most situations, there will be multiple people involved in executing discrete tasks, such as app configuration, tagging the web sites, performing data analysis, and responding to alerts. For some customers, Akamai will help support many of these roles.


Roles and Responsibilities

These slides provide a high level overview of the roles and responsibilities associated with ongoing web performance monitoring and how they relate to development and testing organizations.  (link coming soon...)

Managing Users and Groups

This video covers how to create, manage, and delete users and groups in mPulse.  (link coming soon...)

Permissions in mPulse 

This mPulse Technical Article describes the Permissions feature in Akamai mPulse.  This feature lets you grant or restrict access to apps, dashboards, and alerts within the system to specific users or groups.



Properly configuring mPulse can be the difference between a successful web performance management program and one that fails to deliver the expected insights and value to the business.  The following tasks are critical to the implementation of mPulse.


mPulse Setup

This mPulse Knowledge Base article describes at a high level the steps required to setup and deploy an mPulse Web App and begin collecting performance beacons.

Loading the JS snippet

How the JS snippet loads on the web pages being monitoring is important.  This document covers best practices for calling the JS snippet, and options with specific tag management systems.  (link coming soon...)

Page Groups

This video describes all the options for configuring page groups, assesses their pros and cons, and describes best practices. This can be found on the mPulse Customizations tab.

Custom Timers and Metrics

These videos describe all the options for configuring custom timers and metrics, assesses their pros and cons, and describes best practices. These can be found on the mPulse Customizations tab.

A/B test variable

This video covers configuring the A/B Test Variable in Akamai mPulse and how to use it in the Dashboards for data analysis. (link coming soon...)

Configuring mPulse to Measure User Bandwidth

This mPulse Knowledge Base Article describes how to configure mPulse to collect throughput data for a sample of your real users.





Collecting data, through thoughtful configuration of mPulse, is just the first step to empowering your team to drive change.  You need to be able to make sense out of the vast volume of data.  Different teams will need different ways to access, process, and assess the data.  Some teams will need real-time access for immediate concerns and incident response.  Others may need more complex, deep dives into the data.  Having the right tools is critical, but just as important is understanding how to approach the process of data analysis.   It is important not just to be able to get answers to questions, but to know what questions to ask.



A surprisingly challenging task for many organizations is assessing the baseline performance of a site.  Akamai mPulse can help be the authoritative reference for all invested stakeholders in an organization.


How Fast Are We?

This document discusses how the answer to this question can vary from one organization to the next.  It discusses differences in terminology, how RUM data terminology differs and the options available answering the question of “how fast are we?”  (link coming soon...)

mPulse Beacon Parameters Reference

mPulse collects hundreds of data points for each page view.  This reference describes each field in the raw data.



For most organizations, the fast page load performance is not an end unto itself.  It is, rather, how faster page load times can contribute to improved business outcomes that really matters.  mPulse has powerful tools for correlating performance timer data with business metrics data. 


How do changes in performance impact my KPIs?

Learn how to use custom metrics to track the business level impact of performance changes on the site.  (link coming soon...)

What If Dashboard

A video tour of Akamai’s What If Dashboard, a powerful tool for modeling future changes in revenue, conversion, and performance.  (link coming soon...)

What Should I Trend?

This document offers best practice advice on what KPIs should be examined over longer periods of time.  (link coming soon...)



Having lots of data means that there are lots of opportunities to explore and discover.  Questions will come up that you may never have thought of at the time when the decision was made to invest in a RUM deployment.  RUM gives you the opportunity to make new connections between user experience and user satisfaction.


Creating Custom Dashboards

This video describes the basic steps to creating and using custom dashboards in mPulse.  (link coming soon...)

Custom Dashboard Tips

Best Practice recommendations on setting up custom dashboards from Akamai’s Professional Services experts.  (link coming soon...)





An important operational requirement for most web performance management programs is the ability to identify problems with the site as quickly as possible.  No matter how much testing and preparation you have put into the reliability and performance of the site, problems can and do happen.  Being able to identify these issues and react to them quickly can be the difference between success and failure in the digital marketplace.



Using RUM data to identify problems on your site can lead to fewer false positives than other approaches.  However, it is a surprisingly complex task to identify exactly what conditions should generate an alert.  mPulse has many ways to translate a concept like “the servers are down” or “the pages are too slow” into specific conditions that can trigger notifications to your operations team.


Using mPulse Alerts

This article provides basic information about the Alerts system in Akamai mPulse, including how to configure and track Alerts.

What should we alert on?

This document provides best practice guidelines for what kind of alerts are commonly used with RUM data.  (link coming soon...)

Alert Configuration Walkthrough

This video will step you through how to configure an alert and notifications.  (link coming soon...)



Once an alert has been triggered, it is important to get as much information about the situation as quickly as possible.  With Akamai mPulse, you can configure custom dashboards to jumpstart the investigation by delivering you directly to the problem spot.


Using mPulse to investigate alerts

This video illustrates how mPulse alert notifications work with custom dashboards.  

(link coming soon...)





Once you have established goals and objectives and assessed how closely the site is meeting your objectives, you will find where there are gaps that need to be addressed.  RUM data is a rich source of information from which to identify opportunities for improvement. Akamai mPulse puts this data at your fingertips through powerful Dashboards and the Data Science Work Bench. 



Developers need to be able to drill into performance events and concerns to identify the root cause of the problem.  These guides offer approaches for solving common performance issues, and best practice guidelines to using RUM to help develop the next iteration of your web site.


Diagnosing performance events

This document describes common approaches to diagnosing performance events using mPulse dashboards.  (link coming soon...)

The Waterfall Analysis Dashboard

This video tutorial describes how to use the mPulse Waterfall Analysis Dashboard.  (link coming soon...)



While operational or business management interests often bring mPulse into an organization, the RUM data can be extremely useful for ongoing development efforts. Instead of using the data solely for validating performance changes, devops teams can be using it directly to support site development and new releases.


Trending Best Practices

This document offers guidelines and best practices for trending performance over time with mPulse.  (link coming soon...)

Data Science Workbench FAQ

A FAQ document about using the Data Science Work Bench to investigate RUM data in ways that cannot be done in real-time Dashboards.  (link coming soon...)

mPulse Aggregate Data API

Guide to configuring and using the mPulse Aggregate Data API to export data into your own systems.

S3 Beacon Upload Logs

How to access the raw beacon-level data in your own Amazon S3 bucket.  



Connect mPulse with Akamai CloudTest and you have the most powerful combination for generating accurate and effective test plans.   mPulse provides the most accurate view into the user behavior on the site.  Rather than guesswork, leverage this information to create your test plans.


Develop test plans with mPulse

Best practices document for using mPulse data to identify peak load conditions and scenarios on your production web site.  (link coming soon...)