Measuring Real User Experience with Site Speed Gauge

Many web performance testing service vendors, such as Keynote, offer last-mile testing data. In addition, many popular tools, such as WebPagetest, provide a convenient way to submit test cases so that you can collect large volumes of performance metrics. However, most of these services and tools are synthetic, meaning that they are not from real users. The testing agents in a computer performs the tests in a controlled environment.

The real world can be very different. A great number of parameters can affect the data dramatically. We can easily name a few such parameters: network connection speed, last mile speed in particular, browser type, computer CPU power. More importantly, the distribution of parameter values affects the results substantially. For example, we can get DSL and high-speed backbone test results from Keynote for a particular type of browser, but we don’t know exactly how many users have a network speed that’s comparable to the testing network’s speed. Other types of connection speeds that exist in the real world can also be missed in the tests. It’s difficult to put a weight on each of the tests and get an average that reflects the real world.

Several years back, at eBay we began building a framework called Site Speed Gauge to capture real-world, end-user site speed data. Site Speed Gauge has proven to be an effective way to understand real user experience with web performance, while also correlating site speed and the conversion of traffic to purchases. This framework is now the gold standard at eBay for monitoring site speed. Compared to synthetic tests with at most several thousand data points per day for a site page, we are getting millions of data samples for the same page. Large, statistically significant data sampling points help us to get stable trending and to identify real issues. This post provides an overview of how we use Site Speed Gauge to monitor and improve site performance as well as to collect meaningful business metrics.

The art of site speed measurement

We often hear questions such as, Do you know how fast or slow my web page is? Does it matter to end users? Depending on the source of the question, you’ll likely have a different answer. If a page takes 10 seconds to load and the person asking the question can wait that long, then the speed is fine. However, if I am an engineer, I probably would consider 10 seconds as too slow, and I’d want to know why the page is slow. Is the server response slow, or is the network connection slow? How long do I need to wait to see the first impression on the page? How long does it take to render the full page after getting the first byte of data? What is the speed on different types of browsers so I can tune the slow ones? If I am a business person, I would want to know how many users are affected by slow web performance; a distribution of performance data, such as the median or the 90th percentile, would be useful. Most importantly, everyone wants to know the correlation between site speed and the business metrics. Does site speed affect our company’s top line?

To answer these different questions, we will need various measurements and data analysis views. Throughout the design and enhancement of Site Speed Gauge, we have continually ensured that it is extensible to meet different needs. Site speed measurement is an art; you will experience its beauty when it exactly matches what any particular circumstance calls for.

How Site Speed Gauge works

The following diagram describes the 10 steps of Site Speed Gauge, from receiving a user request to reporting data:

10steps

For older browser versions, where the Navigation Timing object is not available, we mainly use JavaScript to capture the timers. Client-side timer measurements using JavaScript are relatively easy and accurate, as we can start the timer at the beginning of the page, and use the client machine’s clock to measure any points during the page rendering. The difficult measurement is the total end-to-end time, from when the user’s click initiates the page request through when the page completes with the browser onload event. We can use a cookie to store the timestamp when a user leaves the previous page, and measure end-to-end time when loading of the current page is completed. However, if the previous page is not from eBay—for example, if it is a third-party search or referral page—we will miss the end-to-end time metric.

Therefore, we instead use server-side timers for end-to-end time measurement. The beginning timestamp st1 is the time when an app server gets the request, and the end timestamp st2 is the time when another metrics collection server gets the site speed beacon request. We miss the URL request time for the real user’s page, but we compensate for this fact with the beacon request time. To handle the case of the app server’s clock and the metrics collection server’s clock not being in synch, we can use a time-synch service on the two machines. To provide sufficient accuracy, the synch service should return timestamps in milliseconds. Alternatively, we can use a single database timestamp to eliminate the time synch issue.

For the latest versions of browsers, we also send back measurements from the Navigation Timing object those browsers create. These measurements give us very useful information about client-side performance, such as DNS lookup time and network connection time. Through our analysis of the data, we have identified DNS and connection times as major sources of overhead if our datacenters and our clients are not located on the same continent.

Site Speed Gauge features

Currently, Site Speed Gauge supports these major features:

  • Key performance metrics are provided, such as total page end-to-end time, client-side rendering time, first byte time, DOM ready time, certain JavaScript execution times, server processing time, above-fold time, and graphical ads time. About 30 timers are available from the various stages of page processing.
  • Metrics are broken down by page, browser, device, international site, and client IP location. These breakdowns are in different selection dimensions when you query and view the data.
  • Data sampling is adjustable, from 100% sampling for low-traffic pages, to smaller percentages for high-traffic pages. For a heavily trafficked site like eBay, with billions of page views per day, big data processing and scaling requires controlling the sampling data size.
  • Through the gauge’s support for A/B testing, we can tie site speed to site feature changes. This ability is very useful for collecting business metrics correlation data; more on this in the next section.
  • In addition to collecting web performance data, we can plug in collection of other user behavior data:  user clicking, scrolling, and browser size data. We have built heap maps on top of the data to analyze user interactions with the pages.
  • We can plug in other browser performance objects, such as the Navigation Timing object available in new browser versions. As described previously, this capability enables us to get more visibility into the network layer, such as DNS lookup and network connection times.
  • Site Speed Gauge also supports other client-side capturing, such as JavaScript error capturing. Without this capability, we would be flying blind, unaware of problems until getting complaints from end users.

Integration with A/B testing

Tracking the trending of site speed helps us to identify site feature rollout issues, and to monitor site speed improvements we can achieve from various optimizations. In addition, we can run different versions of a web page at the same time and compare the site speed of each version. The resulting data enables us to correlate business metrics with site speed in a precise way. One of the characteristics of eBay’s business is seasonality; if we can simultaneously monitor site speed and business metrics for a page with seasonal or other variations, we can build meaningful correlations.

To enable such analysis, we have integrated Site Speed Gauge with eBay’s A/B testing platform. In this platform, business metrics are collected and analyzed based on testing and control groups; a user session id identifies which group a user belongs to. We use the same session id to collect site speed as well as business metrics. Once a user is sampled for site speed, all pages viewed by the same user are sampled so that we have all site speed and business metrics data for this user.

Several years ago, we ran two versions of the eBay search page. The versions had the same features but different implementations of the page, one our classic search and the other a new implementation. We collected data on site speed as well as buyer purchases per week (PPW, a business metric related to conversion of traffic to purchases), and over time we found a strong correlation between site speed and PPW. As the chart below shows, the correlation is not linear, starting from 10% site speed and 1 % PPW, increasing to 35% site speed and 5% PPW. Our interpretation of this result is that a small change in page speed might not have much impact, but a large page speed change can have a noticeable effect on end users; for example, a large reduction in site speed can cause user activity to drop, or even abandonment of the site, thus affecting conversions.

fig2_revised

As we established the correlation between site speed and PPW, business people and engineers alike began vigorously emphasizing site speed when they designed and implemented features. Now, the engineering culture at eBay is that site speed and optimization are part of the design, implementation, and rollout of features. Many important changes to the site first go through A/B testing to ensure that we’re maximizing site speed as well as business impacts. In addition, the Site Operations team uses dashboards and alerts to monitor site speed 24×7.

Data processing and visualization dashboards and tools

Although we typically collect 10% of site speed sampling data for high-traffic pages, these pages still generate large amounts of beacon data. A batch job running at certain time intervals performs site speed data processing and aggregation in ways that meet the different needs of various consumers of site speed data.

Site Speed Gauge provides dashboards and analytical tools that support data visualization in highly configurable ways. For example, we can select daily results, hourly trending on the 50th percentile, or hourly trending on the 90th percentile. We can view data on a specific eBay page, for a particular international or US site. We can see the site speed for various A/B testing groups.

Here is an example of the site speed breakdown by browser:

browser_site_speed

And here is an example of the browser usage breakdown:

browser_usage

For those who are interested in network latency and the impacts of a CDN, they can view results by geographic location. We process the data based on client IP address to support this feature. The image below shows a heat map of site speed for the United States. A picture is worth a thousands words; the heat map helps people identify page speed visually.

geographic

Conclusion

At eBay, Site Speed Gauge helps us to identify user web performance issues as well as to monitor site speed improvement efforts. Its abilities to collect real-world data and correlate business metrics provide powerful tools in eBay’s highly trafficked consumer-oriented web site. We built extensibility into Site Speed Gauge. In the past several years, we have enhanced Site Speed Gauge to support different needs, and we expect to continue enhancing it in the future.