When one of the first slides in a presentation offers this level of candor, you know you’re in for a treat:
File this presentation under “I wish I’d been there in person.” Cliff Crocker, Aaron Kulick, and Balaji Ram joined forces at February’s SF Web Performance Meetup to tell a RUM (real user monitoring) story through the lens of three job functions at Walmart.com: the performance data analyst, the developer, and the business analyst.
You can check out the full slide deck here (you need to be a Minus member to view it online, but anyone can download it), but I want to highlight my favourite slides.
First, some back story.
The problem: Folks at Walmart knew their pages were slow. As a for instance, initial measurement showed that an item page took about 24 seconds to load for the slowest 5% of users. Why? The usual culprits: too many page elements, slow third-party scripts, multiple hosts (25% of page content is served by partners, affiliates, and Marketplace), and various best practice no-nos.
The goal: Create and meet a performance SLA that would see Walmart’s 95th percentile traffic hit 20 seconds.
The approach: To get to that goal, they dedicated a scrum team to one sprint of performance optimization. At the start of the process, the team performed some baseline measurements in which they used their RUM data to look at the load times of key pages and look for patterns. Then the team would create targets for page performance and at the end of the sprint measure the impact of optimization on key metrics.
The following slides are a great use of historical RUM data to make a powerful case (embodied in compelling graphs) for investing in performance. Throughout this set, there’s a clear, consistent correlation between load time and conversion.
Awesome slide #1: Overall, converted shoppers received pages that loaded 2X faster than non-converted shoppers.
This should be of interest to anyone who’s asked themselves how to set targets for their own performance.
Awesome slide #2: The above trend persists, even on individual pages that experience greater load times.
An example is show here Forex, in which a specific page loaded almost 2 seconds faster for buyers than it did for non-buyers. (There’s nothing I love more than seeing parallel lines on a graph like this.)
Awesome slide #3: Non-buyers were served category pages that were 2-3 seconds slower than buyers.
I really appreciate the comprehensiveness of this set of slides. It’s a good proxy for measuring page flows, which is a much trickier task, due to the massive number of permutations involved. (Learn how you can do something similar using WebPagetest.)
Awesome slide #4: Bounce rate strongly correlates to page speed.
No huge surprise here, but great to see this validated.
Conclusions: Optimization resulted in improved conversions, revenue, and SEO
You may be happy to know that the Walmart team not only hit their SLA at the end of their optimization sprint, they actually overreached it by almost 3 seconds. They note the following benefits:
- For every 1 second of improvement they experienced up to a 2% increase in conversions
- For every 100 ms of improvement, they grew incremental revenue by up to 1%
- SEO benefits for entry pages and reduce bounces
Do it yourself
Check out the original slide deck to learn how Walmart used RUM tools, first to baseline performance and its benefits, and then to measure the results of optimization.
If RUM isn’t an option for you, then read this post to learn how to identify and measure the performance of sample page flows using WebPagetest or HTTPwatch. And then watch this short video to learn how to use Google Analytics to perform a page speed/revenue analysis of the pages in your flows.
Before I sign off, I want to applaud the team at Walmart for providing this level of disclosure. Case studies like this are a massive boon for our industry. It’s always inspiring to see e-commerce giants joining the likes of Amazon and Shopzilla in their willingness to share data.