A/B Testing is the go-to technique for improving online conversion. It lets you test and determine which of your ‘improve performance’ ideas actually produce the results you’re after. The standard A/B testing approach is to test two variations, A and B, to determine which gives you the best results. Put simply, A/B testing splits your traffic into two equal halves, and sends half of your traffic to variation A and the other half to variation B. By comparing the results from each variation, you can conclude which version produces the best results.
Why A/B Test
Why even A/B test? Why not just make a change and see if you get an improvement. E.g., Why not change the call to action (CTA) on an email campaign and see if the results are better than the last campaign? Why not make a change on your store and see if sales go up? This type of approach or ‘testing’ produces unreliable results and can lead you to the wrong conclusion, as you don’t know what else is changing. Multiple factors affect results, if you don’t control for the variability in these other factors, you cannot reliably conclude what caused the change in results.
Consider this example: you make a change to a regular email, conversion rates drop vs the last campaign, so you conclude that the change reduced results. However, unknown to you, many of your email subscribers were on holiday and weren’t checking their emails, and the ones that did, converted at a higher rate, indicating the change improved results. By not using A/B you would draw the wrong conclusion.
In summary, to get reliable insights, you need to compare ideas under the same conditions – this means at the same time, with the same audience to remove or control for other factors you’re not measuring or aware of. This is an A/B test.
How to A/B Test
Before starting your A/B test, start with a theory or hypothesis of which version of your marketing asset will work best and why, (your marketing asset is your Landing Page, email, Blog Post etc.). Starting with a theory, helps you learn and increase your own understanding as you conduct your test of what works and what doesn’t. This new found insight and understanding will help you create better A/B tests and achieve even better results in the future. If you don’t have a theory or hypothesis, you can start with a question, e.g., “why is my bounce rate so high?” This will allow you to build up information and understanding of what is happening to your users as they interact with your marketing asset.
It’s important to have a clear idea what you’re optimizing for – or what your goal is. Some common examples are to get the best click through rate, reduce your bounce rate, achieve the best conversion rate, the largest total sales and so forth. It’s unusual that you can simultaneously optimize for multiple goals – so it’s important to select one goal that you’re aiming for. It’s also important to only vary one element between option A and B, such as the CTA text, or image on a landing page, otherwise you won’t know what is creating the improved performance.
To get the best results from A/B testing, I recommend starting with an understanding of how your pages are performing. Use Google Analytics to track traffic, referral sources, conversion rates and other insightful information. By observing how visitors interact with your site, you can gain insight into the areas that need improvement. To observe users, you can run heat maps and scroll maps to get a collated picture of their behavior. To get individual visitor insight, you can record individual user sessions with tools such as Fullstory and/or pay people to test and interact with your site using services such as UserBob. This last type of service has the advantage that you can specify the scenario you want ‘tested’ and the results come back with detailed commentary from the tester. Seeing exactly what a visitor does when she or he lands on a specific page can give you invaluable insights into where customers may be getting stuck or frustrated.
What to A/B Test
Now that you’re armed with knowledge of how your customers respond to your current marketing strategy from Google Analytics, and you have insights into the areas that need improvement from observing your users, you’re ready to generate ideas of what to A/B test. Essentially, you want to ‘guess’ what single change will produce the biggest improvement in performance, e.g., change the layout, change an image, change the heading etc. Once you have sufficient ideas, select the idea you think will generate the best results and A/B test it.
How long to A/B Test
You only need to run your A/B test until it’s clear that one variation is producing better results than the other. The standard approach is to continue until the results from each variation are statistically different. Essentially this means the difference you’ve measured is likely to be ‘real’ and sustainable, as determined by statistics. Sometimes results of a test whilst positive, are unlikely to be repeatable as determined by statistics – which means the results are not useful for you. An example would be, you toss a coin 10 times and it always comes up heads. This is unlikely to be highly repeatable, so the statistics associated with A/B tests would reject this type of event as not being statistically significant (although it does happen).
If you’re using an A/B test tool, the tool should tell you when you meet the statistical threshold. If you’re doing the test yourself and don’t have statistical testing capability, you can graph the results to visualize the difference. What you’re looking for is a consistent pattern of ‘difference’. The graph below shows the results of an A/B test; A: with Frankie personalized recommendation platform vs B without. The graph shows a consistent improvement in items/order “with Frankie” vs “without Frankie”. Statistical tests of these results, also indicated a statistically significant improvement with Frankie.
Always A/B Test
Marketers joke that A/B Testing stands for Always Be Testing, and as with all good jokes, there is an element of truth in it. As soon as you find one element that improves performance, you’ll find yourself wanting to test another element to continue to improve results.
Recommendations and A/B testing
When selecting a recommendation solution for your eCommerce store, we recommend you select a solution that uses A/B test results to demonstrate a measurable benefit to you the merchant – like Frankie does. Ideally, you want to select a recommendation solution that has A/B testing built in – like Frankie. This allows you to conduct your own ongoing A/B tests to determine which recommendation algorithms generate the best conversion rates for your store.
Key Steps in A/B testing
I’ve covered a lot of ground on A/B testing, so please refer to this high level recap of the key steps in A/B testing. You may want to print it out and keep it handy:
- Start with an understanding of how your site is performing (use analytics, user recordings, heat maps etc)
- Set a goal (what you’re trying to improve)
- Generate multiple ideas of what single change, will give you the best improvement
- Select the best idea to A/B test and create a variant for it, this is variant ‘B’
- Choose an A/B testing tool
- Set up your test
- Accumulate data from each variant
- Analyze the A/B results using statistics and graphs
- Implement the winner
- Repeat steps 2 to 9.
A/B testing is the most reliable way of determining which changes to your online assets generate improved results. I can’t recommend A/B testing strongly enough. A/B testing allows you to:
- learn what your audience needs to convert
- streamline the journey through your website, reducing friction at all stages
- improve your conversion rates and boost your bottom line
A/B testing is not a single event, but rather a philosophy. A/B testing allows you to improve the performance of your online assets, but it can do more. You should always be asking yourself if there is A/B data to support the effectiveness of an idea, a tool or any approach you’re considering. Adopting the philosophy of A/B will serve you well – get started on your A/B journey today.