A/B testing to improve conversion

View Post: A/B Testing to Improve Conversion

Consider A/B testing if you’re looking for a way to learn more about your users and improve conversion on your website.

What is A/B Testing?

A/B testing is comparing different variations of a web page to see which is more successful. Comparing variations involves sending 50% of users that visit your website to variation A (often the existing design or the control) and the other 50% to variation B. You can test three or four variations as well, but be wary of how much traffic you get to your website and how long it might take to get an adequate sample size.

After an appropriate amount of time or visits, you compare the conversion rate (the amount of users that completed a specific task) of each to determine the winning design.  A/B testing is sometimes referred to as champion/challenger testing, where the winning design (the champion) from a previous round of A/B testing competes against a new design (the challenger).

How A/B testing works

A/B Testing vs. Usability Testing

A/B testing has a few benefits over typical usability testing. You’re testing real user behaviors in natural environments, which is something that can never truly happen with usability testing. It’s is a quick and easy way to find out what’s most effective to your users, although you need to first have a fully functional website that’s been through quality assurance (QA) in order to get accurate results.

A/B testing will tell you what your users prefer but it won’t tell you why they prefer it, that’s where usability testing comes in. It’s best to use A/B testing in combination with usability testing; test your concept first with usability testing and then refine the design with A/B testing.

What to Test and Measure

Before you begin A/B testing, you’ll need to decide what you’re testing and how you’ll measure success. Think about what conversion rate you want to improve and possible design solutions that might increase that rate. There’s a lot you can A/B test on your website, including wording, colour, images, size and layout of elements or even overall design aesthetics. For example, if you wanted to increase the amount of newsletter signups you might try moving the signup call to action to the top of the page or changing the subscribe button to a different colour. A/B testing can be a great way to settle design decisions if you’re caught between two options.

You may be contemplating between two completely different design concepts, or the wording of a call to action. For best results, choose one goal to test at a time. Testing more than one conversion rate may have significant impact on the results of the test.

A/B Testing

How to Run an A/B Test

Once you’ve decided what you want to improve and what you’re going to test, you’ll need to prepare each design variation and choose a tool to run the test with. There are many tools out there (Unbounce, Google Analytics Experiments, and VWO to name a few) and they all have their strengths and weaknesses. Do your research and choose a tool that best suits your needs and budget.

Each tool will give you a statistical confidence rating while the test is running – A higher confidence level percentage means more accurate test results. Ideally, you should run the test until you reach at least a 95% confidence level. Another option is to use an online calculator to determine how long you should run the A/B test. When the test is complete, compare the conversion rates of each variation to decide on a winning design.

Tip: Show repeat visitors a consistent design during the duration of the test. It could be disorienting and may affect the results of the test if users are shown different designs on every visit. Similarly, if the design variations include a major change that might alarm regular users of your website consider running the test with new users only.

 

Case Studies

Company: Birchbox
Goal: Increase clicks to the Birchbox website via social sharing.
A/B test variations: Variation A, the original design, included a standard and uninviting heading “Deluxe Beauty Samples” while variation B used the heading “The Perfect Holiday Gift” to entice people.

Birchbox AB Testing

Birchbox’s share widget heading variations. Image Credit: Friendbuy


Results:
Referral traffic and referral conversions from social media doubled with “The Perfect Holiday Gift” heading.

Company: ZAGG.com
Goal: Increase the amount of “add to cart” clicks.
A/B test variations: A 360-degree product view was used in variation A, and a video of the product was used in variation B.

ZAGG AB Testing

ZAGG.com’s product image display variations. Image Credit: Optimizely Blog


Results:
Variation A, the 360-degree product view, increased the average order price by over 10%.

A/B testing can help improve your conversion rate significantly, often with only a simple change to your design. Keep testing regularly and combine A/B testing with usability testing to maximize effectiveness and make sure your website is still meeting your users needs.

If you’re interested in learning more about other forms of user research, watch our webinar series Putting Users in UX: Research Methods in Strategy, Research Methods for Design and Planning, Conducting, and Analyzing User Research.

 

References:

https://vwo.com/ab-testing/

http://www.smashingmagazine.com/2010/06/the-ultimate-guide-to-a-b-testing/

https://www.maxymiser.com/resources/ab-testing

https://www.friendbuy.com/blog/a-simple-ab-test-to-optimize-social-sharing-and-referral-performance/

https://blog.optimizely.com/2013/12/06/ab-testing-optimize-product-image/