Does A/B testing actually work?
A/B testing has become widely used over the past 3 years; according to a recent study done by Econsultancy, 67% of businesses use A/B testing to optimise their conversion rates:
However, 90% of A/B testers say that they’re unsatisfied with the results!
Does that mean that A/B testing just doesn’t work? Yes…except for 10% of people using it!
Of course, A/B testing isn’t a magic solution that waves its little wand the moment you take out a subscription with AB Tasty or Optimizely.
Take the example of Laurence who works in the Marketing department for an online shoe shop; he discovered A/B testing in 2014:
One Monday in September, I made an important decision: to introduce A/B tests on my business’s website!
After some Google research on A/B testing services available, I made up a shortlist comprised of: AB Tasty, Kameleoon, Optimizely and VWO.
In the end I went for AB Tasty because our office is based in France; supporting the industry and all that!
The starter pack wasn’t expensive: only 29 for testing 5,000 visitors. Seemed pretty reasonable for a first test…
After a quick sign-up process, I arrived on a large, all-white page with just one visible button: “Create a campaign”. There was also a video and tutorial section to the right of the screen, but the explanations were a bit too…”technical” for me at this point!
I was asked “What do you want to test on your site?”
Most of the online articles I’ve read talk about how simply changing the colour of a Call to Action button can increase conversions by 300% so I thought I’d start with something simple like this.
After just 3 hours, my credit of 5,000 visitors had run out and the AB Tasty interface was showing a very low rate of reliability, which meant of course I would need to do further testing to have a more reliable outcome.
But how many more times would be sufficient? What volume of visitors did I need to test? It was all a bit vague in that sense..so I changed on to a 99 pack to try and get some satisfying results from this first test.
The next day, the reliability level had gone up to ‘Green’ with an improvement rate of 27% showing! I was elated at this result and I went straight to my Manager to give them the good news: a 27% increase in revenue!
Except the thing that I hadn’t realised is that this 27% increase wasn’t particularly visible in the end of month revenue report, in fact the increase barely showed in the figures at all and my Manager wasn’t at all convinced that any increase was down to my A/B testing…
In the end, my first experience with A/B testing wasn’t particularly conclusive, which made me realise that signing up to an A/B testing solution probably isn’t the way to go unless you really know what it is you want to test and know how to manage the test correctly.
Laurence’s experience is a classic one: A/B solutions don’t offer any kind of magic solution for improving your conversion rates, they only offer a tool with which you could ultimately reach your goal.
5 things to remember about A/B testing:
- Without a method for optimising conversion rates, without a structured analysis, without preparation, using an A/B testing solution service will only be a waste of time and money.
- Signing up to a pack that only tests 5,000 visitors won’t be of any use – unless you currently only have a conversion rate of 10% and you somehow manage to achieve a 15% increase on your test! However, you should know that getting even an increase of 10% in your conversion rate would be quite a remarkable feat. You can see for yourselves on the site AB testguide :
- Changing the colour of a button is one of the classic solutions offered up on blogs but – unless there are some major design faults on your site – this alone won’t help sustain an improved conversion rate.
- A/B testing is invaluable for testing out educated ideas for improvement, but you should always check the validity of your tests otherwise you risk waking up each morning thinking you’ve doubled your sales!
- A/B testing isn’t always applicable: if your site doesn’t generate enough conversions then starting A/B tests won’t be of much help as you will never get a fully representative statistic – meaning, a large enough volume of conversions to really validate your results. In this case, you’d be much better off looking for solutions using Neuroscience.