Fleet Car.

The Importance of AB Testing Marketing

One of the things I’ve always been a firm believer in within marketing is the importance of AB testing. According to Wikipedia, AB testing is; A/B testing is a user experience research methodology. A/B tests consist of a randomised experiment with two variants, A and B. It includes application of statistical hypothesis testing or “two-sample hypothesis testing” as used in the field of statistics.

June 30th, 2022

In plain English, it’s the notion of doing the same thing in slightly different ways and measuring the impact that each tweak has on the subsequent results. The idea being, that you improve things like conversion rates by getting your campaign closer and closer to perfect.

We instinctively have a lot of unseen biases as marketers, but it’s vital that we allow data to tell the story and guide us on how to get the best possible results.

AB Testing, tested

Take Twitter ads as an example. Recent ad campaigns we’ve run have all been extensively AB tested, whether that’s the inclusion of a hyperlink, the hashtags, images, or copy. Even slight rewording of the same sentence can have a huge difference in terms of conversion.

“Is a few extra visitors worth the effort?” you might think, keen to apply a quantity over quality approach. Well, let’s let the statistics tell us the answer by comparing our worst running ad, with our best running ad. Now we’re purely looking at site visits rather than conversions, which only tells part of the story, but bear with me.

AB Testing

Ad #1 – Mexican Standoff

Our worst performing ad did not perform well at all. It played into the notion that negotiating insurance with your broker can feel like a Mexican standoff where everybody has a blindfold on. It didn’t land at all and was costing us $5.13 for every visitor that clicked the link to go to our website.

Whilst still considerably less than the $56 Google Adwords cost for insurance, that is still only a site visit rate of 0.02%. Interestingly, it used the exact same copy text as our best performing ad, similar visuals, but the background image was moving rather than static. So as such, for every $100 we spent on this ad, we could expect 19.5 visitors to end up visiting our website.

Ad #2 – Your Business

Our best running ad was more customer-focussed and about us making their business, our business. The copy remained the same, pointing to a cost saving, but the image was more about them than it was about us. Resultantly, that advert was delivering at $0.02 per visitor – that’s 256x more effective than the first one, over thousands and thousands of impressions. The result was a 21% site visit rate, driving more that 1,300 unique visitors to our website. That’s 5,000 visitors for every $100 spent.

Conclusion

Now the important thing to understand is that these are the results at either end of the scale, but this was not simply an AB test. It was an ABCDEFGHIJKLMNOPQRSTUVWXYZ test, with more than 40 variations of the same ad running to establish which landed the most effectively. The result overall was a spent of $907, equivalent to roughly 18 PPC’s on Google Adwords, generating just shy of 24k unique visitors to our website.

Is AB testing worth the effort? Well, had we simply run Ad #1, that campaign would have delivered 176 unique visitors to our website. Conversely, had we simply run Ad #2, it would’ve generated 45,350. So – I think we can comfortably go with “Yes” as the answer to that question.

To summarise? Don’t just run one ad and hope it works. You’re making yourself 256x less effective.

Find out more about Taveo