AB-Testing

A/B testing is a lie. Here’s the truth

April 01, 2020
datasine

“In reality, A/B testing wastes huge amounts of money, time and resources in pursuit of the perfectly optimised creative”

AB-test-marketing

The marketing industry has a dirty little secret: A/B testing isn’t effective as we’ve been told

In reality, the vast majority of teams implementing it just aren’t getting reliable data or actionable insights from the approach – and are wasting valuable time and resources in the process.

So, what can we do instead to make sure we’re using the very best ad creatives possible? 

A/B testing: The holy grail of marketing?

Finding the right creative for your audience is incredibly important for the success of your ads. According to Stand Up For Creative, a campaign’s success relies 70% on the creative (and just 30% on the media plan), so it’s critical that each campaign is using the best creative possible

A/B testing is one practice for discovering the best creative that many data-driven marketers swear by. It’s all about finding the perfect image, text and CTA (among other things) for different audiences.

A/B testing works by testing two (or more) versions of an ad on the audience to see which version is the most engaging. Using the approach, teams can test which image, headline, CTA, text and more drives the highest performance for that ad.

However, A/B testing isn’t the absolute answer we’ve been led to believe it is. In reality, the approach wastes huge amounts of money, time and resources in pursuit of the perfectly optimised creative. 

“Only 10% of A/B testing experiments result in actionable change ”

While all the creative elements rotate in the A/B test, thousands, even millions, of possible customers end up seeing un-optimised creatives that don’t push the right message – and that don’t lead to a sale. This equates to many, many lost opportunities, and wasted budget

AB-testing-lie

In fact, some of our clients say they waste up to 70% of their budget on A/B testing before they end up finding the creative they settle on. And many never even find the perfect creative at all

Here’s the truth:

  • Only 25% of A/B tests actually produce significant results (Appsumo)
  • Only 10% of A/B testing experiments result in actionable change  (Northwestern, IDS )
  • 50% of teams could not make decisions from A/B testing experiments due to inconclusive or poorly measured data (Northwestern, IDS )

Since finding the perfect creative is so critical to the success of a campaign, what approaches can marketers use to make sure these creatives will hit the mark, without relying on flawed A/B testing?

testing-AB-marketing

Do less A/B testing for longer

One of the big issues with A/B testing is that it’s not often used correctly.

Northwestern IDS concluded that most companies are “running A/B tests too frequently for too little time, contributing to a high failure rate that makes A/B test results less valuable and meaningful”. 

“Less frequent A/B tests are more likely to result in successful experiments”

With this in mind, it makes sense to simply give A/B testing more time

Less frequent A/B tests that last longer than a month are more likely to result in successful experiments that let you know what the best creative is. 

time-AB-testing

Saying that, we probably don’t need to tell you that a month in a marketing team is a long time. If your message or approach pivots at any point, you’ll need to start the process from the beginning. 

And even if you commit to a lengthy process of A/B testing, waiting a longer time to be sure you’re using a properly optimised creative can seriously delay the time it takes to see and action results.

Persona-led creative selection

Building out personas for your audience is one common approach to crafting creatives that will be engaging for them – especially if those personas are built using the data company’s hold on them. 

With a better idea of who it is you’re trying to appeal to, you’ll have more guidance on what they’ll want to see from your creatives, meaning higher-performing ads, in theory.

AB-testing-personas

However, even if you’ve done all the research possible and you’re using the most detailed personas you can build, you’re still left relying on guesswork to craft and select the best creatives.

Insights from panels 

Hearing from your audience themselves about what they find engaging in creatives can be very helpful in crafting genuinely effective creatives. 

This is why testing your creatives using panels can shed a great deal of the guesswork that goes into making ads. A/B testing no longer has to be the middleman when you know exactly what your customers want from creatives.

However, panels aren’t a magical solution. The creatives you are bringing to them have still been created using guesswork, after all. And they rely on people knowing exactly what their content preferences are. 

In reality, what people find engaging in creatives is not as fixed as we might think. A lot of their content preferences are unconscious, and they may struggle to identify in a panel setting what creatives they would actual click on ‘in the moment’.AB-testing-lies

Pre-optimising with creative data – and Content Atomisation

All of these approaches outlined may be effective, but they all have the same problem: The creatives that are being tested have been created using guesswork in the first place

“These approaches have the same problem: The creatives that are being tested have been created using guesswork in the first place”

So, how can you take guesswork out of the loop altogether? 

Content Atomisation is an approach informed by data which has been proven to be enormously effective at driving performance while removing guesswork from the equation. 

Here’s how it works: Content Atomisation uses AI to analyse the performance of individual creative features. Once you understand what features are driving engagement, you understand exactly what your audience wants to see – from whole groups right down to the individual – and your messaging engages them. Every. Single. Time. 

This means that A/B testing is no longer necessary, because with Content Atomisation you know exactly what creatives perform for your audience

Read more about how Content Atomisation works. 

At Datasine, we’ve created a tool that allows you to easily implement Content Atomisation – and early adopters have seen gains of up to 59% in conversions.  

Our tool lets you:

  • Access to AI predictions into how well any number of creatives you upload to the platform will perform for your audience
  • Select highly-engaging assets from a variety of content libraries
  • See actionable insights into what content works for your brand – and why

Datasine-AI-platform

Want to learn more? Organise a call with our friendly team to talk about the impact Content Atomisation can have for your brand now.



Related posts

Let's work together.

Find out more about how Datasine empowers marketers with intelligent, data-driven insights to improve the performance of your marketing campaigns.

Share This