As marketers, data is our compass. Data points us toward the actions, ideas and processes that ignite our marketing campaigns, and it steers us away from costly mistakes.
One common method for collecting this data — and, in turn, making informed decisions based on the visibility that data provides — is A/B split testing.
An A/B split test is an experiment that compares two variables by presenting those variables to a randomly “split” audience over a specific period of time. As long as the experiment is controlled and unbiased, A/B split testing is a great way to understand which marketing tactics reap the best results for your business.
The process for A/B split testing is similar to the scientific method:
Form a hypothesis.
Test your hypothesis.
Analyze the data.
Draw a conclusion.
Take action based on the results.
Repeat.
But the good news is, you don’t need to be a data scientist in order to execute accurate A/B split tests for your marketing tactics.
In marketing, A/B split testing is often applied to things like email subject lines, landing pages, calls-to-action and more.
For example, you could send out two identical emails with different subject lines to a randomized list of contacts in your database to learn which subject line encourages the most users to open the email.
You should document every split test you execute to keep track of how each is performing. Different documentation systems will work better for different marketers, but ideally, your documented split test should look something like this:
Hypothesis: If I add the word “free” to my email subject line, I’ll boost open rates by 10 percent. (Side note: This one’s debunked! Subject lines with the word “free” in them tend to get significantly lower engagement rates than those without.)
Variable A: Subject line “Get your e-book on Inbound Marketing”
Variable B: Subject line “Get your free e-book on Inbound Marketing”
Sample size: 500 random blog subscribers
Duration: 1 email send
In this example, variable A is the “control” variable, whereas variable B is the “challenger.” The sample size is an audience of email contacts with similar attributes — they’re all subscribed to your blog — but to ensure an unbiased result, you’ll need to send out the control and the challenger email to random contacts within that audience.
HubSpot Enterprise includes an A/B split testing tool that helps you determine your sample size and automatically splits your audience for you. If you’re testing something that doesn’t necessarily have a fixed audience such as a landing page, just make sure you define the right length of time to run your test.
Typically, we’d recommend running a split test for at least three to six months. The more data you can gather, the more confident and accurate a conclusion you can draw from the final result.
The two most common and harmful mistakes that marketers make when they do A/B testing are:
Skewing their data by forgetting to account for outliers and biases
Making drastic, wholesale changes based off of a small A/B test
For example, December tends to be a low-performing month for us at New Breed because of the holidays. If we were to perform an A/B test during a few weeks in December without considering the historical context of that month, we might draw the wrong conclusions from the data we collected. However, we could account for that by extending the timeframe for our test to include the following three months as well.
Ultimately, getting as close as possible to an apples-to-apples comparison is critical for running an accurate A/B split test. If you’re not looking at comparable data sets, you won’t have the concrete information you need to take action based on the result.
And if you’re not looking at a large enough data set, you risk getting a false result — it might be OK to change a word in your email subject line based on a small sample size, but you wouldn’t want to redesign your entire website based on a week-long A/B test that only reached a small fraction of your usual web visitors.
A/B split testing is a key driver for optimization — one of the core principles of inbound marketing.
By continuously measuring the performance of your marketing campaigns and readjusting your approach as needed, you can move closer and closer toward the best possible result for your business. Remember your high-level business goals while you’re A/B testing and use your data as a guide toward those goals. Finally, if you think you're ready to accelerate your testing and experimentation, then check out multivariate testing.