What does a late 19th-century merchant have to do with modern-day click rates?
Over 150 years ago, a merchant named John Wanamaker noticed that half of his advertising money was wasted. The question was, which half? The decades that followed revealed how challenging it was to measure returns on investment.
Out of this problem was born the A/B test: a method of comparing two versions of something in order to determine which performs better. Fast forward to modern times. Today, the A/B test is one of the most popular ways to do a controlled experiment, making for more effective marketing campaigns.
Why do A/B tests?
Correct data doesn’t lie.
A/B Tests are an opportunity to assess the success of certain marketing methods and get to know your audience in deeper ways. The tests are perfect for improving website optimization, e-commerce and advertising.
What to test?
- Calls to action
- Pricing and offers
- Subject lines for emails
- Graphics and placement
- Timing of email and mail
- …and the list goes on! The possibilities are endless.
How to do the test:
Treat the A/B test as a sporting event with two competing teams. You set up the playing field and let the variables play the game. Here are the four main things to know when conducting an A/B test.
Determine your metric for success.
If your goal is to get more form-fills on your website, that will be the result you focus on. Be clear about what your goal is so that other data doesn’t fog the question you’re trying to answer.
Pick what variables are being tested.
Colors? Copy? Images? Create a question or hypothesis from which to base the data. If you are testing whether larger text on an ad will get more clicks, that is your base for the test. All other variables must stay the same for the two groups (A and B). Use existing data to gauge what’s already been successful, so you can test variables you’re not sure about. Test variables you believe will significantly increase your performance, not ones that will just have a minor impact. Use the test to educate and improve your strategy.
Have appropriate test groups and splits.
If you’re not sure which subject line will get the most open rates, do a do a 50/50 split. Take a sample size of your entire list (we recommend 10-15% of the total list per segment). If you have a good idea of which will be the winner (and are doing the A/B test to confirm that), you can do an uneven split of up to 95/5. Send the same campaign with the different subject lines to each.
Test one element at a time, at the same time.
In order to get the most accurate results, level the playing the field. If you are testing emails, for example, send each option on the same day and same time of day. That way, other factors do not meddle with the results.
Achieve statistical significance.
Have enough volume in both the groups and the results to make your data have validity. Give the test time to render results. When you see the result you are looking for, pay attention to other trends that result as well. For example, if you see that a certain subject line has a bigger open rate but more unsubscribes, that is worth noting.
Before any kind of digital marketing existed, John Wanamaker had the foresight to improve the advertising field. Each A/B test we run today is a nod to him and his contribution to history.
Imagine how future generations will benefit from your marketing strategy today.
Want to get your A/B test started? We’ve got you covered.