A/B Testing Emails – What Works?

[ 1 By

A/B testing is one of the most effective ways to make measurable improvements to your emails. When executed correctly, they can give marketers concrete evidence of which tactics work on their audience(s) and which don’t. As a result of these desirable outcomes, marketers are constantly told to “test, test, test;” however, eConsultancy’s 2012 Email Marketing Industry Census concluded that only 16% of marketers test frequently. If the ROI on A/B testing can be so high, why are so few of marketers doing it on an ongoing basis?

Here at Litmus, we’ve had some difficulties with A/B testing and we’re assuming that we’re not the only ones. Our biggest challenge has been that out of all of the A/B tests that we’ve done, only one test has had conclusive results. Like the other 72% of marketers surveyed for the 2012 Email Marketing Benchmark Report, we typically test our subject lines when doing A/B testing. Here are some examples of the subject line tests we’ve tried:

  • April Newsletter
    • Subject Line A: Squash the ‘image gap bug’ + test your emails in Chrome
    • Subject Line B: Sneaky Yahoo! ads, Chrome testing for email, and a chance to win a Litmus tee
    • What we tested: More aggressive/attention grabbing language (example: “sneaky”) vs. CTA language
  • May Newsletter
    • Subject Line A: Get your email opened: First impressions, subscriber engagement infographic + Interactive Testing sneak peek
    • Subject Line B: Get your email opened: First impressions + new infographic
    • What we tested: Length of subject line
  • June Newsletter
    • Subject Line A: Mobile opens take the lead: new market share stats
    • Subject Line B: Mobile opens take the lead with 80% increase over 6 months
    • What we tested: Using statistics vs. not using statistics in the subject line

Unfortunately, none of these tests yielded significant results. This may be a result of our loyal audience opening our newsletters regardless of what the subject line is (we hope!), or perhaps our testing methodology needs some work! Check out the (very inconclusive!) results from the May Newsletter:

May 2012 Newsletter - A/B Test Results

According to Silverpop, successful A/B tests should compare the same variables and they should be conducted multiples times in order to minimize variables that could affect the outcome. The April Newsletter test did not compare the same variables so that is potentially a reason why we did not see conclusive statistics with that test; however, both the May Newsletter and June Newsletter tests compare the same variables so I am unsure why those did not yield results. Perhaps if we tested the same variables numerous times (example: testing subject line length in more than one campaign), we would have seen more conclusive results.

Testing call to action/button language

The only A/B test that has left us with significant results was an A/B test on the text of one of the buttons in our July newsletter.

A/B Testing Results

Here are the results from this test:

A/B Testing Chart

With this test, we were testing different language on our main CTA. We wanted to see if the psychology behind button language had the potential to impact clicks. Version A encouraged our subscribers to use our products (“Start testing”) while version B promoted learning about a topic (“Read our overview”). From this A/B test, we learned that our subscribers are much more responsive to CTAs that focus on education rather than features. Also, there’s a bit more commitment involved in running a test than there is in reading an article. In order to draw a stronger conclusion, we’ll have to test this theory a few more times to see if it yields the same results.

We’re at the beginning of a very long journey of A/B testing. We want to optimize our emails for our subscribers and A/B testing is definitely a way to move towards that goal. We’d love to hear about your journey with A/B testing — what types of testing have you done? What types of results have you had? We’d love to hear!

Additional Resources

  • How to determine the right sample size by Silverpop.
  • 101 things you could be testing by Email Insider.
  • Answers to the 19 most frequently asked questions about A/B testing by HubSpot.
  • Anonymous Coward

    With the business I work for, we have done a lot of split testing and found that, although we could massively increase the click-thru rate, the sales rate remained the same. We tried many variety of texts, designs, subjects. The only thing that seemed to increase sales was the “triple-slam” technique were you tell your prospects that something will expire soon like a promotion ending in a few days. Then remind them again a few days later, then one last time with “today is the last day” type of subject.