Use Testing to Continuously Learn More About Your Audience0
In order to produce well-performing campaigns, it’s crucial to send emails that not only render well, but also resonate with your audience. But, how do you know what type of content, calls-to-action (CTAs), and design work best for your audience? It’s easy: test! Through testing, you can gain insights into your subscribers and their preferences that help you send strategic, optimized, and better-performing campaigns.
The team over at Emerson, a manufacturing and technology company, wanted to generate interest in their product by offering a free trial via email. While they knew their B2B audience consisted mostly of conservative, middle-aged engineers, they were unsure which type of offer would resonate best—and ultimately produce the most leads.
So, they set out to test…and test…and test again! (Looking to test your emails? Try Litmus free.)
While their results were surprising, they now know which types of email perform well with their audience. More importantly, they have instilled testing in their process, allowing them to learn even more about their subscribers over time.
A/B TEST: FREE TRIAL VS. WHITE PAPER
Emerson’s first test was to see whether including a white paper alongside a free trial would generate more leads than simply a free trial alone. Their hypothesis was that the white paper would distract from the free trial, producing fewer leads. They sent 50% of their audience the “free trial only” email, and the other 50% received the combination free trial and white paper version.
This test also pitted two CTAs against one another: the CTA in the control email enticed readers with a “Free trial and installation” while the test version asked subscribers to “Download the white paper.” In both emails, the CTAs are in the same place and use the same color—one in the header image and one at the bottom of the email. In the test email, there is also a secondary CTA in the sidebar for a “Free trial and free installation.”
The subject line also drew attention to the inclusion of the white paper, stating “[White Paper] The Impact of Failed Steam Traps on Process Plants.” The control email used “Free Trial & Installation: Capture Energy Savings with Automated Steam Trap Monitoring”.
The results really threw the Emerson team for a loop! For starters, the subject line of the test email resulted in 23% more unique opens than the control email. While the control had a 8.92% open rate, the test had a 10.96% open rate. This showed Emerson that their subscribers are more apt to open an email that included content rather than just a free trial.
While both emails generated the same amount of interest in the free trial, the white paper was significantly more popular than the free trial and generated additional inquires. Take a look at the clicks for the emails:
WHERE ARE PEOPLE CLICKING?
After using A/B testing to see what type of offer (content vs. trial) resonates well with their audience, Emerson began adding content offers to their free trial emails. However, they wanted to determine if clicking on a white paper CTA or a free trial CTA indicated interest in the trial. Their hypothesis was pretty straightforward—that clicks on a free trial link indicated more interest in free trials than clicks on a white paper CTA.
They started tracking the location of clicks in each email and cross-referencing these clicks to inquiries and interest about the offers:
Once again, the white paper received significantly more clicks than the free trial itself. However, both free trial CTAs received the same amount of clicks—19. The sidebar CTA received 14 clicks on the button and 5 clicks on the corresponding image, while the CTA at the bottom received 19 clicks on the button. But, which CTA placement led to the most leads and the most interest in a trial?
Lead generation forms for both CTAs had the same qualification questions: 1) Are you interested in learning more about a specific product? 2) Are you interested in a free trial?
Forms that were submitted without checking either of those boxes were considered inquiries, forms with any two checkboxes were considered leads, and forms with both checkboxes filled in were considered interested in the offer.
Email offer interest
Once again, the results surprised them! The initial interest in the white paper generated more clicks that later became leads—and not only leads, but leads expressing interest in the trial offer. Their conclusion: content really is king.
WHAT HAPPENS IF YOU RE-SEND AN EMAIL?
Re-sending messages (especially to non-openers) is a frequently debated email marketing tactic. When making the decision to re-send messages, you might consider using the same email, or re-writing a message to differentiate from the first send. Which practice is most effective, and a better use of marketing resources?
This was another question that the team at Emerson wanted to answer. While they filtered out anyone who had already downloaded the white paper or signed up for a free trial, they re-sent the exact same email to those who had not. The email sends were one week apart. The only difference between the two emails was their subject lines:
- Whitepaper: The Impact of Failed Steam Traps on Food Processing Plants
- Whitepaper: Food Processing Plants; The Impact of Failed Steam Traps
The team feared that sending identical emails might annoy their audience; however, they didn’t see an increase in unsubscribes with the second send. Unsurprising to the team, the first email generated more traffic to the landing page because the click rate was much higher.
However, they were shocked to see that the second email actually generated more leads than the first.
After diving into the click data between the two email sends, they discovered that there wasn’t much overlap between subscribers who clicked both emails. In fact, only 30 people clicked links in both emails; 568 subscribers clicked links in the first email and 270 different subscribers clicked links in the second email.
What did Emerson learn from this? For starters, the subscribers who clicked on their first email may in fact have been annoyed that they were resent the same email—only 20 of them clicked links in the second email. However, while there were fewer clicks in the second email, those that clicked led to more leads, and both emails resulted in approximately the same percentage of people expressing interest in the free trial.
While the results of this test were very interesting, they weren’t conclusive enough for Emerson to make a decision for future sends. They plan to do more testing on re-sending campaigns to see whether multiple sends can lead to an increase in leads. They are interested in looking at changing subject lines and excluding clickers from the initial send.
By continuously testing (and often being shocked by those results!), Emerson continues to learn more about their audience and what types of emails, messaging, and CTAs resonate the best with them.
A big thanks to…
We’d like to thank Vanessa Bright, Online Marketing Manager; Scott Pries, Marketing Communications Manager Flame & Gas Detection; and Charlie Oracion, Team Lead, Online Marketing on Emerson Process Management—Rosemount’s team for sharing all of their testing efforts with us.
DISCOVER YOUR AUDIENCE + OPTIMIZE YOUR MESSAGES
A/B testing is just one way you can learn more about what resonates best with your audience. Litmus’ Email Analytics data will show you engagement, email client, geolocation, forward and print data to help make key design and HTML build decisions, providing you with opportunities to surprise and delight your subscribers (not to mention increase conversions!).
Optimize Your Emails
When it comes to email marketing, it’s all about your audience. Use Litmus Email Analytics to discover which email clients are most popular with your subscribers. Sign up free!