A Year in Review: Looking Back on Our 2012 Emails0
After returning back to work from the holidays, we’ve been reviewing what we did in 2012 and determining what worked, what didn’t, and what we didn’t do that we had wanted to. Most importantly, we’ve been reflecting on our experiences in 2012 and planning what we’re going to do this year. It’s both an exciting (and somewhat stressful!) time!
Justine recently wrote about our most popular blog posts of 2012, which showed us that our readers appreciate our educational content the most. Our most popular posts were easily referenceable resources, like infographics, and research-heavy posts, like addressing a common rendering issue. We will definitely have more of these in 2013!
Unfortunately, analyzing what works in our emails hasn’t been such an easy task. We’ve blogged in the past about our difficulty with A/B testing — we can’t seem to get any conclusive results — and now we are having some issues with determining exactly which email did best in 2012 and some key takeaways for optimizing our emails this year. Let’s take a look!
HOW TO DETERMINE SUCCESS? LET US COUNT THE WAYS…
What determines which email has been the most successful? One of the most obvious answers here has to be revenue. However, since we don’t include coupons for Litmus subscriptions in all of our emails (most of our emails are educational & informative) AND it’s a bit of a process to drill down into how many coupon trials later convert to paid customers (hey, we’re only 15 people here at Litmus!), we don’t have much data on the amount of revenue raised from each email campaign. However, we’re working on making this easier to measure in 2013! As marketers, we know how important it is to tie revenue to the campaigns we’re working on.
In terms of coupon redemptions, our Interactive Testing launch email came in first place:
As a result of this email focusing on a new feature release, I am not surprised that this email had the most coupon redemptions (the two buttons offering the free trial probably didn’t hurt, either!). Three other emails also included coupons, but the focus of these emails were primarily educational articles with one section mentioning product or features. In addition, the CTA for the free trial was only present once in these other emails.
Takeaway: While I’m basing this assumption on only 4 emails (not a huge sample group!), we have significantly more coupons redeemed when the main focus of the email is on one product & the CTA is to try out this product via a free trial. Without the extra clutter of other CTAs, coupons are given more visibility in the email. Definitely some food for thought when we use coupons in the future!
Traffic To Site & Click Rates
Another factor in determining the success of our email campaigns is by looking at the traffic it brings to our website. Clearly we want our subscribers to visit our website (and, hopefully, be convinced to buy a Litmus subscription!). In terms of traffic to our site, the following were the top newsletter sources to our site:
All of these newsletters contain links to numerous informative articles so I guess it’s not too surprising that these drove a lot of traffic to the site. By only including articles that are extremely relevant and useful to our subscribers, it makes sense that our subscribers would want to click through to our site to read the articles in detail. Relevant content is king!
After looking at the newsletters that drove the most traffic to our site, I wanted to check out the click-through rates for each of the campaigns to see if there was a correlation between our top sources to the site and high click-through rates. While they aren’t in the same order, three of the top sources also had the highest click-through rates:
- April Newsletter: 27.01%
- March Newsletter: 26.57%
- September Newsletter: 25.73%
- June Newsletter: 25.22%
- August Newsletter: 23.91%
At first I was very confused by this — if we are always sending to the same list, if the August Newsletter has the highest traffic to the site, why doesn’t it have the highest CTR too? Then it dawned on me — our list grows every month, so it makes sense that newsletters later in the year would draw more traffic to the site (since they are going to more people)! So while April may have had the highest CTR, it also went to 20% less people than the August newsletter, so it’s no surprise that August drove more traffic!
Which is more important — the CTR or the traffic to the site? Without hesitation, traffic to the site. If I sent an email to 10 people and 5 clicked, my CTR would be 50%, yet only 5 people would have visited the site. Conversely, if I sent an email to thousands of people and the CTR was only 20%, but hundreds of people visited the site, than that is a far greater feat!
As a result, when determining which email was the most successful by looking at clicks and traffic to the site, the August Newsletter takes the cake:
With a larger list than in earlier months, informative content (2 infographics & inspiration posts), a free trial, and product updates, this email has it all. In round one, Interactive Testing wins, while in round two, the August newsletter wins. Talk about confusing! Which one was was truly our “most successful” newsletter? Let’s take a look at some more metrics.
Opens & Engagement
Other metrics that we always look at when measuring the success of our campaigns are open and engagement rates. By using our Email Analytics tool, we are able to generate both of these results. As a thought leader in the email marketing industry, it’s very important to us that our emails are not only opened by our subscribers, but that they are actually read as well. If our subscribers aren’t opening and engaging with our emails, then we aren’t doing our job — we want to provide them with content that matters to them!
- Mobile Masterclass Announcement: 31.31%
- Interactive Testing Launch: 30.58%
- May Newsletter: 29.31%
- November Newsletter: 28.94%
- August Newsletter: 27.71%
While all of these emails have a similar design (the most prevalent CTA is above the fold and present in the preview pane, which can have an affect on open rates), they all have very different subject lines — some are long, some are short, some are elusive and some are direct. Unfortunately, this makes it difficult for us to determine which types of subject lines work best for us. This may be a result of our loyal audience opening our newsletters regardless of what the subject line is (we hope!), or perhaps our subject line testing methodology needs some work!
What about our engagement rates? Do they coincide with our open rates (ie. the emails with the higher open rates would also have higher engagement rates)? There is somewhat of a correlation:
- Interactive Testing Launch & Mobile Masterclass Announcement: 76%
- July Newsletter: 73%
- March & June Newsletter: 71%
For starters, what do these engagement metrics mean? Our Email Analytics tool breaks down how long the recipient had the email opened for to determine whether they read, skim read, or glanced/deleted an email. If the email is opened for 10 or more seconds the email is considered read, 2 or more seconds (but less than 10 seconds) is considered skimmed, and if it’s opened for less than 2 seconds then it is considered glanced/deleted. The engagement metrics above are read and skim read combined, meaning that 71% of openers spent at least 3 seconds reading our top three emails!
There is definitely a correlation between the emails with the highest open rates and engagement rates — the Interactive Testing Launch & Mobile Masterclass Announcement top both lists!
These newsletters are different in content — one is a product announcement & one is inviting our subscribers to attend our mobile email design class. However, both of these emails are different than our typical newsletters (which link to educational articles and sometimes include product announcements/updates).
Although this sample size is incredible small, perhaps we can infer that our subscribers are more engaged with emails that have minimal CTAs, rather than numerous. Do these successful open and engagement rates make the Interactive Testing Launch & Mobile Masterclass Announcement emails our most successful? While neither of these had the highest CTRs or traffic to the site, Interactive Testing did have the most coupon redemptions (and perhaps the most revenue?). It’s a toss up!
Forwards & Prints
Lastly, we always find our forward and print metrics (via our Email Analytics tool) very interesting. When these metrics are high, it’s easy to conclude that not only did our subscribers find the content interesting, but they found it interesting enough to share it. While a successful email marketing campaign encourages its subscribers to take some sort of call to action, an even more successful email does that, but more—it persuades the reader to share the message with others. To check out what actually motivates people to share emails & why, check out our infographic, “Send it Forward: Get Your Emails Shared.”
- June Newsletter: 871
- Mobile Masterclass Announcement: 695
- November Newsletter: 493
- September Newsletter: 442
- Interactive Testing Launch: 401
- June Newsletter: 61
- July Newsletter: 21
- November Newsletter: 16
- Mobile Masterclass Announcement & March Newsletter: 13
Our June newsletter was, by far, our most shared newsletter!
The main focus of this newsletter announced that mobile email opens finally have surpassed desktop opens, and shared a link to see an infographic of the stats. In addition, there was a link to an infographic for mobile-friendly design tips. As a result of mobile becoming such a big topic in email marketing, I’m not surprised that this was such a popular email. We assume that our subscribers were probably sharing these statistics (and email design tips) with their coworkers, managers & teammates. Many people are hesitant about investing resources in creating a mobile strategy, but these stats show how important it is!
The Mobile Email Masterclass Announcement is also on the list for a large number of forwards and prints. I believe that this is due to our subscribers either forwarding/printing it to give to their managers to convince them to let them attend the event, or vice-versa (managers sending to employees to gauge attendance interest).
How do you measure success?
So which was our most successful campaign? It’s hard to say. Due to the fact that our Interactive Testing Launch email had the most coupon redemptions, as well as high open and engagement rates, it could be concluded that this has been our most successful newsletter. However, our August Newsletter drove the most traffic to the site, which is also such an important measure of success. In addition, our Mobile Email Masterclass had high engagement rates, as well as many shares. We’re conflicted — how do you measure the success of your email campaigns? Based off of the results above, which of our emails do you think was the most successful?
What would you like to see from us in 2013?
What were your favorite newsletters that we sent in 2012? Here’s the whole list:
- Interactive Testing Launch
- Mobile Email Masterclass Announcement
- March Newsletter
- April Newsletter
- May Newsletter
- June Newsletter
- July Newsletter
- August Newsletter
- September Newsletter
- October Newsletter
- November Newsletter
What newsletters did you like and which ones didn’t you like? Leave your thoughts, opinions, and suggestions below! We’d love to hear from you.