Around Autumn 2017, we realized something weird about our email open rates:
They weren’t that great.
They weren’t BAD, but they were pretty consistently below the industry average – and if we’re being honest, that didn’t feel too amazing.
With a few little tweaks, though, we got a HUGE boost to our email performance.
Better open rates, better click rates, more conclusive results for our tests – the total package!
How’d we do it?
And what do you need to know about your options for doing the same?
Here’s what we learned:
Every week, we send out a newsletter sharing our latest posts from this blog, along with some of our favorite recent reads from around the web.
(If you haven’t already, you can join here so you never miss a post.)
In addition to being a valuable source of traffic – more on how you can make that part of your own content strategy here – the responses we get to our newsletter every week help us understand what’s most interesting and relevant to our readers.
We’re able to keep an eye on which topics are most popular with our readers, and which are the least popular. We even perform a weekly A/B test of the newsletter’s subject lines, so we can continually write better ones based on how they perform.
But none of that works without paying attention to ONE thing that’s easy to neglect:
The natural decay of your email list.
We’ve been building our list for years – and with well over 100k people subscribed to our weekly newsletter, it makes sense that not all of them are still as interested as they used to be, or even still receive it!
Email addresses are abandoned. People’s needs change. Mail gets sorted into different categories, inboxes, and tabs, which make it easy to miss or ignore.
(Think of how many emails you receive in a day. How many of those do you delete without even opening or reading them?)
Considering that perhaps not everyone on our list was still an engaged reader, we did a little investigating – and were surprised by what we found.
We’d been growing our list for years, but not monitoring for that decay – so by the time we finally did investigate, we discovered tens of thousands of inactive subscribers still on our list.
(Note: when you’re monitoring your own email list for decay, your definition of active or inactive may vary. Because we send our newsletter once per week, we consider recipients inactive if they haven’t opened a single email within the past three months.)
Here’s why that matters so much.
First of all, an email list that has experienced decay can cost you more money.
A service like MailChimp, for example – which is what we use – may charge more per month depending on the size of your list. If your list has decayed, you could be paying for a lot of recipients who either don’t see or don’t open your emails.
Second – and more importantly – decay can skew your test results and make it hard to determine what works for your business.
Take a subject line test for example.
When we send our weekly newsletter, we want to make the subject line as appealing as possible.
(Because why wouldn’t you, right?)
Using MailChimp, we set up an automated A/B test that measures two subject lines against each other.
MailChimp sends our newsletter to about 20% of the people on our list – half of that group receives one subject line, and the other half gets the other. After a few hours, we can see which one had better open rates, and we send the newsletter to the remaining 80% of our list using that subject line.
If you don’t factor in decay, though, your tests may be a waste of time.
Imagine running the test we just described, but one of the two test groups is more heavily weighted with inactive subscribers, and the other is more heavily weighted with active ones.
The test results wouldn’t give you an accurate picture of which subject line is actually more effective, because the group weighted with inactive subscribers would have a lower open rate no matter what.
At best, the results are useless – and at worst, they could lead you to making bad choices influenced by untrustworthy data!
(We’ll show you an example of what this looks like in practice in just a second.)
So what can you do about email list decay?
You have two major options.
The first is to remove inactive subscribers from your list – an option that’s especially valuable if you’re trying to save on your email expenses by no longer paying to send to inactive subscribers.
You don’t have to shave people off your list with no warning, either! One common practice is to send a re-permission or reactivation campaign to inactive subscribers, giving them a chance to proactively remain on your list.
Here’s an example of what one of those might look like:
If you don’t want to cut subscribers from your list, though, you have another option – and this is the one that we took.
When we determined the extent of our own email list decay back in 2017, we decided that instead of cutting inactive subscribers from our list entirely, we’d use them as a learning opportunity.
(For example, if subscribers who rarely open one of our emails suddenly open one in higher-than-average numbers, we might be able to draw some conclusions about why it spoke to them when little else does.)
That said, we also didn’t want those inactive subscribers skewing our test results, or keeping us from understanding the overall effectiveness of our emails.
Here’s what we did about it.
When we send the newsletter every week, we actually send it to two separate audience segments: one labeled active, and one labeled inactive.
Our active segment includes only subscribers who have met at least one out of two criteria:
(The second criteria ensures we’re not leaving out anyone who just joined the list, and hasn’t yet had much opportunity to open a newsletter.)
We still test our subject lines the way we talked about earlier, and because one of those segments includes only people who have expressed some form of interest within the past quarter of a year, we can assume that the results will be more accurate and relevant!
Here’s an example.
Take a look at the test results for one of our January 2018 newsletters, which we sent to about 20% of the people in our active subscriber segment:
(The blog post referenced in that subject line is this one, by the way.)
As you can see, this test showed us that one subject line was more compelling than the other, but that once people had opened the email, its contents did the job pretty reliably! It taught us a valuable lesson about subject line construction.
Compare that to the results for our inactive segment (which was sent to that segment in its entirety):
The overall open and click rates are WAY lower – we can see that if someone hasn’t demonstrated interest within the past three months, that pattern is pretty likely to continue!
(Cue the sad trombone.)
If the inactive and active subscribers had been grouped together in a single segment, our data would have been skewed, and we would have drawn the wrong conclusions.
We might have made strategic decisions based on inaccurate data – and that means those decisions might have been really, really wrong!
This all goes back to something we mentioned at the very beginning of this post, too: using our newsletter as a way of learning about our audience.
Because we focus on active subscribers, our open and click rates more accurately represent our audience. That means we can develop a more accurate understanding of what topics interest our audience the most, and how we should write about those topics!
Plus, it gives us a much better understanding of how our open rates compare to our industry’s average.
(And that just makes us feel good.)
So there you have it – if you haven’t been keeping an eye on your email list’s decay, now is the time.
How many people on your list haven’t opened a single email in three months? Or six? Or twelve?
What happens to your performance stats when you stop including those subscribers in your tests?
What information are you missing out on that could lead you to making exciting new decisions – or avoiding bad ones?
Determine the point that separates your active and inactive subscribers, and experiment with segmentation.
If you’re like us, you might find that some of your results are a little surprising!
(And if you want to read more posts like this one, you can sign up for our own weekly newsletter right here!)