Act-On Adaptive Web

ABCs of A/B Testing: 9 Ways to Optimize Your Email Campaigns

Article Outline

A or B as concept of choiceTesting is essential for optimizing your email campaigns; you want to find the combination of design and content that does the best job of enticing your audience and spurring them to action. After all, connecting with customers requires both art and a science, and finding the right way to reach them can be a challenge. Brilliant concepts and executions are obviously important, but they’re only the beginning when it comes to developing successful marketing campaigns.

You can’t count on what worked in the past; market conditions change, and so do your buyers. Automated email testing lets you take some of the risk out of trying new things. It gives you the data you need for decision-making, so you can develop and evolve your campaigns with more confidence.

Find a Winner

When you run an A/B test, you’re trying to find out whether version A, the control version, does better or worse (according to your metrics) than version B. By testing each version with a small sampling of your target audience, you can quickly determine which design and copy combination will likely get the most conversions before launching your larger campaign.

For example, you can discover whether a call-to-action like “Buy Now” or “Shop for Deals” resonates better with your target segment. The better the response, the higher your return on investment.

Using A/B testing, you can experiment with any aspect of an email that might have an impact on conversion, such as:

  1. Subject line: This is probably the most common A/B test. Remember to look not only at open rates, but also at clickthroughs and conversions. Sometimes the email that gets the most opens isn’t the one that gets the most conversions.
  2. From address: You can test whether sending from a named individual (such as the CEO) consistently boosts opens and clickthroughs vs. a generic From address like “Newsletter.”
  3. Preheader: Test to see whether an email performs better when there’s a preheader. How much better? You can determine if a link in the preheader drives more responses, as well.Too often preheaders are used only for “View this email in a browser.”Test using this real estate instead as a continuation of the subject line. It may optimize response, particularly for recipients who have images turned off and/or use preview panes.
potterybarn-preheader
This email header provides an incentive to click as well as a link, while also offering a mobile and online viewing experience.
  1. Format: Does a bolded call-to-action get results? What if it’s underlined as well? Should you make the font a different color? Testing can tell you.
  2. Buttons: Color, size, label, positioning, or even something as arcane as whether or not a button has a drop shadow can all make a big difference in clickthrough rate.
  3. Link text: Try out different configurations of your link text, like “Download the email best practices eBook” vs. “Get better results on every email campaign.”
  4. Content: Test any other copy elements such as headings, benefit statements, customer testimonials, or product descriptions.
  5. Images: Does a photo of happy people on a sailboat outperform the photo of a beautiful beach? There’s only one way to find out.
  6. Prices: Discover if your audience responds better to certain price points, and test discount strategies like “Get 25% off” vs. “Save $200.”

Have a clear goal for each test

One key benefit of A/B testing is that it delivers unambiguous results – a single variable (subject line, let’s say) is tested via two possibilities, A and B. The results are quick, clear, and effective. You are comparing how your email recipients respond to one version against how they respond to a single variation. The results show in a concrete, measurable way which of the two alternatives delivers the better results.

People do sometimes create A and B version that have multiple differences; this will tell you which email performs better, but not necessarily why. Try to limit the variables if you want to answer a specific question. For example, if you want to know whether phrasing a subject line as a question works with a particular segment, you might keep it as similar as possible to the control:

Get Tips for a Healthy Heart

vs.

How Healthy is Your Heart?

Notice we’re not testing the control against something very different like, “Are You at Risk of a Heart Attack?” or “Is Your Heart Beating Too Fast?” That’s because our stated mission was to determine whether or not a question format gets more people to respond. If we wanted to test whether a positive message works as compared to a fear-based message, then we could use one of the scary subject lines. It’s important to have a clear goal in mind when creating an A/B test. Otherwise, you’ll just be throwing out ideas to see what sticks. (Of course, that’s still better than not testing at all.)

A/B Testing Tips

Here are some additional tips to help you get the most out of your A/B testing strategy.

  • Synch up your tests. If version A goes out this week and version B runs next week, it’s not a true A/B test. Version A might have performed better, but only because version B was sent during a holiday weekend. It might still give you some insight into trends, and it’s better than nothing, but ideally, the tests should run at the same time for exactly the same amount of time. Marketing automation makes it easy.
  • Listen to the data. Don’t let your personal preference be your guide – trust the test outcomes. The winners are often surprising. If the call to action button that you personally think is ugly gets the most results, then don’t be afraid to use it. If you’re sure people love pictures of puppies, and yet the results show kittens win every time, go with the facts.
  • Take the right amount of time. Let your test run long enough (and to a big enough audience) to get statistically meaningful results. But don’t run it for too long – the poor performer could be costing you conversions. If you want to maintain higher conversion rate during a test, you might want to something (like Act-On’s Optimize for Conversions feature), that makes it possible to direct more traffic to the best-performing execution, rather than an even distribution.
  • Keep it simple. It’s a best practice to test one element at a time. If you try out a new button color and you also change the headline, you won’t know which one caused the uptick in conversions.  If you are pressed for time, you can certainly test version A against version B and go with the winner. (Your insights won’t be as granular but at least you’ll improve results.) You might also want to test how a plain-looking email (an HTML email with basic text and no graphics) performs against a polished, designed message. Sometimes simple really is better.
  • Make it an ongoing thing. You’ll want to keep testing on a regular basis, since the effectiveness of any element in your email (or landing page) can change over time.

How can you be sure which test won? Get an in-depth look at determining the confidence level for email A/B testing, and find out how to interpret the results of your tests.

Email optimization is a continual quest of discovery and refinement. A/B testing is a tried-and-true method for uncovering what works … and what doesn’t. It doesn’t always have to be complicated, but it should be an integral part of your development processes.

Just getting started with email marketing?

Get the jump on it with this free toolkit:

Email Toolkit CTA v1

What's New?