A/B Testing: Optimize Your Emails and Landing Pages and Improve Conversions
One of the tough things about being a marketer these days is the fact that what worked in the past doesn’t always work anymore. Our B2B customers and prospects are changing the way they search, consider, and buy. They’re also spending much more time doing their own research.
According to Forrester, B2B buyers may be as much as 90% of the way through their journey before they reach out for a sales person. Customers are also consuming content like never before, visiting social channels, watching videos, attending live and virtual events, and evaluating options on mobile.
To keep up with our information-seeking audiences, we need to create content that engages and connects with their needs. What’s more, we need to attract them in effective ways so they have the opportunity to see this amazing content. And to do that, we have to keep fine-tuning our approach. One thing is for sure: We can’t afford to send a lackluster email that drives to an underwhelming landing page. Because once we miss the mark, there’s a possibility that we’ve missed it forever. That’s why testing is a critical aspect of developing digital marketing campaigns.
Testing is a proven method to find the best combination of design and content so you can attract your audience and get them to take action. Coming up with compelling creative concepts is a great start. But that headline you think is brilliant might not shine so brightly when your prospects open the email. Sometimes a no-nonsense approach is much more effective than a clever turn of phrase. Some audiences like a big red button to click, and others prefer a smaller text link.
How can you know for sure? You have to test, test, and test again.
According to the Direct Marketing Association, 68% of digital marketers rated the ability to test new campaigns as having the greatest impact on their email marketing efforts. And that’s not too surprising, since testing takes a lot of the risk out of trying new tactics. Testing gives you the data you need to back up your decisions, so you can develop and evolve your campaigns with more confidence.
There are two basic types of testing used by digital marketers today.
A/B testing examines the impact of one change, known as the variant, against a baseline, known as the control. By keeping the tests identical, other than the one variation, you can look at the results to see which variant impacts your email or landing page performance. For example, you can quickly discover whether the call to action “Download the Guide” or “Read Now” works better with your target audience. If you change other elements, like the color of the button or the size of the copy, you won’t truly know which change brought about the difference in results.
Multivariate testing changes many different elements in an email or landing page. It’s great if you need to test multiple variables but you don’t have the time to conduct a series of one-off tests. They’ll help you discover which version performs the best, but you won’t be able to pinpoint which change had the biggest impact on the performance of your campaign.
Another key factor to keep in mind when testing emails and landing pages is the statistical significance of the results. If you run an A/B test and the results show a very small difference between the two options (say, below 5%), it could just be random chance. If you have a higher number, it’s more likely to have been caused by the change you introduced – in which case you might say the results are statistically significant. That can give you the confidence level you need to move forward with the winning version. However, it can be more complex than that. Here’s an in-depth look at calculating the significance of a result and determining the confidence level of an A/B test.
In general, you will want to test on a small segment of your entire audience so you can send the winning version to the bigger population. For example, you could take 20% of your list and randomly split it in half. So 10% of the test group receives version A, and 10% gets version B. The winner is subsequently sent to the remaining 80%.
If you think the open rate on an email you plan to test will be low, you need to increase the number of members in your sample pool to achieve significance. There are plenty of calculators out there that can help you make sure you’re getting a statistically significant result. Just keep in mind that the lower the number of conversions expected, the larger the sample size should be. Aim to receive at least 200 responses to each email variation.
With the right solution in place you can automate this process so you don’t have to worry about manually splitting lists or scheduling the launch of the winning versions. Learn more about using Act-On solutions to test your emails, landing pages, and forms. You can also try out an A/B test for free with Act-On’sABtesting.netonline service.
Best Practices for Testing
With the right strategy in place, you’ll be testing in no time. There are many different elements that could impact key conversions like open rates, clickthrough rates, and conversions. You’ll find that A/B testing doesn’t double the work – in fact, it can decrease the amount of time and effort you need to spend on campaigns. Here are a few best practices for creating a successful testing strategy.
Set goals: Start with a theory for the test and define which metrics you will monitor to identify the winning version. Is a red button better than blue? Does “better” mean more clicks or more conversions? Make sure you have a clear hypothesis from the beginning.
Trust the data: Don’t let your personal preference fool you. Look at the data, study the results, and trust the test outcomes. Even if you think that red button is ugly, if it gets more conversions, you’ll want to use it.
Synch up your tests: Your tests should run exactly the same amount of time. And unless you’re testing factors such as time of day or day of week, you should always run your tests simultaneously. Otherwise, different variables like holidays or weekends might have an impact. Fortunately, marketing automation with A/B testing technology makes this easy to do.
Make it a regular thing: Always keep testing, since the effectiveness of any element in your email or landing page can change over time. The blue button in this month’s email campaign might not work next month.
Share the results: The insights you gain from your testing might have an impact on campaigns in other areas of your company. Be sure to share the results so everyone can continue to optimize across campaigns and channels.
Ready to start testing your own emails, landing pages and forms? Be sure to read this eBook to get the ABCs of A/B Testing. In addition to many of the basics we’ve covered here, you’ll find an in-depth look at some real-world case studies from our Act-On marketing team. Download the eBook today and find out how to optimize the results of every campaign.
Growth Marketing Automation: The Next Wave
Want to Go Beyond the Lead — But Not Sure Where to Start?