A/B testing, also known as split testing, is a way of working out which of two campaign options is the most effective in terms of encouraging opens or clicks.
In an A/B test you set up two variations of the one campaign and send them to a small percentage of your total recipients. Half of the test group is sent Version A and the other half gets Version B. The result, measured by the most opens or clicks, determines the winning campaign and that version is sent to the remaining subscribers.
- A/B test types
- Choose recipients
- Define test settings
- Test before you start the test
- Monitor A/B test results
Important: When an A/B test is in progress you can't change the campaigns in any way, so be sure to test thoroughly before you start a split test.
A/B test types
You can split test an email campaign in one of three ways. Open Create & Send, click Create a new campaign then select A/B test, as shown here:
You can select from one of three test types, explained below:
For this test, campaign version A and B are identical except for the subject line. For example, you could test to see if a generic subject gets more opens than a longer subject line that's more specific:
Or, some more examples:
- Test two completely different topics as the subject line, to see what content is of most interest to subscribers.
- Add personalization to identical subject lines to see if a first name greeting, for example, gets a better response.
- See what kind of promotion works best by offering "Free Shipping" versus "15% Off".
Sender details are important because many people will not open an email if they don't recognize who it's from. With the From name test you can use a different name and email address for Version A versus Version B, as shown below, or just change one or the other.
Tip: The best approach depends on your relationship to the subscriber. Consider if they are more likely to recognize an individual's name, your company name, or the product name your campaigns are about.
This is to test different elements of the campaign itself, for example: section titles, article length, calls-to-action, header images and more. You might even test two completely different designs to see which one gets the most clicks.
Campaigns for this type of test can be created using one of your saved templates or designed externally and imported (if that option is available to you). If you're doing an email content test, follow the on-screen instructions to set up Version A first. After that, you'll be prompted to repeat the process to set up Version B.
Tip: If you're using the same template and only making minor changes to content, save time by copying the content from Version A to create Version B, as shown here:
When both versions of the campaign are set up you'll be prompted to choose the subscriber list, or lists, to send to. If new subscribers are added to a list after the A/B test has started they will not be sent the campaign. Any subscribers added while a test is in progress will have to be targeted separately.
Tip: You can use the resend a sent campaign function to send the winning campaign to new subscribers. It will be duplicated as an A/B test but just follow the instructions to step four, then click the "Define campaign and sender" Edit button and select Regular as the campaign type. This will discard the losing version and leave you with the winning campaign to send.
Define test settings
The next step is selecting the size of your test group, deciding how the winner will be chosen, and setting a length of time to run the test.
Test group size
Use the slider to define a test group, which should be small subset of your recipients, say 20-30%. Recipients in the test group are selected randomly. Half of them will be sent Version A while the other half are sent Version B. The remaining recipients will be sent the winning version.
How will the winner be decided?
Select either Open rate, Total unique clicks or Total clicks on a selected link as the performance metric for your test. You can't change the metric after a test has started but you can monitor activity for all three metrics while the test is in progress, as shown here:
Test run time
The length of time you can run an A/B test is from one hour, up to five days. There's also the option to manually select a winner while the test is in progress.
Test before you start the test
It's very important to test and check for errors of any kind before you start an A/B test because you cannot make any campaign changes when an A/B test is in progress.
To double-check everything you've set up so far, we summarize it for you in a campaign snapshot (pictured below). If you need to change something just click the the Edit buttons on the right:
Tip: If you're testing email content using a template-based campaign, the snapshot is where you can check the plain text version of your emails. What looks great as HTML might need a little work as text-only. Click Preview to see how they look and edit to make changes, as highlighted above.
When you're satisfied that everything is good to go click Test and define delivery. Run a Quick test to send yourself previews of both versions or click Complete test to run a fully automated design and spam test.
Send now or schedule for later
You can start the A/B test immediately or select Schedule it to start at the following time, as shown here:
If you schedule the test to start later, we'll send you an email notification when it starts.
Monitor A/B test results
While a test is in progress you can monitor results with the A/B test report, which looks like this:
The graph makes it easy to see which version of the campaign is winning, according to the performance metric you've chosen: open rate, total unique clicks, or total clicks on a selected link. Below the graph you can also see which version is winning for the two metrics you didn't select.
For some A/B tests there's a clear winner before the test time is up. On the report page, pictured above, you can see the time remaining and a link to manually select the winner. Click this to choose the winner yourself and end the test early.
After a winner has been decided, through test completion or manual selection, the remaining recipients will be sent the winning version and a full campaign report will become available. The A/B test results are included in the full report so you can revisit them when designing your next campaign.
If you let the test run to completion we'll send you an email notification as soon as it's finished. If the two versions are tied at the end of the test, version A will be sent to the rest of the list.