How to carry out an A/B test?

A/B testing, also known as split testing, is a process of comparing two or more versions of the same marketing strategy, for example, email campaigns. Test different elements of your email, measure their performance, and create your future email campaigns based on what's working and what’s not.

IN THIS ARTICLE

How it works
How to carry out an A/B test
FAQ

How it works

Please note

A/B Testing is available on Team Pro or Agency plan. Learn how to upgrade your plan »

What is A/B testing for? 

Carry out the A/B test by adding two or more versions of your email. 
  • The original version is marked as “Version A”.
  • To create a new email version you can either copy the existing version or add a new blank version by clicking the green + symbol.
Create and test a maximum of 5 email versions per one step in your campaign. 
  • That means that besides the original “Version A” you can add versions B, C, D, and E to your opening email (Email 1) and another 5 versions to each of the follow-up messages (Email 2, Email 3, etc.)
  • Please note that you can have a maximum of 8 emails in one Path (1 opening email and 7 follow-ups) which gives you the possibility of testing 40 versions of emails in one campaign.
Test one thing at the time. 
  • If you want to increase your open rate, focus on multiple variations of your subject line. 
  • If your goal is to engage your prospects and receive more replies, experiment with the content of your email, for example, by including different Call To Action (CTA) in each email version.
Send one email version to at least 100 prospects. 
  • If you’re planning to test 4 versions of email, it’s best to import at least 400 prospects who share similar characteristics. 

Why should you A/B test your email campaign?

  • Improve email deliverability by experimenting with varied content. 
  • Learn which of your cold email practices bring better results.
  • Study your prospects’ behavior to craft personalized messages that fit your target group.
  • Turn your cold prospects into valuable leads by having engaging conversations.
  • Create lasting business relationships.

What can you A/B test?

  • Subject line to measure your open rates.
  • Email content to increase reply rates.
Please note

Delivery time, timezone, and signature stay intact for the whole step. These settings cannot be changed separately for each email version.

How to carry out an A/B test?

Before you start

  • Take care of your domain and email warm-up.
  • Prepare an up-to-date list of prospects. 
  • Divide your target group by characteristics. 
  • Plan your test.
  • Prepare various email versions.

Step #1 Create different email versions

Each campaign starts with version A of an opening email - this is your original message. Add one on more email versions to start testing your campaign.

Adding versions

  • Click green symbol + to add an empty version of your email. 
  • Click the ‘Options’ icon and select ‘Copy’ to duplicate your email version.

Deleting versions

  1. Hover onto the version that you want to remove. 
  2. Click the 'Options’ icon.
  3. Select ‘Delete’ from the dropdown menu.
  4. Click 'I'm sure' to confirm.

Please note
  • You need to have at least one email version in your campaign.
  • You can’t delete any version as one of them has already been sent.

Step #2 Add follow-ups

Add one or more follow-ups to increase the chance of opening your email and test them too.

Sending follow-ups

Sending follow-ups in the same thread 

Leave the subject empty if you want to send your emails in the same thread. Your follow-ups will be sent with the previous subject line, attached to the other emails in the sequence. It will start with "Re:" the subject line. For example, "Re: Hi!"

Sending follow-ups as a separate message 

Type in a new subject to send follow-ups separately. Each follow-up will be sent as a separate message with the new subject line and without the body of the previous message attached as a citation. 

However, if you wish to attach the previous message, just click the quotation mark icon. Your follow-up will be sent with the new subject and the content of the previous message will be included in the thread.

Read more »

The process is fairly the same as when you split test the opening email. Add a blank new version by clicking ‘+’ (‘Add version’) or copy the existing version. One version of a follow-up will be randomly assigned to a prospect in each step. 

How does it work? Let’s say you have 1 opening email and 2 follow-ups in the campaign sequence. That gives 3 steps: 

  • Email 1 (opening email) without A/B testing.
  • Email 2 (follow-up) with A/B testing.
  • Email 3 (follow-up) also with A/B testing. 

New versions will be drawn for a prospect right before sending a message. That means that your prospect can receive, for instance, version B of Email 2 and version A of Email 3.

Step #3 Preview the campaign

Check how each version of your email and follow-ups will look like in your prospect’s inbox. Scroll down to the bottom of the page and click the PREVIEW button.

  • In the preview, you’ll see all versions of your email for each step in the campaign. 
  • Only one version will be sent to each prospect. 
  • The versions can be different for each step. For example, Prospect #1 can receive version A of the opening email and version D of the follow-up message. Prospect #2 will be sent version C of Email 1 and version A of Email 2, and so on.

Customize the opening email 

After email customization, we add a symbol of the version that you edited, for example, ‘A’ and tag ‘CUSTOMIZED’. See the example below.

Step #4 Send test campaign

Test the sending and see how your email will look like in your prospect’s inbox. 

  • Click ‘SEND TEST CAMPAIGN’ to test send several email versions to your email address randomly. 
  • To avoid cluttering your inbox we won’t be sending all of your versions at once.

If everything looks good, click ‘SEND’ to start your A/B test.

Step #5 Compare the results

Your campaign will start running as soon as you hit the ‘SEND’ button. The emails will be sent according to the Delivery time and timezone that you set up for this campaign. 

To check your results, open your stats dashboard.

Statistics

  • Steps with different versions will be marked with the ‘A/B’ tag.
  • Your best results will be marked by a special winner icon.
  • You can turn off the email versions of lower performance.

‘Best result’
  • Your best results will be marked with a little badge - a special winner icon.
  • ‘Best result’ badge will be assigned to the best open rate (‘OPENED’), click rate (‘CLICKED’), and reply rate (‘RESPONDED’).
  • ‘Best result’ badge for Interest level is available only for the prospects marked as 'Interested'.   

Turning off/on an email version
  • You can turn off an email version with worse results so it won’t be sent to prospects anymore.
  • Open your stats dashboard and click the toggle switch to turn off/on an email version.

  • Inactive are marked in grey color.
  • Active versions are marked in green.
  • Keep in mind that at least one version of the sending email has to be active.

  • If you decide to turn off email versions where you customized an email, those customized emails will still be sent.

If-campaigns

  • Switch between Path YES and Path NO to compare your email performance in campaigns with conditions.

Last Activity

  • Open the PROSPECTS tab in your campaign and check your prospect’s activity.  
  • See when and which version of your email was SENT, OPENED, CLICKED, or RESPONDED to.

Exporting results

  • Go to your campaign stats dashboard → click ‘Actions’ → and select ‘Export as .csv’
  • We add column ‘VERSION’ regardless of A/B testing. If you didn’t use A/B testing in your campaign, the column will be left empty. 
Simple campaigns

Campaigns without conditions. Column PATH intentionally left empty.

If-campaigns 

Column PATH filled in with the data Path YES or Path NO.


For more information, check out the A/B Testing: FAQ »

Learn more on our Blog

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us