AB testing for higher conversion

Written by Team e-Village

Sending emails offers enormous opportunities for increasing your conversion. But this does require optimal use of your emails. That is why it’s a good idea to test your email campaigns using A/B testing, before sending out your emails. Testing always provides interesting information. And the more you test, the better you get to know your target market. The result? Effective email campaigns with a higher email ROI. Read all about A/B testing with our omnichannel marketing automation software Clang, right here.


Clang offers several options for A/B testing. For instance, you can use the ‘Split-Run’ feature. You select various mailings and the number of addressees is evenly divided across the various mailings. After sending, you can see in your statistics which mailing performed best.


This makes Split-Run testing enormously valuable. But there is another way to apply A/B testing, which is ‘Smart-Run’. This testing mode works similarly to Split-Run. With a Smart-Run, you also send several variants to your target market. And this form of testing also shows statistical results.

The advantage of Smart-Run

With a Split-Run, your number of addressees is evenly divided across the various mailings. However, Smart-Run works a bit differently. You can select a small batch for your chosen target market to run a quick ‘test’ on. This is based on a percentage.








Figure 1: Smart-Run settings

For ‘Sample size’ you indicate the percentage of total addressees you want to run the A/B test on.

After a set amount of time, Clang will check which mailing variant got the best results. This top variant is then sent out to the rest of the addressees.

Smart-Run vs. Split-Run

The biggest advantage of Smart-Run versus Split-Run is that Smart-Run is done completely automatically. By A/B testing a small percentage of your total target market beforehand, you can run highly effective tests during your email campaign.

You can indicate in the test what you want to test for. This could be:

  • Click Through Rate (clicks/opens)
  • Number of clicks
  • Click To Open (number of opens)
  • Conversion

Click Through Rate (CTR)

The CTR is measured by dividing the number of unique clicks by the number of emails delivered. The number of clicks records how many total clicks the email receives, including instances when people click the same link several times.

When you apply A/B testing to the Click Through Ratio or the number of clicks, think of a testing scenario based on content. This means various CTA buttons in various colours, various ratios of text vs images, colours in your emails, shapes etc.

Click To Open (CTO)

The CTO is measured by dividing the number of unique clicks by the email’s number of unique opens. When you test the number of opens, you can vary the subject line, sender name or snippet (pre-header). After all, these are the first things the email recipient sees. If the goal of the test is to generate more opens, it’s best to vary these three elements.







Figure 2: Sender name, subject line and snippet

The subject line can be tested based on several key words, use of symbols, personalisation etc.
If you want to test sender names, think for instance of setting a masculine or feminine name, to see what works best for your target market. If you have a standard contact with the organisation, you can include this in the sender name, instead of just setting the company name as the sender. This can make the email more personal and attractive.
The snippet can be tested basically the same way as the subject line. You can make use of symbols, salutation, or various keywords.


The most important goal of the email is to create the highest possible conversion. If you wish to use A/B testing to measure the highest conversion, you will testing content relevance, originality, personalisation, etc. The conversion results often differ per organisation or per campaign. Conversion result is an often-applied goal, but you could also use the number of registrations for an event, downloads, or sing-ups as conversion results.

Winning content

The winning content is automatically sent to the rest of the addressees that were not part of the test.


The report below shows the sending statistics and which content came out on top. An example below.








Figure 3: A/B-test statistics

In this example, I use two different subject lines. I want to find out which subject line will do best in my target market. Because I have two different subject lines, I indicate during planning that I want to test for the number of ‘Click To Opens’ (CTO).

Version 2 has a higher ‘Click to Open’, so the rest of the target market receives version 2 in their inbox. Interesting insight:

The word ‘discount’ remains one of the best incentives for opening your email ☺.

Would you like to know more?

If you’re looking for more help using the Smart-Run feature, please contact our Campaign Services department.

Don't miss a blog?

Then sign up for our newsletter!

  • By subscribing to our newsletter, you agree to e-Village's Privacy Statement and your data can be used for marketing purposes. You can unsubscribe from this at any time.

Known from: