How to approach Klaviyo Flow A/B testing

A/B testing is a critical function of email marketing automation optimisation and provides a controlled and measurable process for testing new strategies against old.


Typically, we work with five main types of A/B testing:

A/B test type

Metric to improve

When to use this test


Test a new strategy on an existing flow

When implementing a new content strategy for an under-performing flow


Improve open rates

Either on every email in a new flow to get the best deliverability, and to improve open rates on underperforming individual flow emails


Improve click & conversion rates

To test content structure and different wording for the same overall CTA


Improve open, click & conversion rates

On accounts where some account markers suggest designer emails are going into spam, or when the particular email content could benefit from a more personal feel

Time delay

Improve open, click & conversion rates

On accounts where some account markers suggest the email content is not aligned with customer urgency

Old/New A/B Testing

Whenever we build new content for a flow and the client has existing content we will A/B test our content against theirs. If the client has sufficient traffic volume, we may also run a Subject/Preview A/B test at the same time.


A flow with an Old/New A/B test will typically start with a random sample split and will have two distinct branches that will not join back together unless the flow requires a completion tag.


The example below includes the aforementioned Subject/Preview tests running in conjunction with the Old/New A/B test.


While open rates are a great indicator, as the content varies between these emails the winners are chosen from Placed Order rate first and Click Rate second.


We'll look for an overall better performing trend. On the better performing side of the flow, don't be surprised to see emails further down the flow with lower performance than the opposite test side. This can occur as more conversions have occurred earlier and the remaining contacts are less engaged. However, any outlying emails that are performing well on the "losing" side should be considered for a content A/B test.


Screenshot the "All-Time" data or a long enough time-frame of data and compile it in a Google Doc for later reference. Example.


Once the winning split has been established, make sure it is situated to the left side of the flow, 100% of traffic is diverted to it, and the losing side emails are switched off.


We won't delete any content in this flow until we've compiled additional learnings to implement the next stage of A/B testing.


Watch the video below for a full show-through of Old/New A/B testing.


Subject/Preview A/B Testing

The most common type of A/B testing we perform is Subject/Preview across flows and campaigns. Here, we'll focus on the applications in flows.


The only optimisation goal with Subject/Preview A/B testing is improving open rates. In saying this, establishing winners is straightforward.


A typical Subject/Preview A/B test flow will look like this, with multiple variations on each email and no random sample splits. Other types of content splits may be present.


To choose the winning variations click through to A/B testing data in Analytics and ensure the data sample is on All-time. Typically a total sample of 100 or more recipients is required to find a clear winner. However, some examples may be obvious with a smaller sample, some may need a larger sample, and some may be neck-to-neck regardless.


Example of a clear winner with a circa 100 sample:


Example of a clear winner with a smaller sample:


Example where a clear winner is unlikely to be established:


All of these screenshots should be compiled into a document in order for later reference. Include the test end date. Example.


Once the winning variations have been established, we can safely delete the losing variations as we have the Subject/Preview variations in our copy document and the A/B testing screenshots for future reference.