Last Updated: April, 2023
The process of A/B testing involves segmenting your audience to evaluate a variety of creatives, with the aim of determining the most effective and efficient one. To put it simply, you can present Creative A to 50% of your audience and Creative B to the other 50%, in order to determine which one performs better.
Add Creative Variations
To begin an A/B test, follow the below steps:
Select a campaign and click on "Edit".
Proceed to the Creatives section, and add the desired creative variations (such as Variation A, Variation B, and so on).
After a new creative variation is added, Kayzen Creative Optimization will redirect enough traffic to assess the merits of the new variation(s).
It is recommended to run your A/B test for a period of 7 to 14 days, as this allows Kayzen to obtain the necessary sample size for you to determine a winner.
In the Reports tab, you have the option to generate a performance report that can be instantly downloaded from the system. Furthermore, you can preserve reports for future viewing, and they can be accessed from the Overview tab.
Please follow the steps below:
Click on ”New Report”.
Select the “Filter By” options to filter the campaign/advertiser for which you would like to analyze the performance.
Select the “Group By” option, and make sure you have “Creative name” checked in the “Group by” filter.
Finally, choose the metrics you would like to analyze in the reports like Impressions, Win Rate, Clicks, CTR, etc.
The Kayzen Dashboard gives you a brief summary of creative metrics, enabling you to monitor and compare performance without having to go through detailed reports. The "Performance by Creative" table displays all creative variations for filtered campaign(s). You can effortlessly modify or delete your preferred performance metrics from the widget editor.
Once you have performed the test on your creatives and you know the best fit for your campaign, you can choose to keep or remove the desired creative from the campaign.
Statistical Significance Test
When conducting A/B testing experiments, statistical significance measures the probability that the observed difference in performance between different versions of your creative content is not due to chance or error. For example, if you set a significance level of 95%, you can be 95% confident that any differences observed are statistically meaningful. When testing different creative variations to determine which one resonates better with your audience, it is essential to verify that the observed differences are statistically significant before making any decisions about which variation to adopt.
Statistical Significance Calculator
You can use the Statistical Significance Calculator to determine the statistical significance of your findings by inputting your desired metrics and performing tests at either a 95% or 90% significance level. For example, if you want to evaluate the conversion rate of two creative variations, you can enter the click and install values of Variation A and Variation B. By using the calculator, you can determine whether the difference in performance between the two variations is a genuine outcome or simply due to chance.