A/B testing

A/B testing is an experimental test procedure that is particularly suitable for optimizing the performance of landing pages. The aim of A/B testing is to compare the performance of different landing page variants in order to decide on the better version. Incoming traffic is automatically split during A/B testing. Part of the visitor flow is diverted from the original landing page A and directed to a slightly modified landing page B, which only differs from the original version in one feature. In this way, the subsequent analysis of visitor behavior can determine whether the change to the landing page has a positive impact on the relevant KPIs.

The splitting of visitors between the different versions is done arbitrarily and without the user's knowledge. Ultimately, statistical analyses are used to evaluate which version contributes more efficiently to achieving the target.

Multivariate testing is a special form of testing. This test procedure can be used to test which combination achieves the best performance when several changes are made at the same time. For example, if you want to compare two variants of an image and two variants of a headline, you could do this with two AB test runs or a multivariate test (see illustration). In the AB test, variant A (current image) would be compared with variant B (image alternative) in the first run and the better-performing variant B would be chosen. In the second run, the current headline is then compared with the headline of variant C to be tested. In a mutlivariate test, four versions are created directly from all element combinations and compared. Whether such a procedure makes sense in individual cases depends largely on the traffic on the landing page. Multivariate tests require significantly larger numbers of users to deliver meaningful results due to the large number of variants.

The advantage of A/B testing over before-and-after testing is that both variants are tested simultaneously. This minimizes the risk of falsification. For example, the conversion rates of landing pages can be influenced by whether the traffic reaches the website during a working day or a weekend day, whether it is morning or afternoon or whether the weather is good or bad. These kinds of disruptive factors are eliminated in A/B tests. However, in order to obtain truly valid results, three other factors must be taken into account.

 

1. change only ONE variable if you are not doing a multivariate test

The change in the conversion rate must be clearly attributable to a changed factor. If variant A and B differ in several variables, it is no longer possible to draw any concrete conclusions.

2. let the test run long enough

The test must be active long enough. Jörg Dennis Krüger, Managing Director of Conversionboosting, recommends a test period of at least 14 days in order to test at least twice on each day of the week. Otherwise, it cannot be ruled out that the winning version will only deliver better results at a certain point in time.

3. the sample size must be large enough

Krüger advises a minimum size of 100 conversions per landing page variant within the two-week test phase. This makes it clear that multivariate tests in particular only make sense if there is a lot of traffic, as many different landing page versions are sometimes tested at the same time. However, the 100 conversions per version are a minimum value and the validity of the results naturally increases with the amount of data evaluated.

 

Conclusion on A/B testing

In general, test results should only be relied upon if a significant change in the conversion rate has been achieved. In this case, significant means statistically reliable. This is the case when you can be at least 95% sure that the result is not based on pure chance, but that the changes to the landing page were decisive. When the 95% is reached depends on the traffic, the initial conversion rate and the effects of the changes made. It can take anywhere from a few days to several weeks before the value is reached. For websites with low traffic, it is therefore advisable to only test landing pages with comparatively high visitor numbers.

There are various tools for carrying out and evaluating AB tests. Optimizely, AB-Tasty and VWO are some of the most common and professional tools that can be used for this. However, Google also offers a free option for carrying out AB tests with "Google Optimize". In the case of paid tools, the prices are usually based on the number of monthly users on the test pages and other factors such as contract duration and consulting services (from self-service to consulting services to full-service solutions). It is therefore important to determine how many services can be provided internally and which should be purchased before using the tool.

The requirements for good landing pages are of course high, because just like offline, first impressions count. In the last part of our blog series on conversion rate optimization, you will learn practical tips on how to optimize your landing page and achieve your desired goals. Here you will find Part IV and thus the last part of our blog series.

COMMENTS

No comments

YOUR OPINION!

Your opinion is important to us. Tell us what you think in the comments.

Or do you have any questions? Talk to us directly - on the phone, by email or directly with us in our office in Stuttgart.

Contact us now

* Fields are required