Klaus Enzenhofer About the Author

Klaus is a Senior Strategist in the Center of Excellence at CompuwareAPM. Klaus influences the strategic direction and development of application performance management solutions.. He has deep experience gleaned from years of developing and running large scale web and mobile applications for online businesses.

What to do if A/B testing fails to improve conversions?

A/B and multivariate testing are often used to improve the conversion funnel. What these tools do is randomly place alternative change images, text or other design elements to gather statistics about how these things affect site visitors. Companies have had great success using such solutions, but sometimes multiple rounds of testing still produce inconclusive data: changing the color or image on their pages did not have a significant impact on the overall conversion rate.

Does this mean that there is no way for that business to improve the conversion funnel? In these cases, marketing and other business stakeholders often think: “We have chosen the wrong images. Let’s try some more!” and a new test cycle is started. But I think we should consider other factors. Specifically, how the end-user experience is affected by the responsiveness and performance of these pages. I will explain how to do this in 4 steps:

Step 1: See the conversion funnel

First, we need to select the page we are interested in and chart how many visitors come to that particular page, as shown in the screenshots below.

Conversion Funnel Dashboard

 

Step 2: Count the abandonment rate

As the below screenshot shows, the most visited page is the ‘search page,’ which is expected: customers are doing more searches than logins or bookings. So, does this fully explain the decrease between the number of searches and logins, for example? Or are some visitors abandoning the site? Next, we need to create a chart showing how many customers abandoned the pages on each step as shown by the “Exit page count” below.

Conversion Funnel with Exit Page Dashboard

Step 3: See if end-user experience impacts conversions

Now we know that visitors are leaving on certain pages but still we are guessing why they are leaving. And this is where the A/B testing solutions come into place. We always think that it has something to do with the images or could it be that poor user experience and lack of page responsiveness is an issue? The APDEX  is a standard that is used to quantify this aspect of user experience. It is a unified value between 0 and 1 where 1 reflects a superior experience. So let’s test this hypothesis with a User Experience Management solution that captures APDEX scores and see if poor user experience explains low conversion rates. The screenshot below shows the APDEX scores for each step of the funnel. Now we know the poor user experience on the last 2 steps has an impact on the abandonment.

Conversion Funnel with User Experience Dashboard

Step 4: Finding the root cause of the user experience issue

Now we know that changing an image does not solve the problem of our users. We should try to find out what is really making the experience so bad. Is it a bad response time or do we have errors on these pages?

Because we have a User Experience Management solution in place we now have the capability to distinguish whether the problem is an error like in this screenshot on or last step of the conversion funnel or a slow response like on the penultimate step.

Impact Analysis Conversion Funnel

To fix the problem with our conversion rate we need to consult the developers and give them the data they need. As we have wisely chosen a solution that captures deep-dive information that includes the full visit, JavaScript errors, click paths, browser version, etc. the developer has all the data that is needed to fix the problem the first time and without having to re-create the issue or wait for it to happen again.

If you’re looking for more information about actionable data provided by monitoring solutions, please have a look at my previous blog “Fact Finders: Sorting out the truth in Real User Monitoring”

Comments

*


+ one = 6