Curated by: Luigi Canali De Rossi
 


Wednesday, May 2, 2007

Usability Testing Approaches: A Review Of Split A/B Testing

Sponsored Links

If you are into making serious business with your online web presence and you have advertising on your pages or you sell products or services online, making sure that your web visitors do not get lost when finding out where to click when to order, or making sure that your contextual ads are placed in the best and most clicked position on the page is of the essence.

usability_testing_absplit.jpg
Photo credit: bluestock

In all of these cases the very best way to find out what works from what does not, is to run some serious A/B testing on each and every one of your critical items.

A/B usability testing approaches give you the opportunity not simply to correct and change something that doesn't work as successfully as expected, but offer you a unique advantage: pitching multiple alternatives against each other to see, among alternatives solutions, which one would work best while measuring its numerical impact relative to others.

This is in fact the same road that I have taken here at Robin Good's Media Network to improve over the course of the last two years our effectiveness and impact in the use of contextual advertising on our content pages.

Thanks to a dedicated specialist I have hired to overview such ongoing tests, the results we have enjoyed have been nothing short of spectacular, as in some cases we were able to identify small, apparently not relevant small changes that generated very tangible increases in the clickthrough of our ads.

Unfortunately most such discoveries are not universal laws that can be applied to each and every site, due to significant differences between sites architecture, contents, overall page layout and type of audience reading it.

So, to be effective and to make significant progress in identifying where and how you can greatly improve the monetization and interaction opportunities offered by your site, you need to setup serious and well thought-out tests, making sure that you give abundant time to your test to collect sufficient data while being very unemotional about the truth and discoveries you will make.

As you will find out by yourself, your readers may have drastically different opinions than yours as to where ads should be seen, and how to navigate your pages, but unless you go out to verify like a scientist you will not find the holy grail.

A/B usability testing is a must-do activity for any serious online publishers that expects to create a sustainable and growing revenue from its online publishing efforts.

In this article, Lisa Halabi, head of usability at Webcredible and a highly accomplished usability specialist takes you through the basics of split A/B testing procedures, goals and management, introducing you to a must-know topic for any serious web entrepreneur.

Split A/B testing

by Lisa Halabi

Following on from any website usability study a number of usability problems are usually found. There can often be debate within any organisation as to the best solution for each problem, with no one really knowing the optimal solution. Rather than letting the person that shouts the loudest get his or her own way, a better solution can be to test two solutions in a live environment. Whichever performs the best is clearly the superior solution. Welcome to split A/B testing!



Buridan's ass and A/B testing

buridan_ass.jpg

Did you hear the story of the donkey stood in the middle of two equally appealing stacks of hay? He spent so long trying to decide which one to start eating first that the poor animal starved to death without having moved a single inch. This paradox, first discussed by Aristotle is known as "Buridan's ass" and is an often discussed psychological phenomenon.

If you're managing a website you might face similar situations when you need to decide which of two different designs to opt for. Nowadays, you need to continuously improve and evolve your site by making small frequent adjustments.

But how do you know which change will have the highest impact on the customer experience?

Split A/B testing is a way of finding out which changes help your users' performance. It provides a controlled method of measuring the effectiveness (or not) of alterations to your site. It's often used for small tweaks (e.g. "Is this style heading clearer than the original?") but can also be used to test bigger wholesale changes (e.g. "Is this new 1-click checkout process better?"). In essence, it involves running two different versions side by side to see which is more effective.



A typical A/B scenario

graph_usability_test.jpg

You feel your users might not be finding the 'Proceed to checkout' button hidden below the fold of the page and think this might be causing people to leave your site before making a purchase. You've created a new design which you feel is more effective but you'd like to know for sure.

To A/B test your new page, you serve the regular page to say 90% of your visitors as usual, but a randomly selected 10% would be shown your new design. Then you sit back, pour yourself an ice cold gin and tonic, wait and watch your web statistics.

If your new page has a positive effect as you suspect, then you should see an improvement in the 10% group, as measured by conversion rates. Proof that you should publish the more successful page to all visitors.



Advantages of split A/B testing

absplit_advantages.jpg

There are a number of benefits to A/B testing:

  • Low risk approach

  • Cheaper than other methods such as focus groups

  • Provides proof

  • Invisible to most of your users

  • Great way to do 'test run' new designs before full roll-out (to avoid negative surprises on the launch day)

  • Can solve internal disputes



Disadvantages of split A/B testing

absplit_disadvantages.jpg

A/B testing isn't always suitable. Some of its disadvantages include:

  • New designs might have to "wear in" before you can measure their real performance (visitors' initial response might be negative because they're used to the old solution)

  • You can only compare two versions with a single factor that differentiates both designs (for more factors / variations you need to deploy multivariate testing which is more difficult to analyse)

  • It takes technical know-how to set up and analyse the results

  • You're testing in a live environment so external factors might have an impact on the outcome

  • If your site is really in bad shape, you'll still need to do a proper overhaul



In a nutshell

A/B testing is powerful stuff and a useful method for a quick comparison for two different designs. However, bear in mind that it's no substitute for getting proper feedback from your users. Only this will give you the whole picture straight from the ass's mouth.




Originally article by Lisa Halabi published on May 2007 as "Split A/B testing" by Lisa Halabi on Webcredible.

About the author

Lisa Halabi is the head of usability at Webcredible and is also an accomplished usability specialist and founding member of the Usability Professionals Association UK chapter. She has over 6 years experience consultation for blue chip FTSE-100 companies both here and abroad. She holds a degree in Ergonomics and Masters in HCI.



Photo credits

Donkey sign: Harris Shiffman
Graph: Anton Gvozdikov
Arrow up: pablo631
Arrow down: Jafaris Mustafa

Robin Good - Lisa Halabi -
Reference: Webcredible [ Read more ]
 
 
 
Readers' Comments    
2007-05-02 14:34:25

John

I can't believe this is a headlining article? A/B testing is very basic - and effective, but its not rocket science. Just reviewing this old-school approach was less than impressive. A complete waste of time to read.



 
posted by on Wednesday, May 2 2007, updated on Tuesday, May 5 2015


Search this site for more with 

  •  

     

     

     

     

    7169




     




    Curated by


    Publisher

    MasterNewMedia.org
    New media explorer
    Communication designer

     

    POP Newsletter

    Robin Good's Newsletter for Professional Online Publishers  

    Name:
    Email:

     

     
    Real Time Web Analytics