A/B testing: how it helped Obama’s campaign and how it can help improve your product

Obama’s 2008 presidential campaign was famous for turning down public financing and raising funding online. One of the strategies that were used to maximise the amount of money raised, was the A/B testing. This strategy consists in randomly presenting one of variation of the product (a webpage in this case) to the user,  in order to measure which one “converts” (performs) best. The version that best performs is then set as the only version after there is enough data to decide.

You might wonder why this is useful and think that you are able to choose which one better performs for the user. The reality is that you can’t know, you don’t think and chose as your users. A data-driven approach is often the best solution, rather than intuition.

Let’s look at the experiment created for the Obama 2008 campaign. One with Obama in the crowd, and the other with his family. Before continuing after the images, try to guess which one resulted in more sign-ups.

The result of the experiment was the following: the one on the right performed 40% better. Unexpected, and for such a big campaign that meant raising around $60M more.

More example from a book on the subject. Netflix used A/B testing as well, to discover that a homepage with a continuous scrolling of titles was better performing than 4 fixed presented ones. Another e-commerce company increased their revenue by 15% by just simplifying the checkout page and making some sub-fields appear only on the user’s selection.

Essential for startups

Lots of startups use this approach, lots of example in this book where – if I remember well after many years – the author talks about his startup where it continuos iterates his product (online chat with 3D avatars) entirely based on user testing and feedback. I worked myself for a startup for over a year, and I can confirm that bindly developing prototypes based on our assumptions was never leading us the right direction.

Two more example from my experience

I implemented one A/B test in 2010 for a company’s voucher website.  We tested which position of the sign-up box collected more emails. The left sidebar surprisingly collected twice as much compared to a bigger banner in the middle of the page. We never understood why, maybe websites at the time normally presented it there as well, but it didn’t really matter, the users chose for us.

I also used this strategy for a website I developed to compare product features. In the comparison tables (e.g. Asus zenbook comparison) I tested for a few weeks which content of the orange button resulted in more clicks. Unexpectedly, a “More info” button resulted in 3 times more clicks (see below), that also increased the revenue. I learnt that the user is still not ready to make the purchase after a product comparison, but they prefer to learn more.

This one performed 3 times better
Initial version

How to implement one A/B test (technical)

Google analytics has an Experiments page under “Behaviour”, where you can simply set your experiments by specifying the variation pages. You can also manually write the Javascript conditional code that decides how to modify the page depending on the result of a function (from Google analytics) that tells you which version was decided. That’s what I did in my second example.

Another tool I’ve heard about is Optimizely, never used yet, but I heard good things about it and I’ll have a look at it sooner or later.

 

Leave a Reply

Your email address will not be published. Required fields are marked *