Big Data Introduction

20 / 27

Big Data applications - A/B testing




Not able to play video? Try with vimeo

Another application is A/B testing. The idea of A/B testing is to compare two or more versions of the same thing, and analyze which one works better. It not only helps us understand which version works better, but also provides evidence to understand if the difference between two versions is statistically significant or not.

Let us go back to the history of A/B testing. Earlier, in1950s, scientists and medical researchers started conducting A/B Testing to run clinical trials to test drug efficacy. In the 1960s and 1970s, marketers adopted this strategy to evaluate direct response campaigns. For instance, Would a postcard or a letter to target customers result in more sales? Later with the advent of the world wide web, the concept of A/B testing has also been adopted by the software industry to understand the user preferences. So basically, the concept which was applied offline, is now being applied online.

Let us now understand why A/B testing is so important. In real-world scenario, Companies might think that they know their customers well. For example, a company anticipates that variation B of the website would be more effective in making more sales compared to variation A. But in reality, users rarely behave as expected, and Variation A might lead to more number of sales. So to understand their users better, the companies rely on data driven approaches. The companies analyse and understand the user behaviour based on the user data they have. Thus, more the data, lesser the errors. This would in-turn contribute to making reliable business decisions, which could lead to increased user engagement, improved user experience, boosting of the company revenue, standing ahead of the competitors, and many more.

Now let see more clearly see how it is done. Say we have an e-commerce website, and we want to see the impact of the Buy Now button of a product. We can randomly select 50% of the users to display the button with new color(say blue) and the remaining be shown the old(here green) coloured button. We see that 23% of the people who were shown new version of the button have bought the product, whereas only 11% of the people (who were shown the older button) ended up buying the product. Thus the newer version seems to be more effective.

Let us have a look at how some A/B testing cases lead to more effective decisions in real-world. In 2009, a team at Google can't decide between two shades. So they tested 41 different shades of blue to decide which color to use for advertisement links in Gmail. The company showed each shade of blue to 1% of users. A slightly purple shade of blue got the maximum clicks, resulting in a $200 million boost of ad revenue. Likewise, Netflix, Amazon, Microsoft and many others use A/B testing. At Amazon, we have many such experiments running all the time. Every feature is launched via A/B testing. It is first shown to say 1 percent of users and if it is performing good, we increase the percentages.


Loading comments...