In this Insivia Insight, Patrick talks about the process of A/B testing and how it can be used to increase conversions on your website. A/B testing is a surefire way to identify strengths and weaknesses in your digital marketing strategy.

In a nutshell, A/B testing is a process used to determine optimal design, content, or functional elements on a website. But A/B testing is not limited just to that. Some of the other applications include:

  • Newsletters
  • Emails
  • Landing pages
  • Homepages
  • Menu design
  • Video/digital media content
  • Conversion/contact forms & popups

The process is quite simple. First, you choose a “sample” or number of intended recipients. A sampling error occurs when the sample is too small, which provides results that don’t represent the population as a whole. There is no magic number, but a good rule of thumb is to keep your sample size around 100.

Another mistake is to try and test multiple variables simultaneously. In order to do a true A/B test, you need two versions, or variables, to compare to one another. Adding a third (or more) simply muddles the results. In the event that there are in fact multiple items that need to be evaluated or optimized, they need to be tested independently of one another.

Once you have your sample size and variable chosen, the next step is to collect data. Here’s a hypothetical example:

Janet, a marketing manager at the law offices of Evans, Smith, & Wessen is interested in finding out if increasing the size of a call to action button on her homepage will lead to more clicks. Her colleague, Randy, argues that, “common sense dictates that making a button bigger will entice more people to click on it.”

Janet is wise enough to know that it’s flawed logic to base decisions purely on assumptions, so she decides to split the next 100 site visitors into two groups:

  • Group A (50 people) will view the original site
  • Group B (50 people) will see the alternate site with a larger call to action button

After they finish collecting data, they discover that Randy’s assumption was wrong. Group B, the one with the larger button, actually clicked it less. Furthermore, the bounce rate increased by 12%. As a result of the test, Janet concluded that she shouldn’t change the button size, and started to explore other ideas, such as changing its shape or color.

Why go through all this trouble? Technically you don’t have to. We do A/B testing as part of our conversion optimization services, so you don’t have to worry about the nitty gritty details. Identifying the elements of a website that will increase your conversions is important, but as the above example illustrates, learning about what decreases conversions is equally (if not more) important.