How to Use A/B Testing to Increase Conversions and Engagement

Aaron-Glazer_MaRTech Guest Taplytics - AB Testing

A/B testing is the best way to know what works and what doesn’t when optimizing your app and website. Here are tips from Aaron Glazer, CEO of Taplytics, on how to run A/B tests that give you the evidence you need to confidently make changes with maximum effect.

_____

If there’s one thing about mobile apps and websites, it’s that there is always room for improvement and near endless possibilities of ways to optimize. A/B testing can help guide those decisions and remove some of the guesswork as to which changes will yield the biggest impact. With A/B testing, all decisions are informed by data, which prevents you from going too far down a costly path with changes that show nominal ROI.

When you know what resonates with your users, you can create better experiences and drive conversions—and what marketer doesn’t want to do that? While A/B testing used to reside with more of a company’s technical team members, it’s primarily moved within the marketing purview in the past decade as testing technology has simplified. Marketers with the knowledge to go beyond the most basic A/B tests will see more users, more engaged users and a quicker path to their long-term goals.

Marketing Technology News: Taplytics Unveils New AI-Optimization Engine, Taplytics AI, to Deliver Personalized Digital Experiences.

What Is A/B Testing?

A/B testing works by taking two identical versions of your site or app and then make one change to the design or experience to see how it impacts user behavior. These changes could entail anything from replacing a hero image, to updating copy, to reconfiguring the layout of your site or swapping the color of a sign-up button. Half of your audience is shown variation A, which is what users would normally experience, while the other half is shown variation B, which is the version you hope will accomplish your pre-set goal. After letting the test run for a couple of weeks, you can use data to see which variation performed better. A key benefit is that you’re able to understand the preferred function before moving forward with a permanent change.

Traditionally, A/B testing primarily relegated to ads and websites, but it’s now just as prevalent in the mobile world. With 69.4% of internet users using shopping apps on their mobile device and 69% of users preferring to do product research on their phone, it’s highly important to test and optimize your users’ or customers’ cross-channel experiences. The next channel being added to the mix is Over-The-Top streaming, where tests can be applied to tvOS, AndroidTV/FireTV, Roku and other smart TV apps.

For some of the most successful organizations, A/B testing goes far beyond the surface. Take Booking.com, which has built an entire culture around A/B testing and experimentation and everyone has carte blanche to test without management’s authorization. It’s been reported that Booking.com runs more than 1,000 concurrent tests, and presumably more than 25,000 tests a year. The impact of this testing culture has catapulted Booking.com from small startup to on online accommodation juggernaut.

At Netflix, the culture of experimentation runs just as deep and has been imperative to its continued transformation and success. The company has frequently blogged about its fervent A/B testing practices, pointing out that it tests every product change, which has led it to completely overhaul its UI layout and the launch of a personalized homepage. It even A/B tests most movie title images, sometimes leading to a 20-30% uptick in viewing.

What You Need To Know About Setting up A/B Tests

The first step to running an A/B test is to determine what element you’re going to experiment with. It’s critical to only make one change at a time when running an experiment. Everything on your site should remain the same aside from the element you’re testing. It varies by goal, but examples of testing elements include the copy for a promotion you’re running, the color of a call-to-action (CTA) button and the layout of a page. By only testing one element at a time, you’ll know with confidence that the specific variable you changed is what’s driving more (or fewer) conversions.

A/B tests should run for two weeks to gather enough data. It’s important that the two variations of the page are tested at the same time and the control groups and test groups are  evenly and randomly divided for accurate results. If you run version A for two weeks then version B two weeks later, external factors such as time of the year could impact the test results.

Once the two-week period is complete, analyze the results to see which variation performed best. The winning variation can then be made a permanent change or addition, and what you’ve learned can guide future A/B tests.

Marketing Technology News: MarTech Interview with Michael Kraut, VP of OEM for Automotive at Experian

Examples of Impactful Tests

Every A/B test should begin with devising a hypotheses and setting a succinct goal. Some of the more common goals are to improve user engagement with a website or app, drive conversions and see how users react to new features. The sky really is the limit, though. If you want to keep fewer people from dropping off during onboarding or direct them toward a different CTA within your app, you can build tests around those as well.

For example, Chick-fil-A, one of our Taplytics users, was looking to optimize their mobile app’s payment flow. They had a strong in-app payment system, but the payment option layout seemed to confuse customers who were unaware that they could pay using their credit card. Chick-fil-A received countless customer complaints about this misconception and needed to figure out how to remedy the confusion. The Chick-fil-A mobile team used A/B tests to identify a payment flow that better highlighted the credit card option, and was able to increase credit card orders by 6% and eliminate incoming calls related to mobile payments.

Talkspace, a pioneer in online therapy, also realized success through A/B testing. Talkspace A/B tested their onboarding flow to see if reducing the number of steps in the process would increase conversions. They hypothesized that cutting the length of an animation in half could reduce friction in their onboarding. This was proven true through A/B testing and led to a 60% increase in conversions.

Feeding Results Back into the System

Based on the A/B testing results, you can keep the winning variation and remove the losing variation. Use your learnings from the test to determine other areas within your website or app to test and to better understand what resonates with your customers. If certain CTAs or imagery have been proven to provide a positive response, you can implement them in other areas to enhance your user experience.

A/B testing allows you to create a personalized experience for your users, to continuously learn from them and to bring them closer to your brand. There are endless elements to A/B tests and, the more tests you run, the more informed and successful you’ll be as a marketer.

 

Picture of Aaron Glazer

Aaron Glazer

Aaron Glazer is the CEO and co-founder of Taplytics, the most comprehensive feature management and experimentation platform for the modern enterprise. Digital leaders like Grubhub, Chick-fil-A, and RBC Royal Bank use Taplytics to create a competitive advantage. Aaron grew Taplytics from a couple of friends in a basement to a Y-Combinator-backed powerhouse. Before Taplytics, Aaron was a strategy consultant at ZS Associates and he also worked in a management consultant role for Accenture.

You Might Also Like