It’s highly likely that you’re not A/B testing as much as you should be — and leaving performance gains on the table.

While the Don Drapers of years past may have relied on gut instinct to make important marketing decisions, today’s marketers have a much more reliable tool at their disposal: data. With access to countless data points — particularly in digital settings — there’s essentially no reason to make decisions that aren’t backed by evidence. 

One of the leading methods for gathering data is A/B testing, a methodology commonly employed by marketers and UX designers to better understand consumer actions. At its most basic level, A/B testing is a way to compare two versions of any piece of digital collateral to determine which performs better. It is essentially a simple scientific experiment: you establish a control and a variation, and then make observations.

Even as web analytics becomes more sophisticated, A/B testing is still one of the most important tools in any marketing team’s repertoire. The question is — are you using it effectively to drive results that matter? 

How A/B Testing Works 

An A/B test starts with an element— on your site or part of simply part of your campaign — that you would like to evaluate for its effectiveness. This could be as simple as an email subject line or the size of a subscribe button. Determine how you would like to evaluate the variable’s performance — this could be via conversions, impressions, click-through rate, time on page, etc.

To run the test, you’ll set one randomized set of users as the control and another randomized set as the variable. After a set amount of time, you’ll determine which variant influenced your chosen metric the most significantly.

Of course, this summary is a bit of an oversimplification. It’s also necessary to determine the sample size you need to achieve statistical significance, and you will likely need to perform slightly more complex tests. For instance — while it may seem logical to test a large vs. small button before testing a red vs. blue button, this method may impede you from discovering that a small red button outperforms all other combinations.

But because running more than two tests at a time requires a larger sample size, it also takes longer to get statistically significant results — which is why patience and persistence are two hallmarks of A/B testing.

Why Is A/B Testing So Important? 

Most marketers and UX designers are aware that A/B testing is a powerful tool — but they still may not be utilizing it to its full potential. Virtually anything — not just email subject lines or CTAs — can be A/B tested and optimized for better performance. Neglecting to test can mean missing out on game-changing insights. 

Here are three tips to keep in mind that will help you make your marketing team’s A/B tests more effective: 

1. There are almost always performance gains to be made.

When it comes to their websites and digital marketing efforts, many companies tend to “set it and forget it.” That’s understandable — after weeks or months perfecting a display ad or a new homepage, the last thing most teams want to do is continue to tinker with their creation even after it goes live. 

That’s why testing should be a mindset, not a one-off activity; for the most skilled marketers, testing isn’t optional, it’s expected. Everything from email subject lines, CTA forms, images, fonts, colors, and layout can be tested and optimized for greater engagement and, ultimately, more conversions.

2. You may discover something unexpected that defies best practices.

Testing is such a powerful tool precisely because there’s no definitive guide for what will or won’t perform well. A/B testing is frequently surprising — if it weren’t, it wouldn’t be useful. It’s not unheard of for an outdated homepage that breaks every UX rule in the book to outperform a sleek, visually appealing update. 

Every company is different, and so are their audiences. What works in most cases won’t work in all cases. But you won’t know what’s going to win over a given audience until you test.

3. Something that used to work may not work anymore.

Testing is a continual process. Even if a feature on your website previously won out in A/B tests, consumer preferences may have shifted, and there’s no guarantee that it will continue to provide the same results indefinitely. 

It’s vital to repeatedly test in different permutations to ensure that your website, emails, or digital ads remain optimized over time. For lasting results, assume that the best combination of elements is constantly changing, and test continuously to keep up. 

How to Get Started with A/B Testing

Even marketers that are sold on the value of A/B testing may not be sure where to begin. Generally speaking, there are three paths companies take when starting an A/B test: testing manually in-house, using a software like Optimizely, or finding an agency partner.

Choosing the right path for your business depends largely on the amount of time and resources you’re willing and able to dedicate to A/B testing — as well as the kind of results you wish to see. While it’s certainly possible to see substantial results from manual testing, it does require intensive time and personnel investment. A software solution or an agency partner, while more expensive upfront, demands less effort from your team and virtually ensures better outcomes. 

But no matter which testing method you choose, one thing is certain — you’ll be better off with A/B tests than without.

Author Madeline Killen

Before graduating from Dartmouth College, Madeline studied English and Italian literature, edited the arts section of the campus newspaper, worked as an Italian tutor, and completed a senior honors thesis on Emily Dickinson. At L&T, she’s translated her passion for language — no pun intended — into success in her role writing and managing social media for clients across multiple industries. When she’s not at work, you’ll find her reading, running, or enjoying the perks of moving back to civilization after four years in the New Hampshire wilderness (even though she does miss the trees).

More posts by Madeline Killen