This is the third post in our Analytics to Action series, where we outline the ways you can gain insight from everyday interactions with your customers. Thus far we’ve covered what goes into creating a data-driven marketing organization, and how you can derive customer insights from search behavior.
As the series progresses, we’ll be diving into more ways you can learn from interactions with your customers. One tool you may not currently be using for customer insight is A/B testing. When it comes to preferences, actions often speak louder than words. A/B testing relies on customer action to determine which ads, email copy, or landing pages are the most effective. The typical testing ethos is about finding what works and doing more of it, not than asking why and taking those lessons to other aspects of your marketing.
Right now, you’re probably using A/B testing as a conversion optimization tool (and if you aren’t you should be), but A/B tests can, and should, be much more.
Thinking Beyond Conversion Optimization
Here’s a typical A/B test:
This test asks which display ad will convert better: the blue or the orange? Tests like this can provide an immediate payoff in the form of increased conversions and higher return on ad spend. But that’s about it. With a test like this you can increase your clicks, conversions, and ultimately revenue, but you haven’t learned anything about your customer that you can apply to other aspects of your marketing. Maybe your audience preferred the blue over the orange, but there’s nothing to say that color preference will necessarily hold true in the next ad campaign and even if it did, this is hardly stuff of meaning.
An insight-focused A/B test might look something like this:
A test like this doesn’t focus on aesthetics, but rather on messaging. This test hopes to discern which benefit, reach or ROI, is most important to the customer. You can then use that insight to inform your next ad campaign, email marketing messages, and any other marketing collateral that highlights the benefits of your product or service.
Insight-focused tests like this are only effective if the two messages you’re testing represent fundamentally different benefits and aren’t slight variations in phrasing around the same proposition.
More Opportunities for Insight
Many email marketers test subject lines to optimize email open rates. Like with ad copy, this is highly worthwhile data, but its benefits can be limited if the test is around specific phrasing rather than a stark distinction of benefits.
In addition to testing variations on subject lines, you can also test the body of your emails. Typically, these tests hope to optimize click-through and conversion rates. Here, there is likely to be more room for insight-focused experimentation.
If youre a devotee of testing, you’re probably already using A/B or multivariate testing to optimize your landing pages. Here there are countless opportunities for running effective tests that maximize your conversions: which copy is included, do you focus on an image, where is the call-to-action?
When it comes to landing pages, the significant real estate offers ample opportunity to test a variety of different messages and combinations of messaging. Make sure you dont limit your tests to button placement or calls to action.
From Insight to Optimization
The two goals of A/B testing, conversion optimization and insight gathering, needn’t be mutually exclusive. You can learn about your customers while optimizing for conversions.
When running your A/B tests, it’s often effective to go from far-reaching, more conceptual tests that can tell you more about your customers to pure optimization plays. For example, in the display ad demonstration above, it would be prudent to test the variations on messaging and then move to testing simpler changes like colors and calls to action.
Like with any A/B test, make sure you aren’t testing more than one variation at a time to ensure proper attribution. If you were to test tagline one in orange against tagline two in blue, you wouldn’t be able to discern which change was responsible for performance differencecolor or copy? Before setting up an A/B test, clearly define goals and set strict expectations for what you hope to get out.