Brands and retailers seeking to increase conversion and revenue must constantly innovate to win and retain online shoppers. Hibbett, a leader in omnichannel shopping experiences, has built a culture of innovation and iteration around customer experience, making it easier for shoppers to find and buy the products that are right for them.
In the webinar, we were joined by Bill Quinn, SVP of Marketing and Digital at Hibbett, where he discussed best practices for A/B testing new on-site features – from planning to execution. Also in conversation were Avi Zuck, Director of Customer Success at Syte, and Dana Zachor, Enterprise Success Manager at Syte.
Watch the webinar on-demand here.
Identifying Points of Friction
It all begins with understanding what needs testing and where. Bill explains that usability software can highlight where customers are getting stuck or “rage-clicking.” Hibbett also runs live usability testing with customers and runs UX audits to identify points of friction in the purchase journey.
Customer service is another area where you can get feedback about the shopping journey on your site, as well as post-purchase surveys. There’s also the “Eat what you cook” approach, as Bill puts it. In other words, every time you introduce a new feature, make sure your team tries it out and experiences it themselves, which will help to uncover issues quickly.
Two Main Dimensions of Any A/B Test
The perceived benefit and level of effort are two critical areas to consider before running an A/B test. For example, you might consider the benefit in terms of sales and cost savings. When it comes to effort, determine the time investment and resources needed to run the A/B test
Anything high in benefit and low in effort is what Hibbett aims for before running split tests. Players from each department at Hibbett help to pinpoint the fine line between benefit and effort – whether designers, developers, or marketers.
Diving Into the Metrics
It’s important to look at metrics, but don’t fall into analysis paralysis. If your tests don’t work, use all the metrics you can to understand why they aren’t working. If your tests do work, understand why they do and use the metrics to tell that story.
“It’s kind of a trap to have too many metrics. The metrics can tell a story, so it’s good to look at more than one. But at the end of the day, there has to be one metric you’re looking at that says you’re winning or not winning. For us, it’s revenue per session,” explains Bill.
A/B Testing Timing
At Hibbett, A/B testing typically lasts two weeks, particularly if subtle changes are being applied. 50/50 traffic splits naturally lead to collecting a large amount of data quickly.
Split tests could run longer, at upwards of six weeks, if they’re more high-risk. They involve smaller groups, which are split at 85% control and 15% test.
Best Practices
Keep these three tips in mind when it comes to implementing new technologies on your site and testing them:
- Understand the financials upfront before investing and determine your break-even in terms of revenue and profit.
- Survey your customers to see if a new technology you’re implementing is something they need.
- Look at best practices outside of your industry to see how a specific service was executed.
In Summary
You can’t lose with A/B testing, as there’s always something to learn with constant testing, iterating, and improving. “In the end, A/B testing is a win-win situation. You’re either performing better, you’ve learned something, or you have another hypothesis,” says Avi.
Watch the webinar on-demand here.