Website Testing Wins: Out of Sight Only Equals Out of Mind

By ​Peter Borden

June 25, 2012

Okay, stay with me for this one. With a series title like “Website Testing Wins,” I realize you expect each installment to showcase a test that came out on top. The last thing you'd think I'd write about is a test that failed, right?

But as I mentioned last week, you can learn from your testing losses, just as you do from your wins. For this reason, marketers like Sam Decker have advised companies to fail faster; when you know what doesn't work, you can better focus your future efforts to improve faster, too.

For example, let's take a look at a test an online retailer conducted to determine the impact of simplifying its category pages. The hypothesis was that hiding the product quickview function would improve KPIs like conversion rate and new customer acquisition (see at right and below, for the "before" and "after").

While the head-to-head test did produce a small improvement in bounce rate, it ultimately failed to make a positive impact on any other KPIs. With fairly neutral response to this proposed page change, the online retailer now knows that simply not offering product quickview isn't enough to drive performance on these pages.

Further, it might build off this insight by running some follow-up tests to see if refining the quickview presentation itself could enhance value and achieve response lifts. Or, the marketer might give new thought to any proposed tests oriented around hiding other page elements.

In the end, yes, the test is a failure. But the information gained, in our book, is most certainly a testing win.

Peter Borden is a former marketing strategist at Monetate and was responsible for PPC and email marketing strategy. Peter's also an expert on the psychology of persuasion, influence, and conversion as well as an active iOS developer.

Experience the future of ecommerce personalization

Book your personalized demo today.