Not too long ago, we got a phone call from HOLZ Racing. It took us a minute to get over the excitement of driving a decked out UTV, but once we did, we were able to drill into their problem. Although HOLZ had been around since 1995, had awesome products, and a loyal customer following, online sales still hadn’t kicked into high gear.
It didn’t take long to figure out why. Their website design was difficult to navigate, and while it had all the cool features many ecommerce sites need, it was messy and confusing.
While it was easy for our team to see, HOLZ wasn’t convinced. So we set out to prove it to them.
How to Decide What to Split Test
Our first step was to decide what the goal of the website was. In this case, it was easy: we wanted to sell more products online.
Next we had to break down a list of all the areas we could improve the site to meet this goal. I’m not talking about arbitrarily disliking a color scheme. This is more about creating a list of things based on a clear idea or hypothesis. We created a testing document for this very purpose. It went something like this:
Ecommerce Test Idea: Update the homepage design to focus on the most popular products and not on the image gallery and/or industry news.
Conversion Rate Testing Hypothesis: Users visiting the website are looking for racing products. By showcasing the most popular products, they’re more likely to find something they’re interested in. This will also remove at least one step in the buying process by allowing customers to quickly identify a potential product and move into the checkout phase. Combined, this will drive more customers to buy and increase the conversion rate.
After a quick brainstorm meeting, we came up with about 25 areas that could have an impact. The next step was deciding what to test first, so we sat down and rated each test based on what we thought would have the highest impact.
Choosing an A/B Test
One test came out on top: button color. It sounds simple enough, but button color was a major piece of the puzzle and you’ll see why.
The more complex version of the test idea was this: “Updated the product detail page in a way that drives more attention to the ‘Add to Cart’ button.” We had the following hypothesis: “If we updated the product detail page to provide a clearer focus on the ‘Add to Cart’ button, it will decrease friction and reduce the amount of time customers spend thinking about buying the product. This will lead to an increase in how fast a customer adds an item to their cart and what percentage of visitors complete checkout.”
With this in mind, our design team went to work. We know you’re curious what the original design looked like, so for reference, here it is:

After our design team was done, the new variation looked like this:

As you can see, we didn’t redesign the whole website. What we did was tweak the layout of the page to increase focus on the primary call to action. Here are the key changes we made:
- Increased the size of the product image, so customers could better see what they were getting
- Made the “Add to Cart” button larger and changed the color; the new contrast color and button size helped it pop off the page
- Moved the product price directly below the button–this forced users to look past the button to find the price
- Moved the product description down the page and away from the button, removing it as a distraction; this also created negative space around the button which established a focal point
This updated site design was launched as an A/B test using Optimizely. We split the traffic down the middle with 50% of visitors seeing each test. We then created tracking for the two goals we wanted to monitor.
- Add to Cart The number of customers who added a product to their shopping cart.
- Purchases The number of customers who made a purchase.
Split Test Results
We ran the test for 14 days and reached a statistical significance rating of 92% (That basically means enough people saw the test that you can feel comfortable knowing the results were accurate). At the end of the day, we rokked it. From just one test, we saw a huge improvement to the company’s bottom line. Here are the results:
- The new design had a 60% increase in the number of customers who added a product to their cart.
- Total purchases increased by 44%
That’s a big win for any business, but to get it in 14 days makes it even better. The best part about conversion rate optimization and split testing is that you get clear results.
At the end of the day, we knew the updated design performed better and should be implemented for 100% of customers. After that, we were able to head back to our testing document and identify the next test.
Want to learn more about launching successful tests: get a free copy of our Excel testing document! Download it for free here.
If you have questions about testing or digital marketing, please feel free to email us, give us call, or post a comment below.
Until next time, we hope you have a Rokkin day.