We recently discovered on one of our new services that our New Visitor Bounce Rate was in the high 60% range, even after removing an advertisement that had our bounce rate in the mid 80%. (We were running a very targeted ad campaign that was driving traffic, but killing our quality metrics)
On top of the poor bounce numbers, 95% of the new visitors were bouncing within 10 seconds. Obviously we aren’t meeting visitor expectations and need to fix it. So Justin and I sat down and came up with a good test to run. Being the lazy programmer I am, I was trying to find the simplest change possible…
We decided to implement a simple split test that compared a static landing page with pricing on top with two versions that had auto-playing sales animation.
There is a great split testing system called UnBounce that’ll let you design and split test landing pages easily. It’s perfect for the non-techie to test out theories and rapidly get to winning designs. But since we churn out fully programmed sites, we don’t really want to pay for a WYSIWYG system like that and just coded a solution together really quickly.
On a user’s first visit we stick them to one of our home pages variations that we’re testing. Since we have three pages we’re testing between, we cookie the appropriate page and that’s all the user will see as they navigate around the site.
We set up Google Analytics to track this data via an overwritten ga(‘send’, ‘pageview’) command to to push in the actual page version the user landed on. Now we can tell all of the various quality metrics (bounce, time on site, etc) by version.
KissMetrics has an amazing system for tracking this type of information, and we’d love to have used them as part of a more comprehensive analysis effort. However, at our volume it’d be over $500/month and to be honest not worth investing in right now. We actually sent them a message about pricing options and we never even got a response. So we’ll probably look into alternatives once we hit the growth curve.
Back to our test… Once the user registers and logs in, we log that data into their user record in our user database, allowing us to perform simple SQL reports to understand conversion rates between the versions. A little data massaging and we now have the ability to see the best performing test version in both quality metrics and overall sales.
Note: this is a currently running test (and it has only been running for 2 full days), but instantly we noticed a difference. (EDIT: Now, a week later the data holds.)
Our average time on site before the change was 50 seconds, even after we got rid of the advertisement.
The red circle was when we killed the ad campaign. As you can see, the visits dropped way down, but the average time on site stayed around the same.
We then added the sales animation video to the home page and set it to autoplay.
As you can see, from the 11th to the 12th, adding the video took the New Visitor Time On Site metric from 48 seconds to 4 minutes 32 seconds – a 566% increase.
Bottom line: Well done auto-play videos can make an immense impact in your time on site, and if people are hanging around to watch the video they’ll learn more about your offering and (hopefully) purchase your services.
Our bounce rate is still high, but it has dropped to under 50% for the first time since we launched the site. In the coming weeks expect more mini-tests like this as we work to improve our overall experience.