At Browser to Buyer we’re constantly running tests with SaaS and subscription businesses around the world.
While every site is unique, we’re sharing some of the recent results and findings we’ve had to inspire you!
Our client was struggling to convert users into free trials, despite not requiring credit card details at this stage. Their signup form featured many fields and brought many opportunities for errors and confusion.
We greatly simplified the form and split it into two steps and achieved a 7.5% increase in the number of trials.
We did all of this without the client needing to change anything. This is a technical change which involved a lot of complex changes in the background. Get in touch if you want to know how we did this.
We’ve seen a double digit increase in conversion for software products as a result of changing the hero graphics from an abstract version to a video that demonstrates the UI and how it is used. These users no longer needed to be told about the features – they can see for themselves what the product does. (See new hero graphic below:)
Sites that do this exceptionally well are Monday.com and Asana, as you can see in this short clip:
We added a G2 Leader badge site-wide on prime real-estate (right next to the free trial button at the top of the page). Often these awards are hidden at the bottom of the page in email signatures.
The result was a 3% increase – not huge increase in conversion rate, but this change was site-wide so had a large impact on revenue.
An affiliate site that we work on sends traffic to a variety of subscription products. One of these subscription product providers increased the complexity of their package choice and as a result we saw the conversion rate drop off massively.
While most providers ask the user to choose between 3 simple options, this provider now requires the user to choose between 12 different combinations of plan and duration.
This provider had clearly tested the change and will be doing this to move from growing users to increasing revenue. So it works from a revenue perspective but it has decreased their underlying conversion rate, mostly likely because it’s confusing for the users. Fundamentally the user starts questioning, does the standard plan give me everything I need?
We fixed the issue by giving users guidance on our client’s site before they clicked out. By reassuring them that the standard plan would give them everything they needed for their use case, we managed to increase the conversion rate.
We recently ran a really interesting test for a VPN site, which showed exactly why you have to be careful with taking overall test results on face value. The test was simple – adding a bar across the top of the site – like the one below:
It’s typical to have this kind of thing across VPNs – so we expected it be a slam dunk to test this across the site. It wasn’t!
We put it on desktop, we put it on mobile. The traffic split was 50:50.
What was amazing about this was that on desktop this DROPPED the conversion rate by about 20%. However on mobile we INCREASED the conversion rate by 33%.
If we hadn’t broken down the results by device, we might have harmed the desktop conversion rate significantly. Instead we’ve been able to take all of the upside on mobile, and none of the downside on desktop.
Changes having completely opposite effects on mobile and desktop is something we’ve also found previously for other SaaS sites, as well as eCommerce clients.
These tests all worked well for specific sites, but every site and user is different. To get a true understanding of the best tests to run and to get the biggest results, you have to start with user research to really find out what stops users from converting, so that you can fix those problems. At the very least, don’t just make changes based on what you’ve seen here – run them as tests to see if they work for your users too.