Let’s be upfront here, this is, in the end, a features versus benefits case study. But there are a few reasons this one is worth reading.
First, unlike every other marketing expert telling you to focus on benefits over features with no data to back it up, we actually AB tested it, so there’s data here (including tests that didn’t work).
Secondly, this case study is for a SaaS company. (We were trying to optimize their conversion rate via free trial starts from their homepage.) And the thing about software companies is that they love talking about their features.
And frankly, I don’t blame them.
Below, we show user research that supports the assertion that prospective customers’ primary barriers to signup were feature related (“How does [this or that] work?”, “Do you integrate with [such and such software]?”).
In the end we show how following the original user research led us to test adding more features, which did not improve conversion rates (free trial starts from the homepage), but how a simple list of three benefits (inspired by, of all things, a promo video) did improve free trial starts.
We hope various SaaS companies or even other startups (subscription services, other business models) could use these results to test similar benefit summaries to possibly increase their own conversion rates.
User Research: Can we optimize conversions by showing how this software works?
When I Work was a client of Growth Rock that sells employee scheduling software for brick and mortar businesses that have hourly employees.
They are one of the leading platforms for this kind of software in the industry and their offering has a lots of genuinely useful features:
These features are listed out in a separate “Features” page, but not on their homepage, which only included (at the time) a high level overview of key features.
A simple hypothesis thus emerged: Could listing some of these features on the homepage, or key landing pages, improve free trial conversion rates?
Being responsible conversion optimizers, however, we turned to user research first (before running, guns blazing, into an AB test).
We asked users on the homepage:
What else do you want to know about When I Work before signing up?
We categorized their responses via our Sort and Measure Method of quantifying text-based survey responses and got this high level overview:
The “how it works” category was by far the leader.
Note: Cost is almost always a big topic in user polls. But price is price. You can play with ways to frame it, but often you have little you can do on that front. Instead, if you find out ways to increase perceived value, it helps counteract consumers’ obsession with price.
This category of responses included things like this:
- “Will this work for scheduling multiple shifts for one employee?”
- “How can my employees log in using a laptop?”
- “Can I schedule people at different locations?”
It also included a slew of questions that weren’t more specific than “So how does this thing actually work?” That is, we got the sense they just wanted to see the software in action.
Note: This is common. Customers for software want to see the software. Don’t you?
All the more reason, we figured, to show more screenshots. Screenshots were of course abundant on the feature page.
The First Approach: Give the Users What They Want, Features
So with this hypothesis backed by some user voices, we decided we to test adding features and “how it works” explanation onto some landing pages that were near replicas of the homepage (for no other reason than the fact that the homepage had other tests running at the time).
We tested this in two ways:
- via scrollable images and text,
- via a “How It Works” video with a list of feature as icons below the video.
We wanted to make sure we weren’t only going to see effects of a particular layout, format, design, or copy.
After over 30 days and hundreds of signups per variation, there was no statistical difference in conversion rate.
Too bad, no difference. (Note: For client confidentiality, we blur actual conversion rates.)
We even tried other variations to list features on various landing pages and the homepage, such as this one:
Adding this list of features reduced signups by 10%. Argh.
We should mention that these features are genuinely useful to When I Work’s customers. Customers use them and historically they have often requested these features. So we knew they are valuable. Plus they were asking about these things in the user research polls. And yet variation after variation of listing features in different ways did not manage to increase conversion rates.
The Second Approach: Simple benefits
So we stopped and went another route, also built on user research: Testimonial Analysis.
We analyzed video testimonials of When I Work customers talking about what they liked about the product. Keep in mind, these videos are produced by the company and heavily edited. Nonetheless, you can hear customers describe what they like about the product in their own words.
In these videos, customers did mention features they liked, but we noticed two key characteristics:
- They spent a lot of time talking about their frustrations
- Features were almost always wrapped around benefits that it provided their business.
For example, instead of saying “We love that When I Work has sends schedule updates by SMS” (feature), they’d say something like “It’s really easy to send updates to my whole team” (benefit), or “my team loves that they can easily see updates and schedule swaps by text” (benefit that mentions a feature).
Could it be this simple? Or even this cliche? (“Benefits over features!”)
We tested this approach via an experiment that added a short “Benefits” section to the top of their homepage (which was constructed similar to most paid media landing pages).
It was only one variation and it was just adding the above section to the page. That’s it. “Spend more time growing your business” was probably the “strech”-iest benefit on there in that it didn’t allude to any features directly, just a general mention of time savings.
Adding this section increased signups on the homepage by 10%.
(Note: Our AB testing software, Optimizely, is only showing 8% significance because it uses a statistical model that factors in sample size, which could have been larger. The p-value of this experiment, though, was 0.032, i.e. 97% statistical significance. With that plus over 1,000 conversion events per variation across multiple weeks and the variation winning the entire time, we felt comfortable calling this test.)
An observation and a guess
I found a two aspects of this result interesting.
First, I love the simplicity of the benefit section. The features sections are elaborate. Take a look at their features page, you can scroll through a huge list of features, with images, zoom ins, etc. The benefits section, on the other hand, was just 3 points in text with some stock icons on top. Yet it worked.
Second, I’m guessing this result is audience specific. In particular their target customers are small business owners (owners of cafes, restaurants, dentist offices, etc.). They are often not tech savvy. So, it makes sense that they may not be attracted to a long list of tech features. It may even cause overwhelm:
“This looks like it’ll take too long to learn.”
If you’re making software for developers though (e.g. even as simple as Stripe), your customers may very well want a more technical list of features rather than a high level summary of benefits. So, as always, be careful and test it on your own.
More strategies for optimizing SaaS conversion rates
This case study is actually an excerpt from a recent ebook we wrote on optimizing SaaS landing page conversion rates. You can download the PDF at that link, free, no email optin required.
Finally, if you run a SaaS (or other online) business where a 10% increase in conversions would be worth a 6, 7, or 8 figures in annual revenue, hit that Get Proposal button on the top right and we can chat about getting similar results for you.