A constant challenge for any startup is figuring out what’s next and where to focus its scarce resources to improve the business. Sometimes a little tweak within the product’s onboarding will do magic, sometimes it’s a whole new feature, and other times just a simple change in the pricing strategy can dramatically improve key metrics.
There’s never a lack of ideas for what to do, but when it comes to deciding what to tackle first, there’s often a lack of information to make an educated decision.
It’s always tempting to base decisions about data on how people are interacting with the product because this data is easily available through tools like Google Analytics. When all goes well, the impact of changes can be measured quite quickly, so it seems to be a good choice to just go ahead and try that tweak and see whether it has an effect on the metric.
A lot of times, however, it makes sense to look not only at the how, but also the why.
Imagine your analytics show you a high one-time user rate. You might be tempted to rework things to get your product’s value across more efficiently. But what if, by asking more of those one-time users why they dropped off, you found that most of them understood your product perfectly in the first session and those who never came back made the choice because they didn’t find the product useful?
Wouldn’t it make more sense to focus instead on the users that come back? Wouldn’t that insight save you a lot of time and help you to better focus your resources?
I think everyone would agree that having both the how and why answered would help businesses make more educated decisions. But how many of you manage to continuously ask a big enough set of customers about the whys?
At Blinkist, we didn’t ask enough customers the right questions for a long time. Of course, we would interview customers about their experience with our product and frequently let them play with new features in beta. And certainly, we would send around surveys, some of them at regular intervals and some of them randomly. But we found both approaches to be limited for several reasons:
First of all, interviews and surveys require some time investment from your users, so you’ll mostly get answers from users who are already engaged, while it might be more valuable to get answers from those who weren’t engaged in the first place.
Furthermore, interviews (and to some extent surveys) are time consuming for you, too, so there’s always a risk you’ll never get to interview enough of your users and end up basing your decision on faulty data. I realized that especially in cases where we’d have one specific question and we deemed a survey to be too much effort. The result? We didn’t bother to ask.
The more little questions we had about little changes and tweaks we were discussing, the more I asked myself how we could find a way to quickly ask our users simple questions about a behavior we wanted to understand better.
We started some tests with simple Google Form Surveys but respond rates were very weak; we couldn’t ask the questions immediately within an email and customers had to click a button first before they saw the questions and picked their answers.
When I was searching for solutions that would allow us to include simple one-question surveys directly in an email to improve user response, I was surprised to find that none of the big survey or email marketing providers offered anything like it. It seemed it wouldn’t be too much of a challenge for these companies to implement such a feature, and it would be very powerful for a lot of their customers.
It took me three hours to set up our one-click email survey (a developer would have probably managed to do it in 10 minutes) and we’ve been using it for two months now.
We’ve set up some automatic emails based on events that are very important to us, but I’m also using it frequently when I want to get quick feedback on theories I have.
For example, when we launched a feature that allowed readers to move their books from their “to read” to their “finished” lists, roughly 15% of our users read a book to its last chapter, but never marked it as finished. We weren’t sure what to make of this data: were some users unaware of the feature or did they simply not want to mark the books as finished? Instead of discussing it for a long time or even creating a test to make the feature more visible, we simply sent out an email to a set of customers who didn’t seem to use the feature.
We got an impressive response rate of 34% to this email, and other emails that we send out automatically based on certain events often see response rates of more than 20%.
The one-click email survey made it so much easier for us to quickly get answers for questions and validate hypotheses we discussed. And as a nice side effect, it proved to be a very good tool to increase customer happiness, too. I never received a single negative response but a lot of positive feedback in regard to how we care about our users and how we let them help shape our product.
I hope that this article inspires some of you to give it a try, too, and that it’ll work as well for you as it has for us!