One of the most useful tools in the marketer’s tool kit is the survey. Yet, one of the most despised emails to receive as a prospect or customer is also the survey. What starts off with good intentions of trying to make my experience better, turns into a 45 question onslaught from hell where I’m asked everything from how I’m feeling that day to my blood type. Wait though—in exchange for my time, I do have a miniscule chance to win a $50 Amazon gift card.
All sarcasm aside (well mostly), we’re noticing a depressing trend taking SaaS by storm: We’re miserably bad at collecting feedback from current and potential customers through surveys, because we’re sending really crappy ones.
This goes way beyond the length of the survey, as well. We’re asking the wrong questions and are therefore not getting the feedback we should be getting from such a powerful tool. How do we know this to be true?
Well, we have some experience sending surveys. Over the past three years we’ve sent over 5 million SaaS customer development surveys in our work building the pricing and revenue engines for some the biggest and best in SaaS. Because our pricing software hinges on collecting data from different user segments, we’ve had to iterate our process and tactics for higher response rates, and better user feedback.
To pass these lessons along, let’s walk through: 1. why shorter surveys are better (and why you need to tell customers how short your survey is), 2. how to not be lazy in the questions you’re asking, and 3. how to ask better questions to get better feedback.
Shorter surveys get better feedback and can be sent more often
Every survey made could be made better by being made shorter. One of the biggest complaints with surveys are their length, because most of the time they require us to massively interrupt your workflow as the survey taker. The negative impact of lengthy surveys manifests in major drop-offs in response rate, as well as the quality of responses.
Take a look at the output below comparing the time it takes to complete a survey and completion rate (measured by individuals who start the survey and get through it's entirety). Notice the massive drop-off that occurs after 4 minutes. Yet, in our analysis of surveys - the average time it takes to complete most surveys that are sent is 14.3 minutes.
What’s worse is the quality of responses of even those who are completing the lengthy surveys is atrocious. The below output shows the same surveys, but of the completions, how many responses were accepted. Over four minutes once again saw a massive amount of churn in the quality of responses, meaning that not only are these surveys not being completed, but the results are pretty crappy, as well.
What should you be doing then? Well, we’ve found that surveys that can be taken in less than a minute (on average) are typically the best in terms of response rate and quality of response. This means you really don’t have room for more than five questions. Plus, what’s better about these surveys is that they can be sent much more often with some companies sending customer development or feedback surveys every 21 days to the same groups of people.
Customers need to be trained though that your surveys are going to be much shorter in length and much more useful to them. To do this, your email copy is absolutely crucial and needs to communicate the actual length of the survey. We’ve found our favorite subject line to be: “Share in Shaping [COMPANY]’s future - 30 Seconds”. Customers then know it’s a quick task that can be completed lickity split.
Cut down on length by not asking lazy questions
Shorter survey length is great in theory, but how do you get all the information you need with so little time? Well, you can start by not asking lazy questions. Out of the surveys we examined for this post (and just in our day to day work), we found that most of the bad ones are asking questions that should absolutely already be known by the team sending the survey.
You should never be asking the respondent for their email address, what plan they're on, how often they use the product, etc., because you should already know the answers to these questions in your respective databases.
Practically this means that when your survey is ready, you need to go through every single question and make sure there isn’t a way that you can already get the data. Don’t waste your respondent’s time. You only need to piss off your respondent once for them to shun you from giving feedback forever.
Cut down on length by asking the right questions
Additionally, surveys can be made much shorter by asking the right questions. We see time and time again that SaaS teams wanting feedback will start out very focused on their feedback needs (ie. “We want to see which of the next five features are most important to a prospect”), but inevitably get “survey creep”. This happens by either asking for too much feedback from other teams or trying to do too much with one survey. Always assume every pixel you add to your survey means the quality of responses will go down.
To circumvent these issues, keep the goal of the survey focused and singular, but also make sure you’re forcing respondents to make tradeoffs and decisions. Too many surveys ask people to rank a bunch of features on a scale of 1 to 10 and get results like the below. You have no idea what’s best and what’s worst in this list.
Instead, force respondents to make decisions by choosing a most important and a least important feature out of a group, or by asking for open ended (but targeted) feedback, such as “At what price point is the product too cheap you’d question the quality of it?”. There’s more on the mechanics of feature value analysis and price sensitivity analysis at those respective links (including how to ask the right questions), but the moral of the story is to take survey experimental design very seriously, because you only have so much time with a respondent.
Surveys are a powerful tool. Use them wisely.
Surveys are an exceptionally powerful medium that are mostly corrupted by those that wield them. Used wisely and you’ll be able to form quantified buyer personas, save money through a targeted product roadmap, and even make sure your customers are happy. Use them poorly, and you’ll just alienate your customers and prospects to the point that they won’t want to give you feedback anymore. The power is yours! :)
Here’s some other feedback for you based on questions asked about surveys:
Incentives only work if they’re large enough and given to each respondent: We use a lot of market panelists (people who are paid with cash, airline miles, donations to charity, etc to take surveys) to complete surveys for targeted user types. Yet, incentives in terms of a contest for a free iPad or another gift typically work very poorly. The difference is market panelists are opting in to providing feedback, whereas someone contacted via email needs more intrinsic incentive to take a survey
Qualtrics and Price Intelligently’s software are better platforms for collecting surveys: We’re biased on the latter option, but both of these platforms allow you to collect who answers what for proper segmentation and survey design. Qualtrics is a massive platform, and PI’s tools are a bit more targeted.
Respondents respond the best to a “community” feeling: We’ve found that the best brands who’ve built communities around their products, even if it’s just wanting to see the company succeed, get the best response rates. Evoke the sense of “this is a two way conversation” and you’ll do better.