There are so many surveys out there. Surveys that organizations do yearly, surveys that get sent out after customer interactions, surveys for donors, surveys for new products, millions of surveys for millions of reasons. And a lot of them suck. SO MUCH. I click on pretty much any survey I see, and without knowing the objectives behind them, I can still tell they aren’t going to get people want they want. So here’s my top five reasons why your last survey sucked, or five ways to make your survey suck less. (There’s more than this btw, so hang in there for the Why Your Survey Still Sucks post.
ONE: You don’t really know what you want to achieve. If you don’t have a super clear understanding of what you want to get out of your results, you don’t have super clear questions, and you won’t have clear action steps based on the data. You’ll never be able to move forward and develop your knowledge further. The best data and research not only provides you with clarity and insight into specific objectives at a specific time, it should also allow you to progress your clarity and insight over time. You can fix this right now, by summarizing in 1-3 (no more) objective statements, like:
- Identify current and prospective donors’ interest, understanding and connection to our cause
- Identify the path to purchase for Clothing Product X
- Understand the day-to-day experiences of those living with a disability in our region
TWO: No one does anything with the results. Or no one knows what to do with the results. This is a big one. This is partly connected to your objectives, and partly connected to the results you’re distributing. Ask yourself, do the objectives feed into the reasons you’re the doing the survey in the first place? Think about it this way…
You want to identify current and prospective donors’ interest, understanding and connection to our cause. Why? You need to be able to answer that question easily and succinctly. For example: Our most recent fundraising campaign didn’t meet its goals. We don’t have any concrete information on how our own donors, or prospective donors feel about our cause. So, we have no idea if we’re communicating with clarity the right messages at the right time. The survey will tell us what we need to address.
Now, the results part. Reporting back on the data will mean answering the objectives and the why behind your objectives in simple terms. For example, 70% of our current donors indicated that the emotional connection to our cause lies their perception that their donation allows us to bring a smile to child’s face on a donor’s behalf. 80% of current donors did not see that message within our more recent campaign. We’re recommending next year’s campaign includes this simple message in a variety of ways… If you aren’t getting people the information they need, in a way that’s usable, you’re going to lose your internal engagement and the chance to change things for the better.
THREE: There’s more open-ended questions than anything else. If your survey is 10 questions long and has more than 3 open ended questions, then you should not be using a survey methodology. If you don’t know enough about how your target is going to respond to a question to provide them with inclusive options to most questions, you need to think about qualitative research instead (see upcoming blog post Why Your Last Focus Groups Sucked), or you need to think about your questions more, and how you’re going to analyze the data. Think about this example:
- Identify the path to purchase for Clothing Product X
Now, if you knew the path to purchase already, you wouldn’t be doing the research, but you at least need to know the possible components with some room for expansion. A question like this should be easy for you to design:
Think back to when you first considered purchasing Clothing Product X, do you remember where you first saw it?
- A magazine (hard copy)
- A magazine (digital)
- A look book
- A friend was wearing it
- A retail store
- Something else we didn’t mention? Please specify.
If you can’t figure out at least MOST of the options, think about qual, or think about who else might have that information before you run your survey.
FOUR: You’re asking people to manually skip over questions. When you’re using a survey software you need to spring for the paid version that allows respondents to skip questions, or receive specific questions based on their responses. For the last question example on path to purchase, if someone selects retail store and you want to ask more questions, you should be using skip logic to ask only people who selected that response. It’s easy and every survey tool will have info on how to do it. The question should not be:
If you selected yes at retail store, tell us…if not, skip over blah blah
It should be:
You said you first saw our product in a retail store, what was the name of that store?
- X Store
- Y Store
- Z Store
- Not sure/Can’t remember
It’s a minimal cost, it’s minimal effort to learn how to skip questions based on responses, it makes a better experience for your respondents and the data is cleaner and easier to analyse. Don’t ask respondents to do it themselves, it makes your data messy and very likely, full of errors. Respondents will also skip that question even if they said yes, if they don’t feel like answering and you’re going to lose data. That’s more than enough reasons to budget $50 or less for a properly functional survey tool.
FIVE: No one responded. Surely the saddest reason of all! You need to think about why people are going to fill your survey out, and if you’re thinking they’ll just want to help, unless you have a really compelling/interesting/engaging cause, they probably won’t. Incentives that work:
- Prize draws – think gift cards, technology
- Product discounts – 30-50% of your product minimum
- Direct experiences with your organization (if you have an organization that works for this, e.g. you have an experience that respondents can have/learn/see etc.
Once incentives are set, how will people see it? Database? Social media? At your location? Assume at 5-10% response rate (and that’s generous for most) and then calculate all the possible “eyes” you can get it in front off and you’ll know what you can expect. By this point, you’ve spent a lot of time getting things right, so spending a little more time and budget is worth it.
So there you go, there’s five reasons your survey might suck, and five things to change and get yourself an awesome survey. Check back for more ways to crank up your research and data efforts, and check out our Summer Research and Data workshops in Toronto and NY here.