“We love you! ….Do you love us?” That’s the gist of an email from my web hosting company, asking me to take their customer satisfaction survey. And it’s guaranteed to generate bad data.
The subject line shouts “Media Temple Champions! Please share your feedback.” Then email then says “Thank you for being a Media Temple champion. We would love to hear your feedback….”
As far as I know, I’ve never signed up to be a Media Temple advocate, or told them I’m their biggest fan. But they called me their “champion” twice in two sentences.
I couldn’t give them a bad rating after all that love, could I? And that’s likely the point. This email came from the company’s customer success manager — a man whose bonus probably depends on high satisfaction scores. So he decided to kill his customers with kindness and hope they returned the favor.
There a lots of ways surveys bias their respondents, intentionally or not. Have a look through your survey questions and make sure you’re not:
1. Buttering respondents up. “You’re smart, you’re attractive, and everyone loves you. Now, what about us?” Media Temple used this tactic above; it’s hard for people to criticize someone if that person’s just been gushing about them.
2. Telling respondents the answer they “should” give. We saw this in last week’s LinkedIn poll: “Apple CEO Tim Cook starts his day at 3:45am… what time do you wake up?” Remind respondents what successful people do — or what their peers do — and they’ll feel pressured to say they do the same.
Likewise, if tell your respondents how great something is (“ExxonMobil loves the environment!“) before you ask them to rate it, and it’ll likely score much higher.
3. Only offering respondents your preferred responses. An artist I follow recently posted an Instagram poll: “Are you ready to see my next project?” The two responses offered: “Yes” and “Definitely!”
While the artist was joking, many survey writers actually present only the responses they want to receive. Questions and answers always rely on editorial discretion, but consider the full range of responses your survey takers want to give — not just what you want to hear them say.
Before you publish your next survey or poll, re-read your questions and answers. Have you slipped any bias into the way you ask your questions? Have you only offered the answers you’re hoping to receive? You can’t get good data if you don’t ask good questions.
What’s the craziest way you’ve seen a survey bias its respondents? Let me know in the comments below, or on LinkedIn or Twitter.
Want good data delivered to your inbox? Subscribe here.