7 Worst Questions To Ask In Customer Surveys

How to avoid common mistakes that ruin surveys

Gathering feedback is an essential part of building a great customer experience. It's also one of the keys to running a successful organization.

It helps improve customer service, product development, and employee satisfaction. On top of that, surveys can be used to generate qualified leads.

But it only works if it's done well. Surveys are surprisingly easy to mess up and a messed up survey is useless.

The 7 Worst Questions To Ask In Customer Surveys

Get Started with Online SurveysWith One Of Our 180+ Templates

Customer Satisfaction Survey Template
Customer Satisfaction Survey
Customer Feedback Survey Template
Customer Feedback Survey
360 Employee Evaluation Survey Template
360 Employee Evaluation
Coffeehouse Rating Survey Template
Coffehouse Rating
Customer Satisfaction Survey Template
Customer Satisfaction Survey
Market Research Survey Template
Market Research Survey
Computer Skills Assessment Survey Template
Computer Skills Assessment
Event Feedback Survey Template
Event Feedback Survey

We've established a list of common pitfalls and how to avoid them.

Want to improve your surveys? Read on!

1. Double-barreled questions

The more information you get, the better informed your decisions will be.

But that doesn't mean you should ask for too much at once.

In fact, you should only address one issue at a time.

A double-barreled question is a question that addresses two or more issues at a time. And it expects a single answer.

As such, you can't discover the real intentions of the respondent. And that renders the question useless for analysis.

Examples of double-barreled questions

Here’s what it looks like:

"Do you like the design and features of the product?"

"Are you satisfied with the evolution and the performance of the software?"

Even if it makes your survey longer, you should break down your questions. Then the results will be meaningful.

The 7 Worst Questions To Ask In Customer SurveysThis IKEA survey question is triple-barreled!

Double-barreled questions are often used by criminal attorneys during cross-examinations. It's a way to get people to incriminate themselves.

For example: "Did you call the ambulance after injuring the victim?" Whether you answer yes or no, you're screwed.

2. Leading questions

Leading questions are framed to push the respondent to answer in a specific manner. They usually already contain information that the survey creator wants the respondent to confirm.

It can be intentional or not. That’s why we need to be aware of our own biases.

Here are things to look for to avoid bias:

  • ✗ Elements of assumption
  • ✗ Influence or moral judgement in the question
  • ✗ Moral implication in answering either way
  • ✗ High influence of moral values on the response
  • ✗ Forcefulness in the question
  • Examples of leading questions

    Assumption: "How innovative did you think the latest release of our product was?"

    This assumes that the customer thinks of the product in terms of innovation.

    Ask this instead: "What did you think of the latest release of our product?" or "Do you think the latest release of our product includes innovative elements?"

    Influence in the question: "How satisfied are you with the outstanding performance of our customer service department?"

    Whether your performance was outstanding or not is for the respondent to decide.

    Ask this instead: "How satisfied are you with our customer service department?"

    Moral implications: "Do you feel justified in causing deforestation so you can enjoy a chocolate bar?"

    Depending on the answer, the question frames the respondent as a good or a bad person.

    Ask this instead: "Do you worry about the environmental impact of your buying habits?"

    Here’s a real-life example of moral pressure in a survey:

    The 7 Worst Questions To Ask In Customer Surveys

    Forcefulness: "95% of our customers are happy with our products. Can we assume you're one of them?"

    This question doesn't really ask for information so much as it asks for confirmation.

    Ask this instead: "How would you rate your satisfaction with our product?" or "Would you say you're generally satisfied with our products?"

    Check your own biases and make sure respondents can answer freely and truthfully. Otherwise, the results of your survey won't be usable.

    3. Badly designed Questions

    Crafting good questions is important. But you also need to make sure they're easy to understand and respond to.

    This happens more often than you think. And to big companies too.

    Examples of badly designed questions

    Unnecessarily complicated design

    This survey by Bank of America makes a very simple rating question unnecessarily complicated. The question would’ve been perfectly clear with a regular scale.

    On top of that, they ask the participant to respond as fast as they can. This is something you want to indicate at the beginning of the question. That way, the participant can comply as soon as they’re done reading, instead of being surprised with the request.

    The 7 Worst Questions To Ask In Customer Surveys

    It’s also worth noting that this question probably wouldn’t look good on mobile.

    That can be problematic since according to surveys conducted by Toluna with millions of people, 25% of Americans, 32% of Brits, and as much as 53% of Turks complete surveys on their mobile phones.

    Mismatching question and answers

    This recent survey by Cisco suffers from a severe flaw.

    The 7 Worst Questions To Ask In Customer Surveys

    The question is formulated in terms of frequency. But you can only answer in terms of likelihood. There’s an important mismatch between what is asked and what you can answer to it.

    Wrongly used mutual exclusion

    This survey by Google only allows you to pick 1 option. Clearly these are not mutually exclusive. For that reason, the respondent might not know what to choose.

    The 7 Worst Questions To Ask In Customer Surveys

    As you can see, how you set up your answers is as important as the question you’re asking.

    Make sure that questions are easy to understand as well as easy to respond to, and that answer choices match the question perfectly.

    4. Questions with uncommon language

    Avoid using jargon, slang, or colloquialisms in your surveys.

    If your customers can't understand the question, you won't get a valuable answer.

    Keep it as clear and simple as possible.

    Examples of questions with uncommon language

    Here are a few examples of what not to do.

    Jargon: "How do you feel about the CSS improvements in our new release?"

    Most people don't know what CSS is. They just browse the website.

    Ask this instead: "How do you feel about the visual appearance of our new release?"

    Colloquialism: "How do you feel about the number of lorries driving through your neighborhood?"

    "Lorry" is typically used in Britain to refer to a large freight truck.

    If you're addressing an international audience, you need to adapt.

    Ask this instead: "How do you feel about the number of freight trucks driving through your neighborhood?"

    5. Ambiguous questions

    The goal of a survey is to understand how respondents really feel about something.

    But if they don't know what you're after, they can hardly give you a usable response.

    That's why you need to make your questions as clear as possible. There should be no room for interpretation.

    You either need to narrow down the question or break it up into several questions.

    Examples of ambiguous questions

    Here are some examples of vague or ambiguous questions.

    Vague: "Do you usually shop at our stores?"

    "Usually" doesn't mean much. It has a different meaning for everybody and it depends on the situation. Any answer to that would be useless.

    Ask this instead "How often do you shop at our stores?" Then offer specific and relevant timeframes to choose from.

    Ambiguous: "Do you enjoy our product?"

    The problem here is that product experience usually has many components. There's ease of use, return on investment, safety, reliability, and so on.

    So any answer to that won't tell you much.

    What you want to do here is break the question down. This is how EASI asks their customer satisfaction surveys.

    The 7 Worst Questions To Ask In Customer Surveys

    6. Badly scaled questions

    It's not enough for your question to be relevant. You have to give respondents a reasonable and usable scale to answer it.

    Otherwise you won't be able to do much with those answers.

    Examples of badly scaled questions

    Here are just a few cases in which the scaling system makes the results of the survey worthless.

    Nonsensical scale

    The scale in this example appears to have a common gradation but, upon closer inspection, it doesn’t make any sense.

    The 7 Worst Questions To Ask In Customer Surveys

    Surely, if you received less support than you needed, you received some support that you need. And if you didn’t receive either less or more support than you needed, then you received all the support you needed.

    Also, the two first two options are written in comparative terms (more/less) while the fourth and fifth option are written in quantitative terms. You can’t have both at the same time.

    Slightly off scale

    In this example, the survey creator uses a scale from 1 to 10, with 5 being the middle point. That middle point should be between 5 and 6.

    The 7 Worst Questions To Ask In Customer Surveys

    And this is why we use odd scales in these situations.

    Disproportionate scale

    In this example, the middle point isn’t neutral. There are 3 options to express satisfaction and 2 to express dissatisfaction. And there’s no way to express a neutral opinion.

    The 7 Worst Questions To Ask In Customer Surveys

    7. Intrusive questions

    It's understandable that you'd want to know as much as possible about your respondents.

    That being said, some subjects are more sensitive than others. You need to approach them as such.

    This concerns questions relating to people's personal lives. It can include sexuality, gender identity, illegal activities, distressing events, and so on.

    How to ask a sensitive question

    If you don't prepare your respondents' sensitive questions, you might hurt your survey. Respondents might avoid responding, drop out of the survey, or give untruthful responses.

    None of that helps. So here are a few things you can do.

    1. Guarantee anonymity - Only say it's anonymous if it really is. In that context, respondents might be comfortable with answering the question truthfully.

    Obviously, anonymity can't only be theoretical. If you run a small company and your survey asks about race, and you only employ one person of color, they can be identified.

    In this case, that's not real anonymity.

    2. Lead up - Consider not asking a sensitive question out of the blue. Build up to that question with questions that already touch on the subject.

    For example, don't ask someone if they're using illegal/hard drugs right away. Ask first about their opinion of drug regulations. And if they've used legal drugs already.

    And then ask your question.

    3. Provide context - If respondents don't know what their data is going to be used for, they might be suspicious.

    You can prevent that by explaining why the question is important. And what you're going to do with that piece of information.

    4. Normalize - You might have to ask a question about something that's usually frowned upon. That's okay. An easy way to avoid respondents getting defensive is to normalize the behavior.

    For example, you might need to ask how frequently respondents attend parent-teachers meetings. In that case, consider starting by saying that there are many reasons why parents can't attend meetings (work, illness,...).

    And then ask them how frequently they go.

    5. Frequency - You'll have a better chance with some questions by asking about frequencies rather than yes or no answers.

    If you ask someone whether they do illegal drugs, they'll probably say no. Because that's what's socially acceptable.

    But if you provide frequency brackets (never, 1 to 6 times a year, 7 to 12 times a year,...), they'll be more likely to respond.

    6. Provide an out - Give them an option not to respond. Sometimes that's the price for them not to drop out. If your survey doesn't depend on these questions, you'll benefit from their other answers.

    7. Do it at the end - Ask sensitive questions towards the end of your survey. That way you make sure that you'll at least have answers to the rest of your questions.

    Examples of intrusive questions

    Questions about—including but not limited to—the respondent's sexuality/gender, political opinions, and financial/credit history are usually seen as intrusive.

    Here are a few examples.

    Intimacy - "Would you describe your sex life as satisfying?"

    Politics - "Did you vote in the most recent presidential election?"

    Money - "How frequently do you go into overdraft?"

    How guilty are you?

    It's crucial that you analyze your own work and fix those mistakes.

    Otherwise surveys won't teach you anything. And, in turn, you won't be able to make informed decisions.

    You need to:

  • ✓ Limit yourself to one piece of information by question
  • ✓ Check and eliminate bias as much as possible
  • ✓ Design your survey in a way that makes it easy to respond
  • ✓ Use language your audience can understand
  • ✓ Leave no room for interpretation
  • ✓ Use relevant scales
  • ✓ Be extremely careful with intrusive questions
  • If you keep all these in mind, your next survey should be a success!

    Get Started with Online SurveysWith One Of Our 180+ Templates

    Customer Satisfaction Survey Template
    Customer Satisfaction Survey
    Customer Feedback Survey Template
    Customer Feedback Survey
    360 Employee Evaluation Survey Template
    360 Employee Evaluation
    Coffeehouse Rating Survey Template
    Coffehouse Rating
    Customer Satisfaction Survey Template
    Customer Satisfaction Survey
    Market Research Survey Template
    Market Research Survey
    Computer Skills Assessment Survey Template
    Computer Skills Assessment
    Event Feedback Survey Template
    Event Feedback Survey

    About Author

    Forster Perelsztejn is in charge of marketing at Rooftop, an email management and all-in-one collaboration tool. He has spent most of his career working in SaaS and creating content for a variety of authoritative publications. When he’s not working, you can find him playing music, taking photos, and taking care of his pets. Connect with him on LinkedIn!