All notes

April 18, 2026 · 4 min read · Nikhil Kumar

How to write survey questions people actually want to answer

Most surveys are written by committee for an audience of one — the person who designed them. Here's how to write questions that get real answers from real humans.

The reason your survey response rate is bad is almost never the survey tool. It's that the questions are written like a tax form.

I've read maybe two thousand survey questions in the last year, and I can tell you the failure modes by heart. Here are the ones that matter, with fixes.

1. You're asking two questions in one

"How easy was it to sign up and start using the product?"

That's two questions. Sign-up could be smooth and onboarding could be a disaster. The respondent picks the option that averages their feelings, and now your data is useless.

Fix: Split it. "How easy was it to sign up?" then "How easy was it to get started after signing up?" Two questions, two answers, two actionable insights.

2. You're using your internal vocabulary

"How would you rate the AAR on your last engagement?"

The respondent doesn't know what AAR means. They guess. They get it wrong. They feel dumb. They close the form.

Fix: Write like a normal person. "After your last project wrapped up, did your team actually do a review?" If you have to use jargon, define it in the question. Don't make people Google your acronyms.

3. You're leading the witness

"How much did you love our new redesign?"

You've already told them the answer you want. The honest ones will give it to you anyway because they don't want a fight. The dishonest ones — the people who hated it — will lie. Either way, your data is shaped by your wording, not by reality.

Fix: Be neutral. "What did you think of the new design?" Or, better: "What's one thing you'd change about the new design?" Open-ended, neutral, gets you the actual signal.

4. The scale is meaningless

"On a scale of 1 to 10, how satisfied are you?"

What's a 7? What's the difference between an 8 and a 9? Nobody knows. NPS gets away with this because it's a benchmark — but for everything else, 1-10 scales are noise.

Fix: Use a 5-point scale with words. Very dissatisfied · Dissatisfied · Neutral · Satisfied · Very satisfied. Words anchor people. Numbers don't.

(NPS is fine. Just know that you're using it for the benchmark, not because the data itself is more useful than a 5-point scale.)

5. You're asking for things they can't remember

"How many times did you use the dashboard in the last 90 days?"

Nobody knows. They'll pick a round number. You'll average a bunch of made-up round numbers and call it research.

Fix: Either use behavioral data (you have analytics — use it), or ask about the most recent instance. "Did you use the dashboard yesterday?" You can answer that. Aggregate it across enough people and you have something real.

6. The required-field tax

You add an asterisk to every field "just in case." Now the form is twice as long and the respondent has to invent reasons to put something in the "company size" box.

Fix: Make almost everything optional. The only things that should be required are the ones you genuinely cannot do your job without. Email, maybe. The rest is bonus.

People will tell you more, not less, when you stop forcing them.

7. You're saving the good question for last

The classic mistake. Twelve demographic questions. Then, on screen 13, the actual thing you wanted to know.

By question 13 your best respondents have left. You're now collecting opinions from the people who finish surveys for a living. Those are not the people you want to hear from.

Fix: Put the most important question first or second. Demographics last. If you only get one answer per respondent, get the one that matters.

The general principle

Write each question as if you're asking it out loud, to a friend, in a coffee shop, who has somewhere else to be in five minutes.

If they would say "what do you mean?" — rewrite it.

If they would say "why are you asking?" — rewrite it or cut it.

If they would say "all of the above I guess" — split it.

If they would just stare at you — definitely cut it.

Most surveys would be twice as good if they were half as long. Cut everything that isn't load-bearing. Then cut more.


A small plug: I built coolform so that the form-design part isn't the bottleneck — you write the questions, the AI suggests the right types, the layout happens automatically. That way you can spend your energy on the part that matters: writing questions worth answering.

— Nikhil

Try the thing

Build your first form in under a minute.

Free forever. No credit card. AI does the boring parts.

Start building