12 min left 0%

customer interviews

The customer will not tell you what they want. They will tell you what they think you want to hear. Your job is to make that impossible.
Talvinder Singh, from a Pragmatic Leaders discovery workshop

I have watched hundreds of PMs run customer interviews. Most of them walked out of the room convinced they had validated something. They had not. They had performed validation theater — a ritual that feels like research but produces nothing actionable.

The tell is always the same: the PM goes in with a hypothesis, asks questions that confirm it, and comes back saying “customers love the idea.” The engineer asks what specific pain point it solves. The PM cannot answer without hedging.

This page is about stopping that pattern before it starts.

Why interviews fail before you open your mouth

Most interview failures are structural. They happen in the 48 hours before the interview, not during it.

Failure 1: You are testing your idea, not learning about their life.

The moment your interview goal is “does this customer want feature X,” you have already lost. You will ask biased questions. You will interpret ambiguous answers favorably. You will remember the one “yes” and forget the three “maybes.”

The correct goal for any customer interview is this: understand what the person’s life looks like in the domain you care about, in enough detail that you could predict what they would say about your idea before you pitch it.

If you can do that, you do not need to pitch the idea in the interview at all.

Failure 2: You are interviewing the wrong person.

In Indian B2B contexts — mid-market companies, enterprise, government procurement — the person who experiences the pain and the person who signs the cheque are almost never the same person. The accounts manager who uses your expense software daily has a different job than the CFO who approved the purchase and the IT head who manages the deployment.

Most PMs interview whoever is easiest to reach. That is usually the mid-level user, who will tell you the product is “fine” because they do not want to rock the boat with their management.

You need all three people for a complete picture. Knowing which one to prioritize depends on what decision you are making.

Failure 3: You do not know what “done” looks like.

An interview without a clear definition of a useful output is an hour of pleasant conversation that produces nothing. Before every interview, write down: “This interview was useful if I learned X.” That X forces you to design questions that actually go there.

The Mom Test: three rules that fix everything

Rob Fitzpatrick’s Mom Test is the most practically useful framework for customer interviews I have encountered. It is called the Mom Test because it gives you rules so good that even your mom — who will lie to you to protect your feelings — cannot give you misleading answers.

Three rules:

Rule 1: Talk about their life, not your idea.

Do not describe your product. Do not ask if they would use it. Ask about their actual behavior: what they do today, how they do it, what breaks, what workarounds they have invented. Their behavior is data. Their opinions about your future product are noise.

Bad: “Would you use an app that automatically reconciles your expense reports?” Good: “Walk me through what happened the last time you submitted expenses. What was the most annoying part?”

Rule 2: Ask about specifics in the past, not generics in the future.

“Would you pay for this?” tells you nothing. People are optimistic about their future behavior and have no skin in the game when answering. “What did you use to solve this six months ago and what did you pay for it?” is anchored in reality.

People’s past behavior predicts their future behavior far better than their stated intentions. If they have never paid for a solution to this problem before, their claim that they would pay for yours deserves skepticism.

Rule 3: Listen more than you talk.

A good interview is 80% listening. The moment you start explaining, defending, or pitching, the interview is over. You have flipped from learning mode to selling mode. The customer will start telling you what you want to hear.

Silence is your most powerful tool. When someone finishes a sentence, wait three seconds before speaking. Half the time they will add something more important than what they just said.

// scene:

A customer interview at a logistics startup in Chennai. The PM is three months into building a route optimization product.

PM: “So tell me — if we built something that optimized your delivery routes automatically, would that be useful?”

Customer: “Yeah, definitely. Route optimization would be great.”

PM: “And would you be willing to pay for it? Like a monthly subscription?”

Customer: “Sure, if it's reasonably priced, yeah.”

PM: “That's great feedback. Really encouraging.”

The PM leaves convinced of product-market fit. The customer leaves and goes back to manually fixing routes in Excel, which they have been doing for four years and have no actual plan to change.

// tension:

Asking someone if they would buy something costs them nothing. It tells you nothing.

Question design: what to actually ask

A good interview has three phases. Each phase has a different goal.

Phase 1: Context (10 minutes)

You are building a map of their world. No hypotheticals. No product.

  • “Tell me about your role. What does a typical week look like?”
  • “How does [the domain] fit into your day? Where does most of your time go?”
  • “Who else is involved in [the process]? Where do handoffs happen?”

You are listening for: how complex their environment is, who the stakeholders are, and what the natural unit of their work is (a shipment, a campaign, a patient visit, a payroll cycle).

Phase 2: The struggle (20 minutes)

This is the core of the interview. You want specific past experiences, not general opinions.

  • “Tell me about the last time [the process] went badly. Walk me through what happened.”
  • “What did you do when that happened? What was your workaround?”
  • “How long has this been a problem? Have you ever tried to solve it?”
  • “What would it mean for you if you could solve this?”

The question “what would it mean for you” is powerful. It connects the functional problem to the emotional and professional stakes. In India especially, where job security and hierarchy matter, the answer often reveals the real motivation: “If I could fix this, my manager would stop questioning my team’s competence.”

Phase 3: The current solution (10 minutes)

Whatever they are using today is your real competition. Understand it thoroughly.

  • “What do you use right now to handle this?”
  • “How long have you been using that? What made you choose it?”
  • “What do you like about how you solve it today?”
  • “What would have to be true for you to switch?”

That last question is the most important one most PMs never ask. The answer tells you exactly what problem you have to solve to win.

// thread: #discovery — After a well-run interview with a mid-market HR manager in Pune
Sneha (PM) OK so she's been using Excel + a CA for payroll for 3 years. Has looked at two SaaS tools. Rejected both.
Arjun (Engineering) What was her objection to the SaaS tools?
Sneha (PM) Neither handled variable pay correctly for her blue-collar workforce. The logic was too rigid. Her CA can handle edge cases. The software could not.
Arjun (Engineering) So the switch criteria is: handle variable pay edge cases as flexibly as a human CA.
Sneha (PM) Exactly. And the emotional stake: if payroll goes wrong, she gets calls from 200 workers' families. She said that. Verbatim. 🎯
Arjun (Engineering) That's a totally different product spec than what we had. The current design would fail for her.

The five questions that almost always work

These are not a script. They are a toolkit. Use two or three per interview, not all five.

QuestionWhat it surfaces
”Walk me through the last time you did [X].”Real workflow, not ideal workflow
”What happened next?” (repeated)Depth — pushes past the polished summary
”Who else was involved?”Stakeholder map, decision chains
”How did you handle it before [current solution]?”How entrenched the status quo is
”What would you do if this tool disappeared tomorrow?”How critical the need actually is

The question “what would you do if this disappeared tomorrow?” is a particularly honest proxy for value. If the answer is “we’d figure something out easily,” the pain is not acute. If the answer is “we’d be in serious trouble,” you are in the right territory.

Synthesis: turning notes into insight

Most PMs treat synthesis as “reviewing my notes.” That is not synthesis. Synthesis is the process of finding what your interviews mean, not just what they said.

Step 1: Capture raw notes immediately.

Do not wait until the end of the day. Write your raw notes within 30 minutes of the interview ending. You will forget tone, hesitation, the moment someone looked uncomfortable, the example they used to illustrate something. These details matter. Memory degrades fast.

Have a note-taker if you can. Being the interviewer and the note-taker at the same time splits your attention at exactly the moment you need it most.

Step 2: Extract observations, not interpretations.

Separate what they said from what you think it means. Write observations as quotes or close paraphrases. Write interpretations separately, clearly labeled as your inference.

“She said she spends 3-4 hours every month reconciling expenses before the 5th.” — observation. “This suggests the month-end deadline creates a predictable pain spike.” — interpretation.

Keep them separate because your interpretations are hypotheses, not facts. They need to be tested across multiple interviews before you act on them.

Step 3: Find the pattern across interviews.

After five to eight interviews, do an affinity exercise. Put every major observation on a sticky note (physical or Miro). Cluster them by theme without forcing a predetermined structure. The clusters that surprise you are the ones worth investigating further.

Three types of patterns to look for:

  • Frequency: How many people mentioned this without being asked?
  • Intensity: Who had the strongest emotional reaction to this topic?
  • Consistency: Is this pattern holding across different company sizes, industries, roles?

A pattern that appears in seven of eight interviews, across two industries, from both users and managers, and produces visible emotional intensity — that is a real problem worth solving.

Step 4: Write one “problem story” per segment.

For each customer type you interviewed, write a one-paragraph story: who this person is, what their world looks like, what breaks for them, what they have tried, what they would need to change. Concrete names, roles, company sizes, and stakes.

This is not a persona template. It is a narrative that anyone on your team can read and understand immediately. It grounds every subsequent product discussion.

Mistakes that turn interviews into theater

Pitching your idea in the first half. Once you describe what you are building, the customer goes into feedback mode. They stop telling you about their reality and start reacting to yours. If you must share your idea, save it for the last five minutes and frame it as a thought experiment, not a pitch.

Asking hypothetical future questions. “Would you use…” “Would you pay…” “Would you recommend…” None of these are anchored in behavior. Replace every “would you” with “have you” or “did you.”

Accepting “yes” without digging. When someone says “yes, that’s a problem,” ask “can you give me a specific example?” Unprobed agreement is worth nothing. Specific examples are worth everything.

Interviewing people who like you. Your most enthusiastic early customers, your LinkedIn connections who cheer you on, your pilot users who are emotionally invested in your success — these are the worst people to interview for unbiased discovery. They will tell you what you want to hear. Find skeptics. Find people who tried something similar and rejected it. Find people who chose the competition. Their answers are calibration.

Taking feature requests literally. “I wish the export was faster” is a feature request. “What would you do with the data once it exported?” is the question that reveals whether faster export actually solves the job or whether you are optimizing something they do not actually need.

// exercise: · 90 min prep + 3 interviews
Run three interviews this week

This exercise is designed to be run on a real product you work on — not a hypothetical.

Before the interviews:

  1. Write down your current hypothesis about the biggest problem your users face. Be specific: who feels this pain, when, and what does it cost them?
  2. Write your interview goal: “This interview was useful if I learned ___.”
  3. Design five open-ended questions using the Mom Test rules — past behavior, specifics, their life not your product.

During the interviews:

  1. Record if you have consent. Do not rely on memory.
  2. Let silence do work. Wait three seconds after they finish before speaking.
  3. Follow every “yes that’s a problem” with “can you give me a recent example?”
  4. Never mention your product until the last five minutes. Ask them to react to it as a thought experiment, not a pitch.

After each interview:

  1. Write raw notes within 30 minutes. Separate observations from interpretations.
  2. Note the one thing you heard that surprised you most.

After all three:

  1. Do an affinity exercise. What patterns repeat? What surprised you across multiple interviews?
  2. Compare the patterns to your hypothesis from before the interviews.
  3. Write one sentence: “We were wrong about ___ because ___. We now believe ___.”

If that sentence is easy to write, your interviews produced real insight. If you cannot fill in the blanks, you probably stayed too close to your hypothesis during the interviews. Do three more.

The India-specific layer

Running discovery in India is different in ways that most frameworks do not account for.

Hierarchy shapes what people will say. In Indian organizations, especially mid-market and enterprise, junior users will not tell you the product is bad if they think it reflects on their manager’s decision to buy it. You will hear “it’s fine” when they mean “it does not work.” Ask about workarounds. Workarounds are honest. Opinions are not.

The buyer and the user are often strangers to each other. The IT head who evaluated your product and the accounts executive who uses it daily may have never discussed whether it actually works. Your discovery needs to reach both. The IT head will tell you about integration and security. The accounts executive will tell you about the daily reality.

“We will figure it out” is a culture of improvisation, not satisfaction. Indian mid-market businesses are exceptionally good at improvising around broken tools. An Excel sheet with 47 formulas and a WhatsApp group for approvals is not customer satisfaction. It is a workaround that has become invisible through repetition. Your job is to make that workaround visible — to help them see the cost of improvisation they have stopped noticing.

Reference anxiety is real. When interviewing potential customers, many Indian business owners will not admit they do not know something or that their current process is poor. It reflects on their competence. Create safety explicitly: “We are not evaluating your process. We are trying to understand the reality of how this category works so we can build something useful.”

Test yourself

// interactive:
The Encouraging Interview

You are a PM at a B2B SaaS startup building a vendor management tool for mid-size Indian manufacturers. You have just finished your fifth customer interview. The prospect — a procurement head at a Pune auto parts company — seemed enthusiastic throughout. He said 'yes, this is a problem' multiple times. He called the demo 'impressive.' He said he would 'definitely explore this.' Your founder wants to know if you have validation.

You are writing up the interview notes. Your founder messages: 'How did it go? Do we have signal?' You need to decide what to tell him.

// learn the judgment

You are interviewing users for Groww's mutual fund product. After 8 interviews, 7 users say they would 'definitely use' a goal-based SIP recommendation feature. Your PM mentor says interview data like this is almost always overoptimistic.

The call: Do you recommend building the feature based on 7/8 positive interviews, or do you need more evidence?

// practice for score

You are interviewing users for Groww's mutual fund product. After 8 interviews, 7 users say they would 'definitely use' a goal-based SIP recommendation feature. Your PM mentor says interview data like this is almost always overoptimistic.

The call: Do you recommend building the feature based on 7/8 positive interviews, or do you need more evidence?

0 chars (min 80)

Where to go next

  • The framework for structuring what you learn: Jobs to Be Done — translating interview patterns into a clear statement of the job your product needs to get done.
  • Turning interviews into a research practice: User Research Methods — when to use interviews versus surveys, usability tests, and behavioral analytics.
  • What to do after discovery: Writing PRDs — how to turn interview insights into a spec your engineering team can execute against.
  • The broader question of what problem to solve: Problem Definition — how to write a problem statement that keeps your team aligned before and after discovery.
customer interviews 0%
12 min left