K
KnowMBAAdvisory
ProductIntermediate6 min read

Customer Interviews

Customer interviews are structured 1:1 conversations whose only job is to surface what real users actually do โ€” not what they say they'll do, not what they wish for, not what they hypothetically might pay for. Steve Portigal's Interviewing Users (Rosenfeld Media, 2013, 2nd ed. 2023) frames it cleanly: a good interview is 'rapport-building plus the disciplined avoidance of leading questions.' The interviewer's job is to elicit specific past stories and let the user do 80%+ of the talking. Teresa Torres' Continuous Discovery Habits made the cadence concrete โ€” one interview per week per product trio โ€” and tied each interview to an opportunity on the team's tree. The asymmetry: a 30-minute interview with a real user routinely surfaces an insight that 6 months of internal debate missed.

Also known asDiscovery InterviewsProblem Interviews1:1 User InterviewsGenerative InterviewsJTBD Interviews

The Trap

The single biggest trap is asking about the future. 'Would you use this?' 'Would you pay $X for that?' 'How important is feature Y to you?' All produce noise dressed up as signal. People are terrible forecasters of their own behavior and will lie to be polite. Erika Hall calls this 'asking the audience to do your job.' The second trap is interviewing for validation rather than learning โ€” going in with a solution and asking questions designed to confirm it. The third trap is talking too much: PMs who pitch the product mid-interview train the user to nod along, then act surprised when the launched feature flops.

What to Do

Run interviews using the past-tense pattern from The Mom Test (Rob Fitzpatrick) and Portigal: (1) Schedule 30-45 minutes with one user โ€” never groups. (2) Open with rapport, then ask for a SPECIFIC past episode: 'Walk me through the last time you tried to do X.' (3) Probe with 'Then what?' and 'Why?' โ€” never 'Would you?' (4) Listen for behaviors, workarounds, money already spent, and emotional language. (5) Stop talking. The 5-second silence is where the real answer lives. (6) Synthesize within 24 hours into your opportunity tree. Aim for one interview per week, minimum.

In Practice

Steve Portigal recounts a study for a kitchen appliance brand where the team was certain users wanted more pre-set programs on their bread machines. Instead of asking 'would you like more programs?', interviewers asked users to walk through the last loaf they'd baked. They discovered most users only ever used one or two settings โ€” and abandoned the machine entirely after a single failure. The real opportunity wasn't more programs; it was a recovery experience for the first failed loaf. The team killed a planned firmware feature and shipped a 'what went wrong?' troubleshooting flow that doubled second-loaf rates. (Source: Interviewing Users, Steve Portigal, 2013)

Pro Tips

  • 01

    Erika Hall's rule from Just Enough Research: 'The most useful research is the kind that disconfirms your assumptions.' Track which interviews changed your mind. If 10 interviews in a row confirmed everything you already thought, you're either a genius or โ€” far more likely โ€” asking leading questions.

  • 02

    Record (with permission) and have a teammate observe. The interviewer can't catch every cue while running the conversation. The observer's notes catch the body language, the long pauses, and the moment the user's energy shifted.

  • 03

    The exact phrases users use to describe their pain are the headlines for your landing page, the tooltips in your product, and the names of your features. Steal their language verbatim โ€” never translate it into 'product speak.'

Myth vs Reality

Myth

โ€œYou need a statistically significant number of interviews before you can act on what you hearโ€

Reality

Qualitative interviews aren't surveys. The Nielsen Norman Group's research shows ~5 well-chosen interviews surface ~85% of recurring patterns. By interview 6-8 you're hearing repeats. Continuing past saturation is procrastination disguised as rigor.

Myth

โ€œCustomers know what they want โ€” just ask themโ€

Reality

Customers know their PROBLEMS in vivid detail. They are unreliable narrators of their own future behavior. The interview's job is to extract the problem, not crowdsource the solution. Designing the solution is the team's job.

Try it

Run the numbers.

Pressure-test the concept against your own knowledge โ€” answer the challenge or try the live scenario.

๐Ÿงช

Scenario Challenge

You're interviewing a product marketer at a target customer. They tell you 'Our biggest problem is that we don't have good analytics โ€” we'd definitely buy a tool that gave us better dashboards.'

Industry benchmarks

Is your number good?

Calibrate against real-world tiers. Use these ranges as targets โ€” not absolutes.

Customer Interviews per Month (per product trio)

B2B SaaS product trios

Continuous Discovery (Torres)

4+ / month

Healthy

2-4 / month

Sporadic

1 / month

Feature Factory

0 โ€” only at kickoff

Source: Continuous Discovery Habits, Teresa Torres

Real-world cases

Companies that lived this.

Verified narratives with the numbers that prove (or break) the concept.

๐ŸŽ™๏ธ

Steve Portigal โ€” Interviewing Users

2013-2023 (2nd ed.)

success

Portigal's Interviewing Users became the operating manual for product discovery interviews. His central argument is that interviewing is a learnable craft โ€” not innate empathy. He decomposes the interview into preparation (research goals, recruiting, screener), execution (rapport, open questions, the long pause, probes), and synthesis (tagging, pattern-finding, sharing). Teams that train the craft report 2-3x more usable insights per interview vs. teams that just 'go talk to customers' without a method.

Recommended Length

60-90 minutes

Target Interviewer Talk Time

~20%

Sample Size for Pattern

5-8 per segment

Books Sold

Industry-standard reference

Customer interviews fail not from lack of empathy but from lack of craft. The discipline of asking past-tense questions and shutting up afterward is teachable, repeatable, and produces dramatically better insight than 'just talking to users.'

Source โ†—
๐Ÿ“˜

Erika Hall โ€” Just Enough Research

2013-2019 (2nd ed.)

success

Erika Hall (co-founder of Mule Design) wrote Just Enough Research as an antidote to two failure modes: research theater (massive studies that don't change decisions) and research nihilism ('we don't have time, just ship'). Her thesis: small, focused interviews โ€” done weekly โ€” are the highest-ROI activity in product development. She popularized the framing that research's job is to disconfirm your assumptions, not validate them. Teams that internalize this rule kill 20-40% more bad ideas before they hit the backlog.

Recommended Cadence

Weekly

Recommended Sample/Round

5-6 users

Highest-ROI Output

Disconfirmed assumptions

Common Failure Mode

Research theater

You don't need a research department to run interviews. You need a weekly slot, a script that asks about the past, and the discipline to act on what disconfirms your plan.

Source โ†—

Decision scenario

The Sales-Driven Roadmap

You're a PM at a 50-person SaaS. Sales is pushing hard for an 'enterprise reporting' module based on 3 lost deals where prospects mentioned reporting. The CEO wants a commit by Friday. You have one week and zero customer interviews on the calendar.

Recent Customer Interviews

0 in 90 days

Sales Anecdotes Driving Roadmap

3 lost deals

Proposed Build Size

10 engineering weeks

Time Until Commit

5 business days

01

Decision 1

You can either ship the commit based on sales' anecdotes or use the week to interview 6-8 actual customers about how they handle reporting today.

Commit to the build โ€” sales has the closest customer signal and the CEO needs an answer FridayReveal
10 weeks later you ship enterprise reporting. 6% of accounts open it. Sales says prospects 'still want better reporting' โ€” turns out the 3 lost deals each meant something different by 'reporting' (one wanted SOC 2 audit logs, one wanted a CSV export, one wanted a Power BI connector). You built a fourth thing none of them asked for.
Engineering Weeks Spent: +10Adoption: 6%Lost Deals Recovered: 0
Spend the week running 6 interviews โ€” 3 with the lost-deal contacts, 3 with current customers โ€” then commit on Friday with real evidenceReveal
Interviews reveal the 'reporting' need is actually 3 distinct jobs. You re-scope the commit to a 3-week CSV export + scheduled email (the largest overlap), with a follow-on roadmap item for the audit log work. Adoption hits 41%. Two of the three lost deals re-engage. You bank 7 weeks of engineering capacity.
Engineering Weeks Spent: +3 (saved 7)Adoption: 41%Lost Deals Re-Engaged: 2 of 3

Related concepts

Keep connecting.

The concepts that orbit this one โ€” each one sharpens the others.

Beyond the concept

Turn Customer Interviews into a live operating decision.

Use this concept as the framing layer, then move into a diagnostic if it maps directly to a current bottleneck.

Typical response time: 24h ยท No retainer required

Turn Customer Interviews into a live operating decision.

Use Customer Interviews as the framing layer, then move into diagnostics or advisory if this maps directly to a current business bottleneck.