K
KnowMBAAdvisory
RetentionIntermediate7 min read

Cohort Retention Analysis

Cohort retention analysis groups customers by the period they signed up (the 'cohort') and tracks what percentage are still active 1, 3, 6, and 12 months later. Instead of an aggregate churn number that hides everything, you get a triangular table showing whether your January cohort retains better than your June cohort. The shape of the curve matters more than the absolute number: does retention flatten (a healthy 'smile') or keep dropping toward zero (leaky bucket)? Cohorts are the only honest way to detect whether product changes, onboarding tweaks, or pricing shifts actually improved retention โ€” because they isolate the variable of WHEN someone joined.

Also known asCohort AnalysisRetention CurvesCohort TrackingVintage Analysis

The Trap

The trap is reading averaged retention curves instead of cohort tables. Aggregate retention can look healthy because new signups inflate the active user count even as old cohorts hemorrhage. Founders celebrate '85% monthly retention' that's really 60% for last quarter's cohorts hidden by a flood of new signups. The other trap: comparing cohorts of different sizes or sources without normalizing. A 50-person organic cohort and a 5,000-person paid cohort behave completely differently โ€” paid traffic almost always retains worse, so blending them masks the problem.

What to Do

Build a cohort table monthly: rows = signup month, columns = months 0-12, cells = % still active. Color-code green to red. Look for three things: (1) the M1 retention number โ€” your onboarding quality, (2) the M3-M6 slope โ€” your product-market fit signal, (3) whether the curve flattens by M6 โ€” the existence of a 'sticky core'. Then segment cohorts by acquisition channel, plan tier, and persona. The leverage isn't in the average; it's in finding the cohort segment that retains 2x better and pouring acquisition spend there.

Formula

Cohort Retention (Month N) = Active Users in Cohort at Month N รท Original Cohort Size

In Practice

Slack's early growth team famously built cohort retention dashboards segmented by team size. They discovered that teams that sent 2,000+ messages in their first 3 weeks retained at 93% after a year โ€” while teams under 2,000 messages churned at 70%. This single insight reshaped onboarding: every product decision was optimized to push new teams past the 2,000-message threshold. The cohort analysis didn't just measure retention โ€” it identified the activation metric that defined whether a team would stick.

Pro Tips

  • 01

    If your retention curve doesn't flatten by M6, you don't have product-market fit โ€” you have a leaky bucket. No amount of acquisition spend will fix it. Fix retention first, scale second.

  • 02

    Always include a 'self-serve vs sales-led' cohort split. Sales-led cohorts almost always retain 2-3x better because they're qualified โ€” but they're also expensive. The gap tells you whether your self-serve onboarding is broken.

  • 03

    Plot cohorts as overlapping curves on a single chart, not just a table. Patterns jump out visually: a cohort that retains worse after a specific product change is immediately obvious as a downward shift in the curve.

Myth vs Reality

Myth

โ€œCohort retention is only for SaaSโ€

Reality

E-commerce, mobile apps, marketplaces, and content sites all have cohort dynamics. An ecommerce 'retention curve' is the % of buyers who order again in months 2, 3, 4. Mobile apps track day-1, day-7, day-30 retention by install cohort. The framework is universal โ€” anywhere you have repeated customer engagement, cohorts apply.

Myth

โ€œA flat retention curve is the goalโ€

Reality

A flat retention curve is the MINIMUM goal. Best-in-class consumer products show an upward-sloping curve in later months โ€” old cohorts re-engage and become more active over time. This is the 'smile curve' that Facebook and TikTok have. Flat retention is healthy; rising retention is exceptional.

Try it

Run the numbers.

Pressure-test the concept against your own knowledge โ€” answer the challenge or try the live scenario.

๐Ÿงช

Knowledge Check

Your January cohort shows 60% M1 retention, 35% M3, 28% M6, and 27% M12. Your June cohort shows 70% M1, 50% M3, 40% M6. What's the most accurate read?

Industry benchmarks

Is your number good?

Calibrate against real-world tiers. Use these ranges as targets โ€” not absolutes.

M12 Cohort Retention

B2B SaaS, annual contracts, paid logos

Best-in-Class SaaS

> 80%

Healthy SaaS

60-80%

Average SaaS

40-60%

Leaky

20-40%

Broken Model

< 20%

Source: OpenView SaaS Benchmarks 2024

Mobile App D30 Retention

Consumer mobile apps, organic installs

Best-in-Class

> 25%

Good

15-25%

Average

8-15%

Poor

< 8%

Source: Adjust Mobile Benchmarks 2024

Real-world cases

Companies that lived this.

Verified narratives with the numbers that prove (or break) the concept.

๐Ÿ’ฌ

Slack

2014-2015

success

Slack's growth team built cohort retention dashboards segmented by team size and message volume in the first 3 weeks. They discovered teams that hit 2,000 messages had a 93% one-year retention rate, while teams below that threshold churned at 70%. This 'aha number' became the north-star metric for onboarding. Every product change โ€” channel suggestions, integration prompts, the 'invite teammates' nudge โ€” was measured against whether it pushed new teams past 2,000 messages.

Activation Threshold

2,000 messages in 3 weeks

Above Threshold Retention (1yr)

93%

Below Threshold Retention (1yr)

30%

Outcome

$27B Salesforce acquisition

Cohort analysis isn't just a measurement tool โ€” it's a discovery tool. The 2,000-message insight defined Slack's entire activation strategy and was only visible by slicing cohorts by behavioral attributes.

Source โ†—
๐ŸŽฌ

Netflix

2019-2022

success

Netflix uses cohort retention curves segmented by 'first content watched' to make programming decisions. Cohorts whose first show was a binge-able original series (Stranger Things, Squid Game) retained at 90%+ after 6 months. Cohorts whose first content was a movie or licensed show retained 30-40% lower. This drove their pivot from licensing to original content โ€” they weren't measuring 'engagement' generally, they were measuring whether new subscriber cohorts found a 'sticky' show fast enough.

Original Series First-Watch Cohort 6mo Retention

~90%

Licensed Content First-Watch 6mo Retention

~55%

Annual Original Content Spend (2022)

$17B

Subscriber Base

230M+

Cohort retention by first-experience reveals what makes users stick. Netflix's $17B/year original content spend is justified by cohort math: subscribers who hit a tentpole show in week 1 retain dramatically better.

Source โ†—

Decision scenario

The Hidden Cohort Crisis

You're VP Growth at a Series B SaaS. Aggregate monthly retention reads 88% โ€” you've been celebrating. Your CFO asks you to slice the cohort table for the board deck. You discover that cohorts from the last 6 months are retaining at 65% by M3 โ€” much worse than the 80% your older cohorts hit. The aggregate looked healthy because new signups (driven by a paid push) inflated active users while older cohorts continued to retain well.

Aggregate Monthly Retention

88%

Recent Cohort M3 Retention

65%

Older Cohort M3 Retention

80%

Paid Acquisition Spend

$120K/mo

New Signups/mo

850

01

Decision 1

The board wants growth numbers. You can present the aggregate (which looks great), or surface the cohort gap (which raises uncomfortable questions about your paid spend).

Present the aggregate retention โ€” slice by cohort once you've fixed it internally, no need to alarm the boardReveal
You buy yourself one quarter. The next quarter, the older cohorts continue to age out and the recent cohorts' poor retention becomes mathematically unhideable. Aggregate retention drops to 76%. The board now sees both the bad number AND that you knew earlier and didn't flag it. You lose credibility, your paid budget gets slashed by the board, and the VP Growth role gets restructured.
Aggregate Retention (next Q): 88% โ†’ 76%Board Trust: High โ†’ Damaged
Show the cohort table, identify that paid social cohorts are the problem, and propose pausing paid spend to fix activationReveal
The board respects the transparency. You pause paid social spend, freeing $120K/mo to invest in onboarding redesign. Within 2 quarters, M3 retention on new cohorts climbs to 78%. You restart paid spend with a much higher bar โ€” only channels with M3 retention above 70% get scaled. Net: lower top-line growth for one quarter, much healthier unit economics, and a credible growth playbook.
Recent Cohort M3 Retention: 65% โ†’ 78%Paid Spend Efficiency: Low โ†’ 2x

Related concepts

Keep connecting.

The concepts that orbit this one โ€” each one sharpens the others.

Beyond the concept

Turn Cohort Retention Analysis into a live operating decision.

Use this concept as the framing layer, then move into a diagnostic if it maps directly to a current bottleneck.

Typical response time: 24h ยท No retainer required

Turn Cohort Retention Analysis into a live operating decision.

Use Cohort Retention Analysis as the framing layer, then move into diagnostics or advisory if this maps directly to a current business bottleneck.