Learning and Development Strategy
L&D strategy is the deliberate plan for building the capabilities your business will need in 12-36 months โ not the catalog of training courses you happen to offer. Strong L&D answers three questions: (1) What capabilities do we need that we don't have? (2) Build, buy, or borrow? (3) How do we measure that capability now exists at scale? The 70-20-10 framework remains the most credible model: 70% of development comes from on-the-job stretch assignments, 20% from coaching/mentoring, 10% from formal training. Most L&D budgets invert this โ spending 70% on the 10% that produces the least learning. The result: massive course catalogs, stagnant capability.
The Trap
The trap is measuring L&D by activity (courses offered, hours consumed, completion rates) instead of capability (can people DO the thing). A company runs 200 courses annually with 85% completion rates and reports 'a thriving learning culture.' Meanwhile: managers can't run a structured interview, only 12% of engineers have shipped a feature using the new framework, and the 'data-literate workforce' initiative produced no actual data analysis behind decisions. Activity metrics make L&D feel productive while the underlying capabilities don't move. The other trap: confusing 'access to learning' (LinkedIn Learning subscription for everyone) with 'capability building' (people actually develop measurable skills).
What to Do
Run a capability gap audit: list 5-10 strategic capabilities the business needs in 24 months (e.g., 'product managers can run rigorous experiments,' 'engineers can deploy ML models'). For each, score current state (1-5) and target state. The biggest gaps get the L&D budget. For each gap, design with 70-20-10: a stretch project (the 70%), a paired coach/mentor (the 20%), and a focused workshop (the 10%) โ in that priority. Measure capability via demonstrated work (a shipped experiment, a deployed model), not training completion. Cut any program that can't show capability change in 12 months.
Formula
In Practice
Hypothetical: A mid-stage SaaS company replaced its 180-course catalog with 4 capability tracks (product analytics, customer-research interviewing, technical architecture, executive communication). Each track required completing a stretch assignment with a paired coach, judged by a panel of senior leaders. Total annual L&D spend dropped 40%. After 18 months, the % of PMs running structured experiments rose from 20% to 71%, and the % of senior engineers leading architecture reviews rose from 35% to 68%. Activity metrics (course completions) plummeted; capability metrics tripled.
Pro Tips
- 01
The 70-20-10 model (originally from Center for Creative Leadership research, 1980s) is widely cited but often misapplied โ the percentages aren't a budget allocation prescription, they're an observation about WHERE learning happens. The real lesson: design programs that anchor learning in real work, with formal training as supporting scaffolding.
- 02
Run a 'capability calibration' annually: senior leaders rate the % of their team that demonstrates each strategic capability at the required level. The gap between rating and reality is the L&D backlog. This single conversation refocuses L&D more effectively than any training-needs survey.
- 03
Replace 'learning hours' as a metric with 'capability-tagged work products' โ when a developer ships their first ML model, when a manager runs their first calibration session, when a PM completes a structured experiment. These are demonstrations of capability, not consumption of content.
Myth vs Reality
Myth
โMore training = better learningโ
Reality
Ebbinghaus's forgetting curve shows people forget 50% of new information within 24 hours and 90% within a week โ UNLESS they apply it. Training without application is content theater. The 70% that sticks comes from on-the-job application, not the training itself.
Myth
โL&D should be self-directed via platforms like LinkedIn Learningโ
Reality
Self-directed learning has near-zero correlation with capability building at the organization level. People click through content they're already comfortable with, completion rates are vanity metrics, and no one is held accountable for outcomes. Capability requires designed pathways, real work, and human accountability โ not a content library.
Try it
Run the numbers.
Pressure-test the concept against your own knowledge โ answer the challenge or try the live scenario.
Knowledge Check
Your CEO asks you to justify the $2M L&D budget. You have these metrics. Which actually justifies the spend?
Industry benchmarks
Is your number good?
Calibrate against real-world tiers. Use these ranges as targets โ not absolutes.
L&D Spend per Employee (Annual)
Companies of 1,000+ employees, knowledge-worker-heavy industriesHeavy Investment
> $1,500
Strong
$1,000-1,500
Average
$500-1,000
Light
$200-500
Minimal
< $200
Source: Hypothetical: Composite of ATD State of the Industry and Brandon Hall benchmarks
Real-world cases
Companies that lived this.
Verified narratives with the numbers that prove (or break) the concept.
Hypothetical: Mid-Stage SaaS Capability Pivot
2024-2025
A 600-person SaaS company faced board pressure: 'Why is L&D $1.4M when product velocity isn't improving?' The new VP of People killed 75% of the course catalog and built 4 capability tracks: Product Experimentation, Technical Architecture, Customer Research, Executive Communication. Each track required a real stretch project (e.g., run an A/B test that ships, lead an architecture review for a real system) reviewed by a senior leader panel. Course completion metrics dropped 90%; capability metrics tripled in 18 months.
Course Catalog Reduction
โ75%
PMs Running Structured Experiments
20% โ 71%
Senior Eng Leading Architecture
35% โ 68%
L&D Budget Change
โ40%, refocused
L&D budgets are wasted when measured by content consumption. The same dollars produce 3-5x more capability when redirected to stretch assignments + coaching. The hardest part is killing the courses โ they have political constituencies even when they don't work.
Related concepts
Keep connecting.
The concepts that orbit this one โ each one sharpens the others.
Beyond the concept
Turn Learning and Development Strategy into a live operating decision.
Use this concept as the framing layer, then move into a diagnostic if it maps directly to a current bottleneck.
Typical response time: 24h ยท No retainer required
Turn Learning and Development Strategy into a live operating decision.
Use Learning and Development Strategy as the framing layer, then move into diagnostics or advisory if this maps directly to a current business bottleneck.