The Myth of the Magic Number: How to Actually Find Your Product's Activation Moment
How should you define and measure activation?
"Seven friends in 10 days." Facebook's legendary activation metric has launched a thousand cargo cults. Every growth team wants their own magic number -- a clean, simple threshold that predicts whether a user will stick. The problem is that most teams reverse-engineer a number without understanding what made Facebook's work, and end up optimizing for a proxy that does not actually cause retention.
The real lesson from Facebook is not the number. It is the method. And the method, as five growth leaders reveal, is both more rigorous and more flexible than the popular retellings suggest.
You are building a growth team. You know that users who reach some "aha moment" are far more likely to retain. How do you identify that moment, define it as a metric, and build your onboarding to get users there faster?
The 5 Positions
Evidence from the Archive
The new user experience (upload photo, find friends) was built specifically to drive the activation metric
Facebook used growth accounting to discover retention was the biggest lever, then identified friending as the key behavioral proxy
Airtable's W4MUA metric: at least 2 users working together in week 4, with a 5-15% hit rate
Airtable tracked four metrics simultaneously: W2A and W4A at user level, W4MUA and build rate at workspace level
LogMeIn's aha moment was the first remote control session, which required going to a different computer -- the most...
LogMeIn's aha moment was the first remote control session, which required going to a different computer -- the most complex activation funnel Ellis has ever seen
GitLab's PLG motion required users to see a workflow simplified before they would convert
A company Qu consulted with discovered users staring at a blank screen after free trial signup -- the fix was pre-built sample data
A single fix was sufficient -- fixing more than once in 30 days did not significantly improve retention
Snyk's F30D metric: teams that fix a vulnerability within 30 days of creation are dramatically more likely to retain at 3 months
The Synthesis
The cargo cult of "find your magic number" misses the deeper lesson. Facebook's "7 friends in 10 days" worked not because of the specific number but because of three properties: it was experience-based (adding friends, not just visiting the site), it had a time constraint (10 days, creating urgency), and it was causally linked to the product's core value loop (more friends means more relevant content in the feed).
Your activation metric needs three properties: it must describe a specific user experience (not a usage count), it must happen within a defined time window (to create urgency in onboarding), and it must be causally connected to the reason users retain. Facebook's '7 friends in 10 days' had all three.
The most common mistake is picking a metric that correlates with retention but is not actionable. 'Used the product 100 times' predicts retention but you cannot design onboarding around it. The test is simple: can you design a first-session experience that gets users to this metric?
Facebook's 'adding friends' genuinely caused a better product experience. Optimizing for profile photo uploads -- which also correlated -- would have been gaming a proxy. The antidote is a three-step process: brainstorm potential aha moments, run regression against retention, then run experiments to prove causation.
A six-month minimum reflects the reality that domain expertise and compounding improvements take time. The second common mistake is changing the metric too often. And a counterintuitively low activation rate (5-15%) may indicate a better-chosen metric with stronger predictive power.
For inherently collaborative products, individual-level activation metrics miss the point. A user who creates an Airtable base alone has not truly activated. The convergence across multiple voices: use team-level or workspace-level metrics when collaboration is core to your product's value.
Which Approach Fits You?
Answer 3 questions about your situation. We'll match you to the right approach.
What type of product are you building?
Can you design onboarding around your activation metric today?
How does value in your product compound?
Notable Absences
The Bottom Line
Lenny's benchmarking survey of over 500 companies provides calibration: the average activation rate is 34% overall (36% for SaaS), with a median of 25%. But Isford's counterintuitive advice complicates these benchmarks: a low activation rate (5-15%) may indicate a better-chosen metric with stronger predictive power.
There is also a dangerous conflation between correlation and causation. Facebook's "7 friends in 10 days" worked because adding friends genuinely caused a better product experience. If they had optimized for profile photo uploads instead -- which also correlated with retention -- they would have been gaming a correlation without driving value. Lenny's newsletter on how to determine your activation metric prescribes a three-step antidote: (1) brainstorm potential aha moments from usage data, (2) run regression analysis against retention, and (3) run experiments to prove causation, not just correlation.
Sources
- Naomi Gleit, Head of Product at Meta — "Lenny's Podcast, Oct 2024"
- Lauryn Isford, Head of Growth at Airtable — "Lenny's Podcast, Feb 2023"
- Hila Qu, Growth Strategist at Reforge / GitLab — "Lenny's Podcast, Apr 2023"
- Sean Ellis, Author of Hacking Growth — "Lenny's Podcast, Sep 2024"
- Ben Williams, VP of Product at Snyk — "Lenny's Podcast, Nov 2022"
- Lenny's Newsletter: — ""What is a good activation rate""
- Lenny's Newsletter: — ""How to determine your activation metric""