Pinpoint your activation metric from candidate list to causal proof
Your team has been guessing at the activation milestone and the retention curve has not budged. This walks the team through brainstorming candidate moments, running a regression against retention, and designing the experiment that proves the milestone is causal, not just correlated.
Activation Metrics Decide Whether Onboarding Has a Job to Do
A product without a defined activation metric runs onboarding by guess. A product with the wrong activation metric runs onboarding against a target that does not move retention. Pinning down the right milestone is one of the highest-leverage moves in growth, and it is also one of the most commonly skipped because the work crosses qualitative discovery, quantitative correlation, and causal experimentation.
Amplitude's writing on activation rate frames the milestone as the first moment a new user gets enough value that they are likely to retain. The definition is simple. Picking the right specific moment for your product is the work.
Why correlation alone is dangerous
Once a team has analytics, the temptation is to score every event by retention lift and crown the winner. The lift is real, but the causal claim usually is not. Power users hit many moments early because they are power users; the moment that correlates strongest with retention may simply be a marker of motivation, not a lever the team can pull.
Three failure modes follow:
- Marker, not lever. "Users who hit X retain 2x" sounds actionable until you build onboarding around X and retention does not move. X was tracking motivation, not creating it.
- Survivorship. Cohorts already self-select before the candidate moment. The correlation is real but the moment is downstream of an earlier filter.
- Power-user contamination. A handful of heavy users dominate the retention numerator. The moment correlates with retention because heavy users hit everything.
The remedy is the experiment. If a treatment that increases the share of users hitting the moment also moves retention by the predicted amount, the moment is causal. If it does not, you found a marker.
How the Pinpoint your activation metric prompt works
Step 1 brainstorms 6-10 candidate moments at three levels: account, user action, workspace. Mixing levels matters because B2B products often need both a user-level activation and a workspace-level milestone tracked alongside it.
Step 2 filters candidates by the three constraints that make a metric usable: observable, influenceable, hit by most users within their first 7 days. A moment that takes 30 days to hit is not an activation metric, it is a retention metric.
Step 3 runs the correlation pass. The retention horizon is product-specific: D7 for high-frequency consumer apps, W4 for prosumer SaaS, M2 for B2B with longer adoption cycles. Amplitude's cohort analysis primer explains why the horizon should match the natural frequency of value delivery.
Step 4 tests for the threshold effect. The strongest activation metrics have a clear plateau (hitting the moment twice retains the same as five times) and a clear time bound (the lift collapses if it takes more than three days). Both signals reduce the search space. A moment with no plateau is probably a usage proxy, not an activation gate.
Step 5 designs the causal experiment. The hypothesis is named, the treatment is named, the sample size detects the smallest lift that would justify deploying engineering, and guardrails prevent the team from celebrating a noise win. Reforge's north-star playbook and Amplitude's north-star primer both stress that activation metrics live underneath the north star, and the experiment is what proves the link.
Step 6 stands up the dashboard. Activation rate, retention split, weekly trend with confidence intervals. The dashboard is what keeps the metric alive after the analysis ships.
B2B and multi-player products need both a user metric and a workspace metric
Single-player products can pick one activation moment. Multi-player and B2B products usually need two: a user-level action ("invited a teammate," "shared a doc") that the growth team can move, plus a workspace-level milestone ("3 active editors by week 4," "first integration installed") that gates account-level retention. The user-level metric is operational; the workspace-level milestone monitors whether the account is on a healthy trajectory.
Teams that pick only the workspace milestone find the growth team stuck because the metric is too coarse to influence with onboarding. Teams that pick only the user action find that strong user activation does not always translate into account retention if the workspace stalls. Both are needed, and they should be tracked side by side.
When to use it
- Your retention curve has not moved despite onboarding changes and you suspect the activation target is wrong.
- A new product is launching and the team needs to commit to an activation metric before the first month of data lands.
- A previous activation metric correlated with retention but did not move when the team experimented against it (a marker, not a lever).
- A new growth lead is joining and wants to install a defensible activation contract before designing experiments.
- A B2B product is missing a workspace-level activation alongside its user-level metric.
Common pitfalls
- Stopping at correlation. A 2x retention lift is not proof. The experiment is what proves causation.
- Wrong horizon. Measuring D7 retention on a B2B product where adoption takes 30 days produces noise, not signal.
- One metric for multi-player products. B2B and multi-player products need both a user action and a workspace milestone.
Sources
- What is a good activation rate - Amplitude
- Activation metrics primer - Amplitude
- Cohort analysis primer - Amplitude
- North-star metric playbook - Reforge
- North-star metric primer - Amplitude
Sources
- What is a good activation rate — Amplitude
- Activation metrics primer — Amplitude
- Cohort analysis primer — Amplitude
- North-star metric playbook — Reforge
- North-star metric primer — Amplitude
Prompt details
Ready to try the prompt?
Open the live prompt detail page for the full workflow.