Design a 90-day launch metric review
Delivery
0 uses
Updated 4/17/2026
Description
Your launch hit its week-1 activation target and everyone moved on. 90 days later nobody can say whether the feature mattered. This designs a 90-day review that measures sustained engagement, business impact, and the cost of keeping the feature alive — so you can decide whether to double down or sunset.
Example Usage
You are running the 90-day review for {{feature_name}}. Launch date: {{launch_date}}. Success bar pre-committed: {{success_bar}}.
## Step 1 — Quantitative scorecard
| Metric | Target | Actual | Verdict (win/neutral/loss) |
|--------|--------|--------|----------------------------|
| Activation rate | | | |
| Sustained engagement (weekly active) | | | |
| Retention impact (vs. non-users) | | | |
| Revenue impact (new ARR, expansion, reduced churn) | | | |
| NPS contribution | | | |
| Support load | | | |
| Maintenance cost (eng hours/quarter) | | | |
## Step 2 — Qualitative read
- Top 3 positive themes from customer feedback
- Top 3 friction themes
- The one customer quote that best captures the value
- The one customer quote that best captures the risk
## Step 3 — Decision
Pick one:
- **Double down**: feature hit bar, invest in v2 scope
- **Sustain**: feature hit bar, keep as-is, no new investment
- **Reduce**: feature half-hit, reduce maintenance to minimum
- **Sunset**: feature missed bar, sunset with migration plan
## Step 4 — Output
1. Filled scorecard
2. Qualitative read summary
3. Decision with rationale
4. A 1-page memo for leadership
5. The one input that, if different, would have flipped the decisionCustomize This Prompt
Customize Variables0/3
Was this helpful?
Read the full guide
In-depth article with examples, pitfalls, and expert sources