Updated · 10 min read
Onboarding flows: signup to activated
Activation is where the lifecycle does its heaviest lifting, and where most programs chronically underinvest. Teams confuse 'sending onboarding emails' with 'onboarding'. Not the same thing. Here's the playbook — define the activation event, ship a first-seven-days sequence that matches the urgency, coordinate with in-product so you stop repeating yourself, and stop the sequence the moment the job is done.
By Justin Williames
Founder, Orbit · 10+ years in lifecycle marketing
Activation is one specific action, not a feeling
The discriminating power of the event is what matters. Pick a signal that splits your retention curves and commit to it.
Before anyone writes an email, define the activation event. This is the step most programs skip, and it's why they can't tell if the work is working. What you need is one specific, measurable action that empirically predicts long-term retention. Signed up doesn't count. Opened the app twice doesn't count. A completed profile rarely counts either.
What it is: a threshold pulled from your own data, tuned so users who cross it retain at two to three times the rate of users who don't. In a messaging product it might be a count of messages sent in the first month. In a subscription product, reaching a specific feature or workflow. In a marketplace, a second transaction. The number itself doesn't matter — the discriminating power does. Pick something behavioural (the user did a thing), measurable (you can count it per user), and predictive (retention curves visibly split either side of it).
How do you find it? Pull the retention curves for users who took various actions in week one. Find the action where the curve for users-who-did flattens out well above the curve for users-who-didn't. That's your signal. It will almost always be more specific than "logged in twice".
A representative activation funnel, showing the drop-off from signup down to activated:
A typical activation funnel The drop-off between signup and true activation is usually larger than teams expect. Most onboarding programs spend their energy on the top of this funnel where drop-off is already lowest. The real opportunity sits in the middle.
The first seven days do the heavy lifting
Most activation happens in the first 72 hours or not at all. The sequence has to match that urgency. Spreading six emails across 30 days means half the audience has already made the call by the time email four lands. The 72-hour window guide has the intervention grid; for the sequence itself, here's a structure that works.
Hour 0 — Welcome with one specific action.Not "explore the product". Not "here's what we do". One clear CTA that moves the user toward the activation event.
Day 1 — Value proof.Show the user what they're going to get, with a specific example. Don't ask for another action yet. Show value first, earn the next ask.
Day 2–3 — The second action. The next meaningful step. Branched by whether the user completed the hour-0 action or not.
Day 5 — Social proof or pattern.How other users are succeeding. Works when the users are named and specific. Reads as noise when it's generic.
Day 7 — Check-in. Friendly, low-pressure. This is the message that separates users still on the journey from users who quietly dropped out.
Four to six messages across seven to ten days is the sweet spot. Longer sequences hit diminishing returns hard — because most activation already happened, and you're now emailing a cohort that's decided. The discipline isn't hitting every planned message. The discipline is stopping the sequence the moment activation fires.
On the content mix: early messages prompt action with minimal education. Users who just signed up have appetite, not patience. Middle-sequence messages can carry more "why this works" once the user has skin in the game. Pure education without a next step is usually the weakest email in the whole sequence.
Every message is conditional on the user NOT having activated yet. The moment the activation event fires, the sequence stops and the user moves to post-activation lifecycle. Continuing to send onboarding emails to activated users is how you train them to ignore everything else.
Coordinate with in-product, or the emails get wasted
Email is one channel. In-product tooltips, checklist widgets, and empty-state prompts are another. Push is a third. If these three don't coordinate, you send the user the same reminder four times and create friction instead of progress. The single most common onboarding mistake isn't a bad email. It's sending the same messages regardless of what the user just did in-product. An email saying "haven't set up your first project?" when the user set one up yesterday is worse than no email at all.
Coordination looks like this. The in-product checklist owns the canonical state of what's been completed. The email sequence reads that same state and adapts. Push fills the gaps where email timing is wrong — overnight pushes don't work, overnight emails often do. And no channel fires its "you haven't done X yet" message if another channel just fired the same reminder within the last N hours.
The Orbit Multi-Channel Orchestration skill handles exactly this — channel selection, frequency governance, and the adaptive sequencing that lets onboarding programs behave like one coherent experience instead of three that happen to share a user ID.
Progressive profiling — ask for data when it matters
The data you collect at signup is usually the minimum viable. Email, maybe a password, maybe a name. That's right, because asking for more at signup increases drop-off. But you still need the extra data to personalise the rest of the lifecycle. Progressive profiling is how you get it — collect more over time, at moments when asking makes sense.
The practical rule: tie each data request to a specific piece of product value the user is about to receive. "What industry are you in?" alongside a feature that genuinely changes by industry. "What's your team size?" alongside the pricing page. Never a generic profile-completion email with a form full of demographic questions. It converts poorly. It also signals to the user that you're doing data collection, not personalisation — and they can tell the difference.
When the onboarding sequence is done
A sequence completes when one of three things happens: the user activates, the user reaches the explicit end without activating (in which case they transition to the non-activated-user lifecycle stage), or the user hits a hard negative signal — marked it as spam, unsubscribed, hasn't opened anything in N days.
The third condition is the one most often missed. A user who hasn't opened any message in the sequence is telling you they don't want these emails. Keep sending and you damage sender reputation (see the Deliverability Management skill for the mechanism) in exchange for a tiny incremental activation rate. Cut the sequence short when the engagement signal is clearly negative.
How do you measure the whole thing beyond opens and clicks? Primary metric: activation rate within the measurement window — the percentage of signups who hit the activation event within N days. Secondary: 30-day retention of activated users versus non-activated. The onboarding programs that earn their budget improve both. They lift activation AND the downstream retention of the newly-activated cohort. Shift one without the other and you're usually just moving users into a stage they weren't ready for.
This guide is backed by an Orbit skill
Related guides
Browse allThe welcome email sequence: the 7-day structure that works
Most welcome sequences over-pitch, under-onboard, and keep firing long after the user has either activated or wandered off. Here's the 7-day shape that moves signups to activation — shorter, sharper, conditional on what the user actually did — and the stop rules that keep it from training users to ignore you.
The first 72 hours decide who activates
Activation isn't a seven-day project. It's a 72-hour race most teams lose without noticing. Why the window is that short, what to watch inside it, and how to intervene while users are still reachable.
Abandoned cart emails: what actually works
Cart abandonment is the easiest program to get wrong because the defaults work well enough to hide the problem. Here's the structure that actually moves incremental revenue — timing, sequencing, and the discount policy most teams have backwards.
Post-purchase emails: what to send after the receipt
Post-purchase is the highest-engagement window in the entire customer relationship and most lifecycle programs spend it sending a receipt, a generic welcome, and then silence. Here's the 30-day sequence that actually earns the second purchase.
Win-back flows: 12 patterns that earn their place
Win-back is the highest-ROI program most lifecycle teams underbuild. Twelve patterns that work, when each one fits, and the sunset policy that stops the program quietly eating your sender reputation.
Transactional emails: the highest-engagement messages you ignore
Order confirmations, password resets, receipts, shipping updates. Transactional emails post open rates two to three times higher than marketing sends — and most lifecycle teams have never touched them. Effort is going to the wrong place.
Found this useful? Share it with your team.
Use this in Claude
Run this methodology inside your Claude sessions.
Orbit turns every guide on this site into an executable Claude skill — 54 lifecycle methodologies, 55 MCP tools, native Braze integration. Pay what it's worth.