· 10 min read
Onboarding email flows: signup to activated
Most onboarding programs fail in the same three ways: no clear activation metric, emails sent without awareness of what the user did in-product, and a sequence that keeps going after the user clearly activated. Fix those three and your onboarding starts doing the thing onboarding is supposed to do — move signups to activated users, measurably. Here's the playbook I've used across B2C onboarding programs.
Justin Williames
Founder, Orbit · 10+ years in lifecycle marketing
Activation is one specific action, not a feeling
The discriminating power of an activation event is what matters — not the number itself. Pick a signal that behaviourally splits your retention curves.
Before you write a single email, define the activation event. This is the single most-often-skipped step in onboarding work, and the reason so many programs can't tell whether they're working. Activation is a specific, measurable action that empirically predicts long-term retention. Signed up is not activation. Opened the app twice is not activation. Completed a profile is rarely activation.
What is: a threshold you derived from your own data, tuned so that users who cross it retain at 2–3x the rate of users who don't. For a messaging product it might be a specific count of messages sent in the first month; for a subscription product it might be reaching a feature or workflow; for a marketplace it might be a second transaction. The number itself isn't meaningful — the discriminating power is. Pick a signal that's behavioural (the user DID something), measurable (you can count it per user), and predictive (retention curves split visibly either side of the threshold).
A representative activation funnel, showing the typical drop-off from signup to activated:
A typical activation funnel The drop-off between signup and true activation is usually larger than teams expect. Most onboarding programs spend their energy on the top of this funnel where drop-off is already lowest; the real leverage is in the middle.
The first seven days do the heavy lifting
Most activation happens in the first 72 hours or not at all. The onboarding sequence has to match that urgency — spreading six emails across 30 days means half your audience has already decided by email four. The 72-hour window guide covers the intervention grid in more depth; for the sequence itself, a practical structure:
Hour 0 — Welcome with one specific action.Not "explore the product". Not "here's what we do". One clear CTA that moves the user toward the activation event.
Day 1 — Value proof.Show the user what they're going to get from the product, ideally with a specific example. Don't ask for another action — show value first, earn the next ask.
Day 2–3 — The second action. The next meaningful step toward activation. Differentiated by whether the user completed the hour-0 action or not.
Day 5 — Social proof or pattern. How other users are succeeding. Works when real users are referenced specifically; reads as noise when generic.
Day 7 — Check-in. Friendly, low-pressure. This is the message that separates the users still on the journey from the users who have silently dropped out.
All messages are conditional on the user NOT having activated yet. The moment the activation event fires, the sequence stops and the user moves to the post-activation lifecycle stage. Continuing to send onboarding emails to activated users is how you train them to ignore you.
Coordinate with in-product, or the emails get wasted
The email sequence is one channel. In-product tooltips, checklist widgets, and empty-state prompts are another. Push notifications are a third. If these three don't coordinate, you send the user the same reminder four times and create friction instead of progress.
Coordination means: the in-product checklist drives the canonical state of what the user has completed. The email sequence reads from that same state and adapts content accordingly. Push notifications fill the gaps where email timing is wrong (overnight pushes don't work; overnight emails often do). And no channel fires its "you haven't done X yet" message if another channel just fired the same reminder within the last N hours.
The Orbit Multi-Channel Orchestration skill handles exactly this — channel selection logic, frequency governance, and the adaptive sequencing that lets onboarding programs actually behave coherently.
Progressive profiling — ask for data when it matters
The data you collect at signup tends to be the minimum viable — email, maybe a password, maybe a name. That's right, because asking for more at signup increases drop-off. But you still need that additional data to personalise the lifecycle program. Progressive profiling is how you get it: collect more user data over time, at moments when asking makes sense.
The practical rule: tie each additional data request to a specific piece of product value the user is about to receive. "What industry are you in?" alongside a feature that genuinely changes based on industry. "What's your team size?" alongside a pricing page. Never a generic profile-completion email with a form full of demographic questions — that converts poorly and signals to the user that you're doing data collection, not personalisation.
When the onboarding sequence is 'done'
An onboarding sequence is complete when one of three things happens: the user activates (fires the activation event), the user reaches the explicit end of the sequence without activating (transition to the non-activated-user lifecycle stage), or the user hits a hard negative signal (marked the email as spam, unsubscribed, hasn't opened any message in N days).
The third condition is the one most often missed. A user who hasn't opened any email in the sequence is telling you they don't want these emails. Continuing to send them damages sender reputation (see the Deliverability Management skill for the connection) and produces a tiny incremental activation rate. Cut the sequence short when engagement signals are clearly negative.
Frequently asked questions
- How many emails should an onboarding sequence have?
- Usually 4–6 across the first 7–10 days, conditional on the user not having activated yet. Longer sequences hit diminishing returns fast because most activation happens in the first 72 hours. The discipline is stopping the sequence the moment the user activates, not hitting every planned message.
- What's the single most common onboarding mistake?
- Sending the same messages regardless of what the user did in-product. Email and product are one experience to the user. An email saying 'haven't set up your first project?' when the user set up their first project yesterday is worse than not sending anything.
- How do I define the activation event for my product?
- Pick a behavioural event that empirically predicts long-term retention. Look at the retention curves for users who took various actions in week one; find the action where the curve flattens vs users who didn't take it. That's your activation signal. It's usually more specific than 'logged in twice'.
- Should onboarding emails include product education or just prompts to act?
- Both, but in the right order. Early messages prompt action with minimal education (users who just signed up have appetite, not patience). Middle-sequence messages include more 'why this works' education once the user has some skin in the game. Pure education without a next step is typically the weakest email in the sequence.
- When should progressive profiling data be collected?
- At moments tied to specific product value. 'What industry are you in?' alongside a feature that genuinely changes by industry. 'What's your team size?' alongside a pricing page. Never a generic 'complete your profile' email with a form of demographic questions.
- How do I measure onboarding success beyond opens and clicks?
- Primary: activation rate within the measurement window (% of signups who hit the activation event within N days). Secondary: 30-day retention of activated users vs non-activated. The best onboarding programs improve both — they lift activation rate AND the downstream retention of the newly-activated cohort.
This guide is backed by an Orbit skill
Related guides
The cadence question: how often should you email?
Everyone asks how often to email and almost nobody answers it properly. This guide is about what 'cadence' actually means, why the right number isn't a single number, and the five inputs that settle the debate for your specific program.
The first 72 hours decide who activates
Activation isn't a seven-day project — it's a 72-hour race most teams lose without noticing. This guide is about why the window is that short, what to watch inside it, and how to intervene while users are still reachable.
Personalisation that doesn't feel creepy
There's a line between personalisation that earns a user's trust and personalisation that breaks it. This guide is about where the line actually is, how lifecycle programs cross it without noticing, and the specific patterns that keep you on the right side.