Updated · 9 min read
The cadence question: how often should you email?
The most common lifecycle question I've been asked across a decade of CRM work is 'how often should we email?'. It comes from a CEO who thinks the team sends too much. It comes from a growth lead who thinks the team sends too little. It comes from a CX team who wants a blanket cap. The answer is never a single number, and most of the heat the question generates is a consequence of treating it like one.
By Justin Williames
Founder, Orbit · 10+ years in lifecycle marketing
'How often should we email?' is the wrong question
The right cadence for user A is the wrong cadence for user B. Program-level cadence questions force an answer that's wrong for most of the audience, every time.
Built into the question is an assumption: cadence is a single program-wide setting — a dial you turn to find the magic number. It isn't. Cadence is an emergent property of five other decisions, each of which is a more productive conversation than the top-level one.
A user three days into onboarding should get more mail than your average subscriber. A user dormant for six months should get less. A user who bought this week has a different post-purchase sequence than one who's never converted. The subscriber who opens everything and clicks three a week can receive what a cold-list recipient cannot. The right cadence for user A is the wrong cadence for user B, so the program-level version of the question forces an answer that's wrong for most of the audience.
The more productive version of the question: what are the inputs that determine the right cadence per cohort? Answer those, and the program-level frequency falls out as a consequence, not as an argument.
Five inputs that actually settle it
Lifecycle stage. Different stages tolerate different frequencies. Onboarding is intense by design — more email in week one than in any other week of the relationship. Engaged users tolerate regular cadence because they're finding value. At-risk users tolerate some cadence but need higher-signal messages, not more of them. Lapsed users need specific low-density sequences, not the standard program. The Lifecycle Program Design skill covers stage-specific cadence for each canonical stage.
Engagement tier within the stage. Inside any stage, users vary. Someone who opened four of your last five emails can handle more than someone who opened one. Tier explicitly and ship different cadences by tier; most programs that pick a single rate end up under-mailing the engaged base, over-mailing the light base, and producing worse aggregate numbers than they could.
Natural product cycle.A daily-engagement product tolerates a different frequency than a monthly-engagement one. The lifecycle cadence should match or just slightly lead the product's usage rhythm — not impose a separate one that ignores when the user is ready for the product.
Deliverability headroom. Strong sender reputation and clean list hygiene give you more room. Weak reputation, more volume compounds the damage. Most programs that "can't send more email" actually can — they just need to fix hygiene first so the additional volume doesn't poison reputation. The deliverability guide covers the full connection between frequency and sender reputation.
Content inventory. You can only send as many messages as you have worth sending. Shipping a second email in a week just because the schedule said so — when the content is thin — is worse than shipping one good email. The cadence ceiling is set by the quality bar; the floor is set by the cycle.
Frequency capping is the safety net
3–5/wk
Common marketing message cap for B2C audiences.
1/day
Absolute ceiling including transactional messages.
0
Messages that cross the cap without explicit priority rules. Decide before you need to.
Even with well-designed per-cohort cadences, a user-level frequency cap is what prevents compound damage when multiple programs fire at the same user in the same window. Without it, a user can end up with onboarding email plus lifecycle newsletter plus product-update broadcast plus abandoned-cart push all in one afternoon — not because anyone designed it that way, but because four independent systems fired independently.
Practical cap: no more than N marketing messages per user per week, with transactional and critical-service messages exempt. N depends on program and tier; three to five marketing messages per week is the range that balances engagement and fatigue for most B2C audiences. Under-mailing a cohort 20% below the cap rarely hurts; over-mailing 20% above it reliably does.
Priority is the harder question. Which message gets cut when a user is about to cross the cap? Decide in advance and encode it in the system. Onboarding beats newsletter. Abandoned-cart beats promotional. Transactional beats everything. Without explicit priority, the cap produces random cuts and the program becomes less coherent, not more.
Negative signals beat cadence rules every time
A user who hasn't opened anything in 30 days should receive dramatically less than their tier's usual cadence — not because the cadence is wrong, but because the engagement signal is telling you the user is heading toward lapsed. Continuing to mail at the normal rate accelerates that journey.
Spam complaint, move-to-junk, or an abandoned unsubscribe flow is a louder signal. Suppress outright or drop to minimum-touch immediately. The cost of one extra send to a user who's already said "stop" through behaviour is a complaint, and complaint rate is the metric mailbox providers actually weigh against your reputation. Unsubscribes don't damage deliverability. Complaints do.
The principle: cadence rules define the ceiling, not the floor. The ceiling is the maximum you can mail without damage. The floor is whatever engagement signals say the user is willing to receive, and it can be much lower than the ceiling. A cadence system that ignores the floor over-mails disengaging users and ships them straight to spam complaints — the one thing the cap was supposed to prevent.
When the CEO says 'we email too much'
The lifecycle team rarely gets to answer cadence questions in isolation. The common pattern: a senior stakeholder observes they're getting "too many emails from your own company" and asks the team to cut back. The ask is real, but it's a sample size of one. A cadence cut based on a single user's experience is usually wrong for the base.
The productive response isn't "actually our engagement rates are fine" — that reads as defensive. It's to surface the tiering. Show which users are receiving how many messages. Show engagement-by-cadence data for each tier. Invite the stakeholder to look at whether their own cadence is actually aligned with their own engagement tier — which, usually, it isn't. They opened two emails all year and are receiving three a week. The right answer is almost never "cut everyone's cadence". It's "this user's cadence doesn't match their engagement — fix the mismatch, not the program".
The other version of this question: what's the risk of over-mailing? Higher complaint rates, higher unsubscribe rates, and sender reputation damage that compounds over months. The cost usually surfaces 30–90 days after the frequency increase, which is why programs rarely link them. The signal to watch is complaint rate, not unsubscribe rate. Complaints poison deliverability. Unsubscribes just trim the list.
Significance testing has a role here too — any cadence change deserves a proper test before it rolls program-wide. A 20% cut that feels right but produces a 30% revenue drop is worse than the problem it was trying to fix.
Frequently asked questions
- How often should I email subscribers?
- Depends on category and audience expectations. Daily works for news and content-heavy programs (Morning Brew, The Daily). Weekly is the standard for most B2C marketing and newsletters. Monthly is right for low-frequency relationships (SaaS onboarding-complete users, infrequent-purchase categories). Two to three times per week works for e-commerce with sale-driven content. The wrong answer is consistent — it's whatever your audience opted into and what your content consistently earns.
- What's the optimal email frequency?
- The frequency where incremental unsubscribes equal incremental revenue. Below that frequency, you're under-mailing — leaving revenue on the table from users who would engage more. Above it, you're damaging retention — sends that produce revenue today erode the audience you'll mail tomorrow. The practical test: run a holdout group at higher frequency vs your current frequency for 8+ weeks, compare incremental revenue against incremental unsubscribes and complaints. The break-even frequency is what you want.
- Should email frequency be the same across segments?
- No. Engaged subscribers tolerate and benefit from higher frequency; dormant subscribers should receive less not more. Good programs run 5-10x frequency gap between engaged and dormant cohorts — daily for top-engagement tier, monthly for low-engagement tier. Sending the same cadence to everyone is the most common failure mode: it over-fatigues the engaged and doesn't rescue the dormant.
- Does higher frequency always mean higher revenue?
- Short-term yes, long-term no. Every incremental send produces a revenue bump (some percentage of the list converts on that send). But every incremental send also produces unsubscribes and complaint-rate drift, which permanently shrinks the audience and the ceiling of future revenue. The cumulative effect compounds downward. Programs that chase short-term send counts over 12-18 months almost always trail frequency-disciplined programs on total revenue.
- How do I handle opt-outs by frequency rather than channel?
- Preference centre with frequency tiers. Users choose: daily, weekly, monthly, or topic-only. Each tier maps to a different sending cadence. This reduces unsubscribes substantially (users who would have unsubscribed entirely instead downshift to monthly), preserves the subscription relationship for future content, and lets you segment by implied-engagement — users who chose daily signalled higher intent, which is a useful input to personalisation.
This guide is backed by an Orbit skill
Related guides
Browse allWhat is lifecycle marketing? A field guide for operators starting from zero
If you're new to CRM and lifecycle, the field reads like a pile of acronyms and vendor demos. It's actually one simple idea executed across five canonical programs. Here's the frame that makes the rest of the library make sense.
The lifecycle audit — a 30-point checklist
Lifecycle programs decay silently. A recurring audit is the cheapest discipline that catches drift before it shows up in the revenue deck. Here's the 30-point list, grouped by severity, that takes three hours the first time and ninety minutes thereafter.
Lifecycle marketing for flat products
The standard lifecycle playbook assumes weekly engagement and neat stage progression. Most real products aren't shaped like that. This is how to design lifecycle for products used once a year, once a quarter, or whenever the user happens to need you — where the textbook quietly makes things worse.
Choosing which lifecycle programs to build first
New lifecycle lead, empty Braze account, a laundry list of programs you could build. The question nobody trains you for is which to build first. This is the selection framework — by business type, by team size, by data maturity, and the programs I'd actively wait on.
Segmentation strategy: beyond RFM
RFM is the floor of audience segmentation, not the ceiling. Every program that stops there ends up describing what users already did without ever predicting what they'll do next. Here's the segmentation stack that actually drives lifecycle decisions — and how to build it in Braze without ending up with 400 segments nobody understands.
Retention economics: proving lifecycle ROI to finance
Lifecycle programs get deprioritised when they can't defend their impact in dollars. The four models that keep the budget — LTV, payback, cohort retention, incrementality — and the four-slide pattern that wins a CFO room.
Found this useful? Share it with your team.
Use this in Claude
Run this methodology inside your Claude sessions.
Orbit turns every guide on this site into an executable Claude skill — 54 lifecycle methodologies, 55 MCP tools, native Braze integration. Pay what it's worth.