Updated · 7 min read
Review request emails: the timing that actually produces reviews
Every ecommerce and SaaS program needs reviews, and the review request email is the primary mechanism. Most programs send them as a batch 14 days after purchase with generic 'How'd we do?' copy and get 2% response rates. The better version — timed to actual product-use milestones, written like a human, asking one specific question — produces an order of magnitude more reviews. Here's how.
Justin Williames
Founder, Orbit · 10+ years in lifecycle marketing
The timing that works
The single biggest lever is when you ask. Ask too early and the user hasn't used the product. Ask too late and the moment has passed.
Ecommerce physical product: 7–14 days after delivery (not after order). Shipping time means the user may still be waiting when you send at 14 days from order. Trigger on delivery confirmation + N days where N depends on product use cycle — apparel: 5 days, electronics: 10 days, furniture: 21 days.
SaaS product: after the user hits a meaningful usage milestone — 30 days of active use, or completion of a primary workflow. Time-based alone is weak; usage-based is strong.
Content / course: after completion or a meaningful percentage consumed. "Finished the course? Tell us how it went" converts better than "2 weeks since you signed up, review?".
The right time to ask is when the user has formed an actual opinion. Before that, they have nothing to say. After that, the moment has passed.
The message pattern
Subject: "How's the [product] working out?" or "Quick question about your [product]". Question-framed subjects convert higher than command-framed ("Please leave a review").
Opening: reference the specific product and delivery/use context. "Your [product name] arrived on [date]. By now you've probably had a chance to use it." Specific beats generic.
The ask: one question, one CTA. "Would you share a quick review? It helps other customers decide." Not three CTAs, not a survey — one review link.
Make it easy: link direct to the review form with the product pre-filled. Every extra click loses response rate. If possible, embed a 1-5 star rating in the email itself, with tapping any rating deep-linking to the review form with the rating pre-filled.
Social proof the ask: "Join [X] customers who've shared their experience with [product]." Reinforces that reviewing is a normal action.
Why incentives usually backfire
,
Ethical incentives: a loyalty-program points reward (not a discount), or entry into a monthly draw for reviewers. Both are weaker incentives than direct discount but don't violate platform rules. Check your specific review platform's policies.
The stronger alternative is to invest in better asking — timing, messaging, frictionless flow. A good timing + messaging improvement lifts submission 2–3× without the platform risk.
Handling low ratings
Some programs route low ratings (1–3 stars) to support rather than to public review. This has a legitimate version and a dark version:
Legitimate: if a user rates 1 star, trigger a support flow — "We're sorry; can we help?". Offer resolution. After the resolution, they can still leave a public review if they want. This is customer-service first, reviews second.
Dark pattern: intercepting low ratings entirely so they never post publicly, while routing high ratings straight to public review platforms. Technically shapes review averages; widely considered unethical and against most platform terms.
The first pattern is fine and often helpful; the second is risky and platform-violating. Be honest about which you're doing.
Measuring review program health
Submission rate: percent of emailed users who submit a review. 5–15% healthy with good timing and messaging; 1–3% typical for generic unoptimised programs.
Average rating: should be directionally high (4+ on a 5-scale) if the product is genuinely good. If average is low, fix the product before optimising the request mechanism.
Review depth: average word count of submitted reviews. Higher depth = more useful reviews for prospective customers. Open-ended prompt ("What would you tell a friend about this product?") produces better depth than structured form.
Time-to-review: median days from request to submission. Users who submit in the first 24 hours are your highest-intent; if most reviews come in days later, the request is being shelved and forgotten.
covers review program placement in the broader lifecycle — typically a standing trigger after key product milestones, not a periodic batch.
Frequently asked questions
- When should I send the review request?
- After the user has had time to form an opinion, not before. For physical products: 7–14 days after delivery confirmation (not after order, which includes shipping time). For SaaS: after the user hits a meaningful usage milestone. Time-based alone usually underperforms usage-based triggers.
- Can I incentivise reviews with a discount?
- Generally no. Most review platforms (Amazon, Google, Trustpilot, Shopify's built-in reviews) prohibit direct discount-for-review exchanges. Violating terms can remove your reviews and delist your product. Legitimate alternatives: loyalty points (not directly redeemable for discount), prize-draw entries, or simply better asking.
- Should I send follow-ups if they don't review?
- One follow-up 7 days after the original, yes. Two total messages is the right number. More than that starts feeling pestering. If the user hasn't reviewed after two asks, they're not going to — cut the flow.
- How do I handle low ratings?
- Route them to a support flow: 'We're sorry this wasn't right — can we help?' Before they publicly review. This is customer service, not review suppression — after support resolution, they can still leave the review. What's not okay: intercepting low ratings entirely so they never become public while pushing high ratings to public platforms.
- What review platform should I use?
- Depends on the business. Ecommerce: platform-native reviews (Shopify, WooCommerce) plus Google Reviews for SEO. Higher-consideration purchases: Trustpilot or Google Reviews. SaaS: G2, Capterra, TrustRadius depending on category. Multiple platforms are fine; just make sure the request flow is specific to each.
- Should the subject line mention 'review'?
- Test both. 'How's the product working out?' (question-framed) and 'Share your review of [product]' (direct) both work. Question-framed often has higher open rates, but direct-framed users who open know exactly what's being asked. Try both for your audience.
Related guides
Browse all →Product launch email sequence: the five emails that actually sell a new product
A product launch with one big announcement email captures a fraction of the addressable audience. A proper five-email launch sequence catches multiple attention windows, builds anticipation, and converts the users who needed a second or third touch. Here's the structure.
Browse abandonment: the program that sits between ads and cart
Browse abandonment catches the users who viewed a product and left without adding to cart. Smaller lift than cart abandonment, but larger addressable audience. Here's the trigger logic, data requirements, and the timing that works.
Referral program emails: the three flows that make a referral program work
Referral programs live or die on the lifecycle messaging around them. Here are the three flows every referral program needs — inviter prompt, invitee welcome, reward confirmation — and the timing and copy that make each convert.
Trial-to-paid: the seven-email sequence that converts 20%+ of free users
Trial conversion is the most financially leveraged lifecycle flow in SaaS — every percentage point of improvement compounds against CAC. Here's the seven-email sequence that reliably moves trial conversion from 5% to 20%+.
Replenishment emails: the lifecycle flow that buys itself
Replenishment emails remind users to re-order a consumable product before they run out. Done right, they generate the highest revenue-per-send in any lifecycle program because the purchase intent is already established. Here's the timing, data, and copy.
Price increase emails: how to raise prices without a churn spike
A price increase is one of the highest-risk lifecycle moments — done wrong, it triggers churn, public complaints, and a reputation dent that outlasts the extra revenue. Done right, most users accept the change without friction. Here's the sequence that works.
Found this useful? Share it with your team.