14 min left 0%

retention loops

Every growth team I have seen celebrates acquisition numbers. Almost none of them can tell me what percentage of users from three months ago are still active today. That is the number that determines whether your company exists in two years.
Talvinder Singh, from a Pragmatic Leaders growth masterclass

Retention is the growth metric most teams measure last and fix never. They have dashboards for signups, install costs, activation funnels, revenue per user. Ask them for their D30 retention by cohort and you get silence, or a number someone pulled from a spreadsheet six months ago.

This is not a knowledge gap. It is a priority failure. Acquisition is visible, fundable, and easy to celebrate in all-hands meetings. Retention is invisible, expensive to fix, and forces you to confront the possibility that your product is not as good as you think it is.

After watching hundreds of PMs go through this realisation at Pragmatic Leaders, I can tell you the pattern: the ones who build careers in growth are the ones who became obsessed with retention early. Everyone else stays on the acquisition treadmill, running faster to stay in the same place.

The math your CEO ignores

Here is a number exercise that should change how you think about growth budgets.

Company A acquires 10,000 users per month at ₹200 per user. D30 retention is 10%. After six months, they have roughly 6,000 active users and have spent ₹12,00,000 on acquisition.

Company B acquires 5,000 users per month at ₹200 per user. D30 retention is 25%. After six months, they have roughly 7,500 active users and have spent ₹6,00,000.

Company B spent half as much and has more users. That is not a rounding error. That is the entire difference between a sustainable business and one that runs out of money while celebrating growth charts that go up and to the right.

The compounding effect gets more extreme over time. At month twelve, Company A has roughly 12,000 active users (still churning 90% monthly, still spending). Company B has approximately 15,000 active users with half the spend. By month eighteen, the gap is unclosable.

This is why every growth conversation should start with retention. Not because acquisition does not matter, but because acquisition without retention is a marketing budget attached to a leaky bucket. Fix the bucket first.

Three types of retention, and which one matters for your product

Retention is not one metric. It is a family of metrics, and using the wrong one will mislead you completely.

Consumer apps (daily use case): D1, D7, D30 retention. Measured as the percentage of users from a given cohort who return on day 1, day 7, and day 30. If you are building a food delivery app, a payments app, or a social product, these are your numbers. The benchmarks vary wildly by category. Swiggy might consider 40% D30 retention strong for a city launch. A social app like ShareChat needs 25%+ D30 to have a viable business. Anything below 15% D30 in consumer means the product does not have a use case that justifies keeping the app installed.

That last point matters in India specifically. Storage is scarce on most Android devices. If your app is not used frequently enough, the OS will offload it, or the user will delete it to free up space for WhatsApp and a game. A monthly use case is not frequent enough for people to keep the app. Apple already removes offloaded apps automatically. For consumer products in India, you either earn a place in the daily or weekly routine, or you get uninstalled.

Productivity and B2B tools (weekly/monthly use case): Weekly active users (WAU) or monthly active users (MAU), plus feature adoption depth. A project management tool does not need D1 retention. Nobody opens Jira on a Saturday. What matters is weekly engagement and whether usage deepens over time. Are they using one feature or five? Did the team adopt it, or is one person logging in alone?

The top churn reason I have seen in B2B SaaS survey data is not “the product is bad.” It is “I could not get my team to adopt it.” That makes team activation within the first week the single most important retention lever for any multi-user B2B product. If only the admin is active by day seven, that account is dead.

Subscription/contract businesses: Net Revenue Retention (NRR), logo churn rate, revenue churn rate. For a product like Freshworks or Zoho, the question is not whether individual users come back daily. It is whether accounts renew, expand, or contract. NRR above 100% means existing customers are spending more over time, which means you can grow without acquiring a single new customer. NRR below 90% means your bucket has a hole that no amount of sales effort can fill.

Product typePrimary retention metricTime windowDanger threshold
Consumer daily-use (Swiggy, PhonePe)D1 / D7 / D30Daily/weeklyD30 below 15%
Consumer periodic (Nykaa, MakeMyTrip)M1 / M3 repurchase rateMonthlyM3 below 20%
B2B SaaS (Freshworks, Zoho)NRR + logo churnQuarterly/annualNRR below 95%
Productivity tool (Notion, Slack)WAU/MAU + team adoptionWeeklyWAU/MAU below 40%
Gaming / fantasy sports (Dream11)Session frequency + D7Daily during eventsD7 below 30% in season
Edtech (courses)Completion rate + re-enrolmentPer-courseCompletion below 15%

India retention benchmarks: Consumer social D30: 10-15%. Fintech D30: 8-12%. E-commerce D30: 5-8%. Quick commerce D30: 12-18%. If your D30 is below these ranges, fix retention before spending on acquisition. See PM Benchmarks for the full table.

Pick the metric that matches your product’s natural usage frequency. If you track D1 retention for a product people use monthly, you will always look terrible. If you track MAU for a product people should use daily, you will miss the churn signal until it is too late.

// scene:

A growth team standup at a consumer fintech in Mumbai. The CEO has just seen the monthly report.

CEO: “We hit 500K downloads last month. Great work on the campaigns. But I noticed MAU is only 80K. That seems low.”

Growth Lead: “Yeah, our D30 retention is around 8%.”

CEO: “Eight percent? So 92% of the people we acquire are gone within a month?”

Growth Lead: “That is... technically correct.”

CEO: “We spent ₹1.5 crore on acquisition last quarter. At 8% retention, the effective cost per retained user is... almost ₹400?”

Growth Lead: “We have been focused on top-of-funnel. The retention team is just one PM and a data analyst.”

The CEO paused. Then cancelled the next quarter's ₹2 crore acquisition budget increase and redirected half of it to a retention squad.

// tension:

The company celebrated 500K downloads while quietly losing 460K of them. The acquisition machine was running perfectly, feeding a product that could not hold users.

The engagement loop: what actually brings users back

The Hook Model (trigger, action, variable reward, investment) is the most-taught retention framework in PM courses. I have taught it myself, to CleverTap’s customers and in Pragmatic Leaders sessions. It is a useful mental model. But it has a problem: it describes the mechanics of habit formation without telling you how to design for your specific product.

Here is the framework I use instead. I call it the Engagement Loop, and it has four parts that map directly to product decisions:

1. Trigger (why do they come back right now?) Not “we sent a push notification.” That is a delivery mechanism, not a trigger. The real trigger is the reason the notification matters. PhonePe sends a reminder that your electricity bill is due. That is a trigger tied to a real-world event the user cares about. CRED tells you your credit card payment is due and you will earn coins. Dream11 tells you a match is starting in two hours and your friends have set their teams. The trigger works because it connects to something the user already wanted to do.

The question to ask: what event in the user’s life naturally creates a need for our product, and are we present at that moment?

2. Core action (what do they do when they arrive?) This must be the shortest possible path to value. Every step between the trigger and the core action is a place where users drop off. Zepto understood this: the core action is placing an order, and they stripped the experience down to under two minutes from app open to checkout. Swiggy Instamart followed the same logic. The fewer decisions between “I need this” and “I have ordered this,” the stronger the loop.

Design question for your product: what steps can you remove from the path between trigger and core action? Every screen, every choice, every loading state is friction. Remove it or justify its existence.

3. Reward (what do they get that feels good?) Three types of rewards, and the best products use at least two:

  • Tribe rewards (social validation): Dream11 leaderboards among friend groups. Swiggy showing your order count compared to friends during year-end wrap. CRED’s members-only positioning.
  • Hunt rewards (resources, deals, information): Meesho’s daily deals that change. PhonePe’s cashback on transactions. The variability matters. If the reward is the same every time, it becomes invisible.
  • Self rewards (mastery, progress, competency): Duolingo’s streak counter. An edtech platform showing course completion percentage. The sense of getting better at something.

The critical question: how are you rewarding the core action? Is the reward immediate, satisfying, and variable? If the user completes the core action and feels nothing, the loop is broken.

Spotify Wrapped is a masterclass in all three. Users get a personalised review of their listening (self reward), they share it on social media for reactions (tribe reward), and they discover new patterns about themselves (hunt reward for self-knowledge). It creates a moment users actively look forward to every year. Users crave self-expression, nostalgia, and social currency. Wrapped delivers all three in one feature that costs Spotify almost nothing to build.

4. Investment (what did they put in that makes leaving harder?) Every action a user takes that stores value in your product increases switching cost. Notion users build entire knowledge bases. PhonePe users link all their bank accounts and set up recurring payments. Dream11 users build a history of team selections and win/loss data. Nykaa users accumulate reviews, wishlists, and loyalty points.

The investment does not have to be dramatic. Even small data deposits compound. Typeform understood this when they turned form creation from a boring utility into something people spent time customising. The moment you have invested fifteen minutes making a form look exactly right, you are not switching to Google Forms. They grew that single insight into a business worth hundreds of millions.

Product typeTriggerCore actionReward typeInvestment
Food delivery (Swiggy, Zepto)Hunger + time pressurePlace orderHunt (deals, new restaurants)Order history, saved addresses, favourites
Payments (PhonePe)Bill due, money requestComplete transactionHunt (cashback) + Self (spend tracking)Linked accounts, autopay, transaction history
Fantasy sports (Dream11)Match startingCreate teamTribe (leaderboard) + Hunt (prize pool)Team history, friend network, win record
Fashion (Nykaa, Myntra)New collection / saleBrowse + purchaseHunt (deals) + Self (style discovery)Wishlist, reviews, size preferences
B2B SaaSTeam activity, workflow needUse core featureSelf (productivity) + Tribe (team adoption)Data, integrations, team workflows
// thread: #product-growth — A PM analysing why a feature launched to improve retention actually hurt it
Priya (Growth PM) We launched the gamification layer two weeks ago. Badges, streaks, points. D7 retention went DOWN by 3 points. I am genuinely confused.
Arjun (Data Analyst) I looked at the session data. Average session time went up by 40 seconds, but sessions per week dropped from 4.2 to 3.1. Users are spending more time per visit but coming back less often.
Priya (Growth PM) So the badges are making each session more engaging but we broke the return trigger somehow?
Arjun (Data Analyst) Looks like it. The badges created a 'completion' feeling. Users see their badge count, feel accomplished, and leave satisfied. Before, they left mid-task and had a reason to come back.
Priya (Growth PM) We accidentally gave them closure. The open loop was the trigger, and we closed it. brain 5
Arjun (Data Analyst) The streak mechanic might help. Streaks punish absence. But the badge wall needs to go. It is creating a finish line in a product that should feel infinite.
Priya (Growth PM) Killing the badge wall tomorrow. Keeping streaks, adding a weekly challenge that resets every Monday. The loop needs to stay open.

That Slack exchange illustrates the most common retention mistake I see: adding engagement features without understanding the existing loop. Every product has a natural return trigger, even if it was unintentional. Before you add gamification, notifications, or rewards, map the existing loop. Then strengthen it. Do not replace it with a new mechanic that might be weaker.

Retention in India: the notification spam problem

Indian consumer apps send more push notifications per user than almost any other market. The average Indian smartphone user receives 50-80 notifications per day across apps. The result: notification blindness is severe, and aggressive notification strategies actively hurt retention.

I have seen this play out repeatedly. A product team sees declining D7 numbers and their first instinct is “send more notifications.” So they increase frequency from two per day to five per day. Open rates drop. Uninstall rates spike. D7 gets worse. So they send even more. The death spiral continues until someone in the room asks “are we annoying people into leaving?”

The answer is almost always yes.

Byju’s is the cautionary tale. At their peak, they sent an extraordinary volume of notifications, emails, and even phone calls to drive engagement. Parents complained. Users turned off notifications entirely. The aggressive retention tactics created a brand association with spam that damaged trust far beyond what any individual notification could recover.

What works instead:

Event-triggered notifications only. PhonePe does not send “come back to PhonePe!” notifications. They send “your electricity bill is due in 3 days” and “Ramesh sent you ₹500.” The notification is about the user’s life, not the app’s engagement metrics. The user opens the app because they have a reason to, not because they were nagged.

Transactional value, not promotional noise. Zepto sends “your order is 2 minutes away” — information the user is actively waiting for. That notification has a 90%+ open rate because it is useful. Compare that to “Check out today’s deals!” which trains users to ignore you.

Respect the quiet hours. Swiggy learned this. Late-night food notifications work at 9pm. The same notification at 6am gets the app uninstalled. Time-of-day sensitivity is not optional in a market where phone notifications wake people up.

The products winning retention in India are not the ones sending the most messages. They are the ones sending the fewest, highest-value messages. Every notification should pass this test: would the user thank you for sending this? If the answer is no, do not send it.

Cohort analysis is the only retention tool

If you are looking at aggregate retention numbers, you are seeing an average that hides everything interesting. Aggregate retention mixes users who joined yesterday with users who joined six months ago. It tells you nothing about whether your product is getting better or worse at keeping people.

Cohort analysis solves this. You group users by when they started (weekly or monthly cohorts) and track each group’s retention independently over time.

Here is what a cohort table reveals that aggregate numbers hide:

Is your product improving? If January’s cohort has 12% D30 retention and March’s cohort has 18%, your product changes are working. If it is the other way around, something you shipped in February is hurting retention and you need to find it.

Which acquisition channels bring users who stay? Cohort by channel. You might discover that users from organic search retain at 25% while users from paid Instagram ads retain at 8%. The Instagram campaign looks great on a cost-per-install basis and terrible on a cost-per-retained-user basis.

Where is the cliff? Most products have a specific day where retention drops sharply. For many consumer apps in India, the cliff is between D3 and D7. Users try the product for a few days, do not form a habit, and leave. That cliff is where your engagement loop needs to activate. If you do not know where your cliff is, you are guessing about when to intervene.

Did that feature launch help or hurt? Compare the cohort that experienced the new onboarding flow against the one that did not. Not an A/B test (though those are better) but a directional signal. If post-launch cohorts retain better, the change is working.

Everything else — DAU/MAU ratios, session counts, time-in-app — is interesting context. But none of it tells you whether the people you acquired last month are still here. Only cohort analysis does that. If your analytics tool does not make cohort tables easy, switch tools. Mixpanel, Amplitude, and CleverTap all do this well. Even a SQL query against your events table will work. The tool does not matter. The discipline of looking at cohorts weekly does.

// exercise: · 15 min
Run a cohort retention analysis

Do this for a product you work on or have data access to. If you do not have access, use a public dataset or ask your data team.

  1. Pull all users who signed up in each of the last four weeks (four cohorts).
  2. For each cohort, calculate what percentage were active on D1, D7, D14, and D21 after signup.
  3. Arrange this in a grid: rows are cohorts (Week 1, Week 2, Week 3, Week 4), columns are retention windows (D1, D7, D14, D21).
  4. Look for two things:
    • The cliff: Where does the sharpest drop happen across all cohorts? That is your intervention point.
    • The trend: Are newer cohorts retaining better or worse than older ones? That tells you whether recent product changes are helping.
  5. If newer cohorts are worse, list every product or acquisition change between the best and worst cohort. One of those changes is the cause.
  6. If you cannot get this data within 15 minutes, write down what is blocking you. That blocker is your first retention project: you cannot improve what you cannot measure.
// learn the judgment

You are the growth PM at Swiggy. Your D30 retention has been at 28% for two quarters — strong for food delivery. The growth team proposes increasing the daily push notification cap from 2 to 5 per user, targeting inactive users (no order in 7+ days). Projected impact: a 12% lift in reactivations from the dormant segment. Your data shows that users who receive 4+ notifications per day have a 3x higher uninstall rate than users who receive 1-2. The dormant segment is 35% of your installed base.

The call: Do you approve the 5-notification-per-day cap to drive reactivations, or hold it at 2?

// practice for score

You are the growth PM at Swiggy. Your D30 retention has been at 28% for two quarters — strong for food delivery. The growth team proposes increasing the daily push notification cap from 2 to 5 per user, targeting inactive users (no order in 7+ days). Projected impact: a 12% lift in reactivations from the dormant segment. Your data shows that users who receive 4+ notifications per day have a 3x higher uninstall rate than users who receive 1-2. The dormant segment is 35% of your installed base.

The call: Do you approve the 5-notification-per-day cap to drive reactivations, or hold it at 2?

0 chars (min 80)
// interactive:
The Retention Drop Detective

You are the growth PM at a consumer fintech app in Bangalore. Your D30 retention had been steady at 22% for three months. After a major feature launch (a new savings goal feature with social sharing), D30 dropped to 15% in the most recent cohort. The CEO is alarmed. Your task: diagnose the cause.

You have Mixpanel access, user interview capacity, and one engineer for instrumentation. The savings goal feature launched four weeks ago. Where do you start?

Career-stage considerations

0-2 years: Learn to instrument funnels and read cohort charts. These are foundational analytical skills that every PM needs, not just growth PMs. If you cannot pull a cohort retention table and explain what it shows, you are making product decisions blind. Start here before worrying about engagement loops or notification strategy.

3-5 years: Own the retention metric for your product area. At this stage, you should be the person who can explain why D30 moved up or down, what caused it, and what you are doing about it. Retention ownership is one of the fastest paths to senior PM because it forces you to understand the entire user journey, not just one feature.

5+ years: Design the engagement system, not just the experiments. At this level, you are not running individual retention experiments — you are architecting the loops, triggers, and reward structures that make the product inherently retentive. This is systems design applied to user behaviour, and it is the work that separates a Head of Growth from a senior growth PM.

Where to go next

retention loops 0%
14 min left