LTV 201: A Simple Method for Forecasting

Lifetime value is tricky. The simplest -- and also most reliable -- calculations tell you what happened in the past: segment X made Y dollars per user. In a perfect world, that would be enough.

But usually, you’ll want to know the future, and that means building a more complex model.

“Complex” can mean a lot of things, from relatively quick and easy formulae to cutting-edge machine learning setups. What most startups find when they begin researching LTV models is that it’s easy to find examples, but hard to find reviews or evidence of which model is best.

The lack of substantive info is not only because big companies don’t want to talk about what they use. The other reason is that there is no “best” practice. There’s only what works in a particular situation. So for this post, the follow-up to our LTV 101 intro, I’m going to talk about why you may need models and how to start from scratch.

A Quick Recap

LTV 101 offered several typical reasons developers calculate LTV: spending acquisition dollars wisely, staying informed of their app’s health, and making business decisions.

Each of these purposes can be satisfied by even the simplest calculation: take the known revenue of a cohort over a period, and divide it by time. So for instance, if you know your Facebook 18-25 year old cohort, made up of 1,000 users, earned revenue of $10,000 in a 90-day period, then you have a provable 90-day LTV of $10 per user.

This number might tell you that it’s OK to acquire similar users for up to $10; that your app has a chance with investors; and that you can scale up user acquisition. But it doesn’t tell you about the future of other cohorts.

And what if the revenue was just $1,000? Inability to project future LTV equals missed chances to pivot when needed, or to confidently charge ahead when the signs are all positive.

One Method for Predicting LTV

You can start projecting LTV for additional cohorts with a simple historical model we call the “Ratio Method.”

To use this method, you’ll need 90 days of data from at least one past cohort. Based on this past cohort data, you’ll be able to find a rate of growth that can be applied to future cohorts.

Here’s a step by step method for using the Ratio Method:

  1. For your past cohort, find the dollar amounts of 7-day and 90-day LTV (e.g. $1.50 on day 7, and $6 on day 90)

  2. Divide the 90-day amount by the 7-day amount (6 / 1.50 = 4)

  3. Find the 7-day LTV amount of a new cohort (e.g. $2)

  4. Multiply new cohort’s LTV of $2 against the original cohort’s ratio of 4

In this scenario, your expected 90-day LTV of the new cohort is $8.

Your ratios will become more accurate as time passes and more cohort data is collected, so this method will remain a useful backup even in the presence of a more complex formula, both for quick back-of-envelope calculations and benchmarking your other calculations.

Next Up: A More Complex Model

The next step after learning the Ratio Method will be building your full formula, which can be based off principles like a power curve or even machine learning. As you’re working toward this goal, it’s worth keeping two other factors in mind that may feed into LTV:

  • Advertising -- If you run ads in your app, you’ll need to add the revenue to your LTV calculations.

  • Virality -- Viral invites can also be added to LTV in apps with measurable virality: when one user invites in another, the inviter is effectively responsible for the invitee’s future spending. Calculate K-factor (for example, 0.1), multiply it against projected LTV of the invitee (for example, $5), and add the result to the inviter’s LTV (0.1 * $5 = $0.50 additional LTV).

Tenjin’s own tools do account for ads and virality out of the box.

What comes after is a bit more difficult -- but manageable! Stay tuned for our third post in the series, in which we’ll introduce several models Tenjin keeps on hand for LTV.

comments powered by Disqus

Save Money on your Mobile Marketing. Seriously.

Subscribe

We'll keep you in the loop with product updates and interesting content around data, APIs and analytics.