Mobile App Uninstall Rate: Benchmark and Analysis

Mobile App Data Benchmarks are quite opaque. Though getting accurate data for your app category and competitor is difficult, general inference about good and bad can be based on publically available data. This also helps in figuring out the product -market fit of your product. I have gathered the information from different mobile app analytics companies and analytics team of different startups.

Andrew Chen and Quettra Study 

For an average app D1 (Active users after 7 days of install) is 29%. It almost lost 71% users. While D30 for an average app is 10%.

Now here users are lost doesn’t mean that users are suddenly going completely inactive – they might just be using the app once per week, but they are not active on that particular day. Don’t confuse this app install retention, though they are correlated. If the user is not using the app, chances are high that app will be uninstalled.

According to Ankit Jain (formerly head of search+discovery for Google Play)

“Users try out a lot of apps but decide which ones they want to ‘stop using’ within the first 3-7 days. For ‘decent’ apps, the majority of users retained for 7 days stick around much longer. The key to success is to get the users hooked during that critical first 3-7 day period”.

App Uninstall Rate Benchmark

Flurry App Uninstall Benchmarks

Flurry comes out with a great article on the breakdown of retention versus frequency for a bunch of mobile app categories.

This will serve as retention over 30 days benchmark for different categories i.e. Health and Fitness, Education, Entertainment, Finance, News etc. As per the report, Health and Fitness have the highest retention i.e. 48% in Android. Games have generally low retention. Please note that this retention over 30 days different from D30 retention. D30 is based on active user lost i.e. users not active on the 30th day. While, Flurry measures active app installs in 30 days (number provided by Google Console and iTunes).

Flurry Report Uninstall Benchmark

Flurry Report Uninstall Benchmark

Apsalar Android Benchmark

Marketers are most interested in understanding and mitigating uninstall rate problems in the first weeks of a customer engagement. But data show that after a few weeks, user behavior is mostly set for long term. That’s why 4 weeks benchmark is provided. These number seems bit biased to good quality apps, but neverthless are good benchmark

APAC region has worst mobile app unistall rate. Major reason why rates are higher in India and across APAC is less expensive phones purchases here have much smaller memories.

Apsalar Mobile App Unistall Rate Benchmark

Unistall Rate Benchmark

Localytics and Similiar Web Report

After a user installs an app, it get’s used alot in first 3 days. However, on ana average app retains only 23% of Daily Active Users within 3 days of installation. In general, app with high D3 retention rates drive high engagement in future. Average app is approximately unistalled in 8 months, and for gaming app it is about 7 months.

Users who are retained for 7 days stick around. The percentage of users who abandon an app after one use is now 23%. The percenatage of users who are retained 11 or more times, hence retained are 38%. That means a whopping 62% will use an app less than 11 times.

“For successfyl retention strategy – Build a good relationship between 3-7 days period and before 11 app opens”

Localytics and Similar Web Mobile App Uninstall Benchmark

Mobile App Uninstall Benchmark

Mobile App Unistall Rate Category Wise

User Retention Mobile App Category Wise

Retention in below Localytics report is defined as returning to the app at least 1x within 30 days.

Localytics Mobile App Unistall Rate Benchmark for Average app

“So in general, with the exception of gaming apps,anything around 20% retention rate at 90 days is average across the mobile industry, and anything north of 25% is what you should strive to reach.”

Below is statistic for top performing app. These apps are defined as top performers by having over 1 million monthly active users.

Localytics Mobile App Unistall Rate Benchmark for Top Perforiming App

There are some paid tools and partner tools which you can use to track retention in competitor website. Retention in terms of DAU or MAU upon install can be easily correlated with the uninstall rate. I find them quite useful, apart from the fact that they cost us and ask us to share data. Here are some: Survey Monkey, App Annie and Priori Data.

Naming Convention Best Practices For Events in Analytics Platforms

The most critical step of setting analytics for non-tech people is naming the actions on app or website right. If you screw here, you not only have to repeat this step again but risk making a dubious decision for business based on screwed data. Furthermore, still many tools don’t provide option to rename the event leading to loss of historical data.

Common Taxonomy Mistakes 

Some of the common mistakes while naming includes:

  • Ignoring the case sensitive aspect of Google Analytics, Firebase, and other third party analytics tool
  • Ignoring the hierarchy of events. Every third party tool as two sets of actions – primary, generally called ‘events‘ and secondary, generally called ‘properties‘, and these events need to be named appropriately
  • Creating huge number of primary events, and not making much use of secondary events feature
  • Not thinking from funnel perspective, and hence defining too many events, but risking missing the key ones
  • Not defining user specific properties, and hence missing out on customer segment data even when every third party analytics tool comes with user specific properties
  • Not keeping record of taxonomy of events or making changes to event names without updating in central repository for everyone’s knowledge

When Social Capital partner Ashley Carroll spoke on behalf of DocuSign regarding events in analytics, she used following meme:

Boromir

 

Step by step guide to Taxonomy:

1. Don’t measure every event – scarcity is good

You don’t need to measure every event, but only those which are actionable. As a principle, track only those events rise and fall of numbers of which will be a basis for your action. For instance, consider a fitness app which provides user facility to look at his activity history. Now tracking whether

For instance, consider a fitness app which provides the user with a facility to look at his activity history. Now tracking whether the user is “scrolling down” on the profile to check history doesn’t make much sense. Because there no action associated with following 2 cases:

  • Low Usage of Feature: As this feature is sort of basic hygiene and available in all fitness app, even if the usage is low, you can’t do away with this feature
  • High Usage of Feature: Then also you can’t do much to improve the profile, and it anyway will not lead to significant action

2.What events to track

First, decide what user actions are critical for you. Then decide the chain of the event leading to that action. We call it funnel approach.

Here is an example of conversion funnel:

Naming Convention Best Practices For Events in Analytics Platforms

Once you decide the broad events based on required funnel, then determine what secondary information you will need to deep-dive to find the reason increase or decrease of major event values from expectation. These secondary events are called by different in different tools – parameters (in Firebase), properties (in Mixpanel or Google Analytics) etc.

3. Create a taxonomy implementation sheet

It’s extremely important to create an implementation sheet. This will be helpful in keeping every team on the same page and avoiding confusion.

Here is an example of the implementation sheet for Mixpanel and Firebase.

Mixpanel Implementation Sheet

Firebase Implementation Sheet

Sample Tracking Plan – ECommerce

Sample Tracking Plan – Social

You can similarly create a sheet for different third party tools.

To avoid the confusion of upper-case and lower-case, and other similar confusion, here are some thumb rule while naming the event

  1. Always use lowercase letters
  2. Use “-” instead of space
  3. You can use number only if it makes absolute sense. Don’t ever name it purhcase_step_1. That would be really confusing to other team members
  4. Don’t abbreviate unnecessarily. For example “add_cart” will be preferred to “adcrt”
  5. Avoid proposition. Use only present tense. For example “add_cart” will be preferred “added_to_cart” and “stop_video” preferred to “stopped_video”

4. Understand different type of events

In some third party tools, there are different types of events, which have different tracking functionalities.

For example, in Mixpanel, apart from event and properties, there are user properties, applied to user related attributes and super properties, applied across all events and properties. These events help you dissect the events and funnel even more. So make use these featured for better dissection.

In Firebase, you have user properties apart from events and parameters. First, these attributes help in user segmentation. Second, these attributes help in dissecting funnel elements even further.

5. Always keep the implementation sheet updated

It’s could be a huge source of confusion if the implementation sheet is not updated regularly. In a worst case, it might be lead to a disastrous decision. In ideal, there should be a restriction on who could make a change to the names of events and one does should have the responsibility to update on the sheet.

In ideal, there should be a restriction on who could make a change to the names of events and one does should have the responsibility to update on the sheet.

Now, there is also a facility is a lot of third party tools to track events without making changes in code. I have used this feature in Mixpanel. This feature significantly reduces our dependence on developers.

I hope this answers all your questions. In case, you have any queries, please ask in the comment.