- Rockbite Games founders offer insight into how to utilise analytics to boost your game metrics
- Tips include avoiding making only small iterations if a project has low metrics instead developers should quickly test different hypotheses
- Teams can also reach out to publishers with a similar game portfolio to save time and money
Get Industry News In Your Inbox…
Sign Up Today
Understanding game analytics can significantly impact your game in the mobile market. However, understanding analytics in a way that allows you to create meaningful change in your game can be challenging.
Rockbite Games founders Avetis Zakharyan and Gevorg Kopalyan experienced this scenario as they had to find a way to boost their games’ metrics, which meant working closer with an analytical team.
In this article, the founders share insight into what they learned in this process, highlighting tips and common mistakes to avoid.
The project is soaring, marketing returns more money than it spends, and you keep scaling up and increasing purchases.
This is the dream scenario, right?
The project is growing, and all you’re thinking about is improving product metrics. Most developers imagine this, especially those who know how to create compelling projects but don’t have much experience in marketing and analytics.
We went through this ourselves, but eventually, the money from our hit ran out, and we had to change our approach to developing new projects, analytics, and working with publishers.
During the crisis, we had to figure out how to find growth points that would not only maintain our metrics but also significantly boost them.
During the crisis, we had to figure out how to find growth points that would not only maintain our metrics but also significantly boost them.
The solution was deep and close collaboration with the analytical team. It’s great if you have one at all. The next step was to understand what to measure and avoid common mistakes.
Product analytics ≠ Marketing analytics
Measuring playtime, drop-offs, and retention helps identify bottlenecks, which, when fine-tuned, can give your metrics a nice uplift. For example, in our last project, Idle Outpost, the economy was struggling due to a high percentage of drop-offs at the third level.
After this level, a new mechanic significantly increased long-term retention, but only for players who discovered it. We needed to guide the rest of the players to this point so they wouldn’t quit early, and that’s where product analytics comes in handy.
We created plot summaries and emotional “hooks” to explain where everything was heading and encourage players to keep playing. And then we roll out the update to 50% of the audience and just compare the metrics, right? Of course not.
When your marketing is actively purchasing, the key is to segment the audience, figure out where the traffic came from, break it down by GEO, and exclude organic traffic. It’s not always obvious, but you won’t get reliable results without these basics.
We acquired traffic and planned for a rough profit, but after the introduction of IDFA, tracking problems started.
Initially, we used Firebase, but it wasn’t the best fit for our needs and had to be used with other services, which was too complex and inconvenient. Only with several million dollars of marketing spending can you understand sources and segment audiences during testing.
We didn’t have that experience with previous projects and ended up with $300,000 in losses. We went into debt, but even so, the studio survived for a couple of months due to high burn rates and marketing issues.
We acquired traffic and planned for a rough profit, but after the introduction of IDFA, tracking problems started. We realised we were buying at a loss and a significant one at that.
When we started working with AppQuantum, we began upgrading our product analytics. The publisher helped us with expertise and advice for our in-house analyst. We couldn’t have discovered some insights without marketing test budgets and benchmarks based on big data.
Now, we maintain our own event database, which is more reliable when used with Firebase for audience segmentation. The main feature is that we process the data ourselves.
The analytics stack is pretty simple and time-tested:
-
Firebase
-
AppsFlyer
-
Airflow
-
Tableau
-
Grafana, Prometheus for monitoring
-
We’re also currently expanding the department and setting up new infrastructure, such as a ClickHouse cluster for events.
Don’t launch projects blindly
Even if the game seems cool and testing conditions are met, it’s tough to move forward blindly without extensive experience in launching and running successful projects.
Basically, we made almost all the mistakes and had all the fears of a startup.
Initially, we focused more on the quality of games. Then, realising our lack of expertise in marketing and analytics, we were afraid to approach publishers for fear of losing control.
Basically, we made almost all the mistakes and had all the fears of a startup.
To avoid the same pitfalls, here are some basic principles for working with metrics during prototype testing and soft launches:
-
Collect benchmarks for your niche. Period. Reach out to publishers with a similar portfolio. This will save you a ton of money and time. We released about 50 games before it finally clicked.
-
If the project’s metrics are very bad, don’t move in small iterations. Fine-tuning might seem like a good idea, but it isn’t. If the project lacks 60-70% ROI, you must test quickly and drastically different hypotheses. An increase of 0.5% from changing the colour of a button and polishing the design is a waste of time.
-
With very poor metrics, A/B tests may also be unnecessary. They are great and right but require a lot of time, data, and effort. If your retention is significantly lower than the benchmark, you need to either drop the project and start a new one or quickly test major hypotheses. Small changes won’t help, and there’s no time for A/B tests.
-
Micro-tuning is acceptable for projects with good basic metrics or engaging core gameplay. In other cases, it’s a waste of time in the pursuit of perfection. If the game is already making money, you don’t want to lose your audience, so it’s important to conduct A/B tests and make small changes to boost metrics by a few percent.
-
Test new versions on identical and statistically significant audiences. Several hundred installs from a cheap region won’t give you the full picture.
First, start with a retention test. The core of the game is key. If it’s solid, you can work with it further.
For days one and three, you can gather metrics if there’s enough content. By day seven, you need at least two to five thousand players.
We didn’t reinvent the wheel; we went to a publisher experienced in our genre.
Payment metrics require 10 to 20 thousand in the main GEOs (Canada/UK or other similar markets), depending on your payer percentage. Often, people look at multiple sources – Facebook, Unity, etc.
First, do basic attribution, distinguishing organic from paid traffic. During this time, organic traffic may come in, and it’s important to filter it out. Also, divide traffic by campaign and source, and be sure to attribute traffic from each country.
-
Monetisation mechanics differ by genre. If R1 is 25%, but long-term retention is over 10%, it’s worth working on. Expand the primary funnel and deepen monetisation.
-
Learn from others’ experiences. If your metrics are critically low and there’s no time or reason for tuning, figure out where to go next. There will be fewer questions if you’ve launched and tested dozens/hundreds of prototypes and run various successful projects. We didn’t reinvent the wheel; we went to a publisher experienced in our genre (we wish we had done it sooner).
As a bonus, here’s a table with benchmarks from point one using our past Metropolis project as an example:
Analyse your competitors
We’ve rethought game mechanics multiple times, but moving forward without guidelines is counterproductive.
We never played other people’s projects as much as we did when collaborating with a publisher.
I’m going to be honest with you: even though we’re gamers, we never played other people’s projects as much as we did when collaborating with a publisher. It’s been a steady stream of games worth checking out. They helped us regularly analyse the market, features, successful projects, mechanics, and trends.
This approach keeps you from worrying about your project, helps you stay on trend, and significantly increases the chance of a successful launch.
For example, let’s take Tycoon Idlers. Here’s a must-check list for developers:
-
Idle Courier: answers a lot of questions.
-
AdVenture Capitalist: an idle classic for genre newcomers.
-
Cheech and Chong Bud Farm: Eastside Games’ top-earning project this year. Eastside is a big player in the idle game market, releasing games with identical cores for different franchises.
-
Frozen City: one of the main projects in the niche.
-
Idle Office Tycoon: the largest project in its “room tycoon” subgenre.
-
Eatventure: birthed a new subgenre – simplified production chain + equipment meta.
LiveOps has long been crucial for any project aiming to reach a plateau rather than die after a profit spike and audience influx.
“There is no silver bullet for making a successful project – only teamwork and the ability to learn from past experiments. Only an iterative approach and communication between the developer, publisher, and team can increase the chances of releasing a hit. We sincerely believe that 1+1=3; only this form of teamwork allows companies to succeed in a highly competitive market.
We’ve nailed how to launch successful products. It’s no longer just a shot in the dark. We’ve got a clear plan for us and our partners. And now we’re excited for the release of new hits!” Evgeny Maurus, founder of AppQuantum.
There are never enough events
LiveOps has long been crucial for any project aiming to reach a plateau rather than die after a profit spike and audience influx.
Even with profitable metrics, as acquisition volume increases, metrics will decline, requiring constant refinement to balance this and track key points in product analytics.
This approach has become second nature to us. Here are several key points to save time analysing metrics and working with updates from the start:
Evaluate the entire starting funnel (look at additional loading stages)
-
Client loading event and passing the GDPR window
-
Server login
-
Additional content loading initialisation
-
Game initialisation after loading
User onboarding: Track every user step – monitor every gameplay-changing element to understand user progress and where drop-offs occur.
Measure D1 retention and tutorial completion: This is crucial in initial tests to increase the first funnel and get as many users progressing further into the game.
After expanding the primary funnel, work with win rate: Adjust the balance carefully to provide a challenge without overcomplicating things.
Finally, examine monetisation at levels. Identify levels with good indicators and load them as options for A/B tests to create a chain of levels with maximum monetisation and minimal drop-off.
Edited by Paige Cook