Oscar Clark is chief strategy officer at Fundamentally Games. Clark will be presenting a free webinar discussing validating game ideas on Tuesday 24 May 24 2022, available through this link.
For so much of my career, I’ve had the responsibility of making a call on the commercial potential of a game. Whether it was trying to pick the winners in the mobile Java era, consulting (usually to find a way of rescuing a failing project), and particularly now in my role as chief strategy officer of a living game publisher.
I’ve done this a lot. And I want to share the way that I look at games, including my own efforts in design, to identify and prepare projects for launch.
For full disclosure, my approach is biased towards Living Experiences, meaning games on mobile, PC, or console which aim to immerse players in ongoing active play over months or years, sometimes known as live service.
There are three different perspectives I like to take:
- Direction: why, what and for whom?
- Design: mechanic/context/metagame
- Data: forecasts, testing and performance
Direction – why, what and for whom?
When looking at a game, the first thing I want to understand is what is providing the direction that underpins the concept, and how that drives its (usually commercial) purpose.
This starts with trying to understand why this game is being made. This is as much about appreciating what motivates the developer as how the game is expected to attract, retain, and convert the player (to spending). Are the developers trying to create a genre-busting disruptive innovation that will blow my mind, or are they trying to recapture the way that Mario made them feel 20 years earlier? Many teams are motivated by peer review, some by cold hard cash, other by Metacritic ratings.
I also need to understand what they are trying to build. This isn’t about genre or platform, although that can provide an easy shorthand of what to expect and will be useful later when looking at data. I’m more interested to know if the game will be a linear, singular experience with a start, middle, and end or something that evolves and develops over time through ongoing updates, configurations, content releases and a commitment to a community.
This is as much about appreciating what motivates the developer as how the game is expected to attract, retain, and convert the player
Oscar Clark
These kinds of distinctions are fundamental when assessing how a game might perform. They will also shine a light on other factors, such as if the team have the skills to realise the game’s potential. It also frames other questions such as costs, resource requirements, planning, development, and testing strategy required. If a developer is making a simple hypercasual twitch game, the criteria will be very different than an open sandbox experience which routinely adds new missions.
In many cases, the most important question of direction is who the game is being made for. The target audience for the game should (almost) never be the game designer. If we are validating a game, we need the understand the potential audience in enough detail so we can appreciate why those players will care enough to part with their precious time and hard-earned cash.
We need to understand what they feel, think, say, and do around games like this and, importantly, how you can legitimately find them through social media and marketing. This means we need to understand their interests, characteristics, and the channels where they already are and build reasons for them to trust your team and brand.
Design – mechanic/context/metagame
Assuming we understand the direction of the game, we need to understand the design and how this ‘resolves’ the requirements set out for the team.
I start by isolating the core mechanic – centrally, what the player does – into a loop which usually should be expressed with four steps. Myself, I’m looking for:
- A start condition that sets up a…
- Challenge for the player
- Resolution and ultimately receive a…
- Reward.
Simple enough, but this alone doesn’t identify what makes the game fun. Instead, we need to look at how these steps take the player through an emotional journey. There is no one way to do this, but we find it useful to look at how the start condition begins with the player in a calm or relief state but at the same time creates a high level of anticipation.
As the game introduces the challenge, we might expect to see the player move emotionally from relief to tension, i.e. awareness of the potential for failure. However, if the game fails to sustain the level of anticipation that was set up in the player’s mind at the start, this can morph into churn. At the resolution stage, we look for the emotional journey to go from anticipation to some kind of fear of missing out (FOMO).
We still want there to be a tension, but now combined with an awareness of the consequences of that failure, i.e. missing out on a future reward or experience. The final stage when we have completed the action concludes that pattern by revealing your reward, but even here we need to see the players’ state to move to a sense of calm or relief.
This helps us judge the potential for players to enter the ‘flow state’ popularised by Mihály Csikszentmihalyi and eagerly sought by many a game designer. However, combining this with thinking about the switch between anticipation and FOMO helps us understand if the game can tease players interest in playing again (and again). If the reward stage resolved the player’s emotional experience completely, where is the motivation to replay?
A similar principle applies to the second layer, the context loop: the element of game design that provides the reason for the player to repeat that mechanic and how it provides reasons to play once again.
Again, we need to consider the emotional journey of the player as well as the key elements that drive the loop forward. I think about purpose (‘why am I doing this?’) as the starting point where the player can feel anticipation and rest/relief. But as we move towards progression (how we can see the impact of our actions against that purpose), we move from relief to tension. That pattern continues as we look at optimisation (how we play and the choices we make) and move from anticipation to FOMO.
Finally, I look at the narrative (even if just our own playing story) providing a conclusion to the loop, moving the player from tension to relief but maintaining a sense of FOMO to drive forward the next play (or next episode).
Frequently, that’s as far as many designers go. But I think this isn’t enough, especially for living games. We need to also consider what I call the metagame – the social/lifestyle impact that the game has on the player. Only by understanding where the game fits into the player’s daily routine can we have confidence of attaining the real potential of the game.
Similar to the other two layers, we are looking at key elements moving through these same emotional states. We start with the lifestyle fit – what the player is doing outside your game – which means considering how the playing mode is affected by the device being use and how this affects the habits of the player.
For example, using a phone in portrait mode means the gameplay has to be conducted with one-handed, will be interrupted regularly, and may be used discretely in otherwise inconvenient places (such as on the toilet). That’s generally not a consideration for a console, where you may instead have to negotiate with the rest of the family to take over the central TV in the main living space.
The combination of device and circumstances drives many behaviours but still needs the initial state to provide a sense of calm (relief) and anticipation in order to set-up the conditions to sustain long-term player engagement.
Adding levels of collaboration, whether directly in the game through multiplayer or social community platforms, helps sustain or even build on the excitement, but also needs to build a tension if you are to maximise engagement. Competition can, for many games, be a massive driver of that positive FOMO we need to build engagement; however, this doesn’t have to be traditional direct, one-on-one confrontations like in many sports. With games we have the power to look at competition in fresh exciting ways.
Finally, as we move from tension to relief, we should be building up that sense of zeitgeist around your game, which creates that hype, and inspiring debate around the game and community.
I use this model to not only to work out new game ideas, but also to dissect games – and sadly I too often see it’s the context and metagame loops that aren’t fully considered. It’s also not the only way to look at games, but unlike a design based on ‘pillars’, it makes us think about the player experiences as a system.
You’ll notice this doesn’t take into account numerous important questions. Is the art style compelling (fit the IP/genre/etc), how immersive is the game narrative, sound effects, etc. All of these are centrally important, but I find that the dissection into the abstract loops allows me to set aside my preconceptions.
Data – forecasts, testing, and performance
As much as I like to think about the direction and design, in the end what really matters is the data. This means thinking about ‘what success looks like’.
If the game has yet to be shipped, you need to make forecasts – frankly, another article in itself. However, if you can take data from previous games (directly or via tools like SteamSpy, data.ai, or Reflections.io) you can build a model which can show you what success can look like. This is Fundamentally Games call a ‘top-down model’, showing what revenue/downloads have been achieved, even if we don’t know the costs of UA or development.
For the more diligent amongst you, it can also be important to also create a ‘bottom-up model’ to compare the top-down view with. This takes the assumptions you make about user acquisition costs, organic installs, and revenues to build a picture of what your game could actually deliver. The gap between the two can be very illuminating as it helps you understand the scale that you can work with for that game.
The second form of data – user testing – collates insight and feedback from players. This is usually qualitative: about understanding how the game makes the player feel using surveys and polling. We find it useful to use third party platforms like Antidote and PlayTestCloud (mobile-focused) as this keeps a distance between us as the publisher and the players, preventing bias in the results.
As much as I like to think about the direction and design, in the end what really matters is the data
Oscar Clark
This gives the developer/designer solid, independent insight into what’s happening for players. Some triple-A studios will use biometrics and other advanced techniques, which can be amazingly useful, but generally out of the reach of most independent studios. The key objective for us with this user testing is to identify would they play again, and would they recommend to a friend. The concept of recommendation is a key factor, also known as net promoter score, as it helps us understand the potential for organic user acquisition.
Finally, we look at the performance data. This is the gold standard for understanding how a game will perform but doesn’t always help you understand exactly why – hence useful to combine with Testing data. The key variables I’m interested in are CPI (blending paid and organic installs); Day 1/3/7/14/30/60 retention; ROAS (usually percentage return over 30 days), and the percentage of repeat spenders. Again, there is a whole article in looking at performance data, but the important thing is having a metric in mind before you run the tests and treating it as a hypothesis that can be proved or disproved.
In summary
I’ve talked about three key approaches for understanding and validating any game by understanding the direction, design, and data.
However, what we use at any time usually depends on the stage of development. We aim to find problems rapidly and focus on the basic needs of the game first, so we have something to build on with the later analysis. This means we need to start by understanding who the target audience is and run a Facebook ad test using a 30-second gameplay video and seeing how many clicks we get.
That gives us an immediate understanding of how attractive the game might be. If that goes well, then we might do the user testing process, and only after that start looking at retention tests, betas, and soft launches.
In short, we see validating a game as an ongoing process, treating the game as a hypothesis to be tested. Agreeing KPIs in advance and improving our understanding, step by step.
If you want to know more about this approach to validate your game ideas, check out my upcoming free webinar on Tuesday May 24 2022 at validategameideas.eventbrite.co.uk.