Janne Huuskonen is director of marketing and communications at Utopia Analytics
Facebook’s lack of moderation across its billions of users as put it at the centre of a political and social media storm, with the realisation that a utopic belief in the ability for people and platforms to self-moderate simply doesn’t work when anger and outrage fuel an entire business model.
Games are generally not driven by the same engagement-based financial model but effective moderation is just as important; especially with games like Roblox and Fortnite blurring the lines between games and social spaces. With more and more games built around multiplayer and online community play, the problems of a toxic minority are becoming a significant problem that risks alienating players and harming the gaming brands involved.
Utopia Analytics surveyed 1,000 adults in the US who identified as gamers across mobile, console, and PC to try and quantify the scale of the problem. The report’s findings revealed that toxic behaviour in online gaming has snowballed into an industry-wide problem.
The scale of the problem
Our research found that 70 per cent of respondents had experienced toxic behaviour directly or witnessed it firsthand in an online video game – 38 per cent were the direct target of abusive remarks and 32 per cent had witnessed abuse, while only 30 per cent of users hadn’t.
In mobile, RPGs are the most toxic genre, with just over half of respondents experiencing abuse in one form or another. RPGs were followed closely by strategy and adventure mobile games, with 46.5 per cent and 42.5 per cent respectively of players experiencing toxic behaviour. However, it is such a widespread issue across the industry, it seems unfair to shame specific communities.
Nearly half (49 per cent) of toxic behaviour experienced centred around personal identities – such as ethnicity, gender, and sexual orientation. While this shows anyone can be a target of abuse for any reason, our poll also suggests that women bear the brunt of toxic behaviour: 72 per cent of the 489 women polled said they had experienced gender-specific discrimination when gaming online.
In mobile, RPGs are the most toxic genre, followed closely by strategy and adventure
Janne Huuskonen
Young male toxicity
Measuring the fallout of abusive behaviour is one thing, but zeroing in on its root cause is an entirely different beast. The January 2021 study, Cyberpsychology, Behavior, and Social Networking, found that while age is not a deciding factor, toxicity seems to be driven by young males motivated by achievements in competitive games.
Derek A. Burrill’s book, Die Tryin’: Videogames, Masculinity, Culture, found that young males most often engage in toxic behaviour. He suggested their ignorance of social conventions as a potential cause and argued in favour of teaching teenagers phrases that are and are not socially acceptable, as they can often be unaware of their social implications. With diversity being an essential part of history and today’s reality, Burrill argues that education in the early years, in addition to in-game moderation, will help curb toxic behaviour.
Moderation: a moving target
Companies often dedicate considerable resources to ensuring user safety. But with 70 per cent of respondents experiencing toxicity, are these strategies fit for purpose?
Moderation has traditionally been done using human moderation teams, either building internal tools or moderate content in real-time. Besides being outdated, this approach is prone to bias, less effective at scale, and very costly – not only in a monetary sense, but the negative effects of exposure to this kind of content on human moderators must be taken into consideration.
Yet the alternative that many companies turn to – software that filters content using rules and lists of banned words – doesn’t solve the problem either. Some of these claim to be AI-based, but they are not AI in the true sense; rather, they are extensive dictionaries of words and phrases which need to be regularly updated as players find simple workarounds like swapping or misspelling words to get around the filters.
The introduction of chat apps, emojis, localisation and other languages, upside-down text, Unicode, and voice-to-text makes using these strategies even more of an uphill battle.
The current cutting-edge in moderation is context-aware advanced AI
Janne Huuskonen
Years ago, there simply wasn’t this volume of data and chat logs to contend with. Today, PUBG Mobile has been downloaded over a billion times, Minecraft has 600 million player accounts, and Fortnite has 350 million registered players. The pandemic has only acted as a catalyst for the dramatic rise in player numbers, drastically escalating the moderation challenge.
The current cutting-edge in moderation is context-aware advanced AI, which is built and trained using community dynamics, so the tool is bespoke to a specific community. This kind of AI learns the context within the text, meaning it can moderate intent as well as just recognising words and phrases, and can moderate in multiple languages as it doesn’t rely on dictionaries.
Key takeaways
Of the $180.3 billion revenue generated by games in 2021, over 50 per cent came from the mobile games industry, which is now valued greater than all other gaming platforms combined. As developers keep bringing more multiplayer features, in-game chat to mobile games, and cross-platform functionality, publishers must ensure they have the right moderation strategies in place or risk leaving players open to abuse.
Gamers are passionate about the games they play. For many, it doesn’t just provide fleeting online interactions, it’s a vital part of their social lives, or a way to relax and unplug at the end of a long day. Toxicity creeping into their favourite hangout space considerably disrupts engagement, so it’s essential these communities are protected.
The responses and player testimonies captured by our study paint a picture of an industry that has allowed toxic behaviours to become deeply ingrained in gaming culture. The industry needs to do more to build and foster safe online environments and protect the mental health and well-being of the people who support it.