The toxicity that shows up in many online games—and the gaming community at large—can actually kill a game’s success. 81% of multiplayer gamers experience some form of harassment (a number that’s gone up in recent years):
While it may seem to many that harassment is just part of the experience in gaming, it doesn’t have to be this way. Building in Trust & Safety practices right from your beta or alpha launch can have a huge impact on the success of your community and game moving forward, and it’ll feel better for your players, too.
Here are some of the ways T&S can help your game succeed.
People leave games that are toxic. A quarter of players note that they have quit playing specific games because of having negative experiences with the game. The important thing to note is that they don’t stop gaming as a whole but instead move their participation to another platform or game.
With every toxic interaction you allow to occur on your platform, you are giving a tacit vote for other players to leave. When enough of your players leave, you no longer have a player base.
If your PS team is always working on a backlog of abuse and harassment claims, they don’t have the time they need to work on more meaningful and long-term impactful work. Beyond that, it can be draining to deal with this type of work all day.
Not only does player toxicity take a toll on the players that have to deal with it personally, but it’s also a drain on your internal resources. Along with players churning, you will also start to see a higher turnover rate with your player support team.
Giving your PS team more support to focus on conversations and issues outside of toxicity and moderation means that you will likely provide better one-off support experiences for players who reach out about other issues. Following best practices regarding Trust & Safety can be a lifesaver here.
We all know how valuable getting a net-new player to a game or series can be. However, newer players are much more susceptible to being disengaged by toxic behavior.
Imagine this: you turn on a new game, you are fully immersed in the beautiful scenery, and are just starting your first quest, and then someone comes up, kills you, and camps your corpse. As a new player, this can be discouraging, and you may even believe that this is how the rest of the game will go for you.
Gamers that have played a game before are much more unflappable when it comes to abuse, but a game can’t survive on returning players alone.
It’s a good look to care about the wellness of your player community. The community that can build itself around a specific game or fandom can be truly flooring—take a look at all of the communities that have built themselves around Super Smash Bros., for instance. Players have the capacity to create lifelong, meaningful friendships within your game.
That said, toxicity within your game can hurt those who care about you most. Many players have indicated that heightened anxiety and lower self-esteem are two outcomes of toxicity in the games they play. An ADL survey in 2020 indicated that 11% of players reported depressive or suicidal thoughts after experiencing harassment in an online game.
You owe it to your players to care about their mental health.
It is a commonly accepted truth that if you allow people to behave badly, they will continue to do so. Further, the anonymous environments that are available in games make it even easier for this to happen. After all, how will they learn to stop if there are no personal repercussions for bad behavior? Taking action on an unwanted behavior right at the start helps to pull out the problem at its root.
Riot and Valve are two examples of different ways to handle moderation. League of Legends, for instance, has a zero-tolerance policy. Anyone who has had a rules infraction is subject to permanent banning. On the flip side, Valve uses a community moderation method called “the overwatch system.” The system assigns players a behavior score, and players with a high behavior score are asked to review suspected cases of poor behavior and pass judgment. Players that have been “flagged” have degraded accessibility to the platform until their case is reviewed.
There are a number of ways that players can protect themselves against bad behavior, but ultimately tasking them with self-protection will degrade their game experience and could lead to you losing player population. If you rely on your players to report in-game toxicity, it’s unlikely to have much of an impact. That same ADL study reported that less than half of players use in-game tools to report others.
Establishing a dedicated T&S team ensures your players get the best experience possible. Consider augmenting your team with AI, such as ToxMod, that specializes in analyzing voice chats.
Having a team dedicated to ensuring that your players have the best, safest experience is one of the most impactful ways to create success in your game.