By Andreas Gkoumplias,
Communities in video games are fundamental to the gaming experience. Beyond the main gameplay, it’s the players, alongside the game developers, who support and cultivate the game’s concept, keeping it alive. Particularly in online multiplayer games, there are numerous instances where real-life friendships are formed through in-game interactions. However, with the opportunities that arise from the communities formed, there have been issues arising thought-out the years as well, the most serious and the most difficult to completely eradicate being the toxic behavior from certain players.
In a gaming context, toxic behavior is defined as any deliberate attempt to spoil other players’ experiences, whether by not contributing to the game’s objectives, using offensive language, or demonstrating a generally negative attitude. Now, as it fairly understandable, there are two main categories of video games; games you can play alone (solo-campaign games) and games you can play with others (MMORPG games; Massive Multiplayer Online Role Play Games). The latter category is the one that suffers the most from toxicity incidents.
There are myriad ways players can experience toxic behavior in games. Some examples are; discrimination of any kind (race, color, ethnicity etc), flaming (offensive language), hate raids (when people encourage other people to raid the chat mostly of streamers with offensive messages), as well as more serious situations like swatting (when a person is reported to the police for a crime they didn’t actually commit, only for the swatting in their home to be broadcasted live) and doxing (the act of publishing private and sensitive information of an individual).
So, if the issue of toxicity is readily identifiable, why is it not equally manageable? In the last years, there have been quite a few attempts to at least limit those types of behavior. Stricter restriction on accounts in case of such incident, permanent bans etc. But it is not as easy as it sounds. Simply put, in large gaming communities, such as those of Call of Duty, Counter Strike, and League of Legends, it’s nearly impossible to monitor every player for signs of toxic behavior.The player base is so vast that only a handful of incidents at a time can be handled. Of course, other tactics have been implemented against toxicity, like in League of Legends for example, Riot Games (the developer) has been giving extra small perks to those that stay positive instead of toxic, and has also added the mute in lobby and in game option. The catch here is that, in games where they key element is to be able to properly communicate and coordinate with your team, the addition of the mute option in lobby, where you have limited time to strategize and in game, where communicating even for the essentials is indispensable, comes to back up the fact that toxicity is at an all time high.
But, if we look at it from a different perspective it seems sometimes that those developing the games don’t care enough to remove toxicity altogether and that is partly what drives old school players away. Simply put, when you are forced to report someone after a game and that report has no meaningful impact or does not result in a temporary/permanent suspension, it shows two things; the first being that actions DON’T have consequences and the second being that big companies potentially care more about the money they are going to make rather than having a pleased and loyal playerbase.
There is no definitive ‘cure’ for the toxicity plaguing today’s games. This is understandable, given that it’s impossible to actively control a player’s mood to prevent them from ruining the experience for others. And as a person that has been actively involved in multiplayer games for at least a decade, I can easily confirm that toxicity has spiked throughout the years. Sure, developers and companies are trying their best to limit the toxic behavior incidents and restrict those that take the fun out of playing video games, but it is my understanding that in order to have proper, situation-changing results we all, developers and players alike, have to go a long way towards reaching that goal.
References
- Toxicity, Hate and Harassment in Gaming. intenta.digital. Available here
- Dark Participation in Games. frontiersin.com. Available here
- Toxicity in Multiplayer Games Report. create.unity.com. Available here