Have you played Project Winter yet? It’s an 8 person multiplayer game from Other Ocean focusing on social deception and survival, in which communication and teamwork is essential to the survivors’ ultimate goal of escape. Players must work together to repair a radio and call for rescue while surviving against harsh weather and hostile environment, but beware! One or more players may turn out to be traitors trying to sabotage the mission. Mixing teamwork, strategy and subterfuge, players must use their wits and resources to outsmart their opponents and survive until they are rescued.
It’s clear from that description that teamwork and communication are core facets of gameplay in Project Winter. So what happens when a player faces hate, harassment, or toxicity in the middle of all that?
There are a few likely outcomes for any multiplayer game in this situation. The player may choose to simply quit playing to avoid any future harmful interactions. Or, they might file a report against the offending player to flag the bad behavior to the game’s moderation team. More likely than not, though, the toxic behavior will simply be ignored and accepted as “par for the course,” something that can’t be avoided. With no action taken against the bad actors, the community can slowly morph into a negative and hostile environment that makes the game unenjoyable and unwelcoming for many players.
Project Winter has an excellent set of community guidelines, with rules that include “harassment of any form will not be tolerated” and “be excellent to each other.” But how can the community team properly enforce those rules when player reporting proves to be insufficient and much of the harmful behavior goes unreported?
With ToxMod, Other Ocean is arming the Project Winter team with the tools they need to more effectively address harms and toxicity. ToxMod employs proactive voice moderation, which can detect toxicity in real-time thanks to our advanced machine learning technology that understands all the nuances of voice communication. It goes beyond player reports by providing much-needed additional context surrounding a harmful incident, allowing moderators to make more informed decisions and operate much more efficiently.
With ToxMod, the Project Winter team will see these benefits:
- Faster response time: Proactive voice moderation software like ToxMod can automatically detect and flag toxic speech in real-time, providing an immediate response to toxic behavior. In contrast, reactive moderation relies on players to report toxic behavior and requires moderators to spend time investigating the context of each and every submission.
- Increased accuracy: ToxMod’s intelligent machine learning models have proven to be extremely accurate, reducing the risk of false-positives and false-negatives. Though we often say ToxMod acts like another player in your game that is filing reports on bad behavior, there’s no risk of ToxMod reporting another player just because they lost!
- Improved player experience: ToxMod enables a safer and more positive environment for players, reducing the negative impact of toxic behavior.
- Less busywork for moderators: With ToxMod helping identify toxic behavior and providing the context needed to make decisions about what actions to take against hate and harassment, the team behind Project Winter can direct their energy away from the busywork of manually sifting through reports and towards building a positive and engaging community.
Promoting a positive and respectful community is crucial for fostering teamwork and cooperation, which are key elements in a survival game like Project Winter where players must rely on each other to achieve their objectives. By reducing toxicity through proactive voice moderation, players can have a more enjoyable and fulfilling experience and help create a stronger and more collaborative gaming community. We’re thrilled to be working with the team at Other Ocean to protect the Project Winter playerbase and ensure the game stays fun for everyone.