makes voice chat safe
The only proactive voice chat moderation solution purpose-built for games
Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context.
Voice-Native
INTELLIGENT
SECURE
PLUG-AND-PLAY
FLEXIBLE
DETAILED
Go beyond player reports and take the burden off your users
In contrast to reactive player reports, which rely on players to take action on reporting bad behavior, ToxMod is the only voice moderation solution in games today that enables studios to respond proactively to toxic behavior as it's happening, which prevents harm from escalating.
67%
A majority of multiplayer gamers (67%) say they would likely stop playing a multiplayer game if another player were exhibiting toxic behavior.
83%
5 out of 6 (83%) of adult gamers report facing toxicity online, across every demographic of player, though often with emphasis on targeting the underprivileged.
How ToxMod works
First, ToxMod triages voice chat data to determine which conversations warrant investigation and analysis.
- Triaging is a crucial component of ToxMod’s efficiency and accuracy, flagging the most pertinent conversations for toxicity analysis and removing silence or unrelated background noise.
- Unlike text moderation, processing voice data is labor intensive and often cost prohibitive, necessitating accurate and reliable filtering.
Second, ToxMod analyzes the the tone, context, and perceived intention of those filtered conversations using its advanced machine learning processes.
- ToxMod’s powerful toxicity analysis assesses the tone, timbre, emotion, and context of a conversation to determine the type and severity of toxic behavior.
- ToxMod is the only voice moderation tool built on advanced machine learning models that go beyond keyword matching to provide true understanding of each instance of toxicity.
- ToxMod’s machine learning technology can understand emotion and nuance cues to help differentiate between friendly banter and genuine bad behavior.
Third, ToxMod escalates the voice chats deemed most toxic, and empowers moderators to efficiently take actions to mitigate bad behavior and build healthier communities.
- ToxMod’s web console provides actionable, easy-to-understand information and context for each instance of toxic behavior.
- Moderators and community teams can work more efficiently, allowing even small teams to manage and monitor millions of simultaneous chats.
Ready for enterprise
ToxMod provides best-in-class community health and protection services for AAA studios and indies alike
Modulate’s support team goes above and beyond the call of duty to support our customers. Modulate technical teams are available 24/7 to help address any critical issues in real-time.
ToxMod can distinguish real harms in multiple languages, and even keep track of context in mixed-language conversations.
Modulate's team supports customers in writing clear Codes of Conduct, producing regular Transparency Reports, and conducting regular Risk Assessments.
Book a Demo
Learn how ToxMod can help you protect your community and empower your community and moderation teams.