Schell Games Elevates Player Safety with ToxMod

Schell Games Elevates Player Safety with ToxMod
February 13, 2025

With 90% of Among Us VR players relying on voice chat rather than pre-set text messages in quick chat, it was critical for Schell Games to ensure a safe and inclusive environment. However, relying solely on player-generated reports proved insufficient due to inaccuracies and the time-intensive nature of manual reviews. Moderating all in-game audio for code of conduct offenses was infeasible due to the sheer volume—nearly 50,000 hours of audio per month! 

Recognizing that voice chat was both a popular feature and a key selling point for the VR experience, Schell Games needed a scalable solution to balance player safety with the demands of real-time communication.

The Solution

Schell Games integrated ToxMod, Modulate’s AI-driven voice moderation tool to accurately and quickly moderate player voice chat interactions at an immense scale, allowing moderators to flag and address toxic behavior. 

ToxMod empowered moderators to efficiently process large volumes of audio data, prioritizing critical infractions. With ToxMod analyzing and flagging likely code of conduct violations, moderators could focus on more severe cases, such as hate speech and bullying. This collaboration ensured a streamlined process, reduced staffing needs, and improved player safety. 

The Results

The implementation of ToxMod transformed moderation in Among Us VR. Key outcomes include:

  • Efficiency Gains: ToxMod processes over 50,000 hours of audio monthly, significantly reducing the need for additional staff.
  • High Accuracy: ToxMod achieved at least 95% detection accuracy for severe infractions, improving as the AI model was refined to match player behavior.
  • Scalable Moderation: Moderators review an average of 30 cases per hour, addressing 2,000–3,000 problematic players weekly.
  • Focused Actions: Approximately 80% of moderator actions focus on racial or cultural hate speech, with additional focus on sexual or gender hate speech and bullying. 

“Without ToxMod, we couldn’t moderate the community as efficiently,” said a Schell Games moderator. “It pinpoints the important stuff that needs our attention.”

A Paradigm Shift

By continuously refining the AI model, ToxMod has become an integral part of their moderation workflow, enabling automation of obvious infractions and allowing human moderators to focus on nuanced cases.

“Constant review and adjustments have made ToxMod a reliable tool,” says Laura Hall, Senior Player Support Specialist at Schell Games. “It helps us efficiently monitor and remove the most toxic utterances in the game.”

With ToxMod, Schell Games has set a new standard for scalable and effective content moderation, demonstrating how AI-driven solutions can create safer and more inclusive gaming environments.

Industry

Gaming

Gaming
TEAM SIZE

50+ Employees

Most Recent Customer Success Stories

Read more Success Stories

Subscribe to our newsletter for insights on Trust & Safety

Trust & Safety Lately explores the landscape of Trust & Safety in games and explains what the latest news means for studios.

Save countless hours of design and development.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor.