Modulate prepares platforms for regulatory compliance
120 million
15 million
10 million
Experts in online safety policy for any platform
Regulations
Digital Services Act

European Union
The Digital Services Act (DSA) is a European law designed to require online platforms to combat harmful and illegal behaviors and foster transparency across the industry.
The DSA outlines a variety of obligations, with some of the most notable being a requirement of regular transparency reports to disclose the effectiveness of each platform’s moderation efforts; as well as banning users who repeatedly upload or share illegal content.
Effective Date
January 1, 2024
Potential Financial Impact
6% of annual revenue
Requirements
- Transparency reports
- Banning repeat toxic users
- Prompt, transparent user reports and appeals process
How Modulate Helps
ToxMod can identify all the worst behavior across your platform, helping you stay compliant and genuinely improve your community experience.
.avif)
.avif)
Children's Online Privacy Protection Act

United States
The Children's Online Privacy Protection Act (COPPA) imposes controls on online platforms that are targeted towards kids or have a lot of underage users.
Among other requirements, COPPA requires parental consent before platforms can process PII (personally identifiable information) of children under the age of 13. COPPA is designed specifically to protect children, and allows for looser interpretations when it’s in the best interest of the child.
"COOPA 2.0" is currently on the Senate floor and would extend these protections to teens as well.
Effective Date
April 21, 2000
Potential Financial Impact
$43,000 per impacted child
Requirements
- Obtain parental consent for any data collected
- Protect all collected data securely
- Minimize data collected
How Modulate Helps
Modulate limits our collection of PII wherever possible, and we restrict all collected data to be used solely for moderation in to improve user safety. We're certified COPPA-compliant.
UK Online Safety Act

United Kingdom
Several countries currently have Online Safety Bills implemented or under consideration. The UK’s Online Safety Bill requires platforms to proactively remove illegal content, provide expansive explanations in their terms of service regarding moderation practices, and limit the risk of underage users accessing adult or otherwise harmful content.
Effective Date
October 26, 2023
Potential Financial Impact
10% of annual revenue
Requirements
- Proactively remove illegal content
- Limit children from accessing dangerous content
- Enable transparent reporting and appeals flows
How Modulate Helps
ToxMod proactively monitors conversations across the ecosystem and provides a categorized, prioritized list of the worst harms back to your team, allowing you to engage with the most severe content first.
.avif)
.avif)
eSafety Industry Codes

Australia
Australia's Online Safety Bill created the eSafety Commission, an office solely devoted to enforcing online safety standards.
The eSafety Commission is empowered to publish "Industry Code" expectations for a variety of industries, including game developers. The Industry Code relevant to games was initially rejected and is now being reworked to require more proactive efforts by platforms.
Effective Date
January 23, 2022
Potential Financial Impact
Variable, at eSafety Commission's broad discretion
Requirements
- Automated tools to reduce harmful content like CSAM
- Robust reporting options for users
How Modulate Helps
ToxMod can ensure studios become aware of illegal content even if players don’t report it, and can also augment player reports with substantially more context to help platforms take action efficiently and consistently.
Code of Practice for Online Safety

Singapore
The Singapore Code of Practice for Online Safety is a set of guidelines aimed at promoting safe and responsible online behavior. It outlines measures for online service providers to prevent and combat harmful content and activities on their platforms while ensuring user privacy and freedom of expression.
Effective Date
July 18, 2023
Potential Financial Impact
S$1 Million fine, plus S$100,00 per day
Requirements
- Minimize exposure to CSAM and terror content proactively
- Respond to user reports in a timely manner
- Transparent updates to reporting users about actions taken
How Modulate Helps
ToxMod's proactive moderation technology helps studios surface CSAM and terror content in near real-time, and connects with player report systems for speed & transparency.
.avif)