Privacy and Data management
Privacy
ToxMod manages three different classes of data - Live Operations data, Player Profile data, and Aggregated Statistics. We’ll break down each of these data categories below, but first, a few points which are consistent for all three types of data:
- All data is encrypted both at rest and in transit.
- No data is ever shared, in individual or aggregate form, with any party except you and Modulate.
- ToxMod never collects any personally identifiable information (username, phone number, address, etc.) about any players. All of ToxMod’s data for each player is linked to an anonymous and unique ID within our system. We provide tools for you to connect these IDs with information from your own internal player databases if necessary for moderation, but at no point do we ever see or store any of that personal information.
- As necessary for compliance with GDPR or other privacy regulations, Modulate can delete all data belonging to a specific player, or deliver a report of all such data, upon request. (More on our compliance with privacy regulations can be found below.)
- Any data which ToxMod collects may be used, in aggregated and anonymized form, to further improve Modulate’s services. Any individual player can exempt their data from inclusion within these training datasets upon request.
- Modulate will never create a VoiceWear voice skin of any individual using data collected by ToxMod without explicit consent from the individual.
- In some rare cases, Modulate may store data beyond the durations listed below in compliance with certain regulations or law enforcement, such as in the event of suspected child abuse or grooming. If the suspicion is later cleared by manual review, all such data is immediately deleted.
Modulate’s secure services run within an AWS Virtual Private Cloud (VPC), with all traffic into, from, and within the VPC being encrypted. All data which Modulate collects and stores is stored within this VPC under state-of-the-art encryption at rest. In addition, Modulate employs additional AWS services to monitor our systems for breaches, security vulnerabilities, or other anomalous behavior in real time. We also under periodic cloud security audits through Amazon to ensure we are utilizing best practices for security.
For more information about AWS’s security capabilities, please visit their website.
Each of Modulate’s systems have been developed from the ground up with privacy regulations in mind. As a result, all of our services respect standard requirements including:
- Breach notification: We make use of automated systems managed by ourselves and AWS to monitor our systems for breaches, and can notify you within 24hrs of any breach or vulnerability.
- Consent: We don’t collect any data directly, but only do so within the context of your game or platform, so our services may represent an expanded degree of consent you’ll need to request from your players. We can supply recommended language for amending your Privacy Policy and consent requests as necessary.
- Right to Access: A report of all data we have for any given player can be provided upon request.
- Right to be Forgotten: We can completely remove all data corresponding to an individual user upon request without requiring downtime or retraining any models.
- Data Portability: None of our data is stored in proprietary formats. Some neural network predictions are stored in vector format without a precise verbal meaning attached, but our reports put this information into context to ensure it is well understood.
- Personal Data Use: No personal data is ever disclosed to any party aside from you and Modulate. In particular, no personal data is sold to third parties for any reason.
- Equity: Players who choose to opt-out of their data being stored or used for training to improve Modulate’s services are not treated differently in any way by any of our services compared to other users.
Additional information can be provided upon request to demonstrate our compliance with any specific regulations.
Live operations data is anything which is generated live while the player is chatting, and is securely deleted by default after 35 days. This data includes:
- Voice chat clips from in-game voice chat (including some derivations from these clips, such as detected emotions and text transcripts)
- Partially anonymized IP addresses (we zero out the last octet for privacy, but retain the rest to use for rough geolocation, as some moderation heuristics depend on location)
- Current game state (score, deaths, round length, etc)
- Party characteristics (are the players in the chat all friends? Is this a private chat? etc.)
Live operations data is often processed natively on the player’s device, with only flagged audio likely to be relevant to a harm being sent to our servers. This setting is optional, and some customers prefer to send ToxMod all audio, but even in this case, Modulate will immediately separate our data which is relevant to a harm from those conversations which are completely benign. Benign conversations will not be accessible to anyone for the duration until they have been fully deleted. Conversations likely to be related to a harm, however, will be made available to your moderation team (and, as needed, Modulate’s support staff, who will only view data when necessary for debugging or error correction purposes.)
The exact PII nature of voice clips is not well defined in current regulations. Voiceprints are considered sensitive biometrics, and ToxMod at no point generates or stores voiceprints from any user. Beyond this, most regulations consider voice clips to only be problematic when combined with other PII (which Modulate does not collect.) That said, we recommend discussing with a legal expert before deploying ToxMod to ensure your privacy policy clearly indicates the kind of data, including voice clips, that ToxMod will collect.
Player profile data includes any persistent information we need to store in order to build a robust understanding of the individual’s behavior over time for moderation purposes. This data includes:
- Our prediction of the player’s general geographical region, as derived from comparing the partially anonymized IP addresses collected in live operations data over time
- Moderation history, such as previous strikes and bans. (To be clear, this only includes the metadata indicating that this event occurred. Other metadata, and other Live Operations data, is not stored as part of the Player Profile.)
- A history of which other players they have interacted with (again, only identified by anonymous Modulate player IDs)
- And finally, estimates generated by our machine learning systems:
- The probability between 0 and 1 that the user is underage
- The probability between 0 and 1 that the user's voice is perceived as female by other users
- The language most frequently spoken by the user on the given application
Player profile data is initially processed on the player’s device, and stored on Modulate’s secure servers for up to 3 years after the last online access of the user to the relevant application. As with live operations data, the only people who will have access to this information are your moderation team and Modulate’s support staff, who will only view data when necessary for debugging or error correction purposes.
Aggregated statistics are summaries across large batches of users which are used to give an at-a-glance understanding of the overall community. This data includes:
- The total number of instances of disruptive behavior per hour/day/week/month; segmented by the type of disruptive behavior, geographic region, and basic demographics
- The total number of each type of moderation action across the same distributions as described above
- The total number of downloads of, and average usage time for, each voice skin managed by VoiceWear, across the same distributions as described above
- The total number of instances of any word or topic tracked by VoiceVibe, across the same distributions as described above as well as being broken down by sentiment
Aggregated statistics cannot be tied back to any specific players or small groups, and are stored for up to 10 years on Modulate’s secure servers to provide a record of high-level community behaviors for you over time.
Please reach out to us to discuss your security needs - we’re confident we can find a solution that works for everyone.