Over the last few years, we’ve seen a number of major shifts in gaming. Cloud gaming was introduced and, in the minds of many, faded back off the field shortly thereafter. COVID brought the value of online communities into sharp relief, bringing millions of new players to the industry. User-generated content drove player engagement with titles like Roblox and Fortnite; though battle-tested formulas like Call of Duty and League of Legends also continue to hold their own.
And, as the ADL reports, along every conceivable dimension, toxicity, harassment, and exploitation got worse.
As we look forward into 2023, there are a number of articles talking about exciting new trends around gaming experiences, discoverability, and even business models. But we’d like to take a different lens, and consider the fundamental questions of safety, privacy, and inclusivity. Will 2023 mark a change in direction, as platforms begin to address the problem more holistically? Or will we continue to see trends get worse?
Overall, the Modulate team is optimistic; but that doesn’t mean we’re comfortable resting on our laurels. Let’s take a look at our top 5 predictions for 2023.
1. Multiplayer online games will take over a growing share of playtime across all demographics
Let’s start with an easy one. Look back to ten or fifteen years ago, and a video game was akin to a museum. A prepared, curated experience, which you’d walk through roughly in-line with the designer’s planned path. If you brought friends, it was mostly to experience the art and story in parallel, rather than to do something together with them. And though you might be presented a few choices of where to go next…usually you would end up in the same area regardless of the choices you made.
Nowadays, top titles like Roblox, Fortnite, and Rec Room function more like a shopping mall or fairground. They host a variety of possible experiences, many of which change over time, so there’s always something new - and sometimes a patron with a great idea can actually become a contributor to the event instead. You and your friends also come to the forefront; you gather at the fair, first and foremost, to hang out with your friends, and secondarily to explore whatever experiences you want to…with those experiences usually involving more interplay between your group than before. And crucially, players get tons of creative control around what they do next.
COVID only reinforced the importance of games as a space, not just a curated experience, and we’ve since seen the advent of more casual multiplayer games to bring in a wider audience, such as Fall Guys and Among Us. In 2023, we expect to see more games emerge which are designed from the ground up around offering a social experience…and therefore, which will need to pay closer attention to questions of safety.
2. A major game studio will pay upwards of $700M in penalties related to privacy and safety violations
The FTC just recently announced that Epic will be paying over half a billion dollars in relation to COPPA violations and unfair payment practices. This is part of an upward trend as online platforms have been hit harder and harder for such violations, and the FTC’s statements made clear that they are, if anything, only planning to widen the net in the future. Coupling this with incoming regulation - including a number of new proposals for improved child privacy as well as a variety of online safety proposals around the globe - and it’s clear that platforms will have their work cut out with them to ensure they live up to the duty of care that regulators and consumers demand of them.
Top studios are well aware of this scrutiny, and we’ve seen many reinforcing their Trust & Safety teams or even creating whole new divisions over the past few years. These platforms appreciate the need for change, and are hard at work updating their systems.
But unfortunately, they aren’t doing enough to actually deliver what consumers are asking for. Most platforms are trying to bind the wound by introducing or cleaning up user-report functionality. This is certainly valuable, but the simple fact is that it doesn’t solve the biggest issues. Less than 10% of players report the harm they experience, meaning user reports miss a massive amount of misbehavior; but even more importantly, children will never report that they are being groomed by a predator, because they simply don’t know. The regulation we’re seeing emerge is quite clear about a comprehensive duty of care, requiring platforms to protect every user, not just those that raise their hands to submit a report; and while we’re confident that many studios will get there (see our next prediction), some of the behemoths of the industry are unlikely to be willing or able to take the leap without a sharp prod from regulators. We’re hopeful that the industry overall looks much safer next year…but there will be a few stragglers, and we won’t be at all surprised if the FTC or other watchdogs look to make examples of them.
3. Moderation and safety will transition from ‘cost center’ to a core part of the design process
As we laid out above, failure to comply with privacy and safety regulation can result in costly penalties; but that’s not the only way it costs platforms. As online games have become more fundamentally social, they’ve also become more reliant on network effects and chat features in order to keep players invested. Players who have a toxic or disruptive experience in chat are likely to clam up to protect themselves; and that inevitably leads to churn, whether immediate or as their interest fades due to their lack of connection to the game. In the meantime, it also means less playtime, fewer purchases, and overall a quieter and less passionate community.
In other words, it means a less successful title.
Solving this problem isn’t as simple as “get rid of all the bad actors.” A lot of negative experiences online don’t come from someone trying to be toxic, but rather from confusion about community norms, lack of awareness of offensive topics, or other mistakes made from generally well-intentioned players. (This is why Apex Legends had such success cutting recidivism rates by just explaining to players why their actions were wrong.) And further, of course, badness is relative; and no matter how much we all devote to this problem, we’ll never be able to stop every single offensive word from being said without crippling levels of restriction placed on players.
But the good news is that you don’t have to solve it perfectly to have a massive impact. What truly matters is making players aware that you’re actually trying to solve the problem in the first place. When Rec Room first launched ToxMod, their community was well aware we couldn’t solve everything, but nonetheless reacted with joy:
"It’s so refreshing to have devs that care about user experience and not just money."
"To be honest I love this idea! What I take from this is that Rec Room isn’t giving up on their game completely and that’s just the best to see and know!"
"My kid has been playing Rec Room for a long time. So I got a vr so we could play together. Let's say I was literally stunned the first 10 minutes of playing. I'm thankful they are dealing with it."
We hope and expect that studios will be taking this factor into account in 2023. Features that act like a “black box” and struggle to build community trust - including player reports without any clear followup with the reporting player; or text chat messages that just vanish with no explanation on why they were moderated - will need to take a backseat, and be replaced by features that are designed not just to make the community safer, but to remind the community that the platform cares about safety in the first place. Real-time voice moderation like ToxMod achieves this both through PR as well as through bad actors being visibly removed from chat live as they are misbehaving; other helpful solutions include in-game reminders of the code of conduct, player reputation systems that influence matchmaking, and features that reinforce sportsmanship (like Super Smash Brothers characters clapping for the winner at the end of the match).
4. Privacy and safety regulation will begin to merge
This is a bit of a technical point, but it’s an important one. For a long time, privacy regulation has demanded that studios absolutely minimize the player data they access; while safety regulation has demanded that studios review lots of player data to find the bad actors and mitigate their damage.
Obviously these two demands are in tension with each other, and most studios decided that the privacy angle had more teeth, so they prioritized that side of things. This resulted in our current status quo, where many platforms are passionately focused on things like GDPR compliance, but when asked about how they prevent child predation or protracted bullying campaigns, they basically answer “you can’t blame us since we don’t know for sure what’s happening on our platform.”
Regulators are starting to recognize this tension, though, and are putting in efforts to resolve the gap. A major portion of the FTC’s COPPA claims against Epic tied to the fact that Epic was putting the safety of kids at risk, highlighting that just ensuring the kids were anonymous while they get harassed isn’t good enough. New regulations under discussion, like the UK’s Online Safety Bill or the US’s Kids Online Safety Act, attempt to further tie these two elements together.
This is a complicated issue, and regulation does tend to move slowly, so we don’t expect this topic to be fully resolved by 2024. But we do expect that 2023 will be a key year in which case law around these regulations will be established, and we predict that the legal consensus by the end of 2023 will change from the idea of “choosing” privacy or safety, to truly needing solutions that provide both.
5. Transparency reports will fill the gap of game ratings, for now
Game ratings - things like “T for Teen” or “M for Mature” - served a crucial role in the early days of gaming, helping parents and sometimes players themselves identify which titles were a match for their interests and sensibilities. But as games have grown more social, a hole has emerged in this ratings process, all tied to four little words at the bottom of the rating notice: online interactions not rated.
This makes sense, in a way - how could the ESRB hope to make a commitment about the kind of experience players will have online, without knowing who you might end up speaking with or what they might say? This is a genuine hard problem, and the ESRB certainly isn’t at fault for lacking an easy answer.
That said, it nevertheless creates a problem, because it leaves players and their guardians in the dark about what to expect from new titles - which in turn, leads these folks to assume the worst of the games, and avoid them to be safe.
But not all games are made equal. Titles like Sky: Children of the Light, Rec Room, and Roblox are designed with an intense focus on safety, and while they can’t guarantee every player will have an ideal experience, they certainly can claim that players are comparatively safer than in more competitive-oriented titles that don’t give the same attention to safety. Right now, though, there’s no way for these games to prove themselves; and their own claims of their safety largely fall on deaf ears.
The ideal solution to this problem would be the ability to rate online interactions. Tools like ToxMod make this possible - by monitoring the platform more comprehensively for safety issues, Modulate can actually objectively assess how safe or unsafe a platform is compared to others. But it will take some doing to actually institutionalize such ratings, set standards the industry fully agrees upon, and get buy-in from ratings agencies as well as game studios themselves, so we realistically expect this kind of work will take longer than the next year to bring fully to fruition.
In the meantime, though, there’s something that could help bridge the gap - transparency reports. A few gaming companies, such as Wildlife Studios, Discord, and Xbox, have begun issuing regular reports outlining safety outcomes on their platforms, including number of player reports, number of banned users, and breakdown of what types of toxicity are occurring. These reports are of course imperfect - without proactive moderation and analysis tools, these platforms don’t have all the info they need to know they have a representative sample; and of course, we’re forced to trust that the platforms aren’t fudging the numbers. But even so, these reports are invaluable, because they offer insight to players and guardians around what kind of environment these platforms are trying to foster in the first place. Remember, toxicity isn’t black-and-white - behavior that’s welcome or even celebrated at a sports bar might be problematic in a courtroom or school playground. These transparency reports help the platforms make clear what kind of space they are trying to cultivate - enabling guardians to make responsible decisions about which platforms are suitable for their kids’ specific interests, needs, and level of maturity.
Bringing it all together
Gaming is an enormous industry - and, as we’ve alluded to here, the lines between gaming and social are blurring over time. No one blog post or organization can hope to discuss all of the major trends that will emerge in the next year, any more than a single telescope can be used to view the whole sky at once.
That said, we think few would argue that one of the largest areas to watch in gaming will relate to trust & safety - the infrastructure, consumer sentiment, regulatory landscape, and of course developer mindsets that enable safer, more privacy-responding, and fundamentally more inclusive games. We hope this post has whet your appetite around some of the changes we’re expecting in this area; and we’re looking forward to seeing the evolution of gaming further towards something that everyone can truly enjoy and participate in. Happy 2023, everyone!