From Games to Delivery Apps: Promoting Community Positivity

Whether you're a game, a social platform, or even an enterprise delivery or ride-share app, you are building your product for your community. Your users are sacred, and it's your goal to give them a magical, engaging experience every time. Which is why it's a shame that so many community management techniques are inherently punitive, in a way that puts you at odds with your users.

Don't get us wrong. Sometimes punitive actions are necessary. Those few but loud bad actors seeking to ruin the experience for others do need to be stopped; and other users who might be making unintentional mistakes should be informed and coached towards better behavior. Even your most loyal users need to see that you're taking your community seriously and acting to mitigate misbehavior, lest you lose their trust. This is why Modulate started off focused on moderating severe misconduct, and we stand behind it.

But punitive action is only a piece of the puzzle. In an ideal world, you'd achieve your community goals primarily through encouragement and positive reinforcement - cultivating the best instincts of your community, rather than constantly fighting the worst.

In recent years, we've seen more and more organizations talking about this focus on prosocial behaviors, and some even attempting to build detectors to recognize positive behaviors instead of just negative ones. This caught our attention, and we set our eye towards building our own such detector. But as we did so, a critical question arose - once you find good actors, what exactly do you do with them?

With bad actors, the intervention is obvious - get them off the platform, or push them to change their behavior. But your best community members are, by definition, already on the platform and already engaging constructively. We've seen many platforms build out their positivity detectors only then to fall flat when they realized that data didn't really help them bolster their community in any meaningful way.

Ken and Chris recently shared their thoughts on how to conceptualize "prosocial behaviors," but today I want to talk about what you as a platform might want to do once you've detected which users of yours are most prosocial.

A Framework for Detecting Prosocial Behavior

First, let's consider what it means to detect prosocial behavior. When detecting toxicity, it's important to be monitoring everyone - after all, the simple truth is that some of the worst harms, things like child grooming or violent radicalization, may take place between two people that your platform identifies as friends based on how they're connected with each other.

But for positive behaviors, there's a little bit less of a justification - the downside of missing someone that's behaving well is less extreme. So the first question you might ask is, do I really need to monitor all my users to find positive behaviors?

Now, one possible answer is yes. There are some platforms that may well choose to say, "Hey, I would in fact, like to monitor all of my users, because I think that's going to help me get a better sense of what these positive behaviors look like, or maybe help me better catch things like attempts to diffuse toxicity." But, while users typically readily accept that monitoring is necessary for toxicity detection, privacy advocates may be less pleased with prosocial monitoring. So an alternative approach is to actually invite your users to volunteer. Specifically, you'd invite users to opt into some kind of special user group for "trusted ambassadors" or similar; and one condition of being a member of this group would be that your behavior would be monitored to ensure you're living up to the expectations there. Of course, this would also come with perks. Maybe you'll be given special in-game items or special powers. Maybe this is how you qualify to be a community representative or moderator. Maybe the best users automatically get invited to the next big in-person even. There are a lot of different options, but they basically boil down to a reputation system where you're encouraging your users to invest in the health of your community, both for their sake and your own.

What's especially cool about this is it allows you to much more directly promote your goals to your community. You're reminding them, through the very existence of this program, what you want your users to be doing, how you want them to be behaving, and that it is important to you, whether the reward is just a status signal like a position on a leaderboard or special indicator next to their username, or something more substantial. The real reward that the users will understand is that they are in partnership with the platform, that they can feel like they're a deeper part of the community that's really integrated with the developers together realizing the vision that they're looking for, and that's going to be extremely compelling to your users.

So that's one approach to making use of positivity. Let's say you don't want to have that kind of proactive encouragement, though, and you really just want to see how users behave completely organically. Well, in that case, you'll need to start by deciding what kinds of behaviors to search for. Those positive behaviors could be things like simple politeness. It could be efforts to defuse toxicity, especially successful efforts that demonstrate some skill at the art. It could be coaching and encouragement for users that are a little less skilled or more introverted, helping them to feel better or more at home or feel like they're still able to be included in the community. It can even be support for things happening beyond the online space itself - if someone's coming into that space looking for an escape from the physical world, and another user is offering much-needed emotional support, that may be something that you want to encourage, that you want your community to be able to offer. So you're looking for all of these behaviors, and now you're starting to identify users who are acting in line with that sort of positive goals.

Encouraging and Amplifying Prosocial Behaviors

Once you've detected these behaviors, what can you do to encourage those users to continue behaving that way? Well, to start, you could use the approach we discussed from the "trusted ambassadors" example.

You could also provide them something that's a little bit meatier. This will depend on the environment - but for instance, some semi-competitive social games could offer actually better items or more advantages around matchmaking to users who were more supportive of the community. If you don't want to go that far, you can still look at matchmaking and say, "now that I've identified this user as a really positive user, maybe I want my new users coming into the platform to be matched with people like this more often." This can allow you to make sure that new users coming onto the platform for the first time have a more positive experience because you have trusted users that you're pairing them with. You can also potentially use this to defuse some toxicity - if you see that certain groups of players are seeming to escalate to more and more harmful behavior, bringing in other users that are known to be effective at defusing that situation, especially those who are effective at protecting  vulnerable users, can be a great way to protect things. And in MMOs and other kinds of open worlds (including many VR games), you have even more flexibility.

Do you want to actually designate some of your users as the in-universe law enforcement, where part of their their job within the virtual world is that they get access to a special stream of data indicating things like "this person is being bullied right now", or "these people are becoming toxic and violating our policies"? Then, instead of just acting punitively, you're instead encouraging this positive actor of yours to go into that space where this bad behavior is happening and engage human-to-human with these bad actors in a way that tries to convince them to change their behavior. Deputizing your users like this can be an extremely powerful way to encourage better behavior within your community.

Of course, some platforms either feel uncomfortable with the idea of deputizing their users, but might instead want to have NPCs (in games) or AI agents (in other enterprise systems) that can model positive behaviors. This can be tricky - AI systems are known to go haywire, especially when "trolls" on the platform attempt to push them towards worse behaviors. So you're going to need to train these NPCs and AI agents in a way that makes them robust, and teaches them skills to delicately sidestep attempted toxicity. Training these skills just once and calling it a day rarely every works - after all, language and behaviors evolve, so even if your NPC was robust yesterday, it might struggle to understand what's happening today. So if you really want your AI agents to be effective at this, there's few better places to look for data to train them than your prosocial human users.

We also need to address privacy in this context. This is something that needs to be done carefully to make sure that you're not obtaining PII and building that into your NPC models and agents, and that you're more broadly getting the consent of the users that are contributing to the training of this data. But many users across gaming, social, and enterprise ecosystems consistently express interest in this kind of use, becoming extremely intrigued by the idea of their own behavior serving as an actual template for how AIs will behave.

Harness the Power of Collaboration for a Positive Community

So, let's summarize. Once you find your good users, what can you do with that information?

- You can offer membership in a "trusted ambassador" program (or even in-platform "law enforcement") which both reward these good citizens and enable you to match these good actors with bad actors for more personal coaching and gentle discouragement from misbehavior

- You can adjust your matchmaking systems to improve the experience of new users or better protect your most vulnerable users

- You can directly incentivize more good behavior with direct gifts, special skins or items, etc

- You can use your best users as a model to improve your AI agents, ensuring your automated systems can demonstrate the same compassion and resilience as your most dedicated users

And of course, there are surely other options too. We're only beginning to scratch the surface here, but what's so exciting about this is that it's all collaborative and constructive. When you talk about toxicity, a suspension or a mute fundamentally feels confrontational and negative for your users. It's very hard to imagine a banned user thanking the platform for it. So punitive content moderation fundamentally puts the platform and the players in opposition. But the beauty of prosocial work is that it really puts the players and the platform on the same side.

You're all working to cultivate the same positive community. And while the platform gets final say on what that positive community is, the users - who are so invested in the community, who know the ins and outs, who actually understand the social dynamics within your space, possibly much better than you do - are being empowered to really meaningfully contribute to the realization of that vision. They get to see that this is not just a platform they are using, but it's their community as much as it is yours - and that's a wonderful feeling for everyone.