What's new

How EA’s Positive Play Initiative Wants to Move From Banning Toxic Players to Creating Positive Experiences

We’ve known for years that online gaming can be a minefield of toxicity and bullying especially for women. And while moderation tools generally have been a thing for almost as long, it hasn’t been until recent years that we’ve started to see major gaming companies really acknowledge their responsibility and power not just to stop this behavior, but to proactively create positive spaces.


Just last month, we saw Riot Games and Ubisoft partner on such a project, and Xbox has recently begun offering data on moderation topics as well. But one company that’s been publicly promoting this strategy for a few years now is EA, via its Positive Play program.

The Positive Play program is spearheaded by Chris Bruzzo, EA’s chief experience officer. He’s been at the company for eight and a half years, and stepped into this newly-created role after six years as EA’s chief marketing officer. It was while he was still in that old role that he and current CMO David Tinson began the conversations that led to Positive Play at EA.

“David and I talked for many years about needing to engage the community in this, and address toxicity in gaming and some of the really challenging things that were happening in what have been rapidly growing social communities either in or around games,” Bruzzo says. “And so a few years ago [in 2019], we held a summit at E3 and we started talking about what's the collective responsibility that gaming companies and everybody else, players and everyone involved has in addressing hateful conduct and toxicity in gaming?”

Pitching Positive Play



EA’s Building Healthy Communities Summit featured content creators from 20 countries, EA employees, and third-party experts on online communities and toxicity. There were talks and roundtable discussions, as well as opportunities to provide feedback on how to address the issues that were being brought forward.

Bruzzo says that both going into the summit and from the feedback that followed it, it was very clear to him that women especially were having a “pervasively bad experience” in social games. If they disclosed their gender or if their voice was heard, women would often report being harassed or bullied. But the response from the summit had convinced him that EA was in a position to do something about it. Which is how Positive Play came to be.


He sought out Rachel Franklin, former head of Maxis, who had left for Meta (then Facebook) in 2016 to be its head of social VR, where Bruzzo indicates she unfortunately acquired some additional relevant experience on the matter.

“If you want to find an environment that's more toxic than a gaming community, go to a VR social community,” Bruzzo says. “Because not only is there the same amount of toxicity, but my avatar can come right up and get in your avatar's face, and that creates a whole other level not feeling safe or included.”

With Franklin at the helm as EA’s SVP of Positive Play, the group got to work. They published the Positive Play Charter in 2020, which is effectively an outline of do’s and don’ts for social play in EA’s games. Its pillars include treating others with respect, keeping things fair, sharing clean content, and following local laws, and it states that players who don’t follow those rules may have their EA accounts restricted. Basic as that may sound, Bruzzo says it formed a framework with which EA can both step up its moderation of bad behavior, as well as begin proactively creating experiences that are more likely to be progressive and positive.

The Moderation Army



On the moderation side, Bruzzo says they’ve tried to make it very easy for players to flag issues in EA games, and have been increasingly using and improving AI agents to identify patterns of bad behavior and automatically issue warnings. Of course, they can’t fully rely on AI – real humans still need to review any cases that are exceptions or outliers and make appropriate decisions.

For one example of how AI is making the process easier, Bruzzo points to player names. Player names are one of the most common toxicity issues they run into, he says. While it’s easy enough to train AI to ban certain inappropriate words, players who want to behave badly will use symbols or other tricks to get around ban filters. But with AI, they’re getting better and better at identifying and preventing these workarounds. This past summer, he says, they ran 30 million Apex Legends club names through their AI checks, and removed 145,000 that were in violation. No human could do that.

And it’s not just names. Since the Positive Play initiative started, Bruzzo says EA is seeing measurable reductions in hateful content on its platforms.

The minute that your expression starts to infringe on someone else's ability to feel safe ...that's the moment when your ability to do that goes away.

“One of the reasons that we're in a better position than social media platforms [is because] we're not a social media platform,” he says. “We're a community of people who come together to have fun. So this is actually not a platform for all of your political discourse. This is not a platform where you get to talk about anything you want…The minute that your expression starts to infringe on someone else's ability to feel safe and included or for the environment to be fair and for everyone to have fun, that's the moment when your ability to do that goes away. Go do that on some other platform. This is a community of people, of players who come together to have fun. That gives us really great advantages in terms of having very clear parameters. And so then we can issue consequences and we can make real material progress in reducing disruptive behavior.”

That covers text, but what about voice chat? I ask Bruzzo how EA handles that, given that it’s notoriously much harder to moderate what people say to one another over voice comms without infringing privacy laws related to recorded conversations.

Bruzzo admits that it’s harder. He says EA does get significant assistance from platform holders like Steam, Microsoft, Sony, and Epic whenever VC is hosted on their platforms, because both companies can bring their toolsets to the table. But at the moment, the best solution unfortunately still lies with players to block or mute or remove themselves from comms that are toxic.


“In the case of voice, the most important and effective thing that anyone can do today is to make sure that the player has easy access to turning things off,” he says. “That's the best thing we can do.”

Another way EA is working to reduce toxicity in its games may seem a bit tangential – they’re aggressively banning cheaters.

“We find that when games are buggy or have cheaters in them, so when there's no good anti-cheat or when the anti-cheat is falling behind, especially in competitive games, one of the root causes of a huge percentage of toxicity is when players feel like the environment is unfair,” Bruzzo says. “That they cannot fairly compete. And what happens is, it angers them. Because suddenly you're realizing that there's others who are breaking the rules and the game is not controlling for that rule breaking behavior. But you love this game and you've invested a lot of your time and energy into it. It's so upsetting. So we have prioritized addressing cheaters as one of the best ways for us to reduce toxicity in games.”

Good Game



One point Bruzzo really wants to get across is that as important as it is to remove toxicity, it’s equally important to promote positivity. And it’s not like he’s working from nothing. As pervasive and memorable as bad behavior in games can be, the vast majority of game sessions aren’t toxic. They’re neutral at worst, and frequently are already positive without any additional help from EA.

“Less than 1% of our game sessions result in a player reporting another player,” he says. “We have hundreds of millions of people now playing our games, so it's still massive, and we feel…we have to be getting on this now because the future of entertainment is interactive…But it's just important to remember that 99 out of 100 sessions don't result in a player having to report inappropriate conduct.

So far in 2022, the most common text comment between players is actually ‘gg’.

“And then the other thing that I was just looking at the other day in Apex Legends, so far in 2022, the most common text comment between players is actually ‘gg’. It's not, ‘I hate you.’ It's not profanity, it's not even anything competitive. It's ‘good game’. And in fact, ‘thank you’. ‘Thank you’ has been used more than a billion times just in 2022 in Apex Legends alone.

“And then the last thing I'll say just putting some votes in for humanity is that when we warn people about stepping over the line, like they've broken a rule and they've done something that's disruptive, 85% of those people we warn, never offend again. That just makes me hopeful.”

It’s that spirit of positivity that Bruzzo hopes to nurture going forward. I ask him what EA’s Positive Play initiative looks like in ten years if it continues to be successful.

“Hopefully we've moved on from our number one problem being trying to eliminate hateful content and toxicity, and instead we're talking about how to design games so they're the most inclusive games possible. I think ten years from now, we're going to see games that have adaptive controls and even different onboarding and different servers for different styles of play. We're going to see the explosion of creation and players creating things, not just like cosmetics, but actually creating objects that are playable in our games. And all of that is going to benefit from all this work we're doing to create positive content, Positive Play environments, and positive social communities.”


Rebekah Valentine is a news reporter for IGN. You can find her on Twitter @duckvalentine.

Continue reading...
 
Top