How EA’s Active Play Initiative wants to move from banning malicious players to creating positive experiences

We’ve known for years that online gaming can be a minefield of toxicity and bullying, especially for women. And while censorship tools in general have been around for a long time, it’s not until recent years that we’ve started to see the big game companies actually acknowledge their responsibility and power not only to prevent this behavior but also to actively create positive things. space.

Just last month, we’ve seen Riot Games and Ubisoft collaborate on such a projectand Xbox recently started providing data on censorship topics also. But one company that has been publicly promoting this strategy for several years now is EA, through its Active Play program.

The Active Play program is led by Chris Bruzzo, EA’s chief experience officer. He’s been with the company for eight and a half years and takes on this newly created role after six years as EA’s chief marketing officer. While still in that old role, he and current CMO David Tinson started conversations that led to Active Play at EA.

“David and I have talked for years about the need to get the community involved in this, while also addressing the issue of toxicity in the game, and some of the really challenging things that are happening in these communities. The social community is growing rapidly in or around the game,” says Bruzzo. “And so a few years ago [in 2019]we held a summit at E3 and we started talking about the collective responsibility that game companies and everyone else, players and everyone involved must have in addressing behavior obnoxious and malicious in the game?”

Play actively

EA Building a healthy community Featured content creators from 20 countries, EA staff, and third-party experts on online communities and toxicity. There have been talks and roundtable discussions, as well as opportunities to provide feedback on how issues are being addressed.

Bruzzo said that both entering the summit and from the feedback that followed, it was clear to him that women, especially women, are having a “hugely bad experience” in social games. . If they reveal their gender or if their voices are heard, women often report harassment or bullying. But the response from the summit convinced him that EA could do something about this. That’s how Active Play was born.

He sought out Rachel Franklin, the former Maxis executive, who left Meta (then Facebook) in 2016 to become head of social VR, where Bruzzo said she unfortunately had some extra. relevant experience in this regard.

“If you want to find a more toxic environment than a gaming community, go for a VR social community,” says Bruzzo. “Because not only is there the same level of maliciousness, but my avatar can immediately pop up and touch your avatar face, and that creates a whole other level of not feeling safe. or included.”

With Franklin’s leadership as EA’s Senior Vice President of Active Play, the team got to work. They publish the Active rules of play in 2020, this is really an outline of the do’s and don’ts of social play in EA games. Its pillars include treating others with respect, keeping things fair, sharing clean content, and complying with local laws, and stipulating that players don’t follow those rules. may have their EA account restricted. As basic as it sounds, Bruzzo says it has formed a framework by which EA can both strengthen moderation of bad behavior, and begin to proactively create experiences that are more likely to be progressive and positive.

Dispatch Army

In terms of censorship, Bruzzo says it’s been trying to make it easier for players to flag issues in EA games, and is increasingly using and improving AI agents to identify patterns of bad and spontaneous behavior. alarm action. Of course, they can’t completely rely on AI – real humans still need to consider every exception or exception and make the right decisions.

For an example of how the AI ​​makes this process easier, Bruzzo pointed out the player’s name. Player names are one of the most common malicious problems they encounter, he said. While it’s easy enough to train the AI ​​to ban certain inappropriate words, players who want bad behavior will use symbols or other tricks to bypass the ban filters. But with AI, they are getting better and better at identifying and preventing these workarounds. He said that this past summer, they ran 30 million Apex Legends club names through their AI test and removed 145,000 offending names. No human can do that.

And it’s not just the name. Since the Active Play initiative began, Bruzzo said EA is seeing a significant reduction in hateful content on its platforms.

The moment your expression begins to encroach on the other person’s ability to feel secure… that’s when your ability to do so disappears.

“One of the reasons we are in a better position than social media platforms [is because] We are not a social media platform,” he said. “We are a community of people who come together to have fun. So this isn’t really a platform for all your political speeches. This is not a platform where you can talk about whatever you want…The moment your expression begins to infringe on the ability of others to feel safe and included or to let the public environment equal and let everyone have fun, that’s the moment when your ability to do that disappears. Let’s do it on some other platform. This is a community of people, players come together to have fun. That gives us the really big advantages of having very clear parameters. And so then we can come up with consequences and we can make real progress in reducing disruptive behavior.”

That includes text, but what about voice chat? I asked Bruzzo how EA would handle that, since it’s notoriously difficult to censor what people say to each other via voice communication without breaking privacy laws regarding recorded chats.

Bruzzo admits it’s harder. He said that EA gets significant support from platform holders like Steam, Microsoft, Sony, and Epic whenever VCs are hosted on their platforms, because both companies can. bring their toolkit to the table. But for now, unfortunately, the best solution still lies with players blocking or muting or removing themselves from malicious communications.

“For voice, the most important and effective thing anyone can do these days is make sure that players have easy access to turning things off,” he said. “It’s the best we can do.”

Another way EA is working to reduce toxicity in its game seems a bit helpful – they are actively banning cheaters.

“We find that when the game has bugs or cheats in it, so when there’s no good anti-cheat engine or when the anti-cheat engine falls behind, especially in edge games, competition, one of the root causes of the huge toxicity rate is when players feel the environment is unfair,” Bruzzo said. “That they can’t compete fairly. And what happens is, it makes them angry. Because suddenly you realize that there are other people breaking the rule and the game has no control over that rule violation. But you love this game and you have invested a lot of time and energy in it. Frustrating. So we’ve made it a priority to tackle cheaters as one of the best ways for us to reduce toxicity in the game.”

Good game

One point Bruzzo really wanted to address is that just as it is important to eliminate toxicity, it is equally important to promote positivity. And it’s not like he’s working from scratch. Just as bad behavior in a game can be pervasive and memorable, the majority of gaming sessions are not malicious. Worst of all, they are neutral and have often been active without any help from the EA.

“Less than 1% of our game sessions result in one player reporting another player,” he said. “We have hundreds of millions of people currently playing our games, so the game is still huge and we feel…we have to continue this now because the future of entertainment is interactive…But it’s important to remember that 99 out of 100 sessions don’t result in players having to report inappropriate behavior.

By far in 2022, the most popular text comment among players is actually ‘gg’.

“And then another thing that I just looked at the other day in Apex Legends, by 2022, the most common text comment among players is actually ‘gg’. It’s not, ‘I hate you.’ It’s not vulgar, it’s not even anything competitive. It’s a ‘good game’. And in fact, ‘thank you’. ‘Thank you’ has been used over a billion times in 2022 alone in Apex Legends.

“And then the last thing I would say is just vote for humanity is when we warn people about crossing the line, like they broke a rule and they did something offensive. mess, 85% of which we warn, never offend again. That just gives me hope.

It is a positive spirit that Bruzzo hopes to foster in the future. I asked him what EA’s Active Play initiative would look like in ten years if it continued to be successful.

“Hopefully we’ve moved on from the number one issue of trying to get rid of hateful and malicious content, and instead, we’re talking about how to design games so they’re all-inclusive games. possible. I think ten years from now we will see games with adaptive controls and even different servers and different servers for different play styles. We will see an explosion of creativity and players creating things, not just cosmetics, but actually creating playable objects in our game. And all of that will benefit from all the work we’re doing to create positive content, an active Play environment, and an active social community.”

Rebekah Valentine is a news reporter for IGN. You can find her on Twitter @duckvalentine.


News5h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button