At Reboot Develop Blue 2024, Unity’s Micaela Hayes appeared on stage to tackle the subject of online toxicity, how to fight it and how important it is to confront it not only from an ethical point of view but, frankly, as a business decision.
Between 2021 and 2023, she explained, the number of people experiencing some form of toxicity has gone from 68% to 76%, and roughly 49% of players have stated that they avoid specific games for this reason. When, later in the day, we had a chance to interview her on these topics, we asked her why she thinks the numbers have been rising.
"I think it’s gotten worse because during COVID and 2020, a lot of people were inside playing games," she said, "The gaming community grew and we’ve kind of plateaued since then, because a lot of people have gone back to school, back to work. They’re not at home as much."
The lockdown obviously had a huge impact on mental health and Hayes thinks that "people’s empathy changed", suggesting that players became more self-focused and didn’t have the same experiences as face-to-face interactions. That is particularly relevant for young people, students who missed two or three years of human interactions during a very delicate phase of their emotional development.
"It very much siloed people’s environments and took away the humanity of it," Hayes explained. "And I think that translates in all aspects of the world, especially gaming. Particularly because you’re hidden behind a computer screen. You don’t have a camera. There’s no identifiable information coming from your profile or whatever. So people feel safety in that."
"Moderation sees one of the highest turnovers in the industry because they face obscene language and terrible threats on a daily basis. The human psyche can only take so much before asking if it’s worth it"
Data also shows how toxic behaviour doesn’t necessarily come from a small group of angry people. "There’s a generalized increase in toxicity," Hayes said.
But the current situation could be traced much further back than 2020; Hayes suggests the anti-toxicity culture during the early Noughties could be part of the issue.
"When I started playing games, there wasn’t the anti-toxicity movement at all," she said. "That kind of behaviour was just part of the culture, part of the experience, deal with it or leave.
"Your foundational understanding of video games comes from when you first start. And I think that when you start your gaming adventures in a toxic environment like that, and you think that’s normalized, then you’ll expect it to continue on that way, because that’s just part of the experience, unfortunately."
She added that changing online behavioural patterns isn’t easy: "It’s been normalized for so long, people just think it’s part of how games work. Thankfully, there’s been a super high increase of companies that understand that not only is it important, but the onus is on them to do something about the environments that they create for their players."
During her talk, Hayes pointed out a silver lining to the terrible stats we mentioned earlier: studies show that roughly 96% of people want to do something about the issue. Also, a lot of people states that they are willing to pay even double the monthly price for a game if it means the game environment won’t be toxic.
But what can be done and how have companies actually tried to tackle the situation? Hayes briefly summarized it during her talk, explaining how there’s almost always a compromise involved. Moderation based on reporting, for example, is complex because it’s often not based on evidence, it depends on people’s will to act and report and rarely produces any direct feedback for the user. Using forums or Discord means the player has to leave the game to report. Speech to text transcriptions ignore nuances, tone, community culture, which can also be a problem with outsourced moderation.
"More companies understand the onus is on them to do something about the environments they create for players"
And one of the biggest issues is how hard it is to evaluate contextual behaviour: a certain kind of banter could be insulting between complete strangers and perfectly fine between friends, or generally accepted in a first-person shooter and wildly unacceptable in a game meant for little kids.
So, how does the industry address this complex web of issues? Hayes approach could be traced back to her teaching background; she previously worked as a high school math teacher, before
she had a chance to turn her love of gaming into a job by becoming a community manager at Hi-Rez Studios.
"I worked there for about five or six years, but during that time I transitioned from community management to the business development side of things."
After specializing in community safety and anti-toxicity, she started working on the Vivox voice and chat service in Unity, focusing on in-game communication systems. And in her work, she’s been able to apply what she has learned as a teacher.
"As a teacher you… get up and you public speak every day. So having the ability to kind of just off-the-cuff chat with people comes very naturally for me."
Dealing with online communities to fight back against toxic behaviour could be in many ways seen as teaching, she said, because it is not just about punishing, it’s also about educating, which can be done in many ways. During her talk, for example, Hayes mentioned how certain games gate rewards and limit them to people who don’t receive bans.
Part of the issue with toxicity management also lies in manpower. "Moderation and customer support see one of the highest turnovers in the industry because they are faced with obscene language and terrible threats on a daily basis," Hayes told us. "And the human psyche can only take that so much before asking if it’s worth it."
Also, she added, a lot of support teams are "at the bottom of the totem pole." Many people try to get into the industry this way but "then they get a bad taste for the gaming industry at large."
"And their pay is not great. It really isn’t, especially compared to other jobs in the same company. And so people start thinking, ‘Okay, how many times do I need to read ‘kill yourself’ before I need to quit this job?’"
This specific issue ties in strongly with the fact that the second half of Hayes talk, which focused on presenting Safe Voice, the cross-platform tool developed by Unity that uses machine learning to manage online toxicity by combining transcription with tonal analysis and monitoring the whole environment. The tool tracks things like player behaviour and responses, such as if a particular player gets muted by all other participants or how people react to specific behaviours.
The moderation needs of a child-centric title like Roblox will differ from those of a game with a more mature audience, such as Call of DutySafe Voice launched in closed beta last July and recently entered its open beta phase.It has been designed to give exactly what’s usually missing: context, reported in detail and easy to parse.
"It’s machine learning at its best," Hayes told us. "It’s a way of protecting people, but it’s not just protecting the players, it’s also protecting the moderators."
But through automating this kind of task, isn’t there the risk of perpetuating misinterpretation of context, nuances, jokes? That’s the field in which Safe Voice is particularly strong and effective, Hayes explained to us. But it also is fully customizable, so that the studio utilizing it can adapt it to behaviours that could be considered acceptable in specific communities.
"For instance, games come with ratings," Hayes said. "So if the game is Mature or 17+, obscenities are more understandable and more part of just generalized language that doesn’t necessarily even mean toxic." In those cases, she added, "maybe using the word ‘damn’ or ‘shit’ in a kind of neutral tone is just part of the cadence of talking."
"[Fully automated moderation] could happen in future, but some studios want to some sort of human check and I’m all for that as well"
Hayes also doesn’t think that allowing certain kind of language in specific communities presents a risk of alienating potential new players, because those players "should be adults and should understand the context of how these obscenities are being used, especially in contrast to a game like Roblox, where the majority of the player base is 7 to 14."
But focusing on language and tone is not enough and the system needs to discern more subtle nuances and take into account how the same word can have different meaning in different places. We asked Hayes about this and the fact that languages like Portuguese, Spanish or of course English can be spoken in quite different ways depending on the territories. There’s a couple of English words that are completely unacceptable in the US but tolerated elsewhere. She agreed and once again pointed to tonal analysis.
"It’s not just about what you say and how you say it, although those are important factors, but also the reception of the listener or the other players. So, understanding if you’re offended by what I said, based on your tone, and taking that into account."
That’s the big plus compared to simple automatic transcription, where people could say things that would seem fine, but in context could be quite offensive. And of course it could work in the opposite direction, too.
But even with all the grunt work done by AI, the system still needs a human intervention to parse all the selected data, evaluate it and then take action. Will we ever get to a point where the full process will be automated?
"It could happen in the future," Hayes said. "We would love to get to a place like that. But I think studios want to have some sort of human check and I’m all for that as well."
{Categories} _Category: Takes{/Categories}
{URL}https://www.gamesindustry.biz/the-present-and-future-of-online-toxicity-management{/URL}
{Author}Andrea Maderna{/Author}
{Image}https://assetsio.gnwcdn.com/Reboot-toxicity-panel-2.jpg?width=1200&height=600&fit=crop&enable=upscale&auto=webp{/Image}
{Keywords}community management,Unity,Toxicity,Reboot Develop Blue 2024,Harassment{/Keywords}
{Source}POV{/Source}
{Thumb}{/Thumb}