Activision is turning to AI to make your CoD matches a bit less toxic

4.1
A new voice moderation tool, powered by AI, has been added to Call of Duty to protect players from “toxic or disruptive behaviour they may encounter.” This tool is said to help increase Activision’s ability to “identify and enforce against bad behaviour that has gone unreported.”

The initial rollout of the voice chat moderation tool will begin in North America for Call of Duty: Modern Warfare 2 and Call of Duty: Warzone from today. It’ll be rolled out globally (excluding Asia) on November 10 when Call of Duty: Modern Warfare 3 arrives.

How exactly does this tool work, and what behaviour will it be looking out for? In a comprehensive FAQ shared to Activision’s website, the developer shares that voice chat moderation is “managed and operated by Activision and uses the AI-powered model ToxMod from Modulate.” It will be focused on “detecting harm within voice chat versus specific keywords”, and violations of Call of Duty’s Code of Conduct will see more toxic players be punished for their behaviour.

In case you’re concerned about flaming your friends during a round of Warzone and being subsequently punished as a result, the developer has shared that the tool does allow for “trash-talk and friendly banter.” However, as is to be expected, hate speech, sexism, and other types of discrimination will not be tolerated.

And if you don’t like the tool? Well, the developer suggests that those who do not wish to have their voice chat moderated should simply “disable in-game voice chat in the settings menu.” Problem solved. It has also specified that ToxMod’s job is just to moderate voice chat; the tool will not be dishing out punishment to problematic players.

“Call of Duty’s Voice Chat Moderation system only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations.”


So, if you’re just mucking about in Call of Duty having a good time with your friends, this tool won’t affect you. And if you’re worried your trash-talk might get flagged, anything that ToxMod flags should, according to Activision’s FAQ on the tool, be judged by a human before any punishments are dished out.

Even then, Modulate states that ToxMod “was born to understand all the nuance of voice. It goes beyond transcription to consider emotion, speech acts, listener responses, and much more.” If true, you can expect to see less toxicity in your Call of Duty games, hopefully.



Posted:
Related Forum: Call of Duty Forum

Source: https://www.vg247.com/activision-ai-moderation-call-of-duty

Comments

"Activision is turning to AI to make your CoD matches a bit less toxic" :: Login/Create an Account :: 16 comments

If you would like to post a comment please signin to your account or register for an account.

TheseusovoPosted:

These games are rated M for Mature +17 years of age.
How can they do this?
Maybe two different play styles like back on 360. Casual & Underground
They are catering to 12 year olds and these new age people
The game itself has cursing and blood/gore
What is the world coming to.

XiripitiPosted:

What the hell...
Just completely remove voice chat then if they don't want toxicity.

GibsnPosted:

Thay took the fun away when they turned up the SBMM alittle but more.

BilletPosted:

Just takes the fun away from it imo

QTPosted:

I remember the days of OG mw2 game chat.. ahh when times were good

This is bs

RuntsPosted:

Yeah this is straight up Bs. Literally impossible to even play anymore or be yourself. Just gunna get ban or muted for every single thing. CoD is ment to talk crap Lmao wtf is the point of game chat then ???!