Games publisher Activision has announced that it will use artificial intelligence to listen in on the voice chat of ‘Call of Duty’ players and punish anyone using “hate speech” or “discriminatory language.”
The feature was rolled out on Wednesday for US players of Call Of Duty’s ‘Modern Warfare II’ and ‘Warzone’ titles, and will be launched globally with ‘Call of Duty: Modern Warfare III’ on November 10.
In a statement on Wednesday, Activision said that it will use a machine learning tool to “identify in real-time and enforce against toxic speech,” including “hate speech, discriminatory language, harassment and more.”
Activision’s announcement made no distinction between private chat among teammates and public chat audible to all players in a server. Presumably, both will be subject to the same monitoring.
Call Of Duty’s code of conduct bans insults based on race, sexual orientation, gender identity, age, culture, faith, and country of origin. Players deemed to have breached this code of conduct will be punished with temporary voice chat restrictions, temporary account bans, or permanent bans for repeat offenders, Activision said.
Around 90 million gamers worldwide play Call of Duty every month. At present, Activision depends on reports from other players and text chat monitoring to weed out allegedly “toxic” conduct. According to the developer, around one million accounts have been penalized since the launch of ‘Modern Warfare II’ last October.
Players will be unable to opt out of being listened to, except by disabling voice chat entirely.
Purging “hate speech” from Call of Duty will likely please the US military, which reportedly planned on using the game to recruit women and minorities. While the Pentagon identified Call of Duty as a “potentially useful branding and recruiting tool,” it canceled a deal to sponsor in-game ads and an esports tournament at historically black universities in the US when allegations of sexual harassment within Activision surfaced in late 2021.