‘Call of Duty’ employs AI for voice chat moderation
- Web Desk
- Aug 31, 2023
SANTA MONICA: Famous first-person shooter game franchise Call of Duty is set to incorporate AI in-game voice chat moderation system to identify and tone down instances of hate speech, discrimination, and harassment during gameplay.
The first person shooter games have been talk of the town for harbouring toxic environments in their lobbies and voice chats.
The Verge reported that surveys have labeled the franchise’s fan community as particularly negative, with incidents like a SWAT team being called due to player disputes. In efforts to counter this behaviour, Activision has been striving to address the issue, and now artificial intelligence might play a role in the solution.
Activision has joined forces with Modulate, a company specialising in this field, to introduce “in-game voice chat moderation” to their games. This new moderation system, utilising AI technology named ToxMod, is designed to promptly detect harmful behavior and speech during gameplay.
Read More: Microsoft surprises gamers with launch of ‘Age of Empires IV’ on Xbox Consoles, Game Pass
As per the media reports, the beta release of ToxMod has commenced in North America, marking its debut in Call of Duty: Modern Warfare II and Call of Duty: Warzone. A global rollout (excluding Asia, as stated in the press release) is scheduled for November 10th, coinciding with the launch of this year’s new addition to the franchise, Call of Duty: Modern Warfare III.
Although Modulate’s press release doesn’t provide much specific details about how ToxMod works, their website explains that it could understand conversations in voice chat, pinpoint toxic behavior, and facilitate swift responses by moderators with pertinent context.
Meanwhile, the CEO of Modulate emphasised that the technology does more than just changing spoken words into text. It also considers things like how a player feels and how loud they are speaking to tell the difference between harmful and playful speech.
However, it is pertinent to note that at least for the time being, ToxMod will not directly take action against players based on its assessments. Instead, it will provide reports to Activision’s moderators. This approach acknowledges the necessity of human oversight, as research has indicated that speech recognition systems can exhibit bias when interacting with users of diverse racial backgrounds and accents.