Content
Published: 8th September 1:17PM
In the gaming world, the battle against toxicity and disruptive behavior has been an ongoing challenge. Call of Duty, a franchise known for its action-packed multiplayer experience, has taken a significant step forward in addressing this issue. With the upcoming release of Call of Duty: Modern Warfare III on November 10th, Activision is introducing real-time voice chat moderation in collaboration with Modulate, a move set to transform the in-game communication landscape.
A New Era of Moderation
Call of Duty’s anti-toxicity team is dedicated to creating a more inclusive gaming environment. The introduction of voice chat moderation, powered by Modulate’s ToxMod technology, is a game-changer. This advanced AI-powered system can identify and address toxic speech in real-time, including hate speech, discriminatory language, harassment, and more. It’s a promising development that complements Call of Duty’s existing moderation efforts.
Holistic Moderation Approach
Call of Duty’s commitment to fostering a positive gaming community extends beyond voice chat moderation. The franchise has already implemented text-based filtering across 14 languages for in-game text, including chat messages and usernames. Additionally, a robust in-game player reporting system empowers players to report disruptive behavior effectively.
Beta Rollout and Global Release
The voice chat moderation technology will undergo an initial beta rollout in North America on August 30, within the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone™. A full worldwide release (excluding Asia) is planned to coincide with the launch of Call of Duty: Modern Warfare III on November 10th. Initially available in English, support for additional languages will follow in due course.
Positive Impact and Player Engagement
Call of Duty’s existing anti-toxicity measures have already yielded positive results. Since the launch of Modern Warfare II, over 1 million accounts have faced restrictions due to violations of the Call of Duty Code of Conduct. Remarkably, 20% of players who received a first warning did not reoffend, showcasing the efficacy of clear feedback for behavior improvement. Those who did reoffend faced account penalties, including voice and text chat bans.
Addressing False Reporting
The collaboration with partner studios has led to important policy updates. One such update is the inclusion of Malicious Reporting in the Call of Duty Security and Enforcement Policy. This step was taken to combat the rise in false reporting within the game, ensuring that the reporting system remains fair and just.
The Community’s Role
Call of Duty’s commitment to creating a safer gaming environment would not be possible without the active involvement of its player community. Players are encouraged to continue reporting disruptive behavior as part of the ongoing effort to reduce and eliminate toxic conduct in the game.
A Brighter Future for Call of Duty
The introduction of real-time voice chat moderation marks a significant stride towards a more enjoyable and respectful gaming experience within the Call of Duty franchise. Combining cutting-edge technology with community engagement, Call of Duty is on a mission to make its games fair and fun for all. As the gaming industry evolves, the commitment to combating toxicity remains at the forefront, ensuring that the virtual battlegrounds of Call of Duty are spaces where every player can thrive.
This announcement sets a new standard in the gaming industry, and it’s a testament to Call of Duty’s dedication to its players and the gaming community as a whole. Stay tuned for more updates as Call of Duty continues to shape the future of online gaming moderation.

About the author