Content
Published: 8th September 1:17PM
In the gaming world, the battle against toxicity and disruptive behavior has been an ongoing challenge. Call of Duty has taken a significant step forward in addressing this issue. With the upcoming release of Modern Warfare III on November 10th, real-time voice chat moderation is being introduced.
A New Era of Voice Chat Moderation
Call of Duty’s anti-toxicity team is dedicated to creating a more inclusive gaming environment. The introduction of voice chat moderation, powered by Modulate’s ToxMod technology, is a game-changer. This advanced AI-powered system can identify and address toxic speech in real-time, including hate speech, discriminatory language, harassment, and more. For this reason, It’s a promising development that complements Call of Duty’s existing moderation efforts.
Holistic Voice Chat Moderation Approach
Call of Duty’s commitment to fostering a positive gaming community extends beyond voice chat moderation. The franchise has already implemented text-based filtering across 14 languages for in-game text, including chat messages and usernames. Additionally, a robust in-game player reporting system empowers players to report disruptive behavior effectively.
Beta Rollout and Global Release
The voice chat moderation technology will undergo an initial beta rollout in North America on August 30. This will go live within the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone. A full worldwide release however, excluding Asia is planned with the launch of Modern Warfare III on November 10th. Initially available in English, support for additional languages will follow in due course.
Positive Impact and Player Engagement
Call of Duty’s existing anti-toxicity measures have already yielded positive results. Since the launch of MWII, over 1 million accounts have faced restrictions due to violations of the Code of Conduct. However, 20% of players who received a first warning did not reoffend. While those who did reoffend faced account penalties, including voice and text chat bans.
Addressing False Reporting
The collaboration with partner studios has led to important policy updates. One such update is the inclusion of Malicious Reporting in the Call of Duty Security and Enforcement Policy. This step was taken to combat the rise in false reporting within the game. Ensuring that the reporting system remains fair and just.
The Community’s Role in Voice Chat Moderation
CoD’s commitment to creating a safer gaming environment would not be possible without the active involvement of its player community. Additionally, Players are encouraged to continue reporting disruptive behavior as part of the ongoing effort to reduce and eliminate toxic conduct in the game.
A Brighter Future for Call of Duty with Voice Chat Moderation
Combining cutting-edge technology with community engagement. Call of Duty is on a mission to make its games fair and fun for all. In fact, as the gaming industry evolves, the commitment to combating toxicity remains at the forefront. Ensuring that the virtual battlegrounds of Call of Duty are spaces where every player can thrive.
This announcement sets a new standard in the gaming industry, and it’s a testament to Call of Duty’s dedication to its players and the gaming community as a whole. Stay tuned for more updates as Call of Duty continues to shape the future of online gaming moderation.
About the author