Home
Home
feature-image
feature-image

When Call of Duty first got multiplayer, fans loved the in-game communication aspect. Being able to talk to your teammates while you coordinate an attack on the enemies felt like a whole new level of gameplay. However, fans quickly discovered the way toxicity emerged with in-game mics.

Watch What’s Trending Now!

There seems to be a new development in the way Activision tackles toxicity in the game. Partnering with Modulate, Activision worked to bring AI moderation to Call of Duty in order to reduce toxic comms in the game.

ADVERTISEMENT

Call of Duty combat toxicity with new tools

The issue of toxicity in CoD is not a problem that has emerged in recent times. It existed for as long as people had mics and could speak to other players in the game. What Call of Duty can control is the way they moderate toxic players and subsequently deal with them. Their idea is to bring AI technology, named ToxMod, as they partner with a company called Modulate.

article-image

ADVERTISEMENT

What sets this apart is the way it works. ToxMod will work in real time in order to identify toxic behaviors such as hate speech, harassment, and discrimination. There will be text-based filtering that takes place across 14 separate languages. Additionally, it will also fortify the existing reporting system that exists within the game.

ADVERTISEMENT

Read Top Stories First From EssentiallySports

Click here and check box next to EssentiallySports

READ MORE: From Modern Warfare II Vault Edition to Black Ops Cold War: Activision Reveals Massive Call of Duty Sale To Excite Fans Ahead of 2023’s New Premium

Top Stories

UFC 324 Payouts: How Much Will Justin Gaethje, Paddy Pimblett, Sean O’Malley, and Others Earn?

Pittsburgh Steelers Make Final Mike McCarthy Decision as Aaron Rodgers Gets His HC Wish

Sources: Bears Emerge as Suitors for Travis Etienne Jr., Chargers Expected to Cut Mekhi Becton, Toledo QB Not Focused on NFL

Fans in Awe as Wholesome Blades Brown Moment Is Caught on Camera

Fans in Disbelief as Mysterious Figure Around Scottie Scheffler Is Caught on Camera: “Got a Hex Him”

Michael Chandler Finally Moves On From Conor McGregor After Dana White’s Firm Verdict

ToxMod, on its website, elaborates on how its system works. ” ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context.

Apart from this, neither ToxMod nor Call of Duty revealed too much about what the system would entail. This could be a strong move for Activision, especially since it can give players ideas on how to dodge the system. There is also a system that keeps it in check since the AI tech cannot ban players directly. Instead, it only submits reports to Activision moderators who will then review the reports.

ADVERTISEMENT

article-image

ADVERTISEMENT

What they did reveal was factors such as player emotions and volume when speaking. The system has already been deployed in several titles such as Among Us.

Call of Duty has plans to bring this system into place with the release of Modern Warfare III. At the moment, it has already rolled out for the North American players for Modern Warfare II as well as Warzone 2. Only time will tell how effective the system really is.

WATCH THIS STORY: Legendary Mission Impossible Composer Is Returning to Call of Duty for Modern Warfare III

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT

ADVERTISEMENT