Technology company Modulate Inc. is introducing artificial intelligence which moderates online gaming chat in real-time. Modulate Inc. is attempting to take the next step in solving the rampant issue of player toxicity in online gaming.

Smack talking from players has essentially existed since online gaming's inception. However, recent reports suggest online behavior is as bad as ever, even recently spurning a Respawn Entertainment developer to criticize Apex Legends players for their toxicity.

RELATED: Nvidia Aims to Harness AI For Medical Imaging

Modulate's new moderation service is being called ToxMod. Touted as the world's first voice-native moderation service, ToxMod uses voice as a medium to moderate better than plain text moderation. In other words, the program can detect how something is said versus older tech that monitors only what was said.

Modulate Co-Founder and CEO Mike Pappas stated that developers are now able to take more detailed, nuanced approaches to dealing with problematic speech. Instead of simply shutting a conversation off, ToxMod enables devs to block individual words or phrases like racial slurs. The AI also has the ability to isolate personal things like phone numbers or other identity-related info.

toxmod artificial intelligence online chat monitoring software

Similar to what Overwatch is doing to combat chat toxicity, ToxMod utilizes machine learning models in order to understand exactly how players are saying what they're saying. This includes factors like emotion, volume, inflection, and more, which is intended to give more accurate predictions on whether a specific statement's intent is malicious or disruptive. The AI's moderators can then identify the involved parties and reach a solution before the issue turns into something more severe. ToxMod processes all of each player's data in real-time, and only sends out audio clips if toxicity has been detected, thus (mostly) preserving privacy.

While many gamers may see this piece of technology as somewhat invasive, ToxMod isn't the first to actively moderate gaming chat speech. Gamers who've managed to snag an elusive PlayStation 5 may want to be aware that PS5 users can record party chat and report it to Sony. A key difference however is Sony is leaving it to players to report each other, whereas ToxMod takes a more proactive approach. Either way, the gaming industry appears to be trying to make significant changes to purge as much online gaming toxicity as possible.

While online games have been the very catalyst of the issue of player toxicity, one game studio's new title aims to do the opposite. Assembled by a team of ex-Riot developers, Vela Games is opening up testing for its new title Project-V, designed specifically to combat toxicity and re-imagine co-op. Hopefully, it won't mandate any AI eavesdropping.

MORE: Is Call of Duty: Black Ops Cold War's Toxicity Out of Control?

Source: Skewed & Reviewed