Modulate Partners with the Anti-Defamation League to Combat Hate and Toxicity in Online Gaming
Modulate Partners with the Anti-Defamation League to Combat Hate and Toxicity in Online Gaming
Modulate, creators of voice technology that elevates the health and safety of online communities, today announced that it has partnered with ADL (the Anti-Defamation League) to support their Center for Technology and Society (CTS) in scaling its work to combat hate in online gaming spaces.
ADL has a proud history of monitoring antisemitism and extremism online to ensure the safety of Jewish communities and other marginalized groups from coast to coast. Since early 2017, ADL’s Center for Technology and Society has led the charge against cyber hate. CTS convenes key stakeholders such as technology companies, NGOs, and academics to measure and stop the spread of cyber hate. As ADL and CTS continue to expand their research in the games industry, Modulate will be able to apply these learnings in the continued development of ToxMod and other products designed to help studios and development teams fight hate online.
“ADL is thrilled to partner with Modulate to advance our shared goals of a hate-free digital world,” said Daniel Kelley, Director of Strategy and Operations at CTS. “Digital spaces, especially online gaming communities, offer wonderful opportunities for entertainment and positive interpersonal experiences but have the potential to become havens of online hate, with over 80% of adult gamers experiencing harassment.”
“The ADL’s research has been a driving force in Modulate’s development of ToxMod to fight toxic behavior in online games,” said Hank Howie, Games Industry Evangelist at Modulate. “Becoming an ADL Corporate Partner Against Hate is a real validation of our efforts to address the growing problem of toxic behavior.”
ToxMod is gaming’s only proactive, voice-native moderation solution. Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context. In contrast to reactive reporting systems, which rely on players to take the effort to report bad behavior, ToxMod is the only voice moderation solution in games today that enables studios to respond proactively to toxic behavior and prevent harm from escalating.
For more games industry news, be sure to follow @XONEHQ on Twitter, YouTube, Instagram, Facebook, and Pinterest, download the free XBOX app for Android, and stay tuned!
Streaming Platform Noice Launches Industry First Reward Program In UK, EU and EEA
Noice (@NoiceStreaming), the next-gen livestreaming platform, is entering a new phase in its mission to transform the streaming landscape for both creators and viewers. After...
INDIE Live Expo and Awards Set for December 7, 2024
INDIE Live Expo (@INDIELiveExpoEN), Japan’s premier online digital showcase series connecting indie game fans worldwide, announces the return of both "INDIE Live Expo...
Melbourne International Games Week Launches Its First Educational Symposium This October
Melbourne International Games Week (MIGW), the largest games event in the Asia-Pacific region, will host its inaugural educational symposium on October 3-4, 2024. Developed...