We're excited to be joining the speaker lineup at The AI Summit Series in New York this year! We often hear about the risks and dangers associated with AI technologies, but how might AI be used to help keep people safe? Join CEO & Co-founder Mike Pappas for a talk on "Using AI to Create Safer Online Spaces" on the Game Developers Conference Stage. Check out the full speaking lineup for #TheAISummit here: https://bit.ly/4hXLuFj
Modulate
Software Development
Somerville, Massachusetts 2,312 followers
We make voice chat safe
About us
We are a Boston-based startup using machine learning and AI technologies to create safer & more inclusive voice chat experiences with the revolutionary #ToxMod platform.
- Website
-
http://modulate.ai
External link for Modulate
- Industry
- Software Development
- Company size
- 11-50 employees
- Headquarters
- Somerville, Massachusetts
- Type
- Privately Held
- Founded
- 2017
- Specialties
- Machine Learning, Audio Processing, Voice Chat, AI, Content Moderation, Trust and Safety, Analytics, Voice Analytics, Sentiment Analysis, Compliance, Audio Engineering, and Recording
Products
ToxMod
Content Moderation Software
ToxMod is gaming’s only proactive, voice-native moderation solution. Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context. In contrast to reactive reports, which rely on players to take the effort to report bad behavior, ToxMod is the only voice moderation solution in games today that enables studios to respond proactively to toxic behavior and prevent harm from escalating. Developers and game studios of all sizes use ToxMod to reduce churn, delight players, and build healthier communities.
Locations
-
Primary
212 Elm St
300
Somerville, Massachusetts 02144, US
Employees at Modulate
Updates
-
Modulate reposted this
Thanks Davit Baghdasaryan for inviting me onto the #VoiceAI podcast! I really enjoyed getting to share more about how Modulate has expanded - from starting with one of the hardest challenges, differentiating gaming banter from real hostility, now to commercial applications in call centers and gig work platforms...and soon, to so much more!
Did you know that ~35% of players experience toxicity ~35% of players exposed to toxicity are likely to churn ~10% of playerbase churns due to toxicity How can Voice AI help? In this interview, Mike Pappas, CEO & Co-Founder Modulate and I do a deep dive into AI Voice Moderation. Here’s what stood out to me most 👇 1) Modulate pivoted from voice-changing tech to voice moderation, driven by the need for safe, engaging social and gaming spaces. 2) ToxMod tool analyzes voice in real-time to detect toxicity in gaming. 3) The triaging system layers AI models to identify harmful behavior while minimizing privacy risks and costs. 4) Voice moderation must handle the nuances of language and emotion, going beyond simple word detection to assess intent. 5) Most toxicity detection happens on-device, reducing the need to process every conversation. 6) The gaming industry’s shift to a social experience highlighted the importance of moderation to keep players coming back. 7) Developer priorities have shifted toward safety and community engagement rather than just gameplay mechanics. 8) In Call of Duty, ToxMod led to a 10% monthly reduction in repeat offenses, and found that reducing toxic behavior boosted user retention by 25%. 9) Privacy-preserving tech is essential as more platforms adopt real-time monitoring. 10) Modulate is starting to work with games to detect positive behavior and reward it. 11) Voice AI is reshaping how developers collect feedback, giving them real-time insights into user sentiment and product impact. 12) Modulate plans to extend its voice moderation to other social applications, like dating apps, to maintain safe and respectful interactions. 13) Modulate sees the gig economy as a potential expansion area for ToxMod, where call centers are decentralized and workers lack voice protection tools. 14) Mike believes there's a place for humans talking to bots, but that people deeply value talking to other humans. Mike, thanks for your time and insights 🙏 Full interview here 👉 https://lnkd.in/eBRSfvn7
-
We're LIVE with our latest Prosocial Hour, in which our own CEO Mike Pappas is discussing the Online Safety Act with k-ID's Kieran Donovan. Join us!
In just one month, Ofcom is slated to publish the #OnlineSafetyAct content risk assessment guidance and the first version of illegal harms codes of practice, requiring in-scope companies to report on risk assessment and mitigation measures 😱 Join us for another episode of Prosocial Hour hosted by Modulate! This month, CEO & Co-founder Mike Pappas is joined by Kieran Donovan, CEO & Co-founder of k-ID, to chat about the UK's OSA: • The OSA's current impact and any tangible improvements to online safety. • The immediate and long-term impact we expect in the gaming sector and beyond when it comes to the soon-to-be-released Ofcom guidance. • How platform leaders should prepare themselves and their teams to meet compliance requirements of the OSA. • What impact might the OSA have on other countries' online safety regulation.
Prosocial Hour: What's Next for the Online Safety Act?
www.linkedin.com
-
In just one month, Ofcom is slated to publish the #OnlineSafetyAct content risk assessment guidance and the first version of illegal harms codes of practice, requiring in-scope companies to report on risk assessment and mitigation measures 😱 Join us for another episode of Prosocial Hour hosted by Modulate! This month, CEO & Co-founder Mike Pappas is joined by Kieran Donovan, CEO & Co-founder of k-ID, to chat about the UK's OSA: • The OSA's current impact and any tangible improvements to online safety. • The immediate and long-term impact we expect in the gaming sector and beyond when it comes to the soon-to-be-released Ofcom guidance. • How platform leaders should prepare themselves and their teams to meet compliance requirements of the OSA. • What impact might the OSA have on other countries' online safety regulation.
Prosocial Hour: What's Next for the Online Safety Act?
www.linkedin.com
-
Part 2 of our guide to VoIP for gaming is out now! Zachary N. shares his insights into navigating the sometimes-intimidating process of selecting and integrating VoIP into your game. In Part 2, Zach gives an overview of authenticating and tips for testing and deploying. Read more here: https://bit.ly/4fIBRZc #GamesTech #VoIP #GameDev
-
Modulate is #hiring a Senior Data Analyst! We're looking for a qualified analyst to join our Customer Success team. Here's some of what you'll be working on: • Guide the way through complex data by telling and showing meaningful stories through intuitive reports and visualizations • Extract and manipulate data from SQL databases • Utilize Python for statistical analysis and visualization • Identify trends, patterns, and anomalies in data • Communicate technical insights to non-technical stakeholders At Modulate, we make #ToxMod, voice chat moderation technology that uses machine learning models to identify the worst behaviors happening across millions of voice chats, allowing Trust & Safety teams to take action against racism, threats of violence, and other kinds of harms at scale. Learn more about the role and how to apply here: https://bit.ly/48Joupf #DataAnalystJob #StartUpJob #JobSearch #JobHunt #NowHiring
-
Congrats to our friends at Rec Room for their expansion onto Nintendo Switch consoles! Rec Room is coming to Switch™ November 6th! https://bit.ly/4fQ5pUT
-
The November Trust & Safety Lately newsletter is comin' at ya a little early here on LinkedIn. Something about an #election coming up... Here's a preview of what we've got in this issue: ✅ X Pushes Political Content ☢️ Anti-toxicity in Call of Duty: Black Ops 6 💃🏽 ByteDance Turns to AI for Moderation 💬 Ofcom: "Enough talk!" Activision | ByteDance | TikTok | Ofcom | X
November 2024
Modulate on LinkedIn
-
Modulate reposted this
Thanks to Boston Business Journal for the honor of being included in your #40under40 list - and for the truly unique questions that allowed me to share my story. I've benefited by so many wonderful members of the Boston community and it's humbling to be included alongside such incredible people. And just in time for 🎃 Halloween 🎃, this was also a chance to reveal my favorite candy - Mounds. Which I've been told by many people in my life is a problematic choice, but hey, I gotta be me. What's the unusual candy preference that reveals something about who you are? https://lnkd.in/eKuCs_Ey
Mike Pappas, CEO of Modulate - Boston Business Journal
bizjournals.com
-
We had a fantastic discussion with Yasmin Hussain, Head of Trust & Safety at Rec Room, and Imran Khan on the evolving landscape of voice moderation in online gaming! 🎮 From ensuring player-to-player interactions remain positive to addressing the unique challenges of VR, our fireside chat provided valuable insights for building safer, more inclusive and long-lasting online and gaming communities. Voice chat is essential for immersive gameplay, but unmoderated conversations can quickly turn toxic. Platforms need to invest in a user safety strategy that ensures longterm player engagement. Thanks to Yasmin H., Mark F., and Imran Khan (VentureBeat) for the opportunity! #VoiceModeration #OnlineSafety #TrustAndSafety