Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
The Sound of Safety: How Modulate is Cleaning Up Voice Chat & Beyond with Mike Pappas image

The Sound of Safety: How Modulate is Cleaning Up Voice Chat & Beyond with Mike Pappas

Player Driven
Avatar
31 Plays5 days ago

Mike Pappas discusses Modulate's Toxmod, a sophisticated voice AI that analyzes emotional nuance and behavioral dynamics in online interactions. Initially focused on combating toxicity and promoting positivity in gaming, the technology is now expanding to address fraud and challenges like deepfakes. Mike emphasizes the importance of transparency with users, collaboration with studios based on their codes of conduct, and a privacy-centric approach focusing on behaviors, not individuals.

Key Themes & Insights:

  • Advanced  Voice Analysis: Toxmod interprets emotion, nuance, and behavioral impacts in voice, not just words, to identify harmful or positive interactions.
  • Studio Collaboration & Transparency: Modulate tunes Toxmod to studio-specific guidelines, with studios making final decisions. Open communication with players about moderation is crucial.
  • Privacy-Conscious Moderation: The system focuses on identifying harmful or positive behaviors exhibited by users, rather than building profiles of individuals.
  • Expanding Beyond Gaming: Originally for gaming toxicity, Modulate's tech is now being applied to detect fraud and improve interactions in other sectors, like call centers.
  • Addressing Evolving Threats: The technology can help identify synthetically generated audio often used in scams, treating it as a form of fraud when coupled with problematic behavior.

Resources Mentioned:

Subscribe to Player Driven for more insights and share this episode!

Recommended
Transcript

Introduction and Episode Preview

00:00:00
Speaker
Welcome to Player Driven. Here's what you're about to listen to on today's episode. Today we're talking to Mike Pappas. He is the CEO and co-founder of Modulate. Modulate has provided their ToxMod service to provide in-game protection of players through voice moderation.

Modulate's ToxMod and Voice Intelligence

00:00:17
Speaker
We talk about the cutting edge of voice intelligence and how they're able to detect emotion, nuances, and behavioral dynamics in voice chat. We talk about building trust through transparency and collaboration and how they work with studios and players alike to help understand what the technology does.
00:00:33
Speaker
You can hear how they're

Collaboration with Community Clubhouse

00:00:34
Speaker
expanding their voice tools to protect people and other industries as well from things like fraud. It's a really great episode and this is another collaboration between community clubhouse and player driven community clubhouse is a great place for anyone in the gaming industry to start learning about best practices trust and safety community support they're there to provide free courses as well as in-person events at gdc and games come to help really spread these best practices to everyone it's a great group and it's highly recommended for anyone in the industry to just check them out there's no cost you can just learn from the best with that i hope you enjoy the rest of this episode if you'd
00:01:12
Speaker
haven't liked player driven on the socials yet please follow us on linkedin or tick tock instagram youtube we're on all the social media platforms so be sure to follow us for clips and other best practices and i hope you enjoy today's episode
00:01:30
Speaker
good morning everybody and welcome to player driven Today, we are continuing our Community Clubhouse sponsored episodes, and we are talking to Mike Pappas, the CEO and co-founder of Modulate.

Role of ToxMod in Online Safety

00:01:43
Speaker
Modulate has created of their major products, Toxmod, to help protect online communities and online players from from just... bad things online. We'll learn more about that online, but it's a great technology. And we've spoken to Mike in the past, as well as other people from Modulate. it's a really cool tool. I'm excited to talk more about

Understanding Emotions in Voice Chat

00:02:01
Speaker
this. This is a topic we're all passionate about.
00:02:03
Speaker
Mike, thank you for coming to the show. How are you doing on this Friday? Thank you very much for having me, Greg. I'm doing well and excited to dive into the chat. Yeah, I'm excited. You've been posting a lot about about kind of fraud protection a lot recently online, and you're making a bit of a pivot, not a

Broadening Focus Beyond Gaming

00:02:21
Speaker
complete pivot, just kind of ah vertical pivot to take a look at how we can look at this technology and other verticals.
00:02:26
Speaker
ah Before we break it down, you want to do a better job at explaining what Modulate is to our audience? You did a fine job, but I'm happy to add my spin to it. And I think it connects to that that so-called pivot you're talking about, which I'd say is really just a ah broadening of what the tool is focused on.
00:02:43
Speaker
um So Modulate builds pro-social voice intelligence technology. um That's a bunch of keywords, but what it ultimately means is we are extremely good at understanding conversational voice, not just what's being said, but the emotion and the nuance of it, the behavioral dynamics.
00:03:01
Speaker
If I say something and then you suddenly go into a shocked silence, we can notice those kinds of behavioral impacts that tell us, is this conversation going the way that we wanted it to, or is there some kind of key event happening?
00:03:14
Speaker
As you noted, the sort of first thing we did with that technology is we went to games and we said, hey, we can use this to help you protect your community from bad behavior. And as we've expanded over time, it's also been to promote and enable good behavior.
00:03:27
Speaker
um where We work with some studios on detecting that positivity. Everything from, hey, this user is really good at diffusing toxicity to this is a user that's really welcoming to a new player and helps coach them in the game.

Identifying Synthetic Voices and Deepfakes

00:03:40
Speaker
Um, and then tying into this fraud piece that you mentioned that conversational voice intelligence, it can look for all kinds of different types of harms or positive actions.
00:03:51
Speaker
It's not itself restricted to toxicity. We wanted to start there because we knew that was such an immediate poignant problem for so many gamers.

Trust Evolution in Gaming

00:04:00
Speaker
But as we've been thinking more and talking more to people about what else could this technology really provide in society, um fraud comes up a lot and does ah a lot of damage in the world, especially with the advent of things like deepfakes. So we've been ah really excited to see just sort of a ah broadening of the impact that we're able to have out there.
00:04:19
Speaker
That's fascinating that you you can start to cluster individuals based on their actions and what they talk about online. And this is like a wild thought, but like, can you identify deep fake videos? Can you identify things like that just based on, you know, I mess around with CapCut a lot now and you can over you could change ah the the voices and all this stuff. Like, can you identify that this is an AI?
00:04:45
Speaker
bots voice versus we we can do that pretty well. And I'll speak to

Modulate's Collaboration with Studios

00:04:50
Speaker
that. I also want to just clarify, um we we look at behaviors more than we look at individuals. And I just want to bring that up because I know this is a very sensitive privacy topic and we take that very seriously.
00:05:02
Speaker
We're not building up. oh, here is our understanding of Greg as a person. He's playing these three different games or something like that. That's not our business. It's not our interest. It is, hey, we're looking for indications of fraud, of toxicity, of positivity.
00:05:19
Speaker
And it happens to be user 372 just exhibited that behavior. The game would know user 372 happens to be Greg, but we're we're not building up that profile.
00:05:30
Speaker
Putting that clarification to the side, deep fakes are really interesting. um From an audio standpoint, digital audio is digital audio. However, it was

Community Feedback and Moderation Efforts

00:05:40
Speaker
sort of created.
00:05:41
Speaker
you There's no inherent signal in that saying this was synthetic. And in some ways, it's all synthetic. Once we record it and put it into the computer, we're running compression, we're running different algorithms over it.
00:05:53
Speaker
So we can often detect sort of falsified audio. But we encapsulate that as one of the kinds of fraud, um especially because there are legitimate uses for synthetic audio. There are some people who can't speak for various reasons and use text to speech to augment their ability to interact in the world.
00:06:14
Speaker
There are folks with voice dysphoria who use voice changers to better express their identity. We don't want to say

Studio Collaboration on Conduct Policies

00:06:20
Speaker
every indication of syntheticness is harmful. So instead we say, all right, we're looking for those signs of syntheticness coupled to other kinds of problematic behavior that indicate that this is someone not just expressing themselves differently, but actually intending to manipulate your impression of them in a negative way.
00:06:42
Speaker
That's fascinating. i never thought about the the use cases behind it, but that makes a lot of sense. Not everyone can communicate in the same manner and that makes sense at it puts a worry on me again of will AI take over one day. and ah but But back to gaming, right? What does trust look like in gaming today? You know, we talk about this a lot. and I think trust and safety over the past probably five years has changed quite a bit anyway, right? What what what is the concern? What is trust in gaming today?
00:07:11
Speaker
Yeah, i mean, if you look back a couple decades, the relationship that players had with games and game developers was a little bit more transactional. The game developer creates this experience, you go into the experience, you have a good time, and then eventually you're done.
00:07:28
Speaker
And so trust was very one way in that sense of the game developer had an obligation to give you the kind of experience they told you they were going to give you. If it was a game that was e for everyone, you shouldn't suddenly find yourself, you know, shooting up a bunch of people in the middle of a city.
00:07:43
Speaker
um that That was what that really meant at the time. But now we've moved more to this ah world of games as social platforms. And first of all, that introduces a lot of new kinds of both positive and negative behaviors as you have players interacting with each other

AI Challenges and Human Oversight

00:08:00
Speaker
online.
00:08:01
Speaker
But I think the more fundamental thing is it creates a different kind of sense of ownership of that space for the players. It no longer feels to a player like this is purely the developer's space that they get to control because the players are themselves contributing significantly to it and helping to cultivate and enrich that community.
00:08:23
Speaker
So they increasingly feel the the need and right to have a say. And so to me, what trust looks like when done successfully in the game studio is navigating that tension of the studio saying, hey, there are some things we won't stand for. There are some things we expect of everyone. We are going to impose some rules, but doing it in a way that's transparent and engaged with the community so that they actually feel like they're a part of it and not like this is just sort of being thrown upon them, or even worse, done kind of inconsistently or unfairly in a way that they can't

Learning and AI's Role in Development

00:09:00
Speaker
sort of plan for and to make informed decisions around.
00:09:04
Speaker
It's fascinating how how you kind of, and well, the rise of community, right? We're seeing it in a lot of places now, right? At kind of the response to the past few years in my mind is now people are looking to build up that community again, do more things together. Right. uh, and when you look at a game and, uh, you know, you, I know you guys work with, with Activision, right? Like you're,
00:09:29
Speaker
you're entering a house that has already been built. The foundation has been laid and you need to start to alter and kind of consider how do we rebuild this foundation without knocking down the house? um And you're probably looking at

Transition to Text-Based Communication

00:09:40
Speaker
one of the top three games, right? That are out there. Like, how do you, how do you work with partners to take a look at,
00:09:45
Speaker
a foundation that's there already and say, hey, this is how we have to work with your community to build this out. Yeah, I mean, there's there's two layers. There's us with the studio and then there's us and the studio with the community.
00:09:57
Speaker
Yeah. So the way we work first with the studio is it's ultimately their code of conduct, their standards. We don't tell them what's permissible in their game. We don't tell them what actions to take against their users.
00:10:09
Speaker
We have to collect from them an understanding of what do you consider acceptable, sort of problematic or rewardable behavior in your space. And we will tune our system against that.
00:10:21
Speaker
And then we will flag things that we believe meet that criteria for you, the studio, ultimately to review. So that's very important to us that we're leaving

Voice Changing to Toxicity Moderation

00:10:30
Speaker
the ultimate decision making power in the hands of the studio.
00:10:34
Speaker
Equally important though is us and the studio going to the community and transparently saying, here's what's happening, here's how it works, here's why. This is something that we push very, very hard with all of our partners and say, look, youre you have to tell your players, not just that you're moderating voice chat, but why are you doing it?
00:10:55
Speaker
You're not doing it because you don't want anyone to ever have fun trash talking with their friends. And people who have seen misbehaving AI agents will rightly fear that that's going to be the consequence if you don't talk to them about this.
00:11:09
Speaker
um Some of the studios that we've worked with, I think, have done a really good job proactively messaging this and laying out, look, you all know that we've had some of these toxicity problems before.

Broader Applications Beyond Gaming

00:11:20
Speaker
We really believe in the ability to you know have fun and poke fun with each other and do all these things. But there are some extremes, and we're trying to find a way to stop those extremes without stopping the fun.
00:11:32
Speaker
Here's our first attempt at it. We're going to be monitoring it. We're going to be tracking how well it's working. We're going to be adjusting it over time. And not only is that transparency valuable, but it also creates this path where you're saying to the community, hey, you as our community can help us iterate on this thing.
00:11:50
Speaker
You can send us examples of where you think it's misbehaving. You are a part of the process to ultimately reshape our community for the better. It's not just something that's happening to you.
00:12:03
Speaker
And that's really important. I think you mentioned a few points there, which i think are important to highlight. One is that you're you're following the company's guidelines ah of what's okay, right? You provide the tools to help help them figure out who the characters are that might be good, that might be bad, that might be saying things

Preventing Scams with Technology

00:12:20
Speaker
that are are not heading in the right direction, but it's up to the commute it's up to sorry that the studio to decide how do we we treat that. and And I think it's super important. I know there's been certain backlash to things like this online with with shadow bans or people saying that it's not working right, but I think it's part of understanding that this is a new technology that's growing and it's there for the better at the end of the day.
00:12:43
Speaker
There are going to be people that are angry about technologies ah that could potentially listen to them, but if you're worried, like I don't understand the sentiment there. like You shouldn't have to be worried if you're just playing the game online, right? And now as a father of two, right? like I want my kids to be able to go out and have fun in these games, right? And be safe. like If you think you're getting banned for dropping an F-bomb, right? like That's not going to be the case. Or if we're playing Call of Duty and say, I'm going to shoot you in the head, that's that's fine. But if we're playing Sudoku and for some reason where we're talking to each other, say, I'm going to shoot you in the head, it's a very different sentiment coming from
00:13:17
Speaker
from a casual game than a a

AI's Role in Customer Service

00:13:20
Speaker
multiplayer game. So I think it's, it's grow, well, not growing pains. I think the community needs to be able to do a better job at accepting that there's tools out there to help to protect them, ah to make sure that everyone can game online safely. Yeah.
00:13:36
Speaker
It, there, there's so many different factors to balance. I will say like, i I get what you're saying of the like, hey, if you're just behaving well, then there's nothing to fear. I'll actually say i'm I'm somewhat sympathetic to the people that have fears there.
00:13:49
Speaker
AI can misbehave. AI can get things wrong. People have seen various attempts at AI that doesn't understand these dynamics. we've invested a lot and had the opportunity to work deeply in the gaming space to really get the difference between flirting and the same sentence that's being received as sexual harassment, between friendly trash talk and something that's being received as, you know, again, harassment or hate speech.
00:14:18
Speaker
It's important to understand not just what the intent

Building Trust in AI Moderation

00:14:20
Speaker
is of the speaker, but how that's being received in the community space and what's supposed to be expected. So I get the skepticism from people.
00:14:29
Speaker
I just look for them to, to you know, be open minded enough to say, hey, I worry about this. But if the game developers are being open, that they've looked into this and they're using this tool, Modulate Talks Mod that we've seen in other games successfully.
00:14:45
Speaker
maybe it's worth giving a little bit of benefit of the doubt and seeing if it can help here too. Do most of the studios that implement it have some sort of human in the loop to provide some review on on the actions that are being taken to these individuals?
00:15:00
Speaker
Yeah. So there there's always some human in the loop at the beginning. um As they start to review the performance of TalksMod, they will sometimes say, hey, when TalksMod flags things above this score,
00:15:13
Speaker
It's right 99.9% of the time. And so it's worth just issuing a warning based on that. We don't need a

Proactive Moderation and Community Response

00:15:20
Speaker
human to review it. So as they get validation through the stats, they will start to transition more of that actioning to happen automatically for the very high confidence stuff.
00:15:32
Speaker
And then for the stuff that's mid-confidence, they'll keep that human in the loop to review it until we're able to sort of improve Toxmod to a sufficient level of accuracy that, again, we can go automatic.
00:15:43
Speaker
And of course, even if it's 99.9%, you've still got that 0.1%. And that's why you have your appeals process. And that's why you continue to iterate and improve on the system even past that point.
00:15:55
Speaker
It's unfortunately the case that you'll never get every single decision right. But at the very least, we want to make sure that if we get a decision wrong, there's not too much friction in the way of getting that corrected.
00:16:08
Speaker
I have a question that I don't know if it's going to make sense, so I'll try and ask it and we'll see if it works or not. is that You've been doing this now for maybe close to two years, maybe more than that. um Do you find it's getting easier to understand the the nuances and the predictability of people or is technology as quick as AI is moving and what people are using to either mask themselves? like Is it becoming harder and harder?
00:16:36
Speaker
It's not really becoming that much harder or easier, to be perfectly honest. us um the The world is complicated. Language evolves. We'll never be done understanding all of the nuance here.
00:16:49
Speaker
um So it certainly won't get easy in that way. um i think what has gotten easier is we have a better understanding of how to navigate that threefold relationship between

Importance of Community Clubhouse

00:17:00
Speaker
modulate the studio and the community um And part of that is that we now know what pitfalls to avoid, what kind of messaging is going to be most helpful.
00:17:10
Speaker
Part of it is also that the community and studios are more familiar with what it is that we do. When we first launched in our earliest titles, we had folks from our team jumping into the discords to directly answer questions from the the players about how our technology worked, what it would be doing.
00:17:28
Speaker
What we found in the last couple of years is sometimes we would put someone in that discord and they would be beaten to the punch by other players in the community who would say, no, no, no I've actually seen TalksMod working in this other game and it works like this and you're misunderstanding, check their website. I made a video about it, check out the video.
00:17:46
Speaker
So it's been really rewarding to see the community building trust in us and sort of rewarding us for that outreach that we've done by, you know, carrying that along and helping new communities to better understand what it is that we're doing.
00:18:01
Speaker
i like we We hear indies do that a lot, right? With building their audience and their community up well and their community becomes their advocates. and you rarely hear that from a service provider that they're working with the community to educate the community and the community is going out there to educate them. And i think the power of the word of mouth from fellow gamers goes a lot further than the preaching of anyone else, right? If other gamers are standing up for you and saying this is a good technology, I think that's such a great, great sign from the people the gamers, right? that It's almost another strategy is, hey, let's just go educate players on what we're doing, right? So they continue to go. And I think and there's no better way to do that, right? It sells itself at that point.
00:18:42
Speaker
Absolutely. um And, you know, again, it's something we always have to curate together with the studio. Some studios are understandably more protective of the way they message with their community and they want to really make sure that that's all run through them.
00:18:55
Speaker
um But ultimately, it's just about having some way for the community to voice their honest concerns and questions so that they can get a response and again, feel like they're part of the process and there's a meaningful conversation happening.
00:19:11
Speaker
Now in the process after Modulate is sold and put into the studio, who what teams do you see normally working with the tool internally? they'll They'll have different names. it ah It all sort of sums up to trust and safety. Sometimes there's actually a trust and safety team.
00:19:28
Speaker
Sometimes it's internal tools or central tools or something like that. Sometimes it's community management or community health. um So we'll we'll see different titles that are all associated with that.
00:19:41
Speaker
Usually the way it really breaks down in practice is there's sort of a more technical product person whose responsibility is for figure out the right tools to solve this problem.
00:19:53
Speaker
There's a team of moderators whose tool is to actually review what we're sort of putting forward. There's a policy team whose job is to say, hey, what is our code of conduct in the first place?
00:20:05
Speaker
And then there's a compliance team that needs to put together all the reporting about how is this all going, both internally, but also externally, as there's more and more regulations that they need to be providing ah data to about how their internal moderation efforts are going.
00:20:22
Speaker
And has that... Has that always been the case? Is it starting to change? ah i've heard cases where it's not just supports team ah problem anymore, right? It's rolling up to other departments as well. You mentioned product, right? And stuff like that. um Especially as we start going into other industries as well, right? Do do you see that ownership starting to change or it's always going to be a trust and safety?
00:20:47
Speaker
Within games, I think it's always going to be some flavor of trust and safety slash community, which to me are so interchangeable because you can't have a meaningful community if you don't sort of build that sense of trust with them.
00:21:03
Speaker
um But that that's that hasn't changed too much. There's been some change as we see consolidation in the games industry. And you have a major publisher that now has 10, 20 different titles.
00:21:16
Speaker
There's been more attention on, hey, maybe that publisher should have one central team that manages this kind of stuff for all of the titles across their portfolio. So that's been a little bit of a shift, but it doesn't break down the fundamental skill sets of the people we're talking with.
00:21:33
Speaker
It doesn't set that in any different direction. okay Outside of the game space, it's more complicated um because the way you intervene looks different.
00:21:44
Speaker
So in fintech, when you're doing fraud, you have fraud investigators. You don't have real-time content moderators. So the way this stuff needs to be surfaced and the teams that need to be analyzing it looks different.
00:21:57
Speaker
that's the sort of major change as you look at each different space. So I have questions about the other verticals. Before we do that, I want to jump into my little fireball round where I'm just going to throw some random questions at you to get get some feedback here.
00:22:10
Speaker
um So good to go? Go for it. What is the last game you played? um The last new game i played was Split Fiction with my wife.
00:22:21
Speaker
Nice. How was it? I am not quite done with it yet, but we've been enjoying it. um um It's so rare to find co-ops that are actually sort of balanced skill sets between the two players. um And it's it's a nice enough way for us to relax, though, honestly, like I play so many fewer games being part of the games industry than I'd like to, just given the the time of the day.
00:22:45
Speaker
Yeah, you knowre we're time poor as we get older. We we have less time to do this stuff. Yep. It is ah Memorial Day weekend here in the States, and I'm wondering what would be the thing you're going to throw on your barbecue for Memorial Day?
00:23:00
Speaker
um My go-to barbecue is chicken thighs and just a bunch of fresh vegetables. All right. Lightweight, easy, but feels just so, so tasty every time. It's the thing that ever since growing up, I could just keep eating grilled chicken thighs.
00:23:18
Speaker
forever and not really get full and not really get tired of it there you go and it's good protein there so there you go how about you how about me um um i i feel like a good burger you can't go wrong i think this year i have a smoker so i think gonna make some ribs but i feel like every once a while i just gotta go burgers and dogs keep it simple it's uh it's the way to do it fair enough um what was the last show you binge watched um You're asking me to go back a couple of years. If you want to go movie, what's the last movie you watch? We could go that too.
00:23:55
Speaker
I mean, we, my wife and I do try and keep up with John Oliver's last week tonight, which is not exactly binge watch material, but we watch it steadily.
00:24:05
Speaker
wow Oh, my, my wife got me watching some of Ted Lasso not too long ago. um which I've been hearing about for for ages and didn't have the right streaming services for it. And then she reminded me that DVDs exist.
00:24:18
Speaker
They do. So we've been working our way slowly through that. Ted Lasso's on DVD? Wow. I bet at this point it's backdated onto VHS for the amount of passion that people have for it. Get the Laserdisc copy.
00:24:34
Speaker
Um, last question would be, what would be your preferred method of learning? Would it be video, audio, visual? My preferred method of learning is to throw myself into a real world problem and try and fix it more than any kind of let me kind of absorb the facts from somewhere else.
00:24:57
Speaker
um Just the the exercise of trying to think something through for myself is where to me the actual like remembering anything happens.
00:25:08
Speaker
I enjoy watching good, well-made educational videos much more than I do sort of reading a textbook or something like that. But if you ask me 30 seconds after the video ends, hey, what was it talking about? I've probably lost most of it unless I was trying to absorb it towards a purpose. So Mike is jumping into the deep end to learn how to swim.
00:25:29
Speaker
Yep. Cool. um for For better and worse. And I've come come close to drowning a few times, but we we do learn. When we talk about it a bit, it's like you go to school to learn accounting and then you get your first accounting job. And the first thing your job teaches you is how they do accounting there.
00:25:47
Speaker
I think, you know, you can learn as much as you can in school, but when you get your first job, most of the time they're going to teach you their method of doing it. And it's good to know the basics, but... I actually think of it a bit differently. We talk about this with a lot of our software engineers. As you go to school to learn how to code, and then you go to your first job and you realize that none of the job is coding. And this this is why I'm not so worried about, oh no, is AI going to replace all of my software engineers? No, because the actual job they do is communication and planning and architecting Actually writing the code is a fraction of the time that they spend and all power to them if they can augment that with AI, but that's the it's the design piece that's harder and it's the soft skills that we don't really know how to teach in and school in general.
00:26:34
Speaker
Yeah, i agree with that. That's what we always talked about for like when chatbots were there. It's like you can't get rid of everyone because of chatbots. Figuring out the best methodologies of how to get them working, the best flows, right? Like there's a whole art to this that that you can try and replicate, but you need someone there to actually sit there and help think through it. And I think the the human in the loop again is something that can never be replaced. I think it's just you need that human-esque look at it to see how it works.
00:27:01
Speaker
Yeah. All right. So I want to talk about modulate from my perspective and and just get your thought right. ah When you started modulating about seven years ago, trust and safety was an issue in the industry. Everything was really text based back then. A lot of typing, a lot of people on computers.
00:27:20
Speaker
um No one did voice at the time. Everyone was saying voice is expensive, which it is. and in my mind, everything at the time was still very voice heavy. And yes, there was definitely games with text, right? But it was voice, voice. And and years went on and and everyone was text based and few voice vendors were out there, if any, right? Because voice is expensive. Then...
00:27:39
Speaker
Tox Mod comes out and Tox Mod is pure voice and you're selling to some of the biggest games that are out there. And it's like you swung for the fences with your first at bat trying to go with voice. No one else was touching it.
00:27:52
Speaker
Was the goal always to do voice or did you realize kind of while you were getting stuck in the mud that, hey, this is a problem we got to solve? It was always voice and we came at it from a oblique angle. um The original tech that my co-founder actually built was real-time voice changing tech.
00:28:13
Speaker
If you search hard enough for it, there's a video out there of me speaking with Barack Obama's vocal cords and talking about how this cool technology works. um But when we started going to games among others to understand like, hey, how would you want to use this? Do you want to maybe sell people the Morgan Freeman voice skin or something like that? Obviously, we'd have to deal with licensing, but that was the original vision. and And what we heard from the games is, hey, you know, this idea of augmenting voice chat is really cool. Voice is so important to the social features of our community.
00:28:44
Speaker
But the real thing stopping people from engaging in voice chat is this fear of toxicity and harassment. And then I had no less than three different gaming execs bring up the idea of, you know what, Mike?
00:28:56
Speaker
You could solve sexual harassment. You could solve sexism by making all of the women online sound like men so they don't get harassed.
00:29:09
Speaker
And I kind of stared at them for a little bit. And like, okay, i would like to help you solve this problem. And I'd like to do it in a better way than that, because that's a really bad way to solve that problem.
00:29:20
Speaker
um But that's really what kind of spun us into, okay, why isn't anyone moderating voice chat? And as we looked into that and learned more about the expense and the complexity, what we realized is this like voice synthesis, voice generation model we had,
00:29:37
Speaker
um My co-founder cringes heavily when I say is but broadly speaking, if you run a generator in reverse, you get a classifier. Totally wrong, but let's assume that that's right for now. What that means is we had kind of the bones in our hands of something that could understand emotion, could run really cost effectively because this needed to run on device to do that real-time changing.
00:29:58
Speaker
that we had actually cracked through the biggest problem stopping anyone from doing voice moderation before even realizing that was the problem we wanted to solve. And that confluence was what led us to say, okay, this is clearly the sort of right destiny for modulate and took us forward from there That's fat. First all, you're giving me like flashbacks to the original Xbox when they had their not-so-great voice masking technology, and you're always talking to like kids and baby voices.
00:30:25
Speaker
so So it was always gaming from the beginning. You wanted to be in the gaming world. when When we first came up with that voice changing tech, games was at the top of our list of who's probably interested in this.
00:30:40
Speaker
We did talk to a bunch of other markets, call centers, podcasts, you know, marketing agencies. We talked to some folks in the public sector about it. We were just trying to figure out who would use this kind of technology.
00:30:54
Speaker
And games were the ones that not only said, oh, this is cool, but actually said, we've we've thought of this. We hadn't built it yet, but we've thought of it. We have ideas about how to use it. Some of those ideas misunderstood how to solve sexism.
00:31:08
Speaker
um But, you know, they at least had a vision for it. And that really solidified for us. Okay, this is a space not only that we'd be excited to be part of where we're both, you know, little bit of kids at heart. We play a lot of these games. It was exciting to think that we could be a part of that journey.
00:31:25
Speaker
But also, this is an industry that is ready for this kind of technology that has the need today, not just someone who's kind of exploring and figuring it out. Yeah. And then you cracked the code, you figured out, you can kind of create these classifiers and all of a sudden the possibility of new verticals opens up, right? i mean, I come from the FinTech background and I've worked in compliance for a long time and and we've heard of issues where my grandma called me or my daughter called me and she was arrested and she needed bail money sent to her and it turned out it wasn't true. And all these new scams started coming out online with people getting robocalls and
00:32:09
Speaker
Did it click with you at some point that, hey, this is a world we could play in? Yeah, I mean, it actually, we we were thinking about it long ago. We had this vision of you know, we we always want to take our gaming roots seriously and continue to invest in gaming.
00:32:28
Speaker
But we always wanted to think about what what else would we be able to do on top of that? um We actually got our first major inbound from outside of the games industry after we launched our first big case study with Call of Duty because we had folks from outside of gaming reaching out and saying, look, if you can tell the difference between trash talk and Call of Duty and, you know, actual hostility, then you're going to do just fine telling if someone's harassing someone in a call center.
00:32:59
Speaker
Right, like yeah you've actually already solved the really hard problem, which gives us confidence you can solve our still hard but easier problem over here. And that's what kind of pulled us into that space. And it started out in enterprise being sort of still toxicity detection. When is someone really irate? When is someone yelling unfairly?
00:33:20
Speaker
But it quickly became, hey a lot of the reasons that you're getting one of your agents sort of acting frustrated or or irate is because they can tell that they're being scammed or they feel like they're being scammed or someone's trying to manipulate them.
00:33:34
Speaker
And so it very organically started to find fraud. And our first big customer, a major delivery platform, told us after only two weeks of running our toxicity detection tool, that that tool alone and was finding them five times more fraud than any of their previous reporting tools were we're generating for them. Because so few people report it Even when it's part of your job to report it, it's just such so much friction.
00:34:02
Speaker
And this was escalating so much more. And so that got us starting to think about, OK, that's the obvious next kind of category we should build. We did toxicity. We did positivity. Now let's do fraud, scam sort of deception.
00:34:17
Speaker
That's the right next kind of behavior we where we can add a lot of value to the world. I think that from from a humanity standpoint, something that's great, I think we need to protect people who are on the other side of the phone. Right. Both ways as it goes. Right. there's people should not be mean to other people or attack other people, right? If we could, if we could and it goes back to your positivity stance, right? Like, know i these are easy things to detect. The question is, how do you get a studio to invest to put these things into it? And I think you start to see things with retention, where you have happier employees when they're not getting yelled at. I mean, my first job was in the call center and we got yelled at by people because they didn't know how to set up Outlook.
00:34:55
Speaker
Right. Like having stuff like that. So like we had a really crappy day. Our employers could come back and protect us, I think would be really good and make us feel a lot better that our employers are putting in some time to protect us as well.
00:35:09
Speaker
And I think it goes to the scamming side as well. Right. Like eventually find a way to protect us from ourselves, from from people that we know, because these tools are becoming easier and easier. And I just think it's such a cool technology to be able to implement things.
00:35:23
Speaker
voice and understand the nuances of a voice to understand where this like, this person's really not in duress, and they're saying they're in duress. So maybe you need to be careful who this is. Like, I don't know, I think what you're doing is super cool. And I'm curious, kind of how does that transition? And do you continue to grow that out? and And I don't know if there's even an answer to that at this time.
00:35:43
Speaker
No, I appreciate And I should caution again, like AI isn't magic. um What AI can do is be about as perceptive as a very well-trained human being in this kind of thing.
00:35:55
Speaker
So we cannot see through your head and say, oh, it turns out you're lying because you took it eight microseconds longer to respond to that. it' It's not that kind of pseudoscience. It's just as simple as, hey, you might have a gig worker or sort of a relatively untrained junior agent or heck, an AI agent and that's taking these calls that doesn't have the sophistication to recognize these signs that a trained operator would be able to recognize.
00:36:25
Speaker
And so let's help give everyone that same level of perceptiveness and prompt that. Where does this go from here? We're talking a lot to folks in that AI agent space. There's a lot of concern about, are my agents going to go rogue and violate our brand guidelines or make promises we can't keep or just generally hallucinate?
00:36:44
Speaker
And on the flip side, how angry are my customers when they're talking to my agent? um I see a lot of call centers trying to say, you know, oh, ai agents maybe help us save some money, but also it's ruining our customer experience.
00:37:02
Speaker
Well, one way you can maybe manage that is use the ai agent as the very frontline operator to get the details of whether it needs to be routed to a person or not.
00:37:12
Speaker
but use technology like ours to get a better understanding of the sort of sense of urgency and fear and anger coming from the caller. And we can very quickly identify this caller needs to talk to a person.
00:37:25
Speaker
Let's get them through the chain and get them over there. um This is not the sort of thing that an AI agent is going to be able to handle for you. Yeah, you know, I think about all the times I've called an IVR phone system and like, agent agent, agent, agent, right? Like, all right, just want to speak to an agent. But like, just because I want to speak to an agent doesn't mean it's an emergency, right? Like, if it can get a better sense of, hey, what's the best, I don't know, the AI, I know people don't want to work with those initial clients.
00:37:53
Speaker
bots, but the truth is they're there to help make things quicker, right? Like yeah if you say something's wrong, it will get you to usually the place where something's wrong. If you say I need a refund, it will get you to the place where you need a refund, right? Like it's there to triage.
00:38:06
Speaker
I basically agree with you. um I'll say my personal gripe is for me being part of the the particular generation that just hates calling anyone on the phone ever.
00:38:19
Speaker
By the time I'm calling the phone, I've tried all of the obvious website features. I've tried all of the clear settings. And so by the time I'm calling you on the phone, it's probably because i have a non-standard issue. yeah um So to your point, the the whole secret code of just shout agent,
00:38:38
Speaker
seems bad to me. i wish that it could be, you know, hey, here's the frontline operator. We can solve a number of standard problems for you. If you don't believe that your problem is on this list and have already tried everything, here is the actual formal setting to please add me to the queue for humans.
00:38:57
Speaker
I get why they don't really want to do that because everyone would just try to dodge around the ai But yeah I wish we could live closer to that world is I guess what I'm saying. I guess we, I've read a story this morning that Uber, an Uber driver accidentally drove away with a child still in the car. And like the parents were trying to call Uber and they couldn't get through to Uber. They ended up calling the police and the police can get through to Uber and they got the child back. It took about 80 minutes, but like, that's a perfect use case. Like,
00:39:25
Speaker
right. Most of the time Uber doesn't want to talk to you. And I get it because it's Uber and they don't want to talk to you, but you drove away with my kid. Like that's something that needs to go to a human, like now, not in 80 minutes. Right. And you can sense that type of, oh shit in my voice. Like.
00:39:41
Speaker
ah Exactly. or And again, like we, we can listen to what's being said too. It's like, okay, you're talking about a child abduction that like fits into the category of things worth bothering people over. Mm-hmm.
00:39:55
Speaker
um From a studio or a company perspective, when looking into these sort of technologies, right? um Are there early indicators of harmful behavior happening in your community that maybe be players or sorry, not players that ah you may not recognize out of the gate? Like what are those early signs that there's something bad brewing or potentially?
00:40:16
Speaker
i So from a game perspective, because in in enterprise, it's not... it's not quite the same flavor of community, right? The people who are calling your call center aren't also talking to each other and changing the way each other engage with your call center. Yeah. At least most of the time we do some stuff with like internal employees and that looks a little different. um But in the gaming space, it's not, you know, oh everyone is doing low key toxicity and that means everyone's going to do like more extreme toxicity soon.
00:40:52
Speaker
It's more about how people that test the waters are received in your community. And the communities that have done the best job setting the stage, and some of them have concrete tools like an in-game code of conduct that you can sort of put in front of people in a VR space.
00:41:08
Speaker
um What you find in those sort of strong communities is someone starts towing the lion, starts saying, hey, I'm going to toss out these slurs and see if I can get a laugh.
00:41:19
Speaker
And they don't get a laugh. And people say, hey, man, this is not the place for that. in the spaces that are a little bit less effective at maintaining that, you'll see those people testing the waters.
00:41:31
Speaker
And the worst case is they get a few laughs and they start to feel even more emboldened. But often it's as simple as no one really pushes back on them. And the system doesn't push back on them if they're not using something like ToxMod to detect it.
00:41:46
Speaker
And so even though the code of conduct says that it was bad, their experience is, well, I just did it and it wasn't bad. So I'm going to go ahead and keep doing it. And that that's the leading indicator is how much people sort of are continuing to test those waters and how far they're trying to push it without any repercussions.
00:42:07
Speaker
And the more that that happens, the more they get emboldened and the more they start to feel entitled to the point that now if you come back in and this happened, um not not to sort of poke at anyone in particular, but Helldivers had this problem fairly recently with their Discord community, where it started out as a really good community, but rapidly, within a matter of a couple of weeks, it descended. And when the developers came back in and said, hey, this is not what we're about.
00:42:34
Speaker
At that point, these bad actors had developed this sense of entitlement. where they were saying to the developers, this is not your space. This is our space. You don't get to tell us what to do.
00:42:46
Speaker
And that that's when it really starts being a difficult problem to solve. That's crazy. And the truth is, any game that gets popular is going to have bad players enter that arena. It's just...
00:43:01
Speaker
I guess it's up to the studio to stay on top of it. And you can tell people like trust and safety and community trust and safety. And unfortunately they usually wait till like, zero hour, like, oh we got to do something now. Like, oh, it's... Yeah, I mean, think think about like parenting or a school or any other place with kids, right? You have you have people that are saying fart because they think it's funny and maybe you let them sort of do that and it's not the biggest deal. And then one of them tries saying the N-word because they also have heard that they're not supposed to say that.
00:43:31
Speaker
And then one of them tries to kick another student and... Like, okay, you need to be setting really clear barriers and making it clear there are some things that will not be permitted, or else they'll just keep pushing until they find those barriers. And that's a natural part of human psychology. It doesn't mean these are evil people any more than the average kid is evil.
00:43:51
Speaker
yeah oh It just means that you haven't given them the structure for them to feel like they know what's expected of them in this space. I think that's so well said. And that I mean, you gave me the parenthood example, which again hits home, right? And it makes me just think, you know what, like, it's up to the parent to step in and say, Hey, you can't do that. You can't say that. That's not right. Like you have to discipline or find a way to kind of educate your child about that stuff. And I guess the same goes in your example. Like if no one comes in for the first week and says, Hey, you can't do that. You can't do that. They're going to keep pushing it. And I did this podcast with a PhD, her name was Ruth Diaz and she does like, men ah she talks to bullies online ah and and trolls and understands why they do it. And it aligns with what you're saying. It's like,
00:44:39
Speaker
First of all, they may not be the most popular people in the world, but if they get a laugh, then they have someone to go along with them. And then they're going to continue to to poke and pry. it And most people will be alienated by this person, but they'll have a little group that keeps coming. If no one comes to say no to them or stop, right, they're going to keep doing it and take over. And at some point, its just going too hard of a problem to try and solve it. and that might be where you then have to come mute them or ban them or kick them out, right? You don't want to start by kicking them out. You want to be able to educate them and help fix the problem. But if it gets too far out of control, it it's too hard to to keep control of.
00:45:11
Speaker
Exactly. Yeah. You've been a part of Community Clubhouse now since the beginning, I think, since the first GDC in in San Fran. And why what's important about Modulate being a part of Community Clubhouse?
00:45:28
Speaker
The word community. um I mean, that it's you said before, were did we always think about voice because there's text, there's all this other stuff. We are a voice solution.
00:45:40
Speaker
That's what we are focused on because voice is such a technical barrier that it requires true expertise for us to be able to to deliver that. um But that is only part of the community experience, right? You also have people engaging in text chat. You also have in-game behaviors to think about. You have stuff outside of the game.
00:46:00
Speaker
How are you cultivating your community across Discord, across Reddit? um There's so much to building and fostering a really rich community. that we cannot do on our own.
00:46:12
Speaker
So to me, venues like Community Clubhouse are the way to bring together practitioners across different segments of that overall community management story, not just so that, you know, an individual developer can hear from all of them, but so that I can hear from all of them and I can learn and get a little smarter about where our tools fit in.
00:46:34
Speaker
I'm often speaking with the the leads of these studios that we work with and they're asking me, hey, Mike, who should we use for text moderation? Who should we talk to about sort of filling up our our moderation staff?
00:46:46
Speaker
I want to have informed answers for them. I don't just want to say, oh, well, I heard about this one text moderation provider. They're probably fine. and i want to actually be able to help them solve their problem. And to do that, I need to understand the nuances of things beyond just the scope of what Modulate tries to deliver.
00:47:04
Speaker
I love it. Yeah. And you know they have so many great panels that talk about not just trust and safety, but how to protect the full community and how to monetize eventually if you want to get there. i think just hearing these other people in the industry, how they've conquered these problems, kind of gives you the little light bulb ideas like, oh, I should do this, or maybe I should consider that, or just talk to this person about that. and I know Community Clubhouse, I mean, I appreciate Modulate always being a great partner there. So so thank you for all that.
00:47:32
Speaker
Yeah, we've we've appreciated the opportunity to be a part of it It's been a great group. So Mike, we learned about Modulate ToxMod communities, how you're entering the FinTech space and call centers to help protect it. I think you're doing some amazing stuff and I thank you for the tools and technology you're putting out there. I think it's really important for for community growth to be protected. Again, reiteiteating reiterating this time and time again, but as a child a father of two children, right Like I'm excited about them playing games. I'm more excited about them to be able to be a be able to play games in a safe space where I can just walk away and let them keep playing it and come back.
00:48:07
Speaker
later um before we do go today is there anything you want to say or share we We kind of hit everything. I appreciate you walking us through the questions. I've enjoyed the conversation. um i'll I'll say, you know as always, if anyone wants to learn more about Modulate, our website, modulate.ai, you can follow me on LinkedIn. We've also got a regular newsletter called Trust and Safety Lately that um shares updates about Modulate, but more broadly is really just focused on what are we seeing across the the trust and safety and community health ecosystem and
00:48:40
Speaker
in gaming and beyond. So I would love to, you know, connect with more folks that are interested in talking about these problems. We will have links to Modulate. We'll have links to everything mike just mentioned.
00:48:53
Speaker
They truly are making a really cool technology, one one of the coolest ones in the space to help protect players. I recommend everyone just check them out because it's a cool thing and just to understand how this technology works.
00:49:05
Speaker
Mike, I really do appreciate you coming out here, talking through it. um I hope you have a great Memorial Day. I hope you have lots of chicken thighs and veggies and and that you enjoy yourself. Thank you again. Thanks very much, Greg. Great to be here.