Introduction to Trust and Safety in Gaming
00:00:00
Speaker
Hey, everybody. It's Greg from Player Engage. In this episode of the Player Engage Podcast, we're going to dive into the crucial topic of trust and safety in gaming. You'll hear from three amazing guests from our GDC's Community Clubhouse panel, Mike Pappas, CEO of Modulate, Kieran Donovan, CEO of KID, and Tess Lynch, a privacy and IP attorney at PreMac Rogers.
Balancing Player Privacy and Safety
00:00:20
Speaker
We're going to discuss how to balance player privacy with safety, strategies for creating happy and engaged gaming communities, and practical advice for indie developers. These experts are going to share their insights on leveraging technology for proactive model moderation and the importance of early collaboration between developers and community teams. Don't miss this deep dive, making games safer and more enjoyable for everyone, and I hope you enjoy it. So let's tune in now and listen to the latest episode.
00:00:53
Speaker
Hey, everybody. Welcome to the Player Engage Podcast. Greg here. Today we have a really exciting episode. We're joined by three excellent guests that spoke with us at GDC's Community
Technical Solutions and Industry Trends
00:01:05
Speaker
Clubhouse. We have Mike Pappas from CEO of Modulate, Kieran Donovan, the CEO of KID, and Tess Lynch, a privacy and IP attorney at PreMac Rogers. I'm going to give you all a chance to introduce yourself here in a minute, but today we are going to be talking about balancing trust and safety in the ah current gaming ah environment. um We're going to explore the technical solutions, the legal challenges, and best practices for creating safer gaming environments while also respecting player privacy.
00:01:33
Speaker
We're going to draw from recent industry events, and our experts here will discuss about moderation, regulatory compliance, and emerging trends in gaming safety. So whether you're a developer or just interested in gaming and tech, this is going to be a really fun episode. And this is also our second podcast in conjunction with Community
Guest Introductions
00:01:49
Speaker
Clubhouse. We're going to be recapping what we spoke about at GDC about trust and safety, as well as kind of kind of starting to set the stage for Community Clubhouse at Gamescom. That's a lot of talking. I'm going to stop here for a moment and I'm going to go through our guests and let you introduce yourself. So Tess, you're first on my screen here. You want to kick it off? Sure. Thank you guys for having me. Hi, everyone. I'm Tess. I'm a privacy and IP lawyer. I work with Pre-Mac Rogers. um I need to do a disclaimer or something to get that out of the way now. Hi, I'm an attorney. I'm not your attorney unless I am your attorney. But that's because we signed an engagement agreement. If I don't know you and we haven't talked, please don't consider this legal advice. Actually, as a general rule,
00:02:27
Speaker
None of what I'm going to say is legal advice. You should really talk to a different attorney who knows the niche and nuances of your circumstance. Because it could be like the data you collect, the content you're moderating, or the services you provide. And not all legal advice fits every size. um That said, also, I'm just an individual, a humble a humble human. That means that all of my all the things that I'm about to say are my opinion, not the opinion of my firm. But if you do want to bring it up with them, ah please feel free to send in this podcast. Okay, that's it. Disclaimer over. Thank you for your attention. Thank you, Tess. Great, great disclaimer. And thank you for joining us. Mike, you want to go next?
00:03:03
Speaker
Sure, yeah. Hi, everyone. I'm Mike Pappas, as introduced CEO of Modulate. um The things I say are also not legal advice, but that's because I'm not a lawyer at all. um My focus with Modulate, we work with game studios primarily on privacy aware voice moderation. So finding that balance between making sure that you're giving players space to have a lot of fun with trash talk from the banter and know that they're giving an opportunity to just express themselves the way they want. but also have the ability to recognize when those conversations really do take a dark turn and need someone to be stepping in to protect players there. um I've had the opportunity to work with a number of studios through my direct work at Modulate, as well as other sort of groups in the industry, including through Community Clubhouse, through the Fair Play Alliance, through the Gaming Safety Coalition. um So like to think I've been able to see a really good cross-section of the strategies used in the industry and look forward to talking about it more.
00:04:02
Speaker
Great. Thank you, Mike and Kieran. Yeah. Thanks very much, Greg. So great, great to be here. My name is Kieran. I'm the CEO and founder of KID. So I see there's a bit of a hybrid of Tess and Mike, so I am an attorney. So I can second everything that that Tess has said. um I've been an attorney for forever, a decade, um done an orbit of the planet. So I've practiced in multiple jurisdictions. I've practiced in Australia and the UK. in Hong Kong and Singapore so I really but born with this first hand to how challenging it can be to to launch something globally and when you're dealing with all the complexity in multiple markets um with the types of issues that we're dealing with whether it be legal but also cultural um and socio-political as well there are all sorts of sensitivities when you're dealing with um with these types of issues for global go-to market
00:04:52
Speaker
That's really really my sweet spot and was the reason we created KID was because we saw an opportunity to build a ah really unique technology that would allow um those more empowered experiences for kids where they are um and respecting who they are. Beautiful. And I love it. I think again, protecting children who are playing games as a father, you know, it's one of those things that are near and dear to my heart. And I think it's a great thing. And I was telling people I'm meeting with with lawyers and attorneys to do a podcast and everyone got so excited about that. So let's start with the fun question first is what games are you guys playing?
Insights from Recent Gaming Experiences
00:05:27
Speaker
test You want to kick it off? Oh, man. So I took a brief Baldur's Gate three hiatus to finish off Hellblade to the horizon.
00:05:38
Speaker
Forbidden West sequel and then the God of War Ragnarok series. And now I'm back to Baldur's Gate 3. Wow, look at you running the gauntlet there. Mike. So I grew up almost entirely on Nintendo games. My parents had this weird theory that other online games might have some kind of toxicity or something that were dangerous for young kids. So I was entirely restricted to single player Nintendo growing up. So still have a soft spot for that um recently.
00:06:09
Speaker
um I'll say I finished playing Tears of the Kingdom by which I mean I got to the end of the story. I do not mean I completed all the things in the game because no one will ever complete all of the things in that game. um But it's been a very enjoyable one to explore. I love the thought in my head that you started modulate to prove your parents wrong. that There is no toxicity now on any of their platforms as well. Garen, what are you playing? Yeah, I've actually just come off the back of watching all the GDQ stream. So I'm like all inspired to get back to retro games and do some speedrunning. So I actually have been practicing some speedruns on GoldenEye over the course of the last couple of months. Nothing that I would be willing to stream at this stage, but it's been it's been a good bit of fun learning some of those so skills and going back and playing the game again. Love it. Love it. so
00:07:00
Speaker
Appreciate you all sharing that. Let's start by recapping Community
Approaches to Privacy and Safety in Gaming
00:07:04
Speaker
Club House. It's been a few months now, um but we had GDC's Community Club House. And I'm wondering, you know, we all have these things that after the event were like, shoot, I wish I still had time to say that. Oh my God, things happened so quickly. I couldn't even remember this. So I'd like to understand if each of you had any key takeaways or things that you wanted to say that maybe you remembered after the event. ah And this time we'll go in reverse order and say, Kieran, you want to start? Yeah, I mean, I think for me, some of some of the things that are really a key focus for us are and like, how do we flip the script? Like taking a step back, um there's a lot of the issues that but we can get really hung up on. And I think people have very strong views and yeah good views. um But taking a step back, you know what what do we want in terms of play happiness?
00:07:51
Speaker
um and And one of the things that we've been working on in the background is this whole idea of like a player happiness index um and being able to communicate to parents. like This is where your kid's happiest. We can get into the nuts and bolts of ah you know the regulation and the complexity of like how to deal things by geos and all sorts of stuff like that. Like really that that's kind of the fundamental takeaway is like where where a player is happiest and often and I mean mike Mike's done some fantastic research in this space and and I'm sure we'll talk about that. But looking at how that aligns with um the the more safe and empowered experience as well. um And the direct impact that has on happiness and player engagement I think is um the the real takeaway for me um from a lot of those discussions at at kind of that high level.
00:08:40
Speaker
I like that idea of a happiness index. And I want to question it a little later because I could tell you for a fact, my kids are happiest while playing games. But but then there's is there too much happiness, but we can we can talk about that as the podcast goes on. Mike, same question. Yeah, I mean, thinking back to that that panel at GDC which was a fantastic discussion, I'll, I'll agree with Kieran on sort of the player focus but put a different lens on that I think one of the topics we discussed a lot was the importance of platforms communicating with their players about what's expected and what they're trying to do in the first place.
00:09:16
Speaker
um you know we We've had the opportunity to work with a lot of these big AAA games, often sort of M-rated titles. And every once in a while, we'll have a user reach out and say something like, I don't understand why I got banned. All I was doing was shouting a bunch of homophobic slurs at a seven-year-old until they cried. It's an M-rated game. I should be allowed to do that. um And, you know, after saying, you know, thanks for raising your hand so we can, you know, take you off the platform. Well, we'll sort of take a step back and say, if someone could really reach out like that, then they're so deeply don't understand what these games are trying to do, because even the M rated games, like, their their goal is not hostility. Their goal is for people to be able to have fun.
00:10:01
Speaker
And so that gap of understanding where some users relate you know M-rated or it's a violent or a competitive title or something and take that to mean, therefore, literally anything goes, that's that's a misunderstanding. And the more these platforms can get out there really clearly and say, yeah, look, you're allowed to be competitive. You're allowed to trash talk. But that's different from just you know terrorizing someone. The, that's going to have a huge difference far before we can even get to any of the sort of technical interventions. Well said. Tess. I think I had the same problem as I do with most of these conversations, which is we can always get to like the surface level of what the issue is. Like what are privacy laws? What does content moderation requirements look like today?
00:10:50
Speaker
Um, but we never really have enough time to dig into implementation or even maybe just like the basic bones of what your compliance program is going to look like. And, um, you know, we're never going to really have all the time to hit everyone's use case there, but there are certainly ways we can start to conceptualize it and kind of just driving home, like what where the priorities are for your business, what kind of strategies do you value and other ways to kind of triage to get you there to the same point effectively. I think that's a great point and that's actually quite a bit of feedback we heard from Community Clubhouse. Well, it was a lot of great stuff that people love to hear. A lot of people also wanted to hear more in-depth technical stuff like how do you implement and how do you do that stuff and that's great feedback because you know as a nonprofit Community Clubhouse is here to kind of teach the best practices of the industry but if we're not really getting
00:11:36
Speaker
down and dirty for lack of better words with our hands on on the tool and showing people how you can use that stuff there's a gap there so it's trying to understand kind of who is more interested in trust and safety who's more interested in community right so it's kind of casting a big net and then trying to find out where do you want to funnel the appropriate people and that's just stuff that kind of goes on I think behind the scenes but kind of kicking off the real conversation here. Well, the panel you were part of was a host moderated by Lewis Ward, and he emphasized the the need to balance privacy and safety. And we hear that a lot about that along along the lines. Again, I think I use the term it's not the best term, but big brother, everyone thinks everyone's out to watch you, but that's not the case. So kind of what are your thoughts, each of you on kind of balancing player safety, as well as player privacy in an environment like today, when everything is online and
00:12:26
Speaker
Sans kick it off with Mike. i I think we have to be really clear what we mean when we talk about privacy. I think there's a lot of people that have this kind of intuition in their head of if we're trying to get privacy what we mean is we don't want anyone to collect any of our data ever or have any knowledge of anything that we're ever doing. And that's a very absolutist approach that isn't really reflected in the way that we actually live life offline. um The example that I always use is you know if you're taking your kid to the playground and you're saying, hey, I value my kid's privacy, well, the absolutist privacy approach is to leave your kid at the playground and go home and wish them luck and hope for the best. Most people don't do that. Most people don't think that's the right trade-off.
00:13:13
Speaker
Instead, they'll say, hey, no, there's some compromise we need to make between privacy and safety. But also, we don't want to be looming behind the kid watching every single thing that they do or say. That's too much of a privacy invasion in the name of safety. So what we're going to do instead is strike that compromise of we're standing over at the sidewalk, maybe talking with the other parents, watching out of the corner of your eye. And so from a sort of privacy standpoint, you can't say, oh, that parent never knows what the kid is saying. They'll get some snippets. They'll get some little glimpses of what the kid is saying. They'll get some little glimpses of what the kid is doing. But it is still substantially more private, substantially more freedom for the kid to sort of explore and express themselves compared to if mom or dad was overseeing their every action.
00:14:02
Speaker
um So I think shifting to more of that mindset of instead of if you can ever access any iota of data, then that's a fundamental sort of massive breach, really thinking about what are the sort of circumstances in which you're getting this data. How is the data being used? What is the scale of data being collected? And then finding that happy middle ground where you're collecting the minimum data needed to ensure safety. But you are collecting that data so that you can ensure safety in the first place.
00:14:37
Speaker
I think that's the perfect analogy. I think, you know, you need to make sure everything's okay, but you can do it from afar and making sure that, you know, if something's going wrong, then you can kind of step in. it And Kieran, I think KID is kind of building a a ah solution to help understand this. So maybe you want to share your perspective. Yeah, I mean, what Mark, what what Mike said is is absolutely right, which is The example I often use is when Disneyland opened, Matterhorn Bobsleds in 1959, it's the fastest roller coaster on the planet and it's lots of sharp turns and they don't suddenly say, oh, oh well, we better stop kids coming into Disneyland. um The approach is, well well, how do we deal with that? The fact that there is this danger, this risk um for younger kids um when it comes to that roller coaster and the way they deal with it is by creating you know the high check.
00:15:30
Speaker
um at the roller coaster and so there is that that trade-off right where you on the on the on the pathway in when you're you know going going to get on the roller coaster there's that check right does this this is this thing going to be safe for you are you going to be okay with it We don't need any any you know information about you as an individual, who you are, where you're from, how long you've spent at Disneyland, what rides you want to get on, you know all that sort of thing. All we need to know is for this specific instance, getting on the roller coaster, is it okay for you? Are there any risks that that you're taking on by coming on?
00:16:02
Speaker
And so when I think about like that balance between privacy and and safety and what we're building, it's very much with that like at the at the core of our thinking, which is, okay, what is what is the minimal amount of data that's required in order to obtain the best advantage from from a safe safety and empowerment perspective? And but what we've realized is actually, It doesn't take a huge amount of data or you can do things in in ways that are very creative or privacy preserving um and have huge upsides ah when it comes to that safety and empowerment. and
00:16:37
Speaker
So I think we you know we we take that approach and and that's why you know we have really strong views on on some of the trends, um particularly around things like age verification and and age assurance and things like that, because we we really do want to get into the the weeds of the problem. like At what point is there the risk to the to the the child or the teen? right And it's not at the point they sign up, it's at the point that they access that particular functional feature where there's that sensitivity.
Global Privacy Regulations and Gaming
00:17:05
Speaker
And and that's the place where where you do that balancing act of working out, like what's the minimum data I require in order to make sure that they're safe in this particular space of the overall ecosystem or game. Another great analogy, you know I think, you know, different rides in different countries also might have different restrictions on kind of how tall you need to be as well.
00:17:25
Speaker
go back to thinking about bars and thinking about if I go to the Caribbean when I was younger, I could drink, but not in the States. So ah it's probably not the best mindset to go about thinking about it, but I think that's a great way to understand it and test with such great analogies, the pressures on that. But from your perspective, I imagine it's a big gray area that's kind of moving because we're learning these new things as time goes on, privacy, what is privacy? but And maybe I'm overthinking this, but but from your perspective, how do you juggle privacy and safety? I could talk about like this one topic for three hours, so stop me if I tangent too much. But earlier in my career, I was told this analogy, so I'm going to add another one. We're going to add a third analogy to the mix. um Security, or cybersecurity specifically, in this analogy, security builds walls, privacy builds bridges. The idea of the golden surveillance for the past 20 years was that we have to collect all this information, as much information as we can possibly find, so that we can detect red flags, we can detect bad actors,
00:18:21
Speaker
We can find hackers before they compromise our systems, right? And we really did this indiscriminately to our purposes, right? It was more of a throw as much spaghetti against the wall and see what sticks. Now with the advent of privacy and like this new kind of societal shift into into this idea that, you know, privacy, at least in Europe, is a right. and that you know our autonomy shouldn't be ah impacted by the services that we choose to consume, that we should have some level of control on the information that we give others, or at least you know that we can access it, that we can correct it, that we can delete it, or if it's sensitive, you know that we can restrict certain processing of that sensitive information. So I don't see privacy and so and cybersecurity as diametrically opposed, as a lot of people do.
00:19:08
Speaker
I do think that the motivations for data collection are inverted, right? And that's where you can find conflicts. But at the same time, like even from going back to analogy two, Kieran's roller coaster analogy, right? The bride isn't collecting the parent's ID documents, right? They're not collecting the child's date of birth. They're not even taking a photo to spur their records to show that this is a child that was denied the service because they were a child. Like all of that, extraneous data collection has really just been boiled down to one indicator that can kind of dispositively or with some accuracy that can kind of confirm that this person is in fact a child or or that this ride would not be unsafe for them regardless of their age.
00:19:52
Speaker
That's great. And I think everything you're all saying is clearly kind of setting this all up, is setting the stage for understanding the difference between privacy and the tools that you can use to make sure that ah your your data is safe. and And I love tests that you brought up data. I mean, GDPR, we have CCPA here. in California,
Voice Moderation and Privacy Challenges
00:20:08
Speaker
right? And I think controls of that data is a fascinating thing that we can't control our own data in our in the states here. And the first time I spoke with Kieran, he was mentioning how all 50 states have different rules that go about and it's maddening. And it made so many things click in my head. And maybe it's all things that you all know is common sense. But when I'm talking and hearing you guys talk about it, it just bends my mind. so
00:20:32
Speaker
Each of you, are Mike and Kieran, you both have tools and technologies that make this easier on different parts of it, right? So, Mike, ah you you talked about, at GDC, you talked about the tools that you've built with Modulate, your ToxMod and CallDuty implemented it at that time. Can you talk a little bit about what ToxMod is, how it works, and and how it respects player privacy? Yeah, I mean, the the basics of TalksMod is we are we're providing a voice moderation solution. So the idea is recognize when you're in a chat with someone and they're you know throwing hate speech or harassment at you, or they're doing something even more insidious. They're trying to groom a child. They're trying to radicalize a disaffected teen, something in that direction. um The platform does not, by default, have any awareness that something like this is happening.
00:21:23
Speaker
which means that all of the burden is on the users to do something about it. And that's problematic for a number of different reasons. First of all, you often have especially younger users who don't even realize that what's happening is harmful to them until much later. In the case of something like grooming, it's designed so that the kid does not realize that this is a harmful behavior. Even on top of that though, even with, and i I hate to say it this way, but run of the mill hate speech and harassment, um you still have even sort of adult users generally don't report that stuff. We actually just released a case study with Call of Duty, where we found that a little bit under 20% of the users that TalksMod was able to detect
00:22:13
Speaker
graphically and egregiously violating the code of conduct were ever actually being reported by other users. um So there's this huge gap of 80% of your offenders that you're just not aware of, you're not able to do anything about. That's the intention behind modulate is to say, hey, what if you could build a system that could come in, actually be listening so that it's not entirely on your users and flag to the platform when something bad is happening. The two major ways that we really sort of incorporate privacy into this approach.
00:22:49
Speaker
is first of all, what do we mean when we say talks mod is listening? That doesn't actually mean we're transcribing everything and we're analyzing everything you say in detail. It certainly doesn't mean we have any idea who you are or like what the ID or address or, you know, name or anything like that of the person is that speaking. All we know are initially some very cursory details of the conversation. It's again, sort of hearing things from that distance away over at the sidewalk. So you can tell, hey, are people shouting at each other? Has someone burst into tears? Is someone clearly seeming really unnerved by the conversation? These very, very basic privacy preserving cues are often enough for us to be able to say, this conversation is innocuous or this conversation is scary and sort of worth a closer look.
00:23:42
Speaker
As we get those warning signs, we'll zoom in a little closer, zoom in a little closer, and iteratively start to unpack what's going on in the conversation. But the end result is very similar to with user reporting. you know If you know that another player could report you at any time, you know there's some chance that if a player thinks you're doing something harmful, your audio could get recorded and shared with the platform. The odds of that happening with ToxMod are actually probably lower because even though we're catching five times as much stuff, people do a lot of false reports and they do about five times as many false reports as accurate ones. So on net, you're you're not really at greater risk of ToxMod sending something, especially if you are a good actor. But that's the first layer is that sort of iterative approach that we're never collecting data that we don't know for sure we need to understand that.
00:24:38
Speaker
And then the other side is who gets access to that data even once we've identified it. So even when Talksmod says this is a harmful situation, we will package that up as tightly as we can and share only that with the studio's moderators. But there's all kinds of misbehavior that you could imagine by a malintentioned moderator who says, hey, I'd like to know everything this person says because I'm out to get them. Or actually, hey, I'm into this person based on their voice and I want to you know figure out what you don't want moderators being able to say, hey, let's follow this person. Let's dig in in more detail to what they're saying. You only want them to ever have access to the actually scary things that that person did.
00:25:28
Speaker
without them being able to change what ToxMod is looking for. So that's the other set of controls we have in place, where we can talk at a high level with a platform about what kinds of priorities they have, but the moderators and users of the system on their side do not have any ability to direct ToxMod or to gather additional information. They're only given the reports that ToxMod determines are relevant. I think, you know, I think that's great. I think it's preventing access to to obviously the audio logs that they shouldn't, but I find it fascinating. And we see the same thing at help shift that when a game has a problem, players don't typically reach out. It takes proactive, ah proactive notifications to say, Hey, did something happen? Was something wrong? And it seems like it's the same thing on trust and safety, where if someone calls me a terrible thing.
00:26:17
Speaker
Fine, I feel bad, but I'm not going to take my time to report it to the game. It sounds like it's just making this process more automated so you're capturing and it's still being being found, but it doesn't require me, the human, to step in it and say, hey, someone just bullied me. A good community process doesn't put the burden on the victims to fix things, right? You you need to have better tools than that.
Youth Gaming Needs and Parental Challenges
00:26:40
Speaker
And then Kieran, KID is building tools to help with that. And first Kieran's tool recently got his Series A from A16Z. So congratulations, cause there's clearly some excitement about that in the market, but can you kind of like tell us what how KID is doing that at a basic level and as much detail as you want, I guess.
00:27:02
Speaker
Yeah, I mean, fundamentally, we're we're solving for three gaps. One is that dastardly pop up that we've all experienced with the whole history of the internet, which is I confirm I'm over 13, which every kid has always respected. um The second one is the fact that if I am under the age of 13, 16, 18, depending upon where I am in the world, I often don't have a single sign-on option. um I can't use Facebook. I can't use um anything else to to get instant sign-on into a game. My only option is to manually go through a process to sign up for a native account system that is limited to that particular game or ecosystem or particular platform.
00:27:45
Speaker
um And I can't take that anywhere with me because I'm a child and that's not the way the internet works. The third one is, even if I try and do the right thing as ah as a kid and as a parent, the experience of signing up today as a parent is Not a great one. um And a lot of our research has shown that sometimes it's 40 or 50 steps. You're going through feedback with over and over again. You're trading devices. You're scanning QR codes getting punted to different web portals. It's all over the place. um You're re-varying yourself over and over again.
00:28:20
Speaker
on different platforms. It is a friction-filled user experience. And the scary statistic that really sits at the back of my mind for everything we build is that almost 70% of kids that end up on social media or a game where they're not intended to be there because they're under the minimum age is with the help of a parent. um And that tells you that there's something wrong with the underlying system. if There's too much friction. If there aren't the options for kids to get on board safely, um then something's wrong with the with the system. And so that's fundamentally what we're solving with KID. So we take each of those problems one by one. We solve it for publishers by taking all of the regulatory complexity. And we've built all the developer tools to take all the the great advice that tests will provide to a to a publisher. um And we have all the tools to deal with that. What happens when regulation changes?
00:29:15
Speaker
What happens when you know we've had two countries in the last two weeks who've changed the age of digital consent upwards, which means now all of a sudden you've got to go back and get parental consent and notify parents. You have to attach parents to accounts that didn't have parents attach them before. But your engineering system needs to be calibrated to deal with that. You can't just like go, like oh, wait a minute. I never collected certain things in the first place. And now I have to go and create even more friction for my entire user base to solve for this you know one little use case And so we built engineering tools that evolve with regulation and and go where the park's going for publishers, which unlocks little sign-on for kids, which unlocks unified parental tooling. So as a parent, I can see everything. I don't have to re-verify myself reverify so over and over again. I can passport my preferences across games. So if I've already spent time thinking about like
00:30:05
Speaker
do i Do I want to give access to DMs? I kind of want to approve their friends. I'm kind of cool with that them playing with their friends after dinner, but I don't really want to do it with them doing it at breakfast before school. So once they've gone through that process and they've calibrated their settings, they can port that across other games. So they're not having to rethink and go through all of that arbitrage every single time. um So that we we take all the friction out for all of publishers, parents, and kids.
Advice for Indie Developers on Privacy and Safety
00:30:32
Speaker
amazing and crazy to think about i mean again when we first spoke i didn't realize all the different countries all the different states here all have different rules on on the age and all that stuff and you said the best you said what is the definition of a kid every country has a different definition of what a kid is and i didn't even think about as a indie developer that's releasing a game having to think about oh i'm gonna do a global release crap i gotta think about all the different countries all the laws i like
00:31:00
Speaker
crazy stuff and it's amazing what you're doing. Test is an indie developer that's building a game, right? There's a lot of different things that are coming at me, right? Different types of tools. If I want to be able to provide a safe environment for players, a privacy, that something that could protect players, where do where do you start? So any any developer should really start by looking at their exposure points. So normally, in like a non-internet-based society, I would say look at your jurisdiction. Where are you going to release this game? right Because that's going to give you the biggest indicator for what laws are going to be on play in play. Unfortunately, as we are in the age of the internet, um nearly everyone who's going to be releasing a game within this decade is going to do so globally. So that means that global regulations apply. So the next couple of steps would involve, all right, um what kind of features are you going to offer in this game?
00:31:47
Speaker
Is there going to be messaging at all? Voice communication, text messaging, DMs, friending, banning, blocking. like To what extent can the users engage with each other? and it Because a lot of these regulations are going to look at where are the children put within the game features themselves. right Are they accessible to other users? um Can they be influenced by other users? Is there certain content that they shouldn't be able to see? Are there certain forums that they can't go into? right um Et cetera, so forth. Now, if those features don't really apply, then OK, maybe content moderation isn't really necessary for you. um But looking into like the sensitivity of the content that you're producing, the features that you control, and even, ah I will say, when it comes to like gauging your age range, ah understanding what other different amenities you offer outside of the main service, like you do giveaways, promotions, ah cash credit,
00:32:44
Speaker
for anything, right? Things that would involve like a financial transaction between adults normally isn't something that's appropriate for kids. And so all of those like things once stacked together kind of gives you an idea of like, all right, my age demographic and group based on the content, the amenities that I offer, the services and the different forums that I have in tow is going to segment me in a market that's going to be 16 plus or 18 plus if finances are involved, right? um Anything outside of that is going to go into the realm where children could potentially access it. So where children can potentially access it, then that's where you need to start incorporating things like, okay, ah what do I need to get a parent involved to approve this child's access to this feature? um And then just just ah another quick aside before I wrap this one up. Content moderation doesn't really care if you're a child or not. There are a lot of there are a lot of international regulations out there that are just like, we hate hate speech. okay Australia is one, Germany is one, and they're all going to have different kind of standards that you need to monitor your content by.
00:33:44
Speaker
i I would be remiss if I didn't bring up section 230 here, which is the bedrock of at least the American internet. It's the reason why forums like Facebook and Reddit even exist in the first place. Section 230 basically says that the platform owner owner isn't liable for any of the content that the users post on their sites, right? That they can take moderation decisions and that those moderation decisions are effectively the the platform's speech as decided recently. anyway All that to say that content moderation could still be a factor even if you're not servicing children or expecting children on your platform. So I have a question that builds upon that that we didn't really discuss prior. You know, that is great information. But from the outsider perspective, it's overwhelming. sorry if you're a If you're starting your own game company right and pretend Ed Hartman's building this really great game that includes
00:34:39
Speaker
lovable characters like Pokemon, right? Like, how does an indie developer get started looking at this stuff? Obviously, like you said, you got to take a look at all the things you're offering in game, but like, how does someone that only knows how to develop games and build games get started? Okay, maybe maybe if I was getting started, I would start by developing my ethos. And I would ask one question, how would I want my data to be treated in this environment? What do you what What sacrifices would you be expecting other companies to give you in this situation? Would you ah would you want access to everything that you did in-game? ah To what extent is that your information, right? To what extent is that something that you, the company, own, right? And try and kind of qualify that ethos across the across the board. Wonderful. Mike or Kieran, do either of you have thoughts on that as well?
00:35:31
Speaker
Yeah, i I certainly don't disagree with what Tessa's saying, but I'm going to go a little bit deeper here and say, yeah, all right, so you're trying to do this complicated thing that has a lot of different moving parts. How do you get started? The sort of most obvious answer is you hire an expert. In the same reason that you know if you're trying to build a game and you've never actually written code before and you want to get this running on all kinds of fancy new pieces of hardware, sure you can go learn from a lot of different tutorials and stuff like that, but it's probably good to go talk to an expert because they're going to be able to point you to things that you ought to learn about that you wouldn't even have come to in the first place.
00:36:12
Speaker
um I mentioned earlier, there's a lot of good groups in the games industry, um including the Fair Play Alliance, the Gaming Safety Coalition, of course Community Clubhouse, that are trying to get that expertise out and more available to as many different people as possible. um There's also just great people working in Trust and Safety at a lot of these top studios um who frankly are really excited to talk to you. um you know If you're an indie developer, even if you can't afford to hire someone full time, I think you would be shocked how many people in Trust and Safety would be tripping over themselves to help if you reached out and said, I actually want to do this right. I care about this. I'm passionate about it.
00:36:58
Speaker
That's what these folks have been sort of dying to hear for so long. And I think the more that they're hearing it, the more they're just excited to keep sort of carrying that wave. There you go. Just reach out and it's perfect, right? I think Kieran said on our other call, he said that you have to talk to good people or hire good people. That's what you do. You hire people that don't that know what you don't know. So I don't know if you had anything to add on Kieran or if I just stole it from you. but lets get that i mean I would say yeah etherine if you're an indie developer and you're staring down this complexity and trying to figure it out.
00:37:32
Speaker
hit me up. This is exactly exactly the problem we solve, and it's it's free for NDS, and we made it free because we want to democratize access to to the answers for to this sort of complexity. I mean, I think to kind of double click on it, it' um you know one of the biggest challenges is it's so complex and the risks can be so great. What you need is, to to Mike's point and to what Tessa said, is you want someone who can help triage that risk for you. Because it's not necessarily about trying to solve everything out of a gate. you know You don't have the budget to be able to to do that or build necessarily all of the tools for that. And it's great people you can rely upon and tools you can rely upon for that. But what you need is someone who can say, hey, here are the things that I would focus on first. right you've You've got like real money loot boxes in here.
00:38:27
Speaker
Like we should we should really double click on that. Or do you know what, you've integrated some some interesting cross-platform functionality and that means that data's being shared. in certain things. Well, hey, guess what? You're launching in Vietnam, Korea, and Brazil. And there's some pretty unique um sensitivities around around regulation in those markets. And so let's let's spend some time on that. And so I think having that expert who can help you triage the noise from what you should actually worry about is that'll be money well spent.
00:38:58
Speaker
Lovely, I kind of all again came full circle test talking about take a look at your ethos. What are you going to? What would you want your privacy to be? If you're playing your game, then go find some experts. And as you said, Mike, everyone in trust and safety I've ever spoken to gets very excited when the conversation comes up. So I think if you're looking for best practices, anyone here or anyone who works in trust and safety would be happy to give you theirs. But I want to kind of talk a little bit about common mistakes that studios might make when implementing these tools or when even thinking about it. Do they overthink it? Do they underthink it? So I guess, ah again, just to talk through, it what are the common mistakes or misconceptions that people encounter when they're trying to achieve their privacy or safety goals in here? And you want to kick it off for us this time?
00:39:43
Speaker
Yeah, I mean, having having built these privacy programs from the ground up, even pre-GDPR and pre the world we live in today.
Compliance Misconceptions and Company Culture
00:39:52
Speaker
I think the best thing you can do is think forward to how do I build an engineering language and a culture within my organization that is putting privacy first. And that means that whether you're dealing with someone in, and you know, when you're thinking about marketing, when you're thinking about, okay, what services, backend services am and am I going to integrate with? What features am I going to integrate into my game?
00:40:20
Speaker
all these things, you know if whether you're a single developer or whether you're a team of 20 or whether you're a team of hundreds or thousands, embedding that privacy first culture within the organization pays dividends down the road because inevitably it will mean that things get picked up along the way where they otherwise wouldn't. And it could be, I've seen examples of ah developers and and engineers picking up things and saying, Wait a minute. So I'll give you a really specific example, which is if you run the argument that you are a general audience game in the US, you don't want to be collecting age information um and retaining users that are under the age of 13. And therefore, when you run a market survey, you don't want a category that says, oh, you're under the age of 13.
00:41:11
Speaker
because now every user who clicks that box is now someone where you have you're imbued with actual knowledge. And if legal might catch that, right? But they may never reach legal. And so if you've got that culture of privacy first, and whoever is creat crafting that market survey in a marketing team is thinking, oh, wait a minute. like We need to be sensitive about this. We need to be careful about this question around like demographics and age. um Again, it will just pay dividends down the road in terms of mitigating risk. Mike, is there anything you want to add to that?
00:41:47
Speaker
Yeah, I think there's two related misconceptions that I want to talk about. the The first and maybe broadest is this belief that there's this one weird trick that will get you done with this, and then you can just move on with your life. um And maybe that's a little bit of a pessimistic phrasing. I know everyone kind of wants to be able to say, hey, yeah, we're we're done with privacy. We're done with safety. We don't have to worry about it anymore. But is there one weird trick where now you're done with user acquisition? Is there one weird trick where now your software architecture is finished and is just going to work perfectly for all of your players forever? like no these These are, in fact, ongoing efforts.
00:42:27
Speaker
And I think where you see platforms get themselves into trouble is where they're not willing to actually put in that effort or for whatever reason, they don't put in that effort and they try and use those one weird tricks to kind of hand wave stuff away. um Whether that's attempting to disclaim COPPA, despite the fact that you clearly have kids on your platform. or saying, hey, I'm just going to, you know, I don't want people to get toxic playing this game. So I'm going to try and make it harder for people to participate in chat. Feels like a neat thought for for sort of the first 10 seconds. But what you end up seeing is people will go somewhere else and just chat in an even less regulated way. And that's still going to blow back on your community.
00:43:12
Speaker
um I think Helldivers is a good example of this recently. you know They had a game that deals with an intense topic that's liable to create a lot of toxicity. And they had the initial thought of saying, hey, let's try and avoid sort of PVP. Let's try and minimize all of these comm systems so that there's just less room for toxicity. And what happened is when they first launched they had a couple of weeks where they got glowing comments on how proactive they were to make sure that there was no toxicity and everyone was safe. And then people started to find all of the, you know, discord servers and all of the other communities that these players had created.
00:43:54
Speaker
And they realized, oh wow, these communities are way worse than anything we've seen before because there's no one home. There's no one trying to moderate this stuff. And in some sense, Helldivers, their team had kind of already thrown in the towel and given up their right to tell the players, no, you're supposed to behave differently because they had already kind of washed their hands of the hole. So I think there's a lot of these instances of platforms trying to find some way to have their cake and eat it too and not really have to engage with all this and
00:44:31
Speaker
that just comes back to bite them time and time again. The other misconception is that it would actually be good to be able to have that one weird trick in the first place, which I think the reason so many studios think that is because they see trust and safety and privacy and all that stuff as a cost center. They see it as a drain on the value of their business. And it's just not. People, it turns out, don't like experiencing horrible toxicity. Some people, you know, don't like experiencing in a game designed for adults, lots of kids running around shouting and interfering with stuff that they shouldn't be. part like there's There's lots of these kinds of things that will drive people off of your platform.
00:45:17
Speaker
Again, I mentioned this case study we did with Call of Duty. We were able to do a pretty effective A-B test at looking at what's the impact of having meaningful content moderation. And again, an M-rated, highly competitive game with extremely loyal fans. And thanks to only a single month of content moderation, there was a 30% larger number of people sticking around after that month in the moderated space compared to the unmoderated space.
00:45:49
Speaker
30% more users. Think about how much money that is. if If that's the lens you have to look at it through, think about how much money that is that you're leaving on the table. This is not a cost center. This is part of your customer acquisition strategy. And I think the studios that understand that better are the ones that are ultimately seeing more and more success these days. Thank you. And Tess, from your perspective on the legal side, right have you, I don't know if you can't talk about cases, but have you have you been a part of anything where it's like, why didn't you think about this sooner? Or or have you seen it from a different angle from from these guys? I am going to echo the past complaints of like not getting legal involved sooner, just because it's in my interest to say, like please, please. but It's so much cheaper ahead of time proactively than it is reactively.
00:46:41
Speaker
That said, ah the two major misconceptions I see are actually kind of opposite of each other, but I'll start with we need to complete be compliant with everything all at once. It's it's unattainable. you can't you can't You can't. I don't even think of any company could physically be compliant with every jurisdiction in the world. um Half of them conflict with each other, right? And there's a certain amount of data inventory management that you have to do in order to even understand if you are compliant. Uh, when I just don't see anyone at that kind of level of sophistication. Um, the second misconception is all I need is a privacy policy and I'm done. Like i'd to go back to like the easy button, right? Normally most, most companies and not my clients, cause they're, they're very well behaved, but most companies will just have a privacy policy and be done with it. That's it. The, you know, box checked and it's really, that's it's the privacy policies, but a drop in the bucket. It is, but the tip of the iceberg of what.
00:47:39
Speaker
the privacy regulations actually expect of you. So that's the major misconception. And most those two are kind of both based off of the same lack of privacy, education and awareness. And that's okay. Not a lot of people ah read compliance regulation for fun. I understand that. I just wanted to echo something that Tess had said there about the you can't be compliant with everything all at once and just acknowledge like, we're all kind of joking about that because the alternative is to cry. um But like I, I do think it's worth taking a moment just to recognize like, not that any of us can really do anything directly about it, but that's so bad.
00:48:16
Speaker
but That's so bad that we are in this state where all this stuff is so mixed and conflicting that there is no clarity. And it's great that there's, you know, great attorneys out there like Tess and great platforms like what Curen's building at KID that are trying to help resolve all of that chaos. But i I do think it's worth also just sort of taking a moment and like patting everyone on the back that has to deal with this and saying like, we we feel the pain, it's bad, it shouldn't have to be this way but it is, maybe one day we'll be a little bit less fragmented but I think it's worth acknowledging that that's not
00:48:53
Speaker
That's not what a healthy industry looks like. And that's something that more on sort of getting more regulators engaged, having more of those conversations effectively, there's there's a whole other set of things that I'd i'd love to strive for over the longer term to clean a lot of that up. Yeah, and just in a um more lighthearted note, because I realized that did sound kind of dire. um The goal of compliance is to get started, right, to get started and to keep going. And ah it's never going to really be perfect. There's going to be a new problem every year, every quarter, right? Something new for you to deal with and and come up with a solution for. ah So the idea is to not get exhausted and to see this as a fun exercise in helping people, you know, exercise their rights or enshrine their own autonomy and enjoy your service.
00:49:43
Speaker
something you were mentioned was policy and having a policy is just a drop in the bucket. And and in our prequel, you talked about policy versus implementation. And I can imagine what I think that means. But can you talk a little bit about what the difference between just having a policy and and implementation is? Yeah, sure. um The privacy professionals are going to be a little angry at me for misusing the term privacy policy, because like, per GDPR, it should be privacy notice and privacy policy refers to your internal policy. But like, that's a That's a good way to illustrate this. Your privacy policy, as we know it today, is the policy document that lives on your website. right Someone clicks onto your web page. They scroll down to the footer. They can see the terms of service. They can see your do not sell, do not share link from CCPA. And then they can also see your privacy policy from the gd from GDPR and other regulations. ah So your privacy policy basically sets out what you as a company do to protect user data. This isn't a agreement this is not an agreement.
00:50:40
Speaker
Users do not sign it. They don't agree to it. um At best, they can acknowledge that they have read it and that they understand it, right? Which is helpful for you as the business because really all you're doing here is providing notice and transparency that, hey, these are the practices that you're using with your data, that you're going to adhere to these practices, and that if you deviate out of the practices, the customers can say like, hey, stop it, right? You give them a whole bunch of ways to say, hey, stop it. including exercising their rights that they can do to you specifically as the controller of their data um or directing them to the other people who actually do control their data. like We'll get into and taxonomy of those two terms at some other point. ah All that to say, policy is going to set up the rules of the road for how you enact how you interact with the user's data. But the implementation of that policy is quite different.
00:51:34
Speaker
When you're talking about implementation, you need interdepartmental buy-in. That means all of your departments within your company. Talking finance, marketing, people who are very heavily interfacing with consumer data. right They're the folks that are going to need to implement the privacy practices that you come up with. so A privacy practice might be something like data tagging. like I need to know whose data belongs to a 13-year-old. right and In doing so, we would have a way to communicate that across the database to each department so that they could then look at their own practices and marketing could say, hey, we have this giveaway. Should we include this list of users who are clearly under 13, right? And they they can use it and implement the privacy practice in their own like departmental workflow. So that's the difference between policy and implementation. And there are a lot of processes we could talk about, but I'll save that for a deep dive.
00:52:31
Speaker
This kind of sounds like a little bit of what Kieran was saying, kind of policy at the core. Get your whole company to buy in, even though it's a little more agreeing to it, but but making sure that everyone just understands, hey, this is what we do as a company. This is our our core pillar, for lack of better words, and making sure that this is privacy, this is what it means, and this is how we're going to adhere to it. Is there anything you guys want to add to that? or All right. um Let's talk about challenges you've encountered when people either implementing tools or just related to player safety and privacy. um I don't know if you're able to talk to them or address them, but Kieran, have you run into any problems you've seen with ah customers like that? I think one of the biggest challenges, which I do never experienced as an attorney, having having lived the the life that that Tesla lives,
00:53:22
Speaker
was actually implementing the advice that I had given for the better part of 14 years and sitting there with a technology platform that we were building and thinking, okay, so this is simple, right? The first thing you do is you build a universal age API and based on IP address, it returns a result and it says, hey, this is the age of digital consent in that particular market. Let's start with that. That seems like a really simple thing that we should have everywhere, right? And we can pass the regulation and we can do that. And you're like, okay, that's great. Wonderful. And then you sit down with a ah publisher and the first thing they tell you is, oh, but I don't know what market my user's in because I don't click the IP address. I pass it from my underlying platform, my Xbox and my PlayStation. And they don't tell me that. They tell me they're just in Europe.
00:54:12
Speaker
but they tell me like they're outside the United States and Europe and you suddenly scratch your head and you're like wait a minute this this is beyond just regulation this is implementation is the struggle of dealing with all the different stakeholders in this space that all have their own interpretation of how to manage their own risk and everybody else needs to to somehow fit into that that jigsaw puzzle and so I think what's been the biggest eye-opener for me is working through those issues where I had previously given advice saying, oh, well, it's fine. The age of digital consent in Denmark and Belgium is this. They're your two key markets deal with this. And now when I'm when i'm dealing with like an Xbox or a PlayStation implementation or an an App Store implementation, it's like, wait a minute. I know these signals that are being passed from the App Store. So the first thing I need to do
00:54:59
Speaker
is have an intermediary layer, an arbitrage layer that's actually allowing me to pass IP address so that now I can
Moderation Practices and Community Response
00:55:04
Speaker
do something. And so you go down that rabbit hole and you're like, wow, this is so much more complicated than I ever could have imagined you know previously like sending off my memory and being like, that was that was great. Great, I love it. It's like from looking out from the outside in, like ah you see a whole bunch of weeds, but once you start walking through there, there could be poison ivy, there could be thorns, there could be all these other things in there that you just don't see from the as outside looking in. ah So thank you for sharing that story. Mike, you have anything? Yeah, i it's interesting. I'm almost at the inverse here, where from an integration standpoint, like Kieran, I totally recognize all of this complex data validation. And do you even have access to that? And from our side, well, we can use some other metadata in in a couple of particular cases. All we need is the audio.
00:55:49
Speaker
um So now integrating to get audio is a technical challenge in some sense, but we have a great team. We've been able to build that stuff largely one and done. So being able to get access to the data we need hasn't been the particular challenge. I think the challenges that we do face are one, a lot of studios don't actually know what their code of conduct is. um They know what the policy is that they wrote down, but how does that translate into actually making decisions about certain kinds of edge case scenarios? And what we'll often do with studios that say, you know oh, well, we know exactly what the rules are for what we do is, you know here's our survey of 20 audio clips. Run that by five of your moderators to decide what action they would take. And let's see what the consistency is. It's not always great.
00:56:41
Speaker
um So just getting that understanding of what what do you think your code of conduct is? What is it actually? How do we help you bridge that gap by educating, training your moderators, giving them the insights and the tools they need to engage really consistently with that? That's one big challenge. um The other is honestly studios with kind of misconceptions about what we do and then how that's going to impact player a sentiment. um So the number of studios that we've talked to who initially had said something like, yeah, look, moderation is really important, but we don't want people to know that we're surveilling everyone. So can you just do it in a hidden way where no one ever knows? And now, first of all, we're not surveilling everyone. We've designed this really carefully to make sure that we're not doing that terrible thing. Second of all, your solution to doing the terrible thing is to cover it up. What are you doing?
00:57:38
Speaker
um And then third, when we actually do launch and we've had the opportunity to do this with so many studios now, players respond positively again and again, even for you know franchises like Call of Duty that have been around for a long time that have these really loyal users. We still get people coming out of the woodwork saying, oh, thank God, like I'm so excited to dive back in. I'm so glad that like my comments have been heard and that someone is taking this seriously. And so we have to spend a lot of time kind of educating and managing
00:58:15
Speaker
anxieties of the studios, which i I can understand and appreciate. They don't want to do something that their players are going to be upset about, even if their players might be misunderstanding it. But I think now we have a really strong battle record of being able to show we can communicate to players that this is not simply you know vacuuming up all of your data, that this is something that is a lot more careful and a lot more measured. It does have a positive effect. And that means you can not only dare to admit it, but actually be proud and make some noise and celebrate this good work that you're doing for your community. I think that can't be said enough. I remember hearing a stat a while ago, I think that on Facebook and Reddit, about 10% of the audience are the ones that interact with each other and 90% of the others are are just lurkers. And, you know, you have a few
00:59:07
Speaker
bad people that that speak up at the 90% of the other people just want to play the game and you know when you start hearing people continuously continuously using racial slurs or or doing terrible things it's going to dissuade you from that game but when you finally take care of that it's just like All right, I could go back to playing my game ah quietly and not have to worry about this stuff. So I just don't think in any form anywhere that can be stated enough that, you know, this is something that is better for the user base. It's not just you're going to piss a few people off. It's just what it is. Right. And people are going to get mad or those ones that want to drop racial slurs on there. So.
00:59:41
Speaker
And one really quick add-on i'll leave I'll even share there is when we first started going live, we were really worried about these misconceptions. So we worked with our studio partners to embed members of our community team into their Discord channel so we could answer questions directly as it came out. After we had done that with five or six games, we started to find other players were actually beating us to the punch. Because so many players, first of all, play a lot of different games and they'd seen us somewhere else and they had started to learn about it and gotten a deep appreciation of what this is. And someone new came in and said, oh, my God, I can't believe they're recording everything. And before we could even get there, three other users are coming in saying, man, read the product page. Here's what they do. They're not actually.
01:00:27
Speaker
and It's been really great to see like that it's not just again the community accepting it, but the community really sort of welcoming and building building up their understanding around that. That's your community, your building as well. right so yeah You have advocates out there that are standing up for you and that's the greatest feeling in the world.
Resource Management for Small Teams
01:00:44
Speaker
It's been a nice step forward, yeah. Tess, is there anything you want to add to that? That's the beauty of transparency.
01:00:53
Speaker
um Yeah, my biggest challenge is mostly around resource management. ah I work with small teams. They don't really even have IT t sometimes. and And I need to figure out how I'm going to get these processes into place when it's just a few folks that are developing and defining the policy and the implementation steps. um It does seem a little overwhelming at times, but there are certainly battles that you can pick, easy wins. right um privacy policies, number one, I suppose. so But I mean, as aside from that, I think, ah you know, the other biggest challenge is really, again, just the education around it, and you know, having to have the same conversations about what these terms mean, and why they're important, and using a little bit of fear mongering tactics to say, Oh, you're going to get 4% of your annual profit taken from you if you don't comply, that that sort of thing. Guys, this has been really awesome conversation, and a lot of great stuff that came from it.
01:01:51
Speaker
The last question I want to kick it off to you guys for the to wrap this up is, what advice would you give a game developer who's looking to enhance both their player safety as well as the privacy for the player? And Tess, do you feel like starting off? Figure out what information you need and get rid of the rest. that's the that's That's my big tip for you, because the smaller your data footprint, the less you're going to have to manage. um Now, if you have a sprawling data footprint and you like it that way, That's fine. ah Hire an expert. I think that's the next best tip I can give you. Thank you. Kieran. There are so many incredible tools that are popping up, and there is such ah an incredible industry of um champions in this space when it comes to building inclusive and empowered experiences, which the two words that we always use inside KID about everything is inclusive and empowered.
01:02:49
Speaker
that you don't have to think that when it comes to, ah in in our case, a youth audience, that there's gonna be a compromise. You can actually benefit from building an empowered experience for a youth audience and build a better community because of it. Thank you. Mike, take us home. Game developers, we're we're all idealists. right like we're We're building these games because we have this really cool vision of an experience that we want to bring to other people, that we want to give to other people.
Recap and Contacts
01:03:22
Speaker
like Tap into that passion and take ownership over that whole experience from the get-go and say, hey, if I want to create this experience for people, what does that entail? What are the different moving parts?
01:03:34
Speaker
use that like that. That's the foundation of what safety by design is. It's not safety by design. It's the experience you intended by design, which inherently involves safety, but it involves all this other stuff too. I i think the the worst thing you can do is try to graft on safety or privacy as a sort of after effect. but recognizing that the whole vision of the game you're building fundamentally is about having fun, having positive experiences with each other. And then thinking from the get-go, how do I want to do that? um And then, of course, as I mentioned before, just reaching out to any of us, everyone else in the industry that are so passionate about this. This is such a passion-driven industry. There's so many you know bright people who've done this kind of work before.
01:04:23
Speaker
make make use of all the people around you who want to help. Great stuff. So a lot of awesome topics and conversation points. Just a few I wrote down and learned I need to have better handwriting in the future is figure out the data that you need, capture and keep only the data that's important to you. ah Set up a privacy first culture in your company. I think that's good for any company in the world to make sure that you understand the data you're capturing, what you're doing with it, how you're tagging it, how you store it, what you do with it, because again, It's a lot easier to do this stuff upfront than it is going back and having to do everything afterwards. ah Educate, educate your employees, educate your players, all that stuff. And when remember, if you're an indie developer, you're an indie developer. Build the greatest game you can build. Ask for help. There's a whole bunch of experts in trust and safety that are willing to talk, willing to have a conversation and point you in the right direction. so
01:05:13
Speaker
build great games. And when you need help with the privacy stuff and we need help scaling again, ask for help. So, uh, I'll let you guys say any, uh, anything else you want to want to call anything out, share anything out tests. Well, thank you so much for having me. My name is Tess. I also go by Inver online. You can find me mostly on LinkedIn, sometimes Instagram. And if you want to check us out at the pre-mack Rogers firm page, you could also probably send me an email that way as well. Mike? Yeah, we missed the opportunity for all of us to pitch each other at this very end here. But um I guess I'll do the self-indulgent thing and say, yeah, my ah you know if you're interested in talking more about this stuff, reach out to me, mikeatmodulate.ai. um We, again, focus on voice moderation from a product side. But my focus and a lot of sort of the passion behind our team
01:06:07
Speaker
is around strategic design of really great experiences in a much more comprehensive way. You don't have to be using voice chat or trying to plug in a voice moderation tool for it to be valuable to just hear about how everyone else is thinking about these same kind of problems and be able to jam together. So please reach out if you ever just want to chat. kean I'll everything that that Mike and Tess have said. so yeah yeah Look, so Karen at k-id dot.com, feel free to reach out anytime. I'm always happy to jam and have a chat. I love talking to people in this space who are who are passionate and mission-driven. um Feel free to reach out to us if you're looking to build a youth and party experience, k-id.com.
01:06:52
Speaker
And that's really the the focus of the platform we're building is is demystifying all of the complexity around global go-to-market when you want to build the best experience for kids and teens. Great. And I'm ready to challenge you in complex with proximity mines in GoldenEye when it's time. So we could ah hit that up. But again, you've given me more than an hour of each of your time and I am internally grateful. It is such an educational experience for me. So again, we'll have all the information for all of our speakers. Check out modulate.ai, k dash-id dot.com and premackrogers.com. Again, we'll have all the information to to each of our speakers, as well as our websites, and how to get in contact with anyone there. Because like we said, if you're ever in wondering what to do next in the world of trust and safety, there's tons of connections we all have that we can at least point you in the right direction. So again, thank you everyone for your time today. I'm looking forward to seeing you at a future community clubhouse. And thanks and have a great day.