Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Christian Nunes on Deepfakes (with Max Tegmark) image

Christian Nunes on Deepfakes (with Max Tegmark)

Future of Life Institute Podcast
Avatar
4.4k Plays5 months ago

Christian Nunes joins the podcast to discuss deepfakes, how they impact women in particular, how we can protect ordinary victims of deepfakes, and the current landscape of deepfake legislation. You can learn more about Christian's work at https://now.org and about the Ban Deepfakes campaign at https://bandeepfakes.org 

Timestamps:

00:00 The National Organisation for Women (NOW) 

05:37 Deepfakes and women 

10:12 Protecting ordinary victims of deepfakes 

16:06 Deepfake legislation 

23:38 Current harm from deepfakes 

30:20 Bodily autonomy as a right 

34:44 NOW's work on AI 

Here's FLI's recommended amendments to legislative proposals on deepfakes: 

https://futureoflife.org/document/recommended-amendments-to-legislative-proposals-on-deepfakes/

Recommended
Transcript

Introduction and Mission of NOW

00:00:00
Speaker
Christian, it's such a pleasure to get to meet you again. And welcome to the Future Life Institute podcast. Thank you so much for having me on, Max. I'm extremely excited about our conversation today. So for those of you who haven't had the pleasure of listening to Christian before, she is the president of the National Organization of Women.
00:00:20
Speaker
And maybe you can start by telling us about what this organization is. Yeah, absolutely. So the National Organization for Women was founded in 1966, and it is the largest, oldest feminist organization in the United States. And it started, was focused on really looking at creating equality for women in all aspects of life, social, economic, political, and just of happiness and equality for life in all generations. And what is
00:00:48
Speaker
What we do is we work on education and awareness, we work on legislative advocacy, and we are a grassroots organization made up of members and supporters. And our work is really centered on making sure we're looking at it from an intersectional feminist perspective.
00:01:05
Speaker
So when I say that, I mean that we understand how multiple issues and social issues are interconnected, and depending on a person's identity, right? Those issues are going to—the multiple issues you have, multiple identities you have, the levels of oppression you experience are going to be compounded and exacerbated worse because of your multiple identities.

Core Issues Addressed by NOW

00:01:27
Speaker
I work at National Organization for Women, really truly focuses on using multi-issue, multi-strategy approach to making sure we're fighting for true equality for women and girls, and really women and families and gender equality in all aspects. I would say that one of the most important parts of our organization is to make sure that we're, you know,
00:01:46
Speaker
creating narrows and making sure uplifting the narrows marginalized voices. And we're doing truth to power, you know, in the work that we're doing. And we don't back down from that. So we get involved in six court issues. And our court issues are ending violence against women.
00:02:03
Speaker
constitutional quality, reproductive rights and freedom, LGBTQIA rights, economic justice, and racial justice. And so that's kind of how we frame the work that we do on those issues and anything that falls within those six core issues is how we govern our policy.
00:02:20
Speaker
Great stuff. So before we go off and talk about fascinating issues like deepfakes and more broadly artificial intelligence and how it impacts these issues you mentioned, let's talk a little bit more about you first.

Christian's Journey to Leadership

00:02:34
Speaker
So can you say a little bit about where you grew up and the arc of your life that took you to the work you do now?
00:02:40
Speaker
Sure. So the story of my trajectory to being the national president now is very interesting because my background is actually in social work and mental health. But I always tell people it's a very, it's a very direct trajectory if you really think about it. So I grew, I was born and raised in Houston, Texas. And around junior high school, you really be located to Arizona. And so majority of my life, I grew up in Arizona.
00:03:05
Speaker
And I went to undergrad school there and I got my bachelor's in social work. And then I worked for a little bit before I deferred my mission to grad school and I worked as a victim advocate doing domestic violence advocate and actually the advocate working with like the juvenile justice system with those who were victims of like sexual abuse and things like that. So I took time off and worked and you know, that was a really interesting place because I responded to
00:03:33
Speaker
calls of the police. I helped take people to court. The orders helped make sure they understood the court process really worked really well with victim witness services. And it really gave me a good insight into what really happens with policy and what happens in real life. So as I said, I took a year off, and then I decided I really wanted to work more directly. I went and got my master's. I went to New York. I went to Columbia University for my master's.
00:03:58
Speaker
And I did clinical work and I focused on policy and health mental health disabilities. Really understanding like the need to like truly make change and impact in communities. You have to kind of like get in that community and like work on like a healing and justice framework. But also in that path, I also still saw the disconnect between what policies and procedures looked like when we say we're offering these policies procedures and how they actually play out.
00:04:24
Speaker
in real life when you're trying to service communities because I still was working as a community organizer outside of my actual career. So just from being a community organizer, I've initially got involved now again and kind of work my way up through from local to states to national.

Policy vs. Real-life Implementation

00:04:42
Speaker
And so now that I'm on the site of being the national president, it is the opportunity for me to be able to communicate, like, you know, saying, like, these are where the disconnects are in these policies, these are just next to these bills, because a lot of times we sometimes, and I don't think it's intentional, Max, but I think what happens is
00:05:01
Speaker
you know, when we're thinking about how we people write policy, sometimes they forget the person in a process, you know, like, they leave the person out of it. And that's, that's problem. But we see that when you're working directly with people, you see where they forgot about trauma, how trauma impacts a person or what are these marginalized communities are experiencing. So
00:05:21
Speaker
It's really helpful for me to have that experience now leading the organization because I'm able to really talk about and identify those disconnects and legislation and really help communicate how we can make it better and better informed to really service people. Fascinating.
00:05:38
Speaker
Yeah, so we are, of course, very, very excited that you and the National Organization of Women has joined this band, deepfakes.org campaign to rein in deepfakes.

Deepfakes and Violence Against Women

00:05:50
Speaker
Can you say a little bit about why your organization decided to support this campaign and what you see the relationship between this deepfake problem and in particular women's issues? Why this is a women's issue?
00:06:01
Speaker
Yeah, well, first I want to thank you for even approaching National Organization for Women with the opportunity to join this campaign. And when we talk about what's happening, we talk about gender-based violence, when we talk about ending violence against women,
00:06:17
Speaker
We have to really think about it in the year 2024, right? Before we wanted to only think about it in terms of, you know, intimate part of violence, direct intimate part of violence, like, you know, physical abuse, emotional abuse, you know, partner of sexual abuse, things like that.
00:06:33
Speaker
But in this new day and age, it has expanded and changed completely different. And we know social media has a major impact on so many people's lives. And we're talking about relationships and we're talking about assault and harassment and stalking and gender based.
00:06:51
Speaker
harassment and stalking. We have to have that conversation of what that looks like. We're talking about technology, AI, social media, and what that has caused. And it is a direct correlation between cyber-related harassment that's
00:07:09
Speaker
leads down the pathway to actually physical harassment as well. And the harms are just as traumatic. They're just as detrimental. They're just as life altering. So we can't have a conversation about protecting women, or truly figuring out how do we
00:07:27
Speaker
provide full protections for women and create safe spaces for women if we're not having a conversation about the bigs at all in AI. Because we know that majority of the bigs that are created on women are sexual in nature, they're the big part, right?
00:07:42
Speaker
And then on top of that, you know, we hear about a lot of celebrity cases, but we don't realize that majority of 99% of those cases are marginalized persons and when people who don't have the means and the money to be able to make that fight, right? So they're silenced again. So now we're marginalized in another community within this AI, you know, social media space. And so this deep big ban
00:08:08
Speaker
campaign was so important because you all take it a step further, which I was really happy to see. You don't just say hold the person who perpetrated the violence accountable. You also hold the tech firms, the tech companies, the developers, everyone who's part of creating an environment and ecosystem for abuse.
00:08:33
Speaker
toxicity, violence, gender-based violence, accountable. Because if they did not have that means, you know, to be able to create the beast, it wouldn't exist, right? So it's kind of like, so it's like, by you saying, like, you saying, huh, we got to think about why are we not holding these people accountable? And I was like, I don't
00:08:51
Speaker
So glad they finally did that, right? Because so many people haven't done that. They just kind of disregard the fact that, hey, this person didn't create this app, and this, you know, the platform didn't like, if they don't put them off and take away their privileges, then they don't have the opportunity to continue to abuse people and abuse women. So that's one of the main reasons why
00:09:13
Speaker
We decided to sign onto this campaign because we see how you all are being holistic in your approach and understanding that everyone has to come and be involved in this. Everyone has to get involved in protecting women. It's not the responsibility of only the women and children and men and LGBTQ.
00:09:31
Speaker
or non-conforming gender identified as by people as well, it is responsibility for everyone to make sure that they are doing their best to provide a safe space in ending violence. And that includes whether it's in-person, physical, or cyber-related. Now, that's such a good point you're making there, the people who create
00:09:52
Speaker
most of these sexual abuse deep fakes, you know, they're often 14 year old boys, you know, who actually don't have the technical skills to create and build their own AI tools to do it.
00:10:03
Speaker
Which means if you can put guardrails and liability on the companies who provide these tools, it really makes a huge, huge difference for the better.

Centering Survivors' Voices

00:10:12
Speaker
And I'm also really glad you brought up this fact that it's not just very high profile celebrities like Taylor Swift who are victimized, but the vast majority of the victims are actually teenagers and other very vulnerable people who are not super famous, who most people haven't heard of. So how can we give voice to them?
00:10:30
Speaker
How can we make more of their stories become well known so that the people appreciate this point that it's not just Taylor Swift.
00:10:39
Speaker
Yeah, I think, thank you for asking that question. Well, it goes back to making sure that we're always centering those who are directly impacted, right? So as we're doing these campaigns, as we're pushing these bills that we're making sure that we're also going to those survivor-based organizations and groups and coalitions that are doing this work and making sure we're reaching out to them and asking them like,
00:11:02
Speaker
from your experience, from those survivors, from those victims, you know, what do they think about this? So they feel like this is encompassing everything they need. What else do you feel like would have helped in your situation? So we're really taking consideration what they believe will be beneficial for this because they've lived it. They know firsthand. And it's always important that we're always uplifting their narrative and their voice in this too, first, more than anything.
00:11:27
Speaker
So I think that's one way we definitely can do it is always make sure there's several seats at the table, not just one, but several seats at the table for those who are living this, experiencing this, and, you know, directly impacted activists and advocates and survivors.
00:11:42
Speaker
so that there's a good enriched conversation about what needs to happen and things that happen that sometimes we don't know about because we think we know about everything. But people who've gone through this can tell you a little bit more about some of the technicalities and some of the unspokens about their experiences. And so when we included them, we had better ways to
00:12:07
Speaker
be preventative, but also to find solutions for, you know, creating safety and healing for those people. So I would say that. And the other thing I would say is, you know, this is not about trying to punish anybody.
00:12:23
Speaker
But really thinking about how do we create our societies the way that we really want them to be? If we're always seeing that we want to leave our world in a place where our legacies and our futures have a place to grow and be stationed well and healthy, well, what are we doing to make sure that's happening?
00:12:48
Speaker
Because right now, what they're facing is abuse online, predators online, exposure to all kinds of beings that they don't need exposure to before their brains can possibly even fully understand.
00:13:06
Speaker
development lies, you know, what's going on. And so like, as you said, a 14 year old boy might take their friend or some girl that or boy that he liked and, you know, make a make a graphic about them because they're upset, they broke up with them.
00:13:22
Speaker
you know, how do you, you know, now they don't understand the consequences of that. They're, they're young, they're acting impulsively. And there's no one educating them about that. So we have so much work to do. And I think that's why there has to be education campaigns at school about this. There has to be work with, you know, schools, you know, survivors, everyone has to be brought to this table to make sure we're solving this. So I'm curious, actually, what kind of support networks there are out there right now, if there's anything you can share

Legislative Efforts on Deepfakes

00:13:52
Speaker
with us.
00:13:52
Speaker
For example, there are, of course, some support networks out there for survivors of sexual assault and so on. But can you are there any support networks that you know, for that someone who has been just found out that there are the victim of deep fake sexual abuse that they can reach out to? Could you say speak a little bit about any what the situation is there?
00:14:13
Speaker
So I don't know of any that are specifically designed just for the fake sexual abuse, but I do know there's RAINN, there's Joyful Heart. Those are the two main ones I would say. RAINN is a really popular hotline that everyone RAINN
00:14:28
Speaker
And that's the popular hotline that everyone goes to when they've experienced any type of abuse. There's also a Debutful Heart Foundation that provides resources. So there are ones out there that exist when we're talking about experiences of sexual abuse and harassment and things like that. But I would have to probably look and find out, but I have not heard myself personally, one that's typically designed for just cyber flashing.
00:14:57
Speaker
or the fake AI or anything like that. But it is going to probably become a need, you know, because it's a very different, nuanced type of experience that people are going to have. And it's going to be something that I think we're going to have to eventually start addressing sooner than later. Yeah, so that was rainn.org for anyone watching this who has been victimized or knows someone. Yeah, I think this is a really important point you raised because
00:15:28
Speaker
When something like this happens, one wants to make it really easy for the victim, not just to find someone who can talk to them who actually understands the technical aspects of what happened.
00:15:37
Speaker
And so on, they can help them contextualize and support, get support, but also practical things. Like we can tell them how can they report this? Who do they report it to? What do they do? And maybe it can help follow up also because it's so new. I have a strong concern that much of this just completely slips the cracks of the system. And that's also a segue to talking about the current legislative situation. I know you do a lot of wonderful work where you go educate policymakers.
00:16:05
Speaker
in Washington and elsewhere about things. Do you want to say a little bit about how you see the current situation there with legislatively on this and also how much support or not you feel you get from policymakers when you talk to them about the deep fakes? Well, I'm actually quite encouraged by the current legislators and Congress in their productivity or, you know, how much they're engaged right now with trying to address deep fakes.
00:16:34
Speaker
there's two bills on, there's a bill on the Senate side and there's a bill on the congressional, on the House side that are, you know, they've been rolling around. I think there's a third one coming out, but I'm not quite sure, that are, you know, really working on addressing, you know, you know, deepfakes and the influence in the impact they're having and making sure there's like some guardrails in place and things like that.
00:17:01
Speaker
some accountability and so that's the Defiance Act and that and that's from the Senate side and that stands for Disrupting explicit forged images and non-consensual edits act 2024.
00:17:15
Speaker
So this one is guaranteeing that there are provisions to stop the, you know, the continued growth of these non-consexual images, edited images, right? The bakes. And then also to have some garbles in place by the way it does happen and make sure there's ways to address it. And on the congressional, on the US House side,
00:17:39
Speaker
Congressional U.S. House side, there is the Preventing Debates of Intimate Images Act. So, that's one coming from Aral's office. And that's another one that's really important. And pre-month, very similar language. I mean, not exact same language, but the exact same point, you know, like making sure there is accountability, make sure there's protection.
00:17:59
Speaker
making sure we're doing what I can to create safeguards and accountability and things like that. So the fact that there's two. And then there's also some other cyber flashing ones that kind of have some mixed languages with cyber flashing, deep big porn, revenge, we know some other ones that are kind of lingering around that have some mixed language. So to me, I think that they're really taking this and making sure they're pushing this. And I would say they have bipartisan support.
00:18:27
Speaker
Everyone's concerned about protecting children. Everyone's concerned about protecting the most vulnerable. And these bills are seen by a partisan support, which is great. And so the likelihood of these getting through, I think, is gonna be really positive and I'm optimistic about it. And I wanna just encourage everyone how you can be active if you care about somebody or you know somebody who's been affected is to call your congressional member
00:18:54
Speaker
you know, talk to them about why you support these pieces of legislation and why you think, you know, you know, you want people to vote for them and, you know, make sure they pass because we have to start getting more active in protecting, you know, women and children.

Bipartisan Support and Personal Reflections

00:19:14
Speaker
It's really heartening that this is still such a bipartisan issue. It's getting harder and harder to think of any bipartisan issue in fact in these days. And yet, you know, here I saw this recent poll where about 90% of Democrats and about 90% of Republicans
00:19:31
Speaker
want to really crack that hard on details. I can't think of any other issue at this point that's so bipartisan. And so it's really great that you're trying to push them over across the finish line and not just walk, talk the talk, but walk the walk and pass some laws here. Right. And next, I think it all just shows like when we're really talking about, you know, protecting children and making sure that donor roles are like
00:19:55
Speaker
are targeted and harassed, I think people will eventually start thinking about how the issue impacts them in their life and their family. And nobody wants their, their sister or their child, you know, or their loved one to be a victim of that. And they know the damage it can do. So I really think that's what's driving this bipartisan support. Like, you know, we all know that once and things out there on
00:20:18
Speaker
and, you know, the world wide web, how difficult it is to get it removed and taken down. And then how it just turns into this, like, metamorphosed, like, continuing image on and on and on and on and on. And, you know,
00:20:33
Speaker
There's already a problem with people taking people's images and making memes out of them anyways without their consent. You know, we've already got to the age where people have their cameras out and recording people without them knowing and using them for laughter. And it's a lot of problems we're having where people are being
00:20:49
Speaker
non-consensual images of people. And I'm not talking about porn, but it's non-consensual images of people who are floating through the internet and they don't understand the impact that has in a person's life and that person has been ridiculed or mimicked.
00:21:05
Speaker
become a laughingstock or used to just be to gain likes or gain attention. And so when you already had that and you add in now the compounded part with turning into a sexualized thing and not understanding the impact of what sexualizing something for a woman it turns into and how people misread cues without all the time. And
00:21:30
Speaker
People don't understand if that person did it or not, and they don't care. And then now you're adding to a whole other layer of trauma. You're adding to another layer of abuse. And every time that image comes out, that person's being re-traumatized. They can't get past it. They can't get over it.
00:21:47
Speaker
There's something they're gonna have circulate in their life for every long time. So we really have to crack down on this. We can't keep allowing people's lives to like, you know, be affected and altered without their consent, without their knowledge.
00:22:03
Speaker
just because we think it's funny or because we're angry or because someone set their boundaries. Couldn't agree more. And it was great to get your overview. There are many legislative efforts also. So people watching or listening to this podcast can see what's going on and reach out and encourage their senators or congressmen to push ahead with this. We'll put a link in the chat to
00:22:30
Speaker
Very nice little summary graphic that my FLI colleague, Alexandra, has made summarizing all the latest efforts so people can see how much is going on. Oh, so just a name for the Defiance Act. It's led by Durbin, Ram, Klobuchar, Holly, and Holly. So those, you know, are the ones who are leading that Defiance Act. MRO is leading the one of the U.S. House side. So just want to put those names out there so we know kind of who are leading, individuals are leading those acts and things like that.
00:22:59
Speaker
Great. And when looking through all of them, it's interesting, even though there are some of them that I think can be made stronger. What I found very encouraging Christian here was that if you take the strongest parts from all the different bills and put them together, you actually get all of the key things that our campaign is asking for, go after the whole supply chain, emphasize that
00:23:23
Speaker
a deep fake is something that's non consensual, etc, etc. And the more these policy makers feel support from their constituents and elsewhere, the more likely it will probably be that they'll take the best parts from all the different acts and pass something good. And it's it's
00:23:38
Speaker
We've spoken very much now on the horrible use of AI to make non conceptual sexual abuse. But of course, in addition to that, as if that wasn't bad enough, we also have massive explosion and deep fake fraud, which is up 1600% year on year. You know, some mother gets got called up in California, but what she thought was her daughter said she had been kidnapped, it was all fake. A lot of companies are losing
00:24:06
Speaker
A lot of money through deepfake scams, a lot of older people are really being heavily victimized by someone calling them up who they think is a relative and it's not. And then politicians, of course, who are also feeling the same quite directly with deepfakes of them, which can undermine democracy. So in a way, it feels like maybe the perfect storm is brewing. How hopeful are you that this perfect storm can actually lead to some legislation?

Need for Comprehensive Deepfake Legislation

00:24:33
Speaker
I'm hopeful. I don't know if it'll happen before this election. I'm hoping it does though, actually. I'm really hoping it does because I think it's important that it happens as soon as possible because exactly what you just named. Like, people are being scammed.
00:24:48
Speaker
People are, people are, there's the fraud level, the scam level is affecting people's livelihoods, right? So that goes into the whole economic, the whole economic realm of this and how we are witnessing people who are on disability, people who are on retirement.
00:25:06
Speaker
Even people who just really are, you know, have all levels of income have been affected by this, getting these voices or these messages or these texts or whatever pertaining to be someone they know. And actually, I'll share, my line got created and someone was sending out messages from my phone, one of my numbers. Wow.
00:25:28
Speaker
And I was, and someone texted me and said, did you just text me and ask me for this? I said, that was not me, please delete this now. So I just sent a message out and saying this, if you got a text message from me, this was not me, you know? And the problem is, this is where it lies. If someone trusts you as a person, right? They think you're a trustworthy person, then they're going to trust what you asked them for. Or if they have a good relationship with you, they're going to trust what you asked them for and they're going to
00:25:55
Speaker
just do it because they trust you, right? And so this just happens. And I was like, well, why wouldn't they call me? But then I thought, they said, I think I needed to call you.
00:26:08
Speaker
You know, and I was like, you know, that makes sense. And this is what's happening. This is what we're seeing, right? Because why? They trust me. They don't. They know I'm a personal good character. They know I would never ask. You know, and I had no idea that that happened. And I had to go through this whole scrubbing thing and figure out what's going on at nine. And you know, but this is happening. And then I and then I know someone else whose mother was scammed out of a lot of money and to the point where she was already on a very limited income.
00:26:38
Speaker
And she was spending a lot of money and actually borrowed money to help this person when she didn't have it because of these defects. And, you know, that's how serious this is. So people are getting into debt over defects. They're getting into debt from their fraud. It's affecting people's relationships. I mean, it's a horrible thing that is causing a lot of chaos.
00:27:07
Speaker
And so this is why the need for moral legislation, look at it, I think, max innings to be looked at from a holistic perspective, right? How does it impact us on, you know, we're talking about sexualized images, right? That's one, but then also you're talking about fraud. That's another. Academic, that's another profession. I mean, you know, we have to look at it all the way around about how this deep fake
00:27:32
Speaker
And AI really truly start impacting people's abilities that have successful and sustainable lives. How is it really going in there and impacting it? And then, well, what do we need to do for legislation that can address that? Because as long as we, to me, as long as we keep piecemilling it, we're going to miss something. So the best way is to get some legislators together.
00:27:52
Speaker
Sit down at the table. Let's talk it out. Let's bring it down and say, let's look at this as a whole person. What would be the most devastating places that this deep fake can impact them that could be the most devastating for them, that can affect their ability?
00:28:07
Speaker
their housing, their ability to pay their bills, you know, their relationship with their families, you know, and just look at it, their harassment. And then, why do we, how do we connect this? And how do we put parts in this legislation that's going to hit all sorts of different areas? This is where I believe, and this is my social background,
00:28:29
Speaker
is that we have to start looking at things from an ecosystem perspective, where we're looking at, okay, this part, this part, what's going on, what's going on? Unless the center of the person is the core, or the center of our community is the core, or the center of the issue is the core, but how do we look at all these things that are connected to them, and how are they gonna impact them? And then that's how we start building out this legislation and this bill, so that we're connecting every single piece, and we're not missing anything.
00:28:56
Speaker
Like you're saying, if we keep just doing a piecemeal though, we're going to leave something out. And when we leave those things out then, or like you said earlier, like when one bill put pieces of all of them, you know, you get this one great bill. So that's part of what we have to do. And I think we have to really look at that because I'm really afraid for this broad. I'm afraid that people are already in a place of feeling like they can barely survive or they're, you know, struggling
00:29:20
Speaker
or they're in a place of just feeling like what this world is that kind of like a fragile place and our democracy is a fragile place. It's another way of silencing people and you take away their actual consent. And this is what the big is doing is taking people's consent. So that's another way of silencing.
00:29:36
Speaker
Well, I guess one synthesis of what you said there would be zooming up to the big picture and not being piecemeal is that one should not just pass a bill saying that the cracks down on deep fakes that are doing fraud or a bill that just does deep fakes for non consensual sexual abuse, or that can affect elections, but rather go after them all at once and say,
00:30:00
Speaker
No, deepfake is simply something which can be taken as real and of someone and you don't have their consent for it, regardless of whether it's for fraud or sexual abuse or throwing an election.
00:30:13
Speaker
the laws should catch all of this. And then also, for the last thing they said there, you said there about taking away people's consent, if you think about it as, can we think about this, maybe as a fundamental right, in the same sense that we believe that we should have a right to our own beliefs, our right to believe in whatever, have whatever religion we want, etc.
00:30:36
Speaker
we should also have a right to our own identity and that people can't just take your identity and do whatever they want with it. Well, it is about bodily autonomy, right? So it is kind of our one of our rights. I mean, we're one of our human rights, you know? And so it's kind of, you know, it's that we have we have about bodily autonomy, free agency. When we can't have free agency, we have these beings taking that from us.
00:31:00
Speaker
So, yes, I agree with that. As for one big piece of legislation, I don't think it's, I want to just say, well, let me just go back and say that I'm very happy for the legislation that's out there. I'm very happy for what they're doing. I'm so happy. I'm so thankful. I have endorsed it. Now has endorsed it, you know, as an organization we have endorsed it. I personally endorse it.
00:31:22
Speaker
I'm very happy about it. And I think that that's okay to get down to defining them a little bit more specific, but I think it still needs to be one overall, like we're talking about, one overall bill that just is really targeting the whole issue of deep bakes. And what that is, what deep bake really is, breaking it down, because sometimes I think when people hear deep bakes, they automatically just assume it's deep big porn. They don't understand how it manifests in other areas.
00:31:51
Speaker
So if you create one piece of legislation that just really bans deep fakes in general, which is non-consensual, right?
00:32:04
Speaker
taking with bodily autonomy, taking away your free agency. And then when you can do that, then the other ones that fall under that umbrella are gonna be more, be more enforceable because you have this one piece that just basically explains what it is and says, this is done okay, this is, we've been, this is against a lot of heavy, they're not consent, we don't support not consent. And then the other ones will be more enforceable and you can have more resources, you know what I'm saying, like one from there.
00:32:32
Speaker
Yeah, sounds very promising. And it doesn't on an optimistic note, it's worth mentioning, I think that this can be perfectly consistent with the First Amendment and free speech. And if it goes, if one simply bans things which are non consensual, and
00:32:47
Speaker
cannot, and a reasonable person could actually believe are real, right, then satire, or the really funny kind, like the onion is still allowed, but it completely eliminates the the the fake sexual abuse eliminates the fraud, and the political impact
00:33:05
Speaker
I wanted to also just so people don't get too depressed by thinking that it has to always take forever to have impact. Do you think that even if it takes a while to pass the laws, people just seeing statements from policy makers on both sides of the aisle saying, yeah, we support this, we're going to do it. And seeing these, when tech companies, for example, see the writing on the wall, that this is coming down the pike, that they might actually start taking some action, cracking down on it long before
00:33:34
Speaker
the laws come? Oh, absolutely. Absolutely. I think it will. I think it's important that motion starts happening before the laws pass. And I also think it's important that everyone starts reaching out to those tech companies and start having conversations about how can we assist you to do so. You know, before it gets to accountability, how can we help you make sure that you're doing your part in preventing this, you know, and being accountable? I think that's an important conversation to have too.
00:34:03
Speaker
So how do we pull you in and into the solution before we, you know, more than anything. So it doesn't have to get to a place of accountability because you're you bought into it. You understand that you can still be successful as a tech company, but still not allow some things to occur within your, you know, within your company.
00:34:20
Speaker
So, and the more I think there's educational awareness about this, it also helps with the members of Congress, you know, pushing forward and their understanding and willingness of why this is important and why it needs to pass. But it helps empower the community as well about what they can do in their communities.
00:34:39
Speaker
to provide supports like we're talking about earlier, provide supports and resource for those impacted. Before we finish, maybe we could come back to this even bigger picture you were talking about earlier here and just zoom out a little more broadly and look at the impact of AI on society and on your work. So wondering if at the end here you
00:35:00
Speaker
You wanted to say a little bit about more broadly how now issues of transparency, fairness, bias, and AI, what are the things that you feel are most important here, things that you'd like to see done, guardrails you'd like to see put in place?
00:35:15
Speaker
So now it's just now getting into this area of work with AI, but one thing that what's important to us that I think threads through our values that I mentioned is protecting and creating a safe space for women, right? And so when we think about why this is important for us,
00:35:34
Speaker
the organization, it's because it creates more avenues to make sure and advocate legislation and advocacy to guarantee that in all aspects that we're creating justice and fairness. We're looking at it from everywhere that it might impact the woman's life, right?
00:35:54
Speaker
I would say big on transparency and we're also big on like, we have to make sure that we're doing our part to go from education all the way to the top. It's like we're grassroots based and so we understand the power of the community and the power of people. And now it's a big part of how this organization is founded.
00:36:14
Speaker
So for us, that talks to centering communities, that talks to reaching out to your congressional members and letting them know what's happening in your communities, that talks about having relationships and coalition building and movement building. Those are the things that
00:36:32
Speaker
our organization is known for. And still, we had these conversations about AI, and we had these conversations about the work and this space. It makes sense because we recognize that this is affecting our community and the community that we serve. So if we're looking at how do we continue to protect our community, we have to include this in the conversation.
00:36:56
Speaker
Great Christian. Well, thank you so much for taking time out of your busy schedule to have this conversation with me. Thank you so much for the wonderful work that you're doing. Thank you so much for having me on and we look forward to continue work with you on the campaign.