Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Her Media Diary Episode 46: “Curating Datasets against AI-Facilitated GBV” with Mwende Mukwanyaga image

Her Media Diary Episode 46: “Curating Datasets against AI-Facilitated GBV” with Mwende Mukwanyaga

E46 · Her Media Diary
Avatar
13 Plays3 days ago

Mwende is the Co-convener of the AI Salon by Webworks, with expertise in AI ethics, data journalism, ethnographic research, and dataset building. She also specialises in critiquing models and systems for bias, and teaching media and institutions how to adapt to AI in a meaningful and responsible manner.

In this episode, she joins us to talk about tech-facilitated GBV, the risks of AI-generated harm, and what it means to reclaim data with care and creativity.

Mwende reminds us that data isn’t neutral; it’s shaped by who gets counted and who gets erased. And as generative AI becomes more powerful, we need storytellers like her, brave enough to collect what’s missing, and wise enough to tell it with care

Subscribe, leave a review and share this episode with someone who needs to hear it.

If you’d like to join an episode of this podcast, send an email to yemisi@africanwomeninmedia.com. Or visit our website at www.hermediadiary.com

Subscribe and follow Her Media Diary on all your favourite podcast platforms, Also, tune in to our partner radio stations from anywhere across Africa. And don’t forget to join the conversation using the hashtag #hermediadiary.

Recommended
Transcript

Podcast Introduction by Dr. Yemsi Akimbo Bola

00:00:03
Speaker
welcome to hamida diary the podcast where African women share their real stories, bold moves, and behind-the-scenes moments that shaped their journey. I'm your host, Dr. Yemsi Akimbo Bola, and with each episode, we're pulling back the curtain on what it really means to build media career, to break barriers, and stay true to your voice.
00:00:24
Speaker
So whether you're just starting out or you're already making waves, this space is for you.

Focus on Technology-Facilitated Gender-Based Violence

00:00:30
Speaker
In today's episode, our focus is on the rise of technology-facilitated gender-based violence, often abbreviated as TFGBV, and how artificial intelligence and social media platforms are shaping and sometimes threatening women's safety and freedom online.
00:00:49
Speaker
Today, I'm joined by someone who doesn't just report the data, she makes it. Mwende Mokwanyanga is a journalist, ethnographer, and creative whose work refuses to separate storytelling from lived experience.
00:01:04
Speaker
She's part activist, part artist, part builder, literally compiling her own data sets on gender-based violence when institutions fail to do so. She joins us to talk about TFGBV and risks of AI-generated harm.
00:01:21
Speaker
and what it means to reclaim data with care and with creativity. So stay with us.

Interview with Mwende Mokwanyanga on Activism and Data Storytelling

00:01:39
Speaker
Okay, so Mwende, welcome to Hermide Diary Podcast. I've been looking forward to this conversation, but before go into the AI and the heavy stuff, who is Mwende outside of data and journalism? Tell us about your upbringing and what shaped your creative spirits.
00:01:56
Speaker
Right, so I ah think outside Mwende, the journalist, is Mwende, the fermenter. So I do ferment teas.
00:02:07
Speaker
I grew lot of kombuchas and I also grew a bit of the ginger bugs. So I'm making probiotic sodas and kefir. So I'm into the fermenting bit a lot.
00:02:23
Speaker
I have recently picked up jewelry and I think jewelry for me has been very a very interesting space because it requires intricacy.
00:02:36
Speaker
It's a very delicate art and it's also an art that screams like mistakes. It's visible, very visible mistakes. um And I think outside of that, I also do enjoy watching birds a lot, a lot, a lot.
00:02:54
Speaker
So put up my balcony in a way where I'm able to sit outside every single morning to watch the birds. So I've always been a very curious person right from when I was a child. And I think that has really shaped my coming into this space of AI and I'll get to it just in the larger sense of the everyday, for example, the fermenting, the slowness, the time it takes, how quickly some, like an introduction of, let's say, ah ma ratios we don't, which shouldn't be there and like your entire batch is gone, you know?
00:03:35
Speaker
So I think teaching me a lot, like just being able to translate my world in the everyday, the hobbies, the non-serious is giving me a lot of structure for how to think about the ethical frameworks that I'm hoping to build and continue to investigate and inter interrogate.

The Role of Hobbies in Shaping Work Ethics

00:03:55
Speaker
yeah That's really interesting how something like, you know, the hobby and the things in your personal life actually shape how you approach in these very important and conversations.
00:04:06
Speaker
So I will confess that I do not have the greenest hands in terms of like, you know, gardening is one of those things. I just look at my garden and I'm just exhausted right already. I've never stepped outside.
00:04:17
Speaker
But actually, in the last couple of weeks, I saw the garden of a friend of mine. He posted it on social media. I was like, wow, John, your garden looks amazing. Look at mine. It's a mess. What do I do? you know And I've actually picked up on the art of starting off with a small bud and watering every day and deweeding and creating a safe space for it to grow And it's just been about three weeks and I've only done a small part of my garden. But actually there's a sense of, I don't know how to describe it, it's quite therapeutic.
00:04:52
Speaker
You know, teaches you patience, yeah you know, as well, because you realize that actually you're not going to get that whole garden done in the next year. Like just don't rush, you know, you're going to need to take time.
00:05:05
Speaker
So it sounds like some of these things are patience. Yeah. And structure is also kind of, you know, and something that you're taking from your personal. how How did you begin? How did all of that manifest for you from a young age?
00:05:20
Speaker
So from a young age, I think it's, I have grown up with a mom who, she's very, I'll say permissive, she allows me to be.
00:05:32
Speaker
And then the allowing was also, I think, allowing me to ask a lot of questions and also allowing me to do a lot of experiments. So I think I've run so many losses, you know, like I'd experiment with,
00:05:46
Speaker
anything that I set my hat onto this week, um i'm I'm going to do bidding and I'm bored. So next week I'm like, oh, wow, new hobby found. And I think um in her entertaining my, my just, you know, curiosity. Yeah. i wrote into a space I'm able to, but my brain also is able to like, I think more and less,
00:06:13
Speaker
move through various mediums, experiences, and take one thing, for example, and you're even learning how to do things when you're small, it's, for example, realizing how something you learned in the kitchen, let's say that if you apply oil on these labels,
00:06:32
Speaker
um ah the the labels that they stick on ceramics, like the cups, etc, it will come off, right? But like this is also... I mean, when you come to class, I'm able to very quickly relate in my chemistry that this, but when the teacher is talking about like this chemical reaction, it's not something that's so removed from like your everyday.
00:06:55
Speaker
So I feel like it's been a very great experience now for me moving from, I'm able to, or I'm allowed to experience a myriad of things. So in turn, I'm also able to like translate these things to,
00:07:09
Speaker
there every day so yeah yeah absolutely yeah i mean i've got three girls my eldest just became a teenager over the weekend and go they and thank you thank you i feel as if it's my birthday but parenting is not easy for anybody but it's also not easy when you're trying to be that parent that allows the child to be the child while also also trying to maintain certain boundaries But from your perspective, how important was that for you growing up, being allowed to explore your curiosities, as you put it, and and you know and also navigating those boundaries that you inevitably have as well?

Questioning Authority and Learning Responsibilities

00:07:49
Speaker
So I think that's a very beautiful question, because I think for me, it first allowed me to ask institutions questions. ask and from institutions I even mean like are the most basic your parents you know being able to question your parents allows you or gives you a lot of room to also start questioning bigger systems as you grow and you're able to be like huh this here doesn't feel right one two three um like would look different or I feel this in this type of way
00:08:20
Speaker
um And the second thing I think for me was in the permission, you also start learning the responsibilities and my limitations, which was also extremely important because it's important that you realize that is going to be, for example, consequences when you do certain things, right?
00:08:39
Speaker
So yes, you do them. Go and try to light a fire. i You'll catch a beating. You will perhaps get burnt. But, like, consequences will be there, but, like, I think it's just that space for...
00:08:57
Speaker
I think it's also given me a lot of permission in terms of like with the failing, right? Because that's where a lot of us also are very touchy. um So it's it's a space where I feel okay to to try things even when I am not sure of outcomes or when the outcomes I know there'll be consequences which might be painful or look at that in ways. So yeah.
00:09:20
Speaker
It was extremely, that aspect is extremely important into molding the person I'm moving through the world now. Because also even the activism ties a lot into my being able to start the questioning pretty much earlier on.
00:09:39
Speaker
Yeah, absolutely. And so you've you've described yourself as a journalist, a photographer, a florist, and we're hearing about the teas now, a mixologist, so now we understand that. So, which is such a rich palette, right? And we've kind of talked about how you're bringing kind of shaped that.
00:09:56
Speaker
But when did you realize that storytelling and especially data storytelling was your way of resisting silence?

Storytelling and Data Journalism as Tools for Change

00:10:03
Speaker
So I will give a very small spoiler. I was in a competition, which you should not look up.
00:10:13
Speaker
Okay, My team failed miserably. However, um within this competition, none of that it was an investigative journalism. So all my life passed, I knew wanted to be a lawyer because I talk a lot.
00:10:26
Speaker
And I also, I do stand up to people a lot. So like my entire lifetime, I wanted to to be that. And my compositions were great. So in Kenya, we have, um In English, we write, like as part of the test, you have to write an essay essay composition, and and it's like quite a big chunk of like marks. So I used to examine that a lot because it requires a lot of creative thinking.
00:10:51
Speaker
So with the storytelling, I think I've always had it in me in terms of the writing. But now the journalism, the data journalism began during that competition because we had session where I remember one,
00:11:05
Speaker
um The settings, I'm going to set this in the sense of they were trying to explain how many bombs the US has rained on different Middle Eastern countries across the different regions.
00:11:20
Speaker
How they showcased this was in form of rainfall intensity. So from showers and then all the way to thunderstorms during Obama's time.
00:11:31
Speaker
And in my head, it was like, the this is the simplest way to ever explain a war. Like the simplest way to ever explain, like this, a child could understand this, right? And it's relatable, exactly. It's something that we can all kind of understand the difference between a light shower and that thunder.
00:11:50
Speaker
So I think for me, that was the beginning of the exploration for beyond the investigative world, beyond the data which we now have hands on. How do I communicate data in a way that makes sense to a person who perhaps is just, you know,
00:12:07
Speaker
just moving about that day, not a technical person, not someone who even cares in the slightest about what we're talking about. So I think um that was the beginning of my conviction into this space, but also I have been doing GBV work since 2017, 2016.
00:12:28
Speaker
um So for me, GBV is also a space where it's personal and it's in the sense that it's It's in the sense that um there is a lot of systemic failure, for example, in Kenya, and we've had to create systems and collectives that support survivors or even like families of victims who succumb to GBV.
00:12:51
Speaker
So I think for me, this has also shaped the ways in which I see the world, right? And it has shaped the ways in which I i view and interact, for example, with tech, right?
00:13:03
Speaker
um So the activism for me, it's also like in the sense of you cannot afford to be silent, the silence as complicity.
00:13:13
Speaker
um So for me now, having those two things where now I'm finding grounding in a space where One, the data is complex, the the language is technical.
00:13:24
Speaker
How do I explain to you in the simplest terms what this data means? And then taking that and like, how do I also make sure that this data causes impact? So I think that for me,
00:13:35
Speaker
the intersection of my work how do i explain this simply and how do i make this count yeah yeah and you've done something that many institutions haven't which is to build a gender-based and gender-based violence data set a gd gbv data set from the ground up right what sparked that decision and how did you how did it change how you see data especially data on women's experience and also we should come back to what just said about how do you then translate that GPV data to something that everyday people understand.

Creating and Analyzing a Gender-Based Violence Dataset

00:14:08
Speaker
So um let me first start with how we began the building. And this began in conversation about Valentine's.
00:14:21
Speaker
Okay. love Valentine's. I love a holiday. Me too. Me too. And a chance to eat chocolates and celebrate is my thing. Yeah. We declare today a holy day for you, Yim.
00:14:38
Speaker
Exactly, exactly.
00:14:42
Speaker
um So, Valentine's, I think for me was the concept of like love. And one of the conversations going around that time, there was um a femicide case that had just happened, which was like touching home, pretty much home, I'll say.
00:15:00
Speaker
It's someone we went to school with. It was a very high profile femicide case. And I remember sitting through and like seeing people's thoughts around the femicide, how the women are being killed because we like money, we like sponsors, we don't work for things, we eat people's money.
00:15:24
Speaker
It was a whole, like, you know, and how it's... And also I think the other framing around that was the men abusing women as not human or people.
00:15:37
Speaker
And I think that's a very harmful trope also because it removes a lot. you know like So I think now for me, that was ship that was the shape of a container within which I was working. And it was first going back to the narratives. What narratives are people pushing a around the side?
00:15:59
Speaker
Okay, that women are being killed because of one, two, three, and they seemingly are being killed by strangers. And we do the first collection of like a few data sets. And like, it's interesting to first note that the few cases we had collected, let's say the first 20 cases, like all of them,
00:16:18
Speaker
were done by either ex-husbands, ex-boyfriends, boyfriends, partners. So someone very close home, right? Intimate partners. Thank you. So I think for me, it was a very interesting first, um you you you pose and data is saying a very big story of patterns, which in isolated cases, and I think this is a thing I've been saying, we tend to report on GBV and TF-GBV as isolated cases.
00:16:47
Speaker
and we miss the very big patterns happening. So for instance, now when we put these 20 cases together and you realize there's a pattern of intimate partner abuse, so you it's already dispelling a very big misconception that we have that GBV is happening within this space. So we we had to introduce, of course, like tools.
00:17:12
Speaker
um There was AI tools, models involved. to to pick out from the news stories. So like we literally like got data sets for the news stories in thousands and like just ran through names.
00:17:29
Speaker
And then we did legal where you have to check. So you have to also like the criteria which you're describing, for example, femicide. And we had to go with the UN n one. And of course there's like critique because the UN one doesn't capture the entire issue.
00:17:45
Speaker
story of what I think Femiflade should be. So I think that was also like, I think just the beginning of our work. And I was talking to Irene and Lobnan telling them, I wish I had my hands on Chigali declaration before the onset of this work.
00:18:03
Speaker
it would have shifted a so hundred but big chunk of um the approach. It would have grounded a bit of like the things which sometimes we threw some things out because they felt like they were gut instincts and they were not grounded in anything.
00:18:23
Speaker
So it's very hard for you to to to to like make a case for something that's just, you know, I feel like um So I think for me that has been a big chunk of the why, but also now coming to the TFGB view one, I am currently building platform dossier for interrogating the platform failures.

Platform Design Flaws and Safety Measures

00:18:45
Speaker
And this is coming from a place where I'm realizing that from the onset, the platforms have been designed for violence, right? So for example, like you look at Croc, And there is no way on earth why Grok should have come um to us without having safeguards and guardrails around people being able to undress women.
00:19:11
Speaker
Like on the most basic sense view. Right, and that's going to show us like people who are building these systems, they don't have, they're not thinking about like women on that, like, or even like, I'll say women in like other vulnerable groups like the LGBTQ plus,
00:19:31
Speaker
because it's also violence that's very specific to certain vulnerable groups, right? And I think that's the same thing with looking at Gadreels, for example, around things like TikTok and realizing that they allow things like child sexual exploitation to happen in the guise of lifestyle content.
00:19:51
Speaker
And that's something that's really bugging me just in terms of we are not into Our methodology in terms of reporting around tech in Africa is missing a big chunk of interrogating the platforms, the algos, the codes.
00:20:08
Speaker
So I think that's exactly where my mind is at right now because I am very concerned that We're just building. And then it's very reactive as opposed to proactive measures when it comes to the safety. And that's not something that can actually allow to continue persisting as we go.
00:20:30
Speaker
Yeah. And you often say that data is power. and And in your work, it feels more like data is also about care, right? And being proactive in thinking forward and not waiting until the feedback to react, right?
00:20:45
Speaker
So in this world where gen generative AI and is producing all those fake images and mimicking voices and causing harm, what i what are you seeing in terms of how these forms of technology is being used against women. But then also in connection to your what you just said about the platforms, building with that consideration, how should we be engaging with these platforms to make sure that they are proactively thinking about these things?

AI's Role in Accelerating Harm

00:21:13
Speaker
So I'll start first with the role, how the platforms are mostly exacerbating the TFGBV. So I think for me, there three things it in terms of the how. So the scale and speed of harm is accelerated, right? So let's talk about things like the deepfakes producing. And I think the Nigerian case has really started with me this week. So the entire of this week, there is...
00:21:43
Speaker
a lady's picture which was manipulated on the Nigerian X circle. And they basically depicted her as having, like being in a relationship with one person and and like another person is claiming her. Basically, they labeled her many names.
00:22:03
Speaker
And it's been very crazy to see how they've circulated. So it's been a network. It was a network of like blue check accounts. And when I said I was blue check first, the kind of reach that they have is phenomenal.
00:22:18
Speaker
So she kept getting threats. She got texts. She was abused on texts. People did entire long threads to, you know, talk about women and their bad habits and ETC.
00:22:30
Speaker
So I think for me, it's been very... um crazy to watch the scale and speed at which AI is helping harm be perpetuated.
00:22:42
Speaker
um The second thing for me has been the biased moderation. So no matter how many, and I've been watching so far, it's only two of the tweets from the blue check accounts which have been taken down.
00:22:54
Speaker
um And women are really running under the comments to report, right? And even for us, you report, but it's taking quite a lot of time before they say anything has been done.
00:23:06
Speaker
I've gotten one back where they're like, it does not go against community guidelines. So I think there's also um that, like because they don't understand also the nuance of, for example, PGN.
00:23:20
Speaker
So there is that where they miss it out and like they don't flag that as a harm when it should be flagged as a really, really big harm. And then lastly, I think the other thing we've been seeing in terms of exacerbation is the surveillance.
00:23:35
Speaker
So it's been...
00:23:39
Speaker
it's it's growing more and more in terms of like the stalker aspects where like, even when, for example, now on X, even when you block someone, they can still keep up with your everyday, right?
00:23:52
Speaker
And that's a big gap in terms of being able to grab any person So for instance, if you are a person experiencing like this particular lady, it's that even when you block the people from these accounts, they can still see, they can screenshot and like make fun of your work and your accounts still.
00:24:13
Speaker
um Now, in terms of like the things, and I'm glad you've actually brought up like the emerging harms, the AI generated and consensual deep fakes. And a lot of this is teetering towards pornographic content.
00:24:28
Speaker
So I think there is a lady who did a test yesterday where she asked... um cro Okay, so she was using two non-paid options, so like two free versions of like Gen.ai and one which was chose for.
00:24:45
Speaker
So, and I'll use an example also with Canva, which I'm very conversant with Canva.ai, where there are certain things you ask Canva and Canva will totally refuse to do.
00:24:57
Speaker
But take that to Grok, which is free. and very easily accessible and it's happy to do that, right? So I think it's been a very seeing um interesting, interesting bird to see that kind of harm emerging and especially with the free to use.
00:25:17
Speaker
um The second one, which is now on the rise and I think will be crazier as we move has been romance fraud. And I'll take us back to the Tinder Swiddler story that came out quite a while um But no, the Tinder Swindler in that time, remember, Jenny, I was not where we are right now.
00:25:39
Speaker
We're in space where we can literally be on this call and the Monday you'll sing on your end is not who's here. so
00:25:50
Speaker
scary Yeah. It's scary. Extremely scary. And I do not think that for me, I've been very scared, especially on the dating platforms. That's another space in terms of the platform dossiers I'm talking about that I want to interrogate a bit.
00:26:07
Speaker
What safeguards are you actually putting around the usage of AI and AI tools and like the fraud that would occur within this space? um Then maybe the other thing, looking at the algorithms and what they reward.
00:26:23
Speaker
So I think, and this has been very, very clear with the manosphere content I'm talking about now in Nigeria, where we are looking at content that's trailing the girl going viral.
00:26:36
Speaker
and content that's trying to defend the girl still like not making like a like even half the views of this other the one, or the ones that you're trying to debunk, they're not moving as quickly.
00:26:48
Speaker
So the algo rewards some topics, right? And I think that's been very... um And the blue check accounts know that. And that's what they exactly they exploit because they know like so long as something can can more or less evoke some kind of emotion, the algo is very happy to spread that out.
00:27:11
Speaker
Then lastly, I think for me, the one which has been sitting heavy, and this is basically from TikTok. It's not just TikTok, but like it's modeled a lot on TikTok, where it's the gamified harassment.
00:27:26
Speaker
So people are using, for example, TikTok challenges um and maybe the sounds you know And I'll give an example of like Kenya. There's a song that basically during this femicide conversation, um a song that has like femicide connotation was being used on some posts and like you're left feeling very injured about it.
00:27:52
Speaker
But it's so hard to report to TikTok and like give context for why this is actually extremely harmful. I think for me, that's something I'm really, really concerned about because it's also something that hides in plain sight, but it's also so, so deeply wounding, right?
00:28:11
Speaker
um In terms of engaging the platforms, I'm realizing that one of the big places where we have failed as the media practitioners is we are not giving feedback and like on public media.
00:28:25
Speaker
um you know, we're not giving public feedback in that sense. We're just disgruntled, but like no one is writing about like the harms. No one is keeping tabs about like, for example, if X said that they will always uphold certain standards, like why are we not holding them up to the things that they've said?
00:28:45
Speaker
So I think for us, it's also a big chunk of we need to revise the methodology in which we are reporting on harm. Like let it stop being just this this one incidence of let's say deep fake porn has happened. It's not one incidence.
00:29:03
Speaker
It's about us coming and let's interrogate what structures about X allowed for this to one, be created, number two, be amplified and number three, take this much time to be taken down.
00:29:15
Speaker
So we need to have like, methodologies. Yes.
00:29:23
Speaker
Hello, Her Media Diary listeners. Permit me to introduce to you the Kigali Declaration on the Elimination of Gender Violence in and through Media in Africa. It's a groundbreaking commitment to address the forms of gender violence experienced by women in media.
00:29:39
Speaker
and how media reports on gender-based violence. So whether you sign up as an individual or as an organization, it is a sign that you are pledging to consciously play your role in tackling this important issue and working towards creating safer and more inclusive work environments for everyone.
00:30:01
Speaker
Imagine a media landscape that treats every story with balance and every voice with dignity. By adopting the Kigali Declaration, it's not just a commitment, it's a powerful step towards social change.
00:30:14
Speaker
And that change starts with us. So if you are ready to take that bold step and make that social change, visit our website at africanwomenamedia.com slash declaration to read the declaration, to commit to it, and to begin to take action.
00:30:35
Speaker
Yeah, I was going to say, I mean, imagine, started our OWN23 conference in Kigali in 2023 with the sentence, imagine a world where, right? Imagine a world where...
00:30:49
Speaker
platforms actually have a zero tolerance to gender-based violence through their platforms. Yeah, like that's literally the the the mantra, the motto they need to take, zero tolerance to GBV. Because you've given the example of Canva, right, where no matter you you ask a set of questions and it just refuses to do it. So it is possible.
00:31:16
Speaker
It is absolutely possible. it's just really requires the will to say, okay, there's a boundary here that we just will not cross and compromise on, right? And one thing i also wanted to highlight is that What we're seeing here is not, ah that there are new forms of violence, but then i' actually actually a lot of this is just AI enabling existing forms of abuse and patterns yeah through these new tools. how do you how How do you feel about that statement?
00:31:48
Speaker
actually agree. actually

AI and Online Violence Reflecting Real-life Issues

00:31:50
Speaker
agree. And I actually think um one of the things I keep saying is online gender-based violence and technology facilitated violence violence are mirrors of what happened on a day to day in real life.
00:32:07
Speaker
When you think stalking, I can come to your house or can come to your account, right? I can beat you up physically or I can send trolls to beat you up on your account.
00:32:21
Speaker
or I can come and troll you. So I like to think of like just like so the violence you see on a day-to-day will be translated online. It just assumes a different medium, but it shall be translated. Now, I like the thing you've introduced for AI, more or less accelerating. And it's true, because now it's a thing for, if stalking has always existed, um now, for example, like a can create as many,
00:32:48
Speaker
bot accounts as I want, right? And like automate as many messages as I want to you or like automate as many like whatever um things I want towards you.
00:32:59
Speaker
So if feel like but you're you're absolutely right on AI accelerating and not it starting like absolutely new trends.
00:33:10
Speaker
Because even I like to think of it, for example, even when you're saying, um grok undressing people i'm like this is not new violence if you put that online and ask woman undressed i'm sure it's going to bring quite a number of like things right so you're very right on that dr
00:33:35
Speaker
And you've spoken about how there's no real line between social media and real life, which is basically what we're what we're saying now, right? and And that really resonates, especially when we look at and technology facilitated gender-based violence and how it spills from the screen into people's lives.
00:33:50
Speaker
How do you approach building and GBV data sets in this context of connection with the real life?

Centering Survivors in Data Practices

00:33:59
Speaker
So connecting with, no, one, it's survivor at the center, right? Like survivor, not just as a data point, but as an actual person who has experienced actual harm.
00:34:13
Speaker
So I think one, it's even in the most basic of the language that we'll use in terms of this collecting. I tend to collect a lot of data sets directly from the people, or if you're going to use, um for example, like the,
00:34:30
Speaker
the articles that we need, we have to reach out, for example, for like consent to have some of things up, etc. So I think um in terms of approaching
00:34:45
Speaker
Desta Set building, for me, I think just survivor at the center. And the other thing has been the time. So I've not been taking projects that require me to work in two days.
00:34:58
Speaker
with Because how can you? We're not machines. We're not AI. yeah had uncle Before we train the model on the kind of language, you know, like before you train the model to not pick certain like words, these words are wrong.
00:35:19
Speaker
Like it takes time. It actually takes time before you come up with frameworks, before you decide this data we shall use, this we shall not for these various reasons. I'm huge, huge on documenting like your process for how did you arrive at this decision?
00:35:36
Speaker
um What other existing decisions did you have to make? you know So I think for me, when I say the data needing time, I like and like projects which give me um time to actually build the data set with the care it requires.
00:35:54
Speaker
Because when we rush it, it also means that I will not have time to think about the guardrails I need to put. I just need to give results, right? So it's been actively for us, specifically picking longer term projects, which we are able to to build on.
00:36:12
Speaker
um And then now the second thing has been survivor being our compass. So like, does, is this supportive, like to the survivor? Is this information we need to know, you know?
00:36:23
Speaker
Cause sometimes it's just that that's a juicy detail, but like, it's not the most important thing that I need to be telling the audience. So it's also been being able to discern what will be supportive, what will not. um There are also things, for example, like the parts of this work I don't let AI touch, right? So like, for example, the deciding, the annotations, the aspects of this which you have to do that you're able to to to give bitter better direction to the models for how they pick up things and like what they drop.
00:36:58
Speaker
And I think there are many, we've actually like had to shelve some of like the models we were using, one model we were using for collecting the TFDBV data.
00:37:09
Speaker
Because you soon realize we had trained it on on data, using lake Kenyan data, but like you start realizing that it's giving you very, very non-gen... It's very victim-blaming kind of... like mindset, but that's because the data you're using is also like looking a certain way, right?
00:37:31
Speaker
So that model had to go because like retraining it would have taken much more time than just building, setting up. So I think for me, that's been the bigger, bigger compass for survivor must be at the center, but also like being very, very clear about the places where I'm willing for AI to do, like the scraping can be tedious.
00:37:52
Speaker
We can automate that for sure. But like the legal things, the things just where you have to to ensure that a human being is there. Because AI, again, I like to say AI can understand patterns.
00:38:07
Speaker
AI does not understand emotion. AI does not understand pain. Yeah, and you can't teach AI to understand those things, but you as a human being can, right? So I engage AI to help me pick out patterns.
00:38:23
Speaker
um Help me see what all these cases are saying together, right? Tell me the big numbers. Give me a dashboard. But like, let me sit down and interpret and say that in the Kenyan system, the culturally vision, this is happening and this is the context. This is the container for the kind of violence we're seeing, the kind of data set we are seeing now.
00:38:45
Speaker
So I think that for me has been a bigger compass. And I really want to spotlight something that was implicit in your response because you are using AI. Yes. Right?
00:38:56
Speaker
We're not saying don't use AI. We're not saying we are enemies of AI. You're literally using AI to do this very important work. Right? and But that we're saying actually there are things that also need to be considered.
00:39:10
Speaker
There's the biases that AI brings into into the work, right? And there's the ways in which others might use AI negatively. So because also in this conversation, when we're when we're raising alarm, right, and raising awareness about the gendered experiences of these new tools, it can be misconstrued as if we're saying AI is a bad is but quite person here.
00:39:31
Speaker
Well, we're not saying that, are we? No, no, no. What I'm actually very keen on saying is AI is not a neutral participant. It's not, don't use AI. It's not AI is good. It's not AI is bad. It's just remember it's not a neutral participant ever.
00:39:51
Speaker
So it's going to come in with your bias, with the biases of the people who built the data sets bias. um It has bias everywhere. And one thing I've come to say is,
00:40:02
Speaker
As human beings, we cannot exist without biases. Even here, we have our own biases about like things. um So I do think it's imperative that as we interact, you don't interact with AI as an absolute knowledge system.
00:40:19
Speaker
It's not a knowledge system. It doesn't have capacity to even just be like the only player that you, for example, rely on for research.
00:40:33
Speaker
Yeah, it has to be something that like you remember this has been trained on the backs of other people. So it's upon you too to interact with it the way you would in any other conversation.
00:40:45
Speaker
i like to interact, for example, with GPT as if I'm in a conversation with a friend who I like picking their mind, right? So you won't tell me, for example, it's sunny and I'm like, okay, cool, thanks. No, no, no, no, no.
00:40:58
Speaker
How did you... Yeah, exactly. um both You did you... It's sunny. Absolutely. Small, small things. So why you start, interact with AI from that POV of you're not an absolute knowledge system and I don't I should not trust you with like everything.
00:41:22
Speaker
um I also think that's why teeter a lot to Notebook LM because it has hallucinations here, but it's also a lot of, it depends a lot on like the information heard it, right? So like it gives me, so whenever I feed it garbage in, it gives me garbage out.
00:41:40
Speaker
um Like, but when I give it good data, I can see direct results. So I think I'm also starting to, um, ask and like imagine spaces. For example, I think when you said a space where, because for notebook now I'm able to very carefully curate what quote unquote the model is telling me, the way it speaks to me.
00:42:04
Speaker
um So even the way it reasons, the way it arrives at things. right So I think that's also the other thing, because we're also, um in terms of using AI, I think the other thing we're lacking is patience and time.
00:42:18
Speaker
which we must be taught by our little hobbies. ha um Yes, yes, yes. I think GPT, DeepSeek, and I'll say this with a big pinch of salt, because also when say bias, I think it's important to talk about who owns the platforms.
00:42:38
Speaker
um So with, for example, like the US investing in chat GPT, the open AI, DeepSeq, China, there is going to be like actual things you have to contend with in terms of like the kind of information you're able to extract, get, or receive from that model.
00:42:59
Speaker
However, i think it's important that the models also allow us to be able to only exist. So for example, the GPT, you can be able to train your own GPT in a way that it will still hallucinate, but the At least with the thought processing, it's able to think a bit better. So giving um the models time, like don't just start.
00:43:23
Speaker
I think one of the things I keep saying is don't start, for example, using GPT today and you're giving me answers using that GPT. You're doing a different person with everyone, right? So like, I think for my GPT, I've trained it for the last like two, three years.
00:43:38
Speaker
It's one of the things I keep saying, that's my things for inheritance in the will, the GPT goes to, because it's a dossier that you keep improving over time and you keep improving in terms of feed it content, feed it material that you want it to think about, ask the questions, annotate for it, um give it the thumbs down when it gives you a bad answer, give responses in real time, you know?
00:44:07
Speaker
Yeah, I think that's a big... Yeah, absolutely. Absolutely. Because, I mean, and and just to just to put it out there also, that, you know, anybody that is copying and pasting email suggested by GPT and sending it to us, we know and that we've had GPT. Like, yeah, thumbs up. Like, you're right. We're not saying don't use AI. We're saying use it...
00:44:33
Speaker
whilst thinking, you know whilst enabling it to think also. and And like you said, AI is not neutral. I'm not just copying and pasting what it gives you. The email thing, because I feel like it's also, we are getting lost because there are things that AI should be doing and other things that really we should keep thinking. And I feel like this is a conversation where we keep saying, no, kids can't write.
00:45:00
Speaker
um using pens, right? And it's an art that you lose over time because you're only typing. And it's the same thing for the more and more that you keep telling your email, just write for me an email response. And you're switching off your thinking and your creativity things.
00:45:19
Speaker
So I think it's also also very imperative that we decide that this task I can outsource. But this particular task, I don't need to outsource for this. Like it's okay for my email to to come sounding as raw and, you know, not as polished, like just sounding as human as possibly can be. It's okay for that to happen.
00:45:42
Speaker
But like I can utilize AI for something um else, maybe the patterns or helping me do my to-do list for the day because I have a hundred tasks. So yeah. absolutely and also just on the important and issue you also raised there around who owns ai chat GPT who owns these platforms right it's you know, it's usually a white male in Silicon Valley, right? That's never even stepped into, that still thinks Africa is a kind is a country, you know, that kind of, just to put it simplistically. yeah And, but we do have, the African Union and does have the Continental Artificial Intelligence Strategy. And I would really encourage everybody to go and have a look at that because that's really talking about what does AI look like for our context, right? because
00:46:31
Speaker
Because we also have the power to start creating our own. And these strategies, these documents are examples of how and we can begin to think about that for our context. So the the the the foundation already exists.
00:46:44
Speaker
It's just for us now to proactively go after them and say, okay, based on this continental strategy, based on kind of all of these experiences that we're sharing, what are we going to do? but like What am I literally going to do as...
00:46:57
Speaker
and an agent of force in this space? A hundred, a hundred, a hundred percent. um Yeah, because the more we also stay passive, it also means that they continue to make them and we continue to to to to be used to build, you know, just in the sense of example, like GPT, they built it Silicon Valley, but Africans are the ones training it, right?
00:47:18
Speaker
And getting very poor while at it. um Yeah, so you're very right, Dr. Yemisi, we do need to get our feet in, and I think people are. um And maybe the other struggle here is we need to more or less one channel, even in spaces where like we have access to funds, we need to channel more funds towards like people building solutions, Africans specifically building solutions for the African space.
00:47:48
Speaker
um And then the second thing for me would be the people who've built these need. um So for example, like when the conferences that we have, we need to have spaces where we are showcasing a bit more of what AI things are we doing actually. right Let's give people platforms to talk about their work.
00:48:05
Speaker
um and like amplify, let's amplify a bit more because it's also that sometimes we do not actually know that the kind of work you're doing exists. and Absolutely.
00:48:17
Speaker
And just to take us back to and TFGPV, you you you spoke earlier on about media doing enough to talk about this and to raise awareness, call out the platforms.
00:48:31
Speaker
What would you, like looking forward, what would you like to see media organisations, journalists, content producers do better at when telling stories of GBV, when experiencing technology facilitated gender-based violence?
00:48:45
Speaker
How should we expect media to be more proactive in that space?

Media's Role in Reporting TFGBV with Methodologies

00:48:51
Speaker
So I'm going to take this now back to methodology. So our strategy must be through the methodologies we're using to collect, to hold and honor the data of like harm that has been caused and like also the stories of the survivors, right?
00:49:08
Speaker
And I think I want to be very categorical and say that methodology is also very political. It's not neutral. And just that we are mimicking the same way. AI is not neutral. Your methodology cannot be neutral.
00:49:23
Speaker
um And I think for me, I'll talk about four methodologies which I am currently exploring and hoping to see a bit more.
00:49:34
Speaker
So I think the first one for me has been the participatory. So, like let survivors be more or less at the forefront of this. So, for instance, let's say you're collecting data on the AI, Pong and AI, right? And it's let people, for example, have a space where they're able to, and I think I really like this from the training that we had. There's a group, um I'm um Ahmed, Dr. No, Engineer Ahmed from Egypt has form where women can fill in any kind of like harassment that they've been facing, give context, and it's fully pseudonyms.
00:50:14
Speaker
right But now what this enables people to do is you you curate or you co-build something with a survivor. they don't just They're not just data points, right? The database is informed by very lived experiences.
00:50:31
Speaker
And I think for this is important because a lot of the ways we report on TFGBV is shaped by the Western lens, but that's because we don't have data sets to go back and be like her.
00:50:43
Speaker
So this and these are the patterns we're seeing across our space in Nigeria, in Kenya, in Malawi. right so i think for me one i think um is that participatory the second thing i'm really really looking at is us being able to audit i want to see more audits of ai development and practices um so more things which are annotated, like, for example, in local languages. Let's see people talk about, like, these tools and give um nuance. For example, like the bias audits we're doing for for Google Notebooks. So um I'm running a separate thing called the conflict ledger. And, like, part of the thing that we do is after the podcast is created on Notebook, we do a bias audit.
00:51:33
Speaker
And part of the bias audit is being able to say where or how the... model missed or didn't like, or was aligned to cultural realities.
00:51:45
Speaker
I'll give an example with, in one of the episodes you notice where the model has assigned the man as the interviewer, yes, the lady as the expert, but her voice is devoid of emotion.
00:52:01
Speaker
It's emotion. And in African context, like women are ah big like a big um part of like the culture people who carry like emotions and like the people who you know hold them. Expressive, also very expressive. expressive Even as experts, it does not take away, it's part of like what makes you human.
00:52:26
Speaker
So I think that was one of the things like we noticed whenever like they assign or if we ask the model to assign women as as an expert, they make them be void of emotion completely.
00:52:39
Speaker
So I think for us it's about being able to also audit. Let's audit from that point of view for not only am I going to audit your code, I'm also going to audit you for language, for culture, for, you know, even power. Like how in the interview, how is it that when the woman is speaking and she's an expert or she's an interviewer, she sounds like she doesn't know anything. She's in class being taught with us.
00:53:02
Speaker
right but i can see when it's a man interviewing he has capacity to hold space for certain conversations so like those small things they they might look very minute but you see that's the more of us producing podcasts and not calling it out it means the model is being reinforced as like you're reinforcing bad behavior in the model So I think I do want to see more auditing practices.
00:53:29
Speaker
um And then lastly, I i want to to see more, I'll say, trauma-informed um design principles added to tools or even to like stories.
00:53:44
Speaker
And I'll say, for example, like in the most basic way, it's the language. And I think Chigali has really captured some of the ways because one of the questions I really liked from class was,
00:53:55
Speaker
how how will my story be interesting if I can't put the sensationalised aspect? right And I was like, that's a very interesting question. But like you see there is also, like it's about us learning to play with language in a way that we're able to communicate, but without, with it being more trauma-informed. like Trauma-informed doesn't mean boring.
00:54:21
Speaker
It does not mean flat.
00:54:25
Speaker
So I think me that's been a big, big one, because I think interacting also with TFDBV data is like realizing as was, for example, putting together the material for the sessions, it's very traumatizing for a lot of people, even for us.
00:54:41
Speaker
It's something that you have, you know, because and again, it's because of the language, the people describe the things, the way people write, the way people talk about it. um So I think that's something I'm really, really looking forward to. And like, I'd also like for us to explore different formats.
00:54:57
Speaker
Cause writing, feel like writing is not doing us ah like a good service in terms of capturing the nuance of TFGBV. And I'll put it in very plain way in the way that, for example, Dr. Yamisiwa to text and email about this conversation, it would not have the same depth we've had right now.
00:55:21
Speaker
um so I think I'd like for journalists to also explore different mediums. Let's explore mediums that allow for nuance, that allow for space to be held.
00:55:32
Speaker
And that also allows for time. And of course, amongst the survivors I was talking about are the women in media themselves, the journalists, the country producers, who themselves are quietly so surviving, either because they themselves have personal experience of GBV, or as journalists, they're obviously facing, and media producers,
00:55:52
Speaker
and facing various forms of gender-related violence in their line of work

Advice for New Storytellers and Tech Enthusiasts

00:55:57
Speaker
as well. So what would you say to someone, to our listeners, who wants to work in storytelling or tech or feels unsafe or unwelcome?
00:56:05
Speaker
So first, I will say, believe you, because I know it's very, very hard to like come out and first say it. Even when you've not said it and you know it's happening, sometimes you don't even clock it until like you're out of the context.
00:56:20
Speaker
um So I think for me, one, it's to say that we are in an unsafe world, whether we do or we don't.
00:56:31
Speaker
And it's going to be a thing for grounding community. Make sure you're not doing this work alone. um I like to say this work is expensive in terms of care. You're going to have to put in a lot, a lot into your everyday care.
00:56:48
Speaker
into your everyday, like small things that fill your pockets. So if it's working out, it's the baking, it's like the little things you need to do on a dayto day to day, make sure you do it.
00:57:01
Speaker
And maybe lastly, the other thing I'd say is do it afraid. A lot of the work and a lot of times I'm moving through this space, I'm shaking in my boots. um And I'll also say the work just also in terms of like, sometimes you're doing work which will shake you to the core in terms of believing in humanity, right? And I think it's also important that we come to this space, like open to the experience of losing faith, losing a lot of faith in systems and in humanity.
00:57:37
Speaker
um But like, I think also be very grounded about your why, It's going to be tough and you'll need a very clear, this is my why at any particular time. And I feel like one of the things I keep telling people, for instance, is also like, know yourself. And like, again, I talked about limitations, right? And I talked about like the places where you're willing to go and where you're not willing to go.
00:58:03
Speaker
And I think for me, for instance, one of the things i know, and I've always been telling people is, I know that I might make the error of one day amplifying a falsified claim, right?
00:58:17
Speaker
But I'd rather err to the side of that and apologize or even like, yeah, I'll deal with the consequences of that if it ever happens than teeter to the side of not believing a survivor or a victim.
00:58:31
Speaker
So think for me, it's also like, know the concessions you're willing to make and also communicate them to the people you're working with. So for the people I'm working with at the AI salon, they know.
00:58:42
Speaker
And like, if ever we have to make decisions, maybe that's also the other thing where we have come up or um with have very specific, like, let's say checklists. So we have a bias checklist, we have a deformation checklist, we have like different checklists for different things. Because I've also come to realize when in the heat of the moment and you need to make decisions about what you need to do, especially as a collective,
00:59:10
Speaker
You don't want to afford the mistake of making decisions at the height of emotion. up So ground it in a thing that exists before you start the work. right Let's say that we risk defamation. and So these are checklist for the things you're going to do for any story.
00:59:28
Speaker
And if it passes this, no matter how anyone is feeling in the group, So long as it's past this checklist, it's going, right? Or if it fails, it's not going anywhere, no matter how strongly you feel about it So I think that also maybe will be the other thing for grounded in frameworks.
00:59:45
Speaker
And Chigali declaration is also here as a brilliant, brilliant thing. I think we can all collectively continue co-building and making stronger and localize, right?
00:59:57
Speaker
Make it better for your local context in your local languages. I really like that there's a Somali translation. um We need more and like annotate for context, annotate for context,
01:00:08
Speaker
for things and like let's chigali guide you in terms of the checklist that you might need um the guardrails that you might need in terms of like these work and maybe lastly it's remember you're not working in this alone and try in as much as possible to never pick up a thing and run alone find community.
01:00:30
Speaker
is promise there is like two, or three more people who are willing to be in the inner kitchen with you as you grow that thing. So let people hold your hand, like delegate what you can. um For example, like also like understand like our context, right?
01:00:46
Speaker
So I think one of the things I've been learning in watching one of my very close friends work on a story has been, for example, we know this particular person will not listen to women or won't give you the answers you need.
01:00:59
Speaker
engage that journalist who's a male ally, right? Let them do that drilling for you. You don't have to suffer through everything. This is not a punishment.
01:01:12
Speaker
um So I think that would be the main things I tell the people who want to venture into this space. And also tech is fun and it's hard. And people will make you feel like you don't know what you're doing, but I promise you do.
01:01:26
Speaker
You actually do know a lot more than you give yourself credit for. So yeah. Absolutely,

Closing Thoughts on Data-Driven Storytelling

01:01:32
Speaker
absolutely. And just to say that if anybody wants to learn more about the Kigali Declaration, just visit AfricanWomenInMedia.com slash declaration. It's the Kigali Declaration on the Elimination of Gender Violence in and through Media in Africa by 2034, which was co-produced by the AWIM community in 2023.
01:01:51
Speaker
Thank you so much, Mwenda, for your time today. It's been such a pleasure. I've really, really enjoyed my time with you. I think we need we have a lot of conversations to have about around tea, but also but also around data sets and AI and all all of that. So definitely our conversation will continue beyond this.
01:02:08
Speaker
So thank you so much for joining me today. And thank you for inviting me. So Mwende reminds us that data isn't neutral. AI isn't neutral. It's shaped by who gets countered, who's producing these AI platforms, and who's getting erased.
01:02:24
Speaker
And as generative AI becomes more powerful, we need storytellers like Mwende and brave enough to collect what's missing and wise enough to tell us to care about what's missing.
01:02:37
Speaker
So thank you for listening to Hermedia Diary. If you found today's conversation inspiring, don't forget to subscribe, leave a review, and share this episode with somebody who needs to hear it. If you'd like to join me on an episode of the podcast, send me an email at yemesi at africanwomeninmedia.com or visit our website at hermediadiary.com. So that's the podcast website. And of course, our main website is africanwomeninmedia.com.
01:03:02
Speaker
Subscribe, follow Her Media Diary on your favorite podcasting platforms and tune in to our partner radio stations, which you can find across various African countries. And don't forget, join the conversation using the hashtag Her Media Diary.
01:03:16
Speaker
Until next time, stay curious, stay safe and keep amplifying the stories that matter.