Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Her Media Diary Episode 45: “Regulating AI: Who's Responsible for Gendered Harm?” with Dr Grace Githaiga image

Her Media Diary Episode 45: “Regulating AI: Who's Responsible for Gendered Harm?” with Dr Grace Githaiga

E45 · Her Media Diary
Avatar
6 Plays29 minutes ago

Dr. Grace Githaiga is the CEO at KICTANET and a visionary leader with over 20 years of expertise in ICT policy advocacy. She spearheads progressive advocacy for inclusive ICT policies on data protection, cybersecurity, gender, accessibility, and civic digital rights.

In this episode, she joins us to unpack what feminist digital justice really means in the age of AI, and why we must act now to make our digital futures safe and inclusive for all. She encourages women to learn to report cases of Tech-facilitated GBV both to law enforcement agencies and tech platforms. According to her, these platforms would most likely be moved to action when they have so many people reporting cases of abuse as against when it is just a few persons.

Subscribe, leave a review and share this episode with someone who needs to hear it.

If you’d like to join an episode of this podcast, send an email to yemisi@africanwomeninmedia.com. Or visit our website at www.hermediadiary.com

Subscribe and follow Her Media Diary on all your favourite podcast platforms, Also, tune in to our partner radio stations from anywhere across Africa. And don’t forget to join the conversation using the hashtag, #hermediadiary.

Recommended
Transcript

Introduction to Hammio Diary & African Women's Stories in Media

00:00:03
Speaker
welcome to hammio Diary, the podcast where African women share their real stories, bold moves, and behind-the-scenes moments that shaped their journeys. I'm your host, Dr. Yemsi Akimobola, and with each episode, we're pulling back the curtain of what it really means to build a media career, break barriers, and stay true to your voice.
00:00:23
Speaker
Whether you're starting out or already making waves, this space is definitely for you. In today's episode, our focus is on the rise of technology facilitated gender based violence and how artificial intelligence, AI, social media platforms and tech companies are shaping and sometimes threatening women's safety and freedom online.
00:00:44
Speaker
Our guest

Discussion with Dr. Grace Githaiga on Feminist Digital Justice

00:00:45
Speaker
is Dr. Grace Githaiga. So Dr. Grace is a policy leader, digital rights advocate, and the powerhouse behind the Kenya ICT Action Network, Kiktenet.
00:00:57
Speaker
Yes, Kiktenet. So that's the Kenya ICT Action Network, Kiktenet. She spends years shaping Africa's tech policy landscape and centering women in conversations about internet governance and digital safety.
00:01:11
Speaker
Today she joins us to unpack what feminist digital justice really means in the age of ai and why we must act now to make our digital futures safe and inclusive for all.
00:01:22
Speaker
So stay with us.

Digital Safety and Justice for Women

00:01:36
Speaker
Good morning or good afternoon, Dr. Grace. ah goodar It's already good afternoon here in Nairobi. and Great to have you on the podcast. Thank you so much for making the time.
00:01:48
Speaker
Same here, same here. I'm really glad that I'll be sharing some of my experiences and actually addressing this topic that is very close to my heart. Yeah, absolutely. I can see that from your kind of past and the work you've done over this many, many years.

Dr. Githaiga's Journey from Journalism to Tech Policy

00:02:04
Speaker
So I'm looking forward to learning so much from you.
00:02:06
Speaker
But let's let's start by going beyond the title and where you are today. Who is Dr. Grace Kithaga, Beyond the Policy Tables and TV Screens?
00:02:17
Speaker
Okay, Dr. Githaiga is, like you have rightly pointed out, is this person who has been in policy conversations, initially policy conversations, supporting community radio in Africa and ensuring ah regulator see that considered actually embraced it and entrenched it as a third sector when it comes to broadcasting regulations. So I spend a lot of time doing that.
00:02:46
Speaker
And apart from that, actually i was trained as a journalist. But in 2011, I just tweaked into technology and I must say, again, have found myself.
00:02:59
Speaker
when it comes to tech policy, I enjoy what I do. Besides that, Dr. Gilega is a mother of a son and a daughter. Yes, and I met your daughter just briefly just now. But before we go, because there's something that you said really interestingly there around that shift from journalism to tech.
00:03:17
Speaker
And I'm going to come back to that in a second, because I really want to go back as early years of Grace, before you became Dr. Grace, graup Did you grow up in Nairobi or Kenya or somewhere else?
00:03:29
Speaker
No, I didn't grow up in Nairobi. I grew up in Kenya, but not in Nairobi. i grew up in a town known as the Flamingo City, and that is Nakuru.
00:03:41
Speaker
So born, yes, brought up. It's the city that has flamingos. ah So I was born and brought up there. And I only came to the city to attend college after high school And let's just say, like they say, the rest is history. I never went back to Nakuru, so I only go there to visit.
00:04:03
Speaker
Yeah, but what was, i mean, like, I have this very beautiful image of flamingos in Nakuru. Yes. I mean, I've not been there, but obviously the the visual representation of is very beautiful.
00:04:15
Speaker
What was it like growing up? What was your household like? you have siblings? Tell us about growing up. Yeah, actually we grew up in an ordinary Kenyan family. Both my parents were teachers, primary school teachers.
00:04:28
Speaker
So, you know, we grew up in that simplicity, but we we were a very happy family. Five of us, you know, four girls, two already are late and one boy who happens to be my last born brother and the only brother.
00:04:43
Speaker
ah But we grew up in a home always had relatives. So we really never experienced growing up as as a nuclear family because there were always relatives in our house.
00:04:56
Speaker
ah My father happened to have been the eldest son in their family of 11, you know, nine girls and two male children And he was the fourth born by the eldest. So he was constantly ah taking care of either, you know, his sisters or my cousins.
00:05:17
Speaker
We didn't really grow up as a nuclear family. And I'm only getting to experience that now with my family ah because we didn't do that. But needless to say that, um we were happy. There was always something to laugh about ah in that house. And there was always something new because somebody, every time, either it's an uncle or an auntie, there was just something new in that home and a very warm home, yeah. Yeah.
00:05:45
Speaker
But my father was also, apart from that, being a father of four girls. And I must say that my sisters were actually the two late ones, were ah very beautiful girls. And even the the the one who is still alive is is is is really pretty.
00:06:05
Speaker
And um I think I am different in that house. I was the first born, but I'm kind of different, you know, short, um you know, the shortest, kind of the darkest.
00:06:17
Speaker
Actually, neighbors used to call our home the home of the brown girls. That's how it was all known. but But it's interesting that you can make that distinction about shorts and darker skin. Like, do those things factor in how you were kind of observed and seen as you were growing up?
00:06:37
Speaker
You know, now having been in this debate, I will tell you, i think it's a discrimination thing. Because discrimination happens, ah you know, when we talk of women being discriminated, I think discrimination is also at different levels.
00:06:54
Speaker
If you notice, um you know, women who are considered plump are discriminated because, you know, they are not considered beautiful. They are not, you know, they will not be going on those beauty pageants.
00:07:06
Speaker
Women, for example, always asked women who are my color, maybe be now, but initially women my color would never find themselves in those beauty pageants. I had a stint. I was a Humphrey Fulbright fellow in the U.S. because I was doing very unique work, I guess my name was bigger than my name was...
00:07:28
Speaker
so my name was <unk> And so I would get invited to provide, you know, you know I had speaking opportunities and i' would be invited to speak about my work, especially on community radios in Africa.
00:07:44
Speaker
And sometimes I had an African-Brazilian roommate. um and And I would, you know, tag her along in such in such places.
00:07:57
Speaker
And every time we got to the place, you know, happy people would greet us, hi, Grace, you know. oh And because she was also in those debates on discrimination, she would just go street and tell me, Grace, it's because you're short.
00:08:12
Speaker
ah So they think, mean you know, they think i'm um I'm the one providing speeches and she'll be very quick and, ah, this is grace. And she'll just ah really enjoy looking at people's expression as they realize that the person they invited is not what they imagine.
00:08:33
Speaker
But it's interesting that despite all, I mean, that's so incredibly, i don't know the right word, disempowering. i don't know how you felt. But this so this idea of there's a particular image or look that one should have that kind of aligns with the perception of who you are, especially, know, somebody like yourself has done such amazing work, but there's still that connection with your physical appearance.
00:08:58
Speaker
how How did you navigate that? I never allowed anyone to make me feel small. Actually, it's never even occurred to me um that, that ah you know, being short is, you know, is discrimination. It should be thing. You know, for me, it was a strength because, ah you know, I was short, I was petite.
00:09:22
Speaker
I would be I guess, um you know, in places where people didn't expect. So I took it as an advantage. And even, you know, my classmates, you know, everyone would be saying, this small girl is in, I don't know which class.
00:09:37
Speaker
And it would make me feel so happy. So I think it, for me, it ah it worked the opposite. I think it's, It encouraged me. It made me feel proud of what I had done and where I was.
00:09:52
Speaker
ah So it has never, never been anything to me. And, you know, of course, you've worked across technology, governance, media, all over the world, not you know, folks in Africa, but also being all over the world with your amazing work.
00:10:07
Speaker
And you said something earlier on that you made the leap from journalism to technology. And I'm really intrigued by that. And I'll tell you why, because I was in a, I was at a conference recently where in Zambia, Livingstone in Zambia, where there was this very, you know, intense conversation about actually media organizations need to begin to recognize themselves or begin to embrace more and see themselves as well media and technology and organization and kind of not separate the two.
00:10:39
Speaker
So tell us about your journey, like that transition from journalism to tech and kind of how that's evolved over time.

Kigali Declaration and Gender Violence in Media

00:10:45
Speaker
I tell people that actually it happened by default because I had been working in establishing community reduce in the region And that went on very well after a lot of fights with the regulators, with the government, because the you know government wondered why you know I was pushing for community radios, radios that would be you know utilizing local languages to articulate their genders. In African context where you have you hound governments that are suspicious, especially when they speak in a language that they cannot understand, made me look, you know, it just gave me that that strength that I was forever agitating. So whenever there was a meeting and I knew i needed to make, I needed to make
00:11:44
Speaker
a suggestion and or I needed to use the platform to articulate the community radio agenda, I would do that. So part of the work was establishing the community radios.
00:11:55
Speaker
But, ah you know, together with colleagues, we realized that you can't establish them without binding law binding um a low or policy that supports the establishment.
00:12:10
Speaker
And so in the course of my work, I had to work a lot on policies. So some of the things I was pushing for, one was the recognition of community radios as a third sector in broadcasting policy.
00:12:24
Speaker
I was also so you know pushing allowed for for them to be allowed ah You know, i actually pushed for what we would call um um I forget the word but like when it came to it came to paying of frequency fees.
00:12:46
Speaker
I pushed for that to have different different provisions so that the public broadcaster pay different, commercial broadcasters pay different.
00:12:59
Speaker
And then I pushed for affirmative action when it came to community reduce. Because I was saying, beside these are tools that are helping communities speak to each other and also speak their agenda.
00:13:13
Speaker
And it's communities that have no money. So you can't make them pay the same amount as commercial stations. So I actually pushed for that affirmative action.
00:13:25
Speaker
And then I was also pushing for public broadcaster to be also recognized as, you know, as a public broadcaster that actually serves public interest.
00:13:38
Speaker
And towards that, I actually did a book on public broadcasting in Kenya. Now, when timmber my When I came back from my stay in the US as as as a Fulbright Fellow, I remember it was at at a time when we we we we we had a government that was starting to recognize the usefulness of media and the convergence of technology and media.
00:14:16
Speaker
And so again, I jumped onto those debates. And there were regulations, broadcast regulations that were produced. They were called the broadcast regulations of 2010.
00:14:29
Speaker
And in that, actually, i would say 75% of been push him for was affected.
00:14:41
Speaker
And I remember the director of the National Communications Secretariat that was in charge of formulating, you know, coming up, supporting the Minister of Information to come up with those laws, walked up to me and told me, you know, Greece, you are the winner in this in these regulations because what you have been asking for has actually been adopted.
00:15:09
Speaker
Then he told me, I think now you can consider going on retire.
00:15:19
Speaker
We laughed and, we you know, with the people who are sitting with me and I asked him, now if I retire at my age, what will I do before I get to the age of retirement?
00:15:31
Speaker
Absolutely. Yes. And at that point, I remember going home and thinking, hey what did he tell me? And I thought, actually, he has a point. Is there anything new that I can add into the sector?
00:15:45
Speaker
And I just thought, it doesn't seem like there's anything new I can add into this sector. And I think I have done my bit. I have brought in all the energy I needed to bring in.
00:15:57
Speaker
And we have made wins. So I started thinking, okay, what next? What next? What do I need to do? And it is actually at that point that I applied for the, you know, there was a PhD scholarship.
00:16:14
Speaker
And the PhD, actually, there were two opportunities in that PhD scholarship. And, you know, the way a friend of mine just shared it with me and said, this looks right up your step. You know, I saw community radio, blah, blah, blah. Then there was something on tech and women, da, da.
00:16:32
Speaker
ah So I actually thought the community, I would apply for the community radio one because that is what I knew best.
00:16:42
Speaker
And I completely forgot about it. Then one, I remember just actually two days before the deadline, I remembered about that scholarship. And I just said, you know what?
00:16:53
Speaker
I need to give it. i need to try it out. It would be such a mistake if I do not try and then I'll i'll actually be regretting.
00:17:05
Speaker
So I woke up and I read and I realized the community radio scholarship was for Tanzania. ah The tech one was for Kenya. So I called a colleague and we discussed and he advised this is something you can do. You've been working with women.
00:17:23
Speaker
You've been working with women in radio. The only difference here is tech. tech, you know, nobody was born with tech. and So this is what you need to do.
00:17:34
Speaker
I remember I read, i read that Saturday, I read and read. And then Sunday, I started doing my concept. And I didn't finish it until maybe 3 a.m. m in the morning.
00:17:47
Speaker
So I shared it with that colleague of mine. And I asked him, please check because I need to submit it The deadline is 5, 5 p.m. Kenyan time.
00:17:59
Speaker
And I went to sleep. When I woke up, I found he had responded, but given me so many comments. And I responded to him. I called him and I told him, you know, ah his name is Njuki. And I said, Njuki.
00:18:15
Speaker
ah you've given me so many comments. I do not have time to respond to comments. What I want you to do is to give me high level comments that I can address in 10 minutes.
00:18:30
Speaker
ah So while on the phone, he told me, number one, ah the concept requires that you only do seven pages. Yours is 10 pages.
00:18:41
Speaker
Can you look for a cut out? And so we you know we worked together on that and I was ready. So I remember i you then I went actually, I submitted that concept two minutes uh to the day wow yeah and then for an interview and then the same day i was awarded the phd scholarship by danida that is the government of denmark and so that's how my journey changed because um now then kicked on it uh there kicked on it was then an email list
00:19:19
Speaker
And a friend of mine, her name is Alice Monua. She's now moved to Mozilla, you know, was the convener.
00:19:32
Speaker
And I remember just sending her a message and telling her, I'm available. If you need support in anything, let me know. And immediately she responded and said, you know, there's this research we are doing.
00:19:47
Speaker
We are looking at the dark sides of ICT, how ICTs affect women, blah, blah, blah. Maybe you can come support in fourteen that. And so I got in and you know I had just come from from our an academic institution.
00:20:03
Speaker
I was able to you know support that research. And then there was a meeting, i think, in Johannesburg to meet with a freedom of expression reporter to help him come up, you know what discuss what are the issues in tech.
00:20:23
Speaker
And it had brought together so many people ah from from media. um And so ah when she said i would represent her, because she didn't want to she didn' want to go because her daughter was home and was going back to school.
00:20:39
Speaker
So she wanted to spend time with her daughter. And so when she said I would represent her, they thought i was at the same level as her. And they gave me a topic that I did not even understand what it was all about.
00:20:52
Speaker
I still remember it was on intermediary liability. So I didn't know what that was. on when you Being a researcher, you know, you talk to people, you understand what happens.
00:21:06
Speaker
And I remember one gentleman, his name is, he's also in Kictonet, and we did that study on the dark side of ICTs with him. His name is Murioki Murevi.
00:21:18
Speaker
And he explained to me what intermediary liability was all about. And so from there I went and presented and I got very good comments and i later was told that my paper was one of the best in that conference.
00:21:36
Speaker
Out of that i was able, i think I just found favour with different people because out of that I got sponsorship to go to the summer school at the European um ah University in Hungary.
00:21:55
Speaker
I also got a sponsorship one week. The same people who sponsored me to go to wherever, they sponsored me to go to so Harvard for a week for a conference again on tech.
00:22:07
Speaker
And then I think there was a third training. Yes, there's are an organization called Diplo that provides basic, you know, like introduction courses to internet governance.
00:22:18
Speaker
Again, they gave me a scholarship. By the time I was done with those three, I had become an expert. And now I completely, completely changed and I can comfortably see that I am one of, you know, one of those main voices in Africa.

Technology-Facilitated Gender-Based Violence (TFGBV)

00:22:42
Speaker
Hello, Hermida Diary listeners. Permit me to introduce to you the Kigali Declaration on the Elimination of Gender Violence In and Through Media in Africa. It's a groundbreaking commitment to address the forms gender violence experienced by women in media and how media reports on gender-based violence.
00:23:02
Speaker
So, whether you sign up as an individual or as an organization, it is a sign that you are pledging consciously play your role in tackling this important issue and working towards creating safer and more inclusive work environments for everyone.
00:23:20
Speaker
Imagine a media landscape that treats every story with balance and every voice with dignity. By adopting the Kigali Declaration, it's not just a commitment, it's a powerful step towards social change.
00:23:33
Speaker
And that change starts with us. So if you are ready to take that bold step and make that social change, visit our website at africanwomeninmedia.com slash declaration to read the declaration, to commit to it and to begin to take action.
00:23:54
Speaker
mean If if i may, if I could summarize that kind of journey that you've been on, it started off with somebody posing a challenge to you. yeah Now that you've done, what next basically was was the question, you know, for them, it was a retirement for you. It's like, oh, that's true. What's my next big challenge?
00:24:09
Speaker
And then taking those bold steps each time. whether it's to apply for that scholarship that you felt maybe it's not for you, but you still went for it anyway, whether it's given that talk that you knew absolutely nothing about, but you still decided to show up and do your research and subsequently everything else fell in place, right? So that's kind of, you know, it's very powerful, strong journey that brings you here to where you are today as a very significant voice in this space around tech, media, governance and across the continent.
00:24:39
Speaker
Yes. Yeah? Correct. And just thinking about that very early on research around the dark side of ICT, bringing it to 2025 where we are now, and we're talking about technology facilitated gender-based violence, we're talking about artificial intelligence, machine learning and all that.
00:24:59
Speaker
In simple terms, how do you define technology facilitated gender-based violence, especially now that AI tools are the you know, the thing, they're here to stay.
00:25:11
Speaker
but We're in it now. It's no longer a thing coming up. It's not emerging. It's here to stay. how and And also reflecting on that very early research to where we are now, what has changed?
00:25:24
Speaker
First, I'll say, you know, technology facilitated, we call it TFGBV, technology facilitated gender-based violence is actually violence that is enabled by technology.
00:25:41
Speaker
So we are talking of internet and through different platforms that can be, you know, you know can be... through the phone, can be on social media, can be on email.
00:25:55
Speaker
So it is, you know, it is that violence that can, can you know, people see it. But then because they they are enabled with internet, with bundles, and they have platforms where they can express themselves.
00:26:11
Speaker
So they do that. So, you know, looking at it from a very simplistic perspective, simplistic definition and in the way that i understand it now one of the things that has happened is that over time since we conducted that that research and now uh over 10 years yes because it was actually we did think we conducted that in 2012 then we had an update in 2014 um there is oh
00:26:44
Speaker
theories An increase. There is an increase of technology facilitated gender-based violence because number one, we have seen more and more people utilizing the internet. We have seen populations adopting um you know online platforms that is facilitated maybe by by phones, at least for Kenya.
00:27:10
Speaker
The fact that there are so many people ah coming, you know utilizing mobile mobile mobile phone, mobile phones have also contributed to more people getting online.
00:27:23
Speaker
And the more people get online, then we're actually starting to see um to see a lot of misogynistic content.
00:27:35
Speaker
Secondly, I think people feel safe that they sit behind a screen. um They can, you know, they can sometimes we say poop.
00:27:48
Speaker
They can poop online thinking that they have that safety and that is targeted. so you know at You know, we're also starting to see men ah suffering, although women have actually been, you know, the bane of that the ban of that suffering.
00:28:10
Speaker
him We are also seeing an amplification of that because sometimes people find online platforms the most accessible and the most available tools for them to express themselves.
00:28:29
Speaker
So, you know, we we you you know we are fun we are finding and an unprecedented way for people to communicate. And as they communicate online, you know, for example, if you were to look at the Kenyan communication community coca communities, it's as if they had been locked somewhere.
00:28:48
Speaker
But now these online platforms have provided them an avenue to express their anger, to express their joy, to, you know, people now,
00:29:00
Speaker
ah Really, in practical terms, people have become prosumers where they produce and at the same time they consume. So that availability, that availability of the platforms, I would say has exacerbated TFGBV as we know it.
00:29:20
Speaker
And the fact that even now we are we are finding even young young men and young women involved in it initially it just used to be you know like for example women in positions especially ah you know women in entertainment journalists those would be the ones who would really suffer but we see now you know ah practically everyone is is affected.
00:29:47
Speaker
And in 2025, I think now, you know, with adoption of artificial intelligence, with adoption of generative artificial intelligence tools, we are seeing, you know, the tools ah have actually simplified the creation of that non-consensual intimate imagery.
00:30:13
Speaker
ah And that we see, actually you know that that you you I'm sure you have seen use of artificial intelligence where ah the voice sounds almost like somebody's similar voice.
00:30:30
Speaker
And of course, it's not necessarily true. So what we are seeing is that AI, in as much as has come to simplify the tools, it's actually, um you know, intensifying trauma.
00:30:47
Speaker
for women when they are attacked and that is you know ah through you you know through even deep fake pornography. That is so common, especially where women are concerned.
00:31:00
Speaker
ah Currently, I'll tell you in Kenya, there's ah theses ah there's a content creator, a famous content creator. Let me not say who he is because I think I'll be participating in that TFGBV.
00:31:13
Speaker
ah Him and his wife, um I think, fell out. And so it it is said that he shared ah his wife's nudes on a family WhatsApp group.
00:31:28
Speaker
Wow. And so they the wife retaliated. and did that, shared his news online.
00:31:41
Speaker
And then I think she thought twice and brought down the posts, but people had already taken screenshots.
00:31:53
Speaker
So there has been that teeth, the two of them, including him saying that when he, you know, when they started dating, she didn't have, I think, teeth.
00:32:04
Speaker
And so he's asking her to return the teeth. And it is so funny, you know, because that thief now is on different platforms and it's been going on like for three days.
00:32:17
Speaker
So you can imagine the trauma, the trauma for those who are going in, because I think the trauma is both ways. And so again, um what do we are seeing is,
00:32:29
Speaker
Initially, when we did that study, women would be would be attacked and what they would do, they would just leave the platforms. And so we were encouraging women to try and not leave the platforms because, um you know they you know, the fact that they leave the platforms denies them.
00:32:51
Speaker
the right the and you know to engage, to associate, ah to express themselves. And those are rights that are provided for in in Kenya's constitution, in the Bill of Rights.
00:33:06
Speaker
um So what we are saying is that there is now more. And that I think the algorithmic um like amplification of that of such content and now becomes, I think it's being generated more.
00:33:25
Speaker
So you just need to go on one platform, see that, you know, um the the beef and the insult, and then it's, you know, algorithm.
00:33:37
Speaker
They start bringing you more and more. And it is because of, you know, at artificial intelligence can tell some of those ah sites that you visited and some of the content that you visited. And then ah that way, then it sort of intensifies more people to be able to see.
00:33:59
Speaker
Yeah, we'll come to algorithms in a second.

Legal and Platform Accountability for TFGBV

00:34:01
Speaker
But when you think about, from your perspective, when you think about and policy and governance, do you think, especially with the kind of cases that you're highlighting, and also, i should also add the caveat that we're not saying that there should be limited access.
00:34:15
Speaker
Because access is good, right? There there is a positive side of it. It's just about actually how we, how does that access function you know in a way that does not disenfranchise certain communities or certain conditions and contexts.
00:34:29
Speaker
So do you think our current laws and digital safety measures are catching up fast enough to protect women, especially from ai enabled abuse and violence?
00:34:42
Speaker
i so I think and now that is something you have made me think because like at Kiktenet, we have been at the forefront of AI principles supporting AI policy work.
00:34:56
Speaker
And now I don't remember if we had, we put in anything um on women, which which would be so wrong because, you know, we are the center of looking at how women can be protected.
00:35:08
Speaker
But in terms of, you know, law, whether law is handling that, one, need to still point out that law always plays catch up where technology is concerned.
00:35:21
Speaker
We have seen the dynamism of technology. And sometimes by the time law catches up, that technology has actually ah evolved into something into something else.
00:35:34
Speaker
And so um I think what is very important is that we need to keep these conversations alive and push for, I know we have been trying to push for this in our law, but I won't say we have been successful.
00:35:54
Speaker
The only thing would say we have been successful is that now we are getting people to talk about TFGBV there are more conversations, there are more organizations participating in this.
00:36:07
Speaker
We are getting even lawmakers to start discussing this. So it's a journey, you know advocacy can be slow, it's a journey, but somehow we may get there.
00:36:18
Speaker
so And of course what we need to do is push for criminalization of TFGBV um and and because also sharing of information, especially on on the different genders, um I guess also is to continue contributing to the data protection law.
00:36:40
Speaker
For example, Kenya's data protection so that they have mandate or we support a mandate they that we we we we we support ah monday ah for people to be able to to you know to to disclose when they use deepfix or when they are using ai And of course, I don't know, but there is also need for platform accountability sometimes for content moderation because some platforms also, you know, that kind of, they need to check on their algorithms because when platforms are
00:37:22
Speaker
pick on those deep fix and they they they emphasize, you know, they are enhanced. I think for me, i would just say we need a very honest conversation. And that conversation must be, must bring the different stakeholders together.
00:37:41
Speaker
At KiktaNet, we are very strong on on on pushing for multi-stakeholderism approaches. And in such a you know in in such an issue, I would say that this calls for honest conversations and multi-stakeholder collaboration so that we have the government, we have tech firms, we have civil society.
00:38:06
Speaker
coming together to co, I think we are calling it these days what? Co-develop, co-produce, co-create, co-create policies, you know, where we probably insist that as a rule, especially for maybe content moderation and platforms in terms of accountability, that we put a certain percentage of uh for human oversight because even as we know how ai works we know it still requires some human touch some human control to de determine to determine what is possible or true within each context so there are also contextual ah contextual um
00:39:00
Speaker
stuff that needs to be to to be put into consideration. And finally, I think it's very important for those of us working in tech to continue with awareness campaigns. We shouldn't ah really tired because sometimes you put something online that is meant to be a campaign or

Promoting Secure Online Engagement and Reporting Mechanisms

00:39:22
Speaker
an awareness and you get you get your good measure of attacks on why you are doing it. So I think it's very important that in this multi-stakeholder
00:39:38
Speaker
ah collaboration and approaches that we get each stakeholder to do awareness campaigns, ah to do digital literacy in areas that they work on.
00:39:54
Speaker
And of course, emphasize on the need for concept. When you share my pictures, when you share information about me, that information about me as a data subject,
00:40:08
Speaker
we need to emphasize on the importance of consent. And of course, ah even as young people engage on i online, you know, to continue raising our awareness on the need for engaging and embracing secure practices,
00:40:29
Speaker
and emphasizing that in as much as you have freedom of expression, that freedom has limitations in so far as your freedom affects mind.
00:40:41
Speaker
And to simply just emphasize that, you know, your freedom ends where mind begins, just to break that down and to let people know that that is very important.
00:40:55
Speaker
So what what I'm saying is that the journey...
00:40:59
Speaker
we There are still so many rivers to cross, but we must remain on the big picture. We must advocate for secure and robust online platforms that allow people to also express and engage while also feeling safe.
00:41:21
Speaker
Yeah. And I suppose but part of that journey also to connect back to what you said earlier on about algorithms is who's producing these AI platforms and how well we empowered as African women, people on the continent, people from different diverse backgrounds to produce their own AI tools in order to kind of mitigate, begin to mitigate some of those and biases that perhaps exhibit because of who's creating the AIs.
00:41:45
Speaker
completely agree with you. And i think when it comes to tech tools, there's need to actually deploy AI driven safety features in those in those tech tools.
00:41:59
Speaker
I know sometimes we talk and people tell us what we are supporting is theory, but we must also push for what we call safety by design.
00:42:13
Speaker
tools ah so so that people are thinking we need to think about reporting portals. ah We need to have place where people can report if they have been attacked so that, you know, I think we also need to do like now what platforms have started doing.
00:42:35
Speaker
There are communities, online communities that support the engagements. oh I think we need also, you basically basically that safety by design is very important if we are going to um support um to you know to to to have ai and i know ai for example it's still evolving so sometimes we don't have time to start talking about regulation of something that we do not understand.
00:43:07
Speaker
But we can have principles that guide that and, of course, encourage... And, of course, encourage... um where you know experimentation,
00:43:21
Speaker
um ah sandboxes that allow for people to experiment with AI and those safety features that we are recommending.
00:43:33
Speaker
Otherwise, if we don't try, we are still going to expect it. you know Online violence ah and still one gender, I think the women are more affected.
00:43:47
Speaker
and another than that men. So yeah, we must do something. And which is why it's important that as these conversations are happening, that they're also being grounded with lived experiences of African women also and the very but particular experiences of African women in all of this.
00:44:06
Speaker
Completely agree with you. I think as you were you just said at that point, it ignited that thought. Remember, there was a time in the US when they had that campaign of Hashmi.
00:44:21
Speaker
If you remember, that women were being encouraged to to to report of harassment, sexual harassment. The Me Too movements. his movements yeah Yes. What I realized is that African women are harassed lot, but they don't have the confidence to speak out for fear of stigmatization.
00:44:48
Speaker
And so, you know, when it comes to tech, I think we must also push... and I'm so happy that this conversation we are having because we've been working as Kikta Net in this, and i these are some of the things that I haven't thought.
00:45:04
Speaker
We must push for confidential reporting. Yeah. You know, reporting that enables secure, you know, evidence, secure evidence storage.
00:45:18
Speaker
And then prioritizing
00:45:25
Speaker
especially when a woman says she's in trouble, just prioritizing to see how that woman can be helped. So basically, I guess thinking out of the box, having a holistic approach, you know, how do we integrate relationships?
00:45:45
Speaker
safety into AI, how do we integrate, how do we start thinking of legal aid? oh Even legal aid, for example, when you go onto platforms that have chatbots that are the ones that respond to you.
00:46:05
Speaker
So I think this is, it's it's still, they still, you know, that they say they that they the long road Yeah, absolutely.
00:46:16
Speaker
And I was just thinking, I mean, you've obviously been in this space for for decades, right?

Advocacy for Digital Security Education

00:46:21
Speaker
Was there, when you reflect on that time, was there a turning point, maybe a case or a moment where you thought, if we don't get this right, women's rights will be set back decades by technology?
00:46:34
Speaker
Was there any moments like that, that you had that revelation? Actually, when we were updating our study in 2014, one of the Supreme Court judges coming to our launch. And as a guest speaker, she had actually suffered. She's a very beautiful woman, but somebody had actually put, you know, photoshopped and shared her news, you know, supposedly news.
00:46:59
Speaker
And she said she went to talk to the Minister of Interior because she thought of the need for security. And he just found it such a joke that he loved so much. Didn't realize that, ah you know, the that the judge had suffered.
00:47:14
Speaker
So at that point, we realized, you know, when you do research, you also need to come up with, you need to come up, you know, with with practical solutions because, you know,
00:47:26
Speaker
we realized even out of the other women that we talked to those who had been affected, they had left the online platforms. And we thought this is not right because why are women being ran out of these platforms?
00:47:42
Speaker
And so we realized that it would be very important to actually suggest ah practical measures. And one of the things I know that we have continued doing that i came out of that research, we did...
00:47:58
Speaker
you know, lobby funders to support resource allocation into capacity building, especially for the groups that were highly affected, you know, it's the bloggers,
00:48:14
Speaker
It's are women journalists. It's women human rights defenders. And we have continued to build capacity so that ah then they know how to stay safe online.
00:48:28
Speaker
You know, issues of digital security. How do you defend yourself when when when when you're attacked? And I must say that that work has actually continued to grow.
00:48:41
Speaker
So apart from capacity building, and which I am so happy about, because, for example, in Kenya, so many other organizations have taken that. meaning that there still so much work that needs to be done because as Kiktonet, what we were doing was just a drop.
00:48:59
Speaker
So there are many other organizations that have taken that agenda. We have also seen funders taking that as a very serious issue and there is resource allocation to many different organizations to do that.
00:49:14
Speaker
So for us, what we are doing now is, you know, continuing again to look at, for example, we've just released a study on just looking at the legal framework remark on, you know, what supports this area.
00:49:28
Speaker
and just highlighting what are the gaps. So out of that, we're actually going to continue doing advocacy so that we can have strong laws, like we are saying, advocacy so that you can have security by design are platforms.
00:49:44
Speaker
And I know, like I've rightly pointed out, advocacy can be very slow, but we need to stay on the game. Absolutely. And so what are some of those gaps that your study found?
00:49:56
Speaker
One was, of course, that women were were were afraid to engage once they had been attacked. um Oh, you're saying the legal framework? Yeah, the legal framework. Yes, the legal framework is that, ah yes, we have laws ah that support issues of on line online violence, but they're not specific to gender.
00:50:21
Speaker
and That's a major finding. And do you see and feminist thinking really helping us reimagine these policies and legal frameworks? and How do you see that maybe informing our approaches?
00:50:35
Speaker
but No, that definitely that that is very important because you need to make it an issue. you need to point We need to point out how this relates to feminism.
00:50:47
Speaker
We need to point out what are those gaps. ah We need to, you know, because policymakers, legislators don't have time to read or even to sometimes, you know, they are too busy engaging engaging their yeah their supporters on the ground.
00:51:08
Speaker
So when it comes to this high level, I think, you know, there are other places that we can borrow. I am sure there are places we can borrow from, like the EU's, what, you know, what they provide.
00:51:23
Speaker
um And I think, you know, especially like when it comes to AI, I think Rwanda is also doing something. So it's just look at oh what has worked in other areas and what can we adapt.
00:51:41
Speaker
And you've trained and mentored people, many people through Kikta Nets. So what practical advice would you give to activists, journalists, country producers,
00:51:55
Speaker
who are worried about being targeted online? Number one, that they need to learn more about digital security. They need to embrace cyber hygiene as a practice so that every other day, you know, they're changing their password.
00:52:12
Speaker
When they go into platforms, they sign out as opposed to closing the browser. They also need not to feed trolls. If they are trolled, Sometimes you need to engage from a very mature mature perspective, but don't engage trolls, don't respond ah to them.
00:52:32
Speaker
And then once it's a criminal, they think it's a criminal offense, we have mechanisms, we are telling people we need to start reporting, even to...
00:52:43
Speaker
you know, to on for law enforcement so that they can see that it is an issue. If it is only one person reporting, they don't consider it as an issue. So we encourage people to report.
00:52:54
Speaker
We also encourage people to report on the platforms. ah that they are utilizing so that then even the platforms can actually start seeing that this is an issue that requires attention.
00:53:08
Speaker
Absolutely. And for those our listeners who are passionate about digital rights, who are passionate about digital safety, um and you know want to get involved in that area of work, but don't know where to start, what would you say to them?
00:53:23
Speaker
I would just say that they are they are absolutely welcome to engaging in this sector. And it is because what we are doing it's just a drop in the ocean.
00:53:35
Speaker
So, you know, and and and it is because of, of you know, technology. So, for example, because of, you know, technology, that dynamism in technology, we are now starting to see misuse of personal data when people have, you know, we have our data online.
00:53:53
Speaker
So that data being misused, being, you know, used for doxing, for intimidation. That actually had not, was not there when we did the first ah study.
00:54:05
Speaker
So we are starting to see violation of privacy rights. we We are starting to see use of AI, um you know, to to to demonstrate or to project the different genders of causing negative life.
00:54:24
Speaker
So what I'm saying, because technology is so dynamic, sometimes we don't know the end game of technology. So I would say the more the merrier and it's not enough. There is still so much to be done.
00:54:39
Speaker
Much more to be done, yeah. And

Hope for Technology and Gender Justice

00:54:41
Speaker
when you think about the future of technology and gender justice in Africa more broadly, what gives you hope? and The fact that we have now more people engaging in the issues, the fact that ah we are starting to see lawmakers also interested in the issue, um the fact that we are also starting to you know create awareness, especially with the young people,
00:55:09
Speaker
so that they realize that even as they communicate, there's also the need to you know to respect other people's rights and to you know that there are issues about violating privacy rights.
00:55:23
Speaker
So for me, that ongoing conversation is very important and it gives me hope because when we did the study in 2012, people didn't even understand what that meant. There were very few women online.
00:55:36
Speaker
Now there are so many women online. oh and We have, of course, of course, continued to see more attacks that are being facilitated online.
00:55:49
Speaker
So that continuous conversation is very important. And it gives me hope because one time we will all agree to sit on a table and agree what is doable and what is not doable.
00:56:04
Speaker
Thank you so much for your time, dr Grace. Githaiga, it's been a pleasure listening to you and hearing your decades of experience in the field. Thank you so much for joining me. Welcome. And thank you so much for the opportunity.
00:56:16
Speaker
So technology can build bridges or reinforce barriers. As we've heard from Dr. Githaiga, it's not just about regulating AI or chasing down harmful content.
00:56:26
Speaker
It's about reshaping the digitalization. empathy equity and with intention.

Conclusion and Call to Action

00:56:32
Speaker
Thank you for listening to Her Media Diary. If you found today's conversation inspiring, don't forget to subscribe, leave a review, and share this episode with someone who needs to hear it.
00:56:41
Speaker
If you'd like to join me on an episode of this podcast, send me an email at yemisee at africanwomeninmedia.com and visit our website, hermediodiary.com. Subscribe and follow Her Media Diary on all your favorite podcasts and platforms.
00:56:54
Speaker
And you can also tune in on our partner radio stations from across Africa. And don't forget, join the conversation using the hashtag Her Media Diary. Until next time, stay safe, stay curious, and keep amplifying the stories that matter.