Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Her Media Diary Episode 47: “Linking Human Bias to AI & Machine Learning” with Deborah Kanubala image

Her Media Diary Episode 47: “Linking Human Bias to AI & Machine Learning” with Deborah Kanubala

E47 · Her Media Diary
Avatar
2 Playsin 2 days

Deborah is a PhD Candidate at Universität des Saarlandes (Germany), researching fair machine learning to reduce bias against marginalized groups. Prior to her PhD, she was a Lecturer in Data Science & AI at the Academic City University College, Accra, Ghana. She is also the co‑founder of Women Promoting Science to the Younger Generation (WPSYG), an organisation supporting more girls into STEM and AI.

In this episode, Deborah unpacks how generative AI can deepen gender-based harm, especially for African women. She discusses how machine learning models are shaped by historical human bias, which is then transferred to machine learning models, leading to continued discrimination.

According to her, investing in research that focuses on auditing the systems should not be an afterthought of big tech companies but something they should embed into the system from the start.

Subscribe, leave a review and share this episode with someone who needs to hear it.

If you’d like to join an episode of this podcast, send an email to yemisi@africanwomeninmedia.com. Or visit our website at www.hermediadiary.com

Subscribe and follow Her Media Diary on all your favourite podcast platforms, Also, tune in to our partner radio stations from anywhere across Africa. And don’t forget to join the conversation using the hashtag, #hermediadiary.

Recommended
Transcript

Introduction to Her Media Diary

00:00:03
Speaker
welcome to hemi the diary the podcast where African women share their real stories, bold moves, and behind the scenes moments that shaped their journeys. I'm your host, Dr. Yemsi Akimbo Bola.
00:00:14
Speaker
And with each episode, we're pulling back the curtain on what it really means to build a media career, to break barriers, and to stay true to your voice. So whether you're just starting out or already making waves, this space is definitely for you.

Impact of Generative AI and Ethics

00:00:29
Speaker
Today's guest is a brilliant force in the AI world, Debra kanebala She's a PhD researcher in fair machine learning, lecturer and a fierce advocate for equity in technology.
00:00:44
Speaker
So in this episode, Deborah unpacks how generative AI can deepen gender-based harm, especially for African women, and what it means to build ethical, inclusive systems from the ground up. So from deep fakes to biased data sets, from STEAM advocacy to policy shifts, this conversation is equal parts alarming, empowering and visionary.
00:01:03
Speaker
So stay with us.
00:01:17
Speaker
Deborah, it's a pleasure to have you on the podcast. Welcome to Her Media Diary. Let's start with your story. You grew up in northern Ghana, I believe. and You studied financial mathematics and now you're doing cutting-edge AI research in Germany. So what led you down this path?
00:01:33
Speaker
Did you grow up thinking you're going to end up down here? okay that's ah That's a very long but yeah very interesting question to know about. so I...

Deborah's Educational Journey

00:01:45
Speaker
I mean, I was born in Bolgatanga, that's like the Upper East region of Ghana. And at least when I was like six years, we moved, my mom was moving to a different job. So we moved to, back then was the Northern region, Damarango, but now it's now, I think, Northeast region. So the regions have been changed. Yeah.
00:02:08
Speaker
So we moved there. And the interesting thing was that I actually didn't like mathematics growing up. wow okay okay and it's like okay how how did you end up here yeah I actually didn't like math and I don't know whether it was the teachers or it's just that I just didn't like it right somewhere in junior high school too we had this math teacher still remember his name very well Mr. Ijo and I think the way he probably explained concepts was completely different from other teachers I've had before and I then started after class usually I would want to go back to go through concepts in class and all of that and I remember the subject it was
00:02:54
Speaker
algebraic expressions okay wow such a scary title he used like analogies like oh how can you add mangoes and oranges those kind of things right okay so quite relatable things you can relate your day-to-day exactly exactly and i and i think that as teachers probably it's important to be able to make such connections it's easy for students to follow so i think that was what happened and Yeah, it just became much more interesting for me. Every time after the school, i would just go over the materials. And I realized i I started getting better. And I was like, OK, it's not just bad.
00:03:29
Speaker
And yeah, from there, I proceeded senior high school. I took elective maths. And I just knew I actually didn't have a program in the university in mind.

From Financial Mathematics to AI

00:03:38
Speaker
I just knew that I wanted to do something related to maths and economics, something in between those areas. And I think my mom also helped me a lot. And we settled on financial maths.
00:03:48
Speaker
So I went to do financial maths and then afterwards I applied for master's studies with the African institute Institute for Mathematical Sciences in Senegal and got accepted. So I went for that program and it was also during that program that I got to know about machine learning, got to know about coding. I pretty much didn't know anything about that.
00:04:08
Speaker
So, but during my first master's program, I got exposed to that and i was like, okay, I think I really want to dig deeper in this area because So I did a second master's with this prestigious scholarship, African master's in machine intelligence.
00:04:22
Speaker
Yeah, so I did that program. And from there, I worked as a lecturer at Academic City University. So i was teaching courses in data science and machine learning.
00:04:34
Speaker
And whilst also working remotely with a Canadian based company as an NLP engineer. So we did some projects for the Bank of Ghana and some other banks in Africa before I started my PhD. So it was like it wasn't a very um linear path, would say. And it's not that I planned my life from beginning to end. It just happened that, OK, start from this. OK, I'm interested in this area and then you keep building on it.
00:05:01
Speaker
That's how I landed here. Yeah. I mean, life life is hardly ever linear. And, you know, it's always linear when you when you look back. It kind of makes sense when you look back. OK, yes, this led to this and led to that kind of thing.
00:05:13
Speaker
You know, but it's so interesting how your story there started from the simplification of math education. Right. So let's unpack that a

Understanding Machine Learning

00:05:23
Speaker
bit more. Like how, what would you say that instilled in you? What was the, what was, what were those light bulb moments for you that led you towards this path?
00:05:33
Speaker
Okay. So I w I would say that the, just the love for numbers, just okay. Being able to understand, how numbers build together.
00:05:44
Speaker
and the fact that you can actually relate it to everyday life like when I was using the oranges and and mangroves it's it seemed like okay definitely I know I cannot just put oranges and mangoes together but just that basic thing kept building up and with time I knew I knew that everything I learned in class I can actually just um sort of relate ah so I would say that from that angle that petty thing help me to where I am now. I don't know if that answers question exactly. No, it does, it does. And you spoke about your mom as well being quite kind of instrumental in that journey.
00:06:19
Speaker
How was that? Yeah, so so I think my mom was, at the time that I was in junior high school, she just had first degree. So in Ghana, she went through like the O-level. So that was our educational system before we changed to this West African accent and examination.
00:06:38
Speaker
system so she had this o-level stem from there so she was like a trained teacher and from there worked for a while then we went to do her degree so she started from the bottom bottom before she built upon it so during i was during my high school junior high school in particular she was um she only had her degree and she she was herself as a person was like an inspiration to me And she gave us a lot of support. um So we had access. I mean, I didn't grow up in a rich home or anything. i would just say we were just and probably average, just trying to leave. i mean We didn't have it, but we were content with whatever we had as well.
00:07:20
Speaker
So we she encouraged me a lot. So sometimes she she herself will teach us. My brother in particular, he got a lot of whips from my mom. Oh, dear.
00:07:34
Speaker
So yeah, she would encourage you and all of that. So for me, it was always like, okay, I didn't like math. Then I started liking math. So which she gave you the support. And so after it was more like, what do you want to do?
00:07:45
Speaker
And she left onto her alone. She would have said I should go to do nursing because, um, becoming a nurse at that time in Ghana you would at least get paid you get you'll have your job guaranteed afterwards right and this was to me who wanted to do something maths and economics it's like okay what can you become but right are you going to become a teacher or something but she didn't discourage me which was great it was more like are you interested if you were interested in that course and this is what you want to do you go for it and Let's see where that leads you. And I think that was actually good that she didn't force what she wanted down my throat. And that probably would have killed all probably my dreams and maybe my path would have been different. I don't know. It may have been for the better or for the worse. So I don't know. But my mom played a huge role in our upbringing generally. And
00:08:37
Speaker
Yeah, and and I mean, she made a lot of sacrifices as well, a lot of them, right? I remember one day she actually told us that if it was she doesn't even have money or she has to sell her last material to pay fees for us to go to school, she was willing to do that. And for me, that meant a lot, like to what extent that she can go. I mean, generally women, we want to look good, we want to wear clothes. all the Ankara outfits.
00:09:04
Speaker
And this was a woman who was willing to even sell whatever she had just to ensure that she can pay for your fees and and

AI Biases and Societal Implications

00:09:12
Speaker
all of that. So so very very inspiring and supportive mom.
00:09:16
Speaker
I think all of those foundational foundational environments with the teacher that knows how to make you relate and the and the kind of home environment that encourages your dreams is always critical.
00:09:27
Speaker
I'm curious to know what stage you decided that machines was the way to go. yeah So this was actually during my second master's. i was Second master's, so your first one was onwards. So you sound very, very Nigerian at this point, you know, we have like 10 masters and 10 PhDs. So I guess that's what we have in common. Maybe it's a African thing.
00:09:51
Speaker
Exactly, African thing, exactly, the value for education. So your first master's was in what? So it was in applied maths, but with a focus on financial maths, again, and big data. And this was at the African Institute for Mathematical Sciences.
00:10:07
Speaker
In Senegal. Yeah. So the second one was on machine intelligence. And this one, it was again hosted by the African Institute for Mathematical Science Ghana. But the program itself is an African Masters in Machine Intelligence. And this program was funded by Google and it's still in. and in So it's funded by Google and Facebook.
00:10:27
Speaker
And the founder was the formal Google AI head in Accra, Mustafa Sisi. So he started this program with the main aim that he wanted to train African students in machine learning so we can solve African problems.
00:10:40
Speaker
So um all of us are from African countries. And but from different backgrounds as well, you had people, electrical engineering, computer science, math and stuff. So it was really very diverse in different areas.
00:10:56
Speaker
So currently the program is run in Senegal. Yeah. So it was during this master's program. I needed to, we had to do like our research, of course, or like any master's program.
00:11:06
Speaker
So I was working on financial inclusion. And i was working with this data set from Kaggle. I got this data set. It's just said a website that they put a lot of data. People can go and oh do competitions in machine learning and stuff.
00:11:21
Speaker
So I got this data set from there, Financial Inclusion. And the main idea was to build like a machine learning model that can sort of the measure whether someone was financially included or not. So how we had defined financial inclusion was that if you have access to a bank account or not.
00:11:41
Speaker
So the data set spanned different East African countries, like five East African countries. And so I built model. My evaluations and everything were great. I checked my accuracy, they were good. And I went a step further, okay, let me just so check in terms of how the model was performing but between the different sensitive group attributes. In this case, it was gender.
00:12:07
Speaker
And this was where I noticed that, okay, hey, it didn't have the same performance. um as far as the the good model that I had was like because when you look at it averagely across all the population it was great but when you look at it with between groups when there was a problem and I think that it was at this point that I began questioning like okay hey what could be happening why did I have a good model overall but within groups and this wasn't performing as it wanted as I wanted it to so it was not now for me not the matter of focusing on how what was happening behind the system because now let's say this model I deployed it and it was going to be used by say a governmental institution to make predictions and then based on the predictions would have to take some
00:13:01
Speaker
some steps or decisions, right? What is going to happen to the group that was ah that the model wasn't actually performing well at? It means that if you want to take a decision like, okay, distribute resources, some of these people would be left out because the model wasn't accurate and in that sense.
00:13:20
Speaker
So this was what propped me or i developed the interest in terms of looking into fairness or understanding the impacts of these predictions generally. Okay, really interesting. and So we're going to take some steps back because some of us do not understand what you mean by machine learning.
00:13:39
Speaker
by I mean, I have a vague idea, but I think for a lot of our listeners, I'll be thinking, okay, what exactly in like our level, not your level, like in our level terms, what is machine learning and how, how therefore,
00:13:54
Speaker
what What makes the gender and the fairness really critical as part of machine learning, if that makes sense? Yeah, so machine learning, and yeah I like to always relate to machine learning, like how we also learn as human beings in a way.
00:14:11
Speaker
um I mean, growing up as a child, usually you just learn either from things around you, right? I mean, for example, your parents start speaking to you in your local language.
00:14:23
Speaker
You keep hearing all these systems. So you think about it that you're just collecting data in your head, right? you And as you're collecting the data, you're processing it. And that's why the moment you begin to talk, you don't end up talking speaking in German when you are, say, Ghana, right? You start speaking the language that you have been hearing, all the data that you have been collecting, in a sense, right?
00:14:44
Speaker
So I like to compare machine learning and basically to something like that. This time around, we are not looking at it as a human being learning or taking in data and processing it to do something, but rather from the machine perspective, right? So if you have a machine, you're you're able to collect some data from this machine. You can make, draw patterns that, okay,
00:15:07
Speaker
Cam, for example, in my language means wa, so you can connect these things and then with the the machine can connect these dots and begin to learn that, okay, if you want to say cam, you can say if you want to say go, you say ga, like that's like in my language, right?
00:15:24
Speaker
Yeah, so you can think about machine just learning from data and then being able to but perform a particular task. So whether it's in being able to translate sentences or make predictions on who should be granted a loan or not.
00:15:39
Speaker
So just do different things. So, I mean, in very layman things, this is how I would describe my machine learning. and On the second part of for your question with regards to how gender if it's about how gender comes into all of this and why it even matters, right?
00:15:56
Speaker
So I think one of the main reasons why this people believe that we, hope oh, it's the case, we human beings generally are biased.
00:16:07
Speaker
We have our own stereotypes. You meet some people and just from their appearance, you can already conclude, oh, they'll be rude, they'll be kind, they'll be nice.
00:16:18
Speaker
I mean, sometimes the stereotypes are all right, other times they are not, right? So what that means is that, like I said, machine learning also works with data, right? So I'll pick a case of a loan approval.
00:16:30
Speaker
So say I'm a bank manager and someone walks up to my bank and say, I want a loan to probably buy a house. And usually you can't you have to assess on various factors. But at the end of the day, i think at least from my little experience with internship at the bank, sometimes the person actually processing can also give some input in terms of whether the thing this person would actually be able to pay back for the loan or not.
00:16:55
Speaker
And deciding on this, you could include or have your own stereotype that can influence your decision. So it can influence your decision that you may end up not giving someone the loan and...
00:17:10
Speaker
the impact of not giving the person the loan, meanwhile, this person would have paid back, the negative effects could be super high on the person. ah you And you as the bank, you also lose.
00:17:22
Speaker
and You could also influence use your stereotypes to influence your decision and you end up giving a loan to someone who actually doesn't pay back for the loan. and And the impact of that is that the credit score of this person just drops it affects the person so in the future if this person has to apply for a loan again it's going to be very difficult and you as the bank also loses out money right but then this is some action that has been performed you took some decision it goes into your database right so after a while you have this large database of data of people you've given out loans to in the past that have paid back and those that did not pay back
00:17:59
Speaker
And then you decide that, okay, I use this data. I want to automate this process such that you don't need, say, five or 10 bank managers to sit and decide who should be granted a loan or not. You want to use machine learning to learn the decision-making process, right?
00:18:15
Speaker
So you build these models. forgetting the fact that this data could have been historically discriminated based on, like I use the stereotype, right? So you use this data, build a model.
00:18:27
Speaker
What the model does is just to learn from the data. So if there is any historical bias discrimination, it only learns from it and potentially would could even amplify the existing bias.
00:18:40
Speaker
So if there was some gender discrimination that has occurred in your bank, previously, and then you build a model with that, you would only continue propagating this problem, right? So, and that's why it's super important to look into the details of and if this data you're working with has been um probably historically discriminated, or even in the process of the model building, have you probably assigned higher weights to ah some particular groups of people knowingly or unknowingly, right? You could have done that from
00:19:14
Speaker
the machine learning engineer's perspective. So these are some of the things why it's important to do these things because at the end of the day, once you have the model, you want to deploy it and then there will be a feedback loop because you're going to deploy it, people who keep using the system and then they keep rejecting those to those particular groups of people.
00:19:33
Speaker
It goes back into the system and their lives will never get better. right So these are the reasons why we need to think about fairness generally or in particular gender. gender discrimination when we think about machine learning.
00:19:45
Speaker
Yeah, I think that's probably the clearest kind of and way in which I've had somebody describe bias, machine learning bias, and the fact that actually it is this cycle of our own human, bias starts from the human first, it's not the machine that creates the bias, but learning our own biases.
00:20:03
Speaker
But then that perpetuity that creates by not unlearning and then therefore creating marginalized groups that continue to be marginalized because of this initial stereotype, like you said.
00:20:17
Speaker
So when you think about that in the context of other forms of bias that emerges from ai tools like facial recognition, language, and those kinds of things, how how are they similar to similar?

Language Accessibility and AI

00:20:28
Speaker
how does How does things like facial recognition and language become a bias issue within AI?
00:20:34
Speaker
So, so and ah good i will use so way you mentioned facial recognition. I thought that maybe it's useful to mention this. There was a study by ah Gender Shades. Maybe you've heard about it or but or anything from Timonit and some group of researchers.
00:20:50
Speaker
And what they found out was that they had this model that was actually finding difficult to actually detect Black women their faces. right So now, usually facial recognition systems can be, sometimes they are deployed in lots of places. right You can have them, some people living in the like bank or at the airport, like different places, they deploy these things. So, or maybe for security purposes in your institution, you have to show your face to get in. right
00:21:23
Speaker
And what that means is that, and especially if it's going to even be used for, let's say there was a system that they were using to detect whether someone is a criminal or not. i mean I mean, another thing I should make clear is that there are certain applications of AI that I believe that it is possible to do, but doesn't mean you need to do it.
00:21:43
Speaker
Yeah. I was going to say, because it's been even inherently in itself, that proposition itself is biased. if's The fact that actually by just looking at somebody's face, you can tell whether they're criminal in itself is a bias, isn't it? Exactly. It it is. It is.
00:21:56
Speaker
so So that's the problem, right? And that's why i first of all have to say that the fact that something you something is possible or you can do something doesn't mean it needs to be done. right but let's think about it that these facial recognitions are useful things like this what is going to happen is that already there is gender discrimination is finding not just only gender but also intersectional discrimination because you are just not being able not able to detect women but black women in particular so what it means is that if you are using it to detect criminals
00:22:31
Speaker
And this model just keeps flagging more of black women as criminals. What are you going to do to those groups of people? You're only going to have negative consequences of crimes that they are innocent of or anything of that sort.
00:22:45
Speaker
And this is a problem, a huge, huge problem. right So I think that when it comes to facial recognition system, at the end of it's just not a technical flaw that we just say, but it can to wrongful arrest or even worse consequences and which can actually affect the lives of people. right So and that's one particular thing that I want to say about this facial recognition systems.
00:23:15
Speaker
um In terms of language
00:23:21
Speaker
of language, for language, I mean, aside the fact that maybe some, most of these systems are usually trained on English, i would say. So, and not everyone speaks English, right?
00:23:34
Speaker
I mean, most of us, yeah, English is the official language in our countries, but we still have our own languages. So I don't speak English at home, I speak Dagari at home. So at the end of the day, if you will actually didn't have someone who hasn't gone to school or is even a semi-illiterator is not very good at English, you would have you could it would be hard for this person to actually get whatever information they want from translation systems.
00:24:02
Speaker
like if you use a language a model, a large language model to do, let's say, translations, right? You you could lose out in these translations. And certain contexts are just peculiar to the words.
00:24:14
Speaker
And it will be hard for you to actually contextualize it from a cultural setting. to get certain information and that you actually need from from these systems.
00:24:25
Speaker
And this, i would think that is also another challenge when it comes to language and bias related to to that. Yeah. And just to go back, for those who are interested to look learn more about this gender shades, so you just go to gendershades.org, gendershades.org.
00:24:45
Speaker
And just to add to what you're saying there about language and English, only 17% of the world speak English, you know? So so it it is large it is hugely problematic that actually a lot of these things are happening in that singular primary language.
00:25:01
Speaker
And it's something that this lady, Sophia Kiani, I think it is, who gave a TED Talk and some time back there around, I think it's about 2022, she gave a TED Talk about how the language of climate change is predominantly also in English, which means that it's ah thousands, millions of people who cannot access even UN level documents that highlight the issues of climate change, et cetera, who are not even able to engage with it because of that language barrier.

Environmental Impact of AI

00:25:31
Speaker
So it is quite a significant significant um thing for us to consider. Yeah, I mean, talking about climate change too as well, I mean, I really like this paper. Again, this is also from Temine Tital, or they call it stochastic pirates.
00:25:46
Speaker
and this And also there's this other book from Kate Crawford that talks a lot about, i mean, these large language systems, right? Because the thing is that they are trained on not just millions of parameters now, but billions of parameters, right?
00:26:02
Speaker
and And whilst you're training the system, you actually have some carbon emissions. Right. And this, at the end of the day, we are talking about language. Probably your yeah poor grandma who cannot even speak English has never known anything that is.
00:26:19
Speaker
chajitp There's GTP, there's Gemini. They don't know anything of that sort. Right. And yet at the end of the day, climate change is going to affect all of these people. right? they they They haven't done any, they don't even have the benefits of it and the people, at least if you benefit from the thing you were creating, you know that it's a consequence. You get something and you lose something. but But now I don't even gain anything and yet I'm losing and everything.
00:26:43
Speaker
And this is also, I think, something that I would also say has to do with ah more of bias, like because in this sense, you these people have been biased by not getting access to all of these systems and all of that And yet they have they they suffer the consequences, right? So that's also something I wanted to add when you spoke about climate change.
00:27:04
Speaker
I mean, I would even go beyond saying bias to these people. would say it's it's inequality violence. right? You're not benefiting, you're not accessing, you don't have the same level of access, but you bear the consequences.
00:27:18
Speaker
And that consequence is a form of violence, you know, as well. So I think it's it's quite a substantive area for us to kind of pick up on. But talking about violence, and we've done a lot of work around gender-based violence.
00:27:30
Speaker
And obviously, there's a lot of kind of focus on technology-facilitated gender-based violence because, not just because of AI, but historically, even before AI, social media and all of that as well,
00:27:41
Speaker
And now we're now entering this phase where deep fakes and digital impersonation, know, I was listening to the radio the other day they were talking about how there's a, you know, I think a Canadian based organization that has developed an AI that does the interviews for you, you know.
00:28:00
Speaker
So it's an AI that speaks to you and it's your interviewer. And again, just trying to think, okay, what levels of biases is going to exist in that? Who is going to end being disenfranchised fra try and accessing jobs because of of these?
00:28:15
Speaker
So I guess the question really is, because we also mustn't be so dystopian about you know about about AI, right? We also mustn't think so negatively. So I guess from your perspective, what are the things that you are doing, right? Because you are in this space, or what are the things seeing emerging that are actively trying to counter this inequalities, these imbalances, this avenues through which these kind of biases enter?
00:28:45
Speaker
so So I would say, i mean, currently I'm um i'm doing my PhD and so my focus mostly is on fairness generally, um but whilst looking into fairness, I just don't look into how we can mitigate them.
00:29:00
Speaker
but I would question systems, right? Sometimes we end up doing things in a particular way that we actually assume that it is correct or it is right without actually re-reflecting.
00:29:13
Speaker
The way I'm actually doing it, is it actually the right way? Is it actually doing what I wanted to do, right? Because certain things just become a habit and you don't even even realize realize that there could be some problems with the system.
00:29:25
Speaker
So something that I am currently doing um It's really related to something that I mentioned, intersectionality, right? um I've seen a lot of works that people have done in terms of intersectionality, which involves um discrimination that resolves and that happens at the intersection of more than one sensitive attribute. So,
00:29:45
Speaker
you're not just discriminated because of your gender or your race but at the intersection of both your gender and your new race so what i've been looking into face was to re-question the existing metrics that people from at least a computer science perspective how people capture what intersectionality is right so i try to dig deeper into understanding what do these metrics even capture are these metrics actually capturing what we want them to And to me, from what I saw with the analysis I did so far is that, yeah, we are capturing some disparities. But when we dig into the social, philosophical and legal perspective of what intersectionality is, it's just not capturing disparities, but actually be capturing the non-addictive
00:30:32
Speaker
ah discrimination that exists at the intersection of those sensitive attributes and it's not just because you're um a Muslim or a woman but because you're a Muslim woman that's why you're discriminated right it is different from just saying that because I'm a Muslim I'm discriminated because I'm a a woman I'm discriminated and I put them together it's just not that it's not because you're if you look at all women you as a Muslim woman, you're probably not discriminated because we compare you to the average and there's no discrimination.
00:31:02
Speaker
The same way that we compare you with all Muslims, which include Muslim men, and you are not discriminated, but at the intersection, then you are discriminated. But then these metrics don't tend to capture this concept that I have explained.
00:31:16
Speaker
But these are metrics that are there and people say, yes, we have metrics that can help us um from the model side detect if there is intersectional discrimination or not. So I think it's important that we sort of always take a step back, request systems and try to really understand are they doing exactly what we think they're doing? So this is something that from my side, I'm looking directly into.
00:31:40
Speaker
um A second thing that i I'm doing also still related with more of a re-questioning system and thinking about new ways to do things. Currently, we focus mostly on automating systems by predicting.
00:31:54
Speaker
and So I want to predict who will pay back a loan or not. But actually, it's not just about who will pay back a loan or not. it also has to do with what are the conditions under which a loan is given.
00:32:07
Speaker
Because under the conditions to which a loan is given, it can actually affect your ability to pay back a loan or not. And so if you build a system that just focuses on determining whether give you a loan or not,
00:32:22
Speaker
You actually forget that before we get to that part, you give me some loan conditions, the interest rates. And it's been shown that interest rates can actually differ between different racial groups.
00:32:33
Speaker
Right. And so it means that people with higher interest rates would suffer to pay back a loan. while someone with lower interest rate would find it much, because you you have to just pay lower repayments, prepayments every month, your monthly installment is so much lower.
00:32:48
Speaker
And so the burden is not as high as someone with a higher interest rate. So these are, so right now, rethinking about how we can move from the system of just predicting whether I give you a loan or not, but then to go forward to think about how we can actually decide on these loan terms that are fair, that can actually enable people actually pay back their loans or not.
00:33:10
Speaker
So this from my own little way, as far as my research is concerned, is what um I'm doing to at least help us mitigate the consequences of some of these systems. And i always say that but I don't think that AI or ML is it's bad.
00:33:26
Speaker
Definitely, I mean, it's good. It's helping a lot of things. But at the same time, we need not to forget that there are some consequences and we need to put work into seeing how we can mitigate these negative consequences.
00:33:38
Speaker
Yeah, absolutely. And also remember that we have decades of experience of social media, of digital platforms where all of these things have also been perpetuated and not to forget all of the things we've learned.
00:33:49
Speaker
And as African Women in Media, we have this thing called the Kigali Declaration

Kigali Declaration and Gender Violence

00:33:53
Speaker
Delimination of Gender Violence in and through media in Africa. I don't know if you've heard of it. and Yeah.
00:34:00
Speaker
And one of the um articles, I think it's Article 7 of it, and no Article 6, I believe, is speaks directly to platforms in terms of where they can be doing to address things around gender violence on their platforms.
00:34:14
Speaker
Now, if you were to sit with tech companies like Meta, Google, OpenAI, what would you ask them to be doing differently to prevent gender violence, gender abuse on their platforms?
00:34:26
Speaker
I think... I don't know whether it has to do with capitalism or something. but i To do with what? Capitalism. Capitalism, right? Okay. Yeah.
00:34:38
Speaker
I think people are a bit focused so much on the gains. Like, I just want want to get profits, you know, regardless of who suffers, you know. I just want to gain.
00:34:50
Speaker
And for me, if I have to sit down with them, I would say that investment, into research or work that focus on audits in their systems should not be an afterthought.
00:35:06
Speaker
It should not be something that they do at the end of it just to pass through laws of certain countries, right? But it should be something that they need to embed in their system, their entire modeling system from right from the stats, right?
00:35:23
Speaker
And they need to audit the system at every step as they keep building. aside that it's important that whilst you're auditing i don't want you to audit with a team of all um say white males right this is i'm i mean definitely i know that they may think differently but probably the things they see or how they reason about things they might reason it slightly different but may not be able to detect things that could actually be affecting groups of people that do not necessarily to look like them.
00:35:54
Speaker
And so it's important that we create this, and not just gender balance themes, but also even racial balance themes as well.
00:36:05
Speaker
Because when you don't experience something or you don't go through something, it's easy for you to just not even realize it's happening. right so it's important if i have to sit with them one thing is to tell them that it shouldn't be an afterthought and it's important to create diverse teams that can um at least help them in the auditing process to think from different perspective ways of doing things and potentially also transparency i don't think these organizations are very transparent in the way they do things their models
00:36:37
Speaker
um They tell us that the models are open source and all of that. But definitely, i don't think they give us the whole picture. we don't We don't know the things that go into the model development process, all the features they used to do, their model fitting, how the weights are gotten.
00:36:55
Speaker
And I think that it's important that they are transparent in whatever they do. right So these are for me, if I have to talk about it, I live from where my research seems from.
00:37:06
Speaker
These are some of the things that I would want to touch on.
00:37:13
Speaker
Hello, Her Media Diary listeners. Permit me to introduce to you the Kigali Declaration on the Elimination of Gender Violence In and Through Media in Africa. It's a groundbreaking commitment to address the forms gender violence experienced by women in media and how media reports on gender-based violence.
00:37:33
Speaker
So whether you sign up as an individual or as an organization, it is a sign that you are pledging to consciously play your role in tackling this important issue and working towards creating safer and more inclusive work environments for everyone.
00:37:51
Speaker
Imagine a media landscape that treats every story with balance and every voice with dignity. By adopting the Kigali Declaration, it's not just a commitment, it's a powerful step towards social change.
00:38:04
Speaker
And that change starts with us. So if you are ready to take that bold step and make that social change, visit our website at africanwomeninmedia.com slash declaration to read the declaration, to commit to it and to begin to take action.

Empowering Women in STEM

00:38:25
Speaker
Yeah, and you've co-founded initiatives like Women Protecting Science to the Younger Generation. i think that's such a brilliant and initiative, by the way, because of that intergenerational and exchange.
00:38:40
Speaker
um Because, I mean, i've got kids, my youngest is seven. So she's growing up. This is her reality. This is this is for her what landline was for us, I guess, growing up, right? You know, tell us more about that initiative and, and you know, what they do. So we started this initiative whilst I was doing my master's at AIM, Senegal.
00:39:03
Speaker
And it started, I mean, I think we're not bad with numbers and terms of gender, but we're like 11 girls out of the class of 33. beigni three It wasn't, I mean, it wasn't that bad, at least for a math class. Yeah.
00:39:19
Speaker
So, but then we knew at least from, because we are also from different African countries and we knew from all of us from where we're coming from, like my class financial math, we had three girls and it was a class of 10.
00:39:31
Speaker
So it's like 30%, it's not 50, right? So, and then there was this other girl who said they were like two and their class was over 50. So like, though we had come together at one place and at least we made up it relatively good number, but from our experiences it wasn't necessarily always like a good number, good representation. So we thought that okay with our experiences and everything we could come together and form something that we can um keep talking to young people and I think sometimes it's easier to talk to people when your age gap or generation is not like wide apart, right?
00:40:07
Speaker
So, I mean, now they have Gen Alpha and then they have Gen Z, then we have the Millennials. So it's like, it's easy for a Gen Z to relate to, you know,
00:40:18
Speaker
an alpha yeah than gen alpha to it with like a millennium right so we started that okay maybe we could um start something like this just to talk to more young people share with them our experiences encourage them because you cannot be someone you cannot see right So if you see someone who is at the top, you know that you can be also because this person is there. So it was just something like that that we started. And our main focus was just to help to get more young people into STEM, just study STEM courses, because we believe that this definitely held the future.
00:40:57
Speaker
So this was the main goal of our initiative. And we did quite a lot of work. We've organized bootcamps in terms of Python programming, We've also had like conference, online conferences. Most of the time we do most of online stuff because sometimes and putting people together at one space can be cost involving. And when you don't have like sponsorship, it becomes a bit difficult. But yeah, this were mostly most of the initiatives that we're doing. So just teaching people how to program and motivation, providing mentorship.
00:41:29
Speaker
So here here's a thought um in in a previous interview I did for this podcast so on this topic around AI and you know regulation, et cetera, with reflecting on kind of the notion that as a journalist, as a media person, you're just a media person, but actually that actually today's world, you need to be thinking of yourself, not just a media professional, but media and technology, right?
00:41:55
Speaker
And so obviously your your your initiative is focusing on the younger generation, but actually there's a whole massive older generation that's thinking, AI, what's that? You know, AI agents, what's that?
00:42:06
Speaker
So I guess what are the barriers for entry for women? So not just young women who are going to go in the workforce in the next 10, 20 years and ai is just a standard, but then also those that have, you know, the the ones beyond the millennials who are kind of,
00:42:25
Speaker
still grappling with TikTok and here is AI. you know So what are those barriers and what do you see as opportunity um as initiatives? Your initiative, my you know African Women in Media in addressing these barriers and also the platforms also.
00:42:43
Speaker
think one would be unequal access to resources, learning resources. And this can come in various formats, either not not just because they are not available. Like if you go online and you say you want courses on machine learning, you get a lot of them, but and some you have to pay to get access to them.
00:43:08
Speaker
right and I think the last time I actually at a workshop where I mentioned this about accessibility and costs and affordability that's the right word affordability that um i mean if you picture your GTP for example a subscription in months is like 20 and if i can go back to Ghana and I convert 20 dollars to Ghana cities it's a lot of money that someone can spend at least in two days for or three days for a meal in fact not everyone and they can earn up to ah one dollar it's not like averagely right and when I think about it you think that this person a woman who is trying to ensure they can get some money to feed themselves and their kids would rather pay for a subscription of twenty dollars or a course that costs about two hundred three hundred dollars right and I think that this is one
00:44:03
Speaker
major barrier that sometimes having not just the access but being able to afford these learning resources is a huge barrier to most people, the cost involved, right?
00:44:15
Speaker
And the second thing to, in my opinion, at least for now when I look at my colleagues, and has to do with work-life balance. at the end of the day we are women and um you you want to i mean aside working you also have like a life you have a family you want to um be able to be there as a mother for your family and you also still have to work to get me money right so it's like and you have to upskill yourself so it's
00:44:48
Speaker
a lot of tension and demands of your time which is sometimes super hard um that can create some barrier for you to even get into the field of ai because you need to upskill yourself right and with the demands of family life it's it's super hard to be able to navigate all of those challenges and i think this is also uh another challenge in in my opinion that there's so much to balance out.
00:45:15
Speaker
and Maybe the other thing could be exclusion, right, in either terms of culture or themes. I mean, depending on where you are or where you're based, like if you are based, um let's say abroad or something, where ah you have fewer representation of generally either African women or Black women, sometimes Certain things you can actually, you you can be overlooked in certain things or you are questioned more than other people like male peers or let's say other white women ah in terms of your your decisions or your assigned tasks in jobs and stuff.
00:45:58
Speaker
So, and that can have an effect on your self-esteem, your self-confidence. So you can think that you can even learn some new things. And I think that sometimes these,
00:46:12
Speaker
Things that happen indirectly can actually also cause some barrier to want to get into entering the field of ai But I think for me, most importantly, it would be the affordability of these learning resources.

Grassroots AI in Africa

00:46:27
Speaker
yeah And are there any innovations or stories, best case stories in technology? ai in terms of on the continent right now that's really giving you hope, especially addressing these things that you're really passionate about, around fairness, around gender equality, et cetera?
00:46:43
Speaker
Yes, i would say definitely there are lots of grassroots AI communities in Africa right now that is growing and it's incredible. i um um I actually also have one, we have this women in machine learning and that we started, which we co-organized with some other colleagues.
00:47:04
Speaker
And I think that that's super amazing. We also have like the deep learning in Daga, which is like an annual gathering of machine learning and data science engineers. And it happens every year in one African country.
00:47:17
Speaker
And this year is happening actually in Rwanda. And there's also AI Saturday Lagos that is in Lagos and their main focus is just to help people get started in the field. So in terms of these grassroots communities, I think the the growth is huge.
00:47:34
Speaker
Aside that, there's also lots of work or startups that are focused on solving African problems. um a colleague just created an app that she called the ma app um i say shout outs to asada and the main idea of this app is to help african mothers right so this app you can um use it to keep track of your antenatal visits you can use it to keep tracks on um
00:48:04
Speaker
So if you have to take folic acid, so it's both for people who are trying to be mothers or who are already pregnant or postpartum, right? So it can of help just different caliber of women depending on which stage you are in.
00:48:20
Speaker
And I think that this is definitely incredible. This has to do with us as people and what we really need. Also, these big companies that Google right now is in Accra, Ghana, and their work focuses on solving most of African-based challenges.
00:48:36
Speaker
So I think this growth for me is amazing, notwithstanding that there's also focus on thinking about the ethical implications. So at the Deep Learning Indaba, there's this Trust AI workshop um that will be happening. And the main idea is just to discuss trustworthiness of AI, accountability, transparency of these systems.
00:48:59
Speaker
And I think that it's amazing to see these kind of conversations already kick in place, right? And other governments in other African countries also doing well to come up with their own national AI strategies such that it can guide the respective countries in terms of ai development And I think for me, seeing this growth of AI ethics in the community is really great. And it's not just about adapting to AI tools, but we should also contribute to shaping

Advice for Women Entering AI Fields

00:49:36
Speaker
them.
00:49:36
Speaker
And this actually gives me some hope for the continent. And lastly, what advice would you give to any woman in Africa who wants to, you know, break those barriers, venture into machine learning and AI, but just don't know where to start?
00:49:52
Speaker
So first thing I would say is believe in yourself. And most like most often, we are always more capable than we think we are, honestly.
00:50:03
Speaker
And that's the first advice. And I would say that don't wait for tomorrow. I would begin tomorrow. Just start small. Just stay consistent. get your laptop. I mean, now there are apps that you can actually learn how to code on your apps.
00:50:18
Speaker
Just start gradually and keep, just be consistent. Just start small and keep the windows. And it's also important to find the right support, like people, right?
00:50:30
Speaker
Because either you're joining a local community or you get a mentor or you you find people who you can easily talk to and share whatever challenge you're facing.
00:50:41
Speaker
So it's really important to also find that such that when it becomes difficult, you can share with them and just own your own perspective. We are all different people and our lived experiences are definitely different and we don't have to do things in the same way.
00:50:57
Speaker
So just own your own perspective and run your own race. Don't let anyone pressurize you. So. yeah that's what I would say wise words from you Debra and I should say that as of two nights ago I started to process the process of developing my very first AI agent and so thank you so I'm looking forward to just I'm not going to tell you what it's for it's you know it's very basic it's nothing so extravagant but it's just like I said taking those steps exactly yeah and because I you know I hear about AI agents and everybody's saying oh you should You could be the next millionaire.
00:51:31
Speaker
And it's not about that. It's just actually about just what the hell is this AI agent and how do I do it? And actually, I believe once you get started, like you said, more will follow. and And that's how you learn. So thank you so much for your time today, Deborah. It's been a pleasure speaking with you.
00:51:44
Speaker
Thank you for having me. And I really appreciate the invitation. So as AI continues to evolve, so most our frameworks for ethics, for fairness, and for protection, especially when it comes to women's safety and visibility on these platforms and through these tools.
00:52:02
Speaker
Deborah reminds us that justice has to be built into the foundation of every algorithm. So thank you for listening to her media diary. If you found this conversation inspiring, Don't forget to subscribe, leave a review and share this episode with someone who needs to hear it. If you'd like to join me on an episode of the podcast, send me an email at gmc at africawomeninmedia.com.
00:52:23
Speaker
And can look at all of our other podcasts on hermediadiarie.com. So subscribe to Her Media Diary through your favorite podcasting platform. And you can also tune in through our partner radio stations from anywhere across Africa.
00:52:36
Speaker
And don't forget, join the conversation using the hashtag Her Media Diary. So until next time, stay safe, stay curious, and keep amplifying the stories that matter.