Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Her Media Diary Episode 44: “Open, Free & Secure Internet for Women” with Sandra Aceng image

Her Media Diary Episode 44: “Open, Free & Secure Internet for Women” with Sandra Aceng

E44 · Her Media Diary
Avatar
6 Plays37 minutes ago

Sandra Aceng is a human rights defender and digital rights advocate from Uganda. She is also the Executive Director of Women of Ugandan Network (WOUGNET), an organization that engages in research, policy advocacy, and capacity-building programs to ensure gender inclusivity in Uganda’s digital landscape.

In this episode, Sandra discusses her personal journey into activism, the digital realities faced by women in Uganda, how AI may be perpetuating gendered harm, and what the path forward looks like for creating safer digital spaces.

As AI advances and our lives become increasingly digitised, we cannot afford to overlook gender in tech conversations. Sandra’s work is a powerful call to action for governments, developers, media platforms, and all of us to centre women’s experiences and safety in every aspect of the digital world.

Subscribe, leave a review and share this episode with someone who needs to hear it.

If you’d like to join an episode of this podcast, send an email to yemisi@africanwomeninmedia.com. Or visit our website at www.hermediadiary.com

Subscribe and follow Her Media Diary on all your favourite podcast platforms. Also, tune in to our partner radio stations from anywhere across Africa. And don’t forget to join the conversation using the hashtag #hermediadiary.

Recommended
Transcript

Podcast Introduction

00:00:03
Speaker
Welcome to Her Media Diary, and the podcast where African women share real stories, bold moves, and behind-the-scenes moments that shaped their journeys. I'm your host, Dr. Yemisi Akimbo Bola.
00:00:15
Speaker
And with each episode, we're pulling back the curtain on what it really means to build a media career, to break barriers, and to stay true to your voice. Whether you're just starting out or already making waves, this space is definitely for you.

Gender-Based Violence & Online Safety

00:00:30
Speaker
In today's episode, our focus is on the rise of technology facilitated gender based violence or TFGBV as some of us will know it and how artificial intelligence, social media platforms and digital surveillance are shaping and sometimes threatening.
00:00:47
Speaker
women's safety and freedom online. Joining me is a fierce advocate and expert in this space, Sandra Aseng. Sandra is the executive director of Women of Uganda Network, a leading voice in gender and ICT across Africa.
00:01:03
Speaker
Phenomenal. And from shaping digital rights and policies to protecting women's expressions online, Sandra brings years of experience and a feminist lens to this issue, which is very important.

Sandra Aseng's Activism Journey

00:01:16
Speaker
In this episode, we're talking about Sandra's personal journey and into activism, the digital realities of women in Uganda, how AI may be reinforcing gender harm, and what the path forward looks like for a safer digital space.
00:01:44
Speaker
Sandra, a pleasure to have you on the podcast. Thank you for having me as well. Great. So we'd love for you to take us back to your early years. what was it like growing up in Uganda? Assuming you did grow up in Uganda.
00:01:56
Speaker
Oh, yes. So I did grow in Uganda. um I grew up from a small village in a district called Oyam District that is based in northern Uganda, um whereby grew in I did a bit of school from there, then I moved to a city called Lira, where I did my primary school, and also moved to another city again, Gulu City, where I did my secondary studies and moved to Kampala.
00:02:27
Speaker
where I did my university school and all that. When I moved to Kampala, where I did my bachelor's in social sciences, that is at the Macquarie University School of Women and Gender Studies, it was fun, i don't know what is the right word, but it was a very mixed experience whereby during that time when we were at campus, we didn't own a computer,
00:02:58
Speaker
where when we are doing our coursework, what you have to do is search for your information and then write on the paper and then take it to an internet cafe for it to be typed.
00:03:09
Speaker
Why? Because I didn't own a computer, because I didn't know how to use a computer probably. And it was stressful also to own one because everyone looked for you.
00:03:20
Speaker
For example, if you just take just one minute, to go and shower, ah you find your computer gone. So you find yourself in middle of, do you want to live a free life where you want to go have some fun at the club without having to carry your computer with yourself, um without thinking that your computer is now your new baby.
00:03:45
Speaker
So that is how it was. But after finishing campus, where one of the The interesting course units that I did was also around gender and technology.
00:03:56
Speaker
So i i it became a reality after that to start preparing for a career. So I started um doing a short courses. And luckily enough, at the at the School of Women and Gender Studies at MacKerry University, they used to also offer a fairly scholarship, half payment, quite especially for girls to be able to to pursue technology as as a cause for them to be able to prepare them for for their yeah for their for their skills. So ah that is where I really, I also, that's how my my journey of where i grew up from and the experiences I also experienced from campus to after campus, yes.
00:04:42
Speaker
Yeah, yeah. So um I'm gonna, I mean, you said a lot of things that I'd really love to unpack and during the course of this conversation. But I want to go back to all of those movements from one city to another to another.
00:04:54
Speaker
like Tell us about that period. What was driving all of that kind of movement across different parts of the country? What was growing up like? What your dynamics at home? So what what really caused um that movement is getting better education.
00:05:10
Speaker
Yes, getting better education. And it was really more influenced by where I grew up, which was a rural setting that didn't have maybe better schools.
00:05:24
Speaker
So parents would be sort of in a cycle whereby if a parent of Sandra recommends this school that is good, probably you find that the old village might be going to that school. So I think that is what really influenced my movement from one from one city to another just to pursue education. Yes.
00:05:44
Speaker
And how many you wait how many of you were in your household and what was growing up like? We were eight. And unfortunately, in twenty in 2017, we lost our brother. So we're eight and and I'm the second last born in our family.
00:06:03
Speaker
Yes. Okay. So tell us more about those early years, though, in that dynamic of family of eights and you've been the second to last born. What was that like for you?
00:06:14
Speaker
I think for me, it was first of all, growing up in in that big family as well, but also seeing your you your brothers, your sisters going to this school and you also being inspired by what they do and being actually even better than them yeah in the world.
00:06:34
Speaker
in the in your in your your education. So it was more of me looking up up up to them, but also sort of just um I don't know what is a white word, but sort of just looking up to what they do and trying to also do what they would do, what I would do in in my in my in my education. So it was more, I want to be like maybe my big sister, how she's focused, how she has studied from all the way from primary school and to secondary school.
00:07:11
Speaker
And um growing up, we also grew up whereby we were in this family. we ah with our mother by then and she was able to um remember someone who is a four senior a primary four dropout who was able to raise school fees was able to sell all sorts of local breweries to be able to to pay for us um education ah ah at primary, at secondary and ensuring that all of us actually attain education.
00:07:51
Speaker
And luckily enough, all of us at least went to the university and now if you want to advance You have to work hard, get some money, and also being able to pursue maybe maybe a master's degree or a PhD if you like to advance ah your knowledge, but also be ready for the career path.
00:08:13
Speaker
Yeah. Absolutely. i mean, that sounds like such an incredible, your mother sounds like such an incredible force. Yeah. navigating all of that alone with eight children and kind of all of that movement to make sure you all have the, you know, the essential foundation of education that you needed.
00:08:30
Speaker
And it was interesting, you were talking about your experience at university, not having a a computer, but then also the benefits of not having a computer back then, you know, also. so can you remember the first time you realized that power of technology, not just as a tool for change, but also as something that can be misused?
00:08:50
Speaker
So i will I will just give a bit of how I also started my career and also got into this field. So that also, my career really um deeply was was begun being deeply grounded in digital rights, but also in communication, especially for for social change.
00:09:15
Speaker
So it was really driven by passion to amplify the voices of women and marginalized groups in Uganda. And that is because also I was raised up by a single mother back then and my career with the civil society organization was focused more on digital literacy, ICT for development, and also advocating really for inclusive access to technology, especially for rural women. So I grew in in in my my role when I joined Women of Uganda Network also, where I actually joined as as a volunteer.
00:10:00
Speaker
and initially then I served as a program officer for gender and ICT policy advocacy program. So in that role, I really worked at the intersection of technology and and policy, but also analyzing some of the national frameworks as like the Data Protection and Privacy Act and also the rural communications development policy. By then it was it is drafted by Uganda Human Rights Commission, and also advocating for the inclusion of gender perspectives in laws. so
00:10:39
Speaker
My shift, especially into artificial intelligence, but also looking at technology-facitated gender-based violence wasn't just accidental.
00:10:53
Speaker
So emerged naturally through this work. um As I also engaged in more research on technology-facitated gender-based violence, I also started seeing clear patterns, especially around how algorithm biases the platform in action of handling victims' issues and also AI generated content like defects were also contributing to some of the emerging issues such as gender disinformation, silencing women and also harming especially structurally silenced women.
00:11:30
Speaker
So this really compelled me to also explore what digital harms women face especially. in the in the in the technical depth, but I also became very much involved in doing a number of global collaboration, contributing especially to this book that I i really am proud of, is a technology yeah It's technology and domestic um violence, and technology and domestic and family violence, victimization, perpetuation, and response, which was also edited by ah good friend of mine. So this this book also explored a lot of the intersection of tech, AI, and also violence.
00:12:20
Speaker
so A lot of global collaborations that have also participated to joining several experts advisory boards, such as the AFCON UK, whereby they're conducting also a literature review on perpetrators' behavior in online violence, particularly where AI is also used to automate abuse.
00:12:45
Speaker
And also serving on other advisory boards, such as the UNFPA, 2025 Global Symposium on Technology-Facitated Gender-Based Violence that happened early this year in February.
00:13:00
Speaker
And serving on the UN adversary un Women Advisory Committee that are working on a handbook on tech-facitated gender-based violence.
00:13:12
Speaker
and several others, such as the Accountability Committee by the GNI. So this role has really given me, um I would say, a platform, not only to advocate for safer spaces, but also to shape global conversation, because I believe that local realities really shape our global conversation, especially when we talk about ethical use of her high in regards to gender justice, particularly in the global South. so
00:13:47
Speaker
Today, being the executive director, i i let I've also led a number of initiatives that combine both community-based intervention, survivor-centered tech platforms, policy advocacy, and also within vision to see that technology really empowers rather than harms women especially.
00:14:14
Speaker
We're going to unpack each of these aspects. You talked about AI algorithms and and you know the impact at different levels, and also in terms of rural women.
00:14:26
Speaker
and I'm interested in what you said about you starting off as a volunteer at Women of Uganda Network before now becoming the executive director. Tell us about that journey.
00:14:37
Speaker
and Because that's quite an inspirational journey of entering into an organization as a volunteer and today becoming the executive director. Yeah, so um it wasn't an easy journey, I would say.
00:14:52
Speaker
um It was...
00:14:56
Speaker
Yeah, I also don't know how I got into this, but I think it's passion driven, goal oriented and all that, that's really enabled this journey. For example, um when when I was a program officer, I was able to also challenge myself to be able to apply for different for example fellowships uh there's this remarkable fellowship that also led to a lot of my exposure to be able to learn as much as possible so i got a fellowship with the global network initiative and interneus and at that time i would think i was really young whereby i i won it was big for me
00:15:47
Speaker
And I didn't know how to use this money, but it was good that i was surrounded by people who was able to guide me and all that. So that' I was able to do a project on researching the impact of internet shutdowns on women's rights online.
00:16:05
Speaker
So when I researched that, I would say that that is one of the... the products that Uganda has ever had on looking at how internet shutdowns or network disruptions impact women's rights online. And that also led to several other several other opportunities whereby i was also um ah invited by Internews. I was like, um um i do I deserve this? you know so i was
00:16:37
Speaker
So I served on an ad advisory ah committee that was looking at reviewing applications um for for for grants actually of of different initiatives that and you know where when people say that sometimes um you're put in those ah what's the seat that you feel that you don't deserve you know Yeah, but I think for me, it's about finding what what what exactly you can do associate that with the mentorship and also surrounding yourself with people who are able to guide you to to know um what what you can do. So um doing so a number some some some of those fellowships, I think, really impacted my career and also being goal-oriented, knowing what you want,
00:17:27
Speaker
um being, i would say, being respectful to your leaders, your leaders um doing your work as it is also recommended.
00:17:39
Speaker
Sometimes we do work and not just the people you work with, we will recognize your efforts, but the people in your ecosystem are also able to recognize the efforts that you put as well.
00:17:52
Speaker
So that led to to my promotion as a program manager. And I also served that, that also really exposed and also working with a number of teams, learning how to work with supervised teams and also um um mentor and all that.
00:18:11
Speaker
So when when then in 2022, our executive director decided to um to transition to another role, um there was that then I was asked by board members that a chairperson called me and was like oh so if you are to ever be um do you think you can manage sir i asked myself why is she asking me this question uh and i had so never never actually to to be an executive director
00:18:49
Speaker
why Yes. yeah I don't know. But um before that, I thought it was a and nice role. But also, I think it also came at a time whereby i was also looking at transitioning, especially to another...
00:19:08
Speaker
another workspace you know so it wasn't easy to to let go of what i was thinking and and also taking on on this role but i must say that i think this is a role that everyone should actually have a taste of um it makes you a better person makes you to learn a lot about people it makes you to to to change certain behaviors to be patient to understand that people are different a lot that you can learn with leadership actually and you you start to to to do everything um
00:19:54
Speaker
thinking of what if i had one um your shoes, what would I feel? you know So such kind of experiences has really shaped me to be a better person. And I know that um if I'm to live this role one day because I don't want to be these African ah leaders that stay in power forever, I think it will really help me to be ah better employee wherever I go because then it has really shaped the way I think, the way I do things and the way also behave as a person.
00:20:36
Speaker
Absolutely. and And you wear many hats. So as a gender researcher, you're a digital rights advocate and a policy analyst. how did How did those fields intersect for you and how did it draw you specifically to issues of technology facilitated gender-based violence?

Research, Policy & Advocacy

00:20:55
Speaker
I think it was being a researcher, you are able to really understand the issues, especially that affects or impact women's access, women's use of technology.
00:21:09
Speaker
But also after the research, I think research for really also very much impacts or informs advocacy because for you to be able to advocate for a policy, you should be able to have um data and not just data, but from my perspective, gender inclusive data that is able to inform your position, that is able to influence um how you position or how you think about or analyze as policies from a gender perspective. So um such such kind of shaping is really important.
00:21:52
Speaker
was able to influence and also support ah my work, especially in being in research, um in in in policy advocacy, the aspect of being able to and know that research also very much play a key role in being able to influence for for policies.
00:22:12
Speaker
And so in your experience and working with grassroots women and tech users in Uganda, how would you describe the changing nature of gender-based violence in the digital and this digital period? I mean, we're well into digital age, it's nothing new now.
00:22:27
Speaker
But how would you just you know describe how is the changing nature of GBV in this context, in today's context? So I would say that... um There is a saying that I like to to say that um you cannot fix what is broken by using the law.
00:22:50
Speaker
And that means that for us to be able to fix what has happened, like I said, we need to first maybe do a lot of trainings, we need to first do a lot of capacity building, and that also informs um the policy that is being developed.
00:23:08
Speaker
um I would say that the current nature or of of technology first hit gender-based violence ah keeps evolving because tech is evolving.
00:23:21
Speaker
The policies ah may not be evolving, but tech keeps evolving every day. Just yesterday, we didn't know what a i was, but we would hear it as um something that would come up.
00:23:34
Speaker
So um I would say that now there are also some of the emerging issues such as gender disinformation and and also different forms of technology-faceted gender-based violence keeps evolving day and night.
00:23:54
Speaker
And now with the existence of AI, we've seen several aspects of defects and now we also have what is called also chipfacts. And um the forms of of of of predictive AI, um like GPS navigation apps, justice apps may in in in in a way take women through maybe unsafe roads, you know, because now we are trying to add embrace as much technology as as possible.
00:24:29
Speaker
So it might take women to maybe unsafe roads, areas at night. And if the hub also happens to determine those routes as the fastest on the map, um you see that state-run systems for public safety um and and and this could be predictive policing or facial recognition or allied policies. But it can also be used, for example, to surveil gender diverse groups, women you know in politics, women in leadership, ah women journalists who might be um maybe researching something that is politically contrary of the state's doing.
00:25:13
Speaker
And this may with a later result into abuse. this may later result into violence towards so of them. So ah ah we have also had we've seen things like home security systems that can also be used to stalk women partners, to stalk partners or surveil also domestic workers.
00:25:33
Speaker
And this also help um ah domestic workers who help. And these are people who are usually women in these in these instances. So there are some of the ways in which um technology is evolving and now with that AI, I think it may lead to increased cases or ways in which the nature of technology facilitated gender-based violence.
00:26:01
Speaker
And now there are various hubs actually that have also emerged in the instances of several hubs, such as maybe a Telegram, that now has a lot of hubs that you don't know whether it's safe, you don't know whether it's it's it's it's it can be harmful. and And when you see some of that those those groups, they they really send um sexually explicit images of women and this this also with the aspect of of ai it has enabled it to to move as fast as as possible yeah so it's really interesting what you've said about things like as as simple and everyday date at least as gps your google maps right and um perhaps how
00:26:48
Speaker
It needs to be more intelligent in order to navigate women, not to streets that may be notorious for certain types of violence. right And I just wondered whether you know of work and activities happening around that to try and impact and influence the platforms to think about incorporating some of these things into the app? So whether it's to advise against certain rules, et cetera, irrespective of whether the fastest or not.
00:27:16
Speaker
Is that something that you're actively involved in or that you know of innovative ideas around that? So for example, um as Wugnet, we run a community-based digital safety workshops and also we have developed co gpv web port and It's www.ogbv.wobne.org.
00:27:46
Speaker
And this this platform provides digital security tips, also legal resources and psychosocial support for survivors.
00:27:57
Speaker
and And this has also been helpful to to really see how to strengthen support, especially for survivors. and And also on this, we have also integrated a chat box, a chat box with the integration of AI.
00:28:15
Speaker
but the human aspect whereby AI just helps us to get this information, but with the human interaction whereby we are able to be the one to respond to these issues.
00:28:27
Speaker
and And recently there was a a lady that reached out, whose sexually explicit images were circulated on WhatsApp.
00:28:39
Speaker
And when when our images were circulated on WhatsApp, she reached out for help and we were able to connect her to Meta. But unfortunately, you find that with these resources or with these tech companies in place.
00:29:01
Speaker
um There's a lot of ah bureaucracy that is also involved, but also the fact that there was this specific WhatsApp group that was circulating our images.
00:29:14
Speaker
ah where it started from meta could not uh and i understand in regards to the privacy or or some of the policies they have in place they were not able to to do anything um with that specific whatsapp group what they wanted was her to be able to identify or um speak to one of the admins and they were not willing to do that so it it it ended up not being as as as effective as as we thought.
00:29:45
Speaker
And um she wanted to also have some some of the legal policies that would support her, but it was very technical for her to be able to understand what are those policies and how can she apply in her own case.
00:30:02
Speaker
and And then, as Wugnet, we've also been able to train a lot of capacity building, especially with the emphasis of holistic security, especially holistic security is really being able to ensure that we do not only train digital security, but also know that technology facilitated gender based violence can start from online and it can go to the offline so that's why we also emphasize our physical security i have a laptop here how am i able to see that the physical security of this laptop is actually safe do i have simple things um like maybe an external uh where is where is where have i put mine like an external uh security um uh having some things like uh what's a uh
00:31:00
Speaker
the things that come here to come on your camera. And also just being aware of the environment, you might be in a hotel, but actually, there's a camera that is also recording you.
00:31:11
Speaker
So we have also done a number of that training. And we have also trained recently about 20 trainers of training who were also scaling survivor centered approaches to about 60 000 women and girls using our comic books that is very interesting to read not just in english but it was also in other languages such as luganda and luneoro so they were to go to various communities schools markets to read these books and let them understand that technology facilitated gender-based violence is actually an issue and also do a number of radio talk shows
00:31:54
Speaker
because sometimes we cannot reach everyone through the book, but we also want to see who are the people we can also reach through the radio and and some of the safe spaces so that we can deliver messages around not just online gender-based violence, but digital rights and the aspect of closing the gender digital
00:32:18
Speaker
Media Diary listeners, permit me to introduce to you the Kigali Declaration on the Elimination of Gender Violence in and through Media in Africa. It's a groundbreaking commitment to address the forms of gender violence experienced by women in media and how media reports on gender-based violence.
00:32:38
Speaker
So whether you sign up as an individual or as an organization, it is a sign that you are pledging to consciously play your role in tackling this important issue and working towards creating safer and more inclusive work environments for everyone.
00:32:55
Speaker
Imagine a media landscape that treats every story with balance and every voice with dignity. By adopting the Kigali Declaration, it's not just a commitment, it's a powerful step towards social change. And that change starts with us.
00:33:11
Speaker
So if you are ready to take that bold step and make that social change, visit our website at africanwomenmedia.com slash declaration to read the declaration, to commit to it and to begin to take action.
00:33:30
Speaker
yeah but I think the example you've given there and also the work you've done demonstrates kind of the different levels of intervention that needs to happen, right? So you've talked about working with the platforms and actually them being unable or perhaps, I don't know whether it's an x element of unwillingness also, but unable to actually take the necessary action needed.
00:33:50
Speaker
And then there's the aspect of working with the policymakers, but also the aspect of raising awareness as grassroots as even in the marketplace. But I wonder... Like in the example you gave of the WhatsApp group circulating this picture, this content of this woman, what was the role of lo of like law enforcement in this, like in terms of police and and actually actions they can take?
00:34:16
Speaker
Because surely there's a law been broken there in in Uganda, in any country. in any country So this risk this this case actually started with it being reported to the police, but based on what she she told us, I think the police did not do anything.
00:34:35
Speaker
And this this really calls for the aspect of seeing how do we strengthen our legal and institutional framework in regards to being able to to tackle holistically and collectively technology facilitated gender-based violence?
00:34:52
Speaker
How do we develop, how do we enforce um specific laws on technology facilitated gender-based violence? Because Most cases the police do not see tech-faceted gender-based violence as a form of gender-based violence, but rather they would proprietise someone who has been violated, someone who has been physically violated as as a form of abuse.
00:35:21
Speaker
and and And this also includes um being able to criminalize some of the happenings based on our research in 2022, where we researched exploring the impact, the threats, and also um the harms of technology faceted gender-based violence. So there was a lot of criminalization around that is none the distribution of non-consensual ah intimate sharing of of of of images and videos whereby in most cases when a woman and this happened especially to those who are public figures and and and specifically also women and that is a double standard especially because you're a woman
00:36:07
Speaker
you're in the public space and you're also, you know, you're a public figure. So you face double violence for being a woman, but also being in a space that is perceived as a space for men.
00:36:22
Speaker
So um you find that such image or video is is being leaked um by by your ex-lover or jetted-lover.
00:36:33
Speaker
and when you you report the case, the police will instead blame you because they feel sexuality should be something that should be a secret and not public. So they will blame you and they will not blame the person who shared the image. And that's why it's very important to start now thinking or are doing a number of research on perpetrators, who are the perpetrators of technology-facilitated gender-based violence.
00:37:00
Speaker
And also, um speaking to that also in regards to the role of the police i think uh being able to and see how do we mandate um platform accountability which is one of the the areas that is really right now being talked about um how do we ensure that platform are active and not inactive um how do the platforms work with the with the with police uh through the regulations that require that there's privacy also in content moderation, but also redress mechanism and also algorithm, which is also one of the aspects that is key. But also, i think very importantly is seeing if we are developing those platforms, are these platforms being built with the integration of gender sensitive as perspective, are they developed by women
00:37:58
Speaker
do we have, are we nurturing women are able to develop these policies, I mean, these platforms that um can develop a platform that is is favorable and inclusive for everyone?
00:38:12
Speaker
So I think those are some of the things that we need to to take into consideration so that we can have a safe and inclusive digital platform for all.

AI Biases & Digital Safety

00:38:24
Speaker
I think if we go to the question of AI and algorithms and the biases of creators, you know there is that discourse around the fact that when ai tools are being produced by only those out of your context, you know and we are not ourselves producing our own AIs,
00:38:41
Speaker
that kind of understand our context, whether in terms of geography within Africa, within Uganda, but then also in terms of gender, then that creates an environment where you have you know young boys or young you know young people in Silicon Valley producing something for use in Kampala, and they've never even even stepped into Africa or you know don't even understand these issues are around and and gender-based violence either.
00:39:05
Speaker
So you know it does add some of the things, critical things that we need to be thinking of. And I know that the African Union have a policy or some work, policy work around artificial intelligence in Africa for it to work within Africa.
00:39:21
Speaker
But I just wondered what are your reflections in terms of that issue of where the creators are are and how that's exacerbating the experiences that you're kind of highlighting? Okay, so so thank you for that.
00:39:33
Speaker
I think for me, um again, when we're developing AI or creating AI, I think it's it's also very important that um we can have um for example, the perspective of women and and who are the creators of this AI.
00:39:50
Speaker
Because, for example, when we are looking at digital technologies that have also enabled, for example, the scaling of maybe telecounseling, legal aid,
00:40:05
Speaker
or reporting platform for survivors. I think that involves the ability to being able to um to have AI enabled futures included or integrated in this platform.
00:40:20
Speaker
of For the case like in Uganda, like i mentioned before, having platforms such as the OGPB portal that is really a centralized hub that provides legal information, safety tips, but also psychological

AI's Role in Prevention

00:40:36
Speaker
support. This really helps survivors to be able to report incidents safely and also access.
00:40:43
Speaker
But also we understand that um AI has a tool for prevention and also it can help early detection we shouldn't look at as something that is bad but something that can also be an enabler just like technology so ai powered systems can also help identify some early warning signs of of ah of of gender-based violence, particularly in the digital space.
00:41:16
Speaker
For example, we've seen the natural national language ah processing algorithms that are used on specific platforms like Facebook and also YouTube.
00:41:30
Speaker
So they help really to detect maybe abusive languages, uh threats um harassment in real time but also they're able to reflect and and also maybe remove uh harmful content and and also um in in in in some instances uh you see that chat box i also used to provide anonymous support and referrals for survivors of domestic violence.
00:41:59
Speaker
so So when we are in the place of the designing AI, we shouldn't look at it as something that is bad, but we can see how do we have the human rights centric approach integrated into these systems.
00:42:14
Speaker
um We have seen in the instances also, ah let's say surveillance and also defects and also how AI can be used to create actually non-consensual defects, pornography, stalkerware images and also automate also some of the harassment. But you see,
00:42:38
Speaker
um in in In Uganda and also other parts of Africa, um women politicians, activists as well have reported the use of defects to tarnish their reputation, but also to intimidate them out of the public life.
00:42:59
Speaker
um Surveillance tools such as StockAware, and interestingly, during the COVID-19 pandemic, um lockdown, there were also employers that were also using it.
00:43:14
Speaker
And in that case, it's called BOSWARE. and And it it was used to monitor people to see if they are working really. So that also bridges a lot of privacy and and security and and and some of the facial recognition system that can also be weaponized by users to control and track victims. So I would say that um having the integration of some of the principles such as the principle of digital development is very key when we're developing any tool including the AI hubs platforms so that we we see how do we
00:43:59
Speaker
have design with the user principles of gender is also included and also some of the marginalized groups because AI can also be biased especially on on some of the groups such as the LGBTI community at large yeah yeah And yeah yeah, I mean, you're absolutely right that, you know, obviously we are concerned about some of the negative aspects of ai and despite this is also an opportunity for AI based solutions as well. So let's talk more about solutions.
00:44:33
Speaker
What are some of the most effective community led strategies you've seen that help women and girls protect themselves online? So I had actually talked about some, I think,
00:44:45
Speaker
um being able to see how do you work with the communities, not just training them once, but also how do you consider sustainability in that ri regard to ensure that Once you train, um they're able to also congenue with some initiatives in that.
00:45:11
Speaker
ah For example, Wugnet, we have established a feminist-led community network, especially catering the people, the the underserved communities in the rural part of of Uganda.
00:45:31
Speaker
So we we started in Apache district that is in northern Uganda and we scale it to Oryam district. So a community network is is's really to enable access to the internet and ensure that communities who do not have access to the internet have access.
00:45:53
Speaker
But in most cases, access is first. After access, what else? So we also have been able to establish some of the committees, such as the gender committees, to ensure that they're able to identify the issues that women face in their communities, document them, and also work with Wugnet to be able to see how to respond to it.
00:46:19
Speaker
And when we did the assessment, especially being guided by the feminist principle of the internet, that was developed by Association for Progressive Communications, we found that one of the challenges that community were facing besides the digital illiteracy was also the aspect of digital security aspect or ah the threats, the abuse that they face on the internet. They're able to access the internet, but they should be able to know how to safely, securely, and freely access the internet. So we're able to do a number of trainings that was able to give them hands-on skills
00:47:01
Speaker
checking on their their their phones, their phone settings, seeing that it's safe, and also giving them just general tips on how to to to how to to to enhance their safety and security. Of course, there is nothing like 100% safety, but that there are those mechanisms we can have in place, such as having a VPN that kind can can enable safety.
00:47:28
Speaker
and security, especially for marginalized communities. So for us, we we thought that was a good and a good approach because it's not just a one-off.
00:47:39
Speaker
There's a community network there that they're able to access internet, but also there's the aspect of women coming together you know to talk about their issues. to to to to speak, you know, to gossip. i will some Every time I tell people that gossip is healthy, so long as it doesn't impact one another.
00:47:59
Speaker
And being able to also enhance their skill, I thought that is a ah good... approach of of being able to to have prepared internet, especially by Wubnet, with support from partners such as Association for Progressive Communication, but also Internet Society Foundation that supported this, but also having continuous training such as security and safety, ah physical security, has also been very helpful.
00:48:28
Speaker
But besides that, um There are also networks that we have belonged to. Like I said, we are currently contributing to a handbook on technology-facitated database violence.
00:48:44
Speaker
So this handbook can be handy if we see how to localize them, have it fun in a way that people can actually learn how to keep safe and how to also protect themselves online. And we're also thinking of the use of games for safety, how do we see ways to learn in a fun way for women to be able to embrace and also learn some of this the safety skills on how to protect themselves online.
00:49:15
Speaker
so So, I mean, you're clearly very much involved in advocacy, but are also at policy level as well. So what what would you say, in your opinion, based on your work and your underground understanding of what's happening, what would you say the kind of policies and frameworks that African countries need now to effectively tackle gender and technology-facilitated gender-based violence?
00:49:39
Speaker
So for the policies, I think when we look at the African Union, what is that resolution? Is it 522? I'm not so sure, but 522, that talks about five to two that was that talks about that was recently dev developed by the African Union um on digital violence. And this will really recognized digital violence as a human rights issue, which is a one step forward. But also, it it's tried to um ask member states to be able to ensure that they have in-country policies
00:50:21
Speaker
that protects and also that protects in country policies on technology-faceted gender-based violence that can also enhance safety and security, especially for women and girls.
00:50:34
Speaker
Because we do these things, but we do not have an explicit policy, including Uganda as well, because currently we have the Computer Misuse Act 2022 and also We have the Data Protection and Privacy Act of 2019.
00:50:50
Speaker
But these policies do not explicitly our tackle technology-facilitate gender-based violence. ah For example, the Computer Misuse Act talks about ol it when you misuse, maybe the use of a computer misuse, which is just one aspect when you talk about technology, because Tech-faceted gender-based violence can happen onto the online, and i just not really just via the other computer.
00:51:14
Speaker
And then the Data Protection and Privacy Act, really nothing talks about women's privacy and security, much as civil society organizations try to amend it and also share their inputs.
00:51:28
Speaker
but it talks about the privacy of children. So we understand that um some of those those those happenings can really inform.
00:51:39
Speaker
So holy having African Union work with different member states to have an in-country technology-facilitated gender-based violence policy would be very impactful in in in shaping conversation strategies and and also implementing activities that can um collectively and holistically counter technology-facilitated gender-based violence in different African countries.
00:52:09
Speaker
Yeah, and you're right, it is the AU Resolution 522, which is titled Protection of Women Against Digital Violence in Africa. And so you were absolutely right on the title on that.
00:52:20
Speaker
And I'm just, you mentioned earlier on about kind of having feminist approaches in your work as well. And I just wanted that connection between ai regulation, tech design, arms and um policy, et cetera.
00:52:34
Speaker
And what in your opinion, what in your perspective and what you've seen that's worked would be a feminist approach to some of these things? In terms of what you said, ai regulations, sex policy and all that, I think for me, one of the references that would really be very helpful, especially when we talk about the feminist principle, let me just look it up here, the feminist principle of the internet.
00:53:05
Speaker
um I think this this is this is is a set of principles that work towards um empowering women and also queer persons in in in all diversity to be able to fully enjoy their rights, to be able to and and engage in in in in pleasure and also play.
00:53:27
Speaker
that's That's for me how I look at a feminist principle of the internet. And what is very unique or very interesting about the feminist principle of the internet, it talks about um access, you know, it talks about the importance of fe of of movement building.
00:53:46
Speaker
It talks about also the aspect of of anonymity, the aspect of consent, the aspect and currently they also integrated something around around environmental um environmental, ah what um something around environment, which is which is which is a living document, a document that is trying to shape perspectives, that is trying to
00:54:20
Speaker
to shape how we do do our our implementation, how we ensure that not just women are feminist, but also groups such as the queer persons should also be included.
00:54:33
Speaker
So what is interesting when when the feminist principle of the internet talks about especially access, it will talk about access to information, but also will talk about the access to technology, but also it talks about um the usage of technology, like I said before, also, when you have access, how are you using it? Are you meaningfully using it?
00:54:59
Speaker
When you're meaningfully using it, are you able to... um to ensure that women and queer persons actually have the right to be able to code, have the right to design, also have a right to adopt and also sustainably use ICT and could reclaim <unk>im their spaces, especially on the internet as a platform for them to be able to express, for them to be able to creatively advocate but also challenge some of the the social cultural happenings, especially on the internet, some of the sexism, some of the stereotypes, some of the discriminations that happen into these spaces. and and And because it covers a wide range of of of of of issues when we talk about a feminist principle of the internet, just like we have um the African feminist think principles,
00:55:56
Speaker
It's also important to have one for the internet. Yeah. yeah And for those listeners who are interested in in hearing more about the feminist principles of the internet, you just need to go to feministinternet.org.
00:56:09
Speaker
And there's opportunities to even contribute and to look at those and principles there as well. So and obviously we've spoken quite a mixed bag of what is happening negatively, but then also the opportunities.
00:56:24
Speaker
And I'm just curious, what gives you hope despite of all of those things, despite the rise in technology facilitated violence?

Stories of Hope

00:56:33
Speaker
you know What are those moments in your journey that really reminds you that the fight is worth it?
00:56:38
Speaker
I think what really gives me hope is the ability and the stories that I hear of the feedback you know that I get from from especially women um who have been able to be beneficiaries uh who have been able to be um beneficiaries of work that we continue to do to ensure that um there is an open there's a free there's a safe there's a secure internet i often tell people that um for you to be able to make change
00:57:23
Speaker
if you are able to change one person, that is change. That one person is also able to change one another. you know Seeing for me, people who are able to benefit from the work that we do, who are able to start up initiatives to scale in various areas, for me, that gives me hope.
00:57:44
Speaker
That shows that the trainings we are doing the advocacy that we're doing is actually working because more people are learning more people are scaling more people are creating and that is what we want to see that if we are doing something meaningful we should be able to and embrace that it's not only you or gatekeepers in this work but how we able to see that the work that you're doing is also being replicated.
00:58:13
Speaker
So me seeing young people take up spaces to be able to create initiatives of their own to be able to apply the knowledge that they they they they have achieved.
00:58:26
Speaker
For someone to just say that, oh, I was able to use the knowledge that you got, and I was able to use it especially for maybe applying for a grant, applying for a course at the university, that gives me hope, and that makes me think that I am on the right path.
00:58:44
Speaker
yeah Thinking about our conversation so far, and for all of those, our listeners, especially young people listening to our conversation and that are passionate about digital rights and safety, but don't know where to start, what would you say to them?
00:58:59
Speaker
um Thank you. I think for me, it's about starting where you are, starting with what you have, This could be with the skills you have.
00:59:14
Speaker
This could be with the network you have. This would be with the resources you have. And um as as I can draw this from my personal journey, when I started, I started, I didn't look at money, but I looked at what is it that I need to be able to prepare myself for the job market.
00:59:40
Speaker
So I was able to to start with volunteering, but I was also lucky enough to to get people along the way who were able to hold my hands.
00:59:51
Speaker
These are friends who are able to sort of maybe tell me, oh, this is happening, um this and this is happening. But also I was able to also um just walk to people, walk to people and tell them who I am, what I can do, what I'm able to do, and also, like, yeah, the sales that I have. So what what happened was there was this this conference. It was my first time to travel outside the country.
01:00:22
Speaker
And there was this organization that had reached out. Sometimes you wonder why certain things happen, but I think it's the right time, right? So they reached that reach out in Pacta magazine to write an article and I gave it my best.
01:00:41
Speaker
And one thing again is give give everything you do, give it your best. Have people who are able to validate what you can do. So then what I did was, that was my first time to travel, it was in Addis, and I'd gone for conference for training. So when I met, um I had started developing passion for writing.
01:01:03
Speaker
So I met um one of the editors of Global Voices. And I just showed her, this is the work that I've been able to write on Factor magazine. I'm interested in writing.
01:01:16
Speaker
Just after about a month, she reaches out and then she gave me my first gig, actually, right um to contribute on a project on Internet Freedom.
01:01:29
Speaker
And I was able to to write about the social media taxation in Uganda. And luckily enough, it was also a paid gig. you know So that made me start to be visible. you know Sometimes you don't have to wait for people to create spaces for you, but you're supposed to create sometimes spaces for yourself. So I started doing that.
01:01:49
Speaker
And um that also made me to be known. And also it led to um being able to to create resources that when they Google about Sandra, then people can find something that I've been able to do.
01:02:04
Speaker
So that is also one thing I learned that to be able to to to have something that associates with your name, you know, what is that one product that you'll be able to, is it usually is very important to be able to embrace creation of resources, it could be article, it could be a research. That way, sometimes people we don't know you, but people will know your work.
01:02:27
Speaker
And when you speak, people will associate your work um with the with your name and that also was ah a good stepping stone for me but even before that uh when i just started volunteering i remember i used to literally um go to event bright look at look at uh conferences meetings happening like just go there show up um learn as much as possible network with people and also um i also embrace the power of of the internet especially social media to be able to to share what i've learned um it could be even that when you post that no one will ever like your post
01:03:16
Speaker
But that does not mean that you're not seen. Just continue doing what you're doing. Don't give up. I think the the power of consistency is very key. So I think for me it's doing the little, being consistent, having the right network, the right mentors who are able to show you, give you genuine feedback.
01:03:37
Speaker
to improve yourself i think think that is how i would uh sum it up in regards to uh for anyone who is trying to find themselves improve their career path uh that's that's the advice i would give to them it's been such a pleasure discussing this very important uh issue with you one that you're clearly an expert in it's been such a pleasure having you on the podcast thank you too for having me So that was Sandra Acheng sharing not just insight, but urgency, reminding us that the future of technology must be inclusive, must be ethical and safe for all, especially women and marginalized voices.

Conclusion: Call to Action

01:04:18
Speaker
and As AI advances and our lives become more digitalized, we cannot afford to leave gender issues out of technology conversations. Sandra's work is a powerful call to action for governments, to developers, media platforms, law enforcement, and all of us to center women's experiences and safety in every aspect of the digital world.
01:04:41
Speaker
So thank you so much for listening to her Media Diary today. If you found today's conversations inspiring, don't forget to subscribe, leave a review and share this episode with someone that needs to hear it.
01:04:52
Speaker
And if you'd like to join me on an episode of the podcast, send me an email ahmc at emc at africanwomeninmedia.com. And you can also visit our website, hermediadiary.com for more episodes of Her Media Diary.
01:05:05
Speaker
So listen to the podcast in your favorite podcasting platforms and tune in to our partner radio stations from anywhere across Africa. And don't forget, join the conversation using the hashtag Hermida Dairy.
01:05:17
Speaker
Until next time, stay safe, stay curious, and keep amplifying the stories that matter. And like Sandra said, build your network, do it, and be consistent while you do it.
01:05:29
Speaker
Take care.