Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Best Of: Confronting big tech’s abuses as a question of human rights  image

Best Of: Confronting big tech’s abuses as a question of human rights

S4 E20 · Bare Knuckles and Brass Tacks
Avatar
0 Playsin 8 hours

We're off this week, deep into planning and scheduling for next year. Please enjoy this Best Of episode, originally released in October.

Hannah Storey, Advocacy and Policy Advisor at Amnesty International, joins the show to talk about her new brief that reframes Big Tech monopolies as a human rights crisis, not just a market competition problem.

This isn't about consumer choice or antitrust law. It's about how concentrated market power violates fundamental rights—freedom of expression, privacy, and the right to hold views without interference or manipulation.

Can you make a human rights case against Big Tech? Why civil society needed to stop asking these companies to fix themselves and start demanding structural change. What happens when regulation alone won't work because the companies have massive influence over the regulators?

Is Big Tech actually innovating anymore? Or are they just buying up competition and locking down alternatives? Does scale drive progress, or does it strangle it?

What would real accountability look like? Should companies be required to embed human rights due diligence into product development from the beginning?

Are we making the same mistakes with AI? Why is generative AI rolling forward without anyone asking about water usage for data centers, labor exploitation of data labelers, or discriminatory outcomes?

The goal isn't tweaking the current system—it's building a more diverse internet with actual options and less control by fewer companies.

If you've been tracking Big Tech issues in silos—privacy here, misinformation there, market dominance over here—this episode is an attempt to bring those conversations together in one framework.

Mentioned:

Read more about the Amnesty International report and download the full report here: “Breaking Up with Big Tech: a Human Rights-Based Argument for Tackling Big Tech’s Market Power”

Speech AI model helps preserve indigenous languages

Empire of AI, by Karen Hao

Cory Doctorow’s new book, "Enshittification: Why Everything Suddenly Got Worse and What To Do About It"

Recommended
Transcript

Introduction of Hannah Story and Her Report

00:00:00
Speaker
Gosh, yeah, that's a difficult question. I'm not sure. We only ask easy questions, Hannah, just softballs all all day long. Welcome to the show, Hannah.
00:00:12
Speaker
Hey, listeners, we're off this week. We are preparing a lot of new lineups and guests for next year.

Human Rights Perspective on Big Tech

00:00:19
Speaker
But enjoy this best of episode with Hannah Story from Amnesty International and looking at a new way to think about resisting big tech.
00:00:38
Speaker
Welcome back. It's Bare Knuckles and Brass Tacks. This is the tech podcast about humans. I'm George K. Hi, I'm George A. And today we have Hannah Story, who is a advocacy and policy advisor on technology and human rights with Amnesty International. Y'all, I feel like we got the big time. So George sent me this report. I just straight up cold emailed Amnesty's press office. um The report is breaking up with big tech. a human rights based argument for tackling big tech's market power. And turns out Hannah Story is the author of that report.

Challenges and Insights of Hannah's Report

00:01:16
Speaker
It's just a very unique take on some of the issues that we've addressed in the show, which I think are normally examined through like a legal lens, antitrust lens, commercial lens. But this is the first time we'd seen it examined explicitly through a human rights law lens. And we just wanted to have her on the show because we're like, let's let's dig into that argument. And Hannah delivered for sure.
00:01:38
Speaker
Yeah. And you know what? Again, part of this entire theme this season has been about let's tackle like the actual big issues. And this definitely was it. um I wish we could have had more time.
00:01:49
Speaker
um I wish we weren't struggling through three time zones and hotel Wi-Fi and all the things. But that said, I think she really delivered. The report is outstanding. i think we got into the nitty gritty of some really big, critical questions that Each of them alone are probably worth an entire episode or a series of episodes to try to break down.
00:02:12
Speaker
um She was brilliant enough and so gracious to still try to give us her. you know, summary responses in like a minute or two minutes or less for topics that multiple books could be written on. And she's done massive amounts of research for. So I really do think anyone who has any sort of interest in figuring out what is happening in society right now, what is the relationship between big tech big tech and and normal people,
00:02:40
Speaker
What are we going to do to actually you know fight for our rights and and save any semblance of the society that we thought we knew? How do we fight back against gentrification and then the techno feudalism taking over?

Amnesty's Unique Human Rights Approach

00:02:54
Speaker
i mean, we really went into it. I'm not saying we went like like full shake of air, but like I think it's a good day.
00:03:01
Speaker
have the kind of conversations where, hey, things aren't a good time right now. Can we start calling it out? Yes. A thousand percent. This was that show. Yeah. um All right. Well, you know, we only ask easy questions. So let's turn it over to Hannah Story. Story, welcome to the show.
00:03:28
Speaker
Hi, thanks so much for having me. We're really excited to have you here just because one, the report is exhaustive and two, it takes a very novel argument, one that we have not seen. We both work in tech and have done so for a long time. We're very familiar with a lot of the legal arguments, especially antitrust arguments, for example.
00:03:52
Speaker
But this is the first time that we had seen resistance to big tech in as framed through a human rights lens. So I just want to start with giving you the floor to kind of explain why did you decide to take this approach or how did the idea come up that there was enough of an argument here to base it on human rights? And then we will dig in a little further.
00:04:18
Speaker
Yeah, thanks so much. I mean, you've really got to the nut of it, I think, already. We

Human Rights in Big Tech Discourse

00:04:24
Speaker
were very interested in this market power question, but every time I was reading analysis around this, it was really coming from ah competition, fair competition perspective, or from a consumer rights perspective, or looking at the impact on small and medium-sized enterprises.
00:04:47
Speaker
And some were looking at human rights, but not wholly. So as a human rights organization, it was very difficult for us to get involved in the conversation. We couldn't really talk about these other perspectives because that's not our place in this conversation. So we knew that we needed a wholly human rights perspective. And to be honest, when we started out,
00:05:12
Speaker
you know we didn't start out with the conclusion. It wasn't, we know that we can make this argument. It was, can we make this argument? So we started looking at different human rights impacts that big tank big tech companies have and started to think about, well, what's the market power component of this? Is there an issue with scale? Is there an issue with control? Is there an issue ah with...
00:05:39
Speaker
lack of options around different technologies. um So that was kind of the genesis of this report was can we make that human rights argument, which I'm pleased to say we can we could. and We managed to to publish the report this year.
00:05:53
Speaker
um But it was that that need for a sort of wholly human rights perspective that we weren't seeing necessarily coming from from other organizations. Yeah, I really appreciate that. i i When I went through the report, I feel like it pulled together a lot of things that we had seen in individual use

Universal Declaration of Human Rights and Media

00:06:16
Speaker
cases, right? We knew about the issues in Myanmar and Facebook. We knew about issues with Tigray and Ethiopia, but those were always viewed as like a media problem, right?
00:06:30
Speaker
And then we also have terms and conditions we have. Is it regulatory capture? Is it consumer lock in? And we have those as commercial issues. um I think one of the things that I learned in the report that really stood out to me is while I thought I was familiar with the Universal Declaration of Human Rights, think.
00:06:52
Speaker
I really appreciated the level of detail that there is a right to hold views without interference or manipulation. Right. And what was drafted back then and seen as probably through the lens of like the press and sort of a their authoritarian propaganda machines is something that can be applied to.
00:07:15
Speaker
the media landscape as we see it today, which we've talked about extensively on the show, like the problem with these algorithmic lead media is that it's just bubble after bubble.
00:07:25
Speaker
um But I want to, I guess we're going to drill in here. We'll start sort of general and then move. So I think What would be your ideal outcome here? Is this to drive a policy discussion? Is it also to drive awareness at the individual user level? Like, where do you see the next step in terms of action taking place?

Beyond Regulation: Structural Issues in Tech

00:07:52
Speaker
Yeah, that's a great question. I mean, this is quite a... policy wonky document, I'm not necessarily expecting a lot of engagement with this kind of policy piece from an individual, on an individual level. So I think there's there's two things that I think are key here. One is that we need to start a conversation a policymaker level around going beyond regulation. So regulation of big tech, I think, is a really key part of the solution. We need to be thinking about
00:08:28
Speaker
you know it's It's been a wild west up until now. They've been able to do much of what they want, and we need to start reining that in. But what we found in this research was that even if you start regulating them, they have so much power in terms of regulatory influence.
00:08:44
Speaker
You know, I can't say that they definitely have influenced regulation in the past, but i can say they do have a lot of influence. So regulation. life It's OK. It's OK, Hannah. I will go ahead and say they have a metric fuck ton of influence. If anyone has ever seen Facebook's D.C. headquarters on K Street, it's very clear why that exists. but Yeah, I mean, I've not seen it, but I've heard. The floor plans, let's just say the floor plans are a little bit bigger than ah than civil society organizations. um
00:09:16
Speaker
So regulation is one component, but we're seeing a situation now in the EU, for example, where we have regulation of big tech and the implementation is is not up to standard. These companies have to do Risk assessments, look at their potential and actual human rights impacts, mitigate them.
00:09:36
Speaker
I've read some of these risk assessments. They're not sufficient. They're not doing enough to really address the issue. And the EU needs to be making sure that they're implementing that well. And unfortunately...
00:09:47
Speaker
We're not necessarily seeing that. So we need part of the solution to be thinking about the structural issues. We cannot just think about regulation. So at a state level, at a policymaker level, we need to start thinking about what already exists in terms of maybe it's competition or antitrust regulation that we could be utilizing to start addressing some of these human rights issues.
00:10:13
Speaker
issues And so on a practical level, what does that mean? Well, competition regulators don't know about human rights, maybe on a personal level, but generally they don't. Human rights and experts don't necessarily know about competition and antitrust.
00:10:28
Speaker
Data protection regulators, is you know, everyone's very siloed. We need to start bringing these constituencies together so that when you're making decisions around competition, you're thinking about this human rights impact, that youll have you have that level of knowledge. When a merger or acquisition is in front of a regulator, no they're not necessarily thinking about how that might impact our rights.

Questioning Big Tech's Influence and Media Diversity

00:10:53
Speaker
That is a way that we could move towards, is is bringing those different constittuies constituencies together. And then on an individual level, I think it's It's calling this out. I think we've all seen, you know, we saw the Trump's inauguration pictures with all those CEOs stood right there. You know, I think it's right in front of us that the the power of these companies is is very significant. Absolutely. and And starting to question that and starting to question some of the business models that are underpinning these companies.
00:11:24
Speaker
we could have alternatives. We could have an online world that doesn't track us across the entire web, for example. We we could imagine something better that we prefer. um And on a personal level, you know, this isn't necessarily an amnesty position, but when I go on social media...
00:11:40
Speaker
I'm seeing a lot of rubbish, certainly on some of these platforms. Well, I think the joke is that in the incentivization of the internet, it's like you open up your browser and it's basically the same five websites and all of the content on each is just content from the other four, right? Anyway. Yeah.
00:11:59
Speaker
Yeah. So, I mean, that's, that's kind of the problem. and we deal with that in our world too. And, and, you know, I remember, um, going from like signals intelligence in military to cyber intelligence and the matter of circular reporting where,
00:12:14
Speaker
you know, finding an actual primary source of a lead versus the seven or eight blogs to just repost the same information over and over again. And when you start looking at citations of other reporting outlets and you're saying, you see, wait, they're they're citing like three or four blogs that, yes, they're they're different publications or different different publishers, but still fundamentally the same information.
00:12:37
Speaker
And I think that's a form of of numbing us as well, along with the work slot that a lot of these AI outlets are now producing. um i have a friend. It's funny you bring this all up because I have a friend who works in creative writing and she recently reached out to me um about a movement that her friend is starting advocating on behalf of human creative writers and fighting for the survival the creative writing profession,

Big Tech as Public Utilities and Innovation Stifling

00:13:01
Speaker
which I would.
00:13:02
Speaker
George and I should probably dig into it in another episode. You just you've inspired to remind me of it. I'm unfortunately traveling on the road for work and there's a million other things I'm doing. And and there's just the stuff in in our head. This is like or the the common battle of being in this economy today. That said, um your point on the lobbying, I think, is really kind of a critical issue that we have to discuss with this because.
00:13:26
Speaker
you know I don't want to be an idealist. And we'll say in other hats in my life, I've worked very closely with amnesty on other issues that are very are very dear to me because amnesty does tend to represent the side of justice and good um and the way that we we should be approaching humanity.
00:13:43
Speaker
And on this issue, you know, you are right. You are fundamentally correct. i mean, I mean, even in the report, you describe these platforms as being so embedded in public life that they're essentially utilities.
00:13:56
Speaker
I mean, if we if we take that premise as true, I mean, do we need a new sort of digital social contract and something that that protects our rights, but also acknowledges that these platforms are now public squares, whether we like it or not? And that's one part of the question. The other part of the question is,
00:14:14
Speaker
Knowing what society is now in the state of techno feudalism that we have essentially entered upon, knowing that the billionaire class have now done everything imaginable to protect themselves, to fortify themselves from good lobbying. And we're seeing millions of people in America and in Europe protesting on behalf of their rights of of different democratic movements.
00:14:40
Speaker
And they're constantly being ignored because the lobbying dollars control everything. And it seems to me that we are no longer led by people who have any sort of ethics.
00:14:54
Speaker
How then do we if we want to accept that these platforms are now essential public utilities that are key to having a modern life? how then do we achieve that recognition in a way that also affords us the rights that comes with using a public utility service as any other would like you know using your your running water using your fire department it's that level and how then do we break the hold that the profiteering that these organizations just do for each other have on us because i feel that
00:15:30
Speaker
Your report paints a picture of the way things should be. And I just don't know how we overcome the delta of greed and profit that defines this entire generation that we are unfortunately in.
00:15:47
Speaker
Gosh, yeah, that's difficult question. I'm not sure I've got- We only ask easy questions, Hannah, just softballs all all day long. but Welcome to the show, Hannah. Yeah.
00:16:00
Speaker
um i think I think one area that I find really interesting is that these companies not

Data Privacy and Reducing Dependency on Big Tech

00:16:14
Speaker
necessarily bringing innovation anymore.
00:16:17
Speaker
and And we, I think, have... become quite ah used to the status quo of these other companies. This is how we access the internet. You know, when I go on to a social media platform or I'm using these services, I'm aware that I'm being tracked at a horrific level of detail. And I'm very uncomfortable with that, but that's my job. i I know about this because it's my job. But I think a lot of us know about that. I mean, if you went back 50 years and you told people,
00:16:50
Speaker
You're going to be, this is the level of detail that a company is going to be tracking you in They would be horrified. And that has become completely normalized. So we need to keep talking about these issues. We need to be visibilizing the fact that these companies are not creating...
00:17:06
Speaker
innovative, interesting solutions. They are abusing our human rights. They are doing things that we are not comfortable with. We need to kind of take charge of the narrative because they are winning that narrative. That's part of the lobbying power. It's not just access. They also have the narrative a lot of the time. So we need to be constantly questioning what they put in front of us in terms of, yes, I use that language saying they're almost public utilities, but in a way we have to be careful we're not feeding into these things are essential for our digital world. It could look different. The internet could look different. We don't have to have these companies. um And then I think some of the solutions, i i agree we're probably not going to fully get rid of these companies, but how do we open them up?
00:17:51
Speaker
So it's not just about potentially structural breakups, although I think that might be part of the solution with some of these companies, but also what can we do so that people really have a choice? So if I have my friends and family, I don't know, maybe your kids' school organizes on Facebook groups or WhatsApp or you have to have Instagram to access whatever, you know, we're locked into these services. oh yes How can we, can we,
00:18:19
Speaker
how can we come up with solutions that mean I have a choice. I don't have to use Instagram. I can use another service that does respect my privacy, that doesn't amplify very harmful content that, you know, I choose to use, but still access maybe the services that Instagram provides. or but I pick Instagram just off the top of my head, of course.

Social Dynamics and Tech Narratives

00:18:43
Speaker
But before I have to ask, though, because I get where you're going with this. Yeah.
00:18:48
Speaker
There is a mind virus where if it was just the case of the working class against the rich, not controlling us and not controlling our media, we would have new media, which would provide us news and we'd all be on new media platforms and we all be getting our info from Substack.
00:19:05
Speaker
But if you go on any any social platform, even a new media platform, plenty of content creators who are supporting, like before it was just the the shameless fans of Elon Musk and all the things that he would do and defending Elon and blah, blah, blah. But now it's actually just people who look to these these CEOs as if they are the new gods.
00:19:30
Speaker
And to speak ill of them is almost demagoguery.
00:19:35
Speaker
And how do we then break the the social hold? Because that's that's where the the crux of the report, I think it it kind of gets to. It's that we have to change the way that we think about this, because we as the people, you know, capital P, the people have willfully accept this condition.
00:19:58
Speaker
So I think before we can even elicit that kind of policy change, how do we, in your opinion, based on the the amazing amount of research that you've done, change the narrative at the lower level so that we can begin to unify the discussion to actually bring together some form of power to counter the movement? Because at the end of the day, it's a game of power, don't you think? Yeah.
00:20:24
Speaker
I mean, I think part of the problem is that a lot of these harms aren't that visible to us. So as an Amnesty International, as a human rights organization, we have a responsibility for visibilizing those harms and bringing that to our movement. I mean, Amnesty is a movement of 10 million people. That is quite a lot of people, but there's also a lot of people outside of that.
00:20:47
Speaker
I think a lot of these harms feel quite distant and not necessarily impactful in our day-to-day. And we need to be thinking about how does this impact my mum when she opens this app or a person who's just accessing you know their friend's Instagram in Ethiopia or whatever it is.
00:21:08
Speaker
Sometimes these harms can feel very far away. And I think we do have to think about how this impacts our day-to-day. which isn't always

Shifts in Tech Policy and Market Influence

00:21:17
Speaker
easy to do. No, but I do, i feel like the tide has not turned, but I do feel it is turning. I think worldwide, you know, we have not just in the U S we have Australia, we have Canada, we have lots of new legislation coming on just to just even limit contact, uh, between younger people and social media, for example, but also, new laws coming online, trying
00:21:43
Speaker
I wouldn't say explicitly antitrust, but trying to reduce the influence and dependency. I do think about when I read reports like this, I think about my friends who own small retail stores and basically Instagram ads and Facebook ads, hyper localized media, because there's no local newspaper really left anymore. Or, you know, radio ads are a thing that they do. um But like,
00:22:12
Speaker
I don't think that they like being locked in, but it just also feels like a necessity to them. Like no modern business can operate, you know, without being on ah social media. um But I want to get to a more personal question. You have mentioned a few times now that when you go on social, um I guess, how has writing this report changed your relationship? I guess not. just I mean, what do you single out social? Cause it's easy, but I think,
00:22:40
Speaker
a lot of technology. you You also call out the Google and Apple app store lock in, as, as these ecosystems that, that circumvent choice, things like that. So yeah, just Hannah, curiously in your world, how are you metabolizing this, this thing that you put together?
00:23:00
Speaker
Yeah, it's such a good question. i think I obviously spend quite a lot of time with kind of technology and human rights people who have a lot of great alternatives to these platforms, to the services that these companies provide, um which I do try and use. But I have to say,
00:23:21
Speaker
I think what this report really made me realize is it's all well and good if those alternatives exist, but if they're not easily accessible to the majority of people, it doesn't really matter. You know, what I use on a day-to-day level is a personal choice, but I don't think we should have to think about our human rights

Surveillance Capitalism and Competition

00:23:42
Speaker
all the time. You know, I don't think... I don't think many of us have time for that. Let's be honest. You know, it's my job. or I spend my life thinking about it. But if you don't, you shouldn't, you should be able to use an app or use the internet. And just, it's a given that your rights are respected. I think that that's the core of it for me. We shouldn't have to be thinking of our own alternatives, even though,
00:24:06
Speaker
on an individual level. Yeah, absolutely. there are There are places you can go, but it's very difficult. I mean, if we go back to talking about um tracking and privacy and the sort of extent of Google and Meta's advertising technology, Even if you avoided their actual services, you didn't have a Facebook account, you didn't use Google search, you're not going to be able to avoid them having some kind of profile on on you or on your IP address. they're just It's so vast across the internet, this kind of ad tech, that...
00:24:38
Speaker
I think even a very, very diligent person would not be able to avoid it. So I guess in answer to your question, it's it's just ah it's part of a necessary evil of online life these days. I'm very aware of it. I'm very freaked out by it, but also...
00:24:54
Speaker
it's difficult to avoid so you kind of have to get used to it and channel that into I guess this work and and pushing back um but I think coming back you made the point around um local organizations and adverts and not having alternatives I think that's really part of the problem that that I maybe was less aware of before I wrote this report is just how much these companies are, you know, either buying up competition or kind of implementing rules around using their platforms that prevent competitors from thriving. There's really quite a web of actions they've taken to lock down innovation or other actors rising up.
00:25:43
Speaker
Hey listeners, we hope you're enjoying the start of season four with our new angle of attack, looking outside just cyber to technology's broader human impacts. If there's a burning topic you think we should address, let us know.
00:25:56
Speaker
Is the AI hype really a bubble about to burst? What's with romance scams? Or maybe you're thinking about the impact on your kids or have questions about what the future job market looks like for them.
00:26:08
Speaker
Let us know what you'd like us to cover. Email us at contact at bareknucklespod.com. And now back to the interview.
00:26:19
Speaker
Corey Doctorow has written extensively. I think he has a new book also about, again, he calls it the inshittification of the internet, which he

Inshittification and Internet Diversity

00:26:27
Speaker
identifies as this, you know, you get people in the door for free.
00:26:30
Speaker
They love it. And then you slowly start, I think he calls it twiddling. You just start changing the business model and they've, they've essentially gotten locked in. So local newspapers, everyone was like drunk on referral traffic from Facebook and Google. And then one day they turn off the spigot.
00:26:48
Speaker
because they want to capture the eyeballs instead. and yeah then there's I mean, advertising collapses and um Yeah, i think I think you're right. I have been accessing the internet since 1993, which is crazy to say out loud.
00:27:05
Speaker
And i my memories of the early internet are getting harder and harder to remember. Like how freeing and weird it was. I mean, sure, there was like some scary stuff on there, but like... ah How do you free the sound of that dial-up modem? It wasn't even just the modem. It was just like you could just...
00:27:26
Speaker
Now that I think, if I think really hard about it, like just even the number of search engines, like you would wire one search through like Ask Jeeves because it was better semantically at finding this one thing. And then this other thing through Alta Vista because it was way better at finding those illegal MP3s that you wanted to spend 20 minutes downloading. um But there was just so much diversity. And it does feel like when i log on, it's like, i I guess I'm going to the same five websites, you know, essentially.
00:27:55
Speaker
um yeah it it just felt so much freer um i don't really i'm not really going anywhere with the question i just wanted to get your take hannah i think it's it's good work but it's also so hard to like hold these two things at the same time right because even we try to limit our time on social as much as we can but you know sometimes an interesting post comes through and we share it and it's just kind of like a weird i guess this is how we communicate now
00:28:24
Speaker
Yeah, absolutely. And I i mean, i think the thing that I really wanted to progress with this report was that big tech aren't probably part of the solution.

Showcasing Alternative Tech Models

00:28:41
Speaker
And I think for many years as civil society, we have been talking to big tech, trying to highlight these issues, asking them to change, and they haven't.
00:28:53
Speaker
And now we have to go beyond that. And I think that may sound like a small shift, but I think it's a huge shift in how human rights organisations are thinking about this.
00:29:07
Speaker
But again, regulation alone is not going to work. So i don't have all the solutions, but I think if we can shift that framing in terms of what we're asking for, which is a more diverse internet with options, with less control by fewer companies, that in itself is a step in the right direction that we can start to visualize what we actually want at the end rather than just um rather just talking about the problems.
00:29:39
Speaker
yeah I think there's also a lot of value in highlighting alternative models, too, which I'll get into in a second. But, George, i go for it. Sorry. Yeah, because was going to say this, this, this actually segues perfectly in what I was going ask you.
00:29:52
Speaker
And that's the reports heavy on the harms of surveillance capitalism. And I think that's a massive. growing problem and it's only to get worse. I just read a report about the Deputy Secretary of Defense in the States now trying to push towards Wall Street essentially funding, you know, billions, if not trillions of dollars of, um you know, army futures, infrastructure development, which, you know,
00:30:19
Speaker
having the people who make profit off of war, financing the organization that does the war probably isn't going to go well you know for the sake of world peace.

Ethical Implications and Human Rights Advocacy

00:30:31
Speaker
um And and you know in in our circles as well, especially in my world, I deal with a lot of investment bankers. I deal with a lot of venture capitalists. you know That's a very common social circle I hardly find myself in as someone who's a little bit more socialist leading than most people.
00:30:49
Speaker
So I think we look at Palantir as an example and the amount of people that I know that are not pro surveillance state, they're not necessarily pro authoritarianism, but they love, love, love talking about how much money they've made off Palantir because they invested in it five years ago and watch their stock spike.
00:31:12
Speaker
And all they understand is the commonality of the the the casino, that is the stock market, and their winning bet, not understanding the consequential implications of that bet winning.
00:31:29
Speaker
So many users, especially in rural areas, actually depend on the Googles and the Amazons for basic services, if we go back to that. How then is there a path to demonopolization that doesn't also decimate access for both lower income and remote users?
00:31:50
Speaker
And at the same time, still retains the pre the premise, the Western premise of of freedom of and but of individuality, freedom of choice, of of not being controlled by the state.
00:32:04
Speaker
How do we combat against this? Yeah, when you said you did difficult questions, this definitely you definitely do. um I mean, i think I've already said this, but I don't think that people are fully aware of the human rights implications of a lot of these technologies. And around Palantir, for example, I think there's still a lot of work to be done to bring those stories of actually what they're doing. um
00:32:35
Speaker
I think it's become so normalized the way that we use the way using and manipulating and data for all sorts of purposes in the name of efficiency that is really harmful for human rights that we're feeling we almost have to catch up because there's been so much change in the last few years.
00:32:59
Speaker
um So I do think human rights organisations have a part to play in in raising awareness around that. um
00:33:08
Speaker
I'm sorry, i'm I'm blanking sort of on your second question. I jumped into Palantir then, but could you repeat for me, George? but we're trying to to help save access to folks who are now dependent on these services. So there's two angles to this, right? You talk about surveillance capitalism.
00:33:25
Speaker
One is the dependency of lower income or or traditional working class people or people who live in rural areas who rely on those services, especially considering, you know, I'm in Canada and we're on the verge of losing our door door postal service for all sorts of reasons, financial mismanagement, blah, blah, blah. But postal services were absolutely critical for keeping people actually engaged and connected. And and ah banking by mail is actually a big thing in rural Canada.
00:33:55
Speaker
we might lose that. So that means they're going to be even more dependent on digital payment processors, which are hosted on Amazon and et cetera. Right. So how can we, you know, allow these services to still be in place because the people living in these areas still need them, but then by keeping these organizations in place and the ecosystems that they exist in,
00:34:20
Speaker
And the way that they manipulate and use the data from people who are relying on them to essentially control them. How do we then you know retain the ability for these people to still be connected to society while not continuing to invest and reinforce in the dragnet that we are finding ourselves more and more encapsulating them?
00:34:43
Speaker
And we we see human rights and business as so separate. They've become very separate conversations. And I've worked, my whole career, I've worked in sort of corporate accountability and business and human rights.
00:34:58
Speaker
And that bringing together of these two things to say, Why is it that you're allowed to operate a company that does not consider its potential human rights impacts that can just, whether that be in its supply chain or whether it be in whether it's disrupting a service, as you mentioned, George, around the postal service,
00:35:21
Speaker
fake there's no obligation for them to even think about it, to even consider it. They can put out um a product. Actually, i was listening to your interview around Stalkerware and you were talking about this, right? Putting out a product...
00:35:34
Speaker
um these tracking products without thinking at all about the potential impacts of that. So why? the The first thing is we need to embed this at the beginning. Companies should be required to look at this in the same way that they're required to look at other types of impacts of their companies from the beginning, embedded human rights due diligence.

AI Sector: Risks and Regulatory Needs

00:35:54
Speaker
all the way through. um and obviously, this doesn't talk about things that have already happened, but where are regulators thinking, anticipating and assessing some of these impacts? So one sector That I'm really worried about. I mean, I think we're all worried about is this growth in AI, the generative AI sector, where obviously big tech are extremely embedded into the infrastructure and AI services.
00:36:23
Speaker
We're already seeing potential human rights risks around impact on right to water for the water and energy needed for data centers. We're seeing supply chain impacts with um people looking like data labelers ah bringing cases around to these companies for labor rights abuses. We're seeing discriminatory outcomes with AI. There's so many risks that are already apparent to us. so But this sector is just rolling on and we're letting it happen without any kind of regulation, without any kind of bringing it back in. Where there are moves, like in the EU, for example, with the AI Act, there's talks now of rolling that back in the name of competitive in competitiveness, in the name of simplification for companies.
00:37:12
Speaker
So I think embedding human rights into our into all of these decisions, whether that is companies having to look at their human rights impacts as they are rolling out a new product or service, or whether that is competition regulators looking at a merger or acquisition happening and saying, OK, but how could this impact our human rights? You know, if we're not embedding this from the beginning, these impacts, I think,
00:37:35
Speaker
if not inevitable, it's a very high risk that they're going to occur. um So I think we need to be having that conversation about how do we embed human rights throughout business operations. These aren't, to me, they're not separate conversations, but I think these two different groups are often not talking to one another.
00:37:56
Speaker
Yeah, and I i also think i think part of this work is reframing as a human rights issue. I also think, and and this is not necessarily a question, but I think well we can close out here. I think part of the work, whether it's us, whether it's Amnesty, other organizations that are concerned are also highlighting the alternative

Community-Driven AI Development

00:38:15
Speaker
models. Because I think there's a lot of work being done in small corners, and and by design, they're not these huge bombastic programs, but that also means they're not getting enough attention. And I, i think we take for granted, especially my worries around
00:38:33
Speaker
the way a generative AI is developing is it's like the previous 10 years of social media, but like on steroids. And so you take this shitty thing and you just double down on it versus what are some other ways of doing it? And I had the great honor and privilege of seeing Karen how speak,
00:38:53
Speaker
ah Very recently on her book, Empire of AI, and she highlights in the book, but for the benefit of our listeners, she highlights this project in New Zealand where a local radio station, which has been broadcasting in the native Maori language, um has all this archival information.
00:39:11
Speaker
Audio, right? And so what they did was they went to the community and they said, we would like to develop tools using this data as a means of preserving this language and also actually beginning to reclaim it, like teach it back to the younger generation. Yeah.
00:39:27
Speaker
And so as a community, they develop consensus. Yes, we we approve of this use of these archived things. We will develop transcripts. They used NVIDIA's NEMO program, which is open source, and ah ah just a few A100 GPUs. And they're building out a language recognition model for that specific application.
00:39:48
Speaker
community with that community's consent with, you know, for that community. And it's not necessarily, it's not built for a for-profit model, which is one, one way of doing things, but it is using ai in a way to really help.
00:40:03
Speaker
And I just think if we, so if we try to highlight more examples of that, it gives people something else to latch onto. I think to your point, I think the, the bombshell, ah spoiler alert,
00:40:15
Speaker
is not that the big tech is bad, it's that they aren't innovating. Like we just got stuck in this mire and they're these large behemoth organizations and they're actually not really incentivized to innovate as rapidly as they did. And I would encourage the future founders and the people who want to use tech in a good way to just start looking for different ways of doing it because I don't think we should just take it for granted. Like this is the only way

Envisioning a Diverse Internet Future

00:40:40
Speaker
you do it. You just get billions of dollars and you scorch the earth and you...
00:40:44
Speaker
do whatever you want. um I just think like, quote unquote, scale is not always the most profitable or best way of doing that. Yeah, absolutely. And I want to get excited again about using the internet. yeah you know you You were talking about the 90s and all the exciting things you could do and find and explore. And I think we've lost some of that. so yeah, what's that vision?

Conclusion and Gratitude

00:41:07
Speaker
like What do we want? What could exist? Or what does exist already that's that's better than what we have? I think Yeah, i I would love to think more about that. I think there's so many opportunities and possibilities, but we've got quite locked into very set way of how things work that it doesn't have to be.
00:41:25
Speaker
Absolutely. Well, Hannah, we'll end things there. I want to thank you so much. I know we had to do a little time zone tango. like What are we doing? Like three different time zones right now because George is overseas. um But yes, thank you for making it work. We really appreciate you taking the time.
00:41:41
Speaker
And I appreciate you just handling the everything I threw at you. i think you're absolutely brilliant. I hope we can have you on again. i will absolutely be a follower of your work. Please let us know when you put out more articles.
00:41:55
Speaker
more reports. um This show is a fan of yours. So thank Hannah. Yeah, thanks so much for for having me on. I'll definitely keep in touch. And ah yeah, I really appreciate you engaging in this. It's just great to to see, you know, people all over who are interested in these topics and want to discuss it. I did think when I put this out, it might just be a bit of a, oh, policy wonky, kind of only those people are interested. so it's really great to to have. Yeah, I mean, a peek behind the curtain. I think George sent me something.
00:42:27
Speaker
And i I was like, you know what? Fuck it. I'm going to email Amnesty and see if we can get them on. And here we are. So I really, ah really appreciate it. Have been a fan of the organization for a long time, ever since high school. ah So thank you again for coming on the show.
00:42:43
Speaker
Yeah, thank you. Bye.
00:42:47
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:43:00
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. that helps others find the show. We'll catch you next week, but until then, stay real.
00:43:15
Speaker
Well, I appreciate the work that you put in. I remember the one page that had, that was, I think more of the page was footnotes. Yeah, i made an error of not doing my footnotes until the end. I'd sort of left placeholders and oh my gosh, that was horrific. so