Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Confronting Big Tech’s Abuses as a Question of Human Rights  image

Confronting Big Tech’s Abuses as a Question of Human Rights

S4 E13 · Bare Knuckles and Brass Tacks
Avatar
84 Plays18 hours ago

Hannah Storey, Advocacy and Policy Advisor at Amnesty International, joins the show to talk about her new brief that reframes Big Tech monopolies as a human rights crisis, not just a market competition problem.

This isn't about consumer choice or antitrust law. It's about how concentrated market power violates fundamental rights—freedom of expression, privacy, and the right to hold views without interference or manipulation.

Can you make a human rights case against Big Tech? Why civil society needed to stop asking these companies to fix themselves and start demanding structural change. What happens when regulation alone won't work because the companies have massive influence over the regulators?

Is Big Tech actually innovating anymore? Or are they just buying up competition and locking down alternatives? Does scale drive progress, or does it strangle it?

What would real accountability look like? Should companies be required to embed human rights due diligence into product development from the beginning?

Are we making the same mistakes with AI? Why is generative AI rolling forward without anyone asking about water usage for data centers, labor exploitation of data labelers, or discriminatory outcomes?

The goal isn't tweaking the current system—it's building a more diverse internet with actual options and less control by fewer companies.

If you've been tracking Big Tech issues in silos—privacy here, misinformation there, market dominance over here—this episode is an attempt to bring those conversations together in one framework.

Mentioned:

Read more about the Amnesty International report and download the full report here: “Breaking Up with Big Tech: a Human Rights-Based Argument for Tackling Big Tech’s Market Power”

Speech AI model helps preserve indigenous languages

Empire of AI, by Karen Hao

Cory Doctorow’s new book, "Enshittification: Why Everything Suddenly Got Worse and What To Do About It"

Recommended
Transcript

Introduction of Hannah Story and Her Role

00:00:00
Speaker
Gosh, yeah, that's difficult question. I'm not sure I've got... We only ask easy questions, Hannah, just softballs all all day long. Welcome to the show, Hannah.
00:00:16
Speaker
I think one area that I find really interesting is that these companies... are not necessarily bringing innovation anymore.
00:00:29
Speaker
and And we, I think, have become quite ah used to the status quo of these other companies. This is how we access the internet. You know, when I go on to a social media platform,
00:00:46
Speaker
or I'm using these services, I'm aware that I'm being tracked at a horrific level of detail. And I'm very uncomfortable with that, but that's my job. of i I know about this because it's my job. But I think a lot of us know about that. I mean, if you went back 50 years and you told people,
00:01:03
Speaker
You're going to be, this is the level of detail that a company is going to be tracking you in They would be horrified. And that has become completely normalized. So we need to keep talking about these issues. We need to be visibilizing the fact that these companies are not creating innovative, interesting solutions. They are abusing our human rights. They are doing things that we are not comfortable with. We need to kind of take charge of the narrative.
00:01:36
Speaker
Welcome back. It's Bare Knuckles and Brass Tacks. This is the tech podcast about humans. I'm George Cate. I am George a And today we have Hannah Story, who is a advocacy and policy advisor on technology and human rights with Amnesty International.
00:01:54
Speaker
Y'all, I feel like we got the big time. So George sent me

The 'Breaking Up with Big Tech' Report

00:01:58
Speaker
this report. I just straight up cold emailed Amnesty's press office. um The report is Breaking Up with Big Tech, a human rights based argument for tackling big tech's market power.
00:02:10
Speaker
And turns out Hannah Story is the author of that report. It's just a very unique take on some of the issues that we've addressed in the show, which I think are normally examined through like a legal lens, antitrust lens, commercial lens. But this is the first time we'd seen it examined explicitly through a human rights law lens. And we just wanted to have her on the show because we're like, let's let's dig into that argument.
00:02:33
Speaker
And Hannah delivered for sure. Yeah. And you know what, again, part of this entire theme this season has been about let's tackle like the actual big issues. And this definitely was it.
00:02:44
Speaker
um I wish we could have had more time. um I wish we weren't struggling through three time zones and hotel Wi-Fi and all the things. But that said, I think she really delivered. The report is outstanding. i think we got into the nitty gritty of some really big critical questions that Each of them alone are probably worth an entire episode or a series of episodes to try to break down.
00:03:09
Speaker
um She was brilliant enough and so gracious to still try to give us her you know summary responses in like a minute or two minutes or less for topics that multiple books can be written on. And she's done massive amounts of research for. So I really do think anyone who has any sort of interest in figuring out what is happening in society right now, what is the relationship between big tech big tech and and normal people,
00:03:38
Speaker
What are we going to do to actually you know fight for our rights and and save any semblance of the society that we thought we knew? How do we fight back against incentivization and and then the techno feudalism taking over?
00:03:52
Speaker
I mean, we really went into it. I'm not saying we went like, like full shake of air, but like, I have the kind of conversations where, Hey, things aren't a good time right now.
00:04:05
Speaker
Can we start calling it out? Yes. A thousand. Yeah. A thousand. This was that, this was that show. Yeah. Um, all right. Well, you know, we only ask easy questions, so let's turn it over

Human Rights Perspective on Tech Market Power

00:04:16
Speaker
to Hannah's story.
00:04:23
Speaker
Hannah Story, welcome to the show. Hi, thanks so much for having me. We're really excited to have you here just because one, the report is exhaustive and two, it takes a very novel argument, one that we have not seen.
00:04:40
Speaker
We both work in tech and have done so for a long time. We're very familiar with a lot of the legal arguments, especially antitrust arguments, for example. But this is the first time that we had seen resistance to big tech as framed through a human rights lens. So I just want to start with giving you the floor to kind of explain why did you decide to take this approach or how did the idea come up that there was enough of an argument here to base it on human rights? And then we will dig in a little further.
00:05:16
Speaker
Yeah, thanks so much. I mean, you've really... got to the nut of it, I think already. We were very interested in this market power question, but every time I was reading analysis around this, it was really coming from ah competition, fair competition perspective, or from a consumer rights perspective, or looking at the impact on small medium-sized enterprises.
00:05:45
Speaker
And some were looking at human rights, but not wholly. So as a human rights organization, it was very difficult for us to get involved in the conversation. We couldn't really talk about these other perspectives because that's not our place in this conversation.
00:06:02
Speaker
So we knew that we needed ah wholly human rights perspective. And to be honest, when we started out, you know We didn't start out with the conclusion. it wasn't we know that we can make this argument. It was can we make this argument?
00:06:16
Speaker
So we started looking at different human rights impacts that big tank big tech companies have and started to think about, well, what's the market power component of this? Is there an issue with scale? Is there an issue with control? Is there an issue ah with...
00:06:36
Speaker
lack of options around different technologies. um So that was kind of the genesis of this report was can we make that human rights argument, which I'm pleased to say we can we could. We managed to to publish the report this year.
00:06:50
Speaker
um But it was that that need for a sort of wholly human rights perspective that we weren't seeing necessarily coming from from other organizations. Yeah, I really appreciate that. i i When I went through report, I feel like it pulled together a lot of things that we had seen in individual use cases, right? We knew about the issues in Myanmar and Facebook. We knew about issues with the Tigray and Ethiopia, but those were always viewed as like a media problem.
00:07:27
Speaker
And then we also have terms and conditions. We have, is it regulatory capture? Is it consumer lock-in? And we have those as commercial issues. um I think one of the things that I learned in the report that really stood out to me is while I thought I was familiar with the Universal Declaration of Human Rights, think,
00:07:50
Speaker
I really appreciated the level of detail that there is a right to hold views without interference or manipulation. Right. And what was drafted back then and seen as probably through the lens of like the press and sort of a there authoritarian propaganda machines is something that can be applied to.
00:08:12
Speaker
the media landscape as we see it today, which we've we've talked about extensively on the show, like the problem with these algorithmic lead media is that it's just bubbleaster bubble after bubble. um But I want to I guess we're going to drill in here. We'll start sort of general and then move. So I think what would be your ideal outcome here? Is this to drive a policy discussion? Is it also to drive awareness at the individual user level? Like where do you see the next step in terms of action taking place?
00:08:50
Speaker
Yeah, that's a great question.

Limitations of Tech Regulation

00:08:51
Speaker
I mean, this is quite a
00:08:54
Speaker
policy wonky document, I'm not necessarily expecting a lot of engagement with this kind of policy piece from an individual, on an individual level. So I think there's there's two things that I think are key here. One is that we need to start a conversation a policymaker level around going beyond regulation.
00:09:18
Speaker
So regulation of big tech, I think, is a really key part of the solution. We need to be thinking about
00:09:25
Speaker
you know it's It's been a wild west up until now. They've been able to do much of what they want and we need to start reining that in. But what we found in this research was that even if you start regulating them, they have so much power in terms of regulatory influence. you know I can't say that they definitely have influenced regulation in the past, but I can say they do have a lot of influence.
00:09:49
Speaker
So regulation alone- Oh, it's okay. It's okay, Hannah. I will go ahead and say they have a metric fuckton of influence. If anyone has ever seen Facebook's DC headquarters on K Street, it's very clear why that exists.
00:10:02
Speaker
but Yeah. I mean, I've not seen it, but I've heard. The floor plan, let's just say the floor plans are a little bit bigger than and civil society organizations. Yeah.
00:10:14
Speaker
So regulation is one component, but we're seeing a situation now in the EU, for example, where we have regulation of big tech and the implementation is is not up to standard. These companies have to do risk assessments, look at their potential and actual human rights impacts, mitigate them.
00:10:33
Speaker
I've read some of these risk assessments. They're not sufficient. They're not doing enough to really address the issue. And the EU needs to be making sure that they're implementing that well. And unfortunately,
00:10:45
Speaker
We're not necessarily seeing that. So we need part of the solution to be thinking about the structural issues. We cannot just think about regulation. So at a state level, at a policymaker level, we need to start thinking about what already exists in terms of maybe it's competition or antitrust regulation that we could be utilizing to start addressing some of these human rights issues.
00:11:10
Speaker
issues. And so on a practical level, what does that mean? Well, competition regulators don't know about human rights, maybe on a personal level, but generally they don't. Human rights um experts don't necessarily know about competition and antitrust.
00:11:26
Speaker
Data protection regulators, is you know everyone's very siloed. We need to start bringing these constituencies together that when you're making decisions around competition, you're thinking about this human rights impact, that you'll have you have that level of knowledge.
00:11:41
Speaker
When a merger or acquisition is in front of a regulator, no, they're not necessarily thinking about how that might impact our rights. That is a way that we could move towards, is is bringing those different constiueenties constituencies together. And then on an individual level, I think it's It's calling this out. I think we've all seen, you know we saw the Trump's inauguration pictures with all those CEOs stood right there. you know I think it's right in front of us that the the power of these companies is is very significant. Absolutely. and
00:12:15
Speaker
And starting to question that and starting to question some of the business models that are underpinning these companies. We could have alternatives. We could have an online world that doesn't track us across the entire web, for example. We could imagine something better that we pra prefer.

Should Tech Platforms Be Public Utilities?

00:12:31
Speaker
um and on a personal level, you know, this isn't necessarily an amnesty position, but when I go on social media... I'm seeing a lot of rubbish, certainly on some of these platforms. Well, yes, I think the joke is that in the initiatification of the internet, it's like you open up your browser and it's basically the same five websites and all of the content on each is just content from the other four, right? Anyway. Yeah.
00:12:55
Speaker
Over to you, George. Yeah. So, I mean, that's that's kind of the problem. and we deal with that in our world, too. And, and you know, I remember going from like signals intelligence in the military to cyber intelligence and the matter of circular reporting where.
00:13:12
Speaker
you know, finding an actual primary source of a lead versus the seven or eight blogs to just repost the same information over and over again. And when you start looking at citations of other reporting outlets and you're saying, you see, wait, there they're citing like three or four blogs that, yes, they're they're different publications or different different publishers, but still fundamentally the same information.
00:13:35
Speaker
And I think that's a form of of numbing us as well, along with the work slot that a lot of these AI outlets are now producing. um i have a friend. It's funny you bring this all up because I have a friend who works in creative writing.
00:13:47
Speaker
And she recently reached out to me um about a movement that her friend is starting advocating on behalf of human creative writers and fighting for the survival of the creative writing profession, which I would...
00:14:00
Speaker
George and I should probably dig into in another episode. You just you've inspired to remind me of it. I'm unfortunately traveling on the road for work and there's a million other things I'm doing. And and there's just the stuff in in our head. This is like the common battle of being in this economy today.
00:14:14
Speaker
That said, um your point on the lobbying, I think, is really kind of a critical issue that we have to discuss with this, because you know I don't want to be an idealist. And we'll say in other hats of my life, I've worked very closely with amnesty on other issues that are very are very dear to me because amnesty does tend to represent the side of justice and good um and the way that we we should be approaching humanity.
00:14:41
Speaker
And on this issue, you know you are right. You are fundamentally correct. I mean, i mean even in the report, you describe these platforms as being so embedded in public life that they're essentially utilities.
00:14:53
Speaker
I mean, if we if we take that premise as true, I mean, do we need a new sort of digital social contract and something that that protects our rights, but also acknowledges that these platforms are now public squares, whether we like it or not?
00:15:08
Speaker
And that's one part of the question. The other part of the question is, Knowing what society is now in the state of techno feudalism that we have essentially entered upon, knowing that the billionaire class have now done everything imaginable to protect themselves, to fortify themselves from good lobbying.
00:15:28
Speaker
And we're seeing millions of people in America and in Europe protesting on behalf of their rights of of different democratic movements. And they're constantly being ignored because the lobbying dollars control everything. And it seems to me that we are no longer led by people who have any sort of ethics.
00:15:51
Speaker
How then do we if we want to accept that these platforms are now essential public utilities that are key to having modern life? How then do we achieve that recognition in a way that also affords us the rights that comes with using a public utility service as any other would, like, you know, using your your running water, using your fire department?
00:16:15
Speaker
It's that level. And how then do we break the hold that the profiteering that these organizations just do for each other have on us? Because I feel that Your report paints a picture of the way things should be.
00:16:31
Speaker
And I just don't know how we overcome the delta of greed and profit that defines this entire generation that we are unfortunately in.
00:16:45
Speaker
Gosh, yeah, that's difficult question. I'm not sure I've got- We only ask easy questions, Hannah, just softballs all all day long. Welcome to the show han
00:16:57
Speaker
the show, Hannah.

Normalizing Privacy Invasion by Tech Companies

00:17:01
Speaker
I think one area that I find really interesting is that these companies are not necessarily bringing innovation anymore.
00:17:14
Speaker
and And we, I think, have... become quite ah used to the status quo of these other companies. This is how we access the internet. you know When I go on to a social media platform or I'm using these services, I'm aware that I'm being tracked at a horrific level of detail and I'm very uncomfortable with that, but that's my job. i I know about this because it's my job. But I think a lot of us know about that. I mean, if you went back 50 years and you told people,
00:17:47
Speaker
you're going to be, this is the level of detail that a company is going to be tracking you in they would be horrified. And that has become completely normalized. So we need to keep talking about these issues. We need to be visibilizing the fact that these companies are not creating innovative, interesting solutions. They are abusing our human rights. They are doing things that we are not comfortable with. We need to kind of take charge of the narrative because they are winning that narrative. That's part of the lobbying power. It's not just access. They also have the narrative a lot of the time.
00:18:20
Speaker
So we need to be constantly questioning what they put in front of us in terms of, yes, I use that language saying they're almost public utilities, but in a way we have to be careful. We're not feeding into these things are essential for our digital world. It could look different. The internet could look different. We don't have to have these companies.
00:18:40
Speaker
um And then I think some of the solutions, i i agree we're probably not going to fully get rid of these companies, but how do we open them up? So it's not just about potentially structural breakups, although I think that might be part of the solution with some of these companies, but also what can we do so that people really have a choice?
00:19:02
Speaker
So if have my friends and family, I don't know, maybe your kids school organizes on Facebook groups or WhatsApp, or you have to have Instagram to access whatever, you know, we're locked into these services. How can we, how can we,
00:19:16
Speaker
how can we come up with solutions that mean I have a choice. I don't have to use Instagram. I can use another service that does respect my privacy, that doesn't amplify very harmful content that, you know, I choose to use, but still access maybe the services that Instagram provides. or but I pick Instagram just off the top of my head, of course. But before I have to ask, though, because I get where you're going with this. Yeah.
00:19:46
Speaker
There is a mind virus where if it was just the case of the working class against the rich, not controlling us and not controlling our media, we would have new media, which would provide us news and we'd all be on new media platforms and we'd all be getting our info from Substack.
00:20:03
Speaker
But if you go on any any social platform, even a new media platform, plenty of content creators who are supporting, like before it was just the the shameless fans of Elon Musk and all the things that he would do and defending Elon and blah, blah, blah.
00:20:19
Speaker
But now it's actually just people who look to these these CEOs as if they are the new gods. And to speak ill of them is almost demagoguery.
00:20:32
Speaker
And how do we then break the the social hold? Because that's that's where the the crux of the report, I think it it kind of gets to. It's that we have to change the way that we think about this, because we as the people, you know, capital P, the people have willfully accept this condition.
00:20:56
Speaker
So I think before we can even elicit that kind of policy change, how do we, in your opinion, based on the the amazing amount of research that you've done, change the narrative at the lower level so that we can begin to unify the discussion to actually bring together some form of power to counter the movement? Because at the end of the day, it's a game of power, don't you think?
00:21:21
Speaker
I mean, I think part of the problem is that a lot of these harms aren't that visible to us. So as an Amnesty International, as a human rights organization, we have a responsibility for visibilizing those harms and bringing that to

Invisible Tech-Related Harms and Amnesty's Role

00:21:37
Speaker
our movement. I mean, Amnesty is a movement of 10 million people. That is quite a lot of people, but there's also lot of people outside of that.
00:21:45
Speaker
I think a lot of these harms feel quite distant and not necessarily impactful in our day-to-day. And we need to be thinking about how does this impact my mum when she opens this app or a person who's just accessing you know their friend's Instagram in Ethiopia or whatever it is.
00:22:05
Speaker
Sometimes these harms can feel very far away. And I think we do have to think about how this impacts our day-to-day. which isn't always easy to do. No, but I do, i feel like the tide has not turned, but I do feel it is turning. I think worldwide, you know, we have not just in the US, we have Australia, we have Canada, we have lots of new legislation coming on just to just even limit contact ah between younger people and social media, for example, but also new laws coming online trying
00:22:41
Speaker
I wouldn't say explicitly antitrust, but trying to reduce the influence and dependency. I do think about when I read reports like this, I think about my friends who own small retail stores and basically Instagram ads and Facebook ads, hyper localized media, because there's no local newspaper really left anymore. Or, you know, radio ads are a thing that they do.
00:23:06
Speaker
um but like, I don't think that they like being locked in, but it just also feels like a necessity to them. Like no modern business can operate, you know, without being on ah social media.
00:23:20
Speaker
um But I want to get to a more personal question. You have mentioned a few times now that when you go on social, um I guess, how has writing this report changed your relationship? I guess not. to I mean, when you single out social, because it's easy, but I think a lot of technology you you also call out the google and apple app store lock-in um as as these ecosystems that that circumvent choice things like that so ah yeah just hannah curiously in your world how are you metabolizing this this thing that you put together yeah it's such a good question
00:24:00
Speaker
I think I obviously spend quite a lot of time with kind of technology and human rights people who have a lot of great alternatives to these platforms, to the services that these companies provide, um which I do try and use. But I have to say,
00:24:19
Speaker
I think what this report really made me realize is it's all well and good if those alternatives exist, but if they're not easily accessible to the majority of people, it doesn't really matter. You know, what I use on a day-to-day level is a personal choice, but I don't think we should have to think about our human rights all the time. You know, I don't think...
00:24:41
Speaker
I don't think many of us have time for that. Let's be honest. You know, it's my job. or I spend my life thinking about it. But if you don't, you shouldn't, you should be able to use an app or use the internet. And just, it's a given that your rights are respected.
00:24:56
Speaker
I think that that's the core of it for me. We shouldn't have to be thinking of our own alternatives, even though, On an individual level, yeah, absolutely. There are there are places you can go, but it's very difficult. I mean, if we go back to talking about um tracking and privacy and the sort of extent of Google and Meta's advertising technology,
00:25:19
Speaker
Even if you avoided their actual services, you didn't have a Facebook account, you didn't use Google search, you're not going to be able to avoid them having some kind of profile on on you or on your IP address. they're just It's so vast across the internet, this kind of ad tech. that I think even a very, very diligent person would not be able to avoid it. So I guess in answer to your question, it's it's just, ah it's part of a necessary evil of online life these days. I'm very aware of it. I'm very freaked out by it, but also,
00:25:51
Speaker
it's difficult to avoid so you kind of have to get used to it and channel that into I guess this work and and pushing back um but I think coming back you made the point around um local organizations and adverts and not having alternatives I think that's really part of the problem that that I maybe was less aware of before I wrote this report is just how much these companies are you know either buying up competition or kind of implementing rules around using their platforms that prevent competitors from thriving.
00:26:28
Speaker
There's really quite a web of actions they've taken to lock down innovation or or other actors rising up.
00:26:40
Speaker
Hey listeners, we hope you're enjoying the start of Season 4 with our new angle of attack, looking outside just cyber to technology's broader human impacts. If there's a burning topic you think we should address, let us know.
00:26:53
Speaker
Is the AI hype really a bubble about to burst? What's with romance scams? Or maybe you're thinking about the impact on your kids or have questions about what the future job market looks like for them.
00:27:05
Speaker
Let us know what you'd like us to cover. Email us at contact at bareknucklespod.com. And now back to the interview.
00:27:17
Speaker
Corey Doctorow has written extensively. I think he has a new book also about, again, he calls it the inshittification of the internet, which he identifies as this, you know, you get people in the door for free.
00:27:28
Speaker
They love it. And then you slowly start, I think he calls it twiddling. You just start changing the business model and they've, they've essentially gotten locked in. So local newspapers, everyone was like,
00:27:38
Speaker
Drunk on referral traffic from Facebook and Google and then one day they turn off the spigot because they want to capture the eyeballs instead. and yeah, then there's, I mean, advertising collapses and um Yeah, i think I think you're right. I have been accessing the internet since 1993, which is crazy to say out loud.
00:28:03
Speaker
And i my memories of the early internet are getting harder and harder to remember. Like how freeing and weird it was. I mean, sure, there was like some scary stuff on there, but like... How do you create the sound of that dial-up modem?
00:28:20
Speaker
It wasn't even just the modem. It was just like you could just... Now that I think, if I think really hard about it, like just even the number of search engines, like you would wire one search through like Ask Jeeves because it was better semantically at finding this one thing. And then this other thing through Alta Vista because it was way better at finding those illegal MP3s that you wanted to spend 20 minutes downloading.
00:28:40
Speaker
But yeah. but There was just so much diversity. And it does feel like when i log on, it's like, i I guess I'm going to the same five websites, you know, essentially.
00:28:52
Speaker
um Yeah, it it just felt so much freer. I don't really, I'm not really going anywhere with the question. I just wanted to get your take, Hannah. I think... It's, it's good work, but it's also so hard to like hold these two things at the same time.
00:29:06
Speaker
Right. Because even we try to limit our time on social as much as we can, but you know, sometimes an interesting post comes through and we share it and it's just kind of like a weird, I guess this is how we communicate now.
00:29:22
Speaker
Yeah, absolutely. And I, I mean, I think the thing that I really wanted to progress with this report was that big tech aren't probably part of the solution.
00:29:38
Speaker
And I think for many years as civil society, we have been talking to big tech, trying to highlight these issues, asking them to change. And they haven't.
00:29:50
Speaker
And now we have to go beyond that. And I think that may sound like a small shift, but I think it's a huge shift in how human rights organisations are thinking about this.
00:30:05
Speaker
But again, regulation alone is not going to work. So i don't have all the solutions, but I think if we can shift that framing in terms of what we're asking for, which is a more diverse...
00:30:19
Speaker
internet with options with less control by fewer companies that in itself is a step in the right direction that we can start to visualize what we actually want at the end rather than just um rather than just talking about the problems yeah I think there's also a lot of value in highlighting alternative models, too, which I'll get into in a second. But, George, ah go for it.
00:30:45
Speaker
Sorry. Yeah, because I was going to say this, this, this actually segues perfectly and what I was going ask you And that's the reports heavy on the harms of surveillance capitalism. And I think that's a massive.
00:30:56
Speaker
growing problem and it's only going to get worse. I just read a report about the a Deputy Secretary of Defense in the States now trying to push towards Wall Street essentially funding, you know, billions, if not trillions of dollars of, um you know, army futures, infrastructure development, which, you know,
00:31:17
Speaker
having the people who make profit off of war financing the organization that does the war probably isn't going to go well so hope for the sake of world peace.
00:31:28
Speaker
um And and you know in in our circles as well, especially in my world, I deal with a lot of investment bankers. I deal with a lot of venture capitalists. you know That's a very common social circle I hardly find myself in as someone who's a little bit more socialist leading than most people.
00:31:47
Speaker
So i think we look at Palantir as an example and the amount of people that I know that are not pro surveillance state, they're not necessarily pro authoritarianism, but they love, love, love talking about how much money they've made off Palantir because they invested in it five years ago and watch their stock spike.
00:32:10
Speaker
And all they understand is the commonality of the the casino, that is the stock market, and their winning bet, not understanding the consequential implications of that bet winning.
00:32:26
Speaker
So many users, especially in rural areas, actually depend on the Googles and the Amazons for basic services, if we go back to that. How then is there a path to demonopolization that doesn't also decimate access for both lower income and remote users?
00:32:47
Speaker
And at the same time, still retains the premise, the Western premise oh ah freedom of freedom of individuality, freedom of choice, of of not being controlled by the state.
00:33:02
Speaker
How do we combat against this?
00:33:05
Speaker
Yeah, when you said you did difficult questions, stephanie you definitely do. um I mean, i think I've already said this, but I don't think that people are fully aware of the human rights implications of a lot of these technologies. And around Palantir, for example, I think there's still a lot of work to be done to bring those stories of that actually what they're doing.
00:33:31
Speaker
um I think it's become so normalized the way that we use, that we're using and manipulating and data for all sorts of purposes in the name of efficiency that is really harmful for human rights, that we're feeling we almost have to catch up because there's been so much change in the last few years.
00:33:57
Speaker
um So I do think human rights organisations have a part to play in in raising awareness around that.

Embedding Human Rights in Business Operations

00:34:03
Speaker
um
00:34:06
Speaker
and sorry I'm sorry, I'm blanking sort of on your second question. I jumped into Palantir then, but could you repeat for me, George? but we're trying to to help save access to folks who are now dependent on these services. So there's two angles to this, right? When you talk about surveillance capitalism, one is the dependency of lower income or or traditional working class people or people who live in rural areas who rely on those services, especially considering, you know, I'm in Canada and We're on the verge of losing our door-to-door postal service for all sorts of reasons, financial mismanagement, blah, blah, blah. But postal services were absolutely critical for keeping people actually engaged and connected. And and ah banking by mail is actually a big thing in rural Canada.
00:34:53
Speaker
We might lose that. So that means they're going to be even more dependent on digital payment processors, which are hosted on Amazon and et cetera. Right. So how can we, you know, allow these services to still be in place because the people living in these areas still need them.
00:35:12
Speaker
But then by keeping these organizations in place and the ecosystems that they exist and the way that they manipulate and use the data from people who are relying on them to essentially control them.
00:35:25
Speaker
How do we then you know retain the ability for these people to still be connected to society while not continuing to invest and reinforce in the dragnet that we are finding ourselves more and more encapsulated?
00:35:41
Speaker
And we we see human rights and business as so separate. They've become very separate conversations. And i've worked my whole career I've worked in sort of corporate accountability and business and human rights.
00:35:55
Speaker
And that bringing together of these two things to say, why is it that you're allowed to operate a company that does not consider its potential human rights impacts that can just, whether that be in its supply chain or whether it be in whether it's disrupting a service, as you mentioned, George, around the postal service,
00:36:18
Speaker
they there's no obligation for them to even think about it, to even consider it. They can put out um a product. Actually, i was listening to your interview around stalkerware and you were talking about this, right? Putting out a product, um these tracking products without thinking at all about the potential impacts of that. So why, like the first thing is we need to embed this at the beginning. Companies should be required to look at this in the same way that they're required to look at other types of impacts of their companies from the beginning embedded human rights due diligence all the way through.
00:36:53
Speaker
um and obviously this doesn't talk about things that have already happened, but where are regulators thinking, anticipating and assessing some of these impacts? So one sector that I'm really worried about, I mean, i think we're all worried about is this growth in AI, the generative AI sector, where obviously big tech are extremely embedded into the infrastructure and AI services.
00:37:21
Speaker
We're already seeing potential human rights risks around impact on right to water for ah the water and energy needed did for yeah yes for data centers. We're seeing supply chain impacts with um people looking at like data labelers ah bringing cases around to these companies for labor rights abuses.
00:37:43
Speaker
um We're seeing discriminatory outcomes with AI. There's so many risks that are already apparent to us. so But this sector is just rolling on and we're letting it happen without any kind of regulation, without any kind of bringing it back in. Where there are moves, like in the EU, for example, with the AI Act, there's talks now of rolling that back in the name of competitiveitive and competitiveness, in the name of simplification for companies.
00:38:10
Speaker
So I think embedding human rights into our into all of these decisions, whether that is companies having to look at their human rights impacts as they are rolling out a new product or service, or whether that is competition regulators looking at a merger or acquisition happening and saying, OK, but how could this impact our human rights?
00:38:29
Speaker
You know, if we're not embedding this from the beginning, these impacts, I think, if not inevitable, are you know it's a very high risk that they're going to occur. um So I think we need to be having that conversation about how do we embed human rights throughout um business operations. These aren't, to me, they're not separate conversations, but I think these two different groups are often not talking to one another.
00:38:53
Speaker
Yeah. and i And also think, I think part of this work is reframing as a human rights issue. I also think, and and this is not necessarily a question, but I think we can close out here. I think part of the work, whether it's us, whether it's Amnesty, other organizations that are concerned are also highlighting the alternative models because i think there's a lot of work being done in small corners and and by design they're not these huge bombastic programs but that also means they're not getting enough attention and i i think we take for granted especially my worries around the way generative ai is developing is it's
00:39:35
Speaker
like the previous 10 years of social media, but like on steroids.

Alternative Tech Models and Community Consent

00:39:39
Speaker
And so you take this shitty thing and you just double down on it versus what are some other ways of doing it? And I had the great honor and privilege of seeing Karen Howe speak ah very recently on her book, Empire of AI. And she highlights in the book, but for the benefit of our listeners, she highlights this project in New Zealand where a local radio station, which has been broadcasting in the native region,
00:40:03
Speaker
ah Maori language um has all this archival audio, right? And so what they did was they went to the community and they said, we would like to develop tools using this data as a means of preserving this language and also actually beginning to reclaim it, like teach it back to the younger generation.
00:40:25
Speaker
And so as a community, they develop consensus. Yes, we we approve of this use of these archived things. We will develop transcripts. They used NVIDIA's NEMO program, which is open source and ah ah just a few a one hundred gpus and They're building out a language recognition model for that specific community with that community's consent with you know for that community. And it's not necessarily it's not built for a for-profit model, which is one one way of doing things, but it is using AI in a way to really help.
00:41:01
Speaker
And I just think if we so if we try to highlight more examples of that, it gives people something else to latch on to. I think to your point, I think that the bombshell, a spoiler alert is not that the big tech is bad, it's that they aren't innovating.
00:41:16
Speaker
Like we just got stuck in this mire and are these large behemoth organizations and they're actually not really incentivized to innovate as rapidly as they did. And I would encourage the future founders and the people who want to use tech in a good way to just start looking for different ways of doing it because I don't think we should just take it for granted. Like this is the only way you do it. You just get billions of dollars and you scorch the earth and you...
00:41:41
Speaker
do whatever you want. um I just think like, quote unquote, scale is not always the most profitable or best way of doing that. Yeah, absolutely. And I want to get excited again about using the internet. yeah you know You were talking about the 90s and all the exciting things you could do and find and explore. And I think we've lost some of that. so yeah, what's that vision? like What do we want? What could exist or what does exist already that' that's better than what we have? I think Yeah, i i would love to think more about that. I think there's so many opportunities and possibilities that we've got quite locked into very set way of how things work that it doesn't have to be.
00:42:23
Speaker
Absolutely. Well, Hannah, we'll end things there. I want to thank you so much. I know we had to do a little time zone tango. i What are we doing? Like three different time zones right now because George is overseas.
00:42:34
Speaker
um But yes, thank you for making it work. We really appreciate you taking the time. And I appreciate you just handling the the everything I threw at you i think you're absolutely brilliant. I hope we can have you on again. i will absolutely be a follower of your work.
00:42:50
Speaker
Please let us know when you put out more articles, more reports. um This show is a fan of yours. So thank you, Hannah. Yeah, thanks so much for having me on. I'll definitely keep in touch. And ah but yeah I really appreciate you engaging in this. It's just great to to see you know people all over who are interested in these topics and want to discuss it. I did think when I put this out, it might just be a bit of a oh policy wonky kind of only those people are interested. So it's really great to to have. Yeah, I mean, a peek behind the curtain. I think George sent me something.
00:43:24
Speaker
And I was like, you know what? Fuck it. I'm going to email Amnesty and see if we can get them on. And here we are. So I really, really appreciate it. Have been a fan of the organization for a long time, ever since high school.
00:43:38
Speaker
ah So thank you again for coming on the show. Yeah, thank you both.
00:43:44
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:43:58
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. that helps others find the show. We'll catch you next week, but until then, stay real.
00:44:12
Speaker
Well, I appreciate the work that you put in. I remember the one page that had, that was, I think more of the page was footnotes. Yeah, I made an error of not doing my footnotes until the end. I'd sort of left placeholders and oh my gosh, that was horrific. so