Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Algorithms of Oppression author Safiya Noble visits the show image

Algorithms of Oppression author Safiya Noble visits the show

S7 E183 · The PolicyViz Podcast
Avatar
224 Plays4 years ago

Author and professor Safiya Noble visits the podcast to talk about the intersection of race and technology, and how "technological redlining" will impact people of color today and in the future. 

The post Episode #183: Safiya Noble appeared first on PolicyViz.

Recommended
Transcript

Significance of Election Day

00:00:17
Speaker
Welcome back to the Policy Viz Podcast. I'm your host, John Schwabish. I hope you and your families are all well and healthy in these strange times. And obviously today here in the United States, it's election day, so it's a pretty big day. I remember just a few years ago when people would argue that there wasn't a big difference between either the major presidential candidates or the political parties. That has certainly changed over the last several years. The differences between the candidates couldn't be larger. The differences between the parties couldn't be larger.
00:00:47
Speaker
both in terms of policy and politics in the approach to governing. My hope today is that everyone who wants to vote can vote and does vote. Voter suppression strikes at the very core of any democratic country, and for any party, any person, any organization to threaten voters is to me quite upsetting and outrageous, so I hope today we have a smooth day in our elections.
00:01:12
Speaker
But OK, enough about politics.

Newsletter Relaunch & Updates

00:01:14
Speaker
And before we get to the show, a few things about what's going on at PolicyViz. First, I've restarted my newsletter. Up until this fall, I sent maybe one or two newsletters out each year. It's a lot of work to do. I didn't think
00:01:27
Speaker
folks really wanted to hear from me even more than they already do on the blog and the podcast. Plus, the way I was writing them, they just felt kind of like big advertisements. But I have finally found I think what is a good balance. I have the newsletter going out now every other week. It comes out the day before the podcast.
00:01:43
Speaker
And it talks a little bit about the podcast that people will see the next day. So it's a little bit of a sneak preview. I also write a behind the scenes post. It's kind of a short blog post. I've only done two or three already. But it's pretty much a short blog post. I've already written about the last phases of my book writing process for the book that comes out in January. And I've written about some data viz hacking I've been doing in Excel. So if you'd like to sign up for the newsletter, the new newsletter, do head over to policyviz and check it out.
00:02:13
Speaker
I've also added a few more things to the policy this shop, a couple of more shirts. I've updated the match it data visualization game, which is now in stock. And if you're into the R programming language, you'll find a couple of shirts that I think will strike your fancy just to say.
00:02:29
Speaker
Okay, enough of all that, and on to

Introducing Safiya Noble

00:02:31
Speaker
the show. Okay, let me just say, I love all of my guests. My guests are great, they do great work, they take time out of their schedules to come sit and chat with me, and I love all of them. I will say, however, that this week's guest is special. It was a great conversation, super fascinating, and I was really excited to have this guest on the show with me. So this week,
00:02:55
Speaker
Sophia Noble joins me on the show. Sophia is an associate professor at UCLA where she serves as the co-founder and the co-director of the Center for Critical Internet Inquiry. She's also the author of the bestselling book, Algorithms of Oppression. I picked it up over the summer and I could not put it down and I'm just so happy that she
00:03:15
Speaker
took time out of her schedule to come chat with me on the show. Her book covers racist and sexist algorithmic bias in commercial search engines. It's a problem that has not yet gone away, as you'll hear about in the interview. So it was just a pleasure to chat with Sophia Noble. I had a really great time during this interview, as you will certainly hear in the audio,
00:03:37
Speaker
Her work is so interesting. Her perspectives on the intersection of race and technology are so interesting. They're so important and they're invaluable for anyone who works and communicates data. So I hope you will enjoy this week's episode as much as I did with the interview and putting it together. So here's my conversation with Safiya Noble.

Safiya's Background in Advertising

00:04:00
Speaker
Hi, Safiya. Thanks for coming on the show. It's great to chat with you.
00:04:06
Speaker
It's great to be here. Thank you. I'm really excited to talk about your book that links are on the show notes page, algorithms of oppression, how search engines reinforce racism.
00:04:16
Speaker
And I want to talk both, if we can, about the book and also your current work and what you're looking at now and the activities and things that you're looking at, especially in the current era in which we're in, the moment that we're in. But maybe we can start by having you talk a little bit about yourself and your background, and then you can start talking about the book and where it came from and how you went through the process of writing it and what you were trying to accomplish with it.
00:04:40
Speaker
Sure.

Research on Search Engine Biases

00:04:42
Speaker
So, you know, this book was really the outgrowth of my dissertation when I went to grad school. And I always like to tell people that, you know, I spent my first career in advertising and marketing for about 15 years. And, you know, everything that probably isn't good for us, like SUVs and booze, you know, I was a part of selling all those things. And, you know, I think of going to back to grad school, it's kind of like a
00:05:12
Speaker
When I went to grad school at the University of Illinois at Urbana-Champaign, I was thinking about advertising and marketing. That's the thing that I really knew super well inside and out. I've been an expert in multicultural and urban marketing and public relations.
00:05:31
Speaker
And when I went back to grad school, it was kind of a time when, of course, Google was on the rise, Facebook was on the rise. These platforms were becoming platforms before our very eyes. And people were kind of enamored in a way that I thought was kind of curious in academia, because when I was in industry,
00:05:56
Speaker
I was really clear that platforms like Google and Yahoo in particular were advertising platforms. I mean, there were places where we were doing pretty significant media buying. We were definitely working with public relations teams to create what we called like advertorial, right? To make it to the front page of search and to make our clients' products and services seem like they were credibly third-party verified, so to speak.
00:06:26
Speaker
In academia, people were starting to talk about Google like it was the new public library. And this disconnect really was fascinating to me. And that was the impetus for starting to think about, well, wait a second. Let's double click a little bit here on Google in particular. Really, as a case, I mean, it could have been Microsoft or Yahoo, but no one really uses them the way they use Google.
00:06:51
Speaker
unfortunately, Google got the attention because they were the monopoly leader. But you know, I, that's really how I kind of got to it. And I was because I'd worked in the multicultural and urban marketing space for so long. I was attuned to thinking about how vulnerable communities, marginalized communities, underrepresented communities, and even
00:07:14
Speaker
a trend setting and trend leading communities, which I think of African-Americans, for example, my community as being both often politically, socially and economically marginalized, and at the same time being the trend setters and the trend leaders in popular culture, for example. So it's like this, you know, interesting duality, plurality. That's where I kind of started looking at how communities were represented in search engines. And that really opened up, I think,
00:07:44
Speaker
the Pandora's box that became algorithms of oppression. Right. Do you find it odd, even today, how many people seem to think that these services that we get for free are just free and there's no other component? I love that word, advertorial. Yeah. I mean, it still surprises me. On one hand, it surprises me that people who I think should know don't know.
00:08:11
Speaker
Um, and that, that kind of is that sliver in particular. So I remember recently giving before we were all locked down, I was, uh, giving a talk at a big conference and a public librarian approached me and she said, you know, I always thought Google was a nonprofit and I was really fun. And I thought, Oh my gosh, you know, it was so important to learn that from her. And it made me realize that even the people that we.
00:08:39
Speaker
think of as being highly trained and astute. They are also just as susceptible to the kind of marketing discourses that come out of Silicon Valley as anybody else, right? So Silicon Valley's really for 30 years tried to convince us that their products are just tools and that anything that happens in them is the fault of the public, not the fault of the faulty tool.
00:09:06
Speaker
We have a lot of work to do for sure with all kinds of audiences, but I like to focus on data scientists and computer scientists and librarians and teachers and professors and people who I think have kind of like an exponential type of impact on helping us understand what we're really dealing with.
00:09:30
Speaker
Right. Do you want to talk about an example or two from the book on how the algorithms, particularly in the search engines in Google have, or still do, I guess, misrepresent people of color underrepresented groups? Just to give people, if they haven't read just like a flavor of what, and I'll give people a warning, like it's, the examples you show are pretty shocking of how these algorithms work and misrepresent different groups.
00:09:58
Speaker
Yeah, this is probably the point where I, we should warn everybody that if you are listening to this podcast in the car with the kids, this is where you want to turn the volume down. Okay. So what got me going in this study, like the first real study was looking at a variety of keywords that represent
00:10:24
Speaker
women and girls, and not just women and girls. I mean, I looked at a lot of different keyword combinations. The first, let's say, well-crafted study that I did was I took all of the identities represented in the US Census. So I took the gender categories and the racial and ethnicity categories, and I paired them and comboed them in 80 different ways.
00:10:50
Speaker
And what was shocking to me, and by shocking, I mean not really shocking, but still shocking, was to see how, when you paired any kind of ethnic marker, except for white, with the word girls, so African American, black, black in particular, and I thought black was important because most, you know, black girls identify as black girls, right? I mean, African American is kind of a,
00:11:18
Speaker
a way we might characterize ourselves in more formal settings, but in our families and with our friends, you know, we many of us just identify as black as kind of a cultural and political identity. And so it was shocking to see that for black girls, Latina girls, Asian girls, the whole first page of search was almost exclusively pornography or hyper sexualized content. And, you know, the first of these being, I think, you know,
00:11:47
Speaker
In 2009, when I was looking at this, the first hit for black girls was hotblackpussy.com. And then by 2011, it was sugaryblackpussy.com. And you know, you ask yourself, my God, you know, how can, how can this be when you don't have to add the word sex or porn, but black
00:12:09
Speaker
have become synonymous, Asian girls, Latina girls, just become synonymous without adding any sexualized keywords. And this really is what opened up what became a much longer inquiry into all of the ways in which these systems really fail vulnerable people. And that we have a lot of mythology around why this happens. So for years, I would talk about my work and people would say, well, that's just because that's what's most popular.
00:12:39
Speaker
And I would say, well, how can we say that this is what's most popular without adding these kind of hyper-sexualized keywords? And how fair is that? I mean, this is tyranny of the majority, because girls of color in the United States will never be in the majority in a way to affect the way they're misrepresented in search engines. But more importantly, they'll also, in the near term future as children,
00:13:09
Speaker
never have the capital to be able to use AdWords and keyword planning tools and all of these, the backend that helps optimization happen. And of course, the porn industry is masterful at many, many technologies that we have are really due to the research and investment that they've made. But also, is it fair and is it moral and is it ethical? And that's really what is the kind of opening salvo to the book, Algorithms of Oppression.
00:13:39
Speaker
which follows with many, many more kind of gruesome tales of misrepresentation harm. You know, it's really interesting. I read Ibram Kendi's book Stamp from the beginning after reading your book, and the entire
00:13:54
Speaker
I mean, at least the first half of the book, he talks about how the way that white people would describe the black community and people in bondage was hyper-sexualizing both men and women. And so for someone to say that what happens in the algorithms or what happens in their Google search is because that's what's popular is sort of ignoring a long history of the way the white power structure has viewed and I guess pushed down the black community in this country.
00:14:24
Speaker
It's really true. I mean, I try to give a whole historical and kind of sociological context like Ibram's book does. I mean, his book came out around the same time. I wasn't able to read it before. And while I was writing my book, I wish I could have. I mean, it's, but these histories, many Black feminists have written for decades about the gruesome hyper-sexualization
00:14:53
Speaker
of African peoples. And of course, one of the reasons for that is because it helps justify the enslavement of people, right, the degradation, part of the dehumanization of people. And when you dehumanize people, you are much less attuned to their loss of rights, the ways in which they are discriminated against, the way in which they are oppressed,
00:15:19
Speaker
You lose empathy and the ability to care. And so talking about how dehumanization happens in the digital space is very important because it is also many of the ways that we're engaging online are dehumanizing and they're racialized and gendered in that dehumanization. And if we think there isn't a relationship between people coming across, disinformation, propaganda, racist ideologies,
00:15:48
Speaker
that seems so subtle, right? I mean, I can't tell you how many thousands of people have told me over the last decade that what shows up on the first page of search is just what it is. And there's no politics, right? There's no power. There's nothing nefarious happening. It's just what it is. I actually had a professor once say to me when I was a grad student and presenting my work, he said,
00:16:11
Speaker
maybe Black girls just do more porn. I mean, the temerity, right, of the way in which people would justify these kinds of problems is really why, I mean, I try to expose people in this book to, you know, how to think about the way in which we've all been socialized in racist systems.
00:16:36
Speaker
that then make it very difficult when we are designers and makers of technology also to even be willing to look at these problems.

Digital Redlining Explored

00:16:47
Speaker
Right. I want to pivot a little bit and sort of look, because we've looked back a little bit, I want to look forward. So in this book and also elsewhere in some of your writings, you've sort of coined this term technological redlining, which I think is really a fascinating concept. And I'll just for a brief moment, talk about what redlining is for listeners who don't know. So in the early part of the 20th century,
00:17:12
Speaker
There was a federal government agency whose job it was to assess housing values in major metropolitan areas around the country. They created these maps where that assessed risk and the red areas of those maps were the highest risk areas.
00:17:29
Speaker
Red areas could have just one black household living in that area and those would be assessed the highest risk and those the impact of those maps has persisted for generations to impact the accumulation of wealth accumulation of income.
00:17:43
Speaker
and neighborhood segregation in the US. And so Safiya, I'm interested from your perspective, when you think about technological redlining, how does that affect groups now? And then also in the future, does it have the same intergenerational effect going forward? It's such a great question. You know, as I was writing the book, I didn't know about the genius of my colleague, Chris Gilliard.
00:18:09
Speaker
who also writes about digital redlining. And I think of our concepts of kind of technological or digital redlining as being quite similar in that what we're arguing is that these systems, all kinds of digital engagements, not just search. I mean, every type of digital engagement is really tied to tracking, surveilling, categorizing, and predicting us into certain types of futures.
00:18:40
Speaker
And those futures often and the way in which we're classified digitally is usually something we have no ability to affect for the most part. We really don't know what our digital profiles are. We don't know, just like African-Americans in the past did not know that they lived in zip codes that were being redlined and keeping them out of mortgages and financial services. In many ways, those same processes are happening now. Our digital traces.
00:19:10
Speaker
information gathered about us is also used when we are doing things like looking for insurance quotes online or looking for different kinds of banking and educational products. You know, there's a great book called Lower Ed by Tressie McMillan Cotton, where she talks about how over and over again, poor women and especially poor women of color, black women,
00:19:38
Speaker
whose digital traces are embedded in these systems that they're using all the time. When they go to look for things like higher education, they are targeted with predatory ads that push, you know, like the Trump universities, the like for-profit, you know, predatory kinds of educational, I use that, the Quitty fingers, right?
00:20:00
Speaker
scams, quite frankly, and are targeted for all kinds of predatory products. So these are the kinds of ways that we're talking about digital or technological redlining. The thing that I worry about the most that I'm studying are really are the predictive analytics that underlies so many of the systems we're engaging with, because I don't think that it's far-fetched to see how systems of social credit, which really originate in the United States,
00:20:29
Speaker
with our own kind of credit scoring systems certainly are in play in the UK. You know, we put a lot of focus on China, but we want to remember a lot of these models started before China picked them up. So these social credit, these predicting models about who has access to all kinds of opportunities, goods and services in the future really are the fundamental logics of so many products now.
00:20:58
Speaker
And I think these are things that are hidden from view. People don't understand them. When they hit the news, like the algorithms predicting who would get to go to college or not in the UK, which hit about a month ago, then people kind of realize like, whoa, whoa, wait, what? But until people start to understand that many of the future opportunities for our kids and our next generations are going to be overdetermined by algorithms and AI,
00:21:27
Speaker
then we haven't done enough work yet. And I think we need regulation, we need policy, and we need a huge amount of social awareness about what these technologies I think are going to do to our quality of life and the way we live.

Combating Algorithmic Bias

00:21:46
Speaker
Yeah. So on that note, so what can people do to combat the algorithmic racism? And especially for people who might be listening to this podcast who are sort of immersed in data, immersed in technology, you know, what can we do, if anything, to address some of these inequities? It's really a good question. I mean, I think there's intervention on a couple of
00:22:14
Speaker
levels. Well, first, let me say for data scientists and people who work with data. One of the things I always try to teach my students at UCLA is that data, like race and gender, are social constructs. Data come from somewhere, and they're incredibly subjective. People think that data is somehow neutral, that it just is what it is, that it has no politics. But see, those of us who make data
00:22:43
Speaker
people who are researchers, we know, for example, the sociologist knows that where they kind of mark the beginning of one category and the end of the next category is sometimes arbitrary. It's sometimes a guesstimate. It's sometimes kind of like the best you can do. It's sometimes just a number. It's just 25%. And that's what it is. And that kind of data making is also often, especially when you're talking about big data sets,
00:23:11
Speaker
It's representative data. It isn't a close reading of people. Representative data is somebody's grandma. It's somebody's family member. It's people who are vulnerable, who can't speak for themselves, who can't clarify and nuance the way in which they're represented. So this kind of representative data gets used and taken up like a truth teller. And that is very dangerous. So I think we got to pull back and think about
00:23:39
Speaker
what we're doing when we're doing things with data. You know, there are people who obviously there are so many tech workers who are also intervening at the level of just saying, I'm not going to work on certain products, because I think these are unethical products, or these are going to change the future in a way that we don't want. We only wish, for example, you know, I look at people like climate scientists now, who lament the fact that all they did was kind of
00:24:07
Speaker
put the information out and say, hey, the way human beings are living in an organized society is really going to kill the planet. Here's the research. But they didn't take an activist stance about that work, you know, like they're, yeah, yeah.
00:24:21
Speaker
So I think we have to, those of us who work closest to these models, we have to take a more activist stance about the harm and say, wait a minute, we don't want to do this. This is dangerous. And not lay back like the climate scientists until it's too late. And then I think, you know, for everyday people, obviously, there are so many ways that we should be thinking about protecting our data.

Local Policy & Tech Issues

00:24:46
Speaker
But these things and our privacy and like how we engage with the internet,
00:24:50
Speaker
But I think we need to really exercise our power by getting candidates educated. We need to vote on policymakers that are astute about these issues. Many of the interventions can happen at local and state levels. We certainly see like ban on facial recognition, greater privacy laws in California where I live. Those things don't have to all just go to the federal government. And I just want to remind people that we still have a lot of
00:25:18
Speaker
power to connect and organize locally where we live. Yeah, I wanted to ask, and you may not know the answer to this, but are there states or countries or other jurisdictions that are, or even individual policymakers that are leading in these efforts to address these issues? Well, I think Senator Warner's office, the Senator from Virginia has been way out front on things like antitrust and harm.
00:25:48
Speaker
that's coming from Big Tech, let's say, broadly as a category. And I think we want to support and watch what's happening in his office. I think that on the Federal Trade Commission, we have Commissioner Rohit Chopra, who is one of the best and smartest thinkers about consumer harm. So he, for example, is taking on the cases where
00:26:16
Speaker
You know, automotive dealerships, for example, not only tell their salespeople that when you see Black and Latino customers approach to mark the price of the car up no matter what, but also that are using predatory software that gives higher interest rates to African-American and Latino customers. He's taken them on and took on a case out of New York. And so I think that
00:26:40
Speaker
there are individual policymakers who are definitely trying to create a critical mass on the federal level. I think California has been a place where we can test a lot more thinking and it's a place where we need to. I mean, we are, here we are, the heart of Silicon Valley is right in Silicon Beach, right down the street from where I live. And here you have like the largest industry in the world, doesn't pay taxes, doesn't pay back into the system.
00:27:09
Speaker
you know, takes the cream of the crop students into their employee ranks, and then really gives back so little to society. So I think, you know, there are people, certainly those of us in the UC system at UCLA, our Center for Critical Internet Inquiry, we're really trying to track these things and your subscribers can certainly follow us and we can try to keep pushing out resources. I think in Europe,
00:27:36
Speaker
The EU has been the most probably aggressive Germany and France around thinking about the harms of data, but it's very complicated because the scale of these systems is so intense that, you know, when you talk about taking on these systems, you're talking about taking on the entire financial services sector, which is all digital now. You're talking about taking on the markets, the global financial markets. You're talking about taking on state governments
00:28:06
Speaker
that are now so deeply intertwined into digital systems that it's quite difficult to see where big tech companies begin and end and states begin and end. And then of course you have companies themselves like Facebook who are operating at the level of nation state unto themselves trying to make their own currency, trying to make their own laws to govern themselves. So there are many points of pressure that I think we need to be paying attention to.
00:28:34
Speaker
Right. I can't decide which final question I want to ask. I might ask both of them. This is just so fascinating. Let me ask this one first. Now, we're in October of 2020, so it's an odd time for many reasons, but I want to ask whether you think things are changing for the better or changing for the worse. I can't really tell based on what you said so far, whether you think we're going uphill or downhill on this. Well, I think that

Optimism for Tech Equality

00:29:05
Speaker
We haven't bottomed out yet. And so that makes me a bit nervous because I think, you know, one of the things that I say in the book is that we have more data and technology than ever. Yeah. So have more global social kind of political and economic inequality to go with it. So one of the things we know, for example, is that the promises of the Internet and the promises of digital technologies was that they were going to
00:29:32
Speaker
even out the world, right? Make things more equitable, make democracy more plausible, accessible. What we've seen, though, is that many of these technologies are being used in service of the kind of rise of authoritarian regimes, including in the United States, I'm sorry to say. We're seeing, again, a level of control
00:30:00
Speaker
and lack of transparency by so many of these actors between the tech sector and governments that I'm not actually feeling hopeful yet, like that everything will turn on the next election. What I will say though, is that we are in the era of kind of the tech lash and that people are starting to think more critically, more
00:30:26
Speaker
films and television shows and scholars and thinkers and podcasters and, you know, public intellectuals like yourself are taking up these issues in a way that they really did not 10 years ago. So in that way, I'm super
00:30:45
Speaker
energized, that we're building a critical mass of people. And I like to think of it this way, that right now I'm writing about the relationship between three different eras, the era of big cotton, the era of big tobacco, and the era of big tech. And one of the things that they all have in common is that it really was a small group of people who were kind of abolitionists who took on
00:31:13
Speaker
huge industries that people thought could never be taken on. And they shifted the paradigm of how we think about the enslavement of African peoples. They shifted the paradigm of how we think about the public health crisis associated with big tobacco is just what it is and there's nothing we can do about it. And I think we're in a moment where some of us are thinking about what technologies should be abolished
00:31:39
Speaker
What should be made illegal? What is too harmful? What are the secondary tertiary effects of some of these technologies, even if you aren't on Facebook, how you're affected by those kinds of platforms? So I think that in that way, if we pull back and take a long view instead of a short view, I feel hopeful that there will be enough of us who will talk about the ethical and moral failings of the institution of big tech
00:32:09
Speaker
and shift us towards something better. And I believe we've already done that. Human beings have already done that in different moments. I think we'll do it again around this moment. Well, I like the way you shifted from the pessimistic outlook to the optimistic outlook. So that's the best question too, John. I wanted to ask just one last question on what you're working on now and what you and your colleagues are looking at for future work.

Work at UCLA Center

00:32:38
Speaker
Okay, so we have at UCLA, we have this new center. It's the UCLA Center for Critical Internet Inquiry. We're part of a network of centers that have been funded by the Mindooroo Foundation. And they include kind of critical technology centers at NYU, New York University, at Cambridge, at the University of Western Australia, us at UCLA, and some friends at Oxford, at the Oxford Internet Institute.
00:33:08
Speaker
One of the things that I guess there's kind of two things we're all thinking about, which is how to strengthen the research so that people can access it and pick it up and touch it and do things with it. That's so important. We're thinking about public policy. What are the ways that we can inform policymakers with the best, you know, evidence-based research because we think that's what should inform policy, not just raw power.
00:33:35
Speaker
Um, and we're thinking about how to shift culture. So, uh, you know, my colleague Sarah Roberts at UCLA, she works, she's kind of one of the world's authorities on commercial content moderation. So we're obviously thinking about policies that affect technology workers and the traumas that they experience in moderating content. Um, that'll always be core and I'll always be kind of working on algorithmic discrimination and oppression.
00:34:00
Speaker
But we're taking up and trying to link arms, I think, with other researchers around the world who also care about these issues. Most of the places that have studied the internet and society have really been advocates for the tech sector and have taken a lot of funds and resources from the tech sector. So we're kind of like
00:34:20
Speaker
You know, we're on a shoestring compared to those places, but we don't bring big tech money into the center because we don't want it to influence the research we do. And we think that gives us a space for people who are interested again in studying the most vulnerable and those who are most harmed. And we need help and support. So people who are interested in that kind of work, again, should link up with any of these places where this work is being done.
00:34:49
Speaker
Yes, absolutely. And I'll put links to the center, to your work, so people can check that out. Sophia, thanks so much. This has been really interesting. The book is great. Your work is great. And I really appreciate you taking the time to come chat with me. Oh, thanks so much, John. It's really my honor and my pleasure. Appreciate you.
00:35:20
Speaker
And thanks to everyone for tuning into this week's episode of the podcast. I hope you will check out Safiya's book and her work at UCLA. I have linked to all the things that we talked about in the episode in the show notes. If you'd like to support the podcast, please tell your friends and colleagues about it. Write a review on iTunes or head over to my Patreon page. So, okay. Anyway, until next time, this has been the Policy Vis Podcast. Thanks so much for listening.
00:35:56
Speaker
The number of people helped bring you the Policy of His podcast. Music is provided by the NRIs, audio editing is provided by Ken Skaggs, and each episode is transcribed by Jenny Transcription Services. The Policy of His website is hosted by WP Engine and is published under WordPress. If you would like to help support the podcast, please visit our Patreon page.
00:36:32
Speaker
It's so hard to make a mess. It's coming around the bend. It's coming around the bend. Too much noise. Too much life. Too much noise.