Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Automating Inequality author Virginia Eubanks talks inequality in tech image

Automating Inequality author Virginia Eubanks talks inequality in tech

S7 E181 · The PolicyViz Podcast
Avatar
226 Plays4 years ago

On this week's episode of the show, I chat with Virginia Eubanks about how high-tech tools and software profile and punish people of color and low-income people and families. 

The post Episode #181: Virginia Eubanks appeared first on PolicyViz.

Recommended
Transcript

Introduction to High-Tech Inequality

00:00:16
Speaker
Welcome back to the Policy Vis Podcast. I'm your host, John Schwabisch. And on this week's episode of the show, we're going to be talking about how high-tech tools and software profile and punish people of color and low-income people and families.

Interview with Virginia Eubanks

00:00:30
Speaker
And to discuss these complex, really interesting issues, I chat with Virginia Eubanks, who is an Associate Professor of Political Science at the University of Albany in New York State.
00:00:40
Speaker
Virginia is also the author of the 2017 book, Automating Inequality, How High-Tech Tools Profile Police and Punish the Poor. Now, if you haven't read Virginia's book, I really, really highly recommend

Impact of Technology on Communities

00:00:50
Speaker
it. Some of the stories she tells, and she does a really good job of weaving in the data with the stories of people and places, a topic that comes up and again in this podcast, is just amazing for me.
00:01:01
Speaker
For me, the book is right up my alley. It's a mix of public policy considerations, data considerations, and technology considerations. So it's right in that sweet spot. Virginia and I also talk about a lot of groups doing amazing work in this space, including the Center for Media Justice, Data for Black Lives, and the Ella Baker Center, all of which I will link to on the show notes page. And I'm pretty sure you're going to learn a lot on this week's episode of the show. So here's my discussion with Virginia.

Algorithmic Injustice and Personal Stories

00:01:33
Speaker
Hi, Virginia. Welcome to the show. Thanks for taking time out of your schedule. Yeah, thanks so much for having me.
00:01:38
Speaker
I'm really excited to chat with you. I really, really enjoyed your book, Automating Inequality. There seem to be a bunch of these books out right now, but the thing that struck me about your book and that I hope we'll spend some time talking about is you weave in not just the numbers and not just the technical parts of what's happening, but also with the stories and how it impacts real people and real families and real communities.
00:02:04
Speaker
Um, and so there's a lot of books out there that I think are a little more academic and you're sort of really, uh, for me, at least struck a chord. I'm excited to chat with you about it. I thought maybe we could start, you could just talk a little bit about yourself and your background and, um, and why you decided to write this particular book on this particular

Political Nature of Public Services Technology

00:02:20
Speaker
topic. Yeah. So I'm really glad to hear that the book spoke to you in this way, um, because
00:02:27
Speaker
It just has always seemed really obvious to me that algorithmic injustice or digital decision-making or whatever it is you want to call it right now, particularly in public services, it's all about people, right? It's all about people and it's all about politics. And one of the things that I get really concerned about when we talk about these issues sort of in public
00:02:50
Speaker
is that we sometimes frame them, these new technologies, just as issues of sort of administrative upgrades or efficiency upgrades. And so they're not political in themselves. But one of the arguments I try really hard to make in the book is that these technologies are political decision-making machines. And in fact, the thing that is most, sometimes most worrisome about them
00:03:18
Speaker
is that they are sort of politics pretending they're not politics.

Personal Experiences with Digital Surveillance

00:03:23
Speaker
So this great political scientist that I love named Deborah Stone is writing a new book about numbers. And one of the great lines in the book is she says, numbers are just stories pretending they're not stories. And that's very much sort of the approach I took to this work, which is like, there's so much great work out there, including sort of earliest in some ways out, Kathy O'Neill's wonderful book, Weapons of Math Destruction.
00:03:47
Speaker
One of the things that's so strong about that book is she's a quant person herself. She's a really good writer. And so she makes it really clear how the technology works and what the impacts might be. But I felt myself, after reading it as much as I love the book, really hungry to hear from the people who were being affected. And that goes way, way back in my history. So the moment I think of as the sort of
00:04:14
Speaker
origin story of this book is all the way back in 2000, I was working on a project with a group, a community of women who lived in a residential YWCA in my hometown of Troy, New York. And we were working together around issues of sort of technology and economic inequality. And the sort of idea that was really current at the time was this idea of the digital divide, this idea that the sort of most important social justice issue or one of the most important social justice issues
00:04:43
Speaker
of the digital age was the issue of lack of access, whether that was along racial lines or that was along class lines or gender lines.
00:04:51
Speaker
And so I kind of went into this collaborative project in the late 90s with this in my head. This

Digital Surveillance in Welfare Systems

00:04:58
Speaker
community of wonderful women at the YWCA really sort of eventually just sat me down and like forced me to have what we in the South would call a come to Jesus moment around my assumptions. And basically said, look, Virginia, we don't lack interaction with technology in our lives. It's just the interactions we're having with them are terrible.
00:05:18
Speaker
are really exploitative, make us feel unsafe, make us feel vulnerable. And one of those moments, which I recount very briefly at the beginning of this book, feels very much like the seed of automating inequality. And that was I was talking to a young mom on public assistance named, goes by a pseudonym in the book, Dorothy Allen.
00:05:39
Speaker
Um, and we were talking about her electronic benefits transfer card, her EBT card, which is the sort of ATM like card you get public benefits on in most places now, but they're pretty new in 2000. So we were talking about it. And, um, she said, you know, I was asking her different questions about how it was working for her. And she said, you know, um, right, maybe there's some ways that the stigma is a little bit less than like pulling food stamps out in the grocery store. But frankly, like.
00:06:08
Speaker
Most the tellers don't know how to use them. So they just shout, you know, like, you know, food stamps, how do I deal with this card? Right. So not that much less stigma. Like in some ways it's more convenient. Yeah, I guess. But in reality, like the thing that really stands out to me, she said, is that when I go to see my case worker, all of a sudden she's asking me questions like, why are you spending all this money at the convenience store on the corner? Don't you know it's cheaper to go to the grocery store?
00:06:35
Speaker
And so she sort of pointed out that this digital record that was being created by her electronic benefits transfer card was creating a trail that her caseworker could follow to track all of her movements and all of her purchases. And I must have had this like incredibly naively shocked look on my face. I was, I don't know, 25 at the time and had only been on one public benefit program in the past and not on food stamps.
00:07:03
Speaker
And so she kind of looked at my face and laughed at me for a while and then like got actually really sort of quiet and concerned and was like, oh, Virginia, like you all meaning I believe at the time, meaning sort of professional middle class people, I was a graduate student at the time, like you all should be paying attention to this because you're next. And that moment has always stuck in my head.
00:07:26
Speaker
Not only because I think that was actually incredibly generous of Dorothy, right? Of her being like, oh, we're dealing with this shoot storm, you know? And you all should be concerned because it might impact you too. But also it stuck in my head this idea that like the folks who are sort of on the cutting edge of a lot of the most intrusive, invasive digital surveillance technologies

Focus on Current Technological Impacts

00:07:55
Speaker
are poor and working class people. And you need to go to the source to ask people how those tools are operating in their lives. And so I was really committed writing, automating, and equality. I did more than 100 interviews for the book. I talked to lots of different people. I talked to designers. I talked to policymakers. I talked to cops. I talked to frontline social workers. I talked to welfare case examiners. But in every case, I started by talking
00:08:24
Speaker
to the people who felt like they were the targets of the system I was describing. So in Indiana, it was folks who either struggled to keep their benefits or lost their benefits during that benefit modernization. In Los Angeles, it was unhoused folks who had interacted with the coordinated entry system. And either it had gotten them housed, and it was often a happy story, or they had been shut out somehow. And in Pennsylvania in the Allegheny County story, I started with the families.
00:08:55
Speaker
who felt like they were being targeted by this algorithm that risk rates their parenting based on the potential risk to their children of abuse or neglect. And it just turns out that these stories of these magic new digital tools look really different from the point of view of the targets of those tools. Yeah, I think it's just really crucial to start with impact. Start with who does it matter to and how's it affecting their real lives every day.
00:09:20
Speaker
Yeah, I mean, a lot of what you're expressing is this idea of empathy through the storytelling, right? Being able to put ourselves in someone else's shoes and say, what if I was the person on Snap receiving benefits and having to use this card? What would my experience be like? And maybe that's something that we've lost a little bit over the last, let's say, three years or so. Yeah, in a sense. I mean, for me, where that instinct comes from is less about empathy and more about fact.
00:09:49
Speaker
You know, the old saw is the future has already arrived. It's just unevenly distributed, right? This is something that's widely said, William Gibson said in the sort of 80s. And I think he meant it slightly differently than I do. I think he meant it, that like the newest flash is technology go to wealthier, more powerful people first. I think in the kinds of
00:10:18
Speaker
um, cases I'm talking about that these tools are tested first in communities where they're sort of low expectation that people's rights will be respected. Um, and so where you see the sort of most bald faced uses of these tools tend to be in these communities, not just poor and working class communities and not just communities where people are using public assistance, but I do think that that's an important place to look, but also migrant communities, um, communities of color.
00:10:46
Speaker
First Nations, indigenous folks interact with these tools in really different ways than non-native people do. So it's really about not projecting potential harm of these tools into the future, right? Like the example would be, and this is actually really important, right? Like, so let's talk about which we do a lot, what like an autonomous car
00:11:13
Speaker
would do if it came on in the road upon a box of puppies and a bicyclist, like which one would it hit, right? That's actually an important question to ask. We should be asking that question. But we have this tendency when we talk about technology and policy to talk about the problems that might come in the future instead of just going and talking to people about what's actually happening in their lives right now. And so that just tends to be my approach, which is like, you know,
00:11:42
Speaker
These future problems are interesting and in some ways sort of beautiful puzzles that people like to sort of grapple with in their heads. But if we want to get real about what's actually happening, we have to go ask people and we have to go ask people in these places where there aren't real expectations that people's rights will be respected. Right. So dealing in the now so that we could deal with the future, we can evolve to the future that we want to get to. Exactly. Yeah, exactly.

Public Service Modernization Failures

00:12:08
Speaker
I think that's a great way to put it. Yeah.
00:12:10
Speaker
So one of the examples, and it's early in the book, is this, you just mentioned this experience in Indiana, where the public services, mostly TANF, I believe, and food stamps and Medicaid, they were trying to modernize the system. It was a lot of the technology issues that you talk about throughout the book.
00:12:29
Speaker
And I wanted to ask, you know, a lot of the of what you talk about in the book and a lot of what people in the world talk about is a lot of these modernization efforts are about efficiency. They're about cost cutting. They're about and sort of privacy is sort of gets a little bit of a wink and a nod of what of what I see. And I'm just curious how should we as as both consumers
00:12:50
Speaker
of people who are receiving these benefits or involved in these programs or just as citizens. So how should we think about these competing incentives? Because there is a budget constraint for some of these programs. And yet we have these what I'll call after reading your book, these very these fairly scary outcomes that are possible. Yeah. So I think that idea that we have to work with
00:13:18
Speaker
resources that are limited beyond our ability to change them is one of the most common reasons people will give you for going to these digital tools. There's generally two first-run reasons that people give for these digital tools. The first is efficiency, cost savings, and sometimes the identification of fraud, waste, and abuse. The second is
00:13:45
Speaker
ferreting out legacy patterns of discrimination in frontline decision-making. And both of them are reasonable, right? We want there not to be frontline discrimination in decision-making. We want rules to be applied to the same way in each case in most cases, though we can talk about that some more in a minute because people are individuals and their problems and needs and resources are different. But let's just talk about the efficiency issue, like the triage issue.
00:14:16
Speaker
So even though I spoke to, like I said, 100 people, and I spoke to lots and lots of designers, and all the designers were quite different in their approach and their politics and what they thought the problem was, to a person, every single one of them would say that they had to use the tool because it was necessary to do a kind of digital triage, that there weren't enough resources for everyone
00:14:46
Speaker
and that in the absence of having sufficient resources, they had to make really hard decisions about who should get access to benefits and who should. And one of the things that I try to raise in the book is this idea that triage actually isn't an appropriate way to talk about programs that have been relentlessly defunded.
00:15:12
Speaker
So when you talk about public services, for example, since 1996 and actually even before, we've made a series of really consequential political decisions to defund our public service system. And you can't then say,
00:15:31
Speaker
Oh, like, let's relate this to like a natural disaster. We have to do triage because like we how could we know that this tsunami was coming and we don't have enough medicine? Right. Like this is clearly not what's happening. And so one of the arguments I make in the book is that it's actually not appropriate to use the language of triage because if the problem is not temporary and if there are not more resources coming,
00:16:00
Speaker
then what you're doing is not actually triage, it's digital rationing. And so if it's digital rationing, like let's name it, like let's say that's what we're doing and have a conversation about that. I find this idea that like, oh, we have to do it because we just don't have enough resources. I find that specious. I find it really unconvincing and potentially malintention.
00:16:25
Speaker
And you see that across all three of the cases, right? That's the same argument that's made in Indiana and Los Angeles and in Allegheny County. It says that there's just not enough resources. But you end up with these more, like thornier, more awful problems by defunding these programs at the front end.
00:16:47
Speaker
For example, if you look down the road of the book towards the Allegheny County case, so we're talking now about building a tool that is supposed to risk rate all the families in Allegheny County based on their potential to maltreat that is abuse or neglect their children in the future so that they can be investigated by the Children Youth and Families Administration of Children Youth and Families there.
00:17:13
Speaker
with an eye for potentially pulling children out of the home and putting them in foster care. The reality is 75% of children who are put in foster care across this country are put in foster care because of neglect, not because of emotional, physical, or sexual abuse. And that neglect is basically the textbook definition of neglect is very similar to just being poor. It means not having safe housing. It means not having enough food.
00:17:42
Speaker
It means having to leave your child alone or with someone who's not terribly trustworthy because you have to go to work. All of those are downstream problems from not funding public services. I find it so sneaky that you then say, oh, but we have to do this because we don't have enough resources to investigate all these dangerous families. When the state is making those families dangerous, it's not parents who are making the families dangerous, it's the state.

Racial Dynamics in Welfare Policies

00:18:10
Speaker
And I mean, even in a very practical way in Indiana, if this was about cost savings, it did not work because they signed what was originally a $1.16 billion 10-year contract. It ended up being a $1.34 billion with a B dollar contract to create a system that basically worked to deny people public assistance, worked so badly that the community rose up and shut it down like three years into
00:18:39
Speaker
a 10-year contract and then IBM turned around and sued the state for breach of contract and originally won, right? Like one damages on top of the money they had already collected. And if you had just looked at the contract with an eye to like how public services actually work and what the impact might be on affected communities, you could tell from the contract out what was going to happen. Like all of the metrics were
00:19:04
Speaker
Nothing was like whether or not people got benefits they were entitled to. Nothing was about whether the decision that was made was correct. All the metrics were how fast did you get off the phone and how many cases did you close. Right, right, right. And so you absolutely, I mean, you could have known from the beginning that that was going to be the effect.
00:19:26
Speaker
So that the metrics that they're looking at are efficiency, but not necessarily the usefulness of the program. And certainly not to the people who are participating in the programs. Well, the metrics are short-term efficiency. The metrics are like, how many people can we get off public assistance this year? They certainly wouldn't say that was the metric, but I think you could read between the lines of the contract pretty easily that that's actually the metric.
00:19:51
Speaker
Um, but that just creates, like I said, all of these downstream problems, right? And I'm not even talking about the human costs, like people like Omega young who lost her Medicaid because she missed a phone appointment because she was in the hospital dying of ovarian cancer. Right. So I'm not even talking about the human effects or the political effects of a community that now will not trust public service programs.
00:20:18
Speaker
because they've had these god-awful experiences of being digitally surveilled. I'm not even talking about those. I'm just talking about the straight money. They just lost money on that bet. And that's not even the legal case that the state had to engage in to fight back against IBM's suit. That doesn't include the hundreds of probably thousands of fair hearings that they had to hold when people were wrongly denied their benefits.
00:20:48
Speaker
Like, so not even talking about the cost to people, like the actual cost, they lost money on that. Right. I'm curious. So the book came out three years ago, which in 2017, which feels like
00:20:59
Speaker
90 years ago right now. That is really funny. It's really true, right? But I'm curious, if at all, how has your perspective changed over the last few months with after the murder of George Floyd and Breonna Taylor and unfortunately so many others, how has your perspective changed on these issues, especially over the last few months or has it not changed?
00:21:21
Speaker
Yeah, so I think that you can't talk about public benefits in the United States without talking about race and without talking about policing and the criminalization of poverty. Right. So I think in this moment, I might have framed what I said slightly differently, but I really think
00:21:45
Speaker
as so much of the conversations were really exciting and important conversations we're having about police brutality right now are absolutely clear and obvious in the cases in the book. So though white people still make up the majority of people on public assistance in the United States, perceptions about welfare as a sort of, quote, black thing impacts like all of our policies
00:22:11
Speaker
And all of the ways things are implemented, I mean, everything from, you know, racial disparity and foster care to sanction rates in different states, that is sanctioning is throwing people off of public benefits because they've made a mistake. All of that is racially determined in some really serious ways.
00:22:33
Speaker
And each case that I talk about, Indiana and Los Angeles and Allegheny County, race plays a really significant role in the case. In Indiana, race played a really significant role on where they rolled out the system first as they were testing it. And actually really, what I thought was really interesting was that they, they're really just a handful of counties in Indiana that have the majority of the black and African-American population.
00:23:03
Speaker
it seems quite intentionally rolled the system out to the counties that did not have black populations first, which I think is really fascinating. I have some real suspicions that I couldn't confirm that it might have to do with sort of using racial resentment as a political tool. So I saw race at play very much in Indiana and also the case of folks like Omega Young who really faced the worst outcomes of the system. They were majority black women.
00:23:34
Speaker
In Los Angeles, I look not just at Skid Row, where many of the stories of the unhoused community come out of and for good reason. It's a huge and very politically active community. But I also look in South Central, which actually has more unhoused people than Skid Row, but gets much less funding and much less attention, largely because of race. And in Allegheny County, I look at the way that past legacies of racial discrimination
00:24:04
Speaker
are used actually to rationalize the implementation of this tool and sort of the problems with saying that data is racially neutral, which I think we're all pretty familiar with now that we've had these sort of conversations about policing, right? Like the way that sort of stop and frisk use data or the way that racism skews data, I think is a conversation we're much more comfortable having these days.

Digital Tools in Social Services and Policing

00:24:30
Speaker
So I mean, I feel like that conversation was very much in the book. One of the things that I'm really excited about that has happened since the book is that there have been these sort of intentional linkages that we've started to build across different areas of policing. I think of like lowercase p policing. So Dorothy Roberts wrote this really great review of the book for the Harvard Law Review.
00:24:56
Speaker
that talks about the connections of what I call the digital poor house to what she calls the digital carceral state. And it's exactly what I hoped would happen with the book is that we would start making these connections about how policing operates in different areas, like not just in criminal justice and law enforcement, but also in child protection, like also in public assistance, also in homeless services that like
00:25:24
Speaker
as my really brilliant colleague Mariela Saba of Stop LAPD Spying Coalition says, policing wears many uniforms. But that these processes of policing show up in all of these different social programs and how dangerous that is when you start to conflate economic support programs and law enforcement under the same data structure, under the same rubric and using the same people. I think that's actually an incredibly dangerous thing.
00:25:55
Speaker
Yeah, I'm curious. How do you see people, organizations out there trying to work and remedy the problems that you've identified? I mean, I feel like we can see police violence, right? We can see police officers pointing guns at young black men standing at a bus stop, which is a story that came out this morning.
00:26:18
Speaker
But the issues that you're highlighting are more of these hidden forms of racism and structural racism. And how have you seen organizations working to turn things around? And how have they sort of gone about doing that? Yeah, I mean, I think that's one of the things that makes talking about these technologies so interesting and so important is that
00:26:41
Speaker
Um, we sort of talk about technology, particularly as a tool that is neutral, that is like, you can use it in sort of any old way. I think it's much more useful to talk about tools as like manifestations of structures. Right. So like, of course the technology that we build for the foster care system is going to be racist. Right.
00:27:00
Speaker
In every single county in the United States, there's a problem with racial disproportionality and foster care and that's affected all of the data and that affects all of the machine learning and that affects all of the outcomes of all of these tools. And so it's like this manifestation of the structural problems that we're already facing. And I think you're right though, I think that when we look at these technologies, the harm looks really different
00:27:28
Speaker
than the interpersonal conflict, right? Like police officer or black youth. But the problem that we have in the United States is not racist cops. I mean, we do have that problem.
00:27:41
Speaker
The problems, though, are structural, are really deep. So even if we replaced every police officer tomorrow with Gandhi, we would still have a lot of these problems. And so that's the thing about talking about the tech, is that it allows us to have those conversations in a way that I think is really, really powerful.

Community Resistance and Organizing

00:28:02
Speaker
So yeah, there is some work going on. I'm probably out of the loop of the newest, most exciting work around that. I was involved as the book was coming out and after the book with a really great project called the Our Data Bodies project that is starting to imagine what community safety looks like
00:28:25
Speaker
in a world of sort of digital surveillance, you know, the Center for Media Justice. I think they changed their name recently, but I don't remember what it is, which is terrible. Sorry, guys, you're awesome. Have been doing that work. I think it's not the kind of work that has, you know, data for Black Lives, right? So it's, but it's not the kind of work that necessarily needs a whole slate of new organizations.
00:28:53
Speaker
it feels to me like it is a layer that we add to the organizing work we're already doing, right? So if you're interested in economic justice, you also have to think about algorithms now, right? If you're interested in police brutality, you also have to think about CompStat, or electronic shackling, right? That it's just, it's a dimension of the work that so many people are already taking on.
00:29:19
Speaker
And I think one of the most important things that I really hope people take out of the book is that none of these systems are inevitable. So Indiana is the perfect example. The state was like, you know what? We're doing this. We don't care what people say. We're going to hold basically no public comment period on this more than billion dollar contract. We're just going to do it. And the citizens of Indiana shut that thing down. They just old school organizing, had town hall meetings.
00:29:49
Speaker
went door to door, handed out flyers, and they were just like, no, you don't get to treat us like this. And I think we've seen those kinds of successes around facial recognition technology, around tech workers refusing to work on projects that they find morally reprehensible. So we're seeing that kind of work pop up all over the place. I'm really excited about that moment, the moment that we're in. One of the things that I really still
00:30:18
Speaker
I want to say again, because I so want to see it happen, is like this connection between the policing apparatus of law enforcement and the policing apparatus of the programs that we think of as more charitable or more helpful. Because we have a tendency to think that these tools are okay as long as they're like, quote, just helping and not punishing anybody. But the reality is,
00:30:44
Speaker
Things like child protective services plays both a helping role and a charitable role and a punishment role. And so if we don't see that as a policing system, we're really in danger of giving it a pass around things that we would never accept in law enforcement.
00:31:04
Speaker
Yeah, there's this book by Zach Norris who runs the Ella Baker Center out in Oakland called we keep us safe. He expands on this on this exact point that a lot of what our public services and programs do is about punishment as opposed to
00:31:21
Speaker
trying to keep people safe and get them, you know, get people who need services to the right place where they can be successful in the long term, as opposed to, you know, kicking kids out of school, putting people behind bars, you know, all sorts of other punishments that we inflict on people in our existing public service programs. Yeah, I'm so well, I love the Ella Baker Center. They're amazing. And I have heard about this book. I haven't quite I haven't read it yet, but I'm really excited about it.
00:31:50
Speaker
because one of the things that sort of kept me out of the loop for the last couple of years is as I started to write the book, my very, very dear and much beloved partner of many years, more than I care to admit, Jason Martin was attacked and really badly beaten in our neighborhood and ended up suffering from a pretty horrifying case of post-traumatic stress disorder. And living with someone with PTSD,
00:32:17
Speaker
And after the book came out and things calmed down a little bit, we've largely sort of turned our attention to his healing and me being supportive to his healing. And one of the things that's become really clear to me as a partner of somebody with PTSD, and particularly as a partner of somebody with PTSD during a pandemic, is how poorly the systems we hope will work
00:32:44
Speaker
to keep us feeling and actually physically sort of safe and healthy, you know, how routinely and sort of life destroying ways those systems fail us. And I am really interested in figuring out how to live in a world where we keep us safe and that that means something beyond like dialing 911

System Failures and Personal Stories

00:33:12
Speaker
Um, that, that feels really important to me, not just on an intellectual level, but on a like day to day, you know, leaving my house to go to the corner store level. So it is that, that kind of safety or security, particularly community security feels so, so crucial, um, to me right now. And the pandemic just makes that all the more obvious, right? Like the pandemic has made it so clear how harmful these cracks in the system are to everyone.
00:33:41
Speaker
I mean, especially to the people who have already been suffering from falling in the cracks before the pandemic, but the pain has, I think, been widespread enough through the pandemic that a lot of people are starting to wake up to the cracks in the system.
00:33:56
Speaker
I want to ask you one last question on that exact note. You've struck both an optimistic and a pessimistic tone that we are seeing these cracks, and there's a lot of work going on to improve the system. I wonder, going forward, and I know we've already talked about, let's not just go into the future, but looking ahead, are you optimistic that things will change for the better?
00:34:19
Speaker
Um, or, and probably most of us are feeling sort of down and pessimistic these days, but on the things that we've been talking about, are you feeling like things are heading in the right direction or they're heading in the wrong direction? So that's a great question. Like am I, do I think things are getting better or worse? Like, I'm going to say both.
00:34:37
Speaker
And I know what a frustrating answer that is, right? So I have for a long time, I have identified myself, and this comes out of the work I did at the YWCA back in the early 2000s, I've identified myself as a hard one optimist in that I feel very aware of the real life shattering

Hope for Change and Activism

00:35:04
Speaker
system destroying potentially world-ending catastrophe we seem to be teetering on the edge of. And also feel like I see so much in the social movements I've been part of and in the community I live in and in the work that I've just been honored to do with all sorts of different folks, but primarily with poor and working class communities.
00:35:29
Speaker
I just feel so honored to be in the presence of our continued optimism that we have hope and we can create change and that out of this mess that we're in, we can birth the kind of world we want to live in. So I don't think I'm naively optimistic, but I also feel like hope is part of revolution, right? Like you have to live. Barbara Smith, who I did a book with many years ago, is a black feminist who is a great hero and friend of mine.
00:35:59
Speaker
Um, she says for the revolution to happen, you have to live as if the revolution is possible always. And, you know, I think for me that, that feels like a final word. Yeah. Well, I think that's a great place to stop. Um, but I always try to give, I always try to give Barbara the last word. That's a good, that's a good, that's a good quote to write down down right there. Yeah, that's right. Well, thanks so much for coming on the show. Well, thank you so much for having me.
00:36:30
Speaker
you

Supporting PolicyViz Podcast

00:36:35
Speaker
And thanks to everyone for tuning into this week's episode. I hope you learned a lot. I trust you did. If you would like to support the PolicyViz podcast, please consider sharing it on your networks, on Twitter, or leaving a review on your favorite podcast provider platform. Or you can go over to my Patreon page and support the show financially. For just a few bucks a month, you get a coffee mug, or you get my thanks, or you get a note from me every month, giving you the heads up on what's coming up on the podcast. So until next time, this has been the PolicyViz podcast. Thanks so much for listening.
00:37:13
Speaker
A number of people helped bring you the PolicyViz podcast. Music is provided by the NRIs, audio editing is provided by Ken Skaggs, and each episode is transcribed by Jenny Transcription Services. The PolicyViz website is hosted by WP Engine and is published on WordPress. If you would like to help support the podcast, please visit our Patreon page.
00:37:48
Speaker
So hard to make a mess. It's coming around the bend. It's coming around the bend. Too much noise. Too much life. Too much noise.