Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Pieta Blakely and Eli Holder on Data Equity image

Pieta Blakely and Eli Holder on Data Equity

S9 E224 · The PolicyViz Podcast
Avatar
498 Plays2 years ago

Pieta Blakely, PhD helps mission-based organizations measure their impact so that they can do what they do well. She started her nonprofit career as a teacher in workforce development and adult basic education. It was important work and she was worried that they didn’t really know if they were doing it well. In the process of trying to answer that question, Pieta got a Masters in Education and a PhD in Social Policy, and became an evaluator.

Pieta has been an evaluator for over fifteen years, the past five of those as a consultant helping mission-based organizations use evaluation to build better and more effective programs. She believes that evaluation isn’t a test, it’s an ongoing process of trying things, measuring the results, and making adjustments. Her goal is to help build organizational cultures that thrive on joyful accountability and doing important work well.

Pieta is known for explaining complicated things clearly, an emphasis on ethics and justice in evaluation, an understanding of how not-for-profits work, and her unpredictable efforts in vegan and wheat-free baking.

You can read her blog at pietablakely.com or watch her live show, Coffee Time with Masterminds, where she talks about leading mission-based organizations through uncertain times.

Eli Holder is a dataviz designer, researcher, and founder of 3iap, a data visualization design firm. 3iap (3 is a pattern) specializes in psychologically effective information design, approachable analytics, and developing human-centered data products. If you’re a data designer, journalist, or analyst, Eli’s Equity-Oriented Dataviz Workshop can quickly teach your team how to visualize data on inequality, without reinforcing inequality. This covers not only his recent research, but also the underlying psychology and alternative design approaches to conventional (harmful) visualizations of racial outcome disparities. 

Episode Notes

Recommended
Transcript

Introduction to Equity and Inclusiveness in Data

00:00:13
Speaker
Welcome back to the Policy Vis podcast. I'm your host, John Schwabisch. And on this week's episode of the podcast, I talked to Pita Blakely and Eli Holder about their work on equity and inclusiveness in data and data visualization. If you've been following my work for the last year or two, you know, this has been a big topic of interest for me. I've written a couple of
00:00:33
Speaker
papers about this particular topic with more that I'm currently working on through the Urban Institute, including two volumes of what we're calling the Do No Harm Guide. So I hope you'll check those out and I hope you'll check out the blog posts that Pita and Eli have written that I link to in the show notes page.
00:00:49
Speaker
So I hope you'll take a listen to this episode and I hope you'll think about ways in which you could be more inclusive and more thoughtful and more strategic in the way that you talk about and visualize different groups across the world. So here is my conversation with PETA and Eli.

Guests' Backgrounds in Data Visualization and Equity

00:01:06
Speaker
Hey, Eli and Pita. Good morning. Welcome to the show. How are you both? Hello. Great. Thank you. Great to have you both on the show, excited to chat about all the great work you're doing. I thought we would start by some introductions for folks who don't know of you or your work, and then we can get into all the nitty gritty of the content. So maybe Eli, you'd like to start tell folks a little bit about yourself and your background.
00:01:30
Speaker
I'm Eli Holder. I long time ago studied computer science in school, did my first kind of scientific visualization research in undergrad, got out of school, started a couple startups. One, again, back to data biz, looking at
00:01:48
Speaker
How can we take a Fitbit data and personal health tracking and make it less clinical, make it more motivating exploring part of the, like the emotional sides of database. And then from there, a few miserable years as a product manager working for a kind of data oriented products.
00:02:06
Speaker
Uh, and then 2020 happened. I think, uh, we all became a little bit more introspective retrospective, uh, and, and realize that the database side of data and the storytelling and the psychology of it were just so much more compelling to me than, uh, then, then really anything else. Um, and so 2020 I started working on my own practice three IAP, uh, kind of in earnest. Um, and so, uh, since then have been doing, uh, client projects around,
00:02:35
Speaker
either communicating data, storytelling kind of things, or data products. And how do you design both of those in ways that are not just informative, but psychologically effective that create the outcomes that we want to see. And yeah, so that's where I'm at. And part of that is I enjoy the work. It's also an excuse to do kind of side projects. And this one turned into what I thought would be like a, maybe like a two or three week exploration. And it's now turned into a year long research project.
00:03:04
Speaker
Yeah. Um, so, but that's, that's part of the fun. That's, that's, that's why I'm here.

Mix of Computer Science and Psychology in Data Visualization

00:03:08
Speaker
You know, it's, it's so funny cause I talked to so many people on the show and sort of, you know, folks are in data is kind of gravitate towards computer science because that's, you know, a lot of the part of making data visualization. But like you started with computer science and then kind of like gravitated towards psychology, which is kind of like the other pair of it, which is, which is pretty interesting. Um, that's great.
00:03:29
Speaker
Um, so Peter, could you talk a little bit about yourself and then how you and Eli kind of like teamed up on this on these projects? Um, so I am a, I'm an evaluation consultant. I work for mission based organizations and I help them measure their outcomes and all of the work that I do has to do with some kind of marginalized or disadvantaged population.
00:03:52
Speaker
and is aimed at building thriving urban communities because that's what I care about. So, you know, under that umbrella, there's a lot of youth work, local community development, a little bit of healthcare, things like that. And I'm often collecting data, right?

Targeted Universalism and Its Application

00:04:10
Speaker
creating a visualization and then talking to people about it because the goal is always, so how can we use this to do better? How can we run our programs better? And so I had a thought one day about targeted universalism and data visualization. And I thought, how would I set up my data visualizations to support a conversation rooted in targeted universalism?
00:04:36
Speaker
And I wrote a blog post entirely based on my thinking.
00:04:42
Speaker
as it's emerging. I was like, here's what I think I would do. And I put it out there. And you picked up on it. And Eli picked up on it. It was really exciting. And Eli reached out to me and said, I am trying to implement some of these things you've written about and I'm getting some pushback. And so he, which was great for me, because I had just been thinking by myself about this, asking me these questions, he really got me to articulate some of the assumptions.
00:05:12
Speaker
you know, that were in that initial piece. And so we decided to take some of our thinking as it was, it was just getting so much like richer.
00:05:23
Speaker
through us being in conversation about it and write a follow-up blog piece. And he's really taking some of the ideas that I had just thought of, and he's starting to create some experiments and actually demonstrate how these things work in the real world and validate some of that. So that's been really, really cool and safe.
00:05:46
Speaker
That's great. So I want to get to the experiment part, but I want to make sure that folks are sort of on the same ground, I guess, they're all start the same place. So I was hoping that you could start by just defining for folks targeted universalism. Yeah, that's a big phrase. And then and then we'll let and then I think we need Eli to define deficit framing, because that's another big part of that. So targeted universalism is an approach
00:06:13
Speaker
I've seen it a lot in education, but I think it applies in all kinds of fields where we have one universal goal for everybody and then targeted approaches to help different populations reach that goal. So it's a balance between a universal goal and a universal approach. Everybody gets the same thing. And then if you don't do as well, well, we gave you what everybody had, right?
00:06:42
Speaker
and targeted programs which
00:06:45
Speaker
focus on particular populations. And then some people say, well, this is not fair. Different people are getting thick, different things, whatever those critiques are. So targeted universalism says, look, we've got high standards for everybody. And then some populations have specific barriers. And we're going to create targeted programs to make sure that they can reach the targets that we've got for everybody. And so that was what was in my head when I started those first.
00:07:14
Speaker
set of charts. And so you're thinking about this in the context of, um, evaluations, right? And so you're thinking about if somewhat like if a firm or a researcher is doing an evaluation, how do they, for example, target a particular policy?
00:07:33
Speaker
Yeah. How do they target particular populations? Could we say, hey, are girls doing really well in this program? Do boys have some different barriers to accessing this material to students who are speaking English as a second or third language, have different barriers to getting the best out of this program than other people do? That was the kind of thinking. Gotcha. I think that there was
00:08:00
Speaker
an assumption embedded there that I did not say explicitly in the blog and that she said out loud, which was, I thought that you had to design your data visualizations to support different kinds of conversations and that people looking at the visualization could have different conversations based on how you design that visualization. That's the important thing that I omitted from the initial piece. Right? Right. Yeah.
00:08:30
Speaker
Yeah. Okay.

Avoiding Deficit Framing and Racist Narratives

00:08:32
Speaker
So now we've got a deficit framing sort of, I don't know how they fit sort of together, whether deficit framing subsumes. But anyway, Eli's hoping you could give people sort of a framework for deficit framing.
00:08:46
Speaker
So deficit framing is a phrase that I kind of made up or that came out of the paper. It refers to a framing effect on a chart that leads to deficit thinking. Deficit thinking, I think, is the more common term.
00:09:01
Speaker
Um, and that's, it's, it's a term that I wouldn't have known about if it weren't for, uh, PETA's posts. Like I, I just, I'd never came across it, uh, prior to this. I think it's, uh, used a lot more in, in the education, uh, space, which I'm not as, uh, as close to. Um, but the, the, what it describes is this, this tendency to, uh, to generally think about, uh, outcomes for minoritized groups only in relative terms to outcomes for, uh, for majority groups or for,
00:09:31
Speaker
groups with better outcomes. And one of the main harms that comes from it is this tendency to conflate the outcomes with qualities of the people. So to blame outcomes on the people being visualized as opposed to
00:09:48
Speaker
uh, more systemic reasons or more external factors. And so it's, it's, it's kind of a nasty form of victim blaming, essentially, uh, comes out of it. Um, and that's, that's, I think I've been working with a pretty narrow definition of it. Uh, there's certainly more to, to that concept. Uh, but for, for the purposes of, of like, how does this work with charts and graphs and, and, and data design, uh, I kind of latched onto the part around, uh, personal attribution and, and how certain charts can, can lead to that. Right. Yeah.
00:10:18
Speaker
I think that's really the key thing, right? That, you know, how do we avoid looking at a disparity or difference and then blaming people with all these racist narratives that have always been used to explain different outcomes?
00:10:34
Speaker
Right. So the example that you sort of anchor both of these posts is, I'll just describe it here so folks can get it in their head and I'll let you all dive into it. But it's basically the percent of students who achieve some threshold in a test score.

Multiple Graphs and Chart Order Discussion

00:10:51
Speaker
Right. And so you've got this line chart and you've got four different lines for four different racial groups, white students, black students, Asian students, et cetera, et cetera. And so the argument that you both make in these posts and similarly the argument that Alice Fang and I made in the Do No Harm Guide is like, should that be one graph or should it be multiple graphs? And so I'm going to leave that as the
00:11:16
Speaker
context. So people who haven't seen it, um, well, they should go read it, uh, have this like one graph versus these, these sort of smaller multiple graphs. So maybe Pete, I'll let you talk a little bit about it and we can just, we can chat through it. So there were, there were two reasons why I thought it should be multiple graphs. One was in that initial graph, there is no target for everybody.
00:11:40
Speaker
And so you end up with the white group becoming the benchmark. And that centers whiteness in ways that I think are really problematic. The other thing was it leads, I'm obviously reading it as a comparison between the groups, right? I'm going to say,
00:12:00
Speaker
Well, yeah, the black students are not doing as well for the white students, which just leaves this perfect size space to think something like they're not trying as hard. I just fill in with assumptions and bad things. What I wanted to demonstrate was the space between each group and where we want them to be.
00:12:25
Speaker
which allows for, I think, just a much more nuanced conversation and actually think about each group individually and what might be their strengths and their barriers. Right. Yeah. So the one graph where they're all together is what is the, I mean, depending on the context, right? What is the highest group or the lowest group? And is that the baseline? That's the goal. And that sort of centers, in this case, centers what students are square the highest. Yeah.
00:12:55
Speaker
So I wonder then to take it a next step when you think about, let's say there's four groups, how do you think about ordering or aligning those four next charts? Oh, that is a really good question, which I did not think about in that first iteration. And I think the best answer I have, like at this point is probably alphabetically.
00:13:24
Speaker
I mean, I don't know, Eli, you got any thoughts on this, but this to me is like one of the big challenges. So the ordering effect I think is definitely important. And I think if you can start with, uh,
00:13:36
Speaker
So the context for me is I've done this in reports where there are just multiple, multiple kind of different cuts of data involving race or gender or things like that. And so if you can establish that early on, all these groups are ordered alphabetically and that's just kind of the rule across the board. You've at least established like, here's the reason for it.
00:13:59
Speaker
Yeah. It doesn't always work. So there will be cases where where it does have like a centering effect where just it doesn't feel good to have certain groups kind of like at the top or at the bottom. Yeah. And so like, I've for stuff like that, I've never been shy about like, you know, and I'm gonna I'm gonna reorder it so that the group that needs the most attention or the group that's most relevant to this conversation is kind of front and center. Yeah. But I do want to
00:14:25
Speaker
push back or dive into the question of separate charts versus same charts. And we can get to that in a second, but I think there's a little more maybe nuance there that's kind of worth unpacking to. Yeah, absolutely. I think you're right. The thing about breaking the chart up is that the alphabetical ordering is sort of like objective kind of, you pull away all the other stuff. And even if you order it by value,
00:14:53
Speaker
you can do that by magnitude or whatever it is. It's still an objective ordering, but it's not putting them all together on the same graph so that you have this competing effect. I think for my work, it's like, I'm just going to be upfront about how I've made this decision and that's the decision. Let's dive into this single versus multiple panels.
00:15:16
Speaker
Yeah, so I think to one of Peter's kind of original points around this, the centering effect is definitely a good reason to separate. And I think for the purposes of goal setting, the majority group can tend to act as a benchmark.
00:15:36
Speaker
that's not always good because who's to say that the majority group is the best outcome. And so I think those are good reasons to separate where you need to show different goals and kind of create that emphasis. But the harms that come from blaming people individually or blaming people personally, these personal attributions
00:15:57
Speaker
At least a big chunk of that doesn't seem to be related to whether they're separate or not. It's closer to the way that you present it, even on the same chart. The research that I was doing

Reducing Personal Blame with Data Variability

00:16:12
Speaker
All the different variations of charts that we tested showed the outcomes on the same chart, but we varied the way that we showed them. And we were able to, even on the same chart, significantly reduce the viewer's tendencies to make these kind of personal attributions. It's essentially stereotyping. And so I think that the benefit of having them in separate charts is it takes emphasis away from these direct comparisons.
00:16:40
Speaker
which can be harmful, but there's ways of doing that that don't require necessarily splitting it out. I think for stories where it's maybe high risk and you're worried that it could be misperceived, for stuff like that, I'll lean pretty heavily into any trick I've got to water down the differences or make sure that, not water down, but
00:17:00
Speaker
To clarify the differences, I'll do that. But I don't think that it always needs to be the case that you need to separate if you have other options to make sure that people don't fool themselves into these personal attributions. And so do you think about what those are, if that's helpful to you?
00:17:20
Speaker
Yeah. Yeah. Well, I was just going to ask, like, is that a visual component in the graph itself or is it the text that you use in and around the graph or probably is going to be both, but it is definitely one of the big takeaways is understanding that by default, a lot of people jump to jump to these blaming tendencies.
00:17:38
Speaker
And this isn't like a new concept. This is like something that we all learned about in intro to psychology, like fundamental attribution error, correspondence bias. Just as people, particularly as Americans or people that live in individualistic or Western cultures, we jump to personal blame much faster than we jump to looking to external reasons for any kind of outcome or behavior.
00:18:04
Speaker
Um, and so if, if you kind of, uh, take that as, as a premise, uh, and, and you assume that given any chart, people will tend to a lot of people tend to explain it in terms of these personal attributions. Uh, that's, that's the main thing that we're trying to, trying to solve for. And so you can solve for that in, in a couple of different ways. I think the, uh, the annotation layer and the text around it, um, this it's.
00:18:28
Speaker
separate from the research project that I was doing, but I think is actually still probably even more of an important component. As much as you can do to frame it in terms of the outcomes that you're seeing could likely be caused by these external factors X, Y, and Z, or systemic factors X, Y, and Z.
00:18:50
Speaker
A lot of research that looks at misperceptions around causality show that if you can provide people with alternative explanations for what they're seeing, then their tendency to jump into superstitions or other false conclusions around causality diminish just by giving them alternative explanations. And so annotation layer and titles are a great place, I think, to do that.
00:19:18
Speaker
And then within the charts themselves, what we found was that it has a lot to do with showing variability. The charts that tend to be the most problematic are charts like bar charts or dot plots or even confidence intervals. These are all charts that really emphasize the average group outcome. And they don't show anything about the variability of outcomes within a certain group.
00:19:45
Speaker
And so for something that's as loosely defined as race, for example, you will almost always have a lot of people that... So if you're looking at something like earnings for race, you'll always have a lot of people in any group that earn very little, and you'll always have a lot of people in any group that earn a whole lot.
00:20:02
Speaker
even if the average outcomes for those groups are different. And so by showing the variability of those outcomes and making sure that it's clear, using something like a jitter plot or a prediction interval, you can help people see that there is not only a lot of differences kind of like within groups, but that the differences between groups aren't actually as pronounced as something like a bar chart makes it seem like.
00:20:27
Speaker
And so it was much more obvious that something like race is not a good predictor of outcomes or something like income, because you can see that how much variation there is within any given group for whatever that outcome is. Right. So I'm curious then, Pita, in the folks that you work with,

Resistance to New Chart Types

00:20:47
Speaker
who are probably, I would guess, sort of generally more accustomed to line charts, bar charts, pie charts, like that world. Like, have you had experience of telling people like, yeah, let's try a jitter plot or a B swarm charter, you know, or something like this? And like, what is their reaction? You know, with this framing of, you know, because we're trying to do this, like, what is their reaction to stuff like that? Yeah. I mean, and I think, um,
00:21:08
Speaker
I think because Eli does even more of this work, he's gotten even more pushback, but yeah, definitely. People are like, oh, but this is just how we've always looked at it. These are the charts that people know how to read. Yeah, absolutely.
00:21:23
Speaker
And my data visualization work is not as sophisticated as some people. But yeah, I get tons of pushback on it, just like basic improvements.
00:21:38
Speaker
But it's interesting because I'm guessing that you're talking to your clients or partners and you're saying, we want to take this different framework where we're not ranking or implying or this or that. And so here's an alternative to that. But I'm guessing they're still pushing back. I mean, they probably agree with you on the framing piece, but they're still pushing back because, well, it's a part chart, right? Right. Yeah, exactly. This is what we know how to do.
00:22:06
Speaker
I think Eli, early on, came across a much more interesting objection. Yeah, this is exactly how we first started talking. Yeah, the level of objection that I've gotten is just, oh, this is what we're used to. You know, Canada pie charts, this is what we know how to read. The objection that Eli came across, which I think is way more interesting and important is, oh, but everybody knows, right?
00:22:34
Speaker
We just need to point out the disparity because everybody who reads our chart knows the disparities are caused by structural racism. That's a really interesting problem because the answer is no, you can never assume that your chart is only going to be read by people who read charts or visualizations from the same perspective that you do. And so part of this is like sending your
00:23:03
Speaker
your work out into the world in a way that is complete, where the kind of guards against those kinds of readings. Yeah. I'm also curious, we've been talking about race and ethnicity, and that was sort of the grounding of the posts.

Broadening Visualization Principles to Other Contexts

00:23:22
Speaker
I'm curious if you've done any thinking about
00:23:26
Speaker
other characteristics. I mean, you know, internationally, it's certainly like ranking of, you know, lower developed countries versus developing versus develop. There's issues around gender and sexuality and all these things. So have you thought specifically about these other groups or is it sort of right now your thinking is kind of evolving to spread out? I think it applies.
00:23:50
Speaker
to anything where you have groups of people and there could be stereotypes about those people. So I've been thinking about road safety and vision zero, which a lot of cities have a target of having zero road deaths. Does this apply? Do we need targeted approaches for
00:24:14
Speaker
Pedestrians, people with physical disabilities, cyclists, drivers, are there stereotypes about cyclists? You know, or you can scooter users, like, yeah, definitely. I think, yeah, I think anything where you think, oh, in the absence of data, people are going to, like, there's a story that fits in here, right? Yeah.
00:24:39
Speaker
The driving one and the road safety one is one that I've started noticing a lot recently. So now that I have a...
00:24:48
Speaker
clear understanding of how pervasive the personal attribution bias really is. I see it everywhere, and I can't unsee it. It's crazy in that way, but it's pretty eye-opening. I think one of the topics that is just full of victim blaming is road safety.
00:25:11
Speaker
Um, so there was, uh, there was an infographic that came out like a few years ago around like, uh, relating driving time to obesity. Um, and I think the, the, uh, the title on, on the infographic was driving is making you fat. Um, but like in reality, like it's not.
00:25:28
Speaker
In the United States, outside of maybe like three or four cities, you don't have a choice but to drive. Like it's not the driving that's doing it. It's not your personal choice. It's like years and years of suburban sprawl and years and years of kind of like transportation policy that force you into cars. And that's also something to do with it.
00:25:48
Speaker
with the cost of housing that's proximate to your workplace, which makes cycling to work not a feasible choice, and that your lower income status is also influencing your access to outdoor space and healthy food. Yeah. I wanted to finish up by asking what you think folks in the data is field.

Aligning Visualization with Values and Ethics

00:26:12
Speaker
I'm pausing here, I'm just thinking for a moment because
00:26:17
Speaker
I'm saying database field, but I think it's actually broader than that. So I'm gonna stick with database field, but I think it's broader than that. But so let me just say people then. People who write about numbers sometimes. What do you think people who write about numbers in their organizations? Because I think I think that's the other piece that it could be.
00:26:36
Speaker
You know, really, for example, Peter, I mean, I think, I think your experience is probably like, like the prime example. You could be all about trying these other methods and pushing your clients and you can imagine in an organization having the same pushback that you're getting. Well, we always make our charts and that's it. But so, so what do you think people should do? What is their first step to put the lessons in, in the stuff that you're talking about and writing about? What should their first step be to put this into place? You know, I think.
00:27:05
Speaker
We do not think enough about how things like data visualization should fit with the overall philosophy and guiding principles of our organizations. I think one, we assume that charts are just neutral.
00:27:22
Speaker
and like their numbers so they must be true and however they come out of your software is fine. We never think like if our organization has these beliefs and tenets about how we operate and how we treat people,
00:27:41
Speaker
I think a lot of organizations recently are starting to think like, well, that has effects for what words we use, you know, how we write about people in text. Well, the next step is how then does it affect how you write about people in charts and graphs?
00:28:00
Speaker
Right. Um, I don't have an answer for you. Like that is a conversation to start having in your organization to start thinking about like this, this matters in the same way we could do harm with our words or be more careful and thoughtful with our words. We can do it with data visualizations too. Yeah.
00:28:20
Speaker
Eli, any thoughts on this? Yeah. Getting started, please. So I think there's two things. So one is, and you've kind of hinted at this, you can, you can maybe expect to push back on like, like, what is a jitter plot? And a lot of that I think stems from this false notion of, of like, we need simplicity. We need it to be as, as kind of like.
00:28:41
Speaker
simplistic as possible to reach a wide audience. And I think that's a false trade-off. I think you can, with the right designs, you can still get the message across. But what you need to remember is that with these more simplistic designs, you're creating awareness, but you're not necessarily creating awareness around the right thing.
00:29:02
Speaker
So it's not enough to teach the whole world that there are these wide disparities between different groups. The actual goal is much closer to, we need to teach that these disparities exist because of these external factors, because of these systemic factors.
00:29:23
Speaker
You can kind of fool yourself into thinking that something like a bar chart is better because it'll reach a wider audience. And maybe it will, but is it reaching the audience with a message that will actually help solve the problem or not? And the other one is more targeted towards designers and people that do this in general is
00:29:46
Speaker
I mentioned this before, but realizing how deeply embedded this concept of personal responsibility is within just kind of the American mindset, it creates this trap when you're exploring data or you're trying to figure out what causes these outcome differences.
00:30:04
Speaker
kind of tricks your brain into stopping at answers that just aren't as enlightening or compelling or interesting. A much bigger thing is going to be typically some external factor. And so if you stop yourself from this habit of immediately jumping to blame and stopping at blame,
00:30:20
Speaker
And just ask yourself, OK, what are the external factors? What are the systemic factors? You'll get to much richer answers. You'll get to much more interesting answers and much more enlightening answers, I think, for you and your audiences. And that, I think, is what would be my main ask for anybody communicating numbers is think about the external

Questioning Traditional Disparity Charts

00:30:42
Speaker
factors. Try to make sure that your audience is thinking about the external factors. I think for so many years, we've looked at these disparity charts.
00:30:49
Speaker
for years and years, we have not solved the disparities yet. So that's probably not the most useful visualization. I think Eli, your point, how do we set up these visualizations to help us really think deeply in much more creative ways about these issues? I think that's really important.
00:31:13
Speaker
Yeah. Terrific. Pita, Eli, thanks so much for this conversation and the work. I look forward to seeing where you go with it. There's a lot more to do, I'm sure. There's a lot more to do. I mean, this is really just the beginning of a conversation and we're inviting everybody to join in. That's great. Well, I'll link to all the blog posts and everything on the show notes so people should check them out. But yeah, thanks to both of you for coming on the show. Really appreciate the chat.
00:31:41
Speaker
John, thank you very much for having us. Thanks so much, John. Appreciate it.
00:31:45
Speaker
Thanks for tuning into this week's episode of the show. I hope you enjoyed that. Maybe you learned a little bit, thought about some strategies you might like to implement in your work, in your visualizations, in your data analysis. And be sure to check out all the links in the episode notes page. There's a bunch of papers and blog posts there. I hope you'll check them out. If you want to learn more about this, you should check out some of the resources at the Urban Institute website, lots of different toolkits and other things that you can use in your own work. So until next time, this has been the policy of this podcast. Thanks so much for listening.
00:32:15
Speaker
A whole team helps bring you the Policy Vis podcast. Intro and outro music is provided by the NRIs, a band based here in Northern Virginia. Audio editing is provided by Ken Skaggs. Design and promotion is created with assistance from Sharon Sotsky-Ramirez, and each episode is transcribed by Jenny Transcription Services. If you'd like to help support the podcast, please share and review it on iTunes, Stitcher, Spotify, YouTube, or wherever you get your podcasts.
00:32:39
Speaker
The Policy Vis podcast is ad-free and supported by listeners, but if you would like to help support the show financially, please visit our Winnow app, PayPal page, or Patreon page, all linked and available at policyvis.com.