Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Episode #71: Tracy Gordon image

Episode #71: Tracy Gordon

The PolicyViz Podcast
Avatar
171 Plays8 years ago

Hi everyone! Welcome back to the PolicyViz Podcast! On this week’s episode, I’m excited to welcome my Urban Institute colleague Tracy Gordon to the show. Tracy is a senior fellow with the Urban-Brookings Tax Policy Center, where she researches and writes...

The post Episode #71: Tracy Gordon appeared first on PolicyViz.

Recommended
Transcript

Introduction and Sponsorship

00:00:00
Speaker
This episode of the PolicyViz podcast is brought to you by JMP, Statistical Discovery Software from SAS. JMP, spelled J-M-P, is an easy to use tool that connects powerful analytics with interactive graphics. The drag and drop interface of JMP enables quick exploration of data to identify patterns, interactions, and outliers.
00:00:19
Speaker
JUMP has a scripting language for reproducibility and interfacing with R. Click on this episode's sponsored link to receive a free info kit that includes an interview with DataVis experts Kaiser Fung and Alberto Cairo. In the interview, they discuss information gathering, analysis, and communicating results.

Guest Introduction: Tracy Gordon

00:00:49
Speaker
Welcome back to the Policy Vis podcast. I'm your host, John Schwabish. On this week's episode, I'm very pleased to be welcomed by my Urban Institute colleague and pal, Tracy Gordon, a senior fellow in the Tax Policy Center here at Urban. Tracy, welcome to the show. Hi. Thanks for having me. Otherwise known as Trago. Yes. Perhaps the best nickname here in the building. It's a very good nickname. It's a very good nickname. You should really go by Jay Schwab. Have you thought about that?
00:01:14
Speaker
I've tried it. It doesn't quite roll off the tongue the way Trago does. I need to change my Twitter handle and do more branding. Just go with Trago. Change my legal name. Do the whole thing. This is going to be a fun little conversation, I can tell.

Social Science Methods and Causal Relationships

00:01:29
Speaker
Let's talk about research. You do a lot of work with state and local finances, taxes, and we're going to talk about a cool new product that you have that's just come out in a little bit.
00:01:40
Speaker
But we've been talking over the last few weeks sort of here and there about some of the things that we've seen that are even at wrong with the social science field, but maybe lacking. That's a good term. So how are you feeling these days about the work that we're doing and we being sort of the entire field of social science research? Right, right.
00:02:01
Speaker
It's funny, when you frame the question that way, I think of the State of the Union, and I feel the need to say the state of social science is strong. We've perhaps never lived in better times in terms of being able to discern relationships between things. There was a great graphic, actually, your audience is an expert in this, but in The Economist about the use of different types of evidence.
00:02:23
Speaker
you know, what's hot at various points in time. Now it's, you know, regression discontinuities and RCTs and randomized clinical trials. For a while, it was instrumental variables. So there are these, you know, flavors and fashions. But for the most part, I think we're getting better and better at uncovering causal relationships. But I think, you know,
00:02:41
Speaker
this podcast originated out of a conversation where you and I were leaving the building and standing on the steps and saying, okay, so what? So you've identified this causal relationship where you think you have based on using the latest tools and techniques, but maybe you have this sinking feeling that you haven't quite uncovered the truth. And you know, what is it all for?

Economists in Policy: Role and Critique

00:03:00
Speaker
And so this is kind of a long standing frustration of mine. And I don't know that I have answers. I think I have a lot of questions, but
00:03:07
Speaker
You know, throughout my career, I've kind of tried to straddle this inseam between sort of research and policy. And I think, you know, having worked in government, having been around a lot of people who've worked in government, I can see how I'm not sure we're training people to be effective as social scientists influencing policy. My degree is in public policy, although I did most of my coursework in economics. And when I think of economists in the policy process,
00:03:31
Speaker
I remember a meeting at the National Tax Association, which is more fun than it sounds. It is more fun. Not just economists, but lawyers and accountants, too. I mean, it's not that much fun. But they had the major advisors to the presidential campaigns at the time. It was 2008.
00:03:50
Speaker
and they asked them sort of how they did their jobs and what was surprising to them. And what was surprising to me was how much personality really mattered because so much of what we're taught in graduate programs and economics is this kind of perfectionist criterion. So to basically do a proof, show that you're right, QED, I'm done, drop the mic, that's it. And if you do that, and from my very brief experience in government, if you do that in a meeting, people are like, that's nice, but we've got this problem here that we need to solve.
00:04:17
Speaker
And I know a lot of economists who have served in government will say, you know, my biggest value added was not getting something through, but shooting down dumb ideas. And I feel like if that's the way you think of your job, then no one's ever going to invite you to

Training Social Scientists for Policy Impact

00:04:30
Speaker
a meeting, right? You're the guy who says no all the time. Nobody wants you around.
00:04:33
Speaker
What we don't teach people is how to take an idea that may be flawed and make it better to shape it. And I think that in law schools, my sense is that people get that a little bit. There's a little bit more sort of thinking strategically or working in the real world. And I fear that. So, you know, again, I come from a public policy school, too. And
00:04:53
Speaker
Public policy schools came about because people like Erin Woldoski, who founded my program at Berkeley, were concerned about not enough analysts having the right toolkit. So I was very focused on just getting the tools, not a lot of normative theory, and using the answer from cost benefit analysis or whatever to speak truth to power. And I think that's a great idea. But again,
00:05:12
Speaker
you know, sort of how do you make sure that you're getting information into the right hands and it's used correctly. One of my favorite economists, Charlie Schultz, who unfortunately passed away recently, has an article in the Journal of Economic Perspectives where he says at the end of it, forget about any theory of the second best, you know, this idea that you could sort of fudge perfectionism a little bit if you're in this and you might actually achieve better outcomes in this world's second best.
00:05:36
Speaker
He said, you know, in reality, it's much more taking the sixth best to the third best. And really listening, you know, asking a policymaker, like, what are you trying to achieve? And if they say, you know, because they might not even know, you know, might they may just say, like, you know, I want X, I want X. And, and if you could say, like, well, I can get you X prime, but it's going to require giving up this and maybe getting that. And I just think it makes us much more effective. And it would bring about better policies. So that's my soapbox.
00:06:05
Speaker
Okay, so let's start with graduate school. So I've said for a long time that when I was in graduate school, it was similar for you. I wasn't really taught how to present. I wasn't taught how to make graphs. I wasn't taught even how to write well. I mean, write for a journal, which is not writing well. None of that communication was actually part of the education. It was like, yeah, it was like, do the proof, write QED, and move on to the next thing. And you sort of mentioned lawyers in some other fields.
00:06:33
Speaker
people in healthcare and people in the legal profession, they have to do these externships or they have to do these residency programs. I mean, is that what we need in social science where you need to take a semester and go work for a nonprofit and you need to have an internship and actually do the job? I always feel like I wasn't taught how to be an economist in graduate school. I had to learn when I went out to work. Maybe that's how it is for most people, most fields. But we're also trying to talk to policy makers about changing policies.
00:07:01
Speaker
And if you go straight into academia from graduate school, you could live your whole life without having experience. Which, you know, has its place, right? Because social science, we like to think that it's a pure science and we can live this life of the mind and cover actual truths. And so that's wonderful. But yeah, I guess what concerns me is I think there's almost this law of inverse appeal that you're not a serious economist.
00:07:23
Speaker
if you think about presentation, you know, that it's a really serious guy that is still using like, transparencies and a marker, you know, covering it up with a piece of paper. So there's the big reveal when you've got the KED at the end. That's right. But they all melt together. Yeah, yeah, yeah. But yeah, and so I like the idea of an internship. And for that matter, I wanted to have an internship here for policymakers and to come learn what we do. Yeah. So yeah, we should all
00:07:51
Speaker
talk more and you know sort of learn to respect you know each other's professions. I mean I think there's obviously a place for the people who are doing the theory right and doing the let's expand the theoretical concepts or the econometric techniques because we're going to need to use them but I'm a little frustrated which I think you are as well as like
00:08:09
Speaker
anyone can run a regression now, we click a

Effective Communication in Economics

00:08:12
Speaker
button, we run a regression and we tend, I think the field sort of glosses over a lot of these mathematical problems like omitted variable bias or all these biases where we're just sort of like, we sort of wave our hands and are like, okay. Whereas maybe it's just the fundamental variance or the core or a simple correlation that's the most important thing. So I remember one of these fads for a while was, you know, non-parametric estimation, which someone said,
00:08:38
Speaker
to me was sort of let the data speak. And so I still like that. Like if you can't see a basic relationship between two variables and the means, and there probably isn't anything there. And I've tried and tried to make something happen when there's not that fundamental relationship. But then that concerns me. So yeah, I think that you can learn a lot from, you know, a table or a scatterplot or a good visualization, obviously. But it concerns me that we would also then be sort of more lackadaisical about the rules of evidence. And
00:09:05
Speaker
I was at a conference once where someone said that, she said, I was trained like everybody else here and we all know that randomized clinical trials are the gold standard, but then we know what happened to the gold standard, didn't we? And I just thought, that's a strange analogy. And she, hopefully she's not listening to your podcast, but she was promoting what she referred to as video evaluations, program evaluations done by video.
00:09:27
Speaker
which I ran by a friend of mine and he said, oh, you mean a commercial? And so, you know, and I think you and I were both at a talk where someone referred to research evidence with research as a modifier, you know, because there are other types of evidence. So, yes, like, we'd never want to rely exclusively on research. Nevertheless, like, the whole point is to follow the scientific methods. We're not trusting your gut.
00:09:53
Speaker
And being in policy, by the way, is a good way of testing that. And I remember there was one conversation I was in about a program where some of us were pushing for better evaluation and this guy who was very
00:10:05
Speaker
enthusiastic about the program and very smart and very well read about all these methods said, his idea of evaluation was data collection. So he said, we're doing evaluation, we're collecting metrics and not only are we collecting metrics, we're using those metrics to feed back into the program and do continuous quality improvement. And then of course my head hit the table because that's not an evaluation, you just ruin your evaluation basically. That was good for me to have to say, if you care about this policy,
00:10:32
Speaker
Then you need to be able to say why it works, why we need to bring it to a million other places. It's just this one place where
00:10:40
Speaker
you know, they were experiencing these very dramatic improvements in graduation rates. So is it about researchers being able to communicate? Is it about researchers recognizing that not everyone understands what heteroscedasticity is? Is it about researchers in some ways getting back to first principles? Where if you were queen of the researchers, which would be an awesome title. Which I actually am.
00:11:10
Speaker
But you are, okay. So if you're a queen of the researchers, what? Okay, so let's put aside the graduate school stuff.
00:11:16
Speaker
Yeah. Say there are no one new is coming out. So you're just dealing with, let's say urban astute or places like us. What would your first couple of policies be to say, here's how we're going to try to improve the way our research actually makes a difference or affects policy or whatever it is. Is it on the communication side or is it?

Research with Direct Policy Applications

00:11:35
Speaker
I think it's, it's even, you know, formulating the questions because I think a lot of what we refer to as policy oriented research is kind of
00:11:43
Speaker
Here's this neat question that I have in mind that I can look into because there happens to be this county level variation and someone designing revenue system at the local level should just naturally pick up my paper and be interested in it because the question is,
00:11:59
Speaker
plausibly related to what they do. I hate that. I feel like that's offensive. That person who's trying to figure out how to raise revenue in probably the most efficient and equitable manner just in terms of keeping her dog. They don't have time to decipher. I remember during the financial crisis, the podcast Planet Money had a contest where they asked people to translate the abstracts from economic journal articles.
00:12:22
Speaker
And one of them for something about economies of scope in the banking industry, which was so pertinent when you're talking about breaking up banks and too big to fail and all that kind of stuff. Is there some benefit to size and the ability to offer different kinds of products? But this abstract was impenetrable. And so one of the winning translations was something like,
00:12:42
Speaker
We think things are cool, but we were too shy to talk to any actual papers. So we found this data and we did some aggressions and here's what we found. So I mean, that's, I guess talking to people, right? I think you can overdo the video interviews and you can overdo the case studies, but you know, really talking to people that are affected by policies who oftentimes know, you know, even like the identification issues that we spend a lot of time worrying about, you know, they intuitively know that based on the complexity of these programs.
00:13:13
Speaker
But how do you institutionalize it? So we've talked about sort of an internship program, maybe even like more collaborative projects, having people actually write research papers together. But is it interesting, right? Because we use big data, or not even big data, or large data sets, or small data sets, and there are people behind those, and yet we never actually talk to
00:13:33
Speaker
Right. Right. Right. Right. Right. Right. Right. Right. Right. Right. Right. Right. Right.
00:13:53
Speaker
It's not what a journalist knows how to do. Everyone has their little specialty, I guess, and yet the paths don't really cross. That's what we need more of. Again, it's so exciting. The people in policy are at least reading from the same prayer book when it comes to evidence and data and that folks in journalism are producing these amazing examinations where they really get their hands dirty and they dig into data and they produce these visualizations.
00:14:22
Speaker
I was at a talk where someone pointed out the footnote to one of these visualizations that said that this was created with sort of a dash of kernel estimation and then a little bit of this, a little bit. It basically wasn't clear what this map was. It was a lot of vegetation. Yeah. And so it's sort of like you were saying, it's become very cheap to run a regression. It's become very cheap to do a lot of what's now called data science, I guess.
00:14:45
Speaker
But how much of it is really following the scientific method and making sure that just very basic stuff like are you measuring what you think you're measuring?
00:14:53
Speaker
I remember taking class in qualitative methods that was taught by someone who did not believe in qualitative methods. And so she said, it's all just small n, right? Basically, it's like you're doing statistics, both a very small sample. So you really got to worry about generalizability and, you know, and sample selection. And I thought that was a great point, actually. Yeah. Oh, I know, too. There was a book called Diverse Tools, Shared Standards.
00:15:19
Speaker
which sort of emphasized that, you know, we're all trying to ask the same questions. I think it was edited by Henry Brady and David Collier at Berkeley. It was about sort of interdisciplinary research, but I think it goes for, you know, policy and journalism and other domains as well that, you know, we're all trying to do the same thing, but we should, and we have different tools and that's okay, but we should have the same standards about, you know, asking, you know, are we really uncovering a relationship here? What else could be happening in the background?
00:15:48
Speaker
You also mentioned earlier about personality, like at the NTAs, which tax economists are fun, right? I spent my birthday. Wow. That's brave of you. It would have been fun for the whole group to sing Happy Birthday, though.
00:16:06
Speaker
There was cake, but there was no cake. How important is it, do you think, for researchers to be able to actually talk to policymakers?

Translating Research for Policymakers

00:16:15
Speaker
Jargon is a big problem, but we tend to, I think, worry about jargon in written products and not so much worried about
00:16:24
Speaker
You know, we spend a lot of time, and we being here at Urban, but also other places, we spend a lot of time editing papers and making sure blog posts and making sure that, you know, a general lay audience might be able to read it. But then I sort of feel like we just, if someone has a hearing, we just sort of push them out the door. Or they'll meet with their guru and push them out the door and say, good luck to you.
00:16:43
Speaker
So how important is that piece of it? Yeah, very important. You do a lot of this. Yeah. And we have colleagues who I think the world of who are the world's experts on certain topics who I would never send to those kinds of meetings. And so yeah, so the question is, you know, sometimes I just, you know, repeat their research. I just, you know, figure that it's important that what's in those people's heads gets into the hands of people who can use it.
00:17:13
Speaker
And so for example, at the state and local finance initiative, which I'm a part, we did a summer meeting for budget analysts, compendium of urban research that was important to people that are putting together budgets. And it was, a lot of that was trying to sort of translate into dollars and cents, some of the research that goes on here, but you could imagine other kinds of compendia like that for different kinds of policy people. But yeah, I think when I've asked this question of sort of how do we train economists better and people talk about economists who've been successful in government,
00:17:42
Speaker
oftentimes it does come down to personality, like Laura Tyson, for example, someone that I work with in graduate school. She just had a knack for staying in the room and for offering useful insights to the process. I think translating, basically being able to go back and forth and checking your ego a little bit. Maybe you don't have to be the one to come up with the answer.
00:18:08
Speaker
I think part of our training too is to try to always do things yourself and be competitive. But maybe you go to the person who's the world's expert because it's more expeditious and you figure out what the answer is and you find a way to explain the answer in a way that maybe is better than the person who actually came up with it to the person who needs it.
00:18:26
Speaker
to be willing to be more of a conduit sometimes instead of the originator of the new fact. But if you're a tenure track faculty member, then you've got to generate the new fact. There was a great, actually, apropos for your audience, I think. There was a great blog I remember seeing when I was in government that had a table for academic versus policy economics.
00:18:48
Speaker
And so in academia, it's important to be original. In policy, it's important to be right. In academia, it's important to identify a direction of an effect. In policy, the magnitude actually matters. And some people just have a knack for one or the other. It can be very stressful when someone comes to you and says, I need to know how X policy is going to affect GDP and I need it in 10 minutes. And so it's good that people self-select. Not everybody has to do everything.
00:19:16
Speaker
But yeah, perhaps we could provide more training at an earlier stage. This is maybe a little bit of a different question, but just a quick question as you were talking.

Gender Dynamics in Economics

00:19:25
Speaker
From your perspective as a woman doing this sort of work, how do you view the gender split and how we have these conversations?
00:19:33
Speaker
I mean, I know that's a big topic and I don't want to get too far down the path, but just, I'm just curious about your perspective from doing this like NTAs for example, right? And then I always find it interesting when you go to some of these panels and it's like, you know, five older white men talking about, you know, gun violence in the inner city. And you've seen the David Hasselhoff thing, you know, where he can be like, congratulations, you have an all male panel.
00:20:02
Speaker
So I think I really in graduate school did feel like it was a meritocracy. And, and yeah, oftentimes, you know, there were fewer than, you know, 20% of us in the room are female. And that still happens sometimes. But in some cases, I feel like it's helped me more than it's hurt that I think people remember you if you're the only one who's like you.
00:20:24
Speaker
And having some of these more kind of female skills like listening and trying to bring people together and you know, it matters. And so yeah, I wouldn't, you know, I wouldn't trade it. But yeah, I think I've also, you know, been in conversations where, you know, all that stuff of, you know, oh, John just made a really good point. Well, I said that 10 minutes ago, actually. And so, you know, we can definitely all do a better job there.
00:20:52
Speaker
But yeah, it's funny, I haven't thought of it before, but maybe some of what I'm talking about is just kind of everyone sort of being a more effective policy analyst if they did have these kind of more, you know, female type personality traits.
00:21:06
Speaker
Yeah. I mean, I think obviously a lot of it spans what we are and we're not teaching people, both in how they communicate and how they relate to one another. I think in the room, right? And graduate students when they relate to one another and how they treat one another. And that's just gender. We haven't even talked about race. We haven't talked about nationality.
00:21:26
Speaker
Sure. And there is that thing. I think there was a study that confirmed this that, as in the entertainment industry, with a lot of industries, people tend to hire people like themselves. That guy reminds me of me. He looks like the next Steven Spielberg. And yeah, that happens in academia all the time. So I think there was a blind study where they sent
00:21:46
Speaker
emails to professors saying, I would like to work with you. My name is, you know, John. And it was so disheartening, right? Because if it was like a white male, the guy would be like, yeah, sure. I have some time. And honestly, you know, I have seen that when I've asked.
00:22:03
Speaker
people to write letters for recommendation for more junior people and you know like there was one case where I really had to ask like several times and I was like come on you know and but it's I don't think it was you know it's just a blind spot that that person had and so yeah I think we have to be vigilant about that.
00:22:22
Speaker
Okay, so this probably hasn't been the most uplifting conversation for the field, but I want to turn before we close up to talk about this new project you have that's come out, aptly titled, I think, what everyone should know about their state's budget.

Demystifying State Budgets: A New Tool

00:22:38
Speaker
I mean, there's no jargon there that tells you exactly what it's supposed to do. So can you talk a little bit about the project and what went into it and what people are going to get out of it?
00:22:48
Speaker
So, one way I think about this project is I was in a dress shop in DC, and this woman, who's the owner, who's very smart, very entrepreneurial, very savvy, said, you know, my state doesn't spend enough on education. And I thought, well, that's interesting, you know, what's enough? And so, and then same time, she was concerned that, you know, class sizes were too big, and that teachers weren't getting paid enough. And so, you know, as economists, we always think about trade offs. And so the goal of this tool is really to show how spending is determined, and how
00:23:16
Speaker
In some cases, there's just the hand that you're dealt in terms of the geographic conditions of your state or the demographics of your state. And then there's the policy choices that you make. So to separate those two things and show just how any area of spending, whether it's social services where we often think in terms of caseloads and how much we spend per person or transportation, you could also think of caseloads with a number of drivers that you have on the highway to show how those inputs interact to determine overall spending.
00:23:41
Speaker
And you know that if you want to have more of one thing, you're going to have less of the other. So if you want to pay teachers more, that's great. But you might have to have larger class sizes. And we do sort of do cost of living adjustments and show sort of explained versus unexplained portions due to labor market conditions. I think the value of the tool is providing this uniform framework across all areas of spending and really trying to show people sort of where their tax dollars go and to what effect we have some outcome measures in there.
00:24:07
Speaker
and trying to demystify state and local budget processes because I really think people would trust in their state and local governments. If they're going to trust government, it's going to be at the local level as most research has shown. But I do think that budgets can be opaque and it's important for people to understand
00:24:25
Speaker
you know, sort of what governments are up against and what they're trying to accomplish. Yeah. And what's really interesting, I mean, aside from there, there's lots of things that are interesting about it, but you actually do break down what sort of builds up to, to spending. So you can actually, so it's sort of a fun little mathematical tool to sort of show you. And then, and then shows you the data and then broken down by state. So you can actually see what's going on in your state. Right. So every state.
00:24:48
Speaker
every state for our social programs, for Medicaid, higher ed, public safety, et cetera, et cetera.

Conclusion and Farewell

00:24:54
Speaker
So very cool. So I'll put the link to the new tool on the show page and folks can go check it out. So very good. Wonderful. Thanks for coming on the show. This has been a lot of fun. It was a pleasure. Thank you very much. Thanks to everyone for tuning into this week's episode. Until next time, this has been the policy of this podcast. Thanks so much for listening.
00:25:26
Speaker
This episode of the PolicyViz podcast is brought to you by JMP, Statistical Discovery Software from SAS. JMP, spelled J-M-P, is an easy to use tool that connects powerful analytics with interactive graphics. The drag and drop interface of JMP enables quick exploration of data to identify patterns, interactions, and outliers.
00:25:45
Speaker
JUMP has a scripting language for reproducibility and interfacing with R. Click on this episode's sponsored link to receive a free info kit that includes an interview with DataVis experts Kaiser Fung and Alberto Cairo. In the interview, they discuss information gathering, analysis, and communicating results.