Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Episode #120: Alberto Cairo image

Episode #120: Alberto Cairo

The PolicyViz Podcast
Avatar
181 Plays7 years ago

Alberto Cairo is the Knight Chair in Visual Journalism at the University of Miami. He teaches data visualization and infographics in our Journalism and Interactive Media Masters programs, and he is also the director of the Visualization Program at UM’s Center...

The post Episode #120: Alberto Cairo appeared first on PolicyViz.

Recommended
Transcript

Introduction and Greetings

00:00:11
Speaker
Welcome back to the PolicyViz podcast. I'm your host, John Schwabisch. On this week, I have the one and only Alberto Cairo to join me on the show. Alberto, how are you, friend? Doing all right, John. Thank you for having me again. How's the weather? You can imagine, right? It's rough, right? Yeah, it's rough. Very rough.
00:00:32
Speaker
Really rough sitting up on the pool, writing books. I don't know if you have seen that, but sometimes I shoot photographs of landscapes here in Miami, and I post them on Twitter, and I write the long and harsh winter. You actually sent me one. I think it was one. We were having some nasty weather up here, and it's been a pretty mild winter. And you just sent me a nice note of a picture of you.
00:00:56
Speaker
reading a book in front of your pool. I was like, this is why people hate academics. This is the reason right here. I was working though. I was working really hard. I know. I know. But outside by the pool, it's not so bad. Margarita time starts early down there. Of course, of course.

Visual Trumpery Tour Insights

00:01:15
Speaker
Well, thanks for coming on. So I wanted to chat with you because you are on your Visual Trump Retour and in late January, you're up here in DC. You gave a talk here at the Urban Institute and the day before you're up in Baltimore at the Maryland Institute College of Art, gave the Trump Retalk. So you're going all over the place and you're now in New Zealand and coming back to North America soon. But maybe you could talk a little bit about what the tour is for people who haven't yet seen it.
00:01:42
Speaker
or unfortunately you're not gonna be in their town for whatever reason and then we can just chat about it. Sure. So the visual trumpety tour has a talk. I prefer to call it the visual trumpety talk. It's a series of lectures that I'm doing.
00:01:58
Speaker
All over the United States and also in other countries, as you said, New Zealand, and then next I'm going to Canada, then back to the US to California. Later on during the summer, I'm doing a mini tour in Europe. I'm going to Norway, Finland, the UK, Italy and other places.
00:02:15
Speaker
People, by the way, can see the following dates and places in the website of the talk, which is trumpertour.com. But anyway, the talk is not about politics. The title of the talk is actually a provocation.
00:02:29
Speaker
to bring people in, so I usually joke during the talk that I selected the title in purpose with several goals in mind. The first one was to provoke people and to trigger them into thinking that the talk is about, again, politics when it's actually a talk about graphics,

Deceptive Visualizations and Self-awareness

00:02:45
Speaker
right? How visualization lies.
00:02:48
Speaker
and how we lie to ourselves, and this is perhaps the most important part of the talk, how we lie to ourselves using visualizations quite often, more often than not. But the original title of the talk was much more worrying. It was graphicacy, which is visual literacy or graphical literacy. But the joke that I make is that I usually, you know, I spoke to some friends who told me that if I title a talk,
00:03:11
Speaker
graphic you would not attract as many people as if i select the title that is a little bit easier so yeah after after the election someone on my twitter feed, do you need the meaning of the word trumpery which is basically something that the seeds something that i mean particular something that the seeds the eyes are very old,
00:03:30
Speaker
word in english that comes from french so i said you know this is perfect perfect as a title so yeah so that's what the talk is about is about how how graphics lie or how we lie to ourselves with graphics what we can do about it to avoid it and so on.
00:03:47
Speaker
You make a couple of arguments points in the talk.

Source Verification and Misinformation

00:03:50
Speaker
One of them is about how we as graph data consumers need to be careful about what we retweet or what we share or arguments that we make based on things that we see. Can you talk a little bit about that and how you view people's responsibility in this era of just quickly sharing things around the world?
00:04:13
Speaker
Sure. So today is easier than ever to distribute content among friends and families. So you see a news story on Twitter, on Facebook, Instagram, or whatever. And if the title of that story and the charts or maps that that story includes speak to your own prejudices or to your own ideological positions, it is quite likely that you will retweet it without thinking.
00:04:39
Speaker
I believe that this contributes to an informational environment that is getting worse and worse and worse by the day. And I am blaming all of us. I do that myself. So in the talk actually I present a couple of cases of stories that I retweeted mindlessly.
00:04:55
Speaker
told myself that i should have spent a little bit more time very fine the claims of the stories taking a closer look at the graphics and even a very fine the origin of the data which is something that i encourage people to do before we reach with anything on social media anything that we see in the media right the primary sources usually are one of the one of the main problems in graphics graphic may look beautiful very well designed
00:05:20
Speaker
But if the data that it is depicting is not correct, or has not been gathered with the right standards for accuracy, etc, then the graphic is wrong, no matter how beautiful it is, or how attractive it is, right? Right. Do you think that people will embrace that idea? And do you think there's, um, I don't know, do you think there's a level or a spectrum of when we should be more discerning of a graph before we share it versus
00:05:47
Speaker
ones that, you know, maybe they don't shout out a special outlier, a big trend, but it's just interesting.

Realistic Data Verification Goals

00:05:53
Speaker
But maybe it's not something that we have to dive into. It's like, you know, GDP growth across these like eight countries. Well, that's interesting. But you know, something on, you know, guns or abortion, or, you know, a hot topic, those are the sort of things that maybe we should our ears should perk up a little bit. Yeah.
00:06:08
Speaker
So, well, let me clarify though that the expectation of the talk is not that we will embrace the principles that I teach 100% of the time. It is completely unrealistic to expect that every person will verify the original source of every single graphic that we see every day, right? We cannot do that.
00:06:27
Speaker
But I believe that, let's say, 10% of the people who come to the talk start verifying the sources of 10% of the stories that they see every day. That's progress. That's likely fewer bad graphics and fewer bad data stories that will be shared in social media.
00:06:44
Speaker
The more people we convince of certain principles of verification, et cetera, and the more people will apply these principles, again, not 100% of the time, but only when they have a couple of minutes to verify the source, et cetera, or to debunk a story that they believe that is wrong and so on and so forth, that is still progress. Now, what kinds of stories we should keep an eye on or a closer eye on? Precisely those stories that we find more appealing.
00:07:10
Speaker
for ideological reasons. So if you're a liberal, for example, there is an example that I show in the talk that speaks about how expensive healthcare in the United States is in comparison to other countries. And I explain why I believe that that particular story is wrong. The reason it is wrong is the original source of the data.
00:07:32
Speaker
But that's the kind of story that I will retweet mindlessly myself because I do believe, I am convinced, that healthcare prices in the United States are absolutely insane. I'm European, so prices over there are much less expensive than they are over here. They are much lower than they are here. So that's the kind of story that I will retweet. Well, that's the kind of story that you should double check
00:07:51
Speaker
the most because it is very easy to debunk stories and a lot of fun, I would say, to debunk graphics that go against your ideological principles. It is much harder to do when those stories confirm your ideological principles, but those are the most important ones to pay attention at because those are the ones that are more likely to lie to you or to deceive you.
00:08:13
Speaker
I mean, it seems like if we are in our own bubbles now, you know, only the folks on the left read certain news organizations that are, people sort of argue they're on the left and news organizations

Transparency in Journalism

00:08:24
Speaker
are on the right. And if you are only reading the news that agrees with your perspective, and so those are the ones that you're checking, seems like there's possibility of actually fact checking those organizations that agree with you. Let me ask this question.
00:08:38
Speaker
What responsibility do journalists and news organizations have to enable this fact-checking, this checking of data and graphs? Well, I do believe that we have a huge responsibility. And I still consider myself a journalist in that sense. So I do believe that there are certain practices that we journalists need to apply systematically. The main and most basic one is that we should always link to primary sources, which is something that many people do already, but some people not.
00:09:08
Speaker
Some organizations don't link to primary sources. And I have examples that I sometimes don't show during the talks of news stories that were extremely dubious. And it was hard to find the original source of the data that they were using because they weren't a link into it. So finding it required Google searches, talking to people, et cetera, et cetera. So it takes quite a long time to find the.
00:09:28
Speaker
So if a news organization does a link to primary sources, my rule of thumb is don't trust that organization, period. Don't trust it. That's the first thing. But we should go way beyond that. So we should go, for example, to, I believe, move towards, for example, what organizations such as
00:09:45
Speaker
538 or ProPublica or even the New York Times and the Washington Post to a lesser extent are actually doing, which is that to create repositories of data and also articles that explain the methodology that was used to generate the graphics and the data that are being used in stories. ProPublica does this really, really well. 538 also does it quite often. So they published these for a long story about guns or whatever.
00:10:12
Speaker
or healthcare or whatever, and that story contains tons of visualizations or arguments based on data or whatever. These organizations will not just publish the story and the graphics, they will also publish a methodology page like scientists do. So disclosing what it is that they did with the data, where they obtained the data, what kinds of manipulations or transformations they applied to the data, what were the assumptions.
00:10:35
Speaker
behind the story that they did and so on and so forth. This is an exercise of transparency. I must say that I don't think that more than perhaps 1% or 5% of readers will ever take a look at that methodology. But that 1% or 5% of readers are actually the ones who can help us become better journalists down the road. Because those are the most critical readers, the readers who can give constructive criticism, which is something that we in journalism really need, I believe.
00:11:02
Speaker
And again, I put myself in this group, right? We journalists are experts on nothing. So we need to collaborate with people who know much more than we do about the topics that we cover. I think you're right on it. It's an interesting thing about data and journalism that a lot of data journalists or whatever you want to call them, people, journalists who are working with data are in some ways doing more sort of academic research and yet they're not held to the same standards as researchers are.

Peer Review in Journalism

00:11:28
Speaker
you know, a formal peer review. And like you said, putting the data out, you know, makes it available, right? Yeah, it needs to be a much more, we could call it peer review, but we should not confuse it with academic peer review. So academic peer review, it's, you know, it takes much longer. And we need to be realistic for that.
00:11:46
Speaker
Journalists need to publish. That's the reality. So I'm not advocating that a journalist is taking months to publish a story. That's not realistic. But I am advocating for a little bit more of peer review. So you publish your data out, you publish your story, you open yourself to constructive criticism from experts who can take a look at that data and say, this assumption over here is not quite right, or perhaps you should have put a little bit more
00:12:13
Speaker
nuance in this story over here, or perhaps in the case of visualization, I don't know, you should have shown your data at a more granular level of detail, or you should have included the confidence intervals or whatever, right? That kind of criticism, I find it extremely valuable when it comes from experts. And I speak based on my own experience. So I give every book that I write, for example, the most recent one is The Truth for Art.
00:12:40
Speaker
While I'm writing it, I let several friends of mine read it in order to spot mistakes or things that could be better explained, etc. The friends who I usually contact are people who work in data science or statistics or computer science or whatever, people who know much more than I do about things that I sometimes need to write about in the books that I write.
00:13:03
Speaker
Right. Now we've been talking so far about visuals, about graphs that may be misleading.

Hurricane Visualization Misinterpretation

00:13:11
Speaker
I think we've seen the tone we sort of taken so far as they're misleading for a purpose or for a goal. But they're not always misleading because they have a, you know, some sort of evil, you know, objective, but sometimes the visual itself is just misleading. Yeah, but it happens.
00:13:28
Speaker
Usually, not because the graphic is badly constructed, it happens because of things that happen inside your brain that are enabled by the graphic. Were you going in that direction? Well, so I think there are two branches of this. I think the example that I had in my head was the example that you've written about a lot, which is on the hurricanes. I guess it aligns up with what you're saying, but you have this example about how NOAA shows the cone for hurricane projections.
00:13:57
Speaker
And they're not drawing that graph to lead you astray. They're not trying to bias the data, but the way the graphic is made up actually might lead you to come to the wrong conclusion. Yes. So that's an example of a graphic that is not built for the audience who is seeing it. And this is one of the critical things in visualization, which is that rules apply always in a particular context.
00:14:22
Speaker
Not always, but often applying a particular context. There are certain rules that I believe that are quite universal, but most of them are not. Most of them depend on the audience, etc. So what you're talking about is usually called the cone of uncertainty. So whenever hurricanes are represented visually,
00:14:39
Speaker
hurricane forecasts scientists use a code of increasing size to represent a range basically a range of possible paths that the center of the hurricane could take so you need to vision that as an expanding call basically.
00:14:55
Speaker
being very narrow, where the hurricane is right now, and being much wider for five days in the future. So it becomes bigger and bigger and bigger. So what that represents is a range of possible paths. The center of the hurricane could move anywhere.
00:15:11
Speaker
within the boundaries of that code. But some people don't read it that way. That's how a scientist reads it. Scientists understand that that represents thousands of possible lines inside that code. But some people who are not scientists, when they see that, they believe that that's the area that is going to be affected by the hurricane. They interpret it as the area that is going to be impacted by the hurricane.
00:15:32
Speaker
And moreover, even scientists, and this is part of the reason I say that the graphic was not well built. Many scientists believe that that represents a 95% probability. So that means that 95% of the time, the center of the hurricane will be within the boundaries of that kind of uncertainty and only 5% of the time.
00:15:51
Speaker
we could get an outlier and then the hurricane could go outside the cone of uncertainty. But in reality, the number that is often not disclosed by anybody is that the actual number is 66 or 67%. That means that two out of three of times, the hurricane will be inside the cone, but one out of three, 33% of the time, the hurricane could go outside the boundaries of the cone of uncertainty. So NOAA, the National Hurricane Center, several designers, et cetera, are trying to push other kinds of graphics.
00:16:20
Speaker
to depict a hurricane forecast because they are very aware of the shortcomings of this graphic. A graphic that was originally created to be read by scientists is being used by news media. So it's a completely different audience. Scientists can understand it, sort of, but the general public can't.
00:16:39
Speaker
So scientists are trying to push, for example, what we could call a spaghetti graphic that shows you tons of possible lines, each one of them representing a possible model, a possible path of the hurricane. But they have been so far unsuccessful beyond newspapers. Newspapers have begun adopting these other methods of depicting hurricane forecasts. But these scientists have been unsuccessful at pushing these kinds of maps on TV. Why? I believe, and this is just a conjecture,
00:17:08
Speaker
that TV stations still use the old kind of uncertainty map because it looks so clear. And it looks very clear cut, very easy to understand, although that is misleading. It looks very easy to understand, but it is not easy to understand. And TV journalists, I speak based on my own experience, TV journalists get this map wrong. I have seen people talking about this map on TV, on TV casts, getting everything wrong about that map and explaining it wrong.
00:17:37
Speaker
So that is dangerous, obviously. So I would say, just to finish with this question, so that's an example of a graphic that misleads for two reasons.

Feedback and Audience Engagement

00:17:45
Speaker
First of all, because it is not well-built, that's the first thing, and scientists are very aware of these, we could not blame them, they are aware of these. And also because it is a graphic that is being shown to the wrong audience, audience who doesn't have
00:17:59
Speaker
the necessary prior knowledge to read that kind of graphic well. But there are other kinds of graphics that lie even if they are perfectly built. That's another part of the talk. Let me ask one more question. How many of these talks have you given so far? I don't know. I would need to count, but I would say around 20 already or something. So have you heard a general theme of questions or people pushing back? So anything popped up where you're like, oh yeah, I've heard this like five times now.
00:18:26
Speaker
No, not really. I mean, because of repetition, you mean? No, no, no, because of, you know, they hear one part of the, you know, you're making an argument, they hear one part of the argument, they say, well, yes, but maybe not in this particular case. No, not really. I mean, I haven't, I haven't gotten quite a lot of feedback from people who have attended the talk. Their reaction is usually quite positive.
00:18:49
Speaker
But i have gotten a lot of feedback to perhaps tweak an example or explaining a little bit clearer or i don't know that kind of very constructive and very positive feedback that i appreciate and welcome because every iteration a little bit better than the past one to the previous one thanks to to this kind of feedback.
00:19:08
Speaker
No negative feedback at all. I got an email once from a person who is quite conservative saying that at the beginning of the talk, I said that the talk is nonpartisan, and it really is, and complaining that the balance between right and left when choosing the examples was not a perfect 50-50 split, like 50% liberal, 50% conservative.
00:19:31
Speaker
My reply to him was that doing a 50-50 split would not be true to reality because sadly, right now, much of the visual bullshit that we see is coming from the far right in particular, so we cannot avoid that. There are certainly examples that come from liberal media, and I try to highlight them, and I have two or three of them in the talk, but most of the ones I have in my collection come from far right sources.
00:19:57
Speaker
Yeah. And unfortunately we keep seeing more and more of these. So you have multiple goals in this talk. And one of the goals is to open people's eyes to these, to these different visualizations, these different tricks more or less. Have you received in terms of positive feedback? Have you found that people are.
00:20:14
Speaker
taking some of your recommendations to heart are they going out looking at the data are they sending you things that they find that you know raises red flags for them. I do get emails sometimes from people sending me send me examples i remember right now someone who got into the lecture looks aboard.
00:20:31
Speaker
Send me an example from Luxembourg in particular of a graphic that was not constructed also another one in California so yeah people email and I also get feedback on the on the fly and in the very place of the talk right after so I remember really well for example the talk in in Redlands in California that was very well attended by a lot of retirees.
00:20:51
Speaker
So a lot of people were retired, they attended the talk for some reason, and I had like 40 or 50 people, elderly people in the room. I was a little bit concerned because I said, well, perhaps it would be a little bit bored about the, you know, a graphic about graphics or something, about charts.
00:21:07
Speaker
Quite the contrary, they were super involved, their eyes were wide open and after the talk I got this very nice lady come to me to thank me for explaining how to read a scatterplot because she had never seen a scatterplot or she had seen a scatterplot but she had never understood how to read them.
00:21:28
Speaker
So that's exactly what the talk is about, right? It's about the level of graphicacy of people. I say that the talk is not for experts. I mean, it can be for experts because what I try to do in the talk is also to provide a set of tools that experts can use to explain quite complex statistical and visualization principles to the general public. So if you need to explain all these principles to your friend who is not a statistician or a data journalist or whatever,
00:21:55
Speaker
How would I do it? What kinds of fun examples you could use? Here are the examples, and I showed them in the talk. Feel free to use them from this point on or repurpose them. But the other purpose of the talk was to address general people, like people who don't have any sort of expertise and perhaps who are confused sometimes by the kinds of visualizations that they see in the media.

Public Understanding and Upcoming Talks

00:22:16
Speaker
So I explained how to read them. I explained how to read some of them. Among them, the scatterplot.
00:22:20
Speaker
Okay, so you've done 20 or so. Where are you headed next? So, if I'm not wrong, the next one is going to be Quebec City in Canada, then I'm going to California. I'm taking a look at the schedule now, California, Los Gatos. I'm going to Finland in May, London, and Wales. That will be in June, Norway in June, Italy in June.
00:22:45
Speaker
then Switzerland at the end of June and then in the fall probably going to visit a North Carolina three or four cities in North Carolina. I will go to Nashville, Nashville Santa Clara University in California. Yeah, I have like 10 or 15 more places. You're going to need like a new passport. Yeah, with all those places, right? Yeah. Yeah. So I don't know. I'm still receiving requests from people asking for. Yeah. And it's a fun talk. I really enjoyed doing it.
00:23:12
Speaker
And I enjoy the fact that people seem to be taking something useful from it. So I would probably keep doing it from this point on. I keep tweaking it every time that I deliver it. I include more examples and I withdraw other examples. So yeah, it's fun.
00:23:28
Speaker
Yeah, that's great. I'll link to the whole site so folks can make sure that they attend the talk when you're in their city and they can send you a note to get you to their city. And if you can't get to their city, there are I think now a variety of recordings of the talk here and there. I know Urban has one and I think there's a couple others out there. So people should check those out and I'll throw some links up. So, Alberto, thanks a lot for coming on the show. Thanks for coming to Urban to do the talk and yeah. Well, thank you so much for having me again.
00:23:55
Speaker
Thanks everyone for tuning into this week's episode. You have comments, you have questions, you have suggestions, please drop me a link on the website or on Twitter. So until next time, this has been the PolicyBiz Podcast. Thanks so much for listening.