Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Lilach Manheim Laurio Shows Us How to do a Better Job Critiquing Data Visualizations image

Lilach Manheim Laurio Shows Us How to do a Better Job Critiquing Data Visualizations

S9 E231 · The PolicyViz Podcast
Avatar
964 Plays1 year ago

Lilach Manheim Laurio leads the Data Experience Center of Excellence at Visa, where she helps data practitioners across the company to elevate the quality of their data products, and improve their skills in data visualization and data experience design. Lilach’s data visualization work blends together a background in art history, library science, and human-centered information design, along with a passion for visual metaphor and pun.

Lilach has served as a Tableau Zen master (2018-2019), Tableau Public featured author, and co-organizer of her local Tableau user group chapter. She has contributed as guest author to the Tableau blog and the Nightingale journal, writing about design and user experience in data visualization. She has also spoken on topics ranging from visual metaphor to dataviz critique at Tableau conferences and user groups across the U.S.

Lilach holds a Bachelor degree in Art History and a Master of Library and Information Science (MLIS).

Episode Notes

Lilach | Web | Twitter | Tableau Public

Visa Chart Components
Elevating Data Experiences framework
Chris DeMartini (Twitter)
Frank Elavsky (Twitter)
Data Visualization SocietyThe Shape Parameter of a Two-Variable Graph (banking to 45 degrees paper from Cleveland, McGill, and McGill)

Related blog posts:

Books

Recommended
Transcript

Introduction and Sponsorship

00:00:00
Speaker
This episode of the PolicyViz podcast is brought to you by User Interviews. User Interviews connects researchers with quality participants who earn money for their feedback on real products. So there's high demand right now for software developers and engineers to provide feedback on products that are being created for developers. So if you want to help shape the future of the tools that we use in the data visualization, data communication field, this is your opportunity to provide that feedback to product developers.
00:00:29
Speaker
So you can go in and you can sign up for free. You can apply for your first study in other five minutes and they will send you updates for surveys that are going to be related to the work that you do. So you can actually customize what opportunities you're going to see from user interviews. Now, most studies, at least what I've seen here are less than an hour. They pay over $60 for an hour worth of work.
00:00:51
Speaker
Some studies are more the focus group, the one-on-one conversations, and those pay even more money up to several hundred dollars. So there's some opportunity here not only to help shape the future of technology, but also to earn some money for your time. So if you're ready to earn extra income for sharing your expert opinion on software development, engineer, hardware, software, head over to userinterviews.com slash hello to sign up and participate today.

Guest Introduction: Lila Manheim-Lorio

00:01:30
Speaker
Welcome back to the Policy Vis podcast. I'm your host, John Schwabisch. Hope you're having a good start to the year. I hope you enjoyed the last few episodes of the show. We've been dealing with some interesting pieces on different data visualization books, and today's actually going to be no different because on this week's episode of
00:01:46
Speaker
The show, I am fortunate enough to be chatting with Lila Manheim-Lorio, who I've been chatting with a lot lately because I've been thinking more about how we use a field critique data visualizations and the different types of ways that we critique data visualizations. And so Lila and I have been emailing back and forth. She's been.
00:02:05
Speaker
so kind and generous to take a lot of time reading through this very long post that I'm publishing today, along with this podcast episode, about the field of data vis critique. And Lila has a book coming out later in the year about this concept of

Lila's Work at Visa and Data Experiences Framework

00:02:20
Speaker
critique. How do we do it? How can we do it better, both as a critic and as someone receiving criticism? And how can we do that not just as individuals, but also how can we do that within teams and within organizations?
00:02:32
Speaker
And so what you're going to hear in today's conversation is the start of Lila's work on this at Visa, where she and her team created this data experiences framework. It's like 10 pages long. It's terrific. It gives you a lot of ways to think about generating critique and to think about all the different aspects of a data visualization that you might want to think about and talk about and try to improve upon.
00:02:57
Speaker
And I'll link to that in the show notes. There's a lot of books that we talked about here as well, not just in data visualization, but also in the UX UI fields. So I linked to those as well. So I really encourage you to listen to today's conversation. I encourage you to check out the blog post that I'm publishing along with today's conversation about
00:03:16
Speaker
my sort of view on database critique and what we as a field need to do, what we need to do better, and where I think we need to focus our attention rather than just saying, oh, this graph is garbage, this graph is great, what we need to do sort of to move the field forward. So I hope you'll listen to today's conversation. Hope you'll check that out.

Lila's Career Path

00:03:34
Speaker
Hope you'll check out all the other resources on policyvis.com on my YouTube channel, my Twitter feed, my Winnow feed.
00:03:41
Speaker
wherever you like to connect with me. And if you have comments or questions, please let me know. You can connect with me at all those different places. So here is today's conversation of the PolicyViz podcast with Leela Manheim-Lorio.
00:03:56
Speaker
Hi, Leela. Good afternoon. Welcome to the show. Hi, John. Thanks so much for having me. I mean, this is really exciting because, so for those who, I mean, no one really knows, say for us, but like we've been emailing at length for like a while about a lot of different topics, primarily what we're going to talk about today. So like, this is very nice to actually like see your face and chat in person. Definitely. And definitely thank you for your patience and openness to my like,
00:04:22
Speaker
diatribes. No, it was great. I mean, this is like, this is this is the thing that is, I mean, I don't know. I mean, I think everybody has their own fears of feedback and rejection and criticism and something like you have. For me, it's always been writing. Like, that's always the thing. It's like, you know, you just have to learn to, like, embrace
00:04:44
Speaker
the feedback and the critique. And that's the only way to get better. And that, of course, is a great segue to what we're going to talk about today. Because you are working on a book on critique and feedback, which I'm really excited to talk about. But you've already done a bunch of that work. And I'm sure at least some of the listeners to the show are familiar with the data experiences
00:05:04
Speaker
It's not really a checklist, per se. We call it a critique framework. A critique framework, yeah, from you and Frank Olavsky and a couple of others, I think. So I'd like to talk about that. Yeah, and talk about the book and how exciting that is that's coming out. So maybe we can just start, you can talk, just sort of give a little background and how you ended up at Visa and what you do there on the DataVis side.

Design-Focused Approach at Visa

00:05:30
Speaker
Sure. So I think I have, I'm one of those like,
00:05:35
Speaker
got into data is through a bit of a unique pathway. So, started initially studied art history. And then a few years later did my master's in library and information science.
00:05:50
Speaker
with a focus on information seeking behavior, which is really like a fancy, I guess, library term way of saying it's how people, when you have something to research, how do you go about looking for information? And then I really actually heavily got into while I was doing that degree and of how people deal with information overload and how people, how you kind of interact
00:06:18
Speaker
with a lot of information and decide when you found enough information to answer your question or complete your task. And so during kind of the last year of my program, I kind of a little bit accidentally discovered Dataviz and Tableau. And it was like, you know, the
00:06:41
Speaker
the light from the heaven like it was just such a crazy like mash up of everything that I loved and could have learned or had been doing before that. I never really anticipated using anything I learned in my art history degree.
00:06:55
Speaker
you know, outside a museum. But I definitely, you know, I love how I can bring all of those things into database. So, you know, I spent a few years kind of building dashboards and doing the BI thing.

Development of Data Experiences Framework

00:07:10
Speaker
And I think just as I got further into it, and especially getting more involved in the Tableau community, I really started falling more in love with the teaching piece and like how to
00:07:24
Speaker
How can I enable others to improve their, especially the design piece of the scale set? And so, yeah, the opportunity opened up. I'd say it definitely, a big part of it was the Teplo community, because otherwise I would have known Chris DiMartini, who's running that group. But it was a really good opportunity to work in a center of excellence, which is kind of like more than enabling peace.
00:07:50
Speaker
And the thing that really, really appealed to me, especially about the visa team is that it's a like to call it a small but mighty team.
00:08:04
Speaker
Partly because of the size of the team. Um, but also kind of like the nature of working in a, um, somewhat big company will really focus on. Just kind of like tool agnostic data visualization. So, I mean, I think that's just a really, a super interesting challenge, a chance to be creative. A little, you know, not just in making database, but like, how do you.
00:08:28
Speaker
kind of abstract out a little, but not too much. Like what makes something a good data experience, a good data innovation. And really like helping people, because it's not just tools, right? It's like, it's even like the difference between, you know, BI folks who are making dashboards for other people to do interactive analysis, or like someone like a data scientist, like presenting
00:08:57
Speaker
results of a, you know, these are very different types of like things that we're making, experiences that we're making.

Human-Centered Design at Visa

00:09:05
Speaker
Yeah. And I assume both like internal, like data scientists internal to Visa, but also like communicating with banks and customers and the external piece. Yeah. There's, yeah. Visa does a fair amount of like
00:09:19
Speaker
console to customers, like banks. So we have our VCC, which is Visa chart components, fairly recently put out there open, open source. So definitely I think because we have some like a fair amount of the data products that are made are beyond just internal
00:09:45
Speaker
opportunity to do something like open source the charts library. Yeah, that's great. So I guess we can start with the data experiences framework. So I want to start with where or why that came about. Like this is like a 10 page
00:10:07
Speaker
of doing a better job of providing feedback for all these different types of visualizations that you just mentioned across, I mean, I don't know how many it is. I want to say like 20 different domains. So I'm curious, like, was that sort of demand-driven within Visa or was it supply-driven where like you and Chris and people on the team were like, we definitely need to do this better. Let's create this experience into this framework.
00:10:35
Speaker
Yeah, that's a good question. So, um, I think it is somewhat demand driven and then that we kind of got asked like within our org, like how can we improve the maturity of data products? And obviously that's part of our team's mission. Um, but how can we like, um, measure the,
00:10:57
Speaker
whether a dashboard is good or not so good and kind of how can we have some consistency in like saying these are the things that should be addressed to like the quality of the experience.
00:11:14
Speaker
But the great thing was that we had, we got a lot of flexibility in terms of like how to do that. And I think especially because of our focus on being tool agnostic, that was kind of like the, one of the challenges of like, what are things that are universal, right? But, you know, still specific enough that like, it can actually help a person to improve visualization. And I think the other thing is that we really,
00:11:42
Speaker
kind of, I think Visa generally has a pretty high level of maturity, you know, at least compared to most companies for like the design maturity.

Framework Goals and Human-Centered Outcomes

00:11:55
Speaker
So we have a whole team for the design system and also like accessibility. So there's already, like people are already used to doing a pretty thorough accessibility check.
00:12:07
Speaker
like for any product, not just product. And so we wanted to see how we can kind of like infuse more of like human centered design. So how can we, I think generally because we're data people, right? We think about the data. So when we design, we're thinking about like how can we just present this data rather than like, what does a human need to do with the data?
00:12:32
Speaker
first and then, you know, what information can we help them, give them to do that. And I think similar with, with critique, a lot of times we focus on like the things in the data visualization, right? Like, do you have a good title?
00:12:48
Speaker
you know, do you, I don't know, did you pick the right chart? These are all like very system focused. In our case, the system is also data focused. So we really wanted to do something that was, you know, instead of a checklist of going through like, these are, you know, did you do this and this and this? Like being able to like look at a dashboard or a database and saying, is it like,
00:13:17
Speaker
accomplishing these things. It was great because I was like, oh, I wonder if we could like teak
00:13:28
Speaker
some of the concepts from heuristic evaluations, which are very well established in the UX field, like apply them more to a data product. And this is where I really love the, this is where like our team is awesome because Chris was like, okay, what would you, what would it look like? Just run with it.
00:13:49
Speaker
Yeah. And so I looked at some of the UX usability heuristics and then there's also been like a good amount of work on how to apply
00:14:03
Speaker
some of those heuristics to dashboards. Some of them are like very domain focused like the one that I looked at that I cited in the framework is like the, I think it had to do specifically with a domain of like health
00:14:20
Speaker
information in a hospital setting, but still it's a good foundation too. So we kind of took that as a starting point and thought about a lot of mapping, very information architecture kind of brainstorming more. I'd like, okay, if you take something like
00:14:44
Speaker
visibility of system status is one where you want to like have filters or like really anything that a user interacts with, like to be for users to not have to work to find that. Right, right. So filters are going to drop down. Yeah. Yeah.
00:15:00
Speaker
So, um, so for data is obviously it's a little, like there's some unique things that that applies to. So that could apply to like, do you have legends that are easy to find or something like that? So it was a process of kind of like coming up with like, what would some questions be specific to a data is that would test for that heuristic. Um, and then expanding on that to like, what's maybe some additional
00:15:30
Speaker
that aren't yet, you know, because we want to go beyond just usability. Right. So, you know, one that I know we added was it kind of has a somewhat similar
00:15:44
Speaker
not necessarily heuristic, but more principle and UX of you want something to be useful. So usable is important, obviously, but useful as well. And so we basically called it valuable. And, but we really tied it to, you know, like the, the what you learn
00:16:03
Speaker
or what you're able to learn from using a product. So like, does it answer the question that, you know, what kind of business value does it add to a user if they were to use a product to answer that question? So in the like the UI UX field, a heuristic would be a button or a
00:16:28
Speaker
toggle or a switch or a scroll, just examples. But when you think about heuristics applied to dataviz, in addition to those pieces that you might have a filter or search bar and like a dashboard, but in just any general data visualization, a heuristic would be defined as all those elements on the chart space itself, the legend, the axis, all those pieces would be a heuristic.
00:16:55
Speaker
So I think the heuristic is the kind of test. And those elements are what you look at to see if they have achieved that heuristic.

Creating a Culture of Feedback and Critique

00:17:07
Speaker
So like preventing errors, let's say that's one of the common ones.
00:17:14
Speaker
So you could look at all the things, whether it's how filters work or a label on an option on a filter. So I think what expands is we still could use a lot of those basic kind of overall tests of like, does the content on the page achieve the functionality? Does it achieve these heuristics
00:17:44
Speaker
describing what's a good experience. Um, but then you evaluate all the little, and I think that's one of like the challenges, which is our critique framework came was, uh, we have a set of heuristics and then we also have what we call design pillars. So those are really like the things that you, the actual objects on the page or what of, of what you look at when you are, you know, critiquing a database.
00:18:15
Speaker
The heuristics are kind of like goals, like principles of what we want to, it's kind of like almost how we describe if it's a good experience. And I think the somewhat unique thing we did is that the original, like Nielsen's original 10 usability heuristics, and we had an additional six
00:18:40
Speaker
some were like a little bit invented. Like I, I added one for limiting distraction, which isn't really, you know, a physical heuristic anywhere. But what we did is we, so we took those like individual heuristics and we said like, okay, what does, we kind of categorize them into five broader categories. And the category is kind of more like in outcome. And I think it helps you
00:19:06
Speaker
So we call them human centered heuristics and outcomes to really make the point of like, this is really describing what is a great data experience feel like for
00:19:19
Speaker
a human being. So if you think of something like efficiency is something we tend to think a ton about and data is, we put that, we call it productivity, so it's kind of framing it a little bit more. This is what it allows the user to feel or to do. And then that rolls up to the
00:19:46
Speaker
um focused and clear category. So you know if a data experience is focused and clear it basically
00:19:57
Speaker
to focus on the most important information, like for completing their analysis. So the heuristics are really like describing it in even more detail, like what does it mean to be focused and clear, what causes it to be like productive.

Challenges in Abstracting Design Concepts

00:20:13
Speaker
It still provides flexibility, is another one. And I think the other couple in there is limiting distraction and directing attention.
00:20:23
Speaker
It is really interesting the way you describe it because the change in the word efficiency to productivity I think is really smart because efficiency in a lot of ways sort of implies speed. Like can I get the point of this graph as fast as possible as opposed to how does this graph actually help me do my job or make a decision, which is different.
00:20:45
Speaker
Yeah, that's that's interesting. I want to ask you and then and then I want I do want to I want to shift here and talk about how this launched into the book. But you framed the the work on this as sort of tool agnostic but lots of people are using lots of different tools and this could be used for for any of those. And I'm curious whether you think
00:21:03
Speaker
that approach was limiting, or it was freeing because I can imagine if you're like let's build this for Tableau, you might in some sense say well, then you should use this filter type for this type of data and this type of map for this type of data.
00:21:22
Speaker
even though those can be applied to different things. So I'm just curious, like, in retrospect, if someone said, let's make this for Tableau or for Excel or for JavaScript or Python, whatever, would you have been like, oh, okay, like, yeah, that puts me in a fairly simple box, but also like too constraining. Um, so I think it probably,
00:21:42
Speaker
just made the challenge really interesting. I think it did help to move away from, you know, like specific, like building, thinking about building stuff. Building, right. You know, I mean, I think one thing is, yeah, obviously I think
00:21:59
Speaker
through a lot of data and stuff as in Tableau terms. So like when I think about something like negative space, right? A lot of times I'll think about it in terms of like, did you add padding to a container? I think that can apply to, I mean, there's ways to phrase that for, you know, like CSS.
00:22:22
Speaker
Oh, sure, sure, sure. Right. You know, anything, I mean, any tool will provide a way to create negative space with some sort of feature. Right. But it does sort of put you a little bit in a box of
00:22:36
Speaker
Am I thinking the way someone who's, yeah. Yeah. Just the way we think about it is sort of different when we're in our little tool. Yeah. Yeah, totally. I do think, though, that what helped is, at least on our team, and I think definitely in the design team,

Guiding Critique and Communication

00:22:54
Speaker
we follow more of the design process of designing something
00:22:58
Speaker
outside the tool first, right? Like sketching and also then using, you know, a design tool like Figma or something. And, um, yeah, I know that's kind of like a little controversial, like, you know, like how much work do you want to put into, it's a skill to like abstract it out and not get too detailed into like, Oh, what's the, what's like the realistic data distribution for this bar chart?
00:23:23
Speaker
I think that's really what the challenge is. I don't know what my data look like until I make the chart with the data and I can go draw what I want it to look like. Then I throw the data on top of it and it doesn't work because I have some huge outlier and so I need to use a log scale or something.
00:23:40
Speaker
Yeah, yeah. I mean, you've very kindly sent me a couple chapters of the book, which I'm gonna use this as a segue to look through. And like, there are some, there are a bunch, in just the two chapters you sent me, there are a bunch of sketches. While it is a little controversial, cause I think there's just the data layer, I just, and I think a lot of people, and I put myself in this box for just like a little, not embarrassed, but shy about like sharing our sketches. Cause like, not like get drunk, but like it is such an important part of the process.
00:24:10
Speaker
just to like, to your point from earlier, to pull yourself away from the tool because like, and I'm like, I'll Tableau, not quite a newbie, but like, like half a step above a newbie, but like, I'll think like, oh, I just made this thing this morning. I was like, I want to make this thing in Tableau. It should be super easy. And it's not because I don't know how to do it, right? And so, um,
00:24:31
Speaker
So I think it's just those different levels and those different steps, which brings us to your book coming out later this year. And I'm increasingly thinking about this new evolution of data of his books that go beyond the 101. So we have the Bridget and Vijaya book, Functional Aesthetics. We have Jen Christensen's book, Building Science Graphics. And I think your book is going to fit nicely into this new
00:24:57
Speaker
space. So your book is Let's Talk About Data Visualization and it's really focused on, I don't want to short shrift it, but it is kind of like the more in depth version of what we've been talking about this like framework and it's really like pushing people into a more
00:25:15
Speaker
I don't know, like a more concrete way to think about critique and feedback. And so before I like ping you with my questions, so people can hear what I was struggling with with the blog posts that I'm publishing today along with this post. I just wanna ask you to talk a little bit about the book and what you think it's gonna provide people with who are in the database field.
00:25:40
Speaker
Yeah, definitely. So first I'll just say thank you for including me in that category. I'm definitely honored that some of those are some of my new favorite books. And it's exciting to see the field moving towards that. I think often when I've spoken at a conference, I've noticed a lot of times design gets put into the intro level stuff. And it's really not like,
00:26:06
Speaker
I'll give you one book, an example. One of my favorite books is a primer on visual literacy. And that is, was written actually in the late 70s, I think. And it's still, I've reread it, I think at least five times. And every time there's something else I take out of it.
00:26:26
Speaker
It's a little crazy how how applicable it still is. Yeah. But it's very much not a beginner book. It's foundational. I would say the more you learn about design,
00:26:41
Speaker
back and we read it and get a lot more out of it, but it's nice to see kind of people, more and more people, like books being written on the kind of beyond the basics, but it can still be very foundational. Right. And I think there's always going to be, I mean, I'll say this as an author of one of those like 101 books, like they'd always be a space for those because there's always going to be people coming to the field new, right? They're, you know, they haven't really thought about how to make a B-form chart or something like that, right?
00:27:10
Speaker
and so that's sort of a new experience, but they, like all of us, are going to grow and going to make more and more, and like what is the next steps? And I think one of the other things that we don't see a ton about in the field, maybe aside from

Inspiration and Problem-Solving in Critique

00:27:26
Speaker
Ben Jones's books and maybe Andy Kirk's book is on teams and organizations and sort of like a larger group which I kind of feel like your book is really going to help people with is not just you're not just the person on your own making stuff and putting it out and you're being done you are working in a team maybe for a boss but we all work for a boss you're working for someone even if you're not working for someone you're a freelancer
00:27:50
Speaker
Like, you are working for your audience, right? Or you're working for your client. So there's always more people involved. And I think that's where the literature really quite isn't at now. Yeah. And so that's kind of my goal in the book. I would say I was very inspired by this book that's more in the general UX field called Discussing Design. And that highlighted some issues in how general UX field
00:28:19
Speaker
was kind of had room, let's say had room to grow in terms of doing critiques better. And so I think it is, that's a good example of the fact that it is a kind of growing pain. I think it was one of the ways that UX has matured as a design field. I mean, there's many flavors of UX, obviously. But so I think that one of the main
00:28:47
Speaker
things, points that they make in that book that I really kind of try to spend some time in my book, expanding on what that might look like for the database is the idea that we pretty much agree in the design process. If you're following a better design process, you're separating it out, like defining the problem and coming up with solutions for the problem. And yet,
00:29:13
Speaker
A lot of times, when we critique, we just jump right ahead to the solution. Right, right. Now, it takes, talk about it as like a skill and like muscles that you have to build because when you're so used to doing it that way, it takes like relearning the habit, right? You can start out by, you know, maybe catching yourself then, oh, what I just said,
00:29:39
Speaker
really if you look out for anything like I suggest or what if you did it this way? Those are all solutions and that's fine. So I think starting to learn to kind of step back and say, okay, this is a solution that's coming to my mind that I'm saying.
00:29:59
Speaker
What's the problem that I'm trying to solve with it? And then learning to just kind of eventually lean yourself to at least talk about the problem first. This is something I guess I've kind of adjusted. Because I think I probably do get to do a lot more critique in my current role than I did in previous roles. It's pretty beyond my own stuff.
00:30:24
Speaker
that like especially the people that are on the beginning of their learning curve in the especially in design concepts, you need to give them some solutions like it's to they don't really know how you can help them understand the problem that you're seeing but you know like they can't if you're saying you know maybe the typography hierarchy is is a
00:30:51
Speaker
your design being much more scannable and help people find what they need in the page if you improved your visual hierarchy. But what does that mean to someone who doesn't know there's best practices for a type ramp and the fact that you're probably better off, to get a little bit into detail, you're probably better off starting out in the middle
00:31:21
Speaker
and then going to kind of the edges of what's going to be the biggest and what's going to be the smallest and then kind of feeling like that's a very procedural, you know, it's just
00:31:31
Speaker
I don't know where I got taught that, honestly. But it makes it so much easier to do something like that. So I think it's both, but I do think like that whole, so I'm not saying, I guess I'm not saying don't ever give someone, like this is how you should do it, but just getting more aware of kind of starting out with defining
00:31:57
Speaker
the problems you see. I think also just generally learning to see more when we look. That's another kind of big piece of... And by that you mean like more of the detailed pieces of a visualization?
00:32:14
Speaker
Yeah, so I mean, I think some of that is, we can take from kind of art history and visual analysis, being able to kind of identify, so let's say you start with identifying what's the focal point, what is your eye most attracted to, and then kind of looking at what are the
00:32:33
Speaker
individual design decisions that are causing, what are the elements on the page that are causing your eye to kind of be moved in a certain way or be attracted to a certain part. So I mean, I can give an example, I think this is a kind of common
00:32:52
Speaker
example. I think it's a pretty good like specific application of this.

Abstracting Design Rules

00:32:58
Speaker
So if you think about the rules that we have that we try to follow and you know whenever we see something in the wild we're like it doesn't follow this rule. So one I think fairly old not old old but a few years old is the idea of like how do you so how do you decide on the size for a chart
00:33:20
Speaker
Right. And I believe it was Miguel, I think. They came up with a whole measure of the banking to 45 degrees. Yeah, right. So thank you. So that's a very specific rule. But if you think about it, that's really like
00:33:42
Speaker
describing a solution, but a very rule-based one of make sure it has the 45-degree angle. And so the problem with that, obviously, is there's going to be a lot of exceptions.
00:33:58
Speaker
um, doesn't work when it especially doesn't work when you're working with like a line chart that's pretty flat. Right. Right. Um, and so like abstracting outer level from that would be, well, we want to size it in a way that to make sure that it's true to the data and it doesn't distort, you know, make the data look in a way that is not like true to the actual true shape and meaning of the data. Yeah.
00:34:23
Speaker
So I think that's definitely a step in the right direction. Because I think it starts you thinking on what is the problem that you're solving. You want to make sure that people aren't having interpretation errors because it's a weird shape.
00:34:44
Speaker
But then I think like there's even like further, like when I say, when I think about like, how do we see more when we look at it? So you could start thinking about also, and I think this is where it starts coming in that like, we can't really look at just one chart. So like one of the big things that I think you need to consider when you're deciding on the size is
00:35:08
Speaker
like, what's the relative importance of a particular, like, is it a, is it the primary chart? Is it a secondary chart? And that should really drive your, how big you make one chart, another chart. But then, I think, I think this is where my, like, art historian, internal art historian comes in.
00:35:30
Speaker
is I always really try to consider, um, the wider frame. So in a interactive dashboard, I think we can usually assume it's going to be the, the shape of the page is going to be a product of your screen or whatever screen it was designed for. Um, but you're rarely going to have like a perfectly square page shape, right? Like,
00:35:56
Speaker
rectangular or long form, even if it's long form, you're only looking at it like one screen at a time. So I think a really important thing is, let's say it's your primary chart, thinking about like, what's the shape of the frame? And is the shape, so basically aspect ratio, is the shape of the chart echoing the shape of the frame?

Understanding Goals and Constraints in Critique

00:36:20
Speaker
or contrasting to it. So like, I think if you have, speaking in really broad terms, if you have, you know, a rectangular shape of the chart,
00:36:33
Speaker
and it echoes the rectangular shape of the frame, the pink frame, then it's a kind of like static design. There's not a lot of movement in the design. And so that could make sense, but then just kind of thinking about if you make something, and maybe it's a chart, maybe it's something else. If you think about maybe a menu that's a tall shape,
00:37:02
Speaker
that has a kind of vertical movement. Yeah. Like if you have, you know, one chart, that's very wide, that could lead the eye, you know, left to right. Right. But that could also depend on other, you know, what are the shapes of the other charts in there? Competing with the kind of shape that the, you know, your main chart is creating against the frame. So things like that, they're very,
00:37:31
Speaker
like it's really visual analysis which it is but it's also I mean one of the things that we can talk about this in a second because I think it leads back to the the post that I published today that we let you help me with I mean I think the other piece of it is recognizing that
00:37:48
Speaker
Different creators have different goals. When I think about a columnist at the Times or the Post, their goal is to get eyes on the page. That's what matters. Whereas someone working inside Visa or someone who's providing a memo to whoever their boss, their goals are just different.
00:38:08
Speaker
You are creating something for your colleague at Visa. You know they're going to read it. The goal isn't to get people to click on it, because the goal is to make a decision, or as you mentioned earlier, to increase productivity as opposed to efficiency, which I start that on my notes. I just think that's really smart. I mean, there's been for a long time this discussion
00:38:29
Speaker
that sort of ebbs and flows on impact and how do you measure impact and can you even measure impact? And I think the efficiency thing sort of goes hand in hand. I think the other thing that goes hand in hand with that is speed. It's one of the things that I just like, there's this like obsession with, do I get the message of the graph?
00:38:48
Speaker
just like as fast as possible, as if that should be a metric. So I guess the sort of next question I want to ask is when you see people doing critique today, and we'll keep it sort of in the Twitter world, so like public critique, not within and not within teams, but in the public sphere,
00:39:07
Speaker
What do you think is like the thing most people are doing the most wrong? Like is there a aspect of critique generally that you think people are just like missing the point or they're critiquing the wrong thing?
00:39:23
Speaker
So how long do you have? I mean, obviously, I'm writing a book about this. Writing a book about it, right? Yeah. But I think you did hit on something that's really important is there is a big difference between critiquing someone's work and critiquing work with someone.
00:39:48
Speaker
And really the critique that I do on a regular

Critique as a Constructive Conversation

00:39:53
Speaker
basis at work, and that sometimes I've also done with public work, like with people I collaborated with, is the second, and it's really a conversation. You know, sometimes people make themselves available and you can ask.
00:40:12
Speaker
kind of clarifying questions, but really, if you're doing the second kind where you get to have the conversation with someone, and that's why I called my book, Let's Talk. I mean, you should start, that should always start with, just kind of like when we start on developing data visualization, we're trying to discover what, and ask questions. It's a bit similar, but you're trying to like discover
00:40:41
Speaker
what is, what was the creator's goals and constraints and, you know, really then like really trying to understand what are the choices they made, what are the design, kind of how those work or don't work for what they're trying to do. And that's really, you do that before you go into like more of like evaluating what you think isn't working, right? Like you have to really understand, seek to understand.
00:41:11
Speaker
what the current design is. I think we probably don't do enough of that. But saying that, I think there's definitely room for the other kind of public
00:41:23
Speaker
or more, you don't get a chance to maybe talk to the creator. But just kind of recognizing that's a slightly different thing. It's almost like criticism. If you think about art history field, there's critics that go and look at artwork and write up what they think of it.
00:41:47
Speaker
But that's almost like a different goal. So if you're having more of a conversation with someone, your goal is to help them figure out how to improve their data visualization. Or you could be trying to evaluate your own work, but your goal is to
00:42:04
Speaker
figure out what's not working and how to maybe make it better. Right. And to help other people do better, right, when they're... Yeah. I think that the one thing that really is a bit of, I'd say a pet peeve, like the thing that I feel like is starting off things a little wrong is when someone
00:42:24
Speaker
post all feedback welcome. And I'll be honest, I fall into that. Sometimes I still have to catch myself. And I think that's one of the things that I really loved about the Discussing Design book that they brought up. Really, it's a two-way street. So if you're not getting useful feedback, the feedback you're getting is all, you need to make this button bigger or wider.
00:42:53
Speaker
Some of that might be coming from that like you're not asking for the for the feedback that you know it's kind of like on that person asking for the feedback. Obviously critique on Twitter doesn't usually involve someone asking for. I have a whole chapter devoted to that as well.

Clarifying Feedback Requests

00:43:11
Speaker
How can you
00:43:14
Speaker
articulate and describe what is the feedback that you're looking for? So what's the feedback you're not looking for? Right. I don't care about the size of the buttons or the color of the button. I need to fix that. But like, so instead of all feedback, welcome, do you have like a good pithy phrase for people to like, you know, like this is like the hook so they could, you know, they have to wait a few months till the book comes out. But like, do you have that phrase in mind for people?
00:43:41
Speaker
I probably should have a good pithy phrase. But I think the point is maybe there isn't a good pithy phrase because sometimes maybe you do want to know like are the buttons in the right spot versus the colors of the line chart because I can't control that because this is what our company follows and it's a red line and that can't be changed but the button is the button in the right spot. So maybe there isn't a phrase and it's just
00:44:06
Speaker
you need to be more focused in your listening? Yeah. And obviously this is tougher to do in Twitter. I think what we've done at Visa, and I've seen something similar in the DVS channel on Priti, is we have a set of three questions.
00:44:29
Speaker
that when you come in and ask for critique you kind of fill out ahead of time to explain to people what type of feedback you're looking for. In the book I kind of outlined like that there's like two main questions that you as a creator should kind of answer to help people understand what is the type of feedback you want. First is just kind of your what's your
00:44:53
Speaker
goals and objectives to whatever level of detail you want to get. Like what is the design trying to achieve? And the second is what are the elements that you want to have evaluated? So I think it can be helpful to kind of think about it in terms of
00:45:15
Speaker
I use design layers in the book. So, you know, is it like something to do with the information architecture, or is it the thing that we usually tend to focus on, which is the chart design? So is it the information visualization layer? So I did, I guess, give some tools for
00:45:39
Speaker
describing what parts of the information product you want, like you want to get feedback on. And it may be like a specific part of it, right? Like let's say you want help with the interactions, interaction layer. So like a really broad question would be,
00:45:56
Speaker
how well does the interactive features that I have, how well does that support the analytical flow or the questions that I want to enable people to use it, right? The broadens make you more productive. Yes. They make you more efficient, they make you more productive. Yes. Or do they give you interesting, actionable answers, right?

Purposeful Critique and Understanding

00:46:17
Speaker
A lot of times, the classic example is having a million filters that don't really help you come to new answers.
00:46:26
Speaker
Yeah, but the other one would be like, you know, maybe like, do I have clear feedback where like it's clear that when you interact with some feature, it's clear what happened. Yeah. Right. So that's like a more a more specific one. Yeah. Or even like, is this button difficult to find? Right. Right. Right. Are the outcomes obvious is what I have to do. Right. Yeah.
00:46:52
Speaker
So I think in the short term before the book comes out and you have all these checklists, by the way, everybody, for those of you listening, watching, like the checklists, I don't want to call them checklists because they're not really checklists. They are frameworks and they are cues are amazing. They're going to be super helpful to you. So be ready. But I think it was a good in the meantime, till the book comes out.
00:47:10
Speaker
a good lesson for folks to keep in mind. Like when you are asking for feedback, be specific and be purposeful so that you can, you know, get the feedback that you want. And because we've already been talking for an hour, I want to wrap up. But I think also as the critic, you know, to be a little bit more purposeful and thoughtful. And I like this idea of
00:47:32
Speaker
critiquing someone's work versus critiquing with someone are two very different approaches. So Leland, thank you so much. I mean, we could keep going, but at some point people are going to like, they're going to hit like two times speed. I know this kind of started a big Twitter conversation, but I think it really, I still think it really crystallizes what, yes, my book tries to do and what critique framework also tries to do. It was,
00:48:02
Speaker
by, let me see, I wrote it down. It was by Dr. Kat Hicks and she said, often we try to make complex problems easy rather than making it easier to work on complex problems. So I think if there's any one thing that I think we can push us forward in how we critique, is that we do need to think a little deeper. I think we've done some initial really, we have a lot of really great work on
00:48:32
Speaker
know, very specific rules that kind of cover the easier things. But, and like I totally get like that instinct to want just a rule, like a do and a don't, right? But it really is about like, how to think a little deeper about all these things. And so I do think there's space for tools, like I hope the Priti framework
00:48:56
Speaker
is the beginning of that. There are more tools that help you to think through some of those deeper things without giving you a set of simple, yes, no answers.

Conclusion and Further Resources

00:49:09
Speaker
Right. I think that's the challenge with the checklist where there's a box, where it's like you check a box, check a box, check a box. Because as you said, sometimes they don't apply. And some things, I would say, are more important than other things.
00:49:25
Speaker
I mean, I don't know, like the integrity of the data is more important than like the font size of your title, right? I guess if I had to come up with that, it's not a pithy statement per se, but I think the message here is to think more deeply. I think on both sides is what I'm hearing from you, right? As a critic, to think more deeply about what you are critiquing.
00:49:46
Speaker
And as the person being critiqued or soliciting critique, being more thoughtful about what you want people to say, and maybe how you respond even to those, what I kind of call the hit and run or drive by critiques. It's like, this is garbage. And you're like, maybe you respond in kind of a different way to be like, well, I know that you don't like the colors, but like, that's the branded colors. And that's what I use.
00:50:10
Speaker
you know, but does the graph type like is that useful? So I like that. My wife would call it a compliment sandwich. So you give a compliment and then some critique and then another compliment at the end.
00:50:22
Speaker
So, okay, so the book comes out when, like in the fall of this year? Summer? I'm not sure of that because- The publishing world of the mystery of publishing world. Yes. Right. Okay. But in the meantime, people can find you where? On Twitter? If I'm on Twitter, they can, I have my regular handle where I will be honest,
00:50:47
Speaker
tweet about probably politics as much as Dataviz. But you can also for updates and sneak peeks by the book, you can follow Dataviz Crit.
00:50:59
Speaker
Okay, great. That's great. I'll put all this and the links to everything that we talked about on the show notes so people can check it out because there's a lot here and a lot for people to think about and hopefully do better individuals and as teams. So this was great. Thank you so much for coming on the show, taking time out of your day. And yeah, I'll talk to you soon. Okay, thanks John. Thank you.
00:51:21
Speaker
And thanks everyone for tuning into this week's episode of the show. I hope you enjoyed that. I hope you'll start to think a little bit more in depth, a little bit more creatively, a little bit more sophistication and purpose about your efforts in critiquing and receiving critique in your and others' data visualizations.
00:51:40
Speaker
Be sure to check out all the links on the show notes. I've got links to all these different checklists and frameworks that we talked about in the interview and links to all the books that we talked about as well. There's some great new books out on the market and I hope you'll check them out. And of course, don't forget to read the blog post that is up at policyvis.com where you can sort of get my take on database critique. So until next time, this has been the policy of this podcast. Thanks so much for listening.
00:52:06
Speaker
The whole team helps bring you the Policy Vis podcast. Intro and outro music is provided by the NRIs, a band based here in Northern Virginia. Audio editing is provided by Ken Skaggs. Design and promotion is created with assistance from Sharon Sotsky-Ramirez, and each episode is transcribed by Jenny Transcription Services. If you'd like to help support the podcast, please share and review it on iTunes, Stitcher, Spotify, YouTube, or wherever you get your podcasts.
00:52:29
Speaker
The Policy Vis podcast is ad-free and supported by listeners, but if you would like to help support the show financially, please visit our Winnow app, PayPal page, or Patreon page, all linked and available at policyvis.com.