Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
#10 Sara Migliorini: Biometric Harm image

#10 Sara Migliorini: Biometric Harm

AI and Technology Ethics Podcast
Avatar
39 Plays1 month ago

Sara Migliorini is Assistant Professor of Law at the University of Macau, specializing in international law, AI, and big data. We'll be discussing her 2023 article "Biometric Harm," which examines how the use of biometric identification—identifying people by their bodily or behavioral features—can pose significant harm to both individuals and society.

Some of the topics we discuss are the different technologies used for biometric identification, the human need for unobserved time, the right to control our informational narrative, and laws that might protect us from biometric harm—among many other topics. We hope you enjoy the conversation as much as we did.

For more info on the show, please visit ethicscircle.org.

Recommended
Transcript

Who is Sarah Milorini?

00:00:16
Speaker
Hi everyone and welcome to the AI and Technology Ethics Podcast. Today we are joined by Sarah Milorini, Assistant Professor of Law at the University of Macau. She specializes in international law, AI, and big data.
00:00:32
Speaker
And we'll be discussing our 2023 article, Biometric Harm, which examines how the use of biometric identification, identifying people by their bodily or behavioral features, can pose significant harm to both individuals and society. Sarah Miririni, it's my pleasure to welcome you to the show. Thank you Roberto, and hi somewhere as well. Thank you for having me. and Thanks for coming.

Why shift from law to technology?

00:01:00
Speaker
So Sarah, I think we should begin by just kind of like loosening up. We're going to ask you a little bit about yourself. Tell us about yourself, ah your research. How did you come to study the things that you studied?
00:01:13
Speaker
Yeah, so um actually by necessity. So my initial training um is in a discipline called private international law that studies the regulation of cross-border legal relationships. So for example, contracts between companies that are established in two different countries. And of course, where all of this is relevant to technology, um it's much broader. So they're not the same thing.
00:01:42
Speaker
um but would I became interested in technology by necessity as I mentioned um because even though now I'm in a tenure track position in Macau for the best part of 10 years I was in research contract.
00:01:57
Speaker
that were grant-based and they will last one or two years with different universities and research centers. So, faculty that's the state of academia right now. um And so, during this time, I had to follow the funding. And technology issues became very trendy between 2017 and 2018, and then with the pandemic, of course, funders were even more interested about in these new technologies.

What are biometric technologies and their evolution?

00:02:20
Speaker
So I got a few projects on this. um And then when I applied in Macau, I built on that for my research agenda. And now I'm part of a group ah with colleagues that we we work on these finishes. So I think um I feel quite lucky now that this is my job to think about these issues. And usually I combined it with my initial training because I usually study issues of when there's a litigation. so For example, the paper that we're going to discuss today is about harm in a tort litigation. so I tried to kind of um merge it with what was my initial training. Awesome. Yeah, that's really cool. and It's really well done. It's fascinating stuff to ah to think about harm in this context of ah kind of this important emerging technology. so
00:03:12
Speaker
I guess yeah speaking of that, you know before we dive into this idea of biometric harm, yeah could you briefly just explain or describe the technology that you're responding to? you know What exactly are biometric technologies?
00:03:28
Speaker
Yeah, sure. um So um obviously, my my training is in the law, so I'm not an engineer. But to write this paper, I really try to read some some literature on on on this from the engineering um field. So my understanding of the technology um is the following. So first of all, the idea that this technology implements is not new. So we want to tell people apart based on how they look.
00:03:55
Speaker
um and In particular, based and and more specifically based on the measurements um regarding their bodies. This is a very old idea, we can talk about it later maybe. um But the technology um implements this idea by first of all creating a database.
00:04:15
Speaker
of measurements could be the fingerprints, right? um And then and when someone is measured again, the technology allows to match that measurements with existing database. um And so can then um as an output will give you the name or the identification, whatever was put in database in the first place for that measurement.
00:04:39
Speaker
um And of course, even though the idea is quite old, um the technology has been evolving really fast. And again, I feel that the pandemic has been an accelerator of this, um especially in the part of the world where I live, so in Asia and in Greater China. and So now we can identify people not just ah by the fingerprints, that was the first biometric measurements, but you can use face geometry of course um you can use gate behavioral um characteristics you can use corneas or the way in which you move your eyes even if you're wearing a face mask um so the uses of this have multiplied.

How are biometrics used in everyday life?

00:05:22
Speaker
And of course, we are all familiar with it. We use it without even realizing to unlock our phones, unlock our devices. um When we go through security sometimes at the airport, depending on the country. um So there are different, different uses of this that are quite common. Yeah, it's funny, like talking about not even realizing it. It's like I read your article a couple weeks ago, and I was like, okay, yeah, what are this biometric technologies that I'm Involved in and I only just remembered today. I was like, oh, yeah the fingerprint on my so I have a Mac and I like open it Unlock it with like ah the fingerprint at the corner. Anyway, it's just funny how you you just don't really think about it don't really notice it and realize that you're um using sort of Some new emerging technology, but yeah, so it's all about being able to identify a particular individual Obviously that is important in various contexts like you can think of like a medical context They need to know who you are or maybe
00:06:18
Speaker
and ah I mean most obviously like in a travel context you know the the airline wants to know you know identify you to make sure that you are who you say you are and more secure than just asking for someone's name or i guess ah you know.
00:06:36
Speaker
you know match a visual identification matching the person to the ID. More secure than that would be fingerprint analysis, would be geometric analysis of their face, I guess. so it's like ah it's It definitely seems to be a ah great advancement in terms of um the accuracy with which you can identify people. I do have a question for you. i You mentioned GATE. what What are they using GATE analysis for? like Is it more so for security? or Yeah, so i I guess anyone who has access to a device that can do that can use it. And so there are companies that compile databases, and there are states law enforcement agencies that buy them and implement them. So you're walking in the street. And in many places, you're always on, I mean, you're always recorded, they're always cameras.
00:07:27
Speaker
And the best case scenario is that the recording is never actually used by anyone. But if you need to be identified because of the law enforcement need, um they can use GATE and the way in which you were walking down the street um to try and match with a previous recording of you and your GATE.
00:07:49
Speaker
um I think this is done a lot um in the US, but also in China. um I think there has been a lot of discussion about this in Europe with the new um AI regulation that um doesn't um really prohibit biometrics, but at least prohibits the fact that you you cannot use it in real time. So while you're walking, there cannot be someone identifying you unless there are really law enforcement and um some very specific law enforcement necessities.

What is the concept of biometric harm?

00:08:22
Speaker
um But I think it's something that we are subject to and we don't even know. I mean, we've been so subject to it for a time now. Yeah. Interesting. I kind of want to send or kind of ah emphasize, I want to bold this part you know about to the to the listeners. It's like, you know you can put on a mask, you can put on a face mask i don't a face mask, or I guess you can put on another kind of mask or whatever.
00:08:49
Speaker
But the way that you walk alone could betray your identity right to if if there's a ah sufficiently ah large database and and you're in it and all that. And I guess it's like the way our pelvis is tilted and the length of our legs and all that, these things you can't change very easily. So so by um we can be identified quite easily.
00:09:12
Speaker
from the way we walk. I keep thinking about that, so I'm going to stop that right there. Just to add something more, it's also the keystroke, how you type on your computer. You're sitting in ah in ah you know in a coffee shop and you're writing something, um and that's also something that can identify you. and Imagine the way in which you move your eyes. so this what i got scared and I got scared about it when I realized how you know little things that, as you say, cannot be changed about you unless you work on them very much can actually lead to to identification. and and One point that I wanted to make, because we we had the discussion just just before,
00:09:55
Speaker
is that, yes, biometric technology can be accurate, but it only works in a closed system. So it is as accurate as the initial input. So if my gate has been identified as my sister's gate, well, there there's no escaping it. The biometric system is wrong and the identification will be wrong.
00:10:16
Speaker
um And then there's nothing you can do about it, whereby if you have and a human trying to identify someone, of course there can be her error, but we know that. And the bias with this technology is that it is accurate, right? And so that can be really dangerous sometimes. Yeah. Like it's like we trust too much or we have too much confidence.
00:10:43
Speaker
in the in the in the new exciting technology, I guess, or something like that. Yeah. Yeah. We we call it the over overly credulous problem. We just believe algorithms too much. Awesome. Well, actually, you know that kind of already dipping into the next question. But basically, we thought you know maybe you could just give us so like ah just an initial summary of your like kind of main thesis. Obviously, ah you know your article is about recognizing some you know the existence of biometric harm and so yeah we're kind of just curious like how would you describe like you're kind of your main position when it comes to biometrics is it like when you have you know you when you're doing so biometric identification systematically across a large society is it then sort of like necessarily harmful is that kind of your or yeah just how would you describe your main thesis
00:11:36
Speaker
okay yes so um I think that the main thesis of the article is is a bit um um less broad than that.

How does biometric identification impact privacy?

00:11:45
Speaker
so um The article tries to put forward the idea that when you are identified by a biometric system,
00:11:56
Speaker
um there are some very important um values that are at stake and that should be possible only if the identification serves other values, competing values that are of the same kind of constitutional level.
00:12:16
Speaker
And um because the legal system will only allow you to get compensation when you're harmed if the harm is actionable, if it's recognized as such by the law, if if it's harmful by the law, then My point is that biometric identification should be harmful unless you can justify it with the pursuit of a value that is as important as the values of privacy and self-expression that are at stake. Okay, so there's there's specific values already kind of built into what is it like maybe like typical western law. It's already built into typical western law which.
00:12:58
Speaker
um
00:13:02
Speaker
biometric identification is kind of detracting from, or how would you put that attacking or and specifically, I guess the values that you're thinking of are like ah the value of privacy, probably primarily or yeah. How, so yeah can you, can you unpack that for us a little bit? Like and certain values built into the law. And then if those values are attacked, then there's like a legal standing to say that's harm is something like that.
00:13:29
Speaker
Yeah, so just just the premise behind it. um so if um So the legal system is built on hierarchy of norms. So you have values that are more important than we call them constitutional, you know, in in all virtually all actually legal system, you have a constitutional level, it might be called something else, but those values are foundational. So you are not allowed to you know, infringe them and you have to protect them. But sometimes they the conflict with each other. So you cannot protect two of them at the same time in the same way. So you have to find and like balancing between the two. So I'm i'm thinking about that and I'm and i'm seeing that
00:14:11
Speaker
And I'll explain why in a minute. But when you go and identify someone with a biometric system, you are actually infringing upon, yes, privacy, private life and self-expression. There are constitutional values, certainly in Europe.
00:14:29
Speaker
but probably also elsewhere. And because you're doing that, you have to be able to justify that infringement with other values. For example, medical necessity. We can understand that, right? um And if you're not able to do that, then that identification is unjustified, should be harmful and should be compensated. So that's the idea of the paper. Gotcha. So it's ah it so it's like, okay, so there's a necessary Infringement connected to the utilization of biometric technology at least this is kind of what i was going to wonder about is it like. Insofar as that biometric identification is going on like systematically in a society then there is necessarily an infringement on the value of one privacy to.
00:15:21
Speaker
Self-expression is that kind of that's kind of capturing the thesis. Yeah, so um And thus there thus there needs to be a like and thus to do it you would have to justify it with respect to another fundamental value is that Yes, no, that's that's an accurate um and summary. I would just say that the paper tries to first um kind of describe the harm that happens at the individual level, and then also tries to go to the more societal level, which is a trend now in in privacy, not just about biometrics, about any technologies that um infringes upon privacy. and Okay, great. Awesome.
00:16:06
Speaker
I think all of this will be a lot clearer to listeners once once we really get into some of these fundamental human needs that are getting interfered with by biometrics. So i let's start to go into that direction. And I'm going to leave this kind of open-ended, Sarah, so you can just kind of you know take this where where you want. tell us about why we need unobserved time so much or and and why we need to take breaks from social roles sometimes. Yeah, so that was another interesting discovery that I did when I started to read something in anthropology and sociologists are stepping out of my field. um We always recognize that as humans, we we need a social life. um It's really embedded in our nature. We couldn't survive without um a society.
00:17:03
Speaker
But at the same time, we also need breaks from that social life. And in particular, we need times during the day, um every day, where we know that no one is watching. It it is very simple, but um it seems to be a very foundational um need of humans.
00:17:29
Speaker
so When we um we're are in the presence of anyone, um we are actually and playing a role in that interaction. And I don't mean this in a bad way, as in we are acting, we're not ourselves. We're not faking it. It's just that if I'm with my children, of course, um I have the role of mother.
00:17:52
Speaker
um And if I'm with my colleagues, I have the role of a colleague. And those are very different. And they require me to behave in different ways. So sometimes I will need to step out of this role, be able to control the moment when I step out of my role, I shut the door because shutting the door is self defining, right? And I just, um and I just stay with myself but with the knowledge that no one is watching me and I don't have to play any role.
00:18:22
Speaker
Can I give you an example? And until you tell me if I'm getting it. So my friend, not me, ah when he gets, when he's done with work, um he just needs to, I just need to, like, I feel like there's a lot of emotional labor to looking competent throughout the day. And so before I get home, I just like, you know, unwind decompress for a little bit. And I switch roles from like, you know, teacher to, you know, home person, you out with my fiancé. So is that kind of ah you know that decompression, the alone time in the car, is that one of the things that you're... Does that count as this unobserved time? Absolutely. So unobserved time is is so foundational as a need, then then every culture, ah at least what I've been reading seems to point to the fact that every culture and then every individual
00:19:18
Speaker
interpreted in a certain way. So, for example, where I'm from in Asia, there's less occasions for unobserved time because the the population is very dense. But at the same time, we have spaces, right, where we can do that. So, see, for you, that, you know, I guess this is something that we will connect very much to the US, the people commute by car because they're so big, yeah right?
00:19:46
Speaker
and And then you have to have that half an hour in the car with your favorite music or whatever podcast. And that's your unobserved time. That's great. very Other people will commute home in a very past MTR, the train um here in Hong Kong and Macau. And then we'll have other ways of feeling unobserved. But yes, yeah. yeah yeah Yeah, that's crazy. Because I was thinking like, I'm glad you brought up the like, you know, your research into kind of other cultures, because I was thinking, you know, my first thought, you know, devil's advocate kind of thing is like, Oh, you know, does this really apply to cross culturally, because I'm just imagining scenarios where it's like, it's a much more communal environment, maybe it's like, ah you know, a multi generational home, and, ah you know, the the kids don't have the distinctive bedrooms, you know, we pack them in, you know,
00:20:44
Speaker
and Anyways, I was kind of like thinking, like is you know is it really true that there's there's a need for unobserved time in that context or alternative cultural context? because I was wondering whether oh you know is there is an element of like Western individualism to this. like Of course, we would want to have some more unobserved time potentially but maybe it doesn't universalize but anyway you're kind of saying no actually in the literature you've read it's really more of like a universal need like even if it's just maybe a slight moment i don't know like going off ah for a little stroll or something but
00:21:24
Speaker
your research is saying it's really fundamental. Yes, and and I think it's also my experience here. One of the things why I got interested is that as a European coming to Asia, I've been living in Asia for ah for a long time. I was in Shanghai first in Taiwan and here. and but ah There is less space. And they're even less compared to the US. It was was there for for visiting two years ago. And and yes, it's ah very difficult, very different. But what the point of the paper is that um
00:21:57
Speaker
everybody has that need, then the other observation that we can make is that humans are extremely adaptable, right? So we can really adapt to very extreme situations that push us to the limit. And that could be in terms of climate, that could be in terms of, you know, even mental um strength in situations of stress. So we can actually go for a very long time against even our fundamental need.
00:22:26
Speaker
And this is where culture plays a very important role. If you you have these needs but it's culturally mediated, I grew up in a multi-generational home in Italy where and no one had privacy because there were too many people. um Right.
00:22:44
Speaker
But that doesn't mean that I didn't have the need. It just meant that the need was met with different strategies that would fit with my culture, with my even social and economic situation. So we do have the need, but we are also able to not prioritize it for sometimes, or interpret it with culture, um or just change over time as well, um depending on our situation. But that moment where you close the door, even metaphorically, that is important as important as your social relationship for your ah mental health, really, and for your being human, even even more than mental health, all all your health.
00:23:26
Speaker
ah and so what are the What are some of the things that happen ah you know when when someone is denied these things that you before we talk about biometrics or just like because of maybe incarceration or whatever example you'd like. What but goes wrong when we were denied this right?
00:23:45
Speaker
So and there are a few foundational values that are involved in this, um I think. ah One of them, and it's the most straightforward, but it's not necessarily the most important is privacy. So the idea that you have to be able to keep certain information about yourself private.
00:24:04
Speaker
um And that, of course, if you don't have any space for yourself or for your things, whatever way you interpret this need of unobserved time, um that actually leads to um not not being well and and um being under a certain type of stress. But there's another important value that is at stake and it is what the European Convention of Human Rights calls self-expression.
00:24:33
Speaker
um and When we think about self-expression, this is Article 10 of the Convention, we always stop at free speech, but the idea that you can should be able to you know um speak your mind in a way and to speak your opinions without any very few restraints. um But that's not the only meaning of self-expression.
00:24:56
Speaker
in the convention um and in the case law of the Court of Justice, sorry, of the Court of European Court of Human Rights, that self-expression also means the possibility to um really express your personality in multiple ways. It could be with your clothing, it could be with your likes on social media, it could be with a song that you sing in a public space, with a religious sign that you wear in a public space, and so on.
00:25:28
Speaker
So that need, it is fundamental our own um for our own humanity. And actually, and the paper refers that this one foundational article that was written um in the 1800s about privacy, where um the author and one of them was a supreme court judge in the u.s. reacted to the invention of photography and at some point the paper says something like well if you're able to picture someone with the then you are um fixing that facial expression of that person forever
00:26:11
Speaker
and that it's against self-expression because that's a moment in your life where you had that expression on your face and now it defines you because it stays in time right instead of just being a moment and you can step out of that and you can just you know not be that anymore ever again but people will remember you with that because of the picture.
00:26:35
Speaker
So when this self-expression is not possible, never. Of course, most of the time it is not possible because we need to ah balance it with many other things that are also very important. But when that is never possible, that also really restricts your ability to to be human and it really um creates another situation of stress. So when a person is under this kind of stress,
00:27:04
Speaker
um They tend to be less creative. They tend to conform more. They tend to um be afraid of expressing them themselves. So it's a cycle that reinforces themselves. And if you multiply this at the societal level, um societies that are under observation, where privacy and self-expression in that sense are not respected enough, um are societies that are less um creative, and societies that are less inclusive, um societies that are more conformist, um and so on. So this is the... It's like a ah chilling effect almost, yes you know? Fascinating.
00:27:48
Speaker
um

How do biometrics control society?

00:27:50
Speaker
It's great. Okay, so it seems, yeah, so it's like we have these two fundamental needs of Of on the one hand self-expression which we were just discussing and then as well as this need for unobserved time and um You know, it would be interesting maybe maybe we can maybe we'll have time to circle back to discuss a little bit more like the legal reality of these various needs like to what degree they're protected and that sort of thing but perhaps at this point maybe we want to you know start discussing more directly the issue with biometrics um and so um yeah so like i mean i guess maybe building building on you mentioned the uh you're kind of referring to lewis brandeis i think is we the the supreme court judge talking about um this the photographic technology but yeah how you know how
00:28:47
Speaker
is biometric technology potentially infringing on either of these rights initially you know you talk a little bit about informational narrative so yeah anyway just however you want to start touching on the biometrics more directly yeah um yes so the biometrics are just an amplification of something that happens um in real life. we When I see you, I recognize you, right? Everybody does that. um We also might have other types of amplifying recognition. So I take a picture of you and then I'm able to confront that picture with the image of you as as a human. I'm able to do it.
00:29:33
Speaker
um and so um ah for sure, ah that is a type of identification. The problem with biometrics is that on the one hand, um the system is a closed one. So it is, as we mentioned before, it is as accurate as the initial input in the database.
00:29:52
Speaker
um And then as a concept and like as ah an additional point to that, what um was mentioned before as well, that we over we tend to over rely on technology. So if the computer says it, kind of must be true. ah We have this kind of reverence towards the machine. and But um more than that, I think what the paper tries to say, and I'm not sure if I made that point so clear, is that it also kind of It amplifies this identification, it makes it so easy for many private and public actors to do it, that it transforms society into this place where you just step outside, you can be identified. And it doesn't matter if it's accurate or not, and and it is just what you project. And then the world around you could actually be shaped.
00:30:44
Speaker
um based on your identification. And one of the best examples of this are these interactive advertising boards that we have in, for example, in malls, different jurisdictions. but you you When you walk towards the screen, um they will identify, you probably don't know your name, but they will know if you're a male or female,
00:31:06
Speaker
um you know they will identify your ethnicity, probably your age, and so they will connect that to your estimated purchasing power and show you a certain art, which is what happens also in our social media, right? So we we get this personalized art.
00:31:23
Speaker
Well, while, you know, ads are just ads, we may want to be exposed to many different things and we may want to be able to have access to different products. So, for example,
00:31:38
Speaker
um women usually are shown in this board um products that are less expensive than those shown to men, because we just estimate the purchasing power of women is is less. but which is you know if like and and Every time with this technology of big data, these are generalizations.
00:31:59
Speaker
They come from somewhere, but the data is there, but you cannot use that at an industrial level on society because then you're blocking society in the past. right You are perpetrating whatever the data suggests.
00:32:14
Speaker
kind of constraining to a certain degree. Yeah, sorry. Oh, yeah. that was um That was also the effect that was for me, it's another type of chilling effect if you want, but there was also the effect that I was very surprised um about when I was starting this, that we really don't want this. This is cannot this cannot be good ah for society. We are blocking the the possibility to evolve if we're doing this.
00:32:41
Speaker
and um This is particularly true um because, and this I didn't mention this before, because biometrics don't only identify your name, identify you as a person, they also identify you as part of a group, if your name is not known, right? but yeah I was about to ask you about that. It's like there's kind of an important distinction. because Sometimes when you know people are talking about biometric technology, they're really thinking about ah its use in terms of identification of like the civil identity of a person. so it's you know yeah i go to you know I'm getting on an airplane, it scans my face, it identifies me as Sam Bennett. That's a civil you know identity or whatever.
00:33:31
Speaker
But yeah, there's there's also, you know, what you bring up in your paper is like, you know, you also want to think about um ah think about just it classifying me through the use of ah various bodily or behavioral features so yeah like if you were saying maybe you know an ad board yeah it's not trying to identify me as Sam Bennett but it might be trying to classify me as a male and then adapting it's you know um anyway so it seems like yeah like a whole range of different kinds of worries can can spring from that specific type of
00:34:17
Speaker
ah usage of biometric technology? Yeah, definitely. And um again, I think the most a dangerous and scary part is that it just amplifies things that are already there and doesn't allow for evolution, right? So for example, um we talk a lot about self-identification, but If I'm given an identification in terms of groups just by stepping in front of a board, that work that I do for myself to identify differently, that is completely useless because society still sees me in a certain way. I'm not saying that that is
00:35:01
Speaker
um that that is not what happens when a human looks at me. right ah first of yeah but but There's a difference, I think, between the human and the machine. We can talk about it if if you want, but um it for me, it is not the same thing. That's one of the... Yeah, yeah I mean, I guess we got to get to that. Right, right we got to get to it because it's like, okay.
00:35:23
Speaker
um I mean, so like one idea, right? It's like, okay, you're talking about being in control, at least in the paper, you know, you mentioned this idea of being in control of your informational narrative. And that seems to be, i cannot say yeah right yeah yeah it but yeah for an idea from fully right. And, but it goes along with what you're just talking about in terms of self expression, the need for self expression. And, you know, and then you think about the Lewis Brandeis,
00:35:52
Speaker
uh quote that you just mentioned about um when someone takes a picture of me it's like i become locked into that look or i don't know it's somehow like it can't be changed like now You know one thought i had though is like you know there are an initial thought is like well look there's a tons of things that are outside of control about us right like i can't change who my parents are i can't change where i was from like i can there's so many i mean there's just there's just a bunch of facts that like are outside of my uh
00:36:27
Speaker
ah ability to alter you know like i can have whatever narrative i want self-understanding i want i can have whatever kind of narrative i want but like you know if i say that i'm not from virginia that's just gonna be a false narrative so it's like i don't have control over that so anyway um sorry that's kind of a rambling question but sarah yeah what do you think about that kind of issue like you know our control over our informational narrative is very limited is it really a big deal if
00:36:58
Speaker
um someone takes a picture of us or if we're being classified by ads as you know. You know this or that so anyway can you kind of maybe respond to those kind of worries. Yeah so the the point about um facts right you cannot change facts about yourself where you were born where your parents were of course you cannot change those facts but.
00:37:20
Speaker
One of the the ideas that I think Floridi puts forward is that you can change what you will become. so Of course, you are who you are and and you come from certain facts, but what you decide to be and what you can be, even if you don't become it, that is part of your humanity. and so I think in one of his papers, yeah um It really says any technology that changes that unlocks you in Something just because you were born there or you look like that that denies your humanity Because the possibility to become that is what gives you control on your information and narrative not the past but the possibilities of the future Okay, okay, so and so you're you're concerned about yeah the these technologies um Limiting our future
00:38:11
Speaker
opportunities. Yeah, so Roberto, you go. Yeah, so I do i i happen been to i specialize in in crazy examples, there i just so you know. So i'm gonna I'm trying to think of a nice example for this, and I'm going to run one by you and and see if you if you like it or not. But I'm kind of thinking about you know how you said,
00:38:29
Speaker
this amplifies, you know, what what what might be true, but when you amplify it such that only those things are being heard and everything else that's more subtle and nuance is going by the wayside. The example that I have in mind is like the paparazzi and someone that's super famous, right? And so they might, you know, they everyone, every human is a complex nuance being and there's all kinds of layers to us and all of that. But when you are super famous,
00:38:57
Speaker
sometimes the media locks on to one or two things that you've done and that defines your entire... So they amplify those traits to to the extent where you become you know a two-dimensional character, you you are not anything beyond that. Is that ringing any bells there? Yes. So actually, um there's a lot of discussion um legally about what kind of privacy does someone who's famous deserve, right? And just a lot of discussion in particular, because these people who are famous, they have enough resources to bring the claims against paparazzi, against um newspapers that are trying to infringe upon their privacy. So, and let me say this too, historically, privacy, um you know, or, yeah, privacy,
00:39:48
Speaker
a good deal of that field of law developed with respect to people who were famous. Of course, there was a tort of you know infringing upon your secrets. If someone you know would tell information about you and that would cause your arm, you could always you know try to mitigate that. But for the most part, privacy was a thing for people who were famous, and and so the other people would care about what they do, who are they wed, who are the children, and so on. So that was the one of the initial um themes of privacy law.
00:40:23
Speaker
With these technologies, each and every one of us deserve those 15 minutes of um of being famous and much more. of Our lives is completely online, and it is for grabs. Many different actors, again, public and and private, and and also other people, simply.
00:40:42
Speaker
right so
00:40:46
Speaker
On the one hand, there's something to be said about being famous. If you're famous, you probably need to be more accountable towards the public.
00:40:57
Speaker
Of course, this need is different whether you're a politician or you're an actor or so someone who's just you know done something and then became famous because of what you did, good or bad, right? These are different types of people that are famous. um And so, based on the situation, that will you know this um need to be accountable to the public can be modulated.
00:41:21
Speaker
but overall there is an understanding that if you're famous you're more accountable you deserve less privacy that doesn't justify that kind of behavior for example that you know someone has been caught drunk driving once your football player or you're an actor and then that defines you forever that cannot be true and actually um Those information can also be erased, um at least from the internet, if not from the memories of people. right and you can um you can there's There's legal rights, at least in Europe now, but also um elsewhere, to kind of try to erase information that are not useful to the public anymore, even though you're famous. Is that what they call it the right to be forgotten? Is that what that is? Yeah, it's a bad name for it, but it is. yeah
00:42:06
Speaker
Okay. And so is that justified in relation to the kind of like the value of self-expression or how does that, can you, sorry. Yeah, yeah, sure. So if you're cast into a certain character, right, for example, the person that was caught drunk driving once, yeah um then you don't have much room to self express yourself differently, differently. And we hear this from from people who are famous all the time, right? um So definitely, but but again, if you are really famous, if you are someone who has a public role, there's also accountability, and that's a competing interest that needs to be balanced um with the privacy that you deserve.
00:42:44
Speaker
Yeah, and so actually, and and this is why I remember in your paper, you mentioned that it's actually, I don't remember if you said it's more harmful, but you know accurate ah you know judgments by these, by accurate recognition by biometrics is actually you know as harmful as inaccurate ones, right? So can you tell us about the accuracy and how that,
00:43:11
Speaker
that kind of what what dimension that adds to this. yeah so um If you are actually accurately identified while you go through immigration and to go somewhere else or to enter a country, i mean it's really hard to and don't construe that as harmful because you know that kind of that's the purpose of the thing. So it's not always the case. But sometimes, um for me, it was quite paradoxical to see that if you are identified, especially as part of a group,
00:43:42
Speaker
That can actually harm you because you're stuck in that identification and you don't experience the world as you want it with your freedom, your freedom to control your information narrative, but you experience it as whatever it is that you look. Again, our board is very interesting, um but it could be in other ways. so For example, if you're crossing the border um before you arrive to the place where they can actually search you, someone ah some biometric system can identify you as threatening or non-threatening based on on how you look and your ethnicity and so on, even your gender or the way in which you can be identified as assigned to male or female.
00:44:28
Speaker
um and that is Overall harmful for you if you are the person that is search just because you're looking a certain way and it's overall harmful to society right because we Continue to perpetrate this bias is this unfair treatment actually of individuals just because they Look in a certain way and that's coded in the data right right well, so what but what would you say to someone who's like well, I That shows a problem with that particular ah classification scheme. In other words, we shouldn't be thinking that you can evaluate whether someone is um potentially dangerous based upon how they look. like That's like a you know that's like a a foolish inference. you know That's like not an inference we should make, that like, oh, this is how they look. Therefore, you know potentially, they're a dangerous actor.
00:45:28
Speaker
um you know In other words, um yeah like what would you say to someone who's like, ah you know I can see that there's certain ways of identifying people through biometrics that it can be problematic, but I still think that like overall you know using at TSA to identify the civic identity of someone, that's that's fine. So what would you kind of say to someone who's giving that kind of perspective?
00:45:53
Speaker
Yeah, so I think that there is some truth to that. So we have this technology, and um it allows in certain circumstances to you know be more accurate and be sure who's crossing the border, who's, you know, that it for law enforcement purposes. um And um that cannot be denied. But the problem with it is that, at least when I wrote the paper, now we have something in the IR. But the
00:46:26
Speaker
the worst no um There was no regulation of it, specific regulation of it um one year ago. And still there isn't. I mean, the AI act is enforced, but it will not be applicable until 2026. So these databases of our faces and our behavior characteristics, they are compiled already. um All of us, I think, were part of a generation that we didn't really mind putting our pictures on Facebook 15 years ago. yeah um i I personally didn't um and so we are
00:47:00
Speaker
I think in different databases, and this database can be used with no restraint. ah By law enforcement, which you know as as European, as Western, lawyer bothers me particularly, but also ah by private actors that are gaining more and more power, and those companies especially that are providing this technology. But not only, ah right, there are yeah um scandals about companies that we wouldn't No, ah but it actually were very powerful because of the databases that they had compiled of a face basis of people. yeah Yeah, so actually I would be curious if you can you elaborate on that like from the legal legal perspective. um
00:47:39
Speaker
yeah what is what is like worrisome about ah them like yeah maybe like the law enforcement ah finding all the pictures they can of sam bennett you know in high school doing dumb stuff and all that like you know like what what would be like from a legal perspective i'm kind of curious like how how do you think about obviously intuitively it sounds problematic but i'm just curious like how how you would flush it out yeah Yeah, so um there is a level that is the the more kind of, um if you come to me, if I were still a practicing lawyer, if you come to me and say, look, I don't feel good about, ah you know, the FBI having all my pitches or, you know, law enforcement in general. yeah What can I do? Well, of course, we have laws that um you know are based on consent, which is also something problematic, but that you can try to kind of claim that you know that particular picture shouldn't be there because it didn't give consent to collect it and store it and process it. so On that level, that is something that the legal system allows us to do in different ways in different countries, but you know in many countries, we have those laws.
00:48:49
Speaker
um On a higher level, um You have to be a little bit more conceptual in a way that there's no particular article that I can cite that, you know, can say, well, this is illegal. But I think we can see how a police service that has as much information about you as possible, not just about what you do, but again, what would you did and how did you look and how you look now.
00:49:18
Speaker
i and is in possession of this technology that can you know elaborate images, for example, protect how you would look in 10 years, um you know your children, whatever this is you know possible to do right now. um That is something that runs counter the idea that we shouldn't be living in a police state, that there's you know no need to have information about me,
00:49:44
Speaker
unless there's a need to have them, right? And there are the vegetarian systems that had these files about everybody, you know, you didn't do anything wrong, but they will collect information about you, put it in a file, just in case. And law enforcement agencies still do that. And that is not in the long run, that is not a free society, right? Where you're observed at all times, and there's information about you at all times.
00:50:12
Speaker
ah Because they kind of have like a burden of proof, right? Isn't it like they they they need an initial justification to to get information from us that's, I guess, private in nature? I guess, is that a part of the right to privacy, is like basically? is that Yeah, so especially in the US, but also in Europe, how the right to privacy is built is privacy against the intrusion of the state.
00:50:38
Speaker
Then we have a tour to privacy that, you know, it's among individuals. Again, if you reveal secrets about me and that causes me a harm, I can try to sue you. But that's a different thing. The constitutional right to privacy, um how how it has been developed from the Fourth Amendment in the US, but also from the European Convention of Human Rights and the European Charter of Fundamental Rights.
00:50:59
Speaker
It is a right to be shielded from the state's intrusion, right? Unless it is necessary for something else. You go to hospitals, but it's a public hospital. You have to give all sorts of information about stuff. I mean, we understand why it there's a justification to it. Yeah. yeah yeah Right, so I was just um thinking about this news last year because I give gave a presentation about this paper before I tried to publish it. It was already in the process in my department. And I use this example, so it was close to labor um weekend in the US. And in New York City, the police department was
00:51:35
Speaker
flying drones all around the city with cameras and facial recognition technology so that it could look into people's backyard to make sure that, but you know, everything was fine because during Labor Day weekend, they get a lot of calls, people are allowed, we'll have people over, you know, you know it's a party and people yeah you know get noise and so on. So in order to protect and, you know, kind of use well the resources of the New York Police Department, they would um use this drone to look into people's backyards with facial recognition technology. right That, for me, is very chilling. i do feel the need I do understand the need to protect police officers. that That's not the point. But that the police is able to look into my backyard where there's no
00:52:24
Speaker
um Notice that something is going on that is not, you know, fine, or they don't have um a warrant to enter my house, right? That, you know, doesn't make me feel completely safe. And I'm taking the US s as an example, but I could take many other jurisdictions throughout Europe and here where, you know, something similar, you know, could be done or it's done.
00:52:49
Speaker
Yeah, fascinating. Um, as we, as we get closer to wrapping up, uh, like kind of curious, like, um, I don't know, Robert, if you want to help me as this, I'm basically, I want to ask something like, yeah, like given, you know, the way you understand the potential harms of biometric technology, what do you think would be the correct legal structure or response or like,
00:53:18
Speaker
How should the legal world, what are we what is it not doing that it should be doing given the nature of biometric harm that you're talking about?

How can legal systems address biometric harm?

00:53:27
Speaker
Yeah, so one of the things that I'm trying to write about now is how should we get rid of consent as a legal base for processing data? And I think in particular, um consent doesn't work very well for for sensitive data, but for all kinds of data, because um if we
00:53:51
Speaker
Accept that data protection is a constitutional right then you should not be able to trade in constitutional rights so you should not be able to give away your data so that you can have a facebook profile or you know you can access any app which is what we do when we click consent right we trade our data right right right personal data ah for service.
00:54:12
Speaker
So it's almost like a transaction. It's like, I give you my private life and you let me go on Facebook, basically. Okay. yeah okay So you're saying we shouldn't even be allowed to like... And and and i'm not yeah okay it' just it' anita allen and it's and yeah yeah yeah saying this but yeah I'm trying to say that i I'm trying to link this to...
00:54:35
Speaker
um de fact that that There's a lot of path dependence in this field and so consent is now the legal base and everybody uses the legal base. So, for biometric harm, what I'm proposing, I think that the paper tries to propose, is that if if you are identified with biometrics without a legal justification, and that includes consent, we didn't talk about that, but the last part of the paper also discusses consent.
00:55:00
Speaker
um if you are identified without justification, that is harmful. So it means you can get compensation, you can sue, right? So that is a mechanism that usually kind of prompts actors, public or private, to um respect the law and to use biometric identification only when they can justify it.
00:55:26
Speaker
right so The legal system, the change that I'm proposing, I think it's very small. It's just make this biometric harm actionable. It should be possible to go to court and get compensation for it.
00:55:42
Speaker
But I have to say that when I tried to publish this paper in a tort law journal, very classical one, I was actually rejected very fast, because this seems very far-fetched from the point of view of classical tort law, because we usually only compensate personal injuries or bodily harm or economic harm to property, right? Something, you know, it's this damage. so yeah so It's a small step, but not that small if you are. yeah so Interesting. We don't own our personal data, i mean that so that's not property. It can't be construed as property. so There's a lot of discussion about this. um Who owns the data? Because data is about yourself. There's one argument to say, it should be mine. you know My data, my face geometry, whatever, it's it's mine.
00:56:31
Speaker
and I should be able to trade it in a way. But because there are these kind of very serious side effects, when especially when everybody is also at the individual level, especially if everybody is allowed to trade data, we're seeing this with the you know all the arguments about the civilian society that people i have been putting forward. Because of these serious side effects, we should reconsider the possibility to trade. right So if you own something, you usually are able to sell it. So this would be, in my in my view, you own it, but you're not able to sell it, at least not that easily. And we should think about how we could allow. wow Fascinating. For example, I can now, ah you're recording my voice and my um
00:57:24
Speaker
image, I should be able to do that, right? But at the same time, should I be able to trade my personal data forever, just to have a Facebook profile? Can this be the payment of it? And so there's a lot of thinking to be done about this issue.
00:57:42
Speaker
I mean, I guess you know i could I technically own my kidney, but it's illegal, I think, for me to sell my kidney, right? So it's something like that. Interesting. It is very similar. It is very similar to to your body, right? Especially biometric. It's a small step. Think of your face geometry a little bit as your kidney. You know yeah own it, but do you really? Can you sell it?
00:58:06
Speaker
but it does oh i think I think on that excellent analogy, maybe we should wrap it up. ah that's That's fantastic. Thank you so much for coming on, Sarah. it's a super It's a fascinating paper and it's been really fun to talk to you about it. Thank you so much. I also had a lot of fun.
00:58:35
Speaker
Thanks everyone for tuning into the AI and Technology Ethics podcast. If you found the content interesting or important, please share it with your social networks. It would help us out a lot. The music you're listening to is by the missing shade of blue, which is basically just me. We'll be back next month with a fresh new episode. Until then, be safe, my friends.