Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
#8 Muriel Leuenberger: Track Thyself(?) image

#8 Muriel Leuenberger: Track Thyself(?)

AI and Technology Ethics Podcast
Avatar
97 Plays3 months ago

Muriel Leuenberger is a postdoctoral researcher in the Digital Society Initiative and the Department of Philosophy at the University of Zurich. Her research interests include the ethics of technology and AI, medical ethics (neuro-ethics in particular), philosophy of mind, meaning in life, and the philosophy of identity, authenticity, and genealogy. Today we will be discussing her articles “Technology, Personal Information, and Identity” and “Track Thyself? The Value and Ethics of Self-knowledge Through Technology”—both published in 2024.

Some of the topics we discuss are the different types of personal information technology, narrative identity theory, and the effects that personal information technology can have on our personal identity (positive, negative, and ambiguous)—among many other topics. We hope you enjoy the conversation as much as we did.

For more info on the show, please visit ethicscircle.org.

Recommended
Transcript

Introduction to Dr. Muriel Luenberger's Work

00:00:16
Speaker
All right. So we are thrilled to welcome Dr. Muriel Luenberger, a philosopher and researcher at the University of Zurich, formerly of Oxford. Her expertise spans digital technology, identity, virtual reality, and the meaning of life. So thanks a lot for joining us today, Muriel. Thank you. Thank you very much for having me.
00:00:34
Speaker
Absolutely. Yeah. So today we're going to be discussing a recent article, technology, personal information and identity, as well as another piece called track thyself, the value and ethics of self knowledge through technology. So before we dive into those though, could you just kind of tell us a little bit about your

Journey from Science to Philosophy

00:00:52
Speaker
philosophical journey? You know, I'd like to kind of interested in what initially drew you to philosophy, the kind of the context in which you studied it and just how your, you know, your interests have evolved over time.
00:01:04
Speaker
ah Yeah, thank you. um Yeah, so I do actually have a background in science. I studied nanoscience in my bachelor. um And I guess I was just interested in understanding kind of what the world is made of and how it works. um But there remains still a lot of important unanswered questions that science just doesn't answer. um And I also didn't really see myself as working in a lab long term. So I switched to a Mastering Philosophy of Science and Technology at the Technical University of Munich.
00:01:35
Speaker
And they have a really nice program there for people with a science or engineering background um to switch to philosophy and to kind of reflect on how technology impacts us as individuals and society and things like how ah science works on a level that you don't usually learn when you study science, like what defines a lot of nature or what is a paradigm change.

PhD Research on Authenticity and Technology

00:02:02
Speaker
Um, yeah, so then I got kind of hooked on, on the philosophy, uh, the people level. And, um, I did the PhD in philosophy at the University of Basel and I worked on, um, authenticity and neuro interventions. So are you authentic if you use antidepressants or if you change after using a brain implant, um, or can you even become more authentic through those things? Um,
00:02:32
Speaker
Yeah, yeah, i really I really enjoyed that topic. um And after that, I spent two years in Oxford at New Hero Centre for Practical Ethics. And there I worked on a related but slightly different topic.

Exploring Meaningful Life in a Digital Society

00:02:45
Speaker
So on identity and self-knowledge through technology. So this is the the work that we're discussing today is from from the this project.
00:02:56
Speaker
um And then I returned to Switzerland and I just recently started a new project on the meaningful life in the digital society. oh Very cool. That's exciting. yeah um Well, I want to talk about all that. so let's Let's get to the paper and then hopefully at the end we'll have a little bit of time for especially the meaningful life in the digital side if we can get that in there. okay so For the listeners, let's begin with ah the kind of technology we're going to be talking about today. Let's just kind of lay it all out.

Technological Impact on Personal Data

00:03:27
Speaker
What is personal information technology? What precisely are we talking about when when we refer to that term?
00:03:34
Speaker
um yeah So the kind of technologies I wanted to look at here are technological devices or applications that provide us with um unprecedented amounts, so unprecedented quantity of information or qualitatively novel information about ourselves. um So there's a lot of technology available that can measure us in in any different kind of way or track us or infer um information about us.
00:04:04
Speaker
um in ways that was just not possible even just a few decades ago. And to get a bit more concrete, so I look at four types of technology specifically. um The first is technology providing autobiographical information. So those are like pictures, videos, text messages, call logs, or also browser history. So just anything that tracks what you're doing in your life, where you are at, who you're talking to, things like that.
00:04:34
Speaker
And then the second type of technology are health and activity trackers. They've also become much more prevalent now. And I have to admit, I just recently bought myself my first smartwatch for activity tracking. So now I'm also ah part of this personal information industry of health and activity tracking.
00:04:56
Speaker
um Yeah, and they can measure a lot of parameters about our bodies, so heart rate, sleep cycles, temperature. um There's also, besides these like watches that we we know, there's also, for example, applications that measure glucose levels for people with diabetes, or apps where you can track how much you drink or how much you smoke if you want to quit or reduce. I love those. Or increase, I guess you could too.
00:05:25
Speaker
um And then a further technology is but that I look at is neuroinformation technologies. So anything brain scan, brain imaging. um And I think this is an interesting subcategory because the brain is just deemed so central for our identity and our personality. And there's been also a lot of advancements in neuroimaging and how we process those data mainly.
00:05:51
Speaker
And it has been applied also beyond health care, so beyond just like diagnosing and and like specific disorders, for example, um but also to get information about social attitudes or personality or humor. um And there are even now these direct to consumer technologies. So you can order a neurotech device that measures your EEG at home, like you can wear it the whole day and measure allegedly your emotional states and and stress levels.
00:06:22
Speaker
um Yeah, and and then the last category, the fourth category is inferences we can draw from um online behaviors, so like algorithmic profiling, so all the traces that we leave online.
00:06:35
Speaker
um We can, of course, infer a lot of things from this. Many companies earn a lot of money with that by inferring preferences, beliefs, age, personality, and much more.

Identity Construction Through Technology

00:06:45
Speaker
um And this is, of course, not necessarily information that is used to inform us about who we are, but to inform companies about who we are so they can sell us things or make us vote for someone or whatever ah interests they pursue.
00:07:02
Speaker
so Just to recap, just to make sure that I you know will intelligently ask you questions later. so it is it's It's basically the functions that a smartphone does. right It keeps all your pictures, your call logs, all of that, it's almost like raw data. It's also the bio information technology like smartwatches and my Fitbit. I'm still trying to get sponsored, Sam, so I'm going to mention it at least twice per episode. get We're getting close, I think. Yeah, I think so.
00:07:31
Speaker
Um, then it's the neuro technology and you're including like commercially available stuff, right? Yeah. Yeah. So yeah, on the one hand, the, the diagnosing technologies, the like the big machines you get in the hospital, but also these direct to consumer neuro technologies that we can order at home. Yeah. Okay. So that kind of makes it brings up a distinction where it's like some of these personal information technologies are ones that were taught that are on the person, you know, so the obviously the the cell phone is on the person throughout the day, generally, or the health tracker, if it's a Fitbit, you know, it's on the person. Whereas some of these other technologies are more like, yeah, they're going to be housed in a hospital, but they are designed to gain,
00:08:20
Speaker
yeah, information about the person, about maybe not their identity or who they are, but at least like some physical features of them. Like this is how your brain is operating anyway. I'm just, yeah. So there's a, there's kind of distinctions, I guess. Yeah. Yeah. I think there's a lot of, a lot of ways how we can gather information about ourselves. But I think what is distinct about those types of technologies is that they intend to gather some information from either your body or your, your mind.
00:08:55
Speaker
about, yeah, aspects of of your identity, ultimately. And just so I can make sure that that the listeners ah have the fourth category, mind you. that The last one, you call it algorithmic profiling. And so that's one like Amazon knows that I like Agatha Christie.
00:09:11
Speaker
and So when there's a new movie, it'll be like you like this, you nerd. and so So it's that kind of thing. Yeah, yeah exactly. so you You just leave some traces and based on those traces, um which are like things you do online, but also um the people you connect with, for example, this all creates profiles where these companies, like they throw some algorithms at them and then they see, oh you must be someone who likes this movie or this, like Arthur Christie book or or whatnot. yeah so Okay. Yeah. Good. Okay. so Cool. So yeah, so this is kind of the the focus is personal information technology. And then the kind of philosophical angle you're taking on it though is a little bit, um, it's kind of unique or different maybe, you know, cause you kind of talk about how most people, when they're looking at personal information technology, they're thinking about privacy, infringement, manipulation, potential abuse of that, um,
00:10:10
Speaker
information they have about you, but you're kind of focusing on personal identity. So yeah, can you kind of just introduce us to this concern and what kind of questions, what kind of distinct questions come up when, when your focus is personal identity? ah Yeah. So the basic idea of this, of this project and what this paper came out of was that we have all these, this research already that looks at what happens when All this personal information, technology to others, gets into the wrong hands. What if it gets to people who manipulate you or not respect your privacy? um And this is obviously all like very important topics. What I want to do now is look at what happens when this person personal information gets into your hands. What does it make with you? um And I think because we develop our identity in interaction with others and with our environment, we we construct our identity.
00:11:07
Speaker
um in in this like really big interactive system, and technology has become an integral part of it. And what I want to look at is how technology intersects in those processes of identity construction um with by by being part of our relationships with others, but also just being part of our environment and giving us feedback. um And I want to look at this on a descriptive level, so just seeing what's going on, where does technology intersect,
00:11:37
Speaker
and also then raise ethical concerns. So to look at where might be some issues where we don't look at when we don't have this focus on identity in mind. So I think it just gives a new lens um to understand some processes that are going on and to yeah raise some some ethical issues that might be overlooked.
00:11:58
Speaker
Right. So it's so it's not it's not about ah identity risks. i know you know that's like ah that's kind of and That would be another sort of traditional issue. is like oh you know People are obviously worried about yeah you know your credentials being stolen when you're remote working and making you vulnerable to cyber attacks, but that's that's kind of a more Um, I don't know where you would classify that, but this is, yeah, this is more about personal identity, your own, uh, who you are, what defines you and how these technologies might be impacting it, influencing it, changing it, that sort of thing. Yeah.
00:12:40
Speaker
i I think I want to, um so ah before we move on to, I mean, we have to obviously unpack a bunch of stuff here and like narrative identity theory and what self-knowledge is and all that. But before we do all that, is it possible that you can give us just one example of how maybe maybe a um ah pernicious example of how these technologies might affect us negatively? A negative effect might be For example, I might be overwhelmed by information that I get about myself and kind of lose focus of what might be important about me. So for example, health records are bombarded with information. um And then if I try to to integrate that, though, to make sense of that, I might struggle to do so.
00:13:26
Speaker
um Or am I might believe wrong information. I might buy one of those direct-to-consumer neurotechnology devices, and they're just not very reliable. And then I believe what they say about me. Oh, I'm this and that emotional state, so I should um and i'll react to it in a certain way. And this might just be wrong. And then I yeah just build a wrong image of of who I am and what kind of situation I'm in and how I'm feeling.
00:13:54
Speaker
I have ah i think a funny example here, but every morning um i I check my my sleep tracker to see how I slept last night. and My fiance will ask me, you know how do you feel? and Now i just so I immediately say, it let me ask my tracker to see how I feel. Even though I feel great, but if it says I only got seven hours, like i I'm missing a whole hour, I'm i'm ah i'm tired. so I start believing that I'm tired. That's interesting.
00:14:22
Speaker
Yeah, I mean, that's certainly like, that's a further concern that you kind of, on the one hand, like you lose trust in your own abilities to know yourself. Um, but you also might just lose those abilities. Like you might, if, if I have this, this, uh, neuro device that tells me every second how I'm feeling, why do I have to pay attention? I can just speed it off. Right. Um, yeah. One thing I've been keeping track of now is my, my resting heart rate.
00:14:49
Speaker
It's 51, by the way. Who cares? like ah why Why do I... like It went up, it used to be 50, and now it bothers me that it's 51. But this is not really... This is trivial to me. i so It should be. Yeah, that's the thing. You you might worry about the wrong things. It shifts your focus to whatever those devices can measure, and that might not be what you should focus on. like In some cases, it might be good, but in many cases, it might just not be very relevant.
00:15:16
Speaker
Okay. Yeah. That's interesting. Great. Yeah. So I mean, yeah, we did well, no questions about all that as soon as, but maybe just the, a further stage setting, you kind of mentioned that like you approach the issue of personal information technology through the lens of narrative identity theory. So yeah. Can you kind of explain

Narrative Identity Theory and Technology

00:15:39
Speaker
that a little bit? um Yeah. What is narrative identity theory? Yeah.
00:15:44
Speaker
Yeah, of course. um So the basic idea of narrative identity theory is that humans integrate their experiences into an ongoing, internalized, evolving narrative or life story. um And this means so you have this self narrative, which kind of recounts your life from your personal perspective. And it reflects your characteristics, goals and values and just what happened in your life.
00:16:11
Speaker
And this doesn't mean that you actually tell yourself this story. like You're not going to actually sit down every evening and tell yourself what happened in your life. um But we do make parts of it explicit. like When you ask me what my philosophical journey is, that's that's exactly what I did. right I recount a part of my life narrative. um Or if someone asks, oh, why did you do that? And then you're going to give just like a a short account of what your motives and intentions for and what what occurred. um So it's it's this partially explicit, but we also have all those um kind of background notions of what happened in my life trajectory and this kind of projection into the future. Oh, I know I came from here. I'm doing this. Now I'm standing here in my life. I have those concerns. And this is where I see myself going next week, next year, 20 years, whatever.
00:17:06
Speaker
um And the present moment, wherever you're at, is also kind of experienced in relation to this past and future. And it becomes um kind of intelligible and meaningful through it. So say you won a marathon, um then this experience of like winning the marathon,
00:17:26
Speaker
is experienced in relation to all the training you went through or the goals you have in the future. If you just um participated, I don't know, like not very seriously, you just wanted to do it once, when then it will be a different experience than if you practice for a really long time and or hope to become a professional or something. um So the present moment is kind of set in this narrative and it changes our experience of it.
00:17:57
Speaker
And it seems that it's like action guiding a little bit, right? So um so there's ah like that normative element. So you I guess you essentially, ah let me correct me if I'm wrong. It seems like you want to have a nice healthy narrative about yourself. Is that my reading too far into it or is that okay?
00:18:14
Speaker
um so There are some some constraints to it. um So in the literature, they say that the literature I base this on, they say, well, if you want to engage in specific practices with other people, specifically practices of holding each other accountable and responsible, um then we have some conditions. So your identity needs to needs to um and satisfy some conditions. One of those is the articulation constraints. So you need to be able to explain yourself to some degree. If I do something and people ask me, oh, why did you do that? And I can just not say what was going on, then at some point um I will run into trouble if I can ever explain myself.
00:19:01
Speaker
um and We need to share some common ground about what's going on and how the world works. um so This is the reality constraint. so If someone thinks he's Napoleon, and I'm not going to sign a contract with that person or I'm not going to hold him very responsible to the same degree as other people. um So these are some important constraints on these narratives. Yeah, it doesn't have to mean that I'm like necessarily super happy with my identity. like Maybe I'm disappointed in myself or so you know something like that. But um in this sense, like a healthy identity is one which is largely correct, and I can articulate it. Yeah.
00:19:44
Speaker
Yeah, that that kind of brings up a question for me. So you would, the way you put that, I mean, would you say that, you know, a person's actual identity, who they are, it's not entirely determined by the person's narrative because of course that narrative can be incorrect. Like you could have a erroneous narrative about yourself. You know, you could have a narrative where I'm just, you know, I'm, I'm just the good Samaritan. I'm doing, you know, I'm doing great things for people all the time. I'm, you know, uh, you know, kind and patient and all that, but that could be totally, you know, that could be totally wrong, uh, and virtue of how you actually behave. So,
00:20:30
Speaker
Anyway, so like there's a difference between how you understand yourself, like the narrative that you're telling about yourself versus your actual objective identity, right? I mean that you would, yeah. Yeah. Yeah. This this is actually a very tricky question for me still. So, um, I would say there are some hard facts, which you will just run into really big trouble if you don't,
00:20:58
Speaker
um if you don't accept them as part of your narrative or if you're really self-deluded or ignorant of them. um But then within those facts, there's still room um to interpret yourself differently. And there's like room for creativity. So say, a person, for one person, they they're born in France, and for them like, oh, yeah, being French, this is really, this is really me, this is really my identity. And for someone else, this very same fact, it just doesn't really mean anything to them. And they don't care. um Yeah, and in this way,
00:21:36
Speaker
you know, different different parts of those facts can become important to us in our self narratives or they can be unimportant. We can push them aside um and we can kind of create ourselves to a degree as well. And we have some freedom in this in this narrative. um But yeah, I think like to some degree, the self narrative, even if very self-deluded, it will um create a sort of reality insofar as it will determine how you experience yourself and and your environment. If you think, oh, no, everyone hates me, then this is what you will perceive as well. right um So in this sense, there is a reality to it, but it doesn't mean that there's not like hot facts that might speak against it.
00:22:21
Speaker
yeah Yeah. Sorry, Roberto, you got it. Yes, sir. Just one follow up here. i mean just i mean And I hope this isn't the question that derails the conversation. So I apologize ahead of time. you said so So you're saying that that there might be a little bit of um you know of a conflict sometimes between ah you think you're Okay, so I guess the question is this. Is it possible to have more than one narrative and and in the sense then in the sense that there's like, you know in different situations, you're like a different person and you follow a different narrative along the way? Because you said it's not always conscious. You you don't always you have to articulate it, but it's not always a story that you're telling yourself every night, right? So can you talk about that?
00:23:05
Speaker
um Yeah, so my idea here is that, like in terms of like how we can be different people in different contexts and so on, um is that the self-narrative would encompass this, that you as a person, you would still see yourself as, oh yeah, well, at work, I'm a bit more shy. And then when I meet these friends, I'm a bit crazy or whatever. um I think this this can all be part of your self-narrative. if I wouldn't say that you have like one narrative and another, but maybe different narrative threats.
00:23:35
Speaker
but then that they get together in your self-narrative, like in your overall view of who you are and and where you stand. yeah But then in different situations, maybe you're more aware of one than the other. One is more in the background than the other. Right. Well, maybe now we can turn to yeah how personal yeah how how the princes personal information technology can impact our self-narratives.

Personal Information Technology's Role in Self-Perception

00:24:02
Speaker
so Um, you kind of, the first way that you point out, which is probably the most kind of like the, the most obvious one is that it can provide content, you know, so, you know, one example is like your smartphone can provide the data that, Hey, you're, you're on this thing for five hours every day or whatever it is, you know? So, um, but anyway, yeah, can you kind of get into this whole issue of like,
00:24:30
Speaker
how content, how it can provide content for our self narratives and then also like the different types of content. um Yeah, yeah, certainly. So yeah, I think the main and the obvious way of how these personal information technologies can impact us is just providing information that we can integrate into those self narratives and that can shape who we take ourselves to be.
00:24:53
Speaker
And there's different kinds of content um I think we can distinguish. So the first is kind of data points. So this is content that is kind of very like reduced, for example, ah um a single measurement of a body temperature or a step count or a neural state. And this information needs to be integrated in a narrative to gain meaning. So um a temperature measurement can mean very different things of people in very different situations. So for one person, I don't know, increased temperature is really stressful situation because they have to go to work now and they cannot afford to stay home. Or for someone else, it will be less concerning. um And then this integrating it into the narrative will provide this this meaning and and context to this single data point.
00:25:41
Speaker
And the second type of content identify our patterns. um So patterns is like, I understand like something like categories and labels that we then can use to interpret further behavior ah in the future or in the past. um So for example, instead of having just a measurement, oh, I had one bad night. Oh, I'm a bad sleeper. This is kind of something, a pattern that I can expect to have. I can explain past instances, um or I'm an introvert. So this is going to be a pattern that can help you to identify what's going on in past situations or what you can expect in the future.
00:26:26
Speaker
um And the last one are are templates and narrative pieces. So these are um this is content that already has kind of a temporal narrative structure. and So for example, a text message of someone saying, oh, I'm breaking up with you. This is not just a single data point, but that's that's a story kind of, right? So there's a past, there's something happening in the future. That's kind of like a piece of a narrative that's kind of readily um, implementable into your own self narrative, where it again can gain more context of course. Um, or also health trackers. They can kind of set us on a path. Oh, here you are now with this, um, really high heart rate. And if you do these exercises, you will bring it down and, um, it kind of sets you on a path where you can see yourself, um, into the future.
00:27:20
Speaker
And so I say this like the last one was like templates and pieces with these narrative structures and templates are kind of generic narrative structures that are typically culturally shared. A common example is a redemption narrative. So someone overcoming and learning from negative life events.
00:27:37
Speaker
And some of those technologies also use those templates. So there's this smoke-free app that tells you you should imagine yourself as a non-smoker, as someone that um overcame this addiction and is now smoke-free. So they want you to shape your self-narrative and project yourself into the future as someone within such a redemption story to to keep smoke-free. yeah Okay. So it's data.
00:28:06
Speaker
patterns and narrative templates slash pieces. um Can I give you an individual example from my life and you tell me if I got it right? Yes, please. This is ridiculous, but I think I'd help the listeners too. and This is not just me getting personal advice right now from Uriel. Free counseling session for Roberto. Yes, yes that's what this was his is. Anyway, so there you go.
00:28:33
Speaker
Data, how about a photo? so An individual photo might be data. and for The example that I have in mind here is that you know my iPhone gives me like my memories sometimes. you know and It's like, hey, a year ago you did this. and Sometimes I'm legitimately surprised. I wasn't wine country last year at this time. so so That would be a data point.
00:28:54
Speaker
Yeah, yeah, that would be a data point. It can also already have like, some pictures can have narrative structure, if they're really strongly point towards past and future, like, oh, there's there's big events surrounding this very obviously, that it comes with a story right away. um But otherwise, I mean,
00:29:17
Speaker
enough was yeah yeah i mean I was thinking, what do you think? I mean, like, So the individual picture, I feel like is data point, but one, it just starts packaging it because I got this way. I want to bring up the same thing because sometimes I'll look at my phone and it's like, you went on vacation. Like look over the photos that you did and it like collects the photos into a, um, how and then, and then talking about what you were mentioned earlier, Muriel with like losing capacities. It's kind of funny how in the past I would have had to like,
00:29:52
Speaker
you know do some kind of Kantian synthesis of different things and like bring them together into this one story. But instead it's like, no, you it's already given to me this set of, anyway, but that's kind of what you were saying is that that's already like, ah um once it packages it, it's like, that's starting to get to the narrative piece or template maybe.
00:30:12
Speaker
Yeah, and it kind of it creates a story for you a little bit already, right? So that was just the beginning, and then in the middle you were there, and then in the end there was that. So yeah, that this this gets a little bit of a narrative structure already.
00:30:24
Speaker
so So the photo can be either either a data point or have more of a template attached to it. And then the pattern, I think the easiest example for pattern for me is what we've already touched on, like the sleep stuff. I used to believe that I was a horrible sleeper and I started tracking my sleep. And I was like, hey, like once a month I sleep poorly. um It's not that bad. So that was actually a positive ah impact of tracking my sleep. So that would be a pattern.
00:30:51
Speaker
Yeah, that would be a pattern. And I think you can also see um how it might help like redefine your identity and say, OK, yeah, I'm not a bad sleeper. And then maybe the reason why I'm tired is something else. So you you search for different um explanations. You try to keep your your narrative intelligible. right So once like one explanation is lost, you look for another one. So this is how these patterns can restructure our narratives. and So Maria, what do you think about like, is I was just thinking one issue, like, do you have any opinions about diaries? Cause like I was thinking diary reading is like, or making a diary is probably something on the Wayne. I'm going to say, I mean, I mean, I'm not sure how popular it was in the past, but I just imagine that it's not as common now because it's like your,
00:31:50
Speaker
I want to say like your phone almost does it for you to a certain degree. Like, I mean, I don't know. I guess this is, this is just getting back to the whole issue of like, yeah, are they, are they, are, are certain skills of ours going to be atrophying or are we not going to develop certain skills and virtue of these personal information technology? Because yeah, just like an obvious sort of.
00:32:13
Speaker
personal information practice or something is diaring where you, you know, you write down what happened that day or that month. And that's a way of like gaining self knowledge about yourself. And, um, anyway, I don't know. I would imagine that's probably on the Wayne, but maybe, maybe that's not true. I don't know. Do you have any thoughts about that? Um, yeah, I think, I think it's interesting. I think, um,
00:32:37
Speaker
I think like one reason and why we write diaries is on the one hand like to reflect about ourselves and to also just keep the record. right i can I can go back and look up, like, what did I do? who was i um And I think that's part of the motivation why many people take a lot of pictures. It's also to document what's been going on and to go back to it and to remember and to reminisce.
00:33:02
Speaker
and i think given that we can satisfy this need electronically much quicker, much easier, ah but much less effort. It's just the ah pressing a button instead of actually sitting down, reflecting and writing. um So I think it it makes sense that it goes back a bit. But also we lose something, right? In pictures, we're not really reflecting. We just take the pictures. We're not creating.
00:33:29
Speaker
ah a story and and and select what's been important in the day and so much as we would in the diary. There might be something else though, because when you're in your diary, I think the goal is to you know ah accurately, you know, record what was going on. But I see a lot of people take pictures. And they're sort of a choreographed, performative element that doesn't really reflect accurately what was like a minute before they were pissed. Are we allowed to show this on the show? I don't even know. And so but they're like, no, no, take the selfie with me. And they put on a smile. And so that would be an inaccurate memory almost if you forgot the context before I found something there.
00:34:14
Speaker
that is true that is true and it's it's of course because also like the diary had different functions in terms of like the idea was that no one's going to read it except you right or maybe in special circumstances circumstances and why are we taking pictures yeah expect like those pictures that you're talking about are probably to share um so there is this this whole other layer of uh self-presentation going on Which i think I think it also happens in diary, but it's more towards yourself. You want to maybe not look so bad towards yourself. I i don't think that people, I think it could be very hard to still be like completely honest towards yourself, even in a diary. um But definitely even much harder if it's aimed at sharing with people.
00:35:01
Speaker
and Yeah. Right. Well, we'll bring up the issue of honesty. Maybe we can turn it to kind of a positive impact that, um yeah, this this technology can have on our self narratives, which is potentially overcoming self delusions. That's one issue you bring up. And then another one is enhancing our self understanding. So yeah, can you kind of give us some examples of that?

Technology's Influence on Self-Understanding

00:35:25
Speaker
And then also, yeah,
00:35:28
Speaker
ah Yeah, it just introduces to those kind of ah positive impacts. um Yeah, surely. I think this this kind of idea that these technologies can help us to overcome some self-delusions is pretty straightforward. so I can learn about myself, for example, for about my health um through a health tracking device. and This can make my self-narrative more accurate and more detailed. like You said you have now a more accurate picture maybe of of your sleep, of whether or not you're you're a good sleeper.
00:36:02
Speaker
um and And I think like another another quite straightforward one is if you would if you think that you're a very helpful person and then you scroll through your texts and you see with your friends and you see, oh, well, like they keep asking me to help them with stuff and I just never never do it. um There's a lot of potential sources for self-knowledge and for overcoming self-delusions. And this can help us to develop a self-narrative, which is
00:36:33
Speaker
more suitable for navigating the world in the sense that we can make better the predictions about ourselves. um We can pursue projects that fit me better or or um have a focus on, say, not my sleep issues, which are not existing, but maybe other things. um And it can align my self-image more with the image other people may have of me that I'm not of a very helpful person, for example, um which can also make interactions with them easier. and yeah Sometimes this information can be kind of like an addition. there was There was something I just didn't know anything about and didn't have an opinion about where I stand on this. I didn't have an opinion on what my blood oxygen level is before I had a smartwatch.
00:37:20
Speaker
um So there was just an addition that I can like integrate if I wish to do so. um And sometimes it's kind of a correction. So I had a belief, oh, I'm very helpful. But then I was mistaken, and um this this information corrected me. um And if if i'm if this information is kind of contradicting me, um I can either adjust my self-narrative. He said, OK, I'm insightful. I'm not helpful after all.
00:37:48
Speaker
Or I can try to deny this and deny the credibility of this technology. And depending on this technology, that can be more or less difficult. So if so Google characterizes me wrong, I'm not going to say, oh, well, they must know it. No, I'm going to say, well, then what do they know? um Or laugh about who they think I am. and But then when I look at text messages, that's going to be much harder to to refute or or video evidence or something like that.
00:38:18
Speaker
Yeah. That, that issue of like, it's difficult to refute. Um, that seems relevant to like this kind of question. Cause I was thinking, okay, yeah, I could, um, discover that I'm not so helpful by reviewing my text messages and realizing that, you know, when someone asked me to, uh, you know, find that, uh, PDF, um, you know, I didn't send it to them. Sorry, Roberto. But, you know, uh,
00:38:48
Speaker
but But someone might think, oh, but, you know, that's always the case. Like you could, every interaction you have with other people is always an opportunity. It's always providing data about who you are. So like, yeah, when I ignore the person on the street who, you know, asked me for some change, you know, I could, um, use that as data to confirm that I'm not a good Samaritan, but you know, um, in other words, someone might just say, well, yeah, this is,
00:39:18
Speaker
another potential source of overcoming self delusions. But is, is there anything unique or different about Does that make sense when I'm trying to get? Yeah. Yeah. So I think, um, there are a lot of parallels. Of course, that's true. Like I, other people can tell me, no, man, you're never helping me it has be fix and you don't ah you're not helping me. But I think what is interesting is that there are at least, there are at least some technologies which are, uh, much harder to refute than even evidence from print. Um, for example, so.
00:39:57
Speaker
if I have it like black and white, that I'm just like ignoring questions. it's It's much easier to also not remember the things that we maybe not like to remember about ourselves. um Or if the friend says something, I'm like, yeah, he's just being cranky for whatever reason. or um and know I don't know. like i can I can rationalize it, I think, easier. There's always ways to rationalize things, even in texts.
00:40:26
Speaker
but um I think there's just there shows something about it that there's this um more objective measurement, so to say, of this interaction or this record of an interaction or in the picture, for example, or in a video.
00:40:45
Speaker
um that is really hard to to grapple with because with other people you can always say, ah this has this is something about you or this is, oh, it's just because I am having this insecurity. This is why I think I should, I'm not helpful enough, but actually I am or whatever. There's always more more things to point to, I think.
00:41:06
Speaker
and I think I have an example. Tell me if if this sounds good. And if it's good, you can use it and just put me in a footnote somewhere or something like that. Oh, I'm excited. um So I know this person, I won't say their name, Joe, who basically he says they're very, very generous with with servers, right? Because he used to be a server. but um But I see him not be that generous, to be honest. and I bet you if we could get a data spread of his like you know credit card statement and the times that he gave less than 20% tip.
00:41:39
Speaker
Right, by the way, this is an American thing we tip doesn't matter. um So when he gives less than 20% tip, we can we can basically track it. And over time, we just present them with this data. And we're like, see how you don't always tip 20%. So so this is maybe a data point that'd be like the kind of thing that you're saying it's black and white should.
00:41:58
Speaker
Yeah, yeah, I think so. And then, you know, if if your friend says, oh, you're never tipping enough, you're like, I am. And then you just remember those three times that you tip really well. andre like Yeah, no, I do. But those 20 times that you didn't so like, you know, I forgot them. But then, yeah, if it's black and white, you can just selectively look at the data point. They're just all there.
00:42:24
Speaker
OK, let's turn to the more ambiguous effects, right? So I think you already kind of mentioned one like when when Google mislabels you. I've been in the I turned in my car recently, so I was at least and I turned it in. And so I have been looking at cars and now Google is sending me like links to like, you know, one feature that Ford always beats Chevrolet in and I'm not a car guy. I'm really not like I don't care, honestly, so.
00:42:52
Speaker
So it's currently, yeah, that's what I think until Google gets under my my skin and turns me into a car person. But um as of right now, um and it's it's clearly mislabeling me. like i um you know I don't care. I don't click on it. And so probably later, it will fix itself. but um so and Maybe that's an ambiguous effect, but maybe you can tell us about what some other ambiguous effects are. And in your response, you can tell us about what the reductive view of ourselves is, what you mean by that.

Reductive View of Identity Through Technology

00:43:22
Speaker
Yeah. Yeah. So I think one thing about this like mislabeling of Google is interesting because in one hand, yeah, it's very annoying, right? Then you get all those ads and it's like, it's not me. Why do they think that? I mean, in your case, it might be very obvious. um But it can also get us um inspire us to think about who we are. ah can be It can be a source for self-reflection to just look at Google characterizations and be like, is this me?
00:43:51
Speaker
That doesn't sound right. Why do I have this in my category list? Yeah, but to some onto some other and maybe more ambiguous view is this this reductive view that I talk about. And yeah, I think I go over it in the paper a bit quickly.
00:44:07
Speaker
Um, so they, the idea is that, um, we can kind of look at our actions either as the results of kind of biochemical processes. So these words, um, when I'm waving my hand, that's like neurons firing and then there's muscles contracting and then the bones are doing something and whatnot. So there's this physical, biological, chemical story going on, um, of what I'm doing when I'm waving my hand.
00:44:35
Speaker
But of course, it can also kind of be about something. I can express an intention. I saw my friend. I wanted to be friendly to them. I was waving. So there this action is is coming from these biochemical atoms bumping into each other into, I saw a friend. I wanted to be friendly. And it's not I'm not saying that like one of them is correct or or the other. It's kind of two sides of the same coin.
00:45:04
Speaker
um But I think if we have this abundance of bio information, this abundance of information about what's happening in my body, oh, I don't know, my neuro and direct to consumer neuro technology device is measuring my EEG and I'm having this emotion, my brain is firing like that and this, that we can kind of be shift shifted towards this more biochemical perspective.
00:45:30
Speaker
And this is not bad in every situation, but I think it can make our actions less meaningful. So if I say this hand waving, this is just like neurons and muscles and something, this is not a meaningful action.
00:45:43
Speaker
This is just chemistry and physics happening. um But if I see a friend and and I want to say hi, then this can be a meaningful action. So I think if we have a strong shift towards this more biochemical view, it can have this negative impact of kind of ah limiting the meaningfulness of our actions.
00:46:05
Speaker
So I think I have an example and maybe this will be good. ah so So I read, I think it was in Nina Farahani's book about the battle for your mind. There is this like a smart cap or whatever with EEGs on it and it's to help you meditate. And so when you're meditating,
00:46:27
Speaker
Well, i don't even I'm not even sure how to say it. When you're actually doing what you're supposed to be doing, you should get a particular neural signature. And and so it'll tell you along with the app, it's like, oh, you're meditating right now. And so um I can imagine someone, if someone has a bad meditation morning, you know,
00:46:45
Speaker
And and the the rest of the day, they'll just say, well, i'm I might just be having an off day. And they'll, you know, ascribe their actions to it being an off day and neurally speaking. And yeah, I don't know. is it Maybe is that a close to what you're getting at? Yeah, that's an interesting example. And then it kind of turns.
00:47:03
Speaker
Meditation, which can also be this very spiritual practice, right? I go, I don't know, I go to the center of the self or or abandon the self and become one. It turns it into, oh, I'm going to, I'm trying to make my brainwave look like that. Um, it's very different thing, right? One is this like deep, meaningful spiritual thing. And the other is I'm trying to manipulat manipulate neurons into behaving a certain way and showing up on the screen in a certain way, um, that I'm measuring.
00:47:33
Speaker
Yeah, whenever I see any anything that's trying to gamify, you know, any kind of meditative practice, I'm like, I think you're doing it wrong. I'm not sure that's the goal here. Yeah, yeah, you might kind of miss the point of that. Yeah. Well, speaking of the spiritual stuff, that kind of reminds me of a thought I had where so I don't know if you're familiar with the work of Hartman Rosa, but he talks about how the quantified self movement, which By the way, i don't is that a movement? I don't know. is that i don People talk about that as a movement. Anyway, and are there people who like consider themselves a part of this movement? i't Anyway, I'm not really clear about what the quantified self-movement is. But anyway, he kind of claims that all this stuff is like heading in the direction or encourages the idea that what's important about you is what can be quantified and measured on, let's say, a Fitbit or something.
00:48:29
Speaker
And it kind of made me think about how, you know, if you think about like, um, you know, Western history, how, you know, in the past people would be sort of like in like, like in a Christian context, like thinking about like the, you know, week by week, you know, am I, am I being prudent? Am I being temperate? Am I,
00:48:53
Speaker
in my being just. In other words, like they're, they're trying to see, they're they're, they're kind of watching themselves or thinking about themselves in terms of whether they're living up to various virtues, ideally. And it's just interesting to think about how now people are also watching themselves take, you know, keeping track of themselves in some sense, but it does seem to be more reductive in the way you're talking about it. It seems to be more about like,
00:49:24
Speaker
my physical characteristics, which, you know, obviously are not as important as whether you're just or whether, you know, you are kind to other people. Anyway, so I don't know. What do you of you what you think about that? Yeah. Yeah. Yeah. Um, So one, one slogan of this movement, I don't know if it is one, I think, I think it was like, it was a thing okay when, when those technologies like started to become more available. I'm not sure if There's much of a movement going on still. um So one one of their slogans, which is I find interesting, is what gets measured gets managed. So I think whatever you manage, and whatever you measure, is like um then you see, oh, this is where I stand that this is where the ideal parameter would be.
00:50:14
Speaker
um So you're trying, you're you're much more incentivized to work on that than on the thing that you cannot measure and you cannot see, and you don't see your project, your progress. So it's much, I think this just has to do with with human motivation, just how we are motivated. This is also exploited by these gamified technologies. They they use um our psychology of motivation And by by tracking those goals, it is much easier for us to keep going at it and to stay motivated. And it's much easier to measure um yeah your your heart rate or your, I don't know.
00:50:57
Speaker
bicep size or whatever you want to move on, then justice. I don't know, maybe we will get a neuro device that will be able to measure justice. Who knows? Maybe one day we will ah try to to improve that, but I think that's that's a long shot. so yeah i think because Because we have those technologies that measure specific parameters, we are much more obsessed with those and might neglect others that might be much more important actually. yeah so I wanted to ask, I'm taking my notes here, is this when you say that there's a technological external perspective that's in tension with your narrative self-understanding, is this what what you were getting at or is that another direction? It is.
00:51:47
Speaker
Yeah, it is a bit, I think there's some parallels. So the idea there was that like, I have my own perspective of who I am like this. Yeah, I have my self narrative of my story, where I come from and go to. And then there are all all these stories that everyone else has about me and my colleagues and friends and my mom, they all have another story of who I am, um an external story that I can, that I I get to know it because other people react to me or they just tell me. um And then this personal information technology also provides an external perspective on me. um this This measured technological one is all these data points about who I am, which I can integrate into my narrative.
00:52:27
Speaker
um And yeah, this this is always going to be a selective story, just like my colleague's story about me is selective. It is just about the work context and my mom's story is selective in a different way. This technology story is selective as well, and it is typically um yeah more reductionistic. It is more concentrated maybe on physical aspects or on autobiographical aspect, depending on the um type of technology. But there's a lot that falls in between um the cracks of those technologies and maybe there's a lot of the important stuff as well.
00:53:27
Speaker
We bring up Charles Taylor a little bit and you talk about how this idea of you know shared horizons of meaning. And it's related to how not all personal information is like inherently significant. You have to kind of place it in a horizon of meaning or something like that. And and yeah, basically I think you suggest that this technology can shape or influence these shared horizons. Um, anyway, can you come out? Yeah. Can you get elaborate on that? Uh, it might, if I get that right. Yeah.
00:54:01
Speaker
yeah Yeah, so the idea is that um not everything about who we are is important. There are like thousands and thousands of facts about each person, how many hair you have on your head, how many nanometries tall you are or whatever. These are all things um we could know about each other that could be self-defining, but many are not. So we kind of pick out what is important about people, what are what are important characteristics um that we that we, for example, pick out when we
00:54:33
Speaker
describe someone or when you introduce someone, for example. um And those are, socially, those are occurring in a social context. I kind of just define myself what is what is the important stuff about about people. I kind of just decide, oh, no, no, like my my job doesn't really matter about who I am and where I grew up and all my friends, this all doesn't really matter. What really matters is the hair on my head or whatever silly thing.
00:54:59
Speaker
um Instead, we have social norms about what is important and as well as how we measure what is important. and This can change and technology is also part of this change. um and I think an interesting example of this is genetic heritage testing. so For many people, um being, for example, 25% Croatian, has become an important part of their identity. And this is something that we just didn't know before, and that was just not important in in our identities before. um And also like the way we measure heritage, it's much more likely to be defined by a genetic heritage than by upbringing.
00:55:42
Speaker
um so I think technology can introduce shifts in what we deem as important um and create new meaningful categories. Yeah, that makes sense. and mean you know kind of Looking at time as a way of kind of thinking about the big picture here, like you know you mentioned that you know not everything about who we are is important or could be self-defining. One of your papers you mentioned, like the example of like, yeah, it's a fact about me that I put my left sock on before my right this morning, but that's just not like, so very significant or substantial self knowledge. And so I guess, yeah, just overall, like, do you think that personal information technology, is it going to provide really important insight into ourselves, significant insight, or is it going to be more these like,
00:56:42
Speaker
trivial things. I mean, I guess like from my perspective, I want to say that like the really important things about a person are, you know, are they honest? Are they courageous? You know, are they cruel? are they Basically I would be thinking in terms of like the list of like vices and virtues, you know? And, and like, when I think about it from that perspective, I'm kind of thinking like,
00:57:03
Speaker
I'm not sure personal information technology is going to be so helpful with that. So anyway, yeah, like what, you know, just what do you think? Like at the end of the day, do you think these personal information technologies are going to be an avenue of like significant self knowledge or not? Yeah. Um, I think there is a lot of, definitely there's a lot of trivial and unimportant information in it. Um, but I think maybe as, as philosophers, philosophers who might also have a tendency to,
00:57:31
Speaker
um neglect the the embodied side of identity. So I think in the end, it is also important um how you feel in your body, the things you can do with it. ah yeah People have very different capabilities and these technologies might help you to um explore them and and get better at them, for example, improve certain research and goals. um And then I think there are just some like clear benefits when it's just about very straightforward. like there's a disease that you managed to discover thanks to these technologies. But also, I think, especially these autobiographical technologies that help you remember things in ways that we just have not been able to before, and I think that is quite remarkable, um that we can keep in touch with our past in a way that was not possible.
00:58:25
Speaker
um Yeah, and I think there's there's this interesting shift that we mentioned before where we might present ourselves a bit differently. So the question is, we watch with which past are we keeping ah in touch of when we when we look at those pictures? um But I think it can be an important source um of self-knowledge. Say, if you can look at the picture of someone who passed away and that you really love, and you can kind of get deeper into the kind of relationship you had together.
00:58:55
Speaker
um so I think that can be a valuable source of self-knowledge and a valuable input into our identity construction. Awesome. This is is okay if I ask you one more question and then this is this is going to be a softball. Put on your sage hat, you will be our our wise sage. What should we use? What should we stay away from? just you know and This is not a softball at all.
00:59:24
Speaker
yeah Is there anything for sure that looks like this is not helpful? Can you give any you know general guides like that like that and maybe some things that either are helpful or maybe will be helpful to kind of keep an eye on?
00:59:39
Speaker
so This is a really tricky one. i think um Yeah, trying to to record and store and share important memories, but then hopefully in a way that's not distracting you for actually living those memories. um I think that can be valuable. um I think some getting some information about your health, if this is like easily accessible for you in terms of like financial um effort, I think
01:00:12
Speaker
that can be valuable. like Maybe often you you use such a such a health checker and it doesn't really give you any important information and it's not very relevant. um But I think it can, in in in some cases, it might be very helpful. It might help you um to understand yourself a bit better. But then it's also really tricky because you really should try to um not sell not obsess over those data. they I noticed this myself too. that It's it's really very easy to just stay glued on on, I don't know, your pulse tracking. And I think this is a really big risk that we are just flooded by this information about ourselves in a way that's no longer helpful. So I think
01:00:56
Speaker
These technologies, yes, but just you know try to try to contain the amount of time you you spend on them and just try to pick out what really matters. um Yeah, that's a bit of broad statement. I think that's how far my wife's head this is getting me. Perfect. Well, thanks a lot, Muriel, for coming on. and yeah again I recommend all all the listeners the technology, personal information, identity, please check it out, as well as the other article you touched on, Track Thyself, the Value and Ethics of Self-Knowledge through Technology. so Thank you again, Mariel. Thank you very much. Thank you for having me and for the discussion. Absolutely.
01:01:50
Speaker
Thanks everyone for tuning into the AI and Technology Ethics podcast. If you found the content interesting or important, please share it with your social networks. It would help us out a lot. The music you're listening to is by the missing shade of blue, which is basically just me. We'll be back next month with a fresh new episode. Until then, be safe, my friends.