Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
#5 Giovanni Rubeis: Liquid Health image

#5 Giovanni Rubeis: Liquid Health

AI and Technology Ethics Podcast
Avatar
81 Plays6 months ago

Giovanni Rubeis is a professor and head of the Department of Biomedical Ethics and Healthcare Ethics at the Karl Landsteiner Private University in Vienna. He also has worked as an ethics consultant for various biotech companies. He is the author of the recently published Ethics of Medical AI. And today we are chatting with Giovanni about his article on liquid health.

Some of the topics we discuss are the notion of liquification, the concept of surveillance capitalism, and the perils of liquid surveillance in healthcare—among many other topics. We hope you enjoy the conversation as much as we did.

Recommended
Transcript

Introduction to Giovanni Rubais and Medical AI

00:00:16
Speaker
Hello and welcome to the AI and Technology Ethics Podcast. Today, Sam and I are back with Giovanni Rubais. Giovanni Rubais is a professor and head of the Department of Biomedical Ethics and Healthcare Ethics at the Carl Landsteiner Private University in Vienna.

Defining Liquification and Surveillance in Healthcare

00:00:34
Speaker
He has also worked as an ethics consultant for various biotech companies
00:00:38
Speaker
He is the author of the recently published Ethics of Medical AI, and today we are chatting with Giovanni about his article on liquid surveillance. Some of the topics we discuss are the notion of liquification, the concept of surveillance capitalism, and the perils of liquid surveillance in healthcare. Among, of course, many other topics. We hope you enjoyed the conversation as much as we did.

Relevance of Liquid Modernity in Neoliberalism

00:01:18
Speaker
of liquid health haunted me for quite some time. And because I'm a big fan of Sigmund Bowman and especially of his whole concept of liquid modernity, I think this is such a great way of understanding several tendencies and several developments we have witnessed, especially
00:01:42
Speaker
in the last, let's say, four decades, I think. And I think he also emphasizes this somewhere, that this is especially something that is very relevant for the neoliberal era of modernity.

Power and Economic Dynamics in Modern Capitalism

00:02:00
Speaker
and coincides maybe with what people call postmodern or postmodernity. So this idea of liquidity, I think,
00:02:16
Speaker
is simply a stroke of genius, the idea to frame what is going on in the modern era by comparing it to this physical state of a liquid.
00:02:32
Speaker
In the pre-modern era, you have power relations and economic activities, both are interrelated, that are very limited in terms of time and space.
00:02:49
Speaker
So, especially power relations are tied to time and space. And power was mainly exercised in a clearly defined but limited area. And the modern era then unleashed the economic powers. We know this story from Marx and Engels and so on.
00:03:13
Speaker
Even Adam Smith, this unleashing of the productive forces also meant that economic power transcended time and space in a way, global capitalism, global
00:03:28
Speaker
global exchange of goods and commodities and global trade and so on. And this also meant that we are dealing with a totally different mobility and also velocity. Goods travel faster, information travels faster, people travel faster and wider distances. And
00:03:53
Speaker
Bowman tries to explain this by ascribing to this process what he calls extraterritoriality. So power, and especially economic power, becomes extraterritorial in a way that it is no longer bound to these clearly defined and controlled space.
00:04:18
Speaker
Also in terms of time and I mean, telecommunication, digital technologies nowadays, data transfer are examples for that.

Transcendence of Economic Activities

00:04:31
Speaker
But what makes it really interesting is that he also has another category which he calls
00:04:37
Speaker
socially exterritorial, which means that economic activities, power relations and so on also become socially exterritorial, meaning that economy is not tied to traditional norms and roles and relations anymore.
00:05:01
Speaker
It kind of becomes an end in itself, like this concept by Weber, instrumental rationality. And so this also means that not only do these power relations, this power and this economic activities kind of transcend time and space and traditional bonds and so on, they also transform them.
00:05:28
Speaker
So, especially roles and norms and relations, they are not clear anymore. They are constantly changing. They are shifting. Or, in other words, they become liquid.

Bauman's Liquification and Stability

00:05:47
Speaker
Right, right. So, basically, he's using this metaphor of liquefaction to describe numerous
00:05:55
Speaker
Developments that he sees in the modern world and by the way just for listeners who don't know so zigment bowman bowman, uh was a polish sociologist and philosopher Uh, and he's really significant in sociology He has numerous major ideas, but one of them that you're talking about here, you know is this idea that um, you can describe
00:06:16
Speaker
Like the times we live in as sort of liquid modernity and so it's almost like Instead of describing us as living in post-modern times. He's trying to use a more like Uh, what's the word descriptive term? Yeah, it's a time of like liquification and it offers obviously there's like numerous like immediate, you know connections you can make to like
00:06:39
Speaker
The lack of stability, which is like obvious, you know, I mean, just anyway, so there's numerous and we can dive into a lot more of that. And in just a moment, that's great. And so basically what you're doing. Oh, yeah, Roberto. Yeah. Yeah. Just just for clarification, because I'm less familiar with this space.

Historical Onset of Liquidity

00:06:56
Speaker
Right. So so I'm just trying to juxtapose this, you know, with something very, very local and fixed like the medieval era or something like that. Right. So in those days, warlords
00:07:08
Speaker
the kings are basically warlords and the limit of their powers was basically the contested borders next to the next warlord or something like that. And you all knew your place, meaning you all knew what role you're supposed to fill. You make shoes, you make whatever. And now we all, not only, I mean, it's more than just that we're in a condition of very fluid roles, but it's also the case that even information is fluid. Basically every aspect of our social order
00:07:37
Speaker
is very, very fluid. Just so that I get the context, when did this kind of thing begin? It feels like it happened during the Renaissance, or I'm thinking printing press too, but are there kind of like big landmarks in the history of this kind of idea being put into practice?

Transition from Territorial to Economic Power

00:07:56
Speaker
Well, I'm not really sure if Bowman really goes into that much historical detail, but I could think of a few developments. As you said, I think the Renaissance is
00:08:11
Speaker
is surely a watershed, but also the development, I mean it also, it differs from country to country. Not every country has like the same pace here.
00:08:28
Speaker
For example, I think that England and France were quite ahead of, let's say, Germany, for example, or Spain throughout that era. We are talking about, let's say, the 16th and 17th century here. But you see this shift from a ruler whose power is bound to a certain territory.
00:08:54
Speaker
to rulers who suddenly rule empires that stretch across oceans.
00:09:02
Speaker
And these empires are not really political entities. They are more like economic empires, right? So the first settlements in the US or in South America, in Africa, in Southeast Asia, by the British, the French, the Dutch, and so on, they didn't have any political significance. They only had, in some cases, a very rudimentary political organization, if any.
00:09:31
Speaker
I'm but they were very significant in economic terms and i think this is this is this is a major aspect because it's set before power of course always included economic power but i think throughout medieval times. Power was.
00:09:48
Speaker
mostly military power and

Economic Power's Role in Modernity

00:09:50
Speaker
the power to basically define law and exercise your radical power. Plus, you had a clear hierarchy, you had clear roads. As you said, if your father was a shoemaker, you will become a shoemaker. And this is as clear as day, and there is no way around it.
00:10:15
Speaker
the case in which you were born before your whole life. And there was no, only very few exceptions could like transcend this and have like a, could climb the social ladder because simply there was no social ladder, you know. You had clearly defined compartments. And modernity changed all that and it changed it
00:10:46
Speaker
Today, we would like to believe the story of the Enlightenment, that all this came about through lots of clever people putting their minds to it and telling everyone, come on guys, these hierarchies are not
00:11:01
Speaker
rational. Let's think about

Digital Integration in Healthcare

00:11:03
Speaker
that. Everyone is equal and so on and so forth. But the real driving force of this was the economic development. And I mean, clearly, Bowman is a Marxist here or a new Marxist or whatever in this very materialistic way to describe these developments. But don't forget the Black Death too, you know.
00:11:30
Speaker
Yeah, Black Death had a significant
00:11:36
Speaker
economic implications because what happened after the Black Death was, a population was so decimated that suddenly a farm hand, for example, or an artisan became irreplaceable because he had an expertise and there was such a shortage of people, not even specialists and experts, but even people who just did manual labor.
00:12:04
Speaker
that you had to give them certain benefits, if not rights. It's no coincidence that the later 15th century was an era of social upheaval.
00:12:20
Speaker
throughout Europe because of that. Because suddenly all these nobles, these noblemen and kings and so on, they couldn't do what they wanted and they couldn't just, if people didn't do what they wanted, couldn't put them to death or put them in prison or whatever.
00:12:44
Speaker
because suddenly the life of an individual human had a worth, you know, was worth something in terms of economic worth.
00:12:58
Speaker
It was not that disposable anymore. And so the Black Death played an immense role in this. And as always, there is no one cause for a very complex phenomena like that. There are many different causes, but they all come together and form this very weird thing that we call modernity.
00:13:24
Speaker
Yeah, yeah, it's funny. I had to laugh on Roberto brought the black death because I had flashbacks of
00:13:31
Speaker
drinking with Roberto and he has like five beers and then he starts, you know, pontificating, lecturing you about the significance of the black death for, for history. Anyway, he's a black death guy. He's like, he's just like convinced that it's like incredibly anyway. Um, it is, but good. Okay. So this is good stuff. So, uh, maybe let's now shift in. Okay. So basically the idea is like, you know,
00:13:57
Speaker
for your paper, it's you're taking this Bauman lens, applying it to the context of healthcare and medical arts, because because basically Bauman didn't really talk too much about liquidity in terms of healthcare and stuff. Then also you're also applying stuff from like Zuboff, her idea of surveillance capitalism, you're also taking that concept into play.

mHealth Apps and Data Significance

00:14:24
Speaker
But
00:14:24
Speaker
Before we get into all that, let's just go. You know, let's just briefly talk about, OK, so. You're thinking about, you know, sort of ethical issues, normative issues associated with applying or integrating digital technology into health care and medicine. So we're talking about like mobile health technologies and health.
00:14:49
Speaker
smart wearables big data machine learning Um, so I mean just to kind of orient us could we just have initially just kind of like a sense of like, yeah What are the kind of technologies we're talking about? Um Maybe some I mean we could bring up some like examples of uh m health um that are arising like kilkari in india and other things like that, but yeah first let's just kind of talk about the the uh technological stuff right now
00:15:18
Speaker
Well, mHealth has a huge variety of applications. It ranges from so-called symptom checker apps, which you can use for, I mean, it's basically an app.
00:15:35
Speaker
Text-based or maybe also working with pictures. You basically ask the app, hey, I have this and that headache or I have this and that ailments or whatever pain or whatever. What could that be? And the app tells you, well,
00:15:56
Speaker
blah, blah, blah, this could be this and that. And this could be one application, but there are so many others. And the basic principle behind mobile health or the basic idea behind this is to collect
00:16:14
Speaker
data from the daily lives or the environment of persons. So usually doctors deal with data from electronic health records or lab data or what they can see and feel when they encounter the patient. So this is all very artificial.
00:16:37
Speaker
It's lab conditions that we have here, and it's also just a snapshot. When you do, for example, a blood test or something like that.
00:16:50
Speaker
It shows you several things. It's very important data, but it's just a snapshot in time. Whereas if you use an app, for example, to collect data in your vital functions or whatever, or track your sleep or your bowel movements or whatever, I don't know. If you use this data,
00:17:17
Speaker
It's longitudinal data, which is much more significant. And it's not data obtained under lab conditions, but in the wild, so to speak. So this is immensely valuable, this behavioral and environmental data. So mHealth is basically about collecting data in
00:17:39
Speaker
in your natural environment and in your daily life and integrating that data into healthcare. And it can be done through apps, through, as you said, through all kinds of variable sensors, but also through Internet of Things applications. Like sensor stuff and monitoring technology that you have in your home.
00:18:04
Speaker
Right.

Innovative Health Tech Examples

00:18:05
Speaker
I was looking at this bed. I'm in the market for a new bed always. And so it is my plague for this episode. This particular bed can sense, you know, there's this pad on the bottom and it can sense your body temperature.
00:18:21
Speaker
and like fix it so that it'll learn what temperature you like to be at to sleep and it'll kind of like tune itself like oh you need a little bit cooler okay cool and so obviously if it's all of that goes immediately to the you know you get your app and all that information goes up immediately to the cloud to their servers.
00:18:42
Speaker
and they're all just collecting this, you know, a third of your life, they're collecting all that data, so. Right, yeah. Yeah, I read about, like, another example I read about, you know, like, diabetes management, so it's maybe you have an app that's tracking your blood glucose level, it's monitoring your dietary habits, and then it's giving you, like, medical reminders,
00:19:07
Speaker
Maybe yeah in offering instance, like you said about based on real time analysis. It's not It's it's it's monitoring in real time Other you know thinking about mental health so it could be like tracking your mood And then you know prompting you to like do a meditation session when you know if you're getting sad or anyway, I mean, uh, yeah, so you can just think about all these innumerable potential applications and
00:19:34
Speaker
And I think that's what you know, I was reading it like the protect the expected, you know, market size.

Deep Medicine and AI's Role in Healthcare

00:19:39
Speaker
M health is supposed to get to like almost 90 billion by 2030. So anyway, it's, it's big business, but that's kind of, which is like kind of, you know, where you come in Giovanni with your, your perspective, you know, thinking about, um, the issue of surveillance capitalism there. So, um,
00:20:00
Speaker
I don't know. I mean, maybe from here, I mean, maybe we could do this quickly because it's sort of obvious, the potential upsides of them help, but maybe let's just to really like hit home how, you know, beneficial this content could potentially be. Maybe we could talk a little bit about the work of Eric Topol, who is a really, you know, one of the biggest researchers in the world in terms of scientists and like, uh, you know, amount of peer review articles he has. He's, but he's somebody who's really championing.
00:20:29
Speaker
the use of digital technology and health so i don't know just yeah i mean it's it's probably already obvious to the viewers like how this could be super beneficial and you've already kind of touched on a Giovanni but yeah i just why are so many people super excited about the potential for digital health and healthcare or digital technology and healthcare
00:20:49
Speaker
I mean, in part, you answered this question already by talking about the projected economic aspects of this. Of course, people are hyped by a $90 billion market. But beyond that, I mean, what people like Eric Topol do
00:21:13
Speaker
He came up with this very important concept of deep medicine. The idea behind this is very simple. You have all these digital technologies, especially AI-based machine learning technologies.
00:21:32
Speaker
You have mobile health technologies and if you combine them you can you can make medicine great again more like this is what what he tries to tell us and that means that basically.
00:21:54
Speaker
Doctors spend a lot of time for very repetitive, time-consuming tasks like data collection, data analysis, not to speak about administrative stuff.
00:22:09
Speaker
that has nothing to do with patients right this is just a lot of this is just red tape or other stuff that cost a lot of time and is not considered as a medical task at all and it's not patient centered maybe patient related but not patient centered in terms of interacting with patients or doing something about their health.
00:22:33
Speaker
The idea of Topol is that these wonderful technologies can delegate a lot of these tasks to these machines and these programs. This would mean that doctors suddenly would have more time on their hands.
00:22:51
Speaker
And they can spend this time on patients, on the doctor-patient relationship. And he calls this deep empathy, which is very interesting because he basically says that the
00:23:09
Speaker
increased use of technology will not have a dehumanizing effect. It's the other way around. It will lead to a more humane medicine because doctors suddenly have more time and can spend it on the patients and so on.
00:23:28
Speaker
This is all really nice, but I would strongly support this. There is the potential in the technology, but I think what Topol and others
00:23:45
Speaker
totally ignore is the economic aspect of this. I mean, what would be the first thing that health institutions and so on would do if they see that their doctors suddenly have more time? Of course, they would push more patients through the system. I mean, this is basic economic thinking, right?
00:24:11
Speaker
The thing about this about Toppol and this idea is not that it is wrong as such. It's just naive to think that developing and implementing technology alone will make medicine humane again and will bring about this deep empathy.
00:24:33
Speaker
It's the other way around. We have to define this as the major goal of A, technology development and B, implementation of technology. Then we could achieve this. But just saying, oh, the technology is there. We just have to implement and use it and then everything will fall into place. That's not going to happen.
00:24:53
Speaker
And it's nice because it ignores the economic frame of this and not only the economics in terms of the economics of the healthcare system, but also the business model of these tech companies and what has been called surveillance capitalism.

Introduction to Surveillance Capitalism

00:25:16
Speaker
Which would be a good time to talk about this, I think, this concept. Yeah, absolutely.
00:25:22
Speaker
You know reiterate like yeah, like You like you said there's kind of me at least least three things there. You know, one thing is it's supposed to save time? Yeah For these people. I mean he has that quote that you know, the greatest gift of ai would be the gift of time Yeah, totally right and then another idea is being able to actually distinguish at-risk populations yeah, that's like a really interesting one where
00:25:46
Speaker
You know he has this quote. It's like we send women for mammograms when they hit age 40 But 88 of women will never have breast cancer in their life, right? So it's like
00:25:55
Speaker
If we actually knew who was at risk, then we wouldn't have to send everybody for a mammogram. So it's just anyway, so that's kind of obvious. That's like a, you know, being able to distinguish who is actually at risk here and using all this data will potentially allow us to really determine who's at risk. And then finally, like the empathy bit, which is like, like you said, is kind of surprising. Uh, but yeah, he has this example where it's like the AI could be listening to your discussion. Hmm.
00:26:22
Speaker
with a patient and it will notify you afterwards, like, hey, you interrupted your patient after three minutes. Maybe don't do that. Anyway, okay, but good. Obviously, the upsides are pretty clear there. Let's dive more into what are the concerning aspects. Maybe what is surveillance capitalism and
00:26:48
Speaker
Yeah. What are some of the sort of ethical concerns that you have?

Data Commodification and Predictive Behavior

00:26:56
Speaker
Maybe I begin with very shortly outlining surveillance capitalism by Shoshan Subov, who wrote this wonderful full book in 2019. And the basic idea is that for the last two decades or so,
00:27:14
Speaker
We are where a new business model new economic order has emerged, rearranging the relation of knowledge and power and what you mean by that is that.
00:27:30
Speaker
the new oil in this new industry is data. And I mean, data in terms of the data that we share when we interact with all these digital technologies with apps and so on, data that has become
00:27:55
Speaker
not only a commodity as such but we pay with our data for services that are supposedly free and I mean everyone of us knows this and does this if you use google maps or similar things or google at all you know the search engine and so on this is supposedly free but of course you pay with your metadata when you use it.
00:28:20
Speaker
And so when such a tech company designs an app and says to the app, the app usually costs almost nothing.
00:28:32
Speaker
But they make their profit not by selling the app. They make their profit from the data that you voluntarily share with them. And so you more or less pay for all these services with your data. And this is what Subhav calls behavioral surplus.
00:28:55
Speaker
And so far, so good. I mean, there's nothing unfair about this. I mean, it's an exchange of simply market exchange. I get to use your platform in exchange. I give you whatever data you can find.
00:29:16
Speaker
glean from my activity. And I mean, they are transparent. They tell you that somewhere in the terms and conditions, you know, page 357 at the bottom, but at least they tell you that. So everything seems fair up to this point. But the thing is that
00:29:39
Speaker
And this is a very clever idea by Subov, I think, to frame it like that. The bad you just mentioned, the smart bad, is a good example for that.
00:29:59
Speaker
huge amounts of data about you, so they know you. So services, for example, can be tailored to your preferences and your characteristics, which also sounds pretty good. But the idea behind this is not to sell you a product that fits you perfectly, but also to predict your behavior
00:30:22
Speaker
and not even stop it, predicting your behavior, but shaping your behavior, giving you a push in the right direction.

Ethical Concerns in Healthcare Surveillance

00:30:29
Speaker
Buy, for example, you all notice from, or we all know that from online shopping, hey, you just bought this coffee machine, what about this and that? Whatever, coffee pads, maybe you're interested in that. Or if you buy golf clubs, of course, you need a bag for it, here's the bag and so on and so forth.
00:30:49
Speaker
I'm and so this is this is what what's waiting is capitalism really is gathering your data but making profit of your data by maybe we setting them.
00:31:04
Speaker
but also shaping your behavior through the knowledge that is obtained by collecting your data. And this is exactly where health technologies come in and where the ethical issues start. And my idea in this liquid health concept is basically taking this concept of liquid modernity from Bowman
00:31:34
Speaker
and combining it with this idea of surveillance capitalism. They both meet in this or they intersect when it comes to surveillance, the role of surveillance. And if you combine them and apply it to the health sector, then you have this, you can use this idea of liquid health as a lens to better analyze the ethical issues.
00:32:04
Speaker
So before we go into the healthcare stuff, I do have a follow-up question on just the surveillance capitalism part of it, but I'll frame it as an anecdote because why not? Immediately after I was looking at beds, of course I was using Google,
00:32:21
Speaker
And so it sort of knows that I have trouble sleeping. So I got an ad for a light that uses, I don't know, soft wavelengths or whatever to naturally wake you up and to kind of lull you to sleep, right? And it was a $200 lamp. And so I'm getting these ads and
00:32:43
Speaker
I haven't bought it, but I'm just... You will buy it, Roberto. That's exactly what I'm getting at. I'm already... I've been thinking... I thought about it twice already in this conversation, right? So I clearly... It's working. I thought you were distracted, Roberto. I saw you looking at me. Really, you're just like on Amazon. It's like the Moth meme, you know, got lamped. Yeah, yeah.
00:33:07
Speaker
So I guess the follow-up question is, do we know anything about the rate of success? I know it works, right? So do we know how often it works? It's an empirical question and it's very interesting. I cannot answer it because I don't know the numbers, but I think that it must be immensely
00:33:31
Speaker
successful if you look at how these companies fared throughout the last 20 years or so. Right. Yeah. Because basically what you were talking about earlier with if you like this, then you'll like that when it comes to Amazon. So that's, you know, that's referring to recommend recommender system technology. That's what people write recommender systems. And that's basically an essential dimension of the business model for most
00:33:59
Speaker
tech companies, you know, it's like really fundamental to it. It's I mean, Spotify, etc. But I think that in terms of data about that, Roberto, I also think, yeah, I mean, besides what Giovanni just said, which is that presumably it's successful, given how crucial it is to their business models. You know, I think there is some data with Netflix that like, the recommend members on Netflix are like the the main conduit
00:34:28
Speaker
In other words, I think there's data that most of the stuff that people watch on Netflix was first recommended.
00:34:35
Speaker
Anyway, I think it's suggestive that it's pretty. I can imagine. I also watched shows on Netflix or bought stuff on Amazon, which I wouldn't necessarily have done if it hadn't been recommended to me. Although I have to say, whenever I'm on Amazon,
00:35:04
Speaker
I always think all this poor, poor algorithm, because I think my user behavior is so hard to predict because I have many weird incompatible interests.
00:35:21
Speaker
I search for a biography on Adorno, then on a graphic novel based on HP Lovecraft, then on a Slayer CD, then on a football jersey from my favorite football club and so on. I always try to anthropomorph the algorithm as some poor, overworked, overweight guy sitting in front of a
00:35:51
Speaker
They ask something, what's he up to again? What's he up to now? I mean, come on! Right, right. Yeah, you're the last unpredictable man, Giovanni. The rest of us are... No, but... No, I think it's the other way around. I think I'm responsible for a huge part of Amazon's success throughout the last, let's say, 10 years, because through me, they really could hone their algorithms, you know.
00:36:18
Speaker
in a way that maybe they couldn't have done with all you Earthlings out there. You made Jeff Bezos his last billion. Probably. But yeah, so it's interesting to return to like, so it's kind of interesting with what you're highlighting with Zuboff, it's like, the most
00:36:39
Speaker
Obvious ethical concern that pops into your mind when you think about surveillance capitalism when you in other words when you think about the fact that Companies like part of their business model is to track everything you do The most like the thing that first pops in your mind is privacy concerns. Yeah, but
00:36:58
Speaker
the kind of thing that she emphasizes the steering of behavior. In other words, manipulation concerns, concerns related manipulation and manipulation that generates issues that generates moral concerns related to freedom, autonomy, and I don't know, some other stuff probably, but that's like what kind of first jumps to mind is issues related to freedom and autonomy of the person.
00:37:21
Speaker
Um, but good. Yeah, but now let's kind of dive into applying this to the healthcare setting. So like, what are some like specific.
00:37:31
Speaker
Things you could envision in terms of how this could go awry, how surveillance capitalism could kind of run amok in the healthcare digital technology context. Yeah. I mean, if it were only for data harms, like loss of data or data falls into the wrong hands, data theft and stuff like that, I'd say, okay, this is a risk we have to face because the benefits are obviously there.
00:38:00
Speaker
And the benefits totally outweigh these risks and doing something against these risks is not impossible. So this would be, I think, only a minor discussion point. But as you pointed out, the idea that you cannot only predict behavior but also shape it,
00:38:24
Speaker
and maybe control it that way. This opens, of course, the possibility of disciplining people in a Foucauldian sense towards a certain behavior. And this makes it interesting not only from an economic perspective, so that corporations, of course, they would have an interest to kind of nudge the users of their health technologies if you use my app.
00:38:49
Speaker
Oh, we have seen that you have this and that. Maybe you could buy this lamp that Roberto mentioned, or you could buy this bed, or you could buy this and that. So this would be a pure financial interest or economic interest.
00:39:05
Speaker
But my idea was that there is another more sinister potential here, namely that also governments, for example, could use this kind of shaping your behavior, disciplining you for implementing their health agendas.
00:39:33
Speaker
Because maybe, I mean, the whole healthcare system in Europe is totally different than in the US. And here in Europe, for example, with this public healthcare system, the state or the government could have interests, already has, of course, an interest in cutting costs. And it could achieve that by forcing, for example, a certain lifestyle on people.
00:40:05
Speaker
These are two risks that you have here, the ethical risk of exploitation by corporations, and on the other hand, these health agendas by sinister governments.

Emotion Recognition and Ethics

00:40:20
Speaker
And yeah, so I think these consequences are far more serious than simply the data harms that we can somehow address even by technical means.
00:40:26
Speaker
and stuff like that.
00:40:34
Speaker
I want to drive the point home here. I'm wondering if there's any concrete example you can give. It's some particular behavior that the government wants to change. Is there a particular example you can give us? Yeah. I recently published a paper with two dear colleagues of mine from Heidelberg University, where I used to work.
00:41:04
Speaker
And we analyzed so-called emotion recognition and regulation systems.
00:41:14
Speaker
And I mean, it's already in the name, right? So these are basically systems that work with IoT technologies, Internet of Things technologies, sensors, monitoring technologies, mostly computer vision, so cameras plan important role here. And what these systems do is
00:41:36
Speaker
Imagine a camera in your home environment that films you and tries to detect any signs of mood change in your facial expressions.
00:41:53
Speaker
So whenever this system tracks a mood change and what it has learned to be a negative emotion, some facial expression that indicates a negative emotion, it counteracts this by changing the light scheme in the room, for example, by playing delightful music.
00:42:18
Speaker
In some systems, even a social assistive robot comes out and engages you in conversation or tries to distract you or something. This is already a system that is designed not only for recognizing your mood change, but also for regulating it.
00:42:41
Speaker
It really controls your behavior and disciplines your behavior. The system doesn't ask you why you feel bad or sad or whatever. That's of no concern. It's just a negative emotion bad.
00:42:59
Speaker
Because the subtext here is, if negative emotions continue, you might have to go see a therapist or whatever, which costs the system money. So we have to do something against this negative emotion. But it's just fighting the symptom. No one is interested in your mental health and doing something about the underlying condition. All they want to do is get rid of the symptom.
00:43:27
Speaker
And this is a technology that is already out there. It's mostly used for older adults or people with mental health conditions. And it's really scary.

Critique of Emotion Recognition Systems

00:43:37
Speaker
Plus, as we found out, what this system defines as an indicator for a negative emotion is totally outdated. It's like some theory of emotion that dates back to the 60s and has
00:43:54
Speaker
Oh, right. Exclusively on people from the Global North. So yeah. Right. So that work is by Paul Ekman and it's been discredited. It was a wonderful book called How Emotions Are Made.
00:44:12
Speaker
Yeah, Lisa Feldman Barrett to where she kind of dispels it as aggressively as she can. Yeah. But yeah, lots of people are going after it. Yeah, these technologies are still based on it. Because you know, someone has maybe googled something 10 years ago. And this was the first thing that came up. Okay, let's take this book. Right. So good. Well, I was just gonna say, it's interesting to think about this case, like in the different kind of worries you can have. I mean,
00:44:40
Speaker
So it seems like the one that you're emphasizing here, which is Interesting to me is is basically a concern with This technology sort of embedding a superficial Understanding of mental health almost it's like it's based upon you know measurable metrics and it's based upon what's easily quantifiable and the thing is that the human being is like It's just
00:45:10
Speaker
Our mental health our psychology is deeper than that. There's something deeper going on. You can't just Yeah, you can't really get a deep understanding of the human person just based on this kind of like superficial these These more superficial metrics potentially so maybe yeah, like that's like kind of one concern I mean another thing of course is like if you know, we're thinking about you know
00:45:38
Speaker
What sort of applications are actually doing the monitoring the more maybe like a more obvious concern is like, okay Well, this is very private data. You're the data related to your your emotional mood psychological state And so, you know when you think about data breaches related to that data, that's Super concerning when you're thinking about uh the potential for manipulation Totally based on that data becomes much more potent because we're talking about
00:46:09
Speaker
uh, the psyche of people. Anyway, I'm just thinking, you know, there's kind of multiple directions you can take. I mean, there's also the error, right? As, uh, as Giovanni was saying that the, the fact that the, the, the, you know, uh, algorithm is looking for like an emotional fingerprint, right? So when I see tears, that equals sadness, but it could be wrong because, uh, you know, you might win the lottery and start crying.
00:46:33
Speaker
But obviously it's not out of sadness. It's because it's freaking awesome to win the lottery. Apparently it's actually not awesome to win the lottery, but you get what I'm saying. Yeah. I mean, it's tricky with those kinds of concerns because the thing is that someone can always respond to those kinds of things like, Oh, well that's, that'll be figured out. Like the more data we get, the more we'll realize that
00:46:57
Speaker
What is actually a good indication of your true mental state? Anyway, it's tricky with this stuff because a lot of responses can just be, oh, once we get more data. Yeah. This was exactly something that I
00:47:16
Speaker
What was my initial idea what if i just today i feel like looking moody or something maybe i'm not in a bad place i just. Don't don't want to smile today or maybe i have a severe case of resting bitch face i don't know.

Solutionism in Healthcare Technology

00:47:37
Speaker
And people would, I mean, a software engineer or data center or whatever someone, someone who creates this, this applications will tell you, yeah, yeah, yeah, well, that's just all we need is your baseline. If we know that you have a risk resting bitch face, that's fine. Because we take it from there, right? So we this is our baseline then. And but
00:48:00
Speaker
I mean, there are technical issues, of course, to be fixed, but all these technical fixes don't matter because they don't fix the context in which this technology is used because they totally ignore this context. And they don't even ask one very simple question, why would I need this?
00:48:28
Speaker
Why is this important and who benefits from it do i really benefit from a system that sways me twenty four seven just in case. I look moody. I mean come on and it's it's it's if i have a mental health issue.
00:48:48
Speaker
I think my compliance wouldn't be boosted by a system that tells me, well, you didn't smile enough today. Maybe you should see it. I see. Okay. So you're kind of thinking like, there's going to be a lot of people who don't necessarily need
00:49:09
Speaker
This technology no but because we're sort of i mean i'm thinking of there's a guy what's i think is named like hartman rosa i think he kind of talks about stuff like this where it's just like It's just kind of in the air. You got to quantify everything. You got to watch you got to track everything about yourself like
00:49:26
Speaker
you know, that's just what you do. Like it's irresponsible if you're not tracking everything about yourself or something. And so it's just like people are just going to kind of think, Oh, like, yeah, I guess I should be monitoring my mental health. And, and ideas like, well, actually, you know, there are certain people who would need that kind of monitoring and it's actually useful. Uh, but there's going to be a lot of people who it's just like, they're just kind of going along with this movement of self
00:49:54
Speaker
quantification or whatever. Is that kind of like what you're thinking of there? I mean, there is this fantastic book by Yevgeny Morozov to save everything click here. And he has he has this. I don't know if he really invented this is always hard to track. But he talks about this, this concept of solutionism.
00:50:20
Speaker
which means we have a technical solution, now let's find a problem for it. And this is how a lot of this work, I mean, we all know this concept of technology push, a lot of money is invested in developing certain technologies, then they are out there and then we have to apply them and implement them to make this economically,
00:50:48
Speaker
Sound and this there is already a tendency in in healthcare and medicine. Where we see that this kind of technology push creates not just to use technologies in context where they are not really that.
00:51:07
Speaker
Fitting or where they don't really make sense where other alternatives would be way better. But just we have a but since we have a technical solution, this is always more technology is

Power Dynamics in Digital Healthcare

00:51:19
Speaker
always good. It's always more accurate is always more efficient and stuff like that. So but this is this is more a mythos, you know, this is this is or a myth. This is more a narrative behind it and it's not necessarily true.
00:51:34
Speaker
This is exactly the case here. It's a case of solutionism. We have a very complex problem. Let's find a very easy technical fix for it.
00:51:48
Speaker
If you look at history in the last, let's say 200 years, you will find that whenever people tried to come up with a very simple technical solution for a very complex social and so on problem, it doesn't turn out that great. You know, another kind of worry that occurred to me is like, at a simple level, what we're just talking about is,
00:52:18
Speaker
the medical, um, context gaining tons more power in a way, right? Like in a way, in the sense of like, you know, now, yeah, not only is your therapist potentially, you know, going to get at the end of the session, give you some recommendations, but they have more power in the sense that now you're going to be getting notifications asking you, Hey, did you, are you doing the thing that your therapist told you to do? Right. And so,
00:52:47
Speaker
When you assume that the therapist is good, that extra power is sort of like appealing or like you can see the upside. But the thing with a lot of power is that if you assume that the agent wielding the power is actually not smart or maybe has bad intentions, then the potential for
00:53:16
Speaker
Bad stuff going happening, you know raises and so like i'm just thinking like you know the point being like well in some sense of all you know what we're saying with m health is just that The medical field will gain so much more power over people's lives So if they make an error like if they think oh, yeah, actually this is great for people and they're wrong about that. Well It's like in in this new time of m health it will be like that error will cause a lot more damage because of
00:53:44
Speaker
Is that, do you guys see what I'm saying here? Does that make sense? And I think this pretty much aligns with what Bowman tries to tell us with his idea of liquid modernity. And this is why I think this lens fits so perfectly for healthcare, because
00:54:09
Speaker
the face of it, liquid modernity is actually a pretty good development because it kind of dissolves these power structures, these very rigid power systems and their symmetries that have been around for centuries. They are suddenly
00:54:31
Speaker
They are suddenly destroyed. And this is pretty Marxism, by the way. This is exactly what Marx and Engels say in the manifest, where that capitalism destroys all these archaic feudal systems and so on and so forth.
00:55:05
Speaker
I mean, this is really as soon as you break these rigid systems of power and these symmetries, and as soon as you also kind of transcend the traditional roles and relationships.
00:55:26
Speaker
This also poses a danger. And the danger is that now it's unclear who has the power, who has what power, what is my role, what are my rights, what is my position here. And this makes it more unpredictable for you when trying to find out, okay, obviously power symmetries, they don't just disappear, they just shift. But who is in power right now?
00:55:53
Speaker
And this power also, I think Bowman even uses this phrase, power doesn't really have a face anymore.
00:56:02
Speaker
In a traditional patient-doctor relationship, for example, in a very paternalistic model, it's totally clear. The doctor has all the authority, has all the power, and you know what that means as a patient. You know your role, you know your relationship, and you know who has the power. You have literally a face for it. It's a person who wields this power. But with these technologies,
00:56:32
Speaker
You don't know who really has this power, who exercises it, you know, because all you see at best is some app or some computer screen or just a camera or you see nothing at all because you're not aware of all these sensors around you on your body or whatever.
00:56:49
Speaker
But there's a machinery behind it, there's a cooperation behind it, a faceless cooperation somewhere and you have no idea what their obligations are, what their power really is, what your place is in that and so on. And this is the other side of liquefying. And coming back to Topol, for example, Topol also talks about the patient-doctor relationships.
00:57:13
Speaker
a relationship and he thinks it's a good thing that we use all these health technologies, especially mHealth, because this empowers patients and kind of mitigates this power asymmetries. Again, yes, they could do that, but ignoring the real architecture of power that is behind these technologies, well, it's a bit naive again.
00:57:42
Speaker
It introduces a new agent of power, one for which we don't have any compass. This is those who produce and control these technologies. I think as we move to wrap up here, is there any
00:58:03
Speaker
Any message you can possibly send as to how to not only raise awareness, but is there any practical actions that

Potential and Risks of Digital Health

00:58:12
Speaker
we should take? I mean, should we all not use ML or any thoughts on that?
00:58:17
Speaker
No, definitely not. This is not the message that I want to send on the contrary. I think that these digital health technologies are really powerful tools and we could achieve a lot of good with them in terms of health benefits, but also in terms of social benefits. Just think about the immense potential of some of these technologies
00:58:41
Speaker
in terms of applying them on underserved populations, for example, or in regions that are structurally disadvantaged, healthcare systems that are underfunded and so on. So this could have an immense benefit, even in a social respect. But I think if we frame
00:59:09
Speaker
What these technologies do to health and health care as liquid health, the basic idea behind this is that these technologies reshape our concepts of health and illness. They change the scope of the medical domain, and they liquefy roles and relationships that surround health and health care.
00:59:38
Speaker
They reshape the concept of health and illness because when you use these M Health technologies and so on for saving your own health, you are never really healthy. You're just not ill yet, right?
00:59:59
Speaker
It's always about risks. Like the old jokes that doctors used to tell a patient who is healthy is simply not diagnosed enough. And so it's the same thing today. So we tend to think of health and illness in a way that we think we don't think about health at all. We only think about health as soon as we're ill.
01:00:28
Speaker
So in a way, health is just the absence of illness, what I think most people would subscribe to that. But these technologies change this because you are always on alert, you're always at risk when you fix it on this, on waiting and monitoring yourself, right? So this reshapes our understanding of health and illness. At the same time,
01:00:58
Speaker
At the same time, in a weird way, it also changes the scope of the medical domain because people use more of these technologies themselves, which some would, as Topper, for example, would say, they become empowered through this.
01:01:15
Speaker
through self-management and self-monitoring, but this also means that the clinical gaze isn't bound to the walls of the clinic anymore. It penetrates the private realm as well. So the clinical gaze is always present even in your daily life. It's a kind of a medicalization of your daily life. And the third area
01:01:41
Speaker
where i think this is important is as i said the relationship between patients and doctors because. Now their roles are not really defined anymore and a new player is is is is present in this relationship and change the power dynamics here so.
01:02:01
Speaker
What I'm trying to say is, if we are aware of this, if we use liquid health as a lens to better understand these processes, we can achieve exactly this deep medicine and deep empathy by pointing out the risks that may undermine the use of technology in such a beneficial way.
01:02:23
Speaker
So my idea of liquid health is not about we should all turn Amish and shun all this fancy technology. Not at all. On the contrary, this can really bring about a huge change in medicine and health care and make it better, make it more accurate, make it more personalized and more actionable.
01:02:49
Speaker
But this, as I said before, is not something that will just magically happen just because we have this technology and use it. We have to be aware of the risks. And this is what the lens of liquid health is for, to better understand them. And then use that knowledge to
01:03:13
Speaker
design technologies and implement them in a way so that we can achieve these beneficial goals.

Mindful Use of Digital Health Technology

01:03:19
Speaker
So that's the whole idea or the point I tried to make in this paper. Awesome. Well, thanks so much, Giovanni, for coming on. Yeah, I highly recommend to any listeners your article, Liquid Health. We could only really touch some of the highlights of it. So yeah, thanks so much for coming on. Fascinating piece.
01:03:38
Speaker
And people who like this paper also liked my recent books, my recent book, ethics of medical AI. So you should check that out too. Absolutely. All right. Thanks again, Giovanni. Appreciate it.
01:04:07
Speaker
Thanks everyone for tuning into the AI and Technology Ethics podcast. If you found the content interesting or important, please share it with your social networks. It would help us out a lot. The music you're listening to is by the missing shade of blue, which is basically just me. We'll be back next month with a fresh new episode. Until then, be safe, my friends.