Manipulation Tactics in Technology
00:00:00
Speaker
Similarly, with misinformation, radicalization, it all comes down to manipulation tactics. And so I realized that the manipulation game is going to be extremely prevalent. And we need to not only start building resilience in terms of adapting to these technologies, but to almost building a shield, which is why it's called mind shield. It's literally a shield for your mind to be able to Recognize, resist, and respond to manipulation tactics.
00:00:29
Speaker
And hence, get your power back.
Introduction to Shimona Banerjee and MindShield
00:00:39
Speaker
this is bare knuckles and brass tacks this is the tech podcast about humans i'm george ka and i'm george a And today our guest is Shimona Banerjee, who is the founder of MindShield.
00:00:53
Speaker
So she's a psychology PhD and MindShield is designed to help people protect what she calls their cognitive security. So this is a really interesting and very wide ranging discussion.
00:01:06
Speaker
And of course, having a psych doctoral candidate on the show means that she also ended up asking us questions too. Yeah, it was really cool. i was thinking about this halfway through the episode because we got in some some really good philosophical type conversations. We got into how to think, how to rewire your thinking. i was like, fuck, I should taken mushrooms for this episode. This is like deep as hell. What is truth? We got into that. we By the way, we solved it in 40 people.
00:01:37
Speaker
Yeah, so I guess this is a psychedelic friendly show. ah But she was wonderful. And we met her actually at an event, we we did our PKBT event in Toronto, mind over cyber was there, she presented the same kind of stage as us. And they're just, there was too much symbiosis there to not bring her on the show. Eventually, she might probably become a regular guest because I feel like we have our own therapy sessions lined up with her. Yes.
00:02:03
Speaker
I think this is a great first episode with her. So I'm very happy to introduce you to our audience. Yeah, there's a lot in here for individuals. And if you are a parent, I would also take notes too. Shimona Banerjee, welcome to the show. Thank you, Kay and A for having me on.
00:02:19
Speaker
um I know I met you guys earlier this year. Exciting.
Shimona's Background and Mission with MindShield
00:02:22
Speaker
Lovely to be known by our initials and we will start in the most logical place, which is you have started this ah institute and advocacy group called MindShield. So why don't you take ah ah quick detour and give us a little bit of your background and a little bit about the the mission of MindShield and we will take it from there.
00:02:45
Speaker
So my background is a bit, it's a bit of a rollercoaster. It's it's um definitely not traditional. I was born in India, left when I was 18 to be a dancer. And I've always been an artist at heart. I did see my future being in the performing arts.
00:03:01
Speaker
And while i was in college, ah due to a like a physical injury, couldn't pursue that anymore. So I switched to film. And the world of filmmaking got me into documentary filmmaking, which became my business for the next three years after I graduated. and During my experience with documentary filmmaking,
00:03:20
Speaker
I realized as i was doing a ah few politically sensitive films, I realized that people were afraid to come on camera to share their stories, especially if their families were back home in war zones and in jeopardy.
00:03:34
Speaker
So i that was my first time really looking into this world of AI. ah Google had just come out with a very early form of deepfake technology back then. And I got really fascinated by the use of it. I didn't know it as deepfake. I just thought it was a beautiful technology that would revolutionize filmmaking.
00:03:52
Speaker
um And so I started looking into that space. Unfortunately, back then, the technology was very expensive for a college student, inaccessible. And so um I could not imbibe that into my project. But a deepfake technology, a company building that technology became my client.
00:04:11
Speaker
And that started this entire rabbit hole of trying to learn about how fast tech is moving, the implications of that on not just work, but society as a whole, some fundamental fabrics of what make us human.
00:04:26
Speaker
And simultaneously, i was also studying psychology and neuroscience. So I got the opportunity. There was an amazing program here in Canada called Strategic Foresight and Innovation,
00:04:39
Speaker
It was a master's allowing you to essentially take two industries that you were fascinated about and innovate in
Technology's Rapid Advancement and Societal Impact
00:04:44
Speaker
them. And so for me, it became tech, like cyberspace AI. I was also looking at virtual physical reality and psychology.
00:04:52
Speaker
So that's where the whole journey first began. And upon studying that, I realized three things. A, technology is moving faster than humans can keep up with because it's on the exponential curve and humans are yes known to be linear learners. Yes.
00:05:08
Speaker
And so we're going to reach a point of what people call singularity, where it has exceeded our capacity to control it. Number two, that that with it is going to bring a lot of acceleration of the problems that we already see in society, especially when it comes to human connection, um well-being, critical thinking, many other parts of even like environmental damage.
00:05:34
Speaker
But it's also going to bring ah really interesting opportunity to respond and find solutions like we have never before because of the intelligence of AI. Don't know where that's going to go, obviously, but hopefully tells more on the benefit though optimistic side.
00:05:50
Speaker
um And then the third, that this gap can be closed. And that, you know, we feel is this fear, there's this anxiety because there's so much fear mongering in society about technology taking over humans. But this has been oh argument that has happened for years when the first train came out or the first television, and even the computer was created.
00:06:11
Speaker
um And so I think that there is an opportunity to really close that gap. In the midst of these three things, so this is a bit of my background, and I'm currently pursuing my PhD in cyber psychology because this is just, this is my heart, my passion. I'm going to be doing this till I'm old.
00:06:28
Speaker
And MindShield was created because of something very specific that I noticed that was going on, which is that in terms of the risks of exponential technologies, be it AI, be it neurotechnologies like Neuralink and other BCI that will be out soon,
00:06:47
Speaker
um or extended reality, virtual reality, that whole metaverse space, that there is a fundamental ability to manipulate human beings. And the way we are manipulated either into making decisions we don't like or conforming to a society that you know people in power would essentially benefit from,
00:07:10
Speaker
or cyber risk and cybersecurity. Because as we see in cybersecurity, most of the problems raised from human factors and social engineering, which is all just manipulation. um Similarly, with misinformation, radicalization, it all comes down to manipulation tactics. And so I realized that the manipulation game is going to be extremely prevalent. And we need to not only start building resilience in terms of adapting to these technologies, but to almost building a shield, which is why it's called MindShield. It's literally a shield for your mind to be able to recognize, resist, and respond to manipulation tactics, and hence get your power back um to actually work and but like actualize in in your life. so That became the mission of MindShield.
00:08:01
Speaker
There's a long convoluted answer, but to it's important to give the little like
Human Element and Accessibility in Technology
00:08:05
Speaker
steps, right? Because A lot of people in the tech world don't like getting into the tech world because they think that they're not techies.
00:08:13
Speaker
And I think one of my other biggest messages is that technology is human. And I think that's something, a realization that's going to definitely amplify. And so people from all backgrounds, if you understand the body, if you understand the mind, if you understand the spiritual nature and you understand technology, you can work in technology.
00:08:30
Speaker
So. and Absolutely. And um i i would also tell you after hundreds of hours of interview, George and I have rarely encountered someone with a linear or traditional background. So I do appreciate that level of detail.
Decline of Critical Thinking and Truth in Society
00:08:48
Speaker
Yeah, and i think kind of the important thing to take away from the background that you set up, and if we can go back to how we kind of build that shield of resilience, and and you know, you and I kind of talked about this a little bit before the show,
00:09:00
Speaker
Um, really, I'm a big believer in the importance of critical thinking and taking the time to read. And I think part of the problem statement that mind chill is addressing really comes down to what society has become. I would say arguably since.
00:09:16
Speaker
you know It's been slowly creaking into it from the Reagan years onward, but then from 2014 on, it really turned the corner and accelerated where we had a a lack of fundamental belief in what truth is. So when we start as a society losing our common definition of truth, when the sky is no longer blue for everyone, we can't really have a good foundation for a conversation on anything, let alone contentious issues.
00:09:38
Speaker
So I think part of the problem is before we even think about technology, because I think technology technology is a parallel and it's consequential, but it's not at the root cause of the issue. The root cause of the issue, I think, is a lack of critical thinking in human beings and a lack of.
00:09:56
Speaker
the desire and the ability to manually assess and come up with our own conclusions based on information. Because I think, you know, I know a lot of people and and there's a lot of research that coming out about it now that the human mind is getting stunted because of over-reliance on prop-based results. Like people aren't reading things and coming with own conclusions. They're just looking to a prompt to give them the answers quickly.
00:10:19
Speaker
And I think in academia, we're seeing a lot of problems now. How, in your opinion, based on your life experience and your work, do we return the conversation and in how we raise our children and and how we continue to raise ourselves? Because self-growth is kind of thing that has to happen.
00:10:35
Speaker
How do we get back to a place of reading and acquiring knowledge manually and building kind of the... fundamental tools necessary to be able to think critically? Because I think a big part of having that shield of resistance is being able to discern what is fact from from fiction as best as you can as you're receiving that information.
00:10:56
Speaker
How do we do that? Isn't that the billion dollar question, eh? Okay, so from from your question, I'm going to break it into three parts. We'll start with like the concept of truth, right?
00:11:11
Speaker
um And then stunting of the brain, which you mentioned, and how technology is fundamentally changing, not just our way of thinking, but neurologically and physiologically, the shape of our brain.
00:11:24
Speaker
And then what do we do about it? How do we maintain ah sense of common ground and flourishing in this world and bring and prepare our kids and the next generation for it?
00:11:37
Speaker
So I've had a hard, when it comes to truth, I've had an interesting journey with it because, you know, in this entire game of misinformation, disinformation, malinformation, there is an underlying presumption that there is information that is true and information that is false.
00:11:59
Speaker
If you took 10 extremely critically thinking people and asked them to find or decipher a certain problem and analyze why that problem is a problem,
00:12:12
Speaker
I could guarantee you that they would come with 10 different answers. Because the nature of truth is that it is an accumulation of how someone thinks, the experiences someone has had, the culture they come from, what suits them, benefits them, biases that they would still have.
00:12:31
Speaker
And so truth is not singular. I'm sure there is a singular truth somewhere, but in terms of how humans function, the brain is fragmented. And so everyone only sees a small portion of truth.
00:12:45
Speaker
And so when someone says my truth is greater than your truth, it's but or the other side is wrong or misinformation or fiction, it's very hard to prove that.
00:12:56
Speaker
Now there's some things that it's easy to prove. Like I can say this pen is red and you say, no, it's green. Actually, I take that back. Even that's because, you know, maybe the perception of red is actually green. Exactly. Right. But that is your truth.
00:13:11
Speaker
So I think the problem comes down to actually the notion that there is one truth. First, we need to break that paradigm because when people think that there is a truth, everyone else becomes the opposition.
00:13:24
Speaker
If everyone was just taught or recognized that truth is fragmented and you only have a tiny piece of it, it would create an environment for more discussion, for more conversation, for more truth seeking.
00:13:40
Speaker
to build on your picture. And I think that's what we really need because the fundamental problem of people, you know, of, of mis or mal disinformation or lack of truth or suspicion is that it causes disharmonization of society.
00:13:54
Speaker
Society breaks up, the violence occurs, you know, people are not becoming roommates with the opposite side anymore, all kinds of things. It leads to fragmentation. So in order to actually bring people together, I think creating spaces where truth is um subjective and all about adding more to the picture will start moving towards connection.
00:14:15
Speaker
So that's my, that's my take on truth in terms of the brain stunting, you know, this is a evolutionary law.
Impact of Technology on Brain Development
00:14:24
Speaker
What is not used atrophies, nature gets rid of things that it deems not needed by a species anymore.
00:14:31
Speaker
Um, We know that in the last few decades, our ability for memory in the brain has severely decreased because now we offload a lot of memory ah memorizing to our technologies.
00:14:46
Speaker
um And we also know similarly with things like critical thinking, and not just critical thinking, it's also emotional regulation, like the fight and flight response, being able to control that.
00:14:58
Speaker
It all happens because of this beautiful little bean in our head called the amygdala. And studies have shown, in fact, it was a Harvard study that showed that the size of the amygdala can be contracted or expanded.
00:15:13
Speaker
It's not fixed. You can actually shift it but depending on how contracted or expanded, if it's contracted, but you're very um impulsive and you have a hard time um neurochemically balancing yourself.
00:15:27
Speaker
But if it's expanded, you have a lot more control. So then the entire question becomes, okay, just because someone has got into an environment where critical thinking and these skills are not being implemented, we can still put in practices and we can put in certain kinds of training and certain kinds of schooling systems. Don't and get me started on schooling systems, but needs to change. It needs to change, but we can put things in that actually expands the amygdala, allows for increase of gray cell matter in the brain through practices,
00:15:58
Speaker
funnily enough, like meditation, um and then added to that, create a kind of conversational design that allows for truth seeking and questioning and being okay, being wrong.
00:16:11
Speaker
And I think if we were just to be able to create those three things, we are already good steps ahead of the game.
00:16:22
Speaker
Hey listeners, we hope you're enjoying the start of season four with our new angle of attack looking outside just cyber to technology's broader human impacts. If there's a burning topic you think we should address, let us know.
00:16:35
Speaker
Is the AI hype really a bubble about to burst? What's with romance scams? Or maybe you're thinking about the impact on your kids or have questions about what the future job market looks like for them.
00:16:46
Speaker
Let us know what you'd like us to cover. Email us at contact at bareknucklespod.com. And now back to the interview.
Algorithm-driven Media and Societal Perception
00:16:57
Speaker
Well, I think to your point about... The objectivity of truth, which, I mean, we could, we don't have enough time in this podcast for an epistemological debate. i No, no, no. The idea that at least you would be curious enough or, and you know, be able to grant the grace for someone to present a counter argument and you can have a conversation without feeling threatened, like it's some part of your identity. Well, something interesting that i I heard recently from a journalist who covered the COVID pandemic very deeply at a data level.
00:17:32
Speaker
places that seem entirely divergent politically with a lot of noise on social media, ah especially here in the U S between say Texas and California or Texas and New York, whatever.
00:17:46
Speaker
So if you were just chronicling the pandemic online, you would think these were wildly divergent places that had wildly divergent approaches But when you look at the policy level, what actually happened in quote unquote reality as in the place we live, the lived experience was, you know, school closures at the same time, you know, reopening roughly the same time with different nuances to it. And so there is this divergence between what we perceive because we're just taking in a whole bunch of news and media and whatever bubble we're living in versus, you know, if we stood back and understood what was the lived experience and
00:18:28
Speaker
I do not take for granted that all of the algorithmic media that we live in now, like we're not all reading the New York Times or even the Washington Times or the Washington Post. We're sort of all in our own algorithmic algorithmic feeds that tell us what we want to.
00:18:43
Speaker
is creating a lot of that tension and a lot of that stress and they feed off right enragement is engagement and so the fight or flight response or the freeze response is very high so what are some of your recommendations i guess or or what is mind shield's work related to helping people kind of lower the temperature enough with techniques or research or whatever to, I guess, survive some of that and kind of see through it or get off of it or just cope with it.
Navigating Information During COVID-19
00:19:18
Speaker
So before I answer that question, I'd like to ask you when the entire COVID era began and concluded, how did the both of you decipher from all the information that you were getting which one or what perspective you wanted to believe. That's a fine question.
00:19:49
Speaker
I was dealing with children in school, right? So a lot of that information was also just through municipal resources, right? Like these are the policies that we are enacting to do X, Y, and z I happen to have been following the CDC at the time because I was also traveling. Like March 2020, I was in two hotspots at once. I was in San Francisco,
00:20:13
Speaker
Like for RSA, when multiple vendors had pulled out and people weren't really sure if it was really a thing. And then like a week later, I was in London ah and walking through London Heathrow at 1030 in the morning and there's nobody there.
00:20:26
Speaker
Like it looked like 28 days later. So for me, it felt like the emergence of something was very real. I also have a passing interest in epidemiology. I've read a lot about it in the past and stuff like that. So I was like looking at the modeling and looking, I understand how...
00:20:44
Speaker
respiratory diseases transfer. And so i i will say I was looking primarily at scientific resources. I tried not to look at a lot of the news because my understanding was that journalists were going to be metabolizing scientific data at the same rate that I was. And I was like, I i don't know if they really understand it. Like when even early days when Fauci came out here in the U.S. and said like masking isn't really necessary. I was like, it's a respiratory disease how can it not be like the math didn't math and the science didn't science so i understood with a grain of salt like this is new it is a novel coronavirus meaning no one knows anything about it and so i just tried to read as many sources as possible i guess that's a very long-winded answer yeah like i kind of my approach to it um
00:21:31
Speaker
You know, it's a little bit unfair compared to the rest of the population because I served as an army signature for a long time, like operationally. You have to deal with multiple sources, multiple assets that are feeding you completely different narratives. And you have to be able to determine based on probabilities, like what is the actual reality so that you can make strategic or tactical decisions that often lives on the line for it.
00:21:53
Speaker
So my kind of, we'll call it shit shield for fake news and for for fabrication a little bit more refined. And i I always break down like what is an emotional impact versus what is reality.
00:22:07
Speaker
And so I found that a lot of people were letting emotions and sensationalization drive what they were saying, what they were thinking. Whereas my thinking was a lot more calm. Like I was dealing with a situation where you I was living with an ex-partner. we had ah a one-bedroom condo, which when life was good, like that was awesome because we lived in the heart of Little Italy. It's in Ottawa. All the good things, all the good life. We were barely ever home.
00:22:31
Speaker
Then suddenly lockdown hit overnight and we're stuck on top of each other. and like She was a personal trainer, so she was furloughed and I was working extra time at my job because that was like my first CISO job.
00:22:43
Speaker
So I had to deal with a lot of like distraction at home and trying to keep a calm situation at home where like two people are now just like rammed on top of each other. And we have like a couch in a bedroom to deal with um versus like, you know, entire societies have to deal with their kids and on education and all that other stuff.
00:23:01
Speaker
So I think for me, I was... ah a bit fortunate because that's where I was at in my personal life. And then i was able to have the capacity to be able to look at what is really happening out there. What is the federal government saying? What is the provincial government saying?
00:23:15
Speaker
What is science predicating that we have to do from a vaccine standpoint? And what do we have to do from a from a cautionary behavior standpoint? And you can kind of figure out like, hey, yes, this is an an airborne virus.
00:23:28
Speaker
But at the same time, like if I'm outside and it's sunny, I'm probably going to be safe. So yeah, classic probabilistic risk. Yeah, like like your lived experience of what's being sold. And you have to remember, too, when like,
00:23:41
Speaker
you're in a pandemic like this and they're coming out with like a medical science type announcement. They're basing it based on the science that they had at that time. That doesn't mean it's 100 percent right. So you have to be willing to change with that because it's changed. And I think, you know, having the nuance to be intellectually flexible and adaptive, like you can't just like, well, I heard this last week and that's what it is. And you know a month later, well, that's what I heard a month ago.
00:24:06
Speaker
You can't. You have to essentially shift into being information receptors and try to process that as best as you can. But again, i i have that level of training. And I think a lot of people, common time, they don't have the time for that kind of critical thinking. And they also don't understand the methods of intaking information and like,
00:24:26
Speaker
prioritizing, well, that source of truth is a little bit better than this one. And I'm finding multiple sources of correlation. And if someone's just one source saying one thing, i probably shouldn't be going around and like excommunicating members of my family over it.
00:24:38
Speaker
Like stuff like that.
00:24:43
Speaker
That's really fascinating, guys, because what I see as a pattern between how both of you have just described your responses kind of follows this, what we call TAR.
Managing Emotional Responses to Information
00:24:55
Speaker
It's just an acronym we made up, which is essentially trigger analysis response. right. So in both your stories, Kay, when you were in San Fran and a where were you when COVID hit?
00:25:07
Speaker
You an Ottawa? Yeah, that was in Ottawa. Yeah. So there was a trigger and that trigger must have been suddenly seeing all the news, you know, people talking about it the airport being empty um for you a overnight, a lockdown happening.
00:25:26
Speaker
So both of you first took in that information and from a trigger standpoint, you could see that information and not get emotionally way too affected by it.
00:25:40
Speaker
So I think that is our first like first part of the triad. Take in information and look at it. And if you find that there is a very heightened sense of emotional trigger associated with it, there's already a red flag.
00:25:55
Speaker
Because we know one of the most common manipulation tactics, and this comes down to learning just about what the manipulation tactics are, because actually they're only a handful, really. It's just combinations of those, is and emotional, inciting language.
00:26:08
Speaker
right So you saw that, you were like, okay, I'm going to take this information and I'm going to do my own analysis. So, okay, you went in and you read the data around it.
00:26:19
Speaker
um A, you already have that aspect built into you because of your military background. And so you were that's how you went with it too. And what's interesting is this comes to our second part, which is teaching how to read data and understand data is a very important skill set that is not actually taught across unless you are taking specific job roles.
00:26:42
Speaker
So I think there is yeah, there's a huge need to teach people how to analyze data and cognitive biases, because how is data even presented?
00:26:53
Speaker
There's a lot of data out there. There's bullshit. You can find data for anything, any stance of any topic. Right. But what exactly ah are they collecting? What do those numbers actually represent? What was the circumstances of the research methodologies?
00:27:08
Speaker
If you don't have a base understanding of that, it's very easy to manipulate people through data. And draw your own conclusions because that's the critical thinking part. you know People will see the same data and come up with two different kinds of conclusions based on what they are looking for.
00:27:22
Speaker
so um So that was the second part, car. So trigger emotional regulation upon seeing something that is meant to incite you. B, analyzing the data and analyzing it for with bias in mind.
00:27:36
Speaker
And then are deciding that three ways to respond, either um you resist it or you resign from the problem or you respond and react to the problem.
00:27:48
Speaker
um And you just have like, and there is a way in a process to just determine which is the more beneficial way to respond. Right.
00:27:58
Speaker
In a relationship, for example, similar thing. You have a trigger. You have a fight with someone. um You have the trigger. The first thing, most people, the fight goes on escalating because just that first T hasn't been really nailed down, right? The emotional overload, just like boom, boom, boom. Then you have said things that you didn't mean and they have said things they mean and you it' it's chaos.
00:28:20
Speaker
So... If you could do that and then analyze, okay, what are we exactly fighting about? What are the facts? Can you give me examples? Right? Okay. Let's decipher if these are right or if you had a different perception, I have a different perception.
00:28:32
Speaker
And then, okay, should we resign? Should we take some time away? Should we fight it out and figure it out? um Or should we just, and usually at but that point, the problem is solved.
00:28:44
Speaker
So in short, in how MindShield is looking at this, is to take people through a process, be it through games, be it through assessments, be it through training, all kinds of things and content, but take them through this process of being able to nail down tar.
00:29:03
Speaker
In another life, Shimano, where we met and I was acting in my capacity is at Mind Over Cyber, right? The reason we focus on mindfulness is to to lower the temperature of that emotional response or even to train the ability to discover that first moment, like, why am I reacting to this?
00:29:22
Speaker
um I've told George, one of the most liberating passages in all of Marcus Aurelius's meditations is I do not have to have an opinion about that. Right. So, cause I'll read something or my brother will send me something or someone will send me some rage bait headline. And I can feel that,
00:29:41
Speaker
that. And I was like, I can feel that response. And I'm like, wait, I don't have to care about this. Like, I do not have to engage with this material. And I, I guess I often choose the resignation. I just like walk away from that. That does not need any of my attention.
00:29:57
Speaker
Yeah. Yeah. And you know, you you pointed out something really cool. Okay. When you have already developed that sensitivity to you know that, okay, this is emotionally inciting content.
00:30:09
Speaker
You don't even need to really emotionally regulate. as soon as a As soon as that information hits you, your body's physiological response isn't like, oh my God, it's like eye roll. Like, ugh. Yes, I do roll my eyes maybe more than anything else.
00:30:23
Speaker
yeah Exactly. and And that's the beauty, you know, because people think that training and cognitive training means you have to like overanalyze that. You have to be more conscious about what you see.
00:30:36
Speaker
But actually, you only have to be conscious for a little while. till you are retraining your brain, you're rewiring your system. And then after that it becomes subconscious. So you will have a physiological response when you come across something that is manipulative and automatically you will, ah your your response time will become much faster in terms of and just not responding to it or countering it in a way that you need to. So yeah, that's, I love that you said you had a physiological response to it.
00:31:04
Speaker
Imagine if people just had that at a wide scale. right, with all the news out there, social media posts, um advertising content, cyber bullying tactics, if people just had that ah shield where they could just ah feel that physiologically, be turned off by it.
00:31:23
Speaker
I actually think that's the solution for most of the problems um in cyber risk. Let's workshop this for one sec. So let's say we want to help people develop. And just as an example, obviously I hope people reach out to you and reach out to MindShield for more actual servicing and everything, but just as an example, um you know, same concept in fighting, right? It's never the shot that you see coming that actually knocks you out. It's usually the shot that you don't see and eclipse you. And that's what actually kicks that knockout reflex and puts you out.
00:31:51
Speaker
I think the same thing happens with misinformation. And in terms of like, we'll say over overly impactful negative impact where something hits you and you read it, you come across it, you hear it, and you get overly offended, right?
00:32:05
Speaker
How do we train people or how should people begin to train themselves if you can give them like you know two to three steps that they can start working on, whether it's a Socratic method of like a questions they should ask themselves when they see questionable information or how they recognize questionable information, but how can people in their own lives, in their own home who are listening to this,
00:32:25
Speaker
begin training their minds to one, be ready to intake surprising information to be able to at least draw a quick assessment of how seriously should I take this?
00:32:40
Speaker
And three, to protect their um emotional well being as they go through life dealing with the countless headlines that come out any given day of everything from you're going to lose your health care to we're going to war with a new country to, you know, whatever terrible thing happens to come up in that news cycle.
00:32:59
Speaker
So just some some simple tips that people could start doing at home to begin building the foundation of this resilience.
00:33:09
Speaker
So funnily enough, um we are actually building an AI agent where we thought that the most simplest things, because I used to do this, is when you see a content that feels too triggering and you're like, you know, you're in taking it.
00:33:23
Speaker
um We have a prompt essentially where you just screenshot that and you upload it and ask it not whether it's true or not, but what are the communication tactics being used in how this has been written?
00:33:36
Speaker
um This is a very basic thing, but it builds awareness because again, there are only specific kinds of wordings or how things are phrased that you know are made to elicit response.
00:33:48
Speaker
um But coming back down to what can someone do personally? And, you know, this is a great question, a very needed because especially in the last so many, in the last two years with AI, with everything that's happening politically and the wars, people are so bombarded that it can cause paralysis and and desensitization and numbing.
00:34:08
Speaker
um What I'd say is, one, realize that in how news and algorithms function, it shows you what it knows will captivate people most, right? That's how they make their revenue. Any platform online, pretty much, if you don't pay for it.
00:34:29
Speaker
In which case, the one thing that captures human attention more than anything is fear and negativity. Because as humans, we are biologically wired to focus more on a threat than something that's not a threat.
00:34:44
Speaker
but When we were in cavemen, we had to be constantly scanning for a a tiger or you know something ah chasing us. So we are naturally rewired for negativity.
00:34:57
Speaker
That's what's hooked and that's what's propagated. Anything that you see is but feeding that, just know that it is very intentional. So I get, you know, ah given everything that I'm telling you guys,
00:35:11
Speaker
I feel that too, like when initially the first news around what was going on with Israel, Palestine, all of that came up, I was very emotionally affected, you know, cause I was thinking about the kids and thinking about like, you know, how they're going to grow up and all kinds of things. But there comes a point where um you decide that, okay, you can either do two things. Either you have a channel to act and it in some way you're benefiting the situation or you're doing something about it, no matter what you believe.
00:35:41
Speaker
um Or if you have no control over it, then switch off for a while. Yeah. And it's really hard for people switch off. Well, yeah. I mean, there's a lot of research there about a lot of the algorithmic influence is feeling the need to keep up.
00:36:01
Speaker
you know And like if you're not, you're somehow... missing out on something. ah You're not part of the conversation. It's like playing on that. But I do love your point about really refocusing our attention on the algorithmic basis.
00:36:18
Speaker
Whenever I, you know, my son might have a question or wants to see something, let's go, we'll look it up on YouTube. And i I'm trying to teach him, this is a tool. You had a question, there's an answer.
00:36:29
Speaker
And then there's all this shit in the side panel that they want you to watch. And he's like, ooh, what's that? i was like, we have answered our question. We're closing the window. And I've gone, we're working, I he's very young. So it's just like, I try to explain, like, this is the reason why you're seeing those things.
00:36:45
Speaker
And I'll open up a new profile and I'll do the same search in YouTube to show them like under a different profile I have for a different corporate entity. Whatever those cookies are, has a completely different reality. And I was like, see, they just want to keep you here is what you know I'm trying to teach him.
00:37:03
Speaker
But you're right. It is. It's definitely a shield. And I think the lesson is also that it's not like you erect the shield and then you're one and done. It like requires near constant vigilance, right? It's a a skill to maintain over time.
00:37:17
Speaker
But, you know, I would say that that even that is temporary because it does not require constant vigilance, because once that has been actually imbibed into the subconscious, transferred from the like working short term memory to the long term memory,
00:37:30
Speaker
similar to what you said, Kay, when you first saw this news, you kind of didn't read the news and you just decided to do your own analysis.
Curating Personal Algorithms for Positive Content
00:37:38
Speaker
It becomes a default way of being.
00:37:41
Speaker
And so I think that that's what would take the power away from, um you know, what we are fighting essentially. And the third thing is, I think there is a more philosophical, you know, way that's very hard for people to agree on something philosophically at the same point. But I would say that we have lived, we live in a world that is actually in constant balance of threat and violence and chaos, as well as a lot of beauty and a lot of innovation.
00:38:15
Speaker
Yes, lot of good people doing good things. Just because your algorithm them um feeds off and earns from you watching negative things does not mean the opposite does not exist.
00:38:27
Speaker
And you can actually curate your algorithm because again, i actually, you know, with algorithms, so some, and I would say it's like a mix of these. Sometimes it's a very maliciously, intentionally driven algorithm. Like, so when you have information operations, you have mis-disinformation campaigns, it's pushed.
00:38:45
Speaker
But for the everyday person also, and outside of that, your algorithm is only a reflection of who you are. It's only showing you what it thinks you want.
00:38:56
Speaker
So if you changed and curated your algorithm, you could make it work for you. You could move to watching more positive content, more useful content, things about nature, things about history, civil learning, right?
00:39:09
Speaker
um So that algorithm is only a ah mirror. So you you change what you're looking for, you can control it. Well, Shimona, that is the time that we have
Closing Remarks and Future Invitations
00:39:20
Speaker
for today. But I guess clearly we will have you on for part two, three, five. I don't know. We could go on.
00:39:26
Speaker
But ah thank you very much for taking the time out of your day to talk with us. Thank you so much. I really enjoyed the conversation. Absolutely. Yeah, I definitely have a lot of questions to ask you guys too, given your backgrounds. Excited again.
00:39:40
Speaker
All right. Well, I hope we run into each other soon. hope so.
00:39:48
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:40:01
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. It helps others find the show. We'll catch you next week, but until then, stay real.
00:40:15
Speaker
And also, is this more like you guys ask questions and I answer or do i ask questions back to you? Oh, you should 100% feel it's a two-way conversation. We're just we're just hanging out the pub. That's all we're doing. Yeah. um I actually love it when guests ask us questions. They rarely do, but you should. If if you want to psychoanalyze us on air, you can do that.