Introduction and Topic Overview
00:00:06
Speaker
Pacific Standard Time. Brian, why don't you unmute yourself and take it away. You got it, brother. All right. Can you hear me? Is the microphone option awesome? Is my audio clear? That's outstanding.
00:00:24
Speaker
Okay, so this is the first time I've ever done virtual space like this before, so I apologize for any sort of awkwardness, but we're just going to go with it. Full send. It's going to be awesome. Hello, everyone. Thank you for the opportunity to come and speak today. I appreciate that. My name is Brian. Marine Veteran served about 10 years. They did a number of different things.
00:00:43
Speaker
I think I'm primarily here to talk to you. I believe that the title that we came up with was misinformation in the age of social media.
Fundamentals of Propaganda and Social Media Algorithms
00:00:50
Speaker
I'll do my best to have a conversation in those regards, basic topics I want to cover today, basic tenets of propaganda, a little bit to information diffusion and the science behind that. And then we'll move specifically into social media and how the Instagram, the different algorithms work.
00:01:08
Speaker
in order to provide a basis structure for y'all's coding efforts. You can create something that will hopefully help out this particular basis. So I'm going to jump up to AS1, so we'll figure out whatever. Let me say that one more time. We had a hot mic at the minute. Sorry about that. Nope. No worries. We're good. At any point in time, please feel free to interrupt me, ask questions. I'm hoping this is going to be somewhat interactive, so it's not just my giant bald head up here talking the entire time.
00:01:40
Speaker
In regards to resources, I'll probably cover down on this again, but for future information for anybody who's curious about these sort of topics, I've got a couple books that are worthwhile. Robert Cialdini wrote a book called Presuasion, very solid. This was Hillary Clinton's campaign advisor in 2016. He's famous for helping her make up some ground against Donald Trump.
00:02:02
Speaker
It has some very interesting influence methodologies and seriously boosted some of the effects of her campaign. Presuasion is basically the idea of priming somebody to be more receptive to an idea by doing a series of actions beforehand.
Historical Propaganda Examples
00:02:23
Speaker
with that, wrote a book called When Big Leave, which is a pretty in-depth expose on the 2016 campaign, talking why Donald Trump had won, why Hillary Clinton was supposed to be the clear favorite, lost, some of the tenants in regards to the ways that they ran their campaigns, pros and cons of each side of the argument. I believe it's very fair, very academic and devoid of emotion, so worthwhile. Alex, what's that mean?
00:02:55
Speaker
Okay. With all that being said, all right, so let's move into propaganda. So we'll start off with a quote here. So it's paraphrased, but make a lie, make it big and repeat it. It's obviously just Goebbels, who if you're not familiar with who that is during World War II, he was the Third Reich's primary propagandist. And there's some worthwhile tenants there to basically take forward and move forward with it.
00:03:20
Speaker
that is ultimately in a nutshell what propaganda is, right? It's not mind control. I feel like that's a common misconception out there that your men who stare at goats being George Clooney, making people implant ideas inside of people's minds. That's 100% not what it is. The basis behind it is the idea that you build upon a truth that is already established or an idea that people believe is already realistic.
00:03:47
Speaker
And then you start using multiple mediums. So if we're talking like in a warfare sense of things, you could talk about like radio broadcasts or, you know, leaflets, that's everybody's favorite term to be employed there. In the modern age, it's more social media in order to repeatedly bombard your target audience with the ideas that will help slowly and gradually cultivate them towards the general principles that you want.
00:04:19
Speaker
Okay. Um, so I want to, I want to pose a, pose a question again for, for, for the audience out here. So, uh, in regards to deception and propaganda talking about historic, uh, bless you. Uh, talking about historic, uh, examples of, uh, of, of, of propaganda or deception that you've seen in the past and bring up anything.
00:04:45
Speaker
propaganda. Yeah, does anyone have any? Well, I've noticed a lot of Russian propaganda. Lovely from this crisis we got going on. What do you see? Yeah, it seems like 200,000 people in Russia are in favor. They have obviously nationalism in their own country. They want to support their own leadership. And their own leadership has their own propaganda for this sort of land grab that they're doing seems like large part of the NATO allies are against this. But
00:05:15
Speaker
this one couple countries are for this. So propaganda is coming in both ways here. What do you believe? That's ultimately the question, right? So how do you determine who to believe? How do you figure out if information is read or not, right? That's what it boils down to.
Modern Propaganda and Misinformation
00:05:34
Speaker
which I think we can get a little more into, but before we dive primarily into the Russian Ukraine situation, let's talk about this from a generic perspective, just to get everybody on the same page, right? All right, so Sun Tzu, Art of War, everybody's favorite classic book, right? All warfare is based on deception. I'm sure for those of you who play video games and call it beauty, you've seen that one before too. That's 100% accurate. That writing is specifically about a Chinese funeral
00:05:59
Speaker
war, civil wars that were taking place during that particular specific time century where they would send out carriers and lie to each other about single combat and show up with an entire brigade, battalion, thousand man unit, two men, whatever the equivalent organizational hierarchy was and be able to trounce their opponent as a result of it, but a lot of deception that was taking place there.
00:06:20
Speaker
Another example from antiquity would be the Battle of Troy. So hashtag Brad Pitt, a little Eric Bana action here if you prefer the movie advice, reading the Odyssey or the Iliad. But basically using the desecration of Apollo's temple and making the churches feel like the
00:06:38
Speaker
The Hellenic force was no longer there. They offer this offering to Apollo, the sun god who Troy was built for. And they're like, yeah, absolutely. These Greek dudes, they're pretty chill. They're all right. That's dope. And then next thing you know, they brought in a series of Greek raiders who opened up the door, allowed for flow of forces inside of the city and led to the resulting attack in Troy.
00:07:01
Speaker
A more modern example would be everyone's favorite, World War II, so Normandy. Does anybody have any familiar with Normandy, how that went down, what specifically was associated with it? Or if I said Operation Overlord or Operation Body Armor. We got some thumbs up and thumbs down, yep, it's a mix here. And feel free, folks, if you guys want to unmute.
00:07:32
Speaker
I got it. No worries. Uh, it's really cool. So, um, yeah, what do you got me? Oh, sorry. Yeah. As far as Normandy, there was a little bit about just, uh, telling about movements on
Truth Assessment and Information Verification
00:07:47
Speaker
a different coast and trying to manipulate the, uh, the, where, where the forces are amassed. Then they, they act in a different coast. You gotta go.
00:07:58
Speaker
Absolutely correct. Right. So let's expand upon that. And the only reason I'm bringing these up, right, I'm trying to go from a big concept of misinformation and how that works, driving it down into specifics with like Twitter and social media. And I feel like this will be a broader understanding. So there's more depth in
00:08:15
Speaker
In your minds, so you have a better understanding of how to put this into some sort of code format, right? So, all right, Operation Overlord, Operation Body Armor. Overlord was the invasion of Normandy that happened June 6, 1944. Body Armor was the associated deception with that. So, this was
Idea Spread and Social Influence
00:08:30
Speaker
huge. This was gigantic, right? We've never seen anything like it. There were two primary invasion points that could have taken place, right? So, we had Patekale,
00:08:37
Speaker
And then we have Normandy, right? So both are in the northern portions of France. Potakalay was actually the less good option because of various title reasons. So the Allies were not going to be able to go there. Normandy was the better option. Hitler therefore reinforced Normandy. What the Allies did in order to basically make them think that they were going to go to Potakalay instead, excuse me,
00:08:59
Speaker
They use a number of different items that they created. You probably remember the inflatable tanks, right? So whenever German aerial reconnaissance assets would fly overhead, they'd see these tanks that are on the ground. And they think that they're massing this invasion force. So that's confusing about the data.
00:09:17
Speaker
having fake battalions with radio signatures, so they have a number of people running radio calls. The German intelligence units would intercept these radio calls, thinking that they've discovered a treasure trove, and one, you know, miscounting the size and scope of the Allied invasion force, and two, thinking that they had good information about what was going to take place at Patakalay.
00:09:38
Speaker
Historically, Patton actually walked a German officer, POW, around some of his forces in his headquarters in England. Basically what slipped that, you know, he was looking forward to being in Normandy or in Pas de Calais and the northern portions of France and that specific area. This officer was then released. The way he did it, the way he dispelled this information was such that the German officer thought he was nonchalantly letting things go. He inadvertently meant to do that, right?
00:10:06
Speaker
So this guy thinks he's got the great information, right? He's figured it out. He sleuths it out, partner, right? Goes back to German High Command and is like, yep, they're going to Padaukalei 100%. This is what's going to happen out here. So you have all these different factors
Social Media Algorithms and Virality
00:10:20
Speaker
that are taking place that are lining up to basically build on the idea that Hitler was already somewhat concerned about that Padaukalei was going to be the move, right?
00:10:28
Speaker
such that when the allies actually landed in June 6th in 44 in Normandy and they had actual visual confirmation from their scouts that this was taking place, Hitler prevented a reinforcement that Rama wanted to push them back because he was so concerned that that was a fate, that element wasn't real, this was all, you know, Normandy was actually the feint, Normandy was actually the deception that took place there.
00:10:57
Speaker
All right. Any questions on that? Any questions on Normandy? It's a classic historical example. Everybody loves to talk about it. It's a great book. Click link and hit there. I recommend you check out that. It goes into more exhaustive detail in which regards. Okay. All the thumbs up and hearts. Good. Okay. So.
00:11:21
Speaker
Basically what we do with the propaganda, and we're kind of closing up this conversation. We're looking for a preconceived opinion. We develop MOPs and MOEs, so measures of performance, measures of effectiveness.
00:11:31
Speaker
We build on that truth, and we figure out a target audience assessment. How are they receiving information, right? So if
Manipulation Through Bots and Fake Accounts
00:11:40
Speaker
it's a population that consumes information primarily via red paper, it's probably worthwhile to push that information via red paper. If it's a news broadcast or a radio, those are the preferred mediums. This is where it's involved to have some sort of good social studies, if you will, on what the other target population is going to be like, what they're most receptive to.
00:12:00
Speaker
We talked about Troy, we talked about Normandy. Desert Storm is another classic example of that. So that was basically, Schwarzkopf kept insisting that he was going to do this marine invasion in the vicinity of Kuwait. So the Iraqi Republican Guard built up its forces along the coast. Meanwhile, he got a left hook through Saudi Arabia and completely got them off guard as a result of that. And that was involving public affairs and exposure and his comments and concerns and so on and so forth.
00:12:31
Speaker
And then a more modern example would be Crimea in 2014, right? So the initial Russian incursion and aggression, so kind of boiling down to the topic that we're getting here to talk about today. Talking about how there's these pro-Russian nationalists within Crimea. But in reality, there are these little green men, and that was kind of
Countering Misinformation Strategies
00:12:50
Speaker
the fall of their propaganda. The Russian-born, Naskirova, you'll get your thrown around there.
00:12:55
Speaker
synonymous same thing, but basically Russian special forces end up taking the Crimean parliament by force and then force them to sign over annexation rights, and that's how we wound up in the kind of situation that we're in today, but the Russians are good with this. They like this. They're well-informed. They've been doing this for a long period of time. They've got a lot of specials. Okay, closing that out. Propaganda basics. You have an understanding of it. Any questions from there? Can you hear me?
00:13:26
Speaker
Yeah. Can you put, uh, can you, uh, put your settings on megaphone DJ please? Yeah. Uh, yeah, I was just going to ask about more about Crimea because if you could just expand on that, because Crimea just kind of fell without any resistance and like they were, they've just submitted to Russian control and it sounds like
00:13:54
Speaker
They prepped the area really well. Yes, understand brother. Okay, so I see that we might have some Ukrainian folks in the chat. If y'all have better information on this, please let me know. But my basic understanding of what happened
00:14:20
Speaker
This is an armchair quarterback methodology, right? So Brian's opinion of what's going on in Ukraine right now versus what happened in 2014. So the Russians had a limited objective back then, right? They wanted access to the sea. They wanted Crimea, which had historically obviously been the Soviet territory.
00:14:36
Speaker
So there was a series of preparatory work there that went with one, the misinformation campaign, right, so using Russian foreign ministry forces in the area to use the Russian media arms, so RIT pushing out that there's all these Russian citizens or ethnic Russians that are in the vicinity of Sevastopol, obviously the capital.
00:14:54
Speaker
who were desperately wanting to rejoin the Russian Federation, when in fact that was not necessarily the case. And then in the middle of the night, the concept of little green men, and I'm not sure if any of you all remember the article that came out here, but a number of people basically got photos of Russians, that's not so special forces folks running around on the ground pretending to be Russian separatists, so Ukrainian nationals who were
00:15:22
Speaker
annexation of Crimea into the Russian Federation, when in reality these are actually Russian soldiers just kind of doing their thing. The operational security of that got blown up because of the pictures that people were posting on social media. So again, showing the difficulty in the age of information, how hard it is to keep a secret on these sort of things.
00:15:45
Speaker
But why they were so effective in 2014 versus now, now they're trying to take an entire country over. Then they were trying to take a limited scale location and they had surprise. There wasn't the massive buildup that they had now. People believe that they were actually doing their exercises. Vice, you know, the obvious intent to have to come in and try to seize Kiev and move forward for the rest of the country. DJ, does that answer your question? Does that provide some context? Yeah, it does.
00:16:14
Speaker
I also had the timing right pretty well because it happened like four days after the president's feed. So they called the whole country of guard, which I guess was pretty important back then. Dark program, am I saying that right? Is that? Yeah, I ran. Got you brother. What was that? Can you say that one more time and put your microphone on? I'm sorry about it. Yeah,
Public Education and Resilient Information Campaigns
00:16:40
Speaker
it happened four days after the president ran away. So there was no build up going on.
00:16:45
Speaker
Wasn't there like concerns, didn't he get ousted as a result of it? There was concerns that like there might've been a payoff or something between Putin and him. Yeah, there was rumor of that, but essentially like nobody was ready because basically we were protesting for like three, four months and then it just, he ran away and then it happened. Great timing. I'm sorry. I'm sorry. I appreciate the context though. Thank you.
00:17:14
Speaker
I have a question, Brian. How do you navigate decentralized propaganda narratives? Right now, multiple topics are being exploited. I think about there's a New York Times magazine article called the Agency, and they're talking about patrol farms. Some of the topics they would probe were like
00:17:34
Speaker
there's a fake chemical plant explosion somewhere in Louisiana. And so it was just like internet gossip, basically. But then also other incidents like, you know, exploiting racial tensions or other things going on. You know, all of it is there to kind of degrade overall trust and, you know, authorities and stuff like that. But when it comes to that sort of asymmetry and how do you, you know, you can't just counter
Case Study: 2016 US Election Influence Operations
00:18:00
Speaker
awareness for how people navigate through those things better? That's a million dollar question, right? So we'll serve some definitions here. Misinformation versus disinformation.
00:18:12
Speaker
and how information propagates itself throughout a social body in a construct. So misinformation is exactly what you're talking about where it's just emphasizing either erroneous or false things in order to muddle up with the true situation that should be pushed out there. So if you wanted to take this into a
00:18:37
Speaker
a networking attack or like a cyber intrusion methodology that should be equivalent to a DDoS, right? You're eating up bandwidth. People then become numb as a result of it because it's like, ah, this thing blew up or these people died and it's erroneous and therefore they're paying less attention to the actual news that's going out there. Oh, it's disinformation. That's an actual objective-oriented propaganda effort. So you are specifically showing either true or meetup
00:19:02
Speaker
stories, tags, tweets, memes, whatever you want in order to actually cultivate that message, that narrative to curate the target population towards a certain goal and objective. So you're asking about how do we stop multiple narratives in multiple different directions?
00:19:21
Speaker
It's tough. It's very tough, right? Especially when you're talking about the mass quantities. I mean, just think about the trolls on the internet that exist out there. Just people that are just part of the problems that you're posting in the comments, if you will, on various celebrities, on various different situations.
Building Resilient Strategies Against Misinformation
00:19:40
Speaker
That's a form of disinformation, right? They're just putting stuff out there in order to cause a ruckus. And that is simply a number of
00:19:53
Speaker
Unintentional people that are just doing it for, you know, their pure entertainment or when the data goes to self self esteem issues that are going on there, you know, whatever the motivations associated with it. Facebook, Twitter and Instagram and all the various big social media platforms have a hard time dealing with that. Right. So now you start talking about how do we deal with
00:20:12
Speaker
objective-oriented, dedicated information campaigns that becomes much more difficult. But I will try to give you the tools to start hacking away at it. And what's the saying? A journey of 1,000 miles starts with the first step. So that's where we go forward. Ian, does that answer your question? Does that provide some context for it?
00:20:30
Speaker
Yeah, I guess I think of gaslighting. Maybe there's a network of other people who are kind of being henchmen for the person who's kind of the primary abuser. How do you just call out the behavior itself maybe or something? Is there a compass for it? And I guess that's what you're going to go into.
00:20:51
Speaker
Okay, so let's start with information diffusion, right? Has anybody heard this term before? There's a number of different psychological studies that are associated with it, and we're talking about how information moves throughout a population. If you have, awesome, if you haven't, I'll try to provide a condensed context of it, right? So let me pose this question for the group.
Conclusion and Final Q&A
00:21:10
Speaker
How do you determine if something is true? When you read it in the news, when you see it on Facebook, when you look at Instagram, what are the factors and traits that you're attributing to it that'll determine that this is actually correct and this is not correct?
00:21:22
Speaker
The language gives it out sometimes. Good. Perfect. What do you mean by the language? It's emotional content that appeals to something that doesn't describe the fact itself or just the reaction to the fact. Good. OK. So dispassion versus impassioned language, right? So you're talking about the difference between an academic article versus an expose or an editorial, if you will. Is that a fair assessment?
00:21:50
Speaker
Yeah, I guess it focuses on the opinion of both of them. Yes, opinion by facts. That's good. Okay. What else?
00:21:57
Speaker
trying to deduce the first principles of whatever happened, the actual hard facts, as opposed to the opinion or projections based off those facts. So what's a hard fact? I'm going to put you in the hot seat. Like dates or specific people involved, or I guess you would say specific actions that occurred, as opposed to
00:22:28
Speaker
trying to relay intent or motivation behind the specific occurrences of whatever happened.
00:22:40
Speaker
How would you, okay, so you, us as academic individuals, us as all intelligent folks, to DJ's point, talking about this authenticity of facts, facts, vice opinions, some of the language that's being put there that our program was talking about, how would you determine if those were correct? Tell me your academic methodology here and you're gonna go after it. Well, yeah.
00:23:07
Speaker
you hear about something, a story that happens. I'm trying to piece or parse out the specific bits of information and where it came from, like sources from which like people talking about whatever happened, like specifically who had the information, where did it come from? Yeah, tried to deduce like where
00:23:32
Speaker
like from first principles what actually happened and then separate what is the trying to the author or person relating the information trying to say
00:23:43
Speaker
with in response to those facts? Perfect. So you're talking source verification, you're trying to find repeatability, you're determining credibility based on source verification, right? So if it's just Joe Schmoe or some random person, right, with like a Hello Kitty picture in their avatar, that's probably maybe not necessarily real, but like a professional, you know, like didn't ask headshot or, you know, credentials associated with it, or a bio, maybe that's a little more truthful or you're willing to believe that information.
00:24:11
Speaker
That's good. Um, what about appearance? What are you, what are you looking for? So you're going on online news and you're taking a look at this online news. Um, what specifically about the website, would you be looking forward to determine if it was real or not? If it has like I'm about us or it's connected to social media accounts that are legit and have a network of followers that are like real people. Um,
00:24:40
Speaker
I'd just be looking for connections to the broader network that seem to be legit.
00:24:49
Speaker
Good. Excellent. Okay. So now you're, you're trying to determine traceability, right? That's, that's what you're saying. You're trying to, uh, it's no longer, it's like a resume, right? If you're at a job interview and they, they call your references, um, if you show up and none of the references work because all the phone numbers are made up, they're probably not going to get hired, right? But if you're, they call the references and they are real people and they provide, you know, blowing, uh, reputation of, of, of your work, probably are going to get hired. It's a scene situation like that with the social media.
00:25:14
Speaker
So, in order of time to kind of go through here, some of the ideas I have, right? So, how do we as people, how do we assume our game is determined as something truthful? I've got appearance. So, you know, appearing to be professional, and that's the overall structure of the website itself. Links work. Color coordination is correct. It's not just some random scripting text. It's actually got, you know, GUIs and various graphical contexts, and it gives sort of a video
00:25:41
Speaker
or Java players like that going on. Authenticity of the source, that's repeatability. It's an accredited source, right? It's not necessarily Wikipedia or something on like a subreddit that you're diving down to at 2 a.m. in the morning while you cheat those. I would say a realistic academic source that has some sort of veracity associated with it.
00:25:59
Speaker
Repetition, so the fact that you can confirm that information from multiple different sources. But remember here, don't fall down this trap. What did Joseph Grebel say about make the lie big, say it loud, say it over again, right? And using different mediums in order to confuse people. So perhaps this becomes a tactic to lull people into a false sense of security that they've found correct information because multiple different sources set up with real, when in reality it is simply just a repeat of the same line. Jamil, what's up, man?
00:26:30
Speaker
Sorry, I was off on my... No worries, brother. Um, okay. And then, uh, language from Dark Broader. Yeah, absolutely correct. So now we're talking about the, the dispassioned, uh, being what's up. I was going to say potential funders and then also accountability. That's a good one too. Okay. Um, so you're talking about, uh, the direction you're getting into right now is less propaganda. Um,
00:26:58
Speaker
It is actually the same, right? So news media calls it something different, right? So depending on what side of the aisle you're on, it depends on what type of news agency you're going to get your information from. So the term they use is slant, and slant is basically how the information is going to be positioned in order to
00:27:17
Speaker
be more breadth of information for the readers if that makes any sense. It's the same basic concepts of everything you're talking about. And yes, the funding associated with that information is 100% realistic. Is that the point you're trying to make? I was also going to say accountability. Oh, yeah. If it's just kind of one directional versus this person being a prior kind of thing, there may be a little more stakes involved.
00:27:40
Speaker
Dude, 100%, absolutely. The final point, I'm gonna point out here, maybe you've heard this term, maybe you haven't. If I said cognitive bias or confirmation bias, what does that mean?
00:27:53
Speaker
Oh, hell yeah, you killed it man. Absolutely. Human beings are weird, right? And whether it's a combat action situation where you're doing auditory exclusion because you're directly focused on the threat and you're in the cupris color code and for those of you that understand what that means, right? Like I am tunnel visioned in and I don't hear that I'm getting shot at over here in that situation.
00:28:12
Speaker
confirmation bias, that's what the mind does, right? We have a tendency that if we figure out something that we think is true and we think it's true based on all the information you talked about before, we start to exclude other information is not true. We don't necessarily know why we do it and I've seen it happen to even the smartest people. So this is one of those situations where you're like, you're looking at the Facebook group and it's a bunch of
00:28:41
Speaker
Folks who have a certain opinion about something, right? And they read CNN and they're like, this is all garbage. Or they read Fox News and they're like, this is all garbage. Or this is 100% true. Or vice versa. That's cognitive bias. That is a trap that is difficult to pull ourselves out of. And we have to always be cognizant of that particular effect. PJ, what's that mean?
00:29:33
Speaker
Say that one more time.
00:29:33
Speaker
Okay, so we talked about all that. Now let's talk about information diffusion and some of the structure that's associated with it, right? So now we have a, we can create a model. So we're talking about how an idea moves forward. We start off with a champion, an idea champion. Now in the age of social media, give me an example of what you think a champion is.
00:29:54
Speaker
Okay, I'll give you an example, media influencers. Let's say that one more time. Social media influencers. Oh, hell yeah. But let's take it a step beyond that. Give me a specific influencer. Give me the hottest of influencers. Hashtag the hotness. Joe Rogan. Joe Rogan. Cool. Classic example, brother. Awesome. Joe Rogan experience. 100% new. The Kardashians with their various
00:30:17
Speaker
beauty products, right? Those are idea champions, even if people are like, we hate them, or we love them, or whatever it is, you know, Rihanna, another classic example, there's many beauty products, right?
00:30:27
Speaker
She creates a trend and suddenly millions of people are going to follow suit on the trend and buy the product and go forward with it. Those are idea champions. They're called idea champions because they've got tentacles and reach and brought influence throughout a multi-sphere of social society, right? So Joe Rogan, very far-reaching, right? Kardashian's incredibly far-reaching.
00:30:49
Speaker
depending on what side of the political aisle you are, either a young church or like Ben Shapiro, all these kind of like each groups that are associated with it, they have kind of the garage. So 90 and Champion is the first person that's got a propaganda origin in 90 and you need a strong one in order to kind of have
00:31:12
Speaker
a good host body, if you will, if you're thinking about this like a virus, you have to have something that's going to allow mitosis to take place for that virus, that idea that you're trying to propagate, really take hold before it gets broad influence out, gets seen by the most number of people.
00:31:29
Speaker
Now, from an idea, you move from a champion into an early adopter. Now, this makes perfect sense exactly what it is. An early adopter is one of the first people that are going to get on it. These are the folks that have probably watched every episode of Keeping Up with the Kardashians. They've got your urban experience saved on Spotify. They're going to listen to every new episode that's there. They're the people that will take it hook, line, and sync, right? That's their game.
00:31:53
Speaker
Late adopters, this is like a concept, let's take a couple of years for this information to reach them and think about it. You view social circles as kind of like these clusters, if you will, with the champion right in the center and then these hub and spokes that expand outwards, early adopters, late adopters, right? It takes time, the delta T of time for that information to kind of just use outwards.
00:32:16
Speaker
So lead adopters would be the folks that realize a video game is really, really cool 10 years after its release and they end up buying it on PlayStation for like $9 versus the original $60. Lead adopters are
00:32:31
Speaker
another effective example here, right? Folks who have used a, you know, Jitterbug or like an old-school touch-tone phone, one of those old Nokia's that can't break. And finally, you know, they just got an iPhone 13 because, you know, the kid's pushing them towards it or something like that. That is a concept that it does.
00:32:50
Speaker
And then the final portion, that outer fringe of that social cluster of this information diffusion is going to be your laggards. Now, laggards may or may not take the idea, but they're going to be the hardest people to reach.
00:33:03
Speaker
So when you're talking about information infusion, the best way to go after people, you're going to look for good champions, good champions in the center of your social circles of the target audience you want to influence. And you're going to look for strong connections with the early adopters. So basically that's social media following, that's readership on an account, right? These are all assessment tools that we would use on how effective our information campaign is going to be and how effective the site is going to be.
00:33:31
Speaker
Okay, that's information infusion, that's a lot of stuff. Any questions on any of that? I see a comment here by Leah. Many times we are targeted by our bias, so I try to check sources of news that I normally don't check, such as world news. To pull off that, Wikipedia now has a current events news section. Would you say that's a good source of news?
00:34:02
Speaker
Yes, no for world world events. Yeah, so obviously
00:34:09
Speaker
There's always a factor that associated there that that could potentially have been modified by somebody who has illicit intent, but the Wikipedia moderators are so good right now that literally if something gets modified and it's untrue, it's changed within 30 or 40 minutes or something like that, that information doesn't last, or bad information doesn't last long up there. So yeah, Wikipedia is great. Yeah, I would recommend, so again, referring back to that term slant,
00:34:37
Speaker
read your news that you would normally read, but then find one or two articles coming from something you wouldn't necessarily read, so you can balance the facts that are being pushed out there. It'll force, there's a psychological effect here, and I'm finding me as a psychologist, but there's a concept where reading information that you're not necessarily agreeing with and fighting that cognitive bias forces a rewiring of the neurons,
00:35:02
Speaker
will help harden you against some of that confirmation bias and cognitive bias effects in the future. So you're more receptive to information, new information that will help update you and keep you centered if that makes any sense. What else? Yo, phenomenal audience, all the avatars, I love it.
00:35:27
Speaker
Okay, the bread and butter of why we're here, social media. So we talked about propaganda, what that is. We talked about information diffusion and the concepts associated with this social cluster and this move out. Now we're going to build upon that and talk about how social media works. So when we're talking here, we're talking primarily about Twitter, Instagram, Facebook, maybe YouTube, right? The primaries majority under the umbrella, full of meta slash Facebook, whatever it's calling itself now. But that's, that's what I mean here. So,
00:35:57
Speaker
If I were to throw out, how do you become Instafamous? Does anybody have any idea how you do that? The content? Content, good. What type of content? Salacious content. Sensationalized. I'll go ahead DJ. What was that again? Say it one time. DJ was saying often sensationalized. Sensationalized, viral, good. What does viral content cause? What effect does it have in the social media realm?
00:36:28
Speaker
Quick sharing. Quick sharing. Good interactions. Excellent. Increase followership. That's it. Good. I was going to say, makes people feel angry, scared, or ashamed. Creates a reaction emotionally, absolutely. But what does the emotional reaction cause? Arguments. Good. Yes. What do they do in arguments? Where do they go?
00:36:59
Speaker
They go to the comments section, right? Yes. And they start tagging other friends in there, being like, I can't believe they posted this BS, or I can't believe they did this, right?
00:37:11
Speaker
So the secret of going viral is basically the aspect that you want to increase your followership as big as possible. And hopefully I'm watching some people's Instagram career here, but increase your followership and then it's interactions. And whether that's a like, that's a share, that's a repost, that's a comment, any sort of situation that takes place there is worthwhile. So if you ever wonder why somebody's on the internet doing something completely and totally stupid, it's because they're doing something so stupid so they
00:37:37
Speaker
They force people to create a reaction to it, and that's basically how the algorithm works. So the more followers you have, the more interactions that take place as a result. The higher your content gets promoted, meaning that it has a higher viewership and the need push to control. So if you've ever seen like a sponsored ad,
00:37:56
Speaker
So you're like, how did I get this content? Why am I seeing this situation, right? One, it's based on your user preference and profiles, but two, it's based on the virality of it and how highly it's been promoted. So more content, more creations, more potential reaches that are gonna take place, more potential touch points for that information to go to. Okay, so now let's talk about artificial, or artificially doing this. So if I said the term, they bought their followers, what's that?
00:38:28
Speaker
to stop genuine or bona fide. Yep. OK. So fake accounts, right? So on Instagram, you'll often hear them referred to as bots.
00:38:41
Speaker
Okay, so these are accounts that have zero actual followers and they follow a bunch of other people and their whole purpose in the world, they're run by a computer account, they're auto set up that any time somebody posts something from one of the people that they're following, they immediately repost it. Now, let's take this into context of information diffusion that we were talking about before. You've got your champion of the idea in the center, and then you've got all these kind of early adopters,
00:39:07
Speaker
late adopters and lagers that are out there as a result of this. But if I create a series of fake bots that are around it that are automatically retweeting, also the content of the champion in the center gets artificially boosted in regards to its relative reach. That is a basic concept of how social media can be used for a misinformation or disinformation.
00:39:30
Speaker
right? So we've got these botnets that are associated there that are a bunch of fake users and fake profiles. And they're simply retweeting information into what's called an echo chambers. That's what this is kind of hub and spoke thing champion in the center bots all around them. They're all retweeting to each other over and over and over again, content gets artificially raised and all of a sudden it's starting to appear in other other groups that are associated out there. So
00:39:58
Speaker
With that concept in mind, now we're talking about the actual purpose of us being here. How do we counter that? How do we move forward with it? So I'm not super familiar, by no means a crazy coder. I'm primarily focused on Python and Matlab for data science purposes.
00:40:16
Speaker
In regards to ways and means that we're looking forward to, go after Ian's earlier question of how do we counter the mass amount of information that's out there, whether it's misinformation, just trying to model the muddy waters or disinformation, specifically trying to orient a group away from a specific idea or hit a target audience that believe a certain thing. This is kind of my approach. I would get familiar with using the social media APIs, publicly available information so you can do large scale scraping of some of the data that's out there.
00:40:46
Speaker
I would focus your code into a bullion-based, like a vacuum, where you're shipping through things. So with this in mind, I would look at specifically Ukrainian-based or Russian-based information or just any sort of information, regardless of source, that is deemed to be misinformation or false in regards to the conflict that's currently ongoing, right?
00:41:12
Speaker
I would move this down through a various number of layers, right? So it's everything we talked about before in regards to appearance, authenticity of source, repetition, confirmability, language. That is how you would structure your Boolean logic in order to filter through whether this is actually like a verified information source or this is probably BS that's being reposted and artificially increased in position. Once you identify some of the content that's being
00:41:41
Speaker
has been pushed out there that you confirmed is factually false. Now you move into the actual mapping of the network. Now Carnegie Mellon has some data science folks that are doing some pretty interesting work in regards to countering botnets and having an understanding of social media influence and the regard of various algorithms for social media. That's where the majority of my information goes. So I sat through like a week long class and I'm giving you the 20 minute version of it right now, but
00:42:11
Speaker
The goal then becomes, now we have to identify what's a bot and what's a champion, right? So create a hierarchy or archetype of this echo chamber and have an understanding of what's going on. So what's a bot, right? The telltale characteristics, it's probably going to have a fake picture on it. Its username is going to be something to the effect of, you know, debris tree one, two, three, four or something, right? Like it's not going to make any sense. It's going to look real, real, real stupid. It might be in Cyrillic. It might not be in Cyrillic, right? Depending on this current situation.
00:42:41
Speaker
Use your best judgment. There are probably better telltale indicators online about what a fake account would look like, but zero actual followers, probably following like 500 other people, no actual posts of real content themselves, reposting everybody else's content. That's probably a boss. Create a hierarchical map. Figure out who the centerpiece is and who's fake and who's real in those regards. Figure out who's connected to who.
00:43:08
Speaker
comes to the final portion of it actually combating the misinformation. Now the admin sites or the admin administrators of the majority of these social media sites are real good at removing spam folks and that's exactly what this would be so you could report them absolutely. The downside is that they probably just pop back up again in a different form and basically you're playing whack-a-mole in those regards. So the other consideration here is trying to use the
00:43:31
Speaker
the echo chamber against itself. The idea of like, we talked about DDoS before, this misinformation campaign. Maybe you start causing a number of doge memes to get thrown at the bot network or something like that to cause some conflagration in regards to what they're supposed to be reposting, what they're not supposed to be reposting. Basically just causing some sort of disruption within the actual site itself.
00:44:00
Speaker
All right, I said a lot of words. I've been talking for a long period of time. This is actually a very complex topic. I mean, there's a number of PhDs who are way smarter than I am at this, but I did my best to try to give you a down and dirty on it. Any questions on what we're talking about or while we're here to review how you're doing your few things.
00:44:19
Speaker
Brian, you started with talking about Goebbels. You quoted him, and what if Goebbels and his henchmen are pushing out these messages? For example, in the Russian case, we have a spokesman for the Ministry of Defense of Russia saying, hey, there are American vio-labs in Ukraine, and what do you do about that? There are no bots, they're just a guy who works for the government, and he's saying these things.
00:44:45
Speaker
Yeah, okay. So you're talking, how do you make a foreign nation official spokesperson start telling the truth? That's tough. Do you remember in 03 when the US invaded Iraq and
00:45:02
Speaker
your record spokesperson was like, there, there are no Americans here. It's good. Everything's great. Meanwhile, like back that, that was like an obvious version of exactly what you're talking about in a much scaled down version. I totally understand what where you're coming from here where it's like, okay, but the only reason that guy is saying that information is to draw attention to where
00:45:22
Speaker
try to conflate the fact that there's been an extreme violation and invasion by a neighboring country taking advantage of another country. Stop mudding the water, stop talking about a country that's 8,000 miles away at this point in time. Let's focus on the issue at hand here. The only thing I could say is take the initiative.
00:45:49
Speaker
It's always bad. I feel like when I'm like, yeah, there are some really good Instagram based news media outlets out there, but.
00:45:57
Speaker
Occasionally there are, right? So Tesseron and Atlas News are on Instagram, and they've got some pretty phenomenal live imagery that's coming from folks on the ground in Ukraine. So they're able to effectively uplink via secure data protocols that's out there. So there's first-hand accounts of what's actually occurring on the ground. It's very accurate information.
00:46:19
Speaker
If you're talking about how do I counter Russian Foreign Minister X from saying stupid things that take away from the actual aspect of it.
00:46:30
Speaker
If you wanna go after him like yourself, use your own accounts, get your buddies together, start adding him and spamming him is basically the way I would go after it and use factual information. So every time you post something on the media or every time he says something, you're immediately retort and try to correct. Use the same policies that we talked about before in regards to social media following, in other words, get your content prioritized, that's the way to go after it.
00:46:57
Speaker
Does that answer your question? I feel like that's a tough one. This is a tough one because it has to be a full-time job. We've been dealing with this stuff for eight years, at least for this type of conflict. This has been going for a while now. There are a lot of people in our network that are just getting fed this information for years and years and years, and just pushing through what they can take in. I'm sure it's a tough thing you need to do.
00:47:26
Speaker
Let me ask you this then. Are you seeing a change in Ukraine right now? Are you seeing a change in regards to spending eight years? All this misinformation is coming down. Is there any sort of effect on the targeted population in those regards from a Russian perspective?
00:47:48
Speaker
Well, there's been a huge change in the past month, although that's for sure. Yeah, but... Essentially, for Russia. Well, definitely for Ukraine and Ukraine. So I'm not really in Ukraine, though, like I'm in Estonia, but... My apologies. No worries. I have this in my title. Yep. Okay, so that is the one thing I would say.
00:48:18
Speaker
It's becoming very apparent that Putin has calculated this whole situation, and I'm loving the fact that we're constantly reading about dead Russian generals. That's not nice to say that people are dying, but in this particular case, it's obvious the train is coming up tracks a little bit here.
00:48:36
Speaker
the world's support and the massive mobilization of world players in response to this flagrant violation of international law. Yeah, absolutely. That's one of those things that's helping counter the Russian propaganda, the Russian information methodologies that are going forward. I'm glad to see that. I'm happy to see that the ground truth is still a pro-Ukrainian that's out there, that the
00:49:01
Speaker
you know it's not uh... not a pro pro poo pro uh... russian perish methodologies that's happening but um... it's it's a big problem separate
00:49:10
Speaker
I'm afraid it was just scared. There's a confirmation bias going on that we see them retreating and I kind of think that we're still grinning and there's still a chance that they're going to get like, cause we're calling them orcs. So like basically there's a lot of orcs in there that get to come. And it sounds like there's like this vacuum happening right now that they were cheating and it sounds like we're grinning so we don't need it for support but we actually need that load of weapons. Okay.
00:49:37
Speaker
So if you can talk to somebody who can make this decision, please tell them that we're open. I'm just a man, brother. Mad respect to that other one. With the idea of confirmation bias and the salacious news that gets spread, it's not really the more centered stuff that's like,
00:50:05
Speaker
decent. It's like bad news travels faster than good news. Could you speak to the idea of useful idiots where Ukrainians want to make Russians look bad and Russians want to make the Ukrainians look bad? That's a generalization, but politically
00:50:28
Speaker
You know, the left wants to make the right look as worse as they can, and the right wants to make the left look as bad as they can. So it's like the worst parts of any argument or any situation get accentuated. And like, as like, yeah, so if you could speak to that. OK, OK. So is that Trotsky or Lenin that has the concept of useful idiots? Or is that straight out of the Karl Marx playbook?
00:50:57
Speaker
Obviously, proletariat was the early stages of communism, the idea that useful idiots are basically like the early adopters, and sometimes idea champions that helped propagate the concept of communism and eventually led to the 1917 revolution and overthrow of the Romanovs and bang, bang, on a stage or running good woods, so on and so forth, right? Just for a full context of what he's talked about there.
00:51:24
Speaker
what context you mean, useful idiots? What are you trying to get out of here? Help me drive down the question here a little bit. Yeah, I'm just, you know, when we're trying to figure out, you know, misinformation and disinformation, it's like both sides of any argument have an incentive to
00:51:49
Speaker
sway the argument more in their favor by basically muddling the waters of maybe the good points of the other side. And it's like with the Ukrainian thing, with the Ukrainian situation,
00:52:09
Speaker
know, it was monitoring social media. In the beginning, it was, I was hearing stuff about like, Oh, we got the Ukrainians are like doing really good. And, um, there's this ghost pilot that's shooting down a bunch of tanks, but then it comes out later. Actually, there wasn't really a ghost pilot, but that was kind of like, uh, that was phenomenal propaganda, by the way, that was, that was real great. And then now I'm hearing, um, like from the Russian side, they're saying like,
00:52:38
Speaker
But when they went to attack Ukraine, they're like, Ukraine is full of Nazis and they're just terrible white supremacists over there. We just got to liberate them. And it's like both sides want to just push a narrative. Yep. Okay, sir. I got your question now. Thank you, brother, for providing more context there. All right, so here's the deal.
00:53:02
Speaker
when we're talking about influence operations, you've got your logical, like so left brain versus right brain, you've got your logical versus your artistic set of brain. People who are inherently logical will look for the telltale factors about information veracity, right? Whether or not it's true. But if we start playing on the ideas of emotion, introducing emotions into the situation, suddenly those type of ideas become less, less important. So like, you know, pro, pro-Ukrainian propaganda for
00:53:27
Speaker
for example, the Ghost of Kia. That's a feel-good story, right? Like the underdog in this particular case is getting beat up by, you know, Reno Vladimir riding shirt was on his bear. But, you know, there's this ace pilot, you know, the equivalent of Maverick who's flying around in this old 1980s era, you know, Soviet technology jet that is just downing people like it's cool, right? Like that's dope, right? Like that's Iceman, that's some serious top gun stuff, filming me for speed.
00:53:52
Speaker
people feel a certain way about it and all of a sudden it doesn't matter if it's true or not because it's an extreme of an emotion and that's another key factor there, right? So it's a feel-good story and that immediately gets more likes and more worth-wildness. So it's the same equivalent of, you know, why do you watch videos of dogs?
00:54:15
Speaker
What do you want to watch for views of cute puppies dancing around? Because in the back of your brain, you would do lab lingatas, receiving all the little signals, and the endorphins are firing back there. And all of a sudden, you get this euphoric sense that's associated with basic human physiology that's associated with it. The same question I would ask you is, what type of headlines are going to be on any major news source?
00:54:36
Speaker
They're either going to be really good or they're going to be really, really bad. So what's the cool ones that we've seen recently? A bunch of Russians got acute radiation sickness, ACR, which is not a good way to die. I wouldn't wish it on the worst enemy because they were digging trenches in the vicinity of Chernobyl. That's horrible.
00:54:57
Speaker
They're conscriptive. You have no idea what they're doing in this particular case, right? But like that's that's kind of like the horrific news that makes it or you know goes to Kia gets gets pushed out there or anytime President Ukraine is is sneaking around and doing his
00:55:12
Speaker
doing his posts and his patriotic comments and that speech he gave to NATO and the EU and the Americans referencing all these different historic events that's out there. It's like this rocky underdog situation. That's what sells. So anything that can cause an extreme emotional reaction has more influence ability that makes any sense. So in regards to useful idiots,
00:55:37
Speaker
same in basic concepts here, but what you're really saying is you're lessening inhibitions, mental inhibitions, by bypassing the logical side, going immediately into the emotional side. Adrian, does that answer your question? It does. I'm just trying to think about how, you know, when we're trying to
00:55:58
Speaker
filter through this and build solutions to this. That is how you would shape your Boolean-based logic gates then. It would be associated with extremism of language, right? Like whether extremely euphoric or like, in regards to your keyword search, like very, very good or very, very bad adjectives associated with it as you're stepping down through your actual filter. It's a very data set that you would recommend with studies.
00:56:29
Speaker
So if you were to go onto and shoot your notebooks, I guarantee if you go on GitHub right now and you searched social media influence or social media marketing tactics, there would be 100% a data set associated with how the algorithms work, how information is propagated that would be worthwhile for you to be able to do some training and testing of your overall model.
00:56:55
Speaker
There's probably some stuff out there. Uh, probably some open source stuff are already pushed forward by, by some of your, uh, your, your countrymen that, that, uh, that would be worthwhile as well. Um, specific data sense that I have, you know, you know, not at this point in time, but I guarantee, like I said, you put a search into, and to get that, go get something that's worthwhile to help you out there. Sorry, I didn't have something right. I made for you. My bad.
00:57:25
Speaker
There's going to be a group of good data sets coming up. All right. So like, here's the thing too. If you take the week, right?
00:57:37
Speaker
The things that are done here can be built upon, but if you want to take some time and get on some of the APIs that are out there or meltwater or all open source software packages that you can buy subscriptions to or limited subscriptions that will provide you data sets that you're looking for. Meltwater is a program that's used by most marketing companies in order to do exactly what I've been talking to you about today in regards to veracity,
00:58:03
Speaker
virality, information diffusion, it provides you excellent graphics in regards to the word charts of what's trending, what's not, who the major players are, what the connections are gonna look like in regards to clustering of individuals and their associated contacts. And then with those, with the keywords that are coming back from that, I guarantee you could probably sharpen up your Boolean logic to have a more successful filtering effect.
00:58:31
Speaker
They also export to Python-based data sets for content distribution as well. That's nice. Brian, I was wondering if you could speak to, oh, go ahead, Thomas. Would you be able to speak a little bit to examples of
00:58:57
Speaker
I guess educating as far as teaching the masses about how to navigate this. Were there any use cases that are particularly successful without being like just to sit down and watch this workshop kind of a dynamic or other tools or yeah, tools or kids who could help people navigate better?
00:59:19
Speaker
So are you asking, is this a question on countering misinformation or is this a question on general education of the public to make them more resilient to information campaigns? Yeah, it's more on the resilient side. I think just kind of illiteracy and you know, so that they can assess it out themselves. So that's tough. The majority of human attention to that has been drawn down to 15 second tic toc or rails clips, which makes it very difficult for
00:59:48
Speaker
thorough understanding to take place. What I would recommend in those regards, so if you're countering, I would also recommend kind of spearheading this as a double crown methodology of how you educate the masses. So for as much and
01:00:11
Speaker
to eliminate the bad information you're finding bad actors. We're either reporting or spamming in order to help kind of muggle the waters and prevent those efforts. But simultaneously, if you're talking about individual efforts that are out there, look at creating your own social media platforms. It could be anonymous, it could be associated with the Ukrainian effort, whatever you want, right? But it could be something associated with information resilience or understanding marketing or understanding the effects of
01:00:39
Speaker
more fear of propaganda and start posting books, whether it's a single ad like, hey, you should read something about Robert Shieldini because that guy knows more about influence than anyone else. Or talking about exactly what we said today in regards to how do I verify authenticity of the source that I'm reading to make sure it's real information. There are a number of studies. If you could go on Google Scholar, you could probably link the
01:01:04
Speaker
available to the site. And then it's from that point, start getting as many followers as you can to try to get the truth out there and help educate the masses a little bit. Other than that, it's tough. I mean, you're talking about billions of people here who are cognitively biased like we talked about before.
01:01:27
Speaker
whether this is how bad it is, right? Like I, I have experience with this. I haven't trained on this, right? I've got a master's degree in this, this, this sort of thing. And I still find myself at times like walking down paths where I'm like, you know, I'm excluding that information and then I have to mentally stop myself and be like, Ryan, shut up. Like you're being dumb right now and you're, you're falling into it. It's, it's tough. That's not a great answer to your question, but that's, that's the only methodology I could recommend for it.
01:01:57
Speaker
You have any like favorite? Oh, sorry. Yeah. Uh, well, I mentioned we want to be respectful of Brian's time here. I'll take, uh, why don't you. Ian, you had, you were asking me if I had any recommended something. I like favorite success stories as far as countering.
01:02:32
Speaker
So, again, not to get political, I would recommend taking a look at when they believe.
01:02:41
Speaker
So again, Scott, it's about the 2016 election. Obviously that incites serious emotions regardless of whoever you voted for. That's not what I'm talking about here. I'm talking about the dynamics of the influence operations that gave the political campaigns that were run by both sides there. Some of the interesting points that are drawn out is why Hillary Clinton was losing in the early stages of it. One of the references is
01:03:02
Speaker
posting yourself in an SNL skit. So Trump puts himself in an SNL skit and he's playing the president in that particular skit, because Clinton is in an SNL skit and she is playing a drunk at that point in time, right? So we're already priming, persuasion if you will, we're priming the mind to be either Clinton as a drunk and Donald Trump is very presidential in those regards, right?
01:03:25
Speaker
We talked about virality, right? Why people say stupid things. So think of every single time that Donald Trump made up a nickname for somebody, something that was like stupid and kind of like backyard bullying nonsense, but immediately it caught on and people started referring to it over and over again, that created a virality and a reshare and he got more exposure as a result of it, that didn't have a similar sort of situation. So we were leading up to the final months of it and Adams is a,
01:03:52
Speaker
The author is an admitted Hillary Clinton supporter, so it's interesting that he gives so much praise to Trump in that regard because of how effective the campaign was run. Hillary is losing. It's very apparent that she's a couple points behind right up to the election point in November of the year, right?
01:04:11
Speaker
talks about how they bring children in and they start making subtle changes to the way that information is being passed out. So in regards to like a success story of countering information, that is an effective methodology of like, okay, we're losing the information more right now and we need to regain the information more. If you're looking at it from like a purely academic methodology, it shows very much how susceptible people, even educated people are to those regards.
01:04:40
Speaker
I think the 2016 election is a phenomenal example of how emotion, virality, the use of social media, authenticity of information, all kind of culminated in this perfect storm of American election. That's all I got.
01:05:05
Speaker
Awesome, Brian. Thank you so much for your time again. It was riveting. You definitely learned a lot. Lots of points to take away and pull inspiration into our projects here. I was planning to take that little green fire, thanks again for your time, sir. Absolutely, but it doesn't matter. It was great to hang out here. I'm going to address the rest of the group here. As it goes, so let's have a little door exit door here.
01:05:59
Speaker
But yeah, thanks again for your time, Brian.