The Reality of Cybersecurity Threats
00:00:00
Speaker
I don't believe people in in any walk of life really believe something without seeing it. So I think whether it's um some sort of interactive experience or video, like will you get it out of an article that gets published in a monthly newsletter? I i don't think so.
Podcast Introduction with a Deepfake Twist
00:00:25
Speaker
Yurian Yo, this is Bare Knuckles and Brass Tax, the cybersecurity podcast that tackles the human side of cyber. Buying, selling, trust, respect, and everything in between. I'm George K. with the vendor side. Just kidding, this is Aaron Pritz from Reveal Risk, running a deepfake voice mask of George. And we're about to unpack this and more.
00:00:50
Speaker
Okay, this is really Bare Knuckles and Brass Tax, the cybersecurity podcast that tackles the human side of cyber, and I am really George Kay from the vendor side. I was not deep-faked, but I'm George A., the Chief Information Security Officer.
00:01:06
Speaker
And today our guest is both deepfake George K and Aaron Pritts, co-founder and CEO at Reveal Risk. This has been in the planning for a bit and it was really fun to see.
Deepfake Technology in Action
00:01:18
Speaker
So normally we're only recording audio and hopefully you'll hear the difference when we do the deepfake audio. But Aaron appeared on video as me, complete with BKBT fist and bolt, snapback hat. ah And that was kind of weird to see myself on the other side of the screen.
00:01:37
Speaker
Yeah. Like, I mean, I, I'm blown away by the whole thing. Like as an actual CISO, like as a technologist, I thought it was cool as hell. And then as a CISO, I'm like freaking out. So you'll see in the show, my questions are kind of more, what free consulting can I get out of Aaron while he's here? Guys, this is real.
00:01:54
Speaker
Yeah. And we talked a little bit with Kaylee Miner in a past episode about how her team is using deepfakes that are in their cyber attack simulations.
Role of Deepfakes in Cybersecurity Awareness
00:02:02
Speaker
But it's one thing to hear about that in the abstract and quite another to be on the receiving end of it. um And we talk about how recently this technology has become available. They are using all open source software. And the fact that they can actually get into the forums with threat actors and basically talk best practices is kind of nutty.
00:02:21
Speaker
But yeah, this is a, this was a real blast and fun for cybersecurity awareness month, which we have a lot of hard feelings about, but in, in terms of trying to get your organization to understand new and novel attacks, as Aaron says, seeing is believing you're not just going to learn by reading an article. That's it. Let's go. Aaron Prince, AKA George K2, deep fake in real time. Welcome back to Bare Knuckles and Brass.
00:02:50
Speaker
Thanks for having me to my own show. Really appreciate it. That's right. So we're doing something new for the first time ah for audio listeners. ah You can't hear the voice fake, but we do have video for the first time. And Aaron is trained a small model on my face in some rather embarrassing iPhone videos. ah And yeah, so he is appearing as me on this podcast, albeit with a bit of an orange beard, but we'll get into the technical difficulties there. Yeah, but I love it. It's it's the Iris George.
00:03:24
Speaker
That's right. And I do have a little bit of Irish in the background. I am a mutt of a mutt. But um yeah, so why don't you start out by talking a little bit about this deepfake process and then we'll take it from
Evolution and Application of Deepfakes
00:03:34
Speaker
First things first, I got to talk about this hat because I am now the proud recipient of a bare knuckles and brass tacks hat, but I want to first make sure and the so the audience knows the proper positioning of the hat and I'm i'm forgetting my training. Is it 15% sidecock right with a a slight upwards lip on the right side? Correct. Yes, yes that is that is the right the right protocol. i think i I think I'm pulling it off, not quite as well. As you mentioned, Irish beard.
00:04:04
Speaker
the classic Aaron Prince butt chin is shining through a little bit. So what we'd have to do to improve this would be either using a snap camera filter, or if I really wanted to lean into it, I could paint some beard looking material on my face with some holleen Halloween equipment. As we've talked to some of the, actually, as our team has talked to some real threat actors in online repositories, we've learned that most of them are using wigs and facial attachments.
00:04:34
Speaker
to make their physical physique look more like the deep thick that they're trying to replicate. So do do you think then there's the use of prosthetics as well? Like, I mean, are are they kind of going full makeup artist mode or is that? i of in and It doesn't take much. You just need low budget additives. So for example, I didn't want to really like fully commit to this because if it's grew up the rest of my day, but I could probably marker on a beard And then the model would take over the line. But I'm not willing to do that because I got some important meetings later. And I think while this is a glassboard marker, it might not work well for me for the rest of the day.
Deepfake Training for Security Awareness
00:05:12
Speaker
So that's right. But I want to just for the sake of our audience who hears deep fakes maybe in the media and then some of your practitioners just to highlight
00:05:24
Speaker
the scale with which this thing came down, this technology, right? And in 2019 was the first deep fake voice call that resulted in a loss of revenue for, I believe it was the British subsidiary of a Hungarian energy company. It was $234,000 and Trend Micro wrote that up as a novel attack. And then not this past black ah but past Defcon, but the one before that, so Defcon 2023 or 31. Yeah, 2023 was there.
00:05:50
Speaker
They showed the first attempt at real-time video deepfakes, which is what we're using right now. And Aaron, correct me if I'm wrong, they had to train that model for something like three months on their CEO. Is that right? It was a three month project. um Open source software is not new. I think the processing speed is what's mostly changed to make it more um accessible.
00:06:13
Speaker
Right now we're running on a $5,000 nice Razer gaming laptop. um We've got some better equipment for when we do it for real. NVIDIA is making a lot of money from us. um But there are other brands of high highper performing GPUs that can pull it off as well. Yeah. And so that's to say that technology that was quote unquote sophisticated. I didn't know we were taking our glasses off. So I'm going to follow your lead. so we can Oh, yeah, yeah. That's right. I got i got really hot. They got steamy. um So in five years, the tech has gone from quote unquote sophisticated to essentially you know a gaming rig. We're using $10,000 desktop. We're using a $5,000 laptop. I mean, that is just really extraordinary. And so we could spend a lot of time marveling at that. But
00:07:00
Speaker
As you know, it is quote-unquote cybersecurity awareness month. It is a month that has a lot of mixed feelings for me, and but I know that you guys at Reveal Risk are using this in security awareness training and and you've got a lot of buzz around it, but just walk me through kind of the why, the why this has people's interest and and how you're using it.
00:07:22
Speaker
Yeah, so it started, I was actually in the audience. It was one of the ah few talks that I got through LineCon at DEFCON in 2023. But we were in the front of the line, we got in, we sat in the front, we got to see it, we got to hear what was happening on stage and what was happening on the screens. And I mean, I've been playing with Deepvix just from a professional curiosity and cyber standpoint for a couple of years. Smartphone apps, you see them on TikTok being used.
00:07:50
Speaker
That was interesting and cool for a previously recorded video. I think what has changed and unfortunately become more mainstream is the live ability. So if I could record a video of myself, that's only so useful to play it to a to a target because you can't adjust. Like you remember the movie Home Alone, I think to when Kevin was in New York and he had the little toy and he he took time like saying something and then slowing it down to make it sound like his dad.
00:08:20
Speaker
That all takes time. The more you can do it in real time, it allows you to interact with somebody. And we'll show a little bit later, you your deep thick voice hack as well. Quick pause. I see George, ah other George, flat lining on audio. George, can you still hear us and can we still hear you? um All right, cool. I need myself to allow you to talk rather. Love it. All right, cool.
Customizing Cybersecurity Training
00:08:43
Speaker
I just wanted to check.
00:08:44
Speaker
Yeah. Yeah. So good. that's ah yeah that's a That's a good bring us up to speed. So how are you all using this in the awareness training programs? Or I guess also if you could speak to the interest of your clients. Yeah. So first of all, I i think we we vibe on some disdain for awareness programs as a whole.
00:09:07
Speaker
I feel that the cyber market has somewhat commoditized human risk management and to buy a tool, fish employees, and push out commoditized training, even though some of it's pretty fun and creative. But it really has nothing to do with what's going on at the company, the risks you're dealing with, how to report concerns, information classification, how to how to deal with the topics that the CSO is pushing for. So we look for anything and everything that layers on top of the basics to really reach employees where they are, get them engaged and and really transform their behavior and engagement ah in and around cyber. Because everyone has to own a piece of it and not expect that CSOs like George ah are doing everything for them and they can just kind of behave
00:09:56
Speaker
in any regard, however they want. Obviously the technologists and all of us want to make it as easy as possible, but thus far, and deepfic is a good example of that, it's impossible to eliminate the human from playing a role, thinking and acting in a way that can help you and the company. So bottom line, to get to your question,
00:10:14
Speaker
What are we doing with this? So we started, again, after I got the vision a year ago and seeing this, and it took a bit to kind of dial it in, but we're doing a three-part thing where we have picked an executive to deep-fic and similar to what we're doing here, but much more sophisticated. We're going to use that as executive likeness to attack the CISO, who is playing along for a role.
00:10:39
Speaker
they are going to become the hero by stopping it. And in meantime, educating the employees on what they see, what they saw, how they stopped it, how they asked questions, just like the Ferrari CEO Deepfake story that, you know, one of the other executives that was the recipient of it asked some personal questions that the Deepfake con artist was unable to answer. So bottom line, that's step one, a a really, really tailored experiential video using real executives and the CSO to show how all this happens. And then secondly, ah the part that has made my October stuff kind of crazy, we're doing a keynote where we deep fake and show the executive live on stage, just like what happened at DEF CON, both video and audio in real time, ah providing a keynote to educate folks with it, and then having a demonstration booth that allows employees to come up and become that executive.
00:11:34
Speaker
And I think once they've seen it and understand the reality and kind of can feel, touch, interact with it, the goal isn't to train them to go do this. In fact, we're going to make sure that the policy states that they shouldn't.
00:11:47
Speaker
But it's really to help them understand when it happens to them or if it happens to them, how they can help be the the stop. Because right now, there's nothing in Zoom or Teams that is auto deepfic attacking or detecting what's going on. That'll come. I'm sure Microsoft and Zoom are all over it. If they're not, tip, huge opportunity there to differentiate. But yeah, let me let me pause there and we'll iterate on the next question.
00:12:14
Speaker
So I want to ask you this, realistically speaking, we are talking about a threat that could be from anywhere, could be from a nation state actor. But I want to know based on your experience is actually running these kinds of you know, we'll see red team type operations to fake out organizations.
Risks and Accessibility of Deepfake Technology
00:12:33
Speaker
How much does it properly cost, in your opinion, to conduct a malicious deep fake campaign? I'm not saying you need to give me an S-bomb. I don't need a a bill of materials. Like, let me know, generally speaking, I'm a bad dude. I want to do bad dude things. How much is it going to cost me to set up a deep fake operation? um Do you already have a laptop? Yes. How how good is the laptop?
00:12:58
Speaker
Let's say it's a Mac model from last year. um Zero dollars and just your time. It's all open source and available in GitHub. Or it was until GitHub pulled it and it got redirected to somewhere else. It's literally that cheap. You're saying it's literally that... It is free and widely accessible and you can interact with... Actually, you can get consulting from the originator of one of the software pieces who only accepts payment and Bitcoin, and there's a whole discussion thread. Unfortunately, as we were diving into it, most of the discussion thread is deep, deep, deep and into porn.
00:13:35
Speaker
because unfortunately people are using this like think about the Tyler Taylor Swift situation. Yes, we're using this technology for bad. Like there are very few use cases that deepfic technology is used for good one education. That's what we're doing to I believe if you think about like Star Wars and how they made Mark Hamill come back as young Luke that was probably done with some really high sophisticated CGI, but now I could just run some deep fake and clone my face. Now I'm going to be a horrible Luke Skywalker because I'm freaking six, five. And Mark Hamill's like maybe five, four with, with still talk, you know, so there's that. At the end of the day, what I'm trying to get to for the sake of the audience is how easy is it for them to get attacked this way? How susceptible are they? Yeah. So we, so we talked about the cost for the threat actors. I think you're asking about.
Company Culture and Cybersecurity Practices
00:14:31
Speaker
how susceptible to would companies be? And I would say, as we're getting into this with several large companies that we're very excited to kind of help their employees learn and and and kind of do this for good, what we're finding out is there is almost zero awareness that this is even possible. um If you're on TikTok, like I mentioned before, you can see, hey, technology exists you know with static videos, but most people, like even in scoping some engagements, they're like, wait, what do you mean you can do this live? like I didn't even know this was possible. and then some Yes, that's it that's a good point. so That's a good point is that I think most people think it's like a it's the Obama deep fake. right It's pre-scripted, pre-filmed, and an attempt to send you that versus like, no, I could get you on the phone or set up a fake one-to-one meeting and try to infiltrate Zoom and do it in real time.
00:15:23
Speaker
yeah Good point. Great question. I guess we got a little heavy into the tech there. I was saying off recording to George that there are also questions of culture inside a company, right? I think we've seen a lot of attacks recently where people go after contractors or IT help desk, basically the bottom of the totem pole where there's a lot of pressure to do something, right? And they create a lot of false urgency. Pay this invoice, clear this ticket, whatever.
00:15:49
Speaker
accounts payable, legal, finance, anything, any general GNA where money is transferring, that's going to be your target area. but so What role does that take in terms of how you're talking to clients about their internal cultures? right Because if you if you're in a toxic culture where it's just clear the tickets, i mean that's how threat actors got through to MGM is basically they targeted a help desk that was just under a lot of pressure. So how do you like get over those? How how do you tell people, you know, take a pause when they live and work in an atmosphere that is very high pressure rapid response?
00:16:28
Speaker
Yeah, I think the pause is the important thing. And I think in any urgent request, although you might get one from your legit boss, I think you need to like, regardless, before you take any action, take five minutes on any big decision to to ask yourself some important questions like, should this be happening? Would this executive normally call me?
00:16:49
Speaker
ah Did anything seem off with it? Did I ask any questions for clarification? Did I call them back at a number that's listed in the company directory or teams or whatever you know Google, whatever ah chat you have?
00:17:04
Speaker
um it's worth It's worth verifying. like Obviously, you don't want to slow anything up, but think about the CEO gift card
Strategies for Identifying Deepfakes
00:17:11
Speaker
scam. right like You should never think that your CEO is going to call for gift cards, but interestingly enough, when I was testing this actual software,
00:17:20
Speaker
um for Deepfake. I was also um on a nonprofit board and I asked my EA to order Amazon gift cards for two board members that had completed their term. and Literally, our intern that was helping me set this up did a test call to the EA, not knowing that I had done that.
00:17:40
Speaker
asking her for gift cards. And she was like, how real is that? Like you've never asked me for gift cards before. You legit did in person. And then our own test, we called back asking asking for gift cards. That's a hard predicament, right?
00:17:53
Speaker
Yeah. But since you also are working at the executive level, are you also, I guess what I'm trying to get to, I mean, there are lessons for the people who are going to be victimized, but there have to be lessons also from a cultural standpoint for the leaders. Like, look, if you're going to ask people to pause, like you can't be an asshole and ask them to hurry all the time. Yes.
00:18:15
Speaker
Yeah, no, there's there's no good answer. I think we're going to have to deal with some churn of as new techniques and tactics from a criminal standpoint come to be real. There's going to have to be some learning curve and some um I guess, forced two-way learning to to let the controls or the learning soak in. long like when i went When I was on the corporate side for 17 years and rolling out an early awareness program and phishing people, I had people reacting to the company cybersecurity account and saying things like, well, I'm never going to answer any questions for my boss again. like Kind of like tongue in cheek, but some of them were like, hey, if it's this easy for the threat actors to do it, I just won't respond to email.
00:18:58
Speaker
And it's like, well, that's not the right solution. That's an overcorrection.
Adapting Cybersecurity to Evolving Threats
00:19:02
Speaker
But some people in your organization might behave like that. And you've got to kind of get the balance right. So they don't just you don't want to create paranoia soup. Yeah. It's it's tough, though, because like I think the the problem is We have to change the way that we train and we deploy our operators, essentially. Like methodology has to change across the board. We can't be so rigid. We have to be a lot more flexible in terms of how we think about things. And I think that's difficult because when you are in the security industry, everything comes down to frameworks. Everything comes down to what solid requirements or or compliance points can I evaluate this against? What can I assess this risk with? It's hard to assess risk now because I think
00:19:45
Speaker
There are so many variables and technology is developing so rapidly, we can't actually quantify that into a realistic risk score. Do I run ISO audits all the time? I do internal audits and it's like, yeah, you're assigning a risk score because you got to put the number on because that's the exercise, but like really, is it that much? Yeah. like go how do we how do we then you know, take this information and we accurately conduct risk management. Because I think, you know, security at the board level, at the funding level, speaks in the language of risk. And that's valid. But down the language of risk, as it's always been, no longer applies to the nature of the evolving threat. So how do we evolve the language of risk accordingly, in your opinion? yeah
00:20:27
Speaker
yeah It's a great point. and and Speaking of risk, like let's talk about the reality of deepfake attacks like we're talking about being used every day for every attack. Probably way less than 100%. I still think it's a very targeted form. You know you know you go after Ferrari, you go after the $26 million dollars heist in Hong Kong with the you know i think British-owned firm that got hit in February or March.
00:20:54
Speaker
um those Those were probably well-planned targeted attacks. On the flip side, because Deepfake got its start in pig butchering, which is the elder you know dating romance type scams for financial, Deepfake was being used in that far before we heard the first cases in corporate.
The Threat of Audio Deepfakes
00:21:14
Speaker
i think it was you know This is all about going after the money. Now, could you use it for intellectual property theft? Yeah. I mean, if you're motivated and you want to plan an attack and you want to use the tools, but let's even step back from video and let's talk about audio a little bit because we've not yet did that demonstration. And George, I sent you two two audio clips that we can probably play for the listeners here. One was from the colleague that was helping set up for this. One was from me.
00:21:41
Speaker
and we were talking as you. And as you mentioned, pretty close, we articulated words a little bit different. We spent, Brandon and I who helped, we spent three minutes listening to this podcast and learning your talk track. And I think I was joking. I need to say rad more often. And i need to I need to show a level of hype that I might not, and my in my own way, I'll show a different hype. But I had to emulate your pace. I had to slow down a little bit. I needed to be a little bit more articulate.
00:22:10
Speaker
And i started to you know I had to get my tone down so the the tech could do it its business. So bottom line, why am I talking about audio? Audio is way less complex, and it's way easier to get pretty precise. So if you go after, like if you're if you're a like our pen testing,
00:22:28
Speaker
We're doing social engineering with video deepfake and audio deepfake as part of a test, which I think employ ah companies but will need to start doing because it's a real avenue of of threat. So do we think audio is more likely because you could call somebody up and fit you know less less susceptible? Or do you think they're going to go after video? I'd say audio will come in waves before video. And I think you even mentioned that in the case study of like an audio fake was probably the first one that got talked about. and My question would be more around, okay, so let's say you want to run an audio fake on someone, on a target. yeahp You then have to take, let's say X amount of days or hours and train a model or a program on their voice. So let's say you capture recording of them talking on stage, talking at ah a company town hall, whatever it is, because like you're joking around it, but George does have mannerisms when he speaks. I have mannerisms when I speak. There are things that we say colloquially.
00:23:24
Speaker
if if If we train the model enough to actually capture those, could we then even remove the human need to to have kind of a voice actor behind it? And you can actually prompt-based automate an interaction that comes out sounding exactly like a target mark. Is that possible?
00:23:45
Speaker
Yeah, a little bit different technology. I'd say that's less deep fake and more AI-enabled trained. And I work with a company that has ah a gentleman that's in a job lucky enough to spend time studying it. um And he's created a full digital replica of himself that you can interact with. And it runs autonomous from the AI model. So that a a little bit different use case.
00:24:08
Speaker
probably too complex for a threat actor to go after, but just, you know, I mentioned it might take a couple weeks to train a really good video deepfake. Guess how much time it takes to train a really good audio deepfake? I need roughly four minutes of audio and a couple hours, and I can do that with open source technology.
00:24:29
Speaker
yeah yeah and so then i mean that That's why i'm like the but the the right or the um cost to entry for for both are pretty low. The time it takes to learn on the video side is higher and the time to take get time it takes to get it right is a little bit more complex. so That's why i'm kind of I think audio will come first, but if a threat actor is really motivated, why not go video and audio? and And to be fair, before we cut for the brass tax portion, we have seen audio attacks on civilians. Like we've seen organized criminal groups just rip down audio from some kid on Snapchat and like use OSINT to get after their parents. And it's that's like a parent's worst nightmare, right? You get a call from your kids saying they're in trouble and you got to pay somebody or there's a ransom or like that's terrifying. And it's talk about
00:25:20
Speaker
urgency talk about, you know, that's, uh, and it's fairly easy to do. Um, but okay, yes, let's take a break here and then we will be back for brass tax.
00:25:41
Speaker
Hey listeners, if you can believe it, we are fast approaching episode 100 of Bare Knuckles and Brass Tax. And what that means is an Ask Me Anything episode with us, George K and George A in the hot seat.
00:25:56
Speaker
with a guest host. And you have to believe it's going to be as awesome as you think it's going to be. We have survived 100 episodes together. We've had the good times. We've had the bad times. We've had the confusing times. But this show just keeps on rolling. And we are looking to you audience to give us the questions for this AMA.
00:26:18
Speaker
That's right. So send your questions into bare knuckles pod at gmail dot.com with AMA and the subject line instructions are also in the show notes. But all topics are on the table. You want to know about us you want to know about the future of cyber you want to know what trends we're seeing. You just want to know what my favorite color is where I my favorite what my favorite food is whatever the fuck you want to ask. We're here to answer it.
00:26:45
Speaker
remember And we're back for brass
Building Effective Security Awareness Programs
00:26:50
Speaker
tacks. Aaron, we share a similar level of I roll for the October heavy portion of cybersecurity awareness month. um What advice would you give to practitioners, your clients when it comes to building out teaching evangelizing security awareness programs in a way that isn't kind of hunt, peck, check the box, mass blast, commoditized, whatever. like whats Let's step away from October. like What's the advice for like April or whatever? yeah Yeah. I think the first advice I have is it's It's okay to have ah you know October and emphasis, similar to Movember and raising you know awareness for breast cancer and growing the the mustache or or whatnot, um or pick your your event. But I think what I've heard at a lot of companies is like,
00:27:42
Speaker
No, we got to wait for October for that. Or we've only got enough budget for October. And I would rather say, don't do 100 things in October. One, what can you spread? And you know does that can that be helpful from a budget standpoint? So it's not a huge spike.
00:27:57
Speaker
And then secondly, like I think you can way overdo the commodity October stuff, where everyone's looking for stuff from vendors. now that there's I mean, there's good and not so good stuff that you might see from vendors. I saw your post on that ah the other day, George, ah challenging the vendor space to do more good um with their their time in October. But I really think, like I was saying earlier, is We got to un-commoditize human risk management from being a SAS tool rollout and phishing campaigns and online training rollout. We need to think as CISOs and as cyber program leaders, what are the things your program needs most from the workforce?
00:28:39
Speaker
And what we typically do as we're helping companies think through that is look at the entire CSO roadmap or the cyber team roadmap and say, we're going to check two boxes on every single initiative. Is this behind the scenes or is this employee impacting? So thinking about that, what are the things, and ah i George, I'm curious on your thoughts, maybe specifically what you might check um on on your box, but the things that come up are like,
00:29:05
Speaker
MFA rollout, it might be, you and there's some technical stuff that has all employee workforce facing. You might be rolling out information classification, data classification to to ready your DLP program or take it to the next level. You might be rolling out a champions program that you're trying to embed grassroots energy into each area of the business. So it's not just the cyber team shouting from the mountain top. There's a list of like 10 or 12 things that we typically see that would impact the workforce. And then you need to build campaigns and initiatives around that. And I like to say, like, ah George, we've talked, like, I've got a little marketer in me. I'm ah i'm a 25-year tech guy, but I kind of have, you know, as I've been gotten more entrepreneurial, I've kind of realized that, man, I like the creative side. Like, I've always been an artist. I played drums. We both played drums ah in the past. But
00:29:58
Speaker
there was something that in my technical roles that I wasn't able to demonstrate. And I found different ways in my life to get the the creative left brain side fulfillment. So I think bottom line, find the people in your company that can add that if you don't have that. It could be in marketing, it could be an intern, it could be you know somebody that you recruited from marketing that might be able to help you. And then think through how to make your cyber program and all these workforce facing initiatives a campaign or a series of campaigns or something that's additive and build ah building. Nice. Yeah. So I think, Joe, to answer your question, and I fully agree, like you pick your flavor for what you can fund that month. yeah So for me, like I'm you know looking at things like like MDM because we're a BYOD shop. So yeah we're going to do a little bit more enhancement there. And then with that, there's got to be an education campaign because
00:30:52
Speaker
Since we're kind of doing some things to improve our visibility and monitoring of our devices and applications on people's phones, we have to kind of assure people like, hey, we've got it configured so we're not monitoring what you're doing on your own shit. Just everything that you do in terms of touching corporate access or touching corporate database, we have to monitor that and we have to restrict access to it. So for for the last like two months,
00:31:15
Speaker
you know With my team, I've been working a little bit more on on kind of technical planning and architecture to understand how do we how do we give people flexibility while still protecting our environment, while still respecting their privacy.
Balancing Monitoring and Privacy
00:31:29
Speaker
yeah We have to come up with a whole series of communication around that because as soon as you tell people we're going to put like some kind of monitoring on your personal device, people freak out.
00:31:39
Speaker
They do, yeah. Especially if they're in Europe. Oh, yeah. They're a whole other thing. But like that's kind of the challenge with it. like You can either treat Awareness Month like, hey, I'm going to give you a bunch of campaigns and a bunch of things. And like there's some fun things. Maybe you'll get some swag. Maybe we'll have a pizza party. Who knows?
00:31:56
Speaker
but like we should really look at how are we actually going to do something that's going to help people and then how are we going to educate people because of that thing. And I think if you're if you're not educating people on things that are you know relevant to them directly, then you know maybe they don't quite um they don't quite learn or absorb it as well as they should. And that's kind of like what I wanted to ask you about was like real world advice you know, what would you recommend to me as an active CISO to try to prevent being compromised by a deep fake? Like just high level points, what would be your takeaways?
00:32:34
Speaker
Yeah. So I'm a show and tell type of guy. Obviously I've personally leaned into this project. It's been one of the more fun summers to kind of just get deep in this talk. And I brought my I'm not doing this by myself. I have a great team that has been able to make this this situation a reality. But bottom line, like I don't believe people in in any walk of life really believe something without seeing it. So I think whether it's um some sort of interactive experience or video, like, will you get it out of an article that gets published in a monthly newsletter? I i don't think so. and And I say that after having negotiated four or five of these projects,
00:33:19
Speaker
with cyber teams where it was so new to them that they weren't even fully understanding what, one, what they were getting, and two, how it would work. And then ah imagine like the optics of working with like PR and comms teams internally of like, you want to do what with our executives? like That was as fun as this project was, like navigating that for companies and helping that cyber team get to the wind to even be have permission to do this.
00:33:46
Speaker
That is the biggest challenge to overcome. But bottom line, George, to your question, I think creating the experience, having clear guidance, knowing that it's imperfect of, hey, this is what is happening. It's not every day likely that it's going to happen. So let's not go paranno full paranoia. Here are some tips to look for.
Detecting Deepfakes and Final Reflections
00:34:07
Speaker
And but actually, George, maybe if we we can edit this out. but Maybe if we ah get the beard fixed and we do a quick touch-up demo, what I didn't do earlier is show like looking to the side. and And pretty much with the tech now, I can look 45 degrees, the model's intact. If I look 90 degrees, you'll start to see the face come apart.
00:34:28
Speaker
The other thing, as I was sipping my coffee earlier, I noticed like as I was going going to sip it, you know you saw the the coffee cup see-through and it it glitched out the model. so There's things that you could do. like If you suspected something, you can either do the Ferrari thing and say, hey Tell me about where we were last week. Or you mentioned something about your mom when we were chatting last week. like How's that going? like that's the See if you can test from a conversation. There's the text them. Hey, text instant message. Be like, hey, on a strange call, are you on it? Or maybe it's um something else. But I think there has to be actionable steps that people can do if they feel something is off or wrong to verify it. like
00:35:15
Speaker
After the Ferrari case, my EA and I had a code word. Like, hey, if either of us feel like something's off when we're talking to each other, we're going to have a code word and a prompt and a response now. That's like classic Cold War spy craft.
00:35:29
Speaker
yeah Exactly. And it's unfortunate that any of that is needed, but it's the reality that we're in and until there's tech that can stop, detect, or put some flashing light up in our face, we've unfortunately got to rely on human intuition and awareness and knowledge and behavior change. Nice.
00:35:49
Speaker
Cool, man. Thank you. Well, Aaron, thank you so much for coming back on the show and replicating my face, albeit in a in a fairly Irish manner. um And yeah, for just walking us through this. I know we've been going back and forth on it, but it was really cool to see it in real time. Thank you.
00:36:13
Speaker
If you liked what you heard, be sure to share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tax drop every Monday. If you're already subscribed, thank you for your support and your swagger. We'll catch you next week, but until then, stay real.
00:36:36
Speaker
But we have regular Aaron rocking the BKBT snapback hat. I think I might have taken a step back. I kind of liked the the George look that I was pulling off there. i I did send that to my wife and she said, it looks like a serial killer because you never wear a collared shirt casually. So.