Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Distinguishing between movement and progress, in AI, security, and more image

Distinguishing between movement and progress, in AI, security, and more

S4 E38 · Bare Knuckles and Brass Tacks
Avatar
116 Plays8 days ago

Are tech industries selling us a problems they invented?

Ryan Clarque, CSO at Black Rifle Coffee Company, doesn't flinch at the big provocations. When Claude's Mythos model showed up in every LinkedIn feed promising a software apocalypse, Ryan's take was blunt: the basics were broken before Mythos, and they'll still be broken after it. The real question about a powerful AI model, it’s whether you've built a program capable of doing anything about them when it does.

But the conversation doesn't stop at hype-busting. Ryan has quietly done something the industry insists can't be done: built a lean, two-person security operation that ditched the big-ticket SIEM vendors, took control of its own telemetry, and outperformed programs with ten times the headcount and budget. When one of those vendors found out, they sent their "heavy hitter" to prove Ryan wrong, who left agreeing Ryan didn't need them.

What emerges is a portrait of a practitioner who learned to distinguish progress from movement — and who thinks most of the industry is confusing the two. The procurement cycle, the Gartner roadmap, the sequence of investments you're told you must make: Ryan's argument is that inertia dressed up as strategy has left small security teams demoralized and over-leveraged, and that the fix is less about budget and more about the willingness to build your own way out.

And then, at the end of a week of planes and conferences, Ryan says something that reframes all of it. The reason he doesn't chase the car or the watch or the title isn't asceticism — it's that working in security means observing the worst of what people do to each other, and the only way to stay functional is to invest hard in what actually holds. Time. Trust. People who remember how you made them feel.

Mentioned:

Recommended
Transcript

Finding Joy in Humanity's Darkness

00:00:00
Speaker
because of the jobs I've had. And even I'm still, we are literally observing the worst parts of humanity, right? The worst shit there is. Stuff that just rubs you, ah burns you out internally, burns out my soul, man, like seeing it.
00:00:16
Speaker
But it also gives me the perspective of like, you know, what actually matters is is i should I need to feel joy all day. I can't be, you know, part of the organization or the enterprise that stops this bad shit And then let that ruin me. I cannot let what that that happen. And I can't let my kids see that like, oh, daddy fights bad guys on the internet and he comes home and he's just, you know, he's terrible. or yeah Or I have to compensate for that by like, you know, let me get this expensive watch, which is totally fine. I mean, people should treat themselves. If you got it, go for You can't take it with you. But like,
00:00:51
Speaker
None of that matters, man. None of that matters. People matter. People matter so much. And people, you know, Maya Angelou, people remember how you made them feel, right? So I love investing and my time in people.

Introduction to 'Bare Knuckles and Brass Tax'

00:01:12
Speaker
Yo, it's Bare Knuckles and Brass Tax. This is the tech podcast about humans. I'm George Kay. I'm George Kay. And today our guest is Ryan Clark, who is the chief security officer at Black Rifle Coffee Company. He joined us right after getting off the plane and we want to tackle some topical issues and get into, I guess, more philosophical issues. So I guess the most obvious thing is we talk a little bit about Claude mythos and the supposed impact it has on cybersecurity.
00:01:46
Speaker
ah But then we go way left field and get into some more stuff on the innovation side that Ryan is a part of and then also the culture that he builds on his teams.
00:01:58
Speaker
Yeah, I really appreciate it. Like Ryan and i um you know, go back a few years and then um it's actually ah an interesting friendship with him. I met him at a Canada-US CISO exchange in Ireland, of all things.
00:02:12
Speaker
And, um you know, he's always and the kind of guy that stood out to me because he's a brilliantly smart guy who comes from a very blue collar background. I think we come from very similar backgrounds and... um It's kind of nice in the industry when you meet your spirit animals, and and he's definitely one of them. And um you know he lives it.
00:02:30
Speaker
And I think if you have the pleasure of ever knowing him or working with him in person, hopefully this episode primes people up to see that there are awesome people still in this industry doing great things. And if you get a chance to work with Ryan or get involved in of his projects, you really should go for it.
00:02:51
Speaker
Ryan Clark, welcome to the show. Yeah, man. Hey, thanks for inviting me. It's great to be here. I'm super tired from all this travel. But yeah, man, thanks for making it work.
00:03:03
Speaker
Yeah, absolutely. I am also exhausted and recording from a hotel room, so the travel fatigue is real. And then, you know, George doesn't sleep, so we got that too. That's true.

Cybersecurity Myths and Realities

00:03:14
Speaker
um All right, Ryan. Well, you are head of security, CISO, all this at Black Rifle Coffee Company, which, you know, more power to you, death before decaf.
00:03:24
Speaker
um But we have you on to dispel some myths. So let's set the table. Some of our audience is in cyber. Some is not.
00:03:36
Speaker
You know, there's this big hullabaloo around Claude's mythos model, which is apparently, quote unquote, unsafe to release to the public because it is so good at finding and chaining together vulnerabilities. It threatens what some might say is a software apocalypse.
00:03:56
Speaker
ah Others have said is a lot of hot air. So why don't we start there? Let's get your thoughts on it. What are you thinking? Oh, man. You know, it's funny because the the idea of Mythos, or whatever it was, that, hey, there is going to be some so apocalyptic capability that comes out. It's going see everything. And daisy chain together all these lows and mediums and informationals into all these criticals. That's been...
00:04:22
Speaker
We've been talking about for a while, right? What was going to be called? Is it even now? Like, is it happening now? I don't know. Is it happening in six months? But I think it's funny because everyone, yeah, like you use some words right now, like the apocalypse, right? It's like, man, I see a whole lot of really strong language ah being used.
00:04:41
Speaker
That's like, yeah, there's definitely something to consider, but all of it's really interesting. the same problems we've had, right? It's the same basics in the end. Will there be some big changes in how I think all software is are compiled if if these capabilities are true? Yes.
00:04:58
Speaker
Could things suck for a while for us um as this kind of gets out there? Maybe, you know. um However, things have sucked for all of us before as practitioners, right? I think that it's... um I think either way, though, if it's true or not, that this mythos ability is going to find the smallest or the hardest to find vulnerabilities and then take advantage of them, then just, it's and if it's true, just just plan for it now. And if it's not true, plan for it to be from a year from now.
00:05:30
Speaker
Yeah, but like, bro, like, look, I mean... Like you, like I, you know, we, we, we ah spun up this conversation tonight because you, you saw I just, you know, I was, ah as George was saying, one of my sleepless nights working on side projects because like clearly that's what I do and I hate myself. um I was, I was literally just taking, taking a, taking a bio break, if you will.
00:05:52
Speaker
I was just scrolling through and I just got so pissed off at at the the grift. Because like every, every man's, okay, cool, whatever. If we get trouble, so be it. Like every loser that I see in like tech and in cyber who just looks for like something to talk about because they have nothing relevant in their actual career. So they're sitting there all day, post about dumb things. Like they started talking about this thing like it's, it is the doomsday, right? And you're like, well, guess what?
00:06:18
Speaker
AI was supposed to be the doomsday. And then, you know, what was it like? All these zero days were supposed to be the doomsday. And yeah it's like if there's always something new. And at day's end, I'm not saying that you know, someone capable couldn't take something like that and weaponize it. And, you know, like you have the case of like the petchers the not petchers, which turn into like global global spread, right? Which at the end the day, all rely on an inability or enterprises, public, private, commercial, academic, whatever,
00:06:49
Speaker
to do basic fundamental security, which is patch management, which is vulnerability management, which is basic log monitoring, which is basic reporting, which is basic visibility and understanding what's in your environment and components and segmentation and BCDR and all the things that, you know, you go through the stupid, or either pay an absorbent amount of money to maybe learn from a bootcamp and get a cert on, or you just do the job and you figure out, oh, how do I not get pwned, right?
00:07:18
Speaker
And I think we just... And like,

AI Tools: Risks and Rewards

00:07:20
Speaker
look, even OpenAI. And it's like, look, the OpenAI response. And George, I'd love to hear you talk about this. Because like Sam Altman, you know, blessed Sam Altman, the hero of...
00:07:30
Speaker
escape benmans So he comes out like a week later whatever. He's like, oh, we have our model. That's super powerful. murmur I'm like, bro, nerd rage elsewhere. No one cares.
00:07:41
Speaker
Like seriously. Yeah. So I think ah you are correct. A healthy dose of skepticism. Right. So i want to point out, I worked for a startup that was trying to train social engineering detections,
00:07:56
Speaker
early days using GPT-2. And if everyone recalls, OpenAI i also tried to say that GPT-2 was too dangerous to release. yes um Obviously, that was not true.
00:08:10
Speaker
Or they didn't heed their own warnings. um I just think there are a lot of things happening at the same time. One is the fox watching the hen house.
00:08:21
Speaker
Who does it behoove to say that you have developed things that are so powerful that, ah you know, watch out. ah It would be the people trying to invent the everything machine. And I say that as somebody who likes and uses Claude, but like that's some strong marketing from Anthropic.
00:08:39
Speaker
um And then the other thing is, as Count Newport recently pointed out, when they pointed some open source, open weight models at these same projects like OpenBDS, which is, um, a open source, uh,
00:08:58
Speaker
operating system used in firewalls for our audience, those models also found the same vulnerabilities. So is it like the uniqueness of mythos or is it literally the ethos and mythos being created around mythos?
00:09:11
Speaker
um I think maybe Ryan, it's more like the four minute mile as a, as a close friend, Connor Sherman just put it. He was like, it was bound to happen. That's kind of the watershed is the time maybe,
00:09:26
Speaker
You know, LLMs for searching for bugs in code is not a new thing. i I just think of it this way. Like, so if, like, George, what you said earlier about this is all the basics of what you must do and all that.
00:09:39
Speaker
I think people are also forgetting to talk about the, I don't know, alleged positiveness about this. Saying, hey, they when this gets released and it is doomsday, at the same time with it will be a capability that also helps prevent that.
00:09:54
Speaker
Help stop that. Maybe. i don't know. Maybe I'm being little too idealistic. No, no. say Say more. I mean, I think ah Marcus Hutchins and casey Ellis, former guest on the show, have said the same thing, that we all yeah we seem to overweight in the adversary thinking and not like, hey, what if defenders also have this?
00:10:11
Speaker
Yeah, it can help. But I think what's funny, though, is it's not going to solve the problem that gets on our way anyway right now. The problem isn't, ah can I see what's wrong? Like and the whole industry in cyber is happy to sell us things to help us observe what's happening.
00:10:24
Speaker
None of it. not Well, not none of it. A lot of it does not orient it to what does it mean to me? And give me enough information so we can go get the decision done. whether that be with me, if my team has agency to go take care of it or another stakeholder in a way that they can say, yes, go do this. And then we act because we know we know what we're looking for.
00:10:45
Speaker
We know we see it. We know a little bit of what it means to a business or an organization when we have to go and patch this thing or take that down. We don't always know. And we certainly know what we need to do to to get after it. We must go patch this thing. We must get a compensated control for that. So I don't think that problem is going be solved still, though.
00:11:02
Speaker
The decision point, that you have enough information orienting to your organization so that you can go and do this faster. um I was asking somebody this week, like, what do you think this is going to look like in your daily life when, if if all this is true, if it's all true, okay?
00:11:18
Speaker
You think you're going on, like, emergency patch calls all day? Is that what it is? And you think you're going to be like saying things that we already sound hyperbolic, right? When we show up with numbers and all the crap we deal with, but are you going to be like, we have to patch it right now?
00:11:34
Speaker
Well, why not tonight? Because we have minutes. Like, I don't know, man. it sounds it's It's just this doomsday view I'm seeing out of people. And I'm like, dude you know, I don't know that's going to be what the reality is. And I'm i'm more confident in our, um,
00:11:50
Speaker
our skill set to innovate. Like we've we've come up with some pretty good technology to solve a lot of these problems. And I think you're gonna see the scrappy innovators come up with some stuff to less than a blow if it is a blow,

Innovating in Cyber Defense

00:12:01
Speaker
right? Well, the other thing too, man, like I think, I think, you know, it's kind of like the whole like sex sells, bad news sells in media, right? So like, yes I think everyone gets hyped up about the adversary is going to do this and do that. and I'm and not saying that there aren't like capable adversaries or nation state actors aren't going to try to weaponize it, right? But at a day's end, like,
00:12:22
Speaker
I think there is a lot more positive progress that is really going to be made out of this um as a capability set once it gets trickled down into solutions that the rest of the mainstream market can use outside of Project Glasswing or whatever.
00:12:39
Speaker
And you know i just I think it's a case where whenever those global compromises happen, it's almost dumb luck.
00:12:50
Speaker
It's not necessarily intentional. Like nation state actors and and major cyber crime groups try to pull off global scams every day, every day. Right. And the ones that succeed, the ones that really, really hit is like once every like five to 10 years.
00:13:08
Speaker
Right. And, and, and to your ah point, Ryan, also like sometimes we get it right. Like, You know, Log4j threatened to do all the bad shit on Christmas.
00:13:21
Speaker
And there was a massive sort of community swell in that direction. And, you know, fortunately, no, like, huge things went sideways. George, remember when you and I were at DEF CON and we, like, met that dude that, like, actually found, like, the original Log4j guy that found it?
00:13:39
Speaker
Yeah. And he asked and was like, oh, yeah, it was by accident. So even even discovering the remediation was by accident. Yes. Well, i like this I think about this too. um Like to your point, George, about after Log4j, Ernie, all the other ones that boiled the planet, right?
00:13:56
Speaker
We did respond, but later. yeah And it was an after action that we were like, oh shit, we need something for this. And then even like even everything I saw, the... um I won't call it vaporware, but I will certainly call it like the marketing rush companies I saw at RSA for AI observability enforcement tools.
00:14:13
Speaker
like Those are a response. And remember when when, you know, a generative AI like was first coming out, we're talking about it, right? I mean, I don't know, George, how many times we say in a room, we would like to stop talking about this, guys, please stop.
00:14:27
Speaker
But then the market responded with some technology, right? Is this, ah I don't know, i'm not I'm not saying tinfoil hat conspiracy guy here, because that's totally not what I'm saying, but this is a podcast, so hell yeah, I'm going to say it.
00:14:39
Speaker
Is this all just saying, listen, This type of thing will come eventually. Maybe it's not happening right now. Maybe it's not. But historically, we to build the defenses way too late.
00:14:50
Speaker
Right. Do we need to build them now? Hold on. Is that what this is? I think we have an AI industry with a lot of big players who have sunk a lot of money, a lot of investors' money, and it's created something of a bubble because there isn't an actual tangible ah ROI for all the resources wasted in the infrastructure industry.
00:15:11
Speaker
and you know kind of the initiatives built. So it would make sense to me to create a need for suddenly purchasing a whole bunch of AI-driven solutions, which are modeled off of the same companies that built a model like Mythos.
00:15:28
Speaker
yeah Well, yes. And so one comment and then the question. So the one comment that I forgot to include to your point, George, about people just popping off about whatever is despite being an analytical industry, cyber is still made of humans who fall prey or have been warped by social media habituation.
00:15:55
Speaker
Oh, I need to comment on something. Like, no, no, you don't. I mean, I saw the Glasswing announcement. I saw that and I was like, I do not, I have a feeling I do not understand the nuances of this. So I do not need to comment on that.
00:16:09
Speaker
I felt to me, I saw it, I was like, oh, you are assembling the Justice League or the Avengers? Am I to believe doing all the right things? Like, I don't know. And those are the same companies that in Dark Room, as we call the evil empire.
00:16:22
Speaker
so So, yes. So to that point and to George's point, I thought, I don't know if you guys saw Rafi Kikorian, the CTO at Mozilla had an op-ed in the New York Times, which was saying, look, if this thing is,
00:16:36
Speaker
going after, it's so powerful, it's going to undermine a lot of the open source components that underpin everything, right? Open BDS and firewalls. I think he pointed out to and another piece of open source software that is pretty much behind all streaming online. All these open source projects that are maintained by developers, but the people who got invited into Project Glasswing are these multi-billion dollar companies. So they get first access to supposedly the thing that is very,
00:17:06
Speaker
good and protective and not the independent developers upon whom people have built, again, billion dollar companies. Right. And so i thought his point was like, the better rollout would be, we know who the developers are. we know who the people who've poured hours into maintaining these projects are.
00:17:27
Speaker
Why are they not being invited in to help be the protectors if they're the ones maintaining the software? I don't know. i thought that was a good democratization point. I just don't know that those big companies that are part of it, first off, I don't really know what they're actually doing, but is this, Hey, they're just getting together to talk about it.
00:17:45
Speaker
They're getting together to ideate on what we should do next to answer the question. You just kind of post like, Hey, how do you push this out in responsible way to everyone else? So it's helpful and not destructive.
00:17:56
Speaker
i don't know. I hope that's what they're doing. I honestly don't know. I, and a part of me, uh, pardon of it's been I mean, hopefully they're there. think they're supposed to be using the preview of the tool to go hunt and find the bones first and fix them. like i would I would think hope I would think it's it's to help them as like the key players in the economy.
00:18:17
Speaker
um to to obviously clean up their own vulnerabilities, but I think there's a was a backdoor deal for you know certain ones them to create new software offerings and yeah services. Yes, a thousand percent. And I think there's a deal to share revenue with Anthropic, right? And OpenAI, they're gonna try doing the same thing.
00:18:40
Speaker
It's really all just about generating net new revenue instead of investment, to justify the spend it's taken to build the infrastructure that they have done so without actually assuring a customer-based need for it.

Corporate Motivations in AI Security

00:18:56
Speaker
Yeah, I think that's the rub, is that behind it all, there's still a rush to market that they all have as a as an agenda. and Rightfully so, they're businesses. Don't get me wrong. I'm all about that.
00:19:07
Speaker
But that's, I think that's the thing. It's like you're giving it to who to do this? The guys who have all the money already? Yes. Okay. Okay. So. The ones who have all the money and have all the vulnerabilities. Like, it's like that are we talking Microsoft and Cisco and Palo. Like, they come on.
00:19:27
Speaker
Yeah. Yeah, man. i don't know. It's a lot of hype. wow man
00:19:42
Speaker
Well, okay, so let's ah get off of Mythos and let's change tack and talk about, you said innovation earlier and building. um Do you want to talk a little bit about, i guess, the approach that you're taking ah inside your teams to do some innovation around, i think, what people would take for granted as accepted practice?
00:20:07
Speaker
Oh, well, um yeah, I think... That's the best way to describe this. So our our program, our security program um is by intentional design. We started physical and cyber and crisis management and now safety slip and fall. So don't fall, please. um But the intentionally was all that the physical cyber combined because in our experience, like I visualize like a bunch of circles that are Venn diagrams and each ah you know each ah each circle is the responsibility area of
00:20:40
Speaker
I'm the physical security leader. I'm the cyber leader. And there's all these other circles around it that are like, I'm the marketing person. I'm illegal. And where are the biggest problems that I've ever seen?
00:20:51
Speaker
The areas where they either overlap or don't overlap. And in terms of responsibility or tell them who's taking it's almost like a, know, you could like, it's like a seam in the clothing. Like, hey, that's where the bad guy's going. Go and take advantage of stuff. Look at fraud. Fraud happens everywhere security is not going to be.
00:21:08
Speaker
um it's It's so funny because it's just, why would we be at there? Are we get other stuff to worry about? So we designed our program to be flexible so that when we have to make decisions on these things that, you know, are becoming more and more cyber enabled,
00:21:21
Speaker
that, Hey, we have the technology and information together right away. The decision has already been decided. So anyway, that, that was like our, our, our starting ethos and it's worked out. It's been about four years. We've been building it out. Um, and then, so we looked at the, um,
00:21:39
Speaker
I don't know, man, I guess I'll call it the pain or what do life experience, I guess, of working with Sims and trying to get at a global company where we were at before, a really big one where we had just data from 110 countries, so much stuff to try and pull together and like any detection, and monitoring detection logic.

Challenges and Solutions in Security Practices

00:21:58
Speaker
And then we have these partners who, going to call them that, I'm going call partners a little P, Sim providers who are kind of like these boat anchors and it's a disingenuous partnership.
00:22:09
Speaker
It's not about they, they, you're buying something for an outcome, but you're being charged for like every step of the sausage making. And it's, it's, it's crazy. It's just no good. doesn't help. And they don't innovate that much. i wish they did more. So we, we've taken a bunch of strategies. We're like, look, let's get complete control of all of our telemetry internally and as far much as we can externally. And so we, yeah we didn't go and like build our own sim, like a bunch of mad scientists. because that's stupid. Yeah.
00:22:38
Speaker
um That's not sustainable, but you can get the components of it. And you can do that. You can do it in a way that's where you can keep data longer. You can analyze it faster.
00:22:48
Speaker
You can save money, but not saving money because I'm i'm all about being a money saver. It's because I got other stuff to do, man. Like we have other stuff to use those resources for. and yeah Instead of just storage costs.
00:23:02
Speaker
Yeah, storage costs or analytics. So we we've done something that um we presented on it a couple years ago at the RHISAC summit where we said, hey, look, this is what we think we're going to do in terms of building it out or you' on data pipelines and and storage and like s three and in some basic analytics in advance.
00:23:22
Speaker
And we said it was a theory though. Even at the time, we didn't have any contracts signed with any vendors. In fact, we were still figuring it out. We said, hey, we think we think this can be done. And so we just sort of follow up this last Tuesday showing what we did.
00:23:34
Speaker
That's another another conference. Two years later, hey, we actually built this and we learned a lot of really cool things. And we were able to be very flexible and innovative um and and iterate quickly with other technologies we wouldn't normally have. Like, you know, if you want to a POC or a bake-off of a company, you may only have cycles to do one or two.
00:23:52
Speaker
What if you could do five or six or seven? Because you have your data control, you can move it around. So it's been it's been fun. um and We're still building it. And now we're building it all really around this decision management because we still we think that's still the barrier going forward and everything and and security is you know, there's going to continue to be tools that come out that will, again, help us observe what's happening.
00:24:16
Speaker
But we need to get the the right information or the right right context or language to deliver it to the people who help us make decisions. So, like, you know, I can't go and talk to, well, some people can, but going and talking to like a chief marketing officer, like you do a CFO, that's not really, that's not it. We all are, all the CISOs are being trained to be business whisperers, right? Like, oh, speak in the business's language. And you should, you should love your brand or whatever did you do.
00:24:45
Speaker
But there's also a point where it's like, man, I can only say that so much. And some people who are stakeholders, I'm not going to talk that way to them. There's no way. There's no way. It doesn't make sense. so Well, yeah I think again, at the end of the day, like,
00:24:56
Speaker
having to break silos, which is kind of like our job as CISOs, like you have to, yeah you have to be a bit of an interpreter to every different type of department that you speak to. But like ultimately, like just quote unquote, speaking the language of business, it's understanding, okay, like,
00:25:12
Speaker
what exactly do I need to address from a need standpoint for you to get in line with what I actually need you to do, right? Like how do I sell the thing, the directive I need you to do?
00:25:25
Speaker
So it's not like I'm forcing to do it. It's like, hey, we're aligning on an opportunity to do what you need to do to meet your KPIs in a way that protects our data and protects our brand and protects our environment and protects our customers and our employees, et cetera.
00:25:41
Speaker
I think that's where the interpretation comes in. I'm wondering from from what you're saying though, like are you guys planning on packaging this up and selling it as some kind of a solution? Or are you going to open source the learning and the discoveries of the framework and just, and I hope you do this because it'd be the most punk rock thing ever, just release it to the wild and fuck with all those overpriced SIEM providers and platforms and all that shit and show people that there's a way to get value and capability without having to spend
00:26:19
Speaker
you know potentially millions of dollars to do it. We actually did that before. In fact, the first time we presented on it, you know, i was a as a vendor, you can sponsor a ah a talk at ISAC. They didn't know they were sponsoring us. I'm not going to say their name because I do love them. They're doing a lot of good for the world.
00:26:38
Speaker
I just don't like some of the practices they do. Okay. It was funny. um They got really mad at us. One of these vendors, like really mad, like, and ah told us how wrong we were. and And you know, like, I'm trying to think of a terrible s science sci-fi movie where there's some like human being that gets a parasite and they go, oh my God, let's get it out of them. And then they start trying to get this parasite of a human being and the pair starts parasite starts moving and fighting back.
00:27:04
Speaker
It means like, oh, you we must be doing something good. It's hurting it. That's what we experienced from one of the providers and providers. it was like three years of my life. They were just escalating things inside my company and making my life hell.
00:27:16
Speaker
um Because we just talked about, hey, there could be another way. Not this is the way, just there could be more versions of right, guys. So, yeah, we've done that with couple of people and a couple of really big companies, um like huge ones like like, yeah, a big, big ones who are like, hey, I got this big bill. I want to get off it, but I'm scared. And everything we did, we didn't develop most of the stuff.
00:27:40
Speaker
That's saying is it's off the shelf things you can do, but people are just kind of, um you know, it's, people got a lot of stakeholders and a lot of things, that got to make happy, right? And self-preservation is a hell of a motivator, right? um So like, I'm not sure we can do this or we'll have the meet the SLAs for enterprise requirements that I have. I hit my compliance requirements. And so we've,
00:28:01
Speaker
we've demonstrated that not only can you, you can do it very, very well with like little to no effort. um So yeah, we're gonna, we're keep giving, and we're showing anybody. So anybody who wants to see it, we're, we'll sit down on call with them. Heck, even the SIM providers, we did it for a couple of them, explained to them what we're doing.
00:28:18
Speaker
And one of them had one of their, their people come on, who is now a good friend of mine. And they thought he was the heavy hitter. He was gonna come and tell us how wrong we were. And he said,
00:28:30
Speaker
Oh, these guys don't need us. They got figured out. Damn. That's hardcore. Yeah, that's awesome, man. It's cool. It's cool. I just, you know, we liked the challenge we thought of when we were doing this was, you know, there are so many small teams who don't, who look around and compare themselves to

Small Teams, Big Impact in Cybersecurity

00:28:47
Speaker
the big teams. with all Yes. 100%. Like, man, i cannot have those things. I cannot have a big sim or a SOC or an MSSP or any of that.
00:28:57
Speaker
And so then I can't defend, really. And then just demoralized because every day they're still trying to do all the right stuff. And we're like, look, what what, what happens if a team of small, a small team can do it? Because technically, technically we have two guys in cyber because I don't count because I don't actually work. I just talk a lot.
00:29:12
Speaker
And so, um, to show that, Hey, we're able to fend off some pretty, pretty, you know, impressive attacks for once in a while, but we know about things and we have good control. It's cool. It's cool be able to show that, Hey, we can do that. And we did some other things too, is our, our program around like, um,
00:29:27
Speaker
like I won't say from an OPSEC standpoint, but, you know, um we did a bunch of things around like deception right away. And not because like people think of deception as like some, I don't know, you need to get to this level of maturity to do it.
00:29:40
Speaker
And I get it because they follow what Gartner and everyone else says, these are the steps you'll do these things. I mean, you know Ryan, and this feels like, it feels like we just talked about falling into these mental traps, being habituated or acculturated into like the social media hype. I feel like what you're saying is the same thing is true of security professionals. They just kind of fall into the stream and then inertia takes over. And it's all like, well, this is the way it's been done. I guess I got to go buy this thing because I bought this thing at the other place.
00:30:09
Speaker
Well, and it's because people too, like i remember George, I'm talking about this, that people believe that there is a sequence of investments you must do and that it defines success.
00:30:20
Speaker
We're not, not success on the outcomes, but it's also funny because people will sling in capabilities and it's not, um it's, what is it, a CISO who has maybe worked for two years, like that's considered success.
00:30:34
Speaker
That's crazy, because how are you anything strategic done? So it's the difference between progress, like progress, versus just movement. Like, hey, am I going forward towards an issue? or am I just kind of dancing around the same spot, doing the same thing?
00:30:48
Speaker
a lot of the business leaders don't know how to assess us as CISOs. And they're like, well, that like guy that looks like he's doing things, we're buying stuff, and there's really big numbers coming out saying they're blocking Must be good, you know? And even even the standards for measuring us,
00:31:03
Speaker
like the ones that you get trained on like an NACD, their doctrine, their curriculum they train on is behind a little bit. And they know that. So those people or board members are getting these training and it's not reality.
00:31:17
Speaker
Not keeping up. George, what do you want to say to that? I feel like I dig this entrepreneurial vibe and I don't know how we lost that.
00:31:28
Speaker
Well, I think what I really love is Like going back to a sense of community, which is I think the most important thing where like, you know, Ryan and and his team have like found a way to genuinely solve a problem, which has been created by what I think is like the hyper proliferation of profiteering and excessive solutions in our, in our industry.
00:31:54
Speaker
And so, you know, going back to the roots of this thing, which is, again, a lot of open source, um providing a solution that can help organizations who don't have millions of dollars to spend on securing themselves find a way to get the most out of the money they do have, of the resources they do have.
00:32:13
Speaker
I think that's real thought leadership. I think that's real industrial leadership. you know, like, like um you know like like Joseph's side, I've been joking around, um you know, this whole episode or whatever, but like like in all seriousness, like like I have a world of respect for Ryan. he does a lot of really good things. He works at a tough environment to manage.
00:32:34
Speaker
And, you know, i'm I'm proud to see what you've built And i'm I'm hopeful that there are more people like you and that you'll inspire the people who have the competence, the the technical competence, the operational experience, the charisma to convince other people like, hey, I have this idea, let's try this out.
00:32:56
Speaker
Because I think we are in such an obsessive, profit-driven industry and world where we celebrate these announcements of like, oh, I got a bazillion dollars in my seed round.
00:33:08
Speaker
Celebrate me. And then watch these clowns like go party in limos in Vegas and stuff. that's That's not what it is. I mean, you've built something I think you can be genuinely proud of.
00:33:19
Speaker
And um it's just cool. What's cool though, man, is like, and you know, I don't take any credit for I'm just a guy that talks about it. My team is just badass. And we've been together, lot of us been together like 10 years.
00:33:31
Speaker
So we've got trust is like, you know, that's hard and we have it. But what's really cool, and I'll say this about Black Rifle as an organization, and this is, you know, these are my opinions, whatever, all that stuff. But this is true is that because of the nature of the military background of our founders and our leaders who were the soft guys in particular,
00:33:48
Speaker
you know, innovation is survival for them. And they're out, you know, a lot of times you guys are out um doing missions and stuff where they're out alone or out far away without a lot of support. And you got figure out a way to make it work and innovate.
00:34:01
Speaker
And it's, you know, it's ah it's not crazy if it works, right? And so we we we're given that leeway to also take risks and innovate. Innovate's one of our actual core values of our company, one of the six values.
00:34:13
Speaker
And call us true we're given that ability to because if you look at like, like you look at like some of the the technology and equipment and and military, just the doctrine and stuff that exists now, would say common in Western militaries.
00:34:29
Speaker
I'm just going to say Western because sorry, sorry, not friends right now, countries. um Sorry, frenemies. You look at like, what is commodity that a regular soldier, Marine, airman, guardian, whatever might get?
00:34:42
Speaker
um At one point, it was a very fringe advanced capability used by special ops. And then it became commodity after they iterated on it enough. And then someone said, oh, shit. So that's what the difference is.
00:34:57
Speaker
between, hey, you just have like an m six and A1 or A3, and then that evolved into, oh, an M4, oh, an M4, all this shit on it, this extra 15 pounds of equipment that I'm sure everyone wants.
00:35:10
Speaker
But like, hey, do we do these regular soldiers need a red dot? Do they need a gangster grip? Do they need these you know suppressors? So that that whole cycle of how that works, right, and how it historically worked,
00:35:23
Speaker
because of our organization, we're able to do that. We're able to say, hey, and we're we're um we're making the next thing. Because what we have right now isnt is it's good, but it's not good enough. And to keep that edge. So I'm really thankful that we have that opportunity to do it. um And it's funny, man, like when you find... um Like the, um I hate to say the vendors because i don't like using that term on them, but like when you find the real people who are building a product and they're looking for someone who's like, hey, I want to help you build it.
00:35:55
Speaker
I want to break this shit along the way consistently. So where we like to be that team. It's like, hey, we'll you. And they're really open to it. a lot of them. We have one partner right now we work with who are coming out of stealth here in a little bit and I'll plug them because I'm,
00:36:08
Speaker
I don't have to plug anybody when it's the truth. You know, it's easy. called They're called Ocean Security. And these guys, they come in as an email of security vendor, which is, you know, that's a hard new mousetrap to create, man.
00:36:21
Speaker
Like, it's pretty hard. yeah we so yeah We threw out, and what we're pretty heavy on our forward on our email security. And we threw out one of the leaders that people right now and will always rave and recommend me because we found that their feedback loop internally was broken.
00:36:36
Speaker
Their engineering feedback loop was broken and they admitted it. Yes. I love that you highlight that because I tell founders sometimes I was like, you you probably won't lose on tech. You'll lose on service delivery and cycle time to iterate and innovate. It's got to come back. You got to be able to quickly learn what's going especially now.
00:36:56
Speaker
Especially now, you got to be able to very quickly apply those lessons in seconds, minutes, right? not It can't be weeks and months anymore. Or they can't promise it on the roadmap and then not deliver it.
00:37:07
Speaker
Yeah. Yeah. I guess it's not the speed at which it's going. So these guys at Ocean have been, um man, it's one of the first times we've been having a hard time keeping up with them. Nice. So cool. So cool. They're going to be a dominant force. I know it. Anyway, and they're good human beings though, like for real. So I, I, I love seeing that though. When we find people real or like, hell yeah, let's go make this. All right.
00:37:28
Speaker
All right. Let's go. yeah So let me ask you this then. I guess kind of close it off. Like you've done some really cool stuff and, you know, I think it's, it's an incredible kind of movement. You're a part of it kind of started. And I think it's quietly, it's like, it's, this feels like a quiet revolution, if you will.
00:37:50
Speaker
Forget all that though. Like you're a super cool dude. And I think, you know, people need to understand like, What is generally your your outlook on life? Because I think you, unlike a lot of other people I know, a lot other CISOs I know, a lot of other folks I know that get to the C-suite level, start making that C-suite money, they live with a lot of flash. They love showing off. You know what I mean? Like big house, big cars, vacation, all that noise.
00:38:16
Speaker
you're ah You're a blue-collar dude, and you remain a blue-collar dude, which I think is why i love you. um What is your kind of overall outlook on life, especially given like where you started, you know, as a troop and where you are now as an executive at, you know, probably one of the the the best companies in America right now.
00:38:38
Speaker
Yeah. I mean, i was lucky to have really good mentors who like had man been through it, who instilled in me like, Hey, what matters, like took the time to really focus on things that tell me that mattered to them. And some of them were very successful, like, like, you know, ah financially and all that.
00:38:57
Speaker
I mean, i look at like the, what's the best way to say this? i look at

Prioritizing Joy and Relationships

00:39:03
Speaker
time, right? It's time's the one thing I can't buy right now, unless mythos is a fucking time machine. But,
00:39:09
Speaker
So I'd rather like get experiences and stuff. And so like, like George, what you're saying, like I've lived what in like five places since you've known me. I just moved to some tiny ass town in the middle of ah nowhere for like,
00:39:23
Speaker
house that cost no money. i lived in a, I lived, i put everything I owned in storage with my family for a year to live in a 200 square foot apartment in the, in, in the middle of forest in the mountains just to see like, what's this like?
00:39:36
Speaker
Can we do this? Right. It's funny when now when we moved into a house again, as I'm taking all that stuff out of storage, I don't need any of that stuff. it has It has literally zero to do with my happiness. And I think you only get that perspective. thats For me anyways, my my ah angle, my wife and i talk about all the time is like, because of the jobs I've had, and even though I'm still, we are literally observing the worst parts of humanity, right? The worst shit there is. Stuff that's just...
00:40:06
Speaker
rubs you, ah burns you out internally, burns out my soul, man, like seeing it. But it also gives me the perspective of like, you know, what actually matters is, is I should, I need to feel joy all day. I can't be, you know, part of the organization or the enterprise that stops this bad shit And then let that ruin me. I cannot let what that that happen. And I can't let my kids see that like, oh, daddy fights bad guys on the internet and he comes home and he's just, you know, he's terrible. or yeah
00:40:37
Speaker
Or I have to compensate for that by like, you know, let me get this expensive watch, which is totally fine. i mean, people should treat themselves. If you got it, go for it You can't take it with you. But like, None of that matters, man.
00:40:48
Speaker
None of that matters. People matter. People matter so much. And people, you know, Maya Angelou, people remember how you made them feel, right? So I love investing in my time in people because that is far more rewarding. That's what like when I had kids, it's the same thing. It's so rewarding to see it in like,
00:41:09
Speaker
in the industry, you see it with other vets that we get to interact with. Like, dude, this is so, it's so awesome. And that's, I can't measure that, you know, in a, in a number or dollar sign. And i I don't think it should be. So like for me, it's just try to be a good human being and then use the time I have the time or anything.
00:41:27
Speaker
Use it really well. So. ah Well, that is a no, but as a perfect place to, to wrap, uh, cause we're not going to get any more positive than that at this hour, at least.
00:41:39
Speaker
Um, so Ryan, thanks very much for jumping on especially just as you got off a plane. really appreciate the time. appreciate you guys. i appreciate you guys doing these, like,
00:41:52
Speaker
It's real talk, what you guys do on these podcasts, which is needed a lot. So yeah I think we're really bad at faking it. So this is the only avenue we have. that's actually it Hey, I'm just saying it's a good sign of character when you suck at lying or being full of shit. Right. So. no girl All right, man. We will ah talk to you soon. All right. Yeah. appreciate you guys.
00:42:17
Speaker
All right. So questions to take forward, question the hype, you know, be interested in AI, but don't believe everything you hear.

Questioning AI and Cybersecurity Hype

00:42:26
Speaker
i guess my question to you is how are you gut checking a lot of the noise? Because if you just kind of fall into the inertia,
00:42:36
Speaker
it really stops you from thinking critically about the issue. What is the incentive of the argument that I am hearing? And am I just kind of throwing my weight because I'm impressed with the cachet or whatever of the person who's telling me? You do really have to keep questioning what you hear.
00:42:56
Speaker
Yeah, I think the other thing too is is to... you know, are you are you following your good instincts, right? Whether, you know, it's a career path decision, your employer would be asked to do, or or if you're a leader in charge of a program, being pressured and pushed by a vendor to make a purchase or make a decision.
00:43:16
Speaker
Does it really align with how you want to run your program and how you want to contribute to your part of the business? I mean, the way I see it, people need to start learning to trust themselves better because when they do and they don't let whoever the person is in front of them just influence them.
00:43:34
Speaker
I think that's when we go back to starting make good decisions again. And, um, you know, I think this, this episode really hammered that home. Nice. All right. Take this forward. We will see you next
00:43:49
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:44:02
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. It helps others find the show. We'll catch you next week, but until then, stay real.
00:44:18
Speaker
I think we got drunk for like a week in Ireland. It was sweet. ah I mean, i won't confirm or deny that, but yeah, probably. this is