Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
From Hacker to Founder, and Cybersecurity's Future image

From Hacker to Founder, and Cybersecurity's Future

S3 E37 · Bare Knuckles and Brass Tacks
Avatar
121 Plays3 days ago

“When you look at cybersecurity…we've got to be constantly thinking about how we disrupt ourselves in order to actually solve the problem."

Casey Ellis is a hacker, a founder, and an advisor and investor. Occupying a lot of different vantage points in cyber has given him a very unique perspective on the industry.

George K and George A talk to Casey about:

  • How Casey went from hacker to solution architect to entrepreneur, creating a marketplace that connects ethical hackers with companies who need them
  • Why security startups focused solely on acquisition are hurting the industry (and why defenders deserve better)
  • The reality check on AI in security - separating hype from actual value
  • Why human creativity will always be necessary in security (automation is great, but humans build systems and humans break them)

It’s real and it’s raw. As always.

👊⚡️🏳️‍🌈 

Our Pride campaign kicks off in June, and we're looking for a brave vendor sponsors! Queer communities are facing backlash and corporations are shrinking back into the shadows. 

We’re looking for courageous vendor partners and individuals who will consider matching donations to help us multiply the show's contribution. If you’d like to remain anonymous, that’s fine, too. After all it’s about getting resources to those who need it.

If you're interested, get in touch: contact@bareknucklespod.com 

Recommended
Transcript

Cybersecurity Challenges for Defenders

00:00:00
Speaker
There's no shortage of point problems to go out and and create a a flashy solution to. And those are the things that yeah either flash in the pan and fail or sometimes get picked up by Apollo or whoever else.
00:00:11
Speaker
um and And you get kind of that quick turnaround. You know, like this is like cybersecurity is like, inherently disadvantaged on the defender side and we've got to be constantly thinking about how we disrupt ourselves in order to actually solve the problem that we're saying that we're here to solve which is not to grow and sell a big company it's to make life harder for the bad guys and easier for folks that are trying to do the right thing right you i try to look for folks that are doing that when i think about it from a from a from an angel and from you know an advisory standpoint
00:00:44
Speaker
And I do think that, you know, as we go along, like more people are calling bullshit on on point solutions that just solve immediate pain in in a flashy way that's going to get acquired, know, on a short turnaround.

Podcast Introduction and Guest Introduction

00:01:04
Speaker
Yo, yo, yo, it's the show. This is Bare Knuckles and Brass Tacks, the cybersecurity podcast that tackles the human side of the industry. I am George Kay with the vendor side. And I'm George A., Chief Information Security Officer.
00:01:16
Speaker
And today, guest, really excited. we got Casey Ellis, founder of Bug Crowd, behind the mic. ah Just this guy occupies a lot of circles. He's a hacker. He's a founder. He's a researcher. He's worked in policy.
00:01:32
Speaker
So we are just trying to get his viewpoints on a lot in as little time as possible, but given the breadth of his experience and this conversation did not disappoint.

Casey Ellis' Cybersecurity Journey

00:01:43
Speaker
I honestly was looking forward to this thing for months. I know you were too. I think ah Casey, before even stepping onto the show, was already one of our favorite guests. ah He absolutely delivered on the hype. he really just provided us with a ton of perspective from all the different hats he wears.
00:02:02
Speaker
And I think at the end of the day, regardless of whether you're a seller, whether you're new in cybersecurity, whether you've been in cybersecurity for 10 20 years, whether you're a deep technical practitioner or a policy person, Casey's had his fingers on all of it at some point or another, probably all of it today as a CEO. I think this was probably one of our favorite, one of our best episodes.
00:02:25
Speaker
And, you know, just the fact that we really went into this purely on heart. There was no, there was no pre-scripting. This was just, we got a cool dude.
00:02:36
Speaker
Let's see what we can do with him. And I think this turns into a home run. Yeah, well, let's get into it. Casey Ellis, welcome to the show. Thank you for having me.
00:02:47
Speaker
Yeah, this has been a long time in the planning, not logistically, but more George and I going back and forth. We should have Casey on. Glad to have you. um We should set the scene.
00:03:00
Speaker
As you and I were talking, you have occupied many roles, right? Indeed. Hacker, founder, executive, researcher, kind of do a lot of things. And in terms of cyber, you're in the middle of a lot of Venn diagrams.
00:03:12
Speaker
So i will sort of set this that stage and let you kind of give a quick background for the listeners. And then we're going to just launch into this fuselade of questions.
00:03:24
Speaker
um Yeah, like simple version is, hello, Casey, um yeah probably best known as the founder of BugCrowd, also the co-founder of the Disclose.io project, yeah know came up through hacking into Pentest, then moved over to the solutions architecture and kind of salesy side of things. At one point along the way there, I got it in my head that if I'm doing both, I should probably try to be an entrepreneur.
00:03:46
Speaker
Yeah. Right.

Founding Bug Crowd and Disclose.io

00:03:48
Speaker
As one does. Kind of broke bad at that point. And, um you know, one one thing led to another and now we're here. So that's like the the very condensed version. But yeah, I do sit in the middle of a lot of stuff like offensive security, know, research of rights and kind of coordination of...
00:04:02
Speaker
the hacker community, but also you know creating a more favorable environment for people to operate in good faith. um So that's dragged me into a lot of like policy stuff, which kind of wasn't on the bingo card 12 years ago when I started Bug Crowd, but here we are.
00:04:16
Speaker
So yeah. um Yeah. Well, I think to to your credit, when you started Bug Crowd and when vuln disclosures became sort of like less annoying to the establishment, right? And they like, oh, right. We should have bug bounties. This is helping us.
00:04:32
Speaker
Like, i think a lot of those folks would not have thought that they were going to be end up in policy, right? It was like, that was just the natural transformation of, oh, software is everywhere and we should know what the problems are. Yeah, yeah. And a lot of that had to do with reframing the way that people think about hackers, the way that people even think about vulnerabilities and security. Like, hackers...
00:04:55
Speaker
On that side, it's this idea that we're not all burglars. like Some of us are locksmiths, which is you know a concept that everyone's familiar with, but the digital version was completely foreign at that point in time.
00:05:06
Speaker
And I think on the vulnerability side, like the point that we're at now in 2025...

Changing Perceptions of Hackers

00:05:12
Speaker
that we weren't at in in 2013 was this idea that like vulnerabilities are a product of the fact that humans write code and humans deploy systems like we're awesome as a species at doing creative stuff but we're also not perfect so when we screw up and then computers go off and amplify that we end up with a system level problem like that's just the physics of what we're doing you know it's funny you say when humans write code is interestingly enough when ai writes code it's even more vulnerable proof so yeah Yeah. Where did the a AI get its training data? Right. So I think there's a bit of a problem.
00:05:43
Speaker
Yeah. Like that, that whole idea of just getting folk over the hump of like, what is the nature of what we're actually working with here? um you know In the early days of of Bug Crowd, um there was definitely a lot of just straight-up evangelism of those ideas. and And that goes on today. But you know at some point, you realize that like the policies and the laws that have been written around you know preventing like legitimate computer crime actually chill good faith security research as well. So we should probably work on trying to change those and so on. So yeah, it's it's definitely a thing that a lot of us that were around back in the day
00:06:19
Speaker
have found ourselves doing at this point in time. I think it's a good thing, but yeah, it's definitely, um, like I said, it wasn't really on the bingo card when we started. Yeah. And I'll say i want to explore that first phase that you went from hacker to solution. So you just sort of skipped over that, like it made sense to be an entrepreneur, but which i feel like is let's give that a little bit more time because not everyone makes that leap.
00:06:45
Speaker
And I also think you were making a leap into i mean, every startup says they're inventing a new category, but it wasn't. Yeah, we, you know, normal to start a company based around volume disclosures and trying to systematize that and things like that. So where was that inspiration of like, I want to be, i want to go from individual contributor to like, ah build a company, which is a completely different mindset.
00:07:12
Speaker
Yeah, for sure. Part of it, I mean, on on the personal side, like in terms of, you know, entrepreneurship as a word means to enter into a thing, right? um For me, transitioning out of, you know, like kind of the break point between doing kind of back of house, pure technical stuff to going out the front of the house and starting to do solutions and sales. It was actually off the back of a conversation with my wife. Like she sat me down at one point.
00:07:38
Speaker
after we got married and said, Hey, like you computer good, but you people good too. I don't think you realize that not everyone can do both. um And to me at the time, I'm like, what do you mean? Cause it was just sort of native to how was operating and, you know, kind of what she was doing at that point in time to, to her credit was kind of drawing out some of the latent potential that I had in, in, in that area. So I went off and did sales and and solutions architecture and realized that I really enjoyed, like not just trying to create a solution to a problem, but then connecting,
00:08:08
Speaker
that problem solution fit to his mind where the problem exists as possible, which I later learned was product market fit. Right.

From Technical Work to Entrepreneurship

00:08:15
Speaker
um and So there's all of these kind of raw ingredients brewing up along the way. And, you know, frankly, like the decision to get into,
00:08:24
Speaker
like to quit my day job and and kind of, you know, hunt and kill what I eat and all that kind of stuff. That was probably the scariest, scary thing in the whole thing, right? Because that's ah that's a big step for people to make. um That was really after, you know, reading like four-hour work week, like talking to a bunch of other entrepreneurs, it was bubbling up.
00:08:42
Speaker
And, you know, really the thing that drove me personally was like, I think I can probably do that. And now I want to see if I can or not. Well, it's like a different problem to hack, essentially. It's a human problem.
00:08:54
Speaker
Yeah. And this is one of the reasons why I love, you know, cybersecurity entrepreneurs, because there is that crossover in how we think. Like if you, you know, especially if you're in offensive security, like you start at one point, you want to get to an end point.
00:09:08
Speaker
You've got like a limited limited set of things to work with and you've got all these things trying to stop you. Your job is to create a path, right? see a lot of similarities in the overall kind of like high level thought process between that and and building a company, you know, especially creating a category because like all of the different things that, you we had to kind of get through to get to the point where...
00:09:30
Speaker
Buckrad, and really the category that it kicked off was established. Like there was a lot that we had to hack our way through in the early days for sure. Nice. Yeah, i mean, the the way that i kind of look at it is is you, what you've created is really just kind of like a ah marketplace for us.
00:09:49
Speaker
Sorry, a marketplace for offensive researchers to like really put their skills to the test and to actually benefit and profit off of doing so. but What you've done is kind of create something that, you know, it other companies have tried to do similar, but I think it's still far too commercial.
00:10:08
Speaker
Whereas with you, there's a very real, authentic vibe to it that it is ah company designed by hackers for hackers to essentially help the blue team.
00:10:21
Speaker
um And, you know, like, I think trying to understand kind of where and how you've maintained still fighting for the good guys, right? Because, know,
00:10:32
Speaker
you know, you I'm sure you do quite well financially with what you're doing, but you know you can make a lot more money going down that gray hat, black hat route. yeah So ethically, how are you able to retain kind of like your you're good, right? and And really that's the message I want the listeners to see is like you can learn how to hack and you don't always have to like do evil shit with it.
00:10:55
Speaker
Yeah, no, for sure. um yeah That's a really good question. i Like I think for For me, personally, that was just kind of a part of my my own moral compass like growing up. right And then yeah back in the early days of getting... yeah know like Having like hacked around and done a bunch of stuff in high school and whatever else and then transitioning out that into a career, yeah know I didn't realize that pen testing was an option until I got into network engineering and started hacking stuff for clients and they started seeing value in that. right So all of a sudden for me at that point in time, it's like, okay, I really enjoy thinking like a criminal all, but I don't want to be one.
00:11:34
Speaker
What do I do with that? All of a sudden it had an outlet. Um, and this is like late nineties, early two thousands, right? So it's in the very, very early stages of the, uh, of the industry, especially in Australia where I'm from.
00:11:47
Speaker
Um, yeah To me, that's just you know like the opportunity to replicate that experience for other people and and and create you know pathways for folk that are of that same sort of mindset. It's like, yeah, I enjoy...
00:12:04
Speaker
There's almost a certain appreciation for criminal creativity, but you like you've got this moral code or you've got this sort of set of ethics that you operate by that you know doesn't want to cause harm. um That's a difficult thing to reconcile unless you figure out that there's a path to use it yeah for good.
00:12:21
Speaker
Yeah. so yeah Yeah, I think you know nowadays it's easy because I've i've you know been doing this now for a really long ass time and I've i've built a career and and the ability just to do it. Right. I think for for younger players,
00:12:36
Speaker
It's sort of a similar thing. It's it's like, you know, what do you want to do when you grow up? Like you can be a criminal. Like that is an option for everyone. um But, you know, Dan Kaminsky used to say it. It's like not everyone wants to be a drug dealer. Right. Like everyone could be.
00:12:48
Speaker
But, you know, I mean, for sure. But but how did you then find like minded kindred ilk to join you on this quest? you know because is getting get like i look look man I used to do organizing all sorts of community events and doing community level organizing in my opinion is always like trying to herd cats.
00:13:09
Speaker
I can only imagine trying to get hackers to do things in the same fucking direction. It must be similar. How did you do it, man? Yeah.

Building a Collaborative Cybersecurity Community

00:13:19
Speaker
So like in in context of bug crowd, um it was really like a lot of, you know, going back to sort of some of that early stage evangelism that we had to do. A lot of it was educating, know, the internet collectively on what a hacker actually is.
00:13:35
Speaker
And we still do that today. We do ah a report every every year called Inside the Mind of a Hacker. And the entire purpose of that report is to have people read it and go, oh, I didn't know that. know what I mean? I thought these were like bad people or evil people or whatever.
00:13:49
Speaker
um and And to kind of humanize that persona and start to make it more familiar and something that people feel comfortable to interact with. Like we started working on that pretty much from day dot because we knew that, you know folks are afraid of hackers. We're going to have to deal with that in the market before anything else that we do.
00:14:06
Speaker
will work. right um So just being conscious of the fact that that was the problem and actively working on that, I think was one part of it. um you know Just very practically, like we've been We've been sort of spoken about over the years as like a swag company or a party company that, you know sometimes doubles in cybersecurity, right?
00:14:26
Speaker
um Like we did a t-shirt, you know my other computer is your computer. That was the first. The first t-shirt that we did when we rocked up to DEF CON in 2013 was like, I think it was six or seven of us, right?
00:14:38
Speaker
So we wanted to create a splash and get the brand out there. But we also wanted to think about how to create a rally point for folks that... you know, wanted to basically like hackers and CISOs or hackers and defenders that wanted to talk to each other, but didn't know how to, like we wanted to create a branding item that they could both appreciate um that kind of brought everyone together.
00:15:00
Speaker
Right. And that worked really well. And we've sort of repeated that throughout the years. And that's, you know it's not all t-shirts and stickers that, that get it done. But, you know, for us, that was a huge part of it. Just giving folks something to identify with and something to rally around. Like that was a really important,
00:15:15
Speaker
and continues to be a really important part of, of, you know kind of what you're calling out there, George. Well, first of all, marketing hat on, I'm always telling people like, you know, threat actor, adversary, cyber criminal.
00:15:28
Speaker
Please stop putting hacker on your website as the thing because you're basically going to piss off a good portion of your would-be customers. yeah um so Casey, you have had this founding journey and lucky enough to start it early. you now kind of so with ah more experience than most founders in this industry.
00:15:52
Speaker
George and I were talking just this morning about mission and um this idea came up. I was listening to this interview with Demis Asavas, the CEO of DeepMind, now Google DeepMind, and he was asked why did you start the company in London versus in the Valley?
00:16:10
Speaker
Right. And he said, well, you know, London, we knew we could still get ah world level talent from Cambridge, Oxford and Europe. But what was really interesting is he said, You can also get very distracted. And he said like the, the company was started with a mission in mind and he knew that it was like what he said, a 20 year mission.
00:16:31
Speaker
And he said, like, if you're in the Valley, you're surrounded by, oh well, what if I just peel off and start a photo app and like sell it for a hundred million dollars or whatever. Right. And it was like, then you get sort of pulled from the mission.
00:16:44
Speaker
want to bring this back to cyber because I want to get your take on kind of that founding ecosystem now, because there was a frothy period, little bit less so, but I used to see startups and I'm like, i would see the demo and I'd be like,
00:17:00
Speaker
Are you trying to get acquired by Google? Like that just feels like your whole purpose, your modus operandi was just start scale from C to A, sell it B, you know, versus like, what can I build for the long term that's going to actively help the community defend its systems? And I i get it. Not everyone can be a Palo or a CrowdStrike and grow, you know,
00:17:19
Speaker
exponentially large and cover a lot of the ecosystem, but like the actual founding mission, I guess I want to get your take on that because you're you're there in SF. And yeah, you've, you've probably seen lots of founders at this point.
00:17:31
Speaker
Yeah. Gosh. I mean, that honestly forms a part of my thesis when it comes to like advisory and and angel investing.
00:17:43
Speaker
Do you know what I mean? Folks that... There's no shortage of point problems to go out and create flashy solution to. And those are the things that... you know, either flash in the pan and fail or sometimes get picked up by Apollo or whoever else.
00:17:57
Speaker
um and And you get kind of that quick turnaround. um Sometimes the things that get picked up in the process are a real kind of transformative tech as well. But, you know, that's kind of the pattern that you were referring to before.
00:18:08
Speaker
But then when you look at, you know, like this is like cybersecurity is like, inherently disadvantaged on the defender side. And we've got to be constantly thinking about how disrupt ourselves in order to actually solve the problem that we're saying that we're here to solve, which is not to grow and sell a big company. It's to make life harder for the bad guys and easier for folks that are trying to do the right thing. Right.
00:18:33
Speaker
Um, you I, I try to look for folks that are doing that when I think about it from a, from a, from an angel and from, you know, an advisory standpoint, And I do think that, you know, as we go along, like more people are calling bullshit on on point solutions that just solve immediate pain in in a flashy way that's going to get acquired, you know, on ah on ah ah on a short

Creating Genuine Solutions for Defenders

00:18:56
Speaker
turnaround. like yeah there I've noticed a lot more apprehension around early stage. They're like, yeah people are like, oh, that's exciting. But like, are you going to be here next year? Yeah, with the exception of, you know, AI cybersecurity, like in general, like that's starting to cool off a little bit now. But there was some...
00:19:12
Speaker
sort of fairly crazy hype around anything that, you know, moved and in that area for a while. um But yeah, you know, the the whole idea of like, are you like solving a solution that's going to, know, potentially be disruptive to the entire way that an industry operates, which is kind of what we tried to do, right?
00:19:30
Speaker
um Or are you just building a thing that's not going to be a problem in 24 months time and you've either needed to have exited by then or you know, you're a zombie or you kind of fail at that point. Right. Yeah.
00:19:43
Speaker
I actually really love what you said that it's a reframing reframing that I think is useful as,
00:19:50
Speaker
are you building something that is designed to make defenders lives easier versus yeah. Are you just designing gigas and widgets because it looks cool and it fills this hole in the stack or selling band-aids to people who have a broken arm, right? like it's like When you think about it, a lot of, a lot of what can pop because you know, the,
00:20:12
Speaker
and the symptom of the problem that requires the bandaid, like everyone can kind of get spun up on that from a, like a product market fit standpoint or a problem solution fit standpoint pretty quickly. So you can accelerate really rapidly with something like that in market. But if it doesn't address the fundamental problem, then, you know, like I said before, like you either got a punch out before the market realizes that you end up in this sort of zombie, you know, kind of propping things up state.
00:20:38
Speaker
um It's, yeah Yeah. I think the whole idea of like making secure easy and making insecure obvious, like as a fundamental design principle that Yeah. Everyone should sort of try to apply. and And then to be thinking about like how deep in, in the defenders process, can you embed that in a way that just makes their life easier so i can get on with actually building the business, which is meant to be the hard part.
00:21:02
Speaker
Right. Um, you know, if you, if you're sort of approaching what you're building and and the problem that you're solving and back to your question of vision, your picture of the future when that's a thing, um, that's when I think you're doing it right.
00:21:16
Speaker
Um, yeah and you know sometimes you like you can to me you can tell when you talk to founders that have that just irrational conviction around a problem that they're pissed off enough about to actually go off and try to solve um you can sort of see it you know i mean the people that just want to build a thing and and fluff it up you know sometimes there's that's that spark or that fire that's just not there
00:21:42
Speaker
um Hey, listeners, this June, we will once again be supporting Pride Month with our T-shirt campaign to raise money for scholarships for LGBTQ plus students in cybersecurity programs, both graduate and undergrad.
00:21:58
Speaker
In the month of June, all profits from any Pride gear purchased from the BKBT swag store will be donated. That's all profits. Last year, we put this together in a hurry and we still managed to donate a thousand dollars.
00:22:10
Speaker
This year, we are looking to do a lot more. why Because this year is not like last year. Queer communities are facing backlash and corporations are shrinking back into the shadows.
00:22:22
Speaker
To that we say, fuck that noise. We've never feared a fight for just causes and we believe hiding is just pre-surrender. So we are looking for courageous vendor partners and individuals who will consider matching donations to help us multiply our contribution.
00:22:42
Speaker
If you'd like to remain anonymous, that's cool too. After all, it's about getting resources to those who need it. So if you are interested, reach out, either us on LinkedIn or through the official email, which is in the show notes.
00:22:57
Speaker
Now back to the interview. um I rail to George almost every conference we go to now, and it's like almost three years worth of them.
00:23:10
Speaker
We're like, you know, we all have to do that mandatory walk through a trade show floor. And you go to where like Startup City always is. and's just like, oh, cool, Point Solutions. i don't give a fuck. Like, you know, it's just like...
00:23:22
Speaker
and like I hate to say it, man. I'm on a tight budget, and I have to justify every bit of spend I do. I'm like pretty much the majority of other mid-market CISOs. yeah And I just don't find these things appealing. and And it's tough, because we have good friends who are sellers of those organizations. It's just like, why don't you give us a shot? It's just like, well, like I like you, but like yeah I know that that solution's not going to be here, whether you have all the hype and the cool gadgets or not.
00:23:52
Speaker
um So I think you're really bang on there. Now, what I find interesting and what I want to know, given what Bud Crowd does, like there's an entire section of technology that is trying to automate out what you guys do,

Role of Automation in Cybersecurity

00:24:07
Speaker
right? So there's there's the snicks of the world, the ravens of the world, and we love Raven. They just won our pitch battle last week.
00:24:15
Speaker
You know, there's the Dynatree. So there's there's all those solutions that are trying to use automation to take care of what human beings are still the best at solving because ultimately humans created the problem.
00:24:28
Speaker
So where do you see industry going in that sense? And and is there a way that, you know, ah company like yours doesn't have to competitively combat against these organizations? Can you work with them?
00:24:40
Speaker
Or where do you see the future of the space going? Yeah. Yeah, no, for sure. I think like your, your call out there. So going back to like Casey's vision, really it's, it's, it's two things. One is to create a better operating environment for hackers that, that work in good faith.
00:24:57
Speaker
Um, and the other was to really like reduce the economic asymmetry between an attacker and defender when it comes to creative problem solving. Um, yeah, And that's not the written down version. i kind of winged that one a little bit, but you get the gist of it, right? like but You but need to be able to apply people.
00:25:13
Speaker
The whole reason that we're here in the first place is not just the fact that like humans deploy systems and screw it up sometimes. The other side of it is that the adversary is human as well. um they're they're driven by human incentive. they've They're applying their own human creativity to overcoming whatever we come up with from a defensive standpoint.
00:25:33
Speaker
And like whatever you throw into the middle of that as tools to make them more effective um on either side, their job is still to innovate past that and get the job done. um That's not going to like that sort of cat and mouse thing. You know, the the way I explain it sometimes is like the idea of someone leaving their front door open and someone else coming along and exploiting that like predates the internet by a couple of thousand years, right?
00:25:57
Speaker
This is not a technology problem that we're talking about here. it's it's It's red on blue. Yeah. So, you know, to me, the thing that that happens, like how it goes forward is kind of a lot of what we've already seen, but sort of continued. Right. Like ASM as an industry didn't exist.
00:26:16
Speaker
when when we when we first kicked off, I think practitioners were very well aware of the fact that folks didn't know where the public-facing infrastructure was, um but they weren't. And then all of a sudden, yeah the bug bounty model dropped and you've got folks like Nafi and Shubs and some of these kind of you rock stars in in in the community these days going out and looking in places that other people weren't.
00:26:40
Speaker
And all of a sudden, public and infrastructure gets found, they're getting rewarded. like All of a sudden, this sort of trash fire problem that we've got gets revealed. The next phase of that is that people start building automation to to basically do that same thing because now it's a known issue that we're trying to solve like systemically across the entire internet. And all of a sudden, ASM as an industry is birthed out of that. right So you could argue that that took away...
00:27:07
Speaker
some opportunity from the bounty hunting community. I think that's a fair thing to say, but that's the job of automation. Yeah. And at that point in time, it's really the role of, of folks doing research for money and in this sort of way to figure out how to work with that to your point before George, or to like upskill into the next area that hasn't really been addressed yet.
00:27:28
Speaker
Yeah, so i so I see you guys as like kind of the bleeding edge of like the zero days and and the net new problems yeah are still on human beings to discover. And then once we have solutions in place, then create technology that you can purchase and subscribe to and it takes care of that shit.
00:27:44
Speaker
Exactly right. Exactly right. You know, it's like the Iron Man suit, right? Like Tony Stark without suit is weaker than he needs to be. um But the suit without Tony Stark is dumber than it could be. And you put the two together and it kind of works. So yeah, to me, that's sort of that, you know,
00:27:59
Speaker
automation human interface. um I see the two as always being you know crucial going forward. And to your point, it's partnership. like We do have to deal with the competitive kind of angle of it. you know Folks are asking like, oh, is...
00:28:13
Speaker
you know is the like AI pen test or AI red teaming kind of category going to replace, you human, human testers. Well, some of it, yeah.
00:28:23
Speaker
Like you, you Nessus plus jockeys and, you know, kind of your, your cheaper compliance driven stuff. I can see that taking a pretty big bath over the next period of time. And I think it's about time it did. um But in the meantime, there's you know multi-stage attacks, things that are actually adversarial emulation, like in a real and adaptive way that yeah AI and those sorts of tooling are going to take a long time to get to.
00:28:48
Speaker
Yeah. Yeah. Because as you said, in terms of incentives, I mean, your classic social engineering is just... It's attacking human psychology, right? Creating false urgency and stuff like that.
00:29:00
Speaker
So um at the time of recording, and we were talking about this just before the recording, right? We have this big kerfuffle with MITRE and the CVE database.
00:29:12
Speaker
ah Given the path that you have taken, the path that Bug Crowd has followed, um I guess I want to get your take on like Where you think the industry is now in terms of handling this, you know, there's you know, we have CVEs, we have KEVs now through CISA.
00:29:31
Speaker
And I think teams are getting better at prioritizing vulns instead of just like, oh my God, here's the latest list, patch all of them. it's like, we don't even have that stuff except on this non, you know, public facing server, you know, like they're getting better at prioritization.
00:29:47
Speaker
yeah But I guess how do we, ah how are we doing with patching? How are we doing with discovery? and also I think you and I were discussing before George joined, how are we doing with patching?
00:29:58
Speaker
Literally just have we improved software development because the bugs are still out there. Yeah. um Wow. there's ah There's a lot in that question. i yeah I think i see in general, we're mostly at a place where organizations realize that you can't fix everything. Mm-hmm.
00:30:19
Speaker
right So it's like, yep, vulnerabilities happen. like this is where you know the whole star SPM category came from, this idea that like reachability and like attack pathways, like probability of attack is partly a function of where vulnerability might exist within a system. And if it's very deep inside, then maybe you can prioritize that lower than the thing that's facing the internet. right like That just makes sense. But I think we're at a point where people are looking at volume management and volume prioritization in that way, because I realized that we're just not going to get around to all of this. Like, you know, um, so I feel like that's sort of where we're up to. I'm not sure that we've got a whole lot of good solutions to deal with that truth yet. I think there's, you know different things that have popped out, you know, Kev,
00:31:07
Speaker
the the The problem that it was trying to solve or it is trying to solve with the with respect to like sounding the alarm on exploitability, I think that's a really important thing a um to feed into prioritization, especially as nation-on-nation threat activity accelerates. like We can reasonably expect that's going to increase.
00:31:30
Speaker
um And what we also know is that nation-states are using fairly commoditized, you know, old day basically or end day to, to, you know, opportunistically get shells and preposition and do things like that. Like as that,
00:31:45
Speaker
yeah That is one example of a threat actor that we're thinking of as defenders now. um Knowing that that's going on is a pretty important ingredient into that goes into making that decision. So there's all sorts of things sort of, I think, trending in the right direction there. um CVE and NBD and that whole thing, it's...
00:32:08
Speaker
It's a mess because um I think that there's a lot that could be done to improve how CVE and NVE type processes get done.
00:32:22
Speaker
yeah um And I do think that there's, you know like what we've seen over the past couple of years is um The strain, you know, like with bug crowd, a part of what we have to do is triage of like the input of the entire internet ah to our customer base. So we're pretty familiar with the triage problem, right?
00:32:41
Speaker
and And to see like the scale issue of the number of vulnerabilities being discovered and and submitted to to the CVE and, and MVD programs and then seeing them start to really kind of buckle under that pressure.
00:32:54
Speaker
Yeah. Okay. Something needs to get fixed up in the middle there. Um, but at the same time to just have it suddenly go dark, like that's, that's a bad scene. Like there's, there's, there's so much kind of,
00:33:07
Speaker
existing, you know, mostly functional defense and vol management infrastructure that's hanging off that as a critical dependency for it just switch off overnight. I think that's absolutely the idea that like this, this kind of meme that like real men don't eject the USB drive before they unplug it.
00:33:24
Speaker
Um, maybe it's being taken a little too far in some instances because yeah, you like, sometimes you can get away with that, but when it goes wrong, it goes really, really wrong. at ah At a scale that is yeah very difficult to imagine. me that like I think we're well and truly into testing the limits of at this point in time. so I'm really interested to hear what your take is on really the implications of true AI capabilities, right? Because i' have been...
00:33:49
Speaker
One of those annoying CISOs that they bring on the panels, and I'm like, I don't give a shit about ai If you're talking to me about secure implementations of taking open source LLMs and being able to plug them into a production environment, I give zero fucks because I can't take that to my devs. i just I just tell it to put it on Docker and just like make sure that it doesn't actually touch the environment.
00:34:10
Speaker
um So I really am just getting tired of... I want to say the snake oil salesmen and the NFT guys who have now become AI guys who are like, yeah, we can just make a SaaS company by ourselves using AI and all this.
00:34:26
Speaker
How do we, i mean, what what is your take on it? Tell what's real. like To tell what's real, yes, but then how do we as real practitioners really combat this whole thing? Because I can't go to an event or even open up my news feed without seeing some kind of overhyped nonsense about what AI either could do or some weird graphic that someone built based off prompt.
00:34:52
Speaker
Oh, yeah. The Lattipot, like... yes is a way bigger problem. And I think we've actually been dealing with this problem for a lot longer than, than we realized. Like when, when AI dropped, when kind of broad consumer access to, to generative AI dropped, um, whenever that was ended 22, I think everyone's like, Oh, this is going to usher in a post truth era, blah, blah, blah. I'm like, and where have you been last? Yes. Yes. Like, hello. Yeah. Like machine learning plus social media, like,
00:35:25
Speaker
kick that job off in a big way a long time ago. And we're we're sort of really well into that in terms of the need and the fatigue that comes with trying to discern fact from fiction on the internet. I think we're at a point now where it's like becoming obvious almost in a way that know, with the optimist hat on, like maybe the frog jumps out of the pot at some point and people kind of call BS.
00:35:47
Speaker
Um, but that's like a way bigger, like technology kind of system level problem. Um, we're a little bit out of the, uh, the realm of cybers at that point. So, Yeah, like I think with with the the tech side of it and and thinking about, you know I do think that ai I mean, the way i I sort of started to think about it early on was like, this is sort of like how I imagine it would have been to be in the Valley in like 1997, 1998, when people started having a website, right?
00:36:20
Speaker
Like there's this enormous peak in the hype cycle. Yeah. Everyone who's doing anything else is going completely crazy because it's distracting them from their job. All those different things. A lot of what gets talked about during that peak phase is directionally predictive, but wrong mostly.
00:36:37
Speaker
Um, yeah But in reality, if you sort of look at that period in the rearview mirror, it did go on to change basically everything, right? But but let's let's take that thought for a sec then.
00:36:51
Speaker
you parallel that timeline, do you see an AI bubble bursting similar to the dot-com burst that occurred in that era? cool That's a fun one. Yeah.
00:37:04
Speaker
It could. i don't think the public markets are anywhere near as exposed to to the AI boom as as they were with.com. um yeah There's obviously a lot of other black swans kind of circling lake at the moment that could precipitate that.
00:37:20
Speaker
um But yeah, the exposure looks different on on the public market side of things, I think. um I do think you saw like a lot of capital flows after ChatGPT, and then took about a year before the VCs were like, oh, wait, this is just a wrapper.
00:37:39
Speaker
And then like the models are just going to build that capability natively, and your API calls are worth zero dollars. yeah So I think some smarter questions are being asked, but I do think there's still a lot being promised that's real weird.
00:37:53
Speaker
Yeah. um and And when it comes to AI security, like part of, I mean, I got involved around that same same period with AI. you know, the national cybersecurity strategy, which was being put together at the time, and then some of the executives, the stuff that came out of the Biden administration around AI.
00:38:13
Speaker
And it was really interesting because, you know, the unanimous agreement on on the Hill and in the White House was like, ah, AI is like, We've got to do something about this, partly because it's powerful and anything that's sufficiently powerful is inherently dual use, but also partly because it was so accessible to such a large amount of people, it just immediately became a retail politics issue.
00:38:34
Speaker
Right. So it's like we've got to do something. um And the thing that was really interesting was when I asked the question, yeah, but what do you mean by ai security? Like no one had a clear answer to that. Right. It's powerful. Keep it safe. I just freaked out. And that was kind of it.
00:38:49
Speaker
um So that's when you know I started to play around with the idea of a taxonomy for like AI as a tool, AI as a target, and as a threat. That's kind of how I rolled it out at that point in time.
00:39:00
Speaker
And you know the the overall thing there is that like we don't really understand... what AI security means yet um because we haven't really figured out how we're going to use AI.
00:39:11
Speaker
Right. Right. um And I still think, I think that's less true today, but at that point in time it was very true because it was just this collective freak out and a whole bunch of activity. So, yeah, I don't know George, like going back to your question, like I don't think it's BS. I think, you know, the, the, the way to combat it is just to get as comfortable you know, as comfortably familiar with it as you as you can afford to in order to actually understand the difference between hype and things that could be real, like are real now or could be real in the future. Yeah. I think the the thing that's worth pointing out is that this is the first question
00:39:47
Speaker
quote unquote, transformative technology, you know, that has a lot of breadth, depth, scale that is being developed in the private sector and not in the government sector, right? Flight, railroads, the internet,
00:40:03
Speaker
yeah you know Everything that came out of the space race was directed through government funding because government could afford to take on that risk. And then slowly bleeds out into the public sphere. But this is the first time that this is being developed outside of government purview, which I think is a little bit scarier.
00:40:22
Speaker
Yeah, um I would. Yeah, i I broadly agree with that. I do think cloud. Fair point. Is similar and in terms of like it just completely transformed how ah folk interact with IT t and it was largely privatized and in how it approached. So I think there's some pattern matching that maybe applies there.
00:40:43
Speaker
but You know what's was funny about that, though? With cloud, cloud was supposed to replace the DC. But now there's a movement of organizations moving back into the brick and mortar data centers because they're inherently more secure.
00:40:55
Speaker
And I think what people are waiting on, and I want to know what your thought is of this, replacing your analysts with AI, because i just it's just

AI's Impact on Security Analyst Roles

00:41:04
Speaker
not there yet. i just don't see it. Do you think we're even within five years of it?
00:41:09
Speaker
ah For some analysts, yeah. Yeah. and And that's the qualifier. I think that is definitely a trap that always sort of pops up. It's like, oh, we're going to replace the entire SOC with AI.
00:41:22
Speaker
Bro, like, well, what kind of company do you run? Like what's coming into your SOC? What kind of threat actor do you deal with? What kind of tech do you have on the back end? Like what does so onean SOC 1, SOC 2, SOC 3 from like an analyst tiering standpoint mean to you? it's It's going to be different every time. So to say that,
00:41:39
Speaker
you can just wholesale replace that function with AI for anyone, I think is a stupid starting point. um I do think that like commoditized, you know, commoditized functions,
00:41:53
Speaker
um like folks that are there to accelerate other people, and that's kind of the only thing that they're contributing. I think they're probably the ones that are most at risk from from like a job security standpoint. I do think some of those roles, you know, are already in the process of getting replaced.
00:42:08
Speaker
Um, logically that kind of creeps its way up the creativity stack as time goes by. Um, so yeah, I mean, that's a, that's a, that's not really a yes, no question that often gets framed as one, if that makes sense. And I do think, you know, for some, for some folk, it is kind of existential and, and those folk, you know, I think the best thing to do is,
00:42:31
Speaker
Get comfortable with the tech, figure out how you're going to use it to upskill yourself and and continue to kind of grow from ah from a career and a you value standpoint. Yeah, we might not have have time for it tonight, but I'd love to, if I ever get to see in person again, man, ask you about how a new or interested cybersecurity analyst, a wannabe analyst, should train themselves up. Because I think all of the old methodologies are still SAS 2.0.
00:42:56
Speaker
yeah And I just don't think the old doctrine and the old information that's out there can get you guys hired. yeah even Even the old way of approaching a job, you know, like i'm goingnna I'm going to go hard in the pain, get super specialized as like specialization is a super vulnerable position to be in.
00:43:12
Speaker
I think that's right. and And I think as well, like the the um the refresh cycles on on the way that we teach people, I'm not sure they're going to catch up quick enough with that.
00:43:24
Speaker
So, you know probably the shortest answer of that, George, is is kind of learning how to learn as quickly as you can, like doing that through community, doing that through, you know, whatever hands-on stuff you can do from a tooling standpoint, um wherever your interests might lie.
00:43:37
Speaker
But then connecting with folk and trying to kind of rise as a group. um I'm a big believer in that, obviously, with, you know, BuckGrad's model and how we've kind of taught the crowd over the years and all kind of stuff. But I think it really does apply here because it's it's an adaptive learning model that's meant to be resilient to whatever the environment might throw at you, right? Because the you know the logic is the crowd that you're in is adapting to that.
00:44:00
Speaker
And if you're plugged into that and a part of that, you'll automatically, to some degree, learn from that and actually get to contribute back as well. Well, that's a brilliant note to end on. We'll end on the high note, resilience, adaptive intelligence. Yeah, perfect. yeah All right. Well, Casey, thanks so much for taking the time and ah for your attention to our non-softball questions. I know we throw some weird, long syntax stuff at you, but I appreciate you rolling with that. trying to pass those out, but yeah hopefully hopefully that went well. because No, you're awesome, brother. Thank you.
00:44:34
Speaker
Absolutely.
00:44:39
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:44:52
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. It helps others find the show. We'll catch you next week, but until then, stay real.