Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
The AI-human future: Collapsing silos, leveling up teams, and investing in builders image

The AI-human future: Collapsing silos, leveling up teams, and investing in builders

S3 E51 · Bare Knuckles and Brass Tacks
Avatar
0 Playsin 18 hours

Rinki Sethi joins the show this week! She's held just security leadership roles on just about all possible sides, and now she's also a VC investing in the next generation of founders.

George K and George A talk to Rinki about:

  • Real talk and timeline on AI in SecOps
  • How security teams are gonna get reorganized around AI tooling
  • Why most founders fall in love with solutions instead of problems
  • The brutal truth about career development in the AI era

Best quote: "I don't think AI is going to replace people. AI is going to replace people that aren't using AI - and it's already starting to happen."

Rinki breaks down why the finance team beat an engineering team during a hackathon at her portfolio company, and why that feat portends an industry wide change.

For vendors: Stop trying to impress CISOs with your tech specs. They're drowning in noise and need you to understand the actual problems their teams are facing.

For practitioners: Learn the fundamentals, then use AI to scale. The combo is gonna separate good from great.

————

👊⚡️Support the show!

For as little as $1 a month, you can support the show and get exclusive member benefits, or send a one-time gift!

https://ko-fi.com/bareknucklesbrasstacks

Your contribution covers our hosting fees, helps us make cool events and swag, and it lets us know that what we're doing is of value to you.

We appreciate you!

Recommended
Transcript
00:00:00
Speaker
future is going to be agentic. And this is when like the organizational silos are going to like kind of go away and follow the tech. um And so, but I think today you're right. I think that the tech is starting to consolidate. I just heard someone today say like SecOps and um like SecOps and Eng that were like distinctly different functions are starting to consolidate into one, maybe even AppSec a little bit um is coming into that picture.
00:00:26
Speaker
And so, and And you're seeing the tech forcing that team or that like that persona becoming one. um And it's still human driven. But I think like as we move more into kind of this agentic future, perhaps that those organizational silos are going to be broken down, maybe even between security as a whole and other teams.

The Future of Organizational Silos

00:00:52
Speaker
All right, it's Bare Knuckles and Brass Tacks. This is the technology podcast that tackles the human side of all the things, AI, security, privacy, whatever you want.

Introduction of Rinky Sethi

00:01:03
Speaker
I'm George K. on the vendor side.
00:01:04
Speaker
And I'm George A., Chief Information Security Officer. And today our guest is the incomparable Rinky Sethi. Storied CISO. You've seen her on LinkedIn. She's been at Twitter. She's been at Palo. She's been at Rubric. She's been at Intuit. She's been at Walmart. She's been everywhere.
00:01:21
Speaker
ah Client side, product side, vendor side. And it's pretty amazing that we got her. It took a lot of calendar judo, but we're very happy to have her experience on the show.
00:01:32
Speaker
We covered a lot of ground in 30 minutes. It's kind of nuts. Yeah, I got to be real with you, man. Like i I nerded the hell out of this on the AI thing. Yeah, you did. Real expert that like lives in San Fran. She's neck deep in the scene.

AI's Role in Security Operations

00:01:45
Speaker
So let's talk about what AI is doing in security operations. Let's talk about the real state of AI in the cyberspace. And let's talk about career implications for people who are trying to break into this damn thing and how AI is going to impact their ability to do so.
00:02:00
Speaker
So I think this is a high value episode. And oh, my God, this was the end of our day, too. Yeah, as much as she's got going on in her past, we really use this interview to focus on the future. So not only AI, not only workforce development, but we also close it out with how she invests, how she looks the future. It's awesome. Let's turn it over to her.
00:02:22
Speaker
Rinky Sethi, welcome to the show. George, thanks for having me. This has been a long time coming. We had to do a lot of calendar judo, but I'm glad that we made it work and couldn't be more thrilled.
00:02:35
Speaker
You have a storied background. We're not going to go into the origin story because anyone with LinkedIn can scroll through that. But... I want to start kind of with that insofar as you have occupied like maybe every security role possible product side, vendor side, client side, kind of back and forth now also investing.
00:02:58
Speaker
And instead of looking to the past, I want to use that as a springboard for the future. So thinking about your career, Is there a thread from that collected past experience that is informing how you're seeing the future?
00:03:14
Speaker
Because you you write a lot about um what's happening and we're going to get into that, especially at the front lines of AI. But like just with the perspective of all the different kinds of seats you have occupied, how is that informing how you look for things when you look at either the changing of the practice, the changing of the technology, or guess the changing of controls even?
00:03:36
Speaker
um Yeah, it's so it's a great question. And, you know, I having spent two decades in cyber now, um a little over two decades, ah it's interesting how like cybersecurity has just morphed to because I remember early on, it wasn't even called cybersecurity. I think

Evolution of Cybersecurity Roles

00:03:50
Speaker
I was an information protection analyst when I first came into the field.
00:03:54
Speaker
um And the term CISO didn't really exist outside of banking at the time. um And that actually now, like every company has a CISO or head of cybersecurity or somebody that's focused on cyber, but that wasn't the case, right? So things have just changed so tremendously.
00:04:10
Speaker
um And having worked in so many different and industries like utility and retail and financials and social media, and it's like, The principles of cybersecurity are the same.
00:04:21
Speaker
The speed at which ah you can adopt new technology or the type of technology you're even allowed to adopt um can change. And so I think that has given me a really well-rounded, I would say, perspective in innovation and kind of how...
00:04:38
Speaker
as an investor, how can these companies sell in the right way to different kinds of industries? um But as a practitioner too, um just understanding the business model better and knowing how important that is in driving security and driving culture at different companies.
00:04:56
Speaker
Yeah, brilliant. Yeah, I mean, gotta it's really impressive to talk to you because like your resume kind of speaks to itself. We kind of pride ourselves on doing the guest research. And I think George was talking to you before me. um So he's like, hey, we're going to interview Ricky. And I was like,
00:05:13
Speaker
Oh, cool. I've been seeing her posts everywhere. This is great. going and on um So yeah, so like one CISO to another, thank you for coming on the show. um It's quite humbling to have you.
00:05:25
Speaker
um so I'm going to pick your brains a little bit and it might be... I don't want to say contentious, but I want to really want to pry because um it's rare to get expertise like yours on here. so I'm going to take advantage of that.
00:05:38
Speaker
um I want to talk about the real state of AI and security operations for a second, because you know in running my own environment and even doing the consulting thing at the firm and a bunch of other stuff I advise on, um what I've noticed is the hype cycle is still very much at the precipice of of everyone's attention.
00:05:58
Speaker
And the real state of of essentially you know machine learning enhanced automation operations, which is really what we're talking about here with like AI and SecOps, um it's just not there yet. i mean, like you know people love doing this weird SOAR is dead thing, but it's like, no, no, we're we're still building SOAR use cases. What are you talking about? like We're still trying to implement this tech.
00:06:21
Speaker
um So you know I kind of want to know from from your opinion, what are the realistic timelines for when we can actually create any kind of instantaneous automated response that is correct, like accurate, like say above 95% on false positive or false negative.
00:06:41
Speaker
So, you know your baseline is complete. You can implement it in the environment. um It's probably going to have to be something that's, that's ah like native instance inside the perimeter.
00:06:53
Speaker
So like, Do you see this happening sometime in the next year, the next five years, the next 10 years? like i'm I'm looking at the my personal opinion. I think it's it's more to be in the like four to 10 year timeframe before we can actually see this baked out of the box so we can have this kind of enhanced capabilities that we can sell at scale.
00:07:13
Speaker
What is your opinion on the real state of AI in SecOps?

AI's Strengths and Limitations in Security

00:07:17
Speaker
Yeah. um Okay. I'm a little bit more optimistic than you. And um I may be like, I may be like talking, so like, you know, I may be detached from reality, but I feel like we're two to five years away, but like, maybe we can break this down and have ah of a good debate or discussion where I think like AI is really delivering value as it relates to SecOps specifically. Right. And where I think it's still struggling or has time.
00:07:43
Speaker
I think when it comes to like noise reduction or prioritization, um like thinking about lower lower value alerts and correlating signals, right um reducing alert fatigue.
00:07:54
Speaker
That's where I think AI is playing well. I think threat detection and anomaly identification is another area where I think we've used machine learning and AI for a while in SecOps and it's doing well. And then on the automation,
00:08:06
Speaker
in kind of how we do triage and response. Soar platforms also like leverage intelligent playbooks. And that's where AI is, I think, working well. And then finally, like threat intel enrichment.
00:08:17
Speaker
I think where we're still struggling is one, like contextual understanding. This is where you're hearing kind of the new SecOps players that are coming in that are saying like, hey,
00:08:29
Speaker
we are, so we're going to provide you with all the context, but I think getting in that deep kind of getting the deep context in your environment, like think that's still, we still need to see the truth in that.
00:08:40
Speaker
um Trust in explainability, like auditability, I think explainability on models, um why they triggered an alert, how they made a decision that transparency needs to still, um I think,
00:08:53
Speaker
be there and you need kind of very clear explainability around this stuff. There's still like, Hey, we're heavily audited. How are we going to show that yeah this is doing something similar to how a human would have done now? We would have audited this.
00:09:06
Speaker
when But then I have to ask, like how can you then permit that to securely interact with the data in prod without trying to create any kind of VM environment to replicate on the test? Because I still see massive risks. If I'm taking my security device content and I'm throwing it into an LLN that's connected to the internet, who's to say that through model poisoning or any other type of threat, that use case doesn't get stolen essentially by a bad actor if they can manage to compromise the model?
00:09:35
Speaker
and Again, if you want to be like, no, no, this like we talk about something else. It's cool. But it's like, hey, like these are the things I think about and why I get so apprehensive about taking any of these like AI enhanced tech op technology seriously.
00:09:50
Speaker
Yeah, because we don't have the security layer around that, like, or, you know, defined well. So I actually agree with you on that. And I think that that these are the things that there's going to have to be like clear auditability, transparency around how the AI models are working. And I think it's where that's the part I think is still two to five years out um and really kind of changing. And maybe you're right, maybe in highly regulated environments, we might not be seeing this until 10 years because it's going to have to be proven again and again.
00:10:18
Speaker
going to get through kind of all the incidents that are going to happen due to the lack of kind of like security around this. um So, but that's where I think there's still kind of issues with AI and kind of where we need to see this go.
00:10:34
Speaker
And I would say like last is like, I still don't think that it's been proven out on the false positives, negatives. I don't think that's foolproof yet. I think like in the beginning, you're going to see a lot of false reports until the AI gets better. And then I think that's when you're going to start seeing, okay, the human feels like, okay, I need to be less in the loop because I trust this more, but we're far we're still far away from that.
00:10:55
Speaker
Yeah, I have a personal hypothesis, which you can pressure test and it'll lead into the next question. But as tooling achieves greater and greater machine speed, my hypothesis is that The problems we will have aren't necessarily on the tech side, but on the process side. Because if you think about the way security teams have been architected to date, it's around human specialization, right? Like here's the GRC team, here's the offensive security team, here's SecOps, here's whatever.
00:11:27
Speaker
And then what does specialize really, really well is machine learning. And so you can have things that will detect and they may be able to respond very quickly, but then they run into these old team organization structures, which I just call like the human wall of meat. Like it's just like, it's super slow. So it's like, oh, I can now threat model in two minutes, but it takes my team still three weeks, six weeks to like build out.
00:11:57
Speaker
like Anyway, I just want to get sort of your thoughts on that part of the equation because I think we maybe over-index on the tech and we're not thinking about how AI changes how teams are organized.
00:12:07
Speaker
um Let me give you a moment there because I just threw a lot at you and then I have a follow-up. i I agree with you. Like the tech is consolidating functions, but the people have not like oriented around that yet, right? and But i like don't you see a future where like I'm not saying today, maybe it's 10 years, um but it's the future is going to be agentic.

The Impact of Technology on Role Consolidation

00:12:29
Speaker
And this is when like the organizational silos are going to like kind of go away and follow the tech. um And so, but I think today you're right. I think that the tech is starting to consolidate. I just heard someone today say like,
00:12:42
Speaker
SecOps and um like SecOps and Eng that were like distinctly different functions are starting to consolidate into one, maybe even AppSec a little bit um is coming into that picture.
00:12:53
Speaker
And so and And you're seeing the tech forcing that team or that like that persona becoming one. and um And it's still human driven. But I think like as we move more into kind of this agentic future, perhaps that those organizational silos are going to be broken down, maybe even between security as a whole and other teams.
00:13:14
Speaker
Good. ah i I do dig that. I have my shorthand is that when you think about your beginning roles, right, you said specialist, there's often analyst. And I think the paradigm shifts from one of analysis to orchestration, right? Like teams become orchestrators of tools talking to other tools. But the the real question I was trying to get to is um how would you advise your peers as CISOs be prepping, working through kind of understanding One, the tech and its in impact on their team structure. Like, i guess what I would dispel for my vendor friends, because I come from the vendor side, is that they think all the CISOs are at like the very cutting edge.
00:13:56
Speaker
They know all the things and they've got to out impress them with all the the future casting. But the reality is most CISOs are so busy. that they're kind of struggling to keep up with the pace because they don't have like all the time in the world to like read archive preprints or like dig into the George is shaking his head. So it's like how in this overwhelming wave where there is a lot of pressure on security teams to reduce headcount to automate, automate AI, all this, like, how are you advising peers that they just sort of like, I guess, get baseline knowledge or start to think about how this is coming for their teams?
00:14:33
Speaker
Yeah, I think like um the way that I've thought about this and, you know, you see kind of like also resistance from the team because they're seeing these products that they feel like are going to displace them.
00:14:44
Speaker
And because they don't see a future, like that is exactly what I do in my role. And now this tool can like replace me and scale and we may not need a human. And I think one of the things that's our responsibility as leaders, and I've been preaching this, I probably heard it from somewhere else and I've stolen it, but it's like, I don't think i think I don't think AI is going to replace people necessarily. I think AI is going to replace people that aren't using AI.
00:15:07
Speaker
And i it's already starting to happen. um And so I think leveraging AI to be like, more productive to be able to do something different and pivot. And I think it's our job as leaders so to look at our programs and look at what could be done in different ways and start like it's almost a time to do like a zero sum. What could a security program look like if I was living with the current tech stack? And you kind of have to do that and start thinking training your teams on what they can do with AI instead of being fearful of it so that they can see, oh, my gosh, I can do so many other cool things, so many other fulfilling things. The monotonous part of my role might be like, I don't have to do that anymore. So I think that's like a really key, important thing. And I think this applies to every aspect of cybersecurity right now, like every aspect of the program that could be rethought of.
00:15:55
Speaker
um And again, like I think a lot of the areas are still early. I do think we have to start looking at that like pretty quickly. Love it. Yeah. So, you know, it's interesting you say that because I I kind of see things as still being in the hybrid phase. And I use kind of like the parallel of training developers.
00:16:16
Speaker
Right. Like vibe coding is a thing. But like, you know, you'll watch like co ah corporate corporate bro on like Instagram or whatever. And he kind of like makes fun of this. But like, you know, anyone thinks that they can now GPT code themselves, the next great B2B SaaS. And it's like, it's literally like a running joke.
00:16:33
Speaker
um And I think, you know, we still have to stick with the fundamentals. So when I get a lot of people, a lot of people in the community ask me, like, how do I get in into cyber? I'm like, hey, you got to learn like Wireshark.
00:16:46
Speaker
You got to learn if you can get a hold of Carbon Black, if you can get a hold of Kali Linux, if you can start doing like basic use cases. I think there's still value in starting with the fundamental manual skill set so you understand the logic and then the application of AI as you go. and and this is kind of where I want to ask you because it's like,
00:17:06
Speaker
where do you make that shift then to start using it, right? Because you want to learn how to do the actual thing. And then from that, you can learn using AI and using the tools out there.
00:17:17
Speaker
What do you actually love doing? What do you love focusing on? So if it's orchestration, if it's architecture, if it's threat intelligence, whatever it is. And I do think certain fields are going to be passe, like basic log monitoring is probably going to go out the window from a human doing it.
00:17:32
Speaker
But, you know, incident response is still going to be a human thing. Defer is still a human thing. So in your opinion, based on all that, how will the practitioner career development lifecycle change because of AI's proliferation? And and how does someone, let's say someone either late career wanting to transition into cyber or someone getting out of college, no cyber skills at all, how do they get in today versus how are they going to get in in five or ten years, in your opinion?

AI's Influence on Cybersecurity Careers

00:18:03
Speaker
it's It's a really good question. I think today still like... it's still tough to get in. like We know we see right every day, like you and I are seeing like college and grads struggling to enter cyber, right? More because like, there's still like this, people are looking for 10 years of experience, whether they need that or not is another question. Cause sometimes having like someone new in the field is better. And I think that, you know, you can teach somebody a lot of things in cybersecurity. um You don't necessarily need someone that's been there for 10 years, but ah again, depending on the role and what you're looking for,
00:18:36
Speaker
I do think the barrier to entry, and we've seen this on the attacker side, is getting lower. You can leverage AI to do things. And, you know, we even like you you don't need a sophisticated AI to do phishing attacks. we already know the non-sophisticated ones work, but now you can do it at scale.
00:18:52
Speaker
um And I do think going back to your point, like... I think the fundamental, I agree with you, the fundamentals still do matter. And I think like knowing how memory works or how auth flows are structured or like how different, how packets move, it's still gonna be the difference between good and great.
00:19:07
Speaker
um And AI is like gonna be super powerful, but I don't think it replaces judgment or experience or the ability to reason about risks and trade-offs and system design experience to influence executives if you're at a different level.
00:19:20
Speaker
I think there's still this like fundamental where that experience is gonna matter. um And you can use AI, but still think critically about what is going to, you know, is going to find you success.
00:19:33
Speaker
um I still think there's a bunch of like, remember when we used to write detections by hand before we used to, and, and, you know, like you used to write them on the machine. And I think there's still going to be a lot of value in that.
00:19:44
Speaker
um But I do think for those that like you learn the basics, Now AI is going to help you scale. Right. And I think I'll give you a non-cyber example. I was just talking to um ah a CISO where their company are really engineering, have a heavy culture, like really top engineering talent.
00:20:03
Speaker
And they did a hackathon. And the team that won was a non-technical team. It was like a finance team or something. And they used three apps, Cursor and two other apps. Was it secure? Was the code really good? Probably not.
00:20:15
Speaker
But did it produce a winning idea? it absolutely did. Right. And so like it just kind of depends on how you're leveraging a and what the I think outcome is that you're looking for.
00:20:26
Speaker
But I think this is the advantage that college students might have is they're going to be really power users of the AI and can get really creative with things. um But doesn't like mean you don't need to learn the fundamentals, right? Because you're not going to be able to do it unless you learn the fundamentals. In the same way, I was just talking to someone today where I was like, it's really helpful in my role that I started off as an engineer um and and then kind of now I'm in this role.
00:20:49
Speaker
One follow up on this, forget about George, but based on this, and this is kind of my opinion of what's gonna happen as well, in the world of consulting and um you know independent contractors, independent contributors and and billable rates,
00:21:03
Speaker
I think right now there's a bit of a golden era for folks who are in the IC space because you can use AI to deliver stuff. And I know a lot of people that still build eight hours on what's realistically like an hour or two of work that they put in.
00:21:16
Speaker
Like car mechanics. Dude, that's what they do. it's It's a thing that happens right now. My belief is... The actual value proposition of an individual consultant is going to come down to their knowledge of the fundamentals along with how to use the AI tools. I think that's what's going to drive high billable rates.
00:21:35
Speaker
I think if you're just rolling in with the vibe coding type energy, you're probably not going to be able to achieve top bracket dollar in the consulting space. And what's what's your quick opinion of that?
00:21:49
Speaker
I think I 100% agree with you, full stop. um Hey, listeners, if you dig the snark, the stories and the big swings we take, we'd appreciate your support. You can now become an official supporter of the show. You can send us a one time gift or sign up as a member to provide ongoing support. Memberships start for as little as one dollar per month. Just follow the link in the show notes.
00:22:16
Speaker
Each membership tier comes with a unique set of benefits, including exclusive discounts to the BKBT swag shop and even advisory services for your team. So really, for less than you'd pay for one cup of coffee per month, you can support the show.
00:22:32
Speaker
It covers our hosting fees, helps us make cool swag, and it lets us know that what we're doing is valuable to you. Many thanks to listener Elizabeth Ramirez for her recent pledge of support. We'd love to have yours too.
00:22:46
Speaker
Now, back to the show.
00:22:50
Speaker
In that case of the finance team, ah it feels like that is the harbinger of what you were previously talking about, Rinky, of like maybe the silos collapse. So the finance team may have been kind of closer to the business need.
00:23:06
Speaker
And now they had these tools that allowed them to do things that before it was like a conversation over dinner. is like, what if we could? And then it'd have to go get ah sent to the product team and they'd have to build out the stories and then it gets sent to the dev, you know, like.
00:23:21
Speaker
There's that. And then I also take your point about. If you don't understand the fundamentals and then you don't have fundamental skills like the ability to influence or communicate, which we talk about a lot on the show, doesn't do any good if you have like incredible technical acumen, but you can't speak to others or persuade them or convince them, you know.
00:23:40
Speaker
You have to do that work. I mean, you could put it in a chat GPT and say, write this in a more polite fashion, but ultimately you will have to deliver that news. um So all of this experience as a practitioner, as an engineer, as a CISO on both sides of the the aisle, as it were, product side and client side,
00:24:02
Speaker
I'm very interested in how you take all of this experience because you wear so many hats and how it informs your investment approach,

Rinky Sethi's Investment Approach

00:24:12
Speaker
right? Because it it could be a superpower.
00:24:15
Speaker
could also be sort of like paralysis by analysis. But like, how are you kind of taking in all that signal? And, you know, how do you take that when you are listening to companies looking for investment?
00:24:27
Speaker
Um, it's, it's so interesting. Cause like, if anything on the investment side, what I've learned the i I shouldn't say learn. Maybe this is like something that's already inside us, but like, I guess reinforced and just made it like so in the forefront is like, as we're doing our day in day job as security practitioners, you kind of get lost in the tech and the team and like what the company's outcomes are.
00:24:51
Speaker
But then when I went back to like investing and working with founders, you realize this is all people business. It's all about your conviction in people. And I think, as security folks, we like can read people pretty well. And and it's like, um because we're like analyzing people too, and including the attackers, we're like, you know, and and maybe in a healthy way, paranoid.
00:25:10
Speaker
um But it's like, at the end of the day, it's like, you know, if founder, because the product might just change over time. And we've seen that we've seen companies start in one place and they end up doing something different, especially at early stage.
00:25:22
Speaker
So to me, it's like, do the founders have a vision? Are they looking into the future? Do they understand? Have they fallen in love with the problem instead of the solution and um to a point where they're going to build something really creative? That's to me, like number one.
00:25:35
Speaker
um I think I go back to the people because I don't, I'm not an expert in all things and I have a lot of blind spots, no question about it. And I like to like have people around me that think differently about things that can help me vet things too.
00:25:52
Speaker
And that has like helped, like that has really driven me in the last few years to build out community in meaningful ways. Like, and and I know you guys care about this deeply and I'm like really proud to be about a part of the community you guys have built. um And I just think like going back to the community to say like,
00:26:10
Speaker
one, let's understand each other's problems and support each other because this job is hard as a security practitioner. But like, also, can I, there's like this cool company, what do you think? And sometimes it's like, that was not so cool. Like, or the founder did this one thing and I'm like, they didn't do it to me. And, or like the tech, ah the founders seem amazing. The tech isn't like, why don't you push them to pivot? And I think having that different perspective in helping these companies, I think that's super important. um And so like,
00:26:37
Speaker
So obviously, like I'll and analyze a company quickly say, like, I think this is a platform. I think this is a feature that could be acquired. And it's interesting. But at the end of the day is like, do I love the founders? And do I like the space they're building in Can I influence them to pivot if it's heading in the wrong direction?
00:26:52
Speaker
Are they listening to practitioners um that are telling them this is a pain point? And are they falling in love with the problem and really want to dig in to find the right solution? Yeah. can you Can you say a little bit more about falling in love with the problem and not the solution? Can we pull on that thread a little?
00:27:09
Speaker
Because i I think the general understanding of VCs, unless you're like really close to it, is you're just like hunting for good ideas. And you clued in, like I've heard many VCs say, like, I don't really care. Everyone has an idea. It's about the people because they're definitely going to hit a slog and it's like, are they going to fight each other or are they going to pivot and be successful on

Keys to Startup Success

00:27:33
Speaker
the way out? But that phrasing that you used about in love with the problem and not the solution is intriguing to me. And I haven't heard that one yet.
00:27:41
Speaker
Yeah. You know, I had, there's a couple of founders that like, it was frustrating for me in the moment to even watch, but they probably talked to a, they weren't sure what they were going to build. They talked to me pretty early on. They said, here are the three areas that we think have challenges that CISOs want fixed right now.
00:28:01
Speaker
um And then they went and talked to, I think like 150 CISOs and security leaders. And I was like, it's been like six months, how many more people do you need to talk to? to like And they're like, we wanna understand the problem like and why it's a problem and dig deep and not just hear from like one person or just tech people.
00:28:21
Speaker
wanna go broad and like understand before we decide like which of these three areas we're gonna go build in. And that's where i was that's what I meant by like, and then once they said, okay, here's where we're gonna go build, it was like really understanding deeply what is the problem?
00:28:36
Speaker
um And I think like, and they ended up building in the bone management space, but it was like deeply understanding how our practitioners in the bone management space working right now that makes this such a problem. And like, what can we actually do to fix this? And they focused a long time. Once they said, okay, we now deeply understand this problem. We love this, what, what it is. We can now go build a solution. That part actually, they did it pretty quickly. You know, it's it's interesting though, is I, I like shrugged.
00:29:06
Speaker
You know, I get where you're coming from and especially because like I believe you're out in San Fran right now, right? That's you're in that area. Yeah. So like you're in the area. yeah um For me, though, like and based on like my experience and obviously I'm kind of more of an East Coast guy or whatever, Ontario more than anything. um I'm big on do they have the bona fides? And I think that comes from like having a military background. It's like if someone is a co-founder or a founder of a group,
00:29:35
Speaker
And none of them are actually practitioners in the technology space they're trying to build in. It doesn't matter how good it is, how the bells and whistles, the hype.
00:29:47
Speaker
I just I can't bring myself to believe in it because I've seen these types of things fail way too many times. And in a past life, it's ah It fails and people lose their life, which, you know, that's that's probably why there's a little bit of a trauma to it. But it's like in this space, I don't want to make the business bet on technology that's not backed by experience.
00:30:09
Speaker
and i And I'd like to get a little bit more of of kind of your and your perspective as to have... why you would want to believe in organizations like that that are founded by people who have no direct security operations experience. They they they don't do the thing of the problem they're trying to solve.
00:30:30
Speaker
I think just like having been, having, having been an operations engineer and having done the work or even being an app sec engineer, I don't think I could build a good product.
00:30:45
Speaker
um in that space. I actually think you would never want me to build a product. because I understood my problem and I was, i I like, especially at that time, like I just would have been terrible because one, like I wasn't the necessarily the best developer. i wasn't necessarily like the best product manager, meaning like I want to go and hear people's problems and understand, right? Like, and not even if you're like an, like how an AppSec engineer works in the Valley is in and what an AppSec engineer may be dealing with at a different company. That's like much bigger, like retail or
00:31:21
Speaker
Food industry, it's it's a little bit different. Like, yes, the principles are the same, but the problems they're facing and the scale might be different. um And I think a person that can understand the different it's like a different persona, I feel.
00:31:35
Speaker
ah And then there's the whole go to market engine and all of that, that. Like as security people, we don't have to sell like, you know, we're just like doing our work. Right. And there's all these components of running a business and starting a business that you have to think about. And so um I just think building a company is different. That's why I do think that if they're really smart folks, they can sit down with folks and say, like, I want to understand what the problem is. And that might take some time to like really like deeply understand.
00:32:02
Speaker
But then they're like, OK, I'm a builder. i can quickly go and I can fix this. Their brains are working differently. Yeah. Yeah, I mean, the anthropologist in me really appreciates that example that you brought up, that they were willing to spend a lot of time As you said, not just understand the problem that the CISOs saw, but the problem that vuln managers were having in their daily experience, because I find in early stage, not enough attention is given to that ux You know, I think I've said on the show before, I've seen incredible tech, but it takes like nine clicks to do something instead of three.
00:32:38
Speaker
And so like if you think about the daily workflow of a security team, you've basically like 3x to their workload. That's not viable. Like it sounds cool and you can put go to market behind it, but it's like ain't no one going to buy that because it makes their life harder.
00:32:51
Speaker
And so i I think UX, I think that's definitely customer service. um You know, I've seen in the CISO Society, a lot of tech get compared to one another.
00:33:02
Speaker
But I've also seen they don't really fail on technical stuff. They'll fail because their support tickets aren't answered or they can't get their account executive online or the stuff that was promised on the roadmap in three months is still not delivered after eight. It's like this squishy human side of this of solutions delivery. It's not even the product build, really.
00:33:23
Speaker
Totally. And like, also like you have to raise money and you have to be able to work with investors and there's so much more to like building a company. um And, and, you know, like usually that, but I do think um to the point on like, I don't think it has to be a practitioner that like goes and builds the company, but I think the close tie back to practitioners is going to be like key, right? You cannot build a product unless you understand the problem. And so know that's like, what that's what I meant. Like fall in love with the problem, understand the user base that's having the issue and,
00:33:52
Speaker
like get really close with them. And I think still like people, there's ah still a lot of founders that just fall in love with their solution. Right. And then it's really hard for them to part ways from that when they under, when it's like, okay, it's really not right. And they might go too far down whereas as they could have saved that time.
00:34:08
Speaker
Oh, yeah. I think where i'm I'm trying to meet in the middle of is i think in a group of founders, um at least someone should be that. but So like you look at like Apple, right?
00:34:22
Speaker
Jobs was the business mind. Wozniak was the CTO, right? And there is no Apple without Wozniak. Simple and plain. So i think i think what's hard is... is How then do we encourage more...
00:34:39
Speaker
encourage more practitioners to actually take the chance and become entrepreneurs and try to contribute to these teams and try to actually put their ideas out there. Because talk to analysts all night and day, and they're the ones that really oftentimes do see the problem.
00:34:59
Speaker
Not all analysts, yes. like I'm lucky to have a very experienced senior team, and everyone kind of like sees the bigger picture of the business, and that's the culture we built. But when you enable actual operations folks or defer folks or just the deep tech folks and you actually give them the bigger picture of like, here's the forest and the trees and the business and here's what the state of the industry is. and you give them that little bit of understanding. I feel you can teach them how to build those businesses. And I think...
00:35:29
Speaker
I don't know. Do you think the future of of being a founder and an entrepreneur is going to come down to having technical chops or is it just going to be the folks that are wild enough to take a chance and actually try to do the go-to-market journey and try to build an MVP and try to get funding and all the sleepless nights that come with that?

From Practitioners to Entrepreneurs

00:35:52
Speaker
i think it's ah I think it's a mix, but here's where I think like Just now having invested in companies and seeing the founder journey, like you have to be able to attract like top talent to go and build what you want to build and inspire that.
00:36:05
Speaker
And is that going to be like, ah like a CTO that has this like, you know, amazing career building a top tech companies? um Or is it going to be a like,
00:36:16
Speaker
a rinky that's been a CISO that has no idea how to like lead engineering teams. Right. Or, so I think there's like that kind of thing. I do think there's a place and you're seeing CISOs go and do this journey. um and And I've seen more practitioners go on to like, I'm going to go build because no one's solving this. And I feel like there's something unique. And, um and then they are getting help to surround themselves with the people that are going to help them build the, like a successful company, knowing that the idea and like having the practitioner experience is just one part of building a company. There's so much more to it.
00:36:46
Speaker
um And you have seen folks that have been successful down that path, folks that have struggled down that path because it's not easy to build a company. ah and i But I also think that can be done in different ways. Like I'm not necessarily like i joined company now where I'm the voice of a customer, right? And helping out and it can be done that way too. So I think there's like different models on that, but I don't think it has to be a company that's super successful, has to be somebody that's been a practitioner in the space, right?
00:37:14
Speaker
Um, and, and being a practitioner in the space doesn't mean you know how to build a company. So, but you see those gems that do come up and you're like, dang, they had both skills and they had the entrepreneurial side too.
00:37:26
Speaker
Um, now being on like a practitioner and on the VC side, like raising money is like super hard. It's not easy. And I have the battle scars from it. Right. And and you learn a lot and you have to pivot. And there were, there are times where you're like, i don't know if I want to do this. Right. Like, so it's, it's diff, it's very different.
00:37:44
Speaker
Well, Rinky, thank you so much for the time. That's a perfect place to to wrap. And yeah, we really appreciate you taking the time at the end of your day to join us. No, I really, it's an honor to be here with both of you.
00:37:57
Speaker
Pleasure, friend. Take care.
00:38:02
Speaker
If you like this conversation, share it with friends and subscribe wherever you get your podcasts for a weekly ballistic payload of snark, insights, and laughs. New episodes of Bare Knuckles and Brass Tacks drop every Monday.
00:38:15
Speaker
If you're already subscribed, thank you for your support and your swagger. Please consider leaving a rating or a review. It helps others find the show. We'll catch you next week, but until then, stay real.