Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Kinder Code Reviews with AI? with Qodo's Nnenna Ndukwe image

Kinder Code Reviews with AI? with Qodo's Nnenna Ndukwe

Hanselminutes with Scott Hanselman
Avatar
2 Plays8 days ago

Code reviews are one of the most powerful tools teams have for maintaining quality — but they're also one of the most emotionally charged parts of the development process. With AI coding agents generating more code than ever, the review bottleneck is growing fast. But what if AI-assisted reviews could not only keep up with the volume, but actually be kinder about it? Scott talks with Nnenna Ndukwe, Developer Relations Lead at Qodo, about how AI code review is evolving beyond glorified linting into something that understands context, catches what matters, and delivers feedback developers actually want to read. They explore what happens when the same AI writes and reviews its own code, and whether thoughtful AI review can make code review culture healthier for everyone...not just faster.

Recommended
Transcript

Importance of Filtering Feedback in Code Reviews

00:00:00
Speaker
And then there's like very high signal um feedback that you can get. So how do we make sure that all of that is filtered out so it's going to recall the relevant information and then it's also going to give you the level of severity and only these very high signal feedback? Because if not, honestly, it would just be annoying. If you're just getting any of any and all nitpicks, I wouldn't even want to work with a developer who is like that with code review. um So why would i want to you know introduce a tool or a technology that's going to do the same thing to me? Yeah.

Introduction to Naina Ndukwe and Her Social Media Presence

00:00:36
Speaker
Hi, friends. I'm Scott Hanselman. This is another episode of Hansel Minutes. Today, I'm chatting with Naina Ndukwe. She's an AI developer relations lead at Codo, and she's blown up on Twitter. How are you?
00:00:47
Speaker
I'm doing very well. How are you? I'm good. Thanks for hanging out. um I've been really enjoying your kind of little kind of snackable videos that you do and you you do them mostly on Twitter, but do you do YouTube as well? I mean, how many social media networks are you on on right now? I know right now it's Twitter and LinkedIn, but I'm slowly starting to get into YouTube. Expect a lot more of that this year, 2026. So right now you're a developer relations lead Codo, but you have to be an engineer to talk to engineers. How did you start your engineering journey?

Naina's Journey from Tanning Consultant to Software Developer

00:01:19
Speaker
First of all, I love that you pointed that out. I think that's super important for DevRel work. um I started maybe nine, 10 years ago as a software developer. And believe it or not, i I got in through teaching myself. I think that the times back then of like free tools online, like Codecademy and FreeCodeCamp, um I was working as a tanning consultant in Houston. And in my free time, I would just go through these free tutorials And then I realized like, oh, this is an actual career. And this is super interesting solving problems through code.
00:01:54
Speaker
And that's when I moved to Boston and really fully immersed myself in the tech space and eventually got a role. And that was all before i ended up studying computer science at Boston University. So been doing full stack um software engineering for all of my career before transitioning into DevRel and AI specifically.

Scott's Educational and Career Balance

00:02:17
Speaker
Yeah. One of the things that I think you may not realize, but I think we are kindred spirits in is that we both started at community college and I started in community college. It took me 11 years while working at night to finish my four year degree.
00:02:31
Speaker
So I was working in the lab, maintaining computers at Portland Community College, got a job, was working full time, but still doing kind of night classes and and and grinding through through stuff. So even though you've been doing this for a decade plus, you still continue to your education.
00:02:47
Speaker
Yeah, it was very difficult to balance both of those, but I think that I'm glad that initially i had the interest, like this relentless curiosity to understand what coding and software development was, and then having these computer you know science courses that I could take at community colleges was an amazing entry point, a bit more structure in how I could learn and getting that expertise from professors and getting mentorship and all of that was was an awesome combination. But it was definitely a pretty difficult thing to balance once I was working full time and going to school full time. Yeah, yeah.
00:03:27
Speaker
i I was reflecting on, I mean, I've ah been out of school a little longer than you, but all the languages that I learned in school are dead. ah So I feel like I really, C, like it's not, C's not dead, but it was like C and Windows 3.1 and, you know, DOS, Turbo Pascal, stuff like that.
00:03:46
Speaker
What did you learn in school and when you were self-teaching?

School vs Practical Programming Languages

00:03:49
Speaker
Because then you did work in, you're kind of non-denominational. You do Python, you do React, JavaScript, TypeScript, and everything. But I'm curious what you learned in school versus reality versus what you're doing now. do they Is there a straight line or is it a curvy line?
00:04:02
Speaker
In school, I actually remember one of the first official, it was Visual Basic. Yes. Yeah, I knew you were going to be happy with that. That's my jam. Like everybody, like so many people in my generation are like, Visual Basic got us started because it was the first like accessible, like you can just do stuff language.
00:04:20
Speaker
Like the feeling that people are feeling about AI, which is the you can just do stuff. We felt like that about Visual Basic. Literally, it was the first time being exposed to that. and it was And I felt like pretty powerful, I would say. And then the other, it I think in Boston University, the big focus was Java for sure. And I have mixed feelings, I guess, about it.
00:04:44
Speaker
I don't know if I should say that out loud. I know, because we probably have Java friends. want to offend our Java friends. We'll appreciate that. I know, like amazing, talented Java friends that like my colleague, principal architect, Java guy, 30 years.
00:04:59
Speaker
but But still, I think I was comparing it to the ease of use I felt with JavaScript, or I guess how easy it was to get started. When you know nothing, JavaScript felt a bit more friendly than Java when I was learning

Real-world vs Academic Projects

00:05:14
Speaker
it and in university.
00:05:15
Speaker
Yeah, I think it's also worth noting, and I think our Java friends would probably agree with us, that Java as taught in university is not the same as like Java in the enterprise. And there's a bit of ah a distance there.
00:05:27
Speaker
Oh, I would like to hear more about that, your opinions on that. Well, I just feel like the stuff that I learned was always four years behind. and it's just like Java beans and like Java for hello world versus like running a large enterprise that needs to scale. I feel like one of the number one missing things in school is scaling stuff.
00:05:46
Speaker
Everything you make in school as a general rule is largely a toy, particularly at kind of mid-level schools and community colleges. You're making stuff for yourself. i It's not until you get to like...
00:05:58
Speaker
fancier schools, and i'm I'm curious if you saw this at Boston University, where you start doing group projects and making something big and scalable. Yeah, group projects definitely made things a bit more, I guess you had to grow up in a way with a more complex problem solving and collaboration, um really, with projects. But yeah, I can see what you mean about the difference there. A lot of like isolated small projects when you're, you know, in your own world building for yourself. um And it's a completely different ballgame with large companies. And, you know, I feel the same way when it comes to the startups, working in a very small startup developer team versus thinking ah for a larger machine at a larger company with a
00:06:41
Speaker
bigger product bringing in that brings in millions a year and you know the how careful you have to be in the process he's put in place to make sure your shipping quality uh is is a i think that that is different and it it makes sense why it would be Yeah.
00:06:57
Speaker
Now, when when we start doing group projects, and I try to explain this to my my the young men in my life, my sons, 18 and 20, that life is just a big group project, except you don't always get to pick the people on the project. And sometimes that one guy or gal who doesn't do anything, but they come to all the meetings and they just kind of hang out. And then they also get an a Like that happens a lot. That kind of sucks.
00:07:20
Speaker
Did they teach you that in school or did you just learn that late later in life? You know, i I was one of those people who if I noticed there was someone slacking, i was like, there's no way I'm going to let them negatively impact the overall grade, right? So I would definitely do a lot of their own work. Maybe that's not a good team player. But if if I knew that there was no way they were going to do it, and they wouldn't have done it anyway, yeah I'm taking on that work.

Personal Nature of Code Reviews and AI's Role

00:07:51
Speaker
I've done a number of podcasts about code reviews, and I feel like code reviews get people tense. It's a moment in a group project where you have to actually look at people's work. And typically code reviews, when when I was coming up, were done in a room full of people in front of a whiteboard. And we would share our screen, and we would like, what were you thinking, Naina, when you did that line of code? Like, what?
00:08:13
Speaker
What was wrong with you when you wrote line 55? And there was just, it was very personal, but now it's a little bit more like on GitHub or in, you know, get ops and like that. What has been your experience through group projects up through the pre get era and now get code reviews and distributed code reviews and how, how it makes you feel.
00:08:32
Speaker
I distinctly remember, um, similar experience. I don't think a whiteboard was involved, but you know, a big screen sharing for earlier on in my development career. And code review right there in in one room, everyone looking and skimming through, it those were some brutal moments. I'm not going to lie. i think it I think maybe a lot of things have changed since then. So there's some detachment, I think, that you're able to have when you know with the remote work and and you know comments just being on GitHub instead of everyone and being in a room talking about how questionable your code may or may not be. but
00:09:13
Speaker
the personalities are can still be strong and show up in those comments in GitHub. you know So it it really depends on like who you're working with and what their style is, their preference, and what everyone cares about.

AI as a Neutral Intermediary in Code Reviews

00:09:26
Speaker
um So that's something that I think that now with AI code review, there's this third party that is ah playing a part in the experience that interestingly, I'm seeing um some developers are actually
00:09:44
Speaker
I guess they prefer to maybe argue with the ai as opposed to, and maybe fix their work with AI before another person, another professional has to come in and review their work. Ideally before a pull request is even live and public, but if it has to, um if it, if it's already public, then there's some fixing up that they can do before their colleague jumps in.
00:10:09
Speaker
I remember in the move from being in school to being like on a small company to being in a big company, it's kind of like when you wrote a paper for English and then it gets returned covered in red ink. And then you have to just go in. But that was very like personal and it felt very opinionated.
00:10:28
Speaker
Because I would be reading what my English teacher would think about my essay. And I'd be like, well, I mean, says you, like you're, you're, you're more senior and you're fancy, but like spelling errors. Yes. But like thematically, no, this is good. Thematically code reviews also felt a little like personal and it's just like that. But I, I kind of find that with AI code reviews, at least the ones that I do before I put the PR up, it's like, no, we're really just focused on correctness.
00:10:54
Speaker
And I kind of like that. It feels more like less personal. It's more about, and I'm saying spelling in quotes because it's like the AI doesn't have a beef with me that they're like actively trying to destroy me at work because they didn't like me. They just want to make sure the code is correct.
00:11:10
Speaker
Right. there's So some of the negative, it sounds like what you're saying and what we're both saying, some of the, I guess, the downsides of the human collaboration aspect of things, the variability there gets softened a bit in that experience with

Contextual Importance in AI Code Reviews

00:11:25
Speaker
AI. And it can also be a learning tool too for juniors coming up, ah depending on you know the type of developer experience that the code review can provide and the way in which you engage with it. Those are other elements there.
00:11:40
Speaker
But the way juniors can potentially learn from a code review experience and the ways that a senior developer maybe can learn how to engage or which things to focus on based on the insights from a code review that that an AI tool can can highlight.
00:11:57
Speaker
Yeah. Like, I don't, I do not want AIs to like hurt people or replace people, but I do like the extra step between, like, I like the AI code review right before the human, like, looks at it.
00:12:11
Speaker
And, uh, I noticed, though, that it's all about context. And I don't mean necessarily AI context. I mean, like, why was this done this way? Well, there's an old person who works here who wrote it 20 years ago, and it was a good idea back then. Like, that's context. And I don't know if that fits in a context window. And when the bigger the code base, the more complex the code base, a lot of context gets missed.
00:12:34
Speaker
So you'll have blinders on, and you'll have, like, a very narrow window in your mind about, like, that line of code sucks. And it's like, well, actually, if you knew about the bigger context, you'd understand why that's exactly what needs to happen right now.
00:12:47
Speaker
Exactly. um Yeah, I think there's that impacts or guess having that proper context, no pun intended, in my case and yours, ah definitely helps to shape the decisions that are made in the present time over why something is built and the way in which it was built, it should if if it should exist at all. And yeah, I think in the AI specific context,
00:13:11
Speaker
I think that there is a really big push and a need for all of that type of information that could be tribal knowledge or, you know, only the person who's been at this company X amount of years and had been in the room during some of those conversations. Those that's information that they would know and that in that influences the way in which a.
00:13:33
Speaker
a piece of software product is built. I think ah we're really pushing to codify that information, make it machine readable. It's like, where does it exist?
00:13:44
Speaker
How and how can we collect that in a way that AI can consume it and traverse it, um ingest it often in order to influence what is even suggested for code changes or improvement or validated Now in your day job, you work at a company called Codo. It's Q-O-D-O. And they have a code review product and they've got Codo 2.0 coming out.
00:14:14
Speaker
And it's a very crowded space, right? And I think the question is, is a code review from an AI good? And what's the secret sauce? Because for some people who are listening, they may have just copy pasted code and chat GPT and said, hey, is this good?
00:14:31
Speaker
Or they may go into Cloud Code or GitHub Copilot and go slash review. But like now, there's context engineering, there's complicated multi-level, multi- multi-agent code reviews. right Do you think code review is something that's going to be commoditized? And we just like, that's not that big of a deal, just use this one? Or do you think that there's secret sauce that companies like Codo can provide to make it something special?

Codo's Unique Approach to AI Code Review

00:14:57
Speaker
I think there is a secret sauce. I mean, just being in the weeds and learning so much from the R&D team here, and diving more into like what, what,
00:15:08
Speaker
are the qualities or the components of a good AI code review. I think I've realized that there is a secret sauce, there are many secret sauces. Context is one of the things that we brought up and that I think can empower or influence like but best practices and architectural decisions and entire code bases and how they all work together. um That can influence the value of the insights that you get from code review. um But so so context is extremely important and can be a differentiator, I think, in the space. But also there's um there's another element there, and that's just the benchmarking around highest precision, highest recall, and constantly improving in that area in a very specialized manner. um i think that is what can help with
00:16:00
Speaker
okay, there's a lot of code review tools out there, but there are some that produce a lot of noise, where it's just going to call out everything and anything that based on the code you know that you give it or the the diff that it reads um through Git. And then there's like very high signal um feedback that you can get.
00:16:22
Speaker
So how do we make sure that all of that is filtered out so it's going to recall the relevant information And then it's also going to give you the level of severity and only these very high signal feedback. Because if not, honestly, it would just be annoying. If you're just getting any of any and all nitpicks, I wouldn't even want to work with a developer who is like that with code review. um So why would i want to you know introduce a tool or a technology that's going to do the same thing to me? Yeah. You did a blog post a couple of months ago on contextual retrieval and kind of like how that's different from just like rag or just ah like the code is not telling the full story, which I thought was really interesting. If if someone says, I'm going to do a code review and here's the file or here's the new interface that we're going to do a code review on.
00:17:13
Speaker
There's the code, but there might be like a whole conversation that happened over in an issue somewhere. There's a whole design document. There's like slack messages. Context is spread all over yeah the the company.
00:17:25
Speaker
Yeah, it is. um And then there's there's also like best practices in general that I think ah is spread out in different areas. um It could be through some comments that you might find in some GitHub issues or our comments from past PRs of like, we don't do this that way. Here's the actual way that we implement it. So if you think about all the different types of contexts that you can find and all the different places that they might exist, The work around ah converging that into maybe some central plane that can fuel AI code review tools, like I think that is, that not only is going to, it's not only going to help with the experience of using it on a daily basis of like, okay, this is actually more valuable information. I think that that's when you can get into like some proactive, real, of what I consider real AI. It's like very proactive information.
00:18:25
Speaker
developer experiences. Yeah. Because there's, there's like, people say, well, just give it a giant context window, throw everything at it. Like give it, you have a million tokens, throw it all into the pile. But the AI reasoning layer can only do so much. And it's, they call it a, what's the word, the needle in the haystack problem. And,
00:18:45
Speaker
the the The thing that I think is interesting from your blog post is that there's a context gathering layer that happens well before the AI gets gets to think about it because garbage in garbage out, right?
00:18:57
Speaker
Right, right. There's a few things going on there, and one of those things is like if if you're worried about the context window, it's like there is an engine running. And I think about it like a cron job, right? Where it's just ingesting information on a nightly basis, like a build. And that is supposed to help with um essentially you don't have to,
00:19:25
Speaker
worry about retrieving all of that information in real time. This is information that is baked and ready to be recalled already. And that engine is running on itself. And you can also determine um what goes what actually goes into that.
00:19:42
Speaker
engine and to fuel or improve the AI itself. So there's a a lot of things ah going on there that is a beyond RAG, I would say, or what people are calling more agentic RAG. As we get deeper into the different like AI architectural patterns that seem to be more successful now, I guess, with the more yeah testing and building things, it changes. Yeah, it is. but Being able to do, like, search has always sucked. And i remember, like, like again, I'm of a certain age. So I remember when, like, when you had to search for something, you had to use proper case, like capital letters mattered. And then we got case insensitive searches. And then we got fuzzy searches.
00:20:24
Speaker
And now we have AI searches where I can say, what was that thing I was talking to Nina about? I think it was last week on Slack. Could have been the week before. we were talking about design. that Like, you can give.
00:20:36
Speaker
long yappy, like, what was that thing with that person, with that guy? He was in this movie with this. And it'll know. And it's like, that's where augmentation feels cool. Like that's like, wow, now I have like an exoskeleton for my own brain.
00:20:51
Speaker
So if that context gathering for like a code base gets not just like a commit history, which is basic, but like test coverage, conversations, discussions about the feature, a a document that was from design from a year ago, and then give the right amount of context, you can answer those super vague questions.
00:21:09
Speaker
It's honestly, I feel the same way as you. i'm I'm still blown away that I can search something um and it will, you know, ai an AI search tool can,
00:21:22
Speaker
bring up the most relevant or closely matching information without even using the direct or the exact keywords that match what I was talking about, that semantic um layer. I think it's absolutely fascinating, but that is that is the present. And I think it's the future of really of of really what real AI is,
00:21:47
Speaker
could be and is intended to be for for users, everyday users. Yeah, yeah, yeah. So why did you join a code review company? Was it exciting to like get into but in this space? Because you've got, you're like you're in the thick of it. I watch you doing videos basically every single day talking about how you can make people's lives better. And it's a really advanced system. I've been going through the Kodo documentation and learning about all the different contexts. I was just thinking about context generally.
00:22:15
Speaker
But there's the semantic context, there's the temporal context, context over time, there's the architectural context. And the bigger the code base, the more this matters because your code base is very likely much larger than your context window.
00:22:29
Speaker
Exactly. There's something about AI for software development, the software development lifecycle that it, how do I say this, has me in a chokehold. That's the way to put it.
00:22:43
Speaker
I'm so fascinated by the way to, ah structured ways to integrate AI into software developers experience,

Naina's Fascination with AI in Software Development

00:22:52
Speaker
to improve their workflow, to improve the quality or even maintain the quality that they want to maintain with their software, being able to ship it and just maybe do it a bit faster.
00:23:04
Speaker
ah That is just, there are so many, as we know, when you look at the entire lifecycle, there are many stages. And so that means that there are many opportunities to peel back the layers, get into the weeds and find out what, where are the wins here for workflows with AI where you can improve it. And so that's, you know, ultimately what drew me to Kodo is that not only is it a tool that is meant to enable developers for that, but also there's so much I think there's so much more to be said about practical implementations. This is not about like
00:23:41
Speaker
oh, AI for software development, like this is a thing that you need to be doing. so We hear this message all the time. And when I speak with engineering managers and we have talks and things like that, they're like, that's awesome. Yes, we should be using AI.
00:23:55
Speaker
How do you actually roll this out? How do you actually adopt this? And those are the real important questions, I think. So you know exposure to tools is awesome, but being able to walk through how you can actually use it um from an individual to an entire engineering organization. thats just That is so exciting to me.

AI Integration in Azure DevOps for Enterprises

00:24:17
Speaker
Now I hear, I see a lot of code review tools that are around GitHub and I hear everyone's talking about, you know, github it's very GitHub centric. It's very GitLab centric. It's all the, all the Git places, but you also work in Azure DevOps, which I thought was pretty cool that you've got like code quality security reviews all in ASDO. And I, people don't maybe don't know this. I don't talk about it enough, but like most of my main sites all run in Azure DevOps. Like I'm a, Yeah, like Hanselman, like this, this podcast deploys and runs and is managed out of DevOps.
00:24:48
Speaker
I love to hear it. Yeah, this was the more that the more that we spoke with engineers and engineering leaders about Azure DevOps, the more it was like, wait, this is a missed opportunity. it's like these are folks at large enterprises that need the type of support to leverage AI tools the same way others can on different infrastructure, why can't they be supported too?
00:25:16
Speaker
And that's when the engineers here at Cotto just got, you know, got to work. And it was some months long, I think, of of being able to build that out to make sure that You get the same experience if you were on GitHub that you are on Azure and it was worth it. Really excited about that because, you know, um like i like we said, it's one thing to work in isolation um on the easiest to access tools or most tools that people are using. But what if you're on the job and this these particular tools or something else is what you use? Well, we want to be able to support that.
00:25:54
Speaker
Now, I know we mentioned we we know we mentioned Twitter, and you know there's aspects of Twitter that are just an absolute dumpster fire. i see you on LinkedIn as well, and I also see you not just talking about AI tools on Twitter, but also talking about like AI companies that they're built for business incentives. They have really high evaluations. like It's not the AI that's dangerous. It's like the person holding the AI and pounding you on the head with it.
00:26:17
Speaker
do you How do you find those kind of takes on Twitter when everyone is so... for lack of a better word, like Crypto Bro, about their excitement around Abound AI?
00:26:28
Speaker
It's like, you know, i try to be as balanced and as pragmatic about this as possible. Like, i I'm so excited about the positive potential, what AI can do for society. And I'm also very well aware that there is nothing new under the sun and humans are terribly predictable and like on a much higher level in a macro ah level.
00:26:57
Speaker
and And that means that when you have a tool, which is AI, just a tool, you can choose to leverage that for good leverage that for bad. So I never, ever want to dismiss the power of people leveraging things that would, leveraging a tool for for bad for society.
00:27:18
Speaker
Yeah. I mean, it is it is a power tool, right? And you can chop, you can take ah a a chainsaw and you can chop down trees and you've chopped down people and you have to treat these things with the respect and you need to know which is the pointy end of the tool so that people don't get hurt. It is challenging and sometimes helpless feeling that it's like, Hey, I can show you how things can be better.
00:27:39
Speaker
If you use this tool, like I'm, you know, I'm off quote unquote selling my things in my day job and talking about copilot CLI and all these different cool models like Opus and stuff like that. But I'm also acknowledging that people like are afraid.
00:27:53
Speaker
And it's like, I don't want anyone to lose their job. I want people to be excited about how this will make your job suck less and you can do stuff you couldn't do before, you know? Right, right. There's a general fear, I think, would not just developers, but so many folks about, um is this going to be ah a tool that replaces me? And what i want, my my personal mindset really is that Because I am really interested in emerging tech, technology, and because I like to be forward thinking about where do I see myself in two years or in five years, I'm constantly thinking about how can I make sure I'm getting ahead of things? And what I would love to see is more
00:28:37
Speaker
folks start to do that like self audit. If there is fear, you can actually transform that energy of fear to figure out, well, how do I want to get ahead by educating myself or empowering myself or considering a different a couple different career trajectories? um Yeah. and that's Maybe that's not the best or I guess the most positive answer, but it is a way to empower yourself thinking through.
00:29:05
Speaker
Yeah, it is challenging because you you have to contextualize how you exist in this time and what you do to fight against the things that suck, but also to promote the things that don't suck. Right. Yeah.
00:29:16
Speaker
Do you feel like you're doing a great job of that with the work that you do? I always say that if I work at a company, the company may not be perfect, but at least it'll be better that I'm there than if I wasn't there.
00:29:29
Speaker
And while the bigger the company, the more likely that that company does something dumb or problematic. But at the same time, ah i can influence things and be a lever for positivity and change on the inside versus the the out.
00:29:43
Speaker
And then, you know, when it doesn't work anymore, I'll go and I'll teach high school science. I love that. Absolutely. love thats so That was like a perfect conclusive, ah oh I guess, wisdom nugget right there. Well, that's very kind. The show was about you, but I appreciate the compliment. Well, thank you so much for hanging out with me. Folks can check out Codo, Q-O-D-O dot AI. You can have AI powered code reviews with the number one AI code review agent. This is not a sponsored show. This is just a really cool exploration with Naina and the work that she and the folks over at Codo are doing. So check it out and you can check her out on everywhere.com.
00:30:21
Speaker
that social media exists. Thank you so much, Nina Ndukwe, for hanging out with me today. Thank you. This has been another episode of Hansel Minutes, and we'll see you again next week.