Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Rebuilding a YC Real Estate Tech Stack from the Ground Up | Ep. 11 image

Rebuilding a YC Real Estate Tech Stack from the Ground Up | Ep. 11

Tern Stories
Avatar
49 Plays10 months ago

In this episode of Turn Stories, I dive deep into my experiences with Bobby Grayson, the first engineer at Quantierra, a YC-backed startup.

Bobby shares his journey from the chaos of WeWork to the unique challenges of a small startup, where he faced a messy codebase and low reliability. 

We discuss the impressive app he helped build, which aggregates vast amounts of data from New York City, and how he transformed the system to better support users' workflows.

If you're curious about the intersection of technology and real estate, or if you're an engineer looking for inspiration on tackling complex projects, this episode is a must-listen. Join us as we explore the future of software development and the exciting possibilities that lie ahead.

Connect with Bobby! ➡️ https://bsky.app/profile/bobbby.online

Get Tern Stories in your inbox! ➡️  https://tern.sh/youtube

Recommended
Transcript

From WeWork to Quantiera: An Unconventional Journey

00:00:00
Speaker
We are talking about the most atypical company I've ever worked at and I've worked at WeWork. What stuck out to you about the state of the app? They were all information hoarders. Like their ultimate brokerage was like the God that is email, right? And like the utility oftentimes became like, okay, like send me an email.
00:00:16
Speaker
What was the tech you were dropped into actually doing when you showed up? A high level view of it would be a series of data input tasks that would grab a ton of stuff from New York City open data and some APIs, ultimately assemble one God table. And that God table knew just about everything you could about every single parcel in New York City. That is an extremely impressive app. Worked for vendors for too long to not be on the side of you should probably buy software to solve your problems, not build it.
00:00:45
Speaker
What if your users didn't want a dashboard? What if they just needed the right email at the right time with the right data in it? In this episode of Turn Stories, I talked to Bobby Grayson. He was the first engineer at Quantiera, which is a YC startup.
00:00:57
Speaker
And he walked into thriving business, but a mess of a code base. low reliability, huge data sets. So we talk about what he did, how he rebuilt the system that actually supported users' workflows, but we also talk about what it would look like to do this in the future.
00:01:11
Speaker
It's 2025. Would you even build this product today? Or would you just expose the data layer as a bunch of MCPs over a tight core? Maybe. Let's

Joining Quantiera: First Impressions and Challenges

00:01:20
Speaker
dive in. I've got Bobby Grayson here.
00:01:22
Speaker
Bobby has been an engineer at a ton of interesting companies. He's currently a senior software engineer at Pepsi. He's worked at WeWork. um But today we're going to talk about his experiences at first engineer of a YC-backed company.
00:01:37
Speaker
Welcome to the show. Thanks for being here. Hi, how's going? um Yeah, just finishing up Monday and here to talk some shop. but we're We're here for a special nighttime edition of Turn Stories.
00:01:50
Speaker
um Cool. so tell me a little bit. You were telling me a little bit before the show about how you'd sort of gone from one of the bigger companies out there. We work in surviving the chaos there to jumping into an extremely tiny startup.
00:02:06
Speaker
Tell me about how you landed there and and what you were seeing in that company. A friend met at a dinner party. i' wanted out. Money sounded all right.
00:02:19
Speaker
interesting problems to solve. And it was, you got to remember it was 2019, like money was flowing. It was not hard to find good work.
00:02:31
Speaker
um We made sure that the folks on our team were going to land in pretty favorable places. um We have, you know, most of the folks on my team moved on to the likes of like GitHub, Spotify, Google, like very favorable places.
00:02:44
Speaker
Like for leaving WeWork and thinking it might be a trash fire. Everybody was like leaving in pretty good condition. And, i you know, i I thought it would be cool to have that feather in my cap to like go and join a YC firm as early as this was.
00:02:57
Speaker
It was unique because they were like kind of in a position of distress and I excel with a gun to my head. I have a consulting background and coming in to sort of working in business. You know, weird search, weird situations, exigent circumstances, you know, and and so like I and I thrive in that kind of environment and I really liked them and was just like, yeah, let's try it.

Innovating Real Estate with Technology

00:03:21
Speaker
um And and, you know, we we had things arranged where it was like. if we could build this thing, there could be real upside. And yeah, so we dove in.
00:03:32
Speaker
um the The beginnings of it were like, you know, it it feels relatively low stakes almost too, because when you go into something that is like, you know, it's a bit of a mess. Like, what are you what are you going to do? make make it Worsely, yeah i don't know. It it doesn't it it's for me, at least for my way of working and like by the person that I am, that is like ah not something that will make me worry.
00:03:57
Speaker
So, yeah. So paint the picture for me. ah You were first employee there. couple of founders. I was the first employee. It was really first engineer. The founder had built a good bit. And so and so we were bringing in myself and my at the time he was my engineering manager at WeWork and he wanted to get back to being an IC.
00:04:15
Speaker
And so we were both coming in kind of as as equals from him having been my manager before. And we were just like, all right, let's just be like app engineers and learn this business and kind of figure out what we need to build.
00:04:28
Speaker
Cool. so what what was what was the business doing at the time and what was the step? Fundamentally, like brokerage of New York City real estate, finding property for folks who are looking to accomplish something. And oftentimes those goals are complex.
00:04:44
Speaker
You're not going to just like go and find a listing. This is like very, very specific acquisitions and working to ah be able to source and develop deals for that. reasonable business model.
00:04:56
Speaker
So what was the... So like kind of trying to function as a different model of brokerage than what you would typically expect um and working with this technology first model rather than being, you know, back kind of deep seated into the ways of like just wheeling and dealing and trying to find people who know each other and might do business, like trying to find real value adds that might not have been possible otherwise. Yeah.
00:05:19
Speaker
isn I didn't put together until just now, despite you having told me, I think three times now that going from WeWork to this, this company was not, was, those are similar industries.
00:05:31
Speaker
ah Yeah, it it was. And um like, it was, it was funny because I, I was getting exposure to the real estate industry, not because so much of WeWork, as I was getting really involved with ah my local community board, CB3 in Manhattan, with like kind of just some of the yes in my backyard, like overall movement stuff.
00:05:54
Speaker
And so I had gotten shooed up into the disturbing local politics of New York City, zoning and land use policy. and And this was ah explicitly ah around some folks who were in those circles. And so so that that had kind of been where the interest and and overlap came. And there was obviously some common interest there as soon as we joined with everyone They had actually done ah right before I joined, they published a big thing with the New York Times on how like more than half the property in Manhattan today it would build be illegal by current zoning code.
00:06:33
Speaker
Really? Yes,

The Role of Data in Real Estate Brokerage

00:06:35
Speaker
I can. I can send you the Times link. That's wild. Yeah, without they did the whole study.
00:06:44
Speaker
It was great.
00:06:48
Speaker
Got it. um um But yeah, that that was ah that was quite entertaining to find out. But like they were really good at finding esoterica like that to to kind of dig in and find.
00:07:04
Speaker
Find find value where other people just hadn't realized it was there. Yeah, absolutely. And this is kind of a, you know, this is this is a a real and physical corner of the tech space, right? that this This is something where, you know, historically, you're not um you're not disrupting an industry that already has ah pile of competitors and entrants.
00:07:25
Speaker
You're disrupting something where there's still a lot of kind of manual work. what did the What was the tech you were dropped into actually doing when you showed up? A high-level view of it would be a series of data input tasks that would grab a ton of stuff from New York City open data and some APIs.
00:07:47
Speaker
but write all of that. That would go to a series of tables. Another series of rake tasks would be run that would then, in a chain of 150-some materialized views, ultimately assemble one God table.
00:08:03
Speaker
And that God table knew just about everything you could about every single parcel in New York City. That is an extremely impressive app for just couple It's very cool. And it it also could go so far as to like find links in other ways that when we built that out in-house, that was a custom solution that myself and Adam and the founders like came up with as we we dug into the data further. um So there was some interesting and novel stuff there as well.
00:08:30
Speaker
um So there were there were a lot of really cool problems still as to solve. And all of it fundamentally boiled down to data flows. and And what we ended up doing as we built up these tests to kind of get our feet in there is we moved towards um only building new application code that would basically be a data engine that would make a sane way to make a new source and have this adapter. So there'd be generators and adapters.
00:08:59
Speaker
You generate, it's if your Elixir users, anyone in that world, super similar to Ash declaring your domain, we would declare a domain by having a data source that would like be represented by something, say, like a noise violation on a property.
00:09:15
Speaker
So you're looking up all the noise violations, you give all the attributes that are there, their types, and then you say it should ultimately tie in this way to these versions. Like this is how it's a condo. This is what if it's a building. This is what if it's commercial, blah, blah.
00:09:30
Speaker
get all that mapped out, and then from a big classification object, be able to run all those imports. And from an object that used an active record style hierarchy, have an ultimate ah ah collection assembling how the data needed to be built up as represented by that really complex chain of views.
00:09:48
Speaker
So we were able to work for the complexity out towards the edges. by slowly building up this functional core that we were able to test piece by piece, replacing the old ones.
00:09:58
Speaker
And we were able to test the old ones by going like with the new version that's using this adapter. If we drop and rebuild this materialized view, are these 50 queries getting the same exact results?
00:10:09
Speaker
And if they are great, move on. Got it. So was this, so i guess, at at a high level, what was your mandate when you joined this company?
00:10:19
Speaker
Oh, what was our our job was to help Quantiera make money. And so so we were we were heavily thinking about like what we could build that would help source more deals or find more value that they would be able to work with.
00:10:35
Speaker
So sometimes that was getting data to support a deal. Sometimes that was thinking about the product and how we could build out something that would be able to potentially service another brokerage. Sometimes that was just speeding up data processes we had so that we could get answers on things in an instant and look really smart. Yeah.
00:10:51
Speaker
Other times that was just making it so the systems would keep running for years if we needed them to. um Like a lot of this was automating our own jobs. We we pretty much fired ourselves after two years.
00:11:03
Speaker
Got it. So how did you, when when you think about sort of breaking that down, that that that feels, ah I know a lot of engineers who'd be really uncomfortable to have been put in that position.
00:11:15
Speaker
what did you What did you look at when you first dropped in? like What was important to you to figure out before you started? Oh, I just tried to figure out where bodies were buried. Because like that's usually... it's It's sort of like the approach that you take with... um what is it just like inverse indexes or whatever, like the, the words that are if term frequency, inverse document frequency.
00:11:38
Speaker
If the, the least said words often have the most meaning like the code that's been touched the least, but is like still linked to a database table that is like, 150 gigabytes of joint records you know like you might want to know a little about that one like one of the problems we hit at this company was we hit the maximum id for 32 bit oh no like that's a problem that like youtube hit once everyone knew what it was but the problem you hit somewhat earlier than that in your company's lifestyle life cycle
00:12:14
Speaker
Yeah, I mean, like we got to 2 billion join records on this one table like six months in.

AI and Database Challenges: Past and Present

00:12:19
Speaker
And then we were like, wait, we need more. What do we do? And so, you know, that migration process took like one of us a month and a half. It would be a lot easier now with ML tool or AI tooling and a lot of other stuff. But yeah.
00:12:35
Speaker
I think that kind of brings us to what we were. What were we building? Like we were building an interface that would allow really smart people to just kind of do their thing faster. And the way to do that was by watching them.
00:12:46
Speaker
um This absolutely would have been a job that makes a lot of engineers uncomfortable. um We came into it knowing that it would be atypical. We came into it wanting an atypical job. ah Anyone else really probably would have been set up for failure.
00:13:01
Speaker
Got it. So how did you get that information? Did you physically go out with customers and sit down and figure out what they were doing? Yeah, the brokers worked in our office. So it was just like, go sit next to Jay.
00:13:13
Speaker
and like we would get on calls. We would, you know, we would really just like kind of watch how they were developing the business as a whole. um Like there would be mandates that would come through and it would be like, how do we get more of this type of inventory that we could possibly find?
00:13:29
Speaker
And the answer is more data. So you go and you, you just kind of get clever and you figure out, certain things that would point at it like um i'm trying to think of like a parallel example of something that would be reasonable it's like you can you can get a feel for how active a neighborhood but is by going on and like checking how many lights are on in the windows after a certain hour right like what's the equivalent of that on some new york city open data that you can go and find um that might not be as obvious if you were just doing a straight by the numbers evaluation with something like a rep
00:14:03
Speaker
Yeah, that makes sense. um What was as as you're sitting there and like watching them do their jobs, like what stuck out to you about about the state of the app?
00:14:16
Speaker
Oh, like, I mean, they they they were all information hoarders like their ultimate brokerage was like the God that is email. Right. so So like the state of the app wasn't really like a big thing. It was like, what is there? Is there a path to utility?
00:14:34
Speaker
And like the utility oftentimes became like, OK, like send me an email about like just if these magic conditions happen. I need to know.
00:14:46
Speaker
i want to get an email about it. Yeah. And like, you know, it's Pepsi today. That is a non-trivial amount of what I have to build. And I think that it's a pretty universal need for people in businesses, just in many, many forms.
00:14:59
Speaker
um Like something as simple as just little notifications, right? Like if you look at, ah you want to talk about great little New York City startups, look at Nock. They just do notifications. It's another Elixir company. um And it's like you you can you can plug that in and have many, many, many customers doing millions of things with something that's just that simple and small.
00:15:23
Speaker
ah So finding those little niche services is always great. Absolutely. I think that one of the the conceits of every product is that your product is the most important thing to your users.
00:15:34
Speaker
And that's almost never true unless maybe you work on email. Yeah, in this in this case, it's like there are 50 other factors and the technology is the core of like how the business can do a lot of things. But but like there there's something interesting here, which is the like technology can be a driver of like behavior and not just a product.
00:15:55
Speaker
It can be a driver of like function and form of how like a business operates. And you end up at this point where now you think about like, what would I offer if I was like building something in that space today is it would it would completely be a conversational like LLM interface.
00:16:14
Speaker
And i would be building like MCPs over top of this Postgres server. There wouldn't even be an application layer anymore. Um, like if I had this, if I had this kind of open set of problems right now where it was like, you've got a bunch of sources and you want to aggregate some stuff into a nice view for somebody,
00:16:30
Speaker
Like, why not do it inside the robot that can magically draw open a web browser and do 50 other things? It's a lot more powerful than just building a web app. And it's also like not an agent or this this other stuff that people think about. I think this is a ah classification of application that people are like kind of overlooking right now that will become really, really popular once remote MCP servers have more prevalence.
00:16:57
Speaker
um But that's that's like a whole other topic. like Yeah, I want to come back to that, though, because I i do think it's really interesting. You were telling me a little bit about the the work that you had to do into this app just to get it to functional and how when you came in, that there was no like there's no tests.

AI Strategies for Code Development

00:17:13
Speaker
There was, you know, what? Oh, no yeah. And like now I now i have absolute strategies. i And so like with the talk that I did on um AI assisted development that i recently gave at Empire City or not Empire City Elixir at Gig City Elixir in Chattanooga.
00:17:30
Speaker
Um, fundamental approach here is i i use what I call like a multi-phase we like prompt augmentation and and kind of like that's just like my name for it. This isn't like something academic.
00:17:45
Speaker
But like what I do is I start off with a kind of broad description of my goal and then I feed that into like Claude Opus 4.03 and I say like this is a broad description of my goal. I need to create implementation specifications so that I can feed that to Claude Sonnet 4.0 and have it go about like actually building this for me.
00:18:05
Speaker
And then I have it make those specs and then I go through and I redline those. And then I'll give that as like an actual implementation to like go and build. And so when I'm doing that, I'll have like multiple instances of my editor open or like if I was using cloud code or whatever, you could pick your tool.
00:18:22
Speaker
But I use separate Git work trees so that you can have these multiple tasks just kind of going about each other. If I would have had tools like that to just churn through the code base, I would have been able to clean up everything there in, you know, weeks.
00:18:36
Speaker
But... But also, would that have been the right move? um I think that with where we were positionally in the business, that the right move would have been to rethink the interface to the data and how people can get to it fast.
00:18:52
Speaker
And I think that the product we wanted to offer people would have been a lot different in the today world. Interesting. One of the things that I think it makes a lot of sense to try and basically churn out as much as you can.
00:19:07
Speaker
near the user in order to to help users understand like user user demands are high. um But for a product like this, like it's a fundamental data product, right? The the data has to be right.
00:19:19
Speaker
And I think more than anything today, people people don't trust products where the data isn't good. You know, and like that this is so much more than just like, is the data good or not? It's also like interface easiness. Like if you are the ice cream man, don't you just want to ask where people are like,
00:19:37
Speaker
A not having ice cream and B it's like warm, especially. I don't know, like criteria for searching it. Like if you can just ask that, like where where's the good place to sell ice cream? That's so much better than needing to know which of like 90 pieces of data might work for work out.
00:19:55
Speaker
But like being able to go through all the different permutations of what's here and how it could think about those things is. You know, it's a fundamentally different approach to what we do now. And I I I think it does go back to kind of what we were talking about, which is I think that there is ah new kind of class of application that is going to start sprouting up overall.
00:20:22
Speaker
And it'll be really interesting to see how people start to fully interact with them as they come up. Yeah, I think where I'm wondering, and maybe this is this is just my nice naivete on this kind of application, is i would think you need you almost need bespoke or hand-coded or trusted layer down near the bottom of like this is the data, this is where we get it, it is correct, it is vetted, there's some amount of human eyes on it.
00:20:49
Speaker
And then whatever interface you build on top of that can be a lot more flexible or fast-moving, Is there is that is that a reasonable view of the world or is i mean, i don't know. I think I think that a reasonable view of the world is that people really like gluing shit together and that it's I find it very reasonable to believe that people will be motivated enough that pretty much everybody writes a TPC or TCP server that goes over their program that says, here's how every command I use operates.
00:21:26
Speaker
I think that that will become part of the open API specification level kind of integrated. I think that having that built up for MCP is going to be a default. And so that goes beyond like data. It goes into interfaces. Right.
00:21:39
Speaker
And, and so this is like, I see the, I see the relationship we have with computers fundamentally changing to where like,
00:21:51
Speaker
LLMs themselves are like a new type of computer, even though there's not a physical box. They're kind of like a new type of computer where they they' date they don't have the same interface we used to. We're used to the old ones were really good at math. These ones seem to struggle like it's it's a different type of machine.
00:22:09
Speaker
And we're in this like nascent phase where we're really figuring out like different ways to use it to change interfaces and make programs. um And I think that like right now we're very much in like the super early days.
00:22:24
Speaker
But I think that stuff like if you look at like what the dream of like Hypercard was. um but say That's a memory. But like, think about it, though, like you that that's kind of like the world of like what you could reach to if you really had um MCP for everything.
00:22:40
Speaker
And you and like, I don't know how much you've like read or like thought about the school of thought about like end user programming as a whole, like the model that Scratch offers and like adapting that for like real business end users and like the history of the uses of that. But like, I think there's a world where like that's starting to appear and become like a really real present phenomenon that could become possible with the advent of how we're starting to build things this way.
00:23:04
Speaker
um But it's just interesting to me to think about like, oh, yeah, like if I went and started a company today, i don't know if the company would truly have an app.
00:23:15
Speaker
It might be a bunch of MCP servers over a Postgres database. Interesting. So if you went you went back and looked book at like what you did at Quintera, I guess.
00:23:29
Speaker
would you Would you even build that Ruby monolith? Uh, I would, i would take the database. I would take the, the model of what, like the product they were used to looking at in the God view was like, and I would be able to rebuild it with different tools, with modern things in a much different way today, much faster and be able to really quickly service things.
00:23:57
Speaker
Um, but you know, that's just, it's, No, I wouldn't. um So what would be what would be what would you not

Building Trust and Efficiency with Data

00:24:06
Speaker
build? I would have built an I would have built a natural language interface over a really complicated Postgres database that serviced like 10 people internally.
00:24:16
Speaker
And then i would have been thinking about the higher order problem of how do we make this generalizable enough that you can sell it to any agency or brokerage that keeps their own like internal models or something? Or how do you become an acquisition target for like a big firm or fund that is doing this kind of stuff?
00:24:36
Speaker
Yeah, absolutely. That that would be to to some extent at the beginning of the the game. i you're you're always building something that's useful. And if you end up building it as useful for one company or for as many companies as possible, that ah the first step does not dictate whether you're going to pick one path or the other. It's it's down the line that you pick that.
00:24:55
Speaker
um That's interesting. it It feels like some of the challenges you were describing around, you know, I think this is weird, though, because we are talking about the most atypical company I've ever worked at and I've worked at WeWork.
00:25:08
Speaker
Like. So, you know, like there's a certain ship that sailed that point. um
00:25:18
Speaker
But but yeah, it's ah they that's like a level of freedom that most engineers are never really going to have unless they're a founder. and And so we were in a unique position. Yeah. i think and I think that's what what makes this this particular journey so interesting, right? Is that you did have the autonomy to make decisions like, you know, we we will choose what we work on and and how is it exposed to users all the way out to like, what is the value we are creating?
00:25:46
Speaker
And that's a very different set of constraints than, you know, I have a reliability problem. I have a product i have a product ship date that we have to hit. um and And it lets you use a whole different set of tools.
00:26:01
Speaker
um Yeah. And, and, and, you know, now like there would have been so much more we could do with just like isolate, like if we had to make, like, let's say that there was some but like just hypothetical scenarios. Let's say that we had like four people running a white labeled version of it.
00:26:16
Speaker
Even if it wasn't getting us value, like we accidentally had these five customers. Like I feel like a lot of startups actually end up in that position. Yeah. And if we if we would have had that problem and had to go keep the lights on, it would have been, you know, I would approach that with completely different tools in the modern day than I would have back then.
00:26:36
Speaker
Um, like the first thing you do would have just been like, you could take like a week worth of logs from all of this stuff um from every major white labeled service and just run them and be like, where are the failure patterns between all of these? Like what's, you know, there's so much static analysis you could do just from having a pile of running stuff.
00:26:56
Speaker
Um, That was if you were making your own reports and spelunking through that, even on like Datadog or a nice service like Splunk, like to go through and really ascertain all that would have been days or weeks of reporting a manual aggregation that now is like minutes.
00:27:12
Speaker
And so like we've really like cracked an egg where like the it's fundamentally a different way of working and. and the tooling and the, like, I think the output is going to reflect that. Like it's catching up right now and not quite there yet, but that the, the, the necessitation of like quality output and consistent output is going to be big.
00:27:35
Speaker
And I think there's a huge skill issue with people not learning how to use this tooling well enough to get the quality output that they can get some stuff quickly and that they, the the long-term maintainable stuff is going to become a challenge.
00:27:49
Speaker
so Say more about that long-term maintainability. I do think that a lot of the game of what you build is about coercing LLMs into doing something repeatably because by default, they don't.
00:28:01
Speaker
And by default, it is unclear what they will do. um How do you think that shows up either from product building or from a product using perspective over the long term? There's a few things we can do immediately and pragmatically. Number one, property-based testing.
00:28:18
Speaker
property-based testing with completely crazy randomized inputs, that's going to help with just in general dealing with the kind of instability that you're going to see with LLMs and inputs and outputs. So like by having...
00:28:35
Speaker
inputs that range from the total nonsense to what's in there, you can really feel out the what kind of behavior you're going to get. I think there's a separation now that you want to have um in your code where you think about like ah LLM code is like its own application layer.
00:28:49
Speaker
So like it used to be like you have, you know, assembly and you have C and then you might have like objects on top of C and C++ plus plus or whatever. Right. And then that's like your highest level of abstraction. But like now there's this other level of abstraction of like kind of like LLM nonsense, right? Like where, where can it get variants in its inputs?
00:29:10
Speaker
And so I think you want to have like this, the classic thing of like pushing bound, pushing complexity to the edges and having the LLM be set up such that it is going to have constrained inputs, but the variance that you're going to get is planned for. And you have a difference in structure for reliability and expectations in your application code versus your language model code.
00:29:35
Speaker
Yeah, I think that makes a ton of sense and kind of comes back to the the question I had earlier of like, if you're going to build, you know, reductively, Quantiera could have been a Postgres database with a pile of MCP servers on top of it.
00:29:47
Speaker
But you still need a, you know, Postgres is reliable software, you presumably need to operate it reliably, which might mean some layer on top that is not, you know, LLM throwaway code.
00:29:59
Speaker
Yeah, I mean, like, you could also go a really long way with infrastructure providers, too. I mean, like, if you look at Supabase and what they offer, like... can go a long way letting somebody else handle your infra and this is just a me thing but i'm usually really inclined to do that um i've historically never been an ops person and it's a weakness of mine but like that's just the way i work is like if i can if i can offset something by putting a little offset cost into the operations budget it's just like well okay
00:30:32
Speaker
I've worked for vendors for too long to not be on the side of you should probably buy software to solve your problems, not build it. yeah um But I do think it's it's rationally the right choice in a lot of cases.
00:30:45
Speaker
um Even though it even though it does push against its intention with this idea that if users are also able to create their own complexity and complete.

Balancing Innovation with Practicality in Engineering

00:30:56
Speaker
And there's a component to it too of like, do you want me to put on my fun engineering ad or do you want me to put on my what's best for this business ad?
00:31:03
Speaker
And, but you know, and and developers have always been able to say, you know, what's best for this business is for me to spend 20% of the effort to get 80% of the value and create the thing right in front of me.
00:31:13
Speaker
Right. That's like one of the, Both one of the weird things about working in the developer tooling space as a vendor and also one of the great things about being a developer is that there is this fundamentally different relationship between you and the computer and all the software that people try to sell.
00:31:31
Speaker
Yeah, like, I think, like... It's like if I if I went and looked at how I would be building something like a SaaS platform today, i think that my number one thing would be just Finding some sort of hook that is into somebody's daily workflow that takes away like mild amounts of pain. And I think that these interfaces are going to be like MCB driven. It'll be like open a browser to go to this page, go to the CSS selector.
00:32:14
Speaker
There's a table copy it to the clipboard with this stuff stripped, open up Excel, put it in there with this option, do this. yeah like that's It's stupid to people like you and I who are used to like, oh, this problem makes me mildly uncomfortable, let's automate it.
00:32:31
Speaker
But the level of that that you see in small and medium-sized businesses that are just operating day-to-day where it's these custom workflows around a few core pieces of software, like addressing that middle market is, I think, going to be like...
00:32:46
Speaker
a series of just like not billion dollar, but really, really like you could do it with like 10 people in five or six years kind of businesses.
00:32:59
Speaker
um And that's that's what I would be pursuing right now if I were out in the space. So how do you how do you think about, you know, Greenfield,
00:33:11
Speaker
Greenfield development has been absolutely productive with AI, right? That, you know, all a ton of the apps that people talk about, and even the ones that are like built for big code bases, people use cursor to vibe code and create new stuff.
00:33:24
Speaker
How do you think about like people who have existing apps, they have users, they have use cases. What is the, what is the path to,
00:33:35
Speaker
you know, for something like a Quintero where you have an app and in place, like how do you go from that to the new interface? And like, what does that mean technically? Is this just development again? Or is there something material?
00:33:48
Speaker
I think that it's a different way of thinking about building stuff because you're now working towards building like logical components that sort of accomplish goals rather than classically building features.
00:34:00
Speaker
I think that what I would end up with is a different way of thinking about source code fundamentally. um like I would be willing to turn things on its ear and be like, we have an orchestration layer of like LLM um code that's generated. And then we have like our core code that's human maintained. And we know is like this is like super crucial.
00:34:21
Speaker
And that's like a super minimal part of the code base. And everything else is like with Phoenix and Live View, we're spoiled because we can make unit tests for really, really complex interactions. And we can have unit tests for basically the entire suite.
00:34:36
Speaker
Like right now for the stupid Nathan Fielder social network I'm making, I have this content generation stuff that is being built up so that we can just make the gifts and share it out.
00:34:48
Speaker
And to get everything going for it and get every each piece built up. I've had all of these unit tests built up that cover every possible interaction throughout the site. Like right now, I've vibe coded it, but there's over 500 unit tests.
00:35:06
Speaker
And so the LLM knows if it breaks even the smallest little thing like, oh, if it was expecting two things in a row here and it got four, like it'll blow up. And and so you make these really well-defined portions of the application that are reviewed by experts to make sure that the implementation looks sane and that they're using modern best practices.
00:35:27
Speaker
But you're no longer like churning out code and you're sort of like, I would I would be very interested to build up a company that exclusively worked that way. I think that you could actually have some success in it.
00:35:38
Speaker
And i think that it would make a fucking ton of people really uncomfortable. um How do you deal with the fact that a unit test classically, I think i don't think LLM super changed this, it that they test the behavior that you know you want tested.
00:35:55
Speaker
ah You're building a new feature if you're changing the app. Property-based testing really helps there where you can randomize the inputs. So like and so tell me about property-based testing. So property based testing is about fuzzing with crazy amounts of inputs. So like I could have a property based test for like an input on a form that's like ultimately our rule on the form is that it should only allow ASCII characters, right?
00:36:16
Speaker
A property based test would go in and try to fill it with like three valid things and like 997 nonsense ones.
00:36:27
Speaker
And they'd be generated by like a super static random generator that's just like... whatever. um And if you use that around like all your edges, you can usually figure out where nonsense can sneak in.
00:36:41
Speaker
Um, so like, yeah, not like if you build only to the happy paths, that is totally something that can bite you. Um, like I can actually, I'd be curious to just have you take a look at some of the examples of the tests for what I have been doing here.
00:36:57
Speaker
So I'm going to pull up this get up and link you. And I'd be curious to think, just see what you think of some of the unit tests that are generated for this, like relatively complex interaction on this site.
00:37:08
Speaker
Here's where I curse that I can't actually screen share on this platform. That's okay. I'll take a look at it. um so While you're pulling that up, but I think there is... Here we go. I want to take a look at this, and I'm curious to see how this looks for you and in real life.
00:37:26
Speaker
um But I think one of the... i i'm I'm still not quite getting that like so um when you do a When you do a migration, a couple of the folks I've talked to, you talk to folks at like big banks and people who, when they break things, it's it's billions of dollars kind of at risk.
00:37:45
Speaker
The gold standard for, did I do the migration right? Is a migration specific test suite, right? Like I, not that I know the behavior of my app now, not that I'm going to write my test later, but I have some specific targeted set of validation that I can work through,
00:38:05
Speaker
Over the like incremental period. With that for me historically in a consulting capacity and what I've done it internally. What we do is we will run a replica of the application where we're checking full database inputs and outputs everything end to end for both the request flow for the web layer and for the entirety of the database.
00:38:26
Speaker
So like a lot of the times it's in it. So comes down to feature flag, A, B testing running in parallel environments. So I guess you if you would call that a unit test, that's historically where I've fallen on it.
00:38:39
Speaker
Got it. I wouldn't, but I think that is the right approach, right? Is that you can't know. Yeah. In my clients, I've, I've also had the benefit of ah working with like European clients who are pretty open to like the definition of success and like what it is and not like having someone who's like a big firm in the U S that's like, we're used working exactly this way. This is how we do things, blah, blah. Like this was much more of like,
00:39:03
Speaker
Oh, you kind of have an open basket of problems like here's some stuff I can do for those. And and it was it's a flow where like your acceptance criteria is that they're getting value still and that like they're like in some of that, like one of these folks I was dealing with is European Bank.
00:39:23
Speaker
And so like obviously compliance is huge. We can't have anything but be messed up there. But if you can guarantee integrity for every request that you have flowing through the system for every possible endpoint matching one to one and verify that your databases are completely one to one.
00:39:40
Speaker
Yeah. You know, how much further can you go? ah You know, then you know industries where you could go further. There are there are ways you could do that. um But I have thankfully never been in the position where I'm like working on someone's active heart surgery.
00:39:55
Speaker
The worst I have done is like migrations of like customer records for ah financial institution. um But, you know, that's still serious, but it's not lives at stake. That's for sure.
00:40:07
Speaker
Yeah, absolutely. And I think that's and that's fundamentally a lot of the the answer to this, right? Is that you need the context to understand if the system is running. we'd um A couple of weeks ago, we had Mode on the show, and she was saying that we spent 20 minutes on testing in production, because the only thing that matters is what happens in production. It doesn't matter how many unit tests you have.
00:40:29
Speaker
There's never been a customer in the world that asked about your unit tests. Right. Yeah, I think ah charity majors is a big proponent of that, and I've always been a fan of it. um I think the people who get real on their horse about it are kind of being wimpy.
00:40:45
Speaker
It's just like, I don't know, like beak ah be like may like, part of how you succeed in business is making confident moves that are calculated, and you need to be able to do that with your systems as well.
00:40:57
Speaker
And like... I understand that some people are like working on Stripe, but a lot of us are like moving a little bit of data from A to B and like keeping track of some stuff. It's not it's not as crazy. And like you have user expectations and things like that, where it's like they you'd never want to lose somebody's like bookmark collection and stuff like that.
00:41:20
Speaker
But I think that those critical flows are something that you manage with like effective communication in your organization. And, and ultimately that it's not like one hero in an engineering or because like I know how to do each step at this.
00:41:36
Speaker
It's going to be a calculated group group thing that gets you somewhere successful. what if What if you have this attitude and you want to push this at a company, you want to say test in production, what would you use in 2025 with all of these AI tools at your disposal in order to push that viewpoint?
00:41:58
Speaker
Oh, I actually haven't surveyed that space, so I wouldn't have a super good answer. I could tell you how I'd research it, but I wouldn't have an immediate answer. I mean, like the first thing that I think of is that we've got all of these new browser tooling based solutions that you can work with playwrights and with other similar things like running a full browser is a totally different game than what it used to be.
00:42:20
Speaker
Um, starting off with agents acting as users, driving through the site regularly in a cloned test environment could be something I would find really, really possibly helpful.
00:42:31
Speaker
Like if you, like we have, uh, like one thing that I know we have right now that I have just wired into my app day to day is I can get live session recording of everything a user does in Datadog, right?
00:42:45
Speaker
And so if you take that, whatever data file is feeding that, if you take ah a few profiles of those and give them a few hundred sessions, you can now have different profiles of users just poking around a copy of your site and have analytics on the logs for just any sort of outliers or like errors that shouldn't be happening from this like seemingly random activity that is like pretty similar to your real use.
00:43:12
Speaker
You could have environments like that. You could also do like chaos monkey style stuff where you're just simulating the dumbest people possible. um But, you know, I would probably be inclined to start off with browser simulation because it is while it's kind of expensive to run, it does get you something that's highly, highly resembling in the real world. And if you are running a system that like, like, say you're building like Blue Sky or something like you'd want to be able to have an environment where it looks like there's like 10,000 people using it And you don't want to have to spin up like crazy infrastructure to do that every time. Like you don't want to do it on demand.
00:43:48
Speaker
Like having this little parallel thing with like these profiles that you could scale up and down would be super useful. um And then if you want to test scaling stuff where you're like thinking of what if we have 5 million users or 10 million users, then those are different problems to solve.
00:44:01
Speaker
But for like these application level and flow stuff, that's that's probably the space that I would start out in. But yeah, like as a as a market, I don't actually know where the product space is on that right now. So I wouldn't have a good answer.

Custom Software Solutions: A New Frontier

00:44:12
Speaker
Yeah, that that makes sense. um One of the the things we had talked about when we we first met is the idea of sort of local, I don't remember what you called it, but ah like user level computing, like people writing their own stuff. Yeah, like it was I think it's just like like personal grade applications where it's like, you know, it's it's not something you could put in an app store.
00:44:32
Speaker
Yeah, absolutely. it's said tell me Tell me what you mean by that, because I thought it was an interesting idea. So I just think that people are going to start making these little things for themselves because everything can talk to each other now. Like i I made a little thing that just goes out and it gets like my top stories from Hacker News, Reddit and Pinboard and then throws them at me in a markdown document so I don't have to open my freaking browser and read it like little tools like that are going to have a proliferation and They're not people are going to come to rely on them very deeply.
00:45:01
Speaker
They're not going to be something that's reproducible. It's going to be like Amy made this and it really works. And Amy works here. So that's fine. And these people are going to have no software deployment strategy. They're going to have no maintenance strategy.
00:45:15
Speaker
Soon they're going to come to rely on it as if it's like a black box because Amy got another job, but the thing still worked. And now the thing is effectively become a mini God inside of the business and no one knows how to control it And that's like an unfortunate situation to be in, even if the God is benevolent.
00:45:36
Speaker
So like you, I see, i see a situation here where like, this is totally going to be an evolution of a class of software that a lot of people rely on. But as that reliance grows, there's going to be people who need to step in and quote unquote, like production eyes and commoditize all of this.
00:45:53
Speaker
And I think that that's going to be like the biggest class of new application that we see. Like when when when Mark Zuckerberg was going all in on mobile and people were like, you're crazy, you'll never get ad revenue there.
00:46:05
Speaker
Like that's where I see the next big billion dollar funnel of potential ad revenue. And this coming from is like little ways that you can tie all of this together and like slap your label on it or be collecting some data on how the flows work or whatever.
00:46:19
Speaker
There's going to be a profit model. But finding a way to commoditize that, I see so many like I was I was sitting in the Las Vegas airport and just chatting with this guy.
00:46:30
Speaker
He's a commercial roofer in Minnesota. He made just in chat GPT an entire single page web app that handles all of his businesses, logistics, accounting and ordering.
00:46:46
Speaker
He built out every single piece of it as individual pages. It runs in the chat GPT app, like as a memory. that's amazing i love that and this is running his whole like he's this is like a 15 million dollar commercial roofing business it runs the business like was talking to him and he like this dude's this dude's got like a serious number employees he's doing like real operational loads in like a couple of states and he's he's cobbled this thing together and it's running it all and if he's doing that 2025 in fucking
00:47:16
Speaker
may then I can only see where this is going to keep evolving as it gets into more hands and the pricing continues to be as friendly as it is. The value proposition right now is crazy.
00:47:28
Speaker
You have a magical Oracle that can basically produce code that would cost somebody 80 grand a year to pay them. And you can have that cranked out as a completely non-technical person and solve whatever problem you have in front of your face.
00:47:40
Speaker
That's going to generally change how people do a lot of things. i One of the things that I think about a lot is that if that think kind of thing does come true, and and I think it's plausible, there's there's the people on the other side of those APIs, right? There's the people who run Hacker News and Pinboard and all of the different apps.
00:47:59
Speaker
And now it's always been hard to reason about use cases because people use your app in different ways, you know. ah I've recently most recently worked at Slack.
00:48:11
Speaker
but People use Slack in wacky ways. yeah. like It's the bug that somebody relies on as a feature, the most classic thing, right? Right. And this is this is going to be 10x that, right? Because it's not like a fee it's not like a user using the product in a weird way. Sure, that's weird. I can understand it.
00:48:26
Speaker
It's going to be people's software using the software in a weird way. Oh, yeah. totally new classifications of ways to analyze all this it's going to be a huge series of open problems and that's why i think that we aren't all going to lose our jobs there's just going be so much more shit to do yeah and and and like that's that's the reality of it is like when people make more things there's just going to be more software and if there's more software there's more bugs if there's more bugs there's more shit for us to do like this is just how the system works Absolutely.
00:48:58
Speaker
But but yeah, I think I think that that's going to be like the next big class of application that really starts to evolve. And like, you know, if I draw if I was like 25 years old and had a lot of time, i would probably be in the space of like trying to find a bunch of businesses who needed that and commoditize it and like find industries where it was just something that I could really make a difference with and do them one by one.
00:49:22
Speaker
And if you're not trying to get billionaire rich, you do that somewhere between one and three times. And like you can probably retire in whatever city in the United States you pick.

Leveraging AI for System Overhauls

00:49:33
Speaker
Yeah, um we're coming up on time, I want to ask you one question. Say you're an engineer and you're staring down the problem the barrel of some massive refactor, some giant migration, a big project that scares you a little bit.
00:49:48
Speaker
What's the one thing you should be doing today in 2025 that you didn't have in your arsenal in 2020? as an engineer oh the the amount of prior art you can find with llms like the one thing that i've seen is like if you take the high level problem that you have and you feed it to the reasoning models you're eventually going to get to the point where you're finding like academic definitions for the problems that you're having and that's where you're going to get into like finding real real solutions and it'll be like okay here's how we think about load balancing at a scale of like
00:50:22
Speaker
a billion users. Like you're able to go and you, you find that paper, you read that paper. You don't just have the LLM summary. Like go at you, you use the tool to go and you find it. And now that you like respect the fact that you can actually get all this information at your fingertips, but don't make the machine summarize it for you.
00:50:40
Speaker
Because like the ultimate power we have is still being able to take that information in and make really contextual calls. And so like, I think the number one thing is just like finding that prior art because there's so much that is now discoverable that wouldn't have been before.
00:50:55
Speaker
Yeah, absolutely. I think the element of tribal knowledge that used to be lost to just people that were inside of those big companies is now so, so much more highly qualifiable just as something you could search.
00:51:08
Speaker
Yeah, totally. Yeah. I mean, that's probably pretty boring advice. Like, go read the docs. but its That's my advice from my my old man yells at Clav like he should use it to find the really good docs and then read them.
00:51:25
Speaker
you know, five years ago, you didn't have at your disposal the ability to read the right doc. So, you know, I think worth reiterating. Yeah. And this is like, this is actually the topic of a blog post I'm kicking around right now, which is like, there's this great new book on the Erlang VM. And I was like, I could totally have this go summarize the bullet points for me. And then when I hit those performance things, I'll know I no need to need to go at the reference text.
00:51:47
Speaker
And then I was like, no Bobby, You were an expert on this language and i've been working on for over a decade. You were going to read the book and you were going to internalize the knowledge. You were going to keep it and it will be useful.
00:51:59
Speaker
But it's, you know, it's easy to forget about that now. And i I very much don't want to lose that spirit of like actually knowing how things work. I do know how a lot of things work and pride myself on that.
00:52:13
Speaker
Amen. All right. This has been a fantastic episode. If people have more questions or want to follow up, where can they find you on the internet? ah The only thing I do is Blue Sky. The handle is bobby.online and Bobby has three Bs.
00:52:27
Speaker
Fantastic. we We will drop that in the description of the video. and Cool, cool. hi Thanks for having me. Thank you so much, Bobby. Catch you later.