Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Madhav Bhagat on Leaving Google to Build SpotDraft and Redefine LegalTech image

Madhav Bhagat on Leaving Google to Build SpotDraft and Redefine LegalTech

Founder Thesis
Avatar
60 Plays19 hours ago

How did a Google engineer turn a 90% revenue loss into a $XXM ARR LegalTech success story?   

Madhav Bhagat's SpotDraft journey reveals the secrets of AI timing, opinionated product strategy, and global SaaS scaling.  

From losing $900K in collections with his NYC nightlife startup DrinkLink to building SpotDraft into a leading contract lifecycle management platform, Madhav Bhagat's entrepreneurial journey is a masterclass in resilience, pivot strategy, and AI-first thinking.   

After leaving Google where he helped scale Google Classroom to millions of users, Madhav co-founded SpotDraft with the vision of democratizing legal technology. What started as a tool for freelancers evolved into an enterprise-grade CLM platform that now serves major companies like PhonePe, Airbnb, and Panasonic. The company has achieved double-digit millions in ARR with 60% revenue coming from the US market.  

In this candid conversation with host Akshay Datt, Madhav shares how SpotDraft reduced contract implementation times from 2 years to 60 days, leveraged AI cost reductions from $40 to 4 cents per contract, and built an opinionated product that outcompetes giants like icertis and Docusign. He reveals why being early to AI wasn't always an advantage, how non-technical teams now ship code using AI agents, and why focusing on mid-market customers over enterprise deals became their winning strategy. This episode offers invaluable insights for SaaS founders navigating AI adoption, international expansion, and building sustainable competitive advantages in crowded markets.  

What you'll learn from this episode:  

👉How Madhav built SpotDraft from Google exit to $XXM ARR in LegalTech 

👉Why AI timing matters more than being first, and lessons from pre-GPT to post-GPT era 

👉SpotDraft's opinionated product strategy that reduced implementation from 2 years to 60 days 

👉Secrets of scaling SaaS globally with 60% US revenue from India base 

👉How AI agents are transforming organizations and enabling non-technical teams to ship code 

👉Why mid-market focus beats enterprise deals in competitive SaaS markets   

#legaltech #startups #AIstartups #saas #startupfounder #AI #B2BSaaS #startupjourney #entrepreneurship #indianstartups #founderstory #artificialintelligence   

Disclaimer: The views expressed are those of the speaker, not necessarily the channel

Recommended
Transcript

Why Madhav Bhagat Left Google to Pursue Entrepreneurship

00:00:00
Speaker
That was sort of the push to quit Google and move back to India. This is the only real opportunity I have to take that risk and i can't take that risk sitting in America. So so how does an opinionated product scale up?
00:00:14
Speaker
Madhav Bhagat is the founder and CTO of SpotDraft, an AI-powered legal tech platform helping brands like Airbnb and PhonePay to close deals faster and automate legal workflows.
00:00:26
Speaker
I feel people operating in the niches of niches are going to be more successful than people are coming from outside and trying to solve. What does it mean to meter it?
00:00:44
Speaker
Madhav, welcome to the Founder Thesis podcast.

The Birth of SpotDraft and Its Initial Challenges

00:00:49
Speaker
Before we come to your current venture, which is Spot Draft in the legal tech space, I want to kind of understand your journey of getting here. You've been a serial entrepreneur, worked in the US. Just give me like a sneak peek of the journey to Spot Draft.
00:01:05
Speaker
Thanks, Akshay. Super excited to be here. So before SpotDraft, I was actually working on a startup while I was at Google in New York, which was called DrinkLink. It was a nightlife concierge service.
00:01:17
Speaker
And we were trying to solve a problem that we felt ourselves, ah which was that if you're a bunch of guys trying to go out in New York City, it's very hard to know where you'll get in and where you won't.
00:01:29
Speaker
um So we set up this app called Drink Link, which allowed you to put in your requirements, your budget, and then nightlife venues, be it bars, pubs, or nightclubs would bid on you and con confirm whether they would let you in. So instead of you going and, you know, wasting time around different places, not getting in, you could know for sure that you're going to get a good deal and you would get into a spot.
00:01:56
Speaker
um so that was Did you quit your day job to do this? I quit my day job because I wanted to work on startups, not necessarily just that one. And ah as you might be aware, on H1B, it's very hard to ah do anything which is not your core job, because as soon as you quit your job, you have to leave the country in 30 or 60 days.
00:02:19
Speaker
So with that sort of sword hanging over me, and this was back, you know, when ah the flip cards and the snap deals were really going and raising obscene amounts of money. And, you know, when you're sitting there in New York City, it's a comfortable life. But You're like, this is the only real opportunity I have to take that risk. And I can't take that risk sitting in America. So the that was sort of the push to quit Google and move back to India.
00:02:48
Speaker
And as I moved back here and I was working on this startup, I also set up a dev shop in India, mostly to get engineers for the startup for DrinkLink.

From DrinkLink to SpotDraft: Madhav's Entrepreneurial Journey

00:02:58
Speaker
As part of that, we also started talking to other founders and other entrepreneurs to understand, you know, what was going on in the Indian ecosystem, where people were really allocating time and capital.
00:03:11
Speaker
And that was a great experience from a learning perspective. And that is also how SpotDraft came to be. So while I knew Shashank, my co-founder from New York, and we had met at a couple of parties, um a lot of our interactions were limited to me getting free legal advice from him and him getting free technical advice from me.
00:03:31
Speaker
So it was around that time that you know he was trying to figure out how to ah solve this contracting problem that you know he had seen as a corporate lawyer on Wall Street, but also as a lawyer sitting in India. So he moved back independently and he's like, I'm just trying to extract these rights and obligations.
00:03:52
Speaker
And yeah, I told him, see, I can do it to a certain level, but it's not going to be fully accurate. And so we built something very simple for him. It probably took like a week.
00:04:04
Speaker
So he was your client, basically. So he was, yeah, in a way. i Your dev shop, ah he was a client. Okay. We got it built probably within a week. And then he started seeing a lot of value from that.
00:04:18
Speaker
And that is sort of the genesis of SpotRout because he's like, okay, how do we productionize this? And how do we sell this? And You know, he's like, if you don't come on board, I can never figure it out because i know zero tech.
00:04:32
Speaker
So we sort of teamed up. He had spent, i think, six months ah said where SpotDraft was available as a tool for freelancers and, you know, single mom and pop type operations to really help just draft contracts.
00:04:48
Speaker
And that's why the name also SpotDraft. And then... when we realized that there was this opportunity to solve the contract review problem is when we teamed up and we started you know raising capital, building this ah solution to help others.
00:05:05
Speaker
you know What we had built for him, we wanted to productionize it and sell it. So then we started going into customers, investors and you know hearing these signals of, okay, you actually need a you review tool that lets you do more than just see obligations because what each person cares about is different.
00:05:23
Speaker
This tool was of course built with what Shashank cared about. But when we spoke to customers, they kept coming to us with different asks. And that's when we decided, you know let's make a contract review tool as a product.
00:05:35
Speaker
ah Now, fact this was back in 2018, 2017. And we would say i would say we were a little bit ahead of the tech at that time.
00:05:46
Speaker
ah So this was when Transformers, which is the D in GPT, had just come out and it was the state of the art. And we were, of course, using it. But it wasn't as general purpose as a lot of these models are today.
00:06:01
Speaker
So we started, you know, collecting data, training our own models. did a lot of that heavy lifting and we even managed to close a few really large customers like the biggest private bank in India, the biggest telecom operator in Europe.
00:06:18
Speaker
ah We had a large but Japanese investment bank and all of them said that, you know, we'll give you access to our data, you train models for us. And this was their digital transformation story.
00:06:30
Speaker
Of course, they did not give us enough data. so the results were subpar. And as ah part of what some of the work I did at Google, I also built an AI solution in Google Forms.
00:06:42
Speaker
And a key takeaway from me from back then was if you ask the user to do extra work to train your AI, then no one likes doing extra work. right So they will not do it. And that's essentially what happened with the initial SpotGraph solution as well, is that users would just not give us the data. And then they would come back and say, your tool doesn't work.
00:07:03
Speaker
So this DrinkLink, then you shut it down? Yeah, we had to shut it down. I mean, it was also an operational challenge because um so one thing was that we were not collecting the the booking fees up front. So what we would do is partner up with the venues and say, you know, you'll pay us 5, 10, 15% of whatever the spend was after the billing was done.
00:07:28
Speaker
And of course, that became a collections nightmare to the extent that we had a million in GMV. And I think our collections was... at the level where at the gmw is 100k so we were missing almost 90 percent of our collections so then India is a glivyard of these ideas, right? There was a crowd.
00:07:47
Speaker
there was crowd That's one of the reasons why we set up the India team is like, okay, let's set up, you know, payments on the platform. It was a very simplistic approach we were using till then. And then what we also realized is that there is a large pull that we are seeing from corporate event bookings.
00:08:02
Speaker
So we created another brand, which was focused on corporate bookings, and we started getting decent amount of, you know, interest from them. The ticket sizes were larger, $20,000, $50,000. So was easier for us to manage as well.
00:08:17
Speaker
But when you go corporate, then you also ah have the expectation of a certain level of operational excellence. And what that meant was that we had to deploy people on the ground. And now that would eat into our margins. And you know imagine finding someone in New York City who's willing to work on a Friday night or a Thursday night and you know sit in a bar, but not...
00:08:38
Speaker
be part of the party. So it became a big challenge because of that. And it also at that point, you know, I was sitting here my partner and co-founder at the time was sitting in New York and there was always this, you know, 24 hour gap and time zone challenges and all of that. So that's sort of when we decided that, you know, while this is a great ah you know lifestyle business, scaling it requires just so much like even on ops, it's a very operational heavy business and running an operational heavy business in America is ah very, very hard as you can tell with you know all the quick commerce apps that have come and gone there. So that's sort of when we decided that you know it's not working out.

SpotDraft's Evolution and Customer-Centric Innovations

00:09:21
Speaker
and yeah we decided to just shutter at that point and we had done i think three million in gmv by that point it was ah good experience from a learning standpoint but i did take away that you know build something that you can operationalize in the long run because it was working because we were we would be the ones you know going to these bars and making sure everything was in order but of course that cannot scale Right. Okay. Interesting.
00:09:50
Speaker
And the dev shop also you closed down once you got into Spot Draft. Yeah. So the dev shop ah for me was a initially it was set up because I didn't have a way to hire people for the US company in India.
00:10:04
Speaker
So we just set it up as a that being the start point. And then we did get a few other clients. But the challenge with customers in India, ah that we spoke to was they would fall in two extremes. Okay, one extreme would be that I want everything the best.
00:10:19
Speaker
And then of course, being a new business ah with a very lean team, doing that becomes very challenging where, oh, you only have one designer, how will this happen? So they would not come to us. And the other extreme was, you know, you're competing with folks who would do everything under the sun for 5000 rupees.
00:10:37
Speaker
And of course, that means that the customer just cares about getting the cheapest solution out as opposed to getting something that works. So then again, there was a challenge of, okay, there's not, we won't be price competitive and people just did not care about quality at that point.
00:10:52
Speaker
So what we then decided was, you know, I spoke, I worked with a few good folks. In fact, we were cash flow positive for most of that dev shop's existence, but it was just such a drain.
00:11:05
Speaker
ah I was doing sales as well as, you know, leading the technical side and, It just wasn't scalable because as soon as you bring some of the senior people in, you become cash flow negative. So that was when Shashal came about and he's like, can you help me with this? And the timing just worked out. So moved all the people from the dev shop to build SpotDraft V0. And then once that took off, I decided that, you know, I would want to dedicate all my time and mental energy to SpotDraft.
00:11:34
Speaker
Okay, interesting. ah What was the... ah The specific, you said Shashank first got it built for his internal use. ah Internal use in which company or like where where was he working that he wanted this tool built? So he was working at Rupee Power, which was this loan pricing engine. So all the loans that you go and get on any bank website would be priced by them.
00:11:58
Speaker
And he was also doubling up as the lawyer there because, you know, when you have a lawyer, you don't want to pay someone. So that is when he would go through and because they would have a lot of really complex asks around rights and obligations, a bank is like, if your website goes down for half a minute, you will pay me 50 lakh rupees or, you know, that's of course, ah exaggeration, but things like that. So he would spend an obscene amount of time also doing that.
00:12:23
Speaker
And he just wanted a simple way to say, okay, I'll give you a document and you just tell me what the rights and obligations are. So I don't have to go through 500 pages, I can go through one or two pages. So that is where the start point was. And he found it so valuable that he must have so it shown it to some folks and they were like, oh, how do i get access to this?
00:12:41
Speaker
And that then became the start point. And this is around when he had anyway started SpotDrop, but he was looking at it from a... enabling and democratizing law for freelancers and you know really small companies. But of course, the things we were building are much more valuable when you have a large company because that is when tracking these rights and obligations becomes a Herculean effort. If you're a one person shop, you know what those rights are because that's what keeps you up at night.
00:13:09
Speaker
And that is when you know ah it went from being a tool for him to a tool that we wanted to give to others. And we started talking to customers and prospects and we realized that there is a large problem in this space. ah You know, NDA review, people were spending $500 per NDA, right? Like, ah so we were like, okay, we can do better. We can get it down to $5, $50 even without AI, we could do it for $5, $50 with just humans. But because we had this tool in place, we said with AI, we can do it even cheaper.
00:13:44
Speaker
ah And people love that. And that's why they sort of, we got a lot of interest in our pilot phase. So in the pilot phase, you were selling it as a service? In the pilot phase, we were...
00:13:57
Speaker
So it was we had built out a solution that you know you can upload a file and it'll highlight, we used to call it rag analysis, which is red, amber, green. So it would you know color code your document and and parts would be green and it'll give you an explanation why it's green.
00:14:11
Speaker
But the big challenge was that, well, twofold. One was when you go and ask an organization, what is red, amber and green for you? Now that internally starts their own debate where they're like, oh, no, I will not. x Why should we accept this? So like we would not get a clear picture until we spent a few cycles with them. And then the other challenge was that ah we had explicitly in our contract said you have to give us 100 good examples and 100 bad examples so that we can train our AI on top of it.
00:14:42
Speaker
So imagine like an NDA that you refused to sign. Right. So 100 examples of that and then 100 examples of the final, you know, post negotiation NDA that you signed. So with one of our customers, they tried really, really hard, but they were only able to give, they were only able to find, i would say some 50 contracts.
00:15:03
Speaker
And that too, when we started running our engine on it, it's like, okay, 25 of these are duplicates. It's like because you never store older versions. Yeah, it's like three people send the same file from their inbox because you never store the bad version.
00:15:17
Speaker
The good version is still you know sitting in some share drive somewhere, but the old ones aren't. And that is when we you know we spend a lot of time and built a lot of tooling to generate synthetic data. you know We pulled some 2 million contracts from the public ah internet like SEC, Edgar and all the violence that public companies that have to do.
00:15:38
Speaker
But still, the you know when we got that data and trained the model, the accuracy was good. But if there is something which is very, very nuanced and specific to an industry or a customer, we did not have that in that training set and we were not able to then get as good an accuracy. We tried all the like we tried things like back translation where we take a contract, translate it to German, then back to English, because now it says the same thing, but in a different way.
00:16:06
Speaker
but We would reverse the obligation. So it's like if your thing says i shouldn't be i should be allowed to visit the campus at any time, then we'll reverse it and say, you know, the other party should be allowed to visit the campus at any time. And of course, that means it's a negative thing, which should probably be highlighted in red.
00:16:24
Speaker
So we did all of that. We got to some 70, 80% accuracy. But the reason, the big thing we realized was that if I need to onboard 10 more customers, I'll spend like how much time just training all these models and gathering this data for those 10 customers. And that means that it's not going to scale like a software business. It's going to scale almost like a services business, even though what we want to sell is a software to get that software working. I need to do so much onboarding and services before that.
00:16:53
Speaker
So that sort of took me back to ah At Google, had built in Google Forms an AI solution using only Google internal data. And that data was created by anyone who had ever used Forms would have created that data. So can we get the user to create the data while they are doing what they anyways do?

AI Integration in SpotDraft's Contract Solutions

00:17:13
Speaker
And for lawyers, the place that they create data is when they are editing these contracts. So then we built an editor which had superpowers for lawyers. So if you hover on a definition, it would show that definition in line.
00:17:28
Speaker
Right. If you clicked on it, it would navigate you to that part, like article 3.5.4. Right. Instead of you having to find it in the document, we would take you there. so There, what we would also do is now track the contract from its you know first version that you received to the final version that you sent.
00:17:46
Speaker
So the idea was now we'll get this data. But as we started- This is a collaborative tool, like a Google Doc. So editor and- And we had the review tool built in and we had some other things like we would verify that if you say section 2.5, does that section even exist in your document? Right. Some basic clerical or paralegal work we would automate for you and track keep track of the edit history.
00:18:11
Speaker
um So that was great. Had you worked on ah Google Docs or something like that or at Google? So I was part of the Google Apps team, which is where Docs and slides and everything falls. But I actually was part of the team that ah was almost like a startup within that org because we built out Google Classroom from scratch.
00:18:33
Speaker
When I joined the team, it was two engineers, one manager and one engineer. And when I left, I think it was a team of 60 80. and we had scaled it to
00:18:43
Speaker
tens of millions of DAUs or more than that. I don't recall the exact number, but we had a pretty successful worldwide launch. And ah during COVID, i actually heard from some of my old friends there and they are like, oh, we've crossed a hundred million or 500 million because now everyone is using it every day.
00:18:59
Speaker
um So it was there that, you know, then what we realized was that teachers need multiple choice questionnaires, right? And grade them. And instead of building...
00:19:11
Speaker
but Yeah, so now Google Forms came into our team and so that uses all the same infrastructure as editors. So we were, I was exposed to a lot of that and I was always very interested in AI. So even before the brain and all those teams came about, we built this solution.
00:19:28
Speaker
And one of the reasons for me leaving was that while that solution had been live it within Google for months, we just could not get it launched because there was all this ah operational checklist, you know, bureaucracy almost at at a size and scale of Google you have to jump through.
00:19:45
Speaker
So that was on another reason for me to want to leave. And then I saw that the feature I had built was released two years after I left. And when I looked at some of the code behind it and, you know, you can go poke around it was still using the exact same code so but that just shows that two years later finally they launched it right um so while i hadn't worked on editors i had a pretty fair and substantial understanding of how those things worked and we were using those same techniques here as well so what we wanted to do was you know as we got people to start using the editor but then they like One question first. a Hold on.
00:20:24
Speaker
So, okay, I understand the editor is solving the problem of data access. um I don't understand what ah were you building the model for? What was the end goal of the model that you were building?
00:20:37
Speaker
That it should read a contract and accurately ah classify the information inside it? Something like that, I can be i can elaborate it a little bit. Basically, what we would do is, you know, imagine you're signing an ND and you're like, I am not, I want to make sure that all recordings of the podcast are covered under confidential information, right? Because that is your ah most important asset.
00:21:03
Speaker
So you would set up our tool saying or confidential information must include recordings. Now what we would do is we would go through the document and where confidential information is defined, we would say, okay, this guideline that you've mentioned is what should compare against it.
00:21:18
Speaker
So first part was figuring out which one applies where because you could have 50 guidelines and the document would be 100 pages. So that we trained a simpler model for which would take two pieces of text and just say, are they related or not?
00:21:31
Speaker
And then once you get the related pieces of text, then we would have another model that would go and say, okay, Given this guideline and this text, is it red, amber or green? And give me a reasoning why it is red, amber or green.
00:21:43
Speaker
So that was the model that we would essentially train. And be ah for some customers, we had trained 10 of those models because the variety was such within their documents and their guidelines that one model couldn't do it all.
00:21:55
Speaker
And this was you know back when Transformers just came out and BERT was the state of the art. So we even trained our own BERT from scratch. And that cost some 30,000 or $50,000 back then. And it was ah pretty significant impact and ah cost for a company of our size, which was like just starting out. But thankfully, we had credits from the GCP and AWS. We were able to utilize those.
00:22:19
Speaker
ah So that, you know, gave us a ten fifteen percent accuracy boost. But still we had to fine tune and train more models on top of it each time for each different question. And that's where it became, you know, if I have to onboard 10 customers, i have to train 100 models. My costs go through the roof.
00:22:37
Speaker
My operational overhead goes through the roof. i that's when we realized that you know it's not necessarily the right timing from a tech readiness perspective. And this was also incidentally around the time where people using the editor started asking us, okay, can you give me you know control over, can I make this a template? Can I get some approval controls on it? you know Or when I send it out to the counterparty, can I remove internal comments automatically?
00:23:05
Speaker
So we started building these things as they came together. It became sort of what is called a workflow. Now, when you have an editor and a workflow together, that beast is called a contract management platform.
00:23:17
Speaker
So we didn't even know that we were building a CLM. till we had built some parts of it. And a lot of that build was just happening based on user feedback. So once we realized- CLM says for contract lifecycle management, CLM. Yes, CLM is contract lifecycle management. It is- ah oh if you've ever used a Salesforce or a HubSpot, a CRM, it is imagine a CRM for your contracts.
00:23:43
Speaker
And, you know, approvals, reviews, managing, ah edit history, version history is sort of the key basic features in it. And so then we spend a lot of time, you know, building that. We realized that, you know, this is a pretty large market.
00:23:58
Speaker
ah There were incumbents who were really old school and we wanted to really attack that from a new perspective. cloud first, new age, simple UX approach and we saw a lot of good traction there.
00:24:11
Speaker
Then we started seeing a larger and larger sort of ask from them and we started just building those things. So we got, you know went from having one customer and I distinctly remember one of our early sort of moments as a startup was that we had WattFix as our first real customer and they opened a ticket saying something is not working and everyone was cheering that, oh, someone has opened a ticket means that they're actually doing it. That was the moment that, okay, clearly we are solving a problem because they're like, I need this fixed because my deal is stuck because of this. Okay, now we are in the critical part of your business, which means we are adding value.
00:24:51
Speaker
So we scaled that. That was when you found PMF, basically. Yeah, I would say before PMF.
00:24:59
Speaker
Yeah. So the CLM was some, ah the PMF moment because we started seeing value that we were giving to customers because if it goes down, people start calling you means it's important, right? That's, and thankfully it only went down a couple of times, but that is how we knew. And what also worked in our favor was this was also around the time that, um,
00:25:23
Speaker
When we had a lot of product readiness is also when COVID hit.

Competing on a Global Scale During COVID-19

00:25:27
Speaker
So what ended up happening was that everyone was doing sales online and everything had to move online. So there was both a pull from the demand side because now everyone is looking for these solutions to better collaborate with legal.
00:25:41
Speaker
And we also had a level playing field with other players sitting out of the US because they are also getting on Zoom calls to sell. We are also getting on Zoom calls. So, no, they don't have that advantage of, oh, I'll drop drop by your office.
00:25:53
Speaker
And that, you know, leveled the playing field for us. So, in a way, ah COVID worked to an advantage in those early days. And then sort of what ended up happening is we started seeing, you know, okay, now GPT is coming out. And we, in fact, did a course. One second, before GPT.
00:26:13
Speaker
This CLM space, I want to understand it a bit better. Yeah. Who are the legacy companies here in the CLM space? So the big old ones are Syrian Labs and iCertis.
00:26:26
Speaker
So you can almost think of them like a SAP style model where, you know, it's a platform on which you can build anything. So it is more about customizing it,
00:26:37
Speaker
So you can capture your workflow on it. But that also means that it's very complex to use and it requires a pretty significant implementation to actually get it to work. So some of the customers, one of the customers we onboarded, they said we've been onboard trying to go live with one of these two for the last two years. And we came in and we said, we will add contractually that you will be live in 90 days because we are confident.
00:27:02
Speaker
And we got them live in 60 days. How did you... ah Compress the implementation timeline. like what What was fundamentally different in your product as compared to an iCertis product ah that the implementation could happen in 60 days? We were a lot more opinionated in that you know this is how it works.
00:27:24
Speaker
And instead of it being a customization that you know you can build it on top of the platform, We made it a configuration that you can choose one of your paths. So now instead of you having to start from scratch, we will provide you, okay, this this is all the ways you can configure it.
00:27:41
Speaker
You tell me how you want to configure it. So there was a little bit of a give and take where some processes on there and might also need to change. But a lot of our implementation was greenfield.
00:27:52
Speaker
So what it would end up happening is the customer would say, okay, I don't actually know what the best path is. So you guys also tell us how we should be structuring our processes. So there was, you know, some level of...
00:28:05
Speaker
them adjusting to make sure the process, you know, adjusted to our tool and us making it configurable so that their, whatever their critical asks were, was still covered by us.
00:28:16
Speaker
As opposed to in iSortis, you have, you can do anything, but you have to start almost from scratch. um So that was the big differentiator. And ah the other good thing was that there were a few new age competitors based out of the US who were really educating the market.
00:28:32
Speaker
So we were sort of following in their footsteps, you could say, that people were now aware of and okay with a cloud-based CLM that can be configured and not a on-prem CLM that can be customized.
00:28:48
Speaker
So it it worked that way. It worked in our favor. This DocuSign is one of the newest ones or or is DocuSign a CLM? So DocuSign is a very interesting player. We do not really consider them as a large competitor to us. So DocuSign's bread and butter is of course the signing piece.
00:29:07
Speaker
They have acquired over the past five, eight years, four different CLMs. And... We like to think of it as, you know, but so the challenge from DocuSign is that because they're such an incumbent, they think that we can also always go and upsell a CLM. But most of their acquisitions have not gone through successfully in terms of an outcome. Of course, acquisition closed.
00:29:32
Speaker
But once like i would love to understand why they have not been able to sell because when we heard that news we were extremely concerned that of course that brand presence is so strong how are we going to compete but we would go into deals and hear that oh yeah their sales team hasn't even replied to us so they are not even an option for us not anymore so they acquired four different clms and uh the most recent one was last year and we have not heard of them in many
00:30:03
Speaker
of our competitive bidding processes. So maybe they might be focusing on a different part of the market, upper enterprise, as opposed to we are primarily focused on SMB. And I think there is a lot of reasons for us to do that as well. So that hasn't been a big challenge for us. What are the reasons to focus on SMB?
00:30:23
Speaker
See, one thing we realized is, you know, A, enterprise sales take much longer. And then B, all our competition is moving up market. So there's just a lot more competition there because for them to justify their cogs sitting out of, you know, US, which is a very a heavy cost market, it becomes that if I don't get a six, seven figure deal, I can't even dedicate people on it versus me sitting here.
00:30:49
Speaker
I have a lot more flexibility. and because everyone is moving up market means that the mid-market becomes a lot less fierce of a competition and it's open so we want to sort of if you know in the crm world salesforce is that enterprise uh behemoth but you have a hubspot who owns mid-market and if we can replicate that you know get some 20 000 customers in mid-market we don't need anyone we don't even want enterprise then because we have enough revenue enough time enough market to be able to sustain and make a profitable business out of it okay this opinionated product strategy sounds um like a double-edged sword like on the one hand yes ah onboarding is faster because there are limited choices and uh
00:31:42
Speaker
people who are getting 80% of what they need would be okay to let go of the remaining 20%, modify internal processes. But it could be reverse also, right? There's somebody to whom you're only able to give him 20% of what he needs and he would end up not buying your product.
00:31:57
Speaker
um So ah how does an opinionated product scale up? Yeah, so see, it's a great question and this is something we also keep running into. So what we realized was that a lot of the discovery work that happens on product really has to be done in a strong way. Otherwise, it's very easy for things to break and we've actually seen that play out. So our initial customer set was sitting out of India. So when we started our initial first round of sales in the US, there was a lot of pushback.
00:32:29
Speaker
even in terms of the terminology used in the product, not even the workflows and the processes that we had built. So that's when we realized that, you know, we can't just rely on Indian customers. We have to over index on the US market.
00:32:42
Speaker
um And another example of that would be ah so we would say that, you know, we would do white glove onboarding because we realized that's the best way to get customers to see value quickly because we know everything we can do. And internally, it's much easier to get it done.
00:32:57
Speaker
um Now that meant that we would not expose each and every configuration to the end user. But when you go to America, they want that control, right? They thrive on that and they have that, you know, I should be able to configure it on my own.
00:33:12
Speaker
So then... Yeah, Indians are used to that for you. Yeah, yeah. So really early when we started selling there, We almost had to revisit some of our core decision and say, no, this has to be configurable self-serve.
00:33:25
Speaker
It can't be something that we configure during onboarding. So it is double-edged sword and people who fall under our exact you know ICP, whatever research we've done, they love it because they're like, oh yeah, this makes so much sense. This is amazing.
00:33:41
Speaker
But as soon as someone needs some change outside of that, it becomes, ah ah how what do you mean this is not possible? And what we've learned over the years is to really treat all those opinions as you know best effort or suggestions and not bake it into the product because we are still unbaking some of the things we baked in super early on.
00:34:03
Speaker
ah And that for me is as long as you have... enough customer conversations and don't over index on one geo or one vertical, you will be fine.
00:34:16
Speaker
ah And now we are seeing that, you know, especially, for example, in finance, everyone come at whatever questions they have, we are we have solid answers to because we've dealt with so many finance customers. So it's also once you spend enough time in a vertical, you have the right answers to give because even though your product might solve it if you don't know how to position it in the way that makes sense to them they are not going to be able to uh onboard it interesting very interesting this i guess being able to reframe the product in a way that your customer gets it ah sometimes it's just a reframing challenge uh yeah and
00:34:54
Speaker
One thing we learned there is, you know, instead of talking about features and problems, if we can talk about the value that we are able to drive, then the conversation becomes a lot more ah Like the customer then understands that, see, this is the value I want to see. This is how you are showing that value. Okay.
00:35:12
Speaker
If I go and say, this is the problem you have, they are like that's problem number 25 for me. I don't care. Right. I care about closing deals faster. That is what my job as a legal person is.
00:35:22
Speaker
And if we go and say, we are going to close help you close deals faster by, you know, doing all of this. Now we're in business. So I guess you need to have an insider's understanding of the market, which Shashank probably had to understand how to communicate value. And even to identify what is value, all of this, you need to spend time either studying your customers or having worked in that sector.
00:35:48
Speaker
One of these would help. ah This ah unbaking that you spoke of, ah give me an example of that. Yeah. So we said that you know every contract has a legal user who reviews the contract and has a business user who owns the contract.
00:36:04
Speaker
Not always the case, but if you build everything with that assumption, Now, undoing that assumption in the code and in the process and in the you know in basically every part of the tool is ah pretty heavy lift. So that is where, because when we spoke to, let's say 10 customers, they said, yeah, every time we have a legal user who owns the review side and a business user who owns the contract.
00:36:27
Speaker
Now we have customers who are like, actually, no, it's owned by three different departments because there's three different pieces to the contract. Like, you know, the InfoSec team owns this part, the sales team owns this part, the procurement team owns this part. And so now it's like, okay, we need to have multiple business owners.
00:36:43
Speaker
And then they're like, okay, the review actually, i don't want to set it up in the beginning because based on the time zone, the value of the contract, the area, the entity that it's working with, I might have different reviewers.
00:36:55
Speaker
So now we've built a way for where you can define this review sort of matrix and it will automatically choose the right person. So there's always things to learn. And the more open-ended you keep things early on, the better.
00:37:10
Speaker
ah We put some of this as almost hard assumptions the in the code. So then we had to go back and remove those assumptions. And of course, as you build it on top of assumptions, it becomes harder and harder to do that.
00:37:25
Speaker
What do you think are... ah the key skills a successful product manager should have? I mean, you know to me, it sounds like you are head of product ah for Spot Draft.
00:37:36
Speaker
So, you know considering that you are leading a large product, what are really the ah the challenges that you have to balance out, the different forces that are pulling on you in different directions, and how do you really succeed as a product manager?
00:37:53
Speaker
Yeah, that's a great question. I think um empathy and curiosity towards the customer and their problems and their processes are probably the biggest one because if you don't understand what the customer is doing, it's very, very hard to go ahead and do anything to improve it. Right. So being able to really empathize and talk to customers and understand their problems with a curious mindset is probably the biggest one.
00:38:24
Speaker
The second biggest one. And i think this anyone who's ever done any product will empathize with is Having a solid reasoning or framework around reasoning of how you say no.
00:38:37
Speaker
Right? Like now we have some four, 500 customers. Every customer is like, oh, make this improvement. This is my, you know, everyone is a product manager. They'll come with their own ideas and own suggestions.
00:38:48
Speaker
How do you take that suggestion make think of ah about it in a way that generalizes and how do you say no in a way that the customer doesn't feel slighted but instead they understand that oh they did not understand the full picture and that is why it's not possible or should not be done or cannot be done right so managing that uh being able to say no to the right things and forcing the team to say no to good ideas so that they can work on great ideas.
00:39:19
Speaker
I think that's a huge hard skill to have. And finally, I would say the ah what I've seen and some of my colleagues at Google do really well is A lot of times it's very easy to go into this feature factory mode that, oh, let's build this feature also. Let's build that feature also.
00:39:35
Speaker
In classroom, the feature surface area we had was tiny to the extent that I was talking to the PM and i was like, we have built nothing. Are you sure people going use this? And he's like, trust me, we are saving them hours and hours of time just with this simple tool.
00:39:50
Speaker
And you know having that conviction, having that vision that even with a very simple tool, I know I'm going to save hours and hours of customer time is a very hard thing to do because you know it's very natural. Oh, yeah, there's a great idea. Let's build it.
00:40:04
Speaker
But great product managers will take that idea and say, this is why we shouldn't build it. This is why we are not going to solve this problem. And that I think is a very, very hard skill.
00:40:16
Speaker
Yeah, building something which is simple is is is like really like top 1% of, I guess, product teams are really able to build truly simple software.
00:40:28
Speaker
Yeah, that's really. And I guess that is also and like you need to be opinionated to build something which is simple to use, right? Yeah. That's why Apple is so loved. It is extremely opinionated.
00:40:40
Speaker
ah But they are also maniacal about keeping it simple for users, keeping it intuitive. Okay, interesting. Okay, so let's come back to the journey. So you said once COVID hit you had all those...

AI's Role in Enhancing SpotDraft's Capabilities

00:40:55
Speaker
um ah all the advantages which a US-based company had got erased. You were at a level playing field. You were able to onboard customers for the CLM. Then you were talking of the GPT moment. So why don't you just pick up the story from there?
00:41:10
Speaker
Yeah. So um see we were always an AI-first company and I would... But the CLM was not AI, right? So see, CLM, you know, contract, templates, negotiation, comments, reviewing, signing.
00:41:28
Speaker
There's really no AI there. There's more workflows and processes. Of course, we were always trying to put AI like we were like okay, you give me a word document, which has come from a lawyer, I will automatically generate a template outfit.
00:41:39
Speaker
I will automatically try to review your contract and, you know, insert or suggest comments for you. So we were always trying those things, but it was always a challenge, you know, making it work with the technology that was available. but And I have always been interested in AI, so I would keep reading papers.
00:41:56
Speaker
So BERT was what we used. It didn't work. all its There were a lot of different variations of it that came out. We used all of them. Then Google released a model called T5, which was multi-task. So instead of doing one task, you could teach it to do multi many tasks. So we again went and trained that on all the tasks we had. At that point, we had some 200 different tasks that we were doing.
00:42:17
Speaker
Still, it was... The accuracy was around 60-70%, which for a lawyer isn't good enough because at that point, they're like I'm anyways going to review everything because I i can't be 60% right. AI was like ah that Microsoft Clippy, the...
00:42:33
Speaker
yeah like MS Office had that clippy feature which was supposed to predict what you needed and suggest, e etc. But obviously, most people would just find it irritating. Yeah, heitating get it would get wrong more than it would get it right.
00:42:45
Speaker
all alright so okay So we were keeping very close tabs on what was going on. And this is when GPT-3 2.5 was released.
00:42:55
Speaker
ah That was almost a like a step change from my perspective because we were doing all these evaluations and this was the first model that without any fine tuning or training was able to match what our fine tuned and trained models were doing.
00:43:10
Speaker
and as one model not you know train 10 different models so we actually uh did an internal company all hands where we spoke about okay this is the future and this was i think uh early 20s and i can look up the exact date because that message is still sitting in the r slack and That is when you know like Shashank and all the lawyers were like, I don't even understand what you're saying. And I was like, see, if you ask it, you can tell it to think in steps and then it gives you the right answer.
00:43:40
Speaker
So you know this is going to be a big deal. And people were just like, you know this engineer is doing engineering things. ah But what we started doing was started using it and we started seeing a lot of improvements.
00:43:51
Speaker
And this is when... OpenAI had made it available. Yeah, yeah so this is when OpenAI had just released it and it was still a um you know very engineering and tech-first solution and very ah you know in that niche. It was not something that anyone outside of the tech community even knew about. new about um So then we started saying, okay, now with this technology, we can actually go back to our roots and solve that contract review problem and be first to market with it. And in fact, we built
00:44:23
Speaker
and were first to market in a lot of ways with ah automated contract review tool using transformers. Of course, as we were building GPT 3.5 came out, GPT 4, which was in a lot of ways, but another step function change came out. And then we were pretty certain that you know this is the future.
00:44:42
Speaker
So we really doubled down. We built Verify, which is our automated contract review tool built on top of this. Now, back then again, it was so expensive that we had to meter everything.
00:44:54
Speaker
So we built it. We were seeing, you know, ah super large contracts with a lot of guides might even cost us $40 for one single review. So we were like, if you're giving this away as a flat pricing and not a per contract pricing, which is what customers wanted, then we have to meter it and we have to, you know, really track everything.
00:45:15
Speaker
What does it mean to meter it? So every single time you run a review, I'm going to track it. And ideally, I want to say, you know, as if you pay $50 a month or $500 a month, you get five reviews or 10 reviews.
00:45:30
Speaker
Anything above that, you have to pay $10 or $20 per review. And another reason for metering it was for us to actually know the cost, right? Like I know the maximum based on all the limits we had was $40 per contract.
00:45:43
Speaker
On average, it was more like $4 per contract. So how do I make sure that I'm not losing money on a per user basis? So that's when the whole metering piece came out and we built all of it.
00:45:55
Speaker
Then as the newer models came out, you know, Gemini came out with one tenth the cost. We went from $40 to four tenths of a cent over the course of two years.
00:46:06
Speaker
So now my metering became more expensive than what I was metering. And ah that was the time when we were like, okay, if we don't need to meter it. We changed how we were selling it to our customers earlier. It was a per seat, some fair usage policy, you know, you click a button, you consume a credit, all of that we can now remove and we can just say, okay, we are going to charge you another 20% on your per seat cost if you want AI.
00:46:31
Speaker
That's it. So it's a flat, much easier to reason about and talk about pricing as opposed to it being like, oh, give me 10,000 credits, which means I can run it on 5,000 documents. And now, you know, lawyers are doing math and Excel and, you know, things going to be hard to close.
00:46:48
Speaker
ah So basically we went from, you know, CLM, contract review to CLM, back to contract review. And now within our CLM, we have these AI features sort of peppered everywhere.
00:47:02
Speaker
Like if a salesperson uploads a contract, it will automatically review it. It will tell the salesperson what are the potential things that the legal team is going to flag. So even before the contract goes to legal,
00:47:15
Speaker
It goes back to sales and sales is like, okay, let me start pushing back on these things so I can keep that conversation really warm and not have to wait for legal to get back. So we've started seeing that happen pretty frequently.
00:47:27
Speaker
ah We have this tool called Smart Data Capture that extracts structured data from contracts. And of course, contracts are unstructured and that has become a key part of a workflow for a lot of our customers. So now imagine you close our deal, which says I need to you know ship 100 modems at this price point by the end of this quarter.
00:47:50
Speaker
Now that data has to go to an yeah ERP, which is where the rest of the org will pick it up and make sure it gets to the right person. With smart data capture, first I'll get that data out of your contract, even if it is signed on paper, and then i will push it to your ERP.
00:48:06
Speaker
So instead of you having to do it manually, it happens all automatically. And we are seeing um certain customers are using this to even onboard ah drivers and educators where To get into the system, you go sign a contract on SpotDraft and then SpotDraft sends the information to their system, which then creates and activates their account. So earlier it would be a manual four, five step process. Now the driver just fills the contract and they're done.
00:48:33
Speaker
ah So, yeah, there's a lot of ah really interesting use cases. In fact, we didn't even think of this use case. So when our customer success associate was visiting our customer, they're like, what is that QR code which says sign this contract? And they're like, yeah, that QR code goes to SpotDraft. And we were like, wow.
00:48:51
Speaker
Right. ah So do you also do contract drafting? um So we would generally tell the customers to get their own lawyer and draft their own contract.
00:49:04
Speaker
Of course, if you want a contract to be drafted based on publicly available knowledge, we do support that. But we are not a legal services company. We don't provide legal advice.
00:49:15
Speaker
So we try to you know stay away from it. Now, you can use a ChatGPT to draft a contract. You can use our agent to draft a contract. But we will not tell you what is right or wrong because that then goes into that legal advice part of things. And that is something that we don't we are not set up for. We are not a law firm.
00:49:35
Speaker
We don't provide legal advice at all. But isn't this also just a data problem? Like you have now you are ah privy to...
00:49:46
Speaker
thousands, hundreds of thousands of contracts that your customers are drafting. um so you have enough data to create a lawyer agent, AI agent, which is a lawyer.
00:50:01
Speaker
Yeah, so see, i can what the the approach we take there is that we will help you draft the contract best. So what that means is I will show you all the previous versions of an indemnity clause that you've used so you can choose the right one.
00:50:15
Speaker
But do I know which is the right one in this scenario as, you know, spot draft not having all the understanding of your business, your risk tolerance? I do not. And what I don't want to do is say that, you know, this is a...
00:50:29
Speaker
approved contract because if something goes south the reason you pay the lawyer is because you're going to hold his neck and say okay now you come and defend what you wrote but we are not set up for that so we will definitely enable the lawyer to draft a better contract more informed decisions on that contract and do it faster but we don't want to be that lawyer doing it ourselves got it interesting What is ah happening in the legal tech space overall? on like is the
00:50:59
Speaker
Is any company trying to take that lawyer TAM? like you know There is this popular term of salary TAM, that AI is coming for the salary TAM, that instead of paying salary to 10 people, you will pay salary to 10 agents, for example. like ah So is there somebody who's coming for that legal fees TAM?
00:51:23
Speaker
See, I think that's a great question and something we speak about a lot. Just to set the context, right? If you look at any large company and you look at the legal team spend, the top two line items may be flipped, but they are always going to be in-house legal cost and external legal counsel cost.
00:51:44
Speaker
So, and we the there is this list of top 500 law firms in America The one at the bottom of that list does millions in revenue.
00:51:55
Speaker
At the top of the list, you can forget about because they do billions. So our view is that there is definitely going to be a change here where if we can enable lawyers to do some of the things that they would have to go to external counsel for at a much, much cheaper price point,
00:52:12
Speaker
they are going

SpotDraft's Agentic Solutions and Market Adaptations

00:52:13
Speaker
to use that. So we have ah our own agentic solution called Sidebar, which we are testing with customers and we have a lot of them using it. So we are seeing two things play out. One is that a lot of customers would and in-house legal teams at these customers, companies would not do certain activities because it's just too time consuming and too expensive.
00:52:34
Speaker
So one of the guys is telling us that he subscribes to 50 newsletters to stay on top of regulations across the globe. But he basically does it on his commute to and from work every day because those newsletters are general purpose. So they are an FMCG company who do wholesaling.
00:52:51
Speaker
Right. So he's like, I don't know which of the newsletters, which of the new laws apply to wholesale versus retail. So half my time is just spent doing that. So we've in sidebar, you can just say I'm a wholesaler. These are the newsletters. Go read them and give me a weekly digest.
00:53:07
Speaker
right So now he started using it for that and he's like, now I have i don't know what to read on my commute because i have too much time. right And then we have another customer who's again an FMCG, but in retail.
00:53:18
Speaker
So they have the same exact use case, but in a different way. So the key thing with agents, which I think other ah software solutions before did not unlock as easily is you can configure an agent to do exactly what you want because you can almost code the agent in plain in English.
00:53:37
Speaker
So this is an example of something that probably you wouldn't even do. right so this is new capability or new you know value that's being unlocked on the other side there are things that you know you are in an mna and you need an external legal firm to do due diligence on hundreds of thousands of contracts now you're looking at a bill of six seven figures usd what our agent is able to do is okay let's do a first pass review of those contracts in minutes as opposed to days and costing hundreds or thousands of dollars as opposed to six, seven figures.
00:54:14
Speaker
So earlier, I would say, you know what, once the deal is very close to completion, only then we'll involve the lawyers because it's going to cost a lot. Now we are seeing our new customers come in and say, or as part of the deal, due diligence, before we close it, let's start looking at these contracts because now I can and I don't need to go to, you know, and pay so much money.
00:54:34
Speaker
So what they do is they'll run the agent to get that first pass and then use a subset of that data and send it to external counsel still. But now instead of external counsel going through 1000 contracts, they're going through 500 contracts.
00:54:47
Speaker
Instead of them starting from scratch and you know each contract taking half a day, they start from an informed place and each contract takes maybe a few hours. so The council spend is not reducing, but it is becoming much more strategic and much more focused.
00:55:03
Speaker
And this is also playing out, you know, in law firms where They are deploying AI agent solutions that, again, make them more effective and more productive. So it's happening on both sides. and there are And we are even hearing emergent use cases coming out where earlier, if I wanted to create a report of what is the risk in all my old contracts or how should I you know change my order form or my MSA to help contracts close faster? Because what is it that people usually push back on?
00:55:37
Speaker
This is an effort that they would almost never take or they would hire a new person just to do this. And now with our technology, you can do it in again in a few minutes without having to go and do it manually. So it's value unlock of new use cases as well as making existing use cases more work much, much faster.
00:55:55
Speaker
um So Sidebar sounds... ah like ah up like ah Like a generic agent, so somebody who can read emails, summarize, categorize.
00:56:06
Speaker
and ah Is that what it is? or like Or have you specifically trained it for the legal domain? Or like what is that? that Help me understand the sidebar little bit.
00:56:17
Speaker
of course, a lot of overlap with any generic agent because the capability set is, ah that standard capability set always needs to exist. so we For example, you can do deep research on Gemini or on ChatGPT, you can do it with Sidebar as well.
00:56:33
Speaker
But the Sidebar deep research will give, like if you ask, you know, new GST laws are coming out in India, they're going to change the slabs. A Gemini or a ChatGPT will give you answers from news outlets, from blogs.
00:56:49
Speaker
We will give you answers from the Gazette of India. right So we want to go to the authoritative sources because that is what the lawyers care about. So that is one big difference. The big difference is that we really heavily over-index almost on ah making it work for lawyers' use cases. So we find to change.
00:57:12
Speaker
fine-tween the models, everything to work better for those legal use cases. We also deeply integrate with the SpotDraft CLM. So you can ask, you know, show me the last five contracts that were signed, which contracts don't have an indemnity clause, and it will go through your entire repository, do a deep research, find out those clauses, give you those answers. So it's also been tweaked to work from a lawyer mindset where A lot of times your Gemini or ChatGPT will use general knowledge to answer.
00:57:43
Speaker
We say if it's not written in the document, you cannot answer it. So focusing on reducing those sort of hallucinations, calling out ambiguity when it exists, linking to sources that lawyers care about.
00:57:56
Speaker
So we're looking at making verticalized version of ChatGPT or Gemini, and we are integrating with tools and providing tool sets. For example, we have verify which is our contract review tool.
00:58:08
Speaker
Sidebar can actually use verify to do review. right We have smart data capture. So Sidebar can use smart data capture to extract doc data from hundreds or thousands of documents. So it's also you know aggregating and amalgamating all our different AI solutions in a way where you can get more value and it is better used together as opposed to having to, you know oh let me go extract it from one place, get an Excel sheet, and upload it to Gemini and hope it works out well.
00:58:48
Speaker
So like for example, in Google Sheets, you can use Gemini. But my experience has been underwhelming so far. like It's not really truly able to do the kind of things which you speak about, like go through my Google Drive and find the resume resume of product managers.
00:59:05
Speaker
For example, if I give that prompt, I think the response will be underwhelming. ah so So is that the state of models today that you are seeing? Or like what are you seeing?
00:59:18
Speaker
See, I think, yeah. So first of all, that's absolutely true what you're saying, especially in some of those open domain sort of scenarios. Like if you look at the Gemini agent in sheets, you'll be like, okay, format this sheet correctly. It can't, it just.
00:59:34
Speaker
It can't. Yeah, it can't. Yeah, it's pretty useless so far. Yeah. It's a mix of, you know, what is the capability set that has been given to the agents, which is what tools does it have access to and how tuned is it to a particular domain?
00:59:47
Speaker
So, Gemini, I think the big challenge is that they are trying to solve it for everything, right? You can use it in docs, you can use it in sheets, you can use it in drive, it's its own standalone. So, then you can't focus on one, right?
01:00:01
Speaker
So, I would actually look at Cursor as a great example of what is possible when... an agent is tuned to a particular domain. So Cursor, if you haven't heard or used it, is a um IDE for developers, which is AI agent first. So imagine, you know, all developers use an IDE, which is an integrated development environment, essentially place where you write code and it has all the bells and whistles to make coding easier.
01:00:29
Speaker
So a Cursor has an IDE, which has an agent sitting inside it. so what it allows you to do is instead of you sort of navigating the code and finding where things are happening you can ask cursor and it will navigate the code for you and give you answers so that's sort of a very simplistic view on it now what it does and we are actually we've been using it for a good six months now so in the beginning people were like yeah it's you know sometimes it works sometimes it doesn't have to really give it the right signals and indications.
01:00:58
Speaker
Now with Cloud4 and Gemini 2.5, it's actually become significantly better to the extent that we have actually given access to Cursor to our design and product team.
01:01:10
Speaker
And last quarter, they shipped 20 changes without involving engineering. Of course, we still have to go review and approve the code. But earlier, they would come to an engineer and ask, how does this work? Can I make this change? How much time will it take?
01:01:24
Speaker
All of those questions are now ah answered by cursor. So if they want to understand how a particular part of the code works, they'll go ask cursor. Cursor will give them a very, very elaborate answer.
01:01:36
Speaker
Then they'll ask follow ups, try to get an even deeper understanding and then they'll go to the engineer saying, see, this is what cursor is telling me. Is it right? And if it is right, does that mean that we can ship this feature in two days as opposed to two weeks?
01:01:50
Speaker
And you know that has been a fundamental shift in how we operate. In fact, now we have cursor sitting inside ah Slack ah environment and I am seeing, you know if let's say customer success is asking, oh, some customer wants this to be done.
01:02:07
Speaker
Can we answer? Is it even possible? Like sometimes you don't even know if it exists already. right So now product managers are just asking cursor, getting the answer. trying it out on the platform and answering to the customer directly.
01:02:22
Speaker
And then this doesn't involve writing code. So that is the biggest unlock where if now someone who doesn't have a technical skill set is able to answer technical questions and that is being enabled by Cursus. So in terms of what agents are making possible, I think if you look at agents that operate in a very constrained domain, there is a sort of lot of really great value that's being added.
01:02:46
Speaker
For more open domain things, it's not as good. So for example, the the example you were talking about in Google Sheets, there's actually a YC company, I think called Shortcut. They've built an agent that just does Excel.
01:02:59
Speaker
So there you can ask, you know do a DCF analysis and give me, it it'll be able to do it, fix the formatting, it's able to do it. So you know it's also, if you have enough focus, then you know what those use cases are and then you can truly solve for them. And that's why in Sidebar,
01:03:16
Speaker
We are able to say like one of the great examples that we've seen customers use us for is, oh, this new EU AI Act came out. I don't know what I need to change in my privacy policy, my DPA.
01:03:29
Speaker
So you can just give it your DPA. It will go find the EU AI Act and make the changes. And we've heard customers say this is 70-80% of my work is done. right As opposed to if you try to ask a Gemini to do it, it'll do all sorts of crazy things.
01:03:42
Speaker
So it's also having focused ah because then you can really prompt and train the agent on that. right And that is been what we are seeing across the board, like even within ah tools like Replit, which are all again, ah little bit engineering or developing apps for non-engineers is the place where the most value is being added right now and which is why cursor went from i think 100 mil to 500 mil in the course of a year in terms of arr because it's just so easy to deploy because all those checks and balances are already present in that sdlc life cycle where you still have to get your code reviewed it's not like suddenly the entire process has to change so that also is another place where
01:04:28
Speaker
If it is fitting very easily into an existing process, it's much, much easier to adopt. And those are some of the factors we are also keeping in mind as we are building out our agents.
01:04:40
Speaker
Do you think there will be like all these products rebuilt AI native, like say somebody will build an AI native spreadsheet software. And that would really have that magical experience where you no longer go inside the cells.
01:04:54
Speaker
I mean, you know, even for a beginner, a spreadsheet is intimidating. ah You know, like how do you use a spreadsheet? How do you feed in a formula? It's, it's like low-level coding, you could say, in a way, putting a formula in a spreadsheet, you should know C3 is where the data is and whatever. so So you could replace all this with just instructions. You think that that's the way forward?
01:05:17
Speaker
Like all of these tools will be rebuilt AI native or there will be new companies building AI native to spreadsheet, AI native word processor? yeah I think that's a great question. And something, again, we keep thinking about because...
01:05:31
Speaker
We are also like, okay, is there going to be an AI native CLM? it know So I think the way I look at it and what we've seen play out is at the end of the day, it's about you know how much autonomy you are giving versus how much control does the user still have.
01:05:47
Speaker
And you know you can almost think of it as a slider on full control. You just ask the agent to do something, you copy paste it versus full autonomy. The agent is doing it all like by itself. But at the end of the day, what we are seeing happen is you still need some level of deterministic behavior that is under your control. So, for example, if you look at a Canva, which is that social media ah content generation platform, they have an agent, but their agent is successful because if the agent makes a stupid mistake, I can just go change it in the Canva UI. I don't need to prompt it again.
01:06:22
Speaker
right Similarly, cursor, if it makes a stupid mistake, I can write the code myself. Now, if there is an AI native solution being built, they will eventually reach a point where they need to build the non-AI part of it because the users will need to have that control.
01:06:39
Speaker
So either you partner with an AI native solution and you are the non-AI part of it. Now, that's what we are doing with, let's say, a Verify where we are sitting inside Word.
01:06:51
Speaker
So anything we do wrong, you can just command Z and undo it. Or you know you can just go type it yourself. So that means, you know, the autonomy isn't fixed at fully autonomous. It can easily go back to no autonomy at all.
01:07:04
Speaker
And which means that now the user isn't having to do extra work. But if you look at some of the competitors of ah cursor, even for example, where they're like, you know, there is no IDE. You just type what you want.
01:07:17
Speaker
We'll do it in the cloud and give you the code. And a lot of times it's like, oh, it's just made this one mistake. And now i have tried three prompts to tell it to fix that one mistake and it just go it becomes worse and worse.
01:07:28
Speaker
Right. So that's when it breaks completely. So that's when even when we're looking at competitors in the legal tech space, A lot of these AI native solutions are now building workflows, now building approvals, because those things you want to be deterministic, you want to be able to change by pressing a button and not having to explain in maybe half a paragraph that, oh, for this one approval, change this one condition.
01:07:54
Speaker
If I have to explain it, I'm already like losing my mind versus, you know, just having to click it. So I think the long term view from my perspective is some level of convergence there where either it is the incumbents that are able to really give agents control over the existing UI, which means that I can see the agent do the work and drop stop it and change it at any given point.
01:08:19
Speaker
Or it's going to be AI native companies that end up building all that stuff that the incumbents had, but maybe in a way that is more attuned and to how you would do a chatbot where the UI shows up in the middle of the chat as opposed to you having to go there.
01:08:34
Speaker
But I think that UI is going to still continue to exist, especially in enterprise sort of scenarios. So like a chat GPT costs $20 a month. Does a sidebar cost similar or like how you price sidebar?
01:08:49
Speaker
Sidebar is more expensive than 20. I think it's something we're still working on the pricing because we're in the private beta phase. And we're also seeing what kind of usage people are doing to make sure that we price it at the right value add perspective. But we are looking at something like $3 to $50 a month for that.
01:09:07
Speaker
And for the core yeah CLM, all the AI offerings are offered as an AI plus package on top of ah you know the existing perceived or per document charge that you pay for a CLM.
01:09:19
Speaker
but You said $350 a month per seat or like a flat? Yeah, of course. And there also there's a very specific challenge, I'll tell you, is that you could buy one seat and then deploy it that everyone in your org sees value.
01:09:39
Speaker
because I am doing, you know, regulatory compliance monitoring and sending out an email. Now the value is being seen by all the people reading that email, but the person paying it is just one person. So that's why this is indicative and early sort of thoughts around pricing. We are also looking at pricing it from an outcome perspective, like each and every successful chat, you pay half a dollar for or a dollar for. That's something that is... that's friction right like i mean that that's what you wanted to avoid and you were able to see yeah so if i charge you for any chat you're going to say okay now it's expensive but if i charge you for successful outcomes now it's like okay if i had done the same thing with an a human and external counsel i would have paid maybe 100 so i'm willing to pay one dollar
01:10:31
Speaker
So we don't have an answer yet, which is why. so what I was trying to get to was in another place where agents are becoming fairly common and deployed in production is customer support.
01:10:43
Speaker
right And one of the sort of marquee companies in this space other than Zendesk out of the Valley is Intercom. And Intercom has moved to a pricing model of any closed ticket by the agent you pay a dollar for.
01:10:58
Speaker
So if the agent is not able to close the ticket, you don't have to pay anything. But if we successfully close it and the customer says thumbs up, you pay me a dollar for it. So that aligns the value with what the value the customer wants. And it's a much clearer thing saying, okay, today, what you truly care about is closing customer tickets, right?
01:11:18
Speaker
Everything else, how many people you have, how many seats you have is a path to that. And so what we want to do is figure out if there is an equivalent that we can look at that is, you know, for sidebar in the legal tech space that we can charge for. But the...
01:11:35
Speaker
entire industry and competition is charging per seat per month, which is why if you go into a comparative bidding process and your pricing is different, you're already as at a disadvantage because now people can't compare apples to apples. So which is why we are going with a per user per month costing to start off with.
01:11:52
Speaker
ah Do you think AI will increasingly make this possible, the pricing from input to output to outcome? and Absolutely.
01:12:03
Speaker
I think ah as these agents become more and more powerful, like I would be willing to pay a cursor on a per successfully merged code basis, right?
01:12:15
Speaker
Because at the end of the day, that is the value. So now instead of me saying, okay, I have to pay for everyone $40 a month and that's a big cost. And now I have to monitor, are they actually using it?
01:12:28
Speaker
If I pay, okay, you write code that we merge and we pay per line. Great. Right. I know I'm only going to pay for things that actually value. So definitely it is moving towards that. It is going to, ah you know, that I feel is going to happen. it sort of boils down to how easy or hard it is to truly capture whether the outcome is successful or not, right? And customer success, very easy.
01:12:52
Speaker
Chat was closed with thumbs up, great. Or or like a CRM tool, which is charging you for every reply you get to a cold mail campaign. Like that could be interesting that you're not paying for the campaign per se, but every successful meeting lined up, for example. Every meeting book you pay for. Great. Yeah, yeah, yeah, yeah.
01:13:14
Speaker
And there are, by the way, exactly people doing agents in that SDR world who are charging like that. So once, and that then aligns everyone, right? Like I, as the company want to have a successful outcome. You want that outcome. Everyone is aligned in terms of value prop. So I think that is a natural move.
01:13:32
Speaker
ah Now, the other side of that is you know it doesn't naturally incline or align with the cost basis that you have. like You might have sent 100 emails and got zero replies, but you still had to pay for those 100 emails and all the LLM calls you had to make. you kind of need to do due diligence on your client then.
01:13:53
Speaker
Like, who are you selling it to? is is is what Is the outcome possible that they want? Yeah, so once you have that level of conviction that I can do this and I can do this right, then charging like that makes much makes it much easier.
01:14:07
Speaker
Otherwise, yeah it's very easy to burn so much money on LLMs because they are not while they've become... many orders of magnitude, multiple orders of magnitude cheaper, they're still like very expensive. And in fact, I was reading this article the other day that a lot of these ah you know AI native companies like your Perplexity, Chat, GPT,
01:14:32
Speaker
The cogs and margins are insane compared to a standard SaaS offering where, you know, eighty ninety percent is almost taken for granted.
01:14:43
Speaker
and these, it's like 20%, 30% is, and oh, what happens when I get too many free users? Like, you know, that Girbaldi moment that happened with ChatGPT.
01:14:54
Speaker
Like now I'm suddenly paying for free users more than they will convert. So it's it's a very very like it's early days, which is why pricing is a very big question also.
01:15:06
Speaker
SAAD AL- Has the onboarding time come down because of AI? So you you told me that 30 to 60 days was the onboarding time earlier. ah Is AI making it possible for customers to do more self-serve onboarding by just writing English?
01:15:24
Speaker
Yeah, so we have an internal mandate to use AI across the org and for the delivery and onboarding team, I think there's a lot of value to be derived from it.
01:15:38
Speaker
But it also boils down to sort of expectation setting for the customer. That's sort of where, because i can't go sell to a customer ah saying that, you know, I am going to give you 100% accuracy because everything is going to be manually reviewed and then just run an AI that may be 70-80% accurate. So it depends on sort of what the ask is for the customer. So what we've started doing is, you know, we'll tell the customer, see, for these things that you want to extract, we look at 60, 80, 90% accuracy.
01:16:12
Speaker
And the customer is going to say, you know what, for the contracts of the last two years, which I truly want to track and they are still ongoing, I want 100% accuracy because it matters because those contracts are ongoing.
01:16:23
Speaker
But for this historical set for the last 20 years, it's okay if it's 70%. right So as long as we are able to clearly call out and make sure expectations align, we can actually see a huge, huge uplift in time in terms of onboarding time. So we, ah just to give you an example, we onboarded a pretty large listed US company that is in the chemical space, I believe. And they have contracts that go back 30, 40 years. Now,
01:16:55
Speaker
right now Initially, we said everything will be manually vetted and that became like 50 or 80,000 contracts that has to be manually vetted. And of course, that's going to take you know months and months.
01:17:10
Speaker
So we gave them a timeplay timeline of a year almost. right And we were like, okay, you tell us what is the most important, we'll do that first. So you start seeing value. But for the long tail, it's going to take a year because i I'm not going to suddenly get 1,000 people to do your 100,000 contract.
01:17:24
Speaker
hundred thousand content right um Then we went and asked them, you know, do you really care about this 100% accuracy for everything? And we did a few experiments where we asked them, okay, tell us how you want this data.
01:17:38
Speaker
We'll give you the AI output. You can validate it, vet it at your end and see if the kind of reporting and analysis you would want to do on these contracts is still coming through. And even at a 80% accuracy, they were getting enough value out of it. So we took it from a year to almost two months and closed it.
01:17:57
Speaker
Because now we were able to run it using AI and AI is able to do it much faster. so Instead of it taking an entire year of manual validation, it took two months of manual validation for that one, two year contract and everything else we ran through AI. And now they're seeing value from even those older contracts. And when they identify, you know, they identified, okay, AI got this wrong.
01:18:19
Speaker
So we told them, okay, let's, we'll update, do you know, the prompt, we'll update it And now we've gotten that also to a higher accuracy. so AI is definitely helping. And this is again on the extraction side, even on the templating side, we have internal tooling that takes a template that's come from your lawyer and converts it to a template on SpotDraft.
01:18:39
Speaker
But that we'd still manually wet each and every time because the last thing you want is something there to be set up incorrectly. Just ongoing contracts. Yeah. And that will become like, what ends up happening is we set up that template.
01:18:52
Speaker
There are two legal people, but there are 500 salespeople who are using that template and the sales guy is not going to read. And he is he wants to close that deal. So as soon as a contract... What exactly happens in onboarding? What does your onboarding team do?
01:19:06
Speaker
They create templates for the company, which are as per their regular contract. Like if a company has a lot of, let's say, contracts with drivers, so they'll be... like and so on, there will be like maybe 100 different templates. So in the onboarding process, you make those templates available through a dropdown menu where somebody can just select.
01:19:26
Speaker
That's what they do, your onboarding team? So it's not just that. so we will do what we call call assisted migration. So if you have contracts sitting in any tool or even in a Google Drive or your hard disk, you want them you know to be correctly set up and imported into the system. So you can do it yourself. But a lot of times, especially for US customers, it's just so much, so much cheaper to get it done via us. So they'll give us, you know, ten hundred thousand contracts. We'll segregate them by contract type, by, you know, the region, entity, geo, whatever. And again, we use an AI for a lot of it, but manually, where depending on the customer's risk tolerance.
01:20:06
Speaker
We also will take, you know, your, so usually you will already have a template of sorts where it's like, okay, this is the word document and it's highlighted saying this part only put for you, this part change when you have greater than this value.
01:20:20
Speaker
So we'll take that and we will set it up in our templating system with those conditions in place. So essentially what happens is the sales guy can, or whoever the non-legal person can just fill out a questionnaire and based on the answers to the questionnaire, it will automatically generate the right contract. Now, of course, I'm oversimplifying it because you can do, you know,
01:20:42
Speaker
The questions also change based on what answers you give. The contract also changes based on what answers you give. The approval requirements change based on what answers you give. So all of this put together is the ah is the workflow.
01:20:55
Speaker
So when we do an onboarding, we'll say, okay, we'll set up one or two workflows for you. We'll teach you how to set up the workflows. You can see how it's going. So you can do it yourself. Or if you don't have bandwidth internally, you can ask us and we charge on a per workflow basis then. So, you know, it's like, okay, with your license, one or two setups are included. And then beyond that, we can do it at a paid per workflow basis.
01:21:19
Speaker
How much of your revenue is subscription and how much is this services kind of revenue? Around 80-85% is subscription. Services is a sliver. And... ah Services only happens when you're signing up a new client.
01:21:34
Speaker
The one-time onboarding. it is That's what we thought. But we've also seen with some of our customers as... So usually what ends up happening is we'll start with one team. So for example, it's like, okay, the sales team, we want to scale how much sales we're doing. So we want to automate it But now we are seeing value in that. So actually, let's get procurement or HR or some other team also set up.
01:21:57
Speaker
And either the person who is the legal operations person would try to set those up themselves, or they'll come and say, you know what, actually in procurement, we have these 5000 contracts sitting.
01:22:08
Speaker
I don't want to manually import them. Let's get SpotRaf to do it. And that then becomes additional one-time revenue that comes after you know the customer's been onboarded. So we do see some of that happening, and especially when there's expansion happening within a customer, either at a geo or at a business unit level, that's usually when this comes up.
01:22:31
Speaker
And this importing is essentially a classification and tagging process, like classify what kind of contract and ah data extraction and filling up some fields and all.
01:22:42
Speaker
Yeah. And what we do is, for example, if we were working with an events company. So they sign a lot of contracts with hotels. Now they'll say, okay, how much food is included? How many minimum guarantee?
01:22:54
Speaker
What dishes are included? Is tax included or not? Is parking included or not? How much is the parking charge? So that is truly what they care about because those are the things then they have to charge for.
01:23:06
Speaker
right And so they set up our tool to automatically extract parking is free or not, what kind of parking is there and some 50 different data points. So you can configure the tool to extract anything that is you know critical to your business. We support some 150-200 fields out of the box, which are very, very general purpose. Like, you know, when does the contract sign? When does it expire?
01:23:29
Speaker
Who's it with? What is the jurisdiction? So very basic stuff we have out of the box, but when you have things that are unique to your business you can configure it and we have a way for you to see how well it is performing you can keep updating it you can still use that data push it to reports and push it to other systems so that is where the true value comes because now you're extracting information relevant to your business there's this concept of context window
01:24:00
Speaker
ah that ah you know you can feed data to your chat, which is currently ongoing. And so as long as the data is within the context window, so it can refer to whatever you fed it, whatever document, et cetera, and ah give you information from that.
01:24:16
Speaker
ah And I believe the bigger the context window, the more expensive it is. So it's, An organization which has Sidebar, would the context window be all contracts of that organization? Because someone can go to Sidebar and say, um find all contracts, like this events company, find all contracts where i am responsible to pay parking.
01:24:37
Speaker
So then it would actually need to see each and every contract and see where I am responsible for parking. And that sounds like a very expensive query. Is that the case? Or is there a way to optimize it and...
01:24:49
Speaker
So great, great question. ah So short answer is you end up having to put it in the context. But what we do is ah we have many different three or four different ways where we try to understand which part of the contract is even relevant for this question.
01:25:06
Speaker
So instead of passing the whole contract, I will do retrieval and identify the relevant pieces and only pass those to answer. That's one piece. The second is when you're doing analysis of any sort on more than five items.
01:25:19
Speaker
We actually treat it as doing analysis on each item individually and then, you know, summarizing and contextualizing the final answer. So what it'll do is if you have a thousand contracts, it will go through each of them individually and try to answer the parking question.
01:25:35
Speaker
Then it'll take that thousand rows as an Excel or CSV and then run an LLM on it again and say, okay, this is the output. And actually there we allow the model to also run code because if you have thousand items and the model needs to remember all thousand, it's very likely to mess up.
01:25:51
Speaker
But if it can output code that will analyze the thousand items, it's very likely to do the right thing. So what we do is we give it those thousand, ten thousand items. It will generate code that says, OK, I need to look at the column of who is responsible for parking and generate a bar chart or, you know, give you a categorical view on it. So then it will write some code that just counts all of that and gives the final answer.
01:26:15
Speaker
And of course, if you have multiple columns, it can do it across. And then finally, when it has that, then it will again go and summarize saying, I've analyzed and, you know, this is an outlier or this is what I'm seeing as the trend.
01:26:27
Speaker
So

AI's Impact on Organizational Roles and Workflows

01:26:28
Speaker
those analysis post facto is well what the LLM does, but the middle part, we actually make it write code so that it can run deterministically because in a probabilistic way, you'll run it three times. You're like, how can this analysis be different each time, right? And then you lose faith in it.
01:26:45
Speaker
Right, right. Okay. ah What does the org chart look like once you have become like an AI first organization where you made it a mandate that everyone in the organization needs to use ai ah like Like I've heard of like every PM is mapped to one engineer because AI does a lot of coding as opposed to, I guess, probably Google would have had one PM, one product manager with eight or 10 engineers, which ah AI native companies are saying it should be one is to one.
01:27:16
Speaker
ah is do Do you see that? Is that happening within SpotDraft? um So we don't have we have ah the old school mapping of around eight to one, still ah eight engineers to a product manager.
01:27:29
Speaker
What we are seeing is ah the biggest value add from AI is actually in non-technical people being able to do technical things and this applies sort of across the board right so we spoke at length about how designers and pms are now able to write code in a way but in terms of the entire org impact like we have revenue operations whose job it is to you know give insights and analyze uh you know the pipeline the sales what they look like all of that
01:28:00
Speaker
Now with agents, I am actually able to ask those questions and get answers myself. Earlier, I would be in the CRM trying to figure out how do I get this, like, you know, what chart, what data point.
01:28:13
Speaker
Now I just ask it and it figures it out on its own. Yeah, like a HubSpot has like that. I think HubSpot has some sort of ah like a sidebar version in of HubSpot that you can just ask. it Yeah, it's it's going to be everywhere, right? And even, in fact, we don't even half the time use the HubSpot agent. So we have our sidebar supports MCP. So we can connect HubSpot to sidebar.
01:28:36
Speaker
But since it's more legal focused, it'll give a very, you know, it'll qualify every answer, which but doesn't mean. So we actually use Cloud for that a lot as well, because you can configure MCP there.
01:28:48
Speaker
And in fact, we have our own MCP server. So now the legal team uses Cloud to ask questions about the contracts because ah and ask questions where it's like mixing data from HubSpot and SpotDraft can happen in Cloud.
01:29:03
Speaker
Earlier that would be first let me get a report from the RevOps guy, now let me get a report from the LegalOps guy. Now I am sitting and trying to map them and you know, trying my luck with Vlookup, which I've never gotten to work properly, as opposed to the agent has done 90% of the work and now I'm just looking at the final result and going back to DevOps and saying, does this data look right?
01:29:25
Speaker
right So

Integrating AI with MCP for Enhanced Solutions

01:29:26
Speaker
that has been the key change operational and organizational behavior where we have enabled people to go get the data themselves and then go to the relevant expert asking, is this right or wrong versus go to the expert to even get access to the basic data.
01:29:44
Speaker
In terms of changes on the non, like we have in fact set up roles where their job is to just how can I use AI to do everything faster. Like, you know, PDR team has one person who's figuring out, can I generate emails automatically?
01:30:02
Speaker
Marketing team has two people figuring out, can I generate copy and content and, you know, all of that automatically. So Within the team, we are empowering people to do it. And in product and engine design, that's become like without using AI, if someone I see do is not using AI, I'm going to try to have a hard conversation with them that if you don't do this, you will become obsolete.
01:30:27
Speaker
What is MCP? How does it work? So, yeah. so i Sorry, I'm so deep in it. I just throw out these terms. But ah so a lot of the power of agents comes from their ability to use, to have the agency to use tools, right? They can decide that I want to use the browser's tool or the search tool to go look at something and then give you an answer, right?
01:30:55
Speaker
So MCP is something that Anthropic, which builds Claude came up with. It is a protocol to allow a standard way of defining these tools.
01:31:05
Speaker
So if you talk about SpotDraft, like, you know, we have an MCP thing that has tools like find contracts, get contract text, you know, review contract, approve contract, whatever.
01:31:17
Speaker
Now I can give that MCP server to Claude and then Claude now suddenly knows how to talk to SpotDraft, right? Oh, Claude says I can get contracts, I can approve contracts, I can review contracts.
01:31:29
Speaker
Now I can also add another MCP tool, let's say for Google Drive. It's like, okay, search drive, get document. Now I can say find which documents are present in SpotDraft and not present in Google Drive and copy them over to Google Drive.
01:31:44
Speaker
So what it'll do is it's like, okay, first let me go ask Poddraft to give me, call that tool using MCP, give me contracts, then call the tool of drive to say, search for this contract, then call another tool in drive saying, save this contract. So that in the old age,
01:32:02
Speaker
Pre-agents would be you have to use these APIs, which means now you're writing code. Or you will use something like a Zapier and build out a workflow. And then you'll rely on Zapier's, you know, whatever tools that exist in Zapier.
01:32:15
Speaker
So MCP is ah essentially a way to give the agent the ability to call any tool because there's a standard protocol around it. And this you said you set up an MCP server. So there's not like a self-service kind of thing where you sign up to a service of an MCP server. There's something which you have to build for your tool.
01:32:39
Speaker
We as SpotDraft had to build it for SpotDraft. But now if any customer wants to use it, we'll just give them a yeah URL. You just put it there, and then it'll start talking to your SpotDraft instance. OK. So any of your customers who are using Claude connect Cloud to SpotDraft because you have built an MCP server for SpotDraft. And ah I guess this is the new API. like There was a time when companies didn't have APIs and then everybody wanted an API and like a company like Postman became a unicorn on the back of that move of everyone building APIs for their products and API first products. So do you think the future is MCP first products?
01:33:21
Speaker
See, it is too early to say for sure, but we are seeing if you have a well-designed and well-built MCP server,
01:33:32
Speaker
Then a lot of, them so if we look back at why APIs were even important, it's like that allows me to do inter-service communication that otherwise someone is copy pasting essentially. right So that is sort of the simplest way to think about why APIs are important and why they became so powerful.
01:33:52
Speaker
Now, MCP is essentially an API, but it also puts the onus on the person developing the MCP server to give enough details of how it should be used.
01:34:03
Speaker
right Same way as API documentation meant that the people can use it right, the MCP part means that your agent can use it right. so what that allows is like you could maybe get away from building a lot of very very nuanced features because you can say okay this you can just go ask an mcp server mcp enabled agent and it'll give you the answer so now you don't need to go and build each and every long tail functionality so it is definitely going to change how things are built and even when you look at examples and demos from people like microsoft
01:34:38
Speaker
So they are very heavily investing in this to the extent that they demonstrate that I think Satya Nadella in fact said, oh, using Copilot and the MCP server on Microsoft Dynamics, I get insights into what the sales team is doing.
01:34:53
Speaker
ah Earlier, I needed one person to set up dashboards for me. Now that dashboard is getting generated. So that is where like, do I even need a dashboard support because MCP can generate that dashboard.
01:35:04
Speaker
right And it makes it much easier to get that initial customer set happy with long tail of asks because you don't need to write code. You don't need to do anything other than describe your problem in English.
01:35:18
Speaker
And if you see enough people doing the same thing, then yes, I will build a dashboard because clearly that is important. But in the initial days, i otherwise I couldn't solve this problem at all. And now I can. Okay. Fascinating.
01:35:30
Speaker
um What is your ARR today and how much of it is from the US? Our ARR is double digit millions, more than 67, 60, 55% I think is US, 25, 30% is rest of world and the remaining is India.
01:35:50
Speaker
but Okay, ah you're still servicing the India market. India is actually a big market for us because we were first here. ah We have now in a lot of ways become the benchmark.
01:36:02
Speaker
So people will compare again SpotRapt and say, you know, this other tool is much cheaper. Why should I go with SpotRapt? It's become the conversation. So ah thankfully, that means we get a lot of pull instead of having to push where customers come to us saying, okay, ah you know, you have to be part of the RFP process for a CLM in India. And So, yeah, India is still a pretty large market for us.
01:36:26
Speaker
And when say double digits, what is that? Is that like teens or like 30 plus, 20 plus, what? It's on the teens side, more on the teen side than on the 90 side.
01:36:41
Speaker
You know, about building an AI first organization, especially for non-tech people, do you have any advice for non-tech people on how they can be AI fluent?
01:36:54
Speaker
Yeah, I think that's a good question. i think the bigger question for me is if a non-tech person is trying to do, you know, any sort of any activity online, are they going to one of these ah chat GPT or Gemini and trying?
01:37:14
Speaker
Because if you try, you might realize that, oh, wow, I didn't even know this was possible, right? Like there's a new image editing model that Google released just a few days back.
01:37:25
Speaker
You almost don't need Photoshop anymore. It is that level of good. And I'm not exaggerating because we tried it. In fact, I had to publish some article and we needed some marketing images.
01:37:38
Speaker
I just took some of the old images and drew something by hand and the marketing team was not able to tell that it was an AI generated image. right So it's a question of experimenting with the parts that today you are doing manually or you are using ah technical person for and seeing if the result is good enough.
01:37:59
Speaker
ah To the extent that we have our HR team, they wanted to do some quiz as one of the monthly engagement activities. So earlier they would have come and found an engineer who is willing to help them and maybe taken one day of their time.
01:38:15
Speaker
This time that entire quiz was built by them using one of these Replet or Lovable and they ran it and everyone was like, oh, this has such nice animations. And they are like, yes, I had to prompt it five times for the animation to work.
01:38:30
Speaker
But if you don't try, then it's not going to happen. So I think the biggest thing is at least knowing what the possibilities are so that when you have a ask come up that you would have gone to a technical person for, now you know, okay, I can go to a Replay, I can go to a Figma, I can go to a Canva and get that out earlier. like And I've seen this happen with our org where I was showing them of image editing and then the design team was like, oh, I didn't know this exists. Now I'm going to use this every day.
01:39:01
Speaker
Right. so knowing is, I think, half most of the battle right now. Yeah, that's so true. OK, you've built for multiple markets. ah the the DrinkLink product was like a consumer product for maybe a youth kind of a market. And then you had the DevShop, which was a B2B service. And even SpotDraft was first for freelancers. And ah you know how should someone who's starting off for the first time think about which market to build for? you know what What problem to select and solve?
01:39:39
Speaker
See, I might have a contrarian view there, but I personally, ah the way I look at it is either you what is your unfair advantage and how are you tapping into it?
01:39:51
Speaker
So when we talk about Trinklink, my unfair advantage was my co-founder was a promoter who knew 80% of the clubs in New York City already. So we knew exactly what their problem statement was, who is going to be the one making a call on, are we going to let five stags in?
01:40:09
Speaker
and so we were able to really build understanding that. And I had seen the other side of the problem, which is like, well People don't get in and people have plans and people are willing to pay.
01:40:20
Speaker
So we were able to merge those two there. Similarly, in the case of SpotRraft, I had dealt with contracts as I was working with my dad and working on CoderBuy where I had seen that, okay, I need someone to sit and review these and I would, you know, use free help from friends, but also had to pay a few times when we had a couple of larger contracts that we had to review.
01:40:43
Speaker
So I was aware of that side. And then, of course, Shashank, having worked on Wall Street, being a lawyer himself, had a much, much deeper insight and understanding on what exactly in-house counsel does, what outside counsel does, what are some of the challenges.
01:40:57
Speaker
So it is that combination of expertise in the problem space and being able to understand it from a technical side and really getting it together is where I think I see ah success because if you don't have an unfair advantage,
01:41:16
Speaker
in any of these sites, then you are sort of working from a point of disadvantage because someone else who's done that before or who knows that world better will come up and then leapfrog you. know leap frog And of course, distribution is the last one where if you have any distribution advantage, that is always going to trump everything else.
01:41:34
Speaker
And the Slack versus Microsoft Teams is a classic example of this. We see this saw this play out when we launched Google Classroom where we got, I think, 50K signups or 500K signups in the first day.
01:41:47
Speaker
And that was only because we were a Google product, right? Any startup trying to do that would be happy with 50 signups on the first day. So

Leveraging Market Signals for Success

01:41:55
Speaker
that those are, I would say the three key factors for me.
01:41:59
Speaker
ah And with AI, actually, I feel people operating in the niches of niches are going to be more successful than people who are coming from outside and trying to solve.
01:42:14
Speaker
Because you know the problem statement and problem area so much better. And with AI solving it is becoming so much easier that you can just go in and test that market, test that theory of, okay, is this problem even worth solving without any, you know, without spending lakhs and lakhs or millions or millions of building that and MVP or prototype.
01:42:35
Speaker
ah You know, there is this common advice given to founders of persistence, grit, resilience. and If you guys had bought that advice when you were serving freelancers, ah that we should be resilient and make this market work out, you wouldn't be here today.
01:42:54
Speaker
How do you balance that need to have grit with the need to exit? Yeah. And you've had like three exits, right? Like you you walked away from ah both those opportunities and the freelance opportunity. So, you know, what signals ah came to you that you were able to walk away? Yeah, I would say, see, resilience and grit is not about sticking to...
01:43:24
Speaker
a problem statement. it like You have to be willing and able to pivot if you start hearing signals from the market. And that's essentially how we moved away from that freelancer world was we realized that no one is going to pay. and We can get some Indian freelancers will pay anywhere from a rupee to maybe a thousand rupees per contract.
01:43:45
Speaker
But it's not a recurring revenue business. It's a very, very high churn business. So that is when we started talking to enterprises. We realized that, you know, these guys are willing to pay what we would have made from 5,000 or 50,000 freelancers. One person will pay.
01:44:01
Speaker
So the resilience great part for me is be being able and willing to stick and keep at it versus saying that, you know, this is the only problem I'll solve because we went from And freelancers to contract review to CLM, then back to contract review.
01:44:20
Speaker
Right? And it's all about making sure that you're listening to your customers, making sure that if they are saying things that you don't are not solving for, and that is what you keep hearing over and over again, then let's pivot, but let's continue at it. Right?
01:44:36
Speaker
but And we've we've done that two times. And we've, in fact, the company was set up in Gurgaon and we moved to Bangalore, we got everyone to move to Bangalore with us.
01:44:48
Speaker
And and be but they did that because they believed that okay, what we are doing now actually makes sense and is a problem to solve and that comes from talking to customers. So the resilience side for me is having conviction to keep going even if it means changing what you're working on.
01:45:06
Speaker
But why did you move to Bangalore? for We were having such a hard time hiring in Kurgaon. And everyone, good we would talk to, you know, out of college or even with a few years of experience.
01:45:19
Speaker
They're like, you know, I actually want to go to Bangalore. And we're like, okay, we have to go to Bangalore. Otherwise, we will never be able to hire. And it's just like the ecosystem is here. So if you're not in the valley, at least be in Bangalore.
01:45:32
Speaker
oh Got it. Okay. Awesome. Thank you so much for your time. it was a real pleasure, Matav. Thanks, Akshay. It was great conversation. I loved chatting and will look forward to seeing the podcast and hope if there's any questions, would love to answer them as a follow on as well.