Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
A deep dive into Enterprise AI | Anand Mahurkar @ Findability Sciences image

A deep dive into Enterprise AI | Anand Mahurkar @ Findability Sciences

Founder Thesis
Avatar
291 Plays3 months ago

Anand is an IT veteran who started his career three decades back and in his last role was heading an IT services company. Noticing the pain his customers were facing in searching and interpreting data, he recognized the need for a fundamentally new approach to insight generation, which led to the creation of Findability Sciences. In this episode, he talks about how AI is impacting large enterprises and the value it can bring to businesses.

Get notified about the latest releases and bonus content by subscribing to our newsletter at www.founderthesis.com

Read more about Findability Sciences:-

1.AI is key for business growth as it enables data-driven decisions: Findability Sciences CEO Anand Mahurkar

2.Exclusive Interview with Anand Mahurkar, Founder and CEO, Findability Sciences

3.Fortune Magazine recognizes Findability Sciences as one of America's Most Innovative Companies

4.India’s AI roadmap in 2023: From CAGR 20% growth to creating endless job opportunities

Recommended
Transcript

Introduction of Anand Mahurkar and Findability Sciences

00:00:00
Speaker
Hi, this is Anan Mahurkar, founder and CEO of Findability Sciences, an enterprise AI company headquartered in Boston.

Why is AI Adoption by Businesses Essential?

00:00:20
Speaker
AI is the new buzzword, and investors are pumping in hundreds of billions of dollars into AI startups. But how will these investors make a return on their investment? Most technological revolutions need adoption of businesses to truly become mainstream and return money to investors. Think of cloud computing. It's as big as it is today because businesses adopted it. And the lack of business adoption is probably why cryptocurrency is still not a mainstream product.
00:00:46
Speaker
In this episode, your host Akshay Dutt speaks to Anand Mahurkar, the founder of Findability Sciences, about how AI is impacting large enterprises and the value it can bring to businesses.

Anand's Background and Founding of Findability Sciences

00:00:57
Speaker
Anand is a veteran of the IT space who started his career three decades ago and in his last role was heading Datamatics, an IT services company. When he heard the pain his customers were facing in finding and making sense of data,
00:01:10
Speaker
He realized the need for a fundamentally different approach towards insight generation and this led to the birth of findability sciences. I absolutely loved this conversation because it went behind the hype of AI and helped me to understand the enterprise AI space and the tectonic shifts happening there. Stay tuned and subscribe to the Found a Thesis podcast for more such deep dive conversations.
00:01:41
Speaker
So Anand, I'd love to learn about your origin story. ah but What were those dots that you can connect looking back that made you become an entrepreneur that prepared you for company? So Akshay, it's a very funny story in terms of how I became an entrepreneur. And maybe your listeners will also enjoy this story. Growing up as a child, my father was actually a professor of accounting.
00:02:10
Speaker
and he used to be also author of a lot of cost accounting and accounting books. In the Marathwa region of Maharashtra in India, my father's books were used very regularly by all the students.

Family Influences and Early Career Choices

00:02:25
Speaker
Now he used to work and constantly write books and do the examination papers, et cetera, and we used to look over his shoulders. Very interestingly, when my father used to give examples in his book for balance sheet, profit and loss, what is profit? What is loss? what is And he used to use me and my brother Deepak's name into the companies.
00:02:50
Speaker
so Anand Private Limited is a company which makes shoes, and it made profit of this dollar. So since then, and I i reflected on this over the last decade, is that since then, there was always a dream that Anand owns a company, or Anand is a business person. And my father, as an accounting professor, al always wanted either of their sons to try business because, and sisterly, probably a million years in the past, nobody in my family ever did any entrepreneurship.
00:03:21
Speaker
ah They were either farmers or priests in the temples. So there where the was seed was sowed. And then after engineering, I became a mechanical engineer. and There was a dream that, oh, now I'm a mechanical engineer. Instead of going into a job, can I try and do something else? So actually, I drove from Aurangabad city in Maharashtra, where I did my graduation in engineering, on a scooter to Pune, where there was the then kinetic Honda headquarters.
00:03:51
Speaker
And my dream that time was to set up a garage for Kinetic Honda two-wheelers because those were the most modern two-wheeler in 1990s. And I thought in Aurangabad, there is nobody who is serving Kinetic Honda population. So I can set up a garage, I'm a fresh mechanical engineer, I can do this, that, etc.
00:04:11
Speaker
I went there, I knocked on the gates of kinetic Honda and the secretary said, oh, why are you here? I said, no, I want a service station in Aurangabad. Can I meet a service manager? And after a lot of hassle, he called one guy, he came and he said, oh, we are you. What's your background? He said, the minimum requirement is that you need one gala or one place to start a garage and five lakhs rupees in your bank account. And I said, I neither have both. And he said, thank you very much. And I just left the gate.

From Engineering to IT: Anand's Career Journey

00:04:38
Speaker
And then, Akshay, as they say that overnight success is in 20 years in making. I went into the entire route of being an engineer, then senior engineer, manager, general manager, vice president. So I started actually with Bharat Foraj in Pune.
00:05:00
Speaker
no which is a manufacturing company, Bharat Forge, that time was the largest ah forging manufacturer for all the American automotive, like General Motors and Ford. and What is a forging?
00:05:14
Speaker
So forging is a process where you take a metal. So if I can use some Hindi word, it's a low hurry work. It's a blacksmith work. So you, so you take hot filet and then hammer it with force and which makes the component. So where if you go beneath and any automobile, the tram shaft, the axle beams, all are forged component because ah they bring the strength. So you cannot, so the only way of manufacturing those ah load-bearing component components in a vehicle or any machinery are forged components and Bharat Forge in Pune which is owned by Kalanid's which now are the largest forgers in the world. They used to provide this to Ford and in fact if you come here in America in Michigan now most of the forging companies in Michigan are acquired by Kalanid's. So you will see all over Bharat Forge or Kalanid's sign.
00:06:09
Speaker
But anyway, I started there as a project engineer, as a graduate engineer trainee, and then kept on changing jobs. After that, I went back to Aurangabad in Videocon. I was a plant engineer at Videocon television and washing machine shop. I worked there for a number of years. then Videocon was starting office automation division. So they moved me to office automation, which was my backdoor entry into computers. And then I got a job into real software because this was Y2K. Everybody was trying to look into software and trying to fix the Y2K bug. So I got into Kale consultant in Pune, which was the then India's banking software manufacturer and supplier. ah So I was regional manager for Pune for Kale consultant.
00:06:53
Speaker
And then I joined the how to do coding and all that, because yeah i source I'm sure it would not have been software work. But it's a very interesting question, Akshay. At that time I skipped a story, but ah luckily in Bharat Forge at that time was going from a concept of forge modernization. So as I explained to you, the traditional forging is literally in a furnace. You put the metal long rods and take it heated up to 1200 degrees centigrade. Then you take out that hot fillet and they are manually. So there are chains or the overhanging chains.
00:07:31
Speaker
and there is a tong so you go into inside the furnace pick the tong you can imagine that process and put it into the forging where there is another operator who is pressing the pedal and then banging it and then it bangs and then it creates the shape then you go to the trimmer you remove the rest of the stuff and how the component is made It is a very, very human, inhumane process. The workers have to work into high temperatures, safety hazards, et cetera. So Bharat Forge and the world actually started modernizing these plants. And they said, instead of human, can there be a robot which can go inside and pick the fillet and bring it and put it into the hammer? Can hammer come automatically the moment it senses the temperature? Can then another robot pick it up from the hammer and go into trimmer and blah, blah, blah?
00:08:19
Speaker
So, they at that time tied up with a company named Weingarten in Germany. And these Weingarten were the world's first 4U modernization company. And they used to supply these robots and enter assembly line and then channel and et cetera. So I was actually put onto that project and there were German engineers in 1990s working in Pune on Bharat Forge factory. And they used to work in German times, which is used to be typically in the evening when the shift shift is over, they used to implement all the robots. So I used to just hang around with them and looking over their shoulders, I started learning coding.
00:08:59
Speaker
So again, I never trained myself in a school on software programming, but in factories, so in Bharat for the initially I had my initial seeds of software. And I went to VideoCon and VideoCon also, I used to be plant engineer, but I used to just hang around in the nightshapes and the evenings, go into the computer lab, play around with the programming computers.
00:09:20
Speaker
So I kept myself trained in not only in the extended shifts, but at home as a curiosity. And that led me into the job with Kale consultant to manage the region where they had all the banks like Bank of Maharashtra and all the cooperative banks in that region, which are automated by Kale consultant. That's the work I was doing. So that's a long answer to your question. How did I learn my coding?

The Birth of Findability Sciences and its Concept

00:09:43
Speaker
And which language was Kali using? Like what was this? that it was It was C++ plus plus that time. okay and And it was on InforMix as a database and it was black and white window. So it was not a NCUI with buttons and clicks and etc.
00:10:01
Speaker
But that's when I started, the C++ plus plus at college. But prior to that, in my engineering and even these robotics at Bharat Forge was on Fortran 4. I don't know whether even that language is known to anybody today, but Fortran 4 was a basic programming language on a microprocessor. And that's what I started learning from college to Bharat Forge, then graduated to Infomix and C++, plus plus and then kept on learning the modern languages.
00:10:30
Speaker
and Okay. Okay. So coming back to the entrepreneurship story. So it took 20 years, ah but interestingly from Datamatix, where I used to work as an employee, my last job is I moved to United States as a pre-sales engineer and Datamatix used to do... Datamatix is like IT services. Like in business. Datamatix is IT services.
00:10:54
Speaker
yeah okay Datamatics is an IT services and a BPO company. In fact, Datamatics is one of the first companies in India after Tata consulting engineers. Datamatics was started as an op shoot of Tatas and started by Dr. Kanodia, who was the first CEO of Tata consulting services.
00:11:12
Speaker
so but So I worked with but Dr. Kanodia and the management team, and my focus that time was mainly on data and document related services. So business intelligence, data warehouse, data management, document management, workflows, imaging, OCR. So I used to work into that technology because data analytics was primarily a BPO company, the business process outsourcing.
00:11:36
Speaker
And when the US customer used to outsource the work, like HR forms or healthcare forms or insurance forms, so that processing of those forms required all these technologies, imaging, OCR, document management, workflow, data warehousing. OCR for run listeners is optical character recognition. It's how you ah like a you can scan a document and then converted into text from an image. Correct, correct. So I worked on all those technologies and then about
00:12:13
Speaker
Now, 12 years ago, I was invited by one of our customers in Rhode Island. So I moved to Boston about 21 years ago, worked here for eight, nine years, grew Datamatics, eventually became the president of the company. But that day where it was a life changer for me to answer your question in a very long format is I was invited by this bank who was deploying a very large data warehousing solution.
00:12:40
Speaker
And my team had partnered with IBM and we were the vendor there deploying this data warehousing solution. And this data warehousing solution was really in a very, very bad shape in terms of what customer was looking to take from that data warehousing solution. ah And typically,
00:13:06
Speaker
president of the company called me because I was the president on this side. And they said that we are very unhappy with what you are doing. And they had spent by then about $7 million. dollars It was really hardening to see that the customer is not happy. So I was thinking, I said, I apologize. I showed them how we are doing. This is the report. This is what we did. And the meeting ended in a very unsatisfactory way. They said, do you need to fix this. So I was driving back.
00:13:35
Speaker
And while driving back, I realized that the customer is not looking for a business intelligent solution, but actually customer is looking for ability to find information.
00:13:48
Speaker
because the customer had and a unique issue. They had all the savings account and checking account in the database, and they had mortgage accounts and loans and car loans and et cetera in a document management repository. And when we were giving them the BI solution, we were just giving them on the data. So if they look for Akshay or Anand,
00:14:10
Speaker
And I had to go into structured data separately and unstructured data separately in order to get and generate the report. Whereas the customer wanted a unified report. At the moment I searched for Anand, the checking account, the saving account, the mortgage account, everything unified access.
00:14:26
Speaker
And that aha movement of ability to find information, I said, can there be a technology? And actually, I'm talking about pre big data, Hadoop, all that work. So nobody had come up with that. So I said, I'm going to leave the job. I gave the six month notice and I said, I'm going to build a technology which actually improves ability to find information.
00:14:50
Speaker
I got a couple of developers in the market. I myself was a hands-on developer. I developed an architecture and I was getting excited that every company needs this. Every organization wants ability to find information. And actually, interestingly for your listeners is this find word, it struck to my mind because there was a sudden pop-up in my mind saying, pain of search versus such joy of finding. Because nobody tells you to go and search. okay You may say, hey Anand, go and find that document.
00:15:26
Speaker
or find that image but nobody's asked you to search because search has a pain associated with it whereas find has a joy associated with it. So I actually trademarked this line that pain of search versus joy of finding and started building this platform and there where my entrepreneurship journey started and just to last point on that is that what should be company name?
00:15:48
Speaker
It is findability is not a English word. So I coined word findability and it is a science. So findability sciences is a company name. And that's how my entrepreneurship actual journey began after almost 25 years later, when my father wrote in the books that I have limited has this balance sheet and products. Interesting. Your LinkedIn shows a two year state with lines. ah It was a very short stint after I left Datamatics and started the company is just ah because again, I'm not sure how many of your listeners would remember or you know Akshay but ah Mr. Anil Ambani that time had launched a big campaign of setting up IT services company.
00:16:35
Speaker
And he had hired about 24 consultants globally to assist him to set up the Reliance IT services. So he wanted to do business like Enposis and Wipro and etc. And I was one of the, so it was a more of a consulting gig for ah e one year between the transition from what I was doing to my entrepreneurship. And you worked directly with Enlambati.
00:16:59
Speaker
His office, I met him a few times, but he he he had his president who was working on this whole ah BPO IT setup. Eventually they gave up on that model, but that one year they wanted to have all the IT BPO services, people worldwide. There was a big news in the market in terms of how he picked up 24 people and how he's setting up these things. But that's where I was asserted very briefly. Did that one year interaction give you any hints of what was to come for Nalambani?
00:17:29
Speaker
i I can't really say that because one is I was in the United States. I was not in India. i During my that one year of gig, I visited only once.
00:17:40
Speaker
and spent about eight days in the Reliance campus. um But otherwise I was remote. I was dealing with only that few people who were appointed and had no exposure to any other business. And my job was basically helping them consulting on what services they should launch and et cetera. So I don't think I had any clue about it or I would have one guest sitting this distance. Okay. ah I want to understand what is,
00:18:08
Speaker
What was the need for creating a software to find things? ah I mean, it it seems like ah finding is a problem that probably Google and the search engines would have already solved by that time. so So what was it that, I mean, what was unsolved there? What was your, what were you solving?
00:18:36
Speaker
So actually it is not a past tense. It's a continuous current tense. It is even today not solved. yeah And your example of Google is appropriate, but just imagine now in your own world, you can go to Google and search anything.
00:18:54
Speaker
But now that you have on your desktop search and et cetera, but imagine in your company, you have 10 databases internally within firewall. They are not connected to Google. You still can't look for everything. You will have to go to different applications to find the information. So this findability problem exists within the firewalls of the companies.
00:19:15
Speaker
I'm sure even just before this interview, I was on a customer call. It's a $7 billion dollars company. I'll repeat, $7 billion dollars company. They have 20 different data sources, structured and unstructured, not unified, not connected, and they can't find information in 2024. So what I was really trying to solve was the ability to find information within the firewalls of the organization.
00:19:44
Speaker
And what we are talking actually here is not individuals or not in the public domain. We are talking about specific enterprise problems. So these are businesses where there are enterprise applications are working. Again, you know that individuals have files on their desktop. They don't share with each other. If that person has left or is on vacation, you have access problems. So there are many findability issues within the organizations.
00:20:10
Speaker
Okay, okay, I understood. So like you give the example of the savings account data in a separate database and the mortgage data in a separate database, and probably these two would not always be linked through a clean customer ID.
00:20:26
Speaker
maybe the customer ID for the savings account would be different, the customer ID for the mortgage account would be different. So you have to find a way to figure out that this is the same person oh whose records exist. So those were the problems that you were trying to solve.
00:20:41
Speaker
Yeah. And that is a very simple example and a simple articulation. And anybody would say, if I have customer ID, then I can find out in two minutes. But that was my just simple illustration. If you go into the complex challenges, like the company I was just talking about is a manufacturing company.
00:20:57
Speaker
they have The largest data comes about from their IoT devices, the Internet of Things devices, fitting on the machines. Now just imagine if the machine is spitting out data literally every second, every minute, every hour, every day, and that data is stored.
00:21:14
Speaker
Okay. Now I have maintenance records. I have a ERP system in which I have spare parts and maintenance data, et cetera. Now, how do I unify this data? How do I find what is a correlation between this sensor emitting this data out and this component I need? That is much larger problem than just having a common ID and then you can match the two databases. But yes, that simple articulation of two different data sources, can you match with one ID? But down the line, there are like, we developed 200 plus rules in order to ability to find information, what all you need to do in terms of architecturally, programmatically, technically. So it's actually a very complex problem than that just simple illustration. OK, OK, OK. You spoke of ah structured data and unstructured data in different databases. That's part of the problem. What is the difference between structured data and unstructured data?
00:22:09
Speaker
So let's first start with that in terms of what is the difference between structured data and unstructured data. The simple way I define is the structured data is which sits into rows and columns. Okay, which has a very structure so height, weight, chest, address, telephone number, name, first name, last name, etc. So we have columns and then we have access data, announce data and Sanjay's data and Tom data and etc. So there's a structure to it. And traditionally this structure data sits into relational database management systems like Microsoft SQL or MySQL or Oracle, etc.
00:22:42
Speaker
So rows and columns and it has structured and mostly it is numbers or short text like first name, Anand, last name, Mahurgarh. So that is a structured data. Now unstructured data, you applied for a job, I apply for a job, I have a resume, resume you have a resume resume. The piece of the content is resume.
00:23:06
Speaker
But you may start with address, I may start with experience one. You may start with qualification, I may start with my last title. So the structure within the content piece is different for different people. Like if you go to even universities, I graduated from A university, you graduated from B university. We may have done the same course, but your graduation certificate may be horizontal, mine is vertical.
00:23:34
Speaker
The content may be different. So unstructured content is mostly textual and it is not structured pre-defined. So research reports.
00:23:47
Speaker
analytics reports, news, video content, audio content. Like for example, this podcast is unstructured content because your last podcast, you may have started with something totally different. In this podcast, you are starting something different. I am answering in a different way. Somebody else must order. Your other guests must just answer a different way. So this is a podcast. This is about entrepreneurship, but it is unstructured. So that's the difference between the structured data and unstructured data. And actually just for for your listeners that from the explosion of handheld devices. The unstructured data is actually much, much larger than structured data within enterprises or overall in the world. So in fact, there's a very funny statistics couple of years ago, which were published, is that if you imagine humans started writing on stone slate, right? We have stone carvings where people used to take a chisel and write on a stone and that was their slate.
00:24:45
Speaker
This must have happened probably a million years ago or half a million years ago or 10,000 years ago, whatever is that timeline. So if you decide that you want to count now all the unstructured content created in the world from that time, the first stone in grieving to today, you will be surprised that 99% of the data was created in last one year.
00:25:09
Speaker
Oh, wow. That's the, that's the volume we are creating now every year, because imagine images. We are taking pictures on handle devices, the text messaging we are doing to our friends, family, business colleagues, et cetera, the emails, billions and billions of emails are flying literally every minute, which was not there 10 years ago, which was not there 30 years ago, which was not there 40 years ago. So that is all unstructured content.
00:25:35
Speaker
So when I speak about structured content and unstructured content, the unstructured content, as I said, is now about 80% in the businesses. And this statistic really stuns people. And that 20% of the data is locked into the structured databases. So that makes the enterprise data. And therefore, everybody has to now focus on both.
00:25:56
Speaker
So we talked about this banking and mortgage account and you said account ID. Now imagine I have a customer and I have lots of emails, I have documents, contracts, Excel sheets, images, videos about that customer and I have the customer's ERP data where I have how many items I shipped to that customer, etc. So when I'm now looking for that customer data, where should I go?
00:26:21
Speaker
So that findability problem still exists, but these two databases have now become the fuel to the enterprise AI because it is locked with a lot of information, history, wisdom, whatever you want to call it as in form of digital content. Okay. Okay. Let's talk about the journey of.
00:26:41
Speaker
reaching here today where you are well of course the AI revolution has made everyone realize the value of data of being able to ah use that data for training AI and so on ah but ah you know in I think 2009 or 10 is where you started findability right so so what did you start with?
00:27:06
Speaker
So as I mentioned, I stepped out. I started at how can I connect the structured data to unstructured data and internal data to external data. And I built the first product called Findability Platform. So Findability Platform was basically collecting the data, unifying the data, and giving unified access to customers. And I got my first customer in six months. And since then, there is no looking back.
00:27:34
Speaker
I built this company on my own by generating revenues, bootstrapped it, no investment, et cetera. But very interestingly, findability platform is would be ah like a cloud native platform that you built. like it wrote Yes, yes. so The company is born in cloud, operated in cloud, grown in cloud, and currently operates in cloud. Those were like so early days of cloud, but you could see that cloud is the way to go.
00:28:00
Speaker
Absolutely. I never ever bought even one physical server in last 11 plus years. So yes, it started in cloud. And and how would it ah how would it work? Would it have like some sort of ah API integration with the databases through which it would catch the information and then present it? It would present a front end. The front end would be findability. And at the back end, the API cause would go to fetch the data and present it.
00:28:29
Speaker
Correct. So, and you got the answer. so But we developed about 200 plus connectors. So these 200 plus connectors were into standard systems like yeah ERPs and CRMs and et cetera, or even on the web, web crawlers connecting to some standard ah like databases like Lexus Nexus, Bloomberg, Reuter. So we connected with paid, unpaid sources on externally. Internally, we developed some standard APIs for the standard products, but then there were some custom products we used to always meet and that time we used to do the custom work.
00:29:02
Speaker
So connect to all this and then pull the data in a central repository. And then on top of it, then you give the, as you said, user interface where now your pain off search goes away and your joy of finding comes in together. And that was a big need. I say that I'm talking in past tense that need is to get today also. And since you asked the cloud question actually, actually that led to our change into the AI company because I used to host initial version of Findability platform on a technology named Rackspace. So Rackspace is an American cloud company, early days like an AWS competitor. Correct. But a very early time, AWS probably had never thought about cloud, but
00:29:48
Speaker
Rackspace was the first one of the first in the, in the country. And, but they were not really scaling up. We had lots of outages. We had issues about access. We had, it was a clunky interface. So I was desperate to move from Rackspace to something else. And in my earlier days, as I told you about Datamatics, I used to work with a lot of IBM people because Datamatics was IBM partner. We used to do together a data warehousing and document management solution.
00:30:15
Speaker
So what one of they execut execut you but ah what was IBM's role in this partnership? IBM would provide like a software and you would be the implementation partner?
00:30:27
Speaker
Correct. So IBM had their own tools about data warehousing and business intelligence and document management and workflow. So Datamatix was an IT services company. So they never had their own technology products. They took third party products and did like what all system integrators do. with seven integrats So I was the partner manager for IBM and therefore I had some executives I knew there. And one of the executive is here in Massachusetts. So I called up on her. I said, by any chance, do you know any other cloud service? I'm really very unhappy with this Rackspace service. And he said, oh, glad you came to us because this month only we acquired a new cloud company by IBM and IBM is launching IBM cloud.
00:31:12
Speaker
And in fact, I'm going to be the in-charge for IBM Cloud. Why don't you become the customer? And I said, OK, so now I know this person. IBM brand is good brand. Why don't we try it out? So I decided to move from Rackspace to IBM. And when we were moving with IBM, again, Cloud was a big thing that time. There was like 10, 15 people every day on the call. We are shifting everything.
00:31:34
Speaker
So they were able to see what I was doing. And they said, wow. yeah And this is I'm talking about 2014, 2015. They said, this actually is a foundation to AI. And we are now launching IBM Watson. And IBM Watson actually needs a data foundation for connecting structured data and structured data. So do you want to become IBM Watson partner? And actually, again, for you and your listeners is that that time there was a TV show named Jeopardy. Do you know this Watson story?
00:32:04
Speaker
the no that so So Jeopardy, you know that Jeopardy show on in America, it is question and answer session. But do you know there' something unique about that show that in that show, they never ask you question. They tell you answer. And as a contestant, you have to fabricate a question.
00:32:28
Speaker
Now why this is a tricky show and why it is difficult because our brain, human brain, since our birth is always tuned for answering the questions and not crafting questions. Okay. So if I say Akshay, a famous podcaster named Akshay, something like that, then the Prentice India say, who is Akshay?
00:32:54
Speaker
Okay, so that's the way the question gets drafted. So therefore the people who are really winners in this show are one of the greatest human brains who have control on their mind, who have a lot of knowledge, who understand the history, economics, geography, et cetera, et cetera. So IBM Watson in 2015, or 2014, 2015, decided that they want Watson to appear in Jeopardy! show.
00:33:21
Speaker
And they trained IBM Watson on entire Wikipedia. And the two contestants who were hands down winner for last decade, they decided to compete with Watson and it was live broadcasted on television. And I was actually watching that show that day because there was well advertised in America with my daughters because we used to watch Jeopardy! show and Watson won hands down at the end.
00:33:49
Speaker
And then there was a lot of news that the cognitive era has begun, the computers are becoming smarter, et cetera, et cetera. And coincidentally, that time I moved to ibm what IBM Cloud and the folks at IBM said, can you become the partner of IBM?
00:34:04
Speaker
So 2016, I took this Findability platform into IBM Cloud, and I became the Watson partner and started integrating Watson APIs into the technology which we built. And we are getting very good success rate than other partners, because other partners had no data infrastructure. They had to work on data, et cetera. And then IBM also appointed me that time as an IBM Watson advisory partner. So I was on the advisory board So I could learn a lot of natural language processing, how to process unstructured content, et cetera. And I decided that that's actually the future. So I should build some AI stack onto this findability platform. And since 2017 onwards, we actually converted into enterprise AI company.
00:34:47
Speaker
So Akshay for your listeners and for you, we are not currently riding on the hype of AI curve. We are doing AI since 2017. So it's seven years. And I can tell you some other stories in terms of how we used AI way before even chat GPT was launched in the same way what chat GPT is doing. But that's how I transitioned the company from a findability platform or solving ability to find a problem to an enterprise AI company.
00:35:16
Speaker
Okay, yeah I will ask you to share some of those stories, but before that, um what was IBM Watson? So for example, today everyone knows what is chat GPT. It's a large language model and it predicts the next word which should come in a sentence. And so that prediction of words one after another leads to some sort of an answer which looks like intelligence. What was Watson?
00:35:44
Speaker
So Watson was more or less same. It was never called as a large language model ah because Watson was more driven from a technology called natural language processing and natural language classifier because six, seven years ago,
00:36:01
Speaker
Technology was not because as you must have heard and read about that GPT needs processing power for etc, etc. So the computing had not reached to what it is today's level. So what was IBM Watson was actually a computer program which will learn from the natural language.
00:36:20
Speaker
not the way the LLM does, but it does based on the classification and understanding sentiments, keywords, words, et cetera. So it was more natural language processing than natural language interpretation and prediction. So it was a NLP and NLC technology.
00:36:39
Speaker
Wouldn't it be fair to say that it was rules-based? From what I understand, the the innovation of GPT as a technology is that it ah is not rules-based. ah You let the system learn on its own. ah Previous attempts at AI were rule-based, where you would code in thousands of lines of rules that if someone says, I want, then want has this meaning. ah is Am I right?
00:37:07
Speaker
from yeah From that standpoint, we can call it as a rule-based. But I would not totally classify it as a rule-based. There was some intelligence. ah that that was like Once you teach a sentence, then it would understand that sentence and classify, et cetera. So there is an element of intelligence. But yes, the generative technology has taken the game to the next level. And six, seven years ago, ago it was more, as you if you want to just call it as a rule-based, yes. madeo is ah I would call it as a fuzzy logic.
00:37:35
Speaker
not just purely a rules-based. but You know, for people who don't understand what's happening behind the scenes of these AI tools, can you talk a bit about the evolution? What is fuzzy logic? What is GPT? What does it mean? What was the innovation which made this possible? Like a little more fundamental technology stuff, if you can share a bit.
00:38:00
Speaker
Yes, so we'll come to GPT at a later stage. Let's first start with, as you say, ask the word evolution. So in 1990s, when I talked about my Bharat Forge days, or my videocon days, or my Kali consultant days, we as a software engineers, we were very excited that I am now coding into a software. And the software asked me, is this an animal? I'll say yes. is it It has a four legs. Yes. Is it a brown color? Yes.
00:38:30
Speaker
Is it barks? Yes. And then it will tell me it is dog. And that was a great success, right? So then that these applications were used in banking and et cetera. And then we built ERPs and CRM. So mainly you're inputting the data, you're storing the data and you're processing and you're getting out. Then we fast forward and we said, I have an image, okay, of a dog. I give it to computer and then computer tells me it's a dog.
00:38:59
Speaker
Okay, so I'm giving an image. I'm not feeding four legs, brown, barks, two years, et cetera. I'm just giving an image. So this image part is where the AI started coming in. So I would probably classify Watson into that category. So when I'm talking about fudgy logic or it has some logic, yeah, I have trained on millions of dog images. So therefore now it understands dog, but it gave me the output as dog.
00:39:24
Speaker
Now, fast forward. This technology was called neural networks, right? this No, neural networks is still there. It is okay partially, yes. but But neural networks are used in deep learning and et cetera. But we'll come there in a minute. up I would call this as a supervised learning technology, where you have trained on something. I'm constantly telling machine, this is dog, this is dog, this is dog, this is dog. And I have given a million examples. So now it will tell me that this is dog.
00:39:54
Speaker
Okay. So that's, that's the way it it learns. So it is a more supervised learning. Yes. It can have a deep learning mechanism into it because I have trained it on multiple algorithms, et cetera. But fast forward, what happened actually is in 2017, the Google mind engineers published a white paper called attention is all you need.
00:40:19
Speaker
and the listeners of this podcast, if they have not heard about this or not seen it, please Google attention is all what you need. So that's the title of the white paper. But actually in the Indian context, our parents told us exactly the same thing in the school, right? That pay attention. so ah So these Google mind engineers actually came with an algorithm saying that if you pay attention to a word,
00:40:47
Speaker
and then give the weightage of that word based on your historical content, it can actually predict the next word. So let me give an example. ah Very interesting and in the Indian context, it will make sense. We all know snake. Okay, snake as an animal. Now snake as an animal, each one of us have different learnings. If I'm from a yogic systems, I know that snake is our highest conscious animal. It should be prayed.
00:41:17
Speaker
worshiped, et cetera. If I'm coming from religious teaching, I have different connotation. I'm coming from a complete scientific that this is a very poisonous animal. It can bite you and you can get killed. So everybody has their own learnings. Okay. So keep this in mind. The first is learning. Now we are sitting and all of imagine 20 people are sitting in the room and listening to this podcast. And I suddenly shout snake. Okay. Consider that as a prompt.
00:41:47
Speaker
Now your brain pays attention to that prompt called snake. And what it starts doing, it goes into your memory and brings your learnings. Now you learn that it's a poisonous, it can bite and it can kill you. So you are going to start running out of that room.
00:42:07
Speaker
Whereas I have been trained that it's a huddly-cuddly animal and you can actually pet a snake. So I'm going to start looking for that animal. So I generated another reaction, you generated other reaction. This happened because of the training we got either in textbook, in in our grooming, in our genetics, etc, etc.
00:42:30
Speaker
So the nows take a simple example into GPT. So what happened is that when then 2017, these data scientists published this white paper called Attention is All What You Need, they also gave in that white paper and all listeners and you can access access that document online. There is a diagram called encoder and decoder.
00:42:54
Speaker
and what they said is that you can train this encoder and then when you give the prompt it will decode and give you the output and it was just published and it was left there. Google engineers were working on it.
00:43:06
Speaker
but Elon Musk and all the open AI founders, they got together. What is the full-time GPT? Was that term also coined in this paper? or I'm coming. No, it was not. It was not there. So I'll come to that in a second. So this encoder and decoder in the white paper is called transformer.
00:43:25
Speaker
Okay. So they said that the transformer has an encoder and decoder. If you put a lot of content into it, it will generate a lot of content and the white paper ends there. They have given formulas in terms of how attention to the word, you can create the weightage of the world, et cetera, et cetera. So fast forward, yeah Elon Musk and team took this white paper and then decided to experiment this encoder and decoder and in the development of AI.
00:43:55
Speaker
So they took this, and then they trained with some sample content, and it started generating interesting output. Then they said it was published in the public domain, saying with generating output, everybody ignored. It started. It's just a regular program. So this was a transformer, which is there in the white paper. What they did is they pre-trained it with the content, and they found it is a generative in nature.
00:44:25
Speaker
So, GPT stands for Generative Pretrained Transformer. So, the transformer comes from the white paper. Pretrained means you are given pre-training and generative because it started creating generating the new content. So, OpenAI decided to call it as GPT-1 in 2017.
00:44:50
Speaker
Then they decided, oh, this technology has a promise, they got some more investment, they created more infrastructure, and silently behind the scene, and nobody cared about it because, yeah, there's open air, they are there to create intelligence. They kept on training the content, and they created GPT-1, GPT-2, GPT-3, GPT-3.5. This, they took it for about five years.
00:45:14
Speaker
and relentlessly they were training or pre-training the transformer, which is there in the public domain. You build a code, you transform, and then and again, I'm trying to simplify this topic. There are more complexities to it, but then they pre-trained this transformer, which is there in the attention model. Now it is called attention model. And then they started generating the output. So now coming to my evolution examples, initially I talked about four legs, brown color, barking dog. Then I put the image. It told me it is dog. And suddenly now on the GPT, if I have trained it on a content, if I put a dog picture in the computer, it gives me output of a similar dog.
00:45:59
Speaker
It is not same dog. It is not telling me textual dog. It is giving me an image. So if I put a Bichon Frise picture, which is a dog breed, it will give me a bulldog picture. But now I know it is dog, but it has generated the content on its own.
00:46:16
Speaker
Okay. And therefore the topics of hallucination and et cetera. And that can be another podcast onto it, but whatever information it had, it took it is dog. It created a new dog image and gave you, therefore it is called generative technology. So now answering your question and summarizing the evolution is in the regular computer programming. We used to give input and it used to give us the output.
00:46:39
Speaker
In the machine learning, we started training software and it started giving us the output. In the new era, you are training the computer and it is generating the output. Not giving the output, it is generating the output and that's where the discussion started. Wow, it is writing like human-like.
00:47:00
Speaker
But again, it goes back to my snake example is that if I have a transformer, if I had trained it on a poisonous animal, it will generate the output of poisonous animal. If I trained it as a holistic religious animal, it will give me the dead type of horn. So this cutting stopping the subject now is that This transformer is now in a public domain. So everybody can now create their own transformer. And therefore, you must have seen the moment OpenAI came out, Google said, oh, I have my own transformer. IBM said, I have my own transformer. Meta said, I have my own transformer. And now there are thousands and thousands and thousands of large language models are there. So large language model is nothing but it's a transformer plus content is a large language model.
00:47:43
Speaker
So I'm training it on a large content and generating a model to which I can ask the question and it gives answers. ah good Okay. Okay. Interesting. Uh, can you just explain to me that encoded decoder part one more time? What does that do?
00:47:57
Speaker
So this in the white paper, they created called transformer. So they said you can create a software named transformer. So the Google engineers named it transformer and it has two elements or two ah wings of this transformer, the encoder and the decoder. So the encoder and decoder actually gives anytime when you give prompt, it will go onto its learning and decode the learnings to generate the new content.
00:48:25
Speaker
So that's the simplistic explanation. So the encoder and decoder are the subcomponents of a transformer. So the generative pre-trained transformer, which is GPT, the T has encoder and a decoder. What does it encode? If I'm giving a prompt, does it convert that prompt into some sort of a like a series of numbers or something like that. Yeah. So it creates actually weightage. So there are, there are five principles in an encoder and decoder. So how do you weigh a word? Let's say, let me give an example. I'm typing cat sat and now the next word needs to be completed. But the content I've trained, pre-trained was zebra cat sat on a zebra. There is one
00:49:09
Speaker
Somewhere there was a one novel in which it said cat sat on a zebra. Then in somewhere way it said cat sat on a mat. Then in third it said a cat sat on a wall. But this cat sat on a wall got repeated 20 times in my content. So it is now going to give and give the weightage to zebra, to wall, to mat. And then it will say the wall appeared many times. I'm going to give the weightage to it and I'm going to complete the sentence cat sat on a wall.
00:49:39
Speaker
Now, related to that, so yes, it creates numbers, it creates weightages, it gives the number, and then based on the weight, it completes the sentence or generates the sentence. I'm just giving you an example of a word. It can be for a sentence, paragraph, pages, books, pictures, images, videos. That's how the technology progressed. But the initial solving was for a word. So it weighs everything. Now imagine I don't have anything in my content, neither mat, nor wall, nor zebra. It has to generate something. So when I saw cat sat on, it will say Anand because it found Anand a lot in the content. That's what we call it. Oh, this is giving a wrong answer. It's a hallucination because it has to complete the sentence. So therefore why do we don't trust generative AI technology for realistic application because it is generative in nature and not factual.
00:50:37
Speaker
So encoder and decoder does this giving weightage to the next word based on the learnings you have. I remember reading somewhere about LLMs being described as confident pulsators.
00:50:51
Speaker
And yes, it is as I said, the nature of this technology is to complete the sentence, to complete the content. So therefore all these issues happen, right? I am sure all the listeners and you have heard that there was a attorney in New York City who used chat GPT to present a case in the court. And he asked chat GPT about the case studies and it gave you all the wrong case studies. It presented and the judge said, this is not true. And he said, no, I got this from chat GPT. And then his sentence was barred.
00:51:26
Speaker
so yeah So you will get an answer to your question on chat GPT or any other for that matter, any large language models, because again, now I think for listeners, it may be clear the large language model is a transformer to which you have pre-trained on a large piece of content.
00:51:41
Speaker
The more specific I am, like for example, I'm now building a large language model for law. Then I need to train it only on all legal content. They need to start giving you good answers. So I call these LLMs as high school graduates. So chat GPT is a high school graduate. It's a foundation model. It knows science, it knows mathematics, it knows English, et cetera, et cetera. In enterprises, what we do now is we bring that for high school graduate into the firewalls of the enterprise We already talked about the structured and unstructured content. So we start training. Now we are start from high school graduate to undergrad, graduation, PhD, postdoc, on the topic for that enterprise business. And then it starts, their hallucination can be reduced. It can start really giving you responses and the business applications can be built on top of it. Okay, okay, okay, i just understood. Initially, why you, ah
00:52:42
Speaker
ah Like in 2017, you saw Watson and you decided that you need to build your own stack for AI. ah So what did you build in 2017? Did you also do a GPT or were you building a way for Watson to achieve that? or like Just take me that journey and let's come back to the journey.
00:53:04
Speaker
And it's a very good question because we don't want listeners to form impression that AI is equal to GPT. Okay. So GPT happens to be one tree in the jungle of AI and not the whole jungle. So okay it's a one technology, one application, but coming back to the most of the AI application, and again, AI is not new in enterprises. It's being used into financial services, healthcare, et cetera, very extensively for last probably two decades. So the AI solution we built initially was on structured data because the technology in 2017-18 was not progressed much onto the unstructured content. And most of the business challenges today also lies onto structured data.
00:53:50
Speaker
And if i when I demystify this AI topic, there are three AI applications today in 2024, predictive AI, interpretive AI, and generative AI. We just covered at a length the generative AI part.
00:54:06
Speaker
The two predictive and interpretive AI is what we did since 2017. So answering your question, based on the enterprise data, which sits in rows and columns or in databases, there are lots of requirements in the market about predicting what will happen. So the way I explain it to the non-technical people is that AI can tell you what will happen and what to do.
00:54:32
Speaker
And that is what is a big requirement in all the enterprises because the C-level executives are fed up in letting me know what happened. So the entire business intelligence, what we talk about is about what happened. Excel sheet is about what happened.
00:54:50
Speaker
how much revenue I did, how many employees I have, how many employees left, how many customers I have, how many customers I left. So 2017, some of our existing customer had this challenge that they had churn management solutions. And you I hope the listeners and you know actually churn is a word used for ah retention or customers leaving your service. So if you talk about telecom business or insurance business or wherever there is a subscription,
00:55:18
Speaker
Everybody, some percentage of customers leave the service. So for example, I am with Airtel today, tomorrow I go to Reliance, after tomorrow I go to Jio. So I leave one service. Now, for all the subscription business companies, it is utterly painful to lose a customer.
00:55:37
Speaker
because the cost of losing customer is much higher than getting a new customer. So all the customer service managers, sales managers, C-level executives, today i also get the report how many customers left last month. Now that's the past, right? They left, what can I do? I just feel bad that I have now 30% churn, I have 35% churn, I have 15% churn.
00:56:03
Speaker
and That is a good point of information, but I can't take any action on it. So now AI can come into picture and AI can actually tell you that how many people are likely to leave you next month, next quarter, next half year, next year. So if I have that intelligence today, then me as a product manager, customer service manager, CEO, CFO, I can work on many strategies.
00:56:32
Speaker
So like if you take a telecom business, if I predict that there is a 90% chance actually is likely to move from A to B. The A company can actually call proactively saying, Akshay, how are you doing? Send you some $20 coupon, give you some additional freebies, make sure your service is better. And actually you can drop down your retention rate. So what will happen? Akshay is likely to live. What to do? Akshay is shopping for a cheaper price. So maybe give him a discount. So AI now has the ability to learn from the past and do that.
00:57:05
Speaker
So answering your question in 2017 onwards, we started working on predictive AI and today also that comprises of our major service of giving predictive AI on demand forecasting, supply chain, customer management, employee management,
00:57:21
Speaker
We call it as the enterprise forecasting because everybody wants to know know what will happen in my business, how much revenue I will do, how much price I will have, how much product I will sell. Any app business division you take, they want to know what will happen and what to do. so In addition to this predictive AI, where we worked started working because of Watson and natural language processing on interpretive AI.
00:57:45
Speaker
Now what is interpretive AI? We talked about OCR some time ago. you are You elaborated for your listeners that optical character recognition is a technology. It is in a way interpretive AI. It is interpreting what is there in a content or a piece of paper. But now if you take the interpretation to the next level, I am now interpreting the sentiment is bad.
00:58:07
Speaker
I have a news release, I'm passing it through AI and I'm into, oh, it's a bad news. Okay. you Now take customer feedbacks. You can take reviews on help or get tech reviews on Google. You can't read every day all that content, but I can take that content and interpret for me that, oh, that sentiment is bad. keywor is actually Keyword is keyword is Anand. What are the entities in that? So that intelligence gave me a lot of application building. So that is interpretive AI. So we work mostly in interpretive AI and predictive AI. And Akshay, for you and your listeners today, highest level of AI application demand today in the corporates, in the enterprises, is on predictive and interpretive AI and not that much on generative AI.
00:58:52
Speaker
Because generative AI is, as you must have read, or it's good for marketing, content generation, authoring. But the real challenges are not that in a business. The real challenges lie somewhere else where predictive and interpretive AI comes in. Even if you read the latest Meconzi report, they talked about the economic impact from the traditional AI. So this predictive and interpretive AI is also called traditional AI or classic AI. We call it as a discriminative AI.
00:59:20
Speaker
just to separate from generative AI. The highest applications are on that, and we at Find Deputy Science has built production solutions using predictive AI and the interpretive AI. Okay. Predictive AI is essentially statistics applied to big data. Correct. Okay. The correct answer is on your part one. It's a statistical methodologies applied.
00:59:47
Speaker
the big data word I want to change for your audience or listeners, actually that myth in the market that you need a big data in order to analyze has gone away. And we were again leaders in the industry to call out that. And we actually call a new word, which can be very interesting for you Akshay and your your audience is it's called wide data, not big data. So you need to have read top your data more. Let me give you an example in 30 seconds.
01:00:18
Speaker
I'm predicting so demand for a company. okay And what I have is 15 variables from their yeah ah ERP system. The product name, its specification, how much quantity they sold, what are the price, where are the customer, in which state they sold, blah, blah, blah. So let's say 1520. So that's a very small width through the data.
01:00:37
Speaker
Now I go and take outside, say, weather data. How is weather impacting on my product sales or economic data? or So I now start increasing my width beyond my firewalls and trying to take more and more data. So if I have more columns, that actually helps machine learning to understand the interdependencies of the data.
01:00:59
Speaker
When we talk about the big data, it is only more rows. I may have only two columns, but I have 20 million rows. That's a big data. That's no use for machine learning. But if I have a width of the data, which is 200 columns, and I have only 2,000 rows, that's fine. Machine can learn based on that. So therefore, answering your question, the predictive AI is a statistical model applied to wider data and not big data alone.
01:01:30
Speaker
Okay. And which is why your connectors ah help you build predictive AI. That's the foundation we build from multiple sources, including like Bloomberg, you gave an example. Okay. Okay. ah right and So just to add on to that, it enhances your accuracy of the prediction. You would be surprised that many, many, many corporates, countless corporates today, as of May, 2024.
01:01:58
Speaker
live with their businesses with the accuracies of 65 and 70% accuracy on their forecasting because they have no means. But now we have shown the customer that they can take from 65% to 97, 98% accuracies on their prediction by increasing the width of the data and using better statistical models with the AI.
01:02:20
Speaker
and that possibility is there. So it's not just about solving the problem, but having higher accuracies. Because if I now know, bit likeck it's like weather example, right? On your iPhone or your Android phone, you have weather app. When you step out today in Tokyo and you are going out to check the weather, it says that it's going to rain at 30%. You'll say, hey, I don't need umbrella. But if it says it's rain at 90%, you are going to carry the umbrella.
01:02:45
Speaker
So imagine now same outputs in the business situation. If I tell you that you are going to generate 200 million revenue at 90% probability, if you do one, two, three, four, I would prefer that output. So the width of the data, and as you talked about the connectors, the different mixing of the data and bringing that and providing the output. is Now, just additionally in between, you asked me that what did we do GPT or different than GPT?
01:03:14
Speaker
So this interpretive AI, actually actually we built very interesting algorithms about six, seven years ago called summarization, okay, using interpretive. So we are interpreting the text and then summarizing in two sentences. Now it was not a generative technology, it was again based on the weight. So if I take 10 sentences and I'm taking the weight of each sentence, then if I'm abbreviating, then I'm just taking top three weightage of the sentence and I'm abbreviating that paragraph.
01:03:44
Speaker
So there is a gentleman named Jaspreet Bindra, who happens to be a very well-known person in India, who was the chief digital officer. No, he was the chief digital officer of Mahindraj. And we actually did one project at Mahindra, so I met him there. And then he left Mahindra and started his own consulting company. So he, in 2017, 2018, authored a book ah which is called the book name is Tech Whisperer. And this became a bestseller book on Amazon and all the bookstores in India, etc. But this is a world's first book in which one chapter is authored by AI. And this is six years before chat GPD.
01:04:32
Speaker
And that was done by our algorithm. And if you read that book right from the prepare, it is a credit can given to us. And the book actually the chapter, which is authored by AI is in green pages. Whereas the rest book is in the white pages. And we did that purposefully because this was the world's first book in which the chapter was authored in AI. Now that time we use this interpretive summarization algorithm where we fed all Wikipedia to the algorithm on AI and ask it to summarize in 10 pages. And that's what the technology was. Now you can just give one prompt and it will summarize for you 100,000 pages. But that's the progress of the technology. But that interpretive technology still has many applications within the business world. Just to give one example so that your listeners will be able to visualize, we work with one of the largest pharmaceutical companies.
01:05:24
Speaker
Now, this pharmaceutical company produces bulk drugs. And it's like actually a a Japanese company. It's a Japan-headquartered company. So they produce batches of drugs, powder, literally chemical powder. So they do all the composition. They mix it. And they produce, saying, OK, this is now a new medicine. Now, in the research and development labs,
01:05:46
Speaker
These samples come for each batch. And the QA testers have to go through the batch and write down the specification. The color is yellow. The composition is 1, 2, 3, 5, 6, 7. They'll do all the tests which are told to them as a standard format. Now there is a regulatory angle to it. The regulators tell you to do this testing. And then you have to compare this handwritten report with the regulations.
01:06:10
Speaker
Now, whether the regulators want the color of this batch to be white or yellow or gray or pink, whatever. I'm just taking color as a simple example. but So what we did is we built this platform on an interpretive AI technology for this customer. Earlier, when the report is written by the QA tester, a very expensive scientist had to sit down and match each of the outcome to the regulation. Oh, and this is saying beige color, but we want actually white color. So it's not matching.
01:06:40
Speaker
And they have to do manually. Now what we did is we trained this natural language platform with all the regulations. And now whenever this test report comes in, they just take a scan and upload in the system. We have natural language processing, which reads each word and it says white. Then it will say, what is the specification white? It matches tick mark, tick mark, tick mark, tick mark, cross tick mark. We have 100% accuracy on this project, 100%.
01:07:09
Speaker
So they stopped entire manual work. It's a published use case in the domain. People can look up that, but that is not gen generative AI. I cannot solve that because that's a generative nature. Here it is OCR, character reading, matching the natural language, natural language classification, and giving you finally the output, whether it is matched or not. So that's the interpretive AI technologies. We talked about predictive and then interpretive and then now it is generative.
01:07:37
Speaker
OK, OK, OK. I'm going to share some thoughts with you. So essentially, generative AI is creative, and interpretive AI is not creative. So which is why when you need something to be done where it is it needs to be 100% accurate, it needs to match against some rules and regulations, then creativity is not what you're looking for. So in that case, ah interpretive AI is the right solution to go with.
01:08:04
Speaker
I would not subscribe to your word creative. Okay. I would still continue to use as a technology is generate you. Generative. Okay. so Okay. Okay. So and yes, it is a, agenda so the your rest of the explanation is all good, but you cannot call it it. Since it is used in the word of creation, you're using the word creative, but technologically it is generative and it generates the new content in interpretation. You want to, it's a fact-based, you want to know what it is. You want to do matching.
01:08:34
Speaker
If you want to call it as a rule based or a semi rule based, saying that and sort of I'm writing white okay in handwritten on the test report. When it gets scanned, the technology has to read it as white. Now I may have written in Japanese characters. I may have written in Spanish characters. It has to read. It has to interpret that this is white.
01:08:54
Speaker
then it has to interpret that it's white means it's a batch color. So therefore it has to go to the regulatory specifications and match the batch cause it's a matching. So it can be, called you can call it a rule. Is it white or it's a beige? It's a white. Then it should give to me the output. So that's the interpretation of the content and logic in that. What is the difference between rule-based and fuzzy logic? I had used that term rule-based and you said that it's better to use the term fuzzy logic. What is the difference? So rule-based are very,
01:09:26
Speaker
it's so It's a predefined output. It's hard coded. ah that If hardly this number is more than five, then give this output. Something like that. Correct. But now taking your example, if this number is more than five, then open the door. Okay. That's probably right. But now in addition to that, we are adding 10 more rules and may any condition meet.
01:09:54
Speaker
to open it, that's become started becoming the fuzzy logic. So it's not only one rule, multiple rules cascading with each other and generating the output. Okay, okay. So there is some sort of ah discrimination needed here, like which rule to apply. connect Is this why this you term this discriminative AI?
01:10:16
Speaker
it It is in a way, yes. because what what might What's the logic? This helped me understand what the term means. So so sort the discriminative AI word is used mainly because of it is all number driven and it is based on the data correlations. okay So even the predictive AI is a discriminative AI because it is a correlation between the like when there is this X pressure in the environment, then only it rains. right So there is a correlation.
01:10:45
Speaker
So as as you just talked about rules, can that be a rule? Not necessarily, because that's not just the one output. And I need, in addition to pressure, I need the wind speed, I need the moisture, I need this, mean so all that combination. So why discrimination? Because it's a correlation and inter-correlation between the multiple data items.
01:11:04
Speaker
Discrimination here is a term of statistics, right? Yes. What does it mean? It's been a long time since I... Yes, but I think just for the simplification, it is mainly correlation. Okay. So the discrimination is basically correlations between various... See again, if I can go into the statistical method, when you have x and y axis, it's very easy to define.
01:11:30
Speaker
okay When I have x, y, z axis, it is still easy to do. Now imagine I have multiple dimensions. I have one variable, which is being pulled by n number of variables. And by the way, the pulling force is variance. But you cannot see. So far in the statistics, we always talked about there are four important fields.
01:11:53
Speaker
okay But in the new AI technology, there may be 470 important field which has very little implementation, implication but it is right it is impacting it. So if you identify that also, then you will be able to get your outcome very well. So now imagine a floating particle in the universe, which is being pulled by 2,000 different variables. And 2,000 different variables have 2,000 different weightages. One is heavy. So far, the human brain and data scientist said four variables, or five variables. So we took only so-called important and worked onto it. But now in the discriminative AI, you can actually take everything and then find the correlations. OK. OK. OK. I understand. I understand. OK.
01:12:39
Speaker
Your journey since 2016-17 has essentially been about building interpretive and descriptive AI solutions for our corporates. Is this a service or a product? No, we actually built products. So we over period now have 20 plus products, but we classify them now into the predictive interpretive AI, which we call it as a discriminative AI, which is mainly related to forecasting. So you slice and dice, and if you again see the maximum AI applications in the world today, and again, I'm talking about the enterprise context,
01:13:13
Speaker
are related to prediction forecasting and and that output. So we have product called enterprise forecasting algorithm. We built a lot of AI into it. We have a lot of unique intellectual property we built onto it. And then we have the business process copilots, which is partly interpretive AI and partly generative AI. How is it a product, the ah enterprise forecasting product? Like what is it like?
01:13:41
Speaker
Like in typical boss products, you have some sort of an onboarding and then it is all self-service. Is it like that or like, how does it work as a product? Yes. So the way why it is product because, and actually this can be again, a very interesting point for the audience and the listeners and you also, we try to classify AI into softwares. Okay. I humbly request you all to not do that from today onwards.
01:14:10
Speaker
And I'm i'm talking in ah from a position of a lot of research and study. So why? Because actually at the beginning, we talked about that AI is meaningful only if you have data, right? Now, when you bought yeah ERP solution or when you bought an accounting solution or HR solution or a CRM solution, there was no condition applied to your business.
01:14:36
Speaker
You can be any business you're being CRM start using from tomorrow CRM, you input it, you store the data, it has workflows. So that's a software product. If I bring you, so when you said how it is product is if I bring you a software code, okay, which I'm calling it as my enterprise forecasting product to you, what will you do with it? You have no data.
01:15:00
Speaker
So, the enterprise AI solution should not be compared as software and should not be classified into a software category. So that's my humble request to you all. Now, assuming that we all agree with that, or if you want to understand why I'm saying that is now take another example. I have two customers and both the customers are in hospitality business. They are the hotel chains, okay?
01:15:29
Speaker
Now I take this enterprise forecasting product to both of them and to both of them I tell them that I will predict for you how many people are going to stay in your hotel occupancy rate and what will be your room rent. This is the most burning problem in hotel industry but these are two different hotels.
01:15:52
Speaker
If I'm taking a suppose customer management solution to both the hotels, I can just go load onto to their shop machines and say, this is done. Now this is UI. You start entering as people check in, you start entering their name, et cetera, et cetera, and you use it. But now I take forecasting problem. The hotel number one or hotel A is using Oracle as a yeah ERP. Hotel B is using SAP as yeah ERP.
01:16:21
Speaker
Now the data to take out from both these yeah ERP has its own challenges. So you need custom work when you are deploying AI. So now walking you through since answering your question that how this implementation happens is we actually through lots of failures and successes We have now developed a unique framework for AI implementation, which is not applicable for softwares. We call that framework as a CUP framework. And it stands for CUPP. So two times timespe And it stands for collection, unification, processing, and presentation.
01:17:08
Speaker
So the collection and unification is every time unique to a business situation. So what theyification so we talked about wide data, right? So in wide data, now I'm taking a hotel location. I'm taking the data from yeah ERP named Oracle. And I have now how many customers were checked in, what room rent they gave, et cetera, et cetera.
01:17:32
Speaker
But now I have another system which is let's say Expedia or let's say booking dot.com. That data is sitting somewhere else. Now I need to know what are the what is the weather in that area or what are the holidays in that area.
01:17:49
Speaker
What are the tourist destination? What are the events? So that's a external data. So I need to now unify all this data in order for my machine learning algorithm to learn. So that's the unification. If you're just taking from one data source, then the unification may not be required, but typically it is not the situation. So in this cup framework, CUPP, the collection and unification, we do it for each customer.
01:18:16
Speaker
then the processing, the third word, the CUP, is what is our product. So what we did is in enterprise forecasting and in business process co-pilot, we have pre-developed a lot of software where you don't need to now develop any code there. So I'm now in a software world. So like I got the ERP software, like I got CRM, I got the forecasting software. But unless I help the customer on C and U, the processing has no meaning.
01:18:48
Speaker
So processing is our repeatable engine. We take it to every customer. So it may be between the hotel A and hotel B in our example, or it can be hotel A and manufacturing B, or it can be manufacturing B and financial services C or financial services C and telecom E. I can go to anybody with my processing engine. That's our product.
01:19:10
Speaker
Now I put this collected and unified data into processing and it starts spitting me the output. It starts giving me the room rent. It starts me giving the occupancy, but now every hotel, so I have a hotel chain and I have one hotel in Tokyo, one in New York, one in Mumbai city, one in Delhi. Now each of the owner, local owners want the presentation in a different format.
01:19:34
Speaker
The New York guy says, push back into my ERP. I will access it from there. The Mumbai guy says, send me Excel sheet every day by email. The Tokyo guy says, gives me a fancy dashboard. So this presentation layer in AI is also very custom made because everybody has a different requirements, even if I'm in the same industry.
01:19:57
Speaker
So unlike in the software where you scale the yeah ERP UI, same to every customer, in this also you can force them to do that, but people don't want now use one more system they want. And this is data. What is the output from the system is really data. And this data will be used for taking actions and decisions and et cetera. So that flexibility to use the presentation layer is very important.
01:20:22
Speaker
So the last P is also flexible and customized for each customer. And we have now seen umptime examples where if you go through this framework of CUPP, your success of AI is multiplied. Okay, okay interesting. I have seen that the very large software businesses
01:20:47
Speaker
focus on product and leave customization to system integrators, and which is what has built Infosys and Data Matrix and these kind of companies. um Whereas in your case, you are doing both product and services. ah Is there a risk that it limits your growth?
01:21:07
Speaker
i you know, if you were instead to say, okay, I will only do product and let system integrators do the customization. And then you can, so like today, your collection and unification process, ah the ah you have humans to do ah that subjective judgment is that for this business, what all data do I need to collect? You could build, you could product as that even further that,
01:21:35
Speaker
if it is hotel industry, then these are the data sources which need to be collected, or you could have some sort of a onboarding questionnaire for your customer or something like that, which then decides that, and then all of that gets automated.
01:21:51
Speaker
So yes, what you're saying is possible. We tried doing that, but again, if you are only thinking from the software framework, you feel it's feasible.
01:22:03
Speaker
but it's actually in reality is not feasible. So let me again go back to my example of hotel A and hotel B. Now, if I decide that I'm going to give, I need this 10 variables and I force both <unk> hotel A and hotel B that this is my questionnaire. These are my 10 variables and you give that 10. You are actually limiting the use of AI.
01:22:27
Speaker
because hotel B has actually 100 variables, and that is their intellectual property or the gold mine they're sitting on. We talk about data being currency, data being new oil, data being new soil, et cetera. But if I force them to limit to only 10 variables, they're going to get more or less same output, and they don't have any competitive advantage. But their competitive advantage lies in the 100 fields they have.
01:22:53
Speaker
So me as an AI provider, I need to be flexible to do different things from different customers. So that's part two answer to your question. Now part one, yes, system integrators can actually pick and do that. We tried doing it. We failed. Two reasons for failures. One is that the again, AI look from the system integrator, from the software angle, they need this complete revolutionary change in their approach.
01:23:22
Speaker
The second, because they need, as you rightly said, they need process, mythologies, documents, templates, very structured things to do it, then only they can perform. Okay, because that's the reason why Infosys and Bipro and TCS have taken third party products and made big monies out of it, because there's a very standard process.
01:23:40
Speaker
In this case, you cannot define it for the reason I told you. The second is that this AI field is so hot that everybody wants to do on their own. So the system integrators, while they are not prepared to have that processing layer on their own, they want to fiddle in that area. And they say, oh, I can also take other third party products, open source. So most of the people are in the experimental nature, and they don't want to take any third party product.
01:24:07
Speaker
In fact, actually, my dream was, by the way, I built this product company. And as you know, the product company is not an easy game. We don't do any services. We don't do any placement of people or professional services. So I also wish that I just give products and people will do it on their own. In fact, at a very early days, we got a couple of customers. They said, oh, I have in-house capacity. You can just deploy the license on our side. We did our processing engines. We deployed on their network. They didn't move an inch.
01:24:36
Speaker
because you need that expertise to do it. So therefore, we decided to do both product and services. And again, if you scan the market, more or less people like if you take a company named C3.ai, which is a publicly listed AI company, which is started by the former big entrepreneur named Tom s Sibel. If you see they not only just provide product license, but they provide 10 to 20 times more expensive services also, because you they can't just give you the product and say you do it. yeah Sorry. They're also an enterprise AI company.

AI Integration in Business: Challenges and Opportunities

01:25:10
Speaker
Yeah, exactly. same More or less, same thing what we do. OK. OK. OK. OK. So OK. Understood. So that answers. So the SI segment, as you said, is a right business model. And your question about about will it limit our growth? Potentially, yes. But if the industry agrees with our thought process that
01:25:37
Speaker
this software has to be dealt in a different way. okay Then only the things are going to happen the way you're saying. There has to be new system integrators in the world of AI. The old system integrators trying to ride on the bandwagon, we are going to get a lot of noise around it. Okay. The CEO, the collection unification process, what kind of people does it need? like Is it like data scientists who are doing it or is it business people who do it?
01:26:06
Speaker
No, they they are more, I would call data analyst and data engineers. The data scientist comes when the modeling or the processing part comes in. So there are people only who understand databases, crawling, API connectivity, extraction, transformation, loading. So data analyst come data engineers. You don't need business intelligence in this to understand the business of the client and then decide what sources should we look at.
01:26:33
Speaker
Yes, it needs, not business intelligence, it needs domain expertise. So in fact, what I explain Akshay to our all prospects is that we look from customer three things, data, domain, and decisions. So I say three Ds. So they have to bring to the table these three Ds, then only our cup framework can be applied. Now let me explain data. You don't need explanation because unless customer data is there, nothing can happen. So data is a no-brainer.
01:27:03
Speaker
domain. Yes, we need domain knowledge because technologies like AI are horizontal technologies. they get So when the solution gets defined by the data which I put in, imagine I have a predictive algorithm product. Now I bring in HR data, it becomes HR solution. I bring in demand and supply management data, it becomes a demand supply solution. So we need domain experts to say, oh, I think probably this data source may impact on my outcome, or that data field may be open. So we need domain knowledge from customer.
01:27:34
Speaker
Because we can we we don't have our domain vertical products. yeah We have the concepts which are vertical, like demand forecasting. But in demand forecasting, for the same two manufacturers, the things may be totally different. And the decisions is also a very interesting word which I use because we failed on many cases where the customers were not in a position to take a decision.
01:27:57
Speaker
Okay. On whether it is on data, whether it it is on what is the outcome, how to use it, because it's not just about predicting what will happen. So if I tell you that Akshay it is going to rain tonight and you say, I'm not going to take umbrella. And then you go out and you get completely wet. Who is responsible for it? So decision is referring to what the customer does based on data, or it is about customer deciding what kind of things I want you to predict. So all about the prediction use case is well-defined. That's not a decision issue. But in the meantime, also, there are lots of decisions to take. like There are data related to confidentiality issues. There are data about governance issues. There are risk issues. But at the end, let me give you a real example. While I gave you this rain and forecast issue,
01:28:54
Speaker
we work with ah debt collectors, okay? So we work with insurance companies, credit card companies, loan companies. They keep calling on to these debtors. And by the way, I don't know whether you know, but in US, it's exponentially higher debt. It's touching about a trillion dollar consumer debt, individual debt. So there are lots of debt collection companies. So they call upon Akshay, they call up upon Anand saying, oh, you are not paid your bill last month.
01:29:23
Speaker
your outstanding at $172, et cetera, you paid. On an average, the debt collector has to make 26 calls or contact points in order to collect the money. Now we bought in AI here.
01:29:40
Speaker
And we now give them the output saying, Anand is likely to pay 20%, Akshay is likely to pay 95%. So like that, we rank all the list of data. And we also give them what to do that send text message to Anand, call Akshay, send an email to Sanjay, blah, blah, blah. Now, AI can give you the output. The call center has to now reorganize themselves in order to use this. They cannot use because they work on a so-called technology called auto dialers. The agents have no control. They are just sitting with the headphone and the machine dials it and it pops up in front of them that they're calling Anand and Anand's outstanding is $172. But in order to now use AI, they have to take decisions of changing their structure. They have to prioritize accordingly. they are to So we keep telling customers that, are you ready for that first?
01:30:38
Speaker
but Don't just say that I want to use AI for the sake of AI because if you don't operationalize that output, you will not get benefit from it. So that's what I mean by decisions.

Is AI a Separate Industry?

01:30:50
Speaker
okay Interesting. interesting ah In a way, you are like a consulting company.
01:30:58
Speaker
how like say a McKinsey helps customers and they give recommendations and they basically help them be more productive. You are doing something similar, helping a customer be more productive, except it's on the back of data in AI.
01:31:14
Speaker
and And that's the reason I again look back Akshay that as I said that the AI industry should be classified as a separate industry because it's a combination of system integration, it's a combination of software, it's combination of consulting, it's combination of implementation. So it's totally different than just creating a software and distributing it to the dealers and distributors and someone else is implementing it. and Because it is it is very niche.
01:31:39
Speaker
The outputs are totally differently. you likeed For example, the way you may be using chat GPT versus IU chat GPT, totally different. How do you think the enterprise AI market will look like in a decade or

Future of Enterprise AI Market

01:31:54
Speaker
two? ah And I want to give a little bit of preface here.
01:31:58
Speaker
In the enterprise software market, you have like this very standard model of SAP, Oracle, Salesforce. All of them have a very similar way of operating, which is they build products and system integrators through the execution. And there is a very standard way of how they do their B2B marketing and everything around that. ah How do you think the enterprise AI market will look like?
01:32:22
Speaker
So enterprise AI market will, one, continue to grow dramatically because it's completely nascent, open field, everybody needs it. That's one point alone. The second is that there's a huge pressure from software industry to put it into the template the way you were talking about. So there will be eventually some verticalized products with the restriction, as we talked about, saying these are the 10 fields you give it to me and then I'll give the output.
01:32:50
Speaker
So there will be some and people will adopt to it because the low cost, easy to implement, I'm getting fairly good output, et cetera, et cetera. So that it that is it. The constant fight is going to be build versus buy. So the enterprises are going to now look for that I can build this because increasingly the AI technology is becoming easier, like chat GPT is one example.
01:33:16
Speaker
So now I have a foundation model, I can bring it. okay But there there are big issues versus organization building versus buying because I was talking to one of my prospect recently, last week I was in person in a meeting and I was telling them that General Electric makes very good washing machines. They manufacture very washing, but they are not in laundromat business. right so You need a totally different mindset, set up operation to be a laundromat. So similarly, your corporate businesses need to know who they are, what are they doing? If I'm a financial, I'm a bank, I'm a telecom company, what is my core business?
01:34:03
Speaker
Building AI and using AI is important for your business, but if you focus on doing that building versus buying, you're going to land. and And we have a number of such failures in the industry, published and published, is that companies trying to do it on their own eventually landed nothing.
01:34:21
Speaker
so But then that fight is going to continue to increase because companies want to have governance, they have risk issues, they have compliance issues, they have internal staff issues. What will happen to now these large IT teams they have? So they want to re-utilize it. But now rescaling and re-development is also a big challenge. So it's going to go through a lot of turmoil, but the adoption of AI in enterprises is going to continue to grow in the next 10 years.
01:34:45
Speaker
So I don't doubt that in build versus buy, eventually it will be more of buy and possibly the buy decision will also become easier as cost will come down, ah you know, more competition and more things getting productized.
01:34:59
Speaker
but ah
01:35:02
Speaker
You know, Salesforce would want to be an enterprise AI company, possibly saying that to a debt collection agency that you maintain your records of the debtors on Salesforce and Salesforce will predict what you should do and Salesforce will send the SMS or the email reminder ah ah So, you know,

Integrating AI with Existing Systems: Challenges

01:35:25
Speaker
why do you see that happening that each of these companies start doing their own enterprise AI inbuilt with the product?
01:35:34
Speaker
Yes, it's not going to, it's already started happening yeah because every single every single product company is launching so-called co-pilot within their products. ah It will be useful, it has its own limitation and the big limitation is you just articulated that you need to first keep your data data in SAP.
01:35:54
Speaker
Okay. For that, I have to spend a million dollar or a multi-million dollar. Anyway, an enterprise will use something, right? Correct. so you like Not necessarily every product company is going to do it. These big muscle companies are definitely going to do it. But I mean, for them it's an easy upsell. Correct. So even if shop and I get automatically just co-pilot, but we talked about the wider data concept earlier in this conversation.
01:36:24
Speaker
Okay. I am now limited by using my 10 fields stored on data and my intelligence is going to be weak. So I need to have that external data. I need to have those. So if it says, yeah, bring in external data and put it in, then you are finally going through the same cycle. yeah so Now you're using findability product. You're using SAP product. So it will, it will definitely impact. It will have a lot of consumer and consumers for those companies because of their captive things. But currently, we work with many companies who got into situations that where their native product companies are offering them AI, they found serious limitations, particularly in planning area. So there are planning softwares, they say, we do forecasting on your demand also. But the accuracies are around 60, 65%. They don't give generate the output. So there will be challenges. But for some companies, it might be an easy right to just use that.
01:37:14
Speaker
Okay. That 60-65% which comes with the upsell of a software company. How how much can you impact that if a company wants to take up a findability solution? Anything between usually 25 to 30 points more in terms of the accuracies. Okay. And the cost will be less, more or less same or less. ah How do you price it? So we price per month per use case and it varies anything between 25,000 to $50,000 based on the data, use case, its impact, et cetera. And ours are annual contracts. If we go back to that hotel example, what would be an example of a use case for a hotel? Like for one property predicting the demand is one use case or is it for the entire chain or what is like?
01:38:08
Speaker
And entire chain predicting occupancy and price would be one is one use case. Okay. Okay. Okay. So it it seems like your revenue per customer is not that high then it's not like $500,000 contract. It will be like $50,000 contracts. revenue but no only No, no, the um I'm talking about per month billing.
01:38:32
Speaker
ah the price yeah totally like but one right i right right right okay What's your revenue per customer average? Typically in the industry, means not only with findable sciences, you will see average about 240,000 to 300,000 dollars per use case. okay so How does this compare with the average pricing of legacy software businesses like a Salesforce or an SAP?
01:38:58
Speaker
these kinds. This is quite cheaper than the legacy product companies pushing their copilots. The copilot is an expensive add-on. I saw Microsoft pricing for the copilot feature. It was fairly expensive.
01:39:13
Speaker
Yeah. But even that copilot is mean Microsoft copilot is a different game. That's for individual productivity. It is not for enterprise. Yes. So it will not do the forecasting. It will not do it. It will create your PowerPoint. It will tell you answers on your Excel sheet or help you in drafting the word document, but it is not taking your business process into consideration.
01:39:34
Speaker
and Right, right, right, right, right, right. Okay. ah In your ah products, you said that one is forecasting and the other is co-pilot, which is your hero product? Like I was even forecasting is the one which gets you both. Yes, that's the highest revenue generator only because we have been serving it for now number of years. In the business process co-pilot, the interpretive AI part is also significant. ah The generative AI, we are just now capturing that business and we have a few customers on that today.
01:40:05
Speaker
Give us some examples of what this business process co-pilot does. So we launched recently a bit and and this can be a very good segue because of your Microsoft example in terms of what Microsoft co-pilot does.

AI Solutions and Innovations

01:40:18
Speaker
ah We launched a business process co-pilot for medical devices companies. great So now when I launch a medical device, let's imagine that I'm a big company or a medium size or a small company or a startup.
01:40:33
Speaker
I get an idea and let's take a very basic example though it's not a right example. Let's say I'm launching a new thermometer and that thermometer needs FDA approval for me to launch in the US market or the PMDMA DA in Japan and then FSI in India. So there are regulatory bodies in each country. So before I, so I developed this thermometer, I design it, I do some manufacturing, I do clinical trials, and then I'm now ready with the data. I can't go to market unless I have this approval from the regulatory affairs. Now this approval takes anything between three months to three years.
01:41:14
Speaker
Now I talked about thermometer. Imagine a large company developing a complex CT scanner, which runs into hundreds of millions of dollars of machine. So I have developed a machine, I have done critical clinical trials, and now I'm ready to market, but I have stockpiled up and I'm waiting for FDA approval. Now in this FDA approval, just taking US as an example, there is a form called 510K form.
01:41:40
Speaker
which is called pre-market approval. So I cannot go to market without this approval. This form many times it runs into thousands of pages and this form I have to put My product details, specifications, my clinical trial data, competitor's data, producer data, intellectual property, FDA rules I have to follow, categorize it. So there are regulatory affairs departments into each of these medical devices companies.
01:42:12
Speaker
and a team of 3, 5, 10, 20 based on the company size. Day and night, they're studying FDA rules. There is any change in the rule in my category, this category, competition, and it's humanly impossible task in a way, though we manage the industry so far. So what we did is we have now created a solution, which is a combination of predictive, interpretive, and generative AI.
01:42:35
Speaker
where we have trained one large language model on all FDA regulation. So it knows now every code, every page, every loop. Then we go to customer, we deploy it in customer environment and we train on all their historic data. So there are 520 applications in the past, five got rejected, 15 got accepted. So all that we train it. Now we have the interface where we say, okay, I have a new product to launch.
01:43:00
Speaker
So it is conversational and it says that upload the specifications. So you attach a PDF, upload the specifications. And Akshay, the work which it used to take weeks to months to create this form now happens in less than 10 minutes. So I have the 510k form ready.
01:43:17
Speaker
because it it generates content, it drafts the content, it does all the predictions, et cetera, and I have the form ready. I can just tweak manually and submit. Now, once you submit, you may get and questions from FDA. The moment question comes in, those also get automatically drafted.
01:43:33
Speaker
Earlier, what used to happen? I'm getting questions. Oh, I'm busy today. Next week, I'm busy. I'm putting it into Backburner. It might take delay one month for me to respond to those. Then I have to research now the completed product, et cetera. So all that is done automatically. So that is the business process copilot. So business process copilot for a 510K form application.
01:43:55
Speaker
is one not example And that is no way compared to I am drafting a PowerPoint presentation or Word document or Excel. It's a business process and it needs the enterprise level implementation to solve that problem. Okay. Okay. Interesting. This business process co-pilot is now I see a lot of startups which are focused on just one specific business case co-pilot. Like this itself would be a startup. Yes. Apply for FDA, like FDA approval product. Like correct. And we are heading in that direction. So this is coming back to your earlier question on, can this be productized? This is productized, but now again, keep in mind, unlike the yeah ERP or CRM systems, while I have now created this co-pilot as a solution, I still need to train it on customer data.
01:44:44
Speaker
right ah with Without that, it cannot you be used. So a lot of standardization and a lot of vertical training is already done, maybe 70%, but that 30% work still needs to be done.
01:44:58
Speaker
That as word data, it can be self-service where customers can train on their own. It is a technical process. Yeah. It's a technical process. So while we use the word very lightly training and fine tuning and et cetera, it's a, it's a process which needs some technological skills, not a business user cannot do that. Okay. Okay.

Building Sustainable Businesses vs. High Valuations

01:45:20
Speaker
Okay. Probably technology may reach to that level, but it's not there today. Okay. Okay. Okay. What kind of error do you do now?
01:45:30
Speaker
The total or per customer? No, total for findability. What is? About $10 million. $10 million. And you are still bootstrapped. No external funds raised. We have in 2018, SoftBank of Japan, the SoftBank Telecom has invested. That was mainly because we formed a joint to venture in Japan with them. And that was one of their condition to have the investment. So we have that we had that investment from them. But that's the first and last. OK. OK. OK.
01:46:01
Speaker
why Why have you not sought out more investment? Wouldn't that help you to do more productization, like create more industry specific products? It will, but very interestingly in this entrepreneurship journey, and since your podcast is about entrepreneurship, is it's we have practically reinvented us every year, not just because of us, the market pressure and forces were too large. So one is the technology landscape is changing dramatically. The second is actually we went through economic recession, we went through pandemic, we went to the post pandemic economic recession. So there are constant market pressure to re innovate yourself.
01:46:48
Speaker
And then this AI investment, if you really see in the market, are happening to only the foundational technologies. The solution technologies, the investors want the way you are asking questions. Where is your template? Where is your verticalized product? Where is the repeatable? So this cup framework to explain to the investors, it takes a lot of time and people don't have attention bandwidth or understanding interest to understand that this market is going to head only in this direction. So combination of reasons, not only one reason, the last but not least, my humble background coming in from a very small town of Maharashtra. I didn't want to just take someone's money and blow it up. We were very conservative or a very sustainable business we are building. And we might, we might go for investments or future things. We we continue to be profitable. We generate money every month.
01:47:40
Speaker
so There's no dying reasons for us to get for investment. At appropriate time, we might do it. Hypothetically, if you raised $10 million dollars today, what would you use it on? So primarily two. One is the market reach, basically. they Because we currently operate on a very thin or lean sales team. We have only two sales people in the company. Oh, wow. So we we get more most of the business through referrals and through my contacts.
01:48:10
Speaker
ah So we would definitely spend some money on that part. The second is that we talked about extensively on the verticalized product development. Like we talked about this business process co-pilot for medical devices. So it needs an element of investment and proactive development and not just customers. So if there is some cash available, those products can be proactively done and launched and you can afford to fail on a couple of them and succeed on a few of them.
01:48:37
Speaker
Okay. Okay. Okay. Very interesting. So ah let me end with this. I have taken up more time than we started for, but you know, what's your advice to young people who are listening to this podcast? So my first advice, and I do mentor a lot of but entrepreneurs here in business schools or entire Boston and et cetera. The first is that no idea is a bad idea. Okay. If you are convinced about the idea,
01:49:07
Speaker
you can go and convince the world because typically what happens is that everybody wants to become an entrepreneur. They get the idea tonight, they get excited and tomorrow morning they wake up and they feel demotivated about that idea. So take the idea and just run with it. It will evolve. Like I gave you the example of Findability platform to now business process co-pilot.
01:49:27
Speaker
I transformed the company every year and it happens. The universe finally conspires and help you constantly evolve. So no idea is a bad idea. The second is that no U-turns. Okay. You can branch it off. You can take detours, but keep moving ahead. So that is most important in entrepreneurship is don't take a U-turn.
01:49:50
Speaker
And I have seen many, they develop develop cold feet, they get butterflies in their stomach, they give up, they start giving 10 different explanations that why this product is a bad product idea or why the market is bad, but just keep marching. And last but not the least, do the sustainable business. I think what has happened with this modern software um Americanization of the globe, we are more focusing on creating unicorns.
01:50:16
Speaker
Okay. So your question also, why are you not taking investment? Why are you not scaling up? Everybody wants to scale up as a humans. We want to grow. There's never, one that we have no, no limit. If I get a billion dollar and if you have $2 billion, I'm going to get unhappy. I'm going to look for 3 billion. Then I see someone who has a 4 billion. So I think the rat race towards becoming a unicorn, it is now very important for industry to go and see how many of those unicorns have died prematurely.
01:50:43
Speaker
okay They have created scandals, they have created scams, they are generated by bad societal people. So my request to the entrepreneurs that if you can create a sustainable business, that is good. I feel very proud that I'm employing 200 employees. okay For a person of my background where I was struggling to have my own income, if I can do for 200, I am satisfied.
01:51:07
Speaker
Can I do 2000? Can I do 20,000? Can I do 200,000? Possibly yes, but not everybody can do it. Every company cannot become a billion dollar company. And and actually coming back from our Indian background, did Tata's and Birla's and the Ruya's, they never focused on creating a billion dollar company, right?
01:51:27
Speaker
they They became billion dollar through their sustainable business models and impact on the society is creating employment, helping the nation. So my last request to all the entrepreneurs is that focus on sustainable business, the growth, the money, the valuation will happen automatically. Okay. um I want to do a little bit of pushback here.
01:51:49
Speaker
You know, a lot of founders have shared with me that you are only limited by your ambition. ah And while Tatas and Willas, there was no such concept separate at that time of fundraise, either you do an IPO or you take a back loan. ah But I think in today's market,
01:52:09
Speaker
ah one of the short short ways to grow is fundraise and it ensures long-term sustainability. because otherwise your competitors who raise more funds than you might ah be more aggressive in sales, get more business. and this is a I think this is a land grab phase in the enterprise AI ah sector. you know Customers that you acquire will stay here, customers for a long time. like like the If somebody is on salesforce, they stay on salesforce. It's unusual for someone to change the CRM. A similar would be enterprise AI space. so
01:52:49
Speaker
shouldn't you be more ambitious and you know like take advantage of this unique opportunity of the land graph phase which is happening right now? this so actually I would like to highlight what I said. I didn't say that don't be ambitious. What I said is that build sustainable business. That doesn't mean that you should not take investment. Even if you take investment,
01:53:12
Speaker
use it wisely to build a sustainable business. So I'm no way objecting to taking investment. I'm no way objecting to dreaming 1 billion. I'm not objecting to growth by taking investment. By taking investment, if you are going to crash the company in two years, please don't do that. I'm saying build sustainable business.
01:53:32
Speaker
Yes. Land grab, do it very positively. No problems. Absolutely. If you have the capacity, if you have that mandate, if you can do it, if you have the team to do it, absolutely that should be done. So I'm no way objecting to any of your thoughts. I'm just putting one condition, build a sustainable businesses. Actually, how many examples you have from last one year, and we will not name the companies in India. Okay. We have taken millions or billions of dollars of investment and where they land today.
01:54:03
Speaker
Okay. That's my problem. Yeah. and And there are apparently successes also. There are companies who have taken investment and created product and market. Let's follow them. Therefore, my advice is build sustainable business. How you build is is your choice. It's your personality. It's your karma. It's your ecosystem. How you are going to do it, but focus on sustainability. and Right. Right. Right. Right. Amazing. Thank you so much for your time on it. Thank you very much.
01:54:33
Speaker
And that brings us to the end of this conversation. I want to ask you for a favor now. Did you like listening to this show? I'd love to hear your feedback about it. Do you have your own startup ideas? I'd love to hear them. Do you have questions for any of the guests that you heard about in this show? I'd love to get your questions and pass them on to the guests. Write to me at adatthepodium.in. That's adatthepodium.in.