Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Making Robots Human | Nikhil Ramaswamy and Gokul NA @ CynLR image

Making Robots Human | Nikhil Ramaswamy and Gokul NA @ CynLR

E84 ยท Founder Thesis
Avatar
175 Plays3 years ago

In this era of start-ups in India, invention, innovation, and disruption have become a reality.

Founder Thesis presents you the journey of one such duo of techies who are on a mission to enable robots with human-like vision, which could unlock endless possibilities.

In a candid conversation with Akshay Datt, Nikhil Ramaswamy and Gokul NA, Founders, CynLR, have taken us through their journeys. Nikhil was an urban boy inspired by tech revolutions while Gokul saw automation challenges at a farm & studied the impact of code on real life. After gaining experience at National Instruments, they started CynLR in 2019 and has solved 70+ industrial machine vision problems with a 100% success rate, putting it on an exponential growth trajectory.

Tune in to this episode to hear Nikhil and Gokul explain cybernetics, what the future will be with robots and how it will alter the business landscape.

What you must not miss!

  • The challenges to improving Machine Vision.
  • The Paradigm Shift in Robot systems Design.
  • Scalability & Expansion of the CynLR Platform.

Recommended
Transcript

Introduction to Founder Thesis Podcast

00:00:03
Speaker
Hi, I'm Akshay Hi, this is Aurabh and you are listening to the founder thesis podcast We meet some of the most celebrated sort of founders in the country and we want to learn how to build a unicorn
00:00:35
Speaker
Dominator was one of those sci-fi movies that really influenced a whole generation of tech nerds and innovators.
00:00:41
Speaker
While we all agree that a Terminator-like robot is nowhere on the horizon, but some of the capabilities of the Terminator are now available in commercial robots.

Machine Vision in Robotics

00:00:51
Speaker
And one of the biggest game-changing technologies in the field of robotics is the science of machine vision. Traditionally, replacing humans with robots in factories has always been an expensive and lengthy process. In fact, Tesla blamed the robotics in their factory for production delays.
00:01:08
Speaker
But thanks to the work done by Gokul and Nikhil, the co-founders of Simla, this may no longer be true. They're building the next generation of factory robots that would change the manufacturing landscape in ways that we can't even imagine today. Listen to this fascinating mind-boggling conversation between the founders of Simla and Akshay Dutt about a future in which robots with intelligent vision fundamentally alter the business landscape.

Career Shift to Solve Automation Challenges

00:01:35
Speaker
So even without having a very solid product goal or a business goal in mind, Gokul and I took the step of leaving our jobs at NI in 2015. While at NI we had already identified that there were a lot of customers wanting to solve their automation challenges in manufacturing plants using machine vision, but were terribly failing. And I had 10 problems and I would succeed in three, seven and I would fail. So we knew that there were seven customers
00:02:01
Speaker
7 customer use cases that were unsolved. You quit your job to be consultants. The goal of doing consultancy was essentially to discover what could be a use case for this insight which Google had. And how did you get these consulting assignments and were they well paying assignments?
00:02:23
Speaker
Yeah, so, so when we were an NIH, we kind of recognized that there were a lot of customer use cases where the customer wanted a solution, but the available technology and machine vision was limited. Technology would kind of more or less absorb it.
00:02:39
Speaker
So we had the opportunity to characterize some of these previously unsolved problems from customers. And the customers already trusted you because you were interacting with them as part of TI. We also did our own efforts as well. So we got a few of our existing relationships. So I found a project in G.
00:02:58
Speaker
that I was working with. Gokul was working with one of the largest grain milling equipment manufacturers in India. And then he was able to convert that into a consulting project as well. We got some leads from other manufacturing customers. All of them, he was for us to identify what are all the use cases where machine vision can play a role.

Consulting Success and Vision for Universal Hardware

00:03:21
Speaker
Machine vision can play a role in physical interaction, some kind of manipulating objects
00:03:27
Speaker
And then we let ourselves discover what kind of use cases will get paid for, what kind of use cases customers want to be solved really, really badly, really immediately. And over the next three years, we ended up developing and developing over 30 previously unsolved custom machine vision and robotics projects.
00:03:47
Speaker
And it turned out that a majority of these use cases came from discrete manufacturing automation. So wherever there is a certain set of objects being put together, and you need to automate that task, today it was extremely challenging for manufacturers to do that. And that's where we got most of our use cases. So what does the word discrete mean here?
00:04:11
Speaker
Discrete manufacturing is any manufacturing process where individual components have to be handled, put together. Automotive manufacturing, for instance, is an example of discrete manufacturing. You have about 10,000 discrete parts that go into a car. These are metallic parts or plastic parts or glass parts, and each of these has to be picked oriented in place.
00:04:33
Speaker
Process manufacturing is something like petrochemicals and cement manufacturing where you don't have these discretized units where you need to handle. Most of that is already mechanically automated because you don't have to really individually handle any components. You can put them in containers and the containers can be standardized. That is when you come to discrete manufacturing, the car models keep changing, the parts that you're handling keep changing.
00:04:56
Speaker
smartphone manufacturing for instance every 12 months every eight months you get a new set of parts to be manufactured and assembled. In all of these use cases it was extremely challenging to deploy robots as they existed today because robots inherently are blind they can't see where these objects are and for you to really get that robot to automate something that otherwise is fairly simple for human beings are super challenging.
00:05:19
Speaker
And we found these use cases through the projects that we did. And while doing these projects, we were largely relying on off-the-shelf hardware, but putting those off-the-shelf hardware together on a case-to-case basis using our fundamental approaches that differed from how the machine vision world or computer vision world otherwise dealt with it. And we were able to achieve 100% success rate in every attempt that we did, every project that we took up. And that's when we were able to kind of validate that our approaches were strong.
00:05:48
Speaker
But we still were doing this in a consulting manner. We were still doing that in a customized manner, relying on off-the-shelf hardware. Because of the off-the-shelf hardware was built for identification, did not have the abilities like convergence and autofocus that we really wanted. And therefore, through this process, we were also able to go ahead and identify and articulate what is the hardware product that we had to build to universalize this or generalize this altogether.
00:06:14
Speaker
And that definition came to shape by early 2019 for us. So by early 2019, we knew the exact product that we had to make. We knew the exact stack that we had to build both on hardware as well as software. We also identified customer use cases, an industry that was itching to have this problem solved.
00:06:37
Speaker
which is largely discrete manufacturing within discrete manufacturing in India. Automotive was the largest buyer and largest player to expect these innovations to come. And with all of that, we were able to go ahead. And anyway, hardware design, there is a gestation, there is an entry barrier, and there is a capital requirement to put together any hardware from an industrial quality standpoint.
00:07:03
Speaker
So, we identified that we had to fund ourselves with some capital. So, during these pre-2019 when you were still doing consulting gigs, was it a two-man show or did you build out a team also and what kind of revenues were you earning annually in those years?
00:07:22
Speaker
Alright, so we largely did it as consultants, so it was Google's and my time, only for one of the projects which was a more product development effort that we had to do for this green sorting application. Probably we could talk a little bit about the technical aspects of that application.
00:07:40
Speaker
We got into a deal with a manufacturer saying that they will hire a couple of resources. We will hire a couple of resources and then go up on doing the project engineering. But otherwise, it was a two-man job and we earned enough revenues to live a standard of living that we wanted and go about discovering the problem and describing the landscape. So we were generating upwards of about 20, 25 lakhs worth of just consulting revenues between Google and me.
00:08:13
Speaker
for the year, but our purpose was like we spent about 30% of our time consulting, 70% of our time researching and developing the product aspect. So we were very choosy, we were selective, we didn't
00:08:28
Speaker
We didn't really want to build a services business. We never wanted to do that. Our goal was to identify that we will be able to solve this problem universally and what would be the product definition, what would be the market outreach. So you can consider it as a as a semitical time outside of NI.
00:08:46
Speaker
We want to do the search and also keep the wheels rolling. Like a product market fit time in a way. Right, discovery. Basically our PhD market discovery and PhD time. Yeah, okay.

Developing a Hardware Platform for Manufacturing

00:09:00
Speaker
PhD, okay. So then when you had clarity on what you want to build, like this was like you had clarity that this is the hardware that we need and
00:09:14
Speaker
Was it specifically for this use case that in factories we need this kind of hardware or was it like something generic that we will customize it as per the use case? How detailed was your vision on what you wanted to build? On the starting. So one thing we are very, very clear is that from the articulation from the beginning is that you need something, a vision that enables the hand to be able to do things.
00:09:41
Speaker
and when we moved from there these things one thing we were very clear is that any mason technology we need to look at who is who is being profitable and who is buying a lot of it, right? Robotic arm is not something that we were making from scratch. We are using what it is and we are building only the equivalent layer on top of it. So if I have to sell I need to look at the customer was already buying a lot of robotic arms who is who is very convenient and then so we had that clarity. We always begin from there and to say whether you also wanted to check whether
00:10:11
Speaker
Are we biased by whatever the past data is? What if there are other customers who are willing to fight? So we also fight with every other customer's electronics in different ways. Mostly it always had been automotive industry which responded back with their problems because they are the ones who are already trying automation. So that marketing itself revealed it to us rather than me going behind something.
00:10:37
Speaker
that there is the attraction that is highest for us, right? And considering that we are not sitting on a platform which is already well established and moving on is also necessarily picking up technology, right? And we had a very clear difference between a platform and a product technology platform and a product and because at a platform level if you are selling a platform level, you always need a solution engineering on top of it, right?
00:11:04
Speaker
If it's a product, then at least okay. So some additional support will be enough. But the solution and I might be executed plug-and-play might be large but most of the industrial solutions never ever act like a product. They always act to be a laptop and you need to kind of integrate together. So it's the industrial system already had a network of integrators and things and you also you also have a structured process how to put this together and how these things can work with more to leverage that
00:11:33
Speaker
That's already available. So that again also happen to be automotive which is more conducive for us to go like this. And they're also less they're more forgiving for how may sell the tech is whereas no small startup.
00:11:47
Speaker
At the stage that we are now we first initially put our money and we were running for the first three to four years and we were doing all this and we're trying to then we started circulating money with the customers that we got as consultancy. We were doing all that and in the process we were there were a lot of opportunities that came we said no to them simply specifically saying there was a tool that we were using there are a lot of tool based integration solutions that are also coming right.
00:12:13
Speaker
So that also we kind of ignored a lot of control and motion based systems that can be ignored them and we never took mechanical as part of our integration. We always let the customer take care of that and we were doing only for provision and algorithm and specifically only those applications, right? So that I'm not innovating on the business side. I'm never doing only on the tech side and once the tech is up, then you can go about innovating on the business side. So then again it that when we look from that idea then
00:12:42
Speaker
From that filter, then the automotive came to be the perfect solution from that case. So that's how we chose our customers. And the idea was to build a platform which would be customized for each customer. And you would work with the existing ecosystem of people who do that customization. Who is the existing ecosystem of people who do customization?
00:13:09
Speaker
Oh, that is a huge ecosystem of... I mean, I can't just name one specific person because... What type of... Like, these are like consultants, basically. They do the whole mechanical fabrication, they do the whole integration of those systems. So you have global level large giants like Atinium, who are there. Then so many other companies, very similar. So these are companies selling like robotic arms.
00:13:36
Speaker
No, they, they, they, robotic arms are sold by companies like ABB, Fanok, KUKA and all these guys. Today, in fact, that brings another point. If you are buying a robotic arm today, you will get a little rest. You won't get the hand.
00:13:52
Speaker
You don't get the gripper. The dripper is an entirely independent industry like Shunk and all the other like $500 million and $200 million, $100 million, $10 million, $1 million come. There are a lot of those kind of organizations which are self-discussed. Grippers and most often Shunk is the largest simply because they don't give you the fingers. You customize the fingers. You customize the gripper.
00:14:12
Speaker
from all of it, right? You have only the grasping technology that comes along with it, right? So that's how much, that's how far the tech is today, right? So you don't have the tech that is sophisticated enough, that's complete enough, right? So this is only till the, so who, you can't do anything without the stripper to handle the object, right?
00:14:37
Speaker
So it's the system integrators who start from there onwards. So system integrators are like small hardware tech kind of firms who will procure the arm from one place, the gripper from another place, and the software from a third place, and the cameras from a fourth place, and integrate it all based on what the client needs.
00:15:00
Speaker
And for the vision cameras, again, are they like large players who provide that or these are like off the shelf cameras? Yeah. Nothing on the size of like either Sony, like multi-billion dollar companies don't exist from a camera. But several hundreds of millions of dollars companies like Basler and from there to like 10, 20 million dollar companies also there. Mostly they all have been brought under one acquisition ring.
00:15:30
Speaker
Today back 10 years back that was lot more company. Nowadays. There are only very specific some by 10 companies at the max was standard and in mission typically you don't get the camera along with the lens lenses and optics is a separate world of company.
00:15:44
Speaker
Oh, okay. So there are actually a lot of different vendors whose products need to be purchased individually. So you need someone technical who understands what the client needs and then creates the specs for each of these products and then integrates it all together. Okay. And what would be your role in this? You would provide the camera tech and the software.
00:16:13
Speaker
So the whole portion of the technical person was supposed to understand what lighting has to be there. How the lens, what is the math behind the lens? How do you choose that lens? Because there are like, if you open a lens company like Opto engineering or Edmund Optics, right? These are all big giants of, I mean, Edmund Optics is a giant in Optics from a mission or any custom.
00:16:35
Speaker
There will be almost like 1000 different lenses that will be there. Right, right, right. How do you know which lens to pick, who to put and what combination. Based on an algorithm you'll have to change the lens. Based on a lens you'll have to change the algorithm.
00:16:46
Speaker
that the cycle is going on right and often customers cannot spend on those The to keep changing those right with because that camera might actually cost more than a lakh rupee, right? The whole system might cost to anywhere between six to seven lakhs. You can't just keep experimenting on that right? So in an expert who understands from a physics point of view and sets all of this together and he's able to give those feedback.
00:17:09
Speaker
So, that's the question that we were doing along with the whole software architecture and also the algorithm, right?

Standardizing Vision and Camera Kits

00:17:17
Speaker
So, we will design the system for them and the customer will purchase the hardware. We didn't want to, if that would have done, our revenue might have looked at it much bigger, but we didn't do it that way. We just, because we didn't... This was during your consulting days. You were actually like a system integrator also.
00:17:36
Speaker
Correct. We were also so system integrator from the vision point of view where most of you know the system integration usually there will be another company, right? So we had partners like Sasyata, we had partners like Tiltrix and some organizations had announced the capacity to integrate like that one of the customers
00:17:56
Speaker
had their own exhaust capacity to integrate and all. But what did you want to create? Like, you know, when you started similar and you want, what did you want to create here? Yeah, yeah, yeah. 2019 August. So out of these experiments, we kind of very clearly realized that our approach works because the method by which I said this length has to be put, the method by which this camera has to be put
00:18:19
Speaker
the method by which whether you have to turn this way or this way to get the data, right? So all this are all kind of fixed, right? So and we were never having and how did we get that we were never adjusting the system there, right? We just designed everything on the paper and then we just put there and it just worked in much order, right? So the moment that was happening, then you are very confident that it's I have understood the way I am thinking about the problem. And then we also understood the fundamental layers of Tech what is needed to be built right?
00:18:49
Speaker
But we didn't have an universal order. It's just capable of doing all these adjustments by itself and figuring out how I was able to figure out the rate this coin you need to put it this way and then look at from this angle, right? That's because I was able to understand and then get it to the point right this intelligence need to be fed to a device and the device must be capable of doing all this by itself. Right that ability was not there. There was no hardware that is available which was there at the point, right? So we need to build in a common layer of tech first.
00:19:16
Speaker
from a hardware angle that in synergy with whatever the software approach that we might write and then layers of information that our brain uses to understand all this some places depth some places reflection of light some places just on color all these things that it is able to make a combination make a model around it and the ability to build that model for a system by itself is something also something that we have to build right? So that's what we want to do only then you can anodize this business.
00:19:47
Speaker
So you wanted to create like a standard kit of lens and camera and maybe a light source also in that?
00:20:03
Speaker
Okay, but like lens camera and software like this whole thing in a standardized kit and then that kit can just get attached to the robotic arms which system integrators are creating for these companies. Okay. And what we are doing once the product that you want to do is we didn't want to just keep it as a camera that is just attachable to any robotic arm.
00:20:27
Speaker
When we make a final solution at the end of the day, the customer today, it has a nice statistics where he brings out every time, right? So this is a $48 billion industry robotic arms today on which just 30% amounting to almost $16 billion of money is just 24 robotic arms, right? Rest of the 70% of the money is only for customization, right? And it's not organized.
00:20:52
Speaker
Every time it's a new system, every time it's a new system, even if replications don't happen as a replicable system, right? And they go to six months, seven months of stabilizing the hardware design and everything, right? So we, though, why are we doing all the customization? Because today a robotic arm goes to one location, you just teach them, go to this location,
00:21:12
Speaker
then you have a gripper which you have customized input such that if there was an object that was already in this orientation, it just simply attached to it, right? Go to another position and then release it. When you release it, it just falls.
00:21:24
Speaker
The way it falls is the expected orientation that you wanted. So this is the entire contraption that they are building that will not fail. Right? So that's how we had been doing this design takes an infinite amount of time initially, right? A heavy amount of time for all of these guys, right? And this customization changes all of the system that you build. How do you bring the object to this position in that expected orientation so that your gripper is not touching this portion, but it is touching this portion, right?
00:21:51
Speaker
So that contraption that they build in running system presentation system is the majority cost today, right? That's the 70% of the cost and it's entirely customized right and the second problem is the robotic arm is 20 micro meter precise most of the cases. It's 20 micro meter which means you come to this position and then go to that person for next to five years. It will keep continuing in the same position in that position, right? Some calibration in between here and then but it will keep doing that which which means
00:22:21
Speaker
If your object, if I'm making a gripper from like this, and then if my object is likely away, it will fall on the object. Yeah, it will fail. You also have to present the object at that 0.1 mm of accuracy. It's easy to just present it, but if you have to present it at 0.1 accuracy, 1 mm, it's much easier. 0.1, it becomes harder. 60 micrometer, 50 micrometer is extremely hard. So that's when the system is able to succeed.
00:22:49
Speaker
This again is a problem that that's where this 70% of the cost is going. This we could simply standardize. A human being on the other hand, you'll just look at it. He doesn't need the hand to be that precise. He'll come here and he'll adjust. He'll adjust the hand. And then you want to place it because it's blind, you're expecting it to be 20 micrometer precise. But otherwise you have a shape into which it is supposed to go fit. You'll just place 10 times and you'll figure it out and then just put it inside.
00:23:18
Speaker
and just yourself and then you'll put it the more you train yourself. You will become one shot until you start going and putting it right as the ability for the system to be more dynamic and then put it will remove all this customized structuring that is needed. Right? So when I'm able to put this camera and then make the system learn anyway to move the camera. I need the robot enough because I need to do all this right for the same system can also double it to a fireplace system also and then you are able to do give a package solution to a customer and customer can write like a laptop
00:23:49
Speaker
right variety of objects we had variety of different devices to handle just like your variety of data in 1970s. You are a typewriter you had you had your calendar you had your movie theater and TV and everything has become just one laptop today. All this variety of data is able to standardization of circuits that they did into a CPU particular logic of it.
00:24:13
Speaker
Is what our next big vision is that what you want to do in this cool thing, right? When I have my robotic arm as a standardized unit between different objects different orientation and different tasks that you do right people can start replicating more quickly and it makes the line more universal that you could repurpose line for something else.
00:24:32
Speaker
rather than having to change the whole line every time even if a small dimension changes today. In a way, like Microsoft giving Windows to the hardware companies. So you want to be... Basically, you're looking to build the operating system for manufacturing automation.
00:24:56
Speaker
You can actually, that's what the event will, I think the screen, we all have this one vision map and that's the universal factory and an operating system for a factory. We want to change something, you don't go about changing the hardware or re-circuiting the whole factory because it's just a circuit, it's just moving the objects here and there, right? So, two processing units, right? It still can be reprogram and then change the whole factory. So, like, did you first raise funds? Like, you don't tell me the journeys.
00:25:24
Speaker
All right, so we basically launched Seindler in August 2019, along with... And what's about... I mean, why this name Seindler? It's not a very easy name. Seindler is a short form of cybernetics laboratories. Cybernetics is something that we, as a philosophy and as a science, we subscribe to in everything that we do on the technology.
00:25:50
Speaker
So we launched silent in August 2019 along with the seed round of funding. It was about 750,000 US dollars seed round of funding raised from some deep tech VCs based out of India. How did you navigate that? I mean, you had no idea of fundraise and all of that stuff.
00:26:10
Speaker
We did have an idea. I mean, at least my friends from Pilani have raised a lot more money than I can imagine raising in the near future. Very close friends.
00:26:25
Speaker
that fits one or two years, three years. I also came from the entrepreneurial network. So in fact, our problem was not access to the VC network. Our problem largely rested on the fact that our problem and our company was not within the typical VC landscape, right? Very few people really understand the tech models. You have no app. Right. So you have to prepare an app. Your thesis doesn't fit our problem, basically.
00:26:55
Speaker
therefore although we did have access to
00:26:59
Speaker
Most of the top tier VCs, there was a bit of a struggle with respect to articulating the value proposition and getting recognition. If we had attempted to raise money when we left our jobs at NIH, it would have been extremely difficult. But thankfully, between 2015 and 2019, when we raised, there was also a significant amount of churning that was happening within the VC ecosystem, not at the top tier, but at least at the same stage.
00:27:27
Speaker
where there was recognition for the opportunities around solving for deep tech problems. Very fundamental that will have an IP value and a value of building business use cases in the future. And then the IP ends up becoming your significant mode. These are going to be
00:27:46
Speaker
companies where it's gonna be significantly difficult for somebody to come and compete with you. If you have done ten years of very fundamental personal research on this, you can't really reverse engineer that in three months. You'll have to go through a lot of experiences that we have gone through.
00:28:04
Speaker
So there were a few players in the market at a seed stage who had started recognizing this and looking for founders at that. And we were also, of course, looking for them. So eventually we found each other. So it took much longer than what it would typically take for us to close a lot of funding. We began mapping that ecosystem. We kind of also needed a very strong deep tech
00:28:32
Speaker
People who understood their time investment. Right. Right. You needed a patient investor. We needed more than the patient investor. We also needed someone who understands and who made the thesis as something that is very specific. The current set of investors, especially the one who led, if you look at the history of their investments, that will all be like our organization.
00:28:56
Speaker
So they had that very strong conviction to put that money specially and early and grow eggs and all of these guys. So there are not many competitors who are exactly looking these and that's another issue. India, this is a very limited space and you have a lot of round robot systems, but we don't have any of these robotic arm based organizations. Either it will be robotic arm improvisation. They're not people who are calling this
00:29:31
Speaker
So that's that's also another problem where VCs are not aware of information. Half the time we spend more educating and rather than and you don't have a platform to also
00:29:42
Speaker
So, this 750,000 that you raised, what did you build with that? Sure. So, like I mentioned, in August is when we got that cash with the goal of us in 2019. So, the first goal, of course, was to crack the hardware deficiencies on the vision aspect and then build out the software stack along with the robotic arm adaptable.
00:30:06
Speaker
and then the goal is to launch something which is very dynamic, that is visually intelligent, a robot that can learn how to manipulate a variety of objects. Currently you are pre-product, like you're still building this out. Almost completed. So, I mean, what happened historically in August 2019 is when we raised, then the pandemic struck and then we are kind of out of our laboratories for a while, but we look on our feet

Customer Engagement and Product Launch Preparation

00:30:33
Speaker
too. What we are is we have built a platform
00:30:36
Speaker
Right with the platform. We already have started engaging with the customers who collect solution use cases so that we know what to product as this. So it's one thing to have this technology with understands motion is able to work around it. But how is someone how is an engineer going to train it?
00:30:51
Speaker
How is the engineer going to use this to integrate into the solution? So those portions are also part of our next fund raise. They are also attempting to make it more smoother. But we already have this platform and we are testing it out with some of these customer applications and customer use cases that we actually got as a hardware that is there. I think right behind there are those components that are lying there. So that's what we are. And we want to just bring it to one preliminary shape
00:31:21
Speaker
Not all applications can be handled with the current form factor that we have, but there is a significant portion of close to a billion dollars of opportunity just for the current format of what it comes. So that's something that they are trying to launch with the association with one of our customers in Intex in January 2022. Sorry, in Intex.
00:31:46
Speaker
It's Asia's largest machine and machining tool export that happens in my book. Founders, then we have tons of great stories from entrepreneurs who have built billion dollar businesses. Just search for the founder thesis podcast on any audio streaming app like Spotify, Ghana, Apple Podcasts and subscribe to the show.
00:32:15
Speaker
And so what you're launching is like a plug-and-play product or it will need to be further customized or like the customization also will be like say ordering a computer from Delhi. On the software side, on the software side. On the training on the software side there will be something. Because there is a hardware side is completely. So what you have achieved with the sound of the case we have stabilized the hardware platform.
00:32:38
Speaker
So you don't have to go about customizing hardware around this device. This can handle like a range of use cases because of the intelligence around it. So it can adjust itself to different kinds of use cases. The definition of the product that we're launching in January is a hardware platform that will scale across use cases, be able to manipulate objects.
00:33:01
Speaker
Even from randomized bins, you can just like it can put present objects in random orientation. However, the system has to be trained for the objects that it is handling. So that training is a software training process.
00:33:12
Speaker
roughly a two-week training process that we will do in our facility with our robotic arm and then ship a software model or an intelligence model to the robotic arm. At an extra pace, what we think this will lead to is we would learn out of these opportunities for us to see, these are the hiccups, these are the places where you could make a much better UI and integration to be more easier and these are the places where robotic arm is limiting. So this is a learning experience for us to do that mission engineering from this so that we can formulate it, refine it much further.
00:33:42
Speaker
So, our next phase of development will be split into two portions. One, standardizing from the software point of view. And the next is coming up with more further deeper technology layers and further ability, which will bring more ability for the robotic arms to handle more. Like talk to me about both of these. So, first one, the standardizing software means that
00:34:05
Speaker
two weeks of training will come down to maybe a couple of hours of training. That's what would happen. So our eventual thing is in four to five years time, we will reach one day of training, right? So that we split them into two kinds of training. One is the object model training, the other is task training.
00:34:21
Speaker
Task training is fairly quickly achievable, but object model training takes a lot more time today, right? So at least take two weeks, one month, right, based on object object or task to task, right? So that we want to standardize and then shrink it down to- How do you train like, how does it like, you're going to train for two weeks, what will you do in that two weeks?
00:34:42
Speaker
So there are different procedures just like a baby, right? So we are going to accelerate and then for a baby. It is learning a lot of sophisticated environment. One advantage these robots have is they are in a limited environment where they are to learn. They don't have expectation to look anywhere else. They don't have expectation to you know, pick another part or somebody comes in or stop them. None of those things are actually there. So how do I first then differentiate it without touching it? How do I know because there are a lot of shades and colors and format that you can obviously have.
00:35:09
Speaker
So how do I know where to go and touch it? Where should I grab? It will be able to grab. Or am I looking at the pattern of the wood in the table as an object? Or is it the whole object itself? Am I picking the contour of the object, the edge of the object? So those things are all tested.
00:35:24
Speaker
It will start from there. Surely there is some procedure for it to get ahead. And this training happens how? Like this is purely you sitting on a laptop and feeding it images or is it like actual? No, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no,
00:35:41
Speaker
In fact, it looks at it, it tastes and then it goes and touches it and then it validates. Oh yeah, the contour that I saw is rigid. The contour that I saw is not. So all those things kind of a lot. It actually feels like watching a baby learn then, like you actually see that arm touching, learning, getting more confident each time. The speed act, which happens getting shorter and shorter mistakes. Oh, this looks like this in the other side. This side, it looked like this.
00:36:10
Speaker
So all that question of it learning will be there right now. We are doing like motion sensing and these kind of things, right? So the way it associates force and the way it associates all that information that is there right about an object we say seven different parameters of the object that is supposed to understand and then associate them together the method of association how would is constructing depth today? How would you able to utilize out of focus? I was able to utilize the convergence how how deep is it actually taking its understanding?
00:36:38
Speaker
Right understanding the color understanding the texture so many of those things, right? So those are further layers that the system has to go more deeper in learning and right now there are some small tricks that you have done here and there with autobocus and so that could be even more machine-learn system that it can actually become right individual by its one way. So those are the things that we will be concentrating on so that this system is able to play it's able to handle rigid objects.
00:37:05
Speaker
Yeah, it cannot handle a wire. It cannot handle a shirt. Then I use the system to switch shirts because that's entirely monolithic. Second thing, one step before that is that's from an object handling point of view. There is task handling point of view also where there are tasks that you can do only with two hands. There are tasks that you can do only with one single hand. You can do even with a single hand. There are tasks you might need multiple fingers of a single hand.
00:37:32
Speaker
So there are those combinations, that product combination that you're supposed to come up with is also next phase. So customer will have a menu of options and they can say, okay, okay. And one robot are double robots. Is the robot upside down or is it straight? And this will like be put on a table or like how does it get installed in a plant?
00:38:00
Speaker
So in so the that's a good question because most people don't know how a robot has been installed, right? So just to just for that public knowledge sake every time and a robot is in installed. There is a lot of civil work that happens there, right? What do I mean? Because they have to go out it they have to put the screws because when I say the robot becomes 20 micrometer precise, right? Even if the robotic arm has shaped and then move
00:38:25
Speaker
Everything will go haywire. So they have to go out and put it properly with the cement and we have to prepare the facility to put a robotic arm all that effort actually happens. So here what we are doing it is because of the dinosaur this system has and it can actually track its position by itself. It's actually a stand which has wheels. So you can move and keep wherever you want to keep.
00:38:48
Speaker
So you don't have any of this. So today it can like you can move a person from one station and then a whatever another station and make him work there. You can kind of move this robot to come with a mode to bear. The only difference is if the customer happens to have learned the object for both of this and to the two two weeks for each of these objects and spent another two weeks for all this, you know manipulation and everything. He has a model with them. You can just put the model and then move the robot account to the new position and then just ask him to start working on it.
00:39:15
Speaker
That's how we are actually deploying the robot. What happens if the object is slightly changed? Say things get smaller and smaller. Suppose it was manufacturing a phone and the chip became a little smaller, then would it need to be trained again? That's interesting.
00:39:36
Speaker
Often when the geometry changes, so the same effect, this is another fallacy that most of the AI and ML systems have to do is zooming in and zooming out of an object, right? Without standardizing to it is much harder for most of the systems to learn at the same pattern that I'm looking at, what is the pattern that I'm looking at, right? So it's the same effect as an object being approached close by or an object being smaller at the same distance, right?
00:40:02
Speaker
This guy can replicate most of the designs as it is, right? But he is not a precise mission. It's not an arithmetic mission, which is so precisely written an equation and then it is precisely here and then touch it here. If it's 0.2mm lesser, this fellow will just simply
00:40:18
Speaker
The object will not mate with the gripper, right? It's not like that. This guy doesn't even know how much it is. He has an approximation and uses the same thing some way to again calculate and then understand, oh, this is the depth and this is the amount. This is how much I'm supposed to grab and all of it. It's dynamic enough, right? So as long as the shape doesn't change, dimension of the shape can change, but the shape doesn't change. The features don't change if I rotate them in different directions. The aspect ratio is the same, basically. Same. Aspect ratio is same.
00:40:48
Speaker
If it becomes too small for the fingers, then it's a finger problem. Then it's a customization that it triggers a customization in that case, right? There's a range within which they can operate it, right? So I have a thickness and the width of my finger within which if it's manipulatable by a human being, this is also manipulatable by a robot. A human being needs a gear he can hand, they pick it in his hand, but a watch gear he needs a tweezer to put it inside.
00:41:14
Speaker
Right? If the watch is on both are in exact looking warm gears or whatever, right? So that you planetary gears or whatever, right? So in that case, this also will need a tool or where the tool doesn't have to be detached from the hand. It can be attached. The figure itself can be translated into the freezer and they can do that. So that that change might happen was at an extent, but whatever that is handleable within a finger to fingers, whatever that is, all those shapes are also handleable by the robot.
00:41:40
Speaker
Okay. And is there a difference in the setting of the software while you're training and while it is on the floor? For example, when you're in the training mode, does it have more autonomy? Can it do more experiments and make more mistakes? And when it goes to the factory, then the autonomy is reduced because you don't want it to do too many experiments. Like, does something like that happen? Yeah, the whole training software is entirely different. The deployment is entirely different.
00:42:09
Speaker
Right the sequence of training is largely irrelevant. The moment it just goes there, right and I can put that there but the sequence may not the industry may not be the same environment of sequence of training that we get I would have given a black table and I would have put the part or I customize the stereo for it or whatever that I might have done. So those are not the same scenario in which you are actually picking the part or you are actually operating on the part, right? So often that deployment is very different. That's mostly from a task point of view.
00:42:37
Speaker
Whereas the training here that is happening inside the lab is mostly from object point of view, right? Understanding object point of view, right? So once you have understood the object, so let's say I have this mouse, am I picking this mouse to move the mouse? Or am I picking the mouse to wipe the mouse? Or am I picking the mouse to store it somewhere, right? The logic will change, the motion and everything will change, the environment will change for that, right? So that's what the industry provides, right? The factory provides, right? Just what I'm doing with it is,
00:43:07
Speaker
is how I'm learning. This is an Italy. I would have plumsily grasped it. I would have done this way or the one that way or drop it couple of times and all is very different. So that if the customer wants to buy in the new system is the very will start thinking on it, but mostly it will be a deployment scenario is a very different case for you will be able to handle it. It will be able to handle in very variable orientations of the same object.
00:43:36
Speaker
You will know how to go precisely pick it. You will know his intent is not to touch and feel and then learn that object. Right? The intent is to precisely pick it. So that's how he has been deployed. So that the algorithm also changes.
00:43:51
Speaker
Do you see yourself becoming more like a Salesforce or more like a Windows? So like Windows is like plug and play, pure plug and play, whereas Salesforce is something in which companies employ Salesforce developers who customize Salesforce for them. So similarly, like right now you are training and sending it
00:44:11
Speaker
to the companies. Do you think in the long term it would become a scenario where companies just buy the product from you and then they have a whole army of developers or trainers or whatever you want to call them who actually do that last mile customization training, tweaking, etc. Our intention is to move more stronger towards like an Android and Play Store or an Apple and an App Store.
00:44:37
Speaker
In fact, that matches more because you also have hardware that we sell. But then, you know, those are consumer products and Salesforce is an enterprise product. And, you know, maybe you end up being more like Salesforce because of the varying needs of enterprises. And the fact that enterprises may want to change stuff, they don't want to then
00:44:58
Speaker
come to you for every small change. I'll give you one example. Every time I went to let's say one customer, I learned one part and then let's say I have learned a bolt, which is a standard bolt across different industries and different customers. The moment I learned this bolt,
00:45:17
Speaker
All the robots that could be there have learned to bolt because I have that model stored in my object store, right? The next customer doesn't have to spend the time on learning that object. You can simply subscribe to that object store and then you can pick that right? But why does he want to subscribe to that object store, right? You can either subscribe to the object store or you can subscribe to only the bolt model either of those is twice, right? But so that's why you want to keep that training with us. The object model is our IP also in one day, right?
00:45:47
Speaker
and someone else also can buy these robots and then just do just set up an university where you are just simply training object models, right? You're teaching the robots to learn the objects, right? So that could be one business for someone else. We don't know at the platform the beginning as a platform we can enable that right and they can keep uploading to the object store. Right and the and the customer what advantage the customer has is that similarly another customer for another bolt would have asked our system and get obtained and kept right. The customer is enabled with both the bolts.
00:46:16
Speaker
So what he comes so he suddenly is designed taste that day. I don't want to use this bolt. I want to use a little longer bolt or a thicker bolt, right? He doesn't have to go over redesigning everything. It just goes to the optics for other things that hey, I can pick this bolt and then I can just write so that subscription is something that they can get into right?

Fundraising and Future Expansion Plans

00:46:34
Speaker
That's a bigger picture on the second one more store also comes which is the task store that is a customer's IP, right? So one one simple example the leg in analogy that I will again take this
00:46:46
Speaker
We have kfc franchisees that are there everywhere. McDonald's franchisees that are there everywhere. One who was running is someone who was here, who was born and brought up here, who learned to go to the way we would cook. Indians would cook in all of this case. But what is the kfc franchisee? It's method of preparation. They are classists. Correct?
00:47:04
Speaker
So that's beautiful, right? So in for the when kfc can now without having to have a brick and mortar and I will have to figure out some type right and simply approach the task model in a law in a encrypted way, right? That over has the infrastructure with this robot a common thing.
00:47:23
Speaker
factory today, anyone who has a standard infrastructure like this will be able to borrow the task model from them and they get paid for their invention of the task model, right? And they're able to expand manufacturing service companies can expand your thing without having to actively put the investment from their side, right? How could the else can agree to, you know, buy all these robots and set up it and they can take the know-how from these guys and these guys have an option to
00:47:47
Speaker
I know monetize the know-how. Today monetization of know-how only happens with the cost of all the brick and mortar being put together. I think all the infrastructure being put together, right? This, someone tasted, hey, I have all the capacity to run it really well, but I just give me the know-how, right? And that guy says, and that guy also has a lot, that trust is very slow in manufacturing industry. Why? Because it is so, so risky for me to just put all my infrastructure on that particular know-how alone.
00:48:16
Speaker
Right? What if this doesn't work? There's no it's not clicking with the people. What do I do name? Right? What do I do then? Since I have a standardized line, I can choose a different know how to work. I can run it for two months.
00:48:30
Speaker
And this is not working right for six months. This is not working. I can but this infrastructure is worth for next seven years, right? I can keep experimenting with multiple different. Yeah, so come these instead of exporting products will start exporting their task IPs. Yeah. So material gets localized.
00:48:54
Speaker
Hmmm, right, right, yeah. So your cargo shipment will be very different. Fascinating. Okay, okay, cool. Wow, I mean I can already visualize that world which you are living in.
00:49:19
Speaker
How long before you think you'll open up a Westworld kind of a theme park? If this kind of doesn't go well, you'll rather go writing science fiction movies. So what is this priced at, what you're launching in January 2022?
00:49:49
Speaker
The integrated robot with vision and grasping roughly about $80,000 would be the cost of a robot. Today, the acceptable price point of a robotic arm in the market is anywhere between $80,000 to $150,000. That variability exists because each time the customization is different.
00:50:09
Speaker
Even if the task looks very similar for a human being, depending on what the task is and what object you are handling, the complexity can just explode. And that's the higher end, which is like anything up $150,000 just becomes too pricey. So that's roughly the price point. So we would largely be focusing on a smaller target market in India.
00:50:33
Speaker
where that price point is justified because of the repeatability and the quality that the robotic arm automation inherently handles. But in a market like the US or Japan or Germany, it would be like very, very aggressive pricing in terms of
00:50:51
Speaker
even labor costs. So that's where we would most likely price the product. We already have a few OEM engagements and end customers, engagements that we have in the pipeline for next year in India and they've initiated some partnership conversations in the US as well.
00:51:12
Speaker
Okay. And how many units do you think you'll sell next year? Like what's your internal? Next year would be conservative. Next year would be like our launch your product market fit and figuring out. So probably about 20 systems is what we will target to sell. The subsequent year we should hit at least 100 systems that we want to deploy.
00:51:34
Speaker
By then, customers would start telling each other and there would be stuff for you to show productivity improvement. The existing customers are still for a deep pipeline of opportunity. This addressing is all mostly the pilots that we are going to sell to them. And the follow-on is going to be for the replication of those systems.
00:51:56
Speaker
So that's the sequence by which 100 number 10 200. So actually 0 1 and then 10. So we already have like we have identified at least 200 physical locations where the robot can be used and a customer will receive value. We have been there we've seen it customer will invest and buy but we will take 10% of that and deploy that in the first year.
00:52:20
Speaker
uh and then subsequently it'll be like an expansion mode for expansion mode of course you need certain amount of execution capability also to to do that at that scale it's like each is an eighty thousand dollar system uh certain amount of in-house assembly and manufacturing happens
00:52:35
Speaker
integration, and then shipment, and then training. So you need to scale as an organization. So we'll be looking to scale to a 30-member team early 2022 by April or so. And that will allow us to execute 20 projects, 20 components, and also build the pipeline for subsequent year. We are dedicatedly targeting the US market as well to accelerate our sale and deployment.
00:53:05
Speaker
Okay. And what is your current team size? We are an eight member team right now. And most of these would be people either who are coding or doing the training or stuff like that, or some people who do the hardware integration.
00:53:22
Speaker
So five of us are on tech, three of us are non-tech, that includes me. I qualify myself most of the time as somebody who is not doing the tech. Doing the engagement with these potential customers.
00:53:40
Speaker
Yeah, so until now I've been doing anything other than tech. Equally technical, but yeah. I've taken responsibility for anything other than tech. Recently we hired one of our colleagues from National Instruments who spent 10 years of his time at NI to lead our growth and business development from a sales point of view. So I'll kind of have him also
00:54:02
Speaker
support me on that while I focus more on investments, operations, business executions, and org development. Google does a bulk of the org development, the technology, and the design and the marketing around what we're building as well.
00:54:18
Speaker
Are you looking to raise a series A? We're doing a pre-series A. We should announce something soon enough. So we'll be doing a pre-series A to go to market and expand. And then a series A, which will be a larger round toward the end of next year. Awesome.
00:54:39
Speaker
You know, this Elon Musk talked about the production hell at Tesla. Was that something which you guys could have solved for him? We could solve now for him to a certain extent.
00:54:54
Speaker
So possibly the next generation EV manufacturing, that gold rush which is happening in EV manufacturing, that would be the wind beneath your wings, so to say, give you those tailwinds and allow you to really scale up. Probably 100 is too small an ambition then. I guess there is a lot of investment happening in the EV space, and all of those folks would be perfect customers for you.
00:55:23
Speaker
Yeah, so the function of course, I mean, we're always hopeful for a brighter future than what we have envisioned. The function right now, Gokul I think briefly mentioned as well, right? Like we have absolutely no doubts about the customer demand and the customer requirement. The function is on whether we're able to find capital and the investors to see that opportunity. Investors are likely to see that opportunity. India is picking up US's leapfrog with respect to
00:55:49
Speaker
the VC ecosystem recognizing that manufacturing is a use case that is worth solving a huge opportunity in it. India is still getting their manufacturing is not theater manufacturing or manufacturing technologies and automation is not yet a VC capital industry largely driven. There's often a perception in sizes. There is a bad reputation that manufacturing has actually built. So one the processes are very slow and they're very
00:56:15
Speaker
Obstructive in the way obstructionist attitude is all And the second bad rap that they often get is that in India manufacturing is cheap and people wouldn't automate often there is an assumption that Automation happens because of people are buying machinery and it's the automation is just replacing people with machinery, right?
00:56:36
Speaker
They're buying machinery to replace human labor because human labor has become costly. And they go back and say, oh, human labor is so cheap. He's paid only so much. So there is a long time it's going to take for someone to automate, right? But that's not the reason why anyone automates.
00:56:51
Speaker
Automation prominently comes only for the point of predictability. I need absolute predictability. Even if there is an error, I need to know where exactly the error is happening, when exactly the error is happening so that I can go back and correct predictability of my consistency in my dimension of the product and whatever the accuracy of the product and whatever that they have. That is, that is also one question. In fact, when I say precision, they will think only that predictability, they will think only that precision
00:57:16
Speaker
from an accuracy point of view, right? There's also process accuracy that they also want. That's a major driver for automation, right? So that understanding and in-depth putting the quote into is a problem and I can also blame the VC world because mostly they come from the software industry and associated is around the top of the finance industry that actually comes, right? And often the finance is also not rubbing from manufacturing benefits.
00:57:44
Speaker
It's rubbing from the other world, so banking and other world. It's also partially a problem with these manufacturing world to come out and then start generating VCs from their side.
00:57:58
Speaker
There might be many LTEs, but LTEs don't make this as a thesis to, you know, as a focus, to say, hey, go. I might prefer investments into the manufacturing side. So that is something that, that tailwind that's important. I think the Ola EV factory is going to change that.
00:58:17
Speaker
Hopefully, they change the perception and people start looking at them more seriously. The proof is already there in terms of the pre-orders that they have generated. And so quickly, they are able to go to the factory also. Yeah, exactly. I think that whole EV boom is going to change. I don't see you facing
00:58:45
Speaker
much challenge in raising funds and like, you know, being even more aggressive in your targets. Maybe a unicorn into three years time. Hopefully, that's the plan. To learn more about Sinla, their business and their emerging products, check out sinla.com. That's c-y-n-l-r dot com.
00:59:11
Speaker
This episode of founder thesis podcast is brought to you by Long Haul Ventures. Long Haul Ventures is the long haul partner for founders and startups that are building for the long haul. More about them is at www.longhaulventures.com