Introduction to Contract Testing Podcast Series
00:00:00
Speaker
Hi and welcome to series two of my podcast on contract testing. It's been a long time, but I'm back. This time I'm speaking with each guest about one topic and having an in-depth chat about it. All episodes will be linked to contract testing in some form. So I hope you enjoy.
Guest Introduction: Tobias Muller and Key Topics
00:00:19
Speaker
In this episode, we have Tobias Muller, Managing Director of Projarl GmbH. We discuss OpenAPI specifications, start with testresults.io and Autonomous Testing. So let's get into the episode. So welcome Tobias and thank you so much for coming on the podcast. It's been a long time coming. So if you want to just give us a little introduction about yourself, who you are,
00:00:47
Speaker
Hey, thanks. And thanks for having me on the podcast, Lewis. So who I am, as you mentioned, I'm Tobias, managing director of Projell. And we are the vendor of Test Automation Platform that closes the gaps that are currently in Test Automation, Test Results.io. And yeah, that's me. I'm a 42 year old guy in IT. I guess 90% of the people in IT. Yeah. It's always difficult to know what to say about yourself, isn't it?
Creating testresults.io: Addressing Market Gaps
00:01:16
Speaker
Yeah, so testresults.io, how did that come about? How did you start with that solution? Yeah, that was more or less built based on the need of having an automation platform in the regulated environment.
00:01:31
Speaker
like with the requirements of all the trades and stuff like that and approval processes. And that was the first part. The second one is we wanted to give testers a better experience than the tools that currently existed on the market back then.
00:01:46
Speaker
Yeah, I have a lot of talks these days about artificial intelligence and autonomous testing. And somehow it brings me back to the beginning when we started. It's like, we have a lot of marketing and everybody tells you, yeah, we have self-healing testing, we have autonomous testing. And everybody actually knows that it just doesn't work. And what we wanted to do differently is the thing that we actually lost the gaps in.
Simplifying Testing with testresults.io
00:02:10
Speaker
in your daily business as a tester so if you automate something there's always these small pieces that don't work which cost you a lot of amount of time and that is actually where we put our focus on and then some testers also was born. Cool yeah so I guess there's lots of people coming about in the market saying those things right so I guess what does test results do differently than other platforms?
00:02:40
Speaker
Yeah, exactly that. So what you see on the market is like, for one example, my best example is actually the self-healing stuff. If you have ID or a CSS selector or XPa selector based automation stuff, you usually see that they have self-healing claims. And in the end, if you look at it, it's just some part of LCS, so long as common sub-sequence or part of dynamic programming or something like that. But you can find freaking any element.
00:03:07
Speaker
How does that actually work in testing? Is that really self healing or is that just, I take a random element and assume that's the correct one and all of the other stuffs. And what we are looking at is like, how can you really, how can you really test like a user? That's the first idea is like that we really go only for visual cues.
00:03:25
Speaker
We test the user interface and the API based on visuals that we have. And that is, in the beginning, it looks extremely simple because you're looking for an image and you need to click at that position and that's it. But if you're really going into that, actually, it's pretty difficult to open a dropdown and select an element from a dropdown by only having access to an image. And that is actually what we do different. It's like we give people high-level access to user interfaces.
00:03:52
Speaker
but while you only have an image so we are completely technology independent and that that's the difference more or less and it's easier to use that's that's that's the whole point on it isn't it it's a lot easier to use it's like it is intellectually
00:04:06
Speaker
What we are, we are a bunch of testers and a bunch of developers and the testers had problems and the developers solved those problems. And it took like three years to re-tune the tool actually to ensure that the daily life of a tester is actually something you want to have instead of like struggling.
00:04:26
Speaker
Yeah, and I think that that's what like looking around and seeing and getting a demo of your platform. I think that was what really set it apart for me was the fact that it is separate from the code. It's not trying to kind of, yeah, guess what's happening. It's not tied to a specific framework, so I think I think that's really. Really clever in that respect. So as you say,
A Day in the Life of a Managing Director: Hands-On Approach
00:04:55
Speaker
as you say about the day-to-day life of Tester, what does the day-to-day life of you as the managing director look like?
00:05:06
Speaker
Actually, perhaps we should put some context on that. So we are like a company of 12 employees. So I'm the managing director of 11 employees. And that means how does my life look like? Yeah, it's like what I'm doing, everything. So I'm finding new customers. I'm talking to investors. I'm doing also coding. I'm talking to peers. I'm browsing through LinkedIn, Twitter, and different private select channels to identify what the competition is doing right now.
00:05:34
Speaker
And stuff like that, looking at videos, challenges, and all of the... I mean, if you look at that in a single day, there are, I don't know, how many hundreds of videos that are produced that show some new technique that you need to learn and understand. Something like that, and yeah. And I think the most important part is actually...
00:05:52
Speaker
and making sure that I have one of the best teams in the world behind me. Because as I mentioned, we are extremely small in comparison to all of the US-based companies and also of the European-based companies. And that is business-critical for me to actually have the best people in my team. And to be honest and to say something funny, sometimes I even mop the floor. But I guess that is just a requirement. I mean, that is how we grow, isn't it?
00:06:23
Speaker
Yeah, yeah, yeah. Yeah. Sounds like, yeah, you really get involved in every, every part of
Tobias's Journey into Testing: From Developer to Leadership
00:06:30
Speaker
it. So how, how did you come into testing? How did you get into testing and what's your kind of backstory? And that, that Lewis is an extremely interesting question. The funny part is like, everybody's asking me that, but this time in the API context, I thought like, Hey, how did I actually get into testing? Because I am a developer, developer. So how did I get into testing?
00:06:53
Speaker
And then I felt like, yeah, when was the first time I ever tested? And then I remember like, yeah, I came into a project where I had to do nothing else than writing unit tests for eight hours a day to an existing solution. And they just didn't have any unit tests and there was no documentation and I had to write those freaking unit tests. And as you can imagine, it was like 12 years ago, 15 years ago. And there were a lot of static classes and a lot of what you would call these days untestable code.
00:07:22
Speaker
And I had to somehow test it and the documentation was created while I tested it. So based on my questions, they actually created the documentation of the code.
00:07:32
Speaker
In summary, it was a mess, but I think that was the first time when I came into contact with testing. But in the end, I wrote, because being a developer, you're looking for solutions, isn't it? And then I started to use, I think it was called molds back then, where you could actually mock away static classes in the IEL code behind and ensure that you can actually have your mock classes injected.
00:07:57
Speaker
during runtime and I created a firmware simulator actually to simulate stuff that was impossible to replace.
00:08:04
Speaker
and stuff like that. And the project became successful. And afterwards, they asked me to take over another software project that was also successful. And then they asked me to take one of their most critical projects over as a software project lead. And that was the first time when I actually get into contact with what you call verification and validation. So formal testing and also end-to-end testing. And from there, the story unfolded itself. I was in touch with the test and there was actually a brilliant tester.
00:08:33
Speaker
And he is, funnily enough, he is still with my company now. So I took him with me. And he was the one who actually showed me what testing really means. And what you read in LinkedIn, all day long, he was living that 15 years ago. He was really talking to the developers, talking to project managers, talking to customers, trying to identify how should the system work, trying to identify how is the system working.
00:08:59
Speaker
Identifying the gap. Is it really a gap then confirming that's the gap or not? Looking for a documentation and then in the end actually extending the specifications based on his findings and all stuff like that. And that actually brought me into the whole universe of testing. And finally, in the preparation for this actually, I thought, and that's how, this is about API, isn't it?
00:09:23
Speaker
And that is this area. And I thought like, yeah, I mean, doing unit testing is actually API testing. Perhaps I'm too old for the business. Back when I started coding, it was like, everything was an API. I mean, if you use it, it has an API. That's it. And these days, if people talk about AI, somehow I need to remind myself, they are referring to REST APIs and GraphQL and stuff like that. But they don't get the understanding that
00:09:52
Speaker
In programming, actually, you're always coding against an API. Yeah, we've got to steal the computer somehow. Going back to your first introduction to unit testing, that's perfectly segues into what we're going to be talking about today.
Best Practices for Generating API Specs
00:10:16
Speaker
The topic is around open API documentation and you spoke about kind of the test driving that documentation. And so what do you think is the best way to kind of generate those API specs?
00:10:35
Speaker
So some, some advertisement, naturally with test results, I would say, oh, no, but what I would do, and I just, I guess that is, that is prison. That is personally me is more or less because I'm, I'm, I'm thinking in code. So what I always do is I'm writing stop code, like, like only having the class and having the functions in there and then putting all of the annotations in to generate a specification because I live code more or less so that then I guess that only fits to me, to be honest.
00:11:03
Speaker
If you are asking me what is the best way to generate API specifications, I would say like, what I always use is the most native tool to the language that I'm using. So right now we are using .NET 6 or 7 or whatever is current and they just have a middleware for Swagger and actually we use the annotations in there. And what we are doing is, or what I'm doing.
00:11:27
Speaker
Mostly it's like, yeah, generating some stop code, putting the annotations in, and then getting the API specification generated. Nice, yeah, yeah. I think creating that skeleton is a really good way to get started with it. And I guess that's the benefit of creating your documentation before you start implementing, right? So that you have something where you have the contract between
00:11:57
Speaker
what the API is and who's going to be consuming it. Exactly. And I'm not the guy who says something like, the code is the documentation, as I guess you hear it everywhere. But it's more or less like, no, I'm using the minimum of code to generate the documentation. And the interesting part is actually, I mean, the documentation is interesting, but the more interesting part is actually taking that documentation and giving someone else to understand it.
00:12:25
Speaker
Because that is where the discussion starts and where you start refinement and where actually the real thinking process starts. So you do have your design of an API in mind and you can put that in the documentation, but in the end actually you have customers. So other developers that will use that. And I think that's a more important part of the documentation than having the documentation itself. It's like the discussion about the documentation. Yeah. Sorry, what did you prefer to it actually? Yeah, no, I'm completely on your side.
00:12:55
Speaker
using the native kind of tool. And then, yeah, I talk about contract testing a lot. So I'm very much for kind of creating that contract upfront, storing it in a place which is accessible by parties and lives there, right? So it evolves as the products involve. It doesn't just live with the code and then someone sees it at a point in time. And then
00:13:25
Speaker
And then it's out of date. So yeah, I think try and create those conversations upfront. And then that's where, that's what can drive the, the building of the application. And I think you mentioned something important. Actually the documentation needs to live because that is, that is what I had. One of my first customers is more or less like, because the regulations required the documentation and then the documentation was created, signed off and done.
00:13:56
Speaker
And that is somehow, I mean, there was no added value in that documentation like six months afterwards, just zero. Yeah, exactly. I think that's an important point. And yeah, coming on to kind of the final part is how do you use, I think you touched upon this a bit earlier, but how can you use the specifications to assist in the testing activities?
Role of OpenAPI Specifications in Simplifying Testing
00:14:24
Speaker
So specifically for open API, it's more like, and I measured that before, it's like, it's extremely powerful.
00:14:33
Speaker
I looked at the numbers, and it's more or less like doing 60% of your work. So having this open API specification actually gives you all of the calls. It gives you all of the data structure. And depending on the context provided, it gives you all of the error cases. It gives you so much information that you can take, pass, and generate test cases out of that, that you could actually claim that you have autonomous testing for APIs in there, which is not the case.
00:14:59
Speaker
I'm not a marketing guy, which is not the case, but actually that is far, far closer to autonomous testing than what everybody else claims actually on testing. And that is why I love those specifications. Yeah. I mean, if you have a tool that does the correct interpretation of the open API or swagger specification, that is like,
00:15:23
Speaker
That's the hard part, isn't it? Understanding those data structures of the API, how to call them, what to put in there. And you have all of that information readily available to you as part of the specification. Always, to be honest with that, it's always depending on how much information is in the specification. Because if you read the standard,
00:15:44
Speaker
There's a lot of freedom. I can give you an open API specification that you can use for nothing in the end, other than having the URLs actually for the call. So there's a lot of freedom in the standard.
00:15:56
Speaker
But if you provide the corresponding and the required information, then it actually takes a lot of work from testing APIs. I mean, if you go to YouTube and search for Postman tutorials or API testing tutorials, you can actually see them.
00:16:17
Speaker
how they are typing in URLs at the top and how they are actually adding some JSON payload to the, to their part requests or to the post requests. And actually all of that is not needed. I mean, the open API specification gives you all of that and you can just, you can just create the test cases out of that. So for me, it saves a lot of time. Yeah, for sure. We'll have to link, link some things in the show notes for where people can go for that stuff.
00:16:46
Speaker
Good stuff. Any other comments around OpenAPI specs or specifications in general?
Evolution and Importance of Updated OpenAPI Specs
00:16:56
Speaker
For the OpenAPI spec, it's like what I just mentioned. If you go through the standard and you see that it evolved somehow. I mean, you have the different versions and most of the systems still use version two. Microsoft supports them on version two and version three, but you need to specify it. And that is something where you would think like,
00:17:16
Speaker
is a clean cut required or not. I mean, most of the time you say like, no, we need to be backward compatible. But the problem on the current kind of specifications is like, there are so many variations of it where you can put endpoints in and they can put additional information in. And I guess nobody's actually using the specification to its maximum.
00:17:38
Speaker
And it could be much easier. But that's always the case in development, isn't it? You start with specification one. In two, you put your learnings in. In three, you try to be perfect. And at some point in time, you restart with all of the learnings that you have. And that is the perfect version afterwards, for the time being.
00:17:57
Speaker
So I think that's my only comment on the open API specification. I love that it's there and I love the install the swagger and that it actually became a real specification that you can rely on and that you can use. Honestly, I also love swagger UI, for example. I mean, I don't know if that is advertisement or not, but I love it that I do have a full client out of nowhere. I just give you the specification and I do have the full client. I mean, people talk about
00:18:25
Speaker
about rest assure and rest sharp and stuff like that. I mean, honestly, to give it a try, it's already there. You just need to enable the middleware, at least in .NET, .NET 6. And you can just have it. You have a full web client that can actually correspond to your API, including OAuth, including basic authentication stuff. It's all in there. So that is amazing, the combination of the specification and also this implementation of Swagger UI that is
00:18:54
Speaker
I would call it magic, but in the end, in my world, there is no magic, so I just love it. Yeah, I can't believe we almost didn't mention the Swaggy UI because I literally use that on a daily basis. So, yeah, it's really important. Cool. And so, yeah, you touched upon kind of the
Future of Autonomous Testing: Upcoming Events and Skepticism
00:19:16
Speaker
AI autonomous testing and I know you're doing some events around that soon so yeah do you want to mention those?
00:19:26
Speaker
Yeah, actually, I did one yesterday together with Buzz. Buzz had some insights on mutation testing, quite interesting. And we are doing, I think, on the 24th or 25th of April, we are doing a webinar together with Buzz. And it's more like a chitchat between two people that have their thoughts on autonomous testing because both of us see a lot of fuss out there and marketing buzz about, wow, we can do everything autonomous and stuff. It's like Tesla 2017.
00:19:56
Speaker
history repeats itself. This time it's just testing. So yeah, that is what we're planning to do right now. Awesome. Join? Everybody can feel invited. Yeah, definitely. I'll definitely try and join. Bas was on the first series of the podcast. So yeah, he's a friend of the pod. So that's really great. So yeah, thank you so much for coming on today.
00:20:27
Speaker
Lewis, I have to thank you to have me on your podcast, actually. And yeah, if any of you have any questions, contact me. Contact Lewis.
00:20:36
Speaker
Absolutely. And if you have any insights, and by the way, if you feel, and that is something I have to say in every podcast, I guess, if you feel offended by my opinions, please let me know. You have every right to feel offended. You can have a completely different opinion. That's fine for me as well. Get in touch with me and let's have a discussion. I always want to learn from somebody who knows better than I do. So how can people get in touch?
00:21:01
Speaker
Just send me an email at Tobias.muller.proj lch or search for Tobias.muller at LinkedIn. You'll find me. Great stuff. Thank you very much. Thank you, Lewis, once again.