Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Testing Web APIs with Mark Winteringham image

Testing Web APIs with Mark Winteringham

How to start API Contract Testing series
Avatar
235 Plays1 year ago

In this episode we talk about Mark's book Testing Web APIs, API playground (with planted bugs) restful booker and open-source automation curriculum.

Make sure to sign up for TestBash UK 2023 and buy Mark's book!

Transcript

Introduction and Episode Overview

00:00:00
Speaker
Hey, and welcome back. I had a great conversation with Mark, who's the ops boss at Minister of Testing and author of Testing Web APIs in the Snow introduction. But in this episode, we talk about his book, Russell Booker, API, and also the open source automation curriculum. So I hope you enjoy the episode.

Role and Responsibilities at Ministry of Testing

00:00:21
Speaker
Hello, Mark, and welcome to the podcast. Thank you so much for coming along. Well, thank you for inviting me on. It's good to be here.
00:00:30
Speaker
Well, yeah, there's a lot of crossover with what you're doing. And so I wanted to get you on the pod. So tell us a bit about yourself. What is an ops boss? What do you get up to a ministry of testing? And also what do you get up outside of ministry of testing?
00:00:46
Speaker
Yeah, it's funny, I've been asking myself that question about what an ops boss is as well. So yeah, so I liken it to being a COO, but for a small company, because there's only nine of us at ministry testing, but my focus is kind of on day-to-day running of Mott and basically helping the team in terms of
00:01:15
Speaker
how we do our work, how we prioritise it, the direction of it as well, and then working with Richard Bradshaw, his boss boss, and kind of basically being a rubber duck for him, for his ideas and stuff.
00:01:30
Speaker
you know, taking the, taking the ideas that he has and then kind of running with them and being the one that kind of implements them. Um, cause we've, we've always found that we've worked quite well that way, whether it's with not work or, um, our training material stuff that we've done outside of not. So yeah, that's kind of the ops boss stuff. I'm still, uh, I think I'm a year, year and a half into it. I'm still kind of working it out myself.
00:01:57
Speaker
It's cool. I really enjoy it. There's a lot of metrics, a lot of monitoring, a lot of sort of getting better at understanding what our members do, what they want, challenging our assumptions, that sort of stuff.

Insights from 'Testing Web APIs'

00:02:09
Speaker
And then outside of work, I write, so I've written a book called Testing Web APIs. I am in the process of writing a second book, which is nominally titled AI Assisted Testing in Action, which I have just finished my first chapter on.
00:02:28
Speaker
Um, which is pretty cool. And then beyond that, it's just, uh, you know, family life and DIY, just lots of DIY. We've got a house that needs a lot of love. So yeah, those are the, so it's kind of like the main things that I'm focused on at the moment. Yeah. So we wanted to get you one talk about, uh, testing web APIs. Cause you've got a chapter in there about contract testing. I do. But yeah, give us a little ever later pitch of the book.
00:02:58
Speaker
Who is your target audience for the book? So yeah, Testing Web APIs. It's kind of a book where I wanted to have my cake and eat it. So I wanted it to be a book that could be for people who have experience with testing web APIs in certain ways, but would like to dip into other approaches. So if you've done some automation, but want to try out contract testing, there's a chapter there. If you wanted to do maybe some testing and production, there's a chapter on that.
00:03:27
Speaker
But then it's also for people who are thinking about testing APIs from a sort of holistic whole. So, you know, what are the different approaches that I need to take? What sort of risks do I need to consider? You know, what does quality mean? And basically how to be strategic in the choices that you make. So.
00:03:46
Speaker
some of the chapters talk about strategy, how to put your strategy together and your plans. And then the lion's share of the book is exploring these different types of techniques that we can use and kind of connecting them to risks. So, you know, performance testing will focus on certain risks that exploratory testing won't necessarily focus on or testing web API design and ideas and stuff is much more focused on a different set of risks as testing and production.
00:04:16
Speaker
Nice, yeah. Or I just jumped straight into the contract testing chapter, obviously, because I was interested in that.

Understanding Contract Testing

00:04:24
Speaker
Yeah. So yeah, when you're going through the description, I thought it was a really great introduction for someone who hasn't got that experience of contract testing, because it can be really hard to explain to people who haven't got that kind of background. So you mentioned within it,
00:04:41
Speaker
the testing strategy module and how contract testing sits in the imagination side. So can you elaborate that on a bit more because I don't have the context of that model that you've described there?
00:04:55
Speaker
Yeah, so the model you're kind of talking about is something that I borrowed from James Lindsay. So he created this model, but he used it as a way to sort of explain the value of exploratory testing. But I actually kind of ended up internalizing it as sort of my way of thinking about testing in general. So the concept is you have
00:05:16
Speaker
the imagination, which is what we want from a product or an application. So this is explicit things like requirements, but it's also implicit and tacit things like conversations that have been had, assumptions that we're making, the general sense of what quality is. And then you have implementation, which is sort of another area, and that's the product itself. And the idea is that the more we test, the more we learn about both of these things, the more these two things overlap,
00:05:47
Speaker
and we want them to overlap so that we end up delivering what it is that we want. So yeah, it's interesting with contract testing because it is a technical approach. It's about using some tooling and stuff, but I've always thought or I've always felt that it's a technical solution to a people problem.
00:06:08
Speaker
So I sit in the imagination place because it's about changes to our designs, changes to our assumptions about how our applications work. But we are basically putting some automation in place to check those assumptions or check the assumptions of the other team so that when a contract test or check fails, it's supposed to trigger off a conversation. You know, in the ideal world, you know, if everyone's talking to each other all the time and there's good communication,
00:06:37
Speaker
There's an argument that you might not necessarily need contract testing because there's that sort of back and forth. But it gets more complex when you start having... I worked on platforms where we had hundreds of microservices. It doesn't matter how good the culture is, it's not feasible to keep an eye on all of that sort of stuff. For sure.
00:06:58
Speaker
So that's kind of why I see it sitting in that space because it's facilitating conversations and that's different to say like sort of more traditional automated tools and stuff like that, which are just checking assumptions that you've sort of codified. Yeah, I completely agree with that. Like when I do my talks, I talk about it as a communication tool.
00:07:20
Speaker
And it's a really great way to get humans to communicate exactly what you're saying. It's like it triggers that conversation. And also when you're using contract testing, often the teams are speaking in different programming languages. So it's great to have a tool that sits in the middle, which then you can start trigger those conversations with. So yeah, that, that makes good sense to me.
00:07:45
Speaker
Yeah, I think another aspect of it all as well is, so I used to do a lot of BDD training and talking about, because I think there's commonalities with sort of talking about BDD as there is with talking with contract testing is, I've always liked Liz Keogh's thing about having the conversations is more important than capturing.
00:08:04
Speaker
the conversation is more important than automate the conversations. I think it's the same thing with this, is that it's a tool that facilitates that conversation at first. You've got to agree on the contract to begin with, or you've got to maybe get into a pattern of understanding, oh, we're going to have to make these changes. So we should have these conversations. But then you've still also got that sort of guidance that you get from the mocks that you create.
00:08:32
Speaker
It gives you some sort of level of controllability. It gives you boundaries as well so that you don't end up going off course. But it also does have that safety net or those indicators that a change that you've made has stepped out of those boundaries. So I see that a lot like BDD as well. It's that same sort of principle of you're using those tools to set the scaffolding, but really the richness is the conversations that sit within it.
00:09:01
Speaker
Yeah, 100%. Yeah, like the comparison as well to other tools about creating that communication. I think so often people take those tools and they like just follow them word by word. So I think it's more of a framework rather than a defined set of steps. Yeah, and to bring it back to the book, that was the whole idea was that
00:09:26
Speaker
I think I was more interested in framing the risks and saying this tool can help with that because that is something that Richard and I talk about in automation and testing about this concept of problems before tools.
00:09:38
Speaker
it's very easy to get seduced by the tools. And there's an explicit nature of like learning how tools work, but you've got to understand more like the problem spaces, which is a bit more squishy and messy and, you know, humans involved and stuff. But you have to kind of do that work before you start doing the tooling as well. Yeah, for sure. Cool. Yeah. So if you, if you obviously want to hear more from the book, then go out and purchase it from Manning.

Restful Booker and API Testing Tools

00:10:07
Speaker
Yeah.
00:10:08
Speaker
So I wanted to move on to other, other things that you have going on. So I discovered restful Booker a couple of years ago when I was practicing API testing for a job interview. I think it's really great tool to do that before there was the Twitter API, which I used a lot, but obviously I don't encourage people to do that anymore. So yeah, tell us about wrestle Booker and how people can use it and find it.
00:10:36
Speaker
Yeah, so you can find Restful Booker at restful-booker.herokoapp.com. It's very kindly supported by Ministry of Testing, so they pay the bill after Heroku turned off the free instances. Yeah, I thought it might be moving somewhere, but that's great to hear it still being supported.
00:10:56
Speaker
Yeah, I didn't really want to, I mean, I didn't want to move it in the first place, but yeah, it was good that basically all I had to do was just set up a new dino in their account and everything was set up and stuff. I have another application called Restful Booker platform. So Restful Booker is a single API, whereas the platform is multiple APIs at the front end and stuff, and that's all deployed on
00:11:19
Speaker
automation in testing to online. And that one is a little bit more complex to manage. It does fall over every now and then stuff. Whereas I find with the Heroku one, it's there. It resets itself every 10 minutes. It's pretty stable. So yeah, I wasn't looking forward to having to deal with the hassle of making the changes.
00:11:40
Speaker
But yeah, the API was actually built because it's funny, like you mentioned the Twitter API. I wanted something for my testing web APIs course and workshops that I was creating at the time, but I wanted something that had bugs in it. I wanted something that didn't quite work like it should. Because one thing I tend to do with a lot of my training is I always have an ulterior motive.
00:12:07
Speaker
So if I'm doing a workshop on testing web APIs, I'm not just teaching API testing, I'm also trying to teach you exploratory testing as well. I'm trying to teach you about heuristics and oracles and stuff like that. Yeah, I wanted something that was filled with bugs and stuff because I wanted people to experience what it was like to actually not just send some requests and get some responses, but to actually analyze them, to actually look at them and think about them and be like, well, what's going on here and stuff?
00:12:37
Speaker
So yeah, so the API was built for that. It's riddled with bugs. Some of them I say are intentional. Some of them I claim are intentional when I get found out that they exist. It's written in JavaScript, so it's really easy to write bugs into it and stuff.
00:12:55
Speaker
But yeah, I just deployed it one day for free, partially just because I was fed up of setting them up and shutting them down. I was just like, I just need one working for my workshops. And then I just started sharing about it and people started picking it up. And it's bizarre. I think I implemented it for a couple of companies to use as their interviewing stuff. So when you say you did it for an interview, I think I know which company that was.
00:13:21
Speaker
But it's, it's mad to see like how much it's kind of taken a life of its own. You know, it gets something like, you know, like literally earlier today, I was preparing for a workshop I'm doing tomorrow and I created a booking and it was like booking 3,891.
00:13:41
Speaker
And this API resets every 10 minutes as well. So the level of traffic it gets is staggering, really. It does have some problems. I need to start getting rid of the bots and things, which I haven't quite worked out how to do in a way that doesn't impact everyone else, which
00:14:02
Speaker
initial implementations have not gone so well. But yeah, it's well, it's just great to see we're using it. My intention is to keep it free. And it's nice to see other teachers using it as well, like they use it in their courses and stuff like that.
00:14:19
Speaker
Yeah, it's just not what I expected, but I'm quite pleased with what it's become. Yeah, no, I think I think it's really cool.

Community-Curated Automation Curriculum

00:14:27
Speaker
Cool. And then ministry of testing stuff, co-creating an automation curriculum with the community, which is awesome. So yeah, tell us about what what is annoying for source automation curriculum.
00:14:42
Speaker
So I did a big talk at Selenium Conf recently about it so if you want like kind of the details of it and the guts of it I'd recommend checking that out. It was called like what do you do as an automator? I was trying to be clever like you know like who would you do exactly but I don't think I don't think the title really translated very well.
00:15:02
Speaker
So it's kind of been something that we've wanted to do for quite a few years. It's one of the reasons why I joined Ministry of Testing. So it kind of harks all the way back to the software testing clinic stuff that I was doing with Dan Ashby. We wanted to create free, accessible learning material. But the more we got into it, the more we were like, well, we need some sort of structure around it. We need some sort of curriculum.
00:15:26
Speaker
But I was keen not to turn it into the Mark and Dan show. I didn't want it to be like testing according to us. I wanted it to reflect what was going on in the industry. You know, the team that Mark, Richard, Sarah Deary, our learning boss as well, like they felt the same.
00:15:45
Speaker
When I joined, yeah, we wanted to work on something like this. We tried the first iteration just before the pandemic. Like we launched it all. It was all about to go live. We just started the process and then everything locked down. So it all just sort of kind of.
00:15:59
Speaker
went to one side. But yeah, so it came back and the way that it works is that it's a community curated curriculum, which is just wonderful to say. And the idea is that we go through the curriculum creation process, but we do it with the guidance of the community. So we've actually finished the automation one and we're now working on a junior testing one.
00:16:23
Speaker
And it starts off with the process of sending out surveys, interviewing people, analyzing training material that already exists out there, looking at job roles, this idea of a needs analysis, what do people need, and then a jobs analysis in what is actually being framed within these job roles. We put all of that together, and then what we do is we come up with a job profile.
00:16:50
Speaker
And the idea is these are the sort of kind of common responsibilities that we see across all roles within the context of either automation or tuning. And then we run weekly events where we literally, and we run lots of social questions and again more surveys and stuff. And what we do is we just collect loads of raw information from
00:17:13
Speaker
from the community of like, let's say, one of the interesting ones for a junior tester is creating a test plan. That really surprised me that that was a thing that came in. So we will run a workshop of like, well, what does that involve? What are the actual specific steps and stuff? So we take all of that, we clean it all up, and then we turn those into learning outcomes. And the learning outcomes set the kind of learning journey that you would go through.
00:17:39
Speaker
So all this process, it's all, rather than coming from my opinions or opinions of others at Mott, you know, we'll offer our experiences and our guidance, but we're much more being, we're more in tune with what the community wants. We put it all together and then yeah, like once that job profile is completed with all the tasks and all the outcomes, we just offer it free to the community. And that means other trainers could create their own training material. It could be used by managers to
00:18:09
Speaker
create job roles and job specs. It can be used for self evaluation of like, where's my gaps?
00:18:15
Speaker
in my experiences and stuff. So yeah, that's why we've kind of made it open source. We'll create our own learning material. And we feel like we're in a good position for that. But in terms of the actual curriculum itself, it is owned by the community. It will be maintained by the community. We'll put the mechanisms in place to do that sort of stuff. What's great about that is that if something changes in the industry, we can tweak the curriculum and then tweak our learning materials as a consequence.
00:18:44
Speaker
rather than it sort of stagnating because the people who author it, it can't scale and they can't, they're not necessarily aware of everything that's going on. Yeah. I think that's great because doing mentoring myself, saying to a mentee, what are your, what do you want to go away and learn? It's like, okay, so this job's asking for this. And then in my previous experience, this is what I wanted my testers to do. But I don't know whether another
00:19:13
Speaker
QA lead is going to want the same thing. So it's nice to have a central place where you can go and send a junior or someone starting out to.
00:19:24
Speaker
And that's it, that's another aspect of it being community curated is that, yeah, it's never going to be 100% fit. There's always going to be like context dependent aspects of a role that matter to that job. But the more the community is involved, the more likely that if you are to go through like some learning materials and then have some sort of portfolio that says, look, I can do everything that's on this curriculum.
00:19:51
Speaker
If those managers and staff have been involved in this curriculum or recognize it as something that is useful, then that helps everyone because you get someone as a manager who has demonstrated that they have the abilities that you want.
00:20:07
Speaker
they have the portfolio and the knowledge to present their understanding as well. And I said all of it's still sort of, you know, in tune with what's actually going on as the community evolves and that sort of thing as well. Amazing.

Upcoming Events and Community Engagement

00:20:23
Speaker
So is there any other events coming up at Ministry of Testing? There's always events coming up. Any ones that you think my listeners would be particularly interested in?
00:20:34
Speaker
Well, I would say for the big one for ministry testing is we've got Test Bash UK coming up in the end of September, which based on the back of last year, which as well, you know, cause you were one of our speakers, you know, it was, it was a bit of a gamble, but it really paid off with this multi format approach. So you can learn by attending a tour.
00:20:55
Speaker
or a workshop or having a conversation or doing an activity with that person. So we're going to be doing that in Liverpool at the end of September, which I think, you know, and now that we've learned from it, we're making it bigger, bolder, you know.
00:21:09
Speaker
all sorts of crazy stuff going on. Really looking forward to that. And then yeah, like the big thing is, is like our pro membership, you know, we're still sort of doing a lot of work around that, a lot of 99-minute workshops each week. So online workshops to get involved in. And then we've got, as a little sneak peek, I'm sure we'll be announcing more online test bashes in the coming weeks. I think Deanna's just putting all that stuff together.
00:21:34
Speaker
So yeah, yeah, there's always stuff going on at Mop, for sure. Exciting stuff, yeah. Definitely head down to Test Bash if you haven't been before, or yeah, you're on the edge, definitely. The format is awesome. Yeah. So thanks, Mark, so much for coming on. That's quite all right. Yeah. I'll pop all the links and stuff in the show notes so people can check out your book.
00:22:00
Speaker
check out Wrathful Booker, check out the curriculums. So yeah, thanks again. Cheers. Thanks for having me.