Become a Creator today!Start creating today - Share your story with the world!
Start for free
00:00:00
00:00:01
Exploratory Testing API's with AJ Wilson image

Exploratory Testing API's with AJ Wilson

S1 E7 · How to start API Contract Testing series
Avatar
168 Plays2 years ago
In this episode we had AJ Wilson who is a Quality Engineer at Cazoo, stem advocate and community champion. We chatted about attending my contract testing workshop as a novice, including API maintenance regularly, exploring API’s, observability, reviewing conference abstracts for Grace Hopper Conference, empowering women in tech through mentoring and much more. You can follow @AjParadith on Twitter or connect with her on LinkedIn.

Hosted on Acast. See acast.com/privacy for more information.

Transcript

Introduction and Workshop Experience

00:00:00
Speaker
In this episode, we had AJ, who is a quality engineer at Kazoo, STEM advocate and community champion. We chatted about attending my contract testing workshop as a novice, exploring APIs, observability of your APIs, and empowering women in tech. It was a pleasure to have AJ on the podcast. It was lots of fun this episode. Here it is. Hi, AJ. Thanks for coming on the podcast. Hi, Liz. How are you doing today?
00:00:30
Speaker
Yeah, really good. Thank you. I'm really excited to hear about your journey into contract testing. I know you've just got started with it. Yeah, it's very exciting and there's lots to learn. So much to learn. Yeah, it's a bit of a steep learning curve, but I'm sure, I'm sure you'll pick it up quickly.

Testing at Kazoo: A Different Approach

00:00:48
Speaker
So you've started a new role at Kazoo this year, last year.
00:00:53
Speaker
Yeah, this year I started in January. So new year, new job. Yeah. Exciting. So tell me a bit about your, your journey to, to testing at Kazoo.
00:01:05
Speaker
So testing at Kazoo is a little bit different to what I'm used to. Principles in the framework that we have here are very different and quality is more of a coaching role. So we do take active participation and we do testing, but we think about things more from helping the engineering teams move left and shift left, but whilst also thinking about shifting right in terms of observability.
00:01:28
Speaker
So we do a lot of things around whisk storming, thinking about what we can do at a TDD level with code, also moving into design and discovery. So it's quite interesting. Cool. So what, what did you do before you were a tester at Kazoo?
00:01:43
Speaker
Oh, there's so many things. So I've worked in every kind of testing role I can think of from finance, payments, checkout, games testing in Nintendo, wifi testing at Sky, gas and electricity, smart meters, working in the field with hydro, everything you can think of. I've done a little bit of everything. Yeah. I'm like a test janitor. Yeah. Amazing. Done everything.
00:02:07
Speaker
So what was kind of, what was a really interesting one that you worked on, I guess, in terms of challenges that you faced? Oh, the best one I can't ever talk about for legal reasons. Oh, no. Because it's all with deck in the government and stuff.
00:02:20
Speaker
Yeah, what's the second one then? But I think the funnest one was working on Wii Fit.

Challenges in Gaming Projects

00:02:25
Speaker
Oh okay, cool. Because that's when I started relearning about personas, and this is nearly 10 years ago now, but thinking about the edge cases and the unthought about. So Wii Fit was aimed primarily at adults, and one of the things that the predominantly male group of testers that I worked with and developers didn't think about was that there might be a 12 year old who might want to try it out. Yeah.
00:02:45
Speaker
And if they do, should we really be telling them that a 12-year-old is overweight, that type of thing? So having to think about those kind of scenarios and those personas and what we should look for. And then also making the me's, that was quite fun. Like, do we want to alert when people are making things that are rude?
00:03:02
Speaker
or politically incorrect, that type of thing. So it was a really interesting project to work on. Nice, yeah. That sounds like you get really into who the person is, because that's the thing about the we, right? It's very personal and very... Yeah, even down to localisation and other cultures. So what we might feel is a good persona for the UK market might be completely different for Japan or Europe. Yeah, absolutely. Yeah, it's a really interesting one. Yeah, it was really good. So another chat for another time.
00:03:31
Speaker
So contract testing. I run a workshop that you attended. It's fantastic. If anyone's listening, you have to have a little short of it. It's quite good. Thank you. Yeah, it should be up on the Ministry of Testing platform at some point this year. So what did you know going into the workshop?
00:03:48
Speaker
I'll be honest, I knew the basics. I understand the principles of contract testing, but actually seeing someone apply is very different. And then because of previous roles, when I did touch on contract testing, it was a very high level just based on how the teams worked at the time. So that's why I'm kind of hungry to learn a bit more, which is why I came to your session because I knew that it would be good. Nice. And so what do you think was the biggest takeaway that you took from the session?
00:04:12
Speaker
The biggest takeaway from me was that there was a lot of people that it was a really big session. It was really good. And there's a lot of people from all over the world with different levels of experience. And even those that were very experienced in testing more technically than me, we were still asking really good questions and they were still having the same kind of doubts that I was. So it was really good to see that no question is a silly question, that everyone has the same thoughts. And it's quite good to have that open forum where people can ask questions and discuss things.
00:04:39
Speaker
Definitely. Yeah. I thought the dynamic of the group was really good. We had a lot of interaction from, from people senior and more junior. I think one of the things around like what's highlighted from contract testing is that everyone makes mistakes when it comes to releasing APIs, releasing software with breaking changes. So it's nice to hear people's stories about like this happens all the time. Right.
00:05:03
Speaker
Yeah, exactly. And there's a product owner that was in the conversation and they'd got so much benefit, even though they weren't a developer, just seeing the kind of conversations that were happening and the thought process. So it was really good. And so you mentioned a topic to me before about exploratory API testing.

Exploratory API Testing and Observability

00:05:21
Speaker
How would you describe it? What is exploratory API testing?
00:05:25
Speaker
Again, it's probably context-driven. So for some people, exploratory API testing would be if they inherit the API, seeing what it can do, taking use of that observability, using the data, and then thinking what other things can we do. So it's looking at the API where your expectations are open. You have the documentation if you're lucky.
00:05:42
Speaker
You have a handover and a knowledge transfer session if you're lucky. If you had good communication with the other teams, whether it's internal or external, then that's great. But it still always gives you a lot of value by just exploring it for yourself and saying, based on my Oracle experience and what I want to deliver and how I want to improve on this or what information I have from Discovery, what kind of things can we look at? Can we improve it? Can we improve performance? Just having those like even 30 minute time box sessions where you just have a play with it.
00:06:11
Speaker
or you have a charter and you go this I want to understand. Yeah because I think with APIs you can get a narrow-minded mindset towards the documentation says this so I want to check that it does this whereas there's lots of stuff which is implicit as you mentioned about data, data you're putting through the system. So what other scenarios do you think that don't live inside the documentation?
00:06:34
Speaker
I mean, one of the things for me is if your tests are always green, that's sometimes a red flag. Because are the tests good enough? Are there no alerts coming off? Is it because you haven't thought of things you need to alert on? Because alerts are for what you know is going to go wrong, right? If nothing ever goes wrong, then are you focusing on the right thing with the API? Just because the latency is good, is it good enough? Is it scalable? Those kinds of things. And that's where exploratory testing can be really beneficial.
00:06:59
Speaker
Yeah, absolutely. So I've kind of discovered recently about tools which replay the documentation, replay your open API specs and give you a set of tests and say, OK, yeah, these these tests conform with the open API spec. But that's a computer telling us this is what the outcome is.
00:07:20
Speaker
When you're exploring the API, it gives you more value than just the automation. What value does it give beyond just automatic checks? I think for me in terms of exploratory testing, again, that's where it ties in with the observability. If you have an API dashboard or health checker or health board and the latency has been good for a year, some of the things you can get from observability is predicting when a problem is going to happen. When was the last time the version was updated? Who's using those versions?
00:07:48
Speaker
If there's been no maintenance on the code or the API or anything around it or the consumers for more than two years, then do we still need it? Are those fields all relevant anymore? Are we creating bulk for the sake of it? If no one's using that information or those fields, then what's the point in them? People sometimes forget, they go, it's working lovely, crack on, but part of maintenance is going back and saying, is this still the right tool for the job? Is it still the right information we're gathering? Otherwise, we can improve the information. Can we make it workable for future projects, that type of thing?
00:08:15
Speaker
Yeah, that's a really good point. Because when you think about software development, you think about the definition of done and definition of done is often we've delivered it in production, we can see what's happening in production. But then that area of continuous improvement comes in. But that's not really that doesn't fit into your development process. How would you fit it into your development process? I guess it's a question.
00:08:37
Speaker
So the way that we do it, Kazoo, for example, is we have constant measurement as part of our observability. So after we do deliver, we have a measurement period where we say, is it actually delivering what we thought? Is latency what we expected? How is it impacting in terms of the e-commerce tools and the broad spectrum of all of the other roles that it touches? Are they getting what they need? What areas are we getting back?
00:08:58
Speaker
Are they the right errors? So we don't just deliver it and go, here you go, crack on. We do revisit it as part of our discovery. And then we get to a point where we say, we no longer need to do discovery on this. We think it is where we want to be, but because it is consumed and we are the maintainer, we will revisit it quarterly and just have a quick chat because otherwise it could be there for a year and a lot can happen. People can move teams, knowledge can change, knowledge can be lost. So it's worth revisiting, even if it's just for 20 minutes.
00:09:23
Speaker
Definitely, yeah. I think that's really nice use cases, as you say, about how we're getting the errors back that we expected. And that's where things like contract testing can come in and really detail that. And you can see what your consumers are doing. So I think that has a really nice overlap, but it definitely requires that human intervention of what we see.
00:09:42
Speaker
Different viewpoints are always good, especially when you're focusing on the API and you're delivering it. You do enter a little bit of a bubble sometimes. And it doesn't matter how good your cross-team collaboration is. You both get sucked into this little bubble. So revisiting it with a fresh pair of eyes or a fresh perspective is always good.
00:09:58
Speaker
Definitely prominent in lots of organizations.

Mentorship and STEM Involvement

00:10:01
Speaker
Cool. So I want to talk about stuff you do outside of I guess your testing role as well because you're an advocate for STEM and women in tech. So yeah, what kind of stuff do you get involved with?
00:10:14
Speaker
all sorts. And it's different levels of business at different times of year, depending on what's happening. So on one occasion, I might be reviewing abstracts for a conference for Grace Hopper. That's a really big global conference that happens online and in person, in Europe and in America. And that's quite good because I get to provide really meaningful feedback to the individuals that apply. It is quite admin related, but it's important because
00:10:36
Speaker
just giving them an opinion saying, no one will find this interesting, or this has been done before, or if it doesn't fit in with the theme of the conference, that's not going to help that person move on and it might put them off. So one of the good things about that is you can give them detailed feedback on each section and at what part of the application they may be stumbled and they get key takeaways, like quite good detailed ones. And it's part of a really large committee and that committee split into specialties. So mine is human interaction. So that's quite good. I've also started doing abstracts on data, which is really interesting as well.
00:11:06
Speaker
conferences are very populated with white men and we need to work out why that is potentially and give people the opportunity. I think it's because anything happens, even with the communities that we're part of, they do start off with a buddy in who you know system and that's how they grow and they build and people bring other people in. And if you don't make an active effort to widen that diversity group and include people that you think aren't being included, then you're just making it worse.
00:11:35
Speaker
Yeah, no, it's a really good point. You do lots more stuff for STEM. So what's some projects that you've been involved with that you're really proud of? So recently, just off the back of working with STEMETS at ASOS, I kept in touch with quite a lot of the young people that were working on that project and also taking part in the schemes and the Princess Trust. There's still people there that I mentor that have come out of the Princess Trust. They're in careers now, but we still keep in contact. And they've got really great news today, actually, that one of the young people that I mentor
00:12:05
Speaker
They've become a finalist in the new Samsung competition for new development. And if they win, they'll have really great access to CEOs and people at Steve Bartlett and stuff who will help them with their product design. So that's quite interesting. I'm excited for them.
00:12:19
Speaker
Yeah, that's really cool. So how do they get involved with that? So one of the things that you can do, some of my, one of my mentoring groups is university students at UCL and we just have, just whenever they feel they need it, just a quick coffee catch up online and they'll say, I'm really struggling to get an internship. I will just help them figure out ways of contacting people who do contact, who would be good to have a peer review session with on their product design. Just from my experience in the industry and even people I don't know, sometimes they just need someone to help them.
00:12:49
Speaker
take that first step and annoy someone and then make them realize it's not annoying. Like a lot of people will love to be contacted. It's not about recruitment, but actually because they want their expertise or knowledge. And most people can see the benefit of lifting others up and they will make the time to do it. So it's quite good.
00:13:05
Speaker
Yeah, definitely. I've realised that with the podcast is reaching out to people and not just stopping at, oh, sorry, I'm busy right now. Contact them again in two weeks or a month's time. And then, yeah, they're more than happy to contribute. It's just they didn't have the time then. But yeah, they don't find it annoying at

Diversity in Tech and Final Thoughts

00:13:23
Speaker
all.
00:13:23
Speaker
Yeah, I think when you move into tech for the first time or when you're still quite new to tech, everything that is a possibility can be overwhelming and you don't know because you don't have that experience. Like what path to go down, got all these problems, got all these challenges. I don't know where to focus. And sometimes just sitting down with someone for 10 minutes and saying, write everything down that you're having an improbable today or tomorrow. What can you actually control? What can you focus on and what can you not control? So then just ignore that. And just sometimes having that someone to bounce off.
00:13:52
Speaker
can be really beneficial. Having someone who's not someone you work with or who's in your direct community or peer group is also very good because it's almost like having a stranger just to bounce ideas off of and that can be really helpful as well. For sure. Yeah, what are you doing in regards to STEM and women in tech? Yeah, it's a good question. So I've joined a couple of mentoring programmes.
00:14:13
Speaker
One of them being Femme Palette. Their whole branding is centered around that. So hopefully I will get matched with some people that I can help.
00:14:23
Speaker
Yeah, I think anything that anyone in the testing community could do for women, femmes and them is always very important. And just thinking about that language when they tweet. And again, when you have those on Twitter, for example, when people have those calls, like specifically saying, I want you to talk with me or be a speaker and giving them an opportunity to your voice, especially the ones that don't have many followers, because we both know that the amount of followers you have isn't as important as the conversation you're having. So maybe finding those people with conversations and bringing them in.
00:14:51
Speaker
Cool. Yeah, thank you very much for coming on the podcast. So where can people find you? How can people reach out to you if they potentially wanted mentoring in the future? Yeah, this has been really good. Thank you very much, Lewis. So using the power of their mind, telekinesis is always a good one. But if that's not an option, then on Twitter, I'm AGParadeth. Quite hard work to spell because it doesn't exist anymore. I'm on LinkedIn and just AGWilson. Yeah, I'll put the links in the show notes so people can find you there.