1 - Crunching Numbers and Collaborating: Tackling Risks in Testing with Qase's Jenna Charlton - podcast episode cover

1 - Crunching Numbers and Collaborating: Tackling Risks in Testing with Qase's Jenna Charlton

Dec 19, 202347 min
--:--
--:--
Listen in podcast apps:

Episode description

In the inaugural episode of QA Time, host Emily Kjos interviews Qase’s US-based Developer Advocate Jenna Charlton to discuss their perspectives on collaboration, risk, and quality within development teams, along with how Jenna got into software testing and quality assurance from a non-coding background.

QA Time was created for quality aficionados of all experience levels to: highlight stories and leaders in the quality assurance field, discuss career journeys and quality in software development, along with thought leaders’ different perspectives on trends and technology in the SaaS industry.

If you have a story to share or a leader to nominate, our guest application is open.

Connect with Jenna on LinkedIn

Tune into some of their talks:



Connect with them at some of their upcoming events:



Jenna’s Must-Reads:



Additional References:

Transcript

Welcome to QA Time, Conversations with QA Leaders. Where leaders from the Quality Assurance Community share their expertise, personal stories, and heartfelt experiences. I'm your host, Emily Kjos, Senior Solutions Engineer at Case. Hi there, I'm so thrilled to welcome you to the inaugural episode of QA Time Conversations with QA Leaders. I'm your host, Emily Kjos, from Case Inc. I am so thrilled to present Jenna Charlton, our very own developer advocate

based in the US. They have been testing and doing some code, but not too much recently, and exploring quality assurance as a whole for the past 15 plus years. They write and speak about test practices, methodology, agile practices, and career development for juniors and those seeking further advancement. We'll post the links for their Twitter, LinkedIn, and other channels to connect with them, but you're really in for a tree as we cover everything from their start and how

they got interested in testing and quality. There are definition, some rapid-fire questions in order to get to know them a little bit better, but I really want to highlight Jenna's start as a quality assurance engineer moving up to a lead while at progressive and product management, becoming a product manager, and then head of product, head of product. So they have experienced quality assurance from just about every level that you can find, and they have some really

valuable insights, not to mention their time with training for the ISTQB certification. They earn their own certification as a technical test analyst several years ago and as an agile technical tester. So someone with a ton of experience, a lot of thoughtful responses to my questions, and along with that, just a lot of interesting hobbies and fun things that we talk about in the episode. So stay tuned for more, and don't forget to subscribe. All right, welcome to the inaugural

episode of QA Time. I'm your host, Emily Chose, from Case, and I'm joined today by our very own US-based developer advocate, Jenna Charlton. Jenna, thank you so much for making time to come on the show and to help me launch this new podcast. Thank you so much. I'm excited to be here. So I thought we'd start just by getting to know you a little bit better, and you've been in the

testing world for quite some time now, and have done research. You're currently in a graduate program, and so we'd love to just start off with how you got into testing, and maybe a little bit about your journey before then. Sure. Yeah, so I got into testing the way a lot of people do. I fell into testing. I'd worked terrible customer service jobs, you know, the call center thing, and all that, and I'd been laid off, and for a while I worked for a friend of the family

who had this weird service called Travel Clinics of America. It was a husband and wife, and they were doctors, and they had bought this company that they franchised essentially. So they taught other doctors how to have travel clinics. Anyway, they taught me how to do some SEO work. I don't really remember how to do it at this point, but I remembered then, because they

showed me. So I worked for them for a little while doing SEO, because it was like six months before, Bob and I were getting married, and I knew nobody was going to hire me, and then let me have like three weeks off to get married and have time off. And that's important. We take that time, and not. Oh, yeah. So good work by balance. We love that. But that wound up leading me to after a series

of weird things, working for a company called Brand Muscle. And I joined there because somebody I was working with that was tempting that the company I was at said, you know, I think you'd really like this company that I'm tempting with in the afternoon. Give me your resume, and I'm going to give it to them. And so she did, and I joined their help desk. And let's say about eight months in, Trisha Missola, who was my manager, shout out to Trisha if she went to listening to this.

She gave me the opportunity to start looking at other opportunities within the company. She was like, you know, you've got to, you got to do more. I'm worried you're going to get bored. And I wound up kind of falling into the testing role and really falling in love with it. And I have to say that seems to be a pretty common trend within the testing space. I mean, I know a lot of friends, especially during this past couple years where we've had some big shifts

in moving in an out of tech who have trained as developers or engineers. But it seems like a lot of testers come from something else where quality is something that they care about. It's important.

And I think it's cool that you come in with this different context and this different viewpoints because staring at code and staring at different infrastructure, sometimes I think if you are so indoctrinated into a certain system or process or way of looking at things, it does make it a little bit hard sometimes to step back and really evaluate what all is going in and to think a little more creatively or outside the box or from a design perspective, which it sounds like that served you

well as you've kind of continued on in your journey. Yeah, absolutely. And I think a big piece of it too is that a lot of testers come from the business side. So when you're a tester coming from the business side, you have a strong understanding of the business needs and how the business functions outside of IT. You understand your customer base. And that gives you the opportunity to take all

of that knowledge and then bring it to the IT side. And that's where kind of magic happens because developers think about things in linear terms and want the quickest, quickest solution to a problem. Not that they're cutting corners, but they're not necessarily thinking about experience. Testers on the other hand know the customer and say, yeah, I know we can do it this way and it'll work technically, but our users are not going to figure out how to do that. Or I don't quite understand

how to do it this way. So I know a user is not going to understand how to do it this way. So we bring a mindset, especially when we come from the business that's unique and is really different. Yeah. Well, as someone who's spent quite a bit of time on business side and customer success and solutions engineering side, I can definitely appreciate the need to have some checks and

voice of the customer. But also if you are too close to anything, it sometimes can kind of cloud your ability to see the bigger picture or to look at it for someone who's seen it for the first time or who has less experience or isn't quite as knowledgeable about that. So I think that's really important. You mentioned in our preliminary chats that the tester developer relationship is

an area of interest for you. And I know that this can be a little bit tense or a little bit stressful in certain organizations or setups, particularly I think depending on the culture and how those teams are set up. What would you say the biggest challenge you see is based on those relationships and ways that you like to try to overcome or address those proactively? So I want to be upfront that those relationships should not be challenging or combative. They should be peer to peer.

And we should see them as partners. So developers should see testers and testers should see developers as partners. One team, one goal. I think that this breaks down in two different situations. One is when test is completely siloed away from development and it becomes just thrown over the wall at the testers. There is, and we're going to go down a rabbit hole for a second. But if you think about some of the theories of social interaction and actually I just really, I just read a

really interesting study about this last night. This idea of high threat low threat, high commitment, low commitment identities and being part of an individual group identity. When we are not sitting together working together on one team, we develop these separate identities. And it's easy to have your identity and your welfare in terms of not like you think somebody's going to

hurt you, but your emotional welfare, it becomes high stakes and high threat. So when it's just thrown over the wall, it's like, well, they don't care about me and they don't care about my opinion and they don't want to hear my voice. And developers feel the same way. They're like, well, the testers don't care. They're just nitpicking my code. And neither of those are true. But this adversarial relationship develops because nobody's talking to each other because we have two

silos and nothing in between them that's bridging the gap. So agile teams, testers and developers work together much better. The other time way that this becomes really adversarial and it's really dangerous. And I'm sorry to folks who are going to like say, don't say this because my managers who we mad, but when there are, I love, I want to hear all of the controversial

challenging opinions. I wouldn't even say this controversial or challenging, but it does call into question something that we have made a central way that we evaluate people. And that's using metrics. I believe in a whole team metrics, but when we look at metrics that are number of bugs entered for testers or number of bugs on your code for developers, we now put our teams in a situation where their needs are in conflict.

And that's that's a really tough bridge to build or manage as a manager. And you've managed some some teams. You've also dealt a lot with students and folks in various stages of their careers. And I would imagine that especially when you're working with very diverse global teams or different experience levels, different opinions on methodologies and technologies, it can be hard to sometimes find those those common grounds. But I think that's a really

important point of perspective and being able to just have that communication. I mean, we see this, I've seen this across departments at a number almost every organization I've worked at that is larger or doesn't have great inter-departmental communication. And that's something I feel very strongly that proactive, positive, you don't have to sugarcoat things or, you know, lie or

or white lie. But just being able to be human and empathetic and unders try to understand where the other team or person is coming from a little bit about their context really can go a long way. And I think the whole situation now with remote or hybrid work or very spread out teams, case has people across maybe 12 different times that one's gonna ever take. So it can be very challenging to find that commonality if there's not a chance to discuss or build relationships

that are more personal or outside of just the pure transactional code, testing, fixing. And so I think that's a really, really important point. Yeah, and so there's the transactional nature of it. And there's the building community within teams nature of it. But there's also, so there's also the way we use data that can either be positive or negative. And metric is data,

like metrics are data. And when we use metrics is an evaluation of a person. We now put ourselves in a situation where everybody's in conflict because you may have developers who their code has dozens of bugs. Another developer who their code has just a couple of bugs. But do we understand the context of the work they're doing to understand why one of these is problematic or not problematic. Because I don't care how many bugs are in a developer's code. What I care about is that they're

identified and they're resolved. They might be working on something that's extremely complex. They might be working on something that touches six other systems versus the person who has two bugs that's just changing, you know, some text. Huge different context metrics don't give us that. So going out in tangent, but they become extremely problematic when they're not whole team metrics. Like I really like Dora metrics because they're evaluating the whole team.

But that's a, that's a bigger conversation. Maybe maybe one of our future episodes. I know we've used the idea of maybe having some panels coming up. So we'll, we'll table that for now. But speaking of just complexity and overall quality of code, I would love to hear what you would define quality as. And I know this is can be very, very broad, very granular, kind of a loaded question.

But I think it's important to highlight that speaking of communication quality standards, while there are quality control standards that's very different from a quality assurance or a subjective qualitative series of metrics or expectations that someone could have, especially depending on what type of people you're working with, who your stakeholders are if they're external internal. So what does quality mean to you? Quality only has meaning

when the team has established what that meaning is. Quality for one system is not the same as quality for another system. Context matters. And team understanding of the vocabulary matters. So when I look at what does quality mean, well to me, quality means software that's usable, that solves the problem that intended to solve, and is low risk to the user and to the business. But all of that is abstract. None of that is quantifiable. What is quantifiable is the context. So

what does quality mean if I'm working on avionics software? What does it mean if my software runs the hardware and airplanes? What is quality there? Versus what does quality mean on Facebook? They're both risks. Like there's risks in both systems. And there are risks to the user in both systems. We saw some of those risks come up with Facebook with like Cambridge Analytica. We saw those risks come up with avionics with Boeing. But the context there is very different. And the

quality required min is very different. So quality doesn't have a definition to me because quality means whatever the team has determined it means and whatever the context needs it to be. Quality is a feeling and a greed upon state of being. But it's not something can create that anybody can really define. You know, I'm going to give a big shout out to Damien. I apologize, Damien. I can't pronounce your last name. I don't know who you are. Well, I'm sure.

I'm sure. Damien talks about semantics. And I have to be reading a book on semantics when he I saw his talk on semantics. And this really drove home for me. Something I talked about for years, which is this idea of shared vocabulary. But it really drove home to me that words have zero meaning until we assign meaning to them. And quality has zero meaning until the people who are engaged with it in that particular organization in that particular system assign meaning to it.

So I gave you a non answer. But I think in defining when something is not and defining that it is a context based evaluation assessment expectation kind of intangible that is set by the involved parties, I think I think that works. That works for me anyway. And it's a short it's a short question with a not so easy straightforward answer. But definitely appreciate your approach there. And I think we also talked about how risk is a really important part of testing and designing for risk

when a project is approach. I am a huge advocate for testing being integrated as soon as a project is is started is planned as designed because you can't go back and inject quality into something after the fact and it can be a bit more labor or resource intensive at the start. But that can save you

so much down the road. So when you think about risk, what is your kind of key concern there when you're planning for something as I'm going to say simple as user data protection in the case of meta or Facebook versus something as complex as you know software that will pilot a moving vehicle that has weather and other unpredictable elements to to consider. So risk has multiple facets and there are components to risk. And we can look at risk in two different from two different

perspectives. One is the general idea of risk. The other is risk as a as a metric essentially that gives us meaningful information. So when we think about it in terms of let's start with like what happened with with with Facebook, that wasn't necessarily a software risk. Like there's no there's nothing that their testers could have done that would have appropriately identified and then been able to put a plan into place to mitigate that risk. It was a business decision

that had an inherent risk associated. So we're looking at a business risk there. When we think about risk in terms of the components of risk and how we measure and then plan for mitigation, we have to think about this in terms of how we calculate risk. I like a risk calculation. There are other ways to do this. This is my preference. So risk calculation would look at first, what's the likelihood or probability of failure? How likely is it that this thing, this feature,

functionality, whatever it is is going to fail. We should have data that tells us how many defects we've seen associated with it. And then we have the impact. And this is what the impact is, the user to the business, the brand, all of that kind of good stuff. There's this other factor that I like to consider as well that is not part of the formal risk assessment. This goes against what ISTQB and BBS TSA, I don't care. I don't care that this goes against what they say. Shout out to

progressive when I work there. This is where I learned it. But progressive has kindly allowed me to proselytize this, I think that's the right word to use, to shout this way of doing risk assessment from the rooftops, which is an adding this idea of complexity. And complexity can be any number of different ways of looking at something. This can be how many features it touches,

how complicated the code is, how understandable the code is. Let's say we're working with an algorithm that's really complicated and has a lot of potential results, but it's hard to identify what ISTQB isn't correct. That would make it more complex. But when we look at all of those facets, now we can identify what the level of risk is. And there's actually a mathematical formula that I use for this that we're going too deep to get into that. But we can evaluate

risk in a spectrum. We can kind of put them on a chart and evaluate and then say, okay, now that we've identified our risk, let's start mitigating. If you're really going for the most logical way to do it, you mitigate the highest risk items first. Work your way down. But risk is again subjective. So this is about what the team agrees is a risk. Right. And again, that's something that priority, just in and of itself, can be so subjective to what are we categorizing this? Is it

prohibitive to internal functions? Is it prohibitive to clients? What could cause potential security risks that could cause us to get sued or go through litigation? So there's so many ways to define that. But I think it's again, something that's really good to consider from inception of a project to build that into the planning and design. And just to kind of go back to your time at Progressive. And so it's been, can I say, how long over 15 years? Oh, yeah.

I've been working for about 15 years. Yeah. Yeah. So what would you say some of your key accomplishments have been? So I know you cover a lot of things. What's funny is you mentioned Progressive, which yes, I brought them up earlier, but that's probably the most formative time for me and my career. I had, you know, I'd worked my way up to senior progressive was my first senior title. When I joined there. And I say that again, sorry. So this is for senior. Yeah. So the first time I was

a senior in title was when I joined Progressive. Okay. While I was there, they really embraced my love of testing and they embraced my desire to go deeper and really explored examine. And they gave me the opportunity to do all sorts of things. That me to conference for the first time. Let me bring back the testing circus from test bash in Philadelphia to Progressive. I put on an event for over 500 people using these. Oh, yeah. No, I mean, I had a team of like 40 volunteers. This was big.

But they gave me, they gave me budget and the ability to use volunteers and let people do this on the clock and they let people have time away from work. People outside of IT. So customer service people, sales people come down and learn about testing developers and testers came too. But they gave me the opportunity to do my first talk, which was amazing. Yeah. Like I had mentors there who helped

me with it. My manager helped me with it. And when the next opportunity came along, because I wasn't looking at fell in my lap, they gave me like a pat on the back at a big hug and said, you were meant to do this. You need to go do it. And the doors open to come back one day. So like those were my big moments was doing doing my first talk there, doing my first event with them. Really discovering my strength in this industry and where I belong. So those were really big

things that happened all at one time. And then probably the other really big moment was I had my first keynote this past June, May, May, an agile testing days in Chicago. And that was that was huge. That was a personal big accomplishment. Why I'd say that a keynote speaking in Gatron is a great achievement by anyone's standards. So I think again, you came from a background that was not testing related. But you also you do coding. You can understand a lot of things. It may,

maybe it's not your favorite, but being able to just have that perspective. So for example, I've done some intro boot camps and I've dabbled a little bit, but I'm getting to the point where I can recognize certain infrastructure or syntax or elements. But I think one of my personal goals is to get to the point where I could consider myself a very slow introductory junior coder. But so I think it just it helps to be able to understand the language literally and figuratively speaking

that your colleagues and your peers are working with them. So you've had quite the journey. And do you know offhand how many talks you've given at this point? I don't. I mean, in the dozens maybe. I would say I would say if we include virtuals during COVID, maybe 50. That's a pretty solid number. And I've seen I've seen a lot of your decks. You put a lot of time and energy into really crafting a story. You are a great storyteller. And you know, we just had our your welcome. Well, we just had

our our session with the folks that ingenious. They just launched a new program kind of an introduction to testing to let folks test out if they would like to pursue that as a career. And there's a ton of great resources will also link some in the show notes that we would recommend or that our our colleagues have explored or checked out if you're curious about getting into testing. But if you are someone who's been in the industry longer, you know, Jenna, you are on the board

or your co-chairing track chair. Was it track chair for the Romanian testing conference coming up in the coming months? Yes. And that's in June. June of of 2024. And then you've been involved with code mash, co-track chair for for quality for them. And an agile testing day ambassador. You're a graduate. Do you sleep? Is I think it's my question? Yeah, no, I don't actually.

No, I do I do stay very busy, but I have an incredibly supportive, wonderful partner who all of the things that he picks up because I can't is why I can do all of the things that I do. Well, that's always amazing to have a good support system, but especially someone you're sharing your life, your space, all of those the fun ups and downs of the question and projects. But no, I it's Bob. I hope to meet you someday, Bob. And I'm sure you'll be listening to this. But

well, we're really happy to to have you here at Case. And I think, you know, just our conversations, different recommendations you've given me to improve myself as a as a tester or at least a testing novice. I'd love to let you kind of give a few shout outs on some book recommendations. I know there's a lot of a lot of books, a lot of resources, a lot of content that we could talk about. But some of the ones that you were telling me about earlier sound a little off the beaten path,

but more in theory. So what would you say your top must reads are for for anyone who is an experienced tester or looking to further their their career and their knowledge. So these are there are definitely must reads. And I'm not going to go through the must reads because we all know what they are like they're at least in Janet's books. Okay, I'm going to drop the names anyway. Hundreds in book. Like so there are books that are must reads. But getting off the

beaten path, there's a couple of things that I think are really beneficial. One of them first when I'm going to suggest actually I haven't finished. I just pick up and put down because it's that hard. And that's Gerald Weinberg's an introduction to general systems thinking. I don't have a CS degree. This is this is stuff that you do in a computer science program. But it is a great opportunity to explore logic. And logic is really beneficial in thinking through how a system

works. So I think it's a great book. You can get it as a Kindle download. I think you can also maybe get it free from the publisher. I'm not sure. But it's a great book. And if you want to challenge yourself, don't feel like you have to finish it. Pick it up. Get through a couple of chapters if you can. Put it away when you feel like you need to challenge your brain again a little bit.

Pick it back up again. It's the kind of book you can do that with. There's also a fantastic book called Thinking in Systems, which if you think about Gerald Weinberg's book, Weinberg's book, this is kind of like the next generation I'll most. This is an easier read. It's not an easy read, but it's an easier read. And she goes into Danella of Meadows who wrote it, goes into systems and systems thinking, but also how do systems interact? What makes up a system? What

is a system? And she goes deeper than just hard systems. So we she also has about soft systems. And human systems, which are critical to understand when you think about your users, when you think about your team, these are human systems. So that one is really good, less challenging, but equally beneficial. And then another one that I love that again off the beaten path, and I know we're not designers, but it's it's useful anyway. Right. Yeah. But all times together, I think one of the

things I love is that you have a very holistic approach. And it your your knowledge, your background, your reading and different things you spend time on all kind of play well together. So I think that's good to have a variety, not just strictly testing or part or so. I would love to say that it's because I'm intentionally holistic, but it's not. It's because I'm ADHD and I just thought down the rabbit hole of whatever interests me that at that moment. The good thing is that all

of these things that have interests me have built a knowledge base. That is very wonderful communion. But this last book is called Designing for Cognitive bias. And this is really useful because we all have cognitive biases. And it's not a bad thing. These are just something that like our brains are meant to work this way. But when we think about cognitive bias, we have to learn how to kind of shake ourselves out of some of our framing and kind of untether ourselves from our

anchoring bias so that we can see things from other perspectives. And we understand how our user's brain works. So we better understand how to test for them. So that's another great one. And the name of the author always escapes me, but I'm sure it'll be in the notes. Absolutely will be. I can see that. So before we get to some of your recent projects and wrap up, I'd love to just do a quick little rapid question fire at you if that's okay.

We've touched upon a few of these things, but just kind of for fun. Mac Windows or Linux will start easy. Mac. Mac. Okay. What's your go-to comfort drink or snack for long working hours? Oh, water. I'm in my emotional support water bottle and coffee. Nice. Now you're black or you a coffee beverage person? A little bit of cream. Nice. Popular you I feature that bothers you or that you don't like. Carousels. Carousels. Oh, interesting. Suppose it is harder to be able to encapsulate everything

in one's lighter dark mode dark mode. What about well, I've seen I've seen Excalibur in the background a few times. Cat dog bird snake or none. All right. Cats because I'm allergic to dogs. Also cats are less work. Not like birds. I have bad experience. No, you will have to chat about that later. My aunt bird attacked me a number of times in the child that not a fan. So sorry. That's okay. He was mean. Well, distant memories hopefully and then how about reptiles?

They're fine. I've never had a reptile. They're fine, but I like I like fuzzy cuddly things. That's fair. And then do you have a favorite sports or professional skilled activity you like to nerd out about? Oh, pro wrestling. Well, I suppose that I kind of walked into that one. How about B sides that I know you're you support the rounds? I do. I sport the rounds. I sport the guardians. I support the cows. Just a few. Just a few. And you're just saying that you're um and college

football. It's not something I've dipped into being primarily West Coast as an adult and college years, but I know that's really big back in Ohio where you're at and kind of the south. So your favorite college teams if any. So Ohio State because I'm in Ohio, it's like a rule. That's fair. You're hard for not. But also, you know, I'm of course a fan of of my colleges. So Bowling Green and Miami University. Miami University in Oxford, Ohio by the way, not

Miami University and or not University of Miami, excuse me. Important clarification. They're in Florida because everybody gets it confused. That's fair. Yeah. A little less obvious than say Paris, Texas or Paris city is in the States versus broad, but yes. And then finally, actually two more. For brainstorming or problem solving to prefer a physical you stylus whiteboard pen paper or do you prefer digital like some sort of design tool?

So I am in between. I have a Kindle scribe that I have found to be really, really, really useful as my brainstorming tool. So it's like paper, but it's digital. Easy to track everything and log it somewhere. I had a rocket notebook for a while that you could scan. And that was nice, but you still have to go through scanning and I write in kind of a weird mix

of cursive and denaliant. So it does not pick up my scribbles. I just type so much faster and it's so much faster now because I don't write as much, but I do try to practice here and there. And then finally, what's a least known fact characteristic or skill about you? Oh, goodness. I feel like everybody knows my skills. Like I can do what I do. I can cook. Oh, okay. This doesn't happen as much as it used to. And so remember for those of you who are

old enough to remember scrolling through TV channels, because that was the way you did it. My superpower is that I could find legally blonde on TV any time of the day. Wow. It didn't matter where I was. If I scrolled through the TV channels, I would find legally blonde, which is great because it is fantastic movie. Yeah. That is my unknown superpower. That is fabulous. And I mean, it's not one of those that would be playing 24 seven. So it would still have to have

some level of chance there. I knew it. But either it's playing 24 seven or somehow like, I was talking about the superpower or or something aligning in the first. I was manifesting it. I love that. I love that. Well, speaking of your Kindle scribe, just before before we end here, what are some of the recent talks or projects that you are working on that people are interested

they should follow or look for after the show? So something I'm very excited about that everybody can get involved in is my research project for the semester is looking at job title as it relates to the employee experience. And by experience, I don't mean like your experience in your job. I mean your experience in work. I guess this is what I was going to say it. I have experience work.

So there's actually going to be a survey coming out that I'm going to publish waiting for IRB review that's going to ask you lots of questions about how you feel about job titles for testers and what feels best to you. Interesting. Kind of, you know, almost a manifestation in some senses. Yeah. All men's there. Sorry. Good. Go ahead. Everybody who participates can be kind of consider themselves a part of my thesis because you will have contributed to the

research that will be a part of my thesis. Excellent. Well, shout out to the survey filler outers because that there's always a need for data as we all know. And I know you've you've recently had some sketch notes that you are starting the series on. So there should be a couple out by the time this goes live, which I'm excited about. The first one was great. And your inspiration for that. I think you got a shout out for him, him on LinkedIn, wasn't?

So, yeah. I went to, I apologize. I'm going to mispronounce your name. Katrina's talk. And she'll be linked everywhere. But she did a talk at targeting quality on and power of example mapping. And so I did a sketch note of her talk and Matt Wynn, who developed the technique. He, it happened to make it to him. And he reshared it on LinkedIn. It was great. And doing sketch notes was really inspired by all of the people who do sketch notes in a way

cooler way than I do. I can't draw. And I certainly can't draw quick enough. But like, Lucy Hockey and Constance, I don't know your last name, but she goes by Constance Hermit. They do sketch notes live in talks like on paper with like multi colors and stuff. And they're incredible. But it's a lot of times to be an artist and a tester and a student and all these other things. So I was very impressed with your sketches. And again, like with quality or risk, I think

art and beauty are very much subjective. And then the eye of the beholder. And I think they were very clear. And it was really interesting to check those out after the talk. So we'll, while still post a link for those and the series, we've also got your mix tape coming out, which I love, especially since your music is another passion point for you in addition to wrestling

and testing and all of that. So do you want to? Yeah, I don't know about that. So the testing mix tapes are interviews with with testers and people who are really influential in the testing space and people who are maybe not so influential, but have something really interesting to talk about and should be influential. And this was actually inspired by a podcast that used to exist called a tester's DITS that was hosted by Ministry of Testing that Neil Stud created. But these, as

opposed to being a podcast, are a text-based interview. So you can read it kind of like reading an interview in the newspaper. And all of the contributors then kind of curate a Spotify playlist that are songs to test by that are inspired by their interview or have to do with whatever it is that we were talking about. It's really cool. I'm very excited about it. Lena is our very first interview. So shout out to Lena. Her playlist is awesome. I can't wait for people to read it.

Well, I'm excited to listen to the playlist on Spotify too. I love Spotify or Pandora was my first favorite. I love the music genome concept and the idea that I liked songs. I never could for the life of me remember the name or the artist unless it was very, very obvious. But even sometimes people would look at me and say, Emily, are you serious? You don't know who this is. You can't remember decades later. I'm still running into this, but it's okay. Making space for other things

in my brain. The lyrics, though, I could always get along with. But I love having curated playlists by other people get a little peek inside their mind, being able to have music that's appropriate to knock out some tasks, different responsibilities you have. So I think that'll be really exciting. And then we've got the conferences coming up. So the CFP is open for the Romanian Test Conference, you mentioned. And we'll also link some of those other ones, like

agile testing days and code mash as well. But really any other final thoughts for those tuning in before we call it so glad that you made it on today and that we were able to chat. I've really enjoyed our conversations beforehand, but it's always fun to be able to share exciting, less known than a lot of people. Final thought would be find something new to explore every day. It can be something small. And when I say explore, it doesn't have to be exploratory testing,

but yes, do some exploratory testing. But find the most interesting people are the people who are interested in things. So find the things that make you excited and you get passionate about. And start exploring them. The more you explore all of these disparate concepts and things and activities and stuff, the more connections you start to make both in the way systems work. And in the people in the communities that you develop. So go explore.

Love that. Yeah, continual learning. And there's some wonderful communities within the testing, open source development world. So if you haven't found those, please go out and check those out. We'll link all of the things we talked about and more in the show notes. So please tune in and look forward to our next episode. We'll talk some more. We'll do some more rapid fire questions and we'll bring some more interesting people to share their journeys, their experiences,

and maybe some of their stories from in the trenches that they've had to overcome. So thanks for tuning in and have a great week. Thanks for joining QA time. Conversations with QA leaders. Subscribe to keep up to date with the quality assurance community. If you'd like to nominate yourself or someone else to be a QA time guest, head to qatim.case.io. That's qatim.qase.io.

This transcript was generated by Metacast using AI and may contain inaccuracies. Learn more about transcripts.