Typisch Tester? Diverser als man denkt - Isabel Evans - podcast episode cover

Typisch Tester? Diverser als man denkt - Isabel Evans

Feb 14, 202524 minEp. 122
--:--
--:--
Listen in podcast apps:

Episode description

In dieser Episode spreche ich mit Isabel Evans über ihre Forschung im Bereich Software Test Stereotypen. Isabel, die 2017 mit ihrer Forschung begonnen hat, spricht über ihre Arbeit zur Bekämpfung von Stereotypen beim Softwaretesten. Ihre Umfrage in der Branche zeigt, dass Tester aus verschiedensten Bereichen kommen, wie z. B. Theaterwissenschaften und Kunst, was gängigen IT-Stereotypen widerspricht. Und gerade im Testen ist diese Vielfalt sehr wichtig um verschiedene Sichten auf Software einzunehmen. Isabels Forschungen zeigen, dass nur 6 % der Tester dem traditionellen IT-Stereotyp entsprechen.

Transcript

Hello and welcome to a new episode of the podcast software testing. I am your host Ritchie and have brought another episode of the 28th testing retreat from Belgium. This time I had the pleasure and honor of having Isabelle Evans in the podcast. You probably know her from countless conferences from papers and books that she has published and I have to say I was quite excited that she came to my podcast. Because she is one of the people who

have made software testing very intense and forward-looking for me. With her I talked about the topic of who is actually a typical tester. Does the template fit from the typical IT nerd here, from the introverted self-proclaimed or is the group of testers perhaps much more diverse than you think? Isabelle did research on the topic and found out a few exciting things, but listen for yourself. Have fun with the episode.

Hi Isabelle, nice to have you on the show. Well, thank you for having me. I'm really pleased to be here. Yeah, yeah, very nice. It's the first time we met in person here. It is, isn't it? Yeah, yeah, yeah. We are here at the testing retreat, the 28th testing retreat in Belgium. Yeah. Yeah, it was very nice and you gave yesterday a short insight into one of your works now. And you're dealing with how to break stereotypes. Yes. Yes. And so please, please tell us here what's going on in there.

All right. So it's part of the research I'm doing. So it's one of the outcomes. I'm not sure where to start here. One of the things that's happened to me with the research, and I get a feel it might be quite usual, is you have a hypothesis and as you're collecting data, parts of the hypothesis aren't right. And then you say, "Oh, well, actually, well, I need to check this thing over here and collect some more data." And then you find patterns

and form more hypotheses and test them. You're kind of looping round iteratively. And so from the research, which started in 2017, and we're in our 2024, I started to see various different patterns. I was originally looking at testers and their experiences of tools. And one of the things coming out of that was that there was perhaps a lack of clarity about actually who was doing testing. So you've got the title, tester. There's actually lots

of people doing testing. They may or may not have that job title. But even people who have got that job title, they're not all the same. But I hadn't really got the evidence for that. It was kind of hunch and experience. So I ran an industry survey and that went out via online networks and conferences and so on, collected data from open questions. And I was asking people about things like how they did their testing and the approaches they

did, they had and so on and so forth, the tools they were using. But I was also asking in that survey questions about their hobbies, their backgrounds, to find out about them as people, as opposed to how they were doing their work. And then kind of even start lining it up, correlating things or finding that they don't correlate. And out of that, I started seeing that testers were from a very wide set of backgrounds. So not just my hunch that

that might be the case, but actually the evidence of it. People coming in from boat building, water studies, international relations, urban planning, artists, and so on and so forth. And a really wide range of hobbies. Arts hobbies, very, oh, I'm kind of losing my track here. A very wide range of hobbies and people with multiple hobbies and so on and so forth. You know, why is that of importance or of interest? Obviously, we're a wide group of people with

a wide interest and all the rest of it. But there's actually a set of stereotypes about who is in IT. And some of those are encapsulated in recruitment and career advice databases. And I came across a paper somebody called Michesne had written, where they were looking

at IT people in general. And they were looking at people who wanted to be in IT and people who actually were in IT and comparing those people with, if you like, the stereotype of an IT person that was contained within recruitment databases and careers advice databases. And what is the stereotype?

Good, thank you. So the stereotype was very much saying that people who are going to be good in IT would be very likely to people who didn't have very much social activity, weren't interested in the arts, weren't particularly communicative, had perhaps very often one hobby or interest that they were quite mono-fixed on. So quite a nerdy, narrow, particular sort

of person. And so when you look at those careers recruitment type databases, it's actually saying, oh, if you're this sort of person, you should go into IT. Now when you look to what we want to do, what many people want for IT, there's talk about diversity and inclusion and talk about if we're representing the world at large, humanity at large. You know, if you've just got one narrow group of people doing the work, is

that sensible? So McChesney had done that work to look at that. And what they found was a relatively small percentage of the people in their sample who were IT people met that stereotype. There were a wider group of people working in IT. And I'm having an awful old age blank brain moment. I think it was, let's say it was 30%. So maybe 26%, 30%, something like that. Let's talk about it being that kind of amount. So, you know, a significant

proportion met the stereotype, but it wasn't everybody. It wasn't the majority by any means, nowhere near the majority. And when you look to the people who are aspirant to be in IT, they were closer into the stereotype. In other words, the recruitment process was narrowing down the people who were going to be encouraged to come into IT or felt they had a right to be in IT. So I took the modeling that McChesney had done with their data and I reapplied it

in to my data. So I wasn't expecting to be doing this, but it was kind of like, oh, wow, okay, let's try this. And when I did it with my sample of testers, it was only 6% of them that met that stereotype. And in fact, when you looked at things like the number of people who had arts related hobbies, it was enormous, huge number of people had arts related hobbies. And even within that, the other thing about the IT stereotype was very much about being

quite passive in those interests. I think that doesn't fit with what I know. And she looked at it, so you take something like a love of music as part of those interests. And there were a proportion of people amongst the testers who were talking about listening to music. Okay, that's quite passive, but it's an arts interest. And there were another group of people who were more active. So they were playing music, they were singing in choirs.

That was quite interesting, quite a proportion of those. But then you'd also got people going, oh, I compose music, I write songs. So they're actually really actively engaged and creative. And that was true for a number of those different interests. So that was fascinating to me that

people just weren't from that point of view, meeting the stereotype. And then when you actually looked at people's backgrounds in terms of what degree they'd done, there were people with arts degrees, with science degrees, with social science degrees, with arts degree, sorry, with IT degrees, and also people without degrees at all. So wide range of backgrounds, and people who'd done a wide range of jobs beforehand. So yes, the original version of

the paper was called From Artists to Urban Planners. And then we actually changed the title later to Breaking Stereotypes. When you looked at the roles that people were taking, I mean, I guess you might think, oh, if they're coming in as arts graduates, they're probably going to be doing more sort of test management-y, maybe some test design, but not so much of the technical roles. 40% of those arts graduates were actually doing test automation, technical

testing roles. When you looked at the IT graduates, around about 40% of them were not doing technical roles at all. So it's kind of like the degree doesn't predict what you're going to end up doing, or your aptitude for doing it, or your enjoyment in doing it. But one of the things that was also interesting, and this was kind of like it was another bit, while I've got this data, I might just try modelling this and seeing, can I find any patterns? So I

was looking at communication styles. And there was a bit of a correlation between people's degree subject and how they expressed themselves when they were talking about test approaches. So the arts graduates wrote very easy to read, well expressed, almost, I'm going to say, essay-like, but storytelling sort of descriptions of what they were doing. And they talked a lot about people and the problems they were solving. The social scientists very much talked

about the organisation as a whole, and teams, and how people were interacting. The science graduates, they made ordered lists, like this is what I do in this order. There's a real structure to it. So it's quite terse, but very informative. The people who didn't have degrees were very much into storytelling, almost like, I've got a glass of beer, and I'm sitting here, and I'm telling you all about it. I happen to be writing it, but that's

what's happening. Now, the IT graduates, loads of technical detail, not very well structured, quite hard to get through, and not very storytelling. And I thought, wow, actually, you know, we need those arts graduates and those social scientists to get us communicating across the organisation and telling our story. And we need those scientists to give us that kind of order and structure. And the IT graduates are bringing in the technical skills that

they've learned, but maybe not those other communication skills. And I was, I'm a computer science graduate, and I know I'm chatting away now, but actually, in my kind of natural state personality, I'm much more over in that solitary kind of Sheldon Cooper side of the

world, you know. But I thought, God, we need those other graduates, we need those people from the other backgrounds, in order to be able to put across, you know, these are the risks we found, here's what we need to communicate to you, what do you need from us? And it leads to very different and diverse perspectives. Yeah, absolutely. And perspectives that reflect the rest of the world. Yeah, yeah. And you put it in, it was the base was from your work about the test tools.

What were the insights from this diversity from the tester to your work? One of the earlier findings that led to that survey was a realisation that, first of all, the testers lived experience of using those tools. So, their user experience as the users of the tools knocked onto a poor lived experience. So, it's kind of like it was damaging, there's a lot of frustration, a lot of upset. There were some good emotions as well, but it was

very high emotional response. And also a set of findings. So, it's important to do something about this for people's health and welfare at work. But also that I thought usability was going to be the problem. And it kind of was, but not in the way I expected. So, one of the things was the highest scoring quality attribute of the tools that people talked about was the operability. In other words, can I carry on my workflow? Can I carry out

the tasks to reach the goal that I have as a human being with this tool? And that was the most kind of problematic and the most raised. You infer that from the way people are talking about it. You pull it out from what they've reported. It's not that you've asked them is operability a problem. You've said, describe to me an experience with the tool. And then from that, you pull out what actually the technical and quality attributes

they're talking about. So, it's kind of not directly ask questions. It was all open questions. And also one of the things that was really causing people the most annoyance was tools that have been bought because they had really pretty interfaces. So, superficially, the usability is there, but when you start using it, it's not supporting how you do your testing. And so, out of that came this concept of the illusion of usability. So, I started talking

with my academic supervisors about how do we deal with this? And we started wondering whether a UX or academically an HCI perspective, putting that lens on test tools would help to understand it. And when you do that, you start asking who is this tool for? What are

the characteristics? What are the persona groups that we're dealing with? And as I started investigating that, I initially thought, right, I can set up a framework that's going to help tools designers kind of run this information through and it'll come out and say this is... And it's too hard. There's too many variables. I can't do it. You can't come up with the ideal test tool or even like the framework for helping you design the ideal test tool.

And then I had a meeting with somebody who's an academic at another university, a guy called Hussein Dugan, and he was listening to what I was doing in the work. And he said to me, you know, the contribution could just be a set of heuristics that people consider when they're designing a test tool. Don't try and write a whole framework. Don't try and write a tool for writing tools. Don't try and solve the problem. Go away and come up with what

are the heuristics based on the evidence you've got. So taking all the data, I've come up with 12 heuristics which are couched as questions because they haven't got a single set answer. It's actually around, you need to ask yourself these 12 questions and then where that takes you will tell you how you need to design the tool. And those are now out on a repository.

I'm going through industry case studies with people using them. They're going to go through, they've been through several expert reviews and iterations of refinement and they'll go through a final set of expert reviews and then be published out. And my vision for them is that they're out under some sort of creative commons license. So everybody in the industry who is designing tools in-house, who is designing open source based tools or vendors or people

who are evaluating tools can pick those heuristics up and use them in any way they want. And to help them think about how am I designing this tool or how am I evaluating this tool. And that's going to be the final product out of the research. And it's just going to be there available for everybody. Also I've realized in designing them it's not like you go through

1 to 12 and you've got the answers. In themselves they're different. I've seen on the different case studies people using them in different contexts, picking out particular ones as being important in this context, picking out a different order to run them in, revisiting them later in the evaluation process or the design process because they realize they've got to go back and revisit a question. So I hope really flexible as well. But a tool if you like, the set of

heuristics is a tool which people have to use thinking. They're an aid to thinking. They're not a set of answers. So fingers crossed with that. Hoping to get all of that done over between now and the end of the year and then really be pushing it out, writing up for my thesis and then publicizing them more during next year. Great work. I like it. I read it yesterday through the questions. It was very great and

I think also very practical to use it. So it's not so high level but you can use it in a pragmatic practical way. I hope so. I mean if I can ask you, I mean you said you looked at them. I mean for you was there a particular question in there where you thought, oh actually I haven't thought of that before or was it confirming or was it more were there things which you thought there that sparked something new for me?

I think it was written down. When I think about tool design it's very intuitive and this is now written down the questions and give a good orientation. Right. Okay. Okay. That's interesting. One question back to the stereotypes. You said that HR always in recruiting doing very fixed on the stereotypes of IT people. So what do you recommend? How can we soften that up? How can we transform it because we need diverse testers and IT people?

I think one of the things is to talk to HR people and talk to recruiters and I'm hoping that the paper can give people evidence that they can take and not just for recruiting testers but actually recruiting developers and UX people and whoever else you need in the product owners, systems analysts, architects and so on. So I hope the paper gives people things where they can pick it up and go right here's a piece of evidence. I can take that.

I'm also at the suggestion of the audience when I gave this talk at the HUSDEF pre-conference meet up a couple of weeks ago. What the audience wanted me to do was take it to HR people. So I'm actually just in conversation now with some HR people I know and I'll talk to them and say well where would be a good place to take this information and how to show it.

But again the paper's out there. So people can pick that up and take it to their own HR department, their own recruitment people, their managers and start the conversation I think. Yeah I think it's a very important part to get future proof IT departments and software developer and testing. Yeah. Yeah Isabel thank you very much for joining the show and tell about your work here. I think it's very very interesting and it opens up the stereotypes we have sometimes in our

head and with your survey you give now a fact based evaluation of this. So thank you very much. Oh well my pleasure. Thank you so much for taking the time to talk to me because I'm really delighted because I want to get the word out there. So thank you so much for taking time to talk to me. Thank you too. Thank you. Then have a good time here. Will do. And you. Thanks. Bye now. Bye. (orchestral music)

Transcript source: Provided by creator in RSS feed: download file