In the digital reality, evolution over revolution prevails. The QA approaches and techniques that worked yesterday will fail you tomorrow. So free your mind. The automation cyborg has been sent back in time. Ted Speaker Jonathon Wright's mission is to help you save the future from bad software. You're listening to the first ever podcast of The QA Lead. Welcome to the show.
The QA Lead is a community for anyone with a passion for quality, for making things work properly and therefore breaking them too. We're on a mission to build a QA community exploring the latest and best of automation and testing, including quality in agile, quality at scale, enterprise, DevOps of course, quality in digital which means we'll be talking about Web Mobile Database, Web Services, Cross Browser, Big Data and the always paying for
regression testing. So whether you're already leading a QA team or just starting out as a QA tester, this is a place for you to connect with your QA tribe. It's a place to be inspired to be equipped and connect with those who are shaping the future of quality assurance. And we want to invite you, dear listener, to join our QA conversation, to discover with us how to lead QA teams and build better software. So keep listening to the QA Lead Podcast to find the tips, tricks and tools you need to learn.
So let's get to the point, to learn more about the QA, to learn more about the QA Lead and the QA Lead. So let me introduce myself and what this podcast is all about. So by way of introduction, I'm Ben Aston, I'm the founder of Black and White Zebra. We're an independent media company, we're based in Vancouver and Canada.
And we make serious, sometimes boring things, understandable and a lot more fun. Our mission is to help people and organizations succeed. So that's you, that's your organization. And we do that by creating platforms like the QA Lead.com that help people get competent, get connected and get confident in their role.
So next up, I want to introduce you to the future host of our podcast, Jonathan Wright. Jonathan is a published author, a Ted Speaker, a co founder and CTO of Digital Assured. He's also the president of Vivit Worldwide's Microfocus Community, among the whole other things I think we'll touch on a bit later.
Now Jonathan has got stacks of experience building things and making sure they work properly, with nearly 20 years experience, including Hitachi, George Bank, Lehman Brothers, remember them and Thompson Rooters, Siemens and Xerox. So Jonathan's going to take on our first ever residency on the QA Lead podcast. So a massive welcome Jonathan. Hello listeners, it's great to be here.
Good stuff. So we're going to be talking today about the new community we're starting called the QA Lead. We're going to talk to Jonathan about his QA story and then do a bit of a deep dive to get some of his QA insights. But Jonathan, I want to talk to you about the QA Lead, the community for a minute. What's your idea about what we're creating and why do you think it matters?
Well this I think is going to be the first of a kind for a podcast series that actually focuses on quality. You know, it's very easy for people to get confused between what say, for instance, testing is and what quality is all about. So that's really why we're focusing on this. It's a untouched, undiscovered and you know, it's the most important thing. We have we interact with quality every single day and we all have an opinion about it. So there's going to be some great conversations.
A dark mysterious world. Let's hope we let's hope you find the light. So I want to dive into some QA basics. Maybe for people who are new to QA new new to quality. And I mean, let's talk about quality for a new for I guess a non QA person. How would you describe QA and how would you describe quality? Well, there's definitely no one description that kind of covers everything. So I think it's probably worth starting with the idea of, you know, what is quality in your everyday life.
So, you know, you go to a supermarket, you go and buy a product, you know, and you'll notice sometimes you get this kind of little badge to certify that the actual quality of that is good. And sometimes this might be something like the ISO. So that's the international standards organization who are literally saying, here's some great kind of practices and methodologies that you should really adhere to to to make sure that the product is up to a certain level of quality.
And I think that's what's really interesting about this is quality, it completely changes depending on the person, the price point, and really kind of what what you're expecting, what your expectations are of quality. Right. And so why for you is, is that interesting or important? I think I'm curious from a, I mean, obviously you've got lots of technical experience of implementing systems and building things. Why is the quality aspect to you? Why is it even interesting?
I actually think it's a lost arc. You know, it's literally something that's been put away for so many years and we've been talking about doing things faster and getting them quicker to market. And I think we've kind of lost some of that kind of engineering capability where we really take time and care and love for what we're doing. And I think there's a lot of people out there that still agree that quality should be at the front of everything we do.
I'm wanting to talk to those people. I'm wanting to interview and have conversations about what quality means to you, the listener. Yeah. Yeah. No, I think it's fascinating. I think as the world or there's been a trend towards more disposability and almost yeah, planned, obsolence when we know the things that we buy or the things that we use will break, whereas maybe a couple of generations ago, things wouldn't break.
And you know, you'd still have furniture that's handed down, you know, from the generations that is quality and it does last and therefore is is much more valuable. So I think thinking about that within the context of, OK, well, how do we within the, I guess agile mindset that people tend to be excited about and iterating on things and, you know, slowly making things better.
Increasing value over time is asking, OK, well, where does quality fit into that? And is quality the final iteration or is it something that's embedded within those iterations as a products development? What develops over time? Absolutely. And I think, you know, we talk about things like quality is everyone's responsibility or quality is a culture or even, you know, quality is at the center of everything that we do.
And I think this is interesting because this is kind of showing that it's no one person's responsibility for quality. However, you know, we're talking about QA leads. We're talking about quality assurance. People who are in a career to make things just better. So where do they fit in? How do they impact the way that quality within an organization as a culture is actually delivered?
And I think this is where I kind of really wanted to start with a bit of a journey about how I've seen quality change over time in my personal experience. Yeah, I mean, we'll tell us how did you, what is your QA story? How did you end up, how did you end up here? So I think this is, you know, this is really interesting. This is where my passion was. So I was just back in the 90s, I was studying at university doing high performance computing.
I found myself in the lab one evening and one of my lecturers had come over to talk to me and said, you know, would you be interested in being a tester over the summer holidays? And I thought, oh, okay. So what does a tester do? And he kind of said, well, you know, these guys, they make phones, you know, they do stuff like that, you know, I'm guessing you're going to be testing it.
So my thought initially was it was some kind of production line. Once the phones had been finished being made, I was going to sit there and, you know, take a few out of the box and just turn them on and make sure they work. So that was kind of my initial idea of what testing was. And, you know, that's the first step which I had. Apart from all the theory that you got from, from, out in the, the kind of academic world, this was my first ever hands on.
Well, what is this quality good or is this quality bad kind of scenario? And was it that was it turning on a phone? It wasn't indeed, but, you know, so I was really lucky. And I started off in the 90s and I had a summer internship which then led to being sponsored through my final year and my dissertation, which was all around automation and helping people.
And part of that summer internship taught me so much. So I turned up the first day and they kind of said, here's a whole stack of test scripts which was a big manual. It's tricky as you've ever seen a bit like a phone book used to be.
And they say, you know, there's a thousand tests. You've got the next six weeks to test all these that this phone exchange. Now, if, for though it was picking up phones, but it was picking up one phone calling another, hanging up, transferring, putting it on the switchboard. So trying all these different combinations to try and break the phones. Now, if anyone's used to a desk phone, you kind of expect that it just worked, right? You know, you pick it up, you dial a number, no problems.
And these were the days of voice over IP. So this was the early days where instead of making a phone call down your some copper lines, this was going over the internet. So yes, it did break a little bit more. But those six weeks taught me a whole stack of really hard lessons, which was, I was the first one in. I was excited. I was enthusiastic. And I tested every single one of those, all 1,000, 2,000 scenarios. Do you want to know how many bugs I found? Spoiless.
Zero. So I was the most upset I've ever been in my entire life, right? And I tell between my legs, I went over to my boss. And I said, look, I'm not joking. There's a tick on every single page. You know, I have tested the one that you have 10 people on a call, one person, you know, then hangs up and then reconnects. And I've tried every combination. And he said, I know you have, you know, we've been reading these for the last 10, 20 years.
And I've never failed because we build products that will last. And it's a PABX, a switchboard, it's expected to work, right? We engineer products that will work and last forever. So exactly what you said was, you know, it stands true.
They build stuff that's solid. You go and you go and pick up your desk phone in your office. And it works, right? It doesn't crash on you. It doesn't, you know, blue screen. It doesn't, you know, it's people have got into that point where they understand that certain things are going to be incredibly reliable. And they're bomb proof.
But for me, who's just starting out and spending those long days to then be told that in another six weeks, I think you did it in the intro that I'm going to have to do another regression when they patch the system again was the most demoralizing thing I've ever done in my life.
And thankfully, this was my first flavor for doing automation. So I say automation because this was actually a machine. So they pulled in a machine, which was called an abacus, which has all ones and zeros. And I program the machine to make phone calls directly interfacing with the PABX to call a calls phone B, call hangs upon phone C.
And there was no poor student who had to do it again. So I can see why going faster and automating things, especially manual repetitive tasks, which don't really add much value, make a lot of sense. But at the same time, it's essential that they can, you know, at least gauge the level of quality. And as far as Siemens were back in those days, they were one of those kind of like you would, you know, associate a German car manufacturer.
You have an associate quality. You don't know why you associate quality with it because unless you own one or, you know, you've had experiences, you've just got this feeling that the German. So it's going to be better engineered and the quality is going to be higher. So, you know, where does that come from? You know, how do we in our psyche kind of thing, that product's been better engineered because it's German.
Yeah, yeah, no, it's that was my opinion about German things until I bought Mercedes and it always breaks. So that's the whole thing is the perception, isn't it? It's the perception that the product that you're going to own is going to be a high quality. And, you know, you're also expecting that when you take it in for a service that is going to be expensive, right? Because quality is expensive.
You know, that is the perception you've got. You're not going to go in and go, wow, that was the cheapest servicing I've ever had on any car ever. It's literally kind of there's a balance between fast quality and cost, right? And I think that's where the balances are. And, you know, you make a choice of whether or not you buy a premium brand over another brand because you assume there's a level of quality.
You know, this same goes for if you're, you know, buying product beauty products, right? You might it may have the same ingredients, but not have the advertising. It might not have the, you know, the logo up or a famous celebrity endorsing it, but, you know, it's the same thing. So why do people still buy those really expensive products? It's because this, you know, it's the quality.
The packaging is everything is what Apple does so so well, you know, when, you know, the concept behind Apple came came to us was it was not just the experience of owning the phone and the phone being a music device and everything else. It was open the box. It was, you know, everything looking, you know, and feeling so kind of like well built and well designed. And I think this is where we've lost some of that art because we're looking at getting things out cheaper and faster.
And that's what I want to rekindle with. Yeah, and I think I think what you're touching on is really interesting is about quality being something that you feel and a moat and emotional response.
Rather than quality being something that doesn't break. And like in your example way, you're talking about the phone system, it was quality was that it didn't break. Whereas the quality that you're talking about with Apple is well, we all know that, you know, the battery life to grades and they slow down the processor when they're doing upgrades and things like that.
But, but it makes us, but we're still we still think associated with quality because because it's Apple because it makes us feel because it's expensive. We paid a lot for it. It was packaged nicely and we told that this is the best phone you can get. It's a it's a conundrum.
It is. And I think this is where I think you know, I kind of want to start helping people understand well, you know, what are the steps you can potentially do to understand some of these kind of methodologies and tool chains where you can look at how you create better quality. And I think the easiest way to start really with this is and I think, you know, especially if you listen to the digital project manager as well, you know, we talk about things like waterfall.
And for you guys who don't know what waterfall is, this is a sequential design. So this is the idea where you start with an idea. And you could be Steve Jobs saying I want to make the best mobile phone ever. You know, he doesn't care what processor it has, how much memory it's got, he's just got a vision, right. And that vision has set of requirements that then need to be developed, delivered, tested, built, and then delivered to the end customer, right.
And it's, you know, I worked out in for Apple in Silicon Valley. And I remember going to Stanford to do a guest lecture and I went past the old Hula Packard garage. Now this was great, great experience because Hula and Packard used to do a started off doing hardware. So a bit like what we talked about with the telephone exchange. They used to do scientific calculators.
Now, if you ever hear the term mark one, mark two, mark three, you know, the idea was they would iterate. So, you know, their Hula Packard's vision was build and test, build and test, iteration one, two, three, mark three, you know, they'd release a new calculator every kind of couple of months.
Now we could call this agile because, you know, it's iterative, it's, you know, improving on the design. However, you know, it's also engineering. And I think it may have been sequential design, you know, it may have been well actually isn't something missing from the last one will feedback and create a better product. And a bit like the iPhone generation, every iPhone 456, 789, 9SE now I suppose. But, and you know, the idea is that with each iteration becomes a better product, better quality.
And maybe that's where it's fell, it's full and show in the same, the form factor, the phones all look very similar. They do the same kind of thing. There's a perceived quality of the actual product, but going back to that idea, which, you know, they had at the start was really a powerful message about how they were going to change the world.
And I think there was a certain quality they were looking for and part of it was building something that people would, would change their life, but also would be part of everyone's every day's life. And I think this is where we've got this new kind of challenge around how do we deliver quality products to market, not every two years, not every year, but actually extremely quickly. And I think this is a challenge, which we're going to address during the series.
Yeah, sounds fun. So let's kind of go back to your, you in the, in the phone factory doing your testing. You've created some automated tests. Now you were lucky to inherit that all those testing scripts, which kind of gave you some guidance to allow you like, has to be able to automate build some automation into the testing. And that was kind of a great starting point for you. But for someone, maybe who's new to QA, who's thinking, hey, this sounds fun.
And then making stuff work better, breaking things, finding out what's wrong with them. What type of certification books or training, would you recommend to people new to the industry who are in the industry, just wanting to get better? Sure. So back in the 90s, I had what I called the Bible for the, for, for automation.
And it was a, it was a book, which was called software test automation. And it was written by a lady called Dorothy Graham. Now Dorothy is known in the industry. She's a good friend of mine. And she's just retired after doing a hundredth keynote on quality and, and testing. So she's definitely a thought leader in this space.
And this book, which she'd first released, was talking about automation and what she'd been doing for the last 20 years. So she's, she's known as the grandmother of automation. And she worked for Bell Lab, so another communication company back in the end of the 60s into the 70s. And she was known as the first person to ever write any automation against systems.
And reading that book, obviously, there was a wealth of knowledge that I applied to my role. But one of the things that I was lucky enough to do is I looked at reaching out to Dorothy and talk, say, you know, I've got questions, you know, who would you recommend the best people to talk to?
And she came back with, you know, she was based in the US at that time, and she kind of came back with a, well, I know a couple of good people who are in the UK, and I know a couple of people are good Germany. Why don't you reach out to these guys? So I reached out to them, and I found myself a mentor. I found myself a mentor, which actually was in the, in Bakker in the States, who also worked at Siemens.
And I was lucky enough to, to be able to kind of learn some of the kind of the experiences and, and lessons that he'd done in automation over the last 10 years. And so he'd come up with this idea of having graduates like myself at the time, writing these test scripts during the day. And then because of the time difference, he would then fix the, what they call the framework in those days.
But, you know, the engine that could get things running, and then he'd execute them overnight. So we had this kind of 24, 7 automated testing, you know, across three different global locations. And it was, that was amazing at the time. And it was interesting because he, when he came over to do his, do the training with the team.
He said to me, there's going to be one word that I'm going to say to you, and that one word is going to change your life. And yeah, I don't want to build it up too much, but the word was, got to want to win, right? And the idea was you have to want to make it a success. And back then it was, it was very hard. You know, there was no online resources.
There was very, there was a very few books. There was no online training. It was literally you either learn it from reading, you know, just the old books or papers people have published on, you know, different different sites. And, you know, it was, it was, it was, you're bodily going where nobody's gone before. And he said to me, he said, you know, if you ever get to a point where you're stuck instead of waiting, it doesn't matter if I'm in a meeting with the CEO, come in, grab me.
And let's work on it together. And I think part of that mentality of the community side of things was, it was, how do we build up a community which we can rely on each other to solve these kind of problems. Obviously things have moved on. And as they've moved on, we've got a whole wealth of good books and training around it. You know, it's very easy. You can jump on to something like you, Demi and learn a course on automation in a few hours.
You know, there's books out there. I've done a couple of books. My first book, which I did was a cookbook. So the idea behind it was, it was all reusable recipes of different automation code that I'd written over the years. And, you know, the idea was you could just apply that pattern or recipe to what you were trying to do. But that was all about the idea of how do we, you know, share and reuse.
And this is kind of what we're trying to do with the QA lead is, how do we kind of point people in the right direction so that they can a contribute, but also get something back out of it. I'm actually still looking at the ISO website because I've started to, I was on the shadow committee for the new ISO standard for software engineering, which was, which I'm not going to bore anybody on, but there was the 29-110.
But I'm actually just reviewing it at the moment and they've just done part 11. So another addition on there. And that's now covering testing AI systems. You know, there's a section here on DevOps and how you test DevOps systems. Now anyone can look at these kind of standards and be part of them and, you know, help contribute to it.
But also, you know, reach out to that whole community of people who are, who are happy to talk to exchange their views and experiences around the quality engineering space. Yeah, definitely. And I think we, I mean, you mentioned earlier the digital project manager and another one of the platforms that we've developed is called the digital project manager.com.
And it's a similar kind of story in terms of what you're talking about in terms of the power of community. And I think that's why I get so excited about launching a new platform and building a new community because there is incredible power in connecting people through that connection. People can get more competent and then through that they become more confident and then at that's what helps people and organizations succeed. So I love this idea about bringing people together.
And it's about together writing a playbook rather than necessarily always going back to the textbook. But it's about, okay, sharing with one another and seeing this playbook that you can use those recipes evolve over time as we have new requirements to test things in different ways with different technologies.
But I'm curious from your perspective, like how, how has your perspective of quality and, and to that degree, testing changed from, you know, your days, testing phones to now what you're talking about, you know, enterprise DevOps. How, how is your kind of perspective changed or evolved over time? Great question. So, you know, I think part of that journey, you know, everyone's got their journey through QA and testing and that's what I want to hear.
So anyone listening to this podcast, you know, get in touch, reach out, let's have a conversation. You know, I'm the easiest person to get on on LinkedIn because I'm literally linked in.com slash in slash automation. And it just, you know, you can send me a message, get involved, reach out and share your experience. But, you know, we talked about starting off in the 90s, starting off with this kind of waterfall approach, you know, it was all around quality engineering.
It was about building the best product we could possibly do. And I think I mentioned in the intro there about Lehman Brothers, right? So, you know, things slightly moved on by then and we saw what the damage doing things wrong can actually do to everyone and every person you was probably listening, right? It's been impacted by that. So, I was at Lehman's in 2008. I was there on the day of the crash and with my box coming out of the building.
And my job there was automation. So, I was, I had a framework, you know, I had this playbook which you're talking about with my recipes and my patterns. And, you know, the idea was we were able to scale things. So, in this case, I was, I was boarding loans, surprise, surprise from one system to another system.
And there was a certain level of, well, as you're doing that and it could be called RPA now, so robotic process automation. But as you're doing there, we need to make sure you don't miss a zero off.
Because if you miss a zero off, you could potentially cause, you know, a massive financial problem, not that I was responsible for that day. But, you know, what I'm kind of say is, you know, part of it was we were kind of, I guess the Agile movement had kind of moved us, you know, so the Agile manifesto which back in 2001.
I guess that's why, you know, we're talking about the quality manifesto here and the people who are listening is, you know, back in 2001, you know, they created a few, you know, items which they believed would help people guide people to the Agile movement. And, you know, things have moved on. One of the things that hasn't changed is the manifesto.
And those guys that all met up in 2001 skiing, you know, part of it is they were the best guys in the industry to kind of put their thoughts into something that would help everybody else. We're trying to do exactly the same here. We're trying to say, how do we bring some of that quality back? How do we the future of quality assurance help create a quality manifesto that's lightweight enough that allows people to actually put some, you know,
figure around it so you don't have a lemon, another lemon brothers problem, right? But this is the thing. So, you know, I must have been a glutton for punishment because, you know, a few years later after I was living, I'd moved to New Zealand and then Australia as I was working with Microsoft and, you know, I came back and I started working for another bank, a large bank called Deutsche Bank and they were, again, trying to do this but great scale.
So kind of what you were talking about there for scale that gel and they wanted quality as a culture, they wanted, you know, some great German bank. So loads of great standards loads of great good practice that was out there.
And part of it was how do we do this but a huge scale. So back then there was 15,000 different products. So it was, it was a massive landscape and, you know, that went from anything from derivatives platforms, you know, and so therefore we were having to coordinate, you know, 7,000 teams across lots of different locations and try and get them to all build better, higher quality software.
And of course, it's a bank. The last thing you want is it to go down, right. So how do you do that? How do you actually understand which ones need more investment when you're looking at those large landscapes. And I think this was kind of my first take opportunity of it kind of understanding what kind of the business assurance was.
So that was kind of we talked about quality assurance, but how do we sure that the business can operate. So it doesn't become another lemons, right. And that means you need to understand the impact of, you know, what, you know, I remember seeing this horrible spreadsheet, which had literally what would 20 minutes down time cost for this, you know, FinTech platform in this country.
You know, how many products are, you know, loss of revenue would we make, you know, you couldn't, you know, you couldn't do sequential kind of development because you had to be able to release fixes quickly. So we had this whole, we're doing, I'm going to have him to think about DevOps. You know, and that's a huge challenge. I think for most of the listeners out there, you know, they may have had experience in large organizations, small organizations.
So you know, you unicorns or your startups, they may have gone to medium sized businesses, which are have a certain level of maturity around quality. And then everywhere they go, there's a different viewpoint on quality. And I think this is why it's so exciting to be doing this because this is somewhere that no one's really gone before. And, you know, we've got next generation technologies coming through. And, you know, I think you've also mentioned around the work that I've done with attaches.
So, you know, I was, you know, this was the kind of the next generation of well, how do you test things like smart cities, right? How do you deal with, you know, testing things like car to X. So, you know, people are used to autonomous cars, right? But the idea behind an autonomous car is that it uses the internal camera, the cameras and the internal system to make decisions.
So yes, it's using computer vision to recognize if a pedestrian stepping out, but it's autonomous. It makes its decision on its own. Whereas car to XO is a standard around how do cars communicate? So that way they can increase flow of traffic. They can, you know, if you're about to crash into another car, it will move out of the way. You know, it's interacts with the infrastructure. So in Germany already, if you've got an Audi, you can actually go to a traffic light.
And the cameras will look at the traffic light and wait for it from going from red to green and then allow you to go. So you can keep your foot flat down if you really wanted to. But that's not infrastructure to X. So that's the smart cities will allow you to, you know, all the cars to talk to the traffic lights and make a decision of when it's good green and it's safe to go.
You know, these are really complex systems and you know, how do you potentially provide quality, which could potentially affect people's lives, you know, everyone kind of always talks about the most difficult question of well, what happens if there's an autonomous car. And on the left hand side, there's a motorcyclist not wearing a helmet. And on the right hand side is an SUV with a child on board sticker, which way does the car go?
Now, you know, if I'm testing that scenario, well, what do I test? And the answer is, you know, the system, the autonomous system at the moment doesn't make it isn't clever enough to make any decision about whether it goes wrong. It will make a decision based on the amount of space that it's got where it thinks the least amount of damage can be caused.
So it doesn't have the ability to make the decisions we do is humans and we build these systems in our image. So how do we ensure that it makes the right decisions, but also that the quality is high that we don't have accidents. We don't have accidents. We don't have fatalities. You know, we we move things forwards for, you know, the better good. And, you know, we'll work together to make that, you know, make that happen.
And I think that is what's, you know, going to drivers and it's going to be a whole new world of products in the future. And we need to be involved in that process from day one all the way through to when that products live and it's driving itself.
Yeah, I mean, I want to kind of we've touched on this a bit before, but I want to just deep dive for a second into your perspectives on equality in agile, because I think the topics that you're raising in terms of, you know, quality in smart cities or qualities in infrastructure and these kind of critical systems.
We have that on the one hand and then on the other hand, we have this excitement and appetite for agile and doing things iteratively and accepting the fact that they're not perfect, but they're, you know, evolving over time. Some of these things are opposing one another when we're thinking about, okay, well, you can't have a smart city that's, you know, there's iterating some level of iteration is acceptable, but then if it's a critical system, a banking system.
You can't, they're just isn't that tolerance for mistakes or for bugs that, you know, they're going to cause millions of dollars to just disappear or be transferred to the wrong place. So tell me like what's your view about quality in agile and what does, I mean, you talked about having this quality manifesto. But what is the future of quality in agile in your opinion? So yeah, I think, you know, this is where we're going to obviously get exploring in much more detail.
I think it's a great example what you're saying about the smart cities. So the smart city project that I worked on, which was for Copenhagen, was a smart city. The idea was that they want to become carbon neutral by 2020. They wanted to create a citywide data exchange where people could, you know, put information coming from cameras, coming from football, coming from different weather or traffic or whatever else it may be.
So, you know, could we have launched an application on day one and then iterated on that product? And the answer there is probably no. So, you know, this talks for new patents. So the approaches and methodologies that may have worked for say, web development or, you know, create a website may not work in the future.
So one of the challenges we had at that point in time was we had to create a smart app and that app would allow people to recommend to people which was the most carbon neutral way to get to work. So it was a really good idea because it was community driven, you know, the people, the citizens of Copenhagen wanted to adopt a cleaner way of living. They wanted to reduce the traffic. You know, they wanted to be, you know, make a real difference.
So we created a platform which would a bit like what an Uber would do or, you know, your Google Maps would suggest, you know, take the bike, take this bus, take this tram, take this underground. You know, this is the best way to do it and reduce your footprint. But we also wanted to have gamification, right? We wanted to encourage people to use it.
So the city of Copenhagen came up with a great idea that businesses would all, you know, come together a bit like the company that you run, your team would come together to compete against other companies in Vancouver, for instance. And to get become higher, you know, get better carbon footprint. And then you get tax relief, which would allow you to, you know, buy your employees, you know, bikes and, you know, give them tram passes and all sorts of stuff like that.
But how do we possibly do that and launch the application from day one? So I remember seeing the requirements, which was day one, 38,000 users is what we're expecting. How can we make sure that system works? And so we had to be able to at least test or prove the quality of the product before we ever went into the wild.
So how do you get historical information to enable gamification? How do you get historical information from Google to get all whatever for the maps for the different journeys through the application? So what we did is we actually reached out to the Copenhagen University and we said, guys, can you install this GPX logo, which is a tracks where you go around the city and we'll exchange it by giving you beer, right?
Free beer. So the idea was, you know, they would just turn on the tracker. Yeah, it used a bit of their battery, but they would upload the information to us.
And so we had all these historical journeys, right? And we use those as the delta to then generate billions of possibilities of people walking around, you know, going on the tube, not having any signal for a bit coming out the southern end, walking it to miles an hour, walking it three miles an hour, going in different directions, all this historical information so that we could prove the quality of the product.
And also that it would be able to scale. So as far as the amount of users that would be needed and also the recommendations that it was making the gamification. And so this required the feedback from, you know, a real real world, you know, emulating a real world. And I think, you know, I took a little bit about the my Ted talk for a second, but this is where I think the agile manifesto is missing something is that we talked originally around the
the Hualappacod approach to to build and test, build and test, build and test. Right. And agile, you kind of start off you, you know, you have some maybe you do some ceremony. So you do your sprint planning. And then at the end, you know, you might do some some retrospectives and you may be doing something like scale that gel, so something like safe or dad or less. So distributed agile delivery,
scrum of scrums or, you know, whatever that is, you've got that feedback loop to say, well, these are the things that went really well. These are the things that didn't go well. But they're sometimes team based. They're sometimes PI based or, you know, your level above your features or your capabilities. And quality is a really difficult thing to deal with when there's different teams with different skills, different methodologies, all using, you know, different tooling.
You know, it's that culture aspect of being able to get people to understand what was the real quality because they could be 50 teams all building something that comes together to represent so from some smart city app. So the idea with the Ted talk was to say, okay, let's break this, let's move this paradigm a little bit. Let's talk about thinking as one activity.
So that's, you know, the before stage where it is, you know, well, what is what are we trying to do? You know, what we trying to build? So in the old days, that might have been called requirements engineering. But now it's, you know, that could be a story that could be an executable specification.
That could be a vision. That could be, you know, some kind of charter of what you want to do as an organization. Whatever it is, that's the thinking aspect, which you're bringing people together to make that happen, right? You're not coming up with one person from say from some marketing, you say, we're going to launch a new type of pizza because, you know, it'll increase sales.
It's literally something which is collaboratively done. And then again, that evolution over revolution, approach to things was how do we then evolve the idea? So what humans are great at doing is this whole build and test build and test kind of an iterative approach, which the agile manifesto really supports.
And then I think there's the third section of the of the paradigm, which is the learning aspect. So this is when we start applying machine learning, artificial intelligence, the kind of cognitive engineering that I talk about quite a lot about how do we learn from what's actually happening in the real world. So take an example with the mobile app, you know, once it's gone live, how do we understand how the actual users are using the application?
How do we know when they're having problems? How do we know? How do we improve it? So how do we feed back what we know the behavior of the users are in the real world? And how do we feed that back into the thinking process of how we improve continuously improve and continuous quality of that product going forwards?
And I think that's the whole point is that it's not a build deploy and then maintaining production and fix the bugs. This is, you know, and it's not the ability for like DevOps to be able to do that, you know, release 25 times a day. It's about the feedback that comes from those 25 releases per day to understand how that should change or pivot the the design of the system.
And that's this next generation of challenges and opportunities, but it's also about quality because those people are using your your product right now. They have that perception of the quality based on what's happening in the real world. And we need to get bet closer to real the real world to understand, well, you know, what are the actual users doing? And I think this is the exciting shift is understanding the real world is, you know, is where we're going to understand quality.
Yeah, yeah, that's it reminds me of it's kind of like conversion optimization. When we're, you know, we're looking at we look to see what people do, what they interact with. We use, you know, I'm using marketing terms now, but we, you know, we start doing a B test, we start using heat maps to see where people going, what they click on. And then we use that intelligence to optimize the experience and thinking of that through that kind of lens of quality is like, OK, well, what?
How does that impact the way that we actually design this? And how can we iterate in a way that makes more sense using quality is that. I guess cornerstone so that we can continue to develop things responsibly and effectively, rather than just kind of being blind about it and using data. I think is is really the future of how we can develop within this kind of agile context. So I think that's, I think that's an exciting thing to think about.
I'm curious Jonathan, what are you, what are the kind of projects that you're working on right now, what are some of the challenges that you're dealing with? Well, you know, that was beautifully said and actually it leads nicely into what I'm actually doing at the moment. So just finished for the day, but I've been helping a one of the fastest growing fashion retailers in in the UK and Europe.
And it's exactly what you just said, all of the stuff what you're talking about is exactly what we're looking at doing now. So let's just take the thing aspect for a second, right? So we've got a very passionate CEO and she's kind of got a vision for what this new mobile app was going to be. And you know, they wanted to try something completely different. So first of all, it was a bit like a date in a, you know, you could swipe if you loved it or liked it.
And then we'd use that to be able to make recommendations of the styles and products that you want to buy. But actually we've come up with this killer app idea, but we need to validate it, right? So we've got choice A, we've got choice A, which is we go into some kind of development lifecycle and we build the product. But we've decided that actually there's better ways to do it, right?
So exactly what you said is let's quickly get it. Let's get the vision up there and work out what, what would this app look like? Right? So there's loads of tools out there. I'm going to give a shout out to envision wishes this, this free tool, which you can upload screenshots and it turns into an app. So the actual end user feels like they're actually using the mobile app. But this isn't, you know, all this is just images, right?
So this is wireframes, this is sketches, this is literally nothing more than a UX designer, cutting some concepts up, right? And we use a product called maze, which again is free, which allows the, the internal influences. So you know, we've got the fashion industry. So she's got friends in the fashion industry. So they can literally pass it out to them. And they can literally like it's a mobile app that's working. They can go through, they can order stuff.
They can buy something, return something. And there's no code being written yet. But at end of each experience, you know, they're kind of given a mission to say, go buy something or go and look for a product that you like. At the end of it, it kind of we get that feedback, a bit like enter proud crowd testing. We're getting that feedback to say, yeah, I like that.
But, you know, I didn't really like, you know, how it felt when I clicked the like burn, you know, I didn't like the fact that I had to, you know, had to fill in all this extra detail. I just want like a one click to pay solution. But we're getting quick feedback. So this is the design phase. We're not even past this to developer. And we were already getting feedback of the different journeys that you can go through the application and what the end users are feeling.
You know, what the influences say what internal, you know, fashion experts are saying we can go to the crowd and actually get people at the right demographics. Based on, you know, what your Google Analytics is telling you you've got your your gender split, you know, you could even get down to if they've, you know, got family dogs cats, you know, you can get down to that level without even ever releasing a product into or even writing a level of a line of code.
So, you know, to me, that's exciting. And we can actually automate that. So we're actually using image based testing capabilities and visual testing techniques to automate the flows. So we can actually drive it and with, you know, different scenarios with, you know, different combinations and test it. So there's a level of quality to a product that hasn't had a single line of code.
So that is extremely exciting. And then on the opposite side of it. So let's say we did launch this application into the wild. So we did put it onto the app store and you downloaded it right. Not words also doing on the right hand side, we're doing something called digital experience analytics. So this is looking at every journey that you take on your phone.
And it's creating a DNA, a kind of a footprint of what your activity is and where you're going, how long you're spending on a particular screen, how long you're looking at a particular, you know, banner or an image or a product, you know, what is your user behavior. But not only that, we're looking at every single person's user experience or digital experience across all of our million customers, right.
So every day we get new views of the behavior, like you said, of where people are going, where they're dropping off, where they're bouncing. And yes, none of this is new. But we're introducing things like what I started to coin as the word dark canary. But the idea is it's dark launching, which is this ability to launch a feature hidden to the user that kind of appears to a certain amount of users of the AB kind of approach.
But the it's got that kind of canary roller aspect where you can start with 10% of people, then ramp them up or you can even target them. You can say, well, actually these early adopters with these kind of buying styles will get them to validate this idea or concept.
So there's so much more we can do today and quality comes in at every single point, not only in the initial designs of something, but the actual operational stay off the app and how you can improve the quality of that by learning what those, you know, the digital experiences are.
And look at how you can remove the dropout from that. So, you know, these are all exciting things that I know we're going to talk about them. So, you know, I just wanted to give people an idea that this is going to be really exciting. We've got some really exciting, you know, well known industry experts that we're going to we've got lined up for you who talk about this kind of stuff.
And I think it's going to really inspire, but it'll also change your view of, you know, how do I question this and how do I actually put this into practice at work? You know, how do I get access to Google Analytics? I can see what my users are doing. How do I ask the design team of how I can get in earlier to to look at the the wireframes and the mockups of the product before anyone starts building it?
It will change your behaviors, which will help you drive quality across your organization and help you to get involved and learn new techniques, new patterns that will hopefully change the world. That sounds good. And I think for me, that is what is really exciting about this podcast. It's that what we want to do is go beyond the theory, go beyond the things that people are doing and actually explore the bleeding edge of quality and what that looks like.
And really hopefully lift the lid on new techniques, new approaches, new tools, different ways to use tools and really build this community around developing this quality manifesto, this playbook for delivering quality in the next however many years as we see the industry of evolving as we see new technologies emerging and requiring a different way and approach to quality and testing.
In this world of enterprise DevOps of quality at scale, quality within an agile context. I think it's going to be super exciting. But just as we close, as we think about how this community is going to be different from your perspective, what's your view on this? What is going to be unique about the QA lead? Sure, so there's loads of testing communities out there. I could list 10, which are a few which I'm involved with.
And I think they're all very much focused on providing valuable resources for testing. I think this is where we take it as a step up, we're looking at quality and this is something which has never been done before because it is so difficult to define, it's difficult to find materials around it. And that's why I think we can really help, we can help these blue provide blueprints, these playbooks, some templates of things you can think about, resources, training things that you can go off and do.
Stuff that you can learn to help you get better, books, references. And I think that's why I'm going to play a clip in a second, which is this whole, I think it's a lost art of quality assurance. And I think there's no need for you to listen to this podcast. If you know, you can completely define what quality means for you, completely define what quality is for all of your customers and for your organisation.
And then, yeah, you probably nailed it. But for the rest of it, including myself, I just think it's similar to kind of the matrix is nobody knows what it is. But we know that it's everywhere and we need to be able to understand it better. And I'm going to play this clip and hopefully people will think, yeah, this is an unknown and I want to learn more and that's why you should subscribe.
Let me tell you why you're here. You're here because you know something. What do you know you can't explain? What do you feel it? You felt it your entire life. There's something wrong with the world. You don't know what it is, but it's there. Awesome. Thank you Jonathan so much for joining us. It's been great having you with us. It's a pleasure and I look forward to some of the great content we've got lined up here.
Yeah, so we have got some episodes that are going to be rolling out over the next few weeks. So come back and make sure that you subscribe. Tell us what you think about today's podcast and come and join our gang. Comment on the post head to the QAED.com to register to join us soon to be launched forum where you'll find all kinds of interesting conversations going on about all things quality and quality assurance. But until next time, thank you so much for listening.