¶ Intro
Hi everyone. My name is Patrick Akil and in this episode we cover how security has evolved throughout the years, how to build resilient organizations and software, and how much of A people component is actually there in the field. Joining me today is Jelen Nimos Fedritt. He has one of the coolest titles I have ever seen, National Security Officer over at Microsoft and this was a blast of an episode. Enjoy. I was wondering because I really
¶ Security is a matter of software quality
want to touch on kind of security how it is nowadays. But also I'm very curious since you've been in the field for a long, long time, how you've seen security evolve. Because I I have an opinion and but my opinion is kind of from the last let's say 6-7 years when I've been in this field and your experience kind of goes even even before that.
I feel like security is less of a hurdle than I feel like people had this notion of, let's say, in the previous few years, and it's more so seamlessly integrated, but it's still like a core, core component and I don't think it's ever going to go away as a discipline. Yeah well to maybe to immediately pick up on that last one that's that's I don't think it should go away as a as an area to focus on.
I'm I'm always a little bit playing with the idea on whether it should be a separate discipline or not or if it actually should be more a matter of, yeah, quality almost 'cause right now it still feels a little bit and and I agree with you, it has changed a lot in in recent years. But it still feels a little bit like there is this, yeah, it it is in many cases still an afterthought.
It is still we built something and then there's other people who look at securing the thing or it is is something like you we deploy something in our organization for our colleagues to use. And then there's a security team that tells you that there is one way to use it, which is not recommended because there's another way which is the secure way to use it. And then they're going to do awareness training to teach you and send you fake phishing
emails to test you. And yeah, I'd I'd like to think that that should ultimately be more integrated, more notion of, yeah, we're going to build something and it needs to be secure out-of-the-box. So it doesn't really make sense that that's a separate team, but yeah, really getting those disciplines more closer together and more entangled, so, so to say. I think that makes sense.
¶ Security way of working
Do you think it's, it's always been kind of how it's grown organically where people would build stuff and people would focus on making it secure and that's kind of how it's grown and now maybe they're moving a bit towards each other, but they're still separate. That's probably what where how it started and I think I'm I'm sometimes a little bit critical
to to my own field. I think we have also traditionally not been trying really hard to to integrate and and maybe even because we have to a certain extent glorified the the hacking element of of security. So it's it's it's always fun and and nice to to try and break something and you see that many in many cases that's the fields or the part of the field where people want to move into it's
and again that's getting better. But it was traditionally always like, Oh yeah, I want to be an ethical hacker or a penetration tester or do red teaming and break stuff. And I still think that's an incredibly valuable component of security. But it has always been a little bit the notion of yeah, if you do it wrong you are sort of the annoying kid on the playground and you've just built a very nice tower with Lego or blocks or whatever.
And yeah here comes the sort of the the the school yard bully knocks over your tower and doesn't even help you pick up the pieces or rebuild it. And to a certain extent that that's. Yeah. And and of course I'm I'm exaggerating maybe little bits and and notwithstanding the good examples but that has a little bit been the approach for security. You build it. We test it. We point out 10/15/20 different things and do a little bit of a priority ranking. Good luck.
And yeah, that that's, and again it is. It's getting better. But yeah, I think that that that hasn't helped traditionally to to get this more into an integrated notion. We should really, Yeah, I guess, almost admire that. It's fun to build something that is, yeah, not only easy to use, but also secure and and it doesn't require a lengthy manual to explain how to do things in a
secure way. And yeah, that's where I think security should aim to position itself a bit more broadly and especially a little bit earlier in the development phase, so to say. Yeah, yeah it's it has kind of shifted I feel like because from my side if I if I build
¶ Professional pride
something and I don't know if this is kind of the let's say the expertise of people or also the technology that we use nowadays, but I really feel ownership of that. So if we're someone were to misuse it, let's say, and steal data or have like significant impact on the organization of the software that I just created with my team, yeah, I would feel like disastrous basically means. Your your your personal pride, almost. Exactly. Yeah. Like, it could be an ego thing,
it could be an ownership thing. But really I I take pride in the thing that I do, and I want it to be resilient and I want it to be useful. And if people then misuse it, I take that as a yeah, that's that's kind of on me or on the team, basically. Yeah, that would be hard. It's never happened, but that feeling of ownership, I think, is now at least with me, let's say, and I don't want to
generalize too much. No, but I think that that's exactly the the I guess the kind of thinking and and well almost feeling maybe because that's also a little bit how you explain it that I think would be would be great if if more yeah.
Development teams or or system admin teams would have that like yeah almost this yeah it feels almost a little bit like this professional pride on yeah we this is our thing and and we want people to enjoy using it. We want them to be able to to to put it out there regardless whether you're you are computer literate or or maybe a kid without worry that it's going to be hacked. But also I think and that's where security could define itself also in a broader way
also. Yeah, without yeah, I guess not not breaking that that trust between whoever build it and whoever is using the product. So make you that that you you will look after whatever data that this objects program whatever it is might be holding that you're not doing and using any sort of dark patterns in in the user interface.
I think it's really those things where we should ultimately maybe even see security as more of a had a broader trust topic and and not also maybe not have a Chief Security Officer but a Chief Trust Officer who has this broader perspective on how can we build things that our customers or our employees can actually really trust. Yeah, I think that's I like that
¶ Layers of defense, or excuse?
way of thinking about it, that it's more so about trust rather than it being secure. But it's it's trust both of the end users as well as the people that are creating it. Because with the technologies that I'm leveraging now, I feel like my my within my team, the total cost of ownership has gone down significantly just by virtue of not having as much operational overhead, by really being able to focus on business problems by taking solutions off the shelf cloud natively.
Let's say. I can do a lot of things with a very, very small team and things are more so secure out-of-the-box. Like you still have patterns within your programming but still you have tools for that that do analysis and and you accommodate for that nowadays.
So I feel like we can do a lot. I feel very much more confident and I feel that I have a trust in my team basically by virtue of having that expertise within there and people Even now I have colleagues that have come from a security background, a very specific focused background and that now label themselves as let's say secure software engineers where they leverage their previous experience and they're now software engineers and they always take that with them.
So it's very much integrated in smaller subset of teams. But when there's a bigger organization where you feel like you're a cog of this overarching team and you have security on the one hand, OPS on the other hand, you're in development, then it just becomes these silos. And then people think about, let's say their responsibilities and oh, that's their responsibility. Now this is mine, that's the securities.
And then I do feel like maybe trust is not there as much you trust in your own capabilities then, but maybe not so in the other departments because that's their responsibility. Yeah. And I think especially also with those, those silos or sometimes we, we think they're layers, right. We always talk about layers of
defense. But I think that they sometimes indeed become layers of excuse or or layers of slowness because indeed it is this notion of OK, I built this and yeah I should probably think a little bit about security. But hey we still have somebody doing a penetration test and hey we still have the the second line of defense and we have internal audit and we have. So it's it's yeah, it almost takes away a little bit of that that overall ownership indeed you. Don't have to do it.
And I think also circling back to to what you were saying about
¶ The industrial revolution in IT
modern tools, I think there you you probably in in in a little bit of a, yeah good space I guess where you can really benefit from what I sometimes call this industrial revolution
in in IT, right. When you can now, yeah use the indeed these these modern tools whether it's cloud or or supporting environment to really do things in a consistent, in a repeatable way without having to worry about, yeah actually well carrying a a 19 inch server, screwing it into a rack, putting an operating system on it, configuring it, all those thing things of the past. I think that has indeed created a lot of potentially a a great foundation to build on.
But yeah, not many or more and more organization but not everybody is there yet or they are a little bit in a sort of mix between of course where you have still a lot of legacy on premises environment and you are transitioning to maybe more of a modern cloud operation. And I think especially in that transition, yeah, things probably get even more complex than they were in the old versus the new world. Yeah, yeah, I think so too. That's when I started off this
topic. That's when I was thinking OK the role has really shifted because if this is now let's say the new way of working because I mean definitely from my perspective, I don't want to go back. I was in operations before we had stuff like physically in a data center and a lot of issues because that and I I was also on standby. So I I physically would get alarms on my phone and and I would experience that.
So now saying seeing let's say the the new order and the new way of working, it's just a lot better, ease of mind trusting what is there and what is established and we can do a lot more with a lot less basically, which I think is always going to be great in that way of working. I feel like disciplines will
¶ Security as speciality
also evolve, just like security will be and security will still have a role. And it might be kind of integrated within a team where people that have had that background and now are capable and applying it kind of with them when they create software or are involved in creating software. Or it could still be kind of the specialized focus. Because when I'm thinking of still that speciality, there will always be technology that
is a bit more disruptive. When we think about AI and what kind of 2023 has brought us, or even with the cloud and people that kind of get this toolbox nowadays and just have to do something and are not really mindful of what the right way is of configuring things, I feel like specialized knowledge will always be still valuable there. No agree.
And and I think there it will be interesting to see how to almost compose security teams with I guess I think ideally a bit of a mix on yes indeed maybe you want to have somebody who can go super deep into this technical area. But I also really always admire and that's I think building a little circling back a little bit to what what I mentioned earlier about people coming into security from various different
backgrounds. I think that's something we should should cherish and should also actively look for. Because if we are thinking about something like behaviour of our our customers or our colleagues and we want to maybe do them do things in a certain way, we can try to figure that out as IT security techie kind of people. Or we can speak to our marketing department so we can try to hire a psychologist into the team.
And I think that is something that we again criticizing my own industry a little bit, we've had sometimes try to position ourselves as this. The special snowflakes like our problems are so complex. Never nobody's ever done anything like this before. And sure, of course there's there's elements that are super hard and especially the interaction is is very complicated.
But when it comes to to behaviour and especially how how people in in sort of large environments interact or behave, there's there's lots of other disciplines who've studied this for for decades if not centuries. So let's try to also learn a little bit from them and yeah, integrate at least the knowledge into our teams and maybe even some of those people into our teams if if we have the space for that. Yeah, it could be kind of a
¶ Collaborating with the security department
consequence of being too dogmatic that it also diminishes trust. If you still have a security department from other departments looking at that department, if you are continuously dogmatic, if they also maybe are not aware or there's a lack of understanding and maybe a lack of patience to explain, that's when I feel like friction kind of builds up and then it becomes this kind of check box that people have to go to. And the people that check are then the security, yeah.
And the friction might be a great starting point of course where. Yeah. But indeed I think it's often then easier to indeed, yeah. Get into the sort of, OK, we we come together, there's a bit of friction and we divert again and without meeting and yeah, maybe you should indeed. Yeah. Use that friction to have this, these useful dialogues on.
OK, maybe you are a little bit old fashioned, maybe putting it not a great way to put it if you want to get into a constructive dialogue, but maybe you are adhering a little bit to to to an older, older approach. We come at this topic from a new perspective. Yeah, let let let's sort of see how we can can get into a useful way of of cooperating here. I think so too, and to build on top of that. I was wondering because I always had this notion that when I came
¶ Building bridges
into the software engineering field, I would be, let's say, creating software probably 80% of my time. And nowadays I'm mostly trying to figure out what the right problems are aligning with my team to make sure everyone is aligned. And then the execution part is like not as much, definitely not 80%. And it's hard to put a percentage on that because it
has ups and downs. But still, it's a lot less of work than I thought it would be. The bigger the team, the less coding I'm actually doing, the more I have to align with the team to get everyone kind of on board and have a shared understanding and then investment kind of pays off the bigger or the longer we work as that team. Have you seen that with security as well? Because I I don't have that insight from behind the scenes how much kind of the human factor or human dynamics plays
into the day-to-day. Yeah it's I guess I would similarly struggle to put a percentage on it trying to think if it's, I think it's again indeed traditionally been a lot of focus on the more the the I guess the the the mechanics I think it doesn't Oh yeah depending on your perspective doesn't help that that the the field does have a security does have this notion of yeah we have whatever standards checklist that you have to adhere to. So I think that automatically
gets you a little bit into this. OK let's just start working off the against the standard, taking off the check the checklist. I think if you did I I think security teams that are really performing well are indeed probably doing more of the yeah, the the, the behaviour that you describe where you are indeed building bridges to to other teams to teams outside of IT and security and really trying to understand yeah what's the
¶ Willingness to listen
what's even the business we're in how do we make money. What would be probably the most disruptive disruptive thing that could happen to us. And I think building that knowledge really helps you to indeed get away from the yeah, we need this whatever availability for all our systems to really understanding like, OK, we have, I don't know, let's say 100 types and of of environment. And I understand that 70 of them, we could probably do
without them for a week. I mean it would be annoying and people wouldn't be happy, but hey, we'll survive. But here's this five things that yeah, we, we probably can't do without them for a day or maybe not even an hour and starting to
think from those. Yeah, sort of business understanding and then pairing that again with the IT knowledge because then you sometimes you you start with say OK these are five systems, but then you ask a little bit further and then of course there's always two or three systems that everybody thinks are not important, but the five important systems will never run if those seemingly non important ones aren't running.
So I think those kinds of. Yeah almost yeah it's a little bit almost like scenario building and and I think that can be really helpful to to yeah make sure you are focusing on the right things but while doing this because while you are having these conversations you're also building bridges to the to those other teams and you start to see that this kind of thinking. Yeah, also almost. Yeah. Automatically makes it makes its way into the organization we we've had.
I've done these kinds of of scenario analysis exercises and then you start to see that sometimes you already get to notions of maybe somebody from a business background says yeah, but this is super well protected because there's a four eyes principle always two people who have to check this. And then you start to see the IT person who's at the other side of the table sort of pull the the the the the complicated face
and squint. Yeah. And the squint and and and later explains like yeah indeed that's true. But hey for this very old system everybody gets sort of a default passwords or or or. It's almost always the same and that's already very useful. But the other thing is that you sometimes had you had these those interviews and then one or two days later one of those people would come back to you and say hey, I was cycling to work this morning or having a shower.
And I've been thinking a little bit about this business process and actually I've, I thought of another way in which it can be circumvented, used, disrupted, whatever the IT was and yeah you almost turn. Yeah, the rest of the whole organization into sort of this, this hacker mindset.
And yeah, if you then have shown with these conversations that you are willing to listen to them and that you're going to do something with their feedback, and then all of a sudden you are truly scaling your way of thinking and you are scaling your security organization. Yeah, I think that's that's a great way of doing it. I I'm wondering though what the
¶ Scenario analysis workshops
people think kind of going into that workshop. Do they? Do they, let's say, look forward to it, or they go begrudging? They usually Or is there a train of thought? That that's I guess it would really depend a little bit on, well, the situation of course. And indeed you have to really carefully explain what you what the purpose is so that you're not there to point, point
fingers, lay blame or whatever. Because indeed, it could very well be that you are talking about an environment that they have have maintained for for years or maybe even decades and. We're not there to show that they've done a bad job or to, yeah, make make them look bad, so to say it's. Very delicate. Yeah, well, exactly so.
So there, indeed, that's what. That's probably also a matter where you should yeah, think, think a bit before hands on what how you wanna do this or what what the wording will be. That's. That's the for me, that's the hardest part. No it is but but but that's also the the, yeah, I mean if, if, yeah, if if if it works it it's it's really nice to see because indeed you start to see this this little yeah sort of snowball through the
organization. Because then of course, there's also going to be maybe you start to have this with one team and then maybe by the time you schedule the second set of interviews, they already say like I am, I've been speaking to so and so from the first team and they liked it or they learned something or. So yeah, hopefully you'll get a little bit of a sort of a positive movement around security in the organization. Yeah.
¶ Unpredictable human behaviour
The when it comes to the solutions, do you feel like there's still a lot of solutions that are more let's say technology focused where we have to specifically configure something and then it would solve a security risk or concern? Or is it also kind of people
focused? Because I feel like and this might be kind of a cop out previous conversations there was always this OK, we built a very secure system, resilient even more and more and that's always going to be a continuous process And at some point if people just do what people do sometimes like that is the unpredictable part and I feel like that's kind of the hardest part is educating, bringing awareness and making sure people have this indeed secure way of working.
It's almost going to be quality. I I love the way you put it and kind of professionality in in what you do as a professional, I guess. Yeah and and and I think with that ultimately also having I think you used you used the word
resilience somewhere in that. Also having that in mind when you are building, configuring, designing your system, because indeed, yeah, it it. It would be great if we didn't have those those pesky users making a mess of everything as it sometimes feels, but we really need to rather than fix the fix the human, yeah, fix the system.
Or at least make sure that the system is as forgiving as it can be. Because we indeed as you people will not adhere to instructions, construct instructions might be conflicting from time to time. There's going to be time pressure. Stuff changes, stuff behaves in
unexpected way. There's many reasons to why people indeed don't behave as you sometimes would like them to do. And yeah, there are the, I think indeed we need to make sure that that, yeah, rather than than pushing people back to the intended behaviour, yeah, we should build systems in, in a broad sense, right?
Not just systems, as in the the, let's say the box or the application, but also in the interactions between digital and the real world that are forgiving and that are, well, maybe somehow sounding an alarm if you do something that that you're not supposed to do. But yeah, ultimately building these almost these feedback loops to learn from it rather than sticking to how it was and how it should be. Yeah, yeah, I like that a lot. What you're describing really
¶ Seemless and friction in security solutions
makes me think of kind of a security by default, which which is very seamless, right, for the end user. If it's, if it's secure and you don't even notice it. And I love the example. I forget who gave it to me, but the iPhone is the best example where it's just you look at it and your phone opens and that's basically it. But it scans your face like it couldn't be more secure.
Oh, exactly. And, and, and that's one of those things where I don't think many of us will still think of security when we start using our phones or our laptops and and indeed log on using biometrics, then we just start using them. And under the hood there's, yeah, lots of of complicated things that are taking place. But yeah, the those kinds of seamless things are I think great examples with the notion that I'm it sometimes might also be nice to introduce. I mean I'm. I'm seamless.
Yeah. As much as possible. I think sometimes it could also be useful to to build in some friction at the right places. So I always like the the notion of the what what's called the IKEA effect. This is. Yeah. Apparently I didn't know about this, but I learned this is a cognitive bias where that people play say disproportionate high value or or they they assigned A disproportionate high value to something that they partially assembled themselves.
So you sort of see where the where the naming comes from. So it is if if if you put something together yourself, yeah, you value it more than just, then it's justified just by the the price of the components, so to say the. Purchase of the end result. So and and that that notion building. Yeah. Building friction or or even. Yeah, having people being involved in in maybe putting something together as small as it can be can also mean that they indeed value it more.
And yeah, looking for those kinds of things seamless where where it's where we can, but maybe sometimes something we were that we really value and maybe we should be building a little bit of friction and a bit of a hurdle because then people might actually, yeah, be a little bit more careful or in in that space. Interesting.
¶ Instant cake
I I was trying to think of like a good maybe consumer facing product that would have that because I the IKEA principle, I love the way you explained it, right? And I I completely agree. When I've built something with my hands, when I've put some time and effort into it, that is my investment. And just by virtue of having invested that myself, I feel like it's just more valuable. This is also the and and some. You. Sometimes you shouldn't check all those those stories to
death, right? But there's also the the story that in the the sort of the first cake mix that was designed or created in sort of like I guess the 50s, sixties, it was like yeah, here's the the mix with water and make a cake. But that didn't feel right. So then they they they added the instructions that you also had to add the eggs because then it yeah, it all of a sudden felt a bit more like cooking and really, yeah, indeed you would appreciate the the value of the
cake more. So that's so it's those kinds of of, yeah. And this is again, this is not a digital example, but I think looking for those things and again understanding also how people are using products, that's probably the key part there as well, but. It's, yeah, yeah, interesting. I was wondering if you're coming
¶ Red, blue and purple teaming
from let's say a security discipline, if that's what you've been doing for the last X years and there's new technologies and you want to let's say either bridge the gaps or kind of expand your skill set maybe horizontally rather than vertically. What have you seen kind of trend wise do people move towards because I've seen cloud secure development for example, I've seen software secure software
development. I've also seen people that then just focus more on their soft skills side and really try and bring departments together similar to the workshops that you're organising and try and kind of create this secure first mindset in the organization. Those are kind of the trends I've seen, but those are more local. Have you seen kind of similarly those trends? I'm trying to think, I I do think there is a building on what we would talked about earlier.
I I think a positive movement where we are if if you have sort of the the the breaking things towards building in general. I think we are starting to appreciate the sort of the defensive side of the house a bit more and also still breaking things, but at least doing it in a more constructive way. So you have this this notion of well you have red teaming which is sort of like the the simulated attacks so to say could be technical also more focused on people also even
physical if if necessary. But then taking it further than just delivering a reports but also working together with what's called the blue team, the defensive team. And people sometimes describe this in a color mix like OK then that's purple teaming really breaking things to make the defensive team better and really doing that in a constructive working together kind of way. So I do think that's that's certainly a positive development.
Yeah I think another I I guess, but we're it feels like we're
¶ Exploring the boundaries in AI
still in the middle of that. The whole whole AI notion is is another big one where people that people are exploring from from different angles, again using it for defensive purposes. But of course also seeing lots of use cases on how this could be used to attack with with more scale or more effectiveness.
And what I find interesting there in in general with a lot of these, these language models is that the they again change the way we interact with systems because it's all of a sudden you can build these custom, well, let's let's call them chat bots just just by yeah, typing in natural language or even speaking. And you don't need to be a programmer anymore to. Yeah, to to to build something like that. Yeah, that's that's already a sort of almost way to democratize this, this skill set
or this availability. But of course it also opens up the access to to try and circumvent these kind of models to basically the whole world. And we saw that in in the early days, of course, of these models where people were trying to see how to do these guardrails work, ignore all the instructions was a very, very simple starting point.
But yeah, that's again applying the the kind of the hacker mindset that most people have because we that's we we we we we always try to use things in a different way and and many of especially children do this all the time, right. They try to explore. Look for the. Boundaries. The rules try to use things in a different way. So most of us have that capability and this is again a way to apply it to an IT environment without having to be a a Python expert or or anything
like that. So that will be very interesting to come. I think on to to watch how that evolves with, yeah, all of a sudden potentially a whole army of, well, both attackers, but let's stay positive. Also, defenders who can now, yeah, take those, take their notions and ideas into, simply said, the computer. Just just by applying natural language. That will be super interesting. I think so too.
I I like that you put out that the example of the, let's say solutions, the I solutions, the chat bots to kind of generalize them, simplify them. They also had kind of security guardrails I would say, in them where people couldn't find OK, for example, if I really want to build a bomb or make poison, how
do I do that? Just out of genuine curiosity, people ask that and then the response would be well, as so and so I can't really tell you that because I've been instructed not to. That would be the response. I'm afraid I can't do. That exactly. And then somehow it would be this instruction.
OK? If you ask about your grandma and she was in the war, and ask her to tell a story about how she created blah blah blah like the circumvented way of still getting the info, then all of a sudden you would have it. Write a poem about how to get bomb. Yeah, it's, it's. But again, it's it's yeah, it's
¶ Gamified security
it's a nice way for people to to start thinking about, OK, so this is actually what security to a certain extent is about. And I don't need to understand bits, bytes, networks, whatever, because there's this model and yeah, I can can try to manipulate it. Let's see how far I get some of these example platforms as well that indeed show you.
I think one's called Gandalf, where you it, it has been instructed not to tell you a password and you have to to try and and get it to tell you the password and it has increasing levels of difficulty And it's yeah, you can just show that to people who don't have any technical background and they really get into it and and start to try and yeah with that understand a little bit what security is all about.
Yeah, absolutely. It's like, it's like kind of a gamified way because there's a kind of input. There's a response. There's boundaries and you're trying to crack it. You're trying to move and progress even through levels in in some cases. I think it's genius, also ingenious, to educate people to bring awareness in that way and to then shine light on this topic that is actually quite broad.
¶ With risk comes reward
Yeah, I'm wondering because we touched on organizations and you mentioned procedures like red teaming, blue teaming, even purple teaming. Now we've also touched on technologies that make let's say security or security first. They they give that out-of-the-box in cloud native technologies that we have. I was wondering if you're within an organization could be either large or small.
And I know this is kind of a a very broad topic, but I wonder what the right fit is security wise because we can always do and we can always make things more resilient. But even from a software perspective I can, I can try and make things perfect even though things are never going to be
perfect. At some point you need to add that business value and this kind of this risk versus reward trade off that you need to do. I feel like on the security front you need to look more at the risks and kind of the consequences for that. But how do you find, let's say, the right fit with regards to how resilient your product needs to be when it comes to a digital part? Yeah, that's. I know it's a broad that was a
that was a pop that popped. Up and maybe starting with I, I think what we sometimes forget and that's again a mindset that doesn't come naturally to most security people or teams. Yeah with risk comes reward, right. And we we we sometimes go with it very much from the from the risk side. But we also need to understand that yeah that that or or we should also keep our eyes open to the reward side. So I think that that's and it's a simple sentence and an easy statement to make.
But even taking that into account and and I guess or somehow even promoting that kind of thinking within the security team could already be be lead to a little bit of a positive change. And I think if you try to keep that in mind of course it depends a little bit on on nature of the business and and whatever digital products or
environment you might have. But there could could be many ways where you can look for for things where indeed yeah sort of switching the dials a little bit on on on on security and maybe a little bit more from risk to rewards.
Yeah, you you could think of that and you could even look for ways to or or or even show ways where you say, hey we've changed the maybe the the login process on our consumer website because it right now it was, I don't know, let's say you had to create an account with every order and we had a complex password requirements whatever and we saw a certain churn
percentage. People were put stuff in their shopping cart and then they got to the account creation part and they just abandoned Maybe we can do something with a magic link where you just you you leave your e-mail address you get send a link to log in for in the future you don't need to create an account. Yeah it's it's elements. It has elements of security but it's really close to the actual
business. So looking for those kinds of things or or elements where it's it's more like maybe coming up with using more biometric passwordless ways of of logging into your well let's say your laptops or your your work environment. Maybe you can easily prove that that actually leads to reduction in in help desk calls because you you might be dealing with with password resets or or other costly things that your help desk is working on.
I think it's constantly looking for those kinds of examples and and yeah the really getting that mindset into to the minds of the
¶ Security costs vs. benefit
security team can can really help. And yeah, with that get into sort of an approach and and maybe you can even formalize that into how you actually pick your sort of priorities and projects.
But where you, yeah, almost make this sort of cost benefit, risk rewards analysis and yeah, with that indeed accept that sure there might be a sort of 100% great technology solution, but nobody uses it because it's super complicated versus the well maybe then there's an 80% solution and yeah, and everybody gets it and and then if you do sort of a scalability analysis for both, then the net result is probably better for the 80%
solution that everybody uses. And that's I think the way where we which is, I mean this is not super complicated but and we do this all the time for vulnerabilities which is the interesting thing because there we always do something like yeah what sort of the the frequency and the impact and then you get one these green, orange, red risk tables and yeah doing something similar for the sort of the defensive measures, the the fixes so to say that that could actually really help to
also prioritize those maybe a bit more of an eye that that's a bit more closer to the business side of the house so to say. Yeah, I I like that you emphasize that it's not really rocket science and a lot of people can do it. Almost anyone can do it. The only thing that's going to prerequisite is the knowledge of OK, what are the alternatives and what decisions can we make and what is then that risk
reward? Yeah. And and and I think also the willingness of of let's say you are interacting with with security experts, the willingness of those experts to also, yeah, maybe understand that some of their knowledge might also not be applicable to today's world anymore. That's because we do have a tendency to keep adding things. And that's not just security. That's what we do everywhere, everywhere.
Software also, yeah. Yeah. So, but, but yeah, so sometimes to also say, yeah, sure this this indeed made a lot of sense 10 years ago and and we've paid the annual license ever since. But the world has changed and we actually have two or three different products or approaches that more or less tackle the same problem. So it actually cost us money and we get slower. Yeah, let's just stop doing that.
But yeah, that that's in many, yeah, it's sort of the the the messy reality of of especially large organizations. That's not something that happens easily, unfortunately. No, I I feel like maybe that is the harder path.
¶ Frequent password changes
Because if there's this notion and this way of working and things are established like we've always done, that this has always been in place, or it must be for XY and Z reasons, because otherwise why would we have it To then challenge that and to investigate that, that might be very, very valuable. It's it's something as as seemingly simple as frequent
password changes. Yeah. Like yeah, which we well I would love to say like we used to have but I think many organizations still have this but every 60 or 90 days you have to change your password.
It's been there's been been researched and and and advised by by all sorts of by NIST and and other standard setting organizations internationally you say yeah we've actually found that this doesn't lead to people creating more secure passwords and also the sort of the threat to tech model it that that's there's there's other ways to to mitigate that So essentially they are saying stop doing this and still yeah it's it's there's still many organizations who are who are
sticking to this from a sort of a yeah this is what we've always been done also it's something that you can easily do so yeah that's another yeah sometimes another contributing factor of course it's if it's just a setting yeah why why bother just leave it in place and but. No more questions. It yeah, if no one does the research and it even from a logical perspective as you explain it, it even kind of makes sense how it sounds, right?
Because all it's updated and then people will never have kind of the, let's say recent or it's always different in that way. It kind of makes sense. But then if you actually do the research and say this is rather harmful than helpful, yeah, then yet you kind of have to make that step in that change. But I mean exactly as you say. I mean, I'm a consultant.
I go in and out of organizations and then I see, OK for this tool, Yes. For that tool, no. Like it's also kind of not standardized then within the organization. Like somewhere it went wrong here and no one's changing it. Yeah. And that that's again back to what we spoke about a bit earlier. Of course.
Also it's it's easy to say if you have sort of a green fields, yeah, let's configure everything great, nice, shiny, out-of-the-box versus yeah, we have have decades of yeah, sometimes well built, sometimes not so well built things that try to interact with various log on mechanisms and yeah, that's. So that's of course also the reality that you have to deal with in many organizations. Yeah, yeah, that makes sense. As a as a final thought, I was really curious on your opinion
¶ Verizon Data Breach Investigations Report
when it comes to the education and awareness. Let's say for people that want to learn more about security or people that want to go deeper either from a software engineering point of view or even from a completely non-technical side. Do you have any any resources where people could look into? Because I know for example one of my friends found the website and there's this like again gamified progressive way of hacking someone and he's loving it and he's learning a ton on the way as well.
But for people that are maybe non tech focused as well, do you have any resources? Yeah it's trying to think in in general I I I think it's one of my my mantras is we we don't need security awareness for people we need people awareness for security. Because I do think that yeah, we, we security teams and again the sort of pair, a little bit of a black and white description here.
But yeah we we many security teams don't understand well enough how people interact with systems and environments and all that on a daily basis. And yeah as I said before try to fix the human instead of built for giving really resilient systems. So but then maybe not not replacing awareness by knowledge. Yeah, I think that that depends a little bit on, on the the, the, the sort of viewpoint you want to start with.
I think in general there's a number of of what I think always find very good research reports which which sounds a little bit dry if if I say it like this, but so so Verizon has the Data Breach Investigations report which is very good, very factual analysis of a large data sets of of yeah, actual confirmed
incidents. So really drawing on the experience from the airline industry where whenever a plane crashes, there's an investigation done by a review board and the report is shared because so the whole industry can benefit from it. And with that mindset this, this report started by now probably what was it 15 plus years ago.
So that's a very good approach to to really show like OK, if things go wrong, this is what's happening to the attackers from the outside, was it inside, was it a mix, what were the industries, what are other other trends and yeah that's. So that's that's certainly one one that I would would really recommend with I guess the additional, so with the report from from, from our, from our current team from Microsoft, the
¶ Sidney Dekker - Human error doesn't exist
digital defence report which is again drawing on the experience of different teams, they're not only doing instant response and monitoring but also building products again showing yeah a number of trends. We're seeing a bit of recommendations as well. So that's another one that I would would certainly recommend. Yeah. And with with that and and I think another maybe two sets of thinking are more I guess also building a little bit on the the resilience and and approach that.
There's a person, a professor that I really like a lot, who's called Sydney Decker, he was a Dutch safety researcher who's also happens to be a pilot himself.
So he has a lot of experience in the aviation industry and he is one of the main thinkers in what is in the more traditional safety space really focusing more on the role of the of of of the the human so to say and getting less into a compliant driven control approach but more empowering people to make the right choices with the knowledge they have available at that
point in time. And and he has a lot he he publishes a lot of books as as he says himself I I write them faster than you can read them but he so so there's one that's called behind human error where human error is in air quotes because he his his approach he says yeah human error doesn't exist that's all that's always a systemic problem leading to a human making a certain decision that we then label an error in
hindsight. So that's one I would recommend, but he also has another one which I think is called the the Safety Anarchist. But yeah, so and if you're not that much of A reader, check out his website because he also has a lot of a couple of short mini documentaries where he tries to apply this kind of thinking to other industries.
So not just traditional safety aviation, but also for example, the hospitals or supermarkets where they really try to do actual research and how much of these compliance rules can we take away And yeah, does that then lead to and and it does actually in in many case lead to to not only better safety records but also Better Business outcomes. So that that's a really interesting notion. Those are very valuable, yeah. And and and to to then because
¶ Kelly Shortridge - Sensemaking
that's not not directly security of course that that sort of that. I then try to apply to security but there's another book by by Kelly Shortridge which I think it's called Security Chaos Engineering if I well maybe title might be might be a little bit off there. But if if you check out her website Kelly Shortridge, she has a great blog as well which tries to apply again a lot of this kind of thinking resilience thinking a little bit of the
safety approach to to security. So, yeah, that's another one that, yeah, maybe they're not necessarily sort of a security one-on-one, but at least they I think are good starting points to to try to get this kind of thinking into how to look at security. Yeah, it definitely piqued my interest also because the industries you labeled like very
¶ Sharing knowledge around security
tangible, first of all. And second of all is one of my favorite things about being in this industry is that people, people do the research and then put it out for others to educate themselves to create awareness and to then better kind of the industry as a whole is one of my favorite things. And I think that's it's great for companies to do that. It's great for individuals to do that.
It makes everyone better and it gives again this, this perspective that I definitely me, I'm looking for that. But I think a lot of people are seeking as well to kind of form their own opinion on things because otherwise it's just this black box and really kind of hard to work with. And if you have an idea, then at least you have a starting point of a conversation and you can work towards a better outcome. Yeah, I've really enjoyed this yell. I must say, this has been a blast.
Yeah. Is this kind of comparable to the podcast you've done before? Yeah, but different. Different topics, I guess. But I yeah, at least I enjoyed the conversation this way. So. Awesome, cool. Thank you so much for coming on and sharing, and then I'm going to round it off here. If you're still listening, leave a comment in the comment section below. Reach out to yell and let him know you came from our show. And with that being said, thank
you for listening. We'll see you on the next one.