Welcome to cybersecurity today, the month in review. Joining us today are Laura Payne from White Tuque. Hello, Laura. Hey, Jim. Great to be back. Great to have you back. And David Shipley, from Beauceron Security. Welcome David. Thanks for having me again. Okay, panelists, you know the game and I'm sure most of our audience does in the show. We focus on the stories that had the greatest impact over the past month.
We try to do a deeper dive into them and this month started so calmly, A prof professor and his wife disappeared, maybe, and everybody was wondering what was happening to them. Hertz lost a trove of data with an amazing amount. All those things that you tell the people so that you can get insurance as well as a car and all of that, where you're staying, where you're going. They misplaced all of that in a hack. But, that was just another day in Paradise. 'cause then things got really weird.
Government officials were using a commercial app in the middle of American. Conflict. The, they're attacking, a group of rebels in the Houthies, I think they're called. And these guys are chatting away on Signal, about this with locations and data. Nothing wrong with that. And then just, I'll just add Yep. And one meme, the mean use of emojis in a way that we've never seen. Bam. Fire pow American flag, when the bombs were dropping. Oh yeah. That to me was 21st century warfare right there.
It's, I think they were selling, they're probably selling tickets to this. They didn't have to sell tickets 'cause they had a journalist that had invited on the call. That was just really, that was really special. And then the government, using, they're using a commercial app and they go to Chris Krebs and say let's attack Chris Krebs, the guy who used to run CISA and, that was the start of the month. And I have to say it got more exciting after that. The past few weeks have been overdrive.
I've seen some of your notes as they come across about how we're gonna narrow this down, but, we have to find enough stories to go through an hour. Why don't we start with easy ones? I wanna take a victory lap on PowerSchool first. Yes, please. On this show, the patent pending finger wave of stop paying cyber extortionists was had because what did we say? We said you can't trust criminals. And what have they done? They didn't delete the data and now they're going for a double payment.
'cause why not? Why not get two scoops of tasty data extraction ice cream? So rewind and give us the story. not everybody remembers David Shipley's lessons on this, so let's give him a little history first. so Power Schools is one of the largest software as a service providers to k to 12 schools, if not globally, definitely in North America.
They were hit and hit badly, with a data extraction, not your typical ransomware, but simply the other half of the double extortion, via insecure parts of their support, infrastructure. and paid a undetermined sum. We can assume it's likely in the millions. 'Cause that's the way these kind of things go. And then sent the communications on to the regular school districts, which cause all kinds of really interesting conversations about, how sensitive is this data? In some cases highly sensitive.
In terms of maybe the bus pickup locations and other, issues or notes that teachers may have had about students, depending on the school district, how long they've used the platform, et cetera. It was an on a wholly date of notification mass and affected school districts in Canada, from the west to the east. You name it. And of course one of the defenses that we've seen trotted out in these data exfiltration. We saw this in Canada at the Life labs years ago. Man, this feels like a lifetime ago.
But a long time ago, somebody else tried to do this and pay this and then tell you it's okay. They deleted it. And I believe if I remember the story correctly, and Lu, and Jim, correct me if I'm wrong they they got proof, they got a video and they showed us on video, they deleted the data. Okay. There's a video deleting the data.
Yeah. And now it turns out they're targeting individual school districts for a double dip, of the, of the data extraction, which is as uncouth in cyber as it is, with nachos, right? The single dip. But it's so funny because ages ago. PE we used to say that, cyber criminals had great help desks. They helped you through pay the ransom. They made sure that they gave you a decrypt key because they were in business.
It, when it became a franchise and anybody could be out there and it got dirty there, there is no longer any rule of, you're paying the ransom as power schools did. But now as you've said, these guys are going around from school to school, telling the schools we have your data. And you pointed out some of the data that's in there, the data on children who are vulnerable, that it just goes on and on. If you're a parent, your heart just goes just sinks. So what do you do about this?
PowerSchool, will, PowerSchool survive? This is gonna be a big question. At this point. But the second one is, how do these local school boards deal with this? And I think it comes down to, back to, we've learned hopefully, right? It's an education system. So here's your education from the criminals. Fool me wants shame on me or shame on you, but fool me twice, shame on me. Yeah. Back to me. you don't wanna get there wrong. The first one was on you.
The second one's on me, there's no point in paying this ransom. They're just gonna be back for more. you gotta figure out what you're gonna do about it. The horse is outta the barn. and yeah, it might mean that power of your school is not in your docket anymore and you gotta figure out something better. Or maybe we're back to paper in the meantime, this is not the kind of thing where it's a click and pay and solve, right? Money doesn't solve this problem.
Could we have possibly gotten to the point where this might be the straw that broke the camel's back and people might stop paying ransoms? These people might stop paying ransoms, like the decision makers here, but I don't think it's, yeah, there's no universal oh, we're gonna stop paying ransoms. Because it will depend on situations, it'll depend on the knowledge of the person in that seat making that decision at that time.
Do they consult somebody who's going to give them good guidance to, know what their options are and what the likely risks are if they pay? I'm sure there are still some I will call them with a heavy dose Salt, noble cyber crime groups that have good reputations, but they are fewer and farther between, right? So if you're going to pay the ransom, you just have to know. it might give you a spell of, time to deal with something, but it's not your end game. You can't delete the data once it's gone.
If they've got it, they've got it. And I think there's a distinction here, like the use case of a hospital, paying a ransom to get systems back operational as quickly as possible remains a deeply ethically and morally problematic space. But when we're talking very specifically, it's not about your operational. It infrastructure, it's about data that was exfiltrated. There is no value.
And I think the only way we're gonna change this is if the courts say we don't care if you pay to, to get them to pinky swear they deleted it. That doesn't matter. It's untrustworthy, unreliable. there's no definitive way to prove it. Your penalty is for the loss of custody, not for everything you try to do after the facts. You devalue the incentive to try and pay this 'cause part of the reason they pay this is so that they can go around and lower their eventual class action lawsuit losses.
I don't think they genuinely care as much or really in their heart of hearts, believe the criminals are trustworthy actors that are gonna delete the data, We gotta get rid of that out of the equation. And I think this is the first step to getting out of the pay, the ransom mess that we're in, but definition of crazy man doing the same thing over and over again and expecting a different result out. are we just looking at this through rose colored glasses?
You're saying like the whole game might be just keeping their liability down, so doesn't matter. do we know how much they paid? Anybody know how much they actually paid? Millions? I think No don't think we know the Yeah. Yeah. If we don't know. And they may have negotiated very hard and the ransomware are back. I maybe they did go really big and pay a large ransom. And so they figure, there's more from the well to draw from. I don't know.
And I don't know what decisions, the payers made going into that. I wouldn't wanna be in their shoes, but I think at the end of the day this just does go to prove that Yeah you can't get the data. Out of the ecosystem. Once it's out there. It a leak is a leak. It might be in for a bit of a surprise. They, might be looking at public data and going, oh, the Toronto School Board. Oh, they've got, $80 million in budget. The fact is they don't have $8 in money, as a matter of fact.
Yeah. They owe money. They're in a deficit, so they're not gonna be able to come up with a lot of cash. And I don't think the parents are gonna go for a bake sale on this one, no. And the insurer may have covered that first payment, but they're not gonna be up for a round two. the impact on this was, $60 million. all the reporting I've seen have said the payout was in the quote unquote millions.
For context, PowerSchool has reportedly 18,000 clients covering 75% of K to 12 schools across North America. And, 60 million up to 60 million people are in that system. So definitely on the general rule of, 10 cents to a dollar per impacted user, going market rate for extortion, anywheres to chain, what, six and 60 million? Potentially paid out.
It sounds crazy to be talking about tens of millions of dollars, but we had an un, yet still a yet unknown Fortune 500 company pay something like 47 or $50 million last year to a ransom. It was the single biggest ransom take. So yeah, I guess that's where we're at. I think the one thing I wanna make a point about as well is the end of the first era of international transnational, ransomware came with the big takedowns, the lock bit takedown, the other gang, takedowns, et cetera.
The groups kind of self-destructing on their end. And that was the end of the easy days of this crime. one of the things that I had predicted was that their tactics would get more vicious, but that the industry itself would get more resilient. So it'd be harder to take down, but they would also in turn, become more ruthless. This was the natural, Darwinian evolution of this crime.
So what we're seeing now with the re extortion and, the retargeting of healthcare post pandemic and other things, this is cyber crime, ransomware 3.0, and they are going for the throat. it's interesting to see the. Physics for every action, there's an equal and opposite reaction play out in the cyber battle between the good guys and the bad guys.
for some that's an excuse to do nothing and to say we were better off just having this as a, basically international IT tax that we were paying to Russian and other cyber crime gangs. And at least we had X amount, but now it's even worse. I think we have to get through the other side of this, whether we have the will to get through the other side of this. And keep in mind all this is happening is the counter ransomware initiative and other really good US led initiatives are all for not now.
There's a giant sucking sound and that is the vacuum of previous incredibly valuable US led leadership on this global issue. And I don't think anyone else has got the heft to keep that charge going. So I think the good old days and the nasty days are also combining. 2025. So much fun. And we'll get to it later. C is debtor than we think. I do wanna add one clarifying point on this story, 'cause I think we glossed over it.
Is that, we have, in this particular story, we had the third party who was breached. And they paid the ransom. The attackers are now coming back not to the same third party, they're coming back to the users of the third party. The victims. Yeah. To the victims. It's not the case where, the boards paid out the ransom in the first place. They are coming, they are now being asked to pay a ransom.
And who knows, maybe the attackers were gonna were hacked in the backend and, the data, maybe they really did delete the data, but somebody else grabbed it first. You don't think they're really good at security? Like lock bit was hack this month? And four chan and you know it Yeah, of course. The honor among thieves these days. Yeah. Let's twist around to that. 'cause you brought up the four chan story and you were gonna, you were gonna cover that.
That's another story that, where of Wrong Gone Wrong? I don't know how else you put it. Yeah. It, it's a little refreshing to see that bad security affects the bad guys too. And four chan I guess is. Not objectively bad in the sense that they didn't start out to be full of evil crap. But here's, that's, that's where they are. And shockingly ly is what they're a site that they a counter culture message site they started, yeah.
And just a, allow whoever, whatever communities to co coalesce on their site and to exchange information and opinions and less information and more opinions maybe. But it got pretty sick key to it, the end. But yeah, it was, it is pretty full of things that are undesirable from a moral standpoint. And, not surprisingly, it became more and more difficult for them to engage vendors to provide the hardware and the updates that were needed for their infrastructure.
also not shockingly, a lot of people weren't willing to pay to advertise. So their revenue streams and. Here's another shocker. People who aren't, maybe on the right side of moral and ethics don't like to pay for services. So they weren't gonna pay for a subscription to four chan. So your revenue model doesn't really work. And, so you know, this is a foreseeable circumstance of a very popular site that has no revenue and is known for being on the wrong side of a lot of things.
Really not being able to sustain itself. it's back online. It will probably go the way many things do where it devolves causes, splinter sites to crop up in its vacuum and something else will probably grow out of there. I hope this is the beginning of the end of that particularly large community. The site collapsed, but they brought it up. It's gasping, at least at last. I've heard there's not they can't bring it all back up. They've lost a lot of data. We won't miss them.
one of the largest purveyors of deep fake, pornography services, Mr. Deep Fake, was taken down and as a result of US legislation now coming after non-consensual intimate image. Deep fakes as well as, real non-consensual intimate image distribution, which is fantastic.
But in a sad note for Canada, the individual that has been determined to be responsible or a key individual is a, is a Canadian citizen, CBC, and a whole bunch of other media, let's work together to, piece together this person's identity through opsec leaks and actually confronted, the individual, about it. In terms of, we think about Alpha Bay was an expat Canadian who was running that was at the time one of the world's largest dark web, narcotics, marketplaces and other things on it.
Not exactly that moment of Canadian pride of some of these clowns. The biggest clowns out there have been coming from up here. At the time when we want to excel in digital industries, this is not exactly what we want to be known for. Is that not the industry we were aiming to excel in? Not quite. Yeah. We want we wanna focus on it on a different clientele, I think. But what's interesting is to see these groups like, lock bit, in the middle of a fight and things that are happening.
And this gets back into the interesting sort of, politics and drama that happens within these environments. Like we've seen, these core ransomware infrastructure providers stiff their affiliates and, and then, we've seen inter fighting like when the, the latest sort of expansion of the war in Ukraine in 2022, which wasn't the start.
Goes back to 2014. But in 2022 the most recent actual, official Russian invasion of that area, we had gangs that fell apart, that actually were, came Ukrainian and Russian. And that leaked all of their sort of internal chats or playbacks the playbooks. This is where we learned about the HR structure, the payment schemes, the, the, the feedback from project managers for your, particular ransomware group.
So we're seeing this, play out and it is nice to see the self-destructive, but the one thing I will describe four chan, 'cause we're post May the fourth, but if there ever was a Mos Eisley, that den of scum and villainy referred to by Obiwan Kenobi, four chan was it was the Mos Eisley of, the internet, the den of scum and villainy. And you just love to see it burn to the ground. So wouldn't we rather talk about Hertz?
No, talking about starting the month out, they just lost a whole pile of our data. and we saw the impact of it was the co-op and Marks and Spencers. And everybody knows, not everybody knows who co-op is in the uk, but it a large, I think there's, I forget how many thousands of grocery stores they have, but they cover a vast swath of the UK or England.
Whether, I can't tell the difference between the UK and England, but Marks and Spencers also got hit, and people were starting to wonder whether the new. Pursuit of ransomware and hacking was going to be retail. This has been a big hit for these stores though. There was a story just today and it's come up. The stores at co-op, they can't stock the shelves. they're rationing milk.
rationing has been announced and all of that entails ever, you mind, I said to her, you put on a kettle, we'll have a nice cup of boiling hot water. you've got a big deal, beyond the fringe. Fans will like that one. Nobody else will figure it out. But the issue is, that just wiped out two major grocers. How much more of our infrastructure that we don't think about is vulnerable at this point.
Yeah, and this is something I've thought about in the context of the discussion around critical infrastructure and in general, the discussion of how do rank the criticality of different devices, different systems, different industries. And unfortunately, I do frequently come back to the conclusion that yes, some are more quote critical than others, but everything is so integrated now that the difference between the importance of protecting less critical and more critical is just aligning, right?
Because the things that are so critical, when we think about typically the power grid and, water resources and things like that. Now we think, okay, food security. Transportation and logistics, which makes all of that happen.
And then all the suppliers that go into that, whether it's professional services, which typically are smaller organizations, and all of these supporting organizations, there really isn't anything left at the end of the day that's not in the circle of supporting a critical service. And, so it just reinforces that you can't just protect a few things and the rest of it'll be acceptable. It's they, everything needs to have a reasonable level of security.
Not that it'll be perfect, but that the risk or that the impact when there is an incident is a smaller impact. That it's more contained. That it's more manageable. And I think a couple of different things. It's stunning to me that Marks and Spencer did not see what was happening in Canada to, both what happened to Empire Sobeys, what happened to, the drug, chain. London drugs and not say, okay, we need an incident plan. We need to invest in this area.
it's Stupifying that, we saw ransomware rise. We saw it happen. We saw the impacts in very specific retail food security verticals. And we still failed to act. Isn't that fascinating? A realistic level of security. What's your RLS? And then, what alternatively is your realistic level of resilience? can you go back to pen and paper? Can you stock the shelves so that people can get milk? absent regulation, it's very clear the private sector, north star of shareholder value.
Will dominate the risks conversation to the point where they're ill prepared for this. The evidence could not be clearer. I feel Matlock resting his case at the end of the episode. Ah, here's the proof. This is why we need regulation. And yet, I, as I raised the point on LinkedIn this week, Prime Minister Kearney is, gonna be delivering his speech from the throne here in Canada, with the king coming. This is super exciting for us.
I hope in that speech cyber regulation for critical infrastructure, at least for what they were trying to do with Bill C 26, get some kind of nod intention and back on the legislative radar. But what I will say is that even what we were trying to do before it failed, here in Canada didn't even contemplate dealing with food distribution. Food security is not critical infrastructure in this country from a federally resourced mandate.
And it means the provinces are off to pick up whatever they can on this. And the good news is some provinces are, I'm aware of conversations happening within provinces to go, Ottawa is not coming to our rescue. We're gonna have to figure this out for American listeners. What you have to understand is that, CSA is gutted. You are, you're globally leading. Trump created critical infrastructure security agency, which was a success. And President Trump deserves credit for creating csun. Absolutely.
Term one, that was one of the wins. Here I am actually giving appropriate credit for where it needs to be on that. So there's lots of people that were involved in the creation and formation of that policy, but it was under his administration. Now, Trump too, is literally butchering one of the successes of Trump won and turn around saying the states are gonna be responsible for security. And there are some states, New York, California, Texas. Sure. But are you kidding me?
The rest of the states are being left to their own devices. This is bad news Bears. And CISA was known as the coordinator for critical Infrastructure, security and Resilience. And it's where I think, I don't know the total system yet, where you reported a lot of things to Yeah, no, they did great work. Jen Easterly was advancing so many amazing. The most recent past cs, a director.
She was building sort of consensus on really deep, long-term issues like software quality and holding the software supply chain accountable to creating secure, software, secure coding technologies evolving from the things that we know are inherently risky, that we're making it too easy to create these CVEs. These are big media, substantial national level, international issues and all that's gone. It's worse than gone. for those of us who, remember Chris Krebs was the head of csa.
He was fired for honestly, saying that elections were safe. So the head of CSOs, learned not to speak truth to power, even if it's the right thing to do. They've learned that it's worse than that. I can't disclose totally why I know all of this stuff, but as I was telling you guys before the show, we've been doing some interviews with whistleblowers and CISO's. Non-existent. It is a political front right now.
It is a puppet and it is overruled by political decisions coming from God knows where, but that's, it is no longer trusted and that is a crime. Not and again, not I'm not getting into American politics. You've given Trump a nod. I don't want to, talk about Trump or anybody else, but the fact is we have lost one of the best assets in the world at leading the fight on cyber crime. and Laura probably can speak more with more intelligence than I about Mitre and the CVE database.
we had an asteroid almost hitting the planet moment for the software reliability and resiliency, automation and vulnerability scanning environment. Previous to the current administration, it was a very fragile budget improvement process, which shame on those for not getting it figured out beforehand. So let's put appropriate blame where it belongs. The fact that it was so brittle to begin with was not good that it was so heavily dependent upon and yet so brittle.
then it almost came completely unglued. And I don't know what your thoughts are from your perspective. Yeah. there were a couple things, and I think your comment is very on point that, it sounded in some of the. Write-ups around this, they, it actually wasn't unusual for it to come down to the 23rd hour for approval to come through. But what was different was in previous administrations it was, yeah, we're getting close to the wire, but we're pretty sure it's coming.
There was confidence that the job was gonna get done. There was not confidence this time that it would actually get done. And what that shows is, yeah, there was that ez Fair nature around it. maybe not by everybody, but by enough people that it was, yeah. Business as usual is yeah, we just, we get her done.
Instead of saying no, we need these things done in a timely fashion so that there are checks and balances that can reasonably be put in place, and so that if the funding isn't going to come through, there's enough time to do something else. And that's really what's been missing. But this is not new for the US government. The number of times that things have been pushed through, or they've gone into, basically the government freezes because they haven't finished passing their budget for the year.
this is just another symptom of the larger issue that. Has been building for quite a long time and really highlights the fact that any one organization who is leading such a linchpin in our system, of security, that if there is any trouble in that organization, it has such a huge potential for impact across so many other places.
And whether that's government or whether it's nonprofit or a collaboration of governments, whatever it is, our lesson learned out of this is to take a better look at the warning signs as they're coming and try to build as a community that resilience so that yeah, any organization can fail. So whatever organizations we're putting these key services into, we need to have a better way of staying on top of what's going on there so we can course correct before it becomes a crisis.
Absolutely. And I think Laura, you're talking about the CVE system, right? And in particular in April was the funding. Sure. Again, and not everybody's, not everybody does this security junkie, jumping into this, the CV system is a critical piece of how, like cisa, it's a critical piece of how we manage, any sort of hacks or problems. Zero days. All of those things. Yeah. These are, and do you wanna just give us a brief explanation Report? A view? Yeah. Give it, yeah. The Quick primer.
So CVE is the common Vulnerabilities and Exposures database. And it is a mass collection of all the, known, published, vetted, confirmed vulnerabilities in software. Doesn't mean it's the full enterprise of software or vulnerabilities because we know there are people who found them and they haven't reported them. But it is the biggest single central source for vulnerability knowledge.
All of the major vulnerability, scanning, players and technologies brings CVE data in from this source, as part of their ecosystem. it is the most robust, as robust as anything is in, this day and age. Collaboration point for those vulnerabilities. So to lose it or to have it not be effectively maintained, and it requires staffing as well. It's not just about the technology of the database, but there are real people attached to reviewing the submissions, scoring and writing them.
And there's lots of people who have lots of debate about the effect, efficacy of the scoring. That's neither here nor there because the most important thing is that there are actually people working together to try and make the effort to get these things in a place where everybody can then contribute, and use the information that comes out of it. That would've been a huge loss. It would've put new findings at a standstill for, being reviewed.
It would've meant things that were in progress, weren't finishing, getting through. and then the older findings do get brought back forward and updated from time to time as things change.
So all that new context information wouldn't have been being published, would've been a huge win for the bad guys, Have this lack of communication in the ecosystem for this, the defenders I cannot imagine anybody who listens to the show here the word CVE 19, whatever, and they'll know that instead of having to look in 16 or 60 or 600 databases across, God knows where to find out is, is there something out there? There is one database archived, normalized to the best possible.
And I hear the criticism to the scores. But I don't, if it's at eight or an 8.25 or a 9.13, it doesn't matter. It's bad, and if it's a three, it's not as bad as a nine. it's not perfect, but it has accuracy. But the one thing critics ignore about it is that it moved us from low, medium, high. Low, moderate, high, whatever ways you want to put it.
Which is even less granular to something a little bit more specific, with a little bit more rationale, harder for organizations to argue, oh, it's not that big of a deal. there's a Beyond Trust report I just came across. Apparently they've been doing this for a number of years, where they look at the total number of CBEs that come outta Microsoft per year and how many turn out to be critical versus the total volume.
And it enables that kind of an analysis, which is our only kind of barometer of software quality. And so if we don't have that kind of data to work with, we really don't know how the trajectory is going. And that's just one example. But I think we rely on so much in our modern digital economy on so many things that are brittle and fragile. Like that are just one bad actor away. And I think, Jim, you posted this in the, prep notes today.
It was the, Russian hosted, open source software that used Oh God. Yeah. Everywhere. And everyone's oh, all of a sudden this is a problem. And remember, the open source community are, okay. Let's still, hang on. Let's make a point of going back and giving the audience the context. they don't know what we're talking about at this point. The story I posted was a Russian group is the source of a module. That is in practically everything open source. it's a GO package and everybody will know.
If you follow Linux, you'll know that Go is where everybody wants to go from c plus to this new language go, which is supposed to be more secure except for one particular module that is supported only by Russia Everybody who supports this module is Russian. And the head of this organization that supports it is under sanctions. And what it does is it supplies the interpretation and retrieval from JSON. Now, you don't have to be anybody, you don't have to be a smart programmer.
JSO that's used in a lot of places, isn't it? Yeah. So that is what has been discovered, and we've all woken up and going, wait a minute, I. This guy named, and he's not named, not Vladimir, is supporting almost all of our open source libraries for this function. That's a bad thing. But I wanna paint a broader picture for a second. I was like, okay, why did this happen? Remember that at the dawn of the age of open source that.
I just wanna give a shout out to the, these are some of the best of us as humans. They were thinking about how we genuinely collaborate for the collective benefit of everybody on the internet. That this was non-commercial focused. Let's make better software together. Let's advance technology together. And I love that. Like this is the Ben and Jerry's ice cream of the internet, right? And these are still the internet hippies. And I loves meet the internet hippies, right? It was fantastic.
And at the time this was happening, we were heading towards that guy Kawasaki end of history, view that the, the Cold War was over and its und deleting prosperity and war. Democracy was on the march thanks to the power of the internet and authoritarianism was done, and we're gonna have unending economic growth, yada. It's like looking back on that from the nine, looking on the nineties is so sweet now.
Like you look back with a nostalgia that I think people in the eighties were looking at the fifties, so now I get it. Not to say the fifties were great for everybody. They weren't. Nostalgia is a funny thing but the open source community, that's how you ended up with people from Russia doing cool key things. Because before Vladimir Putin's rise, we thought we were gonna have a more normal relationship globally with all kinds of places, including Russia. And now geopolitically, this has changed.
And I think this whole idea of a fractured internet of this, unwinding of globalization, this is a canary in the coal mine of what that means for the open source movement. And what I can tell you is that if the open source movement genuinely is as dead as globalization is for the next 30 years, software costs. Everything's about to get expensive. Everything's about to get real, real expensive if everyone has to do their own coding from the ground up. And I'm not saying that we can't Yeah.
And before everybody gets on their high horse and goes o open source, we should get rid of it. But it just in case anybody gets on their sort of well open source, I always knew that was bad. Can you say supply chain hack? Boys and girls, they're we're reusing modules in commercial software. This is not just what we've got is we've got an explosion of software without any be ability to track its providence or where it came from. That's at the heart of it. And I wanna challenge that a little bit.
there's the ability to do it. That's established. It's just whether people do it and whether they put the right checks and balances all the way up the chain. And, depending on the limitations of how you're scanning, right? Maybe you go three layers deep. You go six layers deep. Do you go seven layers deep? Do you go 10 layers deep? And this kind of. Back to the theme of the critical infrastructure, right? It's all connected. You can't say, six layers deep, that was good enough.
You need to go all the way back. And then protecting that chain, the open source community. I don't think it's dead, and I don't think the movement is dead, but I think like all things that we've learned, along the way with communication, innovation, and you could say the same thing about the printing press, right? People thought it would be like this great boost for democracy, which it was, but it was also a great boost for authoritarian dictatorship too, right? It's just a hammer, right?
What you use it for is what makes the difference and whether it's good or bad. So I think open source is gonna be the same where. People will have to get more serious about checking the provenance and, being, if you're a software supplier, you need to be more serious about the provenance of all of the modules and libraries you've used.
the smaller the code chunks that you can look at, the more likely it is that you'll actually understand what's going on in it and, and being able to make sure that it's good and less open to security issues. So I'll put that out there.
And David's gonna hate this one, but this is a really good use of ai, you can hate the code that AI generates, but there's nothing better for finding out that no code is ever documented or documented well than running AI against it, saying, what does all this stuff do? I can see it. I can't read code that it can read and it can tell me what it's doing. So we have tools and you've pointed out quite well, Laura, we've got the tools. We know the issue. We're we just we gotta do it.
That's becomes part of the due diligence or part of the quality of making any software program or taking in any software to your organization is understanding who it hangs out with, Because tangible things people can see and feel a little bit better, right? We have regulations around how bridges are built. We have, strong provenance as far as how the calculations are done. We have a licensing regime around the people who are allowed to sign off on the designs. We have, rules around inspection.
The materials that go into it are sourced and people understand the full supply chain up to the screws and the rivets, where those things are coming from, and the material composition within them. And we want all of that because we drive on them and put, millions of tons of weight on them. X number of meters up in the air. I'm gonna use meters. 'cause I'm Canadian though. I wanted to use something else.
And we have these expectations of things but it's so geographically limited and the people who use a bridge are, within a confined space. And then we look at systems that we've now extended to billions of users on the planet and there are none of these things in place. And then we wonder why we have the problems we have.
And you're hitting with something really important and it's this is that we still live in an era and maybe in the next monthly review, I'll remember the exact research study, but I came across this just before the show and it said something like, organizations with more security tools are less secure because they keep just buying the stuff, but they don't actually properly implement it.
They probably don't actually fundamentally change how their business is operating to be more secure and resilient. Going back to the Marks and Spencer example, right? Like I'm sure there's a lot of vendors that are gonna settle their next generation firewall or their next generation AI powered MDR full on 14-year-old David version eye roll. And, but they're not fundamentally gonna fix.
Their ability to have a good tested incident response plan or, to review their supply chain for their software and their systems and put in place an SBO m and do the thing about security is we always go for the sugar rush, quick calorie junk food solution. We don't do the hard work of, from the field to the table of actually that's required for a proper security nutritious meal. You can tell that I'm very much focused on food lately. It's interesting to see where we are, right?
Like we, we've got an internet that we walked into that was, what the nerds did and it wasn't critical to our everyday life to, it's now being essential to sustaining the life and wellbeing of 8 billion people on the planet. But we still treat it like it's a play thing that it's just a fun thing.
We think about how smart tractors in Canada are solving the labor shortage on farms across the country by having the things go and use AI and run themselves to the field because we don't have enough people to staff them or want to do the job, to do that manual labor at the rate we're willing to pay for our food. We depend on this stuff, but we don't protect it.
you said a curious word, you said we, and I don't wanna make it seem like it's the people who are out there working in cybersecurity that are the problem at one point or another. The reason why, you can tell where the rivet on a bridge came from, or if there's an outbreak of, poisoning from, Ecola in Minden, Ontario where I live, they can trace where the lettuce came from, is not because the lettuce farmers wanted it, and it's not because the rivet makers wanted it.
It's because there is legislation and regulation that says thou shalt provide a chain that tells me where any piece came from. There are. Standards that must be held from engineering and other places that say, you must know this. We've got to step up.
Finally, 50 years into our world, and maybe, I don't know, the first virus I saw was in the 1980s, let's say 45 years into our world of cybersecurity and start to treat this like it is a profession like engineers or anyone else who have to obey certain rules, and that goes against the grain right now of this deregulation thing. But I gotta tell you, every time you fly in an airplane, you want to know that there was regulation. Trust me. And listen, Jim, I wanna give a shout out to ai.
I mentioned this, the other show, but the AI that does the collision avoidance, the machine learning, simple logic-based, structured automation that avoids humans potentially making human error to make a bad situation, a tragic one. Yes. Laura, you were about to say something relevant. Wait, I was, Ooh. Oh, that's a shot of shot fired. No, just, just to wrap up the thinking and what you were saying and on, and in particular shout out to engineering.
there's debates around the term software engineer has been used and abused over the course of time and it is a soar point for a regulated engineers who, actually had to do some properly qualified training in ethics uphold certain promises whenever they sign off on something.
I think there are a lot of people who call themselves software engineers who, if they had to put their name to sign off on the code that's published and then it's known if there is a failure in their code that causes a mass issue with availability or security or whatever other issues it might come down to, they are personally on the hook to be sued. That would change a lot of perspective around the, how quality people feel about the code being put out there.
'cause at the end of the day, if there's not a person on the hook, whether it's within the corporation or, a, an engineer. The responsibility just gets distributed and people feel like they're safe. I'm a consultant. I spent like 20 years or 30 years of my world as being a consultant. I am a certified management consultant. I hold a designation that is duly authorized by the province I live in and I'm licensed in terms of being able to do this. I have a code of ethics I have to adhere to.
I can be called before a discipline committee in the same way an actuary can or a lawyer can. and I agree though, if you're gonna call yourself a software engineer. And not just a guy who writes code or a lady who writes code, then you have to step up to have that profession be regulated. But again, we also need regulation and we need those two pieces. And they're missing in this industry. And almost no other industry.
But also there are no rules to uphold somebody's professional standard right now when it comes to code. Yeah, absolutely.
But the other part is that with this level of diligence and approach, the work that, Tanya Janka from, SEM group does on educating developers to create secure code requires investment by company to actually do it, to slow down their build cycles, to do the validation check, to create cultures where developers can actually be assertive and say, no, this is an unsafe way to do that. That is gonna come at a cost and it is going to significantly increase the cost of cheap software.
We live in the era of cheap. Software. it's incredibly valuable to us. It does amazing things, but we don't pay what it's worth. And our willingness to pay to have secure quality software is questionable. that'll be the interesting thing to see is that I don't think the market has ever sent that signal. I think we're in an era of deregulation now with the global largest marketplace. So the pressures on's not there.
The Europeans are overwhelmed already trying to hold everyone to a higher standard on data privacy, ai, and security. Already the software quality may be a bridge too far for them. that leaves us with a lot of questions. But I'm gonna put this to you, and not to disagree with you, David, but I'm just gonna say that would we really miss half the crap that gets into software? Do we really need a Word program that has 50 million lines of code in it?
I'm not so sure that I would really miss some of this stuff. As a matter of fact, I've been saying, could you slow down the development a little bit? 'cause I have to use this stuff and I like it. I like to be able to use it regularly, lemme put it other way. Imagine we went back to the days where you had to pay for your annual Windows upgrade, right?
and now we're in an era where you get a basically a almost a new version of Windows every six months with significant feature enhancements and it costs you nothing. But we'd be willing to pay 150 bucks each for the next six month upgrade of R os. We used to, but now we expect it. To have, Windows that had half the vulnerabilities. Would we be willing to wait twice as long and pay more for software that's also secure? And that's the interesting challenge.
Look at this madness around the AI arms race, right? it's gotta be first, it's gotta be best. We gotta just release this as fast as we can, regardless of the harm it can cause from deep fakes to hallucinations, to, all kinds of other chaos. The environmental side. We just, we gotta go, we gotta go, we gotta go. We're repeating the same mistake with AI writ large and the insanity around that, that we did with software. You either you let somebody else develop it or you do.
And that's the world we're in. We're in a competitive world and without, I think we need software minimalism, right? Like we, we need some Marie Kondo in maybe. Yeah, that'd be good. But we do need, and nobody in AI has been willing to participate in any form of regulation. No. In fact, and regulation is not necessarily bad if it's there to protect you. we went through that with NASA. When Nassau first started launching people up into space you were rated by having no mistakes.
After a while it became, you were rated on being faster and we had. The great disaster where we lost seven astronauts and, so we go back and forth in this continuum but right now, the issue before us here is not probably world hunger, it's, we're in a world where we can't do this any longer with software. We're at a place where stuff's gonna start crumbling if we don't address the security in software. And Pro pro, I kept saying Providence. Providence in software, where it comes from.
And those are two things I think we can wrap up on in there. No, I wanna get it one more story. Oh, sorry. Yep. Lori, do you wanna say that? Okay. I want to, I don't wanna leave. We can't leave without this story. The story that won't stop. And we all remember back at the early part of the month, the big story was a bunch of people in the Department of Defense were watching a, an invasion of, going
against Hootie Res. And they were merely putting this stuff out on their phones and in a commercial app called Signal. But of course it's encrypted, so we're okay. All kinds of problems with that. And we've had great scandals but this story just will not go away. so in the middle of all this. We not only had the story of Signal, we not only had the story of, how it got there. I don't think anybody's actually been fired over this. Mike, waltz, no longer the National Security Advisor.
And he's now the UN ambassador. Fired, transferred, demoted. I don't know. put him where can't do any harm, right? But not the guy that actually disclosed the timings that airstrikes, that, that would be Pete Hegseth, he's got a lot of time on tape saying people that do exactly what he did should be in jail. Yeah. But that's American politic. The what about him is American politics. That's just, that's their game. They put that's played. You can't do anything about.
But what is frightening to me. Is that there's a structure by which, and that's broken down by which these people are saying this stuff was put on their machines. So how could this be? I just wanna back up for a second. at first we were all calling this Signal Gate and credit to Signal. They just rolled with it, had fun with it. They did a very cheeky software update that I thought was hilarious. But it turns out it wasn't Signal.
It was this thing called TM Signal, made by a company called Telem Message, which took the signal open source code, and made a third party version of it that could then communicate with other signal clients. Now, here's the delicious irony. In some perverse way, what we've now learned that Mike Waltz and others were using Telem messages. The reason you use Telem message is that it complies. Complies with archival record keeping required by the US Federal Government.
So the deep and twisting delicious irony that someone probably went to somebody and said, we wanna use Signal, and they're like the only way to do it to comply compliance with the legislation is to use this Israeli made tech. Okay, great. I get to do what I want and I'm compliant. Great. You notice what I'm saying? User need compliance and massively insecure are all in the same paragraph. Yeah. Because somebody came on and hacked telem message in 15 minutes. But it's better.
It's better than just the vulnerabilities. So first of all, a shout out to all the hacker nerds who just dove bomb on this. You can tell I spent the weekend learning World War II stuff and watching movies about the Pacific, but literally who dove bomb the carrier that was Tele message to the point where they turned themselves off. they found out that the open source code had hard coded creds. Oh my God, please just stop. And then their infrastructure was vulnerable. They got into that.
But the chef's kiss to the person that did the process flow and captured the video and screenshots to show that this app would take end-to-end encrypted messages from the Signal protocol. It would take a plain text copy of this and save it, including to destinations like your friendly neighborhood Gmail account. Sweet. it's gonna be really interesting seeing what more comes out of this.
I don't think anybody has ever actually admitted yet that, even though we know which account the person was added by, but I don't know whether that individual ever actually admitted to adding the Atlantic's account. So how much access did some actors maybe have beforehand? Maybe it wasn't the individual and their sloppiness. Perhaps this was actually malicious. I don't know. I'll put it out there.
So the story that I heard about this, and this is one of my, David time patented, cheap shots at AI that Jim loves so much, but apparently an apple AutoFi thing came in from a media request from this. Journalist and it had the same name as another contact on this assistance phone. And it was like, Hey, do you wanna update so and so's contact information? And he said, yes. And so that was the person he intended to add to the chat, not the Atlantic reporter.
So I like that story 'cause it fulfills my particular zealotry. But I can see Jim well before you do the YA burn. Apple has been prosecuted for not being able, not being let to call that stuff that they put out AI, they've been asked legally to remove this, saying that this is this, you can't call this ai. Anyway, back to this. all these people's phone numbers are out there on the internet.
Any one of their devices they're using personal devices is not only probably hacked, I will say it is hacked. But once again, and we can make this political or not, but we can also go back to our own worlds and say, every time an executive can overrule your security procedures and not take personal responsibility for it. I'm not saying.
CEOs or senior executives shouldn't be able to run a company, but they should have to go back to the security person and say, I will take personal responsibility if we change this. And I'll tell you, we've been dumping on CISOs and we've been saying we wanna take CISOs to court and we want to prosecute them. Let's start prosecuting some of these people who come in and say no. You're just a, you're just a hack. You're just an IT person. You don't know anything.
Let them take responsibility fully and say, I will now personally take responsibility for the fact that if we as a corporation get hacked, it was my decision and this is when I get back to it. You can do it nicer than that, but just going back to your executives and saying, what risk are you willing to run? And I would just reinforce this. Blind compliance creates insecurity. you need to look at your compliance mandates and say, is this way to comply? Safely and securely.
Just because it says it complies doesn't mean it's gonna do it in a safe, smart way. That's not say that compliance isn't important. Laws, regulations, contracts matter. So I'm not slamming compliance. I'm just saying don't misinterpret that the software is compliant with the software is secure. Yeah. your user requirements are not the same as your security requirements.
I think the really important point out of this whole story for sure is that at the end of the day, all of the tools, all of the controls, all of the good things we put in place, they're all able to be worked around by a user making a mistake. Even if it could be intentional or it could be completely by error, but the user who has permission to take an action, like adding a person to the chat.
If they add a person to the chat that wasn't supposed to be there, there is no security technology that's likely to catch that, are you sure is the best we have as a security control for that problem. So we have to wrap. That's the best way to add it. Yeah, we have to wrap up, but I wanna leave with at least one positive story because we've been in the morass of all the things that are going wrong. Laura, you dropped in a note about. the fact that we're actually doing some post quantum work now.
So I am for the April roundup, AWS has now released, they're updated libraries or they're updated, services for s a CM and Secrets manager, with the L-M-L-K-E-M, post quantum algorithm. the important takeaway from that is if you use AWS and you use those services for encryption, have access to post quantum algorithms that you should be looking at, bringing into your usage. Google had, in February announced for their KMS that they were supporting some PQC as well.
My challenge with the previous conversations about, quantum readiness was that there was nothing tangible for the majority of people to do. There were certainly lots of tangible things for people in the math side to do, in the coding side to do once we had the algorithms established, but there wasn't anything for the kind of the majority of people. 'cause I do not recommend rolling your own crypto. But now there are things to do.
So keep paying attention on, these are the first, items really coming out, being ready to go and, there will be more. Get your inventory up to date, know where you're using cryptography, where you need to be ready to, update to modern libraries and get her done. Yeah. The other thing I'll just add is that this opportunity that you just mentioned, the inventory is also the opportunity to say. Do we need this data anymore? Is this system one we want to keep?
If we're gonna talk about software and data minimalism, our, our Marie what was her name again? Marie Kondo Yeah. While we're improving security, we can clean our house. And wouldn't that be nice for everybody? Doesn't mean you have to get rid of everything, but it has to have joy and purpose in your business. So prepare yourself for quantum and this is a wonderful time and this is the positive side of a competition because if AWS announces it, everyone else will have to.
So there, there's our positive side of competition in this industry. And the second is we get to declutter. Absolutely organizers. Rejoice. Thank you Laura Payne, and thank you very much. And David Shipley, this has been a great piece and I, as usual, I'll have the editing of it to do, but I think we've had a great discussion. I hope the audience has enjoyed it as much as I have.
And if you're out there listening, wherever you're listening, I never know when people are tuning into this, whether it's on a Saturday morning or sometime through the weekend whenever you take long podcasts. Thanks a lot for joining us on this. And, have a coffee, relax and, think about cybersecurity. I'm your host, Jim. Love. Thanks for listening.