Welcome to CyberFocus from the McCrary Institute, where we explore the people and ideas shaping and defending our digital world. I'm your host, Frank Cilluffo, and this week I have the privilege to sit down with Michael Daniel. Michael serves as the head of the Cyber Threat alliance, and prior to that was the Cyber Coordinator and Special Assistant to the President, to President Obama. He came to that role through omb, where he ran
the intelligence programs at the Office of Management and Budget. And Michael is known to everyone in the cyber field and really excited to sit down with him today. Michael, welcome. Oh, thank you for having me. It's a privilege to sit down with you. We've been talking cyber together for many, many, many years. I thought what we would do today is go deep in maybe three to four issues and obviously draw on your wealth of knowledge and insight. So. But before we go into those issues, just
out of curiosity, what was your. Why? When did you really realize cyber matters? And did you ever think you were going to be in leadership roles around cybersecurity? Oh,
well, clearly not. I mean, I'm old enough that, you know, cybersecurity wasn't a thing, for example, when I was. Well, I mean, it was a thing, but nobody really talked about it. It was in fosec, right, Back when I was in college or grad school. Right. And even when I first came into government. No, it was really at about sort of the 2004, 2005 time frame when a lot of the intelligence
agencies started asking for a lot more money for this thing called cyber. And I was overseeing their budgets as part of the Office of Management and Budget and the Intelligence Branch. And not very many people in the White House understood what this cyber, cyber thing was. And so as I started, you know, at the direction of the
OMB leadership and others, we started getting more and more into it. And I very distinctly remember a conversation that I had where the, you know, a bunch of folks from NSA had come over, along with CIA and a few other folks, and they were briefing us on kind of the issues, not so much the budget, but the issues and sort of how the bad guys do their work and that sort of thing at the time. And I remember stopping one of them and saying, wait, wait,
wait, wait, let me get this straight. The bad guys are getting in through a hole that we know about that we have a fix for, but we just haven't done it? And they were like, yeah, pretty much. And I was like, this is. I remember thinking, this is not a technical problem. This is way Bigger than a technical problem. This is like an economics incentive problem. And so that made it a lot more interesting because I realized that it was a much broader thing than just
the ones and zeros. I do think you brought a unique skill set to that
coordinator role where it wasn't purely the beep and squeak, but you were able to. Policy without resources is rhetoric. And at the end of the day, if you can figure out how to marshal and mobilize the economic instruments, we can start moving the ball. Yeah, Yep, absolutely. I think in subsequent years, Chris Inglis and Kemba Walden and others rolled out the National Cyber Strategy. And one of the big takeaways, an issue
we've discussed, but I'm glad to see it was sort of foot stomped in. The National Cyber Strategy itself was to try to shift the burden around cybersecurity from end users to entities that are more in a position to be able to punch back. That said, translating those concepts into reality and making that a real issue is a tough set of issues. So what are your thoughts on that generally? Well, I also,
I give them a lot of credit for actually producing a cybersecurity strategy that says something. Right. That's a. Even if you disagree with some pieces of the strategy, you should be pleased that, like, it says something more than cybersecurity is good and we should have more of it. Right? More gobbledygoo. Yeah. It actually lays out some ideas that we want to try to pursue. And one of them, as you're very clearly
stating, is this idea of realigning the security burden, shifting that cyber security burden. But you're absolutely right. So the question is, okay, so where do we shift it to and what do we give those who are taking on this burden, like incentives. What incentives do we give them to. To actually take on this burden that we're asking them to. To take on? Now, in some cases, maybe we say, okay, you've actually been making a lot of money. Cybersecurity or software industry, maybe you could take on
this burden. But in other cases it needs to come with resources or legal protection or other things. Part of this, for example, is this question of software liability. We have very few products in our society where the creator, maker of those products bears no liability for how those products operate. Exactly. That's not a very common model. And so while you certainly can say there's a lot of things that users do that
just like with cars, Right. You can't, you're not going to hold the car company responsible for people driving dangerously. Right. But we do hold Audi manufacturers responsible if, like, the brakes fail. Exactly. Right, exactly. And that also played out in the technical report,
the building blocks. And we're going to have Anjanan soon to discuss that in greater detail. But in addition to the what, who should the burden be thrust upon? Yeah,
and I think a lot of this has to do with who has the capabilities. Right. And so in general, I think you're talking about mostly larger organizations and institutions, although not always, but I think you're largely looking at the structures in the ecosystem that are better able, better technically able. ISP provides, software providers, telecommunications, cloud providers, all
of those kinds of things. Those, those kinds of structures. Right. Those kinds of institutions that are going to be better positioned to take on some of the, some of the burden. And you mentioned incentives. What, what, what comes top of mind around that,
those issues. And what about regulatory and legal? Do we need new laws to codify some of this? In some cases, I think we probably do. A good example actually
is the telecommunications industry, ISPs and denial of service attacks. We actually know what denial of service traffic looks like, and it's pretty easy to identify a lot of it. And we could actually provide the telecommunications companies with, you know, some protection to say, look, don't deliver known denial of service traffic to the end customer, just drop it on the floor, and we'll give you some protection that if in the process
of doing that in good faith, you accidentally also block some other traffic. Right. And we'll hold, you know, we'll protect you, we'll indemnify you for carrying out this public function of eliminating the denial of service traffic. Right. And would that eliminate all denial of service traffic? Of course not, because the bad guys would figure out ways to hide it and obscure it and other things, but it sure would remove a lot
of it. Right. And again, this is not hard to identify. This is not about choosing content. Right. This is not a content debate like that. Denial of service traffic is pretty easy to identify as just empty packets or whatever. And so you don't even have to get into the content debate to, you know, do that. But you've got to give the telcos some reason to do that service. Right. Because right now all they see is downside. Exactly right. There's no upside for them. And we have
in large part been blaming victims in this space, haven't we? Yeah, yeah. And I
think that that's the kind of conversation we should have about, okay, so how do you do that? And how do you. So that's how you do that particular burden realignment. Okay, well then you look at different areas and you figure out what you need to do for different types of the security burden. I also think a big chunk of the security burden is not just by design, but it's also about default.
Like what are your defaults? How do they come out of the box? Because as you know, a lot of people never change the default settings on their devices or anything else. And you know, so in a way. And this is a pet rock
issue of mine, secure by design, cyber informed engineering. These are incentives and levers at the very get go we need to be thinking about. Yes, yeah, absolutely. And that's
a good place where we might actually start to say, you know, might actually use some of the regulatory or statutory tools to say, you know, okay, if you're a critical infrastructure provider and we'll probably apply this to the government too. We're just not going to buy software that doesn't. Where the creator isn't able to demonstrate that they followed secure by design principles. And I think that gets to a question that we
started with is that it's not just the cybersecurity issue, it's actually embedding cybersecurity principles into design, into software, into hardware, in other disciplines. Yes, yeah. And I think again
you come back to, if you look at the market structure for a lot of software, right, there's no prize for being second or if there is a prize for second, it's a really, it's a much, much smaller prize than first prize. Right. And so almost every incentive in the software industry is to get to market fast, market fast. And we have to think about, okay, software manufacturers and designers are responding rationally
to those market pressures. Right. They're not just excluding security because they're like, I don't care about that. They're making a rational economic decision about what they have to prioritize in order to succeed in their industry. So we need to help shape that. We need to help shape those incentives so that we get the market forces working for us rather than against us. Now, are there issues of moral hazard here and how
do we balance that? Because that also does put a lot of emphasis and onus on some of these entities. Yes. So I would definitely say that you don't want
to shift all of the burden. Right. Like just like we, again, I'll go back to a car analogy. Still have to put on your seatbelt, you still have to drive carefully, be aware of other drivers, and you are responsible for not drinking and driving, I would say the same thing is true, that users will always be responsible for some pieces of the security puzzle of being careful of maintaining hygiene Gene and that sort of thing, you'll never be able to take that
away from. And I don't think, nor should you. Yeah, I was about to even say, no, I don't think we should do that because we want to maintain, we want to have everybody have some investment in their security. Absolutely. It's just that we're sort of expecting everybody to. We're expecting the end user to do everything right now and then we're surprised when it doesn't work. Especially given recent trends where small and
medium sized businesses, state, local, tribal, territorial, these are prime targets. Absolutely. Ransomware has made this a very democratized threat and risk. And I do think, to put a fine point on it, that's why this burden realignment is so important. Yes, I agree. And sort of since we talked a little bit about ransomware, let's talk about another issue you and I have had many debates and discussions on in the past and I
think we're in agreement. But it's the imposition of cost and consequence on bad cyber behavior. I mean there have been some very good developments of late that has. You've seen clawing back of ransomware funds, but to be very blunt, these are onesies and twosies. We've got to get to the point where we can scale some of this. What are your thoughts there? Yeah, I think the particularly around ransomware providers, nation state
actors, that's a little more Uncle Sam, but curious what your thoughts are. Well, I
think there's several different factors that are intertwined here. So one of them is that if you just, if you step back and you think about the scope and scale of the problem, clearly the level to which we are putting costs on the adversaries
is just not sufficient. One of the think tanks here in town, I think it was the center for New American Security did a study not all that long ago that showed that all things being equal across the board, if you commit a crime in the physical world, rob a 711 or steal a car or whatever, likely you'll get locked up, 50% chance you will get arrested and prosecuted. So, you know, you start to stack that up and it doesn't take too long before your odds of
being arrested are pretty. Arrested and prosecuted are pretty high, which. Induced changes in behavior. Right. For cybercrime the figure is less than half a percent. So it's very, so it's a very safe, safe sort of crime for a lot of people to commit.
Most likely to get shot, too. Yes. And then you just add on the fact
that there are countries that have found it in their interest to harbor these kinds of cyber criminals and have them operate from their territory, primarily Russia. But there are others that do that, and the result is that, you know, it's very hard to arrest cyber criminals. And so the rate at which we are finding and prosecuting and arresting cyber criminals is just far too low to actually have any sort of deterrent
effect on, you know, somebody that's thinking about getting into that line of work. So clearly we have to increase the scope and scale and primarily the cadence, how regularly, the speed, the frequency with which we're actually carrying out those operations so that it's actually happening for. Actually, part of my. One of my jokes, Frank, is that I've always said that we want to get it to the point where the media doesn't
even bother reporting on it because it happens so often. Ten, four on that. Right?
You mean the good guys activity? The good guys activity gets so frequent that nobody
even bothers on reporting on a takedown or an operation because it happens so often that, you know, it's just not even news. Right. We want to get to that. To that point. And we're nowhere near that. And we're nowhere near that. So a piece of this is also starting to think about then, okay, how do you impose those costs? So if it's not arrest and prosecution, now, if we can, if we
can. Scale that, maybe we should. If we can, great. You know, if they. If the cybercriminals are stupid enough to, you know, take a trip to the Riviera and the French, you know, arrest them or whatever, that's great. You know, they show up in Monaco and Interpol meets them at the airport. Great. You know, extradition treaties are
not everywhere, though, right? Yeah, that's right. So that's great. So we don't take that
off the table as a tool. Right. But it just. It's not our primary tool, or it won't be our primary tool for most of these activities. So then you get into the question of, okay, how do you impose those costs and where do you impose them, and who can actually help you with that? And it turns out that in cyberspace, we have an unusual set of capabilities that are resident in the private sector in a way that are not in most other sort of policy, law
enforcement areas. Counterterrorism, for example, is primarily a government game. Right. Counter weapons of mass destruction is primarily a government game, even like you know, narco trafficking and those kinds of things. Again, primarily, yes, there are a few private sector entities that have insight and things, but they're kind of their niche and they're a limited number, right? Largely
payments systems. Right? Yeah. But in cyber, the capabilities are vast. In the private sector,
both the intelligence about knowledge of criminal activity, not, you know, knowledge of the payment system, knowledge of the software industry. The assets that are being stolen and the assets that are being attacked by ransomware are primarily in private sector hands. There's a lot of capability in the cloud and in the ISPs and in other cybersecurity companies
to actually take action. The issue that I think here is that we want to get to the point where we can do what I call operational collaboration between the government and the private sector. So this goes beyond information sharing to actually synchronizing actions in time and space. You know, I'm going to bring up jcdc. So for complete
transparency through the Cyber Solarium Commission, we, we really emphasized, we called it jcpo. We didn't, we weren't great with acronyms, but most of them got over the goal line. So we'll take it joint cyber collaborative, but very much from an operational perspective. And, and that is, I think crucial because in addition to the tools, it's about trust.
Right. If you're not playing ball with someone every day, you can't expect, even if you have the dream team in the us if they don't practice together, they're not going to necessarily. The result isn't going to be what you always want come game time, because the repetitions matter. You, you gotta do the work, but. And you gotta build trust, which takes a long time to build and nanoseconds to lose. So I'd be curious what your thinking is in terms of models there. And I also wanna
touch on, because of your unique background is looking at crime as an industry. So I think that's when we started seeing some impact on the narcotics industry, was when you treated it as an industry. But before we jump into that, let's talk JCDC collaborative and what you think that could look like grade where we are to an extent and where we hope to be. So I would say that you can think
of three things that the JCDC could do that public private, that operational collaboration on. And I think they do two of the three. So one is you can use the JCDC to help you manage an incident that's already occurred. Something bad has happened. Colonial Pipeline at the high end or just the change software Unum Something bad has happened and you want to get industry together to help you with the response. So JCDC clearly can perform that function that it has. You can also imagine we think
an incident might happen, right. And so we're going to get industry together to help us figure out and plan for what might happen. Right. Or to get the word out about what might happen. And a really good example of that was the run up to the Russian invasion of Ukraine, Right. Where they've. Cisa, the US Government leveraged the JCDC very well. Yeah, sorry, yes, no, there's a debate. So they leverage the
JCDC very well to get the word out. Right. And to help organize industry, to help increase the level of protection on our critical infrastructure, you know, under the assumption that the Russians might, you know, take the opportunity to try to attack U.S. critical infrastructure during the, you know, the initial invasion. So those are two of the functions.
And I think JCDC does those. Now we could talk about how well it does those, but what I don't think it does right now, and I would be happy to be proved wrong, but I haven't seen evidence of this is it doesn't do the third kind, which is, okay, we know this set of actors is causing a lot of problems, so we want to organize a campaign around, around those actors to impose costs on them. That's what I don't see JCDC doing. And maybe that's even
beyond their remit. Maybe that needs to be another kind of organization because that's not really CISA's mission. Right. That's more of the sort of law enforcement and, you know, military intelligence world. So maybe you need a different structure for that. You could use JCDC for that. But I don't see evidence right now that they're doing it. But
I think the key ingredient here is the private sector. And there's still to this day, contrary to what we've all discussed for so many years, there's still a mindset, government lead, private sector follow. We've got to shift that calculus, don't we? Yeah, absolutely.
And I should be clear here, I'm not referring to hack back. Like that's a really. I'll just, you know, my public service announcement. That's a really bad idea. I'm talking about what the private sector can do under, you know, existing law and hand in glove, hand in glove with the government, what they can do on their own networks, on things that they own that are in the public domain. Before we jump
to the economic model, there is one group that I think does this Exceedingly well. And it's Europol. Europol has served as a good enabler to work with other law enforcement agencies around the world, as well as at least in allied countries, as well as with the private sector. Quite honestly, they couldn't do any of their job without private sector input here. That's right. And actually Interpol's getting better too. Really? Yeah, they've
got now quite an operation going in Singapore where that's where their cybercrime, you know, problem is. Some of the Interpol members are part of the problem. Yes, they are
part of the problem. But, you know, nevertheless, I think they've done a good job
at starting to marshal some of the capability here. Great, great. Happy to hear that.
Let's talk the model now, because I think that's where we can potentially take a new bite at this apple and think about it a little differently. What are your thoughts there? So when you look back over the, say, last decade, you know, going
back certainly 15 years to the late, you know, aughts. Right. You know, cybercrime was still very much a sort of individual effort or a small group of people, and they sort of did everything right. It was kind of boutique operations and it was like an art. It was like artisanal. What we've seen over the last 10 to 15 years is the industrialization of cybercrime crime. Yeah, yeah, right. It is. Now the, as I always say, if your image of the hacker is still the dude in
a hoodie living in his mom's basement, like, that is not the primary business. It's a big business. They've read their Adam Smith, they've read their Harvard Business School cases, and they run these things like a business, right? They have a supply chain, they have diversification, like, and specialization. So in other words, some groups now, all they do is initial access. All they do is get access to. A company, outsource other capabilities,
and they. Sell that access to somebody downstream, right? And then there's somebody else who
goes. And all they do is they design and build ransomware kits and they sell those or license them to, you know, users who actually conduct the actual, you know, they hit the button and actually encrypt stuff, right? And then those people, they have, you know, they have to get their money. So they mostly get their money in
cryptocurrency. But you still, Tesla's aside, you still can't buy very much with cryptocurrency. So you've got to eventually turn that back into cash, you know, fiat currency, some euros, dollars, you know, whatever. And so that means you've got to get access to the regular legitimate financial system. So you got all sorts of people that launder money and convert, you know, convert cash. So this kind of goes to your third leg of
this stool that we're not doing, because do we have. Right now, we don't have campaigns that are lighting up the system. So you can understand all the bits and pieces, or do we? I think we do, but again, it's not at the scope
and scale and regularity that we need, so we need to scale up those activities. So, for example, there's a project that is being sponsored by the World Economic Forum called the Cybercrime Atlas. Okay. And the reason we, those of us who helped get the project off the ground, call it the Atlas is because the idea is you want it to be like a book of maps, a reference set, like, okay, I want to understand the criminal supply chain for ransomware. Who's making it, where do they
sit? Sort of like a CIA World Factbook. Yeah, absolutely. For all the bad actors. For all the bad actors. Pretty cool. Who are the bulletproof hosters of choice and where do those bulletproof hosters sit? Oh, look, they're all, you know, 75% of them are in this country over here. I wonder if we could put some diplomatic pressure on that country, some economic pressure on that country's government to go crack down on
some of those bulletproof hosters. And why aren't we doing more of that, just out
of curiosity? Because some of those are not in countries, some of those are in allied countries. Yes. Yeah, absolutely. I think it's because we haven't really started to take
this holistic view of the cybercrime ecosystem from beginning to end. And that's some of what the Cybercrime Atlas is designed to do, is to say, okay, here's what the private sector understands about the cybercrime ecosystem. And there's a remarkable amount that you can pull from open source data, right. Or data that companies are willing to provide and make public. Right. Even if it came from proprietary sources, they're willing to make it
public. And if you start doing that and start assembling this, this map, these maps, right. Then you can, it enables you to do a few things. A, it allows you to start saying, okay, well, this is what the private sector knows. So if I'm a government, I don't need to spend law enforcement or intelligence assets collecting that information because the private sector has already got it. I can spend my resources getting to the stuff that only governments can get to. Right. And then it also enables
you to say, okay, here's the roadmap for the places we should look at. Oh,
look at this. Everybody and their cousin is all using the same four cash out
services. I guess we know who to go target. Again, you're not going to like, is this going to stop other people from stepping up and providing cash out services? No, but all of this is about increasing the friction and reducing the profits that are available for cybercrime. I mean, we're never going to stop all cybercrime, but we
are going to make it harder. That's the. And at the. Or, we can make it harder. And. Yep. This reminds me of all discussions around counterterrorism and the National Intelligence Priority Framework. We could get to the point where we could really focus our exquisite and limited resources on very specific. Right things and the private sector. But to
get to that, you got to have the trust. And I might note when I look at criminal successes against organized crime, historically, just like the good guys rely on trust, so do the bad guys. Yes. And if they don't have confidence or trust in their broader ecosystem, that can cause a run and it can cause people to overreact. When the Italian mob, the Italian Mafia, if you thought your brother, cousin, or whoever was working with the FBI or anyone else, you kind of create a chill
in their business model too, right? Absolutely. Yeah. And again, that's an example of how
you have to think more holistically about how you organize a campaign. And it has
to be a campaign. It has to be a sustained campaign using a lot of
different levers, some of which will be in the private sector, some of which will, you know, a lot of which will be in the government. But who should lead
this campaign? Ncd, National Cyber Director or wherever? No, that's to my FBI, I think,
to me, yeah, this is really a. This is a law enforcement. You know, at the end of the day, this is doj. This is a DOJ thing. Right. They need to be in the driver's seat for this. Supported by, you know, the intelligence community. Supported by. There may be even times when we want to say, you know what, this is a high enough priority. We're even going to bring in Cyber Command
because ransomware is a national security threat. And if we think they're going to disrupt a, you know, major piece of critical, you know, critical infrastructure, we're actually willing to use a military asset against these guys. You know, I think that should be uncommon, but it shouldn't be uncommon. You don't know until, you know, so. And we don't
have that. So it actually gets to, you know, again, in the counterterrorism business, do you string them up or string them along? We can't make those decisions holistically right now because we don't have a concerted campaign. And it needs to be more than a campaign against cyber. Right. It's about actors. Yes. So you've got nation states here
where that's where I do think government needs to obviously take the lead. Yes. But when it comes to ransomware and other criminal organizations, I do think the problem sector plays as significant a role in providing the information because they're going to have the data. Absolutely. And I'll just make another point here. This has actually become something that
I've really thought a lot more about in the last year or two, Frank. Of that. One of the hazards that we have in the cybersecurity industry is we have all these names for Cutesy names. Yeah, cutesy names. And we say that they're for these actors, but the truth is they're actually for sets of behaviors. Right. They're sets of activities that we've lumped together. And some of them, I mean, and this goes back to the government days, Right. And some of them are really cool names. I
mean, who wouldn't like a name like Byzantine Hades? Like, I mean, that is just an awesome. Like an awesome name for a set of activities. But at the end of the day, your point is we're not going to arrest, prosecute, Sanction, Disrupt, impose costs on Sparkly Kitten. We're going to impose that on people. Behind people behind them. Part of the other point of the cybercrime atlas is to actually start driving down to. Can we actually start to do what we would call from the old CT
world, the link analysis? Who are the people even if you don't have true name, like, you know, dude X. Right. And dude X and dude Y. And they're taking them out with the impact it would be. Right. And here's how they link to the other group. Oh, it looks like they actually service multiple groups and they have multiple Personas. Right. That they use in different contexts. We've also got to start treating it that way because the. I think part of our problem has been we've stopped
short of actually getting to. And that's really hard. Right. Like, it's often easier. Like the behaviors and the things that you can find on a computer, like you can get to those. Right. And oftentimes it's difficult to get to the human behind the computer. Exactly, exactly. But we have to do that if we actually want to start really imposing costs on the people, because that's going to be the only way that we actually really have a deterrent effect or disrupt their ability to do what they're
doing. You're absolutely right. Not to sound like a fortune cookie philosopher, but at the
end of the day, technology changes, human nature remains consistent. You have to get inside that decision loop and impose consequences on people. But people, and I would say countries and organizations and the like. Before we jump to the third topic, safe havens, how do, do we, in addition to, I remember, no joke, about eight, nine years ago,
there was an alert that Russia put out to their criminal entities. Don't travel to countries that have extradition treaties with the United States, to some of these cyber operators. And how do we get around that issue? That's a hard, it's a hard nut to crack. It is a really difficult one. And I think to some degree in
the short term, we're not going to be able to do much about it. So
that then means more disruptive than. So what that may mean in the short term
is we have to start thinking about where are we willing to take risk and where are we willing to, for example, burn a computer network? Disrupt a computer network, even if it's inside Russia, if it's a criminal network. Like, even if it's of
intelligence value. You gotta act right, you know, and force them to replace that equipment,
force them to read, you know, stand up stuff, right? And yeah, that's gonna come with some risk. And so we have to balance that risk in any given situation. I think over the long term, the question is how do you start to marshal enough of the world opinion, enough of the global diplomatic cadre, and how do you connect the safe havens to other things that those countries want? There's a reason why they find it in their interest to provide safe haven. And so we eventually have
to start making it not in their interest to do that. And so if you start having the fact that they're providing safe haven, start impacting negotiations and other things that they want, then they'll start to pay a price for serving as a safe haven. And right now, on balance, they see that as being in their national interest.
And so eventually we have to figure out a way to marshal enough of the world to say that's just not behavior that responsible nation states engage in, and you want to be a participant in the global system, then you're going to have to not harbor cybercriminals. Yeah. Or it gets to the point where the bad actors bite
the hand that feed them. Right. And we're not going to see that anytime soon.
That's right. Yeah. Hopefully we can be a little creative there. And you raise a
very valid point. Our greatest instruments may not be cyber in kind and often aren't cyber in kind, but I'm not sure we're smart enough yet until we take this campaign approach to be able to calibrate the levers in a smart way. So let's go to another issue you and I have discussed in the past, and that's sort of core functions of the Internet, some of which are in the hands of nonprofits that aren't necessarily well funded and shadows. Rather than sort of jump into that, I'd
be curious what some of your thinking is there. And if I'm not mistaken, you've been doing some good work with friends and colleagues like Phil Rettinger and others on some of this. So please. Yeah, absolutely. So actually, you know, a couple years ago,
Phil really had the idea of starting to put together a coalition of the nonprofits that actually operate in. That provide products and services, that operate. Provide operational, keep the Internet running, but also that do other functions. Like cta, we share threat intelligence. So my organization now, the Cyber Threat alliance, we actually share threat intelligence among our members on a daily basis. We have activities. There are other organizations like the Cyberpeace Institute,
that actually provide, for example, services. They'll help an organization do an assessment. Global Cyber alliance provides toolkits. So there's these organizations that actually are not just about. So it's outside of the think tank space. Right. In that you're the doers. And so that was called nonprofit cyber. And as that got going and we started looking at, like, okay, we started compiling catalogs of, like, here are all the different products and service
offerings that are available. It started dawning on some of the people in this group that, like, wait a minute. Some of these are actually some pretty critical functions. Yeah.
Mention some of those. Yeah, yeah. So you've got things like Shadow Server, which is
the registrar of last resort, which means that if you're trying to get a domain, one of these, you know, going back to our previous conversation about taking down illicit domains, seizing domains and things, you got to put them someplace. The way the Internet functions, you got to have them housed somewhere. Shadow Server often provides that function. Right. So that's a critical function. How is Shadow Server funded? Well, today it's funded by
Craig, as in Craig Newmark. Wow. Right. It's primarily funded through philanthropy. And do we really want some of these critical functions to be funded through philanthropy? Right. Alone. Alone. And that's great. And by the way, Craig's dedication to this area and the amount of money that he is putting in should be applauded and, like, should be absolutely. You know, and so that is great. But I am not sure that that's the right model for the, you know, for all of the functions that we want to
have done. Any other examples? Yes. So you've got everything from, like, even things like position, navigation and timing for a long time were run by a university. Right. And
clocks rule the world. Gps. Are we there yet? Yeah, yeah. You know, and again,
it's like, so, like, what if that university, like, decided to reallocate resources? Right. Or
it was run by someone with different intentions? Intelligence. Yeah. Yeah. So again, there's several
of these functions that, you know, are critical to how the Internet operates. I mean, you've got others where we've actually figured out a funding model. So you've got, you know, the. The Internet Corporation for Assigned Names and Numbers. Right. Is a big deal. Right. ICANN's a big deal. But they. What we've decided is we found a market for that because we said, you know what, if you're going to register a domain, you're going to pay a little bit for that because that has value to you,
so you should be willing to pay for that. So I can. Has a funding model because they can charge something for the service that they provide to manage all of those assigned names and numbers. Right. So we don't have that for some of the other functions that are done. And so as a result, there's now this idea of these functions that are in the common good. You could also call it public good if you're an economist. And so the question is, how should we be structuring
and funding these public goods? And what are your thoughts there? So I think we need to look at some different combinations of where do we want governments to actually do some support? You know, we may say we still want the function done by a nonprofit. We don't want this to be a government function, but we actually want to have some dedicated funding streams coming from, you know, the government. In other cases, we may want to say, you know what, really, you're actually closer to a public
utility. And so we're going to actually treat you like a public utility. You're private, but you're going to be regulated like a public utility. There's going to be you know, so the good news is you get a better rate structure, but the bad news is you have to, you know, get some approval for your rate structure. Right. And so I think there's just a number of these different models that we need to explore. And I should also note that this is an international problem. Almost none
of these functions that I'm talking about are purely us. This is really also about global institutions to actually support a lot of these activities. You know, what becomes really
difficult is when you look at it legislatively alone, you still have so many different committees that have a bite at the cybersecurity apple, but there are only a handful of committees that really have the heft to get something from gold over the goal line in terms of funding as well as even authorization. The problem with many of these issues, they sit in between jurisdictional entities. Right. Whether on Capitol Hill or even
the Executive Branch. Yeah. And that's. Right. Sometimes it's not even clear who in the
WHO should be doing tests. And I should be clear that in some of these cases right there even, I mean, there's a current example going on right now of that's been going on for the last few months of the National Vulnerabilities Database. Right. You know, nist, the National Institutes of Standards and Technology, dramatically, Sherry Pasco was. Just
on to talk about the cybersecurity framework. Framework, yeah. And they do great work there.
But they very much had to reduce resources going to the enrichment of vulnerabilities. And what were the entries that were going. So once a cve, once a vulnerability has been identified and has been given a number, a common vulnerability enumeration. Right. And given a number. NIST has for years, decades almost. Yeah. Going back to like 1999 or
2000, has been enriching that data and putting it into the National Vulnerabilities Database. Well, they largely had to stop doing that in mid February, and they've been struggling to catch back up and to find the appropriate resources to actually put against this function. But this is a critical function that many, many organizations in the private sector rely
on to help them figure out which vulnerabilities to prioritize for fixing. And so, again, this is a function that not a lot of people have thought about, and we haven't thought about ensuring that we have the right dedicated funding stream to support this
activity that is critical to doing good cybersecurity across the Internet. So we really have to start thinking about how we fund some of these critical activities and the resources that go into it, because a Lot of them are funded on a shoestring or as an afterthought. And that was probably okay when, like, the Internet was two, but it's probably not okay now that the Internet is a critical national function for commerce.
Everything, everything you can think of, and not just for the United States, but for the whole, you know, almost every country, you know, in the world. And so I really think that one of the interesting things about cybersecurity policy over the next decade will be us actually figuring out, okay, what are the funding models that work? How do we get the Internet? How do we make sure we've got these functions properly
resourced and, you know, financed? And I'm really glad you foot stomped that point. I
mean, this is not meant to be the empty pejorative, impossible to answer question. But the reality is we have not built the business case yet, have we? That's right. And the actors that have to be part of that play are so many. It's not just the three lettered agencies. That's right. And we need to get to that point. The question is, we all kind of realize that, but what's keeping us from doing that? And that's a hard one. And I'm not asking you, expecting you to
answer that. If you got that we're right, we're. All good, and I probably would
have an island. But, you know, I think part of. So, for example, in the common good cyber area, one of the, One of the efforts is to actually build the business case, to actually lay out in concrete terms, here are the benefits that these organizations provide, and this is the benefit that they provide to the Internet as a whole. And here's how. The extent to which we can quantify it and to the extent that we can't, here's at least some qualitative data to support that so
that you can actually start making that case. Right. Or you go the opposite where
it's a utility as a whole. Right. I don't think that's the answer in my eyes. But as a free market. But there are functions that maybe are. That's right.
And I think that in most cases, I would rather go down the path of trying the market incentives first. And if they really don't work, then, okay, then you think about maybe this is a public utility. But, you know, but we're not there yet. We're not there yet. And I think we need to build that business case. And again, I will say in some ways, we shouldn't be overly surprised at this. Right. You know, the. I mean, cyberspace as a whole is 55 years old, maybe
sort of at the outside. Right. You know, the Internet, as we sort of worldwide web, as we know it, is essentially 30 years old. That's not a long. I mean, I know to people who are in tech, they think like that's ages, but in younger than us, but in the policy world. Right. That's not very long. Exactly right. And particularly in the international relations world, that's really not very long. And so,
you know, we need the. Nobody should be surprised that we're still evolving the. Our thinking evolving, the policies evolving, the, you know, laws in this space, because it's still so new. It takes time, but we need to move with urgency. And that,
unfortunately, normally take something really a really bad day to act. And we have time, and I hope we have some time, but we need to start moving. Michael, we covered so much territory here. What questions didn't I ask that I should have? Well,
I think there's still some broader questions out there that are, you know, I'll raise just a couple that. Or at least one. I don't think the conflict, the war between Russia and Ukraine has played out in cyberspace and through cyberspace quite the way that a lot of people thought, including myself. Me, too. And so there's a lot we need to do some thinking about. Okay, was that kind of a unique situation,
or should we draw some. What lessons should we draw from that? And how do we not overdraw the lessons and how do we apply those to potentially the next conflict, which will. Have a cyber dimension. Not necessarily. And this one does, too. Right. The Russia, Ukraine has definitely got a lot of cyber conflict going on and everything else. A lot of cyber conflict going on. And a lot of hard work to
get to the point that they could be a little more resilient. Yeah, absolutely. So
what's the next conflict going to look like? And what are the lessons that we should learn from the conflict in Ukraine, the war in Ukraine? And I think that's a really important thing for us to think about. I think the other, you know, the other thing that we have to really consider is that we still don't have a really good handle on a lot of the systemic issues. And I've been amused at, like, it seems like almost every time we think we understand who. All of
the critical infrastructure, not just the sector. Right. You know, but. Oh, wait, we didn't think about. We didn't think about that company being a critical provider change as an example. That's right. Yeah. We didn't think about that. And, you know, we expanded a little bit more because the connections across our economy are so complex and they're so almost chaotic in the mathematical sense of that, that term that it's really hard to
understand ex ante. And the implications written large. Cause there are a lot of unintended
consequences. Consequences here. Right. And so those are just two of the things that I
think about that are definitely issues that will continue to need policy development and policy work. You hit a word that is near and dear to me. Systemic. This is
where I think we need to do more to batten down risk. Because at the end of the day, we're not preventing everything. Right. We are trying to become more resilient and we're trying to manage risk and we're trying to get the greatest risk off the table quickest. Easier said than done, but thanks to people like you to continue to fight the good fight, I am confident we will get there at some point. Michael, thank you for joining us today. Thank you for your commitment to these
issues for so long, making a difference. And until next week, stay safe, stay secure, stay curious. Thank you, Mike. Thank you for having me.