I'm Christian O'Brien, Managing Editor at NFX. How would a government regulate social media when the they need to regulate, content is happening in real time billions of times a day. These aren't printing presses anymore. These are networks and network problems demand network solutions. Today with Chris Anderson, former editor in chief at wired, and James Currier, general partner at NSFAX, we're proposing a solution to bridge the divide of technology and society.
But to get there, here's Chris and James breaking down what Silicon Valley CEOs and government regulators need to understand. About the new era of regulating networks. You start with the word stewardship. But there was a notion that that once tech companies get to a certain scale, they have responsibilities, you know, beyond those to their employees and their shareholders. And, you know, we can debate whether or not that's the case, but clearly we're we're this is being played out in real time.
You know, Congress feels they do China feels they do. Many people feel they do. So it's it's worth thinking about how do we as as as capitalists? And then as you know, and then I when we talked before, You know, I think the company suck at this stuff. They suck at stewardship, and they really are only good at at, you know, at at their customers, their employees, and their and their shareholders.
So how do we you know, find a way to satisfy society's needs that don't com that aren't completely ineffective or cripple a company's ability to be a company. Yeah. Yeah. The the control of the language and the and the public sphere changes when technology changes and the technology has changed. Right. And it's not just a matter of technology changing, but also scaled matters. Little companies can get away with murder.
Big companies shouldn't, can't, you know, maybe do, but, you know, but but but not for long. And so I I I think I think, you know, that you have, you know, you know, at least two dimensions that we can explore here. One dimension is the idea that, you know, the big company is different. The big is different. Defined big by something, by some metric.
The other is that the, you know, that the the levers of you know, you know, the the social compact between companies and and society, have, you know, fail as technology changes. You know, the the the, you know, basically technology moves faster than regulation and laws and that kind of stuff. And as a result, we find ourselves constantly in the gray zone. With with nobody happy.
And so and so, you know, it's, you know, you can't just say, well, let antitrust deal with it because antitrust moves at the pace of, you know, tectonic you know, the age, you can't say, let the courts deal with it because, you know, again, slow. You can't, you know, you can't say just let the market deal with it because we've seen what happens.
And so there must be something else that sort of nimble, and aligned impedance matched even with a company's internal processes that can represent society's interest. Yeah. Yeah. And and so you brought up the concept from the ombudsman. But the Wikipedia entry on ombudsman is is really excellent. And, if I could just sort of, you know, just just read the first two paragraphs, as we're sitting here and sort of ask ask you know, is this just to kind of solve problem elsewhere?
And I'm I'm I'm Buddhistman. I'm Bootsman, rather, is an official who's charged with representing the interest of the public by negotiating and addressing complaints of maladministration or violation of rights. Usually appointed by the government or by parliament, but with significant degree of independence. In some countries, it's it's similar to like an citizen's advocate Morgan inspector general.
Below the national level, an Putsman may be, maybe even, pointed for or work appointed by our workforce corporation, such as newspaper. Or NGO or professional regulatory body. Typical duty of not budeson are to investigate complaints and attempt to resolve them usually through recommendations or mediation. They sometimes aim to identify systemic issues leading to poor service of breaches of people's rights.
Anyway, it goes on and you can you can read it there, but it seems to me that there's so much, you know, that the advantage of an odd butchman is it's a well understood role. That that involves the the characteristics of independence as well as, as power.
An ombudsman in ombudsman has, if nothing else, a bully Pete, And often, you know, and and often they do have sort of, you know, you know, mediation or sort of legal power to, you know, to force their entity that they work with to to follow the recommendation. And they're faster and more nimble than a committee or a court, you know, with Facebook's disastrous or not disastrous, but Facebook's, you know, you know, board as an example of, you know, what doesn't seem to be scalable or working.
An ombudsman could be more effective. And we have great we have, you know, decades of precedence precedent rather with newspapers. Which are the closest thing, the closest analogy we have to, you know, social media. So anyway, that's that's kinda where I start. Yeah. Yeah. And you're talking when you're talking about the board, you're talking about the Facebook oversight board, which they've been publicizing more.
Yeah. So the Facebook over a public oversight board is has certain qualities which are good, such as independence. They're transparent. Which is good. But they're entirely complaint driven. And it seems like they can only handle 3 or 4 complaints a quarter as as far as I can Beller. And then, you know, of the first five that got one of them at least disastrously wrong. Now, you know, I Pete Facebook gets it wrong by at least the same proportion. But, you know, what's the Flint?
If this if this if this board gets it wrong, you know, about the same proportion of Facebook, you know, what have we achieved? Right. And also, I mean, there's also other reasons to be critical of that. That they've got there in terms of how they're compensating the people who are on the board. Sure. The the, you know, the speed of which they're doing things, I I just think this is there's a kind of a, you know, I I my main concern is that they're slow and and bureaucratic.
You know, it's a committee. Seems more like a court than a board, and, that, that they're entirely complaint driven. So they have no purview over Facebook's algorithms. For example. You know, so Facebook's make makes, you know, thumbs up, thumbs down decisions about whether certain posts should be kept up or down, up or not. And and the board can reverse that. If need be, the board can't change the underlying algorithms that Facebook uses to create more problems than any individual case. Right.
And they're not smart enough to think about how the interface might change. Exactly. Exactly. Exactly. Here's the issue. It's like one of the one of the things that made newspaper is a little bit easier to to manage was that anybody could understand the technology of a newspaper. They could see a call niche They could see a photo. They could see a headline, which was Beller, what was small. They could see, you know, who had written it.
These interfaces drive behavior a billions opposed today that are driven by the software design. How big is the box that you type into? Who gets to see that? You know, is the button called a like button or is or is it more nuanced than that?
These sorts of more fundamental decisions about the playground in which we're all playing very few people in the world are good at that at all including most of the people at Facebook and certainly at the thousands of you know, social media startups that have existed over the last decade, which we've got no traction. People who committed their lives to being good at that Pete still no good at it. It's a very difficult thing to do. It's a little bit like refining uranium.
Like one of the reasons that nuclear weapons haven't proliferated is because refining uranium was just damn hard So I would I would argue that they're that they're more similar than than you think. So you described 2 elements of of the UI. So you described the sort of, you know, the the the user interface of Facebook and the user interface of a newspaper headline, you know, text, etcetera. But In both cases, there's a real, there's an opaque and some black magic y process that gets into that.
So the algorithm recommendations and, you know, is is is what's going on in the back of and then behind the scenes of a newspaper is is reporting. Which involves human beings trying to, you know, filter through a lot of noise and understand what's the truth. Involving, you know, some elements of rep repartorial techniques, some elements of judgment, some elements of bias or or or lack thereof, And so I would argue that in both cases, there is an there is an there is some sausage.
There's an algorithm. There's an algorithm. In both cases, there's an algorithm, which is not apparent to the outside world. And then Nombudsman, it would have visibility into both. So Nombudsman and a newspaper, if somebody complains and says, you know, your article about climate change. You you got you got a you you you for for reasons that are unclear, you decided to have a skeptic, you know, a climate change, you know, skeptic to match her's climate climate change believers.
Even though believers are 99%, 99.999, you know, you decided in the interest of balance, to give the skeptics equal equal time. So that's a terrible algorithm. Well, that is in fact the algorithm of of of media is to try to bring balance, but not all issues deserve equal balance. And you could argue that that's an algorithm badly badly deployed, and that Facebook also has sort of well meaning algorithms that sometimes don't aren't well deployed.
But it's a simple ad in the sense that once you publish a newspaper, it gets published, let's say, once a day, with the old technology, and it was published the same to everyone. Both those things aren't true in this in the digital media because it's being published all the time. Who saw what, what time of data they see it, what was it next to, like, that's different for everyone.
And so the algorithm, the complexity for someone to adjudicate on the content of the network was much easier than the old technology than it is new technology. For sure. My point would be that the the the structure of the the product itself, Facebook versus Snapchat versus TikTok they have different interfaces. They have different and those things end up impacting what gets shared and how it gets shared. That that's true.
I mean, you're you're absolutely right that newspapers are one size fits all. And so, you know, you you you know, it's it's not to say that this this article was right for him, but not for her. It is it is is this right for everybody or it's right for nobody? It's right for Karen Ombudsman or an advisory group. Absolutely. You could say we could say what happened.
Well, you could say there's a threshold and and this this article did not rise above our quality threshold and should not been exposed to anybody. Whereas in Facebook, you can say, well, you know, maybe this post isn't right for everybody, but it might be right for her. And and so it's a much more difficult you know, problem. However, I would argue that if you're looking for binaries, you know, the the atomic unit of Pete is take the post down for everybody or not take it down for everybody.
And so there is there is some similarity in that respect. Yes. There's a crossover in those cases, but the oversight of the system is more complex in the new world by far than it was in the old world. So you could argue that there's a there's a, you know, there's this 3 class of content. There's content that's fine. There there's it's clearly fine for everybody. There's content that's clearly not fine for everybody.
And then there's gray zone where this content might be fine for some people, but not for everybody. And, you know, and, you know, it's it's easy to deal with 1 and 3, and it's difficult to deal with with with 2. And you could argue that, you know, the first thing the ombudsman or whatever should do is just make sure that we're really good with the stuff that nobody should Pete. Right?
And that's essentially what the Facebook, you know, Oversight Board is doing is saying this, nobody should have seen this, or maybe every maybe everybody should have seen this, but you got it wrong. Facebook. You know, that's the speed with which Facebook and other systems like Twitter worked. That was so fast. That there's no time to go back and say, well, we should have seen it. Well, that it was it came out 15 minutes ago.
That means 42,000,000 people already saw it, then copied it and now they can spread it elsewhere, right? And so so so not only is the deciding more difficult because not everybody can write a MySQL query to figure out who saw what, when, whereas a normal human can just read a piece of paper but but now you you you don't have the time to actually What what what clearly there's Yeah.
There's no utility in an oversight boarder, Nombudsman, you know, 3 weeks later, having a post rescored after it's no longer relevant. The only utility is deriving some lasting lesson from that mistake and changing the underlying algorithm, so it doesn't happen again. That's right. And and or algorithm and or interface. And or policy. It could be all three. It could be it could be an algorithm. It could be human beings just instructed differently, or it could be how you display.
Yep. Yep. It could be. You know, as an example with in terms of in terms of how people should understand that the the the design choices, right? If if on Instagram, the light button is bigger and then the size of the the count of your light numbers is bigger on the Pete. It's very clearly indicating you that you are playing this game to get the most number of likes.
People then go in and spend hours figuring out which thing they can post to get them the most likes and what they inevitably find is that the most extreme content gets the most light. So so I mean, you know, you've now escalated this discussion up into something much bigger than a kind of a case by case content decision. You've now you've now escalated it to the overall sort of, you know, the raise on debt of any of any platform.
Is the is the is this is this platform about about, you know, about getting likes? Is it about, you know, increasing your follower count? Is it simply about community. And so, I mean, I I wonder whether that scope Currier. In other words, it would it be appropriate for, you know, a body of some sort, a, you know, an an advisory board or or ombudsman or even the congress. To tell Facebook that they can't have likes. Right. That's that's the that's the question and that's the extreme version.
I don't think I'm proposing that, but I do want everyone who's thinking about this issue to be clear that the design of the product itself drives certain behavior and we can't get away from It it absolutely does. And it and and, you know, but that is that's, you know, you could argue that's a root cause. Like, you know, okay, let's imagine you've got, like, some conflict And the and the root cause of the conflict is like, you know, a civil war 300 years ago that people are so angry about. Right?
You know, it's like the UN can't do anything about that. And so what they do, what they do is they come in and they say, okay. Given that you're all really angry, how do we how do we, you know, stop you from killing each other? And so I think that what you've described there about the sort of the underlying motivations or incentives of a platform are probably baked in to some degree. Pete, but but I will say this.
I will say that we can put a lot of mechanisms on Boomsmen and other things on top. But fundamentally what's happening is because Facebook is driven by adrevenue. They necessarily are driven by dwell time, time on-site. And Zuckerberg, according to people closer, this has been true from day 1, and this is still true.
This is the mental model driving his decisions that's preventing him from realizing that becoming a statesman or becoming a a steward is required once you have this platform because he's so attached to twelve times. And in case and and it's absolutely true that, you know, we are driven by the Beller the baser elements of our of our of our of our soul. He's not there for money. Let me be clear. I don't say he's doing that because he wants to make more money.
He's already worth a 120,000,000,000 or whatever it is that he's worth. It's not that he wants a 130,000,000,000 It's that he wants to maintain his relevant to his power. He wants to maintain his ecosystem that he's built and all the people he's employing and all the love that he gets For sure. For sure. So so, you know, let's say that that engagement is how he measures is how he keeps score.
Now money is how how Wall Street keeps Pete of engagement, but Zuckerberg keeps keeps score on how how relevant, you know, how well he's he's delivering at his vision with engagement. And it's true that the, you know, that the, that the that the darker angels of our soul, you know, including things like conflict and and, you know, and and tribalism, do engage us. And and, you know, and and you've you've seen the social network and you've and you've and you've read the book.
And and it's it's clear there's there's like there's like you know, and by the way, they said the same thing about television, you know, the base that, you know, our our instincts, etcetera. So it is true it is true that, you know, that our, technology has moved faster than our evolution. And for the in the same way that we like sugar and fat, you know, we like we like, you know, Flint and titillation.
And, you know, and and, you know, television like tabloids, like Facebook is just giving us what we want. You know, and and by the way, you know, there is there are certainly regulatory bodies that will stop us from taking drugs. You know, you know, like, you know, you know, because even though our body wants them, they, you know, the the, you know, the the DA will will stop us you know, from taking them or FBI or FDA or USDA or whatever.
And so there there are some sub instances that are considered so harmful and bad for society that they must be regulated. And you could call Facebook a class a class was a class a, class a, class 1 drug. Class 1 drug. Yeah. I could call it a class 1 drug and you could and you could regulate it. I, you know, that that feels draconian. But, you know, we we certainly have that that lever. And in China, by the way, they're super happy to to declare Falongong, a religion, a class 1 drug in Bannett.
Okay. Yeah. Opiant of the masses. Right? They're perfectly happy to put up their wall and decide exactly like like the iOS can decide what apps go on iOS, the Chinese government can decide what goes on either. Absolutely. Absolutely. So so, you know, not only is that possible, but it is has been done by perfectly valid, you know, parallel universe, which is China. Now we believe that there is a there's a cost to that, cost of free speech and democracy and things.
And we we believe that's a that's a price too high, to pay. Right? Right. Right. And so we have these things where we agree and then we have these things where we it's unclear. And in those spaces is required in most societies, I believe, some sort of temperament, some tempering of behavior. So Flint instance, Jack Ma went past that tempering point in China and now he's disappeared because the Chinese government decided he he had gone too far.
He lost the threat of being a steward of his position in society. And right? So so, I mean, I mean, it's a great example because, you know, the average Chinese person would say, no way. I mean, he's a hero. You know, he is he is he is someone we would all like to emulate. The Chinese government clamped down. Why did that did he do anything wrong in particular? Probably not.
The what the Chinese government is always concerned about is, centrifugal forces, which is to say, you know, power distributing, whether power could distribute to the cities and the mayors, that would be bad. Power could be distributing to comp individuals or companies, and that would be of any any, any distribution of power, any decentralization of power is a threat to the Chinese coming party.
And if you're the CEO of a company and you become too rich and powerful, they will find some violation of the law because, you know, basically everything's against the law in China just indifferently in, enforced, you know, they will find something and they will dial you back in. So that is simply a control, factor. And, you know, I think we don't believe that the central government should maintain control and that if any entity gets too big, it it, you know, threatens Washington DC states.
Is built on the opposite perspective, which is that of a balanced network structure. Exactly. So we believe in democracy. Beller, that's the foundation of our true American exceptionalism as we've been able to build this distributed network structure where everyone's encouraged to balance. Absolutely. So Pete between the three things of federalism, democracy and the balance of powers, we we have we have chosen the path, which is not the Chinese path.
And any decision we make about social media should be coherence and consistent with that path. Right? That's right. That's right. But but now we've got these new people sitting at the table of the balance of powers. Right. Suddenly, you have a Jack Dorsey. Suddenly, you have a Mark Zuckerberg who, you know, are not vetted. They are not trained in stewardship or statesmanship they are trained in AB testing. Right. Well, we've we've always had those individuals.
They used to be called newspaper owners. You know, the the the William Randolph Hurst of of the world. And, they too were not trained. They they too, you know, were were just were just, you know, entrepreneurs. However, their, the media was regulated by the FCC. So there were standards of media, both both broadcast and and and Flint. And there was a kind of a gentleman's contract, within within the media that they had certain that, you know, with great power comes great responsibility.
And they accepted that cultural idea that mental model for at least 6 to 8 years Yeah. You could argue that every now and then one of them one of them would go off the reservation, you know, Murdoch or Fox News or even William Randolph could argue, they they they've kind of violated the, you know, the the social norms of media. But but by and large, they were sort of, you know, the the the idea was it's like owning a football team.
You know, once you're in the club, you kinda have to abide by the rules. Yeah. That's right. Or you're not gonna get the social benefit that you want, and that That is the the legal, the non you can't write those in the laws. This is the sort of cultural norming that takes place in the fuzzy center where it's unclear, and every society needs them. And and and and you can there were professional bodies and associations that were designed to both, you know, set standards and enforced standards.
They had sort of, you know, moral suasion, if not legal legal power. But there definitely was a sense that, you know, that, you know, that there are, like, trade. There there was there sort of industrial Morgan. And that if you were a newspaper, or any other form of medium and, you know, like like like doctors and and accountants and all have, you know, professional bodies that that if you kind of violated these social norms, you were called out at least.
And you could be you weren't thrown into jail. There was no, like, clear white line that are black line that you could cross and the same thing is going to be true forever. Now, that the Chinese government, they can't paint the exact lines either. So when Jackma disappears, we're all surprised.
Did he violate a lie and then now he just kinda strayed too Morgan every society, whether you're, you know, an authoritarian version like China or whether you're a more balanced version like this, everyone needs some sort of cultural norms to help adjudicate those muddy centers.
And when you when you get too much power like a Zuckerberg there's a gentleman's agreement, if you will, that you're gonna hand some of that back over to other folks so that the balance of powers days in balance, and the system continues to function optimally or close to optimally. And there's some debate. Right now, there's no debate. Do you think there is a gentleman's agreement that that's the case? There was. There was.
There was prior to the but, you know, back back in the days of two parts. Yeah. Yeah. I assume we're rocking and sold the internet and and now and I think I think this has been thrust upon this has been thrust upon Dorothy and thrust upon Zuckerberg and they aren't aware of those cultural norms. They aren't aware that there's this requirement in order for society function better and that perhaps you don't maximize your dwell time.
Facebook, maybe instead of having a $750,000,000,000 company, just have a $550,000,000,000 company and that that 200,000,000,000 of loss of of of, you know, value today allows you to still have 550,000,000,000 in 20 years versus split getting split up and doing all that mess. You know, in in in in the 20th tree, when there was the, you know, the great global battle between Capitalism and and and Communism.
You know, the countries that are flirting with socialism, would say that any time a company got too big, it would be nationalized. In the interest of, you know, the, you know, security or or or or the Pete or whatever. And so and so that, you know, that was always a threat. And, you know, there was a and even in AT and T's, time, there was a there was a, a move to nationalize AT and T instead they broke it up. But, you know, so there's there has been that that that that stick in our past.
Now, you know, I don't think that it's plausible to nationalize Facebook, you know. Right. Because no one no one could run it within a year or 2. I guarantee you that product would die because because it's like a piece of art. It's it's a living thing and the the the balance that it took look Facebook was a 150 of social network to get going with the same 5 features. It's very delicate how these things run.
And if you try to do something like that, the Flint relief and the thing would deteriorate, it would die. It would go in. Yeah. And of course, that that was that was true for the telcos and the water companies and the electric companies as Beller. And they they all sort of, you know, entropically degraded after nationalization, Venezuela being the sort of the the the, you know, the primary example. However, it took decades for Telefonica to suck.
You know, I'm not actually, I'm not sure where they took that. They probably suck to start with, but but it did took decades. This would happen all overnight.
This would happen overnight because well, for lots of reasons, the net the the physical network effects that those telephone systems have are really the strongest network effects in the world and and technology is relatively simple so that tens of thousands people can understand it, repair it, maintain it, and so it's it's a lot easier than digital interfaces and and viral flows and AB testing.
All of, you know, the the the trust and safety stuff that Facebook and Twitter do is just enormous, just enormous. Let me bring up another historic example, which is more and might be real. So in between the newspaper era and the social media era, there was the banner ad, the, you know, web web 1.0. And they were like, you know, and there was the internet advertising bureau.
And the people who did, you know, ad tech for lack of a better word were members of the internet advertising bureau, and they agreed on things like sizes of banner ads and, you know, cookies and some some basic standards and and norms. Once third party advertising kind of disappeared, and it all went to 1st party. It went to, you know, social media advertising. You know, I don't know whether the internet advertising bureau has any impact any influence anymore.
Does there do you know anything about that that error? No. I mean, with with Google and and Facebook absorbing 85% of all the advertising online, there's they just get to decide what they wanna do. There's the IAB still impacts the other people, but it doesn't. And the IAB could not be a norm Beller for Google and Facebook? No. They wouldn't accept. Yeah. No. I think they were a political entity helping with their simple things like banner sizes.
I don't think there was anything other than standardization for the flow of commerce just like you have an ethernet standard and you maintain the ethernet. Do do you think that it is a, attractive proposition to say to Mark Zuckerberg, accept the following limits. It'll cost you you know, 30% of your of your of your revenues, but do it because it's good for the world. Or is there a better way to sell this? It's a great question.
I, if it were if I were Mark, it would be attracted to me because but you you have to say or Beller. There has to be a there has to be, you know, the alternative is worse. Right? Right. Right. I mean, I think that's, you know, one of the puzzling things to me, Chris, is that, you know, that Facebook thinks that they're not gonna get broken up if they integrate their end. I don't understand why the government didn't care about that.
I mean, what the government should be caring about is the fact that if they split the company up, you're not gonna have 3 or 4 different management James trying to do trust and safety, you know, getting rid of the child porn, getting rid of the Pete speech. It's hard to do and it acquires a lot of algorithmic work and who can afford the very smart Pete. And if you split the companies up, they might not do as good a job as Facebook is doing that. I have no idea.
I mean, you could argue that even, you know, if you take out, you know, WhatsApp and Instagram and Facebook, you know, All three of them have critical mass. All all three of them are, you know, have enough resources to do a good job of that. And they're different enough that maybe good at that is different between the 3 and would be better for them to focus. It could be. It could be. It's hard to know which way that would go, but it's certainly Just take YouTube versus Google.
I mean, they are pretty radically different. Now there's probably some some shared, you know, technology in the back end, but again, they both they they both have enough resources to to to do a job. So I I think that you look at you're looking at a person like Bezos, right? Who's got in his his his money? He's got his James, and now he's turning to buy the Washington Post. Right? And trying to keep the Washington Post to be a steward organization as it has been for the last 8 years.
And then he's looking at, you know, Space exploration. He's he's now wants to leave Amazon and start spending his time on things that are more about his legacy and about the future of humanity and his mindset has shifted from the rapaciousness that was required for him to get where he is to a place of more stewardship. He's made that maturing jump as Joseph Campbell discussed in the hero of a 1000 faces and that I believe is what could happen to a Zuckerberg Flint a Dorsey? For sure.
I mean, I, I, I, you use stewardship, I would say, you know, philanthropy. I mean, for example, you know, bug, Bezos is you know, interests also include climate change, you know, environmental and, you know, and and James, as well. So, you know, I think what they decided was that the best vehicle for them to exercise their social obligations was not their company.
They had to leave their company and take their money And only then could they would they be free to to be the steward they wanted to be? Because they needed to pursue high share prices for all their employees that are stuck in that mindset. You've all sorts of conflicts of interest. You could say they had they had share conflicts that you've said they had just sort of, you know, imp you just band with, you know, competition. But it just it just felt more natural for them to make a clean break.
And you have argued this for 20 years. Right, Chris? I I have. I I I've argued. I mean, so, back when in when I was at the economist, you know, back in the in the nineties, we argue this this is where the beginnings of corporate social responsibility. And there was this sort of CSR movement And, I was on this team that argued, against it, on the grounds that companies suck at this. And that, you know, that that that governments are good is social issues.
Or governments are good at deciding what in society's priorities should be, and companies are really good at you know, achieving shareholder value, maximizing maximizing shareholder shareholder value. And then the companies try to do them both. They'll do them both badly. And so that, you know, the best thing for different companies to do is to be the, to be the, you know, the profit seeking entities that they're very good government is designed to check them in.
And then then there's, you know, a balance Pete. We'll get the companies, you know, create a thriving economy. The government you know, reigns them in when they go too far, everyone knows their role. And, you know, I think it's I think I think, you know, you could argue the jury's still out or corporate social responsibility, but a lot of it has, I think, been, either ineffective or window dressing.
Sure. But you you can find, you know, Howard Schultz or whatever who running Starbucks to give health care to all of his part time people who are you know, Hilton Hotels doing the same thing in the fifties, with their, you know, maids or, And and you could you could argue that You could argue that that that that is, you know, that's very aligned with their with their profit seeking. It it just could be that that's a win win.
Pete talk about the triple bottom bottom line, you know, good for the company and good for the environment. Sure. In the long term, it's probably better for Starbucks for him to have done that. And he might have been right by doing it. Even though at the time, people thought it was controversial on the stock price, you know, shot down blah blah blah, thinking, oh, he's too much of an activist.
But you might make the same argument with something like social media, which is a particular type of a product, which requires some sort of maturity in order to manage and it can't be as rapacious as maybe some other forms of business. Because a lot of his value, a lot of his value is coming from the ownership of the social sphere, yet that is a comments. That is a collectively own thing. Whereas coffee, I can choose to go to Pete, I can choose to go whatever.
You know, when you if if there's even more of a reason for I'm not sure. I'm I'm I'm getting the distinction. So Facebook doesn't own the social sphere. They, you know, they they, they bet they have your time if you choose to give it to them, but there's many other places you can go. What's the distinction between that you're drawing between those between, is it Facebook and Starbucks? One's a product and the other is something that affects how you think. It's like running a news program.
It's a it's a newsfeed. Okay. That's that's true. That's true. But in terms in terms of individual choice and individual action, there's the the same. Starbucks has got, you know, It's got a, you know, a lot of coffee shops say Facebook has a lot of, a lot of eyeballs. But, you know, if you don't like it, you know, we we have We we choose to give Facebook its power. Yeah. But but but Starbucks isn't letting you think about, you know, anti vaccine.
It's such that it's not letting you think about, you know, Proud Boys and and storming the Catholic. Like, that's not what it does. It's it gives you a copy. I might say, you know, let's give some more money to the people in Columbia, but it's like the marginal Yeah. I mean, I mean, there there are certainly people who believe that Starbucks is a gentrifying influence and, you know, destroys local coffee.
I mean, there's certainly there are people who will find a social bad from any large multinational corporation. You know, for all sorts of reasons, including competitive ones, they just have they just have, you know, access to capital. So, you know, I so I think it is, that there are some analogies, there, but getting getting back to sort of, you know, what you know, what Facebook should do. So Facebook is now threatened with the breakup, which may or may not be plausible.
They are aware, that they're causing harm and they're trying to do something about it. Perhaps not with their full heart and and energies. They've set up some mechanisms to do it Beller, such as such as the, the review board. And, and and our our argument is that They haven't nailed it Pete, that they're that they're that they're that they're, ex their internal their internal methods And this board are not sufficient to address the root cause, which is the the algorithm.
And that Congress is not good at it either as a blunt instrument, and would probably do more harm than good. And, you know, the courts and the regulators, are sucking their own way. And so, and so, you know, if you were gonna reduce this to proposal, the, I, you know, the the draft of the proposal might be something like this. You know, once any entity gets above a 100 a 1,000,000,000, you know, a daily act a weekly active monthly active use A 100,000,000 users.
Okay. Once you get above, a 100,000,000, monthly actives, once you get to a 100a100000000 MAUs, You now have triggered the threshold by which you have to have some regulate some some some some some some independent oversight And, I and and you have, you can either set up your own independent oversight, which is transparent, and, and, and and independent and and nevertheless has has has power over the company, or congress will do it, you know, or or a regulator will do it to you.
You know, and they're gonna and they're gonna, you know, stick you under a public utility commission and, you know, good luck to you. Yeah. Yeah. It might even be 50,000,000 depending on on on what type of of content is. The the the part of a 50,000,000 is that, you know, you basically, you know, clubhouse for example, will go from, you know, from nothing to having triggered in about, like, 2 months. Yes. It will.
Which has seem which seems a little I mean, if but they could also disappear 2 months later. It could, but it's it's also owning the public sphere. Mean, there's a lot of I would argue I would argue that you'd have to have a size and time. So now there's a 100,000,000 for a year. And now you're now you're on the radar. Yeah. Okay. I'd say 50,000,000 for 4 months. I would I would come down faster on it. Having built some of these myself.
Yeah. Okay. We can agree we can agree that there's a there's a threshold. Yeah. There's a threshold somewhere in there. And and and and you and you could say that, you know, with with great power comes great response Beller. You know, good news. You're you're Morgan. You know, good news. You're a success. And by the way, I lived this, right? I live this with drones because we said we said, you know, because because drones were highly regulated.
It's aerospace, it's weapons, all this kind of stuff, and we found all these loopholes. As did, you know, Uber and and and Waymo and and and PayPal and and everybody Beller. And we found all these loopholes. And we just sort of, you know, innovated with, you know, open source and community. Until a certain point, we had 10,000,000. We had we had 1,000,000 units you know, out there, and 10,000,000 globally. And at that point, the regulators said 2 things.
Number 1, they said Beller, we can't stop you now, you know, the genies out of the bottle. But, you know, but we can we can now put you we can now put walls around your sand And if you want to kind of move to the next layer of permissions, you're gonna have to, you know, we can't we can't stop you from doing what you're doing. But we can stop you from doing the next thing you wanna do.
And the way you're gonna get to this next thing is not by conforming to our rules, but rather working with us to develop new rules. And that was, you know, that was 10 years. That was 5 years ago in particular when they brought us in, and it's gonna basically says that, you know, the penalty for your you know, your your your skyrocketing success in the early days is you're just gonna have to play the regulatory game for the your next your next chapter.
Your next 5 years are gonna be working with us to to, you know, to come up with appropriate regulations. To protect the public from your drones falling on their heads or or Beller driving cars or fintech, you know, and and and Robinhood, is just going through this right now. Yeah. And all of them are saying example. Yep. Absolutely. All of them are saying, you know, basically, can't stop you, but we can make your life miserable unless you work with us. To do the right thing for society.
But, Chris, what was your reaction when they came to you with that? My reaction was look. The old model was do things our way. And the new model was Beller us do things the new way. And my primary reaction was initially, you know, euphoria. They're listening to us. They understand that the that the game has changed. They want to to, you know, to be more internet Silicon Valley, like, I'm thrilled that, you know, that they're that they're paying attention.
They respect us enough to ask us to help write these new regs. Then that led led to, you know, the dismay of realizing just how slow and inefficient this process is. And now I find myself, you know, basically sitting in in, you know, regulatory committees, you know, for the the rest of my life. And, yeah, I gotta say that the, you know, that the, that the honor I feel being invited is diminished to the tedium of actually doing it.
But nevertheless, you know, when I pull back, you know, the, you know, everyone one thing that all revolutionaries learn is it's a great to be revolutionary. Fuck the man, tear down the system. What what do you do? What do you win? Now you're the establishment, and you have to move from tearing down to building up. And it's a completely different skill Pete, and it's not that much fun but it's kinda what you gotta do. Different personality type.
Exactly. So I think that the, you know, that that now that we've torn down newspapers and torn down traditional media, we have to build up something new. And although it's not in our DNA and it's no fun, it's kinda what we gotta do. And I and I kind of feel that way that that now's the time, like, everyone's like, oh, I'm gonna start this network effect company. It's gonna be so gonna be colorful. Everyone's gonna talk about it. Everyone's gonna love me. We're gonna make a ton of money.
It's gonna be perfect. There's gonna be no downsides. Let's just, I think, come on. It's like, grow up. Like, yes, there's gonna be some downsides, and it is the it is the case that you have to take care of the public sphere. You can't let the drums drop on people's heads. Yeah. I mean, what watching Robinhood, you know, deal with the, you know, the the most extraordinary, you know, penalty of success which is their capital requirements.
It's like we are we have so much business that we don't have we can't satisfy our capital requirements or reserves. You know, to to take people's money. It was it's it was it was an incredible kind of wake up call. Yeah. Yeah. Definitely. Definitely. So the the idea would be to go to them and say, look, you haven't done enough. You need to do more. And There needs to be what a government pay for person who gets embedded in the company. Absolutely not. So the uh-uh newspaper, I'm men.
They're called men. I don't know why, but that's just the old fashioned thing. We're always paid for by the by the newspaper. They're employees of the newspaper. But they are, they're they're they're given a, a set term so they can't just be fired for cause. They, they reports publicly as well as so, you know, all the reports are public as well as as well as private. They, they reports the complaints that come in. And they have the ability to talk to anybody and go anywhere within the company.
They have a pass, you know, they can they can they can open the kimono and and see the algorithm if they want. And so what they are they're, you know, would he pick somebody? You try to pick somebody who has wisdom, judgment, respect, and gravitas, And, and it's not an asshole because, you know, you can't fire them, you know, until they're until the term is over. And that person needs to be proactive. Need to be proactive.
And if they're if they're considered to be a puppet, they won't be respected, from outside. And if they're and if they decide if they kinda go rogue and, you know, decide to, you know, rip up the company, then, it's gonna be an unpleasant situation, and they'll point someone less less draconian next time.
Nice. Man, they and this person will be allowed to address the root algorithmic causes, not just the consequences that not just an individual piece of content, but the algorithmic problems and maybe even the product problems, product design problems, Well, it's not clear whether let's say they discover that there is, you know, an algorithm that is, you know, causing people to, I don't know, become racist. They may not be allowed to disclose the the the details of the algorithm.
It might be proprietary. And so they're they probably have to be some sort of NDA where, you know, where proprietary information doesn't cross the firewall. They would absolutely have to have access to it, but they might not be able to completely describe it in their public report. So there could be some redaction. And, you know, okay. In so bit.
But they can at least report on the fact that, look, the way the algorithm works today produces x amount of racism, which is that we wanna reduce that from a factor 25 down to a factor. Exactly. Exactly. And these proposals have been made and we're gonna hear in 2 or 3 weeks from the algorithm team Beller they accept those those recommendations or not. If so, then we'll implement the move in 4 weeks.
We'll have we'll we'll see if it actually moved it down to 8 or 10 because our recommendations may or may not be right as to how to move that number. And we'll let you know as it goes forward. And so there would be this focus here. People could be talking about it, reading about it. Yeah. And the company and and and so like a inspector general or the general accounting office It would do an investigation. The GAO, by the way, is is is the government equivalent of a Nam Bootsman.
The GAO has access to, to the employees or the government employees. I think that jail may even have subpoena power. But they can see all the books. They come out with a report that report may have some redaction. And then the agency or the entity is required within 60 days to respond. And and theoretically this person could be nimble. Do you think there could be a committee of people that do that? You think You know, you there's no there's no engineer and, a writer.
There's no reason why the ombudsman can't have a staff. I think I think the problem with committee is that you end up with these kind of, you know, a blanched, you know, rulings. You end up with looking for consensus or even majority. And it, and it just just dulls. So and so has conversations there and there, and I don't know. And you end up with dissenting reports and things like it. It gets complicated. So I would I would I like this idea.
It just requires very special person to be this ombudsman. Except for we've done it successfully for, you know, for 80 years within newspapers. The New York Times had an ombudsman till last year. But I guess my my my point is that it's much more complicated now because you have to understand product. You have to understand algorithms. You have to under I mean, there's a lot of sort of details to understand and But so I, you know, I I do think that it reduces to a modest proposal.
And the modest proposal is, you know, when companies pass a certain threshold, you know, they have now, they have now triggered a requirement to, exercise their responsibility to society. Or an ombudsman in their committee to, and they have these particular rules. Cool. Alright. Awesome. You're the best. Thank you so much. Bye. You've been listening to the NFX podcast.
You can rate and review this show on Apple Pod cast, and you can subscribe to the NFX podcast on Apple Podcasts, Spotify, Google Podcasts, or wherever you get your favorite podcasts. For more information on building, I conictechnologycompaniesvisitnfx.com.