Things Only Look Crazy When You Stand Too Close - podcast episode cover

Things Only Look Crazy When You Stand Too Close

May 30, 20221 hr 7 minSeason 3Ep. 6
--:--
--:--
Listen in podcast apps:

Episode description

In this episode I look at the "delusions" that happen when a large number of people get together in groups. We'll explore a number of prominent economic bubbles and end-of-times cults to reveal the core properties of groups that lead to mania. I discuss the Terra Luna collapse as a recent example, explore human biases, and ultimately show that the so-called "delusions" that occur in groups is really just one of the costs to complex tractability. 

Support the show

Become a Member
nontrivialpodcast.com

Check out the Video Version
https://www.youtube.com/@nontrivialpodcast

Transcript

Hey everybody. Welcome to Nontrivial. I'm your host, Sean mcclure. In this episode, we're gonna take a look at a number of interesting concepts related to the madness of crowds, but also the intelligence of crowds. We're gonna take a look at some of the manias that go on in uh economic bubbles and in the world of cults, we're gonna take a look at biases that come into play and why when a bunch of people get together crazy things seem to happen like cognitive dissonance and confirmation bias.

Some of those biases that play a role and how that relates to some of that craziness that we see in crowds. We'll take a look at what all that means in terms of complexity and how there's an actual cost to complexity. In other words, a frame this, in terms of this so called madness that occurs in aggregate in large crowds is actually something that needs to happen to make long term stability possible.

Take a look at scale dependence and narratives and how some of that turmoil and collapse is absolutely needed for long term stability. So let's go ahead and get started. OK. Things only look crazy when you stand too close. So let's take a, a look at what I mean by that. Um Go ahead and uh if you would like to support nontrivial, please go ahead and uh head on over to patreon dot com slash nontrivial.

And you can uh for a little bit of, you know, a few dollars a month, you can help support the show. You get access to the, uh you know, the slide decks that I use some of the visuals that I put together a little bit of extra media. Sometimes uh a few little goodies in there like uh you know, some piano composed by yours.

Truly, like we did on the last episode, I try to keep things a little bit interesting, but basically some visuals to help anchor the narratives that I put together in these episodes. For those of you on Patreon already, you can see the overview slide for this episode. Um We're gonna take a look at a lot of number uh interesting concepts. But let me just skip ahead to the, the book that I will use this week to anchor the conversation. The book is called the Delusions of Crowds.

Why people go Mad in groups by William Bernstein, the delusions of crowds. Why people go mad in groups. And so this is a common pattern that we see well throughout all nature, obviously, when large uh number of organisms get together in aggregate, they produce uh behaviors that cannot be explained by the individuals themselves. Of course, that's a hallmark of complexity itself.

And one of those patterns that can emerge is is what this book calls delusions or what a lot of, of researchers and individuals would consider delusional or mad. Uh You know, the common example being things like a stock market bubble, right, where people start investing like crazy. And despite evidence to the contrary, in terms of the viability of the of the company that they're investing into or the set of companies they're investing into.

Um you know, despite that contrary evidence, people just keep kind of pumping the money, pumping the money in this bubble grows and grows and grows and eventually explodes. And then in hindsight, you kind of look back and say, OK, well, that was stupid, right? Like there was all, there was all these uh obvious red flags. Why do people do that?

Of course, hindsight is 2020. Um But there is a certain kind of madness and almost delusion that seems to occur when a large number of people get together. Um You can see this even in um you know, the Stampedes that might happen, something little sets it off and all of a sudden it propagates throughout the crowd and you've got this kind of madness. So we'll take a look at that pattern, this this kind of madness, this craziness that happens in groups where that comes from.

What are some of the underlying mechanisms. Um We'll use that book, The Madness Of Crowds or Sorry, the Delusions of Crowds by William Bernstein to kind of see his take on it. And uh and I'll agree with some parts and disagree with some other parts. Um But we'll take a look at the properties of those bubbles of that madness that happens. And uh anyways, for our Patreon subscribers, you can see the list of um patterns and concepts. We'll talk. Uh I'll, I'll be talking about this uh this episode.

Um But yeah, things like emergence, right? The limits of logic, sustainable growth, biases, tractability, scale, dependence, narrative fallacy and decisions with heuristics. So let's go ahead and get started. OK. Uh I want to start with this system one versus system two thinking. Uh Daniel cane is a, is a very famous psychologist, um Nobel Prize winning psychologist who wrote a book called Thinking Fast and Slow.

And this is this is the way a lot of people will frame human thinking nowadays in terms of the system one and system two. and it's not a bad way to kind of think of the two ways that people typically think of things. One is what he calls system one, which is the instinctive unconscious, fast, associative emotional way of thinking. And the other is system two, which is more slow, reasoned, and analytical.

OK. So in everyday life, when we're faced with this uh with a situation, we, we tend to think in one of two ways, right? We either use our system, one mind, which is the more kind of reptilian uh been around forever. You know, at least 50,000 year old mind where we are unconsciously just acting very quickly doing things based on instinct. It's very associative, it's very emotional, right? And, and this constitutes, you could argue, you know, let's say up to 95% of, of our thinking, right?

We just use these kind of high level heuristics, we do things very quickly, we use our gut feeling and that's how we navigate the complexities of life. And then system two is that where you kind of slow it down, you try to be more reasoned, more logical, more analytical takes a lot more effort, uh effort. Uh And that's kind of the rational mind.

And you could say that's probably only about 5% of the time and most people um but that would be the breakdown that system one versus system two, the kind of emotional fast stuff versus the slower rational stuff. And the usual narrative around this that you always hear is that two is kind of the more advanced, the more kind of enlightened, suitable way of thinking to modern life, right?

You always hear people talking about how, you know, people aren't actually that good at thinking statistically or probabilistically, they don't know the the most people won't break a problem down into its components. And try to be very logical about how they add up to some good result. They'll just kind of be emotional or they'll base it off gut instinct.

And, and so the usual way, this system one versus system two thinking is framed and I'm not saying Daniel Kane Man is necessarily framing it this way. But most people will talk about these two ways of thinking in terms of system, one being kind of archaic and kind of not really suited to modern life. Whereas System two is something that we could all use more of. That's the usual narrative. And I'm going to argue against that a little bit.

Um Not that that idea is completely wrong that, that, that, that there's this mismatch between, you know, system one thinking and modern life. I mean, there is definitely a mismatch there, but that is not to say we should be um walking away from system one. And I and I had this kind of talk that I give a lot about coming full circle, right? How a lot of technologies today are actually getting us closer to what I think makes system one more suitable to modern life.

But I'll talk about that a little bit later. Um in the book, The Delusions Of Crowds by William Bernstein, um you know, Bernstein talks about system one essentially leading to mania, right? That's kind of how he frames this up. He says group decisions. He he sees those as a logical and dangerous, right? So the decision making that happens in groups is not something that's relegated to just the individuals, right? The groups can quote unquote, decide as well.

And, and we think of that as an emergent phenomenon. I've used, you know, the ant colony and the termite colony as examples a lot, right? Nature solves problems by taking a lot of little pieces and running them in aggregate and the aggregate or the group of all those organisms working together is actually how the decision making happens, quote unquote, decision making, how the problem solving happens, right? That's how complex problems become tractable.

Um So these decisions can be considered mania though as though there's something wrong with them if it leads to poor outcomes, right? So if you have a bunch of organisms or individuals as the humans coming together and they operate in aggregate and it leads to something quote unquote bad, you know, whether that was, you know, people getting stampeded, right?

Uh trampled in a stampede whether that's people investing in something that they shouldn't have, you know, kind of these the the propagation of these bad narratives that that end up happening at large. We can think of that as a mass delusion. So anything that kind of leads to a bad outcome is, is framed by Bernstein as essentially a system one leading to mania, that's the way I kind of interpreted it um from his book. So that makes us think, well, what is modernity, right?

What is modern life's definition of delusion? Right? The, the the this idea that system one is leading to deception, well, to believe that you have to think that anything outside of a logical framework, a strictly logical framework is deemed wrong by today's standards, right? Um So, you know, if you're looking at a situation and, and again, kind of in hindsight usually, but you pick it apart and you say, you know, ok, all these people, let's say joined a cult or threw their money at something.

And when you look at all the, you know, wrong things that occurred and not just on the output side of things where, you know, obviously bad things happen, but you kind of look at the decisions that seem to be made by the individuals that went into it or, or what the group decided to do at large, it seems kind of ridiculous, you know, why would they do this? Why would they do that? It doesn't make sense.

So, so from a strictly logical framework, uh it's deemed wrong, it's deemed delusional, it doesn't make sense. But recall my episode on the limits of logic. Um You know, I had a two part episode uh on this, uh you know, a couple of seasons ago where I take in the first season where I looked at the limits of logic. And I argue that while logic is definitely powerful in a lot of context, it also leaves a lot out, right?

It, it, by definition strips out much of the context that makes a situation what it is. You know, it frames situations in a, in an overly simplistic fashion where all the different pieces come together and they sum together to produce the final result. And if you can't see how those different pieces sum to produce the result, then, then you almost, uh you know, society tells us that's almost like a failing, right?

Uh It's, it, it gets fuzzy, it gets messy and therefore it's not in the realm of logic and therefore it's hard to make sense of. And therefore it's, it's, you know, there's something wrong with it. In other words, if you can't understand um how the pieces came together to produce the result, we tend to think there's something wrong with that. But of course, that's, that's ridiculous, right? Because the overwhelming number of phenomena in real world life uh are are synergistic, right?

Where the, where the whole is not equal to some of the parts, right? We cannot take that reductionist approach and peel back the layers and find how those components act together. We can't understand how the sum happens. In fact, it's not really a sum at all, right? There's this complex interaction of all those components. So the reason why I'm saying that is modernity's definition of delusion is not necessarily a delusion.

There might be a higher purpose to it if that's the way you want to say it. In other words, if you step back and look at the more holistic larger picture, you can begin to understand that. Maybe that wasn't so delusional. Uh Maybe there's a reason that things work like this and maybe that's just a cost of, you know, that there's these things that are not that great, but that's a cost of how complex problems get solved in the long run. So I'll talk a little bit more about that later.

But I just wanna point out that when we say something like the delusions of crowds, that is not a, you know, like a factual statement that, you know, when something quote unquote was wrong, it's, it was definitely wrong or there was, it's definitely delusional.

In other words, if people invest in something really, really stupid, yeah, in hindsight, you can say that shouldn't have been, but that, that, that's in hindsight and, and if you step back and see the more holistic picture of the entire market, you might realize, well, that, that actually needs to happen.

In other words, to give a more concrete example, if you wanted a stable economy, if you wanted a stable market, uh one of the costs to doing that is to have bubbles that burst you, you have to have things go wrong and I'll give examples of nature of this um a little bit later on.

OK. But just keep that in mind that the, that when we say things are delusional, when we say things like the madness of crowds, the quote, the, the, the quote unquote madness term in there is, is, is subjective or, or is only, it's only mad or crazy under a very narrow definition of what correct is. And, and that's that strict, extremely naive logical framework that we typically use to define things as correct.

Remember in my episode, I was talking about science saying more about explaining less. I talk about the great disconnect. It's this idea that only under the very contrived laboratory settings, do you tend to say? Well, this is true and this is right and everything else kind of isn't.

But if you actually, um you know, let the complexity go, so to speak, let that run its course, all that context that you stripped away in order to define something as quote unquote true was actually absolutely critical. You know, I talk about those hidden dependencies that make something what it actually is. So anyway, go listen to that episode if you want, but it's kind of related to that pattern. But let's jump into some specific examples.

Now, there's a lot of these um kind of madness of crowds or mania examples that have happened throughout history. I'm gonna use uh take, take the economic side of things for this episode for the most part. Although I'll talk a little bit about cults later on. Um But when we take when we think about economic bubbles, right? Those are always interesting, right?

You've got this kind of speculating frenzy that happens in society, you know, for whatever reason, a number of reasons, a, a particular company or domain becomes very interesting to many people and the, the economics of the day, the banks and the governments make it possible for the everyday citizen to essentially um you know, put some of their money into that.

And there's all this narrative baked in and there's this feedback loop that happens and people just keep speculating and getting, you know, almost overly op uh optimistic about situations and, and so much so that many people will actually pump all their life savings into something. And then of course, it it grows and grows and grows without valid reason and then collapses and what I mean without valid reason. Um If you remember my episode on the we work example, right?

You had this kind of relentless growth but, but there were these, you know, there, there was no way to backfill that growth with something real and it's, it's, you know, kind of like the Malthusians, right? This this idea that you're growing, growing, growing, but you don't have the resources in place to actually backfill that growth with something. In other words, the growth of something can outpace the reality, right?

You, you can grow something from an informational standpoint, whether that's via digital currency or money or, or, or just something that's more informational, but the physicality of the thing has to catch up to that informational side of things. Right. And I talked about that in the, in, in the wework example where you've got the separation between the informational side of things and the physical side of things and eventually they have to catch up to each other.

So anyway, let's, let's take a look at some of these economic bubbles. I think they're pretty interesting. So the first one, what, what history would normally call kind of the, the first major economic bubble was the South Sea bubble. OK. So there was this South Sea company which is a public private partnership and this was done to essentially consolidate and reduce the cost of the national debt. OK? This is in, in Britain.

Um you know, after two wars, there was this massive amount of debt and they need to try to solve that. And so um what ended up happening is they created this South Sea Company and, and the government granted a monopoly to supply African slaves to the islands in South America. OK. So there's obviously at the time, you know, we're, we're talking 17, early 17 hundreds here and you've got slave trade and you've got slaves being essentially sold to South America.

And and Britain is obviously playing, playing a large role in that. And so by creating this public private company, the South Sea Company and then the government granting a monopoly to, to, to essentially capitalize off the slave trade that caused a speculation frenzy. Because if you're looking at this as kind of an everyday citizen and say, ok, well, there's the slave trade going on. And that's obviously, you know, for lack of a better term of hot market, right at the time.

And you've got this company that has a monopoly on that and that monopoly was granted by the government. And, you know, they're doing this to get rid of their debt. It looks like a situation that could be quite profitable. And so the company's stock rose greatly in value largely via propaganda, right?

And the reason why it was largely propaganda is because, uh you know, there was this war going on um with Spain at the time, there, there, there was essentially all these reasons to not believe this was not going to work out particularly well. If you, again, if you kind of picked apart, did the, you know, the fundamental analysis of if you will of the company and the market and the, and the culture and what was going on at the time.

Um You know, again, hindsight's 2020. But you could see that there are obviously a lot of red flags there that this might not work out too well. And that was the case at the OC company, the company stock was greatly in value via a lot of this kind of propaganda because they were just trying to raise money and they were trying to get rid of the debt from the two wars. It peaked in 17 20 then collapsed and, and it ruined, you know, thousands and thousands of investors.

So it's kind of one of the first examples of, you know, government runs into a problem. They form this company and they put all this propaganda behind it. Uh I mean, in this case, the government's involved and, and the company just through that massive speculation grows and grows and grows, at least on an informational side of things from an informational side of things and then it collapses and, and people who were just pumping money, of course, don't reclaim that investment.

There's no ro i on it, they lost a lot and so it crashed. And so that was kind of the first well documented, you know, uh mania of crowds example, where people went crazy and they invested and we had the, the the market crash or specifically the company crash from it. Um And then another example would be the British railway bubble. So this is going back to the Bubble Act. So in 18 25 the government had repealed the bubble act for whatever reason, right?

So time goes by, you know, people's memories are short and uh and there's all this, there's this potential new opportunity to, to get, you know, raise a lot of money through investment and grow a very big company. Uh railways were obviously being laid down at a very rapid pace. And if you look at it, it makes a lot of sense. You know, think about the internet today, we use the, the, the the dot com bubble as an example in a, in a bit.

But you know, if, if you're alive at a time when a new technology comes around and it just makes so much sense as obviously rail did at the time. And you've got this major player in that domain.

Um You know, you get, you get into this overoptimistic speculation again and so much so that in 18 25 the government actually repealed the bubble act and, and meaning that kind of like whatever was whatever regulations were put in place to not let you know, um the OC bubble situation happened again was it was kind of paired back a bit and uh and people started to invest, you know, their entire savings into prospective railway companies.

So a number of railway companies were involved and they raised a lot of money. Uh you know, so that mania reached its zenith in 18 46 and dozens of rail companies formed, but their un viability eventually became clear and again, things crashed. So you've got this overinvestment over optimistic speculation, you grow the bubble, you grow the bubble and that falls down. Uh And, and that collapses again.

What's interesting here, I think is that, you know, you had this bubble act in place from the soI bubble, right? The so c company disaster in 17 20 you know, 100 years goes by and uh and people start putting things in place. Well, you know, maybe this is OK, this is different this time, right? It's always different this time. And so the same thing kind of happens. Um obviously the most uh famous example I suppose throughout all history would be the 19 twenties stock market crash, ok.

It's an example of essentially borrowing going crazy. So in the autumn of 1929 right, share prices on the New York Stock Exchange collapsed as we all know. And of course, that signaled the beginning of the Great Depression. Um So if you think about, I mean, obviously you can go read all kinds of reasons behind this and, and the mechanisms at play. But one of the main core mechanisms was that brokers were routinely lending small investors more than two thirds of the face value of stocks. OK?

And so basically what that means is, you know, a bunch of people, yeah, think of uh you know, the housing crisis, right? Of, of, of, you know, 0708 where, you know, a lot of people were basically able to buy things that they were not supposed to be able to buy. I mean, this is a common pattern that happens again and again, right?

And so the banks or the brokers are able to kind of lend investors who really shouldn't be able to get into this because they don't have the, the, the, the, the, the money to do. So they start lending the money so that they can do that. Obviously, there's, there's a reason that the banks would do that because they can make a big return. People want to do that because they can start buying things that they wouldn't otherwise be able to.

I mean, it's like, you know, if you're making not a lot of money, you probably shouldn't go buy $60,000 vehicles and yet people can do that all the time because, you know, banks are able to lend that money out. Um but this was 8.5 billion out on loan, ok, in the 19 twenties. And that was more than the entire amount of currency circulating in the United States at the time. Ok. More than the entire amount of currency circulating in the United States at the time was, was lent out on, on borrowing.

Um And so investors couldn't, so, so it got, so, so, you know, kind of stepping back, you've got the situation where you've got the, the, you know, the roaring twenties right after that first war. Uh you know, World War, you've got a lot of optimism, uh new technologies are coming out, businesses are starting up, you've got all this, you know, hey, everything's great. Uh banks are, are feeling the health of that economy, economy, so they're lending, they're getting more aggressive.

Uh Investors are trying to capitalize off that they go on margin. Ok. So this is where the broker essentially lends you uh an investor an amount of money you borrow that, then you use that borrow money to essentially amplify the potential returns you can make on the investment. Of course, it can also amplify the losses you can make. So it's very risky to, to borrow on margin. But you saw that in the late in the, in, you know, mid late twenties.

And uh and, and basically eventually, if the uh value of all the assets that you have drop below a certain value, then you have this margin call where the broker kind of calls you back and says, ok, you need to either, you know, you, you know, you gotta pay me back to get the amount of assets in that account up to a certain level or you gotta, you know, move stocks from another account over to this one.

Basically what you own cannot drop too low or the person who lent you money to make that happen, uh to make the good stuff happen there, the return is, is gonna want you to, you know, essentially pay back a lot of that, uh a lot of that money or get stuff into that account. So anyways, so it, it, you know, the, the things started to go down with all the investments being made in the 19 twenties.

So the brokers started, you know, calling in the margin and uh and investors couldn't make that margin call because they had obviously borrowed too much and they didn't have enough money or other assets to fill it back. And this is when you know the collapse happens. Ok. So it's the same pattern. It doesn't matter, it doesn't matter if you're looking at 1920 stock market, British railway bubble or the South Sea bubble.

You've got this, this, this informational aspect of things that this ephemeral kind of, you know, uh money is out there, which it's, it's, and it's not always obviously tied to physical things, right? It's supposed to be, it's supposed to ultimately be tied to physical things, but it's easy to make it not that tied, right?

You think about a lot of the financial engineering that goes on or people start to mathematically layer on different financial instruments and then it's, it gets further and further away from the, the underlying physical asset that it's supposed to represent. And this, this allows things to grow at rates that are far beyond what the underlying physicality would normally allow. Now, a little bit of that can be good. Um but it can easily get out of hand. And this is just another example.

So you can just lend money and lend money and there's there, there's this kind of unreal that happens in the investment world. And if it gets too bad and you've got this massive leverage, you get this massive amount of, you know, margin or whatever, you know, borrowed money, whatever you wanna call it, you have this underlying gap that needs to be filled.

And if it gets to the uh past the breaking point, then, then that's the collapse, there's nothing to fill in the big gap that's been created again. I like to think about that as the, as the informational aspect of things being separated from the underlying physicality of things, OK? It's, it's really the fundamental pattern that's happening here. OK. You know, and I laugh because in hindsight, we can always look back and say, well, yeah, I mean, this, this was obvious, right?

But, you know, again, you know, in hindsight, things are always obvious, but that's not to say that there aren't definitely red flags. Um And people have, have before bubbles, there are always individuals or before collapses rather who are, there are always individuals who do point to say, look, look, this might not work out too well, but it doesn't seem to matter how many times this happens, right?

Uh And that's not to, you know, when I laugh, I'm not disregarding the lives that get ruined from these things, obviously. But look, there are fundamental patterns here that, that do have a predictive aspect to them. Uh a more recent example would be the dot com bubble. OK. So, um you know, mid late nineties, you know, the internet is starting to take off really become popular.

You get a lot of these internet companies, thousands and thousands of internet companies start popping up and uh you know, just kind of like with the railway example, right, you've got this new technology, the internet, which is bringing together unrelated buyers and sellers in a in, you know, in these seamless and low cost ways and the potential is just massive, right? So it led to excessive speculation of internet related companies.

In the late nineties, NASDAQ rose, you know, 400% and it actually fell 78% from its peak. So when it collapsed, what we call the dot com bubble, right? So the internet companies are all forming, they're, they're getting tons of investment, you know, people are, you know, borrowing money and investing in companies and then again, there's nothing to backfill it. And this is particularly strong in internet companies, right?

When we're looking at, we work, we had this real estate company who was essentially pretending to be an internet company. Internet companies because they're based on software are basically fully informational. So they can really, really rapidly grow, they can scale like crazy. Um but they still have to be tied to something real, there has to be a viable business there and if there's not, there's a problem.

So, NASDAQ um during the dot com bubble in the late nineties, it was 400% it fell 78% from its peak, it gave up all its gains during the bubble. So whatever gains it made during those rapid investment was given up when it collapsed. And this is another common pattern that you see, you know, in all complex systems, right?

Um You know, that kind of that the fat tales of things, if you look at the distributions, you've got, you know, it's those rare events that end up dominating over the long run, uh the properties of things, the the the the reason behind things, the collapse, right? So in other words, you can make money, make money, make money, make money, and then you could lose everything that you made, let's say over 10, 15 years, you could, you could lose it in one day.

So the one day is extremely rare event and yet it had the overwhelming domineering effect on the, on, on the entire system, right? Everything was basically dictated by what happened in one day, not the 15 years. And so that's, that's why you get into, into trouble when you think history alone is going to tell you what's going to happen next.

You know, people come up with these predictive models and they try to forecast based on patterns of what came before, you know, but sometimes what came before won't matter because it will just take one catastrophe one day of something happening to wipe out everything that happened up to that point. So this happens again and again and again, right? This is just a fundamental property essentially of complex systems.

So in this case, the majority of technology startups that raised money and went public folded, right? Like almost all the internet companies just folded and, and we're no longer. Um And so you had these telecommunications equipment companies that invested more than $500 billion at the time and most of that was financed with debt, right? And a lot of this went into the laying of fiber optic cables. Um and it's not all bad. There are some good things about that, ok.

I'll talk a little bit later, like laying of fiber optic cables. Like even though most people collapsed in the process, there was good that came out of it. Um in the British railway bubble, example, even though most of those rail companies were unviable and and got destroyed in the in the collapse of the British railway bubble, you still laid uh the overwhelming majority of rail that gets used today, right?

So there's still um some benefits there, but we'll talk a little bit about, you know, the benefit of turmoil later later, which I've also brought up in in the library episode. Ok. So, so what are we? So my examples here were um the South Sea bubble, the British railway bubble, the 1920 stock market, the dot com bubble. Um getting out of the financial side of things, we also have another man example in end of world cults.

OK. So this is not an investment, this is now the cults that form throughout history. Um where people buy into kind of these really, really crazy narratives.

Again, there's always that hindsight, hindsight bias there, but some, some really crazy things and, and we can distinguish, you know, there's always a bit of um a touchy subject when you, when you try to make it a distinction between cults and religion and people kind of sometimes get upset about that because, well, you know, it's easy to call what somebody else is doing a cult and not what you're doing a cult, right? If it's not your own religion and all this kind of stuff.

But generally speaking, if you take a look at what we, what what society defines as cults, you've got some, so some, you know, what you might call really, really crazy stories that people are buying into like really kind of edge of the distribution kind of things that, that most people just would not buy into, right?

You got the 16th century Anna Baptists, the adventists or, you know, the, the, the, the Miller that came about a lot of these have end of times narratives that are very, very extreme and, and a lot of them, the ones that are based around the Bible are usually, you know, transmuting the Bible's ambiguity.

OK. So the, the issue is is that whenever you have and of course, it's not just the Bible, but whenever you have something like the Bible, which has a lot of flexible interpretation around it, you're always going to have a small group of people who take that to the extreme. In other words, they bring just absolutely what you might call crazy interpretations of what of what they read. So an example would be and you can do this with anything.

But again, because the Bible plays such a large role in society, it tends to be the the anchor or the target to a lot of this stuff. Um this kind of numerology that people will, will, will do well, they'll start finding patterns in anything, right?

So they'll say, well, you know, this chapter of the Bible mentioned this and happened this many times and that same number appears in this book over here and maybe there's a connection and they start building and building and building and pretty, pretty soon you create this whole narrative around the numbers and the in, in the Bible and the meanings behind the words. And, and again, it doesn't have to be the Bible there. Can you can do this with anything, right?

In fact, as I'll talk about in a bit, you can even do this with science. It's not, it's not uh uh immune to this. But you particularly see this in areas where there's a lot of flexible interpretation possible. But if you do enough of this kind of numerology and these connections and this pattern finding, you can eventually put together a narrative that sounds fairly realistic to at least a number of people, right?

It's kind of like the flat of society, you can convince people, in fact, you can do it using purely logic if you wanted that the earth is flat, right? And, and some people will buy into that. Um There's a, you know, a number of you can come up with all kinds of examples of, of just flexible interpretation, allowing people to put uh put narratives together. And in fact, one of my earlier episodes, I think I was talking about, you know, the 9 11 towers, right?

Um or the twin towers, sorry, uh during 9 11 where you've got this, these conspiracy theories where maybe the government brought down the towers, right? And you can use purely logic, you can just take a look at, well, here's the plumes of smoke coming out, here's the rate at which the building fell. Here's a, you can use a nothing but facts, but you can use it to piece together a narrative that is by, by most people would agree is, is fairly nonsensical, you know, who knows?

I'm not here to say whether conspiracy theories are correct or not. But the point is you can make a lot of these really crazy narratives then and you can get enough people together to buy into them and, and the book the delusions of crowds by Bernstein uses the cult as another example. In addition to economic collapses, uses cult as an example of this kind of this, this delusion that ends up happening.

This mass delusion people get together, they feed off each other's interpretation uh of, of a book like the Bible or something else. And you get into this feedback loop and you get these really crazy narratives, you know, and, and when I say crazy, I mean, things like that are well beyond anything the Bible would talk about as an example, right?

It could be like the, you know, the, the aliens are gonna come and the UFO S and they're gonna take us up to heaven and just really, really bizarre kind of stuff, right? Again, to each their own, right? You're allowed to believe whatever you want. Um But most people would agree that this is taking the interpretation of something, you know, to the, to the, to the extreme end. And so there's a kind of mania to it, there's a kind of delusion, most people would agree to things like that.

And again, as I say, we can define these things as the bad outcome. So a lot of these cults that have these end of times narratives to them um end up in mass suicides, right? Or, or at least I shouldn't say a lot, but there, there are a number of um definite ones that have been, you know, over the last few decades that have ended in, you know, up to 70 people, you know, committing suicide and, you know, basically in one shot. Um And so you can go look at, look at those cults.

So that's another example of many people getting together and, and getting into this kind of mania, right? Of something that in hindsight because of the bad outcome, whether it's an economic collapse or a bunch of people in a cult committing suicide appears obviously quite delusional. Um OK, so I'm gonna go back to uh another economic example just because it's been in the news recently. So many of you are familiar with Bitcoin or cryptocurrencies in general, right?

This digital version of currency that tries to get away from the downfalls that you have with government issued kind of centralized currencies that uh you know, we are most used to. So I'm not gonna go in and explain all about Cryptocurrency and Bitcoin. I think most of you are fairly familiar with at least what they are. There are a huge um attractive, basically speculative asset at this point.

Uh as opposed to a currency, you could say it's still a currency, but it's used mostly as a speculative asset that people are investing in. So if the value goes up, you can uh you can cash in on that and if the value value goes down, you can lose. So it's got this um you know, a lot of the markers of, of what you would see um with stocks, right, stocks and bonds, something that you invest in that can go up and down.

So it is prone to uh you know, potentially being a speculative bubble, cryptocurrencies could be a speculative bubble. And then it's actually characterized as such by a number of uh notable winners of the Nobel Memorial Prize in Economic Sciences uh by certain central bank officials and by notable investors.

So there are investors out there who see uh cryptocurrencies and particularly Bitcoin as being so volatile that as, as something to invest in, you could characterize it as a speculative bubble. And you've seen a lot of ups and downs and, and things that you could um define as crashes over this last few years um in this month. So recently, actually, there is the collapse of Terra Luna, which is a Cryptocurrency or uh kind of this. Well, I'll explain the mechanism in a bit.

It's actually two um currencies that are brought together to in, in this kind of balancing act that's supposed to stabilize the coin. But Terra Luna is a Cryptocurrency called a stable coin. And it's stabilized by assets that fluctuate outside of the Cryptocurrency space. And so what that means is just as you can try to stabilize a regular currency by, let's say, backing it by gold, um stable coins try to tether themselves to something like the US dollar.

And so, whereas cryptocurrencies are normally quite volatile, like Bitcoin stable coins, try to stabilize that volatility by tethering themselves to something like the US dollar. Well, this did not work too well for terra luna. Uh And, and just as a few weeks ago, it wiped out almost 45 billion of market capitalization over the course of a week. So let's just explain the mechanism a little bit.

I'm not gonna get super deep into it, but I just want to show that, you know, again, people will try to do these kind of financial engineering tricks and they'll do it in the Cryptocurrency space. And people will once again get really excited and invest a lot of their life savings into these things only to have it collapse. Um And so these, these, you know, these bubble mechanisms are just baked into you, you can't get away from them, right? They, they are eventually going to happen.

So stable coins though are just yet another thing that was invented. Uh so that crypto investors could uh you know, lower the volatility in their portfolios. And so stable coins claim to be back by the dollar, right? It just as you would hold dollars in reserve kind of like a gold standard currency, algorithmic, stable coins essentially trade ties to the dollar for decentralization.

So if you tie or something, if you peg or tether your currency to something like the US dollar, that's almost a form of centralization because you kind of have a single point of failure. If something happens to the US dollar, something's gonna happen to your currency, right? So they try to decentralize it uh in, in the true kind of dream of Cryptocurrency by adding these algorithms to it. And we're not gonna get into all how that happens.

But think of it as layers of financial engineering that are sitting on top of a, of a currency to try to kind of play whack a mole really, right. Every time you do something you lose something else, there's always these tradeoffs. So you know, we're gonna tie ourselves to the US dollar, but that's kind of a, you know, a point of failures. So now we have to go decentralize it more. And so they add and add all these layers and you see this in the investment world all the time, right?

The the 0708 crash um tied to mortgages had all these crazy um you know, investment vehicles that were, you know, essentially created uh with algorithms and, and some fancy mathematics. So you add layers and layers and it hides a lot of the underlying danger of what's happening. And you could say that that's what happened with Terra Luna. So Terra Luna, it's, it's kind of like two coins that play off each other. OK?

But if you've got the demand for one that goes up, the price is gonna go up because of supply and demand, the usual relationship right. And so you will create more one coin that will increase the supply, that will make the price go down and then the peg will be restored. And then the inverse is also true. So if you are on Patreon, you can see the image I'm using, it might be a little hard to follow this.

If you're not used to some of this financial jargon, or if you're not, uh you know, if you're just following along with my words, hopefully can visualize in your mind. But essentially, you've got this usual mechanism where you try to maintain the peg of your currency to something like the US dollar, right? So if the demand for something is really, really high, then the price is going to be higher, let's say, than the dollar, right, relative to the dollar.

And if the demand for that coin goes down and it drops below a certain level, then uh the price is gonna go down relative to that uh dollar. And so if you have very um high demand, you are printing or increasing, uh let's say the supply of a certain coin eventually that supply gets so high that you've got this, this drop in price that will happen because that's how it happens with supply and demand usually. And then you got the inverse, right?

If the, if the demand goes too low and the price drops, you'll start to destroy um uh the, the, the Cryptocurrency. And so the supply will start to decrease and then the price goes back up relative to the US dollar. So basically what I'm saying here is you've got this kind of built in equilibrium or mechanism that tries to maintain the price of the Cryptocurrency relative to the dollar. And that's how you control the volatility. OK. So that's the basic idea behind stable coins.

Well, in the case of terra luna, they've got this kind of fancy financial engineered um situation where you have the Terra Us D as one coin and the Luna as another coin. So you kind of got these two coins that are playing off each other. The idea is that you can always exchange one Terra Us D for $1 U uh worth of Luna. OK? You can always make, it doesn't matter what's happening to the prices of the coins themselves. You can always make that trade for one terra, $1 for one Luna.

OK. So you burn a Terra Us D and you mint Luna in order to earn a profit. So there's this arbitrage possibility that can happen. In other words, if the Terra Us D coin, let's say drops to 99 cents relative to $1. Ok. So just like the previous, we, we've got this kind of supply and demand, the usual market fluctuation happening and the price of that coin can change relative to the dollar. So let's say it drops relative to the $1.

Well, now my Tara Us D is worth 99 cents, but I can always exchange it for one Us dollar worth of luna. So I'm gonna burn my tarot Us D and get one luna worth. So that's called arbitrage. Any time you try to take advantage of the discrepancies between the prices of, you know, things like currencies or other things that discrepancy. If you act upon it fast enough, you can gain a little bit of a profit. And if you do that writ large, then you can make potentially a lot of profit.

And so that's called arbitrage. So as more and more people are holding the tarot us D try to earn, let's say this in this example, just one cent profit by burning it for Luna. The supply of the tarot Us D would reduce and then the demand would rise and then its price would rise until it hits its $1 peg. So again, it's just this way of having uh of trying to stabilize the volatility of the Cryptocurrency. So again, you've got terra Us D Cryptocurrency. Its price starts to fall.

It's not worth as much anymore. So I'm going to burn it and trade it for this other Cryptocurrency called Luna because I can always do that for $1. That price discrepancy allows me to earn the one cent profit. I'm hopefully going to do this thousands and thousands of times to make a, a worthwhile profit.

But the way that, that arbitrage, the people acting on these arbitrage opportunities, the way that that uh changes the supply and demand of the coin and therefore the price with respect to the US dollar, it has this built in kind of balancing act. So anyway, hopefully that wasn't too confusing. But now imagine a situation where so many people are trying to take advantage of that arbitrage, right, where the price of the Terry Ud actually rises. So now Terra Ud is worth a lot.

It's well, one cent more relative to the US dollar. Now, that means that people who are holding the other lunar coin realize that if they burn $1 worth of luna, they can get a Terra Us D and make an extra cent profit. So it's the exact same thing, but just the universe. So at the end of the day, it doesn't matter if you're not following all the jargon. The point is is you've got these kind of two coins playing off each other and you can always trade them for that $1.

And that causes these little arbitrage opportunities where if the price goes up, I can trade it for the other one. If the price goes down, I can trade it for the other one. And because you've got different investors doing this on different ends of the spectrum, you, you essentially stabilize the Cryptocurrency overall.

And so at a high level, stepping back from this whole thing, it becomes an attractive investment uh for investors in cryptocurrencies to invest in something like this because not only can they try to make profit off this arbitrage opportunity, but the price should never fluctuate too drastically. So it's kind of a more stable thing. OK. Now, so, so what happened?

So, so again, we've got this kind of crazy financial engineering situation where you try to stabilize this coin and you're doing it with some fancy math and algorithms and uh you know, it, it should work, right. But something can always go wrong.

Well, something that happened was a large number of people actually held the Terra Us D one of those cryptocurrencies in that balancing act into something called the Anchor protocol, which was basically this savings account for Terra Us D that had a 20% interest rate, which is good, right? For any savings account that has a 20% interest rate, that's probably something you're gonna want to hold it.

So prior to this, this Terra Luna crash, you had 75% of all Tarot Us D in circulation was deposited in Anchor. This, this savings account with 20% interest in March. Anchor replaced their 20% interest rate with a variable rate. And so large amounts of tarot Us D were withdrawn from Anchor. So many burned the Taro US D in exchange for Luna using that mechanism, I just explained. And so the supply for luna ballooned, which made its price tank right against supply demand.

And both Terra us D and Luna ended up crashing. And so the entire crypto market has been slashed by more than one half the entire crypto market, which is 2.9 trillion from 2.9 trillion to 1.2. So I I shouldn't be laughing because obviously people had a lot of their, you know, life savings invested, invested in this. And, you know, apparently there was even, you know, suicide hotlines that were, you know, um put up for this situation.

And so, you know, again, regardless of how much you want to stood behind the mechanisms, what we're we're talking about here is some, some fancy financial engineering and uh jumping on kind of a hype train of Cryptocurrency, which is going on right now, convincing people that they've created a ST a more stable version, a more lucrative version of this.

And inevitably, you know, you, you've got some, some, you know, kind of organization or some protocol that gets uh created somewhere within this uh within this environment.

And then, you know, some business decision was made, which made people all of a sudden act a certain way and it basically pushed one of the coins off the balancing act and then the whole thing crashed and, and, and, and, and when you step back from these things, like, look, it doesn't even matter how it happened there, there's always going to be this risk of of a speculative bubble that can happen anytime people, a group of people can get really excited in something and just go for it, right?

The madness of the crowd, right? So we have gone over um the OC bubble, right? The British railway bubble, the the 1920 stock market, of course, the dot com bubble, we use cults as an example. And then one more economic example with um the terra luna crash that literally just happened this month, took a look at that mechanism. When we step back from all these, we can see the properties of bubbles in general, they have four common attributes to manias.

And this is what is talked about in the book that I'm referencing the delusions of crowds. Number one is over optimism and intense speculation, ok, which is obvious. So we just saw that with, with the Tara Lua example, we saw this with over you know, the railway example. Um all those other examples of the gay the dot com crash, it was this overoptimism and intense speculation, two leaving good situations in life like family and jobs to pursue the speculation or belief.

So this is when you really get into the delusion or the madness where you get caught up in that overoptimism where you start to sacrifice things like, you know, time with your family or you, you know, people will leave their jobs just to spend their time because they think they're they, they really figured something out in the market and start to dump a lot of their life savings into it. Three becoming angry at naysayers regardless of how bad things look.

So, even when people are giving you like obvious red flags, look, if you just look at the basic mechanisms of, you know, finance or whatever it is, markets, you know, you can see that this is maybe not a good thing to sink all your life savings into.

Uh or in the case of cults, they'll do the same thing when you start to really believe crazy um interpretations of the Bible or whatever it is, whatever the cult is, you know, aliens are gonna come down and they're gonna take us to heaven and, and people are just saying, look, maybe just slow down and think about what you're saying. I don't think you realize just how crazy it's almost like they don't hear it. They're getting caught up in this frenzy and they will become angry at naysayers.

They will justify no matter how crazy it is, no matter how red flag I it is, they will just justify the decisions they're making because they're caught up and they can't stop. And then four is making extreme predictions.

This kind of prognostication where people will start, you know, like the end of the world cults where they start saying, you know, I think two months from now the earth is absolutely going to end because, you know, we've taken a look and it sounds so crazy to anyone from the outside, but the people who are inside and caught up in this frenzy, what they're doing is it's kind of like the numerology with the Bible or something. You start seeing patterns.

I mean, humans are really, really good at seeing patterns and that's, of course, a good and a bad thing. It can help us survive, it does help us survive in many ways.

But under these kind of contrived situations, which I'll talk about in a bit, uh you know, the the under modernity, right, where we do things like have stock markets and you know, established institutions and yeah, you, you know, even religions and things when you start to put a lot of those in place, that pattern recognition can actually come and bite you because you'll start to see patterns everywhere.

And so you'll start to make these kind of ridiculous predictions, but they will make sense to you because to you, they logically add up. So a bubble is anything where the asset prices are much higher than the underlying fundamentals can reasonably justify, right? And you get these overly optimistic projections about the future. And so that's what we've seen, whether it's economic, whether it's cults, whether it's anything, any kind of frenzy that happens in groups.

OK. Well, the book, The Delusions of Crowd kind of uh talks about these mechanisms in terms of two biases and that's cognitive dissonance and confirmation bias. And so these are things that people that do on a regular basis. So the, the, the cognitive dissonance, which is a pretty popular term these last five years, this pops up all the time.

I don't know why it's particularly popular now, but you hear about it a lot, I guess just maybe it's because of the way the polarized, you know, political environment that we're in. Um people talk about echo chambers and things like that. Well, cognitive dissonance is any time a narrative and data are in conflict. So the perception of contradictory information or beliefs clashing with new information.

So if you have a certain worldview, right, based on your experience or your echo chamber or whatever feeds into that, and then all of a sudden you are presented with contrary evidence, this causes uh you know, a problem for you, right? It's, it's got contradictory information to what you believe you have a certain worldview and you expect to see things go a certain way. And then all of a sudden something is presented with you that doesn't agree with that.

And that causes this kind of internal consternation where you really, you can't handle it. And so people will actively work towards reducing that cognitive dissonance. In other words, you can't, you can't just exist in a state where you have these two opposing thoughts in your mind that ambivalence, right? You have to do something to resolve it. And people will handle that or resolve that by usually investing even more in their narrative.

So in other words, I'm going along in life and I have a certain worldview and I expect everything I see in here to conform to that because that's how I think the world works. And then all of a sudden I presented with something contrary, well, instead of kind of reasoning about it and, and saying, OK, well, maybe I should change my view. We tend to just invest even more into our own narrative.

And so we will, you know, and, and that's kind of going back to this, becoming angry at naysayer's uh property of bubbles, right? It doesn't matter what I'm being presented with or even how much, sometimes I just, I'll sink even more deeply into my narrative. And so, so we as humans do this all the time. And so this is one of the reasons why people will get caught up in bubbles, no matter how ridiculous they are, no matter how damaging they are to their own lives and to their family's lives.

Confirmation bias is another one, this is this is the type of bias thinking. This is the tendency to favor information in a way that conforms or supports to prior beliefs, ignoring the contrary information. So, so kind of similar thing, I, I like to think of confirmation bias is basically, once you think something is a certain way, that's the only thing you can see. OK, so imagine that it, it basically affects your interpretation of things, right?

Um You know, if, if you think there is a systemic problem um that is baked into the system, then you're going to see that everywhere. Even when that's not the case, that's confirmation bias, right? So you interpret ambiguous evidence as supporting existing attitudes.

Um This ends up biasing the things that you search for, the way that you interpret, even your memory, the way that you recall information, you will think that something happened that didn't because it's, it's you're taking what you remembered and you're conforming it to your, your built in biases as opposed to more of the, you know, the objective reality.

Um Now they say that, you know, psychologists say that confirmation bias cannot be eliminated, but it can be managed, for example, with critical thinking skills, right? So if you're aware of this confirmation bias, so that when you're searching or interpreting or recalling information, and you know, this is just something that humans do. You can try to keep that in check and maybe try to be more reasoned and balanced in your understanding of the world and the decisions you make there from.

OK. So, so we talked about, you know, these bubbles, we talked about the properties, we talked about the built in biases that people have. And what I want to talk about now is, is what I call the cost of complexity. And And so the stance I'm kind of taking on this is that, you know, a lot of this is viewing this kind of feeding frenzy, this this delusions of the crowds in, in a, in a purely pejorative sense, right? Like this is obviously a bad thing and there are bad outcomes, right?

Because people will lose all their money, uh people will commit mass suicide. I mean, there are obviously bad things that happen. But if you step back and see the bigger picture, you can also say that, you know, this is the cost to complexity. So let me give you an example of what I mean by that. For those of you on Patreon, you can see what is called an ant mill, an ant mill. I like to use ants a lot, right? We talk about complexity.

So what, what you're seeing here in, in this, this gif this video on the right is these ants swirling around in circles and if you haven't, if you're not on Patreon, you don't see this, go ahead and youtube this uh go Google Ant mill. It's also called the death spiral.

Uh And it's these ants who have essentially lost the way from the the usual colony and they started just hundreds and hundreds of them are just going in circles and going in circles and going in circles and they'll do this until they get exhausted and die. And I don't know if they always die, but that, that, that, that's something that can happen. So you have, and, and so the way this happens is ants aren't usually following a pheromone trail, right.

These chemicals that get released into the air, the other ants pick up on those pheromones and that's how they know, you know, where to follow. If one ant or a few ants go off the usual trail and other ants decide to follow their pheromones and then those ants follow those pheromones and on and on and on. You'll kind of have this diverging path where hundreds of ants go away from the rest of the colony and start getting looped into this death spiral.

OK. So, so they're, they're not doing what they're supposed to do, right? They're, they're not following the usual trail of ants. One ant went away and everybody started following that pheromone trail and it got caught in this feedback loop. And so now they're all kind of releasing their pheromones and they're all following each other when they're spilling and spiraling, spiraling. And so you've got this death spiral, which is called an ant male. So again, go check that out if you don't see it.

Um On Patreon that I'm showing now. Well, this is a property of complex systems. And so the reason why I wanna bring this example up is because I want to highlight the cost of complexity, the cost associated with things that actually do work when you step back and get the bigger picture. So if you zoom in and you look at these ants doing what they do, it looks ridiculous, right?

Why would nature design ants to, in, in, in such a way that they could get caught up in this ridiculous situation where they spin and spin and spin until they exhaust themselves and die. But the same thing that makes them die in the death spiral is, are, are the things that make them so successful as a species, right? They're able to solve complex problems, they're able to make extremely complex challenges tractable because of the uh because of this property, right?

So in other words, the same things that make you successful are are also the things that can get you into trouble, right? And, and in aggregate specifically, this is definitely true in nature, many things have to die so that many things can live, right? This is what that you have mechanisms, powerful emergent mechanisms to in in nature that allow things to survive over the long run. But those same mechanisms are going to lead to things like this.

So when we look at the economy, right, you're not gonna stop bubbles from happening, they have to happen the way the reason the market works, right? Regardless of how well you think it works. The reason why it works at all is because of things like bubbles, right? You have to have those things that the mechanisms that cause bubbles are the same mechanisms that allow the, the the the the market to work successfully. OK?

Just the, just the same reason that the, you know, the the ants have attributes about them that cause death spirals, those same attributes are what allow overall the ants to be so successful as a species. OK. So, so the the, you know, you take certain properties of complexity like feedback loops so those can work for or against you.

And so in line with that, you know, which I kind of want the take home message here is that, you know, if you stand too close to the situation, it's gonna look delusional. But if you step back and see the bigger picture, uh if you're able to it, you'll, you'll be able to make sense of it. And a lot of times we can't do that, we can't get the bigger holistic picture because there's, you know, the epistemic uncertainty is just too great.

We can't understand the big ultimate reason for things, but that doesn't mean they're not there. And as a general property, we know that this is the case that that if you take too narrow a slice, a too myopic view of something, it's going to look pretty delusional. Um but that doesn't mean it actually is delusional or is wrong. In fact, it's only wrong under the contrived situations of modernity, right? Modern life creates a lot of contrived situations.

Um even participating in the market, I mean, there's something very natural about, you know, organisms competing with a currency in the market, but there's also something kind of contrived about, you know, investing in and trying to, you know, get a return on something. I mean, nature doesn't care if you invest money. Right. So there's something contrived about that situation and, and you see this in, in sciences, you know, I think of the French paradox, right?

Where, you know, back in the day, I don't know if they still talk about this, but they used to say, you know, why, why are the French not dying earlier? Right? Because they seem to eat a lot of, you know, butters and creams and uh you know, they drink a lot and their, their diet does not necessarily reflect what science at the time was saying was healthy and, and so they kind of had this French paradox where, you know, why are these, why do they have good life expectancy?

Maybe even one of the best if they're eating all this presumed crap. But of course, it was the, the contrived, you know, bad science that was happening at the time that thought, you know, saturated fats were bad and maybe, you know, alcohol um was, was always bad. Maybe there was a dosage there where it wasn't so bad. Maybe at the time they didn't realize red wine. And that was kind of the answer to the, the, the French paradox. Well, it must be all the wine they're drinking.

Look, look, the reason, you know what, what this is, is, is science has an extremely limited view of what health is, right? And so they take that kind of 19 eighties, 19 nineties, what is now quite outdated nutrition advice of what is healthy. And they look at a diet that's loaded in saturated fats or something. And they say, well, you know, this is a paradox. Well, it's not a paradox.

You know, the, the way that you resolve paradox is you take away the thin slice and you get a, a broader view of something and, and you know, that can't happen until usually decades and decades later, when enough time has passed. And now we know, you know, saturated fats are uh although this is still up for debate, but um are largely regarded as something that's probably not that unhealthy at all. In fact, our ancestors were probably ingesting a lot of saturated fats.

And so, you know, it was that outdated myopic, overly contrived view of reality that made it something look like a paradox. But when you got more information and a bigger picture of how things actually work, and you take a look at human survival over time, that's not actually a paradox. So it's that idea that if you're taking too thin of a slice of something, things are going to look wrong. But it's not because the situation actually is wrong.

Your view is just too myopic, you don't see the full thing. And again, a lot of times we don't get to see the full thing. And so something might look delusional. But, and, you know, this is why if something has survived over a long period of time, the safest assumption is that it is correct, even though you're, it, it, there's a good chance you're not gonna be able to uncover the causality of why that is OK. So, um and also, you know, we talked about system one versus system two.

System one being the rapid emotional uh you know, kind of intuitive way of thinking, which is like 95% of what we do because that's, you know, our 50,000 year old brains, that's primarily how they operate. And then s that's system one and then system two is much more uh you know, slow and logical and, and, and kind of analytical and system too. It might be um obviously what you would associate actually with with science, right?

But there's far more narrative to science than layman tend to realize, right? Data doesn't necessarily save you from narrative. In fact, data can supercharge narrative because the the data, when you have some of it can give you the illusion that you must really be doing something intelligent and you must actually understand the situation.

But again, going back to this, this previous visual that I have for the non or for the Patreon viewers, you know, you can see that thin slice, you know, data is always going to be a very thin slice through all of reality. And if you anchor too greatly on the data, you can wrap all kinds of BS narrative around it.

And, and the problem with that and you, and you get into the same problem with mathematics is that you put so much faith in what looks like something that's rigorous that you, you really invest in your narrative, right? Because it looks scientific because it looks like it's rigorous. There's, there's there's numbers, there's data, therefore, it must be true and people do this all the time. So data can actually make narrative worse.

So even though you have something like cognitive dissonance, which is when data, when, when narrative and data are in conflict, that's typically how we think of it. Um data can actually supercharge narrative and make the problem worse. So, so data and evidence are a very thin slice used to anchor a narrative.

And if you think about, you know, the problem like the replication crisis where 70% of, you know, the software sciences can't even be replicated or they, they have zero e very little evidence that they can be replicated. You know, we have a lot of problems in science and, and physicists chasing too much elegance and all the stuff that I've talked about before.

A lot of that is because of this overinvestment in system two, this assumption that system two must be better and not realizing just how much bias and narrative is caused by data and data is not the opposite of narrative, right? Evidence does not necessarily get you away from narrative. It, it doesn't, it does not objectively cut through the bias of, of humans. Uh There, there are times where it can but it is not guaranteed and a lot of times it can do the exact opposite is what I'm saying.

Um And then finally, you know, and I've talked about the benefit of turmoil before I think it was in the library episode, right? The Immortal Library, you know, collapses bring benefits like nature doesn't prefer good outcomes over bad ones, right?

Nature just is and, and so again, if you're zooming in close to something like the railway mania that we talked about, you know, people were giving up their life savings and there was this overoptimism and speculation and investment in the railway looks like a bad thing. It's mania, I know it collapsed. Nothing good about that. But then over time you step back and realize, well, actually 6220 miles of railway uh which is just over 10,000 kilometers of railway was, was laid during that time.

So in the moment it was bad, but overall, there was a reason for it, you could argue, right? Something good did come out of that. Uh you know, the dot com mania led to large amounts of fiber optic cables being laid, right, maybe too much, but still, you know, that, that laid a lot of infrastructure. So again, you're standing too close, you got a really myopic view of something. It looks bad.

You allow more time to pass, you get a more holistic, bigger picture of something and you realize, you know what, maybe that was kind of meant to be or, or regardless it was the system doing what it does. OK. So, so there's always a cost to complexity, there's going to be bubbles, there's going to be manias, there's going to be bias, there's going to be, you know, a lot of this uh stuff that looks bad in the moment. But when you step back and get the bigger picture is not necessarily.

So there is a benefit to turmoil. You need that variation. That is how things become tractable and look at the end of the day. Um You know, we, we compare the system onto system two. You know, I think this is a really outdated way of looking at things or you could still say, OK, yeah, there's system on system two, but this idea that system two is the, is, is, is the, is the type of thinking to be favored.

And that system one is kind of this outdated reptilian, you know, old way of thinking that that really makes living in modernity hard in a way that is true, you know, I think that a lot of times, you know, humans aren't uh you know, naturally going to pick apart things logically and analytically. And if we think about, you know, the enlightenment world and the industrial revolution where a lot of the things were very deterministic and cogs and pistons and slide rule.

And, you know, uh that type of engineering and that type of world that was being brought into being logical, ana analytical thinking, you know, served you quite well. But I've talked about this before. The way that technology is moving now is we're coming full circle where a lot of the technologies that we're building now are. So, so they're more heuristic, right?

Machine learning A I, we're getting away from programming computers explicitly with rules and we're using softer statistical methods that approximate their outputs. And, and so I think we're gonna be coming full circle where system one is actually going to be a far more powerful way of, of thinking.

I think it it always has been a far more powerful way of thinking, but it's actually gonna have a stronger utility in society as we as, as, as the technologies we create, become higher and higher level and more abstract and softer. It's gonna be more of that intuitive uh you know, high level thinking that's gonna be far more important than the low level stuff which has already been taken over, you know, by machines and is, is less relevant. I would say to operating in society.

So I think system one will end up being recognized as more powerful in the long run anyway. OK. So that is it um for this episode, um uh you know, we covered quite a few aspects. I know I got a little jargony around some of that terra us d uh Tara Luna collapse stuff. But I hope that made sense. Right. You know, it's, it's in, in one context, it is right to think of uh what happens in aggregate in crowds as a bit of madness. And we have to be careful.

But also when you get the bigger picture, you know, that really is just the cost of complexity. I think that when people get together, they can solve problems in groups as a real wisdom to the crowd. And I think over time, it really will be seen as wisdom. But the moment it can, it can definitely look delusional and it's up to us to maybe not get caught up in the uh the death spirals like the ants do.

Hopefully you're not part of that ant group and you're part of the other ants that go on to survive. So understand that this is a property that there is a cost of complexity and try to put yourself on the right side of that equation. So thanks everyone for listening. Um If you want to help support nontrivial, go head on over to patreon dot com slash nontrivial and for a few dollars a month, you can help support what I do.

And of course, again, that gives you access to the visuals that I use on these episodes. So thanks again. Uh I know there's a bit of a break there a few weeks from the last episode. So I'm trying to get it back on track. I'll try to get that next one out. Uh Soon after this one, the book is already read for that one. So pay attention for that. Thanks everyone for listening. Until next time. Take care.

Transcript source: Provided by creator in RSS feed: download file