How Much Do We Really Know? - podcast episode cover

How Much Do We Really Know?

May 19, 202555 min
--:--
--:--
Listen in podcast apps:
Metacast
Spotify
Youtube
RSS

Summary

We often think we understand the world in much greater detail than we actually do, a phenomenon called the “illusion of knowledge.” Cognitive scientist Phil Fernbach explains why this happens, using examples ranging from everyday objects like toilets to complex systems like automated cockpits and financial markets. He also explores how this illusion impacts our views on political issues and offers strategies, like explaining concepts in detail, to bridge the gap between what we know and what we think we know.

Episode description

You probably know someone who thinks they know more about something than they really do. But you could never be described that way . . . could you? This week, cognitive scientist Phil Fernbach explains the "illusion of knowledge" — the fact that we think we understand the world in much greater detail than we actually do. He'll explore why this happens, and how to close the gap between what we know and what we think we know. 

Hidden Brain is about to go on tour! Join Shankar in a city near you as he shares key insights from the first decade of the show. For more info and tickets, go to https://hiddenbrain.org/tour/

Transcript

This is Hidden Brain. I'm Shankar Ghaedato. Субтитры сделал DimaTorzok Some months ago, I brought seven key insights from the first decade of Hidden Brain to live stage performances in San Francisco and Seattle. The evenings were electric. We got so much positive feedback from those two sold-out shows that we've decided to launch a tour to more than a dozen cities in the coming months.

I'll be coming to Portland, Denver, Minneapolis, Chicago, Austin, Dallas, Boston, Toronto, Clearwater, Fort Lauderdale, Phoenix, Baltimore, Washington DC, and Los Angeles. To snap up your tickets, please go to hiddenbrain.org slash tour. you can also sign up to say hello and get a photo with me in some places you can sign up for an intimate chat with me and a handful of other fans i'd love to see you there Again, go to hiddenbrain.org slash tour. Okay, on to today's show.

This is Hidden Brain. I'm Shankar Vedan. Almost exactly a century ago, a British ship, the Endurance, became trapped in Antarctic sea ice. Aboard were 28 men, led by the explorer, Ernest Shackleton. On October 27, 1915, the pressure of the ice crushed the cale Freezing water rushed in. The sailors dragged lifeboats and supplies from the ship onto the ice. Carrying the lifeboats, they started hiking toward open water.

After a week with little progress, owner Shackleton realized they would be better off camping on an ice floe and waiting for it to drift toward open water. What followed was an extraordinary story of knowledge and resourcefulness. The explorers managed to get their lifeboats into the water and row to dry land, a place called Elephant Island. But the desolate island offered little by way of resources and no hope of rescue.

So Ernest Shackleton and a smaller group took one of the lifeboats and set out for a whaling station. It was 800 miles away. Across seas were towering and deadly storms. The explorers had only basic navigation equipment. Somehow, using the stars to guide them, they made it to the island of South Georgia. But storms had blown them to the wrong side of the island as the whaling station. They now had to hike across the island, across frozen mountains and icy glaciers to reach help.

Ernest Shackleton eventually commanded a ship back to Elephant Island to rescue the remainder of his crew. He saved every last man. This is the kind of story that is popular in novels and movies. The explorers came up with creative solutions to problems and displayed a deep understanding of the challenges before them. They kept their heads and saved themselves. Stories like this obscure how much we understand about our own world.

When we face obstacles, even simple problems like a punctured tire, a malfunctioning phone, or an odd pain in our stomachs, are we any good at figuring out what to do? Closing the gap between what we know and what we think we know. This week on Hidden Brain. Human beings are, as far as we know, the most intelligent creatures ever produced by evolution

Our brains are unimaginably complex and capable of extraordinary feats of invention and creativity. And yet, these amazing minds also come with certain limits. At the University of Colorado Boulder, Philip Fernback is a cognitive scientist. He has spent many years studying a rather humbling question. How much do we actually know about the world in which we live? Phil Fernbach, welcome to Hidden Brain. It's great to be with you Sharper, thank you.

Phil, in 1946, eight men gathered in a nuclear lab in Los Alamos, New Mexico. They were there to watch a distinguished researcher perform a maneuver known as Tickling the Dragon's Tail. Who were these people, Phil? What was going on? This was one of the most fascinating periods in American history. It was the development of the nuclear bomb at Los Alamos in New Mexico. These were very eminent physicists who were testing the reaction of the fissile material in the bomb, the plutonium.

And this particular experiment, which the physicist Richard Feynman, as you said, called Tickling the Dragon's Dale, was a very delicate experiment that involved taking two hemispheres of beryllium. that were surrounding the plutonium core and moving them closer and closer together to test the reactivity of the plutonium.

So the plutonium is radioactive and gives off neutrons. Those neutrons rebound off of the beryllium and create the reaction. As the hemispheres get closer together, you get more of that reaction. What's so delicate and dangerous about this experiment is that if the hemispheres get too close together, it can create a chain reaction that releases a burst of radioactivity that can be very dangerous.

So the lead scientist who was running this experiment, who was tickling the dragon's tail if you will, was named Louis Slotin and he was a very experienced physicist, was he not? He was. He was one of the developers of the bomb and was an extremely eminent and experienced physicist. the most important member of this experimental team because he was the one who was actually engaged in the process of bringing those hemispheres of beryllium closer together.

How was he doing it? What was he doing? Well, this is the crazy part of the story. He was actually using a common flathead screwdriver to keep the two sections of beryllium apart. Unfortunately at the critical moment the screwdriver slipped. The two hemispheres of beryllium crashed together and they released this intense burst of radioactive radiation. Slotin, who was right next to the apparatus, took the worst of it, and he actually died in the infirmary eight days later of radiation poisoning.

The rest of the physicists in the room, all eminent. scientists all survived the initial burst. Some of them unfortunately died before their time, potentially due to the radiation dose that they received. Was there a way of conducting this experiment more safely? There certainly was. For instance, an obvious way to do it would have been to suspend one of the hemispheres

of beryllium, and then the other hemisphere could be raised from the bottom. In that case, if anything slipped or there was any problem, gravity would have just pulled the two hemispheres apart, and that would have been much less of a dangerous way to conduct the experiment. Why do you think Louis Slotin didn't do it that way, Phil? I think Slotin, like any other person, by virtue of his experience, did not foresee this potential problem.

he became overconfident in his ability to conduct this experiment because he had done it so many times before and he had so much experience in this domain. I want to ask you about another incident that took place many years later. In 2009, Air France Flight 447 crashed into the ocean killing more than 200 passengers on board.

When the black box of the plane was recovered, it appeared that the aircraft had stalled. Can you first explain what a stall is and what you're supposed to do when the plane stalls and what the pilots of flight 447 actually did? A stall is a very scary thing. So that's when an airplane loses airspeed and literally starts falling out of the sky.

Every pilot, when they learn how to fly an airplane, this is one of the most important things that they train for. So they actually will purposely put the airplane into a stall and then learn how to take it out of the stall. How do you take a plane out of a stall? Well actually what you need to do is point the nose of the plane down

counter-intuitively. You want to go up, but you actually have to point the plane down. You have to increase your acceleration, you have to regain airspeed, and then you can actually achieve an altitude correction. When they recovered the black box from this airplane, surprisingly what they found was that this very experienced pilot...

had done was exactly the opposite of what you're supposed to do in a stall situation. He was actually trying to pull the plane up, but without any airspeed, that's just impossible. There's no way that the plane could recover. And so, unfortunately, this led to this devastating outcome. I'm wondering whether the Federal Aviation Administration or other safety organizations evaluated why the error had come about. Did they look into that film?

Yeah, they did very extensive analyses to try to figure out what went wrong. And one of their overall conclusions was that pilots in general have become too reliant on automation and have lost some of their basic... flying skill. modern airplanes, especially modern jetliners. are so technologically sophisticated that a lot of the time the software and the airplane are doing most of the work. The pilot is almost like an observer watching the airplane do the job.

The pilot of course has to be there to intervene in unusual situations when something goes wrong. The problem is that those situations are becoming more and more because the software is becoming more and more capable. So in cases where an intervention is necessary, the pilot might not be as prepared as they should be because they become too reliant on the automation. i mean i'm thinking in a much smaller

scale. I am so reliant on my navigation tools and GPS technology to get me around that on the occasions when my phone dies, I suddenly feel like I can't find my way out of a paper bag these days. I've had the exact same experience. In fact, I've heard stories of people driving into lakes because

That's what the GPS tells them to do. My own personal experience with this is I use self-driving software all the time when I'm driving now. It works remarkably well. But one thing I've noticed over the last few years is that I've actually become a worse driver. No one wants to admit to themselves that they've become a worse driver, but for me, the evidence is kind of incontrovertible at this point because I've gotten into a few fender benders.

over the last few years which I've never done before. and so i've asked myself what is going on here and i really think it's because i've become so reliant on the car doing the work then I've lost my situational awareness. And then when something unusual happens and I need to take over, I'm not ready to do it. Just like a pilot on an automated airline. Both the Los Alamos accident and the Air France crash involved highly experienced people who made seemingly elementary mistakes.

Phil's research suggests there are many other instances where less experienced people also make blunders, in part because they assume that other smart people know what they're doing. Take for instance the 2008 financial crash in the United States. The financial crash in 2008 was in a great part due to a massive decline in the value of these financial products, a certain kind of financial product called a derivative.

Derivatives are really complicated financial instruments and it's really hard to predict exactly how they're going to behave. Traders like derivatives because they can often generate really good returns. In financial markets, return is always correlated with risk. So if something has a high return, it's also going to have a high risk.

The problem with derivatives is because they're so complicated, it's often impossible to see the risk, or it's very hard to see the risk. The risk only emerges in sort of unusual situations. So it's like the product seems to be behaving just very safely, but then something changes and all of a sudden, instead of losing 10% of its value, it loses 90% of its value or something like that.

Now, what ended up happening in the financial crisis was that these derivative products were ending up in places where they didn't belong, in places that should not have been taking on this amount of rent. things like pension funds that should be relatively safe investments. Why did they end up there? Because the people buying them just did not appreciate or realize how complicated they were.

And so what happened was when the price of real estate in the United States Started going down These products which were tied to the value of real estate and like the default probabilities on mortgages They crashed in value

And there was this huge amount of risk in our financial system because these products had entered into all these different areas and they were highly represented in all these different areas. And all of a sudden, Everybody was caught by surprise that the pension fund was losing a huge amount of its value because it had exposure to these very risky things because people didn't realize that the risk was there.

In some ways, it's analogous to what happened on the plane. The planes are getting so complex that the pilots, in fact, are trusting the plane most of the time to fly itself. And perhaps traders or people who manage pension funds are basically looking out at the market and the market has gotten so complex.

that they basically are saying okay we trust that these big banks know what they're doing these big hedge funds know what they're doing and in some ways they're outsourcing or reducing their own capabilities to actually deal with a crisis partly because the situation has gotten so complex That's absolutely absolutely right and I think I'm

It happens at more than one level. So if you think about a pension fund manager who knows what a derivative is but isn't a super expert in a derivative, He's relying on the people who do the analysis of the derivatives to kind of think oh this is potentially like a good investment vehicle for my for my portfolio he's not thinking in

Super careful detail about what the risk profile is of that instrument and maybe he or she should be But but they obviously did it but it also occurs at another level which is

The person who actually is the super expert on the derivative, even that person, it turned out, didn't understand completely how these things were going to behave. So the big banks who are developing these instruments and selling them... they would have a mathematical model that would determine exactly how these things would behave. The problem is that those mathematical models work really well most of the time like an airplane that's highly automated.

But they actually break down in very unusual situations. black swan event so to speak and that's exactly what happened so the the models failed and so the super experts on derivatives Even they completely misunderstood the amount of risk that was present in these products. So it was across the entire financial system that there was these sort of miscalibrations in understanding, and there was this cascade effect.

That started with some slight decreases in the price of housing and then just cascaded into this massive crash and devastation that not only affected the United States, but affected... countries all over the world because the entire global system is so connected nowadays. When we come back, why we imagine we know more than we do and what we can do about it. You're listening to Hidden Brain. I'm Shankar Vedanta. This is Hidden Brain. I'm Shankar Vedanta.

Perhaps you've had the experience of driving to an important event. You're running late and you decide you want to outsmart all the other drivers who are stuck in traffic. You take a side street because you know it's a shortcut around the traffic jam. Three blocks later, you realize the shortcut isn't a shortcut. In fact, it doesn't get you where you need to go.

What do you tell yourself in moments like that? Do you say, oh, I got that really wrong? Or do you say, my sense of navigation is impeccable, except for this one tiny misstep? Philip Funback is a cognitive scientist at the Leeds School of Business at the University of Colorado Boulder. Phil, as a cognitive scientist, you've studied how humans can be so brilliant but also really dumb. That feels like a paradigm.

That is a paradox, and that is the paradox at the heart of humankind, I think. On the one hand, we have visited the moon and created incredible artificial intelligence and all these other sort of almost magical abilities. On the other hand, everybody knows that people can engage in behavior that's incredibly ignorant and extreme and foolish. And we've all done it ourselves and we've all seen it in others.

So researchers who study the mind have come up with a term to describe why our own ignorance is often hidden from us. They call it the illusion of knowledge. What is this illusion? The illusion of knowledge is the idea that we think that we understand the world in much greater detail than we actually do. In cognitive science, this is sometimes called the illusion of explanatory depth. There was great research done by a psychologist at Yale named Frank Kyle and his colleague.

in the 1990s. And that's precisely what they were interested in. How well do people understand how everyday objects work, things like toilets or ballpoint pens or zippers? And in these studies, what they ask people to do was to first just give a sort of impression of how well they understand things. And what's fun about this experiment is that your listeners can actually do this experiment on themselves right now. So think about it. How well do you understand how a toilet works?

And if you're like most people, you're kind of nodding your head right now and saying, well, I have a decent understanding of how a toilet works. You think that somewhere in your mind is something like an annotated plumbing diagram that you could tell us about. But here's the trick. In the next part of the experiment what I'm going to do is I'm going to ask you to explain to me in detail exactly how it works. And when I do that something really remarkable happens.

people reach inside and they realize they have It turns out that we tend to know remarkably little about the way that the world works. And yet, that initial impression we have is that we do understand in a lot of depth. And that's what Kyle called the illusion of explanatory depth and is sometimes referred to more simply as an illusion of knowledge or an illusion of understanding.

So when it comes to something like a toilet, I think most people imagine you press a handle and water flows into the bowl and the toilet flushes and that's how the toilet works. That's actually, most of the time, pretty much all you need to know about how a toilet works until the toilet breaks and you have to fix it. Then you realize, actually, there's a lot more going on. At this point you may be asking yourself, how do toilets work?

I asked Phil to test his own knowledge, but he declined to offer an explanation from memory. As he put it, he's learned it about 10 million times, but can never retain it. So here he is, reading the explanation. So the most popular flush toilet in North America is the siphoning toilet. And by the way, this is a really ingenious mechanism that was created. Its most important components are a tank, a ball, and a trap away.

The trap way is usually S or U shaped and it curves up higher than the outlet of the bowl before descending into the drain pipe that eventually feeds the sewer. The tank is initially full of water and when the toilet is flushed the water flows from the tank quickly into the bowl raising the water level above the highest curve of the trapway. This purges the trapway of air filling it with water.

As soon as the trapway fills, the magic occurs. A siphon effect is created that sucks the water out of the bowl. and sends it through the trap way down the drain. It's the same siphon action that you can use to steal gasoline out of a car by placing one end in the tank and sucking on the other end. Let me try and say it back to you to see if I can test my own illusion of knowledge here. What I think it's saying is that you have this S-shaped curve in the pipe behind the toilet bowl.

And as you flush the toilet, it fills with water. It starts to fill up this S shape. this s-shaped pipe and as it fills up the pipe with water, The water basically pushes out the air, creating something of a vacuum on the back end of this S-shaped pipe. And the vacuum then suctions the rest of the water inside the bowl out into the drain. How did I do, Phil? Well, I think we should test that with a plumber, not with me. But I think you did a pretty good job.

What you're saying is I should stick to podcasting and not plumbing is the advice I'm hearing you give me. I think we both should. But what this highlights is just how complicated the world is. Whoever thought. You know, there's this ingenious mechanism in this object that we use every single day and we never think about.

So talk a moment about how the illusion of knowledge in some ways is a product of the way the human mind works and its limitations as it interacts with this extremely complex world. Why is it that the workings of the world are so hard for us to wrap our heads around film?

I think this reflects a really deep fact about the nature of the mind and what thinking is actually for. Thinking evolved to make us more effective at acting in the world The world is extremely complex and to be able to choose effective actions in the world really requires acting in environments that are very different from one another. We almost never see the same exact situation arise twice. And so our minds have to be really effective at generalization.

They have to understand that this situation is similar to this situation in some way to be able to act effectively. So storing a huge amount of detail about the world and the way that it works. is actually counterproductive. What we really want to do is throw away all of the irrelevant detail and just retain the deeper principles, the deeper... generalizable structures that are going to allow us to choose effective action.

And of course this must be true for all manner of different things in the world. Everything from bacteria to how trees work to how a hurricane works. All of these in some ways are complex phenomena where we have a general understanding of what bacteria do, a general understanding of what a hurricane is, but not the exact mechanics of what's actually happening at a granular level. That's exactly right. The world is so complex when you start digging into the details of almost anything.

you realize how complicated it is and that you don't necessarily appreciate that complexity off the bat. I remember hearing a story about a rock, paper, scissors tournament. Do you know the game Rock, Paper, Scissors? Yes, yeah.

Very simple game, right? It seems like there's nothing to learn and nothing to understand about that game how could you have a tournament you just throw randomly right well actually it turns out that there's a group of people who have gotten really good at rock paper scissors and what do they do they master the details of how the human mind chooses

what to throw and how it engages in pattern matching and other kinds of things and they can identify certain kinds of patterns in what a more novice rock-paper-scissors player will do and take advantage of that. I see. So in other words, if I'm a novice, there might be certain patterns that I slip into unconsciously without realizing I've slipped into the pattern. And if you're an expert, you sort of can take advantage of that.

That's exactly right. It's a funny kind of dumb example, but something where when I first looked at it, I said, there is no way that somebody could develop any level of expertise in that area. But it turns out, no, there is actually something to be learned there. So we can see the problem with mastering too much detail about the world. When we look at people with a very rare condition called hyperthymesia, what is this condition, Phil? How does it work?

Yeah, this is a super fascinating example. So hyperthymesia is also called highly superior autobiographical memory. And these are people who literally remember everything that's ever happened to them. And so you could talk to a person with hyperthymesi and say, what happened to you at 1030 in the morning? on August 15th, 1985. they can relate to exactly what occurred. They basically have perfect recall.

So before, I described how our minds are very good at throwing away irrelevant detail. If you have hyperthymesia, it's the opposite. You retain everything that's ever happened to you. It sounds like a superpower, but it actually makes life really difficult. An amazing short story by the Argentinian writer Borges, which is called Funes the Memorias, and it describes exactly this, a man who falls off a horse.

and develops perfect autobiographical memory so that he remembers every little detail of his life and it drives him crazy. And that's because the purpose of thinking, the purpose of cognition, is really not to store detail. It's to throw the detail away to be able to generalize. And that's such an important function of cognition. If you have this hyperthymesia, it actually makes life really difficult.

So what I hear you saying, Phil, is that in many ways it's actually more efficient and more effective for us to discard a lot of details or maybe even never learn the details in the first place and stick with this very generalized understanding of the world. So the problem is not that we don't know everything, that's just the human condition. The problem is that most of us think we know more than we actually do.

In fact, it would be futile to try to know everything. The world is just way too complex. We've already talked about that. There's just no way that an individual can know enough about the world that it would be effective to store all the details about everything. That's just part and parcel of what it means to be a human being. It is a problem that we don't appreciate the extent to which we don't understand.

So in April 1995, a man named MacArthur Wheeler came up with what seemed to him to be a foolproof way to rob banks and get away with it. Tell me what he did Phil. This is a crazy story. So this guy... was under the impression that he would be invisible to the cameras if he put lemon juice on his face.

You remember when you were a kid, you could create invisible ink by putting lemon juice on the paper. So he thought for some reason that this would apply to making him invisible to the camera so he actually went in to try to rob a bank with no mask no subterfuge whatsoever just lemon juice on his face

So his image is broadcast on the news. Within minutes, the police arrive at his door to arrest him, and he is completely incredulous. How could they have possibly known it was him? Because he had the lemon juice on his face. And in fact, it turned out that he had tested the method by taking a Polaroid of himself. and making sure that his face was indeed invisible to the Polaroid camera. It's never been discovered why the Polaroid didn't show his face.

but one possibility is that he missed because his eyes were filled with lemon juice. So what you're saying is he was not just a terrible bank robber, he was a terrible photographer as well. Apparently so. Phil has also found that our faulty memories play a role in the illusion of knowledge. Take for instance the research into how good people think they are when it comes to making smart investments in the stock market.

It's been well documented that people tend to be very overconfident in their capabilities in terms of beating the market. And that can lead to really disastrous outcomes, like people take on too much risk. And it actually turns out that the more people actively trade, the worse they do. And active trading is sort of an indicator of being overconfident in your ability to beat the market. This is a paper that I wrote with Dan Walters.

And what we found in this paper is that one reason for the overconfidence, the reason that people feel that they can beat the market, is because they tend to remember the good outcomes and forget the bad outcomes. So if I make a trade and I do great on it, that one's going to stick in my memory more so than when I make a trade and it does poorly.

That's not to say that if you have a really disastrous trade that you're not going to remember that. You certainly would. But on average, you're going to tend to inflate the good over the bad. And one reason for that is because we tend to remember the things that help us maintain a positive self-image. We want to believe that we're smart and good at investing. That's a very natural...

human tendency to want to remember the good and kind of forget the bad. And so this doesn't just occur in the domain of investing but can occur in other domains as well. So we've looked at how the illusion of knowledge and the illusion of explanatory depth applies to physical objects, things like bicycles or toilets. Can they also apply to historical events or public policy in that we imagine that we understand things better than we actually do, Phil?

Yes, and that is what made me so passionate about these ideas. Because it turns out that the reason that we should care about this illusion is not because people don't understand how a ballpoint pen or a toilet works. But I realized at some point that the illusion applies to just about everything that we grapple with as a society and as individuals. I was doing this work.

the midst of a political environment in the United States that was becoming more and more polarized and as I was doing this work I had this This big insight that wow, we are arguing with incredible vitriol. Across the aisle about issues that are extremely complex and that nobody understands in a lot of depth and detail. And yet we have these passionate, strong views and are unable to compromise. across the political divide. And that was a core issue that really got me interested in this stuff.

now i think if you talk to a lot of people with strong political opinions they will say well you know i do understand the issues i do understand what it means to have you know universal health care i do understand the consequences of a flat tax are they wrong I think in many cases they are wrong, and actually we've demonstrated that in experiments. So what we've done...

is something very akin to the toilet experiment that I described earlier. We bring people into the lab and we ask them about their position on the issues of the day, the things that we're arguing about as a society. And then we ask them how well they understand those issues. People tend to say, oh yeah, I understand it pretty well. And then we ask them to explain the mechanism, explain in detail how it works.

And we find this large decrease in the sense of understanding. So people are humbled by that because they try to explain and they realize, wow, I just have a talking point or two. I actually don't understand this thing in detail. You've also found that people who have some of the strongest views on an issue are sometimes also those with the lowest levels of knowledge about the issue. Why would this be the case, Phil?

So I've studied this in the context of controversial scientific issues. So things like the safety of genetically modified foods or the reality of climate change. In those cases, there's a substantial minority of the population who expresses really extreme, strong counter-consensus views. views that are counter to the scientific consensus. And what we find in that case is that the people who have the strongest

counter consensus views have the highest levels of subjective understanding. They feel like they understand these issues the best, which makes sense because If I feel like I understand it really well, I'm going to have a strong opinion. But when we measure their actual understanding of the issue in a variety of ways, they actually have the lowest levels of objective knowledge. So when you put those things together...

The people who have the most extreme counter consensus views have this huge gap between what they think they know and what they actually know. Why do you think that's happening? The fact itself of course is striking and surprising, but is there a relationship between the two? Is the fact that they don't know partly what causes their certainty? I think that that's right.

So if I feel like I understand something and I feel like I know it well, I'm going to be less likely to listen to counter evidence and counter explanations or to do more research into the issue to learn more about it.

If I already feel like I understand it, it's really hard to reach me with counter evidence. And so the people whose views seem to be the most out of line with what the scientific community says, are the ones who are hardest to reach because they already feel like they understand the issue. It turns out that many of the problems that beset us in the modern world have a common source. We think we know something, when actually, we don't. When we come back, combating the knowledge illusion.

You're listening to Hidden Brain. I'm Shankar Vedanta. This is Hidden Brain. I'm Shankar Vedanta. Philip Fernbach is a cognitive scientist at the Leeds School of Business at the University of Colorado Boulder. Phil, you say that in the face of the knowledge illusion, one effective way to better align what we think we know with what we actually know is to challenge ourselves to explain technologies, procedures, and policies in our own words. Why does this dispel the knowledge illusion?

So we're not in the habit of engaging in a lot of explanation. most of the time we just take things for granted as we've been discussing and so when we get in the habit of doing that It's sort of hard to hide from ourselves that we don't understand things as well as we thought we did. A great example of this comes from work by Rebecca Lawson on people's understanding of bicycles.

So if you ask somebody, do you know how a bicycle works? A lot of people say, oh yeah, I kind of know how that works. But then in the study, what she did was she asked people to draw a bicycle. And if you sit down and try to do it, I encourage the listeners to try to do it. It's much more difficult than you might have anticipated.

And that's just a great example of, in that case, it's impossible to hide from yourself the gaps in your knowledge. They've just become revealed on the page as you try to draw. I mean in some ways this is not hugely different from what happens in a school setting where you're giving a student a test and you're basically saying produce for me your knowledge. You know you think that you know how

differential equations work, produce for me that knowledge by solving this problem. That's right. A test that is designed to actually gauge understanding of the way things work as opposed to a test that is merely regurgitating facts. So a test that's well designed to actually... evaluate whether somebody understands the details or mechanisms of the way that something works would indeed reveal those gaps.

I have a colleague who was recently telling me a story about one of his classes where he was just imploring students, if you are using... artificial intelligence to try to help you with the class make sure that you understand this and not just are using the artificial intelligence and yet when the test came around and they've all been using chat gpt They learned that they

did not understand the material as well as they thought that they did because they had been relying on the artificial intelligence. And they got very upset with him. And he sort of explained to them, look, I tried to tell you, but this is such a natural human thing. to overestimate our understanding that it's very hard to actually appreciate.

the gaps in our knowledge. It's so natural for us to go through life just not really questioning and assuming we know more than we do that we don't really see it. And that's why getting into this habit of questioning yourself and questioning your understanding is a very powerful tool. dispelling the illusion. So you've asked people with strong views on political issues to explain public policies in detail. What effect does this have on their certainties, Phil?

So in our studies, we've often compared two different types of explanations. One is this mechanistic explanation, how does it work? And the other is more... about why you believe what you do so reasons and you can think about those two modes of thinking as being more explanatory or mechanistic on the one hand and then more argumentative

or advocacy-based, on the other hand. I think we are much more commonly engaged in this argumentation or advocacy when we're talking about things like political issues. We very rarely engage in this kind of mechanistic. or explanatory kind of discussion. But it turns out that the mechanistic or explanatory mode leads to a little more open-mindedness.

It gets people to be a little less certain about their positions because it reveals the complexity of issues. And this has been demonstrated to some extent in our work where we simply ask people, how do you feel about these issues afterwards? But I think even more interestingly and meaningfully, there's a great paper by El Nacori Hun and Grossman a few years ago, and they asked people on different sides of the political divide to have political discussion. and then they measured

different things about the discussion, like how well it went and how open-minded people seemed to be and how much they were listening to the other side and those kinds of things. And what they found is that The discussions were more productive when people engaged in this more explanatory kind of discussion. Did you find that when people are asked to engage in this more explanatory style of conversation, that political polarization goes down because some of people's political certainties go down?

It does. They move to the middle a little bit. The challenge with that is that people get defensive as well. So if you challenge people with their understanding, they might actually double down and say, no, I know what I'm talking about. because people naturally tend to be a little bit defensive. So the way to actually implement such an intervention can be a little bit challenging in its details because, again, human beings are not simple. They're complex.

So if you reveal people's lack of understanding to them, it may have the effect of making them a little bit more humble or more moderate, but it could also make them double down on their position a little bit.

I'm wondering whether you might have advice for people who are sitting across the table with a friend or family member with whom they're having a really strenuous disagreement or debate, how would you encourage them to invite the other person to engage in explanatory thinking as opposed to argumentative thinking? The key is to be curious. about the other side's position. So if you start from a perspective of

the other side is not as smart as me, is not as ethical or moral as me, they're bad, they're stupid, then you're not going to be likely to have a very productive discussion. If you start instead with the perspective of The person that I'm talking to is As smart as me. and as moral as me.

then you become curious. Why is it that they maintain a strong position that's so different from yours when it seems like the right answer is just so obvious to you? And then when you become curious, and if both sides are mutually curious,

and they want to understand what is behind the other side's position, then you can have a more productive discussion. And I think what you will find is that when you engage jointly in this explanatory kind of discussion is more of a collaboration than it is an argument. And that collaboration is going to reveal, likely, that both sides don't know as much as they thought they did at the beginning. That's going to be the most common outcome.

I mean, I think one thing that I take away from that is that even when we encounter people with beliefs that are very different from our own or even beliefs that we can prove are certifiably wrong, Perhaps the right response actually is some degree of compassion because in some ways I think what I hear you saying is that the process by which they are arriving at their erroneous beliefs is not different than the process at which I arrive at what might be correct belief.

I agree with that 100% because we all have beliefs that are incorrect and we all have beliefs that we have much more strength. and conviction in than we should. And conspiracies are a bizarre sort of extreme example of this, but they're part and parcel of the same kind of mechanisms that lead to all of our beliefs. So another way to better grasp what we don't know is to engage in a practice of considering the unknowns. What is this practice, Phil?

Yeah, so there's been a lot of research in the psychology literature on overconfidence. People tend to be overconfident in a lot of different ways. The reason for overconfidence is what's called confirmation bias. That is, we're preferentially disposed to find evidence for the position that we start with, the one that we want. some of the work that I've done is looking at another reason that we're overconfident. It's not just because we tend to preferentially weight the evidence.

for our positions, but also that we tend to neglect all of the unknown information. And that's part and parcel of all of the themes that we've been talking about today, that the world just seems simpler than it is. If the world seems simpler and we're confronted with an issue, then we're going to tend not to think about all of the stuff that we don't know.

going to tend to think about the stuff that we do know and if we thought about all the stuff that we don't know it would make us more moderate in our positions because wow there's a lot more to know about this There is another dimension of the illusion of knowledge that affects organizations and groups. And it affects us when we are part of large groups. This aspect of the illusion of knowledge shapes gyrations in the stock market and your retirement portfolio.

What we find is that when people search the internet for financial information, they become overconfident in their knowledge. And not only do they become overconfident in their knowledge, but that leads to downstream behaviors like taking on more risks. To listen, please check out the episode titled, Smarter Together, Dumber Together in our subscription feed, Hidden Brain Plus.

If you're not yet subscribed, go to support.hiddenbrain.org. If you're using an Apple device, go to apple.co.hiddenbrain. Your support helps us produce high quality shows with scientific experts. Every episode of our show is thoroughly researched, fact-checked and edited. By signing up for Hidden Brain Plus, you are helping us build a show you love. Again, that's support.hiddenbrain.org or apple.co slash hiddenbrain.

Philip Fronbach is a cognitive scientist at the Leeds School of Business at the University of Colorado Boulder. With Stephen Sloman, he is co-author of The Knowledge Illusion, Why We Never Think Alone. Phil, thank you so much for joining me today on Hidden Brain. This has been a really fun conversation, Shankar. Thank you so much. Do you have follow-up questions for Phil Fernbach? If you'd be willing to share your questions with the Hidden Brain audience,

That email address again is ideas at hiddenbrain.org. Use the subject line knowledge. Hidden Brain is produced by Hidden Brain Media. Our audio production team includes Annie Murphy-Paul, Kristen Wong, Laura Querell, Ryan Katz, Autumn Barnes, Andrew Chadwick, and Nick Woodbury. Tara Boyle is our executive producer. I'm Hidden Brain's executive producer. We end today with a story from our sister podcast, My Unsung Hero. This My Unsung Hero segment is brought to you by Discover.

Our story comes from Kara Beth Roger. In 2008, Karabeth's brother Luke passed away unexpectedly. Karabeth was 20 years old at the time, studying abroad in Morocco. The shock of the news made it nearly impossible for her to form a sentence, let alone fly all the way back to the US. She managed to get to the airport, but as soon as she boarded the plane, the grief became unbearable.

I was sitting on an aisle and I will never forget the way that it felt to try to sit still. It was impossible. I couldn't stop moving. The strength of the emotions was so intense. I really stood out. And I felt like people were avoiding eye contact with me. They weren't really sure what to do with me. through the flight this man came up to me i was sitting on the aisle on the right side of the plane so he came up to me and he crouched down next to me on my left side

and he was so gentle. He made direct eye contact with me and he spoke softly and slowly and he was really sincere. And he said, I know you don't know me, and I don't know what's going on for you, but I want you to know that if you need anything, I'm here. And I said, thank you. I never ended up going to him during that flight, but knowing he saw me, you know, I felt like I was in this cavern. just like untenable emotion and that I was deeply deeply alone and

You know, having lost a brother, we were so close in age. We grew up just one year in school apart. And knowing that I was on a plane with somebody that could see me. that knew that I needed something, even if I didn't know what it was, even if they didn't know what it was. was an incredibly powerful experience. I will forever be grateful for him. It was a really powerful moment for me. Kara Beth Rogers lives in Los Angeles.

This segment of My Unsung Hero was brought to you by Discover. Discover believes everyone deserves to feel special and celebrates those who exhibit the spirit in their community. I'm a longstanding card member myself. Learn more at discover.com slash credit card. I'm Shankar Vedantam. See you soon.

This transcript was generated by Metacast using AI and may contain inaccuracies. Learn more about transcripts.
For the best experience, listen in Metacast app for iOS or Android
Open in Metacast