Today. It's great to have Stephen Pinker on the podcast. Doctor Pinker is the Johnstone Professor of Psychology at Harvard University, a two time Pultzer Prize finalist, and the winner of many awards for his research, teaching in books. He has been elected to the National Academy of Sciences and named one of Times one hundred Most Influential People and one
of Foreign Policies one hundred Leading Global Thinkers. His books include How the Mind Works, The Blank Slate, The Stuff of Thought, The Better Angels of Our Nature, The Sense of Style, Enlightenment Now, and most recently, Rationality, What it Is, Why it Seems Scarce, Why it matters? Stephen Pinker, So great to have you back on the Psychology podcast. Nice. I always enjoy talking to you. You always stimulate my brain, which is the motto of our podcast is that we
might people with brains. We used to my brain. So you say in this new book of yours, you argue that that quote rationality ought to be the load star for everything we think and do. Why is that? The very fact that you're asking that question and waiting for my answer means you've already accepted the answer, namely that we ought to persuade each other with good reasons. Now it's not the only way that we can get other
people's ascent. We can bribe them, we can threaten them, we can beat them up, we can try to cancel them. But if you're tossing out a question waiting for the answer, presumably pondering what I'm saying, that you have already committed yourself to reason. Reason is just the It's the water we swim in, it's the air we breathe. Well, you know, But the key part of that sentence that I asked you is the everything we do. So I do, I must ask? Must we always follow reason? Do I need
a rational argument? For a while? I should fall in love, cherish my children, enjoy the pleasures of life. Isn't it sometimes okay to go crazy, to go wild the silly, to stop making sense? Absolutely, and there is a false dichotomy between reason and goals, motivation, emotion. But as thinkers going back at least to David Hume, have noted reason as a means to an end. It's a way of achieving a goal using knowledge. It can't tell you what the goal is. If you enjoy pain, if you enjoy
being miserable and hungry and cold. Well, reason can help you accomplish that if that's what you want to accomplish, and it isn't going to tell you that that's not something you ought to accomplish. But whatever it is you want to accomplish, reason is the way you gain it. But the goals are not part of reason per se.
Although I'm going to add an asterisk there. One still can rationally talk about goals, such as what happens when two or more of your goals come into conflict, such as having fun now versus being comfortable and respected in the future. And there are also rational arguments to be had when one person's goals clash or another, and that brings us into the domain of morality. But there's nothing irrational about falling in love or dancing and partying or nothing.
There's nothing irrational about them rationality. You know, I want to just talk about a second what is rationality? Because I'm a big fan of Keith Stanovitch's work, and I was really excited to see that you quoted him in your book. He often distinguishes between two types of rationality, and his work it is between instrumental rationality and epistemic rationality. Your definition of rationality seems to be the murderer, but it's like the pithy condensing of both. So you define
rationalities the ability to use knowledge to attain goals. It kind of combines both instru mental and episte in a way. Yes, and that is a fundamental distinction sometimes frames the distinction between what is true and what to do, or pure
reason and practical reason. The reason that I merged them in my own attempt at a definition, namely the use of knowledge to attain goals, is that even when it comes to epistemic rationality, what is true, we don't really consider someone rational who's just say, spitting out the digits of pie till for all eternity, or just stating using logic to state true but useless facts like either Paris is the capital of France, or unicorns exist. Either Paris is the capital of France, or the moon is made
of cheese. I mean, those are all true, but we tend not to consider it to be particularly rational just to say true stuff. We even in epistemic rationality, we have some goal of understanding the world, expanding our knowledge base. We have goals for which truths we prioritize, So that's why I merge them. But you're you're right that there is one could say that there is a logical distinction between them and at its base, I mean, rationality helps
you get the things that you want. And it's very interesting, you know that the goal conflict thing. I'm not over that yet because I think that I think about that all the time because we can have two long term goals that conflict within ourselves. Right, It's not always like a goal conflict is some short term one weighing it against a long term it can be you know, we
can have two long term goals. And what is like the is there like a system in the brain, like a meta rationality system that like can help us adjudicate two rational potential avenues we could take be taking that makes it great? Oh, it makes complete sense. But the way I would put it is rationality is itself meta. That is, you know, if you had a meta rationality, then you say, well, gee, do we have to get another part of the brain for the meta meta rationality?
But taking a leaf turtle and turtles all the way down or all the way up. But take taking a leaf from the field of linguistics and cognitive science, which going back at least to Non Chomsky and George Miller, emphasized the uh the power of recursive computation and recursive representation. Namely, an idea can contain an idea, including an example of itself.
A reason reasoning process can step back and consider its own shortcomings or flaws, and we can then in turn step up a level and criticize the criticism of our reasoning. And there's as long as you have the computational power to embed a proposition in a proposition, or have a routine call an instance of itself that automatically makes it meta for as many levels as you want, until you, of course, your the mind boggles because it just gets
too complicated to keep track of them. But it is essential, and I'm glad you mentioned this that not only when it comes to adjudicating among long term goals much harder. I mean, we all of course are faced with the tension between immediate gratification and longer term satisfaction. I mean, that's just kind of the stuff of self control, of maturity. But you're right that we often have long term goals
that come into conflict. And I thought consider in talking about reasoning about goals, I can think of adjudicating among goals within a person to be kind of what we mean when we talk about wisdom and judicating among goals from different people, kind of what we mean when we talk about ethics and morality. But yeah, there's no obviously league correct answer to how should I trade off creating
my masterpiece with spending time with my kids? Both of them are long term goals at least developed building a satisfying relationship with your kids, And it is part of the agony of being a mature adult that one has to grapple with these conflicts and there's no single correct answer. Yeah, there's a question I want to ask you, and it's opening up a can of worms, and what is truth? Because I mean, I've heard other very smart people attempt to discuss such a topic and it can go in
a very long, you know, conversation direction. But I feel like that's kind of the elfin in the room we took a rationality. Truth is a very important part of rationality. You want to use correct knowledge, right, two tangles you don't want to use you know, your definition the ability to use knowledge to tangles The inference. There is that the ability to use correct knowledge. There's a lot of
knowledge going around these days. According to one some fairly well known characterization of knowledge borrowed from the fill of philosophy, knowledge is sometimes defined as justified true belief. So by definition, knowledge is true. If it isn't, we don't call it knowledge, We call it belief. We don't say John knows the moon is made of cheese, although we could say John believes the moon is made of cheese. When you use Noah knowledge, you're kind of committing yourself to truth behind
the scenes. Then you're right. It does raise the question of what do we mean by truth? And again this is a question much discussed in the field of philosophy, and in many ways above my pay grade. A famous definition of truth from Tarski is that to say that X is true is to say X. Now that for many people that's not particularly satisfying. Or truth is what is the case. But it's like reason, it's a It doesn't submit to a conventional definition because it's deeper than that.
You can't have a definition of reason unless you know how to reason. You can't talk about truth unless you already are grounded in some tacit commitment to truth. Otherwise nothing that you say would be in a sense worth saying. It'd be oh, I'll take it or leave it. I'm I'm just making noise in my mouth. I'm not making a claim to anything that is factually accurate or worthy
of your belief. So kind of committed to truth even when we start to persuade, explain, And I think you know, one way of parsing this somewhat enigmatic definition of truth that to say X is true is to say X is that it's just kind of what we mean when we say stuff in the first place. And so there is something almost superflulous about saying differentiating between the you know, the the the world is round is true and the
world is round. I mean, this just raises the question, can it be rational in certain instances to be completely deluded delusional as well about something, but in a way that delusion in its if it in itself is what gets you to attain your goal, then would that be the more rational thing to have incorrect knowledge? Well, it could be. Well, then say that it be. We relativize it and say that it is a rational way to attain a particular goal. I don't think we would call it.
And again, I'm especially as someone who is very interested in words, I know that it's not up to me to legislate the meaning of words. When I answer a question of this, does the word apply, I'm kind of tapping into are the community's intuitions of when it's natural to use the word. So, uh, the word means what people understand the word to me, and I think most so the answer to your question is it rational? Is it really? Would most people use the word rational in
describing that. I think most people would not, unless you narrow it and say, is this a rational way of doing whatever he wants to do? But is it rational to believe things that we standing outside that person know to be false? I think no. The answer is no, and my own admittedly someonat makeshift definition of rationality, namely
the use of knowledge to attain goals. Well, packed into that is knowledge, And as we spoke about just a couple of minutes ago, the conventional understanding of knowledge is justified true belief, and so true is packed into that. So in a sense, in the most general sense, no, you can't be rational if you believe things that are not true. At least if there are reasons for you to know that it's not true or reasons for us to know that it's not true. Yeah, Stephen, this is
really this is really interesting. I mean, you're really this is really interesting. So you distinguished between logic and reasoning. Can you can you be completely illogical about something but still make a rational decision about it. I don't think you can be illogical, but you can. You have to apply a lot more than logic, because the problem about logic is that it basically just expands what is already
in the premises. If something is true, that it tells you you know, some other things that are true and some things that aren't that aren't true, in some things that might or might not be true. But because rationality involves the pursuit of some goal, even if the goal is understanding something more deeply or more accurately, logic doesn't get you there. You've got to They're just they're just too many things you can prove. You can prove all
kinds of you know, crazy but irrelevant things. If P is Q, then p or q true, and so is p or Q or R or p or q or rrs And you could spin out all kinds of useless true statements indefinitely, so logic isn't enough. Moreover, there are if you are applying nothing but logic, there's a sense in which that can be irrational, in the sense that you are deliberately foregoing the use of possibly relevant knowledge.
The point about logic is it's formal, in the sense that logic actually doesn't even care about the content of sentences. It cares only about their form, about how propositions, predicates and arguments are joined by hands and ores and ifs and nots and hals and necessary and possible, but really kind of doesn't care what those sentences are about, whether they're about the moon being made of cheese or little
green men emerging from flying saucers. Predicate is a predicate, and then logic allows you to apply predicate to new sentences, uh in real life. And oh and not only is that, what is that what logic is all about? But if you're really being a logician, you may not make inferences that aren't licensed by the premises and the conclusions. You may not kind of drag in your your knowledge, your beliefs, your conjectures. And that's kind of a crazy thing to
do so. For example, you know, if you're a geometry student and you're asked to prove that the in an Isosceles triangle the two angles are equal. Now a perfectly rational thing to do. Pull it up a tractor and measure one angle and measure the other angle, and say, hey,
they're they're they're the same. It's best I can tell. Now, you don't get credit on a geometry test if you do that, even though in real life that's exactly what you might do, but you're not using the rules of the system, namely the rules in this case of Euclidian geometry, which allow a certain number of axioms and rules of infra. You have to stick to those. You can't pull out
your ruler and your protector. That's not very rational, but that's what logic, strict logic demands, and likewise, in all kinds of reasoning, and it's many experiments are shown show that people are in this sense illogical but rational, or at least non logical but rational. They can't help but mixing logic in with their real world knowledge, even though strictly speaking, logic doesn't care about real world knowledge. So if you say, for example, all all all plant matter
is healthy. Tobacco is plant matter. Therefore tobacco is healthy. Is that a valid inference? And the answer is yes, it's a valid inference. If you know all X is y and P is x and p is y. That's what the rules tell you. Now you give that to people and you say that's solveless, just as a logic problem, and they like, they won't do it. They can't block out of their heads an idea talk, Oh, that's really toxic substance. What are you talking about, Is that it's healthy?
Say no, no, forget what you know about tobacco. Just pay attention to be they all and the is and ordinary people have a lot of trouble doing that. And so is that irrational? Well not in everyday life, but it is illogical in the sense that in the you the actual application of the rules of logic, you're you're breaking the rules of the game. Now, there are some pass in fact, we don't want people to be so rational,
we want them to be logical. So, for example, if we have a moral principle, you may not prejudge someone by their their race or religion. If you're screening for terrorists, for example, even if you have statistical data that say that people of one religion are more likely to be suicide terroristsan people of another religion, and there's a sense in which it would be more rational to scrutinize them more. We decide on moral grounds, No, you can't do it.
You've got to follow the rule all people have equal rights, so the ability to be at least sometimes turn on our logic and ignore what we know. Likewise, in the courtroom, there may be evidence that's highly relevant to the case, but if it was illegally obtained, if it was inadmissible because it referred to the person's criminal record or racial statistics, we say we're going to follow this strict logical rule system. You must throw out what you, as an ordinary person know,
so it's there. Sometimes when it's actually good that we're not, so that we can turn off the totality of our rationality and apply rules exactly. Or another example is there is a there's a dolphin, a fish, or a mammal. Well, if we apply everything where you know, well, fish they swim, they're streamlined, you know, they live in the ocean, you'd say, yeah, dolphins of fish. When we apply kind of the strict logic of science. Namely, a fish belongs to one class
and mammals to another. And if you are warm blooded and suckle your you're young and have further than you're a mammal, even if you look like a fish. Well there, then again we're turning off our knowledge, applying strict definitions and rules in a more logical mode of thinking. And that's what makes science possible. You know, does that contradict your idea that rests now ought to be the old
star for everything we think can do. Didn't that just kind ofdict that Well, there's a kind of net it's a good, good question, and there's a kind of meta rationality where we can decide, depending on our goals, what tools of rationtionality to apply and which ones to sideline. So if our goal, if our goal is justice, that's not the same goal as say efficiency or the best actuarial statistics, and we might decide, well, justice is more important.
Let's turn off our statistical reasoning. Let's apply this ironclad rule. And if your higher level goal is justice, then again, week, since rationality is always in pursuit of a goal, you might decide that one kind of rational thinking must be disabled and another kind deployed. You make a point in your in your book that there can be no trade off between rationality and social justice or any other moral
or political cause. That's a that's a heavy statement. I really want to unpack that for our listeners because I think that people's intuitions is that they is not that. I don't think a lot of people that. Yeah, yeah, so can you unpack what you mean by that, and maybe and then we can get into into your into the essential features of morality as you do in your book. Yeah, if you believe that social justice can be itself be justified, there are reasons to pursue it. It's not just a
battle between our side and the bad guys. Now, admittedly, a lot of people who claim to pursue social justice are doing that, but you know, in which case they've got to live with the possibility that the other side might be stronger than them and crush them. But if they're hoping to persuade open minded third parties, if they're hoping to have reasons for what they do, then you've got to follow the laws of reason, including factual accuracy.
Is it the case that, for example, African Americans are disadvantaged relative to white people. There a lot of people on the right who would say, no, it's the other way around. It's white people who are oppressed, white working class people. Well, if you think they're wrong, you better have reasons to have a show why they're wrong. If you think that particular measures such as reparations, such as compensatory policies are morally justified, well either they are they aren't.
If they are, then you should be able to provide those reasons. If you can't, then well maybe you should rethink them. But you're certainly not going to recruit others, or at least you're not going to recruit others who are open minded. Aren't just joining a mob for the fun of being part of a mob. Now you could concede that it's all about mob rule. Our mob is
bigger than their mob. But again, if you do that, then say goodbye to recruiting, hitherto unaffiliated people, and say goodbye to claiming that that you're right when the other side happens to be stronger than you are. Do you feel like you're you're ever misunderstood? Oh, you might say that answer honestly, yes, I read an article this comes up in my head because I read an article a Christism of you, but it was framed in like, why
is Stephen Pinker far right? Now all of a sudden and now I don't get the sense that you're you've suddenly become a far right, you know, in terms of Paul Understatement time on record as the I think I'm the second largest contributor to the Democratic Party among Harvard faculty. So let's get that on the record. Let's get that on the record. You know, as I sometimes say that, you know, a lot of academics and intellectuals live at
a hypothetical place that I call the left pole. So you know, when you're at the when you're at the north pole, all directions are south. When you're at the left pole, all directions are are right. And if you when you're sitting at a left pole, anything that diverges from a pretty rigid set of orthodox is considered, you know, on the right. I consider that to be a pathology of some of the leftism in academia and journalism that there's it is so rigid, it's such a a catechism,
like religious catechism. There's just no room for dissent and just like anyone who doubts the trinity is a heretic. Anyone who doubts certain sets of axioms on the hard left is considered to be on the right. But no, there's no certainly the right doesn't consider me to be
on the right. Yeah, you know, this is it is interesting because it does seem like something has changed in the last ten years or so five ten years, where you know, maybe twenty years ago, if you published something a scientific finding and the finding had like a it didn't it wasn't automatically politicized, It didn't automatically make you think the author must be on a certain pot of
because they're presenting a certain form of knowledge. But there's something these days about where knowledge is so the knowledge that you present is so inecrably intertwined with perceptions of your political personal political stance. I just haven't seen that so tied together in my past. But it's so interesting, like, you know, you could just just make a point about something where you think the evidence is suggesting, like I
think there might be some progress. You know, when when you look at data and then they're like, oh, he's on the far right, you know, because he it's but I feel like this is something new, like do you know what I mean? It's not brand new, and I can recall, you know, strange that you know, way back when I was a college student, when there was the you know, the Marxist Leninist, social Socialist United Workers Party, uh and among the popular among students. You're right to
have perceived a change. And I think things got really took a lurch around, you know, three four years ago. And John Height has written about this as well, where I think I think he fingered twenty sixteen as the turning point. Now. I don't know if it's that Donald Trump suddenly polarized the whole country. I don't I don't
think so, I don't know didn't help. I don't know if it's social media led to mutually reinforcing clacks and cheerleading squads, or if there was just just sometimes there are social trends that, for chaotic and unpredictable reasons, gained momentum, that have a energy of their own. It's a good question. Yeah, it's just it's a fascinating like psychological Yes, yeah, you you asked the question. It's funny. I would rereading your book and then I would like think a question to myself,
and then the next page you'd asked that question. I guess you tried to think through what people would come up, so you ask, is knowledge always power? Sometimes it really is rational to pull your ears with wax. Ignorance can be bliss, and sometimes what you don't know can't hurt you. So can we riff off off that idea a little bit? It Actually it is related to an earlier point I made about can you have irrational methods? Yeah? Yeah, you're
absolutely right there I can here. I was influenced by the great political theorist Thomas Shelling, who way back sixty years ago it was a book The Strategy of Conflict, explained a lot of puzzles and paradoxes of negotiation and bargaining and threats and promises, and articulately It's something that I think people have known for probably for millennia, as long as there can be a strategic advantage to being ignorant, irrational,
out of control. A simple example. You know that the Brinks truck that goes to the bank and picks up the sacks of money at the end of every day, and there's a sticker on the side of the truck that says, driver does not know combination of safe, So it's kind of flaunting his ignorance to his advantage, because
if he doesn't know the combination, then you can't. Then the robber can't put a gun to his head and say open the safer, all blow your brains out, and he can credibly say I just don't know the combination, and so it's useless to blow his brains out. So there are cases in which ignorance is a strategic advantage. There are cases in which powerlessness is a strategic advantage.
If protesters lie in front of the on the railroad tracks, then the conductor, the engineer of the train could just keep going, knowing that they'll have to scramble if they want to save their own lives. And if they handcuff themselves to the tracks, then the engineer has got to
stop the train. Likewise, the suicide terrorist who is has the explosives attached to his body to go off, with the slaves jostling out of his control, he can't be persuaded to uh uh, you know, to to to run away or it can't be it can't be attacked, And then they're also a case in which irrationality can be a strategic advantage. Namely, if someone can't be threatened, if they if their own self interest means nothing to them, then you can't credibly threaten them, because they can threaten
you right back by refusing to comply. In fact, they don't even have to threaten. In fact, they just have to be so crazy that it's uh, there's no point. I mean, there's sometimes sometimes just called the in the context of international relations, the Madman theory UH, named after the alleged tactic that Richard Nixon deployed during the Vietnam War, flying nuclear armed bombers alarmingly close to the Soviet border, allegedly to make the Soviets think that he was so
unbalanced that they better not mess with them. And if they knew it was good with them, they better pressure their North Vietnamese clients to make concessions like that guy's crazy. Don't don't deal with them, or don't these don't don't try to push him, because who knows what he might do.
He might even do something that's crazy for himself. And you know, in our interpersonal relations, you know, we've we've many of us had experience with you know, high maintenance, romantic partners and hot heads and uh, you know, and and borderlines and people who kind of get what they
want because there's no reasoning with them. So now the problem, of course with these paradoxical tactics, uh, is that since you have taken yourself out of the game of persuasion and reason, you've kind of left the other guy no choice but to kind of take you out if they ever have the opportunity, because there's no reasoning with you. So that it does come with that disadvantage. And of course both sides play it as in a game of chicken, where as it's often been noted, how do you win
at chicken? Well, if you put a you know, if you put a club on your steering wheel and then climb into the back seat and a brick on the accelerator, so the car's no longer in your control, then the other guy's got to swerve. On the other hand, if it occurs to both teenagers to try that at the same moment, it can be a recipe for disaster. So even the rationality of the strategic rationality of irrationality can
have its limits. Sounds like you're describing most celebrity relationships, as described in that's look good to both of them are the paradoxical rationality of irrational emotion. That's the technical
term of this paradise. Yes, indeed, and to add to the uh sometimes mind dizzying implications, we can also sometimes deploy them against ourselves, the most famous example being Odysseus, who was able to hear the sirens' songs without steering his ship onto the rocks because he had his sailors plug their ears with wax and tie him to the mast, so he was incapable of ordering them to sail toward
the sirens. They were not tempted by the sirens, So the voluntary incapacitation and the voluntary ignorance in the case of Odysseus and his sailors respectively, was a higher order advantage in his own self control. And we, you know, we often do that. In fact, it's often considered the most effective means of self control, rather than just exerting brute will power, which is kind of beyond most of
our powers. If you just make it impossible for you to succumb to temptation later, that is, you know, don't buy the brownies in the first place. Then when you get hungry at midnight, you won't be tempted to eat them, or if you are tempt to eat them, all maddics, they're not there totally. I've been meaning to buy one of these jars that you put your iPhone in at night and you set a timer. It won't open it
up for you, exactly. It's sometimes called odissi and self control after the after the honesty makes a lot of sense. Is it irrational to think taboo thoughts? And then my follow up question, is it rational to condemn someone merely
for the thinking of their thoughts? Yeah, another fascinating topic on the possible you know, the boundary conditions for rationality where and this is work by Philip Tetlock and some collaborators on the psychological phenomenon that we do tend to deem certain thoughts evil to think, even though one might reason that you no harm, no foul, What goes on in the privacy of your head is no one else's business.
That's often not the way we think, or that discussing, even discussing some things as hypothetical possibilities, can be morally compromising, and Tetlog gives examples like the taboo trade off. How much money should we spend to save a little girl's life? Now, you can't avoid making that decision, at least implicitly, because the hospital can't drain its entire budget to provide the absolute cutting edge medical treatment for one sick child at
the expense of all the other sick children. How much should we spend to preserve the environment, to save an endangered species? It seems kind of dirty to put a price on these things, you know, In a sense, we don't have a choice, and often it's done kind of in the shadows, so that we do it but we don't talk about it. Or another example is the heretical counterfactual.
This is what got Selomon rushed you into trouble for merely depicting an alternative life history of Muhammad in which he was tempted by the devil instead of by God, and he nearly paid with his life when when a fat wall was imposed on him by the Iatola Halmani. Or that's not a little exotic. So I actually came up with a real life example that affects all of us.
This is a true story that I heard of a party game that people played after dinner where they said the game was of course, none of us around this table are the least bit racist or bigoted or prejudiced. But let's just say hypothetically that you were, which ethnic group would you be prenticed against? And that's the kind of a game you really don't want to play. Even though you're not confessing to racism. There's something about confessing
to hypothetical racism. It's almost as bad as confessing to the real thing, if you're even allowing in mind to go there. I saw a movie where they did that the dinner and the and the wife broke up with the guy based on his answer. She said, you would I came from there was a movie. I'm gonna trying to remember what the movie was. Remember my family dumbed her boyfriend when when he answered Jews. Yeah. And then
there's the ah. What was the The third type of trade off is the theoretical counterfactual, the taboo trade off and the forbidden base rate ah. If you are actually we even touched on this earlier in the conversation. If you're screening at the airport for terrorists, are you allowed to consider the statistics of whether you know Buddhists, Catholics, Jews, Muslims,
and Protestants. They're they're base rate of committing terrorism. If you're a good Baysian reasoner, that's exactly what you should do in admitting someone to university, in judging them in a criminal courtroom. If you wanted the statistically cutting edge state of the art most a current possible prediction of how well they would do, then you should throw into the equation their gender and their race and their religion.
But that is to say that that's kind of emotionally icky, would be an understatement to say nothing but politically inflammatory, and so certain base rates we considered to be kind of immoral to to think about now. And as Teloc emphasizes, this is not completely irrational because in our social lives we pick our friends and our allies not just by
what they do, but by who they are. Namely, has this potential friend not just has he treated me well so far, but if the chips were down, and if you were ever tempted to stab me in the back when my back was turned, or to sell me down the river, would he? And of course when we pick our allies, our friends, we want to peer into their soul and know what they're capable of, not just what they've done so far, and which thoughts they're capable of
thinking is very much relevant to that judgment. You know, if you were if someone said for how much money would you sell your child or betray your spouse or be unfaithful to your spouses? In the movie in Decent Proposal, the correct answer is not, well, what are you offering? The correct answer is I've offended you asked that question. That is, you're indicating that there's certain relationships that are sacred.
The problem being that what's rational in the realm of choosing our friends and allies is not so rational when we are setting policy for an entire society, when we're doing science, where we might really want to have the most accurate calculation of costs and benefits, and not project our friendships and our romantic relationships onto running a government or doing science. But why do in our heads we let off the hook fiction writers? I don't think anyone
looks at Stephen King. You read his books, you see what he's capable of thinking in But I don't feel like we project that onto him. You know, we like Stephen King, We're like, oh, he's probably a good guy. I don't think people. It's a great question. And actually, I mean I don't know if anyone. I would love to see someone explore this. I'll just toss off some ideas. Now, it's not probably this inocuous as in the case of Solomon Rushdy. I mean, it worked with Stephen King and
we uh it did not. It didn't work so well for for Solomon Rushdi uh. And even in the case of our culture, there have been novelists who have been condemned. I think it was Brett Easton Ellis who depicted scenes of female sexual torture and mutilation that were a little too close for comfort, and there was, at least at the time, some of this lack of forgiveness that we do. You're right extend us to Stephen King. It Partly it's I'm gonna riff here. I don't know the answer, but
I think it's a fascinating question. Partly, there may be certain conventions where because the convention already exists, someone working within that convention is giving given a pass. And the Murder Mystery is a classic example. The you know, the Agatha Christie and that whole genre of are ones where we as a culture have anointed it as an acceptable cultural form and we don't think the worse for the
writers who operate within that genre. And it may be that for certain genres of horror there can be changing more's so that we do conventionalize them as an acceptable as an acceptable genre or form, although there are sometimes tensions at the boundaries, like where you do have some guardians blue noses saying that violent entertainment Quentin Tarantino comics
in the nineteen fifties, certain kinds of violent movies. And indeed, different constituencies do make different moral arguments, such as the arguments against pornography or against sexualized violence against women, where someone who the genre itself may be deemed dangerous and the people who do think those thoughts might be morally condemned. But anyway, it's an into Maybe there's a PhD thesis
for some brilliant English literature student in that. For sure, someone needs to analyze American horror story the TV series, because that is as extreme as can be. But I don't think the writers of it get in trouble be interesting to for someone to go over the history of popular fiction. And I think they're probably at every historical area, there probably are debates and moral condemnation at the boundaries,
and then the culture itself can sometimes change the boundaries. Yeah, I guess this is just another one of examples where context is everything. You know. You you if you're on a day and you share the fact that you have fantasies, you know, and have conjured up ideas of you know, a serial killer and in a small town with supernatural elements, your date's going to run for the hills. But if you're a sci fi convention you talk about your new plot for a story, people are like, oh, that's really
clever and creative. So this is indeed one of those cases of context. Yeah. Yeah, So why is rationality so in cool? Yeah, people do. First of all, there is the confusion that we talked about before between rationality and coldness, joylessness, dourness. That's just that's just if you're pardon the expression, that's just irrational, it's a it's a non secutor. Uh uh.
There is and there always has been a always since the since the nineteenth century, there has been a Romantic movement that valorizes spontaneity, authenticity and so and of course the romantics have all the great art, so uh that has has led to the cool uh connection. Yeah, well do you do you think you're making it cool? I would love to. I think it's probably beyond my powers. But if if that were an outcome, I would be delighted,
I think, and I probably way. I do think you are to a certain extent, But I think that you need to, like if your book launch, you know, you need to like pair with like Snoop Dogg and like I'm not you know, I'm like serious and get him like being like I will this new book, you know, on rationality and why it matters, you know, while we're smoking a joint or something, and you know, like I feel like, well there is I mean, there are certain you comedians who are who work in that venue, like
like Bill Maher. Also there is there is this kind of eccentric culture called the rationality community that I think probably unsuccessfully tries to make rationality cool, although they at least they make it a thing. And you know, as I say, and there are people who are associated with it. Scott Alexander, Julia Gailiff, Elliott yed Kowski to some extent, Robin Hanson's Scott Aronson. The problem being there a couple
of problems. One of them is, like any community, clubs develop their own internal culture very quickly, as we know as psychologists, throw a bunch of eleven year olds together and pretty soon they have their own lingo, their own their own habits, and that's true of any community. The other being that, you know, ultimately having a club for rationality kind of misses the point. It's like everyone should be rational. It shouldn't be a niche like you know,
stamp collecting or or or cosplay. Of course, I think may be the first to agree, and I give them credit for trying to put it on the on the radar as something that is at least potentially cool and that ought to be cooler. Yeah, it shouldn't be geeky, should be geek rational. I agree, And look, I think
that part of the blame is the Spock character. But the more that I read your book and really understand what you're saying, I feel like Spock was more logical than rational, you know, because like to be rational, there's this like what do use? What does a person want? There's a very human dimension of it in the way that you describe. Indeed, that perhaps indeed, and of course warmth, love, friendship,
our being social animals and emotional animals. There's nothing irrational about that, you know, except when we it conflicts with other goals or it conflicts with someone else's goals for and pleasure and warmth and social connection. Yeah, yeah, agreed, agreed. Well, there's so much for its content your book. I'm trying to like think through, like, Okay, where do I want to go next? Here's what let's talk about the tragedy, the tragic trade off between hits and false arms, oh yes,
or miss and correct rejections. What's a rational observer today? Right? You know, this is one of the motivations for writing this book is an intuition that I think many people in our field sometimes have. Mainly, they're just some tools that we use to understand phenomena that really ought to be part of conventional wisdom. And one of them is the tool that that that you and I and and and our students and fellows academics learned in perception class.
It's called signal detection theory, originally applied at least in our field, applied to the human subject in the booths with the headphones pressing button. Whenever he hears a sound and has he heard it, hasn't he Well, there's no answer to that question. When the sound is just the very threshold of hearing, he can either say, well, yeah, I guess I heard it or I'm not sure that I heard it. A lot depends on whether you pay him for correct detections hits, whether you penalize him for
false alarms. And there's a mathematics as to how the ideal observer ought to respond in cases where they're detecting something under conditions of noise, they may detect it, they may not, And there's a you can predict what as they set their trade off they're cut off. I mean between being trigger happy or yea saying you know, saying yes in the slightest hint, being much more conservative, being gun shy, nay saying defaulting to know unless they're really positive.
How do you set that threshold, Well, it depends on how bad the costs are for a false alarm compared to a miss. So that's the there's mathematics on how to do that optimally. But the way of thinking, namely, often in life we just don't know the truth. We're fallible. None of us is omniscient. We have a good guess. We like to think that when we think we see something, or detect something or hear something, it's true more often than not. But it can't be true all the time.
What do we do? And the answer is, well, think carefully about how bad it is to be wrong in each of those two ways, namely, you miss or you false alarm, and set you are cut off accordingly. Now, why is that relevant? It's relevant to things like medical decision making. There's a blob on a scan, is it cancer or is it a harmless cyst? First of all, in all signal detection problems, the imperative should be to
increase your sensitivevity. That is, fine tune your judgment, your instruments, your forensics so that you pull apart the signal and noise and they're the fewest possible opportunities for confusion by Given that you're never going to be perfect when you set, you should set your cutoff mindful of each type of cost. In the case of the cancer, it could be how bad would it be to have painful and disfiguring surgery if I am cancer free? Versus how much of a risk am I taking with my life if I really
do have cancer and I fail to operate. In the case of the judicial system, and I talk about signal detection theory, in the courtroom, there's evidence as to whether the suspect committed a crime or not. It's never perfect. A jury nonetheless has to render a verdict. We reckon the costs in that case, not in pain and suffering, not in dollars and cents, but in terms of justice.
How abominable is it to send an innocent person to prison, or even worse, to send them to the gallows versus how much justice is denied if you let a guilty person go free. No correct answer, since there's never an objectively correct answer for questions of goals of costs, including moral costs, but it's good to be clear as to what costs you're willing to pay or not. Classically, the criterion, going back to just Judge Blackstone, is better to let
ten guilty people go free than to falsely convict one innocent. Now, even that's not totally arbitrary. I think you can make arguments as to why that's reasonable. But as an example, the straight out of signal detection theory of figuring out the costs of the two kinds of errors and setting a cutoff accordingly, Well, this is tricky stuff. I mean there's no well, there's a trade trade. All's are always tricky, right, It's always tricky. But what's what what is always good
is to know when you're faced with them. Again subject to film Tetlock's point that sometimes people don't like to think about trade offs. They might even think it's your moral to think about trade offs. We're going trade one thing off inst another, such as you know, should there be should people be allowed to sell their kidneys on eBay? Uh? You know there is an argument there that you know,
everyone wins, no one's the worse off. But there's a lot of moral opposition to anything that smacks of quid pro quote when it comes to a sacred commodity, namely organs. Oh for sure. I mean there's lots of examples. We could bring up a prostitution. There's a contemporary example, sex works as we now call it, where there's a debate has been revived in older times. We forget that there was this debate that people were able to buy their way out of jury duty or out of military service.
They are now able to sell their votes. So these things do change with the moral evolution of societies. I do feel like in general, though utilitarian reasoning is uncool, is not as cool. Well, yes, my heart, you don't have a heart. If you hear utilitary, I think we're often defined. I think you're right. It does see I think my colleague Joshua Green calls it nerd morality, and it is a kind of it was a cost benefit analysis.
On the other hand, a lot of what we credit as our most glorious milestones of moral progress came from utilitary reasoning, Like the decriminalization of homosexuality. It's a straightforward utilitarian argument. No one's harmed, no harm, no foul, and animal writes movement that suffering is bad. It doesn't matter how smart you are. It's a question of how much
you can suffer. The decriminalization decriminalization of heresy, No one's As Thomas Jefferson put it, it doesn't matter to me whether my neighbor believes in one god or ten gods. It neither picks my pocket nor breaks my leg. That's
the utilitary argument and argument for women's equality. There are many, many arguments that in fact appealed to the notion that you know, ain't no one's business if I do, if no one gets hurt, what consenting adults do in private, These are all utilitarian arguments, and so paradoxically they're they're cool in the sense that they often are on the progressive side of things, and they do overall ten to win the day. But if you laid them out as calculations,
then you're right, they see uh tragically unhip. Yeah, it's just the word utilitarian doesn't itself were bad branding. God was just going to say, it's got an image problem. Well, I think I think we have reached the point of our interview where we can finally discuss the pandemic of poppycock. Indeed, and I could not get away without a chapter that I call What's wrong with People? Knowing that if you bring up the topic of rationality, the first question you
get is why is the world losing its mind? And there isn't a simple, a single answer to that, because there isn't a single kind of poppycock. But I think it's in part it comes from the fact that we partly motivated reasoning, namely because rationality is always in service of a goal. That goal needn't always be universal objective truth. It can be my own reputation, esteem, respect, deference. It can be the glory of my sect, my tribe, my coalition, my party, the so called my side bias, which our
friend Keith Stanovitch as recently written a book about. So that's a major component. People care more about glorifying their political coalition than achieving universal objective truth. So that's one. Another is that we're all as humans vulnerable to certain
kinds of deep seated intuitions. We're dualists, as Paul Bloom has argued, that we think that people have a body and a mind, and from there there it's a short step to imagining minds that aren't attached to bodies, namely souls and ghosts and spirits and hints esp and psychic powers. We are all essentialists, as Susan Gellman and others have argued, we think that living things have an internal, invisible essence that gives them their form and their powers. And so
we're subject to homeopathy. We're skeptical about vaccines. We're open to blood letting and emetics and detoxification, things that intuitively feel like ridding the body of poisons yougo. Mesia has shown that blood letting is found in many many cultures as a kind of quack cure. We're all teleologists. Deb Keleman has emphasized this so that we know that our own tools and plans and artifacts who are designed with a purpose, and it's short step to think that the
world was designed for a purpose. Universe. Everything happens for a reason. So we become it's easily become creationists. They are all implanted in us because of our evolutionary history. The question isn't why people believe these things, but why some of us don't believe them. Why do we actually believe that the mind is that product or the brain? Why do believe we believe that scigns and design in
a living world come from evolution. Well, it's because some of us not only have a scientific education, but we trust the scientists. There are people What they say is good enough for us. If you aren't in that social sphere of influence, then you're liable to fall back either on your own intuitions or on communities that ratify those intuitions. That might run a foul of the scientific insensus. But the scientific insensus you think of as just another another
clique and other tribe. They're not my clean or my tribe. And I think there is a sense that why people believe weird things, It depends on what you mean by belief. And there's a category of belief that isn't the same as literal belief that there's there's milk in the fridge or or or gas in the car, where it's provably
true or false. Robert Abeleson, the great social psychologists, differentiated between distal beliefs and uh and testable beliefs where there's a whole realm of things, like you know, does God exist,
what's the origin of the universe? What are the bankers and presidents act and powerful people actually doing in secret where you can't find out if you're an ordinary person, and you don't really care because you can't you can never find out anyway, And so you believe things that are socially uplifting, that are make your side look good, make your side look bad, that are entertaining, and we all to some extent swallow these not quite true, not
quite false beliefs in religion, in historical fiction, in national mythology, where you know whether they're literally true or false? Well, we just don't care that much, but we do know that they are inspiring, uplifting, entertaining. And when it comes to beliefs outside of people's personal sphere of day to day living, they whether they believe it is not a matter of whether they can show that they're true or false,
but whether it is morally empowering to believe them. And I think a lot of crazy beliefs, you know, So you know it was Barack Obama a Muslim? Did Hillary Clinton run a pedophile ring out of the basement of a pizzeria? If you say you believe it, It isn't so much that you have grounds for believing it, but it's a way of saying boo Hillary. Namely, that's the kind of thing that she could be capable of. Such a bad person. Well, there was a phrase that I
thought was really interesting. It's called expressive rast. Exactly Yeah, that's a good way of putting it, exactly right. Yeah, Yeah, I think that that really resonated with me. And and the distinction between these two realms, as you put it, in the disting between reality and mythology. Mythology is very powerful. I can have a very powerful effect, as many cult leaders over the years have discovered, and religious leaders and political leaders, that's all the same thing. Yeah, funny about it.
And then this is played out very convincingly by Jonathan Rousch in his new book The Constitution of Knowledge. So we almost shouldn't ask the question why do people? Why do people believe weird things as much as why do at least some people some of the time believe true things?
And the answer to that is it's not because they are particularly rational, they have better brains, or it's because they have embedded themselves in a community with norms and rules and institutions that are explicitly designed to read out
falsehoods and to steer the entire community towards truth. Institutions like science with empirical testing, and scholarship with peer review, and journalism with fact checking, and the court system with adversarial proceedings, and government with checks and balances and free speech.
And by letting one person's bias thinking bump up against another person's bias thinking, then no one gets to impose their crazy beliefs on everyone else, but the community as a whole is playing a game that will move them all toward truth. When it works, although of course it's always imperfect and it's always being corroded. Yeah, I mean a few of your life like a scientist, like twenty four to seven. Then don't you never have a belief right like that? That word doesn't really I wouldn't put
it by way. I mean you could have a degree of credence. You could be a Baysian and say I put you know, point nine out of in a scale of zero to one credence on that belief. So that is I believe it. I might even and then going back to civil detection theory, I'm going to act on it even though I'm uncertain, because the costs of being wrong and not acting on it are worse than the
costs of acting on it. If if I'm wrong and it's not really true, and you assume that the truth that there is a truth, and that you don't know it, you try, the community tries, you try to get as close as possible, always leading open the possibility that you might be mistaken. Right, So I think it makes sense to believe that there's a certain high probability that something is true, you know, in terms of probabilities. But that
seems like a different state, that's right. I believe that the certain exactly, and that's the basing approach is to treat degree of credence as a probability and number is your own worm, but it is. But you're right that it's part of the commitment to epistemic humility, to fallibility, to considering the possibility that you might be mistaken. That it's inherent to science, it's inherent to democracy, it's inherent
to liberalism, it's inherent to humanism. Here here you're you know you you you make this point, you say, I love this. I'm actually gonna I think I'm gonna tweet this quote of yours after our conversation. Each of us has a motive to prefer our truth, but together we're better off with the truth. I'm just going to conclude this interview by saying, in an era in which rationalities see both more threatened and more essential than ever, I do agree that your book is in affirmation of the
urgency of rationality. So congratulations on the publication of the books, Stephen. Thanks so much, Scott. As always, it's been stimulating and enjoyable to speak with you. Yeah, I fel the same way. Thanks for listening. To this episode of The Psychology Podcast. If you'd like to react in some way to something you heard, I encourage you to join in the discussion at the Psychology podcast dot com. That's the Psychology podcast
dot com. Thanks for being such a great supporter of the show, and tune in next time for more on the mind, brain, behavior, and creativity.