Stuff You Should Know

How Cognitive Biases Work

56 min
Feb 10, 20262 months ago
Listen to Episode
Summary

This episode explores how cognitive biases work, examining the mental shortcuts our brains use to make decisions quickly and the systematic errors that result. Hosts discuss how pioneering psychologists Amos Tversky and Daniel Kahneman revolutionized our understanding of human decision-making through their research on heuristics and biases, with practical applications in economics, medicine, and everyday life.

Insights
  • Cognitive biases are hardwired evolutionary features, not character flaws—all humans experience them unconsciously as the brain prioritizes homeostasis over perfect decision-making
  • Confirmation bias is the most consequential bias because people actively seek information supporting existing beliefs and reject contradictory evidence, even when presented with facts
  • Behavioral economics demonstrates that people make predictably irrational financial decisions based on loss aversion, relative thinking, and probability misjudgment rather than rational self-interest
  • Cognitive biases have real-world consequences in high-stakes fields like medicine and forensic science where anchoring bias and outcome bias can lead to misdiagnosis and wrongful convictions
  • Awareness of biases alone provides minimal protection; effective mitigation requires deliberate practices like delaying decisions, seeking contradictory sources, and documenting expectations
Trends
Growing recognition that cognitive biases affect AI systems similarly to humans, creating emergent errors in machine learning modelsIncreased scrutiny of confirmation bias in polarized media environments where algorithmic feeds reinforce existing beliefsBehavioral economics principles being weaponized by marketers and corporations to exploit predictable irrational consumer behaviorMedical field beginning to address cognitive biases in diagnosis through structured decision-making protocols and bias awareness trainingForensic science reform efforts targeting cognitive biases in eyewitness identification and evidence interpretationRise of Bayesian reasoning frameworks and lesswrong.org-style methodologies for systematically overcoming personal biasesCognitive bias modification techniques emerging as therapeutic interventions for anxiety and other mental health conditions
Topics
Heuristics and mental shortcuts in decision-makingSystem 1 vs System 2 thinking (fast vs deliberate cognition)Hindsight bias and false memory reconstructionSelf-serving bias and fundamental attribution errorAnchoring bias in negotiations and pricingAvailability heuristic and emotional salienceInattentional blindness and selective attentionDunning-Kruger effect and expertise illusionGambler's fallacy and pattern recognition errorsBase rate fallacy and statistical reasoningConfirmation bias and belief perseveranceLoss aversion and prospect theoryBehavioral economics and irrational consumer behaviorCognitive biases in medical diagnosis and treatmentCognitive biases in forensic science and criminal justice
Companies
iHeartRadio
Podcast network producing and distributing Stuff You Should Know and other shows
Kentucky Fried Chicken
Used as example of behavioral economics exploitation through product bundling and Pepsi pairing
People
Daniel Kahneman
Israeli psychologist who challenged rational actor theory and co-developed heuristics and biases research program
Amos Tversky
Israeli mathematical psychologist who partnered with Kahneman to revolutionize behavioral economics research
Richard Thaler
Behavioral economist who built on Tversky and Kahneman's work to develop predictive models of irrational decisions
John Ridley Stroop
Psychologist who discovered the Stroop effect demonstrating System 1 interference with System 2 thinking
Peter Wason
Psychologist who coined the term confirmation bias and designed the 2-4-6 sequence experiment
Dan Ariely
Behavioral economist credited with concept that humans are predictably irrational
Eliezer Yudkowsky
AI researcher and founder of lesswrong.org, platform for overcoming cognitive biases through Bayesian reasoning
Adam Smith
Historical economist whose work preceded modern behavioral economics research on irrational decision-making
Quotes
"All humans are dummies as far as cognitive biases go. It's not just you."
JoshEarly in episode
"Corporations, marketers, basically everybody who wants to sell you something knows about these things and they can manipulate those things."
JoshMid-episode
"We're not absolutes. No, it's more just, yeah, your tendency to think in certain ways."
ChuckDiscussing bias patterns
"Confirmation bias is probably the granddaddy of all biases, I think."
JoshNear end of episode
"Being aware seems like 2% of the problem. It's like you're aware that you have an unconscious bias. It doesn't make you understand the bias."
ChuckSolutions section
Full Transcript
This is an iHeart Podcast. Guaranteed human. Hi, it's Jill Winterstein, host of the Spirit Daughter Podcast, where we talk about astrology, natal charts, and how to step into your most vibrant life. And today I'm talking with my dear friend, Krista Williams. It can change you in the best way possible. Dance with the change, dance with the breakdowns. The embodiment of Pisces intuition with Capricorn power moves. So I'm like delusionally proud of my chart. Listen to the Spirit Daughter podcast starting on February 24th on the iHeartRadio app, Apple Podcasts, or wherever you listen to your podcasts. Welcome to Stuff You Should Know, a production of iHeartRadio. Hey, and welcome to the podcast. I'm Josh, and there's Chuck, and Jerry's here too, and we are getting down to business, Just getting right to it here on Stuff You Should Know, because we've got a lot to cover here. That's right. So, Chuck, I got a little bit of an intro. Let's hear it. Was that it? That wasn't it? Yep. Do you remember how homeostasis used to come up a lot? Yes. So, for those of you who haven't been listening that long, homeostasis is what your body and your mind and your brain wants to return to, right? You just want everything nice and even keel and normal and without exerting too much effort and energy, right? That's homeostasis? That's, are you asking me? Mm-hmm. Sure. Okay. So one of the ways that your brain returns to homeostasis as fast as it can is to use shortcuts in making decisions, right? Because if you're having to decide something, you're actively being challenged. You have to, you're not in your homeostatic space. So if you use a shortcut, you can say something like, I've had the red apple in the past and it was delicious. I've eaten the brown mushy one before and it was awful. I'm going to eat this red apple, right? Rather than going to the trouble of pulling both apples out and analyzing them with a microscope and all that, you can just kind of use a little shortcut. That's a heuristic. And it makes a lot of sense because your brain is like, great, I didn't use that much energy. I made the right decision and we're good to go. The problem that comes about, though, is that with heuristics, you're not always right. You don't always make the right decision. You're not always taking all the information into account. And when that happens, you start stumbling into cognitive biases. Yeah. Like, this is a frustrating episode because I feel like the title could be Cognitive Biases, Everything You Think You Know Is Wrong. Yeah. Well, that's a great title. Let's go with that. It just, it made me feel like a dummy the whole time. Oh, no. You're not a dummy. All humans are dummies as far as cognitive biases go. It's not just you. And this stuff is hardwired into us because, like I just said, we take mental shortcuts. And the problem, Chuck, is what we're talking about mostly today are unconscious biases. Right. There's conscious biases. We just usually call those biases, right? Those are the active challenges that you need to overcome to be a better version of yourself. These are like unconscious. So there's not a lot you can do about it. Although at the end, we're going to kind of give you some tips and pointers. But it's a challenge for absolutely everybody. It doesn't make you dumb. Yeah. I mean, I think the tips and things can help for sure. But it's just part of being human, you know, the unconscious bias. And there's not a lot we can do to completely eradicate them. No. And if it's a you know, if it's a real problem, then I'm sorry. Well, one of the big problems that we all kind of face is that we are predictably irrational, as was said by Dan Ariely, who is a behavioral economist. And because of that, corporations, marketers, basically everybody who wants to sell you something knows about these things and they can manipulate those things. They can trick you into making decisions you wouldn't otherwise make. Yeah, for sure. And we wouldn't even be here probably talking about this if it hadn't been for two kind of revolutionary thinkers who ended up being some of the more off-sided researchers in the history of research, as we'll learn, especially when it comes to economics. And they were a couple of psychologists, Israeli psychologists named Amos. And I looked up different ways to pronounce this because we always get guff. And I've heard everything from hard to Berski to his colleague, Daniel Kahneman, doing more of one of those Berski. Oh, I just heard him refer to him as Big T. The Big T. Because it's TV. But those were the two guys working together. They developed this concept in the 70s at the Hebrew University of Jerusalem and really like got down to it pretty quickly as a result of Kahneman, I think, taking some issue with the Big T's research. And I guess they kind of bonded over that or something. Yeah, it was pretty cool because Tversky, basically he was a mathematical psychologist, which any time you hear mathematical and it's to do with something other than math, what that means is you've taken something and you've set it out in a very standardized way. So you can explore it. You can teach it based on certain facets. And the upshot of mathematical psychology, as far as human behavior goes, these are the people who came up with the crock idea that humans behave as rational actors. We're self-interested. We take all the best information available to make the best decision for ourselves. and Daniel Kahneman was like, this is not at all true. And he started challenging Amos Tversky's theories and Tversky, instead of saying like, no, you shut up. He was like, all right, let's go figure it. Let's get to the bottom of this. And because of that, yeah, like they formed this partnership that had a huge impact on the world. Yeah, I think it's kind of heartening that they, as academics, you know, got together. There were no ruffled feathers or at least it didn't end up that way. And they worked together. It's kind of a heartening thing, I think, these days. Yeah, there's got to be at least one. Yeah, that's right. They came up with a program called the Heuristics and Biases Program to basically, you know, study how human beings make their decisions, how they go through life making choices when they don't have like all the information at hand, all the most perfect information to make that choice. Or if they don't have like all the time in the world to look at the information that they do have to make that choice. So like, how are people making decisions? How are they making mistakes in their decision making? And they ended up coming up with a couple of different systems, one which is super quick and one which is much more deliberate. Yeah, Daniel Kahneman came out with Thinking Fast and Slow, which was one of those super popular airport books, you know? Yeah, Thinking, Fast and Slow. Yes, thank you. Eat, shoots, and leaves. That's right. And in it, he basically lays out this kind of shorthand model. He's very explicit to say, like, this is not how your brain is actually laid out, but it's a good metaphor for it. And system one is how you think quickly. You think almost unconsciously. You make rapid decisions. And that is kind of how we generally navigate life. System 2 is much more deliberate. It's where we take into account different ideas. It's where we really stop and think about something before making a decision. And they're essentially competing. There's something called interference. And System 1 has a really great tendency to interfere with System 2. And there was a psychologist working all the way back in 1935 named John Ridley Stroop, who basically discovered the Stroop effect that is a way of demonstrating how system one interferes with the slower, more deliberate system two. Yeah. Oh, boy. I bet he patted himself on the back after this one because it's one of those things that's so simple. But I bet he winked at everyone like, watch this. Yeah. This is going to break your brain. It's genius. It really kind of is. So what they did was they simply wrote down the names of colors, but they would write down the name of that color in a different color. And then he would just say, ask to the person to read out loud the color of the word that is written, not the color that it's written in. And it is surprisingly difficult to do that. It's just a little weird brain-breaking thing. Yeah. Yeah. So he's showing that your system one just wants to hurry up and read it. Yeah. And it's getting it wrong. And that's interference, right? So that kind of like started to lay the groundwork for this idea that we do have kind of competing ways of seeing the world and making decisions. and what Kahneman was saying is that most of the decisions we're walking around making are actually the system one super fast shorthand decisions but we think that we're using our more rational mind because we make post hoc explanations for why we decided that and that's not to say we're all walking around with these creepy little secrets that we know we're like fooling ourselves We don't realize we're doing this. That's why these biases are unconscious. Even if you stop and think about what you're doing, you may still not come up with the answer like, oh, yeah, I was making up explanations after the fact to explain why I actually used system two when I didn't. It's really hard to do that. Yeah, for sure. Livvy gives a pretty good example of that as far as like hiring somebody. Someone may make an impression in an interview that kind of locks it up from the second they walk in. Maybe they look like their mom or dad or relative, or maybe they remind them of themselves or maybe like who knows what it could be. then they end up getting that job. And later, if you ask the person who hired them, you might say, oh, well, it was because of this, this and this and this. When in fact, that's really just system two kind of confirming like, no, it's because the guy walked in wearing a New York Giants t-shirt. Yeah. And we'll get into some of the problems with this stuff throughout, but this is a good example, right? If the opposite happened, if like you didn't hire somebody because they weren't quite like you. That's an example of a bias too, even if you don't think that that's why you did it. If you're looking at their CV afterward and you're like, oh, they didn't graduate from college. That's why. But really it was not because you're racist. It's not because you're a woman hater. It was because you're preserving your own level of comfort because other groups that are different than you make you uncomfortable. And that's how groups can become entrenched, right? You just, once one group kind of dominates an organization, they tend to continue doing that because people hire other people who they're comfortable around rather than pushing themselves outside of their comfort zone and probably improving their organization. And that's why diversity programs exist in the first place because of that human tendency. Yeah. Or maybe they were just a Jets fan. That's possible. I mean, no Jets fan is going to hire a Giants fan. Yeah. Well, here's a tip. I don't know a lot about interviewing other than be yourself and try and get someone to like you, but don't go into any interview wearing any sort of branded sports apparel. Yeah, especially a jersey. Yeah. I think that says quite a bit. Yeah, you wear that Giants jersey in there. Well, I guess you're rolling the dice. You've either got that job or there's no way you're going to get it. So maybe it's not a bad idea then. I don't even know what I'm – I might be wrong. Yeah. I mean, I guess if you dressed it up with a bow tie, maybe you could get away with it. But yes, it is still a gamble regardless. Well, not all jobs you have to wear a suit and tie, you realize. I know, but I'm saying like you dress for the part you want. If you're wearing a Giants jersey with a bow tie, I think you're making a good impression out of the gate. All right. So I guess we can talk about we're going to go through a list of about 10 different biases. And they're all pretty interesting. And I know everyone can identify with probably each of these at some point. But before we do that, we need to point out that like these are all mental shortcuts or the result of the mental shortcut, but not all of them work in the same way and how our brains work. A lot of, you know, there could be a lot of things at play. Emotions can come into play. Maybe like we were just talking about, like it's hard to reassess something after you've gotten a first impression. People, humans, historically through their life tend to make bad guesses at things. Because if you make great guesses, then things like, you know, gambling would be super easy. So all of these things come into play. There's not just like a single way that it's a broken system. Right. Yeah, yeah, yeah. But people generally, it seems like universally in a lot of these cases, behave in these ways under the same circumstances. Yeah. Like some of their stuff wasn't replicatable, but that's sort of standard for studies in psychology. Like a lot of this stuff, as we'll see, has checked out across cultures. Yeah. Which is huge, you know, considering the whole weird problem in psychology in particular. The weird people are? Western, educated, industrial, rich. What's the R? Yeah, rich. Democratic, I think. All right. So we'll start, we'll, we'll leave the biggest guy for last, I think, which will be after a break probably, but we'll start then with maybe hindsight bias. Yeah. And this is the idea that after something has occurred, and we've talked about this one before here and there, that we, we tend to think like, oh, well, of course that was going to happen. In fact, not only was that, should I have seen that coming? It was probably inevitable that that happened. And a lot of the time it may be because you're misremembering your expectation before it even happened. Right. Like we can rearrange our memory of how we felt about the event or the outcome of the event afterward to basically match the outcome. Yeah. I guess because we have this never-ending need to be right. Yeah, that probably has something to do with it. And I knew you were going to say that. So I've got another one for you, Chuck. All right. Self bias combined with a little fundamental attribution error on the side Yeah that a good one It a good side dish So these things basically go hand in hand It basically how we see ourselves in a great light how we see other people in a more negative light. Self-serving bias is basically saying if something good happens to you, it's because you are good, like you earned it. It's because of you doing something right. Something bad happens to you, it's external forces that made that happened. Fundamental attribution error is the exact opposite with other people. If they do something right, it was just luck. If something bad happens to them, it's their own fault. So a good example of this is like if a coworker comes in late one day, you're like, they're just lazy and slack. But then you come in late the next day and you're like, it was traffic. That's basically those two things going hand in hand. And those are both biases. Yeah. And I hope people understand that like all of those things can also be true, you know? Sure. So if you're thinking like, well, no, but sometimes I did deserve the thing and sometimes it was someone's fault. Yeah, sure. That can happen. We're not, these aren't absolutes. No, it's more just, yeah, your tendency to think in certain ways. Yeah, sometimes you're going to be right. Sometimes you're going to be wrong for sure. Like humanity's tendency. Yeah, you got to take a big broad view here. Yeah, but also you specifically. Right. You, James Kirkland, listening in Baltimore. There's one, oh man, James Kirkland is going to pull over to the side of the road right now and really freak out. I hope, man, I hope I nailed it. Yeah, I think you picked a common enough name. We'll see. All right. So anchoring bias is another one. This one I've fallen prey to. I'm going to say that probably about all these, but that is the first piece of info that you get about something. can really affect, and even in a very disproportional way, things that happen after that. Like once something is kind of locked in, it's hard to unwind that. Yeah. That first piece of information, it's like, oh, okay, this is going to basically prime you in your answer, your decision, right? Yeah. So a good example I saw is there was a study that said like, okay, the Mississippi River is less than two miles long. How long is it? And those people would say something like 1,500 miles. And then other people would say, okay, the Mississippi River is less than 500 miles long. And people would say like, it's like 300 miles. And then another group was the Mississippi River is less than 80 miles long. Those people would answer like 60. It's the same thing, the length of the Mississippi River, but they were presented with this, basically this priming number, a large one, a middle number, or a smaller number. And their answers were related to that first piece of information that they got. And that's anchoring bias. Yeah. And Livia pointed out another little side dish here, which is called the decoy effect. And that's when you will go into a restaurant and they might, and this is how just kind of one way this can affect economics, which will come up a lot. But you'll go into a restaurant and they might have one super expensive bottle of wine on the menu. And maybe it's even placed at the top so you see it first. And then the other bottles of wine might seem like a decent deal after that, even if they're also overpriced. Yeah. Exploitation. But all wine in restaurants is overpriced. I hope everyone realizes that. I mean, by a lot, right? Yeah. Don't they make most of their margin on that? Oh, totally. It's frustrating, but that's the business. That's also, I think, Chuck, this anchoring bias is why they say you should never lead in a negotiation with your actual price, that you want to go higher or lower depending on your position. Oh, like if they ask what you want to get paid? Yeah. Or like you're trying to buy a car or something like that, you know? Yeah, I hate all that stuff. Yeah. Yeah. If somebody, it's like, well, what do you think I should get paid? What are you looking to make at this job? But that's what I'm saying. What you make should always be the answer. Exactly. You just want to add, I don't know, 50% to what you actually want, and then you have room for negotiation. Right. There's the one other thing that's related to this framing bias, and that's basically the same thing. But rather than the first piece of information guiding you, this is more directly guiding you. So, for example, some drug maker says 10% of patients die. You're like, oh, God, that's a lot. You could say it the opposite way. 90% of patients live. You're like, oh, that's great. Same amount of people dying. It's just framed differently to exploit your response. To exploit your aversion to dying. Pretty much. And that's a human thing, isn't it? For sure. Shall we take a break? Yeah. All right. Josh said human things, so it's time for a break, and we're going to come back with more biases right after this. mini driver. The Irish traveler said when I was 16, you're going to have a terrible time with men. Actor, storyteller, and unapologetic Aquarian visionary. Aquarius is all about freedom loving and different perspectives. And I find a lot of people with strong placements in Aquarius are misunderstood. A sun and Venus in Aquarius in her seventh house spark her unconventional approach to partnership. He really has taught me to embrace people sleeping in different rooms on different houses and different places, but just an embracing of the isness of it all. If you're navigating your own transformation or just want a chart side view into how a leading artist integrates astrology, creativity, and real life, this episode is a must listen. Listen to the Spirit Daughter podcast starting on February 24th on the iHeartRadio app, Apple Podcasts, or wherever you listen to your podcasts. All right, Chuck, up to bat is number 23, availability heuristic. Can we put like a stadium echo effect on that? Next up, many moto. Very nice. So what is that availability heuristic? And I'm sure this has happened to you before, right? Yeah, none of these have ever happened to you, which is the funny thing about us doing this episode. But the availability heuristic is what you have available to call up in your brain at any given moment. So you're going to rely more on what you can immediately think of in the moment. And chances are what you're immediately able to think of in the moment is something that probably aligns with your worldview or something like that, which is a sort of a, well, we won't talk about the C bias because that's coming up. Well, yeah, or something that like really kind of goosed you emotionally. Like that's that's very available because it's, you know, loud and scary in your mind, kind of. You know, like if you saw something about a plane crash in the last like day or so. Right. And somebody asks you how frequent plane crashes are, you're probably going to give a much higher estimate than you would have before that, you know, maybe based on the number of times you've flown and nothing bad happened. Yeah, that's a good one. there's also inattentional blindness and before anybody before we talk about this because we're going to spoil it I want to send everybody if you have the means to do this go onto YouTube and search for selective attention test and this is Daniel Simon's YouTube channel and then watch the test where the people are passing the basketball back and forth we'll wait a second All right, that's enough. All right, great. So hopefully you press pause and you didn't just try to watch it while Chuck was doing the Jeopardy theme. It is short, but it's not that short. It's like a minute and a half or something, right? Yeah. So tell them about this video, Chuck, because it's pretty great. That's right. In the video, they have a group of, what was it, like six people probably? Six on the nose. Six college students. I guess three are wearing white shirts, three are not wearing white shirts. Yeah. And they're in a very tight, small circle. Yeah. It looks very awkward. They have a basket, I think, was it two basketballs? Mm-hmm. There's two groups. Yeah, two groups, two basketballs. And what you're told is the task at hand is to count the number of times that people in white, the white team, are passing the basketball. So you're counting, all right, one, two, three, four, four, five, six, seven. Mm-hmm. And that's all you're supposed to do. And at the end, you're supposed to say, you know, how many times they pass a basketball. Right. And now now hit him, hit him with the good stuff. So apparently half of the people who do this, which is astounding to me, half of the people who watch this video and take this test don't notice that in the middle of it, a person in a gorilla suit walks into frame and turns to the camera. and I think beats on their chest and then walks out of frame. Like in the middle of these people throwing these basketballs around the gorilla, half of the people are paying such close attention to counting how many times the people wearing white t-shirts are passing the basketball that they do not notice the gorilla until the end of the video when it's pointed out. Yeah, and we're assuming it's a person in a gorilla costume. I'm hoping. First of all, that might be a bias at play, that it's not a real gorilla. Well, I guess it depends on the amount of funding they had. Yeah, it looks actually like the gorilla from Trading Places. It totally does. Which is like, were they even trying? No. Okay, good. I just wanted to make sure. Did you watch this video before you knew about this? I had heard it from some friends who do magic, and they were basically talking about this on a little podcast that they made. Whoa, whoa, whoa, whoa. You have friends who do magic? So, you know our friend Toby? Oh, yeah. He has very good friends that do magic. Wow. Like professionally? Yes. I became kind of friends with him. So, yeah, I guess I do. I have friends that do magic. Well, buddy, next time we are in Los Angeles at the same time, our good friend and friend of the show, Adam Pranica, is a member of the Magic Castle. And it's one of his and his wife Elaine's favorite thing to do is to take friends to the Magic Castle. So have you ever been? No. It is great fun. Adam Pranica just keeps getting better and better, doesn't he? Yeah, I haven't been with him, but I've been a couple of times. Once many, many years ago, then another probably 10 years ago. But it's a lot of fun. I'm a big fan of magic. Yeah. And it's pretty magical when people don't see that gorilla in a very tight frame. It's not like it's on a big basketball court and the gorilla sneaks in there. Like there's six people and then there's a seventh, very clearly. It is very obvious. So, I mean, what this is showing is that our attention is limited, right? When we're really focused on a task, you saw that gorilla, those half of the people who didn't notice the gorilla, you still saw it, but you were so focused on the task that your brain was just getting rid of information that was unrelated to the task because it's not pertinent. It can become pertinent, though, when that gorilla decides to attack you. And so this is a cognitive bias we have where we're ignoring potentially unimportant information to take in the stuff that's related to the task at hand. Yeah, you know where they could really get away with this because where you have great concentration is at a professional sports game on the Jumbotron when they have the baseball under the helmet or whatever. And then they're moving them around and you got to find, you know, it's like three card money. because you're concentrating so hard on that. They could put whatever they wanted on that screen while that's going on, and I bet you most people would not. Or maybe I guess it's half if that's what they found. Yeah, I'll bet you're right. I'll bet you're right, man. I was trying to think of something where you're super trying to follow because I was happy I came up with the correct amount of passes at the end. You did. You said, what does it mean if I noticed the gorilla and got the correct number of passes? And I said, it means you're a perfect human. That's right, which we all know is not true. Um, none of us are Chuck. None of us are. Yeah, I know. So there's another one that you may have heard of before, even if you've not heard of any of these other ones called the Dunning Kruger effect. It became kind of viral because if you take it through the pop culture meat grinder, it becomes much more simplified and kind of loses some of its actuality, but people still like it because it's a good way to put other people down. Yeah, it is. Uh, this is the idea that the correct idea is that people with little understanding in an area tend to overestimate their ability and their knowledge about something. Right. Because they don't they know so little they don't even know what they don't know, kind of. Right. Exactly. But what you are talking about, it's kind of been transformed into like morons have them are the most like braggadocious, which can be true. It can be. You know, I think that's one of the things like you said, you can be right with cognitive biases. You're not wrong with them all the time. So yeah, that kind of supports that. But that's not what the Dunning-Kruger effect actually says. You said it. And then there's the opposite way too, where the more experience you have, the more expert you are in a field, the more you assume that it should be easier for you than it is. Yeah. That's a very valuable thing to understand, I think. And you get much further in life if people are like, well, you're the expert. And the expert's usually the one going, yeah, but I don't know. Maybe we should hold off because, you know, X, Y, and Z. Right. Yeah. So that's the actual Dunning-Kruger effect. And I saw that it's being assailed right now. People are starting to question even the basic version of it, like the actual academic version of it. Yeah. So we'll see what happens with that. Oh, interesting. We've got the gambler's fallacy next. And that is, oh boy, if you have ever or gambled, um, anywhere, but if you like go to casinos and stuff like that, you're going to see this all over the place. You're going to hear it spoken out loud. And this is the idea that you find patterns, um, where there are no patterns. Yeah. So if you at the blackjack table and you hear the the person next to you like well oh man see I lost four in a row So I going to bet like I going to go all in on this because I bound to win because I lost four in a row There's no way I'm going to lose five in a row. Right. The problem is, is those each of those hands of blackjack are unrelated to one another. They don't form a pattern, but you are predicting a pattern that just doesn't exist. Yeah. That means you're a fallacious gambler. It can get you in real trouble. I mean, you can do the same thing on the playground with coin tosses. In fact, coin tosses, I think, is a lot of times the way they sort of try and prove this. Yeah, because each coin toss, considering like you're playing with a perfect unflogged coin that has no bias whatsoever, each coin toss is totally unrelated to the last. So you could get 100 heads in a row and that doesn't mean anything. It doesn't mean a tail is coming because that each of those 100 heads in each of those coin tosses had nothing to do with the last one or the next one. I know that's that's hard to break out of, though, because it seems very human to think like they flipped four heads in a row. there's no way there's going to be a fit. Well, that's another reason why this is so hard. We're hardwired to find patterns and stuff. It's a way to navigate the world. It's the way we navigate the world is by finding patterns so that we can recognize things in the future and thus spend less energy getting back to homeostasis. That's right. This is all so interesting to me. I love this stuff. I knew that you loved it. This is Josh Clark Central. I love observing it because I can't grasp what it feels like to suffer any of these. So just to discuss it in this way is really fascinating to me. All right. Let's talk about the base rate fallacy. That means you put more weight on just like one very specific piece of information instead of looking at all the pieces of information that have come your way. Yeah. And usually it's individuated information, meaning like, say, some quality or characteristic of one person. And then you're ignoring the base rate, which is like pure statistical information about what you're trying to figure out. And a really good example of this is like, let's say that you're looking at somebody who is super fit, a woman who's very fit and athletic, and you're asked, do you think that woman is a personal trainer or a teacher? Because basically the only evidence you have there is that this woman is very athletic and fit. You might say personal trainer. But if you took all the base rate information into account, you would know that even the very, say, very small portion of teachers who are very fit and athletic may be small compared to the total number of teachers. It's still much larger than the total number of personal trainers in the world. So statistically speaking, it's much likelier that that very fit athletic woman is a teacher and not a personal trainer. You don't do that because you think personal trainer, athletic fit must be a personal trainer. You've just fallen prey to the base rate fallacy, my friend. Yeah, but she has on yoga pants and hokas. Exactly. That doesn't narrow down anything these days, you know. How about the mere exposure effect, Chuck? And like mere is part of it. I'm not making a judgment about it. That's right. That means just merely being exposed to something has a vast impact. So the more we experience something, the more you like it, which is why you see that commercial for the thing over and over and over that Burger King ad over and over and over. Although I wouldn't say that you might like that one the more you heard it. That's the outlier for me, too. But that's the idea, though. There's just just mere exposure. We'll get you there. And then there's a related thing called the illusory truth effect, which is basically that repeated exposure to a lie causes you to eventually believe in it if you hear it enough times, even if you initially knew that it wasn't true. So that makes me wonder if like it just wears you down over time, like your brain is tired of defending itself against being assailed with a lie. And it's just like, fine, that's true. I don't care. Yeah, I mean, sure. Politics certainly comes to mind. Repeat the lie, repeat the lie, repeat the lie. Yeah. And I mean, like, it's a viable way to exploit people's cognitive biases in that respect. Should we end up, should we close out with the big daddy of them all, the big C, the big confirmation bias? Yeah, let's do it, baby. All right. Why don't you start this one? Okay. So there's a guy named Peter Wasson back in the 60s. He coined the term confirmation bias. and he basically had an experiment that's really clever. It's hard to understand at first, but it's very clever. He basically said, hey, here is a sequence of numbers, two, four, and six. Figure out what the pattern is. Just to be clear, this is really hard to explain. If you find somebody who can explain this well, you'll get it, but I don't think I'm a candidate for that. I think we all know that I'm not going to explain this very well. Oh, I don't think that's true. Do you want to take a crack? Okay. The original numbers were 2, 4, 6. And people might tend to go with like, all right, 8, 10, 12. And they're thinking it might be, all right, it's even number sequence, ascending even number. Right. And they would say, no, that's not correct. And you said, well, maybe it's 4, 8, 12, and it's like doubled or something. And they would say, well, that's also incorrect. And then you're at wit's end because what you haven't done is just done any ascending order. You didn't go 179, 300. All right, let me take a crack at it. You ready? Sure. So the original number is 246, and the participants would try to come up with the explanation of why, like what pattern are those numbers following, right? So you might say, like, does 8, 10, 12 work? And they would say yes. And you'd say, okay, well, then you're just looking at even numbers. And they would say, no, you still got this right. This still fits the pattern, but your hypothesis for it is wrong. Right. That's the key. Right. Here's where the confirmation bias came in. People would then go back and continue trying to find versions that fit their hypothesis to explain this, even though it was wrong. Yeah. Rather than take their hypothesis and say, OK, this is right. This fits the pattern, but it's still not correct. And start trying to break their original hypothesis. by coming up with like just completely random stuff that doesn't fit their original hypothesis, in which case they might have said something like, does 1, 6 or 27 work? And they would say, no, that's not, that doesn't fit. And then that might lead the person to see that actually the only thing that has to be correct to be part of the model is that the numbers have to ascend in order. That was it. But people, man, just go. You might even try and break it by saying 357, but you're still using that original version of that original hypothesis. Exactly. That there's, yes, say like that you think it goes up by two or something like that. Yes. Like very few people go back and try to break their own hypothesis. And that's the point of confirmation bias. Let's move on from that experiment. The point of confirmation bias that this shows, if you actually can understand it from other people than us, is that we tend to take our initial ideas, our beliefs in a lot of cases, and look for information that supports those and discard information that doesn't support it. Right. And that's, of course, you know, I mentioned politics a minute ago. That's where you most firmly see that these days is you are in a media bubble, probably. I don't know a ton of people that get their news sources from completely disparate points of view. and news these days that you're getting is so, a lot of it is so slanted to begin with, it's probably not even the best example anymore to use. But you're probably, a long way of saying, you're probably going to be seeking out news that confirms your beliefs because you don't want your beliefs challenged. Yes. I mean, like everybody else, I have trouble with that as well. But I have to boast Yumi is actually really good about getting news from different sources. Yeah. And one reason that I find it difficult to do is because I have like physical reactions sometimes. Yeah. And that is a thing. That's one of the reasons why they think we use confirmation bias is because it sucks to have your beliefs challenged, right? It's really difficult to overcome that. And there's this thing called belief perseverance, which is even when your beliefs are challenged with say like an indisputable fact, you can still use confirmation bias to preserve that belief because we usually attach our identity or build our identity around our beliefs. That's who we are. So it's like we're being personally attacked. And then even more than that, there's the backfire effect, right? Did you see that? I did not. So the backfire effect says that in the face of being presented with information that basically counters your own beliefs, it can make you actually solidify your original incorrect belief in the first place, right? You'll believe it even more strongly, even though you've just been given facts that contradict it. So we really, really hang on to our beliefs as much as possible. And that is a huge, huge thing that humans trip over. Confirmation bias is probably the granddaddy of all biases, I think. Yeah, that's why I saved it for last. Yeah. And, you know, a lot of reasons people do this. you might be protecting yourself, like your self-esteem, because otherwise you're admitting that you may have been wrong about something. And it takes a big person to do that. You want to believe that you're right about stuff. And it also might just be difficult to process more than one hypothesis at once. It might just be a little too brain-breaking. Yeah, because once you lock into an explanation, your brain just, it's like, no, we've got it. We don't have to figure this other thing out. Homeostasis, homeostasis, you know, it's, it is very hard to entertain something that is counter to what we already think is true. That's right. Uh, all right, everyone, as you can tell by the clock, we are taking our second break. Uh, this is a long one and we're going to come back and talk about behavioral economics right after this. Well, wait, before we do that, let's try to explain this confirmation bias study again. Yeah, we should. All right. We'll be right back. Hi, this is Jo Winterstein, host of the Spirit Daughter Podcast, where we talk about astrology, natal charts, and how to step into your most vibrant life. And I just sat down with a mini driver. The Irish traveler said when I was 16, you're going to have a terrible time with men. actor, storyteller, and unapologetic Aquarian visionary. Aquarius is all about freedom loving and different perspectives. And I find a lot of people with strong placements in Aquarius, like are misunderstood. A sun and Venus in Aquarius in her seventh house spark her unconventional approach to partnership. He really has taught me to embrace people sleeping in different rooms on different houses in different places, but just an embracing of the isness of it all. If you're navigating your own transformation or just want a chart side view into how a leading artist integrates astrology, creativity, and real life, this episode is a must listen. Listen to the Spirit Daughter podcast starting on February 24th on the iHeartRadio app, Apple Podcasts, or wherever you listen to your podcasts. Okay, I promised to talk about behavioral economics. A lot of the work that Tversky and Kahneman did was super applicable and kind of revolutionary in a lot of ways for the world of economics and how people buying behavior is affected. They didn't invent it. Adam Smith wrote about stuff like this. and starting about World War II is when they started really kind of homing in on stuff like this, like using mathematical models. And it all kind of started with the assumption that people and companies and organizations are really just trying to pursue their self-interest at the end of the day. Yeah, and they're going to make the most rational decision. Yeah. And that's just, and they were like, yes, we know people make irrational decisions, but these are outliers. Like if you take all of the information and data in aggregate, you'll see that humans generally try to make the most rational decision. That's just not true. People don't do that. We make all sorts of irrational decisions that very frequently run counter to our own best interests. And again, we'll even reject stuff like information that would help us make decisions to our own best interests if they counter our beliefs. So there's a guy named Richard Thaler who ended up becoming a colleague of Dversky and Kahneman. And he took some of their papers and he realized that these mistakes, these cognitive biases, they can be predictable. Right. You can actually basically map how somebody is going to make a bad decision. And this became the basis of behavioral economics. Yeah, he well, let's talk about this. This prospect theory, because this was from Tversky and Kahneman. It was an article from 1979 from this idea of prospect theory, colon, and analysis of decision under risk. And Livia says it's probably the most cited economics paper of all time. Like this was a revolutionary landmark paper. And they didn't write a ton of papers for researchers. They did like eight total. which just shows what an outsized impact they had. But they talk about in this paper a lot of attitudes about risk. One is loss aversion, which is the idea that you're going to experience more emotional suffering when you lose money than you will gain happiness if you gain something So you may pass up an offer that gives you equal odds of winning 25 or losing 20 There was another example I think kind of gets it across more is there was an experiment in 1996 where they gave participants a lottery ticket. And before you scratched it off or whatever, let's say it was a scratch off, they said, all right, well, hold on. Before you do that, I'll give you another lottery ticket plus $10 in cash. and for no logical reason at all, people tended to think that that first ticket was the one, even though there was no, it was a lottery ticket. There was no difference at all. They would turn down that extra 10 bucks. I think less than 50% of them took that deal. Yeah. Because in giving away or trading that first ticket, they risked a loss, even though the gain was right there. Just trading the ticket, you got an extra 10 bucks, right? that's that's fairly irrational um we also have a lot of trouble um with rare events yeah we tend to overestimate them it can be a positive event it can be a negative event but we're really bad at probabilities and statistics and um this is essentially it's like um you you won't let your kid walk to school because you're afraid of your kid being kidnapped even though the chance of your kid being kidnapped is just ridiculously low. It's technically irrational, even though very few people would fault you for that. But it's still an irrational decision. Yeah, for sure. We talked about that in the, well, was it? Free range kids? Maybe. I can't remember. It tied in with the satanic panic and stuff like that, I think. Definitely did. Back in the day. Relative rather than absolute terms, that is a theory, a monetary theory where, and this is a great example, you might drive an extra 10 minutes in a car to buy a shirt that you know is selling the shirt for 20 bucks rather than the one closer to you that sells it for 30. But you're not going to do that because that saves you 10 bucks. You won't save $20 on a car, even though you may even have to drive five minutes down the road because you're like, oh, it's $20, the car's 20,000. But it's really a relative, you know, absolute thing is you're saving twice as much money as you did on that T-shirt purchase. Right. But it's like you said, it's all relative. Yeah. Again, totally irrational. But all this stuff relates to economics. And like you said, this stuff can be replicated. There was a 2020 study that looked at the prospect theory in particular. And this major study was conducted in 19 countries in 13 different languages and held up. Not bad. No, that's not bad at all. And so it's not just economics. It's not just being exploited by the wine list or, you know, Kentucky Fried Chicken or something like that to make you buy their stuff. This actually, this can have like life and death consequences too. Although I guess so can wine and Kentucky Fried Chicken. You know what you're going to get at Kentucky Fried Chicken? What? Pepsi. That's right. You will get some Pepsi. You know how I know that? Because you just had Kentucky Fried Chicken? I did. After our tour, I was a little tired and needed just some fried chicken. So I got fried chicken. What do you get? Just the original or extra crispy? Because you're crazy if you don't get extra crispy. I get extra crispy, but they were out. They could satisfy. I got the three-piece. They had two more pieces of extra crispy, and they asked if one piece of OR was available. And I was like, yeah, sure. I'm not going to not eat a piece of chicken. It is good. They do chicken right. Yeah, they do. Did you get the mashed potatoes and gravy? You know it, buddy. Times two and extra biscuit. I went all in. It was a rare eating frenzy. Did you drink a Pepsi? I did. Awesome. Well, that all fits somehow. I don't know how, but it somehow fits this episode. So where I was saying that this can be life and death is with medicine. because although doctors have God complexes and like to present themselves as infallible, they are quite fallible. They're humans and they can suffer the same cognitive biases as us, but they have your life in their hands. We rarely have others lives in our hands. Yeah. Do you watch the show The Pit? I tried and it just did not grab me. I gave it like 10 minutes, but I hear nothing but good things. Yeah. I mean, I really like it. I've never been a hospital show guy, so this is kind of one of my first forays into it, But I like it a lot. I haven't started season two, but I noticed when reading through these bias, like medical biases, that they do, or at least Noah Wiley does a really good job on the show with these younger residents trying to bust through. And a lot of this stuff comes up. He doesn't say, hey, that's affect heuristic. He just will talk about what that is. And now that I know the definitions, I'm like, oh, what he's talking about is an outcome bias or an anchoring bias. it's fairly interesting. Yeah, rather than say like being presented with a really high price for a bottle of wine to make the other overpriced wine seem like a bargain, this can be like your first lab work comes back and that forms the anchoring bias impression of your condition. And even as new lab work comes back, that doctor may fail to adjust their view of your condition because they're not taking into account this new stuff. They're giving more weight to that original, that original number. So, yeah. And that's just the anchoring bias, the way that it can affect. There's like you said, there's all sorts of other ways for it to happen. And all of it can result in poorer outcomes for patients just because their doctors are humans. And we don't really approach cognitive biases in a really methodical or deliberate way. Yeah. In fact, now that I'm thinking about it, they do this so much on the show. The show could be called medical confirmation bias. Nice. The show, because you see it all the time. Outcome bias is when a shift in the patient's health, you're convinced is the result of a treatment. Like it's because of that thing I did or affect heuristic that I mentioned, an emotional reaction to a patient, you know, kind of overrunning, you know, deliberating on this thing in a logical way. This happens all the time on the show. Yeah, well, another field that it happens with is forensic science, which we've gone to great lengths to kind of point out, is junk in most cases. And a lot of that junk is just based on cognitive biases. Yeah, for sure. I mean, certainly the way they do lineups is flawed. I mean, the way they – I feel like you're right. We've done this a lot on the show. The way they have done a lot of this is super flawed. And I think maybe they're looking at it some, but not a lot. No. So if you want to fight cognitive bias in your own mind, Chuck, what do you do? What do you do? Well, there's a list of good tips here, and I think these are pretty good tips. The first one is just being aware that you have these, which is something that we've already kind of kind of worked through on this episode, except for you, of course, because you don't have these. Sure. But studies show that like just being aware, it's not one of those things where like, well, being aware is half the problem. It's like being aware seems like 2% of the problem. Yeah. It's like you're aware that you have an unconscious bias. It doesn't make you understand the bias. You just know that they're there. Right. That's the problem with is unconscious. What else? There are some like actual things you can do, like delay decision making. Yeah. Don't come to snap judgments. Go get more information. Go get information from a contradictory source or different source or something like that. And then it like kind of tied into that, you can have like personal like rules. Like if there's a big decision, you will not make that decision until you've slept on it. Yeah. For example, don't buy a TV unless your friend says, yeah, good idea. Try and consider your past experience for sure. because optimism bias could come into play. Like, hey, it worked out last time. Yeah. Like, why would I take more time this time? Yeah, and that's another way that you can kind of do that. An exercise you can do is write down your expectations for an outcome and then go back and look at it afterward and see if you were right or not. That can kind of help you realize, like, I do kind of tend toward the optimism bias. Yeah, because I believe that was one of the other biases is even, And like it's hard to recognize because you're biased and that you misremember what you thought going into it. So writing it down is a good that's a good one. Right. But if you're super, super unconsciously biased, you might be like someone else wrote this in my handwriting. I've never been this wrong. What about Thomas Bayes and Bayesian reasoning? So he was a minister from the 18th century, and he basically came up with a standardized formula for taking into account the probability of an outcome. right? That things aren't essentially, so I saw this on lesswrong.org, founded by one of the guys who wrote, if anyone builds it, everyone dies about AI, Eliezer Yudkowsky. The whole point of lesswrong.org is to overcome your biases in a methodical way. And they love Bayesian reasoning. And it basically says there's no such thing as something is just true. Everything is just a probability and you can kind of try to determine how probable something is based on whatever evidence you can gather about it just basically going through life like that you know who hates that website who l-e-s-r-o-n-g that dude who started his own personal comedy website lesswrong.com that's right he's just getting smashed what else chuck um what else is i So cultivate a growth mindset. That's a big one. Hey, I make mistakes. I screw things up. And like I need to recognize that and try and grow from that rather than, you know, just being confirmed with my own biases constantly. Yeah, maybe like looking around at some of the ways that you're commonly exploited, say like by advertisers, like scarcity is one. When somebody says act now, supplies are limited. They're creating a scarcity mindset in you. social proof basically like these people like this so you probably should too and you're like oh i should like that too yeah and then two other things i saw there's something called um cognitive bias mod modification i think is what it is okay you can use this for like treating anxiety right like people like tend to seek out negative facial expressions oh yeah and this treatment is like, here's a thousand frowny faces. Find the smiley face in there. And just screen after screen, you're looking for the smiley face and you're training your brain to stop putting as much weight on negative facial expressions. Just using, like, basically exploiting your cognitive bias to get over your cognitive bias. Oh, wow. And then the last thing, Chuck, is apparently AI are starting to show signs of emergent cognitive biases because they use heuristics too. So they're starting to make cognitive, they're starting to make errors in judgment in predictable ways, which are cognitive biases, just like humans. Rob Zombie, more human than human. That's right. You got anything else? I got nothing else. This is a good one. This is fun, Chuck. Yeah, I like these. Well, since Chuck and I both like this episode, that means we have no choice but for listener mail to be triggered right now. I'm going to call this follow up on Sebastopol because I wondered what the connection there was. Hey, guys, because if you didn't listen, Sebastopol, California, and we were talking about the Sebastopol in the Crimean War. And I was like, there's no way that's a coincidence. It's not. Hey guys, listening to the podcast on the Light Brigade from Sonoma County, our Sebastopol was named after Sebastopol. And here's a little information. The settlement was apparently originally named Pine Grove, and the name changed to Sebastopol was attributed to a bar fight in the 1850s, which allegedly compared by a bystander to the long siege of the seaport of Sebastopol during the Crimean War. Wow. So the original name survives in the name of the Pine Grove General Store downtown only. And that is it. There's also the Russian River there, Russian River Valley. So apparently there is some Russian influence in that area, which I didn't know about. And that is from Marsha Ford. Yeah. Also, we want to apologize to all of our Iron Maiden fans who wrote in to be like, that song, The Trooper, is about that whole battle. Yeah, I didn't know. I like Iron Maiden. but I didn't have as much shame upon my head as you. But you didn't, reading the lyrics, it doesn't say, you know, Crimean War in charge of the Light Brigade, does it? I don't know. I haven't heard it in a while. I'm a big fan of the poster. I love the poster a lot. Yeah, me too. Well, sorry, all of you Iron Maiden fans out there. We'll try to do better next time. Yeah, missed opportunity. Who was that that wrote in about Sebastopol? That was Marsha, I believe. Thanks, Marsha. Marsha, Marsha, Marsha. We really appreciate you. And if you want to be like Marsha, you can email us as well. Send it off to stuffpodcasts at iheartradio.com. Stuff You Should Know is a production of iHeart Radio. For more podcasts from iHeart Radio, visit the iHeart Radio app. Apple Podcasts are wherever you listen to your favorite shows. This is Special Agent Regal. Special Agent Bradley Hall. In 2018, the FBI took down a ring of spies working for China's Ministry of State Security, one of the most mysterious intelligence agencies in the world. The Sixth Bureau podcast is a story of the inner workings of the MSS and how one man's ambition and mistakes opened its vault of secrets. Listen to The Sixth Bureau on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. This is an iHeart Podcast. Guaranteed Human.