Are you still quoting 30-year-old movies? Have you said cool beans in the past 90 days? Do you think Discover isn't widely accepted? If this sounds like you, you're stuck in the past. Discover is accepted at 99% of places that take credit cards nationwide. And every time you make a purchase with your card, you automatically earn cash back. Welcome to the now. It pays to discover. Learn more at discover.com slash credit card based on the February 2024 Nelson report.
You can go to kitted.shop and use the code SMART50 at checkout and you will get half off a set of thinking superpowers in a box. If you want to know more about what I'm talking about, check it out middle of the show.
Welcome to the You Are Not So Smart Podcast, episode 301. Welcome to the You Are Not So Smart Podcast, episode 301.
I am David McCraney. This is the You're Not So Smart podcast, and this is part two in a two-part series about cognitive dissonance. And
I could do a 500-part series about cognitive dissonance. We could just start a whole new podcast that was only about cognitive dissonance if we wanted to. It's a very old idea in psychology, about 70 years old, and there are lots of tributaries and offshoots from the original research. The last episode, we discussed the sort of inception of all of this. And in this episode, I want to talk about...
the landmark study that came out of the stuff we were talking about in the previous episode. And let's just get started. Let's do that. I want to begin by telling you about one of my favorite studies ever about anything, like in all of psychology and all of science, really. And it just so happens to be one of the original landmark studies into cognitive dissonance. So to set the stage for
Let's go back to 1959 to Stanford University, where the psychologist Leon Festinger is about to change the way we think and feel about the way we think and feel. Officially, the authors of this study in which this experiment appears are Leon Festinger and Meryl Carlsmith.
We met Festinger in the previous episode of this show and joined him as he and his team infiltrated a doomsday cult to observe their behavior after the day of doom came and went. Not only did most of the cult members not leave the cult after encountering pretty strong evidence they had been wrong all along, but
But for most, their convictions and beliefs and attitudes and loyalties grew even stronger. Festinger coined the term cognitive dissonance to describe the stress that people within that cult felt in the presence of such striking disconfirmatory evidence. And he remarked that the dissonance in this instance was so strong that
that they chose to not admit they were wrong at all. They changed their minds about what was real, about what was true, about what was rational and logical and reasonable, instead of simply admitting that they had been mistaken. Festinger wrote a book about all of this. He became a world-famous psychologist. He joined the faculty of Stanford, wrote a second book about how he thought dissonance likely worked. And then, after all that,
decided it would probably be a great idea to design some controlled laboratory studies to quantify and measure and test his hypotheses. After Festinger did his very weird and wonderful study into an alien cult, he needed to create some lab studies that could measure roughly the same kind of psychological phenomena. That is the voice of Dr. Sarah Stein-Lobrano. She is a political scientist and theorist and
academic who studies how cognitive dissonance affects all sorts of political behavior. She's also the co-host of a podcast about activism called What Do We Want? And she wrote a book that's coming out in May of 2025 titled Don't Talk About Politics, which is about how to talk about politics without
talking about politics. So, yeah, I look at the intersection of how people change their minds or don't a lot of the time and what that means for our political life on planet Earth. Okay, enough bona fides. Back to Festinger. He is riding high on the cult infiltration study and the resulting book and fame. He's now at Stanford and he has designed a lab experiment to study a phenomenon he observed in the wild. He...
Designed this experiment with Merrill Carlsmith, an undergraduate at the time who actually came up with the original idea for all of this. Carlsmith would go on to become an influential psychologist, but he isn't one yet. And here is how the study worked. They got 71 psychology students at Stanford to sign up for a project called Measures of Performance, which they told them was part of a project that
the psychology department would be conducting to improve the quality of all their psychological experiments in the future. None of that was true, but it allowed them to tell the students that they needed everyone to be completely frank and totally honest in the interviews that would come after the experiment. So one at a time, a student would arrive and then get escorted into an office holding area where
where they waited for their turn to be asked to step inside the official experiment room, where they would learn what they would be doing for the next. Ask them to do an incredibly boring task. I think it was pretty much literally taking wooden knobs and turning them like 90 degrees or 180 degrees. Yeah, that was...
Task two. First, they sat down in front of a tray filled with 12 wooden spools, and they were then instructed to empty the tray with one hand, one spool at a time, refill it one spool at a time, and repeat that for 30 minutes or so.
while, quote, working at their own pace, end quote. Meanwhile, the experimenter, the scientist, sat nearby with a stopwatch and pretended to take notes. Then, after half an hour, the experimenter removed the tray and replaced it with those square pegs, a horizontal board on which 48 square pegs had been mounted.
The new task was to rotate each peg one quarter turn until they'd rotated them all, then repeat all of that over and over for half an hour at their own pace while the experimenter watched. And if this sounds excruciatingly boring and mind-numbingly repetitive, that was the idea. Festinger and Carl Smith hoped by the end of all of this that the participants would hate Festinger.
what they were doing would regret they'd signed up for this and not wish this on anyone. And by the way, you can watch this study. They have videos and those are online. So if you Google this, there are videos of people doing this from back in the day with wild 50s haircuts and so on. So they do this task. It's very boring. And then something very clever happens.
As with most psych studies, there's a lie in the study because that's how you make sure the person doesn't realize what it is you're actually measuring. So you lie about what the study is about. And I love this part, the lie part, because at the end of the hour, the experimenter in the room with the student leans back and lights a cigarette, like clink. And then through a cloud of smoke tells the student what the real study is about.
And the researcher wearing a lab coat and looking very professional says, you know, oh, thank you so much for doing this study.
Actually, what we're trying to measure, just so you know, is we're trying to measure if we tell someone a task is interesting, will they find it more interesting? And obviously, I didn't tell you that the task was going to be interesting or boring. But another student who's coming next is going to be told the task is really interesting. But I have a problem. And the student's like, OK. And the researcher says, I have a problem. My colleague who's supposed to tell the next student that this task is really interesting, he just didn't show up for work today.
And the student goes, oh, okay. And the researcher says, could you help me? Do you think actually, just to solve this problem that I'm having, if I pay you, will you go into the hallway and tell the next student that the task is really interesting so we can continue this study? And most of the students said yes, but they were paid different amounts of money. They were paid different amounts of money. This established the conditions of the experiment. From here on,
They created three groups. There's the control, but most crucially, there is the $1 group. That's about $9 in today's money and a $20 group. That's about $190 in today's money. The control group, they proceeded straight to the interview, but the one and $20 groups go back into the waiting area and
where another student is waiting and they tell that student, Wow, I just did this research study and it was really interesting. To reiterate, the students spend an hour performing a very, very boring task. And then they are asked if they would mind helping the psychology department by telling the next person waiting to do this task that it is actually quite fun and interesting and not boring.
They also tell these participants, these subjects in this study, that what they're studying are expectations. And the person who is supposed to lie and say all this couldn't make it today. Also, they are told if they agree, they will be officially hired and might be called back sometime later to help with other stuff. So all of this is out there. And some of these students agree.
are offered the equivalent of $9 to do this. And some students are offered the equivalent of $190 to do this. That's the only variable. How much are they paid, right? Almost everyone agrees. And then those who have agreed are escorted back into the waiting room where the next student
Who is, of course, not actually a student, but is in fact the lab assistant in disguise as a student. And they tell this pretend next student who they believe will be the next person doing that very boring experiment. Oh, yeah, I just did it. It was so interesting. They lie. They lie. They tell the next student it will be fun, even though they know it will not be fun. And like Dr. Stein-Lobrano said, this supposed next student they're telling all this to is actually in on the experiment.
And when the experimenter leaves the room saying they'll be back in about two minutes, the fake student waits until the participant tells them the experiment was interesting. And then she tells them she's surprised to hear that because one of her friends who had done this experiment already told her it was terrible. And in response, nearly all the students tell her no.
It's not terrible. It's actually very fun. And she is going to enjoy it. However, two did tell the truth at this point. And one got her number so he could tell her the truth after. At least that's what he said. Also, three rejected the money during the light a cigarette, can you help us part of the experiment. And all of those students who were just not playing along, they were excluded from the final analysis. But the rest, 65 in all,
They lied. And when the experimenter returned after leading the fake student into the experiment room, those participants went down the hall for their post-experiment interviews. And here, each student was asked to honestly rate how fun the study that they were just participating in, how fun was it really on a scale from negative five to positive five.
And then they asked each participant what they felt the study was actually about and if they suspected anything. And at that point, five students said, is this about lying? And so they got excluded as well.
So that leaves us with a pool of people who believed that they were actually helping the psychology department and they were actually lying about something that they knew that they were lying about. And some of them were doing it for $1, which is $9 in today's money.
And some of them were doing it for $20, which is $190 today's money. In a couple of weeks, the same students are asked in a survey, how interesting was the task really? And what's interesting is that the students who had to lie for only $2 have a very different result than the students that had to lie for $20. The students that lied for $20 are like, that task was incredibly dull. Because of course it was. It was a really dull task.
And the students that were paid only $2 seem to think the study is interesting. And that is what is incredibly weird, right? They went out and lied for not a huge amount of money. And they now will leave their own lie. And they really do seem to. And this is where I really recommend watching the YouTube videos. Because there are interviews where Festinger or his colleague asks the student like,
You know, some other students have told us that that task was boring and the students are in like shock. They're like, no, it was really fun. You must be talking about another study because the one I did was great. It was a lot of fun. This is what makes this study so incredible because it starkly reveals the strangeness
of how we often deal with the discomfort of cognitive dissonance. Both groups observed themselves saying something they did not believe. Both noticed an inconsistency, an incongruence, a dissonance between their behavior and their beliefs, their experiences and what they said about those experiences. And both groups not only noticed this inconsistency, but
But they had to contend with the fact that they had accepted a bribe. A bribe to do something that might be considered by others as morally questionable. But their reactions to all of that differed only depending on the size of that bribe.
When they asked the students paid the equivalent of $190 how they actually felt about the boring tasks, those students said, yeah, the tasks were really, really boring. I hated it. I do not recommend it. When they asked the students who had been paid the equivalent of $9 how they felt about the boring tasks, they said, actually, in all honesty, I loved it.
It was fun. It was not boring. They recommended it. They would have done the tasks for nothing and would happily participate again. And here is the thing that really blows my mind about this study. They really did feel that way. After lying about how they felt, they then changed how they felt so that it would not be a lie.
No part of them, nothing inside them was telling them that they were fibbing, that this was a sham, this was a charade. None of it. From that point forward, it was fun to them. They changed their own minds to reduce the dissonance created by
by the experimenters. So the question is, how did this person come to believe that actually the task was interesting? That's a really weird thing to gaslight yourself about, to misappropriate a word. And the answer is that the person is suffering from a cognitive inconsistency, right? They went and lied to the student and told them
That was a really interesting study. And they actually found the study quite boring originally. And they need to reconcile the dissonance they face about the contradiction between their actual belief that the study was boring and their actual action, which was saying that it was interesting. And unlike the student that was paid a lot of money, who can reconcile this by saying, well, but I was paid a lot of money,
So this is consistent with the fact that it was boring, but I was willing to because I was paid a lot of money. The student who wasn't paid very much doesn't really have a good rationale for having lied to the student in the hallway. And they appear to shift their beliefs to tolerate this contradiction. They're like, oh gosh, I lied to that student and it wasn't for very much money, but actually it wasn't that boring. So it's not that much of a lie, right? They're doing a rationalization so that they don't have to live with the discomfort of a contradiction between their beliefs and their actions.
This is a very repeatable finding and a very weird study, but I actually think it works really well for explaining just how funky this is. And notice that there are all kinds of elements at play in it that come up again and again in dissonance studies after this. There's your sense of yourself as a good person that's at stake. There's a sense of yourself as having actually chosen the thing that you did, even though you were really compelled to do it.
And that will fit into a lot of other things that are worth saying about dissonance and the way it features in our life, that a lot of the time we feel dissonance about actions we kind of had to undertake but don't really feel comfortable with. And the way that we manage that is by deciding that in the end, we definitely wanted this thing and chose it for sure. Yes, this is... I'm a good person. I don't take bribes to do bad things.
I don't do boring tasks. I do. I, I would have quit if that had been boring. Uh, I don't just lie to people. Uh, that would be heinous. Um, I am trying, I'm doing something for the betterment of humanity and there are all these opportunities for you to see the truth of what is happening and possibly benefit from being honest with yourself. But people will variably do that depending on how much money you give them because the story you tell yourself is different and,
Why did I do that? Because I was paid. But even then, that's going to divide people into different groups. Some people are going to be like, I'm not okay with the fact that I did that because I was paid well to do that. I wouldn't do that no matter how much you would pay me and so forth and so on. It gets complicated very quickly. But the fact that this is introspection that you're not aware is taking place to the point that you actually do now truly believe that that was not a boring task is very...
freaky, upsetting, and weird to me. And always has been. I'm making up this story about who I am and why I am at all times, and I'm unaware that I'm doing it. And a good portion of it is going to be fiction for the sake of not thinking I'm an inconsistent bad person. Sarah, what are we supposed to do with that? What are we supposed to do with knowing that about ourselves?
Okay, well, look, I think it's very valid, as we like to say in pop culture discourse, to feel uncomfortable about this aspect of human psychology, and we should. And something I'm very interested in lately is a bunch of studies by a wonderful researcher who I've interviewed called Kristen Loren. And she looks at what happens when people are faced with limitations on their actions and how quickly they do a really similar thing to the people in the study we just talked about. So in the study we just talked about, people...
were kind of gently coerced and or bribed into lying. And then they came in some cases to convince themselves, oh, I actually didn't lie. I totally believed in that thing in the first place, right? And Lauren's work looks at what happens when people don't like a restriction that is placed on them, but they think it's unavoidable and how quickly they decide that actually they always liked this restriction. So she looks at two examples. She looks at a plastic water bottle ban in San Francisco, and she looks at smoking restrictions in restaurants in Ontario.
And she found that the very same people would have extremely different beliefs about their preferences before and after those bans. So many people before a smoking ban would say, this is, you know, government interference and overreach. I don't want to be limited this way. Or same with the water bottle.
bottles, right? Like this is a huge pain. And then the day that the ban comes into effect, you can ask exactly the same people. And a lot of them are like, oh, I like this ban. And I've never really smoked that much in restaurants anyway. And they actually don't remember that they used to not like this ban. And they also have misremembered how much they were smoking. Of course, you can imagine a lot of scenarios where these kinds of restrictions are a lot more authoritarian or harmful. You can see that the same mechanism at work probably lets people justify
adhering to really awful despotic laws and doing terrible things to other people. So we should be alarmed, and I want to validate that. And I also want to say that at the same time, I always look for this sort of utopian kernel in this as well, which is to say that actually, if human beings were really little sentient robots...
who were just responding to incentives and disincentives about their material interests. If all we cared about was, am I getting paid well and am I getting laid and am I getting to maybe have a fun time? I'm not that interested in that human subject. That's kind of boring and it's pathetic and it's kind of not that interesting morally. But a human subject that is desperate to tell a story about itself where it's a good
creature that does good in the world and that keeps track of its sense of self and wants to think about itself as an active good person who can can do something in the world i'm interested in that subject and i find it reassuring that that is even possible for human beings even though it can be used against us to make us do terrible things as well it gives me a little bit of hope in a time of like real political despair that human beings care about the kind of creature that they are and the kind of force that they are in the world and that they want that to be for good
We'll be right back after this break. I know for sure that most studies show that new habits fail simply because you don't have a plan.
And without a plan, it's really hard to stick with a new habit or routine. That's why Prolon's five-day program is better than any trend out there. It's a real actionable plan
for real results. Prolon. It was researched and developed for decades at USC's Longevity Institute. It's backed by leading U.S. medical experts. Prolon by El Nutra is the only patented fasting-mimicking diet. Yes, fasting-mimicking
You're going to trick your body and your brain into believing that you are fasting. And when combined with proper diet and exercise, it works on a cellular level to deliver potential benefits like targeted fat loss, radiant skin, sustained weight loss, and cellular rejuvenation.
So I received a five-day kit from Prolon. And let me tell you, right away, I knew this was going to be fun and fascinating. The kit comes in this very nice box with this very satisfying Velcro latch. They did a great job with this
presentation packaging. And inside, on the underside of the lid, you get this message often attributed to Hippocrates about letting food be thy medicine. And below that, a QR code that takes you to your personal page for guidance, tips, and tracking. And then under all that, each day's food, snacks, vitamins, and supplements packaged within its own separate box. Each day has its own box. And
When you get in there, right away, it looks like it's going to be easy and fun. And it was. Both of those things. I was totally willing for all this food to taste bland and boring in service of the concept, in service of the mimicking of a fasting experience. But it turns out, it all tasted great. And day one...
is all set up to prepare your body for the program. And they give you everything you need to stick to it. It's very clearly designed by people who know what they're doing. And Prolon tricks your body into thinking you aren't eating. But here's the thing. It works without being painful because you do get to eat it.
You eat all sorts of little bits and bobbles that are like minestrone soup and these crunch bars and olives. And there's so much stuff in each day's kale. I've got one, I got it right here. Almond and kale crackers and a Choco Crisp bar, an intermittent fasting bar, minestrone soup, algal oil. I love how right away I was like, oh, I can't wait to try this stuff out. And yes, you do get hungry, but not nearly as hungry as I thought you would
get. And most importantly, by day three, I noticeably felt great. I had this, oh, I'm in on a secret feeling. And when it comes to hunger, by day five, I didn't feel like I was missing out on anything at all. The hunger was very minimal. And by the end of all this, my skin looked noticeably glowy and healthy and
And yeah, I lost six pounds. Six. It was easy to follow and I felt reset afterwards, like I had rebooted my system. To help you kickstart a health plan that truly works, ProLon is offering You're Not So Smart listeners 15% off site-wide, plus a $40 bonus gift when you subscribe to their five-day nutrition program. Just visit
ProlonLife.com slash Y-A-N-S-S. That's P-R-O-L-O-N-L-I-F-E dot com slash Y-A-N-S-S to claim your 15% discount and your bonus gift. ProlonLife.com slash Y-A-N-S-S.
The last thing you want to hear when you need your auto insurance most is a robot with countless irrelevant menu options, which is why with USAA auto insurance, you'll get great service that is easy and reliable all at the touch of a button. Get a quote today. Restrictions apply. USAA.
The School of Thought. I love this place. I've been a fan of the School of Thought for years. It's a nonprofit organization. They provide free creative commons, critical thinking resources to more than 30 million people worldwide. And their mission is to help popularize critical thinking, reason, media literacy, scientific literacy, and a desire to understand things deeply via intellectual humility.
So you can see why I would totally be into something like this. The founders of the school of thought have just launched something new called kitted thinking tools, K I T T E D thinking tools. And the way this works is you go to the website, you pick out the kit that you want and
There's tons of them. And the School of Thought will send you a kit of very nice, beautifully designed, well-curated, high-quality, each one about double the size of a playing card, Matt Cello 400 GSM Stock Prompt Cards.
and a nice magnetically latching box that you can use to facilitate workshops, level up brainstorming and creative thinking sessions, optimize user and customer experience and design, elevate strategic planning and decision-making, mitigate risks and liabilities, and much, much more. And each kit can, if you want to use it this way, interact with this crazy cool app.
Each card has a corresponding digital version with examples and templates and videos and step-by-step instructions and more. You even get PowerPoint and Keynote templates.
There's so many ways you could use this. Here's some ideas. If you're a venture capital investor, you could get the Investor's Critical Thinking Kit and use it to stress test and evaluate different startups for Series A funding. If you're a user experience designer, you can get the User Design Kit to put together a workshop with internal stockholders for a software product. Or if you're an HR professional, you could mix and match these kits to create a complete professional development learning program tailored specifically for your team over the course of the next decade.
So if you're the kind of person who is fascinated with critical thinking and motivated reasoning and intellectual humility and biases, fallacies, and heuristics, you know, the sort of person who listens to podcasts like You Are Not So Smart, you're probably the kind of person who would love these decks. If you're curious, you can get a special 50% off offer today.
That's right, half off offer right here. You can get half off of one of these kits by heading to kitted.shop. Okay.
K-I-T-T-E-D dot shop and using the code SMART50 at checkout. That's SMART50 at checkout. 5% of the profits will go back to the school of thought. So you're supporting a good cause that distributes free critical thinking tools all over the world on top of receiving a set of thinking superpowers in a box. Check all of this out at Kitted.shop or just click the link in the show notes.
And now we return to our program. I'm David McCraney. This is the You Are Not So Smart podcast. And we just talked a whole lot about cognitive dissonance.
I'm going to pick the conversation back up again with Dr. Sarah Stein-Lobrano. But first, a brief recap and a brief summary of where we are in this dissonance theory thing. First, Festinger.
Festinger infiltrated a cult, then he wrote a book about it, then he wrote a book about cognitive dissonance, and then he conducted the famous dissonance study we were just talking about before the break. So, strange order, but it can't be understated just how revolutionary the Festinger-Carl Smith boring tasks study was. It's still the subject of replication, meta-studies, and retrospective analyses, and
All of these things are still a thing. For 70 years, we have been experiencing dissonance concerning the fact that cognitive dissonance can have such an impact on human thoughts, feelings, and behaviors. We now know you can create a situation that will generate dissonance, and that dissonance will then compel people to sometimes change their beliefs.
You can manipulate a person's environment and get them to tell a story about themselves that will alter their attitudes or their values, their opinions, their actions, their intentions to act, and so on. And they will do it. They will change themselves. They will rewrite the truth of who they are. Okay, so what is the definition of cognitive dissonance? Let's
Have Dr. Sarah Stein-Lebrano answer that. So usually the phrase cognitive dissonance is used colloquially to tell people on the internet that they are idiots.
But that's not what it means to people who study it. And in particular, psychology uses this term to refer to the discomfort that we feel, often unconsciously, when we're faced with a contradiction between two or more of our beliefs or actions.
So what that means is that we might notice, I don't know that we want something, but we also want another thing that's incompatible with it, and we feel a tension about that, and then we erase that tension, sometimes unconsciously, by choosing one and devaluing the other. I'm using this example first because it's not about hypocrisy and I want to distinguish. It's not always hypocrisy. Sometimes it's just ambivalence. Ambivalence about not being able to reconcile different parts of our belief system or our actions and our beliefs.
Lots of instances of dissonance also are about hypocrisy. We might believe one thing but do another. We might believe in climate change but fly to the Maldives. We might know smoking is bad for us but pick up another cigarette. And we largely experience dissonance in those occasions as well. So it's discomfort about a dissonance between our beliefs and our actions or two of our beliefs or two of our actions. Unconscious is an important thing there as well.
Most people experiencing dissonance don't appear to be conscious of what is happening for them. And what happens a lot of the time is that we find clever ways to get rid of the discomfort without noticing that we've done that. So the two most common ways that we deal with dissonance, discomfort as human beings are usually either we rationalize it.
The way I explain what a rationalization is, is it's when you give a series of reasons for something that are not the real reasons you did that thing. If you decide that the person who dumped you was always a terrible person and you should never ever have met them, but actually you're just grumpy because you wish that you were still dating, that's a rationalization. It's not the real reason you think they suck, but you're not willing to admit to yourself even that that's not the reason.
right? Or if you are, you know, not going on a run and you tell yourself it's because it's rainy outside and you might fall and slip, that's a rationalization. That's not the real reason you didn't go on a run. You're just being lazy today, but it's easier to find a rationalization. So when we face dissonance, when we encounter a contradiction between our beliefs and our actions, we might come up with a rationalization like,
It's fine that I'm flying to the Maldives because the plane is going to take off anyway, right? So yeah, it's describing kind of human beings' discomfort with contradiction and ambiguity in their own worldview and with their sense of self and their sense of themselves as good people. Something that we haven't covered at all for some reason is that dissonance theory emerged in the 1950s and into the 1960s right as psychology was going through a sort of
punk response to the stodgy lab coat behaviorism of the 30s and 40s, which had no real interest in introspection. It was all conditioning back then. Watson and Pavlov and Skinner, they had this picture of humans as basically simple animals, easily trained via rewards and punishments.
And instead of seeing brains as bags of chemicals passively responding to the external environment,
Cognitive psychology emerged in the 1950s as a way of seeing the brain as actively involved in the construction of knowledge and meaning, actively organizing and integrating information, actively generating schemas and priors and assumptions and memories, actively on purpose, knowingly curating concepts and consciously perceiving, interpreting and categorizing the world.
Cognitive psychology said we did a lot of this within purely contemplative spaces in our imaginations while worrying and thinking and ruminating while in simulated internal worlds that we use to imagine potential futures and outcomes of our as yet committed acts, the results of our as yet decided decisions.
When cognitive dissonance theory was first published, behaviorists were the arguably leading group of psychology researchers in America. They were very interested in this idea of the human being, and indeed of the animal in general, as a creature that seeks material benefits and responds to rewards, and also, of course, is disincentivized by punishment.
To be clear, every living organism does do this to some significant degree. If I pay you a certain amount of money, you will do many things. If I am continuously rude to you, you will probably avoid me. There are lots of instances where behaviorism can, somewhat accurately anyway, summarize how human beings or even other animals respond to incentives and disincentives. But it turns out that that is only one model for how human beings behave.
and that there are other systems, let's say, in our psychology that can override that. And cognitive dissonance theory is one of the descriptions we have for when that very basic theory about human beings responding to rational incentives or not falls apart. That there are times, maybe only very specific times in a human being's life or particular areas of cognition where we are not calculating that way and we are not responding to what is in our rational interests. And as you say, it's because we are developing a
for lack of a better word, about ourselves. Or actually, I like to think about it more in terms of a map. So, you know, a story might be about the past, but a lot of what Dissonance is responding to, I would argue, is actually about our sense of who we are in the world and what we could possibly do in the future.
And that's probably why it exists, why we have dissonance at all. If we are facing too many contradictions about what we think the world is like and what we think we are like, we won't know how to act anymore, right? We won't know, okay, is it good or bad that I'm a Democrat or Republican, a feminist, a Christian, a Jew or whatever? If we face a contradiction in our sense of self that is quite profound or in our sense of how the world actually operates, that
more or less prevents us from knowing how we should take action. And I would say that while you can only ever theorize ultimately about why something evolved in the human subject, this is a pretty good theory for why we would have a system like this that forces us back into cognitive consistency rapidly, because otherwise we might never know how we want to act next. And also, because we are such interpersonal animals, and we're such sort of
You know, we don't just live in the present as human beings, we have these long-term projects, we collaborate with other people, we build shared systems of narrative together that structure how we collaborate. We're probably driven to have meaning like this so we can have a shared sense of meaning. Some evolutionary theorists, I think Sperber and Mercier, right? They talk about this, that like a lot of what human beings do when they engage in reasoning isn't so much about trying to find a fundamental true fact.
It's about trying to have a shared set of reasons we can give each other so we can keep collaborating. And dissonance helps us do that also. It helps us stick to something that feels consistent enough that we could communicate it to anybody else. Dissonance theory is something that's been evolving ever since that original theory.
research, the forced compliance experiments that we were talking about before the break. That was just the beginning. We've done lots of research since, but those studies are called the forced compliance experiments. Those are the ones in which a person is compelled to say right or do something counter to their beliefs, attitudes, or values.
Well, we learned from those that the weaker the external justifications for compliance, that is, the fewer consonant cognitions and or the more dissonant cognitions that compliance generates, the more likely a person will produce an internal justification.
But we now know dissonance can be generated in many other ways. One is called effort justification. Researchers have found that if you engage in a painful initiation ritual or a pointless or laborious work project or go on an expensive or terrible vacation, there's a high likelihood you will rationalize what you've gone through and see it as well worth your time instead of admitting the pain, harm, and waste. And you'll truly believe it.
You'll even become defensive about it. That's effort justification, a form of dissonance reduction where people justify their efforts by inflating the value of the outcome of those efforts. Then there's post-decision dissonance and studies where people are asked to choose between two equally appealing products. After making their choice, people will rate the chosen item as being much better than the rejected item. That's true in all sorts of other similar situations.
we will enhance our commitment to a choice to reduce any discomfort after making a difficult decision. There's also something called the hypocrisy paradigm.
In studies like this, in one study, participants were encouraged to advocate for condom use. And when they were all reminded of the times they had not used condoms, it created dissonance. This hypocrisy induction, as they call it, led to a higher likelihood of future condom use. And it illustrated how dissonance from perceived hypocrisy can influence future actions to reduce feeling like a hypocrite.
We also know now that people must feel like they have a choice in the matter, whether or not they really did. Otherwise, they just won't feel very much dissonance about doing or saying something that runs counter to what they think, feel, or believe. They can always just blame it on the coercion. They didn't actually choose to do that. They can't be blamed for it.
In studies in which people are paid a little or a lot to write an essay that runs counter to their attitudes, if they don't feel like they had a choice to opt out, the greater the reward, well, the more they'll adjust their attitudes to match the essay. If they do get a chance to opt out, though, then the opposite is true. The less the money, the more the attitude change, just like the Festinger experiment. And one of the most important findings since the early days of dissonance research is the fact that
People tend to actively seek situations that provide consonants and actively avoid situations in which they might experience dissonance. And we will do both without realizing we're doing either. Whether that's avoiding cable news channels that might threaten our attitudes or spending time with people who will likely praise all our decisions.
We don't just respond to cognitive dissonance after the fact. We actively manipulate our environment to optimize for it before it might happen. When we notice dissonance between our attitudes and our actions, our beliefs and our experiences, our current understanding and some disconfirmatory evidence, it's the anterior cingulate cortex. When we have studied cognitive dissonance by trying to get down into the neurophysiology of what's actually going on,
That seems to be mostly where this is coming from. The portion of the brain that notices errors, the error detection system is very active in this regard, but also what becomes active during these moments of strongly felt dissonance are the
aspects of the prefrontal cortex, which is involved in higher order thinking and decision-making, planning, thinking about what you're going to do next. The dopamine system, the dopaminergenic system, which is a system for motivation. And within that system, dopamine affects the feelings that arise when outcomes don't match our expectations. And varying dopamine levels will then motivate us to notice, learn, and adjust our predictions going forward.
Also, upon resolving dissonance, which is to say justifying one's behavior, you get a dopamine release, providing a little bit of, hmm, that's nice. Also, the sympathetic nervous system is activated during moments of intense dissonance. This generates increased heart rate, sweating, anxiety, and so on. This is a lot of like slapping you around from the inside to get you to pay attention. But most of all, it's the anterior cingulate cortex, the part of the brain that's
that plays the most crucial role in error detection, emotional regulation, and cognitive control. So yes, cognitive dissonance is a real thing. It is a bodily thing. It is a physiological reaction. The dissonance reduction behavior witnessed in that boring task study, the one with the spools and the lying, today, that's called the insufficient justification effect.
Without a sufficient extrinsic justification, people will create an internal one. And we now know that there is an over-justification effect as well. In studies where people are told they will be greatly rewarded if they choose to engage in activities that they already enjoy doing, those people will, over time, report enjoying those activities less. When tasked with explaining themselves,
the justifications no longer seem intrinsic. The answer to why did I do this becomes because I got paid, not because I think this is fun. We're the unreliable narrator in the story of our own lives, right? And something about that has always messed with me because it leads to the next question, which is,
Why are we doing this? I know this is going to be speculation and we don't understand the mind and the brain this well, even at this point in our history trying to make sense of it. But why wouldn't it be better to pursue raw accuracy when it comes to fact-based stuff? I'm going to pull that bullet point aside first. Why not try to, when you notice you're wrong, attempt to...
admit that you're wrong and then be right. Factually speaking, evidence-based, what's up with this? Why would that not be installed into the adaptive functions of the brain? So I want to point out, first of all, that because we've run so many studies, we know that dissonance actually doesn't happen for all like,
all factual information corrections. And actually, I would argue it only happens for a very specific set of them. If you and I are having a conversation about whether it's raining outside, and you're like, it's raining, and I'm like, it's not raining, and you go to the window and you see that it's whatever, you will adjust probably to the reality of whether it's raining. And more broadly, we tend to revise our beliefs a lot in favor of what you might call like Bayesian reasoning or, you know, lots of things we're very capable of adjusting our beliefs about
There's just a specific genre of things that we're not very good at adjusting our beliefs about. Okay, this is perfect. This is perfect. What is this genre? Help me understand the genre where this is going to be more likely.
If we had to summarize all this research, over time, as researchers have done more and more studies, they found that, okay, we only find cognitive dissonance, rationalizations, and confirmation bias and other evidence that this is happening in specific circumstances. And that's usually when people's sense of self is under threat in some way, and or their sense of themselves as good agents. So doing things in the world that have good or bad outcomes. And I could run you through tons of different
ways that they measure this but like a good example for the the thing about our actions is that often if people are told oh that was actually just an experiment the letter you just wrote telling you know the university to change its policy is now going to be thrown in the trash they don't adjust their beliefs because it's not a real action it doesn't have any negative consequences right so they stop feeling dissonance because ultimately their sense of self isn't thrown into question that much anymore and their actions in the world will have no effect
And so it seems like dissonance is happening only around issues that either make us feel like we are bad people or make us feel like our actions in the world, which have had a consequence, are causing a contradiction. And again, that's actually a relatively small part of our lives. Like most of the time you and I, Sarah and David, are wandering around, you know, discovering for real whether the coffee is hot and whether we should have repaired the other way and like whether parallel parking is possible.
possible on this street. We adjust our beliefs all the time, but we don't adjust our beliefs about things that affect our sense of self and our sense of agency. And unfortunately, there are some very important issues where our sense of self and agency are kind of always involved. And politics is like the big one. It's probably like that and religion and, you know, like very difficult interpersonal conflicts you've had. And in those cases, we're going to have dissonance basically all the time. We created this term, Festinger created this term by demonstrating the
uh, scenarios in which it can lead to really terrible outcomes, which is a group of people who like destroyed their lives thinking that some aliens were going to come pick them up and take them off of earth because of a flood. And then when it didn't happen, they doubled down, tripled down and ruin their lives even further. They had opportunities to update and go, Oh, well, I guess I was wrong about that. And, um,
This person is manipulating me. And they had chances to say, all right, look, okay, maybe this, I wasn't completely stupid to have done what I've done up till now. But to keep doing it is very stupid. And yet they doubled down, tripled down. So clearly, even if this is usually adaptive, there are times when this is bad. It's not something we should do. Yes.
That's right. And you know, you can see the sort of tragic nature of humanity in that in a way. But yes, I mean, the fact that we've evolved a certain way, as we know from every other sort of Evo psych study, doesn't always mean that it benefits the individual. And actually, we also are now living in very different circumstances than we evolved in. Right.
So we're both sometimes just situationally screwed over by cognitive dissonance. But also more broadly, I would say that our brains are not well adapted to like the modern news environments, right? It was probably much easier to live with your cognitive dissonance when it's essentially just allowed you to
get along with your neighbors or at least collaborate with a couple of them against the other ones or whatever. And it's pretty poorly suited to a constant barrage of information and misinformation that challenges your sense of self all the time and cognitively overwhelms you and disorients you in the world where it's not even clear about a lot of political issues, what we could even do. Right. And I think that's actually a big struggle as well. It's one of the reasons in my book I talk about
Not so much presenting people with constant arguments, but giving them options to change how they actually operate day to day. So if you want someone to believe in climate change, one of the best things you can do is give them something they can do about climate change that makes them feel like a good person. Okay, so what do we do with this knowledge? How do we actively manipulate people using this information?
Toward goals that we assume may be good. You were talking about climate change. Like, how do you encourage someone to do things that prevent the world from ending? And you were saying, encourage them in a way where they feel good for doing the right thing or feel good for doing the thing.
Don't let me put it in my own words. I only want to hear your words here. This is really cool stuff. I like this angle. So let's switch to this, which is, okay, with all this in mind, how do we use this in some way or another? Instead of just letting it play out and then write cool think pieces and sub stacks about it, how can we actively use dissonance theory to adjust things?
and actions and so forth and so on. Give me some ideas here. Right, I will, yes. I mean, look, I think all
All of us in a certain way are stuck in a certain kind of liberal ideology. And by that, I don't mean liberal like Kamala Harris. I mean liberal like the tradition of liberalism that started in the 16th century vaguely, right? By that, I mean a system that has a legal system that defends private property, a system that thinks about people as individuals primarily, first and foremost, rather than families or communities. There are certain cultural aspects of liberalism
that have very much filtered into the way we think about everything, even the way we do psych studies for that matter. And one of the downstream effects of that shift that we've experienced in the West is that we think that we are rational agents who can just change our minds by having discussion. And if you think about it, that's like how our legal system is set up and how our parliaments are set up. We think political change or like even personal change is like we have a discussion and then we have a new opinion and that's how change happens.
And my book is all about how that's mostly not true. That's not how historical change happens. It's not how we change our minds as individuals, right? But that doesn't mean that people don't change their minds about hugely important political issues. It's just that they change their minds when they are faced with new action possibilities that they could really take or new relationships that they could have. And they are faced with them in a way that allows them to articulate their ambivalence, to grapple with it, and ultimately to choose a different way of looking at the problem
so that they can go out in the world and behave differently. And actually, your book is one of the ones that's helped me think through this, right? Why does deep canvassing, which you write about in one of your books, work? It works because in a way, the person there is articulating their ambivalence, so the cognitive dissonance is made somewhat conscious, even if it's not a term they're familiar with. But also, they're building a new relationship with the person on the doorstep who's told them an important story about their own life, right? And often, they're being given the opportunity to engage in an action
even if it's just voting in a referendum, that would allow them to be a good person, even if they change their mind. So they might be faced with information about, I don't know, a ban on trans people using the bathroom of their choice. And then they're given the opportunity to learn a new thing and then engage in an action that lets them be a good person and vote in defense of these rights. And by the way, there is climate canvassing as well now. So people do deep canvassing, this technique, these long-form conversations now.
And that is pretty effective as well in the studies we have about it. So I guess to bring it all the way back, my suggestion is that the really effective forms of political action that we can take right now partially involve, there are other things we can do too, but partially involve giving people the opportunity to learn about a new action they can take in the world that will let them hold on to their sense of self as a good person, as something they can do next. And actually you can really see this in the climate research I've looked at. So
For example, the number one predictor of whether people will do something that's climate friendly in their life, like whether they'll install a heat pump in their house, doesn't have anything to do with whether they're given arguments for it. And interestingly, it doesn't even matter whether they're given financial incentives for it as much as it matters whether their friends are doing it.
Right? Which sort of aligns with some of these findings. Similarly, people are much more likely to, you know, change their minds about gay people if they discover that they know one. And suddenly, they need to align either their beliefs with their actions and defriend this person, or, and often they choose to align their beliefs with their actions, by which I mean they choose to remain friends with that person and shift their beliefs on homosexuality. That's a very consistent finding. So what we would learn from this, in my opinion, is that if we give people
opportunities to try new ways of living or form new relationships, they are much more likely to change their mind than if we just give them a bunch of arguments.
And I want to point out that we're actually living in a low point in American society in particular for a lot of these opportunities and relationships. Americans have fewer friends than they did 20, 50 years ago. They spend less time with their friends. We are actually resegregating in a lot of different ways, not just racially, but often economically. Millennials in particular, but other people as well, are moving away from city centers because of a lack of affordable housing. So we're actually forming fewer and fewer relationships with our neighbors in a lot of cases.
We're at a very low point for social capital to use a sociology term. And that's actually quite frightening to me as a political theorist, because it means we're probably much more cognitively rigid and isolated. So I guess the point is that I think the number one thing we can do, including and especially in the next four years, is try to create spaces and affordable opportunities for people to mix with people not like themselves and to try out new ways of living, even if that's just, you know, like installing a solar panel or, you
helping a refugee. And that all the arguments down the road for who we should vote for, what we should believe, are in some ways secondary to the options we give people for how they live their lives.
That is it for this episode of the You Are Not So Smart podcast. For links to everything we talked about, head to youarenotsosmart.com or check out the show notes right there in your podcast player. You can find my book, How Minds Change, wherever they put books on shelves and ship them in trucks. Details are at davidmccraney.com and I'll put links to all sorts of things related to that right there in your podcast player.
For all the past episodes of this podcast, go to Stitcher, SoundCloud, Apple Podcasts, Amazon Music, Audible, Spotify, or youarenotsosmart.com. Follow me on Twitter and threads and Instagram at David McCraney. Follow the show at Not Smart Blog. We're also on Facebook slash youarenotsosmart. And if you'd like to support this one-person operation, go to patreon.com slash youarenotsosmart. Pitching in at any amount
It gets you the show ad-free, but the higher amounts get you posters, t-shirts, signed books, and other stuff. The opening music, that's Clash by Caravan Palace. And if you really want to support this show, just tell somebody about it. Share it somewhere. If there was an episode that really meant something to you, share that episode and check back in about two weeks for a fresh new podcast. ♪♪♪
The last thing you want to hear when you need your auto insurance most is a robot with countless irrelevant menu options, which is why with USAA auto insurance, you'll get great service that is easy and reliable all at the touch of a button. Get a quote today. Restrictions apply. USAA.