We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode 301 - Cognitive Dissonance - Part Two

301 - Cognitive Dissonance - Part Two

2024/11/25
logo of podcast You Are Not So Smart

You Are Not So Smart

AI Deep Dive AI Chapters Transcript
People
D
David McCraney
Topics
David McCraney: 我将探讨认知失调这一心理学领域中根深蒂固且极具研究价值的课题。它是一个古老的概念,已有大约70年的历史,并衍生出许多分支和后续研究。在上一集中,我们讨论了这一理论的起源。在本集中,我将重点介绍一项具有里程碑意义的研究,它源于上一集讨论的内容。这项研究是心理学乃至整个科学领域中最令我着迷的研究之一,它恰好也是对认知失调进行早期研究的里程碑式研究之一。为了更好地理解这项研究,让我们回到1959年的斯坦福大学,在那里,心理学家莱昂·费斯廷格即将改变我们对思维和感受方式的认知。这项研究的正式作者是莱昂·费斯廷格和梅丽尔·卡尔斯米思。在上一集中,我们已经了解了费斯廷格,并跟随他和他的团队潜入一个末日邪教,观察他们在末日未到来后的行为。令人惊讶的是,大多数邪教成员并没有因为遇到强有力的证据证明他们之前的信仰是错误的而离开邪教,相反,他们的信念、态度和忠诚度变得更加坚定。费斯廷格创造了“认知失调”一词来描述这些邪教成员在面对如此明显的反驳性证据时所感受到的压力。他指出,在这种情况下,认知失调非常强烈,以至于他们选择完全不承认自己错了。他们改变了对现实、真理、理性、逻辑和合理性的看法,而不是简单地承认自己犯了错误。费斯廷格写了一本书来记录这一切,并因此成为世界知名的心理学大师。他加入了斯坦福大学的教职员工队伍,又写了一本关于他认为认知失调运作方式的书。在那之后,他决定设计一些受控的实验室研究来量化、测量和检验他的假设。在费斯廷格完成了对这个奇怪而奇妙的邪教的研究以及由此产生的书籍和名声之后,他需要创建一些实验室研究,这些研究能够大致测量相同类型的心理现象。费斯廷格和卡尔斯米思的实验让斯坦福大学的71名心理学学生参与了一个名为“绩效衡量”的项目,但该项目并非真实存在。实验参与者需要完成一个极其无聊的任务:转动木质旋钮或方钉。实验设计者希望参与者在完成无聊任务后会讨厌这项工作。实验中存在一个谎言:实验的真正目的是测试告知任务的趣味性是否会影响参与者的感受。实验参与者被分成三组:对照组、1美元组和20美元组,他们被要求告诉下一个参与者任务很有趣。获得20美元报酬的参与者认为任务很无聊,而获得1美元报酬的参与者则认为任务有趣,这揭示了认知失调的奇怪之处。获得1美元报酬的参与者改变了自己的看法,以减少由实验引起的认知失调。获得少量报酬的参与者通过合理化行为来减少认知失调,他们改变了自己的信念以容忍这种矛盾。这项研究揭示了我们处理认知失调的不适感的方式,以及自我认知、选择感和行为后果等因素在其中的作用。我们对自身行为的解释会因报酬多少而异,这导致了自我认知的改变。这项研究结果具有可重复性,并很好地解释了认知失调的奇特之处。它涉及到我们对自身作为好人的认知,以及对自身选择的认知,即使这些选择是被强迫的。我们常常对不得不做但又不太舒服的事情感到认知失调,而我们处理这种失调的方式是说服自己我们确实想要这样做。我们不断地编造关于自己是谁以及为什么存在的理由,而我们自己却对此一无所知。为了避免觉得自己前后矛盾或是个坏人,我们编造的很多故事都是虚构的。萨拉,我们该如何看待这些发现?我们该如何应对自身认知的这种局限性? Sarah Stein-Lubrano: 我认为,对人类心理学中这一方面感到不安是完全合理的,我们也应该如此。我最近对克里斯汀·劳伦的研究非常感兴趣。她研究了当人们面临行动限制时会发生什么,以及他们多快会做出与我们刚才讨论的研究中的人们非常相似的事情。在我们刚才讨论的研究中,人们受到了温和的胁迫或贿赂去撒谎,然后他们中的一些人开始说服自己,哦,实际上我没有撒谎,我一开始就完全相信这件事了,对吧?劳伦的工作研究了当人们不喜欢强加给他们的限制,但他们认为这是不可避免的时会发生什么,以及他们多快会决定他们实际上总是喜欢这种限制。她研究了两个例子。她研究了旧金山的塑料水瓶禁令,以及安大略省餐馆的吸烟限制。她发现,在禁令前后,完全相同的人对他们偏好的看法会有极大的不同。许多人在吸烟禁令之前会说,这是政府的干预和越权,我不想受到这种限制。水瓶也是如此,对吧?这太痛苦了。然后,在禁令生效的那一天,你可以问完全相同的人。许多人会说,我喜欢这个禁令。反正我以前在餐馆里也没怎么抽过烟。他们实际上不记得他们以前不喜欢这个禁令。他们还记错了自己抽了多少烟。当然,你可以想象很多情况下,这些限制更加专制或有害。你可以看到,同样的机制可能让人们为遵守非常糟糕的专制法律以及对其他人做可怕的事情辩护。所以我们应该感到震惊,我想证实这一点。我还想说,与此同时,我也一直在寻找这种乌托邦式的核心,也就是说,如果人类真的是一些有感知能力的小机器人……只对物质利益的激励和惩罚做出反应。如果我们只关心的是,我是否得到很好的报酬,我是否得到满足,我是否能玩得开心?我对这个研究对象不太感兴趣。这有点无聊,有点可悲,在道德上也不是那么有趣。但是,一个渴望讲述自己故事的人,在这个故事里,他是一个善良的生物,在世界上行善,并关注自己的自我感觉,并希望将自己视为一个积极的善良的人,能够在世界上做一些事情,我对这个研究对象感兴趣,并且我认为,即使它也可能被用来迫使我们做可怕的事情,这仍然让人感到欣慰,因为这对于人类来说是可能的。这让我在政治绝望的时代有了一点希望,那就是人类关心自己是什么样的生物,以及他们在世界上的力量,并且他们希望这股力量是向善的。认知失调的定义通常在网络上被用来指责他人愚蠢,但这并非其学术含义。在心理学中,认知失调指的是人们在面对自身信念或行为之间的矛盾时所感受到的不适感,这通常是无意识的。这意味着我们可能会注意到,我不知道我们想要什么,但我们也想要另一件与之不相容的事情,我们对此感到紧张,然后我们通过选择一件事情并贬低另一件事情来消除这种紧张感,有时是无意识地。我首先使用这个例子,因为它与虚伪无关,我想区分一下。这并不总是虚伪。有时这只是模棱两可。对无法调和我们信仰体系的不同部分或我们的行为和信仰之间的模棱两可。许多认知失调的例子也与虚伪有关。我们可能相信一件事,却做另一件事。我们可能相信气候变化,但却飞往马尔代夫。我们可能知道吸烟有害健康,但却又点燃另一支烟。在这些情况下,我们也大多会经历认知失调。所以这是对我们自己世界观中矛盾和模糊性以及对自我感觉和作为好人的感觉的不适感。我们还没有讨论的一个原因是,认知失调理论出现在20世纪50年代和60年代,当时心理学正在经历对20世纪30年代和40年代呆板的实验室白大褂行为主义的一种反叛。当时,人们对内省毫无兴趣,一切都是条件反射。华生、巴甫洛夫和斯金纳,他们认为人类基本上是简单的动物,很容易通过奖励和惩罚来训练。认知心理学在20世纪50年代出现,它将大脑视为积极参与知识和意义建构的器官,积极地组织和整合信息,积极地产生模式、先验知识、假设和记忆,积极地、有目的地、有意识地策划概念,并有意识地感知、解释和分类世界。认知心理学认为,我们在纯粹的沉思空间、在我们的想象中做了很多这样的事情,同时担心、思考和沉思,在我们用来想象潜在的未来和我们尚未做出的行为的结果、我们尚未做出的决定的结果的模拟内部世界中。当认知失调理论首次发表时,行为主义者是美国最主要的群体之一。他们对人类,甚至对动物,作为一个寻求物质利益并对奖励做出反应的生物,以及当然,也被惩罚所劝退的生物这一想法非常感兴趣。需要明确的是,每个生物都在某种程度上做到了这一点。如果我付给你一定数量的钱,你会做很多事情。如果我一直对你粗鲁无礼,你可能会避开我。有很多例子表明,行为主义可以在某种程度上准确地概括人类甚至其他动物如何对奖励和惩罚做出反应。但事实证明,这只是人类行为的一种模型。而且在我们的心理学中还有其他系统,比如说,可以覆盖这一点。认知失调理论是我们用来描述这种关于人类对理性激励做出反应或不做出反应的基本理论崩溃的情况之一。有时,也许只是在人类生活中非常特定的时间或认知的特定领域,我们不会以这种方式进行计算,我们不会对符合我们理性利益的事情做出反应。正如你所说,这是因为我们正在发展一种……缺乏更好的词语,关于我们自己。或者实际上,我喜欢把它更多地理解为地图。所以,你知道,一个故事可能是关于过去的,但我认为,许多认知失调所回应的实际上是关于我们对自身在世界上的认知以及我们未来可能做什么的认知。这可能是它存在的原因,为什么我们会有认知失调。如果我们面临太多关于我们认为世界是什么样的以及我们认为自己是什么样的矛盾,我们将不再知道如何行动了,对吧?我们将不知道,成为民主党人或共和党人、女权主义者、基督徒、犹太人或其他任何身份是好是坏?如果我们面临的自我认知或对世界实际运作方式的认知的矛盾非常深刻,那么这或多或少地会阻止我们知道我们接下来应该如何采取行动。我想说的是,虽然你只能永远对为什么某些东西在人类身上进化进行理论化,但这是一个很好的理论,解释了为什么我们会有这样一个系统,它迫使我们迅速回到认知一致性,因为否则我们可能永远不知道我们接下来想要做什么。而且,因为我们是如此的人际动物,我们……你知道,我们不只是活在当下,我们有这些长期项目,我们与其他人合作,我们一起构建共享的叙事系统,这些系统构建了我们合作的方式。我们可能被驱使去拥有这样的意义,这样我们才能拥有共享的意义。一些进化论者,我认为是斯珀伯和梅西耶,对吧?他们谈到这一点,就像人类在进行推理时所做的大部分事情与其说是试图找到一个根本性的真实事实,不如说是试图找到我们可以互相给予的一套共享理由,这样我们就可以继续合作。认知失调也有助于我们做到这一点。它帮助我们坚持一些感觉足够一致的东西,以便我们可以把它传达给任何其他人。自最初的理论研究以来,认知失调理论一直在发展。那些研究被称为强制顺从实验。在这些实验中,一个人被迫说出或做一些与他们的信念、态度或价值观相违背的事情。我们从这些实验中学到,强制顺从的外部理由越弱,也就是说,顺从所产生的和谐认知越少,或不和谐认知越多,一个人就越有可能产生内部理由。但我们现在知道,认知失调可以通过许多其他方式产生。一种被称为努力证明。研究人员发现,如果你参与了一个痛苦的入门仪式或一个毫无意义或费力的工作项目,或者去一个昂贵或糟糕的假期,你很可能会合理化你所经历的一切,并认为它非常值得你的时间,而不是承认痛苦、伤害和浪费。你真的会相信它。你甚至会对此进行辩护。这就是努力证明,一种认知失调减少的形式,人们通过夸大这些努力的结果的价值来证明他们的努力。然后是决策后失调和人们被要求在两个同样有吸引力的产品之间进行选择的研究。做出选择后,人们会认为所选择的产品比被拒绝的产品好得多。这在所有其他类似情况下都是正确的。我们将增强我们对选择的承诺,以减少做出艰难决定后的任何不适感。还有虚伪范式。在这样的研究中,在一项研究中,参与者被鼓励倡导使用安全套。当他们都被提醒他们没有使用安全套的时候,这就会产生认知失调。这种虚伪诱导,正如他们所说的那样,导致未来使用安全套的可能性更高。它说明了虚伪感带来的认知失调如何影响未来的行为,以减少感觉像个伪君子。我们现在也知道,人们必须觉得自己在这件事上有选择的余地,无论他们是否真的有。否则,他们就不会对做或说一些与他们的想法、感受或信仰相违背的事情感到太多认知失调。他们总是可以责怪这种胁迫。他们实际上并没有选择这样做。他们不能为此承担责任。在人们被支付少量或大量报酬来撰写一篇与他们的态度相违背的文章的研究中,如果他们不觉得自己有选择退出,那么报酬越高,他们就越会调整自己的态度以符合文章。但是,如果他们有机会选择退出,那么情况正好相反。钱越少,态度改变就越多,就像费斯廷格的实验一样。自认知失调研究早期以来,最重要的发现之一是,人们倾向于积极地寻求提供和谐感的情况,并积极地避免可能导致认知失调的情况。我们会在没有意识到自己在做这两件事的情况下做到这两点。无论是避免可能威胁我们态度的有线新闻频道,还是花时间与那些可能会赞扬我们所有决定的人在一起。我们不仅仅是在事后对认知失调做出反应。我们积极地操纵我们的环境,以便在它可能发生之前对其进行优化。当我们注意到我们的态度和行为、信念和经验、当前理解和一些反驳性证据之间的认知失调时,那就是前扣带皮层。当我们通过试图深入了解实际发生的事情的神经生理学来研究认知失调时,这似乎主要来自这里。大脑中注意到错误的部分,错误检测系统在这方面非常活跃,但在此期间变得活跃的还有前额叶皮层,它参与高级思维和决策、计划、思考你接下来要做什么。多巴胺系统,多巴胺能系统,这是一个动机系统。在这个系统中,多巴胺会影响当结果与我们的预期不符时产生的感觉。不同的多巴胺水平随后会激励我们注意、学习和调整我们的预测。此外,在解决认知失调时,也就是为自己的行为辩护时,你会获得多巴胺释放,提供一点……嗯,这很好。此外,在强烈认知失调的时刻,交感神经系统会被激活。这会产生心率加快、出汗、焦虑等。这就像从内部拍打你,让你注意。但最重要的是前扣带皮层,大脑中在错误检测、情绪调节和认知控制中发挥最关键作用的部分。所以,是的,认知失调是真实存在的。它是一种身体上的东西。它是一种生理反应。在那个无聊的任务研究中观察到的认知失调减少行为,也就是那个有线轴和说谎的研究,今天,这被称为理由不足效应。没有足够的外部理由,人们就会创造一个内部理由。我们现在也知道存在过度理由效应。在人们被告知如果他们选择从事他们本来就喜欢做的活动,他们将获得丰厚回报的研究中,随着时间的推移,这些人会报告说他们对这些活动的喜爱程度降低了。当被要求解释自己时,理由似乎不再是内在的了。我为什么这么做的答案变成了因为我得到了报酬,而不是因为我认为这很有趣。我们是生活中自己故事中不可靠的叙述者,对吧?而关于这一点,我一直很困扰,因为它引出了下一个问题,那就是,为什么我们会这样做?我知道这将是推测,我们甚至在这个历史阶段,我们对大脑和心灵的了解还不够透彻,无法理解它。但是,为什么在涉及事实性内容时,不追求原始的准确性呢?我先把这个要点放在一边。为什么不在你注意到自己错了的时候,尝试……承认自己错了,然后变得正确?从事实角度来说,基于证据,这是怎么回事?为什么这不会被安装到大脑的适应性功能中?首先,我想指出,因为我们进行了如此多的研究,我们知道认知失调实际上并非适用于所有……所有事实信息的更正。实际上,我认为它只适用于非常特定的一组事实信息。如果你和我正在谈论外面是否在下雨,而你说,在下雨,我说,不下雨,你去窗户边一看,无论如何,你可能会调整对是否下雨的现实的看法。更广泛地说,我们倾向于根据你可能称之为贝叶斯推理或……你知道,很多事情,我们非常能够调整我们对……的信念。只是有一类特定的事情,我们不太擅长调整对它们的信念。好的,这太完美了。这太完美了。这是什么类型?帮我理解一下这种情况更有可能发生的类型。如果我们必须总结所有这些研究,随着时间的推移,正如研究人员所做的越来越多的研究一样,他们发现,好的,我们只在特定情况下发现认知失调、合理化和确认偏差以及其他证据表明这种情况正在发生。这通常是当人们的自我认知以某种方式受到威胁时,或者他们的自我认知作为良好的行为者。所以做一些在世界上产生好或坏结果的事情。我可以带你浏览大量不同的方法来衡量这一点,但对于关于我们行为的事情的一个很好的例子是,通常如果人们被告知,哦,那实际上只是一个实验,你刚才写给大学要求改变其政策的信现在将被扔进垃圾桶,他们不会调整他们的信念,因为它不是一个真实的行动,它没有任何负面后果,对吧?所以他们停止感到认知失调,因为最终他们的自我认知并没有受到太大的质疑,而他们在世界上的行为也不会产生任何影响。因此,似乎认知失调只发生在那些让我们感觉自己是坏人或让我们感觉我们在世界上的行为已经产生了后果,正在造成矛盾的问题上。再说一次,这实际上是我们生活中相对较小的一部分。就像大多数时候你和我,萨拉和大卫,都在四处游荡,真正地发现咖啡是否很热,我们是否应该走另一条路,以及在这个街区是否可以平行停车。我们一直都在调整我们的信念,但我们不会调整那些影响到我们的自我认知和行为后果的信念。不幸的是,有一些非常重要的问题,我们的自我认知和行为后果总是会参与其中。政治就是其中最大的一个。它可能就是这样,还有宗教,你知道,还有你经历过的非常困难的人际冲突。在这些情况下,我们基本上会一直有认知失调。费斯廷格创造了这个词,通过展示……呃,它可能导致非常糟糕的结果的情况,这是一个群体,他们毁掉了自己的生活,认为一些外星人会来接他们,并将他们带离地球,因为一场洪水。然后,当这种情况没有发生时,他们加倍努力,三倍努力,甚至进一步毁掉了自己的生活。他们有机会更新并说,哦,好吧,我想我在这方面错了。而且……这个人正在操纵我。他们有机会说,好吧,听着,也许我之前所做的一切并不完全是愚蠢的。但是继续这样做是非常愚蠢的。然而,他们加倍努力,三倍努力。因此,显然,即使这通常是适应性的,也有一些时候这是不好的。这不是我们应该做的事情。是的。没错。你知道,你可以从某种程度上看到人类的悲剧性。但是,是的,我的意思是,我们已经进化成某种方式,正如我们从其他所有进化心理学研究中所知的那样,并不总是意味着它对个人有益。实际上,我们现在所处的环境也与我们进化时所处的环境大相径庭。对吧。所以我们有时只是因为认知失调而处于困境。但更广泛地说,我想说的是,我们的大脑并不适合现代新闻环境,对吧?当它基本上只是允许你与你的邻居相处,或者至少与其中一些人一起对抗其他人,或者其他任何事情时,与你的认知失调一起生活可能要容易得多。它非常不适合不断涌入的信息和错误信息,这些信息会不断地挑战你的自我认知,在认知上让你不堪重负,让你在这个世界中迷失方向,在这个世界中,甚至对许多政治问题来说,我们甚至都不知道该怎么做。对吧。我认为这也是一个很大的斗争。在我的书中,我谈到的原因之一是……与其向人们不断地提出论点,不如给他们一些选择来改变他们每天的实际运作方式。如果你想让某人相信气候变化,你能做的最好的事情之一就是给他们一些他们可以为气候变化做的事情,让他们感觉自己是个好人。好的,我们该如何利用这些知识?我们该如何利用这些信息积极地操纵人们?朝着我们认为可能是好的目标。你刚才谈到了气候变化。就像,你如何鼓励某人做一些事情来阻止世界末日?你刚才说,以一种让他们为做正确的事情而感到高兴或为做这件事而感到高兴的方式来鼓励他们。不要让我用我自己的话说。我只想在这里听到你的话。这些东西真的很酷。我喜欢这个角度。所以让我们转向这一点,那就是,好的,考虑到这一切,我们该如何以某种方式或另一种方式使用它?与其只是让它自行发挥,然后写一些很酷的想法和子堆栈来讨论它,我们如何积极地利用认知失调理论来调整事情?以及行动等等。给我一些想法吧。对,我会的,是的。我的意思是,你看,我认为我们所有人都在某种程度上陷入了某种自由主义意识形态中。对此,我的意思是,并非像卡玛拉·哈里斯那样的自由主义者。我的意思是,像16世纪模糊地开始的自由主义传统那样的自由主义。对此,我的意思是,一个拥有捍卫私有财产的法律体系的制度,一个主要将人们视为个体,而不是家庭或社区的制度。自由主义的某些文化方面已经非常多地渗透到我们思考一切的方式中,甚至包括我们进行心理学研究的方式。我们在西方经历的这种转变的后续影响之一是,我们认为我们是理性的行为者,可以通过讨论来改变我们的想法。如果你仔细想想,这就像我们的法律体系和议会是如何建立的。我们认为政治变革,甚至个人变革,就像我们进行讨论,然后我们有了新的观点,这就是变革发生的方式。我的书的主题是这大多是不正确的。这不是历史变革发生的方式。这不是我们作为个人改变想法的方式,对吧?但这并不意味着人们不会改变他们对极其重要的政治问题的想法。只是当他们面临他们真正可以采取的新行动可能性或他们可以拥有的新关系时,他们才会改变想法。他们以一种允许他们表达自己的矛盾、与之抗争并最终选择一种不同的看待问题的方式来面对这些问题,这样他们就可以走出去,以不同的方式行事。实际上,你的书帮助我思考了这一点,对吧?为什么深层游说,你在你的其中一本书中写到的,为什么有效?它之所以有效,是因为在某种程度上,那里的人正在表达他们的矛盾,所以认知失调变得有点有意识,即使它不是他们熟悉的术语。但与此同时,他们正在与门口的人建立一种新的关系,这个人已经告诉了他们关于他们自己生活的一个重要故事,对吧?通常情况下,他们有机会参与一项行动,即使只是在公投中投票,这将允许他们成为一个好人,即使他们改变了主意。所以他们可能会面临关于……我不知道,禁止跨性别者使用他们选择的洗手间的信息。然后,他们有机会学习新事物,然后采取一项行动,让他们成为一个好人,并投票捍卫这些权利。顺便说一句,现在也有气候游说。所以人们现在进行深层游说,这种技术,这些长时间的谈话。在关于它的研究中,这也很有效。所以我想把这一切都带回来,我的建议是,我们现在可以采取的真正有效的政治行动形式部分包括,我们还可以做其他事情,但部分包括让人们有机会了解他们可以在世界上采取的新行动,这将让他们坚持认为自己是好人,作为他们接下来可以做的事情。实际上,你可以在我研究过的气候研究中看到这一点。例如,预测人们是否会在生活中做一些对气候友好的事情,例如他们是否会在他们的房子里安装热泵,最重要的预测因素与是否向他们提供论据无关。有趣的是,即使向他们提供经济激励,其重要性也不及他们的朋友是否这样做。对吧?这与其中一些发现相符。同样,如果人们发现他们认识一个同性恋者,他们就更有可能改变他们对同性恋者的看法。突然之间,他们需要要么将他们的信念与他们的行为相一致,要么与这个人绝交,要么,而且他们通常选择将他们的信念与他们的行为相一致,我的意思是,他们选择与这个人保持朋友关系,并改变他们对同性恋的看法。这是一个非常一致的发现。所以,我认为我们从中可以学到的是,如果我们让人们有机会尝试新的生活方式或建立新的关系,他们就更有可能改变主意,而不是仅仅向他们提供一堆论点。我想指出的是,我们实际上正处于美国社会的一个低谷,对于这些机会和关系来说尤其如此。美国人的朋友比20年前、50年前少。他们与朋友相处的时间也少了。我们实际上正在以许多不同的方式重新隔离,不仅仅是种族隔离,而且往往是经济隔离。千禧一代尤其如此,但其他人也是如此,他们正在远离城市中心,因为缺乏经济适用房。所以在很多情况下,我们实际上与邻居建立的关系越来越少。我们正处于社会资本的低谷,用社会学的一个术语来说。作为一个政治理论家,这实际上让我非常害怕,因为它意味着我们可能在认知上更加僵化和孤立。所以我想说的是,我认为我们现在可以做的最重要的事情,包括在未来四年中,是努力创造空间和经济适用机会,让人们与与自己不同的人交往,并尝试新的生活方式,即使这只是……你知道,像安装太阳能电池板或……帮助难民。而关于我们应该投票给谁、我们应该相信什么的论点,在某种程度上是次要的,因为我们给人们提供了他们如何生活的选择。

Deep Dive

Shownotes Transcript

Translations:
中文

Are you still quoting 30-year-old movies? Have you said cool beans in the past 90 days? Do you think Discover isn't widely accepted? If this sounds like you, you're stuck in the past. Discover is accepted at 99% of places that take credit cards nationwide. And every time you make a purchase with your card, you automatically earn cash back. Welcome to the now. It pays to discover. Learn more at discover.com slash credit card based on the February 2024 Nelson report.

You can go to kitted.shop and use the code SMART50 at checkout and you will get half off a set of thinking superpowers in a box. If you want to know more about what I'm talking about, check it out middle of the show.

Welcome to the You Are Not So Smart Podcast, episode 301. Welcome to the You Are Not So Smart Podcast, episode 301.

I am David McCraney. This is the You're Not So Smart podcast, and this is part two in a two-part series about cognitive dissonance. And

I could do a 500-part series about cognitive dissonance. We could just start a whole new podcast that was only about cognitive dissonance if we wanted to. It's a very old idea in psychology, about 70 years old, and there are lots of tributaries and offshoots from the original research. The last episode, we discussed the sort of inception of all of this. And in this episode, I want to talk about...

the landmark study that came out of the stuff we were talking about in the previous episode. And let's just get started. Let's do that. I want to begin by telling you about one of my favorite studies ever about anything, like in all of psychology and all of science, really. And it just so happens to be one of the original landmark studies into cognitive dissonance. So to set the stage for

Let's go back to 1959 to Stanford University, where the psychologist Leon Festinger is about to change the way we think and feel about the way we think and feel. Officially, the authors of this study in which this experiment appears are Leon Festinger and Meryl Carlsmith.

We met Festinger in the previous episode of this show and joined him as he and his team infiltrated a doomsday cult to observe their behavior after the day of doom came and went. Not only did most of the cult members not leave the cult after encountering pretty strong evidence they had been wrong all along, but

But for most, their convictions and beliefs and attitudes and loyalties grew even stronger. Festinger coined the term cognitive dissonance to describe the stress that people within that cult felt in the presence of such striking disconfirmatory evidence. And he remarked that the dissonance in this instance was so strong that

that they chose to not admit they were wrong at all. They changed their minds about what was real, about what was true, about what was rational and logical and reasonable, instead of simply admitting that they had been mistaken. Festinger wrote a book about all of this. He became a world-famous psychologist. He joined the faculty of Stanford, wrote a second book about how he thought dissonance likely worked. And then, after all that,

decided it would probably be a great idea to design some controlled laboratory studies to quantify and measure and test his hypotheses. After Festinger did his very weird and wonderful study into an alien cult, he needed to create some lab studies that could measure roughly the same kind of psychological phenomena. That is the voice of Dr. Sarah Stein-Lobrano. She is a political scientist and theorist and

academic who studies how cognitive dissonance affects all sorts of political behavior. She's also the co-host of a podcast about activism called What Do We Want? And she wrote a book that's coming out in May of 2025 titled Don't Talk About Politics, which is about how to talk about politics without

talking about politics. So, yeah, I look at the intersection of how people change their minds or don't a lot of the time and what that means for our political life on planet Earth. Okay, enough bona fides. Back to Festinger. He is riding high on the cult infiltration study and the resulting book and fame. He's now at Stanford and he has designed a lab experiment to study a phenomenon he observed in the wild. He...

Designed this experiment with Merrill Carlsmith, an undergraduate at the time who actually came up with the original idea for all of this. Carlsmith would go on to become an influential psychologist, but he isn't one yet. And here is how the study worked. They got 71 psychology students at Stanford to sign up for a project called Measures of Performance, which they told them was part of a project that

the psychology department would be conducting to improve the quality of all their psychological experiments in the future. None of that was true, but it allowed them to tell the students that they needed everyone to be completely frank and totally honest in the interviews that would come after the experiment. So one at a time, a student would arrive and then get escorted into an office holding area where

where they waited for their turn to be asked to step inside the official experiment room, where they would learn what they would be doing for the next. Ask them to do an incredibly boring task. I think it was pretty much literally taking wooden knobs and turning them like 90 degrees or 180 degrees. Yeah, that was...

Task two. First, they sat down in front of a tray filled with 12 wooden spools, and they were then instructed to empty the tray with one hand, one spool at a time, refill it one spool at a time, and repeat that for 30 minutes or so.

while, quote, working at their own pace, end quote. Meanwhile, the experimenter, the scientist, sat nearby with a stopwatch and pretended to take notes. Then, after half an hour, the experimenter removed the tray and replaced it with those square pegs, a horizontal board on which 48 square pegs had been mounted.

The new task was to rotate each peg one quarter turn until they'd rotated them all, then repeat all of that over and over for half an hour at their own pace while the experimenter watched. And if this sounds excruciatingly boring and mind-numbingly repetitive, that was the idea. Festinger and Carl Smith hoped by the end of all of this that the participants would hate Festinger.

what they were doing would regret they'd signed up for this and not wish this on anyone. And by the way, you can watch this study. They have videos and those are online. So if you Google this, there are videos of people doing this from back in the day with wild 50s haircuts and so on. So they do this task. It's very boring. And then something very clever happens.

As with most psych studies, there's a lie in the study because that's how you make sure the person doesn't realize what it is you're actually measuring. So you lie about what the study is about. And I love this part, the lie part, because at the end of the hour, the experimenter in the room with the student leans back and lights a cigarette, like clink. And then through a cloud of smoke tells the student what the real study is about.

And the researcher wearing a lab coat and looking very professional says, you know, oh, thank you so much for doing this study.

Actually, what we're trying to measure, just so you know, is we're trying to measure if we tell someone a task is interesting, will they find it more interesting? And obviously, I didn't tell you that the task was going to be interesting or boring. But another student who's coming next is going to be told the task is really interesting. But I have a problem. And the student's like, OK. And the researcher says, I have a problem. My colleague who's supposed to tell the next student that this task is really interesting, he just didn't show up for work today.

And the student goes, oh, okay. And the researcher says, could you help me? Do you think actually, just to solve this problem that I'm having, if I pay you, will you go into the hallway and tell the next student that the task is really interesting so we can continue this study? And most of the students said yes, but they were paid different amounts of money. They were paid different amounts of money. This established the conditions of the experiment. From here on,

They created three groups. There's the control, but most crucially, there is the $1 group. That's about $9 in today's money and a $20 group. That's about $190 in today's money. The control group, they proceeded straight to the interview, but the one and $20 groups go back into the waiting area and

where another student is waiting and they tell that student, Wow, I just did this research study and it was really interesting. To reiterate, the students spend an hour performing a very, very boring task. And then they are asked if they would mind helping the psychology department by telling the next person waiting to do this task that it is actually quite fun and interesting and not boring.

They also tell these participants, these subjects in this study, that what they're studying are expectations. And the person who is supposed to lie and say all this couldn't make it today. Also, they are told if they agree, they will be officially hired and might be called back sometime later to help with other stuff. So all of this is out there. And some of these students agree.

are offered the equivalent of $9 to do this. And some students are offered the equivalent of $190 to do this. That's the only variable. How much are they paid, right? Almost everyone agrees. And then those who have agreed are escorted back into the waiting room where the next student

Who is, of course, not actually a student, but is in fact the lab assistant in disguise as a student. And they tell this pretend next student who they believe will be the next person doing that very boring experiment. Oh, yeah, I just did it. It was so interesting. They lie. They lie. They tell the next student it will be fun, even though they know it will not be fun. And like Dr. Stein-Lobrano said, this supposed next student they're telling all this to is actually in on the experiment.

And when the experimenter leaves the room saying they'll be back in about two minutes, the fake student waits until the participant tells them the experiment was interesting. And then she tells them she's surprised to hear that because one of her friends who had done this experiment already told her it was terrible. And in response, nearly all the students tell her no.

It's not terrible. It's actually very fun. And she is going to enjoy it. However, two did tell the truth at this point. And one got her number so he could tell her the truth after. At least that's what he said. Also, three rejected the money during the light a cigarette, can you help us part of the experiment. And all of those students who were just not playing along, they were excluded from the final analysis. But the rest, 65 in all,

They lied. And when the experimenter returned after leading the fake student into the experiment room, those participants went down the hall for their post-experiment interviews. And here, each student was asked to honestly rate how fun the study that they were just participating in, how fun was it really on a scale from negative five to positive five.

And then they asked each participant what they felt the study was actually about and if they suspected anything. And at that point, five students said, is this about lying? And so they got excluded as well.

So that leaves us with a pool of people who believed that they were actually helping the psychology department and they were actually lying about something that they knew that they were lying about. And some of them were doing it for $1, which is $9 in today's money.

And some of them were doing it for $20, which is $190 today's money. In a couple of weeks, the same students are asked in a survey, how interesting was the task really? And what's interesting is that the students who had to lie for only $2 have a very different result than the students that had to lie for $20. The students that lied for $20 are like, that task was incredibly dull. Because of course it was. It was a really dull task.

And the students that were paid only $2 seem to think the study is interesting. And that is what is incredibly weird, right? They went out and lied for not a huge amount of money. And they now will leave their own lie. And they really do seem to. And this is where I really recommend watching the YouTube videos. Because there are interviews where Festinger or his colleague asks the student like,

You know, some other students have told us that that task was boring and the students are in like shock. They're like, no, it was really fun. You must be talking about another study because the one I did was great. It was a lot of fun. This is what makes this study so incredible because it starkly reveals the strangeness

of how we often deal with the discomfort of cognitive dissonance. Both groups observed themselves saying something they did not believe. Both noticed an inconsistency, an incongruence, a dissonance between their behavior and their beliefs, their experiences and what they said about those experiences. And both groups not only noticed this inconsistency, but

But they had to contend with the fact that they had accepted a bribe. A bribe to do something that might be considered by others as morally questionable. But their reactions to all of that differed only depending on the size of that bribe.

When they asked the students paid the equivalent of $190 how they actually felt about the boring tasks, those students said, yeah, the tasks were really, really boring. I hated it. I do not recommend it. When they asked the students who had been paid the equivalent of $9 how they felt about the boring tasks, they said, actually, in all honesty, I loved it.

It was fun. It was not boring. They recommended it. They would have done the tasks for nothing and would happily participate again. And here is the thing that really blows my mind about this study. They really did feel that way. After lying about how they felt, they then changed how they felt so that it would not be a lie.

No part of them, nothing inside them was telling them that they were fibbing, that this was a sham, this was a charade. None of it. From that point forward, it was fun to them. They changed their own minds to reduce the dissonance created by

by the experimenters. So the question is, how did this person come to believe that actually the task was interesting? That's a really weird thing to gaslight yourself about, to misappropriate a word. And the answer is that the person is suffering from a cognitive inconsistency, right? They went and lied to the student and told them

That was a really interesting study. And they actually found the study quite boring originally. And they need to reconcile the dissonance they face about the contradiction between their actual belief that the study was boring and their actual action, which was saying that it was interesting. And unlike the student that was paid a lot of money, who can reconcile this by saying, well, but I was paid a lot of money,

So this is consistent with the fact that it was boring, but I was willing to because I was paid a lot of money. The student who wasn't paid very much doesn't really have a good rationale for having lied to the student in the hallway. And they appear to shift their beliefs to tolerate this contradiction. They're like, oh gosh, I lied to that student and it wasn't for very much money, but actually it wasn't that boring. So it's not that much of a lie, right? They're doing a rationalization so that they don't have to live with the discomfort of a contradiction between their beliefs and their actions.

This is a very repeatable finding and a very weird study, but I actually think it works really well for explaining just how funky this is. And notice that there are all kinds of elements at play in it that come up again and again in dissonance studies after this. There's your sense of yourself as a good person that's at stake. There's a sense of yourself as having actually chosen the thing that you did, even though you were really compelled to do it.

And that will fit into a lot of other things that are worth saying about dissonance and the way it features in our life, that a lot of the time we feel dissonance about actions we kind of had to undertake but don't really feel comfortable with. And the way that we manage that is by deciding that in the end, we definitely wanted this thing and chose it for sure. Yes, this is... I'm a good person. I don't take bribes to do bad things.

I don't do boring tasks. I do. I, I would have quit if that had been boring. Uh, I don't just lie to people. Uh, that would be heinous. Um, I am trying, I'm doing something for the betterment of humanity and there are all these opportunities for you to see the truth of what is happening and possibly benefit from being honest with yourself. But people will variably do that depending on how much money you give them because the story you tell yourself is different and,

Why did I do that? Because I was paid. But even then, that's going to divide people into different groups. Some people are going to be like, I'm not okay with the fact that I did that because I was paid well to do that. I wouldn't do that no matter how much you would pay me and so forth and so on. It gets complicated very quickly. But the fact that this is introspection that you're not aware is taking place to the point that you actually do now truly believe that that was not a boring task is very...

freaky, upsetting, and weird to me. And always has been. I'm making up this story about who I am and why I am at all times, and I'm unaware that I'm doing it. And a good portion of it is going to be fiction for the sake of not thinking I'm an inconsistent bad person. Sarah, what are we supposed to do with that? What are we supposed to do with knowing that about ourselves?

Okay, well, look, I think it's very valid, as we like to say in pop culture discourse, to feel uncomfortable about this aspect of human psychology, and we should. And something I'm very interested in lately is a bunch of studies by a wonderful researcher who I've interviewed called Kristen Loren. And she looks at what happens when people are faced with limitations on their actions and how quickly they do a really similar thing to the people in the study we just talked about. So in the study we just talked about, people...

were kind of gently coerced and or bribed into lying. And then they came in some cases to convince themselves, oh, I actually didn't lie. I totally believed in that thing in the first place, right? And Lauren's work looks at what happens when people don't like a restriction that is placed on them, but they think it's unavoidable and how quickly they decide that actually they always liked this restriction. So she looks at two examples. She looks at a plastic water bottle ban in San Francisco, and she looks at smoking restrictions in restaurants in Ontario.

And she found that the very same people would have extremely different beliefs about their preferences before and after those bans. So many people before a smoking ban would say, this is, you know, government interference and overreach. I don't want to be limited this way. Or same with the water bottle.

bottles, right? Like this is a huge pain. And then the day that the ban comes into effect, you can ask exactly the same people. And a lot of them are like, oh, I like this ban. And I've never really smoked that much in restaurants anyway. And they actually don't remember that they used to not like this ban. And they also have misremembered how much they were smoking. Of course, you can imagine a lot of scenarios where these kinds of restrictions are a lot more authoritarian or harmful. You can see that the same mechanism at work probably lets people justify

adhering to really awful despotic laws and doing terrible things to other people. So we should be alarmed, and I want to validate that. And I also want to say that at the same time, I always look for this sort of utopian kernel in this as well, which is to say that actually, if human beings were really little sentient robots...

who were just responding to incentives and disincentives about their material interests. If all we cared about was, am I getting paid well and am I getting laid and am I getting to maybe have a fun time? I'm not that interested in that human subject. That's kind of boring and it's pathetic and it's kind of not that interesting morally. But a human subject that is desperate to tell a story about itself where it's a good

creature that does good in the world and that keeps track of its sense of self and wants to think about itself as an active good person who can can do something in the world i'm interested in that subject and i find it reassuring that that is even possible for human beings even though it can be used against us to make us do terrible things as well it gives me a little bit of hope in a time of like real political despair that human beings care about the kind of creature that they are and the kind of force that they are in the world and that they want that to be for good

We'll be right back after this break. I know for sure that most studies show that new habits fail simply because you don't have a plan.

And without a plan, it's really hard to stick with a new habit or routine. That's why Prolon's five-day program is better than any trend out there. It's a real actionable plan

for real results. Prolon. It was researched and developed for decades at USC's Longevity Institute. It's backed by leading U.S. medical experts. Prolon by El Nutra is the only patented fasting-mimicking diet. Yes, fasting-mimicking

You're going to trick your body and your brain into believing that you are fasting. And when combined with proper diet and exercise, it works on a cellular level to deliver potential benefits like targeted fat loss, radiant skin, sustained weight loss, and cellular rejuvenation.

So I received a five-day kit from Prolon. And let me tell you, right away, I knew this was going to be fun and fascinating. The kit comes in this very nice box with this very satisfying Velcro latch. They did a great job with this

presentation packaging. And inside, on the underside of the lid, you get this message often attributed to Hippocrates about letting food be thy medicine. And below that, a QR code that takes you to your personal page for guidance, tips, and tracking. And then under all that, each day's food, snacks, vitamins, and supplements packaged within its own separate box. Each day has its own box. And

When you get in there, right away, it looks like it's going to be easy and fun. And it was. Both of those things. I was totally willing for all this food to taste bland and boring in service of the concept, in service of the mimicking of a fasting experience. But it turns out, it all tasted great. And day one...

is all set up to prepare your body for the program. And they give you everything you need to stick to it. It's very clearly designed by people who know what they're doing. And Prolon tricks your body into thinking you aren't eating. But here's the thing. It works without being painful because you do get to eat it.

You eat all sorts of little bits and bobbles that are like minestrone soup and these crunch bars and olives. And there's so much stuff in each day's kale. I've got one, I got it right here. Almond and kale crackers and a Choco Crisp bar, an intermittent fasting bar, minestrone soup, algal oil. I love how right away I was like, oh, I can't wait to try this stuff out. And yes, you do get hungry, but not nearly as hungry as I thought you would

get. And most importantly, by day three, I noticeably felt great. I had this, oh, I'm in on a secret feeling. And when it comes to hunger, by day five, I didn't feel like I was missing out on anything at all. The hunger was very minimal. And by the end of all this, my skin looked noticeably glowy and healthy and

And yeah, I lost six pounds. Six. It was easy to follow and I felt reset afterwards, like I had rebooted my system. To help you kickstart a health plan that truly works, ProLon is offering You're Not So Smart listeners 15% off site-wide, plus a $40 bonus gift when you subscribe to their five-day nutrition program. Just visit

ProlonLife.com slash Y-A-N-S-S. That's P-R-O-L-O-N-L-I-F-E dot com slash Y-A-N-S-S to claim your 15% discount and your bonus gift. ProlonLife.com slash Y-A-N-S-S.

The last thing you want to hear when you need your auto insurance most is a robot with countless irrelevant menu options, which is why with USAA auto insurance, you'll get great service that is easy and reliable all at the touch of a button. Get a quote today. Restrictions apply. USAA.

The School of Thought. I love this place. I've been a fan of the School of Thought for years. It's a nonprofit organization. They provide free creative commons, critical thinking resources to more than 30 million people worldwide. And their mission is to help popularize critical thinking, reason, media literacy, scientific literacy, and a desire to understand things deeply via intellectual humility.

So you can see why I would totally be into something like this. The founders of the school of thought have just launched something new called kitted thinking tools, K I T T E D thinking tools. And the way this works is you go to the website, you pick out the kit that you want and

There's tons of them. And the School of Thought will send you a kit of very nice, beautifully designed, well-curated, high-quality, each one about double the size of a playing card, Matt Cello 400 GSM Stock Prompt Cards.

and a nice magnetically latching box that you can use to facilitate workshops, level up brainstorming and creative thinking sessions, optimize user and customer experience and design, elevate strategic planning and decision-making, mitigate risks and liabilities, and much, much more. And each kit can, if you want to use it this way, interact with this crazy cool app.

Each card has a corresponding digital version with examples and templates and videos and step-by-step instructions and more. You even get PowerPoint and Keynote templates.

There's so many ways you could use this. Here's some ideas. If you're a venture capital investor, you could get the Investor's Critical Thinking Kit and use it to stress test and evaluate different startups for Series A funding. If you're a user experience designer, you can get the User Design Kit to put together a workshop with internal stockholders for a software product. Or if you're an HR professional, you could mix and match these kits to create a complete professional development learning program tailored specifically for your team over the course of the next decade.

So if you're the kind of person who is fascinated with critical thinking and motivated reasoning and intellectual humility and biases, fallacies, and heuristics, you know, the sort of person who listens to podcasts like You Are Not So Smart, you're probably the kind of person who would love these decks. If you're curious, you can get a special 50% off offer today.

That's right, half off offer right here. You can get half off of one of these kits by heading to kitted.shop. Okay.

K-I-T-T-E-D dot shop and using the code SMART50 at checkout. That's SMART50 at checkout. 5% of the profits will go back to the school of thought. So you're supporting a good cause that distributes free critical thinking tools all over the world on top of receiving a set of thinking superpowers in a box. Check all of this out at Kitted.shop or just click the link in the show notes.

And now we return to our program. I'm David McCraney. This is the You Are Not So Smart podcast. And we just talked a whole lot about cognitive dissonance.

I'm going to pick the conversation back up again with Dr. Sarah Stein-Lobrano. But first, a brief recap and a brief summary of where we are in this dissonance theory thing. First, Festinger.

Festinger infiltrated a cult, then he wrote a book about it, then he wrote a book about cognitive dissonance, and then he conducted the famous dissonance study we were just talking about before the break. So, strange order, but it can't be understated just how revolutionary the Festinger-Carl Smith boring tasks study was. It's still the subject of replication, meta-studies, and retrospective analyses, and

All of these things are still a thing. For 70 years, we have been experiencing dissonance concerning the fact that cognitive dissonance can have such an impact on human thoughts, feelings, and behaviors. We now know you can create a situation that will generate dissonance, and that dissonance will then compel people to sometimes change their beliefs.

You can manipulate a person's environment and get them to tell a story about themselves that will alter their attitudes or their values, their opinions, their actions, their intentions to act, and so on. And they will do it. They will change themselves. They will rewrite the truth of who they are. Okay, so what is the definition of cognitive dissonance? Let's

Have Dr. Sarah Stein-Lebrano answer that. So usually the phrase cognitive dissonance is used colloquially to tell people on the internet that they are idiots.

But that's not what it means to people who study it. And in particular, psychology uses this term to refer to the discomfort that we feel, often unconsciously, when we're faced with a contradiction between two or more of our beliefs or actions.

So what that means is that we might notice, I don't know that we want something, but we also want another thing that's incompatible with it, and we feel a tension about that, and then we erase that tension, sometimes unconsciously, by choosing one and devaluing the other. I'm using this example first because it's not about hypocrisy and I want to distinguish. It's not always hypocrisy. Sometimes it's just ambivalence. Ambivalence about not being able to reconcile different parts of our belief system or our actions and our beliefs.

Lots of instances of dissonance also are about hypocrisy. We might believe one thing but do another. We might believe in climate change but fly to the Maldives. We might know smoking is bad for us but pick up another cigarette. And we largely experience dissonance in those occasions as well. So it's discomfort about a dissonance between our beliefs and our actions or two of our beliefs or two of our actions. Unconscious is an important thing there as well.

Most people experiencing dissonance don't appear to be conscious of what is happening for them. And what happens a lot of the time is that we find clever ways to get rid of the discomfort without noticing that we've done that. So the two most common ways that we deal with dissonance, discomfort as human beings are usually either we rationalize it.

The way I explain what a rationalization is, is it's when you give a series of reasons for something that are not the real reasons you did that thing. If you decide that the person who dumped you was always a terrible person and you should never ever have met them, but actually you're just grumpy because you wish that you were still dating, that's a rationalization. It's not the real reason you think they suck, but you're not willing to admit to yourself even that that's not the reason.

right? Or if you are, you know, not going on a run and you tell yourself it's because it's rainy outside and you might fall and slip, that's a rationalization. That's not the real reason you didn't go on a run. You're just being lazy today, but it's easier to find a rationalization. So when we face dissonance, when we encounter a contradiction between our beliefs and our actions, we might come up with a rationalization like,

It's fine that I'm flying to the Maldives because the plane is going to take off anyway, right? So yeah, it's describing kind of human beings' discomfort with contradiction and ambiguity in their own worldview and with their sense of self and their sense of themselves as good people. Something that we haven't covered at all for some reason is that dissonance theory emerged in the 1950s and into the 1960s right as psychology was going through a sort of

punk response to the stodgy lab coat behaviorism of the 30s and 40s, which had no real interest in introspection. It was all conditioning back then. Watson and Pavlov and Skinner, they had this picture of humans as basically simple animals, easily trained via rewards and punishments.

And instead of seeing brains as bags of chemicals passively responding to the external environment,

Cognitive psychology emerged in the 1950s as a way of seeing the brain as actively involved in the construction of knowledge and meaning, actively organizing and integrating information, actively generating schemas and priors and assumptions and memories, actively on purpose, knowingly curating concepts and consciously perceiving, interpreting and categorizing the world.

Cognitive psychology said we did a lot of this within purely contemplative spaces in our imaginations while worrying and thinking and ruminating while in simulated internal worlds that we use to imagine potential futures and outcomes of our as yet committed acts, the results of our as yet decided decisions.

When cognitive dissonance theory was first published, behaviorists were the arguably leading group of psychology researchers in America. They were very interested in this idea of the human being, and indeed of the animal in general, as a creature that seeks material benefits and responds to rewards, and also, of course, is disincentivized by punishment.

To be clear, every living organism does do this to some significant degree. If I pay you a certain amount of money, you will do many things. If I am continuously rude to you, you will probably avoid me. There are lots of instances where behaviorism can, somewhat accurately anyway, summarize how human beings or even other animals respond to incentives and disincentives. But it turns out that that is only one model for how human beings behave.

and that there are other systems, let's say, in our psychology that can override that. And cognitive dissonance theory is one of the descriptions we have for when that very basic theory about human beings responding to rational incentives or not falls apart. That there are times, maybe only very specific times in a human being's life or particular areas of cognition where we are not calculating that way and we are not responding to what is in our rational interests. And as you say, it's because we are developing a

for lack of a better word, about ourselves. Or actually, I like to think about it more in terms of a map. So, you know, a story might be about the past, but a lot of what Dissonance is responding to, I would argue, is actually about our sense of who we are in the world and what we could possibly do in the future.

And that's probably why it exists, why we have dissonance at all. If we are facing too many contradictions about what we think the world is like and what we think we are like, we won't know how to act anymore, right? We won't know, okay, is it good or bad that I'm a Democrat or Republican, a feminist, a Christian, a Jew or whatever? If we face a contradiction in our sense of self that is quite profound or in our sense of how the world actually operates, that

more or less prevents us from knowing how we should take action. And I would say that while you can only ever theorize ultimately about why something evolved in the human subject, this is a pretty good theory for why we would have a system like this that forces us back into cognitive consistency rapidly, because otherwise we might never know how we want to act next. And also, because we are such interpersonal animals, and we're such sort of

You know, we don't just live in the present as human beings, we have these long-term projects, we collaborate with other people, we build shared systems of narrative together that structure how we collaborate. We're probably driven to have meaning like this so we can have a shared sense of meaning. Some evolutionary theorists, I think Sperber and Mercier, right? They talk about this, that like a lot of what human beings do when they engage in reasoning isn't so much about trying to find a fundamental true fact.

It's about trying to have a shared set of reasons we can give each other so we can keep collaborating. And dissonance helps us do that also. It helps us stick to something that feels consistent enough that we could communicate it to anybody else. Dissonance theory is something that's been evolving ever since that original theory.

research, the forced compliance experiments that we were talking about before the break. That was just the beginning. We've done lots of research since, but those studies are called the forced compliance experiments. Those are the ones in which a person is compelled to say right or do something counter to their beliefs, attitudes, or values.

Well, we learned from those that the weaker the external justifications for compliance, that is, the fewer consonant cognitions and or the more dissonant cognitions that compliance generates, the more likely a person will produce an internal justification.

But we now know dissonance can be generated in many other ways. One is called effort justification. Researchers have found that if you engage in a painful initiation ritual or a pointless or laborious work project or go on an expensive or terrible vacation, there's a high likelihood you will rationalize what you've gone through and see it as well worth your time instead of admitting the pain, harm, and waste. And you'll truly believe it.

You'll even become defensive about it. That's effort justification, a form of dissonance reduction where people justify their efforts by inflating the value of the outcome of those efforts. Then there's post-decision dissonance and studies where people are asked to choose between two equally appealing products. After making their choice, people will rate the chosen item as being much better than the rejected item. That's true in all sorts of other similar situations.

we will enhance our commitment to a choice to reduce any discomfort after making a difficult decision. There's also something called the hypocrisy paradigm.

In studies like this, in one study, participants were encouraged to advocate for condom use. And when they were all reminded of the times they had not used condoms, it created dissonance. This hypocrisy induction, as they call it, led to a higher likelihood of future condom use. And it illustrated how dissonance from perceived hypocrisy can influence future actions to reduce feeling like a hypocrite.

We also know now that people must feel like they have a choice in the matter, whether or not they really did. Otherwise, they just won't feel very much dissonance about doing or saying something that runs counter to what they think, feel, or believe. They can always just blame it on the coercion. They didn't actually choose to do that. They can't be blamed for it.

In studies in which people are paid a little or a lot to write an essay that runs counter to their attitudes, if they don't feel like they had a choice to opt out, the greater the reward, well, the more they'll adjust their attitudes to match the essay. If they do get a chance to opt out, though, then the opposite is true. The less the money, the more the attitude change, just like the Festinger experiment. And one of the most important findings since the early days of dissonance research is the fact that

People tend to actively seek situations that provide consonants and actively avoid situations in which they might experience dissonance. And we will do both without realizing we're doing either. Whether that's avoiding cable news channels that might threaten our attitudes or spending time with people who will likely praise all our decisions.

We don't just respond to cognitive dissonance after the fact. We actively manipulate our environment to optimize for it before it might happen. When we notice dissonance between our attitudes and our actions, our beliefs and our experiences, our current understanding and some disconfirmatory evidence, it's the anterior cingulate cortex. When we have studied cognitive dissonance by trying to get down into the neurophysiology of what's actually going on,

That seems to be mostly where this is coming from. The portion of the brain that notices errors, the error detection system is very active in this regard, but also what becomes active during these moments of strongly felt dissonance are the

aspects of the prefrontal cortex, which is involved in higher order thinking and decision-making, planning, thinking about what you're going to do next. The dopamine system, the dopaminergenic system, which is a system for motivation. And within that system, dopamine affects the feelings that arise when outcomes don't match our expectations. And varying dopamine levels will then motivate us to notice, learn, and adjust our predictions going forward.

Also, upon resolving dissonance, which is to say justifying one's behavior, you get a dopamine release, providing a little bit of, hmm, that's nice. Also, the sympathetic nervous system is activated during moments of intense dissonance. This generates increased heart rate, sweating, anxiety, and so on. This is a lot of like slapping you around from the inside to get you to pay attention. But most of all, it's the anterior cingulate cortex, the part of the brain that's

that plays the most crucial role in error detection, emotional regulation, and cognitive control. So yes, cognitive dissonance is a real thing. It is a bodily thing. It is a physiological reaction. The dissonance reduction behavior witnessed in that boring task study, the one with the spools and the lying, today, that's called the insufficient justification effect.

Without a sufficient extrinsic justification, people will create an internal one. And we now know that there is an over-justification effect as well. In studies where people are told they will be greatly rewarded if they choose to engage in activities that they already enjoy doing, those people will, over time, report enjoying those activities less. When tasked with explaining themselves,

the justifications no longer seem intrinsic. The answer to why did I do this becomes because I got paid, not because I think this is fun. We're the unreliable narrator in the story of our own lives, right? And something about that has always messed with me because it leads to the next question, which is,

Why are we doing this? I know this is going to be speculation and we don't understand the mind and the brain this well, even at this point in our history trying to make sense of it. But why wouldn't it be better to pursue raw accuracy when it comes to fact-based stuff? I'm going to pull that bullet point aside first. Why not try to, when you notice you're wrong, attempt to...

admit that you're wrong and then be right. Factually speaking, evidence-based, what's up with this? Why would that not be installed into the adaptive functions of the brain? So I want to point out, first of all, that because we've run so many studies, we know that dissonance actually doesn't happen for all like,

all factual information corrections. And actually, I would argue it only happens for a very specific set of them. If you and I are having a conversation about whether it's raining outside, and you're like, it's raining, and I'm like, it's not raining, and you go to the window and you see that it's whatever, you will adjust probably to the reality of whether it's raining. And more broadly, we tend to revise our beliefs a lot in favor of what you might call like Bayesian reasoning or, you know, lots of things we're very capable of adjusting our beliefs about

There's just a specific genre of things that we're not very good at adjusting our beliefs about. Okay, this is perfect. This is perfect. What is this genre? Help me understand the genre where this is going to be more likely.

If we had to summarize all this research, over time, as researchers have done more and more studies, they found that, okay, we only find cognitive dissonance, rationalizations, and confirmation bias and other evidence that this is happening in specific circumstances. And that's usually when people's sense of self is under threat in some way, and or their sense of themselves as good agents. So doing things in the world that have good or bad outcomes. And I could run you through tons of different

ways that they measure this but like a good example for the the thing about our actions is that often if people are told oh that was actually just an experiment the letter you just wrote telling you know the university to change its policy is now going to be thrown in the trash they don't adjust their beliefs because it's not a real action it doesn't have any negative consequences right so they stop feeling dissonance because ultimately their sense of self isn't thrown into question that much anymore and their actions in the world will have no effect

And so it seems like dissonance is happening only around issues that either make us feel like we are bad people or make us feel like our actions in the world, which have had a consequence, are causing a contradiction. And again, that's actually a relatively small part of our lives. Like most of the time you and I, Sarah and David, are wandering around, you know, discovering for real whether the coffee is hot and whether we should have repaired the other way and like whether parallel parking is possible.

possible on this street. We adjust our beliefs all the time, but we don't adjust our beliefs about things that affect our sense of self and our sense of agency. And unfortunately, there are some very important issues where our sense of self and agency are kind of always involved. And politics is like the big one. It's probably like that and religion and, you know, like very difficult interpersonal conflicts you've had. And in those cases, we're going to have dissonance basically all the time. We created this term, Festinger created this term by demonstrating the

uh, scenarios in which it can lead to really terrible outcomes, which is a group of people who like destroyed their lives thinking that some aliens were going to come pick them up and take them off of earth because of a flood. And then when it didn't happen, they doubled down, tripled down and ruin their lives even further. They had opportunities to update and go, Oh, well, I guess I was wrong about that. And, um,

This person is manipulating me. And they had chances to say, all right, look, okay, maybe this, I wasn't completely stupid to have done what I've done up till now. But to keep doing it is very stupid. And yet they doubled down, tripled down. So clearly, even if this is usually adaptive, there are times when this is bad. It's not something we should do. Yes.

That's right. And you know, you can see the sort of tragic nature of humanity in that in a way. But yes, I mean, the fact that we've evolved a certain way, as we know from every other sort of Evo psych study, doesn't always mean that it benefits the individual. And actually, we also are now living in very different circumstances than we evolved in. Right.

So we're both sometimes just situationally screwed over by cognitive dissonance. But also more broadly, I would say that our brains are not well adapted to like the modern news environments, right? It was probably much easier to live with your cognitive dissonance when it's essentially just allowed you to

get along with your neighbors or at least collaborate with a couple of them against the other ones or whatever. And it's pretty poorly suited to a constant barrage of information and misinformation that challenges your sense of self all the time and cognitively overwhelms you and disorients you in the world where it's not even clear about a lot of political issues, what we could even do. Right. And I think that's actually a big struggle as well. It's one of the reasons in my book I talk about

Not so much presenting people with constant arguments, but giving them options to change how they actually operate day to day. So if you want someone to believe in climate change, one of the best things you can do is give them something they can do about climate change that makes them feel like a good person. Okay, so what do we do with this knowledge? How do we actively manipulate people using this information?

Toward goals that we assume may be good. You were talking about climate change. Like, how do you encourage someone to do things that prevent the world from ending? And you were saying, encourage them in a way where they feel good for doing the right thing or feel good for doing the thing.

Don't let me put it in my own words. I only want to hear your words here. This is really cool stuff. I like this angle. So let's switch to this, which is, okay, with all this in mind, how do we use this in some way or another? Instead of just letting it play out and then write cool think pieces and sub stacks about it, how can we actively use dissonance theory to adjust things?

and actions and so forth and so on. Give me some ideas here. Right, I will, yes. I mean, look, I think all

All of us in a certain way are stuck in a certain kind of liberal ideology. And by that, I don't mean liberal like Kamala Harris. I mean liberal like the tradition of liberalism that started in the 16th century vaguely, right? By that, I mean a system that has a legal system that defends private property, a system that thinks about people as individuals primarily, first and foremost, rather than families or communities. There are certain cultural aspects of liberalism

that have very much filtered into the way we think about everything, even the way we do psych studies for that matter. And one of the downstream effects of that shift that we've experienced in the West is that we think that we are rational agents who can just change our minds by having discussion. And if you think about it, that's like how our legal system is set up and how our parliaments are set up. We think political change or like even personal change is like we have a discussion and then we have a new opinion and that's how change happens.

And my book is all about how that's mostly not true. That's not how historical change happens. It's not how we change our minds as individuals, right? But that doesn't mean that people don't change their minds about hugely important political issues. It's just that they change their minds when they are faced with new action possibilities that they could really take or new relationships that they could have. And they are faced with them in a way that allows them to articulate their ambivalence, to grapple with it, and ultimately to choose a different way of looking at the problem

so that they can go out in the world and behave differently. And actually, your book is one of the ones that's helped me think through this, right? Why does deep canvassing, which you write about in one of your books, work? It works because in a way, the person there is articulating their ambivalence, so the cognitive dissonance is made somewhat conscious, even if it's not a term they're familiar with. But also, they're building a new relationship with the person on the doorstep who's told them an important story about their own life, right? And often, they're being given the opportunity to engage in an action

even if it's just voting in a referendum, that would allow them to be a good person, even if they change their mind. So they might be faced with information about, I don't know, a ban on trans people using the bathroom of their choice. And then they're given the opportunity to learn a new thing and then engage in an action that lets them be a good person and vote in defense of these rights. And by the way, there is climate canvassing as well now. So people do deep canvassing, this technique, these long-form conversations now.

And that is pretty effective as well in the studies we have about it. So I guess to bring it all the way back, my suggestion is that the really effective forms of political action that we can take right now partially involve, there are other things we can do too, but partially involve giving people the opportunity to learn about a new action they can take in the world that will let them hold on to their sense of self as a good person, as something they can do next. And actually you can really see this in the climate research I've looked at. So

For example, the number one predictor of whether people will do something that's climate friendly in their life, like whether they'll install a heat pump in their house, doesn't have anything to do with whether they're given arguments for it. And interestingly, it doesn't even matter whether they're given financial incentives for it as much as it matters whether their friends are doing it.

Right? Which sort of aligns with some of these findings. Similarly, people are much more likely to, you know, change their minds about gay people if they discover that they know one. And suddenly, they need to align either their beliefs with their actions and defriend this person, or, and often they choose to align their beliefs with their actions, by which I mean they choose to remain friends with that person and shift their beliefs on homosexuality. That's a very consistent finding. So what we would learn from this, in my opinion, is that if we give people

opportunities to try new ways of living or form new relationships, they are much more likely to change their mind than if we just give them a bunch of arguments.

And I want to point out that we're actually living in a low point in American society in particular for a lot of these opportunities and relationships. Americans have fewer friends than they did 20, 50 years ago. They spend less time with their friends. We are actually resegregating in a lot of different ways, not just racially, but often economically. Millennials in particular, but other people as well, are moving away from city centers because of a lack of affordable housing. So we're actually forming fewer and fewer relationships with our neighbors in a lot of cases.

We're at a very low point for social capital to use a sociology term. And that's actually quite frightening to me as a political theorist, because it means we're probably much more cognitively rigid and isolated. So I guess the point is that I think the number one thing we can do, including and especially in the next four years, is try to create spaces and affordable opportunities for people to mix with people not like themselves and to try out new ways of living, even if that's just, you know, like installing a solar panel or, you

helping a refugee. And that all the arguments down the road for who we should vote for, what we should believe, are in some ways secondary to the options we give people for how they live their lives.

That is it for this episode of the You Are Not So Smart podcast. For links to everything we talked about, head to youarenotsosmart.com or check out the show notes right there in your podcast player. You can find my book, How Minds Change, wherever they put books on shelves and ship them in trucks. Details are at davidmccraney.com and I'll put links to all sorts of things related to that right there in your podcast player.

For all the past episodes of this podcast, go to Stitcher, SoundCloud, Apple Podcasts, Amazon Music, Audible, Spotify, or youarenotsosmart.com. Follow me on Twitter and threads and Instagram at David McCraney. Follow the show at Not Smart Blog. We're also on Facebook slash youarenotsosmart. And if you'd like to support this one-person operation, go to patreon.com slash youarenotsosmart. Pitching in at any amount

It gets you the show ad-free, but the higher amounts get you posters, t-shirts, signed books, and other stuff. The opening music, that's Clash by Caravan Palace. And if you really want to support this show, just tell somebody about it. Share it somewhere. If there was an episode that really meant something to you, share that episode and check back in about two weeks for a fresh new podcast. ♪♪♪

The last thing you want to hear when you need your auto insurance most is a robot with countless irrelevant menu options, which is why with USAA auto insurance, you'll get great service that is easy and reliable all at the touch of a button. Get a quote today. Restrictions apply. USAA.