We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode #112 Adam Grant: Rethinking Your Position

#112 Adam Grant: Rethinking Your Position

2021/6/1
logo of podcast The Knowledge Project with Shane Parrish

The Knowledge Project with Shane Parrish

AI Deep Dive AI Chapters Transcript
People
A
Adam Grant
S
Shane Parrish
创始人和CEO,专注于网络安全、投资和知识分享。
Topics
Adam Grant: 本书探讨了重新思考的重要性,以及如何改变自身和他人观点的方法。作者认为,公开表达观点的人有责任在面对更合理的逻辑或更强有力的数据时,保持思想开放,并愿意改变自己的想法。人们不愿重新思考的原因在于,这会使世界变得更加不可预测,并让人感觉自己不再是专家,从而威胁到自身的身份和地位。作者建议将身份与价值观而非信念联系起来,才能避免固守过时的、甚至有害的实践方式。在决策过程中,应该注重过程问责制,而不是仅仅关注结果问责制。招聘过程中,应该首先确定评估标准,然后根据这些标准对候选人进行评估,而不是先看候选人再定标准。领导者公开批评自己,比单纯地寻求反馈更能有效地建立心理安全感。人们的思维模式可以分为“传教士”、“检察官”、“政客”和“科学家”四种,应该尽量避免前三种思维模式,而应该像科学家一样思考问题。有效的沟通需要寻找共同点,并通过提问来激发对方的思考,'如何'式的问题比'为什么'式的问题更有效。面对虚假知识,应该保持好奇心,并通过提问来引导对方思考,而不是直接反驳。沟通的目标应该是学习和成长,而不是为了获胜。 Shane Parrish: 与Adam Grant的对话中,Parrish探讨了重新思考的挑战和方法,以及如何在个人和组织层面建立心理安全感。Parrish强调了结果导向的重要性,以及在决策中平衡开放性和果断性的必要性。Parrish还讨论了在组织中建立心理安全感的方法,以及如何引导孩子和学生进行重新思考。Parrish指出,人们往往更倾向于选择那些表现出强硬和自信的领导者,即使他们可能并不具备相应的胜任能力。人们对归属感的渴望,以及在群体中获得地位的需求,会影响人们的思维方式和行为。有效的沟通需要寻找共同点,并通过提问来激发对方的思考。虚假知识会阻碍有效的沟通和决策,需要谨慎对待。重新思考并非易事,需要时间和深入的思考。

Deep Dive

Chapters
Adam Grant discusses his journey to writing about rethinking, influenced by experiences in organizations where leaders and students resisted new evidence and ideas.

Shownotes Transcript

Translations:
中文

Yeah, you're entitled to your own opinion if you keep your opinion to yourself. If you decide to say it out loud, then I think you have a responsibility to be open to changing your mind in the face of better logic or stronger data. And so I think if you're willing to voice an opinion, you should also be willing to change that opinion. It goes right back to something we talked about earlier, which is to say, okay, when would I change my mind? If you can't answer that question, you are no longer thinking like a scientist. You have gone into preacher or prosecutor mode.

Welcome to the Knowledge Project Podcast. I'm your host, Shane Parrish.

This podcast is packed with timeless ideas and practical insights to help you get the most out of life and business. If you're listening to this, you're not currently a supporting member. If you'd like special member-only episodes, access before everyone else, transcripts, and other member-only content, you can join at fs.blog.com. Check out the show notes for a link. Today I'm speaking with Adam Grant.

Adam has spent the past 15 years researching and teaching evidence-based management, helped the likes of Google, Pixar, and the NBA re-examine how they can design meaningful jobs, build creative teams, and shape collaborative cultures. Adam's latest book, Think Again, is his best yet.

We're going to talk a lot about thinking, or more specifically, rethinking, attempting to answer the questions of how we update our own views, how we change the views of others, and how we create a community. This podcast is an invitation to let go of the knowledge and opinions that are no longer serving you well, and to tie your identity to flexibility and not consistency. It's time to listen and learn. ♪

The IKEA Business Network is now open for small businesses and entrepreneurs. Join for free today to get access to interior design services to help you make the most of your workspace, employee well-being benefits to help you and your people grow, and amazing discounts on travel, insurance, and IKEA purchases, deliveries, and more. Take your small business to the next level when you sign up for the IKEA Business Network for free today by searching IKEA Business Network.

Can you tell us in your own words how you came to write about rethinking? This is such a hard question to answer because everything I want to say, I'm tempted to rethink. So we're going to have to question a lot of what I say here. I think I really started thinking deliberately about it when...

I just had experience after experience of going into a new organization. And, you know, usually I'd either give a keynote speech or the CEO or founder would reach out for some advising. And I'd start to walk through the evidence on whatever their question was. And more often than not, I'd get answers like, well, that won't work around here. Or that's not how we've always done things. And at some point, I just started saying, hmm.

BlackBerry? Blockbuster? Kodak? Sears? Should I keep going? It was both surprising and mildly annoying to me that the very people who had called me because they thought I could help them rethink their vision, their culture, their strategy, were closed to the best evidence I could find. And I

I got really curious about why that was. And then I saw a lot of the same thing with my students, where they'd come in completely locked into being investment bankers,

I've had enough students regret that path that I've spotted some of the warning signs that somebody might not find nirvana going down that road. And I'd try to encourage them to reconsider. And I got the same resistance. And any time I see a 21-year-old and a 61-year-old grapple with the same exact problem, I think there's something really important and interesting to explore. So here we are.

Yeah, definitely. We always say it's easy to see that other people should rethink things like we're looking at them going like, oh, you know, that's blockbuster. They should have rethought that or they should be rethinking that. It's really hard for us to rethink ourselves. Why is that? There are probably multiple reasons for it. But I think two of the reasons why people are really hesitant to rethink things are one, it makes the world feel much more unpredictable.

You know, if my views aren't fixed, then who am I and how do I navigate a really confusing and often turbulent world? And two, it makes me feel like I am not an expert. And a lot of us take pride in our knowledge. You know, when I think about power, there's a classic French and French and Raven framework.

Where they said, look, there's expert power, there's what's called referent power, which is basically being liked and respect. And then there's coercive reward and legitimate power. And most of the bases of power that people have in life come from a position that they happen to hold. So my ability to reward you or punish you, my ability to get you to listen to me because I have a role of authority is not something I can carry with me.

And so the knowledge I have is one of the few things that I get to hold on to. And the idea that that might be fragile, it not only questions my identity, it also, I guess, questions my status and my standing in the world, which is something pretty uncomfortable to do. Let's double click on that identity concept, because I think that you sort of said our views are almost tied to our identity and that gets in the way sometimes. This is something I'm always puzzled by. And I feel like I see it in every field. There are professionals in almost every field who necessarily

not only are interested in particular ways of being or particular practices, but they actually define themselves by living those practices. You know this already, but when I was writing the book, I started thinking through, you know, how terrible would the world be

If some professions had not rethought some of their convictions. So imagine, for example, that you went to a doctor whose identity was to be a professional lobotomist. That would be extremely dangerous. And yet there was a time when a lot of physicians defined themselves by that method, by that set of tools.

We've seen the same thing with police officers who identify themselves as the kinds of people who would stop and frisk because you never know where a criminal could be. And we know from the evidence that that just had horrendous effects, particularly when it comes to disproportionately arresting and prosecuting people of color.

particularly black people here in the US. You know, we've had teachers and parents who identified with practices that were just highly ineffective and maybe even harmful. And I think that it's dangerous, right? I think that for me, an identity is not about what you believe, it's about what you value. And so I want to have a set of principles. For me, my highest values are generosity, excellence, integrity, and freedom.

And I am completely flexible on the best ways to live those values. And so you might come tomorrow and tell me, you know what? The randomized controlled experiments that you do, the longitudinal studies you do, there's a fatal flaw in them and there's a better way to be helpful and excellent at your job. And I would be skeptical because I believe in science, but I would be open to hearing the idea.

How did you get to that point? Like, how do we convince ourselves to attach our identity to values and not beliefs? Like, that's a tricky path, isn't it? Do you think so? I don't know. Like, how do you? It seems like it is. Otherwise, we'd just be rethinking all the time. I mean, isn't that part of the fun of being human, though? To me, rethinking is code for learning, isn't it?

Well, it is, but we don't update our views very often. That's part of the issue. Our own and others, right? It's hard to see it when we don't do it. It's really easy to see when others don't do it. What is the process by which we update our views? That's a great question. The way that I've landed at thinking about this is to say, look, when you have a belief, you have two options. One is you can subject it to a rethinking cycle. The other is you can fall victim to an overconfidence cycle.

So an overconfidence cycle is something we've all both committed and witnessed probably too many times. But the basic idea is we start by being proud of something that we think we know. And that leads us then to feel a lot of conviction. That kind of launches us into confirmation bias, where we look for information that confirms our expectations, as well as desirability bias, where we look for information that basically reinforces what we want to be true.

And then we see what we expected to see and what we wanted to see, and we get validated, and that only makes us prouder of what we know and less open to rethinking. The rethinking cycle is very much the opposite. It starts for me with intellectual humility, which is about knowing what you don't know. You know, no matter how much of an expert you are in a given field or a given topic, you have a long list of things that you're clueless about.

And being aware of what your ignorance is leads you to doubt your convictions. It makes you curious about what you don't know. And that opens your mind to new discoveries. And then every time you learn something new, it's not this sign that, oh, now I'm an expert. It's this sense that, well, there's so much more to learn, right? And I've made a tiny, tiny dot of progress in a whole universe of knowledge. And I can't wait to see what I learn next.

And so I think one of the things we need to do is we need to give ourselves permission to enter rethinking cycles. And there are a lot of ways to do that we could talk about. But Shane, I'm going to ask you about this because a couple years ago, you wrote a post about how we should have more second thoughts.

And I had literally started writing about that. I think it must have come out around the time that I was writing the Think Again book proposal. And I had proposed a tentative title for this book as Second Thoughts. This is amazing. You're on the exact same wavelength as me. And this is what you do for a living, right? You rethink things. You also ask the Farnham Street community and your whole audience here at the Knowledge Project to rethink a lot of their convictions. So where do you start your rethinking cycles? And how do you know when it's time to enter one?

I think like I've just summed this up as like outcome over ego. And so I usually try to wrap my outcome or wrap my sense of identity or ego in the outcome. And that's something I learned when I was working for the intelligence agency, right? Like it wasn't about me having the best idea. It was like, who's got the best idea? Because that's going to get the best outcome. And then

You sort of grow up in an environment where that becomes, I would say, the norm by and large. It's hard in a knowledge environment, though, right? Because you have so much of your worth. You want to contribute to something. I think there's a biological need to contribute to something larger than us. And if your identity, you're not mechanically making something, you can't see, there's nothing tangible to what you're producing, then you effectively are a knowledge worker in one way or another, and then you're paid for your judgment. So if your judgment isn't right, what is it?

And then what you do is you, you, you force your way, right? Like you, you,

You don't intentionally sabotage other people, but you only look for confirming evidence. You're not open to changing your mind because your sense of identity is tied to being right, because that's how you contribute to the organization. It's interesting, but not at all surprising to me that you really learned this in the intelligence community, because the way you're describing your process of rethinking is exactly what I learned from studying superforecasters.

right, which is they will often come in to making a judgment and say, okay, the only way to have a better shot at being right is to recognize all the places where I'm wrong. And I love this practice in particular that came from one of the super forecasters in the book, Jean-Pierre Begum, who, when he forms a tentative opinion, will actually make a list of the conditions under which he would change his mind. And I've actually started doing this over the past few months because I don't want to get locked into something that was, you know, maybe...

Sort of a soothing belief, but ultimately one that's not going to serve me well. I want to come to something you said about BlackBerry and Blockbuster. I was thinking, you know, like, how do you balance this notion of, okay, we don't know everything. There's a lot of uncertainty in what we're doing. I'm open to rethinking it, but I also, I need to take action and I need to do something. And then you have an escalation of commitment, the more action you take, which it becomes harder and harder to rethink. You have these sunk costs building up. You have other sort of escalations happening.

How do you balance those two things between being open and also affording yourself the choices that you need to make to exist in an organization and seize opportunity? I don't know that there's a way to get the best of both worlds in every situation. I do think, though, that you can create conditions that at least increase the probability that you end up both open and decisive at the same time, which is a sort of strange combination. So

For me, that's really about changing the way that we reward people. So in too many organizations, people are basically counted as successful if they get a good result and failed if they get a bad result.

And the problem is it often takes years to find out what the results were. It's very easy for people to persist with a failing project for a long time and convince themselves and everyone else around them that they're on the right path. What a lot of the research on this suggests is that we want to shift to process accountability, not just outcome accountability. And ask people to really think seriously about, okay, how would I know that this is a thorough and thoughtful decision process as opposed to one that's driven purely by whim or intuition?

and ends up being much more shallow. So I drew this little two by two that I've found helpful where I cross the quality of the outcome with the quality of the process. And I think we need to stop rewarding good outcomes with bad processes because that's just luck. That's kind of a boneheaded decision that happened to turn out well.

And we need to start either celebrating or at least normalizing good processes with bad outcomes. Because if you have a very thorough process, let's say, for example, you're going to launch a new product or you're going to even try to reinvent your culture a little bit or you're trying to figure out, you know, should we hire somebody or not?

In all of those decisions, the common ingredient is you don't know what the outcome is going to be a year, two years, five years down the road. What you do know, though, is that there are more systematic, more rigorous ways of evaluating the decision now. And so if you can score yourself on a set of benchmarks around, okay, was my process thorough, then even if the outcome wasn't good, you could say, well, that was an experiment worth running because that's part of how you become a learning organization.

And I'm constantly shocked by how few people actually think this way. How do you judge a process? How do you walk through judging that and knowing that the outcome won't happen for years, but also knowing that you need to update the process as you go along to get better and better and incorporate new knowledge? So let's take a specific kind of decision. So let's do a hiring decision since that's easy to work with.

So Shane, if you and I are going to make a hiring decision together and- I'm delegating to you for sure. No, you shouldn't. What you want is for me to weigh in on how to design the process and you want to be the one that implements it. Okay. What I would do is I would start by saying, okay, most basic mistake that people make in these kinds of decisions is they don't consider criteria before looking at candidates, right? So they interview their three people and they start to compare them as opposed to saying, no, I should have an independent standard for the skills and values that I'm trying to select on.

And let's identify those really clearly. Let's not just do those from my opinion. Let's, you know, let's try to build some wisdom from a crowd here. And of course, not all crowds are equally wise. So let's let's go to people who are knowledgeable about the key, the key dimensions of our culture, the key challenges of the job.

And then once we've built out the criteria we're looking for, the next step is to say, okay, how do we rigorously and comprehensively assess people standing on those criteria? And, you know, in a lot of cases, there's one interviewer. We know that it's better to go up to three or four empirically. In too many cases also, it's each interviewer's job to make an overall assessment of the candidate, which makes it too easy to decide you like someone and then confirmation bias and desirability bias are basically driving the process.

So what we do instead is we break this down and we say, okay, Shane, I'm looking for somebody who's a giver, not a taker. Your job when you meet this candidate is to solely assess them on that dimension and come back with your behavioral data on whether they fall more on the selfish or the generous end of that spectrum. And then we have someone else assessing their intellectual humility and curiosity. We

We have somebody else who's maybe gauging their levels of integrity. And so nobody has a conviction about whether the overall candidate is good or not. They're building the pieces of the puzzle to say, OK, does this person meet our criteria? And then after that's done, we would then come together and say, OK, now let's make an overall judgment having pooled all of our knowledge. Right. That's a thorough process. And it's very different from how most organizations hire.

Why is it so different from what most people do? Just because it's time consuming and...

If I delegate my knowledge and my experience to what feels more like a formula, then maybe I'm out of a job. And maybe also, you know, my superior intuition, my gut feeling about a candidate is going to get ignored. And that's what I've hung my hat on for a lot of my career.

I also think it's boring. So managers love having the freedom and flexibility to go wherever the interview takes them. And the idea of being much more structured in your interview process of saying, okay, let's get a well-defined work sample. Let's figure out if somebody says they're a good salesperson, let's actually ask them to sell us something. And let's compare all the candidates on the same selling task. It kind of reduces the variety that I get in my job.

And I think that's those are a couple of the reasons why why it's uncommon. But I think to your point, yeah, it's expensive. Right. So it ultimately will require a bigger investment of time. It probably requires more people involved, too.

That time has an opportunity cost. And so maybe we feel like we're giving something up. I don't know about you. If I'm going to hire someone and commit to working with them, I cannot invest enough time upfront to decide that that's a good choice. It's interesting to me to listen to you say that because what comes to mind is like there are organizations that invest incredible amounts of energy, time, money into this. There are sports organizations, all sports, you know, before they draft somebody, it's like, what is the person's character? Right.

How well do they recognize the plays? How well do they, is it in intuition versus professionalism on their part? Right. And they have these ways of evaluating that special forces, the same thing. They're investing a lot in sort of determining these recruits and,

Why do you think they're so variable? It's hit or miss, right? They haven't cracked that code, if you will. I think it's hit or miss for a few reasons. One is we don't have all the criteria that we need. I've worked with a whole bunch of professional sports teams over the past few years on this exact problem. And they're at best assessing on maybe seven or eight attributes when there might be 200 that are going to drive people's future performance, right?

So I think that's the first problem. The second problem is the measures are extremely noisy. So it's one thing to say, okay, you know, if I'm trying to hire a – if I want to draft somebody to play for the Toronto Raptors, right, I can figure out how tall they are. I can figure out how high they can jump.

But when it comes to quickness and diving for a loose ball, I can't measure that as precisely as I would like. And then I'm also trying to come up with a score for grit and generosity and humility. Good luck with that, right? They're very, very intangible factors to measure.

And then how you would weigh them would vary based on the individual candidate, too, I would imagine, and team character and a whole bunch of other things. Exactly. The aggregation problem is huge. So I actually had this question posed by a sports team last year that was hiring a head coach.

And they had used some of my assessments. And the question was, okay, do we go, we have two finalists, do we go with the coach who scored higher in intelligence or the coach who is more of a giver? I don't know what the right answer to that is. Was this in the NBA? I'm not at liberty to say, but the coach, they ended up choosing the coach with the higher intelligence score and firing that coach at the, I think at the end of the season or shortly thereafter. And I

I don't know what the right answer is there, right? I think there's probably a threshold. I would not want a coach who's extremely selfish.

I also wouldn't want a coach who's not reasonably intelligent. But then when you get into the ranges of, well, anybody with these attributes could succeed, I don't know how to trade those off. And then to your other point, well, how are they going to gel with the culture of the team and with the players involved? Those are all open questions. And so this is a very messy problem. Let's come out of this problem a little bit. And I'm going to hire you as an advisor. How do I encourage my organization, the culture within the organization, to

that people are, they feel psychological safety, I guess. That's the core requirement to rethink as an individual is like, you feel it's not threatening to your identity. It's not going to have an impact on your job, your career. Nobody's going to hold it over your head. How do we build psychological safety within an organization? So Constantinos Koudaferis and I just finished some studies on this exact topic. And

We started from the premise of saying, look, if you're a leader and you want to build psychological safety, you want to give people the freedom to take risks and to know they won't be punished if they rethink something or they voice a problem that needs attention. Then what most leaders think they should do is ask for feedback because then the door is open. And we did find that CEOs who seek feedback more often had higher psychological safety in their top management teams than

But we found that when we went and encouraged managers to go and ask for feedback, it didn't have a lasting effect on psychological safety. And it seems like a couple of things broke down in our follow-up analyses. The first one was sometimes leaders and managers would ask for feedback, and then they didn't like what they heard, and they got defensive, which immediately says, nope, guess the door is closed.

The second problem was even when they were open to ideas, sometimes the feedback was irrelevant or it addressed areas that were outside their span of control. And so they said, okay, this is not a priority for me or I can't do anything about it. And that led them to stop asking and it led their employees also to stop giving and to stop speaking up because it seemed like an exercise in futility, right? So even if you took the fear away, that doesn't mean that I can have an impact if I raise an idea or I challenge my leader to rethink something, right?

So we got curious about alternative approaches that might have a more lasting effect on psychological safety. And the one we tried out that worked effectively was instead of just asking for feedback, we actually had leaders criticize themselves out loud. In some cases, managers would bring in their performance review and they'd say to their team, hey, here's what my boss told me I need to work on. And I would love your input on whether I'm making progress in these areas.

And not only did CEOs who did that naturally have higher psychological safety in their top management teams, but when we randomly assign managers to kind of criticize themselves as opposed to asking for criticism, just inviting them to do that once increased psychological safety in their teams for at least a year, which is a staggering effect. And there are a couple of things that happen that are really different from what happens when you just seek feedback. So one thing that happens when you criticize yourself is you show you can take it.

And it makes people immediately less fearful about challenging you. The second thing that happened, which I think is in some ways even more interesting, is it created mutuality. There's now a dialogue that's going on where I've said, you know what, Shane? Here are all the places where I just, I stink. And I really need your help in getting better. And you now not only have the freedom to tell me how I can improve, but you feel like you can be more vulnerable with me.

There's basically a normalization of vulnerability that happens where once I say, hey, I'm a work in progress, everybody on my team is more comfortable acknowledging that too. And it makes it easier too for the team to hold me accountable. So let's say, for example, I have a tendency to talk too much in meetings and I come into my team and one day instead of saying, hey, could you give me some feedback on our meetings? I say, you know, I've realized I have a tendency to not shut up when I should. I would love you all to help me with this.

Then two meetings later, I'm rambling, and that gives permission to my team to say, hey, you know how you said you wanted to talk less? It hasn't happened yet, right? So then what we saw is a lot of the teams work together to create practices for keeping the door open. They do a first five minutes of every meeting check-in. Is there anything anyone can do to improve? They'd hold a monthly vulnerability meeting in some cases where people would just talk about their development areas and their progress and where they were struggling and

I think that we could all be more open in criticizing ourselves. And I'm not saying that every leader should stand up in front of the thousands of people who work below them and talk about all the things they're bad at. But I think with the core people that you work with, odds are they know what your weaknesses and shortcomings are anyway. And if you can own up to them, people are much more likely to help you rethink your ways of fixing those flaws. And I think ironically, knowing that people might be a little bit on guard or a little bit suspicious is

actually led leaders to be more authentic and to come in and say, you know what? The whole reason that I'm going to talk about areas for growth is I want to get better. And so the more honest I can be about that, the more likely I am to grow. And so I think it's the responsibility of people in power to open that door and keep it open, right? And this step to prove, right, here is the stuff people told me I am terrible at. Here's how I benefited from hearing those things in the past. Right?

That's one of the best ways that you can show that you really mean it. I think when I asked this question about psychological safety, I actually presumed to know what some of those variables are, but maybe you can help. What are the variables that go into feeling safe psychologically, either at home in a relationship with your spouse or at work? Are they the same? Are they different? So in the early research on psychological safety by Amy Edmondson and Bill Kahn and some of their colleagues, the two foundations were really trust and respect.

And I think a lot of people, as Amy has pointed out, get psychological safety wrong. They think it's about being nice to everyone or being tolerant of everything or having no standards or not holding people accountable. No, it's none of those things. Psychological safety is knowing that other people are going to treat you with respect and trusting that if you go out on a limb and say something uncomfortable or challenge a deep-seated belief, that you are not going to be punished for that.

The basic ingredients, trust and respect, those matter in any relationship, whether it's romantic or professional. I mean, one of the most basic mistakes people make on this is they forget that one of the most effective ways to earn trust is to show trust. And I think that's why I want leaders to begin with vulnerability. When you ask somebody for feedback, you are saying, hey, I respect you, right? Shane, I want your input. I want you to tell me what I should rethink in my book about rethinking.

But if I don't criticize myself out loud, then you can't really trust that I'm going to listen to you and that I'm not going to bite your head off or get offended in some way. I think I'd get a much more honest answer from you if instead of saying, hey, tell me what I should rethink. I came in and I said, you know what? I think one of the biggest mistakes I made in writing this book was I really understated the importance of preaching and prosecuting.

relative to being in scientist mode, which maybe is something we'll talk about. Maybe we won't. But once I say that, like, hey, you know what? I know this book is not perfect. I poured a lot of energy and time into it. But I really want to find out how I can evolve my thinking. And you're a great person to help me do that. Let's talk about the preachers, prosecutors and politicians and scientists. Yeah. So credit to Phil Tetlock for bringing this framework onto my radar today.

Phil wrote this amazing paper almost 20 years ago now where he said, look, a lot of research on decision making and judgment assumes that people are thinking like hyper-rational economists or scientists. And we're not, actually. We're much more social creatures than that. And as I read this paper, it suddenly dawned on me that

This is a perfect metaphor for me as an organizational psychologist because we spend an inordinate amount of time thinking and talking like professions we have never held, like occupations we were never trained in. So think about how much time you spend in your life preaching. You've already found the truth and your job is to proselytize it. Prosecuting, you find somebody who you think is wrong and your job is to prove it and win your case or come out ahead in the argument.

and politicking, where you think, "Okay, I've got a base of people who I'm trying to curry favor with, and so I've got to campaign for their approval and support." What I started realizing as I was actually about halfway through writing Think Again, I realized it needed an organizing framework. And so much of what I was trying to encourage people to do was about getting out of the mode of preaching, prosecuting, and politicking, and into the mode of thinking more like a scientist.

And part of the reason that I wanted to do that is I think that, you know, the danger of preaching and prosecuting is that you don't change your mind. You're right. Everyone else is wrong. And so you might be very motivated to get other people to rethink, but your views are frozen. They're set in stone.

And politicking is interesting because when we're being political, we're actually more flexible, right? We might even flip-flop, but we're doing it at the wrong times and we're doing it for the wrong reasons because we're just doing it to appease our tribe as opposed to doing it to find the truth. And so I think we could all get better at thinking more like scientists to say, you know what?

Your views, they're actually just theories, right? You could kind of make them into hypotheses and then you could run little experiments in your life to figure out whether they're true or false. And that should leave you not only more mentally flexible, but also more likely to change your mind at the right times for the right reasons.

If it's not a law of nature, effectively, it's just a theory. Exactly. Can you say that again and repeat it to approximately 8 billion people? Yeah, I wish. In that case, would you want your identity sort of tied up with your profession a little bit? Because scientists are known to sort of think and change their mind and look for evidence. Ooh, that is such an interesting reframing of my stance on identity. I think you might be right.

See myself as a social scientist, right? And thinking about myself, seeing myself as someone who likes to think and talk scientifically and who was trained to do that. What that means to me is I value truth. I'm more interested in getting the answer right than I am in being right.

You know, that means lots of my opinions are still flexible, right? I have a set of tools. So I really like experiments. I really like doing, you know, carefully constructed longitudinal studies. And I think those tools have been rigorously tested over centuries, right, as being the most valid and probably independently verifiable or at least most difficult to falsify, right, techniques for reaching the truth or at least getting closer to it.

And I think as an identity, scientist is helpful because it reminds me how much we don't know and how hard it is to, you know, to arrive at the truth or an approximation of it. I want to preface my next sort of question with, I don't want to talk about politics. I don't want to talk about sort of liberal Democrat, Republican, conservative, anything to do with that. I really, what I want to hit at is it seems that most of our leaders, we elect them for being strong-minded leaders.

You know, often charismatic is why are we drawn to these people if we know that actually maybe the best elected official would be the one that gets up and says, I don't know how to fix this. I would just hire the best people and listen to them. But we would never elect that person. Why do you think that is? Why are we drawn to this? I don't know. I think we have elected that person. I don't think it happens that often. But I think that in the U.S.,

Franklin Delano Roosevelt, that was literally his campaign with the New Deal. It was a whole campaign to say, you know what, we're going to run a bunch of trial and error experiments and learn from what works. I think it's hard for that person to get elected, though, because as we face crises and we grapple with uncertainty, we're drawn to people who we feel like are going to figure it out and going to fix it.

And so if somebody hedges too much, if somebody shows too much humility, I think we mistake that as a sign of ignorance, right? And it's the basic trap that you've railed against for years, Shane, which is we should stop confusing confidence for competence, right? Just because somebody is sure of an opinion does not mean they actually know what they're talking about. And in fact, anybody who is familiar with the Dunning-Kruger effect knows

We'll know that the more sure people are of their opinions, the more hesitant we should be to listen to them. Right. But there's something very intoxicating about following someone who believes they've already found the way. And I think it, you know, it gives us a sense of coherence. It can give us a sense of purpose. It's easy to put our trust in people who, you know, who who have a clear vision and

Of course, you know, in the long run, those are the people that I worry most about because they're the ones who are most likely to get too attached to that vision and stick to it long past its time. And I think it removes uncertainty, too. We would almost rather be wrong and certain than uncertain and land in the correct spot because it wrecks havoc on us. At some level, you're right. It's a way of letting other people do your thinking for you.

And this is why political parties have always been such a mystery to me. When people ask me what my politics are, I think for myself, I try to form independent opinions based on the information that I encounter. And the idea of identifying myself as a Republican or a Democrat or a liberal or conservative, that's ridiculous to me because it means I've outsourced my thinking to some group of people that I don't think are thinking very scientifically.

Let's talk about that without using politics, sort of, but like tribes, like we fit into these groups, we want a sense of belonging. It's a very human thing to want to fit in with a group, be part of a group, have status within that group in a hierarchy. There's a biological sort of hierarchy need that we have, even if we're lowest on the totem pole, we sort of like want to know where we stand in this pecking order. And then we assume group identities and group positions. And those are really hard. Talk to me about that.

Well, I think the idea that comes to mind right away here is these twin desires that human beings are constantly grappling with. We want to fit in. We also want to stand out. We want belonging. We also want status. There's a theory that I love. Marilyn Brewer calls it optimal distinctiveness. And she says, look, there's a way to fit in and stand out at the same time. It's by joining unique groups.

Because then you are part of something and you're not only part of something, you're part of something that has a clear identity because there are very few other groups like it.

But you also stand out because of the very way that that group has differentiated itself from others. And so if you can join a group that gives you that sense of optimal distinctiveness, if you can join a kind of an unusual group or a group with clear, well-defined boundaries, then you're able to satisfy those motivations simultaneously.

That explains the rise of a lot of movements and a lot of groups where people will say, OK, I want to I want to belong somewhere where where I also feel like, you know, I'm not like everyone else. You know, it goes back to that's how people gain a sense of predictability. It's how they they have a sense of control in their lives. It's how they avoid feeling excluded. And frankly, the other function it serves is existential function.

One of the most robust findings in psychology over the past three decades is that belonging to a unique group actually serves a terror management function, that it helps you avoid threats to your own mortality.

Or at least it makes you worry less about what might happen to you in the future and whether you have a legacy. It connects you to something larger and more lasting than yourself. And so it's easy to see why people are drawn to these groups. It's also a little bit scary. So we've talked about sort of the individual, but how do we change the views of other people, the people we work with, our partners? This is probably the part of the conversation everybody wants because we're right. How do I change somebody else's views? How do I convince them that they're wrong and I'm right?

Well, it helps to let go of that sense that you're right and say, look, if this is a one-sided exchange and you're supposed to change your mind, but I get to stand still, then

probably not going to make a whole lot of progress here unless you are a very good prosecutor or the other person is very happy to be preached to. One of the more interesting things that psychologists have done when looking at disagreement and debate is what if we thought about it as a dance? There's something about dancing that says, look, we're both going to move.

I think a good argument or a good debate or a good disagreement is one where neither of us has really choreographed all the steps we're going to take. And sometimes we step forward and sometimes we step back. Other times we sidestep. But ultimately, we're actually trying to get in rhythm or in sync, which is a very different goal from trying to change somebody else's mind. Yeah.

I think, yeah, that's a much better way to frame it. So what are the steps that we take for this dance then, not only for sharing our opinion, but also receiving their opinion? So one of my favorite ways to think about that is a classic study by Neil Rackham of expert negotiators, where he compared them both pre-negotiation and actually in the live negotiations to average negotiators to look at what the two groups did differently.

And the experts, they spent far more time planning for and then talking about common ground was the first takeaway. So a lot of people, when they go to have an argument or win a debate, they think their job is basically to find the differences quickly so that then they can fix them, right? Yeah.

That's not at all where I want to start. I want to start by saying, Shane, let's identify areas that we agree on, which gets us in synchrony, right? And also says, hey, you know what? This is somebody who shares some of my values or maybe some of my views. That makes the conversation non-defensive and collaborative right from the start. Then a second difference that jumped out pretty clearly is the experts asked a lot more questions than the average negotiators. And they weren't leading questions. They were questions motivated by genuine curiosity.

Let's talk for a second about how versus why questions. So a lot of times when you discover that somebody has a different view from you, you naturally will say, well, why do you think that? And the problem is you're setting the other person up for confirmation bias. You're giving them an invitation to make a list of compelling reasons that they themselves generated for why they're going to cling to their preexisting convictions, right? So you're doing some of their work for them. What tend to be more effective are how questions.

Where instead of asking, why do you believe this? You ask, how would you implement this? Or how would that idea work if we were to come to some agreement on it? And that cultivates intellectual humility, right? The term for it in psychology is the illusion of explanatory depth.

And the idea is that people think they understand things much more than they actually do. And if you ask them to explain how, right, one version of that is just how does this work? Another version of that question is how would you explain that to an expert or, you know, how would you implement that in the real world? They suddenly realize, gosh, I don't really know what I'm talking about.

And this has recently been demonstrated for policy questions where if you've got somebody who disagrees with you on, let's say, climate change or on tax laws, instead of asking them why they believe what they believe, if you were to just ask them, well, how would you implement that tax law and what are all the effects it would have?

Or how would you address this climate problem that I know there are a range of complicated solutions to? As they try to answer that, they realize how little they know. They become less polarized. They're much more open to hearing alternative views. And the hope is that you are too. One of your pet peeves is feigned knowledge. Talk to me about how you deal with this, because you must see it all the time, not only as a professor, but posturing in organizations and

How do you deal with this? It eats at the core of my soul when someone claims to know something that they don't. And I have not always dealt with this well. So the story that comes to mind, I was called by an investment bank some years ago, and they asked me to figure out how to motivate and retain their junior analysts and associates.

So I did two months of research. I had experiments. I had longitudinal survey data, interviews, observations. I had lots of outside research as well as internal data. And I came back with 26 evidence-based recommendations. And I was presenting them. I think it was actually my first ever video conference I'd done. This was at least maybe eight years ago. And I'm presenting to the co-heads of investment banking. They're on multiple continents.

And I think I was on about recommendation five or six. And one of the co-heads interrupts me and he says, well, why don't we just pay them more? And if there is one recommendation that was not on my list of 26, it was to solve the problem with money because they had already thrown a lot of money at the problem. These people were already well paid, by which I mean overpaid. And if money were going to attract them and retain them, it already would have. And

I'm embarrassed to say that my response at the time was, I've never seen a group of smart people act so dumb. That did not obviously accomplish a whole lot for me, right? Although they did tell me they got a kick out of it later because people don't normally talk to them that way. But the mental modes other than scientist, the one that I spend the most time in is prosecutor mode. And I've been accused of being a logic bully from time to time.

And I decided after that that I wanted to be in scientist mode if somebody had feigned knowledge, right? If somebody claimed to know something that they didn't. And what would a scientist do? A scientist would be just riveted. Like, who is this person? How could they possibly believe these things? And what is it that would possibly change their mind?

And so I wrote a little script for how I want to respond whenever somebody challenges my evidence or claims to know something that I think is false, which is just to get really curious and say, what evidence would change your mind? And I don't always remember to ask the question. I don't always get curious enough to really want to know.

But in the situations where I've pulled it off, it has completely changed the tone of the conversation. And often what will happen is the person will map out the kind of study they would find convincing. And now we're on my turf because I have a mental library of studies that I can then cite. And they're helping me figure out which kinds of data they would find compelling. And also, I find it really helpful because...

It refocuses the conversation in the realm of evidence as opposed to just opinion. Right. And so when I ask them to tell me what a well-designed study is going to look like, we can then agree on what valid methods are. And once I tell them what the findings are using their chosen methods, it's a lot harder for them to just have a knee jerk objection. So I decided after that experience that you can lead a horse to water, but you can't make it think.

The hope is I learned something from that conversation. And now we're arguing to learn as opposed to arguing to win. Talk to me about this logic bully thing. Well, I had a student some years ago now. Her name is Jamie. And she was trying to make a big career decision about whether to do an MBA and if so, what school she should go to. I think she had been accepted at two schools. I just said, Jamie, look, I'm not saying...

You shouldn't do an MBA. But let me give you all the reasons why I think it might be a waste of time and money. You already have an undergrad business degree. You don't need an MBA for any job that you want. And OK, maybe there's some firm that will tell you you need it to get promoted, in which case I would say you probably shouldn't work at that firm because it's not like having an MD or a JD, right? There's not a codified body of knowledge that you need to run a business.

And I just wanted her to think through carefully, would, you know, was this a good use of time and money? And if you could take two years and a quarter million dollars, is there a better investment of that given your goals and your life circumstances? And so I didn't have a stake in the outcome, but I argued the other side as I often do. I guess she activated my prosecutor mode and she just, she came back and she said, you're a logic bully. I was like, what? And she said, a logic bully.

And she went on to tell me that I had overwhelmed her with rational arguments and she didn't agree, but she couldn't fight back. And my first reaction, Shane, was yes, because I thought that was my job as a social scientist. Right. I want to come with airtight logic and rigorous data and make the most compelling case I can. But what I was depriving her of was the opportunity to own her own choice. Right. And to reason through this for herself. Right.

And so I've tried to get out of logic bully mode. And that same conversation I've had with a number of students now, instead of listing all the reasons, I would say, all right, Jamie, can you tell me the pros and cons of doing an MBA versus not? Why are you excited about it? What are the risks? And then I go the extra step and say, and why are you coming to me? Are you here because you want my advice? Are you here because you're looking for my validation of a decision you've already made? Or are you here because you want me to challenge your thought process?

And once I know what her goals are, it's a lot easier for me to...

invite her to rethink some of her assumptions without insulting her or causing her decision process to go haywire or without making such strong arguments that she thinks I'm trying to give her an answer when I'm really just trying to test her thought process. I love that term logic bully. And, you know, you think initially you're like, oh, this is so helpful. And then at the end, you're kind of like, oh, that wasn't what I thought it was going to be. I'm curious, how do you how do you institute rethinking or how do you help kids to

not only students, but your own kids rethink? What are parents supposed to do at a young age for elementary school kids or high school kids and university students in terms of opening their mind to different possibilities? And what sort of things can we do? One of my favorite things that I learned while I was writing the book came from Wisconsin's Middle School Teacher of the Year, Aaron McCarthy.

What Erin does with her students is she gives them a section of a history textbook and she sends them out to rewrite it. And what they do is they, you know, they look at primary sources, they interview people, and they realize how much information is missing from the way that we've narrated past events.

And what that does is it allows them to think a little bit more when they encounter new information, like fact checkers. Whereas where instead of, you know, I read something in the news or I heard it on TV, it must be true to say, well, what are the sources of that information and how do we really know? And I think we should all be lucky to lucky enough to have a project where we get to rewrite a section of a textbook. I thought I thought that was a brilliant assignment.

A variation on that that we do occasionally at dinner is Allison and I, with our kids, will have a myth-busting discussion. And I think originally it came about because our kids had learned interesting things at school that surprised us. Like one day, one of our daughters came home and said, we were just doing an Egypt unit. This was an elementary school. And I found out that King Tut probably did not die in a chariot accident.

I was like, oh, that's so cool. That's not what I learned. What else are you learning in school that is different from what we thought was true at the time? And so it became sort of an occasional tradition for us to say, okay, who's going to bring a myth or a fun, surprising fact to the table? And what I want to do in these, whenever we have these conversations is I want our kids to experience the joy of being wrong. To say it is such a delight to discover that something you thought was true was actually false because now you know you've learned something.

I like that a lot. I'm still grappling with Pluto not being a planet, so I'm a little sad about this. You and me both, Shane. I think those kinds of moments, right? If you observe your own emotional reaction in them,

Why was I so upset when I found out that Pluto was not a planet? Why was declassifying the name of an object? The object hasn't changed, right? It's still floating out there. It's still somewhere in the same position that we thought it was. It doesn't affect my life in any material way. Why does this bother me so much? It's because I like to have a set of beliefs about the world that I can count on. And when it comes to my understanding of the solar system, it's like,

I have a tower made out of Jenga blocks and somebody just pulled out the wrong block and now all of a sudden the whole tower is coming crashing down. And I want a much more solid and sturdy foundation for the things I believe. What else do you do with your kids? One of the other things that we realized a couple years ago we weren't doing enough of was when it comes to teaching values, talking about how we had failed to sometimes live our values.

So, you know, Allison would often roll her eyes when I would talk to our kids about being givers, not takers. It's one thing to, you know, to make that case. It's another thing to say, you know what, here's a time when I was not kind to somebody in my class who was being bullied. And I regret that, right? That was me failing to be a giver. When I tell those stories, right, when I share mistakes I've made, when I talk about embarrassing decisions I've made, right,

What I'm trying to do is I'm trying to signal to our kids that it's okay to be wrong and it's okay to rethink the choices we've made. That's, to me, one of the functions of regret. I think of most negative emotions as teachable moments. And the whole point of experiencing them is you're supposed to learn something from what you did wrong so you can make it right in the future. And so much of regret is saying, okay...

I did something that led to an undesirable outcome or that violated one of my values. And so how do I do this differently moving forward? Instead of trying to deny these emotions, let's actually listen to them and figure out what the lesson is in them. You mentioned rewriting the history section was one of your favorite things that you learned. What's another favorite thing you learned when you were writing the book? I mean, one of the things I just never thought about before was binary bias.

So I knew I wanted to write a chapter about having charged conversations and how people could talk about the most divisive issues that many of us have just been shying away from because it feels hopeless. And I really came out rethinking my view that what we need to do is better understand the other side, which has been so much the political narrative. I think actually that's part of the problem, not the solution, because there is no charged issue that's ever simple enough to have only two sides.

And the research by Peter Coleman and his colleagues on this really opened my eyes to the fact that we want to complexify two categories into a spectrum. Anytime you run into a binary like, you know, liberals and conservatives, you should picture a whole spectrum of beliefs there and say, well, you know, there are relatively conservative and relatively liberal members of each of those groups. And if you break down the multiple issues, very few people agree with their party on all 16 or 17 of the major issues equally. Right.

And so if I can see the nuance there and I can start to get people to think about, huh, where in this very nuanced spectrum do I fall, then I see more shades of gray and less black and white in my beliefs. And so what I've been doing a lot now is catching myself in binary bias.

I'm so glad in retrospect that give and take had three categories rather than two, because if there were only givers and takers, I would have missed the matchers, the people trading favors who actually are the most common at work. And I think that there are so many... I mean, this is relevant to any part of life, but

It's something that runs across almost every project I've done in psychology, right? The best leaders are not the introverts or the extroverts on average. They're the ambiverts who fall somewhere in the middle of that spectrum and are comfortable flexing and talking and listening. The most creative people are not the procrastinators like me who dive right in or the procrastinators who wait till the last minute. They're the people who are quick to start but slow to finish somewhere in the middle of that spectrum. And I think any spectrum that you draw, you can almost always

always find an advantage for being somewhere in the gray. And I think we should all do that more often. That's flexibility and adaptability right there, right? Bingo.

One of the stories from the book that I liked was Daniel Kahneman teaching you about how to respond to a surprise. Can you tell us that story? I went to give a speech at a conference and I didn't know a lot about who was going to be in the audience. And I get up on stage and sitting in the audience is Danny Kahneman, Nobel Prize winning psychologist, one of the giants of our field. Afterward, I ran into him and he said something like, that was wonderful. I was wrong.

And those two things don't normally go hand in hand, right? Either a talk was wonderful because you were right, or it was bad because you were wrong. My first reaction was to say, holy cow, Danny Kahneman liked something I said.

My second reaction was to say, this is so strange that he thought it was wonderful to have been wrong. What's behind this? And so I ended up following up with him and I asked him, you know, why I could literally his face lit up when he talked about how exhilarating it was to be wrong. And he said, it's the only way I know I've learned something. Right. If I find out I was wrong, it means I am now less wrong than I was before. And he talked very eloquently about the

the value of detaching your, like we talked about, your opinions from your identities, your ideas from your identities to say, look, every idea I have, it's just a hypothesis. Might be true, might be false. And if I have a vested interest in it being true, then I'm not going to discover as much as if I really want to find out if it's true or when it's true. I think Bezos had something along these lines too, right? Where he's like, people who don't change their mind are wrong.

So if you want to be right, you have to be somebody who changes your mind a lot. Yeah, I think that it's been one of Amazon's persistent competitive advantages. Love them or hate them, right? They are extremely adaptable. And I've thought about this in terms of another two by two. I met Jeff Bezos a few years ago and I asked him how he goes about making hard decisions. And he said, OK, basically two questions. One is, is this decision reversible?

And the other is how consequential is it? How high are the stakes? And he's willing to act very quickly anytime a decision is reversible because he can change his mind tomorrow or low stakes because it doesn't really matter. The decisions he puts off until the last possible minute, maybe even procrastinates on, are the ones that are both irreversible and consequential because those are the ones that are not just gambles. They are not just experiments. Those are real commitments and

And I think this is as relevant to personal decisions as it is to running a company, right? To say, okay, before I go into a major decision, if it is both irreversible and highly consequential, then I want to spend a lot of time rethinking my own views up front. Whereas if it's pretty easy to reverse and undo or it doesn't really matter, I'm going to go forward and make the decision knowing I'll have time later and the opportunity later to second guess it.

But it's not enough to just sort of convince yourself you have the opportunity. You have to be of the mindset that that's sort of almost what you're looking to do. Did you learn anything else from Jeff or Amazon about how they make decisions? Because I think they're one of the best large organizations we've ever seen collectively making decisions. Yeah, I think one of the other smart things they do is they really take this idea of process accountability seriously.

So if you ever go to a senior leadership meeting at Amazon, you'll see this sort of awkward experience of people sitting silently reading a memo for 15, 20, 30 minutes. And the reason they do that is they want everybody's thinking, careful, focused attention around what are the alternatives around this big decision we have to make? And is the problem framed the right way? What's the long-term impact of this decision? It is so rare that

For a world of people who are like, we all live in a world where we're too busy and we have too many distractions coming our way constantly. How unusual is it to sit down with a group of thoughtful people and read and reflect on a common document and then say, okay, what assumptions should we be rethinking here? And I would be thrilled to see more leaders adopt that practice.

I think it's brilliant because it doesn't assume people have read it before they come in. It gives people time to read it and discuss it. One of the things I noticed when I was sort of running meetings like that is that people would come in and they would have read sort of like the first couple paragraphs because they didn't have time, not really their fault in some ways.

And then they would signal that they've read the document by rephrasing some of those first and everybody would just do the same thing. And I'm like, you're all just saying the same thing. Like we can assume this is knowledge, but then you can't actually assume that people have read it. So I like the approach of sort of like distilling it down to something digestible, giving people time to get on the same page, allowing the time in the meeting. Most meetings don't need to be as long as they are anyway. And then everybody's talking from a common base. Yeah, I think...

it's much more likely if you adopt that practice that people are actually learning. It's also, I can't think of a better time to do it than during a pandemic where we're all stretched for time.

And now if I know the first at least 10, 15 minutes of the meeting are going to be processing time, that's one thing I've taken off my to-do list. I can actually do it during the meeting as opposed to trying to squeeze it in the night before. And I like how it's not PowerPoint and it's actually like thinking instead of just point form. We could use more of that. There is a part of me that wonders whether...

especially again for a high stakes, irreversible decision, do you want to send that out in advance? Encourage people to read it at least a first or second time and then really digest it again when you come together. Because I'm not sure this goes back to that second thoughts post that you wrote. I'm not sure that our second thoughts are always our most insightful observations, right? They're usually the easiest ones to think of. And I worry a lot that we spend too much time listening to people who think fast and shallow.

and not enough time hearing people who think slow and deep. And I think we want to become the people who think slow and deep because that's where most of our good rethinking happens. I want to end with sort of like this question that people say a lot in organizations because I think you'll have a good response to this is people say, I'm entitled to my opinion. And, you know, that's their way of ending the conversation and saying, you know, you can't change my mind on something. How do you respond to that? Yeah, you're entitled to your own opinion. If you keep your opinion to yourself,

If you decide to say it out loud, then I think you have a responsibility to be open to changing your mind in the face of better logic or stronger data. And so I think if you're willing to voice an opinion, you should also be willing to change that opinion. I think that's, of course, easier said than done. But I think it goes right back to something we talked about earlier, which is to say, OK, when would I change my mind?

And if you can't answer that question, you are no longer thinking like a scientist. You've gone into preacher or prosecutor mode. That's a perfect place to end this conversation. Thank you so much, Adam. No, we can't end it. You haven't told me what I should rethink. I'm going to push for this.

Hey, one more thing before we say goodbye. The Knowledge Project is produced by the team at Furnham Street. I want to make this the best podcast you've listened to, and I'd love to get your feedback. If you have comments, ideas for future shows or topics, or just feedback in general, you can email me at shane at fs.blog or follow me on Twitter at shaneaperish.

You can learn more about the show and find past episodes at fs.blog slash podcast. If you want a transcript of this episode, go to fs.blog slash tribe and join our learning community. If you found this episode valuable, share it online with the hashtag the knowledge project or leave a review. Until the next episode.