At the tone, please record your message. Hi, this is Chris Hughes. Look, I can't reach my sister, and since you're next door, could you please check on Terry? I'm really worried. I called the super. He's got her apartment open. Just please look around. I've called Terry, but she's not picking up. I don't know where she is. Anything you could find would be a big help. Look, I owe you. Okay, I...
That's an opening scene from a video game called Missing, The Pursuit of Terry Hughes. The game was designed to help train U.S. intelligence agents to avoid decision pitfalls by working through a fictitious missing persons case. Today on the show, we're bringing you something a little different. We're calling it Choiceology's Guide to Better Decisions. We've gathered some advice from expert guests who study the science of debiasing decisions, and we hope their insights will help you make ever more thoughtful choices.
You'll also hear from two guests whose work on the video game Missing has had a lasting, measurable impact on its players, improving the decision-making skills of people who work in one of the highest-stakes sectors in the United States, intelligence analysis. I'm Dr. Katie Milkman, and this is Choiceology, an original podcast from Charles Schwab.
It's a show about the psychology and economics behind our decisions. We bring you true and surprising stories about high-stakes choices, and then we examine how these stories connect to the latest research in behavioral science. We do it all to help you make better judgments and avoid costly mistakes. This is James Kouros. I'm the president and CEO of Creative Technologies Incorporated in Los Angeles, a woman-owned small business.
Back in 2015, James was contracted to design a game for the government agency IARPA. That stands for the Intelligence Advanced Research Projects Activity.
Missing Part One, The Pursuit of Terry Hughes, it's a character-centric narrative that follows the first-person player through a series of discoveries. And basically, you are in the role of a person living in New York in a cool part of Manhattan, but you have a very boring job. Your neighbor is a
Very exciting, very interesting person whom you know only very slightly because even though she's a next-door neighbor in New York, she's almost a total stranger. And the inciting incident that gets the narrative rolling is a phone call from your neighbor's brother who's very concerned because she's been missing for three days and he can't figure out what's happened to her. So basically it's a mystery.
The game is set up to challenge the player's decision-making skills. Okay, so the first thing that pops up here is the user's smartphone. We see that Chris has texted. He says, yeah, you're in the apartment. Please look around. Take as many pictures as you like, but pick the best three that you think are good clues to help us figure out what happened to Terry. I hope nobody grabbed her. The clues are cleverly designed to elicit a biased response.
In this case, they trip up the player by leveraging people's tendency to jump to conclusions with limited data. By Chris saying, I hope nobody grabbed her, he's setting up the hypothesis that there were malign actors who broke into her apartment and took her away. So the question is, will you be looking for evidence to support that, or would you consider a counterfactual hypothesis?
And this is where we start to navigate around the apartment. So starting in the living room, there's a, see there's a desk here. So we see there's some papers around, see this bookcase. We move to this document here and pick it up. And you can see that there are restaurants in Los Angeles. And if you think about it for a moment,
If someone is generating a list of restaurants in another city, it does not support the hypothesis that criminals broke into the apartment and kidnapped a person. So if you were to take a picture of this, then I would suggest that you do have some interest in exploring counterfactual hypotheses. So we'll take a picture of that.
Okay, so here's a chair that was turned on its side. Then we say, well, that might support the hypothesis that she was taken by force. Bathroom's kind of interesting. So here's what's in the bathroom. Wet towel on the ground. You know, what does that tell us? Let me take a picture of that.
And then over here, here's her closet. And one of the things we notice is that there is a suitcase missing. So again, if someone is taken by force, they don't usually pack a bag. So we got the chair that shows maybe by force, the towel, which suggests that she wasn't taken by force, and the missing suitcase. Sounds like she packed for a trip.
And so we're going to send them to Chris. As the game progresses, other characters start to share clues, like the apartment building superintendent, who comes over to Terry's apartment and shows you some security footage. Hi, you're from 12G. I got a call from Chris, Terry's brother. He told me you were in here looking. The guy is massively freaking. Did you find anything? Come with me.
Then he takes us into her office and there's more discovery here. I've got the security footage from the elevator. Here we go. This was just a couple hours ago. Terry seemed like a nervous person to you.
This clue might stand out if you remember the Choiceology episode about the fundamental attribution error. In that episode, we talked about our tendency to attribute other people's behavior to their underlying attributes or personality and to disregard how dramatically their situation has likely influenced their behavior.
We might assume that she's anxious because there's something seriously wrong going on in her life, or it just might be the fact that she's, you know, in a creaky old elevator in Manhattan. That's the kind of thing that would make anybody anxious. Personally, I think Terry's in some kind of trouble. She always seems short of cash these days. Late on the rent last month.
There's lots of stuff to explore on her desk. Here is an electric bill that says it's past due, so this would support the hypothesis that she's having financial problems.
After clues are gathered and the first part of the game is over, users complete an after-action review and hear lectures on decision biases that might have tripped them up during the game. Confirmation bias is a mistake that people make when they have one idea of what's right, and then they search for evidence and interpret evidence in a way that confirms it. Because people are trying to confirm one idea, they don't evaluate all evidence objectively. So that's Kerry, who's speaking now.
In the game, for example, you searched for clues about why Terri wasn't in her apartment. If you only searched for clues that Terri had a financial problem and you didn't check whether she was gone for other reasons, you committed confirmation bias. That's Kari Morwich. Kari is a professor of marketing at Boston University who studies psychological biases and how to reduce them. Back in 2015, James and Kari worked together on the project we've just described to marry behavioral science insights with game design.
and to make the game as effective as possible at improving people's decisions. Kerry joined me to talk about the game and about his research on strategies for de-biasing decisions.
Hi, Keri. Thank you so much for taking the time to be here today. Hi, Katie. It's such a pleasure to be here. I want to start by asking you if you could just describe your 2015 paper about two different de-biasing training programs that you designed, which I think of as really the best work on de-biasing. They had remarkable and durable effects on reducing decision biases. So I'd love it if you could just explain the trainings you designed and how you proved their value.
The backstory is I got a call asking me if I'd be interested in building video games to reduce cognitive biases. And that sounded absolutely crazy to me. IARPA, which is an intelligence research program of the United States government, was doing these high-risk moonshot kinds of programs where a bunch of teams compete to see if they can achieve some kind of goal that has flummoxed the scientific community.
And the program manager at IARPA who was running this program was a gamer. She got the idea that maybe serious games, which are video games with a learning component, could be used to reduce a lot of the cognitive biases that she was reading about. And then we had an initial meeting with them and I said, yeah.
you know, the data that you can reduce cognitive biases is pretty thin, right? When we think about biases in decision-making, I think as a field, we've often described that as
there's not much that we can do. I accepted that kind of conclusion, but it was also very discouraging because it feels like, why are we spending all this time discovering these kinds of biases, but feeling like we can't do that much about them? So this was exciting to me. I didn't think it would work, but
But I was game to give it a shot. The basic structure of what we did was they had identified six cognitive biases that they thought were really important for intelligence analysis. So, for example, one was confirmation bias, the idea that when we're testing a hypothesis, we tend to look for information that confirms that hypothesis rather than disconfirms it. So there were six of these different kinds of biases, and they basically broke them into two video games where each game was designed to treat
three of the cognitive biases that they'd identified. And then we competed against six other teams to see if we could reliably reduce these kinds of biases, both immediately and then after a delay. And that delay was two or three months based on which game we were testing. Each game takes about 90 minutes to play, but the game's really surprisingly effective at reducing cognitive bias. So how do you measure the effectiveness of the games? Yeah, that was really interesting too. I looked around and there weren't
amazing skills to measure these biases at an individual level. And so we develop these skills
of each of the biases. And so we could go out and measure people's susceptibility to the six different kinds of biases before they played the game, immediately after they played the game, and then two or three months afterwards. And we basically see on average a reduction of their propensity to exhibit these biases of about 30% from immediately before the game to after. And then
We see two or three months later, we see the game is still reducing their propensity by about 23%. That's really interesting. And you mentioned that there are six biases. You gave one example. Could you talk a little bit about the full set of six and whether there were any differences in how well teaching people in this immersive way worked across the different biases? So the six biases were bias blind spot. That's seeing more bias in other people than in yourself.
Confirmation bias. Fundamental attribution error, which is that we tend to attribute our own actions to the situation and other people's actions to them or like dispositional effects. Like that was you versus that was your context. Anchoring. Representatives heuristics. So those are kinds of cases where we often use the similarity of situations as a proxy for the probability of situations.
And then the last one is social projection. When people try to think about how much other people have a particular attitude or a particular preference, they often use their own attitudes and preferences to predict how other people feel, right? So for example,
If you find Crocs to be very comfortable footwear and like Crocs, you might think that most people like Crocs, right? If you think Crocs are horrifically unfashionable, then you would think that no one wears Crocs because they're terrible, right? That's a sort of schlocky example, but you can see that politics often divides how people feel about different kinds of issues and people tend to overestimate how many people agree with them on their particular kind of political rift issue. So those are the six biases. And people...
P.S., I have to say, those are great biases to try to tackle. And I think we've covered four of them maybe on the show, but you just gave me two more topics we'll have to cover in future episodes. So thank you for that, too. And I'd love to hear how it all worked.
So the biggest surprise to me is how effective these de-biasing strategies are. The second thing is in other papers, we used random YouTube clips of people's lectures as a de-biasing intervention. And we've even done things where we just explain the biases to professionals like intelligence analysts or risk analysts, and they work. And I think actually the biggest problem was the measurement. Ah, okay.
So you're saying our pessimism prior to this research about how easily we can de-bias people was because we didn't measure their success dodging biases very well and you came up with better measurement tools. Yeah. I will just say as someone who teaches about bias and who obviously spends a lot of time through this podcast trying to teach the world, it's such an encouraging message that even though we'd seen a lot of failures before, de-biasing through education really does work and it was maybe a measurement issue. But obviously,
I also want to point out that in your paper, you show that these really engaging video game trainings work better than watching lectures by some of the best researchers in our field. And I would love it if you could share a little bit of your insight about why the video game was a particularly useful way to improve people's decision making. I think the video game basically throws the kitchen sink of what we think works for de-biasing at the problem.
To de-bias someone, they need to know what the bias is. They need to know what direction it affects judgment in. Then giving people feedback helps. And then they need coaching and strategies.
And the video game does all of that in as much intensity as possible in 90 minutes. I love that. And of course, watching a passive video doesn't give you that feedback component and it can't be personalized or engaging in the same way as an active video game. Yeah. And I think there's a layer of bias blind spot to watching a video to where you see other people commit these biases. You know, like, yeah, maybe I do that, but not as much as that person or like that's not as much a problem for me.
And the game actually is like, yeah, like you showed this bias in this particular scenario and like, here's what you would do to do that. So there's kind of a effect of getting feedback on your receptivity to the training. And in our bias blind spot paper that describes the skills that we use for bias blind spot, we actually find that people who are higher in bias blind spot are less receptive to de-biasing trainings than people who are lower. And so I think shattering that kind of bias blind spot is helpful in itself.
The other aspect that our findings suggest is that we get this transfer effect. So learning about cognitive biases in one domain does seem to help you in other kinds of related problems. The take home I would say is that I think we've been really underselling the value of a lot of the public outreach of decision scientists.
that people are really benefiting from a lot of this kind of work. And part of it is that it's not as exciting for many people to measure individual differences in these biases that have been established as to understand them or document them. As we mature as a group and as a field, I think it's very useful for us to try to think about, okay, we've done a lot of good trying to
show all of these biases? How do we go from the descriptive to the prescriptive and like try to improve social welfare that way? I love that. And obviously, birds of a feather. I think you and I both we'd love for the incentives to change a little bit. So there's evaluation not only of the basic science, but also some of the applied work that can have a positive social impact. So I love this research you've done.
Is there anything that you do differently in your life as a result of the research you've done and that you've read on de-biasing?
Yeah, I think I'm just more optimistic about our field. And also, I think that it's important for people who do this kind of work to do more public outreach because it seems to work. I really admire people like you who are going out and talking about the science to everyone and trying to get the idea out. And I think that we could do more as a science to try to think about ways to improve on many of the kinds of biases that we've identified. And there's a lot of important work to be done on all
what works best and for whom and how related are these biases? And is there something that would work for many of them, right? And just understanding more about the structure of different biases and how they fit together, I think would be really helpful to move forward. And for listeners who aren't in the science, I think that this suggests that
Doing things like listening to this podcast or reading a book or watching a talk online. Those are all things that can help you improve your decision making. And it's not futile. Your biases are not all permanent. You can do something about them.
Carrie, thank you for the very kind words and also what an incredibly optimistic note to end on. It's so encouraging that the research shows learning works and that we're not just stuck with these biases. And just like we can learn calculus and algebra and addition, we can learn how to make better choices. So thank you for the incredible work you have done to teach us that
It's not futile. And thank you for taking the time today to talk to me and to our listeners about some of your findings and how we can better de-bias ourselves. Thanks, Katie. It's such a pleasure to be here. Kerry Morwidge is a professor of marketing and the Everett W. Lord Distinguished Faculty Scholar in the Questrom School of Business at Boston University. James Kouros is president and CEO of Creative Technologies Incorporated and is a pioneer in immersive game-based simulation for military learning.
You can find links to Kerry's research and to a video of the game he collaborated on with James in the show notes and at schwab.com slash podcast. The video game developed by James Kouros and Kerry Morwidge and their colleagues shows that learning about behavioral biases and developing strategies to counteract them may be more effective than we thought. The robust results point to the value of education around de-biasing.
With that in mind, we thought it would be useful to share some practical strategies to help you make better decisions. A kind of checklist for a broad range of choices that you might encounter at work or at home or anywhere really. You may have heard some of this advice before on Choiceology, but I've asked my next guest to distill it into a checklist that you can take with you as you make your next big or small decision.
Jack Saul is an expert in debiasing strategies. He's a professor of management and organizations at the Fuqua School of Business at Duke University. Hi, Jack. Thanks so much for taking the time to talk to me today. Hi, Katie. It's a pleasure to be here. Well, it's a pleasure to have you. And I'm hoping we could dive right into some strategies for better decision making. We've done an episode on the effectiveness of checklists, so I'd actually love to get your list of the best advice in that format.
All right, checklist. Okay, so I have four pieces of advice here, probably incomplete, because as we know, anytime one person sits down and writes down a list, there are probably some things they're not thinking about, but here's four.
So one is be decision ready. And so what do I mean by that? So we often fall back on our intuition and can be biased if we are distracted, if we're tired, if we're hungry, if we're really rushed, we don't really have time to deliberate. And we often make better decisions if we actually, one, know how to make the right decision. So kind of we know what principles to apply, but also we kind of have some time to think about and apply those principles.
So there's a lot of situations where we're not decision ready. Maybe we're angry at somebody or maybe we're trying to do multiple things at once that we're distracted. And so at those times, it's best to put off the decision to another time. I love that, Jack. That's a great item number one. What else? Yeah, I think it's really important. You know, we miss opportunities because we didn't think broadly enough. And so a second piece of advice is to
broaden the frame. A lot of biases actually could be attributed to thinking too narrowly, that we're only thinking about the one option that's in front of us. Should I go to this college? Should I rent this apartment? And when we do that, we only often have one objective in mind. We're not thinking about all the things we care about. So
Imagine a student choosing a college and wants to be a software engineer. They might focus on which school has the best computer science department, and that might turn out to be the best choice for them. But it's also the case that if they stop and think about it, they might realize, well, in the future, I might prefer a different career path. Or there's other things I care about, about a college, you know, like the social dimensions and things like that. And so I think that's
The advice is to try to think more broadly about what can happen, how might our preferences change, what can happen in the world. And at the end of the day, decisions are limited by the option set, right? A lot of research has shown that people are not necessarily so great at choosing, well, or they might be okay at choosing between A, B, and C. But sometimes none of those options change.
are the right option, right? Sometimes what you actually ought to do is throw out all those options and invent some new ones. And so that's what I mean by broadening the frame. Number three on the checklist, take advantage of the wisdom of others. So seek advice and...
When doing this, make sure to get independent advice. And so don't tell the other person what you think the answer is because this is going to influence what they tell you. They'll either think that you're asking for confirmation and they'll tell you what they think you want to hear, or their thinking will be influenced by your thinking. Either way, you're going to be better off if you really want to get good advice is to get independent advice. And then fourth, experiment.
Try things that you think won't work. Do this in a low-cost way. The idea here is to try to generate disconfirmation. Try to prove yourself wrong. One way to do this is empirical. So if you think you know what a website should look like, you could do some A-B testing, and you could create variations that are different from what you think are best to kind of test whether or not your idea is correct.
But in other situations, experimentation is difficult, and there we have to rely on the wisdom of others. And one way to do this is to have a devil's advocate, have somebody argue for an alternative point of view.
Yeah, I love that. I often also tell my students to try to find somebody who will play the role of devil's advocate and just argue the opposite to when I'm worried that I might be too attached to a given path or argument and not thinking broadly enough. Oh, for sure. You want to think about reasons why you might be wrong. We often think about why we're going to succeed or why our answer is right.
And you want to think about reasons why the answer might be much different from what you thought. And, you know, sometimes we're all on our own. And, you know, this is a little bit challenging. But ask yourself why you might be wrong. There is one other way to do this.
Kind of this, I guess, fourth point of trying to seek this confirmation is to tackle the same problem, but on different occasions. So one trick, actually, that a lot of professors actually use when they're grading papers, number one is you don't want to have the name of the student on the paper. And it's very tempting to know whose paper you're grading, but if
If the name's not there, then you could come up with an assessment of the work, which is not tainted by your views about the person. So this is called blinding. But in addition to that, the advice here would be to do it twice. And it takes extra time, but oftentimes what happens is at any given moment, there's some noise in our judgment or we have different things in mind.
And so if we look at the same decision problem twice or are grading the same essay twice, we might come up with different answers. And in those cases, if they're different from each other, you could either – if it's quantitative, you can average the grades together.
If it's more discrete or if it's categorical, what you can do is look at it a third time or have somebody else look at it. And so there's a lot of work along these lines in medicine where if you ask a radiologist to look at the same x-ray on multiple occasions, they might come up with different answers. And this isn't really a critique of their expertise. This is just symptomatic of how difficult the task is.
But for those kind of tricky ones where they're giving different answers on different occasions, then maybe a third look or bringing in another expert can be helpful.
I love that, Jack. And P.S., it also makes me feel better about the fact that I'm that dissertation advisor who, after six months, reads a new draft from my student of their, say, dissertation and says, I really don't like the third paragraph of the intro. You should cut it. And they tell me, actually, I added it because you told me to. Anyway, I'm often inconsistent in my advice to my students, but I'm like, well, you're getting the wisdom of the crowd within. Exactly. It's like two independent perspectives. Aren't you lucky? Anyway, that's my framing trick.
There's more than one Katie Milkman. That's right. That's right. You got me hungry today. Jack, this has been so enlightening and helpful, and I really appreciate you taking the time to talk with me and share these insights. Thank you so much. Sure. You're welcome. It was a pleasure. Jack Saul is the Gregory Mario and Jeremy Mario Distinguished Professor of Management and Organizations at the Fuqua School of Business at Duke University.
You can find links to Jack's research, as well as an article we wrote together with John Payne titled, Outsmart Your Own Biases, in the show notes and at schwab.com slash podcast. Cognitive and emotional biases can have a big impact on your financial life. To learn more about debiasing techniques and how to approach a variety of specific financial decisions, like how to save for college or what to do with an old 401k, visit
Check out the Financial Decoder podcast. You can find it at schwab.com slash financial decoder or wherever you get your podcasts. This episode has been a bit atypical. We've shared the great news that just by learning about decision biases, you can improve your judgment. I think we need to run an experiment to test whether listening to this show successfully de-biases people. But there are a lot of different mistakes we make that we need to be aware of.
We often seek confirmation of our existing beliefs rather than searching in a balanced way for information. We misattribute people's behavior to fixed features of their personality instead of their situation and think too narrowly about important problems. And that's just scratching the surface. The good news is that cures exist for these types of biases. As Jack shared, you can set yourself up for success by being decision-ready. Part of that, of course, is being familiar with decision biases.
Carrie explained just how valuable that can be. But also, you should make sure you're well-rested, calm, and well-informed before making a consequential decision. It can also be valuable to broaden your thinking by deliberately considering a wider scope of possibilities and future preferences than you would naturally. And you can be intentional about collecting information and seeking independent advice from other people with relevant expertise. Finally, try to prove yourself wrong by testing,
That might mean A-B testing or just having a friend play devil's advocate. I hope that the more you listen to Choiceology, the more reliably you'll be able to dodge decision biases. You've been listening to Choiceology, an original podcast from Charles Schwab. If you've enjoyed the show, we'd be really grateful if you'd leave us a review on Apple Podcasts, a rating on Spotify, or feedback wherever you listen. You can also follow us for free in your favorite podcasting app.
And if you want more of the kinds of insights we bring you on Choiceology about how to improve your decisions, you can order my book, How to Change, or sign up for my monthly newsletter, Milkman Delivers, on Substack. Next time, we'll bring you stories about two very different battles from the American Revolution, and we'll share how people make decisions in the face of real versus hypothetical losses. I'm Dr. Katie Milkman. Talk to you soon. ♪
For important disclosures, see the show notes or visit schwab.com slash podcast.