We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode 53. What’s the Secret to Making a Great Prediction?

53. What’s the Secret to Making a Great Prediction?

2021/5/23
logo of podcast No Stupid Questions

No Stupid Questions

AI Deep Dive AI Chapters Transcript
People
A
Angela Duckworth
S
Stephen Dubner
以《怪诞经济学》系列著名的美国作家、记者和广播电视人物。
Topics
Stephen Dubner:本期节目探讨了做出准确预测的秘诀,以及即使是成功人士也会经历糟糕的一天,并如何从中学到经验教训。他以自己儿子的例子说明了在预测中保持理性客观,不受个人情感影响的重要性。他还讨论了超级预测者(Superforecasters)的特点,例如收集多种信息来源、概率性思维、团队合作以及持续评估自身预测的准确性。同时,他也分享了自己如何从糟糕的一天中恢复,例如反思自身决策失误,并从中学习改进。 Angela Duckworth:本期节目中,Angela Duckworth 阐述了专家在预测方面表现不佳的原因,例如事后诸葛亮和过度自信。她强调了运用外部视角(outside view)的重要性,即考虑类似现象的普遍规律,而不是仅关注个例的内部信息。她还讨论了预测市场的优势,以及如何通过概率性思维来提高预测准确性。此外,她还分享了应对糟糕的一天的方法,例如设定新的开始、自我原谅、记录积极事件以及学习经验教训。 Angela Duckworth:本节目中,Angela Duckworth 详细解释了为什么即使是成功人士也会有糟糕的一天。她认为,成功人士通常会尝试更具挑战性的任务,因此失败的可能性也更大。她还引用了研究结果,说明情绪波动和生产力变化是普遍存在的现象,每个人都会有相对而言的糟糕日子。她建议,与其试图避免糟糕的日子,不如学习如何从中学到经验教训,并将其转化为未来的进步动力。她还分享了应对糟糕的一天的方法,例如设定新的开始、自我原谅、记录积极事件以及学习经验教训。

Deep Dive

Chapters
The episode explores the art of making accurate predictions, discussing the qualities of super forecasters and the challenges experts face in predicting the future.

Shownotes Transcript

Translations:
中文

This episode is brought to you by AARP. 18 years from tonight, Grant Gill will become a comedy legend. When he totally kills it at his improv class's graduation performance, knees will be slapped.

Hilarity will ensue. That's why he's already keeping himself in shape and razor sharp today with wellness tips and tools from AARP to help make sure his health lives as long as he does. Because the younger you are, the more you need AARP. Learn more at aarp.org slash healthy living.

From Freakonomics Radio, a series about the economics of higher education. The supply and the demand, the controversies and the hypocrisies, the answers and the questions. Why are more women going to college than men? What happens when Black and Hispanic students lose admissions advantages?

How does the marketplace for higher education operate? I tell you something, it's a darn good question. Freakonomics Radio goes back to school, a series from the Freakonomics Radio podcast. This is what we're going to do. It's going to work out well because I said so. I'm Angela Duckworth. I'm Stephen Dubner. And you're listening to No Stupid Questions. Today on the show, what's the secret to making a great prediction?

My prediction would have been right if these things had happened. Also, why do even the most successful people have bad days? The eggs were overcooked. Angela Duckworth. Stephen Dubner. There is at least one way that I would very much like to be more like my son, Solomon, who's a college sophomore.

he seems capable of making predictions in a way that is totally divorced from emotion even when he has a stake in the thing he's predicting. So, as an example, he works in politics. Okay. And,

he follows things closely and he has some decent information. So he often has a pretty good sense of who's going to win a campaign, whether it's his candidate or the other. And even if it's the opponent, he's pretty realistic about not letting his fan interests get in the way. The same thing for sports.

And he'll even say, I have a lot of confidence in this particular projection. Even though he has never studied with the forecasting guru Phil Tetlock, who you know, he's probably never heard of Phil Tetlock, he's able to really assess a probability about something that he cares about and then follow it in a really unemotional way. And when I watch him do that, I think I would like to have some of that.

Not just predictions and not just betting and not just sports and politics, but how can I and maybe other people learn to make decisions that are less influenced by emotion and then relatedly not get swept up by the emotional response while monitoring the outcome?

Well, that is really remarkable, actually, that Solomon is able to make predictions about things that people have emotions about in particular, right? You know, is my candidate going to win? Will my team win the Super Bowl? And not to get wrapped up in that. Sounds like he is what the venerable Phil Tetlock would call a super forecaster, what Phil would need to...

say that Solomon was indeed a super forecaster is to actually have Solomon make a series of predictions. How's the stock market going to do? Are we going to invade this country or not? Will the United States make this decision or that decision? And then he would actually see how accurate Solomon is. That's how Phil identified with his colleagues a group that

consistently performed in the top 2% of forecasters who were all doing a tournament extended over years. And we should say one reason he did that super forecaster tournament was

was because in many, many years of work before that, he studied pundits, essentially, people who in their profession make predictions. So these are people in academia, political scientists, let's say, and people in the stock market. So he looked at all these different domains and what he found, which...

It's both surprising and yet not surprising at all, is that experts are really bad at predicting the future because predicting the future is really hard. But experts may have an added dilemma in that they tend to have a lot of confidence, including overconfidence, which Tetlock identified as a particularly problematic

powerful trait in our ability to not be good at predicting. Yeah, this study that he did, which is really one of the big studies to happen in the last 20 or so years, was that experts' predictions are really only very slightly better than

throwing darts at a dartboard. So why is it that they're so confident? It may be that, first of all, many experts are making quote-unquote predictions in hindsight. So they are the Monday morning commentary on what happened at the Sunday football game and

We turn to the experts for an explanation. But of course, in hindsight, everything is 20-20. The other reason I think confidence ends up becoming a hallmark of commentators whom we assume to be experts at what they do is because if it were not for that confidence, we wouldn't be listening to them. And there might be a weak positive correlation between the confidence in your judgment and its accuracy. So when we, for example, go and ask a waiter, what's the best answer?

on this menu. If the waiter very confidently says, you must get the chicken salad on a croissant, hands down, it's fantastic. Most people, including me, would be more inclined than if the waiter were like, I don't know, but me with the chicken salad?

See, when I hear I'm pushing the chicken salad on the croissant, I think, oh, they made four gallons of it Thursday and they can't get rid of it. It's about to go bad. But in general, I think there could be a weak positive correlation between the accuracy of a forecast and our confidence in it. And that might lead the audience to spuriously assume and conflate.

confidence with accuracy. So you said of Solomon, not only does he have this uncanny ability to be rational and deliberate and therefore maybe more accurate than you are about the same kinds of forecasts, you said he was very confident in his forecasts. Well, he will say occasionally, this is a forecast that I have a lot of confidence in.

Let's say it's electoral outcomes. So during the most recent election, he worked on a couple of campaigns, and I think both his candidates lost. And he told me long before the outcome that one of them was definitely going to lose, even though I thought that person was a favorite. The other one was...

bit of a dark horse, but was getting a lot of headlines. And so everyone assumed that that person was going to win. The coverage certainly said it. Some of the polling said it. But Salman told me like, nah, it's not going to work out that way. The interesting thing, just to underscore, though, on confidence is that super forecasters as a group would tend to actually have less confidence in their predictions overall. One of the

mistakes that most people make when they are making predictions is to have too narrow a confidence interval is a statistical term. But in lay terms, it would mean being overconfident about how precise you are in, you know, guessing how many electoral votes one candidate is going to get versus the other. So this idea that to be a super forecaster, you would not only try to divorce the

personal bias or emotion from them, but also that you'd be willing to say, hey, you know what? There's actually a large range of possibilities. And I think this is going to happen. But to some extent, who knows? Can I ask you a question that's always confused me about the prediction literature? I know that Phil Tetlock and his colleagues argue that one component of a good predictor or a better than bad predictor, at least,

is that they tend to employ what's called an outside view. I believe that phrase may have come from Danny Kahneman, not from Phil Tetlock, right? Right. When Danny talks about the outside view rather than the inside view, he's talking about looking at the base rate of similar phenomena in the world that are not in this special case. For example, if a company says, you know, we're really having a problem with our inventory management. Could you come and work

with us. We're going to tell you about our inventory. We're going to talk about our org chart. We're going to show you our numbers. That's all the inside view because that's the particulars of that company. But really, the outside view is also helpful, which is in general, when people have inventory management problems, what is the typical culprit? And if you then look

think about things like hiring. I make this mistake all the time that when I have a candidate that I start to root for, like I think I'm going to hire this person, I feel like all I take is the inside view. I just reread their CV again and convince myself again that they're perfect. Now, the outside view, Danny might say, would be this. Angela, I'm

Forget all about that. At a base rate, how many people are successful in this job in that they go on to, you know, a PhD program, etc.? Then I would say, well, Danny, that's kind of sobering because, you know, not many. You use the inside information to calibrate, but the starting point is the base rate. Right. Especially when you talk about hiring or project completion. I do remember seeing this research about the usefulness of what are called prediction markets. But

A prediction market is essentially creating a market for predictions so that people have some skin in the game. And we've seen that prediction markets are better at predicting than pundits because they represent a diversity of view and...

A pundit will often have a rooting interest, but not that much at stake. So it's fun to go on TV or write an op-ed for the Times or the Journal and make a big, bold prediction not necessarily being right. And then they have all different ways to explain after the fact, well, my prediction would have been right if these things had happened. Well, yeah, that's why predicting is hard. But what a prediction market will do is bring together all the information—

One area where I've read about it applied in a particularly fruitful way is within companies that are trying to, let's say, open up business on a new continent or start a new product. What happens is you have this internal prediction market and it may be anonymous.

You ask people, what is your level of expectation that this project will be completed on time? If it's not, what will be the problems? And what this tends to do is float up, especially from the bottom of the organization, information that's much more realistic rather than the project manager or the CEO or the CFO saying,

predicting this is what we're going to do. It's going to work out well because I said so and because we have a plan. Leaders of this sort often experience what's called go fever. I think the phrase came from NASA where once you put the rocket on the launch pad, you really want to launch it.

Firms and institutions all over the world do this all the time without really knowing how well that project launch will go. So if you can have this anonymous internal prediction market, you can probably glean much more useful information because you're hearing from the people who have better information than the kind of phony.

fans or people who are rooting at the top. Rooting for an outcome and allowing that to cloud your judgment of whether the outcome is going to happen is indeed a dangerous thing. We want something to be so, and therefore we predict that it will be so. And super forecasters, as a rule, tend to do that less. And that indeed makes them better forecasters. I know that Phil has spent much of his life studying how what people...

want to believe and what people do believe end up blending our ideology, clouds our judgment. And I do think that if you run a large organization, rather than having the four people whose vested interest is in promoting this new project and getting their part of the budget allocated for the next fiscal year, taking everyone's judgment and weighing it, the obvious problem of that is, say you're in this prediction market and you get an email that says, hey,

you know, we're thinking about starting this new branch. We're wondering, what do you think the likelihood of success is? What's your lower bound? What's your upper bound? What you don't have is a lot of information. One of the things that Phil Tetlock and Barbara Mellors, his wife and collaborator, and their colleagues would say is that information

It's not that all opinions are equal. Like the wisdom of Crowd's idea, which is that collectively we may come to a better answer, that is true. But it doesn't mean that all people's answers are equally good or bad. You know, there were two other attributes that I take real comfort in when Philip Tetlock talks about people who are better than average at predicting. And you touched on both of them briefly. One is accuracy.

You can't be dogmatic. If you believe your prediction will come true because it is what you believe in, then it's likely to be a pretty poor prediction.

And another one is just humility, essentially, but it's really acknowledging not only what you don't know, but what is unknowable. And I think that's something that people have a really hard time with, even with something as low stakes and binary as sports. It's remarkable when you hear sports pundits who are really bad on average at predicting outcomes. And these are people who have a lot of experience, a lot of information, but their prediction records are often terrible.

And I think the reason why is even though they might be a former player or former coach and they may know much more about the game than the average person, there is so much more in the way the world works and the way a particular athlete or team will perform on a given day. And then there's chance. And how do you measure chance? We're not very good at those things either. So it kind of makes you want to throw up your hands and say, well, the world's unknowable.

I know it will be colder in winter than it is in summer. But other than that. Well, OK, we can take some lessons from the super forecasting research that Phil and colleagues did when they studied the ability of this top two percent of their whole prediction tournament.

And they did this year on year. So then at the end, they actually had a reasonably large number of superforecasters they could study and they could look for systematic features of these superforecasters that we might emulate. One is that superforecasters systematically gather evidence from a variety of sources. Now, what this looks like for us is like, don't just read The New York Times, right? Like, yes, love science.

your New York Times, but also occasionally read the Wall Street Journal or The Economist or tune into Fox News just to see what's going on. So don't narrow yourself to a very small number of informational sources that are starting to become redundant with each other.

The second thing is that superforecasters tend to think probabilistically. This doesn't mean that they were mathematicians. Actually, it's not the case that the superforecasters necessarily had statistics background or higher level mathematics background. They simply thought probabilistically. For example, if you tell me, I think it's going to rain tomorrow, what a superforecaster would immediately do is think probabilistically.

that there's some probability that it's going to rain, that there's an upper bound and a lower bound. They're not thinking it'll rain or not rain. And Phil has done intervention research to show that you can actually help people think a little bit more probabilistically. Third is they work in teams and they tend to benefit a lot from other people's opinions. And then the last thing I think is most important because it might explain why

sports commentators and political pundits are not as expert as they ought to be, which is that you have to basically keep score of how you're doing. When your prediction turns out not to be true, then you need to examine what happened and you have to admit that you're wrong and then update your beliefs. And I think that's the thing that pundits and commentators are not incentivized to do.

You know, in terms of thinking probabilistically and explaining your predictions in probabilistic terms, I think actually that meteorologists have done an amazing job. If you get the five-day weather forecast, there's a 20% chance of rain next Sunday morning. That means it's probably not going to rain, but if it does, I've got plausible deniability, 20%.

It's very rare, at least in this part of the country where I live in the Northeast, that you see 100 percent chance of anything. I actually think that's brilliant. What I would like is to have all my pundits talk to me that way. Like there's a 70 percent chance that this very likely thing will happen. But let's not deny the fact that 30 percent is not nothing. Right. There's a 60 percent chance that this health care reform is going to be good, like net for everyone. There's a

40% chance this is going to be bad. We hope you'll vote for it. I like that. It makes the Weather Channel come out as one of the great elevators of the human experience. And I actually, in all seriousness, wonder whether more people think probabilistically because of the Weather Channel.

On the other hand, the Weather Channel is really good at taking a squall in some state a thousand miles away and making it into the end of the world. Stephen, they have 24 hours of programming to get through every day. You got to make the most of a squall. Still to come on No Stupid Questions, Stephen and Angela discuss ways to get over a bad day. It's why there's a thing called cocktail hours.

No Stupid Questions is sponsored by Rosetta Stone. Traveling to a place where people don't speak a lot of English? Then Rosetta Stone, one of the most trusted language learning programs, is for you. Rosetta Stone teaches through immersion, like matching audio from native speakers to visuals, reading stories, participating in dialogues, and more.

The true accent feature even provides feedback on your pronunciation. Plus, learn on the go with convenient, flexible, and customizable lessons as short as 10 minutes. Rosetta Stone can be used on a desktop or as an app, and you can download lessons for offline use. See for yourself why Rosetta Stone is beloved by millions.

For a very limited time, our listeners can get Rosetta Stone's lifetime membership for 50% off. That's 50% off unlimited access to 25 language courses for the rest of your life. Redeem your 50% off at rosettastone.com slash questions.

This episode is brought to you by AARP. 18 years from tonight, Grant Gill will become a comedy legend. When he totally kills it at his improv class's graduation performance, knees will be slapped.

Hilarity will ensue. That's why he's already keeping himself in shape and razor sharp today with wellness tips and tools from AARP to help make sure his health lives as long as he does. Because the younger you are, the more you need AARP. Learn more at aarp.org slash healthy living.

Stephen, I have a question from Tyler Thorstrom. Here it is, very briefly. Why do successful people have bad days? Why do successful people have bad days? We may have to change the name of the show. Oh, you think it's a... Don't say it! That is a borderline... Tyler, cover your ears. I mean, my answer would be, why wouldn't...

successful people have bad days. But maybe Tyler meant to ask something beneath the surface that I'm not seeing. I'm guessing that's what it is. Maybe he means to say, why do even successful people have bad days? I think this question actually might be, and Tyler, forgive me if I have this wrong, that say you really are a productive, effective, wonderful human being. A successful person, in other words. Yes. Now, you have all these wonderful competencies, but

why wouldn't you only have good days? In some ways, it makes a lot of sense. Like, why is there variability in your life after you've achieved a certain capability, right? I don't know. I think that feels absolutist and utopian in a weird way. But look, I kid, Tyler. I think it's a really interesting observation and good question in terms of why, quote, even successful people have bad days. And you could argue, depending on how you want to define a successful person,

that people who are, quote, successful probably try more difficult things than people who are not successful. And therefore, you might argue that they would have more bad days than unsuccessful people. I was just reading this book. I read that one, too. Did you? We're both doing the same thing. There was a quote there. This is now fourth hand or something.

I read the book and then the author said that the founder of Spanx had a father who said, what have you failed at today? When she was a little girl growing up and that taught her that, look, if you're going to do hard things, you're going to fail. Is there any evidence from the psych literature showing that there are some humans who've never had a quote bad day? Well, there are so many studies that track people's experience over time.

And so much evidence that there's variability in mood over time and in relationships over time, in productivity, that just on that evidence, you could say that everybody has at least relative to themselves a bad day. Let me ask channeling my inner Tyler Thordstrom this.

Stephen, do you have bad days? Let's assume that you would count as a successful person. What are they like and why do you have them? I appreciate you counting me as a quote successful person. I'm flattered. Do I have bad days? Absolutely. But I don't think of them as bad days. I think of them as good.

a bad thing or three happened on this day. Okay, give me some examples. Well, when bad things happen, that's not what I'm talking about when I think of a bad day, because terrible things happen all the time. Someone gets killed or there's a terrible tragedy, like the world is full of bad things happening. If you're the person to whom that happens, that's the very definition of an infinitely bad day. You're right. We should acknowledge that. But I don't think that's what Tyler meant.

Right. And there's a spectrum of bad. For me, when a bad thing happens that touches me somehow, but it's not of my doing, it's

I don't have that big of a problem with that. I feel it's unfortunate. If there's a failure with our work. Like the power goes out or a file gets deleted. Yeah, it's unfortunate and you deal with it. To me, what feels like a bad day or bad incidents in a day is when I did something that I regret. So you had to have some volition. Some agency, yeah. For me, a bad day feels like when I've made a series of decisions that led to

a suboptimal outcome and maybe a bad emotional response, maybe hurt feelings or frustration. That's what feels like a bad day to me. So I'll give an example. Let's say we're working on a particular piece, a podcast or a book chapter or something like that.

And you're constantly making decisions about what's an idea worth pursuing. Is it new and interesting? Is it demonstrably true? What's the data or stories to explain this idea to a listener or reader? And there will be a decision along the way, like we think this person would be a good contributor to that. And in the pit of my stomach, I may think, what I know of their work, it doesn't look that good. It doesn't look that interesting. It doesn't look that current.

It doesn't look that empirical. And then I'll either talk myself into it or let myself get talked into it. And then you start to prepare for hours and hours.

And then the interview comes. And in the back of my mind, I'm thinking this might have just been a massive waste of time. And then it is like within five minutes, I can tell that this person is just not the right person for what we're trying to do. They don't have a command of the literature or the history that we're trying to tell. They are not an interesting talker, whatever it is. And then afterwards, I am ticked. That's a bad day.

that's something that I'm going to feel bad about for a while and be frustrated at. And then what happens? And then I think, okay, let's post-mortem the crap out of that sucker. Why did it happen exactly? Where did our assumption go wrong? Maybe it's just chance. Maybe I just didn't do a good job. Maybe the energy just wasn't right. Maybe their kid was home with the flu and they were distracted and worried.

But it does lead you to constantly change your protocol of how to figure out how to spend your time more productively. Because as my friend Angela Duckworth has told me time and again, life is short. And even an hour that you could spend doing something interesting, productive, fruitful is worth doing interestingly and productively and fruitfully. And yet I still fail at that kind of thing.

probably as much now as I did 10 years ago. So I'm not really getting very good at figuring out how to solve that sort of problem retrospectively. You're doing pretty well, though. You've got this curious attitude about what you can harvest from this crop of failure. I'm feeling like you're

pretty evolved. I've been reading this book from, gosh, I don't know, the 70s or something called The Erroneous Zones or Your Erroneous Zones. Have you ever heard of it? So this is a play on your erogenous zones, correct? It's a play on your erogenous zones. It was like one of these early self-help books. It was a runaway bestseller. The reason I'm reading it is because my literary agent, Richard Pine, told me it's a book that changed his life.

His dad was also a literary agent. And apparently this manuscript comes in, and I don't know, maybe nobody in the office had any interest in it. And his dad gives it to Richard, and he says, take a look at this. Tell me what you think. Richard reads every word, and then his dad says, what should we do with it? And Richard says, it's the best book I've ever read. It changed my life. I'll never be the same. What this book says is that I have choices, and I

I am the sum total of those choices. I have agency. The book is titled Your Erroneous Zones because you have mistakes that you're likely to make, traps that you're likely to get in being the human that you are. And it's a guide to making better choices. So as I was listening to the audio version of this book, I think the author's Wayne Dyer is

And Wayne says in Your Erroneous Zones that at an earlier point in his life, he didn't think there would be such a thing as a bad day if you did everything right. And part of growing up is just realizing that no matter how successful you become and even if you do make all the quote unquote best choices that you should expect to

that you will have some days that don't feel good and that should be okay. You shouldn't run away from it or labor into the illusion that it's anything but that way. So these are really nice pieces of advice and it reminds me of something I heard recently that just so tickled me. I have a friend who is a doctor. He's maybe 60, early 60s, and he was telling me about a friend and mentor of his, I believe he's a rheumatologist,

the older gentleman, he's in his 80s now and he's still practicing medicine, still teaching. And my friend, whose name is David, says that every time he sees this 80-year-old friend mentor, he says, you know, how are you doing? And the answer is always the same. Never been better. Which

Which somehow when you're in your 80s, it seems wrong. You've never been better. Come on. What about when you were 21? And he says, if you say it every day to yourself and to other people, it's not like it's a magic trick and it becomes true. But a little part of you believes a little part of it and makes it better. But if the inclination is to eliminate bad days, I don't think that is a good inclination. Because then you'll take no risks. You'll reach for nothing beyond your comfortable reach.

Well, not that I'm Wayne Dyer and not like I wrote your erroneous zones, but in this 1970s self-help book, he does say that one of the pieces of advice that he would like to impart is that what's done is done. The past is the past. And not to relive that bad day for many more days. Make it at least one bad day and not an infinite number of bad days by revisiting it, chewing on it, ruminating. One solution is...

that I've come up with is if I had what you might call a quote bad day,

I try to focus on. What can I do between now and tomorrow to be in a good frame of mind for tomorrow? Because I don't want it to carry over. And part of what I do then, which is, I think, what a lot of people do, it's why there's a thing called cocktail hour. Now, everybody has a different form of cocktail hour, but whatever it is, it's to try to put an ending to even a bad day that feels like a better ending. This goes back to this

colonoscopy paper that we talked about a while ago on the show. This was back when colonoscopies were painful to have administered. They're not anymore. But the notion that if the end of the procedure is less painful than the rest of it, then your impression was that the whole thing wasn't so bad. Now, you could say it's just lipstick on a pig. The bad day was the bad day. Or you

You could look at it with a little more optimism. My mom used to say, a little powder and paint makes a girl what she ain't. The bad day recedes as the happy ending takes over. And I don't know if a little self-delusion isn't, in fact, quite helpful in order to get you on a better track so that tomorrow at least you have a belief that you'll have a better day, which I think is important, honestly. Yeah.

And I guess there's a reason why happy hour is not at 10 in the morning and it's not at noon. You know, let's end our day with some happiness.

Uh, that's one reason. I think there may be some other reasons for that. So let me ask you this. You hang out with a lot of people who really studied how to change the mindset with which you approach a task or a day. And you've done this yourself for years. This is what you do. So let's say that Tyler or someone comes and says, I had a really bad day and I'm determined to have a better day tomorrow, even though the circumstances that produce the bad day today may still exist.

So what are some ideas to at least approach it with a little bit more optimism? Any kind of reset or fresh start or other good mind tricks? Well, you just named one of them, and that's the idea from Katie Milkman of having a fresh start. And that is actually more mental than physical, that you would just frame the new day as the beginning of a new streak and that things are actually going to be different. And that could happen the beginning of the month.

the beginning of the week, your birthday, January 1. But you could also just mentally create your own fresh start. Like today is a new day. I turn a new page. I'm very fond of a math teacher in New York City named Jeff Lee, and he lives by the mantra, forgive yourself every night

recommit every morning. When I hear that, I almost can picture a river where you baptize yourself every night and you forgive yourself and you begin fresh and new with commitment to go do the good work that you're doing. That's, again, something of a mental fresh start reframing. I have an understanding

Yeah.

Zero. Plus one. Good day. Very simple. And then he says one thing that he did that was completely factual. Like I had two eggs sunny side up for breakfast. Completely mundane. That would make my whole day good right there. Just the two sunny side eggs. Especially if they're just right.

And then finally, something that he learned. And I think in a way, that's the best attitude for any day, honestly, but especially the bad ones. Because you can imagine if you said minus one bad day, the eggs were overcooked. That's what I had for breakfast.

And then what I learned is I should take the pan off the stove one minute earlier if I want my eggs to be sunny side up. There you go. And I think that was good advice. That's good advice. I hope Tyler enjoys this advice, too. I hope Tyler is not vegan.

No Stupid Questions is part of the Freakonomics Radio Network, which also includes Freakonomics Radio, People I Mostly Admire, and Sudir Breaks the Internet. This episode was produced by me, Rebecca Lee Douglas. And now, here's a fact check of today's conversations.

Stephen recalls that the phrase "Go Fever" originated with NASA. This is correct, but more specifically, the term was initially used to describe the 1967 Apollo 1 disaster. Three astronauts died when their command module caught on fire during pre-flight testing, two weeks before their scheduled departure. A review board determined that a number of design oversights led to the fire. The cabin had been pressurized with pure oxygen,

and there were many combustible materials in the capsule, in addition to, quote, vulnerable wire and plumbing. Critics blamed GoFever as a groupthink phenomenon where protocols were rushed in order to meet President Kennedy's challenge of landing a man on the moon before 1970.

Later, Stephen and Angela share their thoughts on how to recover from a bad day. Stephen tells the story of an 80-year-old rheumatologist who says he's, quote, never been better when anyone asks him how he's doing. This is a story that he tells in a book called The Story of a Bad Day.

This is actually a form of self-affirmation. In psychotherapy, a self-affirmation is a positive statement about the self that a person repeats on a regular basis. Mental health professionals may recommend this tool as part of a treatment program for depression. Regular affirmations have been shown to improve a person's relationships, reduce their negative thoughts, and encourage behavior change.

Angela suggests additional tools to help move on from a bad day, including committing to a fresh start, journaling, and trying not to ruminate.

Steven recommends wrapping your day with a metaphorical cocktail hour, whatever that means to you. Cognitive behavioral therapists might also recommend listening to music, spending time outside, talking to loved ones, practicing mindfulness, and getting good sleep. While a literal cocktail with friends may be the solution for some, for others, alcohol can result in increased feelings of next-day anxiety, or hangxiety. That's it for the Fact Check.

No Stupid Questions is produced by Freakonomics Radio and Stitcher. Our staff includes Allison Craiglow, Greg Rippin, Mark McCluskey, James Foster, Joel Meyer, Trisha Bobita, Zach Lipinski, Mary DeDuke, Brent Katz, Morgan Levy, Emma Terrell, Lyric Bowditch, Jasmine Klinger, and Jacob Clemente. Our theme song is And She Was by Talking Heads.

Special thanks to David Byrne and Warner Chapel Music. If you'd like to listen to the show ad-free, subscribe to Stitcher Premium. You can also follow us on Twitter @NSQ_Show and on Facebook @NSQShow. If you have a question for a future episode, please email it to [email protected]. And if you heard Stephen or Angela reference a study, an expert, or a book that you'd like to learn more about,

You can check out Freakonomics.com slash NSQ, where we link to all of the major references that you heard about here today. Thanks for listening. I'm going to put my retainer back in. Hold on a second. I have to, like, suck all the food out of my teeth. The Freakonomics Radio Network. Stitcher.