We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode Nate Cohn Threads the Needle

Nate Cohn Threads the Needle

2024/2/1
logo of podcast Beyond the Polls with Henry Olsen

Beyond the Polls with Henry Olsen

AI Deep Dive AI Chapters Transcript
People
H
Henry
活跃在房地产投资和分析领域的专业人士,参与多个房地产市场预测和分析讨论。
N
Nate Cohn
Topics
Henry: 拜登政府未能与共和党就边境问题达成协议,导致其在国内政治和对乌克兰的援助问题上都面临困境。边境问题的混乱是特朗普的主要政治筹码,拜登政府未能解决这个问题,反而让特朗普获得了政治优势。拜登政府为了安抚民主党内的进步派,未能采取对中期选举有利的政治策略,也未能及时向乌克兰提供援助。

Deep Dive

Chapters
Henry Olsen discusses Joe Biden's challenges in striking a deal on border control, highlighting the political and internal party pressures affecting his decisions.

Shownotes Transcript

Translations:
中文

Hey there, entrepreneurs. What's the easiest choice you can make? Outsourcing tasks you hate. What about selling with Shopify?

Shopify is the global commerce platform that helps you sell at every stage of your business. From the launch your online shop stage to the first real life store stage, all the way to the did we just hit a million orders stage. Shopify is there to help you grow. Whether you're selling scented soap or outdoor outfits, Shopify helps you sell everywhere. With their all-in-one e-commerce platform and in-person POS system, Shopify's got you covered.

Shopify helps turn browsers into buyers with the internet's best converting checkout, 36% better on average. Plus, you can sell more with less effort thanks to Shopify Magic, your AI-powered all-star. I love how Shopify gives you everything you need to take control and take your business to the next level. Sign up for a $1 per month trial period at shopify.com slash dax, all lowercase. Go to shopify.com slash dax now to grow your business no matter what stage you're in. That's shopify.com slash dax.

Hey, it's Ryan Seacrest. Life comes at you fast, which is why it's important to find some time to relax. A little you time. Enter Chumba Casino. With no download required, you can jump on anytime, anywhere for the chance to redeem some serious prizes. So treat yourself with Chumba Casino and play over 100 online casino-style games all online.

for free. Go to ChumbaCasino.com to collect your free welcome bonus. Sponsored by Chumba Casino. No purchase necessary. VGW Group. Void where prohibited by law. 18 plus terms and conditions apply. Welcome back to Beyond the Polls. This week features an extended discussion with the New York Times Chief Political Analyst, Nate Cohn, along with my weekly rant. Let's dive in.

There's lots of things to rant about. It's American politics, after all. And if you're not ranting, you're not listening. But this week's rant has a title, Biden on the border, re-election on the fence. It's the beginning of February. Joe Biden's aid package to Ukraine is still being held up as negotiation with Senate Republicans aren't producing a clear path that the GOP can live with.

What's going on? I thought saving Ukraine was essential to the future of democracy, the future of the world, etc., etc., etc. But yet we're in month three of these negotiations and progress continues to be like progress on the ground in the Donbass, slow.

What makes it even odder is Joe Biden's political situation. Yes, his job approval remains stuck around 40%. It's four points lower than it was 11 months ago, despite good economic news. So what's going on? Well, it might have something to do with the continuing record high number of migrants crossing the country illegally. It's going up. It's going up. It's even making news.

People see this. They know this. Opinions are changing. People who used to be opposed to the wall are now for it. People who used to be for a path to legal citizenship are now for deportation. And Joe Biden's job approval on immigration is a paltry 32%. So you think this is a win-win.

Joe Biden saves the world with the age package to Ukraine. He saves his political future with a border deal that the American people want and that can show him to be bipartisan. So why can't he get to yes?

Well, the answer is internal democratic politics. You see, Joe Biden is not just another politician. Joe Biden has superpowers. Yes, it may seem strange that Uncle Joe has this political superpower, but the fact is, for 50 years, he has had an innate ability to ascertain where the center of opinion in the Democratic Party is at any particular time and effortlessly occupy it.

That this is a guy who was pro-life when it was center of public opinion. He was anti-busing when that was center of public opinion. And now look where Joe Biden is. That's a degree of flexibility over five decades that is really remarkable. But the problem is his typical way of doing this is to occupy a position that satisfies all the competing contestants. And that works when you're a senator.

But it's harder when you're a president. It's harder when you're a president in part because you're no longer one of a hundred. It's hardest when you're president, though, because oftentimes you have to make decisions that can't split the difference. He's running into this in Israel, where he's trying to support Israel while also getting Israel to support the ceasefire and the end of the war that his left wants.

He can't make people happy there. So by continuing the aid to Israel, he's continually aggravating his left on that because the rhetoric isn't enough for them. He's trying to split the difference. He's de facto choosing to support Israel, but he's not solving his political problem because he's trying to be a senator, not a president. And that's what's going on here at the border.

is that he knows that a third to half of his party like what's going on down there. Now, they prefer kind of like the withdrawal from Afghanistan, that it's not be so chaotic and not be so uncontrolled. But did they like the idea?

that three and a half million people have presented themselves at the American border and that many of them are being admitted? Absolutely. Do they like the idea that they're not being held in detention centers or tents that are being rapidly built in the desert, but instead they're going to communities and that maybe even they can work and embed their families here? Yes, they want that. It's not what the middle American voter wants, according to the polls, and it's certainly not what the Republican Party wants.

But in order to get to yes, you have to say no to somebody. And that no is the Democratic Party's left progressive wing. So then the question is, why can't he do that? It doesn't seem to matter to him that doing what would be politically sound for the general election or would free up the aid for the policy that he says is crucial.

Absolutely crucial to American interests. He doesn't seem to care that his very policy by placating the progressive wing is hurting the two things that would seem to be more important to him.

Maybe it will take something to concentrate the mind. Maybe it will take the Ukrainian line collapsing in some way and an unnecessary fallback to raise the specter of a second Afghanistan. We know that once a line collapses, movement can happen pretty quickly.

I think that the Biden administration might want to consider that as they are continuing to drive a bargain that can't get the deal done. But what it means is that he's holding himself back. He's trying to keep his party intact, and that seems to be more important to him than delivering the political win at home and the policy win abroad.

It's clear that disorder in the border is the best issue Trump has. That's why Trump has told Republicans, don't give in, don't do a deal. Because for Trump, having that disorder is politically helpful.

One would think that Biden might want to take that issue off the table, that he would then have a stronger position to be able to say, take a look at that man behind January 6th. Take a look at that man with all the problems that he's got. Maybe you should care more about that. But Biden's refusal to give anything close to what Americans and Republicans want means that he's giving Trump the win that Trump needs.

As a result, Biden is straddling the border issue. And straddling the border means his reelection is on the fence. Step into the world of power, loyalty, and luck. I'm going to make him an offer he can't refuse. With family, cannolis, and spins mean everything. Now, you want to get mixed up in the family business. Introducing the Godfather at ChapaCasino.com.

Test your luck in the shadowy world of The Godfather Slots. Someday, I will call upon you to do a service for me. Play The Godfather, now at Chumpacasino.com. Welcome to the family. No purchase necessary. VGW Group. Voidware prohibited by law. 18 plus. Terms and conditions apply.

Well, anybody who follows politics seriously knows my next guest. He is the man with the needle, the man with the numbers. He is Nate Cohn, the chief political analyst at The New York Times. Nate, welcome to Beyond the Polls. Thank you for having me. I'm excited to chat.

Well, the first thing I wanted to get into is that for a lot of people who do this work, you can see from their background how you get into it. They did campaigns or they were journalists or they have a polling and so forth. You have a very unique background, but yet have developed a well-deserved reputation as being one of the best out there. How did you what's your background and how did you get interested in doing this thing?

My background has very little to do with what I now do, as you know. I mean, I focused on international relations in college, and then I worked at an international security think tank, mostly on South Asian security issues like nuclear weapons in India and Pakistan. And I, throughout, had always been very interested in elections. In fact, I was interested in elections before I was interested in everything else about politics. I mean, I was the sort of person who printed out back then in school...

every CNN exit poll from 2000 and put them into a large binder and committed most of the findings to memory at the time. So that was always in the background, but you're right. The idea of working for a political campaign never seemed serious to me. I didn't work in polling. But in December 2011, my boss at the time got cancer and I was sort of left without work.

Around the same time, I was applying to graduate schools in international affairs, and I was just stuck with nothing to do, mulling what I ought to do. And there was this Republican primary going on in Iowa, and I decided I had some things to say about the race at the time, and I started writing a blog. I didn't really do very much to promote it other than

put it on Twitter, which was much smaller then, and a few people caught it and retweeted it, and in very short order, I was hired by the New Republic to write for them. It was only five months later. Wow. That's kind of like the election Twitter version of you'll go out there, you know, the Broadway story, you'll go out there, the understudy, and you'll come back a star.

Yeah, I mean, I was helped, I think, by a lot like that was Nate Silver was sort of in his peak moment then. And I think people were looking for more people like him in a way that I don't think is true today. I think that I was also helped by like sort of the influx of big tech money into the media at that time. Chris Hughes had just bought the New Republic and they had this extra pot of money heading into the election. But I like to think there was at least a little bit of skill involved, too.

Well, of course there is. And one of the things that sets you apart from somebody like Nate, you know, I respect Nate a lot. I was a fan of his being a fantasy baseball person when all we knew him for, those of us who do this, was for creating the baseball hitting and pitching projections program, Pocota. Of course.

Yes. We can talk with the Mariners when we're at the end of this too, or whatever baseball team you prefer. Well, I root for America's team, the team that plays in the house that Ruth built and the house that Ashman is tearing down. Well, but maybe we shouldn't talk. But,

But one of the things that you do that I really like that does set you apart is precinct analysis, that what Nate really did well was show that polling, at least at the aggregate level, averaged and projected, has real predictive power.

much like past baseball stats have predictive power for future baseball stats. But you do, you kind of like bring home the bacon and fry it up in a pan. You do the polling aggregation and analysis and creation, which we'll get to. But you also get into precinct data. And what is it that each of these sources tells you that might be a little bit different so that you want to look at both?

And I'll add one final piece of data that I think is worth being in conversation, too, and that's the voter file, which has this incredibly rich data on exactly who voted and who didn't every election. And, you know, we may talk about this this in a little bit, but, you know, I think that we're we have we're basically building a homologation.

a harmonized framework for integrating all of this information, voter file data, hard election results by precinct, and polling that's linked to the voter file to generate the most accurate sort of estimates we can of how voters behave in a given election. And we means the New York Times. Sorry.

When you say we are doing this, do you mean the industry or do you mean? I mean, the New York Times. And, you know, I mentioned at the outset that the thing I printed out, you know, in middle school or whatever, was the 2000 election exit polls. For some reason, ever since I first realized that, you know, King County, Washington, where I was from at the time, voted completely differently than New York.

you know, the county, just a little bit to the east, Yakima County or something, you know, that there was a completely different voting, a different group of people in different voting that always fascinated me. And I think that is something that the 2008 version of Nate Silver certainly did when he was estimating the results for Democratic primaries, you know, based on the demographic characteristics of each state.

But I think that's always been more central to what interested me in politics. And a lot of what we do, I think, is built around doing that really well.

And is that something that the intersection between precinct results and maybe the demographic information that you can get from the American Community Survey, which you can fill up or create a demographic database so each precinct has appropriate blocks assigned to it? Exactly, yeah. Yeah. And of course...

Cross index with the voter file. Is that what that really helps you do so that you can then take the polling results and drive it to a granular level? Or is it kind of more of a going in both directions that you learn something from the polls that helps you construct your model and you learn something, the results to help you construct your polls?

It's all an integrated framework at this point, really. I mean, when we first of all, our polls are taken off of the voter file. And so, you know, we draw a sample of registered voters from the voter file and call them. And even before we've drawn telephone numbers off of the voter file, we have appended to the voter file the results for every for every voter in their precinct. And we just for the mechanics of this, because I, you know,

We have like a map of the results in a state, like every precinct. And we know the latitude and longitude of every voter on the voter file based on their address. And we can fit everyone into their precinct to know how their own precinct voted. We also can do the same thing for census block group information, as you just suggested. And so when we conduct a poll, we then learn how this actual human from this particular precinct voted or says they'll vote.

And then it all comes back to the voter file again, where we, you know, I don't know if this is too jargony, but we fit a model of how the people in our time Siena polls voted as a function of all this information that we already knew about them from the voter file and that we have on everyone on the voter file.

And then we now can use that model to estimate how everyone on the voter file voted, even if we didn't talk to them. And then we can adjust even that number to match the actual result by precinct once we get the final election results in. And then also not just to have it match the final election result, but also have it match the election result among that group of people who we learned actually voted in that precinct when we get the vote history data back. So there's a sort of endless cycle of data, you know, integration analysis that yields like very detailed estimates for how

um, everyone voted in an election. And I think that the insights that we generate from that are sort of all over our coverage of, um, election night and turnout and the quality of our polling and so on. Even if we very rarely say, you know, how it all sort of is working under the hood. Well, you know, I'll say, uh, two things, you know, first of all, uh,

If you were interested in doing this professionally for campaigns, you'd be a very rich man because this sort of big data integration that leads to individualized messaging or movement is the wave of the future, whether it's through digital advertising or through personal contact for the sort of things that various grassroots groups do or so forth.

But what I really like is that's not what you're doing. You're actually trying to understand voters as opposed to move one-tenth of one percent of them in a direction to help somebody who's paying you. So that leads me to the next question, which is,

This sort of thing is really only limited by the N, the number of people that you have in your database. You've been doing this for a number of years. Is this the sort of thing where you can envision having enough of a sample that's done within a reasonable enough timeframe that what people said in 2016 or 2017 probably isn't as predictive as

You know, more recent things. I mean, what sort of N are we talking about that you're able to do this from within a two or three year time frame to run your model, to create your and adapt your model?

Well, we have the Times and the Times-Siena poll is the overwhelming source of the data that we have for understanding the electorate. I mean, we use this other data to refine it, like the precinct results and the census demographics, the voter file and so on. But at its core, you know, high quality polling isn't really replaceable. And I think that the quality of the polling is just as important or even more important than than the raw. And, you know, in a typical election year, we have

in a typical presidential general election year,

Um, you know, I think we can reasonably hope to have at least 10,000 interviews, um, from the heart of the campaign season when we think that press preferences are reasonably settled and suitable, um, for, you know, making inferences about the final results, um, or for understanding the final results thereafter. Um, but, you know, surveys are super expensive and they get more expensive all the time. Um, and, um,

It would be ideal to conduct the same analysis over hundreds of thousands of people.

And there's certainly certain things that we lose because we don't have sample sizes that are as big as that. I mean, you alluded a moment to the world of like private campaign analytics and partisan firms that do something really very similar to what I just described. But we didn't invent all of this. The campaign world has been doing things similar to what we do for a long time and before we ever did it.

But those models are powered by far more observations than we have. Now, I do actually think that the quality of our underlying data is better respondent for respondent than a lot of those firms. But in terms of the raw sample size, that's not really comparable in a lot of cases, especially because they can pull across all the campaigns and committees and so on.

So how does this—and one of the things that you're famous for is the creation of the famous election night dial. And for those of you—you know, I am an addict to the Times graphics and to the dial. And—

It's amazingly accurate once you get a reasonable number of precincts. And, of course, you know that so you don't launch the dial when the first three precincts come in. How does this how do you put the dial together? Because obviously each place that you are looking at has a different set of demographics, a different set of voters, etc.

On election night, on Super Tuesday, you probably don't have a model of American Samoa that has a caucus, but the other 15 states you will, presumably. How do you actually do this, constructing a dial and then keeping it up to date? Because the dial is going to be different in November than it is on Super Tuesday.

So the core concept that underlies, you know, we call it the needle, or at least that's what most people call it. But I like the dial. It's probably, it is, a dial is definitely a more accurate description of the thing. But in any case, you know, the fundamentals of it, I think, are really straightforward and they're just like what, you know, all the election nerds like yourself are doing on Twitter. We are looking at the completed results, you know, where the early voting and the election day vote has all been counted. We

We compare those results in each precinct, county, township, whatever the geographic reporting unit for the election maybe is, to our pre-election estimates for how we thought that county, township, or precinct would vote. And as we obtain more information, the model makes an inference about how the results differ from that expectation. So in New Hampshire, if we started out believing that Nikki Haley would rather that Donald Trump would win by 17 points, like the polling average is set ahead of time,

And then after 15 townships, Haley is doing five points better than we thought. Then, you know, the needle will make it will move towards being five points better for Haley than that original Trump plus 17 expectation. It's more complicated than that. You know, there's like a statistical model that lets you be like, OK, it's actually in college educated areas where she's doing well. And we'll revise our estimates there, but not.

in the working class areas and so on. But it's, you know, the core, that core thinking is really not very different than what we see, what we do ourselves on election night. And it's a, it is a formalized version of what I think most people do on their own. The actual underlying mechanics of making this happen is far more daunting than the fundamentals are. The like actual, the like actual core statistical insights of the model, like they're actually really,

They're just a tiny fraction of the whole base of code that sprawls across multiple people and contributors and teams to make all this happen. There's an election results team that's obtaining data from the Associated Press that powers the county maps that you see on the homepage. There's a separate team that's scraping precinct results from election results.

from state and county, you know, election websites across the country. There are, as part of that, people who have had to go out and obtain shape files from the counties to like learn what the actual, what these precincts even are. And then there's a process that, you know, I'm, that, you know, is typically my responsibility, which is to generate estimates for all the geographic units that we have and say, okay, we think that in Hanover, Nikki Haley is going to get

uh 80 of the vote to 17 for trump that was our it was one we talked about a lot because it stood out so much but you know that's an ex we we did have to have an estimate for like how is hanover going to vote right it's new hampshire primary and then we have a very complicated like underlying set of logic um for the statistical model that underlies the needle about okay is this good data um

you know, is how much weight do we assign to those 10 observations versus our priors based on the polling? You know, it's one thing to say Trump should be at 17, but how quickly do you go from Trump plus 17 to Trump plus 11? Is it 10 townships? Is it 15? You know, and there's a bunch of historical data that

we've collected to try and optimize that process. And then there's a simulation about, you know, that yields the probability and then that gets outputted. You put it on the internet and each of these steps, something can break. And, you know, long time readers know that there have been times we've had to

pull this off the internet because there are so many moving parts. But, you know, I think that for what it's worth, I'll just say that, like, we've hired a lot of people over the last two years to sort of rebuild this on more stable ground. And I'm really excited about what the product's going to look like in November. I'm excited about what the product's going to look like on Saturday. Are you guys going to have, and you're right, I called it the dial, but it is called the needle. Are you going to have the needle for the South Carolina Democratic primary? We're not. We're not.

I'm sorry to disappoint. For all the reasons I just described, it is a huge heave to put one of these things on the Internet. It is not like our election results where you flip a switch and then it goes on. And so we have tended to use until it gets easy. We mostly focus on the stuff where, you know, we think the value pays off. As of today, the South Carolina Republican primary, I think, does meet that criteria. Yeah.

Yeah, I mean, whatever the result is, good, bad, or line ball for Haley,

As long as she doesn't drop out before Saturday, which she shows no indication of, it's going to be an important race. Whereas, of course, yes, what happens on this Saturday? I have my benchmarks for what I think Biden has to do in order for it not to be a election nerd slash campaign strategist chatter discussion.

But it's not going to affect the outcome. He's not going to lose. He's not going to lose the nomination as long as he's walking, talking and breathing. That's right. So tell me a little bit about the polls. You know, the fact is Siena College had a great reputation for many years. And then suddenly...

You basically lock them up in an exclusive, as far as I can tell. How do you personally interact with their pollster? Do you write the questionnaires? Do you share the questionnaires? Do you help model or the pulling the sample? What's your personal involvement with the pollster? And how did you come to Pixiena?

well i guess in chronological order i mean i the the time the time sienna poll really originated out of the 2012 presidential election if you if you think back you may remember the obama campaign had this critique of the public polls that they were too volatile that they weren't you know that their likely voter models were bad that they weren't using the voter file and in the end the obama

campaign critique sort of rang true at the end of the day what they said was going to happen happened and we did a lot of you know i did a lot of you know reporting to try and understand the differences between the obama um approach and um the typical public poll

And, you know, after a few years of it, like I kind of gradually came to the realization that was pretty doable, that what they were proposing was not especially challenging and that some of the things necessary to do it, like obtaining a national voter file, had other uses, even if, you know, we didn't end up going the polling route. And.

you know, after, you know, playing a little bit with it, I sort of decided that I thought that there was a real chance that we could do something that was distinctive from other pollsters by take by, you know, relying on the voter file and taking an approach that's much more like what the campaigns do or what the Obama campaign did in 2012. Sienna stood out pretty quickly as being the best option for that, oddly enough. Sienna, unlike most public pollsters at the time,

was doing registration-based samples off the voter file. They had a fantastic 2014. I mean, you may remember that in the CatCo district that they had this poll like three weeks before the election with CatCo up 20 and all of what little of election Twitter exists at the time said that was nonsense. And then CatCo really did go on to win by 20. And I just love that they put out a number like that and it was right and that they were, you know, it wasn't exactly what we wanted to do, but

It was a lot closer than most people. And the Times had worked with Sienna before with polling for the New York mayoral race. So like internally, they had back then, the New York Times had a separate polling unit that did the Times CBS poll. And so there was already an existing relationship there. And I reached out to them and asked them what they thought about this kind of idea. And they were.

totally game. I mean, I can't emphasize how sort of rare this is in the world of polling. I don't want to be too critical of the rest of the polling world, but like, you know, there are a fair number of very conventional and old fashioned pollsters out there who are not especially innovative in doing things the way that they've always been done. And Sienna was really excited about the idea of doing this thing that was a little bit different, um, from what they were already doing. Um, like in terms of the actual mechanics, um, you know, we, um,

draw the sample on our end from the voter file. We do it in a way that I think is pretty complicated and different than other people where we don't just draw a random sample of telephone numbers, but we draw a sample of telephone numbers that's adjusted for whether we think the people will respond to the survey and whether the number of people on the voter file with that number is sufficient for us or whether we need to dial or whether we need to draw a disproportionate number of people

in order to end up with a representative sample. Ciena then fields the poll. I should note we've done the preponderance of the work for questionnaire development at the Times as well. And that is also really driven as well by our colleagues on the politics desk. I mean, they...

They don't write the questionnaires, but in terms of what topics are important and that they want to write about, that's a huge factor for us. And then on the back end, so then Sienna fields the poll, and on the back end, we weight the poll to match the estimates that we have from the voter file on the make above the likely... Let me just interrupt. You do the weighting. They do the collecting, but you do the weighting. That's right.

We also did this really fun thing in 2018 that we called live polling, where we weighted the results after every respondent in not quite real time, but real enough time that it seemed like it was truly real time. And that was in the first time Sienna polls in 2016 and 2017, Sienna did the weighting. And once we did live polling, that was on.

the New York Times side that had to be integrated into, you know, the code for developing this live updating product on the Internet. And since then, we've done the waiting as well. Yeah, no, I thought that was totally cool. You could kind of like go on and it's called going into it's the Republican woman in this area of rural Iowa. I was so fun.

So you wrote some stuff recently about the real Democratic problems with young and Hispanic and black voters. And of course, there's some overlap because the young voters are also the least white of an age cohort.

But they are also very distinct. Can you, for my listeners who may not read everything you write, can you summarize the argument and then tell us what some of the pressures between for each of these voters would face? You know, what is it that...

Why is a Hispanic voter maybe peeling off and being less likely to vote Democratic than they were six years ago? And but what can bring them back that your data tell us where does. So I guess the questions are, can you summarize the argument and tell us what the pressures are that we should be looking out for?

So first, you know, I don't want to be pedantic, but it's not quite just an argument. Like, it is just describing what the Time Siena data shows. And the Time Siena data can be wrong. You know, it's been wrong before. But what it shows is that Joe Biden is performing much worse than he did in 2020, as you mentioned, among Black, young, and Latino voters, in some cases by an amount that strains credibility. Like, it's hard to believe that he could really...

lose young voters, for instance, as our most recent national poll in December shows. And, you know, if you aggregate all of our polls together or look at likely voters or something, it's not quite so bad. But, you know, there are really striking shifts among all three of those groups. And if, you know, I don't want to say any numbers aloud that will from memory that will turn out to be wrong. But, you know, there are 10 plus point shifts in all of these groups. And as you noted, even more of

among young black voters or young Latino voters. There seems to be a disproportionate shift there. One really fun wrinkle on this is that these losses, they're not just concentrated among young voters or black or Latino voters. They're concentrated among the least engaged of these voters, people without a robust track record of voting in elections.

So, again, at some risk of saying some numbers that aren't entirely accurate from memory, you know, the black voters in our polls who voted in the midterms, they're still returning, you know, 85 plus percent support for Biden with Trump and a single digit range. It's the black voters who didn't vote in the midterm that have returned.

that are producing results that look nothing like what we've seen in a previous election with, you know, Biden in the 60s or even the 50s sometimes among these black voters that didn't participate in the midterm or in primaries or however you want to define for this purpose, the high turnout vote.

To me, that really informs my thinking about what kinds of issues are driving this. It is probably not the sort of issue that political junkies care about. It's the sort of thing that affects even people who don't pay any attention to politics at all.

And there are happen to be no shortage of explanations for why voters like these may have soured on Biden. There's the economy. Granted, there are some like positive signs on that over the last couple of months. But even now, most voters don't think the economy is doing well. You know, prices increased and so on. There's his age.

It's really hard to know how to think about that problem, especially since everyone thinks Biden is too old, regardless of whether they are actually willing to vote for Trump or not. So you can't like untangle whether that's the issue that's hurting him when everyone thinks it. But clearly that could be that could be it.

I would toss, you know, Joe Biden's issues on Palestine and Israel into this conversation, especially over the last few months. It does seem to me that at least in our October, November and December time, he is doing worse among younger voters consistently than he did in our polling over the summer or last year. And that's a plausible explanation. There's data.

there's evidence in our data that the young voters who disapprove of Biden on this issue are more likely to be backing Trump, which is really fascinating given that Trump obviously isn't some lefty on the creation of a Palestinian state. And the other thing I'll add to this picture that we have very equivocal evidence on, and I think we're going to keep pursuing it before I write something big on this. So this could be a

This may either prove to be a sneak preview of something for your listeners, or it may prove to be completely wrong. But there is, in our most recent poll, we did ask people about media usage. And it was really striking how much of Biden's problem was concentrated among social media users, and in particular people using TikTok. And the people on conventional, who were receiving their news through more traditional media, did not show the same sort of tendency to defect and shift towards Trump.

And I'm intrigued by that. I mean, I you know, I you know, I was taught this is completely anecdotal. But for the holidays, I was with a lot of voters who are in the demographic category where Biden does pretty well. Old white people, of which I have several in my family and beyond. And it really did feel like there was like the you know, the different information environment there.

Helped explain why I was talking to a group of people that were much more cemented in their 2020 vote choice one way or another. And I am from a politically split family than the than young voters who I think are not bombarded by the same set of news and.

You know, for lack of, you know, there's this talk about the vibe that you may have seen. Maybe that whole thing is worth a conversation in its own right. But I do think that there is a different mood on social media than there is through traditional media. And it would not surprise me if as we collect more data, it turns out this is an important twist that helps explain some of what's going on here.

Well, one of the things you wrote about recently was the crisis in issue polling as opposed to head-to-head polling. Can you explain what do you mean by issue polling? Why is it in a crisis? And how are you, as you say that you're going to try and take some new approaches in order to address this problem, how are you and the Siena College pollsters working together to try and do that?

So just for background here, like I've been a skeptic of issue polling for a long time, you know, for my whole and in particular over the last eight years or so during the Trump era. I mean, I don't know if you have this impression or not, but I just I just felt.

from the first time I started following polls all the way to the present that they offered a summarily liberal view of public opinion on the issues. Like some political scientists refer to this as like operational liberalism is distinct from like symbolic ideology where, you know, you may identify as a conservative, but on these poll questions, you say that you're, you know, a liberal. And there are a lot of different ways that people have tried to square the differences between

what the issue polls imply about what people view, what people think about, I don't know, a cap and trade bill or background checks for guns or whatever. And their stated conservatism and propensity to vote for Republican candidates, despite whatever it is they said about their views on all these liberal issues. And during the last 10 years or so, I have gradually come to think that like the likeliest explanation here is that these issue polls are not doing their jobs.

We the referendum referenda on issues like abortion and cap and trade and gun rights. They tell a very mixed story about when the issue polls seem to line up with public opinion. The abortion referenda line up pretty well with what you would expect based on the polls that show abortion is very popular.

But on something like background checks, the reverenda that get held on background checks, they have no resemblance to these poll numbers that continuously show 90 plus percent of Americans support expanded background checks. And I just think this, you know, the more elections that we have where Republicans successfully campaign on the issues the polls say aren't popular, and the more times that we have referenda in which voters reject the things that the issue polls say are popular, I think the more we have to speculate that the problem is in the polling.

And I think there are a lot of reasons for that. But I think that they it's hard for me to, you know, one overarching thing here is we can't validate issue polls. You know, we do an election poll like we get the actual result. We have to be like, OK, we overestimated Trump by four, underestimated Trump by four points in 2020. We know we were wrong. We have nothing like that for these issue polls. We have no way to know whether they're right or not. So I think that it's I come to it just with the presumption that things may not be all well.

And then this really came, you know, I've already told you a bunch of things that led me to be skeptical, but we then had a separate issue in the midterm election, I think, where we in the polling community, myself included, you know, wrote based on the polls that the economy was poised to play a bigger role in the election than issues like abortion and democracy.

And then when the final results came in, we had a very different take. We thought that abortion democracy did a lot to help Democrats. And we could see it on a race by race basis where, you know, in a state or in a district where the Republicans had nominated a MAGA candidate that the Republicans lost. And then right across the state line, the Democrats lost. And it just seemed to give this clear story. There was some that the most important thing about this election we had managed to miss in our pre-election polling.

And so you put it all together and I gave it the label that it's a crisis. Whether you want to call it a crisis or not, I don't know. But I think that at the very least, I'm not satisfied with whether we've successfully told the story of American public opinion on the issues. I think there's a lot of reason to think that we haven't done a good job of it. And then we need to go back to the drawing board and how we do it. And there are a few approaches that we've tried already, like

offering experiments with different kinds of candidates. So rather than just ask people, is democracy important? Give half the sample a Republican who thinks we should overturn the election and give the other half of the sample a Republican who says Biden lost and see which hypothetical Republican does better. And if the hypothetical Republican who says

the election was stolen does five points worse than the hypothetical Republican who says Biden won, then maybe that means democracy was a pretty important issue regardless of what voters said in the earlier question. I think we've got to think harder about raising the costs of democracy

supporting a policy in a poll. You know, in the real world, like status quo bias, for lack of better word, lowercase c conservatism is real. It's hard to vote for a referendum. It's hard to go out and support big change. It's very easy in a poll to say you support something in principle or, you know, aspirationally.

but then not think it's actually a good idea in the real world, divorce from that simple question. I mean, do you support or oppose a home renovation that would improve your kitchen?

to take something I saw on Netflix the other night. I mean, it's very easy to say yes to that question, but how many people actually go and renovate their kitchens after they're asked that? I mean, I can imagine that we do a poll that says 70% of Americans support renovating their kitchens, but like 1% are going to do it this year. But I think that there's something like that that leads to this, you know, we'll call it a liberal bias in these issue polls,

because we don't capture in this sort of setting, like the real reasons why change is hard. I agree with you 110%. And, you know, your native Washington has an example of that, of, you know,

If you take a look at issue polls, you'd believe that Democratic voters are overwhelmingly in favor of action for climate change. But yet two initiatives to try and institute climate change policies failed statewide in Washington in 2016 and 2018. Yeah. And, you know, I don't part of the reason I've I think that maybe this has been closer to my mind is coming from a state with such a fun record of, you know, initiatives and referenda over the years that it's like,

wait a second, this doesn't line up. And one thing I'll say, I don't want to be too critical of the polling community, which I think does actually a lot of really good work on the issues and

But I have been surprised by how reluctant public pollsters are to think that the referendum initiative specifically pose a problem for polling. Many of them frame it as, well, initiatives and referendum have a status quo bias. And to me, it's just as it's quite could be the exact opposite of that. The polling could be the one with the stat with with the with the bias towards the ease of change. And it's just.

Again, we don't know the truth. There is no way to validate the result of an issue poll like there is our pre-election polling. But I do think that in the absence of validation, these are serious questions that we need to think hard about.

So this kind of leads me to my last question, which is the question that every pollster is asking, which is what would be the effect of a Trump conviction on your vote? And of course, everyone is saying, well, look, it would cost Trump a lot of support. So I kind of have two questions. How much do you believe that? And secondly, why does no one ask the flip of that, which is how would a Trump mistrial

affect your vote? Well, I have a hard time seeing 12-0 for acquittal, but I do not have a hard time seeing 11-1. Oh, me neither. So those are my two questions, which is how much do you believe these polls, which we're seeing more and more of... Well, we did a version of this ourselves. We asked specifically imprisonment, which...

I don't think is likely, but I wanted to set the very highest bar just to see. And even then, you know, the proportion of Trump supporters who said they would actually vote for Biden was very low. You know, one thing that I'm looking at in these polls is not just what proportion say. So sorry, let me step back. One thing that's really important to me in the construction of these questions is that they still mimic the actual choice voters will have to make, like make these people say Biden.

these people being Trump supporters, to be clear. Make the Trump voter actually say they're going to vote for Biden. Don't just let them say, oh, it'd be a problem for me or it would be disqualifying or whatever.

It's very easy for 30% of Trump voters to say, oh yeah, that's disqualifying. It will be much harder for them to actually go ahead and vote for Biden. And so one thing that I really liked about our version of this imprisonment question, and again, I don't remember the exact results, was that we didn't just ask how bad is this? We re-asked the ballot. The number of Trump voters who were actually willing to say, oh yeah, I'll vote for Biden was much lower. I think that's a way better way to get at this. And then a conviction was...

without imprisonment, I would think would be even lower than that. You know, I do think that there is probably a sliver of voters who is going to have a harder time voting for Trump if he's convicted of the crimes he's accused of than that otherwise, you know, but I think we're talking about a sliver here. I don't think that it's likely that Biden is going to win by 20 or something if that happens. But, you know,

um and whenever we're talking about a sliver this stuff gets so hard you know the the democracy stuff on the mega candidates in 2022 is such a great example of that right like how were we supposed to have found before the election that there was the sliver of voters that was willing to vote for um you know jd vance for senate but not vote for

a Republican House candidate who was at January 6th. It's such a small sliver of people we're talking about. If 3% of the electorate is in this category, that's the difference between a close race and a landslide, right? If 3% go from one side to the other, you go from a tie to six or in the context of the national popular vote for president, maybe it's the difference between Biden plus four and Biden plus 10. It doesn't take much to create huge effects here. And it's so hard

um to hone in on that on that kind of group so i'm pretty to actually circle back after my like stream of consciousness rambling there you know i i'm pretty agnostic on what the effect is it's going to be really hard for us to nail it down it's clear to me in our own polling that very few trump voters are willing to just go out there and say they'll vote for biden which i think would be the clearest test that this is serious but on the other hand even a sliver of voters can be decisive and that's super hard to measure you know one of the things that um

helped me in 2016 be pretty accurate with my final prediction was picking up in October that the changes in the margin, which most analysts were talking about, oh, Trump down by two, Trump down by seven, all happened because of variations in Trump. It was always moving. Trump goes down. People move to undecided. They don't move to Clinton. And then it pops back up again. And I was one. Oh,

There are people who really don't want to vote for Trump, but they really don't want to vote for Clinton. And once you saw that, it was like, well, actually, we if that's true, then we can know something about the people who remain undecided or think that they think that they're going to vote for a party candidate who. I'm really afraid of this, you know, from where, you know, if Trump is convicted, I'm really afraid that.

A lot of Republicans who don't want to vote for him are going to move right into the undecided column. Biden is going to open up a big lead. And then on Election Day, they're going to come back to Trump. I think that's entirely plausible. You know, as someone rooting for accurate polling, that's not what I want to see. One of the you know, in 2016, I literally knew lots of people who if you had talked to them on Friday, they would have said, I'm voting for Gary Johnson.

Yeah. One by one. When it came right down to it, just like the person who says, I don't want to vote for Trump if he's convicted. Many of them will go and do it anyway, because the alternative as a partisan is something they can't take. Absolutely. And that is if you were to make me list like the five things that keep me up about the election, the scenario is one of them.

I wish I could. I mean, I wish I had all of our cross. I wish I printed out my crosstab just like I did middle school here so I could flip to the right number. But, you know, I think that when we asked what Republicans would do, Trump fell to the 60s among Trump voters. And I don't think that Biden cleared 10. But you can go and look it up. And if you want, dear listener.

Well, we could make this the Nate Cohn Show. It's been a wonderful conversation. Aside from subscribing to the New York Times, how can my listeners find your work? That's a good question. So first, I have a newsletter now called The Tilt. I would encourage you all to subscribe to it if you're a subscriber. I'm a little bit not... My social media presence is a little bit...

weaker than it used to be. I am still on Twitter and looking at it regularly at Nate Cohn, and I'm not really looking at anything else. But unfortunately, the quality of my Twitter feed has deteriorated pretty markedly. I just can't deal with all this negativity in my life and people yelling at me in the interactions. You get enough Twitter followers, and people yell at you all the time. I spend less time on there and tweet less than I used to, but

I do sometimes have things to say on Twitter and not really anywhere else. And Cone is spelled C-O-H-A. You got it. Well, Nate, it's been a fabulous conversation. I'm looking forward to the needle on the night of February 24th and love to have you back. I would be happy to join.

That's it for this week. Next week, we'll discuss the state of play in the battle for house control with the University of Virginia's Kyle Condon. Until then, let's reach for the stars together as we journey beyond the polls.

And live the Chumba life.

No purchase necessary. Void where prohibited by law. See terms and conditions 18 plus.