We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode From Kennedy to Kardashian: An American Survey

From Kennedy to Kardashian: An American Survey

2024/2/15
logo of podcast Beyond the Polls with Henry Olsen

Beyond the Polls with Henry Olsen

AI Deep Dive AI Chapters Transcript
People
H
Henry
活跃在房地产投资和分析领域的专业人士,参与多个房地产市场预测和分析讨论。
Topics
Henry: 纽约第三国会选区特别选举民主党候选人Tom Suozzi胜选,投票率和人口统计是关键因素。民主党在特别选举中具有优势,因为许多支持者视特朗普为威胁。尽管民主党在本次选举中表现出色,但这并不预示着他们在11月大选中必然获胜,因为总统大选的投票率会更高。独立选民的投票倾向对大选结果至关重要。虽然这次胜利对民主党来说是积极的信号,但他们仍然需要解决全国范围内存在的大选问题,不能仅仅依靠投票率来获胜,需要更多努力争取选民支持。 Doug Rivers: YouGov采用独特的在线面板方法进行民意调查,通过互联网广告招募大量参与者,并根据人口统计和政治特征选择具有代表性的样本,以解决传统电话调查中响应率低的问题。该方法在全球范围内应用,并通过准确预测选举结果来验证其有效性。YouGov还使用多层次回归和分层后估计(MRP)模型来预测选举结果,该模型比传统的加权方法更准确,尤其是在处理样本中代表性不足的群体时。MRP模型能够利用大量特征来预测各个群体的投票行为,并将其应用于整个人口,从而提高预测的准确性。YouGov正在与媒体合作伙伴合作,利用MRP模型来预测2024年美国大选的结果,包括众议院和参议院的控制权以及选举团的最终结果。 Henry: 罗伯特·肯尼迪的超级碗广告巧妙地利用了怀旧情绪和低制作成本,成功地提高了知名度并获得了大量免费宣传。

Deep Dive

Chapters
The special election in New York's 3rd Congressional District saw a victory for Democrats, but the implications for the general election in November are mixed. The turnout and demographic advantages for Democrats in this special election may not translate to the higher turnout expected in a presidential election year.

Shownotes Transcript

Translations:
中文

It's time for today's Lucky Land Horoscope with Victoria Cash. Life's gotten mundane, so shake up the daily routine and be adventurous with a trip to Lucky Land. You know what they say, your chance to win starts with a spin. So go to LuckyLandSlots.com to play over 100 social casino-style games for free for your chance to redeem some serious prizes. Get lucky today at LuckyLandSlots.com. No purchase necessary. VGW Group. Void or prohibited by law. 18 plus. Terms and conditions apply.

Welcome back to Beyond the Polls. This week we'll dive deep into polling arcana with Doug Rivers of YouGov. Mine Tuesday's results in the New York 3 special election for tea leaves for November. And take a look at the political ad that has everybody talking, and you know what I'm talking about. Let's dive in.

Well, if you're a listener to this podcast, you know what happened on Tuesday night. We had a special election in New York's 3rd Congressional District. This is the North Shore of Long Island in Nassau and in Queens. And as you surely know, Tom Suozzi, the Democrat who used to represent this area, is being returned to the House. Yes, George Santos, the fabulous...

is being replaced by Tom Swasey, who is now getting the surname from the Democrats, the Fabulous, because they're very happy.

Does this mean anything for November? Does this foretell that Joe Biden is going to win re-election? Does this give super messaging platforms to Democrats? Does this post a bad sign for House Republicans? Well, the answer, as usual, is not in the extremes, but somewhere in between. What really happened?

on Tuesday night? Well, the simple question is that this came down to a matter of turnout and demographics. Turnout was definitely in favor of Democrats, and demographics meant this was always going to be a difficult seat for Republicans to hold.

Democrats have developed an advantage in special elections in the Trump era, and it's not hard to see why. Large, large, large numbers of the Democratic voter base sees Trump and anything associated with it as the end of life on this planet as they know it.

That's a slight exaggeration, but anyone who knows progressives knows it's only a slight exaggeration. And what that means is they will turn out anywhere for anything to vote to protect their values. So we have a pattern of democratic overperformance caused by overrepresentation in smaller electorates.

voter election was not low turnout. As of right now, we have over 170,000 people voted in this congressional district, but it is a lower turnout. Over 270,000 people voted in the midterm. Over 300,000, well over 300,000, will vote in the congressional election during the presidential year. So what does this mean is that

When, as far as the data show, Democrats had a larger advantage among those people voting yesterday than they had in the 2022 electorate. Data posted by UMichVoter on X says that Democrats had roughly an eight to nine point advantage over Republicans in this election.

electorate that cast ballots. Now, compare that to the 2022 electorate, which was Democrats plus four.

That means that Tom Suozzi started assuming that most registered Democrats vote for the Democratic candidate with a significant edge. And in fact, his final margin is nearly exactly the registration edge. With some votes left to count, he's ahead by 7.8 points. And the registration edge was a little bit over 8, which suggests that perhaps the independents didn't split 50-50, that they swung a little bit towards the Republican, but not much.

So what does this mean? First of all, it means that most of the reason why Tom Suozzi won, at least half of it, I should say, is because of the Democratic special election advantage. That's something that's not going to be replicated in November, because if there's one thing we know, is that low information, low propensity voters do show up for a presidential election. As I mentioned, the turnout will be at least double in this congressional election in November.

But the good news for Democrats is that independents apparently did not vote disproportionately Republican. There's no way to explain George Santos' large victory, about eight-point victory, in 2022 without saying that many Democrats and even more independents voted Republican when they often vote Democratic. That didn't happen. Now, that means one of two things. It means either that

Republican-leaning or anti-democratic voting, independents and Democrats tended to stay home, or it means that Swazi and the Democratic campaign persuaded many of them to come back. The truth, again, is probably somewhere in between, but anything that suggests that there's movement back to the Democrats is something that's good news for them. Again, this is not an earth-shattering,

foretelling of Democratic taking control of the House, but this is a pretty decent result. The other thing, though, that you have to keep in mind is that this is a district that Biden won with a presidential turnout by 8.1%. So Tom Suozzi's margin is slightly lower than Joe Biden's margin with a turnout that tilts towards the Democrats.

That suggests that if we're talking about a presidential level turnout here, Tom Swasey and Joe Biden might win this district. Lee Zeldin, the Republican running for governor, carried this district on the basis of massive swings away from the Democrats in 2022.

But he'd carry it with much smaller margins. What does that mean for the presidential race? Well, let's think about it. If Biden won this by 8.1, that means he won it by about 3.7% more than he won the nation by.

If he wins this by only four, the Democratic registration edge in a presidential race, that means it will have swung four points to the Republicans. If that were repeated nationwide, Biden would narrowly carry the popular vote, but he would almost certainly lose the electoral college.

In other words, underneath the good news for Democrats is a bit of bad news, which is to say they may have avoided the shellacking that they took in New York in 2022, and that's good news for them, but they underlying still have a general election problem that they can't deal with simply by turnout alone. They have to persuade more people to come back.

And that's something that they have months to do, but they're by no means assured of doing it. So on balance, I would say this is a decent night for Democrats, but it is by no means something that should cause Republicans to run scared for the hills and say, woe is me, as with most things in politics. It sets a baseline, but then actions, words, deeds, actions,

all interplay. How do you react to it? What does this mean going forward? And what it means going forward is that the Democrats may not have as much of a problem in New York as they did, but they have a problem nationwide that they still have to deal with. And that's something that they can't deal with

by outspending their foes by three to one. They can't deal with by getting the accident of a snowstorm that depressed low election day turnout. They're going to have to deal with that with months of persuasion effort, and we're going to see whether or not that works in the weeks and months ahead.

It's time for today's Lucky Land Horoscope with Victoria Cash. Life's gotten mundane, so shake up the daily routine and be adventurous with a trip to Lucky Land. You know what they say, your chance to win starts with a spin. So go to LuckyLandSlots.com to play over 100 social casino-style games for free for your chance to redeem some serious prizes. Get lucky today at LuckyLandSlots.com. No purchase necessary. VGW Group. Void or prohibited by law. 18 plus. Terms and conditions apply.

Well, the title of this podcast is Beyond the Polls, and sometimes to go beyond the polls, we have to kind of dig very deep into them to understand what they're telling us and how to integrate that with the other knowledge that we have. And I'm very honored to be joined by one of the industry's pioneers, Doug Rivers, who is the chief scientist at YouGov and also a professor of political science at Stanford University and a senior fellow at the Hoover Institution. Doug, welcome to Beyond the Polls.

Nice to be here.

Well, one of the reasons I've been interested in YouGov for a long time is because of your distinctive methodology that you use online panels to obtain your survey information and the questionnaire responses. And most American firms, even though they've moved away from what was the classic era, gold standard of random digit, consumer-assisted dialing, have not adopted your approach. Can you tell my listeners

about the distinctive online panel methodology that you go pioneered sure um so polling you know before about the 1970s was people went around with clipboards and talked to people at their homes or on the street corner from the 1970s

on political polling, media polling, campaign polling was largely done by telephone.

And that was the era of you called a household and you asked to speak to somebody. It's kind of quaint thinking about it. That was the era of where my daughter wouldn't know what our telephone number was, much less answer it.

you know we live in a world today where uh fewer than half the households have landlines so starting in the late 1990s people got interested in using the internet the newfangled thing for polling as a replacement for a telephone

And there are a number of models for this. I started a company called Knowledge Networks in 1996, in a period when less than a quarter of the households had home internet access.

And we actually paid to give people a web TV unit so they could use the internet on their television. We recruited people using random digit dialing, which is a standard telephone methodology, randomly generate telephone numbers. That was enormously expensive.

panel and several like it continue to exist. It's now called the Ipsos Knowledge Panel and is used by, I think, the Associated Press and there are others. The National Opinion Research Center at the University of Chicago has a panel like this and there are a few others. They're called probability panels where they use

probabilistic selection. It's no longer random digit dialing, which doesn't work in the era of cell phones. They actually used mail as the initial way of recruiting people. I tried that and became

saw its problems. And so starting in 2004, started a company called Polymetrics in the US. There was a company in the UK called YouGov that was doing the same things. And these are opt-in panels. We weren't the first of these, but we've been by far the most successful. And the idea is

use internet advertising to recruit as many people as you can and then build up a panel as large as possible and then select a sample for an individual survey from that panel of people. And the basic idea is if you can get enough people and you can accumulate enough information about them,

you can select a representative sample. That's the point of random selection is to get a representative sample. It's not randomness is what you're after, it's representativeness. And so the question is, can you do that? If you could do real random sampling,

The law of large numbers guarantees that the sample be representative. But in this day and age, 98% of the people you try to contact aren't willing to be interviewed. So the randomness actually isn't doing much for you. It has to be done on some method of sample selection and weighting that deals with the unwillingness of most people to participate in polls.

So with respect to the representatives, that's the thing is that people of a certain age, and I will, as my listeners know, freely fess up that my certain age is of the dinosaur era, you know, which is to say three television networks, Ronald Reagan is president, and most people have landlines.

We were taught that polling had validity because of this representativeness that randomness would produce. The polling industry, of course, has recognized what you've recognized. How does representativeness work? You know, which is to say that as a

common listener, I would listen to this and say, how do you know a priori what will be representative of the sample that you want? Aren't you risking or putting your finger on the balance scale by making that determination up front rather than letting it come to you through randomness? When response rates were 70%, 50%, even 30%,

the adjustments you needed to make for unrepresentedness were pretty small and they were kind of obvious. So like if you were calling people on landlines, you tended to reach who was at home and so you got a disproportionately female sample. And so the samples were weighted to be, you know, 51% female, 49% male.

The problem has gotten worse over time. First, there's a big age problem that if you call a landline, there are few people younger than us that are willing to answer it. So there are attempts to fix this by calling cell phone numbers by, well, I can talk about some of the complications, but

You know, the hope was that if you had a respectable response rate, which is something certainly not in the single digits, demographic weighting would be enough to cure problems around the edges, you know, by age, by gender, by race.

But what we've observed is first, there's a huge education bias in who wants to take polls, not just telephone polls, online polls have exactly the same problem.

In particular, online polls have a problem that they're typically self-administered. So you have to read and 15% of the U.S. population is functionally illiterate. That is, has difficulty reading and certainly wouldn't choose to voluntarily do something that requires reading for a period of time.

So all these are, and that didn't used to make that big a difference because there wasn't a high degree of education polarization. But now we have college graduates are 20, 25 points more democratic than non-college degree holders. So if you have that bias, you're going to seriously over-represent high education voters who now tend to be disproportionately democratic.

And that's been a problem in the last few election cycles. We fix it as best we can. That is, we ask people what their education is and wait to the known distribution of education among voters. But

it's not enough that demographic weighting by itself has led to polls with fairly big biases, which I mean on the order of five points. So how is YouGov approaching that? Are you using different weighting

approaches than some of your competitors to try and address this response rate bias or super low education bias, whatever you want to call it. Are you using a different weighting approach to address that? Some of your competitors are using different sample approaches, which says they use text opt-in messages or advertisements on video games or something or other to try and get them in the sample. How is YouGov dealing with this issue?

So it requires a multifaceted approach. So there's not like one thing that fixes this problem because then everyone would be doing it and it wouldn't be a problem. The first thing is we are doing panel-based research. That is, we have people who join a panel and take repeated surveys from us. So that differs from one-shot surveys, which is the typical telephone survey.

The advantage of panel-based research is that you can collect a lot of information on people. And in particular, as your relationship with the panelists improves over time, you're able to get more sensitive information. And so we ask people for their name and address, and we match them to a voter file so we can tell if they're actually registered to vote.

how often they voted in the past. We know the value of their homes. There's a whole slew of information you can get on people, the most valuable of which we believe is your past voting behavior. Did you vote and who did you vote for?

That's information that if you call someone out of the blue, you don't have that information. If you call them off a voter file, you can get their past turnout, but you don't know how they voted in past elections. We've been willing to use that information more aggressively than other people. In particular, pass vote is a key variable that we use

because we find demographics alone don't distinguish between Trump voters, MAGA voters, and other types of voters. You know, there's all sorts of stuff after you've selected the sample you can do, but if you have a panel, you consciously select who you're going to interview for a survey.

And so we're not letting who responds determine the composition of the sample. We select systematically people with demographic and political characteristics to match the known distribution in the population. And if someone doesn't respond, we replace them with someone similar to themselves.

So that if we are getting low response rate among MAGA voters, we replace them with other people who've told us they're MAGA voters in the past. So that's an attempt to deal with the underrepresentedness that we and other people in the industry have had in past years of Republican voters.

So how does one get to be in the YouGov panel that you then get solicited to participate in various polls? I mean, can you literally go on YouGov US and say, "I want to take your surveys and get entered that way," or do you solicit people?

It's mostly done via advertising, but you can go to the YouGov website and there's a button you can click on to join the panel. This is considered apostasy by traditional pollsters that the sample is self-selected. And it has risks. That is, we've been...

targeted by the Yang gang and the Tulsi Gabbard supporters and that sort of thing in the past. They would post links to our website and we would get an influx of them. We do have a pretty good idea of who is joining and any anomalies and

We appreciate the Yang gang taking our brand marketing surveys, but we kept them out of our political surveys because it would have way overrepresented the voters for that group.

But, you know, I mean, it's ideally we would entirely be selecting the panelists or the same way when you randomly select a phone number, you are selecting a person and you want those people to respond. The problem is they won't respond these days. So whether we like it or not.

Our samples are self-selected, but the people who choose to take a poll are a small slice of the population. So, you know, you have to do something about that. You can't just hope for the best. And I would say that's kind of the key underlying idea of everything that we do to, you know, obtain representative samples.

So one of the things, you know, you've got polls throughout the world, and I'm a bit of a nerd about that, so I follow polls all around the world. And one of the things that I like about your methodology is among the

multinational pollsters, you know, you hold yourself up pretty well. You know, you go in Australia and you're pretty good. You go into Spain and you're pretty good. And, you know, I think Ipsos does a fairly decent job of international polling, but not a lot of these firms do. And that suggests your methodology works regardless of nationality. It's not just an American thing or a British thing or something. It's a methodological thing.

What is your experience? You're an American, founded polling companies in America, but is that something that YouGov is a selling point for YouGov, is the ability to point to cross-national success and say, hey, this is due to the methodology, not because of any particular fluke or understanding of our native population?

Yeah, so we adhere to the same methodology worldwide. You know, it's really important not to make it up and put your hand on finger on the scale of thinking that you know what the results should be. What we do is use the same methods. And when there are problems, we try to figure out what the nature of the problem is and come up with a principled solution.

that can be provided. But the other thing is we view, YouGov was started by two Tory pollsters along with a Labour pollster who joined them, Peter Kellner. So polling is in the company's DNA. We view the ability to get elections right as an indication of quality.

Most market research firms view polling as a distraction and something they would like to stay away from. But, you know, you don't know what the answer is going to be when you put your data out there for the world to see. And so I think it's an important part of transparency and monitoring data quality to show that you can get elections right. If you can't, what makes you think you can get other things?

Right.

One thing I'm often told is that, yeah, we have to get within a couple of percentage points to get an election right. Market researchers for companies, the difference, whether it's 62% or 59% who like the product, they're looking more at direction and movement rather than precision. So if you can get an election right in that small area, that makes you highly accurate for the purposes of commercial market research.

Yeah, being off by two points in an election is the difference between a really accurate and a very embarrassing poll. You know, so the standards are higher. Being directionally correct is good enough on a lot of product questions. But

It is amazing when you get data back and it's frequently implausible results. And, you know, sometimes that's due to bad sampling, bad methodology. It's often due to the fact that people tell you what they want to hear. You know, so like every poll I know of grossly overestimates the turnout rate. If you ask people if they're going to vote, the fraction that tell you they're going to vote is much higher than the

The actual fraction of the vote, and that's not due entirely or even mostly to the samples being too full of voters. It's people thinking they should vote and telling you that. Yeah, I think this, you know, from my experience, the same is true as if you ask somebody, are you registered to vote? That they'll say, are you registered to vote? And a certain percentage of people know they are not registered to vote and they will tell you they are.

Yeah, or maybe I was registered at some point or I thought about registering. So one of the things that made YouGov's reputation in England was in the 2017 election when it used

Get ready, listeners, for big words. Multi-level regression and post-stratification polling, in other words, MRP, as it's known. And YouGov did MRP modeling, and the national polls were saying it looks like it's going to be a good Tory win. And what they did was project, using the MRP, all 650 British electorates and said,

You know, this is not in the cards. This could very well be a hung parliament, and of course that proved to be right. Can you explain how an MRP differs from a normal poll, and how an MRP can be used to understand sub- or small-demographic differences in an electorate that has significant political repercussions?

i'll try and try not to go on too long um the mrp the idea is due to uh andrew gellman who's a statistician at columbia it's

The basic insight is the traditional methods pollsters have used to correct their samples is weighting. So if your sample has, you know, 40% women and the target is 50%, you weight women by five-fourths and men by four-fifths, or excuse me, five-sixths.

And that evens everything out. And waiting works fine for small adjustments. But let me tell you a story, a true story. There was a poll, I believe in 2016, that said,

I'll leave the names out, but they were trying to wait on people who were too young to vote in the previous election.

And in particular, they had one black voter who was 18 to 21 years old. And in a sample of several thousand, there should have been, I don't know, something on the order of maybe 10 black voters, 18 to 21 years old. So traditional weighting says weight that person up by a factor of 10.

Well, it turned out this voter was a black Republican. And so they ended up with the Republicans in their sample. This one person being weighted up by a large amount made their sample to Republicans, which actually didn't hurt them in that election. But

The problem is not that we don't know the number of 18 to 21 year old blacks in the sample. The problem is you had one person and a poor estimate of what the voting rates would be of blacks aged 18 to 21 years old. And in general, the problem with weighting is you know the fraction that a demographic group is in the population.

But you have too few people in the sample in that group, and then you weight them up. And because you're basing your estimate on a small number of people in a group that's underrepresented in the sample, you have poor estimates. And Gelman's insight, and it's interesting,

based on a whole lot of things that have been developed over the years in the survey and statistics world is, what you should do is not take a small group of people and base an estimate of how the group votes on them.

If you have too few people in the sample, what you should use is a statistical model that uses the characteristics not just of 18 to 21-year-old black males living in some state, but how blacks vote generally and how young people vote generally. And a statistical model can give you a pretty good estimate that uses people who aren't exactly in that category.

And that's what's called a multi-level regression model. And it's related to things that are done in machine learning and artificial intelligence, but essentially using a large number of characteristics to predict how

groups that would be small in the sample or even an individual on the voter file would vote and to then predict for everyone in the population how they would vote based on a large number of characteristics and then aggregate that up. So instead of weighting the sample, what you're using is a sample to fit a model that predicts how people would vote and then apply that to the overall population.

And that method has been very successful. You know, it's able to produce estimates that are good at the constituency level, even though we only have, you know, 20, 30, 40 people in a parliamentary constituency or in a U.S. congressional district. It's hard, even with a very big sample, to get more than 100 people, you

So those are not going to be good samples. They're going to be skewed. And can you use modern statistical methods to make better predictions about how people are going to vote? So that's the rough idea. Your MRP of the Spanish election did better than standard statistical estimates at translating votes to seats.

called a much closer election for Pedro Sanchez than most Spanish pollsters were. And in fact, Pedro Sanchez was

got re-elected as president, as they call it, although we would say prime minister. I'm surprised more people aren't using this method because we describe exactly what we do. We put out detailed descriptions that other people could copy. The ABC News article

used this method before the 2016 election, I think, and didn't publish the results. And afterwards, they published a paper saying we did this and it gave us a much better result in our polls. But they continue to use the same methods they've used in the past on their individual polls.

baffles me why this is not just universally used as a method for election prediction. So when I see the Economist YouGov or the CBS YouGov, are those

traditional polls because your media partners ask for that? Or are you integrating MRP methodology into those polls as well? The individual weekly polls for CBS and The Economist are conventional polls, and I'll explain why in a second. But CBS does a model where they do electoral college predictions.

using MRP and we help them with that. So the problem with a weekly poll is you've got a couple thousand people and you've got 50 questions in the poll that people want the results on. And MRP works well when you are making a prediction of like one variable, how people vote. The reluctance has been

You know, I've got multiple things I want to predict. Do I have to develop a model for each? There are newer methods. There's a thing called matrix completion, which you may hear about in the future. It sounds kind of cutting edge, which enables you to make predictions for a larger set of variables. So we're, you know...

looking at ways to bring results across everything that we do. But, you know, the weekly polls are much more a standard. We interview for The Economist every week, 1,500 people, and we report the results on, you know, it's 60 to 80 questions that we ask those people.

So do you either independently or in conjunction with one of your media partners intend to do an MRP for control of the House or control of the Senate in the upcoming United States elections?

So we have two big projects going at the moment. The one is our work with CBS, where they make the analytic decisions on that. We have another, which is called Say24, Stanford, Arizona State, and Yale. It's a...

did a baseline interview with 112,000 people in December, and we're re-interviewing subsets of that group every week from now to January of 2025. It's an academic study, but we're going to be releasing regular updates and do a model of every House race, every Senate race, and every...

Electoral College result. And where will that be published? If it's a tri-college project, will it be

on the college website? So will it be on YouGov's website? I mean, where would somebody who is interested? Yeah, so if you watch the YouGov website, you'll see releases from it. We just did something showing where the DeSantis and

Christie and Ramaswamy voters win after they withdrew because it's a unique design where because it's re-interviews of everyone we know what they told us

a month ago or three months ago. And so we can trace movements of people. And the idea is that we're expecting some events to occur. I hear there's some trials and so forth. And so we want to see how people are responding to these events this year.

We are working with some media partners and the universities. We're going to do some conferences and so forth where we share the results. But most academics are more interested in things that take years to get published. So the results will trickle out.

Well, Doug, where would my listeners who are not academics awaiting years worth of publishable materials follow you gobs work across its various platforms? I'm just going to say for the United States, because you are an international pollster, where would they follow you gobs work in the United States? So if you go to you gov.com, you can type in what you're interested in, and we will share with you all the data we have on that topic.

We post our results and we're committed to transparency. So we post full crosstabs on our polls. So you can see how subgroups answer each question. You can see what our waiting targets are and who's in the sample and the kind of people we're talking to.

So that's the best spot to see what we're doing. But if you watch on CBS on Sunday mornings, they usually have a release of a new poll of ours on the 2024 election. Well, Doug, I really appreciate your insight. I appreciate the time. And I hope to have you back on Beyond the Polls. Thanks, Henry. I really enjoyed it.

Hey there, it is Ryan Seacrest with you. Do you want to make this summer unforgettable? Join me at Chumba Casino. It's this summer's hottest online destination. They are rolling out the red carpet with an amazing welcome offer just for you. So don't wait. Dive in now and play hundreds of social casino games for free.

your chance to redeem real prizes is just a spin away. Care to join me? Sponsored by Chumba Casino. No purchase necessary. VGW Group. Void where prohibited by law. 18 plus. Terms and conditions apply. Well, usually on the ad of the week, I conjure up some obscure ad from some place that you may or may not be following or even aware there's a race and tell you what group of voters you may or may not have heard of are being targeted with this ad. Well, that's not the case this week. We're talking about a presidential ad. We're

talking about a presidential ad that aired nationwide, a rarity these days. We're talking about one that you probably saw, and if not, you've heard about. Yes, I'm talking about the Robert F. Kennedy Jr. Super Bowl ad. Let's listen. ♪

Do you want a man for president who's seasoned through and through? A man who's old enough to know and young enough to do? Well, it's up to you, it's up to you, it's strictly up to you. American Value 2024 is responsible for the contents of this advertisement.

Now, the first thing that stands out about this ad is actually not the words. Kennedy, Kennedy, Kennedy, Kennedy. That is nice for name ID, and that's what this primarily is. It's an ad that is telling people who may not be politically aware

aware, the sort of person who tunes into the Super Bowl and will vote in November but does not follow the ins and outs of the campaigns. There's this guy named Kennedy running and that he's a good guy. But what strikes out is the visuals. The visuals are they took a 1960 John F. Kennedy ad and superimposed Robert F. Kennedy's picture and made only a few slight changes to the words. So what does this do?

Why is this an interesting ad? Well, first of all, it's an interesting ad because the sheer low production quality of the ad garners attention. We're used to the slickest ads imaginable. We have ads that cost as much as movies to make that showed up on the Super Bowl. And the fact that you counter-programmed that with an old-fashioned low production value means people watched it.

Watching is the first thing. Getting eyeballs to pay attention is the first thing. Usually these days people do it by ratcheting things up more and more or having something unusual. Like, is that really Tom Brady? Why is Beyonce wearing a hat? You know, that sort of thing that we saw in the Super Bowl. This achieves the same effect.

for virtually no money. And as a result, it got noticed. The fact that we're talking about Kennedy and puts it in the background of this happy, peppy music, again, this is a name idea. It may strike listeners of this podcast as unusual, but there's still lots of people who are going to vote who don't know Robert F. Kennedy Jr. is running for president, haven't heard the name before, haven't heard it very much, and suddenly they're aware that there's this Kennedy who's running.

The other thing that is good about this ad is that because it was so out of the box and unexpected, it's getting millions of dollars worth of free advertising. I'm talking about it. You followed it. It got a few days worth of commentary in people who talk about Super Bowl ads. It got talked about in the political press. It probably got talked about in certain newsrooms.

normal non-political outlets in other words by spending this money they got an add-on a follow-on effect that increases the reach of the ad in a way that most of those super bowl ads never got that's exactly what you want is a political candidate particularly when you're talking about establishing awareness that you are there and that you are running rather than simply trying to make an argument or message

The final thing that makes this an interesting ad is how focused it is because of its old-fashioned production value on nostalgia. That we're talking about an ad that for anybody who is 60 years old or over would conjure up some memory of the Kennedy.

The Kennedys once were the royal family of American politics. The mystique of Robert F. Kennedy Jr.'s father, Bobby Kennedy, who was assassinated on the cusp after winning the California primary regiment. He was almost certain to become the Democratic presidential nominee. He's murdered.

and has a martyr status. John F. Kennedy, cut down in the prime of life. For decades thereafter, people who related to the family, Caroline Kennedy, Joseph Kennedy, other Kennedys would run and win. Think about back when Arnold Schwarzenegger, who starred in his own Super Bowl non-political ad,

He was married to a woman named Maria Shriver. Maria Shriver is part of the Shrivers. The Shrivers are cousins of the Kennedys. It was part of the Kennedy mystique, and that was mentioned when he was running for governor. Even in the early aughts, the Kennedy mystique, 40 years after John, 35 years after Bobby, was still something. If you are older, you are highly likely to vote. You're highly likely to be watching television.

And you're also pretty likely to have some sort of positive

vibe about that era and about these people. And this played right into that. It's not going to make Bobby Kennedy president, but it does mean that once again, in a time when, since he doesn't have primaries, he's not getting talked about in the political press for at least 72 hours, Bobby Kennedy was on the lips of political commentators and average people asking what the heck they just saw. And that's why this is the end of the week.

That's it for this week. Join me next week for an expert's look at the all-important South Carolina Republican primary with the Palmetto State's leading political journalist, Skylar Crump. Until then, let's reach for the stars together as we journey beyond the polls.

It is Ryan here, and I have a question for you. What do you do when you win? Like, are you a fist pumper, a woohooer, a hand clapper, a high fiver? If you want to hone in on those winning moves, check out Chumba Casino. Choose from hundreds of social casino-style games for your chance to redeem serious cash prizes. There are new game releases weekly, plus free daily bonuses. So don't wait. Start having the most fun ever.

ever at ChumbaCasino.com. Sponsored by Chumba Casino. No purchase necessary. VGW Group. Void where prohibited by law. 18 plus. Terms and conditions apply.