When a body is discovered 10 miles out to sea, it sparks a mind-blowing police investigation. There's a man living in this address in the name of deceased. He's one of the most wanted men in the world. This isn't really happening. Officers finding large sums of money. It's a tale of murder, skullduggery and international intrigue. So who really is he?
I'm Sam Mullins, and this is Sea of Lies from CBC's Uncover. Available now. This is a CBC podcast. Hi, I'm Dr. Brian Goldman. Welcome to The Dose. Earlier this month, the Canadian Medical Association published its 2025 Health and Media Annual Tracking Survey. Among the findings, a significant increase in Canadians having encounters with health misinformation.
and a direct link between misinformation and negative health outcomes, especially in those who rely heavily on social media for news, and we know that's on the rise. So this week we're asking, how can I spot and deal with health and science misinformation?
Hi, Tim. Welcome to The Dose. Thanks for having me on, Brian. So why should we believe what you're about to tell us? I think you should always believe the process, believe the nature of evidence that is being utilized.
more than leaning into perhaps the person that is providing the information. You haven't said that, you know, when you find trusted voices, I think that's a good thing. In my new book, I actually joke about the paradox of me relying on evidence and saying things in this world of spin and misinformation. I think it's really important to always kind of
pause and ask, where is that evidence coming from? That's a great answer. And I am going to shout out my friend and colleague, Dr. Ken Milne, who's the host of his own podcast called The Skeptic's Guide to Emergency Medicine. And his tagline every week is always be skeptical. How's that for a good piece of advice?
It's a great piece of advice. You want to be skeptical, Brian, but you don't want to be cynical, right? You don't want to think every bit of evidence is to be questions. On the contrary, know that there's good evidence out there. Embrace it. That's the stuff that you should turn to. And that's the stuff you should share.
But first, we're going to talk about misinformation. And before we begin, can you give us a hi, my name is, tell us what you do and where you do it. Well, my name is Timothy Caulfield, and I'm a professor at the Faculty of Law in School of Public Health at the University of Alberta. And really, for decades, I've done interdisciplinary research on how science and health is represented in the public sphere.
Can't wait to dig into the conversation. We're going to talk about red flags. Let's start with the definition first. What's the difference between misinformation and disinformation? I use the umbrella term misinformation, but it is important to recognize that there really is this kind of misinformation continuum. On one end, you have disinformation, and disinformation is when the person providing it knows that
It's a lie. Knows that it's spin and it's being provided to satisfy a particular agenda, build a brand, perhaps sell products. You know, think state actors do this, right? And we see it happening in the political sphere all the time. Move along that continuum and maybe have wellness influencers. And you're kind of not sure, you know, do they believe this nonsense or they do not believe it? Maybe they bought into it, but the information is still wrong. The intentions of putting it out there, mixed up.
and then move to the other end of the spectrum along that continuum. And you have individuals that are just trying to do what's best for themselves, maybe for their family and community, and they don't realize it's misinformation, but they're spreading it anyway. Brian, all of it does harm, but I think the tools that we use and maybe the policies that we bring to bear differ depending on where we fall on that continuum. What are some of the top topics today
driving the most misinformation that you're seeing? You know, I think about RFK Jr. saying that HPV, human papillomavirus vaccine, is accounting for higher rates of cancer when it's clearly shown to do the opposite. It's zeroing cervical cancer. What are some of the ones that you're seeing? People are probably sick of this. It is vaccines. It is vaccination. The amount of misinformation surrounding this topic is
is absolutely astounding. And Brian, as you know, I've been following this for a very long time. There's always been misinformation there. You know, there's always been anti-vax communities. But now it's at an all new level and it's been very much politicized. This has become part of political rhetoric, which I think is, again, politics have always been there. Right, Brian? But now it's at the fore and I've never seen anything like this. Being anti-vax is now a political thing.
flag, right? It's a representation of where you reside on the political continuum. I think we can't underplay the degree to which this is a public health issue. But I also think we can't forget the stuff that sounds absurd, right? It still matters to debunk that because I think one of the reasons we are where we are today is because there was this tolerance for pseudoscience for a long time, right? Even the absurd stuff like drinking raw water,
drinking raw milk. I think it's really important to talk about what the science actually says on that, not just laugh it off, but really point out the degree to which this is problematic and doesn't accord with the good science. And here's another one from the CMA survey, a healthy lifestyle alone can cure cancer.
That was one of the top ones. Bleach cures autism is another one that I've seen. Ivermectin, of course, during early in the pandemic, but now it seems to have made a comeback. There's a lot of chatter on social media about Ivermectin these days. And I also think that the Ivermectin story is now also become political. Again, there's been good studies that have demonstrated you can almost guess someone's political affiliation by their opinion on Ivermectin. And it really has never been like that. There was a fascinating study done in the United States where they surveyed
both physicians and the general public about the polarization around ivermectin and other unproven COVID treatments. And that polarization, the strongest predictor of where people sat on that spectrum was
what cable news show you watched. Think of what that study says. If you are a patient in the United States and you go to a doctor, your doctor, and the study actually says this, your doctor's opinion on unproven COVID treatments is more likely to be correlated with whether he watches Fox or CNN than the scientific literature, which is incredible. I look at those claims and I laugh out loud, but a lot of people believe them.
So what's in the pitch that makes them so believable? When individuals, when patients listen to this, I think we need to try to be empathetic. And I have to remind myself to do this. These individuals are trying to do what's best for themselves. They're looking for answers. And I think that our anger should be more focused on those pushing the misinformation than those necessarily receiving it.
I think a lot of it does have to do with community, Brian. There's a lot of really interesting research on this. So in other words, people are living in echo chambers. They're receiving information from communities. And when you're part of that community and we all do this, I'm not pointing fingers. We all do this right. Believing stuff that your community believes becomes easier. So it's often framed that way.
The other thing that's happening now, I think, at an all-time level is the creation of distrust. And the Canadian Medical Association survey showed this, right? Yes, people still trust doctors and nurses and scientists in general, but the trust is decreasing, distrust is increasing. And I think it's important to remember that distrust is largely generated by the spread of misinformation. We're doing research in this space online.
our team at the University of Alberta. And the reason they do that is to create room for the different narratives.
The ivermectin story, I think, is a really good example of that. You can't trust conventional medicine. You can't trust the health care system. You can't trust the CDC. They're hiding something from you. Try ivermectin. Despite what all the evidence says about clinical trials published in reputable biomedical journals, don't trust that. Trust me. I'm part of your community and we can't trust these other voices. Who are some of the loudest spreaders of misinformation in Canada?
I want to hesitate from using specific names. I also don't think we should underplay the role of the rhetoric in the United States. People like RFK Jr. and others are having an impact on the Canadian discourse. I think what you see in Canada, unlike in the United States, you're seeing politicians embrace some of this misinformation, but it's more an implied. It's less likely to be explicit. But that sort of permission that is given by those politicians to believe misinformation, I think that's having...
a big impact in Canada, but also social media influencers, you know, also many from the United States, people like Joe Rogan, incredibly powerful. And there's other wellness influencers in this space that are really sort of directing the conversation for a large portion of the population. We live in an era where Joe Rogan, this is gonna sound like hyperbole, but I think
he really is one of the most powerful voices in the health space. How did we get here? A discussion between Mel Gibson and Joe Rogan about the untrue benefits of ivermectin in the context of cancer would result in a massive spike all over the world for people searching the word ivermectin and the requirement for the Canadian cancer community to respond. It's a really horrifying situation.
And there's actually evidence that Canadians are turning to American purveyors of health misinformation? I think so. And I think that evidence is the patterns are the same in Canada as we're seeing in the United States. So those patterns are, you know, who's believing this? You know, what is the political side of this story? And also when you look at other countries, in fact, I'm at an event right now in the United States. I talked to individuals from the UK. The story there is more complicated. It isn't so poignant.
perfectly polarized between right and left, as we see here in Canada. And Brian, look, I'm using the word politics a lot here. I want to be really careful because I don't want to come across as totally partisan. And you can back me up on this.
Historically, health misinformation has come from across the political spectrum. And in the wellness space, something that you and I have followed for years, a lot of the misinformation really did come kind of from the left. You know, it was kind of new agey counterculture. And now we've seen a shift. So for most of my career, I was critiquing those voices. Right. And now we're seeing it almost entirely, almost entirely, not completely, but almost entirely on the right.
A random influencer, a friend who read something somewhere, your doctor. It can be hard to know where to get trusted health information. TED Health is a podcast that will help you focus on the stuff that you actually need to know to live your healthiest life. I'm Dr. Shoshana Ungerleider, a practicing internist, and I share weekly TED Talks from certified health experts that break down the questions you're always getting different answers to.
Get the science-backed ideas for a healthier you with TED Health, wherever you get your podcasts. So let's dig into some of the red flags. What are some of the more obvious signs that something is a piece of misinformation? Let's go back to how we started this conversation, right? Always ask yourself what kind of evidence is being used. And if it's only anecdote, if it's only a testimonial,
That should give you pause. And our team has done research on this. We know that testimonials, anecdotes drive the sale of unproven therapies. Whether you're talking about supplements or unproven stem cell therapies, it's testimonials, right? And I get that. Research tells us that a powerful anecdote, a powerful testimonial causes us to kind of shut down scientifically. You know, we're hardwired to listen to stories and to narratives.
And so that should be something that right out of the gate, you should be wary of. Is this only a testimonial? Is this only always ask yourself, is there a body of evidence supporting this claim? Then there may be, as you've suggested, the conspiratorial notion that this is what the pharmaceutical industry or this is what doctors don't want you to know about a cure, that you're being invited to receive special knowledge. For sure. I mean, that's such a common ploy. You probably see it all the time too, right?
You know, I've got the answer and the biomedical infrastructure is keeping this from you. And I find that frustrating on two fronts. First of all, if there was an effective treatment, I promise you would know about it, right? You would know about it. Look, scientists can barely coordinate to write a research grant, let alone, you know, keep this massive conspiracy over decades. And the other thing I think is really important to highlight, and again, this is something our team has researched for a long time,
Medicine is hard. Science is hard. It moves forward iteratively. I often start my class every semester asking the students to name 10 genuine biomedical breakthroughs that have revolutionized health care. It's not easy, Brian. The list is not that long. And so that should always remind you when someone is promising some miracle cure today.
Be skeptical, right? Be skeptical. Always ask yourself, what are they selling? What's the goal here? Let's talk about health and science content that's generated by AI, artificial intelligence. What are some of the tell's?
that a piece of information that you're getting on social media, video, audio has been generated by AI? You know, unfortunately, Brian, it's getting more difficult. In the past, you know, you could look carefully at an image and you could see things that looked unnatural or hyper real. I think people know what I mean by that look. But AI is getting so sophisticated.
It is getting much more difficult just on the face of it to discern whether it was AI generated or not. Why we need to recognize that is that means you need to fact check. You need to go deeper. You just can't use your own AI detecting skills in order to decide if this was created by AI. There's really interesting research that tells us that people think they can detect AI, Brian, but we can't. It's kind of like a Dunning-Kruger force that impacts all of us.
Given the existence of AI, I think it should be a reminder for everyone to pause and always dig deeper if they think they might be being fooled. Recently, news media, including us at the CBC, got caught by a study on black plastic with faulty arithmetic published yesterday.
in a peer-reviewed journal. The math was off by an order of magnitude. The study authors stand by the fact that brominated fire retardants are present in black plastic cookware. That error got caught, which suggests some accountability.
How often does misinformation get caught and corrected or not caught and corrected? It often doesn't get caught, right? And there's subtle forms there. We can put this on a continuum too, right? Sometimes it's out and out misinformation that doesn't get caught. Sometimes it's like this, you know, where there is an error in a study and it's
the headline about the risk because the negativity bias always wins, right? Brian, you know that scary story about how this stuff is bad that is going to dominate. And even though it's corrected, the lie lives on. It's a zombie story, right? It's impossible.
to kill. So I think that's problematic. And sometimes it's just hype about science and the hype lives on, even though the research we know is more complicated, more iterative, and the effect sizes are much smaller than originally predicted. So all of those kinds of myths, lies, misrepresentations can live on and be very difficult to kill.
Other podcast hosts, not us, will advertise supplements or miracle drinks and reference supporting evidence like expert testimonials or even published research. How do you identify the red flags in those cases? Well, first of all, the selling of supplements is so pervasive in the misinformation sphere today.
That should be a red flag. That should be a red flag. My friend and colleague, Dr. Jen Gunter, has suggested that it's gotten so bad in the context of supplements that the supplement industry is almost like funding the misinformation universe. So many purveyors of misinformation make their money supplementing.
selling supplements. So it should be a red flag. And Brian, I want to be really careful here. If you have a clinically identified deficiency and your family physician talks to you about supplements, that's a different story than the massive supplement industry that's promising all these miracle cures. So that's one thing to look for. And the other thing, again, we talked about it, ask what kind of study is being used. Always look to the body of evidence
on a particular topic. Never fall for what's been called the single study syndrome, right? It's never going to be just one study. Humans are complicated, right? Health is complicated. You need a body of evidence in general to support, you know, good clinical care. You know this better than I do. So never fall for the single study syndrome, which you often see pop up
on podcasts and podcasters can be very seductive in how they, here's this really fascinating study that says X, Y, Z. Is it an animal study? Is it just one small study? Does it run counter to a larger body of evidence that suggests otherwise? These are all kind of questions that you can ask yourself and always go in, as our friend Ken says, with a degree of skepticism, knowing that
Biomedical science is hard, and in general, we move forward iteratively. I want to move to talking about accountability. Meta recently announced plans to adopt something called a community notes model of fact checking. On X, formerly Twitter, users are able to leave identified community notes, comments that provide additional details or clarification about information contained in posts.
Why are you concerned about that? So I am very concerned about it. I'm concerned that the move was taken, right? That this policy shift has happened. I think that's problematic. Look, everyone who does research on fact-checking and there is a growing, wonderful community of excellent scholars recognizes that fact-checking isn't perfect. We're still trying to do it better, that there are unintended consequences often associated with fact-checking of various kinds.
But it does, in the aggregate, help. It moves the needle.
Community notes, some studies suggest it can be helpful also. It's in one another tool, right? And there is concern that community notes can be manipulated by crowds with particular agendas. Let's just start with a promise that maybe it could be one potential tool. But the spread of misinformation is going to require a multi-pronged approach. We have to come at it from every direction. And you take away one of those tools,
It's problematic. And I'm more concerned. So I'm concerned that they made the move. But I'm more concerned about the rhetoric that was used to justify it by people like Zuckerberg. You know, this idea that there is this massive bias in fact checking. And studies have shown that that bias, in other words, conservatives have their content fact checked more than liberals in the United States. That bias exists because conservatives
Right now, in this cultural moment, I'm not being partisan. The evidence tells us that more misinformation is emanating from that corner of the ideological spectrum. So, of course, there's going to be more fact checking there. In addition to that, fact checking, contrary to what Zuckerberg and others have said, is not an assault on freedom of expression.
On the contrary, it happens within the marketplace of ideas. Those who believe in freedom of expression should be embracing fact-checking because it is about putting good content in our information environment and letting people learn and make their own decisions. Most misinformation tools do happen within the marketplace of ideas. It's not about censorship. It's not about cancelling and silencing. It is about censorship.
supporting freedom of expression and making sure the good content exists.
You know, it's interesting that there's almost universal approval for having referees in NFL football games, but we seem to be trying to get rid of the referees when it comes to misinformation and information. It's just a weird double standard that people hold. Now, a lot of people get their information from Wikipedia. I want you to comment on the reliability and accountability of the information that's on that platform. Well,
Well, Wikipedia is another fascinating story, isn't it? The history of it has kind of had ups and downs. Wikipedia, I think, is getting better. It's getting more and more transparent. And I think it is an example of it's not perfect. And I think people need to be careful not to rely solely on it. But it is an example of aggregated information, right, being constantly updated and improved and checked.
As you probably know, years ago, it was more heavily criticized. But I think anytime there's aggregation of information, you want it to be done responsibly and in a way that's transparent and in a way that allows for, you know, sort of real time updating as the evidence emerges. And by the way, that's a that's a good segue to another really important point. And I know you agree with this, but I'm going to say it anyway.
It shouldn't be viewed as shameful to change your mind. It shouldn't be viewed as a badge of dishonor. On the contrary, it should be viewed as a badge of honor. When the evidence changes and when science evolves, you should change your mind, right? And unfortunately, we live in this era where flip-flopping is viewed as a weakness. And once you adopt a perspective, you should hold on to it. You can never change your mind. But
But now we live in this era where changing your mind based on evidence is viewed that science was wrong and you were wrong. On the contrary, that's science working and the individual that changed their mind should be praised, not damned. Most pertinent example that immediately pops into my mind was what the scientific community was initially saying about COVID, that it was spread by droplets. And later on, the airborne transmission became more obvious in the face of science and the utility, the value of using masks.
Last question I'm going to ask you. We've been talking about what we can do to spot misinformation. You've given us a lot of tips. What's your advice to talking to friends and family who strongly believe that a piece of misinformation is actually true? And, you know, a lot of us have those conversations, particularly at holiday dinner tables, social events. How do you how do you talk to people who seem to be spouting misinformation, people who you know and love?
There is no magic and you can probably guess what I'm going to say. And there is some evidence to back this stuff up because it's similar to talking, I think, with patients. My wife's a family physician, so we actually talk about this a lot. You want to be patient. You want to keep the temperature down. Don't make this an argument. You want to give people a path to credible information. No one changes their mind in front of you, right? No one ever goes, right.
Brian, now that you mention it, you're right. That doesn't happen. Oh, it happens very rarely. So you want to do that. I think you also want to keep the relationship alive. That relationship is important, right? So you don't want to blow up the relationship. You want to maintain the relationship so you can come back to the conversation. And that path to credible information is important. You know, give people the opportunity to see what the evidence says. And there's some evidence that
Still researching this. We all live in echo chambers. And if you can expose people to other perspectives and
It does make a difference. So invite people to see those other perspectives. Invite people to see the good science. Be patient. Maintain that relationship. And hopefully, hopefully, better days ahead. Well, Professor Tim Caulfield, it's been a pleasure speaking with you about misinformation, sharing some stories. And you've provided a lot of tips that I think will be very valuable to listeners on the dose. So I want to thank you for speaking with us. Thanks for having me on.
Tim Caulfield is a professor in the Faculty of Law and the School of Public Health and research director of the Health Law Institute at the University of Alberta. And he's got a new book out, The Certainty Illusion, What You Don't Know and Why It Matters. Here's your dose of smart advice. A 2025 survey by the Canadian Medical Association says Canadians are encountering more health misinformation.
The survey found 37% of millennials and 58% of Gen Z get their news from social media. Those who rely heavily on social media for news remain the most vulnerable to health misinformation. Among the survey's top health claims that are false, that masks don't stop the spread of airborne illnesses like COVID, they do, and that a healthy lifestyle alone can prevent cancer, it doesn't.
The survey found a direct link between misinformation and negative health outcomes, with more Canadians delaying medical treatment and experiencing heightened anxiety due to false health claims. Here are some ways to spot health misinformation. Health claims should not be based solely on testimonials. They should be backed by research in reputable journals like the Canadian Medical Association Journal or the New England Journal of Medicine.
Always double-check information first using reputable websites such as the Mayo Clinic, MedlinePlus and the Public Health Agency of Canada. You should be able to find major health claims from more than one source. Look out for health claims that use buzzwords like cures, quick fixes. Ask yourself if the wording is designed to entice you. Avoid claims that use polarizing language or words and images that instill fear or anger.
When an expert makes a health claim, go online to see if they have the expertise to make those claims. Steer clear of so-called experts who are influencers or who are selling a product. And beware of health claims that are attributed to unnamed experts.
And consider pausing before sharing claims with others. If a family member or friend appears to believe a health claim you know to be false, don't be confrontational. Instead, ask them where they get their information. Ask them to show you the source and how they evaluate the source's information.
Try to be empathetic and acknowledge that it's difficult to find information you can trust these days. If you have topics you'd like discussed or questions answered, our email address is thedoseatcbc.ca. If you liked this episode, please give us a rating and review wherever you get your podcasts. This edition of The Dose was produced by Samir Chhabra. Our senior producer is Colleen Ross. The Dose wants you to be better informed about your health. If you're looking for medical advice, see your health care provider. I'm Dr. Brian Goldman. Until your next dose.
For more CBC Podcasts, go to cbc.ca slash podcasts.