We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode The challenges of studying misinformation, and what Wikipedia can tell us about human curiosity

The challenges of studying misinformation, and what Wikipedia can tell us about human curiosity

2024/10/31
logo of podcast Science Magazine Podcast

Science Magazine Podcast

AI Deep Dive AI Insights AI Chapters Transcript
People
D
Dani Bassett
K
Kai Kupferschmidt
Topics
Kai Kupferschmidt:错误信息研究领域面临诸多挑战,其中最主要的挑战是缺乏对错误信息的明确定义以及研究方法的共识。研究人员对于是否应该关注假新闻、误导性标题还是那些事实上正确但带有偏见的的信息存在分歧。此外,错误信息的影响是多方面的,难以衡量,它会影响公众健康、选举以及对机构的信任。研究也受到社交媒体公司数据访问受限的阻碍,这些公司的利益可能会扭曲研究方向。

Deep Dive

Key Insights

Why is misinformation research challenging?

Misinformation research faces challenges due to a lack of consensus on its definition and strategies to combat it. Researchers debate whether misinformation must be entirely false or if slanted, factually correct information that creates a misleading impression also qualifies. Additionally, the field struggles with disentangling the multifaceted impacts of misinformation on public trust and behavior.

What are the three curiosity styles identified in Wikipedia browsing?

The three curiosity styles are the busybody, the hunter, and the dancer. Busybodies flit between unrelated topics, hunters focus on solving specific puzzles, and dancers leap between different areas of knowledge, creatively linking them.

How does the curiosity style correlate with Wikipedia content?

Hunters tend to browse STEM-related pages, while busybodies explore culture and entertainment topics. Dancers make creative leaps between different areas of knowledge, often stitching together interdisciplinary ideas.

What role do social media companies play in misinformation research?

Social media companies provide access to vast datasets on user behavior, which is crucial for studying misinformation. However, their involvement raises ethical concerns, as their interests may skew research toward psychological interventions rather than systemic changes, such as altering algorithms or business models.

How does the curiosity style vary across countries?

Countries with greater equality in education and gender tend to have browsers who exhibit a diversity of curiosity styles, such as busybodies. In contrast, countries with higher inequality often have browsers who behave more like hunters, focusing on specific topics.

What is the 'inoculation' approach to combating misinformation?

The inoculation approach involves exposing people to weakened doses of misinformation to help them build immunity. By understanding common logical fallacies or emotional language used in misinformation, individuals can better recognize and resist it when encountered in the future.

What are the limitations of the inoculation approach?

While inoculation works well in controlled studies, its effectiveness in real-world scenarios, such as social media feeds, is harder to measure. It requires constant identification and pre-bunking of misinformation, which is difficult due to the sheer volume of new misinformation daily.

How do researchers predict when misinformation will spread widely?

Researchers monitor social media for recurring narratives and frames that often lead to specific types of rumors. By identifying these patterns, they can predict which rumors are likely to gain traction and spread quickly.

What is the 'rumor clinic' approach to addressing misinformation?

The rumor clinic approach involves monitoring media and social platforms for potential misinformation storms. Researchers then provide rapid debunking information to journalists and officials, helping them contextualize and address the misinformation before it spreads widely.

Why is Wikipedia a valuable tool for studying human curiosity?

Wikipedia provides a vast, accessible dataset of user browsing behavior across millions of users and 50 countries. This allows researchers to analyze how people seek and navigate information, revealing different curiosity styles and their correlations with user demographics and content types.

Chapters
Misinformation research faces dilemmas defining misinformation, assessing its impact, and navigating data access from social media companies. Researchers explore various approaches like rumor clinics, inoculation, and pre-bunking to combat misinformation's spread and impact on public trust.
  • Lack of consensus on a definition for misinformation among researchers.
  • Difficulty in measuring the real-world impact of misinformation exposure.
  • Ethical concerns limit experimental research on misinformation.
  • Social media platforms provide valuable data but present access challenges and potential biases.
  • Researchers explore alternative data sources like surveys, user-donated data, and simulations.
  • Rumor clinics monitor and analyze misinformation trends to anticipate potential crises.
  • Inoculation and pre-bunking aim to build resistance to misinformation by exposing individuals to weakened doses or warnings.
  • The effectiveness of these interventions is still being studied, particularly in real-world online environments.
  • The weaponization of misinformation for personal gain poses a serious concern.
  • Gamification of truth online contributes to the spread of false information.

Shownotes Transcript

Translations:
中文

This podcast is supported by the Icahn School of Medicine at Mount Sinai, the academic arm of the Mount Sinai Health System in New York City, and one of America's leading research medical schools. What are scientists and clinicians working on to improve medical care and health for women? Find out in a special supplement to Science Magazine prepared by the Icahn School of Medicine at Mount Sinai in partnership with Science.

Visit our website at www.science.org and search for Frontiers of Medical Research-Women's Health, the Icahn School of Medicine at Mount Sinai. We find a way.

This is a science podcast for November 1st, 2024. I'm Sarah Crespi. First up this week, contributing correspondent Kai Kupferschmidt joins me to talk about what's missing from misinformation research. While it seems like misinformation is absolutely everywhere, scientists in the field don't always agree on a common definition or a shared strategy for fighting misinformation. Next,

Next, did you know that there were different styles of curiosity, things like hunter or dancer style? Researcher Danny Bassett explains what we can learn from Wikipedia about different curiosity types and how which approach you use could depend on where you live.

This week in science, contributing correspondent Kai Kupferschmidt wrote several stories on the state of misinformation research. Hi, Kai. Welcome back to the Science Podcast. Hey, Sarah. Good to be on. To me, this story, these set of stories feel timely. Misinformation during the election season, especially in the United States, seems to be everywhere. We

We hear about it in the news. We are drowning in it on social media. And you yourself have actually been looking at this for a long time. How did you get into this topic and why is now a good time to produce these stories? I mean, my personal story has less to do with politics than with a pandemic. I've been covering infectious diseases now for something like 15 years. My takeaway over the last few years has been misinformation has always played a role in infectious diseases. These outcasts

outbreaks are kind of scary. The situation changes rapidly. And in the beginning, there's little known. So there's always like room for misinformation. But it feels like over those 15 years, it's also gotten a lot worse, at least kind of in my

in my personal life after the COVID-19 pandemic and then MPOC's coverage that I did, I felt like I really needed to understand a little bit better what's going on with misinformation. And so I ended up applying for the Night Science Journalism Fellowship at MIT. And I went there for 10 months, basically trying to spend the time to really understand misinformation because it felt like if I want to be good at my job at covering infectious diseases, I do need to understand a little bit better what's going on with misinformation as well.

That came with a lot of frustration, I would say, in the beginning, because I think I was hoping for some easy answers and solutions. And that really wasn't the case. I ended that fellowship in May. And since then, I've kind of been trying to order my thoughts and write that down. And then the other thing is, like you said, we're very, very close to the US presidential election. And

A lot in the field really started happening after 2016, after the election of Donald Trump. That was for a lot of researchers, either was the moment when they got into the field or it was the moment when something that they'd been studying, which was slightly fringe, really, became mainstream because so many people were suddenly asking questions about how can this be? And so, yeah, in a lot of ways, my personal and kind of the overarching trajectory have made this a good moment for me to write about it.

I see you have a story on an overview of what are the major dilemmas facing this field. And it really was surprising to me how little has consensus. So yeah, like for example, how are people who are studying this defining misinformation? I mean, that's where it really...

already starts to get complicated because when the field really got going in a major way, a lot of the concern was about fake news. In the beginning, the concern was, okay, we have these stories here that look like newspaper articles or online articles, but they're actually fake. Over time, it just became clear that that is such a small part of the information that people consume that it really isn't the issue that people worry about.

Over time, instead, people have said, okay, let's look at everything that's false information. So not just stuff that looks like a newspaper article, but any post, anything that gives false information. Even there, it becomes quite complicated because sometimes, for instance, you can say something that goes against the scientific consensus is false. So there are these kind of complicated questions about how to define it. And then if you go further, a lot of the researchers told me, look,

It's not even that something has to be false to be misinformation. There's a lot of stuff out there that's factually correct, but that is so slanted or cherry picking that it ends up creating the wrong impression in people's mind. And that is misinformation too.

One of the studies that I found really fascinating is a 2024 study that we actually published at Science. When they looked at the articles on Facebook that spread the most, articles about vaccines in the first three months of 2021, and they found that the article that had by far the most readers, I think something like 50 million readers, was an article in the Chicago Tribune that had the headline, a healthy doctor died two weeks after getting a COVID-19 vaccine. CDC is investigating why.

They looked at that article and they said, look, this spread way further than all the fake news articles and everything that was labeled misinformation on Facebook. And they kind of modeled, even though the article itself maybe drives vaccine skepticism on an individual level, less than a fake news article about vaccines. On the whole, because it was seen by so many people, it had a way bigger impact that

It's a misleading headline or a clickbait headline, but is that misinformation? Is that what we're talking about when we're talking about misinformation? These are complicated debates that are happening. And I think one of the issues that the field has grappled with is that everybody kind of talks about different things when they talk about misinformation. And increasingly, people are being very specific about what they're talking about. Are they talking about fake news? Are they talking about misleading headlines? Are they talking about rumors, conspiracy theories? So the field is really, I think, kind of like getting into these different subfields in a way.

Refining the definitions. Yeah. Yeah. It's interesting that you point out that that was something that was published in the newspaper and then shared on social media. And like we, as people who work in journalism, it's very frustrating to see bad information being circulated, people relying on sources that aren't very reliable. But do we know how much of an impact being exposed to this content has? We can say this is their reach, but do we know if it leads to behavior changes? Yeah.

So there's all of these different ways of thinking about misinformation. And when I went to MIT in the beginning, you know, I was ready to dig into misinformation and I always ask people, you know, okay, so what have we learned? And I was shocked in the beginning that everybody, first of all, kept going back to, well, what is misinformation, you know? But now I, of course, I understand that this is such a big part of really just making sure we're talking about the same thing is really important. And I think that's where we as, you know, journalists also have a

a job because I think the reporting on misinformation as well hasn't always done a great job of bringing that nuance across. I think the fear about its impact is real. I mean,

There's concerns that misinformation can lead to people not participating in public health measures. It can lead to problems with how people vote or how elections are run or decreased trust across the board in all these institutions. Yeah, I think that's part of the issue is that the impacts are on so many levels, right? If you take the elections, if there's a rumor circulating, you know, on the day of the elections about

your polling station is closed or something, it might just disenfranchise people, right? That's one very concrete little thing. But then on the other hand, of course, over time, all of this misinformation and maybe eroding trust in democracy and public institutions may just lead people to not vote more generally. There are just so many different layers on which this operates because

Once you decrease trust, any information that comes from those sources has less of an impact. I think researchers are really struggling with trying to disentangle all of these different things. And then, of course, showing that a certain piece of misinformation has a certain effect in the real world is really hard. You can't just go into the lab and feed people misinformation and then see if they act on it in the real world, right? That would be unethical. But also, it's hard to do. Yeah. So much of the misinformation research that's being done is being done in partnership with

with social media companies or perhaps despite social media companies we've seen a big change in how Twitter or X participates in research and we just had a bunch of studies come out of Facebook but there's questions about what was really going on behind the scenes. That's

There's a kind of weird paradox in all of this, because really the reason that people have concentrated on social media so much in the first place is that it was such a good place to get actual data, right? Because you have these huge data sets of the links that people have clicked, the posts that people have engaged with and all of these things. And so you can look at a huge amount of data on billions of people potentially and what kind of information they consume online. That first of all, of course, is a problem because a lot of the...

the information we consume isn't online. We listen to radio and podcasts, hopefully. We watch TV, all of these things. And so there is a question whether there's already kind of a bias there. But then on the other hand, people have gone all in because they had this data conveniently available. And then over time, of course, the data access has gotten worse and worse. On the one hand, some researchers are saying, well, I mean, it's really terrible that we're losing this access, but at the same time, maybe it's also good because it forces us to do different type of research.

On the other hand, there are people who are then trying to collaborate with the social media companies and doing this kind of research to have better access. But that also raises a ton of thorny questions.

And I think there's valid concern that in general, the kind of research that social media companies are going to allow researchers to do is going to be slightly skewed towards the kind of thing that they're interested in. So for instance, a lot of the research on the platforms has been on psychological interventions. So something that happens with the end user. So say, you know, media literacy tips.

That's good for the platforms because it means they don't have to change anything about the algorithm or the business model or anything. I think it's been much harder for researchers to gain access and really look at these systemic issues, which everybody I talk to in the field says, look, clearly we're not going to solve this problem with these individual psychological interventions. And that's where I think the power of the social media companies plays a role. It's

It's to their advantage to have everybody engaging with content and conspiracy theories, rumors, all that stuff is very engaging. And it is what pays the bills for social media. And it is interesting to think about comparisons like, say, climate change, right? If you imagine that only the mineral oil companies had the data needed to investigate climate change, right?

that would really change what we know about climate change, probably. And that's kind of the situation. One of the researchers I talked to, she said it's actually climate change maybe isn't the best example. It's more like fishery because really the researchers have to go on the boats to get the data to understand the quotas and what's happening there. So if they don't get involved with social media data or social media companies, what could researchers look at? How

How could they explore this further? I think a lot of the research going forward is going to have to be, you know, sometimes doing surveys of people, for instance, asking them how they use certain platforms. Some people are asking users to donate their data. People

can basically use a plugin or just download the data from certain social media platforms and then give that to researchers. Some researchers are even building their own social media where they basically have AI talking to each other and simulating a social network. And then they're scraping, at least for the platforms where a lot of stuff is public, like X, people can just try and grab all that data continuously

from the net. There's a question about how legal and ethical that is, but that is something that some research are also doing. I thought this idea of a rumor clinic where researchers are monitoring what's happening in media, online, looking out for potential misinformation storms, basically, like

kind of being ready to respond to a crisis, treating misinformation like a crisis. You know, how easy is it for people to predict when misinformation is going to hit the big time? I think what really helps them, stupid as that sounds, is that they just spend

insane amounts of time wading through the worst of social media. And so they see all of these frames and narratives, because I think one of the things that people misunderstand about rumors is that they think, oh, you know, some rumor comes up and then, you know, maybe it goes big or it doesn't, and there's not really a good reason for why it does. But most of the time, there are certain narratives that are already set or certain frames, and they really lead to a specific type

type of rumor coming up. So for instance, with the elections, if you take Sharpiegate, I don't know if everybody remembers, but this was this worry that these Sharpies being used in polling stations were basically part of a ploy to make certain votes not count.

A lot of this is based on, you already have a narrative out there, which is, oh, people are trying to steal the election. And once that's the narrative, then almost organically, people who see certain things in their polling station or outside of the polling station will pick up these things and then they will put it in that frame and say, oh, here's evidence that this is happening. And then that quickly spreads. There's a lot of these kinds of examples where if you are already in these circles and you are speculating

seeing these frames and these messages being pushed by certain people, it is easier to see what kind of rumor might come up. And when it does come up, what kind of rumor might spread easily. If you're working in a rumor clinic and you're keeping your eye out for these type of things, what is knowing maybe a few days in advance that something is going to go big?

but it's wrong. What can you do with that? Yeah, that's a really good question. I'm not sure we have a good answer on it. So what Kate Starbird and her team do a lot at the University of Washington is that they put the information out there and they call them rapid research posts. They say, here's a rumor. This is how it's developed and been spreading. This is what we know about the actual background to it. It kind of gives journalists

journalists, but also election officials and others, something to work with, it kind of gives you a different frame or a different narrative. You then know, oh, okay, this is coming up because there is this rumor and this rumor has come about this way. And so it allows you to put something in context that otherwise maybe would just surprise you and you wouldn't know what to do with it. Ideally, of course, you'd be able to reach people who believe that rumor and deep

debunk it and explain to people what's actually happening. But in our fragmented media ecosystem, that's probably really hard to do. This almost sounds like inoculation, which is the other approach that you write about in one of your stories, which is like letting people know in advance some facts so that they don't get fooled later. Is that putting it too simply?

I mean, it's more than that. It's not letting them know the facts. So it's warning them that they might hear misinformation and then telling them what the misinformation is going to likely be. So the idea of William Maguire was a psychologist in the US in the 50s and 60s who worked a lot on this. And his idea was that people are sometimes easily persuaded

because even though they might hold something to be true and self-evident, even if they're never exposed to criticisms of these ideas, they don't actually learn how to counter those kinds of arguments. And so he was comparing it to people who've been brought up in an aseptic environment. So they've never built up any antibodies against a lot of infections. And so the idea was like a vaccine, let's expose people to kind of a weakened vaccine

dose of the misinformation. And that way, people can then start to build up an immunity to it. That analogy is all over the inoculation idea, of course. So does it work? Yeah, yeah, that's the question, right? I mean, what I took away from my reporting is that it seems to work pretty well in the

more confined sense, often called pre-bunking, where you're doing that very specifically for one idea. So Sander van der Linden, who's maybe the biggest proponent of this, he works at the University of Cambridge and he comes from the climate change space. His first big study on this was testing whether you can inoculate people

to this argument that there is no scientific consensus around climate change. So he showed that if you tell people that there's a scientific consensus that 97% of scientists agree with climate change, that people revise their estimate of the consensus upwards. But if you then countered it with this, there's this famous petition which has been signed by lots of people and it's disproved

described often as more than 30,000 American scientists have signed a petition saying that there's no consensus. Now, that's not really true, but that misinformation can completely erase the effect of telling people about the consensus on climate change. And so he was testing whether you could inoculate people by telling them

So there is a consensus on climate change, but there are also people who are spreading misinformation, you know, in this form. That's not actually true. And just to see whether that way you can counter that. And that seemed to work really well. And there's other studies that suggest that it works quite well on that level. You have to find each case where misinformation is, identify it, and then create a way of telling people the correct information. It's just it's a huge amount of work, right?

Yeah, absolutely. You know, every day a million new pieces of misinformation come up somewhere. So it's impossible, I think, to pre-bank all of that. But what Sander van der Linden is trying to do now is to be much more general. So to try and say, look, here's certain characteristics of misinformation. His idea has been to try and take things that all of this misinformation has in common. And he's identified certain kind of logical fallacies or emotional language being used.

and then inoculate people to that. But I do think that's a very different idea from the pre-bunking. And it seems to work in studies to a certain extent. I think it's very hard to know how well it works if you're just scrolling through your feed on Twitter and you saw a video one day or a week ago warning you that certain logical fallacies are being used by people spreading misinformation, like how well that actually...

then translates into your behavior online. That's really hard to know. I think we're still trying to figure that out. But at least it's something that people are working on because otherwise this problem just sometimes seems so big and so systemic that it can be really frustrating, I think. Yeah. Are you going to keep reporting on it going forward?

That is such a good question. So I've been telling everybody that this is my get out of jail card. Like, I'm going to write this one package of stories and then I've done my due. You're going to get out of the fog. I don't think that's how it works. I do think it's a really important topic. We're learning interesting things, but I think it's also quite slow. And what we're learning is often very limited by, you know, oh, this is on that platform in that country in that time period. And you can't ever really generalize it to other things. I'm definitely going to

to keep track of the field and probably write about it every now and then. I hope that a lot of it is just going to also inform my reporting on other topics. Yeah. Okay, Kai, one very cynical question is, have people always believed a bunch of garbage and we just didn't know about it? We just have more insight into what they think? Yeah.

I mean, that is one of the interesting questions that's so hard to answer. I mean, you could argue that, yes, people have always believed a lot of nonsense. You and I probably believe a lot of nonsense about some things. At least, you know, if we look back in 100 years, people will be able to pinpoint those. And so I think you always have to be a little bit skeptical and self-critical. At the same time, there is, I think, the thing that most

worries me about all of this is how people are weaponizing it. There is a pathway here for people to really kind of game the system and to really very deliberately misinform people about things for their own gain. To some extent online, we have gamified the truth in a way that's really, really worrying. All right, Kai, this has been super fascinating. Thanks so much for talking with us on the show. Thanks for having me. I hope I didn't depress you too much.

But I mean, we keep fighting the good fight. Kai Kuferschmidt is a contributing correspondent for Science. You can find a link to the stories we discussed at science.org slash podcast. Don't touch that dial. Up next, researcher Danny Bassett talks about using Wikipedia to understand human curiosity.

This podcast is supported by the Icahn School of Medicine at Mount Sinai, one of America's leading research medical schools. Icahn Mount Sinai is the academic arm of the eight-hospital Mount Sinai Health System in New York City. It's consistently among the top recipients of NIH funding. Researchers at Icahn Mount Sinai have made breakthrough discoveries in many fields vital to advancing the health of patients, including cancer,

COVID and long COVID, cardiology, neuroscience, and artificial intelligence. The Icahn School of Medicine at Mount Sinai. We find a way.

Japan's Nostr specializes in postbiotic gut microbiota metabolite-based pharmaceuticals research to treat metabolic and immune-related diseases. Nostr's products include biosynthesized GMP bacterial preparations and QMEC, the world's first HYA-50 metabolite postbiotics healthcare supplement.

Their analytical services include liquid chromatography, mass spectroscopy, metabolome analysis, and state-of-the-art genetic sequencing for gut microbiota analysis. Visit www.noster.inc to discover how Noster can help you.

As a super curious kid in the 80s, I was basically waiting for the internet. I found out later that I could have just been calling the library every time something came to me that I wanted to know about, some curiosity, some burning question. But instead, I often was just like, whatever came out of fiction books, you know, is background information or like the 1940s dictionary that we had in the house were my primary sources.

you know, it was never, ever going to match up with the way, you know, today, what feels like this infinite access to information. And I have to say, one of the cornerstones of my curiosity, of my dream of never-ending information is Wikipedia. Despite early concerns that it would be constantly shifting and kind of shifty as a source of facts, it's really lasted and, in my opinion, proven itself endlessly fascinating and useful. So

This week in Science Advances, Dani Bassett and colleagues write about how people use Wikipedia and what it tells us about human curiosity. Hi, Dani. Welcome to the Science Podcast. Hi, thanks for having me. Sure. This is great. I'm so excited to learn more about how people use Wikipedia and what it says about people. It's such an interesting question. So can you give us the big picture? Like, what are you trying to find out about, I guess, people's curiosity?

people with this type of research? Absolutely. Yeah. What we're doing is that we're trying to understand how different people seek information. We are detecting different styles of curiosity. It turns out that each of us is differently curious than the other person. What we want to do is to understand what those styles are and how different people use them as they browse Wikipedia. What are the different curiosity styles?

There are three different styles of curiosity that are predicted from a historical assessment of text over the last 2,000 years. Those three styles are the busybody, the hunter, and the dancer. The busybody is somebody who flits between different pieces of information that might have nothing to do with one another. So these people love all

Wikipedia pages, all pieces of information, all pieces of news. The historical accounts around this kind of person are more like a social butterfly, but in the epistemic space, so in the space of information. The second style is the hunter, and that's a person who is seeking a particular piece of information. They're trying to solve a puzzle. They have a goal in mind, and they are pressing through an information space to solve, to reach that goal.

The third style is the dancer, and that's somebody who spends some time in one area of knowledge and then leaps, jumps to another area of knowledge and stitches them together in a creative motion that's linking two different sectors of knowledge. So interdisciplinary work, fantasy, ideas that bring two pieces of information together you wouldn't expect. So these are the three styles of curiosity that we're seeking to detect in humans today.

Wikipedia is a huge site and its user base is also ridiculously large. And here you have data from over 480,000 people spanning 50 countries. How are you able to get access to all this user data from Wikipedia?

Yeah, so we collaborated with Martin Gerlach, who's a senior research scientist at the Wikimedia Foundation, and they have anonymized data that they can share from millions of browsers, in fact, across the world. What kind of data were they sharing? Were they sharing these kind of journeys that you were just describing when you were talking about the different curiosity types?

Yes, they share information about what page each browser went to. So which Wikipedia page and at what time and then how they moved from one page to another. So we can track how they were moving through Wikipedia. Did they also do it by the types of information? Were there certain areas of Wikipedia that were really sticky or that people tended to do deep dives once they started down that route?

Yes. Interestingly, what we found is that people who browse more like hunters tend to be browsing pages in STEM, so science, technology, engineering, and mathematics. Whereas people who browse more like busybodies tend to be browsing pages on culture, pages on entertainment, music, famous individuals, things like that. It seems that

two important factors come up. One is that we differently browse Wikipedia. So some of us are hunters, some of us are busybodies, some of us are dancers. But also, if we're a hunter, we tend to go to STEM pages. If we are a busybody, we tend to go to culture pages. Just a quick reminder that...

Busybodies are going from page to page. They just, they tend to expand out their search. And these are the ones looking at the culture pages. And then hunters tend to press forward into one area and stick very close to where they started. And dancers make big leaps between different areas of content on the site. And so you did see this correlation between what kinds of content these different curiosity styles were associated with.

What other characteristics did you see that correlated with CuriosityStyle and what was in the data?

I think the one that I'm most excited about and interested in is a relationship between the curiosity style of a whole country, all the browsers in the country, and measures of education and gender inequality in the country. So what we observe is that countries that have greater inequality also tend to have browsers who browse more like hunters, whereas countries with greater equality tend

gender equality and education equality tend to have browsers that browse a diversity of topics like the busybody. What are some of the thoughts about why that might be? There are many possibilities. One is that in different countries with more equality, people may be coming to Wikipedia with a different set of motivations.

So it is possible for people to come to Wikipedia mostly for work in some countries or more for entertainment or more for leisure or more for widening the mind. And that may be different across different countries. It's also possible that people are coming to Wikipedia in different countries at different ages.

Or even having different genders. It's possible that in some countries, there's people who browse Wikipedia are more of one certain age or more of one gender. We don't have that information at this point because as I mentioned, the data are completely anonymized.

So there are open questions of why. Could it be a difference in the browsers themselves? And how does that relate to the social structures of equality versus inequality? I do want to ask about the international aspect of this. So I feel like there are large Wikipedias in certain languages and smaller ones in other languages. Is that something that you can see in your data? And does that reflect anything about...

you know, how people are using Wikipedia in their country and their language? Yes, there are definitely differences in the size depending on the language. And there are also differences in the proportion of science pages or STEM pages to non-STEM pages. We need to do further digging or another study needs to be written on how those differences in size and the topics that are being covered

might relate to how people are approaching Wikipedia, but it is true that sometimes we see that people who are in Germany and are reading English Wikipedia, and sometimes they're reading German Wikipedia, right? And that's true for many countries. So I'm curious about how people choose which Wikipedia page to go to when they are bilingual or multilingual. Yeah. A lot of times I end up on a different country's Wikipedia because an image is used there and I want

And I want to see the image and it's only uploaded to the German version. And it's like a special fish that I want to look at or whatever. So I end up over there.

it seems like a very good reason. What did you learn about these different curiosity styles? Did you get any more insight into how they operate? One of the big findings that we had is this observation that the busybodies browse a diversity of topics and across mostly culture pages more than STEM, whereas the hunters browse more in the STEM areas. So that was a new finding that we observed. And then the

Additional new finding was the observation of the dancer style of curiosity. Previous work had suggested that busybodies and hunters existed in a very small laboratory data set of 149 people. This was the first time that we saw the dancer style and were able to see how people move through this space in a way that is intellectually creative. Mm-hmm.

We knew about dancers from historical textual analysis, but not from the smaller lab-based study. It really only popped out in this giant Wikipedia study. What did you see in terms of proportions across your whole data set of dancer searches versus hunter searches versus busybodies?

Yeah, that is different by country. So there isn't, I hesitate to put a number on that separately for age overall. So you see a difference in proportion of these different curiosity styles, depending on what country you're looking at?

That's correct. Yes. I also, though, at this point, I want to say that while we observe that there are different styles of curiosity in each browser, I also think that we can show that browsers have varying tendencies. So while they may prefer one kind of style, a hunter might spend a little bit of time more like a busybody and then go back to being hunter-like.

That makes sense. I mean, none of us fit perfectly into any kind of cubbyhole, right? We're all going to have a primary and then some secondary characteristics that kind of, especially to meet the moment, like if you're in school and you need to focus on your work, then maybe you're going to just stick with the pages that are very tightly connected. Whereas if you're, you know, playing that game, like how many steps into Wikipedia before you get to X page, yeah, things are going to be a little nutty.

Yes, I agree. And in fact, the hypothesis that we're motivated to test in future work is that people to engage effectively in the world around them have to use different curiosity styles. And so perhaps one of the goals of education is to help kids learn how

how and when I use different curiosity styles. When do I need to be a hunter? When do I need to transition to be more of a dancer or a busybody? And I think that in my work as a scientist, I have to use all three kinds of curiosity. I need to be very open-minded at the beginning. Then I need to turn into the hunter and focus on a single question. And then at the end, I need to ask myself how my discovery relates to previous theories or work in other fields. So you have to do this stitching together of,

different disciplines. So what made you get interested in this kind of work in curiosity as a topic for research? I think I've been interested in why we seek information. And a lot of the work that's been done in the field has focused on

how curious we are about the answers to trivia questions. And as which I think is a very important area of work. However, I also know that there we are curious in different ways. Some of us are very not curious about trivia and others of us are. And so what I really wanted was to understand more about the different ways that people were curious instead of defining it in a single way. This also came up when we when when you look at the work on how curiosity might change over the lifespan.

So some work suggests that curiosity peaks when people are kids and then goes down after that and especially decreases in later age.

My perspective is that it is likely that there are different kinds of curiosity that are present in kids and young adults and then adults and then in later age. And understanding those different kinds of curiosity will help us to better understand the human experience. Yeah, especially, you know, as our lives change, as we have different responsibilities, different interests. Yeah, our curiosity may not be the driving, the same driver as it is for. Exactly. Yeah.

Now, are there any extremes in the Wikipedia journeys that you witnessed? Were there any that span more pages than anyone could possibly expect? There are some people who span a wide variety of pages, although the ones that I think are most notable to me are the ones where somebody stays in a single area of Wikipedia for a very long time. And we noticed this in the earlier study of 149 participants in the lab.

There, for example, we had one person who for three weeks just read Wikipedia pages on the royal family in England. Did that coincide with the death of Queen Elizabeth? No, it was before that. That's so interesting. I know, it is interesting. And another person spent the whole three weeks reading pages on Jewish history. So there are some people who really focused in for a very long period of time in one area. Okay.

And those are not STEM. That's really interesting, too. They're not. Neither. Yeah. Yeah. Very cool. I'd love to see this on Reddit. Just to see how people spend their time on Reddit. That's a good idea. And, you know, there is a lot of expertise hidden in some corners of Reddit. Like if you look at Ask Historians. Knowing where there is expertise reminds me of an interest in Wikipedia articles.

and misinformation in Wikipedia, or specifically pages that are related to conspiracy theories, for example. So there is an open question by the Wikimedia Foundation of kind of how do these pages come to be, who browses them? So an open question at the moment is, is there a curiosity style that's characteristic of people who tend to go towards pages with misinformation or with conspiracy theories? And if so, are there

maybe even recommendation processes that could say, oh, you might also be interested in this other page that has a very different perspective on this topic. The last thing I want to mention, this is totally a little bit off topic, but one thing about curiosity that I think connects with something I've

I've read before, talked to someone about before, is this idea that you can have the sensation of learning and that it's very enjoyable, but not actually learn anything. That's something that's like education research has shown that you can separate those two things and that people really enjoy the sensation of learning and that that itself is a driver for some behaviors.

supposed to like accumulating knowledge. Yeah, I think that is, I feel that when I listen to audio books. I have so much enjoyment that I'm learning something. And then I ask myself, you know, a couple months later, what was in that book? Hmm.

No, no recollection. Absolutely. For me, it has to be a conversation. It has to like fit in with problems I'm trying to solve. Otherwise it's just out the window. Yeah. Yeah. But yeah, the sensation of learning is very, like we just go and rub our brains sometimes. That feels right. It does feel right. Thank you so much for talking with me.

Yeah, thank you. Danny Bassett is a professor in the Department of Bioengineering at the University of Pennsylvania. You can find a link to the Science Advances paper we discussed at science.org slash podcast.

And that concludes this edition of the Science Podcast. If you have any comments or suggestions, write to us at [email protected]. To find us on podcasting apps, search for Science Magazine, or you can listen on our website, science.org/podcast. This show was edited by me, Sarah Crespi, and Kevin MacLean. We had production help from Megan Tuck at Podigy.

Our music is by Jeffrey Cook and Wenkui Wen. On behalf of Science and its publisher, AAAS, thanks for joining us.