We're sunsetting PodQuest on 2025-07-28. Thank you for your support!
Export Podcast Subscriptions
cover of episode The New Conservationists: AI is Making Meaning from the Sounds and Visuals of Wildlife (Part 2)

The New Conservationists: AI is Making Meaning from the Sounds and Visuals of Wildlife (Part 2)

2024/12/16
logo of podcast Science Quickly

Science Quickly

AI Deep Dive AI Insights AI Chapters Transcript
People
A
Ashleigh Papp
M
Matthew McCown
R
Rachel Feltman
T
Tanya Berger-Wolf
Topics
Ashleigh Papp:传统的田野研究工作耗时费力,难以全面了解物种数量和分布,人工智能技术可以有效提高数据收集和分析效率。 Matthew McCown:自然系统存在大量噪声数据,人工智能可以帮助识别隐藏的信号,扩大观察规模,提高数据收集效率。Conservation Metrics公司利用人工智能技术分析各种物种的数据,包括海鸟、鸣禽、蝙蝠、昆虫和珊瑚礁,通过分析声音和图像数据,更全面地了解物种的活动规律和生态系统的健康状况。 Tanya Berger-Wolf:社交媒体上的野生动物照片可以用于动物保护科学研究,利用计算机视觉和机器学习技术,可以自动识别和分析动物图像中的信息,例如物种识别、个体识别、种群数量估算等。Wild Book公司(后来的Wild Me和Conservation X Labs)收集各种来源的动物图像数据,用于保护工作。 Rachel Feltman:人工智能技术既带来环境问题,也提供了解决方案。我们需要利用人工智能技术来帮助我们更好地了解和保护野生动物,应对生物多样性丧失的挑战。 Ashleigh Papp: 人工智能技术可以帮助解决传统田野研究中耗时费力的问题,提高数据收集和分析效率,从而更有效地保护野生动物。 Matthew McCown: 人工智能可以帮助我们处理海量数据,识别自然系统中的隐藏信号,从而更准确地了解物种的活动规律和生态系统的健康状况。例如,通过分析珊瑚礁的声音数据,我们可以了解珊瑚礁的健康状况,并采取相应的保护措施。 Tanya Berger-Wolf: 人工智能可以帮助我们分析大量的图像数据,识别和追踪个体动物,估算种群数量,了解动物的社会行为等。这对于保护濒危物种至关重要。 Rachel Feltman: 人工智能技术在野生动物保护中具有巨大的潜力,但我们也需要关注其潜在的环境影响,并采取措施减少其负面影响。

Deep Dive

Key Insights

Why are conservationists turning to machine learning to process nature's complexity?

Conservationists are turning to machine learning because the natural world is incredibly complex, with many unexplained factors influencing animal behavior and population fluctuations. Machine learning helps cut through the statistical noise, expanding the scale of observations and reducing the time needed to find relevant data in large datasets.

How does AI help in monitoring coral reefs?

AI helps in monitoring coral reefs by analyzing audio recordings to identify the unique sounds of healthy reefs. These sounds, which include popping, clicking, and grunting, convey information about the health and biodiversity of the ecosystem. AI can detect changes in these sounds, indicating the degradation of the reef.

What are the benefits of combining different methods of observing coral reefs?

Combining different methods of observing coral reefs, such as traditional scuba diving, acoustic sensors, and video cameras, provides a more comprehensive understanding of the ecosystem. Each method has its biases and blind spots, so combining them helps researchers cover different areas, time periods, and species, leading to more accurate and detailed observations.

How is social media contributing to animal conservation science?

Social media is contributing to animal conservation science by providing a vast number of images of wildlife. These images, often shared by tourists and nature enthusiasts, can be analyzed using machine learning to identify and track individual animals, determine population sizes, and understand their social networks and behaviors.

What is the significance of the algorithm developed by Tanya Berger-Wolf for identifying zebras?

The algorithm developed by Tanya Berger-Wolf for identifying zebras is significant because it can recognize individual zebras from photographs in just two clicks. This automated process, which was previously a time-consuming manual task, allows researchers to quickly and accurately track individual animals, providing valuable data for conservation efforts.

Why is there a need for better biodiversity data?

There is a need for better biodiversity data because more than 10% of the world's species are threatened with extinction, and the exact extent and rate of this loss are not well understood. Better data helps in making more informed conservation decisions and understanding the impacts of climate change and habitat loss on different species.

Chapters
This chapter explores the use of artificial intelligence in conservation, focusing on how AI helps researchers process large amounts of data from various sources to understand animal populations and their habitats better. It highlights the challenges of traditional field research and how AI can overcome them.
  • AI helps process nature's complexity
  • AI expands the reach of observation
  • AI reduces the time to find interesting data points in massive datasets

Shownotes Transcript

Translations:
中文

Ryan Reynolds here for Mint Mobile. You know, one of the perks about having four kids that you know about is actually getting a direct line to the big man up north. And this year, he wants you to know the best gift that you can give someone is the gift of Mint Mobile's unlimited wireless for $15 a month. Now, you don't even need to wrap it.

For Scientific American Science Quickly, this is Rachel Feltman. You're listening to the second episode of our Friday Fascination miniseries, The New Conservationist.

Today we're heading into the field, under the sea and to the savannah, with researchers who are using artificial intelligence to change the way we understand and protect animals and their ecosystems. Our guide once again is Ashley Papp, an animal scientist turned storyteller. She's here to explain why we're turning to machine learning to process nature's complexity, and how it's extending the reach of what our eyes can see and our ears can hear.

After college, I spent time as a field researcher in Costa Rica working with endangered sea turtles. It was a lot of hard work — basically hours of walking up and down remote beaches hoping to spot a turtle nesting in the sand. The fundamental idea behind field research rests on two questions: How many animals are there, and where do they live?

If we know the answers, we can learn a lot about a species, or in some cases, an entire ecosystem.

And even though they're simple questions in theory, answering them can be incredibly time-consuming and expensive. The natural world is super complicated. That's Matthew McCown. He started a company called Conservation Metrics, which uses technology such as AI automation to decode nature. There are a number of factors, there's endless numbers of factors that influence the way that animals are behaving, the fluctuations of population, etc.,

So statistically, natural systems have a lot of statistical noise, a lot of unexplained factors that are contributing to what you see any given day. Matthew struggled for years to find the hidden signal in that noise until he realized that AI could help, basically by greatly expanding the kind of observational work that I used to do by hand in Costa Rica.

He needed a digital surveillance network that could watch and listen for sea turtles or tree frogs or parrotfish. Really, whatever species you're interested in conserving. What technology allows us to do is really increase the scale of our observations. So increase the number of places that you can be watching or measuring aspects of animal behavior.

and the amount of time you can do it, as opposed to just having one person in one place in a day, you can have 50 sites that are monitored for the full year. And that really helps us get a better understanding of the actual true signal of what's happening in these communities. Think cameras mounted on a tree in the jungle to capture monkeys swinging through. Or a hydrophone dropped into the water to record audio of whales swimming by.

But all those new observations, in turn, create a new problem. Then you start to generate huge amounts of information. Huge, many tens of thousands of hours of recordings. So instead of having one carefully curated notebook of field observations, you now have terabytes of passively recorded video and audio files. It becomes impossible.

impossible to try and actually find the things you're interested in amongst the thousands of hours of recordings. Enter machine learning. If you have 80,000 hours worth of data and the thing that you're looking for is only in there for half an hour, computers can really help reduce the time it takes to find those things you're interested in.

So Matthew got into building and refining computer software that cuts through the noise and pulls out the interesting stuff. Because then, the science can happen. His company started off monitoring seabirds, a notoriously difficult group of animals to study. From there, Conservation Metrics branched out into songbirds, bats and insects, and more recently, the team took the technology for a swim. We're super interested in coral reef communities.

Coral reefs are a powerhouse under the sea. About 25% of all marine species are found in, on, or around reefs. And the biodiversity of these ecosystems rivals rainforests on land. It's estimated that 1 billion people rely on reefs for food, income, and protection.

And those reefs are in danger. Climate change, unsustainable fishing, and pollution are their top threats, which can cause one of the reef's main energy producers, tiny algae called zooxanthellae, to flee in search of better living conditions.

There's a lot of really interesting science going on around the world, a bunch of university labs that have shown that healthy reefs have a unique sound and that that sound disappears as reefs get degraded. For those of us who haven't had the chance to snorkel or scuba dive near a reef, I had to ask Matthew, what does a healthy coral reef sound like? Coral reef sounds are really interesting.

a little alien. There's a lot of popping and clicking on a lot of tropical reefs, which is related to several species of shrimp that make clicks with their appendages. And then there's a lot of grunting. These fish are doing a lot of grunting. And, you know, it sounds kind of like off-gassing sometimes. They're like... Like a bubble coming up to the surface.

It turns out all these pops, clicks, grunts, and bloops convey a lot of information. Right now, the Conservation Metrics team is working with a few universities in the U.S. and a coral restoration group in Morea, an island in French Polynesia that's not far from Tahiti, to translate the noises. The field researchers on the island drop hydrophones into the water, press record, and then ship the sound vials off for processing.

We're building these detection and classification models, these things that computers use to strip through the audio recordings and find the things that you're interested in. And it's crazy because we don't know what's making sounds. So we just have all these sort of unknown sound categories. We just give them a generic name and...

then we just track that sound even though we don't know what's producing it. That decoded and organized soundscape is providing valuable new information for coral research in Morea, which has been ongoing for decades. Researchers are combining recorded audio and video with over 20 years' worth of observational data, the physical notes of what a researcher sees when they swim in the same area every day. We're really interested in combining across regions

ways of making observations on reefs. So there's traditional scuba divers where they swim a line and they count all the fish they see. They've been doing that for a long time. And at these same locations, we're now going to start to add an acoustic sensor. And then we're adding a video camera with the computer vision side of that. And all of these ways are different ways of, you know, trying to make observations about how a community is doing. And so we're really interested in

how do you look across these lines of observation to learn more? Because each of these methods has its biases, has its downsides, has its blind spots. And they cover different areas. They cover different time periods. They cover different taxa, different species. Matthew hopes that better understanding the soundscape in places like Morea will also help us learn how the noises of a healthy reef impact the animals that live there.

Animals are making decisions based on what the soundscape is. So there's a lot of experimental evidence that shows that certain species larvae are floating around the oceans, and they will drop out of suspension when they hear a healthy reef or a healthy oyster bed, et cetera. Like, they're making decisions based on the sound field.

So some researchers are using AI to better understand the soundscapes of ecosystems. But what about using AI to literally see how ecosystems work? There are millions and millions of images out there of animals. That's Tanya Berger-Wolf, a computational ecologist. She's a professor at The Ohio State University and leads the Imageomics Institute.

So bringing all these images together automatically with the modern computer vision and machine learning approaches, we can find all the ones that contain animals, find where the animals are in those pictures, put a bounding box around each one and identify not only species, but down to individual animal.

Consulting firm Rise Above Research estimates that almost 2 trillion photos will be snapped this year. And for many people, sharing photos of wildlife and outdoor adventures is a popular pastime. Anything for the gram, right?

Well, it turns out these social media posts can be useful for animal conservation science. Anything stripe-spotted, wrinkled, notched, even using the shape of a whale's fluke or the dorsal fin of a dolphin, these are all unique identifiers.

And so then with information on when and where the image was taken, you can really now finally start using images as the source of all kinds of information about animals, tracking them, counting them, determining their range, and even, yes, their social network.

Tanya spent years studying mostly mathematics. Only when she met her now-husband, ecologist Moshe Wolf, did she start considering how math could help with conservation work. And for many years, while doing my very theoretical computer science PhD, still pretty much math,

I was involved in many conversations with him and his friends and colleagues where I would walk away with a feeling, oh, there's got to be a better way of answering that question, which was an ecological question. She chose a postdoc position in ecology and evolutionary biology and eventually started working with a group studying the social behavior of zebras in Africa. Zebras are a super social animal, and they often hang out in big groups called a zeal or a dazzle.

To understand each individual animal's behavior, the group needed to be able to quickly identify one zebra from another amid a dazzle of black and white stripes. Tanya, at that time, had been working from behind a computer screen, wanting to bring a computational and algorithmic approach to the work. But after a few years on the team and a lot of prodding from her colleagues, she went to go see the animals herself and had a big realization.

They finally went and saw my data and one of the things that immediately became very clear that all the assumptions that I was making about my algorithms and the way I approached the problem were completely off. It also became very, very clear

that I did not understand my data. To generate the data, one of Tanya's colleagues would go out every day and take photos of the zebras. Then the colleague would use a computer program to very carefully match, pixel by pixel, the zebra's stripes to recognize each animal and note who was standing next to whom, at what time of day, and any other particulars about the animal's behavior.

But Tanya, seeing the zebras in the wild and then watching the very manual process of matching the stripes, thought to herself: This is nuts. Five minutes later, I'm like, this is insane. This is taking too long. It's got to be two clicks. Come on. So she posed a friendly bet to her colleagues, which they gladly accepted. She then went back to her graduate students and explained their task. I'm like, look, I just bet my reputation that we can recognize individual zebras from photographs in two clicks.

And she and her students did it. They developed a computer program that could recognize a zebra by its stripes in two quick steps. And soon enough, the effort picked up steam and the algorithm was expanded to other species, allowing valuable information like spots on a giraffe or notches on a shark's fin to be extracted from images. The program could be used to do everything from determining population size all the way down to tracking individual animals.

Today, the company that Tanya started with this effort, originally called Wild Book, which later became a part of Wild Me and now Conservation X Labs, has more than 50 species in its databases. Images come from researchers, autonomous vehicles, camera traps, and even tourists. And the list of contributors continues to grow each year. For computational ecologists, however, this project is just the beginning.

If I show you pairs of photographs of zebras and ask you, "Are these two more similar to each other than these two?"

No way! And no amount of training will help you. But the same algorithm that identifies these zebras also quantifies the similarity between stripe patterns and allows us for the first time ever to compare the stripe similarity to genetic similarity and to start understanding the mechanism behind the stripe pattern development. Is it hereditary? Can zebras tell each other apart using stripes?

Or do they not use it at all as a function? And those questions, now more easily answered thanks to machine learning and AI, can help expedite conservation work around the world. Biodiversity has a data problem. More than 10% of the world's species are threatened with extinction. That is a shockingly large number. But the problem is that we really don't know exactly what we're losing and how fast.

Today's climate is changing because of us. Our technology has emitted a lot of the greenhouse gases that are warming the planet. But animals are paying the price. Around the world, their habitats are changing too quickly for them to adapt, and species are disappearing altogether.

And by the way, every time we ask ChatGPT to answer a question using AI, it requires about 2.9 watt-hours of electricity. By comparison, an incandescent light bulb uses about 60 watts in an hour.

While ChatGPT's electricity usage may not sound like a lot, keep in mind how many people are using AI every second of every day. This means AI is also contributing to the problems that animals in our environment face. So if we're going to keep using AI, we should employ it to help find some of the solutions too, right?

We've been looking at and listening to animals since time immemorial. But now, finally, we're harnessing some of our ecosystem-disrupting technology to figure out what creatures are left in the natural world and how they're coping. Equipped with this information, we can make more informed conservation decisions.

Animals face more challenges than ever before. But with the help of technology and perhaps a whole lot of wildlife selfies, they might just stand a chance. That's it for today's show. Join us again next time when we'll meet two conservationists who don't fit the historic mold for who does this work. Spoiler alert, conservation has a diversity problem.

Science Quickly is produced by me, Rachel Feltman, along with Fondam Wongi, Kelso Harper, Madison Goldberg, and Jeff Dalvisio. This episode was reported and co-hosted by Ashley Papp. Shaina Poses and Aaron Shattuck fact-check our show. Our theme music was composed by Dominic Smith. Subscribe to Scientific American for more up-to-date and in-depth science news. For Scientific American, this is Rachel Feltman. See you next time.