Summary
This TED Radio Hour episode explores how AI and technology are revolutionizing our understanding of nature, from using citizen science platforms like iNaturalist to decode biodiversity patterns, to deploying acoustic recorders in Yellowstone to analyze wolf communication. Researchers are building specialized AI tools designed for environmental constraints that help scientists process massive datasets and make conservation decisions at unprecedented scales.
Insights
- Community science data now comprises 90% of all biodiversity data collected, democratizing ecological research and enabling AI systems to find patterns humans cannot detect at scale
- Specialized, energy-efficient AI designed for field deployment is more practical for conservation than large generative models, challenging the narrative that AI inherently requires massive computational resources
- Understanding animal behavior through technology (spectrograms, acoustic analysis) reveals complex communication patterns that inform conservation strategies and help protect keystone species like salmon and wolves
- The gap between species discovery and species understanding is critical—knowing a species exists is insufficient; conservation requires understanding habitat needs, diet, and climate resilience
- Technology enables proactive conservation triage, such as coordinating city-wide light-off campaigns during bird migration windows, turning data insights into actionable policy
Trends
Citizen science platforms becoming primary data sources for biodiversity research, shifting from expert-only to crowdsourced ecological monitoringAI-powered bioacoustics emerging as a field for understanding animal communication and behavior without invasive observation methodsEnergy-efficient, edge-deployed AI models designed specifically for remote field work rather than cloud-dependent large language modelsKeystone species restoration (salmon, wolves) driving ecosystem-wide monitoring and measurement of cascading ecological effectsIntegration of multi-modal data (images, audio, GPS, environmental sensors) to create comprehensive digital representations of ecosystemsShift from species discovery to species understanding as the bottleneck in conservation effortsCollaborative partnerships between tech experts and field biologists becoming essential for designing practical conservation toolsAcoustic monitoring and spectrogram analysis revealing previously unknown animal communication structures and behavioral patternsDam removal and habitat restoration projects using real-time AI monitoring to measure ecological recovery at landscape scalesTechnology-enabled poaching detection and wildlife protection within national parks and protected areas
Topics
AI for Biodiversity ConservationCitizen Science Platforms and Data CollectionBioacoustics and Animal Communication AnalysisSpectrogram Analysis for Wildlife ResearchKeystone Species Restoration (Salmon, Wolves)Energy-Efficient AI for Field DeploymentEcological Data Mining and Pattern RecognitionDam Removal and Habitat Restoration MonitoringWildlife Poaching Detection TechnologyClimate Resilience and Species Survival PredictionComputational Linguistics Applied to Animal BehaviorRemote Sensor Networks for Ecosystem MonitoringHuman-AI Collaboration in Scientific ResearchDigital Ecosystem Representation and ModelingConservation Policy Informed by Real-Time Data
Companies
Google DeepMind
Collaborating with researchers to develop AI models that count wolves from chorus howls using audio recordings
iNaturalist
Citizen science platform with 200 million images used for biodiversity data collection and AI training
MIT
Home institution of Sarah Beary, who leads research on AI for ecological conservation and developed the Inquire system
NOAA
State agency collaborating on salmon escapement estimation using AI and sonar technology for sustainable fisheries
California Department of Fish and Wildlife (CalTrout)
State agency partner using AI-powered sonar cameras to monitor salmon populations after dam removal projects
Global Biodiversity Information Facility
Clearinghouse database receiving standardized biodiversity records from iNaturalist, containing over 3 billion specie...
Grizzly Systems
Company founded by Jeff Reed to develop acoustic recording devices and AI tools for wildlife monitoring in Yellowstone
People
Sarah Beary
MIT assistant professor of AI specializing in ecological conservation; developed Inquire system for querying biodiver...
Jeff Reed
Computational linguist and founder of Grizzly Systems; leads Cry Wolf Project analyzing wolf communication through ac...
Charles Darwin
Historical reference point for centuries of species research; contrasted with modern discovery gaps in biodiversity
Kira Kasti
Researcher who discovered wolves can counter-estimate rival pack sizes from chorus howls to inform territorial decisions
Quotes
"We're sitting on an ecological gold mine, and the problem is accessing the knowledge efficiently."
Sarah Beary•Mid-episode
"You can't save animals that you don't know exist."
Sarah Beary•Early episode
"Community science data makes up probably 90% of all biodiversity data we have and have ever collected."
Sarah Beary•Early episode
"You tend to protect what you love and you only love what you understand."
Jeff Reed•Late episode
"If your body represented the total weight of all the world's land mammals today, your right forearm would be what's left of the wild ones."
Jeff Reed•Late episode
Full Transcript
This is the Ted Radio Hour. Each week, groundbreaking Ted Talks. Our job now is to dream big. Deliver it at Ted Conferences. To bring about the future we want to see. Around the world. To understand who we are. From those talks, we bring you speakers and ideas that will surprise you. You just don't know what you're gonna find. Challenge you. We truly have to ask ourselves, like, why is it noteworthy? And even change you. I literally feel like I'm a different person. Yes. Do you feel that way? Ideas worth spreading. From Ted and NPR. I'm Manoux Zamorodi. On the show today, translating nature. How researchers are using AI and other technology to better understand what the world around us is saying. Yeah, I think it honestly, it always shocks me. How little we know about the planet. This is Sarah Beary. She's a professor of AI at MIT. When I think about the hundreds and hundreds of years. The scientists have studied species, you know, all the way back from Charles Darwin until today. And all of the technological advances that have come with that. You would think that we would know every species on earth at this point, but we really don't. Sarah says that by best estimates, humans have documented maybe two million species of animals, which sounds like a lot. But our best statistical estimates to try to figure out how many species on the planet estimate that we have somewhere between 10 and even 100 million species on earth. As you might imagine, this creates a problem for conservation efforts. You can't save animals that you don't know exist. And Sarah says it's not enough to just discover a new species, give it a Latin sounding name, and call it a day. Because that's not enough, right? If you want to protect a species, it's not enough to just know that it exists. You have to understand what it needs, what it eats, how it relates to its environment. For example, if the temperature goes up by five degrees, will that species still be able to live and survive? Or not, right? Without a firm understanding of these details, it's difficult to save animals and their habitats, and will continue to lose species at an alarming rate. So unfortunately, I mean, if we look at our best estimates of even just the population sizes of wildlife on earth, it's estimated we've lost over 70% of all wildlife since 1970. Imagine that, right? Within the lifetimes of people on earth, we've lost 70% of all wildlife. That's shocking. But despite the dire statistics, Sarah is actually hopeful. Because over the last decade or so, citizen science has exploded. You might have an app like I Naturalist or E-Bird on your phone right now, where you can take a picture of an animal or plant and identify it almost instantly. And then you can share and contribute your photo, location information, and field notes. You don't need to be a scientist. You don't need to be formally trained as a scientist to help collect scientific data. And Sarah says researchers are mining and learning from all those millions and millions of entries. And this has already been a complete game changer for biodiversity. At this point now, community science data makes up probably 90% of all biodiversity data we have and have ever collected. So I've played around with I Naturalist, and it's really fun. You feel like you're part of a big group of humans out in the world seeing what's out there and documenting it as you go. But you're saying that this is actually a very important source of data for researchers. Yeah. Every image in I Naturalist gets identified as a species by the community and then turned into this scientific record of the presence of a species in a given place and time. And then once those images have been identified to what's termed research grade, they get directly exported into the global biodiversity information facility, which is a clearinghouse of what is now over 3 billion species of currents records. And so this has already been really transformative because it's making this information standardized and scalable. But I think one of the things that I am most excited about in all of this is the information that we're currently leaving on the table. If you look at the pixels of these images, there's a lot more ecological information hiding in those pixels than just the presence of a single species. Can you give me an example? I mean, if someone goes out into the world, they take a photo and they're like, okay, yes, that is a blue jay. They're also potentially capturing the type of tree that the blue jay is sitting on, other species that might be in the picture. Absolutely. Not just, oh, here's a blue jay and here's the type of tree it's sitting on. You can look at the feathers of the blue jay and determine whether it's in breeding plumage or whether it's a juvenile or a male or a female. You can look and see how that blue jay is behaving in its environment. Is it calling, perhaps, you know, some sort of like attractive behavior? Is it looking for food? Is it maybe even actually currently eating something like an insect or a seed? You can look at the type of tree that it's sitting on. You can look at the sort of structure and the health of that tree. You can look at the amount of vegetation in the background and you can identify the specific niche preferences of species related to their environment. So blue jays prefer specific types of trees or structures of trees to be their habitat. And those preferences might be quite different for another species like a tree's fero. And so a naturalist has collected 200 million images. I'm so excited about the potential of building AI systems that help scientists really quickly find new information that that's currently, it's there. It's there for the taking, but we just don't have the right systems to make it accessible, to make it possible to do at scale. Scientists have long learned about the natural world by observing it firsthand painstakingly in the field. But new tools from the satellites orbiting Earth to remote monitors deep in forests, to the smartphones in our pockets, they're all capturing more information about our ecosystems than ever before. So how can we make all that data useful? Today on the show, translating nature, making sense of the world around us to protect plants, birds, apex predators, all with a little help from technology, which brings us back to Sarah Burry. We're sitting on an ecological gold mine, and the problem is accessing the knowledge efficiently. Here she is on the TED stage. So say you want to look through all this data. Assuming it takes you about a second to look at every image, you would need to work full time for 40 years to look through all the images in a naturalist alone. And this is where AI is transformative. It can just help us look through all the data quickly. So an ecologist today say they're interested in bird diets, and they want to find examples of bird-seating insects in the database. What they can do is they can train an AI model to help them. So to do this, they collect hundreds or even thousands of examples to teach the model what to look for. Now, once they've trained this model, it's an incredible tool. It can very, very quickly find new examples of bird-seating insects in the database. But this process of collecting hundreds or thousands of examples every time we want to look for something new, it's still too slow. So let's reframe the question. Scientific discovery really begins with scientific curiosity with asking questions about the world and how it works. Things like how far can a grant zebra migrate? What plants grow back after a forest fire? Do birds eat insects during the winter? Wouldn't it be great if instead we could just directly ask questions to our databases and get answers back? This is what my team at MIT has been working towards. And we've developed a system that we call inquire that helps ecologists find answers in the data without collecting any examples to teach an AI model or needing to write any lines of code. Now, under the hood, what we're doing is we're developing AI models that can learn and understand similarities between images and scientific language. And this is what allows us to just ask. Can you just give us an example of how inquire works? Yeah. So one example of a collaborative scientist that we've been working with, they're interested in how forests are growing back after forest fires and how the severity of forest fires actually impacts the growth of the forest coming back. And so what they've been doing is they've been searching within a specific area where they knew that there was several past forest fires that had burned. They're searching for evidence of burn of previous forest fires in the background of images. And then they're able to characterize how severe that burn was based on that sort of evidence of past burn in the background. And then they're looking at the distribution of what types of trees are growing back. Is it more deciduous, more coniferous trees growing back in the foreground? So it's essentially just a discovery tool. But it can be used in a really open-ended way. What that researcher found in some preliminary studies is that it appears that areas that are more severely burned have more deciduous trees growing back afterwards. And areas with less severe burns have more coniferous trees growing back afterwards. And that's important because we know that burns are actually happening faster and hotter than they were previously. So this is going to influence the way that our forest shift and change as more of the forest's burn and as the severity of the burns goes up. And presumably that starts to be a data point for modeling how the world may change in the next year, five years, 20 years? Yeah, exactly. I mean, so much of science, environmental science, and ecological science is just trying to understand what's currently happening and what might happen in 10 years and 20 years. Understanding the implications of the way that climate is changing, the way that larger scale human development is changing their surface helps us understand what to prioritize, where to prioritize, how to take action to protect species. So triage essentially? Yeah, triage, but proactively, right? So there's a great example of this that I think is very intuitive, which is we know that a lot of birds die during migration by being sort of distracted and moving into cities because of lights. And then they'll run into windows in the cities and we have estimates of hundreds of millions of birds that are dying each year during the migrations season just by hitting windows and cities. One of the adaptive management campaigns that people are looking at is Lights Off campaigns. So when we actually understand we've detected that the migration is going through, now there are campaigns for cities that are in the path of that migration to just turn off all their city lights at night because that migration happens just a couple days a year, right? The vast majority of birds are migrating through the US just a couple days a year. So just turn off your lights a few days a year and you're already having a huge beneficial impact. So easy, so easy, but we need to know when and where it's happening to be able to take that step. When we come back, Sarah Beary takes us into the field and explains how she can train her AI without using massive amounts of energy. On the show today, translating nature. I'm Manouche Zamorodi and you're listening to the Ted Radio Hour from NPR. Stay with us. This message comes from Ted Talks Daily, the podcast that brings you a new idea every day. Learn what's transforming humanity from balancing AI and your critical thinking to surpassing discoveries about the adolescent brain. Find Ted Talks Daily wherever you listen. It's the Ted Radio Hour from NPR. I'm Manouche Zamorodi. On the show today, translating nature. And we were just talking to Sarah Beary. She's an MIT professor who specializes in AI and ecological conservation. Sarah and her team have built a new AI tool they call Inquirer, which lets researchers search huge treasure troves of data just by asking normal open-ended questions. Kind of like how chat GPT works. It's helping scientists do their work faster and easier. But I ask Sarah at what cost? Whenever I talk to people about AI and the environment, people are like, well, AI is part of the problem. It is everybody, all these techies say, oh, AI is going to solve our climate problems. But actually, all of the energy that these data centers require are just making it worse. How do you see it? Yeah, so AI is actually, I mean, it's a term that is incredibly broad. And there are many different ways that AI can be used. Things like chat GPT, some of these other large, generative language models and vision language models that you're interacting with if you're playing with some of these AI chatbots. This is really only one type of AI, and it is probably the type of AI that is most energy-intensive. Both in the training of the models and the models themselves are very, very large in the use of the models. It does take a lot of energy. But much of the AI that we're building for environmental science is AI that is designed to be not computationally intensive from the beginning. Because if you need to use AI in a remote rainforest or out in the middle of the savanna, where you don't have a large-scale data center, you don't have access to GPUs, you don't have the bandwidth to move the data to the cloud, then by design, you have to work within the resource constraints. So often, we're building AI that is powerful but small, and that is actually far less computationally intensive than maybe some of the traditional statistical methods that are used in ecology. So we've talked a lot about the data and using computers to make sense of it all. But what about field work? Because I know that you do get out of the lab quite a bit and get out there yourself, right? What have you been looking at recently? Yeah, so one of the most important things, as we're starting to see AI grow and be deployed in the real world, is direct, integrated, co-design and development with the stakeholders who are going to be using the models. And to me, that means going to the field and seeing the models in use, in deployment, understanding what gaps remain, and then prioritizing our research to be innovation that's driven by these applications of the models. So one example, we have a project where we've been working with a bunch of different state agencies, NOAA and CalTrow, to try to improve our estimates of salmon escatement to build more sustainable fisheries. So a scapement is basically, as the salmon are coming upstream, you want to count the number that are going upstream to spawn so that you have a good estimate of how many will actually successfully breed and then kind of go back out to the sea and then come back in three years, right? So to keep a sustainable population of salmon, you need to ensure that enough have escaped upstream. Now, one of the things that is very exciting is across the state of California, Washington and Oregon, we're seeing a lot of dam removal projects going in because we've realized that dams have been a pretty significant barrier for salmon populations. So the clamoth dam, there was a series of four dams on the clamoth river in Northern California that were all taken down in the last year. It's the largest dam removal project in the history of the world and it's opened up 800 square miles of habitat for salmon. And so what we've been doing is we've been working with CalTrowt, they have sonar cameras in the river and we're using that to automatically count the number of salmon that are escaping upstream. So we were just there at the dam removal site, actually working with the team that's managing the sonar that's kind of using and evaluating our models and still trying to understand what the gaps are. You know, what do they still need? What does the technology need to do to improve? And I think this type of interactive partnership, this interdisciplinary understanding, it's vital if we want to make sure that AI is used in a way that's truly beneficial. It's not a hammer looking for a nail. We really want to design the right tools for the right problems. And sorry, forgive me for sounding obtuse, but what did the salmon get out of it? So there's actually the estimate is that there's potentially been close to 10,000 salmon that have gone up into this previously inaccessible habitat in the very first year. And salmon are what's called a keystone species. Oh, they are. Yeah, yeah. So keystone species, they're essentially species that, another very famous example of a keystone species is wolves. Like when the wolves were reintroduced in Yellowstone, we saw this kind of expansive effect where the entire food chain, the entire ecosystem started to really flourish in response to having the presence of this keystone species. So for salmon, they not only do they provide food for a huge number of different types of organisms, right? Both like large predators, like bears, wolves, coyotes, but also as they decompose after they've spawned, they're introducing a huge amount of nutrients in these rivers. And their presence as a keystone species, we're hoping will actually really be able to now using again, a lot of these technologies, we're hoping to actually measure how that trickles outwards. So to be able to, at this type of comprehensive scale, monitor the reed generation or the kind of bouncing back of an entire 800 square mile river system that just wouldn't have been possible 10 years ago. And we're still figuring out how to do it today, but I'm very optimistic that we're going to be able to really understand things at scales that were just not possible in the past. I mean, Sarah, it feels so good to hear optimism and progress, and you don't hear a ton of that when it comes to the environment, and certainly not when you layer on AI usually. But I know as much as you are an optimist and see the amazing applications of this technology, you have also been very pragmatic and critical of those who claim like, oh, yeah, no problem, AI is going to fix so many things. I know earlier this year, Google came out with a tool called AI co-scientist, and you sort of were like, yeah, let's be careful what kind of promises we're making big tech. Can you explain what your sort of concerns are with this tool and maybe others? Yeah, I mean, not to even pick on that tool in particular, but I do think that in general, computer scientists have had a bit of a problem with what some people call tech-savirism, where they'll sort of say, oh, like, AI could do that. And it turns out that hard problems are hard for a reason. And AI is an magic wand. And so I think a lot of my work, my research, and also some of the ways that I kind of act as a leader in our field, is really thinking about how do we do AI robustly and rigorously, and how do we build systems that ensure that we're evaluating AI in ways that validate the claims we're making. I think AI can be a really amazing tool to help an expert do something more efficiently, to help our experts spend less time like looking through millions of images and instead spend time analyzing them, understanding the trends that they kind of help us establish, and then working in the policy space to sort of shape and take the actions that are going to have an impact. But I really think it's vital that we have these experts very tightly connected in the loop, right? We design systems where AI is a beneficial tool, but not a solution in and of itself. We stand at a unique point in history. We have both an unprecedented biodiversity crisis, but we also have unprecedented tools to address it. We have millions of people around the world eager to contribute to nature conservation and scientific discovery, and we have AI tools that enable scientists to find patterns in all of that data at scales impossible for humans alone. The future of conservation doesn't just lie in remote rainforests or deep ocean trenches. The future of conservation is hiding in our ecological databases, both the ones we have now, but also the ones we have yet to collect. And that is where all of you come in because everyone can contribute. Everyone can collect data and upload it to platforms like INaturalist. Every photo uploaded, every sound recorded, every observation shared is a piece of the puzzle. We know that we need to act now to save nature under threat. And together with scientific AI tools in our toolbox, we can help by building the complete picture of life on Earth. That was Sarah Beary. She's an assistant professor of AI at MIT's Computer Science and AI Lab, and she runs the Beary Lab. You can see her full talk at TED.com. Today on the show, translating nature. And now we want to go deeper into decoding one particular voice in the wild. A wolf's howl. A primal sound that's been heard and mimicked by humans for thousands of years. People have been howling forever. I'm sure the clovus people that came here 13,000 years ago were doing it. The mountains she's shown who used to live in Yellowstone. They were described by trappers as being the best imitators of animals around. They could howl like wolves. They could nail like horses and imitate any bird. Hunters have used it to draw in wolves and kill them. Some people did it during COVID. They'd go outside their back door on the porch and they'd howl to prove to the world that they were still alive. Across cultures, wolves are symbolic. For some indigenous nations, they stand for loyalty and family. In European stories like Little Red Riding Hood, there are a reminder that danger lurks everywhere. And of course, wolves were domesticated, becoming man's best friend and pet. But a century ago in the US, wolves were widely seen as predatory enemies hunted to near extinction. In places like Yellowstone, they were wiped out entirely. So Jeff Reed, who grew up on the outskirts of the National Park, never heard their howls as a child. I was born in the late 60s in Southwest Montana, which we call the Absorkey Mountains here, just north of Yellowstone National Park, one of the premier ecosystems in the world. And we lived outdoors all the time. Without wolves, Yellowstone lost its apex predator. The elk population surged and plant life was devastated. It took several decades until 1995 when conservationists reintroduced wolves in the hopes of restoring the park's delicate ecosystem. By the time Jeff Reed moved back to the area after spending his adult life in Silicon Valley, wolf howls were everywhere. And he wanted to know what the wolves were saying. So when I moved back, wolves were back. And I always knew that there's nothing more iconic than a wolf howl to tell you about nature. Mm-hmm. Yeah, I was going out and studying them just as a casual observer. And it really dawned on me that wolves are hard to study. Like, you can't walk up with a mic and say, you know, what did you mean by that? And I knew a lot of the biologists who were working in the park and their amazing researchers. And I just started talking to a few of them and said, hey, what if we did a large scale biocoustics project where we put recorders out on the landscape and then really started studying what the wolves were saying, so to speak? Jeff's background in tech and linguistics made him the perfect person to build on the work already being done to understand wolves. Wolves have been researched for their sounds for a long time and there are captive facilities, often sanctuaries, for wolves. And so you can hear them in captive centers, but we were really interested in wild wolves. Like, what are your kids saying when they leave the home and they're with their friends? That's really what we wanted to get after is how are they communicating in their natural soundscapes? Well, what were the tools that they were using before you got there? So wolves in Yellowstone are the most studied wolves in the world. So they already have this huge database, genetics, you name it, observations. They fly planes over, they put GPS collars on and so we just started to amplify that with an acoustic project. To do that, Jeff needed to upgrade the technology. There was certain acoustic technology that you could record, but it really hadn't been innovated upon from a biologist perspective on the landscape. And so I was like, of course we can do this, right? We do this all the time in Silicon Valley. It's just, we're only doing it to make money. And so I was like, okay, let's start reinventing these devices, these camera traps and these acoustic recorders and have it record for a year. Do you remember when you first started placing these devices in the park? Yeah, for sure. You know, we're walking out, it's spring, right? Kittentails are coming up, these beautiful purple flowers. So we put these out there and I'm thinking to myself, is this gonna work? Is this really gonna work? What was going through your mind? But just so many unknowns. If you think about Yellowstone, it's a big landscape, it's very rugged, it's very remote. Yeah. If you wanted to put an acoustic recorder out in the middle of Yellowstone and have it record for a year, it's about three terabytes of data that you have to store for that year. And it's gotta live through temperatures of minus 50 to 110. You don't want birds to fly away with it all sorts of things, right? We leave it near a wolf den site and we come back home and we wait a couple months. I bring the recorder's home, I pull the SD cards and we've recorded over 200,000 hours, which would take me roughly 50 years to listen to. Okay, so the recordings worked, which was great. But then you had so much data to use. What do you do with all these hours, years worth of audio? What you do is they're recorded in one hour chunks. So you get this wave file and I pull it up on this huge monitor and it turns into a spectrogram. A spectrogram is a visualization of sounds, showing how pitch and volume change over time. Waves go up and down, loud sounds show up brightly while quieter ones are more faint. A good linguist can take human speech, turn it into a spectrogram and tell you what they said without hearing it. So the same's true for all animal communication, where animals too. So I'm going through these one hour files and I told them up one by one and I think I was two days in. And I pull them up on a big screen and I look at them. And Jeff sees nothing. Spectrogram after spectrogram. We're not seeing anything. And I'm like, oh man. But then something big and bright and spiky appears. Finally I saw this beautiful chorus howl. He sees another chorus howl and another one and another one. So that was kind of the first time I said, OK, we've got something. The Griscam gave scientists a way to eavesdrop on wolves. But what to do with those hours and hours of howls? How to make sense of them? When we come back, crunching the data with the help of AI. Today on the show, translating nature. I'm Manusha Zamorodi and you're listening to the Ted Radio Hour from NPR. Stick with us. It's the Ted Radio Hour from NPR. I'm Manusha Zamorodi. Today on the show, translating nature. And we were just hearing how Wolf researcher Jeff Reed built new technology to capture the sounds of wolves communicating in Yellowstone Park. I love it so much. There's so much going on. Yeah. What is happening? So I hear the like, oh, but then there's also a little one who seems to be like telling somebody like a story as well. I don't know what were we listening to. When you first start hearing these chorus howls, they kind of sound like a cocktail party. And when you first walk in, you kind of hear all this chatter, right? And you're not necessarily listening to any one person. You're just hearing this noise. Humans and other animals are really good at isolating sound in this kind of chorus. And so we can pick out a person in a room and focus on their voice. And so with some time, you start being able to pick out individual voices. And the easy one is the howl, right? You heard the howl that started at the beginning. Yeah. And then you start hearing these other sounds, right? And that's what we were interested in is what are these other sounds? And who makes them sound? And who makes them? Why are they making them? And these are all the questions we're just starting to raise. And we're figuring out a few things, but I'll be dead by the time the younger biologists really start decoding all of this information. To move way faster, Jeff and Yellowstone biologists turned to, yes, you guessed it, AI. They had to begin by training a new model. The first thing we built were some AI tools to go find the howls in all of these recordings so that we can gather more data. I did it manually for 200,000 of them. Wow. And then built an AI model from that so that now as we're recording across half of Yellowstone, we can more easily go get their voices, so to speak. The tool can quickly find and classify different howls collected over months. It also picks up on things that Jeff and the biologists might never notice. There are clustering tools and other tools that are machine learning based that allow us to use software to say, what are we missing as humans? What might be another way to look at this howl? The AI is seeing patterns in it that there might be some functional behavior associated with it. These battery-operated devices use AI to record only the sound and motion that matters to the scientists, saving them time and money. Jeff Reed explains more from the TED stage. And researchers have already determined that wolves can identify one another just from their howls. But we do not know yet if they have names like Teddy or Rachel, or if it's more like you picking up the phone and recognizing that person's voice. Now this where gets interesting to me. A chorus howls when a pack or family of wolves communicates a group often has signaled that this is their territory. And in this video, a wolf from the pack is off-camera howling but being totally ignored by his packmates until something happens. As soon as the out-of-female howl, the rest of the wolves looked at her and the chorus began. And as it kicks off, the wolves come together in this mosh pit of dancing bodies making different types of calls. The entire event lasts about 60 seconds and then everybody goes about their business. So taking this example of the chorus howl, like break it down for me, how AI helps you understand what the wolves are doing and saying to each other. So there are two things the researchers are really looking at with chorus howls. The first is a chorus howl will start where one wolf is howling and then the rest of them will join in. But when a wolf's howling, it doesn't always turn into a chorus howl. So the question is, who's deciding like we all need to chorus howl together? And we've got a little bit of evidence that suggests it's the dominant wolves once they join into a howling event, then the rest of the pack joins in. So we're very interested in who decides that this whole territorial chorus howl is going to start. The other thing is we're working with Google DeepMind to see if we can count wolves just from their chorus howl. And can we just use audio recordings to know how many wolves were in that pack chorus howl. And what one researcher Kira Kasti figured out was that wolves can counter estimate the number of wolves howling in a rival pack. And they make a decision if they're outnumbered, you wouldn't want to get killed, you leave the area. If you're not outnumbered, the wolves often approach that rival pack because they know they're outnumbering the rival pack. Now wolves also bark like your dog, but unlike most dogs, wolves combine barks with other sounds into sentence-like structures. And in this video a female wolf uses a bark howl after being chased and nearly killed by a rival pack. There's real range in that one. So you hear two things in there, you hear bark and you hear howl, but the howl is highly modulated at the end. And so what I was trying to get people to think about was is this a form of combining two different sounds together to form some new function. Like if you take the word cry and you take the word wolf separately, you combine it together, cry wolf. They slightly change their meaning when they're brought together. We've known about this bark howl forever. The bark is an alarm and the howl is here I am, at least is basic function. And they'll make this when there's a serious threat. That one you just heard was a particular female wolf that was just chased by another pack and was about to get killed. And she got away and she started bark howling. And the idea I'm throwing out there for us to think about is the bark is an alarm like there's a threat. And the howl is trying to get attention to your packmates and say here I am. So come help me. So she was in distress. Well for sure she was in distress, but was she trying to communicate something more than just distress? Come help me. I'm not saying that's exactly what wolves are doing, but I am saying these are two different sounds they make. They combine them together. Is this a basic form of recursion or combinatorial meaning? And recursion is what most linguists would say today is what's unique about human languages. I want to talk about a wolf who sounds rather special. 907. 907 is definitely a special wolf. She lived 11 years in the average age of a wolf in Yellowstone National Park is about three and a half years. Oh that's it. Yeah they would live as long as a dog, but the wild is a different place right. So your life spans shorter. She lost one of her eyes when she was about four. Later in life she had a limp she surely had arthritis, broken bones and whatnot, but she lived to 11 and she holds the record in Yellowstone National Park for the most puppy letters. Oh wow. She always had puppies every year. She's kind of what got me into this project. I'd listened to her house a lot and just don't love with her. You fell in love with her. What was it about her? These wild animals who we pretty much exterminated from the lower 48 with poison and you name it. They earn a lot of respect with hunters like me because they're surviving on the landscape. The number one cause of a death in Yellowstone is another wolf. Outside of Yellowstone in non protected areas it's humans. You see how they take care of their pups, how they take care of family members. And you can't help but look at your own family structure and realize that there are personalities, black sheep in the family, the high achievers. And she was just like I called her the queen of slew creek, which is where her natal den was. Okay let's listen to 907. Oh it is soulful and mournful. That's what it sounds like to me but what do you hear? What else did you hear besides the howl? The howl was easy. She's crunching around and then I liked how there was like a whoo I see like a bridge sort of action. There's a tempo to it. Yeah. So those vertical bars you're seeing after the first howl was her walking around on shale rocks right next to a fir tree that had the recorder in it. And then what was the sound at the end that you heard? Baby wolves. That was her breathing into the recorder. So she was so close to the recorder. Yeah. If you imagine yourself sitting there she would be breathing on your cheek. So her howl was very flat. We call it a flat howl. But there's still you know it rises and falls. The typical wolf howl in Yellowstone is 350 hertz, which is about middle F on your piano keyboard. 907 was a stout girl bigger than your average female wolf. And her pitch was about 320 hertz where the average wolf would have been a little higher than that. And then other wolves will do a whoo well rise and then fall. And research has shown that wolves can identify one another from their howls. And there's at least two ways they do that. One is where the pitch ends at the end of the howl. So where it tails off that lower pitch is a good way for another wolf to know. Oh, I know who that is. And the other is how loud it gets throughout the howl. Is it ridiculous to think that she's saying I'm here. I like the eye. They're like, oh, yeah, it's 907 or that she's calling her own name that she's telling people that she's there. Context always means something to language, right? You don't know what someone said unless you know the context often. What had happened a half hour before this recording is a herd of elk came running right by this recorder and they were being chased by the pack. She was very old at this time and for some reason, like, I don't want to keep going. She stopped and the rest of the pack chased the elk and they were about a mile away. And then she howled. And so I think your intuition is getting close. She's letting them know here I am. This is me, right? 907. They're like, okay, 907's back over there. We're up here. And what happened after this set of howls? She howled like this for half an hour. But in that on the spectrogram, I noticed a distant faint line and that was one of the other wolves howling back at her. So she knew where they were. They knew where she was and that was the communication. So it's like getting lost in the grocery store and you're like, yeah, where are you guys? I'm over here and they're like, we're in aisle 11 kind of thing. Exactly. I love that. Where are you mom dad? Totally. I do want to just pull back. There's a lot of talk about AI. AI for good. AI for nature. Do you get excited when you talk about the artificial intelligence aspect of being able to analyze all these hours of wolf calls that you have? Or do you think, ugh, people just want to talk about the hot new thing and that's why they insist on this part of the project. It's both, I think, AI is a tool that can help us understand animal behavior. But it's not the end all. I don't foresee a Google translate for wolfish, right? Because what you're really trying to do is get in the mind of a wolf and that's different than just transcribing their different types of howls into the basic functional units. And to what end is this conservation? Is this simply expanding human knowledge of other species? These devices that we're building can do a lot of things. A lot of people don't know that poaching happens all across America, across the world, but even within our national parks. Still. Yeah, it's still to this day. Two wolves were poached last year on the northern border of Yellowstone. Why can't we solve for poaching? Because this is a national heritage, right? It's everybody's national park. And so one of the reasons we're building these devices is to locate gunshots on the landscape and help law enforcement protect these resources for the public. But then there are a lot of other reasons that tie to conservation. One of them is simply building the digital wild. Human language across the globe is losing its words and grammar for nature. It's slowly going away because as we move to the cities and we don't experience the wild kingdom, we're seeing changes in our lexicon, so to speak. There's no turning back the digital age. And so one of the things we can do with these camatrops and these sound recorders is bring this digital wild and storytelling to people, kids who are sitting at home and bring them a little closer to nature. You tend to protect what you love and you only love what you understand. Now look, I don't know if I will ultimately decode animal communication, but I do know this. There has to be animals to decode. If your body represented the total weight of all the world's land mammals today, your right forearm would be what's left of the wild ones. The rest of your body is us, our livestock, and our pets. As for carnivores, things like lions, tigers, and wolves, it's less than my pinky. The challenge we collectively face as real humans, not artificial ones, goes far beyond individual opinions on wild wolves. It's about the future of wildness itself. Do you have a name in wolf? I don't know, Jeffrey, if that's a ridiculous question. For me? Yes. If you were to go out into the field and you were to say, hey wolves, it's me, Jeff. I'm collecting the spectrograms and the recordings. Well, what I do, I howl. I've gotten good at imitating 907's howl. So I howl like her. Why don't you do it with me? Just try to keep it flat. Okay. And then try to get your pitch around 320 hertz. Okay. I'm going to count to three and then we're both going to do it. One, two, three. Oh. Perfect. That felt good. Yeah. I feel like I'm a member of your pack now, Jeff. Thank you. Yeah, or I'm a member of yours. Thank you. That was Jeff Reed. He's a computational linguist and the founder of Grizzly Systems and the Cry Wolf Project. You can see the Wolf Howl spectrograms and hear more sounds from Yellowstone at the Cry Wolf Project.com. You can also watch his full talk at tid.com. Thank you so much for listening to our Howls today. If you liked it, please rate us or leave us a review on Spotify or Apple Podcasts. We read all your notes and we love hearing from you. This episode was produced by James De La Housie, Phoebe Lett, Katie Montellone, Harsh Anahada, Matthew Cloutier and Fiona Gehrin. It was edited by Sanna Smaschkinpour and me. Our executive producer is Irene Noguchi. Our audio engineer was Patrick Murray and Becky Brown. Our theme music was written by Ramteen Arablui. Our partners at Ted, our Chris Anderson, Roxanne Highlash and Danielle Belorezzo. I'm Manouche Zamorodi and you have been listening to the Ted Radio Hour from NPR. Music