A UC Berkeley study shows that when zebra finches hear the call of another zebra finch, they have a mental representation of its meaning — they understand what they’re hearing.

A mating pair of male (left) and female zebra finches. Native to the interior of Australia, zebra finches congregate in very vocal groups of as many as a hundred individuals. They disperse to breed, bonding with one mate for life.
Julie Elie/UC Berkeley
November 4, 2025
When a bird spots a predator and emits an alarm call, do its neighbors think “predator” and then react? Or do they aut…
A UC Berkeley study shows that when zebra finches hear the call of another zebra finch, they have a mental representation of its meaning — they understand what they’re hearing.

A mating pair of male (left) and female zebra finches. Native to the interior of Australia, zebra finches congregate in very vocal groups of as many as a hundred individuals. They disperse to breed, bonding with one mate for life.
Julie Elie/UC Berkeley
November 4, 2025
When a bird spots a predator and emits an alarm call, do its neighbors think “predator” and then react? Or do they automatically freeze or fly away because that’s what they’re wired to do?
Surprisingly, biologists don’t know the answer to that question, even in cases where animals clearly exhibit different behaviors when they hear different alarm calls. Vervet monkeys, for example, look down in response to a snake call and look up in response to an eagle call. But are the animals really “picturing” a snake or an eagle?
New research from neuroscientists at the University of California, Berkeley, suggests that the answer is yes, at least for zebra finches.
The researchers showed that zebra finches categorize calls similar to how humans hear and group them. Finches and humans both organize calls into about a dozen call-types or “words” which are used to sound an alarm, advertise their identity and position (“Here I am!” calls), court or pair-bond with their partner, or signal distress, hunger or aggressive intent. But they found that the birds are sometimes confused by calls in the same semantic group — that is, calls with similar meaning — despite the fact that the calls sound much different, even to human ears.
“As long as call-types have clearly different meanings for the birds, they are very well distinguished even if their acoustics are quite similar. But call-types further apart in the acoustic space that can be lumped in the same semantic category are surprisingly mistaken more often by the bird,” said Julie Elie, a research associate in Berkeley’s Department of Neuroscience. “It’s proof that they have this mental representation of the meaning, which leads them to make errors. Otherwise, if this representation of meaning was not there, there’s no reason they would make errors more often between call-types that belong to the same semantic group.”
“We have shown, indirectly, that birds understand what they are saying,” said Frédéric Theunissen, UC Berkeley professor of neuroscience.
Zebra finches have a repertoire of about a dozen call-types. UC Berkeley researchers designed an ‘operant’ task to test their classification of these vocalizations. Here, a female zebra finch is performing the classification task. While the bird interrupts the playback of vocalizations that do not belong to the target song category, she correctly refrains from interrupting the playback of the target song and is given a seed reward. During each day of the experiment, zebra finches were asked to classify a different call-type from their repertoire. (Video credit: Julie Elie and Frédéric Theunissen, UC Berkeley
This is also the first time anyone has “actually tested whether animals agree with the human experts that calls have different meanings” and that the acoustic differences humans detect are also recognized by the birds, he added.
If a small bird like the zebra finch has a mental representation of meaning, Elie said, birds such as crows, which have more complicated vocalizations, likely have an even more elaborate perceptual landscape. These results show that vocal communication is not entirely reflexive in birds and that there is room for decision in their vocal exchanges. She’s now collaborating with a group in France to investigate the mental representations of vocalizations in mice.
Elie, who calls herself a computational neuroethologist, is the first author and Theunissen is the senior author of a paper about the findings published in the Sept. 18 issue of the journal Science. The work was funded by the National Institutes of Health (R01DC018321).
A model for human speech learning and understanding
Zebra finches are a common subject of study because they are very talkative and the young males learn unique mating songs in a way similar to how humans learn speech. This makes them a good model for understanding vocal learning and auditory perception of communication sounds. Theunissen has studied these birds for decades to understand how their brains process pitch, timbre and rhythm, trying to learn how the brain distinguishes sounds in the natural world, including the songs of other finches. He showed that both social and sound experience is critical for the development of the auditory cortex in these birds.
Julie Elie in 1986 in Australia, where she was studying the calls of zebra finches.
Courtesy of Julie Elie/UC Berkeley
But he admits that he has ignored other vocalizations, such as calls, of this chatty bird.
“My work was really focused on the auditory processing of communication songs versus natural sounds. What Julie brought to the lab, which was kind of an eye-opener for me, is that song is only one of the signals that they’re producing. If you really want to study communication systems, using just one signal that has one meaning is not the right thing to do,” Theunissen said.
As a graduate student in the ENES lab in France, Elie spent time in Australia trailing pairs and flocks of zebra finches, learning about and memorizing the various vocalizations they use to communicate with one another. By putting microphones in their nests, she showed in 2010 that they were singing private quiet duets.
After joining Theunissen’s lab 15 years ago, she recreated the conditions in the lab to record and catalog the same vocalizations in captive zebra finches. She counted 11 distinct vocalizations, though an untrained ear might not easily distinguish them. Her catalog agreed with the calls documented by the late ornithologist Richard Zann in a 1996 book on zebra finches, though Elie, unlike Zann, also analyzed the acoustics — pitch, timbre and rhythm — of these calls.
Crucially, she also recorded the behavior associated with each call, compiling a complete list of behaviors called an ethogram. This allowed her to associate each ethogram-based call-type with specific behaviors produced by the birds emitting it and elicited in the birds hearing it.
Three male and two female zebra finches, including two mating pairs. Females are typically all gray, though here one female is a white morph. Males are colorful, with red cheeks, zebra stripes on their chest and brown patches with white spots on their sides.
Julie Elie/UC Berkeley
Elie designed an experiment that confirmed that the birds can indeed discriminate between the call-types. The researchers played the birds an audio track of several hundred zebra finch calls arranged randomly and drawn from a list of about 8,000 calls from more than 30 birds to see if they could pick out the one call-type — the same “word” uttered by many birds — that was associated with a reward of seeds. They pecked a button to rapidly skip through unrewarded calls, like channel surfing on a TV, until they got to the rewarded calls.
“This tells us that they agree with whatever organization of the repertoire we made,” Elie said. “The human is here observing and saying, ‘Those are your words.’ And the bird is saying, ‘Yes, these are my words.’”
“Birds have various degrees of intelligence, and I don’t think zebra finches are very high in that spectrum,” Theunissen said. “But in terms of auditory discrimination while doing this task, they are really quite phenomenal.”
Finally, to test whether zebra finches have an internal, mental representation of the meaning of calls, they analyzed the results of the discrimination task to determine whether the birds ever misunderstand calls. They discovered that “when the birds made mistakes, they occasionally made mistakes based on the acoustics, but even more often, (they made mistakes) based on the semantics of the call types. This was something that was really surprising to us,” he said.
For example, the finches sometimes confuse two contact calls, even though they are acoustically quite different. One, referred to as a “tet” call, is used when birds are close to one another, going about their business while following each other’s location using the exchange of this soft call — tet, tet, tet. It’s like a Marco Polo call and response, Elie said: “Are you here? I’m here. You’re here!” The other, the distance call, sounds like a loud honk sound for females or a loud “pitchiou” sound for males and is used when birds are far enough apart that they can’t see one another.
An alarm call, however — a thuk, thuk, thuk, thuk — sounds much like the tet call but is seldom confused with it.
The fact that calls with similar meanings more often cause confusion than calls that sound similar suggests that the birds are extracting meaning from the calls, Elie said.
Julie Elie holding a zebra finch in a UC Berkeley lab.
Courtesy of Julie Elie/UC Berkeley
“By studying vocal communication, we get a better sense of the cognitive ability of animals,” Elie said. “Maybe at one point we’ll be able to communicate with other animals. If we do the effort of really deciphering their language, we might be able to understand them better.”
Theunissen and Elie are currently making brain recordings as these finches engage in call discrimination tasks to determine, “Where is that percept of meaning in the brain? How is it represented in the brain? How is it that they’re making those errors between call types that belong to the same semantic hypercategory, such as all the contact calls?” Elie said.
Theunissen said that this discovery extends the research he has been conducting on how the auditory cortex in the brain extracts a signal — a song, for example — from background noise.
“Now we’re going from sensation to perception, to use a more psychological term,” he said. “Perception is like assigning a label, where you can actually say, ‘Oh, I am listening to a symphony or I’m hearing a bus going by in the street.’ Or here, ‘I understand what you’re speaking.’”
Other coauthors of the paper are Aude de Witasse-Thézy of the University de Lyon in France and Logan Thomas and Ben Malit of UC Berkeley.