top of page

Bat Brains Reveal Vocal Code

The meaning of the vocalizations we hear is fundamental to navigating our environment and regulating our behavior. For example, hearing a baby cry elicits a very different response than hearing a friend laugh. Language itself depends on our ability to recognize and categorize contextually similar sounds (phonemes). This capacity to distinguish highly similar sounds—such as “beer” versus “bear”—exemplifies the sophisticated neural computations at work.

Bats—true masters of sound—use echolocation to navigate complete darkness and communicate via vocalizations. Recently, a team of neuroscientists uncovered how a specific brain region categorizes these calls. Using two-photon microscopy, they visualized the real-time activity of individual neurons and neuronal populations, revealing how this region identifies and discriminates among different sound types. This study shows that a process of discrimination occurs in structures prior to the auditory cortex, which accelerates auditory categorization.


Bat brain during social calls

Visualizing Neuronal Activity in the Bat Brain

The researchers focused on the dorsal cortex of the inferior colliculus (DCIC)—a part of the midbrain known for processing sound features. This subdivision was particularly interesting due to its potential role in vocalization categorization.

To study neuronal activity, a virus was injected into the DCIC to express a calcium indicator in neurons. This allowed the visualization of neuronal activity in response to sound stimuli with high spatial resolution in live animals.

Tonotopy: Mapping Sound Frequencies

When the bats were exposed to various vocalizations, the activity of individual neurons increased according to their frequency tuning. The precision of two-photon microscopy enabled the 3D reconstruction of neuron positions, revealing a tonotopic map: neurons were organized in a rostro-lateral pattern (X-Y axis), but not along depth (Z-axis). This tonotopic organization reflects a systematic mapping of sound frequency across the brain surface, where neighboring neurons are tuned to similar frequencies, forming a functional auditory map.

Neuronal Coding of Meaning

Bats use vocalizations for both spatial navigation and social interaction. Since many of these sounds are acoustically similar, it was important to understand how the DCIC distinguishes between them.

Researchers trained a computational decoder using a subset of neuronal responses to specific stimuli and tested it on the remaining data. This classifier showed that the DCIC more accurately decodes social vocalizations compared to navigational ones, suggesting a higher fidelity for socially meaningful sounds.

Categorization vs. Frequency Differentiation

Was this selectivity merely due to differences in the frequency preferences of the neurons? Surprisingly, neurons selective for social and navigational sounds did not differ significantly in frequency tuning. This suggested that the DCIC was categorizing vocalizations based on more complex acoustic or contextual features, not just frequency.

Neuronal clusters act like switchers categorizing the meaning of sounds 

To explore this further, the researchers created hybrid vocalizations that gradually morphed from social to navigational sounds. They found that neurons typically responded in an “all-or-nothing” fashion: each neuron preferred one category or the other, rather than showing a gradual shift in activity.

Moreover, neurons selective for each type of sound formed spatial clusters within the DCIC. These clusters were aligned along the tonotopic axis but were not determined solely by frequency. Instead, they reflected a more abstract, behavioral organization based on the meaning of the sounds.

Conclusion

Bats are truly the sound masters. They emit sound to perceive their environment—echolocating objects and other animals through returning echoes. But they also rely on vocalizations for social communication. The brain can categorize different types of information to guide behavior. The fact that the DCIC categorizes sound stimuli in an on/off manner between spatial and social contexts enables the bat to rapidly process sounds and respond appropriately to its environment. This challenges the paradigm that the auditory cortex is solely responsible for categorizing and determining the type of information originating from the ear. I will explore how visual and sensory information is processed in the brain, emphasizing the value of studying species with specialized processing abilities. Understanding their neural computations could inform technological innovation and healthcare applications. Listen to bat vocalizations!!!

If you're curious, you can listen to examples of bat vocalizations here (https://www.nps.gov/subjects/bats/echolocation.htm). While we may hear the frequency differences, the bat’s brain goes further; it categorizes them based on social vs. navigational meaning.





Key concepts

Calcium indicator: A fluorescent molecule used to detect changes in calcium levels, which correlate with neuronal activity.

Categorical perceptions: brain's ability to perceive stimuli and separate in categories, according to the type of information they carry.

Echolocation: A biological ability bats use to navigate by emitting calls and interpreting the returning echoes and self-localize.

Decoder (in neuroscience): A computational model trained to predict stimuli (like sound type) based on patterns of neural activity.

Tonotopy: The spatial arrangement of neurons in the brain according to the frequency of sound they process.

Two-photon microscopy: An advanced imaging technique that allows deep, high-resolution visualization of live brain tissue.



References.

Lawlor, J., Wohlgemuth, M.J., Moss, C.F. et al. Spatially clustered neurons in the bat midbrain encode vocalization categories. Nat Neurosci (2025). https://doi.org/10.1038/s41593-025-01932-3



bottom of page