Jerusalem, 4 June, 2025 (TPS-IL) — Israeli scientists uncovered a surprising twist in how mice perceive their environment: they may be “hearing” through their whiskers, and how it happens could open avenues for new sensors that improve prosthetics, tools for the visually impaired, and rehabilitation techniques.
A study by the Weizmann Institute of Science suggests that the subtle sounds produced when a mouse’s whiskers brush against surfaces are not only real but are actively processed by the brain’s auditory system.
“These whiskers are so delicate that no one thought to check whether they produce noises that mice can hear,” said Prof. Ilan Lampel of the Institute’s Department of Neuroscience, who led the study. “But it turns out that these faint sounds are not only audible to mice—they’re meaningful.”
Long thought to function purely as tactile sensors, whiskers have now been shown to serve as part of an intricate, multisensory system. The study — published in the peer-reviewed Current Biology journal — challenges the conventional notion that the senses operate in isolation.
“Contrary to textbooks, the sharp and clear separation between the senses does not necessarily exist in reality,” Lampel explained. “In fact, perception often combines different sources—touch and hearing in this case.”
The team, including Dr. Ben Efron, Dr. Athanasios Natalzos, and Dr. Jonathan Katz, began by recording the nearly inaudible sounds made when whiskers rubbed against surfaces like aluminum foil or dry leaves. Using high-sensitivity ultrasonic microphones positioned just two centimeters from the whiskers — the typical distance from whisker tip to mouse ear — they captured these previously overlooked acoustic signatures.
They then measured neural activity in the auditory cortex of mice as the animals touched various objects with their whiskers. Even when the neural pathways for touch were blocked, the auditory system still responded to the sounds, indicating the brain treats them as a separate form of sensory input.
To probe deeper, the researchers turned to artificial intelligence. They trained machine learning models to identify objects based on either the neural activity in the auditory cortex or the recordings of the whisker-produced sounds. Remarkably, both models performed almost identically, suggesting the brain was indeed reacting to sound alone and not touch or other cues like smell.
But could mice use this information to navigate the world? To find out, the team conducted behavioral experiments. Mice whose sense of touch had been neutralized were trained to recognize aluminum foil using only the sound of their whiskers brushing it. The results were clear: the mice could reliably associate the sounds with specific objects.
“The results indicate that the mouse’s whiskers are an integrative and multisensory sensory system,” said Lampel. “It may have evolved this way to help mice locate food or protect themselves from predators.” For example, he said, “a mouse navigating its environment and fearing detection may use the weak acoustic signals from its whiskers to choose between a dry thorn field or a soft meadow.”
The discovery has far-reaching practical implications that could inform the development of better prosthetics, rehabilitation techniques, and tools for the visually impaired.
In the field of prosthetics, the study could lead to the development of advanced artificial limbs equipped with sensors that emit specific sounds or vibrations when touching surfaces, enabling users to “hear” textures or resistance. For people recovering from strokes or injuries that impair touch, sound-based feedback systems could help retrain the brain to interpret environmental stimuli through alternative sensory channels.
The findings also offer new directions in assistive technology for the visually impaired. Smart canes inspired by the whisker system could detect surfaces and relay information through auditory or vibratory signals. Much like echolocation, this approach could help users identify object types or navigate complex environments using sound cues alone.
In robotics, integrating sound-sensitive tactile sensors modeled on whiskers could revolutionize how machines operate in low-visibility conditions such as smoke, fog, or debris. Robots could interpret acoustic feedback from surface contact to avoid obstacles or make decisions in search-and-rescue missions where human presence is risky.
Insights from the mouse model may also influence the design of brain-computer interfaces or train AI systems to interpret complex sensory data more like biological organisms.
“Integrating different types of sensory input is a major challenge in designing robotic systems,” said Efron. “The whisker sensing system may serve as inspiration for developing early warning technologies for navigation and collision avoidance in environments with limited visibility.”























