Home Issue 3, Volume 1 • Echolocation in Humans: How hearing helps people with vision loss “see” their environment

Echolocation in Humans: How hearing helps people with vision loss “see” their environment



Composite art by Josh Green; Brain illustration Credit iStock.com/bauhaus1000

Imagine you were placed in a brand new environment, could you identify the composition of your surroundings based on the sound alone? Would you be able to locate the wooden table in front of you, or the ivy growing several feet away? For a subset of visually impaired individuals who harness the power of echolocation, this is a reality. These individuals use an echolocation technique similar to that used by bats, whales, and dolphins in which they can detect items in their environment by generating a sound (for example, snapping their fingers or making a sound with their mouth) and sensing the echoes reflected from those surrounding items. These echoes are then translated in the brain into a mental map of the surrounding space. The sound reflected from an object depends on the object’s physical properties, such as their shape, distance from the listener, and material; thereby, echoes can transmit intricate amounts of information about the surrounding environment. Visually impaired human echolocators can judge the location, shape, and size of the nearby objects with incredible acuity. In addition, according to new research from Western University, human echolocators can also identify the material of the source object, and researchers may have determined the brain region that allows them to do so1.

              In collaboration with researchers from Toronto, California, and the UK, the Goodale Lab at Western University’s Brain Mind Institute sought to assess the accuracy of blind echolocators in determining a target material, and the brain region that allows them such incredible acuity. The study brought in three visually impaired echolocation experts to the Anechoic Chamber (sound dampening, literally meaning echo-free) at the National Centre for Audiology at Western University. The anechoic chamber allowed researchers to isolate echoes to only those reflected from objects introduced to the chamber. The echolocation experts were asked to produce echolocation mouth-clicks in front of 3 materials: a whiteboard, synthetic foliage, and a fence covered with a blanket. The researchers recorded both the mouth-click sound and the corresponding echo. Then, researchers assessed the performance of the echolocators in identifying the material when listening to the recorded mouth-click audio pair. As a control, the researchers also evaluated the performance of visually impaired and non-visually impaired individuals with no experience in echolocation. Notably, the audio clips were shuffled in a manner that prevented echolocators from listening to their own recordings. During performance testing, the participants’ brain activity was measured using functional magnetic resonance imaging (fMRI), which uses powerful magnets to measure activity in different brain regions during specific tasks.

              Impressively, the echolocation experts reached 80% accuracy in identifying materials. For example, the echolocation experts could accurately detect when a mouth click had echoed off of a synthetic plant or a whiteboard. In contrast, the non-echolocators performed in the range of 40-60%. When researchers looked at the brain activity of all participants, they found that the blind echolocation experts showed activity in regions associated with visual processing, a change not seen in non-echolocators. Specifically, they found activity in areas related to visual pattern detection, suggesting that regions that normally detect visual patterns have now become sensitive to spatial auditory patterns. 

     Surprisingly, the echolocation experts also showed brain activity in the parahippocampal cortex – a brain area involved in the limbic system which is thought to be associated with emotion and memory. Previously, this system has also been implicated in the visual detection of material composition. In practice, this region helps you recognize that a jacket is made of fleece and not canvas. In the echolocation experts, this region was active when processing the material stimuli. Comparatively, the region was not engaged in the non-echolocators.

              Overall, this study highlights the incredible ability of the brain to adapt to change based on a change in input. In this case, brain regions that normally respond only to visual input had adapted to distinguish specific spatial auditory input in visually impaired individuals. This research has further solidified the hypothesis that brain pathways normally associated with visual processing are adapted for auditory cues in blind human echolocators2. Since the publication of this article, more research has looked into the fascinating ability of these human echolocators. Furthermore, using these findings, researchers have begun to train machines to discriminate materials based on human echolocations, but these machines have a long way to go to match the human brain!3



  1. Milne JL, Arnott SR, Kish D, et al. Parahippocampal cortex is involved in material processing via echoes in blind echolocation experts. Vision Res. 2015;109:139–148.
  2. Norman LJ, Thaler L. Retinotopic-like maps of spatial sound in primary ‘visual’ cortex of blind human echolocators. Proc R Soc B Biol Sci;286. Epub ahead of print 2019. DOI: 10.1098/rspb.2019.1910.
  3. Abdullah RSAR, Saleh NL, Rahman SMSA, et al. Texture classification using spectral entropy of acoustic signal generated by a human echolocator. Entropy. 2019;21:1–20.

Author:::Sam Mestern