Human echolocation

From Wikipedia, the free encyclopedia

Human echolocation is the ability of humans to detect objects in their environment by sensing echoes from those objects, by actively creating sounds: for example, by tapping their canes, lightly stomping their foot, snapping their fingers, or making clicking noises with their mouths. People trained to orient by echolocation can interpret the sound waves reflected by nearby objects, accurately identifying their location and size.

Background[edit]

The term "echolocation" was coined by zoologist Donald Griffin in 1944; however, reports of blind humans being able to locate silent objects date back to 1749.[1] Human echolocation has been known and formally studied since at least the 1950s.[2] In earlier times, human echolocation was sometimes described as "facial vision" or "obstacle sense", as it was believed that the proximity of nearby objects caused pressure changes on the skin.[3][4][5] Only in the 1940s did a series of experiments performed in the Cornell Psychological Laboratory show that sound and hearing, rather than pressure changes on the skin, were the mechanisms driving this ability.[1] The field of human and animal echolocation was surveyed in book form as early as 1959[6] (see also White, et al. (1970)[7]).

Many blind individuals passively use natural environmental echoes to sense details about their environment; however, others actively produce mouth clicks and are able to gauge information about their environment using the echoes from those clicks.[8] Both passive and active echolocation help blind individuals sense their environments.

Those who can see their environments often do not readily perceive echoes from nearby objects, due to an echo suppression phenomenon brought on by the precedence effect. However, with training, sighted individuals with normal hearing can learn to avoid obstacles using only sound, showing that echolocation is a general human ability.[9]

Lore Thaler led researchers at Durham University to determine if they could teach echolocation to people. Over a ten-week period, they taught it to 12 blind people and 14 others who were not blind.[10]

Mechanics[edit]

Vision and hearing are akin in that each interprets detections of reflected waves of energy. Vision processes light waves that travel from their source, bounce off surfaces throughout the environment and enter the eyes. Similarly, the auditory system processes sound waves as they travel from their source, bounce off surfaces and enter the ears. Both neural systems can extract a great deal of information about the environment by interpreting the complex patterns of reflected energy that their sense organs receive. In the case of sound these waves of reflected energy are referred to as echoes.

Echoes and other sounds can convey spatial data that are comparable in many respects to those conveyed by light.[11] A blind traveler using echoes can perceive very complex, detailed, and specific features of the world from distances far beyond the reach of the longest cane or arm. Echoes make information available about the nature and arrangement of objects and environmental features such as overhangs, walls, doorways and recesses, poles, ascending curbs and steps, planter boxes, pedestrians, fire hydrants, parked or moving vehicles, trees and other foliage, and much more. Echoes can give detailed information about location (where objects are), dimension (how big they are and their general shape), and density (how solid they are). Location is generally broken down into distance from the observer and direction (left/right, front/back, high/low). Dimension refers to the object's height (tall or short) and breadth (wide or narrow).

By understanding the interrelationships of these qualities, much can be perceived about the nature of an object or multiple objects. For example, an object that is tall and narrow may be recognized quickly as a pole. An object that is tall and narrow near the bottom while broad near the top would be a tree. Something that is tall and very broad registers as a wall or building. Something that is broad and tall in the middle, while being shorter at either end may be identified as a parked car. An object that is low and broad may be a planter, retaining wall, or curb. And finally, something that starts out close and very low but recedes into the distance as it gets higher is a set of steps. Density refers to the solidity of the object (solid/sparse, hard/soft). Awareness of density adds richness and complexity to one's available information. For instance, an object that is low and solid may be recognized as a table, while something low and sparse sounds like a bush; but an object that is tall and broad and very sparse is probably a fence.[12]

Brain areas associated with echolocation[edit]

Echo-related activity in the brain of an early-blind, trained echolocator is shown on the left. There is no activity evident in the brain of a sighted person not so trained (shown on the right) listening to the same echoes

Some blind people are skilled at echolocating silent objects simply by producing mouth clicks and listening to the returning echoes. Although few studies have been performed on the neural basis of human echolocation, those studies report activation of primary visual cortex during echolocation in blind expert echolocators.[1][13][14] The driving mechanism of this brain region remapping phenomenon is known as neuroplasticity.

In a 2014 study by Thaler and colleagues,[15] the researchers first made recordings of the clicks and their very faint echoes using tiny microphones placed in the ears of the blind echolocators as they stood outside and tried to identify different objects such as a car, a flag pole, and a tree. The researchers then played the recorded sounds back to the echolocators while their brain activity was being measured using functional magnetic resonance imaging. Remarkably, when the echolocation recordings were played back to the blind experts, not only did they perceive the objects based on the echoes, but they also showed activity in those areas of their brain that normally process visual information in sighted people, primarily the primary visual cortex or V1. This result is surprising, as visual areas are normally only active during visual tasks. The brain areas that process auditory information were no more activated by sound recordings of outdoor scenes containing echoes than they were by sound recordings of outdoor scenes with the echoes removed. Importantly, when the same experiment was carried out with sighted people who did not echolocate, these individuals could not perceive the objects and there was no echo-related activity anywhere in the brain. This suggests that the cortex of blind echolocators is plastic and reorganizes such that primary visual cortex, rather than any auditory area, becomes involved in the computation of echolocation tasks.

Despite this evidence, the extent to which activation in the visual cortex in blind echolocators contributes to echolocation abilities is unclear.[9] As previously mentioned, sighted individuals have the ability to echolocate; however, they do not show comparable activation in visual cortex. This would suggest that sighted individuals use areas beyond visual cortex for echolocation.

Notable cases[edit]

Daniel Kish[edit]

Echolocation has been further developed by Daniel Kish, who works with the blind through the non-profit organization World Access for the Blind.[16] He leads blind teenagers hiking and mountain-biking through the wilderness, and teaches them how to navigate new locations safely, with a technique that he calls "FlashSonar".[17] Kish had his eyes removed at the age of 13 months due to retinal cancer. He learned to make palatal clicks with his tongue when he was still a child, and now trains other blind people in the use of echolocation and in what he calls "Perceptual Mobility".[18] Though at first resistant to using a cane for mobility, seeing it as a "handicapped" device, and considering himself "not handicapped at all", Kish developed a technique using his white cane combined with echolocation to further expand his mobility.[18]

Kish reports that "The sense of imagery is very rich for an experienced user. One can get a sense of beauty or starkness or whatever—from sound as well as echo."[17] He is able to distinguish a metal fence from a wooden one by the information returned by the echoes on the arrangement of the fence structures; in extremely quiet conditions, he can also hear the warmer and duller quality of the echoes from wood compared to metal.[17]

Thomas Tajo[edit]

Thomas Tajo was born in the remote Himalayan village of Chayang Tajo in the state of Arunachal Pradesh in the north-east India. He became blind around the age of 7 or 8 due to optic nerve atrophy and taught himself to echolocate. Today he lives in Belgium and works with Visioneers or World Access to impart independent navigational skills to blind individuals across the world. Tajo is also an independent researcher. He researches the cultural and biological evolutionary history of the senses and presents his findings to scientific conferences around the world.[19]

Ben Underwood[edit]

Ben Underwood

Ben Underwood was a blind American who was born on January 26, 1992, in Riverside, California. He was diagnosed with retinal cancer at the age of two, and had his eyes removed at the age of three.[20]

He taught himself echolocation at the age of five, becoming able to detect the location of objects by making frequent clicking noises with his tongue. This case was explained in 20/20: Medical Mysteries.[21] He used it to accomplish such feats as running, playing basketball, riding a bicycle, rollerblading, playing football, and skateboarding.[22][23] Underwood's childhood eye doctor claimed that Underwood was one of the most proficient human echolocators.

He inspired other blind people to follow his leads. He passed away in 2009 due to cancer.

Lawrence Scadden[edit]

Lawrence Scadden lost his sight as a child due to illness, but learned to use echolocation well enough to ride a bicycle in traffic. (His parents thought that he still had some sight remaining.)[24] In 1998, he was interviewed at the Auditory Neuroethology Laboratory at the University of Maryland about his experience with echolocation.[7] The researchers were aware of the Wiederorientierung phenomenon described by Griffin[6] where bats, despite continuing to emit echolocation calls, use path integration in familiar acoustic space. Scadden said he did the same, as echolocation required extra effort.

The National Science Teachers Association created the "Lawrence A. Scadden Outstanding Teacher Award of the Year for Students With Disabilities" in his honor.[25]

Lucas Murray[edit]

Lucas Murray, from Poole, Dorset, was born blind, and is one of the first British people to have learned human echolocation, having learned it from Daniel Kish.[26] Lucas' parents saw a documentary about Daniel Kish teaching Ben Underwood echolocation.[27][28] Months later, they learned that Daniel would be visiting a Scottish charity called Visibility[29] and contacted him. Kish taught the five-year-old Lucas the basics of echolocation over four days. By age seven, Lucas was proficient enough to not only accurately tell the distance of objects, but also their material, and could play with other children in sports such as rock climbing and basketball.[30][31] In 2019, he enjoyed a week's work experience with South Western Railway.[32]

Kevin Warwick[edit]

The scientist Kevin Warwick experimented with feeding ultrasonic pulses into the brain (via electrical stimulation from a neural implant) as an additional sensory input. In tests he was able to discern distance to objects accurately and to detect small movements of those objects.[33]

Juan Ruiz[edit]

Blind from birth, Juan Ruiz lives in Los Angeles, California. He appeared in the first episode of Stan Lee's Superhumans, titled "Electro Man". The episode showed him capable of riding a bicycle, avoiding parked cars and other obstacles, and identifying nearby objects. He entered and exited a cave, where he determined its length and other features.[citation needed]

In popular media[edit]

The 2017 video game Perception places the player in the role of a blind woman who must use echolocation to navigate the environment.[34]

In the 2012 film Imagine, the main character teaches echolocation to students at a clinic for the visually impaired. This unconventional method spurs a controversy but helps students explore the world.[35]

In the 2007 children's fantasy novel Gregor and the Code of Claw, protagonist Gregor learns echolocation. This skill proves useful for fighting in the Underland, an underground civilization which is the main setting of the book.[36]

See also[edit]

References[edit]

  1. ^ a b c Kolarik, Andrew J.; Cirstea, Silvia; Pardhan, Shahina; Moore, Brian C. J. (2014-04-01). "A summary of research investigating echolocation abilities of blind and sighted humans". Hearing Research. 310: 60–68. doi:10.1016/j.heares.2014.01.010. PMID 24524865. S2CID 21785505.
  2. ^ Richard L. Welsh, Bruce B. Blasch, online Foundations of Orientation and Mobility, American Foundation for the Blind, 1997; which cites S. O. Myers and C. G. E. G. Jones, "Obstable experiments: second report", Teacher for the Blind 46, 47–62, 1958.
  3. ^ Raymond J Corsini, The Dictionary of Psychology, Psychology Press (UK), 1999, ISBN 1-58391-028-X.
  4. ^ M. Supa, M. Cotzin, and K. M. Dallenbach. "Facial Vision" - The Perception of Obstacles by the Blind. The American Journal of Psychology, April 1944.
  5. ^ Cotzin and Dallenbach. "Facial Vision": The Role of Pitch and Loudness in the Location of Obstacles by the Blind. The American Journal of Psychology, October 1950.
  6. ^ a b Griffin, Donald R., Echos of Bats and Men, Anchor Press, 1959 (Science and Study Series, Seeing With Sound Waves)
  7. ^ a b White, J. C., Saunders, F. A., Scadden, L., Bach-y-Rita, P., & Collins, C. C. (1970). Seeing with the skin. Perception & Psychophysics, 7, 23-27.
  8. ^ Thaler, Lore (2015-11-25). "Using Sound to Get Around - Association for Psychological Science". Aps Observer. 28 (10). Retrieved 2016-04-22.
  9. ^ a b Wallmeier, Ludwig; Geßele, Nikodemus; Wiegrebe, Lutz (2013-10-22). "Echolocation versus echo suppression in humans". Proceedings of the Royal Society of London B: Biological Sciences. 280 (1769): 20131428. doi:10.1098/rspb.2013.1428. ISSN 0962-8452. PMC 3768302. PMID 23986105.
  10. ^ Machemer, Theresa (June 4, 2021). "People Can Learn Echolocation in Ten Weeks". Smithsonian Magazine. Retrieved 2023-06-06.
  11. ^ Rosenblum LD, Gordon MS, Jarquin L (2000). "Echolocating distance by moving and stationary listeners". Ecol. Psychol. 12 (3): 181–206. CiteSeerX 10.1.1.540.5965. doi:10.1207/S15326969ECO1203_1. S2CID 30936808.
  12. ^ Kish D. (1982). Evaluation of an echo-mobility training program for young blind people: Master's Thesis, University of Southern California (Thesis).
  13. ^ Thaler L, Arnott SR, Goodale MA (2011). "Neural correlates of natural human echolocation in early and late blind echolocation experts". PLOS ONE. 6 (5): e20162. Bibcode:2011PLoSO...620162T. doi:10.1371/journal.pone.0020162. PMC 3102086. PMID 21633496.
  14. ^ Bat Man, Reader's Digest, June 2012, archived from the original on March 15, 2014, retrieved March 14, 2014
  15. ^ Thaler, L., Milne, J. L., Arnott, S. R., Kish, D., & Goodale, M. A. (2014). Neural correlates of motion processing through echolocation, source hearing, and vision in blind echolocation experts and sighted echolocation novices. Journal of Neurophysiology, 111(1), 112-127.
  16. ^ "World Access Online".
  17. ^ a b c Kremer, William (12 September 2012). "Human echolocation: Using tongue-clicks to navigate the world". BBC. Retrieved September 12, 2012.
  18. ^ a b Kish, Daniel (1995), Evaluation of an Echo-Mobility Program for Young Blind People, Master's thesis, San Bernardino, CA: Department of Psychology, California State University, p. 277, archived from the original on February 2, 2002
  19. ^ "Thomas Tajo". Visioneers.org. 2017-11-20. Retrieved 2023-01-22.
  20. ^ morgan isdell aleks petcova even white thomas cian Humans With Amazing Senses — ABC News.
  21. ^ Moorhead, Joanna (January 27, 2007). "Seeing with sound". The Guardian. London.
  22. ^ "How A Blind Teen 'Sees' With Sound". CBS News. July 19, 2006.
  23. ^ The Boy Who Sees with Sound — People Magazine
  24. ^ Surpassing Expectations: Life Without Sight Scadden, Lawrence]
  25. ^ "Recognizing excellence—The Lawrence Scadden Teacher of the Year Award | NSTA". www.nsta.org. Retrieved 2023-09-21.
  26. ^ "Lucas learns echo technique to 'see'". Daily Echo. 7 October 2009. Retrieved 8 October 2009.
  27. ^ "Ben Underwood | Blind Boy Who Could See".
  28. ^ "Blind boy uses his ears to 'see'". BBC News. 5 October 2009. Retrieved 8 October 2009.
  29. ^ "Visibility Scotland - Listening and responding to people affected by sight loss across Scotland". Visibility Scotland.
  30. ^ Irvine, Chris (5 October 2009). "Seven year old blind boy uses echoes to see". The Daily Telegraph. London. Retrieved 8 October 2009.
  31. ^ "Action for Blind People merged with RNIB". RNIB - See differently. 23 March 2017.
  32. ^ Cartlidge, Sarah (19 May 2019). ""The best work experience ever": Blind teenager enjoys "phenomenal" placement". Bournemouth Echo. Retrieved 28 May 2020.
  33. ^ Warwick K, Hutt B, Gasson M, and Goodhew I. "An attempt to extend human sensory capabilities by means of implant technology". Proc. IEEE International Conference on Systems, Man. and Cybernetics - Hawaii October 2005. pp.1663-1668
  34. ^ Skrebels, Joe (May 25, 2017). "Perception Review". IGN. Retrieved May 25, 2017.
  35. ^ Grierson, Tim. "Imagine". ScreenDaily. Retrieved June 21, 2017.
  36. ^ Collins, Suzanne (2007). Gregor and the Code of Claw (1st ed.). New York. ISBN 978-0-439-79143-4. OCLC 74568049.{{cite book}}: CS1 maint: location missing publisher (link)

External links[edit]