Research reveals the sounds produced in your ears when you move your eyes.
I recently attended the International Multisensory Research Forum (IMRF) in Reno, Nevada, where scientists from around the world gathered to share their research on how the multiple senses (vision, hearing, touch, etc.) interact with each other. There were fascinating talks about all sorts of sensory interactions, with a focus on visual-auditory interactions—how our visual and auditory systems share information to process events more efficiently.
article continues after advertisement
As it turns out, there there are many brain regions, and many stages of perceptual processing, where information is combined, or integrated, between senses. Some studies show evidence of late integration—integration that happens after the visual cortex has processed the visual information in a scene and the auditory cortex has processed the auditory information in a scene. But there is also plenty of evidence that sensory information can combine between senses much earlier than that. In a fascinating keynote presentation by Duke University Professor Jennifer Groh, I learned that our eyes may actually be communicating directly with our ears every time we make an eye movement.
To understand why such eye-to-ear communication would even be useful, it is important to understand how the brain combines visual and auditory information to estimate where things are. Visual information that enters our eyes produces a retinotopic map of the world based on how light hits our retinas. Depending on where our eyes are pointed at any given moment, our retinas receive a particular image of the world that is specific to that direction of gaze. As soon as we make a new eye movement, all the visual information gets remapped to different areas of our retinas. Yet, we are able to maintain a stable perception of the world, in part because our brain knows how much our eyes move each time, and can counteract that motion to keep a stable representation of our surroundings.
Our ears work differently. When we hear a sound, we are able to estimate the direction of the sound source based on how the sound wave hits our two ears. For example, a sound that is coming from your right side will arrive at your right ear a fraction of a milisecond sooner, and a tiny bit louder, than a sound coming from your left side. Based on the difference in timing and loudness between the auditory information processed by our two ears, the brain can estimate the general direction from which a sound came, relative to our head position.
article continues after advertisement
But in order to integrate information from vision and audition, our brain needs to somehow combine visual information that is based in a retinal reference frame with auditory information that is based in a head-based reference frame. How the brain does this so quickly and efficiently is still a mystery, but research by Professor Groh’s teams offers compelling new clues.
Her research team showed that every time our eyes move, it results in a tiny sound produced in our ear canals called “eye-movement-related eardrum oscillations” (or EMREOs). They discovered these sound waves by placing tiny microphones inside people’s ear canals where they could clearly pick up the sounds. This video presents a amplified versions of these eye-movement related sounds:
Intriguingly, these sounds are not only present, but they are infomative about the eye movements that produced them. The research team was able to decode how much the eyes moved, and in what direction, based on the amplitude and frequency of these EMREOs. In other words, they were able to reconstruct the direction and magnitude of an eye movement based on just the sounds that are produced in the ear canal.
article continues after advertisement
Further research is needed to determine whether the brain actually relies on these auditory signals to aid its representation of the world, but Professor Groh’s work reveals that multisensory integration can start at the very earliest stages of sensory processing.
References
Gruters, K. G., Murphy, D. L., Jenson, C. D., Smith, D. W., Shera, C. A., & Groh, J. M. (2018). The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing. Proceedings of the National Academy of Sciences, 115(6), E1309-E1318.
Lovich, S. N., King, C. D., Murphy, D. L., Landrum, R. E., Shera, C. A., & Groh, J. M. (2023). Parametric information about eye movements is sent to the ears. Proceedings of the National Academy of Sciences, 120(48), e2303562120.
Source link : https://www.psychologytoday.com/za/blog/illusions-delusions-and-reality/202406/how-your-eyes-communicate-with-your-ears?amp
Author :
Publish date : 2024-06-28 22:56:01
Copyright for syndicated content belongs to the linked Source.