Jannah Theme License is not validated, Go to the theme options page to validate the license, You need a single license for each domain name.
Health

Ear sounds reveal eye movements


Summary: Researchers have discovered that the ear emits subtle sounds in response to eye movements, allowing them to determine where someone is looking.

The study demonstrates that these auditory noises, potentially caused by muscle contractions or hair cell activations, can reveal the position of the eyes.

This finding challenges existing beliefs about the function of the ear, suggesting that sounds from the ear could help synchronize visual and sound perception. The team’s innovative approach could lead to new clinical hearing tests and a deeper understanding of sensory integration.

Highlights:

  1. Research has found that subtle sounds from the ears correspond to eye movements, providing insight into where a person is looking.
  2. This phenomenon is likely caused by the brain’s coordination of eye movements with ear muscle contractions or hair cell activation.
  3. The findings pave the way for new clinical tests and a better understanding of how the brain integrates visual and auditory information.

Source: Duke University

Scientists can now determine where a person’s eyes are looking simply by listening to their ears.

“You can actually estimate eye movement, the position of the target that the eyes are going to look at, just from recordings made with a microphone in the ear canal,” said Jennifer Groh, Ph.D., lead author of the new report, and professor in the departments of psychology and neuroscience as well as neurobiology at Duke University.

One set of projects focuses on how the sounds of eye and ear movements may be different in people who are hearing or visually impaired. Credit: Neuroscience News

In 2018, Groh’s team discovered that the ears emit a subtle, imperceptible noise when the eyes move. In a new report published the week of November 20 in the journal Proceedings of the National Academy of Sciencesthe Duke team now shows that these sounds can reveal where your eyes are looking.

It also works the other way around. By simply knowing where someone is looking, Groh and his team were able to predict what the ear’s subtle sound waveform would look like.

According to Groh, these sounds can be caused when eye movements stimulate the brain to contract either the muscles in the middle ear, which typically help quiet loud sounds, or the hair cells that help amplify quiet sounds.

The exact purpose of these ringing ears is unclear, but Groh’s initial hunch is that it might help sharpen people’s perception.

“We think it’s part of a system that allows the brain to match the location of images and sounds, even though our eyes can move while our head and ears don’t,” Groh said.

Understanding the relationship between subtle sounds in the ear and vision could lead to the development of new clinical tests for hearing.

“If each part of the ear contributes individual rules for the eardrum signal, then they could be used as a sort of clinical tool to assess which part of the ear anatomy is dysfunctional,” Stephanie said. Lovich, one of the lead authors of the study. the journal and a graduate student in psychology and neuroscience at Duke.

Just as the pupils of the eye contract or dilate like the aperture of a camera to adjust the amount of light entering, the ears also have their own way of regulating hearing. Scientists have long thought that these sound regulation mechanisms only help to amplify quiet sounds or attenuate loud sounds.

But in 2018, Groh and his team discovered that these same sound regulation mechanisms were also activated by eye movements, suggesting that the brain informs the ears about eye movements.

In their latest study, the research team followed up on their initial finding and examined whether faint auditory signals contained detailed information about eye movements.

To decode sounds from people’s ears, Groh’s team at Duke and University of Southern California professor Christopher Shera, Ph.D., recruited 16 adults with intact vision and hearing to Groh’s lab to Durham to take a fairly simple eye exam.

Participants looked at a static green dot on a computer screen, then, without moving their head, followed the dot with their eyes as it disappeared, then reappeared up, down, left, right, or diagonally from the starting point. This gave Groh’s team a wide range of auditory signals generated when the eyes moved horizontally, vertically or diagonally.

An eye tracker recorded where participants’ pupils darted to compare ear sounds, which were captured using a pair of headphones built into the microphone.

The research team analyzed auditory noises and found unique signatures for different directions of movement. This allowed them to decipher the ear’s sound code and calculate where people were looking simply by peering into a sound wave.

“Since a diagonal eye movement is just a horizontal component and a vertical component, my labmate and co-author David Murphy realized that you can take these two components and guess what they would be if you put them together. together,” Lovich said.

“Then you can go in the opposite direction and observe an oscillation to predict that someone was looking 30 degrees to the left.”

Groh is now beginning to examine whether these auditory noises play a role in perception.

One set of projects focuses on how the sounds of eye and ear movements may be different in people who are hearing or visually impaired.

Groh is also testing whether people who do not have hearing or vision loss will generate auditory signals that can predict their performance in a sound localization task, such as spotting where an ambulance is while driving, which relies on mapping auditory information on a visual. scene.

“Some people get a really repeatable signal on a daily basis and you can measure it quickly,” Groh said. “You would expect these people to be really good at a visual-auditory task compared to others, where it’s more variable.”

Funding: Groh’s research was supported by a grant from the National Institutes of Health (NIDCD DC017532).

About this news from research in visual and auditory neuroscience

Author: Dan Vahaba
Source: Duke University
Contact: Dan Vahaba – Duke University
Picture: Image is credited to Neuroscience News

Original research: Free access.
“Parametric information about eye movements is sent to the ears” by Jennifer Groh et al. PNAS


Abstract

Parametric information about eye movements is sent to the ears

When the eyes move, the alignment between the visual and auditory scenes changes. We are not perceptually aware of these changes, indicating that the brain must integrate precise information about eye movements into auditory and visual processing.

We show here that small sounds generated in the ear by the brain contain precise information about contemporary eye movements in the spatial domain: the direction and amplitude of eye movements could be inferred from these small sounds.

The underlying mechanism(s) likely involve the different motor structures of the ear and could facilitate the translation of incoming auditory signals into a frame of reference anchored to eye direction and thus to the visual scene.

Gn Health

Not all news on the site expresses the point of view of the site, but we transmit this news automatically and translate it through programmatic technology on the site and not from a human editor.
Back to top button