Presented By: Department of Psychology
CCN Forum: Visual speech modulates auditory speech perception in the auditory cortex through multiple distinct mechanisms
Karthik Ganesan, Graduate Student, Cognitive and Cognitive Neuroscience
Abstract:
Visual speech cues are often needed to disambiguate distorted speech sounds in the natural environment. However, understanding how the brain encodes and transmits visual information for usage by the auditory system remains a challenge. One persistent question is whether visual signals have a unitary effect on auditory processing or elicit multiple distinct effects throughout auditory cortex. In this talk, I will present results from our group that attempts to better understand how vision modulates speech processing. Data from neural responses to audiovisual speech in electrodes implanted intracranially as well as pilot results from a complementary study using fMRI will be discussed. These data show that visual speech modulates auditory processes in the auditory cortex in multiple ways, eliciting temporally and spatially distinct patterns of activity that differ across theta, beta, and high-gamma frequency bands. These results are consistent with models that posit multiple distinct mechanisms supporting audiovisual speech perception.
Visual speech cues are often needed to disambiguate distorted speech sounds in the natural environment. However, understanding how the brain encodes and transmits visual information for usage by the auditory system remains a challenge. One persistent question is whether visual signals have a unitary effect on auditory processing or elicit multiple distinct effects throughout auditory cortex. In this talk, I will present results from our group that attempts to better understand how vision modulates speech processing. Data from neural responses to audiovisual speech in electrodes implanted intracranially as well as pilot results from a complementary study using fMRI will be discussed. These data show that visual speech modulates auditory processes in the auditory cortex in multiple ways, eliciting temporally and spatially distinct patterns of activity that differ across theta, beta, and high-gamma frequency bands. These results are consistent with models that posit multiple distinct mechanisms supporting audiovisual speech perception.
Livestream Information
LivestreamFebruary 26, 2021 (Friday) 2:00pm
Joining Information Not Yet Available
Explore Similar Events
-
Loading Similar Events...