Audiovisual speech perception
Speech comprehension under challenging listening conditions
Neural entrainment to speech
Multimodal language use
Departments and Institutes
I'm currently working on a project looking at how seeing the face of a speaker influences the synchronization ("entrainment") of neural activity to speech in difficult listening situations. The key question is how neural entrainment relates to speech comprehension. Specifically, I ask whether the comprehension benefit we get from seeing the speaker in challenging listening conditions can be explained by increased phase coherence between brain oscillations and the speech rhythm.