Characterizing the development of neural mechanisms underlying the co-learning of speech perception and production
Learning to speak, from simple babbling to the pronunciation of complex sentences, requires learning to combine articulatory movements with abstract linguistic knowledge. Our perceptual system develops to understand speech as it is produced by the articulatory system, which shows that the perception of sounds is influenced by the process of its production.
Recent studies showed that neural oscillations are a key computational principle in both speech perception and production. Our project focuses on better characterizing the neural oscillatory principles underlying the acquisition of speech perception and production, as well as the emergence of their interplay, which can contribute to a better understanding of speech development. To do that, we will study whether visual speech stimulation modulates the top-down influence of motor cortex on auditory activity by recording neuronal oscillations via Magnetoencephalography (MEG) in addition to analyzing child vocalizations.