The team of Brice Bathelier combines advanced analytical and modeling techniques with a broad array of experimental approaches: two-photon calcium imaging, multichannel electrophysiology, optogenetics and behavioral analyses of auditory perception.
The principal projects of the team include the large-scale deciphering of sound representation in the mouse auditory system, the development of optogenetic methods for generating auditory perception through the targeted activation of central neural networks, and exploration of the role in perception of neuronal connections between brain areas processing different sensory modalities.
Operations structuring auditory perception
The success of deep learning networks in the execution of complex perception tasks, such as the recognition of images or words, has highlighted the importance of non-linear operations for the construction of invariant representations of pertinent objects and signals. One of the projects currently being conducted by this team aims to make use of the power of two-photon imaging to identify the precise the non-linearities implemented in the auditory system. Using a combination of this approach and behavioral tasks, the researchers are trying to identify the key operations in the development of sound perception. For example, they have recently shown, in the auditory cortex of the mouse, that non-linear operations allow the construction of divergent representations of opposite variations of sound intensity, consistent with the divergent perception of these directions of variation in humans.
Manipulation of the neuronal representations of sounds
Over and above the deciphering of models of neuronal activity in response to auditory stimulation, the establishment of causal links between these models and perception is a major challenge. The researchers of the team are trying to achieve this objective, using methods of light formatting to generate models of cortical activity, and to determine whether these artificial “auditory” stimuli can trigger behaviors or interfere with perceptual decisions.
Reinforcement learning models for sensory discrimination tasks
Sensory discrimination tasks are essential for studies of how animals perceive external stimuli. However, surprisingly, each mouse learns each type of task at its own rate and with its own dynamics. The team of Brice Bathellier has developed reinforcement learning models inspired by biology, to describe the changes in the synapses transmitting auditory information to decision centers during the association of a sound with a behavioral decision.
The use of these models to interpret the learning of an association between a sound and a particular behavior will enable us to understand the causes of interindividual variability in learning, and to determine which characteristics of auditory, or, more generally, sensory representations are important for accelerating the acquisition of learning.
Multisensory interactions in the cortex
The cortex is a vast network of largely interconnected zones, and the role of this recurrent architecture is a fundamental issue for our understanding of sensory perception. Recent studies have shown that the cortical zones dedicated to hearing and vision are highly connected. Researchers have begun to characterize, with a high degree of precision, the information transmitted by this connection and its impact on visual processing. They have shown, for example, that the impact of this connection may be negative or positive, depending on the sensory context: negative in the dark, because vision cannot provide an explanation for sonic information in the absence of light, and positive when visual information is available. The researchers of the team are also exploring the way in which tactile and olfactory information is combined in the cortical circuits to refine and stabilize object recognition. They are also studying the impact of these two senses on auditory perception.