Research

Computations structuring auditory perception

The success of Deep Learning networks in performing complex perceptual tasks such as image or speech recognition highlights the importance of non-linear computations for constructing invariant representations of relevant objects and signals from the environement. One ongoing project in the lab aims at exploiting the power of two-photon imaging to extensively specify the non-linearties implemented in the mouse auditory system. Combining with behavior, we try to identify neural computations that shape sound perception. 

Manipulating neuronal representations of sounds. Towards auditory cortical implants. 

Beyond deciphering the neural activity patterns produced by auditory stimulation, a major challenge is to establish causal links between these patterns and perception. To work towards this goal, we are using mesocopic (DMDs) and single-cell resolution (2-photon) light-shaping methods to generate patterns of activity in cortex and test whether these artificial "auditory" stimuli can drive behaviors, interfere with perceptual decisions or even reconstruct perception.  

This fondamental aim aligns with the need of finding new avenues for sensory rehabilitation. In this respect, the Bathellier lab currently leads the EU funded Hearlight project, which regroups 6 teams working to establish a proof of concept in mice that cortical implants can be used for auditory rehabilitation. This projects involves mesocopic patterned optogenetics, deep learning methods, biolectronics and implant-driven behavior in mice. Here a movie to learn more about this project.

Reinforcements learning models of sensory discrimination tasks

Appetitive discrimination tasks are pivotal to understand how animals perceive external stimuli. Yet, every mouse learns such task at its own pace and with its own dynamics.  We have developped compact, biologically-inspired reinforcement learning models with which we can propose precise hypotheses about the neural and synaptic factors generating interindividual variability. Using these models to interpret mouse behavior also helps us understand what features of sensory representations are important for discrimination learning.

Uni- and multimodal representation workspace in the cortex

Cortex is a wide network of broadly interconnected areas, both in humans and mice, and the role of this recurrent architecture is a fondamental question. One likely consequence of this architecture is that even early sensory areas of the cortex carry information about the context of the sensory scene as shown in multiple studies. We are interested in the role of these multimodal workspaces. Recent results from our lab have described what type auditory information is primarily channeled to visual cortex in mice and how it can boost representations of visual stimuli coincident with abrupt sounds.  

Also mice are nocturnal animals which often rely on their sense of smell and on whisker contacts to explore objects and conspecifics in their environment.  Interestingly, we showed that olfactory information coexists with tactile information in the whisker (barrel) cortex. Our long term aim is to understand how this infomation arises in cortex, beyond primary sensory areas and integrates in the cognitive workspace.   

Understanding working memory for sounds and long term sensory predictions  

 Related to the tranfer of sensory information into a more cognitive workspace, the lab is also interested in the how the brain deals with the long term temporal structure of auditory information. We are currently following two axis. First, we use working memory tasks to investigate how cognitive mechanisms can exploit the information contained in a sequence of distinct sensory events. Second, we investigate how auditory cortex encodes regularities in temporal sequences of sounds 

The role of brain state in sensory processing

Finally, we are interested in how sensory processing is affected by brain states. We have recently shown that anesthesia profondly transforms the encoding of sounds and the on-going dynamics of auditory cortex compared to wakefulness. We are currently investigating the impact of sleep on auditory cortical processing.