Computations structuring auditory perception
The success of Deep Learning networks in performing complex perceptual tasks such as image or speech recognition highlights the importance of non-linear computations for constructing invariant representations of relevant objects and signals from the environement. One ongoing project in the lab aims at exploiting the power of two-photon imaging to extensively specify the non-linearties implemented in the mouse auditory system. Combining with behavior, we try to identify the role of computations in shaping sound perception. For example, recently, we have shown, in mouse auditory cortex, that non-linear computations allow constructing divergent representations of sound intensity profile with opposite directions, an observation that correlates with the divergent perception of these sounds in humans.
Manipulating neuronal representations of sounds
Beyond deciphering the neural activity patterns produced by auditory stimulation, a major challenge is to establish causal links between these patterns and perception. To work towards this goal, we are using light shaping methods to generate patterns of activity in cortex and test whether these artificial "auditory" stimuli can drive behaviors or interfere with perceptual decisions.
Reinforcements learning models of sensory discrimination tasks
Appetitive discrimination tasks are pivotal to understand how animals perceive external stimuli. Yet, every mouse learns such task at its own pace and with its own dynamics. We have developped compact, biologically-inspired reinforcement learning models with which we can propose precise hypotheses about the neural and synaptic factors generating interindividual variability. Using these models to interpret mouse behavior also helps us understand what features of sensory representations are important for discrimination learning.
Auditory-visual interactions in cortex
Cortex is a wide network of broadly interconnected areas, both in humans and mice, and the role of this recurrent architecture is a fondamental question. Recent results show that areas dedicated to auditory or visual modalities are strongly connected. Because it is a striking example of recurrent coupling across brain networks, we have started to precisely characterize the information conveyed by this connection, and how this impacts visual processing. We have found recently for example that the sign of the effect of auditory cortex inputs to visual cortex is context-dependent: negative in the dark when vision cannot help deciphering sound information and positive in the light.
Olfactory-tactile interactions in cortex
Mice are nocturnal animals which often rely on their sense of smell and on whisker contacts to explore objects and conspecifics in their environment. To better undertstand the synergies between these ecologically crucial inputs, we are exploring how touch and smell information is combined in circuits of the cortex to eventually refine and stabilize object recognition.