1. Neural mechanisms underlying self-motion perception for spatial navigation
Accurate estimate of self-motion through space requires both visual (optic flow) and vestibular (inertial motion) cues. We use psychophysics, neurophysiology and computational modeling techniques in awake, behaving nonhuman primates to study how brain employs visual and vestibular signals to form precise heading estimations. Specifically, macaques are trained to judge heading directions by using visual, vestibular or combined cues. We simultaneously record from multiple areas in the parietal cortex, and manipulate their activity by delivering electrical current or chemicals to test the causal links between the neural activity and the animals’ perceptual choices.
2. Neural mechanisms underlying optimal sensory cue integration
We live in a multisensory world. Everyday our brain receives redundant information from different sensory channels. Lots of psychophysical experiments already show that humans integrate multiple cues to improve perception in a statistical optimal way. To explore its underlying mechanisms, we use nonhuman primates (macaque) to study how visual and vestibular signals are converged, integrated, and accumulated in the brain including the sensory and the sensory-motor transformation cortex. Our research helps us understand how our brain optimizes decision making and the behavioral output under complex environment when facing multiple pieces of information.
Questions our lab asks
1) What is the temporal and spatial dynamics of vestibular signals for self-motion perception?
Vestibular signals play crucial roles in self-motion, locomotion, and navigation. Previous research has unveiled the complexity of these vestibular signals during self-motion, showcasing their multifaceted nature in both temporal and spatial dimensions. We are interested in their exact functional implications, and how different signal components are integrated with other sensory signals.
2) How different signals, such as visual, vestibular and auditory signals are utilized for self-motion perception?
Different signals across sensory modalities contain different spatial and temporal dynamics. We explore to see how brain handles these challenges, and integrates them to optimize perception. We have heavily studied interations between visual and vestibular in the past years, yet we are also explore how auditory flow may also contribute to multisensory self-motion, and how its signals are integrated with the other self-motion signals.
3) Are there causal links between different brain areas and self-motion perception?
We apply causal link experiments by using chemial inactivation and electrical microstimulation to achieve two goals. One is to identify necessary and sufficiant roles respectively between ROI and behavior. The other goal is to reveal decoding mechanisms by manipulating a small group of neurons. In particular, we examine whether the biased behavioral choice is predicted from the encoded tuning functions of the artificially manipulated neurons.