|  Help  |  About  |  Contact Us

Publication : The medial entorhinal cortex encodes multisensory spatial information.

First Author  Nguyen D Year  2024
Journal  Cell Rep Volume  43
Issue  10 Pages  114813
PubMed ID  39395171 Mgi Jnum  J:358156
Mgi Id  MGI:7779359 Doi  10.1016/j.celrep.2024.114813
Citation  Nguyen D, et al. (2024) The medial entorhinal cortex encodes multisensory spatial information. Cell Rep 43(10):114813
abstractText  Animals employ spatial information in multisensory modalities to navigate their natural environments. However, it is unclear whether the brain encodes such information in separate cognitive maps or integrates it all into a single, universal map. We address this question in the microcircuit of the medial entorhinal cortex (MEC), a cognitive map of space. Using cellular-resolution calcium imaging, we examine the MEC of mice navigating virtual reality tracks, where visual and auditory cues provide comparable spatial information. We uncover two cell types: "unimodality cells" and "multimodality cells." The unimodality cells specifically represent either auditory or visual spatial information. They are anatomically intermingled and maintain sensory preferences across multiple tracks and behavioral states. The multimodality cells respond to both sensory modalities, with their responses shaped differentially by auditory or visual information. Thus, the MEC enables accurate spatial encoding during multisensory navigation by computing spatial information in different sensory modalities and generating distinct maps.
Quick Links:
 
Quick Links:
 

Expression

Publication --> Expression annotations

 

Other

4 Bio Entities

0 Expression