Skip to main content

Dombeck Lab examines hippocampal spatial mapping of a multisensory environment

August 3, 2021

A recent paper from the Dombeck Lab primarily done by NUIN student Brad Radvansky with help from IBiS student Jun Young Oh and postdoc Jason Climer. Radvansky had previously developed an olfactory virtual reality system, and here he applied it to learn how place cells map space in a multisensory environment. To do this, they developed a task where mice get reward at either an odor (pine smell) or visual (green goalpost) cue in a random location in a sparse environment. They found that there were more visual neurons when the visual cue was rewarded, and more olfactory neurons when the olfactory cue was rewarded. 5. Finally, they trained mice to lick for reward at the first cue, regardless of its modality. They found a new class of neuron that responded relative to the first cue. Overall, the results demonstrate that the rodent hippocampus can map multiple sensory and abstract spaces, based on the goal of the animal.

Click Here