We gain information about the world through our senses, but how does the brain decode this information? A recent study by O’Connor et al. addresses this question using one of neuroscientists’ sneakier techniques: virtual reality. Along the way, the authors further shed light on how attention effects perception.
Using the mouse somatosensory barrel cortex as a model, the authors bring circuit-level resolution to the question of sensory coding. Mice were trained to detect the location of a vertical pole using their whiskers and report whether it appeared in the “YES” location or the “NO” location, a task which the animals solved by selectively whisking around the YES location and simply determining whether or not an object was present. The brain could decode the resultant cortical activity in one of two ways: by determining the location of the whisker when contact occurred, based on timing of spikes, or by using patterns of activity that differ with object location – in this case, a pole in the center of the whisking area, the YES location, would contact the whisker more frequently and with greater force, thus driving more activity, than would an object in the more eccentric NO location. So, is it spike timing or spike number that the mouse brain is listening to?
Enter virtual reality. An infrared beam was used as a virtual pole and coupled to a photodiode such that a whisker “contacting” the virtual pole triggered photostimulation to barrel cortex; animals expressed channelrhodopsin in barrel cortex excitatory neurons, so each virtual contact drove activity in the barrel through this closed-loop system. With the virtual pole placed next to the YES location, animals still performed well in YES trials, but in NO trails were biased toward incorrectly reporting the presence of a pole in the YES location; that is, the virtual pole was able to trick mice into perceiving a an object in the YES position.
This system allowed the authors to selectively tweak spike timing or spike number. Altering the delay at which activity followed virtual contact had no effect on the ability to fool mice with the virtual pole, suggesting that spike timing was not the critical factor. Conversely, when channelrhodopsin was expressed in inhibitory neurons so that virtual pole contact reduced barrel cortex activity, the mice were less likely to perceive a pole in the YES position even when it was present. Thus increasing spiking made animals more likely perceive an object in the YES location and decreasing spiking made them less likely to, indicating that it’s spike number upon which the brain makes its decision.
The effects of the virtual pole were strongly gated by attention. Stimulation in a barrel not corresponding to the whisker in use failed to alter perception; it seems that while learning the task, mice tuned in to the active whisker and learn to ignore activity in other barrels. Furthermore, mice were only tricked if stimulation occurred while they were actively whisking, suggesting circuit level differences between intentional exploration of the world and passively received stimuli.
Investigators have used stimulation to alter animals’ perceptions for decades, but modern approaches allow targeting of specific cell populations; in this case restricting stimulation to excitatory or inhibitory neurons in layer 4 of an individual cortical barrel. In a 2009 review, several authors of the present study advocated a reverse engineering approach focusing on activity patters in specific cell types and their role in behavior (O’Connor, Huber, & Svoboda, 2009). The present study demonstrates the power of such an approach to advance our understanding of complex neural processes at the level of cortical circuits.
O’Connor, D.H., Hires, A., Guo, Z.V., Li, N., Yu, J., Sun, Q., Huber, D. and Svoboda, K. Neural coding during active somatosensation revealed using illusory touch. Nature Neuroscience Advanced online publication June 2, 2013
O’Connor, D.H., Huber, D. and Svoboda, K. Reverse engineering the mouse brain. Nature 461, 923-929 (2009)