Harvard University COVID-19 updates

Department News

Moving in the Dark: Primary Visual Cortex Encodes Information 3D Orienting Movements of the Head

Moving in the Dark: Primary Visual Cortex Encodes Information 3D Orienting Movements of the Head

For the past half century, neuroscientists have seen the visual cortex as the part of the brain that processes imagery. Visual information enters the brain via the retina and is processed one stage at a time by dedicated neural circuits, much like a car is assembled out of discrete components on an assembly line. Such experiments have typically been performed by presenting visual stimuli to restrained or anesthetized animals. As such, vision is understood to be a passive sense in which information is absorbed by stationary animals.

However, animals are rarely passive observers of the world: instead, they make myriads of movements to sample their environments, from sniffing to smell, whisking to feel, or licking to taste. Vision too is an active sense: animals and insects make eye, head, and body movements to direct their gaze or follow moving objects. How such movements affect the activity of cortical visual circuits is mostly unknown.

To start exploring this question, my team—including Javier Masis, Steffen Wolff, and David Cox—and I set up a simple experiment. We placed rats in an open arena that also served as their home, and simultaneously recorded their spontaneous head movements and neuronal activity in primary visual cortex (V1) using implanted electrodes. We performed half of the recordings in complete darkness, and half in the light, reasoning that any neural activity in V1 in the dark would reflect movement signals, while activity in the light would be a mixture of visual and movement signals.

We first found that on average, V1 neurons in freely-moving rats were more active during movement than rest, even in the dark. However, when we examined specific movements—orienting movements of the head, such as left or right turns, clockwise and counterclockwise tilts, or up and down nods—neurons were briefly suppressed during those movements in the dark and excited after the movements in the light. But the patterns of activity weren’t just different in the light and dark: we also could predict the specific directions rats were turning their heads by using the V1 firing patterns, even in the dark. This indicated to us that the visual system contains an internal representation of movement that is independent of actual visual inputs.

What were individual neurons doing to contribute information about movement? Looking into activity profiles of neurons during turns, we noticed that over half were tuned to particular direction of movement, meaning that a neuron’s firing rate might increase during left turns and decrease during right turns.

We wondered also how a neuron’s direction tuning in the dark might relate to its tuning in the light. Would it prefer the same direction or the opposite? To our surprise, there was no relationship between direction tuning in the two lighting conditions. In fact, a cell that was tuned in the dark was likely to be untuned in the light, and vice versa. This suggests that there may be two functional populations in V1: one that is direction-tuned in the light, and one that is direction-tuned in the dark. We speculate that these two functional populations may represent different things: the direction of visual flow, and the direction of self-motion, respectively.

We were curious how movements of the head affect processing of visual information. To test this, we presented flashes of light while the rats were freely moving. In previous studies of head-fixed mice, V1 responses to visual stimuli were enhanced when the animals ran on a treadmill. We found that when the flashes were presented during orienting movements of the head, V1 responses were actually suppressed relative to responses during rest, suggesting that perhaps there are mechanisms to dampen the effect of particular movements on sensory coding in V1.

Finally, we wanted to know also if the specific movement signals in V1 originated in a motor area. We focused on an area called secondary motor cortex (M2), which sends extensive axonal projections to V1 and has been implicated in movement control and planning. Is it possible that M2 sends movement direction information to V1? To answer this, we surgically destroyed M2 in several rats and implanted electrodes over V1. The lesioned animals behaved almost the same as the control rats, and V1 neurons responded to visual stimulation, as expected. V1 neurons in the M2-lesioned rats still were more active during movement than rest, but no longer signaled the direction which the rats were turning their heads.

Our data show that movement information in visual cortex is far richer than previously known, and that V1 represents specific movement directions even in the absence of visual inputs. It remains to be known exactly how movements affect image stabilization, or detection of object motion; for this, researchers would need better ways of presenting precise visual stimuli to freely moving animals (perhaps via head-mounted LED displays or VR goggles). These types of experiments are in their infancy, but require our attention; to think we understand visual processing from data gathered in head-fixed or anesthetized animals is imprudent. Experiments in freely moving animals also remind us that the ultimate goal is to study brains in the context for which they evolved: behavior.

by Greg Guitchounts

PDF

Steffen Wolff, (l) and Greg Guitchounts

Steffen Wolff, (l) and Greg Guitchounts