Just as your brain filters out extraneous background sounds during a cocktail party so you can hear one conversation, the brain also sifts through extraneous visual information so you can focus on the person handing you another drink.
The brain eliminates much of information available to the eyes because it is simply incapable of processing all the data picked up by the 125 million photoreceptors in each eye.
Some of the areas in your field of vision to which you pay attention are selected by your own intentions; at the cocktail party you choose to attend to the person carrying your drink. Your brain ignores other partygoers, until your attention is involuntarily drawn away by another stimulus, such as the sudden arrival in the corner of your eye of men with shiny NYPD badges.
The question that interests neuroscientists who study vision about this party scenario is not why the police arrived, but how your brain decides to pay attention first to the drink-bearer and then to the new arrivals.
Now, new research from Dr. Michael Goldberg, the David Mahoney Profesor of Brain and Behavior in the neurology and psychiatry departments, and his postdoctoral researcher, Dr. James Bisley, show how an aspect of the attention process works. They found that one region in a primate brain collects all the data on attention-worthy objects to maintain an ever-changing map of the salient regions within your visual world. The research appeared in the Jan. 3 Science.
To find out, Drs. Goldberg and Bisley trained two monkeys to look at the center spot on a screen while also paying attention to a peripheral target spot. The researchers knew the monkeys were paying attention to the target because their visual sensitivity at the target spot was better than it was at other areas of the screen.
Occasionally, during the time the monkey was looking at the center image and attending to the target spot, the researchers flashed a distractor dot elsewhere on the screen. The distraction captured the monkey's attention for about half a second, before attention returned to the target.
To see how intentional (target) and unintentional (distractor) stimuli were represented in LIP, the researchers measured the activity of 41 individual neurons in LIP of two monkeys when they stared at the computer screen. Each neuron covered a different portion of the monkey's field of view.
Based on the current paradigm in neuroscience, the researchers expected to see that the monkey's attention depended upon the threshold activity of a single neuron. For example, if the neuron covering the space around the target reached a certain level of activity, the monkey would pay attention to the target spot. The monkey would stop paying attention to the target when activity in the neuron dropped under the threshold level.
But instead, the researchers found that they had to look at the activity in the ensemble of neurons in LIP to see where the attention lied. The neuron that had the greatest amount of activity, regardless of the absolute value of that activity, told the researchers if the monkey was paying attention to the target or the distractor.
"Our findings showing that only the greatest amount of activity is necessary to attention is almost heresy," Dr. Bisley says. "Most neurophysiologists will insist the absolute response of a neuron is important. But we show the location of attention depends on which cell shows a higher level of activity compared with the other cells in LIP."
"A level of activity capable of sustaining attention under one circumstance may be insufficient under other circumstances," adds Dr. Goldberg.
The data show that LIP combines the data on a monkey's own attentional plans and also on the unexpected events in the environment to make a map of the important parts in the visual world, but the researchers think another part of the brain actually makes the attentional decision based on data in LIP.
The research was supported by the National Eye Institute, the W.M. Keck Foundation, and the Human Frontier Science Program.