April 23, 2024
clear sky Clear 64 °F

Paying attention to how we see: Nicholas Gaspelin receives NSF CAREER Award

Assistant Professor of Psychology Nicholas Gaspelin holds an electroencephalography cap. Assistant Professor of Psychology Nicholas Gaspelin holds an electroencephalography cap.
Assistant Professor of Psychology Nicholas Gaspelin holds an electroencephalography cap. Image Credit: Jonathan Cohen.

Every day, our brain makes decisions that we only notice after the fact, if we notice them at all.

As we go about our day-to-day lives, our eyes collect massive amounts of visual data that our brain sifts through. Much of that data goes unnoticed, but some rises to immediate attention: for example, the red flash of a taillight that sends your foot quickly to the brake, or the wobbling of an item on a high shelf that prompts you to step away before it falls.

But how does your brain know which visual stimuli require immediate attention, and how does attention really work? That’s what Assistant Professor of Psychology Nicholas Gaspelin hopes to find out.

A cognitive neuroscientist who came to Binghamton in 2017, Gaspelin specializes in visual perception and attention. He recently received a $708,780 National Science Foundation CAREER Award for a project that explores the relationship of attention and eye movements using concurrent electroencephalography (EEG) and eye-tracking.

“We’re trying to understand how neural shifts of attention are used to coordinate eye movements during visual search,” he said.

As part of the NSF award, Gaspelin will host a workshop that will teach undergraduates the basic computer programming involved in cognitive neuroscience research. Coordinated through the Center for Learning and Teaching and the Ronald E. McNair Postbaccalaureate Achievement Program, the virtual workshop will be held during the summer of 2023 and open to students at multiple universities.

EEG and eye-trackers

Currently, much of the research in Gaspelin’s lab concerns whether certain kinds of visual stimuli can automatically distract us, such as brightly colored objects or flashing lights. While we tend to see distraction in a negative light, there are times when we need to notice sudden stimuli to avoid danger, he pointed out.

Whether brightly colored objects capture attention has been subject to a decades-long debate in cognitive neuroscience. The answer: Probably not.

“They might initially capture attention or initially distract people, but then people learn as they get experience to suppress salient stuff in their environment,” he said. “We wouldn’t want to be attracted to every shiny or brightly colored thing; you probably wouldn’t survive very many trips to the grocery store.”

Gaspelin would like to know whether the brain processes behind attention are initiated even before eye movements are physically generated. We move our eyes all the time during visual search, whether glancing about for road hazards as we drive or looking for a can of soup in the store.

Measuring the brain processes of attention before eye movements, however, will require some technical innovations — namely, synchronizing an electroencephalogram (EEG) machine with specialized infrared cameras called eye-trackers.

In his lab, test subjects typically interact with what appears to be a simple video game. While they use the controller to react to stimuli on the screen, the eye-trackers measure their visual and behavioral responses; this data is simultaneously processed in a computer, giving researchers a clear picture as to what the subjects are looking at in real time.

The EEG, on the other hand, is a technology that has been around since the 1930s. Placed on the scalp with saline gel, electrode disks can measure an individual’s brain waves through tiny voltage fluctuations.

While EEG can pick up when someone begins to pay attention, it’s a broad brush: you can tell whether the object of focus is on the left or right, but that’s all. There is another obstacle: eye movement causes a major shift in voltage that makes it difficult to measure brain activity. Starting in the 1990s, scientists have compensated for this by requiring test subjects to look for something without actually moving their eyes — which isn’t typically how people look for things in the real world.

Using these technologies together may help us better understand how attention works. That knowledge could potentially make a significant impact on our daily lives, according to Gaspelin.

“Understanding the basic mechanisms of visual attention really is crucial to making society function efficiently,” he explained. “If we could understand how to develop better visual warning signals — for example, if we could understand how to better control motor vehicle operators’ attention or better control children’s attention in a classroom — we could end up having a much better society as a whole.”