Electroencephalography (EEG): Opening New Windows Into the Developing Brain
By: Rebecca Hansen
How it Works
If you haven’t seen an EEG sensor cap before, you might wonder what such an odd little hat is for. But, as strange as it might look, the principle is fairly simple. Just like your doctor might use EKG sensors to measure your heart’s activity, researchers can use similar EEG sensors to pick up on the activity of your brain. The human brain contains billions of neurons, and the sensors in an EEG cap allow researchers to listen in on the electrical activity that occurs when these neurons communicate with one another. When researchers look at brain activity in response to a particular stimulus—a technique known as event-related potentials, or ERPs—they can identify the neural signatures, or wave forms, that correspond to certain perceptual and cognitive processes, such as face recognition, memory, and attention. Ultimately, researchers hope that, by identifying specific differences in these signatures among children on the spectrum, they can shed light on the underlying mechanisms of autism and help to create more targeted interventions. They also hope that, if they can learn to use such signatures to identify babies at high risk for developing autism, they may be able to help create tools for diagnosis and intervention at a much younger age than is currently possible.
Here in Boston, at the Boston Children’s Hospital Labs of Cognitive Neuroscience, Rhiannon Luyster, PhD is using EEG to study how children and adults with ASD process expressions of facial emotion. Luyster began the project as a postdoctoral fellow in the labs and continues her work there in addition to being an Assistant Professor in the Department of Communication Sciences & Disorders at Emerson College. Although many people may take it for granted, the ability to perceive and correctly identify emotions in faces is crucial to maintaining a successful social interaction. It is also often an area of difficulty for individuals with ASD. And, while researchers have previously explored behavioral and brain responses to ‘prototypical’ facial expressions (that is, very extreme examples of emotion), much less is known about how we respond to the more subtle expressions that characterize our daily interactions.
For a study related to social interactions, EEG is an ideal tool. “Social interactions are very much about timing,” says Luyster, and few measures out there can register brain activity with such precise timing as EEG and ERP. “For successful and fluid social interactions, we have to be sensitive to very subtle social cues, very rapidly,” says Luyster. “If you’re half a second late, you’ve missed the window in which your social partner is expecting you to respond.” The cumulative effect for people who have trouble with this timing, including many individuals with ASD, can be significant. By using EEG, researchers can measure down to the millisecond how long it takes an individual’s brain to respond to a social stimulus. By looking at whether there are group differences in the corresponding wave forms between individuals with and without an ASD, Dr. Luyster aims to learn more about how social and emotional processing may happen differently for some people with ASD.
A Powerful Tool in Combination with Other Techniques
Dr. Luyster also aims to see whether people’s behavioral responses to facial expressions of emotion are consistent with their brain’s response. For example, when it comes to more subtle expressions of emotion, does our brain detect an emotion in a face even when we don’t think we’ve seen one? In order to learn more about this, the study includes a sorting task, where participants are asked to categorize faces by emotion such as anger, happiness, and sadness. By doing this in conjunction with EEG, Luyster aims to identify whether people’s brain activity is consistent or inconsistent with their higher order awareness of facial expressions of emotion. “If there’s a disconnect between their low level sensitivity to those expressions and their higher order awareness,” she says,” we might be able to use that knowledge to design optimal intervention techniques. That is, if the brain registers a facial expression but an individual isn’t aware of having seen one, it is possible that detailed training programs might help to bridge the gap between automatic recognition and conscious awareness.
Dr. Luyster is also combining eye tracking technology with EEG to learn whether where a person looks on a face (for instance, at the eyes or the mouth) plays a role in the response to emotional faces. There is a large body of previous research in which people argue that individuals with ASD are not paying attention to the “right” parts of the face. For example, some people with ASD may spend more time looking at the mouth rather than the eyes. But, says Dr. Luyster, this is a controversial finding as the results have been inconsistent across different research studies. This research aims to shed light on this controversy and also to deepen the discussion by connecting how a person looks at a face with the corresponding brain activity. By using eye tracking and EEG together, she can see not just where people are looking, but also whether a person’s level of brain response is associated with differences in looking patterns. If there is a difference in neural processing, says Luyster, it would suggest a very different target for treatment and intervention. Indeed, the ultimate aim is to shed light on specific areas of strength and difficulty for individuals with ASD, and to contribute to the design of more effective intervention programs and therapies.
Breaking the Language Barrier
EEG can also be an ideal neuroimaging tool for working with individuals who may struggle with language or have no language at all. For children who may not be able to understand instructions on how to perform an elaborate task or to tolerate lying in an MRI scanner for extended periods of time, EEG and ERP allow researchers to track their brain activity while simply showing them pictures, playing language sounds, or presenting other simple stimuli. A tremendous amount of data can be gathered in just ten to fifteen minutes, and with a little creative coaching—researchers often put a cap on themselves or a parent before putting one on a child—many children forget that they are even wearing such an unusual “hat.”
For these same reasons, EEG is also an important tool for studies that aim to learn how to accurately identify infants at risk for ASD, such as the Infant Sibling Project, also taking place at the Labs of Cognitive Neuroscience in collaboration with researchers from Boston University. Long before a baby begins to develop language, researchers can use EEG and ERP to assess whether there are differences in brain activity that could eventually be used to identify a baby as particularly vulnerable for developing autism. Researchers also hope that, by looking directly at brain activity, they can eventually make such assessments before any behaviors associated with ASD are even present, which would have two big implications. First, rather than having to wait for a child to mature enough to start showing differences or delays in motor, language, or social development in order to make a diagnosis, the use of EEG could push the diagnostic window back to the first several months of a child’s life. This, in turn, would pave the way toward much earlier intervention than is currently possible, hopefully leading to better outcomes for children and their families. Second, if a diagnosis can be made before symptoms are even present, it may be possible to design interventions that would prevent those symptoms from ever appearing.
Whether to improve early identification, help design better therapies, or simply broaden our understanding of how children and adults with autism experience the world, EEG is a tool that holds tremendous potential. The better that researchers can understand the connections between the developing brain and the corresponding behaviors, the better that we can support all individuals with autism in achieving their fullest potential.
For more information about Dr. Luyster’s study and other studies happening at the Boston Children’s Hospital Labs of Cognitive Neuroscience, or to sign up for their Research Registry, please click here. You can also view a video about their work with young infants and hear from parents and children who have participated in their studies by clicking here.