Novel Paradigms for Studying Brain Activity across the Autism Spectrum

Oral Presentation
Friday, May 11, 2018: 3:55 PM
Jurriaanse Zaal (de Doelen ICC Rotterdam)
A. Naples, Child Study Center, Yale University School of Medicine, New Haven, CT
Individuals with ASD experience profound difficulty in social interactions. These challenges are most evident in dynamic, interactive contexts; however, most brain research in ASD has studied neural response in unrealistic social contexts, such as passive viewing of static faces on a computer monitor. Furthermore, it is estimated that 30% of individuals with ASD are minimally verbal, a characteristic associated with poorer prognosis across academic, economic, and socio-emotional domains. Despite having the most significant need for support, most neuroscience research excludes these individuals from participation.

We have developed methods to measure brain activity during reciprocal interactions. Integrating high­speed eye-tracking (ET; measuring where a person looks on a computer screen) and electroencephalographic recording (EEG; measuring brain response) enables simulation of social interactions with animated faces that respond to a participant’s gaze (faces that “look back” or smile in response to gaze). Importantly, these methods allow us to characterize attention and brain activity in the absence of explicit instructions and task demands. The experimental paradigms are driven by non-verbal behavior, and participants are “self-paced” in their completion of the task. We have capitalized on these technological advances to explore brain activity and attention in individuals with autism with varying levels of cognitive and language abilities, including individuals who may have difficulty following explicit experimental instructions. Our goal is to evaluate the efficacy of these methods for minimally verbal individuals and to explore brain activity in this sample.

We collected co-registered high-speed EEG and eye-tracking data from individuals with ASD with IQ<70 during simulated social interactions. Onscreen faces responded to participant gaze by opening their eyes or mouths when looked at, or by looking towards or away from the participant. Importantly, the gaze-contingent nature of the experiment guaranteed attention to the eyes or mouth of onscreen faces during all facial movement, ensuring that brain activity reflected the locus of visual attention. We explored the P100 and the N170, temporally early neural indices of face processing, and pupil diameter.

Preliminary data from four children with ASD showed modulation of brain activity in response to gaze-contingent facial movement such that reciprocal eye-contact elicited more negative and earlier activity at the N170 than mouth movement (mean difference=-1.85mv [Cohen’s d=1.49], 21ms [Cohen’s d=1.38]). A marginally significant effect was observed for P100 latency, such that eye movement was processed faster than mouth movement (t=-2.49, p=.08). Data collection is ongoing, and analyses are exploring the effects of gaze change and modulation of pupil diameter.

Our preliminary data show efficacy for interactive neuroscience methods in this sample, with expected modulation of brain activity to social stimuli. Innovative experimental designs that are accessible to participants with a wide range of developmental and functional levels will reduce barriers to participation in clinical neuroscience research.