International Meeting for Autism Research: ERP Measures of Facial Negative Emotional Expression Recognition In Autism and ADHD

ERP Measures of Facial Negative Emotional Expression Recognition In Autism and ADHD

Thursday, May 12, 2011
Elizabeth Ballroom E-F and Lirenta Foyer Level 2 (Manchester Grand Hyatt)
1:00 PM
G. Sokhadze1, A. S. El-Baz2, L. L. Sears3, J. M. Baruth4, E. M. Sokhadze5 and M. F. Casanova5, (1)Psychology Brain Sciences, University of Louisville, Louisville, KY, (2)Bioengineering, University of Louisville, Louisville, KY, (3)Pediatrics, University of Louisville, Louisville, KY, (4)University of Louisville, Louisville, KY, (5)Psychiatry & Behavioral Sciences, University of Louisville, Louisville, KY
Background: Disturbances of affective reactivity and innate inability to perceive and respond to the social and emotional signals in a typical and appropriate manner are the hallmark deficits of autism spectrum disorders (ASD). Children with ADHD are characterized by early and persistent deficits in attentional functions, impulse and motor control, and most research on behavioral deficits in ADHD has focused on laboratory measures of attention and executive functions and less on emotion. 

Objectives: However, a closer investigation of emotional reactivity in ASD and ADHD and its underlying neural substrates is highly justified, since emotional deficits have been found to affect social functioning and interaction with peers in both ASD and ADHD children. The study was aimed at investigation of ability of children with ASD and ADHD recognize differences in emotionally negative facial expressions during affective task. 

Methods: This study used event-related potential (ERP) measures of facial emotional expression recognition to test cognitive functions and emotional responsiveness in autism and ADHD.  In a forced-choice ERP test subjects were instructed in two blocks (60 trials each) to differentiate gender of faces with neutral and emotional expressions, and then in the following 2 blocks differentiate facial emotional expressions (e.g., fear vs. sadness; anger vs. disgust). ERP were recorded using 128 channel EGI Net Station  EEG system. ERP and single-trial EEG responses at  the frontal, temporo-parietal, centro-parietal, and occipital sites of EEG recording were analyzed and compared across 3 groups (ASD,ADHD, typical controls, N=10/group).

Results: Both ASD and ADHD groups showed higher error rates and slower reaction times in emotion recognition tasks as compared to typically developing children. ANOVA revealed groups differences in amplitude and latency characteristics at the centro-parietal and occipital ERPs (N170, N200, P3b). There were calculated also difference waves between gender recognition and emotion prosody recognition tasks. These difference waves at the posterior topographies (e.g., N2d) showed statistically significant  differences between ASD and ADHD group.

Conclusions: Results are discussed using the “theory-of-mind” construct. Differences in amplitude and latency characteristics of ERP waves and single-trial evoked EEGG responses in gender and emotion  recognition conditions showed sensitivity  to efforts associated with emotional prosody recognition and categorization.  The study provides additional support for functional diagnostic usefulness of  emotional reactivity tests using ERP and EEG biomarkers.

| More