32436
Cognitive and Emotional Empathy in Autism: Explicit and Implicit Perspectives

Poster Presentation
Thursday, May 2, 2019: 11:30 AM-1:30 PM
Room: 710 (Palais des congres de Montreal)
J. Quinde1, B. H. Heflin2, L. E. Mash3 and C. J. Cascio4, (1)Vanderbilt University, Nashville, TN, (2)Florida International University, Miami, FL, (3)Brain Development Imaging Laboratories, Department of Psychology, San Diego State University, San Diego, CA, (4)Vanderbilt University School of Medicine, Nashville, TN
Background: Individuals with autism spectrum disorders (ASD) show varied responses in task-based self-ratings of empathy, including impaired cognitive empathy (emotion recognition) but intact emotional empathy (self-ratings of emotional relatedness) (Dziobek et al., 2008). However, response bias may influence self-ratings of emotional empathy. Further, the role of emotional valence in self-rated empathy tasks is not well understood. Spontaneous facial mimicry (SFM) is a reflexive mirroring of emotional faces that reflects social reward (Sims, Van Reekum, Johnstone, & Chakrabarti, 2012), can be measured separately for positive and negative emotional valence, and may provide a bias-free index of emotional empathy. To date, no studies have used both self-ratings and SFM to assess empathy in individuals with ASD.

Objectives: Our study aimed to explore effects of autism diagnostic status and emotional valence on (1) task-based cognitive and emotional empathy, (2) SFM in response to emotionally charged stimuli, and (3) correlations between mimicry (implicit) and empathy (explicit) scores.

Methods: 51 individuals (ASD = 25, typically developing (TD) = 26) performed the multifaceted empathy test (MET) consisting of 32 static images depicting people in emotionally charged conditions. Cognitive empathy was assessed by multiple choice emotion recognition, while emotional empathy was assessed by self-rating on a scale of 0-9. Separate mixed effects models were used for cognitive and emotional empathy scores to assess the influence of emotional valence of the stimuli and diagnostic group status. Participants’ facial expressions were recorded while performing the task and analyzed with iMotion’s FACET algorithm for emotion classification and scoring. Mean percent accuracy in emotion labeling (cognitive task), mean empathy scores (emotional empathy), median SFM across time, and correlations between facial expression emotion and empathy scores were compared between groups.

Results: For cognitive empathy, there were significant main effects of diagnostic group (p < .05) and valence (p=.001), but no significant interaction between the two. Post-hoc comparisons revealed that the TD group was significantly more accurate than the ASD group (p < .001) and that accuracy was significantly lower for negative valence stimuli than for positive. There were no group differences in self-rated emotional empathy, consistent with previous findings (Dziobek et al., 2008). Facial expression analysis revealed no group differences in overall Joy or Sadness scores during cognitive or emotional empathy presses. Between group differences (TD > ASD, p < .01) were found in participants’ Joy scores when responding to images with positive valence during both the cognitive and emotional presses. No group differences were found in participants’ Sadness scores when responding to images with negative valence for either press. No correlations between SFM and empathy scores were significant.

Conclusions: These results replicate previous findings of impaired emotion recognition but intact self-ratings of empathy in individuals with ASD, but suggest the following new ideas: 1) emotional valence should be considered when assessing empathy, 2) spontaneous facial mimicry may be a more sensitive measure of emotional empathy and detected group differences for positive emotions, 3) Self-report and SFM may be indexing very different aspects of emotional empathy given their lack of correlation.