17531
Exploring Attentional Strategies for Emotion Recognition in Autism Spectrum Disorders

Thursday, May 15, 2014
Atrium Ballroom (Marriott Marquis Atlanta)
E. Birmingham1, V. Kling1, N. Roberts1, D. A. Trevisan1, J. Tanaka2 and G. Iarocci3, (1)Faculty of Education, Simon Fraser University, Burnaby, BC, Canada, (2)Department of Psychology, University of Victoria, Victoria, BC, Canada, (3)Department of Psychology, Simon Fraser University, Burnaby, BC, Canada
Background:  A key hypothesis in the field of emotion recognition is that children with autism spectrum disorders (ASD) use abnormal attentional strategies to detect emotions in faces, leading to impaired behavioral performance. Eye tracking studies that have tested this possibility, however, have led to conflicting results.  Moreover, because eye position can be dissociated from the allocation of attention (Klein & Pontefract, 1994), these previous studies may not have been capturing key information about where children with ASD attend when viewing facial expressions of emotion.  

Objectives:  We used the Moving Window Technique (MWT) in which the observer explores a blurry face using a mouse-controlled window of high-resolution information (Birmingham et al., Child Devel 2012). The MWT confers an advantage over eye tracking by directly revealing the attentional strategies used to decode basic facial emotions.  

Methods:  Typically developing (TD; N=82) and ASD (N=59) children aged 7-14 years were tested. Within the MWT paradigm, 40 images consisting of four basic expressions (happy, angry, fearful, disgusted) were presented to children in a four-alternative-forced-choice design. Children had up to 15 seconds to explore each face.

Results:  

Accuracy & RT. Children with ASD were less accurate and slower than typically developing children at judging emotion expression (accuracy: F(1,143)=9.46, p<.01; RT: F(1,143)=10.79, p<.01). Main effects of Emotion revealed that accuracy and RT were best for Happy faces, worst for Angry faces, with intermediate performance for Disgusted and Fearful faces (accuracy: F(3,429)=46.45, p<.001; RT: F(3,429)=60.97, p<.001).

Exploration patterns.  Surprisingly, both groups of children explored the mouth more than any other region, followed by the nose, left eye, right eye, and finally the remainder of the face and the hairline region (main effect of Region: F(5, 715)=542.63, p<.001).  However, exploration patterns differed across emotions, with the eyes explored relatively more, and the mouth relatively less, on angry faces than on disgusted, fearful, and happy faces (Emotion x Region interaction, F(15, 2145)=23.66, p<.001).

Exploration patterns were remarkably similar in the ASD and TD groups (n.s. Diagnosis x Region interaction); however, they diverged for fearful and angry expressions (significant Diagnosis x Emotion x Region interaction, F(15, 2145)=1.99, p<.05). Specifically, for angry and fearful expressions, children with ASD tended to explore the mouth more than TD children; no such differences in exploration were found for happy and disgusted expressions.

Correlations between accuracy and exploration. Controlling for age, we found that in the ASD group, accuracy for detecting fear and anger was positively correlated with exploration of the left eye (angry: = 0.39, p<.01; fear: r = 0.33, p<.01) and the right eye (angry: r = 0.26, p<.05 ; fear: r = 0.33, p < .01), and was negatively correlated with exploration of the mouth (angry: r = -0.26, p<.05; fear: r = -0.37, p<.01).  No correlations between accuracy and exploration time were found for the typically developing children, or for happy and disgusted faces regardless of diagnosis.

Conclusions:  Despite largely similar attentional strategies for recognizing emotions, we find subtle differences in ASD that contribute to their reduced behavioral performance.