26080
Determining the Target Postures of Affective Facial Expression in Autism Spectrum Disorder

Friday, May 12, 2017: 5:00 PM-6:30 PM
Golden Gate Ballroom (Marriott Marquis Hotel)
T. Sorensen1, R. B. Grossman2 and S. Narayanan3, (1)Linguistics, University of Southern California, Los Angeles, CA, (2)FACE Lab, Emerson College, Boston, MA, (3)University of Southern California, Los Angeles, CA
Background:  The generation of affective facial expressions involves deforming the face to achieve target postures. As a disorder of social communication, Autism Spectrum Disorder (ASD) may involve an atypical inventory of target postures for affective facial expression. Identifying target postures of the face and quantifying the size of target posture inventories may provide insight into how ASD perturbs the motor organization of communicative behavior. These perturbations to affective facial expression may be among the mechanisms which give rise to the perceived atypicality of affective facial expression in ASD.

Objectives:  We used a motion capture system in order to identify target postures in the affective facial expressions of children with high-functioning autism (HFA) and typically developing (TD) children. Specifically, we clustered the sampled motion capture marker positions into target postures of the face. We quantified the difference in size of HFA and TD target posture inventories.

Methods:  Subjects included 21 children with HFA and 16 TD children, aged 9-14. The experimental task was to watch and mimic the affective facial expressions shown in short video clips. These facial expressions exhibited either Anger, Sadness, Disgust, Surprise, Fear, or Happiness. Each child mimicked 18 expressions. Six motion capture cameras recorded the position of 28 markers and 4 reference sensors on the face at 100 frames per second. Using the reference sensors, a basis was chosen which expressed the position of the 28 markers independently of rigid head motion. The sampled marker positions were clustered into target postures using the k-means algorithm for each recording separately. The number of clusters k was determined as the smallest number for which the clusters explained a threshold percentage of the variance. We varied the threshold over 70%, 80%, and 90% in order to assess the sensitivity of the method. Explained variance was calculated as the ratio of the quantity (1 - within cluster sum of squares) to the total sum of squares. Lower face markers were clustered separately from upper face markers.

Results:  In our sample, children with HFA differed from their TD peers in the number of target postures achieved for Anger, Sadness, Disgust, Surprise, Fear, and Happiness. This reflects a quantifiable size difference in the target posture inventories of the HFA and TD groups.

Conclusions:  Children with HFA may develop different target posture inventories for affective facial expression than their TD peers. This reflects how ASD perturbs the motor organization of communicative behavior. Further analysis of motor atypicalities will shed light on the mechanisms underlying the perceived atypicality of affective facial expression in ASD.