31881
Identification of Autism Spectrum Disorder Subgroups Based on Facial Expression Recognition Performance Profiles: Links to Clinical Symptoms and Neurofunctional Responses

Oral Presentation
Thursday, May 2, 2019: 1:42 PM
Room: 517C (Palais des congres de Montreal)
H. Meyer-Lindenberg1, B. Oakley2, C. Moessnang3, J. Ahmad2, H. L. Hayward4, J. Cooke2, D. V. Crawley2, R. Holt5, S. Baron-Cohen6, H. Tost3, A. Meyer-Lindenberg3, D. G. Murphy7 and E. Loth2, (1)King's College London, London, United Kingdom, (2)Forensic and Neurodevelopmental Sciences, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, United Kingdom, (3)Department of Psychiatry and Psychotherapy, Central Institute of Mental Health, University of Heidelberg, Mannheim, Germany, (4)Institute of Psychiatry, Psychology and Neuroscience, King’s College London, King's College London, London, United Kingdom, (5)University of Cambridge, Cambridge, United Kingdom, (6)Autism Research Centre, Department of Psychiatry, University of Cambridge, Cambridge, United Kingdom, (7)Department of Forensic and Neurodevelopmental Sciences, Institute of Psychiatry, Psychology and Neuroscience, King’s College London, London, United Kingdom
Background:

Emotion recognition, a critical aspect of social communication, is underpinned by a well-known functional network, including the amygdala and fusiform gyrus, altered in autism spectrum disorder (ASD).

Impairments in emotion recognition in ASD groups relative to controls and differences in amygdala/FG activation have been found in some, but not all studies. This may be due to differences between the employed tests, which varied in terms of their stimulus features (e.g. basic vs. complex emotions, presentation times) and/or due to heterogeneity among autistic individuals.

Objectives:

Our first aim was to create a profile of behavioural emotion recognition capabilities across three facial expression recognition tasks to use as a basis for identifying subgroups. Our second was to ascertain whether these subgroups differed in terms of their clinical symptoms and neurofunctional activation.

Methods:

Study participants were 148-277 autistic individuals and 107-211 control participants (ages 6-30) with either typical development (TD) or mild intellectual disability from the EU-AIMS Longitudinal European Autism Project. Participants completed three emotion recognition tasks: the Karolinska Directed Emotional Faces (KDEF) (basic emotions, long presentation times), the Reading the Mind in the Eyes Test (RMET) (complex emotions/mental states identified from the eye-region only) and the Fleeting Films task (FF) (naturalistic stimuli, short presentation times).

First, we investigated case-control differences on each measure using accuracy and accuracy-adjusted response time as dependent variables.

Second, we used hierarchical clustering to identify subgroups based on combined performance on all three tasks.

Third, we related subgroups to measures of symptom severity (Social Responsiveness Scale-Revised) and adaptive behaviour (Vineland Adaptive Behavior Scale-2nd Edition), as well as functional activation in the amygdala and fusiform gyrus (defined using a regions of interest approach) during a fearful face-matching task.

Results:

Individuals with ASD differed significantly from TD participants across all tasks in terms of accuracy (KDEF: p = 9.8e-05, d = 0.36; RMET: p = 2.7e-06, d = 0.43; FF: p = 5.7e-03, d = 0.38). Accuracy-adjusted response times showed significant differences in tasks with shorter presentation times (RMET: p = 5.0e-05; FF: p = 0.015).

Hierarchical clustering generated one high- and one low-performing cluster for both the ASD and TD groups, validated by bootstrapping.

In functional validation, the ASD clusters differed significantly from one another in symptom severity (SRS: p = 0.011) and level of adaptive function (p = 0.045).

In functional brain imaging, the ASD cluster with impaired emotion recognition showed significantly lower amygdala activation than that with relatively intact emotion recognition bilaterally (left: p = 0.017, d = 0.69; right: p = 0.017, d = 0.845).

Conclusions:

We identified two ASD subgroups based on hierarchical clustering of facial recognition performance profiles. Preliminary external validation of these subgroups suggests that an impaired subgroup has more severe clinical symptoms as well as reduced recruitment of the amygdala, a critical node in the emotion recognition network. If these findings are replicated, facial expression recognition profiles may be useful as a potential stratification biomarker for autism.