28313
Neural Representations of Facial Identity and Emotional Expressions in Young Adults with and without ASD

Poster Presentation
Saturday, May 12, 2018: 11:30 AM-1:30 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
M. H. Hendriks1,2, C. Dillen3, N. Daniels2,3, J. Bulthé1, F. Pegado1, J. Steyaert2,4, H. Op de Beeck1 and B. Boets2,4, (1)Department of Brain and Cognition, KU Leuven, Leuven, Belgium, (2)Leuven Autism Research Consortium (LAuRes), KU Leuven, Leuven, Belgium, (3)Department of Rehabilitation Sciences, KU Leuven, Leuven, Belgium, (4)Center for Developmental Psychiatry, KU Leuven, Leuven, Belgium
Background: “Why didn’t you say hello? One of your classmates just passed by.” Most people are experts in the recognition of faces and facial expressions. However, individuals with autism spectrum disorders (ASD) often have difficulties processing faces, both in recognizing facial identity -especially when they don’t expect seeing someone- and in interpreting feelings based on facial expressions. Even though face processing difficulties are included in the diagnostic criteria of ASD, the empirical evidence is mixed.

Objectives: The goal of this study was to quantify the quality of neural representations with regard to facial identity and emotional expression. Although there might not be a large difference between the groups at a behavioral level, we studied whether there may be differences at the brain level between individuals with and without ASD while looking at faces with different identities and different facial expressions.

Methods: 52 young adults (age 17-23 years) participated in this study. Data were acquired while participants were lying in a 3T MRI scanner. They were looking at short movies of a dynamic face, which gradually transitioned from a neutral facial expression to an emotional facial expression. The movies comprised four different identities and six different emotions. Participants had to push a button whenever the current movie differed from the previous one, regardless of this change occurring in identity or expression. 45 Individuals were included in the analyses (21 ASD, 24 control participants). We tested whether different emotions and identities could be consistently classified and differentiated based on neural activity patterns in a series of face selective brain regions. More specifically, we performed multi-voxel pattern analyses (MVPA) based on a support vector machine to investigate whether different emotions and identities can be decoded on the basis of their neural activation pattern.

Results: Initial analyses show that different facial identities can be reliably distinguished in posterior temporal cortex, superior temporal sulcus, inferior occipital cortex, and primary visual cortex, in both groups. Likewise, different facial expressions can be decoded in these same brain regions. Thus far, our analyses suggest that there are no significant differences in neural response patterns between these two groups.

Conclusions: Neural activation patterns can be used to reliably decode facial identity and facial expression in occipito-temporal brain regions, both in adults with ASD and in neurotypical controls. The initial findings lead us to conclude that there are no detectable differences in the quality of the neural representations and activation patterns between individuals with and without ASD. At the conference, we will present some more fine-grained analyses, including associations with a large battery of behavioral tasks assessing face processing.