31047
Neurobiological Markers of Vocal Emotion Recognition in the Broader Autism Phenotype

Poster Presentation
Friday, May 3, 2019: 5:30 PM-7:00 PM
Room: 710 (Palais des congres de Montreal)
V. Yap1, A. Connelly2, I. E. Scheffer2,3,4 and S. J. Wilson1,2, (1)Melbourne School of Psychological Sciences, The University of Melbourne, Melbourne, VIC, Australia, (2)Florey Institute of Neuroscience and Mental Health, Melbourne, VIC, Australia, (3)Department of Medicine, The University of Melbourne, Austin Health, Melbourne, VIC, Australia, (4)Department of Paediatrics, The University of Melbourne, The Royal Children's Hospital Melbourne, Melbourne, VIC, Australia
Background: Relatives of individuals with Autism Spectrum Disorder (ASD) demonstrate atypical brain activation in regions of the “social brain” when judging emotional states in the face and eyes. These findings are consistent with an ASD profile and suggest the presence of neurobiological markers of the Broader Autism Phenotype (BAP), or endophenotypes of ASD, for emotion recognition. Neuroimaging studies have not examined whether the BAP in relatives is also associated with atypical brain activation for the recognition of emotional voices, although there is evidence for this in individuals with ASD.

Objectives: We aimed to examine the neural correlates of vocal emotion recognition in relatives of individuals with ASD, specifically using non-linguistic emotional vocalizations known as vocal affect bursts (e.g., laughter, cries, screams).

Methods: We assessed 13 adult family members of individuals with ASD and 13 adult controls without a family history of ASD (matched on age and IQ). The family group consisted of first-degree (n = 11) and second-degree (n = 2) relatives from seven different families. Prior to recruitment, all family members were determined to have clinical markers of the BAP on extensive cognitive and behavioural testing, and all controls were screened for BAP traits on a self-report measure. The block-design fMRI task consisted of (i) “emotion” blocks, in which participants identified basic emotions in vocal affect bursts, and (ii) “gender” (baseline) blocks, in which they identified gender (male/female) in non-linguistic neutral vocalizations. We used whole-brain, fixed effects analyses to determine brain activation in each group (“emotion > gender” contrast) and to compare activation between groups at a cluster-corrected threshold.

Results: The family group and controls were able to classify vocal affect bursts at ceiling levels, suggesting brain activation reliably related to correct task performance. Both groups largely activated similar networks (e.g., social brain, mirror neuron system and memory structures; salience, cognitive control, motor and visual networks). However, while controls demonstrated bilateral activation in the superior temporal sulcus (STS) and adjacent middle temporal gyrus (MTG), activation in these regions were left-lateralised in the family group. Between-groups analyses also revealed that the family group had significantly higher activation than controls in the left lateral occipital cortex, whereas no regions showed significantly lower activation in the family group.

Conclusions: Our findings suggest that the family group used neural compensatory mechanisms to successfully classify vocal affect bursts. In particular, they appeared to engage in more effortful processing of vocal affect bursts in left rather than right temporal voice areas (e.g., STS/MTG), which could have contributed to increased left occipital activation. This is consistent with previous research indicating that temporal voice areas are connected to the ipsilateral occipital cortex and that occipital activity is modulated by attention to auditory stimuli. Increased left occipital activation may also indicate more cross-modal processing (e.g., visual imagery) to perform the task. Overall, our findings suggest a neurobiological marker of the BAP for vocal emotion recognition.