Note: Most Internet Explorer 8 users encounter issues playing the presentation videos. Please update your browser or use a different one if available.

Emotion in the Voice, Emotion in the Eyes, and Emotional Intelligence and Their Relationship with Autistic Traits

Friday, 3 May 2013: 14:00-18:00
Banquet Hall (Kursaal Centre)
14:00
M. E. Stewart1, M. Ota2, C. McAdam1, K. De Busk2, S. Peppe3 and J. Cleland3, (1)Applied Psychology, Heriot-Watt University, Edinburgh, United Kingdom, (2)University of Edinburgh, Edinburgh, United Kingdom, (3)Queen Margaret University, Edinburgh, United Kingdom
Background:

People with Autism Spectrum Condition (ASC) are known to exhibit emotional processing patterns that are different from typically developing individuals. Such differences have been observed primarily in two areas: speech prosody and facial expressions. A central question for these differences in performance is whether they reflect problems in modality-specific processing (e.g., speech or face) or a common underlying cognitive difference. We test this in a relatively homogeneous group in which linguistic skills and IQ can be assumed to be less variable than that of the ASC population, a group of students who vary on autistic trait scores.

Objectives:

To examine 1) whether autistic traits have independent effects on emotional processing in speech prosody and facial expression, and 2) whether both of these domains can be related to models of general emotional understanding, in particular, Theory of Mind and Emotional Intelligence.

Methods:

116 students were recruited, consisting of 80 females and 36 males, with a mean age of 21.1 years, (s.d.= 3.4, range 17-37). Participants completed the Autism-Spectrum Quotient (AQ; Baron-Cohen, Wheelwright, Skinner, et al., 2001); Swinburne University Emotional Intelligence Test (SUEIT; Palmer & Stough, 2001); Reading the Mind in the Eyes (RMET; Baron-Cohen et al., 2001); and tests of emotional prosody (Stewart et al., 2013). The emotional prosody test consisted of two parts. In one, the prosodic information was part of sentences which sometimes signalled emotional states through their semantic/lexical content (congruent with the meaning of the sentence; incongruent; or neutral). In another, the same sentences were vocalisations (‘mmm’) so that only the prosodic cues were available to the listener.

Results:

The AQ correlates highly with emotional intelligence (r=-0.57, p<0.01) and with the performance in three of the conditions in the test of emotional prosody (incongruent, r=-0.43, p<0.01; neutral r=-0.26, p<0.05; and “mmms’, r=-0.57, p<0.05), but not in the congruent condition. There is no correlation between AQ and the Reading the Mind in the Eyes task (RMET). There is also no relationship between emotional intelligence and the other emotional measures, nor is there any relationship between the RMET and the other emotional measures. However, verbal IQ correlates with the RMET (r=-0.29, p<0.05).

Conclusions:

First, the Autism-Spectrum Quotient in neurotypicals was related to identification of emotions based on speech, but this effect was not related to their ability to interpret emotional values in the semantic content of the linguistic stimuli. Second, a measure of general emotion processing capacity (EI) showed no connection with identification of emotion in speech even though it was correlated with AQ. Third, in contrast to the performance in the speech task, which correlated with AQ, no connection was revealed between AQ and the performance in the emotion recognition from the eyes. The results are suggestive of a modality-specific explanation of the emotional processing difficulties in found in ASC. However, our study did not yield conclusive evidence against the view that a more general deficit in understanding emotional states underlies the processing of speech and faces in ASC.

| More