Examining How Varying Levels of Verbal and Visual-Spatial Skills Relate to Emotional and Cognitive Aspects of Music Perception

Poster Presentation
Thursday, May 10, 2018: 11:30 AM-1:30 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
E. M. Quintin1, S. Sivathasan1 and H. Dahary2, (1)Educational & Counselling Psychology, McGill University, Montreal, QC, Canada, (2)McGill University, Montreal, QC, Canada
Background: There is growing evidence that music perception is a strength for people with autism spectrum disorder (ASD; Boso et al., 2009; Heaton, 2008). Music represents a unique domain in which to assess both cognitive and emotional processing, and it has been shown that people with ASD recognize music-evoked emotions (Quintin et al., 2011; Stephenson, Quintin, et al., 2015), as well as show enhanced pitch discrimination (Bonnel et al., 2003, 2010), musical memory (Stanutz et al., 2014), and melodic perception (Heaton, 2003). Previous research has also demonstrated that level of intellectual functioning is related to perception of musical structure (Quintin et al., 2012) and music-evoked emotion (Quintin et al., 2011), however direct comparisons between cognitive and emotional processing of music, including individuals with varying levels of intellectual functioning, have yet to be established.

Objectives: The aim of this research is to assess the impact of variations in intellectual functioning within the autism spectrum on perception of cognitive and emotional aspects of music perception.

Methods: Twenty-three adolescents with ASD ages 12-16 years old with varying levels of cognitive functioning (Wechsler Intelligence Scale for Children [WISC-V]: Verbal Comprehension [VCI] range: 50-111; Visual-Spatial [VSI] range: 61-144) completed two musical emotion recognition tasks in which they identified happy, sad, and fearful emotions in long excerpts (mean duration of 37 seconds) and short excerpts (mean duration of 4 seconds). Participants also completed a musical working memory and rhythm perception task.

Results: Participants were grouped into low and high VCI and VSI groups using median splits of the index scores (VCI = 80; VSI = 95). Overall, across emotion recognition tasks, results showed that performance accuracy (M = 92%) was not influenced by participants’ level of intellectual functioning (VCI and VSI). However, participants in the low VCI group rated music-evoked excerpts as more intense than those in the high VCI group. Participants in the low VCI group were also slower at identifying emotions in the long excerpts compared to those in the high VCI group, but there were no differences between groups in response time for the short excerpts. Additionally, VCI marginally predicted performance accuracy only on the musical working memory task (M = 81%) but not the rhythms perception task (M = 64%), which were significantly influenced by participants’ level of visual-spatial skills.

Conclusions: Overall, results suggest that adolescents with ASD are able to recognize music-evoked emotions, irrespective of level of intellectual functioning, though their ratings of intensity and response times may be impacted by their level of verbal cognitive skills. Furthermore, individuals with varying levels of verbal comprehension were able to process cognitive aspects of music perception (i.e., musical working memory and rhythm perception), however accuracy was more greatly influenced by visual-spatial skills. Findings lend support toward the use of targeted, strengths-based music interventions adapted to varying cognitive skills within the spectrum.