19610
Perceptual Binding and Audiovisual Speech Perception in Autism Spectrum Disorders

Friday, May 15, 2015: 11:30 AM-1:30 PM
Imperial Ballroom (Grand America Hotel)
R. A. Stevenson1, M. Segers2, B. L. Ncube3, S. Z. Sun1, N. D. Hazlett1, J. D. N. Ruppel1, S. Ferber1 and M. D. Barense1, (1)Department of Psychology, University of Toronto, Toronto, ON, Canada, (2)Psychology, York University, Toronto, ON, Canada, (3)York University, Toronto, ON, Canada
Background: Social communication, one of the core areas of symptoms in autism spectrum disorders, is intrinsically reliant on sensory perception. Communicating with others requires auditory and visual perception, and recent work has led to the hypothesis that atypical sensory perception may contribute to difficulties in speech communication. Specifically, an individual’s ability to integrate multiple sensory inputs into a single coherent percept (i.e., perceptual binding) may impact their speech perception abilities. 

Objectives: Our goals were to (A) measure differences in speech-perception abilities between individuals with and without ASD, (B) determine if perceptual binding abilities were related to speech perception in ASD, (C) determine if this relationship was specific to speech in ASD, and (D) determine if this relationship was multisensory specific, or if it was also present in unisensory visual binding in ASD.

Methods: Participants (N(ASD)=19, N(TD)=36, data collection ongoing, mean age=12) completed four commonly-used perceptual binding tasks (see figure), and a speech perception in noise task. The paradigms used include: the McGurk paradigm (audio-visual, social communicative), the sound-induced flash illusion (audio-visual, non-social communicative), the composite face task (visual-visual, social communicative), and the global-local composite letter task (visual-visual, non-social communicative). These binding paradigms varied according to modality (audio-visual or visual-visual binding) and inclusion of social communication (socially communicative or non-socially communicative). In the speech-perception task, individuals performed an audiovisual single-word recognition task where the auditory word was embedded in background noise (word=66 dB, noise=72 dB).

Results: Results to date show a significant difference in audiovisual speech perception between individuals with and without ASD (ASD=40% accuracy, TD=52% accuracy, p<0.002). Furthermore, we examined the degree to which performance on the speech-in-noise task could be predicted by performance on the four other binding tasks. Individuals with ASD showed a strong relationship between audiovisual speech binding, measured with the McGurk effect, and speech perception accuracies (r=0.68, p=0.003). However, individuals with ASD did not show a significant correlation between binding of audiovisual non-speech, measured with the sound-induced flash illusion, and speech perception (r=0.11). Furthermore, neither of the visual-visual binding tasks were correlated with speech perception rates (r’s<0.10).  

Conclusions: These data provide evidence of impairments in speech perception in individuals with ASD relative to individuals without ASD. Importantly, the strong correlation between performance on the McGurk paradigm and speech recognition rates suggests that the reduced ability to accurately perceive audiovisual speech may be due, in part, to a reduced ability to perceptually bind the auditory and visual components of speech. The concurrent lack of relationship between non-speech audiovisual binding and speech perception suggests that this effect may be specific to social-linguistic processing, reflecting the core symptomology of autism. Finally, no relationship was seen between visual-visual binding abilities and speech perception, implying that the differences between groups leading to reduced speech perception accuracies is specifically a multisensory issue. These data support the hypothesis that low-level differences in sensory perception, specifically perceptual binding, contribute to speech perception impairments. These effects are specific to speech, and specific to perceptual binding across sensory modalities.