Audiovisual Integration Abilities in ASD Using Music-Based Stimuli

Poster Presentation
Friday, May 11, 2018: 5:30 PM-7:00 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
L. Silverman1, A. R. Canfield2, N. Gebhard3 and M. Schutz4, (1)University of Rochester Medical Center, Rochester, NY, (2)Psychological Sciences, University of Connecticut, Storrs, CT, (3)Massachusetts General Hospital, Cambridge, MA, (4)McMaster University, Hamilton, ON, Canada
Background: Individuals with autism spectrum disorder (ASD) have difficulty with audiovisual integration, which may contribute to some of the core symptoms of the disorder. However, the full extent of their sensory integration abilities is not yet well characterized. Studies using complex, language-based tasks have found that individuals with ASD do not use lip movements or hand gestures efficiently to improve speech perception. Conversely, research with simple non-social stimuli, such as auditory-beeps and visual-flashes generally suggests intact integration abilities. These findings are hard to compare because most language-based stimuli involve biological motion, while non-social studies tend to use computer-generated stimuli without natural human movements and artificial synthesized sounds. It is unclear whether individuals with autism can integrate natural human, audiovisual information in the absence


The current study addresses this gap by examining integration of human, audiovisual information in the absence of language demands. This was achieved by utilizing a musical illusion documented by Schutz and Lipscomb (2007).

Methods: Participants were 24 adolescents with high-functioning ASD and 24 typically developing (TD) controls, matched on age, gender, and IQ. They watched videos of an internationally acclaimed musician performing short and long notes on the marimba (a percussion instrument similar to a xylophone). Prior research suggests that features of a musician’s performance gestures (height and trajectory of hand movements) affect listeners’ perception of note duration (longer gestures result in perception of longer sounding notes). This illusion is well documented in typical adults. The current experiment included three conditions: audio-visual, audio-alone, and video-alone. Participants were told that they would complete a computer game and some parts of the game had gestures, others had sounds, and other parts had both. They were informed that in the audiovisual condition, sometimes auditory and visual stimuli were mismatched. Participants were asked to judge the duration of each independently. Integration was determined based on participants’ estimation of note duration and the presence of an audiovisual illusion.


We assessed the visual influence of gesture on perception of note duration in the two groups using a 2 (visual gesture length) X 2 (group) X 6 (note duration) repeated measures ANOVA. Notably, there was a significant influence of visual gesture length on note duration F(1, 46)=22.00, p < .0001 (partial h2 =.0454), as well as a lack of interaction between gesture length and group F(1, 46)= .0004, p=.984. This indicates that individuals with ASD integrated auditory and visual information, and experienced the illusion no differently than the controls. In unimodal conditions there was a significant effect of group in the audio-only F(1, 46)=9.781, p = .0003 (partial h2 =.0791), and the video-only F(1, 46)=8.988, p = .0044 (partial h2 =.0642), with the ASD group providing shorter relative judgments, but no group X visual gesture nor group X pitch interaction.

Conclusions: The magnitude of the audiovisual illusion in the ASD group was comparable to the illusion experienced by controls. This suggests intact integration abilities in ASD for natural, human audiovisual information in the absence of language demands.