24466
The Relationship Between Audiovisual Statistical Learning and Autistic Traits

Friday, May 12, 2017: 5:00 PM-6:30 PM
Golden Gate Ballroom (Marriott Marquis Hotel)
R. A. Stevenson1,2, J. K. Toulmin3, S. Ferber4, A. Youm4, S. E. Schulz5 and M. D. Barense4, (1)Psychology, University of Western Ontario, London, ON, CANADA, (2)Brain and Mind Institute, University of Western Ontario, London, ON, Canada, (3)University of Toronto, Toronto, ON, Canada, (4)Psychology, University of Toronto, Toronto, ON, Canada, (5)Psychology, University of Western Ontario, London, ON, Canada
Background:  

Sensory issues are a pervasive symptom associated with Autism Spectrum Disorder (ASD). One commonly reported issue is a decreased ability to integrate multisensory information, driven by a reduction in multisensory temporal processing (for review, see Stevenson et al, 2015, Autism Research). The predictive coding hypothesis suggests this is due to a decreased ability to learn statistical relationships between multiple sensory inputs.

Objectives:  

  1. Directly test statistical learning ability of individuals with and without ASD
  2. Determine if there is any relationship between ASD traits and statistical learning abilities

Methods:  

To date, participants included 63 typically developed (TD) adults and 11 children with and without ASD. Participants completed a well-characterized statistical learning paradigm. In three counter-balanced runs, participants were presented with three-minute adaptation phases that consisted of repeatedly presenting audiovisual stimulus pairs (flash-beeps) with a consistent temporal relationship, either synchronized, audio-leading by 235ms, or visual-leading by 235ms (Figure A). Participants were then presented with flash-beep stimulus pairs with varying temporal offsets (auditory-leading by 400ms to visual-leading by 400ms) and performed a simultaneity judgement task; “Did the flash and beep occur at the same time?” (Figure B).

Participants mean responses at each offset were fit with a Gaussian curve, and the temporal offset at which they were most likely to perceive stimuli as synchronous was extracted (PSS; Figure C). This paradigm typically shifts individuals’ PSS towards the offset to which they were adapted. For example, following visual-leading adaptation, a participant’s PSS shifts to more visual-leading offsets.

Participants also completed the Autism Quotient (AQ), a measure identifying the severity of 5 different traits associated with ASD: social skills; attention switching; communication; imagination; and attention to detail. We correlated shifts in PSS, a measure of statistical learning, with each subscale, with the a priori hypothesis that a relationship would be seen with the attention to detail subscale.

Results:  

Given the current distribution of participants, data reported here are from our adult TD group. Preliminary results suggest this pattern is consistent with ASD and TD children. Mean PSS were calculated for each adaptation condition (Figure D): synchronous (mean=4.1ms, s.e.=6.0ms), audio-leading (mean=4.4ms, s.e.=7.4ms), and visual leading (mean=51.5ms, s.e.=8.9ms). Shifts of PSS in auditory-leading and visual-leading adaptations were then compared to their PSS in the synchronous adaptation with paired t-tests (Auditory-leading p=0.95, t=0.05; Visual-leading p=0.00000008, t=6.23). The significant shift in PSS in the visual-leading adaptation condition was correlated with the attention to detail subscale of the AQ (p=0.0009, r=-0.45;Figure E). Thus, less adaptation was related to increased severity of the attention to detail ASD trait.

Conclusions:  

Results indicate that individuals showing greater severity of the ASD trait “attention to detail” were less able to adapt to the statistics of their sensory environment. This supports the predictive coding hypothesis, suggesting individuals with a greater focus on detailed or local aspects of sensory inputs are less able to learn the statistical temporal relationship between audiovisual inputs, which are likely to impact the ability to integrate multisensory stimuli, as the ability to integrate is in large part driven by the temporal relationships between sensory inputs.