31261
Language, Gesture, and Looking Patterns during Viewing of Social Interactions in Children with Autism Spectrum Disorder: Results from the ABC-CT Interim Analysis

Poster Presentation
Friday, May 3, 2019: 11:30 AM-1:30 PM
Room: 710 (Palais des congres de Montreal)
M. L. McNair1, A. Naples1, D. A. Trevisan2, R. Bernier3, C. Brandt4, K. Chawarska1,5, G. Dawson6, J. Dziura4, S. Jeste7, C. A. Nelson8, F. Shic9,10, C. Sugar7, S. J. Webb3 and J. McPartland1, (1)Child Study Center, Yale University School of Medicine, New Haven, CT, (2)Faculty of Education, Simon Fraser University, Burnaby, BC, Canada, (3)Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, (4)Yale University, New Haven, CT, (5)Child Study Center, Yale School of Medicine, New Haven, CT, (6)Department of Psychiatry and Behavioral Sciences, Duke Center for Autism and Brain Development, Durham, NC, (7)University of California, Los Angeles, Los Angeles, CA, (8)Boston Children's Hospital, Boston, MA, (9)Center for Child Health, Behavior and Development, Seattle Children's Research Institute, Seattle, WA, (10)Pediatrics, University of Washington School of Medicine, Seattle, WA
Background: Autism spectrum disorder (ASD) is characterized by social communication difficulties, impacting both language and gesture. It is known that decreased attention to faces when viewing social interactions correlates with lower language abilities in ASD; however, it remains unclear how the presence of spoken language during viewed social interactions influences looking patterns.

Objectives: To investigate relationships among linguistic and gestural abilities with looking patterns to videos of social interactions with and without spoken language.

Methods: Eye-tracking data were collected across five sites from 161 children with ASD between the ages of 6 and 11 years (mean age=8.71 years, mean IQ=95.80) and 64 age-matched typically developing (TD) controls (mean age=8.73 years, mean IQ=114.64). Using a SR EyeLink-1000+ to collect eye-tracking data, participants viewed videos in which two people engaged in a shared activity. In one paradigm, actors spoke to each other; during a second paradigm, actors did not speak. Receptive and expressive language function was assessed by parent report on the Vineland Adaptive Behavior Scales, 3rd Edition. Gesture was assessed with the Autism Diagnostic Observation Schedule, 2nd Edition (ADOS-2) gesture scores. Repeated measures ANOVAs compared the log of the ratio of percent looking to activity compared to percent looking to face between eye-tracking paradigms. The relationship between the log-ratio and the Vineland-3 Communication Scores were analyzed using Pearson’s correlations, and gesture scores were analyzed using Spearman’s correlations.

Results: Across both paradigms, children with ASD looked significantly less to faces compared to activity than TD children (F(1,223)=7.625, p=0.006). While there was no significant difference in looking time to faces in the TD group between speech and non-speech videos, children with ASD looked significantly less to faces compared to activity during videos with speech (F(1,223)=32.931, p=0.001). In children with ASD, higher Vineland-3 expressive language scores significantly correlated with greater looking time to faces during the videos with speech (r(161)=-0.203, p=0.010). In TD children, greater Vineland-3 receptive scores significantly correlated with more looking to faces compared to activity during the speech videos (r(63)=-0.269, p=0.033). There were no significant correlations between Vineland-3 expressive or receptive scores and looking time to faces during non-speech videos in either diagnostic group, and ADOS-2 gesture scores did not significantly correlate with looking time to faces during non-speech or speech videos in either diagnostic group.

Conclusions: In children with ASD, the presence of language in videos of social interactions was associated with decreased attention to faces; however, greater expressive language functioning in this group was related to increased attention to faces. This study highlights that speech may modulate preferential looking to faces in ASD and that eye-tracking studies should carefully consider content of stimuli. Future studies should investigate how associated features of ASD, such as social anxiety, impact attention to faces while viewing verbal and non-verbal interactions.