20424
A Computational Approach to Eye-Tracking Analysis Reveals Slower Orienting to Movements in Social Scenes in Toddlers with ASD

Thursday, May 14, 2015: 5:30 PM-7:00 PM
Imperial Ballroom (Grand America Hotel)
Q. Wang1, S. Macari1, K. Chawarska1 and F. Shic2, (1)Child Study Center, Yale University School of Medicine, New Haven, CT, (2)Yale Child Study Center, Yale University School of Medicine, New Haven, CT
Background: Complex, naturalistic, dynamic scenes are becoming increasingly prevalent in eye-tracking research on autism spectrum disorder (ASD). However, conventional analyses with predefined Areas of Interests (AOI) have limitations: they are created subjectively, are hard to apply to dynamic stimuli, and have a binary definition (“in” or “out”). Computational approaches may provide alternative analytical quantitative frameworks that are more automated, less arbitrary, and that highlight more nuanced features of atypical scanning strategies in ASD.

Objectives: To develop a framework for examining motion-driven visual scanning during observation of naturalistic scenes. We use this framework to compare the visual scanning patterns of toddlers with ASD and typically developing (TD) controls.  

Methods: Participants included 20 month-old toddlers with ASD (n=99), developmental delays (DD, n=56), and TD (n=111). Toddlers freely viewed a dynamic 3-minute scene with four probes (Dyadic Speech, Sandwich-Making, Joint Attention, and Animated Toys), which was broken down into contiguous 100 ms segments for analysis. We first conducted a Cohesion Analysis of TD scanning patterns (Shic et al., 2012). We used three-fold cross validation to compute a ‘cohesive score,’ calculated by the sum of the inverse distances between each participants’ gaze position and that of a subset of TD toddlers for each segment. Next, we computed the optical flow (Horn and Schunck, 1981; Sun et al., 2010), i.e. the relative motion at pixel level between frames in scene. We conducted a Latency Analysis to find the temporal lag between gaze orienting and motion in scene that maximized the average optical flow indexed by the participant’s gaze orienting positions (Maximal Optical Flow). Correlations were computed between cohesive scores, maximal optical flow magnitudes, and behavioral characteristics of toddlers with ASD.

Results: Cohesion Analysis: ASD scanning patterns deviated from both DD and TD groups on cohesive scores (p< .01, p<.001) in all four probes, with no significant differences between DD and TD groups. Latency Analysis: The ASD group responded more slowly to motion than the TD group (p<.05), but not the DD group. This effect was driven by the Joint Attention (p<.05) and Sandwich-Making (p<.05) probes. Maximal Optical Flow: On average, toddlers with ASD looked less at motion than TD and DD groups (p<.05, p<.05), particularly in the Sandwich-Making probe (p<.001, p<.001). Correlations: Strong correlations between cohesion and the optical flow magnitude for all groups (r>.76, p<.001) suggest that motion has a strong organizing influence on looking behavior. Toddlers with ASD with higher cohesion scores presented with higher verbal (r=.21, p<0.05) and non-verbal (r=.22, p<0.05) DQ and lower ADOS autism severity (r=-.21, p<.05); those with longer lags had higher ADOS autism severity (r=.22, p<.05).

Conclusions: Our results suggest that, as a group, toddlers with ASD react slower toward movement in social scenes, and that this behavior is associated with developmental level and social deficits. These techniques may help characterize the nature of atypical scanning patterns in ASD and lead to new, automated instruments for identifying early symptoms of ASD.