28150
Tablet-Based Mobile Eye Tracking for Studying Visual Preference in Children with ASD: Proof-of-Concept and Feasibility Study

Poster Presentation
Thursday, May 10, 2018: 5:30 PM-7:00 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
A. Atyabi1, Q. Wang2, C. Foster2, E. Barney3, Y. A. Ahn4, M. Kim4, B. Li4,5, C. A. Paisley6, S. M. Abdullahi7, M. L. Braconnier7, J. Lei7, C. C. Kautz8, L. L. Booth8, M. Lyons2, P. E. Ventola9 and F. Shic3, (1)Seattle Children’s Research institute University of Washington, Seattle, WA, (2)Child Study Center, Yale University School of Medicine, New Haven, CT, (3)Center for Child Health, Behavior and Development, Seattle Children's Research Institute, Seattle, WA, (4)Seattle Children's Research Institute, Seattle, WA, (5)Computer Science and Engineering, University of Washington, Seattle, WA, (6)The University of Alabama, Tuscaloosa, AL, (7)Yale Child Study Center, New Haven, CT, (8)Yale Child Study Center, Yale School of Medicine, New Haven, CT, (9)Yale Child Study Center, Yale University School of Medicine, New Haven, CT
Background: Eye-tracking has been used for over 15 years to study atypical visual social cognition in ASD. Current access to eye-tracking technologies is limited due to the expense of eye-tracking devices, the need for highly controlled laboratory settings, and the necessity of trained personnel. This study reports on the development of a tablet-based eye-tracking platform designed to examine the feasibility of mobile eye tracking, a step towards significantly increasing the accessibility of eye tracking for clinical applications.

Objectives: To (1) examine the feasibility of tablet-based (iPad) eye-tracking; (2) provide preliminary results regarding differences between children with and without ASD; (3) conduct a machine learning investigation of diagnostic prediction using acquired data.

Methods: Three 6-minute sessions comprising side-by-side presentations of age-appropriate social target movies with distractor movies (moving machinery, sporting events, animals, or random clips from another movie) were administered to 6-12-year-old children (ASD n=29 , non-ASD n=33). Each session included five calibration sets (three 5-point calibrations and two smooth-pursuit-trajectories with animations). Sessions paused automatically when participants’ faces were not detected for more than 5 seconds.

Results: Data Quality: 92% of sessions had <2.5 degrees of calibration error. >80% valid tracking data was recorded in 83% of sessions. Across groups, valid-looking-time was 86% (SD=7%), and calibration-error 1.49° (SD=1.19°). These results indicated robust recording and participant attention. Data Quality Group Differences: Univariate ANOVAs on calibration quality indicated group differences in smooth-pursuit calibration, with the non-ASD group following both the left-half (p<.01) and the right-half (p=.056) of the pursuit better than the ASD group. No 5-point calibrations differences emerged. Looking Pattern Group/Condition Effects: A main effect of distractor condition on %Distractor (p<.001) was found with a trend toward a diagnosis*condition interaction (p=.063). Pairwise comparisons indicated children looked more at machine distractors, compared to animals (p<.001) and dynamic-naturalistic-scenes (p<.01); and more at sports compared to animals (p<.001). Participants with ASD looked more at the animal distractors than the non-ASD group (p<.05). A main effect of condition on GazeOffscreen% (p<.01) and a diagnosis*condition interaction (p<.05) was also observed. Pairwise comparisons showed that children looked offscreen more during animal distractor videos than machines (p<.001) and sports (p<.01). Participants with ASD looked offscreen less during animal distractors than the non-ASD group (p<.05). Machine Learning: Using linear support-vector-machines (SVM) on predictors (GazeOffscreen%, Distractor%, %Target, and associated timings) in a repeated (n=5) leave-one-out cross-validation with nested-bootstraps (n=10, group balancing), we achieved 79% average group-membership-prediction accuracy.

Conclusions: These analyses provide evidence for tablet-based eye tracking as a viable option for remotely quantifying eye movements in children with and without ASD. We piloted a preferential looking task to social videos and smooth-pursuit trajectories. The results revealed that children with ASD showed smooth-pursuit tracking difficulties, with nuanced differences in scanning patterns potentially relating to attentional biases relative to controls towards animal distractors. Results will be presented in the context of design choices and challenges of tablet-based eye tracking.