Tablet-Based Mobile Eye Tracking for Studying Visual Preference in Children with ASD: Proof-of-Concept and Feasibility Study
Objectives: To (1) examine the feasibility of tablet-based (iPad) eye-tracking; (2) provide preliminary results regarding differences between children with and without ASD; (3) conduct a machine learning investigation of diagnostic prediction using acquired data.
Methods: Three 6-minute sessions comprising side-by-side presentations of age-appropriate social target movies with distractor movies (moving machinery, sporting events, animals, or random clips from another movie) were administered to 6-12-year-old children (ASD n=29 , non-ASD n=33). Each session included five calibration sets (three 5-point calibrations and two smooth-pursuit-trajectories with animations). Sessions paused automatically when participants’ faces were not detected for more than 5 seconds.
Results: Data Quality: 92% of sessions had <2.5 degrees of calibration error. >80% valid tracking data was recorded in 83% of sessions. Across groups, valid-looking-time was 86% (SD=7%), and calibration-error 1.49° (SD=1.19°). These results indicated robust recording and participant attention. Data Quality Group Differences: Univariate ANOVAs on calibration quality indicated group differences in smooth-pursuit calibration, with the non-ASD group following both the left-half (p<.01) and the right-half (p=.056) of the pursuit better than the ASD group. No 5-point calibrations differences emerged. Looking Pattern Group/Condition Effects: A main effect of distractor condition on %Distractor (p<.001) was found with a trend toward a diagnosis*condition interaction (p=.063). Pairwise comparisons indicated children looked more at machine distractors, compared to animals (p<.001) and dynamic-naturalistic-scenes (p<.01); and more at sports compared to animals (p<.001). Participants with ASD looked more at the animal distractors than the non-ASD group (p<.05). A main effect of condition on GazeOffscreen% (p<.01) and a diagnosis*condition interaction (p<.05) was also observed. Pairwise comparisons showed that children looked offscreen more during animal distractor videos than machines (p<.001) and sports (p<.01). Participants with ASD looked offscreen less during animal distractors than the non-ASD group (p<.05). Machine Learning: Using linear support-vector-machines (SVM) on predictors (GazeOffscreen%, Distractor%, %Target, and associated timings) in a repeated (n=5) leave-one-out cross-validation with nested-bootstraps (n=10, group balancing), we achieved 79% average group-membership-prediction accuracy.
Conclusions: These analyses provide evidence for tablet-based eye tracking as a viable option for remotely quantifying eye movements in children with and without ASD. We piloted a preferential looking task to social videos and smooth-pursuit trajectories. The results revealed that children with ASD showed smooth-pursuit tracking difficulties, with nuanced differences in scanning patterns potentially relating to attentional biases relative to controls towards animal distractors. Results will be presented in the context of design choices and challenges of tablet-based eye tracking.