28154
Scalable Gaze Analysis for ASD Is in Your Pocket
Eye tracking technology is widely used in Autism research as studies show that infants and toddlers with Autism Spectrum Disorder (ASD) have a tendency to focus on non-social features (i.e., toys, objects). Conventional autism research studies are carried out in controlled clinical settings with expensive commercial eye/gaze trackers that require explicit user calibration. This setup is not scalable, introduces environmental bias, and is challenging for participants. New computer vision techniques can estimate the eye gaze of a person using a single photo taken from a cellphone camera without any explicit calibration, opening up new screening methods that are feasible beyond the clinical settings.
Objectives:
To incorporate eye tracking into mobile devices for large-scale, remote data collection and screening and monitoring of ASD children in a naturalistic environment.
Methods:
We developed an iPhone and desktop application that displays a stimulus consisting of dots appearing sequentially on the screen. The user is asked to tap each dot as it appears, during which an image of the user’s face is recorded using the front facing camera of the iPhone (or desktop camera). The user’s face and eye regions are automatically detected using computer vision algorithms and used as input to train a neural network for gaze estimation. The output of the network is the corresponding region in which the dot falls within.
Results:
We tested our eye tracking algorithm on 9 naïve subjects (7 male, 2 female) of varying ethnicity and age. When holding the iPhone in landscape, on average we achieved 93% accuracy discriminating whether the user was looking at the left or right half of the screen without explicit calibration. We also achieve over 80% accuracy discriminating between 3 regions in landscape and 2 regions in portrait.
Conclusions:
We incorporated eye tracking technology into a mobile (or desktop) application for remote, large scale data collection of ASD children. We demonstrate that we are able to achieve sufficiently high accuracy in estimating the user’s focus of attention on the phone screen without the need for explicit calibration. The proposed region based gaze analysis approach is sufficient for ASD and other behavioral analysis techniques, with properly defined stimuli, e.g., social half vs, non-social half. Application of calibration-free eye tracking on mobile devices could lead to scalable new behavioral imaging methods for detecting subtle neurologic processes relating to attention, and may be useful for early Autism screening and monitoring.