32096
Probabilistic, Model-Based Eye-Tracking Using Machine Learning

Poster Presentation
Thursday, May 2, 2019: 5:30 PM-7:00 PM
Room: 710 (Palais des congres de Montreal)
J. A. Olmstead, A. Klin, S. Shultz and W. Jones, Marcus Autism Center, Children's Healthcare of Atlanta and Emory University School of Medicine, Atlanta, GA
Background:

Eye-tracking is a popular research method for neuroscience and biomedical research in general, and for studying social development and autism spectrum disorder (ASD) in particular (Jones & Klin, 2013). In theory, the technology is applicable to participants of all ages and all levels of cognitive and adaptive ability. In actuality, however, traditional calibration techniques required by many model-based eye-tracking technologies (i.e., those that use models of infrared corneal reflection and pupillary geometry to determine gaze, such as EyeLink, SMI, and ISCAN) are difficult to complete for neonates and for individuals with behavioral or cognitive challenges, including some individuals with ASD. This limitation can prevent data collection from important community stakeholders.

Objectives:

To circumvent pre-session calibration procedures for model-based eye-tracking using post-hoc machine learning techniques, while still enabling high-fidelity data collection for studies of social visual engagement in neonates and intellectually-disabled individuals with ASD.

Methods:

By using a set of accurately calibrated, high-quality eye-tracking sessions (N = 18, Table 1) as ground truth estimates during supervised machine learning (Google TensorFlow 1.12.0 library with Keras backend, Abadi et al., 2015), we empirically constructed a calibration transformation from 2-D eye-image space into 2-D screen-coordinate space (Figure 1a-b). We then used an independent set of accurately calibrated, high-quality eye-tracking sessions (N = 111, Table 1) to test the calibration transformation on novel data collected under the same lab conditions.

Results:

The median absolute point of gaze (POG) error (actual POG - predicted POG) subtended 2.64 degrees of visual angle (Figure 1b-c). We then used the empirical error distribution to demonstrate potential future area-of-interest (AOI) analyses: In videos that are segmented into AOIs at each frame, we can probabilistically determine what AOI is being fixated.

Conclusions:

Data collected through model-based eye trackers can be accurately calibrated post-session using machine learning. The estimated gaze position error can be minimized by quantifying and indexing covariates to discard data that are likely to be poorly estimated. Once an error distribution has been empirically determined, it may be used to probabilistically determine fixated elements in naturalistic social scenes. The strength of this eye-tracking approach is that it can be developed and deployed to collect data from participants for whom traditional eye-tracking experiments may not be possible—i.e., from neonates or from individuals with ASD with behavioral challenges or with substantial comorbid intellectual disability. Probabilistic eye-tracking analyses have the promise of extending quantitative research methods into populations of extreme interest, allowing insight into the earliest periods of development and the results of its disruption.

References:

Jones, W., Klin, A. Attention to eyes is present but in decline in 2-6-month old infants later diagnosed with autism. (2013). Nature, 504, pp. 427-431.

Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., Corrado, G. S. et al. Large-scale machine learning on heterogeneous systems. (2015). Retrieved from https://www.tensorflow.org/.