25235
Gaze Glasses for Outcome Assessment in ASD

Friday, May 12, 2017: 5:00 PM-6:30 PM
Golden Gate Ballroom (Marriott Marquis Hotel)
K. Zhou, B. Gutierrez, V. Minces, J. Townsend and L. Chukoskie, University of California, San Diego, La Jolla, CA
Background: Successful social interaction skills require the marshaling of visual attention and gaze in a fast and accurate manner. However, fast shifts of visual attention (Townsend, et al., 1996) and accurate shifts of gaze are deficient in individuals with ASD (Miller, et al., 2014). In fact, disordered visual orienting is among the earliest signs of ASD identified in prospective studies of infant siblings (Zwaigenbaum, et al. 2005) and it persists across the lifespan (Miller, et al., 2014). However, gaze timing and accuracy are malleable through intervention (Chukoskie, et al., 2013, 2015). Although we have tools to assess changes in gaze on a computer screen, we lack similar objective and reliable tools that can measure changes in gaze behavior in natural social interactions. We need novel outcome measures that will enable us to test the efficacy of interventions that aim to improve social interaction.

Objectives: We evaluated a novel glasses-based eye-tracking tool for the objective assessment of social interactions with one or more partners. We also evaluated a suite of computer vision-based tools for analyzing gaze behavior during these interactions.

Methods: We have tested inexpensive eye-tracking glasses in a structured interaction based on a social game with the research participant and two other players. The analysis suite includes tools for face- and object-recognition in the video that enables the analysis of gaze with respect to facial features and objects, as well as trigger events, such as the onset of a sound or object movement.

Results: Twelve children and adolescents with ASD wore the glasses while engaged in a game and social conversation. The glasses were comfortable, quick to don and doff, and involved minimal calibration time. Using computer vision tools, we identified looks to facial features and other objects in the scene, streamlining analysis of the data, which would otherwise have required labor-intensive “hand coding”. Using these tools we were able to quantify the number of looks to particular objects or people, the duration of these looks, and the time it took to initiate a look based on a trigger event.

Conclusions: We recommend this novel glasses-based eye-tracking tool and accompanying software suite as a strong candidate for creating objective outcome assessments of social engagement in ASD at a range of ages.