24669
Does Eye-Tracking during Dynamic Videos Relate to Social Interactions in High-Risk Infants?

Saturday, May 13, 2017: 12:00 PM-1:40 PM
Golden Gate Ballroom (Marriott Marquis Hotel)
A. M. Kellerman, D. S. Robinson, B. A. Jameyfield and A. J. Schwichtenberg, Purdue University, West Lafayette, IN
Background:  Poorly modulated eye contact is an early behavioral risk marker for autism spectrum disorder (ASD), facial processing difficulties, and joint attention development (e.g., Dawson et al., 2004). Quantifying eye contact within on-going social interactions can be difficult, and many researchers often use eye-tracking technology to assess where and for how long individuals look to the eyes (or mouth) of a speaker or perceived social partner (e.g., Chevallier, et al., 2015; Chawarska & Shic, 2009). However, we know relatively little about how these behaviors with 2D faces/partners relate to on-going live social interactions.

Objectives:  We aim to replicate previous studies (e.g., Jones et al., 2008) by assessing the prospective associations between eye-tracking indexed look to eyes or mouth with elevated autism risk and later developmental concerns. Our novel contribution includes assessing how eye-tracking is associated with concurrent eye-contact modulation/competence within on-going social interactions.

Methods:  As part of an ongoing prospective study, 21 infant siblings of children with autism (high-risk group; n = 12) or typical development (low-risk group; n = 9) completed an eye-tracking task and the Early Social Communication Skills (ESCS; Mundy et al., 2003) when siblings were 18 months of age. A dynamic video task was administered with iMotions software and a Tobii X2-60 eye-tracker. The coded video stimuli included three trials of a woman speaking to the observer with happy, neutral, and sad expressions and tones. Eye-tracking summary data included fixations and the amount of time infant spent attending to the speaker’s eyes and mouth. For the ESCS, initiations of joint attention (IJA) were totaled for frequencies of lower-level, higher-level, and overall bids. By 36 months of age, infants/toddlers completed a developmental evaluation and the outcome groups of ASD, other, and typical development were assigned. Due to the limited sample size in the current study, outcome group was dichotomized into typical (TYP; n = 11) and non-typical (Non-TYP; n = 10).

Results:  Risk and outcome group differences across the dynamic video task were assessed using general linear models with gender as a covariate. The high-risk group spent significantly less time looking to speaker’s eyes during the happy trials (Table 1). When assessing by outcome, the Non-TYP group spent less time attending to speaker’s eyes during the neutral trials, and Non-TYP group IJA scores were positively correlated with time spent looking to speaker’s eyes during happy and sad trials and looking to the speaker’s mouth during neutral trials (Table 2). These associations were not present in the TYP group.

Conclusions:  These findings partially replicate previous research (e.g., Dawson et al., 2005) and demonstrate that (even in small samples) time spent looking to a 2D speaker’s eyes can serve as a risk marker of later developmental concerns in children at elevated risk for autism. Additionally, time spent assessing eyes in emotionally salient videos (happy or sad) may serve as a proxy for eye contact modulation/competence within on-going social interactions. With replication, this quantification could serve as an intervention metric and/or research tool.