Note: Most Internet Explorer 8 users encounter issues playing the presentation videos. Please update your browser or use a different one if available.

Markovian Dynamics of Visual Scanning Behavior in Toddlers with ASD

Saturday, 4 May 2013: 09:00-13:00
Banquet Hall (Kursaal Centre)
G. Ramsay1, D. Lin2, W. Jones3 and A. Klin1, (1)Marcus Autism Center, Children's Healthcare of Atlanta & Emory University School of Medicine, Atlanta, GA, (2)Harvard-MIT Division of Health Sciences and Technology, Boston, MA, (3)Department of Pediatrics, Marcus Autism Center, Children's Healthcare of Atlanta, Emory University, Atlanta, GA
Background: Research has shown that children with autism exhibit atypical patterns of visual attention to social scenes relative to typically developing peers. In previous studies involving presentation of audiovisual stimuli comprising faces and shapes synchronized to varying degrees with speech and tones, we showed that ASD infants are relatively insensitive to social contingencies afforded by talking faces, focusing instead on physical contingencies in the form of audiovisual synchrony between light and sound. Viewing patterns of TD/DD controls indicated a preference for synchronous faces and speech, lacking in ASD participants, even though TD/DD/ASD groups did not differ in baseline sensitivity to audiovisual synchrony. In those studies, measures of visual fixation were derived from summary statistics comparing mean fixation durations on different parts of the screen. These measures do not capture patterns of temporal correlation in the eyetracking trajectories, which may contain information about behavioral responses specific to autism.

Objectives: Accordingly, the goal of this research is to develop a mathematical model for parameterizing the full spatiotemporal dynamics of visual scanning behavior, and to determine whether temporal dynamics distinguish ASD from TD children.

Methods: Drawing on our research developing stochastic models of goal-directed actions, we constructed a hidden Markov process to model our data. In our model, looking patterns are characterized by a Markov chain comprising a finite-state grammar of discrete events modeling the intention to look at regions of interest within a scene, with state-dependent duration distributions modeling event timing. Probability distributions of spatial targets associated with each state model the shape of those regions. A linear system models oculomotor dynamics, smoothing out each random sequence of spatial targets. Eyetracking trajectories are modeled by transforming the state space into screen coordinates. Using the Expectation-Maximization Algorithm, we derived maximum-likelihood estimates that allow us to recover the parameters of the Markov transition kernel from training data. We also derived optimal nonlinear smoothing algorithms that enable the hidden states of the model to be estimated for any given eyetracking trajectory. Finally, we derived likelihood-ratio tests to determine which of a set of trained models is most consistent with any observed test set of eyetracking trajectories. The result is a complete system for automatically quantifying, interpreting, and classifying the full spatiotemporal dynamics of visual scanning behavior. We applied the model to eyetracking data for 20 ASD and 20 TD toddlers from previous experiments using preferential looking paradigms to assess sensitivity to audiovisual synchrony. We trained models for each diagnostic group and stimulus type, and tested for significant differences in each of the model parameters.

Results: We found differences in Markov parameters across diagnostic groups reflecting temporal sequencing and timing of saccades and fixations, which depended on the social nature of the stimulus. These differences cannot show up in our summary statistics.

Conclusions: Significant differences in visual scanning behavior exist between ASD and TD children that cannot be fully quantified without characterizing the detailed temporal unfolding of individual looking patterns, suggesting specific mechanisms of attention that may be crucial in identifying children at risk of autism.

| More