29592
The Use of Computer Vision Analysis for Early Autism Symptom Detection and Monitoring

Panel Presentation
Thursday, May 2, 2019: 10:30 AM
Room: 517A (Palais des congres de Montreal)
G. Sapiro1, J. Hashemi1, S. Espinosa1, Z. Chang1, K. L. Carpenter2 and G. Dawson2, (1)Department of Electrical and Computer Engineering, Duke University, Durham, NC, (2)Duke Center for Autism and Brain Development, Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC
Background: Early autism symptoms include reduced social attention, failure to orient to name, and atypical emotional expressions, among others. Screening for these symptoms has relied on caregiver-completed surveys which have poor performance for caregivers who have lower levels of education and knowledge about child development; this contributes to disparities in early detection and service access. Expert ratings require training, making such approaches challenging in low resource settings. We demonstrate that computer vision applications can be used in primary care clinics to quantify autism symptoms in an efficient, objective, quantitative, and reliable manner.

Objectives: Provide an overview of a research program based on downloadable, closed-loop software applications for low-cost mobile devices that elicit autism behavioral symptoms in response to neuroscience-informed stimuli which are recorded on the device camera and quantified using computer vision analysis to yield precise measures of visual attention, emotional facial expressions, and motor behavior.

Methods: The sample was 104 toddlers of 16–31 months old; 22 had autism spectrum disorder (ASD) based on the ADOS and 82 had typical development or developmental delay. After extensive pilot testing of both the stimuli (brief movies) and how to effectively deliver them in an exam room, a set of movies that reliably elicited autism risk behaviors were shown on an iPad while the embedded camera recorded the child’s attention/gaze, orienting, affective expressions and motor responses, which were quantified via computer vision analysis.

Results: There was strong correspondence between computer and human coding of attention (ICC =.89), orienting (ICC=.84) and affective expression (ICC=.89 - .90). Significant differences between toddlers with and without ASD were found for several known symptom behaviors: Non-ASD children oriented significantly more times (B = 1.89, p = 0.02). Computer coding detected differences in latency to orient that were not readily detectable by the clinician (p = 0.02). Children with ASD looked less at the social stimuli compared to non-ASD children (p < .05). A novel finding is group differences in the rate of head movements (postural sway; significant p-values ranged from .012 to <.0001). Toddlers with ASD exhibited higher rates of head movement, suggesting difficulties in maintaining midline position of the head while engaging attentional systems. Preliminary analyses of facial expression data indicate a higher proportion of time spent with “flat affect” for children with ASD compared to non-ASD (p < .05).

Conclusions: We demonstrate the feasibility, face validity, and reliability compared to human coding of digital behavioral assessments for detecting and quantifying well-established early autism symptoms, including failure to orient to name, reduced social attention, and flat affect. In addition, we report here that digital assessments can reveal novel biomarkers, such as postural sway, which are not readily detected with the naked eye. We will describe our current NIH Autism Center of Excellence research program which is validating a new version of our technology platform on a large, population-based sample of infants and toddlers being seen in Duke pediatric primary care and assessing its utility for symptom monitoring in clinical trials.