32028
Scalable Mobile Apps for Active Closed-Loop Behavioral Coding in Autism Spectrum Disorders

Poster Presentation
Friday, May 3, 2019: 10:00 AM-1:30 PM
Room: 710 (Palais des congres de Montreal)
J. Hashemi1,2, Z. Chang1, K. L. Carpenter3, S. Espinosa1, A. Kaniuka4, G. Dawson5 and G. Sapiro1, (1)Department of Electrical and Computer Engineering, Duke University, Durham, NC, (2)Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC, (3)Duke Center for Autism and Brain Development, Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC, (4)Duke Center for Autism and Brain Development, Duke University, Durham, NC, (5)Department of Psychiatry and Behavioral Sciences, Duke Center for Autism and Brain Development, Durham, NC
Background: Observational behavior analysis of children plays a key role for the evaluation, monitoring, and discovery of behaviors related to Autism. To date, these analyses are done in-person by a trained clinician who administers assessments and manually codes the children’s behaviors. With emerging technology and methods in computer vision and machine learning, new scalable and objective methods of behavioral elicitation and analysis of children are feasible. These include designing and displaying specific (for example) visual, auditory, tactile, and memory stimuli on mobile devices while simultaneously recording the child with the device and coding behaviors of the child while they are interacting with the device.

Objectives: To develop and deploy an all-in-one mobile-paradigm that incorporates questionnaires and movie stimuli designed to elicit behaviors relevant to ASD screening while simultaneously capturing and coding the child’s behavior in an unsupervised manner.

Methods: This work is an interdisciplinary collaboration between mental health professionals, computer vision and machine learning scientists, and app developers. Caregivers are able to digitally fill out demographic questionnaires and parental reports on the device. Additionally, multiple short video stimuli (<1 minute) designed to elicit specific social and emotional responses are displayed on a mobile-device, while at the same time the front facing camera records a video of the child’s face. Computer vision algorithms detect and track the child’s face throughout the video recording. Automatically computed facial landmark are used to compute the child’s head pose, whether the child turns their head, the degree to which the child is attending, and the movement of the child’s head. Image regions around key facial landmark points are used to estimate the child’s facial affect. The facial landmarks are also used to estimate the eye regions of the child, which, along with the face image, are fed into a neural network to estimate the child’s gaze locations on the screen.

Results: The data collected from our developed mobile applications contain features related to ASD risk behaviors. For example, we are able to assess total time child is attending to a given stimulus; number of times child disengages their attention; detecting a name-call prompt; whether or not a child performs a head turn; how quickly the child performs a head turn; the probability a child is displaying a positive/negative/neutral facial affect and the range of facial affect; gaze fixation patterns; preferential gaze; and postural sway. Our apps have been used in in-clinic and at-home studies.

Conclusions: We developed mobile applications that displays visual and interactive stimuli designed to elicit behaviors relevant to ASD and incorporates digital questionnaires. The mobile applications also record the child’s face with the front facing camera, and with our developed computer vision and machine learning algorithms we are able to automatically extract ASD relevant behaviors from the child. Applications such as the one presented here could lead to new or refined behavioral risk marker assessments, and novel screening and monitoring methods outside of clinical settings.