Note: Most Internet Explorer 8 users encounter issues playing the presentation videos. Please update your browser or use a different one if available.

Using Computational Tools to Measure Social Communication and Engagement in Young Children

Thursday, 2 May 2013: 16:00
Auditorium (Kursaal Centre)
14:30
G. D. Abowd1, A. Rozga1, J. M. Rehg1 and M. Clements2, (1)School of Interactive Computing, Georgia Institute of Technology, Atlanta, GA, (2)School of Electrical and Computer Engineering, Georgia Institute of Technology, Atlanta, GA
Background: In light of recent advances in early screening and diagnosis of autism, there is a growing push for interventions targeting the early signs of autism. Many of these parent- and therapist-mediated interventions include a focus on social communication behavior and engagement as treatment targets, including social and joint attention, affective engagement, and nonverbal communication skills (e.g., Kasari et al., 2010; Carter et al., 2011; Casenhiser, Shanker, & Stieben, 2011; Kaale, Smith, & Sponheim, 2012). There is a clear need to develop reliable, objective measures of these skills to better asses the effectiveness of such early interventions and identify their active ingredients.

Objectives: To develop a suite of computational tools that will enable automated quantitative measures of key social communicative behaviors and engagement of young children engaged in dyadic social interactions.

Methods: We are collecting rich sensor data (high quality video and audio recordings, on-body sensing of electrodermal activity and movement) in toddlers aged 15-30 months engaged in a semi-structured play interaction with an adult examiner. The interaction itself consists of a series of presses for specific social communicative behaviors of interest. Data from seventy-four toddlers has been collected, with 24 participants assessed a second time approximately two months after their initial visit. We use the sensor data to develop computer algorithms to automatically detect and quantify individual social communicative behaviors (e.g., attention to people and objects, smiling, gestures, vocalizations) and to predict ratings of the child’s engagement in the interaction. We compare the performance of the automated measurement tools against human coding of these behaviors from video.

Results: Using commercially available software and hardware, as well as research prototypes, we have developed tools to automatically parse an interaction into its constituent parts, and to detect whether the child has made eye contact, smiled or vocalized within a given period of time. Using overhead cameras, we have developed algorithms to track the child and adult’s heads and the objects involved in the interaction and to determine when a child directs attention to objects or to the examiner (or shifts gaze between the two) during the interaction. Using a camera worn by the examiner, we have developed algorithms to detect when a child makes direct eye contact with the examiner. Finally, we developed algorithms that use the child’s speech/vocalization data to predict the examiner’s ratings of the child’s engagement in the interaction.

Conclusions: Our preliminary results suggest that it is possible to detect the building blocks of social engagement – visual attention, joint attention, affect, and vocalization – in an automated way using video and audio recordings. We believe these tools, if further developed, will enable researchers and clinicians to gather more objective repeatable measures of the treatment progress of children enrolled in early interventions, and to do it more efficiently at larger densities and time scales than current human-based observation and measurement.

| More