25119
Robots Teaching Autistic Children to Mind Read: A Feasibility Study of Child-Robot Interaction during Emotion-Recognition Training

Friday, May 12, 2017: 10:00 AM-1:40 PM
Golden Gate Ballroom (Marriott Marquis Hotel)
A. M. Alcorn1, T. T. Tavassoli1, S. Babović Dimitrijevic2, S. Petrović2, S. Skendzic2, V. Petrović2 and E. Pellicano3, (1)UCL Institute of Education, University College London, Centre for Research in Autism and Education (CRAE), London, United Kingdom, (2)Serbian Society of Autism, Belgrade, Serbia, (3)Centre for Research in Autism and Education (CRAE), UCL Institute of Education, University College London, London, United Kingdom
Background: Autistic children often have difficulty recognising emotions and facial expressions relative to typically developing children. Several existing projects have shown promise in using robot-assisted interventions for social and academic skills teaching with autistic children, including emotion recognition. Robots can be more predictable and less complex than interaction with humans, and may be more “comfortable” for autistic children. Little is known, however, about the levels of language, cognitive skill, or sensory tolerance that are necessary or desirable for robot-assisted interventions to be implemented effectively for autistic children.

Objectives: This project tested the feasibility of an emotion-recognition training programme in developing the potential of robot-assisted interventions for autistic children in Serbia and in the UK.

Methods: Forty-two autistic children, aged between 5 and 12 years, have been assessed thus far (testing is ongoing): 23 children (3 girls) in the robot-assisted training condition and 19 children (3 girls) in the researcher-only comparison condition. The majority of children have additional intellectual disabilities and limited spoken communication. In both conditions and over a number of sessions, we implemented steps 1-4 of the emotion training programme, “Teaching Children with Autism to Mind Read” (Howlin, Baron-Cohen, & Hadwin, 1999), which is designed to teach recognition of photographic and schematic faces, and identifying emotions in stories. Critically, in the robot-assisted condition, a Robokind R25 humanoid robot (“Zeno”) with realistic facial expressions helped to deliver the programme (controlled covertly by the adult). All sessions were recorded by audio, video, and depth sensors (Kinect).

Results: In each condition, children took between 3 and 8 sessions to complete the training steps, or reach a ceiling: 28 children completed all steps, 6 were unable to complete any steps, and 8 reached an intermediate step. Two of the 23 children were unable to engage in the robot-assisted condition due to sensory sensitivity. Overall, task scores and researchers’ qualitative reports suggest that participating children gained knowledge of emotions, and interacted successfully and positively with the robot. Some more able participants found the current tasks and robot overly simple, and were better engaged when asked to take on a ‘teaching’ role toward him (e.g., ask robot to give an answer after child responded). Analysis is ongoing.

Conclusions: A humanoid robot can be a feasible, meaningful, and engaging tool in an emotion-recognition teaching programme for autistic children with limited spoken communication. Indeed, more children successfully engaged with the robot and the tasks than was initially predicted by researchers and parents. Robot-assisted intervention may be particularly valuable for those children who may not easily access other interventions due to their language or ability. This study was the first phase of data collection for this large-scale project, with further feasibility studies running in Serbia and the UK. The resulting data will form part of a large “benchmark” annotated dataset of behaviour, gestures, and speech of autistic children, which will be made available to the wider research community. Future analyses will also compare the effectiveness of robot- and researcher-assisted training on teaching emotional and social skills.