28106
Design of a Robot-Based Emotion-Mirroring Game to Engage Autistic Children with Emotional Expressions

Poster Presentation
Friday, May 11, 2018: 10:00 AM-1:30 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
B. R. Schadenberg1, P. Chevalier1, J. J. Li1, E. Ainger2, A. M. Alcorn2, S. Babović Dimitrijevic3, V. Charisi1, S. Petrović3, D. Reidsma1, E. Pellicano4, D. K. Heylen1 and V. Evers1, (1)Human-Media Interaction, University of Twente, Enschede, Netherlands, (2)Centre for Research in Autism and Education, University College London, London, United Kingdom, (3)Serbian Society of Autism, Belgrade, Serbia, (4)Macquarie University, Sydney, Australia
Background: Autistic children often have difficulty recognising emotional facial expressions of others relative to typically developing children. One possible way for children to learn to recognise facial expressions may be to observe their face being imitated by another party. Game-like interactions that include imitation and mirroring can help young autistic children to attend to and “play” with emotional facial expressions, as a building block in recognising those expressions and grasping emotional concepts. A social robot could be particularly useful in this regard by providing a less complex, more predictable - and potentially less threatening - way of interaction than a human.

Objectives: In a novel emotion-mirroring game, autistic children can play with a social robot who mirrors their facial expression in real-time. This study sought to determine whether autistic children understand and enjoy playing this game with the robot.

Methods: Data collection is ongoing. One autistic child (aged 8, male) from a regular elementary school in the Netherlands participated in this pilot study. The child played the emotion-mirroring game for 4 minutes, followed by another game for an additional 4 minutes. The emotion-mirroring game is set up as a triadic interaction between child, an adult, and the robot, which is the Robokind R25 humanoid robot, called “Zeno”. Computer vision is used to track the child's facial features through a webcam, which are translated to the robot's facial features. This game aims to gradually familiarise the child with Zeno, prepare the child to (1) pay attention to Zeno’s facial features specifically, (2) to generate facial expressions in response to Zeno, and (3) to understand the cause-and-effect nature of mirroring. To that end, the game starts with the adult and child making faces using an ordinary mirror, while Zeno is out of sight. Once the child understands the concept of a mirror, the adult introduces the robot to the child. Zeno mirrors the child face and the adult scaffolds cause-and-effect understanding. The game ended with the child being asked to imitate Zeno's happy and sad facial expression.

Results: The child completed the game with ease. After playing with the mirror and being told that “now Zeno is the mirror”, the child spontaneously started making facial expressions without being explicitly asked to do so. In the last step, Zeno asked the child to imitate its facial expressions. The child imitated the happy expression by opening the mouth and raising the eyebrows, but missed the smile. The sad expression was perfectly imitated by the child. Overall, the child appeared very interested in the robot, called it the “best visitor ever”, and was reluctant to leave at the end. Additionally, there was one spontaneous initiation towards the robot and two to the adult.

Conclusions: The game design appeared to have been successful in explaining the emotion-mirroring game to the autistic child and creating an enjoyable interaction. The next step is to assess the effectiveness of this and other similar games in teaching emotion recognition to autistic children in controlled experiments.