27795
Are Some Emotions Harder Than Others? a Study of Autistic Children Recognising Human and Robot Facial Emotion Expressions in the UK and Serbia

Poster Presentation
Friday, May 11, 2018: 10:00 AM-1:30 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
D. Girard1, A. M. Alcorn2, E. Ainger2, T. Tavassoli3, S. Babović Dimitrijevic4, S. Skendžić4, S. Petrović4, V. Petrović4, E. Pellicano5 and C. De-Enigma6, (1)Psychology, University of Quebec in Montreal, Montréal, QC, Canada, (2)Centre for Research in Autism and Education, University College London, London, United Kingdom, (3)Centre for Autism, School of Psychology & Clinical Language Sciences, University of Reading, Reading, United Kingdom, (4)Serbian Society of Autism, Belgrade, Serbia, (5)Centre for Research in Autism and Education (CRAE), UCL Institute of Education, University College London, London, United Kingdom, (6)DE-ENIGMA project consortium, Enschede, Netherlands
Background: Although difficulties in accurately recognising human facial expressions are well established in autism, recent findings indicate that autistic individuals are just as good at recognizing robot faces as neurotypical individuals. Several studies further highlight the potential of using robot-based technology as assistive tools in interventions for autistic children, which tend to have intact or superior abilities to comprehend and manipulate closed, rule-based, predictable systems. It is unclear, however, whether some emotions are more difficult to recognise than others in robot- versus therapist-assisted interactions.

Objectives: This study sought to compare autistic children’s ability to recognise static and dynamic facial emotion expressions (angry, afraid, happy, sad) between robot- and therapist-assisted interactions. These comparisons were made in two distinct cultural contexts (Serbia and the UK).

Methods: 128 autistic children (UK: n=62; Serbia: n=66) aged 5-12 years were assessed in a 6-step facial emotion recognition training programme for 4 basic emotions (fear, anger, happiness, sadness) based on Howlin, Baron-Cohen, and Hadwin’s (1999) approach. Steps 1-2 consisted of correctly matching across static emotional images. Steps 3-5 consisted of correctly matching or identifying dynamic real emotional displays. Performance on the task was operationalised as the percentage of correct answers for each emotion separately in both steps 1-2 and steps 3-5. Children were randomly assigned either to a robot-assisted condition (n=64, 10 female), where a Robokind R25 humanoid robot (‘Zeno’) helped to deliver the programme (controlled covertly by an adult), or a therapist-assisted comparison condition (n=64, 9 female). Researchers assessed autism symptomatology using the CARS2-ST (Schopler et al., 2010) based on direct observation and parent information.

Results: All subgroups were closely matched on age (p=.67) and had, on average, mild-to moderate autism severity symptoms. We used Mann-Whitney U test to compare the performance of our groups as the performance distribution for each emotion was highly skewed. For static facial expressions (steps 1-2), there were no significant differences in children’s recognition performance between robot- and therapist-assisted conditions – with the exception of fearful expressions, which were more easily recognised by Serbian children in the robot-assisted condition (U=373, p=.010). For dynamic facial expressions (steps 3-5), Serbian children in the robot-assisted condition were significantly more accurate in recognising fearful (U=398, p=.043) and sad faces (U=403, p=.043) compared to the therapist-assisted condition. There were no significant differences, however, between UK children’s recognition accuracy in robot- and therapist-assisted conditions (all ps>.05).

Conclusions: These results indicate that children’s performance for static and dynamic facial expressions was similar for robot- and therapist-assisted interactions. For Serbian children, implementation of the emotion-training programme within the robot-assisted interaction seemed to benefit their recognition accuracy for static and dynamic fearful faces, as well as dynamic sad faces. These results suggest that the extent to which a robot-assisted training programme can facilitate emotion recognition in autistic children may vary as a function of cultural context. Also, the use of the robot technology seems to help correctly identify only some specific type of facial expressions. These findings will inform the next phase of this large-scale project.