26390
Feasibility, Enjoyment, and Effectiveness of a Robot Social Skills Intervention for Children with ASD

Poster Presentation
Friday, May 11, 2018: 10:00 AM-1:30 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
J. B. Lebersfeld, C. J. Brasher, C. D. Clesi, C. E. Stevens, F. J. Biasini and M. I. Hopkins, University of Alabama at Birmingham, Birmingham, AL
Background: Children with Autism Spectrum Disorder (ASD) demonstrate difficulty with social skills, including understanding emotional facial expressions. Children with ASD are intrinsically motivated by technology. To capitalize on this, a robotic monkey (SAM: Socially Animated Machine) was developed to lead a social skills intervention. Research found that this intervention was feasible and enjoyable for children with ASD and average cognitive ability, and participants improved in complex emotion recognition following intervention.

Objectives: Given the aforementioned success, the current study aimed to investigate the effectiveness for children with ASD across all levels of cognitive ability. This study examined whether the robot intervention improves 1) emotion matching, 2) facial recognition, 3) social skills, and 4) is enjoyable for this population.

Methods: Fifteen children with ASD ages 5-14 with well below average to average cognitive skills participated in the study. ASD diagnosis was confirmed using the Autism Diagnostic Observation Schedule, Second Edition. Participants were assigned to the intervention group (n=8) or control group (n=7). Both groups completed eight weekly sessions with the robot. The intervention group played emotion games with SAM, and the control group played non-emotion games. Emotion matching accuracy, facial recognition (Benton Facial Recognition Test), and social skills (Social Responsiveness Scale (SRS-2), parent and teacher) were compared across groups pre- and post-intervention. After completion, parents and children rated the child’s enjoyment on a 0-10 scale.

Results: Univariate ANCOVAs were adjusted for pre-intervention scores. One outlier was excluded from emotion matching accuracy analysis. Accuracy for both groups improved from pre- to post-intervention (Intervention: M=81.0% to M=95.2%, Control: M=72.6% to M=83.9%). Overall post-intervention accuracy did not differ between groups. Emotions were separated into simple emotions (happy, sad) and complex emotions (anger, fear, surprise, disgust). Although groups did not differ on simple emotions, the analysis of complex emotions trended toward significance (F(1,11)=4.516, p=.057; Intervention: M=92.2%, Control: M=79.2%). For individual emotions, the intervention group significantly improved in matching facial expressions displaying fear (F(1,11)=6.637, p<.05; Intervention: M=96.4%, Control: M=75.0%) and disgust (F(1,11)=9.239, p<.05; Intervention: M=97.5%, Control: M=66.8%) but not anger or surprise. Analyses indicated no significant group differences in facial recognition or social skills. Both groups enjoyed talking with SAM (M=9.43, SD=1.51) and wanted to interact with the robot again (M=8.93, SD=2.13). Parents indicated that their children enjoyed the sessions (M=8.86, SD=1.56), were motivated to attend (M=8.36, SD=1.99), and would like to have additional interactions (M=8.64, SD=1.55).

Conclusions: The SAM robot intervention is feasible, enjoyable, and motivating for children with ASD across a range of cognitive skills. Considering that children with ASD acquire skills more effectively while engaged and motivated in the learning process, this study shows further evidence for the use of robots with this population. The SAM robot intervention improved emotion recognition for some complex emotions, but not all, and improvements did not generalize to other measures. It is possible that the outcome measures chosen did not adequately capture acquired skills. Aspects of the SAM robot intervention will be explored further to maximize effects and generalizability, including the content, dose, frequency, and duration of sessions.