28269
How Children with Autism Spectrum Disorder Recognize Facial Expressions Displayed By a Rear-Projection Humanoid Robot

Poster Presentation
Friday, May 11, 2018: 5:30 PM-7:00 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
F. Askari1, H. Feng2, A. Gutierrez3, T. Sweeny4 and M. Mahoor2, (1)Electrical and Computer Engineering, University of Denver, Denevr, CO, (2)University of Denver, Denver, CO, (3)University of Miami, Miami, FL, (4)Psychology, University of Denver, denver, CO
Background:

Children with Autism Spectrum Disorder (ASD) experience reduced ability to perceive crucial nonverbal communication cues such as eye gaze, gestures, and facial expressions. Recent studies suggest that social robots can be used as effective tools to improve communication and social skills in children with ASD. One explanation has been put forward by several studies that children with ASD feel more contented and motivated in systemized and predictable environment, like interacting with robots.

Objectives:

There have been few research studies evaluating how children with ASD perceive facial expression in humanoid robots but no research evaluating facial expression perception on a rear-projected (aka animation-based) facially-expressive humanoid robot, which provide more life-like expressions. This study evaluates how children with high functioning autism (HFA) differ from their typically developing (TD) peers in recognition of facial expressions demonstrated by a life-like rear-projected humanoid robot, which is more adjustable and flexible in terms of displaying facial expressions for further studies.

Methods:

Seven HFA and seven TD children and adolescents aged 7-16 participated in this study. The study uses Ryan, a rear-projection, life-like humanoid robot. Six basic emotional facial expressions (happy, sad, angry, disgust, surprised and fear) with four different intensities (25%, 50%, 75% and 100% in ascending order) were shown on Ryan’s face. Participants were asked to choose the expression they perceived among seven options (six basic emotions and none). Responses were recorded by a research assistant. Results were analyzed to obtain the accuracy of facial expression recognition in ASD and TD children on humanoid robot face.

Results: We evaluated the intensity of expression in which participants required to reach the peak accuracy. They were best for happy and angry expressions in which the peak accuracy of 100% was reached with at least 50% of expression intensity. The same peak accuracy was reached for surprised and sad expressions in the intensity of 75% and 100%, respectively. But fear and disgust recognition accuracy never reached above 75%, even in the maximum intensity. The experiment is still in progress for TD children. Results will be compared to a TD sample and implication for intervention and clinical work will be discussed.

Conclusions:

Overall, these results show that children with ASD recognize negative expressions such as fear and disgust with a slightly lower accuracy than other expressions. On the other hand, during the test, children showed engagement and excitement toward the robot. Besides, most of the expressions were sufficiently recognizable for children in higher intensities, which means, Ryan, a rear projected life-like robot could be able to successfully communicate with children in terms of facial expression, though more investigations and improvements should be done. These results serve as a basis to advance the promising field of socially assistive robotics for autism therapy.