19371
Positive Bias for Eye Contact in Adolescents with Autism Spectrum Disorders during Conversation with an Android Robot

Friday, May 15, 2015: 5:30 PM-7:00 PM
Imperial Ballroom (Grand America Hotel)
Y. Yoshikawa1, H. Kumazaki2,3, Y. Matsumoto4, Y. Wakita4, S. Mizushima5, S. Nemoto6, M. Miyao7, M. Nakano7, M. Mimura2 and H. Ishiguro1, (1)Graduate School of Engineering Science, Osaka University, Toyonaka, Japan, (2)Department of Neuropsychiatry, School of Medicine, Keio University, Tokyo, Japan, (3)Research Center for Child Mental Development, University of Fukui, Yoshida-gun, Fukui Prefecture, Japan, (4)The National Institute of Advanced Industrial Science and Technology, Tsukuba, Japan, (5)Research Center for Child Mental Development of University of Fukui, University of Osaka, Yoshida-gun, Fukui Prefecture, Japan, (6)Donguri clinic for developmental disorders, Tokyo, Japan, (7)National Center for Child Health and Development, Tokyo, Japan
Background: Adolescents with autistic spectrum disorders (ASD) often fail to make eye contact, which is one of the most important social communication cues. Researchers have recently considered using humanoid robots to treat ASD-associated deficits in social communication. Although small humanoid robots have been programmed to teach social cues such as head-gaze and hand-pointing, they have not been generalized for interactions with humans. An “android” is another type of humanoid robot that has the appearance of a real person. Due to their similar appearance to humans, it is expected that they could be useful partners for adolescents with ASD to learn social interaction with humans.

Objectives: It is important to determine whether it is easier for adolescents with ASD to establish eye contact or more gaze at another’s eyes when they face with an android than when they do with a person. Therefore, we analyzed the eye-gaze patterns of subjects when they talk to human and android partners.

Methods: Four adolescents with ASD diagnosed based on DSM-5 and six adolescents with typical development (TD) participated in the experiment. All of them were high-school age. They were asked to alternately talk to interlocutors (a female person and a female-type android) five times totally, where the first and the last sessions were conducted with the human interlocutor. The utterances of both interlocutors were scripted in an ambiguous way so that the conversations were appropriate for the various possible replies of the subjects. When the android spoke and where it looked was controlled by tele-operation. Its voice was based on recordings from the same human interlocutor. In each booth, an eye-tracker device was set to detect the subjects’ fixation points during the conversations. The areas of interests (AOIs) were identified around the faces of the interlocutors by manual registration, and the characteristics of fixation inside the AOIs were analyzed.

Results: All participants could talk with both interlocutors and totally spent about 11.5 minutes for five sessions on average. The “looking-eye bias” was calculated as the ratio of time when the subjects’ eye fixations stayed on the upper region of AOI (i.e., approximately on the eyes) with respect to time when they stayed within the AOI (face). The average looking-eye bias of adolescents with ASD in the android sessions (M=.61, SD=.29) was significantly higher than that measured in the human sessions (M=.17, SD=.17) (t[3]=3.78, p<.05). There was no significant difference between the android and human sessions for TD adolescents (M=.65, SD=.26 versus M=.75, SD=.35, respectively; t[5]=.83, p>.05).

Conclusions: Unlike TD adolescents, those with ASD seemed to avoid human eyes but not android eyes. This result encourages the future application of androids for improving eye contact and social communication. Therefore, it is important to identify aspects of android behavior and subjects’ symptoms that underlie this tendency. Furthermore, it will be necessary to confirm whether and how such interactions with androids can be generalized for humans.