20500
Affect and Social Behaviors of School-Aged, High Functioning Children with ASD during Robot Interaction

Friday, May 15, 2015: 10:00 AM-1:30 PM
Imperial Ballroom (Grand America Hotel)
C. Daniell1, E. S. Kim1, C. Makar1, J. Elia1, B. Scassellati2 and F. Shic3, (1)Yale University, New Haven, CT, (2)Computer Science and Mechanical Engineering, Yale University, New Haven, CT, (3)Yale Child Study Center, Yale University School of Medicine, New Haven, CT
Background: Recent group studies confirm earlier case study observations that social robots both elicit and mediate adaptive social behaviors in children with autism spectrum disorders (ASD). Robots have the potential to serve as powerful reinforcers in communication and social skills interventions. Previously, we found that a social robot mediates greater speech production than an adult in a cross-sectional group of school-aged children with high-functioning autism spectrum disorders (HFA) (Kim et al., 2013).

Objectives: To explore and enrich our understanding of increased speech production mediated by interaction with the social robot, in terms of participants’ affect and engagement behaviors.

Methods: Participants were school-aged children with high-functioning autism (N = 26; ages: M = 9.4 SD = 2.4; FSIQ: M = 94.2 SD = 11.7; diagnoses confirmed by clinical best estimate using ADOS). Each completed brief, semi-structured, triadic interactions with a confederate and either a social robot or another adult, in randomized crossover. Using qualitative, Likert-scale measures, two independent raters judged video recordings of these interactions on the adaptive quality of social, regulatory, and affective behaviors. Measured behaviors included 1) affective valence; 2) quality of eye contact; 3) willingness to engage in social physical contact (e.g., shaking hands with the adult or petting the robot); 4) acknowledgement of confederate’s or interaction partner’s spoken or nonverbal bids; and 5) number of increasingly restrictive cues required to elicit response to a question or an instruction to act. Interactions with the confederate and interaction partner were judged for each behavior. Where ratings disagreed by differences greater than one, raters discussed their ratings and achieved consensus.

Results: Children with HFA exhibited more positive affect during interaction with the robot than during interaction with the adult (p < .001; d = 0.97). When interacting with the robot, children also exhibited better eye contact (p < .001, d = 0.97), greater willingness to engage in physical touch aspects of the structured interaction (p < .001, d = 0.71), and better acknowledgment of the interaction partner’s social overtures (p < .001, d = 0.98), than when interacting with a human adult. Previous findings of increased utterance production during robot interaction, as opposed to interaction with the adult partner, were positively correlated with eye contact ratings (r = .467, p < .05 ) and positivity of affect (r = .402, p = .052) during interaction with the robot, but not during interaction with an adult  (r = -.02, r = .08, respectively).

Conclusions: Our explorations suggest that children with HFA who enjoyed interacting with the robot, or who demonstrated better eye contact, showed greater robot-mediated behavioral improvement (as measured by speech production). We also found that children with HFA enjoyed interacting with the robot more than with the adult interaction partner. These results shed light on the mechanisms by which robots may facilitate communication in children with ASD. Further work will need to examine the sustainability of these increases over extended usage, en route towards the development of more automated tools to provide additional learning opportunities to individuals affected with ASD.