16335
Virtual Humans Simulating Joint Attention Based on Real-Time Eye-Tracking

Friday, May 16, 2014
Meeting Room A601 & A602 (Marriott Marquis Atlanta)
O. Grynszpan1, B. HAN2, M. Courgeon3, J. C. Martin4 and J. Nadel5, (1)University Pierre et Marie Curie, Paris, France, (2)University of Paris8, Saint-Denis cedex, France, (3)Lab-Sticc / University of South Brittany, Brest, France, (4)LIMSI, CNRS/ Université paris-Sud, Orsay, France, (5)French National Centre of Scientific Research (CRNS), Paris, France
Background:  The pivotal role of gaze in social interchanges is considered altered in Autism Spectrum Disorders (ASD). Literature reports a diminished propensity to use the gaze of a social partner as a way to derive a representation of her/his mental states. Joint attention has drawn a lot of research interest in the field of ASD. Indeed, the appearance of joint attention behaviors is reported to be delayed in the developmental course of ASD and to predict poor later acquisition of social and communicative skills. Until now, most of the research devoted to assess joint attention competencies has been based on experiments where the participant is expected to follow the gaze of another person (passive joint attention). The active situation where the participant initiates and leads a joint attention episode is much harder to implement as it requires an experimental setup that can detect and react to the participant’s gaze.

Objectives:  The goal of this project is to devise and evaluate a technology that enables studying gaze leading paradigms. We endeavoured to create a controllable yet ecologically valid experimental setup.

Methods:  This project required merging technologies in the fields of eye-tracking and embodied virtual agents.We designed a platform that displays virtual humans endowed with the ability to follow the user’s gaze in real-time. The eye direction of the user is detected by an eye-tracker so as to determine the eyes and head orientations of a virtual human. To test this platform, participants were placed face-to-face with two virtual humans and had to carry out a task that required selecting an object among several choices. We compared two conditions: in the experimental condition, one of the virtual humans was continuously following the direction of the eyes of the participant while in the control condition the gazing behavior of the two virtual humans was independent of the participant.

Results:  Technical tests showed that the platform was able to correctly simulate gaze following behaviours. The experimental protocol was administered to 15 typical adults. Eye-tracking data revealed that the experimental manipulation influenced the gazing behaviour of typical participants, even though they were not aware of leading the gaze of the virtual human. Case studies with participants having ASD will be presented.

Conclusions:  We developed a novel platform that opens new avenues for studying joint attention in ASD. It provides an answer to the lack of relevant tools enabling to examine active joint attention, which is believed to be the critical component of joint attention impairments in ASD. The eye-tracking data yielded by the platform enables precise analyses of the viewing patterns associated with joint attention. Additionally, this platform holds great potential for assessing joint attention skills and could thus be proposed in the long term for clinical evaluation.