17648
Real-Time Eye Contact Detection System

Friday, May 16, 2014
Meeting Room A601 & A602 (Marriott Marquis Atlanta)
Y. Liu1, Y. Li1, Z. Ye1, F. De la Torre2, A. Rozga1 and J. Rehg1, (1)School of Interactive Computing, Georgia Institute of Technology, Atlanta, GA, (2)Robotics Institute, Carnegie Mellon University, Pittsburgh, PA
Background: Eye contact is a key cue for regulating social interactions. Eye tracking and behavioral studies have shown that children with autism are less likely to look at eye regions of the face and to use eye contact while engaging with others. Thus, eye contact represents a key variable that clinicians consider when evaluating children for possible signs of autism. However, in the course of such assessments, it may be difficult for clinicians to collect quantitative data in real-time, such as the frequency with which the child makes eye contact. Although automated eye contact detection has been explored in recent years, none of the existing systems can operate in real-time.

Objectives: Our goal is to automatically detect moments of eye contact between two interactive partners in real-time, and quantify these moments across a period of interaction. We will present an interactive demo of our real-time system.

Methods: Our system utilizes commercially-available eye tracking glasses (SensoMotoric Instruments). These glasses record what the wearer sees via an embedded, outward facing camera, while simultaneously measuring the wearer’s gaze fixations in real-time. In our proposed use case, an adult wears a pair of eye-tracking glasses while interacting with a child. The video of the child recorded with the outward facing camera is processed with the Omron OKAO vision library and Intra-Face system [Xiong and De la Torre, 2013] to provide estimates of the child’s head orientation and gaze direction. By combining this information with information about the adult’s gaze fixations provided by SMI, we developed an algorithm to detect moments of mutual eye contact. We evaluated the accuracy of our eye contact detection algorithm in the context of a brief (2-3 minute) table-top interaction between 9 children aging from 15 to 27 months and an examiner. The examiner brought out a number of toys and allowed the child to explore and play with them without explicitly guiding the interaction. Children were seated in their parent’s lap. Two human coders used the video recorded by the SMI glasses to mark the onset and offset of each event in which the child made eye contact with the examiner.

Results: We took the union of the eye contact events, which were independently identified by the two coders and used them to compare the human coding to the automated eye contact detections generated by our system. The overall accuracy of our system was 93.5%, with a precision of 63.6% and recall of 69.2%. We note that across the 9 sessions, the average agreement between the two human coders was 75%, and for individual sessions could be as low as 61.2%.

Conclusions: We have developed a real-time eye contact detection system and demonstrated its accuracy in a setting that resembles many of the current screening and diagnostic assessments for autism. In the future, we plan to extend our system to detect a wider range of the child’s gaze targets (e.g. to objects). We will also explore whether our system may be useful in therapeutic settings for real-time feedback.