17882
Live Internal State Interaction Monitor Using Google Glass + EDA

Friday, May 16, 2014
Meeting Room A601 & A602 (Marriott Marquis Atlanta)
I. Riobo1, A. Parnami1, J. Hernandez2 and G. D. Abowd1, (1)School of Interactive Computing, Georgia Institute of Technology, Atlanta, GA, (2)MIT Media Lab, Cambridge, MA
Background:  

One core competency in establishing any successful social interaction is being able to adequately self-regulate and co-regulate our internal emotional arousal as we interact with other people in any given context.  As two neurotypical (NT) individuals engage in any interaction, both can externalize, regulate their own emotional arousal levels as well as identify the internal state of their communication partner as the interaction goes on. This co-regulation process happens naturally and both partners modulate their internal state as the social interaction occurs. This is not the case when interacting with individuals on the autism spectrum, because of the challenges they have in communicating intentions or desires as well as externalizing their internal state. Furthermore, maximizing the communication flow as well as helping the individual with autism regulate entails the ability to read their internal state, which requires a considerable amount of time and training. This learning process is extremely costly to individuals with autism as well as all the people involved in their daily lives.

Objectives:  

We are exploring the potential benefits of making visible the internal state of individuals with autism as a clinician, therapist and / or parent interacts with that individual. In particular, we are leveraging the combination of comfortable, ambulatory and non-obtrusive Electrodermal Activity (EDA) biosensors with the Google Glass, in order to provide live feedback of the emotional arousal level of the individual with autism as the interaction occurs in any given context. 

Methods:  

Our demonstration uses Affectiva QTM Electrodermal Activity (EDA) sensors at 32Hz on either the wrist or the ankle of a child with autism, which will transmit wirelessly the raw EDA data to the Google Glass worn by an adult interacting with the child. The Google Glass processes and applies validated physiological filters to the raw EDA data in order to remove noise and make it meaningful to the adult as they interact with the child in any given context. As the adult engages the child, she will be able to use the Google Glass to annotate events of interest. The video, EDA data, and annotations will be transferred to a laptop enabling clinicians, therapists and / or parents to review, reflect and analyze in more depth these pre-annotated events of interest.

Results: N/A  

Conclusions:  

We hope this demo will provoke two opportunities. First, this technology might be used to gain deeper insights into the autistic experience in social interactions. Second, this approach could make it easier for an adult to learn how to best interact with a child by “seeing” how the child reacts to their social advances in a given environment.