23930
Jemime, a Serious Game to Teach Emotional Facial Expressiveness for Individuals with Autism Spectrum Disorders.

Friday, May 12, 2017: 10:00 AM-1:40 PM
Golden Gate Ballroom (Marriott Marquis Hotel)
S. Hun-Billiaut1,2, S. Serret1,2,3, J. Bourgeois1, P. Foulon4, D. Cohen5,6, C. Grossard5,6, O. Grynszpan6, F. Askenazy1,3, A. Dapogny6, S. Dubuisson6, L. Chen7 and K. Bailly6, (1)Cognition-Behaviour-Technology (CoBTeK), EA 7276, University of Nice Sophia Antipolis, Nice, France, (2)Autism Resource Center, Lenval Foundation, Nice, France, (3)University Department of Child and Adolescent Psychiatry, Children’s Hospitals of Nice CHU-Lenval, Nice, France, (4)Genious group, Colombes, France, (5)Department of Child and Adolescent Psychiatry, AP-HP Groupe Hospitalier Pitié-Salpêtrière, PARIS, France, (6)CNRS UMR 7222, Institute of Intelligent Systems and Robotics, University Pierre et Marie Curie, PARIS, France, (7)Liris laboratory UMR CNRS 5205, Ecole Centrale of Lyon, Ecully, France
Background:  Poor social cognition is a core problem in Autism Spectrum Disorders (ASD). Individuals with ASD have difficulties in emotional processing (recognition and analysis of social situations) as well as in emotional production, with a reduced emotional facial expressiveness. Some remediation programs using the support of new technologies have been developed to train people with ASD to produce emotions. However, none of them are based on a production of social situations with imitative models as avatars.

Objectives: The objective of the JEMImE project is to develop a specific game, in which people with ASD are first trained to imitate emotional expressions displayed by an avatar and then taught to produce emotional facial expressions in social scenes.

Methods:  Players with ASD are asked to produce emotional expressions (facial and vocal) of joy, anger or sadness. Two technical innovations have been introduced. Firstly, the software performs a real-time analysis of the emotional production of the players. Secondly, an automatic evaluation is performed by an innovative algorithm implemented in the software, comparing the players’ productions with those of typically developing children (6-12 years) contained in an inbuilt emotional expression database.

Results:  The game is composed of two phases, both using the technical innovations (real-time automatic detection and automatic evaluation). The first phase aims toteach players how to produce joy, anger or sadness using games based on 1) imitation and 2) production on request, within or without social situations (Fig 1). Thus, the software is able to detect each emotion produced in real-time, and to give a feedback to the player about the quality of the emotion. The second phase, using a 3D virtual environment, aims to train spontaneous emotional productions, depending on the social context. Indeed, the progression through the social scenes presented to the player depends on the quality of the emotional productions performed by the player, and changes if the production is not relevant for the current scene. In this phase, the software detects each produced emotion and gives a feedback about the accuracy of the emotional expression according to the context.

Conclusions: In summary, JEMImE software offers a personalized training for the production of emotional expressions in patients with ASD. Future research should investigate the efficiency of the algorithm for the training of the production of emotions in players with ASD. The JEMImE game is expected to benefit ASD individuals, their families and care-givers by offering them an easily accessible and amusing tool to train emotional expressiveness.