19833
Social Personalized Human-Machine Interaction for People with Autism: A Close Look at Proprioceptive and Visual Orientation Integration

Friday, May 15, 2015: 10:00 AM-1:30 PM
Imperial Ballroom (Grand America Hotel)
P. Chevalier1, A. Tapus2, J. C. Martin3, C. Bazile4 and B. Isableu5, (1)ENSTA ParisTech, Palaiseau, France, (2)Computer Science and System Engineering, ENSTA-ParisTech, Palaiseau, France, (3)LIMSI, CNRS/ Université paris-Sud, Orsay, France, (4)FAM-La Lendemaine, Les Molières, France, (5)Université Paris Sud, Orsay, France
Background: Research in socially assistive robotics (SAR) is in expansion, and enable to propose more personalized social learning environments for individuals with Autistic Spectrum Disorder (ASD)(Feil-Seifer2005;Tapus2012), to cope with their impaired skills in communication and social interaction while using their affinity towards robots(Hart2005). Individuals with ASD also suffer from visual and sensorimotor impairments (Haswell2009;Greffou et al.2012).

Objectives: We work in collaboration with 2 care facilities (MAIA Autisme;FAM-La Lendemaine) with the final objective to propose a personalized human-robot environment for social learning. We hypothesize that the individual's reliance to proprioceptive and kinematic visual cues will affect the way s/he interacts with a robot. Hyporeactivity to visual motion of the scene and overreliance on proprioceptive information is linked in ASD subjects to difficulties to integrate social cues and to engage in successful interactions. Our present research defines each participants' perceptivo-cognitive and sensorimotor profile with respect to the integration of visual inputs.

Methods: Our subject pool is composed of 7 autistic adults (26.1±7.9years), 6 autistic children (10.9±1.8), and 7 typically developed adults (21.8±10.3). We evaluated the participants' profiles with two methodologies. First, the Sensory Profile  and the Adolescents/Adults Sensory Profiles (AASP) developed by Dunn (1999,2002) were filled in accordance with participants' age. In addition, in order to have homogeneous scores between age groups to assess Movement, Visual, Touch, and Auditory processing, we matched and adapted the items with a special care on their belonging of neurological threshold and behavior response/self-regulation. Second, we designed an experimental setup to assess the effect of a moving virtual visual scene (VVS) on postural control, and the individual’s capability to use proprioceptive inputs provided in dynamics of balance to reduce visual dependency(Isableu et al.2011). Participants were asked to stand quietly in postural conditions of increasing balance difficulty (normal vs tandem Romberg) in front of a virtual scene rolling at 0.25Hz with an inclination of ±10°.

Results: We observed a correspondence between AASP patterns and postural behaviors. A higher movement sensitivity in AASP leads to greater postural stability, and postural sway is less driven by VVS. A visual sensation seeking behavior also leads to smaller postural responses to VVS and a higher visual sensitivity leads to a greater postural coupling response with VVS. A relation between age and postural instability has been found, but not between age and postural response to VVS. Clustering analyses allowed us to identify 3 groups with significant different behavioral responses: (G1)strong visual independence to VVS suggesting an overreliance on proprioceptive input, (G2)moderate reactivity to VVS suggesting a reliance on both visual and proprioceptive input, and (G3)hypereactivity to VVS suggesting a weak proprioceptive integration and strong visual dependency.

Conclusions: Our work permits us to characterize 3 groups within our participants. These results will help us to model customized Human-Robot Interaction sessions and adapt the robot’s behaviors as a function of the participants profile (dependence on visual and proprioceptive input).