27455
Design of a Multi-Sensory Stimulation and Data Capture System for Investigating Multi-Sensory Trajectories in Infancy

Poster Presentation
Friday, May 11, 2018: 10:00 AM-1:30 PM
Hall Grote Zaal (de Doelen ICC Rotterdam)
D. Bian1, Z. Zheng2, A. Swanson3, A. S. Weitlauf4, Z. Warren5 and N. Sarkar2, (1)Electrical Engineering & Computer Science, Vanderbilt University, Nashville, TN, (2)Vanderbilt University, Nashville, TN, (3)Vanderbilt Kennedy Center, Vanderbilt University Medical Center, NASHVILLE, TN, (4)Vanderbilt Kennedy Center, Vanderbilt University Medical Center, Nashville, TN, (5)Vanderbilt University Medical Center, Nashville, TN
Background: Differences in behavioral responses to basic sensory stimuli are hallmark features of ASD that may impact the deficits in social responsivity which characterize the disorder over time. While the neural basis of complex social and communicative behaviors develops throughout childhood, brain responses to more basic, sensory stimuli are in place much earlier, making sensory processing differences a potential target for earliest detection. Characterization of these sensory differences and prospective study of their developmental impact has been limited by a scarcity of methods by which to assess them in infants. Current single-sensory-modality studies contrast with the rich multisensory environment of human infants, with tools and paradigms for charting this more meaningful and complex processing sorely lacking.

Objectives: To investigate the feasibility and tolerability of a multi-sensory paradigm including the sense of touch.

Methods: We developed a precisely controlled system for multi-sensory (auditory, visual, and tactile) stimuli delivery endowed with the capacity to monitor eye gaze, peripheral physiological, and electroencephalogram (EEG) data. We designed a mechatronic device to simulate affective touch, which can be synchronized with other sensory deliveries. All the data can be captured with user-defined event markers for later data segmentation. A feasibility study with 10 participants aged between 3- to 20-months-old was conducted. All participants watched a series of videos where a female actress was seen and heard reciting a prepared English or Spanish monologue. To determine how much time participants spent gazing at core areas of the face, we defined three principal regions of interest (ROIs) around the eyes, the nose and mouth, respectively, on the face of the talker. The videos were played twice in random order. We manipulated the affective touch simulation device to stroke the participants’ forearms for one of the identical stimulus displays.

Results: Seven out of the 10 participants went through all of the experimental procedures. Eye gaze data, peripheral physiological data (PPG and GSR), and EEG data were all properly recorded with event markers. Three participants pulled their arms out of the tactile device in the middle of the experiment, which might be because the current design of the tactile device has difficulty in fixing small arm in place. The data were collected regardless of the absence of the tactile stimulus. Results indicated that participants looked at the stimulus screen 27% of the time, with gaze toward demarcated ROI for 57% of this time. No specific differences in the tactile stimulus conditions were noted across groups (with and without affective touch).

Conclusions: This work describes a novel multi-sensory stimulus delivery system that can record multi-dimensional data in real time. This feasibility study demonstrated that the system is tolerated by infants aged 3- to 20-months-old. Eye gaze data, peripheral physiological data, and EEG data were successfully recorded. Such a system could potentially be used to help study multisensory differences and process differences over time for both high- (e.g., infant siblings of children with ASD) and low-risk infants.