24707
Neural Signature of Dynamic Facial Processing in Children with ASD

Friday, May 12, 2017: 5:00 PM-6:30 PM
Golden Gate Ballroom (Marriott Marquis Hotel)
R. Ma1, C. M. Hudac2, A. Kresse3, A. Naples4, S. Faja5, J. McPartland6 and R. Bernier7, (1)Department of Psychiatry and Behavioral Sciences, University of Washington, Seattle, MA, (2)Psychiatry and Behavioral Sciences, University of Washington, Seattle, WA, (3)Seattle Children's Research Institute, Seattle, WA, (4)Child Study Center, Yale University School of Medicine, New Haven, CT, (5)Boston Children's Hospital, Boston, MA, (6)Child Study Center, Yale School of Medicine, New Haven, CT, (7)University of Washington Autism Center, Seattle, WA
Background:

Faces are highly complex visual stimuli and fundamentally dynamic; thus, interpretation of facial movement requires frequent updating of visual representations in tandem with higher-order integration of social knowledge (Jemel et al., 2006; Webb et al., 2010; Naples et al., 2015). Impaired interpretation of facial expression is a striking social deficit that has been commonly observed among children with autism spectrum disorder (ASD; Mottron et al., 2006; Webb et al., 2010; Webb et al., 2012). EEG mu rhythm attenuation has been associated with infants’ response to emotional facial motion (Rayson et al., 2016), as well as understanding and interpretation of others’ actions more generally (Muthukumaraswamy & Johnson, 2004). Prior research indicates that children with ASD demonstrate atypical mu rhythm attenuation when attending to dynamic social information (i.e., biological movement; Bernier et al., 2007; Hudac et al., 2015), although findings are inconsistent (Oberman et al., 2012; Bernier et al, 2013). However, little is known about this neural signature during the observation of dynamic facial expressions in ASD.

Objectives:

The focus of current work is twofold: 1) to examine spectral power in the mu rhythm during the observation of dynamic facial information perception in ASD and 2) to extend our understanding of this neural activity to social behavior.

Methods:

Children with ASD (n=37, M=11.6, SD=3.24) and typically developing (TD) children (n=42, M=11.5, SD=2.21) participated in an EEG study during which they were presented with photorealistic, computer-generated faces. As reported in Naples et al. (2015), the faces consisted of fearful movement and affect free movement (puffed cheeks). Power spectra relative to resting baseline across scalp electrode clusters surrounding C3 and C4 were averaged across trials for each condition. Following Bernier et al. (2013), mu attenuation was calculated as the log of the ratio of power in 8-13 Hz during observation over power in the same frequency range during resting baseline for each individual. Data collected from the Benton Facial Recognition Test (BFRT, Benton, 1983) and the Reading the Mind in the Eyes Task – Revised (RMET-R; Baron-Cohen et al., 2001) served as social behavior outcomes.

Results:

Univariate ANOVA indicated that both groups displayed mu attenuation in response to dynamic facial movement relative to a resting state baseline. However, there was no significant effect of group (p=.196) or condition (p=.654) on mu attenuation. Consistent with previous literature, one-way ANOVA indicated that children with ASD scored significantly lower (2.46 points) on the RMET-R [F(1,77)=11.57, p=.001]. Group differences were not observed on the BFRT (p=.09).

Conclusions:

Despite group differences in behavioral measures of social information processing based on observation of static images of emotionally expressive eyes in ASD, as indexed by the RMET-R, children with ASD and TD in this study exhibited similar patterns of mu attenuation in response to facial movement. These results concord with previous research failing to find differences in mu attenuation across diagnostic groups, extending these results to the perception of dynamic facial movement.