19309
Bayesian Meta-Analysis of Multiple Interventions and Outcomes for the Treatment of Autism Spectrum Disorder

Thursday, May 14, 2015: 5:30 PM-7:00 PM
Imperial Ballroom (Grand America Hotel)
C. Fonnesbeck1, A. S. Weitlauf2, N. A. Sathe3, M. McPheeters4 and Z. Warren5, (1)Biostatistics, Vanderbilt University, Nashville, TN, (2)Vanderbilt Kennedy Center, Nashville, TN, (3)Vanderbilt Evidence-based Practice Center, Institute for Medicine and Public Health, Vanderbilt University, Nashville, TN, (4)Vanderbilt Universtiy, Nashville, TN, (5)Vanderbilt University, Nashville, TN
Chris Fonnesbeck, Zachary Warren, Amy Weitlauf, Nila Sathe, Melissa McPheeters

Background: Applied early intervention practices for young children with ASD often involve multi-component interventions (e.g., speech-language intervention, developmental preschools, occupational therapies, involvement with ABA programs, etc.). Providers and parents are often required to make recommendations and choices regarding which components of interventions should be included as part of intervention both based on resource availability and scientific understanding of studied interventions. Unfortunately, few studies have directly compared the effects of well-controlled treatment approaches, instead comparing interventions in isolation or to non-specific “treatments as usual.” Such comparisons provide insufficient resolution of which components of intervention primarily drive developmental progress, and at what rate.

Objectives: In the current work, we present a novel meta-analytic approach for understanding ASD interventions with the potential for discriminating amongst the components of treatment response across the range of intervention classes. Specifically, we present a network (or mixed treatment comparison) meta-analysis of multicomponent ASD interventions. Network meta-analysis is a generalization of standard meta-analysis to allow for the simultaneous evaluation of a set of treatments, rather than a simple pairwise comparison. By constructing a network of studies, this approach considers both direct evidence from studies that compare the same interventions as well as indirect evidence from studies that have one intervention in common, but not all. Including indirect evidence can improve the precision of meta-estimates relative to using only direct evidence by borrowing strength from the indirect comparisons.  Indirect evidence can also serve to mitigate biases that may exist in direct comparisons.

Methods:  We implemented a Bayesian mixed-effects model that accounts for multi-component interventions and the reporting of multiple outcomes in a unified framework. Critically, this allows individual studies to contribute partial information, and for missing quantities to be readily imputed or predicted.

Results: The meta-analysis included 19 independent studies (extracted from 2014 AHRQ report on behavioral interventions for ASD) comprising 27 different interventions (classified into 7 multicomponent treatment categories) that reported one or more of three IQ scores (verbal, nonverbal, composite) as outcomes. We estimated the most effective intervention to be high-intensity applied behavior analysis (ABA), with a median posterior effect size of 5.0 (95% Bayesian credible interval = [1.0, 8.8]) points. In contrast, eclectic school programs had a posterior mean effect of -6.0 (95% BCI [-10.5, -1.6]), while the effects of the remaining classes were equivocal.

Conclusions: Our approach is general and flexible enough to provide comparative effectiveness inference on a range of disorders for which there is a heterogeneous mixture of interventions that warrant comparison.  This suggests that Bayesian mixed-effects modelling may represent an important methodology for understanding the large and confusing body of ASD intervention research.