31162
Beyond Fidelity: Measuring Implementation in a Multi-Faceted School-Based Intervention

Poster Presentation
Friday, May 3, 2019: 5:30 PM-7:00 PM
Room: 710 (Palais des congres de Montreal)
J. R. Steinbrenner1, S. Odom1, L. J. Hall2 and K. Hume1, (1)Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill, Chapel Hill, NC, (2)Special Education, San Diego State University, San Diego, CA
Background: Measuring implementation of interventions is a critical but understudied area in autism research. Many intervention studies report fidelity, but some types of interventions, such as complex service interventions (CSIs; Komro, 2018) warrant a more robust measure of implementation that captures process and content of these multi-faceted interventions (Cordrey et al., 2013. The Center on Secondary Education for Students with ASD (CSESA) developed a comprehensive treatment model for high school students with ASD that is a CSI (Hume & Odom, under review). The CSESA model involves professional development (training and coaching) to support a process of assessment, planning, implementation, and evaluation at the school and student levels. The CSESA model addressed four domains (academics, social, independence & behavior, and transition) using ten intervention components.

Objectives: The objectives of this study are to: (1) describe a multi-featured approach to measuring implementation for CSIs, (2) examine the variability in implementation profiles within the CSESA group, and (3) examine differences in implementation between the CSESA group and services-as-usual (SAU) group.

Methods: This randomized control trial included 60 schools that were randomly assigned to CSESA or SAU. There were 547 student participants and 579 school staff participants across the 60 schools. The CSESA implementation index measured implementation at three different levels of implementation and assessed 7 features across those three levels: (1) delivery of CSESA intervention to schools by CSESA research staff (training, coaching), (2) implementation of CSESA intervention by school staff (intervention quality, teaming, school-level planning), and (3) reception of CSESA interventions by students (intervention dosage, student-level planning). The seven features were measured using a variety of tools (e.g., coaching log, fidelity rating scales, planning artifacts) which provided an implementation profile (see Table 1 for further description). The feature scores were converted to scaled scores (0-3 range) based on a priori decisions about implementation quality for CSESA, and then the scaled scores were weighted and used to calculate a single implementation index score for each school.

Results: The implementation index profile scores for the CSESA schools showed variability within and across the raw and scaled feature scores (see Table 2 for descriptive statistics). Six out of the seven features had scaled scores that spanned from either 0 to 3 or 1 to 3, and the raw scores exhibited similarly wide ranges. However, most schools scored in the good (2) or ideal (3) implementation range for most features. For the single implementation index scores, there was a significant difference between the mean scores for the CSESA (2.07) and SAU (0.47) schools (t=28.13, p<.001).

Conclusions:

The multi-featured CSESA implementation index was successfully used in the context of a school-based RCT to capture a range of implementation within the intervention group, as well as differences between the intervention and control group. The development and use of an implementation profiles and index that captures information about the process and content of a CSI is critical in supporting intervention research, implementation science, and dissemination of comprehensive treatment models or other similarly complex interventions for individuals with ASD.