20416
Examining Treatment Implementation in Secondary Education Settings

Friday, May 15, 2015: 5:30 PM-7:00 PM
Imperial Ballroom (Grand America Hotel)
S. L. Odom1, K. Hume2, J. R. Dykstra Steinbrenner3, E. Carter4, L. E. Smith5, C. K. Reutebuch6, D. Test7, D. Browder7,8, S. Vaughn9 and S. J. Rogers10, (1)University of North Carolina, Chapel Hill, NC, (2)University of North Carolina at Chapel HIll, Chapel Hill, NC, (3)Frank Porter Graham Child Development Institute, University of North Carolina at Chapel Hill, Carrboro, NC, (4)Special Education, Vanderbilt University, Nashville, TN, (5)University of Wisconsin-Madison, Madison, WI, (6)The Meadows Center for Preventing Educational Risk, The University of Texas at Austin, Austin, TX, (7)University of North Carolina at Charlotte, Charlotte, NC, (8)Department of Special Education, University of North Carolina at Charlotte, Charlotte, NC, Albania, (9)The Meadows Center, University of Texas at Austin, Austin, TX, (10)University of California at Davis, Sacramento, CA
Background: Program implementation is a key feature of experimental research. In the early stages of any research project, it is necessary to develop and evaluate measures of implementation fidelity.  The Center for Secondary Education for Students with ASD (CSESA) has developed a complex treatment model consisting of four features (Academic, Social, Independence, and Transition).  In the current study, researchers examined the construct validity of the individual implementation fidelity measures for each of the CSESA components as well as students’ progress on goals related to that component.

Objectives: (Research Question) Do CSESA fidelity measures discriminate between sites implementing and not implementing program features?  

Methods: As part of model development, investigators created implementation fidelity measures.  Using a variation of the classic multi-method/multi-trait matrix process that Campbell and Fiske (1959) designed for building construct validity of assessments, the authors created an evaluation design to assess construct validity of fidelity measures.  In this design, staff in high schools implemented two features of the CSESA program (see Table 1) and served as controls for the other two components. In most cases a feature had more than one measure (e.g., social had peer support, peer network and Social Competence Intervention components).   Research staff collected implementation fidelity on all features/components in all settings. The rationale for the design is that the implementation measures should reflect high ratings of fidelity in schools where a specific feature is being implemented and low levels of fidelity will occur in schools where a specific feature is not being implemented. All combinations of the four components are being implemented in six schools1(e.g., transition and academics, social competence and independence, academics and independence). The fidelity measures were comparable in format, consisting of a four point Likert rating scales (0-3) that documented the degree to which individual practices were implemented with fidelity.  Six high schools across the country (CA, NC1&2, TN, TX, WI) were enrolled in the study, involving 6-8 students with ASD at each school (N=43).  Research staff monitored fidelity of all components through observations of students during instruction. The students were randomly selected from the pool of students in the study receiving a given intervention. 1One school was scheduled to implement the academic feature but did not.

Results:  The small number of schools precludes statistical analysis, and descriptive comparisons of mean fidelity ratings appear in Table 2.  As hypothesized, the fidelity measures were sensitive to implementation fidelity occurring in schools and appeared to discriminate schools in the control condition.

Conclusions: This poster presentation provides a case example of a program development process that focuses on measurement of implementation fidelity. Implications for iterative design of program development, measurement of fidelity, and use of implementation science will be discussed at the poster.