FIDELITY TO THE ACT SMART TOOLKIT: AN ASSESSMENT OF IMPLEMENTATION STRATEGY FIDELITY By Jessica Tschida A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Psychology – Master of Arts 2022 ABSTRACT FIDELITY TO THE ACT SMART TOOLKIT: AN ASSESSMENT OF IMPLEMENTATION STRATEGY FIDELITY By Jessica Tschida Although evidence-based practices (EBPs) have been shown to improve a variety of outcomes for autistic children, they are often inconsistently implemented or not implemented in community settings where many autistic children primarily receive care. One multi-faceted implementation strategy that researchers have developed and tested in a pilot study to support the implementation of EBPs for ASD in community settings is The Autism Community Toolkit: Systems to Measure and Adopt Research-Based Treatments (ACT SMART Toolkit). Here, we used a case study approach to assess fidelity to the toolkit during its pilot study (implementation strategy fidelity) using measures of adherence, dose, and participant responsiveness and examined the relationship between implementation strategy fidelity and EBP use in an exploratory analysis. Overall, we found that adherence, dose, and participant responsiveness to the ACT SMART Toolkit were high with some variability by toolkit phase and activity. However, our exploratory analysis was ultimately unequipped to evaluate the relationship between increased fidelity and increased EBP use given the limited sample size of the pilot study. Our case study evaluation provides one of the first models of considering fidelity in the context of multi-faceted implementation strategies as well as important insights into potential core and peripheral components of the ACT SMART Toolkit. ACKNOWLEDGMENTS I would like to profoundly thank those who have made this work possible. Foremost, I would like to express my deepest gratitude to Dr. Amy Drahota for her invaluable mentorship. I would also like to thank my committee members, Drs. Brooke Ingersoll and Ignacio D. Acevedo-Polakovich for their expertise and insightful feedback. Additionally, I would like to give my sincere thanks to my fellow lab member Aksheya Sridhar for her guidance and support over the past two years. Lastly, I am incredibly grateful for the support of my parents, Brenda and Steve, my brother, Delton, my partner, Jack, and my dog, Seymour. iii TABLE OF CONTENTS LIST OF TABLES ...........................................................................................................................v LIST OF FIGURES ....................................................................................................................... vi INTRODUCTION ...........................................................................................................................1 Background and Significance.......................................................................................................1 ACT SMART Implementation Toolkit ........................................................................................2 Implementation Strategy Fidelity .................................................................................................4 Present Study ................................................................................................................................6 METHOD ........................................................................................................................................8 Participants ...................................................................................................................................8 Materials & Procedure ...............................................................................................................11 Analysis Plan ..............................................................................................................................13 RESULTS ......................................................................................................................................16 Overall Fidelity to the ACT SMART Toolkit ............................................................................16 By Agency Fidelity to the ACT SMART Toolkit ......................................................................19 Differences in Fidelity to the ACT SMART Toolkit by Toolkit Phase .....................................24 Fidelity-EBP Use Relationships .................................................................................................25 DISCUSSION ................................................................................................................................26 Fidelity to the ACT SMART Toolkit .........................................................................................26 Implementation Strategy Fidelity Theory ..................................................................................29 Strengths .....................................................................................................................................31 Limitations .................................................................................................................................31 Conclusion & Future Directions ................................................................................................33 APPENDICES ...............................................................................................................................35 Appendix A. ACT SMART Implementation Milestones Form .................................................36 Appendix B. ACT SMART Activity Fidelity Form ..................................................................37 Appendix C. ACT SMART Implementation Team Engagement Rating Scale .........................42 Appendix D. ASD Strategies and Interventions Survey ............................................................44 Appendix E. Exploratory Analysis of Implementation Strategy Fidelity and EBP Use ............46 REFERENCES ..............................................................................................................................48 iv LIST OF TABLES Table 1. Demographic and discipline information across implementation teams…………………9 Table 2. Demographic and discipline information across direct providers………………………...9 Table 3. Adherence, dose, and participant responsiveness to the ACT SMART Toolkit calculated overall, by phase, and by activity across ASD community agency implementation teams………16 Table 4. Means, standard deviations, and spearman correlations with 95% confidence intervals for implementation strategy fidelity and proportion of direct providers using video modeling………19 Table 5. Adherence to the ACT SMART Toolkit calculated overall, by phase, and by activity for each ASD community agency implementation team……………………………………………..20 Table 6. Dose to the ACT SMART Toolkit calculated overall, by phase, and by activity for each ASD community agency implementation team…………………………………………………..22 Table 7. Participant responsiveness to the ACT SMART toolkit calculated overall and by phase for each ASD community agency implementation team………………………………………….24 Table 8. Beta regression results predicting proportion of direct providers using video modeling post-toolkit…………………………………………………………………………………….....46 Table 9. Firth-type multilevel logistic GEE results predicting odds of direct providers using video modeling post-toolkit……………………………………………………………………...47 v LIST OF FIGURES Figure 1. Phases and steps of the ACT SMART Toolkit…………………………………………3 vi INTRODUCTION Background and Significance. An autism spectrum disorder (ASD) affects approximately 1 in 44 children in the United States and has been identified as a public health concern estimated to cost $461 billion dollars a year for services and treatment by 2030 (Blaxill et al., 2021; Leigh & Du, 2015; Maenner et al., 2021). ASD is characterized by core social and communication difficulties as well as restricted and repetitive behaviors and interests (RRBIs) and commonly co-occurs with other disorders such as anxiety, obsessive compulsive disorder (OCD), attention deficit hyperactivity disorder (ADHD), and oppositional defiant disorder (ODD) (American Psychiatric Association, 2013; Lai et al., 2014; Simonoff et al., 2008). In addition, children on the autism spectrum have higher rates of behaviors such as self-injury, aggression, tantrums, and property destruction compared to neurotypical peers (Hattier et al., 2011; Horner et al., 2002; Stevens et al., 2017). Moreover, both the core features as well as the associated diagnoses and behaviors of ASD have been found to predict unsatisfactory outcomes in quality-of-life factors such as peer relationships, educational attainment, employment, and independent living as an adult (Kim & Bottema-Beutel, 2019; Lai et al., 2014; Mason et al., 2019). These associations between autistic 1 characteristics and unsatisfactory quality-of-life outcomes are also maintained by systemic barriers to inclusion of autistic individuals, such as societal stigma and lack of appropriate accommodations in education, employment, and housing opportunities (Bottema-Beutel et al., 1 I use “identity-first” language in some instances due to recent studies showing that identity-first language is preferred by some autistic individuals (Bury et al., 2020; Kenny et al., 2016) and a recent review highlighting potentially ableist language in autism research (Bottema-Beutel et al., 2020). 1 2020; Pitney, 2020; Robertson, 2009). The lack of systemic accommodations for autistic individuals also exacerbates public health costs and primarily burdens autistic individuals and their families (Bottema-Beutel et al., 2020; Pitney, 2020). The prevalence rate for ASD continues to grow dramatically as practices for diagnosis improve (King & Bearman, 2009; Maenner et al., 2020). However, despite their potential to improve outcomes for autistic youth and reduce individual and societal costs (Eapen et al., 2013; Horlin et al., 2014; Vinen et al., 2018), barriers to community level identification and intervention remain (Elder et al., 2016; Maenner et al., 2020). Although evidence-based practices (EBPs) have been shown to improve a variety of outcomes for autistic children, they are often inconsistently implemented or not implemented in community settings where many autistic children receive services (Drahota et al., 2020; Paynter et al., 2016; Pickard et al., 2017; Wong et al., 2015; Wood et al., 2015). As a result, there is a considerable number of children on the autism spectrum not receiving the practices empirically demonstrated to improve outcomes as part of their usual care. Thus, there is a need to identify, develop, and evaluate strategies to support the implementation of EBPs for ASD within community settings. ACT SMART Implementation Toolkit. Drahota and colleagues (2014, 2017) developed one multi-faceted implementation strategy to support the implementation of EBPs for ASD in community settings: The Autism Community Toolkit: Systems to Measure and Adopt Research-Based Treatments (ACT SMART Toolkit). These researchers developed the ACT SMART Toolkit based on a review of existing evidence and by incorporating insight from a community-academic partnership. The ACT SMART Toolkit involves facilitation meetings led by trained ACT SMART facilitators and a web-based interface to guide ASD community agency 2 leaders, supervisors, and providers that comprise agency implementation teams through phases of implementing an EBP (Drahota et al., 2012, 2014, 2017). Drahota and colleagues (2017, 2020) designed the ACT SMART Toolkit to have steps and activities that align with an adapted implementation model – the Exploration, Adoption, Preparation, Implementation, Sustainment (EAPIS) model (Aarons et al., 2011; Drahota et al., 2020). Overall, implementation teams from ASD community agencies use the toolkit to explore their agency’s receptivity to implementing a new EBP, identify and decide upon an EBP that meets their needs, prepare prospectively to implement the EBP, implement the EBP, and finally evaluate implementation and develop a plan for sustainment (See Figure 1; Drahota et al., 2017). ACT SMART Activity Phase Agency first contacted Pre-Implementation Agency interest indicated Recruitment Agency recruitment meeting Orientation workshop Phase 1 Meeting at agency to recruit for agency assessment Agency Exploration Emails sent to agency staff for agency assessment ACT SMART agency assessment Treatment selection Evaluate treatment fit Evaluate treatment feasibility Phase 2 Evaluate clinical value and research validity Treatment Selection and Adoption Evaluate training requirements Decision Evaluate funding source Evaluate benefit-cost estimator Make an adoption decision Gather treatment materials Phase 3 Evaluate prospective treatment adaptations Planning for Implementation Develop an adaptation plan Develop a training plan Develop an implementation and sustainment plan Carry out adaptation plan Phase 4 Carry out training plan Implementation Carry out implementation and sustainment plan Figure 1. Phases and steps of the ACT SMART Toolkit 3 Importantly, the ACT SMART Toolkit has been tested in a pilot study with six ASD community agencies. Evaluating the ACT SMART Toolkit’s use and associated outcomes from the pilot study can provide insight into the effectiveness of its current design and inform needed adaptations. The results of preliminary work by Drahota and colleagues (in preparation) support the suggestion that the toolkit is feasible, acceptable, and useful to agency implementation teams. In addition, Sridhar and Drahota (2020; in review) have reported that the ACT SMART Toolkit facilitates clinically meaningful changes in agency provider- and supervisor-reported EBP use. Moreover, Sridhar and colleagues (2021) have identified salient facilitators (i.e., facilitation teams, facilitation meetings, and phase specific activities) and salient barriers (i.e., website issues, perceived lack of resources, and contextual factors within ASD community agencies such as time constraints and funding) to the utilization of the ACT SMART Toolkit in the pilot study. Therefore, the next incremental, yet crucial, step in evaluating initial use of the ACT SMART Toolkit is to assess implementation team fidelity to the toolkit: implementation strategy fidelity. Implementation Strategy Fidelity. Fidelity is a construct that assesses the extent to which individuals (e.g., providers) deliver a strategy as planned (Allen et al., 2018; Mowbray et al., 2003; Slaughter et al., 2015). Researchers have proposed components that contribute to fidelity include: (1) adherence to the outlined procedures, (2) proportion of the strategy received (i.e., dose), (3) extent of individual responsivity to the strategy (i.e., participant responsiveness), (4) quality of implementation, and (5) differentiation from unspecified procedures (Dusenbury et al., 2003; Teague, 2013). Researchers have also proposed that quality and differentiation primarily capture the characteristics of an EBP being implemented whereas adherence, dose, and participant responsiveness hold relevance for implementation strategy fidelity (Century et al., 2010; Slaughter et al., 2015). 4 Dusenbury (2003) defines adherence as the extent to which activities are consistent with the way a strategy is proposed, dose as the amount of strategy content received by participants, and participant responsiveness as the extent to which participants are engaged by and involved in the strategy. In relation to the ACT SMART Toolkit, participants would refer to the agency implementation teams (i.e., a group of individuals within an agency responsible for facilitating EBP implementation). Fidelity is also considered dynamic and may be influenced by factors such as provider characteristics, the setting, and/or complexity of the strategy (Cross & West, 2011; Slaughter et al., 2015). Assessing implementation strategy fidelity can help implementation strategy developers further understand which components of an implementation strategy may be core functions needed to produce desired outcomes and which components may be adapted to account for varying contextual characteristics (Kirk et al., 2019; Mihalic, 2004; Perez Jolles et al., 2019). Of course, this is contingent upon an ability to determine whether implementation of the strategy remained consistent with its underlying theory (Haynes et al., 2015; Moore et al., 2015). Notably, increasing understanding about how implementation strategies work has been identified as an important research priority within the field of dissemination and implementation science (Akiba et al., 2021; Powell et al., 2019). Despite the importance of examining implementation strategy fidelity, fidelity to implementation strategies has rarely and limitedly been assessed, as the focus of research has often been only on fidelity to the EBPs being implemented (Berry et al., 2021; Slaughter et al., 2015). Indeed, Slaughter et al. (2015) conducted a scoping review that indicated no articles reporting fidelity to implementation strategies included definitions or conceptual frameworks for 5 assessing implementation strategy fidelity. To our knowledge, only one recent study has used a theoretical framework to evaluate fidelity to an implementation strategy (Berry et al., 2021). Present Study. Using an instrumental case study approach to assess fidelity to the ACT SMART toolkit during its pilot study may be able to provide important insights into the use of the toolkit as well as the phenomenon of implementation strategy fidelity more broadly (Crowe et al., 2011). Examining implementation strategy fidelity can provide insight into the overall potential for ASD community agencies to use the toolkit as planned and ultimately report greater use of EBPs. Further, examination of implementation strategy fidelity can inform which of the ACT SMART toolkit’s specific components may be most needed or most adaptable in relation to its desired outcome of EBP use in ASD community agencies. This can increase implementation strategy sustainability by informing which toolkit components should be prioritized for completion and which aspects may be beneficial but not critical (i.e., demand optimization). This information may be particularly useful for ASD community agencies given potential competing priorities and identified contextual barriers to completing the toolkit in its entirety (Sridhar et al., 2021). Moreover, fidelity to the ACT SMART toolkit can reflect its potential for other desired outcomes such as acceptability, appropriateness, and feasibility (Proctor et al., 2011; Weiner et al., 2017). This will be critical information for further development of the ACT SMART Toolkit as an implementation strategy supporting delivery of EBPs in ASD community agencies. Further, the process of assessing implementation strategy fidelity will also provide one of the first models of assessing fidelity to a comprehensive implementation strategy. This model may then inform a broader understanding of implementation strategy fidelity and contribute to underlying theory. 6 We addressed two key questions: 1. What was fidelity to the ACT SMART Toolkit according to adherence, dose, and participant responsiveness during its pilot study? 2. Does implementation strategy fidelity to the toolkit predict direct provider reported EBP use in ASD community agencies after controlling for pre-use reports? It is hypothesized that increased adherence, dose, and participant responsiveness will each significantly predict an increase in the proportion of direct provider reported EBP use in an exploratory analysis. 7 METHOD Participants. A total of six ASD community agencies located in Southern California were included in the pilot study of the ACT SMART toolkit. Four of the ASD community agencies were Applied Behavior Analysis (ABA) organizations, one ASD community agency was an ABA and Mental Health Organization, and one ASD community agency was a Speech and Language Pathology organization. Five of the six ASD community agencies chose to adopt the EBP of Video Modeling and complete all phases of the ACT SMART toolkit and one ASD community agency, which was an ABA organization, chose not to adopt an EBP at the end of the adoption decision phase of the toolkit. Each ASD community agency developed implementation teams composed of agency staff (see Table 1 for implementation team demographic and discipline information). At least one agency leader was required for each implementation team. Eligibility criteria for agency leaders were: (1) holding the role of CEO, director, or leading decision-maker regarding treatment use at an ASD community agency eligible to participate in the ACT SMART pilot study, (2) willingness to participate in the pilot study for 1 year, and (3) agreement to provide feedback after completing each phase of the pilot study. The agency leader for each participating agency then invited up to four other agency staff members (i.e., supervisors and direct providers) to complete their agency’s implementation team. The eligibility criterion for all members of each implementation team was commitment to providing feedback about the feasibility, acceptability, and utility of the ACT SMART Toolkit. 8 Table 1. Demographic and discipline information across implementation teams Agency Leaders Supervisors Direct Providers (n=7) (n=8) (n=1) Sex Assigned at Birth (Females) 100% 100% 100% Race White 100% 25% 100% Mixed Race - 25% - Prefer Not to Answer - 12.5% - Missing - 37% - Education Level Master’s Degree 42.9% 50% 100% Doctorate 57.1% 12.5% - Missing - 37% - Discipline Psychology 28.6% 25% - Behavior Specialist 28.6% 25% 100% 28.6% 12.5% - Speech/Language/Communication Education 14.3% - - Missing - 37% - In addition to the implementation teams, direct providers within each of the ASD community agencies were surveyed about their use of EBPs pre- and post- implementation of the ACT SMART Toolkit (n=79 pre-implementation, n=80 post-implementation, see Table 2 for direct provider demographic and discipline information). Table 2. Demographic and discipline information across direct providers Agency 1 Agency 2 Agency 3 Agency 4 Agency 5 Agency 6 Pre- Post- Pre- Post- Pre- Post- Pre- Post- Pre- Post- Pre- Post - n=21 n=13 n=8 n=18 n=12 n=11 n=10 n=8 n=6 n=6 n=22 n=24 Sex Assigned at Birth (Females) 60% 84.6% 100% 83.3% 83.3% 81.8% 100% 100% 100% 83.3% 86.4% 87.5% 9 Table 2 (cont’d) Ethnicity Spanish/Hispanic/Latinx 30% 38.5% 25% 22.2% 25% 27.3% 10% - 16.7% - 72.7% 58.3% Not Spanish/Hispanic/Latinx 60% 38.5% 75% 72.2% 75% 72.7% 90% 100% 83.3% 100% 22.7% 25% Prefer Not to Answer 10% 15.4% - 5.6% - - - - - - 4.5% 16.7% Missing - 7.7% - - - - - - - - - - Race White 76.2% 46.2% 50% 44.4% 58.3% 54.5% 90% 100% 83.3% 100% 63.6% 45.8% Black or African American 6.3% - - - 8.3% 9.1% - - - - - 4.2% Asian - 7.6% 25% 33.3% 8.3% 18.2% - - - - - - American Indian or Alaskan Native - - - - - - - - - - - - Native Hawaiian or Pacific Islander 6.3% - - 5.5% - - - - - - 4.5% Mixed Race - - - - - - 10% - 16.7% - - 8.3% Prefer Not to Answer 12.5% 30.8% 25% 16.8% 33.3% 27.3% - - - - 27.3% 29.2% Missing - 15.4% - - - - - - - - - 12.5% Education Level Some College 10% 23.1% 12.5% 27.8% - - - - - - 9.1% 8.3% Associate’s Degree 30% 15.4% - - - - - - - - 4.5% 12.5% Bachelor’s Degree 5% 38.5% 50% 50.0% 83.3% 72.7% 30% 12.5% - - 77.3% 62.5% Master’s Degree 50% 15.4% 25% 11.1% - 27.3% 60% 75.0% 50% 50% 4.5% 8.3% Doctorate 5% - 12.5% 5.6% - - - - 50% 50% - - Other - - - 5.6% 16.7% - 10% 12.5% - - 4.5% 8.3% 10 Table 2 (cont’d) Missing - 7.7% - - - - - - - - - - Discipline Psychology 47.4% 46.2% 50% 55.6% 58.3% 54.5% - - 66.7% 66.7% 50% 62.5% Education - 7.7% 25% 22.3% - - 10% 12.5% - - - - Behavior Specialist 42.1% 23.1% - 5.6% 33.3% 27.3% - - 33.3% 33.3% 9.1% 16.7% Speech/Language/Communication 5.3% - - 8.3% 9.1% 90% 87.5% - - - - Social Work - - - - - - - - - 13.6% 8.3% Marriage and Family Therapy - - - - - - - - - 4.5% - Other 5.3% 15.4% 12.5% 16.7% - 9.1% - - - - 22.7% 12.5% Missing - 7.7% 12.5% - - - - - - - - - Materials & Procedure. As part of the pilot study, a research assistant served as an independent observer and evaluated implementation teams’ fidelity using the ACT SMART Implementation Milestones form, adapted with permission from the Stages of Implementation Completion (Chamberlain et al., 2011; See Appendix A), and the ACT SMART Activity Fidelity form that was created by the toolkit developers (See Appendix B). The ACT SMART Implementation Milestones form required the independent observer to record a Yes or No answer for whether activities during pre-implementation and phase 1 through phase 4 of the ACT SMART Toolkit were completed. In addition, the form also required the independent observer to note the date initiated and date completed for each activity. The ACT SMART Activity Fidelity form presented more detailed questions regarding completion of activities during Phase 2: Treatment Selection and Adoption Decision; Phase 3: Planning for Implementation; and Phase 4: 11 Implementation. The independent observer recorded a Yes or No answer for whether implementation teams completed the form for each activity and then rated the amount of the form completed using a 4-point Likert scale where 0 = “Nothing Completed”, 1 = “Minimally Completed (1-2 items)”, 2 = “Moderately Completed (3-4 items)”, and 3 = “Mostly/All Completed (5-6 items)”. In addition to the observational data collected using the ACT SMART Implementation Milestones form and the ACT SMART Activity Fidelity form, ACT SMART facilitators rated implementation team engagement using the ACT SMART Implementation Team Engagement Rating Scale that was created by the toolkit developers (See Appendix C). Immediately after each facilitation meeting, the ACT SMART facilitator(s) rated implementation team engagement in ACT SMART activities and facilitation meetings since the last facilitation meeting occurred. Engagement ratings were completed using a 5-point Likert scale where 1 = “Not at all engaged”, 2 = “Slightly Engaged”, 3 = “Moderately Engaged”, 4 = “Very Engaged”, and 5 = “Extremely Engaged”. In the present study, we will use the operational definitions from Dusenbury (2003) and an overall scoring rubric for implementation strategy fidelity developed in Slaughter et al. (2015) as the basis for using the ACT SMART Implementation Milestones form, ACT SMART Activity Fidelity form, and ACT SMART Implementation Team Engagement Rating Scale to assess implementation strategy fidelity via adherence, dose, and participant responsiveness, respectively. To assess EBP use, direct providers within the ASD community agencies self-reported EBP use via the ASD Strategies and Interventions Survey (ASD-SIS; Pickard et al., 2018) both before and after their agency used the ACT SMART toolkit (See Appendix D). Providers were 12 asked to rate the extent to which they agreed with the following statement, “I feel competent in my delivery of this practice,” for each practice and strategy they reported to utilize from a list of 55 intervention practices and strategies commonly used with youth on the autism spectrum. Providers were also asked to list any additional practices or strategies they currently used with their clients and to again rate the extent to which they agreed with the statement, “I feel competent in my delivery of this practice.” Agreement was rated on a 5-point Likert scale where 1 = “Disagree Strongly”, 2 = “Disagree”, 3 = “Uncertain”, 4 = “Agree”, and 5 = “Agree Strongly.” Developers of the ASD-SIS determined whether intervention practices and/or strategies were EBPs based on service reviews from the National Standards Project and the National Professional Development Center on Autism Spectrum Disorders at the time of the study (National Autism Center, 2009; Pickard et al., 2018; Wong et al., 2015). Analysis Plan. We used an instrumental case study approach to explore both fidelity to the ACT SMART Toolkit and potential generalizations to a broader underlying theory of implementation strategy fidelity. First, we assessed adherence, dose, and participant responsiveness for the ACT SMART Toolkit overall as well as for each phase and activity of the toolkit. Utilizing the ACT SMART Implementation Milestones form, we assessed adherence via a Yes/No answer to whether implementation milestones were completed. Overall, by phase, and by activity, we calculated the average percentage of “Yes” answers for required toolkit activities. We assessed dose by analyzing Likert scales on the ACT SMART Activity Fidelity form evaluating how much of each activity was completed. Overall, by phase, and by activity, we calculated the mean dose rating. Finally, we assessed participant responsiveness by analyzing the Likert scales on the ACT SMART Implementation Team Engagement Rating Scale and used dates of completion to confirm phase. Overall and by phase, we calculated the mean participant 13 responsiveness rating. We did not calculate the mean participant responsiveness rating by activity as ratings for engagement were only given by phase. We also calculated an average percent agreement on participant responsiveness ratings from facilitation meetings in which multiple facilitators were present. Lastly, we calculated overall, by phase, and by activity adherence, dose, and participant responsiveness separately for each agency implementation team. To evaluate whether adherence, dose, or participant responsiveness significantly differed by toolkit phase, we conducted repeated measures ANOVAs with toolkit phase as a within- subjects factor. We also conducted Bonferroni post-hoc tests and calculated effect sizes using local error terms. It should be noted that dose was not observed during phase 1 of the toolkit. Further, the one ASD community agency that chose not to adopt an EBP at the end of the adoption decision phase (phase 2) of the toolkit did not have any implementation strategy fidelity variables observed during phase 3 or phase 4 of the toolkit. Additionally, only three of the remaining five ASD community agencies had engagement ratings collected during phase 4 of the toolkit. Second, we conducted an exploratory analysis to determine whether adherence, dose, and participant responsiveness for the ACT SMART Toolkit each significantly predicted direct provider reported use of implementation team selected EBP (i.e., video modeling). We conducted analysis both at the agency level and the level of direct providers nested within agencies. For analysis at the agency level, we used a series of beta regressions to evaluate whether adherence, dose, or participant responsiveness predicted the proportion of direct providers reportedly using video modeling post-toolkit beyond the proportion of direct providers reportedly using video modeling pre-toolkit. 14 For analysis at the level of direct providers within agencies, there was one binary observation of video modeling use post-toolkit per direct provider and direct providers were nested within each of the ASD community agencies participating in the pilot study. Due to the multilevel, and therefore potentially correlated, data, a multilevel logistic generalized estimation equation (GEE) was modeled for each implementation strategy variable (i.e., adherence, dose, and participant responsiveness) with pre-toolkit reported video modeling use as a covariate. An exchangeable correlation structure was specified based on prior research (Teerenstra et al., 2010) and a Firth-type penalization was added to address data separation (Heinze & Schemper, 2002; Mondol & Rahman, 2019). 15 RESULTS Overall Fidelity to the ACT SMART Toolkit. Agency implementation teams adhered to an overall average of 87% (SD = 6%) of required ACT SMART toolkit activities. Average adherence ranged from 72% (SD = 27%) completion of required toolkit activities during the planning for implementation phase of the toolkit to 92% (SD = 10%) completion of required toolkit activities during the treatment selection and adoption decision phase of the toolkit (See Table 3). While completion rate for individual activities within phases was also relatively high across agencies, there was some variability. There were lower average completion rates for activities such as evaluating a benefit-cost estimator, gathering treatment materials, developing an adaptation plan, and carrying out an implementation and sustainment plan compared to higher average completion rates for activities related to treatment evaluation, funding, and training. Table 3. Adherence, dose, and participant responsiveness to the ACT SMART toolkit calculated overall, by phase, and by activity across ASD community agency implementation teams Participant Adherence Dose Responsiveness M(SD) M(SD) M(SD) (0-100%) (0-3) (1-5) Overall 87 (6.00) 2.42 (.50) 3.91 (.56) Pre-Implementation - - 100 (0.00) Recruitment Agency first contacted 100 (0.00) - - Agency interest indicated 100 (0.00) - - Agency recruitment 100 (0.00) - - Meeting Orientation workshop 100 (0.00) - - Phase 1 Agency Exploration 83 (18.00) - 3.79 (0.71) Meeting at agency to recruit for agency 83 (40.82) - - assessment 16 Table 3 (cont’d) Emails sent to agency staff - for agency assessment 100 (0.00) - ACT SMART agency assessment (75% staff 67 (51.64) - - response rate) Phase 2 Treatment Selection and 92 (10.00) 2.48 (0.60) 2.33 (1.03) Adoption Decision Treatment selection 100 (0.00) - - Evaluate treatment fit 100 (0.00) 3.00 (0.00) - Evaluate treatment 100 (0.00) 3.00 (0.00) - feasibility Evaluate clinical value and 83 (40.82) 2.50 (1.22) - research validity Evaluate training 100 (0.00) 2.33 (1.00) - requirements Evaluate funding source 100 (0.00) 2.33 (1.03) - Evaluate benefit-cost 60 (54.78) 1.75 (1.50) - estimator Make an adoption decision 83.33 (40.82) 2.00 (1.55) - Phase 3 72 (27.00) 1.72 (0.59) 3.50 (1.87) Planning for Implementation Gather treatment materials 60 (54.78) 0.60 (0.55) - Evaluate prospective 80 (44.72) 2.40 (1.34) - treatment adaptations Develop an adaptation plan 25 (50.00) 1.00 (1.73) - Develop a training plan 100 (0.00) 2.60 (0.89) - Develop an implementation and 80 (44.72) 1.60 (1.14) - sustainment plan Phase 4 Implementation 83 (24.00) 2.98 (0.05) 3.33 (1.63) Carry out adaptation plan 100 (0.00) - - Carry out training plan 100 (0.00) - - Carry out implementation 60 (55.00) - - and sustainment plan 17 In terms of dose, the independent observer gave agency implementation teams an overall average rating falling between “Moderately Completed” to “Mostly/All Completed” (M = 2.42, SD = .50). The lowest average dose rating was between “Minimally Completed” to “Moderately Completed” (M = 1.72, SD = .59) during the planning for implementation phase whereas the highest average dose rating was between “Moderately Completed” to “Mostly/All Completed” (M = 2.98, SD = .05) during the implementation phase of the toolkit (See Table 3). Consistent with observations of adherence, there were lower average dose ratings for activities such as evaluating a benefit-cost estimator, gathering treatment materials, developing an adaptation plan, and developing an implementation and sustainment plan compared to higher average completion rates for activities related to treatment evaluation, funding, and training. Here, it should be noted that average dose ratings by activity could not be calculated for the implementation phase given that evaluation surveys during this phase were designed to be dynamic and capture completion of different sets of tasks by agency. For participant responsiveness, ACT SMART facilitators gave agency implementation teams an overall average rating between “Moderately Engaged” to “Very Engaged” (M = 3.91, SD = .56). The lowest average participant responsiveness rating was between “Slightly Engaged” and “Moderately Engaged” (M = 2.33, SD = 1.03) during the treatment selection and adoption decision phase of the toolkit. The highest average participant responsiveness rating was between “Moderately Engaged” to “Very Engaged” (M = 3.79, SD = .71) during the agency exploration phase (See Table 3). For facilitation meetings with multiple ACT SMART facilitators present, there was a 90.74% average agreement on participant responsiveness ratings. Spearman correlations among the implementation strategy fidelity variables and video modeling use variables are presented in Table 4. We found the only significant correlation to be 18 between dose and participant responsiveness, r(4) = .83, p < .05, indicating that as dose increased so too did participant responsiveness. Table 4. Means, standard deviations, and spearman correlations with 95% confidence intervals for implementation strategy fidelity and proportion of direct providers using video modeling Proportion of Adherence Dose Participant Proportion of Direct Responsiveness Direct Providers Providers Using Video Using Video Modeling Modeling Post-Toolkit Pre-Toolkit Proportion of Direct - .15 [-0.75, 0.86] -.29 [-0.89, 0.68] -.44 [-0.92, 0.58] .09 [-0.78, 0.84] Providers Using Video Modeling Pre-Toolkit Adherence - .60 [-0.41, 0.95] .54 [-0.48, 0.94] .43 [-0.59, 0.92] Dose - .83* [0.05, 0.98] .49 [-0.54, 0.93] Participant Responsiveness - .77 [-0.11, 0.97] M 22.0 87.0 2.42 3.91 42.0 SD 22.0 6.0 0.50 0.56 36.0 Note. *p < .05 By Agency Fidelity to the ACT SMART Toolkit. Across agencies, there was generally high adherence to toolkit activities, with the agency implementation team with the lowest overall adherence rating adhering to an average of 76% (SD = 17%) of required toolkit activities (See Table 5). While there was some variability in adherence across phases and activities by agency, there was no readily identifiable pattern of agencies consistently having lower or higher adherence compared to other agencies. Consistent with other results, the planning for implementation phase appeared to have the lowest adherence ratings across agencies. 19 Table 5. Adherence to the ACT SMART toolkit calculated overall, by phase, and by activity for each ASD community agency implementation team Adherence M(SD) (0-100%) Agency 1 Agency 2 Agency 3 Agency 4 Agency 5 Agency 6 Overall 84.17 (16.77) 93.33 (14.91) 91.67 (14.43) 92.00 (17.89) 88.00 (26.83) 76.17 (16.86) Pre-Implementation Recruitment 100 (0.00) 100 (0.00) 100 (0.00) 100 (0.00) 100 (0.00) 100 (0.00) Agency first 100 100 100 100 100 100 contacted Agency interest indicated 100 100 100 100 100 100 Agency recruitment 100 100 100 100 100 100 meeting Orientation workshop 100 100 100 100 100 100 Phase 1 Agency Exploration 66.67 (57.74) 66.67 (57.74) 100 (0.00) 100 (0.00) 100 (0.00) 66.70 (57.74) Meeting at agency to recruit for 100 0 100 100 100 100 agency assessment Emails sent to agency staff for agency 100 100 100 100 100 100 assessment ACT SMART agency assessment (75% 0 100 100 100 100 0 staff response rate) Phase 2 Treatment Selection and Adoption Decision 87.50 (35.36) 100 (0.00) 75.00 (46.29) 100 (0.00) 100 (0.00) 87.50 (35.36) Treatment selection 100 100 100 100 100 100 Evaluate treatment fit 100 100 100 100 100 100 20 Table 5 (cont’d) Evaluate treatment 100 100 100 100 100 100 feasibility Evaluate clinical value and 100 100 0 100 100 100 research validity Evaluate training 100 100 100 100 100 100 requirements Evaluate funding source 100 100 100 100 100 100 Evaluate benefit- 0 100 0 - 100 100 cost estimator Make an 100 100 100 100 100 0 adoption decision Phase 3 Planning for 100.00 (0.00) 100 (0.00) - 60.00 (54.77) 40.00 (54.77) 60.00 (54.77) Implementation Gather treatment materials 100 100 - 0 0 100 Evaluate prospective treatment 100 100 - 100 0 100 adaptations Develop an adaptation plan 100 - - 0 0 0 Develop a training plan 100 100 - 100 100 100 Develop an implementation and sustainment 100 100 - 100 100 0 plan Phase 4 Implementation 66.67 (57.74) 100 (0.00) - 100 (0.00) 100 (0.00) 66.67 (57.74) Carry out 100 - - - - 100 adaptation plan Carry out 100 - - 100 100 100 training plan Carry out 0 100 - 100 100 0 implementation and sustainment plan Agencies also all had generally high dose ratings for toolkit activities, except for the one agency (Agency 3) that chose not to adopt an EBP at the end of the adoption decision phase of 21 the toolkit (See Table 6). Like the ratings of adherence by agency, there was variability in dose ratings but no consistent identifiable patterns. Further, the planning for implementation phase had the lowest dose ratings across agencies. Table 6. Dose to the ACT SMART toolkit calculated overall, by phase, and by activity for each ASD community agency implementation team Dose M(SD) (0-3) Agency 1 Agency 2 Agency 3 Agency 4 Agency 5 Agency 6 Overall 2.69 (0.43) 2.65 (0.38) 1.50 (0.00) 2.48 (0.76) 2.27 (1.27) 2.25 (0.66) Phase 2 Treatment Selection 3.00 (0.00) 2.71 (0.76) 1.50 (1.64) 2.83 (0.41) 3.00 (0.00) 2.00 (1.29) and Adoption Decision Treatment selection - - - - - - Evaluate treatment fit 3 3 3 3 3 3 Evaluate treatment 3 3 3 3 3 3 feasibility Evaluate clinical value and 3 3 0 3 3 3 research validity Evaluate training - 3 - 2 - 3 requirements Evaluate funding 3 1 3 3 3 1 source Evaluate benefit-cost - 3 0 - 3 1 estimator Make an 0 adoption 3 3 0 3 3 decision Phase 3 Planning for 2.20 (1.10) 2.25 (0.96) - 1.60 (1.52) 0.80 (1.30) 1.75 (1.50) Implementation 22 Table 6 (cont’d) Gather treatment 1 1 - 0 0 1 materials Evaluate prospective treatment 3 3 - 3 0 3 Adaptations Develop an adaptation 3 - - 0 0 - plan Develop a training plan 3 3 - 3 1 3 Develop an implementati on and 1 2 - 2 3 0 sustainment plan Phase 4 Implementation 2.89 (0.44) 3.00 (0.00) - 3.00 (0.00) 3.00 (0.00) 3.00 (0.00) Carry out - - - - - - adaptation plan Carry out - - - - - - training plan Carry out - - - - - - implementati on and sustainment plan Consistent with both observations of adherence and dose ratings across agencies, all agencies also had relatively high ratings of participant responsiveness (See Table 7). The agency with the lowest average participant responsiveness rating was rated between “Moderately Engaged” to “Very Engaged” (M = 3.33, SD = 0.11). However, in contrast to observations of adherence and dose ratings, agencies did not appear to have lower participant responsiveness during the planning for implementation phase compared to other toolkit phases. 23 Table 7. Participant responsiveness to the ACT SMART toolkit calculated overall and by phase for each ASD community agency implementation team Participant Responsiveness M(SD) (1-5) Agency 1 Agency 2 Agency 3 Agency 4 Agency 5 Agency 6 Overall 3.75 (0.25) 4.63 (0.48) 3.33 (0.11) 3.75 (0.29) 4.47 (0.39) 3.37 (0.32) Phase 1 Agency Exploration 4 4 3.25 3.5 5 3 Phase 2 Treatment Selection and Adoption 3.5 5 3.40 3.50 4.50 3.50 Decision Phase 3 Planning for 3.75 4.50 - 4.0 4.13 3.60 Implementation Phase 4 Implementation - 5.0 - 4.0 4.25 - Differences in Fidelity to the ACT SMART Toolkit by Toolkit Phase. Our repeated measures ANOVAs to compare implementation strategy fidelity variables (i.e. adherence, dose, and participant responsiveness) across phases revealed a significant main effect of toolkit phase for dose (F(2,8) = 11.38, MSE = .190, p = .005, η2 = .74, 95% CI [.16, .84]). However, there was not a significant main effect of toolkit phase for either adherence (F(3,12) = 1.11, MSE = .041, p = .384, η2 = .22, 95% CI [0, .43]) or participant responsiveness (F(3,6) = .19, MSE = .211, p = .902, η2 = .09, 95% CI [0, .25]). Using the Bonferroni post-hoc tests with local error terms to further examine the significant main effect of toolkit phase on dose, we found that the average dose rating during the planning for implementation phase (phase 3) of the toolkit was significantly lower than the 24 average dose rating during the implementation phase (phase 4) of the toolkit (d = 3.98, 95% CI [1.05, 6.88]. Fidelity-EBP Use Relationships. After conducting our exploratory beta regressions and series of exploratory Firth-type multilevel logistic GEE models with exchangeable correlation structures to examine the relationship between implementation strategy fidelity, and EBP use, we found uninterpretable results due to the limited sample size available. To ensure a full report, these results can be found in the Appendix E. 25 DISCUSSION Fidelity to the ACT SMART Toolkit. Our present investigation used an instrumental case study approach to evaluate implementation strategy fidelity to the ACT SMART Toolkit by assessing observational descriptive ratings of adherence, dose, and participant responsiveness and explored whether greater implementation strategy fidelity could predict increases in the desired outcome of EBP use within ASD community agencies. Our evaluation provided one of the first models of assessing fidelity to a multi-faceted implementation strategy and important insights into both the potential for ASD community agencies to use the toolkit most effectively and implementation strategy fidelity more broadly. Given that EBPs for ASD are often inconsistently or not implemented in community settings despite their potential to improve outcomes for a growing clinical population, understanding effective use of the toolkit could contribute to addressing an important research-to-practice gap (Drahota et al., 2020; Paynter et al., 2016; Pickard et al., 2017; Wong et al., 2015; Wood et al., 2015). Overall, we found that adherence, dose, and participant responsiveness to the ACT SMART Toolkit were relatively high, which supports the potential for the toolkit to be used with fidelity in ASD community agencies. We also found a significant positive correlation between dose and participant responsiveness, which may indicate that completing a greater amount of the toolkit allows for greater engagement. Consistent with this observation, researchers evaluating intervention fidelity have found that increased dose may influence the quality of participant responsiveness, as completing more of an intervention leads to higher frequency of interaction and greater engagement (Hulleman & Cordray, 2009; Knoche et al., 2010). However, the significant positive correlation between dose and participant responsiveness may also reflect the existence of a latent variable. Carroll and colleagues (2007) highlight the concept of “reaction 26 evaluation,” or the judgments made by recipients about the relevance and outcomes of an intervention, as important in considering intervention fidelity. If more positive reactions lead to a greater willingness to complete more components and engage more with an intervention or implementation strategy, “reaction evaluation” could potentially underly both dose and participant responsiveness in intervention fidelity and implementation strategy fidelity. Although we found adherence, dose, and participant responsiveness to the ACT SMART Toolkit to be high overall, there was some variability in implementation strategy fidelity by toolkit phase. Specifically, we found that dose was significantly lower in the planning for implementation phase (phase 3) compared to the implementation phase (phase 4). One possible rationale for this finding is that there were substantial differences in demands for toolkit activities by phase. Indeed, the planning for implementation phase required gathering materials, evaluating prospective adaptations, and developing training, adaptation, and sustainment plans whereas the implementation phase required carrying out and evaluating the developed plans. Indeed, there were both lower adherence and dose ratings for toolkit activities such as developing adaptation and implementation and sustainment plans compared to toolkit activities related to evaluating treatments, funding, and training. Thus, the lower dose in the planning for implementation phase may reflect the need to lower the amount or intensity of toolkit activities required to better align with ASD community agency’s ability to plan for implementation. Considering recently identified context-specific barriers and facilitators to the ACT SMART Toolkit would also likely be critical to enhancing the planning for implementation phase (Powell et al., 2020; Sridhar et al., 2021). Another potential rationale for significantly lower dose during the planning for implementation phase compared to the implementation phase may be that ASD community 27 agencies perceived greater value in implementing the chosen EBP than in planning for its implementation. While agency implementation teams were rated as moderately to very engaged during the planning for implementation phase, it is unclear how well facilitators were able to emphasize the important relationship between planning and implementation. However, researchers have recently proposed that fostering this understanding is necessary to support successful and sustainable implementation (Leal Filho et al., 2019). Thus, the ACT SMART Toolkit may also benefit from incorporating a greater focus on the practical importance of planning for implementation of EBPs. Our present investigation was able to adequately assess overall implementation strategy fidelity to the ACT SMART Toolkit and consider implications for effective use of the toolkit within ASD community agencies. However, our exploratory analysis was ultimately unequipped to evaluate the relationship between increased fidelity and increased EBP use given the limited sample size of the pilot study (See Appendix E). While the present findings from our exploratory analysis cannot be interpreted with certainty, we contend that they highlight potential for significant effects of dose and participant responsiveness on EBP use, while the relationship between adherence and EBP use remains non-significant. Assuming these findings can be replicated and interpreted in future investigations, there may be the possibility that the ACT SMART Toolkit is composed of both core and peripheral components (Damschroder et al., 2009; Stirman et al., 2012, 2019). Specifically, adherence to all toolkit activities may not be necessary to achieve a preliminary impact on increasing direct provider reported EBP use. Given that the lowest adherence was observed during the planning for implementation phase (phase 3), it may be particularly likely to include activities peripheral to the core components of the ACT SMART Toolkit. However, lower adherence during this 28 phase may also simply reflect fatigue with the toolkit and preparing to implement the chosen EBP. In addition, having found that dose is also the lowest during the planning for implementation phase but still has a potential significant effect on provider reported EBP use, adjustment of the demands during the planning for implementation phase may nonetheless be of importance to enhance feasibility and potential impact on provider reported EBP use. Another potential explanation for the lack of an identified linear relationship between adherence to the toolkit and EBP use may be that a curvilinear relationship exists instead, such that modest adherence is associated with the greatest EBP use. Consistently, researchers have suggested curvilinear relationships between intervention adherence and desired outcomes (Barber et al., 2006; Hogue et al., 2008; McHugo et al., 2007). Thus, assessing both linear and curvilinear relationships between adherence and provider reported EBP use could allow for greater insight into whether both core and peripheral components of the ACT SMART Toolkit exist. To sufficiently evaluate each of these hypotheses regarding core and peripheral components of the ACT SMART Toolkit, larger sample sizes in future studies will be required. This is consistent with the phases of intervention implementation studies proposed by Hamilton & Mittman (2018). They propose that initial studies evaluate implementation programs during a pilot study to develop preliminary evidence surrounding feasibility, acceptability, and potential effectiveness of implementation strategies and subsequent studies focus on fidelity and adaptation in efficacy oriented small-scale trials (Hamilton & Mittman, 2018). Implementation Strategy Fidelity Theory. Taken together, our instrumental case study assessment of fidelity to the ACT SMART Toolkit and exploration of the potential relationship between fidelity and EBP use within ASD community agencies notably provide one of the first 29 models of assessing implementation strategy fidelity. Although a considerable amount of research has been conducted on intervention fidelity, few researchers have explored implementation strategy fidelity (Berry et al., 2021; Slaughter et al., 2015). Further, Slaughter et al. (2015) have found that no studies reporting on fidelity to implementation included a specific definition or theoretical framework for assessing implementation strategy fidelity. To our knowledge, only Berry and colleagues (2021) recently used an adapted Conceptual Framework for Implementation Fidelity to guide their evaluation of fidelity to practice facilitation as a strategy to improve primary care practices’ adoption of evidence-based guidelines for cardiovascular disease. Despite limited research, evaluating and understanding implementation strategy fidelity has important implications and is identified as a research priority within dissemination and implementation science (Akiba et al., 2021; Haynes et al., 2015; Moore et al., 2015; Powell et al., 2019). High fidelity to an implementation strategy may be reflective of other important implementation outcomes, such as high acceptability, appropriateness, and feasibility (Proctor et al., 2011; Weiner et al., 2017). Further, implementation strategy fidelity can inform determination of which components of a strategy are required to produce change and which can be removed or adapted to account for varying contextual characteristics (Kirk et al., 2019; Mihalic, 2004; Perez Jolles et al., 2019). In turn, this knowledge can allow for demand optimization when the implementation strategy is being used, which may be particularly important when users of an implementation strategy have competing priorities or contextual factors that make completing the entirety of a multi-faceted implementation strategy difficult (Sridhar et al., 2021). 30 From our instrumental case study of fidelity to the ACT SMART toolkit, we have demonstrated that fidelity to multi-faceted, multi-phased implementation strategies is possible. Further, we have highlighted that implementation strategy fidelity may vary according to differing components of a strategy, such as components focusing on planning for implementation versus components focusing on implementation itself. We have also observed that implementation strategy fidelity may vary by context. Here, implementation strategy fidelity was observed to vary across different ASD community agencies using the ACT SMART Toolkit. Taken together, these findings suggest that a next step to further understand implementation strategy fidelity may be researching its potential dynamic shifts across both strategy content and context. Importantly, increasing this understanding could then also inform commonly needed adaptations to improve implementation strategy fidelity. Strengths. We propose a main strength of our investigation is that we demonstrate one of the first instrumental case studies to consider fidelity to a multi-faceted, multi-phased implementation strategy. Importantly, our assessment of fidelity to the ACT SMART Toolkit may be able to provide a framework for other evaluations of implementation strategy fidelity and inform the underlying theory of implementation strategy fidelity. Within our evaluation, we also importantly found overall high fidelity to the toolkit within ASD community agencies and identified potential ways in which to optimize demands of the toolkit and increase sustainability. Limitations. In contrast, important limitations of our investigation include potential issues with measurement of specific implementation strategy fidelity variables. For example, we may have been capturing a latent variable underlying dose and participant responsiveness given their significant positive correlation. Further, Berry and colleagues (2021) recently considered participant responsiveness as a moderator of implementation strategy fidelity rather than a 31 component of fidelity itself, as it was considered in our analysis. Moreover, the potential issues with measurement may have been compounded by the fact that standard measures were not used for dose and participant responsiveness. However, as an emerging field, implementation science often faces issues related to measurement and standard measures specific to implementation strategy fidelity have not yet been developed (Lewis & Dorsey, 2020). Researchers have developed some standard measures for intervention fidelity, and these may be able to be adapted to assess implementation strategy fidelity in the future (Ibrahim & Sidani, 2015). Another potential limitation in our investigation is that there were different raters for adherence, dose, and participant responsiveness. While an independent observer rated adherence and dose for each implementation team, participant responsiveness was rated by a facilitator following implementation teams’ facilitation meetings. Although this presents potential for bias, direct observation by independent observers and even implementers have still been found to be more accurate than collecting reports directly from participants (Ibrahim & Sidani, 2015). Further, when two facilitators independently gave ratings for participant responsiveness, there were high rates of agreement. Despite the strength of assessing implementation strategy fidelity to a multi-faceted, multi-phased implementation strategy, there were also some notable limitations. While we were generally able to assess implementation strategy fidelity by toolkit phase and activities, we were unable to assess all variables for all activities and by toolkit facet (i.e., website versus facilitation meetings). Thus, we are unable to make conclusions about all activities and the impact of the multi-faceted nature of the toolkit on implementation strategy fidelity. Further, our results may not generalize to discrete implementation strategies, which may benefit from their own instrumental case studies. 32 Lastly, the most important limitation of our assessment of fidelity to the ACT SMART Toolkit was the limited sample size that rendered us under powered to fully evaluate relationships between implementation strategy fidelity and EBP use. Moreover, our limited sample size also precluded us from considering additional factors such as implementation team and provider demographics and organizational climate within ASD community agencies. While we were able to observe variable implementation strategy fidelity across ASD community agencies, we were not yet able to identify consistent patterns related to higher or lower implementation strategy fidelity. However , there is evidence that some of the aforementioned factors may be moderators of the relationship between implementation strategy fidelity to the ACT SMART toolkit and increased EBP use (Hasson et al., 2012). Conclusion & Future Directions. In summary, using an instrumental case study approach, we increased understanding of effective use of the ACT SMART toolkit as well as the theory of implementation strategy fidelity more broadly. We found that the ACT SMART Toolkit has potential to be used with high fidelity in ASD community agencies. However, we also found that there was some variability in fidelity among toolkit phases, which points to possible adaptations to improve the potential for the toolkit to be used in ASD community agencies even further. Although we were not able to fully evaluate the relationship between fidelity to the ACT SMART Toolkit and the desired outcome of EBP use, our findings highlighted that further investigation of this relationship with larger samples may provide important insight into the existence of potential core and peripheral components of the toolkit (Damschroder et al., 2009; Stirman et al., 2012; Stirman et al., 2019). In turn, this understanding may also be able to guide selection of specific adaptations to the toolkit. Considering such adaptations may be critical as these findings may reflect that fidelity to multi-faceted, multi- 33 phased implementation strategies is dynamic and affected by both strategy content and context. Future research would benefit from the exploration of both linear and curvilinear relationships between adherence and EBP use, consideration of potential moderators of implementation strategy fidelity, and use of both standard measures and independent raters (Barber et al., 2006; Hasson et al., 2012; Hogue et al., 2008; Ibrahim & Sidani, 2015; Lewis & Dorsey, 2020; McHugo et al., 2007). In addition, future studies may benefit from a design intended to systematically evaluate fidelity to all components and facets of a strategy. These lines of research may provide further insight into both effective use of the ACT SMART Toolkit as well as the inner workings of implementation strategy fidelity more broadly. Taken together, our findings and suggestions for future research are critically important given the strong need for consistent implementation of EBPs for ASD in community settings to improve care for autistic youth. Moreover, our findings advance the field of implementation science by providing a systematic evaluation of implementation strategy fidelity that may inform the theory of evaluation of discrete as well as multi-faceted implementation strategies within other mental and behavioral service systems. By increasing the use of and fidelity to effective implementation strategies facilitating EBP adoption, utilization and sustainment within community-based settings, there is potential to increase overall public health. 34 APPENDICES 35 Appendix A. ACT SMART Implementation Milestones Form ACT SMART Date Completed Date Phase Activity Phase Initiated (Yes/No) Completed Agency first contacted Pre- Recruitment Agency interest indicated Implementation Agency recruitment meeting Orientation workshop Phase 1 Meeting at agency to recruit for agency assessment Date initiated: Emails sent to agency staff for agency assessment ACT SMART agency assessment (75% staff response rate) Treatment selection (Phase 2, Step 1, Activity 1) Evaluate treatment fit (Phase 2, Step 2, Activity 1) Evaluate treatment feasibility (Phase 2, Step 2, Activity 2) Phase 2 Evaluate clinical value and research validity (Phase 2, Step Date initiated: 2, Activity 3) Evaluate training requirements (Phase 2, Step 2, Activity 4) Evaluate funding source (Phase 2, Step 2, Activity 5) Evaluate benefit-cost estimator (Phase 2, Step 2, Activity 6) Implementation Make an adoption decision (Phase 2, Step 3, Activity 1) Gather treatment materials (Phase 3, Step 1, Activity 1) Evaluate prospective treatment adaptations (Phase 3, Step Phase 3 1, Activity 2) Date initiated: Develop an adaptation plan (Phase 3, Step 1, Activity 3) Develop a training plan (Phase 3, Step 2, Activity 1) Develop an implementation and sustainment plan (Phase 3, Step 3, Activity 1) Carry out adaptation plan (Phase 4, Step 1, Activity 1) Phase 4 Carry out training plan (Phase 4, Step 2, Activity 1) Date initiated: Carry out implementation and sustainment plan (Phase 4, Step 3, Activity 1) Agency ID: Start Date: ___________________ AS Facilitator: _____________________________________ 36 Appendix B. ACT SMART Activity Fidelity Form ACT SMART Activity Fidelity Phase 2: Treatment Selection and Adoption Decision Treatment Fit (6 items; 1 area) Phase 2, Step 2, Activity 1 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed (1-2 items) (3-4 items) (5-6 items) Treatment Feasibility (6 items; 1 area) Phase 2, Step 2, Activity 2 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed (1-2 items) (3-4 items) (5-6 items) Clinical Value and Research Validity (10 items; 2 areas) Phase 2, Step 2, Activity 3 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed (1-3 items) (4-7 items) (8-10 items) c. How many areas were attended to? 0 1 2 None Some All (1 area) (2 areas) Training Requirements (25 items; 9 areas) Phase 2, Step 2, Activity 4 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? If no training requirements identified, circle N/A here 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed (1-10 items) (11-19 items) (20-25 items) c. How many areas were attended to? If no training requirements identified, circle N/A here 0 1 2 3 None Minimal Some Mostly or All (1-3 areas) (4-7 areas) (8-9 areas) Funding Source (1-3 items; 1 area) Phase 2, Step 2, Activity 5 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? Use judgment if fewer sections were completed 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed (e.g., 1 item) (e.g., 2 items) (e.g., 3 items) Benefit-Cost Estimator (46 items; 7 areas) Phase 2, Step 2, Activity 6 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed c. How many areas were attended to? 0 1 2 3 None Minimal Some Mostly or All (1-2 areas) (3-5 areas) (6-7 areas) Adoption Decision (7 items, 2 areas) Phase 2, Step 3 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 37 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed (1 items) (2-3 items) (4-5 items) c. How many areas were attended to? 0 1 2 None Some All (1 area) (2 areas) Phase 3: Planning for Implementation Gathering Materials (1 items; 0 areas) Phase 3, Step 1, Activity 1 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 Nothing Completed All Completed (1 item) Evaluating Prospective Adaptations (17 items; 2 areas) Phase 3, Step 1, Activity 2 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed (1-5 items) (6-12 items) (13-17 items) c. How many areas were attended to? 0 2 3 None Some All (1 areas) (2 areas) Adaptation Plan (Variable items; 5 areas) Phase 3, Step 1, Activity 3 a. Was the form completed? Yes(1) No(0) N/A (2) b. How many areas were attended to? 0 1 2 3 None Minimal Some Mostly or All (1 area) (2-3 areas) (4-5 areas) c. How detailed were the Agency Leader/Team’s comments, when made? 0 1 2 3 No Comments Minimal Detail Some Detail Very Detailed Training Plan (Variable Items; 7 areas) Phase 3, Step 2, Activity 1 a. Was the form completed? Yes(1) No(0) b. How many areas were attended to? 0 1 2 3 None Minimal Some Mostly or All (1-2 areas) (3-5 areas) (6-7 areas) c. How detailed were the Agency Leader/Team’s comments, when made? 0 1 2 3 No Comments Minimal Detail Some Detail Very Detailed Implementation and Sustainment Plan (5 areas) Phase 3, Step 3, Activity 1 a. Was the form completed? Yes(1) No(0) b. How detailed were the Agency Leader/Team’s comments, when made? 0 1 2 3 No Comments Minimal Detail Some Detail Very Detailed Phase 4: Implementation Evaluation Survey 1 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 38 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 2 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 3 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 4 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 5 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 6 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 7 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 8 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 9 a. Was the form completed? Yes(1) No(0) 39 b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 10 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Phase 4: Implementation Evaluation Survey 11 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 12 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 13 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 14 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 15 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 16 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed 40 Evaluation Survey 17 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 18 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 19 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed Evaluation Survey 20 a. Was the form completed? Yes(1) No(0) b. How much of the worksheet was completed? 0 1 2 3 Nothing Completed Minimally Completed Moderately Completed Mostly/All Completed 41 Appendix C. ACT SMART Implementation Team Engagement Rating Scale ACT SMART Facilitation Meeting Implementation Team Engagement Rating Scale – Facilitator Report Provide the following engagement ratings for the period of time from last facilitation meeting to the current facilitation meeting: 5 Extremely Engaged • The implementation team displays great willingness to discuss progress on the ACT SMART toolkit and upcoming goals with the facilitator (i.e., team fully initiates discussion topics and/or appears fully open to discussing progress and goals). • The implementation team contributes detailed information to identify meeting agenda topics; there is a sense of true collaboration with the facilitator • The implementation team is extremely willing and capable of implementing the ACT SMART toolkit based on the phase they are in and topics from facilitation meeting 4 Very Engaged • The implementation team displays much willingness to discuss progress on the ACT SMART toolkit and upcoming goals with the facilitator (i.e., team mostly initiates discussion topics and/or appears mostly open to discussing progress and goals) • The implementation team contributes much information to identify meeting agenda topics; there is a sense of collaboration with the facilitator • The implementation team is very willing and capable of implementing the ACT SMART toolkit based on the phase they are in and topics from facilitation meeting 3 Moderately Engaged • The implementation team displays some willingness to discuss progress on the ACT SMART toolkit and upcoming goals with the facilitator (i.e., team is responsive to discussion topics and appears somewhat open to discussing progress and goals). • The implementation team contributes adequate information to identify meeting agenda topics; there is a sense of consultation with the facilitator rather than collaboration. • The implementation team is somewhat willing and capable of implementing the ACT SMART toolkit based on the phase they are in and topics from facilitation meeting 2 Slightly Engaged • The implementation team displays minimal willingness to discuss progress on the ACT SMART toolkit and upcoming goals with the facilitator (i.e., team is minimally responsive to discussion topics and appears minimally open to discussing progress and goals). • The implementation team contributes minimally to identifying meeting agenda topics; there is a sense of indifference with facilitation meetings • The implementation team appears indifferent and minimally capable of implementing the ACT SMART toolkit based on the phase they are in and topics from facilitation meeting 1 Not at all engaged 42 • The implementation team is not willing to discuss progress on the ACT SMART toolkit and upcoming goals with the facilitator (i.e., team is not responsive to discussion topics and not open to discussing progress and goals). • The implementation team does not contribute to identifying meeting agenda topics; there is a sense of not wanting to participate in facilitation meetings. • The implementation team is not willing and capable of implementing the ACT SMART toolkit based on the phase they are in and topics from facilitation meeting 43 Appendix D. ASD Strategies and Interventions Survey Practice/ Strategies Assessment (ASD-SIS) Practices/Strategies Currently Used 1. Think about the intervention practices and strategies that you use with all of your clients, from the following list, please check mark all that you use. In addition, please rate the extent to which you agree with the following statement for each intervention strategy, “I feel competent in my delivery of this practice.” Uncertain Disagree Disagree Agree Agree Strongly Strongly  Academic interventions 1 2 3 4 5  Addressing parent/family issues 1 2 3 4 5  Alternative Communication System 1 2 3 4 5  Articulation/Phonology-based Therapy (e.g. 1 2 3 4 5 PROMPT)  Assigning or reviewing homework 1 2 3 4 5  Augmented and Alternative Communication Device 1 2 3 4 5  Case management 1 2 3 4 5  Cognitive Behavioral Therapy (CBT) 1 2 3 4 5  Cognitive restructuring 1 2 3 4 5  Delivering positive reinforcement/Rewards 1 2 3 4 5  Delivering punishment 1 2 3 4 5  Developmental Relationship-based treatment (e.g. 1 2 3 4 5 Denver Model, DIR/Floortime)  Dietary Changes 1 2 3 4 5  Differential reinforcement 1 2 3 4 5  Discrete Trial Technique 1 2 3 4 5  Emotion identification and regulation 1 2 3 4 5  Establishing/reviewing treatment goals or agenda 1 2 3 4 5  Exercise 1 2 3 4 5  Exploring client/family past 1 2 3 4 5  Exposure (with or without response modification) 1 2 3 4 5  Expressive language based therapy (e.g., HANEN) 1 2 3 4 5  Extinction 1 2 3 4 5  Facilitated Communication 1 2 3 4 5  Functional Behavior Assessment 1 2 3 4 5  Identifying/addressing client’s strengths 1 2 3 4 5  Imitation-based intervention/ Reciprocal imitation 1 2 3 4 5 training  Independent work systems 1 2 3 4 5  Joint-attention intervention/instruction 1 2 3 4 5  Limit-setting 1 2 3 4 5 44  Massage/Touch Therapy 1 2 3 4 5  Modeling 1 2 3 4 5  Modifying antecedents 1 2 3 4 5  Music Therapy 1 2 3 4 5  Naturalistic intervention/ Naturalistic teaching 1 2 3 4 5 strategies (e.g. pivotal response training)  Parent-implemented intervention 1 2 3 4 5  Peer Mediated Instruction 1 2 3 4 5  Picture Exchange Communication System 1 2 3 4 5  Play Therapy 1 2 3 4 5  Positive Behavior Support (PBS) 1 2 3 4 5  Problem solving 1 2 3 4 5  Prompting 1 2 3 4 5  Psychoanalysis 1 2 3 4 5  Response interruption/ Redirecting 1 2 3 4 5  Schedules ( e.g., visual supports, structured work 1 2 3 4 5 systems)  Scripting 1 2 3 4 5  Self-management 1 2 3 4 5  Sensory Diet 1 2 3 4 5  Sensory Integration (e.g., auditory integration) 1 2 3 4 5  Social Communication Intervention (e.g. SCERTS, 1 2 3 4 5 Project ImPACT)  Social Skills Training 1 2 3 4 5  Social Stories/ Narratives 1 2 3 4 5  Structured play groups 1 2 3 4 5  Task analysis 1 2 3 4 5  Theory of Mind Training 1 2 3 4 5  Video modeling 1 2 3 4 5 2. List any additional practices or strategies that you currently use with clients. In addition, please rate the extent to which you agree with the following statement for each intervention strategy, “I feel competent in my delivery of this practice.” Uncertain Disagree Disagree Agree Strongly Agree Strongly  1 2 3 4 5  1 2 3 4 5  1 2 3 4 5  1 2 3 4 5 45 Appendix E. Exploratory Analysis of Implementation Strategy Fidelity and EBP Use Table 8. Beta regression results predicting proportion of direct providers using video modeling post-toolkit β se Pseudo R2 Model 1: Intercept -11.82 5.96* Proportion of Direct Providers Using Video Modeling Pre-Toolkit 0.77 1.61 Adherence 13.02 6.76 . .45 Model 2: Intercept -4.43 2.02* Proportion of Direct Providers Using Video Modeling Pre-Toolkit 0.25 1.54 Dose 1.69 0.81* .54 Model 3: Intercept -10.22 2.61*** Proportion of Direct Providers Using Video Modeling Pre-Toolkit 3.49 1.35** Participant Responsiveness 2.35 0.60*** .80 Note. .p<.10, *p <. 05, **p < .01., ***p<.001 46 Table 9. Firth-type multilevel logistic GEE results predicting odds of direct providers using video modeling post-toolkit b se Wald Model 1: Intercept -14.81 12.90 1.31 Direct Provider Use of Video Modeling Pre-Toolkit 0.85 1.10 0.60 Adherence 16.07 13.90 1.34 Model 2: Intercept -14.13 3.11 20.72*** Direct Provider Use of Video Modeling Pre-Toolkit 1.40 0.89 2.48 Dose 5.32 1.15 21.46*** Model 3: Intercept -18.28 8.22 4.95* Direct Provider Use of Video Modeling Pre-Toolkit 3.30 1.69 3.82 Participant Responsiveness 4.12 1.84 5.02* Note. p < .10, *p <. 05, **p < .01., ***p < .001 47 REFERENCES 48 REFERENCES Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 4–23. https://doi.org/10.1007/s10488-010-0327-7 Akiba, C. F., Powell, B. J., Pence, B. W., Nguyen, M. X., Golin, C., & Go, V. (2021). The case for prioritizing implementation strategy fidelity measurement: benefits and challenges. Translational Behavioral Medicine, ibab138. https://doi.org/10.1093/tbm/ibab138 Allen, J.D.., Shelton, R.C., Emmons, K.M., & Linnan, L.A. (2018). Fidelity and its relationship to implementation effectiveness, adaptation, and dissemination. In Brownson, R.C., Colditz, G.A., & Proctor, E.K. (Eds.), Dissemination and implementation research in health: Translating science to practice. Oxford Scholarship Online. https://doi.org/10.1093/oso/9780190683214.003.0016 American Psychiatric Association (2013). Autism spectrum disorder. In Diagnostic and statistical manual of mental disorders. (5th ed.). https://doi.org/10.1176/appi.books.9780890425596 Barber, J. P., Gallop, R., Crits-Christoph, P., Frank, A., Thase, M. E., Weiss, R. D., & Beth Connolly Gibbons, M. (2006). The role of therapist adherence, therapist competence, and alliance in predicting outcome of individual drug counseling: Results from the National Institute Drug Abuse Collaborative Cocaine Treatment Study. Psychotherapy Research, 16(2), 229–240. https://doi.org/10.1080/10503300500288951 Berry, C. A., Nguyen, A. M., Cuthel, A. M., Cleland, C. M., Siman, N., Pham-Singer, H., & Shelley, D. R. (2021). Measuring implementation strategy fidelity in HealthyHearts NYC: A complex intervention using practice facilitation in primary care. American Journal of Medical Quality, 36(4), 270–276. https://doi.org/10.1177/1062860620959450 Blaxill, M., Rogers, T. & Nevison, C. (2021). Autism tsunami: The impact of rising prevalence on the societal cost of autism in the United States. Journal of Autism and Developmental Disorders, 1-17. https://doi.org/10.1007/s10803-021-05120-7 Bottema-Beutel, K., Kapp, S. K., Lester, J. N., Sasson, N. J., & Hand, B. N. (2020). Avoiding ableist language: suggestions for autism researchers. Autism in Adulthood, 3(1), 18-29. https://doi.org/10.1089/aut.2020.0014 Bury, S. M., Jellett, R., Spoor, J. R., & Hedley, D. (2020). “It defines who I am” or “It’s something I have”: What language do [autistic] Australian adults [on the autism spectrum] prefer? Journal of Autism and Developmental Disorders. https://doi.org/10.1007/s10803- 020-04425-3 49 Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007). A conceptual framework for implementation fidelity. Implementation Science, 2(1), 1-9. https://doi.org/10.1186/1748-5908-2-40 Century, J., Rudnick, M., & Freeman, C. (2010). A framework for measuring fidelity of implementation: A foundation for shared language and accumulation of knowledge. American Journal of Evaluation, 31(2), 199–218. https://doi.org/10.1177/1098214010366173 Chamberlain, P., Brown, C. H., & Saldana, L. (2011). Observational measure of implementation progress in community based settings: the stages of implementation completion (SIC). Implementation Science, 6(1), 116. https://doi.org/10.1186/1748-5908-6-116 Cross, W., & West, J. (2011). Examining implementer fidelity: Conceptualising and measuring adherence and competence. Journal of Children’s Services, 6(1), 18–33. https://doi.org/10.5042/jcs.2011.0123 Crowe, S., Cresswell, K., Robertson, A., Huby, G., Avery, A., & Sheikh, A. (2011). The case study approach. BMC Medical Research Methodology, 11(1), 1-9. https://doi.org/10.1186/1471-2288-11-100 Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science, 4(1), 1-15. https://doi.org/10.1186/1748-5908-4-50 Drahota, A., Aarons, G. A., & Stahmer, A. C. (2012). Developing the autism model of implementation for autism spectrum disorder community providers: Study protocol. Implementation Science, 7(1), 85. https://doi.org/10.1186/1748-5908-7-85 Drahota, A., Chlebowski, C., Stadnick, N., Baker-Ericzén, M. J., & Brookman-Frazee, L. (2017). The dissemination and implementation of behavioral treatments for anxiety in ASD. In C. M. Kerns, P. Renno, E. A. Storch, P. C. Kendall, & J. J. Wood (Eds.), Anxiety in Children and Adolescents with Autism Spectrum Disorder: Evidence-Based Assessment and Treatment (pp. 231-249). Atlanta, GA: Elsevier. Drahota, A., Meza, R.; Martinez, J.I. (2014). The Autism-Community Toolkit: Systems to Measure and Adopt Research-Based Treatments. www.actsmartoolkit.com Drahota, A., Meza, R. D., Bustos, T. E., Sridhar, A., Martinez, J. I., Brikho, B., Stahmer, A. C., & Aarons, G. A. (2020). Implementation-as-usual in community-based organizations providing specialized services to individuals with autism spectrum disorder: A mixed methods study. Administration and Policy in Mental Health and Mental Health Services Research, 48(3), 482-498. https://doi.org/10.1007/s10488-020-01084-5 50 Drahota, A., Meza, R., Martinez, J. I., Sridhar, A., Bustos, T.E., Tschida, J., Stahmer, A., & Aarons, G. A. (in preparation). Feasibility, Acceptability, and Utility of the ACT SMART Implementation Toolkit. Dusenbury, L. (2003). A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research, 18(2), 237–256. https://doi.org/10.1093/her/18.2.237 Eapen, V., Črnčec, R., & Walter, A. (2013). Clinical outcomes of an early intervention program for preschool children with autism spectrum disorder in a community group setting. BMC Pediatrics, 13(1), 3. https://doi.org/10.1186/1471-2431-13-3 Elder, J. H., Brasher, S., & Alexander, B. (2016). Identifying the barriers to early diagnosis and treatment in underserved individuals with autism spectrum disorders (ASD) and their families: A qualitative study. Issues in Mental Health Nursing, 37(6), 412–420. https://doi.org/10.3109/01612840.2016.1153174 Hamilton, A. S., & Mittman, B. S. (2018). Implementation science in health care. Dissemination and Implementation Research in Health: Translating Science to Practice, 2, 385-400. https://doi.org/10.1093/oso/9780190683214.001.0001 Hattier, M. A., Matson, J. L., Belva, B. C., & Horovitz, M. (2011). The occurrence of challenging behaviours in children with autism spectrum disorders and atypical development. Developmental Neurorehabilitation, 14(4), 221–229. https://doi.org/10.3109/17518423.2011.573836 Haynes, A., Brennan, S., Redman, S., Williamson, A., Gallego, G., Butow, P., & The CIPHER team (2015). Figuring out fidelity: A worked example of the methods used to identify, critique and revise the essential elements of a contextualised intervention in health policy agencies. Implementation Science, 11(1), 23. https://doi.org/10.1186/s13012-016-0378-6 Heinze, G., & Schemper, M. (2002). A solution to the problem of separation in logistic regression. Statistics in Medicine, 21(16), 2409–2419. https://doi.org/10.1002/sim.1047 Hogue, A., Henderson, C. E., Dauber, S., Barajas, P. C., Fried, A., & Liddle, H. A. (2008). Treatment adherence, competence, and outcome in individual and family therapy for adolescent behavior problems. Journal of Consulting and Clinical Psychology, 76(4), 544– 555. https://doi.org/10.1037/0022-006X.76.4.544 Horlin, C., Falkmer, M., Parsons, R., Albrecht, M. A., & Falkmer, T. (2014). The cost of autism spectrum disorders. PLoS ONE, 9(9), e106552. https://doi.org/10.1371/journal.pone.0106552 Horner, R. H., Carr, E. G., Strain, P. S., Todd, A. W., & Reed, H. K. (2002). Problem behavior interventions for young children with autism: A research synthesis. Journal of Autism and Developmental Disorders, 32(5), 423–446. https://doi.org/10.1023/A:1020593922901 51 Hulleman, C. S., & Cordray, D. S. (2009). Moving from the lab to the field: The role of fidelity and achieved relative intervention strength. Journal of Research on Educational Effectiveness, 2(1), 88–110. https://doi.org/10.1080/19345740802539325 Ibrahim, S., & Sidani, S. (2015). Fidelity of intervention implementation: a review of instruments. Health, 7(12), 1687. https://doi.org/10.4236/health.2015.712183 Kenny, L., Hattersley, C., Molins, B., Buckley, C., Povey, C., & Pellicano, E. (2016). Which terms should be used to describe autism? Perspectives from the UK autism community. Autism, 20(4), 442–462. https://doi.org/10.1177/1362361315588200 Kim, S. Y., & Bottema-Beutel, K. (2019). A meta regression analysis of quality of life correlates in adults with ASD. Research in Autism Spectrum Disorders, 63, 23–33. https://doi.org/10.1016/j.rasd.2018.11.004 King, M., & Bearman, P. (2009). Diagnostic change and the increased prevalence of autism. International Journal of Epidemiology, 38(5), 1224–1234. https://doi.org/10.1093/ije/dyp261 Kirk, M. A., Haines, E. R., Rokoske, F. S., Powell, B. J., Weinberger, M., Hanson, L. C., & Birken, S. A. (2019). A case study of a theory-based method for identifying and reporting core functions and forms of evidence-based interventions. Translational Behavioral Medicine, ibz178. https://doi.org/10.1093/tbm/ibz178 Knoche, L. L., Sheridan, S. M., Edwards, C. P., & Osborn, A. Q. (2010). Implementation of a relationship-based school readiness intervention: A multidimensional approach to fidelity measurement for early childhood. Early Childhood Research Quarterly, 25(3), 299–313. https://doi.org/10.1016/j.ecresq.2009.05.003 Lai, M.-C., Lombardo, M. V., & Baron-Cohen, S. (2014). Autism. The Lancet, 383(9920), 896– 910. https://doi.org/10.1016/S0140-6736(13)61539-1 Leal Filho, W., Skanavis, C., Kounani, A., Brandli, L. L., Shiel, C., do Paco, A., ... & Shula, K. (2019). The role of planning in implementing sustainable development in a higher education context. Journal of Cleaner Production, 235, 678-687. https://doi.org/10.1016/j.jclepro.2019.06.322 Leigh, J. P., & Du, J. (2015). Brief report: Forecasting the economic burden of autism in 2015 and 2025 in the United States. Journal of Autism and Developmental Disorders, 45(12), 4135–4139. https://doi.org/10.1007/s10803-015-2521-7 Lewis, C. C., & Dorsey, C. (2020). Advancing implementation science measurement. In Albers, B., Shlonsky, A., & Mildon, R. (Eds.), Implementation Science 3.0 (pp. 227). Springer Nature. 52 Maenner, M. J., Shaw, K. A., Bakian, A. V., Bilder, D. A., Durkin, M. S., Esler, A., ... & Cogswell, M. E. (2021). Prevalence and characteristics of autism spectrum disorder among children aged 8 years—autism and developmental disabilities monitoring network, 11 sites, United States, 2018. MMWR Surveillance Summaries, 70(11), 1. https://doi.org/10.15585/mmwr.ss7011a1 Mason, D., Mackintosh, J., McConachie, H., Rodgers, J., Finch, T., & Parr, J. R. (2019). Quality of life for older autistic people: The impact of mental health difficulties. Research in Autism Spectrum Disorders, 63, 13–22. https://doi.org/10.1016/j.rasd.2019.02.007 McHugo, G. J., Drake, R. E., Whitley, R., Bond, G. R., Campbell, K., Rapp, C. A., Goldman, H. H., Lutz, W. J., & Finnerty, M. T. (2007). Fidelity Outcomes in the National Implementing Evidence-Based Practices Project. Psychiatric Services, 58(10), 1279–1284. https://doi.org/10.1176/ps.2007.58.10.1279 Mihalic, S. (2004). The importance of implementation fidelity. Emotional and Behavioral Disorders in Youth, 4(4), 83-105. http://www.incredibleyears.com/wp- content/uploads/fidelity-importance.pdf Mondol, M. H., & Rahman, M. S. (2019). Bias‐reduced and separation‐proof GEE with small or sparse longitudinal binary data. Statistics in Medicine, 38(14), 2544–2560. https://doi.org/10.1002/sim.8126 Moore, G. F., Audrey, S., Barker, M., Bond, L., Bonell, C., Hardeman, W., Moore, L., O’Cathain, A., Tinati, T., Wight, D., & Baird, J. (2015). Process evaluation of complex interventions: Medical Research Council guidance. BMJ, 350(mar19 6), h1258. https://doi.org/10.1136/bmj.h1258 Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: development, measurement, and validation. American Journal of Evaluation, 24(3), 315– 340. https://doi.org/10.1177/109821400302400303 National Autism Center. (2009). National standards project findings and conclusions. Paynter, J. M., Ferguson, S., Fordyce, K., Joosten, A., Paku, S., Stephens, M., Trembath, D., & Keen, D. (2017). Utilisation of evidence-based practices by ASD early intervention service providers. Autism, 21(2), 167–180. https://doi.org/10.1177/1362361316633032 Perez Jolles, M., Lengnick-Hall, R., & Mittman, B. S. (2019). Core functions and forms of complex health Interventions: A patient-centered medical home illustration. Journal of General Internal Medicine, 34(6), 1032–1038. https://doi.org/10.1007/s11606-018-4818-7 Pickard, K., Meza, R., Drahota, A., & Brikho, B. (2018). They’re doing what? A brief paper on service use and attitudes in ASD community-based agencies. Journal of Mental Health Research in Intellectual Disabilities, 11(2), 111–123. https://doi.org/10.1080/19315864.2017.1408725 53 Pitney, J. (2020, February 6). Lifetime social cost. Autism Politics and Policy. http://www.autismpolicyblog.com/2020/02/lifetime-social-cost.html Powell, B. J., Fernandez, M. E., Williams, N. J., Aarons, G. A., Beidas, R. S., Lewis, C. C., McHugh, S. M., & Weiner, B. J. (2019). Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda. Frontiers in Public Health, 7, 3. https://doi.org/10.3389/fpubh.2019.00003 Powell, B. J., Haley, A. D., Patel, S. V., Amaya-Jackson, L., Glienke, B., Blythe, M., ... & Weinberger, M. (2020). Improving the implementation and sustainment of evidence-based practices in community mental health organizations: a study protocol for a matched-pair cluster randomized pilot study of the Collaborative Organizational Approach to Selecting and Tailoring Implementation Strategies (COAST-IS). Implementation Science Communications, 1(1), 1-13. https://doi.org/10.1186/s43058-020-00009-5 Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., Griffey, R., & Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. https://doi.org/10.1007/s10488-010- 0319-7 Robertson, S. M. (2010). Neurodiversity, Quality of Life, and Autistic Adults: Shifting Research and Professional Focuses onto Real-Life Challenges. Disability Studies Quarterly, 30(1). https://doi.org/10.18061/dsq.v30i1.1069 Simonoff, E., Pickles, A., Charman, T., Chandler, S., Loucas, T., & Baird, G. (2008). Psychiatric disorders in children with autism spectrum disorders: Prevalence, comorbidity, and associated factors in a population-derived sample. Journal of the American Academy of Child & Adolescent Psychiatry, 47(8), 921–929. https://doi.org/10.1097/CHI.0b013e318179964f Slaughter, S. E., Hill, J. N., & Snelgrove-Clarke, E. (2015). What is the extent and quality of documentation and reporting of fidelity to implementation strategies: A scoping review. Implementation Science, 10(1), 129. https://doi.org/10.1186/s13012-015-0320-3 Sridhar, A. & Drahota, A. (2020). Facilitating EBP Implementation in Community-Based ASD Agencies: Clinical Effectiveness of the ACT SMART Implementation Toolkit. Poster presented at the INSAR 2020 Annual Meeting, Seattle, Washington. (Conference converted to virtual conference). Sridhar & Drahota (in review). Brief report: Preliminary effectiveness of the ACT SMART Implementation Toolkit to facilitate implementation in community-based ASD organizations. Sridhar, A., Drahota, A., & Walsworth, K. (2021). Facilitators and barriers to the utilization of the ACT SMART Implementation Toolkit in community-based organizations: A qualitative 54 study. Implementation Science Communications, 2, 55. https://doi.org/10.1186/s43058-021- 00158-1 Stevens, E., Atchison, A., Stevens, L., Hong, E., Granpeesheh, D., Dixon, D., & Linstead, E. (2017, December). A cluster analysis of challenging behaviors in autism spectrum disorder. In 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA) (pp. 661-666). IEEE. Stirman, S. W., Baumann, A. A., & Miller, C. J. (2019). The FRAME: an expanded framework for reporting adaptations and modifications to evidence-based interventions. Implementation Science, 14(1), 1-10. https://doi.org/10.1186/s13012-019- 0898-y Stirman, S. W., Kimberly, J., Cook, N., Calloway, A., Castro, F., & Charns, M. (2012). The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science, 7(1), 1-19. https://doi.org/10.1186/1748-5908-7-17 Teague, G. B. (2013). Fidelity. Implementation Research Institute Presentation. Teerenstra, S., Lu, B., Preisser, J. S., van Achterberg, T., & Borm, G. F. (2010). Sample size considerations for GEE analyses of three-level cluster randomized trials. Biometrics, 66(4), 1230–1237. https://doi.org/10.1111/j.1541-0420.2009.01374.x Vinen, Z., Clark, M., Paynter, J., & Dissanayake, C. (2018). School age outcomes of children with autism spectrum disorder who received community-based early interventions. Journal of Autism and Developmental Disorders, 48(5), 1673–1683. https://doi.org/10.1007/s10803- 017-3414-8 Weiner, B. J., Lewis, C. C., Stanick, C., Powell, B. J., Dorsey, C. N., Clary, A. S., Boynton, M. H., & Halko, H. (2017). Psychometric assessment of three newly developed implementation outcome measures. Implementation Science, 12(1), 108. https://doi.org/10.1186/s13012- 017-0635-3 Wong, C., Odom, S. L., Hume, K. A., Cox, A. W., Fettig, A., Kucharczyk, S., Brock, M. E., Plavnick, J. B., Fleury, V. P., & Schultz, T. R. (2015). Evidence-based practices for children, youth, and young adults with autism spectrum disorder: A comprehensive review. Journal of Autism and Developmental Disorders, 45(7), 1951–1966. https://doi.org/10.1007/s10803-014-2351-z Wood, J., McLeod, B.D., Klebanoff, S., & Brookman-Frazee L. (2015). Toward the implementation of evidence-based interventions for youth with autism spectrum disorders in schools and community agencies. Behavior Therapy, 46(1), 83–95. 55