BARRIERS AND FACILITATORS TO THE UTILIZATION OF THE ACT SMART IMPLEMENTATION TOOLKIT IN COMMUNITY AGENCIES: A QUALITATIVE STUDY By Aksheya Sridhar A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Psychology- Master of Arts 2020 BARRIERS AND FACILITATORS TO THE UTILIZATION OF THE ACT SMART IMPLEMENTATION TOOLKIT IN COMMUNITY AGENCIES: A QUALITATIVE STUDY ABSTRACT By Aksheya Sridhar Evidence-based practices (EBPs) have been shown to improve outcomes for children diagnosed with autism spectrum disorder (ASD). Research suggests that the utilization of these practices in community settings is varied; however, the utilization of implementation guides may bridge the gap between research and practice. The Autism Community Toolkit: Systems to Measure and Adopt Research-Based Treatments (ACT SMART Toolkit) is a web-based implementation toolkit developed to guide implementation teams through the phases of EBP implementation in ASD community agencies. This study examined the barriers and facilitators (collectively termed “determinants”) to the utilization of this toolkit, based on the perspectives of implementation teams at six ASD community agencies. Two independent coders utilized the adapted EPIS model and the Technology Acceptance Model 3, to guide thematic analyses of participant interviews. Salient determinants were identified, and analyses highlighted two themes: (a) Inner Context Determinants (e.g. funding) and (b) Innovation Determinants (e.g. facilitation meetings) to use of the toolkit. Finally, determinants that differed across phases of the toolkit were identified. Findings highlight areas of improvement for the ACT SMART Implementation Toolkit, as well as factors to facilitate the use of this implementation guide. Additionally, findings may inform the development, refinement, and utilization of implementation guides with the aim of increasing the uptake of evidence-based practices in community agencies providing services to children with autism spectrum disorder. ACKNOWLEDGEMENTS I am extremely grateful to the people who helped make this work possible. My advisor, Dr. Amy Drahota has been a supportive and dedicated mentor to me as I worked on this project, and continues to help me grow as a researcher, writer, and academic. I am also grateful to my committee members, Drs. Brooke Ingersoll and Ignacio D. Acevedo-Polakovich for providing their expertise as I embarked on this project. I could not have completed this work without the help of Kiersten Walsworth, previously an undergraduate volunteer in our lab, who dedicated hours of her time to help with the thematic analyses in this project. Additionally, I am thankful to the other members of my lab, particularly Tatiana Bustos, who has provided me with guidance and mentorship over the past two years. Finally, I am incredibly grateful for the unwavering support of my parents, Vidya and Sridhar, my sibling, Smriti, my husband, Dylan, and my dog, Otis. iii TABLE OF CONTENTS LIST OF TABLES ......................................................................................................................... vi LIST OF FIGURES ...................................................................................................................... vii INTRODUCTION .......................................................................................................................... 1 Background ................................................................................................................................. 1 Implementation Frameworks. ..................................................................................................... 2 ACT SMART Implementation Toolkit ....................................................................................... 5 Determinants to EBP Implementation ........................................................................................ 7 Barriers. ....................................................................................................................................... 8 Facilitators. .................................................................................................................................. 9 Present Study. ........................................................................................................................... 10 Aims. ......................................................................................................................................... 11 Standards for Reporting Qualitative Research. ......................................................................... 12 METHOD ..................................................................................................................................... 13 Researcher Characteristics ........................................................................................................ 13 Reflexivity ................................................................................................................................. 13 Context ...................................................................................................................................... 14 Participants ................................................................................................................................ 14 Procedure. ................................................................................................................................. 15 Measures. .................................................................................................................................. 16 Data Analysis. ........................................................................................................................... 16 Thematic analysis: ................................................................................................................ 17 Coding: .................................................................................................................................. 18 Theming the data: ................................................................................................................. 19 Trustworthiness: .................................................................................................................... 20 RESULTS ..................................................................................................................................... 21 Frequent Codes. ........................................................................................................................ 21 Salient Codes. ........................................................................................................................... 24 Thematic Analysis Findings. .................................................................................................... 28 Inner Context Determinants: ................................................................................................. 28 Inner Context Determinants Differing by Phase: .................................................................. 29 Innovation Determinants: ...................................................................................................... 30 Innovation Determinants that Differed by Phase: ................................................................. 31 Phase Specific Activities as Innovation Determinants: ........................................................ 32 Website Factors as Innovation Determinants: ...................................................................... 33 DISCUSSION ............................................................................................................................... 35 Salient Determinants. ................................................................................................................ 35 Inner Context Determinants. ..................................................................................................... 36 Innovation Determinants. .......................................................................................................... 36 iv Determinants Across Phases. .................................................................................................... 37 Limitations. ............................................................................................................................... 38 Future Directions ...................................................................................................................... 39 Conclusion. ............................................................................................................................... 40 APPENDICES .............................................................................................................................. 41 Appendix A. End of Phase Interview Guide ............................................................................. 42 Appendix B. End of Pilot Study Interview Guide .................................................................... 43 Appendix C. Code book ............................................................................................................ 44 REFERENCES ............................................................................................................................. 48 v LIST OF TABLES Table 1. Phases of the ACT SMART Toolkit.................................................................................7 Table 2. IT Demographics ............................................................................................................15 Table 3. Themes, Descriptions, Frequencies, and Illustrative Quotes…………...........................21 vi LIST OF FIGURES Figure 1. Adapted EPIS Model.......................................................................................................5 Figure 2. ACT SMART Toolkit Determinants..............................................................................28 Figure 3. Inner Context Determinants...........................................................................................30 Figure 4. Innovation Determinants................................................................................................31 vii INTRODUCTION Background. Autism spectrum disorder (ASD) is a lifelong neurodevelopmental disorder affecting 1.5% of the United States’ population (Xu et al., 2018). ASD is characterized by deficits in social communication and interactions, and the presence of restricted and/or repetitive behaviors, interests, or activities (American Psychiatric Association, 2013). Importantly, research indicates that the utilization of evidence-based practices (EBPs) for ASD leads to improvement in the core deficits associated with ASD, including joint attention play skills, language skills, and cognitive functioning (National Autism Center, 2009; Wong et al., 2015; Reichow et al., 2018). Additionally, the receipt of EBPs lead to both immediate and long-term improvements in a number of other areas, including social functioning, adaptive functioning, language and communication skills (Dawson et al., 2010, Wong et al., 2015; Estes et al., 2015). However, there is a well-known gap between research and community-based practices for children with ASD (Paynter et al., 2016; Paynter & Keen, 2014; Brookman-Frazee, Baker- Ericzén, Stadnick & Taylor, 2012; Dingfelder & Mandell, 2011). Specifically, research indicates that the utilization of EBPs in community settings is varied, such that both evidence-based intervention strategies as well as strategies lacking an evidence base continue to be applied (Paynter & Keen, 2015; Pickard, Meza, Drahota, & Brikho 2018). Furthermore, children with ASD typically require access to intensive amounts of care; indeed, the recommended number of intervention hours is 25 hours per week year-round for this population (National Research Council, 2011). However, studies indicate inconsistent and low intensity delivery of EBPs in community-based organizations, suggesting that these practices are often not utilized within these settings, due to a multitude of factors (e.g., lack of funding, lack of EBP fit within the organization) (Brookman-Frazee et al., 2010; Aarons et al., 2011). Additionally, a number of 1 studies highlight the variety in services provided, provider background and discipline, and variations in organization structure within community-based organizations, resulting in individuals with ASD receiving fragmented care within these settings (Christon et al., 2015; Cidav, Lawer, Marcus, & Mandell, 2013). Overall, research indicates that although many children with ASD receive their care within community-based organizations, best-practices are often not utilized in these settings, resulting in limited access to evidence-based interventions for many children diagnosed with ASD (Stahmer et al., 2019). Implementation Frameworks. The utilization of dissemination and implementation (D&I) science is an effective way of examining factors impacting the adoption and utilization of EBPs in community settings (Proctor et al., 2009). Dissemination refers to the active process of spreading of knowledge regarding EBPs to various audiences utilizing specific strategies, while implementation refers to the process of utilizing or integrating EBPs within these settings (Tabak et al., 2012). Importantly, research suggests that the “fit”, or compatibility, of an existing EBP and the service setting in which it may be implemented appears to be an important factor in the adoption and utilization of EBPs in various settings (Proctor et al., 2011). The utilization of implementation frameworks allows researchers to focus on maximizing “fit” by understanding contextual and other factors impacting the compatibility of the EBP and the service setting. This may allow for greater facilitation of EBP implementation in these settings (Drahota et al., 2017). Additionally, implementation frameworks may provide agencies with a systematic way of adopting, implementing, and using these interventions (Tabak et al., 2012). Currently, there are a number of implementation frameworks that may inform the understanding of factors that facilitate or hinder the implementation of EBPs or innovations in community settings. Two relevant implementation frameworks are described below. These frameworks can also be 2 classified as determinant frameworks, as they are utilized to highlight barriers and facilitators to implementation and implementation outcomes (Nilsen, 2015). Both frameworks were utilized to guide analysis in this study. The Technology Acceptance Model 3 (TAM3; Venkatesh & Bala, 2008) is utilized to understand factors impacting the adoption and utilization of Information Technology (IT) across settings. The TAM3 is the most widely used model in research examining IT adoption; importantly, this model has been found to be highly predictive of IT adoption and use across settings (Venkatesh & Davis, 2000; Venkatesh & Morris, 2000). The two major constructs in this model include: the perceived ease of use (e.g., the extent to which staff believe the technological product will require no effort to use) and perceived usefulness (e.g., the extent to which staff believe the technological product will enhance job performance) of the IT product. Furthermore, research has identified determinants of these two constructs, including individual differences (e.g., personality traits), system characteristics (e.g. aspects within a system that impact individuals’ perceptions regarding the usefulness or ease of use of a product), social influence (e.g. social processes that guide individuals’ perceptions of an IT product), and facilitating conditions (e.g. organizational support that enables IT use) (Venkatesh & Bala, 2008). The TAM3 posits the perceived ease of use and perceived usefulness of the product first influences staff attitudes towards the IT product, which then impacts the behavior of staff related to use of the product. Importantly, the TAM3 can be utilized in tandem with other implementation frameworks in an effort to understand all factors influencing implementation (Sanayei et al., 2010). The other determinant framework involved in the current study was the Exploration, Preparation, Implementation, Sustainment (EPIS) framework (Aarons et al., 2011), a multi-level, 3 multi-step framework utilized to understand outer and inner context factors, as well as innovation and bridging factors impacting the implementation of EBPs in community settings (Aarons et al., 2011; Novins et al., 2013). Outer context factors refer to leadership, service environment and policies, funding, patient or client characteristics, patient or client advocacy and the inter- organizational environment. Inner context factors include organizational and individual characteristics, leadership and staffing within the organization, and self-efficacy, values, and fidelity of providers (Aarons et al., 2009; Drahota et al., 2017). Bridging factors include those that highlight the complexity and interaction between outer and inner factors, such as community academic partnerships (Moulin et al., 2019). Finally, innovation factors include factors related to the fit of an EBP or innovation at an agency, and characteristics of the EBP or innovation itself (Moulin et al., 2019). Studies indicate that the EPIS framework may be utilized across a number of different settings in order to examine contextual, bridging, and innovation factors as determinants to the implementation of an EBP or innovation (Moulin et al., 2019; Stahmer et al., 2019). Overall, these determinant frameworks allow for a better understanding of staff perceptions regarding the utilization of technological products at their agency, as well as important contextual factors that may facilitate or hinder the implementation of EBPs across settings (Venkatesh & Bala, 2008; Moulin et al., 2019). Importantly, previous research has not utilized these frameworks in tandem; however, given the utilization of an IT product within the ACT SMART Toolkit and the important role of contextual factors on implementation, it was imperative to utilize both frameworks in order to gain a clear understanding of determinants to utilization of this toolkit. As a result, this paper is the first to utilize both the TAM3 and the EPIS 4 framework to inform the exploration of determinants to the implementation of an innovation within community-based settings. ACT SMART Implementation Toolkit. An adapted version of the EPIS model (Figure 1; Drahota et al., 2017) was developed and utilized to guide the development of the ACT SMART Implementation Toolkit (Drahota et al., 2014; Drahota et al., 2017), in collaboration with an ASD community-academic partnership (for description, see Gomez, Drahota, & Stahmer, 2018). Exploration Outer Context Socio-political Funding Client advocacy organizations Interorganizational networks EBP and training repositories Adoption Decision Outer Context Socio-political Funding Client advocacy organizations Interorganizational networks EBP and training repositories Preparation Outer Context Socio-political Funding Client advocacy organizations Interorganizational networks EBP and training repositories Implementation Outer Context Socio-political Funding Client advocacy organizations Interorganizational networks EBP and training repositories Inner Context Organizational factors • Leadership • Capacity • Resources • Measurement • Culture • Climate • Readiness Provider factors • Values • Needs • Openness • Flexibility • Readiness Inner Context Organizational factors • Size • Needs • Funding EBP factors • Validity • Training requirements Organization-EBP factors • Fit • Feasibility • Clinical value Provider factors • Capacity Inner Context Organizational factors • Size • Needs • Resources • Independent EBP factors • Adaptability • Flexibility Provider factors • Self-efficacy • Training readiness decision-making Inner Context Organizational factors • Size • Resources Implementation team factors • Implementation strategy • Communication • Action planning and evaluation Provider factors • Values • Self-efficacy • Fidelity Sustainment Outer Context Socio-political Funding Client advocacy organizations Interorganizational networks EBP and training repositories Inner Context Organizational factors • Size • Resources • Needs Implementation team factors • Sustainment strategy • Communication • Evaluation Provider factors • Values • Self-efficacy • Fidelity Adapted from Aarons, Hurlburt & Horwitz (2011). Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Research, 38, 4-23. The ACT SMART Toolkit is a comprehensive, web-based interface guiding agency Figure 1. Adapted EPIS Model leaders and providers (who form an agency implementation team) through the process of EBP implementation in ASD community-based organizations (ASD-CBOs), with the aim of addressing known barriers to EBP implementation within this service setting. Specifically, the ACT SMART Toolkit is an evidence-informed, multi-phased, and systematic toolkit that aims to 5 facilitate the implementation of EBPs in ASD-CBOs. When utilizing the ACT SMART Toolkit, Implementation Teams (ITs) from community-based organizations (ASD-CBOs) are guided through the five phases of implementation, according to the adapted EPIS framework, using the web-based interface. Table 1 aligns the adapted EPIS phases with the ACT SMART Toolkit’s steps and activities. For example, agencies begin with an Exploration phase, in which an agency needs assessment is conducted, next steps are outlined, and goals are prioritized. Importantly, the receptivity to implementing a new EBP by staff and agency leaders is evaluated during this phase by the ACT SMART facilitators. Following this phase, the agency ITs work with ACT SMART facilitators to identify an appropriate EBP that matches the needs of the agency staff and their clients, and to make an informed decision about whether or not to adopt the EBP (Phase 2, Adoption Decision). Upon making the adoption decision, the agency ITs begin phase 3 of the adapted EPIS (Preparation), and work to develop adaptation, training, and implementation plans. During phase 4, Implementation, the agency ITs and ACT SMART facilitators conduct the plans that were developed and track the progress of these plans by utilizing an ACT SMART task evaluation survey. Finally, during the final phase of the adapted EPIS (Sustainment), the agency ITs and ACT SMART facilitators evaluate the success of the implementation and develop a sustainment plan (Drahota et al., 2017). Additionally, each IT works closely with the facilitation teams from ACT SMART as they progress through each phase of implementation; this includes participating in monthly facilitation meetings with ACT SMART facilitation teams; importantly, facilitation has been found to be a key component in helping staff plan and implement new interventions across settings (Harvey et al., 2018). 6 Table 1. Phases of the ACT SMART Toolkit Table . Adapted EPIS Implementation Framework with ACT SMART Implementation Toolkit Steps and Activities Adapted EPIS Phases Phase 1: Exploration Steps Step 1: Conduct agency assessment Step 2: Evaluate receptivity to implementing new EBP Step 1: Identify appropriate EBP(s) Phase 2: Adoption Decision Step 2: Evaluate EBP and provider factors ACT SMART Implementation Toolkit Web-Based Interface Activities Activity 1: Encourage staff participation in the organizational needs assessment Activity 2: Form implementation team, if needed Activity 1: Identify EBP(s) to meet agency need Activity 1: EBP fit Activity 2: EBP feasibility Activity 3: Clinical value and research validity Activity 4: Training requirements Activity 5: Funding sources Activity 6: Benefit-cost estimator Activity 1: Formally make an adoption decision Activity 1: Gather EBP materials Activity 2: Evaluate possible adaptations for EBP Activity 3: Adaptation planning worksheet Activity 1: Training plan worksheet Activity 1: Implementation plan worksheet Activity 1: Develop concrete tasks and establish Step 3: Adoption decision Step 1: Develop a prospective adaptation plan Step 2: Develop training plan Step 3: Develop implementation plan Step 1: Conduct adaptation plan Step 2: Conduct training plan Step 3: Conduct implementation plan Step 4: Task evaluation Step 1: Evaluate implementation success Activity 1: Synthesize task evaluations Step 2: Evaluate current sustainment Step 3: Develop implementation plan Activity 1: Evaluate tasks from Steps 1-3 Activity 1: Identify current sustainment practices Activity 1: Sustainment planning due dates Facilitation Meetings • 12 monthly 30-60 minute meetings • Agency implementation team and ACT SMART facilitator collaborate to move through ACT SMART phases and activities • Structured facilitation meetings to review steps, phases and activities; troubleshoot previous action items; introduce next steps and phases; and plan for future steps Phase 3: Preparation Phase 4: Implementation Phase 5: Sustainment Determinants to EBP Implementation. In order to develop the ACT SMART Implementation Toolkit, research examining the barriers and facilitators to EBP implementation in ASD-CBOs was considered, and activities were developed to address contextual factors impacting the process of implementation (Drahota et al., in review). Within the context of implementation research, barriers can be understood as factors that hinder the implementation of an EBP in specific settings, while facilitators are factors that enable the implementation of an EBP in these settings (Aarons et al., 2009). Drahota and colleagues (in review) found that barriers and facilitators may occur at multiple levels that impact implementation, including at the patient-, provider-, organizational-, and policy-level. That is, barriers as well as facilitators to the implementation of EBPs in community settings may be related to outer and inner context factors, bridging factors, and innovation factors (Moulin et al., 2019). 7 Importantly, studies indicate myriad barriers and facilitators to the implementation of EBPs in healthcare settings. Aarons and colleagues (2009) identified 14 factors perceived as either barriers or facilitators to EBP implementation in public mental healthcare settings, with funding identified as the most important factor impacting EBP implementation. In addition, organizational readiness, which involves the culture within the organization to support adoption and utilization of an EBP, as well as organizational and innovation-specific capacities serve as inner context factors that can either impede or facilitate the adoption of EBPs in ASD-CBOs (Scaccia et al., 2015; Aarons et al., 2011). Barriers. Within ASD-CBOs specifically, inner context factors impeding EBP implementation include a lack of provider knowledge about EBPs for ASD, limited resources and access to EBPs, lack of specialized training for providers, time intensity and rigid delivery format of EBPs, and the complexity of ASD interventions (Dingfelder & Mandell, 2011; Pickard et al., 2016; Paynter & Keen, 2015; Brookman-Frazee et al., 2012; Drahota et al., in review). For example, EBPs designed or adapted specifically for children with ASD are often complex and multifaceted, and these innovation characteristics can serve as a significant barrier to its implementation in usual care, community-based settings (Pickard et al., 2016). Moreover, the complexity of these interventions typically requires a significant amount of training that may not be available or may be too costly, therefore limiting the feasibility of implementation and utilization of these EBPs in community-based settings (Dingfelder & Mandell, 2011; Pickard et al., 2016; Wood et al., 2015; Drahota et al., in review). Finally, challenges may also include limited capability of practitioners to master a variety of EBP’s developed specifically for children with ASD, and difficulty implementing complex EBPs effectively across a variety of 8 settings (Aarons et al., 2011; Wood et al., 2015; Dingfelder & Mandell, 2011; Pickard et al., 2016). Further, Wood and colleagues (2015) described three common inner context factors that serve as barriers to the implementation of EBPs. First, the motivation of practitioners to adopt and utilize new interventions has been identified as a barrier to EBP implementation, particularly as these interventions tend to be complex and require a significant amount of training; as a result, practitioners may be less motivated to implement ASD specific EBPs in their agencies. Relatedly, a lack of training and supervision for ASD practitioners on the use of ASD EBPs serves as a barrier to implementation of these services in community settings. Finally, organizational and service setting factors, such as funding and a lack of providers, have been identified as a barrier to EBP implementation in these settings (Wood et al., 2015). Facilitators. Research examining facilitators of EBP implementation in ASD-CBO’s remains limited. Studies indicate a greater need for comprehensive provider training and the inclusion of parents and families in the delivery of EBPs for children with ASD in an effort to facilitate EBP implementation in ASD-CBO’s (Brookman-Frazee et al., 2012; Pickard et al., 2016). However, studies examining facilitators to EBP implementation in public healthcare settings rather than ASD service settings provide some possible variables of interest, including inner context factors such as staff perceptions regarding benefits of the EBP to an agency, EBP compatibility with the agency, organizational leaders, and value of the EBP for both clients and practitioners (Aarons et al, 2009). Further, studies that have taken place in school settings have identified facilitators to EBP implementation to include support from school administration and leadership, and greater buy-in from teachers (Langley et al., 2010). Inner context factors found to facilitate EBP implementation in youth mental health settings include activities such as fidelity 9 monitoring and staff supervision (Novins et al., 2011). Overall, facilitators to EBP implementation in community settings include provider training, staff buy-in, EBP compatibility with the agency, and organizational support. Further investigation examining facilitators to EBP implementation for children with ASD specifically is needed. Present Study. Based on prior research examining determinants to EBP implementation, the ACT SMART Implementation Toolkit was developed in an effort to address unique client, provider, and contextual factors impacting EBP implementation in ASD-CBO’s (Drahota et al., 2017). Importantly, the ACT SMART Toolkit was developed alongside community-academic partners, which allows for maximizing the fit of EBPs within these agencies (Gomez, Drahota, & Stahmer, 2018; Drahota et al., 2017). A pilot study was conducted to evaluate the feasibility, acceptability, utility and fidelity of the toolkit. Upon completion of the ACT SMART Toolkit pilot study, effect sizes were calculated to assess the preliminary effectiveness of this toolkit within ASD-CBOS, to increase the use of the chosen EBP (video-modeling). Overall, effect sizes indicate clinical significance between pre- and post-pilot in the reported utilization of video- modeling. Importantly, these findings support the utilization of the ACT SMART Toolkit in producing behavioral changes in EBP utilization as reported by supervisors and direct providers, and suggests that this toolkit may be an effective strategy to facilitate the uptake of EBPs in ASD-CBOs (Sridhar & Drahota, 2020). Overall, the utilization of practice-based implementation guides such as the ACT SMART Toolkit, have the potential to bridge the gap between research and practice (Drahota et al., 2017; Drahota et al., in review; Eisenman et al., 2018). Furthermore, determinant frameworks allow agencies to identify and address unique factors impacting the utilization of EBPs in their setting, with the aim of facilitating efficient and effective adoption and implementation of these 10 interventions in community agencies (Drahota et al., 2017). However, an understanding of the barriers and facilitators to the utilization of this toolkit is integral to facilitate its broader use and impact in community settings. Aims. This qualitative paper evaluated the barriers and facilitators to the utilization of the ACT SMART Toolkit. Two implementation frameworks, the adapted EPIS model (Drahota et al., 2017) and the Technology Acceptance Model 3 (Venkatesh & Bala, 2008), guided this process. The adapted EPIS model (Drahota et al., 2017) was utilized in order to develop an understanding of inner context factors acting as determinants to the utilization of the ACT SMART Toolkit in these settings. Additionally, due to the toolkit being primarily web-based, the TAM3 (Venkatesh & Bala, 2008) was utilized as well, in order to understand determinants specific to the web-based component of the toolkit. As previously stated, prior research has not utilized these frameworks in tandem; as a result, this paper allows for the exploration of how the two frameworks may interact to explore determinants to EBP implementation in ASD-CBOs. This paper utilized thematic analysis of qualitative interview data to address the following research questions: a)   What are the inner context factors acting as barriers and facilitators to the overall utilization of the ACT SMART Toolkit at six community agencies as reported by agency implementation teams? b)   Did the identified inner context factors acting as barriers and facilitators to the utilization of the ACT SMART Toolkit differ by adapted EPIS phase? 11 Standards for Reporting Qualitative Research. Importantly, this project followed the Standards for Reporting Qualitative Research (SRQR) as outlined by O’Brien and colleagues (2014). These guidelines outline a number of topics recommended to be included in qualitative papers, in order to improve the quality and transparency of reporting qualitative research. Topics, which include the qualitative approach and research paradigm, researcher characteristics and reflexivity, context, data collection, processing and analysis, trustworthiness, interpretations, and limitations of findings, will be discussed throughout the paper. 12 The data involved in the present study was collected as part of the ACT SMART Toolkit METHOD pilot study that aimed to evaluate the feasibility, acceptability, utility and fidelity of the Toolkit’s use with a small sample of ASD-CBOs in Southern California. All study procedures were approved by institutional review boards, and the secondary data analysis involved in the current study was approved through the IRB at Michigan State University. Researcher Characteristics. Researcher characteristics were considered based on the SRQR guidelines. Both independent coders are university-affiliated individuals; one is an undergraduate research assistant at the university, and the other a doctoral student in the Clinical Psychology program. Both coders were provided reading and training materials prior to conducting thematic analysis. Additionally, one coder (AS) had previous experience in coding interviews and utilized similar coding techniques in the analysis of this data. Neither coder was associated with the development of the ACT SMART Toolkit or the pilot study. As well, the coders had no relationship to the participating Implementation Teams or community agencies taking part in the pilot study. These were important steps towards maintaining objectivity, in order to identify perceived determinants to utilization of the ACT SMART Toolkit. Reflexivity. Common assumptions and biases held by both independent coders included the following: (a) all children diagnosed with ASD would benefit from the utilization of ASD- EBPs; (b) there are serious disparities in access to both diagnosis and effective treatments for children with ASD from low SES, racial/ethnic minority, rural, and other marginalized backgrounds; (c) there exists a research-to-practice gap in the utilization of ASD-EBPs in community settings; and (d) research investigating the implementation of EBPs in community- based settings is an important step in narrowing this research-to-practice gap. As a result, both 13 coders strongly believe these findings will have important implications in ASD research and practice. Finally, both coders held common assumptions regarding the utilization of thematic analysis, and the two frameworks (EPIS and TAM3) as being suitable for exploring and understanding determinants to the utilization of the ACT SMART Toolkit. Context. Six ASD community agencies (ASD-CBOs) located in Southern California participated in the ACT SMART Toolkit pilot study. Agencies were selected based on the following criteria: (a) existing social and/or research collaborations with other agencies, researchers, and collaborative groups; (b) existing efforts to receive additional training for their staff; and (c) discussions about interest in implementing new EBPs within their agency. As well, agencies were already providing services to children with ASD at the time of recruitment; specifically, four of the participating agencies provided Applied Behavior Analysis (ABA) to clients, one agency provided both ABA and mental health care, and one agency provided Speech and Language Pathology (SLP). Furthermore, all six agencies reported a need for one of the three EBPs selected for the ACT SMART Toolkit pilot study (i.e., social narratives, video modeling, and self-management) during agency recruitment; once the pilot study began, all six agencies selected video modeling to be implemented at their agency. Participants. Recruitment was completed based on both the timeline of the study, and the feasibility of providing facilitation teams, made up of ACT SMART staff, to all six participating agencies. Five agencies completed all phases of the implementation process using the ACT SMART Toolkit; one agency chose not to adopt an EBP at the end of Phase 2 (Adoption Decision Phase) of the implementation process. Implementation Teams (ITs, N = 6) at each agency were made up of a range from 1-4 agency staff, and included supervisors, agency leaders, and direct providers. Demographics information for IT members is shown in Table 2. 14 Demographics information for three supervisors was not collected. Agency ITs were required to have at least one agency leader on the team. ASD agency leaders were eligible if: (a) they had a role of director, CEO, or leading decision-maker regarding treatment use at their agency, (b) their agency was eligible to participate in the pilot study, (c) they were willing to commit 1 year of study engagement, and (d) they agreed to provide feedback following each phase and at the end of the pilot. The remainder of the IT was made up of agency site staff members and determined by the IT agency leader. In addition, eligibility criteria required that all members of each IT would commit to providing information and feedback about the feasibility, acceptability, and utility of the ACT SMART Toolkit. Table 2. IT Demographics Demographics Sex (Female) Ethnicity White Mixed Race Prefer Not to Answer Education Level Master’s Degree Doctorate Discipline Psychology Behavior Specialist Speech/Language/Communication Education Missing (37%, Supervisors) Superviso rs (n = 8) 100% 25% 25% 12.5% 50% 12.5% 25% 25% 12.5% - Agency Leaders (n = 7) Direct Providers (n = 1) 100% 100% - - 42.9% 57.1% 28.6% 28.6% 28.6% 14.3% 100% 100% - - -   -   -   -   100% 100% Procedure. Individual semi-structured interviews were conducted with each participating IT across 5 time points. Interviews with ITs were conducted in person at a location convenient to the IT members, or by phone, and took place after each phase was completed and at the end of pilot study. Interviews began with questions using a Likert Scale format that measured IT 15 perceptions of the feasibility, acceptability, and utility of each ACT SMART Toolkit phase. Participant responses were followed up with open-ended questions about why a score was given, areas for improvement, and factors impacting their perceptions of feasibility, acceptability, and utility. IT interviews were audio recorded with the participants’ consent, and ITs were compensated $100 at the end of each interview. End of phase interviews took 10.12 minutes on average (SD = 4.32) and end of pilot interviews lasted 22.79 minutes on average (SD = 6.86). Measures. End of phase interviews (see Appendix A for the interview guide) focused on the IT’s perspectives on the practicality, acceptability, and utility of both the activities and facilitation meetings that occurred throughout each phase. Additionally, IT’s were asked to report any changes that had been observed at their agency from the beginning of the study to the date of the interview (whether or not the change was attributed to the utilization of the ACT SMART Toolkit) and recommendations for revisions to the ACT SMART Toolkit, website, and/or facilitation meetings, specific to that phase of the toolkit. Post-study qualitative interviews (see Appendix B for the interview guide) were also conducted with each IT. These interviews focused on the feasibility, acceptability, and utility of the toolkit and facilitation meetings overall. Implementation team members were asked about their satisfaction with the toolkit, challenges their agency faced while using the toolkit, perspectives on the toolkit and facilitation meetings in general, views on the impact of ACT SMART in their agency (i.e., success, value), future utilization of ACT SMART at their agency, and recommendations for toolkit and facilitation revisions. Data Analysis. Data processing: Recorded interviews were transcribed and verified by undergraduate research assistants who were unfamiliar with the aims of the ACT SMART pilot study as well as the current study. The majority of interviews were transcribed in California, 16 where the data was collected; however, the end of pilot interviews were primarily transcribed and verified at Michigan State University. Interview data were anonymized such that specific members of implementation teams were referred to as “participant” or “respondent.” In cases of multiple respondents, each respondent was randomly assigned a number (e.g., Participant 1, Participant 2) for each transcription. Importantly, although each Implementation Team was made up of a range of 1-4 agency staff members, not all members of each IT were present at every interview. As a result, data was processed utilizing each IT as one unit of analysis, rather than analyzing each participant within the IT as separate units of analysis (Onwuegbuzie, Dickinson, Leech, & Zoran, 2009). Thematic analysis: Following interview transcription and verification, thematic analysis (Clarke & Braun, 2014) was utilized in order to develop a clear understanding of the specific research questions this paper sought to examine, based on relevant emergent codes, categories, and themes across the entire data set. Specifically, thematic analysis allowed for the exploration of barriers and facilitators to the utilization of the ACT SMART Toolkit, both by phase and at- post. Finally, thematic analysis allowed for the utilization of multiple coding methods, such as the utilization of both an inductive and deductive approach to coding (Braun & Clarke, 2012). An inductive approach to coding involves the analysis of emergent data, while deductive approaches allow the researcher to utilize existing ideas or concepts when interpreting data; as a result, this method aligned with the utilization of two implementation frameworks, as a number of codes were decided a priori based on these frameworks (Braun & Clarke, 2012). Both inductive and deductive methods were utilized in the thematic analysis for this project. This project followed the six-phase approach to thematic analysis, as outlined by Clarke & Braun (2012). Importantly, this process involved both a sequential progression through the phases of 17 analysis, as well as a repetitive process that involved moving back and forth between the six phases of analysis. Analysis began with the independent coders familiarizing themselves with the data, by reading all interview transcripts twice. Following this, the coding process, outlined in detail in the following subsection, began. Upon completion of coding, the independent coders generated initial themes by examining the codes and identifying patterns in the data. Subsequently, the coders discussed and compared emergent themes and, together, identified two final themes. Following this, definitions for each theme were developed by both coders and codes were organized by theme. Finally, the writing process began and themes were contextualized as they related to previous literature (Braun & Clarke, 2012). Coding: A number of specific coding methods were utilized in this analysis (Miles, Huberman & Saldaña, 2013). Firstly, provisional coding was utilized. Provisional coding refers to the utilization of codes developed prior to analysis, based on the research questions and existing frameworks guiding analysis. These codes included: Barriers to the Use of ACT SMART, Facilitators to the Use of ACT SMART, Phase Specific Barriers to the Use of ACT SMART, Phase Specific Facilitators to the Use of ACT SMART, Inner Context Factors as Barriers to ACT SMART, and Inner Context Factors as Facilitators to ACT SMART. Following this, first cycle (or initial coding) was utilized; this involved line by line open coding to identify emergent codes. During this process, a codebook (Appendix C) was developed and regularly revised as codes emerged. As well, subcoding was utilized to identify second order codes nested under a primary code. This method allows for more specific coding and details about a main code (e.g., main code: Barriers to ACT SMART; Subcodes: “Website Issues”, “Perceived Ease of Use,” etc.). As this process continued, axial coding, the process of relating codes to each other, was utilized to group similar codes together to form larger categories based on concepts 18 that emerged from the data. The entire process was iterative in nature, such that the codebook was often revised in order to add or remove codes and to group codes together. Once the codebook was finalized, the independent coders conducted a final coding of all interviews to ensure consistency in codes across the data. Throughout the process, consensus coding was utilized to handle any coding discrepancies. Specifically, the independent coders discussed their rationale for the code selected, and chose a final code together based on this discussion. Coding was completed first using Microsoft Word. Following this, all interviews and codes were entered into MAXQDA, a qualitative and mixed methods computer software. MAXQDA was utilized in order to examine the frequency with which codes were discussed, and to facilitate the review of illustrative quotes by code, subcode, or theme. Salient codes were identified as well. Salient codes were chosen based on the independent coders’ perspective regarding codes they believed to be most impactful in either facilitating or hindering the utilization of the ACT SMART Implementation Toolkit, based on IT responses. In addition, the frequency of the codes across all transcripts was considered to support the saliency of codes (Landrum & Garza, 2015). Salient codes were selected by each coder individually; specifically, each coder selected the codes that they believed were salient facilitators to the ACT SMART Implementation Toolkit, and codes they believed were salient barriers to the ACT SMART Implementation Toolkit. Following this, the coders selected the final salient codes and utilized consensus coding to address any discrepancies. Theming the data: The coders developed two main themes that aligned with the frameworks utilized in data analysis, following the process outlined by Braun & Clarke (2012). Additionally, using MAXQDA, the coders examined each code to determine whether the content of these codes differed by adapted EPIS phase. 19 Trustworthiness: A number of steps were taken in order to ensure trustworthiness in the analysis of interview data, in alignment with the SRQR guidelines. Firstly, coding was conducted by two independent coders, who regularly assessed agreement in their codes. Secondly, as coding was conducted iteratively, the codebook was updated on a weekly basis throughout the process, with a total of 10 iterations of the codebook. The coders utilized an audit trail to keep record of any changes made to the codebook, as well as to describe the rationale for changes made. Finally, neither coder was associated with the development of the ACT SMART Toolkit to ensure that the current study’s procedures and coding were conducted in an objective manner. However, due to the small sample size and limited time frame of this pilot study, thematic saturation was unable to be established. 20 RESULTS Frequent Codes. Qualitative data was quantized to determine the frequency counts (e.g., number of times the code was assigned across all interview transcripts; see Table 3 for frequency counts). The most frequent code was Phase Specific Activities (subcode: as a facilitator) to the utilization of ACT SMART (frequency: 77). The second most frequent code discussed by IT members was Facilitation Team (FT) Meetings (frequency: 76). Finally, Phase Specific Activities (subcode: as a barrier) to the utilization of ACT SMART was coded 44 times. All frequencies, descriptions, and illustrations of each code, organized by theme, are reported in Table 3. Table 3. Themes, Descriptions, Frequencies, and Illustrative Quotes Total Frequency (# Agencies Endorsed) Illustrative Quote Theme Description •  Characteristics within the organization, including leadership, organizational structures, and staffing. (Moulin et al., 2019) •  Inner determinants from within the ASD-CBO Inner Context Determinants* Ä  Scheduling •  Comments regarding scheduling being difficult as a barrier to ACT SMART. Barrier: 10 (6) “I feel like the real-life factors like our boss is what prevents it from being feasible—the toolkit” Facilitator: 3 (2) “…the intervention I was choosing I pretty much already knew I was choosing so this this phase of it um you know was pretty easy for me to just oh go through the motions and know that I made the right decision” Barrier: 15 (3) “maybe the other hard part just to consider is when you’re in a busy clinical practice how you make it at a time everybody can meet” Facilitator: 0 (0) n/a 21 Table 3 (cont’d) ÄTime Constraints ÄFunding ÄOrganizational Characteristics ÄIndividual Characteristics Innovation Determinants ÄPhase Specific Activities* •  Comments regarding staff members being too busy or not having time to complete tasks associated with ACT SMART. Barrier: 10 (4) Facilitator: 0 (0) “it also took a really long time and its just hard kind of being in the position we’re in to set aside that much time you know which is why it’s taking so long because we have so much work that we do. So we both just kind of found it difficult in phase 2 to complete everything because of how lengthy the process was” n/a •  Comments regarding issues related to funding as impeding ability to complete ACT SMART. •  Comments regarding the characteristics of the agency that help or impede ability to complete activities for ACT SMART. •  Characteristics of the staff members that facilitate use of the toolkit. •  Factors related to the innovation itself (Moulin et al., 2019). •  Determinants directly from the ACT SMART Toolkit. Barrier: 8 (1) “I would say it was pretty difficult because we didn’t have funding so we couldn’t get passed a lot of the steps” Facilitator: 0 (0) Barrier: 11 (2) n/a “I think that people were feeling that they had other things that they needed to do.” Facilitator: 3 (2) “Our team is awesome, so everybody kind of supports each other” Barrier: 0 (0) Facilitator: 3 (2) n/a “Maybe ‘cause I have more kind of background in that, application” •  Phase specific factors facilitating or hindering the use of ACT SMART. This code is further broken down into each of the five phases. Barrier: 44 (6) “Phase 3 really does depend on whether or not there’s availability of materials out there, which I don’t think are covered in phase 2” [Phase 3, EBP Implementation] Facilitator: 77 (6) “Having it presented um with our team was was…it generated a lot of useful discussion and um I thought that was very…it was very useful” [Phase 1, Agency Assessment] 22 Table 3 (cont’d) ÄPerceived Lack of Resources* ÄACT SMART Toolkit not tailored to agency ÄLack of Responsiveness from FT ÄAdditional Resources ÄFacilitation Team* ÄFT Meetings* •  Comments regarding the lack of appropriate or helpful resources that agencies believed would be provided through the ACT SMART Toolkit. •  Responses discussing ACT SMART being too general/not tailored to specific agencies. •  Comments regarding a lack of responsiveness from the FT once the IT has reported an issue/asked for assistance. •  Additional resources provided by FT, that were not initially provided by the ACT SMART. •  Responses regarding the FT members or their responsiveness as facilitators. •  Responses focused on the FT meetings specifically as being helpful. Barrier: 11 (5) “They gave a couple of links and resources but I didn’t feel like they were that helpful, we sorta had to go do our own.” Facilitator: 0 (0) Barrier: 7 (2) Facilitator: 0 (0) n/a “I think because it’s, you know you’re trying to make it fit for all these different agencies doing all these different things they had to be a little more general and not tailored…” n/a Barrier: 3 (2) “We run into hiccups which we communicate and they don’t end up getting corrected” “In the beginning we had trouble locating a program to implement. Remember we couldn’t find the modeling, we um but they helped us a lot by pairing us with another agency who was doing the same thing so that was very useful.” n/a “They were able to bring um some materials that supported maybe some of my questions that I had about the website and the it was much easier um when they brought them… to the facilitation meetings for me to understand what was needed” n/a “They were my favorite because I felt they kept us on track.” Facilitator: 7 (3) Barrier: 0 (0) Facilitator: 22 (5) Barrier: 0 (0) Facilitator: 76 (6) 23 Barrier: 26 (6) Facilitator: 11 (5) Barrier: 9 (3) Facilitator: 23 (6) “It’s just the ease of use of the website. I don’t know why for me it was not something that that seemed like it didn’t- I don’t know I didn’t use it. It didn’t seem that user friendly. I just didn’t log into it as much maybe as I do other websites that I have to use” “I think its just its laid out in a very logical a logical way, kind of walks you through the steps so I thought it was very practical” “But at the end, it didn’t provide any needed structure” “The web-based activities were nice because they were at least at the beginning of the project, too, in those phases because…umm… they were kind of a reminder of… umm… the… you know the… the activities that I shh… I was supposed to be doing in my deadlines and things like that” Barrier: 27 (6) “The web-based activities were challenging because it didn’t work.” Facilitator: 2 (2) “It was very detailed” •  Comments regarding difficulty/ease of use of the website. •  Comments regarding the usefulness of the website. •  Comments about any issues with the website (i.e. technological issues). •  General comments about aspects of the website that facilitated the use of ACT SMART. Table 3 (cont’d) ÄPerceived Ease of Use of Website* ÄPerceived Usefulness of Website ÄWebsite Issues* ÄGeneral Website Comments Note. *Denotes salient determinants. Salient Codes. Salient barriers were identified as factors discussed by implementation teams, that appeared to be the most significant factors in impeding the use of the ACT SMART Implementation Toolkit. Four factors were selected by the independent coders: (a) website issues, (b) perceived ease of use of the website, (c) perceived lack of resources, and (d) inner context factors. Website issues (frequency: 27), described as “a couple glitches that need to be […] worked out, which is typical with technology”, was selected as a salient code, based on IT 24 responses regarding technological issues when using the website. This was especially significant given that the majority of the ACT SMART Implementation Toolkit activities took place on the website; as a result, technological issues with the website often made it difficult for agencies to complete web-based activities and tasks. This code aligned with the adapted EPIS model as an innovation factor, as it was specific to the ACT SMART Toolkit website. Secondly, the perceived ease of use (frequency: 26) of the website was identified as a salient barrier to the use of the toolkit. Implementation team members discussed the difficulty they had in navigating the website, and provided specific comments regarding aspects of the website that were not easy to use. For example, IT members stated “it was very hard to navigate” and “it wasn’t […] user friendly.” These difficulties led several IT members to avoid use of the website during the pilot study. The perceived ease of use code aligned with the TAM3 framework that guided data analysis. The TAM3 framework highlights this factor as a determinant for adoption and use of technology products during the implementation of innovations in community settings (Venkatesh & Bala, 2008). Thirdly, IT members discussed the perceived lack of access to resources (frequency: 11) they believed would have been helpful. For example, IT members expressed a need for access to behavior analytic journals specifically, as well as the ability to communicate with other agencies that had implemented the same EBP, in order to discuss the process of implementation with agencies who underwent a similar process. In addition, agencies reported a perceived lack of resources related to the EBP chosen for implementation, as a barrier to progressing through ACT SMART phases. One IT agency leader stated “so um we sort of had to do our own research, identify our own materials […] when we were gathering the materials together [for video modeling].” The perceived lack of resources code aligned with the EPIS framework, highlighting 25 the importance of accessibility to resources in the implementation of EBPs in community settings (Moulin et al., 2019). Specifically, this code fell under the EPIS frameworks’ “innovation factors,” as IT members believed these resources would be provided through the toolkit. Importantly, resources such as access to specific journal articles or resources that were needed for the EBP itself (e.g., cameras, training videos) were not included in the ACT SMART Toolkit. As a result, this barrier appears to stem from a possible miscommunication or misunderstanding regarding the resources included as part of the ACT SMART Implementation Toolkit. Finally, inner context factors (frequency: 10) emerged as a salient code, and included specific issues related to scheduling and funding within the agency. In terms of scheduling, IT members discussed staff being very busy, leading to difficulties in setting up meetings focused on ACT SMART activities. Additionally, IT members discussed difficulty accessing resources needed for EBP implementation due to a lack of funding within the agency. Both barriers align with the EPIS framework, under the category of “inner context factors” (Moulin et al., 2019). Although these barriers were not the most frequently coded, these factors appeared to be the most salient in impeding the utilization of the ACT SMART Implementation Toolkit, based on IT perspectives. Salient facilitators were identified based on codes that were discussed by implementation teams, that appeared to be the most important and useful factors in facilitating ACT SMART use. The three salient codes identified were: (a) facilitation teams (FT), (b) FT meetings, and (c) phase specific activities. Facilitation teams (frequency: 22) were identified as a salient code based on IT members’ discussion of various characteristics of the teams that were supportive as they utilized 26 the ACT SMART Implementation Toolkit. One IT stated, “she’s been really communicative and we haven’t run into anything where we felt like we couldn’t get support um, it’s been good.” Overall, responses focused on facilitation teams being responsive to agency needs, providing support to IT members, and being flexible in terms of scheduling and meeting with the agencies. Secondly, the FT meetings (frequency: 76) were also identified as a salient code. IT members regularly discussed the FT meetings in particular as being a facilitator to ACT SMART Toolkit use, with one IT commenting “they were very very helpful. I would say they were probably one of the most helpful… um… uh… aspects of the ACT SMART.” IT members cited several reasons for FT meetings being helpful, including that FT meetings helped ITs “stay on track”, provided ITs with accountability, allowed IT members to ask FT members questions and provided IT members with helpful information. Finally, phase specific activities (frequency: 77) were identified as a salient facilitator to the utilization of the ACT SMART Implementation Toolkit. In particular, activities from Phase 1 (Exploration) and Phase 2 (Adoption Decision) were often discussed by agencies. In Phase 1, the agency assessment and ACT SMART orientation meeting were both discussed as being particularly helpful. These activities were perceived to provide IT’s with valuable information regarding agency needs, current resources at the agency, and about the toolkit itself. In Phase 2, the cost estimator worksheet was reported as helpful to IT’s when they considered the cost for implementing the EBP, including staffing, resources, and training costs. As stated by one member of an IT, “overall I think the cost benefit analysis is useful and is, is necessary.” IT members discussed needing this information in order to move forward in utilizing the ACT SMART Implementation Toolkit. All three salient facilitators aligned with the EPIS frameworks’ “innovation factors” (Moulin et al., 2019). These three components were included in the ACT 27 SMART Toolkit and provided to the agencies as part of the implementation process. Overall, these three facilitators appeared to be most salient to agencies, by supporting the utilization of the ACT SMART Implementation Toolkit across all phases. Thematic Analysis Findings. Finally, codes were grouped into two themes that were developed utilizing both the EPIS and TAM3 frameworks and based on salient and frequent codes (Figure 2). Once the themes were selected, researchers examined whether each code under the two themes differed by adapted EPIS phase, using MAXQDA. Specifically, IT responses were examined in order to determine whether the content of specific codes varied by adapted EPIS phase. Figure 2. ACT SMART Toolkit Determinants Inner Context Determinants: The first theme, “Inner Context Determinants” (total frequency: 63) was developed a priori based on the adapted EPIS model, which highlights inner 28 context factors as an important aspect of EBP implementation in community settings; thus, this portion of data analysis followed a deductive approach (Aarons et al., 2009; Drahota et al., 2017). As well, this theme was developed in order to answer the first research question, regarding inner context factors acting as determinants to the overall utilization of the ACT SMART Implementation Toolkit. A number of specific codes (Figure 3) were identified as inner context factors that either facilitated or hindered the utilization of the toolkit. Barriers to the utilization of the ACT SMART Toolkit included funding (frequency: 8), time constraints (frequency: 10), and scheduling (frequency: 15). For example, one IT explained that “finding a time that people were all available at one time was challenging.” Facilitators to the utilization of the Toolkit included individual-level (i.e., staff) characteristics (frequency: 3), such as “having that [staff training] background.” Additionally, organizational-level characteristics (frequency: 14) were found to act as either a barrier or facilitator. For example, IT members perceived their agencies to have other priorities over use of the ACT SMART Toolkit, making it difficult for IT members to prioritize completing ACT SMART activities. However, organizational characteristics such as staff buy-in from numerous members of the agency (e.g., “we’ve got a really good staff who was- they were very eager to do it”) facilitated the utilization of the toolkit. Inner Context Determinants Differing by Phase: Identified Inner Context Determinants were then examined further, in order to explore whether these determinants differed by adapted EPIS phase. Two Inner Context Determinants were found to differ by adapted EPIS phase. While funding was found to be a barrier across all phases for one agency, this barrier played a particularly influential role in phase 3 (Preparation), during which agencies were purchasing materials and resources needed for the EBP implementation. Lack of funding at this agency prevented the ability to purchase materials and resources for a portion of this phase. Notably, 29 funding was not discussed by any other agency during any phase of the ACT SMART Implementation Toolkit. In terms of inner context facilitators, individual characteristics differed across phases, such that prior experience and familiarity with leading staff trainings was an important facilitator for two agencies in phase 4 (Implementation), when staff training took place. For example, in an end of pilot study interview, one IT member explained, “there was times that [staff member] was really key in the implementation, because she was leading the training” when discussing perceptions of barriers and facilitators to phase 4 of the ACT SMART Toolkit. No other Inner Context Determinants were found to differ by phase. Inner Context Determinants Barriers Barriers and/or Facilitators Facilitators Organizational characteristics Individual characteristics* Scheduling Time constraints Funding* Figure 3. Inner Context Determinants *Indicates determinants that differed by phase Innovation Determinants: The second theme, “Innovation Determinants” (total frequency: 345) was developed post hoc, as it emerged directly from the data; thus, this portion of analysis followed an inductive approach. “Innovation Determinants” aligned with the EPIS model that highlights factors specific to the EBP or innovation itself as salient determinants to 30 implementation (Moulin et al., 2019). Codes under this theme included any factor that specifically came from the ACT SMART Toolkit, such as facilitation teams and meetings (total frequency: 98), phase-specific activities developed for the toolkit (total frequency: 120), resources provided to agencies by the ACT SMART staff, and all comments related to the ACT SMART Website (total frequency: 98). All codes associated with Innovation Determinants are displayed in Figure 4. Innovation Determinants Barriers Barriers and/or Facilitators Facilitators Perceived lack of resources* Phase specific activities* Facilitation Team (FT) AS not tailored to agency* Lack of responsiveness from FT Perceived ease of use of website Perceived usefulness of website FT meetings Additional resources from FT Website issues General website comments Figure 4. Innovation Determinants *Indicates determinants that differed by phase Innovation Determinants that Differed by Phase: Three innovation barriers were found to differ by adapted EPIS phase, including specific phase specific activities. No innovation facilitators were found to differ by phase. Firstly, the ACT SMART Implementation Toolkit was 31 noted by respondents as not tailored to specific agencies (frequency: 7). Further, this was found to be particularly influential in phases 2 (Adoption Decision) and 4 (Implementation). Specifically, IT members reported a lack of access to tailored resources to match specific agency needs. For example, one agency described a need for behavior analytic journals specifically, to match their needs as an agency providing Applied Behavior Analysis (ABA) to clients, while another explained that “it was just difficult for us, it-it wasn’t difficult it was just challenging for us to adapt s-some of these materials to our particular population or therapy targets.” Additionally, perceived lack of resources (frequency: 11), in general, was found to be influential in phases 2 (Adoption Decision) and 3 (Preparation), where IT members described needing specific materials for the chosen EBP, and had difficulty accessing these resources (e.g., training videos; training manuals). Indeed, one IT member stated, “we sort of had to do our own research, identify our own materials.” Importantly, IT members seemed to believe these resources were to be provided from the ACT SMART Implementation Toolkit; however, these resources were not included within the ACT SMART Implementation Toolkit, as the toolkit was not developed to be tailored to a specific agency or EBP. Phase Specific Activities as Innovation Determinants: A number of phase specific activities were identified as either barriers and/or facilitators, and were found to differ by implementation phase. Specifically, the agency assessment in phase 1 (Exploration) was described as “confusing” and “time-consuming” (frequency: 14); therefore, implementation team members felt this activity hindered the utilization of the toolkit during this phase. Yet, this activity was also reported to facilitate the use of the toolkit by some respondents (frequency: 16). For example, one IT member stated, “the useful part is getting the data” and explained that the agency assessment allowed agencies to gather information on training needs and resources. 32 Additionally, the cost estimator worksheet presented logistical barriers (frequency: 18) in phase 2 (Adoption Decision) because “there was the glitch, it didn’t work” on the ACT SMART website, and the worksheet was described as “time-consuming” (frequency: 4) and “confusing” (frequency: 4) by different IT members. However, similar to the agency assessment in phase 1, the cost estimator worksheet was helpful in facilitating the development of a budget for EBP implementation for each IT (frequency: 15). As stated by one IT member, “it does help you plan out and also helps you to anticipate […] what the funding is gonna be.” Finally, phase 3 (Planning) included an implementation strategies exercise and the development of a staff training plan; both activities were perceived as being helpful to the agencies and facilitated the utilization of the toolkit during this phase. No other innovation barriers or facilitators were found to differ by phase. Website Factors as Innovation Determinants: Importantly, a number of codes under the Innovation Determinants theme were identified post hoc and were related to the website. These codes were not found to differ by phase, but are important to highlight as the website made up a significant portion of the ACT SMART Toolkit. Codes included both the perceived ease of use (e.g., “really user friendly”) and perceived usefulness (e.g., “not useful, we didn’t use it”) of the website, based on the Technology Acceptance Model 3 (TAM3; Venkatesh & Bala, 2018). These codes acted as either barriers or facilitators to the utilization of the ACT SMART Toolkit and highlight the importance of IT members’ perceptions of the utility and feasibility of using a technological product to increase uptake of EBPs at their agencies. In addition, codes related to other comments regarding the ACT SMART website were identified, including website issues as a barrier (e.g., “glitches with the website”) and general comments regarding the website (e.g., “it did not take that much time”). While these codes did not align with the TAM3 specifically, they 33 highlight other important factors to consider when using a technological product to support the application of implementation strategies in community agencies. 34 DISCUSSION Salient Determinants. This study aimed to examine (a) inner context factors as barriers and facilitators to the utilization of the ACT SMART Implementation Toolkit, and (b) whether these inner context determinants differed by adapted EPIS phase. This study identified a number of salient barriers and facilitators to the implementation of EBPs in ASD-CBOs. Salient barriers included website issues, perceived lack of ease of use of the website, perceived lack of resources, and inner context factors. The barriers suggest that while the website provided helpful resources or was otherwise useful to the participating agencies, technological problems (e.g., “glitches”) and difficulty navigating the website led most ITs to avoid using the website during the pilot study. Furthermore, these findings highlight the need for future iterations of the ACT SMART Toolkit to address technological concerns with the website, as well as to focus on making the website more user-friendly and easy to navigate. Additionally, these salient barriers suggest a possible miscommunication between facilitation and implementation teams, such that IT members believed that the ACT SMART Implementation Toolkit would be responsible for providing certain resources (e.g., access to journals, training videos for the EBP). However, the toolkit did not include these resources, as it was not developed for specific agencies or EBPs. As a result, future implementation of this toolkit should ensure clear communication regarding the resources provided by the toolkit, as well as possible resources the agencies are responsible for accessing (e.g., resources for the chosen EBP). Finally, Inner Context Factors, such as scheduling and funding, appear to hinder the use of the toolkit, such that ITs were unable to progress to a subsequent phase when lacking adequate funding or due to the inability to schedule meetings related to the ACT SMART Toolkit. Overall, these findings suggest that inner context 35 organizational factors continued to hinder EBP implementation in ASD-CBOs, even when utilizing a systematic and structured implementation toolkit. However, there were salient facilitators to the use of the ACT SMART Toolkit as well. These included ACT SMART facilitation teams and facilitation meetings as well as phase specific activities. These findings highlight the importance of facilitation as a component of the ACT SMART Implementation Toolkit, as IT members consistently stated this aspect of the toolkit was the most helpful in facilitating the implementation of the selected EBP. Finally, phase specific activities included in the toolkit were found to provide agencies with helpful and necessary information as they progressed through the adapted EPIS phases of the toolkit. Inner Context Determinants. Based on thematic analysis of interviews with implementation teams at participating ASD-CBOs, five Inner Context Determinants were identified. These findings support previous research indicating that funding is a significant determinant to the implementation of innovations across various settings (Aarons et al., 2009). Additionally, factors such as staff buy-in, or motivation of practitioners to implement an intervention, is a significant facilitator in the implementation of interventions in community settings (Wood et al., 2015; Langley et al., 2012). Furthermore, findings indicated additional inner context determinants not previously discussed in the literature. Specifically, scheduling, time constraints and individual provider characteristics were important inner context determinants to ACT SMART use. Findings suggest the need for further research investigating the impact of these inner context determinants to the implementation of innovations in community-based organizations. Innovation Determinants. Although this paper sought to identify inner context determinants to the utilization of the toolkit, thematic analysis highlighted several determinants 36 associated with the toolkit as a whole (Innovation Determinants) and the website, specifically. Findings support previous literature indicating that a perceived lack of resources for specific EBPs is often a barrier to implementation in community-based settings (Dingfelder & Mandell, 2011; Pickard et al., 2016). Furthermore, facilitators and facilitation team meetings were perceived to be one of, if not the most, helpful factors of the ACT SMART Implementation Toolkit. This finding aligns with previous literature illustrating the importance of facilitation in the implementation of interventions in various settings (Harvey et al., 2018). Finally, both the perceived ease of use and the perceived usefulness of the website were found to be important to the utilization of the ACT SMART Implementation Toolkit. This finding is consistent with previous literature suggesting these two factors appear to be highly predictive of technology use in various settings (Venkatesh & Bala, 2008). Importantly, these findings indicate an interaction between both implementation frameworks, such that salient determinants within the TAM3 were found to fall under the “innovation factors” component of the EPIS. This study is the first to utilize both frameworks in tandem, and to highlight ways in which these implementation frameworks may interact, when technology products are utilized within an innovation (i.e. an implementation toolkit). Determinants Across Phases. In addition, this study identified a number of determinants that differed across adapted EPIS phases. Two phase specific activities were found to act as both a barrier and a facilitator; specifically, the agency assessment (phase 1) and the cost estimator worksheet (phase 2). Both activities were described as being time-consuming and confusing to complete, but IT’s also reported that the information these activities provided to agencies was extremely helpful in preparing for EBP implementation. Importantly, these findings indicate an interaction between various innovation factors, such that the modality in which these activities 37 were delivered was found to impede use of the toolkit, but the content of the activities was found to facilitate use of the toolkit. These findings are an important contribution to the understanding of determinants to the implementation of innovations within community agencies. In addition, a number of determinants were found to be particularly salient in specific phases, based on the purpose and goals of the phase. For example, in phase 2 (Adoption Decision), barriers such as the perceived lack of resources and the ACT SMART Toolkit not being tailored to agency needs appeared to be more salient to implementation team members. Based on IT member responses, these factors played a significant role during this phase due to the need for specific and tailored resources that matched agency needs, in order to assist the agency in making their adoption decision. In phase 3 (Preparation), the perceived lack of resources and a lack of funding were found to be the greatest barriers to use, as agencies began to purchase resources and materials needed for the implementation phase. Finally, in phase 4 (Implementation), individual characteristics, such as prior experience leading staff trainings, was found to be a significant facilitator, as agencies began to implement training plans prior to implementation of the EBP. Overall, these findings indicate that a number of determinants were particularly significant depending on the purpose and activities during specific phases of the ACT SMART Implementation Toolkit. Conversely, other determinants such as time constraints and staff buy-in were found to be present across all phases of the toolkit, suggesting that certain factors acted as determinants to toolkit use regardless of the adapted EPIS phase. Limitations. There are several limitations to consider in this study. Firstly, due to the small sample size of the ACT SMART pilot study, generalizability of this data is limited. Additionally, although six agencies took part in the pilot study, only five agencies completed all five phases of the toolkit. Interviews were conducted with the agency that chose not to 38 implement an EBP; however, this agency only completed two phases of the pilot study, thus limiting the amount of data from one particular agency. Future studies examining the utilization of implementation guides should aim to include a larger sample size, when possible. Importantly, two interview recordings (agency 1, end of phase 4 and end of pilot) are missing from analysis, further limiting the amount of data utilized in this paper. As well, data from this study was collected a few years prior to this analysis, and in a different state. As a result, researchers did not have the ability to conduct data checking with implementation teams at the participating agencies, following qualitative analysis. Due to the subjective nature of qualitative analysis, the lack of data checking may lead to biased interpretations of interview data. However, by utilizing the methods outlined as part of the Standards for Reporting Qualitative Research (SRQR; O’Brien et al., 2014), the researchers aimed to mitigate the impact of their biases on this qualitative analysis. With regard to the implementation frameworks utilized, it is important to note that coders were not able to examine factors impacting perceived usefulness and perceived ease of use within the TAM. The TAM posit that factors such as individual differences, system characteristics, social influence, and facilitating conditions influence perceptions regarding ease of use and usefulness of a technology product. However, these factors were not explored within the end of phase or end of pilot interviews. As a result, factors impacting these perceptions of the website were not explored in detail, thus limiting the understanding of IT perspectives regarding the website. Future research utilizing technology products within an implementation guide should aim to explore factors impacting perceptions of the technology product. Future Directions. Overall, findings from this study may be utilized in future research examining the utilization of the ACT SMART Implementation Toolkit, specifically, and 39 implementation guides, generally. In terms of the ACT SMART Implementation Toolkit, these findings highlight significant barriers to the utilization of this toolkit (i.e., website issues, time constraints) that may be addressed in future studies examining its use. As well, findings indicate several salient facilitators (i.e., facilitation teams, phase-specific activities) that should be included in future iterations of the toolkit. Finally, the independent coders analyzing this data identified a number of suggestions for improvement for the ACT SMART Toolkit, based on implementation team member perspectives. Future directions may include a study to further examine these suggestions, and findings from these studies may be utilized to inform changes and improvements to be made to the ACT SMART Implementation Toolkit. Future research may also utilize findings from this study to inform the development or improvement to other implementation guides utilized in community-based agencies, or other web-based implementation toolkits. Conclusion. This paper illustrates frequent and salient barriers and facilitators to the utilization of the ACT SMART Implementation Toolkit that were found to occur within ASD for-profit, community-based agencies. Although these findings are specific to the ACT SMART Implementation Toolkit, a number of the factors identified as barriers and facilitators in the present study are consistent with previous studies that have illustrated similar determinants to the implementation of innovations in community-based settings. As such, these findings may inform the development, refinement and broader utilization of implementation guides in such settings. Overall, these findings illustrate numerous areas for improving the ACT SMART Implementation Toolkit that may be addressed in future studies, with the ultimate aim of increasing the uptake of evidence-based practices in community agencies providing services to children with autism spectrum disorder. 40 APPENDICES 41 Appendix A. End of Phase Interview Guide ACT SMART End of Phase Interview Guide I will be asking you a few questions today about your perceptions of the feasibility, acceptability, and usefulness of phase _______ of the ACT SMART toolkit. A. ACT SMART Toolkit 1.   How practical was it to complete phase ____ of the ACT SMART toolkit? By Phase ____, I mean reading the website content, completing activities, and how the website functioned. 1 Not at all 2 Slightly 3 4 5 Moderately Very Extremely 2.   Why did you give it that score? (Optional: What would need to be different to give it a higher score? Was there anything that wasn’t practical or could be improved?) 3.   How useful was this phase of the ACT SMART toolkit? Again, I mean reading the website content, completing activities, and how the website functioned. 1 Not at all 2 Slightly 3 Moderately 4 Very 5 Extremely 4.   Why did you give it that score? (Optional: What would need to be different to make this phase more useful? Was there anything that wasn’t useful or could be improved?) 5.   How satisfied were you with this phase of the ACT SMART toolkit? 4 1 2 3 Dissatisfied Neutral Satisfied 5 Strongly Satisfied Strongly Dissatisfied 6.   Why did you give that score? (Optional: Is there anything that can be done to improve your satisfaction score? Why do you say that? Can you tell me why that is?) B. ACT SMART Training Model 7.   What did you think about the facilitation meetings that took place during this phase? (What made you satisfied with them? What made them practical or useful?) 8.   What kind of changes have you seen at your agency since the beginning of ACT SMART? 9.   Are there any changes that you would recommend for the toolkit and the facilitation C. Impact of ACT SMART on agency (Optional: None, why is that?) D. Recommendations meetings for this phase? 42 Appendix B. End of Pilot Study Interview Guide ACT SMART End of Study Interview Guide I will be asking you a few questions today about how feasible, acceptable and useful the ACT SMART toolkit and facilitation meetings were to you, overall. A. ACT SMART Toolkit a.   How useful was the ACT SMART toolkit? (probe for website overall & web-based activities for each of these questions) a.   How feasible was the ACT SMART toolkit? b.   How satisfied were you with the ACT SMART toolkit? b.   How challenging was the ACT SMART toolkit? (probe for website overall & web-based activities) a.   Was there anything about your agency/agency site that made it challenging to use? b.   Was there anything about your implementation team that made it challenging to use? B. ACT SMART Training Model c.   How useful was the ACT SMART orientation training? d.   What were your thoughts about the facilitation meetings? e.   How challenging were the ACT SMART facilitation meetings? a.   How difficult was it to complete the facilitation meeting action steps? b.   How difficult was it for you to schedule and attend the orientation training and facilitation meetings? C. Impact of ACT SMART on agency f.   How would you know if ACT SMART were successful at your agency site? How would you g.   Has there been any value to using ACT SMART, to you or your agency site? In what way? h.   What changes have you observed at your agency site since you began using the ACT a.   How has your agency’s process changed when you are thinking about doing measure success? SMART toolkit? something new? b.   How has your knowledge and skills to adopt new research-based treatments changed? c.   How have your skills to adapt research-based treatments for use within your agency d.   How has your ability to identify specific strategies to implement the use of new research-based treatments changed? e.   How has your ability to use specific strategies to support the use of new treatments changed? changed? D. Future use of ACT SMART Toolkit i.   Would you be interested in continuing to use ACT SMART at your agency? j.   Would you buy ACT SMART for your agency, if it were for purchase? a.   How much would your agency site be willing to pay, if it were for purchase? E. Recommendations k.   What changes would you recommend for the ACT SMART toolkit and facilitation meetings? (Probe for website overall & web-based activities) 43 Appendix C. Code book Code/Subcode 1.   Facilitators to Use of ACT SMART 1a. Website 1ai. Perceived Ease of Use 1aii. Perceived Usefulness 1b. Facilitation Team 1bi. FT Meetings 1c. Phase Specific Facilitators to Use of ACT SMART 1ci. Phase 1 1cii. Phase 2 1ciii. Phase 3 1civ. Phase 4 1cv. Phase 5 1d. Inner Context Factors as Facilitators to Use of ACT SMART 1di. Individual Characteristics Description/Examples General factors or characteristics facilitating the use of ACT SMART. May be related to factors/activities specific to ACT SMART, that are not phase-specific. E.g. ACT SMART is intuitive, functional, straightforward. Factors specifically related to the website that facilitated the use of ACT SMART. 1ai. Refers to comments regarding how easy it was to use the website (e.g. comments like “user-friendly” or “easy to navigate) 1aii. Comments about how useful the website was (e.g. the website contents were really useful/helpful, the resources on the website were useful) Any responses regarding the FT members or their responsiveness as facilitators to ACT SMART. Includes responses re: FT being responsive to requests from the agency, or needs of the agency, and that the responsiveness to these requests/needs were facilitators to ACT SMART Comments specific to the FT meetings. E.g. Perceived as useful, scheduling was flexible, agenda prepared ahead of time, provided agency with information, content, ideas Phase specific factors facilitating the use of ACT SMART. This includes any activities specific to a single phase of ACT SMART E.g.: -­‐   Phase 1: agency assessment (embedded into activities, useful, mapped out next steps, aligned with staff vision, easy to complete) -­‐   Phase 3: developing training plan was helpful to agency Specific inner context factors that facilitated ACT SMART Use (Moulin et al., 2019) Characteristics of the staff members that facilitate AS use (e.g. previous experience in a related field – e.g. with implementation work, with the EBP, with autism, with research) 44 1dii. Organizational characteristics Staff working well together or other comments regarding agency culture that facilitate AS use. Additional resources provided from ACT SMART (e.g. access to other agencies, literature. Do not code if FT provides agency with AS resources that were supposed to be provided- e.g. paper version of budget form) General factors hindering the use of ACT SMART. May be related to factors/activities specific to ACT SMART. Factors specifically related to the website that impeded the use of ACT SMART. E.g. tech glitches 1ai. Refers to comments regarding how hard it was to use the website (e.g. comments like “not user-friendly” or “difficult to navigate” or “not easy to access”, includes comments re: not remembering password) 1aii. Comments about website not being useful/needed (e.g. the website contents were not helpful, the resources on the website were not useful) Comments regarding the lack of appropriate or helpful resources that IT members believed would be provided from ACT SMART. This should be distinct from resources that come from inside the agency or are specific to the agency, and focus on resources that agencies believed ACT SMART was responsible for providing or did not provide but would have facilitated ACT SMART use. -­‐   Eg. ACT SMART did not provide access to related journals Factors specific to a phase that impeded ACT SMART use. - Eg. issues with budget worksheet (phase 2) Comments re: staff training (the activity) as a barrier to completing ACT SMART, specific to a phase 1e. Additional Resources from ACT SMART 2.   Barriers to Use of ACT SMART 2a. Website Issues 2ai. Perceived Ease of Use 2aii. Perceived Usefulness 2b. Perceived Lack of Resources 2c. Phase Specific Barriers 2ci. Phase 1 2cii. Phase 2 2ciii. Phase 3 2civ. Phase 4 3cv. Phase 5 2d. Inner Context Factors as Barriers Specific inner context factors that impeded ACT SMART Use E.g. agency members too busy, limited access to resources needed that ACT SMART is not responsible for providing, agency being located far away from AS 2di. Time Constraints Comments regarding staff members being too busy or not having time to complete any aspect of AS. 45 2dii. Scheduling 2diii. Funding 2div. Organizational characteristics 2e. ACT SMART not tailored to agencies 2f. Lack of Responsiveness from FT 3.   General Inner Context Factors 4.   Barriers to EBP Implementation 5.   Facilitators to EBP Implementation 6.   Suggestions to Improve ACT SMART Toolkit 6a. Suggestions to improve FT Meetings Comments regarding scheduling being difficult as a barrier to AS. Comments regarding issues related to funding as impeding ability to complete AS (e.g. funding limiting access to EBP materials, thus slowing down AS progression) Comments regarding the characteristics of the agency that impede ability to complete activities for AS (i.e. productivity expectations, location, IT problems at agency) Responses discussing ACT SMART being general or not tailored to agencies and their specific needs as a barrier to ACT SMART. Comments from IT members regarding asking FT for resources or giving them feedback about an issue (e.g. website) but not hearing back/issues not getting fixed General inner context factors that neither facilitated/impeded the use of ACT SMART and are not a result of ACT SMART use. May include changes to agency since ACT SMART began. Examples:. -­‐   Staff unaware of ACT SMART -­‐   Staff excited about ACT SMART -­‐   Less turnover -­‐   New trainings -­‐   At beginning of pilot/already existing o   Consultants felt they didn’t have enough o   Agency undergoing leadership changes support Any comments on factors hindering EBP implementation, including inner context factors Any comments on factors facilitating EBP implementation, including inner context factors (EBP factors – adaptability & flexibility) Comments re: suggestions to improve the toolkit overall Suggestions to improve facilitation meetings -­‐   Spacing out facilitation meetings -­‐   FT meetings to be based on agency needs o   Higher frequency of FT meetings if needed for o   FT sends summary to agency instead of in- that specific phase person meeting or other ways to disseminate information 46 6b. Suggestions to improve resources 6c. Suggestions to improve website 6d. Phase Specific Suggestions 6di. Phase 1 6dii. Phase 2 6diii. Phase 3 6div. Phase 4 6dv. Phase 5 Suggestions to improve resources -E.g. access to relevant literature Suggestions to improve website Suggestions to improve ACT SMART that are specific to a phase and/or its activities 47 REFERENCES 48 REFERENCES American Psychiatric Association. Diagnostic and statistical manual of mental disorders. 5th ed. Arlington, VA: American Psychiatric Association; 2013. Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors. Administration and Policy in Mental Health and Mental Health Services Research,38(1), 4-23. doi:10.1007/s10488-010-0327-7 Aarons, G. A., Wells, R. S., Zagursky, K., Fettes, D. L., & Palinkas, L. A. (2009). Implementing Evidence-Based Practice in Community Mental Health Agencies: A Multiple Stakeholder Analysis. American Journal of Public Health,99(11), 2087-2095. doi:10.2105/ajph.2009.161711 Baio, J., Wiggins, L., Christensen, D.L., Maenner, M.J., Daniels, J., Warren, Z., … & Dowling, N.F. (2018). Prevalence of autism spectrum disorder among children aged 8 years -­‐ autism and developmental disabilities monitoring network, 11 sites, United States, 2014. MMWR Surveillance Summary, 67, 1–23. Brookman-Frazee, L., Baker-Ericzén, M., Stadnick, N., & Taylor, R. (2012). Parents’ perspectives on community-based mental health services for children with autism spectrum disorders. Journal of Child and Family Studies, 21(4), 533-544. Brookman-Frazee, L., Drahota, A., Stadnick, N., & Palinkas, L. A. (2012). Therapist Survey on Community Mental Health Services for Autism Spectrum Disorders. PsycTESTS Dataset. doi:10.1037/t38826-000 Brownson, R. C., Colditz, G. A., & Proctor, E. K. (2018). Dissemination and implementation research in health: Translating science to practice. Oxford: Oxford University Press. Braun, V., & Clarke, V. (2012). Thematic analysis. In H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf & K. J. Sher (Eds.), APA handbook of research methods in psychology, vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological; APA handbook of research methods in psychology, vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 57-71, Chapter x, 701 Pages) American Psychological Association, Washington, DC. doi:http://dx.doi.org.proxy1.cl.msu.edu/10.1037/13620-004 Retrieved from http://ezproxy.msu.edu.proxy1.cl.msu.edu/login?url=https://search-proquest- com.proxy1.cl.msu.edu/docview/926273719?accountid=12598 Christon, L. M., Arnold, C. C., & Myers, B. J. (2015). Professionals’ reported provision and recommendation of psychosocial interventions for youth with autism spectrum disorder. Behavior Therapy, 46(1), 68-82. 49 Cidav, Z., Lawer, L., Marcus, S. C., & Mandell, D. S. (2013). Age-related variation in health service use and associated expenditures among children with autism. Journal of Autism and Developmental Disorders, 43(4), 924-931. Clarke, V., & Braun, V. (2014). Thematic Analysis. Encyclopedia of Critical Psychology,1947- 1952. doi:10.1007/978-1-4614-5583-7_311 Dawson, G., Rogers, S., Munson, J., Smith, M., Winter, J., Greenson, J., … Varley, J. (2009). Randomized, Controlled Trial of an Intervention for Toddlers With Autism: The Early Start Denver Model. Pediatrics, 125(1). doi: 10.1542/peds.2009-0958 Dingfelder, H. E., & Mandell, D. S. (2010). Bridging the Research-to-Practice Gap in Autism Intervention: An Application of Diffusion of Innovation Theory. Journal of Autism and Developmental Disorders,41(5), 597-609. doi:10.1007/s10803-010-1081-0 Drahota, A., Chlebowski, C., Stadnick, N., Baker-Ericzén, M. J., & Brookman-Frazee, L. (2017). The dissemination and implementation of behavioral treatments for anxiety in ASD. In: C. M. Kerns, P. Renno, E. A. Storch, P. C. Kendall, & J. J. Wood (Eds.), Anxiety in children and adolescents with autism spectrum disorder: Evidence-based assessment and treatment. Atlanta, GA: Elsevier. Drahota, A., Meza, R., & Martinez, J. I. (2014). The Autism-Community Toolkit: Systems to Measure and Adopt Research-Based Treatments. www.actsmarttoolkit.com Drahota, A., Meza, R. D., Bustos, T. E., Sridhar, A., Martinez, J. I., Brikho, B., Stahmer, A. C., & Aarons, G. A. (in review). Implementation-as-usual in community-based organizations providing specialized services to individuals with autism spectrum disorder: A mixed methods study. Administration and Policy in Mental Health and Mental Health Services Research. Eisenman, D. P., Adams, R. M., Lang, C. M., Prelip, M., Dorian, A., Acosta, J., … Chinman, M. (2018). A Program for Local Health Departments to Adapt and Implement Evidence- Based Emergency Preparedness Programs. American Journal of Public Health, 108(S5). doi: 10.2105/ajph.2018.304535 Estes, A., Munson, J., Rogers, S. J., Greenson, J., Winter, J., & Dawson, G. (2015). Long-Term Outcomes of Early Intervention in 6-Year-Old Children With Autism Spectrum Disorder. Journal of the American Academy of Child & Adolescent Psychiatry,54(7), 580-587. doi:10.1016/j.jaac.2015.04.005 Gomez, E., Drahota, A., & Stahmer, A. C. (2018). Choosing strategies that work from the start: A mixed methods study to understand effective development of community–academic partnerships. Action Research,147675031877579. doi:10.1177/1476750318775796 Harvey, G., Llewellyn, S., Maniatopoulos, G., Boyd, A., & Procter, R. (2018). Facilitating the 50 implementation of clinical technology in healthcare: what role does a national agency play? BMC Health Services Research, 18(1). doi: 10.1186/s12913-018-3176-9 Landrum, B. , & Garza, G. (2015). Mending fences: Defining the domains and approaches of quantitative and qualitative research. Qualitative Psychology, 2, 199–209. 10.1037/qup0000030 Langley, A. K., Nadeem, E., Kataoka, S. H., Stein, B. D., & Jaycox, L. H. (2010). Evidence- Based Mental Health Programs in Schools: Barriers and Facilitators of Successful Implementation. School Mental Health,2(3), 105-113. doi:10.1007/s12310-010-9038-1 Miles, M. B., Huberman, A. M., & Saldaña, J. (2013). Qualitative data analysis: A methods sourcebook. Los Angeles: SAGE. Moulin, J. C., Dickson, K. S., Stadnick, N. A., Rabin, B., Aarons, G. A. (2019). Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implementation Science,14(1). https://doi.org/10.1186/s13012-018-0842-6 National Autism Center (2015). Findings and conclusions: National standards project, phase 2. Randolph, MA: Author. National Research Council. (2001). Educating children with autism. National Academies Press. Nilsen, P. (2015). Making sense of implementation theories, models and frameworks. Implementation Science, 10(1). doi: 10.1186/s13012-015-0242-0 Novins, D. K., Green, A. E., Legha, R. K., & Aarons, G. A. (2013). Dissemination and Implementation of Evidence-Based Practices for Child and Adolescent Mental Health: A Systematic Review. Journal of the American Academy of Child & Adolescent Psychiatry,52(10). doi:10.1016/j.jaac.2013.07.012 O’Brien, B. C., Harris, I. B., Beckman, T. J., Reed, D. A., & Cook, D. A. (2014). Standards for Reporting Qualitative Research. Academic Medicine,89(9), 1245-1251. doi:10.1097/acm.0000000000000388 Onwuegbuzie, A. J., Dickinson, W. B., Leech, N. L., & Zoran, A. G. (2009). A Qualitative Framework for Collecting and Analyzing Data in Focus Group Research. International Journal of Qualitative Methods, 1–21. https://doi.org/10.1177/160940690900800301 Paynter, J. M., & Keen, D. (2014). Knowledge and Use of Intervention Practices by Community- Based Early Intervention Service Providers. Journal of Autism and Developmental Disorders,45(6), 1614-1623. doi:10.1007/s10803-014-2316-2 Paynter, J. M., Ferguson, S., Fordyce, K., Joosten, A., Paku, S., Stephens, M., . . . Keen, D. (2016). Utilisation of evidence-based practices by ASD early intervention service providers. Autism,21(2), 167-180. doi:10.1177/1362361316633032 51 Pickard, K. E., Kilgore, A. N., & Ingersoll, B. R. (2016). Using Community Partnerships to Better Understand the Barriers to Using an Evidence-Based, Parent-Mediated Intervention for Autism Spectrum Disorder in a Medicaid System. American Journal of Community Psychology,57(3-4), 391-403. doi:10.1002/ajcp.12050 Pickard, K., Meza, R., Drahota, A., & Brikho, B. (2018). They’re Doing What? A Brief Paper on Service Use and Attitudes in ASD Community-Based Agencies. Journal of Mental Health Research in Intellectual Disabilities,11(2), 111-123. doi:10.1080/19315864.2017.1408725 Proctor, E. K., Landsverk, J., Aarons, G., Chambers, D., Glisson, C., & Mittman, B. (2008). Implementation Research in Mental Health Services: An Emerging Science with Conceptual, Methodological, and Training challenges. Administration and Policy in Mental Health and Mental Health Services Research,36(1), 24-34. doi:10.1007/s10488- 008-0197-4 Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., . . . Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65-76. doi:http://dx.doi.org.proxy1.cl.msu.edu/10.1007/s10488-010-0319-7 Reichow B., Barton E.E., Boyd B.A., Hume K. (2012). Early intensive behavioral intervention (EIBI) for young children with autism spectrum disorders (ASD). Cochrane Database of Systematic Reviews, 10. No.: CD009260. DOI: 10.1002/14651858.CD009260.pub2. Saldaña, J., & Miles, M. B. (2013). The coding manual for qualitative researchers qualitative data analysis: A methods sourcebook. London: Sage Publications. Sanayei, Ali & Ansari, Azarnoush & Ranjbarian, Bahram. (2010). A Hybrid Technology Acceptance Approach for Using the E-CRM Information System in Clothing Industry. International Journal of Information Science and Management. 15-25. Scaccia, J. P., Cook, B. S., Lamont, A., Wandersman, A., Castellow, J., Katz, J., & Beidas, R. S. (2015). A Practical Implementation Science Heuristic For Organizational Readiness: R = Mc2. Journal of Community Psychology,43(4), 484-501. doi:10.1002/jcop.21698 Shin, K. R., Kim, M. Y., & Chung, S. E. (2009). Methods and Strategies Utilized in Published Qualitative Research. Qualitative Health Research,19(6), 850-858. doi:10.1177/1049732309335857 Sridhar, A. & Drahota, A. (2020, May). Facilitating EBP Implementation in Community-Based ASD Agencies: Clinical Effectiveness of the ACT SMART Implementation Toolkit. Poster submitted to International Society for Autism Research (INSAR), Seattle, WA. Stahmer, A. C., Dababnah, S., & Rieth, S. R. (2019). Considerations in implementing evidence- 52 based early autism spectrum disorder interventions in community settings. Pediatric Medicine, 2, 18–18. doi: 10.21037/pm.2019.05.01 Strauss, A., & Corbin, J. (1990). Basics of qualitative research: Grounded theory procedures and techniques. Newbury Park, CA: Sage. Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed.). Thousand Oaks, CA: Sage. Tabak, R. G., Khoong, E. C., Chambers, D. A., & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine, 43(3), 337-350. doi:http://dx.doi.org.proxy1.cl.msu.edu/10.1016/j.amepre.2012.05.024 Thematic analysis: a reflexive approach. (n.d.). Retrieved from https://www.psych.auckland.ac.nz/en/about/our-research/research-groups/thematic- analysis.html. Venkatesh, V., & Bala, H. (2008). Technology Acceptance Model 3 and a Research Agenda on Interventions. Decision Sciences,39(2), 273-315. doi:10.1111/j.1540-5915.2008.00192.x Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology acceptance model: Four longitudinal field studies. Management Science, 46, 186–204. Venkatesh, V., & Morris, M. G. (2000). Why don’t men ever stop to ask for directions? Gender, social influence, and their role in technology acceptance and usage behavior. MIS Quarterly, 24, 115–139. Wicks, D. (2017). The Coding Manual for Qualitative Researchers (3rd edition) The Coding Manual for Qualitative Researchers (3rd edition) Johnny Saldaña Sage 2015 ISBN-13: 978-1473902497. Qualitative Research in Organizations and Management: An International Journal,12(2), 169-170. doi:10.1108/qrom-08-2016-1408 Wong, C., Odom, S. L., Hume, K. A., Cox, A. W., Fettig, A., Kucharczyk, S., . . . Schultz, T. R. (2015). Evidence-Based Practices for Children, Youth, and Young Adults with Autism Spectrum Disorder: A Comprehensive Review. Journal of Autism and Developmental Disorders,45(7), 1951-1966. doi:10.1007/s10803-014-2351-z Wood, J. J., Mcleod, B. D., Klebanoff, S., & Brookman-Frazee, L. (2015). Toward the Implementation of Evidence-Based Interventions for youth with Autism Spectrum Disorders in schools and community agencies. Behavior Therapy,46(1), 83-95. doi:10.1016/j.beth.2014.07.003 Xu, G., Strathearn, L., Liu, B., & Bao, W. (2018). Corrected prevalence of Autism Spectrum Disorder among US children and adolescents. JAMA,319(5), 505. doi:10.1001/jama.2018.0001 53