IMPLEMENTATION STRATEGY MAPPING METHODS: PILOTING CONCEPT MAPPING WITHIN COMMUNITY-MENTAL HEALTH AGENCIES PROVIDING SERVICES TO AUTISTIC YOUTH By Aksheya Sridhar A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Psychology – Doctor of Philosophy 2024 ABSTRACT Community mental health (CMH) agencies in Michigan are integral for providing services to autistic children experiencing socioeconomic disadvantage. However, CMH agencies utilize evidence based interventions, such as Project ImPACT, at a significantly lower frequency and intensity than is recommended to improve outcomes. There is a critical need to investigate methods to systematically increase the adoption and delivery of interventions such as Project ImPACT within CMH agencies. Concept mapping has been identified as an Implementation Strategy Mapping Method (ISMM) to elicit stakeholder perspectives, identify context-specific implementation determinants, and select and tailor implementation strategies that map on to each determinant, in an effort to facilitate implementation. This study aimed to evaluate (a) the impact of concept mapping on organizational readiness to change and (b) the feasibility, acceptability, appropriateness, and usability of concept mapping as an ISMM in CMH agencies. This study followed a sequential explanatory (quan (cid:0) QUAL) mixed methods design. Four CMH agencies across Michigan participated; 5 staff members (agency leaders, clinical supervisors, direct providers) within each agency participated in pre-concept mapping questionnaires, concept mapping, and post-concept mapping questionnaires. Questionnaire data included demographics information, implementation barriers and facilitators, organizational readiness to change, and end-user evaluations of concept mapping. The concept mapping process included brainstorming, sorting, and ranking implementation strategies on their importance and feasibility in addressing agency-specific implementation barriers. Lastly, 15 participants completed a semi-structured interview to further describe perspectives on the impact of concept mapping on organizational readiness as well as the feasibility, acceptability, appropriateness and usability of concept mapping. Paired samples t-tests did not indicate significant changes in organizational motivation or capacity to change at any of the participating agencies. Concept mapping results highlighted implementation strategies that were ranked as important and feasible at each of the participating agencies. The majority of implementation strategies were selected from the ERIC list of implementation strategies. Common strategies involved training, supervision, developing an implementation plan, and engaging patients/consumers in the process. Lastly, average ratings of end-user evaluations indicated high levels of acceptability, feasibility appropriateness, and usability of concept mapping. Qualitative findings indicated that participants most often discussed the feasibility, acceptability and usability of concept mapping. Three themes were identified: End-user Evaluations, Organizational Readiness, and Mapping Strategies. Qualitative codes explained factors that influenced perceptions of ISMM end-user evaluations, factors that impacted organizational readiness, and beliefs regarding how implementation strategies mapped on to agency-specific barriers. Quantitative and qualitative data were merged in a joint display to illustrate how perceptions of organizational readiness and ISMM end-user evaluations converged or diverged across both data strands. Overall, study findings indicate that concept mapping is a promising method for selecting and tailoring implementation strategies within CMH agencies serving autistic youth, in an effort to facilitate successful implementation and increase service equity for this population. ACKNOWLEDGMENTS I am extremely grateful to the people who helped make this work possible. My advisor, Dr. Amy Drahota has been a supportive and dedicated mentor to me as I worked on this project, and continues to help me grow as a researcher, writer, and academic. I am also grateful to my committee members, Drs. Brooke Ingersoll, James Dearing, and Blair Burnette, as well as my advisors/consultants, Drs. Gregory Aarons and Miles McNall, for providing their expertise as I embarked on this project. I could not have completed this work without the help of Ola Olusegun and Claudia Seiler, previously undergraduate volunteers in our lab, who dedicated hours of their time to this project. Additionally, I am grateful to Autism Speaks for funding this project. Finally, I am incredibly grateful for the unwavering support of my friends and family. This work is dedicated to my Amma & Appa, Smurf, Dylan, and Otis. iv TABLE OF CONTENTS Introduction ................................................................................................................................ 1 Method ..................................................................................................................................... 18 Quantitative Results .................................................................................................................. 43 Qualitative Results .................................................................................................................... 57 Discussion ................................................................................................................................ 72 REFERENCES ......................................................................................................................... 84 APPENDIX A. PRE MEASURE .............................................................................................. 92 APPENDIX B. POST MEASURE .......................................................................................... 109 APPENDIX C. CODEBOOK ................................................................................................. 117 APPENDIX D. FINAL IMPLEMENTATION STRATEGIES SELECTED ............................ 122 v Introduction Autism Spectrum Disorder (ASD) is a pervasive neurodevelopmental disorder that is estimated to impact 1.8% of the U.S. population (American Psychiatric Association, 2013). Core symptoms of ASD include restricted and/or repetitive behaviors as well as social communication deficits, such as difficulty with social-emotion reciprocity, nonverbal communication, and developing and maintaining relationships (American Psychiatric Association, 2013). Social communication skills in particular are fundamental for the later development of language and other important developmental skills (Ingersoll, 2011). For example, foundational skills such as joint attention (i.e. use of gaze, gestures, language to share information with others) and imitation, are critical for social engagement, learning, and social acceptance (Schreibman et al., 2015). Moreover, these skills are associated with long-term outcomes, such as independence in adulthood (Howlin, 2004). Overall, core symptoms of ASD, specifically social communication skills, are a key area for intervention in young children on the autism spectrum in order to support the later development of important social and developmental skills. In addition to the importance of improving these core symptoms of ASD, individuals on the autism spectrum often face significant health disparities in access to care. Social determinants of health, such as racial/ethnic minority status and socioeconomic disadvantage, are key predictors for receipt of evidence-based practices for autistic children backgrounds (Bishop- Fitzpatrick & Kind, 2017). There is a vast literature that consistently highlights disparities in access to early and accurate diagnoses, specialist services, and evidence-based, high-quality care for autistic individuals from marginalized backgrounds (Constantino et al., 2020; Dallman et al., 2020; Magaña et al., 2013; Mandell et al., 2009; Smith et al., 2020; Zeleke et al., 2019). These disparities are particularly significant for racial and ethnic minorities who are also experiencing 1 socioeconomic disadvantage. For example, Medicaid-enrolled children who are Black, Native American/Pacific Islander, or Asian receive fewer outpatient autism services compared to white autistic children (Bilaver et al., 2021). Furthermore, families experiencing socioeconomic disadvantage and families without insurance report less receipt of early and continuous access to care (Liptak et al., 2008). These social determinants of health continue to impact autistic individuals throughout their lives and are associated with limited access to healthcare and worse physical health outcomes in adulthood (Bishop-Fitzpatrick & Kind, 2017). Overall, it is vital to address health disparities for autistic populations, by advancing equity in access to interventions that improve core symptoms, long-term outcomes, and ultimately, quality of life for autistic individuals. Autism Interventions Currently, there are over twenty evidence-based interventions (EBIs) that aim to improve core and co-occurring symptoms for autistic individuals. EBIs are defined as interventions with evidence to indicate that the intervention yields positive outcomes or results, and that these findings have been demonstrated through high-quality research studies (Steinbrenner et al., 2020, National Autism Center, 2015). Many of these interventions utilize applied behavior analysis (ABA) strategies to teach young autistic children a range of skills across settings. ABA interventions typically focus on improving language, social, and academic skills, as well as decreasing the frequency of challenging behaviors. However, researchers began to note limitations to early forms of ABA, as these interventions did not always lead to generalizability of skills across environments and did not include autistic children as active participants in these interventions (Schreibman & Koegel, 2005). Thus, more recent versions of ABA often involve caregivers of young children utilizing behavioral strategies in order to improve their child’s 2 behavior within the home and other settings (Schreibman et al., 2015). Nonetheless, there continued to be a need to engage children as active participants within these interventions, rather than utilizing adult-led interventions alone. Naturalistic Developmental Behavioral Interventions (NDBIs) are a group of efficacious interventions that utilize behavioral principles, integrate these principles with developmental psychology, and engage children as active participants in their learning and intervention experiences (Schreibman et al., 2015). NDBIs focus on supporting foundational skill development (e.g. joint attention, imitation, play) through the use of behavioral strategies within the context of natural environments and daily routines. These strategies are used during child-led or child-preferred activities or routines to increase the child’s motivation and to utilize natural reinforcements (Schreibman et al., 2015). Additionally, NDBIs often include a caregiver-training component that provides families with the opportunity to co-develop goals for their children and to develop greater independence in utilizing a range of behavioral strategies to support their autistic children (Dueñas et al., 2023). In addition to improving child and caregiver-related outcomes, NDBIs that incorporate caregiver training also provide children with increased intervention dosage and greater generalization of skills, as caregivers are able to practice using behavioral strategies across a number of different settings (Green et al., 2010; Ingersoll & Dvortcsak, 2010). Project ImPACT is one example of a manualized, evidence-based, parent-mediated NDBI that has been found to improve social communication skills in young autistic children. This intervention utilizes both developmental and naturalistic behavioral techniques to improve social engagement, language, imitation, and play skills. Research indicates that this intervention leads to significant improvements in communication skills, such as greater language acquisition over 3 the duration of the intervention. Additionally, Project ImPACT involves coaching caregivers to utilize these strategies with their children across different settings and during a variety of routines. Caregivers involved in this intervention show high adherence to the strategies included in Project ImPACT, and report less parental stress after utilizing this intervention (Ingersoll et al., 2016). Although NDBIs, including Project ImPACT, show promising signs of effectiveness for improving social communication, play, and language skills for autistic children, evidence illustrates a significant research-to-practice gap in the use of parent-training interventions for autistic children within community settings (Straiton et al., 2020). Specifically, interventions developed in lab settings are implemented and utilized with lower frequency and intensity in community-based settings than is recommended by intervention developers to improve outcomes (Brookman-Frazee et al., 2012; Straiton et al., 2021). There are several barriers that prevent NDBIs like Project ImPACT from being utilized more broadly in community settings, including provider-level barriers (e.g. lack of training), intervention-level characteristics (e.g. intervention complexity), organizational-level barriers (e.g. lack of funding) and systemic barriers (e.g. insurance coverage for NDBIs) (Dueñas et al., 2023; Straiton et al., 2021). Additionally, research indicates that Board Certified Behavior Analysts and other service providers for this population are unfamiliar with and receive limited training in NDBIs (Dueñas et al., 2023; Hampton & Sandbank, 2022). Overall, NDBIs have been found to improve a range of key developmental skills for autistic children and empower caregivers and families in supporting their children’s development. Therefore, there is an ethical need to prioritize efforts on overcoming barriers to implementing these effective interventions in community-based settings; these efforts may enhance the wide-scale implementation and broaden the availability of NDBIs that support 4 immediate and long term growth of autistic children (D’Agostino et al., 2023; Estabrooks et al., 2018; Gopichandran et al., 2016). Community Mental Health Agencies State-based care systems are essential in providing services to autistic individuals, particularly through policies such as the Medicaid Home and Community-based Services (HCBS) waivers. These waivers were developed in order to expand eligibility criteria and coverage for home- and community-based autism services, given the high health care costs associated with an ASD diagnosis and the limited insurance coverage for many autism-related services (Barry et al., 2017). A scoping review of the impact of these policies provides preliminary evidence that Medicaid HCBS waivers have several benefits including state and federal economic benefits, reduced unmet healthcare needs, increased likelihood of caregivers continuing to work, and reduced racial healthcare disparities (Leslie et al., 2017; McLean et al., 2021). Indeed, research indicates that families utilizing autism service mandates reported greater use of autism-related interventions than families who did not utilize these policies (Barry et al., 2017). Furthermore, HCBS waivers have cut unmet service needs of Black autistic individuals nearly in half, by expanding autism intervention coverage to individuals who do not qualify for the typical Medicaid income cutoff (LaClair et al., 2019). Overall, state-based systems and the use of insurance mandates and Medicaid waivers are integral in addressing the impact of social determinants of health, such as low socioeconomic status, on the receipt of autism interventions. Although state-based systems are key in providing autism services to families experiencing socioeconomic status, the research-to-practice impacting the use of NDBIs within community-based settings means that these interventions may not be implemented or sustained within this context. For example, community mental health (CMH) agencies in Michigan are the 5 primary system for providing behavioral services to children on the autism spectrum who receive interventions via the Michigan Medicaid Autism Benefit. Notably “all youth enrolled in the Medicaid Autism Benefit have a household income that is at or below 133% of the federal poverty level” (p. 3; Straiton et al., 2021). However, parent-training interventions such as Project ImPACT continue to have limited uptake within the context of CMH agencies in Michigan (Straiton et al., 2021). As a result, children experiencing socioeconomic disadvantage (e.g. children relying on the Medicaid Autism Benefit for intervention access) may receive these interventions at a substantially lower rate compared to children from less disadvantaged backgrounds (Bishop-Fitzpatrick & Kind, 2017; Straiton et al., 2021). Therefore, in order to reduce the research-to-practice gap for NDBIs, there is a critical need to investigate methods that systematically increase the adoption and implementation of interventions, such as Project ImPACT, within community settings in an effort to increase service equity for autistic children experiencing socioeconomic disadvantage. Implementation Science Implementation science aims to increase the utilization of EBIs in community settings to promote equitable access to high quality care (Brownson et al., 2012). However, there is limited guidance on effective and systematic processes to implement EBI use and sustainment across settings. Moreover, mental health providers report a need for more specific and tailored implementation support for their organizations, particularly when implementing complex mental health EBIs for autistic youth (Stadnick et al., 2022). This is particularly important, as understanding end-user (i.e. mental health providers, organization staff, and other individuals involved in the implementation process) perspectives may allow for greater engagement during implementation processes (Bustos et al., 2021; Williams et al., 2020). Although studies indicate 6 that end-users report utilizing a range of implementation strategies to implement autism interventions, these strategies are not always used systematically, and may not be an appropriate fit for the specific setting and implementation determinants within that setting (Bustos et al., 2021; Drahota et al., 2021). There are myriad factors (i.e. determinants) that influence the implementation of autism interventions across settings and organizations, including implementation barriers (e.g. lack of provider knowledge or experience with autism interventions) as well as facilitating factors (e.g. therapist flexibility in tailoring interventions to meet client needs) (Adams & Young, 2020). Yet, there is a paucity of literature that guides researchers and practitioners to effectively address implementation determinants (Cheron et al., 2019), especially as these determinants are thought to be unique to a specific setting (Waltz et al., 2019). Implementation strategies (i.e. techniques that increase use and sustainment of interventions within a given setting) are utilized to facilitate the implementation process, ideally by addressing the determinants impacting implementation. Additionally, implementation strategies are purported to improve a range of outcomes, including: implementation- (e.g. increased intervention use within a setting), service- (e.g. greater service equity), and patient- outcomes (e.g. improved functioning) (Figure 1; Proctor et al., 2011). Figure 1. Conceptual model of implementation research 7 Implementation strategies are purported to address implementation determinants to facilitate EBI implementation in various settings and to have a cascading positive impact on organizations, services, and clients (Lau et al., 2015; E. Proctor et al., 2011). A large number of implementation strategies have been identified in extant literature. For example, the Expert Recommendations for Implementing Change (ERIC; Powell et al., 2012, 2015) is a commonly used list comprised of 73 distinct implementation strategies. Strategies address a range of domains, including financial (e.g. access new funding), supporting clinicians (e.g. conduct ongoing training), education (e.g. develop educational materials) and implementation process (e.g. develop formal implementation blueprint, develop tools for quality monitoring). However, research indicates that there may be difficulties when utilizing this comprehensive list of strategies with non-implementation scientists (e.g. healthcare providers involved in implementation within their organizations). For example, non-implementation scientists reported confusion due to the wording of implementation strategies, and difficulties with understanding concepts within implementation science due to the heavy use of jargon (Dorsey et al., 2023; Yakovchenko et al., 2023). Moreover, due to the large number of implementation strategies and length of the strategy list, participant burden has been identified as an additional barrier to utilizing the ERIC (Yakovchenko et al., 2023). In order to facilitate the use of implementation strategies, researchers posit that selecting strategies that are tailored to address the determinants within a specific setting may be particularly effective; tailored implementation strategies may be most likely to successfully support the implementation of a specific intervention within a given setting and timeline (Waltz et al., 2019). Thus far, research indicates varying levels of effectiveness when tailored implementation strategies are used in health care settings (Baker et al., 2015), and there is a continued need to evaluate the use of tailored implementation strategies, 8 as well as processes to select and tailor strategies to address implementation determinants (Powell et al., 2015, 2017). Studies indicate that CMH agencies providing services to children on the autism spectrum experience a range of context-specific implementation barriers (e.g. intervention complexity, lack of provider training, lack of resources) and facilitators (e.g. provider continuity and motivation) (Aarons et al., 2009; Adams & Young, 2020; Pickard et al., 2018; Stahmer & Aarons, 2009). However, research illustrating which implementation strategies may be most relevant and effective for increasing NDBI use within these settings remains limited. Moreover, processes regarding how to best select, generate, or identify tailored implementation strategies have not been evaluated or established within these settings (Sridhar, Olusegun, & Drahota, 2023). Overall, there is a lack of consensus and guidance in the literature regarding systematic methods for implementation processes within CMH agencies, including a lack of understanding around best practices for selecting and tailoring implementation strategies to address context- specific implementation determinants. Indeed, methods to select and tailor implementation strategies – Implementation Strategy Mapping Methods – to context-specific determinants remains an understudied, but high priority area, within the field of implementation science (Powell et al., 2019). Implementation Strategy Mapping Methods Implementation Strategy Mapping Methods (ISMMs) have been identified as one pre- implementation approach to: (1) elicit perspectives of individuals involved in implementation; (2) identify context-specific implementation determinants; and (3) select and tailor implementation strategies that map on to each determinant, in an effort to facilitate implementation. In a scoping review conducted to identify and describe ISMMs utilized within 9 child mental-health service delivery settings, six distinct methods were found (Sridhar et al., 2023). Common across methods, all six ISMMs involved a variety of participants, such as service providers, agency staff, and end-users, in activities to: identify implementation barriers, select or generate implementation strategies, sort and rank implementation strategies, and tailor or adapt implementation strategies for their needs. For further information and descriptions of these ISMMs, please refer to Table 1. Selecting IS Tailoring IS Table 1. ISMMs in Child Mental Health Settings, Scoping Review Findings Participants/ ISMM, End-users Intervention, & Context Innovation Tournament, Behavioral therapy for ADHD in CMH agencies in the U.S. Identifying Determinant s Participants were asked to list barriers to implementat ion at their agency. Participants were asked to generate ideas for strategies that use the identified change methods, so that they are tailored to specific determinants. Clinical staff (therapists, supervisors), organization staff (admin, office staff), and adolescent clients and their parents. Member checking conducted with parents to confirm IS. Participants were asked to: “list as many (ideas) as you can think of to improve [X] barrier” and to generate ideas for IS. [Sibley et al., 2021] Concept Mapping, Speech and language therapy in preschools in Canada [Kwok et al., 2020] Barriers identified in previous study. Participants generated IS during “brainstormi ng” phase then ranked strategies on feasibility & importance. Clinicians, program representatives, and research team. Member checking was completed after IS list was finalized. Researchers mapped IS onto behavior change techniques. Participants identified which barriers would be addressed by each strategy and identified relevant TDF domains. Outcomes 39 strategies were identified, 18 ranked as important & feasible. 282 strategies generated, 13 identified as important, feasible, and with evidentiar y support for mechanis m of action. 10 Table 1 (cont’d) Modified Conjoint Analysis, Mental health interventions in and non- secure youth residential settings in the U.S. [Lewis et al., 2018] Focus Group, Current practices within U.S. pediatric community settings. [Radovic et al., 2020] COAST-IS, protocol to implement EBIs for youth with trauma- related emotional/ behavioral difficulties in U.S. CMH agencies and child advocacy centers. [Powell et al., 2020) Participants completed a needs assessment to identify and rank implementat ion barriers on importance and feasibility. Determinant s were identified during Timepoint 1 Focus Group discussions. Operations staff, therapists, and directors/manager s Strategies were selected using the ERIC. Strategies were ranked by feasibility & importance. Each implementatio n strategy was matched with one or more barriers based on the implementatio n strategy’s “potential mechanism of action”. Researchers provided participants with strategy ideas. Participants were asked for feedback on each strategy during Timepoint 2 focus group. Primary care providers, practice managers, adolescents, and young adults. A needs assessment will be used to identify determinant s in alignment with EPIS. Strategies will be selected using the ERIC. Strategies will be ranked by feasibility & importance. Organization leaders and clinicians. Participants will be asked to explain which barriers would be addressed by each strategy and why. Change methods will be identified and linked to implementatio n determinants and outcomes. 36 strategies were matched to barriers, and rated as important and feasible. Implementa tion teams and a blueprint were developed. No behavioral changes were found. Participants provided feedback. A blueprint and relevant materials were developed. Acceptabilit y, appropriate ness, feasibility, and utility will be evaluated. Implementa tion teams and coaches will develop an implementat ion plan. 11 Table 1 (cont’d) A needs assessment will be used to identify determinants in alignment with CFIR. Researchers will translate IS into practical strategies based on literature. Researchers will develop strategy men based on CFIR and determinants; menu will be used to select final IS. Parents, providers, and leaders of primary care practices. Outcomes to be measured were not described. Intervention Mapping, Study protocol to implement firearm safety as a suicide prevention strategy in the U.S. [Wolk et al., 2017] Moreover, a number of these ISMMs (e.g. Conjoint Analysis, Intervention Mapping, and Concept Mapping) have been studied within other contexts (e.g. adult mental health care settings, behavioral health care settings; Powell et al., 2017). Overall, the scoping review findings indicate that these ISMMs may be helpful in facilitating the identification and prioritization of implementation barriers, as well as the selection and tailoring of implementation strategies to address these context-specific determinants. However, although ISMMs may facilitate systematic implementation processes in various settings, no singular ISMM has been identified as being most effective in yielding behavioral change, and evaluations regarding end- user perspectives of these methods have yet to be evaluated in an empirical manner (Sridhar et al., 2023). Findings from this scoping review also highlighted numerous areas for future work to enhance our understanding of systematic methods to select and tailor implementation strategies (Sridhar et al., 2023). For example, implementation research has consistently identified organizational readiness to change as an important barrier to implementation within community settings (Aarons et al., 2009, 2011; Scaccia et al., 2015). Organizational readiness captures the motivation of individuals within an organization to implement an intervention, as well as the 12 capacities of the organization in general and capacities that are specific to the innovation or intervention being implemented (Scaccia et al., 2015). Additionally, implementation frameworks highlight several factors that may impact these three areas. For example, influences on motivation can include the innovation complexity, priority, and compatibility. These factors are in alignment with the Consolidated Framework for Implementation Research (CFIR), a framework that is often utilized to guide and understand factors influencing implementation processes (Damschroder et al., 2009; Scaccia et al., 2015). Understanding the extent to which an organization is “ready” for implementation, as well as the factors that impact readiness, is believed to be a key component in the implementation process that is likely to influence the success of implementation efforts (Scaccia et al., 2015). Nevertheless, ISMM scoping review findings revealed that studies investigating methods to select and tailor implementation strategies included limited evaluations of the effectiveness of ISMM processes on organizational readiness (Sridhar et al., 2023. Overall, it remains unknown whether ISMMs may be more impactful compared to other approaches to selecting implementation strategies (e.g. researcher selected strategies). Given the numerous steps involved and amount of time required to utilize ISMMs, evaluating the effectiveness of such methods on an organization’s readiness and ability to support implementation is an essential next step. Additional gaps in ISMM research included an evaluation of end-user perspectives regarding these processes. Specifically, the feasibility (i.e. extent to which an innovation can be used within a setting), acceptability (i.e. end-users perception that an innovation is satisfactory or agreeable), appropriateness (i.e. the perceived fit or compatibility of an innovation for a specific setting, population, etc.), and usability (the degree to which an innovation can be utilized by specific individuals to achieve particular goals) (Lengnick-Hall et al., 2022; Lyon et al., 2021) 13 remain understudied. These constructs are often important when evaluating outcomes of implementing a given intervention in order to evaluate the implementation processes and the success of an implementation effort (E. Proctor et al., 2011). Moreover, previous research illustrated that end-users are involved in utilizing implementation strategies within their organizations, and suggested that increased understanding of end-user perspectives regarding implementation strategy use may increase the end-user’s engagement in the implementation process (Bustos et al., 2021). Despite previous research, findings from the scoping review conducted within the context of child mental health service settings found that only one ISMM study planned to evaluate these end-user perspectives (Sridhar et al., 2023). Overall, understanding end-user perspectives may be particularly valuable, as researchers and practitioners aim to utilize methods that are appropriate, feasible, acceptable, and useful for the settings in which they are employed, and for the individuals who use them (Sridhar et al., 2023). ISMMs may offer a solution to the limited availability of NDBIs in CMH settings, by providing end-users involved in EBI implementation with systematic steps to appropriately select and tailor relevant implementation strategies to address context-specific implementation barriers. As a result, the use of ISMMs may have significant clinical and service implications if these methods successfully improve implementation efforts and sustain utilization of NDBIs for autistic children who receive their services in community settings. Concept Mapping Concept mapping was one ISMM identified in the scoping review (Sridhar et al., 2023). Broadly, concept mapping is a mixed methods approach that leads to the development of a conceptual framework representing participating end-users’ views. This method stems from cognitive psychologists, who theorized that learning occurs when new concepts are assimilated 14 into existing concepts and frameworks held by those involved in the learning process (Ausubel, 1968). This method has been utilized in a range of contexts within the field of dissemination and implementation science, as it involves engaging end-users in the process of developing a conceptual framework that represents their views (Green et al., 2012). The concept mapping process typically involves the following steps: (1) identifying end- users, (2) developing a focus question, (3) engaging end-users in group brainstorming sessions related to the focus question and asking end-users to (a) sort/group ideas together and (b) rank the ideas based on various constructs of interest (e.g. importance, feasibility). Concept mapping analyses involves developing a concept map in which similar ideas are represented together, and analyzing end-user rankings of the generated ideas/statements. This data is then used to develop a conceptual framework to address and understand the focus question (Green et al., 2012). This approach can be used to guide planning, implementation, and evaluation for various types of projects. For example, concept mapping has been utilized in previous implementation studies to elicit end-user perspectives regarding context-specific implementation determinants, and the perceived feasibility of addressing those determinants (Green et al., 2012). In that study, participants were asked to: (1) brainstorm “factors that influence the acceptance and use of evidence-based practices in publicly funder mental health programs for families and children”, and then (2) sort generated statements into categories based on similarity and rate factors based on importance and changeability (e.g. “how important is this factor to the implementation of evidence-based practice?”, “how changeable is this factor?”). Concept mapping analysis included visualizing cluster (average) rating for each factor as well as comparing statements on importance and changeability using pattern matching. Participants utilized these results to 15 develop a framework for an implementation plan, which included a plan for evaluation and tracking implementation progress (Green et al., 2012). More recently, a study used concept mapping as an ISMM within the context of child- mental health service delivery as a demonstration project. Specifically, this method was utilized to elicit end-user perspectives regarding implementation determinants as well as to identify relevant implementation strategies and then rank those strategies based on their perceived importance and feasibility (Kwok et al., 2020). This study took place within the context of preschool Speech and Language therapy programs. Findings indicated that the participating end- users were able to generate over 200 implementation strategy ideas, and then narrow this selection down to 13 strategies that were determined to be important and feasible and had evidentiary support. End-users also described how each strategy would address the identified implementation barriers in their organization. Overall, this study revealed promising findings related to the use of concept mapping as a method to select and tailor implementation strategies based on end-user perspectives. However, concept mapping has yet to be evaluated as an ISMM within other child mental health service delivery settings, including CMH agencies providing services to autistic youth. Current Study The current study aimed to further our understanding of methods to select and tailor implementation strategies by evaluating the use of a specific ISMM (concept mapping) within a novel context. Specifically, this was the first study to evaluate the impact of concept mapping on organizational readiness for change, as well as explore end-user perspectives and end-user evaluations of this method when utilized in CMH agencies serving autistic youth. Moreover, this study focused on implementation efforts within CMH agencies providing services to autistic 16 children enrolled in Medicaid benefits and who were interested in using Project ImPACT in their agency. This project substantially advances our knowledge of effective implementation practices in community settings by: (a) identifying determinants to Project ImPACT implementation in community mental health settings and (b) selecting and mapping implementation strategies to address identified determinants. Moreover, study findings provide an understanding of end-user evaluations (feasibility, acceptability, appropriateness, usability) and impact on motivation, capacity, and readiness for change. Lastly, exploring methods to improve implementation efforts within CMH agencies that contract with the Michigan Medicaid Autism Benefit may support increased service equity (i.e., service outcomes [Figure 1]) for autistic children experiencing socioeconomic disadvantage, and ultimately improve patient outcomes for this population. Aims This mixed-methods study aimed to pilot concept mapping as a method for selecting and tailoring implementation strategies onto determinants within four CMH agencies providing services to autistic youth who receive their services via the Michigan Medicaid Autism Benefit. The project’s specific aims were to: a. Examine the impact of concept mapping on organizational readiness (i.e. capacity and motivation) to change in CMH agencies serving autistic youth. b. Explore end-user evaluations (i.e. feasibility, acceptability, appropriateness, and usability) of concept mapping as an ISMM in CMH agencies. 17 Method A sequential explanatory (quan à QUAL) mixed methods design (Figure 2) was used. Study procedures were approved by the Michigan State University Institutional Review Board. Figure 2. Study Design Phase Quantitative Data Collection Procedure Product • Survey: ORCA subscales at pre and post; needs assessment • Raw data on organizational readiness and implementation determinants Quantitative Data Analysis Connecting Quantitative & QUALITATIVE Phases QUALITATIVE Data Collection QUALITATIVE Data Analysis Integration of Quantitative & QUALITATIVE Results • Aggregated site demographics on staff, clients, and service settings • Raw data on end-user evaluations • Frequency count • Comparative analyses of ORCA from pre to post • Average end-user evaluations regarding concept mapping • Interview protocol • Codebook • Recordings from interviews • Researcher memos • Salience of codes • Frequency of codes • Emergent themes • Joint Display • Discussion • Site demographic data • Surveys: AIM, IAM, FIM, ISUS upon completion of concept mapping • Paired sample t-test to explore ORCA changes from pre to post • Calculate frequencies, means, and SDs for AIM, FIM, IAM, ISUS • Develop qualitative interview questions and prompts (informed by QUAN data analysis) • Individual semi- structured interviews with subsample of participants at all sites • Transcribe and de- identify data • Coding, comparison, and consensus method • Thematic Analysis • Merging data for analysis and comparison • Evaluating convergence of data • Interpretation and explanation of results 18 Participants Context and Agencies Four Community Mental Health (CMH) agencies located across Michigan participated in this study. All four agencies provide ABA and other autism-related services (e.g. occupational therapy, speech and language services). Eligibility criteria included that the agencies: (a) provide services to children on the autism spectrum who are enrolled in Medicaid benefits, (b) identify a need for implementing Project ImPACT within their agency, and (c) endorse an interest in utilizing systematic implementation strategies to facilitate this process. Agency Staff Five staff members at each eligible agency participated (N = 20). Specifically, agency leaders/directors (N = 4), direct providers (N = 8) and clinical supervisors (N = 8) participated in this study. Directors/agency leaders (referred to as leaders hereafter) were eligible if they fulfilled the role of director or leading decision-maker regarding interventions provided within their agency. At least 1 leader was required to participate from each agency. Supervisors (e.g. Board Certified Behavior Analysts) were eligible to participate in the study if they delivered or oversaw direct providers who delivered interventions to autistic children who receive their services via the Medicaid Autism Benefit. Lastly, direct providers (e.g. behavior technicians) were eligible if they delivered interventions to autistic children who receive their services via the Medicaid Autism Benefit. Agency staff who do not read or speak in English were not eligible. Two participants did not complete the entire study; one participant (Agency 1) did not complete the concept mapping process or post-concept mapping questionnaire. Data from this participant was not included in the final analyses. A second participant (Agency 2) did not complete the 19 final questionnaire, but data from their pre-concept mapping questionnaire and concept mapping processes were included in the final analyses. Participant demographics are included in Table 2. Table 2. Participant Demographics Demographics Age (years) Gender Identity Man Woman Sex Assigned at Birth Assigned male at birth Assigned female at birth Racial Identity White Black/African American Latinx/Hispanic Native American/Alaskan/Indigenous Education Level High School Some college Associate’s degree Bachelor’s degree Master’s Degree Discipline Psychology Social Work Education Behavior Specialist Other Duration of Employment (months) Employment status Full time Part time Recruitment Procedures Supervisors (n = 8) Leaders (n = 4) Direct Providers (n = 8) 37.5 12.5% 87.5% 12.5% 87.5% 75% 12.5% - 12.5% - - - 12.5% 87.5% 50% - - 25% 25% 139 100% - 29.9 25% 75% 25% 75% 100% - - - - - - 100% 25% 50% 25% - - 51.5 100% - 29.3 37.5% 62.5% 37.5% 62.5% 75% 12.5% 12.5% - 12.5% 25% 25% 12.5% 25% 37.5% - - 25% 37.5% 42.1 87.5% 12.5% Purposive nonprobability sampling was utilized. This sampling method was selected as it is appropriate for studies that seek to include specific members of a population (e.g. CMH 20 agency staff) and when utilizing specific inclusion criteria based upon the characteristics of interest (e.g. staff who deliver or oversee delivery of services to autistic children enrolled in Medicaid benefits) (Davis et al., 2016; Rea & Parker, 2014). This study leveraged existing collaborative relationships between Dr. Ingersoll (Dissertation committee member and study Consultant) and CMH agencies in Michigan to identify eligible agencies. Additionally, I utilized the Michigan Department of Health and Human Services (DHHS) contact list of CMH agencies to reach out to leaders. Leaders were contacted via email for initial recruitment of the agency. Information about the study purpose, benefits, and participation activities was included in the email. Additionally, the email included information about Project ImPACT, including a video overview of the intervention and a link to the intervention website (https://www.project- impact.org/). Leaders who reported an interest in implementing Project ImPACT within their agency and consented to participate in the study then identified four additional prospective staff members to participate. Prospective participants attended a brief (approximately 15-20 minute) informational session via Zoom. During this meeting, I presented an overview of the study purpose, details of participation, and anticipated timeline. Additionally, I presented a brief overview of Project ImPACT. At the end of the meeting, I provided agency staff members with a link to complete a screening questionnaire. Individuals who were eligible for the study were then asked to review and complete a consent form via Qualtrics. Participants were provided a $50 honorarium for completing the pre- and post- questionnaires and concept mapping activities, after completing the entire study. Additionally, each participating agency was compensated $100, and all participants received access to online introductory training modules for Project ImPACT upon completion of their participation in the study. 21 Procedure The study procedure included pre-and post-concept mapping questionnaires (see Measures, below), the use of concept mapping to identify, select, and prioritize implementation strategies, and follow-up participant interviews. Quantitative Method First, participants completed a series of online questionnaires via Qualtrics, to collect demographics information as well as participants’ perspectives regarding organizational readiness and motivation to change (Appendix A). Based on previous ISMM studies (Powell et al., 2020; Lewis et al., 2018; Wolk et al., 2020), a strengths and needs assessment was also included in this questionnaire to gather quantitative data on implementation determinants. Specifically, participants were asked to identify factors that they believe would both hinder and facilitate the implementation of Project ImPACT at their organization. The strengths and needs assessment inquired about different levels of determinants (e.g. implementation barriers and facilitators), and reflected constructs (i.e. Intervention Characteristics, Outer and Inner Setting, Individual Characteristics, and Process) from the Consolidated Framework for Implementation Research (CFIR; Damschroder et al., 2009). The CFIR was selected to guide this project given its strong evidence base and utility in facilitating the understanding of an implementation context through the identification of implementation barriers and facilitators (Damschroder et al., 2009; Nilsen & Bernhardsson, 2019). After identifying relevant determinants, participants were asked to rate each determinant on the importance of addressing/enhancing the determinant and the feasibility of addressing/enhancing the determinant on a scale from 0 (not at all) to 5 (very). Finally, participants responded to questions inquiring about organizational readiness (described in further detail under Measures). The results of the strengths and needs assessment were shared 22 with participants during a brief virtual meeting with the research team prior to beginning the concept mapping. Participants were then provided a pre-concept mapping written report that included findings from the strengths and needs assessment, as well as the complete ERIC list of discrete implementation strategies, and their corresponding definitions (Table 3). Table 3. Expert Recommendations for Implementing Change (ERIC) – Discrete Implementation Strategies List (Powell et al., 2012; Powell et al., 2015) Access new funding: Access new or existing money to facilitate the implementation. Alter incentive/allowance structures: Work to incentivize the adoption and implementation of the clinical innovation. Alter patient/consumer fees: Create fee structures where patients/consumers pay less for preferred treatments (the clinical innovation) and more for less-preferred treatments. Assess for readiness and identify barriers and facilitators: Assess various aspects of an organization to determine its degree of readiness to implement, barriers that may impede implementation, and strengths that can be used in the implementation effort. Audit and provide feedback: Collect and summarize clinical performance data over a specified time period and give it to clinicians and administrators to monitor, evaluate, and modify provider behavior. Build a coalition: Recruit and cultivate relationships with partners in the implementation effort. Capture and share local knowledge: Capture local knowledge from implementation sites on how implementers and clinicians made something work in their setting and then share it with other sites. Centralize technical assistance: Develop and use a centralized system to deliver technical assistance focused on implementation issues. Change accreditation or membership requirements: Strive to alter accreditation standards so that they require or encourage use of the clinical innovation. Work to alter membership organization requirements so that those who want to affiliate with the organization are encouraged or required to use the clinical innovation. Change liability laws: Participate in liability reform efforts that make clinicians more willing to deliver the clinical innovation. Change physical structure and equipment: Evaluate current configurations and adapt, as needed, the physical structure and/or equipment (e.g., changing the layout of a room, adding equipment) to best accommodate the targeted innovation. Change record systems: Change records systems to allow better assessment of implementation or clinical outcomes. Change service sites: 23 Table 3 (cont’d) Change the location of clinical service sites to increase access. Conduct cyclical small tests of change: Implement changes in a cyclical fashion using small tests of change before taking changes system-wide. Tests of change benefit from systematic measurement, and results of the tests of change are studied for insights on how to do better. This process continues serially over time, and refinement is added with each cycle. Conduct educational meetings Hold meetings targeted toward different end-user groups (e.g., providers, administrators, other organizational end-users, and community, patient/consumer, and family end-users) to teach them about the clinical innovation. Conduct educational outreach visits Have a trained person meet with providers in their practice settings to educate providers about the clinical innovation with the intent of changing the provider’s practice. Conduct local consensus discussions Include local providers and other end-users in discussions that address whether the chosen problem is important and whether the clinical innovation to address it is appropriate. Conduct local needs assessment Collect and analyze data related to the need for the innovation. Conduct ongoing training Plan for and conduct training in the clinical innovation in an ongoing way. Create a learning collaborative Facilitate the formation of groups of providers or provider organizations and foster a collaborative learning environment to improve implementation of the clinical innovation. Create new clinical teams Change who serves on the clinical team, adding different disciplines and different skills to make it more likely that the clinical innovation is delivered (or is more successfully delivered). Create or change credentialing and/or licensure standards Create an organization that certifies clinicians in the innovation or encourage an existing organization to do so. Change governmental professional certification or licensure requirements to include delivering the innovation. Work to alter continuing education requirements to shape professional practice toward the innovation. Develop a formal implementation blueprint Develop a formal implementation blueprint that includes all goals and strategies. The blueprint should include: 1) aim/purpose of the implementation; 2) scope of the change (e.g., what organizational units are affected); 3) timeframe and milestones; and 4) appropriate performance/progress measures. Use and update this plan to guide the implementation effort over time. Develop academic partnerships Partner with a university or academic unit for the purposes of shared training and bringing research skills to an implementation project. Develop an implementation glossary Develop and distribute a list of terms describing the innovation, implementation, and the end- users in the organizational change. Develop and implement tools for quality monitoring 24 Table 3 (cont’d) Develop, test, and introduce into quality-monitoring systems the right input—the appropriate language, protocols, algorithms, standards, and measures (of processes, patient/consumer outcomes, and implementation outcomes) that are often specific to the innovation being implemented. Develop and organize quality monitoring systems Develop and organize systems and procedures that monitor clinical processes and/or outcomes for the purpose of quality assurance and improvement. Develop disincentives Provide financial disincentives for failure to implement or use the clinical innovations. Develop educational materials Develop and format manuals, toolkits, and other supporting materials in ways that make it easier for end-users to learn about the innovation and for clinicians to learn how to deliver the clinical innovation. Develop resource sharing agreements Develop partnerships with organizations that have resources needed to implement the innovation. Distribute educational materials Distribute educational materials (including guidelines, manuals and toolkits) in person, by mail, and/or electronically. Facilitate relay of clinical data to providers Provide as close to real-time data as possible about key measures of process/outcomes using integrated modes/channels of communication in a way that promotes use of the targeted innovation. Facilitation A process of interactive problem solving and support that occurs in a context of a recognized need for improvement and a supportive interpersonal relationship. Fund and contract for the clinical innovation Governments and other payers of services issue requests for proposals to deliver the innovation, use contracting processes to motivate providers to deliver the clinical innovation, and develop new funding formulas that make it more likely that providers will deliver the innovation. Identify and prepare champions Identify and prepare individuals who dedicate themselves to supporting, marketing, and driving through an implementation, overcoming indifference or resistance that the intervention may provoke in an organization. Identify early adopters Identify early adopters at the local site to learn from their experiences with the practice innovation. Increase demand Attempt to influence the market for the clinical innovation to increase competition intensity and to increase the maturity of the market for the clinical innovation. Inform local opinion leaders Inform providers identified by colleagues as opinion leaders or ‘educationally influential’ about the clinical innovation in the hopes that they will influence colleagues to adopt it. Intervene with patients/consumers to enhance uptake and adherence 25 Table 3 (cont’d) Develop strategies with patients to encourage and problem solve around adherence. Involve executive boards Involve existing governing structures (e.g., boards of directors, medical staff boards of governance) in the implementation effort, including the review of data on implementation processes. Involve patients/consumers and family members Engage or include patients/consumers and families in the implementation effort. Make billing easier Make it easier to bill for the clinical innovation. Make training dynamic Vary the information delivery methods to cater to different learning styles work contexts, and shape the training in the innovation to be interactive. Mandate change Have leadership declare the priority of the innovation and their determination to have it implemented. Model and simulate change Model or simulate the change that will be implemented prior to implementation. Obtain and use patients/consumers and family feedback Develop strategies to increase patient/consumer and family feedback on the implementation effort. Obtain formal commitments Obtain written commitments from key partners that state what they will do to implement the innovation. Organize clinician implementation team meetings Develop and support teams of clinicians who are implementing the innovation and give them protected time to reflect on the implementation effort, share lessons learned, and support one another’s learning. Place innovation on fee for service lists/formularies Work to place the clinical innovation on lists of actions for which providers can be reimbursed (e.g., a drug is placed on a formulary, a procedure is now reimbursable). Prepare patients/consumers to be active participants Prepare patients/consumers to be active in their care, to ask questions, and specifically to inquire about care guidelines, the evidence behind clinical decisions, or about available evidence-supported treatments. Promote adaptability Identify the ways a clinical innovation can be tailored to meet local needs and clarify which elements of the innovation must be maintained to preserve fidelity. Promote network weaving Identify and build on existing high quality working relationships and networks within and outside the organization, organizational units, teams, etc. to promote information sharing, collaborative problem-solving, and a shared vision/goal related to implementing the innovation. Provide clinical supervision Provide clinicians with ongoing supervision focusing on the innovation. Provide training for clinical supervisors who will supervise clinicians who provide the innovation. 26 Table 3 (cont’d) Provide local technical assistance Develop and use a system to deliver technical assistance focused on implementation issues using local personnel. Provide ongoing consultation Provide ongoing consultation with one or more experts in the strategies used to support implementing the innovation. Purposely reexamine the implementation Monitor progress and adjust clinical practices and implementation strategies to continuously improve the quality of care. Recruit, designate, and train for leadership Recruit, designate, and train leaders for the change effort. Remind clinicians Develop reminder systems designed to help clinicians to recall information and/or prompt them to use the clinical innovation. Revise professional roles Shift and revise roles among professionals who provide care, and redesign job characteristics. Shadow other experts Provide ways for key individuals to directly observe experienced people engage with or use the targeted practice change/innovation. Stage implementation scale up Phase implementation efforts by starting with small pilots or demonstration projects and gradually moving to a system wide rollout. Start a dissemination organization Identify or start a separate organization that is responsible for disseminating the clinical innovation. It could be a for-profit or non-profit organization. Tailor strategies Tailor the implementation strategies to address barriers and leverage facilitators that were identified through earlier data collection. Use advisory boards and workgroups Create and engage a formal group of multiple kinds of end-users to provide input and advice on implementation efforts and to elicit recommendations for improvements. Use an implementation advisor Seek guidance from experts in implementation. Use capitated payments Pay providers or care systems a set amount per patient/consumer for delivering clinical care. Use data experts Involve, hire, and/or consult experts to inform management on the use of data generated by implementation efforts. Use data warehousing techniques Integrate clinical records across facilities and organizations to facilitate implementation across systems. Use mass media Use media to reach large numbers of people to spread the word about the clinical innovation. Use other payment schemes Introduce payment approaches (in a catch-all category). 27 Table 3 (cont’d) Use train-the-trainer strategies Train designated clinicians or organizations to train others in the clinical innovation. Visit other sites Visit sites where a similar implementation effort has been considered successful. Work with educational institutions Encourage educational institutions to train clinicians in the innovation. After completing the concept mapping phase (described in the following section), participants completed a post-concept mapping questionnaire. This questionnaire included the same measures evaluating organizational readiness, and inquired about end-user evaluations (described further in the Measures section). For both the pre-and post-concept mapping questionnaires, please see Appendices A and B. Measures The quantitative data collection included investigator-developed as well as established measures to collect information about the respondent and variables of interest (Table 4). Table 4. Quantitative Measures Measure Constructs Measured Demographics (Sridhar, 2022, unpublished measure) Provider variables: age, gender identity, sex assigned at birth, racial identification, highest level of education, primary disciplines (e.g. psychology, social work, behavior specialist), title, duration of employment in years and months, and employment status (e.g. full-time, part-time). Client variables: number of clients between ages 0- 21 served, number of clients aged 0-21 on the autism spectrum, number of autistic clients within specific age groups (i.e. 0-5, 6-10, 11-15, 16-18, 19-21, and over 21 years old). Organizational variables: number of providers delivering services to autistic clients, typical caseload per provider, settings in which interventions are delivered (e.g. clinic, community, school, home), sources of funding for the organization (e.g. insurance, private pay, Medicaid, etc.), and description of the organizational structure. Data Collection Timepoint Pre- concept mapping 28 Table 4 (cont’d) ASD-SIS (Pickard, Meza, Drahota & Brikho, 2018) Strengths and Needs Assessment (adapted from CFIR; Damschroder et al., 2009) Organizational Readiness to Change (ORCA; Helfrich, Li, Sharp, & Sales, 2009) Organizational Readiness for Implementing Change (ORIC; Shea et al., 2014). Client presenting problems (e.g. communication, social skills, behavior, etc) as well as ratings on how effectively the current interventions used at the organization address each presenting problem. Pre- concept mapping Intervention Characteristics, Outer Setting, Inner Setting, Characteristics of Individuals, and Process Leadership and staff culture related to ability to support and use new interventions Change commitment and change efficacy Organizational Readiness for Change (ORC; Lehman et al., 2002) Motivational needs (i.e. program needs for improvement, immediate training needs) and pressure for change Feasibility of Intervention (FIM; Weiner et al., 2017) Extent to which concept mapping is implementable, possible, doable, easy to use Acceptability of Intervention Measure (AIM; Weiner et al., 2017) Intervention Appropriateness Measure (IAM; Weiner et al., 2017) Implementation Strategy Usability Scale (ISUS; Lyon et al., 2021a Extent to which concept mapping meets participant’s approval, is appealing, is liked, is welcomed Extent to which concept mapping is fitting, suitable, applicable, a good match. Extent to which concept mapping would be used frequently, was complex, was easy to use, requires technical support, components were well integrated, was inconsistent, could be learned quickly, was cumbersome, participant felt confident using it, participant needed to learn a lot before using concept mapping. Pre- concept mapping Pre- and post- concept mapping Pre- and post- concept mapping Pre- and post- concept mapping Post- concept mapping Post- concept mapping Post- concept mapping Post- concept mapping 29 Demographics. The unpublished demographics questionnaire was administered to collect provider, client, and organizational demographics (Sridhar, 2022). Provider variables that were collected include: age, gender identity, sex assigned at birth, racial identification, highest level of education, primary disciplines (e.g. psychology, social work, behavior specialist), title, duration of employment in years and months, and employment status (e.g. full-time, part-time). Client variables included: number of clients between ages 0-21 served, number of clients aged 0-21 on the autism spectrum, number of autistic clients within specific age groups (i.e. 0-5, 6-10, 11-15, 16-18, 19-21, and over 21 years old). Organizational variables included: number of providers delivering services to autistic clients, typical caseload per provider, settings in which interventions are delivered (e.g. clinic, community, school, home), sources of funding for the organization (e.g. insurance, private pay, Medicaid, etc.), and description of the organizational structure. Demographics data are reported in Table 2. Additionally, the demographics questionnaire included questions derived from the ASD Strategies and Interventions Survey (ASD-SIS; Pickard et al., 2018). This survey was developed based on ASD services specifically and was utilized to gather data on client presenting problems (e.g. communication, social skills, behavior, etc.) as well as ratings on how effectively the current interventions used at the organization address each presenting problem. Strengths and needs assessment. A strengths and needs assessment was developed based on the CFIR. Participants were asked to identify the extent to which CFIR determinants were true and important to address within their organization on a 4-point scale (1- “disagree” to 4- “agree”). CFIR constructs include: Intervention Characteristics (e.g. relative advantage, adaptability), Outer Setting (e.g. patient needs and resources, external policy and incentives), Inner Setting (e.g. structural characteristics, culture, tension for change), Characteristics of 30 Individuals (e.g. knowledge and beliefs about the intervention, self-efficacy), and Process (e.g. planning). Readiness and Motivation to Change Questionnaires. Three quantitative measures were utilized prior to and following the completion of concept mapping, in order to evaluate whether engaging in the method influenced agency staff’s perceptions on their organization’s capacity and motivation to facilitate and support change. All three measures have robust psychometric properties and have been utilized across contexts. a. Organizational Readiness for Change (ORC; Lehman et al., 2002). Broadly, the ORC measures organizational climate and staff attributes. Participants completed the Motivational Needs/Pressures for Change Scales (α = 0.64), which focuses on readiness for change within an organization. More specifically, this scale measures perceived needs for change, including improving upon programs, perceptions regarding the needs for training across areas, as well as internal and external pressures for change (Lehman et al., 2002; Billsten et al., 2018). This scale utilizes a 5-point Likert scale (1 - “disagree strongly” to 5 - “agree strongly). b. Organizational Readiness for Implementing Change (ORIC; Shea et al., 2014). The ORIC was developed based upon the theory of organizational readiness for change, which posits that readiness for change involves both change commitment (ie. organizational member’s perspectives regarding staff’s shared resolve to implement change) and change efficacy (i.e. organizational member’s perspectives that there is a shared belief in the collective capability to implement a change within the organization). The organizational readiness for change theory suggests that increased change commitment and change efficacy is associated with effective implementation due to increased likelihood of 31 initiating change, greater effort and persistence, and more cooperative behavior among organizational staff (Weiner, 2009). Participants completed the ORIC which reflects two subscales: 1. change commitment and 2. change efficacy. Cronbach’s alpha for these scales were 0.92 and 0.88 respectively (Shea et al., 2014). This scale utilizes a 5-point Likert scale (1 - “disagree” to 5 - “agree”). c. Organizational Readiness to Change (ORCA; Helfrich et al., 2009). The ORCA was developed based on the PARIHS framework, which highlights important determinants to implementation. This survey was first developed to evaluate organizational readiness and identify implementation barriers. Participants completed the context scale of the ORCA; this scale explores the perceived quality of the organizational context for implementation, based on respondents’ perspectives of their organization’s ability to support and facilitate the use of new interventions. Specifically, the context scale assesses organizational culture for leadership and staff, leadership practice, and evaluation. This scale utilizes a 5-point Likert scale (1 - “disagree strongly” to 5 - “agree strongly) and has robust reliability (α = 0.85; Helfrich et al., 2009). Implementation Outcome Questionnaires. Participants completed four additional surveys at the end of study; these questionnaires asked about ISMM end-user evaluations following the concept mapping process. The first three surveys are comprised of 4 items and utilize a 4-point Likert scale (1-“completely disagree” to 4-“completely agree”) (Weiner et al., 2017). The last survey is comprised of 10 items and uses a 5-point scale (strongly disagree to strongly agree; (Lyon et al., 2021). All four measures have robust psychometric properties and have been utilized across healthcare and educational settings, as well as with a range of end-user groups 32 (e.g. administrators, healthcare professionals) (Adrian et al., 2020; Finch et al., 2012; Kien et al., 2021; Swindle et al., 2021; Taboada et al., 2021) a. Feasibility of Intervention (FIM). This measure will examine participant perspectives on the extent to which concept mapping could be successfully used within the agency. This scale has acceptable reliability (α = 0.89; Weiner et al., 2017). b. Acceptability of Intervention Measure (AIM). This measure the perceptions among participants that concept mapping was agreeable or satisfactory. This scale has acceptable reliability (α = 0.85; Weiner et al., 2017). c. Intervention Appropriateness Measure (IAM). This measure will examine the perceived fit or compatibility of concept mapping within the agency. This scale has acceptable reliability (α = 0.91; Weiner et al., 2017). d. Implementation Strategy Usability Scale (ISUS). This measure will examine the perceived usability of the concept mapping Method. This scale was adapted from the System Usability Scale (α = 0.84; Lyon et al., 2021b) and examines overall usability and compares usability across different strategies (Lyon et al., 2021a). Quantitative Data Analysis Quantitative data (i.e. questionnaire data) was analyzed in three phases. First, I utilized descriptive analyses (i.e. means, frequencies, distributions) to report demographic data. Second, I aggregated mean responses to the AIM, IAM, FIM, and ISUS by agency to evaluate ISMM end- user evaluations of the concept mapping process for each organization. Lastly, given the limited sample (N = 20), I ran paired sample t-tests on the ORCA, ORC, and ORIC between pre- and post- data collection time point for each agency to explore changes in perceptions of organizational readiness following the concept mapping process. 33 Concept Mapping Procedure Participants engaged in three steps of concept mapping: Step 1. Brainstorming, Step 2. Sorting, and Step 3. Rating via the online platform, GroupWisdom. GroupWisdom is an online tool used for data collection, analysis, and visualization of data from concept mapping studies. GroupWisdom allows participants to brainstorm, organize, and rate ideas from most devices to facilitate data collection. This tool also allows researchers to manage participants and their activities on the website, as well as communicate directly with participants. Lastly, GroupWisdom allows researchers to conduct visual analysis efficiently using point maps, cluster maps, cluster rating maps, pattern matches, and go-zone graphs (https://groupwisdom.com/groupwisdom). During the brainstorming phase, participants were provided the following focus prompt: “For the concept mapping activity, please select which implementation strategies you believe will help address your agency’s strengths and needs during implementation of Project ImPACT. Remember to use the ERIC list of implementation strategies, which is included in your agency's pre-concept mapping report. Additionally, you may enter additional implementation strategies that are not included in the ERIC”. Participants then utilized the ERIC list of strategies, and entered the implementation strategies they believed would be helpful for their agency. Upon completing this step, the research team removed redundant implementation strategies, and participants completed the Sorting phase during which they grouped similar implementation strategies together. Specifically, participants were provided the following instructions via GroupWisdom: “Sort each card into a pile as you create your own version of how these ideas are related. You'll give each pile a name that describes its theme or contents. You can start naming 34 the piles or groups right away, or name them as you go. You'll have the chance to check all your piles when you are finished. Don’t create piles according to priority or value, like “Important” or “Hard To Do”. Avoid piles that group dissimilar statements, like “Miscellaneous” or “Other”. Put a statement alone in its own pile if it is unrelated to the other statements”. Finally, participants completed the Rating phase to rank strategies on (a) how important and (b) how feasible the strategy is in addressing identified barriers to the implementation of Project ImPACT at the organization. Ratings ranged from 0 (not at all important/feasible) to 5 (very important/feasible). After participants completed concept mapping, the research team provided agencies with a final list of recommended implementation strategies based on strategies identified as both most important and most feasible by participants. After receiving this information, participants completed a post- concept mapping questionnaire to report on perceptions of organization readiness and motivation to change, as well as their perspectives on the feasibility, acceptability, usability, and appropriateness of the concept mapping method. Concept Mapping Analysis The concept mapping analyses plan was developed based on steps outlined by Kwok and colleagues (2020). After participants selected and rated implementation strategies, I utilized GroupWisdom software to generate a concept map of implementation strategies. First, I developed a point map to visualize the full list of implementation strategies and illustrate strategies that were closely related based on proximity to each other. Second, I developed a cluster map to finalize strategy categories based on the participant’s sorting data. Third, I created a Go-Zone graph to visualize strategies based on both importance and feasibility; findings from this graph informed the identification of strategies that were rated as most important and most 35 feasible by participants at the agency. Lastly, pattern match graphs were used to show the importance and feasibility of the final strategy categories developed. The final list of recommended implementation strategies was then added to each agency’s report and provided to each agency prior to the completion of the post-concept mapping questionnaire and interviews. Qualitative Method Finally, a subset of participants (N = 15, 75% of participants) completed semi-structured virtual interviews. All participants were provided the opportunity to participate in the interviews, and those interested in participating in the interview provided their consent during the initial consenting period. Interviews focused on understanding participants’ experience with the concept mapping steps, including their perspectives regarding the feasibility, acceptability, usability, and appropriateness of the ISMM, and the perceived impact of concept mapping on their organization’s readiness to change. Interviews were audio recorded with the participant’s consent. Participants who completed the interview were provided an additional honorarium of $20 to thank them for their time and willingness to participate in this portion of the study. Measure Semi-Structured Interview. After completing quantitative data collection and analysis, I developed a semi-structured interview protocol to further explore perspectives on the impact of concept mapping on the organization’s readiness to change, ISMM end-user evaluations related to concept mapping, and suggestions to improve the method. Interview questions aligned with constructs from the CFIR (See ISMM Interview Guide below). Participants were asked about perspectives on concept mapping characteristics, outer and inner settings factors, and individual characteristics that influenced perspectives regarding concept mapping. 36 ISMM Interview Guide Introduction Thank you for taking the time to meet with me today and for participating in the Implementation Strategy Mapping Methods study. Today I will be asking you a few questions about how feasible, acceptable, and appropriate the concept mapping method was for your agency. There are no right or wrong answers, we just want to understand your experience with the concept mapping. I am only here to get your opinion, so I won’t be giving my own. I’ll be writing notes to keep track of your answers. We may not get to all of the questions and that is ok. Do you have any questions before we get started? Is it ok if I turn on the recorder on Teams? ISMM end-user evaluations 1. How feasible was the concept mapping? a. What made the concept mapping feasible or not feasible? b. Was there anything that made the concept mapping challenging to complete? 2. How acceptable or satisfactory was the concept mapping? a. What made the concept mapping acceptable or unacceptable? 3. How appropriate was the concept mapping for your agency? a. What made the concept mapping appropriate or inappropriate? Perceived Effectiveness 1. How do you think the selected implementation strategies will address the implementation barriers your team identified? [provide list of identified IS and identified barriers to as references for participant] a. Why or why not? (Probe for examples) b. Which strategies will address which barriers and why? 37 2. What specific steps or aspects of the concept mapping were most helpful and why? 3. Do you think the concept mapping helped your organization prepare for implementation of a new intervention? a. Did the concept mapping impact your organizations capacity? to support implementation efforts? i. Motivation? b. Was this process different than your organizations typical processes to prepare for implementation? How was it similar? How was it different? Suggestions to improve the concept mapping process 1. Do you have any ideas or suggestions for improving the concept mapping process? 2. What would you have changed about the concept mapping process? Data processing An undergraduate research assistant first transcribed all interview data, which I then verified and reviewed to increase my familiarity with the interview data. Interview data were anonymized such that participant ID numbers were utilized in the transcripts. Analysis I utilized thematic analysis (Braun & Clarke, 2006) to analyze the qualitative interviews. This analysis approach was selected as it allows for the use of both inductive (i.e. emergent codes) and deductive (i.e. codes developed a priori based on research questions, frameworks, etc.) coding methods in order to explore the research questions outlined. Thematic analysis typically involves six phases for coding qualitative data. First, I reviewed all interview transcripts in detail, to increase my familiarity with the interview data. I then developed a coding schema based on the research questions of the project 38 (e.g. end-user evaluations of concept mapping) as well as constructs from the CFIR (e.g. innovation characteristics) to facilitate the identification of factors influencing perspectives regarding this approach. I trained an undergraduate research assistant in both inductive and deductive coding methods. We independently coded each interview, and utilized consensus coding meetings throughout the process to address coding discrepancies. During these meetings, the independent coders provided their rationale for the code selected, and selected a final code together based on this discussion. Thematic analysis involves both a sequential progression through these phases of analysis, and utilizes an iterative approach such that coders moved back and forth between the six phases of analysis. Several coding methods were utilized during analysis. Provisional coding was utilized often; this approach involves using codes developed a priori based on the frameworks and research questions guiding the project. Categories were developed based on the CFIR framework and included: Inner Setting Factors, Outer Setting Factors, Innovation Characteristics, Process, and Characteristics of Individuals. I also included codes based on the CFIR framework nested within these categories (e.g. Inner Setting Factors (cid:0) Structural Characteristics). Based on the research questions, the codebook also included ISMM end-user evaluations, Motivation of Individuals, Organizational Readiness, and Implementation Strategies. First cycle or initial coding was utilized as well; this process involves line by line open coding to identify emergent codes. Additionally, we utilized subcodes to provide more specific details about a primary code, by identifying second order codes nested under a primary code (e.g. Inner Setting (cid:0) Structural Characteristics (cid:0) IT Infrastructure). Lastly, we utilized axial coding to group similar codes together to form larger categories based on concepts that emerged from the data. All codes were iteratively added to the codebook during the course of the coding process (Appendix C). Once the 39 codebook was finalized, the independent coders conducted a final coding of all interviews to ensure consistency in codes across the data. After coding was completed, I examined the frequency with which each code was assigned. Qualitative analysis was conducted across all participants and not compared at the staff role or agency level. However, codes that captured differences across roles were included in the codebook, based on the CFIR framework; as a result, qualitative findings highlighted different perspectives based on role when coded. Codes were then grouped into broader categories followed by overarching themes in order to summarize patterns identified in the interview data. Subsequently, the coders discussed and identified three final themes. Following this, codes and categories were organized by theme. Finally, the writing process began and themes were contextualized within the CFIR framework (Braun & Clarke, 2012). Due to the small sample size and limited time frame of this pilot study, thematic saturation and was unable to be established. All data was analyzed with MAXQDA software. The Standards for Reporting Qualitative Research (O’Brien et al., 2014) were adhered to in order to ensure transparency and accuracy throughout qualitative data collection and analyses. Trustworthiness. Several steps were taken in order to ensure trustworthiness in the analysis of interview data, in alignment with the SRQR guidelines: (a) coding was conducted by two independent coders with no relationship to the development of the concept mapping method, (b) coders regularly assessed consensus in their codes, and (c) an audit trail was used to track changes and rationale for changes made during the iterative coding process. Researcher Characteristics. In line with SRQR guidelines, the consideration of researcher characteristics was an important step towards maintaining objectivity during the coding process. 40 Both independent coders are university-affiliated individuals; the lead coder is a doctoral student in the Clinical Science program and the second coder is an undergraduate research assistant. Both coders read coding training materials prior to conducting thematic analysis. Additionally, the lead coder (AS) had previous experience in coding interviews and has used similar coding techniques to analyze qualitative data. Neither coder was associated with the development of the concept mapping process. Lastly, neither coder had prior relationships with the participating CMH agencies. Reflexivity. Coders engaged in self-reflection regarding commonly held biases and assumptions when engaging in data analysis. Additionally, both coders endorsed believing: (a) many children on the autism spectrum may benefit from the receipt of NDBIs, including Project ImPACT, (b) there are serious disparities in access to autism interventions for children experiencing socioeconomic disadvantage, (c) research seeking to improve NDBI implementation in community settings is an important step in narrowing the research-to-practice gap for autism services, (d) the use of tailored implementation strategies may be particularly effective and impactful in overcoming implementation determinants and supporting and sustaining implementation processes. Overall, both coders believe that findings from this study will have important implications in autism and implementation research and practice. Mixed Methods Analyses Each data strand was first analyzed independently, and then merged using a joint display (Guetterman et al., 2015). Joint displays (side by side comparison tables, see Table 12) are used to integrate findings from both quantitative and qualitative data, understand where participant’s perspectives may converge or diverge, and to contextualize the quantitative findings. Further quantitative analyses was conducted in order to evaluate changes in organizational readiness and 41 average ratings of end-user evaluations across all four agencies. For example, quantitative data on the average acceptability of concept mapping was explored across all agencies, and then further explored through utilization of the qualitative findings, when contrasting and comparing these findings in the joint display. Overall, merging these data strands allowed for a deeper understanding of the specific components that participants found acceptable, and factors influencing their perspectives on acceptability. 42 Organizational Readiness Quantitative Results Paired samples t-tests were used to evaluate changes in organizational readiness at each agency following the concept mapping process (Table 5). Results did not indicate statistically significant changes in the ORC at any agency: Agency 1 (t (3) = -0.11, p = 0.92), Agency 2 (t (3) = -0.31, p = 0.78), Agency 3 (t (4) = -0.59, p = 0.59), or Agency 4 (t (4) = -0.86, p = 0.44). These results indicate no significant changes in motivational needs and pressures for changes as a component of organizational readiness for change. Similarly, paired samples t-tests did not indicate statistically significant changes in the ORIC at Agency 1 (t (3) = -1.6, p = 0.21), Agency 2 (t (3) = 1.1, p = 0.37), Agency 3 (t (4) = 0.90, p = 0.42), or Agency 4 (t (4) = -0.97, p = 0.39). These results no changes from pre- to post- concept mapping in change commitment and change efficacy to support implementing Project ImPACT. Lastly, results from the paired samples t-tests did not indicate statistically significant changes in the ORCA at Agency 1 (t (3) = 0.83, p= 0.47), Agency 2 (t (3) = 0.56, p= 0.61), Agency 3 (t (4) = 0.10, p= 0.92), or Agency 4 (t (4) = -1.25, p= 0.28). These results indicate no significant changes from pre- to post- concept mapping in the organizational culture for leadership and staff to support the implementation of Project ImPACT. Table 5. Paired Samples T-Test, Organizational Readiness from Pre- to Post- Concept Mapping by Agency ORC Agency 1 (n= 4) Agency 2 (n= 4) Agency 3 (n= 5) Pre-CM Mean (SD) Post-CM Mean (SD) t p 2.9 (1.4) 2.4 (.78) 2.8 (.73) 3.1 (1.0) 2.5 (.55) 2.9 (.69) -0.11 -0.31 -0.59 0.92 0.78 0.59 43 Table 5 (cont’d) Agency 4 (n= 5) 2.3 (.93) ORIC Agency 1 (n= 4) Agency 2 (n= 4) Agency 3 (n= 5) Agency 4 (n= 5) ORCA Agency 1 (n= 4) Agency 2 (n= 4) Agency 3 (n= 5) Agency 4 (n= 5) 4.6 (.47) 3.7 (.32) 2.9 (.73) 4.1 (.28) 4.8 (.12) 4.4 (.45) 4.4 (.44) 4.4 (.48) **All measures utilize a 5-point Likert scale Concept Mapping Results Brainstorming 2.5 (.75) 4.8 (.24) 3.4 (.49) 2.5 (.69) 4.3 (.39) 4.7 (.51) 4.2 (.43) 4.4 (.47) 4.5 (.39) -0.86 -1.6 1.1 0.90 -0.97 0.83 0.56 0.10 -1.25 0.44 0.21 0.37 0.42 0.39 0.47 0.61 0.92 0.28 Upon completion of the brainstorming phase, Agency 1 had identified 58 implementation strategies as relevant to addressing determinants to implementing Project ImPACT; 22 of these strategies were duplicated and removed during the analysis process. However, due to researcher error, these duplicates were included in the sorting and ranking steps. All strategies identified by Agency 1 were selected using the ERIC list of implementation strategies; no additional strategies were generated. Agency 2 identified 31 implementation strategies from the ERIC list, three of which were duplicates and were thus removed. Additionally, Agency 2 generated two additional strategies: (a) flowchart for incorrect responses and (b) flowchart for behaviors. Similar to Agency 1, due to researcher error, duplicates were included in the sorting and ranking steps but removed for the final analysis. Agency 3 identified 57 implementation strategies in total; of these strategies, 3 were generated by the participants themselves and the remaining 54 strategies were identified using the ERIC. The three generated strategies were: (a) view presentation on what Project ImPACT is, (b) view data on outcomes, and (c) robust training. Lastly Agency 4 identified 29 implementation strategies using the ERIC list, and did not generate additional 44 strategies. Of the 150 total strategies identified across all four agencies, 11 strategies were identified by all four of the agencies and 21 strategies were identified by only one agency. The remaining 118 strategies were identified by two to three agencies. A complete list of the implementation strategies identified during the brainstorming phase, and the number of agencies that selected a given strategy, is shown in Table 6. Table 6. Implementation Strategies Identified During Brainstorming Implementation Strategies Conduct ongoing training Develop a formal implementation blueprint Prepare patients/consumers to be active participants Involve patients/consumers/family members Provide clinical supervision Purposely reexamine the implementation Conduct educational meetings Fund and contract for the clinical innovation Model and simulate change Shadow other experts Organize clinician implementation team meetings Alter incentive/allowance structures Develop educational materials Access new funding Create a learning collaborative Distribute educational materials Use an implementation advisor Assess for readiness and identify barriers and facilitators Identify and prepare champions Conduct local needs assessment Increase demands Conduct cyclical small tests of change Build a coalition Develop academic partnerships Work with educational institutions Tailor strategies Provide ongoing consultation Develop and organize quality monitoring systems Facilitation Use train-the-trainer strategies # agencies that selected the strategy 4 4 4 4 4 4 4 4 4 4 4 3 3 3 3 3 3 3 3 3 3 3 3 3 3 2 2 2 2 2 45 Table 6 (cont’d) Obtain and use patients/consumers and family feedback Capture and share local knowledge Conduct educational outreach visits Develop resource sharing agreements Make training dynamic Develop and implement tools for quality monitoring Place innovation on fee for service lists/formularies Mandate change Create or change credentialing and/or licensure standards Change record systems Alter patient/consumer fees Promote network weaving Facilitate relay of clinical data to providers Develop implementation glossary Remind clinicians Promote adaptability Revise professional roles Flowchart for incorrect responses* Flowchart for behaviors* Robust training* View data on outcomes* View presentation on what Project ImPACT is* Visit other sites Change accreditation or membership requirement Group and individual trainings Change physical structure and equipment Create new clinical teams Identify early adopters Recruit, designate, and train for leadership Stage implementation scale up Inform local opinion leaders Intervene with patients/consumers to enhance uptake and adherence Make billing easier Place innovation on fee for service lists/formularies Use data experts Note: * indicates an agency-generated implementation strategy Sorting 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 During the sorting phase, participants identified and labeled categories, and sorted strategies into participant-generated categories. In the analysis of concept mapping, I compared 46 several cluster solutions and selected a cluster solution that further synthesized the strategies into categories. Agency 1 identified 5 categories: planning, all hands on deck, clinical needs, attempting to get new clients, and one additional unnamed pile that included miscellaneous strategies (Figure 3). Figure 3. Agency 1 Cluster Map Participants in Agency 2 did not develop labels for the categories and sorted strategies into 15 piles, with some piles including one strategy alone. For example, one unnamed pile included strategies such as: remind clinicians and provide clinical supervision, while categories with a single strategy included: (a) develop a formal implementation blueprint, (b) purposely reexamine the implementation, and (c) develop and implement tools for quality monitoring. Agency 3 identified 6 categories: prepare, lead RBTs/RBT impact experts bulk of this step done by, Pre and early intervention, least important, things we cannot adjust or are not needs, and one unnamed pile (Figure 5). 47 Figure 4. Agency 2 Cluster Map Figure 5. Agency 3 Cluster Map Agency 4 identified 7 categories: training, train/implement, fidelity, involve/feedback, money/funding/incentives, preparation, and rapport (Figure 6). 48 Figure 6. Agency 4 Cluster Map Ranking Go-Zone graphs revealed the implementation strategies ranked as most important and feasible for each participating agency. The strategies located in the top-right quadrant of the go- zone graphs were reviewed and included as recommended strategies in the post- concept mapping report provided to each agency. All four agencies identified the following three strategies as important and feasible: (a) conduct ongoing trainings, (b) provide clinical supervision, and (c) develop educational materials. After removing duplicates, Agency 1 ranked 15 implementation strategies as most important and feasible for addressing context-specific determinants related to implementing Project ImPACT. Ratings of all strategies are shown in Figure 7, including duplicate strategies. Average ratings for top-rated implementation strategies (with duplicates removed) are shown in Table 7. For a complete list of identified implementation strategies and their importance and feasibility ratings, please refer to Appendix D. 49 Figure 7. Agency 1 Go-Zone graph Table 7. Agency 1 Importance and Feasibility Ratings Full Map Go-Zone Importance [2.2500]- [5.0000] Median = 2.5 Feasibility [1.7500]- [4.3333] Median = 2.16665 # 1 2 4 7 10 13 15 19 20 21 22 24 30 33 46 Implementation Strategy Average Rating Promote adaptability Conduct ongoing trainings Develop educational materials Create a learning collaborative Tailor strategies Involve patients/consumers/family members Assess for readiness and identify barriers and facilitators Provide clinical supervision Prepare patients/consumers to be active participants Obtain and use patients/consumers and family feedback Model and simulate change Develop a formal implementation blueprint Conduct educational meetings Purposely reexamine the implementation Distribute educational materials 4.2500 5.0000 4.5000 4.2500 4.2500 4.7500 4.2500 4.6667 4.2500 4.5000 4.2500 4.5000 4.5000 4.5000 4.5000 3.6667 4.0000 3.5000 3.7500 3.7500 3.5000 3.7500 4.2500 3.7500 3.5000 4.0000 3.5000 4.0000 3.5000 3.7500 50 Agency 2 ranked 12 implementation strategies as most important and feasible during the ranking phase (Table 8). All strategies that were ranked (including duplicates) are shown in Figure 8. All 12 strategies identified in the ranking process were from the ERIC list of implementation strategies; participant-generated strategies were not ranked as highly important and feasible. Figure 8. Agency 2 Go-Zone graph Table 8. Agency 2 Importance and Feasibility Ratings Full Map Go-Zone Implementation Strategy # 3 Remind clinicians 5 Develop educational material 7 Develop a formal implementation blueprint 8 Conduct ongoing training 15 Group and Individual trainings 19 Provide clinical supervision Importance [3.0000]- [4.8000] Median = 2.4 Feasibility [2.0000]- [4.8000] Median = 2.4 Average Rating 4.0000 4.4000 4.4000 4.6000 4.6000 4.6000 4.8000 4.2000 3.8000 4.0000 3.8000 4.2000 51 Table 8 (cont’d) 23 Prepare patients/consumers to be active participants 21 22 Organize clinician implementation team meetings Obtain and use patients/consumers and family feedback 25 Involve patients/consumers and family members 29 Develop and implement tools for quality monitoring 31 Conduct educational meetings 4.2000 4.4000 4.2000 4.8000 4.4000 4.2000 3.8000 4.2000 3.6000 4.0000 3.4000 3.6000 Upon completion of the ranking phase, Agency 3 had ranked 24 implementation strategies as important and feasible; of these strategies, 3 were generated by the participants themselves and the remaining 21 strategies were identified using the ERIC. The three generated strategies were: (a) view presentation on what Project ImPACT is, (b) view data on outcomes, and (c) robust training. All ranked strategies are shown in Figure 9. Average ratings for top-rated implementation strategies are shown in Table 9. Figure 9. Agency 3 Go-Zone graph 52 Table 9. Agency 3 Importance and Feasibility Ratings Full Map Go-Zone Implementation Strategy # 1 Robust training 2 View data on oncomes 3 View presentation on what project impact is 4 Develop a formal implementation blueprint 6 Distribute educational materials 8 14 Purposely reexamine the implementation Assess for readiness and identify barriers and facilitators 22 Facilitation 29 Use data experts 31 Tailor strategies 33 Remind clinicians 34 Recruit, designate, and train for leadership 35 Provide clinical supervision 36 Organize clinician implementation team meetings 37 Model and simulate change 39 Make training dynamic 43 44 Facilitate relay of clinical data to providers 45 Develop resource sharing agreements 46 Develop educational materials 47 Develop and organize quality monitoring systems 48 Develop and implement tools for quality monitoring 52 Conduct ongoing training Identify and prepare champions Importance [1.5000]- [4.5000] Median = 2.25 Feasibility [1.5000]- [4.2000] Median = 2.1 Average Rating 3.4000 4.2000 4.5000 3.4000 3.4000 3.6000 3.4000 3.4000 3.4000 3.8000 3.0000 3.0000 4.5000 3.4000 3.6000 3.6000 3.0000 3.6000 3.0000 3.0000 4.0000 4.0000 4.2500 3.4000 3.6000 4.0000 3.4000 3.8000 3.6000 3.2000 3.0000 3.2000 3.8000 3.6000 3.2000 4.2000 3.8000 4.0000 3.8000 3.6000 3.6000 3.0000 3.6000 3.6000 3.6000 3.6000 Lastly, Agency 4 ranked 13 ERIC implementation strategies as important and feasible. All ranked strategies are shown in Figure 10. Average ratings for top-rated implementation strategies are shown in Table 10. 53 Figure 10. Agency 4 Go-Zone graph Table 10. Agency 4 Importance and Feasibility Ratings Full Map Go-Zone Implementation Strategy # 3 Purposely reexamine the implementation 5 Prepare patients/consumers to be active participants 6 Organize clinician implementation team meetings 9 Develop and organize quality monitoring systems 11 Develop educational materials 12 Conduct ongoing training 15 Conduct educational meetings 18 Organize clinical implementation team meetings 19 Create a learning collaborative 23 Purposely reexamine the implementation 24 Provide clinical supervision 25 Model and simulate change 26 Involve patients/consumers and family members 27 Identify and prepare champions 31 Conduct local needs assessment Importance [3.2000]- [5.0000] Median = 2.5 Feasibility [2.8000]- [4.0000] Median = 2 Average Rating 4.4000 4.6000 4.8000 4.2000 4.4000 5.0000 4.2000 4.4000 4.2000 4.4000 5.0000 4.4000 5.0000 4.2000 4.2000 3.8000 3.6000 3.6000 3.6000 3.6000 3.6000 3.6000 3.6000 3.6000 3.8000 4.0000 3.8000 3.6000 3.6000 3.8000 54 ISMM End-User Evaluations Acceptability Regarding the acceptability of the concept mapping process, respondents at Agencies 1, 2, and 4 reported liking and approving of the concept mapping process, and that the process of concept mapping was appealing and welcome (M = 3.88, SD = .25, M = 3.81, SD = .13, and M = 3.95, SD = .10, respectively). Agency 3 provided more neutral responses on average related to the acceptability of the concept mapping process (M = 3.05, SD = .45). Results are displayed in Table 11. Feasibility Overall, the agencies perceived the concept mapping process to be feasible. Agency 1 reported agreement that the concept mapping process was implementable, possible, doable, and easy to use (M = 3.44, SD = .52). Similarly, Agency 2 reported high level of agreement with the feasibility of the concept mapping process for their organization (M = 4.0, SD = .00). Agency 3 also provided agreement with the appropriateness of this process for their organization (M = 3.4, SD = .42). Lastly, Agency 4 reported high agreement that the concept mapping process was implementable, possible, doable, and easy to use (M = 3.95, SD = .10). Results are displayed in Table 11. Appropriateness In terms of appropriateness, Agency 1 reported agreement that the concept mapping process was fitting, suitable, applicable, and a good match with the organization (M= 3.56, SD= .52). Similarly, Agency 2 reported high level of agreement with the appropriateness of concept mapping (M= 4.0, SD= .00). Agency 3 also provided agreement with the appropriateness of this process for their organization (M= 3.4, SD= .45). Lastly, Agency 4 reported high agreement that 55 the concept mapping process was fitting, suitable, applicable, and a good match with the organization (M= 3.95, SD= .10). Results are displayed in Table 11. Usability Lastly, Agency 1 reported agreement with the usability of the concept mapping process (M= 3.45, SD= .21). Agency 2 reported similar levels of agreement with the usability of this process (M= 3.4, SD= .12), while Agency 3 reported lower levels of agreement (M= 2.72, SD= .38) that the concept mapping process was usable. Of the four agencies, Agency 4 reported the highest level of agreement with the usability of concept mapping (M= 3.64, SD= .50) Results are displayed in Table 11. Table 11. ISMM End-User Evaluations, Means and Standard Deviations by Agency ISMM End-User Evaluations Acceptability Mean (SD) Agency 1 (n= 4) Agency 2 (n= 4) Agency 3 (n= 5) Agency 4 (n= 5) Feasibility Agency 1 (n= 4) Agency 2 (n= 4) Agency 3 (n= 5) Agency 4 (n= 5) Appropriateness Agency 1 (n= 4) Agency 2 (n= 4) Agency 3 (n= 5) Agency 4 (n= 5) Usability Agency 1 (n= 4) Agency 2 (n= 4) Agency 3 (n= 5) Agency 4 (n= 5) 3.88 (.25) 3.81 (.13) 3.05 (.45) 3.95 (.01) 3.44 (.52) 4 (.00) 3.4 (.42) 3.95 (.01) 3.56 (.52) 4 (.00) 3.4 (.45) 3.95 (.01) 3.45 (.21) 3.4 (.12) 2.72 (.38) 3.64 (.50) **The first three surveys are utilize a 4-point Likert scale (1-“completely disagree” to 4- “completely agree”). The last survey uses a 5-point scale (strongly disagree to strongly agree). 56 Frequency of Codes Qualitative Results Qualitative data was quantitized to identify the frequency with which each code was assigned across all interview transcripts. Overall, codes related to end-user evaluations were coded most often. The most frequent codes were: (a) Feasibility (frequency: 92), (b) Acceptability (frequency: 56), and (c) Usability (frequency: 45). All codes and frequency counts are included in Appendix C. Qualitative Themes Upon completion of the coding process, codes and categories were reviewed and further grouped together by thematic similarity. Based on the codes identified during qualitative analysis, three emergent themes were identified: (1) Organizational Readiness, (2) ISMM End- User Evaluations, and (3) Mapping Strategies. Codes and categories within each of these themes aligned with the CFIR framework. Therefore, coded text was grouped into the following CFIR categories: Innovation Characteristics, Inner Setting Factors, Individual Characteristics, and Process. Some of these categories occurred within multiple themes (i.e. Individual Characteristics and Inner Setting Factors emerged as categories under both ISMM End-User Evaluations and Mapping Strategies). Although the CFIR also includes an “Outer Setting Factors” domain, this construct did not emerge as a category or theme in the qualitative analyses. External pressure was not discussed as a factor important to the concept mapping process or impacting organizational readiness. It may be that external pressure factors more into dissemination and implementation decisions (e.g. during pre-implementation activities). Finally, within each category, a priori and emergent codes represented (1) factors that influenced end- users’ evaluations of the concept mapping process, (2) factors that impacted the perceived 57 organizational readiness of participating agencies after completing concept mapping, and (3) how implementation strategies mapped on to agency-specific implementation barriers. Theme 1: Organizational Readiness The first theme illustrates participants’ perspectives regarding their organization’s readiness to implement Project ImPACT, with a focus on two categories: organizational capacity and motivation. In terms of organizational readiness broadly, some participants expressed that the concept mapping process increased their knowledge about their organization and helped develop a foundation for the implementation of Project ImPACT. For some agencies, this process led to conversations regarding funding and encouraged participants to think more flexibly about ways to address barriers and make improvements. One participant stated “… it's definitely started that dialogue and that conversation [about] ‘how can we implement things like this?’ And like I mentioned, like, ‘how can we increase the budget for 2023 to allow additional things?’ So, it definitely sparked that conversation, like, ‘How do we implement something new and not just keep on reinventing the wheel over and over again with the same strategies?’” However, other participants reported that this process did not improve organizational readiness to implement Project ImPACT, in particular. Nevertheless, those participants noted that if they chose to implement interventions at their agency in the future, they would consider utilizing concept mapping or a similar process to guide implementation preparation and planning. Organizational Capacity. In terms of organizational capacity to implement an intervention, participants expressed that the concept mapping process helped staff identify agency-level implementation barriers, provided agencies with a plan for implementation, increased participant’s confidence about implementation, and set agencies up for success for future implementation efforts. 58 Organizational capacity was discussed in relation to the agency more broadly, as well as in relation to implementing Project ImPACT specifically. In terms of the agencies more broadly, participants described the impact of the post-Concept Mapping report that was provided to them following the completion of the final questionnaire. One participant explained “especially the post-report, I think it will definitely provide some more insight on what…we as a company all like kind of value. And again I feel like we do already implement all these strategies, but I think it kind of sets us up to like what things we should probably focus on. And I feel like we’re pretty good at implementing strategies, but it will help us know which barriers to implement these strategies on”. Although this step is not inherently part of the concept mapping process, these perspectives highlight how a needs assessment improved knowledge regarding which barriers to address when utilizing an ISMM. Indeed, innovation specific knowledge, skills, and abilities are an important subcomponent when evaluating the organizational capacity to change. Qualitative analysis indicated that participants identified lack of knowledge about Project ImPACT as a significant barrier to implementation. Findings suggest that providing participants with a detailed post-concept mapping report provided organizations with additional knowledge that may ultimately improve capacity for change. In terms of organizational capacity to implement Project ImPACT specifically, participants explained that although this process helped increase staff’s confidence and belief in the chance of implementation success. However, inner contextual factors such as lack of time and opportunity to complete implementation steps were expected to continue to impede the implementation of Project ImPACT. Overall, although some aspects of staff capability (i.e. confidence) improved as a result of concept mapping, other aspects of capability (i.e. time and 59 opportunity) continued to pose as a barrier to organizational capacity to implement Project ImPACT now or in the future. Motivation. In terms of organizational motivation to implement Project ImPACT, participants reported increases in individual-level motivation to implement this intervention as concept mapping allowed participants to visualize a step-by-step plan. Additionally, one participant described how specific implementation strategies may increase motivation, “if you have like a learning collaborative in place like it's one thing that's mentioned here, people are gonna share their successes and sharing the success […] it's like a social reinforcer, you know, like, you know, if you're like, oh, I had this success. And they're like, oh, yeah, I did this. And everyone's sharing what's working well, we're much more likely to stick to it”. However, one participant felt a lack of motivation throughout and following the process, and attributed this to limited knowledge and information regarding Project ImPACT. Qualitative analysis also highlighted how intervention specific characteristics were relevant to organizational motivation. For example, one participant noted a lack of perceived relative advantage related to Project ImPACT, as a result of limited information about the intervention “lack of information… We did the study, we participated, you know, again without making that sale. There's no reason that we wanna do it or research it or find it”. Overall, qualitative analysis regarding organizational motivation indicated that these changes occurred primarily at the individual-level, with most participants stating that concept mapping led to increased individual motivation. However, participants also noted continued barriers (e.g. lack of information, relative advantage) to increased organizational motivation. 60 Theme 2: ISMM End-User Evaluations This theme highlights factors that were discussed as influencing end-users’ evaluations of concept mapping. These factors were organized into three categories included: Innovation Characteristics, Inner Setting Factors, and Individual Characteristics. All three categories align with the CFIR framework. Innovation Characteristics. When discussing innovation characteristics that influenced the perceived feasibility, acceptability, usability, and appropriateness of concept mapping, participants mentioned the concept mapping steps, as well as four specific characteristics of the innovation: adaptability, complexity, relative advantage, and design. Overall, participants reported that the three concept mapping steps (i.e. brainstorming, sorting, and ranking) were acceptable, and noted that they appreciated the anonymity provided by completing the process online. Some participants stated that the steps were feasible in terms of being “pretty quick” to complete while other participants felt the steps were time-consuming. Participants also identified factors that made the process less feasible, such as: redundancy in the pre-questionnaire questions and in the strategies selected, an overwhelming number of strategies to brainstorm/select and rank, and feeling that the process was stretched out due to waiting for all participants within an agency to complete a step before moving to the next phase. In terms of usability, participants reported that the brainstorming and ranking steps were most useful in order to select and prioritize strategies that were relevant to their organization. For example, one participant noted that the process was “more tailored to our facility… because of our opinion”. Conversely, others noted that it was difficult to identify the purpose of this step and that it "wasn't as helpful or informative as the ranking”. Furthermore, this step—brainstorming and ranking—was considered less feasible to complete, as participants were unsure of what kinds of 61 labels or categories to create in order to organize the strategies. As a result, some participants recommended the use of pre-determined categories to guide participants in the sorting phase during future concept mapping processes. Four codes that aligned with the CFIR’s Innovation Characteristics domain were identified when participants discussed their perspectives regarding the acceptability, usability, and feasibility of concept mapping, but were not discussed in relation to the appropriateness of this method. Participants noted that the brainstorming and ranking steps were acceptable as they influenced the perceived adaptability of the innovation; specifically, these steps allowed organizations to select and prioritize strategies most relevant to their context. Related to acceptability, participants also noted the relative advantage of the concept mapping process as it allowed for collaboration across staff levels/roles in identifying implementation barriers and relevant strategies. Overall, the relative advantage of concept mapping influenced the perceived acceptability and usability of this process. One participant explained, “I think there were several times where we’ve discussed implementing new strategies and new implementations and I think this was the most effective way that we’ve done it as opposed to the past where we’ve just maybe talked about it and then that was just kind of that”. Additionally, the design of the concept mapping website allowed participants to engage in this method anonymously, which participants found both acceptable and useful. The website’s design also influenced the perceived feasibility of concept mapping, as participants reported that having specific tools within the website (i.e. the ability to use a computer, click and drag, copy and paste) made it easy to use as well as “easy to navigate”. Lastly, in terms of innovation complexity, participants noted that the pre-concept mapping questionnaire was less feasible for staff members to complete if they were not as familiar with organizational needs or where unable to answer implementation-related survey 62 items. Additionally, the concept mapping steps were described as more complex than participating organizations’ implementation-as-usual process, which led participants to perceive concept mapping as less feasible than their current organizational processes. Inner Setting Factors. Overall, participants reported that concept mapping was useful in providing organizations with strategies to focus on and learn more about. Further, concept mapping was considered acceptable as the process allowed for agencies to elicit perspectives across staff levels. As one participant stated, “It felt like everything was very acceptable to everyone and they understood it, and they were kind of seeing even people in various positions were feeling the same way”. While most participants stated that concept mapping was appropriate for their agency, one participant noted that the staff “don't see the [current] situation as intolerable”. This participant felt that undergoing this process was not a priority and did not feel necessary to all staff at the agency. In terms of feasibility, some participants reported that rating the feasibility of implementation strategies was difficult, due to inner contextual factors that would impede implementation efforts. Specifically, a participant stated, “some of [the feasibility ratings] just ties back to my center. I just think we’d struggle in a lot of ways. And so it was kind of hard to rate what I think would be better or worse”. In addition to these broader inner contextual factors, participants also described two specific inner contextual factors that influenced evaluations of concept mapping: structural characteristics and access to knowledge and information. In line with the CFIR framework, participants described IT infrastructure issues related to structural characteristics of their organizations. Due to security settings (i.e. organizational firewall), accessing the concept mapping website from work was less feasible for participants at this one organization; this was not a barrier that was noted by participants at other agencies. Access to knowledge at the 63 organizational level was also a barrier to completing concept mapping. Specifically, participants reported that completing the pre-concept mapping questionnaire and ranking implementation strategies was less feasible for staff who may have less information regarding agency-level needs and capabilities. For example, one participant stated “I think that's the biggest match is the lack of information... I mean, it's more of a black box to us that alright... I have this therapy we'd like to have you implement and it's like, well, what do we need to? We don't know what we don't know, so it's really hard to say what we're missing and what we're lacking”. While this code aligned with the CFIR framework, one additional code was generated during the analysis process: limited knowledge of the intervention being considered for use, Project ImPACT. This code specifically captured participant’s responses regarding a lack of background information and understanding of the intervention itself; this inner context factor impacted the feasibility of completing the brainstorming and ranking steps of concept mapping. One participant noted, “I'm not like an expert in Project ImPACT by any means. I'm knowledgeable of it and so I don't know how many people that participated are, so I felt like some of the questions might be might have been difficult for them to answer, like stuff about like how would you know most of the employees perceive like Project ImPACT like, oh, I don't, they may not know how to answer that because if they're not super knowledgeable". Individual Characteristics. Finally, four CFIR individual characteristics were discussed in relation to end-user evaluations: roles, knowledge, opportunity, and individual staff characteristics, more broadly. Notably, roles and knowledge were often-double coded. Qualitative data highlighted a pattern such that participants in the behavior technician role reported less knowledge about both their organizational needs as well as Project ImPACT itself. As a result, these participants reported that completing the needs assessment was challenging, 64 and reported less feasibility of concept mapping as a result of both their role in the organization and their knowledge regarding their organization and the intervention. Overall, both role and lack of knowledge at the individual level impacted the feasibility of completing the concept mapping process, particularly for those in the behavior technician role. Similarly, these factors impacted the acceptability of this process, as technicians reported feeling frustrated due to limited relevant knowledge, as well as not understanding why their organization was completing this process. In addition, high-level leaders (e.g. organizational directors) at the participating agencies stated that aspects of this process were less feasible, as they tried to consider how their staff might select and rank implementation strategies. However, participants across all staff levels (leaders, supervisors, direct providers) felt that including staff across multiple roles was “nice because it gives everyone a sense of like involvement on like what we're gonna do and … this is the route we're gonna take because majority felt this way”. Participants across all staff levels reported that concept mapping was usable, as they gained a greater understanding of their organization’s current barriers and considered how this process would impact the innovation recipients (i.e. clients, families) of Project ImPACT. For example, participants highlighted the usability of identifying implementation strategies (e.g. engaging patients/consumers) that would increase buy-in and participation from innovation recipients. In addition to roles and knowledge, opportunity (i.e. lack of time) to complete concept mapping was highlighted as a factor that impacted the feasibility of this process. This code also overlapped with roles. As one supervisor stated, “I think that that is the biggest thing is like that time aspect of being prepared, being able to teach and implement it and being able to as like supervisors have that mastery of it”. 65 Theme 3: Mapping Strategies During the interviews, participants reviewed the post-Concept Mapping report provided to them, and described their thoughts regarding if and how the recommended implementation strategies might address barriers. Participants were not asked about specific strategies and were given the opportunity to describe any strategies that were particularly salient to them. There were three categories that detailed participant’s discussions of how implementation strategies mapped on to identified context-specific implementation determinants. All three categories align with the CFIR framework: Individual Characteristics, Inner Setting Factors, and Process. Individual Characteristics. Individual staff characteristics, including motivation, capability, and rigidity, were identified as salient barriers to implementation that could be addressed by implementation strategies. Participants felt that ongoing training may address motivational barriers by increasing enthusiasm and buy-in for implementation while also reducing negative attitudes towards implementation. Specifically, participants explained that strategies such as training, access to educational meetings, materials, and resources, and clinical supervision would likely increase staff’s beliefs about their capability to support implementation. Lastly, participants across a number of agencies reported that staff have a tendency to be rigid around interventions used and may prefer not to introduce new interventions. Participants felt that implementation strategies such as consistent reminders and the use of an implementation blueprint may help to reduce rigidity around implementation. One participant explained “I think reminding us would be, definitely […] since […] we've already been doing things in, like a certain way that it'd be easy for a lot of people to forget to, like, implement Project ImPACT, you know, so I think that reminder to just be like hey guys you know try this out would definitely beneficial. So that way we don't just get like you know tunnel vision on one specific thing”. 66 Overall, although a number of individual characteristics were identified as barriers to implementing Project ImPACT, participants believed that the strategies they ranked as important and feasible would be helpful in overcoming those barriers. Inner Setting. One inner setting barrier was discussed by participants at two of the four agencies: limited access to knowledge about Project ImPACT. This barrier impacted end-user evaluations (i.e. feasibility, appropriateness) of the concept mapping process, but was also cited as a barrier to implementation more broadly. However, participants felt that several strategies would help to address this barrier, including all education-related strategies, clinical supervision, and ongoing training. Process. Lastly, two codes aligned with the CFIR model’s process domain: assessing needs and planning. Additionally, remaining sub-codes fell under a broader “strategies” code. Participants mentioned the importance of assessing needs prior to undergoing the concept mapping or implementation processes. Participants reported that both the pre-concept mapping questionnaire and report increased the usability of the concept mapping process by giving organizations “some the knowledge of what we need and what we currently don't have”. In terms of planning, participants identified one implementation process-related barrier: an implementation scheme/sequence of tasks was not already developed. During interviews, participants highlighted that using an implementation blueprint would be an important and feasible strategy for addressing this barrier. Finally, participants detailed several implementation strategies that would address context-specific barriers identified in the pre-concept mapping questionnaire. Implementation strategies mapped on to a range of barriers. The most commonly discussed strategies during interviews included: 67 (a) Engaging/involving patients and consumers: participants expressed that engaging or involving the families of their clients is often a challenge, but an area that staff find important and valuable for supporting the generalization of interventions at home and in other settings, as well as to inform the understanding of client’s progress. As a result, participants identified engaging/involving patients and consumers and obtaining patient/consumer feedback as an important step to better involve families and address this barrier. (b) Developing an implementation blueprint: participants believed that this strategy would be helpful for identifying implementation goals and outcomes, as well as to address the individual-level barrier of rigidity among staff. One participant stated, “I think having a blueprint, having you know, a checklist to kind of go down and make sure we’re accomplishing that will keep us in line and keep us doing things we’re supposed to be doing instead of, yeah um, becoming too rigid in one sort of way that we’re doing things”. (c) Reminders: similarly, reminders were another implementation strategy that participants felt would help address barriers around individual rigidity, as “[having] that constant reminder would help us to be thinking more critically about how we’re doing things and not get too rigid in the way that we’re implementing interventions”. (d) Resources/materials: participants discussed the importance of preparing and having access to resources/materials related to the intervention as a relevant implementation strategy, with participants noting “I’ve seen interventions fail because no one has the time to put the time into making the materials, so like pre prepared stuff is important”. (e) Quality monitoring: a number of participants mentioned quality monitoring as a relevant strategy, and specifically highlighted the importance of collecting data to monitor progress during implementation. A participant explained “I think that was a big thing that I think 68 is important is making sure that with any kind of new program or system you're implementing being able to do that like quality assurance to make sure that it's being implemented correctly, that those strategies are individualized for each client that's being implemented with, and then being able to follow up to make sure that staff are continuously implementing that correctly”. Moreover, several implementation strategies were related to education and training: (f) Clinical supervision: the majority of participants who completed interviews mentioned clinical supervision as a relevant strategy for their agencies. Participants felt this strategy would help with providing consistent and quality intervention, making sure staff receive ongoing training, and ensuring that staff are able to have questions answered from someone who is knowledgeable about the intervention. One participant stated "I think if you're implementing something new, if someone hasn't done it before, everybody… you're not gonna have effective implementation without supervision". (g) Training: most participants also mentioned “ongoing training” and “making training dynamic” as important strategies to ensure the provision of consistent and high-quality treatment delivery. Participants discussed the relevance of both individual and group trainings, and suggested that training could also occur within the context of supervision. Overall, participants felt that training would help increase and sustain motivation, knowledge, and enthusiasm: “all the individual characteristics, so they don't have confidence in their capabilities to execute their action and then satisfaction and commitment to the organization… training will help with that”. (h) Educational: lastly, participants emphasized using educational strategies (e.g. conduct educational meetings, develop and distribute educational materials) to cover a variety of topics such as family engagement, Project ImPACT, implementation processes, and quality or progress monitoring. Participants believed that educational strategies would address barriers around 69 provider knowledge and enthusiasm, and could potentially increase buy-in from leadership to provide funding for the intervention. A participant explained “the educational meeting would probably be the biggest thing. I'm really taking that up to the higher ups and discussing with them, showing them all the benefits of it, but I feel like that's something where because we’re in the field of ABA, we need to have that data collection on our side, then, to show that here are the differences that were made with Project ImPACT and this is how beneficial it is. And then once we can show that difference from a point of not implementing Project ImPACT to a point of implementing it and being able to show that contrast, then at that point then we would have a stronger foot to stand on in those educational meetings outside of showing research from other companies… So I feel like that's kind of what I see as the barrier right now". In addition to the funding barrier, across supervision, training, and educational strategies, participants still noted that staff turnover at the behavior technician level may continue to hinder implementation, and that these strategies would need to be utilized repeatedly as a result. Other implementation strategies identified in concept mapping were mentioned less often during interviews: clinician team meetings, "oversight" or ongoing support, identify and prepare champions, outreach to other organizations, fidelity checks, facilitate relay of data to providers, and creating a learning collaborative. Overall, participants reported that the strategies they identified were relevant to their organization’s current needs and that “these were really good strategies that are important to start with… I think this incorporates like what needs to happen before, what needs to happen during, and then like the continuous follow up to make sure that the it's being implemented correctly”. Overall, qualitative themes illustrate factors that influenced perspectives around the feasibility, acceptability, usability, and appropriateness of concept mapping, as well as the 70 impact of concept mapping on organizational readiness to implement Project ImPACT. Lastly, themes highlight participant perspectives related to how implementation strategies map on to address context-specific barriers to implementing Project ImPACT. 71 Discussion This mixed-methods study aimed to pilot the use of concept mapping as an implementation strategy mapping method, within the context of community mental health agencies serving autistic children experiencing socioeconomic disadvantage. Specifically, this project aimed to (a) examine the impact of concept mapping on organizational readiness to change (i.e. capacity and motivation) in CMH agencies serving autistic youth, and (b) evaluate ISMM end-user evaluations (i.e. feasibility, acceptability, appropriateness, and usability) of concept mapping as an ISMM in CMH agencies. Aim 1: Organizational Readiness Quantitative and qualitative strands were merged into a joint display (Table 12) illustrating how organizational readiness changed after completing the concept mapping method (quan strand), as well as participants’ perspectives of the impact of concept mapping on their organization’s capacity and motivation to implement Project ImPACT (QUAL strand). The joint display highlighted how the qualitative codes (organizational readiness, motivation, capacity) complemented constructs measured by the ORC, ORIC, and ORCA. Overall, there were no significant improvements in organizational readiness across all four agencies. These data converged with qualitative findings which indicated that organizational-level motivation did not change significantly after completing concept mapping. However, some participants expressed changes in individual-level motivation after engaging in this process, highlighting some convergence of the two data strands. Quanatitive data converged with qualitative results which indicated that while participants believed there were some improvements in their capacity to implement this intervention, several barriers (e.g. time, opportunity) that may hinder implementation efforts continued to exist. Lastly, participants explained that the concept 72 mapping process improved readiness by encouraging conversations regarding training and funding needs at the participating organizations. However, quantitative data revealed no significant improvements in overall organizational readiness, indicating divergence across the two data strands. Although, qualitative findings revealed that none of the agencies plan to implement Project ImPACT in the near future, participants expressed that this process increased overall readiness for future implementation efforts by providing organizations with a structured process to plan for implementation, as well as relevant strategies to support implementation. This study advances our understanding of ISMMs, and is the first study to our knowledge that evaluates changes in organizational readiness following the use of a method to select and tailor implementation strategies. Research has shown that organizational readiness is key to achieving successful implementation and relevant outcomes of implementation efforts (Scaccia et al., 2020). Although quantitative findings did not indicate significant improvements in motivation and capacity to change, qualitative data indicate some promise that concept mapping may influence the critical component of organizational readiness for implementation. Importantly, studies have shown that there is minimal evidence to suggest that certain support strategies can change perceptions of subcomponents related to motivation (Scaccia et al., 2020). Support strategies include tools, training, technical assistance or coaching, and quality improvement or assurance. These strategies demonstrate the various types of support that may be needed during implementation efforts, particularly for implementation teams and support practitioners (Leeman et al., 2015). Yet, some extant research indicates that these strategies may have no to little impact on perceptions regarding the relative advantage, compatibility, complexity, trialability, and observability of an intervention. Rather, these subcomponents may influence motivation to implement an intervention. Additionally, research has shown limited 73 evidence that support strategies improve several subcomponents of general and organizational capacity including: innovation knowledge, skills, and abilities, implementation climate, interorganizational relationships, organizational culture, leadership, and staff capacity. Indeed, several of these factors were discussed during qualitative interviews, including the relative advantage and complexity of Project ImPACT, as well as staff’s knowledge, skills, and abilities to utilize this intervention. These findings highlighted the importance of potentially intervention upon or addressing these factors when engaging in the concept mapping process. Although ISMMs have not been categorized as a “support strategy” per say, this area of research highlights the importance of evaluating which methods or strategies may influence subcomponents of both organizational motivation and capacity. Indeed, it is possible that a process for selecting and tailoring implementation strategies may have limited impact on these aspects of organizational readiness. Further research investigating whether and how concept mapping or other ISMMs improve components of organizational readiness may be beneficial to better understand how to influence these outcomes. Overall, these findings indicated that there were no significant changes in organizational readiness after completing concept mapping. Qualitative findings provided further descriptions of whether and how the concept mapping process impacted organizational readiness. Together, these findings indicated some potential changes in motivation and capacity to change; however, none of the participants expressed intention to implement Project ImPACT in the near future. Aim 2: ISMM End-User Evaluations The two data strands were also merged in the joint display to illustrate end-users’ evaluations of the concept mapping process as an ISMM (Table 12). Overall, quantitative data indicated that most participants felt the concept mapping process was highly acceptable (M= 74 3.65), feasible (M= 3.69), and appropriate (M= 3.72), and reported that the process was usable (M= 3.37). The quantitative data converged with the qualitative findings, as participants frequently commented on the feasibility, acceptability, and usability of this process. While participants also discussed the appropriateness of concept mapping for their organizations, they did so less frequently compared to discussing other evaluations during the interviews. However, appropriateness was highly rated across all organizations based on quantitative findings. Several factors appeared to influence end-users’ evaluations of concept mapping, including knowledge/familiarity with Project ImPACT, staff role in the organization, and the relationship between staff role and knowledge. Specifically, participants explained that for direct providers (i.e. behavior technicians) in particular, staff had limited knowledge and understanding of both agency-level needs and barriers to implementation, as well as knowledge related to the Project ImPACT intervention itself. The inclusion of participants with varying roles and knowledge presented a barrier to the feasibility of the concept mapping process. Additionally, the use of materials such as the ERIC list of implementation strategies may have also posed a barrier. Indeed, research has highlighted that non-implementation researchers report confusion and lack of understanding implementation strategy terminology and jargon (Yakovchenko et al., 2023). Yet, previous research has highlighted the importance of including staff who represent different roles within an organization, in order to facilitate effective implementation (Bustos et al., 2021; Drahota et al., 2020; Schultes et al., 2018). In this study, staff reported high levels of acceptability related to the inclusion of staff across different levels (i.e. leaders, supervisors, technicians). Findings suggest that although staff knowledge may have presented a barrier to engaging in concept mapping, it remains important to include staff across various levels in order to increase buy-in for implementation efforts. Therefore, future research studies should considers 75 methods to increase staff knowledge relevant implementation terminology and making efforts to address staff knowledge related to the intervention itself. While researchers have begun to measure end-user evaluations of other ISMMs in different settings (Powell et al., 2020) the current study’s findings provide a deeper understanding of how participants experience one specific ISMM: concept mapping. A recent scoping review highlighted the importance of obtaining further information on these constructs within a greater variety of settings, including settings that serve diverse communities with a high level of intersecting needs (Proctor et al., 2023). The current study provided an evaluation of ISMM end-user evaluations within the novel context of CMH agencies serving a diverse and marginalized population: autistic children experiencing socioeconomic disadvantage. In addition, this scoping review noted that research has often focused on reporting quantitative data on acceptability, feasibility, and appropriateness specifically. Furthermore, the use of mixed- methods in this project provided a deeper evaluation of these constructs and allowed for an understanding of how quantitative and qualitative data converged and diverged. However, further measure of less-studied end-user perspectives (e.g. sustainability, cost, penetration) in relation to ISMMs continues to be an important area for future research. Overall, findings indicate that the concept mapping process had positive ISMM end-user evaluations across all four agencies based on both quantitative and qualitative results. Importantly, this is the first study to explore evaluations of concept mapping as an ISMM. Understanding these constructs may be integral to the implementation process, given the importance of end-user buy-in and motivation in implementation efforts. 76 Table 12. Joint Display of ISMM End-User Evaluations and Organizational Readiness QUAL Strand Construct Quan Strand Agency 1 Agency 2 Agency 3 Agency 4 All Agencies Illustrative quote ORC- Motivation p=0.92 p=0.78 p=0.59 p=0.44 p=0.56 p=0.21 p=0.37 p=0.42 p=0.39 p=0.67 ORIC- Motivation and Capacity p=0.47 p=0.61 p=0.92 p=0.28 p=0.65 ORCA/Orga nizational Readiness Acceptability M= 3.88 (SD= .25) M= 3.81 (SD= .13) M= 3.05 (SD= .45) M= 3.95 (SD= .01) M= 3.65 (SD= .4) “At this point we have no reason to want to do Project ImPACT… The sale wasn't made because we have no clue what it is or why we need that, so there's no reason to buy”. “If we were to use this process for a certain project or intervention later down the road, we would definitely be setting ourselves up to be more prepared because again we would be looking at all things like ahead of time and kind of prioritizing and figuring out what's most important”. “I'm not sure as an organization, but I can see if we were to do that, that some of us would think towards a process like this of like maybe we should implement one of these strategies where we're… using a system..just that knowledge of you could use a system like this to rank and categorize and brainstorm”. “…it was pretty satisfactory or acceptable just because everyone would talk about it as well, like in person. So that was that was kind of nice and I think it helps to having everything kind of laid out and seeing like what everyone's thoughts were, especially when we were creating those concepts and stuff” Table 12 (cont’d) Feasibility Appropriaten ess M= 3.44 (SD= .52) M= 3.56 (SD= .52) M= 4 (SD= .00) M= 3.4 (SD= .42) M= 4 (SD= .00) M= 3.4 (SD= .45) M= 3.95 (SD= .01) M= 3.95 (SD= .01) M= 3.69 (SD= .42) “I found it like feasible, like…we can do it. The process was like a little bit confusing, but…I feel like some of it was just cause it's it was all over like…knowledge” M= 3.72 (SD= .14) “A really great fit, super generalizable across multiple levels of employment too, like even staff members but also clinicians, but also stakeholders and everything. So super simple for our company especially”. Usability M= 3.45 (SD= .21) M= 3.4 (SD= .12) M= 2.72 (SD= .38) M= 3.64 (SD= .50) M= 3.37 (SD= .77) “Very useful. Yeah, I think it kind of brought together like again like people's priorities versus others and importance to people”. 78 Limitations There were several limitations to this study. Firstly, this study involved a small sample of organizations; as a result, generalizability of these results are limited. Furthermore, we were not able to conduct quantitative analyses beyond evaluating aggregate responses for end-user evaluations and paired sample t-tests to explore changes in organizational readiness. Future studies that seek to evaluate the effectiveness and impact of ISMMs will likely benefit from the inclusion of larger and more representative samples. In addition, participants raised an important limitation during the semi-structured interviews related to their lack of information and motivation to utilize Project ImPACT. An eligibility criterion for participation in this study included involving organizations that expressed an interest in implementing this intervention. However, this interest was endorsed by the leaders, and did not necessarily represent the perspective of other staff members. Furthermore, familiarity and knowledge of this intervention was often discussed in association with staff role, as direct providers and clinical supervisors reported a lack of information, while leaders did not endorse this barrier. Overall, these responses indicated that, although we provided educational resources and an overview of Project ImPACT to all agencies, further information and resources may have been beneficial to increase knowledge of the intervention prior to engaging in an ISMM process. It may also be possible that ISMMs are particularly effective for organizations who have already selected an intervention to implement; future research in this area may provide a greater understanding of how ISMM end- user evaluations and organizational readiness may vary depending on the selected intervention. Participants also highlighted how the modality of concept mapping may have been a limitation. Specifically, while some participants reported acceptability around completing this process anonymously, online, and in their own time, other participants stated that they may have preferred to complete this process in person. Of note, completing this process in person may have allowed the process to move faster as participants would have completed each step simultaneously and in collaboration with one another. Additionally, some participants reported being forgetful and relying on reminders from the research team to complete concept mapping steps. Other participants expressed that having a member of the research team present during the concept mapping steps may have been helpful in order to address any questions. Overall, the modality of this process likely impacted participant perspectives regarding end-user evaluations, such as acceptability and feasibility. However, this study did not compare the use of concept mapping across in-person and virtual modalities; as a result, an understanding of how modality impacts end-user evaluations remains unknown. In addition, researcher errors were made during the concept mapping phase of the study. Specifically, after completing the brainstorming phase of concept mapping, duplicate strategies were removed for agencies 3 and 4, but not for agencies 1 and 2. As a result, agencies 1 and 2 sorted and ranked duplicated strategies, which likely increased participant burden and length of time to complete these steps. Additionally, duplicates for agencies 3 and 4 were deleted on the concept mapping website immediately after the brainstorming phase was completed; information regarding the number of duplicated strategies was not tracked. Finally, although this study sought to identify strategies that map onto implementation determinants, the use of the CFIR needs assessment may have elicited responses related to implementation barriers alone, rather than barriers and facilitators. This limitation is common among ISMM studies and remains an important area for further investigation (Sridhar et al., 2023). However, the use of this measure may have limited the identification and prioritization of implementation strategies that seek to enhance facilitating factors within participating 80 organizations. As a result, these findings primarily represent implementation strategies to overcome context-specific barriers, rather than implementation strategies that map on to context- specific determinants more broadly. Future Directions This study highlighted several areas for future research related to the use of concept mapping and ISMMs more broadly. ISMM Active Ingredients Firstly, further research is needed in order to identify and understand the mechanisms of action underlying how ISMMs can increase motivation and capacity for change. While this study revealed promising findings regarding concept mapping and organizational readiness, there is a need to better understand the active ingredients of concept mapping and other ISMMs, in order to utilize such processes efficiently. Previous research has highlighted common steps across various ISMMs (Sridhar et al., 2022), including the use of a needs or agency assessment to identify implementation determinants, utilizing the ERIC when selecting implementation strategies, and engaging participants in rating the feasibility and importance of implementation strategies. This study included all of these common steps either prior to or during the concept mapping process. Indeed, participants often discussed the impact of the needs assessment and pre- and post-concept mapping reports on organizational readiness, and their perspectives on concept mapping. However, these steps are not included in the traditional concept mapping process. As a result, it is unclear the extent to which these findings were influenced by the inclusion of these steps, or whether these findings are truly representative of concept mapping as an ISMM alone. Therefore, further research investigating the impact of the various steps 81 involved in ISMMs, including an agency assessment report, may be valuable in determining the active ingredients in these methods. ISMM Effectiveness Secondly, further evaluation of ISMM end-user evaluations is needed in order to determine whether these processes are effective in facilitating implementation. Specifically, a focus on the effectiveness of ISMMs on improving organizational readiness, as well as improving the feasibility, acceptability, and appropriateness of a given intervention within a setting is necessary. The ultimate goal of utilizing ISMMs is to increase the implementation and use of evidence-based interventions across settings in an effort to reduce disparities in access to services. While this study revealed positive end-user evaluations related to the use of concept mapping in CMH agencies serving autistic children, further research investigating long-term outcomes such as equity in service access and clinical improvement in clients remains an important and necessary step in ISMM research. Additionally, this study indicated that participants were able to identify context-specific determinants, and select and prioritize important and feasible implementation strategies to address those determinants. However, future studies should investigate the effectiveness of selected implementation strategies; specifically, studies should seek to evaluate whether participant-selected strategies are effective in reducing barriers and enhancing facilitators to implementation. These findings will be important in understanding which ISMMs may be most effective. Furthermore, the effectiveness of concept mapping may vary by setting and population. Further research investigating the use of concept mapping and other ISMMs may be beneficial in order to identify which ISMMs are most effective in specific settings, and further tailor the use of such processes across different contexts. 82 Conclusion This study sought to evaluate whether concept mapping, when utilized as an ISMM, increased organizational readiness, motivation, and capacity to implement a novel evidence- based intervention within CMH agencies serving autistic youth. Findings indicated that concept mapping did not significantly improve organizational readiness, motivation, and capacity to implement Project ImPACT within this context. While participants expressed that they are not likely to begin implementation in the near future due to other priorities and continued low motivation, they believed that this process was a helpful first step in planning for future implementation efforts. Moreover, participants generally reported that concept mapping is a feasible, acceptable, useful, and appropriate method for identifying feasible and important implementation strategies that are tailored to an organization’s specific needs. These findings suggest that concept mapping may be a promising implementation strategy mapping method. Further examination of the impact of ISMMs on implementing NDBIs, such as Project ImPACT, in CMH agencies is needed in order to understand the value of utilizing such approaches to improve implementation processes and implementation, service-, and client-level outcomes. 83 REFERENCES Aarons, G. A., Hurlburt, M., & Horwitz, S. M. (2011). Advancing a Conceptual Model of Evidence-Based Practice Implementation in Public Service Sectors. Adm Policy Ment Health, 20. Aarons, G. A., Wells, R. S., Zagursky, K., Fettes, D. L., & Palinkas, L. A. (2009). Implementing Evidence-Based Practice in Community Mental Health Agencies: A Multiple Stakeholder Analysis. American Journal of Public Health, 99(11), 2087–2095. https://doi.org/10.2105/AJPH.2009.161711 Adams, D., & Young, K. (2020). A Systematic Review of the Perceived Barriers and Facilitators to Accessing Psychological Treatment for Mental Health Problems in Individuals on the Autism Spectrum. Review Journal of Autism and Developmental Disorders. https://doi.org/10.1007/s40489-020-00226-7 Adrian, M., Coifman, J., Pullmann, M. D., Blossom, J. B., Chandler, C., Coppersmith, G., Thompson, P., & Lyon, A. R. (2020). Implementation Determinants and Outcomes of a Technology-Enabled Service Targeting Suicide Risk in High Schools: Mixed Methods Study. JMIR Mental Health, 7(7), e16338. https://doi.org/10.2196/16338 Ausubel, D. P. (1968). Educational psychology: A cognitive view. New York. Baker, R., Camosso-Stefinovic, J., Gillies, C., Shaw, E. J., Cheater, F., Flottorp, S., Robertson, N., Wensing, M., Fiander, M., Eccles, M. P., Godycki-Cwirko, M., van Lieshout, J., & Jäger, C. (2015). Tailored interventions to address determinants of practice. The Cochrane Database of Systematic Reviews, 4, CD005470. https://doi.org/10.1002/14651858.CD005470.pub3 Barry, C. L., Epstein, A. J., Marcus, S. C., Kennedy-Hendricks, A., Candon, M. K., Xie, M., & Mandell, D. S. (2017). Effects Of State Insurance Mandates On Health Care Use And Spending For Autism Spectrum Disorder. Health Affairs, 36(10), 1754–1761. https://doi.org/10.1377/hlthaff.2017.0515 Bilaver, L. A., Sobotka, S. A., & Mandell, D. S. (2021). Understanding Racial and Ethnic Disparities in Autism-Related Service Use Among Medicaid-Enrolled Children. Journal of Autism and Developmental Disorders, 51(9), 3341–3355. https://doi.org/10.1007/s10803-020-04797-6 Bishop-Fitzpatrick, L., & Kind, A. J. H. (2017). A Scoping Review of Health Disparities in Autism Spectrum Disorder. Journal of Autism and Developmental Disorders, 47(11), 3380–3391. https://doi.org/10.1007/s10803-017-3251-9 Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa 84 Brookman-Frazee, L., Drahota, A., Stadnick, N., & Palinkas, L. A. (2012). Therapist Perspectives on Community Mental Health Services for Children with Autism Spectrum Disorders. Administration and Policy in Mental Health and Mental Health Services Research, 39(5), 365–373. https://doi.org/10.1007/s10488-011-0355-y Brownson, R. C., Colditz, G. A., & Proctor, E. K. (2012). Dissemination and Implementation Research in HealthTranslating Science to Practice. Oxford University Press. https://doi.org/10.1093/acprof:oso/9780199751877.001.0001 Bustos, T. E., Sridhar, A., & Drahota, A. (2021). Implementation evaluation of an early intensive behavioral intervention program across three agencies serving young children with Autism: A mixed methods study. Children and Youth Services Review, 122, 105871. https://doi.org/10.1016/j.childyouth.2020.105871 Cheron, D. M., Chiu, A. A. W., Stanick, C. F., Stern, H. G., Donaldson, A. R., Daleiden, E. L., & Chorpita, B. F. (2019). Implementing Evidence Based Practices for Children’s Mental Health: A Case Study in Implementing Modular Treatments in Community Mental Health. Administration and Policy in Mental Health, 46(3), 391–410. https://doi.org/10.1007/s10488-019-00922-5 Constantino, J. N., Abbacchi, A. M., Saulnier, C., Klaiman, C., Mandell, D. S., Zhang, Y., Hawks, Z., Bates, J., Klin, A., Shattuck, P., Molholm, S., Fitzgerald, R., Roux, A., Lowe, J. K., & Geschwind, D. H. (2020). Timing of the Diagnosis of Autism in African American Children. Pediatrics, 146(3), e20193629. https://doi.org/10.1542/peds.2019- 3629 D’Agostino, S. R., Dueñas, A. D., Bravo, A., Tyson, K., Straiton, D., Salvatore, G. L., Pacia, C., & Pellecchia, M. (2023). Toward deeper understanding and wide-scale implementation of naturalistic developmental behavioral interventions. Autism, 27(1), 253–258. https://doi.org/10.1177/13623613221121427 Dallman, A. R., Artis, J., Watson, L., & Wright, S. (2020). Systematic Review of Disparities and Differences in the Access and Use of Allied Health Services Amongst Children with Autism Spectrum Disorders. Journal of Autism and Developmental Disorders. https://doi.org/10.1007/s10803-020-04608-y Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(1), 50. https://doi.org/10.1186/1748-5908-4-50 Davis, J. M., Finke, E., & Hickerson, B. (2016). Service Delivery Experiences and Intervention Needs of Military Families with Children with ASD. Journal of Autism and Developmental Disorders, 46(5), 1748–1761. https://doi.org/10.1007/s10803-016-2706-8 85 Dorsey, S., Johnson, C., Soi, C., Meza, R. D., Whetten, K., & Mbwayo, A. (2023). Implementation science in plain language: The use of nonjargon terms to facilitate collaboration. Implementation Research and Practice, 4, 26334895231177474. https://doi.org/10.1177/26334895231177474 Drahota, A., Meza, R., Bustos, T., Sridhar, A., Martinez, J., Brikho, B., Stahmer, A., & Aarons, G. (2020). Implementation-as-Usual in Community-Based Organizations Providing Specialized Services to Individuals with Autism Spectrum Disorder: A Mixed Methods Study. Administration and Policy in Mental Health. https://doi.org/10.1007/s10488-020- 01084-5 Dueñas, A. D., D’Agostino, S. R., Bravo, A., Horton, E., Jobin, A., Salvatore, G. L., Straiton, D., Tyson, K., & Pellecchia, M. (2023). Beyond the Task List: A Proposed Integration of Naturalistic Developmental Behavioral Interventions to BCBA Training. Behavior Analysis in Practice. https://doi.org/10.1007/s40617-023-00795-z Estabrooks, P. A., Brownson, R. C., & Pronk, N. P. (2018). Dissemination and Implementation Science for Public Health Professionals: An Overview and Call to Action. Preventing Chronic Disease, 15, E162. https://doi.org/10.5888/pcd15.180525 Finch, T. L., Mair, F. S., O’Donnell, C., Murray, E., & May, C. R. (2012). From theory to “measurement” in complex interventions: Methodological lessons from the development of an e-health normalisation instrument. BMC Medical Research Methodology, 12(1), 69. https://doi.org/10.1186/1471-2288-12-69 Gopichandran, V., Luyckx, V. A., Biller-Andorno, N., Fairchild, A., Singh, J., Tran, N., Saxena, A., Launois, P., Reis, A., Maher, D., & Vahedi, M. (2016). Developing the ethics of implementation research in health. Implementation Science, 11(1), 161. https://doi.org/10.1186/s13012-016-0527-y Green, A. E., Fettes, D. L., & Aarons, G. A. (2012). A Concept Mapping Approach to Guide and Understand Dissemination and Implementation. 13. Green, J., Charman, T., McConachie, H., Aldred, C., Slonims, V., Howlin, P., Le Couteur, A., Leadbitter, K., Hudry, K., Byford, S., Barrett, B., Temple, K., Macdonald, W., & Pickles, A. (2010). Parent-mediated communication-focused treatment in children with autism (PACT): A randomised controlled trial. Lancet, 375(9732), 2152–2160. https://doi.org/10.1016/S0140-6736(10)60587-9 Guetterman, T. C., Fetters, M. D., & Creswell, J. W. (2015). Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays. Annals of Family Medicine, 13(6), 554–561. https://doi.org/10.1370/afm.1865 Hampton, L. H., & Sandbank, M. P. (2022). Keeping up with the evidence base: Survey of behavior professionals about Naturalistic Developmental Behavioral Interventions. Autism, 26(4), 875–888. https://doi.org/10.1177/13623613211035233 86 Ingersoll, B. (2011). Recent Advances in Early Identification and Treatment of Autism. Current Directions in Psychological Science, 20(5), 335–339. https://doi.org/10.1177/0963721411418470 Ingersoll, B., & Dvortcsak, A. (2010). Teaching social communication to children with autism: A practitioner’s guide to parent training and a manual for parents. Guilford Press. Ingersoll, B., Wainer, A. L., Berger, N. I., Pickard, K. E., & Bonter, N. (2016). Comparison of a Self-Directed and Therapist-Assisted Telehealth Parent-Mediated Intervention for Children with ASD: A Pilot RCT. Journal of Autism and Developmental Disorders, 46(7), 2275–2284. https://doi.org/10.1007/s10803-016-2755-z Kien, C., Griebler, U., Schultes, M.-T., Thaler, K. J., & Stamm, T. (2021). Psychometric Testing of the German Versions of Three Implementation Outcome Measures. Global Implementation Research and Applications, 1(3), 183–194. https://doi.org/10.1007/s43477-021-00019-y Kwok, E. Y. L., Moodie, S. T. F., Cunningham, B. J., & Oram Cardy, J. E. (2020). Selecting and tailoring implementation interventions: A concept mapping approach. BMC Health Services Research, 20(1), 385. https://doi.org/10.1186/s12913-020-05270-x LaClair, M., Mandell, D. S., Dick, A. W., Iskandarani, K., Stein, B. D., & Leslie, D. L. (2019). The effect of Medicaid waivers on ameliorating racial/ethnic disparities among children with autism. Health Services Research, 54(4), 912–919. https://doi.org/10.1111/1475- 6773.13176 Lau, R., Stevenson, F., Ong, B. N., Dziedzic, K., Treweek, S., Eldridge, S., Everitt, H., Kennedy, A., Qureshi, N., Rogers, A., Peacock, R., & Murray, E. (2015). Achieving change in primary care—effectiveness of strategies for improving implementation of complex interventions: Systematic review of reviews. BMJ Open, 5(12), e009993. https://doi.org/10.1136/bmjopen-2015-009993 Leeman, J., Calancie, L., Hartman, M. A., Escoffery, C. T., Herrmann, A. K., Tague, L. E., Moore, A. A., Wilson, K. M., Schreiner, M., & Samuel-Hodge, C. (2015). What strategies are used to build practitioners’ capacity to implement community-based interventions and are they effective?: A systematic review. Implementation Science, 10(1), 80. https://doi.org/10.1186/s13012-015-0272-7 Lengnick-Hall, R., Gerke, D. R., Proctor, E. K., Bunger, A. C., Phillips, R. J., Martin, J. K., & Swanson, J. C. (2022). Six practical recommendations for improved implementation outcomes reporting. Implementation Science, 17(1), 16. https://doi.org/10.1186/s13012- 021-01183-3 Leslie, D. L., Iskandarani, K., Dick, A. W., Mandell, D. S., Yu, H., Velott, D., Agbese, E., & Stein, B. D. (2017). The Effects of Medicaid Home and Community-based Services 87 Waivers on Unmet Needs Among Children With Autism Spectrum Disorder: Medical Care, 55(1), 57–63. https://doi.org/10.1097/MLR.0000000000000621 Liptak, G. S., Benzoni, L. B., Mruzek, D. W., Nolan, K. W., Thingvoll, M. A., Wade, C. M., & Fryer, G. E. (2008). Disparities in diagnosis and access to health services for children with autism: Data from the National Survey of Children’s Health. Journal of Developmental and Behavioral Pediatrics: JDBP, 29(3), 152–160. https://doi.org/10.1097/DBP.0b013e318165c7a0 Magaña, S., Lopez, K., Aguinaga, A., & Morton, H. (2013). Access to Diagnosis and Treatment Services Among Latino Children With Autism Spectrum Disorders. Intellectual and Developmental Disabilities, 51(3), 141–153. https://doi.org/10.1352/1934-9556-51.3.141 Mandell, D. S., Wiggins, L. D., Carpenter, L. A., Daniels, J., DiGuiseppi, C., Durkin, M. S., Giarelli, E., Morrier, M. J., Nicholas, J. S., Pinto-Martin, J. A., Shattuck, P. T., Thomas, K. C., Yeargin-Allsopp, M., & Kirby, R. S. (2009). Racial/Ethnic Disparities in the Identification of Children With Autism Spectrum Disorders. American Journal of Public Health, 99(3), 493–498. https://doi.org/10.2105/AJPH.2007.131243 McLean, K. J., Hoekstra, A. M., & Bishop, L. (2021). United States Medicaid home and community-based services for people with intellectual and developmental disabilities: A scoping review. Journal of Applied Research in Intellectual Disabilities, 34(3), 684–694. https://doi.org/10.1111/jar.12837 National Autism Center (2015). Findings and conclusions: National standards project, phase 2. Randolph, MA: Author. Nilsen, P., & Bernhardsson, S. (2019). Context matters in implementation science: A scoping review of determinant frameworks that describe contextual determinants for implementation outcomes. BMC Health Services Research, 19(1), 189. https://doi.org/10.1186/s12913-019-4015-3 O’Brien, B. C., Harris, I. B., Beckman, T. J., Reed, D. A., & Cook, D. A. (2014). Standards for Reporting Qualitative Research: A Synthesis of Recommendations. Academic Medicine, 89(9), 1245–1251. https://doi.org/10.1097/ACM.0000000000000388 Pickard, K., Meza, R., Drahota, A., & Brikho, B. (2018). They’re Doing What? A Brief Paper on Service Use and Attitudes in ASD Community-Based Agencies. Journal of Mental Health Research in Intellectual Disabilities, 11(2), 111–123. https://doi.org/10.1080/19315864.2017.1408725 Powell, B. J., Beidas, R. S., Lewis, C. C., Aarons, G. A., McMillen, J. C., Proctor, E. K., & Mandell, D. S. (2017). Methods to Improve the Selection and Tailoring of Implementation Strategies. The Journal of Behavioral Health Services & Research, 44(2), 177–194. https://doi.org/10.1007/s11414-015-9475-6 88 Powell, B. J., Fernandez, M. E., Williams, N. J., Aarons, G. A., Beidas, R. S., Lewis, C. C., McHugh, S. M., & Weiner, B. J. (2019). Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda. Frontiers in Public Health, 7, 3. https://doi.org/10.3389/fpubh.2019.00003 Powell, B. J., Haley, A. D., Patel, S. V., Amaya-Jackson, L., Glienke, B., Blythe, M., Lengnick- Hall, R., McCrary, S., Beidas, R. S., Lewis, C. C., Aarons, G. A., Wells, K. B., Saldana, L., McKay, M. M., & Weinberger, M. (2020). Improving the implementation and sustainment of evidence-based practices in community mental health organizations: A study protocol for a matched-pair cluster randomized pilot study of the Collaborative Organizational Approach to Selecting and Tailoring Implementation Strategies (COAST- IS). Implementation Science Communications, 1(1), 9. https://doi.org/10.1186/s43058- 020-00009-5 Powell, B. J., McMillen, J. C., Proctor, E. K., Carpenter, C. R., Griffey, T., Bunger, A. C., Glass, J. E., & York, J. L. (2012). A Compilation of Strategies for Implementing Clinical Innovations in Health and Mental Health. 37. Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., Matthieu, M. M., Proctor, E. K., & Kirchner, J. E. (2015). A refined compilation of implementation strategies: Results from the Expert Recommendations for Implementing Change (ERIC) project. Implementation Science, 10(1), 21. https://doi.org/10.1186/s13012-015-0209-1 Proctor, E. K., Bunger, A. C., Lengnick-Hall, R., Gerke, D. R., Martin, J. K., Phillips, R. J., & Swanson, J. C. (2023). Ten years of implementation outcomes research: A scoping review. Implementation Science, 18(1), 31. https://doi.org/10.1186/s13012-023-01286-z Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., Griffey, R., & Hensley, M. (2011). Outcomes for Implementation Research: Conceptual Distinctions, Measurement Challenges, and Research Agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. https://doi.org/10.1007/s10488-010-0319-7 Rea, L. M., & Parker, R. A. (2014). Designing and Conducting Survey Research: A Comprehensive Guide. John Wiley & Sons. Scaccia, J. P., Cook, B. S., Lamont, A., Wandersman, A., Castellow, J., Katz, J., & Beidas, R. S. (2015). A Practical Implementation Science Heuristic for Organizational Readiness: R = MC2. Journal of Community Psychology, 18. Scaccia, J. P., Cook, B., & Wandersman, A. (2020). Building Organizational Readiness (Capacities x Motivation) for Implementation: A Research Synthesis of the Empirical Evidence. SocArXiv. https://doi.org/10.31235/osf.io/84cjq Schreibman, L., Dawson, G., Stahmer, A. C., Landa, R., Rogers, S. J., McGee, G. G., Kasari, C., Ingersoll, B., Kaiser, A. P., Bruinsma, Y., McNerney, E., Wetherby, A., & Halladay, A. 89 (2015). Naturalistic Developmental Behavioral Interventions: Empirically Validated Treatments for Autism Spectrum Disorder. Journal of Autism and Developmental Disorders, 45(8), 2411–2428. https://doi.org/10.1007/s10803-015-2407-8 Schreibman, L., & Koegel, R. L. (2005). Training for Parents of Children With Autism: Pivotal Responses, Generalization, and Individualization of Interventions. In Psychosocial treatments for child and adolescent disorders: Empirically based strategies for clinical practice, 2nd ed (pp. 605–631). American Psychological Association. https://doi.org/10.1037/10196-000 Schultes, M.-T., Kollmayer, M., Mejeh, M., & Spiel, C. (2018). Attitudes toward evaluation: An exploratory study of students’ and stakeholders’ social representations. Evaluation and Program Planning, 70, 44–50. https://doi.org/10.1016/j.evalprogplan.2018.06.002 Shea, C. M., Jacobs, S. R., Esserman, D. A., Bruce, K., & Weiner, B. J. (2014). Organizational readiness for implementing change: A psychometric assessment of a new measure. Implementation Science, 9(1), 7. https://doi.org/10.1186/1748-5908-9-7 Smith, K. A., Gehricke, J.-G., Iadarola, S., Wolfe, A., & Kuhlthau, K. A. (2020). Disparities in Service Use Among Children With Autism: A Systematic Review. Pediatrics, 145(Supplement 1), S35–S46. https://doi.org/10.1542/peds.2019-1895G Stadnick, N. A., Aarons, G. A., Martinez, K., Sklar, M., Coleman, K. J., Gizzo, D. P., Lane, E., Kuelbs, C. L., & Brookman-Frazee, L. (2022). Implementation outcomes from a pilot of “Access to Tailored Autism Integrated Care” for children with autism and mental health needs. Autism, 13623613211065801. https://doi.org/10.1177/13623613211065801 Stahmer, A. C., & Aarons, G. (2009). Attitudes Toward Adoption of Evidence-Based Practices: A comparison of Autism Early Intervention Providers and Children’s Mental Health Providers. Psychological Services, 6(3), 223–234. https://doi.org/10.1037/a0010738 Straiton, D., Groom, B., & Ingersoll, B. (2020). Parent Training for Youth with Autism Served in Community Settings: A Mixed-Methods Investigation Within a Community Mental Health System. Journal of Autism and Developmental Disorders. https://doi.org/10.1007/s10803-020-04679-x Straiton, D., Groom, B., & Ingersoll, B. (2021). A mixed methods exploration of community providers’ perceived barriers and facilitators to the use of parent training with Medicaid- enrolled clients with autism. Autism, 136236132198991. https://doi.org/10.1177/1362361321989911 Swindle, T., McBride, N. M., Selig, J. P., Johnson, S. L., Whiteside-Mansell, L., Martin, J., Staley, A., & Curran, G. M. (2021). Stakeholder selected strategies for obesity prevention in childcare: Results from a small-scale cluster randomized hybrid type III trial. Implementation Science, 16(1), 48. https://doi.org/10.1186/s13012-021-01119-x 90 Taboada, A., Ly, E., Ramo, D., Dillon, F., Chang, Y.-J., Hooper, C., Yost, E., & Haritatos, J. (2021). Implementing Goal Mama: Barriers and Facilitators to Introducing Mobile Health Technology in a Public Health Nurse Home-Visiting Program. Global Qualitative Nursing Research, 8, 23333936211014497. https://doi.org/10.1177/23333936211014497 Waltz, T. J., Powell, B. J., Fernández, M. E., Abadie, B., & Damschroder, L. J. (2019). Choosing implementation strategies to address contextual barriers: Diversity in recommendations and future directions. Implementation Science, 14(1), 42. https://doi.org/10.1186/s13012- 019-0892-4 Weiner, B. J. (2009). A theory of organizational readiness for change. Implementation Science, 4(1), 67. https://doi.org/10.1186/1748-5908-4-67 Williams, N., Candon, M., Stewart, R., Byeon, Y. V., Bewtra, M., Buttenheim, A., Zentgraf, K., Comeau, C., Shoyinka, S., & Beidas, R. S. (2020). Community Stakeholder Preferences for Evidence-Based Practice Implementation Strategies in Behavioral Health: A Best- Worst Scaling Choice Experiment [Preprint]. In Review. https://doi.org/10.21203/rs.3.rs- 55991/v1 Yakovchenko, V., Chinman, M. J., Lamorte, C., Powell, B. J., Waltz, T. J., Merante, M., Gibson, S., Neely, B., Morgan, T. R., & Rogal, S. S. (2023). Refining Expert Recommendations for Implementing Change (ERIC) strategy surveys using cognitive interviews with frontline providers. Implementation Science Communications, 4(1), 42. https://doi.org/10.1186/s43058-023-00409-3 Zeleke, W. A., Hughes, T. L., & Drozda, N. (2019). Disparities in Diagnosis and Service Access for Minority Children with ASD in the United States. Journal of Autism and Developmental Disorders, 49(10), 4320–4331. https://doi.org/10.1007/s10803-019- 04131-9 91 APPENDIX A. PRE MEASURE DEMOGRAPHICS This survey asks questions about your demographics background, implementation supports at your organization, and client information. . Your responses will be completely confidential. Individual answers will not be shared with anyone. Your responses will be combined with responses given by other participants. A. Participant and Agency Demographics 1. What is your current age? _______ 2. What best describes your current gender identity?: a. b. c. d. e. f. g. h. i. j. k. Man Woman Trans Man Trans Woman Nonbinary Genderqueer Gender Nonconforming Agender Gender fluid Not Listed - Please specify: __________________________ Prefer not to answer 3. What was your sex assigned at birth? a. b. c. d. e. Male assigned at birth Female assigned at birth Intersex Not Listed - Please specify: __________________________ Prefer not to answer 4. Although the categories listed below may not represent your full identity or use the language you prefer, for the purpose of this survey, please indicate which group or groups below most accurately describes your racial identification? (Check all that apply). a. b. c. d. e. f. g. h. i. j. White Black or African American Asian or Asian American Middle Eastern/North African Latinx/Hispanic Native American/American Indian/Alaskan Native/Indigenous Native Hawaiian/Pacific Islander Multiracial (please specify) _____________ Not listed (please specify) _____________ Prefer not to answer 5. Highest Level of Education 92 a. b. c. d. e. f. g. High School Diploma Some College Associates degree Bachelor’s degree Master’s degree Doctorate Other – Please specify: __________ 6. Primary Discipline / Educational Background: a. b. c. d. e. f. g. h. i. Psychology Marriage and Family Therapy Social Work Speech / Language/ Communication Occupational Therapy Physical Therapy Education Behavior Specialist Other – Please specify: _________________________________ 7. What is your title at this organization? (e.g., Executive director) _____________________________________ 8. What is your duration of employment at your current organization? ________ Years, ________ Months 9. What is your employment status at your current organization? a. b. c. d. e. Full-time Part-time Per diem Temporary Other 10. How many clients 0-21 years old does your organization CURRENTLY serve? __________ 11. How many of your organization’s current 0-21 year old clients are on the autism spectrum? ________ 12. How many of your organization’s autistic clients belong to the following age groups: _________ 0-5 years __________6-10 years __________11-15 years __________16-18 years __________19-21 years __________over 21 years 13. How many providers currently see client on the autism spectrum: _________ 93 14. What is a typical caseload (i.e., number of clients in general) per provider in your organization? ______________ clients 15. What setting(s) does your organization provide intervention to youth (0-21 years old) on the autism spectrum? Please select all that apply: a. b. c. d. e. Clinic Community School Home Other (please describe): _________________________________________________________ 16. Which source(s) of funding does your organization currently receive? Please select all that apply: a. b. c. d. e. f. Insurance Private Pay Medicaid State funding Employment Support Services Other (please describe): _________________________________________________________ 17. Please describe the governance/leadership/organizational structure of your organization? Can include a link to website organizational chart if that is available. ___________________________________________________________________________ _______________ Client Needs: This section refers to clients with Autism Spectrum Disorder, 0-21 years old. 1. What are the typical presenting problems of clients with ASD (0-21 years old) at this organization? Please select all that apply. o Communication problems o Social skills problems o Stereotyped behaviors, repetitive and/or restricted behaviors o Trauma-related problems o ADHD o Behavior problems (e.g., aggression, oppositionality, conduct) o Mood (e.g., depression, bipolar) or anxiety problems o Psychosis (e.g., schizophrenia) o Academics/Learning problems o Other (please specify): 94 2. Please rate how effective you believe the current interventions or strategies are in addressing the presenting problems of clients with ASD (0-21 years old) at this organization. Not being addressed Not effective Somewhat effective Very effective Communication problems Social skills problems Stereotyped behaviors, repetitive and/or restricted behaviors Trauma-related problems ADHD Behavior problems (e.g., aggression, oppositionality, conduct) Mood (e.g., depression, bipolar) or anxiety problems Psychosis (e.g., schizophrenia) Academics/Learning problems Other 3. What problems or challenges experienced by clients at this organization are not being addressed? What types of client problems or challenges would you like to address? ______________________________________________________________________________ _______________ Implementation Determinants INTERVENTION CHARACTERISTICS Barrier/Facilitator Staff have a negative perception of the innovation because of the entity that developed it and/or where it was developed. Intervention Source Evidence Strength & Quality Staff have a negative perception of the quality and validity of evidence supporting the intervention. Please indicate the extent to which you agree this is true for your organization 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree Please indicate how important it is to address this factor 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 95 Staff do not see the advantage of implementing the innovation compared to an alternative solution or keeping things the same. Staff do not believe that the innovation can be sufficiently adapted, tailored, or re- invented to meet local needs. Staff believe they cannot test the innovation on a smaller scale within the organization or undo implementation if needed. Staff believe that the innovation is complex based on their perception of duration, scope, radicalness, disruptiveness, centrality, and/or intricacy and number of steps needed to implement. Staff believe the innovation is poor quality based on the way it is bundled, presented, and/or assembled. Staff believe the innovation costs and/or the costs to implement (including investment, supply, and opportunity costs) are too high. 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important Relative advantage Adaptability Trialability Complexity Design Quality and Packaging Cost OUTER SETTING 96 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important Patient needs, including barriers and facilitators to meet those needs, are not accurately known and/or this information is not a high priority for the organization. 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree The organization is not well networked with external organizations. There is little pressure to implement the innovation because other key peer or competing organizations have not already implemented the innovation nor is the organization doing this in a bid for a competitive edge. External policies, regulations (governmental or other central entity), mandates, recommendations or guidelines, pay-for- performance, collaborative, or public or benchmark reporting do not exist or they undermine efforts to implement the innovation. The social architecture, age, maturity, and size of an organization hinders implementation. Patient Needs & Resources Cosmopolitanism Peer Pressure External Policy & Incentives INNER SETTING Structural Characteristics Networks & Communications The organization has poor quality or non-productive social networks and/or ineffective formal and informal communications. 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 97 Cultural norms, values, and basic assumptions of the organization hinder implementation. Culture Implementation Climate Tension for Change Compatibility Relative Priority Organizational Incentives & Rewards There is little capacity for change, low receptivity, and no expectation that use of the innovation will be rewarded, supported, or expected. Staff do not see the current situation as intolerable or do not believe they need to implement the innovation. The innovation does not fit well with existing workflows nor with the meaning and values attached to the innovation, nor does it align well with Staff' own needs and/or it heightens risk for Staff. Staff perceive that implementation of the innovation takes a backseat to other initiatives or activities. There are no tangible (e.g., goal-sharing awards, performance reviews, promotions, salary raises) or less tangible (e.g., increased stature or respect) incentives in place for implementing the innovation. 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 98 Goals are not clearly communicated or acted upon, nor do Staff receive feedback that is aligned with goals. Goals and Feedback The organization has a climate where: a) leaders do not express their own fallibility or need for Staff’ assistance or input; b) Staff do not feel that they are essential, valued, and knowledgeable partners in the implementation process; c) Staff do not feel psychologically safe to try new methods; and d) there is not sufficient time and space for reflective thinking or evaluation. There are few tangible and immediate indicators of organizational readiness and commitment to implement the innovation. Key organizational leaders or managers do not exhibit commitment and are not involved, nor are they held accountable for implementation of the innovation. Resources (e.g., money, physical space, dedicated time) are insufficient to support implementation of the innovation. Learning Climate Readiness for Implementation Leadership Engagement Available Resources Access to knowledge and information Staff do not have adequate access to digestible information and knowledge about the innovation nor how 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 99 to incorporate it into work tasks. 3- somewhat agree 4- agree 3- important 4- very important CHARACTERISTICS OF INDIVIDUALS Knowledge & Beliefs about the Intervention Self-efficacy Individual Stage of Change Individual Identification with Organization PROCESS Planning Opinion Leaders Staff have negative attitudes toward the innovation, they place low value on implementing the innovation, and/or they are not familiar with facts, truths, and principles about the innovation. Staff do not have confidence in their capabilities to execute courses of action to achieve implementation goals. Staff are not skilled or enthusiastic about using the innovation in a sustained way. Staff' are not satisfied with and have a low level of commitment to their organization. A scheme or sequence of tasks necessary to implement the intervention has not been developed or the quality is poor. 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree Opinion leaders (individuals who have formal or informal influence on the attitudes and beliefs of their colleagues with respect to implementing the 1 – disagree 2- somewhat disagree 3- somewhat agree 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 100 intervention) are not involved or supportive. 4- agree 4- very important A skilled implementation leader (coordinator, project manager or team leader), with responsibility to lead implementation of the innovation, has not been formally appointed or recognized within the organization. Individuals acting as champions who support, market, or ‘drive through’ implementation in a way that helps to overcome indifference or resistance by key Staff are not involved or supportive. Individuals from an outside entity formally facilitating decisions to help move implementation forward are not involved or supportive. Multi-faceted strategies to attract and involve key Staff in implementing or using the innovation (e.g., through social marketing, education, role modeling, training) are ineffective or non-existent. Multi-faceted strategies to attract and involve patients/customers in implementing or using the innovation (e.g., through social marketing, education, role modeling, training) are ineffective or non-existent. Implementation activities are not being done according to plan. 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important 1 – disagree 2- somewhat disagree 3- somewhat agree 1 – not at all important 2- somewhat important 3- important Formally appointed internal implementation leaders Champions External Change Agents Key Staff Patients/Customers Executing 101 Reflecting & Evaluating There is little or no quantitative and qualitative feedback about the progress and quality of implementation nor regular personal and team debriefing about progress and experience. 4- agree 1 – disagree 2- somewhat disagree 3- somewhat agree 4- agree 4- very important 1 – not at all important 2- somewhat important 3- important 4- very important Start of block: Implementation Climate Senior leadership/clinical management in this organization: Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree Don't know/not applicable …reward clinical innovation and creativity to improve client care. …solicit opinions of direct providers regarding decisions about client care. …solicit opinions of supervisors regarding decisions about client care. …seek ways to improve client/family education and increase client/family participation in intervention services. 1 1 1 2 2 2 3 3 3 4 4 4 5 5 5 6 6 6 1 2 3 4 5 6 Clinical staff members in this organization: Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree Don't know/not applicable 102 …have a sense of personal responsibility for improving client care and outcomes. …cooperate to maintain and improve effectiveness of client care. …are willing to innovate and/or experiment to improve clinical procedures. …are receptive to change in clinical processes. 1 2 1 2 1 1 2 2 3 3 3 3 4 5 4 5 4 4 5 5 6 6 6 6 Senior leadership/Clinical management in this organization: Strongly disagree Disagr ee Neither agree nor disagree Agree Strongly agree Don't know/not applicable … provide effective management for continuous improvement of client care. …clearly define areas of responsibility and authority for supervisors and clinical staff. …promote team building to solve clinical care problems. …promote communication among clinical services and units, if applicable. 1 2 3 4 1 1 1 2 2 2 3 3 3 4 4 4 5 5 5 5 6 6 6 6 103 …provide direct providers with information on performance measures and guidelines. …provide supervisors with information on performance measures and guidelines. …establish clear goals for client care processes and outcomes. …provide direct providers with feedback/data on effects of clinical decisions. …provide supervisors with feedback/data on effects of clinical decisions. …hold direct providers accountable for achieving results. …hold supervisors accountable for achieving results. 1 2 1 1 1 1 1 1 2 2 2 2 2 2 3 3 3 3 3 3 3 4 4 4 4 4 4 4 5 5 5 5 5 5 5 6 6 6 6 6 6 6 Opinion leaders in this organization: Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree Don't know/not applicable …believe that the current intervention strategies can be improved. …encourage and support changes in intervention 1 1 2 2 3 3 4 4 5 5 6 6 104 strategies to improve client care. ...are willing to try new intervention strategies. …work cooperatively with senior leadership/clinical management to make appropriate changes. 1 2 3 4 5 1 2 3 4 5 6 6 End of Block: Implementation Climate Start of Block: ORIC Please rate the extent to which you agree or disagree with the following statements about using Project ImPACT at your organization. Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree People who work here feel confident that the organization can get people invested in implementing Project ImPACT. People who work here are committed to implementing Project ImPACT. People who work here feel confident that they can keep track of progress in implementing Project ImPACT. People who work here will do whatever it takes to implement Project ImPACT. People who work here feel confident that the 1 1 1 1 1 2 2 2 2 2 3 3 3 3 3 4 5 4 5 4 5 4 4 5 5 105 organization can support people as they adjust to Project ImPACT. People who work here want to implement Project ImPACT. People who work here feel confident that they can keep the momentum going in implementing Project ImPACT. People who work here feel confident that they can handle the challenges that might arise in implementing Project ImPACT. People who work here are determined to implement Project ImPACT. People who work here feel confident that they can coordinate tasks so that implementation goes smoothly. People who work here are motivated to implement Project ImPACT. People who work here feel confident that they can manage the politics of implementing Project ImPACT. End of Block: ORIC 1 1 1 1 1 1 1 2 2 2 2 2 2 2 3 3 3 3 3 3 3 4 4 5 5 4 5 4 5 4 5 4 5 4 5 106 Start of Block: ORC This agency needs guidance in: Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree ...defining its mission ...setting specific goals for improving services ...assigning or clarifying staff roles ...establishing accurate job descriptions for staff ...evaluating staff performance ...improving relationships among staff ...improving communications among staff ...improving record keeping and information systems ...improving billing/financial/accounting procedures 1 1 1 1 1 1 1 1 1 At this agency, you need more training in: 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree ...ASD-related evidence- based strategies or interventions ...specialized computer applications (e.g. assessments, progress tracking) 1 1 2 2 3 3 4 4 5 5 107 ...new equipment or procedures being used or planned ...maintaining/obtaining certification or other credentials ...new laws or regulations End of Block: ORC 1 1 1 2 2 2 3 3 3 4 4 4 5 5 5 108 APPENDIX B. POST MEASURE Thank you very much for your participation this study! This questionnaire will ask about your organization, as well as your perspectives on the Concept Mapping method that you participated in during the study. The questionnaire is expected to take approximately 30 minutes to complete. The following questions ask about your feelings about leadership and staff culture at this organization. Please indicate the extent to which you agree with each item. Senior leadership/clinical management in this organization: Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree Don't know/not applicable …reward clinical innovation and creativity to improve client care. …solicit opinions of direct providers regarding decisions about client care. …solicit opinions of supervisors regarding decisions about client care. …seek ways to improve client/family education and increase client/family participation in intervention services. 1 1 1 2 2 2 3 3 3 4 4 4 5 5 5 6 6 6 1 2 3 4 5 6 Clinical staff members in this organization: Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree Don't know/not applicable …have a sense of personal responsibility for improving client 1 2 3 4 5 6 109 care and outcomes. …cooperate to maintain and improve effectiveness of client care. …are willing to innovate and/or experiment to improve clinical procedures. …are receptive to change in clinical processes. 1 2 1 1 2 2 3 3 3 4 5 4 4 5 5 6 6 6 Senior leadership/Clinical management in this organization: Strongly disagree Disagr ee Neither agree nor disagree Agree Strongly agree Don't know/not applicable … provide effective management for continuous improvement of client care. …clearly define areas of responsibility and authority for supervisors and clinical staff. …promote team building to solve clinical care problems. …promote communication among clinical services and units, if applicable. …provide direct providers with information on 1 2 3 4 1 1 1 1 2 2 2 2 3 3 3 3 4 4 4 4 5 5 5 5 5 6 6 6 6 6 110 performance measures and guidelines. …provide supervisors with information on performance measures and guidelines. …establish clear goals for client care processes and outcomes. …provide direct providers with feedback/data on effects of clinical decisions. …provide supervisors with feedback/data on effects of clinical decisions. …hold direct providers accountable for achieving results. …hold supervisors accountable for achieving results. 1 1 1 1 1 1 2 2 2 2 2 2 3 3 3 3 3 3 4 4 4 4 4 4 5 5 5 5 5 5 6 6 6 6 6 6 Opinion leaders in this organization: Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree Don't know/not applicable …believe that the current intervention strategies can be improved. …encourage and support changes in intervention strategies to improve client care. 1 2 3 4 5 1 2 3 4 5 6 6 111 ...are willing to try new intervention strategies. …work cooperatively with senior leadership/clinical management to make appropriate changes. 1 2 3 4 5 1 2 3 4 5 6 6 End of Block: Implementation Climate Start of Block: ORIC Please rate the extent to which you agree or disagree with the following statements about using Project ImPACT at your organization. Disagree Somewhat disagree Neither agree nor disagree Somewhat agree Agree People who work here feel confident that the organization can get people invested in implementing Project ImPACT. People who work here are committed to implementing Project ImPACT. People who work here feel confident that they can keep track of progress in implementing Project ImPACT. People who work here will do whatever it takes to implement Project ImPACT. People who work here feel confident that the organization can support people as they adjust to Project ImPACT. 1 1 1 1 1 2 2 2 2 2 3 3 3 3 3 4 4 4 4 4 5 5 5 5 5 112 2 2 2 2 2 2 2 1 1 1 1 1 1 1 People who work here want to implement Project ImPACT. People who work here feel confident that they can keep the momentum going in implementing Project ImPACT. People who work here feel confident that they can handle the challenges that might arise in implementing Project ImPACT. People who work here are determined to implement Project ImPACT. People who work here feel confident that they can coordinate tasks so that implementation goes smoothly. People who work here are motivated to implement Project ImPACT. People who work here feel confident that they can manage the politics of implementing Project ImPACT. End of Block: ORIC Start of Block: ORC This agency needs guidance in: 3 3 3 3 3 3 3 4 4 4 4 4 4 4 5 5 5 5 5 5 5 Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree ...defining its mission ...setting specific goals for improving services 1 1 2 2 3 3 4 4 5 5 113 ...assigning or clarifying staff roles ...establishing accurate job descriptions for staff ...evaluating staff performance ...improving relationships among staff ...improving communications among staff ...improving record keeping and information systems ...improving billing/financial/accounting procedures 1 1 1 1 1 1 1 At this agency, you need more training in: 2 2 2 2 2 2 2 3 3 3 3 3 3 3 4 4 4 4 4 4 4 5 5 5 5 5 5 5 Strongly disagree Disagree Neither agree nor disagree Agree Strongly agree ...ASD-related evidence-based strategies or interventions ...specialized computer applications (e.g. assessments, progress tracking) ...new equipment or procedures being used or planned ...maintaining/obtaining certification or other credentials ...new laws or regulations 1 1 1 1 1 End of Block: ORC Start of Block: AIM Acceptability of Concept Mapping 2 2 2 2 2 3 3 3 3 3 4 4 4 4 4 5 5 5 5 5 Completely disagree Disagree Neither agree nor disagree Agree Completely Agree 114 Concept Mapping meets my approval Concept Mapping is appealing to me I like Concept Mapping I welcome Concept Mapping End of Block: AIM 1 1 1 1 Start of Block: IAM Appropriateness of Concept Mapping 2 2 2 2 3 3 3 3 4 4 4 4 5 5 5 5 Completely disagree Disagree Neither agree nor disagree Agree Completely Agree Concept Mapping seems fitting Concept Mapping seems suitable Concept Mapping seems applicable Concept Mapping seems like a good match End of Block: IAM 1 1 1 1 Start of Block: FIM Feasibility of Concept Mapping 2 2 2 2 3 3 3 3 4 4 4 4 5 5 5 5 Completely disagree Disagree Neither agree nor disagree Agree Completely Agree Concept Mapping seems implementable Concept Mapping seems possible Concept Mapping seems doable 1 1 1 2 2 2 3 3 3 4 4 4 5 5 5 115 Concept Mapping seems easy to use End of Block: FIM 1 2 3 4 5 Start of Block: ISUS Usability of Concept Mapping Strongly disagree Somewhat disagree Neither agree nor disagree Somewhat agree Strongly agree I think that I would like to use Concept Mapping frequently I found Concept Mapping unnecessarily complex I thought Concept Mapping was easy to use I think that I would need the support of a technical person to be able to use Concept Mapping I found the various components of Concept Mapping were well integrated I thought there was too much inconsistency in Concept Mapping I would imagine that most people would learn to use Concept Mapping very quickly I found Concept Mapping very cumbersome to use I felt very confident using Concept Mapping I needed to learn a lot of things before I could get going with Concept Mapping End of Block: ISUS 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3 3 3 3 3 3 4 4 4 4 4 4 4 4 4 4 5 5 5 5 5 5 5 5 5 5 116 APPENDIX C. CODEBOOK Memo Frequency 771 Categories/Codes Code System Suggestions to improve Organizational Readiness Impact/effect on readiness Org. Readiness > Capacity Org. Readiness > Motivation Implementation Process Implementation Process > Implementation strategies Team is now motivated to use implementation strategies or to implement project impact The activities and strategies used to implement the innovation. Implementation Process > Implementation strategies > Engage or involve patients/consumers Implementation Process > Implementation strategies > Blueprint Implementation Process > Implementation strategies > Clinical supervision Implementation Process > Implementation strategies > Resources/Materials Implementation Process > Implementation strategies > Quality monitoring Implementation Process > Implementation strategies > Training Implementation Process > Implementation strategies > Educational 27 19 20 15 5 37 16 7 17 5 9 18 25 117 Implementation Process > Implementation strategies > Reminders Implementation Process > IS and CFIR/Assessing Needs Collect information about priorities, preferences, and needs of people Implementation Process > IS and CFIR/Assessing Needs > Planning Identify roles and responsibilities, outline specific steps and milestones, and define goals and measures for implementation success in advance. -Talking about general next steps Implementation Process > IS and CFIR/Assessing Needs > IS and CFIR/Tailoring Strategies Choose and operationalize implementation strategies to address barriers, leverage facilitators, and fit context. Implementation Process > IS and CFIR/Assessing Needs > Assessing Context Collect information to identify and appraise barriers and facilitators to implementation and delivery of the innovation. Individuals Individuals > Characteristics Individuals > Characteristics > Rigidity/Doing things the same way Individuals > Characteristics > Capability Individuals > Characteristics > Capability > Knowledge The individual(s) has interpersonal competence, knowledge, and skills to fulfill Role. Individuals > Characteristics > Motivation The individual(s) is committed to fulfilling Role. Individuals > Characteristics > Opportunity The individual(s) has availability, scope, and power to fulfill Role. Individuals > Characteristics > Need The individual(s) has deficits related to survival, well-being, or personal fulfillment, which will be addressed 6 4 4 2 0 0 7 9 5 21 8 10 0 118 by implementation and/or delivery of the innovation. People with decision making power Individuals > Roles Individuals > Roles > Innovation recipients Individuals > Roles > High- level leaders Inner Setting Inner Setting > Staffing issues, turnover Inner Setting > Available Resources Inner Setting > Available Resources > Funding Funding is available to implement and deliver the innovation. Inner Setting > Culture Inner Setting > Structural Characteristics Inner Setting > Structural Characteristics > IT Infrastructure Inner Setting > Access to knowledge & information Inner Setting > Access to knowledge & information > Familiarity with ImPACT Inner Setting > Relative priority Inner Setting > Incentive systems Outer Setting Guidance and/or training is accessible to implement and deliver the innovation. Implementing and delivering the innovation is important compared to other initiatives. Tangible and/or intangible incentives and rewards and/or disincentives and punishments support implementation and delivery of the innovation. 10 6 6 18 6 3 16 1 1 2 15 24 1 1 0 119 Outer Setting > Financing Outer Setting > External Pressure Innovation Characteristics Funding from external entities (e.g., grants, reimbursement) is available to implement and/or deliver the innovation. External pressures drive implementation and/or delivery of the innovation. Note: Use this construct to capture themes related to External Pressures that are not included in the subconstructs below - societal, market, performance- measurement May impact perspectives on feasibility, appropriateness, acceptability, usability, effectiveness Based on CFIR constructs Codes may focus on: Innovation Source Innovation evidence-based etc Innovation Characteristics > Feasibility- amount of time Innovation Characteristics > CM Steps Innovation Characteristics > CM Steps > Brainstorming Innovation Characteristics > CM Steps > Sorting Innovation Characteristics > CM Steps > Ranking Innovation Characteristics > Adaptability Innovation Characteristics > Complexity The innovation can be modified, tailored, or refined to fit local context or needs. The innovation is complicated, which may be reflected by its scope and/or the nature and number of connections and steps. 0 1 10 4 20 15 24 18 2 4 120 Innovation Characteristics > Relative advantage Innovation Characteristics > Design Outcomes Outcomes > Feasibility Outcomes > Usability Outcomes > Appropriateness Outcomes > Acceptability The innovation is better than other available innovations or current practice. The innovation is well designed and packaged, including how it is assembled, bundled, and presented. General comments related to implementation outcomes - Don't capture factors/characteristics that influenced perspectives on outcomes here (those should go under the specific CFIR construct. For example, if participants talk about CM being feasible to use because the steps were clear and easy to complete, that should go under Innovation- Complexity, rather than under this set of codes. If participants say "it was really feasible" don't provide further explanation, it would be coded here) Extent to which an innovation can be used by specific users to achieve specific goals with effectiveness, efficiency, and satisfaction 16 30 0 92 45 28 56 121 APPENDIX D. FINAL IMPLEMENTATION STRATEGIES SELECTED Agency 1 Importance and Feasibility Ratings Full Map Go-Zone ImportanceScale [2.2500]- [5.0000] Median = 2.5 FeasibilityScale [1.7500]- [4.3333] Median = 2.16665 n = 4 n = 4 # 3 9 12 17 32 36 37 39 40 43 44 48 50 54 56 57 59 5 6 8 14 23 42 45 53 60 11 25 26 34 55 1 2 4 Implementation Strategy Average Rating Revise professional roles Use an implementation advisor Shadow other experts Revise professional roles Use train-the-trainer strategies Place innovation on fee for service lists/formularies Organize clinician implementation team meetings Mandate change Make training dynamic Increase demand Fund and contract for the clinical innovation Develop academic partnerships Create or change credentialing and/or licensure standards Conduct educational outreach visits Change record systems Build a coalition Alter patient/consumer fees Develop academic partnerships Access new funding Develop academic partnerships Develop educational materials Develop educational materials Identify and prepare champions Develop resource sharing agreements Conduct local needs assessment Access new funding Provide clinical supervision Build a coalition Provide clinical supervision Promote adaptability Conduct cyclical small tests of change Promote adaptability Conduct on-going trainings Develop educational materials 3.5000 4.0000 3.7500 3.2500 4.0000 3.0000 4.0000 4.0000 4.0000 3.7500 4.0000 4.0000 3.2500 3.7500 3.5000 4.0000 2.2500 4.2500 4.5000 4.2500 4.7500 4.7500 4.2500 4.2500 4.2500 4.5000 4.0000 4.0000 4.0000 3.7500 4.0000 4.2500 5.0000 4.5000 2.5000 3.2500 2.7500 2.7500 3.0000 3.2500 3.0000 3.0000 3.0000 2.2500 2.5000 3.0000 1.7500 2.7500 2.7500 3.0000 2.2500 3.0000 2.2500 3.2500 3.2500 3.0000 3.0000 3.2500 3.2500 2.2500 3.7500 3.5000 4.2500 3.7500 3.5000 3.6667 4.0000 3.5000 122 7 10 13 15 16 18 19 20 21 22 24 27 28 29 30 31 33 35 38 41 46 47 49 51 52 58 Create a learning collaborative Tailor strategies Involve patients/consumers/family members Assess for readiness and identify barriers and facilitators Create a learning collaborative Conduct on-going trainings Provide clinical supervision Prepare patients/consumers to be active Obtain and use patients/consumers and family feedback Model and simulate change Develop a formal implementation blueprint Develop educational materials Develop a formal implementation blueprint Conduct ongoing training Conduct educational meetings Assess for readiness and identify barriers and facilitators Purposely reexamine the implementation Prepare patients/consumers to be active participants Obtain and use patients/consumers and family feedback Involve patients/consumers and family members Distribute educational materials Develop educational materials Develop a formal implementation blueprint Create a learning collaborative Conduct ongoing training Assess for readiness and identify barriers and facilitators 4.2500 4.2500 4.7500 4.2500 4.5000 4.5000 4.6667 4.2500 4.5000 4.2500 4.5000 5.0000 4.2500 4.5000 4.5000 4.2500 4.5000 4.7500 4.7500 4.5000 4.5000 4.7500 4.2500 4.5000 4.7500 4.2500 3.7500 3.7500 3.5000 3.7500 3.7500 4.2500 4.2500 3.7500 3.5000 4.0000 3.5000 3.7500 4.2500 4.3333 4.0000 3.5000 3.5000 3.6667 4.2500 4.2500 3.7500 3.7500 3.5000 3.7500 4.0000 3.5000 Agency 2 Importance and Feasibility Ratings Full Map Go-Zone Implementation Strategy # 2 Use an implementation advisor 6 Develop academic partnerships 10 Build a coalition 11 Alter incentive/allowance structures ImportanceScale [3.0000]- [4.8000] Median = 2.4 FeasibilityScale [2.0000]- [4.8000] Median = 2.4 n = 5 n = 5 Average Rating 3.4000 3.0000 3.6000 3.8000 3.0000 2.0000 2.8000 2.2000 123 16 Access new funding 17 Work with educational institutions 18 Shadow other experts 20 Place innovation on fee for service lists/formularies 26 32 Alter incentive/allowance structures 12 Access new funding 27 Fund and contract for the clinical innovation 33 Access new funding Increase demand 3.8000 3.4000 3.4000 3.0000 3.4000 3.8000 4.2000 4.0000 4.0000 2.2000 2.5000 3.0000 2.6000 3.0000 2.2000 2.4000 2.4000 2.2000 1 Remind clinicians, Develop educational materials, conduct ongoing training, alter incentive/allowance structures, flow chart for incorrect responses, Group and individual trainings Purposely reexamine the implementation 4 9 Capture and share local knowledge 13 Flow chart for incorrect responses 14 Flow chart for behaviors 24 Model and simulate change 3 Remind clinicians 5 Develop educational material 7 Develop a formal implementation blueprint 8 Conduct ongoing training 15 Group and Individual trainings 19 Provide clinical supervision 21 Prepare patients/consumers to be active participants 22 Organize clinician implementation team meetings Obtain and use patients/consumers and family feedback Involve patients/consumers and family members 25 28 Develop educational materials 29 Develop and implement tools for quality monitoring 30 Develop a formal implementation blueprint 31 Conduct educational meetings 23 3.7500 3.5000 3.4000 3.6000 3.6000 3.6000 3.8000 4.0000 4.4000 4.4000 4.6000 4.6000 4.6000 4.2000 4.4000 4.2000 4.8000 4.0000 4.4000 4.2000 4.2000 3.8000 3.6000 4.2000 4.4000 3.4000 4.8000 4.2000 3.8000 4.0000 3.8000 4.2000 3.8000 4.2000 3.6000 4.0000 4.2000 3.4000 4.0000 3.6000 Agency 3 Importance and Feasibility Ratings Full Map Go-Zone Implementation Strategy # 10 Promote network weaving 12 Alter incentive/allowance structures 13 Alter patient/consumer fees 15 Build coalition ImportanceScale [1.5000]- [4.5000] Median = 2.25 FeasibilityScale [1.5000]- [4.2000] Median = 2.1 n = 5 n = 5 Average Rating 1.8000 2.2000 1.8000 2.0000 2.6000 1.5000 2.2000 2.6000 124 17 Change accreditation or membership requirement 18 Change physical structure and equipment 19 Conduct educational meetings 20 Create or change credentialing and/or licensure standards 23 Fund and contract for the clinical innovation 25 Inform local opinion leaders Intervene with patients/ consumers to enhance uptake and adherence 26 Inform local opinion leaders Shadow other experts Provide ongoing consultation 28 Place innovation on fee for service lists/ formularies 32 Stage implementation scale up 41 50 Create new clinical teams 53 Conduct local needs assessment 54 Conduct educational outreach visits 56 Change record systems 5 Visit other sites 7 9 11 Prepare patients/consumers to be active participants 16 Capture and share local knowledge 27 Make billing easier 38 Mandate change 40 57 Access new funding 21 Develop and implementation glossary 24 30 Work with educational institutions 42 49 Develop academic partnerships 51 Create a learning collaborative 55 Conduct cyclical small tests of change 1 Robust training 2 View data on oncomes 3 View presentation on what project impact is 4 Develop a formal implementation blueprint 6 Distribute educational materials 8 Involve patients/consumers and family members Identify early adopters Increase demands 14 Purposely reexamine the implementation Assess for readiness and identify barriers and facilitators 22 Facilitation 29 Use data experts 31 Tailor strategies 33 Remind clinicians 34 Recruit, designate, and train for leadership 35 Provide clinical supervision 1.6000 2.6000 2.6000 1.5000 2.8000 2.4000 1.7500 2.4000 2.4000 2.4000 2.6000 2.6000 2.4000 2.6000 3.2500 3.4000 4.0000 3.4000 3.2000 3.0000 3.0000 3.4000 3.6000 2.6000 1.8000 2.0000 2.8000 2.6000 2.7500 2.7500 3.4000 4.2000 4.5000 3.4000 3.4000 3.6000 3.4000 3.4000 3.4000 3.8000 3.0000 3.0000 4.5000 1.6000 2.2000 2.6000 2.0000 2.6000 2.6000 2.6000 2.4000 2.8000 2.8000 2.2000 2.4000 2.4000 2.0000 1.8000 2.6000 2.8000 2.6000 2.8000 1.8000 2.7500 2.8000 2.6000 3.2000 3.0000 3.0000 3.2000 3.2000 3.0000 3.8000 3.4000 3.6000 4.0000 3.4000 3.8000 3.6000 3.2000 3.0000 3.2000 3.8000 3.6000 3.2000 4.2000 125 Identify and prepare champions 36 Organize clinician implementation team meetings 37 Model and simulate change 39 Make training dynamic 43 44 Facilitate relay of clinical data to providers 45 Develop resource sharing agreements 46 Develop educational materials 47 Develop and organize quality monitoring systems 48 Develop and implement tools for quality monitoring 52 Conduct ongoing training Agency 4 Importance and Feasibility Ratings 3.4000 3.6000 3.6000 3.0000 3.6000 3.0000 3.0000 4.0000 4.0000 4.2500 3.8000 4.0000 3.8000 3.6000 3.6000 3.0000 3.6000 3.6000 3.6000 3.6000 Full Map Go-Zone # 2 Use an implementation advisor 13 Alter incentive/allowance structures Implementation Strategy 14 Develop implementation glossary/educational materials 16 Promote network weaving 20 Work with educational institutions 32 Conduct cyclical small tests of change 1 Shadow other experts 4 Provide ongoing consultation 8 17 Assess for readiness and identify barriers and facilitators Access new funding/Fund and contract for the clinical innovation 22 Shadow other experts 30 Develop a formal implementation blueprint 7 Facilitation 10 Use advisory boards and workgroups 21 Use train-the-trainer strategies 28 Facilitate relay of clinical data to providers 29 Distribute educational materials 3 Purposely reexamine the implementation 5 Prepare patients/consumers to be active participants 6 Organize clinician implementation team meetings 9 Develop and organize quality monitoring systems 11 Develop educational materials 12 Conduct ongoing training 15 Conduct educational meetings 18 Organize clinical implementation team meetings 19 Create a learning collaborative ImportanceScale [3.2000]- [5.0000] Median = 2.5 FeasibilityScale [2.8000]- [4.0000] Median = 2 n = 5 n = 5 Average Rating 3.8000 3.4000 4.0000 3.4000 3.8000 4.0000 4.4000 4.2000 4.6000 4.2000 4.2000 4.4000 3.4000 3.2000 4.0000 3.4000 3.4000 4.4000 4.6000 4.8000 4.2000 4.4000 5.0000 4.2000 4.4000 4.2000 3.0000 3.4000 3.4000 3.2000 3.0000 3.2000 3.4000 3.4000 3.4000 2.8000 3.4000 3.0000 3.8000 3.6000 3.6000 4.0000 3.8000 3.8000 3.6000 3.6000 3.6000 3.6000 3.6000 3.6000 3.6000 3.6000 126 23 Purposely reexamine the implementation 24 Provide clinical supervision 25 Model and simulate change 26 Involve patients/consumers and family members 27 Identify and prepare champions 31 Conduct local needs assessment 4.4000 5.0000 4.4000 5.0000 4.2000 4.2000 3.8000 4.0000 3.8000 3.6000 3.6000 3.8000 127