A MULTIDIMENSIONAL TREATMENT INTEGRITY ASSESSMENT OF PARENT COACHING IN A TELEHEALTH PARENT TRAINING PROGRAM FOR AUTISM SPECTRUM DISORDER By Shannon Quyen Tran A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of School Psychology – Doctor of Philosophy 2018 ABSTRACT A MULTIDIMENSIONAL TREATMENT INTEGRITY ASSESSMENT OF PARENT COACHING IN A TELEHEALTH PARENT TRAINING PROGRAM FOR AUTISM SPECTRUM DISORDER By Shannon Quyen Tran An important principle of evidenced-based practice (EBP) is using interventions with strong empirical support for their effectiveness, commonly known as evidence-based interventions (EBIs). Evidence of an intervention’s effectiveness is strongest when supported by treatment integrity data. Treatment integrity refers to the degree to which an intervention is implemented as intended by the original design. The assessment’s purpose is to provide researchers and practitioners with data about the implementation process to enable valid conclusions to be drawn about an intervention’s effectiveness. The present study focused on the treatment integrity assessment of Project ImPACT (Improving Parents as Communication Teachers; Ingersoll & Dvortcsak, 2010), a parent training program that aims to improve parents’ competence in teaching social communication skills to children diagnosed with autism spectrum disorder (ASD). The parent coaching portion of the training program was the focus of this study. Treatment integrity assessment occurred at two stages: The coaching delivery and the treatment delivery. This study used videos of coaching sessions from two randomized controlled trial (RCT) studies that examined the effectiveness of delivering Project ImPACT via telehealth with and without parent coaching. Dane and Schneider’s (1998) treatment integrity conceptual framework was used to guide the assessment. For the coaching delivery, the assessment focused on the therapists’ adherence to the coaching procedure, provision of feedback, and quality of coaching delivery, and the parents’ responsiveness during the coaching session. For the treatment delivery, the assessment focused on the parents’ adherence to the intervention strategies and quality of the treatment delivery. Descriptive statistics provided a general overview of the therapists’ coaching performance and the parents’ teaching performance. Multilevel regression analysis determined which components of the coaching delivery best predicted how parents used the intervention techniques and structured the play session for their child during the coaching sessions. Overall, the therapists consistently completed the essential steps of the coaching process. They frequently provided comprehensive feedback, attention, and reassurance. They did not provide as many opportunities for the parents to engage in collaborative problem-solving or to reflect on their implementation progress. In turn, the parents fully participated in the coaching session and demonstrated sufficient capacity to implement the intervention techniques and structure a meaningful play session for their child. Results from a multilevel regression analysis indicated that none of the treatment integrity components of the coaching delivery significantly predicted the parents’ treatment adherence. The quality of coaching delivery did, however, significantly predict the parents’ structure of the play segment, albeit in a negative direction. The study’s results, along with its limitations, provided a platform for continuing the conversation about treatment integrity assessment in intervention studies. In particular, the study concluded with new questions about the conceptualization and operationalization of different parent coaching aspects for parent- implemented interventions. Seeking to understand the concept and improve the measurement of these parent coaching aspects can lead to a more accurate identification of the active ingredients of parent coaching in ASD parent-implemented interventions. This dissertation is dedicated to my parents, Tu Tran and Hanh Ngo, whose unconditional love and unwavering support consistently served as my guiding light throughout the past seven years. iv ACKNOWLEDGMENTS I would like to thank my advisor, Dr. Evelyn Oka, for her teaching and guidance in the last seven years. Your patience, compassion, and steady faith in me were the crucial ingredients to my success. I feel incredibly blessed to have had the opportunity to grow as a person and a professional under your guidance. I would like to thank Dr. Brooke Ingersoll, Dr. Jana Aupperlee, and Dr. Kristin Rispoli for their commitment to my dissertation process. Thank you for sharing your time, resources, and expertise in an effort to help me bring this study to fruition. Thank you for your acceptance and patience as I attempt to explore an intricate research topic. Dr. Ingersoll, thank you for supporting my growth as a researcher by giving me access to the rich Project ImPACT data. Dr. Aupperlee, thank you for agreeing to be on my committee despite having other demanding responsibilities. Dr. Rispoli, thank you for listening to me talk about my concerns throughout this process and helping me think critically about them. I would like to thank Kelly, Sean, Chloe, and Hayden for loving me, supporting me, remembering me, and never letting me forget that I will always have a home with them. Your presence, though far away, kept me going whenever I became overwhelmed by the demands of graduate school. I would like to thank Felicia for playing an instrumental role in my graduate school journey. Thank you for believing in me when I was not strong enough to do the same thing. Thank you for using your unparalleled editing skills to help me grow as a writer. Lastly, I would like to thank Carl for being loving, supportive, and patient. God answered my prayer when He gave me you five years ago. I feel incredibly blessed to be doing this life with you. v TABLE OF CONTENTS LIST OF TABLES………………………………………………………………………… LIST OF FIGURES……………………………………………………………………….. CHAPTER 1: INTRODUCTION………………………………………………………… Statement of the Problem…………………………………………………………………… Purpose and Significance of the Study……………………………………………………... Conceptual Framework: Treatment Integrity………………………………………………. Research Questions…………………………………………………………………………. Hypothesis…………………………………………………………………………………... CHAPTER 2: LITERATURE REVIEW………………………………………………… Conceptualization of Treatment Integrity: Past to Present…………………………………. The Importance of Treatment Integrity Assessment……………..…………………………. Barriers to Treatment Integrity Assessment………………………………………………… Characteristics of Autism Spectrum Disorder (ASD)………………………………..……... Parent-Implemented Interventions: The Use of Coaching……………..…………………… Treatment Integrity Assessment of Parent Coaching……………………………..………… The Present Study: A Focus on Project ImPACT…………………………………………... CHAPTER 3: METHOD………………………………………………………………….. The Present Study…………………………………………………………………………... Delivery of Project ImPACT via Telehealth: RCT Studies………………………………… Project ImPACT: Target Teaching Domains………………………………………………. Components of Treatment Integrity: Coaching Delivery……………………………............ Components of Treatment Integrity: Treatment Delivery…………………………….......... Training Procedure for Observational Coding……………………………………………… Procedure for Observational Coding……………………………………………………….. Inter-Observer Reliability……………..……………………………………………………. Data Analysis……………………………………………………………………………….. CHAPTER 4: RESULTS………………………………………………………………….. Descriptive Summary of the Data…………………………………………………………... Missing Data………………………………………………………………………………... Confirmatory Factor Analysis………………………………………………………………. Test of Assumptions………………………………………………………………………… Correlation Between the Independent Variables and the Dependent Variables……………. Research Question 1……..………………………………………………………………….. Research Question 1a……………………………………………………………………….. Research Question 2………………………………………………………………………… Multilevel Regression Model Selection…………………………………………………….. Research Question 3………………………………………………………………………… Research Question 4………………………………………………………………………… viii xi 1 1 3 5 6 7 8 8 13 15 16 20 23 24 34 34 35 36 38 44 47 47 48 50 55 55 55 55 57 61 61 62 62 63 64 65 vi CHAPTER 5: DISCUSSION……………………………………………………………... Research Questions 1 and 1a……………………………………………………………….. Research Question 2………………………………………………………………………… Research Questions 3 and 4.……...………………………………………………………… Limitations………………………………………………………………………………….. Implications…………………………………………………………………………………. APPENDICES……………………………………………………………………………... Appendix A: Coaching Delivery – Adherence…………...………………………………… Appendix B: Coaching Delivery – Exposure……………………………………………….. Appendix C: Coaching Delivery – Quality of Coaching Delivery…………………………. Appendix D: Coaching Delivery – Participant Responsiveness……………………………. Appendix E: Treatment Delivery – Adherence……………………………………………... Appendix F: Treatment Delivery – Quality of Treatment Delivery………………………... REFERENCES…………………………………………………………………………….. 68 69 75 78 83 85 89 90 91 92 96 97 111 113 vii LIST OF TABLES Table 1. Inter-Observer Reliability for the Therapist Variables Table 2. Inter-Observer Reliability for the Parent Variables Table 3. Multicollinearity of the Independent Variables Table 4. Correlations Between the Independent and Dependent Variables Table 5. Descriptive Statistics for the Treatment Integrity Components of the Coaching Delivery Table 6. Descriptive Statistics for the Dimensions of the Quality of Coaching Delivery Table 7. Descriptive Statistics for the Treatment Integrity Components of the Treatment Delivery Table 8. Model of Covariance Structures for Parents’ Treatment Adherence Table 9. Model of Covariance Structures for Parents’ Structure of the Play Segment Table 10. Two-Level Multiple Regression Coaching Variables Predicting Parents’ Treatment Adherence Table 11. Two-Level Multiple Regression Coaching Variables Predicting Parents’ Structure of the Play Segment Table 12. Correlations Between the Quality of Coaching Delivery Dimensions and the Parents’ Structure of the Play Segment 49 50 58 61 62 62 63 64 64 65 66 66 viii LIST OF FIGURES Figure 1. Normal P-P Plot for PAdherence Figure 2. Normal P-P plot for PStructure Figure 3. Normal P-P plot for PSupport 58 59 60 ix Statement of the Problem CHAPTER 1: INTRODUCTION Evidence-based practice (EBP) has become the standard of practice in psychology and education (APA Presidential Task Force, 2006; Spring, 2007). The APA Task Force defined EBP in psychology as “…the integration of the best available research with clinical expertise in the context of patient characteristics, culture, and preferences” (APA Presidential Task Force, 2006, p. 273). Along with clinical skill and attention to cultural and contextual characteristics, a defining feature of EBP is the use of the “best available” research-based practices. Treatments that have rigorous empirical evidence supporting their potential to produce positive outcomes for the intended population are commonly known as evidence-based interventions (EBIs) (APA Presidential Task Force, 2006). The number of EBIs has increased dramatically over the past decade due to the advancement in knowledge of different disorders and disabilities, and the growing resources to develop treatments for them. However, the science of assessing how these treatments are implemented, otherwise known as treatment integrity assessment, has not made as much progress in the same amount of time (Southam-Gerow & McLeod, 2013). Treatment integrity refers to the degree to which an intervention is implemented as intended by the developer (Yeaton & Sechrest, 1981). The purpose of a treatment integrity assessment is to provide researchers and practitioners with data pertaining to different aspects of the implementation process that can help them draw valid conclusions about the intervention’s effectiveness (Sanetti & Kratochwill, 2011). Researchers in the field of implementation science have raised concern about the science-to-practice gap in the EBP movement where treatment integrity assessment has consistently been under-addressed, both in efficacy and effectiveness studies of EBIs (Ogden & Fixsen, 2014). 1 Ideally, EBIs are evaluated through two streams of research to determine their utility value. Efficacy research is conducted in a controlled lab or clinic setting under standardized conditions, and focuses on empirically evaluating whether or not an intervention works. Effectiveness research is typically conducted in applied settings under non-standardized conditions, and focused on evaluating the feasibility of transporting an intervention from research to practice as well as the outcome of the intervention (Merrell & Buchanan, 2006; Kratochwill & Shernoff, 2004). These types of studies exist to ensure that the best evidence- based interventions can be disseminated for widespread use. In both types of research, the practice of high-quality treatment integrity assessment has largely been absent (Sanetti & Collier-Meek, 2014). The lack of treatment integrity assessment can be observed across a variety of treatments for different disabilities and disorders (Sanetti & Kratochwill, 2008). For example, Dane and Schneider (1998) conducted a systematic review of primary and secondary prevention programs for behavioral, social, and academic problems from 1980-1994, and found that 39 out of 162 (24%) reviewed studies reported treatment integrity procedures. Of those 39 reviewed studies, 21 studies (54%) reported on exposure, 18 studies (46%) reported on adherence, 11 studies (28%) reported on quality of delivery, 10 studies (26%) reported on program differentiation, and 3 studies (8%) reported on participant responsiveness. Similarly, Sanetti, Gritter, and Dobey (2011) conducted a systematic review of school- and home-based interventions from 1995-2008, and found that half of the coded studies (n=112; 50.2%) reported quantitative treatment integrity data, while 29 studies (13%) mentioned monitoring treatment integrity but did not report any quantitative data. The present study focused on the treatment integrity assessment of interventions for children with autism spectrum disorders (ASD) that are delivered by parents (Ostrosky, 2 Zaghlawan, & Yu, 2009; Barton & Fettig, 2013). Research on treatments for autism has consistently endorsed early intensive intervention as the key to optimal, long-term outcomes (Schreibmann, 2000). Moreover, parents are viewed as an integral part of the intervention process, which makes parent training and/or coaching a primary intervention strategy (Wolery & Garfinkle, 2002). Many studies of parent-implemented interventions have demonstrated significant positive results for children with ASD and their parents (Kasari et al., 2014; Wetherby et al., 2014; Shire et al., 2015; Gillett & LeBlanc, 2007). At the same time, there are studies that were not able to detect significant improvements in children and their parents who either received a parent-implemented intervention alone or in addition to their usual treatment (Carter et al., 2011; Green et al., 2010; Oosterling et al., 2010; Rogers et al., 2012). Although researchers agree on the importance of involving parents in the intervention process, the reasons for the mixed results with parent-implemented interventions are still largely unknown. One possible explanation for the inconsistent findings is that there may be variability in the treatment integrity of the coaching delivery and the treatment delivery. By examining different aspects of treatment integrity, researchers may be able to better understand the reasons for an intervention’s success or failure (Nelson, Cordray, Hulleman, Darrow, & Sommer, 2012). The current study examined the treatment integrity of the parent coaching portion of a telehealth parent training program. Critical examination of the coaching delivery was done from a multidimensional perspective. Purpose and Significance of the Study The primary purpose of this study was to conduct a multidimensional assessment of the treatment integrity of parent coaching in Project ImPACT (Improving Parents as Communication Teachers; Ingersoll & Dvortcsak, 2010), a parent training program that aims to improve parents’ competence in teaching social communication skills to children diagnosed with autism spectrum 3 disorder (ASD). The treatment integrity assessment was conducted at two stages to accurately reflect the indirect service delivery model of parent-implemented interventions: 1) coaching delivery, where the therapist coaches the parent, and 2) treatment delivery, where the parent works with the child (Frank & Kratochwill, 2008; Noell, 2008). In contrast to previous studies, the current study used a multidimensional approach to assess the treatment integrity of the therapist’s coaching delivery and the parent’s treatment delivery. For the coaching delivery, the assessment focused on the therapists’ adherence to the coaching procedure, provision of feedback, and quality of coaching delivery, and the parents’ responsiveness during the coaching session. For the treatment delivery, the assessment focused on the parents’ adherence to the intervention strategies and quality of the treatment delivery (Dane & Schneider, 1998). By assessing multiple aspects of treatment integrity for both the coaching and the treatment deliveries, it increased the possibility of identifying specific coaching delivery components that may be helpful in improving parents’ understanding and utilization of the intervention techniques. This study has the potential to contribute to research and practice in several ways. First, it can enrich the ways in which treatment integrity is conceptualized and assessed through the use of a multidimensional approach to treatment integrity assessment (Dane & Schneider, 1998). Although this concept has been discussed extensively in the implementation science literature, the actual practice of measuring different aspects of treatment integrity has remained scarce (McIntyre, Gresham, DiGennaro, & Reed, 2007; Perepletchikova, Treat, & Kazdin, 2007). Second, it can lead to more effective interventions by illuminating the role of implementation integrity on outcomes. The study attempted to use the construct of treatment integrity as a lens to examine the potential relationship between aspects of the coaching delivery and aspects of the 4 treatment delivery. In doing so, it was possible to explore ways in which parent training models could be improved to benefit parents’ learning. Conceptual Framework: Treatment Integrity This study used the Dane and Schneider’s (1998) multidimensional model of treatment integrity. Their systematic review of over one hundred primary and secondary prevention programs for behavioral, social, and academic problems revealed five components of treatment integrity: adherence, exposure, quality of delivery, participant responsiveness, and program differentiation. The authors stressed the importance of measuring these components in a comprehensive treatment integrity assessment to get an accurate picture of an intervention’s effectiveness. In the present study, all of the components were measured except for program differentiation. This component would require the presence of alternative treatments for comparison purposes, which was not the case for the Project ImPACT studies. The parents in the studies were only given one treatment package, which was the Project ImPACT. Therefore, program differentiation was left out of the treatment integrity assessment. Thus, four key components of treatment integrity were examined for the coaching delivery: Therapists’ adherence, exposure, quality of coaching delivery, and participant responsiveness. Two key components of treatment integrity were examined for the treatment delivery: Parents’ adherence and quality of treatment delivery. There were two reasons for using a multidimensional conceptualization of treatment integrity. First, interventions have become increasingly complex over the years, particularly the intervention selected for this study (Schulte, Easton, & Parker, 2009). Multi-component intervention packages consisting of different methodological approaches to treating a disorder or a disability are now more common. They are typically delivered in a chronological sequence, 5 with each preceding component laying the foundation for the next one. Second, there has been a shift in the role of the primary intervention provider. For many ASD interventions, parents are now required to take on this important role because of their inherent expertise of the child and established bond with the child (Wolery & Garfinkle, 2002). Given these recent changes, there is a need for more critical and comprehensive treatment integrity assessment to better understand the intervention’s effectiveness and the influence of parent coaching. Research Questions The current study conducted a two-level treatment integrity assessment of a parent training model for an ASD social communication intervention, and analyzed the data to determine which aspect(s) of the coaching delivery best predicted how parents used the intervention techniques to work with their child during the live coaching sessions. The selected intervention for this study was the online version of the Project ImPACT training curriculum for parents of children with ASD (Ingersoll & Dvortcsak, 2010). The online coaching component of Project ImPACT was an appropriate context for this study because it fit within the indirect service delivery model that is commonly used in parent training programs. The multidimensional approach to treatment integrity assessment had the potential to highlight different aspects of the telehealth coaching format that may require modification in order to make this model of parent training more feasible to implement with success. This study aimed to answer the following research questions: 1. What is the average level achieved for each treatment integrity component of the coaching delivery: Therapists’ adherence, exposure, quality of coaching delivery, and participant responsiveness? 6 a. What is the average level achieved for each dimension of the quality of coaching delivery? 2. What is the average level achieved for each treatment integrity component of the treatment delivery: Parents’ adherence and quality of treatment delivery? a. What is the average level achieved for each dimension of the quality of treatment delivery? 3. Which treatment integrity components of the coaching delivery predict parents’ treatment adherence? 4. Which treatment integrity components of the coaching delivery predict parents’ quality of treatment delivery? Hypothesis The goal of questions 1 and 2 was to provide an overview of the coaching process through descriptive statistics. The measurement of multiple aspects of treatment integrity was intended to characterize the therapists’ coaching performance and the parents’ teaching performance. The goal of questions 3 and 4 was to explore the potential relation between the coaching components and the treatment components. It was hypothesized that quality of coaching and participant responsiveness would be unique predictors of parents’ treatment adherence and quality of treatment delivery during the coaching session. Learning contexts that are supportive, collaborative, and relevant are more likely to produce positive learning outcomes for adult learners (Dunst & Trivette, 2009). Moreover, opportunities for engagement and skills application are conducive to the overall goal of skills acquisition for adult learners (Knowles, Holton, & Swanson, 2005). 7 CHAPTER 2: LITERATURE REVIEW This chapter begins by providing a background to treatment integrity by discussing the different conceptualizations of treatment integrity, the importance of assessing treatment integrity in treatment outcome research, and the barriers to addressing it in many intervention studies. The second section briefly reviews the characteristics of ASD and the development of language and social communication skills, followed by a rationale for early interventions. The next section examines parent coaching and the current state of treatment integrity assessment for this particular treatment delivery model. Finally, the chapter closes with a review of previous studies that specifically examined the efficacy and effectiveness of Project ImPACT. Conceptualization of Treatment Integrity: Past to Present The concept of treatment integrity first emerged in the psychotherapy literature in the 1980’s, when researchers raised a concern over a lack of data documenting how treatment programs were being delivered in authentic settings (Billingsley, White, & Munson, 1980). At the time, treatment integrity was described as “the degree to which treatment is delivered as intended” (Yeaton & Sechrest, 1981; pp. 161). Along with intervention strength and effectiveness, integrity was viewed as a dimension that could determine the impact of a treatment (Yeaton & Sechrest, 1981). Despite the simple conceptualization of the construct and a widespread recognition of its importance, treatment integrity was still not consistently assessed in treatment evaluation research (Gresham, Gansle, & Noell, 1993; McIntyre et al., 2007). Researchers have suggested that the unitary conceptualization of the construct may have contributed to the low percentage of treatment integrity assessment across different disciplines because it does not provide sufficient details regarding what should be measured and how it should be measured (DiGennaro Reed & Codding, 2014). 8 As the interest in treatment integrity slowly increased in the following decades, researchers began to shift their view of this construct towards a multidimensional perspective in recognition of the fact that treatment implementation is a complex process (DiGennaro Reed & Codding, 2014). The reconceptualization of treatment integrity has resulted in different conceptual frameworks, all of which share similar components that address the intervention content and the implementation process (Sanetti & Kratochwill, 2009). Conceptual framework #1. Gresham (1989) viewed treatment integrity as having two complementary categories – content variables and process variables. The content variables are treatment complexity, treatment operationalization, and treatment adaptation. They highlight the importance of knowing the original intervention design and how possible changes may have an effect on treatment implementation. The process variables are time, resources, and implementers’ competence and motivation. They touch on the importance of acknowledging the organizational and individual factors that can have an impact on treatment implementation. Intervention studies that fail to consider these variables risk the opportunity to understand how an intervention is implemented, and how that implementation can affect the outcomes. Overall, this perspective of treatment integrity was helpful in bringing attention to the complexity of treatment implementation. The acknowledgement of different content and process variables was an important contribution to the treatment integrity literature because the discussion about this construct was minimal at the time (Gresham et al., 1993). Conceptual framework #2. Moncher and Prinz (1991) also recognized the multifaceted aspects of treatment implementation in their conceptualization of treatment integrity. The conceptual framework addressed important prerequisites to consider prior to beginning the treatment, specific features to measure during the implementation process, and ways in which the 9 integrity data could be collected and used to help interpret the outcome data. According to this view, before entering the implementation phase, steps should be taken to operationalize the treatment components and provide training to the implementers. These prerequisites are set in place to ensure accuracy in the implementation process. Specific treatment features need to be measured during the implementation process. First, treatment adherence and treatment differentiation should be measured to ensure that (1) there is not any implementation of non- prescribed procedures, and (2) the selected intervention is different from other interventions. Second, the therapist’s professional characteristics (e.g., skills and knowledge) and personal characteristics (e.g., personality) should be evaluated to determine the role that human characteristics play in the implementation process. Third, the duration of each session and the overall treatment, and the frequency and intensity of exposure to specific components should be measured to determine the extent to which the degree of exposure to treatment is related to the level of effectiveness. Given the wide range of treatment features, different measurement procedures are needed to collect the integrity data. Those procedures include self-reports (client and implementer), progress notes or permanent products, interviews, and direct observations. Treatment integrity data could be used to help interpret the outcomes by serving as a check for adherence, highlighting treatment components that are the most critical, and providing insight into the influence of treatment exposure. Overall, this conceptual framework expanded prior conceptualizations of treatment integrity by proposing additional dimensions and providing greater depth to the aspects that were previously identified. Like Gresham’s (1989) conceptual framework, this was helpful in expanding the current understanding of the construct and highlighting the significance of treatment integrity assessment in intervention research. Perhaps the most important contribution 10 of this conceptual framework was the discussion of measurement procedures and how the data could potentially be used to draw valid claims about the treatment effectiveness. Conceptual framework #3. Waltz, Addis, Koerner, and Jacobson (1993) proposed that treatment integrity has two distinctive components: adherence and competence. Adherence is referred as the therapist’s accuracy in following the treatment protocol. Competence is referred as the therapist’s ability to deliver the intervention in a way that fits with the specific client and the current context. Both components are deemed to be significant parts of treatment integrity assessment because the data could be used to interpret treatment outcome data and improve therapist training as needed. Historically speaking, it was common in psychotherapy to assume that high levels of adherence automatically translated to superior competence. This is an erroneous view however, and a potentially dangerous one to adopt. A therapist could follow a treatment protocol and still deliver it poorly because he or she did not consider the client’s progress, characteristics, or life stressors. Therefore, adherence and competence should be measured separately because each dimension provides different information about the intervention delivery. Despite the theoretical significance of these components as part of treatment integrity assessment, they have not been widely represented in treatment outcome research, mainly due to the lack of a reliable measurement system and a shortage of resources in most authentic settings. Recommendations for measurement procedures include occurrence and non-occurrence checks for adherence, and rating scales for competence. However, systemic limitations could continue to hinder their presence in treatment integrity assessment. In summary, this conceptual framework provided further support for treatment integrity assessment by addressing two specific components that hold valuable information regarding treatment implementation. Data concerning implementation accuracy and implementation 11 competence could assist in revising the treatment or training the therapist to attain the desired treatment outcomes. The recognition of adherence and competence as distinctive dimensions helped clarify the longstanding misconception that adherence equates competence. With this article, Waltz and colleagues (1993) successfully argued for them to be independent but complementary components. Conceptual framework #4. In their conceptualization of treatment integrity, Dane and Schneider (1998) followed previous works by distinguishing between content and process in treatment implementation. The content dimension consists of adherence, exposure, and program differentiation. The process dimension consists of quality of delivery and participant responsiveness. Adherence is defined as the degree to which the implementer follows the treatment protocol. Exposure is defined as the rate at which the participant is exposed to the treatment components (e.g., frequency and duration of each treatment session, and the duration of the overall intervention). Program differentiation is defined as the degree to which the selected intervention is distinguishable from other interventions. Quality of delivery is a qualitative index that measures the implementer’s behaviors during the implementation process (e.g., degree of enthusiasm, preparedness, and general attitudes towards the treatment). Participant responsiveness is also a qualitative index that measures the participant’s response to the treatment (e.g., degree of engagement and enthusiasm). Individually, each component measures a different aspect of treatment implementation. Altogether, they represent the view that treatment effectiveness is dependent on both content and process. Dane and Schneider’s framework (1998) shared similar features with existing frameworks, but also added clarity to the components to make assessment more feasible. The present study used this conceptual framework to guide the treatment integrity assessment of a 12 parent training model for an ASD social communication intervention. In addition to the common components of treatment integrity (e.g., adherence and exposure), this framework takes into consideration the rarely assessed, but equally important, components that are integral to the indirect service delivery model of the intervention (e.g., quality of delivery and participant responsiveness). The Importance of Treatment Integrity Assessment An important message that cuts across the different conceptual frameworks is the significance of treatment integrity assessment in efficacy and effectiveness research, both of which are integral to the dissemination and implementation of EBIs (McLeod, Southam-Gerow, Tully, & Rodriguez, 2013; O’Donnell, 2008). Treatment integrity assessment plays an important role in those studies because the implementation process is composed of different aspects that could contribute to treatment effectiveness, whether in isolation or in combination (Perepletchikova & Kazdin, 2005). Without a comprehensive assessment of how an intervention was implemented, it would be difficult to determine the potential reason(s) for the successful or failed outcomes (Allen, Linnan, & Emmons, 2012). The significance of treatment integrity assessment could be viewed from both a practical and a methodological perspective. Practical perspective. First, the assessment of treatment integrity is important for detecting errors in the implementation process that could undermine the effectiveness of an intervention (Moncher & Prinz, 1991). Therapists could use the data to provide the implementer with additional training in the intervention components where errors were detected. In turn, this prompt remediation could reduce the costs related to implementing the intervention, prevent long-term harm to the client, and increase the likelihood of the client benefitting from the intervention. Second, the assessment of treatment integrity is important for identifying effective 13 and ineffective treatment components (DiGennaro Reed & Codding, 2014). With the available data, appropriate steps could be taken to modify the components that do not align with the client’s needs, or, intensify the components that are particularly effective to better serve him or her. Third, the assessment of treatment integrity is important for pinpointing the active ingredients of an intervention (Schulte et al., 2009). In the pursuit of transporting EBIs into authentic settings, some interventions will require a slight adaptation in order for them to be adopted (Durlak & DuPre, 2008). Knowing the active ingredients of an intervention would allow researchers to find the appropriate balance between adhering to the treatment protocol and adapting the program to meet the needs and circumstances of a setting or a client. In general, treatment integrity assessment is necessary to help researchers and practitioners make informed decisions about how to best implement an intervention so that it could benefit the intended client (Collier-Meek, Fallon, Sanetti, & Maggin, 2013; Borrelli, 2011). Methodological perspective. In efficacy research, the primary goal is to preserve the internal validity to ensure accurate claims about the causal role of the treatment in the outcomes, whereas effectiveness research is concerned with enhancing the external validity to increase the possibility of successfully generalizing the intervention to other settings (Bellg et al., 2004). Treatment integrity has been theorized to have an effect on internal and external validity (Moncher & Prinz, 1991; Lane, Bocian, MacMillan, & Gresham, 2004). Internal validity is concerned with implementing an intervention as intended by the protocol (Allen et al., 2012). Maintaining a high internal validity is essential in an experimental study. It allows researchers to make accurate claims about the causal relationship between the independent variable (e.g., intervention) and the dependent variable (e.g., behavior changes). In studies that assess the effects of an intervention, confounding variables are dangerous because 14 they could alter the relationship between the intended independent variable and the dependent variable (Lane et al., 2004). A comprehensive treatment integrity assessment would give researchers the best chance at monitoring how the independent variable is bringing changes to the dependent variable, while looking for potential confounding variables that could limit their interpretation of the treatment outcomes. If there were significant results, researchers would be better able to determine whether they were due to the intervention or a confounding variable(s). Similarly, if there were non-significant results, researchers would know whether they were due to a poor implementation or an ineffective intervention (Moncher & Prinz, 1991; Hohmann & Shear, 2002). External validity is concerned with the generalizability of the treatment effectiveness to individuals with similar conditions (Allen et al., 2012). In the era of EBP, it is essential that interventions are transferrable to serve different populations in different settings. One way to support a successful replication of an intervention is by providing a documentation of the implementation process. A comprehensive treatment integrity assessment would help in providing that support (Moncher & Prinz, 1991). Furthermore, it could help to identify the active ingredients of an intervention that cannot be compromised when adaptations are made, thereby preserving the inherent internal validity of the intervention and maintaining a high external validity at the same time. Overall, poor treatment integrity data or lack of treatment integrity data altogether could hinder future replications of an intervention, which in turn, could result in negative consequences for the individuals receiving the treatment (Lane et al., 2004). Barriers to Treatment Integrity Assessment There are many factors contributing to the limited assessment of treatment integrity. Perepletchikova, Hilt, Chereji, and Kazdin (2009) used the Barriers to Treatment Integrity 15 Implementation Survey (BTIIS) to assess possible barriers that make it challenging for researchers to include treatment integrity assessment in their studies. Psychotherapy researchers who participated in the survey rated the lack of a unified conceptualization of treatment integrity and specific guidelines for measurement procedures as the biggest barriers, followed by high demand for resources (e.g., time, cost, and labor), lack of general knowledge of treatment integrity, and lack of editorial requirement from journals. Lack of appreciation for treatment integrity was not viewed as a barrier among the participants. These findings suggest that the importance of treatment integrity assessment is widely recognized, but that methodological and systemic barriers prevent it from becoming a common practice in treatment outcome research (Liaupsin, Ferro, & Umbreit, 2012; Perepletchikova et al., 2009). Characteristics of Autism Spectrum Disorder (ASD) Diagnostic criteria. Autism spectrum disorder (ASD) is a neurodevelopmental disorder marked by a significant impairment in social communication and interaction, and the presence of restricted and repetitive behaviors (American Psychiatric Association, 2013). In the social domain, children diagnosed with ASD have deficits in social-emotional reciprocity (e.g., failure to initiate and maintain a conversation, lack of shared interests, emotions, and affect), nonverbal communicative behaviors (e.g., lack of joint attention, poor understanding of nonverbal gestures, and poor integration of verbal and nonverbal communication), as well as developing, maintaining, and understanding relationships (e.g., lack of interest in same-age peers, difficulty engaging in imaginative play, and difficulty adjusting to different social contexts). In the behavior domain, children diagnosed with ASD typically exhibit restricted and repetitive behaviors (e.g., motor movements, speech, and use of objects), have an affinity for consistency in routines (e.g., difficulty dealing with small changes and transitions), and have highly restricted 16 interests (e.g., abnormally strong attachment to an object). Additionally, they can be hypersensitive or hyposensitive to sensory input (e.g., fixation with lights and movement in the environment) (American Psychiatric Association, 2013). Prevalence. The current prevalence data suggest that 1 in 68 children has a diagnosis of autism in the United States, with boys (1 in 42) being more commonly diagnosed than girls (1 in 189) (Center for Disease Control, 2014). However, these numbers should be interpreted with caution due to the methodology used in the study. The CDC primarily employed a record review method, where CDC clinicians reviewed the medical and education records of 8-year-old children from eleven sites in the country. One of the most glaring concerns was the sizable variation in the prevalence estimates across study sites (e.g., 1 in 46 in New Jersey and 1 in 175 in Alabama) (CDC, 2014). Critics of the CDC study noted that a true “prevalence” estimate should not be based on a review of records, but rather, a clinical assessment of children in a population-based sample. Until a more rigorous assessment-based method is adopted to measure the prevalence rate, it is difficult to determine the true prevalence of ASD in the country (Mandell & Lecavalier, 2014). The early signs of autism. Parents of children with autism have reported seeing early warning signs at around the 1-year mark, although it took them an average of six months following this discovery to seek professional advice (Guinchat et al., 2012; Webb & Jones, 2009; Howlin & Moore, 1997). Of all the red flags associated with autism, the symptoms that raised the most concerns for parents were related to social communication development and language development (Charman & Stone, 2006; Chawarska et al., 2007). In the area of social communication, the atypical behaviors that caught parents’ attention were avoidance (or total lack) of eye contact, social withdrawal, lack of response to social stimuli and initiations, lack of 17 gesture, joint attention and imitation, and lack of shared enjoyment and interests (Guinchat et al., 2012). In the area of language, the concerning behaviors were minimal (or complete absence of) language development, lack of response to social inputs (e.g., name, demands, questions), and lack of imaginative play. It is not surprising to find that children with autism experience the most deficits in these two areas since they are intertwined entities. For the most part, language development is acquired through social interactions. By not engaging in social interactions, children inevitably lose out on many opportunities to learn language (Rogers, Hepburn, Stackhouse, & Wehner, 2003). Developmental trajectory in typically developing children. In order to understand the social communication and language deficits experienced by children with ASD, it is important to take a look at the expected trajectory of social communication and language development. The path to becoming active social partners begins at the infancy stage, where the first social partner for children is typically their caregiver (Connell & Prinz, 2002). Children need to demonstrate the ability to share attention, affect, and intention at an early age in order to develop appropriate social communication skills (Woodward, 2003). Shared attention is characterized as children’s ability to shift their gaze towards their social partner or a shared object of interest during an interaction. Shared affect is characterized as children’s ability to share their current emotional state with their social partner, which also teaches them to interpret others’ emotional states. Shared intention is characterized as children’s ability to direct their social partner’s attention towards them in order to meet a specific need. These skills are meant to set the stage for children to acquire appropriate social skills needed to engage in social interactions and form meaningful relationships (Flom, Burmeister, & Pick, 1998; Charman, 2003). 18 Children need to demonstrate the ability to use symbols at an early age, as it is a prerequisite for language development. Prior to acquiring language, children use a variety of functional behaviors to communicate their needs, thoughts, and shared interests during a dyadic interaction (e.g., reaching, grasping, pointing, waving). They come to learn through social interactions that these nonverbal behaviors could be used for effective communication. The capacity to communicate through symbols, coupled with the emerging development of speech perception, paves the way for language development—both receptive and expressive (Gervain & Werker, 2008). Speech perception is still developing during the first year, therefore, children rely on nonverbal cues (e.g., gestural, situational, and intonation) prior to having developed an understanding of the words attached to them. The development of speech perception progresses after the first year. Children begin to learn their first words and use them in a wider variety of contexts through listening and observing others (Morgenstern, Leroy-Collombel, & Caet, 2013). Rationale for early intervention. In comparison to typically developing children, children with ASD demonstrate marked deficits in both social communication and language skills. Treatment outcome research for ASD interventions suggests that children who receive intervention support for 2-3 years starting at preschool age stand to make substantial gains in these developmental areas (Strauss et al., 2012; Sallow & Graupner, 2005; Smith, Groen, & Wynn, 2000). A brief discussion about the interplay between the brain and the environment is necessary to understand this support for early intensive intervention for children with ASD. Along with genetic factors, the environment has a tremendous influence on the development of the brain (Sale, Berardi, Maffei, 2009). The brain has the ability to reorganize neural pathways based on environmental input, meaning that what individuals experience in their environment can change the architecture of the brain (Sale, Berardi, & Maffei, 2014). If an 19 individual performs an action enough times, the brain will commit to certain neural organizations to help the individual remember that particular action. For example, if an infant is given ample opportunities to learn how to walk, specific neural pathways that are assigned to this behavior will become strengthened over time, thereby enabling the child to become proficient at walking. The same pattern of neuroplasticity can be applied to other developmental areas, such as language and social communication skills. Although learning occurs across the lifespan, there are skills that need to be learned at a particular period in order for the individual to achieve optimal development. Neuroscientists refer to this as the “sensitive period” (Thomas & Johnson, 2008). The sensitive period is a window of opportunity where experience can have a profound effect on the architecture of the brain due to the flexibility of the neural circuits. Learning that takes place during this period is known to have lifelong adaptive benefits for the individual. Even though the brain can continue to reorganize itself throughout the lifespan, any changes in the neuro-connections that occur after the sensitive period will be constrained by the architecture that was set up during the sensitive period (Knudsen, 2004). The theory of neuroplasticity and the principle of the “sensitive period” suggest a need for providing early intensive interventions for children with ASD. Aside from the genetic influences, early exposure to developmentally appropriate experiences can greatly alter the child’s cognitive functioning and behavioral patterns (Dawson, 2008; Ben Itzchak, Lahat, Burgin, & Zachor, 2008). Parent-Implemented Interventions: The Use of Coaching Parent-implemented interventions typically follow an indirect service delivery model, in which the therapist serves as the coach and the parent serves as the primary intervention provider 20 (Friedman, Woods, & Salisbury, 2012). Rush and Shelden (2011) defined coaching as “an adult learning strategy in which the coach promotes the learner’s (coachee’s) ability to reflect on his or her actions as a means to determine the effectiveness of an action or practice and develop a plan for refinement and use of the action in immediate or future situations” (p.8). Parent coaching is a graduated process of helping parents gain competence and confidence in implementing intervention strategies in the therapist’s absence (Brown & Woods, 2016). This pedagogical approach is oriented around the adult learning theory, in that parents are seen as self-directed learners who are actively involved in the learning process (Fleming, Sawyer, & Campbell, 2011). In this coaching context, the therapist-parent relationship is often characterized as equal, collaborative, respectful, and active (Knoche, Kuhn, & Eum, 2013). Parent coaching gained popularity in the first decade of the 21st century, with evidence of its existence showing up in professional policies regarding family involvement in children education (e.g., National Early Childhood Technical Assistance Center, 2008; Individuals with Disabilities Education Act, 2004) (Kemp & Turnbull, 2014). The concept emerged in the early intervention field as a response to the increasing dissatisfaction among some researchers with the traditional conceptualization of parent involvement in early intervention programs, in which parents typically receive formal training or education from a professional provider. This formal pedagogical practice was not deemed compatible with the family-centered vision that many researchers had in mind for early intervention programs (Winton, Sloop, & Rodriguez, 1999). Proponents who called for a reconceptualization of parent involvement in early intervention programs believed that the parent-therapist relationship should be marked by equal collaboration from both individuals (Rush, Shelden, & Hanft, 2003). 21 Telehealth delivery of ASD interventions. Parent coaching has become a staple component of ASD parent-implemented interventions, specifically programs that are delivered using telecommunication technologies (Gibson, Pennington, Stenhoff, & Hopper, 2010; Vismara, Young, Stahmer, Griffith, & Rogers, 2009). Under this telehealth approach, parents independently review the intervention contents online and receive live coaching on their implementation practice afterward through a telecommunication program (e.g., Skype) (Boisvert, Andrianopoulos, Boscardin, 2010). This approach to providing parents with support is meant to increase parents’ use of the intervention techniques in socially meaningful settings and routines outside of the clinic laboratory (Meadan et al., 2016). The parent coaching format for a telehealth-based ASD intervention program typically has three components: (1) An update on home practices, (2) a short parent-child play segment, and (3) a post-play feedback segment (Vismara, McCormick, Young, Nadhan, & Monlux, 2013; Ingersoll, Wainer, Berger, Pickard, & Bonter, 2016). There is evidence to suggest that parent coaching is a beneficial practice of telehealth- based ASD intervention programs. Results from multiple studies suggested that both parents and their children demonstrated improvement following the conclusion of the intervention. The common areas of improvement for parents were increased frequency of implementation and fidelity in using the intervention techniques, self-efficacy in being their child’s teacher, and comfort in talking about their concerns regarding their children. Parents also reported a positive change in how they viewed their child and a greater appreciation for home practices (McDuffie et al., 2013; Ingersoll et al., 2016; Hepburn, Blakeley-Smith, Wolff, & Reaven, 2015). The common area of improvement for children was social communication skills (Simacek, Dimian, & McComas, 2017; Wainer & Ingersoll, 2015). 22 Treatment Integrity Assessment of Parent Coaching Based on the coaching model of many ASD parent-implemented interventions, treatment integrity assessment should occur at two stages: The coaching delivery and the treatment delivery (Frank & Kratochwill, 2008). Existing literature reviews show an acceptable rate of treatment integrity assessment for the treatment delivery, but not for the coaching delivery. For example, Barton and Fettig (2013) analyzed 24 parent-implemented interventions for general disabilities from 1972-2012. They found that 19 (80%) studies reported treatment integrity data for the treatment delivery, but only 7 (30%) studies reported treatment integrity data for the coaching delivery. Similarly, Meadan, Ostrosky, Zaghlawan, & Yu (2009) evaluated 12 parent-implemented autism interventions for social communication from 1997-2007. They found that 9 (75%) studies reported treatment integrity data for treatment delivery, but only 2 (16%) studies reported treatment integrity data for the coaching delivery. Schultz, Schmidt, and Stichter (2011) found a comparable trend when they evaluated 30 parent education programs for parents of children with ASD. They found that none of the studies reported treatment integrity data for the coaching delivery. This pattern of low treatment integrity assessment for the coaching delivery poses an ongoing challenge for researchers and practitioners to better understand the triadic nature of the collaborative, family-centered intervention approach to many ASD treatments. Under the indirect service delivery model, conducting a comprehensive treatment integrity assessment of the coaching delivery is important for several reasons. First, it affords the opportunity to examine how coaching is demonstrated in this indirect service delivery model. Autism researchers who study the effectiveness of parent training programs have been grappling with the challenges of determining the most effective elements of coaching that best serve the purpose of supporting 23 parents. The treatment integrity data of the coaching delivery could potentially provide insight into this question. Second, it serves as an important source of support to interpret treatment outcomes related to the parent and the child. While coaching does not automatically ensure successful treatment outcomes, it is still necessary to monitor how the strategies are used to identify problems that may arise during the coaching session. The lack of treatment integrity data of the coaching delivery would limit the interpretation of the treatment outcome data in light of the potential problems. The Present Study: A Focus on Project ImPACT Theoretical foundation. Project ImPACT (Improving Parents as Communication Teachers; Ingersoll & Dvortcsak, 2010) is a parent training curriculum that focuses on helping parents use a combination of developmental and behavioral strategies to teach their child important social communication skills. The curriculum is guided by four principles. First, the curriculum takes on a naturalistic approach to teaching social communication skills. Parents are taught to use daily interactions and routines as teaching opportunities for the child. This teaching approach has several benefits, such as a higher likelihood of intervention acceptability by the parent(s) and a higher likelihood of generalization and maintenance of skills by the child. Second, the curriculum reflects the typical developmental sequence for social communication. Treatment targets are selected based on the understanding that nonverbal communicative behaviors typically develop before language skills. This developmental framework has several benefits, such as allowing the child to learn at an age-appropriate pace and building a solid foundation for more complex behaviors. Third, the curriculum places a strong emphasis on the parent-child relationship. It utilizes the ongoing interactions between the parent and the child as the context for enhancing social responsiveness in the child. One benefit of this teaching 24 approach is the increase in the parent’s responsiveness to the child, which serves as the catalyst for the child’s reciprocal responsiveness. Fourth, the curriculum is guided by ABA principles. This teaching approach focuses on manipulating antecedent variables to trigger the target behavior and systematically applying reinforcement to increase the behavior’s occurrences. The common teaching techniques include prompting, chaining, and fading. The ABA-based teaching approach, along with the specific techniques, are beneficial to the curriculum because they are an appropriate complement to the behavioral naturalistic teaching approach. Content. Project ImPACT concentrates on four core skills of social communication: Social engagement, language, social imitation, and play. Social engagement. Many children with ASD have difficulty initiating and maintaining social interactions. This impairment in social engagement stems from the absence of a key social behavior: Joint attention. Joint attention is the ability to coordinate your attention between an object and another person during an interaction. It preserves the purpose of social interactions, which is to show and share your interest(s) with another person. Joint attention is believed to be one of the prerequisite skills for language acquisition, and therefore, is considered an important behavioral target in this program. Language. Many children with ASD have deficits in language development. First, they experience difficulty in understanding the language content (e.g., vocabulary) and its form (e.g., grammar and structure). Second, they experience difficulty in using language as a means of communication (e.g., to request, to protest, to gain and maintain attention, and to express feelings and needs). Due to this deficit, they also struggle with deciphering social rules for communication (e.g., reading verbal and nonverbal cues, maintaining physical proximity, and gauging the listener’s responses and needs). The program primarily focuses on teaching the 25 children expressive language skills, and, teaching the parents strategies to help their child understand how to use language properly and why language is important for communication. Social imitation. Many children with ASD lack social imitation skills, which makes it difficult for them to learn new skills and engage in social interactions. Without the ability to imitate, these children have a harder time picking up new behaviors because most behavioral programs rely on modeling as a teaching tool. Similarly, the lack of ability to imitate prevents these children from connecting with others in their environment. The program places an emphasis on teaching social imitation skills because they are considered a prerequisite for more complex communication skills, such as play, language, and joint attention. Play. Many children with ASD have difficulty engaging in meaningful play. Instead, they are known to engage in nonfunctional and repetitive play that does not serve a true purpose. They also have difficulty engaging in symbolic or pretend play. Meaningful play skills, particularly ones that involve symbolic thinking, play a crucial role in other developmental domains, such as language, pretend play, fine and gross motor skills, problem-solving skills, perspective-taking skills, and imaginative skills. Furthermore, play serves as a vehicle for social interactions in the early stages of life. Because having play skills is necessary for developing relationships and learning essential skills, the program has an emphasis on teaching them in a natural environment. Evidence of the intervention effectiveness. Currently, there have been two studies that evaluated the efficacy and effectiveness of Project ImPACT (Ingersoll & Dvortcsak, 2010). Results from both studies suggested that Project ImPACT increases children’s communication skills and improves parents’ use of the intervention techniques with high fidelity. 26 Efficacy study. Ingersoll and Wainer (2013) studied the initial efficacy of Project ImPACT with eight mother-child dyads. The children qualified for the study based on having met the DSM-V criteria for autism or pervasive developmental disorder-not otherwise specified. There were seven males and one female with an average chronological age of 53 months. The average level of cognitive development was 25.9 months and the average level of language development was 22.9 months. The study used a single case, multiple baseline design across participants. Most of the baseline and treatment sessions occurred in a research setting, with an average of two baseline sessions and three treatment sessions occurring in the home to test for generalization. Participants were split into two groups, one group (n = 3) received training 1 day/week for a total of 12 sessions and one group (n = 5) received training 2 days/week for a total of 24 sessions. The intervention techniques included in Project ImPACT fall into four categories: (1) Make Play Interactive, (2) Models and Expands Language, (3) Creates Opportunities for Initiations, and (4) Helps Increase Complexity of Initiations. The fifth category, Pace the Interaction, summarize how well the parents used all of the techniques. During the baseline phase, the mother and the child were given a set of toys and instructed to engage in free play for 10 minutes. At the first session, the parent and the therapist worked together to identify goals in the area of social engagement, language, imitation, and play. Once the goals were established, the subsequent sessions followed the same format: (1) homework review, (2) therapist teaches the new intervention technique, (3) therapist models the technique, (4) mother practices the technique with the child while receiving immediate feedback, and (5) both therapist and mother discuss the next homework assignment. At the end of each session, the mother and the child were asked to engage in a 10-minute free play again. These 10- 27 minute sessions were used by blinded observers to code child spontaneous language and parent treatment fidelity. Additionally, parents were given a goal achievement score, which was calculated by dividing the number of new language goals by the number of initial language goals. Treatment fidelity. At baseline, parents’ average fidelity ratings across the five dimensions hovered around the 1-3 range on a scale of 1 to 5. During the treatment implementation phase, parents reached adequate fidelity of implementation (score of 4 or above) for each dimension in the sequential order that they are introduced in the program. However, their average ratings slightly decreased with the introduction of new intervention techniques. During generalization, all but one parent had an increasing trend in their average fidelity ratings. Following one month after the treatment phase, the average fidelity ratings of all parents remained higher than the average fidelity ratings achieved during the baseline phase. In comparison to the average fidelity ratings at baseline (M = 1.87, SE = 0.18), p < 0.01), parents demonstrated significantly higher average fidelity ratings during treatment (M = 3.32, SE = 0.12, d = 15.24) and at follow-up (M = 3.60, SE = 0.33, d = 18.18) with large effect sizes in both cases. Child spontaneous language. During baseline, five children displayed low to moderate levels of spontaneous language, two children demonstrated little to no spontaneous language, and one child demonstrated higher levels of spontaneous language. During treatment, four children demonstrated an increase in their spontaneous language and maintained those gains throughout generalization and follow-up. One child demonstrated initial gains, but fluctuated slightly during treatment, and eventually declined towards the end of treatment. He did not demonstrate a generalization of spontaneous language at home, but did demonstrate an increase at the 1-month 28 follow up. Another child experienced immediate gains at the onset of treatment, and maintained the high level of spontaneous language during generalization and follow-up. In comparison to their baseline performance, the children’s spontaneous language gains were significantly higher during treatment and follow-up. Two children demonstrated no changes in their spontaneous language. In comparison to the rate per minute of spontaneous language at baseline (M = 0.67, SE = 0.26, p < 0.01), children demonstrated a significantly higher rate per minute of spontaneous language during treatment (M = 1.00, SE = 0.25, d = 0.48) and at follow-up (M = 1.66, SE = 0.33, d = 1.44) with medium to large effect sizes in both cases. In addition, there was a significant difference between the rate per minute of spontaneous language during treatment and the rate per minute of spontaneous language at follow-up, characterized by a large effect size (d = 0.96). Lastly, a significant relationship was found between parents’ average fidelity ratings and children’s spontaneous language, b = 0.11, t(125) = 2.73, p < 0.01. In particular, the Make Play Interactive dimension (b = 0.12, t(136) = 2.71, p < 0.01) and the Helps Increase Complexity of Initiations dimension (b = 0.13, t(122) = 3.09, p < 0.01) were unique predictors of children’s spontaneous language use. Goal achievement. Across the eight dyads, 12 out of 17 (71%) initial language goals were reached at the conclusion of treatment. All but two children met their predetermined goals. Conclusion. Project ImPACT appeared to have a positive effect on the parents’ teaching ability and their children’s spontaneous language skill. Parents reached adequate fidelity ratings during treatment and maintained them at follow-up. All but two children experienced significant 29 gains in their spontaneous language. Lastly, the Make Play Interactive and Helps Increase the Complexity of Initiations dimensions were found to be the most active ingredients of the program Effectiveness study. Stadnick, Stahmer, and Brookman-Frazee (2015) studied the effectiveness of Project ImPACT in a community setting with 30 mother-child dyads. There were twenty-four boys and six females with an average chronological age of 54.83 months. The children were qualified for the study by having an official diagnosis of autism or being considered “at risk” for ASD by a community mental health professional. Sixteen pairs were assigned to the experimental group (Project ImPACT), and fourteen pairs were assigned to the control group (community-based services). As in the previous study, the intervention categories of Project ImPACT comprised: (1) Make Play Interactive, (2) Models and Expands Language, (3) Creates Opportunities for Initiations, and (4) Helps Increase Complexity of Initiations. The fifth category, Pace the Interaction, summarize how well the parents used all of the techniques. During the baseline phase, the mother and the child were given a set of toys and instructed to engage in free play for 10 minutes. The program was condensed into 12 sessions with participants receiving training 1 hour each week for 12 weeks. Session 1 was reserved for the intake assessments and the introduction to the program. Session 2 focused on helping parents set up the home environment to accommodate the practice sessions. Session 3 focused on working with the parents to set specific goals for the child. Session 4-11 focused on teaching parents the interventions techniques that are outlined in the program. Session 12 was reserved for the post assessments and the development of a plan to help parents continue using the intervention techniques. At the end of each session, 30 the mother and the child were asked to engage in a 10-minute free play again. These 10-minute sessions were used to code for parent treatment fidelity by blinded observers. Child communication skills. There was a significant group x time interaction for this child outcome, F(1, 27) = 5.70, p < 0.05, η2 = 0.17. Children in the experimental group saw a significant increase in their score on the communication domain of the Vineland II at the conclusion of treatment. The standard score at baseline was 72.06, and at 12th week was 81.38. Meanwhile, for the control group, the standard score at baseline was 72.07, and at 12th week was 73.08. Child social skills. There was not a significant group x time interaction for this child outcome, F(1, 27) = 1.43, p = 0.24, η2 = 0.05. For the experimental group, the children’s score on the social skills domain of the Vineland II increased at the 12th week mark, but it was not considered a significant increase. The standard score at baseline was 69.31, and at 12th week was 74.81. Meanwhile, for the control group, the standard score at baseline was 67.29, and at 12th week was 67.54. Parent stress. There was not a significant group x time interaction for this parent outcome, F(1, 24) = 1.62, p = 0.22, η2 = 0.06. Even though parents in the experimental group scored lower at the 12th week mark, it was not considered a significant decrease. The total raw score at baseline was 93.38, and at 12th week was 84.56. Meanwhile, for the control group, the total raw score at baseline was 100, and at 12th week was 100.62. Parent depression symptoms. There was not a significant group x time interaction for this parent outcome, F(1, 27) = 0.83, p = 0.37, η2 = 0.03. Even though parents in the experimental group scored lower at the 12th week mark, it was not considered a significant decrease. The total 31 raw score at baseline was 11.06, and at 12th week was 7.81. Meanwhile, for the control group, the total raw score at baseline was 12.79, and at 12th week was 13. Parent intervention adherence. Parents in the experimental group demonstrated a strong upward trend in their treatment adherence rating, F(1, 22) = 4.14, p = 0.05, η2 = 0.16. The average adherence rating at baseline was 3.51, and at 12th week was 3.98. Meanwhile, for the control group, the average adherence rating at baseline was 3.51, and at 12th week was 3.24. Conclusion. Parents demonstrated an increase in their ability to use the intervention techniques with accuracy. In comparison to children in the control group, the children in the Project ImPACT group demonstrated substantial gains in their communication skills. However, the same result was not replicated for the social skills domain. Additional analyses showed that children whose parents reported a higher stress level at baseline demonstrated minimal gains in their social skills at 12th week. Implications for future research. Both studies had a number of strengths that make them a valuable contribution to the autism intervention literature. The effectiveness study observed parents’ learning in two coaching schedules: Once a week and twice a week. Parents who received coaching once a week also saw an improvement in their implementation skills. This finding suggested that it is possible to condense the program without compromising the parents’ learning. Community settings that have limited time and resources can opt to use the shortened coaching schedule and expect similar results. Furthermore, the study detected specific active ingredients of Project ImPACT. This information could help the therapists decide what kind of support to give parents who have difficulty using these intervention techniques. The efficacy study evaluated the effectiveness of Project ImPACT in a community setting, which supported the ongoing agenda to make parent-implemented interventions easily accessible 32 outside of the laboratory setting. Additionally, the study looked at parent stress and depression symptoms, which are significant mental health factors that commonly affect parents who have children with autism. Most importantly, the study examined how these mental health factors could predict children’s outcomes, namely their social and communication skills. At the same time, the studies had several limitations. The primary limitations of the efficacy study were the limited measurement of generalization outside of the clinic setting and the lack of data regarding parents’ practices at home. The effectiveness study did not use random assignment to groups (due to ethical constraints) and relied on parent-report data for both the parent and child outcomes. Finally, the studies shared a limitation that has been an ongoing concern for ASD parent-implemented interventions. Both studies addressed integrity at the parent level by measuring treatment adherence, but did not examine the integrity of the coaching that parents received. Although parents improved in their teaching ability, the lack of treatment integrity data on the coaching practices makes it difficult to conclude that their improvement is related to the support that they received. This is an important limitation to address in future studies because coaching is a prominent feature of Project ImPACT. 33 The Present Study CHAPTER 3: METHOD The current study analyzed videos of parent coaching sessions from two randomized controlled trial (RCT) studies that looked at the effectiveness of delivering Project ImPACT online (Ingersoll, Wainer, Berger, Pickard, & Bonter, 2016). Each parent received twelve coaching sessions corresponding to the twelve lessons in the program. The first three lessons were focused on introducing the parent to the program, creating goals for the child, and setting up the house to promote success during treatment implementation. The next eight lessons taught parents a variety of techniques to use when working with the child to improve their language and play skills. The last lesson provided guidance for parents to incorporate all the techniques into their interactions with the child. Treatment integrity was evaluated for the eight coaching sessions that provided the core lessons of the program. The coaching sessions for the first three lessons were not included in the analysis because they were part of the preparation stage, and thus, did not involve direct coaching of the teaching techniques that parents must learn. The coaching session for the last lesson was also not included in the analysis because the topic was a review of the previous eight lessons. A total of 130 coaching sessions from 19 parents were coded for treatment integrity. Ten parents had a complete set of eight coaching sessions. Nine parents had varying numbers of missing coaching sessions ranging from 1 to 5. A total of 127 coaching sessions from 18 parents were used in the final analysis. One set of coaching sessions was dropped due to the cluster size being too small for a multilevel analysis (minimum requirement:  5 data points per cluster). 34 Delivery of Project ImPACT via Telehealth: RCT Studies Study description. The online version of Project IMPACT is a modified version of the original Project ImPACT curriculum. There are twelve lessons in total. The first two lessons focus on helping parents become familiarized with the program and the typical trajectory of social communication development. One lesson focuses on helping parents set up their home to ensure a successful experience when working with their child. The following eight lessons focus on different intervention techniques within each teaching domain. The last lesson focuses on helping parents incorporate all the intervention techniques into their interactions with the child. The randomized controlled trial (RCT) studies looked at two different formats of delivering Project ImPACT online: Self-directed and therapist-assisted. The families were randomly assigned to each format. Both groups had access to the online program for six months. The parents were encouraged to complete one lesson each week, and to practice the intervention techniques with their child between each lesson. Parents in the self-directed group were instructed to complete the online training independently, and to contact project staff members with any questions or concerns regarding the technology. In contrast, parents in the therapy- assisted group received two, 30-minute, coaching sessions each week through the telecommunication program, Skype. The first session focused on helping parents understand the lesson content, and how they could incorporate the intervention techniques into their daily interactions with the child. The second session focused on providing parents with live feedback on their use of the intervention techniques. Participants. The parents and children in this study were recruited from different agencies that serve children with ASD. In order to be qualified for the study, the parents had to be proficient in English, and the children had to meet criteria for ASD based on the DSM-IV and 35 the ADOS-G or ADOS-2. The average age of the children in the first RCT study was 3.5 years old. The average age of the children in the second RCT study was 4.4 years old. Therapists. All of the therapists were part of the Autism Lab in the Psychology Department at a Midwestern university. Four therapists were advanced graduate students, and one therapist was the lab manager. All of the therapists received formal training on Project ImPACT. Project ImPACT: Target Teaching Domains Domain 1: Make play interactive and modeling and expanding language (lesson and coaching sessions 4-6). Parents are first taught how to make play interactive for the child in order to boost their child’s interest, attention, and motivation. The three recommended techniques are: Follow Your Child’s Lead, Imitate Your Child, and Animation. With the Follow Your Child’s Lead technique, parents take on a meaningful, supportive role by following the child’s preferred play scheme. Questions and directions are replaced by simple comments of the child’s play that indicate interest and engagement. With the Imitate Your Child’s technique, parents imitate the child’s play with toys, movements, and vocalizations. The child continues to take the lead, but the parent can correct any physical or verbal behaviors that are inappropriate. With the Animation technique, parents demonstrate their enthusiasm towards the activity by exaggerating their gestures, facial expressions, and vocal quality. Parents are taught specific strategies to model and expand the child’s language during play. One strategy is to provide meaning to the child’s vocalizations and gestures to help the child understand that his or her behaviors carry a meaningful purpose. Another strategy is to use parallel play and self-talk to place meaning on the parent’s and the child’s actions. 36 Domain 2: Create opportunities for initiations (lesson and coaching session 7). Parents are taught how to create opportunities for the child to initiate communication with them during play. The three techniques are: Playful Obstruction, Balanced Turns, and Communicative Temptations. With the Playful Obstruction technique, parents use multiple tricks to interrupt their child’s play with the intention of getting the child to communicate his or her desire to have the obstruction removed in order to continue with the activity. With the Balanced Turns technique, parents engage the child in a “back and forth” interaction during play to teach the child important early social skills. With the Communicative Temptations technique, parents use a variety of strategies (e.g., leaving toy items out of reach, controlling access to the toys, choosing toys that require assistance, and withholding important parts of a toy) to set up opportunities for the child to initiate an interaction. Domain 3: Increase complexity of receptive and expressive language (lesson and coaching sessions 8-9). Parents are taught the structure of prompting and providing reinforcement to teach the child expressive and receptive language. There are eight levels of prompts for expressive language, and three levels of prompts for receptive language. Each level of prompt signifies a degree of support given to the child, ranging from the least intrusive support (e.g., verbal instructions) to the most intrusive support (e.g., physical prompt). With regard to prompting, parents are taught to give clear, relevant, and developmentally appropriate prompts. They are instructed to monitor the child’s motivation to respond, and use the 3-prompt rule to prevent the child from losing interest in the interaction. In regard to providing reinforcement, parents are taught to give natural, immediate reinforcement only when the child demonstrates a good attempt at providing the correct response. 37 Domain 4: Increase complexity of imitation and play (lesson and coaching sessions 10-11). Parents are taught to use verbal and nonverbal prompts to teach the child social imitation and play skills. In regard to social imitation, parents start out by imitating the child’s play actions even if they are nonfunctional. This step helps the child understand that imitation is a back-and- forth interaction. Next, parents move into modeling play actions that have a high interest value and can be easily replicated for the child. The modeling should be paired with a verbal description of the actions to help the child learn them so that he or she can imitate spontaneously. In regard to play skills, parents start out by using the child’s selected toy to model play actions that are different and more complex than the child’s current play repertoire. When modeling a new play action, parents can use a play-related prompt to get the child to engage in the new way of playing. Similar to the language lesson, reinforcement is only given when the child demonstrates a good attempt at providing the correct response. Components of Treatment Integrity: Coaching Delivery Using Dane and Schneider’s (1998) treatment integrity framework, the following treatment integrity components were coded: Adherence, exposure, quality of coaching delivery, and participant responsiveness. According to this framework, adherence and exposure characterize the content of the parent training model, while quality of coaching delivery and participant responsiveness characterize the process of the parent training model. Adherence. Adherence was defined as the degree to which the therapist followed the coaching procedure (Wainer & Ingersoll, 2013; Dane & Schneider, 1998). This component was measured using a modified version of a pre-existing adherence self-report checklist from the initial RCT study (Ingersoll et al., 2016). Several action items related to the therapist’s preparation work prior to meeting with the parent were removed from the modified version 38 because they could not be viewed from the videos (e.g., reviewing the parent’s reflection responses). This component was coded holistically by looking back over the whole session. The observer marked the therapist’s implementation of each action item as either “Observed,” “Not Observed,” or “Not Available.” An adherence percentage was calculated by dividing the Observed marks by the sum of the Observed and Not Observed marks, and multiplying it by 100 (see Appendix A). Exposure. Exposure was defined as the amount of constructive feedback that parents received during the coaching session. Constructive feedback took two forms: corrective and reinforcement. Corrective feedback was defined as comments that were meant to improve the parent’s use of the strategies (Shanley & Niec, 2010). For example, the therapist could say, “Because Johnny is not paying attention to you, why don’t you try to get down on his level and make eye contact with him?”. Reinforcement feedback was defined as comments that were meant to encourage and reinforce the parent’s use of the strategies (Shanley & Niec, 2010). For example, the therapist could say, “You did a wonderful job adjusting your communication tone when you saw that Johnny was feeling overwhelmed!”. This component was coded in 5-minute intervals. The observer counted the number of corrective and reinforcement comments in each interval. The overall score was the combined number of comments across all of the intervals. A comment was counted as an occurrence if it was specific (e.g., “You did a great job gaining his attention!”). A comment was not counted as an occurrence if it was vague (e.g., “That’s great.” or “That’s awesome.”) (see Appendix B). Quality of coaching delivery. Quality of coaching delivery was defined as the therapist’s effectiveness in strengthening the parents’ teaching capacity through the use of 39 specific collaborative consultation strategies (Dane & Schneider, 1998; Knoche, Sheridan, Edwards, & Osborn, 2010). This component was considered a higher order construct. It encapsulated four different coaching behaviors: 1) responsiveness to the parent’s needs and concerns, 2) encouragement of reflection, 3) presence of support, and 4) quality of feedback. The therapists were evaluated across the four quality indicators on a 5-point Likert scale (1=Very Low, 2=Low, 3=Moderate, 4=High, 5=Very High). This component was coded in 5- minute intervals. In each interval, the observer assigned a score for each of the four quality indicators. The average of the interval scores was counted as the final score for each quality indicator. An overall score for quality of coaching delivery was computed based on the mean of the four quality indicators (see Appendix C). Responsiveness to the parents’ needs and concerns. This item reflected the therapist’s responsiveness to the parents’ needs and concerns throughout the session. Responsiveness from the service provider looks different in parent-implemented intervention services than in traditional therapist-lead intervention services (Campbell & Sawyer, 2007). In this intervention model, the service provider is seen as a coach. This role has a different set of responsibilities, all of which are designed to facilitate learning for parents within a collaborative approach (Foster, Dunn, & Lawson, 2013). From this perspective, the therapist is not expected to solve all the problems for parents. Instead, the therapist is a facilitator who guides parents through analyzing problems and generating potential solutions. This consultative practice helps parents understand their own strengths and contributions to the child’s development (Rush, Shelden, & Hanft, 2003). In this study, responsiveness was defined as addressing areas of difficulty that the parents may bring up during the discussion segment or experience during the play segment. It was not 40 enough for the therapist to only acknowledge the problem. Rather, the therapist needed to facilitate a joint problem-solving opportunity. Low level of responsiveness was characterized as the therapist completely ignoring the parents’ needs and concerns, or merely acknowledging them without jointly coming up with a solution. High level of responsiveness was characterized as the therapist actively working with the parents to come up with a solution. Areas that may have required additional support included: difficulty understanding or implementing an intervention technique, difficulty in addressing implementation problems at home, and difficulty in providing supportive social behaviors (e.g., deciding what step to take next or what to say to the child during the play segment). Encouragement of reflection. This item reflected the extent to which the therapist created opportunities for the parents to reflect on their implementation progress. Parent- implemented intervention services place parents at the center of the learning experience (Cambray-Engstrom & Salisbury, 2010). The key to successful learning is that parents are given opportunities to evaluate and reflect on current strengths and limitations (Friedman, Woods, & Salisbury, 2012). As a coach, the therapist is expected to create a learning context that promotes self-assessment so that parents are able to refine their knowledge and skills (Rush et al., 2003). In this study, the therapist could have encouraged the parents to reflect by: 1) asking questions about the daily routines, the use of strategies, or the child’s developmental progress outside of the coaching session, and 2) asking for a self-evaluation of their play segment with the child. Low level of encouragement was characterized as the therapist allowing the parents to go through the coaching session without stopping to reflect on their implementation progress and the impact that it may have on the child’s skills development. High level of encouragement was 41 characterized as the therapist posing specific questions to get the parents thinking about their implementation progress and the impact that it may have on the child’s skills development. Presence of support. This item reflected the therapist’s efforts in empowering and encouraging the parents. It also reflected the extent to which the therapist presented as caring and approachable. The shift towards a parent-focused approach indicates that parents create their own learning experience while the therapist provides the necessary support to enhance their competence and confidence in being the child’s teacher (Friedman et al., 2012). Parents have reported being appreciative of support that comes in the form of empowering and encouraging words. Moreover, they have noted that therapists who present as caring and approachable are deemed as the most effective (Knoche et al., 2013). Low presence of support was characterized by behaviors such as criticizing the parents’ mistakes in ways that make them lose confidence in themselves as the child’s teacher, and failing to check if they needed any support. For the specific parent-child play segment, low presence of support was characterized as being attentive without verbally acknowledging the parents’ success or mistakes. High presence of support was characterized by behaviors such as using positive words to help the parents see themselves as a competent teacher for their child, and regularly checking on their need for support. For the specific parent-child play segment, high presence of support was characterized as being attentive through the verbal provision of support and reassurance to the parents. Quality of feedback. This dimension reflected the qualitative characteristics of the therapist’s feedback. Providing feedback is an essential part of supporting adult learners (Dunst & Trivette, 2009). Parents have noted the importance of receiving feedback throughout the learning process. It lets them know what is going well and what needs further improvement 42 (Knoche et al., 2013; Koh & Neuman, 2009). Given that feedback is often used to guide parents, it is important that feedback solely focuses on the target behaviors and does not drift beyond the content (Meade, Dozier, & Bernard, 2014). Low quality feedback did not change (or improve) parents’ role as the child’s teacher. The feedback was considered “off target” -- such that it was irrelevant (unrelated to the target behaviors), vague (leaving room for confusion), and short (brief and incomplete). High quality feedback improved parents’ role as the child’s teacher. The feedback was considered “on target” and relevant (pertinent to the target behaviors), explicit (no room for confusion), and concise (brief yet complete and informative). Participant responsiveness. Participant responsiveness was defined as the extent to which parents were engaged with the therapist during the coaching session (Dane & Schneider, 1998; Knoche et al., 2010). The parents were evaluated on their level of engagement on a 5-point Likert scale (1=Very Low, 2=Low, 3=Moderate, 4=High, 5=Very High). This component was coded in 5-minute intervals. In each interval, the observer assigned a score for the parents’ engagement. The average of the interval scores was counted as the final score for participant responsiveness (see Appendix D). Level of engagement. This item reflected the parents’ level of engagement during the coaching session. Adult learning theory posits that adults learn best when they are actively engaged with the content (Dunst & Trivette, 2009). This theory is the foundation for parent- implemented intervention practices where coaching takes center stage (Friedman et al., 2012). The coaching approach requires parents and therapists to rely on each other’s expertise and knowledge of the child to create a productive and collaborative experience that meets the family’s learning goals. Hence, parents are expected to participate by sharing information about 43 the child, reflecting on their implementation progress, and working with the therapist to resolve issues related to the intervention or the child’s progress (Knoche et al., 2010). Verbal indicators of low engagement include: Parents provided a surface-level reflection on their implementation progress at home and during the play segment. They did not take the initiative to ask questions or share concerns. They did not actively resolve challenges and barriers with the therapist. Nonverbal indicators of low engagement include: Parents had their head or body turned away from the computer screen. Parents physically attended to other stimuli in their environment (e.g., child running around). Verbal indicators of high engagement include: Parents provided a detailed reflection on their implementation progress at home and during the play segment. Parents took the initiative to ask questions or share concerns. Parents actively resolved challenges and barriers with the therapist. Nonverbal indicators of high engagement include: Parents had their head or body facing the computer screen. Parents nodded in agreement to the therapist’s feedback. Components of Treatment Integrity: Treatment Delivery Adherence. Adherence was defined as the degree to which parents implemented the intervention techniques as described in the lesson. The operationalization of this component was guided by Wainer and Ingersoll’s (2013) conceptual model of treatment fidelity for ASD parent training interventions. The component was measured using a modified version of a pre-existing adherence checklist for Project ImPACT (Ingersoll & Dvortcsak, 2010). The parents were evaluated on their implementation of each intervention technique on a 5-point Likert scale, with each point representing the frequency at which they implemented the step during the play segment. This component was coded in 5-minute intervals. In each interval, the observer assigned a score for each intervention technique. The average of the interval scores 44 was the final score for each intervention technique. An overall score for treatment adherence was computed based on the mean of the observed intervention techniques (see Appendix E). Quality of treatment delivery. Quality of treatment delivery was defined as the manner in which parents structured the play segment for the child, and the manner in which they interacted with the child during the play segment (McCollum, Gooler, Appl, & Yates, 2001). This component was considered a higher order construct. It encapsulated two different teaching behaviors: 1) structure of the play segment, and 2) presence of support. The parents were evaluated across the two quality indicators on a 5-point Likert scale (1=Very Low, 2=Low, 3=Moderate, 4=High, 5=Very High). In every interval, the observer assigned a score for each of the two quality indicators. The average of the interval scores was the final score for each quality indicator. An overall score for quality of treatment delivery was computed based on the mean of the two quality indicators (see Appendix F). Structure of the play segment. This item reflected the structural quality of the play segment. Play is commonly used in most ASD parent-implemented interventions because it is often the place where children acquire language and social skills. The bidirectional interactions between children and adults during play create natural opportunities for the child to learn skills that are essential for establishing and maintaining positive relationships (Reagon & Higbee, 2009; Gillett & LeBlanc, 2007; Kaiser, Hancock, & Nietfeld, 2000). Therefore, it is important to evaluate the play structure because the segment serves as the foundation for the parent to practice using the intervention strategies and the child to learn new language and play skills. When play is used as a framework for treatment, the environment should be structured in a way that promotes learning and development (Wolfberg & Schuler, 1993). 45 A low quality play segment had the following characteristics: The selected activities did not offer opportunities for the child to learn new language and play skills. Parents created few opportunities to use the intervention techniques to help the child learn new language and play skills. Parents were lax about setting limits to keep the child from engaging in inappropriate behaviors. The environment was physically disorganized such that both used and unused items could be seen scattered around the area. A high quality play segment had the following characteristics: The activities were conducive to helping the child learn new language and play skills. Parents created enough opportunities to use the intervention techniques to help the child learn new language and play skills. Parents were vigilant about setting limits to keep the child from engaging in inappropriate behaviors. The environment was physically organized such that only used items were present while unused items were stored away. Presence of support. This item reflected the parents’ level of warmth, encouragement, and patience. Research has shown that parent-child interactions that are marked with sensitivity, warmth, and positive affect can foster positive developmental outcomes in children (Magill- Evans & Harrison, 2001). It is important to consider the nature of parent-child interactions when evaluating the quality of a parent-implemented intervention because the unique bond between the parent and the child is the crux of many family-centered intervention programs (McCollum et al., 2001; Mahoney & Wheeden, 1997). Low presence of support was described as: Parents appeared distant and bored, withheld positive support from the child (especially during challenging and frustrating moments), and appeared frustrated or mad when the child failed to respond successfully to a teaching opportunity. High presence of support is described as: Parents maintained a positive affect and warmth towards the child, provided positive support to the child (especially during challenging 46 and frustrating moments), and remained calm and persistent when the child was not able to respond successfully to a teaching opportunity. Training Procedure for Observational Coding The individuals responsible for the observational coding were the researcher and three graduate research assistants. The first part of training involved the researcher providing an overview of the study, the online Project ImPACT curriculum, and the treatment integrity frameworks. The remainder of the training involved the researcher and the research assistants reviewing videos of the coaching sessions to practice coding elements of treatment integrity at both stages. The training continued until there was an 80% agreement between the researcher and each research assistant for a minimum of three coaching sessions. Percentage of agreement for the coaches’ and parents’ adherence, quality of coaching and treatment deliveries, and participant responsiveness were calculated by dividing the number of agreements by the combined number of agreements and disagreements, and multiplying it by 100. For the exposure component, percentage of agreement was calculated by dividing the smaller value by the larger value between the researcher and each research assistant, and multiplying it by 100. Procedure for Observational Coding Video set-up. The observers reviewed the recorded coaching sessions on a desktop computer. In the video, the parent was in the main frame, while the therapist was in the smaller frame, which was located at the bottom right side of the screen. The observer had a complete view of the therapist and the parent-child dyad. Occasionally, during the parent-child play segment, the parent and the child would briefly move to locations that were outside of the camera’s view. However, the parents were quick to adjust their laptop/computer based on their child’s movement around the room. 47 Coding instructions. The researcher was responsible for coding 7 sets of parents’ (34 videos – odd number due to missing sessions) and each research assistant was responsible for 4 sets of parents (32 videos – 8 per parent). The observers completed the coding process independently. Each coaching session was reviewed in 5-minute intervals during two passes. The first pass focused on the treatment integrity components of the coaching delivery. The observers watched the entire coaching session to evaluate the therapist’s coaching practices during the parent-therapist discussion segment and the parent-child play segment. The second pass focused on the treatment integrity components of the treatment delivery during the parent-child play segment. The observers only watched the play segment to evaluate the parent’s teaching approach and interaction style. The child’s outcomes were not coded for this study. During each pass, the researchers were encouraged to replay each interval as often as needed to review parts of the session that remained unclear. In an effort to minimize drift from the coding definitions, the researcher conducted bi-weekly check-ins to review the treatment integrity frameworks and use examples to maintain clarity across all observers. Inter-Observer Reliability Inter-observer reliability was computed by double coding all of the coaching sessions from four randomly selected parents. Two parents had 8 coaching sessions. Two parents had 6 coaching sessions. Each parent’s set of coaching sessions was coded twice by two observers. Treatment integrity data across the four observers were combined to compute inter-observer reliability. Percentages of agreement for the therapist’s adherence and the treatment integrity components that used a 5-point Likert scale were calculated by dividing the number of agreements by the combined number of agreements and disagreements, and multiplying it by 100. Percentage of agreement for exposure was computed in two steps. First, a percentage of 48 agreement was calculated for each coaching session by dividing the smaller value by the larger value between the two observers, and multiplying it by 100. Next, all of the percentages were added together and divided by the total number of coaching sessions to get an average percentage of agreement across the four observers. The percentages of adjacent agreement and exact agreement between observers on the 5-point Likert scale were recorded for the parents’ adherence, quality of coaching and treatment delivery, and participant responsiveness. Inter- observer reliability data are displayed in Table 1 for the therapist variables and in Table 2 for the parent variables. All of the observers met the recommended 80% threshold for reliable coding. Disagreements between the paired observers were discussed and resolved by coming to a consensus on the final codes. Table 1. Inter-Observer Reliability for the Therapist Variables. Adherence Exposure Quality of Coaching Responsiveness to the parent’s concerns Encouragement of reflection Presence of support Quality of feedback Participant Responsiveness Agreement 86% 85% Adjacent agreement Exact agreement 87% 93% 92% 92% 94% 82% 82% 85% 81% 86% 49 Table 2. Inter-Observer Reliability for the Parent Variables Adherence Lesson 4 – Focus on your child Lesson 5 – Adjust your communication Lesson 6 – Make play interactive Lesson 7 – Encourage your child to initiate Lesson 8 – Teach language through prompting Lesson 9 – Expand language through prompting Lesson 10 – Teach imitative play through prompting Lesson 11 – Expand imitative play through prompting Quality of Treatment Structure of play Presence of support Data Analysis Adjacent agreement Exact agreement 100% 98% 99% 99% 96% 98% 95% 90% 83% 81% 89% 88% 91% 82% 84% 82% Adjacent agreement Perfect agreement 92% 100% 81% 95% Missing data. The missing data (~5%-8%) were resolved by using the Expectation- Maximization algorithm (Dempster, Laird, & Rubin, 1977). This procedure relied on the present values of each variable to predict what the missing values would most likely be through repeated imputations. Confirmatory factor analysis. The independent variable, Quality of Coaching Delivery for the therapists, and the dependent variable, Quality of Treatment Delivery for the parents, were measured using Likert rating scales developed for this study. The Quality of Coaching Delivery variable focused on the quality of the therapist’s coaching performance with the parents and the Quality of Treatment Delivery variable focused on the quality of the parent’s teaching performance with their child. The conceptualization of both scales was based on the research 50 literature on parent coaching (Friedman et al., 2012; Rush et al., 2003), adult learning (Dunst & Trivette, 2009; Knowles, Holton, & Swanson, 2005), autism parent-implemented interventions (Gibson et al., 2010; Vismara et al., 2013), and parent-child interactions (Gillett & LeBlanc, 2007; Kaiser et al., 2000). Given the self-constructed approach to creating the Likert scales, it was necessary to test whether the scale items were an accurate representation of the proposed latent constructs. A confirmatory factor analysis (CFA) was considered the most appropriate statistical method to test the model fit for both latent constructs because there was a strong theoretical rationale behind the selection of the scale items. The statistical program, R, was used to run the CFA. Test of assumptions for a multilevel model. A preliminary test of assumptions was conducted to verify the appropriateness of the current dataset for a regression analysis. Specific conditions had to be met in order for the drawn inferences to be valid. There were six assumptions that required checking: Sample size, outliers, multicollinearity, normality of the observed standardized residuals, linearity of the model, and homoscedasticity. The minimum sample size for a regression analysis is 50; however, a sample size of 100 is recommended for data that are not normally distributed (Wilson VanVoorhis & Morgan, 2007). Outliers are observation points that deviate from other observation points. They need to be identified because they can strongly affect how well the data fit the regression line. The multicollinearity assumption states that there must not be any redundancy between the predictor variables, as it would affect the accuracy of the regression coefficient estimates for them. The normality assumption specifies that the observed standardized residuals in a regression model need to be normally distributed to ensure randomness and unpredictability. The linearity assumption requires that a linear relationship must exist between the independent variable and the dependent 51 variable. The homoscedasticity assumption denotes that the residuals must be equal across the regression line (Ruginski, 2016). Research questions 1 and 2. What is the average level achieved for each treatment integrity component of the coaching delivery: Therapists’ adherence, exposure, quality of coaching delivery, and participant responsiveness? What is the average level achieved for each treatment integrity component of the treatment delivery: Parents’ adherence and quality of treatment delivery? Descriptive statistics analysis was used to report the average level achieved for each treatment integrity component for the coaching delivery and the treatment delivery, along with the sub-dimensions of the quality of coaching and treatment deliveries. Research questions 3 and 4. Which treatment integrity components of the coaching delivery predict parents’ treatment adherence? Which treatment integrity components of the coaching delivery predict parents’ quality of treatment delivery? Multilevel model (MLM) analysis was used to explore the predictive relationship between the coaching delivery variables and the treatment delivery variables. MLM is highly recommended for nested data structure because it does not assume independence between the observations (Hoffman & Rovine, 2007). One type of nested data structure is repeated measures data for each subject. The grouping of multiple observations within each subject makes them dependent upon each other (Peugh, 2010). In this study, each parent had up to eight coaching sessions. The treatment integrity data for the coaching delivery and the treatment delivery were coded for all coaching sessions for each parent. A 2-level regression model was used to determine which treatment integrity components of the coaching delivery were unique predictors of the parents’ treatment adherence and quality 52 of treatment delivery. The treatment integrity components of the coaching delivery served as the predictors in the model: Adherence, exposure, quality of coaching delivery, and participant responsiveness. The treatment integrity components of the treatment delivery served as the dependent variables: Adherence and quality of treatment delivery. The multilevel model consisted of the following equations: Level 1: Yij = β0j + β1(Adherence) + β2(Exposure) + β3(Quality of Coaching Delivery) + β4(Participant Responsiveness) + eij Level 2: β0j = γ00 + μ0j The Level 1 equation showed the repeated measurements of treatment integrity data for the coaching delivery and the treatment delivery for each parent. It accounted for the variation within each parent. Yij represented the dependent variable outcome at each session (i) for each parent (j). β0j represented the Y-intercept, where j indicated that it would be different for each parent. β1-4 represented the regression coefficients for the predictor variables. eij represented the variation that was not accounted for in the regression model. Level 1 was treated as a fixed effects model. The Level 2 equation represented the Y-intercept (β0j). It accounted for the variation between the parents. γ00 represented the grand mean; in other words, it was the mean of the parents’ means. μ0j represented the random error. Level 2 was treated as a random effects model, given that each parent was expected to have a different Y-intercept. In this study, a 2-level regression model was selected over a 3-level regression model because the number of clusters (5 coaches) did not meet the recommended cluster size (>20 groups) for the highest level in a multilevel model (Hoffman, 1997). Including this highest level in the model could potentially lead to biased results with a large standard error, which could 53 increase the uncertainty regarding the preciseness of the predictors’ coefficient estimates. Given this concern, the best solution for the current study was a 2-level regression model. 54 Descriptive Summary of the Data CHAPTER 4: RESULTS The parent coaching sessions that were used in the analysis came from two RCT studies that examined the telehealth delivery of Project ImPACT. A total of 130 coaching sessions from 19 parents were used in the current study. Treatment integrity data were coded for the coaching delivery and the treatment delivery. Data from 127 coaching sessions across 18 parents were used in the final analysis. One set of coaching sessions from a parent was dropped due to the cluster size being too small for a multilevel analysis (minimum requirement:  5 data points per cluster). Four observers completed the coding process. Missing Data There were missing values across all six variables. They were determined to be missing completely at random due to technical difficulties with recording (e.g., poor sound quality) and situational barriers (e.g., child was not present and/or unable to cooperate). The Expectation- Maximization algorithm was selected to generate an expected value for each missing value (Dempster, Laird, & Rubin, 1977). This procedure relied on the present values of each variable to predict what the missing values would most likely be through repeated imputations. Confirmatory Factor Analysis Quality of coaching delivery (QualityofCD). The latent construct of coaching quality was conceptualized in terms of four variables: (1) Responsiveness to the parents’ needs and concerns, (2) Encouragement of reflection, (3) Presence of support, and (4) Quality of feedback. Initial examination of the preliminary statistics indicated that a factor analysis was recommended for the current sample. The Bartlett’s Test of Sphericity result was 2 = 59.631, df = 6, p < 0.001. This result indicated that there is a patterned relationship between the four variables. The Kaiser- 55 Meyer-Olkin Measure (KMO) of Sampling Adequacy value was 0.627, which was above the minimum cut-off value 0.600 (Kaiser, 1974). This result indicated that the current sample met the minimum criterion for a factor analysis. Different goodness-of-fit statistics were used to verify the model fit. The Tucker-Lewis Index (TLI) value was 1.000 and the Comparative Fit Index (CFI) value was 1.083. Both values were above the cut-off value of 0.90. The Root Mean Square Error of Approximation (RMSEA) value was 0.000 and the Standardized Root Mean Square Residual (SRMR) value was 0.015. Both values were below the cut-off value of 0.05 (Hu & Bentler, 1999). Altogether, these goodness-of-fit statistics indicated a good 1-factor model fit. Quality of treatment delivery (QualityofTD). The latent construct of treatment quality was conceptualized in terms of two variables: (1) Structure of the play segment and (2) Presence of support. Initial examination of the preliminary statistics indicated that a factor analysis was not recommended for the current sample. The Bartlett’s Test of Sphericity result was 2 = 7.083, df = 1, p < 0.008. This result indicated that there is a patterned relationship between the two variables; however, the Kaiser-Meyer-Olkin Measure (KMO) of Sampling Adequacy value was 0.500, which fell under the recommended cut-off value of 0.600 (Kaiser, 1974). This result indicated that the current sample did not meet the minimum criterion for a factor analysis. Any interpretation from the factor analysis would not be considered meaningful (Yong & Pearce, 2013). For this reason, Quality of Treatment Delivery was not treated as a latent construct, and instead, the variables that were conceptualized to make up this construct were analyzed as separate observed variables. They were Parents’ Structure of the Play Segment and Parents’ Presence of Support. 56 Test of Assumptions Sample size. The appropriateness of the sample size was determined by following recommendations in the existing literature. The recommended minimum sample size for a regression analysis is 50, but a sample size of 100 is recommended for data that are not normally distributed (Wilson VanVoorhis & Morgan, 2007). The current dataset had 127 cases for each predictor variable. Therefore, the assumption of sample size was met. Outliers Parents’ adherence (PAdherence). This outcome variable did not have any outliers. The standardized residual values ranged from -2.721 to 2.349, falling well within the recommended -3 to 3 range, thereby indicating a lack of influential points in the dataset (Fox, 1991). Parents’ structure of the play segment (PStructure). There was one small outlier for this outcome variable. The standardized residual values ranged from -3.129 to 1.724, falling just outside of the recommended -3 to 3 range (Fox, 1991). However, the Cook’s Distance values ranged from 0.000 to 0.072, falling under the recommended value of 1 (Cook, 1977). These values indicated that the identified outlier was not largely influential. Parents’ presence of support (PSupport). There were three outliers for this outcome variable. The standardized residual values ranged from -5.353 to 1.313, falling far outside of the recommended -3 to 3 range, thereby indicating the presence of influential points in the dataset (Fox, 1991). Multicollinearity of the independent variables. In the current dataset, the predictor variables were uncorrelated with each other, as evidenced by the Pearson r values being less than 0.5 (see Table 3). Therefore, the assumption of multicollinearity was met, such that there was a lack of redundancy between the predictor variables. 57 Table 3. Multicollinearity of the Independent Variables CAdherence Exposure QualityofCD ParResp CAdherence Exposure 1.000 0.037 QualityofCD -0.251 ParResp -0.148 - 1.000 -0.078 -0.111 - - 1.000 0.090 - - - 1.000 Normality of the observed standardized residuals and linearity of the model Parents’ adherence (PAdherence). Figure 1 showed that most of the data points fell along or near the regression equation line. Results of the Shapiro-Wilk normality test indicated that the null hypothesis—the residuals are normally distributed—could not be rejected (p = 0.713). It was concluded that the observed standardized residuals for this outcome variable were normally distributed. Additionally, it was concluded that a linear relationship between the independent variables and the dependent variable existed. Hence, the assumptions of normality and linearity were met. Figure 1. Normal P-P Plot for PAdherence 58 Parents’ structure of the play segment (PStructure). Figure 2 showed that most of the data points fell along or near the regression equation line, but the Shapiro-Wilk normality test indicated that the null hypothesis—the residuals are normally distributed—could be rejected (p < 0.001). The observed standardized residuals for this outcome variable were, therefore, not normally distributed. However, failure to meet the normality assumption is acceptable because linear and mixed models have been found to be relatively robust in the presence of a non-normal distribution of the observed standardized residuals (Winter, 2013). At the same time, it was concluded that a linear relationship between the independent variables and the dependent variable existed. Hence, the assumption of normality was not met, but the assumption of linearity was met. Figure 2. Normal P-P Plot for PStructure Parents’ presence of support (PSupport). Figure 3 showed that most of the data points did not fall along or near the regression equation line. Results from the Shapiro-Wilk normality test revealed that the null hypothesis—the residuals are normally distributed—could be rejected (p < 0.001). It was concluded that the observed standardized residuals for this outcome variable 59 were not normally distributed. Additionally, there was not a linear relationship between the independent variables and the dependent variable. Hence, the assumptions of normality and linearity were not met. Figure 3. Normal P-P Plot for PSupport Homoscedasticity Parents’ adherence (PAdherence). The Koenker (1981) homoscedasticity test yielded a value of 2.557 with a p-value of 0.634, which meant that the null hypothesis—homoscedasticity is present—could not be rejected. The failure to reject the null hypothesis indicated that the variance of the residuals is constant (homoscedastic) across the range of the independent variables. Hence, the assumption of homoscedasticity was met. Parents’ structure of the play segment (PStructure). The Koenker (1981) homoscedasticity test had a value of 2.809 with a p-value of 0.590, which meant that the null hypothesis—homoscedasticity is present—could not be rejected. The failure to reject the null hypothesis indicated that the variance of the residuals is constant (homoscedastic) across the range of the independent variables. Hence, the assumption of homoscedasticity was met. 60 Parents’ presence of support (PSupport). The Koenker (1981) homoscedasticity test resulted in a value of 7.007 with a p-value of 0.136, which means that the null hypothesis—homoscedasticity is present—could not be rejected. The failure to reject the null hypothesis indicated that the variance of the residuals is constant (homoscedastic) across the range of the independent variables. Hence, the assumption of homoscedasticity was met. Correlation Between the Independent Variables and the Dependent Variables The correlations between the independent variables and the dependent variables are relatively low (see Table 4), suggesting a lack of relationship between the variables. However, there is a moderate negative relationship between the quality of coaching delivery (QualityofCD) and the parents’ structure of the play segment (PStructure), as evidenced by Pearson r value being -0.419. Table 4. Correlations Between the Independent and Dependent Variables PAdherence PStructure PSupport CAdherence 0.154 (p = 0.042) 0.148 (p = 0.049) -0.011 (p = 0.452) Exposure -0.013 (p = 0.441) 0.041 (p = 0.322) -0.240 (p = 0.003) QualityofCD -0.123 (p = 0.084) -0.419 (p < 0.001) -0.045 (p = 0.307) ParResp 0.008 (p = 0.466) 0.003 (p = 0.486) -0.077 (p = 0.194) Research Question 1 What is the average level achieved for each treatment integrity component of the coaching delivery (therapists’ adherence, exposure, quality of coaching delivery, and participant responsiveness)? Mean scores for each treatment integrity component of the coaching delivery are presented in Table 5. 61 Table 5. Descriptive Statistics for the Treatment Integrity Components of the Coaching Delivery Unit of Measurement Minimum Maximum Mean Standard Deviation Adherence Percentage of completed steps Exposure Number of comments Overall quality of coaching delivery Participant responsiveness Research Question 1a Likert scale 1-5 Likert scale 1-5 63 0 2.75 2.25 100 94.23 8.24 42 13.08 4.90 3.85 8.47 0.38 5.00 4.35 0.63 What is the average level achieved for each dimension of the quality of coaching delivery? Mean scores for each dimension of the quality of coaching delivery are presented in Table 6. Table 6. Descriptive Statistics for the Dimensions of the Quality of Coaching Delivery Responsiveness to parents’ needs and concerns Encouragement of reflection Presence of support Unit of Minimum Maximum Mean Standard Deviation Measurement Likert scale 1-5 2.00 5.00 3.26 0.54 Likert scale 1-5 1.00 5.00 3.60 0.81 Likert scale 1-5 2.20 3.00 5.00 5.00 4.19 4.37 0.63 0.43 Quality of feedback Likert scale 1-5 Research Question 2 What is the average level achieved for each treatment integrity component of the treatment delivery (parents’ adherence, parents’ structure of the play segment, and parents’ presence of support)? Mean scores for each treatment integrity component of the treatment delivery are presented in Table 7. 62 Table 7. Descriptive Statistics for the Treatment Integrity Components of the Treatment Delivery Unit of Minimum Maximum Mean Adherence Structure of the play segment Presence of support Measurement Likert scale 1-5 Likert scale 1-5 1.96 1.67 4.84 5.00 3.40 4.05 Likert scale 1-5 3.00 5.00 4.86 Standard Deviation 0.57 0.79 0.34 Multilevel Regression Model Selection A 2-level regression model was used to determine which treatment integrity components of the coaching delivery were unique predictors of the parents’ treatment adherence and structure of the play segment. The multilevel model consisted of the following equations: Level 1: Yij = β0j + β1(Adherence) + β2(Exposure) + β3(Quality of Coaching Delivery) + β4(Participant Responsiveness) + eij Level 2: β0j = γ00 + μ0j The Level 1 equation showed the repeated measurements of treatment integrity data for the coaching delivery (β1-4) and the treatment delivery for each parent (Yij). The Level 2 equation represented the Y-intercept (β0j), where γ00 symbolized the grand mean, and μ0j symbolized the random error. Three models with different covariance structures were tested to determine the best fit for the current dataset. Covariance structures are used to illustrate the dependence nature of repeated measurements for an individual. Scaled identity assumes that the repeated measurements are not correlated. Compound symmetry assumes that the correlation between the repeated measurements is constant over time. First-order autoregressive assumes that the correlation between the repeated measurements is different, such that two adjacent measurements would have a higher correlation than two measurements that are farther apart in time (Roy & Khattree, 63 2005). Goodness-of-fit indices are used to determine the best fitting model. The criterion is that a smaller value would suggest a better fit (Keselman, Algina, Kowalchuk, & Wolfinger, 1997). Table 8. Model of Covariance Structures for Parents’ Treatment Adherence Goodness-of-Fit Indices Type of Covariance Assumption of -2 Restricted Log Akaike’s Information Structure Correlation between Likelihood Criterion (AIC) Scaled Identity Compound Symmetry Constant correlation across No correlation Measurements (-2LL) 214.006 214.006 218.006 220.006 time Correlation decreases with time 199.425** 205.425** First-Order Autoregressive ** Denotes best fitting model Table 9. Model of Covariance Structures for Parents’ Structure of the Play Segment Goodness-of-Fit Indices Type of Covariance Assumption of -2 Restricted Log Akaike’s Information Structure Correlation between Likelihood Criterion (AIC) Scaled Identity Compound Symmetry Constant correlation across No correlation Measurements (-2LL) 284.656 284.656 288.656 290.656 time Correlation decreases with time First-Order Autoregressive ** Denotes best fitting model Research Question 3 281.372** 287.372** Which treatment integrity components of the coaching delivery predict parents’ treatment adherence? The 2-level regression model was fitted using a restricted maximum likelihood estimation (RMLE). Various measures of goodness-of-fit indicated that a fixed slope and random intercept, with a first-order autoregressive covariance structure, model was the best fit for the current dataset (see Table 8). Based on the results, none of the treatment integrity components of the 64 coaching delivery was a significant predictor of the parents’ treatment adherence. The intraclass correlation coefficient (ICC) was 0.363, which suggested that 36% of the total variance in the parents’ treatment adherence stem from between-cluster differences (e.g., parents’ characteristics), and 64% stem from within-cluster differences (e.g., intervention sessions’ characteristics) (see Table 10). Table 10. Two-Level Multiple Regression of Coaching Variables Predicting Parents’ Treatment Adherence Fixed effects Intercept Coach adherence Exposure Parents’ Treatment Adherence  4.183 0.006 SE df t value Sig. 0.932 120.825 4.486 0.000 0.006 118.661 1.075 0.285 -0.014 0.007 111.396 -2.004 0.067 Quality of coaching delivery -0.228 0.155 108.464 -1.475 0.143 Participant responsiveness -0.067 0.084 121.220 -0.797 0.427 Random effects Variance SE Wald Z Sig. Repeated measures variance Participant variance Research Question 4 0.216 0.123 0.030 0.055 7.254 0.000 2.233 0.026 Which treatment integrity components of the coaching delivery predict parents’ structure of the play segment? The 2-level regression model was fitted using a restricted maximum likelihood estimation (RMLE). Various measures of goodness-of-fit indicated that a fixed slope and random intercept, with a first-order autoregressive covariance structure, model was the best fit for the current dataset (see Table 9). The results indicate that quality of coaching delivery was a significant predictor of parents’ structure of the play segment. When the quality of coaching delivery score increased by 1 point, the parents’ structure of the play segment score decreased by 0.607 point, 65 holding all the other predictors constant. The intraclass correlation (ICC) was 0.280, which suggested that 28% of the total variance in the parents’ structure of the play segment stem from between-cluster differences (e.g., parents’ characteristics) and 72% stem from within-cluster differences (intervention sessions’ characteristics) (see Table 11). Table 11. Two-Level Multiple Regression of Coaching Variables Predicting Parents’ Structure of the Play Segment Fixed effects Intercept Parents’ Structure of the Play Segment  SE df t value Sig. 6.751 1.250 121.852 5.402 0.000 Coach adherence -0.0005 0.008 120.245 -0.058 0.954 Exposure -0.003 0.009 95.266 -0.303 0.763 Quality of coaching delivery -0.607 0.202 92.531 -2.996 0.004 Participant responsiveness -0.059 0.112 116.187 -0.524 0.601 Random effects Variance SE Wald Z Sig. Repeated measures variance Participant variance 0.400 0.155 0.056 0.083 7.166 0.000 1.859 0.063 Given that quality of coaching delivery was a significant predictor of parents’ structure of the play segment, it was necessary to explore the correlations between each quality dimension and the outcome variable. As shown in Table 12, encouragement of reflection had a negative but strong correlation. Responsiveness to parents’ concerns and presence of support had a moderate negative correlation. Quality of feedback had a positive but weak correlation. 66 Table 12. Correlations Between the Quality of Coaching Delivery Dimensions and the Parents’ Structure of the Play Segment Parents’ Structure of the Play Segment Responsiveness to parents’ needs and concerns Encouragement of reflection Presence of Support Quality of feedback -0.356 (p < 0.001) -0.433 (p < 0.001) -0.302 (p < 0.001) 0.205 (p = 0.021) 67 CHAPTER 5: DISCUSSION An essential part of evidenced-based practice (EBP) is utilizing interventions that have sound evidence supporting their efficacy and effectiveness (APA Presidential Task Force, 2006). One way to better understand an intervention’s utility value, both in a clinical and a non-clinical setting, is through a treatment integrity assessment. At its core, this scientific method is designed to capture the implementation process of an intervention (Yeaton & Sechrest, 1981). The significance of treatment integrity assessment cannot be underestimated in intervention outcome research. Evidence of how an intervention is implemented plays a critical role in allowing scientists and practitioners to draw valid conclusions about its ability to produce favorable outcomes (Sanetti & Kratochwill, 2011). Historically speaking, treatment integrity has not been consistently addressed in treatment outcome research (Sanetti & Collier-Meek, 2014; Sanetti & Kratochwill, 2008). For instance, this pattern of inconsistency is present in ASD parent-implemented interventions, where treatment integrity assessment is often conducted for the treatment delivery but not the coaching delivery (Barton & Fettig, 2013; Meadan et al., 2009; Schultz et al., 2011). This practice poses a concern for the dissemination of current and future programs. Parent coaching plays a significant role in ASD parent-implemented interventions, particularly programs that are delivered online (Gibson et al., 2010; Vismara et al., 2009). This approach transfers the teaching responsibility from the therapist to the parent. Hence, it is necessary to evaluate the process in order to identify the potential active ingredients that make parent coaching successful (Wainer & Ingersoll, 2013). The primary goal of this study was to assess the treatment integrity of parent coaching in Project ImPACT. In this study, treatment integrity was conceptualized as a multidimensional construct. Different aspects of the coaching delivery and the treatment delivery were evaluated to 68 explore any potential association between the therapist’s coaching efforts and the parent’s utilization of the intervention strategies. Research Questions 1 and 1a What is the average level achieved for each treatment integrity component of the coaching delivery (therapists’ adherence, exposure, quality of coaching delivery, and participant responsiveness)? What is the average level achieved for each dimension of the quality of coaching delivery? Coaching adherence. In the current study, coaching adherence adopted a common definition set forth by Waltz and colleagues (1993), along with Dane and Schneider (1998). It was conceptualized as the degree to which the therapists followed the action items listed in the coaching protocol. Some of the most pertinent action items included, setting the session’s agenda, reviewing the Reflection Questions assignment, supporting parents during the play segment, and addressing barriers to the implementation process. The measurement also followed a common course, in that the evaluation was a matter of confirming the presence or absence of specific action items (Wainer & Ingersoll, 2013). This coaching protocol was used by the original authors of the RCT study on the delivery of Project ImPACT via telehealth (Ingersoll et al., 2016). In the current study, the therapists’ adherence to the coaching procedure was high, at 94.23% with a standard deviation of 8.24. This adherence level is consistent with the adherence level reported by the original authors, which was 99.6%. Further, it is consistent with the adherence level reported in different studies of ASD parent-implemented interventions, which ranges from 95%-100% (Dunlap, Ester, Langhans, & Fox, 2006; Mobayed, Collins, Strangis, Schuster, & Hemmeter, 2000; Hester, Alpert, & Whiteman, 1995; Randolph, Stichter, Schmidt, & Connor, 2011). 69 The action items listed in the coaching protocol embodied a parent coaching model that valued the parents’ learning experience. Hence, the high percentage of adherence signified the therapists’ strong investment in the parents’ learning. It underscored their effort to provide parents with a rich learning experience filled with both knowledge and support. Moreover, it portrayed a multifaceted approach to parent coaching, meaning the therapists covered logistical steps (e.g., reviewing the session’s agenda, recording the data, and timing the sessions) as well as coaching steps (e.g., encouraging parents to practice, providing feedback, and facilitating reflections). Hence, the measurement of adherence continues to be a necessity because these essential steps should be covered in order for the intervention to be evaluated in greater depth. Exposure. At a broad level, Dane and Schneider (1998) defined exposure as the rate at which the participant is exposed to the treatment components (e.g., frequency and duration of each treatment session, and the duration of the overall intervention). In the context of ASD parent-implemented interventions, Wainer and Ingersoll (2013) defined exposure as the amount of time that the therapists spend on coaching parents within a session. In the current study, exposure was conceptualized in the context of feedback provision. Feedback was an important feature of the online delivery of Project ImPACT. The parents relied heavily on the therapists’ feedback to gauge their learning progress. Exposure to feedback was measured by counting the number of constructive comments (corrective and reinforcement) provided to the parents. The average number of comments across all the coaching sessions was 13 with a standard deviation of 8.47. The moderate data dispersion could be attributed to factors specific to the therapists and the parents in the study. Some therapists provided in vivo feedback while others elected to wait until after the parent-child play segment. Therapists who opted to wait essentially gave fewer comments. Parents’ skill level also played a role, such that parents who were highly 70 proficient in their use of the intervention techniques did not require as much feedback as parents who experienced more difficulty with the implementation. Both of these observations brought up questions that have remained largely unanswered in the ASD parent-implemented interventions literature. The first question revolved around the comparison between in vivo feedback and delayed feedback. Despite both practices being a common aspect of parent coaching, limited attention has been given to comparing their individual effectiveness (Shanley & Niec, 2010). At most, it has been suggested that in vivo feedback is conducive to the parents’ acquisition of skills (Wyatt Kaminski, Valle, Filene, & Boyle, 2008). The second question revolved around the notion that parents’ knowledge of autism might be a moderating factor for the relationship between constructive feedback and parents’ acquisition of skills. There is sufficient empirical evidence to suggest that parents often benefit from receiving constructive feedback during the course of the intervention (Lyon & Budd, 2010; Graziano et al., 2015; Shanley & Niec, 2010; Oliver & Brady, 2014). However, in the spirit of striving to build parents’ teaching capacity through the use of coaching strategies, it may be helpful to consider how feedback can be modified to fit parents’ individual strengths and needs. It is possible that less proficient parents may benefit from a higher dosage of feedback, while more proficient parents may benefit from a lower dosage of feedback. Although the concept of exposure has been written and talked about at length, there is very little evidence of its measurement in experimental studies, particularly in the ASD parent- implemented intervention literature (Schultz et al., 2011; Wainer & Ingersoll, 2013). The unconventional conceptualization of exposure in this study highlighted the deeper nuances of parent coaching. It is possible that covert factors such as the timing of the feedback and the 71 parents’ pre-existing knowledge of autism may determine the effectiveness of the provided feedback. It is recommended that future studies consider exploring these factors through the treatment integrity lens, particularly through a reconceptualization of exposure. Quality of coaching delivery. In this study, four quality variables were identified as impactful factors of parent coaching: Responding to parents’ needs and concerns in a collaborative way, encouraging parents to reflect on their progress, empowering parents through support, and providing high quality feedback. These four coaching strategies were individually evaluated. Their scores were combined and averaged to make up a global quality score. At a broad level, the therapists achieved an average rating of 3.85 out of 5 for the quality of their coaching delivery. This score suggested that the therapists implemented all four coaching strategies to some level. Further, it implied that the researchers behind Project ImPACT were mindful of the unique behaviors that constitute the practice of educating adult learners. However, this global score could only provide limited insight into the coaching process of Project ImPACT. A deeper exploration of the individual coaching strategy was necessary. Examining the quality of coaching delivery by sub-dimensions, the therapists achieved the highest average scores on their presence of support (4.19 out of 5) and quality of feedback (4.37 out of 5). The first score indicated that the therapists provided parents with positive affirmation, reassurance, and support most of the time. They empowered the parents by providing positive praises when the parents experienced success, reassuring words when the parents experienced challenges, and overt attention during the parent-child play segment. These behaviors were in line with what parents have reported to be valuable aspects of the parent coaching experience (Knoche et al., 2013). The second score indicated that the therapists often provided meaningful and descriptive feedback; however, this tended to be lengthy. There may be 72 a trade-off when it comes to providing feedback, in that high quality feedback takes longer, but can also run the risk of overwhelming or disengaging the parent. Again, the characteristic of the therapists’ feedback was similar to what parents have described in other studies. In other words, they valued feedback that comprehensively delineated their current progress, including what they were doing well and what they needed to work on (Koh & Neuman, 2009). The therapists achieved lower average scores on their responsiveness to parents’ needs and concerns (3.26 out of 5) and encouragement of reflection (3.60 out of 5). The first score indicated that the therapists strictly provided parents with solutions to their needs and concerns rather than engaging them in a collaborative problem-solving process. When concerns or needs were raised, it was common for the therapists to take the lead in providing alternative solutions. While the therapists did engage the parents by having them clarify and analyze the presented concern or need, collaborative problem-solving was noticeably absent. This practice stands in stark contrast to the proposed parent coaching model in the literature. Parent coaching requires the therapist to guide the parent through analyzing problems and generating alternative solutions. It is designed to help parents build the capacity to shape their child’s learning and development (Campbell & Sawyer, 2007). The second score indicated that the therapists occasionally created opportunities for parents to reflect on their progress. Encouragement of reflection occurred more regularly during the discussion segment at the beginning of the coaching session. The specific task of reviewing the Reflection Questions assignment provided an easy opportunity for the parents to talk about their implementation progress at home. However, encouragement of reflection was noticeably absent during the post-play discussion segment. This time could have been a prime opportunity for the therapists to engage the parents in a post-play reflection. As stated in the parent coaching 73 literature, parents are more likely to achieve success if they are regularly given the opportunity to reflect on their current strengths and limitations in order to refine their knowledge and skills as they move through the intervention (Kaiser & Hancock, 2003). Participant responsiveness. In the current study, participant responsiveness referred to the parent’s engagement with the therapist during the coaching session. The expectation for parents to be actively involved stems from the parent coaching philosophy, which states that the relationship between the therapist and the parent should be equal and collaborative (Dinnebeil, McInerney, Roth, & Ramaswamy, 2001). Therefore, the parent is required to be as present as the therapist during the coaching session. The parents achieved an average score of 4.35 out of 5 for their responsiveness during the coaching session. This score represented a high level of engagement, as evidenced by verbal and nonverbal indicators. The parents were observed to be critically reflective of their implementation progress at home and during the play segment. They were observed to be committed to their learning based on the questions and the concerns that they brought to the session. Nonverbal indicators, such as sitting in clear view of the therapist and acknowledging the therapist’s comments with a nod, also provided evidence of the parents’ engagement in the coaching session. These behavioral indicators of engagement closely mirror the ones identified by Knoche and her colleagues (2010). In their study, a high level of parent engagement was characterized as the presence of a bi-directional discussion between the parent and the therapist. The discussion was child-oriented, consisting of parents elaborating and reflecting on their questions and concerns. This shared conceptualization of parent engagement between the two studies further reinforced the notion that there is an explicit expectation for the parent-therapist relationship to be active, equal, and collaborative (Rush et al., 2003). 74 Research Question 2 What is the average level achieved for each treatment integrity component of the treatment delivery (parents’ adherence, parents’ structure of the play segment, and parents’ presence of support)? Parents’ treatment adherence. The evaluation of parents’ treatment adherence was completed by using a modified version of a pre-existing adherence checklist for Project ImPACT (Ingersoll & Dvortcsak, 2010). This adherence checklist outlined the essential steps of every lesson in the intervention. The early steps were designed to set the stage for the later steps. Some of the early steps included facing the child, letting the child lead, adjusting the voice and language, and using different interactive techniques to engage the child. Some of the later steps included providing a teaching prompt, correcting the child’s response when needed, and reinforcing the correct response. Parents were evaluated on the frequency at which they completed the essential steps during the play segment with their child. On average, the parents achieved a score of 3.40 out of 5 for their adherence to the intervention procedure, which suggested that they implemented the intervention strategies as instructed half of the time. This average adherence level is consistent with the findings from previous studies on Project ImPACT. In the initial efficacy study, the parents achieved an average fidelity score of 3.32 out 5 (Ingersoll & Wainer, 2013). In the effectiveness study, the parents achieved an average fidelity score of 3.98 out of 5 (Stadnick et al., 2015). In the pilot RCT telehealth study, the parents achieved an average fidelity score of 3.39 out of 5 (Ingersoll et al., 2016). Across the board, the parents demonstrated a high capacity to learn the intervention techniques, as evidenced by their implementation of all the essential steps needed to create 75 meaningful learning opportunities for the child. This high capacity remained consistent across different teaching modalities (e.g., in-person vs. online). Although the adherence score suggested that the parents only implemented the strategies correctly half of the time, it still served as evidence that they were able to translate what they have learned from the modules into practice with their children. For example, the parents set the stage for learning by facing the child, giving the child choices, adjusting their voice and language, and creating opportunities for the child to initiate communication. In the later stage, they taught the child new social communication skills by adhering to the teaching procedure, which included prompting, correcting, and reinforcing. This finding further reinforced the notion that parents could be successfully trained as “co- therapists” (McConachie & Diggle, 2007; Rocha, Schreibman, & Stahmer, 2007), and provided additional support for parent-implemented interventions to be an evidence-based practice (Odom, Collet-Klingenberg, Rogers, & Hatton, 2010). Their well-established bond with the child, coupled with their capacity to execute the intervention strategies, make parents the most valuable stakeholders in the intervention process. Parents’ structure of the play segment. Research has suggested that providing interventions in the natural environment can increase the child’s ability to maintain and generalize their newly learned social communication skills (Gale, Eikeseth, & Rudrud, 2011; Ingersoll, 2011). Setting up the environment for a meaningful learning experience is an imperative part of ASD parent-implemented interventions (Perera, Jeewandara, Seneviratne, & Guruge, 2016; Sanefuji & Ohgami, 2013). In this study, the way in which parents structured the play segment during the coaching session was deemed a quality indicator of their treatment delivery. A high quality play segment was characterized as structurally neat and structurally meaningful. 76 On average, the parents achieved a score of 4.05 out of 5 for their effort to structure the play segment, which suggested that they maintained adequate control over the environment and selected activities that were conducive to the child’s engagement and learning most of the time. This average score represented a neat play environment, where only needed items were present for a specific activity while unused items were stored away to minimize the child’s distraction. Also, it represented a meaningful play environment, where the selected activities provided ample opportunities for the child to learn new language and play skills. Further, the high score meant that the parents were vigilant about setting limits to keep their child from engaging in inappropriate behaviors. In general, the parents demonstrated that they were able to support their child in ways that appeared comparable to a trained therapist. They structured the environment in a meaningful way to facilitate learning for their child. This finding highlighted the parents’ inherent expertise in the intervention context. They utilized their knowledge of the child’s strengths and weaknesses to create a practical learning experience. They used the home environment to normalize the experience for their child. When these aspects were combined, the child’s learning experience became much more rich and meaningful. Altogether, these findings further reinforced the importance of making parents an integral part of the intervention process, along with the fact that interventions can be implemented with success in a natural environment (Wolery & Garfinkle, 2002; Gillett & LeBlanc, 2007). Parents’ presence of support. There is evidence to suggest that children’s development can be affected by the interactions that they have with their caregivers (Topping, Dekhinet, & Zeedyk, 2012). In particular, positive interactions that are marked with sensitivity, warmth, and positive affect can foster positive developmental outcomes in children (Magill-Evans & 77 Harrison, 2001). For this reason, another quality indicator of treatment delivery was the parents’ presence of support during the play segment. Parents were evaluated on different dimensions of support, such as affect, warmth, encouragement, calmness, and persistence. On average, the parents achieved a score of 4.86 out of 5 for their provision of support during the play segment, which indicated that parents were highly supportive of their child for the majority of the time. This average score portrayed parents as positive, warm, encouraging, calm, and persistent. The parents were observed to be providing support with relative ease. One possibility is that these behaviors may come naturally for parents as part of the parenting role. Another possibility is that being observed by the therapist may unknowingly influence the parents to present their “best” self. In any case, the high presence of support from parents contributed to the current belief that parent-implemented interventions are powerful because of the inherent parental support for the child to receive and process the target skills. Research Questions 3 and 4 Which treatment integrity components of the coaching delivery predict parents’ treatment adherence and parents’ structure of the play segment? In ASD parent-implemented interventions, examining the treatment integrity of the parent coaching process is important (Lane et al., 2004). This middle link facilitates the transfer of knowledge from the therapist to the child. It is equally necessary to examine this portion of the intervention to identify the active ingredients of parent coaching. Information drawn from the examination could cultivate an initial understanding of the mechanisms of change for parent- implemented interventions (Kazdin, 2007). In this study, a multilevel regression analysis was used to explore the potential relationship between the therapists’ coaching performance and the parents’ teaching performance through the use of treatment integrity data. 78 Multilevel regression analysis failed to find a significant predictive relationship between any of the treatment integrity components of the coaching delivery and the parents’ treatment adherence. A similar result was found for the parents’ structure of the play segment, with the exception of the quality of coaching delivery variable. There was a significant relation between this predictor variable and the parents’ structure of the play segment, albeit in an unexpected direction. Contrary to the expectation, the analysis showed that quality of coaching delivery and parents’ structure of the play segment had a negative relationship, such that a 1-point increase in the therapists’ quality of coaching delivery score would result in a 0.607-point decrease in the parents’ structure of the play segment score. This is an unexpected finding as a higher quality of coaching delivery was theorized to have a positive impact on the parents’ quality of treatment delivery. One possible explanation for this finding is that some coaching behaviors may be disruptive rather than helpful for the parents. For example, if the parents are asked to stop frequently during the play segment to reflect on their teaching performance or to work out a solution to an implementation challenge, it may take away from their capacity to maintain a proper play structure for the child. Additionally, the therapists may be distracting the parents through their commenting during the play segment. The provision of feedback may disrupt the flow of the interaction, thereby prohibiting the parents from executing behaviors that are meant to maintain the play structure. At the same time, it is also worth considering how the child’s behaviors could affect the overall structure of the play segment, and therefore, drive the coaching behaviors. For example, a cooperative child would be less likely to disrupt the flow of the play segment, thereby making it less likely that the therapist would need to interrupt. On the other hand, a non-cooperative child may cause disruption that requires the therapist to intervene. Given 79 these data, it cannot be determined which coaching quality dimension(s) may have the strongest impact on the parents’ structure of the play segment. Statistically speaking, it does not make sense to separate the four coaching quality dimensions because evidence from the CFA indicated that they are uniformly representative of the latent construct. Descriptive statistics reporting the correlations can provide an idea of how each quality dimension may be related to the parents’ structure of the play segment. These data suggested that responsiveness to parents’ concerns, encouragement of reflection, and presence of support appear to be negatively correlated with the parents’ structure of the play segment. There are two possible explanations for the non-significant findings in this study. First, the correlation between variables tend to be attenuated when there is limited variability in the data, which is the case for the current study. This is a common dilemma for research studies that rely on treatment integrity data to predict outcomes. Often, there is an implicit interest in the relation between treatment integrity components and treatment outcomes. Yet, this relationship cannot be accurately explored when scores for particular treatment integrity components are expected to be consistently high (Schulte et al., 2009). A few variables in this study had restricted ranges, with scores being on the higher end of the spectrum. While the high scores implied a good coaching or teaching performance, they limited the ability to analyze any potential effect that the predictor variables may have on the outcome variables. Second, there may have been a weak alignment between the operationalization of the treatment integrity components and the actual behaviors observed in the coaching sessions. The operationalization of the treatment integrity components may have had an impact on how the relations between the predictor variables and the outcome variables were analyzed. It may be possible that the operationalization of the components did not accurately capture the unique 80 coaching process of Project ImPACT. In particular, exposure, quality of coaching delivery, and participant responsiveness posed the greatest challenges. Exposure was measured by counting the number of constructive comments that the therapists provided to the parents during the coaching session. In theory, this operationalization made sense because exposure to feedback was deemed an important part of the coaching experience for parents (Shanley & Niec, 2010). In actuality, the actual measurement had a unique challenge to it. The comments were clear enough to distinguish between corrective and reinforcing feedback most of the time. However, counting the number of comments became a challenge when observing therapists who often made lengthy comments that were a mixture of corrective and reinforcing feedback. In these instances, it was difficult to dissect the comments into their individual corrective and reinforcing parts. This particular challenge may have influenced the observers’ counting accuracy to a certain extent. Quality of coaching delivery was measured by evaluating four different aspects of parent coaching: Responding to parents’ needs and concerns in a collaborative way, encouraging parents to reflect on their progress, empowering parents through support, and providing high quality feedback. In theory, these coaching behaviors were considered an integral part of educating adult learners, and therefore should be present in the coaching sessions (Koh & Neuman, 2009; Friedman et al., 2012; Foster et al., 2012). In actuality, it was challenging to maintain a balance between the ideal model of parent coaching and the realistic presentation of it in the telehealth delivery of Project ImPACT. One example was the measurement of the therapists’ presence of support. They were expected to be actively present during the parent-child play segment. This expectation emanated from the notion that the parents would benefit from in vivo guidance and encouragement (Shanley & Niec, 2010; Barnett, Niec, & Acevedo- 81 Polakovich, 2014). This coaching behavior was largely missing for most of the therapists. Instead, the therapists were observed to be watching intently and taking down notes during the parent-child play segment, often saving their feedback for later. Nonetheless, the appearance of passiveness did not take away from the overall richness of their support for the parents. Another example was the measurement of responding to parents’ needs and concerns in a collaborative way and encouraging parents to reflect on their progress. While these coaching behaviors were identified as critical components of adult learning, the context of the coaching session did not warrant their presence at times. For instance, the parents may not have any need or concern to discuss with the therapists at every coaching session. Or, the need for the parents to reflect on their progress may have been limited due to their lack of practice at home. Participant responsiveness was measured by evaluating the parents’ verbal and nonverbal behaviors to determine their level of engagement. In theory, the operationalization of this variable appeared to be straightforward. The parents were expected to show their engagement by communicating with the therapists and positioning their body to face the therapists at all time (Knoche et al., 2010). In actuality, one unforeseen challenge of measuring this component was differentiating between intentional and unintentional disengagement. The reality for most parents was that they often had to shift their attention based on the demands of their current environment. The operationalization of this variable may not have been able to accurately capture the reality of many home-based interventions. Broadly speaking, the overarching challenge associated with conceptualizing treatment integrity as a multidimensional construct was attempting to translate theoretical concepts into measurable components with limited empirical guidance from prior studies (Mowbray, Holter, Teague, & Bybee, 2003). As it turned out, what made sense in theory did not always align well 82 with what was going on during the coaching sessions. The additional challenge was having to envision the application of the parent coaching model to an online intervention. Currently, the literature on the evaluation of parent coaching is only limited to home-based interventions where the therapists visit the family home (Basu, Salisbury, & Thorkildsen. 2010; Foster et al., 2012). The difference in how intervention services are delivered can affect how some treatment integrity components are conceptualized and operationalized for measurement. Limitations One limitation of the study is the small sample size. The multilevel regression model has two levels. Level 1 consists of the repeated measurements of the treatment integrity data for the coaching and treatment deliveries. Level 2 consists of the parents. There are 19 clusters at level 2, with each parent representing one cluster. The cluster size varies between 5 to 8 data points at level 1, but when combined, there are a total of 130 data points across all parents. Multilevel modeling requires an adequate sample size in order for the regression coefficient estimates to be unbiased (McNeish & Stapleton, 2016). Further, an adequate sample size is required for the model to reach the appropriate level of statistical power, 0.80 (Scherbaum & Ferreter, 2009). Statistical power is the likelihood that an effect is detected when there is one to be detected in a study (Cohen, 1988). In an illustrative simulation study, Bell and colleagues (2010) found that models with limited sample size at level 1 and level 2 never attained the desired level of statistical power, 0.80, for both levels. For future studies, several guidelines exist for determining the sample size. Common recommendations have been 20-30 clusters with a minimum of 30 data points for each cluster (Kreft, 1996, Snijders & Bosker, 2012). The second limitation of the study is the over-reliance on the theoretical literature to conceptualize the treatment integrity components. At present, there is a shortage of empirical 83 evidence supporting the conceptualization of treatment integrity assessment as a multidimensional construct. Although much has been written about the concept from a theoretical standpoint, the application of this treatment integrity assessment model in intervention outcome studies has remained scarce (Zvoch, 2012). There are reasons for this ongoing trend in research. Conducting a multidimensional treatment integrity assessment is a challenging endeavor, primarily due to the restraint on time, cost, and resources (Perepletchikova et al., 2009). Further, exploring the relationship between treatment integrity components and treatment outcomes requires advanced statistical analysis, which is not readily available for many organizations (Mowbray et al., 2003). One particular challenge of the current study was mapping the multidimensional treatment integrity assessment model onto both stages of the intervention service delivery (e.g., coaching delivery and treatment delivery). Limited guidance was available to create the assessment framework. Therefore, the conceptualization of most treatment integrity components was primarily based on the different theories that make up the overarching concepts of parent coaching and ASD parent-implemented interventions. The theoretical literature was helpful in that it provided a vision for each treatment integrity component. At the same time, there was a risk associated with following this vision, such that it may not always align with the behaviors observed in the coaching sessions. The third limitation of the study is in the operationalization of the treatment integrity components, which could have contributed to the low correlation between the predictor variables and the outcome variables. For instance, exposure was operationalized as the number of corrective and reinforcement comments provided to parents in each session. This operationalization may not have been the best representation of exposure for a few reasons. First, the therapists had different preferences for providing feedback. Some therapists chose to provide 84 feedback throughout session while others chose to withhold feedback during the parent-child play segment. Second, it was difficult to disentangle convoluted comments at times, thereby compromising the observers’ accuracy in counting. Similarly, the quality of coaching delivery and treatment delivery were operationalized into sub-dimensions of coaching (for the therapists) and teaching (for the parents). Measuring these dimensions on a Likert scale may have underestimated the complexity of the quality construct. More specifically, assigning a single number to an intricate behavior could have restricted the interpretation of that behavior. Implications Research. The current study was one of the few studies that attempted to use the multidimensional approach to assessing treatment integrity. Moreover, this study provided an initial look at how treatment integrity assessment could be accomplished for interventions that utilize an indirect service delivery model. The findings indicated that a multidimensional approach to assessing treatment integrity could be beneficial. At the descriptive analysis level, the findings provided a basic overview of the therapists’ coaching performance and the parents’ teaching performance. At the regression analysis level, the findings left room for future discussions on how the various aspects of parent coaching can be conceptualized and measured with more precision. When appropriate, researchers and practitioners are strongly encouraged to measure multiple aspects of treatment integrity. A comprehensive assessment of the implementation process provides an opportunity to uncover strong and weak delivery parts, contextualize treatment outcomes, and differentiate between design errors and implementation errors (Rossi, Lipsey, & Freeman, 2004; Mowbray et al., 2003). In particular, complex interventions with different parts could benefit from this approach, especially when the goal is to identify the active 85 ingredients of an intervention (Schulte et al., 2009). For future studies that seek to better understand parent coaching, a few steps are recommended to improve from the current study. First, it would be beneficial to modify the conceptualization and/or operationalization of certain treatment integrity components for both coaching and treatment deliveries. In the current study, exposure was operationalized as the number of feedback comments provided to parents in each session. A different way of operationalizing exposure could be looking at the coaching session length (Wainer & Ingersoll, 2013). Differences in session length could have an impact on various aspects of parents’ treatment delivery (e.g., adherence to the intervention protocol and maintenance of the play segment). Quality of coaching and treatment deliveries are also in need of a new conceptualization and operationalization. For both components, it would be helpful to reconsider the appropriateness of the selected sub-dimensions, and the feasibility of measuring them accurately in a telehealth service delivery model. It is possible that there are distinctive differences in the parent coaching process between an in-person delivery and an online delivery of an intervention (Vismara et al., 2013). Additional research is needed to identify the unique aspects of the online delivery, so that the conceptualization of quality could better align with what is observed in practice. It would also improve the accuracy of the measurements. Lastly, one overarching notion to consider is that parent coaching is a dyadic process (Foster et al., 2013). The parent and the therapist are often synchronized in their communication. Responses given by both individuals are expected to align with each other in order for the coaching process to be considered effective. For example, the therapist’s responses to the parent’s needs or concerns should be an appropriate match in order for the parent to feel validated and supported. Similarly, the parent’s responses to the therapist’s suggestions, whether it is a change in implementation or perspective, should also be an appropriate match in order for the therapist to 86 know that support is being properly provided. In a nutshell, the interaction within the parent coaching process relies on both the therapist and the parent. There may be value in assessing this qualitative aspect of the coaching process because synchrony in the communication is the glue that holds this collaborative relationship together. Second, it would be beneficial to increase the study sample size at level 1 and level 2. The recommended cluster size is 20-40 data points and recommended number of clusters is >30 (Bell et al., 2010). Increasing the sample size at both levels would strengthen the statistical power and provide better control over the type-I error rate. If a small sample size is unavoidable, it would be helpful to use the Kenward-Roger adjustment to protect against type-I error by adjusting the F statistics and the df value to obtain a more accurate p-value (Kenward & Roger, 1997) or bootstrapping techniques to improve the accuracy of the inference about the population through multiple resampling of the sample data (Butar & Lahiri, 2003). Practice. As a whole, the coaching session layout of Project ImPACT had several elements that were characteristic of the proposed parent coaching model. The bidirectional interaction between the therapists and the parents was apparent. It could be best described as equal, collaborative, respectful, and active (Knoche et al., 2013). For the therapists, they were observed to be competent at supporting the parents in different ways, such as giving feedback, providing affirmation, and being attentive to their needs. These coaching behaviors portrayed the therapists as caring and approachable, both qualities that were noted by parents in previous studies as important and effective (Friedman et al., 2012; Rush et al., 2003). Additionally, the therapists were observed to be mindful of their role in the coaching session. They presented themselves as a facilitator rather than a leader in their relationship with the parents. This coaching behavior implied that the therapists were intentional in their effort to place the parents 87 at the center of the coaching experience (Cambray-Engstrom & Salisbury, 2010). For the parents, they were observed to be competent at combining their knowledge of the child and knowledge of the intervention techniques to teach their child. They created a meaningful learning experience that supported their child’s acquisition of social communication skills that are necessary for daily functioning. Similar to the therapists, the parents appeared to be mindful of their role in the coaching session as well. They presented themselves as independent and active learners who had something to contribute to the coaching experience (Dunst & Trivette, 2009). Moving forward, it would be worthwhile to review specific areas of the parent coaching process that need improvement. If engaging parents in a collaborative problem-solving dialogue and encouraging parents to reflect on their progress are truly active ingredients of parent coaching, then efforts should be made to incorporate these elements into the coaching session without taking time away from the other tasks on the agenda. It is likely that the inclusion of these coaching behaviors would not only enrich the coaching experience for the parents, but also for the therapists as well. They would likely feel more empowered through these engagements due to the natural promotion of equality, respect, and collaboration in the relationship (Blue- Banning, Summers, Frankland, Nelson, & Beegle, 2004). 88 APPENDICES 89 APPENDIX A: Coaching Delivery – Adherence Instruction: Evaluate the coach’s adherence to the coaching protocol during the coaching session. Then, calculate a fidelity percentage based on the number of observances and non- observances. Procedures Observed Not N/A Observed The coach greets the family warmly. The coach provides an agenda for the current session. The coach checks in with the parent about their understanding of the lesson content, and provides clarification as needed. The coach uses the Reflection Questions and the Homework Plan to discuss how practicing went at home. The coach helps the parent come up with solutions to improve the practice at home. Sometimes this activity will be completed in session if parent did not get a chance to complete it beforehand. The coach encourages the parent to practice the techniques with the child. The coach invites comments, questions, and concerns. The coach provides positive and corrective feedback to the parent regarding her use of the technique(s) with the child during the play segment. The coach helps the parent work through any obstacles in the implementation of the technique(s). The coach assigns the next lesson for the following session. The coach addresses concerns unrelated to the current lesson that the parent raises. The coach helps the parent work through any difficulties with the technology. Fidelity = [Observed / (Observed + Not Observed)] x 100 90 APPENDIX B: Coaching Delivery – Exposure Instruction: Review the coaching session in 5-minute intervals. In every interval, count the numbers of constructive feedback that parents receive from the coach. Constructive feedback has two varieties: Corrective and reinforcement Corrective feedback: Comments that are meant to improve the parent’s use of the strategies. For example, the coach could say, “Because Johnny is not paying attention to you, why don’t you try to get down on his level and make eye contact with him?” Reinforcement feedback: Comments that are meant to encourage and reinforce the parent’s use of the strategies. For example, the coach could say, “You did a wonderful job adjusting your communication tone when you saw that Johnny was feeling overwhelmed!” The component will be measured by totaling the number of corrective and reinforcement comments given during the coaching session. A comment is counted as an occurrence if it is specific (e.g., “You did a great job gaining his attention!”). A comment is not counted as an occurrence if it is vague (e.g., “That’s great.” or “That’s awesome.”). Frequency of Corrective Feedback Frequency of Reinforcement Feedback Total: ________ 91 APPENDIX C: Coaching Delivery – Quality of Coaching Delivery Instruction: Review the coaching session in 5-minute intervals. In every interval, assign a score for each quality indicator. The average of the interval scores will be the final score for each quality indicator. At the end, the overall score for quality of coaching will be the mean value from the four quality indicators. Responsiveness to the Parents’ Needs and Concerns This item reflects the coach’s responsiveness to parents’ needs and concerns throughout the session. Responsiveness is defined as addressing areas of difficulty that parents may bring up during the reflection segment or experience during the play segment. It is not enough for the coach to only acknowledge the issue. Rather, the coach needs to facilitate a joint problem- solving opportunity where parents are actively involved in generating potential solutions. Mark “Not Observed (N/O)” for the following scenarios: 1. When there isn’t any need or concern to be addressed during the reflection segment. 2. When the coach does not provide in vivo feedback during the play segment Parents may need support in the following areas: difficulty understanding or implementing an intervention technique, difficulty in addressing implementation problems at home, and difficulty in providing supportive social behaviors (e.g., deciding what step to take next or what to say to the child during the play segment). Low level of responsiveness is characterized as the coach completely ignoring parents’ needs and concerns, or merely acknowledging them without jointly coming up with a solution. High level of responsiveness is characterized as the coach actively working with parents to come up with a solution. 1 – Very Low: The coach never acknowledged parents’ needs and concerns. 2 – Low: The coach only acknowledged parents’ needs and concerns, but did not work towards a solution. 3 – Moderate: The coach strictly provided parents with a solution rather than working towards a solution with them. 4 – High: The coach worked towards a solution with parents at times, but also strictly provided them with a solution at other times. 5 – Very High: The coach acknowledged parents’ needs and concerns and worked towards a solution with them. 92 Encouragement of Reflection This item reflects the extent to which the coach creates an opportunity for parents to reflect on their implementation progress. The coach can create an opportunity for reflection by: 1) asking questions about the daily routines, the use of strategies, or the child’s developmental progress outside of the coaching session, and 2) asking for input and feedback on what was observed during the play segment. Mark “Not Observed (N/O)” during the parent-child play segment when the parent is not expected to reflect while engaging in play with the child. Low level of encouragement is characterized as the coach allowing parents to go through the coaching session without stopping to reflect on their implementation progress. High level of encouragement is characterized as the coach using leading questions to get parents thinking about their implementation progress. 1 – Very Low: The coach never created opportunities for parents to reflect on their progress during the interval. 2 – Low: For the majority of the interval, the coach did not create opportunities for parents to reflect on their progress. 3 – Moderate: For approximately half of the interval, the coach created opportunities for parents to reflect on their progress. 4 – High: For the majority of the interval, the coach created opportunities for parents to reflect on their progress. 5 – Very High: The coach created opportunities for parents to reflect on their progress during the interval. 93 Presence of Support This item reflects the coach’s efforts in empowering and encouraging parents. It also reflects the extent to which the coach is presented as caring and approachable. Low presence of support is characterized as a coach who criticizes parents’ mistakes in ways that make them lose confidence in themselves as the child’s teacher. This is a coach who does not check to see if parents need any support. Additionally, this is a coach who may be attentive during the parent-child play segment, but does not verbally acknowledge parents’ success or errors. High presence of support is characterized as a coach who uses positive words to help parents see themselves as a competent teacher for their child. This is also a coach who regularly checks to see if parents need any support. Additionally, this is a coach who demonstrates attentiveness during the parent-child play segment by providing reassurance and support to parents. 1 – Very Low: During the interval, the coach criticized parents’ errors, did not acknowledge parents’ success, or did not check to see if parents needed any support. Also, the coach may have appeared attentive during the parent-child play segment but did not provide in vivo feedback during the interval. 2 – Low: For the majority of the interval, the coach criticized parents’ errors, did not acknowledge parents’ success, or did not check to see if parents needed any support. Also, the coach may have appeared attentive during the parent-child play segment but only provided in vivo feedback 1-2 times during the interval. 3 – Moderate: For approximately half of the interval, the coach used positive affirmations to empower parents, provided reassurance and support when parents experienced challenges, or regularly checked to see if parents needed any support. 4 – High: For the majority of the interval, the coach used positive affirmations to empower parents, provided reassurance and support when parents experienced challenges, or regularly checked to see if parents needed any support. 5 – Very High: During the interval, the coach used positive affirmations to empower parents, provided reassurance and support when parents experienced challenges, or regularly checked to see if parents needed any support. 94 Quality of Feedback This item reflects the qualitative characteristics of the coach’s feedback. Mark “Not Observed (N/O)” when the coach does not provide in vivo feedback during the play segment. Low quality feedback does not change (or improve) parents’ role as the child’s teacher. The feedback is considered “off target” -- such that it is irrelevant (unrelated to the target behaviors), vague (leaving room for confusion), and short (brief and incomplete). High quality feedback improves parents’ role as the child’s teacher. The feedback is considered “on target” and relevant (pertinent to the target behaviors), explicit (no room for confusion), and concise (brief yet complete and informative). 1 – Very Low: The feedback was completely irrelevant, vague, and short. 2 – Low: The feedback was somewhat relevant, but still vague and short. 3 – Moderate: The feedback was relevant, but vague and lengthy (comprehensive but long). 4 – High: The feedback was relevant and explicit, but lengthy (comprehensive but long). 5 – Very High: The feedback was relevant, explicit, and concise. Final score: ____ 95 APPENDIX D: Coaching Delivery – Participant Responsiveness Instruction: Review the coaching session in 5-minute intervals. In every interval, assign a score for the level of parents’ engagement. The average of the interval scores will be the overall score for participant responsiveness. Level of engagement. This item reflects the parent’s level of engagement during the coaching session. Verbal indicators of low engagement include: Parents provide a surface-level reflection on their implementation progress at home and during the play segment. They do not take the initiative to ask questions or share concerns. They do not actively resolve challenges and barriers with the coach. Nonverbal indicators of low engagement include: Parents have their head or body turned away from the computer screen. Parents physically attend to other stimuli in their environment (e.g., child running around). Verbal indicators of high engagement include: Parents provide a detailed reflection on their implementation progress at home and during the play segment. Parents take the initiative to ask questions or share concerns. Parents actively resolve challenges and barriers with the coach. Nonverbal indicators of high engagement include: Parents have their head or body facing the computer screen. Parents nod in agreement to the coach’s feedback. 1 – Very Low: The parent was brief, passive, and uninvolved during the interval. 2 – Low: For the majority of the interval, the parent was brief, passive, and uninvolved. 3 – Moderate: For approximately half the interval, the parent was thorough, active, and involved. 4 – High: For the majority of the interval, the parent was thorough, active, and involved. 5 – Very High: The parent was thorough, active, and involved during the interval. Final score: _____ 96 APPENDIX E: Treatment Delivery – Adherence Instructions: Review the parent-child play segment in 5-minute intervals. In every interval, assign a score for each intervention technique. The average of the interval scores will be the final score for each intervention technique. At the end, the overall score for treatment adherence will be the mean value from all the observed intervention techniques. 1 - Never No implementation throughout the segment OR attempted implementations are incorrect 2 - Rarely Implemented as instructed less than half of the time 3 - Occasionally 4 - Frequently Implemented as Implemented as instructed more instructed half than half of the of the time time 5 - Always Implemented as instructed throughout the play segment Operationalization Lesson 4, Session 8 - FOCUS ON YOUR CHILD Intervention Examples technique 1. Stays face to face at the child’s eye level - Parent positions herself (sit or stand) to be in the child’s visual field and to be in close proximity to the child. - Parent sits directly or diagonally across from the child on the floor or at the table. - Parent changes her position to stay in the child’s visual field (and remain close to him) if he moves around. - Parent rearranges the child’s sitting position to have him face her. **Exception: Physical activities that require being next to the child 97 Non-examples Notes Score - Parent sits behind or next to the child when the activity provides an opportunity to be face to face. - Parent doesn’t change her position to stay in the child’s visual field. - Parent allows the child to wander away from her. - Parent doesn’t rearrange the child’s sitting position to have him face her. 2. Lets the child lead the activity - Parent presents the child with activity options or with choices of what to do within an activity. - Parent goes along with the child’s play style (appropriately). 3. Joins the child’s play - Parent plays with the child through commenting, assisting, and expanding on his play actions. 4. Imitates the child’s play - Parent follows the child’s play, gestures (e.g., running, laying down). - Parent presents the child with two activity choices: Play ball or do a puzzle. - Parent asks if the child wants to feed the doll or bathe the doll. - Parent follows the child by lining up the cars. - Comment on the child’s play: Parent says, “Your train track is so long!” - Assist the child during play: Parent gives the child one block at a time to build a tower. - Expand on the child’s play: Parent places a miniature doll in the car that the child is pushing. - Join in sensory play: Parent spins the child around in a chair, or engages in rough and tumble play. - Follow child’s actions: If the child drinks out 98 - Parent decides that they will play ball. - Parent decides that they will feed the doll. - Parent corrects the child’s choice to line up the cars. - Ask questions: Parent asks, “What are you going to do with the blocks?” - Give directions: Parent says, “Put the same-color blocks together.” - Take over the lead: Parent finishes building a tower for the child. - Parent does an action that does not align with or body movements, and vocalization. the child’s action. For example, pour water in a cup instead of drink from it. - Parent corrects the child’s action instead of imitating it. For example, parent makes the child stand up when he wants to lie down. of a cup, parent does the same. - Follow child’s gestures or body movements: If the child lies down, parent does the same. - Follow child’s vocalization: If the child is preverbal, parent replicates any sounds he makes. If the child is verbal, parent repeats the appropriate words to help him learn them. Operationalization Lesson 5, Session 10 - ADJUST YOUR COMMUNICATION Intervention technique 5. Exaggerates communication with the child Examples Non-examples Notes Score - Parent exaggerates gestures, facial expressions, and vocal quality. - Parent uses attention-getting phrases (e.g., “Uh oh!” “Oh, no!”). - Parent changes verbal and nonverbal gestures based on the child’s arousal level. 6. Adjusts animation to help the child stay regulated 7. Use developmentally appropriate language for the child - Parent uses language that is at or slightly above the child’s developmental level. - Parent makes a sad face, a happy face, or a surprised face in the appropriate context. - Parent keeps a straight face, and talks in a monotonous tone. - Parent uses a calmer, less excited voice when the child is too revved up in order to calm him down. - Skill-related response: Parent prompts the child to use single words when he’s at the pointing stage. - Parent maintains overexcited despite the child being overwhelmed by the situation. - Skill-related response: Parent prompts the child to use full sentences when he’s at the pointing stage. 99 8. Give meaning to the child’s behaviors 9. Expands the child’s language - Parent narrates the child’s actions. - Parent responds appropriately to the child’s behavior. - Parent adds on to the child’s communicative attempts. - Child grabs dino toy. Parent describes his action, “You’re choosing the blue dino.” - Child grabs dino toy. Parent teaches him to say, “I want.” - Parent does not narrate the child’s behaviors. - When the child grabs the ball, the parent hands it over without teaching him to say, “I want.” Operationalization Examples Lesson 6, Session 12 - MAKE PLAY INTERACTIVE Intervention technique 10. Use playful obstruction or balanced turns to create an interactive play style PLAYFUL OBSTRUCTION 1. Parent gives an anticipatory phrase to signal their entrance into the child’s play (e.g., Ready, set stop. I’m going to get you. Here I come). 2. Parent presents the playful obstruction (e.g., put your hand over his hand, cover a part of the toy that he is using, stand in his way, stop his movement). 3. Parent waits for the child to respond (e.g., eye contact, vocalization, facial expression, gestures). 4. Parent responds to the child’s PLAYFUL OBSTRUCTION 1. Parent says, “Here I come!” 2. Parent places hand in front of the child’s car to block his play. 3. Parent leaves her hand there and waits for the child to respond. 4. Parent immediately removes her hand once the child makes eye contact. 5. Parent teaches the child to say, “Mom, move please.” BALANCED TURNS 1. Parent taps her chest and says, “My turn!” and Non-examples Notes Score PLAYFUL OBSTRUCTION 1. Parent unexpectedly places her hand in front of the child’s car without a verbal warning. 2. Parent removes her hand without waiting for the child to make a request. 3. Parent fails to teach the child to say, “Mom, move please.” BALANCED TURNS 1. Parent takes the car toy without saying, “My turn.” 2. Parent returns the car toy to the 100 child to stop him from being upset. 3. Parent demonstrates symbolic play skills (puts man in the car and pushes it) while the child only knows exploratory play skills (putting the car toy in his mouth). 4. Parent returns the car toy without waiting for the child to ask for it. puts out her hand to signal that it’s her turn. 2. Parent takes the car toy and pushes it across the floor. 3. Because the child knows functional play skills (pushes car), parent models the next level play skills (puts man in the car and pushes it). 4. Parent plays with the car toy until the child makes an attempt to initiate his turn. 5. Upon the child’s initiation, parent returns the car toy to him and models the phrase, “Your turn.” communication (e.g., get out of his way, give him access to the hidden toy, remove your hand from his hand). 5. Model appropriate language for the child. BALANCED TURNS 1. Parent gives an anticipatory phrase, “My turn!” and puts out her hand to signal that it’s her turn. 2. Parent takes the toy and plays with it for a short time. If the child gets upset, offer him a similar toy to play with while waiting for you. 3. Parent models developmentally appropriate play skills. 4. Parent waits for the child to initiate his turn (e.g., eye contact, facial expressions, gestures, vocalization). 5. Parent responds to the child’s communication by returning the toy to him, and model the phrase, “Your turn.” 101 Lesson 7, Session 14 - ENCOURAGE YOUR CHILD TO INITIATE (see Table A for communicative temptation strategies) Intervention Operationalization technique 11. Sets up an opportunity for communication - Parent hands the ball to the child. Examples Non-examples Notes Score - Parent uses one of the communicative temptation strategies to encourage communication from the child. - Parent makes eye contact with the child and wait for him to make an initiation. - [In sight/out of reach] - Parent places the ball in the child’s line of sight but out of reach. - Parent looks for the child to make eye contact, vocalize, or point. - Parent gives the child access to the ball. - Parent does not make eye contact with the child while waiting for an initiation. - Parent ignores the child’s request. 12. Waits for the child to initiate 13. Responds to the child’s behavior as meaningful - Following the child’s initiation, parent responds by granting access to the preferred item. NOTE (for lesson 8-11): Parent should introduce teaching opportunities 1/3 of the time for each interval (about 1.5 minutes). The rest of the time should still be devoted to free play. Notes Score Non-examples Operationalization Examples Lesson 8 & 9, Session 16 & 18 - TEACH AND EXPAND LANGUAGE THROUGH PROMPTING (see Table B for different language prompts) Intervention technique 14. Prompts for communication related to the child’s goals **In lesson 9, parents are asked to use different language prompts from Table B. - Preverbal communication: If the child points at the ball, and parent models a full sentence, “I want the ball, please.” - Preverbal communication: If the child points at the ball, parent can model how to say, “Ball.” - Parent prompts for a response that is at or slightly above the child’s current skill level. Prompts include modeling, giving choices, or asking a question. ** In lesson 9, refer to Table B for different language prompts. - Parent allows at least 5 seconds after presenting a prompt 15. Provides sufficient opportunity for - Parent verbally models, “Want ball.” - Parent verbally models, “Want ball.” 102 the child to respond to give the child time to respond. **Gives the child 5 seconds to respond before giving another prompt. 16. Follows through on the third prompt (when the child needs it) **Mark N/O if a third prompt is not needed - Start with the least supportive prompt. If the child cannot produce the correct response after two prompts, parent uses physical guidance as the third prompt to help him be successful. - Parent asks the question, “Open jar?” while physically guiding the child’s finger to tap the jar lid. **Gives the child less than 5 seconds to respond before giving another prompt. Or, waits for more than 10 seconds before giving another prompt. - Parent continues to ask, “Do you want me to open the jar?” Operationalization Examples Non-examples Notes Score Lesson 10 & 11, Session 20 & 22 - TEACH AND EXPAND IMITATIVE PLAY THROUGH PROMPTING (see Table C for different play prompts) Intervention technique 17. Models a play action or gesture for your child **In lesson 11, parents are asked to use different play prompts from Table C. 18. Provides sufficient opportunity for the child to respond - Parent models a play action or gesture, and describes it for the child. ** In lesson 11, refer to Table C for different play prompts. - Parent allows approximately 10 seconds after modeling an action to give the child time to respond. - Parent pours water into a cup without describing the action. - Parent models the wave gesture without labeling the action. - Parent pours water into a cup and says, “Pouring water!” - Parent models the wave gesture and says “Bye!” - Parent pushes a car and says, “Push!” **Gives the child 10 seconds to respond before - Parent pushes a car and says, “Push!” **Gives the child less than 10 seconds to respond before 103 modeling it a second time. - Parent physically guides the child’s hand to push the car. modeling it a second time. - Parent continues to verbally instruct the child to push the car. 19. Follows through on the third prompt (when the child needs it) **Mark N/O if a third prompt is not needed - Start with the least supportive prompt. If the child cannot produce the correct response after two prompts, parent uses physical guidance as the third prompt to help him be successful. Operationalization Examples Lesson 8-11; Session 16, 18, 20, 22 – REINFORCEMENT Intervention technique 20. Provides reinforcement when appropriate - Parent grants the child access to the desired item or activity. - Parent praises the child. Non-examples Notes Score LANGUAGE - Parent gives a different item instead of the desired item as reinforcement. For example, parent gives the child a sticker for saying, “Ball,” instead of giving him/her a ball. - Parent withholds access to the desired item/activity or won’t praise when the child gives the correct response or makes a good attempt at responding. PLAY - Parent ignores the child’s correct response LANGUAGE - Grant access: Parent gives the child the ball if he says, “Ball.” - Praise: Parent says, “Good job saying, “Ball!”” PLAY - Grant access: Parent lets the child play with the car on his own for a few seconds. - Praise: Parent says, “Good job pushing the car!” 104 (pushing the car) and moves onto a new play action. - Parent ignores the child’s correct response (pushing the car) and makes him do it again. OR…parent grants the child access to the desired item even though he ignores the demand or provides an incorrect response. FINAL SCORE (average): __________ 105 TABLE A - COMMUNICATIVE TEMPTATION STRATEGIES Strategy In sight/out of reach Operationalization - Parent places the preferred toy directly in the child’s line of sight, but out of reach. Parent makes eye contact with the child, and waits for him to make a request for the toy. - Parent initiates an activity using toys that require assistance (e.g., balloons, tops, bubbles, wind up toys, and remote control toys). Parent makes eye contact with the child, and waits for him to ask for more. - Parent provides the child one piece at a time of an item with multiple parts (e.g., blocks, trains, or puzzles). - Parent provides the child one piece of an item with multiple parts while withholding the other “missing” pieces (e.g., gives the child the train tracks without the trains). - Parent makes a small change in a play or daily routine (e.g., intentionally place a car out of line). Assistance Inadequate portions Sabotage Protest Examples - Parent places a car toy in the child’s line of sight, and looks for the child to make eye contact, vocalize, or point before handing over the toy. - Parent blows a few bubbles, and waits for the child to ask for more via making eye contact, pointing, or vocalizing. Non-examples - Parent allows the child access to the toy without making a request. - Parent continues to blow bubbles without giving the child an opportunity to ask for more. - Parent provides the child one block, and waits for the child to ask for more via making eye contact, pointing, or vocalizing. - Parent provides the child the train tracks without the trains, and waits for the child to ask for the missing item(s) via making eye contact, pointing, or vocalizing. - Parent intentionally places a car out of line. As soon as the child gets upset, parent models the appropriate language or gestures (“Stop, please), and places the car back in line. - Parent intentionally puts the child’s shoes on his hands instead of his feet, and waits for the child to react via making eye contact, - Parent continuously provides the child with many blocks without giving the child an opportunity to ask for more. - Parent provides the child all of the items without giving the child an opportunity to ask for the missing pieces. - Parent intentionally places a car out of line. As soon as the child gets upset, parent places the car back in line without modeling the appropriate language or gestures. - Parent intentionally puts the child’s shoes on his hands instead of his feet, but immediately points out the silliness 106 Silly situations - Parent intentionally does a routine incorrectly in a silly way (e.g., put the child’s shoes on his hands instead of his feet). pointing, or vocalizing. If the child doesn’t react, the parent says, “That’s silly!” instead of waiting for the child to react first. 107 TABLE B - DIFFERENT PROMPTS FOR LANGUAGE (in order from most to least supportive) Prompt Physical prompt Examples - Parent uses her hand to guide the child’s hand in pointing to the water bottle. - Parent taps the jar lid and say, “Open.” - Parent says, “Ready, set, ___,” and gives the child an opportunity to say, “Go!” - Parent models the phrase, “More, please,” and gives the child an opportunity to say it back. - Parent asks, “What fruit do you want, a banana or an orange?” and gives the child an opportunity to respond. - Parent says, “The baby doll is in the ____,” and gives the child an opportunity to respond. - Parent asks, “Where do you want to be tickled?” and gives the child an opportunity to respond. - Parent places the car toy in front of the child, and gives him an expected look as Non-examples - Parent points at the water bottle and asks the child to do the same. - Parent uses her hand to guide the child’s hand in tapping the jar lid. - Parent says, “Ready, set, go!” without giving the child an opportunity to finish the phrase. - Parent models the phrase, “More, please,” without giving the child an opportunity to say it back. - Parent asks, “What fruit do you want, a banana or an orange?” then makes a choice for him. - Parent says, “The baby doll is in the bathtub!” without giving the child an opportunity to respond. - Parent asks, “Where do you want to be tickled?” then chooses a spot for him. - Parent places the car toy in front of the child, and allows him to take it without Operationalization - Parent guides the child’s hand with her hand; also known as “hand over hand” support. - Parent models a gesture or physical action for the child. - To use with a familiar routine: Parent starts saying the verbal phrase but leaves off the last word for the child to complete. - Parent provides a word or phrase for the child to imitate. - Parent provides the child with two choices to let him practice speaking independently without the modeling aspect. - Parent starts saying the verbal phrase but leaves off the last word for the child to complete. However, there isn’t always one right answer. - Parent asks specific questions about the current activity. Gesture prompt Verbal routine Verbal model Choice Cloze procedure Direct question Time delay - Parent gives an expected look to cue the child that he needs to respond in some way. 108 she waits for his response. initiating communication. 109 Verbal instruction - Parent makes a suggestion for a new play action with the current toy. Choice - Parent gives the child new options to play with his current toy. Leading question - Parent asks the child what he’d like to do with his current toy. Leading comment Cooperative play - Parent makes a comment to help the child decide what to do next with his current toy. - Parent takes a supportive role in the child’s play TABLE C - DIFFERENT PROMPTS FOR PLAY SKILLS (in order from most to least supportive) Prompt Imitative play Operationalization - Parent models a play action for the child to imitate. Non-examples - If the child is playing with a car, and parent instructs him to do a race with two cars. - If the child is playing with a car, and parent physically takes the car to model the play action. - If the child is holding a doll, and parent makes him play with it a different way. - If the child is holding a doll, and parent makes him play with it a different way. - If the child is holding a doll, and parent instructs him to put it to bed. - If the parent and the child do separate activities instead of playing together. Examples - If the child is playing with a car, parent can take two cars and models how to make them race against each other. - If the child is playing with a car, parent can give him another car and say, “Make them race!” - If the child is holding a doll, parent can ask whether he wants to feed the doll or gives her a bath. - If the child is holding a doll, parent can ask, “What should the baby do now, eat or nap?” - If the child is holding a doll, parent can give him a blanket and say, “Your baby looks sleepy,” - If the child is playing with his doctor kit, parent can take the role of a patient. 110 APPENDIX F: Treatment Delivery – Quality of Treatment Delivery Instruction: Review the parent-child play segment in 5-minute intervals. In every interval, assign a score for each quality indicator. The average of the interval scores will be the final score for each quality indicator. At the end, the overall score for quality of treatment delivery will be the mean value from the two quality indicators. Structure of the Play Segment This item reflects the structural quality of the play segment. A low quality play segment lacks structure that is beneficial to creating meaningful engagement and learning opportunities for the child. The following aspects are evaluated: • During a specific activity, both used and unused items can be seen scattered around the area, which makes it difficult for the child to pay attention. • The selected activities do not offer opportunities for the child to learn new language and play skills. • The parent creates very little opportunities to use the intervention techniques to help the child learn new language and play skills. • The parent is lax about setting limits to keep the child from engaging in inappropriate behaviors. A high quality play segment is structured to ensure consistent engagement from the child in order to make learning possible. The following aspects are evaluated: • During a specific activity, only used items are present. Unused items are stored away to decrease the child’s distraction. • The activities are conducive to helping the child learn new language and play skills. • The parent creates enough opportunities to use the intervention techniques to help the child learn new language and play skills. • The parent is vigilant about setting limits to keep the child from engaging in inappropriate behaviors. 1 – Very Low: The activities were not properly structured to support meaningful engagement and new learning opportunities during the interval. 2 – Low: For the majority of the interval, the activities were not properly structured to support meaningful engagement and new learning opportunities. 3 – Moderate: For approximately half the interval, the activities were not properly structured to support meaningful engagement and new learning opportunities. 4 – High: For the majority of the interval, the activities were properly structured to support meaningful engagement and new learning opportunities. 5 – Very High: The activities were properly structured to support meaningful engagement and new learning opportunities during the interval. 111 Presence of Support This item reflects the parent’s level of warmth, encouragement, and patience. Low presence of support is described as: Appearing distant and bored, withholding positive support from the child (especially during challenging and frustrating moments), and appearing frustrated or mad when the child fails to respond successfully to a teaching opportunity. High presence of support is described as: Maintaining a positive affect and warmth towards the child, providing positive support to the child (especially during challenging and frustrating moments), and remaining calm and persistent when the child isn’t able to respond successfully to a teaching opportunity. 1 – Very Low: During the interval, the parent appeared distant and bored, withheld positive support from the child, and appeared frustrated or mad when the child failed to respond successfully. 2 – Low: For the majority of the interval, the parent appeared distant and bored, withheld positive support from the child, and appeared frustrated or mad when the child failed to respond successfully. 3 – Moderate: For approximately half of the interval, the parent maintained a positive affect and warmth towards the child, provided the child with positive support, and remained calm and persistent when the child failed to respond successfully. 4 – High: For the majority of the interval, the parent maintained a positive affect and warmth towards the child, provided the child with positive support, and remained calm and persistent when the child failed to respond successfully. 5 – Very High: During the interval, the parent maintained a positive affect and warmth towards the child, provided the child with positive support, and remained calm and persistent when the child failed to respond successfully. Final score: ____ 112 REFERENCES 113 REFERENCES Allen, J. D., Linnan, L. A., & Emmons, K. M. (2012). Fidelity and its relationship to implementation effectiveness, adaptation, and dissemination. Dissemination and Implementation Research in Health: Translating Science to Practice, 281-304. doi: 10.1093/acprof:oso/9780199751877.003.0014 American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). Washington, DC. APA Presidential Task Force on Evidence-Based Practice (2006). Evidence-based practice in psychology. The American psychologist, 61(4), 271-285. doi: 10.1037/0003- 066X.61.4.271 Barnett, M. L., Niec, L. N., & Acevedo-Polakovich, I. D. (2014). Assessing the key to effective coaching in parent-child interaction therapy: The therapist-parent interaction coding system. Journal of Psychopathology of Behavioral Assessment, 36(2), 211-223. doi: 10.1007/s10862-013-9396-8 Barton, E. E., & Fettig, A. (2013). Parent-implemented interventions for young children with disabilities: A review of fidelity features. Journal of Early Intervention, 35(2), 194-219. doi: 10.1177/1053815113504625 Basu, S., Salisbury, C. L., & Thornkildsen, T. A., (2010). Measuring collaborative consultation practices in natural environments. Journal of Early Intervention, 32(2), 127-150. doi: 10.1177/1053815110362991 Bell, B., Morgan, G., Schoeneberger, J., Loudermilk, B., Kromrey, J., & Ferron, J. (2010). Dancing the Sample Size Limbo with Mixed Models: How Low Can You Go? SAS Global Forum. 4. Bellg, A. J., Borrelli, B., Resnick, B., Hecht, J., Minicucci, D. S., Ory, M., . . . (2004). Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH behavior change consortium. Health Psychology, 23(5), 443-451. doi: 10.1037/0278-6133.23.5.443 Ben Itzchak, E., Lahat, E., Burgin, R., & Zachor, A. D. (2008). Cognitive, behavior and intervention outcome in young children with autism. Research in Developmental Disabilities,29(5), 447-458. doi: 10.1016/j.ridd.2007.08.003 Billingsley, F. F., White, O. R., & Munson, R. (1980). Procedural reliability: A rationale and an example. Behavioral Assessment, 2, 229-241 114 Blue-Banning, M., Summers, J. A., Frankland, H. C., Nelson, L. L., & Beegle, G. (2004). Dimensions of family and professional partnerships: Constructive guidelines for collaborations. Exceptional children, 70(2), 167-184. doi: 10.1177/001440290407000203 Borrelli, B. (2011). The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. Journal of Public Health Dentistry, 71, S52-S63. doi: 10.1111/j.1752-7325.2011.00233.x. Brown, J. A., & Woods, J. J. (2015). Effects of a triadic parent-implemented home-based communication intervention for toddlers. Journal of Early Intervention, 37(1), 44-68. doi: 10.1177/1053815115589350 Butar, F. B., & Lahiri, P. (2003). On measures of uncertainty of empirical Bayes small-area estimators. Journal of Statistical Planning and Inference, 112(1), 63–76. Cambray-Engstrom, E., & Salisbury, C. (2010). An exploratory case study of providers' collaborative consultation practices with Latina mothers during home visits. Infants & Young Children, 23(4), 262-274. doi: 10.1097/IYC.0b013e3181f21f6d Campbell, P. H., & Sawyer, L. B. (2007). Supporting learning opportunities in natural settings through participation-based services. Journal of Early Intervention, 29(4), 287-305. doi: 10.1177/105381510702900402 Carter, A. S., Messinger, D. S., Stone, W. L., Celimli, S., Nahmias, A. S., & Yoder, P. (2011). A randomized controlled trial of hanen's 'more than words' in toddlers with early autism symptoms. Journal of Child Psychology and Psychiatry, and Allied Disciplines, 52(7), 741-752. doi: 10.1111/j.1469-7610.2011.02395.x Center for Disease Control and Prevention. (2014). Autism Spectrum Disorder (ASD). Retrieved from https://www.cdc.gov/ncbddd/autism/data.html Charman, T. (2003). Why is joint attention a pivotal skill in autism? Philosophical Transactions of the Royal Society of London. Series B: Biological Sciences, 358(1430), 315-324. doi: 10.1098/rstb.2002.1199 Charman, T., & Stone, W. L. (2006). Social and communication development in autism spectrum disorders: Early identification, diagnosis, and intervention. New York: Guilford Press. Chawarska, K., Paul, R., Klin, A., Hannigen, S., Dichtel, L. E., & Volkmar, F. (2007). Parental recognition of developmental problems in toddlers with autism spectrum disorders. Journal of Autism and Developmental Disorders, 37(1), 62–72. Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum 115 Collier-Meek, M. A., Fallon, L. M., Sanetti, L. M., & Maggin, D. M. (2013). Focus on implementation: Assessing and promoting treatment fidelity. Teaching Exceptional Children, 45(5), 52-59. Connell, C. M., & Prinz, R. J. (2002). The impact of childcare and parent–child interactions on school readiness and social skills development for low-income African American children. Journal of School Psychology, 40(2), 177-193. doi: 10.1016/S0022- 4405(02)00090-0 Cook, R. (1977). Detection of Influential Observation in Linear Regression. Technometrics, 19(1), 15-18. doi: 10.2307/1268249 Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18(1), 23-45. doi: 10.1016/S0272-7358(97)00043-3 Dawson, G. (2008). Early behavioral intervention, brain plasticity, and the prevention of autism spectrum disorder. Development and Psychopathology, 20(3), 775-803. doi: 10.1017/S0954579408000370 Dempster, A., Laird, N., & Rubin, D. (1977). Maximum Likelihood from Incomplete Data via the EM Algorithm. Journal of the Royal Statistical Society. Series B (Methodological), 39(1), 1-38. Retrieved from http://www.jstor.org/stable/2984875 DiGennaro Reed, F. D., & Codding, R. S. (2014). Advancements in procedural fidelity assessment and intervention: Introduction to the special issue. Journal of Behavioral Education, 23(1), 1-18. doi: 10.1007/s10864-013-9191-3 Dinnebeil, L. A., McInerney, W. F., Roth, J., & Ramaswamy, V. (2001). Itinerant early childhood special education services: Service delivery in one state. Journal of Early Intervention, 24(1), 35-44. doi: 0.1177/105381510102400106 Dunlap, G., Ester, T., Langhans, S., & Fox, L. (2006). Functional communication training with toddlers in home environments. Journal of Early Intervention, 28(2), 81-96. doi: 10.1177/105381510602800201 Dunst, C. J., & Trivette, C. M. (2009). Letʼs be PALS: An evidence-based approach to professional development. Infants & Young Children, 22(3), 164-176. doi: 10.1097/IYC.0b013e3181abe169 Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3), 327-350. doi: 10.1007/s10464-008-9165-0 116 Fleming, J. L., Sawyer, L. B., & Campbell, P. H. (2011). Early intervention providers’ perspectives about implementing participation-based practices. Topics in Early Childhood Special Education, 30(4), 233-244. doi: 10.1177/0271121410371986 Flom, R. A., Burmeister, R. C., & Pick, A. D. (1998). An ecological approach to joint attention and early language. Infant Behavior and Development, 21, 61-61. doi: 10.1016/S0163- 6383(98)91276-0 Foster, L., Dunn, W., & Lawson, L. M. (2013). Coaching mothers of children with autism: A qualitative study for occupational therapy practice. Physical & Occupational Therapy in Pediatrics, 33(2), 253-263. doi: 10.3109/01942638.2012.747581 Fox, J. (1991). Quantitative Applications in the Social Sciences: Regression diagnostics. Thousand Oaks, CA: SAGE Publications Ltd Frank, J. L., & Kratochwill, T. R. (2008). School-based problem-solving consultation: Plotting a new course for evidence-based research and practice in consultation. In W. P. Erchul & S. M. Sheridan (Eds.), Handbook of research in school consultation (pp. 13–30). New York: Lawrence Erlbaum Associates Friedman, M., Woods, J., & Salisbury, C. (2012). Caregiver coaching strategies for early intervention providers: Moving toward operational definitions. Infants & Young Children, 25(1), 62-82. doi: 10.1097/IYC.0b013e31823d8f12 Gale, C. M., Eikeseth, S., & Rudrud, E. (2011). Functional assessment and behavioural intervention for eating difficulties in children with autism: A study conducted in the natural environment using parents and ABA tutors as therapists. Journal of Autism and Developmental Disorders, 41(10), 1383-1396. doi: 10.1007/s10803-010-1167-8 Gervain, J., & Werker, J. F. (2008). How infant speech perception contributes to language acquisition. Language and Linguistics Compass, 2(6), 1149-1170. doi: 10.1111/j.1749- 818X.2008.00089.x Gibson, J. L., Pennington, R. C., Stenhoff, D. M., & Hopper, J. S. (2010). Using desktop videoconferencing to deliver interventions to a preschool student with autism. Topics in Early Childhood Special Education, 29(4), 214-225. doi: 10.1177/0271121409352873 Gillett, J. N., & LeBlanc, L. A. (2007). Parent-implemented natural language paradigm to increase language and play in children with autism. Research in Autism Spectrum Disorders, 1(3), 247-255. doi: 10.1016/j.rasd.2006.09.003 Graziano, P. A., Bagnet, D. M., Slavec, J., Hungerford, G., Kent, K., Babinski, D., …Pasalich, D. (2015). Feasibility of intensive parent-child interaction therapy (I-PCIT): Results from an open trial. Journal of Psychopathology and Behavioral Assessment, 37(1), 38-49. doi: 10.1007/s10862-014-9435-0 117 Green, J., Charman, T., McConachie, H., Aldred, C., Slonims, V., Howlin, P., . . . PACT Consortium. (2010). Parent-mediated communication-focused treatment in children with autism (PACT): A randomised controlled trial. The Lancet, 375(9732), 2152-2160. doi: 10.1016/S0140-6736(10)60587-9 Gresham, F. M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18(1), 37-50 Gresham, F. M., Gansle, K. A., & Noell, G. H. (1993). Treatment integrity in applied behavior analysis with children. Journal of Applied Behavior Analysis, 26(2), 257-263. Guinchat, V., Chamak, B., Bonniau, B., Bodeau, N., Perisse, D., Cohen, D., & Danion, A. (2012). Very early signs of autism reported by parents include many concerns not specific to autism criteria. Research in Autism Spectrum Disorders, 6(2), 589-601. Guinchat, V., Thorsen, P., Laurent, C., Cans, C., Bodeau, N., & Cohen, D. (2012). Pre‐, peri‐and neonatal risk factors for autism. Acta obstetricia et gynecologica Scandinavica, 91(3), 287-300. doi: 10.1111/j.1600-0412.2011.01325.x Hester, P. P., Kaiser, A. P., Alpert, C. L., & Whiteman, B. (1996). The generalized effects of training trainers to teach parents to implement milieu teaching. Journal of Early Intervention, 20(1), 30-51. doi: 10.1177/1362361315575164 Hoffman, L., & Rovine, M. J. (2007). Multilevel models for the experimental psychologist: Foundations and illustrative examples. Behavior Research Methods, 39(1), 101-117. doi: 10.3758/BF03192848 Hohmann, A. A., & Shear, M. K. (2002). Community-based intervention research: Coping with the “noise” of real life in study design. American Journal of Psychiatry, 159(2), 201-207 Howlin, P., & Moore, A. (1997). Diagnosis in autism. Autism, 1(2), 135-162. Hu, L-T., & Bentler, P. M. (1999) Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1-55. doi: 10.1080/10705519909540118 Ingersoll, B. (2011). The differential effect of three naturalistic language interventions on language use in children with autism. Journal of Positive Behavior Intervention, 13(1), 109-118. doi: 10.1177/1098300710384507 Ingersoll, B., & Dvortcsak, A. (2010). Teaching social communication to children with autism. New York: Guilford. Ingersoll, B., & Wainer, A. (2013). Initial efficacy of Project ImPACT: A parent-mediated social communication intervention for young children with ASD. Journal of Autism and Developmental Disorders, 43(12), 2943-2952. doi: 10.1007/s10803-013-1840-9 118 Ingersoll, B., Wainer, A. L., Berger, N. I., Pickard, K. E., & Bonter, N. (2016). Comparison of a self-directed and therapist-assisted telehealth parent-mediated intervention for children with ASD: A pilot RCT. Journal of Autism and Developmental Disorders, 46(7), 2275- 2284. Itzchak, E. B., Lahat, E., Burgin, R., & Zachor, A. D. (2008). Cognitive, behavior and intervention outcome in young children with autism. Research in Developmental Disabilities, 29(5), 447-458. doi: 10.1016/j.ridd.2007.08.003 Kaiser, A. P., & Hancock, T. B., (2003). Teaching parents new skills to support their young children’s development. Infants & Young Children, 16(1). doi: 10.1097/00001163- 200301000-00003 Kaiser, A. P., Hancock, T. B., & Nietfeld, J. P. (2000). The effects of parent-implemented enhanced milieu teaching on the social communication of children who have autism. Early Education and Development, 11(4), 423-446. doi: 10.1207/s15566935eed1104_4 Kaiser, H. (1974). An index of factor simplicity. Psychometrika, 39(1), 31-36. Kasari, C., Siller, M., Huynh, L. N., Shih, W., Swanson, M., Hellemann, G. S., & Sugar, C. A. (2014). Randomized controlled trial of parental responsiveness intervention for toddlers at high risk for autism. Infant Behavior & Development, 37(4), 711-721. doi: 10.1016/j.infbeh.2014.08.007 Kazdin, A. E. (2007). Mediators and mechanisms of change in psychotherapy research. Annual Review of Clinical Psychology, 3(1), 1-27. doi: 10.1146/annurev.clinpsy.3.022806.091432 Kemp, P., & Turnbull, A. P. (2014). Coaching with parents in early intervention: An interdisciplinary research synthesis. Infants & Young Children, 27(4), 305-324. doi:10.1097/IYC.0000000000000018 Kenward, M. G., & Roger, J. H. (1997). Small sample inference for fixed effects from restricted maximum likelihood. Biometrics, 53(3), 983-997. Keselman, H. J., Algina, J., Kowalchuk, R. K. & Wolfinger, R. D. (1998). A comparison of two approaches for selecting covariance structures in the analysis of repeated measurements. Communications in Statistics - Simulation and Computation, 27(3), 591-604. doi: 10.1080/03610919808813497 Knoche, L. L., Kuhn, M., & Eum, J. (2013). “More time. More showing. More helping. That's how it sticks”: The perspectives of early childhood coachees. Infants & Young Children, 26(4), 349-365. doi: 10.1097/IYC.0b013e3182a21935 119 Knoche, L. L., Sheridan, S. M., Edwards, C. P., & Osborn, A. Q. (2010). Implementation of a relationship-based school readiness intervention: A multidimensional approach to fidelity measurement for early childhood. Early Childhood Research Quarterly, 25(3), 299-313. doi: 10.1016/j.ecresq.2009.05.003 Knowles, M. S., Holton, E. F., & Swanson, R. A. (2005). The adult learner: the definitive classic in adult education and human resource development (6th ed.). London, England: Elsevier. Knudsen, E. I. (2004). Sensitive periods in the development of the brain and behavior. Journal of Cognitive Neuroscience, 16(8), 1412-1425. doi:10.1162/0898929042304796 Koenker, R. (1981). A note on studentizing a test for heteroscedasticity. Journal of Econometrics, 17(1), 107-112. doi: 10.1016/0304-4076(81)90062-2. Koh, S., & Neuman, S. B. (2009). The impact of professional development in family child care: A practice-based approach. Early Education and Development, 20(3), 537-562. doi: 10.1080/10409280902908841 Kratochwill, T. R., & Shernoff, E. (2004). Evidence-based practice: Promoting evidence-based interventions in school psychology. School Psychology Quarterly, 18(1), 389-408. doi:10.1521/scpq.18.4.389.27000 Kreft, I. G. G. (1996). Are multilevel techniques necessary? An overview, including simulation studies. Unpublished manuscript, California State University, Los Angeles. Lane, K. L., Bocian, K. M., MacMillan, D. L., & Gresham, F. M. (2004). Treatment integrity: An essential—but often forgotten—component of school-based interventions. Preventing School Failure: Alternative Education for Children and Youth, 48(3), 36-43. doi: 10.3200/PSFL.48.3.36-43 Liaupsin, C. J., Ferro, J. B., & Umbreit, J. (2012). Treatment integrity in intervention research: Models, measures, and future directions. (pp. 301-322) Emerald Group Publishing Limited. doi:10.1108/S0735-004X(2012)0000025015 Lyon, A. R., & Budd, K. S. (2010). A community mental health implementation of parent-child interaction therapy (PCIT). Journal of Child and Family Studies, 19(5), 654-668. doi: 10.1007/s10826-010-9353-z Magill-Evans, J., & Harrison, M. J. (2001). Parent-child interactions, parenting stress, and developmental outcomes at 4 years. Children's Health Care, 30(2), 135-150. doi: 10.1207/S15326888CHC3002_4 Mahoney, G., & Wheeden, C. A. (1997). Parent-child interaction—The foundation for family- centered early intervention practice: A response to Baird and Peterson. Topics in Early Childhood Special Education, 17(2), 165-184. 120 Mandell, D., & Lecavalier, L. (2014). Should we believe the Centers for Disease Control and Prevention’s autism spectrum disorder prevalence estimates? Autism, 18(5), 482-484. McCollum, J. A., Gooler, F., Appl, D. J., & Yates, T. J. (2001). PIWI: Enhancing parent-child interaction as a foundation for early intervention. Infants & Young Children, 14(1), 34-45. McConachie, H., & Diggle, T. (2007). Parent implemented early intervention for young children with autism spectrum disorder: A systematic review. Journal of Evaluation in Clinical Practice, 13(1), 120-129. doi: 10.1111/j.1365-2753.2006.00674.x McDuffie, A., Machalicek, W., Oakes, A., Haebig, E., Weismer, S. E., & Abbeduto, L. (2013). Distance video-teleconferencing in early intervention: Pilot study of a naturalistic parent- implemented language intervention. Topics in Early Childhood Special Education, 33(3), 172-185. doi: 10.1177/0271121413476348 McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school-based interventions with children in the journal of applied behavior analysis 1991- 2005. Journal of Applied Behavior Analysis, 40(4), 659-672. doi: 10.1901/jaba.2007.659- 672 McLeod, B. D., Southam‐Gerow, M. A., Tully, C. B., Rodríguez, A., & Smith, M. M. (2013). Making a case for treatment integrity as a psychosocial treatment quality indicator for youth mental health care. Clinical Psychology: Science and Practice, 20(1), 14-32. doi: 10.1111/cpsp.12020 McNeish, D. & Stapleton, L. (2016). The effect of small sample size on two level model estimates: A review and illustration. Educational Psychology Review, 28(2), 295-314. doi: 10.1007/s10648-014-9287-x Meadan, H., Ostrosky, M. M., Zaghlawan, H. Y., & Yu, S. (2009). Promoting the social and communicative behavior of young children with autism spectrum disorders: A review of parent-implemented intervention studies. Topics in Early Childhood Special Education, 29(2), 90-104. doi: 10.1177/0271121409337950 Meadan, H., Snodgrass, M. R., Meyer, L. E., Fisher, K. W., Chung, M. Y., & Halle, J. W. (2016). Internet-based parent-implemented intervention for young children with autism: A pilot study. Journal of Early Intervention, 38(1), 3-23. doi: 10.1177/1053815116630327 Meade, E. B., Dozier, M., & Bernard, K. (2014). Using video feedback as a tool in training parent coaches: promising results from a single-subject design. Attachment & Human Development, 16(4), 356-370. doi: 10.1080/14616734.2014.912488 121 Merrell, K. W., & Buchanan, R. (2006). Intervention selection in school-based practice: Using public health models to enhance systems capacity of schools. School Psychology Review, 35(2), 167-180 Mobayed, K. L., Collins, B. C., Strangis, D. E., Schuster, J. W., & Hemmeter, M. L. (2000). Teaching parents to employ mand-model procedures to teach their children requesting. Journal of Early Intervention, 23(4), 165-179 Moncher, F. J., & Prinz, R. J. (1991). Treatment fidelity in outcome studies. Clinical Psychology Review, 11(3), 247-266. doi: 10.1016/0272-7358(91)90103-2 Morgenstern, A., Leroy-Collombel, M., & Caët, S. (2013). Self- and other-repairs in child-adult interaction at the intersection of pragmatic abilities and language acquisition. Journal of Pragmatics: An Interdisciplinary Journal of Language Studies, 56, 151-167. doi: 10.1016/j.pragma.2012.06.017 Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: Development, measurement, and validation. American Journal of Evaluation, 24(3), 315-340. Nelson, M. C., Cordray, D. S., Hulleman, C. S., Darrow, C. L., & Sommer, E. C. (2012). A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions. The Journal of Behavioral Health Services & Research, 39(4), 374-396. doi: 10.1007/s11414-012-9295-x Noell, G. H. (2008). Research examining the relationships among consultation process, treatment integrity, and outcomes. In W. P. Erchul & S. M. Sheridan (Eds.), Handbook of research in school consultation: Empirical foundations for the field (pp. 315-334). Mahwah, NJ: Erlbaum Odom S. L., Collet-Klingenberg L., Rogers S. J., & Hatton D. D. (2010). Evidence-based practices in interventions for children and youth with autism spectrum disorders. Preventing School Failure: Alternative Education for Children and Youth, 54(4), 275-282. doi: 10.1080/10459881003785506 O'Donnell, C. L. (2008). Defining, conceptualizing, and measuring fidelity of implementation and its relationship to outcomes in K-12 curriculum intervention research. Review of Educational Research, 78(1), 33-84. doi: 10.3102/0034654307313793 Ogden, T., & Fixsen, D. L. (2014). Implementation science: A brief overview and a look ahead. Zeitschrift Für Psychologie, 222(1), 4-11. doi: 10.1027/2151-2604/a000160 Oliver, P., & Brady, M. P. (2014). Effects of cover audio coaching on parents’ interactions with young children with autism. Behavior Analysis in Practice, 7(2), 112-116. doi: 10.1007/s40617-014-0015-2 122 Oosterling, I., Visser, J., Swinkels, S., Rommelse, N., Donders, R., Woudenberg, T., …Buitelaar, J. (2010). Randomized controlled trial of the focus parent training for toddlers with autism: 1-year outcome. Journal of Autism and Developmental Disorder, 40(12), 1447-1458. doi: 10.1007/s10803-1004-0 Perera, H., Jeewandara, K. C., Seneviratne, S., & Guruge, C. (2016). Outcome of home-based early intervention for autism in sri lanka. Follow-up of a cohort and comparison with a nonintervention group. BioMed Research International, 1-6. doi: 10.1155/2016/3284087 Perepletchikova, F., Hilt, L. M., Chereji, E., & Kazdin, A. E. (2009). Barriers to implementing treatment integrity procedures: Survey of treatment outcome researchers. Journal of Consulting and Clinical Psychology, 77(2), 212-218. doi: 10.1037/a0015232 Perepletchikova, F., & Kazdin, A. E. (2005). Treatment integrity and therapeutic change: Issues and research recommendations. Clinical Psychology: Science and Practice, 12(4), 365-383. doi: 10.1093/clipsy.bpi045 Perepletchikova, F., Treat, T. A., & Kazdin, A. E. (2007). Treatment integrity in psychotherapy research. Analysis of the studies and examination of the associated factors. Journal of Consulting and Clinical Psychology, 75(6), 829-841. doi: 10.1037/0022-006X.75.6.829 Peugh, J. L. (2010). A practical guide to multilevel modeling. Journal of School Psychology, 48(1), 85-112. doi: 10.1016/j.jsp.2009.09.002 Randolph, J. K. Stichter, J. P., Schmidt, C. T., & O’Connor, K. V. (2011). Fidelity and effectiveness of PRT implemented by caregivers without college degrees. Focus on Autism and Other Developmental Disabilities, 26(4), 230-238. doi: 10.1177/1088357611421503 Reagon, K. A., & Higbee, T. S. (2009). Parent‐implemented script fading to promote play‐based verbal initiations in children with autism. Journal of Applied Behavior Analysis, 42(3). doi: 659-664. 10.1901/jaba.2009.42-659 Rocha, M. L., Schreibman, L., & Stahmer, A. C. (2007). Effectiveness of training parents to teach joint attention in children with autism. Journal of Early Intervention, 29(2), 154- 172. doi: 10.1177/105381510702900207 Rogers, S. J., Hepburn, S. L., Stackhouse, T., & Wehner, E. (2003). Imitation performance in toddlers with autism and those with other developmental disorders. Journal of Child Psychology and Psychiatry, 44(5), 763–781. Rogers, S. J., Estes, A., Lord, C., Vismara, L., Winter, J., Fitzpatrick, A., …Dawson, G. (2012). Effects of a brief early start Denver model (ESDM)-based parent intervention on toddlers at risk for autism spectrum disorders: A randomized controlled trial. Journal of the American Academy of Child and Adolescent Psychiatry, 51(10), 1052-1065 123 Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed.). Thousand Oaks, CA: Sage. Roy, A., & Khattree, R. (2005). Discrimination and classification with repeated measures data under different covariance structures. Communications in Statistics – Simulation and Compulation, 34(1), 167-178. doi: 10.1081/SAC-200047072 Ruginski, I. (2016, October 11). Checking the assumptions of linear regression [Blog post]. Retrieved from http://www.ianruginski.com/regressionassumptionswithR_tutorial.html Rush, D. D., & Shelden, M. L. (2011). The early childhood coaching handbook. Baltimore, MD: Paul H. Brookes Publishing Company. Rush, D. D., Shelden, M. L., & Hanft, B. E. (2003). Coaching families and colleagues: A process for collaboration in natural settings. Infants & Young Children, 16(1), 33-47. doi: 10.1097/00001163-200301000-00005 Sale, A., Berardi, N., & Maffei, L. (2009). Enrich the environment to empower the brain. Trends in Neurosciences, 32(4), 233-239. doi: 10.1016/j.tins.2008.12.004 Sale, A., Berardi, N., & Maffei, L. (2014). Environment and brain plasticity: Towards an endogenous pharmacotherapy. Physiological Reviews, 94(1), 189-234. doi: 10.1152/physrev.00036.2012 Sallow, G. O., & Graupner, T. D. (2005). Intensive behavioral treatment for children with autism: Four-year outcome and predictors. American Journal of Mental Retardation, 110(6), 417–438. Sanefuji, W., & Ohgami, H. (2013). “Being imitated” strategy at home-based intervention for young children with autism. Being-imitated intervention for autism. Infant Mental Health Journal, 34(1), 72-79. doi: 10.1002/imhj.21375 Sanetti, L. M. H., & Collier-Meek, M. A. (2014). Increasing the rigor of procedural fidelity assessment: An empirical comparison of direct observation and permanent product review methods. Journal of Behavioral Education, 23(1), 60-88. doi: 10.1007/s10864- 013-9179-z Sanetti, L. M. H., Gritter, K. L., & Dobey, L. M. (2011). Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008. School Psychology Review, 40(1), 72-84. Sanetti, L. M. H., & Kratochwill, T. R. (2011). An evaluation of the treatment integrity planning protocol and two schedules of treatment integrity self-report: Impact on implementation and report accuracy. Journal of Educational and Psychological Consultation, 21(4), 284- 308. doi: 10.1080/10474412.2011.620927 124 Sanetti, L. M. H., & Kratochwill, T. R. (2009). Toward developing a science of treatment integrity: Introduction to the special series. School Psychology Review, 38(4), 445-459. Sanetti, L. M. H., & Kratochwill, T. R. (2008). Treatment integrity in behavioral consultation: Measurement, promotion, and outcomes. The International Journal of Behavioral Consultation and Therapy, 4(1), 95-114. Scherbaum, C. A., & Ferreter, J. M. (2009). Estimating statistical power and required sample sizes for organizational research using multilevel modeling. Organizational Research Methods, 12(2), 347-367. doi: 10.1177/1094428107308906 Schreibmann, L. (2000). Intensive Behavioral/Psychoeducational treatments for autism: Research needs and future directions. Journal of Autism and Developmental Disorders, 30, 373-378. doi: 10.1023/A:1005535120023 Schuhmann, E. M., Foote, R. C., Eyberg, S. M., Boggs, S. R., & Algina, J. (1998). Efficacy of parent-child interaction therapy: Interim report of a randomized trial with short-term maintenance. Journal of Clinical Child Psychology, 27(1), 34-45. doi: 10.1207/s15374424jccp2701_4 Schulte, A. C., Easton, J. E., & Parker, J. (2009). Advances in treatment integrity research: Multidisciplinary perspectives on the conceptualization, measurement, and enhancement of treatment integrity. School Psychology Review, 38(4), 460-475. Schultz, T. R., Schmidt, C. T., & Stichter, J. P. (2011). A review of parent education programs for parents of children with autism spectrum disorders. Focus on Autism and Other Developmental Disabilities, 26(2), 96-104. doi: 10.1177/1088357610397346 Shanley, J. R., & Niec, L. N. (2010). Coaching parents to change: The impact of in vivo feedback on parents’ acquisition of skills. Journal of Clinical Child and Adolescent Psychology, 39(2), 282-287. doi: 10.1080/15374410903532627 Shire, S. Y., Goods, K., Shih, W., Distefano, C., Kaiser, A., Wright, C., . . . Kasari, C. (2015). Parents' adoption of social communication intervention strategies: Families including children with autism spectrum disorder who are minimally verbal. Journal of Autism and Developmental Disorders, 45(6), 1712-1724. doi: 10.1007/s10803-014-2329-x Simacek, J., Dimian, A. F., & McComas, J. J. (2017). Communication intervention for young children with severe neurodevelopmental disabilities via telehealth. Journal of Autism and Developmental Disorders, 47(3), 744-767. doi: 10.1007/s10803-016-3006-z Smith, T., Groen, A. D., & Wynn, J. W. (2000). Randomized trial of intensive early intervention for children with pervasive developmental disorder. American Journal of Mental Retardation: AJMR, 105(4), 269-285. 125 Snijders, T. A. B., & Bosker, R. J. (2012). Multilevel analysis: an introduction to basic and advanced multilevel modeling (2nd ed.). London: Sage. Southam-Gerow, M. A., & McLeod, B. D. (2013). Advances in applying treatment integrity research for dissemination and implementation science: Introduction to special issue. Clinical Psychology: Science and Practice, 20(1), 1-13. doi: 10.1111/cpsp.12019 Spring, B. (2007). Evidence‐based practice in clinical psychology: What it is, why it matters; what you need to know. Journal of Clinical Psychology, 63(7), 611-631. doi: 10.1002/jclp.20373 Stadnick, N. A., Stahmer, A., & Brookman-Frazee, L. (2015). Preliminary effectiveness of project ImPACT: A parent-mediated intervention for children with autism spectrum disorder delivered in a community program. Journal of Autism and Developmental Disorders, 45(7), 2092-2104. doi: 10.1007/s10803-015-2376-y Strauss, K., Vicari, S., Valeri, G., D'Elia, L., Arima, S., & Fava, L. (2012). Parent inclusion in early intensive behavioral intervention: The influence of parental stress, parent treatment fidelity and parent-mediated generalization of behavior targets on child outcomes. Research in Developmental Disabilities, 33(2), 688-703. doi: 10.1016/j.ridd.2011.11.008 Topping, K., Dekhinet, R., & Zeedyk, S. (2012). Parent–infant interaction and children’s language development. Educational Psychology, 33(4), 391-426. doi: 10.1080/01443410.2012.744159 Thomas, M.S., & Johnson, M. H. (2008). New advances in understanding sensitive periods in brain development. Current Directions in Psychological Science, 17(1), 1-5. doi: 10.1111/j.1467-8721.2008.00537.x Vismara, L. A., McCormick, C., Young, G. S., Nadhan, A., & Monlux, K. (2013). Preliminary findings of a telehealth approach to parent training in autism. Journal of Autism and Developmental Disorders, 43(12), 2953-2969. doi: 10.1007/s10803-013-1841-8 Vismara, L. A., Young, G. S., Stahmer, A. C., Griffith, E. M., & Rogers, S. J. (2009). Dissemination of evidence-based practice: Can we train therapists from a distance? Journal of Autism and Developmental Disorders, 39(12), 1636-1651. doi: 10.1007/s10803-009-0796-2 Wainer, A. L., & Ingersoll, B. R. (2013). Disseminating ASD interventions: a pilot study of a distance learning program for parents and professionals. Journal of Autism and Developmental Disorders, 43(1), 11-24 Wainer, A. L., & Ingersoll, B. R. (2015). Increasing access to an ASD imitation intervention via a telehealth parent training program. Journal of Autism and Developmental Disorders, 45(12), 3877-3890. doi: 10.1007/s10803-014-2186-7 126 Waltz, J., Addis, M. E., Koerner, K., & Jacobson, N. S. (1993). Testing the integrity of a psychotherapy protocol: Assessment of adherence and competence. Journal of Consulting and Clinical Psychology, 61(4), 620-630. doi: 10.1037/0022-006X.61.4.620 learning. Assessment in Education: Principles, Policy & Practice, 16(2), 165-184. Webb, M., & Jones, J. (2009). Exploring tensions in developing assessment for Wetherby, A. M., Guthrie, W., Woods, J., Schatschneider, C., Holland, R. D., Morgan, L., & Lord, C. (2014). Parent-implemented social intervention for toddlers with autism: An RCT. Pediatrics, 134(6), 1084-1093. doi: 10.1542/peds.2014-0757 Wilson VanVoorhis, C.R., & Morgan, B. L. (2007). Understanding power and rules of thumb for determining sample sizes. Tutorials in Quantitative Methods for Psychology, 3(2), 43-50. doi: 10.20982/tqmp.03.2.p043 Winter, B. (2013). Linear models and linear mixed effects models in R with linguistic applications. arXiv:1308.5499. [http://arxiv.org/pdf/1308.5499.pdf] Winton, P. J., Sloop, S., & Rodriguez, P. (1999). Parent education: A term whose time is past. Topics in Early Childhood Special Education, 19(3), 157-161. doi: 10.1177/027112149901900306 Wolery, M., & Garfinkle, A. N. (2002). Measures in intervention research with young children who have autism. Journal of Autism and Developmental Disorders, 32(5), 463-478. doi: 10.1023/A:1020598023809 Wolfberg, P. J., & Schuler, A. L. (1993). Integrated play groups: A model for promoting the social and cognitive dimensions of play in children with autism. Journal of Autism and Developmental Disorders, 23(3), 467-489. doi: 10.1007/BF01046051 Woodward, A. L. (2003). Infants’ developing understanding of the link between looker and object. Developmental Science, 6(3), 297-311. doi: 10.1111/1467-7687.00286 Wyatt Kaminski, J., Valle, L. A., Filene, J. H., & Boyle, C. L. (2008). A meta-analytic review of components associated with parent training program effectiveness. Journal of Abnormal Child Psychology, 36(4), 567-589. doi: 10.1007/s10802-007-9201-9 Yeaton, W. H., & Sechrest, L. (1981). Critical dimensions in the choice and maintenance of successful treatments: Strength, integrity, and effectiveness. Journal of Consulting and Clinical Psychology, 49(2), 156-167. doi: 10.1037/0022-006X.49.2.156 Yong A. G., & Pearce, S. (2013). A beginner’s guide to factor analysis: Focusing on exploratory factor analysis. Tutorials in Quantitative Methods for Psychology, 9(2), 79-94. doi: 10.20982/tqmp.09.2.p079 127 Zvoch, K. (2012). How does fidelity of implementation matter? Using multilevel models to detect relationships between participant outcomes and the delivery and receipt of treatment. American Journal of Evaluation, 33(4). 547-565. doi: 10.1177/1098214012452715 128