ADVANCEMENTS IN APPLIED BEHAVIOR ANALYSIS SERVICE DELIVERY, SUPERVISION, AND FEEDBACK By Emma Thomas A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Special Education – Doctor of Philosophy 2022 ABSTRACT ADVANCEMENTS IN APPLIED BEHAVIOR ANALYSIS SERVICE DELIVERY, SUPERVISION, AND FEEDBACK By Emma Thomas Supervision is critical to the field of Applied Behavior Analysis (ABA) because it improves the quality of services provided to the recipients of behavioral services which in turn increases client protection and helps to portray the field of ABA as one that is committed to socially significant behavior change (Britton & Cicoria, 2019; Brodhead & Higbee, 2012; Hartley et al., 2016; LeBlanc & Luiselli, 2016; LeBlanc et al., 2020; Turner et al., 2016). Without effective supervision, the quality of services may decrease and negatively impact treatment outcomes for the clients (Britton & Cicoria, 2019; Dixon et al., 2016; Eikeseth, 2009; LeBlanc & Luiselli, 2016; Shapiro & Kazemi, 2017). Given the rapid growth of the field of ABA, supervision will continue to play a critical role in training, fostering the growth and development of professionals and ensuring those professionals uphold the high standards of the profession (Hajiaghamohseni et al., 2020; Turner et al., 2016; Turner, 2017). The current dissertation addressed and evaluated supervision of behavior analytic services provided to individuals with autism spectrum disorder (ASD) in three different contexts: (a) supervision provided during the implementation of behavioral interventions, (b) supervision provided via Telehealth, specifically evaluating barriers and strategies used to address and/or mitigate those barriers, and (c) supervision provided via Telehealth in the form of email performance-based feedback. Collectively, these chapters sought to address gaps in the current behavior analytic supervision literature and identify additional areas of study. Chapter 1 provides an overall introduction connecting the three separate, but related chapters (Chapters 2-4) and includes a supervision logic model. Chapter 2 is a systematic literature review that evaluated the extent to which recently published articles included information regarding supervision and staff training of the individuals implementing behavioral interventions to young children with ASD. The results of Chapter 2 revealed that there is little to no consensus on reporting supervision and staff training characteristics in the current behavioral intervention literature. Research implications and reporting recommendations will be discussed. Chapter 3 is a survey study that evaluated the barriers Board Certified Behavior Analysts (BCBAs) experienced and the strategies BCBAs used to address and/or mitigate the barriers that arose when providing supervision via Telehealth. The results of Chapter 3 revealed that BCBAs that provide supervision via Telehealth are not exempt from experiencing barriers. Research and practical implications will be discussed. Chapter 4 is a single case research design study that evaluated the extent to which email performance-based feedback increased procedural fidelity of teacher candidates’ implementation of a multiple stimulus without replacement (MSWO) preference assessment. The results revealed that email performance-based feedback was effective in increasing procedural fidelity of MSWO preference assessment implementation. These results support previous literature suggesting that email performance-based feedback alone is effective in increasing target behavior(s). Research implications will be discussed. Finally, Chapter 5 provides an overall discussion about the findings of the three chapters (Chapters 2-4), recommendations for research and practice, and considerations for the future direction of supervision in the field of ABA. ACKNOWLEDGEMENTS First and foremost, I would like to thank my PhD advisor and dissertation committee chair, Dr. Matthew Brodhead. His thoughtfulness, expertise, and dedication throughout my time at Michigan State University has made a profound, positive, impact on my life. I would also like to thank my dissertation committee members, Dr. Emily Bouck, Dr. Amy Drahota, and Dr. Joshua Plavnick. Their feedback and support throughout the dissertation process has been invaluable. I would like to thank my parents for their continuous love and support, always having open ears and genuinely asking about what I am working on and/or conducting research on, though it may not have always made sense! I would also like to thank my husband, my incredible husband, for supporting us throughout this entire journey, without you, this would not have been possible. The sacrifices you made to be by my side every step of the way has allowed me to get to where I am today. To our dog, Chico, thank you for reminding me how important it is to take breaks, to relax, and to play. To my siblings, thank you for always believing in me and supporting me. To my nieces and nephew, thank you for providing me with laughter and joy and for your curiosity and questions about what I do each day in school and why auntie has been in school for so long. To the rest of my family, near and far, thank you for the calls and check ins, your support has meant the world to me. Finally, I would like to thank everyone who helped me conduct the three studies in this dissertation, without you this dissertation could not have happened. Thank you, Suzanne Hemwall, Kassidi Krzykwa, Gracie Medlin, Isaac Melanson, David Ray Miranda, Brianna Smith, Alexandria Thomas, Ashley Walker, Sichao Wang, and Allison White. iv TABLE OF CONTENTS LIST OF TABLES ......................................................................................................................... ix LIST OF FIGURES ........................................................................................................................ x KEY TO ABBREVIATIONS ........................................................................................................ xi CHAPTER 1 Introduction.............................................................................................................. 1 Supervision Literature ................................................................................................................. 4 Supervision Logic Model ........................................................................................................ 5 When is Supervision Provided? .......................................................................................... 5 How is Supervision Provided? ............................................................................................ 6 What Does Supervision Consist of? ................................................................................... 8 Purpose of the Present Dissertation ........................................................................................ 9 APPENDIX ................................................................................................................................... 11 REFERENCES ............................................................................................................................. 13 CHAPTER 2 A Systematic Literature Review of Supervision and Staff Training within Behavioral Intervention Research with Children with Autism Spectrum Disorder ..................... 20 Method ...................................................................................................................................... 24 Extensive Literature Search and Inclusion Criteria (Step 1) ................................................ 24 Abstract Screening (Step 2) .................................................................................................. 25 Full Article Screening (Step 3) ............................................................................................. 26 In-Depth Review (Step 4) ..................................................................................................... 29 Inter-Rater Reliability ........................................................................................................... 30 Data Analysis ........................................................................................................................ 31 Results ....................................................................................................................................... 31 Comprehensive Intervention Articles ................................................................................... 32 Overall Participant Diagnoses........................................................................................... 33 Participants with ASD Characteristics .............................................................................. 33 Settings .............................................................................................................................. 34 Duration and Dosage of Interventions .............................................................................. 35 Treatment Fidelity ............................................................................................................. 36 Direct Intervention Staff ................................................................................................... 36 Supervision ....................................................................................................................... 38 Supervisor Staff. ........................................................................................................... 38 Lead Supervisor Staff. .................................................................................................. 39 Supervision Meeting Characteristics. ........................................................................... 39 Staff Training .................................................................................................................... 40 Skill-Based Intervention Articles .......................................................................................... 41 Overall Participant Diagnoses........................................................................................... 41 Participants with ASD Characteristics .............................................................................. 42 Settings .............................................................................................................................. 43 Duration and Dosage of Interventions .............................................................................. 43 v Treatment Fidelity ............................................................................................................. 44 Direct Intervention Staff ................................................................................................... 44 Supervision ....................................................................................................................... 46 Supervisor Staff. ........................................................................................................... 46 Lead Supervisor Staff. .................................................................................................. 46 Supervision Meeting Characteristics. ........................................................................... 47 Staff Training .................................................................................................................... 47 Discussion ................................................................................................................................. 48 Supervision and Staff Training ............................................................................................. 49 Participant Characteristics .................................................................................................... 51 Settings .................................................................................................................................. 52 Research on Behavioral Interventions .................................................................................. 54 Reporting Recommendations ................................................................................................ 54 Limitations ............................................................................................................................ 55 APPENDIX ................................................................................................................................... 56 REFERENCES ............................................................................................................................. 66 CHAPTER 3 A Survey of Barriers Behavior Analysts Experience While Providing Supervision Via Telehealth ............................................................................................................................. 120 Method .................................................................................................................................... 124 Participants .......................................................................................................................... 124 Inclusion Criteria ............................................................................................................ 125 Materials ............................................................................................................................. 125 Procedure ............................................................................................................................ 128 Data Analysis ...................................................................................................................... 128 Results ..................................................................................................................................... 129 Respondent Demographics ................................................................................................. 130 Supervision Load and Supervision Meeting Logistics ....................................................... 131 Experiences When Providing Supervision Via Telehealth ................................................. 131 Barriers Experienced ........................................................................................................... 132 Strategies Used to Address and/or Mitigate Barriers Experienced ..................................... 133 Discussion ............................................................................................................................... 134 Training on How to Provide Supervision Via Telehealth ................................................... 135 Number of Years as a BCBA .............................................................................................. 136 Practical Implications.......................................................................................................... 136 Individual Level .............................................................................................................. 137 Limitations .......................................................................................................................... 139 APPENDIX ................................................................................................................................. 142 REFERENCES ........................................................................................................................... 228 CHAPTER 4 An Evaluation of Email Performance-Based Feedback on Teacher Candidates Multiple Stimulus Without Replacement Preference Assessment Implementation ................... 235 Method .................................................................................................................................... 238 Participants .......................................................................................................................... 238 Confederate ......................................................................................................................... 239 Primary Data Collector ....................................................................................................... 239 vi Setting and Materials .......................................................................................................... 240 Participant Materials ....................................................................................................... 240 Primary Researcher/Confederate Materials .................................................................... 241 MSWO Training Video............................................................................................... 241 MSWO Excerpt. .......................................................................................................... 242 Session Script. ............................................................................................................. 243 Confederate Response Data Sheet. ............................................................................. 243 Primary Data Collector Materials ................................................................................... 244 Measurement ....................................................................................................................... 245 Dependent Variable ........................................................................................................ 245 Interobserver Agreement ................................................................................................ 247 Primary Researcher/Confederate Procedural Fidelity .................................................... 248 Experimental Design ........................................................................................................... 250 Procedure ............................................................................................................................ 251 Initial Research Meeting and Training ........................................................................... 251 Baseline ........................................................................................................................... 252 Intervention ..................................................................................................................... 254 Data Analysis ...................................................................................................................... 254 Results ..................................................................................................................................... 255 Participant Procedural Fidelity ........................................................................................... 255 Nonconcurrent Set of Participants .................................................................................. 255 Riley. ........................................................................................................................... 255 Olivia........................................................................................................................... 255 Ava. ............................................................................................................................. 255 Concurrent Set of Participants ........................................................................................ 256 Layla. .......................................................................................................................... 256 Ellie. ............................................................................................................................ 256 Kennedy. ..................................................................................................................... 256 Tau-U .................................................................................................................................. 256 Discussion ............................................................................................................................... 257 Extension of Previous Literature ........................................................................................ 260 Limitations .......................................................................................................................... 262 APPENDICES ............................................................................................................................ 264 APPENDIX A Tables 4.1 - 4.2 and Figures 4.1 - 4.3 ............................................................ 265 APPENDIX B Multiple Stimulus Without Replacement (MSWO Data Sheet)..................... 270 APPENDIX C Multiple Stimulus Without Replacement (MSWO) Excerpt from DeLeon & Iwata (1996) ............................................................................................................................ 271 APPENDIX D Session Script ................................................................................................. 272 APPENDIX E Multiple Stimulus Without Replacement (MSWO) Confederate Response Data Sheet Example ........................................................................................................................ 273 APPENDIX F Multiple Stimulus Without Replacement (MSWO) Procedural Fidelity Data Sheet Page 1 ............................................................................................................................ 274 APPENDIX G Researcher Procedural Fidelity Data Sheet Page 1 ........................................ 277 APPENDIX H Baseline Email Example ................................................................................ 279 APPENDIX I Intervention Email Example ............................................................................ 280 REFERENCES ........................................................................................................................... 281 vii CHAPTER 5 Discussion ............................................................................................................. 287 Supervision Process ................................................................................................................ 288 When is Supervision Provided? .......................................................................................... 288 Research Recommendations ........................................................................................... 289 How is Supervision Provided? ............................................................................................ 291 Research Recommendations ........................................................................................... 292 Practice Recommendations ............................................................................................. 294 What Does Supervision Consist of? ................................................................................... 295 Research Recommendations ........................................................................................... 296 Supervision in ABA ............................................................................................................ 297 REFERENCES ........................................................................................................................... 299 viii LIST OF TABLES Table 2.1. Journal Distribution of Articles ................................................................................... 57 Table 2.2. Article Distribution by Focus Category and Research Design .................................... 61 Table 2.3. Percentage of Articles that Reported on Supervision and Staff Training Variables ... 62 Table 3.1. Demographics of Survey Participants........................................................................ 143 Table 3.2. Demographic Comparison of Survey Respondents with BACB Data ...................... 146 Table 3.3. Supervision Specific Demographics of Survey Participants ..................................... 147 Table 3.4. Supervision Load and Meeting Logistics .................................................................. 150 Table 3.5. Frequency of Individuals Typically Present/Available During Supervision Meetings ..................................................................................................................................................... 153 Table 3.6. Modality Used for Supervision Meetings .................................................................. 155 Table 3.7. Pairwise Comparisons for Supervisee Barriers ......................................................... 156 Table 3.8. Pairwise Comparisons for Supervisor Barriers .......................................................... 170 Table 3.9. Strategies Used to Address and/or Mitigate Supervisee Barriers .............................. 197 Table 3.10. Most Frequent Strategy Used to Address and/or Mitigate Each Supervisee Barriers ..................................................................................................................................................... 210 Table 3.11. Strategies Used to Address and/or Mitigate Supervisor Barriers ............................ 211 Table 3.12. Most Frequent Strategy Used to Address and/or Mitigate Each Supervisor Barriers ..................................................................................................................................................... 218 Table 3.13. Recommendations to Address and/or Mitigate Supervisee and Supervisor Barriers ..................................................................................................................................................... 220 Table 4.1. Confederate Responses During MSWO Research Sessions ...................................... 265 Table 4.2. Average IOA Scores for Each Participant Across the Two Conditions .................... 266 ix LIST OF FIGURES Figure 1.1. Supervision Logic Model ........................................................................................... 12 Figure 2.1. Article Distribution by Publication Year .................................................................... 63 Figure 2.2. Intervention Setting for Comprehensive Intervention Articles .................................. 64 Figure 2.3. Intervention Setting for Skill-Based Intervention Articles ......................................... 65 Figure 3.1. Percentage of Each Supervisee Barrier Experienced Across Participants ............... 225 Figure 3.2. Number of Supervisee Versus Supervisor Barriers Experienced Across Participants ..................................................................................................................................................... 226 Figure 3.3. Percentage of Each Supervisor Barrier Experienced Across Participants................ 227 Figure 4.1. Percentage of Correct Implementation for Three Participants (Riley, Olivia, and Amy) Across Conditions............................................................................................................. 267 Figure 4.2. Percentage of Correct Implementation for Three Participants (Riley, Olivia, and Amy) Across Conditions Depicting the Nonconcurrent Session Schedule ................................ 268 Figure 4.3. Percentage of Correct Implementation for Three Participants (Layla, Ellie, and Kennedy) Across Conditions ...................................................................................................... 269 x KEY TO ABBREVIATIONS ABA: Applied Behavior Analysis/Applied Behavior Analytic ADHD: Attention Deficit Hyperactivity Disorder ASAP: Advancing Social-Communication and Play ASD: Autism Spectrum Disorder BACB: Behavior Analyst Certification Board BCaBA: Board Certified Assistant Behavior Analyst BCBA: Board Certified Behavior Analyst BCBA-D: Board Certified Behavior Analyst with a doctoral designation EIBI: Early Intensive Behavioral Intervention ESDM: Early Start Denver Model FERPA: Family Educational Rights and Privacy Act HIPAA: Health Insurance Portability and Accountability Act IOA: Interobserver Agreement IRR: Inter-Rater Reliability MSWO: Multiple Stimulus Without Replacement PDD-NOS: Pervasive Developmental Disorder- Not Otherwise Specified PRT: Pivotal Response Training RBT: Registered Behavior Technician SLP: Speech Language Pathologist xi CHAPTER 1 Introduction Applied Behavior Analysis (ABA) is a scientific approach that focuses on understanding and improving socially important human behaviors through the application and analysis of behavioral principles (Cooper et al., 2020). ABA interventions, also known as behavioral interventions, have been found to improve intellectual and social functioning, language development, and daily living skills, in children with autism spectrum disorder (ASD; LeBlanc & Gillis, 2012; National Autism Center, 2015; Virués-Ortega, 2010). Behavioral interventions involve observing, measuring, and analyzing human behavior in order to change the behavior of interest (Baer et al., 1968). Supervision and training of the individuals implementing those interventions is required in order for behavioral interventions to be most effective (Shapiro & Kazemi, 2017). Within the field of ABA, supervision is defined as “improving and maintaining the behavior-analytic, professional and ethical repertoires of the supervisee and facilitating the delivery of high-quality behavior analytic services to the supervisee’s clients” (BACB, 2018)1. Individuals who provide supervision under the umbrella of this definition (i.e., Board Certified Behavior Analyst [BCBA], Board Certified Assistant Behavior Analyst [BCaBA]) must be credentialed through the Behavior Analyst Certification Board (BACB). The BACB was established in 1998 and is a nonprofit organization that provides and oversees behavior analyst certifications, establishes practice standards (e.g., supervision standards), administers credential examinations, and describes ethics requirements (BACB, 2021). The BACB was created to 1 Throughout this dissertation, the term client is used to refer to the recipient of behavioral services to remain consistent with BACB terminology. However, it is recognized that this term has medical connotations and may not fully capture the broad range of individuals who often receive and benefit from behavioral services, such as special education or general education students. 1 provide a credential that identifies an individual as a qualified behavior analyst (i.e., the individual has met standards established for the profession and has a minimal level of competency) and to increase the quality and the amount of behavior analytic services provided to the clients (Shook, 1993; Shook & Favell, 2008). Additionally, the creation of the BACB has provided the field of ABA the ability to receive funding from government agencies and health insurance plans for services provided, as many funding sources may not fund individuals who are not credentialed (Green & Johnston, 2009). Typically, oversight of ABA interventions is provided by a supervisor (e.g., BCBA). A supervisor is defined as someone who oversees individuals who are providing behavior analytic services and/or individuals who are accruing fieldwork hours to become certified (BACB, n.d.). The supervised individual is commonly referred to as the supervisee (e.g., registered behavior technician [RBT], BCaBA, trainee). A supervisee is defined as “any individual whose behavioral service delivery is overseen by a behavior analyst within the context of a defined, agreed upon relationship” (BACB, 2020). The BACB indicates that there are five main purposes of supervision: (a) guide the supervisee in order to increase the quality of services provided to the client, (b) facilitate the improvement and maintenance of the supervisee’s behavior analytic skills, (c) develop the supervisee’s professional and ethical behavior analytic skills, (d) teach the supervisee conceptual skills (e.g., decision making, problem solving skills), and (e) model high quality and effective supervision practices to the supervisee (BACB, 2019). The BACB also indicated there are nine components that make up effective supervision and supervisory practices: (a) continuous monitoring of the supervisee, (b) informing the supervisee of performance expectations, (c) using behavioral skills training, (d) observing the supervisee with clients and providing feedback, (e) modeling professional and ethical behavior, 2 (f) guiding the development of behavioral case conceptualization, (g) reviewing the supervisee’s written materials they have developed, (h) overseeing the supervisee’s delivery of services, and (i) continuously evaluating the effects of supervision (BACB, 2021). Indeed, providing effective supervision is critical in the field of ABA, as it ensures quality control and improves the quality of services provided to recipients of behavioral services (Britton & Cicoria, 2019; LeBlanc & Luiselli, 2016; LeBlanc et al., 2020; Turner et al., 2016). Additionally, effective supervision practices are directly beneficial to the organization in which those services are provided because they improve quality of behavioral services that are available to all clients and reduces the potential for consumer harm and mistreatment (Brodhead & Higbee, 2012; Hartley et al., 2016). Without appropriate supervision of behavioral services, the quality of those services may decrease and as a result may negatively impact treatment outcomes (Britton & Cicoria, 2019; Dixon et al., 2016; Eikeseth, 2009; LeBlanc & Luiselli, 2016; Shapiro & Kazemi, 2017). In addition to the positive effects that quality supervisory practices have on the recipients of behavioral services, the supervisor and supervisee also benefit from the supervisory relationship. The supervisee benefits from having a model (i.e., the supervisor) to demonstrate necessary behavior analytic skills, assistance in making decisions, and mentorship and guidance for how to improve their performance (LeBlanc et al., 2020). Additionally, the supervisee benefits from a supervisory relationship because interactions with their supervisor will help them develop stronger interpersonal skills and gain appropriate competencies to create socially significant behavior change with clients (Brodhead et al., 2018; LeBlanc et al., 2012; LeBlanc & Luiselli, 2016; Sellers et al., 2016b; Turner et al., 2016). The supervisor is also a beneficiary of the supervisory relationship. For example, the supervisor benefits from learning new skills (e.g., how to train a supervisee). Additionally, the 3 supervisor benefits because interactions with the supervisee will help them develop stronger communication and interpersonal skills (LeBlanc et al., 2020). Furthermore, by frequently discussing ABA topics with the supervisee, the supervisor will develop a more in depth understanding of topics within ABA and increase their intellectual stimulation. Finally, each supervisee the supervisor oversees, the supervisor is expanding their professional network and contributing to the field of ABA by training future practitioners (Britton & Cicoria, 2019; LeBlanc et al., 2020). In conclusion, supervision improves quality of care and consumer protection for the clients and both the supervisor and supervisees benefit from quality supervisory practices. Supervision Literature Supervision has been a critical component in a variety of human service fields for centuries. The field of education has employed supervision strategies since the 1600s (Burnham, 1976; Tracey, 1995). The fields of psychology, counseling, and social work have emphasized the importance of supervision since the early 1900s (Leddick & Bernard, 1980; Tsui, 1997; Wheeler & Cushway, 2012). Compared to the aforementioned fields, the field of ABA is young, and several areas of the field are underdeveloped (e.g., supervision) and need to be further evaluated. In response, researchers have begun to increase the availability and publication of supervision literature within the field of ABA. Examples include an increase in supervision books (e.g., Britton & Cicoria, 2019; Kazemi et al., 2019; LeBlanc et al., 2020) and supervision articles (e.g., Garza et al., 2018; Hajiaghamohseni et al., 2021; Sellers et al., 2019; Simmons et al., 2021). Additionally, in 2016, a special issue on conducting high quality supervision was published in the journal Behavior Analysis in Practice (see for examples: Dixon et al., 2016; Hartley et al., 2016; Sellers et al., 2016a; Sellers et al., 2016b; Sellers et al., 2016c; Turner et al., 4 2016; Valentino et al., 2016). Following the special issue, the number of articles that were published on the topic of supervision within the field of ABA increased from two supervision articles published in 2015 to eight in 2016, and seven between 2018 and 2019 (Valentino, 2021). Despite this increase in literature on supervision, there still are several areas in which supervision within ABA can be further evaluated and improved. Supervision Logic Model A supervision logic model was developed to depict the supervision process within the field of ABA (see Figure 1.1). The logic model is based off a systems approach where an organization or process is comprised of interrelated parts (Rummler & Brache, 2013). In this logic model, the supervision process is conceptualized as comprising of three main supervision components, (a) when supervision is provided, (b) how supervision is provided, and (c) what supervision is provided. These three components are incorporated into the supervision that is provided by the supervisor to the supervisee, who then provides ABA services to the clients. Throughout this process, the supervisee may provide feedback to the supervisor, such as through a supervisor evaluation form, indicating what aspects of the supervision process are going well and what aspects may need further adjustments or improvements. Additionally, the client’s data provides information about the effectiveness of the supervision. Using the feedback from the supervisee and the client’s data, the supervisor may make any necessary changes to the supervision components to further improve the delivery of behavioral services. The three main supervision components within the supervision process will be discussed in more detail below. When is Supervision Provided? Within the field of ABA, individuals are supervised when implementing behavioral interventions with individuals with ASD, conducting behavior analytic assessments, and 5 developing and/or selecting behavior-change procedures to be implemented with a client. Though supervision of all activities is critical, the present dissertation will focus on supervision during the implementation of behavioral interventions. Evidence suggests that supervision of behavioral interventions has a positive impact on treatment outcomes (Dixon et al., 2016; Shapiro & Kazemi, 2017). And research has found that the amount of supervision is positively correlated with stronger child outcomes (Eikeseth, 2009). Furthermore, supervision can promote the supervisee’s and supervisor’s professional development and can help the field of ABA by developing future practitioners that can create successful and socially significant behavior change (Brodhead et al., 2018; LeBlanc et al., 2012; LeBlanc & Luiselli, 2016; Sellers et al., 2016b; Turner et al., 2016). Despite the benefits of supervision, there is limited research that investigates the role of supervision during the implementation of behavioral interventions (Dixon et al., 2016). In 2014, Romanczyk and colleagues conducted a systematic review evaluating the research base on behavioral interventions, including information regarding the supervision process provided during the implementation of ABA interventions. Romanczyk and colleagues (2014) found that information regarding the supervision process was not consistently reported in the research literature. Given the limited research and lack of reporting on the supervision process during the implementation of behavioral interventions, it is unclear the extent to which recently published articles (i.e., within the last eight years) include information regarding the supervision process of the behavioral interventions being evaluated. How is Supervision Provided? Supervision can be provided in a variety of locations including in-person (i.e., face-to- face), remotely via Telehealth (i.e., two-way audio-video communication), or a combination of 6 the two (i.e., hybrid). Face-to-face supervision can be provided in a variety of settings such as schools, clinics, client’s homes, and university-based centers. During face-to-face supervision, both the supervisor and the supervisee are present in the same physical location. Alternatively, remote supervision or supervision provided via Telehealth does not require the supervisor and supervisee to be in the same physical location (Turner et al., 2016). Hybrid supervision consists of a combination of both face-to-face and remote supervision at separate times. Though all three modalities for providing supervision are important, the present dissertation will focus on supervision provided via Telehealth. Telehealth is defined as “the use of electronic information and telecommunication technologies to support long-distance clinical health care, patient and professional health-related education, public health, and health administration” (American Telemedicine Association, 2017). Recently, there has been an increase in the application and evaluation of behavioral interventions for ASD treatment via telehealth (Ferguson et al., 2019). Telehealth has been found to be an acceptable service delivery mechanism and reduces the costs associated with behavior analytic services (Ferguson et al., 2019; Horn et al., 2016; Lindgren et al., 2016). Additionally, Telehealth has led to increases in procedural fidelity in individuals implementing behavioral interventions and positive outcomes for individuals with ASD (Ferguson et al., 2019; Neely et al., 2016; Unholz-Bowden et al., 2020). Despite the benefits of Telehealth, the individuals implementing behavioral interventions must be adequately supervised for the intervention to be most effective (Shapiro & Kazemi, 2017). However, supervision, regardless of the modality, does not occur without barriers. Supervision barriers have been defined as something that hinders the supervision of the supervisee and the quality of services provided to the supervisee’s clients. Though researchers 7 have explored barriers supervisors experience during face-to-face interaction (Sellers et al., 2019), little is known about what supervision barriers occur when providing supervision via Telehealth. This is problematic because if barriers cannot be prevented or strategies are not used to address and/or mitigate the barriers that arise, the quality of the supervision provided may be impacted, which may decrease the quality of ABA services (Sellers et al., 2019). Therefore, one specific gap in the literature regarding providing supervision via Telehealth is that it is unclear what barriers are experienced and what strategies are used to address and/or mitigate the barriers that arise when providing supervision via Telehealth. What Does Supervision Consist of? Supervision may consist of a variety of activities the supervisor engages in such as observing, training, monitoring, and providing feedback to the supervisee (BACB, 2019; LeBlanc et al., 2016; LeBlanc et al., 2020). A supervisor will observe their supervisee’s performance when implementing behavioral interventions with clients (Paquet et al., 2017). Supervisors will also train the supervisee on behavior analytic skills and continuously monitor the skills of the supervisee throughout the supervision process (BACB, 2019). Finally, the supervisor will deliver performance-based feedback to the supervisee based on their implementation of the target skills (BACB, 2019). Though all supervision activities are critical to the overall success of the supervision process, the present dissertation will focus on providing feedback to the supervisee using email performance-based feedback. Performance-based feedback consists of the supervisor collecting data while observing a supervisee engage in an activity (e.g., implementing an intervention) and using that data to inform the feedback provided to the supervisee in order to change the supervisee’s behavior (Barton et al., 2016; Barton et al., 2020; Hemmeter et al., 2011). Several literature reviews have 8 established performance-based feedback as an evidence-based practice (Cornelius & Nagro, 2014; Fallon et al., 2015). Additionally, performance-based feedback has been found to increase procedural fidelity of intervention implementation, which is a critical component of effective treatment and supervision (Codding et al., 2005; Cornelius & Nagro, 2014; Rosenberg & Huntington, 2021; Solomon et al., 2012). Performance-based feedback can be delivered in a variety of ways, such as in person through verbal or written forms (e.g., handwritten notes), or remotely through the use of technology (bug-in-ear, text messages, or emails; Barton & Wolery, 2007; Coogle et al., 2016; Hemmeter et al., 2011). An emerging body of literature has evaluated the effects of email performance-based feedback in fields outside of behavior analysis such as education (e.g., Hemmeter et al., 2011; Gomez et al., 2021; Gorton et al., 2021). However, there are several limitations of that previous research: (a) lack of control on the amount or quality of the email feedback, (b) confounding variables (e.g., providing participants with a training between baseline and intervention conditions), (c) influence from the students on the participant’s behavior, and (d) lack of replication. Therefore, Chapter 4 aims to address these beforementioned limitations by evaluating the effects of email feedback while controlling for forms of feedback, isolating the effects of email feedback, and scripting confederate (student) behavior to decrease the likelihood that improvements in participant performance are not a function of student behavior. Purpose of the Present Dissertation Given the gaps identified within the supervision research literature and given the importance of supervision, it is critical that researchers evaluate effective supervision strategies to further improve outcomes in individuals with ASD. This three-part dissertation will address and evaluate supervision of behavior analytic services provided to individuals with ASD in three 9 different contexts: (a) supervision provided during the implementation of behavioral interventions, (b) supervision provided via Telehealth, specifically evaluating barriers and strategies used to address and/or mitigate those barriers, and (c) supervision provided via Telehealth in the form of email performance-based feedback. The next three chapters, Chapters 2-4, will provide more information about the background literature informing each study, along with the research questions, methodological descriptions, findings, and discussions for each study. The final chapter, Chapter 5, will provide an overall discussion about the findings of the three studies, the implications for research and practice, and considerations for the future direction of supervision. 10 APPENDIX 11 Figure 1.1. Supervision Logic Model Super vision Pr ocess Super vision Com ponents W hen is super vision pr ovided? Individuals ar e typically super vised w hen i m pl em en t i n g beh av i or al i n t er ven t i on s (St u dy 1), conducting behavior analytic assessm ents, and developing and/or selecting behavior -change pr ocedur es to be im plem ented w ith a client. How is super vison pr ovided? Super vison can be pr ovided in a var iety of locations including in-per son (i.e., face-to-face), r em ot el y v i a Tel eh eal t h (St u dy 2), or a com bination of the tw o (i.e., hybr id). W hat does super vision consist of? Super vision m ay consist of a var iety of activities the super visor engages in such as obser ving, tr aining, m onitor ing, and pr ov i di n g f eedback t o t h e su per v i see (St u dy 3). Super visor Super visee Feedback Client Data 12 REFERENCES 13 REFERENCES American Telemedicine Association (2017). Telehealth: Defining 21st century care. https://www.americantelemed.org/resource/why-telemedicine/ Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis, 1(1), 91-97. https://doi.org/10.1901/jaba.1968.1-91 Barton, E. E., Fuller, E. A., & Schnitz, A. (2016). The use of email to coach preservice early childhood teachers. Topics in Early Childhood Special Education, 36(2), 78-90. https://doi.org/10.1177/0271121415612728 Barton, E. E., Velez, M., Pokorski, E. A., & Domingo, M. (2020). The effects of email performance based feedback delivered to teaching teams: A systematic replication. Journal of Early Intervention, 42(2), 143-162. https://doi.org/10.1177/1053815119872451 Barton, E. E., & Wolery, M. (2007). Evaluation of e-mail feedback on the verbal behaviors of pre-service teachers. Journal of Early Intervention, 30(1), 55-72. https://doi.org/10.1177/105381510703000105 Behavior Analyst Certification Board. (2018). Standards for supervision of BCaBAs. https://www.bacb.com/wp-content/uploads/2020/05/Standards-for-Supervision-of- BCaBAs_180924.pdf Behavior Analyst Certification Board. (2019). BCBA fieldwork requirements. https://www.bacb.com/wp-content/uploads/2020/05/2022-BCBA-Fieldwork- Requirements_200828.pdf Behavior Analyst Certification Board (2020). Ethics code for behavior analysts. https://www.bacb.com/wp-content/uploads/2022/01/Ethics-Code-for-Behavior-Analysts- 220316-2.pdf Behavior Analyst Certification Board. (2021). Board certified behavior analyst handbook. https://www.bacb.com/wp- content/uploads/2022/01/BCBAHandbook_220110.pdf#Experience%20Requirements Behavior Analyst Certification Board (n.d.). Supervision, assessment, training, and oversight. https://www.bacb.com/supervision-and-training/ Britton, L. N., & Cicoria, M. J. (2019). Remote fieldwork supervision for BCBA trainees. Academic Press. 14 Brodhead, M. T., & Higbee, T. S. (2012). Teaching and maintaining ethical behavior in a professional organization. Behavior Analysis in Practice, 5(2), 82-88. https://doi.org/10.1007/BF03391827 Brodhead, M. T., Quigley, S. P., & Wilczynski, S. M. (2018). A call for discussion about scope of competence in Behavior Analysis. Behavior Analysis in Practice, 11(4), 424-435. https://doi.org/10.1007/s40617-018-00303-8 Burnham, R. M. (1976). Instructional supervision: Past, present, and future perspectives. Theory Into Practice, 15(4), 301-305. https://www-jstor-org.proxy1.cl.msu.edu/stable/1476050 Codding, R. S., Feinberg, A. B., Dunn, E. K., & Pace, G. M. (2005). Effects of immediate performance feedback on implementation of behavior support plans. Journal of applied behavior analysis, 38(2), 205-219. https://doi.org/10.1901/jaba.2005.98-04 Coogle, C. G., Rahn, N. L., Ottley, J. R., & Storie, S. (2016). ECoaching across routines to enhance teachers’ use of modeling. Teacher Education and Special Education, 39(4), 227-245. https://doi.org/10.1177/0888406415621959 Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied behavior analysis (3rd ed.). Pearson Education. Cornelius, K. E., & Nagro, S. A. (2014). Evaluating the evidence base of performance feedback in preservice special education teacher training. Teacher Education and Special Education, 37(2), 133-146. https://doi.org/10.1177/0888406414521837 Dixon, D. R., Linstead, E., Granpeesheh, D., Novack, M. N., French, R., Stevens, E., Stevens, L., & Powell, A. (2016). An evaluation of the impact of supervision intensity, supervisor qualifications, and caseload on outcomes in the treatment of autism spectrum disorder. Behavior Analysis in Practice, 9(4), 339-348. https://doi.org/10.1007/s40617-016-0132-1 Eikeseth, S., Hayward, D., Gale, C., Gitlesen, J., & Eldevik, S. (2009). Intensity of supervision and outcome for preschool aged children receiving early and intensive behavioral interventions: A preliminary study. Research in Autism Spectrum Disorders, 3(1), 67-73. https://doi.org/10.1016/j.rasd.2008.04.003 Fallon, L. M., Collier-Meek, M. A., Maggin, D. M., Sanetti, L. M., & Johnson, A. H. (2015). Is performance feedback for educators an evidence-based practice? A systematic review and evaluation based on single-case research. Exceptional Children, 81(2), 227-246. https://doi.org/10.1177/0014402914551738 Ferguson, J., Craig, E. A., & Dounavi, K. (2019). Telehealth as a model for providing behavior analytic interventions to individuals with autism spectrum disorder: A systematic review. Journal of Autism and Developmental Disorders, 49(2), 582-616. https://doi.org/10.1007/s10803-018-3724-5 15 Garza, K. L., McGee, H. M., Schenk, Y. A., & Wiskirchen, R. R. (2018). Some tools for carrying out a proposed process for supervising experience hours for aspiring board certified behavior analysts. Behavior Analysis in Practice, 11(1), 62-70. https://doi.org/10.1007/s40617-017-0186-8 Gomez, L., Barton, E. E., Winchester, C., & Locchetta, B. (2021). Effects of email performance feedback on teachers’ use of play expansions. Journal of Early Intervention, 43(3), 235- 254. https://doi.org/10.1177/1053815120969821 Gorton, K., Allday, R. A., Lane, J. D., & Ault, M. J. (2021). Effects of brief training plus electronic feedback on increasing quantity and intonation of behavior specific praise among preschool teachers. Journal of Behavioral Education. Advanced online publication. https://doi.org/10.1007/s10864-020-09427-w Green, G., & Johnston, J. M. (2009). A primer on professional credentialing: Introduction to invited commentaries on licensing behavior analysts. Behavior Analysis in Practice, 2(1), 51-52. https://doi.org/10.1007/bf03391737 Hajiaghamohseni, Z., Drasgrow, E., & Wolfe, K. (2020). Supervision behaviors of board certified behavior analysts with trainees. Behavior Analysis in Practice, 14(1), 97-109. https://doi.org/10.1007/s40617-020-00492-1 Hartley, B. K., Courtney, W. T., Rosswurm, M., & LaMarca, V. J. (2016). The apprentice: An innovative approach to meet the behavior analysis certification board’s supervision standards. Behavior Analysis in Practice, 9(4), 329-338. https://doi.org/10.1007/s40617- 016-0136-x Hemmeter, M. L., Snyder, P., Kinder, K., & Artman, K. (2011). Impact of performance feedback delivered via electronic mail on preschool teachers’ use of descriptive praise. Early Childhood Research Quarterly, 26(1), 96-109. https://doi.org/10.1016/j.ecresq.2010.05.004 Horn, B. P., Barragan, G. N., Fore, C., & Bonham, C. A. (2016). A cost comparison of travel models and behavioural telemedicine for rural, Native American populations in New Mexico. Journal of Telemedicine and Telecare, 22(1), 47-55. https://doi- org.proxy1.cl.msu.edu/10.1177/1357633X15587171 Kazemi, E., Rice, B., & Adzhyan, P. (2019). Fieldwork and supervision for behavior analysts: A handbook. Springer Publishing Company. LeBlanc, L. A., & Gillis, J. M. (2012). Behavioral interventions for children with autism spectrum disorders. The Pediatric Clinics of North America, 59(1), 147-164. https://doi.org/10.1016/j.pcl.2011.10.006 16 LeBlanc, L. A., & Luiselli, J. K. (2016). Refining supervisory practices in the field of behavior analysis: Introduction to the special section on supervision. Behavior Analysis in Practice, 9(4), 271-273. https://doi.org/10.1007/s40617-016-0156-6 LeBlanc, L. A., Sellers, T. P., & Ala’i, S. (2020). Building and sustaining meaningful and effective relationships as a supervisor and mentor. Sloan Publishing. Leddick, G. R., & Bernard, J. M. (1980). The history of supervision: A critical review. Counselor Education and Supervision, 19(3), 186-196. https://doi.org/10.1002/j.1556- 6978.1980.tb00913.x Lindgren, S., Wacker, D., Suess, A., Schieltz, K., Pelzel, K., Kopelman, T., Lee, J., Romani, P., & Waldron, D. (2016). Telehealth and autism: Treating challenging behavior at lower cost. Pediatrics, 137, S167-S175. https://doi.org/10.1542/peds.2015-28510 National Autism Center (2015). Findings and conclusions: National standards project, phase 2. http://www.nationalautismcenter.org/090605-2/ Neely, L., Rispoli, M., Gerow, S., & Hong, E. R. (2016). Preparing interventionists via telepractice in incidental teaching for children with autism. Journal of Behavior Education, 25(4), 393-416. https://doi.org/10.1007/s10864-016-9250-7 Paquet, A., Dionne, C., Joly, J., Rousseau, M., & Rivard, M. (2017). Supervision of large-scale community-based early intensive behavioral intervention programs in Quebec: Description of practices. Journal on Developmental Disabilities, 23(1), 54-63. https://search-proquest-com.proxy1.cl.msu.edu/docview/1991893643?pq- origsite=summon Romanczyk, R. G., Callahan, E. H., Turner, L. B., & Cavalari, R. N. S. (2014). Efficacy of behavioral interventions for young children with autism spectrum disorders: Public policy, the evidence base, and implementation parameters. Review Journal of Autism and Developmental Disorders, 1(4), 276-326. https://doi.org/10.1007/s40489-014-0025-6 Rosenberg, N., & Huntington, R. N. (2021). Distance big-in-ear coaching: A guide for practitioners. Behavior Analysis in Practice, 14, 523-533. https://doi.org/10.1007/s40617- 020-00534-8 Rummler & Brache (2013). Improving performance: How to manage the white space on the organization chart (3rd ed.). John Wiley & Sons, Inc. Sellers, T. P., Alai-Rosales, S., & MacDonald, R. P. F. (2016a). Taking full responsibility: The ethics of supervision in behavior analytic in practice. Behavior Analysis in Practice, 9(4), 299-208. https://doi.org/10.1007/s40617-016-0144-x 17 Sellers, T. P., LeBlanc, L. A., & Valentino, A. L. (2016b). Recommendations for detecting and addressing barriers to successful supervision. Behavior Analysis in Practice, 9(4), 309- 319. https://doi.org/10.1007/s40617-016-0142-z Sellers, T. P., Valentino, A. L., & LeBlanc, L. A. (2016c). Recommended practices for individual supervision of aspiring behavior analysts. Behavior Analysis in Practice, 9(4), 274-286. https://doi.org/10.1007/s40617-016-0110-7 Sellers, T. P., Valentino, A. L., Landon, T. J., & Aiello, S. (2019). Board certified behavior analysts’ supervisory practices of trainees: Survey results and recommendations. Behavior Analysis in Practice, 12(3), 536-546. https://doi.org/10.1007/s40617-019- 00367-0 Shapiro, M., & Kazemi, E. (2017). A review of training strategies to teach individuals implementation of behavioral interventions. Journal of Organizational behavior Management, 37(1), 32-62. https://doi.org/10.1080/01608061.2016.1267066 Shook, G. L. (1993). The professional credential in behavior analysis. Behavior Analysis in Practice, 16(1), 87-101. https://doi.org/10.1007/bf03392614 Shook, G. L., & Favell, J. E. (2008). The behavior analyst certification board and the profession of behavior analysis. Behavior Analysis in Practice, 1(1), 44-48. https://doi.org/10.1007/bf03391720 Simmons, C. A., Ford, K. R., Salvatore, G. L., & Moretti, A. E. (2021). Acceptability and feasibility of virtual behavior analysis supervision. Behavior Analysis in Practice, 14(4), 927-943. https://doi.org/10.1007/s40617-021-00622-3 Solomon, B. G., Klein, S. A., & Politylo, B. C. (2012). The effect of performance feedback on teachers’ treatment integrity: A meta analysis of the single-case literature. School Psychology Review, 41(2), 160-175. https://doi.org/10.1080/02796015.2012.12087518 Tracy, S. J. (1995). How historical concepts of supervision relate to supervisory practices today. The Clearing House, 68(5), 320-325. https://www-jstor- org.proxy1.cl.msu.edu/stable/30189094 Tsui, M. (1997). The roots of social work supervision. The Clinical Supervisor, 15(2), 191-198. https://doi.org/10.1300/j001v15n02_14 Turner, L. B., Fischer, A. J., & Luiselli, J. K. (2016). Towards a competency-based, ethical, and socially valid approach to the supervision of applied behavior analytic trainees. Behavior Analysis in Practice, 9(4), 287-298. https://doi.org/10.1007/s40617-016-0121-4 18 Unholz-Bowden, E., McComas, J. J., McMaster, K. L., Girtler, S. N., Kolb, R. L., & Shipchandler, A. (2020). Caregiver training via telehealth on behavioral procedures: A systematic review. Journal of Behavioral Education, 29(2), 246-281. https://doi.org/10.1007/s10864-020-09381-7 Valentino, A. L. (2021). Supervision and mentoring. In Luiselli, J. K., Gardner, R. M., Bird, F. L., & Maguire, H. (Eds.). Organizational behavior management approaches for intellectual and developmental disabilities (pp. 141-164). Routledge. Valentino, A. L., LeBlanc, A., & Sellers, T. P. (2016). The benefits of group supervision and a recommended structure for implementation. Behavior Analysis in Practice, 9(4), 320- 328. https://doi.org/10.1007/s40617-016-0138-8 Virués-Ortega, J. (2010). Applied behavior analytic intervention for autism in early childhood: Meta-analysis, meta-regression and dose-response meta-analysis of multiple outcomes. Clinical Psychology Review, 30(4), 387-399. https://doi.org/10.1016/j.cpr.2010.01.008 Wheeler, S. & Cushway, D. (2012). Supervision and clinical psychology: History and development. In Fleming, I. & Steen, L. (Eds.). Supervision and clinical psychology: Theory, practice and perspectives (pp. 11-22). Routledge. 19 CHAPTER 2 A Systematic Literature Review of Supervision and Staff Training within Behavioral Intervention Research with Children with Autism Spectrum Disorder Autism spectrum disorder (ASD) is a developmental disability that affects approximately one in 54 children in the United States (Maenner et al., 2020). The current prevalence of ASD is in an increase compared to previous years, where ASD affected one in 69 in 2012, one in 88 in 2008, one in 125 in 2004, and one in 150 in 2000 (CDC, 2020). Individuals with ASD often display deficits in social communication and social interaction, repetitive behaviors, and restricted interests (American Psychiatric Association, 2013). Without intervention, severe impairments in social skills often persist through adulthood (Shattuck et al., 2007), serve as a major barrier to meaningful employment (e.g., Hendricks, 2010; Levy & Perry, 2011), and may result in long-term care and support (Rogge & Janssen, 2019). One way to address the deficits in social communication and social interaction, repetitive behaviors, and restricted interests is to use behavioral interventions (Makrygianni & Reed, 2010). Behavioral interventions, or interventions based on Applied Behavior Analysis (ABA), are well-established and effective interventions for individuals with ASD (LeBlanc & Gillis, 2012; National Autism Center, 2015). ABA interventions involve systematic procedures that focus on increasing or decreasing behaviors through observing, measuring, and analyzing human behavior (Baer et al., 1968). ABA, by definition, involves producing socially acceptable behavior change with some examples of socially significant behavior change being promotion of independence and increasing language and social skills (Cooper et al., 2020). Early Intensive Behavioral Intervention (EIBI; see Reichow, 2012), Early Start Denver Model (ESDM; see Vivanti & Stahmer, 2020), Discrete Trial Training (see Leaf et al, 2019), and video modeling 20 (see Bellini & Akullian, 2007) are some examples of behavioral interventions that have been evaluated and used with individuals with ASD (Romanczyk et al., 2014). There are two general categories of behavioral interventions that may produce positive outcomes for children with ASD: comprehensive behavioral interventions and focused (also referred to as skill-based) behavioral interventions. Comprehensive interventions are designed to address multiple skills, across developmental domains (e.g., cognitive, communicative, social behaviors) of an individual with ASD (BACB, 2014). Comprehensive interventions can be delivered in various settings (e.g., home, school), typically consist of 30 to 40 hr of treatment per week, can be implemented in a one-on-one or small group format, and are typically implemented for an extended period of time (e.g., a year; BACB, 2014; Odom et al., 2010). Alternatively, skill-based interventions are typically designed to focus on one skill at a time or a related set of skills (e.g., self-care and personal hygiene; Wong et al., 2015). Similar to comprehensive interventions, skill-based interventions may be delivered in a variety of settings, can be implemented in an individual or small group format but are typically implemented for a shorter length of time (e.g., 1 month; BACB, 2014; Wong et al., 2015). One critical component to evaluate and include within comprehensive and skill-based behavioral interventions is supervision of the delivery of those interventions. Within the field of ABA, supervision is defined as “improving and maintaining the behavior-analytic, professional and ethical repertoires of the supervisee and facilitating the delivery of high-quality behavior analytic services to the supervisee’s clients” (BACB, 2018). Supervision is typically provided by the supervisor staff (i.e., the individual providing direct supervision) to the direct intervention staff/supervisee (i.e., the individuals implementing the intervention). Depending on the 21 organization structure, the lead supervisor staff (i.e., the individuals serving as the lead supervisor), may oversee the entire supervision process. In behavioral interventions for children with ASD, supervision is comprised of several activities: observing and providing feedback to an individual teaching or engaging with a child, developing programming to be implemented with a child, analyzing observation data, and adjusting a child’s programming as needed (Paquet et al., 2017). Emerging evidence suggests supervision of behavioral interventions may affect treatment outcomes (Dixon et al., 2016; Shapiro & Kazemi, 2017). Specifically, preliminary research has found that intensity of supervision is positively correlated with stronger child outcomes (Eikeseth, 2009). Another important and related variable to evaluate within comprehensive and skill-based behavioral interventions is staff training. Training can be defined as “teaching an individual so as to make them fit, qualified, or proficient” (Merriam-Webster, n.d.). Staff training is critical in behavioral interventions for at least a few reasons. First, improper intervention implementation could have harmful effects (e.g., physical or psychological harm) on recipients of behavioral services (Carroll et al., 2013; Shapiro & Kazami, 2017; St. Peter et al., 2016). Second, staff training results in higher levels of procedural fidelity, which is the degree to which an intervention is implemented as intended (Gast and Ledford, 2014; Stahmer et al., 2015). Higher procedural fidelity generally produces stronger child outcomes (Groskreutz et al., 2011; Suess et al., 2014). Alternatively, poor procedural fidelity often positively correlates with poor student outcomes (Allen & Warzak, 2000; DiGennaro Reed & Codding, 2014; Ledford & Gast, 2014; Symes et al., 2006). As a result, staff training resulting in high procedural fidelity should be a necessary benchmark for professionals to reach when implementing behavioral interventions with children with ASD (Strain et al., 2021). 22 Supervision and staff training are both critical elements that need to be evaluated in order to realize the benefits of comprehensive and skill-based behavioral interventions. In 2014, Romanczyk and colleagues conducted a systematic review evaluating the research on behavioral interventions for children, five years of age or younger, with ASD from 2000-2013 which resulted in 144 articles that met their inclusion criteria. Of the 144 articles reviewed, Romanczyk et al. (2014) found 19 comprehensive studies and 125 skill-based intervention studies. Of the 19 comprehensive studies, 15 studies reported the supervisors’ professional qualifications; however, information regarding the amount of supervision was consistently not reported. Of the 125 skill- based intervention studies, only 28 studies reported the professional qualifications of the supervisors and only five studies provided information regarding the amount, or dosage, of supervision. Though Romanczyk and colleagues (2014) evaluated supervision variables within behavioral interventions, staff training variables were not evaluated in the review. The lack of reporting about supervisor qualifications and lack of evaluation of staff training variables is problematic. In order for ABA interventions to be effective, it is imperative that supervision is provided during the implementation of the behavioral intervention and that staff are trained to implement the intervention with fidelity (Shapiro & Kazemi, 2017; Strain et al., 2021). Despite the importance of providing supervision and staff training, it is unclear if recently conducted studies (i.e., within the last eight years) evaluating behavioral interventions with children with ASD between infancy and five years of age included information about supervision and staff training. Determining the current state of the research literature pertaining to both supervision and training will allow us to provide the field of ABA with information about gaps in the literature and how we can further improve within supervision and staff training in order to improve the quality of research conducted and services provided to individuals with ASD. To 23 understand the current state of the published research literature pertaining to supervision, a systematic review to update the results (2013-2020) of Romanczyk et al. (2014) was conducted. Also, to expand upon the results obtained by Romanczyk et al. (2014), the review evaluated the current nature of the research base using two additional variables: supervision and staff training. Specifically, the current review asked the following research question: To what extent does the published research literature report information about supervision and staff training in behavioral intervention research with children with ASD. Method This literature review replicated and expanded the method used in Romanczyk et al. (2014). It involved a four-step process involving an extensive literature search (Step 1), an abstract screening (Step 2), a full article screening (Step 3), and an in-depth review (Step 4). Throughout the literature review, the first author served as the primary researcher, and six graduate students served as research assistants under the supervision of the first author. Extensive Literature Search and Inclusion Criteria (Step 1) The extensive literature search was conducted using three databases: MEDLINE, PsycINFO, and ERIC. These were the same electronic databases used to collect literature for the Romanczyk et al. (2014) study. The following search terms were used in obtaining the articles: (1) “autis*” OR “ASD” OR “PDD-NOS” OR “asperger*” and (2) AND “behavioral interventions” OR “behavior modification” OR “behavior therapy” OR applied “behavior analysis”. The date range was from April 2013 to January 2021. In addition, the search was limited to peer reviewed articles in English. The initial round of the extensive literature search was conducted between March 7, 2020 and March 10, 2020 and it was updated on February 8, 24 2021 to include articles published between March 2020 and January 2021. The search yielded 7,907 articles to be included in the abstract screening. Abstract Screening (Step 2) Articles were screened for inclusion criteria by reviewing the abstracts of the 7,907 articles. The inclusion criteria for the abstract screening were: (a) the article was published between April 2013 and January 2021, (b) the article appeared in an English peer- reviewed/academic publication, (c) the article focused on an evaluation of a behavioral intervention, (d) the article had at least one participant with an ASD diagnosis consisting of either ASD, PDD-NOS, and/or Asperger’s, and (e) the article had at least one participant with an ASD diagnosis and was five years of age or younger. The inclusion criteria indicating that the article has at least one participant with an ASD diagnosis and was five years of age or younger was selected because it was the same age inclusion criteria used in the Romanczyk et al. (2014) study. Based on the inclusion criteria, the primary researcher and four research assistants indicated if each article met criteria or not in one of three ways: yes (i.e., article met inclusion criteria), no (i.e., article did not meet inclusion criteria), and unclear (i.e., the primary researcher and research assistant were unsure, based on the abstract alone, if the article met inclusion criteria or not). To complete the abstract screening, the primary researcher and research assistants used the programs Zotero to access the abstracts and Qualtrics to answer the survey questions. The Qualtrics survey was created by the primary researcher and contained ten questions. The first four questions consisted of the coder information and the article information: (1) coder name, (2) the article year, (3) the author name(s) of the article, and (4) the name of the journal the article was published in. Questions five through nine consisted of the inclusion criteria questions (listed 25 above). Question 10 asked if the article met criteria for inclusion in Step 3: Screening. For this question, the primary researcher and research assistants could answer one of three ways: (1) Yes, if answers to questions 5-9 were “yes”, the article was considered to meet criteria and was included in the Step 3: Screening, (2) No, if an answer to questions 5-9 was “no”, the article was not considered to meet criteria and was not included for Step 3: Screening, and (3) Unclear, if an answer to questions 5-9 was “unclear”, the article was considered unclear whether it met criteria based on the abstract and it was included in Step 3: Screening. Prior to research assistants beginning the abstract screening, the primary researcher provided a training and completed reliability checks. The training consisted of a step-by-step guide of how to access the article abstracts on Zotero and the Qualtrics survey, a task analysis of the abstract screening process including a figure that indicates where the information can be found to answer the questions on Qualtrics, a table of definitions, and a demonstration of how to complete an abstract screening of an article. After the research assistants received the training, they were required to independently screen five articles by an agreed upon date. The primary researcher reviewed their answers and calculated reliability. If the research assistant received a 90% or greater average reliability score, they were considered to pass the reliability checks and were assigned articles to officially start screening. Four research assistants took part in the training and reliability checks. All four research assistants received a score of 90% or greater average reliability the first time. The abstract screening yielded 1,740 articles to be included in the full article screening. Full Article Screening (Step 3) Following the abstract screening, the 1,740 articles were reviewed, in full, to identify whether or not they met inclusion criteria. This screening was conducted as a layer of 26 redundancy to ensure articles identified in the abstract screening actually met criteria for Step 4: In-Depth Review. To complete the full article screening the primary researcher and research assistants used OneDrive to access the articles and Qualtrics to answer the survey questions. A Qualtrics survey was created by the primary researcher that consisted of 20 questions. The first four questions required coder and article information: (1) coder name, (2) the article year, (3) the authors of the article, (4) the name of the journal the article was published in. The next eight questions reflected the inclusion criteria for the full article screening: (5) the article evaluated a behavioral intervention method or technique, (6) used a group or single- subject research design, (7) appeared in an English peer-reviewed/academic publication, (8) provided original data about efficacy of an intervention method for ASD, (9) was a domestic or international based study, (10) provided an adequate description of the assessment or intervention methods evaluated, or provided a reference where such a description could be found, (11) had at least one participant with an ASD diagnosis consisting of either ASD, PDD- NOS, and/or Asperger’s, (12) had at least one participant with an ASD diagnosis and was five years of age or younger. If an article did not report specific ages of each participant in the study and instead only reported a mean age or age range, the article did not meet inclusion criteria unless the article indicated all participants were five years of age or younger. Questions 13 through 18 involved the inclusion criteria questions that were relevant to the type of reported study design. If the article reported a group research design, the specific inclusion criteria included: (13) evaluated functional outcomes that were important to a child’s overall health or development or were important to family or society, (14) a controlled trial was used to evaluate a group receiving the intervention compared to a group receiving no intervention or a different intervention, (15) assigned participants to groups either randomly or 27 using a method that did not appear to significantly bias results, and (16) used equivalent methods for measuring baseline participant characteristics and outcomes for all groups studied. If the article reported a single-subject research design, additional inclusion criteria included: (17) reported on a functional outcome important for the child or the family (or some immediate outcome demonstrated to be related to a functional outcome), and (18) used an acceptable single- subject research design. If the article reported both a group research design and a single-subject research design, all of Questions 13 through 18 were answered. Question 19 asked what skill the intervention focused on. Once a skill was identified, the skill was placed into a focus category. There were 11 different focus categories to choose from that were identical to the categories used in Romanczyk et al. (2014): Academic, behavior reduction, cognitive, communication, comprehensive, daily living, feeding, play, sleep, social, and toileting. Question 20 asked if the article met criteria for inclusion in Step 4: In-Depth Review. For this question, a researcher could answer one of three ways: (a) Yes, if answers to questions 6-15 were “yes”, the article was considered “adequate evidence for efficacy” and met criteria for Step 4; (b) Yes, if answers to questions 6-11 and 16-19 were “yes”, the article was considered “adequate evidence for efficacy” and met criteria for Step 4; and (c) No, if any answer to questions 6-19 was “no”, the article was not considered “adequate evidence for efficacy” and did not meet criteria for Step 4: In-Depth Review. Prior to research assistants beginning the full article screening, the primary researcher provided a training and completed reliability checks. The training consisted of a step-by-step guide of how to access the articles on OneDrive and the Qualtrics survey, a task analysis of the full article screening process, a table of definitions, and a demonstration of how to complete a full article screening of an article. After the research assistants received the training, they were 28 required to independently screen five articles by an agreed upon date. Reliability was calculated in an identical manner as it was in the abstract screening. Four research assistants took part in the training and reliability checks. The four research assistants did not receive a score of 90% or greater average reliability in the first round of reliability checks. The four research assistants then met with the primary researcher to review the answers to the first five reliability articles, the primary researcher demonstrated the process again, and then the four research assistants completed five additional article screenings. All four research assistants received a score of 90% or greater average reliability the second time. The full article screening yielded 475 articles to be included in the in-depth review data extraction. In-Depth Review (Step 4) Following the full article screening, the 475 articles were reviewed further for the in- depth review (Step 4), and 59 variables were collected in Qualtrics to obtain specific information about each article. It is important to note that the primary researcher was in contact with Dr. Romanczyk (first author of the Romanczyk et al., 2014 study) and received components of the original data sheets (i.e., worksheets) used in their study. The copies of available data sheets were used when developing the variables for the in-depth review. The variables collected during the in-depth review were based on the variables used in the Romanczyk et al. (2014) study (n = 21) and additional variables of interest created by the primary researcher of the current study (n = 38). The 59 variables were divided into four categories: article level variables, group level variables, supervision level variables, and staff training level variables (see Brodhead, 2022). The article and group level variables were categorized in a manner identical to that in Romanczyk et al. (2014). The article level variables consisted of variables that applied to the 29 entire article (e.g., study design, participant characteristics). The group level variables consisted of variables that applied specifically to the treatment group in the study (e.g., number of hours of intervention received). The supervision level variables consisted of variables that applied to the supervision provided during the implementation of the behavioral intervention (e.g., qualifications of the individual providing direct supervision, length of time of the supervision). The staff training level variables consisted of variables that applied to the training provided to individuals who implemented the behavioral intervention (e.g., description of the training, how much training was received). To complete the in-depth review for each article, the primary researcher and research assistants used Qualtrics and OneDrive. When the primary researcher and research assistants were assigned articles to review, they accessed the full article through OneDrive and then answered the questions on Qualtrics. Prior to research assistants beginning the in-depth review, the primary researcher provided a training and completed reliability checks. The training consisted of a step-by-step guide of how to access the articles on OneDrive and the Qualtrics survey, a task analysis of the in-depth review process, a table of definitions, and a demonstration of how to complete an in-depth review of an article. After the research assistants received the training, they were required to independently review five articles by an agreed upon date. Reliability was calculated in an identical manner as it was in the abstract and full article screening. Four research assistants took part in the training and reliability checks. All four research assistants received a score of 90% or greater average reliability the first time. Inter-Rater Reliability Inter-rater reliability (IRR) was obtained by three research assistants that served as second reviewers for the abstract screening (Step 2), full article screening (Step 3), and in-depth 30 review (Step 4). The results for each step were compared using a point-by-point method for obtaining IRR (Cooper et al., 2020). Given that each step consisted of multiple reviewers, the primary researcher ensured that IRR was collected on an equal number of articles across all reviewers. After identifying the number of articles needed for each reviewer, a random number generator was used to select the articles that the research assistants would review. IRR was collected on 5% of abstract screening articles, across five reviewers. The IRR for abstract screening was 96.5%. IRR was collected on 10% of full article screening articles, across five reviewers. The IRR for full article screening was 98.7%. Finally, IRR was collected on 15% of in-depth review articles, across two reviewers. The average IRR for in-depth review was 99.4% (range: 96.6% to 100%). Data Analysis Following the completion of the in-depth review (Step 4), variables were analyzed using descriptive statistics, specifically focusing on frequencies, percentages, and measures of central tendency. In addition, the results from the overlapping 21 variables (i.e., the 21 variables that both the present review and Romanczyk et al. [2014] review coded for) were compared to the results of the Romanczyk et al. (2014) review to identify if there were changes in how the literature reported information regarding behavioral interventions. Results Four hundred and seventy-five articles published between 2013 and 2020 met inclusion criteria for the present literature review. Of the 475 articles, a slight majority of the articles were published in 2014 (n = 69, 14.5%) and 2016 (n = 74, 15.6%). Figure 2.1 displays the total number of articles that were published across all eight years (2013-2020). Additionally, the articles were published in 64 different journals (Table 2.1), with the most published in the 31 Journal of Applied Behavior Analysis (n = 129, 27.2%) and the Journal of Autism and Developmental Disorders (n = 38, 8.0%). Finally, 399 (84.0%) of the articles were conducted in the United States and 76 (16.0%) were conducted internationally (e.g., Ireland, Spain, Turkey). Across the 11 possible different focus categories, the articles evaluated 10 of the focus categories (i.e., all categories except cognitive). Overall, the focus categories with the highest number of articles were communication (n = 165, 34.7%) and behavior reduction (n = 88, 18.5%), while sleep (n = 4, 0.8%) and toileting (n = 4, 0.8%) had the fewest articles. Within the 10 categories included, one of the categories (i.e., comprehensive) was characterized as meeting the criteria for comprehensive interventions, while the remaining nine (i.e., academic, behavior reduction, cognitive, communication, daily living, feeding, play, sleep, social, toileting) categories were characterized as skill-based interventions. Table 2.2 displays the article distribution by focus category and research design. Comprehensive Intervention Articles A total of 30 (6.3%) comprehensive intervention articles met the inclusion criteria for this review. Of the 30 articles, 10 (33.3%) used a single-subject design and 20 (66.7%) used group design. The behavioral interventions provided included Advancing Social-Communication and Play (ASAP; n = 1, 3.3%), Caregiver-Mediated Module (n = 1, 3.3%), Comprehensive Autism Program (n = 1, 3.3%), Early Start Denver Model (ESDM; n = 5, 16.7%), Joint Attention Mediated Learning (n = 1, 3.3%), Joint Attention Symbolic Play Engagement and Regulation (n = 5, 16.7%), Milieu Teaching (n = 1, 3.3%), Parent Child Interaction Therapy (n = 1, 3.3%), Pivotal Response Training (PRT; n = 2, 6.7%), Preschool Peer Social Intervention (n = 1, 3.3%), Prevent-Teach-Reinforce Model (n = 1, 3.3%), Promoting the Emergence of Advanced Knowledge Relational Training System (n = 1, 3.3%), Responsive Teaching (n = 1, 3.3%), Self- 32 Management/Pivotal Response Treatment (n = 1, 3.3%), Superhero Social Skills Program (n = 5, 16.7%), Therapy Outcomes By You (n = 1, 3.3%), and Verbal Behavior Approach/Pivotal Response Treatment (n = 1, 3.3%). Within the articles, six (20.0%) identified the behavioral intervention provided as “comprehensive”. Overall Participant Diagnoses For single-subject design articles, there were 29 participants with ASD, two participants with PDD-NOS, one participant with ASD and mixed receptive-expressive language disorder, and two typically developing participants. For group design articles, there were 1,585 participants with ASD, 16 with PDD-NOS, six with Down Syndrome, four with intellectual disability, five with language delays, one with cerebral atrophy, and two with global developmental delay. Participants with ASD Characteristics Within the comprehensive intervention articles that used a single-subject design, the average number of participants that were diagnosed with ASD and were five years of age or younger was 1.9 participants (range: 1-3 participants) and the mean age of the participants was 54.2 months (range: 36-70 months). Of the articles that reported the gender (n = 9, 90.0%), race (n = 7, 70.0%), and ethnicity (n = 7, 70.0%) of the participants, the majority were male (88.9%), white (84.6%), and non Latinx, Hispanic, or of Spanish origin (92.3%). Alternatively, for group design articles, the average number of participants that were diagnosed with ASD and were five years of age or younger was 78.8 participants (range: 1-302 participants). Nineteen of the 20 group design articles did not report specific ages of the participants with ASD that were five years of age or younger, instead the articles included the mean age and age range of all participants or the participant with ASD. As a result, the mean age 33 could not be calculated. Of the articles that reported the gender (n = 16, 80.0%) of participants, three articles reported the percentage of participants for each gender or reported the overall number of participants for each gender but did not provide specific participant gender characteristics. Articles that reported race (n = 14, 70.0%), five articles reported the percentage of participants for race or reported the overall number of participants for race but did not provide specific participant race characteristics. Finally, articles that reported the ethnicity (n = 12, 60.0%), four articles reported the percentage of participants for ethnicity or reported the overall number of participants for ethnicity but did not provide specific participant ethnicity characteristics. Of the articles that reported specific participant characteristics, the majority were male (81.3%), white (46.7%), and non Latinx, Hispanic, or of Spanish origin (77.8%). Settings Twenty-eight (93.3%) of the total comprehensive intervention articles reported the setting in which the intervention was implemented. Settings were categorized using the same locations described in Romanczyk et al. (2014) (i.e., community, day treatment center, home, hospital, inpatient unit, outpatient clinic, private intervention agency, research lab, school, university- based center, and workshop setting) when possible. However, additional categories were created to incorporate new settings that have emerged in research since the publication of Romanczyk et al.’s study. These categories included clinic, unspecified clinic, camp, recreation center, unspecified setting, and not reported. Article settings that met the clinic category included those that indicated what type of clinic the intervention was implemented in such as an ABA clinic, EIBI clinic, or ASD clinic. Article settings that met the unspecified clinic category, included those that stated that the intervention was implemented in a clinic, but did not provide additional information regarding what type of clinic it was. Unspecified settings included articles that did 34 not name the setting, but provided some description (e.g., reported the intervention was implemented in a therapy room, but no additional information was provided). Finally, articles that did not report the setting of intervention implementation were categorized as not reported. Of the 28 articles that reported the setting the intervention was implemented in, seven (23.3%) articles reported that the study was conducted across multiple settings. Overall, the studies were conducted primarily in the home (n = 12, 40.0%), school (n = 11, 36.7%), clinic (n = 4, 13.3%), and in a university-based center (n = 4, 13.3%). Figure 2.2 displays the total number of articles that implemented the intervention in each setting. Duration and Dosage of Interventions For single-subject design articles, the duration (length of the study) of the intervention was reported in eight (80.0%) and the intensity (hours per week) was reported in nine (90.0%). Three (30.0%) articles reported a range for the duration of the intervention and were not included in the overall average duration calculation below. Four (40.0%) articles reported partial components (e.g., number of sessions, length of sessions) but did not report all of the necessary components (i.e., number of sessions per day, number of sessions per week, length of sessions) to calculate the intensity of the intervention. Of the articles that reported all of the necessary components, the average length of the intervention was four months with a range of 1.25 months to 12 months and the average intensity was 1.9 hr per week with a range of 30 min to 4 hr per week. For group design articles, the duration of the intervention and the intensity of the intervention were reported in 16 (80.0%) articles. One (5.0%) article reported a range for the duration of the intervention and was not included in the overall average duration calculation below. Six (30.0%) of the articles reported partial components (e.g., number of sessions total), 35 but did not report all of the necessary components (i.e., number of sessions per day, number of sessions per week, length of sessions) to calculate the intensity of the intervention. Of the articles that reported all of the necessary components, the average length of the intervention was 5.75 months with a range of two months to 27 months and the average intensity was 3.8 hr per week with a range of 1 hr to 10 hr per week. Treatment Fidelity We asked the following two questions to evaluate treatment fidelity: (1) did the article report treatment fidelity and (2) if yes, did the article meet the minimum standards for treatment fidelity (i.e., collected for a minimum of 20% of sessions across conditions and the scores reached an acceptable level of 80% or better). These minimum standards were selected based on treatment fidelity literature that reported these specific requirements as the current standards (e.g., Ganz & Ayres, 2018; Strain et al., 2021). Of the 10 comprehensive intervention articles that used a single-subject design, eight (80.0%) reported treatment fidelity. All eight met the minimum standards for treatment fidelity. Of the 20 comprehensive intervention articles that used a group design, 17 (85.0%) reported treatment fidelity. Of those 17 articles, only eight (47.1%) met the minimum standards for treatment fidelity. The articles that did not meet the minimum standards either did not provide enough information (e.g., the exact percentage of sessions treatment fidelity was collected on) or the 80% level was not achieved. Direct Intervention Staff The direct intervention staff (i.e., the individuals implementing the intervention) were reported for all 10 articles that used a single-subject design. However, the descriptions provided varied across articles. Two (20.0%) articles used generic terms such as research assistants and 36 students. Zero articles indicated that more than one type of direct intervention staff implemented the intervention. Four (40.0%) articles reported that parents/caregivers of the participants implemented the interventions. Four (40.0%) articles reported the professional role or credentials of the direct intervention staff. The majority of direct intervention staff that were identified with their professional role or credentials, consisted of graduate students (n = 3) and a psychologist (n = 1). Though 10 articles reported who the direct intervention staff were, no studies reported direct intervention staff characteristics such as age, gender, race, and ethnicity. The direct intervention staff were reported for all 20 articles that used a group design. However, four (20.0%) articles used generic terms such as experimenter, therapist, and researcher. Seven (35.0%) articles indicated more than one type of direct intervention staff implemented the intervention. Six (30.0%) articles reported parents/caregivers of the participants implemented the intervention. Eleven (26.3%) articles, including articles that reported multiple direct intervention staff, reported the professional role or credentials of the direct intervention staff which included graduate students (n = 3), teachers (n = 5), paraprofessional (n = 1), speech language pathologists (SLP; n = 2), occupational therapist (n = 1), psychologist (n = 2), BCBA (n = 1), family therapist (n = 1), and early childhood educators (n = 1). Though 20 articles reported who the direct intervention staff were, only eight (40.0%) reported direct intervention staff characteristics such as age, gender, race, and ethnicity. Of the eight articles that provided direct intervention staff characteristics, two (10.0%) only provided information regarding gender, one (5.0%) only provided information regarding age and gender, five (25.0%) only provided information regarding gender, race, and ethnicity, and one (5.0%) provided information on all four characteristics. 37 Supervision Of the 10 total single-subject articles, only five (50.0%) articles reported information regarding supervision during implementation of the intervention. Alternatively, of the 20 group design articles, 17 (85.0%) articles reported information regarding supervision during implementation of the intervention. Supervisor Staff. The supervisor staff (i.e., the individuals providing direct supervision) were reported for five (50.0%) articles that used a single-subject design. Three (60.0%) articles used generic terms such as research assistants and researcher. Two (40.0%) articles indicated that more than one type of supervisor staff provided supervision. Two (40.0%) articles, including articles that reported multiple supervisor staff, reported the professional role or credentials of the supervisor staff which included a psychologist (n = 1), an SLP (n = 1), and a graduate student (n = 1). Though five articles reported who the supervisor staff were, no studies reported supervisor staff characteristics such as age, gender, race, and ethnicity. Of the 17 group design articles that reported information regarding supervision, eight (47.1%) articles used generic terms such as researchers and interventionists. One (5.9%) article indicated that more than one type of supervisor staff provided supervision. Nine (52.9%) articles reported the professional role or credentials of the supervisor staff which included a speech language pathologist (n = 2), graduate students (n = 2), master’s level professionals (n = 1), ESDM trainers (n = 2), ASAP coaches (n = 1), allied health professionals (n = 1), an occupational therapist (n = 1), and a social worker (n = 1). Though 17 articles reported who the supervisor staff were, only three (17.6%) reported supervisor staff characteristics such as age, gender, race, and ethnicity. Of the three articles that provided supervisor staff characteristics, two 38 (66.7%) only provided information regarding gender, race, and ethnicity, and one (33.3%) only provided information regarding race and ethnicity. Lead Supervisor Staff. Of the five single-subject design articles that reported information regarding supervision, zero reported who the lead supervisor staff (i.e., the individuals serving as the lead supervisor) were and as a result, zero reported lead supervisor staff characteristics such as age, gender, race, and ethnicity. Of the 17 group design articles that reported information regarding supervision, five (29.4%) reported who the lead supervisor staff were. Of those five articles, one (2%) article used generic terms such as first author. One (20%) article indicated that more than one type of lead supervisor staff provided supervision. Four (80%) articles reported the professional role or credentials of the direct intervention staff which included master’s level professionals (n = 3), psychologist (n = 1), and BCBA (n = 1). Though five articles reported who the lead supervisor staff were, only two (40%) reported lead supervisor staff characteristics such as age, gender, race, and ethnicity. Of the two articles that provided lead supervisor staff characteristics, they only provided information regarding gender, race, and ethnicity. Supervision Meeting Characteristics. Of the five articles that used a single-subject design and reported on supervision, only two (40%) indicated the length of supervision, which ranged from 1 hr to 10 hr. Additionally, only two (40%) indicated the frequency of supervision, which was weekly for both articles. Only two (40%) articles reported the mode (e.g., in person, video) and the setting (e.g., home) the supervision was provided in. Of the 17 articles that used a group design and reported on supervision, only two (11.8%) indicated the length of supervision which ranged from 30 min to 4 hr. However, nine (52.9%) indicated the frequency of supervision which ranged from weekly to every three months. Finally, eight (47.1%) articles reported the 39 mode (e.g., in person, video, phone) the supervision was provided in and only six (35.3%) reported the setting (e.g., classroom, home) the supervision was provided in. Staff Training Of the 10 comprehensive intervention articles that used a single-subject design, six (60.0%) articles reported on training that was provided on the intervention. Of the articles that reported on training provided, four (66.7%) articles used generic terms such as primary researcher and research assistants to report who the trainer was. One (16.7%) article reported the professional role or credentials of the trainers, which consisted of a speech language pathologist (n = 1). One (16.7%) article that indicated training was provided, did not provide information about who the trainer was. Additionally, although five articles reported who the trainer was, zero reported trainer characteristics such as age, gender, race, and ethnicity. All six of the articles that reported training occurred, reported who received the training including parents/caregivers (n = 4), undergraduate students (n = 1) and a psychologist (n = 1). Additionally, all six articles provided a description of the training that was provided; however, only four (66.7%) indicated how much training was received (i.e., range from 30 min to 10 hr). Only three (50.0%) articles reported the mode (e.g., in person, video) and four (66.7%) reported the setting (e.g., university-based center, home) the training was provided in. Alternatively, of the 20 comprehensive intervention articles that used a group design, 19 (95.0%) articles reported on training that was provided on the intervention. Of the articles that reported on training provided, 10 (52.6%) articles used generic terms such as researchers and experienced trainers to report who the trainer was. Six (31.6%) articles reported the professional role or credentials of the trainer, which consisted of ASAP coaches (n = 1), ESDM trainers (n = 2), graduate students (n = 1), master’s level professionals (n = 1), psychologist (n = 1), BCBA (n 40 = 1), and a PRT clinician (n = 1). Three (15.8%) articles that indicated training was provided, did not provide information about who the trainer was. Additionally, although 16 articles reported who the trainer was, only one (6.3%) article reported trainer characteristics such as age, gender, race, and ethnicity. The one article that provided trainer characteristics, only provided information regarding the ethnicity of the trainers which was reported as eight trainers of non Latinx, Hispanic, or of Spanish origin and one trainer of Latinx, Hispanic, or of Spanish origin. All nineteen of the articles that reported training occurred also reported who received the training, including parents/caregivers (n = 7), therapists (n = 1), graduate students (n = 1), intervention staff (n = 1), teaching assistants (n = 1), clinicians (n = 1), teachers (n = 2), and multiple individuals (e.g., paraprofessionals, teachers, therapists; n = 4). Additionally, all 19 articles provided a description of the training that was provided and 16 (84.2%) indicated how much training was received, which ranged from 30 min to 32 hr. However, only 11 (57.9%) articles reported the mode (e.g., in person) and only seven (36.8%) reported the setting (e.g., school, homes) the training was provided in. Skill-Based Intervention Articles A total of 445 skill-based intervention articles met inclusion criteria for this review. Of those 445 articles, 435 used a single-subject design and 10 used group design. Overall Participant Diagnoses Within the skill-based intervention articles, participant diagnoses for all participants, regardless of age and diagnosis, were collected. For single-subject design articles, there were 1354 participants with ASD, five participants with Down Syndrome, 32 with developmental delay, 13 with Attention Deficit Hyperactivity Disorder (ADHD), 52 with PDD-NOS, 10 with speech language impairments, and nine with intellectual disability. Additional diagnoses 41 consisted of, disruptive behavior disorder, Rett syndrome, other health impairments, obsessive compulsive disorder, oppositional defiant disorder, Tourette syndrome, feeding disorders, and social pragmatic disorder. There were 90 participants that had dual or multiple diagnoses along with ASD. Lastly, there were 51 participants that were typically developing. For group design articles, there were 240 participants with ASD. Participants with ASD Characteristics Within the skill-based intervention articles that used a single-subject design, the average number of participants diagnosed with ASD and were five years of age or younger was 2.39 participants (range: 1-15 participants). Seven (1.6%) of the single-subject articles did not report specific ages of the participants with ASD that were five years of age or younger, instead the articles included the mean age and/or age range of all participants or the participants with ASD. As a result, the mean age was calculated using the specific participant characteristics reported in the remaining (98.4%) articles, the mean age of the participants was 50.4 months (range: 20-71 months). Articles that reported the gender (n = 424, 97.5%), all articles reported specific participant gender characteristics. Articles that reported race (n = 57, 13.1%), all articles reported specific participant race characteristics. Finally, of the articles that reported ethnicity (n = 49, 11.3%), all articles reported specific participant ethnicity characteristics. Of the articles that reported specific participant characteristics, the majority were male (82.6%), white (56.3%), non Latinx, Hispanic, or of Spanish origin (85.3%). Alternatively, for group design articles, the average number of participants that were diagnosed with ASD and were five years of age or younger was 19.0 participants (range: 2-63 participants). Six (60.0%) of the 10 group design articles did not report specific ages of the participants with ASD that were five years of age or younger, instead the articles included the 42 mean age and/or age range of all participants or the participants with ASD. As a result, the mean age was calculated using the specific participant characteristics reported in the four articles, the mean age of the participants was 52.9 months (range: 36-71 months). Of the articles that reported gender (n = 8, 80.0%), one article reported the overall number of participants for gender but did not provide specific participant gender characteristics. Articles that reported race (n = 3, 30.0%), all three articles reported specific participant race characteristics. Finally, the two articles that reported the ethnicity (20.0%) reported specific participant ethnicity characteristics. Of the articles that reported specific participant characteristics, the majority were male (76.4%), white (71.3%), non Latinx, Hispanic, or of Spanish origin (90.6%). Settings Four hundred and twenty-three (97.2%) of the total skill-based intervention articles reported the setting the intervention was implemented in. Seventy-five (17.7%) articles reported that the study was conducted across multiple settings. Overall, the studies were conducted primarily in the school (n = 148, 35.0%), home (n = 113, 26.7%), clinic (n = 68, 16.1%), and university-based center (n = 63, 14.9%). Figure 2.3 displays the total number of articles that implemented the intervention in each setting. Duration and Dosage of Interventions For single-subject design articles, the duration of the intervention was reported in only 63 (14.5%) articles and the intensity was reported in 302 (69.4%). Twenty (31.7%) articles reported a range or average for the duration of the intervention and were not included in the overall average duration calculation below. Two hundred and seventy-one (89.7%) articles reported partial components but did not report all of the necessary components to calculate the intensity of the intervention. Of the articles that reported all of the necessary components, the average length 43 of the intervention was 2.9 months with a range of one week to 12 months and the average intensity was 2.9 hr per week with a range of less than 15 min to 11.25 hr. For group design article, the duration of the intervention was reported in seven (70.0%) articles and the intensity was reported in eight (80.0%) articles. Three (42.9%) articles reported a range or average for the duration of the intervention and were not included in the overall average duration calculation below. Of the articles that reported the duration of the intervention, the average length was 4.7 months with a range of less than one month to 12 months. All eight articles reported partial components, which, as a result, intensity of the intervention could not be calculated. Treatment Fidelity Skill-based intervention articles were evaluated for treatment fidelity in an identical manner as comprehensive intervention articles. Of the 435 skill-based intervention articles that used a single-subject design, 284 (65.3%) reported treatment fidelity. Of those 284 articles, 194 (68.3%) met the minimum standards for treatment fidelity (i.e., collected for a minimum of 20% of sessions across conditions and the scores reached an acceptable level of 80% or better). Alternatively, of the 10 skill-based intervention articles that used a group design, six (60.0%) reported treatment fidelity. Of those six articles, only two (33.3%) met the minimum standards for treatment fidelity. Direct Intervention Staff The direct intervention staff (i.e., the individuals implementing the intervention) were reported for 416 (95.6%) articles that used a single-subject design. However, though 260 (62.5%) articles indicated who the direct intervention staff were, they used generic terms such as experimenter, instructor, therapist, and researcher. Sixty-two (14.9%) articles indicated more 44 than one type of direct intervention staff implemented the intervention. Forty-three (10.3%) articles reported that parents/caregivers of the participants implemented the interventions. One hundred and thirty (31.3%) articles, including articles that reported multiple direct intervention staff, reported the professional role or credentials of the direct intervention staff, with the majority consisting of graduate students (n = 46), teachers (n = 38), BCBAs (n = 13), and master’s level professionals (n = 10). Though 416 (95.6%) articles reported who the direct intervention staff were, only 50 (12.0%) reported direct intervention staff characteristics such as age, gender, race, and ethnicity. Of the 50 articles that provided direct intervention staff characteristics, two (4.0%) only provided information regarding age, 26 (52.0%) only provided information regarding gender, 14 (28.0%) only provided information regarding age and gender, two (4.0%) only provided information regarding gender, race, and ethnicity, and six (12.0%) provided information on all four characteristics. The direct intervention staff were reported for all 10 articles that used a group design. However, six (60.0%) articles used generic terms such as coach, therapist, and researcher. Three (30.0%) articles indicated that more than one type of direct intervention staff implemented the intervention. One (10.0%) article reported that parents/caregivers of the participants implemented the intervention. Three (30.0%) articles, including articles that reported multiple direct intervention staff, reported the professional role or credentials of the direct intervention staff which included a master’s level professional (n = 3), graduate student (n = 1), and doctoral level professionals (n = 1). 45 Though 10 articles reported who the direct intervention staff were, only one (10.0%) reported direct intervention staff characteristics which only included information regarding gender. No other direct intervention staff characteristics were provided. Supervision Of the 435 single-subject articles, only 70 (16.1%) articles reported information regarding supervision during implementation of the intervention. Alternatively, of the 10 group design articles, zero articles reported information regarding supervision during implementation of the intervention. Supervisor Staff. The supervisor staff (i.e., the individuals providing direct supervision) were reported for 70 articles that used a single-subject design. However, 34 (48.6%) articles used generic terms such as author, experimenter, and researcher. Six (8.6%) articles indicated that more than one type of supervisor staff provided supervision. Thirty-eight (54.3%) articles, including articles that reported multiple supervisor staff, reported the professional role or credentials of the supervisor staff, with the majority consisting of BCBAs (n = 11), graduate students (n = 9), BCBA-Ds (n =4), and a psychologist (n = 4). Though 70 (16.1%) articles reported who the supervisor staff were, only eight (11.4%) reported supervisor staff characteristics such as age, gender, race, and ethnicity. Of the eight articles that provided supervisor staff characteristics, all eight only provided information regarding gender. No other supervisor staff characteristics were provided. Lead Supervisor Staff. Of the 70 single-subject design articles that reported information regarding supervision, only 10 (14.3%) reported who the lead supervisor staff (i.e., the individuals serving as the lead supervisor) were. Of those 10 articles, two (20.0%) articles used generic terms such as experimenter and researcher. Zero articles indicated that more than one 46 type of lead supervisor staff provided supervision. Eight (80.0%) articles reported the professional role or credentials of the lead supervisor staff, with the majority consisting of BCBA-Ds (n = 4), BCBAs (n = 2), psychologist (n = 1), and doctoral level professional (n = 1). Though 10 (14.3%) articles reported who the lead supervisor staff were, only three (30.0%) reported lead supervisor staff characteristics which only included information regarding gender. No other lead supervisor staff characteristics were provided. Supervision Meeting Characteristics. Of the 70 articles that used a single-subject design and reported on supervision, only five (7.1%) indicated the length of supervision which ranged from 30 min to 105 min. Additionally, only 14 (20.0%) indicated the frequency of supervision which ranged from daily to every two weeks. Finally, only 13 (18.6%) articles reported the mode (e.g., in person, video) the supervision was provided in and only 11 (15.7%) reported the setting (e.g., therapy room, home) the supervision was provided in. Staff Training Of the 435 skill-based intervention articles that used a single-subject design, 92 (21.1%) reported on training that was provided on the intervention. Of the articles that reported on training provided, 46 (50.0%) articles used generic terms such as experimenter and investigator to report who the trainer was. Twenty-eight (31.1%) articles reported the professional role or credentials of the trainers, which consisted of graduate students (n = 7), BCBAs (n = 5), BCBA- Ds (n = 5), master’s level professionals (n = 4), doctoral level professionals (n = 4), principal of school (n = 4), teacher (n = 1), and an SLP (n = 1). Twenty-three articles (25.6%) that indicated training was provided, did not provide information about who the trainer was. Additionally, although 67 articles reported who the trainer was, only seven (10.4%) reported trainer characteristics which only included information regarding gender. Eighty-eight (97.8%) of the 47 articles that reported training occurred, reported who received the training with the majority including parents/caregivers (n = 38) and teachers (n = 14). Eighty-eighty (97.8%) of the articles that reported training occurred, provided a description of the training that was provided; however, only 27 (30.7%) indicated how much training was received which ranged from 30 min to 174 hr. Furthermore, only 37 (42.0%) articles reported the mode (e.g., in person, video) the training was provided in and only 24 (27.3%) reported the setting (e.g., school, home) the training was provided in. Of the 10 skill-based intervention articles that used a group design, six (60.0%) reported on training that was provided on the intervention. Of the articles that reported on training provided, one (16.7%) article used generic terms such as experimenter to report who the trainer was. Five (83.3%) articles that indicated training was provided, did not provide information about who the trainer was. Additionally, although one article reported who the trainer was, zero articles reported trainer characteristics such as age, gender, race, and ethnicity. All six of the articles that reported training occurred, reported who received the training including undergraduate students (n = 2), feeders (n = 1), coaches (n = 1), teachers (n = 1), and therapists and assistants (n = 1). Five (83.3%) of the articles that reported training occurred, provided a description of the training that was provided; however, only one (16.7%) indicated how much training was received which consisted of 10 hr. Furthermore, only one (16.7%) article reported the mode (e.g., in person) and the setting (e.g., agency) the training was provided in. Discussion The present literature review evaluated the extent to which the published research literature reports information about supervision and staff training within behavioral interventions 48 provided to children with ASD. Additionally, the present literature review updated and extended the Romanczyk et al. (2014) review by evaluating both supervision and staff training variables reported in behavioral interventions over the past eight years. The remainder of the discussion will focus on comparing the findings of the Romanczyk et al. (2014) review with the present literature review, specifically focusing on key findings (i.e., supervision, staff training, participant characteristics, settings, and research on behavioral interventions). Research and practice implications will also be discussed. Supervision and Staff Training The Romanczyk et al. (2014) review and the present review found that less than 30.0% of articles reported the professional qualifications of the supervisors who provided supervision to individuals implementing behavioral interventions. Furthermore, both reviews found information regarding the amount of supervision was not consistently reported. Romanczyk et al. (2014) did not evaluate staff training variables, therefore a comparison could not be made. However, given their findings regarding supervision, it is unlikely that staff training variables were reported at a higher frequency than the present review. It is important to note that the present review found that only 35 (7.4%) articles reported the professional qualifications of the trainers. Furthermore, only 48 (10.1%) articles indicated the amount of training received. This finding indicates, similar to supervision characteristics, staff training characteristics were not consistently reported in the present review. Our findings suggest there is little-to-no consensus on reporting supervision and staff training characteristics (see Table 2.3 for the percentage of articles that reported on supervision and staff training variables in the present review). If articles do not report information regarding supervision and staff training, it is unclear whether supervision or staff training occurred. This is 49 problematic because it poses issues for replication. In order for studies to be replicated in both research and applied settings and for intervention effectiveness to continue to be evaluated, it is imperative that researchers provide adequate descriptions of supervision and staff training and the materials and/or protocols are included (Gormley et al., 2020). Additionally, the lack of reporting poses issues with research translating to practice. If it is unclear whether staff training and supervision occurred in research articles, individuals working in locations outside of research settings, may not be able to properly train and supervise individuals because the necessary information (e.g., supervision dosage, how staff were trained to implement the intervention) may not be provided. Furthermore, if only partial supervision or training information is provided in research articles (e.g., training was provided for 5 hr across two days, but there was no description of the training provided) individuals may not be able to identify the components necessary in order to achieve optimal supervision and training outcomes (Gormley et al., 2020). As a result, the intervention may be implemented incorrectly or be identified as an intervention that is not usable in their setting (Dingfelder & Mandell, 2011). We strongly encourage future research to begin reporting information regarding both supervision and staff training characteristics. With this information, researchers may begin to evaluate how much supervision (e.g., 1 hr for every 10 hr of intervention) or training (e.g., 3 hr, 15 hr) is needed in order to increase intervention effectiveness. Additionally, future research can begin to evaluate different models of supervision (e.g., apprenticeship supervision model; Hartley et al., 2016) and training (e.g., behavior skills training; DiGennaro Reed et al., 2018; Slane & Lieberman-Betz, 2021) that leads to the greatest client outcomes (Valentino, 2021). As a result, individuals in applied settings will be provided with the necessary information to properly supervise and train individuals implementing behavioral interventions with children with ASD. 50 Finally, consistent reporting of supervision and staff training variables will lead to improved literature reviews and meta-analyses that can further inform research and practice (Roth et al., 2010). If supervision and staff training variables are consistently reported, researchers can begin to conduct more in-depth analyses of these variables and ask additional questions that are currently limited by inconsistent reporting. Future research could analyze the impact when supervision and staff training is not achieved (e.g., a study reports that a brief 15 min training was provided and supervision was provided one time throughout the duration of the study) and/or if it drifts over time (e.g., initially supervision was provided 1 hr per day and at the end of the study supervision was only provided 1 hr per week). Additionally, future research may consider asking questions such as: were criteria used to determine how or the extent to which supervision was provided?; Do certain behavioral interventions across different focus areas (e.g., academic, behavior reduction) require more or less supervision or training?; Was procedural fidelity collected during training? If so, what level of procedural fidelity was achieved?; and Do different levels of procedural fidelity during training impact treatment outcomes? Having the ability to ask additional questions and identifying the possible answers will provide the field of ABA with more informative research and as a result positively impact treatment outcomes for clients. Participant Characteristics Romanczyk and colleagues (2014) found inconsistencies with articles reporting participant characteristics, and it was clear there was no consensus on what participant characteristics should be reported. The present review found similar results. However, with regards to age and gender, several articles reported the overall age and gender of all participants and did not provide individual participant characteristics. Furthermore, articles we reviewed that 51 provided individual participant’s race and ethnicity remained inconsistent and frequently articles did not report these participant characteristics. There is a long history (i.e., since the mid-1980s) of researchers and reporting standards (e.g., Begg et al., 1996; APA Publications and Communications Board Working Group on Journal Article Reporting Standards, 2008) calling for consistent reporting of participant characteristics (e.g., demographic variables) in both group design articles and single-subject articles (Jones et al., 2020). However, given the results of the present review and the Romanczyk et al. (2014) review, reporting inconsistencies still remain. If participant characteristics, specifically demographic variables (e.g., gender, race, ethnicity), are not reported in behavioral intervention articles, it is difficult to determine if the intervention is suitable for the specific individuals being served, which in turn may hinder the practitioner’s ability to make an appropriate choice on interventions to implement (Jones et al., 2020). Additionally, lack of reporting participant characteristics poses issues for replication of the behavioral interventions (Li et al., 2017). If articles do not report participant characteristics, it may be difficult for additional studies to evaluate the efficacy of the intervention with participants that have similar characteristics or different characteristics (Tincani & Travers, 2019). Reporting specific participant characteristics will allow researchers to gather evidence from multiple studies to evaluate intervention effectiveness, assess generality to participants with various participant characteristics, and ultimately aid in producing better client outcomes (Tincani & Travers, 2019). Settings Romanczyk et al. (2014) found that for both comprehensive intervention and skill-based interventions articles, the majority of interventions were implemented in the participant’s schools 52 and homes. Our results indicate the settings in which behavioral interventions are implemented and evaluated has changed over the past eight years from interventions primarily being implemented in participant’s schools and homes to now being implemented in additional settings such as clinic settings, camps, and recreation centers. In the present review, 30 articles did not report the setting the intervention was implemented in or did not provide specific setting information. While Romanczyk et al. (2014) did not measure for clinics, camps, recreation centers, not reported settings, and unspecified settings, it is unclear whether articles consistently reported the settings or if Romanczyk and colleagues did not evaluate settings at this level of specificity (e.g., unspecified setting). Future research should report specific setting information, to allow researchers and practitioners the ability to determine if they provide interventions in similar settings or if additional research is needed in order to identify if the intervention is effective in their practice setting (Dingfelder & Mandell, 2011; Smith et al., 2007; Strain et al., 2021). Future research should continue to conduct studies in settings where majority of children with ASD receive services (e.g., community-based settings). Previous research has found that procedural fidelity of behavioral interventions is often high when implemented in university- based settings (e.g., university-based centers, research labs; Peters-Scheffer et al., 2013). However, those settings may not replicate the conditions (e.g., community-based settings, clinics, private agencies) where a majority of children with ASD receive services (Brookman- Frazee et al., 2010; Kasari & Smith, 2013). As a result, there remains a significant research-to- practice gap between university-based settings and community-based settings (Dingfelder & Mandell, 2011; Nahmias et al., 2019). 53 Research on Behavioral Interventions Over the past eight years, frequency of research on behavioral interventions provided to individuals with ASD has increased. Romanczyk et al. (2014) found 144 articles (19 comprehensive and 125 skill-based) that met inclusion criteria, while the present literature review found 475 articles (30 comprehensive and 445 skill-based). Two potential reasons for the increasing trend are the increased prevalence of ASD (i.e., one in 54 children; CDC, 2020) and the rapid growth of the field of ABA (i.e., number of BCBAs has increased from 12,625 in 2013 to 54,223 in 2021; BACB, n.d.). The increase in research on behavioral interventions is important because it demonstrates that researchers are continuing to evaluate the effectiveness of interventions and respond to the urgent demand for disseminating evidence-based interventions. However, the present review illuminated the need for additional research (e.g., the dosage of supervision and staff training required to increase treatment outcomes) and that the field of ABA needs to come to a consensus on reporting supervision and staff training characteristics. Reporting Recommendations Based on the findings from the present review and Romanczyk and colleagues review (2014), we strongly encourage future research to begin reporting information regarding both supervision (see Shire et al., 2019 for an example) and staff training characteristics (see Boyd et al., 2018 for an example). By reporting these characteristics, it will help address issues of replication and research translating to practice in the field of ABA. Furthermore, it will provide researchers with the necessary information to evaluate supervision and staff training as independent variables and will allow for improved literature reviews and meta-analyses which in turn will help further improve the field of ABA. We also strongly encourage future research to begin reporting participant characteristics, specifically demographic variables (see for an 54 example: Cariveau et al., 2019; Rogers et al., 2019b), and conducting studies in settings where majority of children with ASD receive services. Limitations Several limitations of the present literature review should be noted. Our search was limited to peer-reviewed publications and did not include gray literature (e.g., dissertations, theses) or unpublished studies. As a result, this review does not include all research that has been conducted on behavioral interventions for children with ASD. Second, while a comprehensive search was conducted, it may be possible that some articles were missed due to the search terms that were used in the search. However, to mitigate this limitation we conducted the search across three different databases (i.e., MEDLINE, PsycINFO, ERIC) in order to capture as many articles as possible. Third, a hand search was not conducted. As a result, it is possible some behavioral intervention research may have been missed, if they were not detected by the search terms used. Fourth, the date range for our search was limited to April 2013 to January 2021. As a result, the present review does not include behavioral intervention research that has been published since January of 2021. Future reviews may consider updating these results to include published research in 2021. Finally, only randomized controlled trial group design articles were included in the present review, as a result evaluations of behavioral interventions using other group design types were not evaluated. Future literature reviews may consider including gray literature, conducting a search with additional search terms, conducting a hand search, and including all group design articles to ensure all articles evaluating behavioral interventions with children with ASD are identified. Although the present literature review uncovered limitations in behavioral intervention research, it brings attention and awareness to the need of standardizing reporting conventions, specifically supervision and staff training characteristics. 55 APPENDIX 56 Table 2.1. Journal Distribution of Articles Journal Name n % American Journal of Speech 1 0.2% Language Pathology Art Therapy 1 0.2% Augmentative and Alternative 2 0.4% Communication Australian Journal of Special 1 0.2% Education Autism 5 1.1% Autism and Developmental 1 0.2% Language Impairments Autism Research 1 0.2% Behavior Analysis in Practice 33 6.9% Behavior Analysis in Practice: 5 1.1% Research and Practice Behavior Modification 21 4.4% Behavioral Development 5 1.1% Behavioral Development 7 1.5% Bulletin Behavioral Disorders 1 0.2% Behavioral Interventions 27 5.7% Behavior Change 1 0.2% British Journal of Special 1 0.2% Education Canadian Journal of School 1 0.2% Psychology Child and Family Behavior 3 0.6% Therapy Developmental 17 3.6% Neurorehabilitation Early Childhood Education 1 0.2% Journal 57 Table 2.1 (cont’d) Journal Name n % Education and Training in 18 3.8% Autism and Developmental Disabilities Education and Treatment of 6 1.3% Children Educational Sciences: Theory 3 0.6% and Practice Exceptionality 1 0.2% Exceptionality Education 1 0.2% International Focus on Autism and Other 8 1.7% Developmental Disabilities Health Psychology Report 1 0.2% Infants and Young Children 1 0.2% International Journal of Child- 1 0.2% Computer Interaction International Journal of 2 0.4% Developmental Disabilities International Journal of 1 0.2% Disability, Development, and Education International Journal of 1 0.2% Psychology and Psychological Therapy International Journal of Speech 1 0.2% Language Pathology Journal of Applied Behavior 129 27.2% Analysis Journal of Autism and 38 8.0% Developmental Disorders Journal of Behavioral 9 1.9% Education Journal of Child and Family 1 0.2% Studies Journal of Child Psychology 2 0.4% and Psychiatry 58 Table 2.1 (cont’d) Journal Name n % Journal of Developmental and 1 0.2% Behavioral Pediatrics Journal of Developmental and 24 5.1% Physical Disabilities Journal of Early Intervention 2 0.4% Journal of Intellectual and 1 0.2% Developmental Disability Journal of Intellectual 2 0.4% Disability Research Journal of Pediatric Psychology 1 0.2% Journal of Positive Behavior 3 0.6% Interventions Journal of Research in Special 2 0.4% Education Needs Journal of the American 1 0.2% Academy of Audiology Journal of the American 1 0.2% Academy of Child and Adolescent Psychiatry Journal of the Experimental 4 0.8% Analysis of Behavior Language, Speech, and Hearing 2 0.4% Services in Schools Learning and Motivation 6 1.3% Mexican Journal of Behavior 1 0.2% Analysis Pediatrics 2 0.4% Psychology in Schools 2 0.4% Remedial and Social Education 1 0.2% Research in Autism Spectrum 24 5.1% Disorder Research in Developmental 3 0.6% Disabilities School Psychology Quarterly 2 0.4% 59 Table 2.1 (cont’d) Journal Name n % Speech, Language, and Hearing 1 0.2% Support for Learning 1 0.2% The Analysis of Verbal 23 4.8% Behavior Journal of Special Education 1 0.2% The Psychological Record 1 0.2% Topics in Early Childhood 4 0.8% Special Education 60 Table 2.2. Article Distribution by Focus Category and Research Design Focus Category Group Single-Subject Academic 0 54 Behavior Reduction 0 88 Cognitive 0 0 Communication 4 161 Comprehensive 20 10 Daily Living 2 15 Feeding 2 20 Play 0 19 Sleep 0 4 Social 2 70 Toileting 0 4 Total articles by research design 30 445 61 Table 2.3. Percentage of Articles that Reported on Supervision and Staff Training Variables Variables Comprehensive Skill-Based Single-Subject Group Single-Subject Group Supervision 50.0% 85.0% 16.1% 0.0% Staff Training 60.0% 95.0% 21.1% 60.0% 62 Figure 2.1. Article Distribution by Publication Year 80 70 60 Number of Articles 50 40 30 20 10 0 2013 2014 2015 2016 2017 2018 2019 2020 Publication Year 63 Figure 2.2. Intervention Setting for Comprehensive Intervention Articles 20 19 18 17 16 15 Number of Articles 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0 ic lin un p am en om e os l lin ge en La b ho ol en r tting ed lin ic ity te ic nc te ep C C C r H pita ie y r h Sc te Se or d m en nt A C rc C t ifi om tC H C en io ea ed d ot e n es ifi R ns C at Out tio at as e N pe m pa n re R B ns c Tr t rv ec ity pe U e In R rs c ay te U D te nive iv a U Pr Setting Note. The sum of articles across all settings does not equal 100% because each setting number is based on all settings the intervention was implemented in, including articles that reported multiple settings. 64 Figure 2.3. Intervention Setting for Skill-Based Intervention Articles 150 140 130 120 110 Number of Articles 100 90 80 70 60 50 40 30 20 10 0 ic lin un p am en r om e os l lin ge en r b La ho ol en r tting ed lin ic ity ic nc ep C C C te H pita ie y te h Sc te Se or d m en nt A C rc C t ifi om tC H C en io ea ed d ot e n es ifi R ns C m Out tio at as e N pe at pa n re R B ns c Tr t rv ec ity pe U e In R rs c ay te U D te nive iv a U Pr Setting Note. The sum of articles across all settings does not equal 100% because each setting number is based on all settings the intervention was implemented in, including articles that reported multiple settings. 65 REFERENCES 66 REFERENCES Achmadi, D., Sigafoos, J., van der Meer, L., Sutherland, D., Lancioni, G. E., O’Reilly, M. F., Hodis, F., Green, V. A., McLay, L., & Marschik, P. B. (2014). Acquisition, preference, and follow-up data on the use of three AAC options by four boys with developmental disability/delay. Journal of Developmental and Physical Disabilities, 26(5), 565-583. https://doi.org/10.1007/s1088-014-9379-z Ackerlund Brandt, J. A., Winkauf, S., Zeug, N., & Klatt, K. P. (2016). An evaluation of constant time delay and simultaneous prompting procedures in skill acquisition for young children with autism. Education and Training in Autism and Developmental Disabilities, 55(1), 55-66. https://www.jstor.org/stable/26420364 Agius, M. M., & Vance, M. (2016). A comparison of PECS and iPad to teach requesting to pre- schoolers with autistic spectrum disorders. Augmentative and Alternative Communication, 32(1), 56-68. https://doi.org/10.3109/07434618.2015.1108363 Aguirre, A. A., LeBlanc, L. A., Reavis, A., Shillingsburg, A. M., Delfs, C. A., Miltenberger, C. A., & Symer, K. B. (2019). Evaluating the effects of similar and distinct discriminative stimuli during auditory conditional discrimination training with children with autism. The Analysis of Verbal Behavior, 35(1), 21-38. https://doi.org/10.1007/s40616-019-00111-3 Akers, J. S., Higbee, T. S., Gerencser, K. R., & Pellegrino, A. J. (2018). An evaluation of group activity schedules to promote social play in children with autism Journal of Applied Behavior Analysis, 51(3), 553-570. https://doi.org/10.1002/jaba.474 Akers, J. S., Higbee, T. S., Pollard, J. S., Pellegrino, A. J., & Gerencser, K. R. (2016). An evaluation of photographic activity schedules to increase independent playground skills in young children with autism. Journal of Applied Behavior Analysis, 49(4), 954-959. https://doi.org/10.1002/jaba.327 Akmanoglu, N. (2015). Effectiveness of teaching naming facial expression to children with autism via video modeling. Educational Sciences: Theory & Practice, 15(2), 519-537. https://doi.org/10.12738/estp.2015.2.2603 Akmanoglu, N., Kurt, O., & Kapan, A. (2015). Comparison of simultaneous prompting and constant time delay procedures in teaching children with autism the responses to questions about personal information. Educational Sciences: Theory & Practice, 15(3), 723-737. https://doi.org/10.12738/estp.2015.3.2654 Akmanoglu, N., Yanardag, M., & Batu, E. S. (2014). Comparing video modeling and graduated guidance together and video modeling alone for teaching role playing skills to children with autism. Education and Training in Autism and Developmental Disabilities, 49(1), 17-31. https://www.jstor.org/stable/23880652 67 Aldosari, M. S. (2016). Efficacy of a systematic process for developing function-based treatment for young children with disabilities. Education and Training in Autism and Developmental Disabilities, 51(4), 391-403. https://www.jstor.org/stable/26173866 Allen, K. D., & Warzak, W. J. (2000). The problem of parental nonadherence in clinical behavior analysis: Effective treatment is not enough. Journal of Applied Behavior Analysis, 33(3), 373-391. https://doi.org/10.1901/jaba.2000.33-373 Alzrayer, N. M. (2020). The impact of an intraverbal webbing procedure on the emergence of advanced intraverbal skills in children with autism spectrum disorder. Behavior Analysis in Practice, 13, 914-923. https://doi.org/10.1007/s40617-020-00410-5 Alzyoudi, M., Sartawi, A., & Almuhiri, O. (2014). The impact of video modelling on improving social skills in children with autism. British Journal of Special Education, 42(1), 53-68. https://doi.org/10.1111/1467-8578.12057 American Psychiatric Association (2013). Diagnostic and statistical manual of mental disorders (5th ed.). American Psychiatric Publishing. Anderson, E., Barretto, A., McLaughlin, T. F., & McQuaid, T. (2016a). Case report: effects of functional communication training with and without delays to decrease aberrant behavior in a child with autism spectrum disorder. Journal on Developmental Disabilities, 22(1), 101-110. Anderson, S., Bucholz, J. L., Hazelkorn, M., & Cooper. M. A. (2016b). Using narrated literacy- based behavioral interventions to decrease episodes of physical aggression in elementary students with disabilities. Support for Learning, 31(2), 92-103. https://doi.org/10.1111/1467-9604.12118 APA Publications and Communications Board Working Group on Journal Article Reporting Standards (JARS, 2008). Reporting standards for research in psychology. American Psychologist, 63(9), 839-851. https://doi.org/10.1037/0003-066X.63.9.839 Ardic, A., & Cavkaytar, A. (2014). Effectiveness of the modified intensive toilet training method on teaching toilet skills to children with autism. Education and Training in Autism and Developmental Disabilities, 49(2), 263-276. https://jstor.org/stable/23880609 Armendariz, V., & Hahs, A. D. (2019). Teaching leisure activities with social initiations through video prompting. Journal of Behavioral Education, 28(4), 479-792. https://doi.org/10.1007/s10864-019-09320-1 Artman-Meeker, K., Rosenberg, N., Badgett, N., Yang, X., & Penney, A. (2017). The effects of bug-in-ear coaching pre-service behavior analysts’ use of functional communication training. Behavior Analysis in Practice, 10(3), 228-241. https://doi.org/10.1007/s40617- 016-0166-4 68 Baer, D. M., Wolf, M. M., & Risley, T. R. (1968). Some current dimensions of applied behavior analysis. Journal of Applied Behavior Analysis, 1(1), 91-97. https://doi.org/10.1901/jaba.1968.1-91 Bao, S., Sweatt, K. T., Leachago, S. A., & Antal, S. (2017). The effects of receptive and expression instructional sequences on varied conditional discriminations. Journal of Applied Behavior Analysis, 50(4), 775-788. https://doi.org/10.1002/jaba.404 Barkaia, A., Stokes, T. F., & Mikiashvili, T. (2017). Intercontinental telehealth coaching of therapists to improve verbalizations by children with autism. Journal of Applied Behavior Analysis, 50(3), 582-589. https://doi.org/10.1002/jaba.391 Barry, L., Holloway, J., & Gunning, C. (2019). An investigation of the effects of a parent delivered stimulus-stimulus pairing intervention on vocalization of two children with autism spectrum disorder. The Analysis of Verbal Behavior, 35(1), 57-73. https://doi.org/10.1007/s40616-018-0094-1 Bauminger-Zviely, N., Eytan, D., Hoshmand, S., & Ben-Shlomo, O. R. (2020). Preschool peer social interaction (PPSI) to enhance social play, interaction, and conversation: Study Outcomes. Journal of Autism and Developmental Disorders, 50(3), 844-863. https://doi.org/10.1007/s10803-019-04316-2 Begg, C., Cho, M., Eastwood, S., Horton, R., Moher, D., Olkin, I., Pitkin, R., Schulz, K. F., Simmel, D., & Stroup, D. F. (1996). Improving the quality of reporting of randomized controlled trials. Journal of the American Medical Association, 276(8), 637-639. https://doi.org/10.1001/jama.276.8.637 Behavior Analyst Certification Board (2014). Applied behavior analysis treatment of autism spectrum disorder: Practice guidelines for healthcare funders and managers (2nd ed.). Behavior Analyst Certification Board, Inc. Behavior Analyst Certification Board (2018). Standards for supervision of BCaBAs. https://www.bacb.com/wp-content/uploads/2020/05/Standards-for-Supervision-of- BCaBAs_180924.pdf Behavior Analyst Certification Board (n.d.). BACB certificant data. https://www.bacb.com/BACB-certificant-data Bejnö, H., Johansson, S., Ramnerö, J., Grimaldi, L., & Cepeda, R. (2018). Emergent language responses following match-to-sample training among children with autism spectrum disorder. International Journal of Psychology and Psychological Therapy, 18(1), 1-14. Belisle, J., Dixon, M. R., Alholai, A., Ellenberger, L., Stanley, C., & Galliford, M. (2020). Teaching children with autism to tact the private events of others. Behavior Analysis in Practice, 13, 169-173. https://doi.org/10.1007/s40617-019-00334-9 69 Belisle, J., Huggins, K., Doherty, M., Stanley, C. R., & Dixon, M. R. (2020). Generalized reflexive responding and cross-modal tactile transfer of stimulus function in children with autism. The Analysis of Verbal Behavior, 36(2), 233-250. https://doi.org/10.1007/s40616- 020-00137-y Bellini, S., & Akullian, J. (2007). A meta-analysis of video modeling and video self-modeling interventions for children and adolescents with autism spectrum disorders. Exceptional Children, 73(3), 264-287. https://go-gale- com.proxy2.cl.msu.edu/ps/i.do?p=ITOF&u=msu_main&id=GALE%7CA160331973&v= 2.1&it=r&sid=summon Benson, S. S., Dimian, A. F., Elmquist, M., Simacek, J., McComas, J. J., & Symons, F. J. (2018). Coaching parents to assess and treat self-injurious behaviour via telehealth. Journal of Intellectual Disability Research, 62(12), 1114-1123. https://doi.org/10.1111/jir.12456 Bergstrom, R., Najdowski, A. C., Alvarado, M., & Tarbox, J. (2016). Teaching children with autism to tell socially appropriate lies. Journal of Applied Behavior Analysis, 49(2), 405- 410. https://doi.org/10.1002/jaba.295 Betz, A. M., Fisher, W. W., Roane, H. S., Mintz, J. C., & Owen, T. M. (2013). A component analysis of schedule thinning during functional communication training. Journal of Applied Behavior Analysis, 46(1), 219-241. https://doi.org/10.1002/jaba.23 Bishop, M. R., Kenzer, A. L., Coffman, C. M., Tarbox, C. M., Tarbox, J., & Lanagan, T. M. (2013). Using stimulus fading without escape extinction to increase compliance with toothbrushing in children with autism. Research in Autism Spectrum Disorders, 7(6), 680-686. https://doi.org/10.1016/j.rasd.2013.02.004 Bishop, S. K., Moore, J. W., Dart, E. H., Radley, K., Brewer, R., Baker, L., Quintero, L., Litten, S., Gilfeather, A., Newborne, B., & Toche, C. (2020). Further investigation of increasing vocalizations of children with autism with a speech-generating device. Journal of Applied Behavior Analysis, 53(1), 475-483. https://doi.org/10.1002/jaba.554 Bloom, S. E., Lambert, J. M., Dayton, E., & Samaha, A. L. (2013). Teacher-conducted trial- based functional analyses as the basis for intervention. Journal of Applied Behavior Analysis, 46(1), 208-218. https://doi.org/10.1002/jaba.21 Borgan, K. M., Rapp, J. T., Sennott, L. A., Cook, J. L., & Swinkels, E. (2018). Further analysis of the predictive effects of a free-operant competing stimulus assessment on stereotypy. Behavior Modification, 42(2), 543-583. https://doi.org/10.1177/0145445517741476 Boutain, A. R., Sheldon, J. B., & Sherman, J. A. (2020). Evaluation of a telehealth parent training program in teaching self-care skills to children with autism. Journal of Applied Behavior Analysis, 53(3), 1259-1275. https://doi.org/10.1002/jaba.743 70 Boyd, B. A., Watson, L. R., Reszka, S. S., Sideris, J., Alessandri, M., Baranek, G. T., Crais, E. R., Donaldson, A., Gutierrez, A., Johnson, L., & Belardi, K. (2018). Efficacy of the ASAP intervention for preschoolers with ASD: A cluster randomized controlled trial. Journal of Autism and Developmental Disorders, 48(9), 3144-3162. https://doi.org/10.1007/s10803-018-3584-z Bremer, E., Balogh, R., & Lloyd, M. (2015). Effectiveness of a fundamental motor skill intervention for 4-year-old children with autism spectrum disorder: A pilot study. Autism, 19(8), 980-991. https://doi.org/10.1177/1362361314557548 Brian, J. A., Smith, I. M., Zwaigenbaum, L., & Bryson, S. E. (2017). Cross-site randomized control trial of the social ABCs caregiver-mediated intervention for toddlers with autism spectrum disorder. Autism Research, 10(10), 1700-1711. https://doi.org/10.1002/aur.1818 Brodhead, M. T. (2022, April 11). Supervision and training literature review. osf.io/46wpd Brodhead, M. T., Courtney, W. T., & Thaxton, J. R. (2018). Using activity schedules to promote varied application use in children with autism. Journal of Applied Behavior Analysis, 51(1), 80-86. https://doi.org/10.1002/jaba.435 Brodhead, M. T., Higbee, T. S., Gerencser, K. R., & Akers, J. S. (2016). The use of a discrimination-training procedure to teach mand variability to children with autism. Journal of Applied Behavior Analysis, 49(1), 34-48. https://doi.org/10.1002/jaba.280 Brodhead, M. T., Higbee, T. S., Pollard, J. S., Akers, J. S., & Gerencser, K. R. (2014). The use of linked activity schedules to teach children with autism to play hide-and-seek. Journal of Applied Behavior Analysis, 47(3), 645-650. https://doi.org/10.1002/jaba.145 Brookman-Frazee, L. I., Taylor, R., & Garland, A. F. (2010). Characterizing community-based mental health services for children with autism spectrum disorders and disruptive behavior problems. Journal of Autism and Developmental Disorders, 40(10), 1188-1201. https://doi.org/10.1007/s10803-010-0976-0 Brown, K. R., Zangrillo, A. N., & Gaynor, R. (2020). Effects of caregiver-implemented group contingencies on siblings’ destructive behavior. Behavioral Development, 25(1), 30-39. https://doi.org/10.1037/bdb0000093 Budzińska, A., Lubomirska, A., Wójcik, M., Krantz, P. J., & McClannahan, L. (2014). Use of scripts and script-fading procedures and activity schedules to develop spontaneous social interaction in a three-year-old girl with autism. Health Psychology Report, 2(1), 67-71. https://doi.org/10.5114/hpr.2014.42791 Buggey, T., Crawford, S. C., & Rogers, C. L. (2018). Self-modeling to promote social initiations with young children with developmental disabilities. Focus on Autism and Other Developmental Disabilities, 33(2), 111-119. https://doi.org/10.1177/1088357616667591 71 Bui, L. T. D., Moore, D. W., & Anderson, A. (2013). Using escape extinction and reinforcement to increase eating in a young child with autism. Behaviour Change, 30(1), 48-55. Cagliani, R. R., Ayers, K. M., Whiteside, E., & Ringdahl, J. E. (2017). Picture exchange communication system and delay to reinforcement. Journal of Developmental and Physical Disabilities, 29(6), 925-939. https://doi.org/10.1007/s10882-017-9564-y Cardon, T. (2013). Video modeling imitation training to support gestural imitation acquisition in young children with autism spectrum disorder. Speech, Language and Hearing, 16(4), 227-238. https://doi.org/10.1179/2050572813Y.0000000018 Cariveau, T., Shillingsburg, M. A., Alamoudi, A., Thompson, T., Bartlett, B., Gillespie, S., & Scahill, L. (2019). Brief report: Feasibility and preliminary efficacy of a behavioral intervention for minimally verbal girls with autism spectrum disorder. Journal of Autism and Developmental Disorders, 49(5), 2203-2209. https://doi.org/10.1007/s10803-018- 03872-3 Cariveau, T., Shillingsburg, M. A., Alamoudi, A., Thompson, T., Bartlett, B., Gillespie, S., & Scahill, L. (2020). A structured intervention to increase response allocation to instructional settings for children with autism spectrum disorder. Journal of Behavioral Education, 29, 699-716. https://doi.org/10.1007/s10864-019-09340-x Carnerero, J. J., & Pérez-González, L. A. (2014). Induction of naming after observing visual stimuli and their names in children with autism. Research in Developmental Disabilities, 35(10), 2514-2526. https://doi.org/10.1016/j.ridd.2014.06.004 Carnett, A., Ingvarsson, E. T., Bravo, A., & Sigafoos, J. (2020). Teaching children with autism spectrum disorder to ask “where” questions using a speech-generating device. Journal of Applied Behavior Analysis, 53(3), 1383-1403. https://doi.org/10.1002/jaba.663 Carroll, R. A., Joachim, B. T., St. Peter, C. C., & Robinson, N. (2015). A comparison of error- correction procedures on skill acquisition during discrete-trial instruction. Journal of Applied Behavior Analysis, 48(2), 257-273. https://doi.org/10.1002/jaba.205 Carroll, R. A., & Kodak, T. (2015). Using instructive feedback to increase response variability during intraverbal training for children with autism spectrum disorder. The Analysis of Verbal Behavior, 31(2), 183-199. https://doi.org/10.1007/s40616-015-0039-x Carroll, R. A., Kodak, T., & Adolf, K. J. (2016). Effect of delayed reinforcement on skill acquisition during discrete-trial instruction: Implications for treatment-integrity errors in academic settings. Journal of Applied Behavior Analysis, 49(1), 176-181. https://doi.org/10.1002/jaba.268 Carroll, R. A., Kodak, T., & Fisher, W. W. (2013). An evaluation of programmed treatment- integrity errors during discrete-trial instruction. Journal of Applied Behavior Analysis, 46(2), 379-394. https://doi.org/10.1002/jaba.49 72 Cengher, M., & Fienup, D. M. (2020). Presession attention affects the acquisition of tacts and intraverbals. Journal of Applied Behavior Analysis, 53(3), 1742-1767. https://doi.org/10.1002/jaba.657 Cengher, M., Shamoun, K., Moss, P., Roll, D., Feliciano, G., & Fienup, D. M. (2016). A comparison of the effects of two prompt-fading strategies on skill acquisition in children with autism spectrum disorders. Behavior Analysis in Practice, 9(2), 115-125. https://doi.org/10.1007/s40617-015-0096-6 Centers for Disease Control and Prevention (2020). Data & statistics on autism spectrum disorder. https://www.cdc.gov/ncbddd/autism/data.html Chang, Y., Shire, S. Y., Shih, W., Gelfand, C., & Kasari, C. (2016). Preschool deployment of evidence-based social communication intervention: JASPER in the classroom. Journal of Autism and Developmental Disorders, 46(6), 2211-2223. https://doi.org/10.1007/s10803- 016-2752-2 Chezan, L., Drasgow, E., Legg, J., & Hollborn, A. (2016a). Effects of conditional discrimination training and choice opportunities on manding for two young children with autism spectrum disorder and language delays. Journal of Developmental and Physical Disabilities, 28(4), 557-579. https://doi.org/10.1007/s10882-016-9493-1 Chezan, L. C., Drasgow, E., Martin, C. A., & Halle, J. W. (2016b). Negatively-reinforced mands: An examination of resurgence to existing mands in two children with autism and language delays. Behavior Modification, 40(6), 922-953. https://doi.org/10.1177/0145445516648664 Chezan, L. C., Drasgow, E., McWhorter, G. Z., Starkey, K. I. P., & Hurdle, B. M. (2019). Discrimination and generalization of negatively-reinforced mands in young children with autism spectrum disorder. Behavior Modification, 43(5), 656-687. https://doi.org/10.1177/0145445518781957 Chohan, M., & Jones, E. A. (2019). Initiating joint attention with a smile: Intervention for children with autism. Behavioral Development, 24(1), 29-41. https://doi.org/10.1037/bdb0000087 Cihon, J. H., Ferguson, J. L., Leaf, J. B., Leaf, R., McEachin, J., Taubman, M. (2019a). Use of a level system with flexible shaping to improve synchronous engagement. Behavior Analysis in Practice, 12(1), 44-51. https://doi.org/10.1007/s40617-018-0254-8 Cihon, J. H., Ferguson, J. L., Leaf, J. B., Milne, C. M., Leaf, R., & McEachin, J. (2020). A randomized clinical trial of three prompting systems to teach tact relations. Journal of Applied Behavior Analysis, 53(2), 727-743. https://doi.org/10.1002/jaba.617 73 Cihon, J. H., Ferguson, J. L., Milne, C. M., Leaf, J. B., McEachin, J., & Leaf, R. (2019b). A preliminary evaluation of a token system with a flexible earning requirement. Behavior Analysis in Practice, 12(3), 548-556. https://doi.org/10.1007/s40617-018-00316-3 Cividini-Motta, C., Garcia, A. R., Livingston, C., & MacNaul, H. L. (2019). The effect of response interruption and redirection with and without a differential reinforcement of alternative behavior component on stereotypy and appropriate response. Behavioral Interventions, 34(1), 3-18. https://doi.org/10.1002/bin.1654 Clark, R. J., Wilder, D. A., Kelley, M. E., & Ryan, V. (2020). Evaluation of instructions and video modeling to train parents to implement a structured meal procedure for a food selectivity among children with autism. Behavior Analysis in Practice, 13(3), 674-678. https://doi.org/10.1007/s40617-020-00419-w Conallen, K., & Reed, P. (2017). Children with autism spectrum disorder: Teaching conversation involving feelings about events. Journal of Intellectual Disability Research, 61(3), 279- 291. https://doi.org/10.1111/jir.12339 Conine, D. E., Vollmer, T. R., Barlow, M. A., Grauerholz-Fisher, E., Dela Rosa, C. M., & Petronelli, A. K. (2020a). Assessment and treatment of response to name for children with autism spectrum disorder: Toward an efficient intervention model. Journal of Applied Behavior Analysis, 53(4), 2024-2052. https://doi.org/10.1002/jaba.737 Conine, D. E., Vollmer, T. R., & Bolívar, H. A. (2020b). Response to name in children with autism: Treatment, generalization, and maintenance. Journal of Applied Behavior Analysis, 53(2), 744-766. https://doi.org/10.1002/jaba.635 Contreras, B. P., & Betz, A. M. (2016). Using lag schedules to strengthen the intraverbal repertoires of children with autism. Journal of Applied Behavior Analysis, 49(1), 3-16. https://doi.org/10.1002/jaba.271 Cook, J. L., Rapp, J. T., Mann, K. R., McHugh, C., Burji, C., & Nuta, R. (2017). A practitioner model for increasing eye contact in children with autism. Behavior Modification, 41(3), 382-404. https://doi.org/10.1177/0145445516689323 Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied behavior analysis (3rd ed.). Pearson Education. Couper, L., van der Meer, L., Schäfer, M. C. M., McKenzie, E., McLay, L., O’Reilly, M. F., Lancioni, G. E., Marschik, P. B., Sigafoos, J., & Sutherland, D. (2014). Comparing acquisition of and preference for manual signs, picture exchange, and speech-generating devices in nine children with autism spectrum disorder. Developmental Neurorehabilitation, 17(2), 99-109. https://doi.org/10.3109/17518423.2013.870244 74 Curiel, E. S. L., Sainato, D. M., & Goldstein, H. (2016). Matrix training of receptive language skills with a toddler with autism spectrum disorder: A case study. Education and Treatment of Children, 39(1), 95-109. https://doi.org/10.1177/1053815118788060 D’Agostino, S., Douglas, S. N., & Horton, E. (2020). Inclusive preschool practitioners’ implementation of naturalistic developmental behavioral intervention using telehealth training. Journal of Autism and Developmental Disorders, 50(3), 864-880. https://doi.org/10.1007/s10803-019-04319-z Daneshvar, S. D., Charlop, M. H., & Malmberg, D. B. (2019). A treatment comparison study of a photo activity schedule and social stories for teaching social skills to children with autism spectrum disorder: brief report. Developmental Neurorehabilitation, 22(3), 209-214. https://doi.org/10.1080/17518423.2018.1461947 Daou, N. (2014). Conducting behavioral research with children attending nonbehavioral intervention programs for autism: the case of Lebanon. Behavior Analysis in Practice, 7(1), 78-90. https://doi.org/10.1007/s40617-014-0017-0 Dass, T. K., Kisamore, A. N., Vladescu, J. C., Reeve, K. F., Reeve, S. A., & Taylor-Santa, C. (2018). Teaching children with autism spectrum disorder to tact olfactory stimuli. Journal of Applied Behavior Analysis, 51(3), 538-552. https://doi.org/10.1002/jaba.470 Delemere, E., & Dounavi, K. (2018). Parent-implemented bedtime fading positive routines for children with autism spectrum disorders. Journal of Autism and Developmental Disorders, 48(4), 1002-1019. https://doi.org/10.1007/s10803-017-3398-4 Delfs, C. H., Conine, D. E., Frampton, S. E., Shillingsburg, M. A., & Robinson, H. C. (2014). Evaluation of the efficiency of listener and tact instruction for children with autism. Journal of Applied Behavior Analysis, 47(4), 793-809. https://doi.org/10.1002/jaba.166 Delmolino, L., Hansford, A. P., Bamond, M. J., Fiske, K. E., & LaRue, R. H. (2013). The use of instructive feedback for teaching language skills to children with autism. Research in Autism Spectrum Disorders, 7(6), 648-661. https://doi.org/10.1016/j.rasd.2013.02.015 DeQuinzio, J. A., Ruch, S. A., & Taylor, B. A. (2020). Teaching children with autism to respond to joyful and fearful expressions within social referencing. Behavioral Development, 25(1), 17-29. https://doi.org/10.1037/bdb0000091 Derosa, N. M., Fisher, W. W., & Steege, M. W. (2015). An evaluation of time in establishing operation on the effectiveness of functional communication training. Journal of Applied Behavior Analysis, 48(1), 115-130. https://doi.org/10.1002/jaba.180 Deshais, M. A., Phillips, C. L., Wiskow, K. M., Vollmer, T. R., & Donaldson, J. M. (2020). A comparison of imitation training using concurrent versus delayed prompting. Behavior Analysis: Research and Practice, 20(3), 132-147. https://doi.org/10.1037/bar0000174 75 Deshais, M. A., & Vollmer, T. R. (2020). A preliminary investigation of fixed and repetitive models during object imitation training. Journal of Applied Behavior Analysis, 53(2), 973-996. https://doi.org/10.1002/jaba.661 DeSouza, A. A., Fisher, W. W., & Rodriguez, N. M. (2019). Facilitating the emergence of convergent intraverbals in children with autism. Journal of Applied Behavior Analysis, 52(1), 28-49. https://doi.org/10.1002/jaba.520 Dickes, N. R., & Kodak, T. (2015). Evaluating the emergence of reverse intraverbals following intraverbal training in young children with autism spectrum disorder. Behavioral Interventions, 30(3), 169-190. https://doi.org/10.1002/bin.1412 DiGennaro Reed, F. D., Blackman, A. L., Erath, T. G., Brand, D., & Novak, M. D. (2018). Guidelines for using behavioral skills training to provide teacher support. Teaching Exceptional Children, 50(6), 373-380. https://doi.org/10.1177/0040059918777241 DiGennaro Reed, F. D., & Codding, R. S. (2014). Advancements in procedural fidelity assessment and intervention: Introduction to the special issue. Journal of Behavior Education, 23(1), 1-18. https://doi.org/10.1007/s10864-013-9191-3 Dingfelder, H. E., & Mandell, D. S. (2011). Bridging the research-to-practice gap in autism intervention: An application of diffusion of innovation theory. Journal of Autism and Developmental Disorders, 41(5), 597-609. https://doi.org/10.1007/s10803-010-1081-0 DiSanti, B. M., Eikeseth, S., Eldevik, S., Conrad, J. M., & Cotter-Fisher, K. L. (2020). Comparing structured mix and random rotation procedures to teach receptive labeling to children with autism. Behavioral Interventions, 35(1), 38-56. https://doi.org/10.1002/bin.1694 Dixon, D. R., Linstead, E., Granpeesheh, D., Novack, M. N., French, R., Stevens, E., Stevens, L., & Powell, A. (2016). An evaluation of the impact of supervision intensity, supervisor qualifications, and caseload on outcomes in the treatment of autism spectrum disorder. Behavior Analysis in Practice, 9(4), 339-348. https://doi.org/10.1007/s40617-016-0132-1 Dixon, D. R., Miyake, C. J., Nohelty, K., Novack, M. N., & Granpeesheh, D. (2020). Evaluation of an immersive virtual reality safety training used to teach pedestrian skills to children with autism spectrum disorder. Behavior Analysis in Practice, 13(3), 631-640. https://doi.org/10.1007/s40617-019-00401-1 Dixon, M. R., Peach, J., Daar, J. H., & Penrod, C. (2017). Teaching complex verbal operants to children with autism and establishing generalization using the PEAK curriculum. Journal of Applied Behavior Analysis, 50(2), 317-331. https://doi.org/10.1002/jaba.373 Doenyas, C., Şimdi, E., Özcan, E. Ç., Çataltepe, Z., & Birkan, B. (2014). Autism and tablet computers in Turkey: Teaching picture sequencing skills via a web-based iPad application. International Journal of Child-Computer Interaction, 2(1), 60-71. https://doi.org/10.1016/j.ijcci.2014.04.002 76 Doherty, A., Bracken, M., & Gormley, L. (2018). Teaching children with autism to initiate and respond to peer mands using picture exchange communication system (PECS). Behavior Analysis in Practice, 11(4), 279-288. https://doi.org/10.1007/s40617-018-00311-8 Drasgow, E., Martin, C. A., Chezan, L. C., Wolfe, K., & Halle, J. W. (2016). Mand training: An examination of response-class structure in three children with autism and severe language delays. Behavior Modification, 40(3), 347-376. https://doi.org/10.1177/0145445515613582 Drysdale, B., Lee, C. Y. Q., Anderson, A., & Moore, D. W. (2015). Using video modeling incorporating animation to teach toileting to two children with autism spectrum disorder. Journal of Developmental and Physical Disabilities, 27(2), 149-165. https://doi.org/10.1007/s10882-014-9405-1 Dueñas, A. D., Plavnick, J. B., & Bak, M. Y. S. (2019a). Effects of joint video modeling on unscripted play behavior of children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 49(1), 236-247. https://doi.org/10.1007/s10803-018-3719- 2 Dueñas, A. D., Plavnick, J. B., & Maher, C. E. (2019b). Embedding tact instruction during play for preschool children with autism spectrum disorder. Education and Treatment of Children, 42(3), 361-384. https://doi.org/10.1353/etc.2019.0017 Dufour, M., & Lanovaz, M. J. (2017). Comparing two methods to promote generalization of receptive identification in children with autism spectrum disorders. Developmental Neurorehabilitation, 20(8), 463-474. https://doi.org/10.1080/17518423.2016.1211191 Dufour, M., & Lanovaz, M. J. (2020). Increasing compliance with wearing a medical device in children with autism. Journal of Applied Behavior Analysis, 53(2), 1089-1096. https://doi.org/10.1002/jaba.628 Dupere, S., MacDonald, R. P. F., & Ahearn, W. H. (2013). Using video modeling with substitutable loops to teach varied play to children with autism. Journal of Applied Behavior Analysis, 46(3), 662-668. https://doi.org/10.1002/jaba.68 Eby, C. M., & Greer, R. D. (2017). Effects of social reinforcement on the emission of tacts by preschoolers. Behavioral Development Bulletin, 22(1), 23-43. https://doi.org/10.1037/bdb0000043 Edwards, C. K., Landa, R. K., Frampton, S. E., & Shillingsburg, M. A. (2018). Increasing functional leisure engagement for children with autism using backward chaining. Behavior Modification, 42(1), 9-33. https://doi.org/10.1177/0145445517699929 77 Eikeseth, S., Hayward, D., Gale, C., Gitlesen, J., & Eldevik, S. (2009). Intensity of supervision and outcome for preschool aged children receiving early and intensive behavioral interventions: A preliminary study. Research in Autism Spectrum Disorders, 3(1), 67-73. https://doi.org/10.1016/j.rasd.2008.04.003 Elliott, C., & Dillenburger, K. (2016). The effect of choice on motivation for young children on the autism spectrum during discrete trial teaching. Journal of Research in Special Educational Needs, 16(3), 187-198. https://doi.org/10.1111/1471-3802.12073 Ergenekon, Y., Tekin-Iftar, E., Kapan, A., & Akmanoglu, N. (2014). Comparison of video and live modeling in teaching response chains to children with autism. Education and Training in Autism and Developmental Disabilities, 49(2), 200-213. https://www.jstor.org/stable/23880605 Ertel, H., Wilder, D. A., Hodges, A., & Hurtado, L. (2019). The effect of various high- probability to low-probability instruction ratios during the use of the high-probability instructional sequence. Behavior Modification, 43(5), 639-655. https://doi.org/10.1177/0145445518782396 Ezzeddine, E. W., DeBar, R. M., Reeve, S. A., & Townsend, D. B. (2020). Using video modeling to teach play comments to dyads with ASD. Journal of Applied Behavior Analysis, 53(2), 767-781. https://doi.org/10.1002/jaba.621 Falcomata, T. S., & Gainey, S. (2014). An evaluation of noncontingent reinforcement for the treatment of challenging behavior with multiple functions. Journal of Developmental and Physical Disabilities, 26(3), 317-324. https://doi.org/10.1007/s10882-014-9366-4 Falcomata, T. S., Muething, C. S., Roberts, G. J., Hamrick, J., & Shpall, C. (2016). Further evaluation of latency-based brief functional analysis methods: An evaluation of treatment utility. Developmental Neurorehabilitation, 19(2), 88-94. https://doi.org/10.3109/17518423.2014.910281 Falcomata, T. S., Shpall, C. S., Ringdahl, J. E., Ferguson, R. H., Wingate, H. V., & Swinnea, S. B. (2017). A comparison of high-and low-proficiency mands during functional communication training across multiple functions of problem behavior. Journal of Developmental and Physical Disabilities, 29(6), 983-1002. https://doi.org/10.1007/s10882-017-9571-z Falcomata, T. S., Wacker, D. P., Ringdahl, J. E., Vinquist, K., & Dutt, A. (2013). An evaluation of generalization of mands during functional communication training. Journal of Applied Behavior Analysis, 46(2), 444-454. https://doi.org/10.1002/jaba.37 Falligant, J. M., & Pence, S. T. (2017). Preschool life skills using the response to intervention model with preschoolers with developmental disabilities. Behavior Analysis: Research and Practice, 17(3), 217-236. https://doi.org/10.1037/bar0000056 78 Ferguson, R. H., Falcomata, T. S., Ramirez-Cristoforo, A., & Londono, F. V. (2019). An evaluation of the effects of varying magnitudes of reinforcement on variable responding exhibited by individuals with autism. Behavior Modification, 43(6), 774-789. https://doi.org/10.1177/0145445519855615 Ferguson, J. L., Majeski, M. J., McEachin, J., Leaf, R., Chon, J. H., & Leaf, J. B. (2020). Evaluating discrete trial teaching with instructive feedback delivered in dyad arrangement via telehealth. Journal of Applied Behavior Analysis, 53(4), 1876-1888. https://doi.org/10.1002/jaba.773 Fettig, A., Barton, E. E., Carter, A. S., & Eisenhower, A. S. (2016). Using e-coaching to support an early intervention provider’s implementation of a functional assessment-based intervention. Infants & Young Children, 29(2), 130-147. https://doi.org/10.1097/iyc.0000000000000058 Fettig, A., Schultz, T. R., & Sreckovic, M. A. (2015). Effects of coaching on the implementation of functional assessment-based parent intervention in reducing challenging behaviors. Journal of Positive Behavior Interventions, 17(3), 170-180. https://doi.org/10.1177/1098300714564164 Fisher, W. W., Felber, J. M., Phillips, L. A., Craig, A. R., Paden, A. R., & Niemeier, J. J. (2019a). Treatment of resistance to change in children with autism. Journal of Applied Behavior Analysis, 52(4), 974-993. https://doi.org/10.1002/jaba.588 Fisher, W. W., Greer, B. D., Fuhrman, A. M., & Querim, A. C. (2015). Using multiple schedules during functional communication training to promote rapid transfer of treatment effects. Journal of Applied Behavior Analysis, 48(4), 713-733. https://doi.org/10.1002/jaba.254 Fisher, W. W., Greer, B. D., Fuhrman, A. M., Saini, V., & Simmons, C. A. (2018). Minimizing resurgence of destructive behavior using behavioral momentum theory. Journal of Applied Behavior Analysis, 51(4), 831-853. https://doi.org/10.1002/jaba.499 Fisher, W. W., Pawich, T. L., Dickes, N., Paden, A. R., & Toussaint, K. (2014). Increasing the saliency of behavior-consequence relations for children with autism who exhibit persistent errors. Journal of Applied Behavior Analysis, 47(4), 738-748. https://doi.org/10.1002/jaba.172 Fisher, W. W., Retzlaff, B. J., Akers, J. S., DeSouza, A. A., Kaminski, A. J., & Machado, M. A. (2019b). Establishing initial auditory-visual conditional discriminations and emergence of initial tacts in young children with autism spectrum disorder. Journal of Applied Behavior Analysis, 52(4), 1089-1106. https://doi.org/10.1002.jaba.586 Fiske, K. E., Cohen, A. P., Bamond, M. J., Delmolino, L., LaRue, R. H., & Sloman, K. N. (2014). The effects of magnitude-based differential reinforcement on the skill acquisition of children with autism. Journal of Behavioral Education, 23(4), 470-487. https://doi.org/10.1007/s10864-014-9211-y 79 Flores, M. M., & Ganz, J. B. (2014). Comparison of direct instruction and discrete trial teaching on the curriculum-based assessment of language performance of students with autism. Exceptionality, 22(4), 191-204. https://doi.org/10.1080/09362835.2013.865533 Fragale, C., Rojeski, L., O’Reilly, M., & Gevarter, C. (2016). Evaluation of functional communication training as a satiation procedure to reduce challenging behavior in instructional environments for children with autism. International Journal of Developmental Disabilities, 62(3), 139-146. https://doi.org/10.1080/20473869.2016.1183957 Frampton, S. E., Robinson, H. C., Conine, D. E., & Delfs, C. H. (2017). An abbreviated evaluation of the efficiency of listener and tact instruction for children with autism. Behavior Analysis in Practice, 10(2), 131-144. https://doi.org/10.1007/s40617-017-0175- y Frampton, S. E., & Shillingsburg, M. A. (2018). Teaching children with autism to explain how: A case for problem solving? Journal of Applied Behavior Analysis, 51(2), 236-254. https://doi.org/10.1002/jaba.445 Frampton, S. E., Thompson, T. M., Bartlett, B. L., Hansen, B., & Shillingsburg, M. A. (2019). The use of matrix training to teach color-shape tacts to children with autism. Behavior Analysis in Practice, 12(2), 320-330. https://doi.org/10.1007/s40617-018-00288-4 Frampton, S. E., Wymer, S. C., Hansen, B., & Shillingsburg, M. A. (2016). The use of matrix training to promote generative language with children with autism. Journal of Applied Behavior Analysis, 49(4), 869-883. https://doi.org/10.1002/jaba.340 Franco, J. H., Davis, B. L., & Davis, J. L. (2013). Increasing social interaction using prelinguistic milieu teaching with nonverbal school-age children with autism. American Journal of Speech-Language Pathology, 22(3), 489-902. https://doi.org/10.1004/1058- 0360(2012/10-0103) Fritz, J. N., Jackson, L. M., Stiefler, N. A., Wimberly, B. S., & Richardson, A. R. (2017). Noncontingent reinforcement without extinction plus differential reinforcement of alternative behavior during treatment of problem behavior. Journal of Applied Behavior Analysis, 50(3), 590-599. https://doi.org/10.1002/jaba.395 Fuhrman, A. M., Greer, B. D., Zangrillo, A. N., & Fisher, W. W. (2018). Evaluating competing activities to enhance functional communication training during reinforcement schedule thinning. Journal of Applied Behavior Analysis, 51(4), 931-942. https://doi.org/10.1002/jaba.486 Gadaire, D. M., Bartell, K., & Villacorta, J. (2018). Evaluating group activity schedules to promote social play in children with autism. Learning and Motivation, 64(1), 18-26. https://doi.org/10.1016/j.lmot.2017.11.004 80 Gadaire, D. M., Fisher, W. W., & Steege, M. (2014). The effects of presenting delays before and after task completion on self-control responding in children with behavior disorders. Journal of Applied Behavior Analysis, 47(1), 192-197. https://doi.org/10.1002/jaba.104 Galizio, A., Higbee, T. S., & Odum, A. L. (2020). Choice for reinforced behavioral variability in children with autism spectrum disorder. Journal of the Experimental Analysis of Behavior, 113(3), 495-514. https://doi.org/10.1002/jeab.591 Ganz, J. B., & Ayres, K. M. (2018). Methodological standards in single-case experimental design: Raising the bar. Research in Developmental Disabilities, 79, 3-9. https://doi.org/10.1016/j.ridd.2018.03.003 Ganz, J. B., Goodwyn, F. D., Boles, M. M., Hong, E. R., Rispoli, M. J., Lund, E. M., & Kite, E. (2013a). Impacts of a PECS instructional coaching intervention on practitioners and children with autism. Augmentative and Alternative Communication, 29(3), 210-221. https://doi.org/10.3109/07434618.2013.818058 Ganz, J. B., Hong, E. R., & Goodwyn, F. D. (2013b). Effectiveness of the PECS phase III app and choice between the app and traditional PECS among preschoolers with ASD. Research in Autism Spectrum Disorders, 7(8), 973-983. https://doi.org/10.1016/j.rasd.2013.04.003 Ganz, J. B., Hong, E. R., Goodwyn, F., Kite, E., & Gilliland, W. (2015). Impact of PECS tablet computer app on receptive identification of pictures given a verbal stimulus. Developmental Neurorehabilitation, 18(2), 82-87. https://doi.org/10.3109/17518423.2013.821539 Garcia, D., Dukes, C., Brady, M. P., Scott, J., & Wilson, C. L. (2016). Using modeling and rehearsal to teach fire safety to children with autism. Journal of Applied Behavior Analysis, 49(3), 699-704. https://doi.org/10.1002/jaba.331 Garcia-Albea, E., Reeve, S. A., Reeve, K. F., & Brothers, K. J. (2014). Using audio script fading and multiple-exemplar training to increase vocal interactions in children with autism. Journal of Applied Behavior Analysis, 47(2), 325-343. https://doi.org/10.1002/jaba.125 Gast, D. L., & Ledford, J. R., (2014). Single case research methodology: Applications in special education and behavioral sciences (2nd ed.). Routledge. Genc-Tosun, D., Kurt, O. (2017). Effects of video modeling on the instructional efficiency of simultaneous prompting among preschoolers with autism spectrum disorder. Education and Training in Autism and Developmental Disabilities, 52(3), 291-304. https://www.jstor.org/stable/10.2307/26420401 81 Gengoux, G. W., Abrams, D. A., Schuck, R., Millan, M. E., Libove, R., Ardel, C. M., Phillips, J. M., Fox, M., Frazier, T. W., & Hardan, A. Y. (2019). Pivotal response treatment package for children with autism spectrum disorder: An RCT. Pediatrics, 144(3), e20190178. https://doi-org.proxy2.cl.msu.edu/10.1542/peds.2019-0178 Gerencser, K. R., Higbee, T. S., Akers, J. S., & Contreras, B. P. (2017). Evaluation of interactive computerized training to teach parents to implement photographic activity schedules with children with autism spectrum disorder. Journal of Applied Behavior Analysis, 50(3), 567-581. https://doi.org/10.1002/jaba.386 Gerow, S., Radhakrishnan, S., Davis, T. N., Hodges, A., & Feind, A. (2020). A comparison of demand fading and a dense schedule of reinforcement during functional communication training. Behavior Analysis in Practice, 13(1), 90-103. https://doi.org/10.1007/s40617- 019-00403-z Gerow, S., Rivera, G., Akers, J. S., Kirkpatrick, M., & Radhakrishnan, S. (2019). Parent- implemented treatment for automatically maintained stereotypy. Behavioral Interventions, 34(4), 466-474. https://doi.org/10.1002/bin.1689 Gevarter, C., & Horan, K. (2019). A behavioral intervention package to increase vocalizations of individuals with autism during speech-generating device intervention. Journal of Behavioral Education, 28(1), 141-167. https://doi.org/10.1007/s10864-018-9300-4 Gevarter, C., O’Reilly, M. F., Kuhn, M., Mills, K., Ferguson, R., Watkins, L., Sigafoos, J., Lang, R., Rojeski, L., & Lancioni, G. E. (2016). Increasing the vocalizations of individuals with autism during intervention with a speech-generating device. Journal of Applied Behavior Analysis, 49(1), 17-33. https://doi.org/10.1002/jaba.270 Gevarter, C., O’Reilly, M. F., Rojeski, L., Sammarco, N., Sigafoos, J., Lancioni, G. E., & Lang, R. (2014). Comparing acquisition of AAC-based mands in three young children with autism spectrum disorder using iPad® applications with different display and design elements. Journal of Autism and Developmental Disorders, 44(10), 2464-2474. https://doi.org/10.1007/s10803-014-2115-9 Ghaemmaghami, M., Hanley, G. P., & Jessel, J. (2016). Contingencies promote delay tolerance. Journal of Applied Behavior Analysis, 49(3), 548-575. https://doi.org/10.1002/jaba.333 Gibbs, A. R., Tullis, C. A., Thomas, R., & Elkins, B. (2018). The effects of noncontingent music and response interruption and redirection on vocal stereotypy. Journal of Applied Behavior Analysis, 51(4), 899-914. https://doi.org/10.1002/jaba.485 Giles, A., & Markham, V. (2017). Comparing book- and tablet-based picture activity schedules: acquisition and preference. Behavior Modification, 41(5), 647-664. https://doi.org/10.1177/0145445517700817 82 Gilliam, A., Weil, T. M., & Miltenberger, R., G. (2013). Effects of preference on the emergence of untrained verbal operants. Journal of Applied Behavior Analysis, 46(2), 523-527. https://doi.org/10.1002/jaba.34 Glodowski, K. R., & Rodriguez, N. M. (2019). The effects of scenic picture prompts on variability during the acquisition of intraverbal categorization for children with autism. The Analysis of Verbal Behavior, 35(2), 134-148. https://doi.org/10.1007/s40616-019- 00120-2 Gomes, S. R., Reeve, S. A., Brothers, K. J., Reeve, K. F., & Sidener, T. M. (2020). Establishing a generalized repertoire of initiating bids for joint attention in children with autism. Behavior Modification, 44(3), 394-428. https://doi.org/10.1177/0145445518822499 Goods, K. S., Ishijima, E., Chang, Y., & Kasari, C. (2013). Preschool based JASPER intervention in minimally verbal children with autism: Pilot RCT. Journal of Autism and Developmental Disorders, 43(5), 1050-1056. https://doi.org/10.1007/s10803-012-1644-3 Gorgan, E. M., & Kodak, T. (2019). Comparison of interventions to treat prompt dependence for children with developmental disabilities. Journal of Applied Behavior Analysis, 52(4), 1049-1063. https://doi.org/10.1002/jaba.638 Gormley, L., Healy, O., Doherty, A., O’Regan, D., & Grey, I. (2020). Staff training in intellectual and developmental disability settings: A scoping review. Journal of Developmental and Physical Disabilities, 32, 187-212. https://doi.org/10.1007/s10882- 019-09683-3 Green, D. R., Ferguson, J. L., Cihon, J. H., Torres, N., Leaf, R., McEachin, J., Rudrud, E., Schulze, K., & Leaf, J. B. (2020). The teaching interaction procedure as a staff training tool. Behavior Analysis in Practice, 13(2), 421-433. https://doi.org/10.1007/s40617-019- 00357-2 Greenberg, A. L., Tomaino, M. E., & Charlop, M. H. (2014). Adapting the picture exchange communication system to elicit vocalizations in children with autism. Journal of Developmental and Physical Disabilities, 26(1), 35-51. https://doi.org/10.1007/s10882- 013-9344-2 Greer, B. D., Fisher, W. W., Briggs, A. M., Lichtblau, K. R., Phillips, L. A., & Mitteer, D. R. (2019). Using schedule-correlated stimuli during functional communication training to promote the rapid transfer of treatment effects. Behavioral Development, 24(2), 100-119. https://doi.org/10.1037/bdb0000085 Greer, B. D., Neidert, P. L., & Dozier, C. L. (2016). A component analysis of toilet-training procedures recommended for young children. Journal of Applied Behavior Analysis, 49(1), 69-84. https://doi.org/10.1002/jaba.275 83 Greer, R. D., & Han, H. S. (2015). Establishment of conditioned reinforcement for visual observing and the emergence of generalized visual-identify matching. Behavioral Development Bulletin, 20(2), 227-252. https://doi.org/10.1037/h0101316 Groskreutz, N. C., Groskreutz, M. P., & Higbee, T. S. (2011). Effects of varied levels of treatment integrity on appropriate toy manipulation in children with autism. Research in Autism Spectrum Disorders, 5(4), 1358-1369. https://doi.org/10.1016/j.rasd.2011.01.018 Groskreutz, M. P., Peters, A., Groskreutz, N. C., & Higbee, T. S. (2015). Increasing play-based commenting in children with autism spectrum disorder using a novel script-frame procedure. Journal of Applied Behavior Analysis, 48(2), 442-447. https://doi.org/10.1002/jaba.194 Grow, L. L., Kodak, T., & Carr, J. E. (2014). A comparison of methods for teaching receptive labeling to children with autism spectrum disorders: A systematic replication. Journal of Applied Behavior Analysis, 47(3), 600-605. https://doi.org/10.1002/jaba.141 Gruber, D. J., & Poulson, C. L. (2016). Graduated guidance delivered by parents to teach yoga to children with developmental delays. Journal of Applied Behavior Analysis, 49(1), 193- 198. https://doi.org/10.1002/jaba.260 Gunby, K. V., & Rapp, J. T. (2014). The use of behavioral skills training and in situ feedback to protect children with autism from abduction lures. Journal of Applied Behavior Analysis, 47(4), 856-860. https://doi.org/10.1002/jaba.173 Gunby, K. V., Rapp, J. T., & Bottoni, M. M. (2018). A progressive model for teaching children with autism to follow gaze shift. Journal of Applied Behavior Analysis, 51(3), 694-701. https://doi.org/10.1002/jaba.479 Gunby, K. V., Rapp, J. T., Bottoni, M. M., Marchese, N., & Wu, B. (2017). Teaching children with autism to follow gaze shift: A brief report on three cases. Behavioral Interventions, 32(2), 175-181. https://doi.org/10.1002/bin.1465 Gunning, C., Holloway, J., & Grealish, L. (2020). An evaluation of parents as behavior change agents in the preschool life skills program. Journal of Applied Behavior Analysis, 53(2), 889-917. https://doi.org/10.1002/jaba.660 Guðmundsdóttir, K., Sigurðardóttir, Z. G., & Ala’i-Rosales, S. (2017). Evaluation of caregiver training via telecommunication for rural Icelandic children with autism. Behavioral Development Bulletin, 22(1), 215-229. https://doi.org/10.1037/bdb0000040 Gureghian, D. L., Vladescu, J. C., Gashi, R., & Campanaro, A. (2020). Reinforcer choice as an antecedent versus consequence during skills acquisition. Behavior Analysis in Practice, 13(2), 462-466. https://doi.org/10.1007/s40617-019-00356-3 84 Gutierrez Jr., A., Bennett, K. D., McDowell, L. S., Cramer, E. D., & Crocco, C. (2016). Comparison of video prompting with and without voice-over narration: A replication with young children with autism. Behavioral Interventions, 31(4), 377-389. https://doi.org/10.1002/bin.1456 Hanley, G. P., Jin, C. S., Vaneslow, N. R., & Hanratty, L. A. (2014). Producing meaningful improvements in problem behavior of children with autism via synthesized analyses and treatments. Journal of Applied Behavior Analysis, 47(1), 16-36. https://doi.org/10.1002/jaba.106 Hanley, N. M., Carr, J. E., & LeBlanc, L. A. (2019). Teaching children with autism spectrum disorder to tact auditory stimuli. Journal of Applied Behavior Analysis, 52(3), 733-738. https://doi.org/10.1002/jaba.605 Hannula, C., Jimenez-Gomez, C., Wu, W., Brewer, A. T., Kodak, T., Gilroy, S. P., Hutsell, B. A., Alsop, B., & Podlesnik, C. A. (2020). Quantifying error of bias and discriminability in conditional-discrimination performance in children diagnosed with autism spectrum disorder. Learning and Motivation, 71. https://doi.org/10.1016/j.lmot.2020.101659 Hansen, B., DeSouza, A. A., Stuart, A. L., & Shillingsburg, M. A. (2019). Clinical application of a high-probability sequence to promote compliance with vocal imitation in a child with autism spectrum disorder. Behavior Analysis in Practice, 12(1), 199-203. https://doi.org/10.1007/s40617-018-00280-y Hansen, S. G., Raulston, T. J., Machalicek, W., & Frantz, R. (2018). Caregiver-mediated joint attention intervention. Behavioral Interventions, 33(2), 205-211. https://doi.org/10.1002/bin.1523 Haq, S. S., Kodak, T., Kurtz-Nelson, E., Porritt, M., Rush, K., & Cariveau, T. (2015). Comparing the effects of massed and distributed practice on skill acquisition for children with autism. Journal of Applied Behavior Analysis, 48(2), 454-459. https://doi.org/10.1002/jaba.213 Hart, S. L., & Banda, D. R. (2018). Examining the effects of peer mediation on the social skills of students with autism spectrum disorder as compared to their peers. Education and Training in Autism and Developmental Disabilities, 53(2), 160-175. https://www.jstor.org/stable/26495267 Hartley, B. K., Courtney, W. T., Rosswurm, M., & LaMarca, V. J. (2016). The apprentice: An innovative approach to meet the behavior analysis certification board’s supervision standards. Behavior Analysis in Practice, 9(4), 329-338. https://doi.org/10.1007/s40617- 016-0136-x Heffernan, L., & Lyons, D. (2016). Differential reinforcement of other behavior for the reduction of severe nail biting. Behavior Analysis in Practice, 9(3), 253-256. https://doi.org/10.1007/s40617-016-0106-3 85 Hendricks, D. (2010). Employment and adults with autism spectrum disorders: Challenges and strategies for success. Journal of Vocational Rehabilitation, 32(2), 125-134. https://doi.org/10.3233/JVR-2010-0502 Henrickson, M. L., Rapp, J. T., & Ashbeck, H. A. (2015). Teaching with massed versus interspersed trials: effects on acquisition, maintenance, and problem behavior. Behavioral Interventions, 30(1), 36-50. https://doi.org/10.1002/bin.1396 Herman, C., Healy, O., & Lydon, S. (2018). An interview-informed synthesized contingency analysis to inform the treatment of challenging behavior in a young child with autism. Developmental Neurorehabilitation, 21(3), 202-207. https://doi.org/10.1080/17518423.2018.1437839 Hodges, A. C., Betz, A. M., Wilder, D. A., & Antia, K. (2019). The use of contingent acoustical feedback to decrease toe walking in children with autism. Education and Treatment of Children, 42(2), 151-160. https://doi.org/10.1353/etc.2019.0007 Hodges, A. C., Wilder, D. A., & Ertel, H. (2018). The use of a multiple schedule to decrease toe walking a child with autism. Behavioral Interventions, 33(4), 440-447. https://doi.org/10.1002/bin.1528 Hoffmann, A. N., Sellers, T. P., Halversen, H., & Bloom, S. E. (2018). Implementation of interventions informed by precursor functional analyses with young children: A replication. Journal of Applied Behavior Analysis, 51(4), 879-889. https://doi.org/10.1002/jaba.502 Holzinger, D., Laister, D., Vivanti, G., Barbaresi, W. J., & Fellinger, J. (2019). Feasibility and outcomes of the early start denver model implemented with low intensity in a community setting in Austria. Journal of Development & Behavioral Pediatrics, 40(5), 354-363. https://doi.org/10.1097/dbp.0000000000000675 Hong, E. R., Neely, L., Rispoli, M. J., Trepinski, T. M., Gregori, E., & Davis, T. (2015). A comparison of general and explicit delay cues to reinforcement for tangible-maintained challenging behavior. Developmental Neurorehabilitation, 18(6), 395-401. https://doi.org/10.3109/17518423.2013.874378 Hu, X., & Lee, G. (2019). Effects of PECS on the emergence of vocal mands and the reduction of aggressive behavior across settings for a child with autism. Behavioral Disorders, 44(4), 215-226. https://doi.org/10.1177/0198742918806925 Hu, X., Zheng, Q., & Lee, G. (2018). Using peer-mediated LEGO play intervention to improve social interactions for Chinese children with autism in an inclusive setting. Journal of Autism and Developmental Disorders, 48(7), 2444-2457. https://doi.org/10.1007/s10803- 018-3502-4 86 Humphreys, T., Polick, A. S., Howk, L. L., Thaxton, J. R., & Ivancic, A. P. (2013). An evaluation of repeating the discriminative stimulus when using least-to-most prompting to teach intraverbal behavior to children with autism. Journal of Applied Behavior Analysis, 46(2), 534-538. https://doi.org/10.1002/jaba.43 Hundert, J., Rowe, S., & Harrison, E. (2014). The combined effects of social script training and peer buddies on generalized peer interaction of children with ASD in inclusive classrooms. Focus on Autism and Other Developmental Disabilities, 29(4), 206-215. https://doi.org/10.1177/1088357614522288 Ibañez, V. F., Piazza, C. C., & Peterson, K. M. (2019). A translational evaluation of renewal of inappropriate mealtime behavior. Journal of Applied Behavior Analysis, 52(4), 1005- 1020. https://doi.org/10.1002/jaba.647 Ingersoll, B., & Wainer, A. (2013). Initial efficacy of project imPACT: A parent-mediated social communication intervention for young children with ASD. Journal of Autism and Developmental Disorders, 43(12), 2943-2952. https://doi.org/10.1007/s10803-013-1840- 9 Ingersoll, B. R., Wainer, A. L., Berger, N. I., & Walton, K. M. (2017). Efficacy of low intensity, therapist-implemented project imPACT for increasing social communication skills in young children with ASD. Developmental Neurorehabilitation, 20(8), 502-510. https://doi.org/10.1080/17518423.2016.1278054 Ishizuka, Y., & Yamamoto, J. (2016). Contingent imitation increases verbal interaction in children with autism spectrum disorders. Autism, 20(8), 1011-1020. https://doi.org/10.1177/1362361315622856 Jeffries, T., Crosland, K., & Miltenberger, R. (2016). Evaluating a tablet application and differential reinforcement to increase eye contact in children with autism. Journal of Applied Behavior Analysis, 49(1), 182-187. https://doi.org/10.1002/jaba.262 Jessel, J., Hanley, G. P., & Ghaemmaghami, M. (2016). A translational evaluation of transitions. Journal of Applied Behavior Analysis, 49(2), 359-376. https://doi.org/10.1002/jaba.283 Jessel, J., Ingvarsson, E. T., Metras, R., Kirk, H., & Whipple, R. (2018a). Achieving socially significant reductions in problem behavior following the interview-informed synthesized contingency analysis: A summary of 25 outpatient applications. Journal of Applied Behavior Analysis, 51(1), 130-157. https://doi.org/10.1002/jaba.436 Jessel, J., Ingvarsson, E. T., Metras, R., Whipple, R., Kirk, H., & Solsbery, L. (2018b). Treatment of elopement following a latency-based interview-informed, synthesized contingency analysis. Behavioral Interventions, 33(3), 271-283. https://doi.org/10.1002/bin.1525 87 Jimenez-Gomez, C., Rajagopal, S., Nastri, R., & Chong, I. M. (2019). Matrix training for expanding the communication of toddlers and preschoolers with autism spectrum disorder. Behavior Analysis in Practice, 12(2), 375-386. https://doi.org/10.1007/s40617- 019-00346-5 Joachim, B. T., & Carroll, R. A. (2018). A comparison of consequences for correct responses during discrete-trial instruction. Learning and Motivation, 62(1), 15-28. https://doi.org/10.1016/j.lmot.2017.01.002 Jones, J., Lerman, D. C., & Lechago, S. (2014). Assessing stimulus control and promoting generalization via video modeling when teaching social responses to children with autism. Journal of Applied Behavior Analysis, 47(1), 37-50. https://doi.org/10.1002/jaba.81 Jones, S. H., St. Peter, C. C., & Ruckle, M. M. (2020). Reporting of demographic variables in the journal of applied behavior analysis. Journal of Applied Behavior Analysis, 53(3), 1304- 1315. https://doi.org/10.1002/jaba.722 Julien, H. M., & Reichle, J. (2016). A comparison of high and low dosages of a component of milieu teaching strategies for two preschool-age learners with autism spectrum disorder. Language, Speech, and Hearing Services in Schools, 47(1), 87-98. https://doi.org/10.1044/2015_LSHSS-15-0035 Jung, S., & Sainato, D. M. (2015). Teaching games to young children with autism spectrum disorder using special interests and video modelling. Journal of Intellectual and Developmental Disability, 40(2), 198-212. https://doi.org/10.3109/13668250.2015.1027674 Kadey, H. J., Roane, H. S., Diaz, J. C., & McCarthy, C. M. (2013). Using a Nuk® brush to increase acceptance of solids and liquids for two children diagnosed with autism. Research in Autism Spectrum Disorders, 7(11), 1461-1480. https://doi.org/10.1016/j.rasd.2013.07.017 Kang, S., O’Reilly, M., Rojeski, L., Blenden, K., Xu, Z., Davis, T., Sigafoos, J., & Lancioni, G. (2013). Effects of tangible and social reinforcers on skill acquisition, stereotyped behavior, and task engagement in three children with autism spectrum disorders. Research in Developmental Disabilities, 34(2), 739-744. https://doi.org/10.1016.j.ridd.2012.10.007 Karaaslan, O., Diken, I. H., & Mahoney, G. (2013). A randomized control study of responsive teaching with young Turkish children and their mothers. Topics in Early Childhood Special Education, 33(1), 18-27. https://doi.org/10.1177/0271121411429749 88 Karabekir, E. P., & Akmanoğlu, N. (2018). Effectiveness of video modeling presented via smartboard for teaching social response behavior to children with autism. Education and Training in Autism and Developmental Disabilities, 53(4), 363-377. https://www.jstor.org/stable/26563479 Kasari, C., Lawton, K., Shih, W., Barker, T. V., Landa, R., Lord, C., Orlich, F., King, B., Wetherby, A., & Senturk, D. (2014). Caregiver-mediated intervention for low-resourced preschoolers with autism: An RCT. Pediatrics, 134(1), 72-79. Kasari, C., & Smith, T. (2013). Interventions in schools for children with autism spectrum disorder: Methods and recommendations. Autism, 17(3), 254-267. https://doi.org/10.1177/1362361312470496. Kassardjian, A., Leaf, J. B., Ravid, D., Leaf, J. A., Alcalay, A., Dale, S., Tsuji, K., Taubman, M., Leaf, R., McEachin, J., & Oppenheim-Leaf, M. L. (2014). Comparing the teaching interaction procedure to social stories: A replication study. Journal of Autism and Developmental Disorders, 44(9), 2329-2340. https://doi.org/10.1007/s10803-014-2103-0 Kassardjian, A., Taubman, M., Rudrud, E., Leaf, J. B., Edwards, A., McEachin, J., Leaf, R., & Schulze, K. (2013). Utilizing teaching interactions to facilitate social skills in the natural environment. Education and Training in Autism and Developmental Disabilities, 48(2), 245-257. Katz, E., & Girolametto, L. (2013). Peer-mediated intervention for preschoolers with ASD implemented in early childhood education settings. Topics in Early Childhood Special Education, 33(3), 133-143. https://doi.org/10.1177/0271121413484972 Katz, E., & Girolametto, L. (2015). Peer-mediated intervention for preschoolers with ASD: Effects on responses and initiations. International Journal of Speech-Language Pathology, 17(6), 565-576. https://doi.org/10.3109/17549507.2015.1024166 Kautz, M. E., DeBar, R. M., Vladescu, J. C., & Graff, R. B. (2018). A further evaluation of choice of task sequence. The Journal of Special Education, 52(1), 16-28. https://doi.org/10.1177/0022466917735655 Keen, D., & Pennell, D. (2015). The use of preferred items in a word-learning task: Effects on on-task behavior and learning outcomes of children with autism spectrum disorder. Australasian Journal of Special Education, 39(1), 56-66. https://doi.org/10.1017/jse.2014.16 Kelly, L., & Holloway, J. (2015). An investigation of the effectiveness of behavioral momentum on the acquisition and fluency outcomes of tacts in three children with autism spectrum disorder. Research in Autism Spectrum Disorders, 9(1), 182-192. https://doi.org/10.1016/j.rasd.2014.10.007 89 Kim, S., & Clarke, E. (2015). Case study: An iPad-based intervention on turn-taking behaviors in preschoolers with autism. Behavioral Development Bulletin, 20(2), 253-264. https://doi.org/10.1037/h0101314 Kim, S. Y., Chung, K., & Jung, S. (2018). Effects of repeated food exposure on increasing vegetable consumption in preschool children with autism spectrum disorder. Research in Autism Spectrum Disorders, 47(1), 26-35. https://doi.org/10.1016/j.rasd.2018.01.003 King, M. L., Takeguchi, K., Barry, S. E., Rehfeldt, R. A., Boyer, V. E., & Mathews, T. L. (2014). Evaluation of the iPad in the acquisition of requesting skills for children with autism spectrum disorder. Research in Autism Spectrum Disorders, 8(9), 1107-1120. https://doi.org/10.1016/j.rasd.2014.05.011 Kisamore, A. N., Karsten, A. M., & Mann, C. C. (2016). Teaching multiply controlled intraverbals to children and adolescents with autism spectrum disorders. Journal of Applied Behavior Analysis, 49(4), 826-847. https://doi.org/10.1002/jaba.344 Klaus, S., Hixson, M. D., Drevon, D. D., & Nutkins, C. (2019). A comparison of prompting methods to teach sight words to students with autism spectrum disorder. Behavioral Interventions, 34(3), 352-365. https://doi.org/10.1002/bin.1667 Knight, R. M., & Johnson, C. M. (2014). Using a behavioral treatment package for sleep problems in children with autism spectrum disorders. Child & Family Behavior Therapy, 36(3), 204-221. https://doi.org/10.1080/07317107.2014.934171 Kobari-Wright, V. V., & Miguel, C. F. (2014). The effects of listener training on the emergence of categorization and speaker behavior in children with autism. Journal of Applied Behavior Analysis, 47(2), 431-436. https://doi.org/10.1002/jaba.115 Kodak, T., Campbell, V., Bergmann, S., LeBlanc, B., Kurtz-Nelson, E., Cariveau, T., Haq, S., Zemantic, P., & Mahon, J. (2016). Examination of efficacious, efficient, and socially valid error-correction procedures to teach sight words and prepositions to children with autism spectrum disorder. Journal of Applied Behavior Analysis, 49(3), 532-547. https://doi.org/10.1002/jaba.310 Kodak, T., Clements, A., & LeBlanc, B. (2013). A rapid assessment of instructional strategies to teach auditory-visual conditional discriminations to children with autism. Research in Autism Spectrum Disorders, 7(6), 801-807. https://doi.org/10.1016/j.rasd.2013.02.007 Kodak, T., Clements, A., Paden, A. R., LeBlanc, B., Mintz, J., & Toussaint, K. A. (2015). Examination of the relation between an assessment of skills and performance on auditory-visual conditional discriminations for children with autism spectrum disorder. Journal of Applied Behavior Analysis, 48(1), 52-70. https://doi.org/10.1002/jaba.160 90 Kodak, T., Halbur, M., Bergmann, S., Costello, D. R., Benitez, B., Olsen, M., Gorgan, E., & Cliett, T. (2020). A comparison of stimulus set size on tact training for children with autism spectrum disorder. Journal of Applied Behavior Analysis, 53(1), 265-283. https://doi.org/10.1002/jaba.553 Kodak, T., & Paden, A. R. (2015). A comparison of intraverbal and listener training for children with autism spectrum disorder. The Analysis of Verbal Behavior, 31(1), 137-144. https://doi.org/10.1007/s40616-015-0033-3 Koegel, R. L., Bradshaw, J. L., Ashbaugh, K., & Koegel, L. K. (2014a). Improving question- asking initiations in young children with autism using pivotal response treatment. Journal of Autism and Developmental Disorders, 44(4), 816-827. https://doi.org/10.1007/s10803- 013-1932-6 Koegel, L. K., Park, M. N., & Koegel, R. L. (2014b). Using self-management to improve the reciprocal social conversation of children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 44(5), 1055-1063. https://doi.org/10.1007/s10803- 013-1956-y Kohler, K. T., & Malott, R. W. (2014). Matrix training and verbal generativity in children with autism. The Analysis of Verbal Behavior, 30(2), 170-177. https://doi.org/10.1007/s40616- 014-0016-9 Kourassanis, J., Jones, E. A., & Fienup, D. M. (2015). Peer-video modeling: Teaching chained social game behaviors to children with ASD. Journal of Developmental and Physical Disabilities, 27(1), 25-36. https://doi.org/10.1007/s10882-014-9399-8 Kreibich, S. R., Chen, M., & Reichle, J. (2015). Teaching a child with autism to request breaks while concurrently increasing task engagement. Language, Speech, and Hearing Services in Schools, 46(3), 256-265. https://doi.org/10.1044/2015_LSHSS-14-0081 Krstovska-Guerrero, I., & Jones, E. A. (2016). Social-communication intervention for toddlers with autism spectrum disorder: Eye gaze in the context of requesting and joint attention. Journal of Developmental and Physical Disabilities, 28(2), 289-316. https://doi.org/10.1007/s10882-015-9466-9 Kryzak, L. A., Bauer, S., Jones, E. A., & Sturmey, P. (2013). Increasing responding to others’ joint attention directives using circumscribed interests. Journal of Applied Behavior Analysis, 46(3), 674-679. https://doi.org/10.1002/jaba.73 Kryzak, L. A., & Jones, E. A. (2015). The effect of prompts within embedded circumscribed interests to teach initiating joint attention in children with autism spectrum disorders. Journal of Developmental and Physical Disabilities, 27(3), 265-284. https://doi.org/10.1007/s10882-014-9414-0 91 Kuo, N., & Plavnick, J. B. (2015). Using an antecedent art intervention to improve the behavior of a child with autism. Art Therapy, 32(2), 54-59. https://doi.org/10.1080/07421656.2015.1028312 Kurt, O., & Kutlu, M. (2019). Effectiveness of social stories in teaching abduction-prevention skills to children with autism. Journal of Autism and Developmental Disorders, 49(9), 3807-3818. https://doi.org/10.1007/s10803-019-04096-9 LaLonde, K. B., Dueñas, A. D., Neil, N., Wawrzonek, A., & Plavnick, J. B. (2020a). An evaluation of two tact-training procedures on acquired tacts and tacting during play. The Analysis of Verbal Behavior, 36, 180-192. https://doi.org/10.1007/s40616-020-00131-4 LaLonde, K. B., Jones, S., West, L., & Santman, C. (2020b). An evaluation of a game-based treatment package on intraverbals in young children with autism. Behavior Analysis in Practice, 13, 152-157. https://doi.org/10.1007/s40617-019-00397-8 Lang, R., Machalicek, W., Rispoli, M., O’Reilly, M., Sigafoos, J., Lancioni, G., Peters-Scheffer, N., & Didden, R. (2014a). Play skills taught via behavioral intervention generalize, maintain, and persist in the absence of socially mediated reinforcement in children with autism. Research in Autism Spectrum Disorders, 8(7), 860-872. https://doi.org/10.1016/j.rasd.2014.04.007 Lang, R., van der Werff, M., Verbeek, K., Didden, R., Davenport, K., Moore, M., Lee, A., Rispoli, M., Machalicek, W., O’Reilly, M., Sigafoos, J., & Lancioni, G. (2014b). Comparison of high and low preferred topographies of contingent attention during discrete trial training. Research in Autism Spectrum Disorders, 8(10), 1279-1286. https://doi.org/10.1016/j.rasd.2014.06.012 Lanovaz, M. J., Rapp, J. T., Maciw, I., Dorion, C., & Prégent-Pelletier, É (2016). Preliminary effects of parent-implemented behavioural interventions for stereotypy. Developmental Neurorehabilitation, 19(3), 193-196. https://doi.org/10.3109/17518423.2014.986821 Lanovaz, M. J., Rapp, J. T., Maciw, I., Prégent-Pelletier, É., Dorion, C., Ferguson, S., & Saade, S. (2014). Effects of multiple interventions for reducing vocal stereotypy: Developing a sequential intervention model. Research in Autism Spectrum Disorders, 8(5), 529-545. https://doi.org/10.1016/j.rasd.2014.01.009 Laprime, A. P., & Dittrich, G. A. (2014). An evaluation of a treatment package consisting of discrimination training and differential reinforcement with response cost and a social story on vocal stereotypy for a preschooler with autism in a preschool classroom. Education & Treatment of Children, 37(3), 407-430. https://www.jstor.org/stable/45153587 Larkin, W., Hawkins, R. O., & Collins, T. (2016). Using trial-based functional analysis to design effective interventions for students diagnosed with autism spectrum disorder. School Psychology Quarterly, 31(4), 534-547. https://doi.org/10.1037/spq0000158 92 Leaf, J. B., Alcalay, A., Leaf, J. A., Tsuji, K., Kassardjian, A., Dale, S., McEachin, J., Taubman, M., & Leaf, R. (2016a). Comparison of most-to-least to error correction for teaching receptive labelling for two children diagnosed with autism. Journal of Research in Special Educational Needs, 16(4), 217-225. https://doi.org/10.1111/1471-3802.12067 Leaf, J. B., Aljohani, W. A., Milne, C. M., Ferguson, J. L., Cihon, J. H., Oppenheim-Leaf, M. L., McEachin, J., & Leaf, R. (2019). Training behavior change agents and parents to implement discrete trial teaching: A literature review. Review Journal of Autism and Developmental Disorders, 6(1), 26-39. https://doi.org/10.1007/s40489-018-0152-6 Leaf, J. B., Cihon, J. H., Alcalay, A., Mitchell, E., Townley-Cochran, D., Miller, K., Leaf, R., Taubman, M., & McEachin, J. (2017). Instructive feedback embedded within group instruction for children diagnosed with autism spectrum disorder. Journal of Applied Behavior Analysis, 50(2), 304-316. https://doi.org/10.1002/jaba.375 Leaf, J. B., Cihon, J. H., Ferguson, J. L., McEachin, J., Leaf, R., & Taubman, M. (2018). Evaluating three methods of stimulus rotation when teaching receptive labels. Behavior Analysis in Practice, 11(4), 334-349. https://doi.org/10.1007/s40617-018-0249-5 Leaf, J. B., Cihon, J. H., Ferguson, J. L., Milne, C. M., Leaf, R., & McEachin, J. (2020). Comparing error correction to errorless learning: A randomized clinical trial. The Analysis of Verbal Behavior, 36, 1-20. https://doi.org/10.1007/s40616-019-00124-y Leaf, J. B., Cihon, J. H., Townley-Cochran, D., Miller, K., Leaf, R., McEachin, J., & Taubman, M. (2016b). An evaluation of positional prompts for teaching receptive identification to individuals diagnosed with autism spectrum disorder. Behavior Analysis in Practice, 9(4), 349-363. https://doi.org/10.1007/s40617-016-0146-8 Leaf, J. B., Dale, S., Kassardjian, A., Tsuji, K. H., Taubman, M., McEachin, J. J., Leaf, R. B., & Oppenheim-Leaf, M. L. (2014a). Comparing different classes of reinforcement to increase expressive language for individuals with autism. Education and Training in Autism and Developmental Disabilities, 49(4), 533-546. https://www.jstor.com/stable/24582349 Leaf, J. A., Leaf, J. B., Milne, C., Townley-Cochran, D., Oppenheim-Leaf, M. L., Cihon, J. H., Taubman, M., McEachin, J., & Leaf, R. (2016c). The effects of the cool versus not cool procedure to teach social game play to individuals diagnosed with autism spectrum disorder. Behavior Analysis in Practice, 9(1), 34-49. https://doi.org/10.1007/s40617-016- 0112-5 Leaf, J. B., Leaf, R., Taubman, M., McEachin, J., & Delmolino, L. (2014b). Comparison of flexible prompt fading to error correction for children with autism spectrum disorder. Journal of Developmental and Physical Disabilities, 26(2), 203-224. https://doi.org/10.1007/s10882-013-9354-0 93 Leaf, J. B., Taubman, M., Leaf, J., Dale, S., Tsuji, K., Kassardjian, A., Alcalay, A., Milne, C., Mitchell, E., Townley-Cochran, D., Leaf, R., & McEachin, J. (2015). Teaching social interaction skills using cool versus not cool. Child & Family Behavior Therapy, 37(4), 321-334. https://doi.org/10.1080/07317107.2015.1104778 Leaf, J. B., Taubman, M., Milne, C., Dale, S., Leaf, J., Townley-Cochran, D., Tsuji, K., Kassardjian, A., Alcalay, A., Leaf, R., McEachin, J. (2016d). Teaching social communication skills using a cool versus not cool procedure plus role-playing and a social skills taxonomy. Education & Treatment of Children, 39(1), 44-63. https://www.muse.jhu.edu/article/611999 Leaf, J. B., Townley-Cochran, D., Mitchell, E., Milne, C., Alcalay, A., Leaf, J., Leaf, R., Taubman, M., McEachin, J., & Oppenheim-Leaf, M. L. (2016e). Evaluation of multiple- alternative prompts during tact training. Journal of Applied Behavior Analysis, 49(2), 399-404. https://doi.org/10.1002/jaba.289 LeBlanc, L. A., & Gillis, J. M. (2012). Behavioral interventions for children with autism spectrum disorders. The Pediatric Clinics of North America, 59(1), 147-164. https://doi.org/10.1016/j.pcl.2011.10.006 Ledbetter-Cho, K., Lang, R., Davenport, K., Moore, M., Lee, A., Howell, A., Drew, C., Dawson, D., Charlop, M. H., Falcomata, T., & O’Reilly, M. (2015). Effects of script training on the peer-to-peer communication of children with autism spectrum disorder. Journal of Applied Behavior Analysis, 48(4), 785-799. https://doi.org/10.1002/jaba.240 Ledford, J. R., & Gast, D. L. (2014). Measuring procedural fidelity in behavioral research. Neuropsychological Rehabilitation, 24(3-4), 332-348. https://doi.org/10.1080/0902011.2013.861352 Ledford, J. R., Lane, J. D., Shepley, C., & Kroll, S. M. (2016). Using teacher-implemented playground interventions to increase engagement, social behaviors, and physical activity for young children with autism. Focus on Autism and Other Developmental Disabilities, 31(3), 163-173. https://doi.org/10.1177/1088357614547892 Ledford, J. R., & Wehby, J. H. (2015). Teaching children with autism in small groups with students who are at-risk for academic problems: Effects on academic and social behaviors. Journal of Autism and Developmental Disorders, 45(6), 1624-1635. https://doi.org/10.1007/s10803-014-2317-1 Lee, A., Lang, R., Davenport, K., Moore, M., Rispoli, M., van der Meer, L., Carnett, A., Raulston, T., Tostanoski, A., & Chung, C. (2015). Comparison of therapist implemented and iPad-assisted interventions for children with autism. Developmental Neurorehabilitation, 18(2), 97-103. https://doi.org/10.3109/17518423.2013.830231 94 Lee, G. T., Feng, H., Xu, S., & Jin, S. (2019a). Increasing “object-substitution” symbolic play in young children with autism spectrum disorders. Behavior Modification, 43(1), 82-114. https://doi.org/10.1177/0145445517739276 Lee, G. T., Hu, X., Liu, Y., Zou, C., Cheng, X., Zhao, Q., & Huang, J. (2020). Increasing response diversity to intraverbals in children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 50(1), 292-307. https://doi.org/10.1007/s10803- 019-04250-3 Lee, S. H., & Lee, L. W. (2015). Promoting snack time interactions of children with autism in a Malaysian preschool. Topics in Early Childhood Special Education, 35(2), 89-101. https://doi.org/10.1177/0271121415575272 Lee, S. Y., Lo, Y., & Lo, Y. (2017). Teaching functional play skills to a young child with autism spectrum disorder through video self-modeling. Journal of Autism and Developmental Disorders, 47(8), 2295-2306. https://doi.org/10.1007/s10803-017-3147-8 Lee, G. T., Xu, S., Guo, S., Gilic, L., Pu, Y., & Xu, J. (201b9). Teaching “imaginary objects” symbolic play to young children with autism. Journal of Autism and Developmental Disorders, 49(10), 4109-4122. https://doi.org/10.1007/s10803-019-04123-9 Lee, G. T., Xu, S., Zou, H., Gilic, L., & Lee, M. W. (2019c). Teaching children with autism to understand metaphors. The Psychological Record, 69(4), 499-512. https://doi.org/10.1007/s40732-019-00355-4 Lee, J., Vargo, K. K., & Porretta, D. L. (2018). An evaluation of the effects of antecedent exercise type on stereotypic behaviors. Journal of Developmental and Physical Disabilities, 30(3), 409-426. https://doi.org/10.1007/s10882-018-9593-1 Lepper, T. L., & Petursdottir, A. I. (2017). Effects of response-contingent stimulus pairing on vocalizations of nonverbal children with autism. Journal of Applied Behavior Analysis, 50(4), 756-774. https://doi.org/10.1002/jaba.415 Lepper, T. L., Petursdottir, A. I., & Esch, B. E. (2013). Effects of operant discrimination training on the vocalizations of nonverbal children with autism. Journal of Applied Behavior Analysis, 46(3), 656-661. https://doi.org/10.1002/jaba.55 Levin, D. S., Volkert, V. M., & Piazza, C. C. (2014). A multi-component treatment to reduce packing in children with feeding and autism spectrum disorders. Behavior Modification, 38(6), 940-963. https://doi.org/10.1177/0145445514550683 Levy, K. M., Ainsleigh, S. A., & Hunsinger-Harris, M. L. (2017). Let’s go under! Teaching water safety skills using a behavioral treatment package. Education and Training in Autism and Developmental Disabilities, 52(2), 186-193. https://www.jstor.org/stable/26420389 95 Levy, A., & Perry, A. (2011). Outcomes in adolescents and adults with autism: A review of the literature. Research in Autism Spectrum Disorders, 5(4), 1271-1282. https://doi.org/10.1016/j.rasd.2011.01.023 Li, A., Wallace, L., Ehrhardt, K. E., & Poling, A. (2017). Reporting participant characteristics in intervention articles published in five behavioral-analytic journals, 2013-2015. Behavior Analysis: Research and Practice, 17(1), 84-91. https://doi.org/10.1037/bar0000071 Lin, C. E., & Koegel, R. (2018). Treatment for higher-order restricted repetitive behaviors (H- RRB) in children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 48(11), 3831-3845. https://doi.org/10.1007/s10803-018-3637-3 Lin, F. Y., & Zhu, J. (2020). Comparison of two discrimination methods in teaching Chinese children with autism. Journal of Applied Behavior Analysis, 53(2), 1145-1152. https://doi.org/10.1002/jaba.652 Lipschultz, J. L., Wilder, D. A., Ertel, H., & Enderli, A. (2018). The effects of high-p and low-p instruction similarity on compliance among young children. Journal of Applied Behavior Analysis, 51(4), 866-878. https://doi.org/10.1002/jaba.482 Lorah, E. R. (2018). Evaluating the iPad Mini® as a speech-generating device in the acquisition of a discriminative mand repertoire for young children with autism. Focus on Autism and Other Developmental Disabilities, 33(1), 47-54. https://doi.org/10.1177/1088357616673624 Lorah, E. R., Crouser, J., Gilroy, S. P., Tincani, M., & Hantula, D. (2014a). Within stimulus prompting to teach symbol discrimination using an iPad® speech generating device. Journal of Developmental and Physical Disabilities, 26(3), 335-346. https://doi.org/10.1007/s10882-014-9369-1 Lorah, E. R., Gilroy, S. P., & Hineline, P. N. (2014b). Acquisition of peer manding and listener responding in young children with autism. Research in Autism Spectrum Disorders, 8(1), 61-67. https://doi.org/10.1016/j.rasd.2013.10.009 Lorah, E. R., & Karnes, A. (2016). Evaluating the Language Builder™ application in the acquisition of listener responding in young children with autism. Journal of Developmental and Physical Disabilities, 28(2), 255-265. https://doi.org/10.1007/s10882-015-9464-y Lorah, E. R., & Parnell, A. (2017). Acquisition of tacting using a speech-generating device in group learning environments for preschoolers with autism. Journal of Developmental and Physical Disabilities, 29(4), 597-609. https://doi.org/10.1007/s10882-017-9543-3 96 Lorah, E. R., Parnell, A., & Speight, D. R. (2014c). Acquisition of sentence frame discrimination using the iPad™ as a speech generating device in young children with developmental disabilities. Research in Autism Spectrum Disorders, 8(12), 1734-1740. https://doi.org/10.1016/j.rasd.2014.09.004 Lorah, E. R., Tincani, M., Dodge, J., Gilroy, S., Hickey, A., & Hantula, D. (2013). Evaluating picture exchange and the iPad™ as a speech generating device to teach communication to young children with autism. Journal of Developmental and Physical Disabilities, 25(6), 637-649. https://doi.org/10.1007/s10882-013-9337-1 Loughrey, T. O., Betz, A. M., Majdalany, L. M., & Nicholson, K. (2014). Using instructive feedback to teach category names to children with autism. Journal of Applied Behavior Analysis, 47(2), 425-430. https://doi.org/10.1002/jaba.123 Lugo, A. M., Mathews, T. L., King, M. L., Lamphere, J. C., & Damme, A. M. (2017). Operant discrimination training to establish praise as a reinforcer. Behavioral Interventions, 32(4), 341-356. https://doi.org/10.1002/bin.1485 Lui, C. M., Moore, D. W., & Anderson A. (2014). Using a self-management intervention to increase compliance in children with ASD. Child & Family Behavior Therapy, 36(4), 259-279. https://doi.org/10.1080/07317107.2014.967613 MacManus, C., MacDonald, R., & Ahearn, W. H. (2015). Teaching and generalizing pretend play in children with autism using video modeling and matrix training. Behavioral Interventions, 30(3), 191-218. https://doi.org/10.1002/bin.1406 Madzharova, M. S., & Sturmey, P. (2015). Effects of video modeling and feedback on mothers’ implementation of peer-to-peer manding. Behavioral Interventions, 30(3), 270-285. https://doi.org/10.1002/bin.1414 Maenner, M. J., Shaw, K. A., Baio, J., Washington, A., Patrick, M., DiRienzo, M., Christensen, D. L., Wiggins, L. D., Pettygrove, S., Andrews, J. G., Lopez, M., Hudson, A., Baroud, T., Schwenk, Y., White, T., Rosenberg, C. R., Lee, L., Harrington, R., Huston, M., Hewitt, A., Esler, A., Hall-Lande, J., Poynter, J. N., Hallas-Muchow, L., Constantino, J. N., Fitzgerald, R. T., Zahorodny, W., Shenouda, J., Daniels, J. L., Warren, Z., Vehorn, A., Salinas, A., Durking, M. S., & Dietz, P. M. (2020). Prevalence of autism spectrum disorder among children aged 8 years- Autism and developmental disabilities monitoring network, 11 sites, united states, 2016. Morbidity and Mortality Weekly Report, 69(4), 1- 12. http://doi.org/10.15585/mmwr.ss6904a1 Maich, K., Hall, C. L., van Rhijn, T. M., & Quinlan, L. (2015). Developing social skills of summer campers with autism spectrum disorder: A case study of camps on TRACKS implementation in an inclusive day-camp setting. Exceptionality Education International, 25(2), 27-41. https://ir.lib.uwo.ca/eei/vol25/iss2/1 97 Majdalany, L. M., Wilder, D. A., Greif, A., Mathisen, D., & Saini, V. (2014). Comparing massed-trial instruction, distributed-trial instruction, and task interspersal to teach tacts to children with autism spectrum disorder. Journal of Behavior Analysis, 47(3), 657-662. https://doi.org/10.1002/jaba.149 Majdalany, L., Wilder, D. A., Smeltz, L., & Lipschultz, J. (2016). The effect of brief delays to reinforcement on the acquisition of tacts in children with autism. Journal of Applied Behavior Analysis, 49(2), 411-415. https://doi.org/10.1002/jaba.282 Makrygianni, M. K., & Reed, P. (2010). A meta-analytic review of the effectiveness of behavioural early intervention programs for children with autism spectrum disorders. Research in Autism Spectrum Disorders, 4(4), 577-593. https://doi.org/10.1016/j.rasd.2010.01.014 Mancil, G. R., Lorah, E. R., & Whitby, P. S. (2016). Effects of iPod touch™ technology as communication devices on peer social interactions across environments. Education and Training in Autism and Developmental Disabilities, 51(3), 252-264. https://www.jstor.org/stable/24827522 Mandak, K., Light, J., & McNaughton, D. (2019). Digital books with dynamic text and speech output: Effects on sight word reading for preschoolers with autism spectrum disorder. Journal of Autism and Developmental Disorders, 49(3), 1193-1204. https://doi.org/10.1007/s10803-018-3817-1 Markham, V., Giles, A., & May, R. (2020). Evaluating efficacy and preference for prompt type during discrete-trial teaching. Behavior Modification, 44(1), 49-69. https://doi.org/10.1177/0145445518792245 Marsteller, T. M., & St. Peter, C. C. (2014). Effects of fixed-time reinforcement schedules on resurgence of problem behavior. Journal of Applied Behavior Analysis, 47(3), 455-469. https://doi.org/10.1002/jaba.134 Martinez, C. K., Betz, A. M., Liddon, C. J., & Werle, R. L. (2016). A progression to transfer RIRD to the natural environment. Behavioral Interventions, 31(2), 144-162. https://doi.org/10.1002/bin.1444 Masse, J. J., McNeil, C. B., Wagner, S., & Quetsch, L. B. (2016). Examining the efficacy of parent-child interaction therapy with children on the autism spectrum. Journal of Child and Family Studies, 25(8), 2508-2525. https://doi.org/10.1007/s10826-016-0424-7 Matsuda, S., & Yamamoto, J. (2013). Intervention for increasing the comprehension of affective prosody in children with autism spectrum disorder. Research in Autism Spectrum Disorders, 7(8), 938-946. https://doi.org/10.1016/j.rasd.2013.04.001 98 Matsuda, S., & Yamamoto, J. (2014). Computer-based intervention for inferring facial expressions from the socio-emotional context in two children with autism spectrum disorders. Research in Autism Spectrum Disorders, 8(8), 944-950. https://doi.org/10.1016/j.rasd.2014.04.010 McDowell, L. S., Guiterrez, A., & Bennett, K. D. (2015). Analysis of live modeling plus prompting and video modeling for teaching imitation to children with autism. Behavioral Interventions, 30(3), 333-351. https://doi.org/10.1002/bin.1419 McKeel, A. N., Dixon, M. R., Daar, J. H., Rowsey, K. E., & Szekely, S. (2015). Evaluating the efficacy of the PEAK relational training system using a randomized controlled trial of children with autism. Journal of Behavioral Education, 24(2), 230-241. https://doi.org/10.1007/s10864-015-9219-y McLay, L., Church, J., & Sutherland, D. (2016). Variables affecting the emergence of untaught equivalence relations in children with and without autism. Developmental Neurorehabilitation, 19(2), 75-87. https://doi.org/10.3109/17518423.2014.899649 McLay, L. K., France, K. G., Knight, J., Blampied, N. M., & Hastie, B. (2019). The effectiveness of function-based interventions to treat sleep problems, including unwanted co-sleeping, in children with autism. Behavioral Interventions, 34(1), 30-51. https://doi.org/10.1002/bin.1651 McLay, L., Schäfer, M. C. M., van der Meer, L., Couper, L., McKenzie, E., O’Reilly, M. F., Lancioni, G. E., Marschik, P. B., Sigafoos, J., & Sutherland, D. (2017). Acquisition, preference and follow-up comparison across three AAC modalities taught to two children with autism spectrum disorder. International Journal of Disability, Development and Education, 64(2), 117-130. https://doi.org/10.1080/1034912X.2016.1188892 McLay, L., van der Meer, L., Schäfer, M. C. M., Couper, L., McKenzie, E., O’Reilly, M. F., Lancioni, G. E., Marschik, P. B., Green, V. A., Sigafoos, J., & Sutherland, D. (2015). Comparing acquisition, generalization, maintenance, and preference across three AAC options in four children with autism spectrum disorder. Journal of Developmental and Physical Disabilities, 27(3), 323-339. https://doi.org/10.1007/s10882-014-9417-x Meadan, H., Snodgrass, M. R., Meyer, L. E., Fisher, K. W., Chung, M. Y., & Halle, J. W. (2016). Internet-based parent-implemented intervention for young children with autism: A pilot study. Journal of Early Intervention, 38(1), 3-23. https://doi.org/10.1177/1053815116630327 Merriam-Webster (n.d.). Training. In Merriam-Webster.com dictionary. Retrieved December 4, 2020, from https://www.merriam-webster.com/dictionary/training Miguel, C. F., & Kobari-Wright, V. V. (2013). The effects of tact training on the emergence of categorization and listener behavior in children with autism. Journal of Applied Behavior Analysis, 46(3), 669-673. https://doi.org/10.1002/jaba.62 99 Miller, S. A., Rodriguez, N. M., & Rourke, A. J. (2015). Do mirrors facilitate acquisition of motor imitation in children diagnosed with autism? Journal of Applied Behavior Analysis, 48(1), 194-198. https://doi.org/10.1002/jaba.187 Miltenberger, C. A., & Charlop, M. H. (2015). The comparative effectiveness of portable video modeling vs. Traditional video modeling interventions with children with autism spectrum disorders. Journal of Developmental and Physical Disabilities, 27(3), 341-358. https://doi.org/10.1007/s10882-014-9416-y Mitteer, D. R., Luczynski, K. C., McKeown, C. A., & Cohrs, V. L. (2020). A comparison of teaching tacts with and without background stimuli on acquisition and generality. Behavioral Interventions, 35(1), 3-24. https://doi.org/10.1002/bin.1702 Moore, D. W., Anderson, A., Treccase, F., Deppeler, J., Furlonger, B., & Didden, R. (2013). A video-based package to teach a child with autism spectrum disorder to write her name. Journal of Developmental and Physical Disabilities, 25(5), 493-503. https://doi.org/10.1007/s10882-012-9325-x Muething, C. S., Falcomata, T. S., Ferguson, R., Swinnea, S., & Shpall, C. (2018). An evaluation of delay to reinforcement and mand variability during functional communication training. Journal of Applied Behavior Analysis, 51(2), 263-275. https://doi.org/10.1002/jaba.441 Mullen, S., Dixon, M. R., Belisle, J., & Stanley, C. (2017). Establishing auditory-tactile-visual equivalence classes in children with autism and developmental delays. The Analysis of Verbal Behavior, 33(2), 283-289. https://doi.org/10.1007/s40616-017-0092-8 Murdock, L. C., Ganz, J., & Crittendon, J. (2013). Use of an iPad play story to increase dialogue of preschoolers with autism spectrum disorders. Journal of Autism and Developmental Disorders, 43(9), 2174-2189. https://doi.org/10.1007/s10803-013-1770-6 Muzammal, M. S., & Jones, E. A. (2017). Social-communication intervention for toddlers with autism spectrum disorder: Effects on initiating joint attention and interactions with mother. Journal of Developmental and Physical Disabilities, 29(2), 203-221. https://doi.org/10.1007/s10882-016-9519-8 Nahmias, A. S., Pellecchia, M., Stahmer, A. C., & Mandell, D. S. (2019). Effectiveness of community-based early intervention for children with autism spectrum disorder: A meta- analysis. Journal of Child Psychology and Psychiatry, 60(11), 1200-1209. https://doi.org/10.1111/jcpp.13073 Najdowski, A. C., St. Clair, M., Fullen, J. A., Child, A., Persicke, A., & Tarbox, J. (2018). Teaching children with autism to identify and respond appropriately to the preferences of others during play. Journal of Applied Behavior Analysis, 51(4), 890-898. https://doi.org/10.1002/jaba.494 100 National Autism Center (2015). Findings and conclusions: National standards project, phase 2. http://www.nationalautismcenter.org/090605-2/ Neely, L., Graber, J., Kunnavatana, S., & Cantrell, K. (2020a). Impact of language on behavior treatment outcomes. Journal of Applied Behavior Analysis, 53(2), 796-810. https://doi.org/10.1002/jaba.626 Neely, L., Hong, E. R., Kawamini, S., Umana, I., & Kurz, I. (2020b). Intercontinental telehealth to train Japanese interventionists in incidental teaching for children with autism. Journal of Behavioral Education, 29, 433-448. https://doi.org/10.1007/s10864-020-09377-3 Neely, L., Rispoli, M., Camargo, S., Davis, H., & Boles, M. (2013). The effect of instructional use of an iPad® on challenging behavior and academic engagement for two students with autism. Research in Autism Spectrum Disorders, 7(4), 509-516. https://doi.org/10.1016/j.rasd.2012.12.004 Neufeld, V., Law, K. C. Y., & Lucyshyn, J. M. (2014). Integrating best practices in positive behavior support and clinical psychology for a child with autism and anxiety-related problem behavior: A clinical case study. Canadian Journal of School Psychology, 29(3), 258-276. https://doi.org/10.1177/0829573514540603 Ninci, J., Lang, R., Davenport, K., Lee, A., Garner, J., Moore, M., Boutot, A., Rispoli, M., & Lancioni, G. (2013). An analysis of the generalization and maintenance of eye contact taught during play. Developmental Neurorehabilitation, 16(5), 301-307. https://doi.org/10.3109/17518423.2012.730557 Ninci, J., Rispoli, M., Neely, L. C., & Guz, S. (2018). Transferring picture exchange requests to receptive identification for children with ASD. Developmental Neurorehabilitation, 21(3), 178-187. https://doi.org/10.1080/17518423.2018.1437840 Nopprapun, M., & Holloway, J. (2014). A comparison of fluency training and discrete trial instruction to teach letter sounds to children with ASD: Acquisition and learning outcomes. Research in Autism Spectrum Disorders, 8(7), 788-802. https://doi.org/10.1016/j.rasd.2014.03.015 Nottingham, C. L., Vladescu, J. C., Kodak, T., & Kisamore, A. N. (2017). Incorporating multiple secondary targets into learning trials for individuals with autism spectrum disorder. Journal of Applied Behavior Analysis, 50(3), 653-661. https://doi.org/10.1002/jaba.396 O’Brien, M., Mc Tiernan, A., & Holloway, J. (2018). Teaching phonics to preschool children with autism using frequency-building and computer-assisted instruction. Journal of Developmental and Physical Disabilities, 30(2), 215-237. https://doi.org/10.1007/s10882-017-9581-x 101 O’Connor, E., Cividini-Motta, C., & MacNaul, H. (2020). Treatment of food selectivity: An evaluation of video modeling of contingencies. Behavioral Interventions, 35(1), 57-75. https://doi.org/10.1002/bin.1693 O’Hara, M., & Hall, L. J. (2014). Increasing engagement of students with autism at recess through structured work systems. Education and Training in Autism and Developmental Disabilities, 49(4), 568-575. https://www.jstor.org/stable/24582352 Odom, S. L., Boyd, B. A., Hall, L. J, & Hume, K. (2010). Evaluation of comprehensive treatment models for individuals with autism spectrum disorders. Journal of Autism and Developmental Disorders, 40(4), 425-436. https://doi.org/10.1007/s10803-009-0825-1 Olaff, H. S., & Holth, P. (2020). The emergence of bidirectional naming through sequential operant instruction following the establishment of conditioned social reinforcers. The Analysis of Verbal Behavior, 36, 21-48. https://doi.org/10.1007/s40616-019-00122-0 Olaff, H. S., Ona, H. N., & Holth, P. (2017). Establishment of naming in children with autism through multiple response-exemplar training. Behavioral Development Bulletin, 22(1), 67-85. https://doi.org/10.1037/bdb0000044 Owen, T. M., Fisher, W. W., Akers, J. S., Sullivan, W. E., Falcomata, T. S., Greer, B. D., Roane, H. S., & Zangrillo, A. N. (2020). Journal of Applied Behavior Analysis, 53(3), 1494- 1513. https://doi.org/10.1002/jaba.674 Paden, A. R., & Kodak, T. (2015). The effects of reinforcement magnitude on skill acquisition for children with autism. Journal of Applied Behavior Analysis, 48(4), 924-929. https://doi.org/10.1002/jaba.239 Paquet, A., Dionne, C., Joly, J., Rousseau, M., & Rivard, M. (2017). Supervision of large-scale community-based early intensive behavioral intervention programs in Quebec: Description of practices. Journal on Developmental Disabilities, 23(1), 54-63. https://search-proquest-com.proxy1.cl.msu.edu/docview/1991893643?pq- origsite=summon Pellegrino, A. J., Higbee, T. S., Becerra, L. A., & Gerencser, K. R. (2020). Comparing stimuli delivered via tablet versus flashcards on receptive labeling in children with autism spectrum disorder. Journal of Behavioral Education, 29, 606-618. https://doi.org/10.1007/s10864-019-09329-6 Perez, B. C., Bacotti, J. K., Peters, K. P., & Vollmer, T. R. (2020). An extension of commonly used toilet-training procedures to children with autism spectrum disorder. Journal of Applied Behavior Analysis, 53(4), 2360-2375. https://doi.org/10.1002/jaba.727 Pérez-González, L. A., Pastor, A., & Carnerero, J. J. (2014). Observing tacting increases uninstructed tacts in children with autism. The Analysis of Verbal Behavior, 30(1), 62-68. https://doi.org/10.1007/s40616-013-0003-6 102 Persicke, A., Jackson, M., & Adams, A. N. (2014). Brief report: An evaluation of TAGteach components to decrease toe-walking in a 4-year-old child with autism. Journal of Autism and Developmental Disorders, 44(4), 965-968. https://doi.org/10.1007/s10803-013-1934- 4 Persicke, A., St. Clair, M., Tarbox, J., Najdowski, A., Ranick, J., Yu, Y., de Nocker, Y. L. (2013). Teaching children with autism to attend to socially relevant stimuli. Research in Autism Spectrum Disorders, 7(12), 1551-1557. https://doi.org/10.1016/j.rasd.2013.09.002 Peters, L. C., & Thompson, R. H. (2015). Teaching children with autism to respond to conversation partners’ interest. Journal of Applied Behavior Analysis, 48(3), 544-562. https://doi.org/10.1002/jaba.235 Peters-Scheffer, N., Didden, R., Mulders, M., & Korzilius, H. (2013). Effectiveness of low intensity behavioral treatment for children with autism spectrum disorder and intellectual disability. Research in Autism Spectrum Disorders, 7(9), 1012-1025. https://doi.org/10.1016/j.rasd.2013.05.001 Peterson, K. M., Piazza, C. C., Ibañez, V. F., & Fisher, W. F. (2019a). Randomized controlled trial of an applied behavior analytic intervention for food selectivity in children with autism spectrum disorder. Journal of Applied Behavior Analysis, 52(4), 895-917. https://doi.org/10.1002/jaba.650 Peterson, S. P., Rodriguez, N. M., & Pawich, T. L. (2019b). Effects of modeling rote versus varied responses on response variability and skill acquisition during discrete-trial instruction. Journal of Applied Behavior Analysis, 52(2), 370-385. https://doi.org/10.1002/jaba.528 Phillips, C. L., Vollmer, T. R., & Porter, A. (2019). An evaluation of textual prompts and generalized textual instruction-following. Journal of Applied Behavior Analysis, 52(4), 1140-1160. https://doi.org/10.1002/jaba.649 Plaisance, L., Lerman, D. C., Laudont, C., & Wu, W. (2016). Inserting mastered targets during error correction when teaching skills to children with autism. Journal of Applied Behavior Analysis, 49(2), 251-264. https://doi.org/10.1002/jaba.292 Plavnick, J. B., MacFarland, M. C., & Ferreri, S. J. (2015). Variability in the effectiveness of a video modeling intervention package for children with autism. Journal of Positive Behavior Interventions, 17(2), 105-115. https://doi.org/10.1177/1098300714548798 Plavnick, J. B., Mariage, T., Englert, C. S., Constantine, K., Morin, L., & Skibbe, L. (2014). Promoting independence during computer assisted reading instruction for children with autism spectrum disorders. Mexican Journal of Behavior Analysis, 40(2), 85-105. https://www.redalyc.org/articulo.oa?id=59335811008 103 Plavnick, J. B., & Vitale, F. A. (2016). A comparison of vocal mand training strategies for children with autism spectrum disorders. Journal of Positive Behavior Interventions, 18(1), 52-62. https://doi.org/10.1177/1098300714548800 Pisman, M. D., & Luczynski, K. C. (2020). Caregivers can implement play-based instruction without disrupting child preference. Journal of Applied Behavior Analysis, 53(3), 1702- 1725. https://doi.org/10.1002/jaba.705 Popovic, S. C., Starr, E. M., & Koegel, L. K. (2020). Teaching initiated question asking to children with autism spectrum disorder through a short-term parent-mediated program. Journal of Autism and Developmental Disorders, 50(10), 3728-3738. https://doi.org/10.1007/s10803-020-04426-2 Préfontaine, I., Lanovaz, M. J., McDuff, E., McHugh, C., & Cook, J. L. (2019). Using mobile technology to reduce engagement in stereotypy: A validation of decision-making algorithms. Behavior Modification, 43(2), 222-245. https://doi.org/10.1177/0145445517748560 Rader, L., Sidener, T. M., Reeve, K. F., Sidener, D. W., Delmolino, L., Miliotis, A., & Carbone, V. (2014). Stimulus-stimulus pairing of vocalizations: A systematic replication. The Analysis of Verbal Behavior, 30(1), 69-74. https://doi.org/10.1007/s40616-014-0012-0 Radley, K. C., Dart, E. H., Moore, J. W., Lum, J. D. K., & Pasqua, J. (2017a). Enhancing appropriate and variable responding in young children with autism spectrum disorder. Developmental Neurorehabilitation, 20(8), 538-548. https://doi.org/10.1080/17518423.2017.1323973 Radley, K. C., Hanglein, J., & Arak, M. (2016). School-based social skills training for pre-school aged children with autism spectrum disorder. Autism, 20(8), 938-951. https://doi.org/10.1177/1362361315617361 Radley, K. C., Jenson, W. R., Clark, E., & O’Neill, R. E. (2014). The feasibility and effects of a parent-facilitated social skills training program on social engagement of children with autism spectrum disorders. Psychology in the Schools, 51(3), 241-255. https://doi.org/10.1002/pits.21749 Radley, K. C., McHugh, M. B., Taber, T., Battaglia, A. A., & Ford, W. B. (2017b). School-based social skills training for children with autism spectrum disorder. Focus on Autism and Other Developmental Disabilities, 32(4), 256-268. https://doi.org/10.1177/1088357615583470 Radley, K. C., O’Handley, R. D., Battaglia, A. A., Lum, J. D. K., Dodakhodjaeva, K., Ford, W. B., & McHugh, M. B. (2017c). Effects of a social skills intervention on children with autism spectrum disorder and peers with shared deficits. Education and Treatment of Children, 40(2), 233-262. https://doi.org/10.1353/etc.2017.0011 104 Rakap, S., & Balikci, S. (2017). Using embedded instruction to teach functional skills to a preschool child with autism. International Journal of Developmental Disabilities, 63(1), 17-26. https://doi.org/10.1080/20473869.2015.1109801 Rapp, J. T., Cook, J. L., McHugh, C., & Mann, K. R. (2017). Decreasing stereotypy using NCR and DRO with functionally matched stimulation: effects on targeted and non-targeted stereotypy. Behavior Modification, 41(1), 45-83. https://doi.org/10.1177/0145445516652370 Rapp, J. T., Cook, J. L., Nuta, R., Balagot, C., Crouchman, K., Jenkins, C., Karim, S., & Watters-Wybrow, C. (2019). Further evaluation of a practitioner model for increasing eye contact in children with autism. Behavior Modification, 43(3), 389-412. https://doi.org/10.1177/0145445518758595 Redhair, E. I., McCoy, K. M., Zucker, S. H., Mathur, S. R., & Caterino, L. (2013). Identification of printed nonsense words for an individual with autism: A comparison of constant time delay and stimulus fading. Education and Training in Autism and Developmental Disabilities, 48(3), 351-362. https://www.jstor.org/stable/23880992 Reeves, L. M., Umbreit, J., Ferro, J. B., & Liaupsin, C. J. (2017). The role of the replacement behavior in function-based intervention. Education and Training in Autism and Developmental Disabilities, 52(3), 305-316. https://www.jstor.org/stable/26420402 Reichle, J., Byiers, B. J., & Reeve, A. (2018). Conditional use of a request for assistance: Considering generalization. Focus on Autism and Other Developmental Disabilities, 33(2), 80-90. https://doi.org/10.1177/1088357616647349 Reichow, B. (2012). Overview of meta-analyses of early intensive behavioral intervention for young children with autism spectrum disorders. Journal of Autism and Developmental Disorders, 42(4), 512-520. https://doi.org/10.1007/s10803-011-1218-9 Ribeiro, D., Miguel, C. F., & Goyos, C. (2015). The effects of listener training on discriminative control by elements of compound stimuli in children with disabilities. Journal of the Experimental Analysis of Behavior, 104(1), 48-62. https://doi.org/10.1002/jeab.161 Richard III, P. R., & Noell, G. H. (2019). Teaching children with autism to tie their shoes using video prompt-models and backward chaining. Developmental Neurorehabilitation, 22(8), 509-515. https://doi.org/10.1080/17518423.2018.1518349 Richardson, A. R., Lerman, D. C., Nissen, M. A., Luck, K. M., Neal, A. E., Bao, S., & Tsami, L. (2017). Can pictures promote the acquisition of sight-word reading? An evaluation of two potential instructional strategies. Journal of Applied Behavior Analysis, 50(1), 67-86. https://doi.org/10.1002/jaba.354 105 Ringdahl, J. E., Berg, W. K., Wacker, D. P., Crook, K., Molony, M. A., Vargo, K. K., Neurnberger, J. E., Zabala, K., & Taylor, C. J. (2018). Effects of response preference on resistance to change. Journal of the Experimental Analysis of Behavior, 109(1), 265-280. https://doi.org/10.1002/jeab.308 Rispoli, M., Camargo, S., Machalicek, W., Lang, R., & Sigafoos, J. (2014a). Functional communication training in the treatment of problem behavior maintained by access to rituals. Journal of Applied Behavior Analysis, 47(3), 580-593. https://doi.org/10.1002/jaba.130 Rispoli, M., Camargo, S. H., Neely, L., Gerow, S., Lang, R., Goodwyn, F., & Ninci, J. (2014b). Pre-session satiation as a treatment for stereotypy during group activities. Behavior Modification, 38(3), 392-411. https://doi.org/10.1177/0145445513511631 Rispoli, M., O’Reilly, M., Lang, R., Machalicek, W., Kang, S., Davis, T., & Neely, L. (2016). An examination of within-session responding following access to reinforcing stimuli. Research in Developmental Disabilities, 48(1), 25-34. https://doi.org/10.1016/j.ridd.2015.10.013 Rivas, K. M., Piazza, C. C., Roane, H. S., Volkert, V. M., Stewart, V., Kadey, H. J., & Groff, R. A. (2014). Analysis of self-feeding in children with feeding disorders. Journal of Applied Behavior Analysis, 47(4), 710-722. https://doi.org/10.1002/jaba.170 Rivero, A. M., & Borrero, C. S. (2020). Evaluation of empirical pretreatment assessments for developing treatments for packaging in pediatric feeding disorders. Behavior Analysis in Practice, 13(1), 137-151. https://doi.org/10.1007/s40617-019-00372-3 Roberts, M. Y., Rosenwasser, J., Phelan, J., & Hampton, L. H. (2020). Improving pediatric hearing testing for children with developmental delays: The effects of video modeling on child compliance and caregiver stress. Journal of the American Academy of Audiology, 31(5), 310-316. https://doi.org/10.3766/jaaa.18070 Rodriguez, N. M., Levesque, M. A., Cohrs, V. L., & Niemeier, J. J. (2017). Teaching children with autism to request help with difficult tasks. Journal of Applied Behavior Analysis, 50(4), 717-732. https://doi.org/10.1002/jaba.420 Rodriguez, P. P., & Guiterrez, A. (2017). A comparison of two procedures to condition social stimuli to function as reinforcers for children with autism. Behavioral Development Bulletin, 22(1), 159-172. https://doi.org/10.1037/bdb0000059 Rodriguez, C. N., & Jackson, M. L. (2020). A safe-word intervention for abduction prevention in children with autism spectrum disorders. Behavior Analysis in Practice, 13(4), 872-882. https://doi.org/10.1007/s40617-020-00418-x 106 Rogalski, J. P., Roscoe, E. M., Fredericks, D. W., & Mezhoudi, N. (2020). Negative reinforcer magnitude manipulations for treating escape-maintained problem behavior. Journal of Applied Behavior Analysis, 53(3), 1514-1530. https://doi.org/10.1002/jaba.683 Rogers, S. J., Estes, A., Lord, C., Munson, J., Rocha, M., Winter, J., Greenson, J., Colombi, C., Dawson, G., Vismara, L. A., Sugar, C. A., Hellemann, G., Whelan, F., & Talbott, M. (2019a). A multisite randomized controlled two-phase trial of the early start denver model compared to treatment as usual. Journal of the American Academy of Child & Adolescent Psychiatry, 58(9), 853-865. https://doi.org/10.1016/j.jaac.2019.01.004 Rogers, S. J., Estes, A., Vismara, L., Munson, J., Zierhut, C., Greenson, J., Dawson, G., Rocha, M., Sugar, C., Senturk, D., Whelan, F., & Talbott, M. (2019b). Enhancing low-intensity coaching in parent implemented early start denver model intervention for early autism: A randomize comparison treatment trial. Journal of Autism and Developmental Disorders, 49(2), 632-646. https://doi.org/10.1007/s10803-018-3740-5 Rogge, N., & Janssen, J. (2019). The economic cost of autism spectrum disorder: A literature review. Journal of Autism and Developmental Disorders, 49(7), 2873-2900. https://doi.org/10.1007/s10803-019-04014-z Rollins, P. R., Campbell, M., Hoffman, R. T., & Self, K. (2016). A community-based early intervention program for toddlers with autism spectrum disorders. Autism, 20(2), 219- 232. https://doi.org/10.1177/1362361315577217 Romanczyk, R. G., Callahan, E. H., Turner, L. B., & Cavalari, R. N. S. (2014). Efficacy of behavioral interventions for young children with autism spectrum disorders: Public policy, the evidence base, and implementation parameters. Review Journal of Autism and Developmental Disorders, 1(4), 276-326. https://doi.org/10.1007/s40489-014-0025-6 Romani, P. W., Ringdahl, J. E., Wacker, D. P., Lustig, N. H., Vinquist, K. M., Northup, J., Kane, A. M., Carrion, D. P. (2016). Relations between rate of negative reinforcement and the persistence of task completion. Journal of Applied Behavior Analysis, 49(1), 122-137. https://doi.org/10.1002/jaba.252 Roncati, A. L., Souza, A. C., & Miguel, C. F. (2019). Exposure to a specific prompt topography predicts its relative efficiency when teaching intraverbal behavior to children with autism spectrum disorder. Journal of Applied Behavior Analysis, 52(3), 739-745. https://doi.org/10.1002/jaba.568 Rosales, R., Maderitz, C., & Garcia, Y. A. (2014). Comparison of simple and complex auditory- visual conditional discrimination training. Journal of Applied Behavior Analysis, 47(2), 437-442. https://doi.org/10.1002/jaba.121 Roth, A. D., Pilling, S., & Turner, J. (2010). Therapist training and supervision in clinical trials: Implications for clinical practice. Behavioural and Cognitive Psychotherapy, 38(3), 291- 302. https://doi.org/10.1017/S1352465810000068 107 Rubio, E. K., Volkert, V. M., Farling, H., & Sharp, W. G. (2020). Evaluation of a finger prompt variation in the treatment of pediatric feeding disorders. Journal of Applied Behavior Analysis, 53(2), 956-972. https://doi.org/10.1002/jaba.658 Rudy, N. A., Betz, A. M., Malone, E., Henry, J. E., & Chong, I. M. (2014). Effects of video modeling on teaching bids for joint attention to children with autism. Behavioral Interventions, 29(4), 269-285. https://doi.org/10.1002/bin.1398 Russell, S. M., & Reinecke, D. (2019). Mand acquisition across different teaching methodologies. Behavioral Interventions, 34(1), 127-135. https://doi.org/10.1002/bin.1643 Saini, V., & Fisher, W. W. (2016). Evaluating the effects of discriminability on behavioral persistence during and following time-based reinforcement. Journal of the Experimental Analysis of Behavior, 106(3), 195-209. https://doi.org/10.1002/jeab.225 Saini, V., Fisher, W. W., & Pisman, M. D. (2017). Persistence during and resurgence following noncontingent reinforcement implemented with and without extinction. Journal of Applied Behavior Analysis, 50(2), 377-392. https://doi.org/10.1002/jaba.380 Saini, V., Greer, B. D., Fisher, W. W., Lichtblau, K. R., DeSouza, A. A., & Mitteer, D. R. (2016). Individual and combined effects of noncontingent reinforcement and response blocking on automatically reinforced problem behavior. Journal of Applied Behavior Analysis, 49(3), 693-698. https://doi.org/10.1002/jaba.306 Saini, V., Gregory, M. K., Uran, K. J., & Fantetti, M. A. (2015). Parametric analysis of response interruption and redirection as treatment for stereotypy. Journal of Applied Behavior Analysis, 48(1), 96-106. https://doi.org/10.1002/jaba.186 Sanberg, S. A., Kuhn, B. R., & Kennedy, A. E. (2018). Outcomes of a behavioral intervention for sleep disturbances in children with autism spectrum disorder. Journal of Autism and Developmental Disorders, 48(12), 4250-4277. https://doi.org/10.1007/s10803-018-3644- 4 Sani-Bozkurt, S., & Ozen, A. (2015). Effectiveness and efficiency of peer and adult models used in video modeling in teaching pretend play skills to children with autism spectrum disorder. Education and Training in Autism and Developmental Disabilities, 50(1), 71- 83. https://www.jstor.com/stable/24827502 Schertz, H. H., Odom, S. L., Baggett, K. M., & Sideris, J. H. (2018). Mediating parent learning to promote social communication for toddlers with autism: Effects from a randomized controlled trial. Journal of Autism and Developmental Disorders, 48(3), 853-867. https://doi.org/10.1007/s10803-017-3386-8 108 Schieltz, K. M., Romani, P. W., Wacker, D. P., Suess, A. N., Huang, P., Berg, W. K., Lindgren, S. D., & Kopelman, T. G. (2018). Single-case analysis to determine reasons for failure of behavioral treatment via telehealth. Remedial and Special Education, 39(2), 95-105. https://doi.org/10.1177/0741932517743791 Schnell, L. K., Vladescu, J. C., Kisamore, A. N., DeBar, R. M., Kahng, S., & Marano, K. (2020). Assessment to identify learner-specific prompt and prompt-fading procedures for children with autism spectrum disorder. Journal of Applied Behavior Analysis, 53(2), 1111-1129. https://doi.org/10.1002/jaba.623 Schnell, L. K., Vladescu, J. C., Kodak, T., & Nottingham, C. L. (2018). Comparing procedures on the acquisition and generalization of tacts for children with autism spectrum disorder. Journal of Applied Behavior Analysis, 51(4), 769-783. https://doi.org/10.1002/jaba.480 Schreibman, L., & Stahmer, A. C. (2014). A randomized trial comparison of the effects of verbal and pictorial naturalistic communication strategies on spoken language for young children with autism. Journal of Autism and Developmental Disorders, 44(5), 1244-1251. https://doi.org/10.1007/s10803-013-1972-y Sears, K. M., Blair, K. C., Iovannone, R., & Crosland, K. (2013). Using the prevent-teach- reinforce model with families of young children with ASD. Journal of Autism and Developmental Disorders, 43(5), 1005-1016. https://doi.org/10.1007/s10803-012-1646-1 Sellers, T. P., Kelley, K., Higbee, T. S., & Wolfe, K. (2016). Effects of simultaneous script training on use of varied mand frames by preschoolers with autism. The Analysis of Verbal Behavior, 32(1), 15-26. https://doi.org/10.1007/s40616-015-0049-8 Shalev, R. A., Milnes, S. M., Piazza, C. C., & Kozisek, J. M. (2018). Treating liquid expulsion in children with feeding disorders. Journal of Applied Behavior Analysis, 51(1), 70-79. https://doi.org/10.1002/jaba.425 Shamlian, K. D., Fisher, W. W., Steege, M. W., Cavanaugh, B. M., Samour, K., & Querim, A. C. (2016). Evaluation of multiple schedules with naturally occurring and therapist-arranged discriminative stimuli following functional communication training. Journal of Applied Behavior Analysis, 49(2), 228-250. https://doi.org/10.1002/jaba.293 Shapiro, M., & Kazemi, E. (2017). A review of training strategies to teach individuals implementation of behavioral interventions. Journal of Organizational behavior Management, 37(1), 32-62. https://doi.org/10.1080/01608061.2016.1267066 Shattuck, P. T., Seltzer, M. M., Greenberg, J. S., Orsmond, G. I., Bolt, D., Kring, S., Lounds, J., & Lord, C. (2007). Change in autism symptoms and maladaptive behaviors in adolescents and adults with an autism spectrum disorder. Journal of Autism and Developmental Disorders, 37(9), 1735-1747. https://doi.org/10.1007/s10803-006-0307-7 109 Shawler, L. A., & Miguel, C. F. (2015). The effects of motor and vocal response interruption and redirection on vocal stereotypy and appropriate vocalizations. Behavioral Interventions, 30(2), 112-134. https://doi.org/10.1002/bin.1407 Shepley, C., Lane, J. D., & Shepley, S. B. (2016). Teaching young children with social- communication delays to label actions using videos and language expansion models: A pilot study. Focus on Autism and Other Developmental Disabilities, 31(4), 243-253. https://doi.org/10.1177/1088357614552189 Shillingsburg, M. A., Bowen, C. N., & Shapiro, S. K. (2014a). Increasing social approach and decreasing social avoidance in children with autism spectrum disorder during discrete trial training. Research in Autism Spectrum Disorders, 8(11), 1443-1453. https://doi.org/10.1016/j.rasd.2014.07.013 Shillingsburg, M. A., Bowen, C. N., & Valentino, A. L. (2014b). Mands for information using “how” under EO-absent and EO-present conditions. The Analysis of Verbal Behavior, 30(1), 54-61. https://doi.org/10.1007/s40616-013-0002-7 Shillingsburg, M. A., Cariveau, T., Talmadge, B., & Frampton, S. (2017). A preliminary analysis of procedures to teach children with autism to report past behavior. The Analysis of Verbal Behavior, 33(2), 275-282. https://doi.org/10.1007/s40616-017-0085-7 Shillingsburg, M. A., & Frampton, S. E. (2019). The effects of the interspersal of related responses on the emergence of intraverbals for children with autism spectrum disorder. The Analysis of Verbal Behavior, 35(2), 172-195. https://doi.org/10.1007/s40616-019- 00110-4 Shillingsburg, M. A., Frampton, S. E., Cleveland, S. A., & Cariveau, T. (2018). A clinical application of procedures to promote the emergence of untrained intraverbal relations with children with autism. Learning and Motivation, 62(1), 51-66. https://doi.org/10.1016/j.lmot.2017.02.003 Shillingsburg, M. A., Frampton, S. E., Schenk, Y. A., Bartlett, B. L., Thompson, T. M., & Hansen, B. (2020). Evaluation of a treatment package to increase mean length of utterances for children with autism. Behavior Analysis of Practice, 13(3), 659-673. https://doi.org/10.1007/s40617-020-00417-y Shillingsburg, M. A., Gayman, C. M., & Walton, W. (2016). Using textual prompts to teach mands for information using “who?”. The Analysis of Verbal Behavior, 32(1), 1-14. https://doi.org/10.1007/s40616-016-0053-7 Shillingsburg, M. A., Hansen, B., & Wright, M. (2019a). Rapport building and instructional fading prior to discrete trial instruction: Moving from child-led play to intensive teaching. Behavior Modification, 43(2), 288-306. https://doi.org/10.1177/0145445517751436 110 Shillingsburg, M. A., Marya, V., Bartlett, B., Thompson, T., & Walters, D. (2019b). Teaching children with autism spectrum disorder to report past behavior with the use of a speech- generating device. The Analysis of Verbal Behavior, 35(2), 258-269. https://doi.org/10.1007/s40616-019-00112-2 Shire, S. Y., Chang, Y., Shih, W., Bracaglia, S., Kodjoe, M., & Kasari, C. (2017). Hybrid implementation model of community-partnered early intervention for toddlers with autism: A randomized trial. The Journal of Child Psychology and Psychiatry, 58(5), 612- 622. https://doi.org/10.1111/jcpp.12672 Shire, S. Y., Gulsrud, A., & Kasari, C. (2016). Increasing responsive parent-child interactions and joint engagement: Comparing the influence of parent-mediated intervention and parent psychoeducation. Journal of Autism and Developmental Disorders, 46(5), 1737- 1747. https://doi.org/10.1007/s10803-016-2702-z Shire, S. Y., Shih, W., Chang, Y., Bracaglia, S., Kodjoe, M., & Kasari, C. (2019). Sustained community implementation of JASPER intervention with toddlers with autism. Journal of Autism and Developmental Disorders, 49(5), 1863-1875. https://doi.org/10.1007/s10803-018-03875-0 Shrestha, A., Anderson, A., & Moore, D. W. (2013). Using point-of-view video modeling and forward chaining to teach a functional self-help skill to a child with autism. Journal of Behavioral Education, 22(2), 157-167. https://doi.org/10.1007/s10864-012-9165-x Siegel, E. B., & Lien, S. E. (2015). Using photographs of contrasting contextual complexity to support classroom transitions for children with autism spectrum disorders. Focus on Autism and Other Developmental Disabilities, 30(2), 100-114. https://doi.org/10.1177/1088357614559211 Sigafoos, J., Lancioni, G. E., O’Reilly, M. F., Achmadi, D., Stevens, M., Roche, L., Kagohara, D. M., van der Meer, L., Sutherland, D., Lang, R., Marchik, P. B., McLay, L., Hodis, F., & Green, V. A. (2013). Teaching two boys with autism spectrum disorders to request the continuation of toy play using an iPad®-based speech-generating device. Research in Autism Spectrum Disorders, 7(8), 923-930. https://doi.org/10.1016/j.rasd.2013.04.002 Silbaugh, B. C., & Falcomata, T. S. (2017). Translational evaluation of a lag schedule and variability in food consumed by a boy with autism and food selectivity. Developmental Neurorehabilitation, 20(5), 309-312. https://doi.org/10.3109/17518423.2016.1146364 Silbaugh, B. C., & Falcomata, T. S. (2019). Effects of a lag schedule with progressive time delay on sign mand variability in a boy with autism. Behavior Analysis in Practice, 12(1), 124- 132. https://doi.org/10.1007/s40617-018-00273-x 111 Silbaugh, B. C., Falcomata, T. S., & Ferguson, R. H. (2018a). Effects of a lag schedule of reinforcement with progressive time delay on topographical mand variability in children with autism. Developmental Neurorehabilitation, 21(3), 166-177. https://doi.org/10.1080/17518423.2017.1369190 Silbaugh, B. C., & Swinnea, S. (2019). Clinical evaluation of a behavioral intervention for packing. Behavior Analysis: Research and Practice, 19(1), 60-71. https://doi.org/10.1037/bar0000150 Silbaugh, B. C., Swinnea, S., & Falcomata, T. S. (2018b). Clinical evaluation of physical guidance procedures in the treatment of food selectivity. Behavioral Interventions, 33(4), 403-413. https://doi.org/10.1002/bin.1645 Silbaugh, B. C., Swinnea, S., & Falcomata, T. S. (2020). Replication and extension of the effects of lag schedules on mand variability and challenging behavior during functional communication training. The Analysis of Verbal Behavior, 36(1), 49-73. https://doi.org/10.1007/s40616-020-00126-1 Silbaugh, B. C., Wingate, H. V., & Falcomata, T. S. (2017). Effects of lag schedules and response blocking on variant food consumption by a girl with autism. Behavioral Interventions, 32(1), 21-34. https://doi.org/10.1002/bin.1453 Simacek, J., Dimian, A. F., & McComas, J. J. (2017). Communication intervention for young children with severe neurodevelopmental disabilities via telehealth. Journal of Autism and Developmental Disorders, 47(3), 744-767. https://doi.org/10.1007/s10803-016-3006- z Simpson, L. A., & Bui, Y. (2016). Effects of a peer-mediated intervention on social interactions of students with low-functioning autism and perceptions of typical peers. Education and Training in Autism and Developmental Disabilities, 51(2), 162-178. https://www.jstor.org/stable/24827545 Sivaraman, M. (2017). Using multiple exemplar training to teach empathy skills to children with autism. Behavior Analysis in Practice, 10(4), 337-346. https://doi.org/10.1007/s40617- 017-0183-y Sivaraman, M., & Bhabu, P. (2018). Establishment of exclusion responding in children with autism spectrum disorder. Behavioral Interventions, 33(4), 414-426. https://doi.org/10.1002/bin.1647 Sivaraman, M., & Fahmie, T. A. (2018). Using common interests to increase socialization between children with autism and their peers. Research in Autism Spectrum Disorders, 51(1), 1-8. https://doi.org/10.1016/j.rasd.2018.03.007 112 Sivaraman, M., & Rapp, J. T. (2020). Further analysis of the immediate and subsequent effect of RIRD on vocal stereotypy. Behavior Modification, 44(5), 646-669. https://doi.org/10.1177/0145445519838826 Slane, M., & Lieberman‐Betz, R. (2021). Using behavioral skills training to teach implementation of behavioral interventions to teachers and other professionals: A systematic review. Behavioral Interventions, 36 (4) 984-1002. https://doi.org/10.1002/bin.1828 Slocum, S. K., & Vollmer, T. R. (2015). A comparison of positive and negative reinforcement for compliance to treat problem behavior maintained by escape. Journal of Applied Behavior Analysis, 48(3), 563-574. https://doi.org/10.1002/jaba.216 Smith, T., Scahill, L., Dawson, G., Guthrie, D., Lord, C., Odom, S., Rogers, S., & Wagner, A. (2007). Designing research studies on psychosocial interventions in autism. Journal of Autism and Developmental Disorders, 37(2), 354-366. https://doi.org/10.1007/s10803- 006-0173-3 Smith, D. P., Eikeseth, S., Fletcher, S. E., Montebelli, L., Smith, H. R., & Taylor, J. C. (2016). Emergent intraverbal forms may occur as a result of listener training for children with autism. The Analysis of Verbal Behavior, 32(1), 27-37. https://doi.org/10.1007/s40616- 016-0057-3 Smith, H. M., Gadke, D. L., Stratton, K. K., Ripple, H., & Reisener, C. D. (2019). Providing noncontingent access to music in addition to escape extinction as a treatment for liquid refusal in a child with autism. Behavior Analysis: Research and Practice, 19(1), 94-102. https://doi.org/10.1037/bar0000092 Speckman, J. M., Longano, J. M., & Syed, N. (2017). The effects of conditioning three- dimensional stimuli on identity matching and imitative responses in young children with autism spectrum disorder. Behavioral Development Bulletin, 22(1), 111-128. https://doi.org/10.1037/bdb0000025 Sprinkle, E. C., & Miguel, C. F. (2013). Establishing derived textual activity schedules in children with autism. Behavioral Interventions, 28(3), 185-202. https://doi.org/10.1002/bin.1365 St. Peter, C. C., Byrd, J. D., Pence, S. T., & Foreman, A. P. (2016). Effects of treatment integrity failures on response-cost procedure. Journal of Applied Behavior Analysis, 49(2), 308-328. https://doi.org/10.1002/jaba.291 Stahmer, A. C., Rieth, S., Lee, E., Reisinger, E.M., Mandelll, D. S., & Connell, J. E. (2015). Training teachers to use evidence-based practices for autism: Examining procedural implementation fidelity. Psychology in the Schools, 52(2), 181-195. https://doi.org/10.1002/pits.21815 113 Still, K., May, R. J., Rehfeldt, R. A., Whelan, R., & Dymond, S. (2015). Facilitating derived requesting skills with a touchscreen tablet computer for children with autism spectrum disorder. Research in Autism Spectrum Disorders, 19(1), 44-58. https://doi.org/10.1016/j.rasd.2015.04.006 Stock, R., Mirenda, P., & Smith, I. M. (2013). Comparison of community-based verbal behavior and pivotal response treatment programs for young children with autism spectrum disorder. Research in Autism Spectrum Disorders, 7(9), 1168-1181. https://doi.org/10.1016/j.rasd.2013.06.002 Strain, P., Fox, L., & Barton, E. E. (2021). On expanding the definition and use of procedural fidelity. Research and Practice for Persons with Severe Disabilities, 46(3), 173-183. https://doi.org/10.1177/15407969211036911 Strand, R. C. W., & Eldevik, S. (2018). Improvements in problem behavior in a child with autism spectrum diagnosis through synthesized analysis and treatment: A replication in EIBI home program. Behavioral Interventions, 33(1), 102-111. https://doi.org/10.1002/bin.1505 Strasberger, S. K., & Ferreri, S. J. (2014). The effects of peer assisted communication application training on the communicative and social behaviors of children with autism. Journal of Developmental and Physical Disabilities, 26(5), 513-526. https://doi.org/10.1007/s10882-013-9358-9 Su, P. L., Castle, G., & Camarata, S. (2019). Cross-modal generalization of receptive and expressive vocabulary in children with autism spectrum disorder. Autism & Developmental Language Impairments, 4(1), 1-18. https://doi.org/10.1177/2396941518824495 Suess, A. N., Romani, P. W., Wacker, D. P., Dyson, S. M., Kuhle, J. L., Lee, J. F., Lindgren, S. D., Kopelman, T. G., Pelzel, K. E., & Waldron, D. B. (2014). Evaluating the treatment fidelity of parents who conduct in-home functional communication training with coaching via telehealth. Journal of Behavioral Education, 23(1), 34-59. https://doi.org/10.1007/s10864-013-9183-3 Suess, A. N., Wacker, D. P., Schwartz, J. E., Lustig, N., & Detrick, J. (2016). Preliminary evidence on the use of telehealth in an outpatient behavior clinic. Journal of Applied Behavior Analysis, 49(3), 686-692. https://doi.org/10.1002/jaba.305 Sullivan, W. E., Martens, B. K., Morley, A. J., & Long, S. J. (2017). Reducing transition latency and transition-related problem behavior in children by altering the motivating operations for task disengagement. Psychology in the Schools, 54(4), 404-420. https://doi.org/10.1002/pits.22008 114 Sy, J. R., Donaldson, J. M., Vollmer, T. R., & Pizarro, E. (2014). An evaluation of factors that influence children’s instruction following. Journal of Applied Behavior Analysis, 47(1), 101-112. https://doi.org/10.1002/jaba.94 Symes, M. D., Remington, B., Brown, T., & Hastings, R. P. (2006). Early intensive behavioral intervention for children with autism: Therapists’ perspectives on achieving procedural fidelity. Research in Developmental Disabilities, 27(1), 30-42. https://doi.org/10.106/j.ridd.2004.07.007 Tanner, A., & Andreone, B. E. (2015). Using graduated exposure and differential reinforcement to increase food repertoire in a child with autism. Behavior Analysis in Practice, 8(2), 233-240. https://doi.org/10.1007/s40617-015-0077-9 Taylor, T. (2020a). Assessment and treatment of pica within the home setting in Australia. Behavioral Development, 25(1), 40-51. http://dx.doi.org/10.1037/bdb0000094 Taylor, T. (2020b). Side deposit with regular texture food for clinical cases in-home. Journal of Pediatric Psychology, 45(4), 399-410. https://doi.org/10.1093/jpepsy/jsaa004 Taylor, T. (2020c). Increasing food texture and teaching chewing for a clinical case within the home setting in Australia. Learning and Motivation, 71. https://doi.org/10.1016/j.lmot.2020.101651 Tincani, M., & Travers, J. (2019). Replication research, publication bias, and applied behavior analysis. Perspectives on Behavior Science, 42(1), 59-75. https://doi.org/10.1007/s40614- 019-00191-5 Toper-Korkmaz, O., Lerman, D. C., & Tsami, L. (2018). Effects of toy removal and number of demands on vocal stereotypy during response interruption and redirection. Journal of Applied Behavior Analysis, 51(4), 757-768. https://doi.org/10.1002/jaba.497 Topuz, C., & Ulke-Kurkcuoglu, B. (2019). Increasing verbal interaction in children with autism spectrum disorders using audio script procedure. Journal of Autism and Developmental Disorders, 49(12), 4847-4861. https://doi.org/10.1007/s10803-019-04203-w Torelli, J. N., Lambert, J. M., Da Fonte, M. A., Denham, K. N., Jedrzynski, T. M., & Houchins- Juarez, N. J. (2016). Assessing acquisition of and preference for mand topographies during functional communication training. Behavior Analysis in Practice, 9(2), 165-168. https://doi.org/10.1007/s40617-015-0083-y Toussaint, K. A., Kodak, T., & Vladescu, J. C. (2016). An evaluation of choice on instructional efficacy and individual preferences among children with autism. Journal of Applied Behavior Analysis, 49(1), 170-175. https://doi.org/10.1002/jaba.263 115 Tsami, L., & Lerman, D. (2020). Transfer of treatment effects from combined to isolated conditions during functional communication training for multiple controlled problem behavior. Journal of Applied Behavior Analysis, 53(2), 649-664. https://doi.org/10.1002/jaba.629 Tsami, L., Lerman, D., & Toper-Korkmaz, O. (2019). Effectiveness and acceptability of parent training via telehealth among families around the world. Journal of Applied Behavior Analysis, 52(4), 1113-1129. https://doi.org/10.1002/jaba.645 Tyner, S., Brewer, A., Helman, M., Leon, Y., Pritchard, J., & Schlund, M. (2016). Nice doggie! Contact desensitization plus reinforcement decreases dog phobias for children with autism. Behavior Analysis in Practice, 9(1), 54-57. https://doi.org/10.1007/s40617-016- 0113-4 Tzanakaki, P., Grindle, C. F., Dungait, S., Hulson-Jones, A., Saville, M., Hughes, J. C., & Hastings, R. P. (2014). Use of a tactile prompt to increase social initiations in children with autism. Research in Autism Spectrum Disorders, 8(6), 726-736. https://doi.org/10.1016/j.rasd.2014.03.016 Ulke-Kurkcuoglu, B. (2015). A comparison of least-to-most prompting and video modeling for teaching pretend play skills to children with autism spectrum disorder. Educational Sciences: Theory & Practice, 15(2), 499-517. https://doi.org/10.12738/estp.2015.2.2541 Ulloa, G., Borrero, C. S. W., & Borrero, J. C. (2020). Behavioral interventions for pediatric food refusal maintain effectiveness despite integrity degradation: A preliminary demonstration. Behavior Modification, 44(5), 746-772. https://doi.org/10.1177/0145445519847626 Valentino, A. L. (2021). Supervision and mentoring. In Luiselli, J. K., Gardner, R. M., Bird, F. L., & Maguire, H. (Eds.). Organizational behavior management approaches for intellectual and developmental disabilities (pp. 141-164). Routledge. Valentino, A. L., Conine, D. E., Delfs, C. H., & Furlow, C. M. (2015). Use of a modified chaining procedure with textual prompts to establish intraverbal storytelling. The Analysis of Verbal Behavior, 31(1), 39-58. https://doi.org/10.1007/s40616-014-0023-x Valentino, A. L., LeBlanc, L. A., Veazey, S. E., Weaver, L. A., & Raetz, P. B. (2019). Using a prerequisite skills assessment to identify optimal modalities for mand training. Behavior Analysis in Practice, 12(1), 22-32. https://doi.org/10.1007/s40617-018-0256-6 Valentino, A. L., LeBlanc, L. A., & Conde, K. A. (2018). Validation of a skills assessment to match interventions to teach motor imitation to children with autism. Learning and Motivation, 62(1), 67-76. https://doi.org/10.1016/j.lmot.2017.02.005 116 Varella, A. A., & de Souza, D. G. (2015). Using class-specific compound consequences to teach dictated and printed letter relations to a child with autism. Journal of Applied Behavior Analysis, 48(3), 675-679. https://doi.org/10.1002/jaba.224 Vedora, J., & Grandelski, K. (2015). A comparison of methods for teaching receptive language to toddlers with autism. Journal of Applied Behavior Analysis, 48(1), 188-193. https://doi.org/10.1002/jaba.167 Vernon, T. W., Holden, A. N., Barrett, A. C., Bradshaw, J., Ko, J. A., McGarry, E. S., Horowitz, E. J., Tagavi, D. M., & German, T. C. (2019). A pilot randomized clinical trial of an enhanced pivotal response treatment approach for young children with autism: The PRISM model. Journal of Autism and Developmental Disorders, 49(6), 2358-2373. https://doi.org/10.1007/s10803-019-03909-1 Vivanti, G., Dissanyake, C., Duncan, E., Feary, J., Capes, K., Upson, S., Bent, C. A., Rogers, S. J., Hudry, K., & Victorian ASELCC Team (2019). Outcomes of children receiving group-early start denver model in an inclusive versus autism-specific setting: A pilot randomized controlled trial. Autism, 23(5), 1165-1175. https://doi.org/10.1177/1362361318801341 Vivanti, G., Paynter, J., Duncan, E., Fothergill, H., Dissanayake, C., & Rogers, S. J. (2014). Effectiveness and feasibility of the early start denver model implemented in a group- based community childcare setting. Journal of Autism and Developmental Disorders, 44(12), 3140-3153. https://doi.org/10.1007/s10803-014-2168-9 Vivanti, G., & Stahmer, A. C. (2020). Can the early start Denver model be considered ABA practice? Behavior Analysis in Practice. Advance online publication. https://doi.org/10.1007/s40617-020-00474-3 Vosters, M. E., & Luczynski, K. C. (2020). Emergent completion of multistep instructions via joint control. Journal of Applied Behavior Analysis, 53(3), 1432-1451. https://doi.org/10.1002/jaba.670 Ward, S., Parker, A., & Perdikaris, A. (2017). Task as reinforcer: A reactive alternative to traditional forms of escape extinction. Behavior Analysis in Practice, 10(1), 22-34. https://doi.org/10.1007/s40617-016-0139-7 Welsh, F., Najdowski, A. C., Strauss, D., Gallegos, L., & Fullen, J. A. (2019). Teaching a perspective-taking component skill to children with autism in the natural environment. Journal of Applied Behavior Analysis, 52(2), 439-450. https://doi.org/10.1002/jaba.523 Weston, R., Davis, T., & Ross, R. K. (2020). Evaluating preference and performance in accumulated versus distributed response-reinforcer arrangements. Behavior Modification, 44(6), 909-926. https://doi.org/10.1177/0145445519868793 117 Whalon, K., Hanline, M. F., & Davis, J. (2016). Parent implementation of RECALL: A systematic case study. Education and Training in Autism and Developmental Disabilities, 51(2), 211-220. https://www.jstor.com/stable/24827548 Whalon, K., Martinez, J. R., Shannon, D., Butcher, C., Hanline, M. F. (2015). The impact of reading to engage children with autism in language and learning (RECALL). Topics in Early Childhood Special Education, 35(2), 102-115. https://doi.org/10.1177/0271121414565515 Whipple, H., Scherr, R., & Kozlowski, A. M. (2020). Simultaneous presentation to decrease packing in a child with a feeding disorder. Behavior Analysis in Practice, 13(1), 197-204. https://doi.org/10.1007/s40617-019-00360-7 Whitehouse, A. J. O. Granich, J., Alvares, G., Busacca, M., Cooper, M. N., Dass, A., Duong, T., Harper, R., Marshall, W., Richdale, A., Rodwell, T., Trembath, D., Vellanki, P., Moore, D. W., & Anderson, A. (2017). A randomized controlled trial of an iPad-based application to complement early behavioural intervention in autism spectrum disorder. The Journal of Child Psychology and Psychiatry, 58(9), 1042-1052. https://doi.org/10.1111/jcpp.12752 Wilder, D. A., Ertel, H., & Thomas, R. (2020). Further analysis of modifications to the three-step guided compliance procedure to enhance compliance among children with autism. Journal of Applied Behavior Analysis, 53(4), 2339-2348. https://doi.org/10.1002/jaba.721 Wilson, K. P. (2013). Teaching social-communication skills to preschoolers with autism: Efficacy of video versus in vivo modeling in the classroom. Journal of Autism and Developmental Disorders, 43(8), 1819-1831. https://doi.org/10.1007/s10803-012-1731-5 Wiskow, K. M., Donaldson, J. M., & Matter, A. L. (2017). An evaluation of generalization of compliance across response types. Behavior Analysis: Research and Practice, 17(4), 402- 420. https://doi.org/10.1037/bar0000087 Wolfe, K., Blankenship, A., & Rispoli, M. (2018). Generalization of skills in Language for Learning by young children with autism spectrum. Journal of Developmental and Physical Disabilities, 30(1), 1-16. https://doi.org/10.1007/s10882-017-9572-y Wong, C., Odom, S. L., Hume, K. A., Cox, A. W., Fettig, A., Kucharczyk, S., Brock, M. E., Plavnick, J. B., Fleury, V. P., & Schultz, T. R. (2015). Evidence-based practices for children, youth, and young adults with autism spectrum disorder: A comprehensive review. Journal of Autism and Developmental Disorders, 45(7), 1951-1966. https://doi.org10.1007/s10803-014-2351-z Wunderlich, K. L., & Vollmer, T. R. (2015). Data analysis of response interruption and redirection as a treatment for vocal stereotypy. Journal of Applied Behavior Analysis, 48(4), 749-764. https://doi.org/10.1002/jaba.227 118 Xu, Y., Yang, J., Yao, J., Chen, J., Zhuang, X., Wang, W., Zhang, X., & Lee, G. T. (2018). A pilot study of a culturally adapted early intervention for young children with autism spectrum disorders in China. Journal of Early Intervention, 40(1), 52-68. https://doi.org/10.1177/1053815117748408 Yakubova, G., Hughes, E. M., & Shinaberry, M. (2016). Learning with technology: Video modeling with concrete-representational-abstract sequencing for students with autism spectrum disorder. Journal of Autism and Developmental Disorders, 46(7), 2349-2362. https://doi.org/10.1007/s10803-016-2768-7 Young, H. E., Falco, R. A., Hanita, M. (2016). Randomized, controlled trial of a comprehensive program for young students with autism spectrum disorder. Journal of Autism and Developmental Disorders, 46(2), 544-560. https://doi.org/10.1007/s10803-015-2597-0 Young, K. R., Radley, K. C., Jenson, W. R., West, R. P., & Clare, S. K. (2016). Peer-facilitated discrete trial training for children with autism spectrum disorder. School Psychology Quarterly, 31(4), 507-521. https://doi.org/10.1037/spq0000161 Yuan, C., Hua, Y., & Zhu, J. (2019). The role of reinforcement in multiple response repetition error correction and treatment preference of Chinese children with autism. Journal of Autism and Developmental Disorders, 49(9), 3704-3715. https://doi.org/10.1007/s10803- 019-04086-x Zaghlawan, H. Y., & Ostrosky, M. M. (2016). A parent-implemented intervention to improve imitation skills by children with autism: A pilot study. Early Childhood Education Journal, 44(6), 671-680. https://doi.org/10.1007/s10643-015-0753-y 119 CHAPTER 3 A Survey of Barriers Behavior Analysts Experience While Providing Supervision Via Telehealth Telehealth is defined as “the use of electronic information and telecommunication technologies to support long-distance clinical health care, patient and professional health-related education, public health, and health administration” (American Telemedicine Association, 2017). Recently, an emerging body of literature has evaluated the use of Telehealth to provide applied behavior analytic (ABA) services to individuals with autism spectrum disorder (ASD; Ferguson et al., 2019). A majority of the ABA procedures delivered via Telehealth have been implemented in home-based settings by caregivers while a researcher trained and supervised the caregiver in implementing the procedures (Ferguson et al., 2019). Telehealth is an acceptable service delivery mechanism for ABA interventions and has led to positive outcomes for decreasing problem behavior and increasing skill acquisition in individuals with ASD (Ferguson et al., 2019; Unholz-Bowden et al., 2020). In addition, Telehealth reduces the cost associated with ABA services (Ferguson et al., 2019; Horn et al., 2016; Lindgren et al., 2016), is an effective platform for parent training (e.g., increasing implementation skills; Meadan & Daczewitz, 2015), and increases procedural fidelity in teachers, therapists, and parents implementing behavioral interventions with individuals with ASD (Neely et al., 2016). Behavioral interventions are most effective when professionals implementing those interventions are adequately supervised (Shapiro & Kazemi, 2017). Within the field of behavior analysis, supervision is defined as “improving and maintaining the behavior-analytic, professional and ethical repertoires of the supervisee and facilitating the delivery of high-quality 120 behavior analytic services to the supervisee’s clients” (BACB, 2018). Supervision of ABA services is critical because it can increase the quality of behavioral analytic services (LeBlanc & Luiselli, 2016), which in turn can positively impact treatment outcomes. In addition, supervision can promote professional development (e.g., establish professional values and increase interpersonal skills) of the supervisor and supervisee and can help the field of ABA by developing future practitioners that have the appropriate competencies to create successful and socially significant behavior change (Brodhead et al., 2018; LeBlanc et al., 2012; LeBlanc & Luiselli, 2016; Sellers et al., 2016a; Turner et al., 2016). Finally, supervision can increase the likelihood of ethical employee behavior and result in greater consumer protection (Brodhead & Higbee, 2012). Supervision does not occur without barriers, however. For the purposes of this manuscript, we define supervision barriers as something that hinders the supervision of the supervisee and the quality of services provided to the supervisee’s clients. An example of a supervision barrier is when the supervisor (here after referred to as the BCBA) does not devote ample time to the supervisee (e.g., the BCBA only has 15 min to meet with the supervisee when the situation demands 30 min of supervision), and as a result, the BCBA does not provide an appropriate amount of feedback. In another example, the supervisee could struggle with interpersonal skills (e.g., is rude) making it difficult for them to receive and then subsequently implement feedback (Sellers et al., 2016b). When a barrier occurs during supervision (e.g., BCBA does not provide feedback), that barrier may impact the organization where the BCBA and/or supervisee work, because the staff may be dissatisfied with the supervision provided and as a result leave the organization (DiGennaro Reed & Henley, 2015; Sellers et al., 2016b). In addition, if a barrier occurs during supervision (e.g., lack of access to materials), that barrier may 121 put the clients at risk if the barrier impacts providing high-quality behavior analytic services (Sellers et al., 2016b). In 2019, Sellers and colleagues conducted a survey to gather information from BCBA supervisors about supervision practices and to identify any barriers respondents might experience while providing supervision. The goal of the survey was to identify areas of success for supervisors and areas that should be targeted for improvement within the supervision process. Participants for the survey were recruited through the Behavior Analyst Certification Board (BACB) mass email service and through various social media sites (e.g., Facebook site for the Association of Behavior Analysis International). A total of 284 BCBAs completed the survey in its entirety and were included in the data analysis. From the 284 responses, Sellers et al. (2019) found within a face-to-face context the most common barrier was lack of time to adequately prepare for supervision meetings and to develop a tracking system to monitor skills and knowledge that the supervisee has mastered. Additional barriers consisted of the cost of materials, lack of access to resources (e.g., supervision curriculum), lack of access to examples (e.g., systems for guiding supervision activities, uncertainty of supervision requirements (e.g., the need to have a contract), and uncertainty about how to teach and measure certain skills of the supervisee (e.g., responding to feedback, time management, organization skills). Though Sellers et al. (2019) provided information on barriers BCBAs face when providing supervision face-to- face, the implications of that study are limited to face-to-face contact and the extent to which findings generalize to supervision provided via Telehealth are unknown. Without having information about barriers encountered when providing supervision via Telehealth, BCBAs are less likely to identify and subsequently address those barriers. If not addressed, barriers may 122 decrease the quality of supervision and ultimately decrease the quality of behavioral interventions (Sellers et al., 2016b). If BCBAs are aware of potential barriers that could occur during supervision, they may take proactive steps to address or mitigate them prior to beginning the supervision process. For example, the BCBA may have a meeting with the supervisee to set clear expectations (e.g., how feedback will be provided) and also review necessary information in order for the supervision via Telehealth to be successful (e.g., how to use the technology or videoconferencing software). Second, knowing the barriers and potential strategies used to address the barriers can provide BCBAs with practical tools to address and/or mitigate the barriers if the onset of barriers cannot be prevented. For example, if a supervisee is frequently engaging in absenteeism, the BCBA may implement a self-management strategy for the supervisee. In addition, identifying barriers BCBAs experience during ABA service delivery via Telehealth can inform the development of needed resources and inform trainings and practice based on real, field-specific issues (Sellers et al., 2019). Having resources and trainings based on real, field-specific issues may increase the quality of supervision provided to supervisees and as a result increase the quality of services provided to the clients (Sellers et al., 2019). Finally, identifying barriers can guide further research in the emerging area of remote supervision. For example, if there are barriers that BCBAs encounter and strategies have not been used or were ineffective, researchers could begin to evaluate what strategies may be most effective in addressing the barriers. BCBAs who may consider delivering ABA services via Telehealth would benefit from information about common barriers that may arise when providing these services (Lerman et al., 2020). Identifying the current barriers BCBAs experience and the strategies BCBAs use to 123 address and/or mitigate those barriers will allow us to provide the field of ABA with information about how to: (a) potentially prevent the barriers from occurring; (b) address and/or mitigate the barriers if they cannot be prevented; (c) inform the development of needed resources, (d) inform trainings and practice based on real, field-specific issues; and (e) guide further research in this emerging area. The aforementioned information will increase the quality of the supervision BCBAs are providing to supervisees and as a result, will increase the quality of services provided to the clients and improve employee ethical behavior (Brodhead & Higbee, 2012; Sellers et al., 2019). Therefore, in order to better understand the current barriers BCBAs experience when providing supervision via Telehealth, a survey study was conducted and recommendations for the organizational and individual level are provided based on survey findings. Specifically, Chapter 3 asked the following research questions: (a) What are the barriers BCBAs experience when providing supervision to other BCBAs or to graduate students who are providing behavioral services via Telehealth to individuals with ASD in the United States? And (b) What strategies do BCBAs use to address and/or mitigate the barriers that arise during supervision of behavioral services via Telehealth to individuals with ASD in the United States? Method Participants Participants were recruited through the BACB mass email service, which is an email contact list of all registered certificants. To use the BACB mass email service, the researchers were required to pay a fee for the initial email and the one-week reminder email to be sent to potential participants. Participants were recruited using voluntary sampling, which consisted of explicitly calling for volunteers (Remler & Van Ryzin, 2011). All of the potential participants resided within the United States and had a BCBA credential or a BCBA credential with a 124 doctoral designation (BCBA-D). According to the BACB, there were a maximum of 42,405 individuals who qualified to potentially receive the email invitation. However, the BACB indicated individuals could independently opt out from receiving emails from the BACB at any time, therefore, the number of individuals who received the survey was lower and fluctuates over time (i.e., initial email was sent to 18,983 individuals, one-week reminder email was sent to 19,154 individuals). Inclusion Criteria Potential participants were screened for the following criteria at the beginning of the survey: (a) if the potential participant held a BCBA credential or BCBA credential with a doctoral designation (BCBA-D) in good standing (i.e., active certification status); (b) if the potential participant currently or in the past six months, provided supervision to another BCBA or individual pursuing a BCBA credential; (c) if the potential participant currently or in the past six months, provided supervision via Telehealth to another BCBA or individual pursuing a BCBA credential; and (d) if the potential participant provided supervision to someone that provided behavior analytic services to individuals with ASD. Potential participants consented to participate in the study. Materials The survey was created by the primary researcher. Expert and content reviews of the survey were then completed. The expert reviews were completed by two doctoral level professionals who held a BCBA credential and had substantial experience in either supervision or Telehealth. The expert reviews resulted in changes in the wording/rewording of six questions within the survey to improve question clarity. In addition, the format of one question was changed from a multiple-choice question format to a side-by-side question format in order to 125 gain information regarding how often (i.e., never, rarely, usually, always, not applicable) individuals were typically present/available during supervision Telehealth meetings. Finally, one question was added to ask if the participants had received training on how to provide supervision in the past six months, instead of only asking if the participants had received training on how to provide supervision via Telehealth in the past six months. The content reviews were completed by five BCBAs who provided supervision via Telehealth to another BCBA or an individual pursuing a BCBA credential. The content reviews resulted in one change to the survey: the addition of a non-applicable choice for the question regarding how often individuals were typically present/available during supervision Telehealth meetings. The final survey (see Brodhead, 2022) was created on Qualtrics and included 34 multiple-choice, side-by-side, rank, and fill-in-the-blank questions. Four questions of the survey consisted of initial survey screening questions to determine if the participants met inclusion criteria mentioned above. Six questions asked about the participant’s supervision load and supervision meeting logistics. Twelve questions asked about the participant’s experiences when providing supervision via Telehealth. Finally, 12 questions asked for demographic information (e.g., age, years worked as a BCBA, current organization setting). For the questions regarding supervision and Telehealth, definitions of each were displayed on the screen. Telehealth was defined as “the use of electronic information and telecommunication technologies to support long-distance clinical health care, patient and professional health-related education, public health, and health administration” (American Telemedicine Association, 2017). Supervision was defined as “improving and maintaining the behavior-analytic, professional and ethical repertoires of the supervisee and facilitating the delivery of high-quality behavior analytic services to the supervisee’s clients” (BACB, 2018). Additionally, for the questions regarding supervision and 126 Telehealth, participants that worked at multiple organizations were asked to think about the organization in which they primarily worked in when answering. For the questions regarding barriers experienced, the barriers were broken into two areas: supervisee barriers (19 options) and supervisor barriers (25 options). Finally, for the questions regarding strategies used to address and/or mitigate the barriers, strategies were broken into two areas: supervisee barrier strategies used (24 options) and supervisor barrier strategies used (18 options). The final survey included skip logic, display logic, carry forward choices, and branches throughout. Skip logic was used with the four initial survey screening questions. If at any point during those questions, the participant indicated they did not meet inclusion criteria, skip logic pushed the participant past all of the survey questions to the end of the survey. Display logic was used based on how the participants responded to the four initial survey screening questions. If the participant indicated they met all inclusion criteria, a message was displayed stating they were eligible to participate in the survey. If the participant indicated that they did not meet inclusion criteria, a message was displayed stating that they were not eligible to participate in the survey. Carry forward choices were used for two questions where, depending on what answers the participant selected in the previous question (e.g., the answers the participant selected in the question regarding what supervisee barriers they had experienced), only those answers would appear in that question (e.g., the participant had to rank the selected answers as the supervisee barriers that occurred most frequently). Carry forward choices for supervisor barriers were programmed in a manner identical to supervisee barriers. Finally, branches were used to display the corresponding end of survey message (i.e., participant was interested in entering a drawing to receive financial compensation or not). 127 Amazon.com gift cards in the amount of $10 were used as incentives for participants to complete the survey. After completion of the survey, participants that were interested in entering a drawing to receive financial compensation for their time, were directed to email their contact information to an email address affiliated with the university that the researchers did not have access to. Procedure The initial survey email was sent to potential participants directly through the BACB mass email service. Potential participants had access to the survey for two weeks after the initial survey email was distributed. Each potential participant was able to access the survey only one time using the link provided in the email in order to prevent individuals from submitting multiple responses. One week after the initial email was sent to potential participants, a reminder email, identical to the initial email, was sent to potential participants. Recruitment of participants and data collection were conducted from May 4, 2021 until May 18, 2021. Data Analysis Following the distribution of the survey, participant data were analyzed using descriptive statistics, specifically focusing on frequencies, percentages, and measures of central tendency, in ways consistent with similar studies (e.g., Hajiaghamohseni et al., 2020; Sellers et al., 2019). Specifically, descriptive statistics were used for respondent demographics, supervision load and supervision meeting logistics, barriers experienced (first research question), and strategies used to address and/or mitigate barriers that arose (second research question). Data for the first research question (i.e., barriers experienced) were also analyzed using a statistical analysis, Cochran’s Q test and a pairwise post-hoc Dunn test with Bonferroni adjustments (Sheskin, 2011). Cochran’s Q test was used to determine if there were differences in the dichotomous 128 dependent variable (i.e., participant experienced the barrier or not) and three or more related groups (i.e., multiple barriers). Based on the results from the Cochran’s Q test (i.e., differences between the barriers were found), a pairwise post-hoc was also conducted to identify which comparisons were significant. All statistical analyses were conducted using the Statistical Package for the Social Sciences (SPSS) Version 26. Results According to the BACB, there were 42,405 individuals who qualified to potentially receive the email invitation. Ultimately, the metrics from the BACB that were provided after the completion of data collection indicated that the initial email was sent to 18,983 individuals. Of those individuals, 2,115 people opened the email and 121 people clicked the survey link included in the email. The reminder email was sent to 19,154 individuals (this number was higher because the number of certificants who subscribe to the listserv is fluid). Of those individuals, 2,246 people opened the email and 151 people clicked the survey link included in the email. Of the 272 people who clicked on the survey link, 150 responses were collected. A total of 150 responses were collected: 23 participants (15.3%) did not complete the entire survey, 2 participants (1.3%) did not currently hold a BCBA credential in good standing, 42 participants (28.0%) did not provide supervision in the past six months, 10 participants (6.7%) did not provide supervision via Telehealth in the past six months, and 4 participants (2.7%) did not supervise individuals who provided behavior analytic services to individuals with ASD. Therefore, a total of 81 of the original 150 responses were discarded because they did not meet our initial a priori inclusion criteria requirements. Sixty-nine participants (46.0%) completed the entire survey and met inclusion criteria, and therefore were included in the data analyses. The overall survey response rate was 0.8% (150 out of 19,154 individuals). 129 Respondent Demographics Table 3.1 contains specific demographic information of the 69 participants included in the data analysis. The mean age of the participants was 36.8 years (range, 25-65), the median age was 35 years, and the mode was 36 years (n = 7, 10.1%). A majority of the participants were female (n = 60, 87.0%), were White (n = 57, 82.6%), and held a master’s degree (n = 55, 79.7%). A total of 25 states were represented, with the most participants working in California (n = 14, 20.3%) and the second most working in Michigan (n = 10, 14.5%). When comparing the demographic information of the 69 participants to the BACB data of certificants (BACB, n.d.; see Table 3.2), it was found that the demographics from the present study are reflective of the demographics of the profession at that time. When participants were asked how many years they have worked as a BCBA, the most frequently selected answer was 6 or more years (n = 29, 42.0%) and the second most was 2 years (n = 11, 15.9%; see Table 3.3). When participants were asked how many years they have worked at their current organization, the most frequently selected answer was 6 or more years (n = 17, 24.6%) and the second most was tied between less than one year (n = 11, 15.9%) and one year (n = 11, 15.9%). When asked to estimate how many employees worked at their current organization, the most frequently selected answer was 1-25 employees (n = 19, 27.5%) and the second most was 251 or more employees (n = 15, 21.7%). When participants were asked what their current organization setting was, the most frequently selected answer was multiple settings (e.g., ABA agency clinic-based, ABA agency home-based, school [n = 23, 33.3%]) and the second most selected setting was ABA agency home-based (n = 20, 29.0%). Finally, 47 participants (68.1%) reported they had not provided supervision via Telehealth prior to the 130 COVID-19 pandemic, while 59 participants (85.5%) reported that they intend to continue to provide supervision via Telehealth after the COVID-19 pandemic. Supervision Load and Supervision Meeting Logistics A majority of the participants indicated that they had been providing supervision via Telehealth over the past six months (n = 48, 69.65; see Table 3.4). Additionally, a majority of participants indicated they were supervising one (n = 18, 26.1%), two (n = 16, 23.2%), or three (n = 21.7%) individuals. When participants were asked to best define the population of the individuals with ASD that were receiving services via Telehealth, the most frequently selected answer was elementary school (n = 18, 26.1%) and the second most was early intervention (n = 8, 11.6%). When asked on average how frequently supervision meetings occurred, a majority of the participants indicated that they held meetings one time per week (n = 42, 60.9%) and they were typically 40-60 min (n = 28, 40.6%) or 61-75 min long (n = 17, 24.6%). Finally, when participants were asked to indicate which individuals (e.g., supervisor, client) were typically present/available during supervision meetings, a majority of participants indicated that the supervisor (them) was always (n = 63, 91.3%) present/available, a BCBA or individual pursuing a BCBA credential was always (n = 55, 79.7%) present/available, a behavior technician was always (n = 27, 39.1%) or usually (n = 22, 31.9%) present/available, a client was usually (n = 44.9%) present/available, and the client’s caregiver was rarely (n = 26, 37.7%) or usually (n = 23, 33.3%) present/available (see Table 3.5) meetings. Experiences When Providing Supervision Via Telehealth When participants were asked if they had received training on how to provide supervision in the past six months, 33 participants (47.8%) indicated that they had. Thirty-two participants (46.4%) indicated that they had not received training on supervision in the past six months but 131 had received training before. Four participants (5.8%) indicated that they had not received training in the past six months or before. When participants were asked if they had received training on how to provide supervision via Telehealth in the past six months, 26 participants (37.7%) indicated that they had. Twenty-eight participants (40.6%) indicated that they had not received training on supervision via Telehealth in the past six months but had received training before. Fifteen participants (21.7%) indicated that they had not received training in the past six months or before. When participants were asked to indicate what modalities they used to provide supervision via Telehealth, all 69 participants (100%) indicated that they used a video conferencing software (e.g., Zoom; see Table 3.6). Twelve participants (17.4%) indicated they used both a video conferencing software and phone calls and another twelve participants (17.4%) indicated that they used a video conferencing software, emails, phone calls, and text messages. Barriers Experienced When asked which supervisee barriers participants experienced, the most frequent supervisee barrier that occurred was internet connectivity issues (n = 46, 66.7%) and the second most was distractions during the supervision meeting (n = 25, 36.2%) (see Figure 3.1). A majority of participants experienced one (n = 10, 14.5%), two (n = 14, 20.3%), three (n = 12, 17.4%), or four (n = 10, 14.5%) supervisee barriers, with a mean of 3.3 barriers (see Figure 3.2). Six participants (8.7%) indicated they had not experienced any supervisee barriers. Cochran’s Q test indicated there are differences between the proportions among the 19 barriers, χ2 (18, N = 69) = 214.05, p < .001. The overall effect of the barrier on the results is relatively weak, R = .144. Finally, a pairwise post-hoc Dunn test with Bonferroni adjustments was significant for 35 comparisons (see Table 3.7 for a list of pairwise comparisons). 132 When asked which supervisor barriers participants experienced, the most frequent supervisor barrier that occurred was the ability to model or demonstrate strategies (n = 29, 42.0%) and the second most was obstruction of view or supervisee out of lens view (n = 28, 40.6%; see Figure 3.3). A majority of participants experienced one (n = 12, 17.4%) or two (n = 16, 23.2%) supervisor barriers, with a mean of 2.9 barriers (see Figure 3.2). Nine participants (13.0%) indicated they had not experienced any supervisor barriers. Cochran’s Q test indicated that there are differences between the proportions among the 25 barriers, χ2 (24, N = 69) = 245.5, p < .001. The overall effect of the barrier on the results is relatively weak, R = .125. Finally, a pairwise post-hoc Dunn test with Bonferroni adjustments was significant for 69 comparisons (see Table 3.8 for a list of pairwise comparisons). Strategies Used to Address and/or Mitigate Barriers Experienced Overall, the most frequently used supervisee strategies to address and/or mitigate each barrier varied between set clear expectations for the supervisee, clarified expectations for supervisee, supervisor provided a training on technology and/or video conferencing software to the supervisee, and the supervisee gained access to internet services (see Table 3.9 for strategies used to address and/or mitigate each supervisee barrier and Table 3.10 for the most frequent strategy used). Overall, the most frequently used supervisor strategies to address and/or mitigate each barrier varied between the supervisor had not used a strategy to address and/or mitigate the barrier that arose, set clear expectations for themselves, and develop and implement a self- management strategy for themselves (see Table 3.11 for strategies used to address and/or mitigate each supervisee barrier and Table 3.12 for the most frequent strategy used). 133 Discussion The purpose of the present study was to identify barriers BCBAs experienced while providing supervision via Telehealth, and to identify strategies BCBAs used to address and/or mitigate those barriers. The results of this study indicate the barriers experienced most often when providing supervision via Telehealth were different than the barriers experienced most often when providing supervision via face-to-face context (as reported in Sellers et al., 2019). As a result, a key finding of this study is that supervision (face-to-face, Telehealth, and hybrid) and the training of how to provide supervision should be tailored to the context in which services are provided. Sellers and colleagues (2019) found the most common barrier was lack of time to adequately prepare for supervision meetings and to develop a tracking system to monitor skills and knowledge the supervisee has mastered. Additional barriers consisted of the cost of materials, lack of access to resources, lack of access to examples, uncertainty of supervision requirements, and uncertainty about how to teach and measure certain skills of the supervisee. However, within the Telehealth context, we found the most common barriers were Internet connectivity issues (supervisee barrier) and the ability to model or demonstrate strategies to the supervisee (supervisor barrier). Additional barriers consisted of distractions during the supervision meeting (supervisee barrier), scheduling conflicts (supervisee barrier), obstruction of view or supervisee out of lens view (supervisor barrier), and internet connectivity issues (supervisor barrier). A logical explanation for these differences is the context in which supervision was provided in (i.e., Telehealth vs. face-to-face). However, more information is needed in order to fully determine this. Future research could ask supervisors who have provided supervision in both face-to-face and Telehealth contexts what barriers they have experienced and 134 evaluate the reported barriers for similarities and differences. This comparison would help inform the creation of supervision resources for both face-to-face and Telehealth supervision and could help inform trainings for supervisors who would be providing supervision in both or either of these contexts. Training on How to Provide Supervision Via Telehealth Seventy-eight percent of participants reported they had received training on how to provide supervision via Telehealth in the past six months or before. This number may initially appear to be high, however in previous research, Hajiaghamohseni and colleagues (2020) found 99.1% of their participants had prior supervision training. Alternatively, it is alarming that 21.7% of participants reported that they had not received training on how to provide supervision via Telehealth in the past six months or before. This finding is concerning because regardless of the level of expertise a BCBA has when providing supervision or services in-person, it is important not to assume that supervision skills will transfer to a Telehealth context without specific training in that context (Lerman et al., 2020). Additionally, this finding is problematic because poor training practices may become professional habits and result in negatively impacting treatment outcomes for the clients (Sellers et al., 2016a). Though our findings are concerning, we would be remiss to ignore the fact that the present survey was administered in the context of a public health emergency (i.e., COVID-19). Therefore, training specific to supervision via Telehealth may not have been possible at that time and in some cases ethically justifiable (see Cox et al., 2020). Given the time and context in which the survey was administered, the results regarding training may need to be taken with caution. Future research could conduct a follow-up survey to determine if the present results were in part due to the context or if there is an issue with training more broadly. 135 For participants who reported having at least some training in providing supervision via Telehealth, the type, dosage, quality of training, and when training was received is unknown. The present study did not ask these specific questions. Future research could gather specific information about the amount of training received, when the training was received, and the type of training received. Additionally, future research could evaluate the amount of training and the type of training supervisors need to potentially prevent barriers from occurring and identify strategies that can be used to address the barriers that occur in a timely manner. This information could then inform employers on the amount and type of training supervisors should receive to ensure they provide effective supervision. Number of Years as a BCBA Our initial findings suggested receiving training did not appear to prevent the presence of reported barriers from occurring. Therefore, we conducted a post-hoc analysis to evaluate if there was a relationship between the number of years a participant was a BCBA and the number of barriers they experienced. A linear regression established that the number of years a participant was a BCBA did not result in differences in the number of supervisee or supervisor barriers they experienced. This finding is important to note as it underscores the importance that supervisor resources and trainings should not focus solely on newly credentialed or unexperienced BCBAs and instead should focus on all BCBAs, regardless of the number of years they have been a BCBA or the level of experience they have (Lerman et al., 2020). Practical Implications In order to reduce the probability of barriers from occurring, an organization should provide a training to all BCBAs on how to provide supervision via Telehealth prior to beginning the supervision process (see Table 3.13 for a table regarding recommendations). When creating 136 this training, the organization should ensure they are using the process of evidence-based practices which consists of using the best available research (e.g., effective training methods), considering the values of the client, the context (e.g., supervision via Telehealth), and expertise of the individual providing the training to identify the practices to use (Brodhead et al., 2018; Slocum et al., 2014). Additionally, the organization should ensure it is specifically tailoring the training to meet the needs of providing supervision via Telehealth. However, it may not always possible to completely eliminate or prevent barriers from occurring. If the onset of a barrier cannot be prevented, an organization should track the barriers their employees commonly experience and use that information to inform revision of future trainings. Additionally, an organization may consider providing a resource of potential strategies that can be used to address and/or mitigate barriers that arise. For example, an organization could create or modify an existing (see Lee et al., 2015 for an example) troubleshooting guide that includes a table or decision-making tree of steps the supervisor can engage in to try to address and/or mitigate issues that arise. Furthermore, an organization may consider investing in and providing appropriate and adequate resources, especially resources related to technology, to their employees to help offset barriers they may experience. Finally, we recommend the organization continuously tracks barriers their employees experience, in order to inform quality improvement and revisions to trainings and/or resources. Individual Level Communicating the expectations of supervision from the beginning of the supervision process may increase the effectiveness of supervision (Sellers et al., 2019) and lead to continued growth and development for both the supervisee and supervisor (Valentino, 2021). In order to potentially prevent barriers from occurring, there are steps we recommend individuals engage in 137 prior to beginning supervision via Telehealth. First, both the supervisor and supervisee should acquire internet and ensure that internet connections are strong. If either the supervisor or the supervisee is unable to acquire internet or unable to acquire strong internet connections in their current settings, the supervisor and/or supervisee may consider upgrading their modem/internet service and/or reduce the number of devices connected to the internet (Lee et al., 2015). If acquiring internet is still unsuccessful, they may consider changing their location to one where internet is available and the connection is strong, using an asynchronous modality, and/or using a different modality (e.g., phone instead of a computer; Neely et al., 2022). Second, the supervisor should clarify and set clear expectations (e.g., how and when feedback will occur, how to receive feedback) for both the supervisee and supervisor. When clarifying and setting expectations, the supervisor should also include information about how supervisees are to engage in professional development activities (e.g., conferences; see Becerra et al., 2020), complete assignments by predetermined deadlines (e.g., prior to weekly supervision meetings), and how to apply what they learned during supervision to their practice (e.g., how to implement feedback; Valentino, 2021). Additionally, supervisors should consider having a formal conversation with the supervisee about appropriate places to hold supervision meetings (e.g., consider avoiding public places, or wear headphones if public places cannot be avoided; Britton & Cicoria, 2019). Third, if the supervisor is unfamiliar with using technology and/or video conferencing software, they should first obtain training on the technology and/or video conferencing software (Ninci et al., 2021). In addition to reading or reviewing any manuals or instructions accompanying that technology or software, the supervisor should ask for guidance from someone within their organization who may be familiar with the technology and/or video conferencing 138 software. Fourth, if the supervisee is unfamiliar with using technology and/or video conferencing software, the supervisor may consider providing a training to the supervisee (Ninci et al., 2021). Additionally, if the supervisor provides a training to the supervisee on using technology and/or video conferencing software, the supervisor may consider including a discussion about how to ensure ethical considerations and practices (e.g., how to manage data appropriately) are adhered to when providing supervision via Telehealth (Britton & Cicoria, 2019; Cavalari et al., 2015; Quigley et al., 2019). This is especially important given the results of the present study in which twelve participants indicated they used a variety of technology when providing supervision. The supervisor and supervisee should ensure privacy and confidentiality are protected by only using technology that is HIPAA or FERPA compliant (Pollard et al., 2017). Finally, the supervisor and supervisee may consider scheduling an initial troubleshooting session prior to beginning supervision meetings to ensure they have strong internet connections, go over expectations and answer any questions the supervisee may have, and ensure that both are familiar with using technology and/or video conferencing software (Lerman et al., 2020). During this meeting, the supervisor may also consider providing the supervisee with additional resources (e.g., troubleshooting guide for the video conferencing software, task analysis for how to set up and tear down of a webcam; see Zoder-Martell et al., 2020 for an example) they can use during the initial meeting and throughout the supervision process. Limitations Several limitations of the present study should be noted. The first limitation of the present study was that reliability and validity of the survey was not evaluated. However, expert and content reviews were conducted on the survey. Future survey research should consider evaluating reliability and validity of their survey prior to distributing it to further minimize the 139 measurement error (Alwin, 2010). A second limitation was the low response rate of 0.8%. Having a low response rate impacts having an adequate sample size and a representative sample (Krezmien et al., 2017). As a result, the low response rate and small sample size limits the generalization of the results to all BCBAs and BCBA-Ds providing supervision via Telehealth. Additionally, the low response rate and small sample size limits the ability to conduct additional statistical analyses (e.g., logistic regression of barriers and strategies) to further explore and analyze the research questions (Rogelberg & Stanton, 2007). A potential cause for the low response rate in present study could be survey fatigue experienced during the COVID-19 pandemic. Research has shown that during a public health emergency (e.g., COVID-19 pandemic), the over-exposure to online and/or telephone surveys results in individuals becoming fatigued and as result not taking part in surveys (Field, 2020; Patel et al., 2020). Given the survey of the present study was distributed during the COVID-19 pandemic, it could be possible that potential participants had been receiving numerous survey requests which resulted in survey fatigue. A third limitation of the study was that the survey was open and available for participants to complete for only two weeks. Therefore, it is possible that potential participants were missed and may have also contributed to the low response rate. A fourth limitation was that potential participants were able to access the survey only one time using the link provided in the email in order to prevent individuals from submitting multiple responses. Therefore, it is possible that potential participants were missed if they had exited out of the link and were unable to access the survey again. A fifth limitation of the present study was that participants consisted of only individuals that held a BCBA credential or a BCBA credential with a doctoral designation (i.e., BCBA-D) who were providing supervision via Telehealth to supervisees. Therefore, we only 140 received responses from the supervisor. The extent to which supervisee and supervisor responses positively corelate with one another is unknown. Finally, a fourth limitation was when asking participants to indicate what strategies (if any) they used to address and/or mitigate each barrier, participants simply selected the strategies used. As a result, it is unclear if the strategies used were effective in addressing and/or mitigating the barriers. Future research could evaluate the effectiveness of strategies used to address and/or mitigate supervisee and supervisor barriers to help inform the development of trainings and resources for individuals providing supervision via Telehealth. 141 APPENDIX 142 Table 3.1. Demographics of Survey Participants Item n % Age 25 1 1.4 26 1 1.4 27 3 4.3 28 5 7.2 29 6 8.7 30 4 5.8 31 1 1.4 32 5 7.2 33 4 5.8 34 2 2.9 35 3 4.3 36 7 10.1 37 2 2.9 38 5 7.2 39 4 5.8 41 3 4.3 42 3 4.3 46 2 2.9 47 1 1.4 50 1 1.4 54 1 1.4 61 1 1.4 62 1 1.4 63 1 1.4 64 1 1.4 65 1 1.4 143 Table 3.1 (cont’d) Item n % Gender Male 9 13.0 Female 60 87.0 Transgender 0 0.0 Non-Binary/Agender 0 0.0 A different identity 0 0.0 Prefer not to answer 0 0.0 144 Table 3.1 (cont’d) Item n % Race American Indian or Alaska Native 0 0.0 Asian 2 2.9 Black or African American 3 4.3 Native Hawaiian or Other Pacific 0 0.0 Islander White 57 82.6 Multiple answers 1 1.4 Other 3 4.3 Prefer not to answer 3 4.3 Ethnicity Latinx, Hispanic, or Spanish Origin 7 10.1 Not Latinx, Hispanic, or Spanish Origin 57 82.6 Prefer not to answer 5 7.2 Highest Level of Education Master’s 55 79.7 Doctorate 14 20.3 Prefer not to answer 0 0.0 Note. For age, we only listed the ages in which a participant stated that was their age. If an age was not listed, zero participants stated that was their age. 145 Table 3.2. Demographic Comparison of Survey Respondents with BACB Data Source of Demographic Data Item BACB Telehealth Supervision Survey Gender Female 86.16% 86.96% Male 12.02% 13.04% Non-binary 0.21% 0% Other 0.02% 0% No Answer 1.59% 0% Race/Ethnicity White 71.82% 82.60% American Indian/Alaska 0.30% 0% Native Asian 5.99% 2.90% Black 3.60% 4.35% Hispanic/Latinx 9.34% 1.45% Native Hawaiian/Pacific 0.38% 0% Islander Other - 2.90% Multiple answers - 1.45% No Answer 8.57% 4.35% Note. The BACB demographic information is as of July 1, 2021. 146 Table 3.3. Supervision Specific Demographics of Survey Participants Item n % State Work in Arizona 1 1.4 Arkansas 1 1.4 California 14 20.3 Connecticut 1 1.4 Florida 3 4.3 Georgia 7 10.1 Illinois 1 1.4 Indiana 2 2.9 Kansas 1 1.4 Massachusetts 2 2.9 Michigan 10 14.5 Minnesota 1 1.4 Missouri 1 1.4 Nevada 1 1.4 New Hampshire 1 1.4 New Jersey 3 4.3 New York 4 5.8 North Carolina 3 4.3 Ohio 1 1.4 Pennsylvania 2 2.9 Texas 1 1.4 Utah 1 1.4 Virginia 4 5.8 Washington 2 2.9 Wisconsin 1 1.4 147 Table 3.3 (cont’d) Item n % Years Worked as BCBA Less than 1 year 5 7.2 1 year 5 7.2 2 years 11 15.9 3 years 5 7.2 4 years 8 11.6 5 years 5 7.2 6+ years 29 42.0 Prefer not to answer 1 1.4 Years Worked at Organization Less than 1 year 11 15.9 1 year 11 15.9 2 years 9 13.0 3 years 7 10.1 4 years 9 13.0 5 years 4 5.8 6+ years 17 24.6 Prefer not to answer 1 1.4 Number of Employees 1-25 19 27.5 26-50 9 13.0 51-75 4 5.8 76-100 11 15.9 101-125 3 4.3 126-150 3 4.3 151-175 3 4.3 176-200 2 2.9 201-225 0 0.0 226-250 0 0.0 148 Table 3.3 (cont’d) Item n % Number of Employees continued 251 or more 15 21.7 Note. For state participants work in, we only listed the states in which a participant indicated that was the state they worked in. If a state was not listed, zero participants indicated that was the state they worked in. 149 Table 3.4. Supervision Load and Meeting Logistics Item n % Number of Months of Supervision was Provided Less than one month 2 2.9 One month 5 7.2 Two months 4 5.8 Three months 2 2.9 Four months 6 8.7 Five months 2 2.9 Six months 48 69.6 Prefer not to answer 0 0.0 Number of Supervisees One individual 18 26.1 Two individuals 16 23.2 Three individuals 15 21.7 Four individuals 5 7.2 Five individuals 4 5.8 Six or more individuals 11 15.9 Prefer not to answer 0 0.0 Client Population Early intervention 8 11.6 Early intervention and Pre-school 1 1.4 Early intervention and Elementary school 3 4.3 Early intervention and Pre-school, 5 7.2 Elementary school Early intervention and Pre-school, 5 7.2 Elementary school and Junior high Early intervention, Elementary school 2 2.9 and Junior high 150 Table 3.4 (cont’d) Item n % Client Population continued Early intervention, Pre-school, 2 2.9 Elementary school, Junior high and High school and above Early intervention, Pre-school, 2 2.9 Elementary school and High school and above Pre-School 2 2.9 Pre-school and Elementary school 4 5.8 Pre-school, Elementary school and Junior 4 5.8 high Pre-school, Elementary school, Junior 2 2.9 high and High school and above Elementary School 18 26.1 Elementary School, Junior high and High 2 2.9 school and above Elementary School and High school and 1 1.4 above Junior high 1 1.4 Junior high and High school and above 2 2.9 High school and above 5 7.2 Average Frequency of Supervision Meetings One time per week 42 60.9 Two times per week 19 27.5 Three times per week 2 2.9 Four or more times per week 1 1.4 Prefer not to answer 5 7.2 Average Length of Supervision Meetings 0-15 min 0 0.0 16-30 min 4 5.8 31-45 min 7 10.1 151 Table 3.4 (cont’d) Item n % Average Length of Supervision Meetings continued 40-60 min 28 40.6 61-75 min 17 24.6 76-90 min 8 11.6 91+ min 5 7.2 Prefer not to answer 0 0.0 152 Table 3.5. Frequency of Individuals Typically Present/Available During Supervision Meetings Item n % Supervisor Always 63 91.3 Usually 5 7.2 Rarely 1 1.4 Never 0 0.0 Not applicable (N/A) 0 0.0 BCBA or Individual Pursuing a BCBA Credential Always 55 79.7 Usually 14 20.3 Rarely 0 0.0 Never 0 0.0 Not applicable (N/A) 0 0.0 Behavior Technician Always 27 39.1 Usually 22 31.9 Rarely 4 5.8 Never 4 5.8 Not applicable (N/A) 12 17.4 Client Always 24 34.8 Usually 31 44.9 Rarely 3 4.3 Never 7 10.1 Not applicable (N/A) 4 5.8 Client’s Caregiver Always 5 7.2 Usually 23 33.3 153 Table 3.5 (cont’d) Item n % Client’s Caregiver Rarely 26 37.7 Never 9 13.0 Not applicable (N/A) 6 8.7 Other Individual Always 0 0.0 Usually 2 2.9 Rarely 5 7.2 Never 8 11.6 Not applicable (N/A) 54 78.3 154 Table 3.6. Modality Used for Supervision Meetings Item n % Video Conferencing Software (e.g., Zoom) 31 44.9 Video Conferencing Software (e.g., Zoom) 2 2.9 and Bluetooth Device Video Conferencing Software (e.g., Zoom) 3 4.3 and Email Video Conferencing Software (e.g., Zoom) 12 17.4 and Phone Call Video Conferencing Software (e.g., Zoom), 3 4.3 Email, and Phone Call Video Conferencing Software (e.g., Zoom), 2 2.9 Email, and Text Message Video Conferencing Software (e.g., Zoom), 2 2.9 Phone Call, and Text Message Video Conferencing Software (e.g., Zoom), 1 1.4 Email, Text Message, and Bluetooth Device Video Conferencing Software (e.g., Zoom), 12 17.4 Email, Phone Call, and Text Message Video Conferencing Software (e.g., Zoom), 1 1.4 Email, Phone Call, Text Message, and Bluetooth Device 155 Table 3.7. Pairwise Comparisons for Supervisee Barriers Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Internet Connectivity Issues Distractions during .304 .064 4.779 .000 .000* supervision meeting Scheduling conflicts .377 .064 5.917 .000 .000* Access to clients .406 .064 6.372 .000 .000* Organizational skills .435 .064 6.827 .000 .000* Implementation of feedback .435 .064 6.827 .000 .000* provided Familiarity with technology .435 .064 6.827 .000 .000* or video conferencing software Access to therapy or .449 .064 7.055 .000 .000* assessment materials Access to internet services .449 .064 7.055 .000 .000* Non responsiveness .536 .064 8.420 .000 .000* Absenteeism .536 .064 8.420 .000 .000* Professionalism .565 .064 8.875 .000 .000* I have not experienced -.580 .064 -9.103 .000 .000* supervisee barriers Access to computer, tablet, .594 .064 9.330 .000 .000* or smart phone with a webcam or an external webcam Access to video .609 .064 9.558 .000 .000* conferencing software that is compliant with HIPAA 156 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Internet Connectivity Issues continued Access to encrypted .609 .064 9.558 .000 .000* computer, tablet, or smart phone Access to computer, tablet, .623 .064 9.785 .000 .000* or smart phone with a microphone or an external microphone Prefer not to answer -.652 .064 -10.241 .000 .000* Other -.652 .064 -10.241 .000 .000* Distractions During Supervision Meeting Scheduling conflicts -.072 .064 -1.138 .255 1.000 Access to clients -.101 .064 -1.593 .111 1.000 Organizational skills .130 .064 2.048 .041 1.000 Implementation of feedback .130 .064 2.048 .041 1.000 provided Familiarity with technology -.130 .064 -2.048 .041 1.000 or video conferencing software Access to therapy or -.145 .064 -2.276 .023 1.000 assessment materials Access to internet services -.145 .064 -2.276 .023 1.000 Non responsiveness .232 .064 3.641 .000 .046* Absenteeism .232 .064 3.641 .000 .046* Professionalism .261 .064 4.096 .000 .007* I have not experienced -.275 .064 -4.324 .000 .003* supervisee barriers 157 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Distractions During Supervision Meeting continued Access to computer, tablet, -.290 .064 -4.551 .000 .001* or smart phone with a webcam or an external webcam Access to video -.304 .064 -4.779 .000 .000* conferencing software that is compliant with HIPAA Access to encrypted -.304 .064 -4.779 .000 .000* computer, tablet, or smart phone Access to computer, tablet, -.319 .064 -5.006 .000 .000* or smart phone with a microphone or an external microphone Prefer not to answer -.348 .064 -5.462 .000 .000* Other -.348 .064 -5.462 .000 .000* Scheduling Conflicts Access to clients -.029 .064 -.455 .649 1.000 Organizational skills .058 .064 .910 .363 1.000 Implementation of feedback .058 .064 .910 .363 1.000 provided Familiarity with technology -.058 .064 -.910 .363 1.000 or video conferencing software Access to therapy or .072 .064 1.138 .255 1.000 assessment materials Access to internet services -.072 .064 -1.138 .255 1.000 Non Responsiveness .159 .064 2.503 .012 1.000 158 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Scheduling Conflicts continued Absenteeism .159 .064 2.503 .012 1.000 Professionalism .188 .064 2.958 .003 .529 I have not experienced -.203 .064 -3.186 .001 .247 supervisee barriers Access to computer, tablet, -.217 .064 -3.414 .001 .110 or smart phone with a webcam or an external webcam Access to video -.232 .064 -3.641 .000 .046* conferencing software that is compliant with HIPAA Access to encrypted -.232 .064 -3.641 .000 .046* computer, tablet, or smart phone Access to computer, tablet, -.246 .064 -3.869 .000 .019* or smart phone with a microphone or an external microphone Prefer not to answer -.275 .064 -4.324 .000 .003* Other -.275 .064 -4.324 .000 .003* Access to Clients Organizational skills .029 .064 .455 .649 1.000 Implementation of feedback .029 .064 .455 .649 1.000 provided Familiarity with technology -.029 .064 -.455 .649 1.000 or video conferencing software Access to therapy or .043 .064 .683 .495 1.000 assessment materials 159 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Clients continued Access to internet services -.043 .064 -.683 .495 1.000 Non Responsiveness .130 .064 2.048 .041 1.000 Absenteeism .130 .064 2.048 .041 1.000 Professionalism .159 .064 2.503 .012 1.000 I have not experienced -.174 .064 -2.731 .006 1.000 supervisee barriers Access to computer, tablet, -.188 .064 -2.958 .003 .529 or smart phone with a webcam or an external webcam Access to video -.203 .064 -3.186 .001 .247 conferencing software that is compliant with HIPAA Access to encrypted -.203 .064 -3.186 .001 .247 computer, tablet, or smart phone Access to computer, tablet, -.217 .064 -3.414 .001 .110 or smart phone with a microphone or an external microphone Prefer not to answer -.246 .064 -3.869 .000 .019* Other -.246 .064 -3.869 .000 .019* Organization Skills Implementation of feedback .000 .064 .000 1.000 1.000 provided Familiarity with technology .000 .064 .000 1.000 1.000 or video conferencing software 160 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Organization Skills continued Access to therapy or -.014 .064 -.228 .820 1.000 assessment materials Access to internet services -.014 .064 -.228 .820 1.000 Non responsiveness .101 .064 1.593 .111 1.000 Absenteeism .101 .064 1.593 .111 1.000 I have not experienced -.145 .064 -2.276 .023 1.000 supervisee barriers Access to computer, tablet, -.159 .064 -2.503 .012 1.000 or smart phone with a webcam or an external webcam Access to video -.174 .064 -2.731 .006 1.000 conferencing software that is compliant with HIPAA Access to encrypted -.174 .064 -2.731 .006 1.000 computer, tablet, or smart phone Access to computer, tablet, -.188 .064 -2.958 .003 .529 or smart phone with a microphone or an external microphone Prefer not to answer -.217 .064 -3.414 .001 .110 Other -.217 .064 -3.414 .001 .110 Familiarity with Technology or Video Conferencing Software Access to therapy or .014 .064 .228 .820 1.000 assessment materials Access to internet services -.014 .064 -.228 .820 1.000 161 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Familiarity with Technology or Video Conferencing Software continued Non responsiveness .101 .064 1.593 .111 1.000 Absenteeism .101 .064 1.593 .111 1.000 Professionalism .130 .064 2.048 .041 1.000 I have not experienced -.145 .064 -2.276 .023 1.000 supervisee barriers Access to computer, tablet, -.159 .064 -2.503 .012 1.000 or smart phone with a webcam or an external webcam Access to video -.174 .064 -2.731 .006 1.000 conferencing software that is compliant with HIPAA Access to encrypted -.174 .064 -2.731 .006 1.000 computer, tablet, or smart phone Access to computer, tablet, -.188 .064 -2.958 .003 .529 or smart phone with a microphone or an external microphone Prefer not to answer -.217 .064 -3.414 .001 .110 Other -.217 .064 -3.414 .001 .110 Access to Therapy or Assessment Materials Access to internet services .000 .064 .000 1.000 1.000 Non responsiveness .087 .064 1.365 .172 1.000 Absenteeism .087 .064 1.365 .172 1.000 162 Table 3.7 (cont’d) Sample 1 – Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Therapy or Assessment Materials continued Professionalism .116 .064 1.821 .069 1.000 I have not experienced -.130 .064 -2.048 .041 1.000 supervisee barriers Access to computer, tablet, -.145 .064 -2.276 .023 1.000 or smart phone with a webcam or an external webcam Access to video -.159 .064 -2.503 .012 1.000 conferencing software that is compliant with HIPAA Access to encrypted -.159 .064 -2.503 .012 1.000 computer, tablet, or smart phone Access to computer, tablet, -.174 .064 -2.731 .006 1.000 or smart phone with a microphone or an external microphone Prefer not to answer -.203 .064 -3.186 .001 .247 Other -.203 .064 -3.186 .001 .247 Implementation of Feedback Provided Familiarity with technology .000 .064 .000 1.000 1.000 or video conferencing software Access to therapy or -.014 .064 -.228 .820 1.000 assessment materials Access to internet services -.014 .064 -.228 .820 1.000 Non responsiveness .101 .064 1.593 .111 1.000 163 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Implementation of Feedback Provided continued Absenteeism .101 .064 1.593 .111 1.000 Access to computer, tablet, -.159 .064 -2.503 .012 1.000 or smart phone with a webcam or an external webcam Access to video -.174 .064 -2.731 .006 1.000 conferencing software that is compliant with HIPAA Access to computer, tablet, -.188 .064 -2.958 .003 .529 or smart phone with a microphone or an external microphone Prefer not to answer -.217 .064 -3.414 .001 .110 Other -.217 .064 -3.414 .001 .110 Access to Internet Services Non responsiveness .087 .064 1.365 .172 1.000 Absenteeism .087 .064 1.365 .172 1.000 Professionalism .116 .064 1.821 .069 1.000 I have not experienced -.130 .064 -2.048 .041 1.000 supervisee barriers Access to computer, tablet, -.145 .064 -2.276 .023 1.000 or smart phone with a webcam or an external webcam Access to video -.159 .064 -2.503 .012 1.000 conferencing software that is compliant with HIPAA 164 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Internet Services continued Access to encrypted -.159 .064 -2.503 .012 1.000 computer, tablet, or smart phone Access to computer, tablet, -.174 .064 -2.731 .006 1.000 or smart phone with a microphone or an external microphone Prefer not to answer -.203 .064 -3.186 .001 .247 Other -.203 .064 -3.186 .001 .247 Absenteeism Non responsiveness .000 .064 .000 1.000 1.000 Professionalism -.029 .064 -.455 .649 1.000 I have not experienced -.043 .064 -.683 .495 1.000 supervisee barriers Access to computer, tablet, -.058 .064 -.910 .363 1.000 or smart phone with a webcam or an external webcam Access to video -.072 .064 -1.138 .255 1.000 conferencing software that is compliant with HIPAA Access to encrypted -.072 .064 -1.138 .255 1.000 computer, tablet, or smart phone Access to computer, tablet, -.087 .064 -1.365 .172 1.000 or smart phone with a microphone or an external microphone 165 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Absenteeism continued Prefer not to answer -.116 .064 -1.821 .069 1.000 Other -.116 .064 -1.821 .069 1.000 Non Responsiveness Professionalism -.029 .064 -.455 .649 1.000 I have not experienced -.043 .064 -.683 .495 1.000 supervisee barriers Access to computer, tablet, -.058 .064 -.910 .363 1.000 or smart phone with a webcam or an external webcam Access to video -.072 .064 -1.138 .255 1.000 conferencing software that is compliant with HIPAA Access to encrypted -.072 .064 -1.138 .255 1.000 computer, tablet, or smart phone Access to computer, tablet, -.087 .064 -1.365 .172 1.000 or smart phone with a microphone or an external microphone Prefer not to answer -.116 .064 -1.821 .069 1.000 Other -.116 .064 -1.821 .069 1.000 Professionalism I have not experienced -.014 .064 -.228 .820 1.000 supervisee barriers Access to computer, tablet, -.029 .064 -.455 .649 1.000 or smart phone with a webcam or an external webcam 166 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Professionalism continued Access to video -.043 .064 -.683 .495 1.000 conferencing software that is compliant with HIPAA Access to encrypted -.043 .064 -.683 .495 1.000 computer, tablet, or smart phone Access to computer, tablet, -.058 .064 -.910 .363 1.000 or smart phone with a microphone or an external microphone Prefer not to answer -.087 .064 -1.365 .172 1.000 Other -.087 .064 -1.365 .172 1.000 I Have Not Experienced Supervisee Barriers Access to computer, tablet, .014 .064 .228 .820 1.000 or smart phone with a webcam or an external webcam Access to video .029 .064 .455 .649 1.000 conferencing software that is compliant with HIPAA Access to encrypted .029 .064 .455 .649 1.000 computer, tablet, or smart phone Access to computer, tablet, .043 .064 .683 .495 1.000 or smart phone with a microphone or an external microphone Prefer not to answer .072 .064 1.138 .255 1.000 167 Table 3.7 (cont’d) Sample 1 – Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a I Have Not Experienced Supervisee Barriers continued Other .072 .064 1.138 .255 1.000 Access to Computer, Tablet, or Smart Phone with a Webcam or an External Webcam Access to video -.014 .064 -.228 .820 1.000 conferencing software that is compliant with HIPAA Access to encrypted .014 .064 .228 .820 1.000 computer, tablet, or smart phone Access to computer, tablet, -.029 .064 -.455 .649 1.000 or smart phone with a microphone or an external microphone Prefer not to answer -.058 .064 -.910 .363 1.000 Other -.058 .064 -.910 .363 1.000 Access to Encrypted Computer, Tablet, or Smart Phone Access to video .000 .064 .000 1.000 1.000 conferencing software that is compliant with HIPAA Access to computer, tablet, -.014 .064 -.228 .820 1.000 or smart phone with a microphone or an external microphone Prefer not to answer -.043 .064 -.683 .495 1.000 Other -.043 .064 -.683 .495 1.000 168 Table 3.7 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Video Conferencing Software that is Compliant with HIPAA Access to computer, tablet, .014 .064 .228 .820 1.000 or smart phone with a microphone or an external microphone Prefer not to answer -.043 .064 -.683 .495 1.000 Other -.043 .064 -.683 .495 1.000 Access to Computer, Tablet, or Smart Phone with a Microphone or an External Microphone Prefer not to answer -.029 .064 -.455 .649 1.000 Other -.029 .064 -.455 .649 1.000 Other Prefer not to answer .000 .064 .000 1.000 1.000 Note. Each row tests the null hypothesis that the Sample 1 and Sample 2 distributions are the same. Asymptotic significances (2-sided tests) are displayed. The significance level is .05. a. Significance values have been adjusted by the Bonferroni correction for multiple tests. * Indicates a pairwise post-hoc Dunn test with Bonferroni adjustments was significant. 169 Table 3.8. Pairwise Comparisons for Supervisor Barriers Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Ability to Model or Demonstrate Strategies Obstruction of view or .014 .054 .266 .790 1.000 supervisee out of lens view Internet connectivity -.101 .054 -1.862 .063 1.000 issues Distractions during .130 .054 2.395 .017 1.000 supervision meeting Providing feedback in a .217 .054 3.991 .000 .020* timely manner Time constraints -.232 .054 -4.257 .000 .006* Scheduling conflicts -.246 .054 -4.523 .000 .002* Providing reinforcement .246 .054 4.523 .000 .002* in a timely manner I have not experienced -.290 .054 -5.321 .000 .000* supervisor barriers Access to therapy or .304 .054 5.587 .000 .000* assessment materials Access to resources -.319 .054 -5.853 .000 .000* and/or examples of providing supervision via Telehealth Difficulties with remote .319 .054 5.853 .000 .000* record keeping and paperwork Supervisory volume -.348 .054 -6.385 .000 .000* 170 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Ability to Model or Demonstrate Strategies continued Familiarity with -.362 .054 -6.652 .000 .000* technology or video conferencing software Access to internet -.362 .054 -6.652 .000 .000* services Non responsiveness .377 .054 6.918 .000 .000* Obtaining evaluation of -.377 .054 -6.918 .000 .000* supervision activities Professionalism .377 .054 6.918 .000 .000* Absenteeism .391 .054 7.184 .000 .000* Other -.406 .054 -7.450 .000 .000* Access to video -.406 .054 -7.450 .000 .000* conferencing software that is compliant with HIPAA Prefer not to answer -.406 .054 -7.450 .000 .000* Access to computer, -.406 .054 -7.450 .000 .000* tablet, or smart phone with a webcam or external webcam Access to encrypted -.406 .054 -7.450 .000 .000* computer, tablet, or smart phone Access to computer, -.420 .054 -7.716 .000 .000* tablet, or smart phone with a microphone or an external microphone 171 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Obstruction of View or Supervisee Out of Lens View Internet connectivity -.087 .054 -1.596 .110 1.000 issues Distractions during .116 .054 2.128 .033 1.000 supervision meeting Providing feedback in a .203 .054 3.725 .000 .059 timely manner Time constraints -.217 .054 -3.991 .000 .020* Scheduling conflicts -.232 .054 -4.257 .000 .006* Providing reinforcement .232 .054 4.257 .000 .006* in a timely manner I have not experienced -.275 .054 -5.055 .000 .000* supervisor barriers Access to therapy or .290 .054 5.321 .000 .000* assessment materials Access to resources -.304 .054 -5.587 .000 .000* and/or examples of providing supervision via Telehealth Difficulties with remote .304 .054 5.587 .000 .000* record keeping and paperwork Supervisory volume -.333 .054 -6.119 .000 .000* Familiarity with -.348 .054 -6.385 .000 .000* technology or video conferencing software Access to internet -.348 .054 -6.385 .000 .000* services Non responsiveness .362 .054 6.652 .000 .000* 172 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Obstruction of View or Supervisee Out of Lens View continued Obtaining evaluation of -.362 .054 -6.652 .000 .000* supervision activities Professionalism .362 .054 6.652 .000 .000* Absenteeism .377 .054 6.918 .000 .000* Other -.391 .054 -7.184 .000 .000* Access to video -.391 .054 -7.184 .000 .000* conferencing software that is compliant with HIPAA Prefer not to answer -.391 .054 -7.184 .000 .000* Access to computer, -.391 .054 -7.184 .000 .000* tablet, or smart phone with a webcam or external webcam Access to encrypted -.391 .054 -7.184 .000 .000* computer, tablet, or smart phone Access to computer, -.406 .054 -7.450 .000 .000* tablet, or smart phone with a microphone or an external microphone Internet Connectivity Issues Distractions during .029 .054 .532 .595 1.000 supervision meeting Providing feedback in a .116 .054 2.128 .033 1.000 timely manner Time constraints .130 .054 2.395 .017 1.000 173 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Internet Connectivity Issues continued Scheduling conflicts .145 .054 2.661 .008 1.000 Providing reinforcement .145 .054 2.661 .008 1.000 in a timely manner I have not experienced -.188 .054 -3.459 .001 .163 supervisor barriers Access to therapy or .203 .054 3.725 .000 .059 assessment materials Access to resources .217 .054 3.991 .000 .020* and/or examples of providing supervision via Telehealth Difficulties with remote .217 .054 3.991 .000 .020* record keeping and paperwork Supervisory volume .246 .054 4.523 .000 .002* Familiarity with .261 .054 4.789 .000 .001* technology or video conferencing software Access to internet .261 .054 4.789 .000 .001* services Non responsiveness .275 .054 5.055 .000 .000* Obtaining evaluation of .275 .054 5.055 .000 .000* supervision activities Professionalism .275 .054 5.055 .000 .000* Absenteeism .290 .054 5.321 .000 .000* Other -.304 .054 -5.587 .000 .000* 174 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Internet Connectivity Issues continued Access to video .304 .054 5.587 .000 .000* conferencing software that is compliant with HIPAA Prefer not to answer -.304 .054 -5.587 .000 .000* Access to computer, .304 .054 5.587 .000 .000* tablet, or smart phone with a webcam or external webcam Access to encrypted .304 .054 5.587 .000 .000* computer, tablet, or smart phone Access to computer, .319 .054 5.853 .000 .000* tablet, or smart phone with a microphone or an external microphone Distractions During Supervision Meeting Providing feedback in a -.087 .054 -1.596 .110 1.000 timely manner Time constraints -.101 .054 -1.862 .063 1.000 Scheduling conflicts -.116 .054 -2.128 .033 1.000 Providing reinforcement -.116 .054 -2.128 .033 1.000 in a timely manner I have not experienced -.159 .054 -2.927 .003 1.000 supervisor barriers Access to therapy or -.174 .054 -3.193 .001 .423 assessment materials 175 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Distractions During Supervision Meeting continued Access to resources -.188 .054 -3.459 .001 .163 and/or examples of providing supervision via Telehealth Difficulties with remote -.188 .054 -3.459 .001 .163 record keeping and paperwork Supervisory volume -.217 .054 -3.991 .000 .020* Familiarity with -.232 .054 -4.257 .000 .006* technology or video conferencing software Access to internet -.232 .054 -4.257 .000 .006* services Non responsiveness .246 .054 4.523 .000 .002* Obtaining evaluation of -.246 .054 -4.523 .000 .002* supervision activities Professionalism .246 .054 4.523 .000 .002* Absenteeism .261 .054 4.789 .000 .001* Other -.275 .054 -5.055 .000 .000* Access to video -.275 .054 -5.055 .000 .000* conferencing software that is compliant with HIPAA Prefer not to answer -.275 .054 -5.055 .000 .000* 176 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Distractions During Supervision Meeting continued Access to computer, -.275 .054 -5.055 .000 .000* tablet, or smart phone with a webcam or external webcam Access to encrypted -.275 .054 -5.055 .000 .000* computer, tablet, or smart phone Access to computer, -.290 .054 -5.321 .000 .000* tablet, or smart phone with a microphone or an external microphone Providing Feedback in a Timely Manner Time constraints -.014 .054 -.266 .790 1.000 Scheduling conflicts -.029 .054 -.532 .595 1.000 Providing reinforcement -.029 .054 -.532 .595 1.000 in a timely manner I have not experienced -.072 .054 -1.330 .183 1.000 supervisor barriers Access to therapy or -.087 .054 -1.596 .110 1.000 assessment materials Access to resources -.101 .054 -1.862 .063 1.000 and/or examples of providing supervision via Telehealth Difficulties with remote -.101 .054 -1.862 .063 1.000 record keeping and paperwork 177 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Providing Feedback in a Timely Manner continued Supervisory volume -.130 .054 -2.395 .017 1.000 Familiarity with -.145 .054 -2.661 .008 1.000 technology or video conferencing software Access to internet -.145 .054 -2.661 .008 1.000 services Non responsiveness .159 .054 2.927 .003 1.000 Obtaining evaluation of -.159 .054 -2.927 .003 1.000 supervision activities Professionalism .159 .054 2.927 .003 1.000 Absenteeism .174 .054 3.193 .001 .423 Other -.188 .054 -3.459 .001 .163 Access to video -.188 .054 -3.459 .001 .163 conferencing software that is compliant with HIPAA Prefer not to answer -.188 .054 -3.459 .001 .163 Access to computer, -.188 .054 -3.459 .001 .163 tablet, or smart phone with a webcam or external webcam Access to encrypted -.188 .054 -3.459 .001 .163 computer, tablet, or smart phone Access to computer, -.203 .054 -3.725 .000 .059 tablet, or smart phone with a microphone or an external microphone 178 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Time Constraints Scheduling conflicts .014 .054 .266 .790 1.000 Providing reinforcement .014 .054 .266 .790 1.000 in a timely manner I have not experienced -.058 .054 -1.064 .287 1.000 supervisor barriers Access to therapy or .072 .054 1.330 .183 1.000 assessment materials Access to resources -.087 .054 -1.596 .110 1.000 and/or examples of providing supervision via Telehealth Difficulties with remote .087 .054 1.596 .110 1.000 record keeping and paperwork Supervisory volume -.116 .054 -2.128 .033 1.000 Familiarity with -.130 .054 -2.395 .017 1.000 technology or video conferencing software Access to internet -.130 .054 -2.395 .017 1.000 services Non responsiveness .145 .054 2.661 .008 1.000 Obtaining evaluation of -.145 .054 -2.661 .008 1.000 supervision activities Professionalism .145 .054 2.661 .008 1.000 Absenteeism .159 .054 2.927 .003 1.000 Other -.174 .054 -3.193 .001 .423 179 Table 3.8 (cont’d) Sample 1 – Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Time Constraints continued Access to video -.174 .054 -3.193 .001 .423 conferencing software that is compliant with HIPAA Prefer not to answer -.174 .054 -3.193 .001 .423 Access to computer, -.174 .054 -3.193 .001 .423 tablet, or smart phone with a webcam or external webcam Access to encrypted -.174 .054 -3.193 .001 .423 computer, tablet, or smart phone Access to computer, -.188 .054 -3.459 .001 .163 tablet, or smart phone with a microphone or an external microphone Providing Reinforcement in a Timely Manner Scheduling conflicts .000 .054 .000 1.000 1.000 I have not experienced -.043 .054 -.798 .425 1.000 supervisor barriers Access to therapy or -.058 .054 -1.064 .287 1.000 assessment materials Access to resources -.072 .054 -1.330 .183 1.000 and/or examples of providing supervision via Telehealth Difficulties with remote -.072 .054 -1.330 .183 1.000 record keeping and paperwork 180 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Providing Reinforcement in a Timely Manner continued Supervisory volume -.101 .054 -1.862 .063 1.000 Familiarity with -.116 .054 -2.128 .033 1.000 technology or video conferencing software Access to internet -.116 .054 -2.128 .033 1.000 services Non responsiveness .130 .054 2.395 .017 1.000 Obtaining evaluation of -.130 .054 -2.395 .017 1.000 supervision activities Professionalism .130 .054 2.395 .017 1.000 Absenteeism .145 .054 2.661 .008 1.000 Other -.159 .054 -2.927 .003 1.000 Access to video -.159 .054 -2.927 .003 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.159 .054 -2.927 .003 1.000 Access to computer, -.159 .054 -2.927 .003 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.159 .054 -2.927 .003 1.000 computer, tablet, or smart phone Access to computer, -.174 .054 -3.193 .001 .423 tablet, or smart phone with a microphone or an external microphone 181 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Scheduling Conflicts I have not experienced -.043 .054 -.798 .425 1.000 supervisor barriers Access to therapy or .058 .054 1.064 .287 1.000 assessment materials Access to resources -.072 .054 -1.330 .183 1.000 and/or examples of providing supervision via Telehealth Difficulties with remote .072 .054 1.330 .183 1.000 record keeping and paperwork Supervisory volume -.101 .054 -1.862 .063 1.000 Familiarity with -.116 .054 -2.128 .033 1.000 technology or video conferencing software Access to internet -.116 .054 -2.128 .033 1.000 services Non responsiveness .130 .054 2.395 .017 1.000 Obtaining evaluation of -.130 .054 -2.395 .017 1.000 supervision activities 17 Professionalism .130 .054 2.395 .017 1.000 Absenteeism .145 .054 2.661 .008 1.000 Other -.159 .054 -2.927 .003 1.000 Access to video -.159 .054 -2.927 .003 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.159 .054 -2.927 .003 1.000 182 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Sig.a Statistic Statistic Scheduling Conflicts continued Access to computer, -.159 .054 -2.927 .003 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.159 .054 -2.927 .003 1.000 computer, tablet, or smart phone Access to computer, -.174 .054 -3.193 .001 .423 tablet, or smart phone with a microphone or an external microphone I Have Not Experienced Supervisor Barriers Access to therapy or .014 .054 .266 .790 1.000 assessment materials Access to resources .029 .054 .532 .595 1.000 and/or examples of providing supervision via Telehealth Difficulties with remote .029 .054 .532 .595 1.000 record keeping and paperwork Supervisory volume .058 .054 1.064 .287 1.000 Familiarity with .072 .054 1.330 .183 1.000 technology or video conferencing software Access to internet .072 .054 1.330 .183 1.000 services Non responsiveness .087 .054 1.596 .110 1.000 183 Table 3.8 (cont’d) Sample 1 – Sample 2 Test Std. Error Std. Test Sig. Adj. Sig.a Statistic Statistic I Have Not Experienced Supervisor Barriers continued Obtaining evaluation of .087 .054 1.596 .110 1.000 supervision activities Professionalism .087 .054 1.596 .110 1.000 Absenteeism .101 .054 1.862 .063 1.000 Other .116 .054 2.128 .033 1.000 Access to video .116 .054 2.128 .033 1.000 conferencing software that is compliant with HIPAA Prefer not to answer .116 .054 2.128 .033 1.000 Access to computer, .116 .054 2.128 .033 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted .116 .054 2.128 .033 1.000 computer, tablet, or smart phone Access to computer, .130 .054 2.395 .017 1.000 tablet, or smart phone with a microphone or an external microphone Access to Therapy or Assessment Materials Access to resources -.014 .054 -.266 .790 1.000 and/or examples of providing supervision via Telehealth 184 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Sig.a Statistic Statistic Access to Therapy or Assessment Materials continued Difficulties with remote .014 .054 .266 .790 1.000 record keeping and paperwork Supervisory volume -.043 .054 -.798 .425 1.000 Familiarity with -.058 .054 -1.064 .287 1.000 technology or video conferencing software Access to internet -.058 .054 -1.064 .287 1.000 services Non responsiveness .072 .054 1.330 .183 1.000 Obtaining evaluation of -.072 .054 -1.330 .183 1.000 supervision activities Professionalism .072 .054 1.330 .183 1.000 Absenteeism .087 .054 1.596 .110 1.000 Other -.101 .054 -1.862 .063 1.000 Access to video -.101 .054 -1.862 .063 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.101 .054 -1.862 .063 1.000 Access to computer, -.101 .054 -1.862 .063 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.101 .054 -1.862 .063 1.000 computer, tablet, or smart phone 185 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Therapy or Assessment Materials continued Access to computer, -.116 .054 -2.128 .033 1.000 tablet, or smart phone with a microphone or an external microphone Difficulties with Remote Record Keeping and Paperwork Access to resources .000 .054 .000 1.000 1.000 and/or examples of providing supervision via Telehealth Supervisory volume -.029 .054 -.532 .595 1.000 Familiarity with -.043 .054 -.798 .425 1.000 technology or video conferencing software Access to internet -.043 .054 -.798 .425 1.000 services Non responsiveness .058 .054 1.064 .287 1.000 Obtaining evaluation of -.058 .054 -1.064 .287 1.000 supervision activities Professionalism .058 .054 1.064 .287 1.000 Absenteeism .072 .054 1.330 .183 1.000 Other -.087 .054 -1.596 .110 1.000 Access to video -.087 .054 -1.596 .110 1.000 conferencing software that is compliant with HIPAA 186 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Sig.a Statistic Statistic Difficulties with Remote Record Keeping and Paperwork continued Prefer not to answer -.087 .054 -1.596 .110 1.000 Access to computer, -.087 .054 -1.596 .110 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.087 .054 -1.596 .110 1.000 computer, tablet, or smart phone Access to computer, -.101 .054 -1.862 .063 1.000 tablet, or smart phone with a microphone or an external microphone Access to Resources and/or Examples of Providing Supervision Via Telehealth Supervisory volume -.029 .054 -.532 .595 1.000 Familiarity with -.043 .054 -.798 .425 1.000 technology or video conferencing software Access to internet -.043 .054 -.798 .425 1.000 services Non responsiveness .058 .054 1.064 .287 1.000 Obtaining evaluation of -.058 .054 -1.064 .287 1.000 supervision activities Professionalism .058 .054 1.064 .287 1.000 Absenteeism .072 .054 1.330 .183 1.000 Other -.087 .054 -1.596 .110 1.000 187 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Resources and/or Examples of Providing Supervision Via Telehealth continued Access to video -.087 .054 -1.596 .110 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.087 .054 -1.596 .110 1.000 Access to computer, -.087 .054 -1.596 .110 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.087 .054 -1.596 .110 1.000 computer, tablet, or smart phone Access to computer, -.101 .054 -1.862 .063 1.000 tablet, or smart phone with a microphone or an external microphone Supervisory Volume Familiarity with -.014 .054 -.266 .790 1.000 technology or video conferencing software Access to internet -.014 .054 -.266 .790 1.000 services Non responsiveness .029 .054 .532 .595 1.000 Obtaining evaluation of -.029 .054 -.532 .595 1.000 supervision activities Professionalism .029 .054 .532 .595 1.000 188 Table 3.8 (cont’d) Sample 1 – Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Supervisory Volume continued Absenteeism .043 .054 .798 .425 1.000 Other -.058 .054 -1.064 .287 1.000 Access to video -.058 .054 -1.064 .287 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.058 .054 -1.064 .287 1.000 Access to computer, -.058 .054 -1.064 .287 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.058 .054 -1.064 .287 1.000 computer, tablet, or smart phone Access to computer, -.072 .054 -1.330 .183 1.000 tablet, or smart phone with a microphone or an external microphone Familiarity with Technology or Video Conferencing Software Access to internet .000 .054 .000 1.000 1.000 services Non responsiveness .014 .054 .266 .790 1.000 Obtaining evaluation of .014 .054 .266 .790 1.000 supervision activities Professionalism .014 .054 .266 .790 1.000 Absenteeism .029 .054 .532 .595 1.000 189 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Familiarity with Technology or Video Conferencing Software continued Other -.043 .054 -.798 .425 1.000 Access to video -.043 .054 -.798 .425 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.043 .054 -.798 .425 1.000 Access to computer, -.043 .054 -.798 .425 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.043 .054 -.798 .425 1.000 computer, tablet, or smart phone Access to computer, -.058 .054 -1.064 .287 1.000 tablet, or smart phone with a microphone or an external microphone Access to Internet Services Non responsiveness .014 .054 .266 .790 1.000 Obtaining evaluation of .014 .054 .266 .790 1.000 supervision activities Professionalism .014 .054 .266 .790 1.000 Absenteeism .029 .054 .532 .595 1.000 Other -.043 .054 -.798 .425 1.000 190 Table 3.8 (cont’d) Sample 1 – Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Internet Services continued Access to video -.043 .054 -.798 .425 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.043 .054 -.798 .425 1.000 Access to computer, -.043 .054 -.798 .425 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.043 .054 -.798 .425 1.000 computer, tablet, or smart phone Access to computer, -.058 .054 -1.064 .287 1.000 tablet, or smart phone with a microphone or an external microphone Professionalism Non responsiveness .000 .054 .000 1.000 1.000 Obtaining evaluation of .000 .054 .000 1.000 1.000 supervision activities Absenteeism .014 .054 .266 .790 1.000 Other -.029 .054 -.532 .595 1.000 Access to video -.029 .054 -.532 .595 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.029 .054 -.532 .595 1.000 191 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Professionalism continued Access to computer, -.029 .054 -.532 .595 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.029 .054 -.532 .595 1.000 computer, tablet, or smart phone Access to computer, -.043 .054 -.798 .425 1.000 tablet, or smart phone with a microphone or an external microphone Non Responsiveness Obtaining evaluation of .000 .054 .000 1.000 1.000 supervision activities Absenteeism .014 .054 .266 .790 1.000 Other -.029 .054 -.532 .595 1.000 Access to video -.029 .054 -.532 .595 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.029 .054 -.532 .595 1.000 Access to computer, -.029 .054 -.532 .595 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.029 .054 -.532 .595 1.000 computer, tablet, or smart phone 192 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Non Responsiveness continued Access to computer, -.043 .054 -.798 .425 1.000 tablet, or smart phone with a microphone or an external microphone Obtaining Evaluation of Supervision Activities Absenteeism .014 .054 .266 .790 1.000 Other -.029 .054 -.532 .595 1.000 Access to video -.029 .054 -.532 .595 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.029 .054 -.532 .595 1.000 Access to computer, -.029 .054 -.532 .595 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.029 .054 -.532 .595 1.000 computer, tablet, or smart phone Access to computer, -.043 .054 -.798 .425 1.000 tablet, or smart phone with a microphone or an external microphone Absenteeism Other -.014 .054 -.266 .790 1.000 193 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Absenteeism continued Access to video -.014 .054 -.266 .790 1.000 conferencing software that is compliant with HIPAA Prefer not to answer -.014 .054 -.266 .790 1.000 Access to computer, -.014 .054 -.266 .790 1.000 tablet, or smart phone with a webcam or external webcam Access to encrypted -.014 .054 -.266 .790 1.000 computer, tablet, or smart phone Access to computer, -.029 .054 -.532 .595 1.000 tablet, or smart phone with a microphone or an external microphone Access to Encrypted Computer, Tablet, or Smart Phone Other .000 .054 .000 1.000 1.000 Access to video .000 .054 .000 1.000 1.000 conferencing software that is compliant with HIPAA Prefer not to answer .000 .054 .000 1.000 1.000 Access to computer, .000 .054 .000 1.000 1.000 tablet, or smart phone with a webcam or external webcam 194 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Access to Encrypted Computer, Tablet, or Smart Phone continued Access to computer, -.014 .054 -.266 .790 1.000 tablet, or smart phone with a microphone or an external microphone Access to Computer, Tablet, or Smart Phone with a Webcam or External Webcam Other .000 .054 .000 1.000 1.000 Access to video .000 .054 .000 1.000 1.000 conferencing software that is compliant with HIPAA Prefer not to answer .000 .054 .000 1.000 1.000 Access to computer, -.014 .054 -.266 .790 1.000 tablet, or smart phone with a microphone or an external microphone Access to Video Conferencing Software that is Compliant with HIPAA Other .000 .054 .000 1.000 1.000 Prefer not to answer .000 .054 .000 1.000 1.000 Access to computer, .014 .054 .266 .790 1.000 tablet, or smart phone with a microphone or an external microphone 195 Table 3.8 (cont’d) Sample 1 - Sample 2 Test Std. Error Std. Test Sig. Adj. Statistic Statistic Sig.a Other Prefer not to answer .000 .054 .000 1.000 1.000 Access to computer, .014 .054 .266 .790 1.000 tablet, or smart phone with a microphone or an external microphone Prefer Not to Answer Access to computer, .014 .054 .266 .790 1.000 tablet, or smart phone with a microphone or an external microphone Note. Each row tests the null hypothesis that the Sample 1 and Sample 2 distributions are the same. Asymptotic significances (2-sided tests) are displayed. The significance level is .05. a. Significance values have been adjusted by the Bonferroni correction for multiple tests. * Indicates a pairwise post-hoc Dunn test with Bonferroni adjustments was significant. 196 Table 3.9. Strategies Used to Address and/or Mitigate Supervisee Barriers Barrier (# of Participants Experienced n % Barrier) - Strategy Used Absenteeism (n = 9) Implemented a self-management strategy 2 22.2 for supervisee Increased monitoring of supervisee 3 33.3 Observed supervisee with clients 1 11.1 Provided immediate feedback to 4 44.4 supervisee Role-played with supervisee 2 22.2 Clarified expectations for supervisee 4 44.4 Set clear expectations for supervisee 5 55.6 Taught professionalism skills to 2 22.2 supervisee Supervisee attended a training on 2 22.2 technology and/or video conferencing software Supervisee removed distractions during 1 11.1 supervision meetings Supervisee completed an evaluation on 1 11.1 supervision activities Supervisee gained access to therapy or 2 22.2 assessment materials Other 1 11.1 Prefer not to answer 1 11.1 Non Responsiveness (n = 9) Implemented a self-management strategy 3 33.3 for supervisee Increased monitoring of supervisee 4 44.4 Observed supervisee with clients 1 11.1 Provided immediate feedback to 4 44.4 supervisee 197 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Non Responsiveness continued (n = 9) Clarified expectations for supervisee 5 55.6 Set clear expectations for supervisee 5 55.6 Taught professionalism skills to 1 11.1 supervisee Supervisee was terminated 1 11.1 Supervisee gained access to computer, 1 11.1 tablet, or smartphone with a webcam and/or microphone Professionalism (n = 7) Implemented a self-management strategy 2 28.6 for supervisee Increased monitoring of supervisee 4 57.1 Observed supervisee with clients 3 42.9 Utilized prompt-fading with supervisee 1 14.3 Provided immediate feedback to 6 85.7 supervisee Role-played with supervisee 1 14.3 Clarified expectations for supervisee 3 42.9 Set clear expectations for supervisee 7 100.0 Taught professionalism skills to 2 28.6 supervisee Supervisee was terminated 1 14.3 Supervisee gained access to internet 1 14.3 services Supervisee gained access to computer, 1 14.3 tablet, or smartphone with a webcam and/or microphone Supervisor provided a training on 1 14.3 technology and/or videoconferencing software to the supervisee 198 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) – Strategy Used Professionalism continued (n = 7) Supervisee removed distractions during 2 28.6 supervision meetings Supervisee completed an evaluation on 1 14.3 supervision activities Supervisee gained access to therapy or 1 14.3 assessment materials Organizational Skills (n = 16) Implemented a self-management strategy 10 62.5 for supervisee Increased monitoring of supervisee 8 50.0 Observed supervisee with clients 8 50.0 Utilized prompt-fading with supervisee 3 18.8 Provided immediate feedback to 7 43.8 supervisee Role-played with supervisee 3 18.8 Clarified expectations for supervisee 12 75.0 Set clear expectations for supervisee 10 62.5 Taught professionalism skills to 3 18.8 supervisee Supervisee gained access to computer, 2 12.5 tablet, or smartphone with a webcam and/or microphone Supervisee gained access to an external 1 6.3 webcam and/or external microphone Supervisee removed distractions during 2 12.5 supervision meetings Implementation of Feedback (n = 16) Implemented a self-management strategy 4 25.0 for supervisee Increased monitoring of supervisee 9 56.3 Observed supervisee with clients 12 75.0 199 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Implementation of Feedback continued (n = 16) Utilized prompt-fading with supervisee 5 31.3 Provided immediate feedback to 13 81.3 supervisee Role-played with supervisee 11 68.8 Clarified expectations for supervisee 14 87.5 Set clear expectations for supervisee 9 56.3 Taught professionalism skills to 1 6.3 supervisee Supervisee gained access to internet 1 6.3 services Supervisee removed distractions during 1 6.3 supervision meetings Supervisee completed an evaluation on 1 6.3 the supervisor Supervisee completed an evaluation on 1 6.3 supervision activities Distractions During Supervision Meeting (n = 25) Implemented a self-management strategy 6 24.0 for supervisee Increased monitoring of supervisee 1 4.0 Observed supervisee with clients 5 20.0 Utilized prompt-fading with supervisee 1 4.0 Provided immediate feedback to 9 36.0 supervisee Role-played with supervisee 8 32.0 Clarified expectations for supervisee 11 44.0 Set clear expectations for supervisee 11 44.0 Taught professionalism skills to 10 40.0 supervisee 200 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Distractions During Supervision Meeting continued (n = 25) Supervisee was reassigned to another 1 4.0 supervisor Supervisee gained access to internet 2 8.0 services Supervisee gained access to computer, 1 4.0 tablet, or smartphone with a webcam and/or microphone Supervisee gained access to an external 1 4.0 webcam and/or external microphone Supervisee attended a training on 1 4.0 technology and/or video conferencing software Supervisee attended a training on 3 12.0 Telehealth Supervisee removed distractions during 12 48.0 supervision meetings Supervisee completed an evaluation on 1 4.0 the supervisor Other 4 16.0 I have not used strategies to address 3 12.0 and/or mitigate barriers that arose Access to Therapy or Assessment Materials (n = 15) Implemented a self-management strategy 1 6.7 for supervisee Increased monitoring of supervisee 1 6.7 Observed supervisee with clients 1 6.7 Utilized prompt-fading with supervisee 2 13.3 Provided immediate feedback to 3 20.0 supervisee Clarified expectations for supervisee 2 13.3 201 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Access to Therapy or Assessment Materials continued (n = 15) Set clear expectations for supervisee 2 13.3 Supervisee completed an evaluation on 1 6.7 the supervisor Supervisee completed an evaluation on 1 6.7 supervision activities Supervisee gained access to therapy or 8 53.3 assessment materials Other 3 20.0 I have not used strategies to address 1 6.7 and/or mitigate barriers that arose Scheduling Conflicts (n = 20) Implemented a self-management strategy 4 20.0 for supervisee Increased monitoring of supervisee 2 10.0 Observed supervisee with clients 5 25.0 Utilized prompt-fading with supervisee 1 5.0 Provided immediate feedback to 7 35.0 supervisee Role-played with supervisee 1 5.0 Clarified expectations for supervisee 9 45.0 Set clear expectations for supervisee 8 40.0 Taught professionalism skills to 3 15.0 supervisee Supervisee was reassigned to another 1 5.0 supervisor Supervisee was terminated 1 5.0 Supervisee gained access to computer, 1 5.0 tablet, or smartphone with a webcam and/or microphone 202 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Scheduling Conflicts continued (n = 20) Supervisee attended a training on 1 5.0 technology and/or video conferencing software Supervisee attended a training on 1 5.0 Telehealth Supervisee removed distractions during 1 5.0 supervision meetings Supervisee completed an evaluation on 1 5.0 supervision activities Supervisee gained access to therapy or 2 10.0 assessment materials Other 3 15.0 Prefer not to answer 3 15.0 I have not used strategies to address 3 15.0 and/or mitigate barriers that arose Access to Clients (n = 18) Implemented a self-management strategy 1 5.6 for supervisee Increased monitoring of supervisee 2 11.1 Observed supervisee with clients 8 44.4 Utilized prompt-fading with supervisee 1 5.6 Provided immediate feedback to 2 11.1 supervisee Role-played with supervisee 6 33.3 Clarified expectations for supervisee 3 16.7 Set clear expectations for supervisee 1 5.6 Taught professionalism skills to 1 5.6 supervisee Supervisee gained access to internet 1 5.6 services 203 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Access to Clients continued (n = 18) Supervisee gained access to computer, 1 5.6 tablet, or smartphone with a webcam and/or microphone Supervisee gained access to an external 1 5.6 webcam and/or external microphone Supervisor provided a training on 1 5.6 technology and/or videoconferencing software to the supervisee Supervisee attended a training on 1 5.6 technology and/or video conferencing software Supervisee attended a training on 2 11.1 Telehealth Supervisee gained access to therapy or 1 5.6 assessment materials Other 2 11.1 I have not used strategies to address 1 5.6 and/or mitigate barriers that arose Familiarity with Technology or Video Conferencing Software (n = 16) Increased monitoring of supervisee 1 6.3 Provided immediate feedback to 2 12.5 supervisee Role-played with supervisee 1 6.3 Clarified expectations for supervisee 3 18.8 Taught professionalism skills to 1 6.3 supervisee Supervisee gained access to internet 3 18.8 services Supervisee gained access to computer, 4 25.0 tablet, or smartphone with a webcam and/or microphone 204 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Familiarity with Technology or Video Conferencing Software continued (n = 16) Supervisee gained access to an external 1 6.3 webcam and/or external microphone Supervisor provided a training on 10 62.5 technology and/or videoconferencing software to the supervisee Supervisee attended a training on 4 25.0 technology and/or video conferencing software Supervisee attended a training on 3 18.8 Telehealth Other 2 12.5 I have not used strategies to address 1 6.3 and/or mitigate barriers that arose Access to Internet Services (n = 15) Implemented a self-management strategy 1 6.7 for supervisee Utilized prompt-fading with supervisee 1 6.7 Supervisee gained access to internet 6 40.0 services Supervisee gained access to computer, 1 6.7 tablet, or smartphone with a webcam and/or microphone Supervisee gained access to an external 1 6.7 webcam and/or external microphone Supervisor provided a training on 1 6.7 technology and/or videoconferencing software to the supervisee Supervisee attended a training on 1 6.7 technology and/or video conferencing software Supervisee attended a training on 1 6.7 Telehealth 205 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Access to Internet Services continued (n = 15) Supervisee completed an evaluation on 1 6.7 supervision activities Other 4 26.7 Prefer not to answer 1 6.7 I have not used strategies to address 4 26.7 and/or mitigate barriers that arose Access to Encrypted Computer, Tablet, or Smart Phone (n = 4) Increased monitoring of supervisee 1 25.0 Supervisee gained access to internet 1 25.0 services Supervisee gained access to computer, 2 50.0 tablet, or smartphone with a webcam and/or microphone Supervisee gained access to an external 1 25.0 webcam and/or external microphone Supervisor provided a training on 3 75.0 technology and/or videoconferencing software to the supervisee Supervisee attended a training on 1 25.0 Telehealth Supervisee completed an evaluation on 1 25.0 the supervisor Access to Encrypted Computer, Tablet, or Smart Phone with a Webcam or an External Webcam (n = 5) Clarified expectations for supervisee 1 20.0 Supervisee gained access to internet 2 40.0 services Supervisee gained access to computer, 5 100.0 tablet, or smartphone with a webcam and/or microphone 206 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Access to Encrypted Computer, Tablet, or Smart Phone with a Webcam or an External Webcam continued (n = 5) Supervisee gained access to an external 2 40.0 webcam and/or external microphone Supervisor provided a training on 1 20.0 technology and/or video conferencing software to the supervisee Supervisee a training on technology 1 20.0 and/or video conferencing software Access to Encrypted Computer, Tablet, or Smart Phone with a Microphone or an External Microphone (n = 3) Supervisee gained access to computer, 2 66.7 tablet, or smartphone with a webcam and/or microphone Supervisee gained access to an external 1 33.3 webcam and/or external microphone Supervisor provided a training on 1 33.3 technology and/or video conferencing software to the supervisee Other 1 33.3 Access to Video Conferencing Software that is Compliant with HIPAA (n = 4) Taught professionalism skills to 1 25.0 supervisee Supervisee gained access to internet 2 50.0 services Supervisor provided a training on 2 50.0 technology and/or video conferencing software to the supervisee Other 1 25.0 207 Table 3.9 (cont’d) Barrier (# of Participants Experienced n % Barrier) - Strategy Used Internet Connectivity Issues (n = 46) Implemented a self-management strategy 1 2.2 for supervisee Observed supervisee with clients 3 6.5 Provided immediate feedback to 3 6.5 supervisee Set clear expectations for supervisee 1 2.2 Supervisee was reassigned to another 1 2.2 supervisor Supervisee gained access to internet 18 39.1 services Supervisee gained access to computer, 5 10.9 tablet, or smartphone with a webcam and/or microphone Supervisee gained access to an external 1 2.2 webcam and/or external microphone Supervisor provided a training on 4 8.7 technology and/or videoconferencing software to the supervisee Supervisee attended a training on 1 2.2 technology and/or videoconferencing software Other 4 8.7 Prefer not to answer 3 6.5 I have not used strategies to address 14 30.4 and/or mitigate barriers that arose Other (n = 1) Prefer not to answer 1 100.0 Note. For strategies used to address and/or mitigate each barrier, we only listed the strategies participants indicated that they used. If a strategy was not listed, zero participants indicated that 208 strategy was used. For the % column, the sum of percentages under each barrier does not equal 100% because they are based on questions where the participants could select multiple answers. 209 Table 3.10. Most Frequent Strategy Used to Address and/or Mitigate Each Supervisee Barrier Barrier Strategy Used Most Frequently Absenteeism Set clear expectations for supervisee Non responsiveness Clarified expectations for supervisee Set clear expectations for supervisee Professionalism Set clear expectations for supervisee Organizational skills Clarified expectations for supervisee Implementation of feedback Clarified expectations for supervisee Distractions during supervision meeting Supervisee removed distractions during supervision meetings Access to therapy or assessment materials Supervisee gained access to therapy or assessment materials Scheduling conflicts Clarified expectations for supervisee Access to clients Observed supervisee with clients Familiarity with technology or video Supervisor provided a training on technology conferencing software and/or videoconferencing software to the supervisee Access to internet services Supervisee gained access to internet services Access to encrypted computer, tablet, or Supervisor provided a training on technology smart phone and/or videoconferencing software to the supervisee Access to encrypted computer, tablet, or Supervisee gained access to computer, tablet, smart phone with a webcam or an external or smartphone with a webcam and/or webcam microphone Access to encrypted computer, tablet, or Supervisee gained access to computer, tablet, smart phone with a microphone or an external or smartphone with a webcam and/or microphone microphone Access to video conferencing software that is Supervisee gained access to internet services compliant with HIPAA Supervisor provided a training on technology and/or video conferencing software to the supervisee Internet connectivity issues Supervisee gained access to internet services 210 Table 3.11. Strategies Used to Address and/or Mitigate Supervisor Barriers Barrier (# of Participants Experienced Barrier) - Strategy Used n % Absenteeism (n = 2) Developed and implemented a self-management strategy for myself 1 50.0 Gained access to computer, tablet, or smartphone with a webcam 1 50.0 and/or microphone Attended a training on technology and/or videoconferencing software 1 50.0 Attended a training on Telehealth 1 50.0 Gained access to therapy or assessment materials 1 50.0 Other 1 50.0 Non Responsiveness (n = 3) Developed and implemented a self-management strategy for myself 1 33.3 Clarified expectations for myself 2 66.7 Set clear expectations for myself 2 66.7 Removed distractions during supervision meetings 2 66.7 Contacted a friend in the field of behavior analysis 1 33.3 Read journal articles on the issue 1 33.3 Professionalism (n = 3) Developed and implemented a self-management strategy for myself 1 33.3 Clarified expectations for myself 1 33.3 Set clear expectations for myself 1 33.3 Attended a training on Telehealth 2 66.7 Attended a training on how to provide supervision via Telehealth 1 33.3 Completed an evaluation on the supervisee 1 33.3 Completed an evaluation on supervision activities 1 33.3 Contacted a friend in the field of behavior analysis 1 33.3 Distractions During Supervision Meeting (n = 20) Developed and implemented a self-management strategy for myself 7 35.0 Clarified expectations for myself 2 10.0 Set clear expectations for myself 2 10.0 Attended a training on technology and/or videoconferencing software 1 5.0 Attended a training on Telehealth 1 5.0 211 Table 3.11 (cont’d) Barrier (# of Participants Experienced Barrier) - Strategy Used n % Distractions During Supervision Meeting continued (n = 20) Attended a training on how to provide supervision via Telehealth 1 5.0 Removed distractions during supervision meetings 10 50.0 Other 7 35.0 I have not used strategies to address and/or mitigate supervisor 1 5.0 barriers that arose Providing Feedback in a Timely Manner (n = 14) Developed and implemented a self-management strategy for myself 4 28.6 Clarified expectations for myself 4 28.6 Set clear expectations for myself 6 42.9 Gained access to computer, tablet, or smartphone with a webcam 1 7.1 and/or microphone Completed an evaluation on the supervisee 3 21.4 Read journal articles on the issue 3 21.4 Other 3 21.4 I have not used strategies to address and/or mitigate supervisor 1 7.1 barriers that arose Providing Reinforcement in a Timely Manner (n = 12) Developed and implemented a self-management strategy for myself 4 33.3 Clarified expectations for myself 2 16.7 Set clear expectations for myself 6 50.0 Attended a training on technology and/or videoconferencing software 1 8.3 Attended a training on Telehealth 3 25.0 Attended a training on how to provide supervision via Telehealth 1 8.3 Completed an evaluation on the supervisee 1 8.3 Completed an evaluation on supervision activities 5 41.7 Gained access to therapy or assessment materials 1 8.3 Contacted a friend in the field of behavior analysis 1 8.3 Read journal articles on the issue 3 25.0 Other 2 16.7 212 Table 3.11 (cont’d) Barrier (# of Participants Experienced Barrier) - Strategy Used n % Difficulties with Remote Record Keeping and Paperwork (n = 7) Developed and implemented a self-management strategy for myself 3 42.9 Clarified expectations for myself 1 14.3 Set clear expectations for myself 2 Completed an evaluation on the supervisee 1 14.3 Completed an evaluation on supervision activities 1 14.3 Gained access to therapy or assessment materials 2 28.6 Contacted a friend in the field of behavior analysis 2 28.6 Read journal articles on the issue 1 14.3 Access to Therapy or Assessment Materials (n = 8) Developed and implemented a self-management strategy for myself 1 12.5 Set clear expectations for myself 1 12.5 Attended a training on Telehealth 1 12.5 Attended a training on how to provide supervision via Telehealth 1 12.5 Gained access to therapy or assessment materials 4 50.0 Contacted a friend in the field of behavior analysis 1 12.5 Read journal articles on the issue 2 25.0 Other 2 25.0 I have not used strategies to address and/or mitigate supervisor 1 12.5 barriers that arose Obstruction of View or Supervisee Out of Lens View (n = 28) Gained access to computer, tablet, or smartphone with a webcam 1 3.6 and/or microphone Gained access to an external webcam and/or an external microphone 2 7.1 Attended a training on technology and/or videoconferencing software 2 7.1 Attended a training on Telehealth 1 3.6 Attended a training on how to provide supervision via Telehealth 3 10.7 Removed distractions during supervision meetings 1 3.6 Completed an evaluation on the supervisee 1 3.6 Completed an evaluation on supervision activities 1 3.6 Other 1 3.6 213 Table 3.11 (cont’d) Barrier (# of Participants Experienced Barrier) - Strategy Used n % Obstruction of View or Supervisee Out of Lens View continued (n = 28) I have not used strategies to address and/or mitigate supervisor 1 3.6 barriers that arose Ability to Model or Demonstrate Strategies (n = 29) Developed and implemented a self-management strategy for myself 4 13.8 Clarified expectations for myself 5 17.2 Set clear expectations for myself 6 20.7 Attended a training on Telehealth 1 3.4 Attended a training on how to provide supervision via Telehealth 1 3.4 Removed distractions during supervision meetings 1 34 Completed an evaluation on the supervisee 3 10.3 Completed an evaluation on supervision activities 1 3.4 Gained access to therapy or assessment materials 5 17.2 Contacted a friend in the field of behavior analysis 4 13.8 Read journal articles on the issue 2 6.9 Other 9 31.0 Prefer not to answer 1 3.4 I have not used strategies to address and/or mitigate supervisor 3 10.3 barriers that arose Scheduling Conflicts (n = 12) Developed and implemented a self-management strategy for myself 6 50.0 Clarified expectations for myself 5 41.7 Set clear expectations for myself 5 41.7 Contacted a friend in the field of behavior analysis 1 8.3 Other 2 16.7 Time Constraints (n = 13) Developed and implemented a self-management strategy for myself 6 46.2 Clarified expectations for myself 2 15.4 Set clear expectations for myself 4 30.8 214 Table 3.11 (cont’d) Barrier (# of Participants Experienced Barrier) - Strategy Used n % Time Constraints continued (n = 13) Attended a training on technology and/or videoconferencing software 1 7.7 Removed distractions during supervision meetings 1 7.7 Prefer not to answer 1 7.7 I have not used strategies to address and/or mitigate supervisor 1 7.7 barriers that arose Access to Resources and/or Examples of Providing Supervision Via Telehealth (n = 7) Attended a training on technology and/or videoconferencing software 2 28.6 Attended a training on Telehealth 1 14.3 Attended a training on how to provide supervision via Telehealth 1 24.3 Gained access to therapy or assessment materials 3 42.9 Other 1 14.3 Prefer not to answer 1 14.3 Supervisory Volume (n = 5) Set clear expectations for myself 1 20.0 Other 1 20.0 I have not used strategies to address and/or mitigate supervisor 3 60.0 barriers that arose Obtaining Evaluation of Supervision Activities (n = 3) Set clear expectations for myself 1 33.3 Other 1 33.3 Prefer not to answer 1 33.3 I have not used strategies to address and/or mitigate supervisor 1 33.3 barriers that arose Familiarity with Technology or Video Conferencing Software (n = 4) Attended a training on technology and/or videoconferencing software 2 50.0 Contacted a friend in the field of behavior analysis 1 25.0 Read journal articles on the issue 1 25.0 I have not used strategies to address and/or mitigate supervisor 1 25.0 barriers that arose 215 Table 3.11 (cont’d) Barrier (# of Participants Experienced Barrier) - Strategy Used n % Access to Internet Services (n = 4) Developed and implemented a self-management strategy for myself 1 25.0 Gained access to internet services 1 25.0 Gained access to an external webcam and/or an external microphone 1 25.0 Other 1 25.0 I have not used strategies to address and/or mitigate supervisor 2 50.0 barriers that arose Access to Encrypted Computer, Tablet, or Smart Phone (n = 1) I have not used strategies to address and/or mitigate supervisor 1 100.0 barriers that arose Access to Computer, Tablet, or Smart Phone with a Webcam or External Webcam (n = 1) I have not used strategies to address and/or mitigate supervisor 1 100.0 barriers that arose Access to Video Conferencing Software that is Compliant with HIPAA (n = 1) Gained access to computer, tablet, or smartphone with a webcam 1 100.0 and/or microphone Gained access to internet services 1 100.0 Internet Connectivity Issues (n = 22) Gained access to computer, tablet, or smartphone with a webcam 3 13.6 and/or microphone Gained access to internet services 11 50.0 Gained access to an external webcam and/or an external microphone 1 4.5 Other 2 9.0 I have not used strategies to address and/or mitigate supervisor 6 27.3 barriers that arose Other (n = 1) Prefer not to answer 1 100.0 Prefer Not to Answer (n = 1) Prefer not to answer 1 100.0 216 Table 3.11 (cont’d) Note. For strategies used to address and/or mitigate each barrier, we only listed the strategies participants indicated that they used. If a strategy was not listed, zero participants indicated that the strategy was used. For the % column, the sum of percentages under each barrier does not equal 100% because they are based on questions where the participants could select multiple answers. 217 Table 3.12. Most Frequent Strategy Used to Address and/or Mitigate Each Supervisor Barrier Barrier Strategy Used Most Frequently Absenteeism Developed and implemented a self- management strategy for myself Gained access to computer, tablet, or smartphone with a webcam and/or microphone Attended a training on technology and/or videoconferencing software Attended a training on Telehealth Gained access to therapy or assessment materials Other Non responsiveness Clarified expectations for supervisee Set clear expectations for supervisee Removed distractions during supervision meetings Professionalism Attended a training on Telehealth Distractions during supervision meeting Removed distractions during supervision meetings Providing feedback in a timely manner Set clear expectations for myself Providing reinforcement in a timely manner Set clear expectations for myself Difficulties with remote record keeping Developed and implemented a self- management strategy for myself Access to therapy or assessment materials Gained access to therapy or assessment materials Obstruction of view or supervisee out of lens Attended a training on how to provide view supervision via Telehealth Ability to model or demonstrate strategies Other Scheduling conflicts Developed and implemented a self- management strategy for myself Time constraints Developed and implemented a self- management strategy for myself 218 Table 3.12 (cont’d) Barrier Strategy Used Most Frequently Access to resources and/or examples of Gained access to therapy or assessment providing supervision via Telehealth materials Supervisory volumes I have not used strategies to address and/or mitigate supervisor barriers that arose Obtaining evaluation of supervision activities Set clear expectations for myself Other Prefer not to answer I have not used strategies to address and/or mitigate supervisor barriers that arose Familiarity with technology or video Attended a training on technology and/or conferencing software videoconferencing software Access to internet services I have not used strategies to address and/or mitigate supervisor barriers that arose Access to encrypted computer, tablet, or I have not used strategies to address and/or smart phone mitigate supervisor barriers that arose Access to computer, tablet, or smart phone I have not used strategies to address and/or with a webcam more external webcam mitigate supervisor barriers that arose Access to video conferencing software that is Gained access to computer, tablet, or compliant with HIPAA smartphone with a webcam and/or microphone Gained access to internet services Internet connectivity issues Gained access to internet services 219 Table 3.13. Recommendations to Address and/or Mitigate Supervisee and Supervisor Barriers Barrier Recommendation(s) Supervisee Supervisor Citation(s) Absenteeism Develop and implement a self-management X X Frayne, 1991; LeBlanc et al., strategy 2020; Sellers et al., 2016b Clarify and set clear expectations (e.g., provide proof of reason for absence) Teach the use of organizational tools (e.g., electronic calendar with built-in reminders) Non responsiveness Clarify and set clear expectations (e.g., X X Cooper et al., 2020; LeBlanc et responding to emails within 24 hr of receiving al., 2020; Sellers et al., 2016b; the email) Southall & Gast, 2011 Develop a self-monitoring system Remove distractions from room/area Professionalism Clarify and set clear expectations (e.g., X X Andzik & Kranak, 2021 collaborating with coworkers) Teach professionalism skills using behavioral skills training Organizational skills Clarify and set clear expectations (e.g., take X LeBlanc et al., 2020; Sellers et notes during meetings) al., 2016b Teach the use of organizational tools/apps (e.g., electronic to-do list with built-in reminders, flag important emails, create folders for supervision related emails) 220 Table 13.3 (cont’d) Barrier Recommendation(s) Supervisee Supervisor Citation(s) Providing feedback and Set clear expectations (e.g., provide feedback X LeBlanc et al., 2020 reinforcement in a timely immediately, provide feedback within 30 min) manner Practice/role play how to provide feedback effectively Develop a self-monitoring system Implementation of Clarify and set clear expectations (e.g., must X Ehrlich et al., 2020; Kazemi et feedback provided implement one feedback suggestion in the al., 2018; Sellers et al., 2016b next session with the client) Practice/role play how to implement feedback that was provided Monitor the implementation of the feedback provided Evaluate if feedback needs to be changed or modified (e.g., frequency of feedback, immediacy of feedback, type of feedback) Assign relevant readings regarding accepting feedback (e.g., Chapter 6 in Kazemi et al., 2018; Chapter 20 in Bailey & Burch, 2010) 221 Table 3.13 (cont’d) Barrier Recommendation(s) Supervisee Supervisor Citation(s) Distractions during Remove distractions from room/area X X Neely et al., 2022 supervision meeting (e.g., Minimize all nonrelevant windows on device family members, pets) Mute all sounds and notifications Move session location Change session time to when other individuals are not around Difficulties with remote Develop and implement a self-management X LeBlanc et al., 2020 record keeping and strategy paperwork Access to therapy or Ask organization how to gain access to X X Simmons et al., 2021 assessment materials (e.g., therapy or assessment materials data sheets) Consider creating virtual materials/resources Obstruction of view or Move camera or change camera position X Lerman et al., 2020; Neely et al., supervisee out of lens 2022 Use multiple cameras view (e.g., could not see the supervisee working with client) Ability to model or Create a video model library of specific skills X DiGennaro Reed & Henley, demonstrate strategies 2015; Simmons et al., 2021 Scheduling conflicts Develop and implement a self-management X X LeBlanc et al., 2020 strategy Time constraints Develop and implement a self-management X LeBlanc et al., 2020 strategy 222 Table 3.13 (cont’d) Barrier Recommendation(s) Supervisee Supervisor Citation(s) Access to clients Make schedule changes X X Frederick et al., 2022; Pollard et Assess if services can be provided via al., 2017; Romani & Schieltz, Telehealth 2017 Access to resources and/or Receive training on providing supervision via X Neely et al., 2022 examples of providing Telehealth supervision via Ask for guidance from a supervisor who has Telehealth provided supervision via Telehealth Supervisory volume Ask organization about changing supervisory X Irwin Helvey et al., 2022; Sellers volume et al., 2016a Obtaining evaluation of Create a supervision evaluation form X LeBlanc et al., 2020; Sellers et supervision activities al., 2016b Familiarity with Attend/provide a training on how to use X X Neely et al., 2022; Zoder-Martell technology or technology and/or video conferencing et al., 2020 videoconferencing software software Create a task analysis of how to use technology and/or video conferencing software Access to internet Develop an IT department/team, hire an IT X X Lee at al., 2015; Pollard et al., services specialist, or train one staff member to serve 2017 as IT support Access to encrypted Create lending library X X Lee et al., 2015; Lerman et al., computer, tablet, or smart 2020 phone with a webcam/microphone or external webcam/microphone 223 Table 3.13 (cont’d) Barrier Recommendation(s) Supervisee Supervisor Citation(s) Access to video Purchase video conferencing software that is X X Rios et al., 2018 conferencing software that HIPAA compliant (e.g., VSee, Breakthrough) is compliant with Health Insurance Portability and Accountability Act (HIPAA) Internet connectivity Upgrade modem/internet service X X Lee at al., 2015; Lerman et al., issues 2020; Neely et al., 2022; Pollard Reduce number of devices using the internet et al., 2017 Have a back-up device (e.g., phone) Develop an IT department/team, hire an IT specialist, or train one staff member to serve as IT support 224 Figure 3.1. Percentage of Each Supervisee Barrier Experienced Across Participants 100 Percentage of Respondents 90 80 70 60 50 40 30 20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 Barriers Experienced 1 Absenteeism 2 Non Responsiveness 3 Professionalism 4 Organizational Skills 5 Implementation of Feedback Provided 6 Distractions During Supervision Meeting 7 Access to Therapy or Assessment Materials 8 Scheduling Conflicts 9 Access to Clients 10 Familiarity with Technology or Videoconferencing Software 11 Access to Internet Services 12 Access to Encrypted Computer, Tablet, or Smart Phone 13 Access to Encrypted Computer, Tablet, or Smart Phone with a Webcam or an External Webcam 14 Access to Encrypted Computer, Tablet, or Smart Phone with a Mircophone or an External Microphone 15 Access to Video Conferencing Software that is Compliant with HIPAA 16 Internet Connectivity Issues 17 Other 18 Prefer Not to Answer 19 I Have Not Experienced Supervisee Barriers i 225 Figure 3.2. Number of Supervisee Versus Supervisor Barriers Experienced Across Participants 11 Number of Supervisee Barriers 10 9 8 7 6 5 4 3 2 1 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 Number of Supervisor Barriers Note. For number of barriers, we only listed the number of barriers participants indicated that they experienced. If a number was not listed, zero participants indicated that was the number of barriers they experienced. 226 Figure 3.3. Percentage of Each Supervisor Barrier Experienced Across Participants 100 Percentage of Respondents 90 80 70 60 50 40 30 20 10 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 Barriers Experienced 1 Absenteeism 2 Non Responsiveness 3 Professionalism 4 Distractions During Supervision Meeting 5 Providing Feedback in a Timely Manner 6 Providing Reinforcement in a Timely Manner 7 Difficulties with Remote Record Keeping and Paperwork 8 Access to Therapy or Assessment Materials 9 Obstruction of View or Supervisee Out of Lens View 10 Ability to Model or Demonstrate Strategies 11 Scheduling Conflicts 12 Time Constraints 13 Access to Resources and/or Examples of Providing Supervision via Telehealth 14 Supervisory Volume 15 Obtaining Evaluation of Supervision Activities 16 Familiarity with Technology or Videoconferencing Software 17 Access to Internet Services 18 Access to Encrypted Computer, Tablet, or Smart Phone 19 Access to Encrypted Computer, Tablet, or Smart Phone with a Webcam or an External Webcam 20 Access to Encrypted Computer, Tablet, or Smart Phone with a Microphone or an External Microphone 21 Access to Video Conferencing Software that is Compliant with HIPAA 22 Internet Connectivity Issues 23 Other 24 Prefer Not to Answer 25 I Have Not Experienced Supervisor Barriers 227 REFERENCES 228 REFERENCES Alwin, D. F. (2010). How good is survey measurement? Assessing the reliability and validity of survey measures. In P. V. Marsden & J. D. Wright (Eds.), Handbook of survey research (pp. 405- 436). Emerald Group Publishing Limited. American Telemedicine Association (2017). Telehealth: Defining 21st century care. https://www.americantelemed.org/resource/why-telemedicine/ Andzik, N. R., & Kranak, M. P. (2021). The softer side of supervision: Recommendations when teaching and evaluating behavior-analytic professionalism. Behavior Analysis: Research and Practice, 21(1), 65-74. https://doi.org/10.1037/bar0000194 Bailey, J., & Burch, M. (2010). 25 essential skills & strategies for the professional behavior analyst. Routledge. Barretto, A., Wacker, D. P., Harding, J., Lee, J., & Berg, W. K. (2006). Using telemedicine to conduct behavioral assessments. Journal of Applied Behavior Analysis, 39(3), 333-340. https://doi.org/10.1901/jaba.2006.173-04 Bearss, K., Burrell, L. T., Challa, S. A., Postorino, V., Gillespie, S. E., Crooks, C., & Scahill, L. (2018). Feasibility of parent training via telehealth for children with autism spectrum disorder and disruptive behaviour: a demonstration pilot. Journal of Autism and Developmental Disorders, 48(4), 1020-1030. https://doi.org/10.1007/s10803-017-3363-2 Becerra, L. A., Sellers, T. P., & Conteras, B. P. (2020). Maximizing the conference experience: Tips to effectively navigate academic conferences early in professional careers. Behavior Analysis in Practice, 13, 479-491. https://doi.org/10.1007/s40617-019-00406-w Behavior Analyst Certification Board (n.d.). BACB certificant data. https://www.bacb.com/BACB-certificant-data Behavior Analyst Certification Board. (2018). Standards for supervision of BCaBAs. https://www.bacb.com/wp-content/uploads/2020/05/Standards-for-Supervision-of- BCaBAs_180924.pdf Britton, L. N., & Cicoria, M. J. (2019). Remote fieldwork supervision for BCBA trainees. Academic Press. Brodhead, M. T. (2022, April 11). Supervision via telehealth survey. osf.io/d2j9z Brodhead, M. T., & Higbee, T. S. (2012). Teaching and maintaining ethical behavior in a professional organization. Behavior Analysis in Practice, 5(2), 82-88. https://doi.org/10.1007/BF03391827 229 Brodhead, M. T., Quigley, S. P., & Wilczynski, S. M. (2018). A call for discussion about scope of competence in Behavior Analysis. Behavior Analysis in Practice, 11(4), 424-435. https://doi.org/10.1007/s40617-018-00303-8 Cavalari, R. N., Gillis, J. M., Kruser, N., & Romanczyk, R. G. (2015). Digital communication and records in service provision: regulation and practice. Behavior Analysis in Practice, 8(2), 176-189. https://doi.org/10.1007/s40617-014-0030-3 Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied behavior analysis (3rd ed.). Pearson Education. Cox, D. J., Plavnick, J. B., & Brodhead, M. T. (2020). A proposed process for risk mitigation during the COVID-19 pandemic. Behavior Analysis in Practice, 13(2), 299-305. https://doi.org/10.1007/s40617-020-00430-1 DiGennaro, Reed, F. D., & Henley, A. J. (2015). A survey of staff training and performance management practices: The good, the bad, and the ugly. Behavior Analysis in Practice, 8(1), 16-26. https://doi.org/10.1007/s40617-015-0044-5 Ehrlich, R. J., Nosik, M. R., Carr, J. E., & Wine, B. (2020). Teaching employees how to receive feedback: A preliminary investigation. Journal of Organizational Behavior Management, 40(1-2), 19-29. https://doi.org/10.1080/01608061.2020.1746470 Ferguson, J., Craig, E. A., & Dounavi, K. (2019). Telehealth as a model for providing behavior analytic interventions to individuals with autism spectrum disorder: A systematic review. Journal of Autism and Developmental Disorders, 49(2), 582-616. https://doi.org/10.1007/s10803-018-3724-5 Field, A. (2020). Survey fatigue and the tragedy of the commons: Are we undermining our evaluation practice? Evaluation Matters, 6, 1-11. https://doi.org/10.18296.em0054 Frayne, C. A. (1991). Reducing employee absenteeism through self-management training: A research-based analysis and guide. Quorum Books. Frederick, J. K., Rogers, V. R., & Raabe, G. R. (2022). Commitment, collaboration, and problem resolution to promote and sustain access to multifaceted applied behavior analytic services utilizing telepractice. Behavior Analysis in Practice, 15(1), 347-369. https://doi.org/10.1007/s40617-020-00550-8 Hajiaghamohseni, Z., Drasgrow, E., & Wolfe, K. (2020). Supervision behaviors of board certified behavior analysts with trainees. Behavior Analysis in Practice, 14(1), 97-109. https://doi.org/10.1007/s40617-020-00492-1 230 Heitzman-Powell, L. S., Buzhardt, J., Rusinko, L. C., & Miller, T. M. (2014). Formative evaluation of an ABA outreach training program for parents of children with autism in remote areas. Focus on Autism and Other Developmental Disabilities, 29(1), 23-38. https://doi.org/10.1177/1088357613504992 Higgins, W. J., Luczynski, K. C., Carroll, R. A., Fisher, W. W., & Mudford, O. C. (2017). Evaluation of a telehealth training package to remotely train staff to conduct a preference assessment. Journal of Applied Behavior Analysis, 50(2), 238-251. https://doi.org/10.1002/jaba.370 Horn, B. P., Barragan, G. N., Fore, C., & Bonham, C. A. (2016). A cost comparison of travel models and behavioural telemedicine for rural, Native American populations in New Mexico. Journal of Telemedicine and Telecare, 22(1), 47-55. https://doi- org.proxy1.cl.msu.edu/10.1177/1357633X15587171 Irwin Helvey, C., Thuman, E., & Cariveau, T. (2022). Recommended practices for individual supervision: Considerations for the behavior-analytic trainee. Behavior Analysis in Practice, 15, 370-381. https://doi.org/10.1007/s40617-021-00557-9 Kazemi, E., Rice, B., & Adzhyan, P. (2019). Fieldwork and supervision for behavior analysts: A handbook. Springer Publishing Company. Krezmien, M. P., Lauterbach, A., Harrington, K., & Yakut, A. (2017). Developing and conducting international school counseling survey research. In J. C. Carey, B. Harris, S. M. Lee, & O. Aluede (Eds.), International handbook for policy research on school-based counseling (pp. 59-70). Springer International Publishing. Kuravackel, G. M., Ruble, L. A., Reese, R. J., Ables, A. P., Rodgers, A. D., & Toland, M. D. (2018). Compass for hope: Evaluating the effectiveness of a parent training and support program for children with ASD. Journal of Autism and Developmental Disorders, 48(8), 404-416. https://doi.org/10.1007/s10803-017-3333-8 LeBlanc, L. A., Sellers, T. P., & Ala’i, S. (2020). Building and sustaining meaningful and effective relationships as a supervisor and mentor. Sloan Publishing. LeBlanc, L. A., Heinicke, M. R., & Baker, J. C. (2012). Expanding the consumer base for behavior-analytic services: Meetings the needs of consumers in the 21st century. Behavior Analysis in Practice, 5(1), 4-14. https://doi.org/10.1007/BF03391813 LeBlanc, L. A., & Luiselli, J. K. (2016). Refining supervisory practices in the field of behavior analysis: Introduction to the special section on supervision. Behavior Analysis in Practice, 9(4), 271-273. https://doi.org/10.1007/s40617-016-0156-6 231 Lee, J. F., Schieltz, K. M., Suess, A. N., Wacker, D. P., Romani, P. W., Lindgren, S. D., Kopelman, T. G., & Padilla Dalmau, Y. C. (2015). Guidelines for developing telehealth services and troubleshooting problems with telehealth technology when coaching parents to conduct functional analyses and functional communication training in their homes. Behavior Analysis in Practice, 8(2), 190-200. https://doi.org/10.1007/s40617-014-0031-2 Lerman, D. C., O’Brien, M. J., Neely, L., Call, N. A., Tsami, L., Schieltz, K. M., Berg, W. K., Graber, J., Huang, P., Kopelman, T., & Cooper-Brown, L. J. (2020). Remote coaching of caregivers via Telehealth: Challenges and potential solutions. Journal of Behavioral Education, 29(2), 195-221. https://doi.org/10.1007/s10864-020-09378-2 Lindgren, S., Wacker, D., Suess, A., Schieltz, K., Pelzel, K., Kopelman, T., Lee, J., Romani, P., & Waldron, D. (2016). Telehealth and autism: Treating challenging behavior at lower cost. Pediatrics, 137, S167-S175. https://doi.org/10.1542/peds.2015-28510 Machalicek, W., O’Reilly, M., Chan, J. M., Rispoli, M., Lang, R., Davis, T., Shogren, K., Sorrells, A., Lanciono, G., Sigafoos, J., Green, V., & Langthorne, P. (2009). Using videoconferencing to support teachers to conduct preference assessments with student with autism and developmental disabilities. Research in Autism Spectrum Disorders, 3(1), 32-41. https://doi.org/10.1016/j.rasd.2008.03.004 Meadan, H., & Daczewitz, M. E. (2015). Internet-based intervention training for parents of young children with disabilities: A promising service-delivery model. Early Child Development and Care, 185(1), 155-169. https://doi.org/10.1080/03004430.2014.908866 Merriam-Webster (n.d.). Barriers. In Merriam-Webster.com dictionary. Retrieved December 4, 2020, from https://www.merriam-webster.com/dictionary/barrier Neely, L., Rispoli, M., Gerow, S., & Hong, E. R. (2016). Preparing interventionists via telepractice in incidental teaching for children with autism. Journal of Behavior Education, 25(4), 393-416. https://doi.org/10.1007/s10864-016-9250-7 Neely, L., Tsami, L., Graber, J., & Lerman, D. C. (2022). Towards the development of a curriculum to train behavior analysts to provide services via Telehealth. Journal of Applied Behavior Analysis, 55(2), 395-411. https://doi.org/10.102/jaba.904 Ninci, J., Čolić, M., Hogan, A., Taylor, G., Bristol, R., & Burris, J. (2021). Maintaining effective supervision systems for trainees pursuing a behavior analyst certification board certification during the COVID-19 pandemic. Behavior Analysis in Practice, 14(4), 1047-1057. https://doi.org/10.1007/s40617-021-00565-9 Patel, S. S., Webster, R. K., Greenberg, N., Weston, D., & Brooks, S. H. (2020). Research fatigue in COVID-19 pandemic and post-disaster research: Causes, consequences and recommendations. Disaster Prevention Management, 29(4), 445-455. https://doi.org/10.1108/DPM-05-2020-0164 232 Pollard, J. S., Karimi, K. A., & Ficcaglia, M. B. (2017). Ethical considerations in the design and implementation of a telehealth service delivery model. Behavior Analysis: Research and Practice, 17(4), 298-311. https://doi.org/10.1037/bar0000053 Quigley, S. P., Blevins, P. R., Cox, D. J., Brodhead, M. T., & Kim, S. Y. (2019). An evaluation of explicit ethical statements in telehealth research with individuals with autism spectrum disorder. Behavior Analysis: Research and Practice, 19(2), 123-135. http://dx.doi.org/10.1037/bar0000094 Remler, D. K., & Van Ryzin, G. G. (2011). Research methods in practice: Strategies for description and causation. Sage Publications. Rios, D., Kazemi, E., & Peterson, S. M. (2018). Best practices and considerations for effective service provision via remote technology. Behavior Analysis: Research and Practice, 18(3), 277-287. https://doi.org/10.1037/bar0000072 Rogelberg, S. & Stanton, J. (2007). Understanding and dealing with organizational survey non- response. Organizational Research Methods, 10(2), 195-209. https://doi.org/10.1177/1094428106294693 Romani, P. W., & Schieltz, K. M. (2017). Ethical considerations when delivering behavior analytic services for problem behavior via telehealth. Behavior Analysis: Research and Practice, 17(4), 312– 324. https://doi.org/10.1037/bar0000074 Sellers, T. P., Alai-Rosales, S., & MacDonald, R. P. F. (2016a). Taking full responsibility: The ethics of supervision in behavior analytic in practice. Behavior Analysis in Practice, 9(4), 299-208. https://doi.org/10.1007/s40617-016-0144-x Sellers, T. P., LeBlanc, L. A., & Valentino, A. L. (2016b). Recommendations for detecting and addressing barriers to successful supervision. Behavior Analysis in Practice, 9(4), 309- 319. https://doi.org/10.1007/s40617-016-0142-z Sellers, T. P., Valentino, A. L., Landon, T. J., & Aiello, S. (2019). Board certified behavior analysts’ supervisory practices of trainees: Survey results and recommendations. Behavior Analysis in Practice, 12(3), 536-546. https://doi.org/10.1007/s40617-019- 00367-0 Shapiro, M., & Kazemi, E. (2017). A review of training strategies to teach individuals implementation of behavioral interventions. Journal of Organizational behavior Management, 37(1), 32-62. https://doi.org/10.1080/01608061.2016.1267066 Sheskin, D. J. (2011). Handbook of parametric and nonparametric statistical procedures (5th ed.). CRC Press. 233 Simmons, C. A., Ford, K. R., Salvatore, G. L., & Moretti, A. E. (2021). Acceptability and feasibility of virtual behavior analysis supervision. Behavior Analysis in Practice, 14(4), 927-943. https://doi.org/10.1007/s40617-021-00622-3 Slocum, T. A., Detrich, R., Wilczynski, S. M., Spencer, T. D., Lewis, T., & Wolfe, K. (2014). The evidence-based practice of applied behavior analysis. The Behavior Analyst, 37(1), 41-56. https://doi.org/10.1007/s40614-014-0005-2 Southall, C. M., & Gast, D. L. (2011). Self management procedures: Comparison across the autism spectrum. Education and Training in Autism and Developmental Disabilities, 46(2), 155-171. https://www-jstor-org.proxy1.cl.msu.edu/stable/23879688 Turner, L. B., Fischer, A. J., & Luiselli, J. K. (2016). Towards a competency-based, ethical, and socially valid approach to the supervision of applied behavior analytic trainees. Behavior Analysis in Practice, 9(4), 287-298. https://doi.org/10.1007/s40617-016-0121-4 Unholz-Bowden, E., McComas, J. J., McMaster, K. L., Girtler, S. N., Kolb, R. L., & Shipchandler, A. (2020). Caregiver training via telehealth on behavioral procedures: A systematic review. Journal of Behavioral Education, 29(2), 246-281. https://doi.org/10.1007/s10864-020-09381-7 Valentino, A. L. (2021). Supervision and mentoring. In Luiselli, J. K., Gardner, R. M., Bird, F. L., & Maguire, H. (Eds.). Organizational behavior management approaches for intellectual and developmental disabilities (pp. 141-164). Routledge. Vismara, L. A., Young, G. S., & Rogers, S. J. (2012). Telehealth for expanding the reach of early autism training to parents. Autism Research and Treatment, 20(12), 1-12. https://doi.org/10.1155/2012/121878 Wacker, D. P., Lee, J. F., Dalmau, Y. C., Kopelman, T. G., Lindgren, S. D., Kuhle, J., Pelzel, K. E., Dyson, S., Schieltz, K. M., & Waldron, D. B. (2013). Conducting functional analysis of problem behavior via telehealth. Journal of Applied Behavior Analysis, 46(1), 31-46. https://doi.org/10.1002/jaba.29 Wainer, A. L., & Ingersoll, B. R. (2015). Increasing access to an ASD imitation intervention via a telehealth parent training program. Journal of Autism and Developmental Disabilities, 45(12), 3877-3890. https://doi.org/10.1007/s10803-014-2186-7 Wilczynski, S. M., Labrie, A., Baloski, A., Kaake, A., Marchi, N., & Zoder-Martell, K. (2017). Web-based teacher training and coaching/feedback: A Case Study. Psychology in the Schools, 54(4), 433-445. https://doi.org/10.1002/pits.22005 Zoder-Martell, K. A., Markelz, A. M., Floress, M. T., Skriba, H. A., & Sayyah, L. E. N. (2020). Technology to facilitate telehealth in applied behavior analysis. Behavior Analysis in Practice, 13(3), 596-603. https://doi.org/10.1007/s40617-020-00449-4 234 CHAPTER 4 An Evaluation of Email Performance-Based Feedback on Teacher Candidates Multiple Stimulus Without Replacement Preference Assessment Implementation Performance feedback is a critical component of professional development (Barton et al., 2020; Miltenberger, 2012). Performance feedback involves the use of data, derived from an observation occurring during supervision, to inform the delivery of feedback in order to change and sustain the individual’s behavior (Barton et al., 2016; Barton et al., 2020; Hemmeter et al., 2011; Novak et al., 2019). Within school settings, researchers found performance feedback to increase procedural fidelity and maintain the teacher’s use of effective practices, which in turn increased the quality of instruction provided and improves child learning outcomes (Barton et al., 2020; Schles & Robertson, 2019). Without additional support (e.g., performance feedback) new and returning teachers may implement evidence-based practices with low or variable levels of fidelity and negatively impact child learning outcomes for students with disabilities (Schles & Robertson, 2019). Performance feedback can be delivered in many forms, such as in verbal or written formats, and it can be provided during or after a supervisory observation of the target individual implementing an intervention or engaging in an activity (Barton & Wolery, 2007). In addition, performance feedback can be delivered in a variety of modalities such as bug-in-ear as well as visible counters, public wall postings, and personal interactions (Coogle et al., 2016; Coogle et al., 2017; Warrilow et el., 2020). As technology has evolved and become more available, so have modalities for delivering performance feedback, such as computer displays, text messages, video conferencing, social media communications, and emails (e.g., Barton & Wolery, 2007; Hemmeter et al., 2011; Krick Oborn & Johnson, 2015; Zhu et al., 2021). 235 One technology-based modality that has been shown to have several advantages for delivering performance feedback is email (Gorton et al., 2021). Email feedback allows supervisors to save time by sending an email following the observation rather than scheduling a time to meet in person to review that feedback (Warrilow et al., 2020). Related, the observer is able to send the email feedback immediately to the individual after the observation is completed, without interrupting the individual and the activity they are engaging in (e.g., implementing an intervention with a child; Barton & Wolery, 2007; Gorton et al., 2021; Zhu et al., 2021). Second, verbal and some forms of written feedback (e.g., handwritten notes) may be seen as obsolete and ineffective by the individuals receiving the feedback, whereas wireless communication forms of feedback (e.g., email, video, bug-in-ear) are seen as more current or up-to-date (Barton et al., 2020; Gomez et al., 2021; Zhu et al., 2021) and may be seen as more socially acceptable (Barton & Wolery, 2007). Additionally, using emails results in an electronic record of the feedback provided (Zhu et al., 2021), which can be reviewed more than once and can be used for future performance reviews. Overall, email feedback may be a strategy for interacting with individuals quickly and more efficiently (Barton et al., 2016; Warrilow et. al., 2020). A limitation of previous email feedback research is that the studies did not isolate the effects of email feedback, because the email feedback was always provided in conjunction with a training (Artman-Meecker & Hemmeter, 2013; Gomez et al., 2021; Gorton et al., 2021; Hemmeter et al., 2011; Martinez Cueto et al., 2021). Researchers then sought to isolate and evaluate the effects of email feedback alone (Barton et al., 2016; Barton et al., 2020) and also compared email feedback alone to other forms of feedback such as immediate bug-in-ear feedback (Coogle et al., 2020) and videoconference feedback (Zhu et al., 2021). Overall, the studies found that email feedback alone was effective in increasing teachers’ behaviors. 236 Though the aforementioned studies evaluated providing performance feedback via email, there were several limitations to the studies that the present study hopes to address. First, within the recent studies evaluating email feedback, researchers did not control for the amount or quality of the performance feedback provided in each email. This lack of experimental control presents problems because researchers have found that more frequent and specific feedback produces more significant changes in performance (Park et al., 2019; Sleiman et al., 2020). It is unclear if variation in amount or quality of performance feedback in previously published research differentially affected participant outcomes. Second, several of the aforementioned studies included a training component between the baseline condition and the intervention condition (i.e., Artman-Meecker & Hemmeter, 2013; Gomez et al., 2021; Gorton et al., 2021; Hemmeter et al., 2011; Martinez Cueto et al., 2021). Thus, it is unclear if improvements in instructional behaviors were a result of email feedback alone, training alone, or a combination of the email feedback and the training. Third, the previous literature did not control for variation in the instructional environment. Specifically, the studies reviewed above involved classroom settings where teachers engaged with students. Opportunities for teacher participants to respond, then, were at least particularly affected by student behavior in those settings and it is unclear to what extent participant behavior was at least partially affected by student behavior (e.g., frequency of teacher descriptive praise is dependent on the frequency of student engagement). In summary, the purpose of the present study is to extend previous research on performance feedback delivered via email by evaluating the effects of email feedback alone on teacher candidates’ implementation of a multiple stimulus without replacement (MSWO) preference assessment. Specifically, researchers evaluated the following research question: 237 Given email feedback, to what extent did teacher candidates’ implement an MSWO preference assessment with fidelity? Method Participants Participants were recruited from three undergraduate courses offered through a special education program at a Midwestern University. The researchers contacted the professors of the three courses and provided them with a recruitment email to send to their students enrolled in the courses. A total of six participants (all identified as female) were recruited and agreed to participate in the study. Participants were between 20-23 years of age. Participants were included in the study if they: (a) were enrolled at the university at the start of the study, (b) had no prior experience implementing an MSWO preference assessment, (c) had access to an email service, (d) had access to a computer with audio and video capabilities, and (d) were willing read and responded to emails daily. Participants were screened for the inclusion criteria through a questionnaire prior to the start of the study. In addition, each participant was required to respond to three non-study related emails sent over three days prior to the start of the study. Riley was a 20-year-old female who identified as White and not of Latinx, Hispanic, or Spanish origin. Riley had been enrolled at the university for less than one year, was a transfer student, and was majoring in Special Education. Olivia was a 23-year-old female who identified as Asian and not of Latinx, Hispanic, or Spanish origin. Olivia had been enrolled at the university for four years and was majoring in Special Education. 238 Ava was a 20-year-old female who identified as White and not of Latinx, Hispanic, or Spanish origin. Ava had been enrolled at the university for three years and was majoring in Special Education. Layla was a 20-year-old female who identified as White and not of Latinx, Hispanic, or Spanish origin. Layla had been enrolled at the university for two years, was a transfer student, and was majoring in Special Education. Ellie was a 21-year-old female who identified as White and not of Latinx, Hispanic, or Spanish origin. Ellie had been enrolled at the university for four years and was majoring in Special Education and Elementary Education. Kennedy was a 20-year-old female who identified as Black or African American and Hispanic, and of Latinx, Hispanic, or Spanish origin. Kennedy had been enrolled at the university for three years and was majoring in Special Education. Confederate The primary researcher, a doctoral candidate who held a BCBA credential and had prior experience conducting research on preference assessments with children with ASD, served as the confederate throughout the duration of the study. The confederate played the role of a student (i.e., the learner) and engaged in specific responses during each research session (see Table 4.1 and see Confederate Response Data Sheet section for further description). Primary Data Collector One graduate student served as the primary data collector and measured the dependent variable (participant procedural fidelity) for all research sessions across all conditions for each participant. The graduate student was trained by the primary researcher on the data collection process prior to the start of the study. The training was conducted across two different days, for 1 239 hr each day. During the training session, the primary researcher reviewed each step of the participant procedural fidelity data sheet (see Dependent Variable section for further description), virtually displayed the MSWO training video that the participants were presented (see Participant Materials section for further description) and practiced scoring a research session. Following the training sessions, both the primary researcher and the primary data collector scored three research sessions independently. If the primary data collector achieved 100% accuracy across all three research sessions, they were considered to pass the reliability checks and were assigned to score the remaining research sessions. The primary data collector achieved 100% accuracy across the three research sessions the first time. Setting and Materials Due to the COVID-19 pandemic, all research sessions, data collection, and feedback were conducted or provided remotely. For each research session, the participant and the primary researcher/confederate were present. Additionally, each research session was recorded for data collection purposes, and no data collection occurred during the session. Instead, research sessions were reviewed later by the primary data collector. Participant Materials Participant materials consisted of a computer with audio and video capabilities, an email service, a video conferencing software (e.g., Zoom), a pen or pencil, a calculator, a timer (e.g., phone), five leisure stimuli, and data sheets. The five leisure stimuli (e.g., ball, stapler) for the preference assessment were arbitrarily selected by each participant based on the stimuli they had available in their households for each research session. Twenty data sheets (see Appendix B) for participants to use for data collection while conducting the MSWO preference assessment were 240 mailed to the participant’s household or emailed to the participant (based on each participant’s preference) prior to the start of the study. Primary Researcher/Confederate Materials Primary researcher/confederate materials consisted of a computer with audio and video capabilities, a video conferencing software (e.g., Zoom), an email service, a timer (e.g., digital stopwatch), and a second computer monitor. Additional materials consisted of an MSWO training video (presented once during the initial research meeting), an MSWO excerpt, a session script, a list of responses (here after referred to as the confederate response data sheet) to perform during each research session (see Table 4.1), and a pool of specific responses for each component of the email feedback. During each research session, the primary researcher/confederate used a second monitor to display the assigned confederate response data sheet, timer, and the session script. The primary monitor was used to display the video conferencing software in gallery view, to allow both the primary researcher/confederate and participant to be visible side-by-side. MSWO Training Video. The MSWO training video was developed to emulate a typical MSWO training that may be provided at an organization to teach instructors how to conduct preference assessments with individuals with ASD. The purpose of the training video was to approximate an “onboarding” or initial training a professional may receive on the topic and to provide basic information about how to implement an MSWO preference assessment, so that each participant would have baseline level knowledge of the MSWO prior to the start of the study. Because the purpose of this study was to evaluate the effects of performance feedback on participant behavior, this pre-experiment training ensured participants engaged in at least some level of accurate behavior, in order to provide opportunities for performance-based feedback. 241 The MSWO training video was informed by an MSWO video created by researchers at Vanderbilt university (Chazin & Ledford, 2016) and by the primary researcher’s eight years of ABA experience and implementing preference assessments. The training video was created by the primary researcher and included a step-by-step guide of how to conduct an MSWO preference assessment within a remote context. When an MSWO preference assessment is conducted in-person, the learner may be able to engage with the physical item following a selection response. However, when an MSWO preference assessment is conducted within a remote context, the learner may not be able to engage with the physical item because they may not have access to it. Instead, the implementer may engage with the item and the learner may observe. The training video included specific steps that were tailored to meet the unique aspects of conducting an MSWO preference assessment within a remote context. The video included a video model where the primary researcher played the role of the participant and a research assistant played the role of the learner (i.e., confederate), written text that indicated the target behavior for each step, and a voice over narration completed by the primary researcher. The training video (see Brodhead, 2022) was 21 min in length and was displayed virtually to the participants by the primary researcher one time prior to the start of the study (i.e., during the initial research meeting that occurred one week before each participant’s first research session). The participants did not have access to the training video outside of the initial research meeting. MSWO Excerpt. The MSWO excerpt was derived from a relevant peer-reviewed article (DeLeon & Iwata, 1996) and consisted of sections that outlined correct implementation of the MSWO preference assessment (see Appendix C). The purpose of the MSWO excerpt was to provide the participants with a brief description of the MSWO procedures in order to mimic a 242 description someone would find if they were to conduct an internet search or access notes on MSWO preference assessments. Additionally, this was an identical approach that previous preference assessment literature has used (e.g., O’Handley et al., 2021; Rosales et al., 2015). The MSWO excerpt was displayed virtually to each participant at the beginning of each research session for up to 15 mins. During the 15 mins, the participants could read and reread the entire excerpt or specific sentences. For one participant, due to participant characteristics that cannot be disclosed for confidentiality reasons, the MSWO excerpt was read aloud to the participant by the primary researcher for up to 15 mins. At any time during the 15 mins, the participants could indicate to the primary researcher that they were finished reviewing the excerpt and the primary researcher would stop displaying the excerpt. The participants did not have access to the excerpt outside of the research sessions. Session Script. The session script consisted of step-by-step instructions for the primary researcher to engage in during each research session. The session script consisted of two sections that prescribed how the research assistant should engage with each participant prior to the session and during the session. It included directions for the primary researcher to share the MSWO excerpt and how to respond if a participant asked a question which ensured consistency in researcher responses within and across participants. See Appendix D for a complete description of the session script. Confederate Response Data Sheet. Each confederate response data sheet consisted of a sequence of responses for the confederate to engage in during the research session (see Appendix E for an example). Specifically, each data sheet included 15 responses, one for each trial of the MSWO. There were a total of 10 variations of the confederate response data sheet. 243 The sequence of the responses on each confederate data sheet were randomly generated. There were seven different responses the confederate could engage in (see Table 4.1) throughout a research session. Given that each research session consisted of 15 total trials, some of the responses were included in the sequence more than once and some were included only one time. To ensure that each of the seven responses were included at least once within each of the confederate response data sheet variations, the primary researcher used a random number generator (Maple Tech International LLC, 2022) to identify a trial number for each potential response to occur in. Once all seven responses were included in the sequence, the primary researcher then used the random number generator to determine how many additional times each response would be included in the data sheet, with no response being repeated more than three times. Once the frequency of each response was determined, the primary researcher continued to use the random number generator to determine which trial each response would occur in until all trials had a response. For each research session, the assigned confederate response data sheet was randomly selected using a random number generator (Maple Tech International LLC, 2022). Once a confederate response data sheet (e.g., assigned confederate response data sheet #8) was used for a participant, it was not replaced into the potential selections for future research sessions, until all 10 confederate data sheets had been used for that participant. Once all 10 confederate response data sheets had been used for that participant, all 10 had the opportunity to be selected again. Primary Data Collector Materials The primary data collector materials consisted of a pen or pencil, a timer, the video recording of the research session, an email service, and data sheets. The primary data collector used three different data sheets. The first data sheet was used to record the confederate’s stimulus 244 selections, which was identical to the participant’s data sheet. The second data sheet was used to record participant’s adherence to the steps of the MSWO preference assessment, the dependent variable (i.e., procedural fidelity). Finally, the third data sheet was the assigned confederate response data sheet, so the primary data collector would know the sequence of responses the primary researcher/confederate engaged in during the research session. Measurement Dependent Variable The primary dependent variable was fidelity of participant implementation (procedural fidelity). Procedural fidelity was the degree to which the participant implemented the MSWO as intended (Cooper et al., 2007; Gast & Ledford, 2014). The dependent variable comprised of component responses derived from a task analysis that depicted the instructional behaviors for the MSWO preference assessment (see Appendix F). The task analysis was developed using the MSWO steps provided in DeLeon & Iwata (1996), DeLeon et al. (1997), and Sipila-Thomas et al. (2020) and modified for delivery in a remote context. Each step in the task analysis was coded as a whole occurrence (i.e., always occurred) or a whole nonoccurrence (i.e., never occurred). In order for a participant’s response to be scored as a whole occurrence, the participant had to correctly engage in the response multiple times (e.g., engaged in the response correctly seven out of seven times) throughout the research session. If at any point the participant did not engage in the response correctly (e.g., engaged in the response correctly five out of seven times), the entire response was scored as a whole nonoccurrence for that session. The percentages of occurrences for each session were calculated by dividing the total number of occurrences (steps implemented correct in the task analysis) by the sum of occurrences and nonoccurrences. The quotient was then multiplied by 100 to yield a percentage (Cooper et al., 2007). 245 The task analysis for the MSWO contained 24 total steps. The first two steps were to occur at the beginning of each session. Specifically, the steps evaluated whether the participant: (1) gathered all materials (i.e., five stimuli, data sheet, pencil/pen, timer, computer) and (2) recorded item names on MSWO data sheet. The next four steps evaluated participant performance prior to each trial. The steps evaluated whether the participant: (3) set up computer on the table, so the confederate can see all of the item, (4) placed all items randomly in a straight line within the confederate’s view, (5) ensured that the confederate was attending to the items (looking at the participant or the items), and (6) said “pick one” or “choose one”. The next six steps evaluated participant behavior after the primary researcher/confederate engaged in a one item selection response (i.e., when the confederate pointed with one finger and selected one item), if applicable. Specifically, the steps evaluated whether the participant: (7) engaged with the selected leisure item for 30 sec, (8) removed all other items from the table or pulled them back/to the side (when applicable), (9) recorded data (indicated the item selected) on the data sheet, (10) placed selected item out of the confederate’s view after 30 sec, (11) re- presented remaining items and rotated by taking the item at the left end of the line and moving to the right end, then shifting the other items so they were equally spaced on the table, and (12) repeated steps 5 and 6 until all items were selected or a 30 sec period elapsed with no selection. The next three steps evaluated participant responding in the event the primary researcher/confederate engaged in a two item (simultaneous) selection response (i.e., when the confederate pointed with two fingers and selected two items at the same time). Specifically, the steps evaluated if the participant: (13) removed all items from the table or pulled them back/to the side and waited five sec, (14) represented items in the same line up (without rotating the items), and (15) repeated steps 5 and 6. The next six steps evaluated participant responding in the 246 event the primary researcher/confederate engaged in a two item (sequential) selection response (i.e., when the confederate pointed with one finger and selected one item, paused, then pointed with one finger again and selected another item). The steps evaluated whether the participant: (16) ignored the second selected item and continued to engage with the first selected leisure item for 30 sec, (17) removed all other items from the table or pulled them back/to the side, (18) recorded data (indicated the item selected) on the data sheet, (19) placed selected item out of the confederate’s view after 30 sec, (20) represented remaining items and rotated by taking the item at the left end of the line and moving to the right end, then shifting the other items so they were equally spaced on the table, and (21) repeated steps 5 and 6 until all items were selected or a 30 sec period elapsed with no selection. Step 22 evaluated participant responding in the event the primary researcher/confederate did not point to or select an item. Specifically, the step evaluated whether the participant ended the session and recorded remaining item(s) as not selected “NS”. The last two steps evaluated participant performance after the MSWO trials were completed. The steps evaluated whether the participant: (23) calculated the top preferred item, and (24) data were accurate. For one participant, due to participant characteristics that cannot be disclosed for confidentiality reasons, the task analysis for the MSWO was modified to contain 23 total steps. Step 5, which was to ensure that the confederate was attending to the items (looking at the participant or the items), was eliminated and the participant was not required to engage in this step. All other steps remained the same. Interobserver Agreement A second graduate student served as a second observer and measured participant procedural fidelity for at least 30% of sessions for all baseline and intervention conditions across 247 all participants. The graduate student was trained by the primary researcher on the data collection process in a manner identical to that of the primary data collector. Interobserver agreement (IOA) was calculated for the dependent variable and all participants and met standards for single- case research (Kratochwill et al., 2013). When calculating IOA for participant procedural fidelity, an agreement was scored if the primary data collector and the second observer recorded the same behavior in the task analysis as a whole occurrence or a whole nonoccurrence. For example, an agreement was scored if both the primary data collector and second observer recorded “always occurred” for Step 1 on the task analysis. A disagreement was recorded if the primary data collector and the second observer did not record the same behavior in the task analysis as a whole occurrence or a whole nonoccurrence. For example, a disagreement was recorded if the primary data collector recorded “always occurred” for Step 1, but the second observer recorded “never occurred” for Step 1. IOA was calculated by dividing the total number of agreements by the sum of agreements and disagreements. The quotient was then multiplied by 100 to yield a percentage (Cooper et al., 2007). Total IOA across all conditions was 96.7% for Riley (range: 87.5% to 100.0%), 99.0% for Olivia (range: 95.8% to 100.0%), and 99.3% for Ava (range: 95.8% to 100.0%). Total IOA across all conditions was 98.6% for Layla (range: 95.8% to 100.0%), 98.3% for Ellie (range: 95.8% to 100.0%), and 95.8% for Kennedy (range: 91.7% to 100.0%). Average instructor IOA scores for each participant, across the two conditions (i.e., baseline and intervention), are displayed in Table 4.2. Primary Researcher/Confederate Procedural Fidelity A third graduate student measured primary researcher/confederate procedural fidelity for at least 30% of sessions for all baseline and intervention conditions across all participants. The 248 graduate student was trained by the primary researcher on the data collection process prior to the start of the study. The training was conducted across two different days, for 1 hr each day. During the training session, the primary researcher reviewed each step of the researcher procedural fidelity data sheet, virtually displayed the training video that the participants were presented (see Primary Researcher/Confederate Materials section for further description) and practiced scoring a research session. Following the training sessions, both the primary researcher and the graduate student scored three research sessions independently. If the graduate student achieved 100% accuracy across all three research sessions, they were considered to pass the reliability checks and were assigned to score the randomly selected research sessions. The graduate student achieved 100% accuracy across the three research sessions the first time. The primary researcher/confederate procedural fidelity was the degree to which the primary researcher/confederate implemented the independent variable (i.e., email feedback), followed the session script, and engaged in the predetermined sequence behaviors (i.e., assigned confederate response data sheet) as intended (Cooper et al., 2007; Gast & Ledford, 2014). Primary researcher/confederate procedural fidelity was derived from a task analysis that depicted the behaviors the primary researcher/confederate engaged in before, during, and after each research session (see Appendix G). Each step in the task analysis was scored “yes” if the primary researcher/confederate implemented that step correctly and “no” if the step was implemented incorrectly or was omitted. The primary researcher/confederate researcher procedural fidelity was calculated by dividing the sum of “yes” scores by the sum of “yes” plus “no” scores. The quotient was then multiplied by 100 to yield a percentage (Cooper et al., 2007). Procedural fidelity across all conditions was 100% for all participants. 249 Experimental Design A multiple probe design across participants design was used to evaluate the effects of the MSWO preference assessment with email feedback on the participants’ procedural fidelity (Gast et al., 2014). A multiple probe design systematically introduces the independent variable (i.e., email feedback) to evaluate its effects on the dependent variable (i.e., procedural fidelity) and controls for threats to internal validity. The independent variable was introduced on one occasion for each participant, for a total of six opportunities to demonstrate experimental control and treatment effect across all six participants (Cooper et al., 2007). The multiple probe design consisted of two experimental conditions: (a) baseline and (b) intervention. Participants moved from the baseline condition to the intervention condition once visual analysis of data suggested a steady state of responding had been achieved (see Sidman, 1960). The intervention condition for each participant ended once at least five research sessions were completed and a stead state of responding had been achieved. Additionally, based on when participants were enrolled in the study, the multiple probe design was conducted either nonconcurrently or concurrently across participants. Participants were assigned to either a non-concurrent or concurrent design based on the order in which they enrolled in the study. For the first set of participants to enroll in the study, Riley, Olivia, and Ava (see Figure 4.1 and Figure 4.2 for an alternative figure depicting the nonconcurrent session schedule), the multiple probe design was nonconcurrent as they began the study on different days (i.e., Olivia started two days after Riley and Ava started six days after Olivia). For the second set of participants to enroll in the study, Layla, Ellie, and Kennedy (see Figure 4.3), the multiple probe design was concurrent as they all began the study on the same day (see Slocum et al., 2022 for a primer on concurrent and nonconcurrent multiple baseline designs and variations). 250 Procedure Each research session lasted approximately 15-20 min. Baseline sessions were conducted two-to-three times per week as close in days as possible, with the exception of baseline probes which consisted of at least five days (based on recommendations from Gast & Ledford, 2014) between the previous baseline session or previous baseline probe session. Identical to baseline sessions, intervention sessions were conducted two-to-three times per week as close in days as possible. All research sessions occurred at times and dates convenient for the participant and when the primary researcher was available. Research sessions always began when the primary researcher gave the instruction to begin (i.e., “Okay, now you can begin implementing the assessment. Once you are finished, please let me know”). All research sessions continued until: (a) the participant indicated they completed the assessment or (b) the participant did not engage in a target response from the task analysis for two min. At the end of each session, the participant was asked to display their data sheet on the screen and the primary researcher thanked the participant for attending. All research sessions were conducted via Zoom and were recorded for data collection purposes. The recording started when the participant joined Zoom and concluded immediately after the participant displayed their data sheet on the screen and the primary researcher thanked them for attending. Initial Research Meeting and Training Prior to the start of the first baseline session, all participants attended an initial research meeting with the primary researcher. The initial meeting was approximately 40 min in length. During the initial research meeting, the primary researcher reviewed the informed consent, shared a demographic questionnaire with the participant, displayed the MSWO training video in 251 its entirety, gathered information for data sheet preference (i.e., mailed or emailed), and obtained the participant’s availability for session scheduling purposes. Baseline The purpose of this condition was to measure participant behavior prior to the introduction of the email performance-based feedback. At the beginning of each research session, participants were given up to 15 min to review the MSWO excerpt derived from DeLeon and Iwata (1996) described above. After 15 min, or when the participant reported they finished reviewing the excerpt, the participant was asked to implement the MSWO preference assessment with the primary researcher/confederate. During implementation of the MSWO preference assessment, the primary researcher/confederate engaged in the sequence of responses determined by the assigned confederate response data sheet (see Appendix E for an example). However, if the participant did not engage in the steps following a specific type of selection correctly, the primary researcher/confederate systematically adjusted their next selection response. For example, following a two-item sequential selection response, the participant should have engaged in Steps 16-21 on the task analysis. However, occasionally instead of ignoring the second selected item and continuing to engage with the first selected item for 30 s (Step 16), participants engaged with both of the selected items and then re-presented the remaining non-selected items. When this occurred, the primary researcher/confederate skipped the next predetermined trial (e.g., Trial #2) and moved onto the next trial (e.g., Trial #3) to match the number of items that were represented in the array. As another example, following a no selection response, if the participant continued to represent the remaining items, instead of ending that round of trials (i.e., one round equals five trials, one trial for each of the five items) and recording them as not selected (Step #22), the 252 primary researcher/confederate continued to engage in no selection responses until the next round of trials began. Finally, as a third example, if the participant engaged in errors such as those noted above and continued to represent items in the array and the primary researcher/confederate had completed all of the predetermined trials, the primary researcher/confederate engaged in no selection responses until the participant indicated that had completed the MSWO. After the participant indicated they had completed the MSWO (e.g., stating “I’m done”), the primary researcher/confederate provided the participant time to calculate their results. Once the participant reported they had completed their results, the primary researcher/confederate instructed the participant to display their data sheet on the screen for a few seconds. No further instructions were given, and no feedback was provided. If a participant asked the primary researcher/confederate researcher a question during the research session, they were informed that the primary researcher/confederate could not answer their question at that time and that they should do the best they could. After each baseline research session ended, the primary researcher sent the participant an email without any feedback components. The components that were included in the baseline emails were: (a) a general positive opening statement, (b) a request for a response (i.e., posing a scheduling question and asking for a reply), and (c) a positive closing statement. Emails were sent to participants during the baseline condition to ensure that participants were accessing, reading, and responding to emails and to further isolate the effects of feedback during the subsequent condition. 253 Intervention The purpose of this condition was to evaluate participant behavior when the participant was provided with email feedback. Experimental procedures in the intervention condition were identical to that of the baseline condition, with the exception of the email sent to participants following each research session. Email followed a format similar to that of baseline but also included components specific to supportive (i.e., the number of steps the participant engaged in correctly and comments about the participants implementation) and corrective feedback (i.e., the steps the participant did not engage in correctly). The components included in the intervention emails were: (a) a general positive opening statement, (b) supportive feedback, (c) corrective feedback, (d) a request for a response (i.e., posing a scheduling question and asking for a reply), and (e) a positive closing statement. (see Appendix I for an example intervention email). Data Analysis Following each research session, data for each participant were graphed and reviewed by the primary researcher for trend, level, and variability of data to evaluate intervention effects (Cooper et al., 2007). Following the completion of the study, Tau-U was calculated to supplement visual analysis of data and to provide a secondary measure of treatment effect. Tau- U is a statistical analysis that combines non-overlap analysis between phases with trend from within the intervention phase. Tau-U is dynamic in that it can calculate trend only, non-overlap between phases only, as well as combinations of trend and overlap between multiple phases (e.g., baseline and intervention conditions; Parker et al., 2011). 254 Results Participant Procedural Fidelity Nonconcurrent Set of Participants Riley. Riley’s percentage of correct implementation data are displayed in Figure 4.1, top panel. In the baseline condition, Riley’s percentage of correct implementation ranged from 12.5% to 33.0% (M = 27.3%), with the last three sessions remaining stable at 33.0%. During the first intervention session, Riley’s percentage of correct implementation increased to 66.7% and then varied from 62.5% to 91d.7%, until it remained stable at 100% for two sessions and 95.8% in the last session (M = 86.5%; range: 62.5% to 100.0%). Olivia. Olivia’s percentage of correct implementation data are displayed in Figure 4.1, middle panel. In the baseline condition, Olivia’s percentage of correct implementation was at 54.0% in the first session and then decreased and remained stable at 41.7% for the last four sessions (M = 43.8%). During the first intervention session, Olivia’s percentage of correct implementation increased to 87.5% and then remained stable between 95.8% to 100% for the last four sessions (M = 96.7%; range: 87.5% to 100.0%). Ava. Ava’s percentage of correct implementation data are displayed in Figure 4.1, bottom panel. In the baseline condition, Ava’s percentage of correct implementation ranged from 41.7% to 54.2% (M = 48.8%). During the first intervention session, Ava’s percentage of correct implementation increased to 70.8% and then varied from 70.8% to 87.5%, until it remained stable between 95.8% and 100.0% for the last three sessions (M = 86.9%; range: 70.8% to 100.0%). 255 Concurrent Set of Participants Layla. Layla’s percentage of correct implementation data are displayed in Figure 4.3, top panel. In the baseline condition, Layla’s percentage of correct implementation ranged from 0.0% to 12.5% (M = 3.34%). During the first intervention session, Layla’s percentage of correct implementation increased to 33.3% and then varied from 16.6% to 79.2%, until it remained stable between 87.5% and 95.8% for the last three sessions (M = 69.3%; range: 16.6% to 95.8%). Ellie. Ellie’s percentage of correct implementation data are displayed in Figure 4.3, middle panel. In the baseline condition, Ellie’s percentage of correct implementation began at 30.4% in the first session and then varied between 17.4% and 21.7% across three sessions, with the last two sessions at 13.0% (M = 19.5%). During the first intervention session, Ellie’s percentage of correct implementation increased to 86.9% and then varied from 86.9% to 95.6%, until it remained stable between 95.6% and 100.0% for the last four sessions (M = 94.5%; range: 86.9% to 100.0%). Kennedy. Kennedy’s percentage of correct implementation data are displayed in Figure 4.3, bottom panel. In the baseline condition, Kennedy’s percentage of correct implementation ranged from 25.0% to 41.6% (M = 32.7%). During the first intervention session, Kennedy’s percentage of correct implementation increased to 75.0% and then varied from 79.2% to 91.6%, until it remained stable at 95.8% for the last three sessions (M = 88.9%; range: 75.0% to 95.8%). Tau-U The researchers calculated Tau-U using a web-based Tau-U calculator (http://singlecaseresearch.org/calculators/tau-u) for the participants. The Tau-U for Riley was 1 (p < 0.05, z = 2.93), the Tau-U for Olivia was 1 (p < 0.05, z = 2.74), the Tau-U for Ava was 1 (p < 0.05, z = 3.13), the Tau-U for Layla was 1 (p < 0.05, z = 3.12), the Tau-U for Ellie was 1 256 (p < 0.05, z = 3.10), and the Tau-U for Kennedy was also 1 (p < 0.05, z = 3.00). Based on the weighted Tau-U for all participants, the intervention had a large or strong effect with 100% of participants’ data showing significant improvement (p < 0.001, z = 7.33) from baseline to intervention with 95% CIs [0.73, 1]. Discussion Overall, the findings indicate the email performance-based feedback was effective in increasing procedural fidelity of MSWO preference assessment implementation in the pre- service teacher participants. All six participants implemented the MSWO preference assessment with high levels of procedural fidelity following email performance-based feedback. These results extend previous findings (i.e., Artman-Meecker & Hemmeter, 2013; Gomez et al., 2021; Gorton et al., 2021; Hemmeter et al., 2011; Martinez Cueto et al., 2021) by isolating the effects of email feedback. Additionally, these results support previous findings (i.e., Barton et al., 2016; Barton et al., 2020; Coogle et al., 2020) suggesting that email performance-based feedback alone is effective in increasing target behavior(s). In the present study, we used a conservative “all or nothing” whole occurrence measure (similar approach used in Sipila-Thomas et al., 2020) that was biased towards deflating participant performance because a single error resulted in the entire step being marked as incorrectly implemented. An alternative measure would have been a per opportunity measure; however, the use of this measure can be biased towards inflating participant performance (Ledford et al., 2014). For example, in this study there were 15 total trials and each trial consisted of 24 different behaviors the participant could engage in. Therefore, the participant would have 24 opportunities that would be included in the calculation of a whole occurrence measure and 360 opportunities that would be included in the calculation of a per opportunity 257 measure. When calculating the whole occurrence measure, if the participant incorrectly engaged in one specific behavior one time, during the 15 trials, and incorrectly engages in four other specific behaviors one time each, all five behaviors would be marked incorrect and would result in a score of 79.2% or 19 out of 24 correct. Whereas, if a per opportunity measure was used for each trial, the participant would have scored a 98.6% or 355 out of 360 correct. Future research may evaluate the extent to which the whole occurrence measure deflates participant performance and determine if practitioners should proceed with caution in using the whole occurrence measure over the per opportunity measure in practice. Although the email performance-based feedback increased procedural fidelity for all six participants and Tau-U results indicated the intervention had a large or strong effect, the accuracy of responding varied across participants. Layla and Kennedy did not reach 100% procedural fidelity, though they both reached 95.8% which was implementing 23 out of the 24 steps correctly. Additionally, the number of sessions to reach high levels of fidelity varied across participants. Olivia, Ava, Ellie, and Riley reached 100% procedural fidelity after being exposed to two to six intervention research sessions, while Kennedy and Layla were exposed to six and 11 intervention research sessions, respectively, but both did not reach 100% procedural fidelity. Previous research on training individuals to implement behavior analytic procedures found similar results when using written or vocal instructions (i.e., instruction-based methods) alone (Iwata et al., 2000; Roscoe et al., 2006; Shapiro & Kazemi, 2017; Vladescu at al., 2012). This variation in responding (i.e., participants do not always achieve 100% fidelity) is an important finding because it demonstrates that rate of performance improvements are idiosyncratic across participants. Therefore, some participants may require additional support (e.g., video models of correct implementation) beyond email performance-based feedback in order to reach 100% 258 procedural fidelity. Previous researchers have found the use of multiple procedures (e.g., video modeling, written self-instruction packages, feedback) to train individuals to implement an intervention have been effective in increasing procedural fidelity (Shapiro & Kazemi, 2017). Future research could conduct a component analysis to evaluate what additional procedures alongside email feedback may be needed to achieve 100% procedural fidelity. For example, a study could evaluate the effects of (a) email feedback alone, (b) email feedback with video modeling, and (c) email feedback, video modeling, and roleplaying on procedural fidelity to determine if one or all components are necessary to achieve 100% procedural fidelity. Future researchers may also consider evaluating whether 100% procedural fidelity is necessary in order to increase client outcomes (see Groskreutz et al., 2011). If it is determined that 100% procedural fidelity is not necessary, future research could conduct a component analysis to evaluate additional procedures alongside email feedback in order to achieve the ideal procedural fidelity percentage. When conducting a post-hoc analysis of participant errors, the two steps most frequently implemented incorrectly were (1) calculating stimulus rankings (Step 23) and (2) accurately transcribing data (Step 24). Incorrectly implementing these two steps when conducting an MSWO preference assessment can be a major issue for two reasons. First, if the implementer used an item that was miscalculated as a top item when providing treatment to an individual with ASD, there may be noticeably different effects in treatment delivery because for treatment delivery to be successful (i.e., increase desired behavior or decrease undesired behavior), an effective reinforcer is required (Bottini & Gillis, 2021; Cooper et al., 2020). Second, if the implementer did not accurately record data during the implementation of the MSWO, the inaccurate data would result in incorrect decision-making (see Cox & Brodhead, 2021) and may 259 impact the effects of treatment delivery. However, it is unclear why these two specific steps were the two most common errors. One potential cause for the errors could be that the feedback was less effective in explaining how to correctly implement these steps. A second potential cause could be session fatigue as these two steps came at the end of the research session. Future research could evaluate if participants experience similar difficulty when calculating results and recording data when implementing other preference assessment or behavior analytic procedures. For example, a study could evaluate the use of email performance-based feedback on the implementation of a multiple stimulus with replacement preference assessment (DeLeon & Iwata, 1996), paired stimulus preference assessment (Fisher et al., 1992), or free operant preference assessment (Roane et al., 1998) and assess if participants engage in incorrect responses when calculating stimulus rankings or recording data. Additionally, future research could evaluate if another form of feedback (e.g., video model, see DiGennaro-Reed et al., 2010) is required in order to correctly calculate results. Extension of Previous Literature The present study extends previous research in at least three ways. First, we standardized the email responses provided to each of the participants to control for variations in the email feedback. In previous studies, emails were not standardized beyond their general frameworks (e.g., general positive opening statements, supportive feedback, corrective feedback, request for response; Artman-Meecker & Hemmeter, 2013; Barton et al., 2016; Barton et al., 2020; Coogle et al., 2020; Gorton et al., 2021; Hemmeter et al., 2011; Martinez Cueto et al., 2021; Zhu et al., 2021). As a result, it was unclear how researchers composed the email feedback for each participant and whether the feedback was specific or general feedback, making replication of these studies difficult. In the present study, we used a pool of specific responses for each 260 component of the email that the primary researcher pulled from. As a result, we were able to rule out researcher variation as a potential source of influence in the present study and demonstrated that providing standardized email responses can improve target behaviors. However, comparisons between customized and standardized email performance-based feedback would still need to be conducted. Additionally, future researchers should consider evaluating other feedback components (e.g., specific vs. general feedback, frequent vs. infrequent feedback, group vs. individual feedback, and source of feedback; Novak et al., 2019; Sleiman et al., 2020) within email feedback in order to further refine email performance-based feedback interventions. Second, we implemented the training component (i.e., the MSWO training video) prior to the baseline condition in order to isolate the effects of the email feedback and to ensure that any changes that occurred between the baseline and intervention conditions were likely a result of the email feedback alone. The results of the present study demonstrated that providing email performance-based feedback alone can improve implementation of an MSWO preference assessment. Future research could evaluate if email feedback would increase individuals’ procedural fidelity when implementing procedures such as match-to-sample, manding, and imitation, and other behavioral interventions. Additionally, future researchers could continue to evaluate the use of email feedback with behavior analytic procedures that consist of varying number of steps of increasing difficulty to understand the conditions under which it may or may not have functional value in improving employee performance (e.g., functional analyses, functional communication training). Third, we used a confederate with a specific list of responses to perform during each research session. Previous literature (i.e., Artman-Meecker & Hemmeter, 2013; Barton et al., 2016; Barton et al., 2020; Coogle et al., 2020; Gomez et al., 2021; Gorton et al., 2021; Hemmeter 261 et al., 2011; Martinez Cueto et al., 2021) did not control for variation in the instructional environment as those studies involved classroom settings where teachers engaged with students. As a result, opportunities for participants to respond may have been at least partially affected by student behavior within those settings. By using a confederate, we eliminated or reduced variation in the instructional environment by holding errors and correct responses constant across all research sessions. Additionally, participants were exposed to each confederate response during each research session, which mimicked the responses a child with ASD may engage in during an MSWO preference assessment. As a result, we were able to evaluate participant behavior in the presence of responses likely encountered in an applied setting across the entire experiment. However, in the present study we did not assess if high procedural fidelity in the presence of a confederate would generalize to a child with ASD. Future research could evaluate if the participant’s fidelity of implementation would generalize to implementing an MSWO preference assessment with a child with ASD after receiving email feedback based on their performance implementing the MSWO with a confederate. Limitations Several limitations of the present study should be noted. The first limitation of this study was that it was unclear what specific components (e.g., supportive feedback, corrective feedback) of the email performance-based feedback were responsible for increases in participant procedural fidelity. Second, though all participants achieved high levels of procedural fidelity, the extent to which these gains maintain or persist long-term are unknown because we only evaluated the immediate effects the email performance-based feedback had on implementation of the MSWO preference assessment. Finally, we evaluated participant responding in the presence of a confederate instead of an individual with ASD. Although the use of a confederate allowed 262 the researchers to control for variation in the instructional environment and provided opportunities for participants to respond to multiple learner responses, the extent to which high levels of procedural fidelity would generalize to individuals with ASD is unclear. Results of previous training studies (e.g., Lipschultz et al., 2015) that have used a confederate and then evaluated participant performance in the presence of a child, found high levels could be achieved. However, future research could evaluate the effects of email performance-based feedback on the implementation of an MSWO preference assessment with individuals with ASD in person or within a remote context. Finally, to our knowledge, this was the first study to evaluate email performance-based feedback effects on a procedure that consists of more than eight discrete steps. Future research could continue to evaluate email performance-based feedback on other multiple step behavior analytic procedures (e.g., implementing a gross motor imitation program, conducting a functional analysis) in order to understand the extent to which email feedback can be used. Though email performance-based feedback appears to be helpful, until future research is conducted, we urge caution in viewing it as substitute for in-person feedback. Instead, email may be considered another tool to delivering high-quality feedback to help teachers improve fidelity of implementation of behavioral procedures. 263 APPENDICES 264 APPENDIX A Tables 4.1 - 4.2 and Figures 4.1 - 4.3 Table 4.1. Confederate Responses During MSWO Research Sessions Response After the Participant Says, “Pick One or Choose One” Immediately select one stimulus from the array presented (one item) Wait 7 s and then select one stimulus from the array presented (one item) Immediately select two stimuli simultaneously (simultaneous) Wait 7 s and then select two stimuli simultaneously (simultaneous) Immediately select one stimulus and then select another stimulus from the array presented (sequential) Wait 7 s and then select one stimulus and then select another stimulus from the array presented (sequential) Do not select a stimulus from the array presented (no response) 265 Table 4.2. Average IOA scores for each participant across the two conditions. Participant Baseline Intervention Riley 91.7% 100.0% Olivia 97.9% 100.0% Ava 98.6% 100.0% Layla 97.9% 99.0% Ellie 97.9% 98.6% Kennedy 94.5% 97.9% 266 Figure 4.1. Percentage of Correct Implementation for Three Participants (Riley, Olivia, and Amy) Across Conditions Baseline Intervention 100 90 80 70 60 50 40 30 20 10 Riley Percentage of Correct Implementation 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 100 90 80 70 60 50 40 30 20 10 Olivia 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 100 90 80 70 60 50 40 30 20 10 Ava 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 Sessions 267 Figure 4.2. Percentage of Correct Implementation for Three Participants (Riley, Olivia, and Amy) Across Conditions Depicting the Nonconcurrent Session Schedule Baseline Intervention 100 90 80 70 60 50 40 30 20 10 Riley Percentage of Correct Implementation 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 100 90 80 70 60 50 40 30 20 10 Olivia 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 100 90 80 70 60 50 40 30 20 10 Ava 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 Days 268 Figure 4.3. Percentage of Correct Implementation for Three Participants (Layla, Ellie, and Kennedy) Across Conditions Baseline Intervention 100 90 80 70 60 50 40 30 20 10 Layla Percentage of Correct Implementation 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 100 90 80 70 60 50 40 30 20 10 Ellie 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 100 90 80 70 60 50 40 30 20 10 Kennedy 0 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 Sessions 269 APPENDIX B Multiple Stimulus Without Replacement (MSWO Data Sheet) Multiple Stimulus Without Replacement (MSWO) Data Sheet Item A: __________________________________ Item B: __________________________________ Participant #: ________________________ Item C: __________________________________ Item D: __________________________________ Date: _____________________________ Item E: __________________________________ 1 2 3 Trial Selected Item Trial Selected Item Trial Selected Item 1 1 1 2 2 2 3 3 3 4 4 4 5 5 5 Results = # of times item was selected / Results: Rank: # of times item was available during three A: ______________ D. ______________ __. _____________ __. _____________ sessions x 100% B: ______________ E. ______________ __. _____________ __. _____________ C: ______________ __. _____________ 270 APPENDIX C Multiple Stimulus Without Replacement (MSWO) Excerpt from DeLeon & Iwata (1996) Response Measurement A selection response was recorded when the participant made physical contact with one of the presented items. The participant had 10 s to select an item. If the participant made contact with more than one item, the first item contacted was recorded as the selection. If no item was selected within the 10-s period, the trial ended. The procedures following a no-selection trial varied across presentation methods (see below). When a selection was made, the trial ended after the participant received 30-s access to the item (leisure stimuli). MSWO Description For this assessment procedure, each session began with all items sequenced randomly in a straight line on the table, about 5 cm apart. While a participant was seated at the table approximately 0.3 m from the stimulus array, the experimenter instructed the participant to select one item. After a selection was made, the item was removed from the immediate area (leisure item). Prior to the next trial, the sequencing of the remaining items were rotated by taking the item at the left end of the line and moving it to the right end, then shifting the other items so that they were again equally spaced on the table. The second trial then followed immediately. This procedure continued until all items were selected or until a participant made no selection within 10 s from the beginning of a trial. In the latter case, the session ended and all remaining items were recorded as “not selected”. 271 APPENDIX D Session Script Prior to Session • Have Zoom meeting set up at least five minutes before the scheduled meeting time During Session (once participant joins) Step of Session Script Initial Greeting “Hi, how are you today?” Assessment Excerpt “Now, I will share the assessment excerpt with you. You have up to 15 minutes to read it, but feel free to let me know at any time once you are done reading it.” “ Now I will share the assessment excerpt with you. You have up to 15 minutes to review it, but feel free to let me know at any time once you are done reviewing it. If you would like me to re-read sentences please let me know.” Label Items “Okay, before we begin can you tell me what your items are?” Begin Assessment “Okay, now you can begin implementing the assessment. Once you are finished, please let me know.” Participant Asks Question “I am unable to answer that question at this time. Please do your best.” Participant Indicates They Have “Okay, once you are done with the results, please let me Completed the Assessment know.” “Can you please display your data sheet on the screen for a few seconds?” or “Can you please send a picture or the document to my email?” “We are all set, thank you for attending the session today! I will be sending a follow up email within 24 hours.” 272 APPENDIX E Multiple Stimulus Without Replacement (MSWO) Confederate Response Data Sheet Example Multiple Stimulus Without Replacement (MSWO) Confederate Response Data Sheet Procedural Fidelity Confederate Response Data Sheet #1 Confederate: _______________________ Data Collector: _______________________ Date: _____________________________ 1 Trial Response Yes No Wait 7 seconds, select 1, then select 1 another (sequential) Immediately select 2 at the same time 2 (simultaneous) 3 Wait 7 seconds, select 1 (one item) 4 Immediately select 1 (one item) 5 Do not select (no response) 2 Trial Response Yes No Wait 7 seconds, select 1, then select 1 another (sequential) Immediately select 2 at the same time 2 (simultaneous) 3 Wait 7 seconds, select 1 (one item) 4 Immediately select 1 (one item) 5 Do not select (no response) 3 Trial Response Yes No 1 Immediately select 1 (one item) Immediately select 1, then select another 2 (sequential) 3 Immediately select 1 (one item) Wait 7 seconds, select 2 at the same time 4 (simultaneous) 5 Immediately select 1 (one item) 273 APPENDIX F Multiple Stimulus Without Replacement (MSWO) Procedural Fidelity Data Sheet Page 1 Multiple Stimulus without Replacement (MSWO) Procedural Fidelity Participant: ______________ Session #:_______________ Date: ___________________ Researcher:______________ Assigned Confederate Response Data Sheet #:________ Data Collector: ___________ Main data / IOA data Always Sometimes Never Occurred Occurred Occurred Prior to Session 1. Gathers all materials (i.e., five stimuli, data sheet, pencil/pen, timer, computer) 2. Records item names on MSWO data sheet Prior to Each Trial 3. Sets up computer on the table, so the confederate can see all of the items 4. Places all items randomly in a straight line within the confederate’s view 5. Ensures confederate is attending to the items (looking at you or the items) 6. Says “pick one” or “choose one” Following a One Item Selection Response 7. Engages with the selected leisure item for 30 seconds 8. Removes all other items from the table or pulls them back/to the side 9. Records data (indicates item selected) on data sheet 10. Places selected item out of the confederate’s view after 30 seconds 274 Multiple Stimulus Without Replacement (MSWO) Procedural Fidelity Data Sheet Page 2 Multiple Stimulus without Replacement (MSWO) Procedural Fidelity 11. Represents remaining items and rotates by taking the item at the left end of the line and moving to the right end, then shifting the other items so they are equally spaced on the table 12. Repeats steps 5 and 6 until all items have been selected or a 10 second period has elapsed with no selection Following a Two Item (Simultaneously) Selection Response 13. Removes all items from the table or pulls them back/to the side and waits 5 seconds 14. Represents items in the same line up (without rotating items) 15. Repeats steps 5 and 6 Following a Two Item (Sequentially) Selection Response 16. Ignores second selected item and continues to engage with first selected item for 30 seconds 17. Removes all other items from the table or pulls them back/to the side 18. Records data (indicates item selected) on data sheet 19. Places selected item out of the confederate’s view after 30 seconds 20. Represents remaining items and rotates by taking the item at the left end of the line and moving to the right end, then shifting the other items so they are equally spaced on the table 21. Repeats steps 5 and 6 until all items have been selected or a 10 second period has elapsed with no selection Following No Response 22. If the confederate does not make a selection response after 10 seconds, end the block and record remaining items as not selected “NS” 275 Multiple Stimulus Without Replacement (MSWO) Procedural Fidelity Data Sheet Page 3 Multiple Stimulus without Replacement (MSWO) Procedural Fidelity After MSWO Blocks 23. Calculates top preferred item 24. Data are accurate Notes: Score 276 APPENDIX G Researcher Procedural Fidelity Data Sheet Page 1 Session #:_______________ Individual Observed:______________ Date: ___________________ Data Collector: ___________ Assigned Confederate Response Data Sheet #: __________ Main data / IOA data Researcher/Confederate Procedural Fidelity Yes No (Always (Sometimes/Never Occurred) Occurred) Prior to Session 1. Has Zoom meeting set up at least five minutes before the scheduled meeting time Beginning of Session 2. Shares the MSWO excerpt via Zoom with the participant 3. Allows the participant to read the excerpt for up to 15 minutes 4. Asks the participant to identify what items they are using 5. Tells the participant they can now begin implementing the MSWO During Session 6. Engages in correct selection response, according to assigned confederate response data sheet for each trial (see assigned confederate response data sheet) a. For sequential and/or simultaneous confederate responses, if the participant engages with both of the items back to back without representing the items, skipped the trial immediately following and moved onto the next trial. i. Example: Trial #1 was sequential, participant engaged with two items without representing, then presented only three items, the confederate skipped trial #2 and moved onto trial #3. 277 Researcher Procedural Fidelity Data Sheet Page 2 b. For no response confederate responses that occur in trials #1-4, if the participant continues to represent remaining items, instead of recording them as not selected, continue to engage in no responses. c. If the participant engages in errors (such as those noted above) and the confederate has completed all of the predetermined trials, continue by engaging in no response trials until the participant indicates they have completed the MSWO. 7. Does not provide feedback during the session End of Session 8. Once the participant indicates they have completed the MSWO or the participant does not engage in a response for two minutes, tells the participant to let them know once they are done with their results. 9. Once the participant has calculated their results, asks the participant to display their data sheet on the screen 10. Thanks the participant for attending the session Following a Baseline Session 11. Sends baseline email following the session. The email includes all of the necessary components a. General positive opening statement b. Request for a response c. Positive closing statement Following an Intervention Session 12. Sends intervention email following the session. The email includes all of the necessary components a. General positive opening statement b. Supportive feedback c. Corrective feedback d. Request for a response e. Positive closing statement Notes: Score 278 APPENDIX H Baseline Email Example Hi Lauren, Thank you for joining me on Zoom and letting me observe you today! Are you still available to meet on Friday, December 4th at 12:30pm? We hope you have a great rest of your day! Best, Emma 279 APPENDIX I Intervention Email Example Hi Lauren, Thank you for joining me on Zoom and letting me observe you today! You completed 23 out of 24 steps correctly during the session! Two examples of correct steps were: (1) following a two-item selection (simultaneous), you removed all items from the table or pulled them back/to the side and waited 5 seconds and (2) the data were accurate! In each email I’m going to also give you some feedback about steps that you may not have completed correctly. One step was not completed correctly (see below). Following a One Item Selection Response 1. Be sure to rotate the remaining items by taking the item at the left end of the line and moving it to the right end and then shifting the other items, so they were equally spaced on the time. On one trial, you did not rotate the items. Are you still available to meet today (12/11) at 4:00pm? I hope you have a great morning! Best, Emma 280 REFERENCES 281 REFERENCES Artman-Meeker, K. M., & Hemmeter, L. M. (2013). Effects of training and feedback on teachers’ use of classroom preventative practices. Topics in Early Childhood Special Education, 33(2), 112-123. https://doi.org/10.1177/02711214122447115 Barton, E. E., Fuller, E. A., & Schnitz, A. (2016). The use of email to coach preservice early childhood teachers. Topics in Early Childhood Special Education, 36(2), 78-90. https://doi.org/10.1177/0271121415612728 Barton, E. E., Velez, M., Pokorski, E. A., & Domingo, M. (2020). The effects of email performance based feedback delivered to teaching teams: A systematic replication. Journal of Early Intervention, 42(2), 143-162. https://doi.org/10.1177/1053815119872451 Barton, E. E., & Wolery, M. (2007). Evaluation of e-mail feedback on the verbal behaviors of pre-service teachers. Journal of Early Intervention, 30(1), 55-72. https://doi.org/10.1177/105381510703000105 Bottini, S., & Gillis, J. (2021). A comparison of the feedback sandwich, constructive-positive feedback, and within session feedback for training preference assessment implementation. Journal of Organizational Behavior Management, 41(1), 83-93. https://doi.org/10.1080/01608061.2020.1862019 Brodhead, M. T. (2022, March 24). Training video. osf.io/xhdzw Chazin, K.T. & Ledford, J.R. (2016). Multiple stimulus without replacement (MSWO) preference assessment. In Evidence-based instructional practices for young children with autism and other disabilities. Retrieved from http://ebip.vkcsites.org/multiple-stimulus- without-replacement Coogle, C. G., Ottley, J. R., Storie, S., Rahn, N. L., & Kurowski-Burt, A. (2017). ECoaching to enhance special educator practice and child outcomes. Infants and Young Children, 30(1), 58-75. https://doi.org/10.1097/IYC.0000000000000082 Coogle, C. G., Ottley, J. R., Storie, S., Rahn, N. L., & Kurowski-Burt, A. (2020). Performance- based feedback to enhance preservice teachers’ practice and preschool children’s expressive communication. Journal of Teacher Education, 7(2), 188-202. https://doi.org/10.1177/0022487118803583 Coogle, C. G., Rahn, N. L., Ottley, J. R., & Storie, S. (2016). ECoaching across routines to enhance teachers’ use of modeling. Teacher Education and Special Education, 39(4), 227-245. https://doi.org/10.1177/0888406415621959 282 Cooper, J. O., Heron, T. E., & Heward, W. L. (2007). Applied behavior analysis (2nd ed.). Pearson Education. Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied behavior analysis (3rd ed.). Pearson Education. Cox, D. J., & Brodhead, M. T. (2021). A proof of concept analysis of decision-making with time-series data. The Psychological Record, 71(2), 349-366. https://doi.org/10.1007/s40732-020-00451-w DeLeon, I. G., & Iwata, B. A. (1996). Evaluation of a multiple-stimulus presentation format for assessing reinforcer preferences. Journal of Applied Behavior Analysis, 29(4), 519-533. https://doi.org/10.1901/jaba.1996.29.519 DeLeon, I. G., Iwata, B. A., & Roscoe, E. M. (1997). Displacement of leisure reinforcers by food during preference assessments. Journal of Applied Behavior Analysis, 30(3), 475-484. https://doi.org/10.1901/jaba.1997.30-475 DiGennero-Reed, F. D., Codding, R., Catania, C. N., & Maguire, H. (2010). Effects of video modeling on treatment integrity of behavioral interventions. Journal of Applied Behavior Analysis, 43(2), 291-295. https://doi.org/10.1901/jaba.2010.43-291 Fisher, W., Piazza, C. C., Bowman, L. G., Hagopian, L. P., Owens, J. C., & Slevin, I. (1992). A comparison of two approaches for identifying reinforcers for persons with severe and profound disabilities. Journal of Applied Behavior Analysis, 25(2), 491- 498. https://doi.org/10.1901/jaba.1992.25-491 Gast, D. L., & Ledford, J. R., (2014). Single case research methodology: Applications in special education and behavioral sciences (2nd ed.). Routledge. Gast, D. L., Llyod, B. P., & Ledford, J. R. (2014). Multiple baseline and multiple probe designs. In D. L. Gast & J. R. Ledford (Eds.), Single subject research methodology: Applications in special education and behavioral sciences (pp. 252-296). Routledge. Gomez, L., Barton, E. E., Winchester, C., & Locchetta, B. (2021). Effects of email performance feedback on teachers’ use of play expansions. Journal of Early Intervention, 43(3), 235- 254. https://doi.org/10.1177/1053815120969821 Gorton, K., Allday, R. A., Lane, J. D., & Ault, M. J. (2021). Effects of brief training plus electronic feedback on increasing quantity and intonation of behavior specific praise among preschool teachers. Journal of Behavioral Education. Advanced online publication. https://doi.org/10.1007/s10864-020-09427-w Groskreutz, N. C., Groskreutz, M. P., & Higbee, T. S. (2011). Effects of varied levels of treatment integrity on appropriate toy manipulation in children with autism. Research in Autism Spectrum Disorders, 5(4), 1358-1369. https://doi.org/10.1016.j.rasd.2011.01.018 283 Hemmeter, M. L., Snyder, P., Kinder, K., & Artman, K. (2011). Impact of performance feedback delivered via electronic mail on preschool teachers’ use of descriptive praise. Early Childhood Research Quarterly, 26(1), 96-109. https://doi.org/10.1016/j.ecresq.2010.05.004 Iwata, B. A., Wallace, M. D., Kahng, S., Lindberg, J. S., Roscoe, E. M., Connners, J., Hanley, G. P., Thompson, R. H., & Worsdell, A. S. (2000). Skill acquisition in the implementation of functional analysis methodology. Journal of Applied Behavior Analysis, 33(2), 181-194. https://doi/org/10.1901/jaba.2000.33-181 Kratochwill, T. R., Hitchcock, J. H., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34, 26-38. https://doi.org/10.1177/0741932512452794 Krick Oborn, K. M., & Johnson, L. D. (2015). Coaching via electronic performance feedback to support home vistors’ use of caregiver coaching strategies. Topics in Early Childhood Special Education, 35(3), 157-169. https://doi.org/10.1177/0271121415592411 Ledford, J. R., Wolery, M., & Gast, D. L. (2014). Controversial and critical issues in single case research. In D. L. Gast & J. R. Ledford (Eds.), Single subject research methodology: Applications in special education and behavioral sciences (pp. 377-396). Routledge. Lipschultz, J. L., Vladescu, J. C., Reeve, K. F., Reeve, S. A., & Dipsey, C. R. (2015). Using video modeling with voiceover instruction to train staff to conduct stimulus preference assessments. Journal of Developmental and Physical Disabilities, 27(4), 505-532. https://doi.org/10.1007/s10882-015-9434-4 Maple Tech International LLC (2022, February). Random number generator. https://www.calculator.net/random-number-generator.html Martinez Cueto, A. P., Barton, E. E., & Bancroft, J. C. (2021). The effects of training and performance feedback on preservice teachers’ use of statements that promote preschool children’s social interactions. Journal of Positive Behavior Interventions. Advanced online publication. https://doi.org/10.1177/1098300721994200 Miltenberger, R. G. (2012). Behavior skills training procedures. In R. G. Miltenberger (Eds.), Behavior modification: Principles and procedures (5th ed., pp. 217-235). Wadsworth Novak, M. D., DiGennaro Reed, F. D., Erath, T. G., Blackman, A. L., Ruby, S. A., & Pellegrino, A. J. (2019). Evidence-based performance management: Applying behavioral science to support practitioners. Perspectives on Behavior Science, 42(4), 955-972. https://doi.org/10.1007/s40614-019-00232-z O‘Handley, R. D., Pearson, S., Taylor, C., & Congdon, M. (2021). Training preservice school psychologists to conduct a stimulus preference assessment. Behavior Analysis in Practice, 14(2), 445-450. https://doi.org/10.1007/s40617-020-00537-5 284 Park, J., Johnson, D. A., Moon, K., & Lee, J. (2019). The interaction effects of frequency and specificity of feedback on work performance. Journal of Organizational Behavior Management, 39(3-4), 164–178. https://doi.org/10.1080/01608061.2019.1632242 Parker, R. I., Vannest, K. J., Davis, J. L., & Sauber, S. B. (2011). Combining nonoverlap and trend for single case research: Tau-U. Behavior Therapy, 42(2), 284-299. https://doi.org/10.1016/j.beth.2010.08.006 Roane, H. S., Vollmer, T. R., Ringdahl, J. E., & Marcus, B. A. (1998). Evaluation of a brief stimulus preference assessment. Journal of Applied Behavior Analysis, 31(4), 605-620. https://doi.org/10.1901/jaba.1998.31-605 Rosales, R., Gongola, L., & Homlitas, C. (2015). An evaluation of video modeling with embedded instructions to teach implementation of stimulus preference assessments. Journal of Applied Behavior Analysis, 48(1), 209-214. https://doi.org/10.1002/jaba.174 Roscoe, E. M., Fisher, W. W., Glover, A. C., & Volkert, V. M. (2006). Evaluating the relative effects of feedback and contingent money for staff training of stimulus preference assessments. Journal of Applied Behavior Analysis, 39(1), 63-77. https://doi.org/10.1901/jaba.2006.7-05 Schles, R. A., & Robertson, R. E. (2019). The role of performance feedback and implementation of evidence-based practices for preservice special education teachers and student outcomes: A review of the literature. Teacher Education and Special Education, 42(1), 36-48. https://doi.org/10.1177/0888406417736571 Shapiro, M., & Kazemi, E. (2017). A review of training strategies to teach individuals implementation of behavioral interventions. Journal of Organizational Management, 37(1), 32-62. http://dx.doi.org/10.1080/01608061.2016.1267066 Sidman, M. (1960). Tactics of scientific research: evaluating experimental data in psychology. Authors Cooperative. Sipila-Thomas, E. S., Foote, A. J., White, A. N., Melanson, I. J., & Brodhead, M. T. (2020). A replication of preference displacement research in children with autism spectrum disorder. Journal of Applied Behavior Analysis, 54(1), 403-416. https://doi.org/10.1002/jaba.775 Sleiman, A. A., Sigurjonsdottir, S., Elnes, A., Gage, N. A., & Gravina, N. E. (2020). A quantitative review of performance feedback in organizational settings (1998-2018). Journal of Organization Behavior Management, 40(3-4), 303-332. https://doi.org/10.1080/01608061.2020.1823300 Slocum, T. A., Pinkelman, S. E., Joslyn, P. R., & Nichols, B. (2022). Threats to internal validity in multiple-baseline design variations. Perspectives on Behavior Science. Advanced online publication. https://doi.org/10.1007/s40614-022-00326-1 285 Vladescu, J. C., Caroll, R., Paden, A., & Kodak, T. M. (2012). The effects of video modeling with voiceover instruction on accurate implementation of discrete-trial instruction. Journal of Applied Behavior Analysis, 45(2), 419-423. https://doi.org/10.1901/jaba.2012.45-419. Warrilow, G. D., Johnson, D. A., & Eagle, L. M. (2020). The effects of feedback modality on performance. Journal of Organizational Behavior Management, 40(3-4), 233-248. https://doi.org/10.1080/01608061.2020.1784351 Zhu, J., Bruhn, A., Yuan, C., & Wang, L. (2021). Comparing the effects of videoconference and email feedback on treatment integrity. Journal of Applied Behavior Analysis, 54(2), 618- 635. https://doi.org/10.1002/jaba.810 286 CHAPTER 5 Discussion Effective supervision is critical to the field of ABA because it improves the quality of services provided to the recipients of behavioral services (Britton & Cicoria, 2019; LeBlanc & Luiselli, 2016; LeBlanc et al., 2020; Turner et al., 2016). Effective supervision is beneficial to the organization in which the behavioral services are provided because it improves the quality of services that all clients receive which in turn increases client protection and helps to portray the field of ABA as one that is committed to socially significant behavior change (Brodhead & Higbee, 2012; Hartley et al., 2016). Without effective supervision, the quality of services may decrease and negatively impact treatment outcomes for the clients (Britton & Cicoria, 2019; Dixon et al., 2016; Eikeseth, 2009; LeBlanc & Luiselli, 2016; Shapiro & Kazemi, 2017). Given the rapid growth of the field of ABA, supervision will continue to play a critical role in training, fostering the growth and development of professionals, and ensuring those professionals uphold the high standards of the profession (Hajiaghamohseni et al., 2020; Turner et al., 2016; Turner, 2017). The current dissertation addressed and evaluated supervision of behavior analytic services provided to individuals with ASD in three different contexts: (a) supervision provided during the implementation of behavioral interventions, (b) supervision provided via Telehealth, specifically evaluating barriers and strategies used to address and/or mitigate those barriers, and (c) supervision provided via Telehealth in the form of email performance-based feedback. Collectively, these chapters sought to address gaps in the current behavior analytic supervision literature and identify additional areas of study. 287 Supervision Process As mentioned in Chapter 1, the supervision process typically consists of three main supervision components: (a) when supervision is provided, (b) how supervision is provided, and (c) what supervision is provided. Future research and practice recommendations regarding the three components of the supervision process will be discussed below. When is Supervision Provided? Supervision is provided to supervisees when implementing behavioral interventions with individuals with ASD, conducting behavior analytic assessments, and developing and/or selecting behavior-change procedures to be implemented with a client (BACB, 2019). In the present dissertation, Chapter 2 focused on supervision provided during the implementation of behavioral interventions. Specifically, Chapter 2, a systematic literature review, evaluated the extent to which recently published articles included information regarding supervision and staff training of the individuals implementing behavioral interventions to young children with ASD. The results of Chapter 2 revealed that less than 30.0% of articles that evaluated behavioral interventions with children five years of age or younger with ASD reported information regarding supervision. This finding supports Romanczyk and colleagues (2014) finding that supervision characteristics are not typically reported in articles evaluating behavioral interventions. The results of Chapter 2 also revealed that less than 11.0% of articles indicated the amount of training individuals received prior to implementing the behavioral interventions and less than 7.0% reported the professional qualifications of the trainers. This finding extends and addresses a limitation of the Romanczyk et al. (2014) review, as they did not evaluate staff training variables. Our findings agree with Romanczyk et al. (2014) that there is little to no 288 consensus on reporting supervision and staff training characteristics in the current behavioral intervention literature. These findings also align with related literature on staff training and behavior analytic studies that show that training content and protocols, and demographic variables are not consistently reported in scholarly literature (Gormley et al., 2020; Jones et al., 2020). Lack of reporting about supervision and training is problematic because it could indicate that supervision or staff training did not occur at all. However, we find this to be unlikely because supervision and training are necessary during behavioral intervention implementation (Shapiro & Kazemi, 2017), therefore it is likely that some type of supervision and/or training occurred. However, the omission of these details leaves questions (e.g., how much supervision or training occurred, who provided the supervision or training, what type of supervision and training occurred). Below key findings will be summarized, and research recommendations will be provided. Although the recommendations are tailored to behavioral intervention literature, they can and should be applied to other areas of when supervision is provided (i.e., conducting behavior analytic assessments, developing and/or selecting behavior-change procedures to be implemented with a client). Research Recommendations Based on the findings from Chapter 2, we strongly encourage future research to begin reporting information regarding both supervision and staff training characteristics of their research studies. Reporting supervision and staff training characteristics (e.g., dosage, frequency, who provided supervision and/or training, what did the supervision process include, what did the training consist of) and providing adequate descriptions of both, including materials and/or protocols used, will help address issues of replication and research translating to practice in the 289 field of ABA (Gormley et al., 2020). Furthermore, reporting supervision and staff training characteristics will provide researchers with the necessary information (e.g., models of supervision that have been evaluated) to evaluate supervision and staff training as independent variables (see Dixon et al., 2016 for an example). The evaluation of supervision and staff training as independent variables will provide the field of ABA with information regarding how much supervision or training is required (e.g., 3 hr) and the most effective supervision and staff training models (e.g., behavioral skills training; DiGennaro Reed et al., 2018; Slane & Lieberman-Betz, 2021) that lead to the greatest client outcomes (Valentino, 2021). Finally, reporting supervision and staff training variables in the literature will result in improved literature reviews and meta-analyses that will help further improve the field of ABA by informing research and practice (Roth et al., 2010). Consistent reporting of supervision and staff training variables will allow for in-depth analyses of these variables (e.g., impact on treatment outcomes when supervisor and/or staff training is not achieved) and allow for additional questions to be asked (e.g., do different levels of procedural fidelity during training impact treatment outcomes; Strain et al., 2021). Having the ability to ask additional questions about supervision and staff training variables and determining the answers will provide the field of ABA with more informative research that can be provided to practitioners and as a result positively impact treatment outcomes for clients. Future research may consider extending this literature review and evaluating supervision and staff training characteristics within behavioral interventions implemented with individuals over the age of five with ASD. Additionally, future research should consider evaluating the extent to which supervision and staff training variables are reported in the research base on conducting behavior analytic assessments. Extending the present literature review to include 290 articles evaluating behavioral interventions implemented with individuals over the age of five with ASD and articles conducting behavior analytic assessments will provide the field of ABA with additional insight into the extent to which supervision and staff training characteristics are not reported. If similar results are found in the extended literature reviews, the field of ABA may consider making a larger call to action for researchers to report supervision and staff training characteristics in their studies. Future research may also consider evaluating studies that included supervision or staff training as the independent variables (e.g., Dixon et al., 2016; Eikeseth et al., 2009; Rios et al., 2020) to assess the effects of supervision and staff training on treatment outcomes and assess what aspects of supervision and staff training are critical to improve treatment outcomes. Lastly, future research may consider evaluating the extent supervision and staff training characteristics are reported or evaluated in behavioral interventions that are implemented in a variety of contexts (e.g., Telehealth, hybrid). This additional information will help researchers make informed decisions on what supervision or staff training variables to consider based on the context they are in or researchers can identify areas that require further evaluation of supervision and staff training. How is Supervision Provided? Supervision can be provided in a variety of modalities including face-to-face, remotely via Telehealth, or hybrid (i.e., a combination of the two). Previous supervision literature has primarily focused on supervision within the context of face-to-face (e.g., Sellers et al., 2019). Alternatively, previous literature focusing on Telehealth has primarily focused on the evaluation of Telehealth when providing services to clients (e.g., Gibson et al., 2010; Suess et al., 2014) and training individuals to conduct assessments (e.g., Alnemary et al., 2015; Lindgren et al., 2016; 291 Neely et al., 2016). However, the research base on providing supervision via Telehealth is limited. Chapter 3 described a study that evaluated the barriers BCBAs experienced and the strategies BCBAs used to address and/or mitigate the barriers that arose when providing supervision via Telehealth. Unsurprisingly, the results of Chapter 3 revealed that BCBAs that provide supervision via Telehealth are not exempt from experiencing barriers. However, the results of Chapter 3 suggest barriers supervisors face when providing supervision during telehealth differ than when provided in a face-to-face context (as reported in Sellers et al., 2019). This finding seems reasonable given that a different context (e.g., Telehealth) may demand its own unique considerations. The results of Chapter 3 also revealed that only 78.0% of participants reported that they had received training on how to provide supervision via Telehealth. Compared to Hajiaghamohseni and colleagues (2020) finding of 99.1% of participants had prior supervision training, this number is alarmingly low and results in 21.7% of participants that did not receive training on how to provide supervision via Telehealth. Regardless of the level of expertise a BCBA has when providing supervision, it is critical that they receive training in that context, as skills from a face-to-face context may not transfer to a remote context (Fischer et al., 2017; Lerman et al., 2020). Below research and practice recommendations will be provided, though the focus is on the context of supervision via Telehealth, the recommendations can also be applied to other contexts (i.e., face-to-face, hybrid). Research Recommendations Though we identified barriers experienced during supervision via Telehealth, it is unclear how often reported barriers occurred, as we did not ask specific questions regarding the frequency of barriers. Based on this finding, the first research recommendation is to evaluate the 292 frequency reported barriers occur and if the frequency changes over time (e.g., is the barrier more likely to occur at the beginning of the supervision process, or does it consistently occur throughout). Knowing the frequency of barriers will provide additional insight to researchers and organizations about barriers that could potentially occur and they could begin to explore ways to prevent the barriers from occurring altogether. Though strategies that were used to address and/or mitigate barriers that arose were reported, we did not assess the effectiveness of the strategies used. As a result, it remains unclear whether the strategies used were effective in eliminating or decreasing the frequency of the barrier. Future research should consider evaluating the effectiveness of strategies by conducting single case research design studies that use various strategies to address and/or mitigate barriers that typically occur when providing supervision via Telehealth. Once effective strategies are identified, future research may consider evaluating the supervisors and supervisees perceptions of the strategies and how often they would be willing to implement each strategy throughout the supervision process. This is critical information as it will help bridge the research to practice gap within supervision and will in turn provide effective strategies that supervisors would be willing to implement. Finally, as technology continues to rapidly change (Fischer et al., 2017), future research may consider evaluating various types of technology and software when providing supervision via Telehealth and comparing the results in order to inform practitioners on which technology and software they should or should not consider using. Additionally, future research may consider evaluating the various formats (e.g., video conferencing, phone calls) supervisors use to provide supervision via Telehealth in order to gain a better understanding of what features of Telehealth are most important to providing supervision via Telehealth. Overall, researchers 293 should continue to evaluate the supervision process provided via Telehealth to understand the full scope and limitations of using Telehealth. Practice Recommendations Given the results of Chapter 3, we also created a list of practice recommendations in order to help improve supervisory practices. It is our hope that this dissertation adds to the growing body of supervision literature that provides practical advice that behavior analysts can use throughout the supervision process (e.g., Garza et al., 2018; Sellers et al., 2016; Valentino et al., 2016). The first practice recommendation is that supervision and training of how to provide supervision should be tailored to the context in which services are provided in. The training should include a review of the research base of that particular context (e.g., research base of remote supervision), practical implications, and ethical considerations (e.g., using HIPAA compliant technology and software; Fischer et al., 2017). Second, an organization should provide a training to all BCBAs on how to provide supervision within that context (e.g., Telehealth, hybrid), prior to beginning the supervisory process. Third, the organization should evaluate each supervisor’s technical competence (i.e., the supervisor’s knowledge and ability to utilize all necessary technology and software, establish and maintain connectivity, and ensure consumer protection) and determine any areas that may require additional training prior to beginning the supervision process (Barnett & Kolmes, 2016; Fischer et al., 2017). Fourth, throughout the supervision process, the organization should track the barriers their employees commonly experience and use that information to guide future iterations of trainings that are provided. Once the organization has identified frequently occurring barriers, the organization could develop a resource of potential strategies that can be used to address and/or mitigate barriers that arise (see 294 Table 3.13 and Lee et al., 2015 for examples). The organization should invest in and make resources that reduce barriers available, especially resources related to technology (if applicable). What Does Supervision Consist of? Though there are multiple activities that are critical to the success of the supervision process, the present dissertation focused on the supervisor providing feedback to the supervisee using performance-based feedback. Performance-based feedback can be delivered in a variety of formats including verbal or written forms (e.g., visible counters, public wall postings, verbal interactions; Coogle et al., 2017; Warrilow et el., 2020), or remotely through the use of technology (e.g., bug-in-ear, text messages, or emails; Barton & Wolery, 2007; Coogle et al., 2016; Hemmeter et al., 2011). Additionally, performance-feedback can include a variety of characteristics including the feedback source (e.g., supervisor), feedback frequency (e.g., daily), feedback privacy (e.g., public posting, private feedback), and feedback content (comparison of an individual’s performance to their previous performance; Alvero et al., 2001; Sleiman et al., 2020). Several literature reviews have established performance-based feedback (using a variety of the aforementioned forms of feedback and feedback characteristics) as an evidence-based practice (Cornelius & Nagro, 2014; Fallon et al., 2015). However, there is a limited research base evaluating the effects of performance-based feedback on behaviors relevant to supervision (Turner, 2017). As technology continues to evolve and become more available, supervisors can use technology to further enhance their supervision and ultimately improve client outcomes (Fischer et al., 2017; Zhu et al., 2021). As a result, supervisors may consider combining the use of performance-feedback and technology (e.g., email) with their supervisees. In the present dissertation, Chapter 4 focused on the use of email performance-based feedback to evaluate if 295 this specific way of providing remote feedback (i.e., email) can be used to result in behavior change. The results of Chapter 4 revealed that email performance-based feedback was effective in increasing procedural fidelity of MSWO preference assessment implementation. These results support previous findings (i.e., Barton et al., 2016; Barton et al., 2020; Coogle et al., 2020) suggesting that email performance-based feedback alone is effective in increasing target behavior(s). However, further research is needed in order to understand the extent to which email performance-based feedback is effective with other behavior analytic procedures and interventions. Below research recommendations will be provided to help guide future research on performance-based feedback within the field of ABA. Research Recommendations Given that remote forms of performance-based feedback will likely remain as technology continues to develop and evolve, the present dissertation, provides an example of how to evaluate a form of remote performance-based feedback (i.e., email feedback) while addressing specific limitations in previous research (e.g., controlling for confounding variables). Future researchers should continue to evaluate email performance-based feedback and other modalities of feedback to increase procedural fidelity of implementation of behavior analytic interventions and procedures. Future research may first consider conducting a literature review or meta- analysis on studies that have evaluated performance-feedback, regardless of the format or modality, within the behavior analytic literature. Additionally, researchers may consider specifically identifying studies that evaluated performance-feedback within the context of behavior analytic supervision. Conducting a literature review or meta-analysis on performance- feedback within the behavior analytic literature and within the context of behavior analytic supervision will provide insight into how the field of ABA has evaluated performance-feedback, 296 what aspects (e.g., frequency of feedback, feedback source) of feedback have been evaluated and how effective they may be in changing behavior, and what areas need further refinement or evaluation. Future research should consider comparing customized and standardized email feedback and evaluating other feedback characteristics to further refine email performance-based feedback interventions. Additionally, future research is needed in order to determine the extent to which results of performance-based feedback intervention would generalize to other behavior analytic procedures and interventions (e.g., functional analyses and function-based treatments). Further evaluating performance-based feedback is important because in order to effectively use performance-based feedback, we must understand why and how the feedback changes behavior (Alvero et al., 2001). Researchers should continue to evaluate performance-based feedback using a variety of formats (e.g., emails, video conferencing) and characteristics (e.g., feedback frequency) in order to determine which modalities, what amount (e.g., dosage of feedback), and what type (e.g., customized vs. standardized) is required in order to increase procedural fidelity during the implementation of behavioral services, which in turn will increase client treatment outcomes. Supervision in ABA Despite continuous growth in the supervision research literature, several gaps still remain. Researchers need to begin reporting supervision and staff training characteristics in their article publications. Additionally, future researchers need to further evaluate supervision and staff training as independent variables to inform the dosage of supervision and staff training that is required to increase treatment outcomes with clients. When evaluating how supervision is provided (e.g., Telehealth), researchers and organizations should ensure that supervisors 297 providing supervision via Telehealth are properly trained to provide supervision within that context and that the trainings are developed using an evidence-based process to eliminate or reduce the barriers that may arise. Finally, when evaluating what supervision consists of, future researchers should continue to further refine email performance-based feedback interventions to determine if this form of feedback can be effective in increasing procedural fidelity of implementation of multiple behavior analytic procedures and interventions. Though the supervision literature in the field of ABA still needs to grow and be further evaluated, hopefully the present dissertation brings awareness to the gaps in the literature and provides guidance to future researchers to begin to fill those gaps. 298 REFERENCES 299 REFERENCES Alnemary, F. M., Wallace, M., Symon, J. B. G., & Barry, L. M. (2015). Using international videoconferencing to provide staff training on functional behavioral assessment. Behavioral Interventions, 30(1), 73-86. https://doi.org/10.1002/bin.1403 Alvero, A. M., Bucklin, B. R., & Austin, J. (2001). An objective review of the effectiveness and essential characteristics of performance feedback in organizational settings (1985-1998). Journal of Organizational Behavior Management, 21(1), 3-29. https://doi.org10.1300/J075v21n01_02 Barnett, J. E., & Kolmes, K. (2016). The practice of tele-mental health: Ethical, legal, and clinical issues for practitioners. Practice Innovations, 1(1), 53-66. https://doi.org/10.1037/pri0000014 Barton, E. E., Fuller, E. A., & Schnitz, A. (2016). The use of email to coach preservice early childhood teachers. Topics in Early Childhood Special Education, 36(2), 78-90. https://doi.org/10.1177/0271121415612728 Barton, E. E., Velez, M., Pokorski, E. A., & Domingo, M. (2020). The effects of email performance based feedback delivered to teaching teams: A systematic replication. Journal of Early Intervention, 42(2), 143-162. https://doi.org/10.1177/1053815119872451 Barton, E. E., & Wolery, M. (2007). Evaluation of e-mail feedback on the verbal behaviors of pre-service teachers. Journal of Early Intervention, 30(1), 55-72. https://doi.org/10.1177/105381510703000105 Behavior Analyst Certification Board. (2019). Supervisor training curriculum outline. https://www.bacb.com/wp- content/uploads/2020/05/Supervision_Training_Curriculum_190813.pdf Britton, L. N., & Cicoria, M. J. (2019). Remote fieldwork supervision for BCBA trainees. Academic Press. Brodhead, M. T., & Higbee, T. S. (2012). Teaching and maintaining ethical behavior in a professional organization. Behavior Analysis in Practice, 5(2), 82-88. https://doi/org/10.1007/BF03391827 Coogle, C. G., Ottley, J. R., Storie, S., Rahn, N. L., & Kurowski-Burt, A. (2017). ECoaching to enhance special educator practice and child outcomes. Infants and Young Children, 30(1), 58-75. https://doi.org/10.1097/IYC.0000000000000082 300 Coogle, C. G., Ottley, J. R., Storie, S., Rahn, N. L., & Kurowski-Burt, A. (2020). Performance- based feedback to enhance preservice teachers’ practice and preschool children’s expressive communication. Journal of Teacher Education, 7(2), 188-202. https://doi.org/10.1177/0022487118803583 Coogle, C. G., Rahn, N. L., Ottley, J. R., & Storie, S. (2016). ECoaching across routines to enhance teachers’ use of modeling. Teacher Education and Special Education, 39(4), 227-245. https://doi.org/10.1177/0888406415621959 Cornelius, K. E., & Nagro, S. A. (2014). Evaluating the evidence base of performance feedback in preservice special education teacher training. Teacher Education and Special Education, 37(2), 133-146. https://doi.org/10.1177/0888406414521837 DiGennaro Reed, F. D., Blackman, A. L., Erath, T. G., Brand, D., & Novak, M. D. (2018). Guidelines for using behavioral skills training to provide teacher support. Teaching Exceptional Children, 50(6), 373-380. https://doi.org/10.1177/0040059918777241 Dixon, D. R., Linstead, E., Granpeesheh, D., Novack, M. N., French, R., Stevens, E., Stevens, L., & Powell, A. (2016). An evaluation of the impact of supervision intensity, supervisor qualifications, and caseload on outcomes in the treatment of autism spectrum disorder. Behavior Analysis in Practice, 9(4), 339-348. https://doi.org/10.1007/s40617-016-0132-1 Eikeseth, S., Hayward, D., Gale, C., Gitlesen, J., & Eldevik, S. (2009). Intensity of supervision and outcome for preschool aged children receiving early and intensive behavioral interventions: A preliminary study. Research in Autism Spectrum Disorders, 3(1), 67-73. https://doi.org/10.1016/j.rasd.2008.04.003 Fallon, L. M., Collier-Meek, M. A., Maggin, D. M., Sanetti, L. M., & Johnson, A. H. (2015). Is performance feedback for educators an evidence-based practice? A systematic review and evaluation based on single-case research. Exceptional Children, 81(2), 227-246. https://doi.org/10.1177/0014402914551738 Fischer, A. J., Clark, R., Askings, D., & Lehman, E. (2017). Technology and telehealth applications. In J .K. Luiselli (Eds.), Applied behavior analysis advanced guidebook: A manual for professional practice (pp. 135-163). Elsevier Inc. Garza, K. L., McGee, H. M., Schenk, Y. A., & Wiskirchen, R. R. (2018). Some tools for carrying out a proposed process for supervising experience hours for aspiring board certified behavior analysts. Behavior Analysis in Practice, 11(1), 62-70. https://doi.org/10.1007/s40617-017-0186-8 Gibson, J. L., Pennington, R. C., Stenhoff, D. M., & Hopper, J. S. (2010). Using desktop videoconferencing to deliver interventions to a preschool student with autism. Topics in Early Childhood Special Education, 29(4), 214-225. https://doi.org/10.1177/0271121409352873 301 Gormley, L., Healy, O., Doherty, A., O’Regan, D., & Grey, I. (2020). Staff training in intellectual and developmental disability settings: A scoping review. Journal of Developmental and Physical Disabilities, 32, 187-212. https://doi.org/10.1007/s10882- 019-09683-3 Hajiaghamohseni, Z., Drasgrow, E., & Wolfe, K. (2020). Supervision behaviors of board certified behavior analysts with trainees. Behavior Analysis in Practice, 14(1), 97-109. https://doi.org/10.1007/s40617-020-00492-1 Hartley, B. K., Courtney, W. T., Rosswurm, M., & LaMarca, V. J. (2016). The apprentice: An innovative approach to meet the behavior analysis certification board’s supervision standards. Behavior Analysis in Practice, 9(4), 329-338. https://doi.org/10.1007/s40617- 016-0136-x Hemmeter, M. L., Snyder, P., Kinder, K., & Artman, K. (2011). Impact of performance feedback delivered via electronic mail on preschool teachers’ use of descriptive praise. Early Childhood Research Quarterly, 26(1), 96-109. https://doi.org/10.1016/j.ecresq.2010.05.004 Jones, S. H., St. Peter, C. C., & Ruckle, M. M. (2020). Reporting of demographic variables in the journal of applied behavior analysis. Journal of Applied Behavior Analysis, 53(3), 1304- 1315. https://doi.org/10.1002/jaba.722 LeBlanc, L. A., & Luiselli, J. K. (2016). Refining supervisory practices in the field of behavior analysis: Introduction to the special section on supervision. Behavior Analysis in Practice, 9(4), 271-273. https://doi.org/10.1007/s40617-016-0156-6 LeBlanc, L. A., Sellers, T. P., & Ala’i, S. (2020). Building and sustaining meaningful and effective relationships as a supervisor and mentor. Sloan Publishing. Lee, J. F., Schieltz, K. M., Suess, A. N., Wacker, D. P., Romani, P. W., Lindgren, S. D., Kopelman, T. G., & Padilla Dalmau, Y. C. (2015). Guidelines for developing telehealth services and troubleshooting problems with telehealth technology when coaching parents to conduct functional analyses and functional communication training in their homes. Behavior Analysis in Practice, 8(2), 190-200. https://doi.org/10.1007/s40617-014-0031-2 Lerman, D. C., O’Brien, M. J., Neely, L., Call, N. A., Tsami, L., Schieltz, K. M., Berg, W. K., Graber, J., Huang, P., Kopelman, T., & Cooper-Brown, L. J. (2020). Remote coaching of caregivers via Telehealth: Challenges and potential solutions. Journal of Behavioral Education, 29(2), 195-221. https://doi.org/10.1007/s10864-020-09378-2 Lindgren, S., Wacker, D., Suess, A., Schieltz, K., Pelzel, K., Kopelman, T., Lee, J., Romani, P., & Waldron, D. (2016). Telehealth and autism: Treating challenging behavior at lower cost. Pediatrics, 137, S167-S175. https://doi.org/10.1542/peds.2015-28510 302 Neely, L., Rispoli, M., Gerow, S., & Hong, E. R. (2016). Preparing interventionists via telepractice in incidental teaching for children with autism. Journal of Behavioral Education, 25, 393- 416. https://doi.org/10.1007/s10864-016-9250-7 Rios, D. Schenk, Y. A., Eldridge, R. R., & Peterson, S. M. (2020). The effects of remote behavioral skills training on conducting functional analyses. Journal of Behavioral Education, 29(2), 449-468. https://doi.org/10.1007/s10864-020-09385-3 Romanczyk, R. G., Callahan, E. H., Turner, L. B., & Cavalari, R. N. S. (2014). Efficacy of behavioral interventions for young children with autism spectrum disorders: Public policy, the evidence base, and implementation parameters. Review Journal of Autism and Developmental Disorders, 1(4), 276-326. https://doi.org/10.1007/s40489-014-0025-6 Roth, A. D., Pilling, S., & Turner, J. (2010). Therapist training and supervision in clinical trials: Implications for clinical practice. Behavioural and Cognitive Psychotherapy, 38(3), 291- 302. https://doi.org/10.1017/S1352465810000068 Sellers, T. P., LeBlanc, L. A., & Valentino, A. L. (2016). Recommendations for detecting and addressing barriers to successful supervision. Behavior Analysis in Practice, 9(4), 309- 319. https://doi.org/10.1007/s40617-016-0142-z Sellers, T. P., Valentino, A. L., Landon, T. J., & Aiello, S. (2019). Board certified behavior analysts’ supervisory practices of trainees: Survey results and recommendations. Behavior Analysis in Practice, 12(3), 536-546. https://doi.org/10.1007/s40617-019- 00367-0 Shapiro, M., & Kazemi, E. (2017). A review of training strategies to teach individuals implementation of behavioral interventions. Journal of Organizational behavior Management, 37(1), 32-62. https://doi.org/10.1080/01608061.2016.1267066 Slane, M., & Lieberman‐Betz, R. (2021). Using behavioral skills training to teach implementation of behavioral interventions to teachers and other professionals: A systematic review. Behavioral Interventions, 36(4) 984-1002. https://doi.org/10.1002/bin.1828 Sleiman, A. A., Sigurjonsdottir, S., Elnes, A., Gage, N. A., & Gravina, N. E. (2020). A quantitative review of performance feedback in organizational settings (1998-2018). Journal of Organization Behavior Management, 40(3-4), 303-332. https://doi.org/10.1080/01608061.2020.1823300 Strain, P., Fox, L., & Barton, E. E. (2021). On expanding the definition and use of procedural fidelity. Research and Practice for Persons with Severe Disabilities, 46(3), 173-183. https://doi.org/10.1177/15407969211036911 303 Suess, A. N., Romani, P. W., Wacker, D. P., Dyson, S. M., Kuhle, J. L., Lee, J. F., Lindgren, S. D., Kopelman, T. G., Pelzel, K. E., & Waldron, D. B. (2014). Journal of Behavioral Education, 23(1), 34-59. https://doi.org/10.1007/s10864-013-9183-3 Turner, L. B. (2017). Behavior analytic supervision. In J .K. Luiselli (Eds.), Applied behavior analysis advanced guidebook: A manual for professional practice (pp. 3-20). Elsevier Inc. Turner, L. B., Fischer, A. J., & Luiselli, J. K. (2016). Towards a competency-based, ethical, and socially valid approach to the supervision of applied behavior analytic trainees. Behavior Analysis in Practice, 9(4), 287-298. https://doi.org/10.1007/s40617-016-0121-4 Valentino, A. L. (2021). Supervision and mentoring. In Luiselli, J. K., Gardner, R. M., Bird, F. L., & Maguire, H. (Eds.). Organizational behavior management approaches for intellectual and developmental disabilities (pp. 141-164). Routledge. Valentino, A. L., LeBlanc., A., & Sellers, T. P. (2016). The benefits of group supervision and a recommended structure for implementation. Behavior Analysis in Practice, 9(4), 320- 328. https://doi.org/10.1007/s40617-016-0138-8 Warrilow, G. D., Johnson, D. A., & Eagle, L. M. (2020). The effects of feedback modality on performance. Journal of Organizational Behavior Management, 40(3-4), 233-248. https://doi.org/10.1080/01608061.2020.1784351 Zhu, J., Bruhn, A., Yuan, C., & Wang, L. (2021). Comparing the effects of videoconference and email feedback on treatment integrity. Journal of Applied Behavior Analysis, 54(2), 618- 635. https://doi.org/10.1002/jaba.810 304