AN EVALUATION OF THE AUTISM FOCUSED INTERVENTION RESOURCES & MODULES (AFIRM) TIME DELAY MODULE By Kate Cavataio A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Applied Behavior Analysis—Master of Arts 2023 ABSTRACT Paraprofessionals are situated within a school setting to provide support to certified educators. They are often tasked with providing instructional support services to students with disabilities in a one-to-one or small group context. To provide effective instruction, paraprofessionals should be trained in the use of evidence-based practices (EBPs). The Autism Focused Intervention Resources and Modules (AFIRM) aim to provide online, asynchronous training of EBPs to paraprofessionals at no cost. The initial study included use of the AFIRM provided Time Delay Checklist-Multi-Use to evaluate whether the AFIRM Time Delay Module was related to improved implementation fidelity of time delay procedures by paraprofessionals while assisting their students with their goals. Upon conducting the initial study, several shortcomings were identified and data collection was put on hold. The current paper discusses the difficulties with study and data collection procedures and discusses areas for improvement within both the AFIRM Time Delay Module and the AFIRM provided Time Delay Checklist- Multi-Use. TABLE OF CONTENTS LIST OF FIGURES ..........................................................................................................................iv LIST OF ABBREVIATIONS ...........................................................................................................v Introduction .......................................................................................................................................1 Purpose of the Current Study ................................................................................................4 Method and Results ...........................................................................................................................6 Participants ...........................................................................................................................6 Setting and Materials ...........................................................................................................6 Setting .......................................................................................................................6 Materials ...................................................................................................................6 Independent Variable ...........................................................................................................8 Dependent Variable and Response Measurement................................................................9 Dependent Variable .................................................................................................9 Response Measurement ...........................................................................................9 Interobserver Agreement (IOA) ...............................................................................9 Experimental Design ..........................................................................................................10 Complications with the Time Delay Checklist ..................................................................10 Complications with the Module .........................................................................................15 Discussion........................................................................................................................................18 Strengths of the AFIRM Training Module ........................................................................18 Areas for Improvement ......................................................................................................19 Recommendations ..............................................................................................................21 Conclusion .......................................................................................................................................23 REFERENCES ................................................................................................................................24 iii LIST OF FIGURES Figure 1 Example of Page 2 of the Completed Classroom Planning Guide ......................................7 Figure 2 Completed Time Delay Checklist-Multi-Use From Observers 1 and 2 During Baseline ...13 iv LIST OF ABBREVIATIONS IEP Individualized Education Program IDEA Individuals with Disabilities Act EBP Evidence-based Practice AFIRM Autism Focused Intervention Resources and Modules IOA Interobserver Agreement N/A Not Applicable DTT Discrete Trial Training SUS System Usability Scale v Introduction Paraprofessionals are direct support staff who work in an education setting to assist certified educators (Indeed.com, n.d.). Paraprofessionals are often tasked with providing one-on- one or small group instruction; assisting with classroom management, such as organizing instructional and other materials; and instructional support services under the direct supervision of a teacher (Michigan Department of Education, 2022). Special education paraprofessionals are assigned to specifically support students with disabilities who need modified instruction or assistance, as described in their Individualized Education Program (IEP). According to the Core Competencies for Special Education Paraeducators, this support includes tasks such as providing instruction, data collection, prompting, prompt fading, and reinforcement (Council for Exceptional Children, 2023). In providing these supports, 97% of paraprofessionals have reported that they spend at least some part of the work day providing one-to-one instruction to students with disabilities (Carter et al., 2009). The Individuals with Disabilities Education Act (IDEA) allows for paraprofessionals to engage in instruction but only if they are appropriately trained and supervised (IDEA, 2004). Given that special education paraprofessionals often engage in instruction with students with disabilities and the IDEA requirements for appropriate training, it is critical that paraprofessionals are effectively trained to implement evidence-based practices (EBPs) that promote student progress (Brock & Carter 2013). Evidence-based practices are interventions or teaching strategies that are supported by empirical evidence of their effectiveness (Cook & Odom, 2013). Research indicates that proper training is related to increased fidelity in paraprofessional implementation of EBPs which, in turn, has rendered improved student outcomes (Brock & Carter, 2013). Unfortunately, teachers and paraprofessionals both indicate 1 that lack of proper paraprofessional training is a considerable barrier to effectively fulfilling their responsibilities (Mason et al., 2021). Most paraprofessional training takes place in a workshop, class, or lecture-based setting (Walker & Smith, 2015). The content of these trainings rarely includes instruction in effective practices (Brock & Anderson, 2020; Massafara et al., 2020). Instead, these trainings often focus on district policies, reporting protocols, first aid, and crisis management techniques (Hughes & Valle-Riestra, 2008). Another common form of paraprofessional training is on-the-job coaching provided by the classroom teachers. Teachers have indicated they do not have adequate time to appropriately train paraprofessionals, which leads to either less coaching or coaching with divided attention (Biggs et al., 2019; Mason et al., 2021). These limitations in trainings for paraprofessionals create obstacles for them to effectively perform their duties. Given that EBPs must be implemented systematically, recent research has begun to address the need for the development and evaluation of paraprofessional trainings to teach implementation of EBPs (Brock & Anderson, 2020). Brock and Anderson (2020) recently reviewed the literature to identify and evaluate paraprofessional trainings that focused on teaching paraprofessionals to implement interventions (i.e., EBPs). The authors identified 36 studies that evaluated paraprofessional trainings. The authors reported important characteristics of effective trainings, including providing performance feedback, and combining that feedback with an implementation checklist and modeling. In fact, the authors found that the only studies that failed to produce a functional relation between the training and fidelity of paraprofessional implementation were those that did not provide a combination of an implementation checklist, modeling, and performance feedback. Brock & Anderson (2020) found planning and role play to be two additional training strategies that may lead to improved efficacy. 2 Brock and Anderson (2020) also identified improvements and specific advances within recent years compared to earlier research on paraprofessional training. For example, an earlier review (Brock & Carter, 2013) found that the majority of paraprofessional trainings were delivered in-person in a one-to-one format, which often leads to an increase in time and costs for organizations or teachers. The more recent review (Brock & Anderson, 2020) found that the use of technology to deliver training outside of an in‐person format (e.g., online format, hybrid format) was effective. Still others implemented an effective group format training. Although Brock and Anderson (2020) found evidence to indicate alternative formats may be a feasible and effective approach to train paraprofessionals to implement EBPs, they concluded that additional studies are needed to further evaluate their effectiveness. The Autism Focused Intervention Resources & Modules (AFIRM; Sam et al., 2020) is an asynchronous training website that provides free trainings and resources related to EBPs, targeting paraprofessionals and other stakeholders. These modules aim to help individuals learn to plan for, use, and monitor EBPs to implement with learners with autism that range from birth to 22 years of age (AFIRM Team, 2019) through use of providing examples, conducting knowledge checks, and incorporating information via videos and text. Each module focuses on a single EBP and consists of four lessons: basics (introduction), planning, using, and monitoring progress (AFIRM Team, 2019). The purpose of AFIRM is to provide easily accessible knowledge and training of EBPs, reducing the resources necessary to produce effective training. Some research has been conducted to evaluate the effectiveness of the AFIRM modules. Knowles et al. (2022) conducted a review of 19 asynchronous online trainings freely available to paraprofessionals, in which the AFIRM modules were included. The authors applied quality indicators to evaluate several aspects of these trainings including features, alignment with federal 3 legislation and professional standards, active engagement features, and usability. Although AFIRM was recognized as containing all or most of the quality indicators of online learning opportunities for paraprofessionals, the authors found that AFIRM lacks inclusion of supervisors in providing feedback and coaching to facilitate behavior change of paraprofessionals. These components have been identified as important considerations for training in previous research (Brock & Carter, 2015). Another study used the AFIRM provided Time Delay Checklist-Multi- Use for time delay to assess whether a functional relation exists between AFIRM program learning cycles and paraprofessionals’ implementation fidelity (Sam et al., 2023). This study demonstrated an immediate improvement in fidelity following intervention, however, the learning cycles included support from the research team to teachers, then from teachers to paraprofessionals. Although the AFIRM modules incorporate some of the effective strategies identified by Brock and Anderson (2020), such as an implementation checklist, modeling (through video examples), and planning materials, the modules are also missing some strategies, such as opportunities for role play and providing performance feedback. Still, these modules might be an important first step toward improving paraprofessional training and enhancing implementation of EBPs with students with disabilities. Unfortunately, because previous research evaluating the AFIRM modules has incorporated additional components (e.g., learning cycles and teacher coaching; Sam et al., 2023), it remains unclear whether the AFIRM training modules without additional instruction or components lead to improved paraprofessional implementation of EBPs. Purpose of the Current Study The current study was initially developed to evaluate whether there was a functional relation between completion of the AFIRM Time Delay Module (Sam et al., 2020) and effective 4 implementation of time delay procedures for three paraprofessionals working with students with disabilities in a self-contained classroom. Time delay is an EBP that transfers stimulus control from contrived response prompts to naturally existing stimuli (Cooper et al., 2020). Implementation of time delay includes presenting the discriminative stimulus, inserting no time delay or a specified time delay respectively, delivering the controlling prompt when appropriate, presentation of reinforcement, and data collection. Specifically, time delay involves the addition of some amount of time between the presentation of the discriminative stimulus and the controlling prompt. Initial trials in a time delay procedure typically begin with a zero second (0- sec) delay, then move on to inserting a specified time delay following a number of correct responses (Cooper et al. 2020). The purpose of the 0-sec delay is to aid in skill acquisition while the purpose of the delay is to support skill maintenance. The amount of delay can either remain constant throughout trials (i.e., constant time delay) or it can systematically and progressively increase (i.e., progressive time delay). A review by Dogoe and Banda (2009) indicated that the number of trials which use a 0-sec delay, as well as the length of the delay, varies across studies, indicating there is no consensus in the field on when time delay should be implemented or how long time delay intervals should be. The current study was conducted to address the following research question: 1. Is completion of the AFIRM Time Delay Module related to improved implementation fidelity of time delay procedures by paraprofessionals while assisting their students with their goals? 5 Method and Results Participants Participants included three female paraprofessionals ranging in age from 24 to 43 years. No participants reported having received previous training in how to implement time delay procedures. The paraprofessionals were supporting students with moderate intellectual disability who were between the ages of 12 and 16 years. Setting and Materials Setting The participants were employed in a classroom for students with moderate intellectual disability at a center-based school in a midwestern state. The initial meeting and all observations took place within the classroom. The participants completed the AFIRM Time Delay Module on their own computer outside of the work setting, requiring access to a working computer and high-speed internet connection. The AFIRM trainings offer self-study in the form of simulated e- learning. Materials To ensure that the researchers observed a sufficient amount of trials, the primary researcher created a 10-minute video training on how to create opportunities and showed this video during an initial meeting prior to baseline data collection. The researcher completed the AFIRM provided Classroom Planning Guide (Figure 1) with sections “Time Delay Procedure” and “Response Interval/Wait Time” omitted so as not to inform the participants of the skill being observed. Observers collected implementation fidelity using the AFIRM provided Time Delay Checklist-Multi-Use. Internet access and a computer or smart phone were required of each participant. 6 Figure 1 Example of Page 2 of the Completed Classroom Planning Guide 7 Independent Variable AFIRM is an online training tool that offers simulated E-learning and professional development options. These trainings teach stakeholders how to implement EBPs as determined by the National Clearinghouse on Autism Evidence and Practice. Modules offered include an Introduction to Autism, Selecting an EBP, AFIRM for Paraprofessionals: Simulated E-Learning, AFIRM for Toddlers, and supplemental modules. The option to earn continuing education credits can be fulfilled by completing a pre- and post- assessment as part of the certificate track. The AFIRM website also provides additional resources to supplement each of their modules. These resources include documents that provide information such as implementation checklists, decision trees, tip sheets, parent guides and more. The Time Delay Module covers an overview of time delay and when and how to use it. The AFIRM Time Delay Module for Paraprofessionals is a 1.5-2 hour asynchronous training. The introduction presents a definition of time delay and explains the three basic rules for its use. First, always start with a 0-sec delay. Second, each student will require a different order and type of response. Third, reinforcement should be provided following a correct response. Embedded within the explanation of these rules are some multiple choice procedural- based questions. The module then goes on to provide three case examples. Each example contains a video in which a paraprofessional uses a time delay procedure with their student and provides some practice opportunities in the form of multiple choice questions and select all that apply-type questions. Supplemental resources specifically for the Time Delay Module include a Classroom Time Delay Planning Guide and Domain Goals for Time Delay to assist with planning for the use of time delay. For implementation, resources provided are the Time Delay Checklist and a Time Delay Decision Tree. Additional materials provided include Key Terms for 8 Time Delay, a Step-by-Step Guide to Time Delay, and a Time Delay Companion Guide for Families. Dependent Variable and Response Measurement Dependent Variable To evaluate skill acquisition based only on the materials provided through the AFIRM website, data were collected using the Time Delay Checklist-Multi-Use (Sam et al., 2020). As such, the dependent variable was the participant’s procedural fidelity of implementing time delay. In accordance with Ledford & Gast (2018), procedural fidelity was calculated as the number of checklist items scored as implemented divided by the sum of the total items on the checklist multiplied by 100 to yield a percentage. Response Measurement As derived from the AFIRM Time Delay Module, time delay was defined as “A prompting procedure that systematically fades the use of prompts. Time delay includes the use of a target cue/stimulus, a controlling prompt, and a reinforcer to increase opportunities to demonstrate a skill or behavior and decrease opportunities for error” (Sam et al., 2020). Observers used the Time Delay Checklist-Multi-Use (Figure 2) to record performance. An observational period was defined as a time when the participant was working with their student toward a specific goal identified on the Classroom Planning Guide. Each observation lasted until the participant conducted 5 learning trials with a student. Interobserver Agreement (IOA) The researcher was the primary observer and a second observer was planned to be present for at least 30% of observational periods for each condition. The second observer was trained by the researcher to collect data by 1) reviewing the definition of time delay; 2) reviewing the participants’ Classroom Planning Guide and student goals; and 3) practicing data collection 9 during one observation in the classroom. An agreement was defined by matching data recording for each opportunity, while a disagreement was defined by non-matching data recording for each opportunity. Experimental Design The initial study was designed as a multiple baseline design, replicated across participants (Ledford & Gast, 2018). All participants received initial training about the study at the same time and data collection began with all three participants within one week. Participants were originally supposed to complete the AFIRM Time Delay Module on a staggered schedule, based on stability of baseline performance. Unfortunately, during baseline data collection, several complications immediately became apparent with the Time Delay Checklist-Multi-Use data collection sheet. Given these complications, the decision was made to cancel the study and to simply provide the participants with training on how to implement time delay through completion of the AFIRM module and one coaching session. Below, these complications are described below. Complications with the Time Delay Checklist In addition to the online training, the AFIRM website provides several additional resources for the paraprofessionals to use when implementing time delay. One resource was the Time Delay Checklist-Multi-Use (Figure 2). Similar to Sam and colleagues (2023), this checklist was used for data collection when observing the participants working with the students in the current study. During observations it became immediately apparent that there were several ambiguities within the checklist that led to difficulties with coding. First, both the Time Delay Checklist and the Time Delay Checklist-Multi-Use only account for one Trial per data collection sheet or per day, but time delay is often taught in a 10 sequence of several consecutive trials. Additionally, the Time Delay Checklists include several additional implementation checklist items related to planning the procedure (Section 1: Plan), ensuring all items are available (Section 2: Use), and monitoring student performance (Section 5: Monitor). For the current study, it was decided to use the Time Delay Checklist Multi-Use (Figure 2) but to consider each data collection column as one Trial (as opposed to one Date). As a result, the additional implementation fidelity items would inflate the implementation fidelity of the actual time delay procedures if they were marked for every Trial that was implemented. It was decided that during data collection, items 1-5 in Section 1 and item 1 in Section 2 were only marked for the first trial and no data were taken on the participant’s data collection behaviors (item 6 in Section 1 and Section 5). Second, the actual implementation of time delay procedures are recorded in Sections 3 and 4 of the Time Delay Checklist-Multi-Use. Wall and Gast (1997) identified five types of student responses that may occur while implementing a time delay procedure. These responses include an unprompted correct response, a prompted correct response, an unprompted incorrect response, a prompted incorrect response (error), and no response. The Time Delay Checklist- Multi-Use does not allow for documentation of each of these student responses, nor does it provide guidance on how the paraprofessionals should adjust their procedures based on student responding. As a result, it was not clear whether a participant’s behavior should be counted as correct, incorrect, or not applicable (n/a). Further compounding the issue is that clear instructions were not provided on how to use (and how to code) the Time Delay Checklist-Multi-Use. This led to uncertainty of how to record participant implementation and affected the denominator (total number by which correctly implemented items were divided), leading to highly variable 11 outcomes and poor interobserver agreement. As a result, it was impossible to accurately code the participants’ performance of time delay during baseline. To illustrate this issue with an example, Figure 2 displays data collected by two observers during a baseline observation. Data collection for 0-sec time delay procedures should be recorded in Section 3; data collection for time delay procedures should be recorded in Section 4. During Trial 1, the participant first delivered the discriminative stimulus (target cue) and controlling prompt with a 0-sec delay, indicating data should be recorded in Section 3. The prompt was unsuccessful, however, resulting in an error. Section 3 does not provide space for the observer to mark an unsuccessful prompt, nor are there instructions for how the paraprofessional should respond if the student responds incorrectly. For Trial 1 then, the observers were unclear how to record performance. Although the Time Delay Checklist-Multi-Use instructions indicate data should only be recorded in Section 4 after the student responds accurately twice in a row with a 0-sec delay in Section 3, neither the module nor the data sheet address what the paraprofessional should do in the case of an error in Section 3. As a result, both observers marked step 4 as correct (the participant gave the controlling prompt with a 0-sec delay), but Observer 1 marked all other steps as n/a and considered the participant’s response as the next Trial (Trial 2), whereas Observer 2 recorded the participant’s subsequent behavior as a continuation of Trial 1 and marked performance in Section 4 of the data collection sheet (the section reserved for time delay). As another example, in Trial 4 the participant inaccurately provided a delay when there had not yet been two consecutive correct responses with a 0-sec delay. The student, however, responded correctly and the participant then provided reinforcement. On data collection, both Observers coded Section 3, step 4 as incorrect but Observer 1 marked the rest of the steps as n/a, 12 Figure 2 Completed Time Delay Checklist-Multi-Use From Observers 1 and 2 During Baseline 13 Figure 2 (cont’d) 14 whereas Observer 2 marked Section 4, step 7a as correct (because the participant provided reinforcement). Given the student’s correct performance, it is then unclear whether the participant should continue to proceed with a delay procedure or revert back to an immediate prompt procedure. Since the checklist did not account for these differences, it was impossible to use it as a way to code participant behaviors. These difficulties may also indicate that the Time Delay Checklist-Multi-Use would not be useful for paraprofessionals or their supervisors to use as an implementation checklist. Given the difficulties with data collection during baseline, the researcher returned to the AFIRM Time Delay Module to identify whether the module itself provided additional clarification on the use of the Time Delay Checklist-Multi-Use and how to respond to various student responses. Instead, this review identified additional areas in need of clarification. Complications with the Module Upon further evaluation of the AFIRM Time Delay Module, several shortcomings were identified. These concerns are related to a lack of specifications and coaching provided by the module. First, the module lacks clarification on how to establish an appropriate length of delay. Although the module suggests a delay from 3-5-sec for constant time delay and up to 10-secs for progressive time delay, it does not provide instructions or criteria for how to select the appropriate length of delay for a student. Previous research has suggested that a 4-sec delay is most commonly used in studies (Dogoe & Banda, 2009), however, the length of delay should be individualized for each student. Too brief of a delay could lead to over-prompting, whereas too long of a delay could result in student frustration or errors. For example, if a student with a 15 musculoskeletal condition is learning a skill that requires motor movement, this student’s response time might be slower, which would indicate that the paraprofessional should choose a longer length of delay. Second, the module does not describe how to determine whether to use constant or progressive time delay procedures. In fact, there seems to be a deficiency of information addressing this across the field. Third, there is an absence of guidance informing the trainee on how to conduct a subsequent trial if the student responds incorrectly or presents an error. A common practice in behavior analysis is to revert to a more intrusive prompt or in this case, a 0- sec delay following two consecutive errors by a student. Failure to revert to a 0-sec delay could result in stagnant student progress. For effective implementation, clarification on how to proceed with trials following one or multiple errors should be provided with the AFIRM Time Delay Module. Fourth, Brock and Anderson (2020) reported that the combined use of modeling, implementation checklists, and performance feedback is critical for effective trainings. While the AFIRM Time Delay Module includes video models and a supplemental implementation checklist, the format of the module does not include explicit performance feedback as a training component. Because it is an asynchronous, self-paced training, the AFIRM Time Delay Module only provides “practice” in the form of multiple choice questions that follow a brief description of a scripted scenario which is not indicative of real-world performance and does not provide sufficient feedback. A final complication with the content of the module is the lack of multiple exemplars throughout the training. The module includes three example videos, all of which were recorded in a one-to-one setting during discrete trial training (DTT). Time delay procedures have 16 demonstrated effectiveness when used for acquisition of a multitude of skills, such as vocational skills (Wall & Gast, 1999), cooking skills (Graves et al., 2005), spontaneous speech (Ingenmey & Van Houten, 1991), and swimming skills (Rogers et al., 2010). To more accurately encapsulate classroom scenarios and student goals, examples for use of time delay outside of DTT should be included in the training. Further, the video examples only display the student responding accurately. There were no examples included in which the student engaged in any response outside of accepting a prompt or responding correctly after a delay. There is value in providing model examples; however, to represent a more applied situation, examples which include divergent student responses and subsequent paraprofessional responses should be embedded into the training videos as well. Given the concerns with data collection and the content within the AFIRM Time Delay Module, the decision was made to halt baseline data collection and to have all three participants complete the Time Delay Module during the same one week period. The researcher then observed implementation following completion of the module. After one week, the researcher met with each participant to obtain their feedback on the module and then provided in vivo coaching to improve implementation of the time delay procedures. 17 Discussion It is necessary that paraprofessionals be trained in the use of EBPs in order to carry out the responsibilities of their role in the classroom effectively; yet, the content of current trainings received by paraprofessionals rarely provides instruction on the use of EBPs (Brock & Anderson, 2020; Massafara et al., 2020). To address this, recent research has aimed to create and evaluate trainings for paraprofessionals. The AFIRM modules are intended to provide an asynchronous, flexible, and easily accessible training at no cost. The current study was conducted to evaluate the effectiveness of the AFIRM modules to teach paraprofessionals implementation of a time delay procedure for use with students with disabilities in a special education classroom. The attempt to evaluate the AFIRM Time Delay Module highlighted multiple strengths and areas for improvement. Upon initial data collection, it was determined that the AFIRM Time Delay Module paired with the AFIRM provided Time Delay Checklist-Multi-Use were not sufficient to gather meaningful data on fidelity of implementation. After examining the issues, we have identified both strengths and areas for improvement within the AFIRM Time Delay Module. Below, we discuss these aspects and end with recommendations for future research and practice. Strengths of the AFIRM Training Module Creating effective training materials and supporting resources is no simple feat. The AFIRM modules set the precedent for more feasible dissemination of EBPs to school-based professionals. The online format allows for paraprofessionals, teachers, and other practitioners to access these modules at any time at their own pace, providing asynchronous flexibility. These modules are offered at no cost, reducing the resources necessary to train employees and, therefore, increasing the likelihood that they will be adopted by an organization. The modules are 18 easy to access and navigate, reducing response effort. In fact, Knowles et al. (2022) implemented the System Usability Scale (SUS; Bangor et al., 2008; Lewis & Sauro, 2018), a technology agnostic system which determines ease of navigation, to evaluate the ease of use of the AFIRM modules. They found that the AFIRM modules yielded a SUS score above 80, which marks excellence in usability. Together, these characteristics offer an efficient arrangement that allows paraprofessionals and organizations to reduce the resources necessary to provide training. Areas for Improvement Despite these strengths, the current study also identified several areas for improvement. First, although the AFIRM Time Delay Module included the use of modeling and an implementation checklist (a relative strength), Brock and colleagues (2017; 2017; 2020) found that these strategies are most effective when used in combination with performance feedback. Because of the asynchronous, self-paced aspect of the module, it is not possible for paraprofessionals to receive performance feedback when implementing time delay directly with students. The module somewhat addressed this need by incorporating practice opportunities in the form of multiple choice and select all that apply-type questions. From 29 of these types of questions, only 14 were procedurally based questions– procedurally based being defined as any question requiring the paraprofessional to choose the next step in the time delay procedure to implement based on a brief description or a video model. In doing so, if the learner chose the incorrect answer, they were simply told the correct answer, then allowed to move on. Another form of practice provided by the module was an exercise in using the Time Delay Checklist to evaluate a video model of another paraprofessional implementing time delay. Only one opportunity to complete the fidelity checklist was available. Feedback for this activity was provided by a comparison between the paraprofessional’s completed checklist and one provided by the AFIRM team. The video model, however, included eight trials, yet the Time Delay 19 Checklist only provided space to collect data on one trial. Again, following an incorrect response, a correct answer was provided and the paraprofessional was allowed to continue on with the module. Although this practice and feedback is better than nothing, passive performance and feedback is not as effective as performance and feedback provided in the applied setting (Brock et al., 2020). Second, the AFIRM Time Delay Module did not explain several important aspects of the time delay procedure. For example, there was no description as to how to properly select the length of delay for individual students, how to determine if constant time delay or progressive time delay is more appropriate to implement, or how to respond to divergent student responses such as errors or incorrect responses. Without this instruction, the paraprofessional is left without guidelines to determine the best way to implement time delay for their student. Third, although the AFIRM Time Delay Module provides video models to demonstrate the procedure, these videos were recorded under very specific and controlled conditions, the session is being taught using DTT, and the student responses were exemplar. Conditions such as these are dissimilar to classroom conditions in which a paraprofessional is typically assigned to support multiple students at once, their goals may not be attainable through DTT, and student responses vary vastly. Without appropriate video models or multiple exemplars of student performance, the paraprofessional will enter the classroom setting without adequate preparation to implement time delay and respond to student performance. Finally, these concerns within the modules also lead to ambiguities when collecting data on fidelity of implementation. The Time Delay Checklist-Multi-Use did not account for all possible responses in which a student can engage, posing difficulties in determining how to use it. To add to this, no training exists for use of the Time Delay Checklist, resulting in further complications. Specific suggestions to address these concerns are provided below. 20 Recommendations The AFIRM Time Delay Module contributes a strong framework for disseminating information and training to school-based personnel in a cost- and time-efficient manner. To expand upon this framework and enhance efficacy, future trainings should aim to address the limitations identified in this article. Below we list specific recommendations for addressing these limitations. First, the module should include more diverse examples, including examples of time delay implementation in different settings, with different goals, and with more student responses. Empirical evidence demonstrates that time delay procedures have been successful in the acquisition of many skills (Dogoe and Banda, 2009). Trainings should use this to their advantage to create multiple exemplar video models. As an example, one scenario could be composed of a paraprofessional teaching a student how to use a chained task such as hand washing or brushing their teeth. Another example could include a paraprofessional teaching a student to mand for “pass” in gym class. This example could also include a situation in which the student responds incorrectly following the controlling prompt so that the learner can see an example of how to respond in this situation. Second, performance feedback is paramount to the delivery of effective training. The asynchronous format of the training poses difficulty in providing meaningful feedback; however, a feedback component is important to maintain given its efficacy. Brock and colleagues (2020) examined the effectiveness of delayed video feedback in which trainees video-recorded themselves implementing a procedure and in two to three days received performance feedback from the researcher in the form of direct corrective feedback, behavior specific praise, and role- play. The delayed direct feedback was found to be successful in increasing fidelity of implementation. To maintain the asynchronous aspect of the training, the makers of the AFIRM 21 modules could consider including a component similar to delayed feedback. For example, the paraprofessional could film themselves implementing time delay, upload the video to the website, and receive written or audio recorded feedback from a trained reviewer. Alternatively, future research could explore the effectiveness of delayed feedback to short response answers when the paraprofessionals are completing the training modules. Thus, rather than responding to multiple choice questions, they could provide a written text response that would be read and evaluated by an AFIRM staff member who could then provide delayed feedback on the response. This would allow a trainee to obtain practice while responding to the provided video models in short form rather than multiple choice. Finally, a third option would be to include a companion teacher coaching component to provide instruction to the teachers on how to provide appropriate feedback to the paraprofessional when implementing the time delay procedure in the classroom. Sam and colleagues (2023) did this in their evaluation of the AFIRM Time Delay Module and found there to be an immediate level change followed by a stable or increasing trend in paraprofessional implementation. Finally, revisions to the module should include additional resources on how to appropriately use and score the Time Delay Checklists. Sam and colleagues (2023) also used the AFIRM provided Time Delay Checklist, but observers received a two hour training from the checklist developers on how to collect data on implementation fidelity. Prior to rating participants, observers were required to meet a minimum of 80% agreement with scores that had been generated by the lead researchers. Supplemental resources with information on how to collect data are principal to maintain consistent and reliable reporting. Additionally, in order to reduce confusion in data collection, the Time Delay Checklists should be revised to provide clearer instructions and spaces to code for paraprofessional behavior in response to divergent student responses. 22 Conclusion In response to the need for efficient trainings for paraprofessionals, the AFIRM modules provide an easily accessible and easy to use online training at no cost. While this is a good start to the dissemination of EBPs, the current study identified areas for improvement in the AFIRM Time Delay Module which include the absence of practice and performance feedback, failure to include detailed explanations of important procedural aspects, restricted examples, and ambiguous data collection procedures. Addressing these barriers will aid in assessing effectiveness of such modules. Given the current lack of training on EBPs provided to paraprofessionals, effective and efficient trainings are essential to promoting their role of supporting students in the classroom. 23 REFERENCES AFIRM Team. (2019). Components of the Autism Focused Intervention Resources & Modules (AFIRM). Chapel Hill, NC: National Professional Development Center on Autism Spectrum Disorder, Frank Porter Graham Child Development Center, University of North Carolina. Retrieved from https://afirm.fpg.unc.edu/afirmmodules Bangor, A., Kortum, P. T., & Miller, J. T. (2008). An empirical evaluation of the system usability scale. International Journal of Human–Computer Interaction, 24(6), 574–594. https://doi.org/10.1080/10447310802205776 Biggs, E. E., Gilson, C. B., & Carter, E. W. (2019). “Developing that balance”: Preparing and supporting special education teachers to work with paraprofessionals. Teacher Education and Special Education, 42(2), 117-131. https://doi.org/10.1177/0888406418765613 Brock, M. E., & Anderson, E. J. (2020). Training paraprofessionals who work with students with intellectual and developmental disabilities: What does the research say? Psychology in the Schools, 58(4), 702-722. https://doi.org/10.1002/pits.22386 Brock, M. E., Barczak, M. A., & Dueker, S. A. (2020). Effects of delayed video-based feedback and observing feedback on paraprofessional implementation of evidence-based practices for students with severe disabilities. Focus on Autism and Other Developmental Disabilities, 35(3), 153-164. https://doi.org.proxy1.cl.msu.edu/10.1177/1088357620902492 Brock, M. E., Cannella‐Malone, H. I., Seaman, R. L., Andzik, N. R., Schaefer, J. M., Page, E. J., ... Dueker, S. (2017). Findings across practitioner training studies in special education: a comprehensive review and meta‐analysis. Exceptional Children, 84(1), 7–26. https://doi.org/10.1177/0014402917698008 Brock, M. E., & Carter, E. W. (2013). A systematic review of paraprofessional-delivered educational practices to improve outcomes for students with intellectual and developmental disabilities. Research and Practice for Persons with Severe Disabilities, 38(4), 211-221. Brock, M. E., & Carter, E. W. (2015). Effects of a professional development package to prepare special education paraprofessionals to implement evidence-based practice. The Journal of Special Education, 49(1), 39–51. https://doi. org/10.1177/0022466913501882 Brock, M. E., & Carter, E. W. (2017). A meta‐analysis of educator training to improve implementation of interventions for students with disabilities. Remedial and Special Education, 38(3), 131–144. https://doi.org/10.1177/0741932516653477 24 Carter, E., O'Rourke, L., Sisco, L. G., & Pelsue, D. (2009). Knowledge, responsibilities, and training needs of paraprofessionals in elementary and secondary schools. Remedial and Special Education, 30(6), 344-359. https://doi.org/10.1177/0741932508324 Cook, B. G., & Odom, S. L. (2013). Evidence-based practices and implementation science in special education. Exceptional Children, 79(2), 135-144. https://doi.org/10.1177/001440291307900201 Cooper, J. O., Heron, T. E., & Heward, W. L. (2020). Applied behavior analysis. Pearson UK. Council for Exceptional Children (2022). Paraeducator Competencies. exceptionalchildren.org/paraeducators/core-competencies-special-education- paraeducators. Accessed 24 Apr. 2023. Dogoe, M., & Banda, D. R. (2009). Review of recent research using constant time delay to teach chained tasks to persons with developmental disabilities. Education and Training in Developmental Disabilities, 44(2), 177-186. https://www.jstor.org/stable/24233492 Graves, T. B., Collins, B. C., Schuster,J. W., & Kleinert, H. (2005). Using video prompting to teach cooking skills to secondary students with moderate disabilities. Education and Training in Developmental Disabilities, 40(1), 34-46. https://www.jstor.org/stable/23879770 Hughes, M. T., & Valle‐Riestra, D. M. (2008). Responsibilities, preparedness, and job satisfaction of paraprofessionals: Working with young children with disabilities. International Journal of Early Years Education, 16(2), 163-173. https://doi.org/10.1080/09669760701516892 Individuals with Disabilities Education Improvement Act (2004). U.S.C. § 1400. https://sites.ed.gov/idea/statute-chapter-33/subchapter-i/1400 Indeed.com. (n.d.). What Is a Paraprofessional? Indeed Career Guide. https://www.indeed.com/career-advice/finding-a-job/what-is-a-paraprofessional Ingenmey, R., & Van Houten, R. (1991). Using time delay to promote spontaneous speech in an autistic child. Journal of Applied Behavior Analysis, 24(3), 591-596. https://doi.org/10.1901/jaba.1991.24-591 Knowles, C. L., D’Agostino, S. R., Kunze, M. G., Uitto, D. J., & Douglas, S. N. (2022). A systematic review of asynchronous online learning opportunities for paraeducators. The Journal of Special Education, 56(3), 168-178. https://doi.org/10.1177/002246692210853 Ledford, J. R., & Gast, D. L. (Eds.). (2018). Single case research methodology. New York, NY: Routledge. 25 Lewis, J. R., & Sauro, J. (2018). Item benchmarks for the system usability scale. Journal of Usability Studies, 13(3), 158–167. Massafra, A., Gershwin, T., & Gosselin, K. (2020). Policy, preparation, and practice... Oh my! Current policy regarding the paraprofessional role and preparation for working with students with disabilities. Journal of Disability Policy Studies, 31(3), 164-172. https://doi.org/10.1177%2F1044207320920004 Mason, R. A., Gunersel, A. B., Irvin, D. W., Wills, H. P., Gregori, E., An, Z. G., & Ingram, P. B. (2021). From the frontlines: Perceptions of paraprofessionals’ roles and responsibilities. Teacher Education and Special Education, 44(2), 97-116. https://doi.org/10.1177/08884064198966 Michigan Department of Education (2022). Requirements for Instructional Paraprofessionals in Title I Schools . Retrieved April 25, 2023, from https://www.michigan.gov/- /media/Project/Websites/mde/OES/Programs/Title-I- A/Title_I_Paraprofessional_Requirements.pdf?rev=2a75a136294946f8ad19bdad61462b8 8 Rogers, L., Hemmeter, M. L., & Wolery, M. (2010). Using a constant time delay procedure to teach foundational swimming skills to children with autism. Topics in Early Childhood Special Education, 30(2), 102-111. https://doi.org/10.1177/0271121410369708 Sam, A., Savage, M., Steinbrenner, J., Morgan, W., Chin, J., & AFIRM for Paras Team. (2020). Time Delay: Introduction & Practice. FPG Child Development Institute, University of North Carolina. https://afirm.fpg.unc.edu/time-delay-introduction-practice Sam, A. M., Steinbrenner, J. R., Odom, S. L., Nowell, S. W., Waters, V., Perkins, Y., ... & Rogers, H. J. (2023). Promoting paraeducators’ use of evidence-based practices for students with autism. Exceptional Children, 89(3), 314-331. https://doi.org/10.1177/00144029221135572 Walker, V. L., & Smith, C. G. (2015). Training paraprofessionals to support students with disabilities: A literature review. Exceptionality, 23(3), 170-191. https://doi.org/10.1080/09362835.2014.986606 Wall, M. E., & Gast, D. L. (1997). Caregivers as teachers: Using constant time delay to teach adults how to use constant time delay. Education and Training in Mental Retardation and Developmental Disabilities, 32(3), 213-228. https://www.jstor.org/stable/23879151 Wall, M. E., & Gast, D. L. (1999). Acquisition of incidental information during instruction for a response-chain skill. Research in Developmental Disabilities, 20(1), 31-50. https://doi.org/10.1016/S0891-4222(98)00030-4 26