PROCESS EVALUATION OF IMPLEMENTING THE EAT HEALTHY, YOUR KIDS ARE WATCHING. A PARENT'S GUIDE TO RAISING A HEALTHY EATER PROGRAM By Jamie S. Karp A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Human Nutrition - Master of Science 2014 ABSTRACT PROCESS EVALUATION OF IMPLEMENTING THE EAT HEALTHY, YOUR KIDS ARE WATCHING. A PARENT'S GUIDE TO RAISING A HEALTHY EATER PROGRAM By Jamie S. Karp The objective of this study was to evaluate the factors that enhanced or impeded implementation fidelity and educational delivery (extent to deliver as designed and duration of topic delivery) by educators of Eat Healthy (EH), Your Kids are Watching., A Parent’s Guide to Raising a Healthy Eater. Both quantitative and qualitative data were collected and used to evaluate implementation fidelity and the educational delivery of EH. Educators (n=20) delivered 1-6 lessons to 107 SNAP-Ed eligible parents in four Michigan Counties using a combination of home visits and phone calls. The educators’ Attention, Relevance, Confidence and Satisfaction (ARCS) factor scores from training and their demographic characteristics, as well as, parent engagement and reactions to key messages of EH were also assessed to check for associations with implementation fidelity and educational delivery. Data were analyzed using a mixed method approach and findings triangulated. ARCS factor scores from the training did not relate to educational delivery. Both age of educators and parent engagement predicted the extent to deliver EH topics as designed, a finding substantiated with the qualitative data. Quantitative and qualitative findings consistently supported parent engagement as the strongest facilitator and distractions by children and technology issues as the most common barriers to educational delivery and implementation fidelity. This study was unique in adding to the paucity of literature on process evaluation in nutrition education. TABLE OF CONTENTS LIST OF TABLES………………………………………………………………………….. vi LIST OF FIGURES………………………………………………………………………… ix KEY TO ABBREVIATIONS……………………………………………………………… x CHAPTER I. INTRODUCTION……………………………………………………………. Objective…………………………………………………………………………….. Specific aim………………………………………………………………………….. Hypothesis 1………………………………………………………………………… Hypothesis 2…………………………………………………………………………. Hypothesis 3…………………………………………………………………………. 1 2 2 2 3 3 CHAPTER II. REVIEW OF LITERATURE……………………………………………….. Background of Eat Healthy, Your Kids are Watching- A Parent’s Guide to Raising a Healthy Eater………………………………………………………………………. Fidelity of Implementation …………………………………………………………. Prevalence of published studies on fidelity of implementation…………………. Measuring fidelity of implementation…………………………………………… Moderators of Fidelity…………………………………………………………... Quality of educator training…………………………………………………. Educator characteristics……………………………………………………… Educator selection…………………………………………………………… Complexity of intervention………………………………………………….. Educational Delivery………………………………………………………………… Barriers of educational delivery………………………………………………… Facilitators of educational delivery…………………………………………….. 4 CHAPTER III. METHODS………………………………………………………………… Study Sites and Nutrition Educators………………………………………………… Design of the Educational Program for which this Process Evaluation Occurs……. Eat Healthy Process Evaluation Concept Model……………………………………. Procedures…………………………………………………………………………… Nutrition Educator Training…………………………………………………….. Instruments……………………………………………………………………… Instruments for delivery of education.………………………………………. Instruments for fidelity……………………………………………………… Data Analysis……………………………………………………………………….. Statistical analysis………………………………………………………………. Testing construct reliability (α) and validity……………………………………. Model assessment and selection………………………………………………… Qualitative analysis……………………………………………………………… 22 22 23 24 24 27 28 29 31 34 34 35 37 40 iii 5 5 6 6 11 11 13 15 16 18 20 20 Triangulation……………………………………………………………………... 42 CHAPTER IV. RESULTS…………………………………………………………………... Educator demographics……………………………………………………………… Hypothesis 1………………………………………………………………………… Attention, Relevance, Confidence, Satisfaction (ARCS)……………………….. Nutrition Educator Report of Topic Delivery (NERTD)……………………….. Ho1b………………………………………………………………………………… Hypothesis 2………………………………………………………………………… Ho2a………………………………………………………………………………… Ho2b………………………………………………………………………………… Hypothesis 3………………………………………………………………………… Periodic debriefing notes……………………………………………………….. Nutrition Educator Report of Topic Delivery (NERTD)………………………. Home visit observations………………………………………………………… Descriptive statistics of educator’s NERTD scores compared with observer NERTD scores………………………………………....................... Qualitative data from Home Visit Observations………………...................... Exit interviews…………………………………………………………………… Educator training…………………………………………………………….. Data collection……………………………………………………………….. Outside EH evaluator………………………………………………………… Delivery protocol…………………………………………………………….. Barriers ……………………………………………………………………… Facilitators …………………………………………………………………... 43 43 45 45 46 50 55 55 61 64 64 67 71 CHAPTER V. DISCUSSION……………………………………………………………….. Hypothesis 1…………………………………………………………………………. Hypothesis 2…………………………………………………………………………. Hypothesis 3…………………………………………………………………………. Instrument design……………………………………………………………………. Data collection issues……………………………………………………………….. Study strengths……………………………………………………………………… Study Limitations…………………………………………………………………… 82 82 83 87 89 89 90 91 CHAPTER VI. CONCLUSIONS and IMPLICATIONS for FUTURE RESEARCH……… Conclusions………………………………………………………………………….. Implications for Future Research……………………………………………………. 93 93 93 APPENDICES……………………………………………………………………………….. Appendix A. MSU IRB Human Research approval letters………………………….. Appendix B. Nutrition educator consent form………………………………………. Appendix C. Power point slides for Phase 1 training……………………………….. Appendix D. Power point slides for Phase 2 training and curriculum grid…............. Appendix E. Nutrition educator demographic questionnaire………………………... 95 96 100 103 122 147 iv 71 71 73 73 76 77 78 79 80 Appendix F. ARCS questionnaires for nutrition educator training………………….. Appendix G. Nutrition Educator Report of Topic Delivery (NERTD) questionnaire. Appendix H. Coding tree for summary of nutrition educator reports (NERTD) responses for barriers and facilitators of topic delivery…………………………….. Appendix I. Coding tree for exit interviews with nutrition educators………………. 149 152 REFERENCES………………………………………………………………………………. 176 v 154 158 LIST OF TABLES Table 1. Timeline of the process evaluation of Eat Healthy intervention. ................................... 27 Table 2. Process evaluation components and four of seven instruments. ..................................... 29 Table 3. Debriefing items queried weekly of site coordinators and educators as indicated by month for all sites, but Lansing..................................................................................................... 32 Table 4. Exit interview items queried of site coordinators and educators at each site. ................ 34 Table 5. Generalized linear mixed effects model comparisons for ‘Extent to deliver lesson as designed’ from educator’s evaluations of parent engagement, parent reaction to key. ................ 37 Table 6. Generalized linear mixed effects model comparisons for ‘Duration of topic delivery’ from educator’s evaluations of parent engagement, parent reaction to key messages and time spent on lessons............................................................................................................................. 38 Table 7. Descriptive characteristics of nutrition educators (n=20) for all sites (n=4) and then by site. ................................................................................................................................................ 44 Table 8. Subscale scores for Attention, Relevance, Confidence, and Satisfaction (ARCS) subconstructs from 20 educators for the training session on educational delivery....................... 45 Table 9. Scores for Nutrition Educator Report of Topic Delivery (Ho2) overall and by site and item for 495 observations.............................................................................................................. 46 Table 10. Frequency (%) of responses by item on Nutrition Educator Report of Topic Delivery for all sites (n=495 observations nested in 107 families with 20 educators).4.............................. 47 Table 11. Generalized linear with mixed effects model of association between quality of nutrition educator training (scores on ARCSa--Attention, Relevance, Satisfaction) and ‘extent to deliver lessons as designed’, family level. ................................................................................................ 49 Table 12. Topic delivery duration in minutes (Mean±SD) for all sites (n=5), then for each participating site. ........................................................................................................................... 51 vi Table 13. Percentage of lesson delivery durations (a priori and post-hoc) for too short, optimal, and too long, by delivery method and topic, all locations combined. .......................................... 52 Table 14. Generalized linear with mixed effects model of association between quality of nutrition educator training (scores on ARCSa- Attention, Relevance, Satisfaction) and ‘duration of topic delivery’, family level. .................................................................................................................. 53 Table 15. Inter-item correlations for educators’ (n=20) scores from NERTD’s on parents’ reactions and their own delivery of 495 lessons given to 107 families. ....................................... 55 Table 16. Pearson’s correlations between nutrition educator characteristics and average scores per educator for NERTD items (n=495 observations nested within 107 families with 20 educators). ..................................................................................................................................... 56 Table 17. Generalized linear with mixed effects model for association of nutrition educator characteristics and Q4a (Extent to deliver topic as designed). Estimated odds ratios, standard errors, 95% CI of ‘Extent to deliver lesson as designed.’ ............................................................. 59 Table 18. Generalized linear mixed effects model for association of nutrition educator characteristics and Q5 (Duration of topic delivery). Estimated odds ratios, standard errors, 95% CI of duration of topic delivery. ................................................................................................... 62 Table 19. Emergent themes coded from the periodic debriefings with outlying sites.1 ............... 66 Table 20. Summary of barriers and facilitators (potential moderators) to Implementation Fidelity (lesson delivery), as listed on NERTD that queried, “Please explain any things that have helped or made it difficult for you to deliver this week’s topic as designed. ........................................... 69 Table 21. Mean (SD) scores and agreement for four items between educator and home visit observer (i.e. implementation fidelity) on Nutrition Educator Report of Topic Delivery (n=7 observations.) ................................................................................................................................ 71 Table 22. Emergent themes coded from seven home visit observations1. .................................... 73 Table 23. Topics queried in exit interviews with educators regarding perceptions of training, data collection, implementation fidelity, study design and timeline, with selected quotes by site. ..... 74 Table 24. Curriculum grid for Eat Healthy. ................................................................................ 143 vii Table 25. Coding tree for summary of Nutrition Educator Reports (NERTD) responses for barriers and facilitators of topic delivery, including selected quotes. ......................................... 154 Table 26. Coding tree for exit interviews with nutrition educators from each site. ................... 159 viii LIST OF FIGURES Figure 1. Eat Healthy process evaluation concept model adapted from Lee et al., 2013. ............ 24 Figure 2. Confirmatory Factor Analysis for ARCS questionnaire.4. ............................................ 36 Figure 3. Equation for final generalized linear mixed effects models. ......................................... 38 Figure 4. Power point slides for Phase 1 training……………………………………………....104 Figure 5. Power point slides for Phase 2 training and curriculum grid………………………...123 ix KEY TO ABBREVIATIONS ARCS for Nutrition Educators- 5-point Likert-scale instrument measuring four motivational constructs: attention, relevance, confidence, and satisfaction Author of this thesis- study coordinator, Jamie Karp Debriefing form- weekly or bi-monthly guide to assess fidelity of implementation across sites Education Delivery- assess time spent delivering each topic, all topics delivered as designed, and level of delivery consistency among nutrition educators and sites EH- Eat Healthy, Your Kids are Watching! A Parent’s Guide to Raising a Healthy Eater Fidelity of Implementation- the extent to which the intervention was implemented as planned; facilitators or barriers encountered and overcome Glmm- generalized linear mixed effects model, logistic regression MNN- Michigan Nutrition Network, funder for Eat Healthy research project IRB- Institutional Review Board Nutrition Educator Demographic Form- captures sex, age, race, ethnicity, education level, number of years teaching nutrition education, nutrition education background, including formal nutrition classes, number of years employed as a home visitor, and level of interest in teaching Eat Healthy materials Nutrition Educators Report of Topic Delivery (NERTD)- mixed method questionnaire measuring education delivery x CHAPTER I. INTRODUCTION To determine success of nutrition interventions, investigators typically perform only outcome evaluation, but process evaluation is also essential in order to better assess the intervention’s effectiveness (Saunders et al., 2005; Young et al., 2008). The process of focusing on outcome evaluation alone without process evaluation can lead to erroneous conclusions of a program’s true impact, or Type III error (Stecklar, 2002; Harachi et al., 1999). Process evaluation assists understanding the reasons an intervention succeeds or fails and sheds light on the mechanisms or moderators that affect outcomes (Wilson et al., 2009). A nutrition education program’s success hinges on several elements of process evaluation, including quality of program design, fidelity of implementation, education delivery, education received, and efficacy in reaching the target population. Of these, fidelity of implementation is most important, as there is strong evidence that the level of fidelity for intervention implementation affects how well a program succeeds (Durlak, 2008). Factors that can influence fidelity include educator’s motivations, skill and expertise, and complexity of the intervention. Monitoring of implementation is done via ongoing process and summative evaluation, providing opportunities for adjustments to maximize success. Few published studies could be identified on process evaluation of nutrition education interventions (Dour et al. 2013; Dollahite et al. 1998; Levine et al. 2002). Although disciplines like health promotion, drug abuse prevention, and mental health more frequently assess process evaluation than does nutrition education, there are still few studies even in these disciplines that have published findings about process evaluation in much detail (Dusenbury et al. 2003; Saunders et al. 2006). 1 Eat Healthy (EH), Your Kids are Watching--A Parent’s Guide to Raising a Healthy Eater was in the process of development for two years and underwent formative evaluation (Reznar et al., 2014). As the complexity of EH’s intervention increased, monitoring of implementation was imperative to determine if delivery of the material was being conducted in a manner consistent with EH’s key concepts and objectives. Such evaluation provided valuable information about barriers and facilitators of intervention components, as well as aided in understanding the internal and external dynamics of the EH intervention (Schneider, 2009). Desired process outcomes for this study included minimal variability among sites, adherence to education delivery design, and completion of topics as intended. Objective The objective of my research was to evaluate factors that enhanced or impeded implementation and delivery of Eat Healthy nutrition education intervention materials by trained educators. Specific aim The specific aim of this study was to identify the factors that affected education delivery and fidelity of implementation. Of specific interest, to what extent all topics were delivered as designed, the consistency of delivery among nutrition educators and locations, and the amount of time spent delivering each topic. Problems or barriers encountered by nutrition educators were also evaluated. Hypothesis 1 H01a: There will be a positive association between quality of nutrition educator training [higher scores on Attention, Relevance, Confidence, and Satisfaction (ARCS) (13 items) 5-point Likert 2 scale] and extent to which the lesson was delivered as designed (Q4a on 5-point Likert-scale Nutrition Educator Report of Topic Delivery). H01b: There will be a positive association between quality of nutrition educator training [higher scores on ARCS (13 items) 5-point Likert scale] and duration of topic delivery (Q5: Nutrition Educator Report of Topic Delivery). Hypothesis 2 Ho2a: The characteristics of nutrition educators, such as age, number of years as a home visitor, number of years delivering nutrition education, nutrition education background, and level of interest in teaching Eat Healthy (EH) materials, will relate to the extent to which the lesson was delivered as designed. H02b: The characteristics of nutrition educators, such as age, number of years as a home visitor, number of years delivering nutrition education, nutrition education background, and level of interest in teaching EH materials, will relate to duration of topic delivery. Hypothesis 3 There will be a difference in fidelity of implementation among sites, as assessed from periodic debriefing notes, Nutrition Educator Report of Topic Delivery (Q4b: explain any things that helped or made it difficult for you to deliver this week’s topic as designed), home visit observations, and exit interviews. 3 CHAPTER II. REVIEW OF LITERATURE In this section, the literature reviewed relates to the rationale and importance of process evaluation in determining the effectiveness of nutrition intervention programs. An overview of process evaluation components is followed by brief reviews of fidelity of program implementation and delivery of the intervention. Most process evaluation literature comes from disciplines outside of nutrition education, such as health promotion, health education research, prevention science, and psychology, but even then, there are few papers published on process findings (Saunders et al. 2006). When emphasis is only program outcomes in evaluation there is room for misinterpretation of the actual findings. There also remains a lack of consistent, standardized vocabulary and definitions across disciplines like psychology, health promotion, education, mental health and evaluation, making evaluation of the process evaluation literature challenging (Durlak and DuPre, 2008; Saunders et al. 2006; McGraw et al. 2000). For the purpose of this research, the process terms selected were fidelity of implementation and education delivery, as being those most useful to assess the effectiveness of the Eat Healthy program. For this literature review, each term is defined and then reviewed in terms of its usefulness in the program evaluation of EH. Quality of training of nutrition educators is reviewed as a moderator of implementation fidelity and education delivery. Education received, reach and context are other components of process evaluation, however they are not reviewed here because they are not part of the approved thesis hypotheses. Relevant peer-reviewed articles were identified using a search strategy that included electronic databases (ProQuest, WebofScience, PsycInfo, CINAHL, Google Scholar), citations 4 from review articles and other references from selected articles published between 1999 and 2014. Key search words included process evaluation, nutrition education interventions, fidelity, program fidelity, implementation fidelity, and dose delivery. Search results yielded approximately 400articles and of these about 10% were relevant. Background of Eat Healthy, Your Kids are Watching- A Parent’s Guide to Raising a Healthy Eater The Eat Healthy(EH) program was in the process of development for two years and underwent extensive formative evaluation (Reznar et al. 2014.) EH is a six-week, home based nutrition education intervention program aimed at low-income families with preschool children (age 2 ½ -5). For the present study, the EH intervention was implemented with over 100 participants and conducting process evaluation was key. The level of complexity increased when the intervention transitioned from a single, pilot-study site to a multi-site (n=4) design in order to recruit an adequate number of participants to achieve adequate power for program outcomes. Para-professional educators delivered the education at each site, thus requiring monitoring of the program’s implementation regarding adherence to EH’s key nutrition concepts and objectives. Process evaluation was performed to provide valuable information about protocol adherence, consistency of delivery, barriers and facilitators to the delivery of the program and fidelity of implementation, and will be beneficial when analyzing behavior outcomes (Schneider, 2009.) Fidelity of Implementation The first step in ensuring fidelity is a well-designed study. Without working from logic model or theory, and having a clear understanding of complete and acceptable delivery of an intervention, researchers cannot assume high levels of implementation fidelity (Saunders et al., 5 2006 & Baranowski et al., 2000.) This becomes especially important when replication of the intervention is a goal (Carroll et al., 2007). Prevalence of published studies on fidelity of implementation Fidelity of implementation was selected for examination in this study because it anchors a program’s success. It is defined as ‘the degree to which an intervention is delivered as designed’ (Saunders et al. 2005), and failure to monitor a program’s implementation can lead to erroneous conclusions about its behavior outcomes. Therefore, measuring this component is essential in evaluating the internal and external validity of program or behavior outcomes, as well as, understanding how and why an intervention works. In a 1998 review by Dane and Schneider of 162 published mental health prevention studies on implementation (1980-1994), only 24% reported procedures and/or the results of implementation (Durlak and DuPre 2008). Of these 39 studies, only 33% measured the impact of fidelity on program outcomes (Dusenbury et al. 2003). Durlak (1997) reported that by the end of 1995, fewer than 5% of 1200 studies published in mental and physical health and education reported any documentation of program implementation. Measuring fidelity of implementation Measuring fidelity has been as inconsistent as its definition. According to Dusenbury et al. (2003), prior to the early 1990’s there was a lack of systematic methods to measure fidelity. To my knowledge, a standardized, inter-disciplinary tool to measure fidelity does not exist. This largely stems from the need to develop instruments that are specific to each program’s intervention. 6 Despite the lack of standardized, validated measures, there are two primary methods used to measure program implementation fidelity. These are self-reports by educators and independent observations. In either case, these reports can be unstructured or on a quantitative instrument. In a review of 59 multi-disciplinary studies, Durlak and DuPre (2008) found that 16 used a combination of these methods. Well-designed process evaluations of intervention programs use both quantitative and qualitative data sources. Multiple data sources provide a range of data which facilitates triangulation to substantiate process findings (Helitzer et al. 2000; Rossi and Freeman, 1993). Specific examples in how these methods have been applied in six studies are provided in the following paragraphs. Of these six studies, only the last two linked process data to behavioral outcomes. Saunders et al. (2006) developed a set of seven “essential elements” to guide the instructional implementation of a physical activity intervention (Lifestyle Education for Activity Program, LEAP) by physical education teachers. Saunders then examined the association of fidelity of implementation to behavior outcomes using two sources of implementation data. First, the teachers logged records of their adherence to each of the seven elements. For example, keeping physical education (PE) classes separated by gender, emphasizing enjoyable lifelong activity, and having students engaged in physical activity at least half of the time. Researchers then reviewed the logs using a content validated, 35-item instrument to rate each teacher’s fidelity. Next, investigators conducted two observations of each PE class and rated teacher adherence to the elements using a content validated, 25-item instrument. Both instruments had an objective rating system in which data were summed and averaged to develop index scores. Schools were then ranked from highest to lowest based on these index scores and then placed into one of two groups, high and low implementers. Schools that consistently ranked in the 7 bottom third of both data sources were classified as low implementers. Outcome data of vigorous physical activity showed that only the girls in high-implementing schools reported a significant improvement at post-test. Lee et al., 2003, evaluated “faithfulness to the curriculum” by teachers (fidelity of implementation) for a middle school obesity risk-reduction nutrition education curriculum using both teacher self-report surveys and classroom observations. Trained research staff measured only process not behavioral, outcomes—lesson completion, fidelity of implementation and barriers to delivering the curriculum as designed. Research staff evaluated teachers’ implementation using a quantitative classroom observation form with a 5-point Likert scale. Teachers received an initial score of 5 points, but lost one point for each change, omission, or addition to the designed program that occurred. Independent implementation coordinators assessed lesson completion using a non-validated lesson-specific form with a 5-point Likert scale where 1=none, 3=half, 5=all. Both implementation and lesson completion scores were converted into percentages. Ranges of implementation fidelity and lesson completion were 62-93% (mean=76%) and 60-93% (mean=70%), respectively. A score <33% was deemed to be low, 3367% was medium, and > 67% considered high. A 3-year after-school program Active by Choice (Wilson et al., 2009) examined how fidelity of implementation could be improved over time. Trained independent process evaluators observed intervention staff during the 2-hour program each program day (3 days per week for two weeks) six times at three schools. Evaluators used a quantitative checklist with 4-point Likert-scale questions to assess fidelity, as well as yes/no questions to assess education delivered. With a fidelity goal of 3 or higher on the 4-point scale, the averages ranged between 2.5 and 3.8, at the 6-weeks of observation. This checklist was not validated and the authors did 8 not cite this as a limitation. Based on ongoing process data and feedback to teachers resulted in strategies to improve levels of fidelity. Changes included visual and design changes to the curriculum manuals, development delivery of core training for each schools’ team leaders prior to program start dates, mid-year booster trainings, and constructive feedback given based on internal evaluations conducted by the project director. The researchers did not relate these process data to behavior outcomes. A school-based community nutrition intervention provided nutrition education to school children in a low-income rural area of Arkansas (Dollahite et al., 1998.) Process evaluation included teacher completed checklists to indicate the parts of the nutrition curriculum used and what was added. The investigators also held focus groups with the teachers to assess barriers and facilitators of implementation. Though researchers used validated instruments for behavioral and knowledge outcomes, the instruments developed for process evaluation were not. Teacher’s fidelity to the curriculum ranged from 12-89% (mean=40%.) Investigators did not relate their process evaluation data to the behavioral outcome data. In the two-phase, two-year, Team Nutrition Pilot Study (Levine et al., 2002), both process and behavior outcomes were assessed by evaluating if the program was implemented as designed and whether changes in children’s eating behaviors resulted from social marketing techniques. Data collection sources included activity logs completed by teachers for lesson completion and lesson duration, observations of teachers in classrooms and cafeterias to assess adherence to the protocol, and self-reported surveys completed by teachers and parents to evaluate attitudes towards teaching the material and perceived changes in child eating habits, respectively. Data were compiled, reviewed and synthesized for each of the sites. Evaluators tracked changes in implementation between both phases. The changes appeared to relate to 9 increased teacher experience and additional planning and development of the program. While fidelity data were collected by self-reported teacher activity logs and teacher observations by researchers, few data were provided on the fidelity findings. Investigators did report that the teachers used the lesson materials approximately 66% of the recommended times. Validation of instruments was not discussed. The 5-a-Day Power Plus study of fourth and fifth grade students (Story et al., 2000), examined both behavioral (fruit and vegetable intake) and process outcomes. Process evaluation used teachers’ weekly self-report curriculum checklists querying specifics of what was covered during the lesson, to what extent they adhered to the curriculum, and lesson duration. Teacher’s self-reported checklists were evaluated weekly by trained evaluation staff. Trained evaluation staff also conducted classroom observations of lessons using a standardized instrument of 17-21 items. All teachers were observed twice during the 8-week intervention. Investigators also examined the association between behavioral (increased fruit and vegetable intake) and process outcomes. Findings showed that among fifth grade classes, high-implementing schools had a significant increase in fruit and vegetable intake, compared to the low-implementing schools. No significant results were found among fourth grade classes. Teachers’ self-reported data showed that they implemented the curriculum as designed between 82-92% of the time. High levels of implementation were also found among classroom observation findings, which showed that teachers implemented greater than 90% of classroom curricula and activities as they were designed. From these six studies, there is consensus that observational data are more objective than self-reports, but each yields valuable information on implementation fidelity. Some studies show that observational data had a higher association with program outcomes than did data from self10 reports (Byrnes et al. 2010; McGraw et al. 2000; Saunders et al. 2006). Although the intent of this current study was not to evaluate the association of process elements to program outcomes, it is worth noting that several studies have found positive associations between higher levels of implementation fidelity and program outcomes (Levine et al., 2002; Story et al., 2000; Cho et al. 2005; Spoth et al. 2002). From this review of studies on fidelity of implementation, another salient point is that there has been little validation of instruments to assess fidelity. The best methods to date use a combination of educators’ self-reports compared with investigators’ observations of classroom experience and this combination was selected for this study. Although this in itself is a type of concurrent validation, there still is no single indicator of good fidelity. The expectation of perfect implementation is unrealistic and no data of perfect implementation have been reported to my knowledge. An assessment of 80% fidelity as good is the one selected for use in this study (Dollahite et al., 1998.) Moderators of Fidelity Several factors might influence or moderate an intervention’s level of fidelity, such as quality of educator training, educator characteristics, and complexity of intervention. Identifying and controlling for the contribution of these factors is essential so that adjustments can be made to overcome barriers and improve fidelity of program implementation in a timely manner. Quality of educator training Many factors have been hypothesized to affect the quality of educator training. They include standardization of training, training sufficiency, self-efficacy to deliver the intervention, 11 clarity of the delivery process, and timely feedback. Ideal implementation cannot exist without adequate and standardized educator training (Bellg et al., 2004). Researchers conducting the trainings must monitor and evaluate how educators receive the training. Ensuring that all educators are trained in a standardized manner is thought to enhance implementation fidelity (Bellg et al., 2004). How well education materials are delivered often hinges on an educator’s belief that they have been adequately trained, as well as, on their self-confidence that they can deliver the material as designed. Helitzer et al., 2000 reported that despite educator self-reports that they felt they had received sufficient training, implementation outcomes, as assessed by an observer using a form that included the same list of content areas and rating scale completed by the teachers, did not always support the educators’ reports. Some educators did not entirely complete all lessons, specifically those activities that were more abstract and time consuming. Training should also adequately cover how process data collection forms are to be completed. Consistency on how to score the forms is paramount. Helitzer et al., 2000 found that there was not a clear, cohesive evaluation system between self-report educator checklist and observer checklist for a study on adolescent health. Scoring methods were not identical nor well designed, resulting in poor inter-rater reliability and inconsistencies. This was not brought to the researchers’ attention until the study ended when the data were evaluated. Timely and effective feedback with and by educators throughout the intervention helps to minimize such findings. These findings suggest that support strategies, such as training manuals, guidelines and ongoing technical support, should provide additional ways to facilitate higher levels of implementation. Likewise, adequate training should provide the educators with competence in 12 intervention skills, acknowledge their expectations and motivations, and increase their sense of self-efficacy (Durlak & DuPre 2008). To address these concerns regarding the quality teacher training in this Eat Healthy study, two standardized trainings were conducted at each of four delivery sites. The two trainings included: 1) clearly outlined data collection procedures and expectations, as well as, 2) detailed instructions on educational delivery and reporting. Educator self-efficacy to deliver the education was measured using a self-reported, but validated instrument for the motivational effectiveness of materials (REF). Then these findings were triangulated using a home visit observation field notes as well as completion of the NERTD (Nutrition Educator Report of Topic Delivery) by the observer, i.e., the author of this thesis. To minimize extenuation of barriers or problems, timely feedback between educators and the principal investigator of the Eat Healthy study and the author of this thesis occurred via prompt responses to educator emails plus weekly or bi-monthly phone debriefings with site coordinators and educators. Educator characteristics Educator characteristics including ethnicity, age, years of teaching, level of education, opinions about the intervention materials, level of interest in the program or motivation, selfconfidence to deliver the materials, quality of working alliances, and participant-educator match can all influence implementation fidelity. Educators with higher levels of self-efficacy and engagement with or ownership of the program were more likely to have higher levels of implementation (Barr et al. 2002; Cooke 2000, Kallestad & Olweus 2003). Yet, in the study by Barr et al. (2002), self-reported data were the sole means of measuring implementation fidelity. The authors acknowledged this limitation and their inability to draw credible conclusions. In a 13 study by Lee et al., 2013, there were no significant correlations between educator characteristics and implementation. Young et al., 2008 reported that teachers delivering the Trial of Activity for Adolescent Girls intervention in schools had lower levels of implementation as compared to the intervention program staff who delivered the intervention to teachers and community workers during the second and third intervention years. Specifically, staff members who took ownership of the program were recruited and trained and identified as “program champions”. These staff members then directed the intervention in the third year. Instruments were validated after the intervention pilot study. The authors offered that teachers might have had limited interest in delivering the education, possibly due to requirements to alter their teaching practices and time constraints to complete district-mandated curricula, which likely impacted educator motivation. Researchers have examined the quality of the working alliance between therapist and patient in mental health studies. This relationship is also one of the most consistently supported predictors of patient outcomes (Heinonen et al.; 2013). Researchers have evaluated characteristics such as a therapist’s level of education, skill, post-graduate training and experience as potential predictors of patient outcomes. In a study by Hersoug et al. (2001), 59 therapists and 270 patients completed the self-rated Working Alliance Inventory questionnaire comprised of three subscales--Bond, Task and Goal. These subscales assessed the extent of the patient-therapist bond, the patient’s capacity to work earnestly in therapy, and the level of patient-therapist agreement on patient’s goals. Findings differed between therapists’ and patients’ perspectives on their working alliance. While training and skill were positively related to working alliances as reported by therapists, none have been reported between the therapists’ experience, training, or skill on alliances, as rated by patients. Heinonen et al. (2013) reported that therapist’s self-rated interpersonal skills, such as engagement and encouragement appeared 14 to foster stronger alliances in the short-term (less than 8 months) vs. long term (three years) therapy interventions. Likewise, therapists reporting early formation of alliances experienced more confidence and enjoyment in their work. While similarities in race and ethnicity between therapist and patient or interviewer and client have been presumed to foster improved alliances and outcomes, findings have been inconsistent as reported next. Cabral and Smith conducted a meta-analytic review of racial/ethnic matching and reported the effects of racial/ethnic matching to be highly variable (2011). Relevance of racial/ethnic matching was most relevant among African-American participants (statistically significant) and but not among White/European Americans. Although there are differences between the relationships of therapists and patients versus educators and program participants, similar associations might be pertinent in this study as well. It was not, however, possible to match on race/ethnicity for educator-participant dyads for this study and neither was it possible to match based on educational level. As such these are potential study limitations. To assess the effects of educator characteristics on implementation fidelity, some of these factors were captured on two instruments for the educators training: the Nutrition Educator Demographic Questionnaire and the ARCS (Attention, Relevance, Confidence and Satisfaction.) This permitted determination of if and how some educators’ characteristics relate to fidelity of implementation. Educator selection Educator selection is a critical component of implementation fidelity. Researchers must consider who is qualified to deliver the program and how educators or practitioners will be 15 selected (Fixen et al., 2009.) Educator characteristics are often as important as academic qualifications or level of experience. Some characteristics are unchangeable or difficult to teach, such as attitude, common sense, basic professional skills, willingness to learn, good judgment and empathy. Educator selection hinges largely on the study design. Simpler interventions may only require volunteers, while more complex interventions may have more specific requirements (Baker, Gersten & Keating, 2000; Schoenwald et al., 2004.) A difference exits whether educators are assigned or chosen from a current work environment, or whether they are recruited or volunteer. Weider et al., 2007 found that recruited or volunteer educators were observed to be more motivated, enthusiastic, and open to program changes compared to assigned educators. In the Integrated Dual Disorders Treatment (mental health) evidence-based program, eight states participated in implementing the intervention. Fidelity of implementation data was collected, by trained PhD researchers. Across sites, there was considerable variation in the staff selection process. One issue that emerged was the limitation of the pool of prospective staff. All of the sites were challenged by the time frame in which to recruit staff, resulting in hiring for convenience in some instances. The importance of interaction of staff selection and other implementation fidelity components, such as staff training and ongoing supervision was essential for minimizing large deviations from the intervention’s intended design. Complexity of intervention Maintaining high levels of implementation fidelity becomes more challenging as the sample size, number of participating sites, geographic coverage (local, state, national), or intervention components increases. As program size and complexity increases, the necessity for clear, well-defined procedures, key messages, and outcome goals become more crucial. Likewise, as all of these increase so, too, will the number of resources needed to monitor and 16 measure implementation fidelity (Saunders et al. 2005). Hill et al. (2007) found that by reinforcing essential program elements at the beginning of the intervention and having open lines of communications between the 12 sites, the program developers and the administrators, the negative impacts on implementation fidelity were minimized. Time constraints juxtaposed against the breadth of data collection, particularly in multi-site studies, can present challenges in timely feedback and adjustments (Wilson et al. 2009). In a study by Schneider et al. 2009, evaluating nutrition and physical activity behaviors in middle schoolers, the differences among how schools were structured impacted implementation, specifically the data collection procedures. Through timely, ongoing feedback of process data, however, program developers were able to modify data collection methods immediately. Specificity of intervention key messages is also imperative as complexity increases. Dusenbury et al. 2005, reviewed implementation fidelity studies for drug abuse and prevention in schools and found that highest levels of implementation resulted from adequate teacher training and staff development, as well as implementation protocols and clearly defined key program messages. The more clearly defined and transparent the program is, the greater the likelihood of good implementation fidelity and program success. The study design of the Eat Healthy intervention is complex. Across four counties, initially there were 24 nutrition educators delivering materials to approximately 140 participants in their respective homes. Timely monitoring via weekly debriefing calls with site coordinators, home visit observations by the study coordinator and timely responses to educator phone calls and email inquiries was essential to make appropriate adaptations to facilitate high levels of implementation fidelity. 17 Educational Delivery Researchers often evaluate the delivery of education, commonly referred to as “dose or dosage”, as a subcategory of implementation fidelity. A lack of consistency within the body of literature made it difficult at times to separate delivery from implementation. Assessing adherence to education delivery protocols is essential and often captured by the length of time spent delivering each component of an intervention. Comparing reported or observed data to benchmarks defined in the curriculum protocols allows researchers to evaluate possible barriers to delivery adherence, provide feedback and make adjustments as deemed necessary. For example, Faw et al., 2005 found that intervention delivery time logs of intervention providers differed considerably from that was expected in the protocols. In the WIC–5 a Day Promotion Program study by Anliker et al., 1999, inconsistent delivery was found among peer educators. Despite clear, standardized procedures, researchers found that educator characteristics such as tardiness, absenteeism, and turnover, and time burden negatively affected consistency of delivery. Use of validated instruments to measure delivery was not reported. Rather, each nutrition session delivered by an educator was evaluated by the project manager and project nutritionist via an interview eliciting the educator’s experience. In the 3-year Active by Choice Trial, delivery of education was measured using a 17-item observation checklist with a binary “yes/no” format. Researchers found improvements in delivery between year one (32-80%) and year three (91-100%), suggesting that these improvements were likely due to formative process evaluation leading to changes in educator training, teaching manuals, and data collection (Wilson et al., 2009). 18 Byrnes et al., 2010 assessed delivery in terms of adherence, in a study examining the relationship between program fidelity and family engagement in two family-based adolescent drug and alcohol abuse programs. To ensure objectivity, all sessions were videotaped and phone calls recorded. The researchers adapted the instruments used to evaluate program adherence from the original program manuals and then trained raters in both program delivery and scoring. Kappa and consensus scores measured inter-rater reliability. Average scores for program SFP, Kappa 0.67, 92% and for program FM, Kappa 0.76, 90%. The instrument asked questions about completion of specific tasks and the time it took to complete each activity, using a binary scoring system (yes/no). Average percentage adherence scores were calculated for each program by summing the items and dividing by the total possible score for each session. Researchers found that educators adherence to delivery was between 78 and 82%, respectively. Interestingly, in one program that had more parent discussions and a tight timeline, educators reduced discussion time in order to deliver all materials to achieve higher adherence scores. Researchers recommended the importance of training educators on how to redirect parents and maximize delivery efforts in the face of time constraints. Levine et al., 2002 found that teachers delivered approximately 66% of Team Nutrition lesson material, as captured by self-report teacher activity logs. Teachers cited time constraints and copy expenses as the primary reasons for reduced delivery. For the purpose of this study, evaluation of the educational delivery included: 1) the selfreport score for ability to deliver the topics as designed, and 2) the time spent delivering each topic. To assess consistency among educators and sites, the researcher used the nutrition educator report of topic delivery (NERTD) and home visit observation field notes to measure these variables. 19 Barriers of educational delivery Researchers consistently cite time constraints as a barrier to achieving optimal delivery of intervention materials (Hill et al. 2007; Dariotis et al. 2008). There are additional barriers to delivery due to the characteristic of the educators, like lack of interest in the intervention and poor teaching skills (Botvin, 2004.) Potential barriers for the Eat Healthy study include disruption in educators’ schedules due to vacations, other programs and scheduling, and time constraints. Nutrition educators reported on facets of education delivery, such as duration of each topic delivery, barriers and facilitators affecting delivery of the material, and levels of participant and child engagement using the NERTDs. During the exit interview, the author of this thesis probed educators to elicit deeper responses on what aided or impeded their ability to deliver the materials as intended by using open-ended questions. Facilitators of educational delivery In the studies reviewed, researchers have cited important factors for successful implementation to be adequate and timely training of educators, support for educators by site coordinators and administrators, as well as educator and coordinator support by program developers (Anliker et al., 1999; Story et al., 2000; Hall et al., 2011. Dusenbury et al. (2005) reported that in addition to educator training, program characteristics, educator characteristics and organizational characteristics all facilitate high-fidelity implementation. Positive program characteristics include detailed implementation instructions of curriculum, as well as, ease of administering program materials. This provides a way for researchers (program developers) to measure adherence or delivery as intended (Sanchez et al., 2006.) Educator characteristics such 20 as confidence, enthusiasm, respect for participants, and good interpersonal skills also facilitate higher levels of program delivery and fidelity (Forman et al. 2009.) Likewise, effective and ongoing consultation, leadership and morale between program developers and participating sites are critical organizational characteristics for successful implementation (Dusenbury et al., 2003.) For this study, the literature supports the use of both quantitative and qualitative data sources to measure fidelity of implementation and education delivery. Nutrition educators completed self-reports and I conducted several home visit observations, at least one per site using a mixed-methods approach (NERTD and field notes). Collectively, these data provided a detailed and comprehensive picture of how the implementation of Eat Healthy unfolded. My goal for fidelity of implementation and education delivery was 80% or greater (score of 4 or 5 on the 5-point Likert-scaled Nutrition Educator Report of Topic Delivery-NERTD) both within and among sites, however, the literature has shown that positive outcomes have resulted with scores as low as 60%. 21 CHAPTER III. METHODS Study Sites and Nutrition Educators There were four participating counties in this multi-site study, with a total of 20 nutrition educators, and designated site coordinators for each of four sites. In Ingham County, i.e. greater Lansing area, the Principal Investigator of the Eat Healthy and the author of this thesis (Karp) interviewed 10 senior students in Dietetics at Michigan State University and hired four to work with Capital Area Community Services (CACS) Head Start families. The author of this thesis was the study coordinator, as well as, a nutrition educator in Ingham County. The Michigan Nutrition Network1, of the Michigan Fitness Foundation, the project funder, designated Genesee, Van Buren, and Kent Counties as additional delivery sites. The Intermediate School Districts (ISD) in these three counties employed all the outlying nutrition educators as home visitors working with SNAP-Ed eligible families in various parenting programs for parents of preschoolers. Van Buren was comprised of two teams of educators, Van Buren ISD (n=4) and Project L.E.A.N. (n=2). Where appropriate and possible, each team was analyzed separately. The selection of educators for Eat Healthy (EH) occurred differently among study sites. Researchers matched the Ingham County participants with five Michigan State University senior dietetic or graduate students as their educators, i.e., a convenience match. In the Kent County Intermediate School district (ISD), the site coordinator selected the three educators based upon how well educators were able to attend to details. This was important because the site coordinator stated that this attribute would facilitate adherence to the intended delivery protocol. 1 Michigan Fitness Foundation, Michigan Nutrition Network, 1213 Center Street, Suite D, Lansing, Michigan 48906. www.michigannutritionnetwork.org, Marci K. Scott, PhD RD, Deborah Harris, MPH RD CDE. 22 Participants from each educator’s particular case load then volunteered for this EH study on process evaluation. In Genesee County ISD, the six ISD educators wanted to participate and their participant matches were made from each educator’s established caseload. For Van Buren County ISD, of the educators, four wanted to participate and recruited participants from their geographic area of responsibility assigned based on the educator's ties and relationship within the community. Van Buren ISD hired two additional educators from Project L.E.A.N. specifically to recruit and educate new SNAP-Ed eligible families for this study.2 This EH process evaluation study had five educators in Lansing, six each in both Genesee and Van Buren, and three in Kent. Design of the Educational Program for which this Process Evaluation Occurs Eat Healthy was field tested in Fiscal Year 2013 using a randomized, quasi-experimental multisite intervention study design of approximately 150 participants. Michigan Nutrition Network (MNN) randomly assigned participants to either the intervention group or the control group. The intervention consisted of five booklets, one per topic-written at a third to fifth grade reading level, with 24 accompanying DVD segments of 2-3 minutes each featuring real SNAPeligible parents of preschoolers in their own homes. Nutrition educators delivered the intervention over a six-week period doing three home visits alternating with three phone calls. This delivery method was selected because experience has shown that low-income parents cannot or will not attend a series of lessons. The control group was given a booklet of general preschool health guidelines, Hip on Health®3. A washout week followed the completion of the first intervention arm. Following the washout week, the educators then delivered the 2 Project L.E.A.N. (Linking Education, Activity and Nutrition). Van Buren ISD oversees Project L.E.A.N. 3 Hip on Health®(Health Information for Parents). Parent and family resource materials 23 intervention to the control group for the subsequent six weeks. A follow-up assessment was completed for each group three months following completion of the intervention. Eat Healthy Process Evaluation Conceptual Model The conceptual model for this study was adapted from one used by Lee et al., 2013. For the purposes of this study, only fidelity of implementation and education delivery were examined. The effects of barriers, facilitators and moderators on these two components as well as process outcomes were also evaluated. Figure 1. Eat Healthy process evaluation concept model adapted from Lee et al., 2013. Procedures Prior to the start of the process evaluation study, Institutional Review Board (IRB) approval was obtained from Michigan State University’s Committee for Research Involving 24 Human Subjects (See Appendix A.) As a multi-site study, potential nutrition educators were recruited in one of two ways. Michigan State University (MSU) nutrition educators were recruited from senior dietetic students in the greater Lansing area. Genesee, Kent and Van Buren County nutrition educators were recruited from their respective Intermediate School Districts (ISD). Once recruitment of nutrition educators was complete for each site, the Principal Investigator of the Eat Healthy study and author of this thesis (Karp) delivered the first training, which included participant recruitment, data collection and measurements. Nutrition educators completed the ARCS for training (Attention, Relevance, Confidence and Satisfaction) and a nutrition educator demographic questionnaire. Nutrition educators also received a copy of the MSU Informed Consent for Nutrition Educators (See Appendix B for consent form). Prior to recruitment of participants, all nutrition educators completed MSU IRB Human Research Protection training. Following certification of IRB training, nutrition educators recruited and obtained MSU participant consent forms, as well as, collected data on participants, including the demographic questionnaire, a Nutritionquest Block Food Frequency Questionnaire for the child’s food intake, and a parent feeding behaviors questionnaire. Educators both faxed and later mailed to MSU the original participant demographic questionnaires and consent forms. After commencement of participant recruitment, nutrition educators were next trained on delivery of Eat Healthy materials (See Appendix C for the PowerPoint slides). Nutrition educators completed a validated form for how motivational and confidence building they perceived the materials to be for training on education delivery. This form is called the ARCS, short for the four subconstructs Attention, Relevance, Confidence and Satisfaction (Keller, 1987). Once fully trained, nutrition educators began delivery of education to those participants assigned to the first intervention group. Following completion of each topic, nutrition educators 25 faxed the Nutrition Educator Report of Topic Delivery (NERTD) to MSU. Participants assigned to the control group received Hip on Health® materials. The author of this thesis (Karp) conducted weekly conference calls with each outlying site coordinator and nutrition educators and held weekly team meetings with MSU nutrition educators to inquire about facilitators, barriers or problems encountered in recruitment, education delivery or fidelity of implementation. Based on feedback received, the author of this thesis (Karp) and Principal Investigator made adjustments to delivery or implementation, as deemed necessary. Throughout the intervention, the author of this thesis (Karp) conducted one-two home visit observations per site taking qualitative field notes on educational delivery as well as fidelity of implementation. Post-intervention, the author of this thesis (Karp) conducted exit interviews with nutrition educators from each site asking questions about quality of nutrition educator training and delivery of education, as well as, barriers and facilitators to delivery. The timeline for these procedures is found in Table 1. 26 Table 1. Timeline of the process evaluation of Eat Healthy intervention. Month April 2013 February -April 2013 June 2013 March-September 2013 May 2013-March 2014 June-July 2013 December 2013January 2014 February 2014 March2014 Activity Review of thesis proposal by Graduate Advising Committee MSU IRB approval obtained for instruments, methods, protocols Delivered standardized trainings at each County Data Collection and Recruitment Education Delivery Thesis proposal approved by Graduating Advising Committee Ongoing recruitment in Genesee, Van Buren and Ingham Counties Weekly/biweekly debriefings with site coordinators Data collection Education delivery Home visit observations in each County Analysis of nutrition educator characteristics, NERTD, and ARCS for educator training data Conduct exit interviews with educators in each County Complete quantitative analysis for hypotheses 1 and 3 Analyze qualitative data (transcripts from periodic debriefings, NERTD Q4b: extent to which the lesson was delivered as designed, home visit observations, exit interviews) Nutrition Educator Training As mentioned in Procedures, the nutrition educator training occurred in two phases. Each is described in the following paragraphs. The first phase (120 minutes) covered MSU IRB training procedures, recruitment of participants, data collection and training and practice of height and weight measures. The training emphasized completeness and accuracy of data collection and anthropometric measurements. The second training phase (90 minutes), on the delivery of the Eat Healthy curriculum, emphasized using the adult learning approach of Anchor, Add, Apply and Away (AAAA) to 27 reinforce key messages throughout all topics. This AAAA approach acknowledges the adult participant’s experience or knowledge, while providing “just in time” information (Goetzman, 2012). The author of this thesis (Karp) provided a curriculum grid outlining the adult learning components for each topic, topic overviews with participant learning objectives, and detailed lesson plans to all nutrition educators. Trainers and nutrition educators worked through the first two EH topics in detail to provide a solid foundation for curriculum delivery. The investigators reviewed with educators the process evaluation instruments and data collection timeline for Nutrition Educator Report of Topic Delivery and ARCS for participants. To assess quality of training, nutrition educators completed the ARCS (Attention, Relevance, Confidence, Satisfaction) questionnaire following each training session. Training sessions were both didactic and interactive, providing opportunities for nutrition educators to practice skills and ask relevant questions (See Appendix D for Power Point presentation and curriculum grid.) Instruments There were seven instruments to collect data for process evaluation. Data from the first three provided insight on education delivery (Nutrition Educator Demographics, ARCS for nutrition educators, and the Nutrition Educator Report of Topic Delivery-NERTD.) Data from the last four provided insight primarily on fidelity of implementation, but also captured information on education delivery. These were periodic debriefing forms, home visit observations, NERTD- Q4b, and exit interviews. Table 2 shows the two process evaluation components and corresponding instruments. 28 Table 2. Process evaluation components and four of seven instruments. Process Component Education Delivered Fidelity of Implementation Completed By Timing of Data Collection Weekly Instruments Type of Data Nutrition Educator’s Report of Topic Delivery (NERTD) Home visit observations (1-2 per site) Periodic debriefing forms Mixed method questions using NERTD’s Nutrition Educators Qualitative field notes NERTD’s Qualitative field notes Author of this thesis During intervention Weekly/bimonthly Exit interviews (semi-structured Qualitative, field notes Principal Investigator of EH & author of this thesis Author of this thesis Postintervention Instruments for delivery of education Nutrition educators completed a 10-item Nutrition Educator Demographic Form capturing sex, age, race, ethnicity, education level, number of years teaching nutrition education, nutrition education background--including formal nutrition classes, and number of years employed as a home visitor. In addition, educator’s level of interest in teaching EH materials was measured using a 5-point Likert scale score 1=no interest and 5=very interested (See Appendix E for this instrument.) Nutrition educators also completed the ARCS for nutrition educators after each of the two training sessions. ARCS, also referred to as Instructional Material Motivational Survey originally developed by Keller (1987), assesses the learner’s motivations or lack thereof with instructional materials design. The ARCS measures four subconstructs—Attention, Relevance, Confidence and Satisfaction. The ARCS for nutrition educators was adapted from the original 29 validated and reliable 36-item questionnaire in which respondents scored items from 1=not true to 5=very true. Based on advice from (Huang et al. 2006), the original scale was reduced in number to 13 and 16 items for each respective training: 1) recruitment, data collection and measures; and 2) training of delivery of education, previously described under nutrition educator training. Principal component analysis of the original scale to reduce the number of items was completed on a sample of about 300 young adults (Reznar et al., 2014.) Cronbach’s  for each construct was greater than 0.60. For Attention, four items had a Cronbach’s = 0.80; for Relevance, four items Cronbach’s =0.69; for Confidence, four items Cronbach’s =0.74; for Satisfaction, four items Cronbach’s =0.82. The difference in number of items between the ARCS for the two trainings was that for the nutrition education training there were three items added that were not relevant to Phase 1 training. These additional items were “It was hard to pick out and recall important points to teach participants,” “The activities in this training were too difficult for me to teach,” “I could not understand how I can teach this material.” (See Appendix F for ARCS.) Nutrition educators completed a Nutrition Educator’s Report of Topic Delivery (NERTD) questionnaire after delivery of each lesson for the five topics delivered in six contacts. NERTD’s had six items on a half-page. A 5-point Likert scale was used to score four of the six items. Nutrition educators scored the extent to which the parent was engaged in each week’s topic and to what extent the child was engaged in each week’s activity. Scores ranged from 1=not at all engaged to 5=very engaged. Perceived parent reactions to each topic’s key messages was scored from 1=strongly unfavorable, to 5= strongly favorable. The extent to which the nutrition educator was able to deliver the topic as designed was scored from 1= not at all able to 5= completely able. An open-ended item (Question 4b) elicited responses regarding problems or 30 barriers, as well as facilitators that impacted delivering each week’s topic as designed. Finally, the number of minutes spent on topic delivery was recorded (See Appendix G for instrument and coding tree.) Instruments for fidelity To assess the level of implementation fidelity, including any barriers or facilitators encountered, the Principal Investigator of the Eat Healthy study and the author of this thesis (Karp) completed Periodic Debriefing Notes via weekly or bi-monthly conference calls with each site’s coordinator throughout the course of the intervention, using an interview guide developed based on the research objectives and aim (Table 3.) Data from these debriefings were recorded and transcribed for qualitative analysis. Weekly team meetings were held with Lansing educators, but these were not coded. As the intervention unfolded, the interview guide transitioned to an open-ended format, so that site-specific needs and concerns on the topic delivery guided the debriefing. Thus, these weekly debriefings served two functions: 1) a source of information on fidelity of program delivery; and 2) an avenue for timely feedback to educators and troubleshooting problems as they arise. 31 Table 3. Debriefing items queried weekly of site coordinators and educators as indicated by month for all sites, but Lansing. Item Month 1 Month 2 Month 3 Months 4-8 1. How are things going overall? 2. What problems have been encountered in data collection? 3. What barriers have been encountered during home visits? 4. What barriers have been encountered during phone calls? 5. What positive feedback/facilitators have educators experienced in delivering EH curriculum? 6. Have educators been using Teacher Guide, including curriculum grid? 7. Have educators been referencing participant status spreadsheets from outside evaluator? 8. Educators were leaving question 4b on the NERTD blank, has this been corrected? 9. Have any educators had to modify lesson delivery? 10. General comments or concerns from educators? x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x x In the home visit observations, the author of this thesis used both qualitative field notes and completed additional NERTD’s evaluating the educator’s delivery of a lesson. Both educator and observer completed a NERTD and mean scores and standard deviation for each NERTD item was calculated. The goal for fidelity was 80% agreement on each item (observer vs. educator). Approximately 25-30% of educators (n=7) were observed delivering one lesson during the intervention. The author of this thesis also took qualitative field notes on educational delivery during these home visit observations (1-2 per site, n=7). These typed observational field notes were used to evaluate key elements of education delivery and implementation fidelity, including nutrition educator characteristics that could potentially impact both process 32 components, such as motivation and engagement in teaching the EH materials and the relationship dynamic between educator and participant. (See Table 22 for home visit themes.) The author of this thesis (Karp) conducted exit interviews with 13 nutrition educators in each of four sites. Interview items consisted of five types of questions, including open, introductory, transition, key and ending questions (Krueger & Casey, 2000.) Table 4 shows the items generated for the exit interview based on the process evaluation concept model (Figure 1.) In addition, Q4b from the NERTD on barriers and facilitators was used. These items inquired about facilitators and barriers impacting an educator’s ability to deliver the lesson as designed. The researcher also queried the educators’ about their perceptions on the effectiveness of educational delivery training. These interviews, lasting from 35-75 minutes, were audio recorded and transcribed verbatim. 33 Table 4. Exit interview items queried of site coordinators and educators at each site. Item How satisfied were you with the training you received on delivering the lessons? Why? How sufficient was the training? How could the training be more effective? What might improve the data collection process? What were the largest barriers to being able to deliver the lessons as designed? What were the largest facilitators to being able to deliver the lessons as designed? If you were not able to deliver the lessons as intended, how did you modify the protocol? In response to your participants’ responsiveness, when did you have to omit, modify or add to the lesson’s content or activities to better reach the participant and what did you do? 9. Did it become easier or more difficult as you progressed, to maintain fidelity to delivering lessons as designed? 10. How frequently did you refer to the survey status spreadsheets? 11. How did you find these spreadsheets helpful? If not helpful, why? 12. Who did you reach out to when you encountered confusion about the status of a participant or questions regarding data collection? 13. How satisfied were you with the timeliness and responses you received to your questions (of the author of this thesis)? 14. How effective were the PI of the Eat Healthy study and The author of this thesis at answering questions during the bi-monthly debriefings? 15. How helpful did you find these ongoing debriefings? 16. How would you describe the amount of time and effort that was involved in working as a nutrition educator for Eat Healthy? 18. Do you have any general comments or feedback? 1. 2. 3. 4. 5. 6. 7. 8. Data Analysis Both quantitative and qualitative data sources were used for data analysis. This mixed methods approach was used to enhance ability to triangulate the data and enhance validity of findings. Statistical analysis. Descriptive analyses and Cronbach  for subconstructs were conducted using SPSS (version 20, Armonk, NY: IBM Corp) and each variable assessed for skewness and kurtosis. 34 Parent engagement, parent reaction to key messages, extent which the lesson was delivered as designed, years teaching nutrition education, and interest in teaching Eat Healthy had high values for skewness and kurtosis. Missing values for NERTD variables (n=3) were replaced with the mode for that lesson and items from all educators’ NERTD’s (n varied from 90, 81, 69). Sample characteristics were described using mean±standard deviation and counts or frequencies for continuous and categorical variables, respectively. Logistic regression analysis was conducted using the statistical program Stata (version 13SE, StataCorp LP, College Station, TX, 2013) for generalized linear and mixed effects models. Mplus (version 7.11, Muthén & Muthén, Los Angeles, CA, 2013) was used for the confirmatory factor analysis. Testing construct reliability (α) and validity. Construct reliability was tested for the ARCS questionnaires. ARCS consisted of 13 items or 16 items when administered. Relevance originally had five items, however, when analyzed with all five, Cronbach  was low. The removal of questions, “I didn't learn anything new in this training, because I already knew it,” and, “The content of this material is relevant to my interests” improved the α value. Cronbach α values for attention, relevance and satisfaction were 0.92, 0.59, and 0.81, respectively. The final questionnaire used for analysis was an 11-item, 5-point Likert scale. 1= Never, 5=Always). Confirmatory factor analysis (CFA) was used to test the validity of the measurement and generate factor scores for three of the four sub-constructs (Attention, Relevance, and Satisfaction) for each completed questionnaire (n=23) (Bentler, 1990). Factor score, with relevance items deleted, was used to achieve the highest Cronbach  for Attention, Relevance, and Satisfaction, as noted above. Factor scores were generated for each of 20 educators and then 35 added to the generalized linear mixed effects models as independent variables. Figure 2 illustrates the structural equation model with sub-construct factors. CFA showed the good model fit of the three factor structure; Chi-square= 33.522, df=32, p=0.393, CFI= 0.999, RMSEA=0.045. Figure 2. Confirmatory Factor Analysis for ARCS questionnaire.4 4 Chi-square: p-value >0.30 is ideal. CFI (comparative fit index): value >0.90 indicates an acceptable model. RMSEA (root mean square error of approximation): value <0.05 considered adequate. 36 Model assessment and selection. Generalized linear mixed effects models (glmm) were created to develop the best model of multi-level associations between dependent and independent variables for both Hypothesis 1 and Hypothesis 2. Independent variables from the NERTD were tested for their random effects across different families, while educator characteristics from demographic form were tested for fixed effects. Both random intercept and random intercept- random slope models were assessed. Tables 5 and 6 illustrate how the glmm models were developed to assess model fit. The first model was a three-level model (Level 1 units: observations of each topic delivery, level 2 units: responses nested within families, Level 3 units: families nested within educators). Level 3 was collapsed into Level 2, as there was extremely small variance (1.04e-16) at the level of educators. Models 4 and 5 were used to test Hypothesis 2, part a. Model estimation or “goodness of fit” was determined using maximal log-likelihood, numerical integration and Newton-Raphson adaptive quadrature. (Rabe-Hesketh & Skrondal, 2012). Table 5. Generalized linear mixed effects model comparisons for ‘Extent to deliver lesson as designed’ from educator’s evaluations of parent engagement, parent reaction to key. Variable Log-likelihood (model fit Model 1 Model 2 Model 3 Model 4 Model 5 Random intercept (Level 3) Random intercept (Level 2) - - Random intercept and slope (Level 2) Parent engagement Random intercept and slope (Level 2) Parent reaction to key messages (without 5 sites) Random intercept and slope (Level 2) Parent reaction to key messages (with 5 sites) -188.21 -188.21 -188.20 -186.83 -179.55 37 Table 6. Generalized linear mixed effects model comparisons for ‘Duration of topic delivery’ from educator’s evaluations of parent engagement, parent reaction to key messages and time spent on lessons. Variable Loglikelihood (model fit) Model 1 Model 2 Model 3 Model 4 Model 5 Model 6 Random intercept (Level 3) Random intercept (Level 2) - - Random intercept and slope (Level 2) Parent engagement Random intercept and slope (Level 2) Parent reaction to key messages (without sites) Random intercept and slope (Level 2) Extent to deliver lesson as designed Random intercept and slope (Level 2) Parent reaction to key messages (with sites) -326.76 -344.90 -344.40 -344.20 -344.33 -328.67 Model 1: (Level 1): Yextent to deliver=ß0 + ß1parent engagement + ß2parent reaction to key messages + ß3minutes spent + ß4(min) + ᵋ. Level 2: ß0= τ00 + τ01educator age + τ02education level + τ03formal nutrition coursework + τ04years teaching nutrition education + τ05 years as home visitor + τ06Interest in teaching Eat Healthy + ᵋ. Figure 3. Equation for final generalized linear mixed effects models. For Ho1a the outcome or dependent variable of interest from the Nutrition Educator Report of Topic Delivery was the extent to deliver topic as designed, a rank ordered variable (Q4a). The independent variables of interest included the factor scores described earlier for three of the four sub-constructs in the ARCS (rank order variables), as well as, the average scores for ARCS both for each site and for all educators. 38 For Ho1b, the dependent variable was duration of topic delivery, in minutes. Delivery of topics was divided into two categories--home visits and phone calls. Prior to data collection, the researcher determined cut-off points in minutes by each topic for ‘too short’, ‘optimal’, and ‘too long’ lesson duration. These a priori optimal ranges in lesson delivery were 50-70 minutes for Topic 1a, and 45-60 minutes for Topics 2 and 5. Optimal phone call duration (Topics 1b, 3, and 4) was between 10 and 20 minutes. For number of minutes recorded by each nutrition educator, duration was categorized (rank order scale) as ‘too short’, ‘optimal’, and ‘too long.’ These data were averaged for each site, with educators teaching anywhere from 1-30 families, and then for all educators. Also, a post-hoc determination of optimal lesson duration was completed using observed mean±SD by topic for all sites. Duration was assessed ad hoc two ways: 1) minutes2 (quadratic relationship) and by using the mean±1SD as the optimal range, with a score greater than -1SD considered “too short” and a score greater than +1SD as “too long.” Post-hoc categories (rank order scale) were used in the models. The second post-hoc definition was used for ease of interpretation. For both Ho1a and Ho1b, the independent variables of interest included the factor scores for three of the four sub-constructs in the ARCS (rank order variables). The average scores for ARCS both for each site and for all educators were also of interest. For Ho2 the two dependent variables of interest from the Nutrition Educator Report of Topic Delivery were the extent to deliver topic as designed (Ho2a), and the duration of topic delivery in minutes (Ho2b) as just described. The independent variables of interest were degree of parent engagement (Q1) and degree of parent reaction to key messages (Q3)—both rank ordered variables. These were averaged for each site and then for all educators. The relationships between nutrition educator characteristics and the NERTD items just described 39 were evaluated using a correlation matrix and then generalized linear mixed effects model regressions. Qualitative analysis. As described previously, four instruments collected the qualitative data for this process evaluation: a) Weekly or bi-monthly periodic debriefings with site coordinators, b) NERTD open ended item probing for barriers and facilitators to delivering the lesson as designed, c) home visit observations of lesson delivery, and d) exit interviews with educators at each site. Both the open-ended items and those for the interviews were guided using the constructs from the conceptual model for process evaluation shown earlier in Figure 1. It was important to examine several moderators in the model, such as barriers and facilitators to implementation, adherence to the designed protocol, and nutrition educator characteristics. A research aide transcribed verbatim the periodic debriefing notes, open-ended responses for the NERTD item, field notes from each home visit observation, and the audio recorded exit interviews with educators. The author of this thesis then reviewed each of the transcripts for accuracy. For the exit interviews, the audio recordings and transcripts serve as reliable records of the interactions between the interviewer (the author of this thesis) and the interviewees (educators) (Seale and Silverman, 1997). The author of this thesis (Karp) gained knowledge of and experience with qualitative research methods in a graduate course, HDFS 982—Qualitative Research Methods, from a professor who is an expert in this area at Michigan State University. Karp was one independent coder and trained a senior research aide as the second independent coder of these four sets of data. Karp trained the research aide using an overview of coding guidelines as outlined by 40 Richards (2009), as well as by reviewing the constructs in the concept model. Multiple coding provides the equivalent of “inter-rater reliability” in quantitative research that helps ensure rigor by reducing the effects of subjectivity (Barbour, 2001; Popay et al. 1998). It also strengthens the findings due to shared interpretation and understanding of the data’s context (Weston et al., 2001). Coding was an iterative process that used a thematic analysis approach as both anticipated themes and new ones from the data were generated from the concept model (Barbour, 2001; Momin et al., 2014; Weston et al., 2001). Karp and the trained research aide independently completed initial coding for each set of data, by underlining words or phrases that aligned with constructs from the concept model. The two coders then met to discuss their findings of emergent themes. Disagreements in data interpretation among the coders resulted in further evaluation of the data to come to “coding consensus” (Mauthner N et al. 1998; Seale & Silverman, 1997). Researchers then developed a coding tree for each data source based on the final list of codes. Using the coding tree, the trained research aide coded each set of texts. A third independent, trained researcher then rechecked each set of text for any omissions. Thematic analysis was used to identify unanticipated themes, not tied to the concept model or previously coded. Researchers then revised the coding trees to reflect these additions. This “realist” approach provided richness as it illuminated the reality of the educators’ experiences beyond the researcher’s pre-existing views (Braun & Clarke, 2006). The coding tree for the NERTD open ended item consisted of two main themes—barriers and facilitators, 11 second level subthemes and 18 third level subthemes. Exit interviews were also coded to three levels consisting of four main themes, 18 second-level subthemes, and 64 third-level subthemes. Periodic debriefing notes were coded to two levels with five main themes 41 and 23 subthemes. Home visit observations were coded to two levels with five main themes and 19 subthemes. Researchers then calculated frequencies for each theme and subtheme, and often by site in order to assess how widespread particular themes or patterns occurred, as well as, to elucidate deviant patterns or outliers (Seale, 1995). All data sources were evaluated as individual sites but also reviewed collectively across sites to examine emergent patterns. (See Appendix H for coding trees.) Triangulation. Emergent themes from the exit interviews and home visit observations were compared with the self-reported quantitative data from the educators, i.e., parent engagement, parent reaction to key messages, and the extent to which the lesson was delivered as designed). The themes were also compared with the quantitative findings from the logistic regression models to triangulate and verify the data regarding fidelity and education delivery. Such multiple sources of data are essential for triangulating and validating findings in process evaluation (Helitzer et al. 2000; Rossi & Freeman, 1993). The author of this thesis’s triangulated data from the NERTDs (both quantitative and qualitative), qualitative field notes during home visit observations, and in-depth exit interview responses to illuminate the overall fidelity to the Eat Healthy program design among educators and sites. 42 CHAPTER IV. RESULTS Educator demographics Table 7 shows educators averaged 39.4 years of age (range 21-59) with 39% between 5059 years. Van Buren and Kent educators were significantly older than those from Lansing, while Project L.E.A.N. was significantly younger than Van Buren. Most were White, with the remainder Black. Most (60%) reported having some college education and about 50% reported having had formal nutrition coursework. Lansing had significantly higher rates of formal nutrition coursework as compared to Van Buren. Forty percent of all educators reported between 2-5 years’ experience teaching nutrition education, while 55% reported between 6-10 years’ experience as a home visitor. Van Buren had significantly greater experience as a home visitor than Lansing, Project L.E.A.N., and Genesee. Kent also had significantly greater experience as a home visitor than Lansing. The majority (87%) reported they were very interested in teaching Eat Healthy. Van Buren was divided into two different groups for this analysis, as the educator characteristics and families differed. 43 Table 7. Descriptive characteristics of nutrition educators (n=20) for all sites (n=4) and then by site. All Sites (n=20) Mean Characteristic ±SD 39.4±13.7 Age (y) 21-23 4 24-29 3 30-39 2 40-49 2 50-59 9 Race/Ethnicity White, non19 Hispanic Black, non1 Hispanic Gender, Female 20 Education Level Some college 11 College graduate 9 Years teaching nutrition 3.7±4.00 education 0-1.9 7 2-5 9 6-10 2 ≥11 2 Years as home 6.4±6.5 visitor 0-5 10 6-10 2 ≥11 8 Interest in teaching Eat 4.87±0.34 Healthyc Somewhat interested 2 Very interested 18 Formal nutrition education, Yes 11 Lansing Mean ±SD or n Project Genesee L.E.A.N.* 27.5±0.7a 43.5±12.0b 0 0 2 1 0 1 0 0 0 4 24.8±8.0a 4 0 1 0 0 Van Buren* 54.3±5.2b 0 0 0 1 3 Kent 5 4 2 5 3 0 0 0 1 0 5 4 2 6 3 3 2 2 2 0 2 3 3 3 0 1.0±1.7 4 1 0 0 4.8±7.5 2 1 0 1 1.8±0.4 1 1 0 0 6.0±3.7 0 3 2 1 4.3±0.6 0 3 0 0 0.2±0.5d 5 0 0 15.0±4.8e 0 0 4 1.8±1.8d 2 0 0 7.2±4.3d 3 1 2 12.3±2.5e 0 1 2 5.00±0.00 5.00±0.00 5.00±0.00 4.83±0.41 4.67±0.58 0 5 0 4 0 2 1 5 1 2 5 0 2 3 1 51.3±6.4b 0 0 0 1 2 *Van Buren was divided into two groups (Van Buren and Project L.E.A.N.). a Lansing and Project L.E.A.N. educators were significantly younger than Van Burenb. Lansing was significantly younger than Genesee and Kentb.. c 5-point Likert scale (0=no interest, 5=very interested) d Van Buren educators had significantly more years (experience) as a home visitor than Lansing, Project L.E.A.N. and Genesee. Kent had significantly more years (experience) as a home visitor than Lansing. 44 Hypothesis 1 H01a: There will be a positive association between quality of nutrition educator training [higher scores on ARCS (13 items) 5-point Likert scale] and extent to deliver the lesson as designed (Q4a on 5-point Likert-scale NERTD). H01b: There will be a positive association between quality of nutrition educator training [higher scores on ARCS (13 items) 5-point Likert scale] and lesson duration (Q5: NERTD). Attention, Relevance, Confidence, Satisfaction (ARCS) Table 8 shows mean±standard deviation and Cronbach α scores for educational delivery training. Scores are shown for all sites as well as by each site. For each sub-construct, on average, educators across all sites reported positively between “mostly true” and “very true.” Mean scores for all four sub-constructs tended to be lowest in Van Buren, though not significant. Van Buren was the first site to be trained. Table 8. Subscale scores for Attention, Relevance, Confidence, and Satisfaction (ARCS) subconstructs from 20 educators for the training session on educational delivery. All sites Subconstruct (n items) Mean±SD Cronbach α Mean±SD (n educators) Lansing (5) Van Buren (6) Genesee (6) Kent (3) 4.02±0.89 0.923 4.40±0.58 3.42±1.01 4.28±0.88 3.83±0.80 Attention (4) 4.50±0.46 0.509 4.75±0.25 4.25±0.74 4.50±0.33 4.59±0.14 Relevancea(3) n/a 4.40±0.55 3.83±0.98 4.33±0.71 4.33±0.58 Confidence (1) 4.22±0.74 4.33±0.73 0.810 4.73±0.37 3.72±0.83 4.44±0.73 4.56±0.38 Satisfaction (3) a Relevance construct initially consisted of five items. Removal of Relevance3 and Relevance4 improved the Cronbach alpha. ARCS consisted of 13 items when administered. Relevance originally had 5 items; however, when analyzed with all five items, Cronbach’s alpha was low. Removal of questions” I didn't learn anything new in this training, because I already knew it” and “The content of this material 45 is relevant to my interests” resulted in the current Cronbach listed above. The final questionnaire used for analysis is an 11 item, 5-point Likert scale. 1= Not true, 5=Very true). Nutrition Educator Report of Topic Delivery (NERTD) Table 9 shows mean±standard deviation scores for Extent to which lesson was delivered as designed, Parent engagement, and Parent reaction to key messages scores for all sites and for each site. Mean scores for extent to which lesson was delivered as designed averaged high, between “mostly able” and “completely able.” Mean scores for each item/question tended to be lowest among Kent educators (non-significant). Table 9. Scores for Nutrition Educator Report of Topic Delivery (Ho2) overall and by site and item for 495 observations. Item Mean±SD All Sites Lansing Van Project Genesee Kent n=20 n=5 Buren L.E.A.N.(VB) n=6 n=3 n=4 n=2 4.83±0.51 4.86±0.45 4.63±0.85 4.86±.41 4.93±0.33 4.25±0.92 1.Extent to deliver lesson as designed1 2. Parent 4.77±0.54 4.57±0.70 4.90±0.31 4.84±.43 4.85±0.48 4.56±0.72 engagement2 3. Parent 4.73±0.53 4.59±0.59 4.67±0.80 4.82±.48 4.79±0.42 4.38±0.66 reaction to key messages3 Scores are based on a 5-point Likert scale. 1 Extent to deliver lesson as designed 1=Not at all, 5=Completely able. 2 Parent engagement, 1= Not at all engaged, 5=Very engaged. 3 Parent reaction to key messages 1=Strongly unfavorable, 5=Strongly favorable. Topics delivered by site: Lansing had 5 educators who delivered 101 lessons; Van Buren, including Project L.E.A.N. had 6 educators who delivered 196 lessons; Genesee had 6 educators who delivered 160 lessons; Kent had 3 educators who delivered 38 lessons. 46 Table 10 shows that overall, most educators reported their extent to which lesson was delivered as designed as “completely able” (87%), parent were “very engaged” (82%), and parent reaction to key messages was “strongly favorable” (76%). Table 10. Frequency (%) of responses by item on Nutrition Educator Report of Topic Delivery for all sites (n=495 observations nested in 107 families with 20 educators).4 Frequency (percentage) All Sites Item Extent to deliver lesson as designed1 Not at all able 0.2 A little 1.0 Neutral 1.4 Somewhat able 10.6 Completely able 86.8 2 Parent engagement Not at all engaged 0.2 Not very engaged 0.6 Neutral 2.6 Somewhat engaged 14.8 Very engaged 81.8 3 Parent reaction key messages Strongly unfavorable 0.2 Somewhat unfavorable 0.4 Neutral 2.0 Somewhat favorable 21.0 Strongly favorable 76.4 Scores are based on a 5-point Likert scale. 1 Extent to deliver lesson as designed 1=Not at all, 5=Completely able. 2 Parent engagement, 1= Not at all engaged, 5=Very engaged. 3 Parent reaction to key messages 1=Strongly unfavorable, 5=Strongly favorable. 4 Topics delivered by site: Lansing had 5 educators who delivered 101 lessons; Van Buren, including Project L.E.A.N. had 6 educators who delivered 196 lessons; Genesee had 6 educators who delivered 160 lessons; Kent had 3 educators who delivered 38 lessons. Table 11 shows that Parent engagement had a significantly positive effect on the educator’s extent to which the lesson was delivered as designed (p<0.01). As parent engagement increased by one unit, the odds of an educator being better able to deliver the lesson as designed 47 (e.g. scoring change from 4 to 5) increased by 3.32, holding all other covariates constant. In this model, parent reaction to key messages and age were also positively significant. Duration of topic delivery was not significant. Years teaching nutrition education and years as home visitor had significant inverse relationships with extent to which the lesson was delivered as designed. Factor scores for Attention, Relevance and Satisfaction were used as independent variables. Their odds ratio values were extremely small and not statistically significant, thus providing no explanatory power on an educator’s extent to deliver the lessons as designed. It was hypothesized that higher scores would result in higher scores on extent to which the lesson was delivered as designed. The Confidence subscale was comprised of only one-item, thus a factor score could not be generated. The cut-points, or thresholds, indicate where the dependent variable was “cut” to make the threshold categories for a 1-point change in the Likert scale for extent to which the lesson was delivered as designed. The thresholds represent the expected value at which a respondent would be most likely to transition from a value of 0 to a value of 1; from 1 to 2, etc. on the Likert outcome variable. When looking at cut-point 1, this means that the predicted odds of reporting ‘not at all able’ being able to deliver the lesson as designed was 10.003 times lower compared to reporting being able to deliver lesson as designed as ‘a little’, ‘neutral’, ‘mostly able’ or ‘completely able.’ Likewise, the predicted odds of reporting ‘not at all able’ and ‘a little’ being able to deliver the lesson as designed was 7.926 times lower compared to reporting being able to deliver lesson as designed as ‘neutral’, ‘mostly able’ or ‘completely able.’ The predicted odds of reporting ‘not at all able’, ‘a little’, ‘neutral’ being able to deliver the lesson as designed was 6.945 times lower compared to reporting being able to deliver lesson as designed as ‘mostly able’ or ‘completely able.’ The predicted odds of reporting ‘not at all able’, ‘a little’, ‘neutral’, 48 or ‘mostly able’ being able to deliver the lesson as designed was 4.605 times lower compared to reporting being able to deliver lesson as designed as ‘completely able.’ Table 11. Generalized linear with mixed effects model of association between quality of nutrition educator training (scores on ARCSa--Attention, Relevance, Satisfaction) and ‘extent to deliver lessons as designed’, family level. Item Standard Error designed1 Extent to deliver as Odds Ratio 95% CI NERTD Parent engagement2 3.32 0.869 1.865-5.436** Parent reaction to key messages3 1.59 0.477 0.903-2.881* 4 Duration of topic delivery Too short 1.49 0.825 0.506-4.408 Too long 0.46 0.196 0.201-1.060 Educator characteristics Educator age 1.12 0.050 1.031-1.23** Educator education level5 2.30 2.256 0.398-15.719 Formal nutrition coursework6 0.56 0.394 0.143-2.219 Years teaching nutrition education 0.84 0.063 0.726-0.974* Years as home visitor 0.78 0.067 0.659-0.923** 7 Educator interest in teaching Eat Healthy 0.06 0.093 0.003-1.223 ARCS Subconstruct score Attention 5.69e62 1.42e65 1.4e-150-2.2e275 299 302 Relevance 2.1e 3.0e 0 -116 -113 Satisfaction 8.0e 4.0e 0 0.11 0.298 Variance of random effects Threshold Levels for dependent variable, Likert score <5 Cut-point 1 (Likert score 12-5) -10.00 Cut-point 2(Likert score 1-23-5) -7.93 Cut-point 3(Likert score 1-34-5) -6.95 Cut-point 4 (Likert score 1-45) -4.61 Log-likelihood (Model fit) -180.45 *Significant at p<0.05 level **Significant at p<0.01 level Level 2: 495 observations nested in 107 families from educator report of each family’s responses across lessons. a ARCS (No factor score for confidence.) 1 Extent to deliver lesson as designed (reference Likert score: 5= completely able.) 2 Parent Engagement (1=not at all engaged, 5=Very engaged.) 3 Parent reaction to key messages (1=Strongly unfavorable, 5=Strongly favorable.) 49 Table 11 (cont’d) 4 Duration of topic delivery (duration of all topics is rank ordered: too short, optimal, too long. Optimal is the reference (Mean±1SD); too short is <-1SD; too long is > 1SD). 5 Educator education level (2=some college (ref), 3= college graduate.) 6 Formal nutrition coursework (1=no, 2=yes (ref).) 7 Educator interest in teaching Eat Healthy (1=no interest, 5=very interested (ref).) Ho1b Table 12 shows “raw” duration presented by topic and delivery method with mean±standard deviation and ranges for all sites, then mean±standard deviation for each site. The first home visit (Topic 1a) was the longest across sites (62.9±20.65) and the first phone call (Topic 1b), the shortest (24.8±17.25). Lansing had the shortest average duration for all topics, while Van Buren (both groups) had the longest delivery times for home visits. 50 Table 12. Topic delivery duration in minutes (Mean±SD) for all sites (n=5), then for each participating site. Duration in Minutes (Mean±SD) Delivery category by topic All Sites Mean±SD Range Lansing Van Buren* Project L.E.A.N.* VB Genesee Kent Home Visit Topic 62.9±20.65 46.8±14.71 74.4±11.48 77.1±19.37 53.3±12.91 64.7±20.91 1a 30-120 Topic 55.1±17.14 43.3±10.85 68.0±13.04 65.9±13.02 47.1±18.44 55.5±7.04 2 20-110 Topic 49.1±12.45 41.7±11.35 57.5±18.93 54.3±8.11 45.7±14.20 54.0±8.22 5 20-75 Phone Call* Topic 24.8±17.25 10.7±3.89 30.0±17.80 20.0±11.52 40.3±17.67 18.3±7.68 1b 6-70 Topic 27.2±17.37 13.5±9.26 33.0±12.55 21.5±13.70 40.0±17.27 17.5±6.45 3 7-60 Topic 31.7±19.13 10.7±4.05 32.5±13.23 35.2±19.12 41.6±16.57 21.0±8.94 4 5-65 **Not all phone calls were delivered as designed. Longer phone call durations reflect educator delivering the Topic as a home visit. Table 13 shows that by a priori ranking, overall, for home visits, approximately 50% of educators’ lesson duration for Topics 1a and 2 were ‘optimal’, while ‘too short’ and ‘too long’ were nearly evenly split for Topics 1a and 2. Topic 5 had approximately 70% agreement between a priori and the second definition post-hoc. For a priori-evaluated phone calls, most were ‘optimal’ while ‘too short’ was less frequent. Post-hoc results shows that for phone calls, the majority were ‘optimal’ while ‘too short’ was less frequent. 51 Table 13. Percentage of lesson delivery durations (a priori and post-hoc) for too short, optimal, and too long, by delivery method and topic, all locations combined. Eat Healthy Topic Too short Optimal Too long Home visit Apriori Post-hoc Apriori Post-hoc Apriori Post-hoc Topic 1a 25 18 44 65 31 17 Topic 2 25 18 52 71 23 11 Topic 5 43 20 70 76 3 4 Phone calls Topic 1b 12 4 57 78 32 18 Topic 3 7 7 51 68 41 25 Topic 4 7 13 38 65 55 22 Duration in min. (1): Apriori where optimal by topic defined as 50-70 min (Topic 1a); 45-60 min (Topics 2, 5); 10-20 min (Topics 1b, 3, 4). (2) Created post-hoc categories from Mean± 1SD for each topic where 42.3-83.6 min (Topic1a); 7.6-42.1 min (Topic 1b); 38.0-72.2 min (Topic 2); 9.8-44.6 min (Topic 3); 12.6-50.8 min (Topic 4); 36.7-61.6 min (Topic 5). Table 14 shows that only educator education level had a significantly positive effect on duration of topic delivery (p<0.05). As educator education level increases by from ‘some college’ to ‘college graduate’, the odds of an educator delivering the lesson in the ad hoc optimal time increased by 2.33, holding all other covariates constant. In this model, formal nutrition coursework and educator interest in teaching Eat Healthy had significant inverse relationships with duration of topic delivery. Factor scores for Attention, Relevance and Satisfaction were used as independent variables. Their odds ratio values were extremely small and not statistically significant, thus providing no explanatory power on an educator’s extent to deliver the lessons as designed. It was hypothesized that higher scores would result in higher scores on extent to which lesson was delivered as designed. Cut-point 1, at the threshold level, means that the predicted odds of delivering the lesson ‘too long’ is 18.13 times lower compared to delivering the lesson in the ‘optimal’ time range. Likewise, the predicted odds of delivering the lesson in the ‘too short’ range are 13.00 times lower compared to delivering the lesson in the ‘optimal’ range. 52 Table 14. Generalized linear with mixed effects model of association between quality of nutrition educator training (scores on ARCSa- Attention, Relevance, Satisfaction) and ‘duration of topic delivery’, family level. Item Duration of topic delivery1 Odds Ratio Standard Error 95% CI NERTD Parent engagement2 1.25 0.331 0.745-2.103 Parent reaction to key messages3 1.37 0.360 0.816-2.293 4 Extent to deliver lesson as designed 0.52 0.128 0.319-0.842 Educator characteristics Educator age 1.03 0.025 0.977-1.076 5 Educator education level 2.33 0.989 1.013-5.355* Formal nutrition coursework6 0.17 0.085 0.063-0.450** Years teaching nutrition education 0.90 0.058 0.791-1.019 Years as home visitor 0.91 0.060 0.796-1.032 7 Educator interest in teaching Eat Healthy 0.05 0.030 0.015-0.163* ARCS Subconstruct score Attention 6.37e-28 6.21e-26 7.2e-111-5.65e55 Relevance 3.9e-180 2.1e-177 0-6.7e284 59 62 Satisfaction 6.77e 1.33e 3.6e-108-1.3e227 0.443 0.270 Variance of random effects Threshold Levels for dependent variable Cut-point 1 (too long) -18.13 Cut-point 2(too short) -13.00 Log-likelihood (Model fit) -328.83 *Significant at p<0.05 level, **Significant at p<0.01 level Level 2: 495 observations nested in 107 families from educator report of each family’s responses across lessons. a ARCS (No factor score for confidence.) 1 Duration of topic delivery (duration of all topics is rank ordered: too short, optimal, too long. Optimal is the reference (Mean±SD); too short is <-1SD; too long is > 1SD.) 2 Parent Engagement (1=not at all engaged, 5=Very engaged.) 3 Parent reaction to key messages (1=Strongly unfavorable, 5=Strongly favorable.) 53 Table 14 (cont’d) 4 Extent to deliver lesson as designed (reference Likert score: 5= completely able.) 5 Educator education level (2=some college (ref), 3= college graduate.) 6 Formal nutrition coursework (1=no, 2=yes (ref).) 7 Educator interest in teaching Eat Healthy (1=no interest, 5=very interested (ref).) 54 Hypothesis 2 H02a: The characteristics of nutrition educators, such as age, number of years as a home visitor, number of years delivering nutrition education, nutrition education background, and level of interest in teaching EH materials, will relate to the extent to deliver the lesson as designed. H02b: The characteristics of nutrition educators, such as age, number of years as a home visitor, number of years delivering nutrition education, nutrition education background, and level of interest in teaching EH materials, will relate to lesson duration. Ho2a Table 15 shows the inter-item correlations for the NERTD items. Extent to which the lesson was delivered as designed was moderately associated with parent engagement and parent reaction to key messages. Table 16 shows the correlations of educator characteristics and NERTD items. Years as a home visitor had a strong inverse association with extent to deliver lesson as designed. Table 15. Inter-item correlations for educators’ (n=20) scores from NERTD’s on parents’ reactions and their own delivery of 495 lessons given to 107 families. Item 1. Parent Engagement 2. Parent Reaction to Key Messages 3. Extent to deliver lesson as designed 4. Minutes spent **Correlation is significant at the 0.01 level (2-tailed). *Correlation is significant at the 0.05 level (2-tailed). 55 1 2 3 4 .606** .338** .087 .332** .047 -.102* - Table 16. Pearson’s correlations between nutrition educator characteristics and average scores per educator for NERTD items (n=495 observations nested within 107 families with 20 educators). Item 1 2 3 4 5 6 7 8 9 Educator age 1 Education level .068 1 Years as nutrition educator .561* -.257 1 Years as home visitor .773** .130 .298 1 Interest in teaching Eat -.265 .308 -.143 -.132 1 Healthy 6. Formal nutrition coursework -.351 -.105 -.240 -.637** -.140 1 7. Parent engagement1 .118 .332 -.134 -.164 .056 .180 1 8. Parent reaction to key -.031 .372 -.137 -.378 .137 .209 .670** 1 1 message 9. Extent to deliver lesson as -.325 .183 -.458* -.607** .159 .341 .470* .753** 1 1 designed * Correlation is significant at the 0.05 level (2-tailed). ** Correlation is significant at the 0.01 level (2-tailed). 1 NERTD scores were averaged for each educator (n=20) over all the lessons (n=495) delivered to 107 families. 1. 2. 3. 4. 5. Table 17, shows the estimated odds of being completely able to deliver the lesson as designed (Likert score=5) by NERTD items and educator characteristics, first without sites, then with sites. Model 1 shows that as educators’ scores for parent engagement and parent reaction to key messages increased by one unit, the odds of scoring a 5 for extent to deliver the lesson as designed increased by 3.05 and 2.47, respectively, holding all other covariates constant. Duration of topic delivery was inversely related to extent to which the lesson was delivered as designed (not significant). Of the educator characteristics, for each year increase in age, the odds of scoring a 5 for extent to which the lesson was delivered as designed increased slightly by 1.11. For each year increase in years as home visitor, the odds of scoring a 5 on extent to which the lesson was delivered as designed decreased by 0.71. When looking at Cut-point 1, the predicted odds of reporting ‘not at all able’ being able to deliver the lesson as designed is 1.20 times lower 56 compared to reporting being able to deliver lesson as designed as ‘a little,’ ‘neutral,’ ‘mostly able’ or ‘completely able.’ Likewise, the predicted odds of reporting ‘not at all able’ and ‘a little’ being able to deliver the lesson as designed is 0.94 times higher compared to reporting being able to deliver lesson as designed as ‘neutral,’ ‘mostly able’ or ‘completely able.’ The predicted odds of reporting ‘not at all able,’ ‘a little,’ ‘neutral’ being able to deliver the lesson as designed is 1.98 times higher compared to reporting being able to deliver lesson as designed as ‘mostly able’ or ‘completely able.’ The predicted odds of reporting ‘not at all able’, ’a little,’ ‘neutral,’ or ‘mostly able’ being able to deliver the lesson as designed is 4.38 times higher compared to reporting being able to deliver lesson as designed as ‘completely able.’ A comparison of results from Models 1 and 2 suggest that after controlling for site differences, the odds of Lansing educators scoring a 5 for extent to which the lesson was delivered as designed was greater than for educators from other sites, but only significantly different from Kent. When educators were categorized by site, parent engagement remained the most significant predictor of extent to which the lesson was delivered as designed. Study sites had some explanatory influence regarding the effects of years teaching nutrition education and years as home visitor on ability to deliver the lessons as designed. When looking at Cut-point 1, this means that the predicted odds of reporting ‘not at all able’ being able to deliver the lesson as designed is 15.95 times lower compared to reporting being able to deliver lesson as designed as ‘a little,’ ‘neutral,’ ‘mostly able’ or ‘completely able.’ Likewise, the predicted odds of reporting ‘not at all able’ and ‘a little’ being able to deliver the lesson as designed is 13.89 times lower compared to reporting being able to deliver lesson as designed as ‘neutral,’ ‘mostly able’ or ‘completely able.’ The predicted odds of reporting ‘not at all able,’ ‘a little,’ ‘neutral’ being able to deliver the lesson as designed is 12.90 times lower compared to reporting being able to deliver 57 lesson as designed as ‘mostly able’ or ‘completely able.’ The predicted odds of reporting ‘not at all able,’ ‘a little,’ ‘neutral,’ or ‘mostly able’ being able to deliver the lesson as designed is 10.49 times lower compared to reporting being able to deliver lesson as designed as ‘completely able.’ 58 Table 17. Generalized linear with mixed effects model for association of nutrition educator characteristics and Q4a (Extent to deliver topic as designed). Estimated odds ratios, standard errors, 95% CI of ‘Extent to deliver lesson as designed.’ Item Model 1 without sites designed1 Extent to deliver lesson as NERTD Parent engagement2 Parent reaction to key messages3 Duration of topic delivery4 Too short Too long Educator characteristics Educator age Educator education level5 Formal nutrition coursework6 Years teaching nutrition education Years as home visitor Educator interest in teaching Eat Healthy7 Site location Kent Genesee Van Buren Project L.E.A.N. (VB) Variance of random effects Covariance (2,1) Threshold Levels for dependent variable Cut-point 1(Likert score 12-5) Odds Ratio Standar d Error 3.05 2.47 Model 2 with sites 95% CI Odds Ratio 0.834 0.993 1.784-5.212** 1.124-5.43* 3.28 2.12 0.905 0.887 1.910-5.633** 0.930-4.801 1.69 0.51 0.944 0.224 0.564-5.051 0.214-1.207 1.41 0.44 0.796 0.196 0.467-4.262 0.187-1.057 1.11 0.79 0.27 0.85 0.71 0.76 0.041 0.386 0.173 0.068 0.061 0.551 1.032-1.194** 0.307-2.058 0.074-0.949* 0.731-0.998 0.603-0.842** 0.184-3.144 1.12 3.09 0.91 0.83 0.74 0.01 0.057 3.625 1.023 0.067 0.084 0.035 1.016-1.240* 0.311-30.778 0.101-8.222 0.707-0.973* 0.590-0.921** 0.000-1.710 0.02 0.84 0.87 0.17 1 5.010 2 0.406 -1.423 0.038 1.320 1.950 0.230 8.641 0.583 2.246 0.000-0.851* 0.039-18.068 0.010-71.545 0.013-2.339 1 3.242 0.369 -1.094 2 5.343 0.430 1.532 -1.20 -15.95 59 Standard Error 95% CI Table 17 (cont’d) Item Model 1 without sites Model 2 with sites Odds Standar Odds Standard Extent to deliver lesson as designed1 Ratio d Error 95% CI Ratio Error 95% CI Cut-point 2(Likert score 1-23-5) 0.94 -13.89 Cut-point 3(Likert score 1-34-5) 1.98 -12.90 Cut-point 4(Likert score 1-45) 4.38 -10.49 Log-likelihood(Model fit) -186.83 -179.55 *Significant at p<0.05 level, **Significant p<0.01 level. Level 2: 495 observations nested in 107 families from educator report of each family’s responses across lessons. Model 1: random slope for parent reaction to key messages. Model 2: random intercept and slope (parent reaction to key messages with sites). 1 Extent to deliver lesson as designed (reference Likert score: 5= completely able.) 2 Parent Engagement (1=not at all engaged, 5=Very engaged.) 3 Parent reaction to key messages (1=Strongly unfavorable, 5=Strongly favorable.) 4 Duration of topic delivery (duration is rank ordered: too short, optimal, too long. Post-hoc optimal is the reference (Mean±1SD); too short is <-1SD; too long is > 1SD.) 5 Educator education level (2=some college (ref), 3= college graduate.) 6 Formal nutrition coursework (1=no, 2=yes (ref).) 7 Educator interest in teaching Eat Healthy (1=no interest, 5=very interested (ref).) 60 Ho2b Table 18 shows the estimated odds of optimal duration of topic delivery (rank score=1) by NERTD items and educator characteristics, first without sites, then with sites. Model 1 shows that as educators’ scores for extent to which the lesson was delivered as designed increased by one unit, the odds of delivering the lesson in the optimal range decreased by 0.48, holding all other covariates constant. Of the educator characteristics, educators who were ‘college graduates,’ the odds of being able to deliver the lesson within the optimal range increased by 7.71, as compared to educators with ‘some college.’ There was a significant inverse relationship between formal nutrition coursework and duration of topic delivery. When looking at Cut-point 1, the predicted odds of delivering the lesson ‘too long’ was 12.88 times lower compared to delivering the lesson within the ‘optimal’ time range. The predicted odds of delivering the lesson ‘too short’ (Cut-point 2) was 7.95 times lower than delivering the lesson within the ‘optimal’ time range. A comparison of results from Models 1 and 2 suggest that after controlling for site differences, the predicted odds of Project L.E.A.N. educators delivering the lessons within the ‘optimal’ time range was higher compared to the Lansing educators. When educators were categorized by site, educator education level remained the most significant predictor of delivering the lesson within the ‘optimal’ time range. Sites had some explanatory influence regarding the effects of educator interest in teaching Eat Healthy on ability to deliver the lessons within the ‘optimal’ range. When looking at Cut-point 1, the predicted odds of delivering the lesson ‘too long’ was 17.42 times lower compared to delivering the lesson within the ‘optimal’ time range. The predicted odds of delivering the lesson ‘too short’ (Cut-point 2) is 12.30 times lower than delivering the lesson within the ‘optimal’ time range. 61 Table 18. Generalized linear mixed effects model for association of nutrition educator characteristics and Q5 (Duration of topic delivery). Estimated odds ratios, standard errors, 95% CI of duration of topic delivery. Item Duration of Topic Delivery (post-hoc)1 NERTD Parent engagement2 Parent reaction to key messages3 Extent to deliver lesson as designed4 Educator characteristics Educator age Educator education level5 Formal nutrition coursework6 Years teaching nutrition education Years as home visitor Educator interest in teaching Eat Healthy7 Site location Kent Genesee Van Buren Project L.E.A.N. (VB) Variance of random effects Model 1 Model 2 Odds Ratio Standard Error 95% CI Odds Ratio Standard Error 95% CI 1.26 1.55 0.48 0.351 0.437 0.124 0.728-2.172 0.890-2.695 0.293-0.800** 1.24 1.38 0.52 0.331 0.368 0.129 0.734-2.093 0.818-2.327 0.316-0.841 1.03 7.71 0.29 0.93 0.91 0.06 0.026 2.860 0.133 0.068 0.059 0.037 0.977-1.079 3.729-15.953** 0.117-0.711** 0.802-1.069 0.804-1.034 0.020-0.200 1.02 2.54 0.21 0.90 0.90 0.05 0.032 1.169 0.144 0.060 0.061 0.029 0.956-1.081 1.031-6.261* 0.054-0.808* 0.787-1.024 0.798-1.023 0.014-0.158** 0.84 3.37 19.03 9.68 1 0.92 2 0.00 0.917 2.549 26.199 5.612 3.089 0.0 0.098-7.144 0.765-14.839 1.281-282.704 3.111-30.164** 1 2 5.79 0.10 6.30 0.17 Threshold Levels for dependent variable Cut-point 1 (too long) -12.88 -17.42 Cut-point 2 (too short) -7.95 -12.30 Log-likelihood(Model fit) -344.20 -328.67 *Significant at p<0.05 level, **Significant at p<0.01 level. Level 1: 495 NERTD observations (multiple per family.) Level 2: 495 observations nested in 107 families from educator report of each family’s responses across lessons. Model 1: random slope for parent reaction to key messages. 62 Table 18 (cont.) Model 2: random intercept and slope (parent reaction to key messages with sites.) 63 Hypothesis 3: There will be a difference in fidelity of implementation among sites, as assessed from—periodic debriefing notes, Nutrition Educator Report of Topic Delivery (Q4b: explain any things that helped or made it difficult for you to deliver this week’s topic as designed), home visit observations, and exit interviews. In the subsequent pages, I will discuss findings from each of these numbered sources of data. Home visit observations, as described in Methods, used the NERTD completed by the observer while she monitored the educator who delivered the lesson. Periodic debriefing notes Table 19 shows the main emergent themes and subthemes from the ongoing periodic debriefings with three sites, excluding Lansing, regarding study implementation of Eat Healthy. Because there were only seven subthemes regarding barriers and facilitators that emerged relating to fidelity, only these seven were examined by site for frequency. Across all three outlying sites, educators cited participants missing or avoiding the outside evaluator phone calls, due to the participant’s caller ID showing “restricted number” or the participant’s inability to reach the evaluator during the evaluator’s specified hours. Confusion was common regarding the timeline of the study among all three sites. Educators at all sites mentioned that a barrier to delivery of education included that the curriculum was geared more towards parents than children, and difficulty engaging both parent and child during home visits. Some barriers were particular to one or two sites. Educators in Genesee and Van Buren cited inconsistencies in key messages between the written materials and the DVD clips. Additionally, Van Buren educators cited low-literacy of some participants as a barrier to teaching. The use of the DVD was beneficial in overcoming this barrier in that site. Across all outlying sites, the two most common facilitators of fidelity were the high level of parent interest in the materials and the 64 nutrition educators’ appreciation for the simplicity and attractiveness of the materials. No questions were queried related to lesson duration in these periodic debriefings. 65 Table 19. Emergent themes coded from the periodic debriefings with outlying sites.1 Themes Recruitment  and Data  Collection    Outside Evaluator   Evaluator’s “restricted” phone number caused some participants not to answer Identification of evaluator unclear to families who receive phone calls from multiple agencies Participant unable to reach evaluator during designated times Educator unsure how to interpret spreadsheet throughout study Dropped participants still on spreadsheet Specific questions about paired foods on FFQ Educators often lack answer for 4B on NERTD Confusion about timeline/ study progression Educator not referring to curriculum grid Re-measure participant due to long time span between recruitment and 1st phone call with evaluator Confusion regarding number of phone calls to complete with evaluator for control and intervention groups Inconsistencies between key messages in DVD and written materials Confusion about timeline of 3 month follow up  Curriculum geared more to caregiver than child  Difficulty engaging parent and child, child served as a distraction  DVD was a great way to reach families with low literacy    Data Collection Study Design Subthemes Weight Measurements Typographical flaws : o Incorrect topic choices on ARCS form o Scale missing from page 2 of PFBQ form Recruitment numbers- trouble deciding on realistic number of participants Uneasiness about recruiting and working with families [they] do not know Over income families- not able to recruit         66 Site (n)2 Frequency 5 1 1 1 3 1 3 2 1 5 1 2 1 3 1 3 1 2 V (1), G(1) V(1), G(1), K (1) G(2), V(4) V(3), G(4), K(1) V(1) 2 3 6 8 1 Table 19 (cont’d) Eat Healthy Materials  Nutrition educators love the materials  Parents excited about curriculum V(2), G(2), K(2) V(1), G(1), K(1) 6 3 1 Information in parentheses is number of mentions by site where V=Van Buren, G=Genesee, K=Kent, Lansing site was not included. 2Shaded area represents the seven items evaluated for frequency by site. Nutrition Educator Report of Topic Delivery (NERTD-Q4b) Table 20 shows the most common four barriers and five facilitators to the query about extent to deliver lesson as designed which impacted overall fidelity of implementation. Across all sites, educators cited distractions most frequently (n=82) although Lansing reported these most often. These distractions ranged from disruptive children (4-5 times more often in Lansing than in other sites) to phone calls during the lesson, visitors to the home, and environmental noise i.e., traffic, fans, machinery, etc. (See Appendix G for coding details.) For example, “There were many distractions in the house. Oldest child somewhat participated while 3 year old played with other toys. Mom talked over children which made things louder and woke baby. Also a hot day with no AC and a dog inside.” Educators in Lansing reported barking dogs inside the home on several occasions. Technology issues were the second most reported barrier (n=43.)Predominantly, these citations related to DVD complications, such as a non-working DVD player and a damaged or defective DVD. Educators generally had back-up DVD’s with them, as instructed, and the Lansing educators brought a laptop to each home visit to overcome this challenge. The proportional frequency of reporting technology issues appeared to be consistent across three of 67 the four sites—Lansing, Van Buren, and Kent, while Genesee reported the fewest. These were proportionally constant when the numbers of reports by site were adjusted for the total number of topics delivered. (Data not shown.) Findings from the home visit observations support this finding regarding technology issues. Only the educators in Van Buren cited a lack of nutrition knowledge or low reading levels by participants supporting a finding from the debriefings. Both Lansing and Van Buren equally cited problems with lack of participant follow through for phone call lessons. Many participants forgot their appointments causing the educators to reschedule at a later time. Unique to Kent, on two occasions, one educator twice cited confusion over protocol for delivering education during a home visit. For example, “Confused on how much information to go over in the home visit, was I supposed to do all of it?” This confusion is further supported by the quantitative findings from Table 17 which showed Kent had significantly lower odds of being able to deliver the lessons as designed compared to Lansing. Exit interview data further supported this. Facilitators to education delivery were likewise fairly consistent across sites. Parent engagement was the most cited in all sites (n=108) and both Lansing and Van Buren reported it most. One educator in Kent reported, “I could tell that mom had worked on the ‘away’ activities, because the kids were able to share some with me. They were excited to show me the snack options they would choose.” Both Lansing and Genesee educators conducted a few visits at a neutral site, such as a school or a restaurant, rather than at home. These educators found this neutral site helped to keep parents engaged, especially with reduced distractions. For example, “Both Mom and Dad were very engaged in the visit. We had it at the school which worked out well. The child did different activities, while I conducted the visit with Mom and Dad.” Despite 68 frequently reported DVD problems, educators in each site also praised the positive impact the DVD clips had on parent engagement (n=10). “Mom admired the videos of parents working together as a united front. Says it’s hard to do the same in their household, but want to practice it more.” These qualitative data are supported by the logistic regressions in Tables 11 and 17 that show that parent engagement as the strongest predictor for educators to deliver the lesson as designed. Table 20. Summary of barriers and facilitators (potential moderators) to Implementation Fidelity (lesson delivery), as listed on NERTD that queried, “Please explain any things that have helped or made it difficult for you to deliver this week’s topic as designed. Moderator Barrier Themes 1. Participant interrupted/Distracted (child, phone call, visitors, dog, environment noise) Frequency by site (number of reports) L (36)  V (25) G (15)  K (6)  2. Technology (TV or DVD problems) 3. Lack of participant knowledge (nutrition knowledge, reading level) 4. Lack of participant follow through (forgot appointment) Selected Quotes “Kids were active, in and out of house, getting Mom’s attention.” “Many distractions in the house. Mom talked over children which made things louder & woke the baby. It was also a hot day, with no A/C, and a dog inside.” “The parent and I had a hard time talking because the child kept interrupting.” L(10) V(20) G(7) K(6)  “DVD player would not read DVD. Went through questions & activities.” V(2)   “Parent not exactly sure what is healthy.” “Very low reading levels.” L(2) V(9) G(1) K(1)  “Parent forgot I was coming and had to leave for an appointment.” 69 Table 20 (cont’d) Facilitator Themes 1. Delivery at neutral site (school, restaurant, etc.), not home L(5) G(4)  “Both Mom and Dad were very engaged in the visit. We had it at the SKIP site which worked out well. The child did different activities while I conducted the visit with Mom and Dad.” 2. Children occupied during lesson (coloring sheets, supplemental activities, second educator) L(28) V(9) G(22) K(4)  “Children were very loud. The second educator distracted the child during the lesson.” “The coloring pages helped us from being interrupted.” 3. Technology (extra DVD, laptop) L(11) V(3) G(21) K(1)  “Bringing the laptop to watch DVD segments makes it much more convenient to be able to set up wherever is best for the family.” 4. Parent Engagement L(30) V(58) G(17) K(3)  “I could tell that mom had worked on the "away" activities because the kids were able to share some with me. They were excited to show me the snack options they would choose.” “Mom was engaged- asking daughter questions about her favorite foods. Reflected on her childhood and how it has influenced her as a mother.” “Mom, dad, and son watched the videos and went through the questions together. Everyone enjoyed it.”    L(2)  “Mom admired the videos of parents working V(3) together as a united front. Says it’s hard to do G(3) the same in their household, but want to K(2) practice it more.” 1 Information in parentheses is number of mentions by site where V=Van Buren, G=Genesee, K=Kent, Lansing site was not included. 5. Participant able to relate to families in the video clips 70 Home visit observations Descriptive statistics of educator’s NERTD scores compared with observer NERTD scores Table 21 shows that for all sites, there was a 63% agreement on scores for parent engagement, 50% for parent reaction to key messages, 75% for extent to deliver the lesson as designed, and 71% for duration of topic deliver, as defined post-hoc. Therefore, validity of educator scores was best for extent to which the lesson was delivered as designed. Table 21. Mean (SD) scores and agreement for four items between educator and home visit observer (i.e. implementation fidelity) on Nutrition Educator Report of Topic Delivery (n=7 observations.) Mean±SDa and Percent Agreement All sites Agreement by Site (percentage) Lansing Project (n=2) L.E.A.N. (n=1) Item Educator Observer % 1. Parent 5.00±0.00 4.71±0.49 63 50 0 engagement 2. Parent reaction to 4.57±0.54 4.43±0.54 50 50 0 key messages 3. Extent to deliver 4.43±0.79 4.43±0.79 75 100 100 lesson as designed 4. Duration of topic delivery, agreement N/A N/A 71 100 100 for optimal (post hoc) a Scores on 5-item Likert Scale Duration of topic delivery categories: 1=too short, 2=optimal, 3=too long. Genesee (n=2) Kent (n=2) 100 100 50 100 100 50 50 50 Qualitative data from Home Visit Observations Table 22 shows the emergent themes and subthemes from the seven home visit observations, two at each site, except only one in Van Buren due to a participant cancellation. 71 Although these themes were similar to those in the NERTDs for barriers and facilitators, the Home Visit Observations were the first time that “educator rapport” was noted. The author of this thesis observed positive participant-educator rapport in four home visits from two sites, Genesee and Kent. Each of these four educators were experienced as home visitors. These educators were better able than the other educators to “tune-out” peripheral distractions, facilitating the participant’s engagement. One young educator in Lansing struggled with keeping a participant on task and was distracted by the observer. As found in the NERTD’s, parent engagement was common across sites. Parents liked the materials and messages and actively participated by watching the DVD clips, as well as, answering topic questions. Two educators, one each in Kent and Van Buren, were especially skilled in eliciting deeper responses, as well as redirecting participants when appropriate. Child distractions were present and technology or DVD issues were present in three observations—Lansing, Kent, and Van Buren, but not in Genesee. This was because the family’s children were engaged throughout the lesson and the other educator delivered the lesson at the school, where the child was otherwise occupied. 72 Table 22. Emergent themes coded from seven home visit observations1. Theme 1. Parent engagement Subthemes F-Participant engaged with materials F/B-Participant somewhat engaged Site G(2), K(2), L(1) L(1), V(1) 2. Educatorparticipant rapport F-Good rapport between educator and participant F-Established as a home visitor with the family B-No rapport established (first visit) G(2), K(2) G(2), K(2) L(2), V(1) 3. Adherence to curriculum (Fidelity and Delivery) B-Educator did not show the introductory DVD clip F-Educator provided additional information when appropriate F-Participant completed each question during the lesson F-Educator provided a thorough overview of the study F-Educator probed for deeper responses B-Kids were busy and noisy, interrupting B-Child crawled on floor, acting like a dog B-Participant answered the phone B-Participant required redirecting G(1) G(1), L(1) K(1) V(1) K(1), V(1) 4. Distractions K(1), V(1), L(1) L (1) L (1) L(1), V(1) 5. Technology issues B-DVD player not working K(1), L(2) B-Participant lost DVD, used educator DVD L(1) B-Used game box to play DVD L(1) F-Used laptop G(1), L(1) B-Poor TV quality K(1) 1 Information in parentheses is number of mentions by site where L=Lansing, V=Van Buren, G=Genesee, K=Kent. F=facilitator, B=barrier. Exit interviews Results are reported by the six main themes from the coding tree. This coding tree is found in Appendix H. Educator training Table 23 shows a summary of topics, by site, queried in the exit interviews. On average, the educators reported satisfaction with the amount of training, but desired improved coverage of 73 training concepts. Educators in two sites (Lansing and Kent) suggested role playing the delivery of a lesson to improve their confidence. Some sites wanted the data collection procedures interfaced with the EH delivery. One site (Van Buren) found the training to be overwhelming, citing lack of having the materials ahead of time to review as the main reason. Mean ARCS scores for the training session on educational delivery, was shown in Table 6 and help to support these findings. Van Buren tended to have the lower mean ARCS scores for all sub-constructs. Table 23. Topics queried in exit interviews with educators regarding perceptions of training, data collection, implementation fidelity, study design and timeline, with selected quotes by site. Topic and Frequency (n) 1. Satisfaction with Training Positive comments (8) Negative comments (6) 2. Training Improvement Role play (2) 3. Improvement of Data Collection Pre-made packets (15) Site V, L, G, K  Selected Quotes “I was satisfied. I remember having to go back and ask questions, but my questions were always answered.” “I think that if we could have gotten the stuff ahead of time and gotten a chance to look through it and then maybe ask questions that weren’t answered at a particular time, this would have been more helpful.” K, L  “…maybe doing some role playing.” V, L, K  “I think it would have been helpful to stock each folder ahead of time for each family with everything we needed so we could just grab the folder and go.” “Provide a timeline with every piece of paperwork and when to do it…there are just so many things to keep track of.”   4. Facilitators of Data Collection Premade mailinglabels(5) Teams (1) 5. Barriers for ability to deliver lessons as designed Child distractions (5) V, G, L, K    L, V, G, K V, L V, L, G, K 74 “One thing I thought was really helpful was having those preprinted labels on the envelope.” “Going in teams was really helpful, too.” “I think that it is really geared toward the parent, so when the children were there it was sometimes difficult because they were distracting” Table 23 (cont’d)  Scheduling of visits (10) Miscellaneous (10)  6. Facilitators to being able to deliver the lessons as designed DVD clips (3) Parent engagement (3) Other (3) L, V, G 7. Protocol modifications Home visits in place of phone calls (4) 8. Participant survey status spreadsheets Confusing (8) Useful (1) “…families would run out of minutes on their phones or the phone was disconnected.”   “Watching the DVD clips was helpful.” “Parent interest in the material. Some of them were really on the ball. They wanted to learn it.” V, G, K  “All of my families preferred home visits rather than phone calls, so I didn’t do phone calls….I did a visit each time.” L, V, K  “They were confusing, I didn’t find it helpful. They dropped one of my participants and would drop the call and then they were on one week and gone the next week. It was too confusing. I was checking all the time” “I would definitely say it was a helpful tool but rather confusing and frustrating at times”  9. Clarification contact/availability Author of this thesis (12) L, V, G, K  “It was the study coordinator (Jamie). “Very timely. Communication was really good.” 10. Usefulness of ongoing debriefings Helpful (7) L, V, G, K  “The first time it was more confusing than helpful but I think that is because all of us were trying to figure out who was doing what…but when we were talking it was clarified as a list of things to ask and they were answered.” “We are so busy doing so many other things it’s easy to let it slide and be like ‘oh yeah, I need to get back to that’ It was a reminder to keep the fire going and keep moving forward.”  Site: L=Lansing, V=Van Buren includes Project L.E.A.N., G=Genesee, K=Kent. 75 Data collection Confusion about the data collection process was common across sites. Some educators found it challenging to keep track of which forms were filled out for home visits or phone calls, both by the participant and the educator. One Kent educator stated, “It was confusing to me, the way the forms were labeled and called certain things, to me, it just got confusing.” Another educator from Van Buren stated, “Not until after the conference call [periodic debriefing] did I realize that they [forms] were required after the phone call, too.” Educators across sites collectively suggested that pre-made form packets, by lesson, for each family would help to reduce confusion and keep on track. “It would have been nice to have a packet of forms [labeled by topic] for each family…then you’d have everything there you needed, rather than having loose forms. Maybe divide the packet by topic.” Confusion surrounding calculating the three-month follow-up dates was also present across sites. For example, “And maybe the part about when we complete a family, we are supposed to email you and send. I don’t know if there would be an easier way to do that. There was a lot of confusion.” In contrast, Lansing educators referred to these participant spreadsheets daily due to their dual roles as educators and research aides. Despite the general confusion, educators stated that once they understood how to interpret the spreadsheets, they found it helpful to know just where their participants were in the course of the study. In contrast, one educator in Kent stated that she stopped looking at the spreadsheet. “They were confusing, I didn’t find it helpful. They [external evaluator] dropped one of my participants and would drop the call and then they [participant] were on one week and gone the next week. It was too confusing.” 76 Educators liked the educator’s binders and curriculum grids, as well as, the prompt email responses from the author of this thesis. All these facilitated the data collection process. Van Buren stated that having pre-printed mailing labels made the mailing process more efficient. Outside EH evaluator The educators cited an initial lack of understanding about the participants’ survey status spreadsheets from the study’s outside evaluator as one reason for frustration in the study’s progression. One educator in Van Buren said, “Sometimes I would be confused, so I would ask [site coordinator] and she would be very helpful. She showed me how to look at them properly and I would understand them more, but at first it was confusing. Remembering who was treatment and control, that was the main confusing part for me.” Confusion and need for clarification was common across sites. For example, “I would definitely say it was a helpful tool, but rather confusing and frustrating at times.” Reasons for lack of understanding and frustration with these spreadsheets included delayed explanation of spreadsheet interpretation, incorrect coding, and changing participant identification number, to be specific to outside evaluator needs only. Lansing participants had considerable difficulties with completing the survey calls after each phase of the study. Lansing educators cited this difficulty to be a combination of lack of follow through on the participants’ part, but also miscommunication between the outside evaluator and participant regarding a window of availability to complete the calls. Educators cited weekly debriefings, timely responses from the author of this thesis, and collaboration among team educators as most beneficial to overcoming these initial barriers. “The first time [debriefing] it was more confusing than helpful, but I think that is because all of us were trying to figure out who was doing what…but when we were talking, it was clarified. We had a list of things to ask and they were answered.” Van Buren, including both Project L.E.A.N. 77 and Kent educators, did problem solving as a team before contacting the author of this thesis. One educator cited, “Having the team here to be able to collaborate and bounce questions, ‘do you understand this better than I?’ That was very helpful. And of course having the backup of you [the author of this thesis] to clarify if needed.” Delivery protocol Educators in each site reported modifying the protocol for lesson delivery. A few educators in Genesee and one in Kent delivered all six lessons as home visits due to their family’s requests. “I did it with all four families. They didn’t want the phone calls. They wanted the visits, because I could tell, especially one, she said, ‘I don’t know if I’ll have a chance’…when I told her I could come out, she preferred that.” Findings from the phone calls in minutes (Table 12) support these findings, because it showed that Genesee tended to have the longest phone call delivery times, due to changing the phone calls to home visits (mean values.) Other modifications the educators made in delivery protocol were made to improve participant level of engagement and responsiveness, such as reducing the time spent on certain lessons. For example, one educator in Kent stated that one parent did not have a picky eater, so she skimmed over that part of the lesson but spent more time on a different part. Educators also adapted lessons to a family’s resources or situations, while still delivering the key messages in each lesson. One educator stated, “Some families didn’t even have tables, so I tried to think about the little steps the family could take. If they don’t have a table, at least bring the coffee table over to put plate on…. so reaching the families where they’re at and trying to either do small steps or let’s go beyond this and see what else can you do to help with this?” 78 One delivery protocol modification, unique to Lansing and Van Buren educators, was teaching more than one lesson per visit. This modification came about in order to meet the intervention deadline that the outside evaluator determined. Educators did not find this collapsing of two lessons into one to be a barrier, but rather a benefit for both the participant and themselves. Barriers Barriers to implementation fidelity were consistent across sites. Child distractions were the most common barrier cited for both home visits and phone calls. One educator stated, “The phone visit for me was the hardest thing, because I could hear kids screaming in the background while I’m trying to ask questions. A majority of the time there were two scenarios, I knew the mom was not being completely honest that she watched it (DVD), just giving answers off the top of her head. Or there was just too much going on in the background that she wasn’t giving me answers that she would have.” Specific reasons for distracting children included the child seeking the parent’s attention, the parent tending to the child’s needs, and lack of child engagement within the lesson component. Additional barriers mentioned more than once included need to reschedule phone calls for lesson delivery, and phone related issues such as the participant running out of phone minutes, not answering, and disconnected numbers. Van Buren educators reported phone issues more often compared to other sites. For example, “I had a lot of participants, quite a few, who wouldn’t answer their phones and I couldn’t contact them for a week, especially with the interview questions. When it [the lesson] is on the phone, it is a lot easier for them not to be there.” Technology issues such as having a defective DVD or participants lacking a working DVD player were also cited. Other barriers, unique to Lansing, included “parents just going through the motions” for the gift card incentive and lack of parent engagement. 79 Lansing educators collectively cited their age as a barrier. “The fact that we’re students and a lot younger than a lot of the participants. I don’t think a lot of them took us that seriously. We had one participant’s mom tell us that we looked like babies and after that visit we never heard from them again.” These educators felt they were at a disadvantage compared to outlying sites, where most educators worked with an established client base. The logistic regressions in Tables 11 and Table 17 reinforced that the educator’s age was significant in predicting an educator’s ability to deliver the lessons as designed. The older, or more ‘life experienced’ the educator, the more likely they were to be completely able to deliver the lesson as designed. This finding was significant for both Ho1 and Ho2. Facilitators Facilitators of implementation fidelity included parent engagement, being an established home visitor, and participant’s trust in the educator. One educator stated, “Some of them [parents] were really on the ball. They wanted to learn it; they were interested in it…quite a few were that way.” Home visit observations (by the researcher) reported earlier, helped to confirm these findings. The majority of parents were engaged with the materials. Most educators were established home visitors and had built trusting relationships with their families. Across sites, educators cited ‘not having children present’ as a facilitator. For example, “I could see this (EH) presented to parents like in a parent meeting, where we provide child care…I think that would be a great way to present it (EH) to parents too.” The DVD clips that showcased the participants’ peers or other parents were cited as an effective strategy to engage parents more. One educator stated, “I think whenever the parent really got involved, [it] was when they were watching the video and hearing from their own peers.” 80 All educators, in all sites, stated that maintaining fidelity in delivering the lessons and data collection improved and became easier overtime. Educators consistently cited weekly debriefings and responses to emerging questions or concerns as helpful and motivating. “We are so busy doing so many other things, it’s easy to let it slide and be like ‘oh yeah, I need to get back to that.’ It was a reminder to keep the fire going and keep moving forward.” Overall, periodic debriefings and exit interviews were unique in that they were the only sources of qualitative data that illustrated the breadth of data collection issues throughout the study. When looking at all the qualitative data sources, there was a lot of text to code. Each source served a different purpose though, eliciting some unique data not found elsewhere and was supported by either quantitative data or other qualitative data sources. 81 CHAPTER V. DISCUSSION The purpose of this research was to evaluate factors that enhanced or impeded the educational delivery and implementation by trained educators of the Eat Healthy nutrition education intervention. This study evaluated, in-depth, the dose of education delivered, defined as “delivery of education” and fidelity, defined as “implementation fidelity.” The study is important in that it adds to the very small body of literature for process evaluation in nutrition education intervention, an area which is lacking. Hypothesis 1 The scores for the ARCS sub-constructs from the educators training session were assessed as predictors of an educator’s ability to deliver lessons as designed. But these did not provide any explanatory power for this. Van Buren served as the pilot site for training, likely resulting in its poorer scores than from educators in other sites. In addition, these ARCS scores might reflect more than just the educator’s perceptions. Initially, some of the educators and the site coordinator at Van Buren were unaware of their role in the Eat Healthy intervention, and some appeared hesitant to participate. Even though trainings were standardized, minor adjustments were made and these changes were reflected in the slightly better scores from other sites. In the exit interviews, across sites, both positive and negative comments were reported in equivalent amounts for satisfaction of training, providing support for the variation in scores. Despite good mean ARCS scores for the training, Hypotheses 1a and 1b were not supported. As revealed in the exit interviews, the phrasing of some of the sub-construct items for ARCS caused educators to question how to effectively answer these questions. Though 82 Hypothesis 1 was not supported, it is important to acknowledge that supplemental training was on-going in several ways. Open lines of communication, via periodic debriefings, one-on-one phone calls, and emails between the research team and educators facilitated the educators’ selfefficacy for delivery of the materials and helped to maintain good levels of implementation fidelity. Quantitative results for extent to deliver lesson as designed from the NERTD’s supported the findings by Barr et al. 2002; Cooke 2000; and Kallestad & Olweus 2003. These studies reported that educators with higher levels of self-efficacy were more likely to have higher levels of implementation. Educators who felt they were adequately and sufficiently trained tended to score better on whether they believed they could deliver the intervention as intended. For this study, Kent educators were significantly less likely to score a five on extent to which the lesson was delivered as designed compared to Lansing. Exit interview feedback from educators helped to substantiate these findings in which Kent educators reported that their training could have been improved by using role play of lesson delivery. Their response supported the findings of Durlak & DuPre, 2008 in that adequate training should provide the educators with competence in intervention skills, acknowledge their expectations and motivations, and increase their sense of self-efficacy. Hypothesis 2 A variety of educator characteristics have been found to predict fidelity of implementation and educational delivery, specifically an educator’s extent to deliver lessons as designed (Dusenbury et al., 2003.) Rohrbach et al. 1993 reported that educators with more training and self-efficacy, despite less field experience, were more likely to maintain good levels 83 of implementation fidelity and the feedback from the exit interview in the present study with Lansing educators supported these findings. The Lansing educators were the youngest and tended to have the least experience in delivering education. They were also more closely involved in the study, as they served dual roles: 1) nutrition educators, and 2) research aides. Quantitative findings supported these findings, as well, in that the longer the educator had been a home visitor, the lower the odds of being completely able to deliver the lesson as designed. This inverse relationship between length of time as a home visitor and delivering the lesson as designed might be related to longer delivery times or having established relationships with participants. Interestingly, age was the most significant predictor for an educator to deliver the lessons as designed even in light of the inverse relationship between years as a home visitor and extent to which the lesson was delivered as designed. This inverse relationship between age and years as a home visitor might suggest that life experiences facilitate rapport more so than length of time working as a home visitor. Parent (participant) engagement was both a facilitator of implementation fidelity and the strongest predictor for an educator scoring a five (completely able) for extent to which the lesson was delivered as designed, a finding supported by research in preventive medicine and psychology (Prado et al., 2006.) In the present study educator responses on the NERTD’s, home visit observations, as well as, exit interviews substantiate this finding. Initially, young educators in Lansing tended to experience lower parent engagement levels. However, across time, these low levels rose for Lansing educators. Lansing educators cited building trust and rapport, as well as, self-confidence in delivering the materials as contributing factors. These findings echo those from Heinonen et al., 2013 & Prado et al., 2002) in that therapist’s self-rated interpersonal skills, such as engagement and encouragement appeared to foster stronger alliances with their clients. 84 Across sites, excluding Lansing, most educators worked with an established participant base within home-visiting programs, which likely facilitated parent engagement and fidelity of implementation. Educator-participant alliances also facilitate rapport which can result in greater participant engagement as well as greater positive behavior changes (Leach M. 2005.) Establishing rapport and alliance is a time intensive process, where trust, empathy, and mutual respect are essential for participant success in a program. Though not considered in this present study, alliance can be an important variable to consider when interpreting behavior outcomes (Elvins & Green, 2008.) Specifically, an educator has the potential to be the central force behind a participant’s engagement and success in an intervention (Blow et al., 2007.) While it may not be possible or appropriate to match educator to participant based on age or other characteristics, there remain “common factors” that educators should be aware of to facilitate alliance or rapport with new participants. These common factors include, an educator possessing a working knowledge of a participant’s living situation, being mindful of how the educator dresses, complimenting a participant about something in their home, mirroring the participant’s body language and using reflective listening (Sprenkle & Blow, 2003.) One item not queried on the educator demographic form was the number of kids each educator had. Whether the educator had children might have been important to consider because of the quantitative finding that age affected an educator’s ability to deliver the lesson as designed. Having children could also affect rapport with other parents, as observed during two home visit observations, where the older educators had children too. Just as educator characteristics can moderate process and behavior outcomes, family characteristics can also impact outcomes. Family characteristics, such as parent age, number of children, and education 85 level were not used to match participants to educators, nor were differences in family characteristics examined by county. These items could potentially aid in accurately interpreting outcome differences among sites. Duration of topic delivery was not a significant predictor of an educator’s ability to deliver the lesson as designed, perhaps due to the large range in minutes for lesson delivery. This variation was possibly a result of delivery modifications, such as delivering phone call lessons as home visits or numerous distractions and redirections during a lesson. It is important to note that several educators, particularly in Genesee, adapted the delivery method to do all home visits and no phone calls to best meet the needs of their participants. This resulted in some topics lasting as much as three times the upper a priori limit. Faw et al., 2005 found similar results, where logged delivery times differed from expected times outlined in delivery protocols. Unlike a previously cited study (Byrnes et al., 2010), Eat Healthy educators did not omit portions of the lesson to maintain time delivery adherence, as evidenced by longer duration times and educator feedback. One educator in Kent reported skimming a particular portion of a topic that did not relate to the family, i.e., the participant’s child was not a picky eater. The educator adapted the lesson by applying the adult learning approach—Anchor, Add, Apply, Away (AAAA), where “just in time” information was shared to best meet the needs of the parent (Goetzman, 2012.) In regards to log times for lesson duration, not all educators noted the start time as when they actually began to deliver the lesson, but rather when they entered the home. This finding only came out during the home visit observations. It is important to consider if establishing a priori optimal duration times or cut-points is appropriate, as these predetermined cut-points 86 assume that all participants have the same knowledge and ability to learn at the same speed. In the future, educators should at least be trained on how to log lesson durations. Hypothesis 3 Intervention complexity can provide challenges to maintaining implementation fidelity reinforcing the necessity for ongoing monitoring and timely feedback for questions or concerns that arise. Similar to other studies, this study found that consistent and timely support from the author of this thesis was fundamental for educators to stay focused and adhere to the intended design (Hill et al., 2007.) Ongoing, periodic debriefings allowed site coordinators and educators to ask questions and discuss concerns that arose, such as protocol modifications, questions on data collection, and difficulties understanding the spreadsheets on participant status from the outside evaluator. In the exit interviews at each site, educators reported how the periodic debriefings were helpful, and this support may have contributed to the good implementation fidelity levels, i.e., scores of four to five on the NERTDs. Of the four sites, Kent’s participation was the most sporadic and this difference likely played a part in Kent’s reduced odds of being able to deliver the lesson as designed. Consistent with other studies (Dusenbury et al., 2003), educators tended to score higher than the observer across all home visit observations. The highest level of agreement occurred for extent to which the lesson was delivered as designed, the primary dependent variable, suggesting that educators tended to not inflate their responses. Greater variability in other independent variables, such as participant engagement and reaction to key messages could be affected by an educator’s own personal perceptions of each participant’s characteristics and level of rapport. Despite this variability, mean scores for all items, except duration, were high and good between 87 four and five. High levels (80-90%) of implementation fidelity have been linked to positive behavior outcomes (Story et al., 2000.) However, lower fidelity ranges between 60-80% have also had similar results (Cho et al. 2005; Spoth et al. 2002). In the present study, there were not enough home visit observations to make a statement on percent agreement. Common barriers to implementation fidelity as cited by Botvin (2004) include adaptations to the intended protocol, limited or insufficient training, lack of ongoing support, and external demands to educators, such as time constraints. In the present study, however, none of these were barriers and the few adaptations made to the lesson delivery were facilitators. Adaptations to the delivery protocol are commonly reported across studies. Dusenbury et al., 2003 noted that modifications or adaptations are common and researchers need to anticipate how and why educators might modify the protocol and develop guidelines to maximize fidelity and meeting program goals. Based on exit interviews and periodic debriefings, many educators reported modifying the lesson delivery method when necessary. Griffin et al. (2010) found similar results in that although modifications to the intended protocol can be interpreted as a barrier to fidelity, they served a facilitating role in this study for parent engagement and meeting the needs of some participants. Educators collectively reported that phone calls were not as effective as home visits in delivering the lessons. Participant’s lack of follow through with keeping appointments and returning the educator’s phone calls substantiates their reports. All four sites cited technology issues as a barrier. Defective DVD’s and non-working DVD players or TV’s were of most concern, as the DVD clips were a key element to delivering the material. An alternate method to deliver the video is key for future implementation, such as downloadable files for tablets. 88 Instrument design Contrary to clinical studies, implementers of community-based interventions have lacked standardized measurements for fidelity of implementation and delivery and instead used selfreport, observations, and audio recordings to collect such data (Breitenstein et al., 2010.) Most instruments have been either generic or specific to a particular intervention (Hogue et al. 2008). The present study likewise used an instrument specific to the research questions, the NERTD, as well as, self-reports and observations. Generalizability of this instrument is possible, given the phrasing and content of the items. The instrument might be more robust if there were multiple items per concept to create constructs and its validity tested. To this end, confirmatory factor analysis was conducted to examine if a combination of two or three NERTD items formed a construct. Despite the strong correlations between these items, there was no relationship between the latent factors. Data collection issues Both debriefings and exit interviews revealed educator’s frustrations they encountered throughout the study surrounding data collection and the timeline of the study’s progression, similar to Wilson et al., 2009. Wilson and colleagues reported that the breadth of data collection among multi-site interventions can increase frustration and negatively impact levels of fidelity. In the present study, some educators were not aware of or had not implemented new forms that resulted from mandatory revisions or clarifications. This supports the suggestion from educators that pre-formed packets, by topic, and clearer directions on data collection during training, would help to alleviate such frustration and confusion. 89 Unforeseen issues with keeping track of participant status, i.e. moving from intervention phase to control phase and vice versa, and completing outside evaluator phone call surveys remained a challenge throughout this study. Despite weekly spreadsheet updates, continuity of identifying participant codes was lacking, as well as real-time status changes. This commonly resulted in the delayed onset of the next phase of the study. Timely feedback from the author of this thesis to the educators helped to minimize the expansion of confusion and help maintain good levels of implementation fidelity. This action supports the findings of Hill et al. (2007), where reinforcing essential program elements and maintaining open lines of communication helped to reduce such negative impacts. Study Strengths To the author’s knowledge, few in-depth process evaluation studies have been published on nutrition education interventions. This study used both quantitative and qualitative data collection methods. Triangulation of data collected from different sources, i.e., NERTD’s, selfreports, observations, debriefings and exit interviews substantiated findings and provided a comprehensive overview of what transpired during the EH implementation. The process evaluation instrument, the self-report NERTD, was created specifically to evaluate the research questions. It was also used during home visit observations, by both the educator and observer, providing rater comparison. Qualitative analysis was conducted by two independent coders using a systematic process, allowing a thorough examination of the data. Standardized trainings of all educators were also a strength, allowing for comparisons among sites for educational delivery and implementation fidelity. Another strength of this study was the ongoing educator support via periodic debriefings and timely feedback by the author of this thesis. Other strengths include use of a validated instrument (ARCS) and the large number of educators. 90 Study Limitations There were several limitations of this Eat Healthy process evaluation. First, the NERTD was not validated and had only one-item constructs, making it difficult to know if the items accurately measured what was intended. Another limitation of the NERTD is that there may have been social bias in the educators’ scores. Based on the home visit observations, the educators tended to score a bit higher than the observer, however the potential for observer subjectivity should be also be considered, because the observer was not an independent rater. Inter-rater reliability could not be performed between educator and observer scores, as there were only seven home visits. Additional home visit observations and an additional observer would have helped to improve reliability of findings. The non-significant findings regarding the association of educator’ self-reported training scores and their ability to deliver the lessons as designed may imply that the instrument was not the most appropriate tool. Originally, the ARCS was designed to explore the motivational ability of education materials and has been used for psychology students, computer programmers, and college students who evaluated an online health website (Huang et al., 2006; Keller J & Suzuki K, 1998; Dour et al., 2012.) While the Principal Investigator of the Eat Healthy study and author of this thesis (Karp) thought this was an appropriate quantitative tool, this assumption did not hold true. Another limitation is that the selection of both sites and the outlying educators could be biased, because they were selected by the funding agency. Adaptations to the delivery protocol were also a limitation. Delivering phone call lessons as home visits made it difficult to accurately evaluate duration of topic delivery. Another limitation was that the weekly team meetings with Lansing educators were not coded, unlike the periodic debriefings from the outlying sites. Educator-participant matching was not considered in this study. Another limitation was the lack of consideration for educator91 participant match or alliance on process outcomes. This also has the potential to impact behavior change outcomes. Another limitation was that exit interviews were conducted by the author of this thesis (Karp), which could have facilitated bias in educator’s responses. Finally, associations of process outcomes and behavioral outcomes were not examined in this present study as all behavioral data had not been collected prior to publication of this thesis. 92 CHAPTER VI. CONCLUSIONS and IMPLICATIONS for FUTURE RESEARCH Conclusions This study was unique in adding to the very small body of literature on process evaluation in nutrition education that assists understanding the mechanisms or moderators that affect outcomes. In this study, both age of the educators and parent engagement predicted the extent to deliver EH topics as designed, a finding substantiated with the qualitative data. Quantitative and qualitative findings consistently supported parent engagement as the strongest facilitator and distractions by children and technology issues as the most common barriers to educational delivery and implementation fidelity. This study also found that ongoing and timely support was important to educators for maintaining good levels of implementation fidelity. Finally, use of mixed methods for data collection provided concurrent validation of the findings. Implications for future research The finding that age, rather than years as a home visitor, significantly predicted an educator’s ability to deliver the lessons as designed suggests that life experience is important for parent engagement, ability to tune out distractions, and establish trust and rapport with participants. The author, therefore, recommends using role play during educator training sessions to build and strengthen these skills. This can be especially helpful with diverse groups of educators, where skills or strengths unique to a few can be taught to others to improve levels of self-efficacy. Because process evaluation is so important for determining a program’s success, developing validated instruments remains a critical need in order to improve generalizability of outcomes. The three single items from the NERTD can be a starting place for other researchers to move this field forward, towards a more valid and reliable instrument than 93 presently exists. Finally, findings from this study suggest that using an a priori definition for duration of delivery might not be a great idea. Instead, an instrument might use a rank ordered range of times for educators to check. 94 APPENDICES 95 APPENDIX A MSU IRB Human Research approval letters 96 97 98 99 APPENDIX B Nutrition educator consent form 100 Location: ____________ Nutrition Educator Consent Form You are being asked to participate in a research study as a “nutrition educator.” Researchers are required to provide a consent form to inform you about the research study, to convey that participation is voluntary, to explain risks and benefits of participation, and to empower you to make an informed decision. You should feel free to ask the researchers any questions you may have. Study Title: Eat Healthy: A Parent’s Guide to Raising a Healthy Eater Researcher and title: Sharon L. Hoerr, PhD, RD, Professor Institution and Dept: Michigan State University, Department of Food Science and Human Nutrition Address and Contact 204 Trout Food Science Building, 469 Wilson Rd, East Lansing, MI 48824; 517-355-8474 x 156 (lab); 517-355-7713 x110 (office) hoerrs@msu.edu 1. PURPOSE OF RESEARCH We are asking you to participate in this study as a ‘nutrition educator’ to gain insight from you about how well you think the parent education sessions are going, and to ask your opinion about the training sessions to become a nutrition educator. 2. REQUIREMENTS You must be in training to become a nutrition educator and be at least 18 years of age or older to participate. 3. WHAT YOU WILL DO We are asking you to fill out an anonymous demographic survey, a topic survey after teaching each topic, and a questionnaire after each nutrition training class (two). We are also asking you to participate in an interview at the end of the research study. 4. POTENTIAL INDIRECT BENEFITS We don’t anticipate any direct benefit to you. 5. POTENTIAL RISKS We don’t anticipate any risk with participating in this study as all responses will be anonymous. 6. PRIVACY AND CONFIDENTIALITY We would like to collect data from the ‘nutrition educators’ anonymously. We will keep all information about you, anonymous and confidential to the maximum extent allowable by law. Data will be kept in a locked cabinet in the GM Trout Building, at Michigan State University, East Lansing for at least three (3) years following closure of the study. Michigan Nutrition Network is funding this project and may need to have access to data records. The MSU Human Research Protection Program may also be given access to data in the event of an audit. Although research team members may have access to information about some parts of the study, only the 101 PI will have access to any information provided by the nutrition educators. The results of this study may be published or presented at professional meetings, but your identity will remain anonymous. 7. YOUR RIGHTS TO PARTICIPATE, SAY NO, OR WITHDRAW Participation in this research project is completely voluntary. You have the right to say no. You may change your mind at any time and withdraw. You may choose not to answer specific questions or to stop at any time. Whether you chose to participate or not, will not make any difference in the quality of the training sessions you receive or evaluation as part of the research team. Discontinuing or choosing not to participate will involve no penalty or loss of benefits to which you are otherwise entitled. 8. COSTS AND COMPENSATION FOR BEING IN THE STUDY There are no costs associated with participation and no compensation offered to you for participating in this portion of the study. 9. CONTACT INFORMATION FOR QUESTIONS AND CONCERNS If you have concerns or questions about this study, such as scientific issues, how to do any part of it, or to report an injury, please contact the researcher Dr. Sharon Hoerr (2110 Anthony Building, Michigan State University, East Lansing, MI 48824; hoerrs@msu.edu, 517-355-8474, ext. 156 or 110). If you have questions or concerns about your role and rights as a research participant, would like to obtain information or offer input, or would like to register a complaint about this study, you may contact, anonymously if you wish, the Michigan State University’s Human Research Protection Program at 517-355-2180, Fax 517-432-4503, or email irb@msu.edu or regular mail at 408 W. Circle Dr., Room 207 Olds Hall, MSU, East Lansing, MI 48824. You will be given a copy of this form to keep. 10. DOCUMENTATION OF INFORMED CONSENT Filling out and returning the surveys, questionnaires and participating in an interview indicates my voluntary agreement to participate in this research study. You may keep a copy of this form for your records. 102 APPENDIX C Power point slides for Phase 1 training 103 Slide 1 ___________________________________ Van Buren ISD Educator training for Eat Healthy, Your Kids are Watching! A Parent’s Guide to Raising a Healthy Eater ___________________________________ By ___________________________________ Dept of Food Science & Human Nutrition Michigan State University Sharon L. Hoerr, RD, PhD, FACN, MOM (principal investigator) Jamie Karp, RD, LN (study coordinator) Nutrition Educators ___________________________________ and Michigan Nutrition Network 1 ___________________________________ ___________________________________ ___________________________________ Slide 2 ___________________________________ Educator Motivations ___________________________________ 1. First priority is to help families 2. Good study design and complete data collection needed to test if Eat Healthy materials can be called “evidence-based” ___________________________________ ___________________________________ * Right now, EH is “best-practice” ___________________________________ ___________________________________ ___________________________________ Slide 3 ___________________________________ Objectives, for educators 1. 2. 3. 4. 5. Explain overview of Eat Healthy program Explain timeline for data collection Discuss recruiting strategies, demographics & consent form Complete the MSU IRB online Discuss debriefing times with MSU/MNN ___________________________________ ___________________________________ ___________________________________ __________________________________ Figure 4. Power point slides for Phase 1 Training 104 Figure 4. (cont’d) I. II. Slide 4 ___________________________________ Agenda III. IV. V. VI. VII. Introductions Eat Healthy Short Overview Recruitment & coding Data Collection Anthropometric Training MSU Institutional Review Board (IRB) completion Schedules ___________________________________ ___________________________________ ___________________________________ 4 ___________________________________ ___________________________________ ___________________________________ Slide 5 ___________________________________ II. Eat Healthy Your Children are Watching, A Parent’s Guide to Raising a Healthy Eater ___________________________________ ___________________________________      ___________________________________ More bullets, less text More interactive 3rd to 5th grade reading level Extensive parental interviews 5 topic booklets in binder 5 ___________________________________ ___________________________________ ___________________________________ Slide 6 ___________________________________ A. Key Messages 1.  2. 3. 4. 5. ___________________________________ Keep healthy foods in the home More “Anytime” than “Sometime” foods Be a good role model with food and drinks Make positive family mealtimes a priority Reward children with attention and family activities Use mealtime rules to reduce struggles ___________________________________ ___________________________________ 6 ___________________________________ ___________________________________ ___________________________________ 105 ___________________________________ Figure 4. (cont’d) ___________________________________ ___________________________________ Slide 7 ___________________________________ Cover for 5 topics & 23 2-3 min DVDs of real families talking about feeding children 7 ___________________________________ ___________________________________ ___________________________________ Slide 8 ___________________________________ B. Order of events Time Task April 1. Recruit, 2 consent forms, measure ht&wt, FFQ & parent feeding behaviors 2. Fax both sides of demographic info to MSU ___________________________________ 3. Mail 2 consent forms (MSU & Altarum), FFQ, parent feeding behaviors 4. MNN assigns to Treatment or Control groups & sends info to External Evaluator, Altarum April/May 5. Altarum does 20 min phone call for assessments April-July 6. Educators conduct Eat Healthy intervention for TR and then CON groups May-July 7. Educators conduct post-tests at end of EH Aug-Oct 8. Conduct 3mo Follow-Up measures 8 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 9 ___________________________________ Program Delivery Treatment Group (TR) (Eat Healthy Materials) Control Group (Health Education Materials) Week 1 Home Visit, Topic 1 (Part 1 of 2) Week 1 Hip on Health® Materials Week 2 Phone Call, Topic 1 (part 2 of 2) Week 2 Week 3 Home visit, Topic 2 Week 3 Week 4 Phone Call, Topic 3 Week 4 Call and send reminder card Week 5 Phone Call, Topic 4 Week 5 Week 6 Home Visit, Topic 5 Week 6 Week 7 Catch-up Week Week 7 Catch-up Week/Schedule Visits ___________________________________ ___________________________________ ___________________________________ 9 ___________________________________ ___________________________________ ___________________________________ 106 Figure 4. Nutrition Education Reinforcement Items (NERI) (cont’d)   Small food related incentives given to participants at recruitment and each home visit Examples: kid friendly snack recipes, F/V strainer, measuring spoons, potato scrubber, cutting board, etc Slide 10 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 11 ___________________________________ Measures/Incentives 11 Week Lesson Recruitment 1 Measurements Incentive      NERI Consent Forms Ht, Wt, Parent & Child FFQ (Child) Feeding Behaviors (FB) ___________________________________  Topics 1, 2 & DVD ($50 Topic 1: Home Visit (part 1 of 2) value)  NERI  NERI 2 Topic 1: Phone Call (part 2  ARCS of 2) 3 Topic 2: Home Visit  ARCS  NERI  Topics 3,4,5 4 Topic 3: Phone Call  ARCS  NERI 5 Topic 4: Phone Call  ARCS  NERI 6 19 Topic 5: Home Visit Follow-Up  ARCS, FFQ, FB  NERI     $25 gift card & FV playing cards Ht, Wt, Parent, Child FFQ (Child) Feeding Behaviors ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 12 ___________________________________ ARCS Attention, Relevance, Confidence, Satisfaction ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 107 ___________________________________ Figure 4. (cont’d) III. Recruitment & coding, Outline A. B. C. D. Slide 13 ___________________________________ Recruitment Flyer, add contact information Consent form Assigning participant ID Data collection ~20-30 min A. Demographic sheet- sided B. FFQ C. Parent Feeding Questionnaire D. Anthropometrics- on demographic form ___________________________________ ___________________________________ 13 ___________________________________ ___________________________________ ___________________________________ Slide 14 ___________________________________ A. Recruitment Flyer ___________________________________ ___________________________________ ___________________________________ 14 ___________________________________ ___________________________________ ___________________________________ Slide 15 B. Consent Forms- MSU & Altarum 1. Overview 2. Collect at recruitment- plan 2 copies of ___________________________________ ___________________________________ each/person: 2 for MSU, 2 for Altarum 3. Each participant must sign for both themselves & ___________________________________ for their child 4. Assign ID code carefully and add to all forms. 5. Leave a copy of each consent with the participant ___________________________________ 6. Fax demographic form to MSU, same day. 517-353-6343 7. Make copies & mail originals to MSU 15 ___________________________________ ___________________________________ ___________________________________ 108 Figure 4. ___________________________________ C. Assigning participant ID (cont’d)    Slide 16 Based on child’s name Begin with the consent form Be consistent on all documents ___________________________________ Ex: “Jamie Sue Marshall” born February 8th, 2009 V Location 02 08 Child Birth Child Month Birth Day J Child 1st Initial ___________________________________ S Child middle Initial ___________________________________ Loss of data is an expensive loss We can’t teach until demo form is received and educators do IRB training. 16 ___________________________________ ___________________________________ ___________________________________ Slide 17 ___________________________________ D. Home Visit Equipment Checklist 4 consent forms per family, 2 for MSU and 2 for Altarum  FFQ and portion size flip chart  Demographic,Ht,Wt form  Parent feeding behaviors  Scale/Stadiometer  NERI  Pencils and pens  ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 18 ___________________________________ E. Care of Data Forms  Completeness is essential  Education can’t begin until Altarum does phone interview  Missing data slows the process ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 109 Figure 4. (cont’d) A. B. Slide 19 ___________________________________ IV. Data Collection C. D. Demographic sheet FFQ Parent Feeding Behaviors Anthropometrics- back of demographic sheet ___________________________________ ___________________________________ ___________________________________ 19 ___________________________________ ___________________________________ ___________________________________ Slide 20 ___________________________________ A. Participant Demographic Sheet ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 21 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 110 ___________________________________ Figure 4. (cont’d) ___________________________________ Slide 22 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 23 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 24 ___________________________________ B. FFQ Participant ID Code ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 111 Figure 4. ___________________________________ C. Parent Feeding Behaviors (cont’d) ___________________________________ Slide 25 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 26 ___________________________________ D. Anthropometrics- on back of demographic form ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 27 ___________________________________ V. Training for Height and Weight ___________________________________ ___________________________________ ___________________________________ 27 ___________________________________ ___________________________________ ___________________________________ 112 ___________________________________ Figure 4. (cont’d) General Items Slide 28  Inter-rater reliability must be performed among all those who measure  All measurements taken twice  Record in correct units on equipment ___________________________________ ___________________________________ ___________________________________  Weight: Kg  Height: Cm 28 ___________________________________ ___________________________________ ___________________________________ Slide 29 ___________________________________ A. Weight Assessment- general 1. Weigh participants at same time of day for baseline and follow-up assessments 2. Ask participants to not eat a heavy meal or drink a lot water within 3hr of assessment 3. Face participant away from instrument readout 4. Don’t comment on participant’s weight! 5. Use same equipment for each measurement ___________________________________ ___________________________________ ___________________________________ 29 ___________________________________ ___________________________________ ___________________________________ Slide 30 ___________________________________ Weight Assessment ___________________________________ Equipment: • Digital scale • A stool or chair to permit participant to remove shoes and socks ___________________________________ Tanita BWB-800S ___________________________________ 30 ___________________________________ ___________________________________ ___________________________________ 113 Figure 4. (cont’d) Slide 31 ___________________________________ Weight Assessment Protocol ___________________________________ 1. Zero the scale. On hard flat surface, no carpet. 2. Be sure participant removes excess clothing and empties ___________________________________ bladder Both feet of participant must be centered and completely 3. on scale. Stand still. 4. Record weight to nearest 0.1 kg 5. Repeat until within 0.2 kg 6. Record average to two decimal points ___________________________________ 31 ___________________________________ ___________________________________ ___________________________________ Slide 32 ___________________________________ B. Height Assessment ___________________________________ Equipment  Portable Stadiometer  Step-stool or sturdy chair  Practice assembly & dis-assembly ___________________________________ ___________________________________ 32 ___________________________________ ___________________________________ ___________________________________ Slide 33 ___________________________________ Height Assessment ___________________________________ Protocol 1. Participant removes shoes & hair ornaments 2. Check for 4 points of body contact with wall or stadiometer a. Head c. Buttocks b. Shoulders c. Heels At least 2 pts must touch wall 3. Participant looks straight ahead with eye gaze level with floor (note that chin is slightly tucked as in “military posture”) 4. Participant takes a deep breath and stands tall ___________________________________ ___________________________________ 33 ___________________________________ ___________________________________ ___________________________________ 114 ___________________________________ Figure 4. (cont’d) Slide 34 5. Move the stadiometer slide and fix in place so that participant can step away ___________________________________ 6. Keep your eyes at the level of the stadiometer readout and record to 0.1 cm ___________________________________ 7. Repeat until two measures are within 0.2 cm 8. Record to 2 decimal places ___________________________________ 34 ___________________________________ ___________________________________ ___________________________________ Slide 35 ___________________________________ Height Errors ___________________________________ Failure to determine where Ht is read 2. Improper body position 1. ___________________________________ • Head in wrong position Children trying to stand on toes 4. Hair ornaments interfere 3. ___________________________________ 35 35 ___________________________________ ___________________________________ ___________________________________ Slide 36 ___________________________________ VI. Training for FFQ  Practice doing this on a partner. ___________________________________  How long does it take?  What questions do you have? ___________________________________  What was missed? ___________________________________ ___________________________________ ___________________________________ ___________________________________ 115 ___________________________________ Figure 4. (cont’d) FFQ General Items ___________________________________ Don’t forget: Slide 37 1. Parent must sign consent form before filling out FFQ 2. FFQ only in pencil, NOT pen 3. Make sure Participant's ID number is written in the box 4. Don’t forget back side and both columns 37 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 38 ___________________________________ ___________________________________ 5. Check that bottom of backside is complete 6. Do your best to answer questions about foods not on FFQ that might correspond with categories on FFQ… ___________________________________ Or tell them to use their best judgment ___________________________________ 38 ___________________________________ ___________________________________ ___________________________________ Slide 39 ___________________________________ Portion Size Reference Binders ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 116 ___________________________________ Figure 4. (cont’d) Slide 40 Don’t forget: 7. Check over front and back of form for completeness  Check for shaded region in the frequency column but not corresponding shading for the portion sizes column  Parents tend to forget to answer last 3-4 questions on the back 8. Last questions about sex/age on back of form is about child NOT parent 9. Make sure bubbles are completely shaded in and not “X”, checked, or half shaded ___________________________________ ___________________________________ ___________________________________ 10. Place completed form back in participants packet! 40 ___________________________________ ___________________________________ ___________________________________ Slide 41 ___________________________________ FFQ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 42 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 117 Figure 4. ___________________________________ Each one complete a FFQ on themselves (cont’d)   Slide 43 ___________________________________ What did you learn? What did you miss? ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 44 ___________________________________ VII. MSU Institutional Review Board (IRB) Training A. Each home visitor/educator must complete MSU IRB online training BEFORE 1st home visit ___________________________________ B. Plan on ~30 min to complete the training C. ___________________________________ http://www.humanresearch.msu.edu/requiredtraining.html ___________________________________ 44 ___________________________________ ___________________________________ ___________________________________ Slide 45 ___________________________________ VII. IRB How to get IRB access, a 2-step process 1. Fill out IRB sign up sheet (First & last name, email address) today 2. We email your information to IRB staff 3. IRB contacts you directly, via email with instructions on how to register as a MSU Community Member (guest ID) 4. Respond to IRB emails ASAP 5. Then complete the IRB training ___________________________________ ___________________________________ ___________________________________ 45 ___________________________________ ___________________________________ ___________________________________ 118 Figure 4. (cont’d) ___________________________________ IRB Training- R e q u i r e d • Slide 46 • ___________________________________ Anyone involved in human subject research who has contact with people (or their identifiable data) must have current human research protection training ___________________________________ Consists of completing initial educational requirement & renewing before the training expiration date in 2 yrs 46 ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 47 ___________________________________ MSU IRB Web Site ___________________________________ ___________________________________ ___________________________________ 47 ___________________________________ ___________________________________ ___________________________________ Slide 48 ___________________________________ ___________________________________ ___________________________________ ___________________________________ 48 ___________________________________ ___________________________________ ___________________________________ 119 ___________________________________ Figure 4. (cont’d) ___________________________________ Slide 49 ___________________________________ ___________________________________ 49 ___________________________________ ___________________________________ ___________________________________ Slide 50 ___________________________________ IRB Certificate  Print certificate and mail to MSU in provided addressed envelopes. ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 51 ___________________________________ VIII. Schedules  ___________________________________ Time for training on delivery of Eat Healthy ~1-2 hours in 2 weeks  Interview scripts for phone interviews ___________________________________ ___________________________________ 51 ___________________________________ ___________________________________ ___________________________________ 120 Figure 4. (cont’d) Slide 52 ___________________________________ Nutrition Educator (NE) FormsProcess Evaluation ___________________________________ Fill out today: 1. Nutrition Educator Demographic Form 2. ARCS for Educator/Investigators ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 121 APPENDIX D Power point slides for Phase 2 training and curriculum grid 122 Slide 1 Educator training for ___________________________________ Eat Healthy, Your Kids are Watching! ___________________________________ A Parent’s Guide to Raising a Healthy Eater Michigan State University (MSU) ___________________________________ Sharon L. Hoerr, RD, PhD, FACN, MOM (principal investigator) Jamie Karp, RD LN (study coordinator) Dept of Food Science & Human Nutrition ___________________________________ 1 ___________________________________ ___________________________________ ___________________________________ Slide 2 ___________________________________ Agenda I. Objectives ___________________________________ II. Eat Healthy ___________________________________ III.Forms & Follow-up ___________________________________ 2 ___________________________________ ___________________________________ ___________________________________ Slide 3 ___________________________________ I. Objectives, for educators ___________________________________ 1. Review teaching guide for Eat Healthy ___________________________________ 2. Practice & use Anchor, Add, Apply, Away (AAAA) ___________________________________ 3. List forms to be faxed & mailed to MSU 3 ___________________________________ Figure 5. Power point slides for Phase 2 ___________________________________ training and curriculum grid ___________________________________ 123 Figure 5. (cont’d) ___________________________________ Motivations for Educators ___________________________________ • Train for consistent education delivery Learn what works best and why Slide 4 • Help families be healthy and reduce mealtime struggles ___________________________________ • Use techniques for adult learning What training have you already had on adult education? ___________________________________ 4 ___________________________________ ___________________________________ ___________________________________ Slide 5 ___________________________________ II. Eat Healthy A. Background ___________________________________ 1. Child food behaviors 2. Parent feeding practices ___________________________________ 3. Developing the Eat Healthy intervention B. Adult Learning ___________________________________ C. Review each chapter 5 ___________________________________ ___________________________________ ___________________________________ Slide 6 ___________________________________ 6 1. Child Food Behaviors a. b.   c. d. ___________________________________ Children’s poor diet quality relates to health problems & high child obesity Children in low income families more likely to eat ___________________________________ high fat & sweet food & drinks less whole grains, low fat/nonfat dairy, vegetables Parent’s role is to teach children how to eat healthy Low income children are underserved, despite high rates of obesity & poor diet quality ___________________________________ ___________________________________ ___________________________________ ___________________________________ 124 Figure 5. (cont’d) Slide 7 ___________________________________ 8 2. Many parent factors relate to child diet & weight ___________________________________ Parent’s weight status o Food available at home Parent’s dietary intake o Food accessibility Food ___________________________________ preferences o Time of consumption child feeding practices o Family income Portion size o Physical activity Responsibility for child feeding Parental Family ___________________________________ meals vs. eating away from table Genetics-taste preferences ___________________________________ ___________________________________ ___________________________________ Slide 8 ___________________________________ 8 NO research consensus to support feeding advice for preschool children ___________________________________ These factors likely relate to pediatric obesity:      Permissive parenting Poor home food environment Frequent, unsupervised snacking Using food as a reward Very controlling food pressures ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 9 ___________________________________ Research suggests that any permissive parenting relates to poor diet & obesity ___________________________________ Feeding Styles Low Demands On Child High Low High Uninvolved Indulgent Authoritarian Authoritative Encourage eating using highly directive behaviors and are unsupportive Actively encourage eating using non-directive and supportive behaviors Make few demands to eat but Make few demands on children to those demands are supportive eat & are unsupportive ___________________________________ ___________________________________ Responsiveness To Child 9 Hughes et al., Appetite, 2006; Hughes et al., JDBP, 2008 ___________________________________ ___________________________________ ___________________________________ 125 Figure 5. (cont’d) ___________________________________ What kind of “control” is good or bad? ___________________________________ Early scholars defined control as pressure, intrusiveness, & domination & detrimental Slide 10 But also viewed no control or allowing children free rein, ___________________________________ as bad, because children require some guidance Control Psychological Control: •Pressure •Intrusiveness •Dominance 1Grolnick, Structure Behavioral Control: •Supervision •Guidance •Limit setting •Availability Vs. ___________________________________ 10 Pomerantz, Child Devel Pers, 2009 ___________________________________ ___________________________________ ___________________________________ Slide 11 ___________________________________ 11 3. Developing Eat Healthy 2010 Study with 330 mother-child pairs in mid-Michigan found Aim of Eat Healthy ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 12 ___________________________________ 12 ___________________________________ ___________________________________ Cover ___________________________________ 0 ___________________________________ ___________________________________ ___________________________________ 126 Figure 5. (cont’d) ___________________________________ 13 Desired CHILD outcomes for project  Slide 13 ___________________________________ Diet has   more fruit/veg & whole grains less sweet drinks & whole or 2% milk  Healthier weight status  More willing to try new foods ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 14 ___________________________________ 14 Desired PARENT Outcomes      ___________________________________ Increased modeling of healthy foods More child-centered feeding Offer child more fruit/veg, less sweet beverages Reduced mealtime distractions, like TV Healthier weight status ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 15 ___________________________________ 15 B. Adult Learning is different 1. 2. 3. 4. ___________________________________ Recognizes current knowledge & experience Uses just in time information Relates in use of stories Anchor, Add, Apply, Away is helpful mnemonic or paradigm to use ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 127 Figure 5. (cont’d) Slide 16 ___________________________________ Adult Learning Approach (AAAA) 1. Anchor ___________________________________ oAnchor content within the learner’s experience oHow do you do this? ___________________________________ 2. Add oAdd new information oKey points ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 17 ___________________________________ 3. Apply oInvite the learner to apply the content in a new way or situation oUse Activities & Questions in booklets ___________________________________ ___________________________________ 4. Away oLearner decides what to take away & use the next week oUsually an activity around meals & often with child ___________________________________ 17 ___________________________________ ___________________________________ ___________________________________ Slide 18 ___________________________________ For more on adult learning: • Goetzman D. Dialogue Education Step by Step: A Guide for Designing Exceptional Learning Events Global Learning Partners, Inc, 2012 ___________________________________ • Available at: http://globallearningpartners.com/service/courses/professi onal-development-opportunities/dialogue-education-stepby-step ___________________________________ ___________________________________ 18 ___________________________________ ___________________________________ ___________________________________ 128 Figure 5. (cont’d) Slide 19 ___________________________________ Timeline Before any teaching, FOUR things must occur A. IRB training completed & certificate faxed to MSU B. Recruitment & measurements done ___________________________________ 1. Fax both sides of demographic form & signed page of Altarum consent form to MSU 2. Mail all the forms to MSU ___________________________________ C. MNN assigns control or treatment & External evaluator calls all participants for 15-20 min of questions D. We let you know when and with whom to start the lessons.  ___________________________________ 19 ___________________________________ ___________________________________ ___________________________________ Slide 20 C. Review of EH ___________________________________ 20 ___________________________________ ___________________________________ ___________________________________ Nutrition Educator Guide has learning objectives that are not in the Cover participant’s0materials ___________________________________ ___________________________________ ___________________________________ Slide 21 See lesson plans in notebook ___________________________________ 21 Each topic has Labeled header 1. DVD clips (2-3 min) 2. Activities 3. Important to Know 4. Tips. What parents really want are tips & tricks to feed their child ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 129 Figure 5. (cont’d) ___________________________________ 22 Find the Choose MyPlate guide under the DVD on back of cover ___________________________________ Slide 22 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 23 ___________________________________ 23 Key Messages 1. ___________________________________ Keep healthy foods in the home, visible & available “Sometimes” and “Anytime” foods 2. 3. 4. 5. Be good role models with food & drinks Make positive family mealtimes a priority Reward children with attention & family activities Use mealtime rules to reduce struggles ___________________________________ ___________________________________ Messages are woven throughout all topics ___________________________________ ___________________________________ ___________________________________ Slide 24 ___________________________________ 24 Intro Clip for Eat Healthy ___________________________________ o What parents think about Eat Healthy ___________________________________ o Educators review the intro clip with parent ___________________________________ Now, view the curriculum grid in notebook & then topic overviews ___________________________________ ___________________________________ ___________________________________ 130 Figure 5. (cont’d)  Key concepts:   Slide 25 ___________________________________ 25 Topic 1a: Kids are what they eat  Anchor: Clip 1.1  Add:   Apply:  Away:   ___________________________________ discuss foods kept on hand at home “Sometimes” vs. “Anytime” foods, p 4,5,8   ___________________________________ Keep healthy foods in the home visible and available Healthy foods are “anytime” foods whereas “sometimes” foods are only now and then parents rank six types of food choices from most to least healthy, p 9-11 ___________________________________ parent answers question, p 7 switches a “sometimes” food for an “anytime” food ___________________________________ ___________________________________ ___________________________________ Slide 26 ___________________________________ 26 Key Message Keep healthy foods in the home, visible and available  Learning 1. 2. 1. ___________________________________ objectives for parents: Describe types of foods they have on hand Define what makes a food healthy ___________________________________ Anchor Clip 1.1 ___________________________________  Discuss  Do types of food at home inventory p 4-5 ___________________________________ ___________________________________ ___________________________________ Slide 27 ___________________________________ 27 Topic 1, Part 1.1 Key Points/Activity  Parents complete a home food inventory p 4-5 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 131 Figure 5. ___________________________________ 28 2. Add Anytime vs sometime food p 8 (cont’d) ___________________________________ Slide 28 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 29 ___________________________________ 29 3. Apply Rank six types of food choices p9-11 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 30 ___________________________________ 30 4. Away, parent switches an “anytime” food for a “sometimes” food p7 ___________________________________ Plus, Child Activity ___________________________________ NERI are the fridge magnets ___________________________________ ___________________________________ ___________________________________ ___________________________________ 132 Figure 5. ___________________________________ 31 (cont’d) For parent at the end of the first visit: Slide 31 ___________________________________  Instructions to work on their “Aways” topic one by next visit  Schedule time for a phone call ___________________________________  Finish  See ___________________________________ topic overviews for more info ___________________________________ ___________________________________ ___________________________________ Slide 32 ___________________________________ 32 Topic 2: Be a good role model  ___________________________________ Key concepts:   Be a good role model with food & drinks New foods can take time  Anchor: Clip 2.1  Add:  ___________________________________ discuss who modeled food habits for them Everything a parent does is a lesson for their child, p3,5  It can take up to 15 tastes to learn to like a new food p 10   Apply:  Away: parent lets child pick a new fruit/veg at the store and tastes it, p 10-11   ___________________________________ Activity 2.1 on p 4 Healthy snack activity p 7 ___________________________________ ___________________________________ ___________________________________ Slide 33 ___________________________________ 33 Clip 2.1: See the big picture, your kids do 1. Anchor, Parents watch clip 2.1 2. Add. Everything a parent does is a lesson for their child ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 133 Figure 5. (cont’d) ___________________________________ 34 Topic 2, Part 2.3 Add ___________________________________ Slide 34 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 35 ___________________________________ 35 Apply. Activity 2.1 Parents complete activity on pg 4 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 36 ___________________________________ 36 Topic 2, Apply Activity 2.2 p7 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 134 Figure 5. Away & Child activity p10-11 ___________________________________ 37 (cont’d) ___________________________________ Slide 37 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 38 ___________________________________ 38 Topic 3: Ways to praise at meals  Key  concepts:  Anchor:  ___________________________________ Labeled praise helps the child understand what he/she does right Clip 3.1 discuss how they encourage & praise their child ___________________________________  Add:  Praise  the action, not the person Be specific when they praise  Apply:  ___________________________________ Activity on p 7  Away:  parent keeps track of labeled praise they use for next 2-3 meals, p 4 ___________________________________ ___________________________________ ___________________________________ Slide 39 ___________________________________ 39 Key Message: Reward children with attention and family activities ___________________________________  Learning 1. 2. Objectives: List 3 reasons why food should not be used as a reward Identify 4 good alternatives to using food rewards ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 135 Figure 5. (cont’d)  Key concepts:   Slide 40 ___________________________________ 40 Topic 4: Making mealtime fun  Anchor: Clip 4.1  Add:   ___________________________________ Make positive family mealtime a priority Family meals benefit everyone discuss typical food commands that parents use Positive feeding pattern allows children choice within structure. Indulgent/uninvolved feeding pattern leads to poor diet & weight problems, p 3-5  Apply:  Parent determines own feeding pattern  List some mealtime rules  ___________________________________ ___________________________________ Away: Parent chooses 2 non-food rewards from a list & uses within next week  Parents track 2 mealtime rules  ___________________________________ ___________________________________ ___________________________________ Slide 41 ___________________________________ 41 Key Message 3: Make positive family mealtimes a priority ___________________________________ Learning Objectives 1. 2. 3. Recall the kinds of praise they give their child for eating behaviors Give two examples of labeled praise at meals Track their use of labeled praise for 2-3 meals ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 42 ___________________________________ 42 Topic 5: Learning to eat healthy  Key concepts:  Anchor: Clip 5.1    Discuss food struggles the parent has with their child Discuss a time when parent got their child to eat without pressure  Add:  Apply:     ___________________________________ Use mealtime rules to reduce struggles ___________________________________ 90 minutes between meals & snacks, rules reduce struggles, p 4 Parent completes chart on learning without pressure, p 4 Parent completes checklist on tips for sometimes foods, p 9 & activity on p 10 ___________________________________ Away:  Parent plays sensory game with child to encourage tasting new food , p 6 ___________________________________ ___________________________________ ___________________________________ 136 Figure 5. (cont’d) Slide 43 ___________________________________ 43 Key Message Use mealtime rules to reduce food struggles ___________________________________ Learning Objectives 1. Explain how having some rules can help reduce mealtime struggles ___________________________________ 2. State the recommended amount of time between meals and snacks 3. List two or more mealtime rules their family will follow ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 44 ___________________________________ 44 Program Delivery Intervention Group (Eat Healthy Materials) Control Group (Health Education Materials) Wk 1 Home Visit, Topic 1,part 1 Wk 1 Wk 2 Phone Call, Topic 1, part 2 Wk 2 Wk 3 Wk 4 Wk 5 Wk 6 Wk 7-8 Home visit, Topic 2 Phone Call, Topic 3 Phone Call, Topic 4 Home Visit, Topic 5 ISD educator phone: ___________________________________ Hip on Health® Materials Wk 3 Wk 4 Call and send reminder card Wk 5 Wk 6 Wk 7-8  External evaluator, phone: F/V, feeding strategies ___________________________________ Food avail, offering FV, willingness , time in PA & screen time ISD educator: Schedule Visits External evaluator, phone: Food avail, offering FV, willingness , time in PA & screen time ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 45 ___________________________________ 45 See scripts for phone calls ___________________________________ Calls on wk 1b, 3, 4 There are also scripts for week 1a, 2, 5. They can help, but need not be completed. ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 137 Figure 5. ___________________________________ 46 (cont’d) Appointment/Reminder Cards ___________________________________  Sent to the control participants at Week 4 Slide 46 ___________________________________ Only 3 more weeks before Eat Healthy starts For more information call.____________________ ___________________________________ Make your own reminder cards ___________________________________ ___________________________________ ___________________________________ Slide 47 47 Week 1 2 Lesson Lesson/Measures/Incentive Measurements Topic 1: Home Visit ARCS on 1 (part 1 of 2) Topic 1: Phone Call (part 2 of 2)  ARCS for 3 ___________________________________ Incentive  Topics 1, 2 & DVD NERI ___________________________________  NERI  Topics 3,4,5 3 Topic 2: Home Visit 4 Topic 3: Phone Call 5 Topic 4: Phone Call 6 Topic 5: Home Visit  ARCS for 5  NERI 19 ~12 wk Follow-Up  $25 gift card ___________________________________ ISD educator, home: ___________________________________ wt,ht,FFQ, feeding strategies External evaluator, phone: Food avail, offering FV, willingness , time in PA & screen time ___________________________________ ___________________________________ ___________________________________ Slide 48 ___________________________________ 48 Questions about Eat Healthy? ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 138 Figure 5. (cont’d) III. Forms and Follow-up A. Slide 49 ___________________________________ 49 B. C. D. ___________________________________ ARCS electronic, will send Educator’s Report of topic delivery, in notebook Mailing/faxing forms Weekly phone call ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 50 ___________________________________ 50 A. ARCS for each Topic ___________________________________  ARCS is a useful instructional model for motivation  Attention ___________________________________  Relevance  Confidence  Satisfaction ___________________________________  Each question answered on a 5-pt scale. 1=Not True to 5= Very True ___________________________________ ___________________________________ ___________________________________ Slide 51 ___________________________________ 51 Attention ___________________________________  Gain the learner's attention and keep it  Encourages questions ___________________________________  Example:  There was something interesting at the beginning of the material that got my attention. ___________________________________ ___________________________________ ___________________________________ ___________________________________ 139 Figure 5. (cont’d) Slide 52 ___________________________________ 52 Relevance ___________________________________  Learner believes that learning this material is relevant and meaningful ___________________________________  Example:  It is clear to me how the content of this material is related to things I already know ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 53 ___________________________________ 53 Confidence ___________________________________  Learners should feel that they can apply what they learn ___________________________________  Example:  When I first looked at this booklet, I had the impression that I would be able to use it. ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 54 ___________________________________ 54 Satisfaction ___________________________________  Learners should feel rewarded for doing the activities ___________________________________  Example:  Completing the exercises in Topic 4 gave me a satisfying feeling of accomplishment. ___________________________________ ___________________________________ ___________________________________ ___________________________________ 140 Figure 5. (cont’d) Slide 55 ___________________________________ 55 Changes in ARCS Scores Topic 2011 2012 1 4.05 4.45 2 3.42 4.67 3 3.64 4.22 4 3.72 5 3.23 ___________________________________ ___________________________________ 4.55 ___________________________________ Scores >3.50 are considered acceptable ARCS only on weeks 1, 3, 5 ___________________________________ ___________________________________ ___________________________________ Slide 56 56 B. Educators Report of Topic Delivery Question 1. To what extent was the parent engaged in this week’s topic? Answers Not at all engaged Not very engaged Neutral Somewhat engaged Very engaged 1 2 3 4 5 2. To what extent was the child engaged in this week’s topic activity? 1 2 3 4 5 3. How did the parent react to this topic’s key message? Strongly unfavorable Somewhat Neutral unfavorable Somewhat favorable 1 2 3 4 4a. To what extent were you able to deliver the topic as designed? 4b. Please explain any things that helped or made it difficult for you to deliver this week’s topic as designed. Strongly favorable 5 Not at all A little Neutral Mostly able Completely able 1 2 3 4 5 ___________________________________ ___________________________________ ___________________________________ Reports on each topic, but mailed only on topics 2 and 5 5. How much time was spent on delivering this week’s topic? __________ Minutes ___________________________________ Please circle one Topic 1a 1b 2 3 4 5 ___________________________________ ___________________________________ ___________________________________ Slide 57 ___________________________________ 57 C. Mailing/ Faxing forms There will be 2 forms/ participant/ week ___________________________________ Please make copies of everything first and mail to MSU. ___________________________________ Send at end of each week. ___________________________________ ___________________________________ ___________________________________ ___________________________________ 141 Figure 5. (cont’d) ___________________________________ 58 D. Weekly phone call 1. Slide 58 2. ___________________________________ At set time with each coordinator, so be sure to convey any issues or questions. What is best time for your ISD? ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 59 ___________________________________ 59 Summary 1. 2. 3. 4. 5. 6. ___________________________________ Reviewed format of each topic Provided mnemonic to teach each topic Reviewed ARCS & educator’s topic report for each topic Identified materials to be sent to MSU Set a weekly time for troubleshooting phone call with site coordinator Complete Nutrition Educator Demographic & Nutrition Educator ARCS ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ Slide 60 ___________________________________ 60 Contact info ___________________________________ 1. Sharon Hoerr hoerrs@msu.edu P: 517-353-7713 x110 or 517-490-1554 F: 517-353-6343 2. Jamie Karp/Imhoff karpjami@msu.edu P: 517-355-8474x156 or 406-260-8970 ___________________________________ ___________________________________ ___________________________________ ___________________________________ ___________________________________ 142 Table 24. Curriculum grid for Eat Healthy. Eat Healthy, Your Kids are Watching! Topic Title 1a visit Curriculum Grid 1b phone 2 visit Kids are what they eat Key Concepts Keep healthy foods in the home visible and available Anchor Clip 1.1 Discuss the types of foods that parent keeps available Parent does food inventory pp4-5 3 phone Be a good role Ways to praise model at meals Portion sizes for Be a good role Labeled praise preschoolers are model with food helps your child smaller than for and drinks understand what adults he/she does right New foods can take time Clip 1.4 Clip 2.1 Clip 3.1 Discuss changes Discuss who Discuss how the parents have modeled food parent seen in child’s habits for the encourages and appetite parent praises their child eating 4 phone 5 visit Making mealtime Learning to fun eat healthy Make positive family Use mealtime mealtimes a priority rules to reduce struggles Family meals benefit everyone Clip 4.1 Discuss typical food commands that parents use Clip 5.1 Discuss food struggles the parent has with their child Discuss a time when parent got their child to eat without pressure, p3 143 Table 24 (cont’d) Add Healthy foods are “anytime foods,” but “Sometimes” are only for now and then, p 5,8 F/V don’t need to be fresh; frozen and canned are good options. Be sure to rinse canned veggies to remove added sodium. Apply Parents rank six types of food choices from most to least healthy, p 9-11. Discuss. Children need less food than adults, p19-20 Growth spurts make appetite erratic Eating behavior milestones, p23 Parent answers questions on amount child needs p19 Everything a parent does is a lesson for their child Praise the action, not the person. It can take up to 15 tastes to learn to like a new food Be specific with praise Positive feeding pattern allows children choice within structure The 90 minute rule between meals & snacks can reduce struggles Indulgent/uninvolved feeding leads to poor diet and weight problems p3-5 Reward children with attention and family activities, not food Parent does Activity 2.1 food behaviors your child learned from others, p4 Parent does Healthy snack activity 2.2, p7 144 Parent selects best way to praise their child p6 Parent does Activity on how to respond to mealtime issues, p7 Parent determines own feeding pattern p4-5 Parent lists some mealtime rules p8 Parent completes chart on learning without pressure, p4 Parent completes checklist on sometimes food, p9 and activity on p10 Table 24 (cont’d) 3 hole punch on this edge Title Away 1a visit Kids are what they eat Parent switches a “sometimes” food for an “anytime” food, p7 1b phone Be a good role model Parent waits 90 minutes between meals & snacks Parent serves only small portions at first, p22 Child Activity Child selects photos of healthy snack choices that can substitute for sometimes foods, p14 Handouts EH binder; Topics 1&2 2 visit Ways to praise at meals Parent let’s child pick a new fruit or vegetable at the store and tastes it, p10-11 3 phone Making mealtime fun Parent keeps track of labeled praise they use for next 2-3 meals, p4 4 phone Learning to eat healthy Parents choose two non-food rewards to use the next week, p11 Child chooses a new fruit or vegetable to try. Child places a super taster sticker in activity book after trying it, p11 Topics 3,4,5 Parent enhances praise of child with touch, eye contact and smiles, p15 Parent and child choose and do a fun mealtime activity, p13 145 5 visit Parent plays a sensory game with child to en courage tasting new food p6 Parent helps child pick mealtime rules and track for several days, p13 Table 24 (cont’d) NERI Mail to MSU Fax before mailing Eat Healthy Magnets Supertaster stickers ARCS for 1 NERTDs for 1a, 1b, 2 ARCS for 2 Interview scripts 1b Healthy Snacks recipes or Fruit/Veg playing cards, MSU extension recipe book ARCS for 5 NERTDs for 3, 4, 5 Interview scripts 3,4 NOTE:     Child does not need to be present during home visits. There will be an activity for the parent to do with his/her child after each lesson. If the child is present during the home visit, have coloring sheets and crayons or appropriate child activity on hand, should child interrupt lesson delivery. At the start of each session, be sure to review the parent-child “Away” activity from the previous session. Bring an extra DVD as well as DVD player or laptop to all home visits, in case the participant’s DVD player or laptop does not work. 146 Appendix E Nutrition educator demographic questionnaire 147 Location: ______________ Nutrition Educator Demographic Questionnaire FY13 Sex  Male  Female Age _____ Hispanic, Latino, or Spanish origin?  Yes  No Race (check all that apply)  White  Black/African American  American Indian  Asian Indian  Asian non-Indian (Chinese, Japanese, etc.)  Other Education  High school diploma or GED  Some college  College graduate Number of years teaching Nutrition Education? _________ Number of years working as a home visitor? ________ Have you taken formal nutrition classes? Yes____ No_____ Please explain your nutrition education background. 148 What is your interest level in teaching the Eat Healthy curriculum?     No interest Little interest Neutral Somewhat interested  Very interested Appendix F ARCS questionnaires for nutrition educator training 149 Eat Healthy FY13 Nutrition Educator Training Location: _______ Training Date: __________ Recruitment and Data Collection ARCS Not True Slightly True Moderately Mostly True True Very True 1 After this training, I was confident I will be able to complete this material. 1 2 3 4 5 2 There was something interesting at the start that got my attention. 1 2 3 4 5 3 The presentation style helped to hold my attention. 1 2 3 4 5 4 The way the information was laid out helped keep my attention. 1 2 3 4 5 5 The variety of activities and pictures helped keep my attention. 1 2 3 4 5 6 I enjoyed this training and would like to know more about it. 1 2 3 4 5 7 I liked attending this training. 1 2 3 4 5 8 I felt satisfied by the feedback after the activities. 1 2 3 4 5 9 The content of this training relates to things I already know. 1 2 3 4 5 10 There were pictures or examples that showed me how this material was important. 1 2 3 4 5 11 The content of this material is relevant to my interests. 1 2 3 4 5 12 I didn't learn anything new in this training, because I already knew it. 1 2 3 4 5 13 This material relates to things I have seen or thought about. 1 2 3 4 5 150 Eat Healthy FY13 Nutrition Educator Training Location: _______ Training Date: Education Delivery ARCS Not True 1 After this training, I was confident I will be able to teach this material. 1 Slightly True 2 2 There was something interesting at the start that got my attention. 1 2 3 4 5 3 The presentation style helped to hold my attention. 1 2 3 4 5 4 The way the information was laid out helped keep my attention. 1 2 3 4 5 5 The variety of activities and pictures helped keep my attention. 1 2 3 4 5 6 I enjoyed this training and would like to know more about it. 1 2 3 4 5 7 I liked attending this training. 1 2 3 4 5 8 I felt satisfied by the feedback after the activities. 1 2 3 4 5 9 The content of this training relates to things I already know. 1 2 3 4 5 10 There were pictures or examples that showed me how this material was important. 11 The content of this material is relevant to my interests. 1 2 3 4 5 1 2 3 4 5 12 I didn’t learn anything new in this training, because I already knew it. 1 2 3 4 5 13 This material relates to things I have seen or thought about. 1 2 3 4 5 151 Moderately True 3 Mostly True 4 Very True 5 Appendix G Nutrition Educator Report of Topic Delivery (NERTD) questionnaire 152 Nutrition Educator’s Report of Topic Delivery- Eat Healthy (NERTD) Questionnaire Question Answers 1. To what extent was the parent engaged Not at all in this week’s topic? engaged 1 Not very engaged 2 Neutral 2. To what extent was the child engaged in this week’s topic activity? 3. How did the parent react to this topic’s key message? 1 4a. To what extent were you able to deliver the topic as designed? FY13 Very engaged 3 Somewhat engaged 4 2 3 4 5 Strongly unfavorable 1 Somewhat unfavorable 2 Neutral 3 Somewhat favorable 4 Strongly favorable 5 Not at all A little Neutral Mostly able 1 2 3 4 Completely able 5 5 4b. Please explain any things that helped or made it difficult for you to deliver this week’s topic as designed. 5. How much time was spent on delivering this week’s topic? Please circle one __________ Minutes Topic 1a 153 1b 2 3 4 5 Appendix H Coding tree for summary of Nutrition Educator Reports (NERTD) responses for barriers and facilitators of topic delivery, including selected quotes 154 Table 25. Coding tree for summary of Nutrition Educator Reports (NERTD) responses for barriers and facilitators of topic delivery, including selected quotes. Level 1 1. Barrier Level 2 Level 3 Frequency 1. Technology 1. Difficulty playing DVD 29 2. Participant interrupted/distracted 1. Child 32 2. Phone 6 3. Visitor 3 4. Dog 7 5. Rushed 2 6. Environmental noise 1. Skeptical 4 3. Participant’s negative attitude 2 155 (Level 1.Level2.Level3) UniqueID Quotes by case (1.1.1) K0625IC: “DVD player would not read DVD. Went through questions and activities. Mom will watch DVD when Dad brings laptop home. Will call in two weeks to review Topics Two and Three.” (1.1.1) G0317PJ: “The DVD kept skipping on 4.4 so it was hard to complete this section to its fullest.” (1.2.1) L1124WJ: “Kids were active: in and out of house and getting mom’s attention” (1.2.1) V0827NE: “ Her children kept interrupting but not so much that we couldn’t handle it” (1.2.2) L0418MO: “ Phone calls interrupting dialogue a couple of times” (1.2.3) V0329SP: “Mom needed to get up several times to attend to daughter, and to get husband up for a visitor.” (1.2.4) L0306AW: “There were a lot of noisy kids in the house and dogs’ running around so Mom was pretty distracted.” (1.2.5) L0108JD: “Mom seemed tired and very short in her response. Seemed in a rush to get off the phone. (1.2.6)L0805CK: “Car traffic and fans made it difficult to hear the DVD.” (1.3.1)L0122RA: “Participant’s attitude (skepticism and defensive) about current food habits. Didn’t feel we could teach her anything and didn’t know in what way we would benefit from her participation Table 25 (cont’d) 5. Nutrition Educator 1. Confused or unprepared 2. Had to pry for answers 5 1. Nutrition Knowledge 3 2. Reading Level 2 6. Lack of participant follow through 1. Forgot appointment 3 1. Neutral Site 1. Fewer Distractions 5 2. Kids Occupied 1. Coloring sheets and crayons 2. Supplemental activities 9 5. Participant’s lack of knowledge 2. Facilitators 3. By second nutrition educator 4 (1.4.1)K0830HG: I was confused what I was to cover and what this part was and what to do this week.” (1.4.2)L0108JD: Mom was very short in her responses, not very willing to converse deeper about questions.” (1.5.1) V0926DP: “There are many distractions in this home. The parent also has very limited knowledge about what constitutes "healthy" She does ask a lot of questions.” (1.5.2) V0918AL: “Very low reading levels. Older child was able to read most of the material to mom which kept both children mostly engaged. Children enjoyed picking healthy foods to replace sometimes foods; pleasantly surprised mom and dad.” (1.6.1) V1231AB: “The parent forgot I was coming and had to leave for an appointment. So I gave her 1.3-1.6 as homework this week (She didn’t tell me she had to leave until we had only 10 minutes left before she had to go)” (2.1.1) G0311AJ: “Mom and Dad were both present, and both answered questions. The visit was at the SKIP site- child liked to play in playroom while I went over the visit with mom and dad.” (2.2.1) V0710DM: “The coloring pages helped us from being interrupted.” 10 (2.2.2) G1121HD: “It helps to bring an activity for the child.” 18 (2.2.3) L1115LS: “Children were very loud so second nutrition educator distracted the child during the lesson.” 156 4.By another caregiver Table 25 (cont’d) 4 3. Participant engaged in topic/key messages 4. Alternative DVD player/laptop 35 5. Participant Relatability 8 30 157 (2.2.4) L1124WJ: "Family member took kids to pool, it helped us focus on lesson" (2.3) V0202AA: “The parent was very engaged and talked freely which helped.” (2.4) V1230CB: “Bringing lap top to watch the DVD segments makes it much more convenient to be able to set up wherever is best for family (2.5) V1128KJ: “Mom admired the videos of the parents working together as a united front. Says it’s hard to do the same in their household but want to practice that more!” Appendix I Coding tree for exit interviews with nutrition educators from each site 158 Table 26. Coding tree for exit interviews with nutrition educators from each site. Topic Training on Education Delivery Question General Responses 1. How satisfied were you with the Visual and written materials training you received on delivering helpful the lessons? Why? Specific Quotes “It was helpful to have the written as well as the visual because I have never done the program, until you do it, it is easier to understand what they are talking about but it was great to have the manual too along with the parts” “I was satisfied, I felt like I was reasonably prepared to present the material, it was rocky when I first started doing it. Everyone gets better with practice, put I felt prepared” (MSU) Site VB Thorough “I thought it was thorough” VB Straight forward “…there didn’t seem to be any G problems with it, it was pretty straight forward. It wasn’t until after when questions came up” “I think mainly it was the purpose VB of the study that we talked about in the training to really help because the materials once we opened it up and watched the videos and kind of tinkered with it a little on our own prior to going into the homes, that all fell together for us” Satisfied Discussion of purpose helpful 159 MSU G Table 26 (cont’d)  Behavioral changes to look for, well explained  Job as an educator, well explained Overwhelming amount of material Need more focus on main points of each lesson Would like to receive material ahead to prepare questions and familiarize with material 160 “when we are going into these homes and what kinds of behavior changes or changes in the home are we looking to make throughout this whole study” “…as far as just the purpose of what are we doing, why are we doing it and what is our desired outcome. I think that was helpful.” “I thought it was overwhelming to cover all that material in one setting but after looking at all the materials, it was like okay...I can teach it fine, it is easy to follow but the training was in general was a little overwhelming.” “More so what points do we need to hit? What is the big main point we need to get across and what kind of changes should we look for? Basics of each lesson, going through it like that.” “I am thinking that maybe if we could have gotten the stuff ahead of time and gotten a change to look through it and then maybe ask questions that weren’t answered at the particular time that maybe that would have been more helpful” (G) VB VB VB VB VB MSU G K Table 26 (cont’d) 2. Was one training session adequate? If no, why? Prefer more than one training, less overwhelming Yes, one training was adequate 3. Is there anything that would have made the training more effective? If yes, what would that be? Role play of lesson delivery More time spent on how to deliver lessons and forms to be completed after each lesson 161 “…two trainings would be helpful VB but maybe not practical but if we could get the binders ahead of time and instructions to review a topic and watch some of the videos prior to the training…so we are a little familiar with it and its not brand new and everything all at once MSU G K “Maybe if we had done a practice MSU round…” (MSU) K “…maybe doing some roleplaying?” (K) “I had more questions on the sequence…I was really confused on the phone interview piece on what I was supposed to be covering in that phone interview. So more training in the delivery of it…not what was in each lesson but the sequence of what you do in each visit” K Table 26 (cont’d) 4. Relevance sub-construct (ARCS)- recall answers 15 stood out as negative (different from others) “anything” is too exhaustive Data Collection 1. How did the data collection process go for you? What might improve it? Confusion of completing NERTD after phone call Confusing  Post PFBQ (intervention vs control)  Acronyms and terminology 162 “Reading this now again, that was more like a 1 or a you know so a lot of these are answered in the positive until you got to that one (number 15)” “I can see that one because we definitely learned things that are new but also touched on a lot of things we know. It reinforced things we are currently teaching; then again it didn’t not teach anything.” “I think I forgot one thing, like after a week (NERTD) after the phone calls I wouldn’t do them. I went blank; I didn’t make the connection that I should’ve only thinking that we do it at the home VB “I don’t think we were informed that there were going to be different forms…I’m not sure I ever got that down-paperwork for control and intervention” “ …the abbreviations did have any relevance to stick in my head to know what it was 3 months down the road, So it would need to be more relevant…” G VB VB K Table 26 (cont’d)  Transmittal of documents to MSU Overwhelming amount of paperwork Improve it by providing a packed with all forms for each participant, broken down by topic  Educator made own chart to organize 163 “I redid it a couple times, because I don’t know what happened at the other end…I did everything, sent everything in and had to refax things.” “Overwhelming, very overwhelming…there was a lot to keep track of and what was supposed to come after each visit was a little bit hard to keep track of and frustrating” (K) “I think it would have been helpful to stock each folder ahead of time with everything we needed throughout education…stocking each folder right when we got their consent form so we could just grab the folder and go” (MSU) G G K VB MSU K “It was a matter of getting into the VB routine of which paperwork did MSU they need at each lesson and whether they were treatment or control and what that meant for paperwork, I think we finally got it figured out…I wish I could have a little packet of it all together…this Table 26 (cont’d) 2. How could directions be improved on when particular forms were to be completed and sent to MSU? Providing outline of which data to be fax and or mailed in Better directions on what to do once participant has completed 3. Barriers and facilitators is what the family needs all in one place or a little chart…I made myself one” (VB) “I think if there was a timeline with every piece of paperwork and when to do it, that would help because there are just so many things to keep track of” “And maybe the part about when we complete a family, we are supposed to email you and send. I don’t know if there would be an easier way to do that. There was a lot of confusion…” VB K MSU VB (+): Preprinted mailing labels “One thing I thought was really helpful was to have those preprinted labels on the envelope that was very helpful” VB (+): Educators created and posted checklists of necessary items and forms to bring (+): Educator Binder “We made checklists all over the room, so we never forgot anything” MSU “In our nutrition educator binder we always kept back up forms in case we ever forgot any” “Going in teams was helpful too” “As far as getting the forms thought, you were great as far as anytime we needed something we knew we could send a quick email and if one of us had already gotten MSU (+):Home visiting in teams (+): Study coordinator quickly emailed forms 164 MSU VB the forms we knew we could get it from each other and I liked that, it was awesome! I don’t think getting the forms was a barrier by any means” Table 26 (cont’d) Fidelity of Implementation 1. What was/were the largest barriers to being able to deliver the lessons to your families? (-):Lack of clear directions on how forms to be delivered to MSU (-):Educator had to print own forms sent via email VB Rescheduling phone call delivery VB G MSU VB    “For me it was the phone calls, so then you know we would kind of set up to call on this day and sometimes they forgot to do it and would ask to call back in an hour or tomorrow. I had that a couple times. One time I’m sure she really did do it but then there was a little lag of time and she couldn’t remember. She asked me to call her back so she could review it.” (VB) “As far as phone calls, families Ran out of minutes would run out of minutes on their phones or the phone was disconnected.” (VB) Did not answer phone “I had a lot of participants, quite a few, wouldn’t answer their phones Follow through/ accountability 165 VB VB MSU VB MSU Table 26 (cont’d)  Phone disconnected School starting Less priority due to life issues Parent felt “above it”- not learning anything/ waste of their time Child distracting (wanting attention, lesson not designed for child engagement, parent tending to child’s needs) Parent just going through motions for gift card 166 and I couldn’t contact them for week, especially with the interview questions. When it is on the phone, it is a lot easier for them not to be there.” (VB) See above for “ran out of minutes” “When school started, some of the little ones or siblings were bringing home illness and that made visits difficult” “So many other things to deal with…being evicted, trying to find somewhere else to live so whatever the kids ate, they ate kind of thing.” “There were quite a few that might have been interested but had so many other issues going on that there were so many issues going on that made it hard for them to concentrate.” “There were maybe two or three that I think kind of felt like they were above it.” “I think that it is really geared toward the parent, so when the children were there it was sometimes difficult because they were distracting” (G) “The attitude was like what do I have to do?” (VB) VB MSU VB VB VB VB K MSU G VB Table 26 (cont’d) DVD not working “What happened with some people that couldn’t get the DVD to work at one point. I did have a lot of trouble with that.” VB Parent excuses for poor habits: too many kids, no resources, expensive “They would be talking about how the world is against them, there are no resources, healthy food is so expensive, they have so many kids, and they can’t make it happen so they won’t make it happen.” “The fact that were students and a lot younger than a lot of the participants. I don’t think a lot of them took us that seriously. We had one participant’s mom tell is that we looked like babies and after that visit we never heard from them again, I think the fact that we were younger was a barrier” “A lot of the parents weren’t really interested or engaged, would rush through it just to get you out” “I think a lot of the times parents would just say what they think we want to hear” “Some of the activities are great for like 4 and 5 years olds but for some of them, the younger kids…getting engaged was a challenge…” VB Being a younger educator Parents rushing through material Social bias Activities too advanced for some of the children 167 MSU MSU MSU G Table 26 (cont’d) Parents not willing to engage in material 2. What was/were the largest Parent with interest in facilitators to being able to deliver material the lessons to your families? Material’s focus of nutrition and meal time environment Having DVD clips  DVD showcasing peers making them relatable Having a second educator Not having children present at lesson Parent watching clips ahead of time Parent and educator having separate books 168 “The parent’s willingness to engage, are they willing to have a deep conversation of will they just say yes or no to our questions?” “Some of them were really on the ball. They wanted to learn it; they were interested in it…quite a few were that way.” “It wasn’t just about nutrition and food it was about the whole are and I liked that about the curriculum” “I think that watching the video was helpful” “I think whenever the parents really got involved was when they were watching the video and hearing from their own peers” “The second educator definitely (was the largest facilitator) I did a couple on my own and it was not very easy, you need someone else there” “I could see this presented to parents like in a parent meeting, where we provide child care…I think that would be a great way to present it to parents too” MSU VB G G G MSU K G Table 26 (cont’d) 3. If you were not able to deliver the lessons as intended, how did you modify the protocol? Delivering all topics as home visits “Some didn’t have a DVD player so we had to do home visits for everything” (MSU) VB G K MSU “All my families did prefer the home visits rather than the phone calls so I didn’t do the phone calls, I went out each time for each topic and did a visit and it worked very well that way” (G) Making lessons shorter to keep parent engaged “If a certain section didn’t pertain K to them, if they didn’t have picky MSU eaters I couldn’t spend as much time on that section. So like ** said, listening to their situation and spending more time on certain issues and breezing over others” (MSU) For unengaged parents or parents with a negative outlook, reaching families where they are at (small steps) There were many times that I expanded if I had an unengaged parent. They would be talking about how the world is against them…and they can’t make it happen so they won’t make it happen. We would talk about options … There are many times we expanded on things …So we 3. a) How frequently of with how many families did you deviate from the original design? 3.b) When thinking of your participant’s responsiveness or level of engagement in the lessons, when did you have to omit, modify, or add to the topic’s content or activities to better “reach” the participant and what did you do? 169 VB MSU focused more on, in addition to eating healthy we also talked about other resources that are available and how you can get help and resources throughout the community … It got us off track from the materials but it seemed to really resonate with the family so it did help. The level of engagement improved immensely by our last lesson together so that was really cool. (VB) Table 26 (cont’d) 4. How would you describe your relationship with our participants/families? Established home visitor: participant is more trusting “I think our participants, since we follow them anyway; we have built trust which is continuing. This would be different than if you were just going in for a five lesson thing.” VB Initially not trusting but relationships improved “They are not people we have been working with as the other sites, we meet them just for this study so we don’t have a personal relationship but surprisingly it becomes personal pretty fast with a lot of them” (MSU) MSU Sense of connection, outside “really great relationships built of Eat Healthy Research with the families and now seeing (community) them again in their community settings, I have had a couple of 170 VB Table 26 (cont’d) Timeline of Study Progression/ Study Design 5. Would you say it was easier or more difficult as you progressed, to maintain fidelity to delivering lessons and completing data collection? Easier over time 1. How frequently did you refer to Altarum spreadsheets? Did you find them helpful? Why or why not Looked at spreadsheet weekly the parents actually getting more involved with their community” “I think it was easier as it went on, once you got everything clear, it got easier, more clear and what forms and when to wait for the phone calls and everything” (G) “When I got them, I would look at them right away.” (VB) VB G K MSU VB MSU “Sometimes I would be confused so I would ask ** and she would be very helpful, she showed me how to look at them properly and I would understand them more but at first it was confusing. Remembering who was treatment and control that was the main confusing part for me” (VB) Initial confusion, lack of understanding  No explanation of how to read, RKM numbers, waves 171 “They were confusing, I didn’t find it helpful. They dropped one of my participants and would drop the call and then they were on one week and gone the next week. It was too confusing. I was checking all the time” (K) “the RKM numbers were really confusing but they changed over time so we had to call and tell them their different RKM number VB MSU K G VB and it was really confusing but it was helpful to keep on looking” (VB) Time consuming “Its very time consuming but whatever it’s not that bad.” (VB) Once learned it was helpful “I would definitely say it was a for participant study helpful tool but rather confusing progression and frustrating at times” (VB) Frustration: incorrect coding “Useful and frustrating at the same of participants and RKM time.” (VB) numbers Table 26 (cont’d) 2. Who did you reach out to when you encountered confusion about the status or a participant or questions regarding data collection? 2. a) How satisfied were you with the timeliness and responses you received to your questions? VB VB VB Site coordinator “I always went first to Teresa” (VB) VB Team educators (colleagues) “I felt like we are a little family…we could get help” MSU Study coordinator “I think it was either these guys (Other educators) or you (Jamie)” (K) VB MSU K G VB K G MSU Great timeliness and responses “I’d say that was great!” (VB) 172 VB G K MSU Table 26 (cont’d) 2.b) Explain how you felt supported by your team, and MSU 3. Some educators sat in on weekly debriefings between MSU and each site coordinator, if you participated in these conference calls, how effective were the study PI and study coordinator at answering your questions on clarifying what was not clear. 3. a) How helpful did you find these ongoing debriefings? Good collaboration amongst team educators and study coordinator to bounce questions Very effective, some initial confusion “I was satisfied, I think the communication was good” (K) Helpful in relating to common struggles “I thought it was really fun because it was a fun collaboration that we got to have a little bit more time to interact with eachother and say, “oh yea, Im struggling with this” and they would be like “I know how that is” The phone calls were helpful.” “I mean when we are so busy doing so many other things it is so easy to let it slide and be like oh yea I need to get back to that…It was a reminder to keep the fire going and keep moving forward.” (VB) VB “I thought they were very helpful, I felt that once school started and I got into teaching during the day, I couldn’t make some of the VB Motivating 3. b) If you weren’t present during these calls, how were your questions addressed and by whom? Information was effectively relayed by site coordinator 173 “The first time it was more confusing than helpful but I think that is because all of us were trying to figure out who was doing what…but when we were talking it was clarified as a list of things to ask and they were answered.” (K) VB G K MSU VB G K MSU VB Table 26 (cont’d) Overall Questions 1. How would you describe the amount of time and effort that was involved as working as a nutrition educator for Eat Healthy? More time consuming due to paperwork, rescheduling, competing with school schedule Wasn’t much different than regular visits Work load was manageable 2. Do you have any comments or feedback in general? Being able to listen to and validate participants struggles Excited to do this but without paperwork 174 conference calls, but I still was able to catch up if I wasn’t able to make the call.” “I don’t think as far as time was, it was different than my regular visits. As far as going there, there was a little extra paperwork but other than that, and we did it every week vs every month which actually was kind of nice” (G) “I don’t think as far as time was, it was different than my regular visits. As far as going there, there was a little extra paperwork but other than that, and we did it every week vs every month which actually was kind of nice” “A lot of paperwork, but it all made sense, we knew why we had so much paperwork and why it was helpful” “It is definitely important to listen to what they say their issues are and what they don’t have issues with” (MSU) “I would be excited to do this without having to do all the paperwork part…answering the interview calls and getting notes sent in…” VB G K G MSU VB MSU VB Table 26 (cont’d) Would have been great for parent night due to more open discussion and no children “I made a comment how this would be an excellent program for parent night. Where the kids are not there and they can go home and do it with the kids and come back. I think you would get a lot more open discussion with a group of people and one on one is sometimes good” Confusion on study duration “I think more communication with the participants about what the course of the research was” Several dropped due to “They dropped one of my difficulty with external participants and would drop the evaluator call and then they were on one week and gone the next week. It was too confusing. I was checking all the time” (K) 175 G MSU MSU K G VB REFERENCES 176 REFERENCES Anliker J, Damron D, Ballesteros M, Feldman R, Langenberg P, Havas S. Using peer educators in nutrition intervention research: Lessons learned from the Maryland WIC 5 A day promotion program. J Nutr Educ. 1999;31(6):347-354. Baker S, Gersten R, Keating T. When less may be more: A 2-year longitudinal evaluation of a volunteer tutoring program requiring minimal training. Reading Research Quarterly. 2000;35:494-519. Baranowski T, Stables G. Process evaluation of the 5-a-day projects. Health Education and Behavior. 2000;27(2):157-166. Barbour R. Checklists for improving rigouor in qualitative research: a case of the tail wagging the dog? BMJ. 2001; 322:1115. http://dx.doi.org/10.1136/bmj.322.7294.1115. Barr H et al. Perspectives and Perceptions. Journal of Interprofessional Care. 2002;16(4):309310. Bellg A, Borelli B, et al. Enhancing treatment fidelity in health behavior change studies: best practices and recommendations from the NIH Behavior Change Consortium. Health Psychology. 2004;23(5):443. Bentler, P. Comparative fit indexes in structural models. Psychological Bulletin. 1990;107: 238246. Blow A, Sprenkle D, Davis S. Is who delivers the treatment more important than the treatment itself? The role of the therapist in common factors. Journal of Marital and Family Therapy.2007;33(3), 298-317. Botvin G. Advancing prevention science and practice: Challenges, critical issues, and future directions. Prevention Science. 2004;5(1):69-72. Braun V, Clarke V. Using thematic analysis in psychology. Qualitative Research in Psychology.2006;3(2):77-101. Breitenstein S, Gross B, Garvey C, et al. Implementation fidelity in community-based interventions. Research in Nursing & Health. 2010;33(2):164-173. Byrnes HF, Miller BA, Aalborg AE, Plasencia AV, Keagy CD. Implementation fidelity in adolescent family-based prevention programs: Relationship to family engagement. Health Educ Res. 2010;25(4):531-541. Cabral R, Smith T. Racial/ethnic matching of clients and therapists in mental health services: A meta-analytic review of preferences, perceptions, and outcomes. Journal of Counseling Psychology. 2011;58(4):537. 177 Carroll, Christopher, et al. A conceptual framework for implementation fidelity. Implementation Science. 2007. http://www.implementationscience.com/content/2/1/40. doi:10.1186/1748-59082-40 Cho H, Hallfors D, Sanchez V. Evaluation of a High School Peer Group Intervention for At-Risk Youth. Journal of Abnormal Child Psychology. 2005;33(3):363-374. Cooke M. The dissemination of a smoking cessation program: Predictors of program awareness, adoption and maintenance. Health Promot Internation. 2000;15(2):113-124. Dane A, Schneider B. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clinical Psychology Review. 1998;18(1):23-45. Dariotis J, Bumgarner B. et al. How do implementation efforts relate to program adherence? Examining the role of organizational, implementer, and program factors. Journal of Community Psychology. 2008;36(6):744-760. Dollahite J, Hosig KW, Karen AW, Rodibaugh R, Holmes TM. Impact of a school-based community intervention program on nutrition knowledge and food choices in elementary school children in the rural arkansas delta. Journal of Nutrition Education. 1998;30(5):289-301. Dour C, Horacek T, et al. Process Evaluation of Project WebHealth: A Non-dieting Web-based Intervention for Obesity Prevention in College Students. Journal of Nutrition Education and Behavior.2013;45(4):288-295. Durlak JA, DuPre EP. Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41(3-4):327-350. Dusenbury L, Brannigan R, Falco M, Hansen WB. A review of research on fidelity of implementation: Implications for drug abuse prevention in school settings. Health Education Research. 2003;18(2):237-237. Dusenbury L, Brannigan R, et al. Quality of implementation: Developing measures crucial to understanding the diffusion of preventive interventions. Health Education Research. 2005;20(3):308-313. Elvins R, Green J. The conceptualization and measurement of therapeutic alliance: An empirical review. Clinical Psychology Review. 2008;28(7):1167-1187. Faw L, Hogue A, Liddle HA. Multidimensional implementation evaluation of a residential treatment program for adolescent substance abuse. American Journal of Evaluation. 2005;26(1):77-94. Fixsen D, Blasé K, et al. Core implementation components. Research on Social Work Practice. 2009;19(5): 531-540. Forman E, Butry M, Hoffman K, & Herbert, J. An open trial of an acceptance-based behavioral intervention for weight loss. Cognitive and Behavior Practice. 2009;16:223-235. 178 Goetzman D. Dialogue Education Step by Step: A Guide for Designing Exceptional Learning Events. 2012. http://www.globallearningpartners.com/resources/books/dialogue-education-stepby-step Griffin SF, Wilcox S, Ory MG, et al. Results from the active for life process evaluation: Program delivery fidelity and adaptations. Health Education Research. 2010;25(2):325-342. Hall WJ, Zeveloff A, Steckler A, et al. Process evaluation results from the HEALTHY physical education intervention. Health Education Research. 2012;27(2):307-318. Harachi TW, Abbott RD, Catalano RF, Haggerty KP, Fleming CB. Opening the black box: Using process evaluation measures to assess implementation and theory building. Am J Community Psychol. 1999;27(5):711-731. Heinonen D et al. Therapists' Professional and Personal Characteristics as Predictors of Working Alliance in Short‐Term and Long‐Term Psychotherapies. Clinical Psychology & Psychotherapy. 2013. doi: 10.1002/cpp.1852. Helitzer D, Soo-Jin Y, Wallerstein N, Lily Dow yG. The role of process evaluation in the training of facilitators for an adolescent health education program. J Sch Health. 2000;70(4):1417. Hersoug A et al. Quality of early working alliance in psychotherapy: Diagnoses, relationship and intrapsychic variables as predictors. Psychotherapy and Psychosomatics. 2001;71(1):18-27. Hill L, Maucione K, Hood B. A focused approach to assessing program fidelity. Prevention Science.2007;8(1):25-34. Hogue A, Henderson C, et al. Treatment adherence, competence, and outcome in individual and family therapy for adolescent behavior problems. Journal of Consulting and Clinical Psychology. 2008;76(4):544. Huang W, Huang W, Diefes-Dux H, Imbrie PK. A preliminary validation of attention, relevance, confidence and satisfaction model-based instructional material motivational survey in a computer-based tutorial setting. British Journal of Educational Technology. 2006;37(2):243-259. Kallestad JH, Olweus D. Predicting teachers' and schools' implementation of the olweus bullying prevention program: A multilevel study. Prevention & Treatment. 2003;6(1). http://ezproxy.msu.edu/login?url=http://search.proquest.com/docview/614387125?accountid=12 598. doi: http://dx.doi.org/10.1037/1522-3736.6.1.621a. Keller J. IMMS: Instructional Materials Motivation Survey. Florida State University. 1987. Unpublished instrument. Krueger R, Casey M. Focus groups: a practical guide for applied research. 2000. Thousand Oaks, CA: Sage Publications. Leach, M. Rapport: A key to treatment success. Complimentary Therapies in Clinical Practice. 2005;11(4):262-265. 179 Lee H, Contento, Isobel R,PhD., C.C.N., Koch P. Using a systematic conceptual model for a process evaluation of a middle school obesity risk-reduction nutrition curriculum intervention: Choice, control & change. Journal of Nutrition Education and Behavior. 2013;45(2):126. Levine E, Olander C, Lefebvre C, Cusick P, et al. The team nutrition pilot study: Lessons learned from implementing a comprehensive school-based intervention. Journal of Nutrition Education and Behavior. 2002;34(2):109-116. McGraw S, Seller D et al. Measuring implementation of school programs and policies to promote healthy eating and physical activity among youth. Preventive Medicine. 2000;31:S86S97. Momin S, Chung K, Olson B. A qualitative study to understand positive and negative child feeding behaviors of immigrant Asian Indian mothers in the US. Matern Child Health J. 2013. DOI: 10.1007/s10995-013-1412-9. Prado G, et al. Predictors of engagement and retention into a parent-centered, ecodevelopmental HIV preventive intervention for Hispanic adolescents and their families. Journal of Pediatric Psychology. 2006;31(9): 874-890. Prado G, et al. Factors influencing engagement into interventions for adaptation to HIV in African American women. AIDS and Behavior. 2002; 6:141-151. Reznar M, Carlson, J, et al. An interactive parent’s guide for feeding preschool-aged children: pilot studies for improvement. Journal of the Academy of Nutrition and Dietetics. 2014;114(5):788-795. Richards, L. Handling qualitative data: A practical guide. 2009.Thousand Oaks, CA: Sage Publications. Rohrbach L, Graham J, Hansen W. Diffusion of a school-based substance abuse prevention program: Predictors of program implementation. Preventive Medicine. 1993;22:237-260. Rossi P, Freeman H. Evaluation: A systematic approach 5. 1993. Sanchez V, Stecklar A, et al. Fidelity of implementation in a treatment effectiveness trial of Reconnecting Youth. Health Education Research. 2006;22(1):95-107. Saunders RP, Evans MH, Joshi P. Developing a process-evaluation plan for assessing health promotion program implementation: A how-to guide. Health Promotion Practice. 2005;6(2):134-147. Saunders RP, Ward D, Felton GM, Dowda M, Pate RR. Examining the link between program implementation and behavior outcomes in the lifestyle education for activity program (LEAP). Eval Program Plann. 2006;29(4):352-364. Schneider M, Hall WJ, Hernandez AE, et al. Rationale, design and methods for process evaluation in the HEALTHY study. Int J Obes. 2009;33:S60-7. 180 Schoenwald S, Sheidow A, Letourneau E. Toward effective quality assurance in evidence-based practice: Links between expert consultation, therapist fidelity, and child outcomes. Journal of Clinical Child and Adolescent Psychology. 2004;33(1):94-104. Seale C, Silverman D. Ensuring rigour in qualitative research. Eur J Public Health. 1997;7(4):379-384 Spoth R, Guyll M, Trudeau L, et al. Two studies of proximal outcomes and implementation quality of universal preventive interventions in a community-university collaboration context. Journal of Community Psychology. 2002;30(5):499-518. Sprenkle D, Blow A. Common Factors And Our Sacred Models. J Marital Fam Ther. 2004;30(2):113-29. Steckler A, Linnan L. Process evaluation for public health interventions and research. San Francisco, CA: Jossey-Bass. 2002. Story M, Lytle LA, Birnbaum AS, Perry CL. Peer-led, school-based nutrition education for young adolescents: Feasibility and process evaluation of the TEENS study. J Sch Health. 2002;72(3):121-127. Story M, Mays RW, Bishop DB, et al. 5-a-day power plus: Process evaluation of a multicomponent elementary school program to increase fruit and vegetable consumption. Health Education & Behavior. 2000;27(2):187-200. Weston C, Gandell T, Beauchamp J, Mcalpine L, Wiseman C, Beauchamp C. Analyzing interview data: The development and evolution of a coding system. Qualitative Sociology. 2001;24(3):381-400. Wieder B, Boyle P, Hrouda D. Able, willing, and ready: Practitioner selection as a core component of integrated dual disorders treatment implementation. Journal of Social Work Practice in the Addiction. 2007;7(1-2):139-165. Wilson DK, Griffin S, Saunders RP, Kitzman-Ulrich H, Meyers DC, Mansard L. Using process evaluation for program improvement in dose, fidelity and reach: The ACT trial experience. The International Journal of Behavioral Nutrition and Physical Activity. 2009;6. http://www.ijbnpa.org.proxy1.cl.msu.edu/content/6/1/79. doi:10.1186/1479-5868-6-79 Young RD, Steckler A, Cohen S, et al. Process evaluation results from a school- and communitylinked intervention: The trial of activity for adolescent girls (TAAG). Health Educ Res. 2008;23(6):976-986. 181