@3119 Y i 1% 2W9 LIBRARY Michinan State University This is to certify that the dissertation entitled CONSTRUCTION, IMPLEMENTATION, AND EVALUATION OF AN UNDERGRADUATE BIOLOGY LABORATORY TEACHING MODEL presented by TODD M. TARRANT has been accepted towards fulfillment of the requirements for the Ph.D. degree in Department of Zoologx r. \t J Wjor Professor’s Signature I’LI‘lIo( Date MSU is an Affinnative Action/Equal Opportunity Institution PLACE IN RETURN BOX to remove this checkout from your record. TO AVOID FINES return on or before date due. MAY BE RECALLED with earlier due date if requested. DATE DUE DATE DUE DATE DUE NOV 0 4 2009 2/05 p:/ClRC/DateDue.indd-p.1 CONSTRUCTION, IMPLEMENTATION, AND EVALUATION OF AN UNDERGRADUATE BIOLOGY LABORATORY TEACHING MODEL BY Todd M. Tarrant A DISSERTATION Submitted to Michigan State University In partial fulfillment of the requirements For the degree of DOCTOR OF PHILOSOPHY Department of Zoology 2005 ABSTRACT CONSTRUCTION, IMPLEMENTATION, AND EVALUATION OF AN UNDERGRADUATE BIOLOGY LABORATORY TEACHING MODEL By Todd M. Tarrant This dissertation documents a time series study in which an undergraduate non- - majors biology laboratory was revised, leading to the development of a new teaching model. The course model was developed at a large Midwestern university enrolling about 827 students in 32 sections per semester and using graduate teaching assistants as primary instructors. The majority of the students consisted of freshman and sophomores, with the remainder being juniors and seniors. This dissertation explains the rationale leading to the development and implementation of this educational model using graduate teaching assistants as the primary course instructors and embedded course assessment as evidence of its success. The major components of this model include six major items including: learning community, course design, GTA professional development, course delivery, assessment, and the filter. The major aspects of this model include clear links between instruction, GTA professional development, embedded assessment (student and GTA), course revision, student perceptions, and performance. The model includes the following components: Formal and informal discourse in the learning community, teaching assistant professional development, the use of multiple assessment tools, a filter to guide course evaluation, and redirection and delivery of course content based on embedded formal course assessment. Teaching assistants receive both initial and ongoing professional development throughout the semester in effective instructional pedagogy from an instructor of record. Results for three years of operation show a significant increase in student biology content knowledge and the use of scientific process/critical thinking skills with mean improvement in student performance of 25.5% and 18.9% respectively. Mean attendance for lSB 208L is 95% for the six semesters of this study showing students regularly attend the laboratory classes and remain in the course with a completion rate of 93%. Additionally, grade point averages have remained high (mean of 3.3 for lSB 208L) while question cognitive level used on course assignments and tests has increased each semester while assignment weight has remained constant. Students indicate through SIRS data that the course is relevant to their lives, emphasizes understanding of ideas, concepts, and encourages students to think about the relationship between science and society. This study is significant in that it provides a description of a field tested working model for all aspects of a course that has large student enrollment, uses graduate teaching assistants as primary instructors, embedded course assessment, ongoing professional development, and general applicability in being transferable to other courses. Copyright by Todd M. Tenant 2005 ACKNOWLEDGEMENTS Committee: Dr. James Smith (Committee Chair): Zoology Department Dr. Larry Besaw: Entomology Department Dr. Merle Heidemann: Division of Science and Mathematics Education Dr. Mark Reckase: Department of Education Dr. Richard Snider: Zoology Department This project and its success could not have occurred without contributions from many people. While I laid out the program, many people contributed ideas, support, feedback, and cooperation to make the program work. These people include Dr. Duncan Sibley who provided time, resources and feedback to aid in the entire process. Dr. Larry Besaw helped in the initial process of organizing the lab manual, acted as a resource person to evaluate ideas, and in providing feedback as to past successes and failures with lab programs. Mr. Gabe Ording helped by working to coordinate daily lab operations, supervising teaching assistants, and providing feedback and ideas. Dr. James Smith served as my committee chairperson and provided feedback on all aspects of the project, critical review of this document, and impetus to keep me moving along. Dr. Mark Reckase provided invaluable feedback regarding experimental design, test and assessment construction, and statistical analysis. Dr. Merle Heidemann reviewed laboratory exercises, assessed quality and difficulty of questions, acted as a resource person, and provided critical review of this document. Dr. Richard II‘A‘ Snider provided feedback on course content, experimental design, and review of this document. Dr. Fran Eckem, Dr. Joyce Parker, Dr. Diane Ebert-May, teaching assistants, students, and many others helped in editing the lab manual, providing feedback regarding course materials and delivery, grading tests, quizzes, and essays, and with data storage. Marsha Walsh helped the lab coordinators by working with teaching assistants, typing documents, helping students get organized, acting as a university resource person, and many other tasks. Amanda Field and Holly Scott helped input and organize data. The point of these acknowledgments is that it becomes obvious that a project of this scope will not work without the cooperation and participation of a large number of people, the entire Ieaming community, associated with the course program. It is also important to note that our Center does not have turf issues that interfere with the design, implementation or function of the lab program. All parties involved in this program from the Center Director to the students, feel comfortable in providing suggestions and constructive criticism to the coordinators. The fact that all members of the Ieaming community are a part of the process makes the process much more effective and in my opinion, the program would never have worked without the cooperation and feedback from these people. I would also like to thank all members of my doctoral committee for their ongoing support, suggestions, help, patience and friendship throughout this entire project. vi ‘ TABLE OF CONTENTS LIST OF TABLES ......................................................................................... x LIST OF FIGURES ....................................................................................... xiii INTRODUCTION .......................................................................................... 1 CHAPTER 1 BACKGROUND AND RATIONALE LEADING TO NEW COURSE MODEL Introduction ......................................................................................... 5 What is Critical Thinking ..................................................................... 11 ' What is Scientific Reasoning .............................................................. 23 Experiential Learning and Labs .......................................................... 30 Laboratory Types ................................................................................ 37 Laboratory Course Models ................................................................. 45 Conclusion .......................................................................................... 47 Link to lSB 208L Course Model ................................................ 51 CHAPTER 2 A FIELD TESTED TEACHING MODEL FOR LARGE UNDERGRADUATE NON-MAJOR GENERAL BIOLOGY LABORATORY COURSES Introduction ......................................................................................... 54 Course Design: Objectives ................................................................. 56 Model Description ............................................................................... 58 Introduction ............................................................................... 58 Model Components ............................................................................ 69 Learning Community ................................................................. 69 Course Design .......................................................................... 71 Graduate Teaching Assistant Professional Development ......... 73 Course Delivery ........................................................................ 74 Assessment ............................................................................... 75 The Filter ................................................................................... 76 Course Evaluation .................................................................... 78 Methods ............ , .................................................................................. 79 Overview ................................................................................... 79 Assessment Description ..................................................................... 82 Pre/Post Course Multiple-Choice Tests .................................... 82 Multiple-Choice Control Questions....~. ...................................... 91 Pre/Post Course Essay Tests ................................................... 94 Rubrics ..................................................................................... 98 Results and Discussion ...................................................................... 105 Pre/Post Course Multiple-Choice Tests .................................... 105 Pre/Post Course Essay Tests .................................................. 109 vii TABLE OF CONTENTS (oont'd) Additional Assessments ........................................................................ 1 11 Informal Assessment ................................................................... 1 14 Conclusions ........................................................................................... 1 14 CHAPTER 3 GRADUATE TEACHING ASSISTANT PROFESSIONAL DEVELOPMENT: KEY TO PROGRAM SUCCESS Introduction ............................................................................................ 127 Methods ...................................................................................... 136 Two-day Orientation Meeting ...................................................... 136 Weekly Laboratory Meetings ....................................................... 146 GTA Surveys ............................................................................... 150' Results and Discussion ......................................................................... 151 Two-day Orientation and Lab Meetings ....................................... 151 Lab Coordinators ......................................................................... 152 Administration .............................................................................. 152 Technology and Facilities ............................................................ 153 Lab Manual .................................................................................. 154 Conclusion .............................................................................................. 154 CHAPTER 4 COMPARISON OF PREVIOUS LABORATORY COURSES (ISB 202L & 204L) WITH THE CURRENT LABORATORY COURSE (lSB 208L) Introduction ............................................................................................. 162 lSB 202L and 204L Course Descriptions .................................... 163 lSB 208L Course Description ...................................................... 167 Laboratory Model Summary ........................................................ 174 Materials and Methods .......................................................................... 176 Description of Methods and How Used ................................................. 179 Attendance .................................................................................. 179 Course Completion/Enrollment/Withdraw Rate ........................... 179 Mean Grades ............................................................................... 181 Course Syllabi ............................................................................. 182 Test Question Cognitive Level Rankings ..................................... 183 SIRS (Student Instructional Rating System) ............................... 183 Results and Discussion ......................................................................... 186 Attendance .................................................................................. 186 Course Completion/EnrollmentIWithdraw Rate ........................... 187 Mean Grades ............................................................................... 192 Test Question Cognitive Level Comparison ................................ 193 SIRS ............................................................................................ 194 Conclusions and Recommendations ..................................................... 199 viii TABLE OF CONTENTS (cont'd) CHAPTER 5 SUMMARY AND FUTURE WORK .............................................................. 205 Future Work ........................................................................................ 217 APPENDICES .............................................................................................. 221 A. Types of Laboratory and Inquiry Experiences .............................. 222 B. Course Descriptions ................................................................... 223 C. SIRS (Student Instructional Rating System) Form .................... 225 _ D. lSB 208L Pre/Post Lab Assessment .......................................... 226 E. Pre-Lab Assessment .................................................................. 234 F. lSB 208L Pre—Lab Assessment ................................................... 241 G. lSB 208L Persuasive Essay ...................................................... 246 H. Sample Pre/Post Essay Questions ............................................. 248 I. Persuasive Essay Grading Rubric ............................................. 250 J. Teaching Assistant Job Description for lSB 208L ..................... 252 K. lSB 208L T.A. Fall Semester 2002 Evaluation .......................... 254 L. lSB 208L T.A. Evaluation ......................................................... 256 M. UCHRIS Form .......................................................................... 258 BIBLIOGRAPHY .......................................................................................... 260 Table l-1 Table 1-1 Table 1-2 Table 1-3 Table 1-4 Table 1-5 Table 2-1 Table 2-2 Table 2-3 Table 2-4 Table 2-5 Table 2-6 LIST OF TABLES CISGS objectives for undergraduate non-majors general biology education .......................................................... Attitudes identified by Ennis (1987) as characteristic of individuals who are using critical thinking ............................ Critical thinking behaviors as described by Swartz (1987)...... Behaviors and skills described by Lewicki (1998) exhibited by life-long learners ......................................................... Questions used to determine the type of laboratory exercise. ................................................................................. Summary of effective teaching practices to engage students in Ieaming ..................................................................... The criteria used for the development of the lSB 208L course model ................................................................ The Filter: A set of questions designed to determine if a laboratory exercise will work within the constraints of University framework, while achieving the objectives for undergraduate biology education ..................................... Bloom's Taxonomy (Modified) ................................................. Sample questions used on pre/post course multiple-choice tests Essay questions used on lSB 208L pre/post course exams to assess student use of critical thinking skills ......................... lSB 208L Pre/Post Course Multiple-Choice Test Results Pre/post course multiple-choice results shown with their corresponding significance as determined using a two- tailed t-test assuming equal variances. All differences between pre and post course multiple-choice tests are significantly different indicating substantial increases in test scores between the beginning and end of the course ...................................................................................... 13 19 33 _ 43 51 57 60 85 87 97 106 LIST OF TABLES (cont'd) Table 2-7 Table 2—8 Table 3-1 Table 3-2 Table 3-3 Table 4-1 Table 4—2 Table 4-3 lSB 208L PrelPost Course Essay Results Pre/post course essay test results for all semesters where the test was administered. The pre/post course essay test results show significant improvement in test scores with mean improvement of 19.2%. All scores are reported as raw score followed by percentage score. The number of students evaluated is shown for each semester as (n) .............................................................................................. Over the course of this study we found that the following shifts in course implementation and procedures occurred as we worked to improve the course and align it with CISGS educational objectives ................................................................ Principles of effective teaching as described in ”Science for All Americans,” AAAS, 1989 (cited in Lawson et al. 2002) ......................................................................................... The areas covered in the lSB 208L GTA professional development program (Course management, Teaching suggestions, Pedagogy. and Administration ............................. Graduate teaching assistant survey results for 8803, F803, and $804 .................................................................................. Course Format Comparison: Comparison of lSB 202L, 204L, and 208L, showing various parameters for each course. Note that lSB 202L/204L were identical in format, while lSB 208L differed in numbers of enrolled students, course focus, alignment with CISGS objectives, types of assess- ments used, and the inclusion of a lecture as part of the laboratory program .................................................................... SIRS questions used for course comparison ............................ SIRS comparisons for lSB 202L/204L and 208L: mean SIRS ratings for the six questions of interest for course model comparisons plus/minus standard deviations. p values shown for two-tailed t-tests assuming equal variances. I used ”1" for choice ”a” and "5" for "e" answers to questions 9 and 10 to provide consistency in viewing results ................... xi 108 123 133 139 142 169 184 185 LIST OF TABLES (cont'd) Table 4-4 Table 4-5 Table 4-6 Table 4-7 lSB 208L/ISP 203L Course Comparisons. Data obtained from Michigan State University Registrar's Office. Beginning enrollments are from the first day of classes. Final enrollments are the number of students receiving a grade ....... 190 Mean grade comparison between lSB 202L, 204L, and 208L.. 193 Course Question Cognitive Level Assessment Comparison of lSB 202L, 204L, and 208L Comparison of mean test item cognitive level based on Bloom's Taxonomy of test question cognitive level ............................................................................ 1 93 Mean student survey (SIRS) results for lSB 202L/204L Fall 1997 to Spring 2000 and for lSB 208L Fall 2000 to Spring 2003. Significance was determined using two-tailed t-tests assuming equal variances. All values shown are means for each SIRS question plus or minus the standard deviation. *denotes values that are significant ........................................... 197 xii Figure 2-1 Figure 2-2 Figure 2-3 Figure 2-4 Figure 4-1 Figure 4-2 Figure 4-3 Figure 44 LIST OF FIGURES Course Design Model for lSB 208L ........................................... 59 Questions 5-8 are sample higher cognitive level control questions used on pre/post course tests. These questions cover material not directly taught during the course ................................................ 92 Initial assignment and rubric presented to GTAs during a Friday afternoon laboratory meeting for their feedback. GTAs made several suggestions for improving the assignment. The modified assignment is shown in Figure 2-4 ............................................. 101 The modified assignment produced following a detailed discussion with the GTAs and showing incorporation of their suggestions. Note that the rubric was omitted from this version of the assignment, as suggested by the GTAs .................................... 104 Center for Integrative Studies in General Science Organizational Chart .................................................................. 171 Beginning course enrollments for lSB 208L and ISP 203L ......... 191 Final course enrollments for lSB 208L and ISP 203L ................. 191 lSB 208L and ISP 203L percent course completion comparison ................................................................................ 192 xiii Introduction This dissertation documents the development and evaluation of a course model, which arose while trying to improve an undergraduate non-majors laboratory biology program at Michigan State University (MSU). The new laboratory course was designed to align course delivery with the CISGS (Center for Integrative Studies in General Science) educational objectives. During this process it became clear that there needed to be a formal process to determine if changes made in various course aspects translated into increased student achievement. The desire was to test using quantitative data, rather than opinions or feelings, whether changes being made in the course improved student achievement. Therefore, embedded assessments were added to the course to provide the necessary data. In the process of making these course changes it became clear that we were following a series of steps, which are identified in this thesis as the new course model for undergraduate non-majors biology laboratories. This analysis consists of a summary of educational research findings leading to model development, a description of the course laboratory model, assessment results used for model evaluation, a comparison of previous with current laboratory programs, detailed description of graduate teaching assistant professional development, and recommendations for future research. This project began in Fall 2000 when the Director for the Center for Integrative Studies in General Science (CISGS) determined that the existing undergraduate laboratory courses were not meeting the CISGS objectives (Table M) for undergraduate non-majors general biology education. Table l-1 CISGS objectives for undergraduate non-majors general biology education. # Objectives 1. To improve student use of critical thinking skills 2. To improve student use of scientific reasoning 3. To improve student abilities to interpret data 4 To improve student understanding of the importance of science to ' society and their lives 5. To improve student argument formulation Several educational areas were identified as in need of revisions to align courses with CISGS objectives. These included the need to align lecture with laboratory course material, increasing student understanding of how science affects their lives and society, and improving student use of critical thinking, scientific reasoning, and data to formulate logical arguments. A lab coordinator was hired to develop a new course based on CISGS objectives replacing the two existing laboratory courses. The new course was called Applications in Biological Science Laboratory and was given a new course number, lSB 208L. This document describes how laboratory course development led to changes in Graduate Teaching Assistant (GTA) professional development, use of embedded assessments, creation of a graduate teaching assistant survey and a formal means of course evaluation leading to a new course model providing a formal process to drive continuous course improvements. This analysis also documents attempts to assess student Ieaming, future goals, and directions for undergraduate non-major general biology laboratories. Each topic is addressed separately in the following five chapters. Chapter 1 provides an overview and rationale for the case analysis. It also provides theoretical underpinnings leading to course model development. This includes a summary of relevant literature on teaching process skills and their role in instruction, as well as experiential Ieaming. These ideas were used to design a generic course model that served as the template for the new lSB 208L laboratory course. Chapter 2 provides a description of the new course model used for development of an undergraduate non-majors laboratory program. The description includes rationale for the new model, a detailed description of laboratory components, and an explanation of how the new model was used to drive continuous course improvement based on assessment. The model is based on the following questions. (1) Can an effective laboratory course be developed using teaching assistants as primary instructors and incorporating embedded assessments to drive continuous course improvement? (2) Do the instructional interventions (shift from an instructor-centered confirmatory laboratory exercise program to a student-centered inquiry/problem-based program) improve student achievement toward university objectives for undergraduate biology education? Chapter 3 describes the GTA professional development program used for the lSB 208L course model. This chapter describes the reasons for GTA professional development, creation and use of a GTA survey form, and how professional development is used to improve classroom instruction. This section addresses the question: Does instructor professional development, used in association with the course model, improve student achievement on assessment tools, showing progress toward achieving CISGS undergraduate education objectives? Chapter 4 provides an analysis and comparison of previous laboratory courses (lSB 202U204L) with the new (lSB 208L) course to determine if one model better addresses CISGS objectives. The hypothesis for this comparison was that the new lSB 208L course model helped students achieve CISGS objectives better than previous course models (lSB 202L/204L). Finally, Chapter 5 summarizes the findings from this study, and provides recommendations for future work. It includes an overview of this study including the background leading to model development, the course model, the conclusions of the course comparisons, and recommendations for future work to expand on this research. CHAPTER 1 BACKGROUND AND RATIONALE LEADING TO NEW COURSE MODEL Introduction This chapter describes background and justification for making changes in how we teach biology laboratories leading to a new laboratory course model. It begins with a discussion of critical thinking and scientific reasoning, the model link to experiential Ieaming, and a comparison of the types of laboratory experiences used in education, followed by a description of how these elements ‘ were combined to build a new laboratory course model. This project arose when the Director of the Center for Integrative Studies in General Science (CISGS) at Michigan State University determined there was a need for an improved general education undergraduate biology laboratory course. He determined that there did not seem to be a clear link between CISGS objectives and the non-majors biology lecture and laboratory courses requiring changes be made to provide a better educational experience for undergraduate students. The Director hired a laboratory coordinator during Spring 2000 to develop a new laboratory course that better aligned with CISGS objectives for undergraduate education (Table l-1) and to show evidence of student achievement of CISGS objectives. The lab coordinator developed the new laboratory course, Applications in Biological Science Laboratory: lSB 208L over several semesters and used this new laboratory course to create a laboratory course model. The course model was developed based on the author's experience as an instructor, educational research, and theories in developing the course model to better align lecture, laboratory, and CISGS objectives, and assess student achievement. The laboratory model was developed using the CISGS objectives as a guide, a review of the available literature defining critical thinking, scientific reasoning, various laboratory types, and the pedagogies recommended to develop these skills. From this course development the new laboratory model was created, which became the template for future course improvements. The power of this model is that it provides a formal mechanism to obtain assessment data as the foundation for course evaluation. Course evaluation was used to determine subsequent course improvements. The following is a description and summary of the analysis leading to the lSB 208L course and subsequent new laboratory course model. Currently, the educational system in the United States is under ever increasing pressure to improve student education by better preparing students to enter the work force and be informed decision makers. The CISGS objectives closely align with the recommendations of national educational and science organizations in the type and level of science education students should receive during their educational experiences. Several areas have been identified as being essential for educational improvement, as presented in Project 2061, Science for All Americans, published by the American Association for the Advancement of Science (AAAS, 1989). Employers have indicated that students are being trained to perform recall tasks and to follow directions, but that employees who can think on their own to solve problems, are self directed, know how to acquire and use information, and are life-long Ieamers, are the most valuable (DOE, 1983; Kyle, 1997; Hobson, 2000; Garkov, 2002; NSTA 2004). Based on the findings of the National Research Council, AAAS, and other organizations it is apparent that changes in pedagogies are necessary. The National Science Teachers Association (NSTA) recently published the following summary on-line regarding American educational perspectives. Americans overwhelmingly believe public science and technology literacy are crucial to the nation’s future prosperity and security, but they do not think students are being adequately prepared in these fields. In a recent poll, titled Bayer Facts of Science Education IX: Americans’ Views on the Role of Science and Technology in US. National Defense, respondents said improvements in K-12 math and science education should be a national priority in a post- September 11 world. In the survey, commissioned by Bayer Corporation as part of its Making Science Make Sense (MSMS) program, 90 percent of respondents said a strong national science and technology capability is critical to US. security at home and abroad; 80 percent said science and technology will be “very important” in meeting future terrorist threats; and 75 percent said new homeland security needs will create new science-and technology-dependent jobs for today’s students. Support for these beliefs, the poll shows, comes from the underlying notion that science and technology are “critical components” of the nation’s military, intelligence, and law enforcement capabilities. "With Americans nearly unanimous in their views that improving pre-college science and math education is a national priority, and that teaching science in a hands-on, inquiry-based way is preferable in today’s complex scientific and technological worId, they’re sending a very strong message to those who have a stake in strengthening the nation’s education infrastructure,” said Rebecca Lucore, Bayer Foundation executive director and manager of Bayer Corporation’s Community Affairs department. ‘Now it’s up to each of the stakeholders-; parents, educators, legislators, and business leaders; to get involved, support reform efforts, and see to it that change really does occur' (NSTA Express, 2003) This survey indicates that there is a need to reevaluate the ,way in which we teach science at the university level. The first step in this process is to describe what the educated scientifically literate university student should be able to do. According to AAAS (1989) and the NRC (1997) students who are scientifically literate should possess the following skills: have the ability to think critically; use scientific reasoning; interpret various types of data; use facts and logic to solve problems, formulate arguments, and understand the world in which they live. This last skill helps students be global citizens and practice environmental stewardship. At the university level, students are not regularly using higher level thinking skills to solve problems. In order to address this apparent deficiency in student skills, it was necessary to define higher level thinking skills, what a student would be able to do if they were using these skills, and methods to assess student progress toward scientific literacy. There are several current theories and definitions regarding critical thinking and scientific reasoning, and suggested methods for teaching and assessing student use of these skills. The author’s own hypothesis and one supported by educational research, as to how students develop higher cognitive skills, is to put students into situations of varying complexity in many disciplines. They must be provided with adequate coaching and relevant and timely feedback, forced to critically analyze both their own and other’s work, and have the opportunity to fail and repeat exercises or activities until the desired outcome is reached (Posner et al., 1982; Presseisen, 1987; Moore, 1998; Zimmerrnann, 2000; Hart et al. 2001 ). This process of Ieaming also requires clearly defined Ieaming objectives, and ways to assess student Ieaming. There are more similarities than differences between people in innate abilities to critically think or reason scientifically, but a large difference in willingness or desire to use these skills unless required or provided the motivation to do so (Hart et al. 2001). Some students have an innate urge to understand the world around them, while others have less interest, unless it directly influences their life (Hart et al. 2000; NRC 2001). This means, as educators, we can teach critical thinking and scientific reasoning skills to all students, but some will excel and others will struggle, due to differences caused by a lack of interest, desire, and/or innate ability (Ennis, 1987; Hart et al. 2000; Taconis et al., 2001). The author’s personal observation and one supported by the work of Payne (1996), is that students who come out of families where they have a wide range of experiences using various thought processes and have been challenged to think on their own, are more prone to use higher level thinking skills. These tend to be families that travel, own businesses, have moved around the world, or in some other way exposed their child to a range of experiences. Students who consistently use higher order thinking skills are those exposed to situations where they are asked, forced, required, or encouraged to seek out answers to complex questions on their own, with support and guidance from others (Duit, 1987; Barr and Tagg, 1995; Payne, 1996; Moore, 1998; Dimaculangan et al. 2000; Karplus, 2003). Several studies support the idea that teachers need to provide various methods and situations to aid students in linking course content from multiple disciplines using higher-level process skills (Hart et al. 2000; NRC 2001). Examples of exposure to these types of thinking experiences include a parent working with their son or daughter to rebuild a car, working in the family store or business, traveling as part of a military family, and living or interacting in a family that socializes with highly educated or diverse people. In the scenario where the parent allows the child to work with them to rebuild a car, the parent would gradually provide the child with more complex tasks until the child can do the work completely on their own. Another example would be the storeowner who takes their child into the workplace, starts them with simple tasks, and slowly increases the workload and difficulty until the child can run the business. In both of these examples the parent acts as a coach, intervening as necessary to I prevent total frustration on the part of the child, while still providing a challenge to allow the child to grow mentally. This may imply that all children exposed to this type of experience will develop good critical or scientific thinking skills. There are many factors that prevent children from fully developing these skills. Other things that affect the skill development are their inherent curiosity, willingness to persevere to achieve a goal, desire or willingness to accept a challenge (Hart et al. 2001; Taconis et al., 2001), innate intellectual ability, and developmental level at time of exposure (Piagett, 2003). The child needs to be coached through the process of analyzing the situation, allowing them to arrive at an answer, with encouragement on simple tasks, and the level of the task is gradually increased, the student (child) should develop a wide range of thinking skills. The point is that the instructor, be they parent, teacher, boss, etc., must be aware of the abilities of their pupil and provide tasks of the appropriate difficulty level to allow the pupil to be challenged and successful in completing the task (Hart et al. 2001; Piagett, 2003). 10 Additionally, the student/child needs exposure to a wide range of situations of varying complexity, and thinking skill sets in order to fully develop these higher- level cognitive skills (Popp,1999; Hart et al. 2001; NRC, 2001). The following section will examine the theories of critical thinking and scientific reasoning as provided in the current educational literature. What is Critical Thinking? Educational researchers and practitioners are focusing on defining, understanding, and determining methods to teach critical thinking and scientific reasoning. Most people recognize the person who can use critical thinking or scientific reasoning, but have a difficult time defining either concept. Many researchers provide a list of functional characteristics to aid in defining both concepts and some researchers and practitioners actually use the phrases/concepts (critical thinking and scientific reasoning) interchangeably. According to Paul (2003), critical thinking is “the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.” Other authors argue for slightly different definitions and many of them use varying characteristics or behaviors to explain critical thinking. For example, Ennis (in Heiman, 1987) explains that critical thinking is a set of principles of thought that cross disciplines and are commonly used in any subject. These principles include, (1) conflict of interest, (2) the straw person fallacy, (3) denial of consequent, and (4) the ability of a hypothesis to explain an 11 event. According to Ennis (1987) individuals who use “conflict of interest” exhibit the willingness to view a person’s claim with greater suspicion than would othenrvise be appropriate. The “straw person fallacy" is when a person misdescribes another person’s position and then attacks the position as if it actually were the person’s position. Individuals who exhibit the “denial of consequent” show that they can deny the antecedent in an argument, being able to show why the argument does not follow. And finally, a critically thinking person is one showing the “ability to express a hypothesis to explain an event” unless otherwise disqualified by additional evidence. Ennis (1987) indicates that these principles transfer across disciplines and therefore provide bridges to aid students in transferring thinking skills to new areas or situations. He also notes that these principles are often learned within a specific context and many students have difficulty transferring the process to a new situation, especially in a new subject area. Ennis (1987) also makes it clear that students need guided practice in multiple domains to train them in the use of these techniques. Ennis (1987) claims critical thinkers have a set of characteristic attitudes that guide their thought processes. These characteristic attitudes are shown in Table 1-1. 12 Table 1-1 Attitudes identified by Ennis (1987) as characteristic of individuals who are using critical thinking. it Characteristics 1. Seek a clear statement of the thesis or question 2 Seek reasons 3. Are well informed 4 Use credible sources and mention them 5. Take into account the total situation 6. Remain relevant to the main point 7. Keep in mind the original and/or basic concern 8. Look for alternatives Be open-minded a. Seriously consider points of view other than one's own ("dialogical thinking”) 9 b. Reason from premises with which one disagrees-without letting ' the disagreement interfere with one’s reasoning (“suppositional thinking”) c. Withhold judgment when the evidence and reasons are insufficient 10 Take a position (and change a position) when the evidence and ' reasons are insufficient 11. Seek as much precision as the subject permits 12. Deal in an orderiy manner with the parts of a complex whole Be sensitive to the feelings, level of knowledge, and degree of 13’ sophistication of others 13 Perkins (1987) explains that students need to be reflective thinkers who internalize thought processes. One way to allow this to occur is to have students talk out loud and have an outside observer help guide their thbught process to understand what they are doing. This process allows students to become more systematic problem solvers. Lochhead (1987) makes the following comment, “Novice students, particularly poor ones, do not need to be taught methods which they can only follow in a mindless fashion. Rather they need to be taught to think about whatever problem solving method they happen to choose.” Swartz (1987) I explains that critical thinking also takes in creativity as well as being systematic. She describes the importance of a child being able to intuitively combine disparate ideas in novel ways to solve problems or provide critical analyses of situations. Presseisen (1987) indicates that good thinking skills can be taught and include the so-called higher order cognitive processes of critical thinking, decision making, problem solving, and creative thinking. Presseisen (1987) also indicates that these higher-level cognitive processes cut across disciplines. Ideally the student becomes aware (metacognitive) of the processes used to carry out critical thinking and learns to apply them in all situations where it would be of value. Helpem (1987) writes that metacognition is the highest of the critical thinking skills. In her view, students must develop metacognition and recognize their own limitations in order to develop basic thinking skills. Other researchers, especially educational psychologists, discuss levels of thinking and the growth of cognitive abilities based on a series of levels or stages 14 the individual goes through in order to become a critical thinker. Snow (1989) describes the first level of cognitive thinking as that of “component processes and skills.” These are the basic rules, processes, and skills needed in order to take on the given task. Students need to become aware of these processes in order to improve their ability to use them and make them their own. This is what most educators tell students to do, but often fail to provide the instruction and practice as to how to go about the process of building a conceptual framework necessary to develop these skills (Ennis, 1987; Hart et al. 2001; NRC, 2001). Thus, educators often teach vocabulary, concepts, or formulas for a particular process but fail to show students how these processes link together or how they should go about thinking of the interrelationships of the parts to the whole. For example, a coach may tell a baseball player to focus on a good swing, but never explain the components of a good swing or how to recognize when the brain and body are not coordinated in order to execute a good swing. In areas such as geometry we often teach the proofs or the formulas for a particular aspect of geometry such as the importance of understanding that the diagonals of a square are equal, yet fail to show students how this concept applies in real-world contexts. Students may memorize that a square has four equal sides, two diagonals of equal length, and may even memorize the formula for determining the area of the square, but when a student is asked to use the concept of a square for a practical purpose they cannot apply the concept. If students understand geometry, it is a simple step for instructors to have students apply these concepts to real-world situations. In biology we often show students how carbon 15 atoms cycle through the environment using a carbon-oxygen cycle diagram, but never help students connect this molecular movement to nutrition, photosynthesis, respiration or other interconnected processes in an understandable way (Pers. Obs.). Instead, instructors teach these separately and assume students will fit all the interconnections together at some point in their education. Often students must take several science courses before they are exposed to each of these processes and may never connect these to one another (Pers. Obs.). In the case of the non-major, the student may never take I enough science to make these connections. Instructors need to provide the links in a simple manner in multiple contexts in order for students to understand the concepts. The more practice students are provided in applying their knowledge to real world situations the better they become at applying these concepts to new scenarios. The concepts move from being items to memorize to useful information. Beyer (1987) suggests we need to make students aware of what they are Ieaming by having them recite out loud to another student or the instructor what they are thinking as they do the steps in a skilled operation. He indicates that this helps students become aware of how and what they are thinking and allows instructors to guide students in correct thinking patterns. Beyer (1987) also indicates that modeling critical thinking by the instructor is of utmost importance in helping students understand and become aware of what goes into higher-level thinking. The author's personal experiences teaching chemistry, physics, and human physiology showed that it was important to have students discuss, role 16 play, or in other ways work with course materials under instructor direction. Additionally, it is enlightening to ask students to complete parts of an assignment, such as a physics problem, showing the work along with an explanation as to why they chose the analysis path they did. This type of student-teacher interaction allows one to observe where students are having difficulties understanding complex solution processes or concepts, thus allowing an opportunity to guide students in alternative, but correct thought processes Lochhead (1987), Perkins (1987). Once students are given guided practice, they. were observed doing self-analysis (metacognition) of their solution schemes. This self-reflection brings us back to the idea of metacognition, that students must become aware of what they are thinking and how the process works in order for them to improve their thought processes. Helpem (1987), Snow (1989). Heiman and Slomianko (1987) compiled educational researchers’ observations showing that good Ieamers demonstrate a specific set of behaviors while engaged in complex academic tasks. These behaviors include the ability to break down complex tasks into less complex ideas or parts, asking questions and relating new material to previous materials, developing methods of self- assessment to measure their own Ieaming, and focusing on instructional objectives to direct their studies. One other important aspect of critical thinking is time management. Karrnos and Karrnos (1987) point out that effective critical thinkers account for time available to accomplish a task and plan accordingly. Critical thinkers need time for incubation, where the thoughts and ideas have time for evaluation to 17 derive an acceptable or best answer according to the individual assigned the task. Additionally, critical thinkers need to have the willingness, mental discipline, and perseverance or tenacity to remain on a task and to endure the frustration of doing iterations of possible solutions until the best solution (in the opinion of the individual) is determined (Pers. Obs.). Many students will not take time to work on a problem because it is too difficult or of no interest to them and they simply pick a solution, not necessarily the best one (Hart, et al. 2001). Swartz (1987) writes that critical thinking focuses on questions about what. we believe and do. He provides a list of behaviors that identifies people using critical thinking skills, but indicates this list is not inclusive. Swartz’s (1987) list of critical thinking behaviors is meant to provide an idea of the scope of skills that may be incorporated into the classrooms. According to Swartz (1987), people who are critical thinkers should possess specific abilities, listed in Table 1-2. 18 Table 1 -2 Critical thinking behaviors as described by Swartz (1987). If Critical Thinking Behaviors 1. Discriminate reliable from unreliable sources of information 2 Make accurate observations 3. Recognize the difference between good and bad reasoning 4 Understand and use evidence and sound reasoning to support conclusions 5. Develop a set of viable standards used for assessment by the individual 6. The ability to categorize, classify and recognize patterns 7. Be open minded 8. Consider other points of view 9. Look for other sources of information 10. Possess the critical thinking/thinking attitudes and dispositions Swartz’s list of critical thinking behaviors is similar to, but varies from, Ennis’s (1987) list (Table 1-1) in several ways. Ennis (1987) and Swartz (1987) both indicate critical thinkers should seek reasons, use credible sources of information, consider others views, seek alternatives, remain open-minded, have an organized thought pattern, and seek alternative explanations. They differ in that Ennis (1987) indicates the critical thinker should have a clear thesis statement, remain relevant to the original point, take a position and change that position based on the evidence, be as precise as possible, and be sensitive to others feelings. Swartz (1987) adds that critical thinkers should make accurate observations, recognize good from bad reasoning, develop a set of standards to 19 assess individuals, and possess the critical thinking attitudes and dispositions. Other researchers add to or modify these sets of critical thinking behaviors. For example: Sanders (1987) provides the following definition of critical thinking: A precise and useful definition of the phrase is that it includes all the thought processes beyond the memory category. A teacher who offers his students appropriate experiences in translation, interpretation, application, analysis, synthesis, and evaluation can be assured he is providing instruction in every intellectual aspect of critical thinking. Finally, the Center for Critical Thinking provides the following definition of critical _ thinking: We understand critical thinking to be purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based. ...habitually inquisitive, well-informed, trustful of reason, open minded, flexible, fair-minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit. (Critical Thinking: What It is and Why it Counts, 1 998) We can summarize the educational researchers’ findings about critical thinking by placing their results in three categories: defining, teaching, and assessing. According to these researchers critical thinking is the ability to break down problems into manageable parts (Heiman and Slomianko 1987), and relate them to previous material, focus on task objectives, and develop methods to assess their Ieaming. Ennis (1987) indicates critical thinkers should be able to formulate hypotheses based on credible evidence and to alter the hypothesis as additional information becomes available. Individuals should also demonstrate 20 the ability to transfer these skills across disciplines. He also states that critical thinkers try to remain relevant to the main point, keep in mind the original and or basic concern, look for alternatives, be open-minded, and reason from premises with which one disagrees without letting it interfere with reasoning. They should take a position and be willing to change their position when the evidence and reasons are insufficient. Additionally, they should seek as much precision as the subject permits, deal in an orderly manner with the parts of a complex whole, and be sensitive to the feelings, level of knowledge, and degree of sophistication of I others. Paul (2003) adds that critical thinkers should actively and skillfully conceptualize, apply, analyze, synthesize, and/or evaluate information gathered from or generated by observation, experience, reflection, reasoning, or communication as a guide to belief and action. Swartz (1987) adds that a critical thinker should be capable of being creative and to use information to analyze novel situations or arrive at novel solutions to problems. Swartz (1987) also notes that critical thinkers should be able to develop a set of criteria for evaluating situations, problems, and hypotheses. Finally, Helpem (1987) and Snow (1989) indicate critical thinkers are individuals that develop self-awareness (metacognition) recognizing how they Ieam. They also can assess their own Ieaming, recognize their own limitations, and have the ability to express their thoughts. Educational researchers provide a list of techniques that help students develop critical thinking skills. Ennis (1987), Hart (2001), and NRC (2001) indicate that educators must build the conceptual framework necessary to 21 develop these skills. We teach the parts, vocabulary, concepts, or formulas for a particular process but fail to show students how these processes link together or how they should go about thinking of the interrelationships of the parts to the whole. Students need practice applying their knowledge to real-world situations. Lochhead (1987) indicates that students need to be taught to think about whatever problem solving method they happen to choose. Students need to see instructors model critical thinking by talking aloud about the thinking process while relating material across disciplines. Instructors need to teach students how- to be reflective thinkers that internalize thought processes by talking aloud while an outside observer guides their thought process to understand what they are doing. (Perkins 1987). This is quite an extensive list and is not a useful definition for practitioners. In summarizing this information, it was decided that the working definition of critical thinking in the lSB 208L course model was: the ability to formulate an argument/solution to a problem based on the use of logic, the scientific process, and facts to support their position. This definition was chosen because it supports the CISGS objectives for student Ieaming, allows us to design appropriate assignments for reinforcement, and can be reasonably assessed. Finally, it was important to assess whether or not the course was actually teaching critical thinking skills. Therefore, worksheet, in-class discussion questions, and test questions were written based on the lSB 208L definition of critical thinking. There was an attempt to use evaluations that tested students’ ability to formulate arguments using facts and logic. These evaluations also had 22 students apply concepts to novel situations or to engage in discussions that encouraged them to use, defend, and argue both their own and other’s positions to arrive at acceptable conclusions. They were also encouraged to make hypotheses, or formulate viable solutions to real-world problems under the guidance of the classroom instructors. What is Scientific Reasoning? How does critical thinking compare with scientific reasoning? According to Zimmerman (2000) scientific reasoning is also called scientific discovery and scientific thinking. Scientific reasoning/thinking relates to critical thinking and depending on the author/definer these processes may be considered synonymous. However, most authors would differentiate between critical thinking and scientific reasoning in the following manner. Scientific reasoning is the process of trying to understand/explain the worid in a systematic way using empirical evidence, facts, or models to formulate explanations, predictions or draw conclusions about observations in the natural world. Scientific facts are based on objective reality and require peer verification. The scientific definition of fact and peer verification separates critical thinking and scientific reasoning (Brownowski, 1965; AAAS, 1990; NRC, 1996; Giere, 1997; Hogan, 1999; Wycoff, 2001; Gallucci, 2004). Scientific reasoning is a process whereby an individual examines a situation and systematically works to achieve an adequate solution to explain the situation. Everyone uses scientific reasoning, not just scientists, and they do not always follow the scientific method in the lock-step fashion portrayed in science textbooks. Scientists use many approaches to find solutions to problems or answers to questions. In fact, scientists will use various approaches 23 to help them explain an observation, but all observations and explanations in science require answers be verified by another person using the same techniques and conditions. Critical thinking could help provide a solution to a problem using pure logic or opinions that would not require verification using models or empirical evidence as is generally required for scientific reasoning. Scientific reasoning is different from critical thinking in that scientific reasoning has limitations as to what areas it will examine. Critical thinking can be used to address issues that scientific reasoning does not generally address such as ethical, moral, legal, ideological, or religious questions. These issues often do not have empirical evidence to support a position, solution or explanation, but rather use thought models, logic, faith, or other types of evidence to arrive at conclusions. Many authors find it difficult to separate scientific reasoning from the process of doing science, generally called the scientific method, which is a process that scientists use in an attempt to understand the world. This process is characterized by a systematic approach whereby the scientist makes observations of the wortd, attempts to explain them or form predictions as to why or how they occur and then draw conclusions about the observation based on evidence. Their findings are verified by peers, and the results are published. One important point is that the process of reasoning may take place without going through all of these steps. Reasoning is attempting to be systematic in the approach to understanding/explaining the observation. In some cases the scientist may go from the data to the hypothesis (Zimmerman, 2000). Most 24 science textbooks provide a series of steps and indicate that scientists doggedly follow these steps in the same order to arrive at their conclusions. Zimmerman (2000) explains that the scientific discovery process includes both reasoning and problem-solving skills with the ultimate goal of generating a viable hypothesis about a causal or categorical relationship. Scientific investigation includes numerous procedural and conceptual activities including asking questions, hypothesizing, designing experiments, using apparatus, observing, measuring, predicting, recording and interpreting data, evaluating evidence, and forming or I inferring models. Zimmerman (2000) claims that most of the focus in science has been on two main types of knowledge, domain-specific and domain general strategies. Domain specific strategies mean that we teach specific information that fits into our field, often heavily jargon laden instead of forming links to other disciplines. Domain general strategies provide students techniques for functioning within a specific discipline, such as science, but may not apply outside the discipline. Klahr and Dunbar (1988) examined the scientific reasoning used by professional scientists where they noted strategy differences between theorists and experimenters. Theorists, individuals who take a theory driven approach, tend to generate hypotheses and then test the predictions of the hypotheses or, as Simon (1986) described, “...draw out the implications of the theory for experimenters or observations, and gather and analyze data to test inferences.” 'Experimenters tend to make data-driven discoveries, by generating data and finding the hypothesis that best summarizes or explains that data” (Zimmerman, 25 2000). Feyerabend (1975) states that there is no such thing as the scientific method and that scientists are better described by the following: ...the idea of a fixed method, or fixed theory of rationality, rests on too naive a view of man and his social surroundings... there is only one principle that can be defended under all circumstances and in all stages of human development. It is the principle: anything goes. This means that scientists do not follow the lock-step method often provided in science textbooks to do scientific research. Instead scientists go through a systematic process of iterations of possible hypotheses, observations, I or data analyses until they derive an acceptable explanation for the problem being examined. Lochhead (1987) states,” that the scientific process is not the result of strict adherence to a set of rigid logical procedures.” Rather, “good science is a search for methods that have the widest conceivable application.” “Good scientists do not fellow a prescribed method; they discover methods that work.” These authors feel that scientific reasoning contains elements of the scientific method. Scientific reasoning should include a systematic study of the world where evidence is used to draw conclusions with regard to observations. Critical thinking encompasses the scientific process as one of its possible elements. Critical thinkers may use scientific reasoning as one of the advanced thinking proceSses during problem resolution. According to Giere (1997) most people think of scientific reasoning as the kind of thinking used by scientists in the process of making great discoveries. Giere (1997) takes the approach that the layperson does not need to understand 26 ‘ scientific reasoning as used by scientists any further than to allow them to understand current information provided in non-technical news sources. Giere (1997) states, “...leaming to understand scientific reasoning is a matter of Ieaming how to understand and evaluate reports of scientific findings we find in popular magazines, national newspapers, news magazines, and some general professional publications. This requires very little knowledge of what really goes on in scientific laboratories. And it does not require the kinds of skills that are necessary for laboratory research.” Based on this recommendation we reduced 1 content coverage, reduced the focus on teaching the use of laboratory equipment and moved to laboratory exercises that required greater student engagement in the process of doing science, more classroom discussions, and fewer confirmatory-type laboratories. Instead, the focus shifted to help students use organized thoughts, recognize the validity of facts, and evaluate what they observe using critical thinking and scientific reasoning. For example, they do not need to be able to use a microscope in order to recognize a microscopic image or perform gel electrophoresis in order to understand the importance of a DNA test. There is a gap between everyday and scientific ways of thinking, and the processes of reasoning that scientists and nonscientists use to build new knowledge (Hawkins and Pea, 1987; Reif and Larkin, 1991). Klahr and Simon (1999) explain that, “although scientists and lay people use similar thinking processes-building inferences, making arguments, critiquing claims-they coordinate the subcomponents of everyday thought processes to different levels 27 of specificity and rigor.” This indicates that non-scientists and scientists reason differently, even when given similar analytical situations. Schunn and Anderson (1999) indicate “such differences in the two groups” reasoning processes are evident even after accounting for the effect that domain-specific knowledge has on reasoning." The two groups, scientists and non-scientists process and analyze problems in different ways indicating that educators must help students build the scientific reasoning skills used by scientists if we want them to use these skills. Scientific reasoning is not only used for science, but rather it is a systematic way of understanding and explaining the world. Every educated person should be trained and adept at using scientific reasoning to analyze an observation in a systematic way and to determine a reasonable explanation for an observation, viable solution to a problem, or a reasonable interpretation of scientific evidence based on valid evidence/data (AAAS 1989; NRC 1997; NSTA, 2003). Scientific reasoning is only one part of doing science. “Doing science” includes making and following procedures, memorizing and using terminology, manipulation of equipment and tools, making observations, and disseminating information, is part of conducting professional science but, is not part of scientific reasoning. Many laypersons have added to the scientific field by using scientific reasoning, but without following the usual procedures of a scientific study. Individuals such as Dmitir Mendeleev, who developed the periodic table, set out to determine the properties of colored glass. Mendeleev, who was not originally trained as a scientist, used scientific reasoning to develop the periodic table. Others include Beatrix Potter who made major contributions to the study of 28 lichens and Theodore Roosevelt who helped establish conservation biology and set up the United States National Park system by having used his observations to recognize that we needed to manage our resources to preserve them. There are two other areas that need clarification: scientific reasoning and everyday/general reasoning. How are scientific reasoning and everyday reasoning different? Webster’s Collegiate Dictionary (10th edition) provides the following definition for reasoning: The use of reason: the drawing of inferences or conclusions through the use of reason. Reason is a statement offered in explanation or justification, a rational ground or motive, a sufficient ground of explanation or logical defense, something that supports a conclusion or explains a fact, or the thing that makes a fact intelligible. Therefore, everyday reasoning is broader in scope than scientific reasoning in that it allows a wider range of information or evidence to be used to justify or present an argument (Pazzani and Flowers, 1990). Scientific reasoning requires all evidence presented in support of an argument/solution to be fact (empirically) based. Facts must be based on objective reality. In everyday reasoning, the information can be presented based on logic or even supposition so long as it leads toward a logical argument. Thus, everyday reasoning is more encompassing than scientific reasoning, with critical thinking being more general than scientific reasoning and able to encompass both. In summary, there is a difference between scientific reasoning and critical thinking skills, but there is clear overlap in the two processes. Scientific reasoning could be best described as a systematic way of trying to explain observations using evidence obtained through the use of objective reality and 29 peer review. Critical thinking, on the other hand, is a group of processes that allow an individual to use a wide array of information and techniques to arrive at a conclusion about a particular topic. Therefore, critical thinking is a broader, more general group of process skills used for analysis and understanding than is scientific reasoning. The difference for the educational practitioner is not relevant as many of the recommended pedagogies to teach these skills are the same for both of these process skills. The main goal for the practitioner is to teach students how to use various analytic skills in all subject areas and in as varied a i set of contexts as possible. The wider the range of contexts and levels of difficulty the better the student will be able to use these skills when needed. The research (Ennis, 1987; Taconis et al., 2001) shows that students need help in transferring thought processes/skills across disciplines. The practitioner must help students bridge this gap through practice, discussion of the process (helping students gain metacognition), and through exposing students to a diverse problem set (Popp, 1999; Hart, et al. 2001; Levin, 2001). Therefore, this study reviewed the relevant educational literature related to experiential, outdoor, environmental, and study abroad experiences to develop the most effective means of teaching critical thinking and scientific reasoning skills. Experiential Learning and Labs Throughout history, people have learned by imitating peers, parents, masters of an activity or knowledge, and through practice while actively engaged in the experience. This form of Ieaming where the student is actively engaged in the activity, is referred to as experiential Ieaming. Experiential Ieaming in formal 30 education has historically been accomplished via apprenticeships (on-the-job training), internships, student mentorship programs, etc., both formally and informally. Experiential learning is defined by the Association for Experiential Education (AEE) as “a process through which a learner constructs knowledge, skill, and value from direct experiences” (Adkins and Simmons, 2002). Basically, this means that pupils learn by doing (Goldenberg, 2001). According to Freeberg and Taylor [(1963), in Goldenberg ,2001)], “Through direct experiences with nature, people, objects, things, places, and by actually Ieaming by doing, scientific evidence has shown that the Ieaming process is faster, what is Ieamed is retained longer, and there is greater appreciation and understanding for those things that are Ieamed first hand.” An example of an experiential educational setting is a study abroad experience that provides a program that totally immerses a student in Ieaming, by actively engaging in the environment while in a close one on one relationship with an instructor. Experiential Ieaming leads to the greatest retention of course content and is the way the most Ieaming takes place (Cash, 1993; Personal observation). Many types of experiential education are easy to assess. Those activities that are skill-based, such as operating a boat, flying an airplane, safely whitewater rafting, performing proper cardiopulmonary resuscitation, etc., have very clear procedures, with specific outcomes, making it easy for an instructor to determine if the pupil has met the performance criteria. Also, many of these activities provide students with a low student-instructor ratio (20:1 or lower), allowing an instructor to critically evaluate a student on an individual basis. 31 When the educational activity moves from knowledge-based performance to higher cognitive level skills and from low to high student-instructor ratio, assessments become more difficult. For example, comparisons between on and off-campus activities are extremely difficult to make, objectives for programs may not be comparable, there is often an unclear set of easily quantified objectives, a lack of good assessment tools, or doubt as to whether or not a student actually Ieamed the desired information. Numerous studies have attempted to assess and compare experiential, outdoor, environmental, and on-campus (regular in-class) education as to their effectiveness in initiating student Ieaming (Cash 1993; Hattie et al. 1997; Lewicki 1998; Andersen et al. 2001; Goldenberg 2001; Adkins and Simmons 2002). Lewicki (1998) indicates that the best way to achieve the greatest Ieaming gain is to use experiential Ieaming. He explains that students learn the most when engaged in as many ways as possible as described in the following: A pedagogy of place brings school and community together on a common pathway dedicated to stewardship and life-long Ieaming. It is teaching by using one’s landscape, family, and community surroundings as the educational foundation. Significant Ieaming takes place outdoors and in the community. This community expands outward from local landscape and home, to regional realities, to international issues. In coming to know one’s place, one comes to know what is fundamental to all places. Respect and reverence for one’s immediate place, land stewardship, gives one respect and reverence for all places (Lewicki, 1998). Lewicki (1998) states that nature helps to teach by helping students understand how place is important to community, that where and how a student leams is important to Wnat a student learns, and that respect is integral to 32 Ieaming. Lewicki (1998) goes on to describe that the goal of education is to develop behaviors, skills, and enthusiasm for life-long Ieaming (Table 1-3). Table 1 -3 Behaviors and skills described by Lewicki (1998) exhibited by life-long Ieamers. # Behaviors and Skills 1. Observe, record and analyze data, ever evaluating appropriateness, reliability, and validity Exhibit tenacity as a learner Demonstrate effective collaboration skills, in pursuit of questions that are pertinent, insightful, and reveal deep understanding Demonstrate a recognition and utilization of dynamic systems and structures 5. Develop the intellectual habits of skepticism and openness 99°!” 6. Utilize the discipline of deduction 7. Develop the power of intuition 8. Demonstrate the ability to cooperate through a shared dilemma 9. Select problem-solving processes appropriate to a shared dilemma 10. Recognize, allow, and seek alternative problem-solving strategies 11. Draw conclusions independent of authority 12. Tolerate ambiguity-and the potential for more than one correct “answer" 13. Develop mathematical relationships based upon empirical data involvinimufliple variables Whittmer (in Lewicki, 1998) states, “Experiential Ieaming is the type of Ieaming which has the quality of personal involvement. Consequently, it is more significant and meaningful. Through experiential Ieaming, knowledge is gained primarily from one’s own actions, practices, and perceptions. In other words, 33 Ieaming is acquired through having an experience and talking about that experience.” Lewicki (1998) points oUt that an authentic Ieaming experience has the following qualities: is fun, provides a sense of belonging, allows opportunities for freedom, encourages the wise use of power, empowers students to select, create, implement and evaluate their work, provides an experience of community caring and compassion through acts of praise and encouragement, and creates momentum that influences school climate. If these statements are correct, the best way to enhance Ieaming is to engage students in real world experiences. Experiential Ieaming and study abroad experiences are specifically designed to engage students by placing them in a real-world Ieaming environment. Most of the education students receive does not provide them with real-world experiences in terms of what is being asked of them in regard to content, performance, or in how they should go about arriving at solutions to problems or situations. Students are often not allowed to ask questions, provided real-world problems to solve, allowed to try multiple iterations to find problem solutions, encouraged to think about novel ways to solve problems, or any of the other contexts in which they will work when they leave the classroom (Hart et al. 2001; Zimmerman, 2000; Posner et al., 1982; Presseisen, 1987; Moore, 1998). The value of experiential Ieaming is indicated by the comments students make regarding their education, such as, “class didn’t help me at all”, or “what I Ieamed in school was great in theory but has no real application in the work place,” or “I do not see how the material we memorized in class can be used for solving this problem” (Hogan and Maglienti, 2001). The central task of education is to develop students’ abilities to think and become life-long Ieamers. Kuhn et al. (1995) indicates that people who think only with their theories, cannot consider alternative theories. This means that if students have a theory of how something works and are not willing or able to look for alternative explanations, they have closed minds. An emerging alternative to emphasizing what nonscientists cannot do or do not know centers on recognizing that limitations in reasoning processes may be due as much to what people choose not to do. For instance, adults and adolescents who demonstrate use of sophisticated reasoning skills in one context do not necessarily display them in other contexts, depending on their motivations, objectives, and the task specifications (Klaczynski, Gordon, and Fauth, 1997; Klaczynski and Narasimham, 1998). This leads to the need to provide students exercises that encourage or force them to consider alternative explanations (theories) to their own and to transfer this understanding to other disciplines. Experiential Ieaming combined with integrative studies helps students transfer these skills across disciplines and to examine information from multiple perspectives. The need to engage students in thinking processes and interest them in new ways of thinking points to a need to broaden frameworks for interpreting the results of scientific reasoning studies from skills - and knowledge-based explanations to explanatory frameworks that include contextual and motivational dimensions of cognition (Klaczynski, Gordon, and' Fauth, 1997; Klaczynski and Narasimham, 1998). Renner, Abraham, and Bimie (1985a, 1985b) examined ways of making the laboratory an active Ieaming environment (experiential Ieaming) for students and found that discussions are pivotal in engaging students 35 in the thought processes used to derive knowledge. The importance of this finding is that a large number of science teachers struggle with incorporating discussion in laboratory work, but it is an essential part in helping students develop critical thinking skills. Students generally enjoy discussions and this often provides a motivational tool to engage students in the thinking process. Discussions provide a way to help students engage in dialogical thinking, an important component of developing critical thinking skills (Ennis, 1987). Additionally, Watts and Ebutt (1988) found that many students preferred laboratory work that offers them opportunities to direct their own inquiries. Inquiry and problem-based laboratory exercises are designed to allow students to direct their own Ieaming/investigations. Clearly, discussion in concert with experiential exercises is important in helping students to clarify their thinking, understand the importance of scientific inquiry, and appreciate the relevance of laboratory exercises to society and their lives. According to Gunstone and Champagne (1990) students should have small qualitative laboratory talks to promote conceptual change. They also indicate that students should spend less time interacting with apparatus, instnrctions, and recipes and more time on discussion, reflection of Ieaming, and being actively engaged in laboratory exercises. Additionally, time needs to be included in laboratory programs for students to process, through discussion, what is to be Ieamed from the exercise. Hart and colleagues (2000) indicate that students need to understand the aim and purpose for doing the laboratory exercises and how the exercises relate to both other course materials and 36 disciplines. The work of many researchers (Hart, 2000; NRC, 2000; Levin, 2001; Duch et al. 2001) suggests that there is a need to change the type of laboratory exercises in which we require students to be engaged. An analysis of the educational research findings on critical thinking and scientific reasoning indicates that there is a need to examine the type of laboratory exercises used for undergraduate education to insure they are providing the types of experiences that help achieve the objectives desired for undergraduate science education. There are different types of laboratory exercises and course models. The literature often lists exercise types as course models and models as exercises. These may overlap depending on how they are used. The following describes the laboratory exercises and the next section will describe laboratory course models leading to the lSB 208L laboratory model. Laboratory Types There are four basic types of laboratory exercises, although this number varies depending on the author. The major types are confirmatory (traditional), inquiry, case-based, and problem-based (see Appendix A: A comparison of each laboratory type). The distinction between these types of exercises is often blurred, as they tend to blend into one another as people adapt them to achieve a specific purpose. All of these laboratory exercise types have their value and which is used in teaching course content depends on the objectives for student Ieaming. 37 Most laboratory exercises currently in use are of the confirrnational type and are task oriented. According to research, (Johnstone and When, 1982; Hodson, 1990; Edmondson and Novak, 1993; Berry et al. 1999a and 1999b) students describe science laboratory work as dull, boring, unengaging, teacher- centered, and the kinds of work that the main objective is completion of the task. If educators are going to improve student Ieaming, specifically develop higher cognitive level skills, there is a need to change the type of laboratory exercises used in our classrooms. Each laboratory type has its merits, but some work better than others for specific skills. The following is a description of the most commonly used laboratory types and their strengths and weaknesses, followed by the rationale for the types of laboratory exercises chosen for use in the new undergraduate biology laboratory. Confirrnational laboratory exercises are often referred to as cookbook or traditional as they have definite procedures students follow to complete a exercise! task. These laboratories can be useful when teaching skill-based material, such as the use of a microscope, where students need to master a skill in a prescribed manner. In confirrnational laboratories, the instructor helps students rediscover the steps that the investigator who designed the lab or the original researcher used to arrive at a known answer. Hart and associates (1997) indicate that students working through confirrnational laboratory activities fail to develop critical thinking or higher cognitive thinking skills from the activity. They also indicate that unless students are forced/encouraged/required to engage in the thought process behind the laboratory exercise they will not 38 understand why they are doing the exercise, and they will not improve their higher-level thinking abilities. These findings show that one of the best methods to engage students in this type of course material is to encourage them to be involved in discussions of the purpose, aims, and the links between the exercise, course content, and real-world relevance. Inquiry-based laboratories or instruction, are activities that require the student to make their own decisions regarding a course of action needed to answer the question or to explain an observation. These activities can be variably structured depending on the objectives of the unit or course. Inquiry- based laboratories lie along a continuum between confirrnational laboratories that require a student to follow a fairly structured path to arrive at a predetermined answer to problem-based laboratories (PBL) or exercises that are based on real- world problems that are open-ended without predetermined answers. According to the NRC (National Research Council, 2000) “inquiry-based Ieaming occurs when students seek information by questioning and, in doing so, construct new knowledge and resolve issues. Students are involved in the discovery as well as the process leading up to the discovery. It is this involvement that results in understanding and the development of critical thinking and problem solving skills” (NRC 2000; DiPasquale, et al. 2003). In the inquiry approach, students are provided with a problem, a set of skills, methods, or processes, and directed to find an answer. Usually, there is a correct answer in inquiry-based laboratories, but the student is allowed and encouraged to try different techniques/processes/methods to answer the 39 question or arrive at a solution. Depending on who wrote the exercise, inquiry- based laboratories may have elements of confirmational and/or PBL exercises. The main difference between confirmational and inquiry labs is that students do not just blindly follow a series of steps to arrive at a predetermined answer. A skilled inquiry - based laboratory or instruction requires the students to be engaged in the process of doing science-the thought processes involved in science. This requires that the student be placed in situations where they must use critical thinking and/or scientific reasoning skills in order to find a problem solution. For most inquiry-based laboratories or educational problems there is a “correct” answer, which the exercise writer is attempting to help students discover. Additionally, educational researchers indicate students need to discuss the concepts with one another and the instructor (Gunstone and Champagne, 1990; Renner, Abraham, and Bimie, 1985a and 1985b), need to be guided through the thought processes, and need to understand the relevance and purpose of the exercise (Ennis, 1987; Popp, 1999; Hart et al. 2000; Levin, 2001; Taconis et al., 2001). According to Levin (2001), “PBL is an instructional method that encourages Ieamers to apply critical thinking, problem-solving skills, and content knowledge to real-world problems and issues.” The goal of a PBL exercise is to put students in real-world situations where they must find a logical solution to the problem, supported by valid evidence. PBL is an instructional process where students are presented with an open-ended problem and provided minimal information as to the path they should follow to arrive at an answer or solution. In 40 PBL, there may not be one correct answer, instead there may be many “best” answers. Also in PBL, no one involved in the project has a predetermined answer, so student and teacher alike are engaged in finding an answer to the problem posed. Some authors consider PBL to be just a less-directed form of inquiry laboratories. Barrows (1998), who is credited with the definition of PBL, states that for an exercise to be “authentic PBL” it must have the following characteristics (PBL Insight, 2000; Newman, 2005): . Be student centered . Exercise must be real-world . There must be a problem to solve . Exercise must self-direct Ieaming . Involve collaboration . Integrate across disciplines . Must allow reiteration . Allow time for student reflection . Require peer and self-assessment 10. Provide motivation 11. Must be an authentic problem 12. Instructor acts as a facilitator 13. Assessments must match exercise or course objectives. (DQNODUI-bOON-‘b Since Barrows (1998) proposed the PBL pedagogical method, educational practitioners have been modifying the original idea to make it more practical and generally applicable. The fourth type of laboratory exercise is case-based. Case-based exercises are widely used in medical, business, engineering, and law schools to prepare students for the type of situations they will encounter in the real world. Case-based exercises are a form of inquiry exercise that usually provides students with a real-world situation that they must analyze to determine an appropriate solution. Many of these exercises are actual cases that are compiled 41 for students by the instructor and the solution or solutions are known. Students work through these exercises under the guidance of the instructor, with varying amounts of direction until they derive an answer to the case. These types of laboratories or exercises usually provide the students with a real world scenario and they are asked to analyze the scenario and draw multiple conclusions, write one or more solutions, generate hypotheses or be prepared to discuss the case during the next class. Case-based exercises usually have fairly well-defined answers that the instructors are looking for or a specific type of thought process ‘ to be used in their solution. Because they are based on real world situations, there is usually an answer to examine, whether it is correct or not, and then the students can discuss whether there was a better answer. Case studies can be more open-ended and be closer to a problem-based exercise or more directed and therefore more of an inquiry-based exercise. It is important to have a clear set of objectives for each laboratory exercise and different types of laboratories are better at teaching different cognitive levels. It is also important to choose the correct laboratory exercise to accomplish the desired objectives. Therefore, the lSB 208L lab coordinators use the list of questions listed in Table 1-4 to determine the type of laboratory exercise. 42 Table 14 Questions used to determine the type of laboratory exercise. Questions 1. Is the goal of the exerciSe to teach a specific skill or task, not to generate thought or creativity? Can students complete the exercise by simply following the steps provided in the laboratory exercise? Can students look up the correct procedure or answer in a text? Is there a predetermined path or answer for the exercise? 9‘1“?!“ Is the question or problem open-ended having multiple right answers? Is the student actively engaged in the thought process, that is, are they being asked to use critical thinking skills or scientific reasoning in order to complete the exercise? Does the problem/observation/solution/explanation require original thought to complete? Are there multiple paths for the investigation or incuriry to follow? Is the problem to be analyzed multidimensional or cross multiple disciplines? 10. Is the problem/exercise based on real-life, novel problems, or is it simplya reenactment of a past researcher‘s path of inquiry? exercise, not an inquiry or problem-based exercise. “No” answers to questions 1 - 4 and yes answers to questions 5 -10 would generally indicate exercises that are inquiry, case-based, or problem-based. It is important to keep in mind that a “Yes” answers to questions 1 - 4 would generally indicate a confirmatory determination of the type of exercise depends on the objectives/purpose and execution of the exercise by the instnrctor. If students are engaged in higher cognitive exercises where answers are not predetermined and paths not clearly defined, they are engaged in inquiry, case-based or problem-based exercises. Therefore, it is important to note that all four types of exercises are important in instruction. Confirrnational type laboratory exercises provide students who are Ieaming to work with laboratory apparatus, power tools, 43 mechanical devices, such as cars, planes, or boats, and the steps to follow to use the equipment in a safe and acceptable manner. These types of Ieaming are rote memory activities, not higher cognitive thinking processes. These types of skills may need to be memorized and practiced in the original form and then modified later, after the student gains experience with the process. These memory or low level cognitive skills very often lay the foundations for more advanced skills or to be used in conjunction with advanced higher cognitive level skills for explanations or solutions to real problems (Ennis, 1987; Feyerabend, 1987; Helpem, 1987; Moore, 1998; Popp, 1999; NRC, 2001). Inquiry, case- based, and problem-based exercises help students engage in higher-order thought processes. These three laboratory types are similar in that they provide students with more open-ended or real-worid problems that must be solved. Students are then encouraged to find solutions or develop methods of solving the problems with guidance from the instructor. Problem-based exercises tend to be the most open in that they give students a real-world problem, little direction as to how to solve the problem, and with many possible solutions. Case-based exercises tend to be more structured, with a more clear explanation as to what the student is to do to provide an answer. There is often only a few answers to a case-based exercise. Inquiry-based exercises can cover everything from problem-based to confirrnational type exercises, depending on how the instructor writes the exercise. Inquiry exercises allow the student to make some decisions about how they will solve the problem, but there is usually only one to a few acceptable answers to the exercise. Educators should use any combination of these exercises to provide students the widest range of skills based on their educational objectives. Because the primary educational goal of CISGS is to develop higher cognitive level skills, inquiry and problem-based exercises are used in the lSB 208L course, as educational research indicates these are best for developing these skills. Additionally, because many students are not familiar with inquiry and problem-based laboratory exercises, it was decided that laboratory exercises would scaffold from confirmatory laboratories through inquiry to problem-based throughout the semester. Educational researchers indicate this laboratory design is effective in developing higher-level process skills (Felder, 1993; Lord and Marks, 2005). Laboratory Course Models There are several types of laboratory course models currently used to teach undergraduate science courses. These include problem-based Ieaming, support for lecture, separate from lecture but skills based, separate course including both skills and lecture, linked lecture and lab concept based taught together (Leonard, 1993; Avery, et al. 1998; White, 1998; Barrows, 1998; Lilly and Sirochman, 2000; Nageswari et al., 2003; Harrington, 2005; Lord and Marks, 2005). Michigan State University’s Integrative Studies Biology Laboratory is unique in that it is not linked to any lecture course and is designed to teach process skills, not basic science skills. It appears the most common laboratory teaching models are designed to correlate with the lecture class of the same number and with overlapping content. Most of these labs have a series of exercises that guide students through activities that teach basic skills and/or 45 specific concepts. Laboratory courses are often taught without an overall focus or set of objectives for the course, but rather as a set of laboratory exercises that teach students a specific skill or reinforce lecture concepts or material. These exercises can be any of the four types of laboratory exercises, confirmational, inquiry, case-based, problem-based, or some combination of all four. A course can also be taught using the laboratory exercises as a guide for the entire course. Some laboratories are constructed to have students do science. These laboratory models have students design, implement, interpret and draw conclusions regarding a specific exercise or problem. Other laboratory models present students with a series of laboratory exercises that teach a skill, or that do limited inquiry with laboratory exercises often not linked in any coherent way from week to week or through the entire course. Other laboratory models are some mix of these extremes. The lSB 208L model is designed to have a clear set of objectives to which all laboratory exercises are aligned to help students achieve the desired Ieaming outcomes. The lSB 208L program uses confirrnational, inquiry, and problem-based laboratory exercises to develop student process skills and increase student achievement of CISGS course objectives (Glase, J. C.,1981; Harris, C. L., 1984; Wilson, J. T. and Stensvold, M. S., 1988; Lawson, A., et al. 1990; Glasson, G. and McKenzie, W., 1998; McGraw, J. B., 1999; Sundberg, M., et al. 2000; Brahmia, S. and Etkina, E., 2001; Henderson, L. and Buising, C., 2001; Boersma, S. et al., 2002; Russell, C. and French, 0., 2002). The lSB 208L course uses the recommendations of educational research from many of these sources. The research findings suggest there needs to be clearly 46 stated objectives for the course. The course should engage the students in real- worid problems, and as Bybee (1993) suggests, it should incorporate the Ieaming cycle. In the Ieaming cycle students are placed into a Ieaming environment where they are engaged through a set activity, explore the topic, offer explanations, elaborate on their findings and the problem, and evaluate their findings to draw a conclusion. The lSB 208L laboratory design incorporated this cycle into its design for each lesson and into the overall flow of the course. The idea was to use each laboratory exercise to develop the desired skills, and to build the overall process skills by the end of the semester. For example, students worked on the scientific process in labs one and two and then used these skills in the rest of the laboratory exercises in more sophisticated ways each laboratory period. They were required to use all of the process skills and the knowledge obtained from the laboratory exercises to solve a real-world problem on the final exam at the end of the semester. Conclusion The review presented in this chapter shows that the most effective teaching occurs when students are provided with work that actively engages them in laboratory exercises that are self-directed, are based on real-world problems, require the use of critical and scientific thinking skills, allows time for self-reflection, and crosses disciplines. One goal of science education is to produce citizens who are scientifically literate members of a global community. This means students must understand how science impacts society and their lives and are able to gather and use evidence/infon'nation to evaluate the world around them. Further, to be truly 47 scientifically literate (AAAS 1989; NRC 2001), students should be able to use higher level thinking skills, understand data and facts, and formulate arguments or solutions using valid information and logic. If the goal of undergraduate science education is to improve student ability in using higher-level thinking skills to become a scientifically literate worid citizen, it is necessary to provide effective means of engaging students in the Ieaming process. The analysis of the educational literature produced a list of characteristics for critical thinking, scientific reasoning, and some general principles to follow in - developing the use of these skills in students. I believe a critical thinker to be a person who can consciously make decisions to live effectively in a global world society and environment. This means the person chooses what actions to take, and does not passively interact with their surroundings. The critical thinking individual determines their destiny through careful deliberation and conscious effort, making the best choices as to how they will live. Educators need to provide students the tools with which to make these decisions. Providing intellectual experiences in a wide range of disciplines is one primary reason for Integrative studies education. Ennis (1987) and Taconis et al. (2001) suggest that the best way to develop students into critical thinkers is to put them into a wide range of intellectually challenging situations and then coach them to find answers or paths to follow to achieve their objectives. Ennis (1987) indicates students have trouble transferring thinking skills across disciplines, meaning that instmctors need to provide methods and practice in transferring thinking skills across disciplines. By exposure to complicated problems and issues in multiple 48 disciplines, both in and out of the academic environment, we help students transfer these skills across disciplines, making them into critical thinkers who can effectively function in a global community. In 1916 John Dewey, in his work, Democracy and Education, discussed the aims of education stating, “The aim of education is to enable individuals to continue their education...the object and reward of Ieaming is continued capacity for growth...” If Dewey (1916) is correct, then instructors should develop the educational opportunities and environment that provide students with the tools to continue their Ieaming. So, what should the educational practitioner do to increase student use of critical thinking and the scientific process, link science to society and student lives, improve student use of facts to formulate arguments and help students become life-long Ieamers according to the work of educational researchers analyzed in this chapter? Table 1-5 provides a list and brief description of practices educators can use to foster Ieaming. To teach thinking process skills (scientific reasoning and critical thinking), practitioners should choose a wide range of activities from multiple disciplines (Integrated Studies), of the appropriate level of difficulty, and at various levels of complexity and then guide students through the activities (Helpem, 1987; Hart et. al. 2001; NRC, 2001; Hewson, 2003). The practitioner should model, coach, and instruct students in the various types of analytical skills that could be used to solve or analyze specific problems, while asking them to use these skills beginning with guided practice and ending when students have attained metacognitive awareness and 49 proficiency in the use of these thinking skills (NRC, 1997; Popp, 1999; Zimmerman, 2000). Content should be used to teach higher-level thinking skills, as skills and content are interconnected and should not be separated (Millar and Driver, 1987). It is also imperative that laboratory exercises are clearly linked to educational purpose and content (Hart et al. 2000). Further, Hart et al. show that students race through confirmational laboratory exercises with more concern for completing activities rather than understanding how the laboratory exercise relates to lecture or the real world. She indicates that students need a framework for the laboratory exercise and discussions with the instructor to fully achieve the objectives of instruction of laboratory exercises. This suggests that instructors should use a wide range of laboratory types that meet the desired objectives for student Ieaming. If the instructional goal is for the students to master a lower cognitive level skill, then a confirmational laboratory exercise may be appropriate. However, if the instructional goal is for students to use the highest cognitive level thinking skills, inquiry, case-based, or problem-based exercises that require students to solve real-worid problems should be used (Levin, 2000; NRC 1997; Duch, et al. 2000). Experiential, environmental, and outdoor educational research shows the greatest Ieaming gains and long-tenn retention occur with low student to instructor ratios, engaging students in analyzing real-world examples/problems that cross disciplines, and using multiple teaching modalities. Additionally, this means activities should be student-centered, not instructor- centered to accomplish the greatest Ieaming gains (Barrows, 1986; Cash, 1993; Popp, 1999; Zimmerman, 2000; Wyckoff, 2001; Stronge, 2002). These types of 50 activities help students connect content with thought processes across disciplines (Ennis, 1987; Lochhead, 1987; Cash, 1993; Hart 2000). Table 1-5 Summary of effective teaching practices to engage students in Ieaming. I Effective Teaching, Practices 1. Provide students with real-world contexts for information 2. Present information at the appropriate levels to challenge students, while avoiding excessive frustration by providing appropriate support 3. Couch or frame concepts in such a way that students can apply them across disciplines 4. Use content to build higher-level thinking skills 5 Present information/tasks in ways that force/encourage students to solve ' problems in their own way using various resources, with guidance from the instructor/mentor. 6. Provide students consistent and clear reinforcement Provide students the opportunity to do multiple iterations to find a problem solution and to compare and contrast multiple viable solutions to determine the best solution choice for the problem at hand. . 8. Encourage students to critique their own and others work (develop metacognition). 9. Provide students practice in analyzing others successes and failures. 10. Provide background and exposure to different environments and ideas. 11. Provide time and encouragement to critically review efforts made to analyze or solve problems. 12. Provide training and practice in explaining concepts, processes, and ideas to others in a precise and understandable manner. 13. Provide training and practice in writing for a specific audience. 14. Provide students practice reading, reviewing, critiquing, discussing, and drawing conclusions from written technical materials. 15. Provide exercises that are student-centered. Avoid teacher-centered activities. Link to 188 208L Course Model Table 1-5 shows a list of the kinds of educational practices that educational research indicates should be included in lessons to aid students in their quest of 51 becoming scientifically literate citizens. Therefore, as many of these suggested educational practices were incorporated in as much into the lSB 208L course model as possible to achieve the greatest level of student Ieaming. To implement these practices we needed a program that aligned instruction, course objectives, and assessment. In order to achieve the level of desired student achievement it was necessary to provide students information as to why they are doing an exercise, linking it to real-world contexts, providing many means of presenting/discovering information, encouraging self-reflection, providing immediate and relevant feedback as to progress, and assessing student progress. The new lSB 208L course used as much experiential Ieaming as possible, to create the best Ieaming situation for understanding and long-ten'n retention of course material (Cash, 1993; Adkins and Simmons, 2002; Andersen et. al. 2001). Chapter 2 will describe the course model developed for the lSB 208L (Applications of Biology Laboratory) based on the educational research presented in this chapter. Based on the educational research summarized here and the author’s own experiences as an instructor, a decision was made that future laboratory programs need to provide for the following: 1. Time for qualitative discussions about laboratory exercises to foster conceptual changes. 2. Less time Ieaming to use apparatus, following directions and recipes and more time discussing leamings and reflecting on what was to be Ieamed from the exercise. 3. More opportunity for students to direct their investigations under the guidance of a knowledgeable instructor/mentor. 52 4. More time for instructors to help students understand what is to be Ieamed from exercises and how the material relates to society and their lives (objectives). 5. Provide a clear link between laboratory course materials and other disciplines. In order to make these kinds of changes it was necessary to use more inquiry type laboratory exercises, insure graduate teaching assistants (GTAs) were knowledgeable in both course content and facilitation skills, and develop laboratory exercises that encouraged students to focus on the concepts, with minimal time spent following directions or Ieaming how to use equipment. With these considerations in mind, the lSB 208L laboratory course model was developed. 53 CHAPTER 2 A FIELD TESTED TEACHING MODEL FOR UNDERGRADUATE NON-MAJOR GENERAL BIOLOGY LABORATORY COURSES Introduction This study is important in that it provides a description of a field-tested working laboratory course model. It examines many aspects of a course that has large student enrollment, uses graduate teaching assistants as primary instructors, course evaluation and improvement based on embedded course assessment, and ongoing professional development. This chapter documents the development of a non-major general biology laboratory course at Michigan State University that led to the creation of a general laboratory-teaching model. This model is based on the author’s experience in teaching various courses, educational research studies that provide recommendations for effective laboratory designs, and suggestions from students, GTAs and faculty. Our course is different from many other laboratory courses in that it is a stand-alone laboratory program, not associated with a lecture. This laboratory course had a mean enrollment of 827 students in 32 sections per semester and used GTAs as primary instructors. Included is an introduction, description of the course model development and components, evidence of model success, and conclusions. The new course model addresses several issues related to the teaching of undergraduate non-major biology classes. These issues include: the often unclear link between lecture and laboratory topics; the perceived irrelevance of science and course material to student lives; the failure of students to use critical thinking skills, the scientific process, and facts in logical argument formulation; and inconsistencies in course delivery among course sections (AAAS 1989; NRC 1997). This course model was developed while attempting to address these issues and assess the effectiveness in accomplishing course and CISGS objectives for undergraduate education. The CISGS objectives (Table H) are based on the recommendations of educational researchers as to what students should be able to do to be scientifically literate (AAAS, 1989; Boggs, 1995; Barr 4 and Tagg 1995; BSCS, 1995; NRC, 1997; Brahmia and Etkina, 2001). One of the goals of this study was to determine if the new lSB 208L course and subsequently the new course model improved student achievement. The study’s primary hypotheses were: Hypothesis I: Use of the course model leads to improved laboratory course delivery leading to increased student Ieaming. Hypothesis ll: Use of the course model provides an effective mechanism to continuously improve course delivery and student achievement. Hypothesis III: GTA professional development improves GTA effectiveness leading to student achievement. To show that the model was working it was important to determine if the model application helped improve student achievement, that GTA professional development led to improved GTA effectiveness and increased student achievement, that GTA professional development improved teaching competence 55 and reduced course problems, and that the application of the model improved course delivery over time. Course Design: Objectives This study was begun Fall 2000 with the introduction of the new laboratory program and manual that combined many of the laboratory exercises from the previous lSB 202L (Applications of Environmental and Organismal Biology Laboratory) and 204L (Applications of Biomedical Sciences Laboratory) courses. Initially, a set of course criteria were compiled to develop the course using the CISGS objectives, previous course objectives, instructor feedback, and educational research findings (Table 2-1) (AAAs 1989; NRC 1997; Wiggins and McTighe, 2000; NRC 2001). Once the objectives were clearly defined, this study attempted to determine if the laboratory course was achieving the course objectives by using formal course assessment. There was also a desire to know if teaching methodologies, course objectives, and assessment tools matched. According to educational research, the greatest gains in Ieaming take place with careful alignment of educational objectives, instruction, and assessment (Angelo and Cross, 1993; Wiggins and McTighe, 2000; NRC, 2001; Stronge, 2002). The design and implementation of a new course was based on a set of criteria provided by the Director of the Center for Integrative Studies. The new lSB 208L course was called, “Applications in Biological Science Laboratory” and was designed to address the criteria for undergraduate science education (Table 2-1). 56 Table 2-1 The criteria used for the development of the lSB 208L course model. _ k e... #1, Criteria - . 1. Increase relevance of material to student’s lives 2. Increase student use of critical thinking skills and scientific process 3. Increase student use of logic and facts to formulate arguments 4. Improve student data interpretation skills 5. Increase student basic biology knowledge 6. Stand alone without any intended connection to the non-major general biology lecture courses 7. Include assessment to evaluate effectiveness in achieving university objectives for undergraduate education and provided mechanism to drive continuous course improvement. 8. Increase student enjoyment (satisfaction or appreciation) of science The development of the lSB 208L course led to the creation of a new course model (Figure 2-1) that has been used to guide continuous course improvement. Based on the course criteria provided by the Director of CISGS, the lSB 208L course attempted to accomplish the following six goals. 1. Provide a clear link between CISGS science course objectives (Table M) and instmction 2. Provide a clear link between lecture and laboratory content 3. Insure course content consistency across all sections 4. Provide emphasis of instruction on process skills (critical thinking, scientific method, use of logic and facts in argument formulation, etc) 5. Include methods of assessment to determine the degree to which students achieve CISGS course objectives 6. Provide a mechanism by which assessment drives continuous course improvement. 57 Model Description Introduction The new course model arose from an attempt to design a laboratory course to improve student Ieaming based on the CISGS criteria for the new biology laboratory course, recommendations of educational researchers, and the author’s teaching experiences. The new generic course model (Figure 2-1) includes the following components: (1) the Ieaming community, (2) graduate teaching assistant professional development, (3) course design, (4) course delivery, (5) assessment, (6) a filter (a set of questions used to evaluate the practicality of suggested course changes), and (7) evaluation. This model was developed while trying to improve the lSB 208L course and became the framework for the lSB 208L course. The most important features of this course model are: c The use of assessment results to drive continuous course improvement. 0 The incorporation of a process to insure an open line of communication among all members of the Ieaming community. . The incorporation of an extensive teaching assistant professional development program to improve course delivery and consistency of instruction across all course sections. 0 The incorporation of a standardized curriculum to improve consistency of instruction across all course sections. . The use of a filter to insure we can implement any recommended changes. The “Filter” is a set of questions used by the lab coordinators to determine the usefulness and practicality of laboratory exercises (Table 2- 2). 58 Course Design Model for ISB 208L GTA Professional Development Coons Design Course Delivery Learning Commonly Inez-stint Course Evalnanon Asses Filter Figure 2-1: Course Model for Undergraduate non-major biology courses. The key features of this model include interaction among all members of the Ieaming community to provide feedback as to course effectiveness (center). The items surrounding the Ieaming community interaction form a unidirectional continuous loop. The Ieaming community is central to this model providing constant evaluation and review of all elements of this model. There is a continuous two- way flow of information between the Ieaming community and each of the model components to analyze any suggested changes based on any assessment data to achieve the greatest gains in student progress toward University objectives for undergraduate education. 59 Table 2-2 The Filter: A set of questions designed to determine if a laboratory exercise will work within the constraints of university framework, while achieving the objectives for undergraduate biology education. it Question 1. Does the course component work toward the CISGS objectives for undeggluate science education? 2. What have instructors indicated as important student knowledge? 3, Does this proposed change help produce a scientifically literate individual? 4. Do the logistics of implementing the program based on time, space, numbers of students involved, location, time of day, etc. work? What are the time constraints based on the number of times the course 5. meets, the sequence that makes sense and builds toward the desired outcome, days missed due to holidays, and the start and end of the course? 6 What is the time frame needed to implement changes to the lab manual, ' obtaining materials, providing adequate instmctor professional development? 7. Is the laboratory exercise of interest to students? 8 Are there safety issues associated with the exercise - both the safety in ' conducting the laboratory and the safety of students traveling to the laboratory or other location where the exercise is planned? 9. Can we assess student progress? Does the laboratory exercise conflict with or violate any University 10- policies? The model is a loop that includes clear links between CISGS and course objectives, open communication among all members of the Ieaming community, course design, GTA professional development, course delivery, embedded formal and informal assessment, and a filter for activity and course evaluation. The goal was to build the new ISB 208L laboratory course to include a formal 60 mechanism to drive continuous course improvement based on data, rather than on feelings or beliefs. Additionally, we implemented a major change in course focus between the previous laboratory courses ISB 202L (Applications of Environmental and Organismal Biology Laboratory) and ISB 204L (Applications of Biomedical Sciences Laboratory) and the new ISB 208L (Applications in Biological Science Laboratory) course. This change was from course materials that emphasized confirmatory (traditional) laboratory exercise design, where students worked through a series of activities leading to a predetermined answer to exercises that are guided inquiry-based, where students began with more confirmational laboratory exercises and progressed through the semester to exercises that required more original thought (critical thinking) with more open-ended real-life problems (exercise scaffolding). This shift in pedagogy is based on research findings of the AAAS Project 2061 (AAAS, 1989), the NRC 1997 handbook Science Teaching Reconsidered, and other reports showing that students need to be actively engaged in real-world exercises to develop critical thinking and scientific reasoning skills, and see the relevance of science to their lives (AAAS, 1989; Felder, 1993; NRC, 1997; Lord, 1997; Travis, Magill et al., 1988, and King; Leonard and Penick, 2005). Therefore, the course model combined effective pedagogies from those suggested by educational researchers, personal experiences teaching in both on-campus classrooms and study abroad programs, with suggestions from the Ieaming community (students, GTAs, and Faculty). The pedagogies were 61 incorporated within the practical limitations of teaching large classes using teaching assistants as primary instructors on campus with all of its inherent constraints. It is important to note that previous ISB courses were content driven, in the traditional sense, with emphasis on content coverage without a clear plan to develop concept understanding. ISB 208L began with the same content emphasis as the previous lab courses, but evolved over several semesters to emphasize process skills. At about the same time, there was an apparent change in the philosophy of the university administration to the teaching of undergraduate non-major biology laboratories. This philosophical change moved the emphasis in undergraduate science courses from content knowledge to process skills, such as the use of the scientific method and critical thinking skills, and away from breadth to depth of coverage of specific biology terminology and concepts. The new model was to focus on the improvement of critical thinking skills, use and understanding of the scientific method/process, use of logic and facts in argument formulation, data interpretation, and relevant laboratories with a clear link to the everyday world (Personal communication from Dr. Duncan Sibley: Director, Center for Integrative Studies in General Science and Dr. Larry Besaw: former laboratory coordinator). Educational researchers indicate that the greatest educational gains occur when content is used to teach and develop process skills (NRC, 1997; Lawson et al. 2000; Taconis et al., 2001). We implemented this philosophical shift in writing the ISB 208L lab manual and other course materials to focus on CISGS objectives for science education (Table H). 62 The ISB 208L lab manual, course materials, and instructor professional development, provide a clear focus on using, rather than simply memorizing information. Additionally, the new course was to demonstrate its effectiveness in teaching these skills through embedded assessments. A lab manual was designed and written to clearly link the course objectives to course delivery and assessment for the new ISB 208L course. It incorporated the instructional methodologies necessary to accomplish a shift in pedagogy from confirmational to inquiry-based laboratory exercises and from instructor-centered to a more student-centered classroom. A formal GTA professional development process was designed and implemented to provide instructors the skills and tools necessary to implement the desired pedagogical shift in laboratory exercises. One of the major shifts needed in GTA professional development was from instructor/lecturer to classroom facilitator. Chapter 3 provides a detailed description of the GTA professional development program used in the ISB 208L instructional model. A major strength of the ISB 208L course was the inclusion of a process to identify any problems with the course, by using embedded course assessments and feedback from all members of the Ieaming community. All members of the Ieaming community (students, GTAs, and University faculty) were encouraged to provide feedback at any time about all aspects of the course via e-mail, during weekly afternoon lab meetings, on mid-semester surveys, coordinator visits to classrooms, and by providing easy access to coordinators due to their proximity to the laboratories and willingness to listen to comments. Changes to the ISB 63 208L program occurred every semester from program implementation in Fall 2000, as assessment identified areas needing modifications to improve instmction. For example, feedback from teaching assistants indicated students did not understand plagiarism. GTAs were receiving student papers that were almost entirely one long quote with a couple of sentences written by the student. During several laboratory meetings the coordinators and the GTAs determined several ways to help students understand and avoid plagiarism, methods for dealing with students who still plagiarized, and better ways to identify when students were plagiarizing. This effort resulted in fewer students plagiarizing. less confusion as to how to deal with infractions, and better student writing. In the previous laboratory courses, there was supposed to be a clear link between lecture and laboratory content for correspondingly numbered courses. For example, laboratory exercises covered a wide range of environmental science topics in the environmental science lab (ISB 202L). However, many instructors teach the ISB 202 courses and they can teach them covering various content. Some instructors teach with a heavy emphasis on entomology, others on plant physiology, and still others general ecology. lSB 204 and 204L had similar problems with instructors teaching general human physiology or anatomy, while others taught parasitology, human nutrition, genetics, or some blend of all these topics. Additionally, laboratory exercises might reinforce lecture content, but were often out of sequence with the lecture covering the same material. Having 12-14 laboratory exercises that align with these varying topics was impossible. The link between lecture and laboratory content was often missing; therefore, the ISB 208L course was stand-alone, having a clear link between an internal lecture and laboratory (Gough, 1987; Hart et al. 2000; NRC, 2001; Stronge, 2002). Students want a clear link between lecture and laboratory (Hart et al. 2000; NRC, 2001; Stronge, 2002; and personal observation). The lack of a clear link between lecture and laboratory was both beneficial and detrimental. It was detrimental in that without the lecture students did not receive background information for the laboratory requiring the use of laboratory time for concept introduction. The lack of a link between the lecture courses and laboratory was . beneficial by allowing us to design an entire course around university and course objectives without outside influences. This allowed us to assess and change any element of the course at any time to improve course delivery and student Ieaming. It also allowed us to use any laboratory exercise, regardless of content, which is a common constraint in a majors course where students are expected by instructors to have been exposed to a specific content set. The course number was changed to ISB 208L to help students understand that this course was not intended to correlate with the ISB 202 or 204 lecture courses. In addition, at least initially, there was a problem in communicating that the ISB 202L and 204L courses were no longer available and that students would need to fulfill their biology laboratory requirement by taking ISB 208L. Increasing numbers of students enrolled in the ISB 208L course once this confusion was eliminated. A major problem encountered with not having the laboratory course aligned with a lecture course was that students do not receive any background 65 for the laboratory. We found the lack of background required the incorporation of a brief lecture to acquaint students with the concepts they would be studying in the laboratory exercises. To link lecture to laboratory, Microsoft Power Point presentations were developed to provide teaching assistants the necessary theory and background to understand and complete the laboratory exercises and to insure there was content consistency across all course sections. The desire was to emphasize laboratory activities so introductory lectures were reduced to less than thirty minutes in length in the ISB 208L course out of the 170 minute lab period. Lectures were reduced in the ISB 208L course based on educational research showing average student knowledge retention rates are greatest when kinesthetic and other activities actively engage students in instruction (AAAS, 1989; Barr and Tagg, 1995; Taconis et al., 2001; Boersma et al., 2002; Hewson, 2003; Jordan, 2003). According to the National Training Laboratories (2003) students show the greatest retention of course materials when they are actively engaged by teaching others. They report that students retain 5% of material they hear in lecture, 10% from reading, 20% from audio visual, 30% from demonstration, 50% from discussion groups, 75% in practice by doing, and 90% from teaching others. Therefore, educational research findings support the decision to design the laboratory program to emphasize student engagement in activities that foster Ieaming through as many Ieaming styles and methodologies as possible, and are clearly linked to CISGS objectives for undergraduate education (Gough, 1987; Heiman and Slomianko, 1987; Helpem, 1987; Kannos and Kannos, 1987; Cash, 1993; Gardner, 1993; Barr and Tagg, 1995; NRC, 66 1997; Hart et al. 2000; Boersma et al. 2002; Brahmis, 2001; Duch et al. 2001; Hattie et al. 2001, Krockover et al. 2001; Manner, 2001; DiPasquale et al. 2003; Hewson and Hewson, 2003). One problem in obtaining course consistency is that GTAs teach differently, have different backgrounds, and therefore emphasize different materials in classes. Another problem that has appeared in trying to have consistency across all course sections is that because course sections are offered over five days, students taking a section offered later in a week have a , greater opportunity to cheat by obtaining answers from students who take the course earlier in the week. One way to remedy this is to have five or more versions of each assignment - ideally 20 or more versions so they can be used only once every four or five years. Having this many versions of all assignments helps, but does not prevent cheating or students obtaining information from former students, and is excessively time consuming for the coordinators to create. Additionally, to insure course content consistency, the lab coordinators describe and discuss the objectives for ISB 208L in the context of CISGS objectives for undergraduate education with the GTAs during course orientation and weekly laboratory meetings. This discussion includes the course emphasis on activities, use of logical argument, the scientific process and other skills. GTAs receive background on course design and are asked to provide feedback on course effectiveness. This provides GTAs knowledge as to what and how they should teach the course. They are encouraged to review their SIRS form 67 results when available, to determine if patterns occur. Course coordinators suggest that GTAs keep doing those things SIRS data indicate they are doing well, and to modify areas indicated as ineffective. Coordinators suggest GTAs revise their approach to conduct laboratory exercises to align with the CISGS objectives. CISGS and course objectives for undergraduate education are regularly discussed to insure all efforts are clearly focused on achieving CISGS and course objectives. The discussion of course objectives allows for continuous feedback regarding the course, allowing identification and implementation of modifications to improve overall course design and delivery. Assessment tools were incorporated into every part of the ISB 208L laboratory program to evaluate progress in achieving CISGS objectives for undergraduate science education. Assessments provided the data for course evaluation and supported the decisions for course improvement. Combining assessment with an open discussion among all members of the Ieaming community allowed identification of mismatches in course objectives, delivery, materials, GTA professional development, and instruction. Once problems were identified, the course was modified to rectify shortcomings in course design. Another and extremely important aspect of this discourse was that all members of the Ieaming community took ownership for the course and were willing to provide invaluable suggestions for course improvements. An additional part of continuous course improvement is the incorporation of GTA professional development to insure GTAs are prepared to teach and can address the objectives of our laboratory course program. GTAs are the 68 backbone of the laboratory program and if they are inadequately prepared to teach the program, the program will fail to meet expectations. A description of GTA professional development used in ISB 208L is included in Chapter 3, under “Graduate teaching assistant professional development.” The initial goals were to redesign and improve the undergraduate non- majors biology laboratory course. Over the course of several semesters it was realized that course design was following a very well defined series of steps to insure that assessments were guiding course improvements. It also became clear that even the assessments needed to be evaluated to determine whether they were providing the desired data to guide course evaluation and redesign of all course elements. This process formed the basis for a formal course model that used assessment to drive continuous course improvement. The following is a description of the model developed in trying to improve the undergraduate non- majors biology laboratory program. Model Components Learning Community Many elements are included in this model to assure that the entire course program is undergoing constant assessment and revision to improve course delivery and therefore student Ieaming. The ISB 208L course laboratory model (Fig. 2-1) is a loop obtaining continuous feedback on all aspects of the course from assessments and all members of the Ieaming community. All elements of the course program are linked and assessment tools have been utilized or developed, where needed, in order to evaluate the entire program. The outside 69 loop is unidirectional progressing from one component to the next. All outside loop components are linked to the Ieaming community to allow constant input to direct course improvement along the entire loop and at any time. This model starts by clearly stating all CISGS and course objectives. We communicate all aspects of the course with all members of the Ieaming community in the course syllabus, on Blackboard or Lon-capa (lntemet sites), by GTAs during the first laboratory session, and in the lab manual. Lon-capa is an on-Iine educational database allowing instructors to post resource materials, course information, provide tutorials, and provide assignment scoring and a mechanism for other student-instructor interactions. The next step is that lab coordinators allow and encourage an open discussion of course objectives and methodologies based on student, teaching assistant, coordinator, director, or colleague feelings, perceptions and assessments of current course procedures without judging the feedback. Coordinators receive information by formal means from evaluations, such as SIRS forms, exams, quizzes, enrollment figures, teaching assistant evaluations, and discussions with the center director, teaching assistants, colleagues, and students. lnforrnal means of assessment include e-mails and informal conversations. These discussions lead to changes in the lab course where necessary and practical, in an ongoing attempt to improve the laboratory program. For all of this communication to take place, it is essential that the coordinators be easily accessible by all interested parties, and that they provide positive reinforcement to those who communicate information with them. Both 70 lab coordinators and the center director maintain an open-door policy with abundant office hours, frequently enter laboratory classrooms, and review e-mail messages, SIRS forms and other information, with prompt response and positive feedback where appropriate. An example of how this feedback loop works is illustrated by a situation that occurred with one of our foreign GTAs. The GTA spoke proper English with a heavy accent and at a very high speed, making it difficult for students to understand her. The laboratory coordinator identified this problem through feedback from students, and discussed the situation with the GTA. Discussions throughout the semester between the Lab coordinator and the GTA led to several solutions resulting in students receiving excellent instruction, as indicated on this GTAs SIRS forms, and from comments students made to the coordinator at the end of the semester. The GTA felt that she had received positive mentoring which helped her become a better instructor and more comfortable in teaching classes. This feedback illustrates how the Ieaming community interacted to resolve a problem, allowing for improved instruction, the development of a better instructor, and improved student Ieaming. Course Design The ISB 208L course was designed to focus on CISGS objectives for undergraduate education. Review of course model components occurred at every step of the design process to insure laboratory exercises, embedded lectures, and assessments helped students work toward achieving CISGS objectives. The ISB 208L course undenrvent several design changes since 71 implementation to better align it with CISGS objectives. The data provided by the assessments was used to evaluate the course and to rewrite, adapt, or redesign course delivery to increase course effectiveness in achieving CISGS objectives for science education. An example of how this interactive process designed into the model worked is illustrated in revisions made to the course lab manual over several semesters. The first lab manual was published locally due to the allotted time for development of the new course and to allow for ongoing course revisions. Eventually, a custom publisher began publishing the lab manual to improve its quality. The original lab manual (Fall 2000) attempted to include all of the topics covered in the previous laboratory courses in one semester. The original lab manual covered so much material that it prevented the desired interactions between students and GTAs from taking place, as the students rushed to complete the exercises (SIRS written responses, GTA comments in laboratory meetings). Additionally, GTAs indicated they were unable to adequately introduce and summarize the laboratories, which is an essential part of our program and recommended by educational research findings (Hart et al. 2000). There were several reasons for lab manual coverage. These included: the need to insure coverage of all course content, keep students engaged in the laboratory exercise, improve student use of critical and scientific reasoning, provide students practice in data interpretation and argument formulation based on facts, and prevent students from becoming bored or confused during the lab session. Several problems occur when laboratory exercises use less than the entire 72 laboratory session. These problems include students not asking questions knowing that they can leave if they finish early, and GTAs will rush to finish the lab by not engaging students in discourse so they can leave to do their own research work (personal observation). SIRS data and informal comments from students and GTAs indicated a need to further clarify course objectives. Based on these assessments, the lab manual was redesigned to begin with a more confirrnational approach to science laboratory courses and scaffold to the less confirrnatory-type approaches of inquiry and/or problem-based exercises. The ISB 208L lab manual was edited each semester to improve alignment between CISGS objectives and laboratory content, to incorporate any suggested changes for the next edition, and to allow an academic challenge to students without being overwhelming. There were two problems with this constant revision process: it was difficult to prevent typographic and content errors in the lab manuals and rewriting the manual was time consuming. Graduate Teaching Assistant Professional Development The course model incorporated extensive GTA professional development to insure the GTAs are prepared to teach the laboratory program. GTAs received professional development in pedagogy, course organization, lab manual features, the course syllabus, lab room arrangements, supplies, and other resources, support, and help available to aid faculty, staff, and students. Additionally, they were provided extensive training in the use of current educational technologies, such as computers, overheads, computer projectors, Microsoft Power Point presentations, and specific biological and chemical 73 equipment. The GTA professional development is described in detail in Chapter 3. Course Delivery Course delivery is viewed as the final step where course model elements come together to provide students the most seamless and clear presentation of course material. It is our belief, as Bill Spady stated, that students should engage in “mastery Ieaming not mystery learning.“ This means students should not guess what the objectives of class/course work is, wait for teaching assistants to discover how to use equipment or look up general content, or feel they cannot get answers to questions regarding course requirements, university policies, etc. (NRC, 1997; Lewicki, 1998; Hart et al. 2000; NRC 2001). Educational research findings indicate that a course should be clearly organized for the students, whereby they attend class, do their work, receive timely feedback on their performance, and are evaluated in a fair and consistent manner. At the same time, students must be held accountable for attendance, doing quality work, Ieaming to be self-sufficient and resourceful, while feeling free to comment and interact with all course delivery persons to help educators provide them with a high quality education. This means course delivery is a complex process using all of the other elements included in this course model. Course delivery is a well-orchestrated process where the beginning step is to insure teaching assistants are prepared to teach the course. A typical laboratory session began with GTAs passing back student work and explaining the objectives of the laboratory session. GTAs then give a brief Microsoft Power 74 Point presentation to introduce the concepts being studied during the class period. Students were then instructed to complete the laboratory exercise under the guidance of their GTAs who act as facilitators to help students complete the laboratory exercises, but without giving them definitive answers. The GTAs and each laboratory period with a brief summary of the day’s activities, emphasizing exercise objectives, answering student questions, and setting the stage for the next week’s laboratory activity. In addition to this basic routine, students may take a short quiz, write a short paper, or participate in a discussion of a current . event. The entire laboratory program was arranged to build concepts from the beginning to end of the semester. The ISB 208L course focused on the CISGS objectives for undergraduate education and worked to thread them throughout all laboratory activities over the entire semester. The semester begins with students collecting data, observing, and positing hypotheses. Later laboratory exercises focused on experimental design, drawing conclusions based on data and technical writing to explain results. Students were given quizzes, writing assignments, and exams to assess their progress toward achievement of CISGS objectives. Assessment The objectives of redesigning the laboratory course were to improve the quality of instnrction using assessment to provide feedback to show course effectiveness in student achievement. The assessment provided the data for course evaluation and was used to guide continuous course improvement as 75 suggested by Eberly (2001) and others. Educational research findings support that proper alignment in all aspects of instruction from objectives, through course materials and instruction to assessment tools, results in the highest quality education (AAAS, 1989; Wiggins and McTighe, 2000; NRC, 2001; Stronge, 2002). In order to accomplish this alignment it was necessary to develop effective and usable assessment tools. Course assessments included pre-post course multiple-choice and essay tests, SIRS, GTA surveys, mean grade point averages, percent course completion, and attendance data. Course evaluation directed changes in GTA professional development and course delivery to improve student performance in subsequent semesters. The pre/post course multiple-choice and essay test results, and the GTA survey results are described in Chapter 3 in support of the ISB 208L model. The other assessments are described in Chapter 4 in comparing the previous laboratory course models with the current model. The Filter The filter is a set of questions developed to determine the applicability of laboratory exercises to the ISB 208L course, covering course logistics, space in rooms, laboratory arrangements, technological requirements, alignment with CISGS objectives, time, safety, and other issues that affect laboratory operations (Table 2-2). All laboratory exercises were reviewed using this filter to determine if they meet the requirements for effective laboratory exercises. The filter helped identify potential problems, needs for professional development, safety issues, course delivery, assessment and alignment of material with course objectives. 76 The filter helped determine what, how, and when to teach a particular item, and how to assess what was taught. It provided information about the applicability of laboratory exercises based on all of the constraints inherent in a laboratory program of this size and complexity. The filter is a mechanism used to proactively manage possible problems with laboratory exercises and helped insure they met our goals for the course and operated within the constraints of the university environment. Even a small problem can lead to hundreds of e- mails from students asking for clarification of a situation. The filter helped identify and guide changes in laboratory exercises. The problems with this process are that it takes time, effort, an adequate assessment tool, and flexibility on the part of all participants. It is important not to make large changes to the program during the semester, because no matter how the coordinators tried to accomplish dissemination of information, part of it is lost in translation to the graduate teaching assistants leading to confusion among both GTAs and students. Table 2-2 provides the filter questions used by the coordinators to determine whether an exercise would work within the university framework. The filter is the final step before implementing course changes in the teaching model. To improve a large laboratory program it is important to use the “filter” and feedback from all members of the Ieaming community to avoid pitfalls in course design or delivery. In some cases, a laboratory exercise was selected that would achieve many of the objectives of undergraduate education, but failed to meet logistic, safety or other concerns making it unacceptable. Some of these laboratory exercises are useful with smaller classes or under a different set of 77 constraints. For example: GTAs suggested the course use a comparative behavior or anatomy study of organisms to be done at the local zoo. The laboratory seemed to meet the CISGS objectives, could be assessed, but failed to meet logistic problems in transportation, safety, and student engagement. The lab was rejected based on all of these considerations for our purposes. Course Evaluation The final part of this model loop is course evaluation where available assessment information is used to determine course effectiveness. During this . evaluation, the assessments are examined to identify elements that are working and those in need of revision to improve their effectiveness. Assessments included pre/post tests, GTA surveys, SIRS, grade averages, enrollment, withdrawal rate, and comments from students, GTAs, and other members of the Ieaming community. Course evaluations directed changes in course delivery/materials, teaching assistant professional development, assessments, or communication procedures to improve the overall program. All assessment and analysis were openly discussed with the Ieaming community as to time, costs/benefits, logistics, educational research, university policies, and other concerns. Proposed changes that arose from this discussion were incorporated into the course design to improve overall course operation. The following example illustrates how the assessment feedback was used to modify our course. A problem was identified in GTA grading of student essay answers when reviewing scores given on student work. Some GTAs had given scores as low as 20 and others as high as 45 out of 50 points on essays for the 78 same student paper, which is an unacceptable spread for scoring student work. In addition, students complained about the inconsistent grading to the laboratory coordinators. Further, there were comments from students on the SIRS forms that they did not feel the written assignments were clearly explained resulting in confusion as to what they were supposed to do to complete the assignment and how they would be graded. Based on all of this assessment data it was decided that changes were necessary. Therefore, a discussion was added to the GTA professional development program on the use of rubrics in scoring papers. When GTAs discussed and practiced scoring student work using examples of excellent, average, and poor student papers it made it possible to obtain scores for student papers that were within 1 5 points. Students were also provided rubrics at the beginning of the assignment. Providing rubrics helped GTAs determine the best ways to explain the assignment, grading criteria, and the reason the assignment was important to students. The process of course evaluation identified the mismatch allowing formulation and implementation of acceptable approaches to improve course delivery. Methods Overview The ISB 208L course model provided a mechanism to improve the teaching of higher-level process skills by using course materials, GTA professional development, assessment and an open line of communication to discuss methods of instruction to accomplish a pedagogical shift to higher cognitive level instruction. Inclusion of mechanisms to teach process skills and 79 assess student Ieaming were incorporated throughout the course and are continuously evaluated to determine whether the ISB 208L course is effective in helping students achieve CISGS objectives (Table l-1). These assessments provided the data to support decisions for course improvements. Combining assessment with an open discussion among all members of the Ieaming community allowed identification of mismatches in course objectives, materials, and instruction, allowing modifications to rectify shortcomings in course design. The ISB 208L course used a combination of laboratory exercises that scaffold from confirmational exercises at the beginning to inquiry and PBL-type exercises at the end of the course (Duch, Groh, and Allen, 2001; Levin, 2001). See Appendix A and Chapter 1 for descriptions of these laboratory types. The CISGS objectives (Table M) for undergraduate education most closely aligned with the inquiry and PBL-type laboratory exercises where students receive a real- world problem, situation or case, given as little background information as necessary to describe both the problem and the desired outcome, and then asked to develop one or more answers/solutions. Instructors acted as facilitators to help guide students to solve problems. A major characteristic of the type of problems used in PBL is that they usually have multiple solutions and therefore students must analyze the situation using all available information to decide on the best solution. Students develop answers to the problem using facts, logic, the scientific process, and critical thinking skills. It was important to determine if students were achieving the educational objectives set for them. Therefore, assessment tools were incorporated into 80 instruction to allow immediate feedback to students regarding their progress toward these objectives, as well as long-tenn evaluations allowing for continuous course improvements (Facione, 1990; Angelo and Cross, 1993; Cash, 1993; Loacker and Mentkowski, 1993; Hendricks, 1994; Brookfield, 1997; VonSecker and Lissitz, 1999; Hart et al. 2000; Heady, 2000; Wiggins, and McTighe, 2000; Zachos, et al. 2000; Heady, 2001; NRC, 2001; Ostiguy and Haffer, 2001; Russell and French, 2001; Haladyna et al. 2002; Lawson et al. 2002; O’Sullivan and Copper, 2003). There is no one best type of assessment to use for student achievement and overall course evaluation. All assessment tools have their strengths and weaknesses. Most large institutions tend to rely on multiple-choice objective tests, due to large class sizes and the requirement to rapidly report grades at the end of a semester. Angelo and Cross (1993) describe various forms of assessment tools and descriptions for their use. Other authors describe assessments developed for laboratory, lecture, discussion, experiential Ieaming, internships, and other educational situations (Cash,1993; Lewicki, 1998; Lawson, 2000). All of these authors emphasize the necessity of aligning assessment with course objectives to insure the assessments actually measure student progress toward educational objectives. The assessor should have a clear reason for doing the assessment and the information they desire the assessment to provide. These reasons may include a need to determine what students have gained from taking the course, to determine if the course is achieving the educational objectives for student education, or to help guide alterations in course delivery to improve the course in subsequent semesters. This work has progressed in steps 81 over the course of three years, beginning in Fall of 2000. It is a work in progress, undergoing continuous improvements/modifications as we learn more about how to build and use assessment tools. This study included a variety of assessment tools in an attempt to determine the effectiveness of the ISB 208L course model. Where tools did not exist, we acquired or developed what we thought were effective means of assessment. We chose to use the following assessment tools based on information available in educational literature (Loacker, Cromwell, and O’Brien . 1986; Loacker 1988; Angelo and Cross 1993; Loacker and Mentkowski 1993; Heady 2000; Heady 2001; NRC 2001). The tools used in this study were pre/post multiple-choice and essay tests, question cognitive level, rubrics, SIRS data, GTA surveys, grade point averages, rate of course completion, attendance, enrollment, and course syllabi. Pre/post course tests, question cognitive level, and rubric use are discussed in detail in this chapter and all other forms of assessment are covered in Chapter 3 (GTA survey) or 4 (SIRS, grades, course completion, attendance, enrollment, and syllabi). Assessment Description PrelPost Course Multiple-choice Tests During implementation of the ISB 208L course program, pre/post course multiple-choice and essay tests were added to student assessment to determine progress in helping students achieve CISGS objectives. The original test was a set of 30 multiple-choice questions used on semester exams, quizzes and worksheets, and from the previous laboratory courses, ISB 202L/204L exams. 82 The pre/post course multiple-choice tests were a collection of multiple-choice questions randomly picked from a pool of fifty questions each semester and given to the students on the first day the lab meets. Students took the pre/post course tests with minimal directions, given only instructions to do their best in completing the test. See Appendices (D-H) for examples of multiple-choice pre/post-tests used. The multiple-choice pre-course test addressed knowledge, often vocabulary and simple content questions. This portion of the pre-course test provided an indication of what discipline specific information students know. coming into the course. We gave the students the same questions at the end of the semester embedded in the final course test. Students do not keep the pre- course tests, and are not provided with test questions or answers, preventing them from studying the answers to the questions used on the pre-test. The test questions used in the three laboratory courses (ISB 202L, 204L, and 208L) were compared to determine whether they assessed student Ieaming of CISGS objectives for undergraduate education (Table H). To address critical thinking, scientific method, and the use of logic, course materials and assessment questions needed to be at the higher levels of Bloom’s scale of question cognitive level (Bloom and Krathwohl, 1956). We compressed Bloom’s six cognitive levels into three to simplify the process of analyzing course materials (Table 2-3). The lowest cognitive level (one) is straight recall questions (knowledge and comprehension), those of application/analysis are level 2, and evaluation/synthesis questions are level 3. The desire was to determine if course test questions were at the appropriate cognitive level. To examine test item 83 question cognitive level among the three courses, questions were randomly picked from the 50 questions used on exams and quizzes from each of the three courses. A chart using an Excel spreadsheet was constructed that listed questions previously and currently used in the courses. Two groups of instructors were given copies of this chart for question cognitive level evaluation and asked to rank all the test questions. The two groups consisted of three faculty not connected with any of the three courses and the ISB 208L GTAs, but who are familiar with using Bloom’s taxonomy of question cognitive level to rank test questions. The average question rankings were used as the final indicator of question cognitive level. The variation in ranking within the sample to that between rankers was evaluated to determine if there were any inconsistencies in question ranking. All current GTAs were asked to do the same ranking as the professional educators, after providing them an explanation of Bloom’s taxonomy and practice ranking questions. The two groups determined that our Fall 2000 test questions had an approximate mean level of 1.2-1.4 on a three-point scale. To address the CISGS objectives for undergraduate science education (Table l- 1) coUrse materials and assessment questions needed to be at the higher levels of Bloom’s cognitive scale of question difficulty (Bloom and Krathwohl, 1956). Table 2-3: Bloom’s Taxonomy (Modified) Level Competence Skills demonstrated Level 1 Knowledge and o Observation and recall of information comprehension 0 Knowledge of dates, events, places 0 Knowledge of major ideas a Mastery of subject matter Question cues: list, define, tell, describe, identify, show, label, collect, examine, tabulate, quote, name, who, when, where, etc. Understanding information Grasp meaning Translate knowledge into new context Interpret facts, compare, contrast Order, group, infer causes a Predict consequences Questions cues: summarize, describe, interpret, contrast, predict, associate, distinguish, estimate, differentiate, discuss, extend Level 2 Application and 0 Use information analysis o Use methods, concepts, theories in new situations . Solve problems using required skills or knowledge Question cues: apply, demonstrate, calculate, complete, Illustrate, show, solve, examine, modify, relate, change, classify, experiment, discover Seeing patterns Organization of parts Recognition of hidden meanings 0 Identification of components Question cues: analyze, separate, order, explain, connect, classify, arrange, divide, compare, select, explain, infer LGVGI 3 SWWOSIS and 0 Use old ideas to create new ones evaluation . Generalize from given facts 0 Relate knowledge from several areas 0 Predict, draw conclusions Question cues: combine, integrate, modify, rearrange, substitute, plan, create, design, Invent, what if?, compose, formulate, prepare, generalize, rewrite Compare and discriminate between ideas Assess value of theories, presentations Make choices based on reasonable argument Verify value of evidence . Recognize subjectivity Question cues: assess, decide, rank. grade, test, measure, recommend, convince, select, judge, explain, discriminate, support, conclude, compare, summarize From “The Classification of educational Objectives: Handbook I”, cognitive domain. New York; Toronto: Longmans, Green 85 Table 2-4 (numbers 1 and 2) provides examples of the lower cognitive level general knowledge multiple-ch0ice questions used on the pre/post course exams from Fall 2000 through Spring 2001 in the ISB 208L courses and the previous ISB 202L/204L course models. Table 2-4 (numbers 3 and 4) show samples of higher cognitive level questions used in the lSB 208L course from Fall 2001 through Spring 2003. 86 Table 2-4 Sample questions used on pre/post course multiple-choice tests. # Examples of lower cognitive level questions 1. Which one of the following shows the products of respiration? a. glucose, carbon dioxide, and water b. glucose, oxygen, and energy c. carbon dioxide, energy, and water (I. carbon dioxide, energy, and oxygen e. none of the above is correct 2. Which one of the following best describes a biome? a. The place where organisms live. b. an area that has specific communities on a continent with similar climatic, physical and chemical parameters c. an area where one is most likely to encounter a specific type cf organism. d. is the entire living area of the earth. e. is the largest area that has the same types of organisms present. Examples of higher cognitive level questions 3. If you and your mate are both heterozygous recessive for cystic fibrosis and cystic fibrosis is a homozygous recessively expressed trait, what is the chance of you or your partner having a child afflicted with cystic fibrosis? a. 0% b. 25% C. 50% d. 75% e. 100% 4. If you were a wildlife manager, which one of the following descriptions expresses what your objectives for building a sustainable fishing industry should be? a. construct a food chain based on the most desirable fish. b. construct a food chain with as many large marketable fish as possible. 0. construct a balanced food web using as many organisms as possible. d. construct a food chain with a good mix of all size classes of organisms. e. construct a food chain or web that will create the most income for the fisherman. 87 After determining the cognitive level of our existing baseline questions, it was decided that our overall course objectives required students to function at modified Bloom’s levels two and three. The CISGS Director and course coordinators decided that a mean cognitive question level used on assessments should be just over 2, which would (if the study assumption that the assessment tools were adequate to measure student achievement) move students toward achieving CISGS objectives. Over the course of three semesters, course materials, pedagogies, and assessments were changed to achieve a mean cognitive ranking of 2.2. Changing the types of questions, and assessment tools was relatively easy in comparison to changing pedagogies. The lab coordinators searched for additional higher cognitive level questions to use on tests and in course materials. The desire was to find and use additional questions at the modified Bloom’s cognitive levels 2 and 3. Some of these questions were already in use on course tests, but there were few of them included in course materials or covered during classroom instruction. Students are used to memorizing materials and the level two and three questions go beyond this type of study technique. We felt we could not ask a significant number of higher-level cognitive questions without including these types of questions in all aspects of the course. This meant these kinds of questions were included in the course lab manual, Microsoft Power Point presentations, questions asked by course instructors, and other areas of the course where applicable or possible. Questions 3 and 4 (Table 2-3) are examples of level 2-3 questions added to course assessments. 88 There was a desire to insure that pre/post course test questions were properly written, valid, reliable, and of the appropriate level to assess student achievement. An analysis of all the test data provided by the University scoring office was performed to determine if the course tests were meeting these requirements. The Computer Center scoring office provided the following indicators of test validity and reliability: mean Kuder-Richardson index of test reliability, mean item difficulty, mean item discrimination, and standard error of measurement. Additionally, the information provided data to allow a determination of whether the course test questions were using good detractors. Educational research findings suggest that the ideal score for Kuder-Richardson and standard error is close to 1, mean item difficulty is 30, and item discrimination is 20-50 (Mehrens and Lehmann, 1997). Using this information as a guide, all of the pre/post course multiple-choice test questions in the test bank (a test bank of 50) were examined to determine if they met the above criteria. Questions that did not fit the criteria were discarded and new questions added that fit the guidelines to produce assessment tests of the desired composition. The pre/post course tests were revised each semester to improve their fit with CISGS course objectives based on test question criteria, while insuring the questions aligned with CISGS goals for undergraduate biology education and what was actually taught in the laboratory course. The print out from the computer scoring office showed the number of students answering a question correctly, the answers chosen by students, and the numbers of students from the top group who got the answer correct and 89 those who got it incorrect. The printout also showed the spread of student answers across the test foils. This information allowed an examination of each question to see if the top students were answering the questions correctly and the bottom students answering incorrectly. Test data were also examined to determine if there was an even spread of answers across the foils. Detractors with no student responses are poor detractors and were changed in the test bank and on subsequent tests. Tests were routinely evaluated with the goal being to insure they have Kuder-Richardson, Index of discrimination, mean degree of difficulty, and cognitive level scores of 30.70, 20-50, 30, and 2.2 respectively, that the foils are acting as detractors, and that all questions pertain to CISGS objectives for undergraduate education (Facione, 1990; Mehrens and Lehmann, 1997). The mean pre/post multiple-choice test K-R is > 0.75 for all six semesters of data analysis with the highest reliability of 0.85 for Spring 2003 and our current mean cognitive level of 2.2. To develop these items required six semesters of writing test questions, course materials, improving graduate teaching assistant professional development programs, analyzing test questions, finding experts to provide answers to questions as to how to achieve the desired outcomes, and listening to feedback from all members of the Ieaming community. It is important to note that this process/work was worth the effort as it provided positive reinforcement for what we were doing by allowing coordinators, GTAs, and students to take 90 ownership of the course and their Ieaming and to show results based on the course assessments. This process identified another problem: most of the teaching assistants were not instructed in using higher level questioning and therefore did not have the skills necessary to teach students how to ask and answer higher cognitive level questions (Pers. Obs.). This meant that there was a need to provide additional GTA professional development in order for them to be effective instructors at the desired higher cognitive level. The coordinators also became aware of their own tendency to teach and assess at the lowest cognitive levels, so they also needed to rethink how they approached GTA professional development. If GTAs were expected to use skills that encouraged higher order thinking from students, the lab coordinators needed to model this behavior for the GTAs (Druger, 1997; Lawson, 2002). Chapter 3 provides a description of ISB 208L professional development program (Tenant, 2005). Multiple-Choice Control Questions Control questions were added to course tests to determine if the instructional strategies used were causing shifts in student Ieaming or if there was an outside influence affecting this change. The questions included were from the Lawson test of cognitive ability (Questions 5-8; Figure 2-2) (Lawson, 1978) and covered material not taught in the course. 91 5. Twenty fruit flies are placed in each of four glass tubes. The tubes are sealed. Tubes l and II are partially covered with black paper; Tubes Ill and IV are not covered. The tubes are placed as shown. Then they are exposed to red light for five minutes. The number of flies in the uncovered part of each tube is shown in the drawing. RED LIGHT This experiment shows that flies respond to (respond 5 5 1 III I 1 means move to or away from): I 9999 neither red light nor gravity - . ......... ‘ I red light but not gravity . Q gravity but not red light both red light and gravity 19 l ., . rv 0.000200) so". RED LIGHT 6. because a. most files are in the upper end of Tube lll but spread about evenly in Tube II. b. most files did not go to the bottom of Tubes l and III. C. the flies need light to see and must fly against gravity. d. the majority of flies are in the upper ends and in the lighted ends of the tubes. e. some flies are in both ends of each tube. BLUE LIGHT How could a person determine [ I I I I 1 I III which conclusion is best? (5 pts) (N \J '2 BLUE LIGHT Figure 2-2 92 7. Farmer Brown was observing the mice that live in his field. He discovered that all of them were either fat or thin. Also, all of them had either black tails or white tails. This made him wonder if there might be a link between the size of the mice and the color of their tails. So he captured all of the mice in one part of his field and observed them. Below are the mice that he captured. Do you think there is a link between the size of the mice and the color of their tails? a. appears to be a link b. appears not to be a link c. can not make a reasonable guess 8. because there are some of each kind of mouse. there may be a genetic link between mouse size and tail color. there were not enough mice captured. most of the fat mice have black tails while most of the thin mice have white tails. as the mice grew fatter, their tails became darker. 99.0.5!” Figure 2-2 (cont’d) Questions 5-8 are sample higher cognitive level control questions used on pre/post course tests. These questions cover material not directly taught during the course. 93 The assumption was that mean scores on the general science knowledge control questions would remain the same on pre/post course tests while scores on the content knowledge questions taught in the course would increase between pre/post course tests. The expectation was that students would guess on these questions, and that the mean scores for these questions would be about 25% (four foils per question). Initially, there were 30 questions on the pre/post tests, Fall 2000 (Appendix D), which was reduced to 20 questions beginning Spring 2002 when essay questions were included on the test (Appendices E and F). It was important to reduce the time spent in class for assessment as testing was taking up too much class time, with the addition of essay questions to the pre/post course tests. The essay portion of the pre/post test was added to evaluate higher level thinking skills (Appendices G and H). Students received 10 points (until Spring 2003), for completing the pre/post course tests. Students received credit for the test so they would take it more seriously and do a better job answering test questions. In the Spring of 2003 the credit for completing the test was dropped. It was immediately observed that many students did not complete tests, especially the essay portion of the pre-test. PrelPost Course Essay Tests There is considerable discourse (DiPasquale et al. 2003; Fowler, 1996; Giere, 1997; Helpem, 1987; Hogan and Maglienti, 2001; Karplus, 2003; Lawson et al. 2000; Presseisen, 1987; Scriven and Paul, 2003; Zimmerman, 2000) on the definition of critical thinking (see Chapter one) and no clear method of assessing student use of this skill. Although there is not a clear consensus as to a definition of critical thinking, there are many similarities among interested scholars. Summarizing these definitions, it was decided that the ISB 208L course working definition of critical thinking was the ability to formulate an argument/solution to a problem based on the use of logic, the scientific process, and facts to support their position. This definition was chosen because it supports the CISGS objectives for student Ieaming (Table l-1), guides the design of appropriate assignments for reinforcement, and allows for reasonable assessment of student Ieaming. This definition guided the design of essay questions of several types. Students practiced critical thinking, scientific method/process, data interpretation, and the use of facts to formulate logical arguments, in exercises and discussions throughout the semester. Based on course assessments essay test questions were added to pre/post course exams during the Spring 2002 semester to assess student use of higher level thinking skills (Bloom’s cognitive question rank of 2 and 3). Questions 9 and 10 in Appendix H show examples of the type of essay questions used on pre/post course tests. One of the major difficulties in using essay questions is that university courses have 700-1000 (X = 827) students per semester and having graduate teaching assistants read and accurately evaluate all of these essays is time consuming. These questions asked students to develop a scientific experiment and/or to evaluate experimental designs or research data, or to critique other student or researchers’ work (the highest cognitive level questions) during Spring 2002 95 (Bloom and Krathwohl, 1956; Swartz, 1987; Newell, 1990; Zimmerman, 2000; Stronge, 2002; Shymansky et al. 2003). The essay portion of the pre-test assessed students’ use of critical thinking, scientific method/process skills, logic and facts to formulate arguments, as well as their ability to interpret data, and their understanding of how science affects society and their lives. Students answered one to several questions that required the use of background knowledge provided in the course and critical thinking skills in order to provide an adequate answer to the question. The major problems with higher-level (modified Bloom’s level 2 and 3) questions are that they are not readily available, are very content specific, and are very difficult to write and validate. Areas such as genetic engineering, environmental problems, and health issues were addressed in these questions. Essay questions did not always require students to use course content, but rather skills taught during the course. Essay questions were modified every semester in an attempt to better evaluate student understanding and ability to use critical thinking, logic, facts, and scientific reasoning skills. The original questions used on pre/post essay tests tended to rely on background information provided in the course and may have more accurately assessed how well students memorized course material rather than process skills. More recent questions have posed new problems from areas not specifically covered in the course, but require the use of critical thinking, scientific process, logic, and evaluation of facts/evidence to provide adequate question answers. Examples of essay questions used on the Spring 2003 pre/post course essay tests are shown in Table 2-5. 96 Table 2-5 Essay questions used on ISB 208L pre/post course exams to assess student use of critical thinking skills. Essay Question 1. Write a one-two page persuasive letter. This letter should be appropriately written to convince legislators to take your side on one of the listed environmental or biological issues. To be most persuasive you will need to support your opinion using scientific evidence. This letter should be based on facts (i.e. environmental impacts, economic concerns, health and safety issues, etc.). Your letter should clearly state the issue and provide the reader with enough information that they recognize that you are an informed individual. You should clearly define the issue, supply at least 3 pieces of supporting evidence for your opinion using specific examples, and a clear direction or course of action you want the elected individual to follow with regard to the Issue. This letter should be concise and have a logical flow, from the position, through the evidence to support the position, to the course of action to be followed. If you use facts, be sure to give credit to the individual or organization from which you obtained those facts. This lends credibility to your argument. Hopefully, it will be a letter you would wish to send to your local, state. or national officials, although you will not be required to do so, but it should be written in the appropriate manner, so you could send it. Choose from one of these biological topics: Genetic Manipulation / Genetic Engineering Global Warming Human Population Alternative Energy Sources Groundwater / water pollution Habitat Destruction Biodiversity Threatened / Endangered Species Essay Question 2. Scientific Investigation and Experimental Design In order to help meet the global need for increased agricultural production, the United States Department of Agriculture has proposed converting much of the Boundary Waters National Park lakes region into a large scale national agricultural sector. This region has thousands of pristine oligotrophic lakes that are among the best habitat for trout. These lakes and the trout fishing are pivotal to the local economy through tourism they attract. You have been hired by the Environmental Protection Agency to conduct an environmental impact survey. You have been given five years and access to a large section of the park, containing many lakes, with which to conduct an investigation. Your task is to determine whether there would be significant impacts to the local lakes that would negatively impact tourism and the local economy. Propose a correctly stated hypothesis as to what, if any, changes would occur to each of the three water quality parameters listed below if this agricultural proposal were implemented. (5 pts. Each) For each parameter, describe in detail why the proposed land use changes would have these predicted affects. (5 pts) DO (Dissolved Oxygen): Hypothesis; Justification Temperature: Hypothesis; Justification 97 Table 2-5 (cont’d) Fish species composition: Hypothesis; Justification Remember, you have been given five years and access to a large section of the park, containing many lakes, with which to conduct your investigation. Your task is to determine whether there would be significant impacts to the local lakes that would negatively impact tourism and the local economy. Given these constraints, describe an appropriately designed scientific investigation, that could be conducted to test each of your hypotheses (from the previous page), that would provide the data necessary to support or refute these hypotheses. Be sure to include all appropriate methods and aspects of a well-designed scientific investigation and also the results you would expect in each of your treatments. (Note: do not just outline the scientific method) Also, identify alternative explanations that could result in the same predicted outcomes and/or identify problems that might arise that could lead to different results. (20pts) Rubrics Rubrics were not used in previous laboratory courses and were implied but not detailed during the first two semesters of the ISB 208L course. However, there was a desire to improve student work and grading consistency among teaching assistants. Rubrics were developed; however, that was not enough to obtain the desired level of scoring consistency across sections. To improve consistency professional development sessions were conducted where the lab coordinators worked with teaching assistants to illustrate how to use rubrics for grading, practiced grading essays, and engaged GTAs in detailed discussions regarding the most important aspects of each assignment. This additional 98 training improved consistency and made the teaching assistants much more confident in assessing student work. Based on these discussions, more detailed rubrics were developed that made assessing student work more effective and consistent. This process was modified every semester as student survey data, test results, and teaching assistant surveys were collected. In Fall 2003, students were provided detailed rubrics at the beginning of the course and no rubrics at the end of the course. There were several reasons for this. First, you cannot ask students for excellent work if you do not provide a description of what is considered “excellent work.” Secondly, the rubrics were so detailed there was a concern as to whether students were Ieaming to think critically orjust blindly following the steps detailed in the rubric. Thirdly, based on the recommendations from the GTAs, the length and complexity of the rubrics were reduced at the end of the course to see if the students mastered the course objectives. The use of rubrics is continuously being revised as part of the continuous course improvement program. For example, during a Weekly afternoon laboratory meeting a discussion of a new laboratory assignment on “Training for a Marathon,” occurred between the coordinators and the GTAs. The discussion was to determine if GTAs were satisfied with the clarity of the assignment and rubric (Figure 2-3). The GTAs indicated that it was not clear what the students were to accomplish with the laboratory exercise. After a lengthy discussion of the objectives of the laboratory exercise, and how to best convey that information to students, while leaving the students freedom to be as creative as possible, a suitable assignment and rubric 99 were developed. The coordinators and GTAs felt the rubric should effectively evaluate students without being too vague or being too restrictive. The original assignment and rubric are shown below (Figure 2-3), followed by the final version (Figure 2-4), as it was used in the laboratory. Note that the points were deliberately left off the rubric giving the GTAs the freedom to assign points for each part of the exercise as they requested during the discussion. Also, only the problem content and “Your plan should contain the following” was provided to students. The GTAs used the original rubric (Figure 2-3) without point designations for grading to help maintain consistency in student project evaluation. Another feature of the rubric is the inclusion of a student self- evaluation. The intent is to help students evaluate their work to determine if they have included the necessary components of the assignment in their work before submitting it to their instructor for grading. The redesigned assignment and rubric illustrates the power of including the Ieaming community in course materials and methods development. The original assignment and mbric would have worked, but the revised versions worked better and the GTAs having had their say in its design, were more willing touse it and thus have bought into the process. 100 F803 ISB 208L Marathon Assignment Name PID Problem content: You want to train or help someone train for a marathon for the summer of 2004. You have asked friends and relatives how you should train for this event and what you should eat now, just before, during and just after the event. They have given you a mixed set of answers. You are now totally confused. What should you eat and how should you train for this event to be successful? You don’t care whether you win or not, simply that you can complete the event. To complete this laboratory exercise students should: Provide a summary report, as if you are a trainer that will: 0 Provide background on who your training program is designed. Provide a one month training program of 2 1 page in length for either running or walking a 26.2 mile marathon giving specific details on what your trainee needs to do every day to prepare for the event. 0 Provide a diet plan for your trainee that includes how and specifically what they should eat and drink every day in preparation for the event. 0 Provide a written report (2 2 pages in length) justifying the program and giving physiological evidence to support the effectiveness of your training program. 0 Provide your hand-written proposal statement and training program developed in class. Include a written report justifying (providing evidence) as to why the program you developed will work for the trainee. Your summary should explain why your trainee should eat a specific food or do a particular exercise. This summary should also explain why your final proposal is better than your original plan (the one developed in class). Use the rubric below to guide you In report preparation. Item Possible Self Earned points evaluation points Hand-written proposal 8 training 0 0 0 program included missing Typed prgmsal and traininmram included missing Written Summary # Articles & other resources (interviews) used 23 2 1 Used only opinion 0 101 figure 2-3 (cont’d) Appropriate use of Evidence Enough evidence used to support position - more than one source, no Opinion only facts, academically accepted sources Some evidence used to support position - may not be completely clear how evidence supports position - missing one of the above items Unclear use of evidence or not enough evidence to support position or use of opinion or a source that would not be rewind as academicaljr valid No evidence presented Clearly details argument and uses evidence to support position and presents arguments against potential counter arguments-clearly shows link between observation, hypothesis, evidence and conclusion (includes the physiological evidence to support the diet and training regimen: specifically, aerobic and anaerobic respiration and how different exercises relate to these, food types based on carbohydrates, lipids or proteins and why eat those, etc.) Somewhat details argument or doesn’t link evidence to argument or barely addresses counter arguments Doesn’t clearly explain rationale for argument, leaves out evidence for argument, or doesn’t address counter arguments No Discussion, disorganized, or unclear Conclusion Conclusion follows a logical path from the evidence directly addressing the proposal, and is clearly stated Conclusion somewhat follows from the evidence No conclusion presented or the conclusion doesn’t follow from the evidence Diet Plan Gives a complete diet and hydration plan for one month (includes all meals, foods, numbers of meals per day, what and how much to drink, etc.) 102 Elmira 2-3 (cont'd) Training Regimen Details a complete exercise program for one month (includes a schedule of what the trainee will do each day: How far they will run/walk, weight lifting, cross-training etc.) Formatting Typed using 12 point font, Times New Roman or Arial, single spaced, 2 pages in length, with page margins of 1 inch on all sides, name, section #, PID included Any of the above items missing 0 .reareorhe ‘ ' f ‘50 Figure 2-3 Initial assignment and rubric presented to GTAs during a Weekly afternoon laboratory meeting for their feedback. GTAs made several suggestions for improving the assignment. The modified assignment is shown in Figure 2-4. 103 F503 158 208L Marathon Assignment Name PID Problem content: You want to train or help someone train for a marathon for the summer of 2004. You have asked friends and relatives how you should train for this event and what you should eat now, just before, during and just after the event. They have given you a mixed set of answers. You are now totally confused. What should you eat and how should you train for this event to be successful? You don’t care whether you win or not, simply that you can complete the event. Your goal is to develop a complete training plan so you can successfully complete the marathon. Your plan should contain the following: . Include your hand-written proposal statement and training program developed in class. Unresearched plan. Include background on who you planned the training program for. Revised training plan that is based on your research: 0 Include a one month training program for either nrnning or walking a 26.2 mile marathon giving specific details on what your trainee needs to do every day to prepare for the event. 0 Include a diet plan for your trainee that includes how and specifically what they they should eat and drink every day in preparation for the event. 0 Include a written report justifying (providing evidence) as to why the program you developed will work for the trainee. Your summary should explain why your trainee should eat a specific food or do a particular exercise. This summary should also explain why your final proposal is better than your original plan (the one developed in class). Figure 2-4 The modified assignment produced following a detailed discussion with the GTAs and showing incorporation of their suggestions. Note that the rubric was omitted from this version of the assignment, as suggested by the GTAs. 104 Results and Discussion PrelPost Course Multiple-Choice Tests Scores on the pre/post course multiple-choice tests have improved over the time of this study (Table 2-6). Test results show significant improvement using a two-tailed t-test assuming equal variances (p < 0.001) in student performance for every semester. Fall 2000 pre-test mean is 46.8 1 17.1and post-test mean is 76.6 1 11.4, showing a 29.8% improvement during the course. Spring 2001 results show a significant improvement of 15.4% in student performance pre and post test means of 44.0 1 18.6 and 59.4 1 14.8 respectively. Fall 2001 results show a 20.0% increase between pre and post-test means of 38.9 1 12.1 and 58.9 1 16.1 respectively. Mean pre/post test scores for Spring 2002, Fall 2002, and Spring 2003 show results of 50.8 1 13.0 and 79.5 1 13.1, 45.9 1 12.3 and 73.9 1 8.7, and 54.6 1 13.6 to 86.6 1 10.4; 28.7, 28.0 and 32.0% improvements, respectively (Table 2-6). Overall, the six semesters for which data were collected for this study show significant (p < 0.001) improvement for pre/post course multiple-choice tests with a weighted mean of 26.4%. The mean scores for the control questions indicated students were not guessing on these questions, but there was no improvement in scores on these tests between pre/post tests. The mean scores for these control questions showed no significant differences for the two semesters they have been included on the pre/post course tests (p <_ 0.05) with a mean of 3.0 1 0.1 PrelPost course tests. Additional control questions could have been included on the pre/post- tests to increase statistical power to detect differences, but were not included 105 because there was already too much time being taken out of class time used for testing. Additionally, students understandably complain about questions included on tests that are not part of the curriculum or that might affect their grade. Table 2-6 ISB 208L pre/post course multiple-choice test results shown with their corresponding significance as determined using a two-tailed t-test assuming equal variances. All differences between pre and post course multiple-choice tests are significantly different indicating substantial increases in test scores between the beginning and end of the course. n=number of students. Semester Pre-test (%) Post-test (%) Difference between n PrelPost :I: SD :I: SD pre/post test P P t scores re os Fall 2000 46.8 i 17.1 76.6 :I: 11.4 29.8 660 641 Spring 2001 44.0 :I: 18.6 59.4 :I: 14.8 15.4 842 782 Fall 2001 38.9 :I: 12.1 58.9 :t 16.1 20.0 755 700 Spring 2002 50.8 :I: 13.0 79.5 :t 13.1 28.7 856 895 Fall 2002 45.9 :I: 12.3 73.9 :t 8.7 28.0 825 881 Spring 2003 54.6 t 13.6 86.6 :I: 10.4 32.0 734 919 Weighted Mean 46.8 73.2 26.48 4672 4818' We predicted that if the model was working the test scores between pre and post-tests would improve or remain constant every semester. If the model was not working pre/post test scores were expected to decrease or to be inconsistent across semesters. Test results show significant increases between pre and post course test scores. The first semester, Fall 2000, had a very large increase of 29.8%. This result was expected because the pre/post tests used lower cognitive level test questions and the GTAs were teaching to the test. For much of the early tests, 93.6% of pre-test and 78.4% of post-test questions were 106 level one. The Fall 2000 pre-test included 3 of 47 questions that were Bloom’s level two questions and no level three questions. The Fall 2000 post-test included 21 of 97 level two questions and no level three questions. The mean cognitive levels for these tests were 1.1 and 1.2, pre/post-tests respectively. The following semester tests were changed to include a greater number of higher cognitive level questions, but there was still much work to be done on course materials and GTA professional development to align course content, objectives, and assessment. Over the next semester, Fall 2001, course materials were aligned with the assessment and we began to build a more formal GTA professional development program. Between Fall 2001 and Spring 2002 the number of multiple-choice questions was reduced from 30 to 20 to decrease testing time so essay questions could be used on the pre-test. There was a change in test scores based on this shift, but the differences between pre/post test results remained constant (see Table 2-7). The Spring 2003 pre/post tests used the same questions. These tests consisted of 23 questions: 10 level 1; 9 level 2; and 4 level 3 questions with a mean of 2.2. GTAs were instructed to teach course material to insure it aligned with test questions. Test scores initially declined between Fall 2000 and Spring 2001 and then continuously improved through Spring 2003. The initial decrease in test scores was probably due to the attempts to align course objectives and teaching materials and to the inclusion of a greater percentage of higher cognitive level questions on the tests, but without adequate Course alignment. These deficiencies in course alignment were addressed by improving course materials and GTA professional development 107 every semester from Spring 2001 through Spring 2003 resulting in better alignment between GTA professional development, course delivery, and assessment. Table 2-7 ISB 208L pre/post course essay test results for all semesters where the test was administered. The pre/post course essay test results show significant improvement in test scores with mean improvement of 18.9%. All scores are reported as raw score followed by percentage score. The number of students evaluated is shown for each semester as (n). Semester Pro-test Post-test raw Difference P value it mean raw score mean % between pre “WOM'I score (%) and post I") scores (%) Spring 2002 75.4 1 10.8 85.4 1 9.0 10 p< 0.001 61 Fall 2002 64.4 1 14.2 88.8 1 9.4 24.4 p< 0.001 58 Spring 2003 70.8 1 18.2 91.2 1 11.0 20.4 p< 0.001 181 Weighted Mean 19.2 Additionally, test questions were examined for their validity and their ability to discriminate using appropriate detractors. This was done by having various members of the Ieaming community examine the questions to insure they aligned with course objectives and were appropriately worded to obtain the desired assessment. The Kuder—Richardson index was used to determine test reliability. There were modifications to exams and changes to course materials each semester in an attempt to increase the test question cognitive level, the validity of test questions, and alignment between course delivery and test questions. If assumptions were correct and the pre/post course tests assessed student ability to use process skills, the results indicated the course is improving student 108 process skills. However, changes to both the overall test question cognitive level (from mean of 1.1 to 2.2) and improved questions for Kuder-Richardson (from 0.58 to 0.85) and other parameters might offset the gains across semesters. Lab manual assignments, out of class assignments, and GTA professional development were changed to better align content taught with objectives. However, mean grades issued for the course remained high, at 3.3. In fact, pre/post course test differences increased in subsequent semesters (See Tables 2-6 and 2-7). The initial tests indicate students increase their content knowledge by taking the ISB 208L course. This suggests the course increased student Ieaming (use of process skills) by clearly aligning course objectives, instruction and assessments, which is the goal established for this course. Improving test scores across semesters supports the hypothesis that iterations in the course model helped improve student achievement. PrelPost Course Essay Tests Pre/post course essay tests were implemented during Spring 2002 in an attempt to better evaluate student use of critical thinking. By changing course materials, GTA professional development, and other parameters, it was predicted there would be improved essay test performance indicating students were improving their critical thinking, scientific process, and use of facts and logic to formulate arguments. Pre/post course test means comparison for the essays show a significant improvement for all semesters with a weighted mean of 19.2% using a two-tailed t-test assuming equal variances (p < 0.001) (Table 2-7). Table 2-7 shows that 109 students improved their overall scores by 10% between pre/post tests, from a mean of 75.4 1 10.8 to 85.4 1 9.0% Spring 2002. Fall 2003, student scores improved from 64.4 1 14.2 to 88.8 1 9.4%, 24.4%. Spring 2003, student scores improved by 20.4%, from 70.8 1 18.2 to 91.2 1 11.0 %. Overall, the pre/post essay test scores showed a 19.2% improvement for the three semesters used in this study essay test questions, with each semester better than the previous semesters. These test scores showed an improvement over the three semesters from 10 to 20.4% suggesting that as course materials, delivery, and GTA professional development were improved. Data for the third semester may have had a problem because the policy of giving students points for taking the essay pre-test was discontinued. Many students did not complete the essay pre-test. Additionally, there may actually be a greater student test score improvement than the data indicate because 10-15% of students were able to earn a near perfect test score on the essay pre-test. Therefore, this group would not show pre/post test score improvement reducing the overall mean differences. These data suggest that the course model process improved our ability to assess student achievement through better use of rubrics and consistency in grading. The increasing differences between pre/post course mean test scores in subsequent semesters suggest that there is better alignment of instruction, assessments and objectives. One possible counter argument to these results is that by providing the rubrics instructors are teaching to the test. There was an attempt to counter this argument by scaffolding our rubrics from the beginning of the course to the 110 end of the course, giving a detailed rubric with the first assignment and reducing the rubric with each subsequent assignment, until there was no rubric given on the final exam. Improved test results between pre/post tests indicated students had Ieamed how to evaluate their work and were not simply following rubric directions. Additional Assessments Additional assessments that support the effectiveness of this model are described in detail in Chapters 3 and 4. These assessments are SIRS (Student Instructional Rating System), GTA surveys, mean course grade issued, course completion rate, attendance, enrollment and course syllabi analysis. SIRS ratings for the ISB 208L course improved every semester during the study period. We only looked at six SIRS questions (See Chapter 4, Tables 4-4, 4-5, and 4-6). Ideal SIRS ratings for all six questions examined should have a value close to 1.0. These questions asked students to rate various aspects of the course. These areas included, the relevance of science to the non-scientist, instruction focused on concepts, the relationship of science to society, intellectual rigor of course work, the effectiveness of the GTA, and an overall course rating. Examining Table 4-7 all of these rankings moved closer to 1.0 over the course of this study. This would indicate that course improvements are being made every semester based on evaluation of assessment data. This implies that the model was helping to improve the course. One problem with SIRS results is that they are a very coarse measure and therefore even small consistent changes may be due to normal fluctuations in measurement and may not actually indicate a trend. 111 This does not appear to be the case in that an examination of the SIRS data for the previous laboratory courses (ISB 202L and 204L), Table 4-7, shows fluctuations, not a consistent trend toward lower numbers, as does the ISB 208L course. Additionally, when used in conjunction with the other forms of assessment the assessment results show a similar trend toward overall course improvement in subsequent semesters. GTA surveys (See Chapter 3, Table 3-3) indicate that the course was effective and that the professional development provided by the lab coordinators helped GTAs teach and deal with issues raised during instnrction. This would suggest that improvements made in GTA professional development led to greater GTA confidence in the classroom, which translated into improved instruction and greater student achievement. The GTA survey questions were developed to be similar to the SIRS questions and therefore a low number, closer to 1, is the best rank. GTAs indicated on Questions 3, 5—9, 17, 19-24, and 27 that the professional development they received aided them in teaching the course. The question ranks stayed consistent over the study period with means of 1.6, 1.6 and 1.7 for the three semesters the survey was used. The GTAs also indicate on questions 10-12 that the improved lab manual was easier to use, more relevant to student lives, and addressed the CISGS objectives for undergraduate biology education, with means of 2.3, 1.9, and 1.8 respectively. There were no grade adjustments, implying that even though we increased test question and course materials cognitive levels, student achievement remained high. These results suggest that students are working at 112 a higher cognitive level while achieving the same grade due to better alignment of course instruction with assessments. An analysis of attendance, rate of course completion, and enrollment shows that students attend class 95% of the time. There were several reasons for such high attendance. There was a strict make-up policy preventing students from making up missed work. There was at least one assignment and/or quiz given each laboratory session. This means that if a student does not come to class and they miss an assignment it will have an impact on their course grade. GTAs aided lab coordinators in developing these policies to encourage students to attend class. This supports the model in that changes in attendance are suggested by the Ieaming community and were incorporated each semester improving the process. Students complete the course at the rate of 93%. This may actually be a low rate of completion as it may be skewed because we used enrollments from the Registrar’s office class lists on the first day of classes as initial enrollments. These numbers change as students switch sections and courses. Final enrollments were based on number of grades issued. Students tended to complete the course indicating that it meets their needs for fulfilling university requirements. Enrollment figures suggest that students favorably viewed the course, because they could be taking other courses to fulfill their university science laboratory requirement, but enrolled in increasing numbers in the ISB 208L course. Alternative explanations for course enrollments and completion were that there are many sections of the lab offered over the entire week from 8 AM. to 6 PM. or that the course is easy and they can earn a high 113 grade in the course. These were probably not the reasons for the high enrollments or course completion because other lab courses are offered at the same times as this course and some of the lab courses were considered to be easier than ISB 208L based on student comments. Informal Assessment An additional form of course evaluation was based on informal conversations with GTAs, students, and faculty to determine feelings about the course and course materials. These discussions took place in the classroom, during teaching assistant evaluations, and during weekly lab meetings. As feedback was received, it was evaluated to determine if it had merit, and could aid in overall course evaluation. It was important to know if either form of assessment indicated a need for course changes in pedagogy, assessments, or other areas. For example, these discussions provided ongoing information regarding test and quiz questions, lab book typographical errors, student problems with course materials, conceptual teaching problems, and many others. These discussions helped identify a missed means of assessment - a formal quantitative GTA assessment of the course (previously described as GTA Survey). Conclusions This project’s goal was to determine if the ISB 208L course and subsequently the new course model improved student achievement. Our hypotheses were: Hypothesis l: The use of the course model leads to improved laboratory course delivery leading to increased student Ieaming. 114 Hypothesis II: Use of the course model provides an effective mechanism to continuously improve course delivery and student achievement. Hypothesis Ill: GTA professional development improves GTA effectiveness leading to student achievement. In order to show the model was working we needed to show that: —l . Iterations of the application of the model help to improve student achievement. 2. GTA professional development leads to improved GTA effectiveness and increased student achievement. 3. GTA professional development improves teaching competence and reduces course problems, and 4. Application of the model improves course delivery over time. Because this type of research has a great deal of noise in the data it is difficult to establish direct cause and effect relationships between each of the new model components and their effect on student achievement. Because of this noise, many types of assessments were employed to act as indicators of model success. This study collected a wide range of assessment data in an attempt to determine the effectiveness of the ISB 208L course in helping students achieve the CISGS goals for undergraduate biology education. To test these hypotheses this study needed to show that iterations of applying the model and GTA professional development improves course delivery by improving GTA instructional confidence leading to increased student achievement. The evidence collected and explained in this study supports the three hypotheses. The evidence for model success consists of pre/post course multiple-choice, and essay tests, SIRS, mean grade issued, attendance, rubrics, course completion, course enrollments, student and GTA comments, and other anecdotal data. 115 Hypotheses l and II, that the course model led to improved course delivery and was an effective mechanism to continuously improve course delivery and increased student Ieaming, is supported by this study. The assessments used in this study showed significant increases in pre/post test scores for every semester. There was a clear increase in test performance across semesters in regard to the CISGS goals for undergraduate biology education in that the mean test question cognitive level was increased from 1.1 to 2.2 over the six semesters of this study and test scores remained high. Pre/post multiple-choice and essay tests showed significant improvement during each semester indicating students were achieving the goals of undergraduate biology education. The data also showed a general increase over the six semesters of this study when the increase in question cognitive level is included. There was an expectation that test scores should decrease with increasing test question cognitive level, but this did not happen. Instead this study showed significant increases in test scores, 26.4% for multiple-choice and 19.2% increase for the essay tests. The essay tests required students to do two things. First, they were required to formulate a logical argument based on data and second, they had to analyze or design a scientific investigation. These are both CISGS goals for undergraduate education. Other factors could explain these results. It is possible that these students were Ieaming to write in this manner in other courses they were taking. However, the course had a mean enrollment of 827 students per semester with a wide range of majors, almost exclusively non-science majors. The majority of the students were freshman and sophomores with a few juniors and seniors, so it is 116 unlikely that they would consistently be Ieaming the same material at the same time to improve their writing during the same semester. The GTAs also indicated that they felt the students were Ieaming the material and that they could see the difference in the classroom in the way students formulated ideas and asked questions. GTAs also indicated on their surveys that they felt the course was achieving the CISGS goals for undergraduate education. Additionally, it was important to show that iterations of the application of the model lead to increased student achievement. The pre/post tests were changed every semester to better align them with CISGS goals for undergraduate biology education. The test question, course materials, and instructor questions asked in class were changed to increase cognitive levels based on assessment data from previous semesters. GTA professional development was also changed to align instructor skills with the desired level of course delivery. GTAs did not come to the ISB 208L course with the skills necessary to teach at the level desired by the university. Additionally, rubrics were added to writing assignments to improve student evaluation. The rubrics underwent many revisions based on the recommendations of the GTAs, lab coordinators, and students. Members of the Ieaming community provided feedback about such things as the lab manual, tests, in-class assignments, attendance and others that led to course changes. Each semester the course improved according to SIRS results with all SIRS ranks moving closer to 1.0. It was also observed that mean grades issued remained high, mean of 3.3, without grade adjustments. There was great course consistency across sections in all 117 respects. Additionally, the rubrics helped improve grading consistency among sections, decreasing standard deviations in essay test grading scores from plus or minus 15 points to 5 points. This meant that essay scores issued by three different GTAs varied by much less after feedback and modifications of rubrics lead to both improved rubrics and GTA professional development. Students came to class with 95% attendance and remained in the course with 93% earning a grade. This is also an indication that the course has improved due to iterations of the course model in that if students did not view the course in a positive manner they would not attend class or would drop the course, as there are other courses available to fulfill their undergraduate laboratory requirement. Other explanations for why students remain in the course are that the course is easy or that there are in class assignments forcing them to come to class. These are probably not the dominant reason for attendance or course completion in that students can pass the course without attending all classes and as previously stated, they could take another course to fulfill their requirement. Anecdotal evidence from conversations with students who have taken the course indicate they liked the course emphasis on relevant topics and the classroom discussions. These comments were not quantified. Another group of comments that have decreased over the time of this study are the number that indicated there was too much busy work in the in class assignments and far too many errors in the course materials. The lab coordinators, with the help of the GTAs, have modified the lab manual, quizzes, tests, papers, and other course materials to make them more appropriate at teaching the material without being busy work 118 or having an undue number of errors. The iterative process built into this model supports this process by formally assessing the course based on all of the criteria. GTAs indicated on the GTA survey and in informal interviews that they liked to teach this course and they felt it is well aligned with the CISGS objectives for undergraduate education. They also indicated that the professional development provided by the lab coordinators helped them to be comfortable to teach, understand the emphasis for instruction, and aided them in being prepared and comfortable to teach the course. GTAs provided invaluable feedback into every aspect of this course over the time of this study. GTAs were encouraged to provide feedback as to any shortcomings in any aspect of the course, from the lab manual to delivery and to gaps in their professional development. GTAs indicated that the ongoing mentoring they received from the coordinators and interactions with other GTAs aided them in developing as instructors. The GTA surveys, pre/post multiple-choice and essay test scores, and informal conversations supported hypothesis III, that improved GTA professional development leads to increased student achievement. This process was begun Fall 2000, and continues today. Test questions, pedagogy, and overall alignment is improved at the end of every semester, as assessment data becomes available suggesting course improvements. Course improvement is an ongoing process based on formal assessment and Ieaming community comments. It was observed that everyone involved in this process, from instructors to students, grew from the continuous course improvement 119 process. It provided everyone in the educational community ownership in and a reason for Ieaming the skills and material presented in the course. The lab coordinators and GTA professional development materials used in this process are described in Chapter 3, “GTA Professional Development: Key to Program Success” (Tarrant, 2004). Handouts used for this are available from the author. Additionally, improved GTA professional development was predicted to improve student achievement of process skills. Because GTAs were not used to teaching higher-process level skills they needed professional development to assist them in teaching, being comfortable in the classroom, and competent to teach these skills. The amount and depth of GTA professional development was increased each semester and worked to help GTAs develop facilitation skills, questioning techniques, and other skills necessary to move the classroom from instructor centered to student centered and to better coach students in development of process skills. Increased student test scores and positive GTA survey results were predicted indicating GTAs felt they were receiving helpful professional development. The GTA survey results are covered in detail in Chapter 3. The GTA survey results support that GTAs felt the GTA professional development helped them to better teach the higher cognitive level skills necessary to be successful on the course essay tests. Also, it was expected that students would rank GTAs as very effective on SIRS. The results of the GTA survey and SIRS confirm this prediction. This process continues as mismatches are identified through course evaluation between instruction, assessment, course materials, GTA professional development, student perceptions, and coordinator 120 knowledge. The greatest problems with this process are the time required for all assessment components including University approval of the study, data collection, grading of essays and other student materials, test construction, and data analysis. The strength of this process is that it allows for continuous course improvement linking assessment, instruction, and the latest in educational research to provide the best student instruction. The course assessments suggest that the ISB 208L teaching model for undergraduate non-major biology is improving student knowledge on many levels and is meeting the CISGS objectives. The assessments indicate that student general biology knowledge, use of critical thinking skills, the scientific process, and awareness of the importance of biology and science in their lives and to the world, increases by taking ISB 208L. Based on test question cognitive level, pre/post course multiple-choice and essay tests, and GTA surveys, the results of this study suggest that the lSB 208L teaching model improves course delivery and is effective in increasing student achievement toward the CISGS objectives for undergraduate science education of integrative studies. Results for three years of assessments show a statistically significant (p< 0.001) increase in student biology content knowledge with a mean improvement on pre/post multiple-choice tests in student performance of 26.4%. The results for the pre/post course essay/problem test assessment to determine student use of critical thinking skills and the scientific process show a mean increase in test scores of 19.2% for all semesters. Mean attendance for ISB 208L is 95% and show a course completion rate of 93%, (This data is described in detail in 121 Chapter 4). Additionally, grade point averages have remained high, mean of 3.3 for ISB 208L while the mean level of question cognitive level used on course assignments has increased from 1.1 Fall 2000 to 2.2 Spring 2003 with 78.4% low cognitive level questions on the early pre/post tests to 56.5% to level 2 and 3 cognitive level questions on the latter tests. Students indicate, through SIRS data, that the course is relevant to their lives, emphasizes understanding of ideas and concepts, and encourages thinking about the relationship between science, society, and their lives. Multiple-choice tests were improved each semester. By analyzing the previous semesters’ tests it was possible for the lab coordinators to improve tests by moving Kuder-Richardson reliability index closer to 1, from 0.58 to 0.85 over six semesters. Additionally, test questions were changed to give a better spread of student responses over all detractors, and to insure that the top students were getting the questions correct and the low students were getting the questions incorrect. This is an iterative process that allows test improvement over time. All SIRS show continuous improvements moving closer to the ideal score of 1. Student course ranking on SIRS has increased every semester as work continues to align course content, delivery, and assessment. (A detailed description of SIRS results is provided in Chapter 3.) Additionally, the course model uses assessment to identify mismatches in teaching, assessment, objectives and other areas, so that improvements in overall course design and delivery can be made to obtain the greatest student Ieaming gains, achieving CISGS objectives for undergraduate education. Other assessment results show continuous improvement in subsequent semesters. 122 Students have also indicated that course work and assessments are challenging and interesting. Additionally, the process of using embedded assessment has increased awareness of course limitations in providing instnrction; identifying educational bias; mismatches between objectives, instruction, and assessment; additional need for GTA professional deveIOpment; and coordinators to improve overall course operation and student achievement. Table 2-8 summarizes shifts observed over three years of using the course model and embedded assessment to drive the continuous course improvement process. Table 2-8 Over the course of this study we found that the following shifts in course implementation and procedures occurred as we worked to improve the course and align it with CISGS educational objectives # A Shift From... 1. One of us generating questions to a group effort 2. Using lowest cognitive level questions to use of higher cognitive level questions, as measured using Bloom’s Taxonomy of cognitive level, and multiple-choice to essay questions 3. Very simple to very detailed rubrics for scoring of student work 4. Guessing question effectiveness to validating question effectiveness and reliability 5. Unclear match between objectives, instruction, and assessment to a clear link between objectives, assessment, and instruction 6. Not using the assessment or minimal use of assessment to modify GTA professional development and course materials to extensive use of assessment to drive all aspects of course improvement It is important to note that there was a focus on only one or two changes in this process per semester, due to time constraints, confusion, our knowledge, and ability to implement changes. Using embedded assessment, it was 123 determined that there was not a clear match between objectives, pedagogy, and assessments. This meant it was necessary to determine how to assess our current position and then build a plan as to how to align the various course elements. This study supports all three hypotheses showing that the course model improves laboratory course delivery, and that GTA professional development increases student Ieaming. This study indicates that an effective laboratory course model can be used in a large undergraduate course using embedded assessments, student-centered laboratory activities, and instructor professional development leading to improved student achievement. This study demonstrated that the new course model can achieve increased student Ieaming through clear alignment of objectives, instruction, and assessments. To accomplish this requires several types of assessments and continuous evaluation of assessment data. Additionally, for maximum effectiveness, there is an indication that GTAs need professional development to insure excellent instruction to occur, which supports the research findings of Druger (1997) and Lawson (2002). Additionally, this course implemented a shift from a predominantly confirmatory laboratory program to a mix of confirmatory and inquiry-type laboratories. The pre-post course multiple-choice and essay test results suggest that students have significant Ieaming gains in the new course, even with the addition of higher-cognitive level questions. This suggests that students are using higher-level process skills, critical thinking, scientific reasoning, and facts to 124 formulate logical arguments, to answer complex essay questions. Based on the assessments used in this study it is unclear whether the same Ieaming gains would occur using a laboratory program that used all confirmatory laboratory exercises. Previous research by other authors (NRC, 1997; VonSecker and Lissitz, 1999; Lawson, 2002), suggest that confirmatory exercises are not as effective in teaching higher cognitive level process skills. Therefore, the findings of this study suggested that the use of this course model can significantly increase student use of process skills using a mix of confirmatory and inquiry type laboratory exercises. This study supports other educational research studies that indicate it is important to have a clear link between objectives and assessments. Research findings (Hart et al. 2000) support that it is important for all members of the Ieaming community to understand the objectives of the course and assessment procedures used for student and course evaluation. It is important to assess achievement, and to use student achievement as a means to evaluate the course and make improvements to increase student Ieaming (Eberly, et al., 2001). Finally, our course model achieved all of the objectives of this project by using assessment to identify mismatches in teaching, assessment, objectives and other areas, so that improvements in overall course design and delivery can be made to obtain the greatest gains in student Ieaming, thus achieving CISGS objectives for undergraduate education. The new laboratory course model was developed during the revision and implementation of the ISB 208L laboratory course. This generic course model is applicable to other courses being used to 125 provide a formal means to drive continuous course improvement, as demonstrated by the research findings in the ISB 208L laboratory course. 126 CHAPTER 3 GRADUATE TEACHING ASSISTANT PROFESSIONAL DEVELOPMENT: KEY TO PROGRAM SUCCESS Introduction This chapter provides a description of graduate teaching assistant (GTA) professional development methodologies used for improving teaching assistant instruction in a non-major general biology laboratory course. It is important in that it provides a description and materials used for professional development and has general applicability to other science courses using GTAs as primary instructors. The CISGS department employed GTAs from several departments at Michigan State University: Plant Biology/Pathology, Zoology, Entomology, Anthropology, Criminal Justice, and others. Teaching assistants were masters or doctoral graduate, or upper level undergraduate students. There were 15-20 teaching assistants per semester. Teaching assistants arrived to teach with varying backgrounds in biological content knowledge and vast differences in teaching experience. This meant many GTAs were under-prepared to teach the course. It was common to have last minute changes in GTAs teaching assignments and about 50% turn over of GTAs per semester, requiring a mechanism by which GTAs could be prepared to teach our course quickly. This diverse group of teaching assistants needed to become instructors within as little as two days to teach one to three laboratory sections. The hypothesis for this study was that well prepared instructors increased GTA effectiveness and student Ieaming. 127 According to educational researchers’ findings there are several areas that must be addressed in order to adequately prepare GTAs for their role as primary course instructors. These include increasing GTA classroom and instructional competence and comfort levels, familiarizing them with all aspects of course and institutional procedures, and providing instruction in pedagogy to increase their instructional tool bag. Each of these areas will be discussed in the following paragraphs to include descriptions of recommendations for accomplishing this professional development and how we incorporated these ideas into the ISB 208L professional development program. Many institutions encounter similar problems with graduate teaching assistants not being prepared and in need of professional development to prepare them for their role as course instructors (Worthen, 1992; Edwards, 1993; Mandeville, 1993; Takalkar et al., 1993; Main, 1994; Hendrix, 1995; Druger, 1997; Savage and Sharpe, 1998; Von Seeker and Lissitz, 1999; Guthrie, 2000; Hayward, 2002; Luft, 2003; Lumsden and Morgan, 2003). Worthen and other educational researchers indicate that one of the major problems in using GTAs as course instructors is obtaining qualified staff (Worthen, 1992; Frymier, 1993; Mandeville, 1993; Takalkar et al., 1993; Main, 1994; Hendrix, 1995; Boyle and Boyce, 1998; Savage and Sharpe, 1998; Chinn and Hilgers, 2000; Bymes, 2001 Crawford, 2000; Roehrig et al., 2003). Worthen explains that GTAs are inexperienced and not prepared to act as primary course instructors. He indicates that it is important to develop and implement effective GTA orientation programs in order to provide the greatest achievement for undergraduate 128 students. He points out that most studies and GTA programs are developed by faculty based on their perceived GTA needs, rather than what GTAs feel they need in order to be effective instructors. He found through surveying GTAs that they felt they needed professional development because they experienced frustration due to a lack of preparation for duties and responsibilities, lack of direction and feedback from coordinators and in working with apathetic students, lack of faculty support, lack of methods to motivate and involve students, and methods to cope with their dual role as student and instructor. Worthen (1992) cautions that it is important to provide GTA professional development at a reasonable pace so they are not overwhelmed by the information. Additionally, the National Science Foundation Standards recommend a reform effort for a pedagogical shift from teacher-centered to student-centered instruction if the goal of education is to be the teaching of higher level cognitive skills with less emphasis on rote memorization of facts (NRC, 1996). The impetus for GTA professional development is that educational researchers’ findings indicate that student achievement is linked to teacher preparedness (Rutherford and Ahlgren 1990; Yager, 1992; Darling-Hammond, 1996; NRC, 1996; Von Secker and Lissitz, 1999; Stronge, 2002). Therefore, if the goal is to increase student achievement then it is important to provide GTA professional development. Graduate teaching assistants were originally used to help regular instructors teach courses and were mentored by these instructors. During the 1970’s, with increasing college enrollments and reduced university budgets, GTAs were moved into the role of course instructors (Hendrix, 1995). Once 129 GTAs began to move into these new roles, it became apparent that many of them were ill-prepared for these roles. In 1991 Syracuse University initiated a comprehensive GTA training program called, “The Future Professorate Project”. The goal of this project was to prepare GTAs for their role as future teachers (Druger, 1997). This project was subsequently expanded to include other universities and was named “Preparing Future Professors: A New York State Consortium Project” (Druger, 1997). The Consortium Project included items deemed important by professors and lab coordinators for GTAs to know to begin their assignments as instructors. Their recommendations were for GTA training to include mandatory university-wide orientation that covered videotaping and critiquing, teaching at the university level, ideas for effective presentations, the dual role of the GTA as teacher and student, diversity issues, encouraging active Ieaming, sexual harassment, dealing with commonly encountered instructional problems, enhancing student-teacher interactions and the professional portfolio. The training also included concurrent sessions covering laboratory instruction, testing and grading, critiquing student work, leading recitations and discussions, designing a syllabus, and using technology in the classroom. Full-time faculty and graduate teaching fellows led all of these professional development sessions. Following the general orientation three days were devoted to departmental training. These sessions focused on departmental responsibilities, professional development seminars, language instruction for international GTAs, mid-course feedback surveys, a consultation survey, and eligibility to compete for Outstanding GTA awards. Many of these issues relate to teaching biology. Most 130 beginning GTAs in the biological sciences are assigned to the introductory course, which served as the training ground for developing competency in teaching. These courses enrolled over 600 students per semester, mostly first- year non-science students, were taught using audiotapes, lab work, lectures, and recitations. The GTAs role was to teach two recitation sections per week, supervise students in the audiolabs for four hours per week, proctor exams, conduct review sessions, and contribute to the course in informal ways. The goal of this training was to transform the GTAs into confident, knowledgeable, effective teachers who enjoy the opportunity to help students learn (Hendrix, 1995; Druger, 1997; Lawson 2002; Worthen, 2002). Most of these GTA professional development strategies have been recognized'as important by other institutions. There appears to be a great deal of agreement among institutions that use GTAs as instructors as to what should be included in GTA professional development, but there is a lack of quantitative empirical evidence showing that professional development translates into increased student achievement. Most research, including this study, used opinion questionnaires given to GTAs, students, or faculty to determine professional development effectiveness. (Worthen, 1992; Frymier, 1993; Mandeville, 1993; Takalkar et al., 1993; Main, 1994; Hendrix, 1995; Boyle and Boyce, 1998; Savage and Sharpe, 1998; Chinn and Hilgers, 2000; Bymes, 2001Crawford, 2000; Roehrig et al., 2003). GTAs also suggested they should receive a core of lecture notes, workshops describing course elements, and department goals and objectives. They also felt they needed to read course 131 textbooks and have all materials used for teaching, an organized activity file, and to hear about their assignment earlier in the summer. Additionally, they need instruction on how to gain respect, and ready access to coordinators and faculty. (Worthen, 1992; Druger, 1997; Savage and Sharpe, 1998). Qualitative data suggest that improved GTA professional development results in greater student achievement (Worthen, 1992; Frymier, 1993; Mandeville, 1993; Takalkar et al., 1993; Main, 1994; Hendrix, 1995; Boyle and Boyce, 1998; Savage and Sharpe, 1998; Chinn and Hilgers, 2000; Bymes, 2001; Crawford, 2000; Hampton and Reiser, 2002; Roehrig et al., 2003). According to Lawson (2002) and his colleagues at Arizona State University, there is a direct correlation between teacher preparation and student achievement. Lawson’s findings showed that teachers who received professional development in how to incorporate the recommendations of the American Association for the Advancement of Science (Table 3-1) for teaching science significantly increased student Ieaming. Their research also showed that teachers need experience and clarification on the nature of science to improve classroom instniction. They indicate that most of the teachers with whom they work do not fully understand the nature of science. They describe the nature of science as the processes used to “do science”, the scientific method and critical thinking. This means GTAs needed professional development to prepare them to be effective classroom instructors. 132 Table 3-1 Principles of effective teaching as described in “Science for All Americans” AAAS, 1989 (cited in Lawson et al. 2002). Principles of Effective Teaching Teaching should be consistent with the nature of scientific inquiry: a Start with questions about nature . Actively engage students o Concentrate on the collection and use of evidence . Provide historical perspective o Insist on clear expression - Use a team approach 0 Do not separate knowing from finding out o De—emphasize the memorization of technical vocabulary Teaching should reflect Scientific Values: 0 Welcome curiosity o Reward creativity 0 Encourage a spirit of healthy questioning 0 Avoid dogmatism 0 Promote aesthetic responses Teaching should aim to counteract Ieaming anxieties: 0 Build on success 0 Provide abundant experience in using tools 0 Support the role girls, women, and minorities have in science o Emphasize group learning Science teaching should extend beyond the school Teaching should take Its time 133 Additionally, McComas and Cox-Peterson (1999) indicate that many teaching assistants have limited teacher preparation (American Association of Colleges, 1985; Nyquist and Wulff, 1996). Allen and Rueter ( 1990) and our own observations indicated that many first-year graduate students are apprehensive about taking on the role of instructor. Other researchers have documented the lack of GTA preparedness for teaching (Monaghan, 1989; Weimer, 1990). Additionally, Druger (1997) recommends that GTAs receive instruction and practice in pedagogies because they will be the future professors who will be teaching undergraduates. This project developed from the experiences of others and my own observations that GTAs are not prepared to take on the role of instructors without professional development and ongoing mentoring from qualified staff. The ISB 208L course is a stand-alone laboratory course, not associated with a lecture course. This course (Applications in Biological Science Laboratory) included an embedded lecture and recitation, with the laboratory in one three hour block that met once per week. The course moved from confirmatory laboratory exercises to inquiry and problem-based exercises. This required graduate teaching assistants to shift from being lecturers to facilitators, a role for which they usually had no experience or training. Therefore, they required professional development to be effective in this role, especially given the short time available to prepare for their teaching responsibilities. The teaching assistants were the primary instructors for the course and the people with whom the students interacted on a daily basis. GTAs were the 134 backbone of the laboratory program and if they were under-prepared to teach the program, the program would fail to meet expectations (Druger, 1997; February, 1997; Heady, 2001; Levin, 2001; French and Russell, 2002; Lawson, 2002). This professional development also was important to the GTAs for their growth toward an academic career. Many of the elements of this program are similar to those presented by Worthen (1992), Druger (1997) and used by other institutions (Worthen, 1992; Edwards, 1993; Mandeville, 1993; Takalkar et al., 1993; Main, 1994; Hendrix, 1995; Druger, 1997; Savage and Sharpe, 1998; Von Secker, 1999; Guthrie, 2000; Hayward, 2002; Luft, 2003; Lumsden and Morgan, 2003). These programs and our program at Michigan State University included orientation, open communication, resources and support, GTA meetings, evaluation, and videotaping. The ISB 208L GTA professional development used all of these items and added instruction and practice in pedagogy. Druger (1997) also recommended providing rewards for excellent instruction, that GTAs take a teaching methods course, and attend a GTA lecture series where they practice presenting their research. A GTA teaching award was provided by the CISGS to reward GTAs for excellence in contributing to the ISB 208L course. It was suggested that GTAs take a teaching methods course, but it was not required. The ISB 208L GTAs also were not required to take a teaching methods course, but we felt it was important for GTA professional development. Therefore, pedagogy was added to both the two-day orientation and to the weekly lab meetings. Many of the GTAs do not have time, University support, or the desire to take another course (Personal Observation and communication with GTAs). 135 Methods Based on the recommendations of educational research, a GTA professional development program was developed to provide GTAs with all of the tools and information they needed to teach the laboratory course. Our program included a two-day orientation program, weekly laboratory meetings, classroom observations, mentoring, and detailed course pedagogical practice and materials. Professional development provided orientation to university and course philosophies and objectives, instructor responsibilities, modeling, and practice in using various teaching methods, development and use of assessment tools, formative and summative teacher evaluations, and an open-door policy for immediate and frequent contact with experienced instmctors. The ISB 208L course coordinators implemented a professional development program consisting of the following elements: (1) two-day in-service orientation, (2) weekly lab meetings throughout the semester, (3) classroom observations, (4) mentoring (access to and support from experienced instructors), (5) detailed course and pedagogical practice and materials, and (6) course assessment. Each of these components is described in the following sections. We developed and used a GTA survey to determine if the professional development was meeting expectations and to identify areas for course professional development improvement. Two-day Orientation Meeting The two-day orientation session provided GTAs with a wide range of background materials needed to effectively teach the laboratory course. 136 Included in this orientation were the objectives of CISGS, GTA rights and responsibilities as instructors, safety concerns, use of technology, teaching methods, grading, and much more. Copies of orientation materials are available from the author. The goal was to help GTAs be comfortable, capable, and prepared to handle any commonly encountered situation as the primary classroom instructor. Course coordinators provided ongoing mentoring and professional development by observing classes, conducting weekly laboratory meetings, maintaining an open-door policy, and encouraging GTAs to discuss any problems or concerns at any time. Many teaching assistants were teaching for the first time and needed help getting started and feeling comfortable managing the demands of a classroom. Some of these GTAs were undergraduate students just one semester before they were assigned to teach. GTA professional development began prior to the start of the semester with a two-day in-service orientation, where teaching assistants were provided with a great deal of information including University, Integrative Studies and course objectives, philosophies, policies and procedures. Additionally, GTAs were provided with an overview of course design, various educational pedagogies, and introductions to course laboratory exercises and equipment. The laboratory coordinators also worked to develop a rapport with the GTAs that encouraged them to seek help and provide feedback at any time throughout the semester. Role-playing exercises were added to the orientation to aid GTAs in dealing with the most common situations they were likely to encounter during the semester. 137 GTA’s were provided an orientation manual, formal training in all aspects of the course, and an Opportunity to ask any questions and express any concerns regarding teaching, the course, expectations, or evaluation. The two days of orientation concluded with an overview of the first two laboratory exercises. The lab coordinators presented lab exercises where they modeled various ways to conduct the laboratory, explained and had the teaching assistants use lab equipment, and explained course content. This professional development was more extensive than was previously provided in laboratory courses according to Dr. Larry Besaw, who was the previous lab coordinator/instructor of record. The ISB 208L course GTA professional development covered all of the areas from the previous laboratory courses, but extended this professional development to include more experience in four major categories: Course management, Professionalism, Pedagogies, and Administration, as listed in Table 3-2. 138 Table 3-2 The areas covered in the ISB 208L GTA professional development program (Course management, Teaching suggestions, Pedagogy. and Administration). Course Mgmt. T eaching_Sgggestions Pedagggy Administration Review and discussion Dressing for success Appropriate ways to Discussion of job of Course Objectives deal with problems requirements Lab Preparation Reasons as to why to get to How to teach Formal and know their students the labs informal evaluation tools Teaching preparation How to obtain and use class How to rewrite lab The University lists exercises, Microsoft Teaching Power Points, etc. Assistant Handbook ORCBS (Health and Attitude toward the course Use of Blackboard How to reach the safety) and students coordinators Student privacy issues Self-introductions How to deal with Emergency classroom conflicts procedures Make-up policies Video consultation How to deal How to suggest with academic course dishonesty improvements Weekly lab meeting Qualities of good instruction Sexual harassment When and how requirements guidelines to communicate ‘ with the coordinators Lesson plans for Qualities of good teaching Classroom Maintenance of teaching the management grades and laboratories attendance records Course syllabus Learning styles Dealing with Union Cards disruptive students Use of classroom Field lecture techniques Evaluation Drops and adds technology procedures and tools Study Skills Use of rubrics lncompletes GTAs gained the greatest respect from their students when they were able to answer frequently asked student questions, such as the reasons for course content and objectives. Answers and training for all frequently asked questions by GTAs were included in the orientation manual. Each of the teaching techniques and issues were presented, discussed, and modeled whenever possible and appropriate by the laboratory coordinators. 139 Michigan State University provides a campus-wide GTA orientation program, similar to other universities (Future Professorate Project, 1995; New York State Consortium Project, 1996; Auxier et al., 2000; Lumsden and Morgan, 2003; Tervalon and Breslow, 2003), that all GTAs are encouraged to attend. This orientation is meant to provide GTAs with general university information, resources, and other items of a general nature. It also provides some pedagogy and other classroom management suggestions and techniques. It was initially assumed, that most, if not all, GTAs would participate in the University-wide general teaching assistant training sessions for all beginning and returning GTAs. After surveying teaching assistants it was apparent that this was an incorrect assumption and that not all teaching assistants attended these general University training sessions. Only about one third of the GTAs attended the University orientation sessions during Fall 2002 and 2003. Therefore, the ISB 208L GTA professional development program incorporated general university information into the two-day orientation program to insure the GTAs were adequately prepared to teach the laboratory course. GTAs were given a detailed job description (Appendix J) to help them understand their responsibilities, coordinator expectations, and how they would be evaluated. This job description was developed With the implementation of the ISB 208L program to conform to the requirements of the GTA Union rules that required university departments to provide clearly stated job descriptions, consistency in teacher evaluation, and ongoing mentoring. The teaching assistants had varying teaching backgrounds from none to those who had taught 140 for several semesters. (Average overall teaching experience 3.3 semesters, 2.5 for ISB 208L: GTA surveys Table 3-3). It is important to note that some GTAs taught for as many as 8 semesters for the ISB program while most for only one or two semesters. Also, some of these GTAs taught at other universities or for other departments that provided little professional development to some that provided a great deal of professional development. 141 Table 3-3 Graduate teaching assistant survey results for 8803, F803, and $804. Graduate Teaching Assistant Survey Summary ($803-$804) Ave 8803 F803 SSO 4 Number of GTA respondents 16 12 9 Semesters Teaching (mean) 3.6 4.8 3.3 Semesters teaching ISB 208L 2.2 2.8 2.1 Current Status (M-Masters, P-PhD) 7M, 6M, 7M, 9P 6P 2P Gender 7F, 9F, 6F, 9M 3M 3M Age (mean) categories: ( 1) 18-22; (2) 23-30; (3) 31+ yrs old 2.1 2.1 2.0 Question# 1. The two-day teaching assistant orientation prepared me 1.9 2.1 1.8 1.7 to teach 158 208L. 2. The two-day teaching assistant orientation was too long. 2.6 2.3 2.8 2.9 3. The weekly lab meetings helped prepare me to teach 1.5 1.6 1.4 1.4 the next week laboratory exercise. 4. The lab coordinators were approachable. 1.1 1.0 1.0 1.3 5. The lab coordinators helped me feel comfortable to 1.3 1.2 1.1 1.6 teach the lab exercises. 6. The lab meetings helped me to be more comfortable in 1.5 1.6 1.3 1.6 teaching the next week’s lab exercises. 7. The lab meetings helped me to develop/improve my 2.1 2.2 2.0 1.9 teaching skills. 8. The lab coordinators helped me avoid/cope with difficult 1.4 1.4 1.5 1.4 situations in teaching the laboratory exercises. 9. The lab coordinators helped me cope with difficult 1.5 1.4 1.6 1.4 students in teaching the laboratory exercises. 10. The lab manual was easy to use. 2.3 2.6 2.0 2.3 11. The lab manual covered relevant science information. 1.9 2.3 1.5 1.7 12. The lab manual addressed the objectives of the course 1.8 1.9 1.7 1.7 and the Center for Integrative Studies. If not, please explain why and how it could be improved to do so in the space below. 13. The coordinators office locations made it convenient 1.2 1.1 1.4 1.1 and encouraged me to ask questions and obtain feedback from the coordinators. 14. The grading format of the course was easy to 1.4 1.2 1.3 1.7 understand and explain to students. 15. The make-up policies for the course were fair and easy 2.0 2.1 1.8 2.1 to enforce. 16. The assignments helped force students to attend the 1.9 1.9 1.9 1.7 laboratories. 142 Table 3-3 (cont’d) 17. The Microsoft Power Points helped me to teach the 1.3 1.4 1.3 1.3 course. 18. The Microsoft Power Points helped me understand to 1.5 1.5 1.3 1.7 what depth material should be covered when teaching. 19. The Microsoft Power Point presentations helped me 1.5 1.4 1.6 1.4 feel more comfortable in teachinLthe labs. 20. The use of Blackboard has helped me teach students. 2.0 1.9 1.9 2.4 21. Feedback from peers during lab meetings has 1.8 2.1 1.7 1.6 increased mycomfort level in teachim the labs. 22. Training for use of technology equipment, such as 1.4 1.4 1.4 1.4 computers, computer projectors, VCRs, Overheads, etc has been adequate. 23. Grade and attendance forms have helped reduce my 1.5 1.4 1.3 2.0 work load and aided in keeping accurate records of student rforrnance. 24. The coordinators open-door policy has helped me to 1.1 1.1 1.1 1.1 do my job, by being able to get answers questions and solutions to problems as needed. 25. The lab rooms are adequate to teach the course.If you 1.8 1.6 1.8 2.1 answered no, how should these rooms be modified to allow for better instruction? 26. Supplies and resource materials were always available 1.3 1.3 1.4 1.2 for each laboratory exercise. 27. Teaching itineraries/lesson plans helped me improve 1.8 1.4 1.9 2.2 my instruction. 28. Having common assignments is a good idea and helps 1.6 1.4 1.4 2.3 maintain consistency across all course sections. Overall mean 1.6 1.6 1.7 Directions: Please rate the following questions in as honest a fashion as you can on a scale from 1-5, with 1 being strongly agree to 5 being strongly disagree. At the end of this document, you will find space for additional comments. Please feel free to add any constructive comments about the course, training, and coordinators as desired. Two methods were used to evaluate GTAs. The Lab coordinators conducted the GTA evaluations using a formative review form (Appendix K) to provide feedback to the instructors to help them grow as professional educators and a summative evaluation form used for final GTA evaluation at the end of each semester. The two lab coordinators observed classes on a regular basis to provide feedback to GTAs throughout the semester. The coordinators attempted 143 to make GTAs comfortable with these evaluations and made it clear that these were to help them grow as instructors. GTAs were informed that they would be evaluated on their performance based on how well they fulfilled their teaching assignment as outlined in the job description. Evaluation was further explained by providing GTAs copies of the evaluation forms used during classroom and post semester evaluations during the two-day course orientation. Aspects of teaching evaluated on both the formative and summative forms included punctuality, grading, presentation of materials, classroom management, professionalism, and promptness in returning graded materials. The post- semester evaluation was a summative evaluation that rated GTA overall performance for the semester, and became a part of the teaching assistant’s permanent file. The specific requirements used for summative evaluation of GTAs are detailed on the teaching assistant evaluation form, Appendix L. The only problem with this approach was that there was insufficient time to adequately evaluate all 15-20 GTAs each semester. This was a time-consuming process and needs reconsideration to determine another way to accomplish this task. One way to solve this problem would be to use excellent experienced GTAs as mentors to help evaluate inexperienced GTAs in addition to coordinator evaluations. Other educational researchers recommend having GTAs evaluate other GTAs and themselves to achieve the greatest Ieaming gains and understanding of the art of teaching (Druger, 1997; Levin, 2001; NRC, 2001; French and Russell, 2002; Lawson et al. 2002). 144 Additionally, GTAs passed out an informal evaluation to their students during the 5th or 6th week of classes asking for feedback as to how they were doing as instructors. The informal evaluations were not viewed by the course I coordinators and were only to help the GTAs determine how students perceived their instruction to that point in the semester. This feedback allowed them to make changes to improve the course and their teaching during the last half of the course. Beginning Spring 2003, the lab coordinators began videotaping GTA performance in the classroom for self-evaluation. GTAs were encouraged to meet and discuss their teaching performance with a coordinator or other experienced instructor. Videotaping was offered to GTAs on a voluntary basis and the videotapes became the property of the teaching assistants. In past semesters, GTAs requested videotaping by The Center for Teacher Assisted Training at Michigan State University, if desired. Videotaping by the Training service was a voluntary confidential formal service, which provided the teaching assistants with information as to what they could expect, followed by an analysis and discussion with an experienced instructor of their overall instructional performance. Providing videotaping on a voluntary basis encouraged GTAs to be taped, making it a low risk form of assessment. Many institutions required their GTAs be videotaped and critiqued by supervisors and peers to help improve instruction (New York Consortium Project, 1996). We offered this as a training option, but did not require this as many GTAs were very uncomfortable being 145 taped and we simply lacked the time to spend taping and critiquing these sessions. Weekly Laboratory Meetings All teaching assistants were required to attend a two-hour laboratory meeting each week where they were introduced to next week’s upcoming laboratory exercise, given time to review past performance, provided additional pedagogical methodologies, given time to interact with their peers to evaluate and solve problems, and where they provided feedback to the lab coordinators about the course and other concerns. During the weekly meetings, GTA discussions with the coordinators covered the previous weeks’ lab and addressed concerns, shared things that worked, helped identify problems, and suggestions for course, laboratory or instructor improvement. The ISB 208L meetings included other techniques for dealing with specific teaching problems. For example, meetings addressed the disruptive student, various student Ieaming styles, facilitation techniques, rubric use, etc. There was a tradition that weekly lab meetings would begin at 3:00 PM. and end by 5:00 or 5:30 PM. The final part of these sessions evolved over the six semesters of this study. Initially, the lab coordinator(s) presented the next weeks’ laboratory exercise and modeled how best to teach the exercise, with very little required participation from the teaching assistants. In subsequent semesters, GTAs were required to team-teach the next laboratory exercise to the other GTAs during the weekly laboratory meeting. The intent was to pair an experienced GTA with an inexperienced GTA, one having never taught the laboratory. This sounded like a 146 very good idea, but was only minimally better in practice than one of the coordinators doing the teaching. Even when GTAs were required to meet to prepare their lesson, they often failed to meet. The primary factors that prevented GTAs from meeting and preparing were having different schedules, not knowing one another, waiting to prepare for the next laboratory exercise over the weekend just prior to teaching, and preparation to teach for their fellow GTAs was low on their priority list (personal communication from GTAs). Another problem with this approach was that when the GTAs knew which laboratory they were going to teach, they only prepared for their assigned laboratory exercise. To overcome these drawbacks, a lottery system was implemented to determine which GTAs would teach on a given day by placing each teaching assistant’s name into a hat, and randomly drawing two names on the day of the laboratory meeting. This required all of the GTAs to prepare and be ready to teach when they arrived at the weekly laboratory meeting. This was much more effective in forcing GTAs to be ready to teach (Pers. Obs.). However, there were still problems with this system, as it was time consuming, making it harder to cover other pedagogical and procedural materials in a two hour laboratory meeting. This helped alleviate some of their anxiety about teaching a lab for the first time, but it was difficult for the GTAs to coordinate their teaching in a short time. It was also difficult for a new teaching assistant to effectively teach the laboratory exercise to their peers. GTAs were not as familiar with the laboratory exercises as the lab coordinators making them less efficient in presenting materials to the other GTAs. 147 To ensure content consistency across all sections of the ISB 208L course, GTAs were originally given a Microsoft Power Point presentation that they were required to use, with no modifications, when teaching the laboratories. In subsequent semesters GTAs were allowed and encouraged to alter the Microsoft Power Point presentations, but within the following parameters. They could change the order of the slides or add to the presentation so long as they maintained the concepts in the original presentation. They could not go into excessive content depth, had to keep the lecture under thirty minutes, and that added, changed, or deleted information must support the CISGS objectives for science education. The freedom to change Microsoft Power Point presentations encouraged GTAs to take ownership for the course (Table 3-3) and they began to provide substantial input for course improvement. They were allowed to present course content in a way that fit their individual personalities, and to add information based on their expertise, which helped make the presentations better for them and their students. GTAs were encouraged to share their Microsoft Power Point changes with their fellow teaching assistants. Observations made during laboratory meetings showed that GTAs had a difficult time determining how long it would take to cover a specific topic, the strategies to use to convey ideas/concepts, and the equipment or other resources needed to accomplish the task. To aid GTAs in preparing for teaching, the lab coordinators required GTAs to prepare an itinerary/lesson plan for the next lab period. These were submitted to the coordinators before the related lab meeting, allowing the lab coordinator to offer feedback about their teaching plan 148 to help guide the GTAs in preparing to teach the laboratory. This requirement improved instruction to some extent, but failed to meet the expectations of the coordinators, as there was insufficient time for the coordinators to provide feedback to all GTAs before the laboratory meeting. Another essential component of GTA professional development was the lab coordinators’ open-door policy that encouraged teaching assistants to walk in and discuss any concerns, provide feedback on the laboratory materials, lab manual, or instructional methodology in an informal manner. Because lab coordinators were readily accessible, GTAs were encouraged to provide a great deal of feedback on instructional materials, as well as student comments or problems regarding the course, and other comments or concerns. GTAs were encouraged to look at their SIRS forms (Appendix C) and evaluate their teaching based on student comments. These comments usually provided more useful information than the multiple-choice questions as they indicated specific areas that worked well or those that were problematic and needed improvement. Teaching assistants evaluated comments to determine if they were justified and warranted their attention. If there was a pattern to student comments, teaching assistants were to examine their teaching methods and take steps to improve them. The lab coordinators offered specific suggestions for improvement. This discourse between teaching assistants and coordinators allowed for continuous course improvement by altering instruction, course materials, requirements, or administration. 149 Some comments addressed a specific situation where a student liked or disliked something and these were sometimes useful. These comments included: students did not like the way a teaching assistant talked with students, how the GTA presented course materials, or how a student grade was determined. These types of comments indicated that a different approach should be used during student interactions, that there was a needed change in the information provided in the course syllabus to improve clarity, or a need for additional professional development. The lab coordinators reviewed all SIRS to determine what students thought about the course. This information was used to identify specific problem areas associated with the course materials, delivery, or other course components that were related to specific GTA issues and used this information to construct or modify GTA professional development for the next semester. Additionally, trends in the SIRS data were used to determine whether course modifications were moving the course in the desired direction. GTA Surveys One area where assessment was inadequate (until Fall 2002) was in obtaining formal feedback from teaching assistants as to the effectiveness of their pre-semester and on-going laboratory professional development, their evaluation of the lab manual and Microsoft Power Point presentations, and their overall perceptions of the course. A new GTA evaluation survey tool was developed and first used (Table 3-3) during Fall 2002 to remedy this oversight. We developed our own GTA survey because those in the educational literature 150 did not provide the information desired for course improvement (Tufts University, 1993; University of Washington, 1996; Montana State University, 2000; Brown University, 2003; Redwine, Trevalon, and Breslow, 2003 ). For example, these forms asked GTAs to provide their level of experience or interest in holding office hours, or assisting in large lecture classes, or working with students with diverse backgrounds, or in teaching laboratories. Others asked for information on the kind and quality of training GTAs received, and how they felt about this training, or how often they attended staff meetings (Angelo and Cross, 1993; Sawada et al. 2000; Wright, 2000). While this information may be important for understanding GTAs, more specific information was needed to such questions as: did the GTA professional development provided increase your comfort level in teaching laboratories, or were the materials needed to teach laboratory exercises available. (See GTA Survey Table 3-3 for additional questions.) Results and Discussion The GTA survey form focused on six specific areas of interest to improve the ISB 208L course. These included: the two-day GTA orientation; weekly laboratory meetings; laboratory coordinators; administration; technology, materials and facilities; and the lab manual. These aspects were the most critical in helping improve the overall course. Survey questions were scored from 1-5 with 1 being the best score and 5 being the worst (Table 3-3). Two-day Orientation and Lab Meetings Responses to GTA Survey Question 1 indicated that the two-day orientation meeting was effective in helping them become prepared to teach the 151 laboratories, with a three semester mean of 1.9. However, GTAs indicated on Question 2 that the orientation meeting was a little too long, giving it a mean of 2.6. GTA survey Questions 3, 6, and 7 indicate GTAs felt the Weekly afternoon lab meetings helped them prepare to teach the next week’s laboratories with three semester means of 1.5, 1.5, and 2.1, respectively. Question 21 results indicated that GTAs found the feedback from peers during weekly lab meetings valuable, giving it a mean rating of 1.8. Lab Coordinators Teaching assistants indicated on GTA survey Questions 4, 5, 8, 9,13, and 24 that the lab coordinators were approachable, helped them to be comfortable teaching, aided them in dealing with difficult situations and students. Moreover, the proximity and availability of the laboratory coordinators to the laboratory rooms encouraged them to talk with the coordinators. GTAs gave mean ratings to these questions of 1.1, 1.3, 1.4, 1.5, 1.2, and 1.1, respectively. Administration Questions 14, 15, 16, 23, 27, and 28 asked GTAs about course administration. These questions addressed the course grading format, make-up policies, assignments, grading, itineraries, attendance policies, and having common assignments for all students. GTAs rated course administration highly, giving these questions mean ratings of 1.4, 2.0, 1.9, 1.5, 1.8, and 1.6, respectively. 152 Technology and Facilities GTAs felt that the Microsoft Power Point presentations helped them to be comfortable to teach and understand the depth of course coverage as indicated in Questions 17, 18, and 19, with means of 1.3, 1.5, and 1.5, respectively. They also explained during laboratory meetings and informal conversations that the Microsoft Power Point presentations helped them become comfortable teaching and knowing to what depth they should teach material. Additionally, they said they liked being able to add or delete material from these presentations, and to share their presentations with their fellow instructors. GTAs indicated that they felt that Blackboard (a web-based information teaching site) helped students Ieam. giving Question 20 a mean rating of 2.0. However, in discussions, GTAs told the coordinators that they would prefer Blackboard not be used as they felt it was too much work for them and the students to use. GTAs indicated on Question 22 that they received adequate instruction in the use of the technology equipment used in the course giving it a mean of 1.4. GTAs indicated that the lab rooms were adequate for teaching this course (Question 25 a mean rating of 1.8). GTAs also indicated that supplies and other materials were available for them to use to teach, giving mean ratings for Question 26 of 1.3. Overall, the ratings for technology and facilities show that GTAs were provided with adequate technology training and materials. 153 Lab Manual The lab manual received an overall good rating from the GTAs. They indicated on Question 10 that the lab manual was easy to use with a mean rating of 2.3. They felt the lab manual covered relevant science information (mean rating for Question 11 of 1.9) and that the lab manual addressed the CISGS objectives giving it a mean rating of 1.8 on Question 12. The lab manual probably received an overall lower ranking than other elements of the course for several reasons. These included typographical errors, changes to worksheets and activities, and more lab exercises included than are covered in the course. Comments from students and GTAs indicated they wanted to use the lab manual as it is printed and that there should be no errors, substitutions or deletions to the manual. As previously stated, this course has undergone many changes and the manual has been rewritten every semester in an attempt to keep up with these changes. A major problem has been that the lab manual had to be sent to the publisher in May in order to be available by the first day of classes for fall semester. This meant that any changes discovered late in the semester were not incorporated into the next semester’s lab manual. Conclusion The teaching assistants were the lynchpins to the success or failure of the ISB program. They make or break the program through their training, enthusiasm, classroom remarks, body language, and other behaviors. Students consistently rated GTAs as being very good to excellent on SIRS forms (1.4 on a scale from 1-5; with 1 being a high and 5 being a low score) even when they felt 154 the course was lacking in some other element. The ISB 208L professional development program consisted of an extensive list of activities, discussions, materials, and procedures used to prepare GTAs for their role as primary classroom laboratory instructors. Included were such items as University policies, CISGS objectives, various pedagogies, laboratory safety, and other items as detailed in Table 3-2. We evaluated the effectiveness of this professional development program using GTA surveys, SIRS (See Chapter 4, Tables 4-3 and 4-7), pre/post test scores (Tables 2-6 and 2-7), and informal feedback from conversations with GTAs and students. The GTA course survey provided invaluable feedback regarding all aspects of the course. The survey helped guide efforts to improve the overall ISB 208L program and to determine if GTA professional development increased student achievement of CISGS objectives for undergraduate education. Teaching assistant professional development improved the operation of the undergraduate non-major general biology program based on the survey used to evaluate professional development (GTA Survey, Table 3-3). The GTAs highly rated the overall ISB 208L program. According to the survey results the GTAs believed this professional development helped them become more comfortable in teaching the lab (GTA survey Questions 1, 2, 3, 6, 7, and 21: Table 3-3). GTA survey results indicated it is important that any future lab program incorporate an open door policy and proximity of the coordinators to the lab rooms in its design. In place of a lab coordinator an experienced mentor could be available to help guide GTAs in teacher preparation. It was important 155 that the lab coordinators encouraged GTAs to interact with them to solve both individual and course problems (GTA survey Questions 4, 5, 8, 9 13, and 24: Table 3-3). GTAs also indicated that the lab manual, Microsoft Power Point presentations and course administration were working and helped them to teach the course (GTA survey Questions 10, 11, 12, 14, 15, 16, 17, 18, 19, 20, 23, and 27: Table 3-3). The GTAs also indicated they feel they received sufficient training in the use of technology equipment (GTA survey Question 22: Table 3- 3). It was important to keep a balance to insure GTAs received all the information they needed to be effective the first day of class, but without being ovenlvhelmed. GTAs left the orientation meetings feeling confident to teach, able to answer any question asked of them by students, and knowing who the resource people were that could help them with any problem they encountered throughout the semester. The coordinators tried to make GTAs know the coordinators well enough by the end of the orientation session that they were comfortable asking any question or requesting whatever assistance they required. Based on the GTA surveys and informal discussions with GTAs throughout the semester, modifications to the GTA professional development orientation program were made to address issues of concern to the coordinators or GTAs. The coordinators attempted to proactively manage all aspects of the course so that students received the best possible education. Because GTAs 156 were the key to the success of the program, there was a great deal of time and effort placed on insuring the GTAs were prepared to teach the course. Lawson (2002) determined that student achievement correlated with GTA preparedness. The results cited here support that GTA professional development succeeded in preparing the GTAs to teach the ISB 208L course. This study indicates that the GTAs favorably viewed the ISB 208L course. The GTA survey rankings combined with the pre/post multiple-choice and essay test results indicated that students improved their use of process skills, the primary goal of this course. However, even though student pre/post course multiple-choice and essay test scores improved both within and across semesters, it is not clear that graduate teaching assistant professional development caused this improvement in student achievement. These results suggest that GTAs felt prepared to teach, that they effectively delivered course materials, and that students achieved CISGS objectives, indicating that GTA professional development is a worthwhile experience and important in subsequent student achievement. Additionally, these results support our hypothesis that course iterations improve the laboratory course. Teaching assistants indicated they felt better prepared to teach, more comfortable in the classroom, and had a better understanding of CISGS objectives and expectations for undergraduate science education due to professional development (GTA survey forms, Table 3-3). The feedback from teaching assistants was that they felt very well prepared to teach each laboratory exercise (GTA surveys: Table 3-3; informal conversations during laboratory 157 meetings). They received both initial and ongoing professional development throughout the semester in effective educational pedagogy from an instructor of record. This study supports the findings of other researchers that GTAs require professional development (Worthen, 1992; Hendrix, 1995; Druger, 1997; Lawson, 2002; Worthen, 2002). Professional development appeared to increase their comfort and effectiveness in teaching the laboratories. The data collected in the GTA survey indicate the GTAs desired ready access and feedback from the lab coordinators and their peers. GTAs indicated they had anxiety in taking over the classroom and assuming the role of instructor. Providing Power Point presentations, course materials, itineraries, helpful teaching suggestions and other information during orientation and weekly laboratory meetings prepared them to teach the course. This study and other educational research studies support the use and effectiveness of GTA professional development in preparing GTAs to take on the role of college instructors. Other educational researchers suggested that GTAs be videotaped, critique themselves and their peers, and be required to take a science methods course. Although this study did not address these recommendations, the lab coordinators and many GTAs involved in this study would recommend these aspects of professional development be addressed in the future. This study also supports other educational research findings that it is important to provide GTAs with a wide array of professional development in order to achieve the greatest gains in instructor effectiveness. 158 One area not supported in this study and clearly lacking from the educational literature are studies using empirical evidence to show that GTA professional development translates into effective instructors leading to increased student achievement. It is difficult to provide a direct link between GTA professional development and student achievement. Possible research studies that may provide some quantitative empirical evidence would be to use two groups of inexperienced GTAs, providing one group extensive GTA professional development including pedagogy and the other group only a brief introduction to the laboratory exercises. Student achievement can be measured using pre and post course tests, student interviews, and GTA observations. The two groups could be compared to see if there is greater student achievement in the classrooms where GTAs put the professional development into classroom practice as shown by differences in pre/post test scores. The second study could provide all GTAs the same extensive professional development. Educational researchers could make classroom observations to determine if the GTAs put the best use recommendations into practice in the classroom. GTA perceptions, student interviews, and student achievement on pre/post tests could be compared to determine if classrooms where the GTAs used best practices affected student achievement. A third study would be to provide one group of GTAs minimal professional development, basicallyjust enough so they were familiarized with the lab exercises and the lab program, and the other group extensive professional development, including pedagogy. The GTAs would be tracked over several semesters to see if there is any difference in both student 159 achievement and GTA performance and perceptions. This would help identify whether it is just experience, professional development or a combination of both that improves instruction. If GTA professional development is worthwhile one would expect GTAs to become more comfortable and to improve their teaching in fewer semesters than those not receiving professional development. It would also be expected that student achievement would be more similar in classrooms where GTAs received professional development than in those where they had not received professional development. There are many qualitative studies showing the importance of GTA professional development, but few that have quantitative studies linking GTA professional development to student achievement. This is probably due to the difficulty in finding an assessment tool that clearly links GTA professional development with student achievement and with the ethical issues in having a control group of GTAs that are provided with minimal professional development. Based on these considerations the most reasonable study of those suggested, is the second one where researchers could provide professional development and then assess whether GTAs put these best practices into use in the classrooms and compare student achievement using multiple-choice or essay tests and student interviews. This study would have the fewest ethical problems, but would require a large input of resources in researcher time. There is still a need for quantitative studies demonstrating that GTA professional development increases student achievement. Our study provided some indication that there is a correlation between GTA professional development and student achievement. It 160 also indicates that there is greater course consistency between sections of the same course as a result of professional development. Therefore, the recommendation from this study is to continue GTA professional development and develop additional means of assessment to determine the role GTA professional development plays in student achievement. 161 CHAPTER 4 COMPARISON OF PREVIOUS LABORATORY COURSES (ISB 202L AND 204L) WITH THE CURRENT LABORATORY COURSE (ISB 208L) Introduction This chapter is a comparison of three non-major general biology laboratory courses at Michigan State University having mean course enrollments around 800 students in 32 sections per semester and using graduate teaching assistants as primary instructors. This was a time-series study that compared the previous laboratory courses, ISB 202L (Applications of Environmental and Organismal Biology) and 204L (Applications of Biomedical Sciences) with the current course ISB 208L (Applications in Biological Science Laboratory), to determine if one course format was better than the other at achieving CISGS objectives for integrative studies undergraduate education. The Six Criteria used for analysis/comparison were: 1. Attendance 2. Withdrawal/drop/completion rate 3. Mean Grades 4. Syllabus analysis 5. Test question cognitive levels 6. SIRS This chapter examines the hypothesis that the ISB 208L course, based on the laboratory model (Chapter 2), is more effective than the previous laboratories (ISB 202L/204L) in helping students achieve the CISGS objectives for undergraduate science education. A comparison of attendance, course 162 completion rate, enrollment, SIRS, average grade issued, average test question cognitive level, course syllabi, and pre/post course test results were examined to determine if there were significant differences between these courses. Inherent in these items are the following hypotheses: 1. Attendance is higher in ISB 208L than in the previous laboratory courses. 2. The course completion rate for ISB 208L is greater than for the previous laboratory courses. 3. Enrollments will increase in subsequent semesters if the new ISB 208L course is positively received by students. 4. SIRS questions 1, 2, 3, 7, 9 and 10 rankings will improve, move closer to 1, for the ISB 208L course in subsequent semesters. 5. Average grades issued for the ISB 208L course will remain high and be better than the previous lab courses. 6. ISB 208L test and assignment question cognitive levels will be higher than previous laboratory courses and will improve over subsequent semesters. 7. There are no significant differences in course syllabi (specifically in determination of course grades) between ISB 208L and previous laboratory courses. 8. Pre/post multiple-choice and essay test scores will improve over subsequent semesters. ISB 202L and 204L Course Descriptions ISB 202L (Applications of Environmental and Organismal Biology) and 204L (Applications of Biomedical Sciences) were taught through Spring 2000 and were replaced by ISB 208L (Applications in Biological Science Laboratory) in Fall 2000. ISB 202L and 204L were similar in their design and delivery. ISB 202L traditionally had average beginning enrollment of 557; ISB 204L had average 163 enrollment of 379. Both courses were linked to lecture courses of the same number. One problem was that lectUre courses of the same number varied significantly in content coverage making alignment with the laboratory activities difficult or impossible. For example, ISB 202 could offer one section of the course taught by a plant physiologist who emphasized plant physiology, while another section taught by a zoologist, Mao might emphasize vertebrate taxonomy. GTAs from three departments: Plant Biology, Entomology, and Zoology taught the ISB 202L and 204L laboratory courses. The teaching assistants were usually graduate students working toward masters or doctoral degrees, or upper level undergraduate students. One-quarter time GTAs were required to teach 1 lab section, 1A» time GTAs 2 lab sections, and 3/4 time GTAs 3 sections. Each section enrolled up to 32 students. lSB 202L and 204L taught content with a heavy emphasis on vocabulary with only an implied intent of helping students develop process skills to build links for all course concepts. A senior faculty member supervised teaching assistants and coordinated all laboratory activities in conjunction with their normal university teaching and research load. The coordinator ensured the smooth operation of the laboratory, was the faculty of record for the course, provided GTA professional development, ordered supplies, and dealt with student and GTA issues. The coordinator was assisted by senior GTAs who acted as assistant coordinators to conduct laboratory meetings, help prepare laboratory exercises, and supervise daily lab operations. The GTA coordinators sometimes had an undergraduate or graduate student to assist them in laboratory preparations. Over time, two teaching 164 assistant lab coordinators managed the ISB 202L and 204L lab programs, respectively, with a senior faculty member acting as advisor and instructor of record for the courses. The lab coordinator in conjunction with the assistant coordinators wrote quizzes, tests, laboratory exercises, and determined lab activities. The Instructor of record worked closely with the coordinators to ensure delivery and consistency of course materials, and to resolve grade and student issues related to the courses. There were two separate lab manuals for ISB 202L and 204L. These lab manuals were written by the lab coordinator, copied in the Center for Integrative Studies main office, and distributed as a course pack/lab manual. GTAs taught course material as they wanted, using the guidelines for instruction provided by the coordinators. In the ISB 202L and 204L courses there had always been a Weekly afternoon meeting that presented what teaching assistants would be teaching in the next weeks’ laboratory session. The GTAs were required to participate in the laboratory meetings, one for each course, lasting about two hours each. Various topics were covered during laboratory meetings. During these laboratory meetings the laboratory coordinator or one of the assistant coordinators explained the key points to address in the upcoming laboratory, discussed potential problems, explained how to use laboratory equipment, provided detailed information regarding lab safety, and discussed teaching strategies and addressed other questions or concerns. The ISB 202L/204L GTAs gave a brief lecture to provide students with background explaining the concepts and reasons for doing the laboratory 165 exercises. Teaching pedagogy was not emphasized and GTAs were allowed to alter what they did within the classrooms so long as they were covering the key points indicated by the faculty of record/lab coordinator. This sometimes led to discrepancies as to what was covered in each laboratory section. Some GTAs provided excellent instruction, while others spent much of their time lecturing and did not complete laboratory exercises or let their students leave early, completing the laboratory exercises in the shortest possible time. Additionally, GTAs often wrote their own quizzes and worksheets, which varied greatly in difficulty. Laboratory classes met once a week for two hours. In the ISB 202L and 204L courses, all students worked on a single activity for the entire laboratory period. In each lab session, students worked through some combination of a laboratory exercise, a quiz, and writing a lab report/worksheet. Students were given a mid- term and final exam that included a laboratory practical exam portion. There were weekly quizzes designed to determine if students had Ieamed the previous week’s material and if they were prepared to complete the current lab exercise. In addition students were assigned position papers to encourage them to read current literature about course topics. Students were required to do a short write- up consisting of a worksheet that asked a few questions to be completed for each position paper. Each of these components varied among sections as GTAs varied in what and how they taught each laboratory. GTAs attempted to return papers and other work in a timely manner. Most of the laboratory exercises were of the confirmatory type, leading students to a specific laboratory answer. The lab coordinator provided GTAs with guidelines as to what content they were 166 supposed to cover each week, but there was no formal means of assessment used to determine if the GTAs covered what was assigned other than the mid- term and final exams which were common among all sections. The major differences between the ISB 202L/204L and the ISB 208L course models are that there was informal intent placed on meeting or assessing the CISGS objectives in ISB 202L/204L, but a clear focus on objectives and assessment in ISB 208L (Personal communication from Dr. Larry Besaw, ISB 202L/204L Coordinator). Overall, other than course section consistency, and teaching with a focus on lower cognitive level thinking skills (focus on narrative that does not require critical thinking), the ISB 202L/204L model appeared to work at teaching biology content knowledge (Personal communication from Dr. Larry Besaw, ISB 202L/204L Coordinator). The ISB 202L/204L course model used the CISGS Objectives (Table I-1), and did not have separate written course objectives as the ISB 208L model did. ISB 208L Course Description ISB 208L Applications in Biological Science Laboratory was developed during Spring 2000 and first taught Fall 2000. It was to address several issues in replacing the two existing laboratory courses, ISB 202L and ISB 204L (See Appendix B for course descriptions). The major driving factors in developing the new ISB 208L laboratory course were alignment of CISGS objectives with assessments, content and grading consistency, and working to increase course cognitive level. The goal of the new course was to align course content with CISGS objectives for undergraduate education, and to assess course 167 effectiveness (See Table 4-1 for a comparison of course and CISGS objectives). Educational research findings support (AAAS, 1989; Wiggins and McTighe, 2000; NRC, 2001; Stronge, 2002) that the best way to accomplish these objectives is linking lecture to laboratory exercises/activities. At the time the course was developed, there was concern that there was a lack of correlation between material presented in lab and lecture of the same course number. The 208L model uses real-world examples to explain science concepts, embedded assessment to evaluate student achievement of CISGS objectives, and course assessments to evaluate and continuously improve course delivery to achieve the greatest gains in student Ieaming (NRC 1997; AAAS, 1989). The ISB 208L course design provided a clear link between the embedded course lecture and the laboratory materials and allowed for immediate and relevant feedback from the instructors. The resulting 208L lab manual, Microsoft Power Point presentations, and instructor professional development provided a clear focus on using rather than memorizing information. 168 Table 44 Course Format Comparison: Comparison of ISB 202L, 204L, and 208L, showing various parameters for each course. Note that ISB 202U204L were identical in format, while ISB 208L differed in numbers of enrolled students, course focus, alignment with CISGS objectives, types of assessments used, and the inclusion of a lecture as part of the laboratory program. ISB 202L ISB 204L ISB 208L Credits 1 1 2 # hours meeting 2 2 3 Mean # students 557 379 827 Management Faculty of record Faculty of record Faculty of record and two assistant and two assistant and co- GTA coordinators GTA coordinators coordinators Link lecture to lab Lecture and lab with same course number - content Lecture and lab with same course number - content Stand alone-no link to any lecture course- and objectives not and objectives not embedded necessarily linked necessarily linked lecture Objectives Vocabulary driven Vocabulary driven Concept/process dnven Focus Instructor Instructor More student centered centered centered Assessments SIRS, GPA, % SIRS, GPA, % SIRS, GPA, used Course Course Course completion, mean completion, mean completion, Pre course enrollment course enrollment & Post multiple- choice and essay tests, GTA surveys, Student Unavailable or not Unavailable or not Kept and attendance kept kept available records Design to focus No No Yes on CISGS Objectives Number of 32 32 32 students per section 169 ISB 208L had a mean enrollment of 827 for Fall 2000 through Spring 2003 with increased enrollments every semester since its implementation Fall 2000. The last enrollment for ISB 208L included in this study was Spring 2003 with just over 900 students. There were major changes in the 208L course design to meet the stated CISGS objectives. First, the number of credits eamed was increased from one for ISB 202L or 204L to two for ISB 208L. Secondly, ISB 202L and 204L each met for two hours while 208L was designed to meet for 3 hours once a week to combine a two-hour lab activity period with a one-hour recitation period. The expectation was that by including the recitation with the lab, students would gain from immediate and relevant feedback on laboratory work and questions which significantly increases student Ieaming and long-term retention of course material (Cash 1993; Taconis et al., 2001). . The ISB 208L program employed a full-time laboratory coordinator whose job was to manage all aspects of the laboratory program and to teach at least one lab section. The laboratory coordinator/faculty of record worked closely with the CISGS Director to insure the ISB 208L program aligned with CISGS objectives for undergraduate science education. Teaching one lab section each semester helped identify problem areas in course design and delivery. The lab coordinator also directed policy, wrote curriculum, provided teaching assistant professional development, set up laboratory exercises, maintained laboratories, dealt with student and teaching assistant issues, assessed the laboratory program, and worked jointly with the assistant coordinator to insure smooth 170 laboratory operation. An assistant coordinator was hired to be responsible for daily operations of the laboratory, to help in teaching assistant professional development, and in resolving student issues. Teaching assistants did the majority of teaching and student assessment. Graduate teaching assistants collected and kept records of student grades and attendance for the ISB 208L course. See Figure 4-1 for an organizational chart for the CISGS undergraduate non-majors biology program. Center for Integrative Studies in General Science Organizational Chart Director of Integrative Studies in General Science Lab Coordinator & Faculty f Recor- Lab Coordinator Figure 4-1: Organizational chart for Integrative Studies in General Science. 171 In implementing the ISB 208L course, a new lab manual was designed clearly linking the previously stated course objectives to course delivery and assessment. Originally, the ISB 208L course design and lab manual was simply to incorporate as much of the ISB 202L and 204L course material into the 208L course as possible. The first ISB 208L lab manual was locally published to allow for ongoing course revisions. During the first semester of use, teaching assistants and students made it clear that the lab manual covered far too much material. Therefore, the lab manual was modified to reduce content coverage, and used a custom publisher to improve its overall quality. There also was an apparent change in the philosophy of the university administration regarding undergraduate non-major biology science education (Personal communication from Dr. Duncan Sibley, Director, Center for Integrative Studies). Science content was no longer the major driving force in the development of laboratory exercises. ISB 208L began with the same content emphasis as the previous lab courses, but evolved to emphasize process skills, with less breadth and more depth of content coverage. Therefore, the ISB 208L course model intentionally taught process skills using content, as opposed to the traditional instructional methods of teaching content with the expectation of improving process skills. The ISB 208L course shifted focus from content knowledge to the improvement of critical thinking skills, the use and understanding of the scientific method/process, use of facts to formulate logical arguments, data interpretation, having relevant laboratories with a clear link to the real world, and increased student appreciation/enjoyment in studying 172 science. The shift from emphasis on content to process skills is supported by several studies showing that the greatest student Ieaming and improvement of process skills occurs when students use content to explain and work with conceptual material (Hart et al. 2000; Taconis et al., 2001; NRC 2001). In contrast to the ISB202L/204L course model, the 208L course implicitly included real-world examples to help students connect course content to other disciplines and their lives. Modifications to the initial design occurred during implementation of the course to include about a twenty-minute introductory lecture to course material, twenty minutes of recitation, twenty minutes of summary, and two hours of laboratory exercises during each laboratory period. Additionally, in an attempt to make the course more interesting and to accommodate all students with minimal supplies, the laboratory sessions were divided into two to four mini-lab exercises. ISB 208L students worked on several exercises during the class period. This change increased the management workload on ISB 208L teaching assistants, forcing them to be well organized in order to help students complete all laboratory exercises in the allotted time. Multiple laboratory exercises helped hold students’ interest by having a changing focus during the class period (personal observation). Teaching assistants received instruction on how to effectively manage such a multi-faceted program during the weekly lab meetings, in terms of organizing students and acting as facilitators to keep the labs moving without ovenrvhelming either themselves or the students. Constant feedback from the teaching assistants and classroom 173 observations by the lab coordinators allowed identification of problems and revisions to the laboratory program. The major strengths of the ISB 208L Course Model are the inclusion of a process to identify problems with the course, formal GTA professional development, embedded course assessments, and a formal method to obtain feedback from all members of the Ieaming community. Based on educational research (Angelo and Cross, 1993; Wiggins and McTighe, 2000; Lawson, et al., 2002) and the author's own classroom experiences as a high school and college science teacher, embedded course assessments were developed to evaluate both student and course performance. Modifications of the ISB 208L program has occurred every semester since program implementation in Fall 2000, as assessment has identified areas needing modifications to improve instruction. Laboratory Model Summary The ISB 202L, 204L and 208L laboratory course models approached undergraduate biology education in different ways. The three courses began with different overall course objectives, requirements, and instructional pedagogy.‘ ISB 202L and 204L met for two hours per week and for 1 credit, ISB 208L met for 3 hours per week and for 2 credits. All three course syllabi show that weighting of course assignments/assessments were similar with about 60% of student grade based on tests and the other 40% on in-class quizzes, lab reports and position papers. The one exception in these requirements is that ISB 202L/204L used a practical exam, which is not included in the 208L program. ISB 208L incorporated course assessment to drive a continuous course 174 improvement process. ISB 208L changed course focus from emphasizing content coverage and vocabulary to process skills and transferability to the real world. The ISB 208L course also focused instruction on the CISGS objectives with emphasis on improving student understanding and use of higher cognitive level skills, such as scientific reasoning, critical thinking, and the use of facts to formulate logical arguments. To accomplish these objectives required improved GTA professional development. ISB 208L provided more extensive graduate teaching assistant professional development than did ISB 202U204L. The three courses were similar in the number of staff who designed, implemented, assessed, and provided graduate teaching assistant professional development. (See Table 4-1 for a comparison of the three courses.) All three laboratory courses (ISB 202L, 204L, and 208L) had associated logistical problems making course planning and implementation a challenge. These courses enrolled a large number of students each semester in 32 sections with means for ISB 208L of 827, ISB 202L, 557, and ISB 204L, 379 students. ISB 208L increased enrollments every semester taught since its implementation Fall 2000. The most recent enrollment in this study for ISB 208L Spring 2003 was just over 900 students. This meant that developers must consider room availability and space when designing any activity. Safety and make-up sessions were two additional considerations when designing and implementing the laboratory exercises. Because there were 32 sections of the course offered from 8:00 AM. until 8:50 PM. Monday through Thursday, and 8:00 - 11:40 AM. Weekly, it was important to include the safety of students walking across campus 175 in the dark in the evaluation of laboratory exercises. Additionally, if an outside laboratory exercise was scheduled and it rains, snows, sleets, etc. causing cancellation of one or more laboratory sessions, there was no way to reschedule due to time, space, student and instructor availability. Logistical problems often dictated the laboratory exercises used in the laboratory courses. With the previous comparison of the three courses in mind, an evaluation was attempted to determine if one of the course models was better at meeting CISGS objectives for undergraduate education (Table l-1). Materials and Methods This comparison is an attempt to determine whether the traditional vocabulary/content driven method of instruction is better, equal to, or worse than, the concept driven method of instruction at achieving CISGS objectives (Table I- 1) for undergraduate education. The findings of this course comparison should allow instructors to determine which laboratory teaching approach is best for a non-major general education laboratory biology course. The following section will describe the data collected and used for course comparison to see if the pedagogical differences in the two course models make any difference in student achievement of CISGS objectives for undergraduate science education. The following data were collected to evaluate the ISB 208L course and to make course comparisons. These data included attendance, initial and final enrollment figures, SIRS data, grade point averages, pre/post course multiple- choice and essay test question data for three years (6 semesters), Fall 2000 to Spring 2003 (last semester where data is available). Data collected for this study 176 are of two types. The first is pre-existing data collected from students by the College of Natural Science and Registrar’s offices. These data consist of SIRS form information, mean course grades, enrollment/withdraw rates, course syllabi and test questions, and course completion figures. All of these data are aggregate data and will be used and reported as pooled data. The second uses data collected from the ISB 208L students and GTAs in the recent past, and include GTA survey results (detailed description provided in Chapter 3), pre/post multiple-choice and essay test results (detailed description provided in Chapter 2). The assessment tools fall into two broad categories: formal quantitative and informal qualitative. More weight was placed on formal quantitative means of assessment and where this was not available subjective informal means of assessment were included. Over the course of this study, assessment tools were added as needed to move away from informal qualitative tools and toward formal quantitative tools to obtain the desired information. The formal tools used in this study were: attendance, withdraw/drop/course completion rate/percent, mean grades, cognitive ranking of test questions, SIRS, student feedback via e- mail, and informal interviews. Course assessment is a very challenging and complex task. Course coordinators tracked all of the elements listed above and evaluated the various assessment tools to determine their validity and usefulness in providing the desired information needed for course evaluation. Course assessments provided a formal means to determine whether the course was achieving the desired 177 educational objectives for the ISB 208L course. Modifications were made to course delivery, GTA professional development, assessments, and other components based on data obtained from the formal and informal assessments. A description of the assessment tools and the methods used to obtain data follows. This comparison used available data for these courses and tried to infer other data based upon information provided by the previous lab coordinator and graduate teaching assistants familiar with both programs. A major problem encountered in comparing ISB 202L, 204L and 208L is that many of the assessment tools employed for evaluation of ISB 208L were not used for either ISB 202L or 204L. Therefore, all available information was gathered for the three courses to see if it was possible to make an adequate comparison of student achievement. Data were collected from the following sources and years: SIRS data, grade point averages, initial and final enrollment figures were collected for ISB 202L and 204L for Spring 1997 through Spring 2000 (the last year it was taught) from records kept by the University Registrar's office. Course syllabi, test, and quiz questions were collected and used in ISB 202L and 204L from the previous laboratory coordinators’ records for Spring 1997 through Spring 2000. For ISB 208L attendance, initial and final enrollment figures, SIRS data, grade point averages, test and quiz questions, pre and post course multiple-choice and essay question data were collected for all semesters from course inception in Fall 2000 through Spring 2003 (last semester where data is available) data were collected. Because ISB 208L replaced both ISB 178 202L and 204L, data from ISB 202L and 204L were combined to compare with the data from ISB 208L. Description of Methods and How Used Attendance Percent student attendance was calculated by multiplying the number of students in each section by the number of class meetings per semester (13) to derive the number of student days per section per semester. Then the total number of days missed by any student per section was subtracted from possible total days. Finally, the total number of days attended was divided by the possible days of attendance, and multiplied by 100 to arrive at the percent attendance. Because there appears to be a clear link between student achievement and class attendance, policies were implemented to encourage student attendance in the laboratories. Students were given weekly in-class assignments that were collected during each lab period. They were not allowed to make up missed assignments, except in extenuating circumstances. Therefore, in-class assignments appeared to encourage student class attendance and increased student Ieaming, helping achieve Integrative Studies objectives. Course Completion/Enrollment/Withdraw Rate An examination of the rate at which students withdrew/droppedlcompleted each course was another indicator of student satisfaction with the courses. The assumption was that students tend to remain in courses that fit their semester schedule and requirements, in which they feel they can be successful by earning a decent grade for the work done, the course material is of special interest to 179 them, or where they learn the most information. Therefore, student retention is an indicator of student satisfaction with the course. Additionally, if students positively accept a course, enrollments will increase or remain constant over time as students communicate their satisfaction to other students. Enrollment figure data were obtained for this analysis from the University registrar’s class lists. Beginning enrollments were used as those on the first day of class and final enrollments as the number of students issued a grade at the end of the course. The course completion rate was determined as the ratio of the number of students completing the course, having received a final grade, compared to the number enrolled on the first day of the class times 100. The course withdraw/drop rate was determined by subtracting the percent completion rate from 100%. One aspect of using the first day of class enrollment is that there is normal withdrawal from courses as students adjust their course load, major course requirements, work schedules, and other factors unrelated to our course. Students also added to laboratory sections during the first three weeks of the semester, changing overall course enrollment figures. This shifting was consistent across semesters. Another possible explanation for changing enrollment that was not investigated was the effect of student schedule as a factor in determining course enrollment. Semester to semester, therefore a higher completion rate (lower withdrawal) rate in ISB 208L may indicate better course satisfaction, the ability to earn a better grade, or the ability to get a better schedule due to the number of sections offered at all times of the day. Another possible explanation for changing enrollment that was not investigated is the 180 effect of student schedule on determining course enrollment. Semester to semester, therefore a higher completion rate (lower withdrawal) rate in ISB 208L than in lSB 202L or 204L may indicate better course satisfaction, the ability to earn a better grade, or the ability to get a better schedule due to the number of sections offered at all times of the day. Mean Grades To make this comparison it was important to determine if the courses were using similar methods of assigning student grades. Mean course grades were collected from the University Registrar’s Office grade reporting forms for all three courses, ISB 202L/204L, and 208L for all semesters ISB 208L has been taught and for the last five years that ISB 202L/204L were taught. The expectation was that by comparing the grades for each course to the types of test questions asked a comparison could be made of course cognitive level and student achievement. This is a less precise measure to compare the three courses than if pre/post course test results were available for both course models, as there are many things used to determine a course grade. Average grades for the courses for each semester were compared to determine both course difficulty and student achievement as we changed the types of test questions and laboratory exercises. Although course grades are not necessarily a good indicator of course knowledge or achievement (Cross and Frary, 1996; deBeaugrande, 1997; Guskey, 2002) they are used as a standard in education and are required by the University. Therefore, average course grades were used as an indicator of student achievement, and to show possible 181 problems with the course, such as misalignment between instruction and assessment tools, i.e., pre/post multiple-choice tests and course content. Average student grades were calculated for each course for each semester using the grades reported to the Registrar’s office. Course Syllabi Course syllabi were analyzed for all three courses (ISB 202L, 204L, and 208L) showing that all three courses weighted assessments approximately the same, with 60% of the grade being determined by tests/exams, and 40% of the grade from quizzes, in-class assignments, and group participation. The only difference found between the three courses was that ISB 202U204L incorporated a laboratory practical exam, which was not used in ISB 208L. The points for the practical exam in 202U204L are incorporated into the total exam points in 208L, as essay and scientific design questions. There is no way to determine the effect the practical exam has on course cognitive level, and subsequently its effect on student course grades. The practical exam questions in 202L/204L consisted primarily of lower cognitive level questions (recall, identification, and concept recognition). I suggest that higher grades in ISB 208L may be due to better match between course objectives, delivery, and assessment and not due to lower cognitive level questions used on assessments. All three courses used similar weighting of course assignments to determine a course grade. There is no way to compare in-class assignments to determine if the ISB 208L gave significantly lower cognitive level in-class assignments, because they are not available, possibly resulting in grade inflation. 182 Test Question Cognitive Level Rankings The next comparison among the three courses was to determine whether the test questions addressed student Ieaming based on CISGS objectives for undergraduate education (Table H). The results showed that the ISB 202U204L courses used lower cognitive level questions (mean of 1.4) on class assignments, quizzes, and tests. The ISB 208L began in the Fall 2000 using questions with a cognitive level mean of 1.4 and over several semesters raised this mean to 2.2, Spring 2002. This question level mean suggests that the ISB 208L course was addressing the CISGS objectives for undergraduate science education (Table I-1) better than the ISB 202L/204L courses. SIRS (Student instructional Rating System) SIRS (Appendix E) are given at the end (usually in the last two weeks) of all courses at Michigan State University and compiled by the Registrar’s office. These data are available to faculty to aid in determination of student perceptions regarding the effectiveness of various courses. Student survey (SIRS) data were collected from the Registrar's records for all semesters of ISB 208L and it was decided to focus on the data available for ISB 202L/204L for the last five years preceding the implementation of the ISB 208L program. The assumption was that the last five years of ISB 202L/204L SIRS data would adequately assess these courses and provide a fair means of comparison between ISB 202L/204L, and 208L. Teaching assistants are required to leave the room during the administration of this survey, and a student from the class takes the forms to the main office. Course coordinators and teaching assistants do not see these forms 183 until after grade submission to the Registrar’s office. The focus was on six of the 13 SIRS questions (1, 2, 3, 7, 9, and 10: Table 4-2) as these are the most important in helping evaluate the laboratory course in meeting CISGS objectives and improving the ISB 208L course and because these are of most concern to the Director of the CISGS. SIRS results for the six semesters of this study are shown in Table 4-3. Table 4-2 SIRS questions used for course comparison. Question #' Question 1. The laboratory exercises regularly emphasized the significance of science to the non-scientist. 2. The laboratory exercises emphasized understanding ideas and concepts. The laboratory exercises encouraged students to think about the relationship between science and society. The laboratory exercises were intellectually challengi_ng. 5°.” I would rate this teaching assistant on a grading scale of 4.0 (very good) to 0.0 (very poor) as follows: (a) 4.0 (b) 3.0 (c) 2.0 (d) 1.0 (e) 0.0 10. I would rate this course (lab only, not lecture) on a grading scale of 4.0 (very good) to 0.0 (very poor). (a) 4.0 Lb)3.0 (c) 2.0 (d) 1.0 (e) 0.0 184 Table 4-3 SIRS Comparisons for ISB 202L/204L and 208L: Mean SIRS ratings for the six questions of interest for course model comparisons plus/minus standard deviations. p values shown for two-tailed t-tests assuming equal variances. I used “1” for choice “a” and “5” for “e” answers to questions 9 and 10 to provide consistency in viewing results. * designates significant values. SIRS Item 1r ISB 202I204L ISB 208L 1) 1. 23:01 23+02 090 2. 22:01 22+02 050 3. 24:02 22+on 010 7. 26101 27+04 on? 9. r7102 15+01 cor 10 24:01 26+o1 00V SIRS data were simply tallied and averaged to provide an overall score for each semester and each targeted question (Table 4-3). A low score, close to 1, for a SIRS question would indicate a student would strongly agree with the question and a high score, closer to 5 would strongly disagree with the statement. Questions 1, 2, 3 and 7 are rated on this scale. Questions 9 and 10 are based on a grading scale from 0.0 to 4.0, with 4.0 being the best rating for the question, which is actually letter choice “a”. In an attempt to maintain consistency among all SIRS question rankings, the following scale was used for SIRS question numbers 9 and 10: 1 as “a”, 2 “b”, 3 “c”, 4 “d”, and 5 “e”. Using this ranking scale, an ideal score would be 1.0, which would give the teaching assistant or course a 4.0. Differences between SIRS mean values were determined using two-tailed t-tests assuming equal variances. 185 Results and Discussion Attendance For the first two semesters ISB 208L (Fall 2000 and Spring 2001) students earned attendance points in an attempt to encourage them to attend the lab. These were dropped for the Fall 2001 semester as there was increased in-class work. By giving in-class work every class period students attended the class and attendance points were no longer necessary. Having in-class assignments and a strict make-up policy kept mean overall attendance at about 95% for the six semesters of this study showing students regularly attended the laboratory classes. Unfortunately, attendance data for ISB 202L and 204L were not available, as attendance was either not taken or the records not kept. According to the former ISB 202L and 204L coordinator, Dr. Larry Besaw, these courses always had fairly high attendance throughout the time they were taught, at over 90%. Several informal assessments compared student test performance between those students who regularly attended class and those who did not. The assessments showed a clear correlation between attendance and test performance. The informal assessment showed that students who missed only one or two laboratory periods earned 1 - 1.5 grade points higher than those who missed more than two periods (unpublished data). These results corroborate the findings of Moore (2003) who showed a direct correlation between student attendance and Ieaming as measured on course assessment tools. An important point is that missed laboratory classes are a minor problem because attendance 186 was so high. In-class assignments appeared to encourage student class attendance and therefore, increased student Ieaming, which helped achieve CISGS objectives. Course Completion/Enrollment/Withdraw Rate On average over the six semesters, 93% of the students completed and earned a grade for the ISB 208L course. The mean percent completion rate is significantly different (p < 0.001), for the two course models for all semesters examined with means of 88.4: SD, and 93.4 1 SD for ISB 202U204L, and ISB 208L, respectively. Course enrollments for all three courses (ISB 202L/204L, and 208L) had mean enrollments of 557 and 379, respectively, for the ten semesters examined (Spring 1997-Spring 2000) and ISB 208L for the six semesters of this study of 827. ISB 208L enrollments increased every semester since implementation from 660 in Fall 2000 to 997 students for Spring, 2003. It is important to realize that even though course completion rate and enrollment has increased for ISB 208L over ISB 202L or 204L, there are many other factors, such as course familiarity, lack of other courses, or scheduling that could explain these changes. It cannot be concluded that these changes are due to student satisfaction with the course. Course completion rates for ISB 202U204L were 100-88.4 = 11.6% and for ISB 208L withdrawal/drop rate was 100-93.4 = 6.6%. A comparison was made between the lSB 208L course and another general education science course, ISP 203L (Geology of the Human Environment Laboratory) to determine if there were differences in enrollment, 187 rate of course completion and mean grade issued. This comparison was to determine if the trends shown in the ISB 208L course were due to the course model or to some other factor. Table 4-4 and Figures 4-2, 4-3 and 4—4 show a comparison of this data. ISB 208L had high and consistent enrollments, both beginning and end. Both courses show some increase in enrollments, but the ISB 208L course had a very steady increase over the six semesters, while the ISP 203L course varied over the six semesters, until Spring 2003 when it had a huge increase in enrollment. The large increase in 203L enrollment was probably due to several factors. First the ISB 208L course fills to capacity every semester and the University increased overall freshman enrollment Spring 2003. ISP 203L also changed course format and aligned it more closely to ISB 208L. Thirdly, students were being encouraged to take the ISP 203L course in order to balance the course instructional load between ISB 208L and ISP 203L. The data show that the ISB 208L course is more consistent in enrollments than the ISP 203L course. When linear regression was performed on the enrollment and course completion data, the trend lines explained the data for the ISB 208L course extremely well, with R2 values of 0.92, 0.93 and 0.81, while the trend lines did not explain the data nearly as well for the ISP 203L course, with values of 0.46, 0.51, and 0.65, for beginning enrollment, final enrollment, and course completion, respectively. These values indicate there is more inconsistency in the values for the ISP than the ISB course, which lends support to the hypotheses of this study that iterations in the course model improves the course in subsequent semesters. The trends shown by the data are the same even 188 though the absolute numbers differ. A comparison of the slope of the regression lines for these three parameters shoWs no significant differences, with p values of 0.44 for beginning enrollment, 0.50 for end enrollment, and 0.47 for course completion. These results indicate that the ISB 208L course is not showing increased rates of students dropping the course, failing to enroll in the course, or completing the course, suggesting that students are not avoiding the course. Course completion rates were good for both courses, but ISB 208L had a higher mean completion rate of 93% compared to ISP 203L with a mean of 85.5%. Figures 4-2 and 4-3 indicate that both courses have good enrollment and course completion. Both courses issue consistent mean grades with means of 3.3 for ISB 208L and 3.2 for ISP. Note that there appears to be a discrepancy in the number of students enrolled between the data used for this course comparison and that used for model success in Chapter 2. These differences are due to when the enrollment data were collected. The Registrar’s office sent data from the first day of enrollment (beginning enrollment) and students issued a grade (final enrollment). The data discussed in Chapter 2 used the first day classes met as beginning enrollment, and number of grades issued as final enrollment. 189 Table 4-4 ISB 208L/ISP 203L Course Comparisons. Data obtained from Michigan State University Registrar's Office. Beginning enrollments are from the first day of classes. Final enrollments are the number of students receiving a grade. Beginning Final % ISP203L enrollment enrollment Completion Mean Grade F800 384.0 306.0 79.7 3.2 8801 506.0 439.0 86.8 3.2 F801 409.0 339.0 82.9 3.1 8802 509.0 441.0 86.6 3.1 F802 427.0 377.0 88.3 3.3 8803 791.0 701.0 88.6 3.2 mean 504.3 433.8 85.5 3.2 Beginning Final % ISB 208L enrollment enrollment Completion Grades F800 660.0 641.0 97.1 3.4 8801 878.0 792.0 90.2 3.3 F801 755.0 700.0 92.7 3.3 8802 817.0 930.0 113.8 3.3 F802 846.0 950.0 112.3 3.3 8803 870.0 1013.0 116.4 3.4 mean 804.3 837.7 103.8 3.3 190 —_ #.2___~ ISB 208MSP 203L Beginning Course Enrollment Comparison ; RT: (1460? 5 n 200 — I E I g 0 r , . I F800 3301 F801 3302 F802 5303 . Semester i i-;-ISB 208LBeginning enrollment ...lspzoataeg Enrollment I I—Linear(lSB208LBeginning enrollment) —Unear(|SP203LBeg Enrollment) 5 Figure 4-2 Beginning course enrollments for ISB 208L and ISP 203L. l . . I i l88208Lll8P203L Flnal Enrollment Comparison ; g l I r I ‘3 1200 i i 1 ° 1000 1"” = “‘v } i E 800 ,_.__ “j" _“w ,1 I i .5 score—r '* m 774/5; — I 3 400+— - ~05107 ”it: _‘_ #7 l22°°«»————— r__-- — — I I é o * ’ * ’ i I t F800 3501 F801 3502 r302 ssos ; ' Semester i I l i—o—lse 208L Final enrollment +ISP 203L Final Enrollment l l j—Linear (ISB 208L Final emailment) —I.inear (ISP 203L Final Enrollment); Figure 4-3 Final course enrollments for ISB 208L and ISP 203L. 191 ISB 208Ll203L Percent Course Completion Comparison i l 95 l l on 33:08:88 -_,_- ,___, _; h l _ ..,__ /"_ o r 285 I _ w“ .. _;_ .. . l 3 ’ ‘5’ " , R2 = 0.6531 580 «———§¢— ‘ ”’ 75 7o ‘———'_' —_ 7— __ 7 'M T T T T ‘1 I F300 3801 F801 3502 F802 3303 ‘ Percent completion w ,_.____ I ; l-o—l88208L % Completion ...e— ISP 203L % Completion ! : l—Uneertlsezoemmnoetion) —Linear(lSP203L% Completion); % Figure 4-4 ISB 208L and ISP 203L percent course completion comparison. Mean Grades Overall, average student grades, based on a 4.0 scale, were 3.0 1 0.2 for ISB 202L, 3.0 _-t_- 0.3 for ISB 204L, and 3.3 1 0.0 for ISB 208L (Table 4-5). ISB 208L (Table 4-5) grade averages remained constant while the question cognitive level used on course assignments increased each semester. ISB 202L and 204L grade averages are not significantly different with a p of 0.64, but are both significantly different from ISB 208L with p of 0.02 (ISB 204L and ISB 208L) and 0.001 (ISB 202L and ISB 208L) respectively. 192 Table 4-5 Mean grade comparison between ISB 202L, 204L, and 208L. ' Semester ISB 202L" "lse 204L ' Semester ‘ " lsa‘z‘ost‘fl" " Spring 1997 2.7 na na na IFall 1997 2.6 2.7 na na Spring 1998 3.0 3.0 Fall 2000 3.4 Fall 1998 3.1 3.0 Spring 2001 3.3 Spring 1999 3.1 2.8 Fall 2001 3.3 Fall 1999 3.1 3.1 Spring 2002 3.3 8 ring 2000 3.1 3.5 Fall 2002 3.3 Mean 3.0;02 3.0 1 0.3 Mean 3.3 -_l-_ 0.0 Test Question Cognitive Level Comparison One hundred fifty questions were ranked, 50 from each course, and the overall averages determined. Subsequent analysis (Table 4-6) showed that it did not matter who evaluated the test questions, as teaching assistant and professional educator rankings of questions were identical, giving a mean ‘ question rank of 1.4. This analysis indicates that ISB 202L, ISB 204L, and ISB 208L were similar in question cognitive level used for student evaluation, with means of 1.3 i 0.1, 1.5 1 0.2, and 1.4 i 0.2, respectively. Table 4-6 Course Question Cognitive Level Assessment: Comparison of ISB 202L, 204L, and 208L Comparison of mean test item cognitive level based on Bloom’s Taxonomy of test question cognitive level. Course Rank 1 ,Rank 2 Rank 3 Mean Standard Deviation .. ' ISB 202L 1.2 1.4 1.2 1.3 0.1 ISB 204L 1.3 1.7 1.5 1.5 0.2 ISB 208L 1.4 1.6 1.3 1.4 0.2 193 However, there appeared to be a mismatch between the course objectives and the types of questions asked on the assessments and used in course materials. The CISGS objectives for undergraduate education are to teach at the intermediate to upper levels of Bloom’s scale of cognitive levels (levels 2 and 3), and yet the courses were teaching and assessing students at the lowest cognitive levels (level 1). Following this question analysis, it was decided that course tests and materials were not asking enough higher level thinking questions. Therefore, it was decided to increase the question cognitive level in ISB 208L. For Fall 2002 and Spring 2003, the mean question cognitive level used on course assessments was raised to 2.2, by choosing questions from the 150 used in our question analysis and other sources of the appropriate level. Mean course grades issued remained high (7! = 3310.0), while the question cognitive levels were increased (3:22). These data suggest that increased average student grades are not based on decreased course requirements or lowered expectations, but to improved course design and implementation. I suggest this increase in student grade is due to better alignment and clearly stated course objectives and assessments not to reduced requirements or the use of lower cognitive level test questions. SIRS One of the CISGS objectives (Table H) and course goal number 5 is to insure that students leave the course aware of the role science plays in society and their daily lives. As can be seen in the data Table 4-3, there is no significant difference between these courses in emphasizing the importance of science to 194 the non-scientist. There was also no significant difference (p = 0.60) between the two course models for SIRS question 2 which asked students to indicate whether they feel the course exercises emphasize understanding of ideas and concepts, with means of 2.2 1 0.2 and 2.3 1 0.1, respectively. Student responses also indicated that there were no significant differences (p = 0.1) between the course models for question 3, which asked if they felt the exercises encouraged students to think about the relationship between science and society with means of 2.4 1 0.2 and 2.2 1 0.1, respectively. The data table (4-3) shows a significant difference (p of 0.02) between the two course models initially, but that they are almost identical after six semesters, with means of 2.6 1 0.1 and 2.7 1 0.2. The results for Question 7 were significantly (p < 0.05) different between ISB 208L and ISB 202U204L with p < 0.05. This indicates students who were in 202L/204L thought they were intellectually challenged more than did students who were in 208L. The original mean value for the first semester ISB 208L was taught was 3.0 and has been moving closer to one, each semester, with the most recent value for Spring 2003 of 2.7 (Table 4-7). Question 9 indicates that students rated both course model GTAs as very good to excellent. The ISB 208L GTAs were rated significantly higher (p < 0.05) than the ISB 202U204L GTAs with means of 1.5 1 0.1 and 1.7 1 0.1, respectively. This difference between the two course models, even though it is significant, is difficult to interpret. One possible difference shown by the data is that there appears to be less fluctuation in GTA ranking in ISB 208L than the ISB 195 202L/204L programs among semesters (Table 4-7). These data indicate that there is a perceived difference between the two course models. However, there is no indication of the cause of this difference. Data were not collected that would explain why students felt the ISB 208L GTAs was better than the 202U204L GTAs. There does appear to be greater consistency of GTA rankings for the ISB 208L model than the 202U204L model. I suggest this may indicate that this consistency is due to the more extensive GTA professional development in the ISB 208L course model. Additional research is needed to support this conclusion. 196 Table 4-7 Mean student survey (SIRS) results for ISB 202L/204L Fall 1997 to Spring 2000 and for ISB 208L Fall 2000 to Spring 2003. Significance was determined using two-tailed t-tests assuming equal variances. All values shown are means for each SIRS question plus or minus the standard deviation. * denotes values that are significant. . ~ ’ . if - SIRS Data - SIRS students '97-02 Item # 1. 2. ‘3. 7. . 9. 10. ‘ surveyed F897 na 202L/204L F897 2.4 2.2 2.4 2.5 1.9 2.6 8897 na 202U204L 8898 2.3 2.2 2.4 2.7 1.6 2.3 F898 na 202L/204L F898 2.2 2.1 2.3 2.6 1.5 2.4 8899 na 202L/204L 8899 2.5 2.4 2.7 2.7 1.8 2.4 F899 na 202L/204L F899 2.3 2.2 2.4 2.6 1.8 2.4 8800 na 202L/204L 8800 2.1 2.0 2.2 2.6 1.5 2.2 Mean 2.3 2.2 2.4 2.6 1.7 2.4 na SIRS DATA Item# 11 2 ,, . 3 i 7 ', '9 7,10 1 _ ,, F800 208L F800 2.6 2.6 2.5 3.0 1.5 2.9 641 8801 208L 8801 2.3 2.3 2.2 2.7 1.4 2.7 792 F801 208L F801 2.2 2.2 2.2 2.7 1.4 2.6 700 8802 208L 8802 2.2 2.2 2.2 2.8 1.5 2.5 933 F802 208L F802 2.2 2.1 2.1 2.7 1.6 2.5 899 8803 208L 8803 2.1 2.1 2.1 2.7 1.5 2.6 997 Mean * 2.2 2.2 2.2 2.7 1.5 2.6 827 Overall Mean 2.3 2.2 2.2 2.7 1.5 2.6 * These means do not include the first semester (Fall 2000). Analysis of Question 10 showed that students who were in the ISB 202L/204L courses ranked them significantly higher (p < 0.05) than the 208L course. These data show a steady improvement in overall course rankings for both course models. The SIRS assessment tool does not provide information as to the reason for student course rankings. Based on course assessments, there 197 is no way to know the underlying reasons for the differences in the two course models. One possible explanation for this result is that the ISB 202U204L model is familiar, while the ISB 208L, with more emphasis on inquiry and problem- based laboratories, is less familiar to students. An evaluation of the effect familiarity would have on student course ratings was not performed, but should be as future work. Another possibility was that constant course revisions taking place in the ISB 208L course may have caused some student and GTA confusion. Additional explanations for differences are the difficulty of assignments, GTA attitudes, quantity of work assigned and many others. Which one or a combination of these explanations is the reason for differences in course ranking, is not distinguished in the SIRS rankings or other assessment tools used in this study. Even though the course ranks are significantly different, the SIRS assessment tool is such a coarse measuring tool that it is not possible to conclude that one course is better than the other, based on this data. The SIRS assessment tool was used because it is the standard assessment used for all courses at Michigan State University and the results of the survey are readily available. It is not clear how to interpret differences between SIRS rankings because there are so many factors that affect course rankings. Slight differences, even if statistically significant, indicate there is a difference between course models or semesters, but give no indication of the cause of the differences. For example, is a difference of 2.4 vs 2.6 for an overall course ranking between two course models due to course difficulty, GTA ability, types of labs used, time in class, another factor, or some combination of factors? 198 The SIRS data does not provide any indication of why students rank a course a particular way. Another example is, what does a difference between GTA rankings of 1.5 and 1.4 mean? There is no way to determine if the GTAs are better, the course is better liked, or some other factor accounting for the differences in the GTA rankings. Absolute values of these rankings do not tell us much, but when compared across semesters for the same course, they give an indication of trends in the course. The ISB 208L SIRS data in Table 4-7 show there are slight mean improvements each semester moving overall question rankings closer to the ideal score of 1. There appears to be more variability in the ISB 202L/204L rankings, but the changes are small and therefore may simply be an indication of the types of GTAs assigned to teach the laboratories between semesters. There is no clear trend for the SIRS results with Questions 1, 2, and 3 having no significant difference, Questions 7 and 10 being better for ISB 202U204L, and Question 9 being better for ISB 208L. Therefore, the SIRS results do not show a clear pattern indicating that one course model is better than the other. Conclusions and Recommendations In summary, the ISB 202L/204L and 208L laboratory course models approach undergraduate biology education in different ways. The three courses began with different overall course objectives, requirements, and instructional pedagogy. The ISB 202U204L and 208L courses used the CISGS objectives for undergraduate education as a guide for designing the course activities. However, ISB 202U204L did not have separate written course objectives used to 199 guide all aspects of the course, while ISB 208L had clearly stated course objectives linked directly to CISGS objectives (Table l-1). Having stated course objectives allowed all members of the Ieaming community (students, GTAs, and course coordinators) to clearly focus on CISGS objectives. ISB 202U204L met for two hours per week and was worth 1 credit, ISB 208L met for 3 hours per week and was worth 2 credits. ISB 208L incorporated course assessment to drive a continuous course improvement process. ISB 208L changed course focus from emphasizing content coverage and vocabulary to process skills and transferability to the real world. The ideal score for all SIRS questions used in this study would be a low score, close to 1. Student survey data showed that there were no initial significant differences between previous laboratory courses (ISB 202L and 204L) on SIRS question numbers 1, 2, and 3 (mean values of 2.3, 2.2, 2.4 for ISB 202U204L, and 2.3, 2.3, and 2.2 for ISB 208L, respectively). There were significant differences, p< 0.001, between the SIRS means for question numbers 7, 9 and 10 (mean values of 2.6, 1.7, 2.4 for ISB 202L/204L, and 2.8, 1.5, and 2.6 for ISB 208L questions, respectively. The SIRS responses indicated that students did not perceive a difference between the ISB 202L/204L program and the ISB 208L program in the significance of science to the non-scientist, that the laboratory exercises emphasized understanding of ideas and concepts, and encouraged students to think about the relationship between science and society. Comparing student SIRS responses indicate that ISB 202U204L and 208L course work was about equally intellectually challenging (2.6 vs 2.8 for ISB 200 202L/204L and 208L, respectively). Students consistently indicated that all teaching assistants deserved high marks, with ISB 208L teaching assistants receiving the highest scores (1.7 and 1.5 for ISB 202L/204L and 208L, respectively). Students indicated they would rank all of the courses in this study in the middle of the range, with means of 2.4 (ISB 202U204L) and 2.6 (ISB 208L) on a 4.0 scale. The ISB 208L course model showed an initial low rating (SIRS question 10) of 2.9, but has consistently improved (as shown by decreasing numbers in Table 4-7) each semester as improvements have been made to the course. Student attendance has averaged about 95% for all semesters ISB 208L has been taught. No attendance data is available for the ISB 202U204L course model, although the former coordinator indicates attendance was always good, over 90%. Course completion has also been very good in all courses, but has improved each semester for ISB 208L with a mean completion rate of 93%, versus 89% for ISB 202L/204L. Because of the difficulty in interpreting SIRS rankings for the two course models, and the fact that the ranking are so close, there is not enough difference between the two models to conclude that one is better than the other for any of the six SIRS questions. Even though the data indicate significant differences between the courses in some areas, there is no clear pattern showing one model is better than the other. Therefore, based on SIRS it cannot be concluded that either course model is better at achieving CISGS objectives for undergraduate non-major biology education. 201 Weighting of course assignments are similar among the three laboratory courses with about 60% of requirements coming from exam scores, and 40% from lab reports, ln-class assignments, and homework. The major difference in the ISB 202U204L and 208L requirements is that a laboratory practical exam was part of the assessments for 202U204L, but not included in the 208L program. However, the mean question cognitive level used on assessments is higher in ISB 208L than in the other courses, while mean grades earned in the ISB 208L course are higher, 3.3, than for ISB 202L (2.9) and 204L (3.0). Mean test question cognitive level was about the same as in previous courses, but it has been increased in the ISB 208L course, from 1.4 in Fall 2000 to 2.2 Spring 2003. Based on this analysis, it appears that increasing student performance is due to improved course design, not to lower level questions being asked on tests. In fact, the number of higher-cognitive level questions asked on tests was increased every semester lSB 208L has been taught, and still had consistent gains in student performance on all tests and assessments used for evaluation, resulting in overall higher mean student grades (See Chapter 2 for a detailed description of pre/post course assessments). It is not possible to directly compare the level of difficulty among the three courses, as the assessments used were different, even though assessment weighting was very similar. The results of this study show significant improvements between the beginning and end of the course for each semester in the ISB 208L course model for pre/post course multiple-choice and essay test scores. These scores indicate 202 the ISB 208L program is causing significant gains in student achievement toward the CISGS objectives for undergraduate non-major biology education. Unfortunately, because the same assessments were not used in the ISB 202L/204L courses as in 208L, making direct comparisons is impossible. Even though some of the data collected in this analysis show statistically significant differences, there is not a clear pattern indicating one course format is better than the other. Therefore, the overall conclusion is that the minor differences found in the current study in the two course models are insufficient to conclude that one course model is better than the other in moving students toward achievement of CISGS objectives for undergraduate non-majors biology education. This study could have been improved by running both course models (ISB 202U204L and 208L) in parallel, using the same assessment tools in both models, with a random mix of students and GTAs. However, this type of comparison was impossible because the ISB 202U204L course was replaced by the ISB 208L course. The findings of this study suggest the ISB 208L course model should continue to be used to for teaching undergraduate biology because the pre/post course test results and GTA surveys indicate the course model is helping students achieve CISGS objectives. In addition to the formal and informal assessments, conversations with GTAs, students, and our observations give us an indication that students are improving their process skills and seeing the relevance of science to society and their lives. Additionally, this course model 203 shows that we get continuous improvement that is driven by assessment data. The 208L course was based on the generic course model (Figure 2-1). I suggest the generic model has general applicability to other courses where the goal is to increase student Ieaming through a formal process of course evaluation, based on embedded course assessments. 204 CHAPTER 5 SUMMARY AND FUTURE WORK This dissertation documents the development and evaluation of a laboratory course model, used to improve an undergraduate non-majors laboratory biology program at Michigan State University (MSU). This program had mean enrollment of 827 students for the six semesters of this study. Additionally, this dissertation compares the new course model with previous laboratory models used to teach undergraduate non-majors biology. The intent of the new laboratory course was to align course delivery with the Center for Integrative Studies in General Science (CISGS) educational objectives, and to show, using quantitative data, rather than opinions or feelings, that changes in the course resulted in improved student achievement. Embedded assessments provided data for course evaluation and helped to identify course improvements. This study consists of a summary of educational research findings leading to model development (Chapter 1), a description of the course laboratory model (Chapter 2), assessment results used for model evaluation (Chapters 2 and 3), a detailed description of graduate teaching assistant professional development (Chapter 3), a comparison of previous with current laboratory programs (Chapter 4), and recommendations for future research (Chapter 5). This project was a time series study spanning six semesters that examined two overarching questions. These are: 1. Can an effective laboratory course be developed using teaching assistants as primary instructors and incorporating embedded assessments to drive continuous course improvement? 205 2. Do the instructional interventions (shift from an instructor-centered confirmatory laboratory program to a student-centered inquiry/problem- based exercise program) improve student achievement toward university objectives for undergraduate biology education? From these questions the following hypotheses were developed and tested during this study: I. The use of the course model leads to improved laboratory course delivery leading to increased student Ieaming. ll. Use of the course model provides an effective mechanism to continuously improve course delivery and student achievement. lll. GTA professional development improves GTA effectiveness leading to student achievement. IV The new ISB 208L course model is better than the previous course models in improving student achievement toward CISGS objectives. This study answered the two overarching research questions and supports three of the four hypotheses by showing that (1) iterations of the application of the model helped to improve student achievement, (2) GTA professional development leads to improved GTA effectiveness and increased student achievement, (3) GTA professional development improved teaching competence and reduced course problems, (4) application of the model improved course delivery over time, and (5) GTA professional development increases student Ieaming. 206 This study indicates that an effective laboratory course model can be used in a large undergraduate course using embedded assessments, student- centered laboratory activities, and instructor professional development leading to improved student achievement. The data from this study support the iterative process built into this model is effective in improving the laboratory course and increasing student achievement. This study demonstrated that the new course model achieved increased student Ieaming through clear alignment of objectives, instruction, assessments, and continuous course improvement. Hypotheses l and II, that the course model leads to improved course delivery and is an effective mechanism to continuously improve course delivery and increased student Ieaming, is supported by this study. The assessments used in this study show significant increases in regard to the CISGS goals for undergraduate biology education. There is a clear increase in test performance for every semester and across semesters. The assessments indicate that student general biology knowledge, use of critical thinking skills, the scientific process, and awareness of the importance of biology and science in their lives and to the world, increases by taking ISB 208L. Results for three years of assessments show a statistically significant (p< 0.001) increase in student biology content knowledge with a weighted mean improvement on pre/post multiple-choice tests in student performance of 26.4%. The pre/post tests were changed every semester to better align them with CISGS goals for undergraduate biology education. The test question, course materials, and instructor questions asked in class were changed to increase cognitive levels 207 based on assessment data from previous semesters. The results for the pre/post course essay/problem test assessment to determine student use of critical thinking skills and the scientific process show a weighted mean increase of 19.2% for all semesters. Mean pre/post differences increased from 15.4 to 32% from F800 through 8803, showing continuous improvement (Table 2-6). Additionally, this course implemented a shift from a predominantly confirmatory laboratory program to a mix of confirmatory and inquiry-type laboratories. This suggests that students are using higher-level process skills, (critical thinking, scientific reasoning, and facts to formulate logical arguments), to answer complex essay questions. Additionally, grade point averages have remained high, mean of 3.3 for ISB 208L, while the mean level of question cognitive level used on course tests has increased from 1.1 Fall 2000 to 2.2 Spring 2003 with 78.4% low cognitive level questions on the early pre/post tests to 56.5% to level 2 and 3 cognitive level questions on the latter tests. There would be an expectation that test scores should decrease with increased test question cognitive level, but this did not happen. Instead this study showed significant increases in test scores. What other factors could explain these results? It is possible that these students were Ieaming to write in this manner in other courses they were taking. However, the course had a mean enrollment of 827 students per semester with a wide range of majors, almost exclusively non-science majors. The majority of the students were freshman and sophomores with a few juniors and seniors, so it is unlikely that they would consistently be Ieaming identical material and improving their 208 writing during the same semester. The GTAs also indicated that they felt the students were Ieaming the material and that they could see the difference in the classroom in the way students formulated ideas and asked questions. GTAs also indicated on their surveys that the course was achieving the CISGS goals for undergraduate education. Student course ranking on SIRS has increased every semester as work continues to align course content, delivery, and assessment. Students indicate, through SIRS data, that the course is relevant to their lives, emphasizes understanding of ideas and concepts, and encourages thinking about the relationship between science, society, and their lives. By analyzing the previous semesters’ tests it was possible for the lab coordinators to improve tests by moving Kuder—Richardson reliability index closer to 1, from 0.58 to 0.85 over six semesters. This is an iterative process that allows test improvement over time. Students have indicated that course work and assessments are challenging and interesting. The process of using embedded assessment has increased awareness of course limitations in providing instruction; identifying educational bias; mismatches between objectives, instruction, and assessment; additional need for GTA professional development; and coordinators to improve overall course operation and student achievement. It is important to note that there was a focus on only one or two changes in this process per semester, due to time constraints, confusion, our knowledge, and ability to implement changes. This study used the working definition of critical thinking as: the ability to formulate an argument/solution to a problem based on the use of logic, the 209 scientific process, and facts to support a position. Scientific reasoning is the process of trying to understand/explain the worid in a systematic way using empirical evidence, facts, or models to formulate explanations, predictions or draw conclusions about observations in the natural worid. Scientific facts are based on objective reality and require peer verification. Chapter one included a comparison of the pedagogies used to teach these processes including the laboratory types and course models most commonly used in classroom situations. These elements were initially used to develop a new course, which over several semesters evolved into a new laboratory course model. The course model was developed based on educational research, Ieaming community suggestions, and the author’s experience as an instructor, to better align lecture, laboratory, and CISGS objectives, and assess student achievement. The new laboratory model became the template for future course improvements. The power of this model is that it provides a formal mechanism to obtain assessment data as the foundation for course evaluation. This project described the new course model, rationale for the new model, evidence of model effectiveness, and an explanation of how the new model was used to drive continuous course improvement. The new generic course model (Figure 2-1) includes the following components: (1) the Ieaming community, (2) graduate teaching assistant professional development, (3) course design, (4) course delivery, (5) assessment, (6) a filter (a set of questions used to evaluate the practicality of 210 suggested course changes), and (7) evaluation. The most important features of this course model are: a The use of assessment results to drive continuous course improvement. 0 The incorporation of a process to insure an open line of communication among all members of the Ieaming community. a The incorporation of an extensive teaching assistant professional development program to improve course delivery and consistency of instruction across all course sections. 0 The incorporation of a standardized curriculum to improve consistency of instruction across all course sections. 0 The use of a filter to insure we can implement any recommended changes. The “Filter" is a set of questions used by the lab coordinators to (21)etermine the usefulness and practicality of laboratory exercises (Table 2- Many GTAs did not come to the ISB 208L course with the skills necessary to teach at the level desired by the university. Therefore the ISB 208L course included GTA professional development to prepare GTAs for their teaching assignments. GTA professional development was changed to align instructor skills with the desired level of course delivery. Rubrics were added to writing assignments to improve student evaluation. The rubrics undenlvent many revisions based on the recommendations of the GTAs, lab coordinators, and students. The rubrics helped improve grading consistency among sections, decreasing scoring discrepancies in essay test grading scores from plus or minus 15 points to 5 points. This meant that essay scores issued by three different GTAs varied by much less after feedback and modifications of rubrics lead to both improved rubrics and GTA professional development. 211 The new ISB 208L course model included a GTA professional development program to insure GTAs were prepared to teach on the first day of classes. This document describes the reasons for GTA professional development, creation and use of a GTA survey form, and how professional development was used to improve classroom instruction. This section addressed the question: Does instructor professional development used in association with the course model improve student achievement on assessment tools showing progress toward the achievement of CISGS objectives for undergraduate education? GTAs indicate on the GTA survey and informal interviews that they like to teach this course and feel it is well aligned with the CISGS objectives for undergraduate education. They also indicate that the professional development provided by the lab coordinators helps them to understand the emphasis for instruction, and be prepared and comfortable to teach the course. GTAs ’ provided invaluable feedback into every aspect of this course over the time of this study. GTAs indicate that the ongoing mentoring they receive from the coordinators and interactions with other GTAs aids them in developing as instmctors. The GTA surveys, pre/post multiple-choice and essay test scores, and informal conversations support Hypothesis III, that improved GTA professional development leads to increased student achievement. Additionally, improved GTA professional development was predicted to improve student achievement of process skills. Also, it was expected that students would rank GTAs as very effective on SIRS. The results of the GTA survey and SIRS confirm this prediction. Because GTAs were not used to 212 teaching higher-process level skills, they needed professional development to assist them in teaching, being comfortable in the classroom, and competent to teach these skills. The amount and depth of GTA professional development was increased each semester and worked to help GTAs develop facilitation skills, questioning techniques, and other skills necessary to move the classroom from instructor-centered to student-centered and to better coach students in development of process skills. The GTA survey results support that GTAs felt the professional development helped them to better teach the higher cognitive level skills necessary to be successful on the course essay tests. Members of the Ieaming community provided feedback about such things as the lab manual, tests, in-class assignments, attendance and others that led to course changes. There was great course consistency across sections in all respects. Students came to class with 95% attendance, and remained in the course with 93% earning a grade. Additionally, Figures 4-2 and 4-3 show that in the latter semesters there were more students finishing the course than were enrolled on the first day of classes. If students did not positively receive the course, there would be an expectation that there would be fewer enrolled students at the end of the course than at the beginning. The ISP 203L course showed the normal drop in enrollment from the beginning to the end of the course with a mean of 86% course completion. The ISB 208L course had increasing enrollments each semester and larger enrollments at the end of the course than at the beginning, with a mean of 104%. This is an indication that the course has improved due to iterations of the course model in that if students did 213 not view the course in a positive manner they would not attend class or would drop the course, as there are other courses available to fulfill their undergraduate laboratory requirement (i.e. ISP 203L). Other explanations for why students remain in the course are that the course is easy or that there are in-class assignments forcing them to come to class. These are probably not the reasons for attendance or course completion in that students can pass the course without attending all classes and as previously stated they could take another course to fulfill their requirement. Additionally, many students indicated on the comment section of the SIRS that the course was too much work for the number of credits earned. Information from conversations with students who have taken the course, indicates they liked the course emphasis on relevant topics and the classroom discussions. These comments were not quantified. Another group of comments that have decreased over the time of this study are the number that indicated there was too much busy work in the in-class assignments and far too many errors in the course materials. The lab coordinators, with the help of the GTAs, have modified the lab manual, quizzes, tests, papers, and other course materials. The last part of this analysis compared the previous laboratory courses (ISB 202U204L) with the new (ISB 208L) course to determine if one model is better at helping students achieve CISGS objectives. Hypothesis IV states that the new ISB 208L course model helps students achieve the CISGS objectives better than the previous course models (ISB 202L/204L). 214 Unfortunately, because the same assessments were not used in the ISB 202U204L courses as in 208L, making direct comparisons is impossible. Even though some of the data collected in this analysis show statistically significant differences, they do not indicate a clear pattern indicating one course format is better than the other. Therefore, the overall conclusion is that the minor differences found in the current study between the two course models are insufficient to conclude that one course model is better than the other in moving students toward achievement of CISGS objectives for undergraduate non-majors biology education. However, there is an indication in the form of pre/post course multiple- choice and essay test data, GTA survey results, SIRS, mean grades, attendance, and mean question cognitive level that the ISB 208L course is effectively helping students achieve the CISGS objectives for undergraduate education. Additionally, the ISB 208L course uses higher cognitive level questions (mean of 2.2) to improve process skills (critical thinking, scientific reasoning, and data interpretation skills). There is also an indication that the ISB 208L course model may be better at identifying problem areas in course program and alignment with CISGS objectives based on GTA surveys, pre/post course test scores, and improving SIRS rankings. This process continues as mismatches are identified through course evaluation between instruction, assessment, course materials, GTA professional development, student perceptions, and coordinator knowledge. The greatest problems with this process are the time required for all assessment components 215 including University approval of the study, data collection, grading of essays and other student materials, test construction, and data analysis. The strength of this process is that it allows for continuous course improvement linking assessment, instruction, and the latest in educational research to provide the best student instruction. Course improvement is an ongoing process based on formal assessment and Ieaming community comments. Test questions, pedagogy, and overall alignment are improved at the end of every semester, as assessment data becomes available suggesting course improvements. It was observed that everyone involved in this process, from instructors to students, grows from the experience of the continuous course improvement process. It provides everyone in the educational community ownership in and a reason for Ieaming the skills and material presented in the course. Therefore, the findings of this study suggest that the use of the ISB 208L course model can increase student use of process skills by using a mix of confirmatory and inquiry type laboratory exercises. Finally, the course model achieves all of the objectives of this project by using assessment to identify mismatches in teaching, assessment, objectives and other areas, so that improvements in overall course design and delivery can be made to obtain the greatest gains in student Ieaming, thus achieving CISGS objectives for undergraduate education. These results support that iterations of the application of the model lead to increased student achievement. The new laboratory course model is applicable to other courses providing a formal means to drive continuous course improvement, as demonstrated by the research findings in the 216 ISB 208L laboratory course. The findings of this study suggest the ISB 208L course model should continue to be used for teaching undergraduate biology because it helps students achieve CISGS objectives. Our future objectives for the program are to move SIRS mean ranks closer to 1 for the six SIRS questions used in this study and further improve student achievement on pre/post course multiple-choice and essay tests through better alignment of course objectives and assessments. The ISB 208L course-teaching model for non-major biology laboratory classes can be used as the framework to guide future changes in ISB 208L, as well as other laboratory courses. Future Work There are many areas for further study based on this research project. First, the actual contribution of each part of the model in improving student achievement needs to be determined. For example, it is important to know how much GTA professional development is warranted, based on the amount of time spent preparing a GTA to teach the course, versus the student achievement gained by this work. Additionally, it is important to formally assess the role of Ieaming community interactions on course improvements. This study did not formally assess this interaction, but it appears to be vitally important to overall course improvement. One method to accomplish this would be by interviewing students and GTAs both pre and post course to determine how they felt about the course and gained from the teaching or taking the course. Student interviews should incorporate both a series of questions asking students for their perceptions of the course as well as a group of questions that assess Ieaming. 217 Another suggestion is to add more control questions to the pre/post tests to determine the effect this course has on student Ieaming. Test questions should continue to be analyzed for their ability to assess student development of process skills and to test question validity. Test questions and GTA survey questions should receive further evaluation and additional formal validation. Another goal is to improve pre/post multiple-choice and essay tests to better assess student achievement. The ISB 208L model will continuously strive to improve GTA professional development, by including more pedagogical modeling in lab meetings, improving the GTA survey, and through identification of areas in need of additional time allocation. An addition to future ISB 208L courses will be questions to encourage student engagement in exercise material and reviewing laboratory exercises posted on-line in the University Lon-Capa system. The goal will be to reduce time spent lecturing during laboratory sessions, to better engage students in the Ieaming process, and improve retention of course materials. The effect of these questions on student achievement will need to be determined. The inclusion of other forms of assessment such as posters, presentations or other alternative assessments, rather than tests, should be added to provide a better measure of student ability to use process skills instead of memorization. Another suggested improvement is to make a generic test that could be used across disciplines and administered vertically over the time a student is at MSU and even after they graduate to determine if the course is teaching the desired 218 process skills, whether students are Ieaming these skills in other courses, and if the course is increasing material retention rates. Course evaluation needs to incorporate student and GTA interviews to determine if all the course assessments match interview assessments and actual student perceptions. Additionally, videotaping of student-instructor interactions several times during the semester and then analyzing this interaction, helps to determine if the frequency and types of questions asked by GTAs and students, changes between the beginning and end of the course. Quantifying student comments would be an area for improved future course evaluation. There is also a need to determine the effect familiarity of lab type has on student achievement and acceptance. For example do students do better and/or are they more receptive of confirmatory, inquiry, or PBL laboratories if they have previous exposure to any of these laboratory types. And finally, there is a need to scientifically evaluate the major laboratory types to determine which one or a combination provides the greatest student achievement of process skills. In order to determine if the ISB 208L course was encouraging students to enroll and complete the course a comparison was made between another general education science course. The second course was ISP 203L. Beginning and final enrollment, mean grade issued and percent course completion were compared between ISB 208L and ISP 203L to determine if the trends shown in the ISB 208L course were due to the course model or to some other factor. Figure 4-4 shows a comparison of this data. Both courses were very similar in consistency of mean grades. ISB 208L had high and consistent enrollments, 219 both beginning and end. Both courses show some increase in enrollments, but the ISB 208L course had a very steady increase over the six semesters, while the ISP 203L course varied over the six semesters, remaining fairly constant until Spring 2003 when it had a huge increase in enrollment. The large increase in 203L enrollment was probably due to several factors. First, the ISB 208L course fills to capacity every semester and the University increased overall freshman enrollment Spring 2003. ISP 203L also changed course format and aligned it more closely to ISB 208L. Thirdly, students were being encouraged to take the ISP 203L course in order to balance the course instructional load between ISB 208L and ISP 203L. Course completion rates were good for both courses, but ISB 208L had a higher mean completion rate of 93% compared to ISP 203L with a mean of 85.5%. Figures 4-2 and 4-3 indicate that both courses have good enrollment and course completion. Both courses issue consistent mean grades. The data shows that the ISB 208L course is more consistent in enrollments than the ISP 203L course. When linear regression was performed on the enrollment and course completion data, the trend lines explained the data for the ISB 208L course extremely well, with R2 values of 0.921, 0.931 and 0.8048, while the trend lines did not explain the data nearly as well for the ISP 203L course, with values of 0.461, 0.5107, and 0.6531, for beginning enrollment, final enrollment, and course completion respectively. These values indicate there is more inconsistency in the values for the ISP than the ISB course, which lends support to the hypotheses of this study indicating iterations in the course model improves the course in subsequent semesters. 220 APPENDICES 221 APPENDIX A Types of Laboratory and Inquiry Experiences Labs that are not scientific inquiry 1. Confirmation labs: students follow directions. Purposes: Practicing lab techniques Confirming accuracy of laws and theories “Consumer reports” labs: students compare products or practices to find the best one. Purposes: Reasoning about and developing experimental techniques Practicing evidence-based argumentation and decision making Application: students observe phenomena, then use models and theories to explain what they saw. Purposes: Connecting representations at different levels of abstraction Practicing detailed explanations of real-world examples Design labs (engineering inquiry): students use scientific principles to design systems that accomplish specific purposes (e.g., egg drop lab, building bridges, maximizing crop yield). Purposes: Applying scientific theories to practical design problem. Building engineering skills Activities that include scientific inquiry (finding and explaining patterns in expefience) . Naturalistic or field inquiry: students look for patterns in observations that they make. Examples: Geological or ecological field work Astronomical observations, such as sun and moon Experimental inquiry: students create new experience in the lab, often with planned variation. Examples: Systematically observing products of different reactions Comparing plant growth under different conditions Data analysis: Students look for and explain patterns in “experientially real” data sets that are given to them. Examples: Looking for patterns in weather or geographic data Explaining reported results of dangerous experiments Simulations: Students look for patterns and explain results in “virtual worlds” that imitate reality. Examples: Models of moving objects or electrical circuits 222 APPENDIX B Course Descriptions: ISB 202L: Applications of Environmental and Organismal Biology Laboratory Fall, Spring, Summer, 1 credit C: ISB 202 concurrently. Problem solving activities based on observation and the analysis of empirically derived data from environmental and organismal biology. ISB 204L: Applications of Biomedical Science Laboratory Fall, Spring, Summer. 1 credit. C: ISB 204 concurrently. Problem solving activities based on observation and interpretation of selected biological systems in relation to medical science. ISB 208L: Applications in Biological Science Laboratory Fall, Spring, Summer. 2 credits. C: ISB 202 or 204 concurrently. Problem solving activities based on observation and interpretation of selected biological systems. Center for Integrative Studies Academic Objectives" 1. Become more familiar with the ways of knowing in the arts and humanities, the biological and physical sciences, and the social sciences. 2. Develop a range of intellectual abilities, including critical thinking, logical argument, appropriate uses of evidence and interpretation of varied kinds of information (quantitative, qualitative, text, image). 3. Become more knowledgeable about other times, places, and cultures as well as key ideas and issues in human experience. Learn more about the role of scientific method in developing a more objective understanding of the natural and social worlds. Appreciate the role of knowledge, and of values and ethics in understanding human behavior and solving social problems. Recognize the responsibilities and opportunities associated with citizenship in a democratic society and an increasingly interconnected, interdependent world. ** Copied from the Integrative Studies Bulletin, p. 5. 9’9”? Integrative Studies Courses in the Sciences ISB/ISP Courses emphasize: The importance of science for individuals and society and the power of science for the life of the mind. 223 APPENDIX B (cont'd) Understanding of the methods, results, and limitations of scientific inquiry. Exploration in the ways in which science investigates and draws conclusions, results of such inquiry, using examples, the social impact of science in historical context, and the kinds of issues that science cannot answer. 224 APPENDIX C CENTER FOR INTEGRATIVE STUDIES/LABORATORY - FALL 2003 Please rate the course thoughtfully using the following scale: STRONGLY AGREE 1-2-3-4-5 STRONGLY DISAGREE .—8 The laboratory exercises regularly emphasized the significance of science to the non-scientist. The laboratory exercises emphasized understanding of ideas and concepts. The laboratory exercises encouraged students to think about the relationship between science and society. The laboratory manual presented the material clearly. The reading complemented the laboratory exercises. Testing required more than the memorization of facts. The laboratory exercises were intellectually challenging. The laboratory was conducted in a manner consistent with the Code of Teaching Responsibility (below). I would rate this teaching assistant on a grading scale of 4.0 (very good) to 0.0 (very poor) as follows: (a) 4.0 (b) 3.0 (c) 2.0 (d) 1.0 (e) 0.0 10. I would rate this course (lab only, not lecture) on a grading scale of 4.0 (very good) to 0.0 (very poor). (a) 4.0 (b) 3.0 (c) 2.0 (d) 1.0 (e) 0.0 S9 9.“??? 9"!” GENERAL RATING. Use the following scale: 1 — Superior; 2 - Good; 3 - Average; 4 - Fair, 5 - Poor How would you rate the overall quality of the: 11. Laboratory Manual? 12. Reading in the textbook? 13. Laboratory exercises? PLEASE MAKE WRITTEN COMMENTS ON THE REVERSE SIDE. THESE ARE PARTICULARLY HELPFUL IN EVALUATING THE COURSE. TRY TO DIRECT YOUR COMMENTS AT ANSWERING THE FOLLOWING QUESTION: WHAT CAN WE DO TO IMPROVE THE QUALITY OF THE LABORATORY? The Michigan State University CODE OF TEACHING RESPONSIBILITY holds all instructors to certain obligations with respect to eg. course content consistent with approved descriptions, timely statement of course objectives and grading criteria, regular class attendance, published office hours, and timely return of examinations and papers. This code is printed in full in the SCHEDULE OF COURSES AND ACADEMIC HANDBOOK. It includes specifics about complaint procedures available to students who believe there have been violations of the code. It is the policy of Michigan State University to keep faculty members’ SIRS ratings confidential to the extent possible. However, certain SIRS ratings may be subject to disclosure under state or federal law. [NOTE: These questions are placed on the SIRS scan-tron template] 225 APPENDIX D ISB 208L PrelPost Lab Assessment Directions: Do Not write on these papers. Write all answers on the scan- tron forms supplied In the lab rooms. Your assignment is to respond to the following questions or statements using only your own current knowledge. This assignment will not be graded as to correct answers, only on whether it was completed by you, using appropriate comments. We wish to know what knowledge you currently possess about the topics you will study in this course, so please take this survey seriously, answer the questions off the top of your head and without help from anyone else, and as accurately as possible. 1. Which one of the following is the single most important factor in the destruction of the environment? a. tropical rain forest destruction b. the use of fossil fuels c. human population growth d. pollution (air, water, and land) e. using non-recyclable resources f. other 2. Which one of the following is the single most important factor in causing the extinction of plants and animals? a. habitat destruction b. Over hunting c. Pollution d. Human population growth e. other 3. Who is most responsible for pollution in the world today? a. heavy industry b. oil companies c. individuals (like you and me) (I. loggers e. business owners f. other 226 APPENDIX D (cont'd) 4. The main function of mitotic cell division is a. to produce two identical daughter cells b. to produce four haploid daughter cells c. to produce gametes for sexual reproduction d. to produce diploid gametes for cellular reproduction e. other The main function of meiotic cellular division is to a. produce two identical daughter cells b. produce four diploid daughter cells c. produce gametes for sexual reproduction d. produce diploid gametes for cellular reproduction e. other Which one of the following shows the basic differences between plants and animals? a. plants contain chloroplasts, are heterotrophic, photosynthesize, and have cell walls b. there are no differences between plant and animal cells 0. plants contain mitochondria, are autotrophic, photosynthesize, and have cell membranes d. plants contain chloroplasts, are autotrophic, photosynthesize, and have cell walls e. plants contain mitochondria, are heterotrophic, photosynthesize, and have cell walls and animals do not possess these features. Which one of the following shows the products of respiration? glucose, carbon dioxide, and water glucose, oxygen, and energy carbon dioxide, energy, and water carbon dioxide, energy, and oxygen none of the above is correct 9999'!” 227 APPENDIX D (cont’d) 8. 10. Which one of the following statements best describes the use of energy by organisms on earth? Energy is collected and produced by organisms Energy is made and stored by green plants Energy is borrowed, stored and released to space Energy is made by plants and used by animals Energy is made by the sun, stored by animals, and made by plants 9.0-PS7!“ Which one of the following best describes the risks of sexual reproduction over asexual reproduction? a. Sexual reproduction is safer than asexual reproduction as more offspring can be produced in sexual than in asexual reproduction. b. Asexual reproduction is safer than sexual reproduction as more offspring can be produced in asexual reproduction than in sexual reproduction. c. The risks associated with both types of reproduction are approximately equal d. Sexual reproduction increases the chances of species survival due to increased genetic variability, but has a lower chance of individual organism survival than does asexual reproduction. Which one of the following best describes your risk of contracting HIV from having sexual intercourse? a. My risk is very low of contracting HIV if I use a latex condom. b. My risk is very low of contracting HIV if I only have oral sex. c. My risk is very low of contracting HIV if I shower with a disinfecting shampoo following sexual intercourse. d. My risk of contracting HIV is very low if I am monogamous my entire life. e. My risk of contracting HIV is very low if I am monogamous with my current partner. 228 APPENDIX D (cont'd) 11. 12. 13. 14. If you and your mate are both heterozygous recessive for cystic fibrosis and cystic fibrosis is a homozygous recessively expressed trait. What are the odds of you and your partner having a child afflicted with cystic fibrosis? a. 0% b. 25% C. 50% d. 75% e. 100% Mutations in the genetic make-up of an organism is best expressed in which one of the following statements? a. Mutations are an important source of genetic variation. b. Mutations increase the chance of survival of the species. c. Most mutations are lethal. d. Mutations may be beneficial in one situation, but harmful in others. e. More than one of the above are true about mutations. The basic mechanism, as proposed by Charles Darwin, thought to affect evolutionary change is genetic mutations phenotypic mutations natural selection sexual reproduction asexual reproduction PP-PP'P’ The best way to lose weight is a. take Metabolife or some other form of dietary supplement to control appetite and increase metabolism. have a diet high in protein and drink lots of water. have a diet low in fats, drink lots of water and exercise regularly. have a diet where you take in fewer calories than you burn, drink lots of water and exercise regularly. eat a small amount of food at a time all day long, drink lots of water, and exercise regularly. 9.0.0 .‘D 229 APPENDIX D (cont'd) 15. 16. 17. 18. The importance of understanding the extent of a watershed is a. b. c. d. e. it allows run-off to be controlled. it allows the study of fish to know where and how to fish. it allows managers to predict the affects of any activity on the watershed ecosystem. it allows managers to set up clean areas to protect the integrity of the water to insure it is safe to use for human activities. it allows planners to know how to set up effective plans for increasing urban development of the area. Which one of the following best describes clean drinkable water, that is safe for human consumption? a. b. c. d. e. Clean water is water that you can see through to great depths. Clean water is water that you cannot see through due to the presence of cyanobacteria (blue-green algae). Clean water is water that is muddy looking and that has few visible organisms present. Clean water is water that is cold, clear, and has a high coliforrn bacterial count. Clean water is water that is cool, clear, and has many visible organisms present. Which one of the following best describes a biome? a. b. C. d. e. the place where organisms live. an area that has specific communities on a continent with similar climatic, physical and chemical parameters. an area where one is most likely to encounter a specific type of organism. is the entire living area of the earth. is the largest area that has the same types of organisms present. Which one of the following would be the best choice as a resource to use in order to build a sustainable activity? a. b. c. d. finite, nonrenewable, recyclable resource finite, non-recyclable resource recyclable, non-renewable resource renewable resource 230 APPENDIX D (cont'd) 19. 20. 21. 22. If you were a wildlife manager, which one of the following descriptions expresses what your objectives for building a sustainable fishing industry should be? construct a food chain based on the most desirable fish. construct a food chain with as many large marketable fish as possible. construct a balanced food web using as many organisms as possible. construct a food chain with a good mix of all size classes of organisms. construct a food chain or web that will create the most income for the fishennen. PPPP 5” Which one of the following best describes the importance of a wetland to the earth’s biosphere? a. wetlands are wastelands that have very little economic value and are home to undesirable species, such as snakes and mosquitoes. b. wetlands are important to the survival of many species of organisms. c. wetlands are important because they help prevent flooding, cleanse water, and act as habitat for a wide diversity of organisms. d. wetlands are important for their economic value to communities, as they may be developed for recreation, waterfront property, tourism and other human activities. e. wetlands are not important as they can be recreated easily in other areas without loss of function or importance. The greatest biodiversity of organisms occurs in temperate forest tropical rain forest temperate wetlands arctic tundra equatorial deserts 9.0-99'?” The reason forest fragmentation is considered to be a problem is a. that organisms may die due to isolation. b. that the area does not provide for rapid recovery following disturbance. c. that the area increases biodiversity due to edge effects. d. that the area does not allow for wide biodiversity 231 APPENDIX D (cont'd) 23. 24. 25 26. Biodiversity is considered important because it increases the stability of the ecosystem allows greater numbers of individuals to survive adverse conditions helps prevent organism extinction allows the greatest genetic variability to occur in the ecosystem. all of the above are correct weave The most important group responsible for the continued health of this planet is government officials environmental protection agency personnel Greenpeace activists Local governments Individuals, such as you and me. EDP-99'!” Which one of the following is true about science? a. Science and scientists can solve any problem. b. Science and scientists work to understand, explain and offer solutions to solvable problems. c. Science and scientists use the scientific method to solve world problems and given enough time, can find a solution to almost any problem. (I. Science and scientists often only solve politically correct problems, with optimum solutions that make sense for the health of the planet. e. Science and scientists make a best guess and prediction as to problem solutions solely based on their own beliefs, opinions and practices. Most materials used by cells are moved by photosynthesis respiration active transport diffusion facilitated transport EDP-9979’ 232 APPENDIX D (cont'd) 27. Which one of the following is the single most important thing you can do to help maintain a healthy environment? PPPPP walk wherever you need or want to go. recycle all of your materials that you use. turn down the heat in your home. become an ecologically informed consumer. minimize your energy consumption. 28. The major reason for naming organisms is 9.0-99'!” to prevent their extinction. to aid in communication about the organisms. to be able to control them to either get rid of them or to protect them. to allow for proper scientific investigation. to allow for recreational and economic activities associated with the organism. 29. Which of the following is the most important to do if you wished to prevent the extinction of an organism, such as the Grizzly bear. EDP-PS7?“ ban DDT. protect its habitat prevent pollution prevent hunting of the organism improve its food sources 30. The best way to prevent contracting cancer is PPPP 5“ eat only organically grown foods. consume large quantities of antioxidants, such as Vitamin C. drink only filtered water. reduce your exposure to sunlight, or keep your exposure to a minimum. eat a balanced diet, drink safe water, avoid overexposure to sunlight, and get plenty of exercise. 233 APPENDIX E Pro-Lab Assessment Directions: Do Not write on these papers. Write all answers on the scan- tron forms that have been provided for you. Put your section number on the scan-tron form. Your assignment is to respond to the following questions or statements using only your own current knowledge. This assignment will not be graded as to correct answers, only on whether it was completed by you, using appropriate comments. We wish to know what knowledge you currently possess about the t0pics you will study in this course, so please take this survey seriously, answer the questions off the top of your head and without help from anyone else, and as accurately as possible. 1. The correct sequence of the scientific method is best represented by: a. hypothesis formation 9 theory 9 observation 9 experimentation 9 conclusion b. observation 9 hypothesis formation 9 experimentation 9 conclusion 9 publish c. experimentation 9 observation 9 hypothesis formation 9 publish 9 conclusion d. observation 9 experimentation 9 hypothesis formation 9 conclusion 9 publish e. experimentation 9 conclusion 9 observation 9 publish 9 hypothesis formation Which one of the following best represents the relationship between water temperature and concentration of dissolved oxygen in a water body? a. increased water temperatures cause higher levels of dissolved oxygen b. decreased water temperatures cause lower levels of dissolved oxygen c. decreased water temperatures allow for higher levels of dissolved oxygen d. there is no relationship between water temperature and levels of dissolved oxygen Which one of the following organic molecules is the body able to break down and utilize as energy most quickly? a. Nucleic Acids b. Proteins c. Lipids d. Carbohydrates 234 APPENDIX E (cont'd) 4. The hydrologic cycle is ultimately driven by P9957!” climate ozone solar radiation geothermal heat transpiration 5. Which one of the following best describes the ecosystem function carried out by a Venus-flytrap? a b. c d e Producer Consumer Decomposer Two of the above None of the above 6. The process of environmental pressure allowing the best adapted individuals to survive and reproduce and the less adapted individuals to decrease in number over time is called: 9.0-PST!” adaptive radiation gradualism punctuated equilibrium mutations natural selection 7. Environmental impacts caused by humans are called a b c d Anthropophagous Anthropogenic Humanistic Humanitarian 8. Which one of the following would not be a good hypothetical (hypothesis) statement? a. b. C. The sun rises over the horizon every morning. The fly on the wall can walk on the wall because it has little hairs on its legs that fit into crevices in the wall. Bees find flowers using their ability to see using ultraviolet light. I believe the ability of a person to consume alcohol without becoming inebriated is due to their body weight. Why do birds fly? 235 APPENDIX E (cont’d) 9. 10. 11. Controls in experiments are important because they are the item being tested in the experiment are the known used for comparison are the unknown which varies in the experiment are used to control the experiment to ensure you get the results you expect none of the above is correct PPPP 9" At the right are drawings of three strings hanging from a bar. The three strings have metal weights attached 1 2 3 to their ends. String 1 and String 3 are the same length. String 2 is shorter. A C > 10 unit weight is attached to the end of String 1. A 10 unit weight is also attached to the end of String 2. A 5 unit weight is attached to the end of String 3. The strings (and attached . weights) can be swung back and forth and the time it takes to make a swing ~ can be timed. ® (5) Suppose you want to find out whether the length of the string has an effect on the time it takes to swing back and forth. Which strings would you use to find out? a. only one string b. all three strings c. 2 and 3 d. 1 and 3 e. 1 and 2 Because? you must use the longest strings. you must compare strings with both light and heavy weights. only the lengths differ. to make all possible comparisons. the weights differ. 9.0-9579’ 236 APPENDIX E (cont'd) 12. that all Farmer Brown was observing the mice that live in his field. He discovered of them were either fat or thin. Also, all of them had either black tails or white tails. This made him wonder if there might be a link between the size of the mice and the color of their tails. So he captured all of the mice in one part of his field and observed them. Below are the mice that he captured. 13. 14. 15. <2 <2 “3:3 ‘83 PPP PPP Do you think there is a link between the size of the mice and the color of their fails? a. appears to be a link b. appears not to be a link 0. can not make a reasonable guess Because there are some of each kind of mouse. there may be a genetic link between mouse size and tail color. there were not enough mice captured. most of the fat mice have black tails while most of the thin mice have white tails. e. as the mice grew fatter, their tails became darker. Page Twenty fruit flies are placed in each of four glass tubes. The tubes are sealed. Tubes l and II are partially covered with black paper; Tubes Ill and IV are not covered. The tubes are placed as shown. Then they are exposed to red light for five minutes. The number of flies in the uncovered part of each tube is shown in the drawing. 237 APPENDIX E (cont'd) 16. 17. This experiment shows that flies respond to (respond means move to or away from): a. red light but not gravity b. gravity but not red light c. both red light and gravity d. neither red light nor gravity RED LIGHT I , I I I r. I I I I l " 8,19 IV n; O) I I I I f I f RED LIGHT Because a. most flies are in the upper end of Tube III but spread about evenly in Tube II. b. most flies did not go to the bottom of Tubes I and III. c. the flies need light to see and must fly against gravity. d. the majority of flies are in the upper ends and in the lighted ends of the tubes. e. some flies are in both ends of each tube. Six square pieces of wood are put into a cloth bag and mixed about. The six pieces are identical in size and shape, however, three pieces are red and three are yellow. Suppose someone reaches into the bag (without looking) and pulls out one piece. What are the chances that the piece is red? 1 chance out of 6 R R R 1 chance out of 3 1 chance out of 2 chance out of 1 can not be determined 9.0-99'.” 238 APPENDIX E (cont'd) 18. 19. 20. 21 22. Because 3 out of 6 pieces are red. there is no way to tell which piece will be picked. only 1 piece of the 6 in the bag is picked. all 6 pieces are identical in size and shape. only 1 red piece can be picked out of the 3 red pieces. ePPPP Which one of the following is NOT an organic molecule that is typically listed on a food wrapper? Calorie Lipid Protein Carbohydrate All of the above are organic molecules 9.0-PPS” Removal of all of the dead logs from a forest floor will decrease the complexity of that natural system. This in turn will result in which of the following? an increase in biodiversity a decrease in the ecosystem stability an increase in the rate of soil moisture evaporation from the forest floor (I. a rapid increase in rates of photosynthesis e. none of the above OPP Which of these characterize all birds, reptiles, and mammals? a. stable body temperatures b. giving birth to live young c. internal skeletal structures d. predatory habits If you get chickenpox when you are young, you are not likely to get it again. Why is this? a. The chickenpox virus cannot enter your body again. b. The medicines you took for chickenpox are stored in your body. c. Chickenpox is not as harmful now as it used to be. d. Your body has formed antibodies against the chickenpox virus. 239 APPENDIX E (cont'd) 23. 24. Which of these is a negative side effect of using pesticides on crops? Crop pests are destroyed. The crop takes longer to mature. The size of the crop is reduced. Beneficial insects may be destroyed. Chemical change as they move faster. Chemical Change as they move slower. PPPPPP As water is cooled down, the molecule and/or it’s atoms undergo a a. Physical change as they move faster. b. Physical change as they move slower. c. Chemical change as they move faster. d. Chemical change as they move slower. 240 APPENDIX F ISB 208L Pro-Lab Assessment Directions: Do Not write on these papers. Write all answers on the scan- tron forms that have been provided for you. Put your section number on the scan-tron form. Your assignment is to respond to the following questions or statements using only your own current knowledge. This assignment will not be graded as to correct answers, only on whether it was completed by you, using appropriate comments. We wish to know what knowledge you currently possess about the topics you will study in this course, so please take this survey seriously, answer the questions off the top of your head and without help from anyone else, and as accurately as possible. 1. The correct sequence of the scientific method is best represented by a. b. hypothesis formation 9 theory 9 observation 9 experimentation 9 conclusion observation 9 hypothesis formation 9 experimentation 9 conclusion 9 publish experimentation 9 observation 9 hypothesis formation 9 publish 9 conclusion observation 9 experimentation 9 hypothesis formation 9 conclusion 9 publish experimentation 9 conclusion 9 observation 9 publish 9 hypothesis formation 2. The scientific name given to sex-cells is a. b. c. d. somatic cells stem cells gametes zygotes 3. The biggest cause of habitat loss in the grassland biome is 9.0-PST!” urban sprawl conversion to agriculture global warming pollution strip mining 241 APPENDIX F (cont'd) 4. A food chain is less desirable in an ecosystem than a food web because a. Food chains do not provide for ecosystem stability. b. There are more organisms involved in a food chain. c. Food chains prevent natural selection from occurring. (1. Mutations are more common in a food chain system. Which one of the following best represents the relationship between water temperature and concentration of dissolved oxygen in a water body? a. increased water temperatures cause higher levels of dissolved oxygen b. decreased water temperatures cause lower levels of dissolved oxygen c. decreased water temperatures allow for higher levels of dissolved oxygen d. there is no relationship between water temperature and levels of dissolved oxygen Which one of the following organic molecules is the body able to break down and utilize as energy most quickly? a. Nucleic Acids b. Proteins c. Lipids d. Carbohydrates Which one of the following equations best represents photosynthesis? a. C02 ‘1' H20 —) O2 ‘1' Sugar b. Sugar + H20 9 002 + 02 c. 02 + Sugar 9 C02 + H20 d. 02 '1' H20 ‘) C02 '1’ Sugar The hydrologic cycle is ultimately driven by gravity ozone solar radiation geothermal heat transpiration EDP-9.0"?” 242 APPENDIX F (cont'd) 9. 10. 11. 12. 13. Which one of the following best describes the ecosystem function carried out by a Venus-fly trap? Producer Consumer Decomposer Two of the above None of the above 909-957.” Which one of the following is NOT true of mutations? a. Mutations are usually lethal b. Mutations are random events c. The environment influences what types of mutations occur d. The environment influences what types of mutations survive Which one of the following is the most serious problem facing our biosphere? Ozone depletion Human population growth Global warming Habitat destruction None of the above 9.0-99'!” The process of environmental pressure allowing the best adapted individuals to survive and reproduce and the less adapted individuals to decrease in number over time is called adaptive radiation gradualism punctuated equilibrium mutations natural selection 9999'.” Cell division where two identical daughter cells is produced is called a. meiosis b. dissection c. differentiation d. mitosis 243 APPENDIX F (cont'd) 14. 15. 16. 17. 18 Environmental impacts caused by humans are called a b. c. d. Anthropophagous Anthropogenic Humanistic Humanitarian A way of mapping relationships and traits across several generations is called a b. c d e gel electrophoresis teratogen tracing pedigree DNA map karyotype Which one of the following would not be a good hypothetical (hypothesis) statement? a. b. e. The sun rises over the horizon every morning. The fly on the wall can walk on the wall because it has little hairs on its legs that fit into crevices in the wall. c. Bees find flowers using their ability to see using ultraviolet light. d. I believe the ability of a person to consume alcohol without becoming inebriated is due to their body weight. Why do birds fly? Controls in experiments are important because they PP-PP'P are the item being tested in the experiment are the known used for comparison are the unknown which varies in the experiment are used to control the experiment to insure it works property none of the above is correct Which one of the following is a correct statement with regard to chromosomes in a. human. a. there are multiple alleles located on a chromosome for each trait. b. there are many genes located on each chromosome in the human body. 244 APPENDIX F (cont'd) 19. 20. c. there are always three alleles for every trait in the human genome. d. there are four haploid chromosomes for each trait in the human genome. e. two identical daughter cells are always present in each chromosome. Which one of the following is an example of a density dependent factor affecting a population? a. Altitude b. Lightning c. Disease d. Old age Which one of the following is the best way to prevent contracting an STD and/or becoming pregnant? a. practicing abstinence b. being monogamous with the one you are with c. using a condom or other protective device (I. having your partner checked for STDs prior to having sex and using birth control, such as an IUD, the pill or similar device 245 APPENDIX G lSB 208L Persuasive Essay Write a one-two page persuasive letter. This letter should be appropriately written to convince legislators to take your side on one of the listed environmental or biological issues. To be most persuasive you will need to support your opinion using scientific evidence. This letter should be based on facts (i.e. environmental impacts, economic concerns, health and safety issues, etc.). Your letter should clearly state the issue and provide the reader with enough infonnatlon that they recognize that you are an informed individual. You should clearly define the issue, supply at least 3 pieces of supporting evidence for your opinion using specific examples, and a clear direction or course of action you want the elected individual to follow with regard to the issue. This letter should be concise and have a logical flow, from the position, through the evidence to support the position, to the course of action to be followed. If you use facts, be sure to give credit to the individual or organization from which you obtained those facts. This lends credibility to your argument. Hopefully, it will be a letter you would wish to send to your national, state, or local officials, although you will not be required to do so, but it should be written in the appropriate manner, so you could send it. Choose from one of these biological topics: Genetic Manipulation I Genetic Engineering Global Warming Human Population Alternative Energy Sources Groundwater / water pollution Threatened / Endangered Species Habitat Destruction Biodiversity Rubfic Biological topic and background information (10 pts) Excellent (10 pts) — An appropriate biological topic is clearly described including background information and specific details or examples. Fair (7.5 pts) - A biological topic is described with very little background information or specific examples. Poor (0-6 pts) No biological topic is described. 246 APPENDIX 6 (cont'd) Your opinion on the biological issue and a suggested course of action (10 MS) Excellent (10 pts) A clearly stated opinion, justification for the opinion, and a clear course of action to be followed by the reader on the biological issue is provided. Fair (7.5 pts) There is a vague opinion of the issue provided with little justification and no suggested course of action provided to the reader. Poor (0 - 6 pts) No opinion or course of action is provided on the issue. Three supporting arguments and specific examples (30 pts: 10 per example with evidence) Excellent (10 pts)A cleariy described argument based upon scientific evidence is provided, including specific examples is given, supporting your opinion. Fair (7.5 pts) An argument supporting your opinion is provided, either based on opinion, or without use of specific examples. Poor (0 — 6 pts) No supporting argument provided. 247 APPENDIX H Sample PrelPost Test Essay Questions Questions 9 and 10 are sample essay questions used to test student understanding of scientific reasoning, critical thinking, and experimental design. Students are asked to write an essay that clearly answers each of the questions using proper technical writing format. Sample problem: 9. Scientific Investigation and Experimental Design In order to help meet the global need for increased agricultural production, the United States Department of Agriculture has proposed converting much of the Boundary Waters National Park lakes region into a large scale national agricultural sector. This region has thousands of pristine oligotrophic lakes that are among the best habitat for trout. These lakes and the trout fishing are pivotal to the local economy through tourism they attract. You have been hired by the Environmental Protection Agency to conduct an environmental impact survey. You have been given five years and access to a large section of the park, containing many lakes, with which to conduct an investigation. You task is to determine whether there would be significant impacts to the local lakes that would negatively impact tourism and the local economy. Propose a correctly stated hypothesis as to what, if any, changes would occur to each of the three water quality parameters listed below if this agricultural proposal were implemented. (5 pts. Each) For each parameter, describe in detail why the proposed land use changes would have these predicted affects. (5 pts) DO (Dissolved Oxygen): Hypothesis; Justification Temperature: Hypothesis; Justification Fish species composition: Hypothesis; Justification Remember, you have been given five years and access to a large section of theI park, containing many lakes, with which to conduct your investigation. Your task is to determine whether there would be significant impacts to the local lakes that would negatively impact tourism and the local economy. Given these constraints, describe ar appropriately designed scientific investigation, that could be conducted to test each of your hypotheses (from the previous page), that would provide the data necessary to support or refute these hypotheses. Be sure to include all appropriate methods and aspects of a well-designed scientific investigation and also the results you would expec in each of your treatments. (Note: do not just outline the scientific method) Also, identify alternative explanations that could result in the same predicted outcomes and/or identify problems that might arise that could lead to different results. (20pts) 248 APPENDIX H (cont'd) 10. Experimental Analysis and Critique (50 pts) Toivo Pakala, a single copper miner, lives in a cabin in the Upper Peninsula of Michigan with his cat Chester, who is strictly a house cat, and his large hunting dog Bowser, who sleeps out back behind the cabin. One evening in June after eating venison steaks broiled in butter and lots of fresh garlic, Toivo realizes that the mosquitoes that normally plague him every night are not bothering him at all. He wonders if the large amounts of garlic that he ate could be the explanation for this. He wonders if by adding fresh chopped garlic cloves to his pets canned food if he could also help rid them of their most common insect pests. Chester is always infested with fleas and Bowser always has ticks. For the next month, Toivo decides to conduct an experiment. Each day he puts one clove of chopped garlic in Chester's dinner, two cloves of garlic in Bowser's dinner, and three cloves of garlic in his own dinner. Two days before the end of the experiment Toivo chooses to sleep outside on the open-air front porch. In the morning Toivo realizes that over night he was not bitten by a single mosquito. He also checks his pets for fleas and ticks. Bowser has no ticks, but Chester is still infested with fleas. Toivo concludes that only high concentrations of gariic in an animal’s diet can help to deter insect pests. He writes up his findings and submits it to the Marquette Mining Journal. Many of the readers find various faults with Toivo’s experimental design. In addition, due to his bad breath he never gets a date. Clearly state the hypothesis that Toivo is testing. (5 pts) Clearly describe three problems with the experimental design that Toivo has utilized. (5 pts) For each problem identified, clearly describe what could be done to improve upon it. (5 pts) Problem Improvement Problem Improvement Problem Improvement Clearly describe two alternative conclusions that could be made based upon this scenario. (5 pts) , 249 APPENDIX I Persuasive Essay Grading Rubric Write a one-two page persuasive letter. This letter should be appropriately written to convince legislators to take your side on one of the listed environmental or biological issues. To be most persuasive you will need to support your opinion using scientific evidence. This letter should be based on facts (i.e. environmental impacts, economic concerns, health and safety issues, etc.). Your letter should clearly state the issue and provide the reader with enough information that they recognize that you are an informed individual. You should clearly define the issue, supply at least 3 pieces of supporting evidence for your opinion using specific examples, and a clear direction or course of action you want the elected individual to follow with regard to the Issue. This letter should be concise and have a logical flow, from the position, through the evidence to support the position, to the course of action to be followed. If you use facts, be sure to give credit to the individual or organization from which you obtained those facts. This lends credibility to your argument. Hopefully, it will be a letter you would wish to send to your national, state, or local officials, although you will not be required to do so, but it should be written in the appropriate manner, so you could send it. Choose from one of these biological topics: Genetic Manipulation / Genetic Engineering Global Warming Human Population Alternative Energy Sources Groundwater / water pollution Threatened / Endangered Species Habitat Destruction Biodiversity Biological topic and background information (10 pts) Excellent (10 pts) An appropriate biological topic is clearly described including background information and specific details or examples. Fair (8 pts) A biological topic is described with very little background information or specific examples. Poor (6 pts) No biological topic is described. Unacceptable (0 pts) 250 APPENDIX I (cont'd) Your opinion on the biological Issue and a suggested course of action (10 PM Excellent (10 pts) A clearly stated opinion, justification for the opinion, and a clear course of action to be followed by the reader on the biological issue is provided. Fair (8 pts) There is a vague opinion of the issue provided with little justification and no suggested course of action provided to the reader. Poor (6 pts) No opinion or course of action is provided on the issue. Unacceptable (0 pts) Three supporting arguments and specific examples (30 pts: 10 per argument with evidence and specific examples) Argument I II III _ Excellent (10 pts) A clearly described argument based upon scientific evidence is provided, including specific examples is given, supporting your opinion. _ Fair (8 pts) An argument supporting your opinion is provided, either based on cpinion, or without use of specific examples. __ Poor (6 pts) No supporting argument provided. _ Unacceptable (0 pts) Total Points Earned 251 APPENDIX J Teaching Assistant Job Description for ISB 208L Assignment type: Teaching Assistant (V4, ‘/2, % time) Qualifications: The Center for Integrative Studies in General Science hires teaching assistants who are knowledgeable in general biology to teach non- major general biology laboratory courses. Individuals for these assignments should have a strong background in biology having completed some upper level training courses in the 300 and/or 400 level. Additionally, teaching assistants should have good communication skills, with an ability to effectively communicate orally and in writing with students and faculty. Preference is given to graduate students in departments whose faculty participate in teaching ISB lecture sections. Job description: Teaching assistants are assigned as V4, V2 or 3/4 time by their major department. % time TAs teach 1 lab section, ‘/2 time - 2 lab sections and % time - 3 lab sections. Teaching assistants are provided support by their departments for 18 weeks each semester, and the total number of work hours required are based on this eighteen - week period. A ‘/4 time TA is expected to work 10 hours per week, V4 time 20 hours per week and % time 30 hours per week. This means the total time required per semester of each teaching assistant is 180 hours for ‘/4 time, 360 hours for 1A» time, and 540 hours for % time. Teaching assistants are assigned to teach general biology laboratories by the laboratory coordinator. Teaching assistants will be supervised and evaluated by the lab coordinator. Job Duties: Teaching will include: Preparation of the lab room. Maintaining office hours (both scheduled and arranged). Grading student assignments (tests, quizzes, written assignments). Answering student questions in the classroom, via e-mail, telephone and before and after class. - Preparing to teach class. The teaching assistant will read lab materials and familiarize themselves with course content, all laboratory and teaching equipment, review questions, and establish a teaching plan/itinerary, to insure all supplies are in the classroom and they are prepared to teach the laboratory. 0 Grade and return student assignments in a timely manner, usually by the next class period. Submit grades on a biweekly basis or as indicated by the lab coordinator. Maintain accurate accounts of student grades, and attendance as specified by the lab coordinator. 0 Keep all student correspondence and copy any correspondence to the attention of the lab coordinator as needed. 252 APPENDIX J (cont'd) Be aware of, instruct, know how to use, and practice all safety procedures, equipment, and protocols for the laboratory assignments, and the room/building in which classes meet. Follow the University Guidelines and procedures as explained in the code of teaching responsibilities Conduct themselves in a professional manner at all times. Attend an orientation meeting prior to the start of each semester and weekly laboratory meetings as indicated by the Lab coordinator. Proctor for faculty as required and assigned by the lab coordinator. Respond to student, faculty and staff correspondence in a timely manner. Provide personal contact information to Center Staff including the Center Director, lab coordinator, and office staff. Notify the lab coordinator, in a timely manner, of conflicts that prevent the teaching assistant from carrying out duties as specified in this description. This includes illness, professional meetings, deaths in immediate family, etc. Maintain and check a mailbox, e-mail account, voice mail, and other forms of communication as indicated by the lab coordinator. Provide course information regarding grading, attendance and other policies and procedures from the syllabus or other source to students as required by the lab coordinator and CISGS. Resolve student issues in an appropriate and timely fashion and notify the lab coordinator or Director of CISGS of problems, as necessary. Respect students, faculty, and staff at all times Accommodate special student needs. Follow the procedures for copying as indicated during orientation. Obtain, return and maintain room keys and materials in a safe and secure manner. Meet with lab coordinator/supervisor to review work performance and set teaching/work objectives Conflicts: Conflicts with this assignment should be discussed with your immediate supervisor and if no resolution is reached, then with the director of CISGS or your Department Chair. 253 APPENDIX K ISB 208L T.A. Fall Semester 2002 Evaluation T.A. Name Lab Section Numbers Taught Meets or exceeds Needs expectations Improvement Notes Professionalism Attends scheduled lab meetings, is on time and stays throughout Makes themselves available to cover proctoring assignments Actively participates in lab meetings Comes prepared to lab meetings having read and worked through lab Participates in “teaching” a lab at lab meetings Is engaged in the course (ownership) such that they are willing to make suggestions for course improvement Returns graded materials to students promptly (normally w/in 1 week) Maintains scheduled office hours Encourages students to come to office hours for help Answers e-mails and other communications in a timely fashion Good Instruction Qualities Shows concern for student learning Speaks with authority on subject manner Presents themselves as a professional to students Arrives at class organized and prepared Takes the subject matter seriously 254 firms-fir APPENDIX K (cont'd) Is approachable for students Maintains an enthusiastic and positive attitude Speaks and covers materials at a reasonable pace Is flexible in teaching for students with different learning styles ls friendly with students while maintaining a certain level of authority Speaks audibly and with clarity Provides students with challenging thought provoking questions Makes the course material relevant to the students lives Is patient with students Makes themselves accessible to students Is consistent with all students 255 APPENDIX L T.A. Name Lab Section Lab Topic Date ISB 208L T.A. Evaluation Meets or Exceeds Needs Expectations Improvement Notes Arrives to class on time (15 minutes before scheduled lab time) Appropriate Dress Verifies that lab room and materials are ready Begins class in timely fashion Returns graded assignments promptly Attends to any necessary informational “business” Clearly outlines objectives for the day Is prepared to teach having read and worked through materials before class Is well organized Notes Speaks loudly, clearly, and at a reasonable rate Attempts to communicate with all students Makes the lab information relevant for students through the use of real world examples, personal narrative, analogies, etc. Covers material at an appropriate pace Moves through classroom assessing understanding and progress of all students 256 APPENDIX L (cont'd) Interacts with students in a professional manner Is helpful and encouraging to all students Encourages students to seek assistance and ask questions Is approachable for students Is patient when explaining difficult concepts Is able to quickly assess student understanding (or lack of) and modify / adjust teaching strategies as necessary When asking questions, allows for a sufficient “wait” time When asking questions, encourages whole class participation Provides challenging / thought provoking questions to class Notes Holds students accountable throughout class period to pay attention Uses class time wisely / efficiently Is proficient in the use of all audiovisual equipment Is proficient in the use of all lab materials and equipment Makes eye contact with students around classroom Appears comfortable and knowledgeable with subject matter Displays and maintains a positive attitude Observer name Signature Date 257 APPENDIX M Project Title: General Science in Biology Curriculum Assessment and Comparison of Inquiry Based Education with Traditional Education Approaches Dear Student, We are requesting your participation in and help with evaluating three instructional methods used to teach the ISB 208L course. Each section will be using one of three instructional approaches to determine which is most effective in meeting the University objectives for this subject (See project description below), as judged using the following criteria. Please note that you may change sections of this laboratory course if you do not agree with, or do not wish to use one of the instructional methods. All changes between sections should be completed by the end of the second week of classes. You may also view the research proposal and protocols for this study by contacting Todd Tarrant at 517-432-9285 or by e-mail: tarrantt@msu.edu. We are asking for your cooperation and participation in this study, and for your permission to use the following information as means to evaluate the success of the educational approaches: SIRS responses, class attendance, class grade averages, pre & post test scores, and your personal essays from pre-test, and final exams. Your name and student number will be removed in order to protect your privacy to the maximum extent allowable by law, prior to use of any data for external publication. Compiled class information and your individual essays (pre and post course) may be published in professional journals or presented at conferences to aid other institutions in their assessment procedures. Your agreement to the use of this information for possible external publication (in educational or other professional journals) is voluntary and your refusal will in no way impact your grade. Refusal will mean that your data will not be included in the collective data for the study. Completion of all assignments and SIRS will still be required as per the course syllabus and University policy. Research Description: This research project is to compare traditional lab activities (cook book approach), a blend of traditional moving toward Problem-based laboratory (PBL) approach and pure Problem-based instruction of the laboratories. Student performance and perceptions will be assessed to determine whether there is a difference between the three instructional approaches (traditional, a mix of traditional progressing to problem-based, and problem- based instruction) employed in this study as measured on a standardized knowledge test, essay, and on student perceptions via SIRS, attendance and willingness to take additional science courses. The objectives of this study are to increase student satisfaction with biology laboratory courses, general biology knowledge, willingness to take additional science courses, and improve student use of critical thinking skills. The findings of this study should allow instructors to determine which laboratory teaching approach is best for a non ~major general education laboratory biology course If you have any questions about this study, please contact the investigator, (Dr. Duncan Sibley, 100 N. Kedzie Lab, Michigan State University, East Lansing, MI. 48824, 517- 353-4572, e-mail: sibley@msu.edu). If you have questions or concerns regarding your rights as a study participant, or are dissatisfied at any time with any aspect of this study, 258 APPENDIX M (cont'd) you may contact — anonymously, if you wish - Ashir Kumar, M.D., Chair of the University Committee on Research Involving Human Subjects (U CRIHS) by phone: (517) 355-2180, fax: (517)432-4503, e-mail: ucrihs@msu.edu or regular mail: 202 Olds Hall, East Lansing, MI 48824. By signing this release of information, you are giving permission for the use of data collected during the instructional process of the ISB 208L course. Thank you for your cooperation. I Date Print Name 259 BIBLIOGRAPHY ----- February 1997. The Many Paths to Success: Biology Education is Flourishing at Many Campuses as New Approaches to Teaching and Learning Take Hold. ERIC EJ540058 Journal of College Science Teaching. Vol. 26(4) p 247-252. Adkins, C., and Simmons, 8., 2002, Outdoor, Experiential, and Environmental Education: Converging or Diverging Approaches? ERIC Digest. ED467713. 4 p. Allen, R. R. and Reuter, T. 1990 Teaching assistant strategies: An introduction to college teaching. Dubuque, IA Kendall/Hunt Publishing Company 158 p. American Association for the Advancement of Science 1989. Project 2061: Science for All Americans. AAAS, Washington DC: 217 p. American Association for the Advancement of Science. 1990 The Liberal Art of Science. AAAS Washington, DC. 121 p. American Association of Colleges, 1985 Integrity in the College Curriculum: A Report to the Academic Community. Washington, DC: Association of American Colleges 233 p. Andersen, C. B.. Worthen, W. B., and Polkinghom, B. 2001 Humanism in the Environmental Sciences: A Reevaluation. Journal of College Science Teaching. XXXI. (3): p 202-206. Angelo, T. A., and Cross, K. P. Teaching Objectives Inventory: Self-Scorable Version Classroom Assessment Techniques: A Handbook for College Teachers, (2"d ed.) Teaching Objectives Inventory, , Copyright 1993 Jossey—Bass lnc., Publishers. p 393-397. Arce, J., and Betancourt, R. November 1997. Student-Designed Experiments in Scientific Lab Instruction. Journal of College Science Teaching. 27 (2) p 114-118. Auxier, J., Brown, D., Brown, M. L., Chaput, C., Gross, M., LeFevre, K., Parrish, J., Roberts, L., and Velez, M. T. 2000 The University of Arizona: Teaching Assistant Taskforce Report. University of Arizona. Tempe, AZ 15 p. 260 Avery, J. P., Chang, J. L., Piket-May, M. J., Sullivan, J. F., Carlson, L. E., Davis, 8. C. 1998 The Integrated Teaching and Learning Lab Frontiers Educ. Conf. ieeexplore.ieee.org Page 1 0-7803-4762-5l98/s 10.00 © IEEE F/lE Conference Session F4D932 5 p. Barr, R. B., and Tagg. J. 1995. From Teaching to Learning: a New Paradigm for Undergraduate Education. Change 27(6): p 13-25. Banows, H. S. 1998. The Essentials of Problem-Based Learning. Journal of Dental Education. Volume 62. No. 9. p 630-633. Berry, A., Mulhall, P., Loughran, J. J., and Gunstone, R. F. 1999a. Helping students learn from laboratory work. Australian Science Teachers’ Journal. 45 (1 ), 27-31.1999a & 1999b Berry, A., Mulhall, P., Loughran, J. J., and Gunstone, R. F. 1999b. Exploring students approaches to laboratory work. Australian Science Teachers’ Journal. (under review). Beyer, B. K. 1987 Practice Is Not Enough. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. p 77-86. Biological Sciences Curriculum Study (3808). 1993. Developing Biological Literacy: A Guide to Developing Secondary and Post-secondary Biology Curricula . Dubuque, IA: Kendall/Hunt Publishing Company 128 p. Bloom, B. S. and Krathwohl, D. R. 1956. Taxonomy of Educational Objectives: the Classification of Educational Goals, by a Committee of College and University Examiners. New York, Longmans, Green. B17.866V1C4 2 v. illus .22cm v. 1 Handbook I: Cognitive Domain Boersma, 8., Hluchy, M., Godshalk, G., Crane, J., DeGraff, D., Blauth, J. March 2001. Student-Designed Interdisciplinary Science Projects: Placing Students in the Role of Teacher. Journal of College Science Teaching. Volume XXX. No. 6. p 397-402. Boggs, G.R. Dec-Jan. 1995. The Learning Paradigm. In Community College Journal. Washington DC: American Association of Community Colleges. Vol. 66 n 3 p 24-27. Bond, L., Jaeger, R., Smith, T. and Hattie, J. 2001 Defrocking the National Board: The Certification System of the National Board for Professional Teaching Standards. Education Matters v1 n2 p 79-82. 261 Bondeson, S.R., Brummer, J. G. and Wright, S. M. November 2002. Rubric for Content Classification: Helping Instructors Make Classroom Decisions. Journal of College Science Teaching. Vol. XXXII. No. 3. p 182-187. Boyle, P. and Boyce, B. 1998. Systematic Mentoring for New Faculty Teachers and Graduate Teaching Assistants. Innovative Higher Education. Vol. 22, No.3, Spring. p.157—179. Brahmia, S. and Etkina, E. November 2001. Switching Students On to Science: An Innovative Course Design for Physics Students. Journal of College Science Teaching. Vol. XXXI. No. 3. p 183-187. Brookfield, S. 1997. Assessing Critical Thinking. New Directions for Adult and Continuing Education. Wiley Publications, Inc. No. 75. p 17-29. Brown University, Department of Sociology; TA Survey Questionnaire: SOC 112 - Fall Semester, 2003. Date of citation: July 10, 2005. Available from: h_ttgl/www.grown.edu/Departmentslsociologv/cogrses/sem2 04- 05/so1 12fl' A uestionnaire. f Brownowski, J.; 1965 Science and Human Values. Chapter 1: What is science? By F. Wilson, New York: Harper 8 Row, Publishers. 119 p. Buchwald, C. E. 1997. Undergraduate Geology Education-The Carieton College Experience. Journal of College Science Teaching. March/April. p 325-328. Bybee, R. W. 1993. An Instructional Model for Science Education, Developing Biological Literacy: A Guide to Developing Secondary and Post-Secondary Biological Curricula. Biological Sciences Curriculum Studies (8808) Colorado Springs, CO Bybee, R. W. and Champagne, A. B. Jan. 2000 The National Science Education Standards Science Teacher Vol. 67 n1 ERIC EJ617007 p 54-55. Bymes, H. 2001. Reconsidering Graduate Students’ Education as Teachers: “It Takes a Departmentl”. The Modern Language Journal. Vol. 85, iv. p 512-530. Bymes, H. 2001. Toward a Comprehensive Conceptualization of TA Education: Contents, Commitments, Structures. ED 456651 ERIC. p 3-24. Cash, R. W. May 1993. Assessment of Study-Abroad Programs Using Surveys of Student Participants. AIR 1993 Annual Forum Paper. ED 360 925 ERIC issue: RIEJAN1994. p 3-24. Chinn, P. W. U., Hilgers, T. L. 2000. From Corrector to Collaborator: The Range of Instructor Roles in Writing-Based Natural and Applied Science Classes. Journal of Research in Science Teaching. Vol. 37. No. 1. p 3-25. 262 Coverdale, G. A. November 2003. Tracking Development: Pursuing Scientific Inquiry on the Lytle Creek. Journal of Science College Teaching. Vol. XXXII. No. 3. p 188-193. Crawford, B. A. 2000. Embracing the Essence of Inquiry: New Roles for Science Teachers. Journal of Research in Science Teaching. Vol. 37. No. 9 p 916- 937. Cross, L. H. and Frary, R. B. 1996. Hodgepodge Grading: Endorsed by Students and Teachers Alike. Paper presented at Annual Meeting of the National Council on Measurement in Education. (New York, New York, April 9-11, 1996). ED 398262 ERIC issue 15 p. Curriculum Solutions: Performance Objectives/ITIP Worksheet Training Resource Center Eastern Kentucky University 300 Stratton Building 521 Lancaster Avenue Richmond, KY 40475 www.trc.eku.edufitiglp cda.htm Darling-Hammond, L. 1996. The right to learn and the advancement of teaching: Research, policy, and practice for democratic education. Educational Researcher, 25, p. 5-1 7. de Beaugrande, R. 1997. New Foundations for a Science of Text and Discourse. Greenwich, CT; Ablex. 11 p. Dewey, J. 1916. Democracy and Education. Middle Works. 9 p. Dimaculangan, D. 0., Mitchell, P. L., Rogers, W., Schmidt, J. M., Chism, J. L. and Johnston, J. W. March/April 2000. A Multidimensional Approach to Teaching Biology: Injecting Analytical Thought into the Scientific Process. Journal of College Science Teaching. Vol. 29(5) p 330-336. DiPasquale, D. M., Mason, C. L. and Kolkhorst, F. March/April 2003. Exercise in Inquiry: Critical Thinking in an Inquiry-based Exercise Physiology Laboratory Course. Vol. XXXII. No. 6. p 388-393. DOE. US. Department of Education. 1983. A Nation at Risk, The Imperative for Educational Reform: A Report to the Nation and the Secretary of Education. Washington DC: DOE. Domin, D. S. 1997 The Transition from a Traditional to a Problem-based Laboratory Curriculum. Traditional Approaches Toward Innovation. The Society for College Science Teachers. Chapter 9 p 55-59. Driver, R., and E. Scanlon. 1989. Conceptual Change in Science. Journal of Computer Assisted Learning EJ392371 ERIC Vol. 5 n1 p 25-36. 263 Druger, M. May 1997. Preparing the Next Generation of College Science Teachers: Offering Pedagogical Training to Graduate Teaching Assistants as Part of the College Reform Agenda. Journal of College Science Teaching. Vol. 26(5) p 424-427. Druger, M. 2002. It All Depends. Journal of College Science Teaching. Vol. XXXI, No. 7 May. p. 493-494. Duch, B. J., Groh, S. E. and Allen, D. E. 2001. The Power of Problem-Based Learning: A Practical “How To” for Teaching Undergraduate Courses in Any Discipline. Stylus Publishing, LLC. Sterling, Virginia. 274 p. Duit, R. 1987. Research on students’ alternative frameworks in science-topics, theoretical frameworks, consequences for science teaching. In Proceedings of the Second International Seminar on Misconceptions and Educational Strategies in Science and Mathematics (Vol. 1), ed. Novak, J. Ithaca, NY: Cornell University. p 151-162. Eberly, M. B., Newton, 8. E., and Wiggins, R. A. 2001. The Syllabus as a Tool for Student-Centered Learning. Journal of General Education. v.50, n1. p. 56-74. Ebert-May, D., Brewer, C. and Allred, 8. October 1997. Innovation in large lectures-teaching for active Ieaming. Bioscience. 47. 9. p 601-607. Edmondson, K. M., and Novak, J. D. 1993. The interplay of scientific epistemological views, Ieaming strategies, and attitudes of college students. Journal of Research in Science Teaching. 36(1), p. 107-116. Edwards, G. 1993. When Teaching Includes Training. ED 444448. ERIC. p. 2-16. Ennis, R. H. 1987 Critical Thinking and The Curriculum. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States..p 40-48. Facione, P. A. 1990. Purposes of Educational Assessment and Instruction, principle investigator, The California Academic Press, Millbrae, CA. Facione, P. A. 1998 Critical Thinking: What It is and Why it Counts 1998 update p14815 Feyerabend, P. 1975. Against Method. Atlantic Highlands, N.J.: Humanities Press, Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. 1987. 174 p. 264 Felder, R. 1993. “Reaching the Second Tier: Learning and Teaching Styles in College Science Education.” Journal of College Science Teaching. 23 (5), p 286- 290. Feldman, 8., Anderson, V. and Mangurian, L. May 2001. Teaching Effective Scientific Writing: Refining Students’ Writing Skills Within the Towson Transition Course. Journal of College Science Teaching. Vol. XXX. No. 7. p 446-449. Fowler, B. 1996. Critical Thinking Across the Curriculum Project. Longview Community College, Lee’s Summitt, Missouri-USA Available from: http://www.kcmetro.cc.mo.us/longview/ctac/reading.htm Accessed Aug. 2005 French, D. and Russell, C. November 2002 Do Graduate Teaching Assistants Benefit from Teaching Inquiry-Based Laboratories? BioScience Vol. 52 No. 11 p 1036-1041. Friedline, M., Phipps, M., Moore, T. and Versteeg, J. 2000. Proceedings of the Annual International Conference on Outdoor Recreation and Education (ICORE). Oxford, Ohio. 145 p. Frymier, A. B. 1993. The Impact of Teacher lmmediacy on Students’ Motivation over the Course of a Semester. ED 367020. ERIC. p. 3-29. Future Professorate Project. 1995. Syracuse, NY: The Graduate School, Syracuse University. Gallucci, K. Febnlary 2004 Prayer Study: Science or Not? JCST. Vol. 33 No. 4. p 32-35. Gardner, H. 1993. Frames of Mind: the Theory of lntelligences. Basic Books. New York. 440 p. Garkov, V. 2002. Is The American Approach to Science Education the Best in the World? Market Forces-Driving Science Education in the United States. Journal of College Science Teaching. Vol. XXXI, No. 6. March/April. p 399-401. Giere, RN. 1997 Understanding Scientific Reasoning. Philadelphia, PA. Holy, Rhinehart and Winston, Inc... 4th ed. 309 p. Glase, J. C. 1981. Tesyed Studies for Laboratory Teaching: Proceedings of the Second Workshop/Conference of the Association for Biology Laboratory Education (ABLE). Kendall/Hunt Publishing Company, Dubuque, IA. 263 p. Glaser, S. R. 1981. Oral Communication Apprehension and Avoidance: The Current Status of Treatment Research. Communication Education. V. 30 n4. p 321-41. 265 Glasson, G. E. and McKenzie, W. L., December 1997/January 1998. Investigative Learning in Undergraduate Freshman Biology Laboratories. Journal of College Science Teaching. p 189-193. Goldenberg, M. 2001. Outdoor and Risk Educational Practices. University of Minnesota. Rochester, MN.. EDRS: ED 463 939. 15 p. Gough, P. B. . May 1987 The Key to Improving Schools: An Interview with William Glasser. Phi Delta Kappan. p 656-662. Grise’, D. J., and Kenney, A. M., Nonmajors’ November 2003 Performance in Biology: Effects of Student-Based Initiatives and Class Size. JCST.. p 18-21. Gunstone , R. F., and Champagne, A. B. 1990. Promoting conceptual change in the laboratory. In E. Hegarty-Hazel (ed.), The student laboratory and the science curriculum. London: Routledge. 30 p. Guskey, T. R. 2002. Perspectives on Grading and Reporting: Differences among Teachers, Students, and Parents. University of Kentucky, Lexington, KY.11. ED 464113. 11 p. Guthrie, E. 2000. New Paradigms, Old Practices: Disciplinary Tensions in TA Training. ED 481001. ERIC. p. 19-39. Haladyna, T. M., Downing, S. M. and Rodriguez, M.. 2002. A Review of Multiple- Choice Item-Writing Guidelines for Classroom Assessment. Applied Measurement in Education. 15(3). p 309-334. Hampton, S. and Reiser, R. 2002. From Theory to Practice: Using an Instructional Theory to Provide Feedback and Consultation to Improve College Teaching, Learning, and Motivation. ED 465787 ERIC. p. 2-40. Harrington, R. 2005. A New Model For A Physics Course For Non-Science Majors: Using An Introductory Physics Course As A Laboratory For Training Pre- Service And ln-Service Teachers To Teach Physics. University of Maine. http://perlnet.umephy.maine.edulbio/harrington/ICUPE.html Harris, C. L. 1984. Tested Studies for Laboratory Teaching: Proceedings of the Fourth Workshop/Conference of the Association for Biology Laboratory Education (ABLE). Kendall/Hunt Publishing Company, Dubuque, IA. 134 p. Hart, C. 2001 Examining Relations of Power in a Process of Curriculum Change: The Case of VCE Physics. Research in Science Education v 31 M p 525-51. ERIC EJ641945 266 Hart, 0., Mulhall, P., Berry, A., Loughran, J. and Gunstone, R. 2000. What is the purpose of this Experiment? Or Can Students Learn Something from Doing Experiments? Journal of Research in Science Teaching. Vol. 37, No. 7. p 655- 675. Hart, P. D., 1997 The Bayer Facts of Science Education III: A US. Student Report Card on Science Education. Executive Summary. US; Pennsylvania. 13 p. ERIC EJ641945 Hattie, J., Marsh, H. W., Neill, J. T., Richards, C. E. Spring 1997. Adventure education and outward bound: Out-of-class experiences that make a lasting difference. Review of Educational Research; Washington. Vol 67. p 43-87. Hawkins, J., Pea, R. D. 1987. Tools for Bridging the Cultures of Everyday and Scientific Thinking. Journal of Research in Science Teaching. v.24, n4. p 291- 307. Hayward, P. 2002. A Comparative Analysis of the Instructional Behaviors Used by Highly Effective and Highly Ineffective Instructors on the First Day of Class. ED 466782. ERIC. p. 2-32. Heady, J. E. 2001. Gauging Students’ Learning in the Classroom: An Assessment Tool to Help Refine Instructors’ Teaching Techniques. Journal of College Science Teaching 31(3). p 157-161. Heady, J. E. 2000 Assessment-A new way of thinking about Ieaming-now and in the future. Journal of College Science Teaching 29(6): p 415-421. Hedges, K., and Mania-Famell, B. November 2003. Mentoring Students in an Introductory Science Course. Journal of College Science Teaching. Vol. XXXII. No. 3. p 194-198. Heiman, M., and Slomianko, J. 1987 Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States..312 p. Helpem, D. 1987. Thinking Across The Disciplines: Methods And Strategies To Promote Higher-Order Thinking In Every Classroom. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. p 69-76. Henderson, L. and Buising, C. February 2001 A Research-Based Molecular Biology Laboratory: Turning Novice Researchers into Practicing Scientists. Journal of College Science Teaching. Vol. XXX, No. 5. p 322-327. 267 Henderson, K. and Fox, K. 1994. Methods, measures, and madness: Possibilities for outdoor education research. In, McAvoy, L., Stringer L. and Ewert, A. (Eds,) Coalition for Education in the Outdoors Second Research Symposium Proceedings. Cortland, NY: Coalition for education in the Outdoors. p 9-13. Hendricks, B. 1994. Improving Evaluation in Experiential Education. ERIC Digest. ED376998 ERIC Issue: REIAPR1995. 6 p. Hendrix, K. G. 1995. Preparing Graduate Teaching Assistants (GTAs) to Effectively Teach the Basic Course. ED 384933 ERIC. 24 p. Heppert, J., Ellis, J., Robinson, J., Wolfer, A. and Mason, 8. February 2002. Problem Solving in the Chemistry Laboratory. Journal of College Science Teaching. Vol. 31(6) p 322-326. Herbert, T. 1995. Experience and the Curriculum: Experiential Learning: A Teacher’s Perspective. Boulder, CO. Association for Experiential Education. 678 p. Herbert, T. 1995. Experiential Learning: ATeacher's Perspective. ED 398 024 ERIC issue: RIEDEC1996. Chapter 3. p 19-35. Hewson, M. G. and Hewson, P. W. 2003. Effect of Instruction Using Students’ Prior Knowledge and Conceptual Change Strategies on Science Learning. Journal of Research in Science Teaching. Vol. 40. Supplement. p 886-898. Hobson, A. December 2000/January 2001. Teaching Relevant Science For Scientific Literacy. Journal of College Science Teaching. Vol. 30(4) p 238-243. Hodson, D. 1990. A Critical look at practical work in school science. School Science Review. 71 (256). p. 33-40. Hogan, K and Maglienti, M. 2001. Comparing the Epistemological Underpinnings of Students’ and Scientists’ Reasoning about Conclusions. Journal of Research in Science Teaching. Vol. 38. No. 6. p 663-687. Hogan, K. 1999 Thinking Aloud Together". A Test of an Intervention to Foster Students’ Collaborative Scientific Reasoning. Journal of Research in Science Teaching. Vol. 36. No. 10. p 1085-1109. Holliday, W. G. 2003. Influential Research in Science Teaching: 1963-Present. Journal of Research in Science Teaching. Vol. 40, Supplement. p v-x. 268 Hwang, Dae-Yeop, Henson, R. K. A Critical Review of the Literature on Kolb’s Learning Style Inventory with Implications for Score Reliability. 2002. Paper Presented at Southwest Educational Research Association, Austin, Texas. Johnson, M. C. and Malinowski, J. C. November 2001. Navigating the Active Learning Swamp: Creating an Inviting Environment for Learning. Journal of College Science Teaching. Vol. XXXI. No.3. p 172-177. Johnstone, A. H. and Whan, A. J. B. May 1982 The demands of practical work. Education in Chemistry. 19(3): 71-73. Jordan, J., and Haines, B. Summer 2003. Fostering Quantitative Literacy: Clarifying Objectives, Assessing Student Progress. AAC&U. p 16-19. Joumet, A. R. P. 1991. Science as Investigation: A First Majors Course Teaching The Process. Tested Studies for laboratory teaching. Vol. 12. (C. A. Goldman, Editor). Proceedings of the 12th Workshop/Conference of the Association for Biology Laboratory Education (ABLE). 218 p. Julian, R. 1998. Evaluation of Learning Experiences Outside the Classroom (LEOTC) Programme. NZCER Distribution Services, Wellington, New Zealand. 27 p. Kahle, J. B. and Lakes, M. K. 2003. The Myth of Equality in Science Classrooms. Journal of Research in Science Teaching. Vol. 40. Supplement. p 858-867. Karrnos, J. S. and Karrnos, A. H . 1987 Strategies For Active Involvement In Problem Solving. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States..p 99-110. Karplus, R. 2003. Science Teaching and The Development of Reasoning. Journal of Research in Science Teaching. Vol. 40. Supplement. p 851-857. Karukstis, K. K.. November 2003 Examining Technology’s Impact on Society: Using Case Studies to Introduce Environmental and Economic Considerations. JCST. p 3640. Kincaid, B. W. and Johnson, M. A. Our Tortuous Path to Inquiry-oriented Instruction in an Introductory Biology Course. From Traditional Approaches Toward Innovation Caprio, M. Editor An Affiliate of NSTA. p 61-66. Klaczynski, P. A. 1997. Bias in Adolescents’ Everyday Reasoning and Its Relationship with Intellectual Ability, Personal Theories, and Self-Serving Motivation. Developmental Psychology. v.33, n.2. p. 273-83. 269 Klaczynski, P. A. 1998. Development of Scientific Reasoning Biases: Cognitive versus Ego-Protective Explanations. Developmental Psychology. v.34, n1. p. 175-87. Klahr, D. 1988. Cognitive Objectives in Logo Debugging Curriculum: Instruction, Learning, and Transfer. Cognitive Psychology. v. 20, n.3. p. 362-404. Knabb, M. T. December 1997/January 1998. Creating a Research Environment in an Introductory Cell Physiology Course. Journal of College Science Teaching. Vol. 27(3) p 205-209. Kokkala, l. December 2002/January 2003. Writing Science Effectively: Biology and English Students in an Author-Editor Relationship. Journal of Science College Teaching. Vol. XXXII. No. 4. p 252-257. Krockover, G., Adams, P., Eichinger, D., Nakhleh, M. and Shepardson, D. February 2001. Action-Based Research Teams: Collaborating to Improve Science Instruction. Journal of College Science Teaching. Volume XXX, No. 5. p 313-317. Kuhn, D. 1995. Scientific Thinking and Knowledge Acquisition Reply. Monographs of the Society for Research in Child Development. V.60 n4, p. 152- 57. Kyle, W. C. 1997. The imperative to improve undergraduate education in science, mathematics, engineering, and technology. Journal of College Science Teaching 34(6): p 547-549. Lawson, A. E. 1978. The development and validation of a classroom test of formal reasoning. Journal of Research in Science Teaching. 15(1). p. 11-24. Lawson, A.E. and Weser, J. 1990. The rejection of nonscientific beliefs about life: Effects of instruction and reasoning skills. Journal of Research in Science Teaching. 27(6). p. 589-606. Lawson, A. E. August 2000. Arizona State University. Based on: Lawson, A.E. 1978. Development and validation of the classroom test of formal reasoning. Journal of Research in Science Teaching, 15(1): p 11-24. Lawson, A. E., Alkhoury, S., Benford, R., Clark, B. and Falconer, K. 2000. What Kinds of Scientific Concepts Exist? Concept Construction and Intellectual Development in College Biology. Journal of Research in Science Teaching Vol. 37. No. 9. p 996-1018. 270 Lawson, A. E., Clark, 8., Cramer-Meldmm, E., Falconer, K.A., Sequist, J.M., and Kwon, Y.J. 2000. The development of scientific reasoning in college biology: Do two levels of general hypothesis-testing skills exist? Joumal of Research in Science Teaching. 37 (1), p. 81-101. Lawson, A. E., Rissing, S. W.and Faeth, S. H. May 1990. An Inquiry Approach to Nonmajors Biology: A big picture, active approach for long-term Ieaming. Journal of College Science Teaching. p 340-346. Lawson, A. E. and Wollman, W. T. 2003. Encouraging the Transition From Concrete to Formal Cognitive Functioning-An Experiment. Journal of Research in Science Teaching. Vol. 40. Supplement. p 833-850. Lawson, A. E., Benford, R., Bloom, l., Carlson, M., Falconer, K., Hestenes, D., Judson, E., Pibum, M., Sawada, D., Turley J.and Wyckoff, S. 2002. Evaluating College Science and Mathematics Instruction: A Reform Effort that Improves Teaching Skills. Vol. XXXI, No. 6. March/April. p 388-398. Lavoie, D. R. 1999. Effects of Emphasizing Hypothetico-Predictive Reasoning within the Science Learning Cycle on High School Student’s Process Skills and Conceptual Understandings in Biology. Vol. 36. No. 10. p 1127-1147. Lavoie, D. and Backus, A. May 1990. Students Write to Overcome Learning Blocks Journal of College Science Teaching. p 353-358. Lederrnan, N.G. 1999. Teachers’ understanding of the nature of science and classroom practice: Factors that facilitate or impede the relationship. Journal of Research in Science Teaching, 36 (8), p 916-929. Leonard, W. H. 1993. A Laboratory Teaching Model Which Trains Students to Exercise Discretion. Tested Studies for Biology Laboratory Teaching,Uh Volume 7/8 (C. A. Goldman and P. L. Hauta, Editors). Proceedings of the 7‘h and 8‘'1 Workshop/Conferences of the Association for Biology Laboratory Education (ABLE), 187 p. Leonard, W. H. and Penick, J. E.. 2005. Assessment of Standards-Based Biology Teaching. The American Biology Teacher. February 2005. 67. 2; Education Module. p 73-76. Levin, B. B. 2001 Energizing Teacher Education and Professional Development With Problem-Based Learning. Association of Supervision and Curriculum Development. Alexandria, Virginia. 140 p. Lewicki, J. 1998. Cooperative Ecology & Place: Development of a Pedagogy of Place Curriculum. Westby, WI. ED 461461 ERIC issue: RIEJUL2002. p 2-35. 271 Lewis, K. G. 1996-2003. Training Focused on Postgraduate Teaching Assistants: the North American Model. James Rhem 8r Associates, LLC. http://www.ntIf.oom/I‘Itm|/Iib/bibllewis.htm. 16 p. Lewis, A. E., Alkhoury, S., Benford, R., Clark, B. R. and Falconer, K. 2000 What Kinds of Scientific Concepts Exist? Concept Construction and Intellectual Development in College Biology. Journal of Research in Science Teaching. Vol. 37. No. 9. p 996-1018. Lewis, E. L. and Linn, M. C. 2003. Heat Energy and Temperature Concepts of Adolescents, Adults, and Experts: Implications for Curricular Improvements. Journal of Research in Science Teaching. Vol. 40. Supplement. p 8155-8175. Libarkin, J. C., and Mencke, R. December 2001/January 2002. Students Teaching Students. Journal of College Science Teaching. Vol. 31(4) p 235-239. Lilly, J. E. and Sirochman, R. F. December 2000 An Innovative College Curriculum Model for Teaching Physical Science to Pro-Service Elementary Teachers Electron Journal of Education Vol. 5 No. 2 ERIC EJ651171 10 p. Loacker, G. and Mentkowski, M.. 1993. Creating a Culture Where Assessmentl improves Learning. Making a Difference: Outcomes of a Decade of Assessment in Higher Education, ed. T. Banta. San Francisco: Jossey-Bass. National Research Council 2001. Knowing What Students Know: The Science and Design of Educational Assessment. National Academy of Sciences. Washington DC. Loacker, G. 1988. Faculty as a force to improve instruction through assessment. Assessing Student’s Learning. New Directions for Teaching and Learning, ed. M. H. McMillan. San Francisco: Jossey-Bass. No.34 p 19-32. Loacker, G., Cromwell, L. and O’Brien, K. 1986. Assessment in Higher Education: to Serve the Learner. Assessment in American Education: Issues and Contexts. ed. C. Adelman, . Washington DC: US. Department of Education. Report No. OR86-301 p 46-62. Lochhead, J. 1987 Thinking About Learning: An Anarchist Approach To Teaching Problem Solving. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. p 174-182. Lord, J., and Marks, J. 2005. Creating College Opportunity for All: Prepared for Students and Affordable Colleges. Challenge to Lead Series. Southern Regional Education Board, Atlanta, GA ERIC ED 485273 28 p. 272 Lord, T. May 1994. Using Constructivism to Enhance Student Learning in College Biology. Journal of College Science Teaching. Vol. 23(6). p 346-348. Lord, T. 1997. A Comparison Between Traditional and Constructivist Teaching in College Biology. Innovative Higher Education. Vol. 21. No. 3. p 197-217. Lord, T. 1999. A Comparison Between Traditional and Constructivist Teaching in Environmental Science. The Journal of Environmental Education. Vol. 30. No. 3. p 22-28. Luft, J. A. 2003. Induction Programs for Science Teachers: What Research Says. Science Teacher Retention: Mentoring and Renewal. Issues in Science Education; see SE 066 056, 11 p. Luft, J. A., Kurdziel, J. P., Roehrig, G. H. and Turner, J. 2004. Growing a Garden Without Water: Graduate Teaching Assistants in Introductory Science Laboratories at a Doctoral Research University. Journal of Research in Science Teaching. Vol. 41. No. 3. p 211-233. Lumsden, A. S. and Morgan, J. G. 2003. Effective Methods of Training Biology Laboratory Teaching Assistants IV: The Use of Skits, A Teaching Plan, and Dealing With Plagiarism And Grading. Association for Biology Laboratory Education (ABLE). http://wwwzoo.utoronto.calable. p. 223-237. Lunsford, E. December 2002/January 2003. Inquiry in the Community College Biology Lab: A Research Report and a Model for Making It Happen. Vol. XXXII. No. 4. p 232-235. MacGregor, J., Cooper, J. L., Smith, K. A. and Robinson, P. Spring 2000 Strategies for Energizing Large Classes: From Small Groups to Learning Communities. Number 81, Jossey-Bass Publishers San Francisco. p 1-25. Magill, M. K., and Others, 1988 A Computer-Assisted System for Analysis of Interaction in Problem-based Learning Groups. Evaluation and the Health Professions Vll n3 EJ383278 ERIC p318-32. Main, E. C. 1994. Teaching Assistant Training and Teaching Opportunity (TATTO) Program. ED 416758. ERIC. p. 2-25. Mandeville, MY. 1993. The Effects of Teaching Assistants’ Public Speaking Anxiety and the Evaluation Results of Classroom Interventions. ED 366033. ERIC. p. 2-18. 273 Mangurian, L., Feldman, S., Clements, J. and Boucher, L. May 2001 Analyzing and Communicating Scientific lnforrnation: A Towson Transition Course to Hone Students’ Scientific Skills. Journal of College Science Teaching Science. Volume XXX, No. 7. p 440-445. Manner, B. M. March/April 2001. Learning Styles and Multiple lntelligences in Students: Getting the Most Out of Your Students’ Learning. Journal of College Science Teaching. Vol. XXX. No. 6. p 390-393. Marbach-Ad, G. and Sokolove, P. G. November 2000. Good Science Begins With Good Questions: Answering the Need for High-Level Questions in Science. Journal of College Science Teaching. Vol. XXX. No. 3. p 192-195. McMillan, V. and Huerta, D. December 2002/January 2003. Eye on Audience: Adaptive Strategies for Teaching Writing. Journal of College Science Teaching. Vol. XXXII. No. 4. p 241-245. McComas, W. F., Cox-Peterson, A. M. 1999. Enhancing Undergraduate Science Instruction-The G-Step Approach: Capitalizing on the Pedagogical Strengths of Science Educators and the Content Expertise of Science TAs. November. p 120-125. McGraw, J. 8. March/April 1999. The Total Science Experience Laboratory for Sophomore Biology Majors: Transcending the Laboratory to Learn Much More Than Just Biology. Journal of College Science Teaching. Vol. 28(5) p 325-330. Mehrens, W. A. and Lehmann, l. J. 1997. Measurement and evaluation in education and psychology. Harcourt Brace College Publishers. Fort Worth. 4‘" ed. 592 p. Millar, R, and Driver, R. 1987. Beyond processes. Studies in Science Education. 14. p 22-62. Monaghan, P. 1989. Unusual Washington State Clearinghouse Develops Teaching and Learning Innovations for Colleges. Chronicle of Higher Education. V. 35, n39. pA11, 13. 1&2“: Montana State University. 2000. Teaching Assistant Survey. [Iggy/www.montana.edulteachleam. Moorcroft, T. A, Desmarais, K. H. and Hogan, K. 2000. Authentic Assessment in the lnforrnal Setting: How it Can Work For You. The Journal of Environmental Education. v. 31 no. 3. 8 p. Moore, K. D. 1998. Patterns of Inductive Reasoning: Developing Critical Thinking Skills. Kendall/Hunt Publishing Company. Dubuque, Iowa. 266 p. 274 Moore, R. March/April 2003. Attendance and Performance: How Important is it for Students to Attend Class? Journal of College Science Teaching. Volume XXXII. No.6. p 367—371. Myers, S. 1995. Exploring the Assimilation Stage of GTA Socialization: A Preliminary Investigation. ED 389016. ERIC. p. 2-16. Nageswari, K. S., Malhotra, N. K., and Kaur, G. 2003. Pedagogical effectiveness of innovative teaching methods initiated at the Department of Physiology, Government Medical College, Chandigarh. Adv. Physiol Educ 28: p. 51-58. National Academy of Sciences (NAS). 1996. National Science Education Standards. Washington, DC: National Science Foundation. National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. National Academy Press. Washington, DC. 366 p. National Research Council. 1996. National Science Education Standards. National Academy Press. Washington. DC. 262 p. National Research Council. 1997. Science Teaching Reconsidered: A Handbook. National Academy of Sciences. Washington DC. APS Science Teaching Forum-July 2003 National Research Council. 2000. Science Teaching Reconsidered. National Academy Press, Washington, DC. National Resource Council. 2000. Inquiry and the National Science Education Standards: A Guide for Teaching and Learning. National Academy Press. Washington. DC. 202 p. National Training Laboratories. 2003 Learning Pyramid. APS Science Teaching Forum. Bethel, Maine. Newell, S. May 1990. Collaborative Learning in Engineering Design. Joumal of College Science Teaching. p 359-362. Newman, M. J. 2005. Problem-Based Learning: An Introduction and Overview of the Key Features of the Approach. Journal of Veterinary Medical Education. AAVMC. 32(1). p 12-20. Preparing Future Professors: A New York State Consortium Project. 1996. Syracuse, NY: The Graduate School, Syracuse University. 275 Novak, J. D. 2003. A Preliminary Statement on Research in Science Education. Journal of Research in Science Teaching. Volume 40. Supplement. p S1-S7. NSTA Express. May 26, 2003. Americans Say Science, Technology Skills Are Essential, But Not Taught Well. http://science.nsta.org/nstaexgress/nstaexpress_2003 05 £6 extra.htm Nyquist, J. D. and Wulff, D. H. 1996. Working effectively with graduate assistants. Jossey-Bass, Inc., Thousand Oaks, CA. Osborne, J., Collins, 8., Ratcliffe, M., Millar, R. and Duschl, R. 2003. What “Ideas-about-Science” Should Be Taught in School Science? A Delphi Study of the Expert Community. Journal of Research in Science Teaching. Vol. 40. No. 7 p 692-720. Osborne, R. E., Norman, J., Basford, T. 1997. Utilizing Undergraduate Teaching Assistants: An Untapped Resource. ED 448640. ERIC. p. 2-33. Ostiguy, N., and Haffer, A. March/April 2001. Assessing Differences in Instructional Methods: Uncovering How Students Learn Best. Journal of College Science Teaching. Vol. XXX. No. 6. p 370-374. O’Sullivan, D. W. and Copper, C. L. May 2003. Evaluating Active Learning. Journal of College Science Teaching. Vol. 32(7) p 448-452. Palmer, J. and Neal, P. 1994. The Handbook of Environmental Education. Rutledge Publishers, Inc. New York, New York. 350 p. Payne, R. K. 1996. Framework for Understanding Poverty. Aha! Process, Inc. Highlands, TX. 207 p. Pazzani, M. J. and Flowers, M. 1990. Creating a memory of causal relationships: an integration of empirical and explanation-based Ieaming methods. Hillsdale, NJ: L. Erlbaum Associates. 350 p. PBL Insight: to solve, to Ieam. together. 2000. Samford University, Indiana University-Purdue University. Vol 3. No. 3. 9 p. Perkins, D. N. 1987 Conversations With David N. Perkins. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. p 49-57. Piaget, J. 2003. Cognitive Development in Children: Piaget: Development and Learning. Journal of Research in Science Teaching. Vol. 40. Supplement. p 88- S18. 276 Popp, J. A., 1999. Cognitive Science and PhiloSOphy of Education. Caddo Gap Press. San Francisco, CA. 240 p. Posner, G., Strike, K., Hewson, P. and Gertzog, W. 1982. Accommodation of a scientific conception: Toward a theory of conceptual change. Science Education 66(2): p 211-227. Preparing Future Professors: A New York State Consortium Project. 1996. Syracuse, N.Y.: The Graduate School, Syracuse University. Presseisen, B. 1987 Thinking and Curriculum: Critical Crossroads for Educational Change. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. p 31-39. Redwine, R. and Tervalon, C. D., and Breslow, L. 2003. Report on TA Training and Development at MIT. http://web.mit.edult1 1ledresearch/reports/ta_survey.pdf. Reif, F. and Larkin, J. H. 1991. Cognition in Scientific and Everyday Domains: Comparison and Learning Implications. Journal of Research in Science Teaching. v.28, n2. p. 733-60. Renner, J. W., Abraham, M. R., and Bimie, H. H. (1985a) The importance of the form of student acquisition of data in physics Ieaming cycles. Journal of Research in Science Teaching, 22, p. 303-325. Renner, J. W., Abraham, M. R., and Bimie, H. H. (1985b). Secondary school students’ beliefs about the physics laboratory. Science Education, 69, p. 649- 663. Rhem, J. December 1998. Problem-Based Learning: An Introduction. Vol. 8. No. 1. p 1-7. Available from: hflg:lew.n1f.com/htmllpil9812/gbl 1.htm Rice, R. E. Overcoming Traditional Opposition to Nontraditional Courses. From Traditional Approaches Toward Innovation Caprio, M. Editor p. 79-85. Roehrig, G. H., Luft, J. A., Kurdziel, J. P. and Turner, J. A. 2003. Graduate Teaching Assistants and Inquiry-Based Instruction: Implications for Graduate Teaching Assistant Training. Research: Science and Education. Journal of Chemical Education. October. Vol. 80. No. 10. p. 1206. Roth, Wolff-Michael and Roychoudhury, A.. 2003. Physics Students’ Epistemologies and Views About Knowing and Learning. Journal of Colege Science Teaching. Vol. 40. Supplement. p 8114-8139. 277 Rowe, M. B. 2003. Wait-Time and Rewards as Instructional Variables, Their Influence on Language, Logic, and Fate Control: Part one-Wait-Time. Journal of Research in Science Teaching. Vol. 40. Supplement. p 819-832. Rowley, E. N. 1993. Keeping the Faith: Teaching Assistants and the Pursuit of Teaching Excellence. ED 360656. ERIC. p. 2-12. Russell, C. P., and French, D. P. September 2001. Assess Student Participation in Biology Laboratories: A Technique Borrowed From Ethology The American Biology Teacher, Volume 63, No. 7, p 491- 497. Russel, C. P. and French, D. P. December 2001/January 2002. Factors Affecting Participation in Traditional and Inquiry-Based Laboratories: A Description Based on Observations and Interviews. Journal of College Science Teaching. Vol. XXXI, No. 4. p 225-229. Russell. I.J., Hendricson, W.D., and Herbert, R.J. November 1984. Effects of Lecture information density on medical student achievement. Journal of Medical Education. 59. p 881-889. Rutherford, F. J., and Ahlgren, A. 1990 Science for all Americans. New York: Oxford University Press. 217 p. Sanders, N. 1987 Classroom Questions: What Kinds? New York: Harper & Row, 1966. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. 134 p. Savage, M. P. and Sharpe, T. 1998. Demonstrating the Need for Formal Graduate Student Training In Effective Teaching Practices. The Physical Educator. V55. no3 p. 130-7. Savitz, F.,Gardner, H. Meet Benjamin Bloom; Strategies for the Future Enliven Methods from the Past. 1999. Paper presented at the Annual Meeting of the Pennsylvania Council for the Social Studies, Pittsburgh, PA., October 14-15, 1 999. Sawada, D., Pibum, M. Falconer, K., Turley, J., Benford, R., Bloom, I., and Judson, E. 2000a. Reformed teaching observation protocol (RTOP). Accept Technical Report No. lN00-1. Tempe, AZ: Arizona Collaborative for Excellence in the Preparation of Teachers. Sawada, D., Pibum, M. Falconer, K., Turley, J., Benford, R., Bloom, |., and Judson, E. 2000b. Reformed teaching observation protocol (RTOP) training guide. Accept Technical Report No. IN00-2. Tempe, AZ: Arizona Collaborative for Excellence in the Preparation of Teachers. 278 Schunn, C. D. and Anderson, J. R. 1999. The Generality/Specificity of Expertise in Scientific Reasoning. Cognitive Science Vol. 23 (3) p 337-370. Scriven, M. and Paul, R. 2003. Defining Critical Thinking. National Council for Excellence in Critical Thinking. Available from: http:/lwww.criticalthinking.orglUniversmg' lunivclasi/Defininghtml. Shipman, H. L. February 2001. Hands-On Science, 680 Hands at a Time: Shrinking the Large Lecture with a Collapsing Can Experiment. Journal of College Science Teaching. Vol. XXX, No. 5. p 318-321. Shymansky, J. A., Kyle, Jr., W. C. and Alport, J. M. 2003. The Effects of New Science Curricula on Student Performance. Journal of Research in Science Teaching. Vol. 40: 81. p 868-885. Simon, H. A. 1986 Understanding the process of science: The psychology of scientific discovery. In T. Gamelius(Ed.), Progress in science and its social conditions. Oxford: Pergamon Press p. 159-170. Smith, G. R. May 2001 Guided Literature Explorations: Introducing Students to the Primary Literature. Journal of College Science Teaching. Vol. XXX. No. 7. p 465-469. Smith, K. A. 1999 Inquiry in Large Classes. Sigma Xi Conference Reshaping Undergraduate Science and Engineering Education: Tools for Better Learning. 11 p. Snow, R. E. and Snow, D. F. 1989 Implications of cognitive psychology for educational measurement. In R.L. Linn (Ed.), Educational Measurement (3rd edition). Macmillan Publishing Company, New York, p 263-330. Stephans, J. 1994. Targeting Students’ Science Misconceptions: Physical Science Activities Using the Conceptual Change Model. Idea Factory, Riverview, FL Streeter, J. and Bowdoin, H. 1997. Place-Based Education: Two Views from the Past. ED 421-310 ERIC issue: RIEDEC1998. p 8-20. Streeter, J. and Bowdoin, H. 1997. Place-Based Education: Two Views from the Past. Proceedings of the 1997 Forum. Selboume, England. 14 p. Stronge, J. 2002. Qualities of Effective Teachers. Association for Supervision and Curriculum Development. Alexandria, Virginia. p 129. 279 Sundberg, M. D., Armstrong, J. E., Dini, M. L. and Wischusen, E. W. March/April 2000. Some Practical Tips for lnstituting Investigative Biology Laboratories. Journal of College Science Teaching. p 353-359. Swartz, A. M. 1987 Critical Thinking Attitudes and The Transfer Question. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. p 58-68. Swartz, R. J. 1987 Restructuring What We Teach To Teach For Critical Thinking. Thinking Skills Instruction: Concepts and Techniques.Washington, DC. National Education Association of the United States. p 111-118. Taconis, R. M., Ferguson-Hessler, G. M. and Broekkamp, H. 2001. Teaching Science Problem Solving: An Overview of Experimental Work. Journal of College Science Teaching. Vol. 38. No. 4. p 442-468. Takalkar, P., Micceri, T., Eison, J. 1993. Tomorrow’s Professors: Helping University Teaching Assistants Develop Quality Instructional Skills. ED 453703. ERIC. p. 2-12. Tarrant, T. 2005. Doctoral Dissertation. Zoology Department. Michigan State University. College of Natural Science. Taylor, J. A., Lunetta, V. N., Dana, T. M. and Tasar. M. F. 2002. Bridging Science and Engineering: An Integrated Course for Nonscience Majors. Journal of College Science Teaching. Vol. XXXI, No. 6. March/April. p 378-383. Tervalon, C. D. and Breslow, L.; Report on TA Training and Development at MIT; prepared for the Committee on TA Development and Dean Robert Redwine. Massachusetts Institute of Technology; September 10, 2003. Available from: hnp:/M39.mit.eduh11/edresearch/reports/ta surveypdf Tobin, K. and J. J. Gallagher. 2003. The Role of Target Students in The Science Classroom. Journal of Research in Science Teaching. Vol. 40. Supplement. p 899-S113. Trowbridge, J. E. and Wandersee, J. H. 2003. Identifying Critical Junctures in Learning in a College Course Evolution. Journal of Research in Science Teaching. Vol. 40. Supplement. p 8140-8154. Tufts University. 1993. TA survey. http:l/ase.tufts.edu/caelmain/ta.htrn US. Department of Education (DOE). 1983. A Nation at Risk, The Imperative for Educational Reform: A Report to the Nation and the Secretary of Education. Washington DC: DOE. 65 p. 280 University of Washington, School of Oceanography; TA Questionnaire (1996). Date of citation August 22, 2005. Available from: httpzllwwwocean.washington.edu/peogle;faculy/mcmanus/guestionnaire.htrnl Vass, E., Schiller, D. and Nappi, A. J. 2000. The Effects of Instructional Intervention on Improving Proportional, Probabilistic, and Correlational Reasoning Skills among Undergraduate Education Majors. Vol. 37. No. 9. p 981- 995. VonSecker, C. E. and Lissitz, R. W. 1999. Estimating the Impact of Instructional Practices on Student Achievement in Science. Journal of Research in Science Teaching. Vol. 36. No. 10. p 1110-1126. Washington State University. 1996. httg://www.ocean.washington.edu/geoglelfaculglmcmanus/guestionnaire.html. Watts, M. 1988. From Concept Maps to Curriculum Signposts. Physics Education. v.23, n2. p. 74-79. Weimer, M. 1990. Improving College Teaching Strategies for Developing Instructional Effectiveness. The Jossey-Bass Higher Education Series. Jossey- Bass, Inc., San Francisco, CA. 232 p. Weinbaum, A., Gregory, L., Wilkie, A., Hirsch, L., Fancsali, C. 1996. Expeditionary Learning Outward Bound. Summary Report. ED 462 456 ERIC issue: RIEAU62002. p 2-45. Westerlund, J. F. and Stephenson, A. L. November 2003. Phasing in Future Teachers: A Multidisciplinary Teaching Technique. Journal of College Science Teaching. Vol. XXXII. No. 3. p 171-175. White, B. May 1998. A Curriculum for Recitation Sections in Introductory Biology: Designing Lesson Plans for TAs that Emphasize Open Communication and Spirited Interaction with Students. Journal of College Science Teaching. Vol. 27 Iss. 6 p407-410. Wiggins, G. and McTighe, J. 2000. An introduction to Understanding By Design. Presentation. Jay McTighe 6581 River Run Columbia, MD 21044-6066. p 1-86. Williamson, L. K. 2001. The Pedagogy of Pedagogy: Teaching GTAs To Teach. ED 461900. ERIC. p. 2-20. Wilson, J. T. and Stensvold, M. S. 1988 Improving laboratory instruction: An interpretation of research. Joynal of College Science Teaching 20(6): 350 p. 281 Worthen, T. K. 1992. The Frustrated GTA: A Qualitative Investigation Identifying the Needs within the Graduate Teaching Assistant Experience. ED 355598. ERIC. p. 2-30. Wright, V., Marsh, G. and Miller, M. T., 2000 A Critical Comparison of Graduate Student Satisfaction in Asynchronous and Synchronous Course Instruction. Planning and Changing. Vol. 31 No. 1-2 ERIC EJ 652039 p 107-18. Wyckoff, S. February 2001 Changing the Culture of Undergraduate Science Teaching: Shifting from Lecture to Interactive Engagement and Scientific Reasoning. Journal of College Science Teaching. VoI.XXX. No. 5. p 306-312. Yager, R. E. 1992. What we did not Ieam from the 60s about science curriculum reform. Journal of Research in Science Teaching. 29. p. 905-910. Yehudit, J. D., Herscovitz, O. 1999. Question-Posing Capability as an Alternative Evaluation Method: Analysis of an Environmental Case Study. Journal of Research in Science Teaching. Vol. 36. No. 4. p 411-430. Zachos, P., Hick, T. L., Doane, W. E. J. and Sargent, C. 2000. Setting Theoretical and Empirical Foundations for Assessing Scientific Inquiry and Discovery in Educational Programs. John Wiley 8 Sons, Inc., Vol. 37. No. 9. p 938-962. Zimmerman, C. 2000. The Development of Scientific Reasoning Skills. Developmental Review 20, p 99-149. Zohar, A., Nemet, F. 2002. Fostering Students’ Knowledge and Argumentation Skills Through Dilemmas in Human Genetics. Journal of Research in Science Teaching. Vol. 39. No. 1. p 35—62. 282 .11 m: .-‘ ‘ s— .‘4 A MMMMMMM llIlilljjjlljjlilillliilillil 36 9218