IMPROVING ENGINEERING STUDENTS’ NON-TECHNICAL PROFESSIONAL SKILLS AND ATTITUDES TO ENGINEERING THROUGH INQUIRY BASED LAB LEARNING By Ying Huang A DISSERTATION Submitted to Michigan State University In partial fulfillment of the requirements For the degree of Higher, Adult, and Lifelong Education-Doctor of Philosophy 2014 ABSTRACT IMPROVING ENGINEERING STUDENTS’ NON-TECHNICAL PROFESSIONAL SKILLS AND ATTITUDES TO ENGINEERING THROUGH INQUIRY BASED LAB LEARNING By Ying Huang This study examines the effectiveness of inquiry-based instructional labs in improving engineering students’ non-technical professional skills and attitudes to engineering by answering two sets of research questions: 1) Do inquiry-based labs enhance students’ professional skills and attitudes towards engineering? 2) How does the complexity of learning tasks influence student learning outcomes in inquiry-based labs? This study adopted a multiple-case design to address research questions in naturalistic settings and a mixed-method approach with emphasis on the quantitative component. The final cases chosen included three lab courses offered at two large research universities in the Midwest: Engineering Design Lab, Chemistry Lab, and Bio-system Lab. To address if students’ professional skills and attitudes towards engineering changed after inquiry-based lab learning experience, individual level survey data were collected from students enrolled in the three sample courses, using the Inquiry-based Lab Learning Inventory (ILLI) developed for the current study. Additionally, I interviewed seven students who had taken the Engineering Design Lab within five years. Paired sample tests were conducted to test the null hypothesis that students’ skills and attitudes to engineering did not change before and after the sample course. Then, estimates of effect sizes and confidence intervals were employed to measure the magnitude of change. The second research question was addressed through quantitative analysis of the survey data. Two survey items asked about students’ perceived difficulty and workload of the course. Multivariate regression analysis and separate regression analyses were performed to examine the association between difficulty and workload of the course and student learning outcomes. Across sample cases, students showed positive improvement in self-perceived skills in conducting lab work, attitude to teamwork and communication, and using active coping strategies when facing stress or difficulty in problem solving. The magnitude of change varied across sample cases. Students taking the Engineering Design Lab showed greatest improvement in these areas. Additionally, regression analyses results showed a positive association between perceived difficulty of the course and students’ attitudes to teamwork and communication and coping strategies. However, the benefit of increased difficulty on these two learning outcomes started to decline after an upper middle difficult level. Therefore, excessive difficulty of a lab experiment of project could compromise the benefit of inquiry-based learning. Findings about how inquiry-based instructional labs influence students’ self-perceived skills in conducting cooperative inquiry-based projects and attitudes to engineering have several implications. 1) Inquiry-based lab instruction is a potentially powerful instructional method to develop students’ attributes of engineering professionals. It can be utilized as a complement to enhance the current engineering curriculum to achieve a balance between cognitive and affective learning. 2) Engineering programs should engage students in inquiry-based lab work early on rather than only in upper division courses. 3) Faculty, staff, or teaching assistants who design, oversee, or facilitate instructional labs should be provided with teaching supports, so that they will have the capacity and confidence to engage students in inquiry-based lab work. 4) Facilitators of inquiry-based labs should check students’ perceived difficulty of learning tasks periodically and adjust the complexity of learning tasks or provide scaffolding accordingly to achieve better learning outcomes. This dissertation is dedicated to my parents who are taking care of my grandpa in China and could not attend my graduation ceremony. iv ACKNOWLEDGEMENTS I would like to thank my committee members, Jim Fairweather, Daina Briedis, John Dirkx, and Ann Austin who helped improve the design of my study at the proposal stage, provided valuable information and support in the data collection stage, and generously offered time and effort to move me forward to a successful completion of my doctoral dissertation. I am also indebted to a lot of faculty members in engineering programs at several institutions who allowed me to observe their classes and tried their best to help me collect good data. I would like to especially thank the faculty members who taught the three sample courses. Without their enormous support and assistance, collecting data in these courses would have been impossible. I am equally grateful to students who returned survey forms or participated in interviews. Additionally, there are so many engineering faculty members who did whatever they could to help my study, including those who patiently introduced their lab courses to me on person or via email, introduced me to others who might be able to help, discussed their previous research on inquiry-based learning to inform my conceptualization, or allowed me to collect data in their courses, although data were not collected in their courses because of practical issues. This research was supported by the dissertation fellowship ($8000) from the College of Education at Michigan State University. This support made it financially possible for me to travel back and forth between New Jersey and Michigan last year to complete data collection in two institutions from spring to fall semester. I really appreciate this generous funding support. Finally, I would like to thank my husband, Weimin, and my parents for encouraging and “pushing” me throughout the dissertation process. Last but not the least, my church friends’ prayers are important support for me walking through difficult times. v TABLE OF CONTENTS LIST OF TABLES ......................................................................................................................... ix LIST OF FIGURES .........................................................................................................................x CHAPTER I: INTRODUCTION .....................................................................................................1 Challenges in Current Engineering Education ..............................................................................1 Instructional Labs in Engineering Curricula .................................................................................2 Purpose Statement and Research Questions .................................................................................8 CHAPTER II: LITERATURE REVIEW ......................................................................................11 Learning in a Lab Context ..........................................................................................................11 Conventional Engineering Labs ..................................................................................................13 Innovative Approaches to Lab Instruction ..................................................................................14 The Status of Inquiry-based Lab Instruction ..............................................................................18 Types of Inquiry-based Labs ................................................................................................18 Category 1—inquiry before lab work. ........................................................................... 18 Category 2—inquiry before and after lab work. ............................................................ 19 Category 3—partially inquiry-based lab work. ............................................................. 19 Category 4—fully inquiry-based lab experiments. ........................................................ 20 Summary. ....................................................................................................................... 20 Inquiry-based Lab Courses at Different Levels ....................................................................21 Capstone design course .................................................................................................. 22 Introductory level course ............................................................................................... 23 Upper division lab course .............................................................................................. 24 Controversy over Innovative Approaches ..................................................................................26 Conceptual Framework ...............................................................................................................28 Learning outcomes ......................................................................................................... 29 Learning approach ......................................................................................................... 30 Learning context ............................................................................................................ 31 Students’ characteristics................................................................................................. 31 CHAPTER III: STUDY DESIGN AND SAMPLING ..................................................................33 Multiple-Case Design .................................................................................................................33 Sample Selection .........................................................................................................................33 Case Context ...............................................................................................................................37 Case A. Engineering Design Course .....................................................................................37 Lab projects .................................................................................................................... 38 Lab setting ...................................................................................................................... 39 Course instructors and teaching assistants ..................................................................... 40 Case B. Bio-system Engineering Lab ...................................................................................41 Lab project ..................................................................................................................... 42 Lab setting ...................................................................................................................... 42 vi Course instructor ............................................................................................................ 43 Case C. Chemistry Lab .........................................................................................................43 Lab experiments ............................................................................................................. 43 Lab setting ...................................................................................................................... 45 Course instructor and teaching assistants ...................................................................... 45 Nature of the Lab Learning in Sample Cases .......................................................................45 Data Collection Procedures.........................................................................................................47 Identify Changes in Professional Skills and Attitudes to Engineering .................................47 Quantitative data ............................................................................................................ 47 Qualitative data .............................................................................................................. 50 Role of Difficulty and Workload in Inquiry-based Labs ......................................................52 CHAPTER IV: CONSTRUCTION AND VALIDATION OF THE SURVEY INSTRUMENT .53 Construction of Measures ...........................................................................................................53 Creativity...............................................................................................................................54 Learn from Failure ................................................................................................................56 Teamwork and Communication ............................................................................................57 Interrelationships among Learning Outcomes ......................................................................58 Learning approach ................................................................................................................58 Students’ Characteristics .......................................................................................................59 Reliability and Validity ...............................................................................................................60 Parallel Analysis ...................................................................................................................61 Exploratory Factor Analysis (EFA) ......................................................................................61 Adequacy of sample size ............................................................................................... 61 Sampling adequacy ........................................................................................................ 62 Items deleted .................................................................................................................. 62 Variance explained......................................................................................................... 62 Factor loadings ............................................................................................................... 63 Confirmatory Factor Analysis (CFA) ...................................................................................63 Factor Scores ...............................................................................................................................64 Demographic Data ......................................................................................................................65 Exploratory Factor Analysis Results ..........................................................................................66 Number of Factors to Retain .................................................................................................66 Factor Solution ......................................................................................................................67 Deleted items ................................................................................................................. 68 Eigenvalues of pre- and post-test factor structure.......................................................... 69 Selecting items for each factor ....................................................................................... 70 Internal Reliability ................................................................................................................73 Confirmatory Factor Analysis Results ........................................................................................74 Statistical Assumptions and Corrections ..............................................................................74 Fit Indices..............................................................................................................................75 Model Fit...............................................................................................................................76 Factor Correlations................................................................................................................79 Summary and Discussion ............................................................................................................80 vii CHAPTER V: DATA ANALYSIS AND LIMITATIONS ...........................................................84 Missing Data ...............................................................................................................................84 Changes in Professional Skills and Attitudes to Engineering .....................................................85 The Relationship between Task Complexity and Learning Outcomes .......................................86 Limitations ..................................................................................................................................88 CHAPTER VI: CHANGES IN SKILLS AND ATTITUDES .......................................................91 Quantitative Assessment Results ................................................................................................91 Qualitative Data from the Engineering Design Lab ...................................................................97 Discussion ...................................................................................................................................99 CHAPTER VII: TASK COMPLEXITY AND LEARNING OUTCOMES ...............................103 Descriptive Results ...................................................................................................................103 Polynomial Terms in the Model ...............................................................................................105 Assumption of Normality .........................................................................................................108 Results for the Chemistry Lab ..................................................................................................109 Difficulty .............................................................................................................................110 Learning Approach .............................................................................................................112 Post Estimation Analysis ..........................................................................................................112 Normality of Residuals .......................................................................................................112 Homoscedasticity of Residuals ...........................................................................................113 Results for the Engineering Design Lab ...................................................................................115 Discussion .................................................................................................................................116 CHAPTER VIII: IMPLICATIONS AND RECOMMANDATIONS .........................................119 Translating Research Findings to Initiatives.............................................................................119 Challenges in Innovating Instructional Labs ............................................................................121 Policy Recommendations..........................................................................................................123 Research Recommendations .....................................................................................................125 APPENDICES .............................................................................................................................127 Appendix A: Inquiry-based Lab Learning Inventory (ILLI) ....................................................128 Appendix B: Interview Protocol ...............................................................................................132 REFERENCES ............................................................................................................................135 viii LIST OF TABLES Table 1: Comparison of Four Levels of Inquiry (Banchi & Bell, 2008) .......................................16 Table 2: Potential Cases Identified at the First Step of Sample Selection .....................................36 Table 3: Case Summary .................................................................................................................37 Table 4: Summary of Interviewees ................................................................................................51 Table 5: Number of Items of Each Sub-scale ................................................................................54 Table 6: Participant Demographics ................................................................................................66 Table 7: Preliminary Parallel Analysis Results for the First Five Factors (30 items; N=481) ......67 Table 8: Items Deleted Based on Preliminary PCA and the Rationale..........................................69 Table 9: Parallel Analysis Results for the First Five Components (24 items; N=481) .................69 Table 10: Eigenvalues for Case A. Pre-course and Post-course Assessment (N=486) .................70 Table 11: Component Correlation Matrix from Orthogonal Solution ...........................................71 Table 12: Rotated Factor Pattern and Communalities (N=486) ....................................................72 Table 13: Cronbach’s Alpha Reliability Estimates for Each Component of Each Case ...............74 Table 14: Fit Indexes for Each Model, with Comparisons Between Models, for Pre- and Postcourse Response Separately (N=781) ............................................................................................79 Table 15: Correlation Matrix of Factors based on Model Four .....................................................80 Table 16: Descriptive Information of Factor Scores in Three Sample Courses ............................92 Table 17: Signrank Test Results and Effect Sizes of Change ........................................................96 Table 18: Description of Students’ Perceptions on Task Complexity, Relevant Learning Experience, and Learning Approaches ........................................................................................104 Table 19: Multivariate Multiple Regression Results for Case C. Chemistry Lab with Standard Error in Parentheses (N=760) ......................................................................................................110 Table 20: Separate Multiple Regression Results for Engineering Design Lab (N=174)….. .......116 ix LIST OF FIGURES Figure 1: Presage-process-product (3Ps) Model of Student Learning (adapted from Prosser & Trigwell, 1999) ..............................................................................................................................29 Figure 2: Scree Plot of Eigenvalues from EFA .............................................................................67 Figure 3: Changes in Factor Scores from Pre- to Post-course Response by Case ........................93 Figure 4: Curve fit for Outcome Variables on Perceived Difficulty of the Course (Chemistry Lab) ..............................................................................................................................................106 Figure 5: Curve fit for EXPLORE and COPING on Perceived workload of the Course (Chemistry Lab) ...........................................................................................................................107 Figure 6: Relationship Between the Level of Difficulty and Its Effects on ATTITUDE and COPING ......................................................................................................................................110 Figure 7: Normality and Homoscedasticity of Residuals (left to right: Kernel Density Estimate, P-P plot, and Q-Q plot) ................................................................................................................114 x CHAPTER I: INTRODUCTION Challenges in Current Engineering Education Policy makers worldwide consider adequate preparation of engineering graduates a fundamental task in national economic competitiveness. The qualities that employers most seek in new engineering hires go beyond technical knowledge to include the ability to work on teams, communication skills, engagement in lifelong learning, and problem solving ability (Lattuca, Terenzini, and Volkwein, 2006). The National Academy of Engineering (NAE) pointed out that the characteristics needed by the engineers of the future include “strong analytical skills; practical ingenuity; creativity; communication skills; principles of business and management; leadership; high ethical standards; professionalism; dynamism; agility; resilience; flexibility; and life-long learning (The Engineer of 2020, 2004).” Some of these attributes have long been familiar goals of the engineering programs, such as analytical skills and practical ingenuity. Others, like creativity and communication, are gaining importance because of the need for engineers to work across disciplines in today’s engineering practice (Sheppard, Macatangay, and Colby, 2009). Since the late 1980s, business leaders and policy makers in U.S. have raised concerns about the relevance of undergraduate engineering curriculum to the changes that have taken place in engineering practice (Prados, Peterson, & Lattuca, 2005). Criticisms leveled at U.S. engineering education include: graduates usually lack of design capability, creativity, understanding of manufactory or quality processes, communication skills, the ability to work in teams (Todd et, al., 1993, cited by Prados, Peterson, and Lattuca), concern for the social impact of engineering work (Astin, 1993), and the awareness of as well as the ability to handle ethical 1 issues in engineering work (McGinn, 2003). Despite several decades of curricular and instructional changes, the literature suggests that undergraduate engineering programs remain more successful in imparting knowledge to students than in preparing students adequately for the practice of engineering (Sheppard et al., 2009). An important step taken to better prepare undergraduate students to enter the engineering profession is the adoption of the new set of standards, the Engineering Criteria 2000 (EC2000), by the U.S. Accreditation Board for Engineering and Technology (ABET Inc.) (ASEE, 1998; Kanter, Smith, McKenna, Rieger & Linsenmeier, 2003). Aligned with the expectations from fields of engineering practice, the new ABET criteria retained the emphasis on students’ technical capability while attaching equal importance to other attributes of engineering professionals such as the ability to communicate effectively, understand professional and ethical responsibility, and conduct engineering design within realistic constraints. The new criteria also shifted the basis for accreditation from what is taught to what students learn, requiring engineering programs to identify their course or program objectives to address the ABET outcomes as well as specify the plans used to address the objectives and outcomes in the program curriculum (Felder & Brent, 2003; Lattuca, Terenzini, &Volkwein, 2006). Instructional Labs in Engineering Curricula In the midst of curriculum innovation in engineering education, engineering educators and researchers began to reconsider the role of instructional labs in the curriculum and the approaches used to achieve desired learning outcomes (Feisel & Peterson, 2002; Feisel& Rosa, 2005; Sheppard et al., 2009). The instructional lab is usually a place for students to “learn something that practicing engineers are assumed to already know (Feisel& Rosa, 2005, p.121).” The instructional lab experience usually involves “personal interaction with equipment/tools 2 leading to the accumulation of knowledge and skills required in a practice-oriented profession (Feisel, & Peterson, 2002, p. 5)”. The undergraduate engineering curriculum usually includes lab-enhanced engineering courses that combine lecture and lab sessions and stand-alone lab courses that use materials in one or several pre- or co-requisite courses. Introductory or fundamental courses in engineering sciences, such as mechanics, thermodynamics, and materials, normally incorporate instructional labs to enhance students’ acquisition of theories and their ability to apply theories or measurement skills to practice (Sheppard et al.). Another typical use of instructional labs is for students to complete senior capstone projects. Especially in fields like civil engineering or chemical engineering where it is usually not realistic to engage students in authentic situations with outside clients, instructional labs allow students to perform simulations or create and test prototypes (Dutson, Todd, Magleby, and Sorensen, 1997). Recently, a growing trend is to offer freshmen-level laboratory courses. The major purposes of exposing students early to laboratory work usually include to familiarize students with methods, techniques and principles related to engineering practice at an introductory level, enhance students’ interest in engineering, and retain students in engineering fields (Sheppard et al.; Sheppard & Jenison, 1997; Wuersig, 2007). Nevertheless, it is difficult to categorize lab courses. Some lab courses can be found in most engineering schools. Some other instructional labs are unique depending on specific disciplinary, departmental, or institutional needs or cultures. A case in point is the Multifaceted Engineering Systems Lab (MEL) course offered at the Colorado School of Mines that engages students from different engineering fields in multidisciplinary collaboration (King et al., 1999). Although the ABET criteria do not identify the kinds of instruction (e.g., lecture, labs, or discussion) employed by engineering programs to achieve certain learning outcomes, the 3 EC2000 program outcomes related with students’ experimental skills or ability to use model tools suggest the important role of lab learning in the undergraduate engineering curriculum (Feisel & Rosa, 2005). The existing literature has also revealed a general consensus that instructional labs have the potential to afford students the opportunity to work on real problems with actual hardware, explore professional practice, and develop the attributes of engineering professionals (Balamuralithara & Woods, 2008; Feisel & Rosa; Ma & Nickerson, 2006). Although it is increasingly common to have an instructional lab that involves physical, computer-assisted, and simulated tools, virtual lab experiences only approximate a “hands on” experience; interaction with physical tools and materials remains irreplaceable in many engineering disciplines, such as chemical, biomedical, and mechanical engineering (Feisel & Peter, 2002; Krivickas & Krivickas, 2007). In 2002, ABET and the Sloan Foundation facilitated a discussion on the expected learning outcomes of engineering lab instruction. The colloquy was a remarkable step taken to expand the understanding of instructional engineering labs. Around 50 distinguished engineering educators from different institutions and disciplines in U.S. participated (thereafter called ABET/Sloan Colloquy) to discuss the fundamental objectives of engineering instructional labs. The colloquy generated the following 13 fundamental lab objectives irrespective of the method of delivery. Objective 1. Instrumentation: Apply appropriate sensors, instrumentation, and/or software tools to make measurements of physical quantities. Objective 2. Models: Identify the strengths and limitations of theoretical models as predictors of real world behaviors. This may include evaluating whether a theory adequately 4 describes a physical event and establishing or validating a relationship between measured data and underlying physical principles. Objective 3. Experiment: Devise an experimental approach, specify appropriate equipment and procedures, implement these procedures, and interpret the resulting data to characterize an engineering material, component, or system. Objective 4. Data Analysis: Demonstrate the ability to collect, analyze, and interpret data, and to form and support conclusions. Make order of magnitude judgments, and know measurement unit systems and conversions. Objective 5. Design: Design, build, or assemble a part, product, or system, including using specific methodologies, equipment, or materials; meeting client requirements; developing system specifications from requirements; and testing and debugging a prototype, system, or process using appropriate tools to satisfy requirements. Objective 6. Learn from Failure: Recognize unsuccessful outcomes due to faulty equipment, parts, code, construction, process, or design, and then re-engineer effective solutions. Objective 7. Creativity: Demonstrate appropriate levels of independent thought, creativity, and capability in real-world problem solving. Objective 8. Psychomotor: Demonstrate competence in selection, modification, and operation of appropriate engineering tools and resources. Objective 9. Safety: Recognize health, safety, and environmental issues related to technological processes and activities, and deal with them responsibly. 5 Objective 10. Communication: Communicate effectively about lab work with a specific audience, both orally and in writing, at levels ranging from executive summaries to comprehensive technical reports. Objective 11. Teamwork: Work effectively in teams, including structure individual and joint accountability; assign roles, responsibilities, and tasks; monitor progress; meet deadlines; and integrate individual contributions into a final deliverable. Objective 12. Ethics in the Lab: Behave with highest ethical standards, including reporting information objectively and interacting with integrity. Objective 13. Sensory Awareness: Use the human senses to gather information and to make sound engineering judgments in formulating conclusions about real-world problems (Feisel & Peterson, 2002, p.367). Six (objectives 6, 7, 9, 10, 11, 12) out of the 13 lab objectives include both cognitive outcomes and affective processes that deal with motivation, attitudes, enthusiasm, etc. (Feisel & Rosa, 2005; Krathwohl, Bloom, Masia, 1973). These six lab objectives were also sometimes referred to as the “Nontechnical professional skills and attitudes” (Sheppard, et. al., 2009, p.62)” or professional goals (Lynch, Russell, Evans, & Sutterer, 2009). These objectives provided a foundation for scholars to further illustrate and evaluate expected lab learning outcomes on a discipline basis. The emphases given to nontechnical professional skills and attitudes address the current call for improving engineering graduate attributes to meet the requirement of engineering practice. Despite the widely acknowledged significance of instructional labs in engineering education, current engineering programs have, in general, underused instructional labs to achieve 6 ABET learning outcomes or to prepare practically trained individuals for the engineering profession (Feisel & Peterson, 2002; Feisel& Rose; Felder & Brent, 2003; Kanter, et. al., 2003; Watai, Brodersen, & Brophy, 2007). Compared to the rest of the undergraduate engineering curriculum components, the instructional lab, especially the pedagogy used in this unique learning environment, is “a missed opportunity (Sheppard et al., 2009, p. xxii)” that has received minimal attention both in research and in practice (Feisel & Rosa, 2005; Feisel& Peterson). Recent engineering education literature has documented a few examples of using inquiry-based lab instruction as an innovative way of promoting active learning (Flora & Cooper, 2005; Lisenmeier, Kanter, Smith, Lisenmeier, & McKenna, 2008). Although a substantial of literature on inquiry-based instruction exists in science education, the evidence of the effectiveness of this increasingly popular lab instructional approach in engineering is not yet strong (Litzinger, Lattuca, Hadgraft, & Newstetter, 2011). Inquiry-based approaches are less common in engineering than in science education. Existing literature has documented inquiry-based instructional labs in several disciplines such as transportation materials (Buch and Wolff, 2000), biomedical engineering (Kanter, Smith, McKenna, Rieger, and Linsenmeier, 2003), and environmental engineering (Flora & Cooper, 2005). Unlike “cookbook” labs where instructors or protocols pre-determine the learning process, an inquiry-based lab course empowers students to take the initiative to observe, question, collect information, conduct experiment, or generate knowledge through trial and error (Feletti, 1993; Colburn, 2000; Khan and O´Rourke, 2005; Spronken-Smith et al., 2008). The literature suggests that inquiry-based lab instruction is more advantageous than the cookbook approach for enhancing students’ learning experiences by exposing students to engage in science and engineering (Flora & Cooper, 2005). Similar to conventional and various other innovative 7 teaching approaches, inquiry-based instruction has the potential but does not promise better learning experiences. Understanding the effectiveness of inquiry-based lab instruction requires more evidence-based research to address the implementation of inquiry-based approach in lab courses, how students approach inquiry-based lab activities, and the impact on learning outcomes (Ellis, Marcus, & Taylor, 2005). Despite the clarification of lab learning objectives in engineering recent by the ABET/Sloan Colloquy, few empirical studies that explore assessment of learning outcomes in engineering instructional labs exist. In addition, prior studies that addressed learning outcomes in engineering instructional labs largely employed untested instruments or instruments constructed for non-lab learning settings. Purpose Statement and Research Questions Advocates of inquiry-based approach and evidence-based practice call attention to the link between inquiry-based instruction and student outcomes. This dissertation adds to research on teaching and learning in undergraduate level engineering education by examining the effectiveness of an alternative to traditional lab instruction--inquiry-based instruction--in preparing undergraduate students for the engineering profession. This study focused on nontechnical professional skills and attitudes in engineering labs. The central questions examined by the proposed study were whether and how students in inquiry-based instructional labs achieved aspired lab outcomes. Followed are two groups of sub-questions that address these central questions. 8 1. Do inquiry-based labs enhance students’ professional skills and attitudes toward engineering? The advantage of inquiry-based approach has been widely acknowledged. The current study hypothesized that students in inquiry-based labs would have significant improvement in aspired lab learning outcomes. Therefore, the first set of sub-questions are quantitative and summative (1.1a-1.4c), focusing on the numeric trends in several learning outcomes related to engineering students’ professional skills and attitudes. Each question addresses a specific learning area in a particular learning domain. At the end of inquiry-based lab experiment(s)/project(s)… 1.1. have students’ self-rated skills in team-based problem-solving improved? 1.2. have students changed their attitudes to the teamwork and the role of oral and written communication in engineering practice? 1.3. have students increased their curiosity in solving novel and challenging engineering problems? 1.4. have students become more likely to use positive coping strategies and less likely to use avoidance coping strategies when facing failure or a state of “stuck” in solving engineering problems? 2. How does the complexity of learning tasks influence student learning outcomes in inquiry-based labs? Inquiry-based lab instruction engaged students in complex problem-solving. The current study hypothesized that the complexity of the learning tasks facilitated the achievement of learning outcomes. However, excessive complexity may have adverse effect on learning. 9 Relevant here are two questions focusing on student learning outcomes in relation to students’ perception of the learning context. 2.1.Does the perceived difficulty of the course influenced student learning outcomes? 2.2.Does the perceived workload of the course influenced student learning outcomes? 10 CHAPTER II: LITERATURE REVIEW Learning in a Lab Context Similar to many other disciplines that prepare practitioners, engineering educators are constantly trying to find a balance between theory learning and practice (Stark & Lattuca, 1997). Instructional labs provide a physical context for students to integrate theory and practice through exploration. This approach holds to the philosophy of experiential learning, i.e., it is insufficient to know or fully understand without doing (Dewey, 1916). Engaging students in various forms of learning activities is believed to be superior to instructor-dominant pedagogy in facilitating deep learning (Tagg, 2003), which is associated with enjoyable learning experience, better grades, and higher rates of retaining, integrating and transferring knowledge (Laird, Shoup, Kuh, and Schwarz, 2008). Nevertheless, students’ participation in hands-on activities per se is insufficient in determining the efficacy of lab learning environment. More critical to promote learning is the way in which students participate (Modell, et al., 2000). As Smith (2002) notes, “the lab is a tool that involves the human hand, and when used appropriately, extends the hand and deepens understanding (p.1)” A student could closely follow the instruction to survive practice and obtain a good assessment result with minimum questioning or reasoning why the task could be done in certain ways. A student could also complete a task and in the meantime relate his or her learning experience to previous knowledge, comprehend the concepts and theories embedded in the activity, and gain the capacity to do similar tasks even if the context changes. Although any student should benefit from the hands-on learning experience in labs to some extent, the intent of inquiry-based labs is to help students develop deep learning about engineering concepts. Students engaged in deep learning are more likely to have intrinsic 11 motivation to learn, be confident in learning, and experience personal development and learning gains (Biggs 1999; Laird, Shoup, Kuh, & Schwarz, 2008; Ramsden, 1992). The performance context of the learning activities affects the types of learning that occurs (Prins, et. al., 2008). A student may be required to carry out some tasks repeatedly and become efficient in specific practicing skills. This practice-makes-perfect approach is useful for making a skill automatic (Kohn, 1999). Yet, Dewey (1916) suggested that students “attain efficiency in outward doing without the use of intelligence (p.137)” in this type of hands-on activity. Fitts and Posner (1967) further explained that intellectualizing the task and identifying strategies to accomplish the task is the first phase of acquiring a new skill, after which the concentration level begins to reduce until the autonomous phase when finishing the task becomes automatic. Being able to apply certain skills automatically is important for engineers to work efficiently, such as using particular equipment for measurement. Automaticity is also necessary in simple tasks of a complex activity because it helps free one’s attention for more complicated processes (Patterson, Rak, Chermonte, & Roper, 1992). However, as reasoning becomes inactive in the automatic phase (Fitts & Posner, 1967), deep-level learning, which involves integrating and synthesizing information with prior knowledge, is less likely to occur when students are familiar with the steps required to accomplish a learning task (Ramsden, 2003; Tagg, 2003). To enhance how students think, perceive, and approach new engineering phenomena, it is necessary to engage students in activities that challenge their existing perception and understanding and expose them to complex engineering problem solving close to engineers’ working context. 12 Conventional Engineering Labs A tendency in conventional engineering lab courses is to give students a protocol, which predefines experimental objectives, methods, and equipment and provides explicit instructions in a step-by-step manner (Gindy, 2006). Typically, the instructor first demonstrates the application of a device or an engineering concept, followed by students’ replication of the experimental procedures demonstrated by the instructor or described in the protocol under the supervision of the instructor or teaching assistants. Students are expected to generate predetermined results. The benefits of this type of structured lab experiment include helping students validate and confirm students’ conceptual learning by observing how a concept manifests in certain conditions. It also helps students understand complex core engineering concepts and can foster students’ ability to solve problems involving these concepts. Practicing how to use some basic devices is also essential skills for students’ future engineering practice (Sheppard, et al., 2008). However, in conventional labs, students tend to follow the instructions in the protocol without paying attention to the reasons of following them (McCreary, Golde, and Koeske, 2006). Additionally, structured lab instruction usually focuses on a few related engineering concepts and skills (Sheppard, et al., 2008). This approach conflicts with real-world engineering problems which are usually ill-structured, requiring applications of different concepts and skills within or beyond engineering fields including social and economic concerns, multidisciplinary collaboration, and using trial and error in optimizing conditions or solutions (Lyons & Plisga, 2005). The single-answer problems used in the “cookbook” approach to lab instruction also discourage students from taking into account broader issues and perspectives beyond the hints given. Solving single-answer problems does not help students build core competencies required to handle more complex situations in their future practice (Sheppard, et al., 2008). As a result, 13 structured instructional labs are ineffective in facilitating deep learning and in linking learning with engineering practice. The engineering education literature has criticized the “cookbook” approach for providing students little opportunity to critically examine the task (Gindy, 2006), regimenting students’ thinking process (Pushkin, 1997), disconnecting the mode of education and work environment of practicing engineers (Buch & Wolff, 2000), and leaving students with an inaccurate view of the real-world practice (Leonard, 1991). Innovative Approaches to Lab Instruction Efforts have long been made in science and engineering education to “decookbook” (Shiland, 1997) or “cure cookbook” (Lochhead & Collura, 1981; Leonard, 1991) labs. Some engineering educators have employed advanced learning models in teaching lab courses. These learning models include inquiry-based, problem-based or project-based, and cooperative or collaborative learning (Hu, Scheuch, Schwartz, Gayles, & Li, 2008; Seymour, Hunter, Laursen, & Deantoni, 2004; Smith, Sheppard, Johnson, & Johnson, 2005). Although these terms emphasize different aspects of learning, some of them have alternative and overlapping meanings and some of them are combined in practice. These innovative teaching and learning approaches share a common goal: to engage students in active learning as active agents. The feature of inquiry-based (sometimes called open-ended) learning is the emphasis given to learners’ innate curiosity for discovering knowledge (Barell, 2007). Inquiry-based instruction engages learners in framing experimental questions, procedures, or solutions in a way that is meaningful to the learner (Hu, Scheuch, Schwartz, Gayles, & Li, 2008; Seago, 1992). Inquiry is a process by which all individuals approach new situations or solve problems in life 14 (Marek & Cavallo, 1997). As a teaching method, the inquiry-based approach emphasizes on students’ investigations, shifting the focus from “learning about it” to “doing it” (Falk & Drayton, 2000). The physical activity does not define this type of inquiry (Haury, 2001); rather, the essence of inquiry is to empower learners to think, actively restructuring their own meaning and understanding in the learning process including gathering sources of information, working to understand the information, and applying information to learning tasks (Grabe & Grabe, 2000; Smith, 2000). Students’ prior knowledge and their own interests are the starting point in the inquiry-based learning process. The subject matter, the content of learning in thematic learning, becomes the tool for students to explore their own questions in inquiry-based learning (Short, Harste, & Burke, 1996). The National Science Education Standards identified five essential features of inquirybased learning (National Research Council, 1996; Olson & Loucks-Horsely, 2000). Based on how scientists solve problems these five features highlight learners’ engagement in learning activities beyond listening, which is a defining feature of active learning (Bonwell & Eison, 1991). The five essential features are: 1. Learners are engaged by scientifically oriented questions. 2. Learners give priority to evidence, which allows them to develop and evaluate explanations that address scientifically oriented questions. 3. Learners formulate explanations from evidence to address scientifically oriented questions. 4. Learners evaluate their explanations in light of alternative explanations, particularly those reflecting scientific understanding. 15 5. Learners communicate and justify their proposed explanations (Haury, 2001, p.1). Classrooms are seldom fully inquiry-based. Inquiry-based classrooms are not equally active or engaging either. The level of active inquiry may also vary within a single class. Based on the amount of learner’s self-direction and direction from teacher or material, variations in these five features exist between and within inquiry-based classes. The National Research Council (2002) listed the possible variations in the five essential features of classroom inquiry, ranging from more amount of learner self-direction and less amount of direction from teacher or material to less amount of learner self-direction and more amount of direction from teacher or material. Banchi and Bell (2008) also ranked different inquiry-based classes or activities into four levels (see Table 1): confirmation inquiry, structured inquiry, guided inquiry, and open inquiry. Confirmation inquiry involves minimal amount of inquiry, because teachers provide students with questions, procedures, and results in advance and students simply follow the directions to confirm an idea. At the structured inquiry level, teachers provide students with questions and procedures, but students generate explanations based on the information that they have gathered. At the guided inquiry level, teachers only provide students with a research question. Students are responsible for designing the procedure to address the question and explain results. The final level, open inquiry, allows maximum amount of inquiry, because students are responsible for the whole learning process, constructing their own questions, procedures, and solutions. Table 1. Comparison of Four Levels of Inquiry (Banchi & Bell, 2008) Level of inquiry Question Procedure Solution Confirmation inquiry Provided Provided Provided Structured inquiry Provided Provided Students construct Guided inquiry Provided Students construct Students construct Open inquiry Students construct Students construct Students construct 16 An inquiry-based approach overlaps with many other approaches including case-study, problem-based, project-based learning (PBL), and cooperative learning, each of which emphasizes particular aspects of active learning (Eick & Reed, 2002; Khan and O’Rourke, 2004; Prince & Felder, 2006). PBL and inquiry-based learning are used interchangeably. They are often indistinguishable in practice despite their different origins. Originated in medical education, PBL emphasizes that the learning process is solution-oriented, driven by problems or challenges that learners could encounter in real-life. The purpose of PBL is to foster students’ hypothetical-deductive reasoning ability required by medical expertise (Barrows & Tamblyn, 1980). Learners construct meaning as their previous experiences interact with the learning material. Deep learning occurs when learners go beyond what they have already known and apply the course material to solve problems within a meaningful context (Barrows, et al., 1991; Hu, et al., 2008). Originated from the practices of scientific inquiry, inquiry-based learning gives emphasis to asking questions and collecting and interpreting the data (Hmelo-silver, Duncan, & Chinn, 2007). Both PBL and inquiry-based learning situates learning in complex tasks, engaging students in sense-making, exploring and analyzing data, articulating and reflecting on learning, consulting various resources in the problem-solving process, and constructing arguments based on evidence (Hmelo-silver, et al.; Kuhn, Black, Keselman, & Kaplan, 2000; Krajcik & Blumenfeld, 2006; Quintana et al., 2004). Additionally, cooperative learning is a significant component in problem-based or inquiry-based classes, where students usually work in groups, communicating and negotiating their ideas in the problem-solving process (Hmelo-silver, et. al.; Smith, 2000) 17 Problem-based, project-based, open-ended, and inquiry-based approaches are used interchangeably in the engineering education literature. Therefore, the review of prior studies will include problem-based, project-based, open-ended lab courses that have the features of inquiry-based learning. The Status of Inquiry-based Lab Instruction Types of Inquiry-based Labs The definition of inquiry-based lab in the literature varies. Some researchers employed Domin’s (1999) categorization of lab instruction styles in which inquiry-based lab instruction allows students to determine procedures and outcomes of the experiment. This definition corresponds with the guided or open inquiry in Banchi and Bell’s (2008) four-level continuum (e.g. Sheppard, et al., 2009). Others adopted a broader definition of inquiry that includes structured inquiry in Banchi and Bell’s categorization (e.g. Prince, Vigeant, & Nottis, 2009; Smith, 2002). As inquiry-based learning can take place in hands-on and non-hands-on learning settings alike, the application of inquiry-based approach in lab courses may take place before, during, or after hands-on experiments. Based on how inquiry-based components are combined with experiments and the amount of structure and guidance an instructor builds into a classroom activity, the engineering education literature has documented at least four types of inquiry-based lab instruction. Category 1—inquiry before lab work. The first way is to incorporate the inquiry-based component in the pre-lab session followed by the conventional lab experiments guided by lab protocols to reinforce students’ conceptual learning in the pre-lab activity (e.g. Kanter, et al., 18 2003). Unlike conventional pre-lab sessions in which instructors demonstrate a concept or experimental skills and students listen passively, the inquiry-based pre-lab engages students in a critical thinking process by asking them to relate their prior knowledge to the learning material and discuss possible solutions to solve the problems given before applying the concepts to experiments. The inquiry-based activity in the pre-lab complements what “cookbook” lab lacks but is separate from the physical experiments. Category 2—inquiry before and after lab work. The second way is to combine inquiry-based learning modules with lab protocols to deal with students’ particular conceptual changes, especially the knowledge areas where students easily develop misconceptions. Guided by specific conceptual learning goals, instructors provide students with questions and experimental procedures. Students construct the solutions by means of predicting what would happen in given scenarios based on their prior knowledge, conducting experiments to test their predictions, and comparing their predictions with the experimental results (e.g. Lyons, Young, & Morehouse, 2000; Prince, Vigeant, & Nottis, 2009). Inquiry occurs when students actively construct their mental models by relating their prior knowledge to their current learning experiences and generating new knowledge. In this approach, although the physical experiment has become part of the inquiry-based learning process, the implementation of the experiment exercise per se is not altered. Category 3—partially inquiry-based lab work. Compared to the first two categories of inquiry-based lab courses, the third one offers students more challenges and flexibility in conducting experiments. The lab course under this category is usually comprised of several stages of increasing complexity with confirmation inquiry at the beginning by using protocols with clear instructions and gradually moving to open-ended problem solving in the final or major 19 course project. Variations exist within this type of lab course. Some employ experiments or design projects that involve open-inquiry (Banchi & Bell, 2008). This approach allows students to define their own engineering problems, evaluate different available approaches, design their own experiments or products, and construct arguments for their solutions (e.g. Gindy, 2006). Others engage students in guided inquiry (Banchi & Bell, 2008) by providing students with specific questions and asking students to construct the procedure and solution to the question (e.g. Komives, 2006). Category 4—fully inquiry-based lab experiments. The last category of inquiry-based lab gives students a great deal of freedom in conducting experiments (e.g. Miller & Olds, 1994). All the conceptual teaching and technical training takes place in the pre-lab session or prerequisite courses. Once students get satisfactory assessment results on their pre-lab preparation or get approval of their experiment plans, they start working in the lab with minimum direction from faculty or teaching assistants unless safety issues emerge or when students proactively seek for advice. This type of lab instruction also falls into the category of open-inquiry (Banchi & Bell, 2008) in which students take ownership of the experiment, constructing their own questions, experiment procedures, and solutions. Summary. Inquiry-based labs under the above four categories offer students the chance to struggle with making sense of the new information and constructing their own answers to the questions encountered in learning. Cooperative learning is usually a component of these lab courses, where students are expected to support, help, and learn from each other in the learning process. The deliverables of these lab courses can be lab reports, designs on paper or computer, or physical products. 20 Labs in the first two categories use structured inquiry before or after the experiment, allowing students to construct solution of the problem based on their prior knowledge or a combination of their prior knowledge and the experimental results. The major purpose of employing the structured inquiry is to facilitate cognitive learning of core engineering concepts and theories and the inquiry-based activity is independent of the hands-on experiments. In the last two categories, inquiry-based learning is woven into hands-on activities. Students need to use their previous knowledge and make sense of the new knowledge taught in class and they use hands-on experiments to test their design or solutions and learn through trial and error. Major distinctions between the fully and partially inquiry-based labs include level of sophistication of lab work and guidance provided. The fully inquiry-based lab course assumes that previous courses or pre-lab sessions have adequately prepared students for designing and conducting certain experiments with little amount of direction from teachers or material. The partially inquiry-based lab course gradually reduces the amount of direction from teacher or material in guiding students’ experiments, assuming that the course progressively helps student build up their capability of designing and implementing certain type of experiments. Inquiry-based Lab Courses at Different Levels Promoting inquiry-based lab instruction is not new. Since 1990, NSF has funded several engineering education coalitions to stimulate reform of undergraduate curricula, including implementing and assessing innovative approaches that emphasize hands on laboratory activities and open-ended problem solving at all course levels. Innovations by the Coalition of Schools for Excellence in Education and Leadership (ECSEL) (1991-2001) included providing engineering design experiences for first or second year students from different departments, expanding design-based experience to discipline-specific upper-level courses, and providing industry-driven 21 interdisciplinary design experiences for upper-level students to work on actual engineering jobs (Kalonji, Regan, & Walker, 1996). The central strategy taken by the Southeastern Universities and Colleges Coalition for Engineering Education (SUCCEED) (1992-2003) was exposing undergraduate students to engineering design activities early in their programs to retain their interest in engineering (Ohland, Zhang, Brawner, & Miller, 2003). Similarly, the Synthesis Engineering Education Coalition (1990-2001) used approaches that allowed freshmen and sophomore students to explore engineering products by disassembling and reassembling machines and (Sheppard, 1992; Sheppard & Jenison, 1997). The Greenfield Coalition (19942005) integrated academic work with experiential manufacturing activities. These national efforts instilled student-centered and problem-based engineering design activities throughout the undergraduate curriculum in participating institutions. To date, some hands on and lab experiences take place outside the traditional brick-and-mortar settings and some break and bridge disciplinary boundaries. Inquiry-based labs offered to particular grade levels usually have specific features and serve particular purposes. Capstone design course. Inquiry-based lab instruction is most likely to be found in Capstone design course. Required by ABET, all undergraduate engineering curricula include capstone projects typically embodied in the last one or a few courses in the program for senior students to synthesize what they have learned in the first three years of their study (ABET, 2007; Sheppard et. al., 2009; Quinn & Albano, 2008). Most of these projects are student-directed and inquiry-based engineering design projects involving a wider range of subject matter, technical skills, and professional skills than design projects in other courses. Capstone design courses in different engineering programs, disciplines, and institutions vary in course duration, format, content, expected outcomes, evaluation methods, faculty involvement, and industry involvement 22 (Dutson, et al., 1997). Two national surveys were conducted in 1994 and 2005 to examine the implementation of engineering capstone design courses in U.S. (Howe & Willbarger, 2005; Todd et al., 1995). Although capstone design courses can bring forth many positive changes in students’ conceptual understanding, experimental skills, and professional preparation, (Howe & Willbarger; Todd et al.), capstone projects can appear too late in engineering programs to be “effective motivators or curriculum integrators (Dym, Rossmann, & Sheppard, 2004)”. Some scholars called for “a sharp departure from conventional classroom pedagogy and solitary learning methods (Duderstadt, 2010, p.28)” to employ intellectual activities and engineering design throughout the engineering curriculum. Introductory level course. The use of inquiry-based labs in introductory level courses for freshmen and sophomores is growing. Since most engineering freshmen and sophomores are still making decisions about their majors, increasing their motivation and retention in science and engineering fields are major concerns for freshman and sophomore level courses (e.g. Goeser, Coates, & Johnson, 2011; Karam & Mounsef, 2011; Knight, Carlson, & Sullivan, 2007; Carlson, Schoch, Kalsher, & Racicot, 1997; VanAntwerp, VanAntwerp, Vander, & Wentzheimer, 2004; Wuersig, 2007). Usually offered to students who are broadly interested in engineering, the introductory level engineering lab courses are broad-based, giving emphasis to acquainting students with the engineering profession and different disciplines, connecting students with each other and with professors, and enhancing students’ interest in engineering (Schoch, et al.; Wuersig). Although open-ended approaches are less common at the introductory level than in advanced courses, some faculty members incorporated creative open-ended design projects in lower division courses (Sheppard et al., 2009; Sheppard & Jenison, 1997). 23 Additionally, engineering students are normally required to take general science courses to gain fundamentals of science and to meet the requirement of general education. For many students, general science courses also gave them an early exposure to lab work. At some institutions, specific science courses were designed and offered to engineering students, while in other institutions engineering students took general science class with other majors. Scientific inquiry and inquiry in engineering design involve many common elements such as problem statement, information gathering, designing the process, and analyzing and presenting results. Their major difference is that scientific inquiry includes hypothesis testing, whereas engineering design involves design statement and criteria for successful design. In an inquiry-based science lab, students may need to test hypotheses multiple times by conducting experiments. In an inquiry-based engineering design lab, students may test the prototype multiple times and improve the design in order to meet the criteria for successful design. Prior studies showed that inquiry-based introductory courses promoted students’ interest in learning, critical thinking, longer retention of content, creativity, engagement, autonomy, positive attitude to scientific investigation, and responsibility (Benford & Lawson, 2001; Lord & Orkwiszewski, 2006; Sheppard et al., 2009). Involving students in engineering design experiences also reduced student attrition and increases persistence in engineering programs (Bransford, Brown, & Cocking, 2000; Knight, Carlson, & Sullivan, 2007; Sheppard et al.). Upper division lab course. Retention is less of a concern for engineering juniors and seniors. Upper division engineering courses for junior and senior students are more focused on preparing students for the profession in particular engineering disciplines. Open-ended lab work is also more typical at the upper-level than in lower division courses in part because ABET accreditation criteria require graduates to have design experiences (Sheppard et al., 2009). 24 Nonetheless, design and synthesis remain minor components of most engineering programs even in upper division courses (Duderstadt, 2008). Some upper division instructional labs attract students from different disciplines such as the Multidisciplinary Engineering Lab (MEL) courses at the Colorado School of Mines that bring in students from mechanical, bio and nuclear engineering. A major goal of these multidisciplinary lab courses is to help students organize their knowledge and understand the relationship among various concepts. Most multidisciplinary lab courses mentioned in the literature were elective courses that facilitated students from different fields to collaborate in open-ended projects (Allen, Muscat, & Green, 1996; DeLyser, Rullkoetter & Armentrout, 2002; King, et al., 1999; Sheppard, et al., 2009). Although high-level coordination among disciplines is not always achieved (Sheppard, et al., 2009), compared to traditional cookbook labs, inquirybased multidisciplinary lab courses are more reliant on effective teamwork and facilitate students to use teamwork in more complicated fashion (Allen, et al.; King, et al.). The assessment conducted in a few multidisciplinary lab courses also revealed their positive improvement in students’ thinking ability, motivation in learning, and awareness of challenges in doing engineering. However, this teaching approach created higher level of frustration among students and required more teacher time (Allen, et al.; King, et al.). Students from the department which offered the lab course usually benefit more from these instructional labs than those from other engineering disciplines or non-engineering fields (King, et al.; Komives, 2006). Some other upper division engineering instructional labs are mainly for students in particular engineering disciplines or in one area of an engineering discipline. These courses are usually required courses offered to juniors or seniors, introducing them to discipline -specific 25 fundamental measurement or other lab techniques, data analysis methods, report writing, and engineering design (e.g. Flora & Cooper, 2005; Miller & Olds, 1994; Munson-McGee, 2000). Controversy over Innovative Approaches Innovative approaches are not without new challenges and criticism. Empirical studies on the effectiveness of problem-based, inquiry-based, cooperative, and experiential learning showed mixed results, indicating that exposing students to more complex problem-solving activities does not ensure more fruitful learning experiences. One of the most controversial articles written by Kirschner, Sweller, and Clark (2006) indicated that “minimal guidance during instruction does not work (p.75).” They categorized various names including problem-based learning, inquirybased learning, and experiential learning under the umbrella term “minimal guided approach”. Based on how memory structures work and influence each other, they posit that learning activities with minimal guidance gave students “a heavy working memory load (p.80)” of searching for information and distracts them from “manipulating information in ways that are consistent with a learning goal, and storing the result in long-term memory. (p.77)” Based on their argument, complex learning activities run the risk of distracting learners from major learning goals to focus on trivial tasks. Norman and Schmidt’s (2000) meta-analysis of the effects of several components of problem-based learning on learning outcomes also revealed small negative effects associated with self-paced and self-direct learning. However, their opponents argued that at least some of the approaches mentioned by Kirschner, et. al (2006), “in particular, problem based learning and inquiry based learning, are not minimally guided instructional approaches but rather provide extensive scaffolding and guidance to facilitate student learning (Hmelo-Silver, et al., 2007, p.99).” These approaches may also include direct instruction and presentation of key information to students as part of the 26 scaffolding (Hmelo-Silver, et al.). Thus, the amount of scaffolding provided to students does not distinguish problem-based or inquiry-based learning from conventional structured instruction; rather, how and when the scaffolding is provided has a bearing on student learning. Overall, the hands-on learning activity is not automatically superior to structured lecturing. Although learning by doing affords great potentials for students to achieve deep learning, both the design of the learning activity and the scaffolds provided throughout the learning process can affect students’ learning approaches and their learning outcomes. When learning tasks are too simple, students may take a surface learning approach without mentally and cognitively involved in thinking and reasoning. When the task is too difficult and scaffolds are insufficient, students may lose their confidence and withdraw from learning. Therefore, an effective instruction should include both elements: learning activities of appropriate complexity and scaffolding that meets students’ learning needs. In addition to task difficulty, complexity of inquiry-based lab work is usually accompanied with increased time and efforts involved for students and teachers alike. However, no matter how lab sessions were implemented, they normally carry fewer credits than courses of other formats. As a result, the discrepancy between the workload of inquiry-based lab work and the number of credits given for lab sessions is another obstacle that usually negatively influences students’ enthusiasm in inquiry-based lab work (Sheppard, et al., 2009). Despite the influential role of task complexity in inquiry-based learning, prior studies that evaluated the effectiveness of inquiry-based or similar instructional labs in engineering seldom took it into account. Schaffer, Chen, Zhu, and Oakes’s (2012) study on self-efficacy for crossdisciplinary learning in project-based teams was one of the few that took into account task 27 complexity. Task complexity in their study was a team-level variable measured by the number of design stages that a team went through. Results of this study were informative but need further examinations and clarifications to inform practice. First, the study found a negative association between task complexity and team’s average change score, but the influence of task complexity at the individual level was unknown. Moreover, the number of design stages, as a proxy variable, may imply workload, difficulty of the project, or both, so it is unclear which of these two aspects of task complexity influenced students’ self-efficacy. To fill the gap in the literature, the current study used student-level data to examine how students’ perceptions of difficulty and workload of the lab course influenced their changes in skills and attitudes after taking the course (Research questions 2.1 and 2.2). Students are usually not equally familiar and comfortable with inquiry-based learning or lab learning. As a result, the perceived difficulty or workload is likely to vary even for students who perform similar tasks. Therefore, using self-rated difficulty and workload that accommodates these individual differences is more informative than using proxy measures of task complexity at a group-level. Conceptual Framework The conceptual framework for the current study is based on the Presage-process-product (3Ps) model of student learning (Figure 1) (Prosser & Trigwell, 1999; Prosser, Trigwell, Hazel, & Gallagher, 1994). First formalized by Biggs (1978), the 3Ps model suggested that student learning outcomes are a result of dynamic interactions among students’ characteristics, instructional context, and students’ learning approaches. The original 3Ps model did not suggest causal relationships among different components factors of the model. Biggs and his following researchers (Biggs, 1978; Prosser & Trigwell, 1999) emphasized the interaction among all the components of the model. The model assumes that students choose learning behaviors based on 28 expected success and value (Wigfield, Tonks & Eccles, 2004). Another underlying assumption in the 3Ps model is that “deep learning approaches to learning were more likely to be associated with higher quality learning outcomes (Prosser & Trigwell, 1999, p.12).” This model entails the role of learners for evaluating an instructional approach. The learning situation, namely inquirybased instructional labs in the current study, is a presage factor, the effect of which on learning outcomes rely on who students perceive and approach it. Figure 1. Presage-process-product (3Ps) model of student learning (adapted from Prosser & Trigwell, 1999) Learning outcomes. As the product of a learning process, students’ learning outcomes are straightforward indicators of the effectiveness of an instructional strategy. Recent discussion and literature about inquiry-based and lab learning in engineering point to non-technical professional outcomes such as creativity, engagement, autonomy, persistence in learning, teamwork, and communication (e.g. Benford & Lawson; Feisel & Rosa, 2005; Sheppard, et. al., 2009). To construct an instrument to measure learning outcomes, the current study drew on literature in science, engineering, and psychology relevant to divergent thinking (e.g. Guilford, 1967), curiosity (e.g. Kashdan, Rosc, & Fincham, 2004), coping (e.g. Brandtstädter & Renner, 1992; McGrath, 1999), teamwork, communication, and problem solving skills (e.g. Chiu, 2003; Lattuca, Terenzini, & Volkwein, 2006; Lingard, 2010; Lynch, et. al., 2009). 29 Based on previous research that decomposed the concept of human development into various domains, learning outcomes measured in the current study involved both cognitive and affective components. Teamwork and communication skills and divergent thinking ability were mainly cognitive outcomes, whereas attitude to teamwork and communication, curiosity, and coping were mainly affective outcomes (Bloom, et al., 1956; Feisel & Rosa, 2005; Krathwohl, Bloom, Masia, 1973). Curiosity and coping, referring to intention to explore new things and the intention to use specific strategies when confronted with difficulties, were a combination of two components: attitude to a given behavior (e.g. explore or active coping) and one’s interpretation of subjective norms (e.g. Am I expected to explore?) (Fishbein & Ajzen, 1975). This body of literature informed the current study to clarify the nature and underline meaning of the learning areas examined in the current study. Moreover, engineering education literature has called for an integrated perspective on human development, positing that acquiring knowledge and skills and internalizing values of the engineering profession are two inseparable facets. Prior research that explored ways to bridge across learning domains and different outcomes (Anderson et. al, 2001, Hauenstein, 1998; Kraiger, Ford, & Salas, 1993; Marzano & Kendall, 2007) informed the current study to take into account the correlations among different aspects of learning. Learning approach. Marton and Säljö (1976) and Biggs (1987) dichotomized learning into superficial and deep learning. While superficial learning leads to strategic acceptance of information as unlinked pieces of facts, deep learning leads to understanding of concepts, longterm retention of information, and application of knowledge learned to life. No student has a fixed learning approach; rather, one may use deep or superficial approach alternatively or in combination throughout a course. Based on the 3Ps model, adoption of a learning approach takes 30 place during the learning process and is shaped by learning situation, students’ characteristics, and students’ perception. Learning context. Inquiry-based instructional elements in lab settings are focal aspects of learning context in the proposed study. Based on the five essential features of inquiry-based learning identified by the National Science Education Standards (National Research Council, 1996), elements involved in inquiry-based instruction include generating questions, collecting and analyzing evidence, formulating explanations, and justifying solutions. Existing studies also revealed that learning assessment, complexity of the learning task, and learning support are important contextual variables that influence students’ learning approaches and learning outcomes (Prins, et. al., 2008; Ramsden, 1992; Tang, 1994; Tippin, Lafreniere, & Page, 2012; Trigwell & Prosser, 1991). The perception of learning assessment to be memorization and recall focused and the perception of the high workload of problem-solving environment were reported to be negatively associated with students’ adoption of deep-learning (Ramsden). Perception of high quality teaching, clear goals, and support received were suggested to be positively related with students’ adoption of deep learning approaches (Prosser & Trigwell, 1998; Trigwell & Prosser; Ramsden). Consistent with the 3Ps model, these studies posit that the adoption of learning approaches is contextually based and is a dynamic process. Students’ characteristics. Student characteristics in the 3Ps model refer to learningrelated predispositions, such as previous knowledge and experiences related to the subject matter, abilities, expectations, and learning styles. Students bring into classroom these presage factors which influence their learning experiences in current learning situation (Biggs, 1989; Prosser & Trigwell, 1999). In the context of the proposed study, students’ previous experiences in inquirybased learning settings and familiarity with solving ill-structured problems are likely to influence 31 how they react to inquiry-based lab instruction. Deignan’s (2009) research findings raised the question that inquiry-based learning may not be suitable for everyone. Houlden, Collier, Frid, John, and Pross (2001) also suggested that students who lack sufficient problem-solving and interpersonal skills can hardly benefit from problem-based learning. Walker and Lofton (2003) suggested that exposing students to unfamiliar instructional methods could reduce students’ selfperception of their learning abilities. Additionally, science and engineering education research has found that gender and ethnicity predicted students’ learning experience and academic achievement (e.g. Hoffer, Rasinski, & Moore, 1995; Madigan, 1997; Mason & Kahle, 1989). These studies suggested that individual differences may explain some variations in students’ learning achievement in inquiry-based class, and thus these individual factors should be considered when assessing the effectiveness of instructional methods. 32 CHAPTER III: STUDY DESIGN AND SAMPLING Multiple-Case Design The current study adopted a multiple-case design to address research questions in naturalistic settings. Each sample lab course chosen for analysis was a case. Multiple case study design was preferred for several reasons. First, the growing emphasis on inquiry-based approaches in STEM fields is a contemporary phenomenon in which “the boundaries between phenomenon and context are not clearly evident (Yin, 2009, p.18)”. Second, case study design allowed the current study to observe and examine participants’ exposure and outcome status in real-life classroom settings (Yin, 2009). Finally, the multiple-case design allowed replication of results via comparing and contrasting different inquiry-based sample cases, a logic analogous to that used in experimental studies with replication of experiments (Hersen & Barlow, 1976). Compared to single case study, multiple-case study design leads to more compelling evidence and is more robust (Herriot & Firestone, 1983). Units of analysis included quantitative assessment of students’ characteristics, perceptions on learning situation, learning approaches, learning outcomes, and students’ qualitative explanation of their learning experiences. Sample Selection The target sample cases in the current study were inquiry-based instructional lab courses offered to engineering undergraduate students at four-year public universities. The search was narrowed to lab courses that involved partially or fully inquiry-based lab experiments or projects, excluding those that used inquiry-based approach only in pre-lab or post-lab sessions. Senior capstone courses were also excluded, because substantial studies had already been done in this area (e.g. Howe & Willbarger, 2005; Todd et al., 1995). Additionally, a previous researcher called for earlier appearance of student-driven hand-on activities similar to capstone projects in 33 engineering curricula (Dym, Rossmann, & Sheppard, 2004). The first step of sample selection was to identify engineering lab courses that met these criteria, offered by four-year public universities within a drivable distance in the states of Michigan, Pennsylvania, Maryland, Virginia, Delaware, and New Jersey. Since inquiry-based labs are not common in undergraduate engineering curricula, several strategies were used to seek for qualified cases: 1) reviewing journal papers and conference proceedings relevant to inquiry-based instruction or lab courses; 2) reviewing undergraduate curricula or course introduction on the university website to identify lab courses that might use inquiry-based approach; 3) contacting program or lab coordinator for course information; and 4) consulting researchers who had studied inquiry-based approach or instructional labs in engineering. Eight instructional labs with at least some extent of inquirybased components were identified as a result of this step (see Table 2). The next step of case selection was based on practical constraint. The IDEAS Lab was excluded at this step as this course started in fall, 2013 and ended in spring, 2014, but the data collection of the current study was from spring, 2013 to fall, 2013. The Measurement Lab for Physio-system and the Thermal Lab were also excluded due to a lack of accessibility to courserelated information and course participants. The final decision of sample selection was guided by two purposes of multiple case studies: “a) predicts similar results or b) predicts contrasting results but for anticipatable reasons (Yin, 2009, p.54).” To achieve these two purposes, the first four courses listed in Table 2. were chosen as sample cases. These four courses all had several years of history of including significant inquiry-based components in the lab section. The instructors of these four courses were also knowledgeable about inquiry-based or problem-based learning and explicitly addressed the inquiry-based nature of lab work in course syllabi or other course materials. 34 Finally, Engineering Design Lab, Chemical Lab, and Fluid Mechanics Lab were all large-scale courses with several hundreds to over a thousand students. They had similar course format, including large-class lecture or pre-lab section taught by a professor or lecturer and multiple lab sections led by teaching assistants. These similarities offered the potential for the current study to replicate findings. On the other hand, the four cases chosen had several variations that the current study could examine if findings could still be duplicated, provided these differences. First, two of them were upper division courses offered to juniors and seniors and the other two were lower division courses (see Table 2). In practice, inquiry-based approach is more often used in upper division courses as reflected by the potential cases identified. However, it is unknown if inquiry-based labs in upper-division are more efficient in improving students’ non-technical professional skills and attitudes to engineering than those in lower-division. Second, lab work in Bio-system lab and Engineering Design lab were design activities in which students designed, created, and tested a product or prototype. Chemical and Fluid Mechanics Lab were both science labs where students conducted scientific experiments including formulating, testing, and drawing conclusions about scientific hypotheses. More evidence from the literature supports effectiveness of inquiry-based instruction in science than in engineering (Litzinger, et al., 2011). The variation in the nature of lab work would allow the current study to reveal if inquiry-based lab instruction could be equally or more effective in improving non-technical professional skills and attitudes for engineering students. 35 Table 2. Potential Cases Identified at the First Step of Sample Selection. Potential Insti- Grade Discipline Term Level of Practical Constraints Case tution Inquiry Bio-system I Upper Bio- and SS GuidedSmall class size (around Lab division Agricultural inquiry 30) Engineering Engineering I Lower All SS, GuidedLarge class size (around Design Lab division engineering US, inquiry 400-800) with several students FS sections, so coordinating data collection is challenging Chemical II Lower Chemistry; SS, FS GuidedNormally half of the Lab division Offered to inquiry students are in nonengineering, but may not Chemistry have enough students engineering students in the final sample; Large class size (around 1200) with several sections, so coordinating data collection is challenging Fluid III Upper Mechanical SS, FS GuidedCourse coordinator Mechanics division Engineering inquiry withdrew from the study before data collection started. Lab practice I Upper Chemical SS GuidedInquiry-based lab work and statistical division Engineering inquiry was not a significant analysis component in this course. IDEAS Lab IV Upper Biomedical yearGuidedThe course starts in FS Course division Engineering long inquiry and ends in SS; Required additional permission from the institution Measurement V Upper Biomedical SS, OpenRequired additional IRB Lab for division Engineering US, inquiry approval from the PhysioFS institution system Thermal Lab VI Upper Mechanical SS US GuidedLimited accessibility to division Engineering and FS inquiry participants Notes: Institution I and II are public research universities in the Midwest. Institutions III-VI are public research universities in the East Coast. However, in the final data collection, Fluid Mechanics Lab was dropped from data collection due to a lack of administrative support from course coordinator or instructors to ensure 36 successful data collection in this large-scale course. So, the final cases are the Bio-system lab at institution I, the Engineering design lab at institution I, and the Chemistry lab at institution II. Case Context I gained an understanding of sample cases through syllabi and other learning materials used in sample courses, previous studies of these courses (Berger, Kerner, & Lee, 1999; Hinds, Wolff, Buch, Idema, & Helman, 2009; Matz, Rothman, Krajcik, & Banaszak Holl, 2012; Walton, et al., 2013), interviews and email correspondences with instructors or course coordinators, and observation of labs. Table 3 presents a summary of the three sample courses examined in the current study. Table 3. Case Summary Course Enrollment Course Credit Hours Lab facilitator Case A. Engineering Design Course SS13:535 US13:40 FS13:784 2 Lecture and discussion: 50 minutes Lab: 110 minutes Teaching assistant Case B. Bio-system Lab Case C. Chemistry Lab SS13: 32 FS13: 1166 3 Lecture and discussion: 2 hours Lab: 2 hours Professor Arrangement of lab Comprised of several experiment(s)/project(s) independent and team-based projects; The final project was guided-inquiry-based A single guidedinquiry-based project started in the middle of a semester 2 Lecture: 1 hour Discussion: 1 hour Lab: 2 hours Teaching assistant Six guidedinquiry-based experiments with an increase in complexity Case A. Engineering Design Course “This course utilizes a combination of weekly lectures and hands-on project labs to introduce students to the design process, critical thinking, problem solving while developing effective written oral and communication skills. All of the above are essential to success in future engineering courses and ultimately a career in engineering (Online course description, 2013).” 37 The engineering design lab was embedded in a two-credit college-level introductory course in a large, public, Midwestern university. First piloted in 2007 and offered in a large scale since spring, 2008, the course usually enrolls about 400 students in spring, 800 in fall, and 60 students in summer semester. The large-class lecture session, 50 minutes per week, introduced students to basic concepts and skills related to different engineering disciplines. The class was divided into several lab sections, each of which enrolled about 30-40 students and met 110 minutes per week. Most students enrolled in this course were freshmen or sophomore. It is a required course that one needs to take before officially admitted to the College of Engineering. For many students taking this course, it was their first lab experience or the first engineering course. Lab projects. For students interested in different disciplines and at different levels to contribute to the lab work, the projects assigned involved less professional or discipline-specific knowledge and skills and were more about offering students a basic understanding of the engineering profession and a taste of how engineers work. The hands-on project labs started with exercises using protocols with detailed instructions at the beginning of the semester. The complexity and the level of self-directedness gradually increased. The final project, which usually started in the middle of a semester, was open-ended inquiry-based. At the end of each semester, each student team presented their final projects to class. In spring and fall semesters, the best design was also chosen from each lab section to attend the college wide Design Day Showcase, which was historically attended by upper-level students. The Design Day offered an opportunity for students to present their academic achievement and interact with industry professionals and faculty members in engineering (Matz, et al., 2012). 38 In 2013 when the data were collected, students conducted two major projects. In the first project, students used Lego parts to make robots following a protocol, and then programmed the robot so that it could perform certain tasks. In the final project, student groups chose from three options: to design and make Lego robot that could perform more complicated tasks, design a solar energy system, or design a green house system that could solve real-life problems. Overall, the final project engaged students in solving ill-structured engineering problems, offering students a great deal of flexibility in tailoring their own projects. This course gave substantial emphasis to developing students’ generic skills, including but not limited to task management, teamwork, communication, and problem solving skills. Allowing students to experience how engineers work is of more importance than improving students’ disciplinespecific engineering knowledge in this course. Lab setting. The Engineering design lab course used two types of labs for students to complete their design projects. The first type, such as the Project Assembly Lab or Project Testing lab, offered space for student groups to assemble and test products. This type of lab was equipped with a variety of basic tools and supplies required for constructing the design projects in this course, such as drill presses, saws, sanders, benches and hand tools. Additionally, computer labs were available for students to work on their course projects. These labs were equipped with high-end technology and allowed students to access engineering technology resources. Both types of labs were located in residence halls providing easy access for most of the students taking this course to construct their design projects during lab sections or in open hours after class. Each lab was staff with trained employees during open hours to supervise usage of machines and tools in a safe manner (Hinds, et al., 2009). 39 Students worked in groups of three of four to complete cooperative lab projects in this course. Team assignment largely depended on teaching assistants (TA) who facilitated the lab section. In some sections, TAs allowed students to find their teammates by themselves; some TAs assigned teams for students. Students usually work with the same team members throughout the course, but in some sections, TAs adjusted team arrangement based on student groups’ performance. Course instructors and teaching assistants. The coordinator of this course was a practicing engineering with over ten years of industry experience, specialized in computer-aided engineering and manufacturing. Additionally, faculty members from each of the nine programs in the college of engineering served as an instructional team. Each faculty member was responsible for at least one lecture session during the semester, introducing students to different aspects of engineering profession, including communication, time management, design methods, problem solving, and different engineering disciplines (Hinds, et al., 2009). Graduate TAs helped facilitate lab sessions. Additionally, each lab section had three junior or senior students in engineering who served as undergraduate mentors, assisting student groups with homework and design projects. Graduate TAs and undergraduate mentors also helped ensure lab safety during lab sessions. They were trained on the proper operation of lab equipment before facilitating lab sections. TAs inspected equipment setups before students used them (Hinds, et al., 2009). TAs and undergraduate mentors were not directly involved in design projects conducted, but when students had questions in conducting course projects, TAs and undergraduate mentors became the major source of advice. 40 Case B. Bio-system Engineering Lab The bio-system lab was part of a three-credit junior-level course required for students in bio-system engineering. This course was offered at the same university as the Engineering design lab. Since 2009, the Bio-system lab course is offered in each spring semester, enrolling about 30 students each semester. This course focused on “design and optimization techniques applied to engineering problems with biological constraints (Course curriculum, 2013).” Upon completion of this course, students were expected to be able to: 1. Apply optimization techniques to engineering design problems with biological constraints. 2. Apply project management tools in the context of a team-based design project. 3. Conduct engineering economic analyses, and 4. Describe and discuss relevant national and global political issues, particularly as related to major challenges in bio-systems engineering This course was comprised of lectures and small-group lab sections. The lecture section, which met two hours weekly, covered a range of topics about engineering design process, project management, and engineering economics. During the 2-hour lab section each week, students conducted team-based hands-on design project to implement the knowledge and skills taught in the lecture. The final product asked students to solve a practical problem related to the biosystems engineering. Although the design process could beyond the content mentioned in the lecture section and involve knowledge in other engineering fields, students had to apply their knowledge and skills specific to bio-system engineering to complete the lab project. In addition to discipline specific knowledge and skills, considerable emphasis was given to generic skills. When asked to rate the emphases on each learning area in a pre-interview questionnaire, the instructor of the Bio-system lab course indicated that this course gave strong emphasis to problem solving (i.e. defining a design problem, generating and evaluating a variety of ideas, and recognizing unsuccessful outcomes and then re-engineering effective solutions), but even greater 41 emphasis was given to creativity and innovation, written and oral communications, and teamwork. Lab project. In the semester when the data were collected, the major lab project was designing, testing, and making a prototype of food dehydrator. Students were expected to design a small-scale food dehydration system that operated only on renewable resources and suitable for drying fruits in a home-scale application. In the middle of the semester (March and April), each group submitted three memos that reported the problem statement, criteria for success, project management plan, critical technical information required, at least three general design concepts, the selected design concept, justification, and technical or management concerns. Then, student groups implemented their project plan based on instructors’ feedback. Each group was provided with a $100 funding, planks, and tools in the working station that they could utilize. The final product could be a physical prototype of the design and/or a fully functional computer simulation of the system in operation. Most student groups chose to make a physical prototype that was functional. At the end of the semester, each student group presented their project and demonstrated the final product to the class. Lab setting. Similar to the labs used in the Engineering design course, the Bio-system lab course used two types of labs. The first one, called work station, offered space for students to construct and test the prototype. The work station was equipped with basic tools and materials that students needed for constructing the project. Trained professional staffs were available during lab sessions to help students operate machines, such as saws and drill presses. Additionally, a computer lab was available for students to gather information or construct computer simulations. 42 Course instructor. Both the lecture and lab sections were taught by a professor in biosystem engineering. The course changed instructor in spring 2013 when the data of the current study was collected. A new faculty member, who joined the department in 2012, taught the 2013 section, so it was the first time for this faculty member to teach this course. However, the course curriculum, content of the course, and design of the lab sections followed sections in previous years. Case C. Chemistry Lab The third sample case was a two-credit chemistry lab offered in another large, public, Midwestern university. It was a university wide general science course for students not intended to major in chemistry. This course is designed to 1) acquaint students with several approaches used by chemists to investigate chemical phenomena, 2) let students learn several laboratory techniques, 3) familiarize students with the properties of variety of chemicals. In addition to conceptual and technical learning, this course facilitates development of scientific thinking and interpersonal skills via team-based problem solving (Course website, 2014). This course usually enrolls about 1200 students each semester. Each lab section had about 20 students. The majority of the students in this course were from engineering and science fields or pre-medical students. The course met three hours weekly, including a one-hour pre-lab lecture and two hours for experiments and discussion. Lab experiments. Students conducted six experiments during a semester from simple to more complex ones. Each experiment began with pre-lab lecture on key concepts and pre-lab report submitted by each team that tested their conceptual understanding. Experiment was guided inquiry-based. Students generated hypotheses and figured out the procedure of the experiment to 43 address a given question that can be solved by more than one position or solution. Each experiment was followed by discussion sections during which students presented in groups about their experimental procedure, rationale for the methods used, data collected, data analysis procedure results, and conclusions concerning the given question. If the experiment was not successful and thus was unable to fully address the given question, students needed to explain the reason and address how the problem solving process could be improved. For instance, the last experiment that student did in fall 2013 was to analyze reactions. Three reactions were announced and assigned to teams when students arrived at lab. The protocol asked students to mixed reagents, conduct reference blank tests to identify reacting species (or conduct quantitative tests to support that no reaction is occurring if reaction did not occur), and draw conclusions about the reactions. However, the protocol did not provide detailed procedures concerning how to test properties of products or confirm the type of reaction or the fact that no reaction happened. To address what kind of reaction the mixture had or if there was indeed no reaction, students needed to generate one or multiple hypotheses, chose reagents from a list of available shelf reagents for comparing the products, and drew conclusions based on experimental results. The lab assistants were asked not to design the experiment or identify results for students in this process. This course was intellectually challenging. As indicated by the instructor, most students were not familiar and comfortable with inquiry-based lab experiments. Moreover, similar to the first two sample courses, the course curriculum of chemistry lab identified learning goals related to teamwork and communication skills. For students to succeed in accomplishing team-based experiments, students were provided with guidance on teamwork and rubrics for assessment teamwork results. 44 Lab setting. Lab experiments and discussion of experimental results were conducted in chemistry labs. Each lab offered space for about six student groups to conduct experiments. Labs were equipped with basic reagents and supplies required by lab experiments assigned to students. Students of each lab section were divided into groups of three of four in the first week of class. In the first class, students were also asked to discover team members’ strengths, based on which each individual performed one or two roles among team manager, recorder, chemist/safety officer and technologist. Team members worked together in conducting all the lab experiments throughout the course, during which they shared observations, data and perspectives, reached team consensus for each experiment, and collaboratively prepared written and oral team reports (Berger, et al., 1999). Course instructor and teaching assistants. Lecture sections were taught by the lecturer who initiated and designed this inquiry-based chemistry lab course. The instructor was specialized in inquiry-based science labs and designed the learning materials used in the lab sessions to guide experiments. Graduate TAs, called graduate student instructor (GSI), taught, facilitated, and graded the lab sections. Each GSI was responsible for teaching two lab sections during a semester. GSIs were categorized into three groups: new GSI who taught the course the first time, experienced GSI who had taught the course before, and mentor GSI who served as role models and offered guidance for other GSIs. Students were encouraged to ask any GSI for assistance during the course. Nature of the Lab Learning in Sample Cases Based on Banchi & Bell’s categorization of four levels of inquiry, all sample courses used guided inquiry as instructors provided students with problems to be solved or questions to be answered and students constructed the procedure and solutions. However, the boundary 45 between open-ended and guided inquiry is not always clear-cut in practice. First, questions or problems provided especially in the final project were all, to some extent, ill-structured with multiple possible solutions and require students to further clarify the problem statement or generate hypotheses. For instance, in the engineering design lab, students could choose to design equipment useful in the dorm. To solve a problem as general as this one, students must evaluate the constraints on the problem, identify the objectives that they intended to achieve, narrow the scope, and re-define the problem. Questions provided to students in the Chemistry lab, although well-defined, still required students to generate more specific hypotheses so that each solution can answer affirmatively. Moreover, in the sample cases, generating questions, procedures, and solutions was a cycle that students were engaged in repeatedly rather than a linear process that students followed. The extent to which students were engaged in each step of problem solving- generating question, procedure, and solution-can vary enormously, depending on the level of details involved in the broad problem given, the range of ways available for students to use, and the position taken. For example, the project in the Biosystem lab asked student teams to design a food dehydrator machine, using the materials that students collected by themselves, were available in the workshop, or could purchase within the budget (i.e. $100/team). A common question that each team had to answer was what energy source the food dehydrator could utilize. As student teams generated different design ideas and took different procedures, they faced different questions that emerged in the process. A solution to a question brought forth new questions. In the example given above, different choices of energy source then led to new questions concerning the choice of materials or equipment that could make the machine powerful enough to function properly. For a student group who decided to purchase a battery to power their product, a subsequent 46 question that they faced was what type of battery they needed. For another student group who decided to use solar energy, they delved into a different set of question such as what materials could be used to absorb solar energy and the shape of energy panel that would fit their device. Finally, the level of inquiry involved in the lab work also relied on the level of details and complexity that student groups chose to be engaged in. The inquiry-based projects in sample courses engaged students in self-directed exploration and allowed flexibility for student teams to complete the projects in their aspired ways. Therefore, it was, to some extent, up to students to manipulate task complexity. In the Lego robot project, for instance, the final products could vary dramatically. Designing a robot that met the minimum requirements would be sufficient for project completion, yet student groups could choose to deal with detailed design issues and manipulate the level of complexity, such as how to make the movement more smooth and efficient. The assessment of student learning might encourage some student teams to improve their products and go beyond the minimum requirements, because only student groups who showed outstanding design products could participate in a college wide design day to showcase their products. Data Collection Procedures Identify Changes in Professional Skills and Attitudes to Engineering Quantitative data. To address if students’ professional skills and attitudes towards engineering changed after inquiry-based lab learning experience, individual level survey data were collected from students enrolled in the three sample courses. The inquiry-based lab projects employed in these courses provided a hands-on learning context for students to solve illstructured problems. To complete these lab projects, students needed to further define the 47 problem or hypotheses based on the question given by the instructor, determine a process to solve the problem or answer the hypotheses, and generate their own solutions. These features of the lab work in the sample courses align with the definition of guided inquiry (Banchi & Bell, 2008). Assessing students’ professional skills and attitudes before and after the sample course provided quantitative evidence for the current study to answer the first group of research questions by examining the numeric trends in aspired lab learning outcomes. Survey data were collected in three successive semesters from spring, 2013 to fall, 2013, using the Inquiry-based Lab Learning Inventory (ILLI) developed for the current study. The engineering design lab was sampled in all three semesters including summer, 2013. The Biosystem lab was sampled in spring, 2013. The chemistry lab was sampled in fall, 2013. To ensure the best response rate possible and the quality of responses, course instructors were involved in planning the data collection, including determining the survey instruction, the section of class during which surveys could be administered, the format of the survey that was convenient for students, and the procedure of administering the survey. The retrospective Post-then-Pre design was used to measure changes in students’ self reported professional skills and attitudes necessary for achieving desired learning outcomes. The survey instrument (ILLI) first asked students to rate their current skills and attitudes, and then, to reflect back and rate the same skills and attitudes before taking the sample course. Compared to the traditional pre-post design, post-then-pre test is advanced in controlling for response shift bias that occur when participants had different frame of understanding when taking the pre- and the post- tests (Howard, 1980; Rockwell & Kohn, 1989). Participants in the current study are undergraduate students, whose understanding of the engineering profession and were likely to shift within a semester, so post-then-pre test was more appropriate. Since the goal is to capture 48 students’ perception of changes that they made in skills and attitudes by taking the sample course, the post-then-pre method allowed participants to reflect about course effect on particular learning outcomes and was adequate for collecting self-reported assessment data. The survey that took about 10-15 minutes was administered in the last day of each sample course. The engineering design lab used web-based surveys and the other two sample courses used paper surveys. In the Bio-system lab course, I (the investigator) administered and collected survey forms in person. In the other two sample courses that included multiple sections going on at the same time, the course instructor communicated the survey administration procedure to lab TAs, who helped administer surveys. In the Engineering design lab, students were in computer labs in the last class and were provided the link to ILLI during class. Students were given the time to complete the ILLI in class, but they could also access and fill out the survey after class. In the Chemistry lab, each TA read the survey instruction to students in his/her section, administered the paper survey, and collected and returned the survey to me (the investigator) right after class. Data collection and analysis were conducted simultaneously, giving emphasis to quantitative data. Based on the quantitative and qualitative data collected during spring and summer semesters, the survey protocols were revised. A few items that received substantially more “NA” response than others, meaning students lacked relevant experience in the sample courses, were removed from the instrument. The revised survey instrument (see Appendix 1) was used in part of the lab sections of the engineering design lab and all lab sections of the chemistry lab in fall, 2013 49 Qualitative data. Qualitative data from semi-structured interviews were a small component in the current study, collected and analyzed at the early stage of data collection during spring and summer, 2013. The purpose of the interview was to confirm and cross-validate the quantitative assessment of learning outcomes and to determine if survey items needed revision. Seven interviews were conducted concurrently when the quantitative data was collected and analyzed. Interviewees were recruited from those who had taken the Engineering design lab course within five years. The survey instrument administered to the Engineering design lab in spring and summer 2013 also included an interview invitation on the last page. Additionally, several faculty members and presidents of engineering student organizations were contacted to help circulate the interview invitation. Snowballing sampling method was used to identify potential participants. Interviews were conducted on phone or Skype. Each interview lasted an average of 40 to 60 minutes. In each interview, the participant first described the projects that they conducted in this course and the procedures that they took to accomplish the projects. Then, interviewees answered two major questions: how they felt about this course, especially the lab projects, and what they had learnt most in this course. If participants mentioned learning experiences or outcomes relevant to the lab learning objectives addressed in the current study, they were asked follow up questions to provide more detailed information. Table 4 summarized interviewees’ background. Pseudonymous names were used to replace their real names. My subjectivities as an international student from China had possibly influenced the recruitment of interviewees. Among the seven students who participated in the interview, six of them were international students from Mainland China or Taiwan and only one was domestic student. The response rate from the spring and summer 2013 classes was low. Among students who accepted the invitation, only one 50 student from these two classes participated in the interview. This low response rate to interview invitation might be a function of the difficulty to reach students during the summer break. Table 4. Summary of Interviewees pseudonymous Year when Grade level name taking the when taking the course course Li 2008 Freshman Gender Country of origin Male Taiwan Qing 2008 Freshmen Female Lei 2013 Sophomore Male Chang 2010 Sophomore Male Zhe 2010 Freshman Male Chi 2010 Sophomore Male Karen 2009 Freshman Female 51 Major projects involved 1.Use food to make a cart that can function. 2.Design something that can be used in the dorm. Taiwan 1.Use food to make a cart that can function. 2.Design something that can be used in the dorm. Mainland Design Logo Robots China Taiwan 1.Design a Logo Robot. 2.Design a water container that can keep water temperature. Mainland Design Logo Robots China Mainland 1.Designed a Logo China Robot 2.Designed a solar water heater U.S. 1.Use food to make a cart that can function. 2.Design something that can be used in the dorm. Role of Difficulty and Workload in Inquiry-based Labs In the original research design, the examination of the second group of research questions concerning the influence of Difficulty and Workload on student learning would employ a qualitative approach. However, the small number of interviewees reached at the early stage of data collection made a quantitative approach potentially more feasible for the sample courses. Therefore, two items that measured students’ perceived difficulty and workload of the sample course were added into the survey instrument. These two items asked students to rate the sample course, as compared to other courses, from 1(very easy) to 10(very difficult) and from 1(very light workload) to 10(very heavy workload). The instrument with these two items was used in 17 out of the 21 sections of the Engineering Design lab and all the 52 sections of the Chemistry lab in fall, 2013. Student-rated difficulty and workload, coupled with other items in the survey, allowed quantitative analysis of the association between difficulty and workload of inquiry-based lab course and student learning outcomes, controlling for student learning approaches and background information. 52 CHAPTER IV: CONSTRUCTION AND VALIDATION OF THE SURVEY INSTRUMENT This chapter reports the construction and evaluation of the survey instrument used in the current study. To address the construct validity of the outcome measures, this section first specifies theoretical concepts relevant to each construct, followed by findings from exploratory and confirmatory factor analyses. The two steps of factor analyses used 486 survey responses from the engineering design lab and 781 responses from the chemistry lab respectively to reduce the probability of sample-specific (chance) variation. Factor analyses results revealed the degree to which outcome measures clustered into latent variables consistent with the non-technical lab learning objectives based on which the instrument was designed. Based on the factor structure from factor analysis, the reliability of the measures was evaluated by Cronbach’s alpha estimates. Construction of Measures Empirically tested instruments that measured student learning outcomes in engineering labs did not exist in the literature. Therefore, part of the current project was developing and testing a survey instrument that measured changes in students’ professional skills and attitudes to engineering after inquiry-based lab learning experience. Based on the 3Ps model (Prosser & Trigwell, 1999), an Inquiry-based Lab Learning Inventory (hereafter called ILLI) was designed to measure students’ self-reported professional skills and attitudes to engineering, learning approaches used during the lab course, and their demographic and academic backgrounds. The perceived skill level, attitude, and learning approach were all latent constructs (or called factors) that could not be observed, counted or measured directly. Instruments measuring a latent construct usually need to include multiple indicators to reflect an inherently abstract construct (Dillon, Madden, & Mulani, 1983). Specification of dimension(s) or indicators that 53 represent each latent factor drew on existing instruments and relevant concepts and theories. Table 5 specifies the dimensions and the number of items used to measure each construct. Table 5. Number of Items of Each Sub-scale Latent Creativity Learn construct from Subscale failure Perceived skill level 4 Positive attitude 3 5 Negative attitude 1 5 Deep approach Superficial approach *Not included in the following analyses. Teamwork and Learning communicatio approach n 8 4 4* 4 4 The four items measuring negative attitude to teamwork and communication were reversely stated items. They were mainly used to identify extreme cases that had same ratings for all items. The meaning of these four items overlapped with the four items measuring positive attitudes. Keeping similar items in factor analysis will mask true underlying factor structure and lead to inferior factor solution. Therefore, the four reversely worded items were removed from analysis. Creativity Creativity in the current study included two dimensions: divergent thinking and exploration. Some psychological studies approached creativity as a domain-general skill, but some other scholars suggested that creativity is discipline-relevant involving domain specific skills and can only be judged by someone knowledgeable about that particular domain (Hoegl & Parboteeah, 2007; Jeffries, 2007; Wilpert, 2008). Each discipline defines creativity differently and gives emphasis to different traits associated with creativity. Creativity in engineering is usually related to problem finding and problem solving in an original and useful manner (Charyton & Merrill, 2009; Cropley & Cropley, 2005; Nickerson, 1999). Relevant is Guilford’s 54 (1967) categorization of thinking into “divergent” and “convergent” thinking. Divergent thinking, the ability to generate multiple solutions of a problem, has been used as an indicator of creative potential in the literature (McCrae, 1987; Runco & Acar, 2012). Divergent thinking ability is particularly important for solving ill-structured problems that require problem solvers to identify various perspectives and opinions relevant to the problem and reconcile the uncertainty in the inquiry process (Kitchner & King, 1981). Based on Guilford’s conceptualization of divergent thinking, four items were created to measure students’ perceived creativity skills situated in the context of problem solving. Another concept associated with creativity and relevant to problem solving is curiosity, an emotional-motivational state that facilitates creativity thinking (Kashdan, Rose, & Fincham, 2004). Like the cognitive component of creativity, which includes divergent and convergent thinking, the affective domain of curiosity was also dichotomized into diversive and specific curiosity. Diversive curiosity facilitates exploration of various sources of novelty and challenges, whereas specific curiosity fosters further enjoyment by going in-depth with particular stimulus or activity (Berlyne, 1960, 1967, 1971; Day, 1971; Krapp, 1999). Although high curiosity does not always result in high creativity, it is necessary for creative thinking to occur (Kashdan & Fincham, 2004). Therefore, researchers often treat curiosity as an interrelated construct with creativity (Amato-Henderson, Kemppainen, & Hein, 2011; Kashdan & Fincham). The Curiosity Exploration Inventory (CEI) (Kashdan, et al., 2004) was developed to measure exploration (inclination for novelty and challenges) and absorption (engagement in a specific activity) among college students. The current study adapted four items from the exploration dimension in the CEI to address the creativity objective. 55 Learn from Failure Failure in learning usually brings forth with stress. For learning to occur through failure learners must persevere when confronted with failure. As novices in the field of engineering, undergraduate students often see their failures more easily than to realize their successes (Burleson & Picard, 2004).Learning from failure and moving forward also requires students to perceive failed tasks as vehicles for them to test assumptions, see failure as the feedback on their assumptions, and redirect resources to alternative ways (McGrath, 1999). It takes time and efforts for students to reflect on failure and build plausible explanations (Shepherd, Patzelt, and Wolfe, 2011). Existing studies have examined the relationship between individuals’ coping strategies in a variety of adversity with one’s self-efficacy (McQuiggan & Lester, 2006), attitudes to learning tasks (Combs & Onwuegbuzie, 2012), curiosity (Cavalieri, 1996). Only positive strategies (adaptive or constructive coping), which improve one’s functioning in solving problems or overcome difficulties, will enable students to learn from failure and move forward in accomplishing learning tasks. In contrast, negative coping techniques (maladaptive coping or non-coping) prevent an individual from unlearning the association between stress and the situation and discourage the individual from learning. A most commonly used maladaptive coping strategy is to avoid the situation that brings anxiety by all means. The avoidance coping strategy can result in students’ withdrawal from learning. Many other ways of categorizing coping exist in the literature. Brandtstädter (1992) distinguished assimilative and accommodative coping, with the former aiming at changing environment to oneself and the latter at altering oneself to the environment. Lazarus, Folkman and Stress (1984) categorized coping into problem-focused and emotion-focused. The former 56 one refers to taking instrumental action to change the situation and the latter one aiming at managing emotional distress. Items used to measure engineering students’ intention to learn from failure in the current study were adapted from the Stress Coping Inventory (SCI) developed and tested by Lin and Chen (2010). Although their study focused on students in universities and colleges of technology, the context of their study is similar to the current study. SCI was created to examine university students’ problem- and emotion-focused coping strategies used to overcome stress in learning, particularly in problem solving. The ILLI used ten items from SCI, including 3 active problemfocused, 3 passive problem-focused, 2 active emotion-focused, and 2 passive emotion-focused items. Teamwork and Communication Prior studies often addressed engineering students’ domain-specific communication skills and teamwork capacity simultaneously (e.g. Adams, 2001; Artemeva & Logie, 2003). Students can practice oral or written communication skills almost in each course through conducting lab projects, writing reports, presenting results, and participating in classroom discussion. Communication skills have generic aspects that are transferable from context to context, but certain aspects of communication skills are domain, context, and case specific (e.g. Baig, Violato, and Crutcher, 2009; Morrow, et. al., 2003). For instance, individuals who can effectively communicate in technical terms may not be equally effective in communicating with a lay audience. Similarly, working in different kinds of teams requires different skills. In a multidisciplinary team, in particular, the ability to judge what kind of message is appropriate with a specific audience becomes more important than when team members are from similar academic backgrounds (Lynch, et al., 2009). 57 The instrument included eight items that measure the perceived teamwork and communication skill, including assessment of collaboration and communication with people in the same field or from different fields. Communication skills included oral and written communication skills with different audiences. These eight items were adapted from an instrument developed for a research project funded by the National Science Foundation, Prototype to Production: Conditions and Processes for Educating the Engineer of 2020 (NSFDUE 0618712), co-directed by Drs. Lisa R. Lattuca and Patrick T. Terenzini. Similarly, eight items were used to measure the attitude towards collaboration and effective communication in engineering. Items measuring attitudes to communication were adapted from an instrument measuring medical students’ attitudes towards communication skills (Wright, et. al., 2009). Interrelationships among Learning Outcomes Learning outcomes are often related rather than discrete. Marzano and Kendall’s (2007) new taxonomy of educational objectives indicated the interaction among attitudes, beliefs, emotions, motivation, and attention in learning. The achievement of one learning outcome may suggest changes in other learning outcomes and vice versa (Kraiger, Ford, & Salas, 1993). For instance, the learning environment fostering creativity and the environment that encourages learning from failure share similar features, such as challenging and supportive. To achieve the goal of being creative and able to learn from failure also involves similar aspects of learning, as both involve tolerance of uncertainty and persistence when confronted with difficulty (Caffarella, 1993; Cavalieri, 1996). Learning approach Booth (2004) related the distinction between deep and surface approaches to engineering education, “deep and surface approaches are generic terms and individual types of task (reading, 58 solving problems, doing labs, undertaking projects in industry) all have their own peculiar forms which can be observed and analyzed in situ (p. 17). Based on the dichotomization of learning approaches into superficial and deep approaches (Biggs, 1987, Marton & Säljö, 1976), the current study modified eight items from the Shortened Experiences of Teaching and Learning Questionnaire (SETLQ) (Hounsell, et.al., 2005) to fit the context of lab course, with half of these items measuring deep approach and the other half measuring superficial approach. Both approaches were driven by students’ intention-to understand the material or to cope with the tasks or assessment with minimum personal engagement (Marton, Hounsell & Entwistle, 1997). Students’ Characteristics Students’ characteristics taken into account in the analysis included students’ previous learning experiences in lab or inquiry-based settings, race, sex, grade, and GPA. In the context of the proposed study, students’ previous experiences in inquiry-based learning settings and familiarity with solving ill-structured problems are likely to influence how they react to inquirybased lab instruction. Houlden and his colleagues (2001) suggested that students lack sufficient problem-solving and interpersonal skills can hardly benefit from problem-based learning. Walker and Lofton (2003) suggested that exposing students to unfamiliar instructional methods could reduce students’ self-perception of their learning abilities. Research on inquiry-based lab instruction showed similar results-students who had prior exposure to inquiry-based lab experiments valued this learning approach more and accept this approach more readily. Gender and race are important individual factors considered in teaching and learning research (e.g. Hoffer, Rasinski, & Moore,1995; Madigan, 1997; Mason & Kahle, 1989). Research also showed that male and female students prefer different learning styles (Dee, Nauman, Livesay, & Rice, 2002; Philbin, Mejer, Huffman, & Boverie, 1995). Secker’s (2002) 59 study examined if the effectiveness of inquiry-based instruction varies across different demographic features. Her study showed that learning styles and demographic status explained some variations in students’ learning achievement in inquiry-based science class, and thus these individual factors should be controlled for when assessing the effectiveness of instructional methods. The student survey asked a series of questions concerning students’ background, including race, sex, and GPA. Male and White were used as the reference group for sex and race respectively, because male and white are dominant sex and racial groups in the field of engineering , as well as in the current dataset. GPA is a 6-sccale ordinal variable, representing overall GPA in the engineering program ranged from 1.49 or below (1) to 3.50-4.00 (6). Reliability and Validity A major goal of the current study was measuring changes in students’ self-rated skills and attitudes to engineering after taking the sample course, so the reliable and valid measures of outcome variables were essential for addressing research questions. Exploratory and confirmatory factor analyses were employed to examine the construct validity of the measurements. Based on factor structures from factor analysis, Cronbach’s alpha was employed in the reliability estimates of each factor. Evaluation of reliability and validity involved parallel analysis that determined the number of factors to retain, exploratory factor analysis (EFA) that reduced outcome measures to latent factors, and confirmatory factor analysis (CFA) that cross validated the factor structure from EFA. Following the common practice in evaluating instrument, the current study used separate datasets for EFA and CFA. EFA used complete responses from the engineering design lab (N= 486) and CFA used complete responses from the chemistry lab (N=781). 60 Parallel Analysis An important decision in conducting factor analysis is the number of factors to retain (Hayton, Allen, & Scarpello, 2004; Zwick & Velicer, 1986). The current study employed one of the most recommended methods in the latest literature, parallel analysis(PA) performed by O’Connor’s (2000) SPSS syntax (rawpar.sps), to determine the number of components to retain. Parallel analysis (PA) parsimoniously simplifies structure and reduces the analysis of noise by identifying significant variable loadings for each factor (Buja & Eyuboglu, 1992; Franklin, et. al., 1995). A factor is considered significant when the associated eigenvalue was greater than the mean of those from the random uncorrelated data. Exploratory Factor Analysis (EFA) The survey instrument employed multi-item scales. The variations in observed variables were likely to reflect variations in a fewer number of latent factors. Explorative factor analysis (EFA) was conducted to uncover the underline structure of a group of variables based on the pattern of correlations among these variables. The purpose of explorative factor analysis is to indicate how strongly each item reflected a factor (Stevens, 1996). Additionally, the survey instrument drew on literature and existing instruments in various fields without strong hypotheses of relationships among its items within the context of engineering labs, so an exploratory analysis was more appropriate than a confirmatory approach to begin with. Since outcome variables were measured by likert-scale questions, the EFA used polychoric correlation matrixes, which are suitable for ordinal data (Garson, 2013). EFA was performed with STATA using the factormat command. Adequacy of sample size. EFA included 30 outcome variables that measured pre-course and post-course self-rated skill levels and attitudes to engineering. The subject to item ratios for 61 each group of factor analysis was 16: 1, over the normally acceptable ratios (i.e., 5:1 or 10:1) used in previous research (Costello & Osborne, 2003; Hatcher, 1994; Mundrom, Shaw, and Ke, 2005). Sampling adequacy. Kaiser’s measure of sampling adequacy estimated the degree of inter-correlations among variables (Sharma, 1996). Kaiser-Meyer-Olkin measure of sampling adequacy was over .7 and the Bartlett’s test of Sphericity test result was statistically significant for pre- and post-course responses respectively, suggesting that it was appropriate to conduct factor analysis. Items deleted. Both factor and reliability analyses specified the variables that measure similar concept (or factor) according to the extent to which variables within a factor share variations. An item was excluded from subsequent analysis if 1) its loading on each factor extracted was lower than 0.4, 2) this item had similar loadings on different factors, 3) this item functioned differently in the pre-lab and post-lab assessment (e.g. an item had high loading on different factors in the pre- and post-lab scores), 4) deleting this item improved the reliability of a scale, or 5) the variance of a variable accounted for by the factor within which it is embedded (i.e. communality) was less than 0.4. The analysis was repeated after each deletion until a satisfactory factor solution was achieved. Variance explained. In factor analysis, the total variance in observed scores was partitioned into common variance shared among a group of variables and specific variance, which includes unshared variance and measurement error (Kline, 2004). A major goal of factor analysis is to have a set of variables’ common variance accounted for by the factor(s). A rule of thumb for determining the percentage of total variance explained in a factor solution does not 62 exist in the literature. The criterion used by researchers could be as high as 90% or as low as 50%. In humanities, it is common to have 50-60% of the total variance explained (Hair, Anderson, Tatham, & Black, 1995; Williams, Onsma, & Brown, 2010). Factor loadings. Factor loading indicates the strength of the association between the original variable and its factor. Direct oblimin (oblique) rotation was used to identify items that loaded significantly on four factors. A rule of thumb is to retain an item only when the absolute value of its factor loading is at least 0.3. Factor loadings with absolute values equal to or greater than 0.4 are considered more important and those equal to or greater than 0.5 are considered practically significant. A stricter rule is to take into account sample size in determining the significance level for interpreting factor loadings. In a sample of 200, a factor loading of 0.4 is required for practical significance (Hare et. al., 1998). Confirmatory Factor Analysis (CFA) Since EFA is largely data driven and involves subjective decisions, a commonly used way to cross validate the factor structure from EFA is by means of conducting confirmatory factor analysis (CFA) using a different dataset (Byrne, 1989; Pedhazur & Schmelkin, 1991). However, research literature has revealed discrepancies between EFA and CFA results that often occurred when using CFA to confirm the factor structure from EFA (e.g., Church & Burke, 1994; Rao & Sachs, 1999). Van Prooijen & Kloot (2001) compared different ways of constructing CFA models based on exploratively obtained factor structures and suggested less constrained model that freed correlations between factors or between factor and variables, rather than fix the pattern coefficients to the values obtained from EFA. The present study used CFA to determine the degree to which the pattern of self-rated skills and attitudes to engineering is consistent with the factor structure resulted from EFA. To test this hypothesis, four models, all with freed parameter 63 coefficients were examined and compared with a null model in which all relations are constraint to be zero (Byrne, 1989). These four models include 1) a four-uncorrelated-factor model which grouped items based on EFA results but with uncorrelated factors, 2) a four-correlated-factor model which added correlations among factors based on model 1, 3) a model that added cross loadings (<0.2 factor loading<0.4), and 4) a model with covariances among error terms. Factor Scores The present study used two types of factor scores to serve the purposes of two groups of analyses. First, the average-score method was used to address the first group of research questions concerning students’ changes in self-perceived skills and attitude to engineering. For outcome measures, the factor score was calculated by summing raw scores corresponding to all items loaded on a factor and then computing the average score to retain the scale metric. The average-score method is useful to conduct comparison across factors or between datasets and is acceptable for most exploratory studies (DiStefano, Zhu, & Mindrila, 2009; Tabeachinck & Fidell, 2001). However, a major limitation of this method is that all items on a factor are given equal weight, as loading values are not taken into account. Additionally, Barlett scores were used in the analysis that addressed the second group of research questions concerning the relationship between students’ perception of the learning context and learning outcomes. Barlett scores were generated with STATA based on the fourth CFA model. Using maximum likelihood estimates, the Barlett approach produced unbiased estimates of the true factor scores (Hershberger, 2005). Barlett scores are also highly correlated with the latent factor estimated (Gorsuch, 1983), indicating high validity estimates. 64 Demographic Data Of the 765 surveys collected from the Engineering design lab, 155 were judged unusable due to extensive omissions or extreme response (i.e. same rating throughout the survey). In the final dataset, spring, summer, and fall sessions had 176, 34, and 400 responses respectively. Response rate in three semesters was 32.9%, 85.0%, and 51.0% respectively. The Bio-system lab was a small class of 32 students enrolled when the data was collected. 28 (87.5%) students completed and returned the survey. The Chemistry lab enrolled 1166 students in fall 2013. Response rate in the Chemistry lab was 77.2%. The 900 usable surveys returned were from 461 students in (51.2%) engineering, 264 (29.3%) in science majors, 92 (10.2%) taking this course to meet pre-medical requirements, 60 (6.7%) from social science majors, and 23 freshmen who indicated that they had not decided the field of study yet. After scale-wise deleting incomplete responses and extreme responses that assigned the same value to all the items, including the reversely stated items, within a scale, the sample sizes for each sub-scale ranged from 526 to 538 for the Engineering Design Lab and 810 to 836 for the Chemistry Lab. The 28 responses from the Bio-system lab retained. Across cases, a majority were White, followed by Asians, particularly the senior-level bio-system lab whose participants were predominantly White (see Table 6). The three sample labs also enrolled much more male students than female students, especially for the two lab courses offered in campus I. The Chemistry lab in campus II, as a fundamental science lab offered to different majors across the university, had a more even division between male and female students. The gender distribution in the sample courses aligned with the national data. Public universities generally had an even male-female student ratio (Ginder & Kelly-Reid, 2013), 65 but more male students (81.8%) than female students (18.2%) enrolled in undergraduate engineering programs (Yoder, 2011). Table 6. Participant Demographics Campus Number of responses gender- Female Male Grade- Freshman Sophomore Junior or above Race- White Black Hispanic or Chicano Asian Others GPA(median) A. Eng. Design Lab I 610 18.5% 81.5% 69.3% 23.4% 7.3% 67.3% 6.0% 2.6% 21.1% 3.0% 3.00-3.49 (B to A-) B. Biosystem Lab I 28 29.6% 70.4% 100% 92.3% 7.7% 3.00-3.49 (B to A-) C. Chemistry Lab II 900 47.4% 52.6% 83.0% 12.2% 4.8% 66.4% 4.6% 4.3% 20.7% 4.0% 3.00-3.49 (B to A-) Exploratory Factor Analysis Results Number of Factors to Retain The PA results (see Table 7) suggested extraction of four factors for both pre- and postcourse responses. The EFA scree plots from SPSS confirmed a four-factor solution for pre- and post-course response (see Figure 2). The eigenvalues in the plots are used to condense the variance in a correlation matrix. The larger the eigenvalue, the more variation a factor has accounted for (Tabachnick & Fidell, 2001). In both plots, the first three factors were on the steep slope. The last big drop occurs between the third and fourth factors. The eigenvalues of the fourth factors were 1.036 and 1.060. 66 Table 7. Preliminary Parallel Analysis Results for the First Five Factors (30 items; N=481) Pre-course Post-course Root Raw Data Means Percentile Root Raw Data Means Percentile Eigenvalues Random Data Eigenvalues Random Data Eigenvalues Eigenvalues 1 2 3 4 … 8.788 3.320 2.273 1.520 … 1.489 1.423 1.372 1.329 … 1.553 1.474 1.412 1.363 … 1 2 3 4 … 8.816 3.221 2.761 1.537 … 1.491 1.424 1.374 1.330 … 1.559 1.472 1.419 1.367 … Figure 2. Scree plot of eigenvalues from EFA Eigenvalue 4 6 8 0 0 2 Eigenvalue 2 4 6 8 10 Post-course (N=481) 10 Pre-course (N=481) 0 10 20 30 Number 0 10 20 30 Number Factor Solution The preliminary results from EFA performed with STATA using the factormat command and polychoric correlation matrixes identified four factors: self-perceived skills (SKILLS), attitude to teamwork and communication (ATTITUDE), intention to explore (EXPLORE), and coping strategies (COPING). The factor structures for both pre- and post-course assessment were consistent. The grouping of the items was largely aligned with the conceptual framework based on which the survey items were selected, except for the first factor that included three types of self-rated skills: communication, teamwork, and divergent thinking. Items measuring these three 67 different kinds of skills clustered together and all had high loadings (0.736-0.818) on this factor and low loadings (<0.2) on the rest of the three factors. Deleted items. In the preliminary factor solution, six items listed in Table 8 had consistently low communality (< 0.5) for both pre- and post- responses. As communality is the percent of variance in a specific variable explained by all the factors jointly, low communalities indicated that the factor solution did not work well for these variables. First, the main loading of item 1 (notint) was low (0.473). Including this variable in the factor about exploration also reduced the reliability of this factor. Since notint is a reversely worded item, it added cognitive complexity in answering survey questions and thus may have reduced reliability (Marsh, 1996). Additionally, Item 2-6 were all from the same subscale-Active Coping. These five items also had low loadings (<0.6) on the primary dimension and weak to modest loadings (<0.2-0.4) on other dimensions. Since evidence from factor structures suggested that the factors extracted did not adequately represent the active coping subscale, the five items in this subscale were analyzed separately by comparing students’ adoption of active coping strategies before and after taking the sample course. Several other items also had communality or factor loading below the normally accepted criterion. However, for theoretical concerns, these items were retained. First, the three items measuring the intention to explore had communality slightly below 0.5. However, they measured different aspects of exploration. Deleting more items from this scale will leave too few items within this factor or reduce the comprehensibility of the factor structure if the factor about exploration was removed. Similarly, several items on coping strategies had communalities slightly below 0.5. However, since the variables mentioned above all had satisfactory main 68 loadings and contributed to the comprehensibility of the factor structure, these items were retained in the analysis. Table 8. Items Deleted Based on Preliminary PCA and the Rationale. Deleted items Rationale 1 (notint) Find myself not interested in probing Low communality; low loading; deeply into new situations or things (var19). decrease reliability 2 (disc) Discuss issues with teachers, seniors or Low communality; low loading classmates and ask for their opinions (var21). 3 (try) Learn to live with it and keep trying (v26). Low communality 4 (persev) Tell myself to persevere (var28). Low communality 5 (simp) Simplify the question and make it easy to Low communality solve (var22). 6 (calm) Use a calm and optimistic attitude to think Low communality; low loading about how to cope with the problem (var24). Eigenvalues of pre- and post-test factor structure. A parallel analysis was performed after item deletion. The raw data eigenvalues of the first four factors were more than those of the random data so a four-factor solution was still preferred (see Table 9). The final factor structures for pre- and post-course responses were similar. The first factor accounted for over 60% of the variation in the total sample. The proportions of variation in the sample accounted by the second and third factors were around 18% and 10% respectively. Table 9. Parallel Analysis Results for the First Five Components (24 items; N=481) Pre-course Post-course Root Raw Data Means Percentile Root Raw Data Means Percentile Eigenvalues Random Eigenvalues Random Data Data Eigenvalues Eigenvalues 1 7.723 1.426 1.492 1 7.838 1.425 1.489 2 2.871 1.357 1.406 2 2.957 1.357 1.407 3 1.912 1.305 1.344 3 2.173 1.306 1.346 4 1.440 1.262 1.299 4 1.427 1.262 1.300 … … … … … … … … 69 However, the eigenvalue for the fourth component fell slightly below one and accounted for only 6-7% of the variation in the sample (see Table 10). The low eigenvalues of the last three factors, especially the fourth factor, COPING, may be a function of the high eigenvalue of the first factor. Because the sum of eigenvalues equals the total variance, the presence of large eigenvalue associated with the first factor could be compensated for by depressed eigenvalues of the last three (Buja & Eyuboglu, 1992). Despite the small eigenvalue of COPING, items loaded on this factor all contribute to a well-defined concept of passive coping consistent with the literature. Since the interpretation of factor loadings must be in the light of theory (Garson, 2013), items loaded with coping strategies were retained at the exploratory stage of analysis. Table 10. Eigenvalues for Case A. Pre-course and Post-course Assessment (N=486). Pre-course Post-course Factor Eigenvalue Difference Proportion Factor Eigenvalue Difference Proportion 1 7.864 5.480 0.613 1 8.585 5.982 0.610 2 2.384 1.049 0.186 2 2.603 0.930 0.185 3 1.335 0.383 0.104 3 1.673 0.778 0.119 4 0.952 0.260 0.074 4 0.895 0.113 0.064 Selecting items for each factor. Table 11 shows the correlations between rotated factors. Based on the most heavily loaded indicators in each column of Table 12, the four factors were labeled SKILLS, EXPLORE, ATTITUDE, and COPING. Factor structures for both pre- and post-course response were simple structures with the majority of main loadings above 0.6 and only a few weak cross loadings below 0.4 (Garson, 2013). Items with major loadings with EXPLORE and ATTITUDE were largely consistent with the original design of these two subscales. Not surprisingly, similar to the measure of perceived professional skills, items measuring the attitude to communication and teamwork loaded with the same factor. Theoretically, teamwork and communication skills are not separable in classroom 70 settings (Lingard, 2010). Contextually, in sample lab courses, written or oral communication usually took place in a teamwork context, such as discussing the problem solving process, writing a lab report, and presenting the final product. Table 11. Component Correlation Matrix from Orthogonal Solution. Pre-Course Post-Course Component 1 2 3 1 2 3 1. SKILLS 2. ATTITUDE 0.107 0.134 3. EXPLORE 0.034 0.320 0.034 0.432 4. COPING 0.263 0.533 0.394 0.213 0.415 0.327 In terms of the COPING, only items measuring passive coping strategies were retained. The current factor structure worked better for post-course assessment than the pre-course assessment of coping strategies. The uniqueness values (uniqueness=1-communality) of the five items loaded with COPING were all above the 0.5 threshold for pre-course assessment, but were within or only slightly above 0.5 for post-course assessment. The loadings with COPING for the post-course assessment were also higher, indicating a stronger association between COPING and the five items loaded on this factor. Additionally, COPING is a better explanation of variations in passive emotional coping than in passive problem based coping. In both pre- and post-course assessment, the two items measuring passive emotional coping (v27 and v29) loaded higher on this factor and had smaller uniqueness than the three items measuring passive problem solving coping (v23, v25, and v30). In other words, a considerable amount of variation in passive problem solving coping was not explained by the current factor structure. 71 Table 12. Rotated Factor Pattern and Communalities (N=486). Pre-Course Post-Course Unique Unique Item SKILLS EXPLORE ATTITUDE COPING -ness SKILLS EXPLORE ATTITUDE COPING -ness var1 0.515 0.374 0.690 0.796 var2 0.422 0.341 0.760 0.837 var3 0.437 0.341 0.730 0.803 var4 0.372 0.305 0.770 0.841 var5 0.224 0.399 0.252 0.327 0.725 0.731 var6 0.376 0.351 0.779 0.819 var7 0.212 0.499 0.361 0.674 0.810 var8 0.258 0.456 0.205 0.322 0.680 0.765 var9 0.268 0.359 0.400 0.751 0.738 var10 0.731 0.332 0.352 0.371 0.737 var11 0.756 0.279 0.351 0.383 0.732 var12 0.720 0.276 0.405 0.203 0.398 0.704 var13 0.666 0.484 0.571 0.704 var14 0.444 0.460 0.712 0.742 var15 0.207 0.309 0.403 0.325 0.210 0.355 0.662 0.562 var16 0.213 0.279 0.550 0.318 0.210 0.409 0.555 0.503 var17 0.224 0.533 0.589 0.623 0.570 var18 0.256 0.243 0.388 0.514 0.679 0.647 var20 0.201 0.382 0.456 0.733 0.709 var23 0.718 0.590 0.525 0.650 var25 0.651 0.575 0.579 0.646 var27 0.611 0.494 0.596 0.680 var29 0.512 0.460 0.671 0.731 var30 0.663 0.586 0.579 0.656 Note: Coefficients less than 0.2 were suppressed; the highlighted loadings were identified as the most heavily loaded indicators in each column. 72 COPING was a seemingly minor factor for both pre-course and post-course assessment. However, COPING had modest correlations with all the other factors (see Table 11), which are supported by existing literature. Prior studies have revealed association between learning from failure and self-efficacy (McQuiggan & Lester, 2006), attitudes to learning tasks (Combs & Onwuegbuzie, 2012), and curiosity (Cavalieri, 1996). Their correlations provided evidence of convergent validity, meaning that two measures of constructs that are theoretically related are related in data. Therefore, it was theoretically legitimate to keep this factor for the following learning outcome assessment (Garson, 2013) Internal Reliability Based on factor structures from EFA, Cronbach’s alpha was employed in the reliability estimates of each factor. Generally, a rule of thumb of 0.7 was used as an acceptable alpha and 0.8 was considered good (George & Mallery, 2003; Nunnally, 1978). Results showed that reliability of ATTITUDE was satisfactory for the Engineering design lab (Case A.) and Chemistry lab (Case C.), but was not satisfactory for the Bio-system lab (Case B.) (see Table 13). Since the Bio-system lab had a very small sample size, much lower than normally accepted minimum sample size of 300 for reliability estimates (Kline, 1986; Nunnally & Bernstein, 1994), results on internal consistency the Bio-system lab should be interpreted with caution. The internal consistency of EXPLORE is higher than revealed in previous evaluation. Kashdan, Rose, & Fincham (2004) evaluated the Curiosity Exploration Inventory (CEI) using data from undergraduate students from a large, Northeastern university. The internal reliability of the exploration subscale in their study ranged from 0.63 to 0.74. The difference may be a function of the removal of a reversely state item (var 19 notint) from this sub-scale in the 73 current study, as negatively worded item intends to reduce the internal consistency of measures (March, 1996). The reliability estimate of Coping was slightly below 0.7 for the Chemistry, but was acceptable for the other two labs. The reliability estimates of the original Stress Coping Style Inventory reported in previous study was over 0.87 for passive emotional coping, measured by six items, and 0.86 for passive problem coping, measured by eight items, respectively (Lin & Chen, 2010). The five items measuring passive coping in the ILLI were a combination of two items measuring passive emotional coping (v27 and v29) and three items measuring passive problem coping. Combining these two different aspects of passive coping in the current survey instrument might have reduced reliability estimates. The fewer number of items included in this scale, as compared to the original inventory, might also contribute to less satisfactory reliability estimates since the number of items in a scale is positively associated with Cronbach reliability scores. Table 13. Cronbach’s Alpha Reliability Estimates for Each Component of Each Case. Case A. Case B. Case C. N. Pre- Post- N. Pre- Post- N. Pre- PostSKILLS 536 0.929 0.937 28 0.939 0.916 835 0.903 0.900 EXPLORE 537 0.825 0.791 28 0.833 0.794 836 0.848 0.846 ATTITUDE 526 0.766 0.780 28 0.411 0.465 810 0.754 0.747 COPING 538 0.760 0.801 28 0.772 0.747 833 0.663 0.689 Confirmatory Factor Analysis Results Statistical Assumptions and Corrections Most extraction methods, such as maximum likelihood (ML), assumed univariate and multivariate normality. Kline (2011) suggested that skewness>3 and kurtosis>10 are extremely non-normal distribution and otherwise are considered adequate for the assumption of 74 multivariate normality. The exploration of univariate normality of individual variables (skewness<2 and kurtosis<4) did not discover extreme non-normality. However, Mardia’s coefficient of multivariate skewness kurtosis from mvsktest (Kolenikov, 1970; Mardia, Kent, & Bibby, 1979) revealed evidence of violation of multivariate normality A non-normal distribution tends to inflate chi-square and underestimate standard errors when using ML, leading to Type I error or rejecting good-fitting models. Moreover, data transformation may lead to spurious structure (Buja, 1990). Instead, bootstrapping method was adopted in the current study to correct standard errors. Based on simulation studies, with large sample size (500-1000), bootstrapping works well in correcting standard errors (Fouladi, 1998; Hancock & Nevitt, 1999; Nevit & Hancock, 2001). Fit Indices A range of statistics was used to evaluate the extent to which four CFA models derived from EFA results reproduces the sample covariance matrix of the measured outcome variables. The purpose was to identify the most parsimonious factor solution that fit the patterns of observed outcome variables satisfactorily. The conventionally used chi-square test for evaluating model fit tends to reject otherwise good-fitting models when sample size is large (when N>400), such as in the current study. Alternative fit indexes used in the current study to evaluate model fit included relative chi-square, root mean squared error of approximation (RMSEA), standardized root mean square residual (SRMR), Akaike information criterion (AIC), and comparative fit index (CFI). The former two are absolute measure of fit and the latter two are comparative measure of fit. The latter two are comparative measure of fit and are meaningful only when two models are estimated and compared (Hooper, Coughlan, & Mullen, 2008). 75 Relative Chi-square (χ2/df) minimized the impact of sample size. Although there was no consensus concerning the cutoff value of this statistic, recommendations ranged from 2 (Tabachnick & Fidell, 2007) to 5 (Wheaton, Muthen, Alwin, & Summers, 1977). RMSEA and SRMR are robust for large df and high N models. RMSEA is a chi-square to df ratio, which penalizes model complexity. SRMR is the standardized difference between the observed and the predicted correlation and does not penalize model complexity. For both indices, 0 indicates perfect fit. MacCallum, Browne and Sugawara (1996) used RMSEA values of 0.01, 0.05, and 0.08 to indicate excellent, good, and mediocre fit. Some others used 0.07 (Steiger, 2007) and 0.06 (Hu & Bentler, 1999) as the cutoff for poor model fitting. For SRMR, 0.08 is generally used as the cutoff for poor fitting models and smaller SRMR means better fit (Hu & Bentler, 1999). Unlike other fit indices, AIC does not have a normally accepted cut-off value for model fit; rather, models with the lowest AIC indicate the best fitting model. CFI examines baseline comparisons and is largely dependent on the average size of the correlations in the data. A cutoff criterion of CFI>=0.90 has been widely accepted. Both AIC and CFI penalize for every parameter added to estimation (Hooper, Coughlan, & Mullen, 2008). Model Fit The first comparison model assumed that items cluster within the factors proposed by EFA, but the factors were not related with each other. All major factor pattern coefficients were estimated as free parameters. The second model assumed that items would be loaded with the exploratively obtained factors in the same pattern as suggested by EFA and each item would have sufficient variance explained. The third model added cross loadings that were common for pre- and post-course responses (<0.2 factor loading<0.4) based on the factor structure from EFA. Model three was expected to fit the patterns of outcome variables better 76 than the first two models. Ideally, model three would fit the data satisfactorily. Otherwise, more paths between variables and factors or among error terms were necessary for the factor solution to fit the patterns in observed outcome variables. The first uncorrelated-factor model fitted the data poorly. The relative Chi-square was above 5; RMSEA and SRMR were both above 0.8; and CFI was much lower than 0.9. Although model 1 was significantly better than the null model, all of these fit statistics indicate that this model did not sufficiently explain the sample correlations among observed outcome variables. The second model that allowed factors to be correlated gave much better model fit to the patterns in data. SRMR of this model suggested a good model fit and RMSEA was close to the cut-off point of poor fitting models (i.e. 0.8). However, relative Chi-square for model 2 was still above the suggested cutoff value of 0.5. Model three that added cross loadings to the models provided a better fit than the first two models. RMSEA and SRMR were also both below 0.8, suggesting that this model was acceptable. However, the relative Chi-square was still slightly above 0.5 and CFI did not meet the cut-off criterion of CFI>=0.90. To further improve model fit, model four that included correlated error terms was considered. The fourth model included covariances among error terms that were conceptually meaningful. Adding correlation among error terms was controversial, because it indicates omission of relevant latent variables from the model (Gerbing and Anderson, 1984; James, Mulaik and Brett, 1982). Moreover, the attempts to improve model fit by means of trying different combination of error terms would reduce the confirmatory approach to an exploratory step. However, for an instrument that uses multiple-item design, correlated errors are common among items that used similar wording or even appear close to each other on the questionnaire (Bollen & Lennox, 1991). Following a commonly accepted rule that correlated error terms 77 should be used conservatively and guided by concepts and theories (Hooper, Coughlan, & Mullen, 2008; Jöreskog & Long, 1993), the final CFA model only allowed error covariance among a few items within the same factor that were obviously overlapped in meaning conceptually. Adding error covariance was expected to improve model fit, yet it was intended to be theoretically driven rather than statistically driven. As mentioned earlier in the EFA analysis results, items loaded on the SKILLS factor included at least three types of skills: teamwork, communication, and divergent thinking. While the factor, SKILLS, explained their commonality, it was likely that items within each of these three subscales had variance unexplained by common factors and were correlated. The final CFA model included error covariance among three items within the SKILLS factor: v3-communicate effectively with teammates and instructors about lab experiments/projects, v5-collaborate with others to accomplish a lab experiment/project, and v8-work in teams of people with a variety of skills and backgrounds. It was evident that these three items were overlapped in meaning conceptually. They all measured self-rated teamwork skills, with emphases on communication, task accomplishment, and working in a diverse team respectively. The final model with correlated error terms showed a mediocre model fit, a better model fit to the patterns in data than the first three models. The relative Chi-square was below 0.5 for both pre- and post-course assessment. RMSEA reduced from 0.079 and 0.074 to 0.068 and 0.063. However, CFIs (0.882 and 0.898) were still slightly lower than the desired value (i.e. 0.9). One possible explanation was that the current data set had low average correlations. CFI depends on the average size of correlations in the data (Kenny, 2014). Although most items within each subscale had modest to high correlations, items cross subscales had weak or no 78 zero correlations. The average correlation of all the items was only around 0.2, which may cause low CFI values of the models tested. Table 14. Fit Indexes for Each Model, with Comparisons Between Models, for Pre- and Post-course Response Separately (N=781) Relative Model df χ2 RMSEA SRMR AIC CFI Pre-course Null 276 27.7 0.185 0.252 59726 0 Model 1 (uncorrelated) 252 7.2 0.089 0.120 53954 0.789 Model 2 (Factors cov) 246 6.2 0.081 0.061 53684 0.826 Model 3 (Cross loadings) 241 5.9 0.079 0.057 53584 0.841 Model 4 (Error cov) 238 4.7 0.068 0.055 53287 0.882 Post-course Null 276 27.3 0.184 0.250 58932 0 Model 1 (uncorrelated) 252 6.9 0.087 0.118 53173 0.796 Model 2 (Factors cov) 246 6.1 0.081 0.058 52952 0.827 Model 3 (Cross loadings) 241 5.3 0.074 0.053 52744 0.856 Model 4 (Error cov) 238 4.1 0.063 0.053 52443 0.898 Note: Relative χ2= χ2/df; RMSEA=root mean squared error of approximation; AIC = Akaike Information Criterion; SRMR = Standardized root mean square residual; CFI = Comparative Fit Index; ** P<0.01; * P<0.05 Factor Correlations The correlations among factors for the pre- and post-course assessment were close (See Table 15). COPING was weakly correlated with SKILLS and ATTITUDE, but was modest correlated with EXPLORE. The correlation between COPING and EXPLORE is consistent with the existing literature, which implied that curiosity is a source of intrinsic motivation for one to persist in working on a problem when solution is not obvious (Caffarella, 1993; Cavalieri, 1996). SKILLS, ATTITUDE, and EXPLORE were also modestly correlated as expected. Teamwork, communication, and creativity are the learning domains that involve both cognitive and affective components. SKILLS measured the cognitive component and ATTITUDE and EXPLORE measured the affective component of these learning areas. Their 79 correlations were consistent with the current trend to integrate cognitive and affective learning to stimulate learning outcomes for sustainable development (Shephard, 2007). Table 15. Correlation Matrix of Factors Based on Model Four. 1 2 3 Factor Pre Post Pre Post Pre Post 1 SKILLS 2 ATTITUDE 0.207 0.201 3 EXPLORE 0.277 0.298 0.323 0.258 4 COPING 0.186 0.183 0.087 0.079 0.374 0.359 Summary and Discussion The purpose of this chapter was to develop and validate a measure of students’ selfperceived skills in conducting lab work and their attitudes to engineering before and after taking inquiry-based lab course. Analyses of the construct validity of reliability of the Inquirybased Lab Learning Inventory (ILLI) demonstrated that this instrument can be adapted to engineering students in inquiry-based lab courses at a lower-division level. Since a large majority of survey participants were from two lower-division sample courses in the current study, a larger sample size from upper-division courses is required to estimate the construct validity and reliability of the instrument among junior or senior students. Factor analyses results provide evidence of construct validity of the learning outcome measures, but also revealed that the measurement can be further improved. The four factors extracted by exploratory factor analysis explained a satisfactory portion (> 90%) of the total variance in observed scores for both pre- and post-course responses. The constructs identified by factor analyses overlap, to some extent, with the learning domains that ILLI was intended to measure. The three attitudinal constructs identified were relevant to the four ABET/Sloan lab learning objectives that have a significant affective component: learn from teamwork, 80 communication, creativity, and learn from failure. The clustering of attitudes to teamwork and communication was also theoretically meaningful (Adams, 2001; Artemeva & Logie, 2003; Lynch, et al.). The confirmatory factor analysis indicated that the exploratively obtained factor structure was acceptable, but correlating errors of three items that measured self-perceived teamwork skills further improved the fit of the model to patterns of measured outcome variables, suggesting a method effect which refers to additional covariation resulted from similar worded test items. The method effects were also reflected by the clustering of all items measuring selfperceived skills. Items measuring cognitive aspects of teamwork, communication and creativity skills all loaded high on a single factor. The first two are personal skills required in team-based lab work, and the last one is related to problem finding and solving in an original and useful manner (Charyton & Merrill, 2009; Cropley & Cropley, 2005; Nickerson, 1999). The clustering of items measuring three types of skills raised a concern for the occurrence of difficulty factor. Categorical variables with similar splits tend to cluster together in factor analysis and resulted in “difficulty factors” (Gorsuch, 1983). Self-perceived skills were measured 6-likert-scale from 0 to 5 and the rest of the sub-scales used 7-likert scale from -3 to 3 including 0. However, this type of difficulty factors usually occurs when dichotomies were included. The polychoric correlation, based on which EFA was conducted, was designed for ordinal data and can reduce the possibility for difficulty factors to occur when data is ordinal (Gorsuch, 1974). Another possible explanation is that the appearance of the survey items on the instrument might influence the factor structure (Bollen & Lennox, 1991). Although questions were displayed in a random order in the web-based survey, items measuring perceived skills appeared together at the beginning of the survey, followed by attitudinal scales. 81 Statistically speaking, the ILLI may be improved by employing the same likert-scale for both cognitive and affective measures. Additionally, more complicated CFA models can also be used to take into account method variances among the indicators caused by different designs of survey items, such as Multitrait-Multimethod (MTMM) models. Models taking into account methods effects may provide stronger evidence of construct validity (Campbell & Fiske, 1959; Kenny & Kashy, 1992). However, since CFA model 3 (cross loadings) and model 4 (error covariance) had an acceptable model fit, the current study did not include further examination of the construct validity of the outcome measures. Theoretically speaking, the clustering of items measuring teamwork and communication skills and divergent thinking ability indicated that these three types of skills or ability may be related in inquiry-based learning tasks. Existing literature on inquiry-based learning usually addressed these learning outcomes individually (Lord & Orkwiszewski, 2006; Feisel & Rosa, 2005; Sheppard, et. al., 2009). However, evidence concerning how these different domains of learning interact with each other in inquiry-based learning context was scant. Future research may examine whether better self-perceived communication and teamwork skills in conducting inquiry-based lab project are associated with stronger selfperceived divergent thinking abilities. Finally, several limitations of the ILLI may influence the interpretation of results about certain learning outcomes. First, the reliability of ATTITUDE was not acceptable for the Biosystem lab. The small sample size of this course was also too small for calculating Cronbach reliability estimates. Second, the measurement of coping strategy for the Chemistry lab had reliability estimates slightly below the rule of thumb of 0.7, as well as below the reliability 82 estimates reported in previous study. Finally, COPING had better explained variation in passive emotional coping than in passive problem based coping. 83 CHAPTER V: DATA ANALYSIS AND LIMITATIONS Missing Data First, responses that had not completed any sub-scale of outcome measures were removed. Next, extreme cases were also removed as they led to biased results. The instrument included items using negative wording but overlapped in meaning with other items. Therefore, responses that assigned same scores to both positively and negatively worded items were likely to be extreme cases in which participants answered survey questions without paying attention to the meaning of questions. The percentage of missing in each sub-scale was largely related with where the sub-scale was located in the survey. Sub-scales at the beginning of the survey (i.e. Self-perceived skills) received more complete responses than the subsequent sub-scales. The last sub-scale in the survey that measured learning outcome (i.e. Attitude to teamwork) had the fewest complete responses. After deleting of incomplete and extreme cases, missing values of each variable accounted for less than 2%. Using incomplete data can lead to biased estimates of the effects of inquiry-based instructional approaches in on student learning outcomes in the current study (Kenward, Goetghebeur, &Molenberghs, 2001; Wood, White, & Thompson, 2004). However, if using casewise or pairwise deletion again, a small number of missing values would lead to a large number of missing cases. Imputation is one of the most popular alternative ways of managing missing data, given that MCAR and MAR can be assumed (Enders, 2001; Enders & Bandalos, 2001; Huisman, 2000). Unfortunately, one cannot tell from the data at hand if the missing was not at random (MNAR), meaning being missing depends on the unseen observations themselves. Since variables within a sub-scale were correlated, MNAR of one variable would likely to be correlated with other variables in the same sub-scale. However, a 84 descriptive analysis of the correlation between missingness and other variables in a scale did not identify this type of relationship. Also given that percentage of missing was low, it is legitimate to assume the missing was MAR or MCAR. Therefore, expectation-maximization (EM) was applied to impute missing values. This advanced missing data approach overcomes the limitations of listwise deletion and mean or regression substitution that usually lead to biased estimates (Dempster, Laird, & Rubin, 1977; Little & Rubin, 1987; Schafer, 1997). Changes in Professional Skills and Attitudes to Engineering To test the null hypothesis that students’ skills and attitudes to engineering did not change before and after the sample course, paired sample tests were conducted. Paired sample tests used each individual as his or her own control in identifying significant differences between pre- and post-course scores. Estimates of effect sizes and confidence intervals were also employed (Kline, 2004). These estimates provide informative evidence for educators to examine how the sizes of instructional effects are on learning outcomes. The current study used Somers’D, a nonparametric procedure, to protect against nonnormaility of measures. Somers’ D is a function of the difference between the conditional probability of concordance and discordance pairs by comparing pre- and post-course responses. As 0.1, 0.3, and 0.5 were used as cutoff points for weak, moderate, and strong relationships, they were used as criteria for small, medium, and large effect size. Fisher’s z transformation (Fisher, 1921) was employed to convert the association estimate to normal distribution and provide more robust and outlierresistant confidence interval (Newson, 2006). The paired sample test and calculation of effect sizes of change were performed for each sample case separately. Results from the three sample cases then were compared with each other to reveal similar patterns, which indicated replications of results. For the Chemistry 85 lab where about half were engineering students and the other half were science or pre-medical school students, analyses were conducted using the complete dataset and among engineering students respectively to examine if engineering students showed different learning outcomes. Finally, the qualitative data were used to triangulate quantitative results. All interviews were recorded and transcribed. Interview data were coded drawing from the pre-specified list of learning outcomes (i.e., teamwork, communication, problem solving, creative, persistent). Finally, codes were summarized into themes and compared with the quantitative assessment results. The Relationship between Task Complexity and Learning Outcomes Regression analyses were performed to address the controversy over the effectiveness of inquiry-based learning approach. Some researchers suggested that moderate complexity of the learning tasks may facilitate learning, yet too much complexity may discourage learning (Kirschner, et., al., 2006; Prince, et. al., 2008). Therefore, the current study hypothesized that the increased complexity, measured by difficulty and workload, of learning tasks in inquirybased lab benefits learning outcome, but such positive effect follows the law of diminishing returns. In other words, at some point, adding the complexity yields less desired learning outcomes or even reduces the yield. To begin, incremental F test and Wald test were used to test the hypothesis that a linear term is sufficient in the regression model. If this hypothesis was rejected, indicating that the influence of task complexity on learning outcomes was expected to be a second degree polynomial curve rather than linear, the statistical model included squared scores of difficulty and workload. Otherwise, analyses were performed without polynomial terms. 86 Based on the 3Ps model of student learning (Prosser & Trigwell, 1999), the analysis controlled for students’ relevant learning experience outside the sample course, learning approaches, and students’ characteristics including gender and race. Male and White students were used as reference groups. White students were predominant in each sample course, followed by Asians. Other minority students such as African American or Hispanic students accounted for a small portion of the sample. Therefore, race was re-coded into a binary variable (White=0, non-White=1). The analysis for the Chemistry lab also controlled for major. Since the focus of the current study is on engineering students, non-engineering students in this course was coded 0 as the reference group. Students’ previous learning experience was represented by two indicators- problem solving experience and the experience in measured learning areas-with each indicator consisting of four and five items respectively. Learning approach was also represented by two indicators-deep and superficial approaches, each of which consisted of four items. Each of these indicators represented a sub-score, which was the average value of the items designed to measure a particular dimension of the learning experience or approach to learning. I used multivariate regression analysis (MRA), which accounts for the betweenequation covariances and allows the current study to test coefficients of interest across equations (StataCorp, 2013). However, to perform a MRA on the relationship between four outcome variables and more than ten independent variables requires a large sample size. Data collected in the Chemistry lab had a sufficient sample size after including difficulty and workload measures (N=780), but the sample size of the Engineering Design lab after adding these two variables reduced to 203, which was too small for multivariate analysis with four dependent variables and multiple independent variables. Therefore, MRA was performed with 87 STATA to examine the relationship between students’ perception of the difficulty and workload and the four outcome variables by using the data collected in the Chemistry lab. Separate regression analyses then were performed to examine the relationship between perceived difficulty and workload with four learning outcome variables respectively. Limitations A major limitation of the current study is related to the scope of the method and analysis. Time and accessibility constraints prevented data collection from teaching assistants’ (TA) perceptions of performance in sample courses. Students taking the Engineering design lab and Chemistry lab were embedded within lab sections facilitated by different teaching assistants. Although instructors of sample courses had expertise in inquiry-based instruction and designed inquiry-based lab activities, TAs might not be equally skillful and comfortable in facilitating inquiry-based projects. Additionally, instructors of both courses specified the role of teaching assistant as a facilitator and supporter and required teaching assistants not to solve problems or design products for student groups. However, it is unknown how scaffolding was actually provided. There might be a gap between instructor’s expectation and what actually happened in the lab. In the Chemistry lab, TAs were categorized into three categories: 1) new TA who taught the course the first time, 2) experienced TA who had taught the course before, and 3) mentors who were experienced TAs elected to mentor other TAs. This variable might be used to control for between-section differences. However, as the course instructor indicated, those who have taught the course before were not necessarily better facilitators. Therefore, data analyses conducted to address research questions only employed student-level data. Future research may take into account variations in TAs’ understanding of inquiry-based approach 88 and skills in facilitating inquiry-based lab activities and employ multilevel analysis to control for between-section differences. Second, although the post-and-then-pre approach has advantages in controlling for response shift bias (Howard, 1980; Rockwell & Kohn, 1989), this design has several limitations including participants’ inability to recall their previous skills and attitudes and the tendency for participants to inflate perceived improvement (Hill & Betz, 2005; Shadish, Cook, & Campbell, 2002). Interviewing students who took the Engineering design lab several years ago had a similar limitation. Although all interviewees could immediately recall the projects that they did when taking this course, those who took the course more than three years ago might not remember the details of the learning process. However, the benefit of collecting information from previous students is that they were in a better position than current students to report long term effects of the sample course on their studies in engineering. Despite best efforts, samples included more lower-division than upper-division courses. The only upper-division sample course, the Bio-system lab, also had a small number of students enrolled (N=32). Consequently, results from the Bio-system lab had large standard errors and the confidence intervals of the effect size of change usually ranged from small to large. To achieve a more accurate estimation of effect sizes of change and reduce the confidence intervals to a smaller range would require a much large sample size from the upperdivision inquiry-based lab course. Additionally, results in the current study might not be generalizable to inquiry-based lab courses with different student populations. Survey participants in the Engineering design lab and Bio-system lab were predominantly White and Males. Samples for qualitative analysis were mainly international students from mainland China or Taiwan. 89 Finally, the response rates for the engineering design in spring and fall semester were not high (32.9% and 51% respectively), but it is typical for web-based surveys that normally had lower response rates than paper surveys as used in the other two sample courses. Response rate for interviews was not high either, resulting in a small sample size (N=7). The low response rate may indicate non-response bias. The cross-verification process, including comparing themes from interview data with quantitative analysis results and comparing results across cases, may help minimize this problem of interpretation. 90 CHAPTER VI: CHANGES IN SKILLS AND ATTITUDES Quantitative Assessment Results The mean scores of SKILLS, ATTITUDES, and EXPLORE all increased from pre- to post-course assessment in three sample cases (see Table 16), but the average score of COPING decreased in the Bio-system lab (case B.) and the Chemistry lab (case C.), indicating more employment of passive coping strategies after taking the sample lab. Students in Bio-system lab reported higher pre- and post-course scores in most areas than those in the other two courses possibly because students taking the Bio-system lab were juniors and those in the other two courses were mainly freshmen and sophomores. It is likely that students of higher grade level had better self-perception of their skills and attitudes than students of lower grade levels. The university offering the Chemistry lab was more selective than the one offering the Engineering design lab (case A.), so it was not surprising that the Chemistry lab also had higher pre-course scores in SKILLS, ATTITUDE, and EXPLORE than the Engineering Design lab. However, after taking the sample course, students taking the Engineering design lab reported higher average post-course scores on ATTITUDE, EXPLORE, and COPING than students taking the Chemistry lab. Nonparametric paired t-tests were used to test for pre-post change. This approach uses each individual as his or her own control and thus increases statistical power of identifying significant differences between pre- and post-course scores. It also is less influenced by the skewed distributions of ATTITUDE, EXPLORE, and COPING and is less affected by the small sample size (N=28) in the Bio-system Lab had a small sample size (N=28). Accordingly, I used the Wilcoxon signed rank test (Signrank) with STATA to test the equality of paired 91 observations (Wilcoxon, 1945). The null hypothesis was that the distributions of pre- and postcourse scores were the same. Table 16. Descriptive Information of Factor Scores in Three Sample Courses. SKILLS ATTITUDE EXPLORE COPING (0 ~ 5) (-3 ~ 3) (-3 ~ 3) (-3 ~ 3) Pre Post Pre Post Pre Post Pre Post Case Mean 3.187 3.769 1.102 1.749 1.512 1.916 0.445 0.472 A. Std. (0.698) (0.735) (1.009) (1.033) (1.016) (0.876) (1.132) (1.306) Dev. N. 610 610 526 526 537 537 538 538 Case Mean 3.845 4.065 1.789 2.013 1.727 1.854 0.393 0.253 B. Std. (0.658) (0.561) (0.696) (0.782) (0.927) (0.837) (1.069) (1.172) Dev. N. 28 28 28 28 28 28 28 28 Case Mean 3.422 3.898 1.395 1.714 1.572 1.635 0.424 0.302 C. Std. (0.700) (0.652) (1.001) (1.080) (1.066) (1.084) (1.078) (1.201) Dev. N. 897 897 810 810 836 836 833 833 Note: Case A.: Engineering Design Lab; Case B: Biosystem Lab; Case C: Chemistry Lab. Figure 3 summarizes individual changes in factors scores from pre- to post-course response by categorizing individuals into three groups: positive changes, negative change, and no change (i.e. zero). Engineering students taking the chemistry lab were listed separately. In all three courses, a majority of students reported positive change in their skill levels (64.3%77.1%) and a small portion (10.7%-15.1%) showed negative change from pre- to post-course assessment. Table 17 presents the signrank test results (Z score) and corresponding effect sizes (Somers’D). The difference of distribution of pre- and post-course scores was statistically significant (P<0.01) across sample courses. The effect sizes of change in self-perceived skill level were 0.618, 0.536, and 0.635 for three cases respectively, indicating that it was 61.8%, 53.6%, and 63.5% more likely for a student from these courses to score higher on skill level after taking the sample course than before the course. Overall, among the four aspects of 92 learning measured, perceived skill level had greatest improvement across cases, especially for the Engineering design and Chemistry lab students whose effect sizes of change in perceived skills were large with 95% confidence. Figure 3. Changes in Factor Scores from Pre- to Post-course Responses by Case. Case A. Engineering Design Lab 100% Case B. Biosystem Lab 100% 8.0% 15.1% 19.8% 40.0% 33.5% 11.4% 80% 60% 8.9% 40% 76.9% 68.8% 20% 80% 60% 31.0% 40% 51.0% 25.0% 46.4% 53.6% 10.7% 67.9% 3.6% 64.3% 20% 35.5% 0% 0% Positive Negative Zero Positive Case C. Chemistry Lab 100% 80% 9.3% 13.6% 60% 40% 32.2% 16.3% 34.0% 77.1% 51.5% 20% 37.5% 62.3% 11.6% 26.1% 28.6% 0% Positive Negative Zero 50.0% 3.6% 35.7% 28.6% 10.7% Negative Zero Case C. Engineering Students in the Chemistry Lab 100% 10.2% 37.4% 80% 13.7% 33.3% 61.3% 60% 16.1% 33.6% 40% 76.0% 13.4% 50.6% 20% 25.2% 29.1% 0% Positive Negative Zero Attitude to teamwork and communication was another aspect of learning that showed significant improvement after the sample course across cases. Students taking the Engineering 93 design lab had the highest percentage (68.8%) reporting positive change in ATTITUDE (see Figure 3). It was 57.4% more likely for students taking this course to have improved attitude to teamwork and communication after taking the sample course than before the course (see Table 17). In the other two courses, about half of the students reported positive change in ATTITUDE and the likelihood for a student to score higher on ATTITUDE in the post- than pre-course assessment was 46.4% and 35.2% higher respectively. In general, students in the Engineering design lab showed a large improvement and those in the Chemistry lab showed a medium positive change in their attitudes about teamwork and communication. The Biosystem lab had a wide confidence interval with effect size of change in this factor score ranging from small to large for students taking his course. Changes in EXPLORE and COPING were comparatively less evident than those in SKILLS and ATTITUDE across cases. Half of the students in the Engineering design lab reported increased intention to explore, but only about 1/3 reported positive change in their intention to explore in the Biosystem and Cheminstry lab (see Figure 3). With 95% confidence interval, the effect size of change was moderate in the Engineering design lab (0.364-0.474) and small in the Chemistry lab (0.104-0.185) (see Table 17). The interval covered values below 0.1 indicating the intention to explore might not change after the taking the Bio-system lab or for engineering students taking the Chemistry lab. Therefore, it was not possible to conclude the improvement in EXPLORE for the Bio-system lab or engineering students in the Chemistry lab. In terms of the coping strategy, the Engineering design lab and Chemistry lab had about 1/3 of the students falling into each category of change and 53.6% of the students taking the bio-system lab reported no change in their coping strategy during problem solving. Change in 94 coping strategy was not statistically significant in the Engineering design lab. Yet, students in the Bio-system (Z=-1.976, P<0.05, effect size=-0.25) and Chemistry lab (Z=-3.043, P<0.01, effect size=-0.054) reported significant negative change in coping strategies. Items loaded on COPING measured passive coping strategies. Negative change means that students became more likely to use passive coping strategies after taking the sample course, but the effect sizes of negative changes were small. Individuals can use both types of strategies simultaneously, so using more superficial approach does not necessarily lead to less application of active coping strategies. As mentioned earlier, items measuring active coping strategies in the survey did not fit the factor structure resulted from factor analysis, so items measuring active coping were listed separately (see Table 17). A comparison of pre-course and post-course score distributions for active coping showed significant increase in students’ intention to discuss issues with teachers and peers and to simplify the question and make it easy to solve. With 95% confidence, these two aspects of active coping had a small to medium size of positive change across sample courses. Overall, students in the Engineering design lab showed greater improvement in using active coping strategy than those in the other two sample courses. All five aspects of active coping had small to medium size positive change as a result of the Engineering design lab. Students taking the Chemistry lab also reported a small size of positive change in their intention to keep trying when facing stress or difficulty when solving science or engineering problems. In summary, among four outcome variables, students demonstrated greatest improvement in perceived skills, followed by their attitude to teamwork and communication. Only students from Engineering design lab also showed significant improvement with moderate effect size of change in their intention to explore new things, but such improvement 95 was not evident in the other two sample courses. Students’ adoption of passive coping strategy did not show substantial change after the sample course. Although students in the Bio-system lab and Chemistry lab reported significantly more adoption of passive coping strategies in the post-course assessment, the effect size was either small or had a wide confidence interval. As the range of the confidence interval is negatively associated with sample size, the Engineering design lab and the Chemistry lab both had narrow 95% confidence intervals of Somers’ D in relation to the point estimate of each outcome variable, suggesting that the estimated effect sizes of change were relatively stable. However, the bio-system lab had a small sample size (N=28) and thus had wide 95% confidence intervals of somers’ D. Table 17. Signrank Test Results and Effect Sizes of Change. Case Case A. Active Coping Case B. Active Coping Case C.All students Factor/Variable SKILLS ATTITUDE EXPLORE COPING -Discuss -Simplify -Calm -Trying -Persevere SKILLS ATTITUDE EXPLORE COPING -Discuss -Simplify -Calm -Trying -Persevere SKILLS ATTITUDE EXPLORE COPING Std. N. Z Somers'D Err. 610 15.955*** 0.618*** 0.048 526 14.354*** 0.574*** 0.045 537 12.542*** 0.421*** 0.034 538 1.343 0.045 0.035 538 12.089*** 0.377*** 0.031 538 10.787*** 0.325*** 0.029 538 11.199*** 0.331*** 0.029 538 10.148*** 0.297*** 0.029 538 8.618*** 0.236*** 0.027 28 3.287** 0.536** 0.184 28 3.123** 0.464** 0.139 28 2.210* 0.250* 0.104 28 -1.976* -0.250* 0.130 28 3.302*** 0.392*** 0.111 28 2.072* 0.286* 0.123 28 1.380 0.143 0.101 28 1.603 0.179 0.107 28 1.131* 0.107 0.095 897 19.583*** 0.635*** 0.040 810 11.968*** 0.352*** 0.030 836 6.391*** 0.145** 0.021 833 -3.042** -0.054* 0.027 96 95% CI of Somers' D 0.556 0.673 0.512 0.630 0.364 0.474 -0.024 0.113 0.325 0.427 0.273 0.375 0.279 0.381 0.250 0.363 0.188 0.293 0.234 0.743 0.227 0.650 0.051 0.430 -0.470 0.000 0.198 0.633 0.052 0.536 -0.054 0.342 -0.029 0.390 -0.079 0.294 0.587 0.680 0.300 0.402 0.104 0.185 -0.107 0.000 Table 17 (Cont’d) Active -Discuss 833 12.142*** 0.293*** 0.023 0.250 0.334 Coping -Simplify 833 9.127*** 0.193*** 0.020 0.156 0.236 -Calm 833 5.912*** 0.137*** 0.022 0.095 0.178 -Trying 833 9.591*** 0.205*** 0.021 0.167 0.250 -Persevere 833 6.070*** 0.118*** 0.018 0.082 0.154 Case C.SKILLS 459 13.578*** 0.623*** 0.055 0.553 0.684 Engineering ATTITUDE 415 8.213*** 0.345*** 0.041 0.278 0.440 students EXPLORE 424 3.531*** 0.118** 0.030 0.059 0.177 COPING 423 -2.128* -0.045 0.039 -0.120 0.031 Active -Discuss 423 9.086*** 0.301*** 0.032 0.242 0.357 Coping -Simplify 423 6.347*** 0.174*** 0.027 0.123 0.228 -Calm 423 4.267*** 0.127*** 0.028 0.072 0.182 -Trying 423 7.269*** 0.208*** 0.029 0.155 0.268 -Persevere 423 5.202*** 0.132*** 0.024 0.086 0.180 Notes: *P<0.05; **P<0.01; ***P<0.001Z: Wilcoxon signed rank test result; Std. Err.: Jackknife Std. Err; 95% CI of Somers’D: asymmetric confidence interval for untransformed D, calculated from symmetric confidence intervals for the transformed parameters (Newton, 2006). Qualitative Data from the Engineering Design Lab Major themes emerged from interview data were related to teamwork skills and the problem solving process. Unlike many tradition instructional labs that gave emphasis to conceptual learning, this freshmen course focused more on generic skills, particularly the process to solve problems in a team based environment. So, this finding is expected. When students were asked to describe their design projects, they said, “I don’t think it was a type of engineering work. We were just playing”; “it did not require much engineering knowledge”; or “compared to engineering specific knowledge and skills, I learned more about group work.” Almost all interviewees acknowledged that the course helped them learn teamwork skills, collaboration, or communication in a team or aware the benefits of teamwork: “I realized how teammates can apply their different strengths to teamwork;” “to do projects in this course, it was very important to know how to express ideas accurately…First, be a good listener. And then, knows how to give feedback to others’ ideas;” 97 Qing, who was a senior student when interviewed, took the course five years ago. She said the first thing that she learned from this course was how to coordinate a team. She said, “everyone has his or her own talents. How to take advantage of different people’s strengths was one thing to learn. For example, we had a mechanical engineering student who was not good at writing and another electrical engineering student who was not good at hands-on stuff.” Similar to Qing, Li also took the course five years ago. Reflecting on his learning experience in this course, he was not satisfied with the extent to which the course emphasized on domain-specific knowledge and skills. He thought that the course had limited contribution to his learning in chemical engineering in subsequent years. However, he admitted that the course helped him handle future teamwork. [Translated from Chinese] I learned how to do teamwork and how to compromise when you raised a lot of ideas, but your team members did not accept...I learned how to implement the project with team members. Sometimes, unexpected things happened such as team members not showing up or could not be contacted... In the past, I used to prefer doing projects by myself and thought that was more efficient. After this course, when I was confronted with similar situations in teamwork, I could handle better emotionally or in terms of negotiating ideas with teammates. Equally frequently mentioned is the problem solving process. Lei who took the course when he was a sophomore described the problem solving process in this course as follows. [Translated from Chinese]What has been practiced most in these projects was how to transfer ideas to products. Take the first project for example. First, we had a picture of how the robot would be like in mind. Then, we had to figure out how to apply these ideas to our product through programming and minor adjustments, so that the robot could function as we planned…In the second project, we proposed three plans. Then, we needed to decide which one was feasible and how to implement. Qing also found the design process taught in the lecture and practiced in lab projects helpful and indicated that she later applied the design process to many other course projects. 98 [Translated from Chinese] The instructor emphasized engineering design process repeatedly in different occasions...We were taught a five-step method, including problem statement, information collection, generate multiple design, select best design, and testing it. This method really helped me do projects in the following 200 and 300 level courses. While I was doing projects in this course, I did not recognize that this method was so beneficial. However, in the following courses, I gradually noticed that this method was very helpful. Interview data showed that these students paid attention to the learning process when doing the inquiry-based design projects. Interviewees’ description of teamwork gave emphasis to the interaction and communication among team members. The problem solving process acknowledged by interviewees complied with the characteristics of divergent thinking ability (Guilford, 1967). However, as mentioned earlier, a limitation of the qualitative data is that almost all interviewees were international students from mainland China. Students of other national or racial groups might hold different views. Therefore, the interview results should not be applied to all the students taking the engineering design lab. Discussion Results in this chapter revealed several positive student learning outcomes in the sample courses. Certain learning outcomes stood out in all sample classes. In particular, selfperceived skills in conducting lab work and attitude to teamwork and communication had positive improvement across three sample classes, with effect sizes ranged from small to large with 95% confidence. In the current study, skills in conducting lab work refer to teamwork and communication skills and divergent thinking ability. All of these were important skills involved in solving ill-structured problems. Interview data also reflected the same theme. The problem solving process and teamwork in the Engineering design lab had impressed interviewees the most. These findings were consistent with prior studies on inquiry-based or problem-based learning that revealed robust positive effect on skill development (Albanese & 99 Mitchell, 1993; Gijbels, et al., 2005; Sharp & Primrose, 2003; Vernon & Blake, 1993). As suggested by some researchers, generic skills such as teamwork and time management should be learned through first-hand experience rather than in lecture halls (Billet, Camy, & Coufort, 2010; Johnson & Johnson, 1999; Smith, et al., 2005). Moreover, inquiry-based cooperative learning is more reliant on effective teamwork than cookbook lab learning (Allen, et. al.,1996; King, et. al.,1999). Findings from the current study lend support to these benefits of inquirybased lab instruction in engineering, suggesting that utilizing inquiry-based teamwork in lab courses is an effective way of improving engineering graduates’ creativity and personal skills. Emphasizing learners’ innate curiosity for discovering knowledge is an essential feature of inquiry-based learning (Barell, 2007). Empowering learners to explore through asking questions, gathering relevant information, and restructuring meaning is an important goal of inquiry-based instructional approach (Grabe & Grabe, 2000; Short, Harste, & Burke, 1996; Smith, 2000). Assessment results of the changes in students’ intention to explore (or curiosity) after the sample course were generally positive across sample cases. However, the effect sizes of change varied across sample cases. The effective size of positive change in the intention to explore was medium in the Engineering design lab, but was small in the other two sample cases. The difference in the improvement of students’ curiosity across sample cases raised a further question over the elements of the inquiry-based lab that may contribute to encouraging students to explore new things. In the current case context, the Engineering design lab left students more freedom to explore in designing and conducting lab projects. Compared to the other two courses, the Engineering design lab gave less emphasis to conceptual learning and allowed students more freedom in choosing a topic of their own interest in the final project. On 100 the other hand, lab projects in the other two courses were with more constraints to disciplinespecific knowledge and concepts. In the Bio-system lab, student teams designed the same equipment. The six experiments in the Chemistry lab also strictly followed a logical sequence to help students make sense of key concepts that needed to be covered during the semester. Therefore, the Engineering design lab allowed a greater extent of self-directedness in conducting lab projects and self-directedness, when utilized appropriately, fosters creativity thinking (Brockett & Hiemstra, 1991; Mishra, et al., 2013). Educators generally believe that inquiry-based approach is more appropriate for upper level courses. In current engineering curriculum, inquiry-based experiments or projects were usually found in upper division, especially senior capstone courses (Sheppard et al., 2009; Sheppard & Jenison, 1997). The purpose of employing inquiry-based approach in lower division courses was usually to increase students’ interested in science and engineering, connect students with professors and peers, or introduce students to different disciplines (Allen, et al., 2002; Schoch, et al., 1997; Wuersig, 2007). However, results from the current study revealed that the application of inquiry-based approach in lower division courses had a greater potential than conventionally believed. The two introductory courses in the current study, Engineering design lab and Chemistry lab, had more students reporting positive improvement in each of the four areas of learning than the junior-level design course in bio-system. Because of a small sample size of the Bio-system lab, the available data were insufficient to conclude whether the two lower-division courses in the sample were more effective than the junior-level lab in achieving the desired lab learning outcomes. Still, the effect sizes for the improvement achieved by the two introductory courses in the sample appeared to be, at least, comparable to the effect sizes of change for students in the junior-level Bio-system design lab. Nevertheless, 101 assessment of self-perceived skill level in the Bio-system lab might also be subject to a measurement limitation. The descriptive statistics (see Table 14) showed that the average selfperceived skill level before taking this course was already high (3.845/5) and higher than the average pre-course skill levels in the other two sample courses. A ceiling effect might occur as the pre-course scores of many individuals were already the highest or close to the highest possible score on the scale, thereby decreasing the likelihood for the survey instrument to detect change in this domain. 102 CHAPTER VII: TASK COMPLEXITY AND LEARNING OUTCOMES To test the second group of research questions about whether difficulty and workload influenced learning outcomes, multivariate regressions was performed with STATA to test the hypotheses that difficulty and workload had polynomial relationship with learning outcomes, using data from the chemistry lab. Since only half of the students taking engineering design lab in fall 2013 were asked the questions about difficulty and workload of the sample course, the sample size (N=170) of engineering design lab was too small for conducting multivariate regression analysis. This chapter first summarizes students’ perceptions on task complexity, previous learning experience, and approaches to learning in three sample courses, followed by multivariate regression analysis results based on data from the Chemistry lab, and ended with discussions of major findings. Descriptive Results The 3Ps model suggested that students’ perception of learning context, learning approaches, relevant learning experiences elsewhere, and demographic background all contributed to student learning outcomes in the sample course. A descriptive analysis of student related variables revealed variations among sample courses. Descriptive statistics (Table 18) revealed that the average perceived difficulty of the chemistry lab (Case C.) (M=6.987, SD=1.609) was higher than that of the engineering design lab (Case A.) (M=3.990, SD=2.054). The perceived workload of both sample courses were in the upper middle range (Case A: M=6.099, SD=2.372; Case C: M=6.877, SD=1.935). The large standard deviations of these two variables in both sample cases suggested consideration variations among students in relation to their perception of the difficulty and workload of the course. 103 Table 18. Description of Students’ Perceptions on Task Complexity, Relevant Learning Experience, and Learning Approaches. Case A. Case B. Case C. Min Max N. Mean Std. N. Mean Std. N. Mean Std. Dev. Dev. Dev. Difficult* 1 10 203 3.990 2.054 780 6.987 1.609 Workload* 1 10 212 6.099 2.372 - 780 6.877 1.935 Problem 0 4 581 2.337 1.055 27 3.000 0.600 775 2.530 0.799 solving experience Outcomes 0 4 581 2.443 0.100 27 3.052 0.727 775 2.794 0.926 related experience Deep -3 3 530 0.822 1.036 28 0.682 1.015 781 1.092 0.978 learning approach Superficial -3 3 530 0.120 1.205 28 -0.543 1.062 781 -0.299 1.249 learning approach ** *Difficult and workload were newly added items in the second wave of data collection. Surveys conducted in half of the lab sections in Case A. and all sections in Case C. included these two questions. **superficial learning approach was reversely coded, so that larger values stand for less application of superficial learning approach. In terms of students’ relevant learning experiences outside of the sample course, other courses that students had taken, on average, gave a moderate to strong emphases to problem solving or the non-technical professional skills and attitudes examined in the current study. It was within expectation that junior students taking the Bio-system lab (Case B.) scored both of experience outside of the sample course higher (M=3.000 and 3.052, SD=0.600 and 0.727) than students from the other two lower division courses. Upper division students had taken more courses in their degree program than freshmen or sophomores. Moreover, upper division courses were more likely to involve complicated problem solving than freshmen courses. Across three sample courses, the expected extensive adoption of deep learning approach and rejection of superficial approach did not occur. The average use of deep learning approach was 0.822, 0.682, and 1.092 in three sample courses respectively with standard 104 deviation around 1. Students in the Engineering design lab rated the superficial approach low. The ratings for Bio-system lab (M=-0.543, SD=1.062) and Chemistry lab (M=-0.299, SD=1.249) were negative. Since reverse coding was applied to the measures of superficial approach, low and negative scores indicated that students had used superficial approach to some extent. Polynomial Terms in the Model First, incremental F tests were used to test the hypothesis that a linear term is sufficient in the model. A non-significant result supports adding polynomial terms in the model. Wald tests were also performed to examine whether polynomial terms of difficulty and workload (power 1 through power 4) were necessary in the model. Incremental F tests suggested include at least one polynomial term for difficulty in the model for predicting SKILLS, ATTITUDE, and EXPLORE (F(1, 758)=0.30-3.78, p=0.052-0.582)). Wald tests results showed that polynomial terms were significant in the models predicting SKILLS, ATTITUDE, and COPING ((F(3, 755)=2.92-7, p=0.0001-0.033)). Therefore, the technical analysis, coupled with the theoretical reasons from the literature, supported a polynomial relationship between perceived difficult and learning outcomes. Figure 4 presents curve fit for outcome variables on different values of difficult (1-10). The relationship between difficulty and SKILLS showed a convex curve with apex at the bottom and the rest of the curves in Figure 4 were concave with apex at the top. The concave curve was aligned with the hypothesis that higher difficulty of the course initially increases student performance, but after a certain level of difficulty is achieved, additional difficulty decreases student performance. The deepness of the curves for SKILLS and EXPLORE was not large, indicating the departure from linearity was not obvious, especially when rating of difficulty fell on the mid-range. Polynomial terms for perceived 105 workload were only found significant in the models predicting EXPLORE (Incremental F: (F(1, 758)=1.83, p=0.177; Wald test: F(3, 775)=3.28, p=0.021) and COPING (Incremental F: F(1, 758)=2.64, p=0.105); Wald test: F(3, 775)=2.94, p=0.03)). The curve fit for these two outcome variables on perceived workload showed concave curves with shallow deepness (see Figure 5). Figure 4 Curve fit for Outcome Variables on Perceived Difficulty of the Course (Chemistry Lab) Curve Fit for ATTITUDE 0 0 0.5 0.2 1 1.5 0.4 2 2.5 0.6 Curve Fit for SKILLS 0 2 4 6 8 0 10 4 6 8 10 Curve Fit for COPING 0 1 0.5 2 1 1.5 3 2 4 2.5 Curve Fit for EXPLORE 2 0 2 4 6 8 10 0 106 2 4 6 8 10 3 1 0 0.5 2 1 1.5 2 4 2.5 Figure 5 Curve fit for EXPLORE and COPING on Perceived workload of the Course (Chemistry Lab) Curve Fit for EXPLORE Curve Fit for COPING 0 2 4 6 8 10 0 2 4 6 8 10 Overall, when values of perceived difficulty and workload were low (i.e. 1-4), difficulty and workload showed stronger positive influence on learning outcomes except for SKILLS, which had a negative relationship with difficulty. However, both difficulty and workload were slightly negatively skewed (skewness=-0.7 and -0.3), with a few cases in the lower range. As a result, the estimation in the lower range of difficulty and workload might have large standard errors and thus should be interpreted with caution. Cases falling in the lower range of difficulty and workload might be outliers that distorted the findings. Additionally, evidence for including polynomial terms of workload for predicting SKILLS and ATTITUDE was not strong, so the multivariate regression model that included a quadratic term may not fit well in predicting the effect of workload on these two outcome variables. To address these concerns, Wald tests was used in post estimation to examine the significance of the linear and quadratic terms. Additionally, normality and homoscedasticity of Residuals were 107 examined to ensure that no substantial outliers existed and the probability of a Type I error caused by heteroscedasticity was low (Osborne & Waters, 2002). Assumption of Normality The latent outcome variables measured by four Barlett scores-SKILLS, ATTITUDE, EXPLORE, and COPING-did not meet the univariate and multivariate normality assumption. Thus, Box-Cox transformation was performed with STATA using the mboxcox command, followed by scaled power transformation by using the mbctrans command. Using the scaled power transformation preserve the direction of association between the outcome variables and independent variables, as well as the orders of scores of each outcome variable (Lindsey & Sheather, 2010). However, a disadvantage of using transformed dependent variables is that the interpretation became more complicated. The coefficients of regression results no longer stand for the unit of increase in dependent variable with one unit increase in the independent variable. Therefore, the interpretation will thus focus on the direction of effect and its significance. The transformed dependent variables met the assumption of univariate normal distribution with skewness between -0.2 and 0.2 and kurtosis between 2 and 3. The test for Mardia multivariate kurtosis was not statistically significant (P=0.487), so the assumption of normality was not rejected. The test for Mardia multivariate skewness was still significant (P<0.01). However, for large samples, even a slightly departure from normality will lead to statistical significance (Kline, 2009). The value of Mardia multivariate skewness was 0.347, which was still within the accepted range for multivariate normal (i.e. -3 – 3) (e.g. Siekpe, 2005). In general, the dependent variables were proximately normally distributed after transformation. 108 Results for the Chemistry Lab The Wilks Lambda multivariate test indicated significant multivariate relation between predicators and four learning outcome measures, F(72, 2904)=52.46, p<0.0001. Univariate F tests for each dependent variable showed that the pre-course assessment of SKILLS (F(4, 738)=126.4, p<0.0001), ATTITUDE (F(4, 738)=112.27, p<0.0001), EXPLORE (F(4, 738)=216.2, p<0.0001), and COPING (F(4, 738)=227.98, p<0.0001), difficulty (F(4, 738)=5.12, p<0.001), difficult^2 (F(4, 738)=5.07, p<0.001), deep approach (F(4, 738)=28.89, p<0.0001), and superficial approach (F(4, 738)=14.55, p<0.0001) were significantly related to four outcome measures. However, no univariate significance was found for workload ((F(4, 738)=0.48), workload^2 ((F(4, 738)=0.71), problem solving experience in other courses ((F(4, 738)=0.50), experience related to outcomes measured ((F(4, 738)=1.40), major((F(4, 738)=1.27), race ((F(4, 738)=1.22), and gender ((F(4, 738)=0.77). The latter group of variables was controlled for in the final model because they were theoretically important for examining learning outcomes. Additionally, removing insignificant variables from the model proved to have minor influence on the output, indicating that the current analysis was robust to alternative specifications. As a supplement to the regression procedure, the joint significance of all four equations was tested. To be conservative and reduce type-I error, P-value more than 0.05/60=0.00083 would be declared insignificant at the 5% confidence level because 4 dependent variables and 15 independent variables resulted in 60 tests of significance. The F test result was significant with P-value smaller than 0.00083 (F (60, 744) = 69.08, p<0.0001). Therefore, the four equations were jointly significant. 109 Difficulty Table 19. Multivariate Multiple Regression Results for Case C. Chemistry Lab with Standard Error in Parentheses (N=760). SKILLS ATTITUDE EXPLORE COPING Skills_pre 0.420***(0.023) -0.046***(0.007) -0.003 (0.002) -0.003 (0.022) Attitude_pre 0.317*(0.149) 0.915***(0.046) -0.008 (0.011) -0.021 (0.138) Explore_pre -0.246(0.334) -0.183(0.103) 0.674*** (0.026) 0.307 (0.309) Coping_pre -0.002(0.026) 0.006(0.008) 0.003 (0.002) 0.734*** (0.024) Difficulty -0.004(0.045) 0.043***(0.014) 0.006 (0.003) 0.132*** (0.042) Difficulty^2 0.001(0.003) -0.003***(0.001) 0.000 (0.000) - (0.003) 0.010*** Workload 0.003(0.043) -0.007(0.013) 0.002 (0.003) 0.046 (0.040) Workload^2 -0.001(0.003) 0.000(0.001) 0.000 (0.000) -0.005 (0.003) Problem solving 0.019(0.019) 0.005(0.006) -0.001 (0.001) 0.006 (0.018) experience Experience 0.031(0.016) 0.009(0.005) 0.002 (0.001) 0.006 (0.015) relevant to outcomes measured Deep approach 0.093***(0.014) 0.024***(0.004) 0.011*** (0.001) 0.046*** (0.013) Superficial 0.031**(0.012) -0.004(0.004) 0.001 (0.001) 0.074*** (0.011) approach Major (1-0.020(0.028) 0.006(0.008) 0.003 (0.002) 0.037 (0.025) engineering, 0-nonengineering) Sex (1-female, -0.001(0.027) 0.009(0.008) 0.000 (0.002) 0.028 (0.025) 0-male) Race (1-non-0.042(0.028) -0.003(0.008) 0.002 (0.002) 0.056* (0.026) White, 0-White) _cons 1.524***(0.180) 0.227***(0.055) 0.100*** (0.014) 1.869*** (0.166) 2 R 0.444 0.453 0.672 0.696 * P<0.05 **P<0.01 ***P<0.003 Figure 6. Relationship Between the Level of Difficulty and Its Effects on ATTITUDE and COPING. Level of Difficulty and its Effect on ATTITUDE Level of Difficulty and its Effect on COPING 0.2 0.15 0.1 0.05 0 0.6 0.4 0.2 0 0 2 4 6 8 10 0 110 2 4 6 8 10 The individual coefficients from the multivariate regression analysis presented in Table 19 would be equal to separate regression analysis. Difficulty and difficulty^2 had significant effects on students’ attitude to teamwork and their coping strategies when confronted with obstacles in solving problems. In both equations, the coefficient of difficulty was positive and the coefficient of difficulty^2 was negative. This result is consistent with the descriptive analysis, showing that the increase in difficulty had a positive effect on these two learning outcomes, but at some point, the influence began to decline. If the range of difficulty was large, the effect could become negative if students’ perception of difficulty is extremely high, but the current data were insufficient to draw this conclusion. Using the coefficients from multivariate regression analysis, the curves in Figure 4 presents the relationship between the perceived difficulty level and the effect of difficulty on students’ attitude to teamwork and communication and their coping strategies respectively, holding the other variables constant. In the current dataset, difficulty was measured by a scale from 1-10 shown on the horizontal axis. In general, increased difficulty led to better attitude to teamwork and communication and coping strategies. However, starting from difficult level 7, the benefit of increasing the difficulty of the course started to decline for both learning outcomes. To examine if difficulty and difficulty^2, individually or as a group, had significant effect on four outcome variables, Wald tests were performed. Following the conservative rule, P-value smaller than 0.05/4=0.0125 was declared statistically significant at the 5% confidence level for individual effect because there were four dependent variables. As two variables tested and four dependent variables resulted in 8 equations, the cutting point for combined significance would be 0.05/8=0.0063. Results showed that difficult (F(4, 741)=5.14, p=0.0004) 111 and difficult^2 (F(4, 741)=5.09, p=0.0005) were individually significant. However, for difficult and difficult^2 as a group, the reported significance level of 0.0073 (F(8, 241)=2.64, P=0.0075) was slightly more than the conservative criteria for significance at 5% level. It may be attributed to equations that predicted self-rated skill levels and attitude to exploration, as difficult and difficult^2 did not show significant influence on outcome variables in these two equations. Learning Approach Consistent with the 3Ps model, results (see Table 19) revealed that more employment of deep approach to learning positively influenced each learning outcome. Less employment of superficial approach in the sample course also positively influenced students’ perceived skill level and their coping strategies. Deep approach (F(4, 741)=29.01, p<0.0001) and superficial approach (F(4, 741)=14.61, p<0.0001) were significant individually in the model. The effect of deep and superficial learning approach, as a group, also had a significant effect on outcome variables (F (8, 741)=24.24, p<0.0001). Post Estimation Analysis Normality of Residuals Regression analysis requires residuals to be normally distributed. Several steps were taken to test this assumption. First, Kernel density estimate, P-P plot, and Q-Q plot (see Figure 7) were used to check the distribution of residuals. Residuals did not show departure from normality for the equations that predicted SKILLS, ATTITUDE, and COPING. However, a slight departure of normality was found in the equation which predicted EXPLORE. Kernel density estimates, compares the probability density of residuals with normal density, showed 112 that residuals for the equation that predicted EXPLORE slightly deviates from normality density. The P-P plot for this equation also showed a sight departure from normality in the middle range of the data. The Q-Q plot, which is sensitive to non-normality near the tails, indicated an obvious departure from normality on the higher range of the data. Since normality of residuals influenced validity of hypothesis testing, inter-quartile range (iqr) test, which assumed symmetric distribution of residuals (Hamilton, 2008), were used to identify if severe outliers existed. Only a few (less than 7) mild outliers were found in each equation and no severe outliers were found. Overall, the residuals had an approximately normal distribution. Homoscedasticity of Residuals Another major assumption of multiple regression analysis is homogeneity of variance of the residuals. Breusch-Pagan test was used to test the null hypothesis that the variance of residuals was homogenous. The results for four equations were not statistically significant, so no evidence showed heteroscedasity of residuals. 113 Figure 7. Normality and Homoscedasticity of Residuals (left to right: Kernel Density Estimate, P-P plot, and Q-Q plot) SKILLS ATTITUDE EXPLORE COPING 114 Results for the Engineering Design Lab Following a similar procedure as mentioned above, Box-Cox transformation was performed with STATA to transform outcome variables to univariate normal distribution. After the transformation, skewness of each variable was between -0.2 and 0.2 and kurtosis was between 2 and 3. Both incremental F tests and Wald tests results did not show evidence of polynomial relationship between difficulty and workload and the four learning outcomes. Therefore, it is not necessary to include polynomial terms in the regression models. The different relationship between difficulty and learning outcomes in the Engineering design lab and the Chemistry lab can be largely attributed to the difference in the overall difficulty of these two courses. Students taking the Engineering design lab rated the difficulty low (M=3.990, SD=2.054). Only about 10% of the students taking this course rated the difficulty above 6 out of 10. However, in the Chemistry lab, over 70% of the students rated the difficulty level above 6. Therefore, for a majority of students in the Engineering Design lab, the level of difficulty had not reached the point that the benefit of increasing difficulty began to decline yet. Separate regression analyses results did not show significant relationship between difficulty and learning outcomes for the Engineering Design lab. As mentioned above, the rating of difficulty of the Engineering design lab largely ranged between low and medium level. It is likely that when students largely found this course easy, the perceived difficulty was weakly associated with learning outcomes. Similar to the results for the Chemistry lab, using more deep learning approach was associated with better learning outcomes in SKILLS, ATTITUDE, and EXPLORE for 115 engineering design lab students. Decreased application of superficial approaches to learning also showed robust positive effects on SKILLS and COPING in both samples. Table 20. Separate Multiple Regression Results for Engineering Design Lab (N=174). SKILLS ATTITUDE EXPLORE COPING Skills_pre 0.554*** -0.017 0.005 0.047 Attitude_pre 0.791 0.765*** 0.014 0.245 Explore_pre 3.144 0.097 0.911*** -0.948 Coping_pre -0.066 -0.006 0.007* 1.079*** Difficulty 0.013 0.001 0.001 -0.018 Workload 0.025 0.004 0.001 0.010 Problem solving experience -0.030 -0.005 0.005 0.066 Experience relevant to 0.102 0.019 -0.002 -0.100 outcomes measured Deep learning approach 0.090* 0.018*** 0.006** 0.032 Superficial learning approach 0.113** 0.013* 0.005** 0.120*** Sex (1-female, 0-male) -0.224* -0.028 -0.005 -0.080 Race (1-non-White, 0-White) -0.135 -0.022 -0.012* 0.101 _cons 2.034 0.220 0.090 0.047 Adj. R square 0.554 -0.017 0.005*** 0.245*** * P<0.05 **P<0.01 ***P<0.001 Discussion By integrating the 3Ps model with the on-going debate on the effectiveness of inquirybased or similar instructional approaches, the current study makes a contribution by providing more specific lenses with which to examine how difficulty and workload of an inquiry-based course influences learning outcomes. Existing literature pointed out that complex learning activities in inquiry-based courses are two-sided. On the one hand, deep-learning is likely to occur when inquiry-based activities challenge students’ existing perceptions and understanding and engage students in collecting, integrating, and synthesizing information (Hu, et al., 2008; Prins, et al., 2008; Tagg, 2003). On the other hand, complex learning tasks and minimal guidance in inquiry-based learning context may cause a heavy working memory load and distract students from major learning goals (Kirschner, et al., 2006). 116 Findings based on the data from the Chemistry lab partly support the benefit of exposing students to challenging learning tasks when improving students’ attitudes to teamwork, communication, and failure was the learning objectives, because the perceived difficulty was, in general, positively associated with better attitudinal outcomes. However, the benefit of increased difficulty on attitude to teamwork and communication and coping strategies starts to decline after an upper middle difficult level. This finding suggested that the concern for the effect of inquiry-based approach on conceptual learning (Kirschner et al., 2006) can also be applied to address affective outcomes. Excessive difficulty of a lab experiment of project could compromise the benefit of inquiry-based learning. For instance, when teamwork in inquiry-based projects is overwhelmingly stressful or when students are frequently confronted with failure but do not receive sufficient support, students may withdraw from learning. This polynomial association between perceived difficulty and learning outcomes had at least two important implications for instructional labs. First, instructors may ask students to rate the difficult of learning tasks periodically and adjust complexity of learning tasks or the amount of scaffolding provided accordingly. Instructors may be cautious when students find the course very difficult. For these students, participating in complicated inquiry-based lab projects may have minimum benefit or even negative effects. It is likely that students may perceive the same tasks differently. In the current dataset, the perceived difficulty of the sample course ranged from very easy to very difficult and the standard deviation was large indicating variations among students. It may be infeasible to achieve a difficulty level desirable for every student in class in the instructional design process, so how scaffolding was provided becomes essential in adjusting the difficulty level, especially for students who 117 perceived the course difficulty to be above the upper middle level. Extra assistance from the instructor or TA could help reduce these students’ cognitive workload. On the other side of the spectrum, additional challenge may be beneficial for students who perceive the learning tasks to be easy, because they may have already achieved the automatic phase in solving the assigned problem and could depend on superficial approaches to complete the task (Fitts & Posner, 1960; Ramsden, 2003; Tagg, 2003). To monitor and adjust the perceived difficulty level of a course requires instructors and TAs to be sensitive to students’ responses to learning activities and know how to make instructional adjustments in a student-centered manner. As mentioned earlier, a limitation of the current study is an absence of data about TAs’ perceptions of inquiry-based learning and skills in facilitating inquiry-based labs. Many lab-courses in engineering curriculum, especially lower-division courses, enrolled a large number of students. As a result, instructors largely rely on TAs to facilitate lab sections and ensure that students achieve lab learning objectives. Thus, TAs play an important role in tailoring the implementation of inquiry-based lab instruction. Future research may explore whether TAs facilitating inquiry-based labs are familiar and comfortable with inquiry-based instruction and the types of training that may help TAs become better facilitators in inquiry-based labs. 118 CHAPTER VIII: IMPLICATIONS AND RECOMMANDATIONS Translating Research Findings to Initiatives The present study documents several positive learning outcomes in inquiry-based lab courses. Findings showed that students not only acquired professional skills in solving illstructured problems in teams, but also internalized values of the engineering profession, such as perceiving teamwork and communication as inevitable in engineering practice, exploring new phenomena, and using active coping strategies when facing stress or failure in the problem solving process. These consistent findings across three sample courses suggested that inquirybased lab instruction is a potentially powerful instructional method to develop students’ attributes of engineering professionals. It can be utilized as a complement to enhance the current engineering curriculum to achieve a balance between cognitive and affective learning. Particularly when used in lower-division introductory courses, inquiry-based labs can give students an early start on familiarizing themselves with how engineering work and developing the skills and attitudes that could benefit their learning experiences throughout the engineering program. In engineering education, innovation of instructional labs has been slower than other parts of the engineering curriculum (Sheppard, et al., 2009). The reform of instructional labs in engineering has also lagged behind that in science or medical education. The slow progress can be partly attributed to insufficient research attention and a lack of clear lab learning objectives. More importantly, changes in educational practices do not take place until research findings are translated into departmental, institutional, and national initiatives that renew the existing organizational and disciplinary culture, so that faculty members and curriculum committee design and deliver courses that address agreed-upon student learning outcomes (Watson, 2009). 119 The three cases in the current study are mature lab courses each evolving over several years. They all originated from local faculty members’ motivation to enhance students’ learning experiences and outcomes. The Engineering Design lab was part of a college-wide initiative to improve student retention and better prepare freshmen for later schoolwork in engineering. The initiative was partly supported by external funding and accompanied by institutional and departmental supports. The design, implementation and assessment of this course involved cross-departmental, interdisciplinary, and academic-industry collaborations. Over the years, this course has become a featured first-year course for students intended to major in engineering. The Bio-system lab and the Chemistry lab courses were both initiated by the instructor of the course who had the intrinsic motivation to improve students’ problem solving skills through working on complex scientific or engineering problems. To change a large-size general science course (i.e., The Chemistry Lab) offered to students from various disciplines could be more complicated than to design a small-size departmental course (i.e., The Biosystem Lab) only for students in the department. The food dehydrate project was implemented in the Bio-system lab the first time when the course was offered, but the innovation of the Chemistry lab was less prompt. The instructor of the Chemistry lab said that she changed the course by incorporating the inquiry-based component gradually to minimize negative reaction by her colleagues in the department. An effective instructional leadership team may foster the instructional change in a college. A teacher with intrinsic motivation to innovate instruction may suffice to change a course when using appropriate strategies that fit the institutional, disciplinary, or departmental 120 cultures. However, only a coordination of top-down and bottom-up strategies can bring forth continuous and fundamental change in the field (Fullan, 1994; Hargreave, 1994). Challenges in Innovating Instructional Labs The recent decade witnessed several nation-wide initiatives to reform undergraduate engineering curriculum including the new criteria for accrediting engineering programs, the establishment of several NSF-funded coalitions and research centers to discover best practices and change the way that science and engineering are taught, and increased recognition of researchers in engineering education. Nevertheless, most instructional labs remain unaffected by the literature on inquiry-based instruction. Similar to any other types of instructional or curricular reform in higher education, innovation of instructional labs takes places within larger institutional cultures that may discourage change. The stagnation in lab instruction can also be attributed to a lack of space, financial, and personnel support that is essential for conducting complex problem solving activities in lab settings. The reform of instructional labs can be adversely affected by the existing institutional structures and cultures that discourage or even penalize effective teaching (Frost and Teodorescu, 2001). Prior researchers identified the coherence between the institutional culture and the strategies taken to reform instruction and curriculum as the most important element in successful reform (Eckel, Green, & Hill, 2001; Kezer & Eckel, 2002). Moreover, instructional and curricular innovations per se are changes of institutional culture because they indicate an enhanced value placed on teaching (Frost & Teodorescu). The strategy taken by the Chemistry Lab instructor to slowly and unnoticeably change a conventional science lab to an inquirybased one may, to some extent, reflect a disconnection between cultures of innovation promoted from the top at the national-level and departmental or institutional cultures that 121 reinforce faculty members’ beliefs on a daily basis. Particularly in science and engineering disciplines, teaching or research in science and engineering education is secondary to science and engineering research. As a result, incentives that encourage instructional innovation are often scarce (May, Susskind, & Shapiro, 2013). Given these impediments embedded within the instructional culture, it could be more difficult for change to occur from grass roots in instructional labs than in other components in the curriculum because lab courses, particularly for lower division students, are often taught by non-tenure tracked lecturers or lab managers who had limited resources to facilitate change. Second, innovation of lab instruction must begin with individuals who directly interact with students and influence how students learn (Ambrose & Norman, 2006). Effective implementation of inquiry-based instructional labs set a higher requirement for instructors and graduate assistants who facilitate lab sections. Proficiency in the subject matter and familiarity with lab procedures may be sufficient for an instructor or teaching assistant to facilitate a cookbook lab; it will not suffice for an effective inquiry-based lab. In addition to the knowledge and skills needed to facilitate a conventional lab course, inquiry-based lab facilitators should also understand what inquiry-based learning means, how to design and implement an inquiry-based lab project, how students learn in an inquiry-based context, when and how to provide scaffolding to students based on individual needs without disrupting student inquiry, and how to evaluate learning outcomes beyond conceptual outcomes. Without proper resources for lab designers and instructors to address these issues, the conventional way of teaching lab courses is unlikely to change. Finally, a lack of top-down support and institution’s unwillingness to upgrade lab infrastructure and equipment may influence how lab courses are taught. The lab setting of an 122 inquiry-based lab is different from a cookbook lab in various ways. The level of selfdirectedness allowed in an inquiry-based lab partly depends on whether the resources available permit students to conduct a project in different ways. Additionally, managing an inquirybased lab is more complicated than coordinating a cookbook lab. Instructors or lab managers need to make the materials and tools available for students when they need and, in the meantime, ensure safety. Particularly in a design lab course where student teams usually cannot complete designing and testing prototypes in class, labs or relevant resources must be accessible after class so that student groups are able to complete a project at their own paces. Because of the complexity involved in changing a lab course, it is not surprising that priority was given to modify other parts of the engineering curriculum. Policy Recommendations The adoption of EC2000 and the NSF-funded engineering education coalitions in the 1990s have stimulated significant of undergraduate engineering education and invigorated engineering curricula (Foundation Coalition, 2006; Lattuca, Terenzini, Volkwein, Peterson, 2006). However, instructional innovation and the scale-up of change take place slowly in engineering instructional labs (Sheppard, et. al., 2009). To change instructional labs in engineering or similar fields requires efforts taken to change the institutional culture, and improve teaching expertise of science and engineering instructors, and upgrade the infrastructure. Institutional culture is the result of continuing negotiations about values between members (Seel, 2000; Watson, 2009). Conventional lab instruction will not change unless the role of instructional labs in the engineering curriculum is expanded to achieve equilibrium between enhancing conceptual learning and developing students’ attributes of science and 123 engineering professionals. The conceptualization of lab learning objectives (ABET/Sloan objectives) offers a framework for curriculum committee and faculty members to rethink how to utilize labs to engage students in solving ill-structured practical problems that they will likely encounter in future jobs. Additionally, substantial research on STEM education reform and organizational changes pointed out the significance of faculty grassroots leadership in institutionalizing innovations (Henderson, Beach, & Finkelstein, 2008; Kezar, 2011; May, et al., 2013; Watson). The instructional labs are more likely to change when faculty, staff, and administrators who design, implement, and conduct assessment of lab courses are engaged in the dialogue on curricular and instructional reform. Distributed leadership among these individuals, coupled with external support and incentives will empower them to work collaboratively to plan and implement lab innovations that fit into the larger reform of STEM education (Kezar, 2011). To improve faculty and staff’s expertise in designing and implementing innovative lab approaches is a time consuming and collaborative endeavor. The first step is to recognize scholarship of teaching and learning in the reward structure, so that faculty and staff in science and engineering fields will be willing to invest time and efforts in improving teaching and learning. Centers for the Scholarship of Teaching and Learning (SoTL) or other teaching support centers established in many institutions are ideal platforms that can help connect science and engineering educators with people with similar ideas in teaching or those with the knowledge and skills necessary to implement an instructional reform. Compared to isolated individuals, networks are more likely to sustain a change and help individuals implement changes that fit institutional context (Kezar, 2011). 124 Lack of funding support to upgrade lab facilities, equipment, and materials is one of the largest obstacles change instructional labs. Incentives or funding opportunities should be provided to encourage creative design of lab assignments within limited budgets. Such expenses need not be high; in the three sample cases, materials and equipment used in lab experiments or projects were not necessarily expensive. These sample labs all made use of materials that are used in daily life. However, using alternative materials to reduce cost may not be a solution that fits all types of labs. For some instructional labs, changing the existing lab infrastructure relied heavily on external funding. Finally, consolidating instructional labs may be a strategy to upgrade instructional labs with limited funding. Separation of disciplines and a lack of cross-departmental and interdisciplinary collaboration resulted in waste of resources. Instructional labs of close disciplines in science and engineering can be consolidated to utilize existing resources more efficiently and reduce the cost of operating redundant labs. More importantly, building new labs or consolidating existing labs should be guided by explicit lab learning objectives and engage educators concerned to ensure that lab settings support innovative teaching approaches. Research Recommendations Findings of the current study open up opportunities for future research on instructional labs. A better understanding of how students approach inquiry-based learning tasks will help achieve better learning outcomes in inquiry-based labs. Findings of the current study revealed that both increased use of deep approach to learning and decreased use of superficial approach were positively associated with better learning outcomes, but across sample cases students did not always use deep approaches to learning. They also used a superficial approach to some extent. As a student may use both approaches throughout a course in different occasions, 125 future research may examine the elements of inquiry-based lab that would lead to different learning approaches. The examination may also go beyond lab instruction to examine if certain factors in the educational system facilitate students to use superficial learning even in an inquiry-based context. The association between task difficulty and learning outcomes revealed in the current study suggests that scaffolding may function as a moderator to adjust students’ perceived difficulty of the learning tasks. Students across sample cases in the current study also showed variations in their perception of the course difficulty. Future research may explore ways in which scaffolding can be provided to students during inquiry-based labs based on individual needs so that students will perceive the tasks to be challenging but not of excessive difficulty. Future research may also examine different designs of inquiry-based lab assignment and identify models that allow lab instructors or facilitators to adjust task complexity based on students’ feedback. 126 APPENDICES 127 APPENDIX A INQUIRY-BASED LAB LEARNING INVENTORY (ILLI) 128 INQUIRY-BASED LAB LEARNING INVENTORY (ILLI) A. Self-assessment of Professional Skills: Please circle the number to the right that best describes your skill level in the following areas after and before taking Chemistry125/126. If you are unfamiliar with, or have had no experience with, any of these statements, select the “N/A (Not applicable)” option. (0: N/A; 1: weak; 2: fair; 3:good; 4:very good; 5:excellent) 1. Write a well-organized, coherent report. 2. Effectively present your solution to a science/engineering problem through writing. 3. Communicate effectively with teammates and instructors about lab experiments/projects. 4. Communicate effectively with non-technical audiences about lab experiments/projects. 5. Collaborate with others to complete a lab experiment/project. 6. Interpret experimental results for different audiences. 7. Delivers a well-organized oral presentation. 8. Work in teams of people with a variety of skills and backgrounds. B. On a scale of 1-10, rate the overall difficulty of this course, compared to other courses. C. On a scale of 1-10, rate the overall workload of this course, compared to other courses. D. Disagree or Agree: Please circle the number to the right that best describes your level of agreement with the following statements. If you are unfamiliar with, or have had no experience with, any of these statements, select the “0 (Neither agree nor disagree)” option. (-3: Strongly disagree; -2: Disagree; -1: Disagree to some extent; 0: Neither agree or disagree; 1: agree to some extent; 2: agree; 3: Strongly agree) D.1. Exploration 1. Describe myself as someone who actively seeks as much information as I can in a new situation. 2. Always look for new opportunities to grow as a person (e.g., information, people, and resources). 3. Find myself NOT interested in probing deeply into new situations or things. 4. Always look for new things or experiences. D.2. Coping strategy: When I face stress or difficulty in solving science/engineering problems, I would… 1. Discuss issues with teachers, seniors or classmates and ask for their opinions. 2. Simplify the question and make it easy to solve. 3. Use a calm and optimistic attitude to think about how to cope with the problem. 4. Learn to live with it and keep trying. 5. Tell myself to persevere. 6. Passively let nature take its course. 7. Be used to leaving aside the problem and not handling it for the time being. 8. Become discouraged. 129 9. Generalize that I have bad luck. 10. Decrease my standards and try again with the new standards. D.3. Communication and Teamwork 1. I prefer solving science/engineering problems in a team based environment to working as an individual. Teamwork accomplishes most science/engineering tasks more effectively than individual efforts. 3. In order to be a good practicing scientist/engineer, I must know how to effectively communicate my lab work with others orally. 4. In order to be a good practicing scientist/engineer, I must know how to effectively communicate my lab work in writing. 5. I prefer to work with team members who ask questions about information I provide 6. Teams that do not communicate effectively significantly increase their risk of committing errors. D.4. Learning Approach 1. I have generally put a lot of effort into the lab work in this course. 2. In an attempt to understand new ideas brought up in this course, I have often related them to practical or real life contexts. 3. In the lab, I’ve looked at evidence carefully to reach my own conclusion about what I’m studying in this course. 4. To meet the requirements of the lab sections of this particular course, I often follow the argument, or to see the reasons behind things. 5. I’ve often had trouble in making sense of the things I have to memorize during lab sections. 6. In this course, I’ve just been going through the motions of studying without seeing where I’m going. 7. Much of what I’ve learned in the lab sections of this course seems no more than lots of unrelated bits and pieces in my mind. 8. I’ve tended to take what we've been taught in this course at face value without questioning it much. E. Your previous learning experience: Overall, how much have the other courses that you’ve taken so far in your program emphasized each of the following area? (0:little; 1: slight; 2: moderate; 3: Strong; 4: Very strong) 1. Conducting hands-on experiment/project 2. Defining a research problem 3. Designing, conducting, and analyzing data from experiments 4. Generating and evaluating a variety of ideas about how to solve a problem 5. Recognize unsuccessful outcomes and then re-think effective solutions 6. Creativity and innovation 7. Written communication skills 8. Oral communication skills 9. Working effectively in teams 2. 130 F. Your background: Please select the most appropriate alternative(s). 1. How do you describe yourself? (Choose all that is applicable) (Black or African American, White, Hispanic or Chicano, American Indian, Asian, Asian American or Pacific Islander, Other( please specify)) 2. What is your major? (Engineering, Pre-med, Science, Other(please specify)) 3. Was this course required in your program? (Yes, no) 4. What is your sex? (Female, Male) 5. What is your overall GPA? (below c-, c-/c, c/b-, b-/b, b/a-, a-/a) 6. What is your class level? 7. Are you an international student? (Yes, no) 131 APPENDIX B INTERVIEW PROTOCOL 132 Interview Protocol Introduction Thanks for participating in this interview. During the interview, I will ask you to think about your learning experience in the Engineering Design Lab course, especially doing the hands-on design project in this course. Before we begin, I need to ask you if I can record our conversation for the purpose of note taking, so that I can go back to our conversation later if I missed anything important. If so, I’ll excuse myself for a few second to start the recording, hold on. Warm up 1. How do you like your undergraduate study so far? 2. Can you describe the hands on design project in this course? What did you do for this project? 3. Is it the first time for you to do this type of engineering project? If not, please describe your previous project experience. 4. How do you feel about doing this course project? What did I learn? 1. What have you learned most from the hands on project? 2. Which part of the design project went well? Which part didn’t? 3. If you could do this project again, what would you do differently? 4. Is there anything that you hope this course could do differently so that you can benefit more from this course? 5. Have you decided your major? Engineering or a different one? 6. Has this course influenced your decision? How? 7. What you have learned about engineering from this course? 133 8. What have you learned about yourself as a future engineer? How did you learn? 1. Compared with other courses that you’ve taken so far in the engineering program, how would you rate the difficulty of this course? From 1-10. Describe how you feel about its difficulty. 2. How about the workload of this course as compared to other courses? 3. When you first know that you would be doing a hands-on engineering design project in this course, what were your expectations for this project or for this course? 4. After doing these course projects, is there anything different from your expectations? 5. When doing course projects, how do you know if your idea is right or wrong? 6. Working in a group when your idea turns out to be right, what would you do? 7. During these projects, did you/your team meet any difficulties (when things go stuck), what did you do? 8. During these projects, when did you ask questions? Whom did you ask? How easy was it for you to ask questions? What types of answers did you get? 9. Please describe the role of instructors/TAs in your design projects? 10. Did your classmates help you learn things in this course? 134 REFERENCES 135 REFERENCES Adams, S.G. (2001). The effectiveness of the E-team approach to invention and innovation, Journal of engineering education, October, 597-600 Albanese, M.A., & Mitchell, S. (1993). Problem-based learning: A review of literature on its outcomes and implementation issues, Academic Medicine, 68, 52–81. Allen, E. L., Muscat, A. J., & Green, E. D. H. (1996). Interdisciplinary team learning in a semiconductor processing course. In Frontiers in Education Conference, 1996. FIE'96. 26th Annual Conference., Proceedings of (Vol. 1, pp. 31-34). Amato-Henderson, S., Kemppainen, A., & Hein, G. (2011). Assessing creativity in engineering students. In Frontiers in Education Conference (FIE), 2011, (pp. T4F-1). Ambrose, S.A. & Norman, M. (2006). Preparing engineering faculty as educators, The Bridge, 36 (2), 25-32 Anderson, L. W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Raths, J., & Wittrock, M.C. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. NY: Unabridged Ed., Longman. Astin, A. W. (1993). What matters in college: Four critical years revisited. San Francisco: Jossey-Bass. Artemeva, N. & Logie, S. (2003). Introducing engineering students to intellectual teamwork: The teaching and practice of peer feedback in the professional communication classroom. Language & Learning Across the Discipline, 6(1), 62-85 Baig, L. A., Violato, C., & Crutcher, R. A. (2009). Assessing clinical communication skills in physicians: are the skills context specific or generalizable. BMC medical education, 9(1), 22. Balamuralithara, B. & Woods, P.C. (2008). Virtual laboratories in engineering education: The simulation lab and remote lab, Computer Application in Engineering Education, 17, 1, 108-118. Banchi, H. & Bell, R. (2008). The many levels of inquiry. Science and Children, 46(2), 26-29. Barell, J. (2007). Problem-based learning: An inquiry approach. Corwin Press. Barrows, H.S., and Tamblyn, R.N. (1980) Problem-Based Learning: An approach to medical education. New York, N.Y.: Springer 136 Benford, R.,& Lawson, A.E. (2001). Relationships between effective inquiry use and the development of scientific reasoning skills in college biology labs. Report to the National Science Foundation, Grant DUE 9453610. Berger, C., Kerner, N., & Lee, Y. (1999) Understanding student perceptions of collaboration, laboratory and inquiry use in introductory chemistry, retrieved April 9, 2014 from http://www-personal.umich.edu/~cberger/narst99folder/narst99.html Berlyne, D. E. (1960). Conflict, arousal, and curiosity. New York: McGraw-Hill. Berlyne, D. E. (1967). Arousal and reinforcement. In D. Levine (Ed.), Nebraska symposium on motivation (pp. 1–110). Lincoln: University of Nebraska Press. Berlyne, D. E. (1971). Aesthetics and psychobiology. New York: Appleton-Century-Crofts. Biggs, J. (1999). What the student does: teaching for enhanced learning. Higher Education Research & Development, 18(1), 57-75. Blumenfeld, P., Soloway, E., Marx, R., Krajcik, J., Guzdial, M., & Palincsar, A. (1991). Motivating project-based learning: Sustaining the doing, supporting the learning, Educational Psychologist, 26 (3&4), 369-398. Bonwell, C. C., & Eison, J. A. (1991). Active Learning: Creating Excitement in the Classroom. 1991 ASHE-ERIC Higher Education Reports. ERIC Clearinghouse on Higher Education. Bransford, J.D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience, and school.Washington, DC: National Academy Press. Brandtstädter, J. (1992). Personal control over development: Implications of self-efficacy. In R. Schwarzer (Ed.), Self-efficacy: Thought control of action (pp. 127-145). Washington, DC: Hemisphere. Brandtstädter, J., & Renner, G. (1992). Coping with discrepancies between aspirations and achievements in adult development: A dual-process model. Life crises and experiences of loss in adulthood, 301-319. Biggs, J. B. (1989). Approaches to the enhancement of tertiary teaching. Higher education research and development, 8(1), 7-25. Billet, A. M., Camy, S., & Coufort, C. (2010). Pilot-Scale Laboratory Instruction for ChE: the specific case of the Pilot-unit leading group. Chemical Engineering Education, 44(4), 246252. Bollen, K., & Lennox, R. (1991). Conventional wisdom on measurement: A structural equation perspective. Psychological bulletin, 110(2), 305. 137 Booth, S. (2004). Engineering education and the pedagogy of awareness. In C. Baillie & I. Moore (2004). Effective learning and teaching in engineering (pp. 9-24). London: outledge-Falmer. Brockett, R. G., & Hiemstra, R. (1991). Self-direction in adult learning: Perspectives on theory, research, and practice. London: Routledge. Buch, N. J., & Wolff, T. F. (2000). Classroom teaching through inquiry. Journal of professional issues in engineering education and practice, 126(3), 105-109. Buja, A. (1990). Remarks on functional canonical variates: alternating least squares methods and ACE, Annals of Statistics, 18(3), 1032-1069. Buja, A., & Eyuboglu, N. (1992). Remarks on parallel analysis. Multivariate behavioral research, 27(4), 509-540. Burleson, W., & Picard, R. (2004, August). Affective agents: Sustaining motivation to learn through failure and a state of stuck. In Workshop on Social and Emotional Intelligence in Learning Environments. Byrne, B. M. (1989). A primer of LISREL. New York: Springer. Caffarella, R. S. (1993). Self-directed learning. New Directions for Adult and Continuing Education, 57, pp. 25-35. Campbell, D.T., & Fiske, D.W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix, Psychological Bulletin, 56, 81-105. Carlson, B., Schoch, P., Kalsher, M., & Racicot, B. (1997). A Motivational First‐year Electronics Lab Course. Journal of Engineering Education, 86(4), 357-362. Cavalieri, L. A. (1996). The processes of adult learning: Failure as feedback for motivation. Proceedings of the First Triennial Eastern Adult, Continuing, and Distance Research Conference. State College, PA: Penn State University Press. Charyton, C., & Merrill, J. A. (2009). Assessing general creativity and creative engineering design in first year engineering students. Journal of Engineering Education, 98(2), 145156. Chiu, C. H. (2003). Exploring how primary school students function in computer supported collaborative learning. International Journal of Continuing Engineering Education and Life Long Learning, 13(3), 258-267. 138 Church, A. T., & Burke, P. J. (1994). Exploratory and confirmatory tests of the big five and Tellegen's three-and four-dimensional models. Journal of personality and social psychology, 66(1), 93. Colburn, A. (2000). An inquiry primer. Science Scope, 23(6), 42-44. Combs, J. P., & Onwuegbuzie, A. J. (2012). Relationships among Attitudes, Coping Strategies, and Achievement in Doctoral-Level Statistics Courses: A Mixed Research Study. International Journal of Doctoral Studies, 7, 349-375 Costello, A. B., & Osborne, J. W. (2003). Exploring best practices in Factor Analysis: Four mistakes applied researchers make. In Trabajo presentado en el Annual Meeting of the American Educational Research Association, Chicago, IL. Cropley, D., & Cropley, A. (2010). Recognizing and fostering creativity in technological design education. International Journal of Technology and Design Education, 20(3), 345358. Day, H. I. (1971). The measurement of specific curiosity. In H. I. Day, D. E. Berlyne, & D. E. Hunt (Eds.), Intrinsic motivation: A new direction in education, pp. 99–112, New York: Holt, Rinehart & Winston. Dee, K. C., Nauman, E. A., Livesay, G. A., & Rice, J. (2002). Research report: learning styles of biomedical engineering students. Annals of Biomedical Engineering, 30(8), 1100-1106. Deignan, T. (2009). Enquiry-based learning: perspectives on practice. Teaching in Higher Education, 14(1), 13-28. DeLyser, R. R., Rullkoetter, P., & Armentrout, D. (2002, November). A Novel Multidisciplinary Course: Measurement and Automated Data Acquisition—an Update. In Proceedings of the 2002 Frontiers in Education Conference (pp. S4A2-S4A6). Dewey, J. (1916) Democracy and Education. An introduction to the philosophy of education (1966 edn.), New York: Free Press. Dillon, W., Thomas M. & Narendra, M (1983) Scaling models for categorical variables: An application of latent structure models, Journal of Consumer Research, 10 (September), 209-224. DiStefano, C., Zhu, M., & Mindrila, D. (2009). Understanding and using factor scores: Considerations for the applied researcher. Practical Assessment, Research & Evaluation, 14(20), 1-11. Domin, D.S. (1999). A review of laboratory instruction styles. Journal of chemical education 76 (4), 543. 139 Duderstadt, J.J. (2010). Engineering for Changing World. In Grasso, D., & Burkins M.B. (Eds). Holistic Engineering Education (pp. 17-36). Springer New York Dutson, A.J., Todd, R.H., Magleby, S.P., & Sorensen, C.D. (1997). A review of literature on teaching engineering design through project oriented capstone courses, Journal of Engineering Education, 86(1), 17-28. Dym, C. L., Rossmann, J. S., & Sheppard, S. D. (2004). On Designing Engineering Education: Lessons Learned at Mudd Design Workshop IV. International Journal of Engineering Education, 20(3), 470-474. Eckel, P., Green, M., & Hill, B. (2001). On change V: Riding the waves of change: Insights from transforming institutions. Occasional paper. American Council on Education: Washington DC. Eick, C. J., & Reed, C. J. (2002). What makes an inquiry‐oriented science teacher? The influence of learning histories on student teacher role identity and practice. Science Education, 86(3), 401-416. Ellis, R. A., Marcus, G., & Taylor, R. (2005). Learning through inquiry: student difficulties with online course‐based Material. Journal of Computer Assisted Learning, 21(4), 239-252. Enders, C. K. (2001). The impact of nonnormality on full information maximum-likelihood estimation for structural equation models with missing data. Psychological methods, 6(4), 352. Enders, C. K., & Bandalos, D. L. (2001). The relative performance of full information maximum likelihood estimation for missing data in structural equation models. Structural Equation Modeling, 8(3), 430-457. Falk, J., & Drayton, B. (2004). State testing and inquiry-based science: Are they complementary or competing reforms?. Journal of Educational Change,5(4), 345-387. Feisel, L., & Peterson, G.D. (2002). A colloquy on learning objectives for engineering education laboratories, ASEE Annual Conference and Exposition, Montreal, Ontario, Canada. Feisel, L., & Rosa, J., (2005). The role of the laboratory in undergraduate engineering education, Journal of Engineering Education, 94, 121-130. Felder, R.M., & Brent, R. (2003). Designing and teaching courses to satisfy the ABET Engineering Criteria, Journal of Engineering Education, 92(1), 7-25 Feletti, G. (1993). Inquiry based and problem based learning: how similar are these approaches to nursing and medical education?. Higher Education Research and Development, 12(2), 143-156. 140 Fishbein, M., & Ajzen, I. (1975). Chapter 7: Formation of intention in M. fishbein & I. Ajzen (eds). Belief, attitude, intention, and behavior: An introduction to theory and research, pp 288-334, MA: Addison-Wesley. Fisher, R. A. (1921). On the" probable error" of a coefficient of correlation deduced from a small sample. Metron, 1, 3-32. Fitts, P. M., & Posner, M. I. (1967). Learning and skilled performance in human performance. Belmont, CA: Brooks/Cole. Flora, J. R., & Cooper, A. T. (2005). Incorporating inquiry-based laboratory experiment in undergraduate environmental engineering laboratory. Journal of professional issues in engineering education and practice, 131(1), 19-25. Fouladi, R. T. (1998). Covariance structure analysis techniques under conditions of multivariate normality and nonnormality-modified and bootstrap based test statistics. In Annual Meeting of the American Educational Research Association (Vol. 1998, No. 1). Franklin, S. B., Gibson, D. J., Robertson, P. A., Pohlmann, J. T., & Fralish, J. S. (1995). Parallel analysis: a method for determining significant principal components. Journal of Vegetation Science, 6(1), 99-106. Frost, S. H., & Teodorescu, D. (2001). Teaching excellence: How faculty guided change at a research university. Review of Higher Education, 24(4), 397–415. Fullan, M. (1994). Coordinating top-down and bottom-up strategies for educational reform. Systemic reform: Perspectives on personalizing education, 7-24. Garson, G. D. (2013). Factor analysis. Asheboro, NC: Statistical Associates Publishers. George, D., & Mallery, P. (2003). SPSS for Windows step by step: A simple guide and reference. 11.0 update (4th ed.). Boston: Allyn & Bacon. Gerbing, D.W. & Anderson, J.C. (1984). On the meaning of within-factor correlated measurement errors, Journal of Consumer Research, 11, 572-580. Gijbels, D., Dochy, F., Van den Bossche, P. & Segers, M. (2005). Effects of problem-based learning: A meta-analysis from the angle of assessment, Review of Educational Research, 75(1), 27–61. Grinder, S. A., & Kelly-Reid, J. E. (2013). Enrollment in Postsecondary Institutions, Fall 2012. Financial Statistics, Fiscal Year 2012; Graduation Rates, Selected Cohorts, 2004-09; and Employees in Postsecondary Institutions, Fall 2012. NCES# 2013, 183. 141 Gindy, M. (2006). A new approach to undergraduate structural engineering laboratory instruction. In Proceedings of the ASEE New England Section, Annual Conference. Goeser, P. T., Coates, C. W., & Johnson, W. M. (2003). The role of an introduction to engineering course on retention. In Proc. ASEE Southeast Section Conference. Gorsuch, R. (1983). Factor analysis. Hillsdale, NJ: L. Erlbaum Associates. Grabe, M., & Grabe, C. (2000). Integrating the internet for meaningful learning. Boston: Houghton Mifflin. Guilford, J. P. (1967). The nature of human intelligence, New York: McGraw-Hill Hair, J. F., Anderson, R. E., Tatham, R. L., & Black, W. C. (1995). Multivariate data analysis NewYork, NY: Macmillan. Hamilton, L.C. (2008). Statistics with STATA. Belmont, CA: Nelson Education Hancock, G. R., & Nevitt, J. (1999). Bootstrapping and the identification of exogenous latent variables within structural equation models. Structural Equation Modeling: A Multidisciplinary Journal, 6(4), 394-399. Hargreaves, A. (1994). Changing teachers, changing times. Toronto: Oise Press. Hatcher, L. (1994). A step-by-step approach to using the SAS system for factor analysis and structural equation modeling. SAS Institute. Hauenstein, A. D. (1998). A conceptual framework for educational objectives: A holistic approach to traditional taxonomies. NY: University Press of America. Haury, D. L. (2001). Teaching science through inquiry. ERIC, Clearinghouse for Science, Mathematics, and Environmental Education. Retrieved March 19 2011 from https://www.msu.edu/course/te/804/Sp05Sec1819/Science05/Assets/402files/HauryA rchivedData.pdf Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory factor analysis: A tutorial on parallel analysis. Organizational research methods, 7(2), 191205. Herriott, R. E., & Firestone, W. A. (1983). Multisite qualitative policy research: Optimizing description and generalizability. Educational researcher, 14-19. Henderson, C., Beach, A., & Finkelstein, N. (2008). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature, Journal of Research in Science Teaching, 48 (8), 952-984 142 Hersen, M. & Barlow, D.H. (1976). Single case experimental design: Strategies for studying behavior change. New York: Pergamon. Hershberger, S. L. (2005). Factor scores. In B. S. Everitt and D. C. Howell (Eds.) Encyclopedia of Statistics in Behavioral Science. (pp. 636-644). New York: John Wiley Hill, L. G., & Betz, D. L. (2005). Revisiting the retrospective pretest. American Journal of Evaluation, 26(4), 501-517. Hinds, T., Wolff, T., Buch, N., Idema, A., & Helman, C. (2009). Integrating a First-Year Engineering Program and a Living-Learning Community. In proceedings of the American Society for Engineering Education 2009 Annual Conference. Hmelo-Silver, C. E., Duncan, R. G., & Chinn, C. A. (2007). Scaffolding and achievement in problem-based and inquiry learning: A response to Kirschner, Sweller, and Clark (2006). Educational Psychologist, 42(2), 99-107. Hoegl, M., & Parboteeah, K. P. (2007). Creativity in innovative projects: How teamwork matters. Journal of Engineering and Technology Management, 24(1), 148-166. Hoffer, T. B., Rasinski, K. A., & Moore, W. (1995). Social background differences in high school mathematics and science coursetaking and achievement (NCES 95-206). Washington, DC: U.S. Department of Education Hooper, D., Coughlan, J., & Mullen, M.R. (2008) Structural equation modeling: Guidelines for determining model fit, Electronic Journal of Business Research Methods, 6 (1), 53-60 Houlden, R. L., Collier, C. P., Frid, P. J., John, S. L., & Pross, H. (2001). Problems identified by tutors in a hybrid problem-based learning curriculum.Academic Medicine, 76(1), 81. Hounsell, D., McCune, V., Litjens, J, & Hounsell, J. (2005). Biosciences. Subject Overview Report of Enhancing Teaching-Learning Environments in Undergraduate Courses Porject. Retrieved May 29, 2014 from http://www.etl.tla.ed.ac.uk//docs/BiosciencesSR.pdf Howard, G. S. (1980). Response-shift bias a problem in evaluating interventions with pre/post self-reports, Evaluation Review, 4(1), 93-106. Howe, S., & Wilbarger, J. (2005). National survey of engineering capstone design courses. In Proceedings of the 2006 ASEE Annual Conference and Exposition (pp. 18-21). Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1-55. Hu, S., Scheuch, K., Schwartz, R. A., Gayles, J. G., & Li, S. (2008).Reinventing undergraduate education: engaging college students in research and creative activities. Jossey-Bass. 143 Huisman, M. (2000). Imputation of missing item responses: Some simple techniques. Quality and Quantity, 34(4), 331-351. James, L. R., Mulaik, S. A., & Brett, J. M. (1982). Causal analysis: Assumptions, models, and data. Beverly Hills, CA: Sage Publications. Jeffries, K. K. (2007). Diagnosing the creativity of designers: individual feedback within mass higher education. Design Studies, 28(5), 485-497. Johnson, D. W., & Johnson, R. T. (1999). Making cooperative learning work.Theory into practice, 38(2), 67-73. Jöreskog, K.G. & Long, J.S. (1993). Introduction. In A.B. Kenneth & J.S. Long (Eds). Testing structural equation models, pp 1-9, Newbury Park, CA: Sage. Kalonji, G., Regan, T., & Walker, M. L. (1996, November). The evolution of a Coalition: ECSEL's programs for Years 6-10. In Frontiers in Education Conference, 1996. FIE'96. 26th Annual Conference., Proceedings of (Vol. 3, pp. 1360-1365). IEEE. Kanter, D., Smith, H.D., McKenna, A., Rieger, C., & Linsenmeier, R. (2003) Inquiry-based laboratory instruction throws out the “cookbook” and improves learning. Proceedings of the American Society for Engineering Education Karam, L. J., & Mounsef, N. (2011, January). Increasing retention through Introduction to Engineering Design. In Digital Signal Processing Workshop and IEEE Signal Processing Education Workshop (DSP/SPE), 2011 IEEE (pp. 186-191). Kashdan, T. B., & Fincham, F. D. (2004). Facilitating Curiosity: A Social and Self‐ Regulatory Perspective for Scientifically Based Interventions. Positive psychology in practice, 27 Sep, 482-503. Kashdan, T. B., Rose, P., & Fincham, F. D. (2004). Curiosity and exploration: Facilitating positive subjective experiences and personal growth opportunities. Journal of personality assessment, 82(3), 291-305. Kenny, D.A., & Kashy, D.A. (1992). Analysis of the multitrait-multimethod matrix by confirmatory factor analysis. Psychological Bulletin, 112, 165-172. Kenward, M. G., Goetghebeur, E. J. T. & Molenberghs, G. (2001) Sensitivity analysis of incomplete categorical data. Stat Model, 1, 31–48. Kezar, A. (2011) What is the best way to achieve broader reach of improved practices in higher education? Innovative Higher Education, 36 (4), 235-247. 144 Kezar, A., & Eckel, P. (2002). The effect of institutional culture on change strategies in higher education: Universal principles or culturally responsive concepts? Journal of Higher Education, 73, 435-460. Khan, P., & O’Rourke, K. (2005). Understanding enquiry-based learning. In Barrett, T., Mac Labhrainn, I., Fallon, H. (Eds). Handbook of Enquiry & Problem Based Learning, Galway: CELT. Retrieved April 18 from http://www.aishe.org/readings/2005-2/chapter1.pdf Khan, P., & O’Rourke, K. (2004). Guide to curriculum design: enquiry-based learning. Higher Education Academy, 30-3. Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational psychologist, 41(2), 75-86. Kitchener KS and King PM (1981). Reflective judgment: concepts of justifications and their relation to age and gender. Journal of Applied Developmental Psychology, 2(2), 89–116. King, R., Parker, T., Grover, T., Gosink, J., & Middleton Venue, N. (1999). A multidisciplinary engineering laboratory course, Journal of Engineering Education, 88 (3), 311- 316. Kline, P. (1986). A handbook of test construction: Introduction to psychometric design. New York: Methune & Company. Kline, R. B. (2004). Chapter 1, Introduction, prepublication version. In Beyond significance testing: Reforming data analysis methods in behavioral research. Washington, DC : American Psychological Association. Kline, R. B. (2009). Chapter 7 Measurement. In Becoming a behavioral science researcher: A guide to producing research that matters. New York: GuildfordPress. Kline, R.B. (2011). Principles and practice of structural equation modeling. Guilford press. Knight, D. W., Carlson, L. E., & Sullivan, J. (2007, June). Improving engineering student retention through hands-on, team based, first-year design projects. InProceedings of the International Conference on Research in Engineering Education. Kohn, A. (1999). The schools our children deserve, Boston: Houghton Mififlin. Komives, C., (2006). Biochemical engineering laboratory course for chemical engineering students, In Proceedings of the 9th International Conference on Engineering Education, San Juan, Puerto Rico, M3F 22-27. 145 Kraiger, K., Ford, J. K., & Salas, E. (1993). Application of cognitive, skill-based, and affective theories of learning outcomes to new methods of training evaluation. Journal of applied psychology, 78(2), 311. Krapp, A. (1999). Interest, motivation, and learning: An educational-psychological perspective. European Journal of Psychology in Education, 14, 23–40. Krathwohl, D. R., Bloom, B. S., & Masia, B. B. (1973). Taxonomy of educational objectives, the classification of educational goals. Handbook II: Affective domain. New York: David McKay Co., Inc. Krivickas, R. V., & Krivickas, J. (2007). Laboratory instruction in engineering education. Global J. of Engng. Educ, 11(2), 191-196. Kuhn, D., Black, J., Keselman, A., & Kaplan, D. (2000). The development of cognitive skills to support inquiry learning. Cognition and Instruction, 18(4), 495-523. Laird, T. F. N., Shoup, R., Kuh, G. D., & Schwarz, M. J. (2008). The effects of discipline on deep approaches to student learning and college outcomes.Research in Higher Education, 49(6), 469-494. Lattuca, L. R., Terenzini, P. T., & Volkwein, J. F. (2006). Engineering change: Findings from a study of the impact of EC2000, Final Report. Baltimore, MD: ABET, Inc. Lattuca, L.R., Terenzini, P.T., Volkwein, J.F, Peterson, G.D. (2006). The changing face of engineering education, The Bridge, 36 (2), 5-13. Lazarus, R. S., Folkman, S., & Stress, A. (1984). Coping. In Gentry, WD (Eds.), Handbook of Behavioral Medicine, pp11-21. Leonard, W. H. (1991). A Recipe for Unbookbooking Laboratory Investigations. Journal of College Science Teaching, 21(2), 84-87. Lingard, B. (2010). Policy borrowing, policy learning: Testing times in Australian schooling. Critical Studies in Education, 51(2), 129-147. Lin, Y. & Chen, F. (2010). A stress coping style inventory of students at universities and colleges of technology. World Transactions on Engineering and Technology Education, 8 (1), 67-72. Lisenmeier, R., Kanter, D.,Smith, H.,Lisenmeier, K.,& McKenna, A. (2008) Evaluation of a challenge-based human metabolism laboratory for undergraduates. Journal of Engineering Education 97, 213-222. Litzinger, T., Lattuca, L. R., Hadgraft, R., & Newstetter, W. (2011). Engineering education and the development of expertise. Journal of Engineering Education,100 (1), 123-150. 146 Lochhead, J., & Collura, J. (1981). A Cure For Cookbook Laboratories. The Physics Teacher, 19(1), 46-50. Lord, T. & Orkwiszewski, T. (2006). Moving from didactic to inquiry-based instruction in a science laboratory. American Biology Teacher, 68(6), 342-345 Lynch, D.R., Russell, J.S., Evans, J.C., & Sutterer, K.G. (2009). Beyond the cognitive: The affective domain, values and the achievement of the vision, Journal of professional Issues in Engineering Education & Practice, 135 (1), 47-56 Lyons, J., & Young, E. F. (2001). Developing a Systems Approach to Engineering Problem Solving and Design of Experiments in a Racecar‐Based Laboratory Course. Journal of Engineering Education, 90(1), 109-112. Lyons W. & Plisga, B. S. (Eds). Standard Handbook of Petroleum & Natural Gas Engineering (Second edition). Burlington, MA: Elsevier Ma, J., & Nickerson, J.V. (2006) Hands-on, simulated, and remote laboratories: A comparative literature review, ACM Computing Surveys, 38 (3), 7 MacCallum, R. C., Browne, M. W., & Sugawara, H. M. (1996). Power analysis and determination of sample size for covariance structure modeling.Psychological methods, 1(2), 130. Madigan, T. (1997). Science proficiency and course taking in high school: The relationship of course-taking patterns to increases in science proficiency between 8th and 12th grades. Washington, DC: U.S. Department of Education. Marek, E. A., Cavallo, A. M. L., & Renner, J. W. (1997). The learning cycle: Elementary school science and beyond. Portsmouth, NH: Heinemann. Marsh, H.W. (1996).positive and negative global self-esteem: A substantively meaningful distinction or artifactors, J Pers Soc Psychol, 70(4):810-819. Marton, F., Hounsell, D., & Entwistle, N. (1997). The experience of learning: Implications for teaching and studying in higher education. Edinburgh: Scottish Academic Press Marton, F., & Säljö, R. (1976). On qualitative differences in learning: Outcome and process. British Journal of Educational Psychology, 44 (1), 4-11. Mason, C. L., & Kahle, J. B. (1989). Student attitudes toward science and sciencerelated careers: A program designed to promote a stimulating gender‐free learning environment. Journal of Research in Science Teaching, 26(1), 25-39. 147 Matz, R.L., Rothman, E.D., Krajcik, J.S., & Banaszak Holl, M.M. (2012). Concurrent enrollment in lecture and laboratory enhances student performance and retention, Journal of Research in Science Teaching, 49(5), 659-682. Marzano, R. J., & Kendall, J. S. (2007). The new taxonomy of educational objectives. Corwin Press. McCeary, C.L., M.F. Golde, and R. Koeske. 2006. Peer Instruction in General Chemistry Laboratory: Assessment of Student Learning. Journal of Chemical Education 83(5):804-10. McCrae, R. R. (1987). Creativity, divergent thinking, and openness to experience. Journal of personality and social psychology, 52(6), 1258. McGinn, R.E. (2003). “Mind the gaps”: An empirical approach to engineering ethics, 19972001, Science and Engineering Ethics, 9, 517-542. McGrath, P. (1999). Findings from an educational support course for patients with leukemia. Cancer practice, 7(4), 198-204. McQuiggan, S. W., & Lester, J. C. (2006). Diagnosing self-efficacy in intelligent tutoring systems: An empirical study. In Intelligent Tutoring Systems (pp. 565-574). Springer Berlin Heidelberg. Miller, R. L., & Olds, B. M. (1994). A model curriculum for a capstone course in multidisciplinary engineering design. Journal of Engineering Education, 83(4), 311-316. Mishra, P., Fahnoe, C., Henriksen, D., & The Deep-Play Research Group (2013). Creativity, Self-directed Learning, and the Architecture of Technology Rich Environments. Tech Trends, 57 (1). 10-13. Modell, H. I., Michael, J. A., Adamson, T., Goldberg, J., Horwitz, B. A., Bruce, D. S., & Williams, S. (2000). Helping undergraduates repair faulty mental models in the student laboratory. Adv Physiol Educ, 23, 82-90. Morrow, D. G., Ridolfo, H. E., Menard, W. E., Sanborn, A., Stine-Morrow, E. A., Magnor, C., ... & Bryant, D. (2003). Environmental support promotes expertise-based mitigation of age differences on pilot communication tasks.Psychology and aging, 18(2), 268. Mundfrom, D.J, Shaw, D.G., and Ke, T.L. (2009). Minimum sample size recommendations for conducting factor analyses, International Journal of Testing, 5(2), 159-168. Munson-McGee, S. H. (2000). An introductory ChE laboratory incorporating EC 2000 criteria. Chemical Engineering Education, 34(1), 80-89. Nickerson, R.S. (1999). Enhancing creativity. In Sternberg, R.J. (Ed) Handbook of creativity, pp392-430, UK: Cambridge University Press. 148 Norman, G., & Schmidt, H. (2000). Effectiveness of problem-based learning curricula: Theory, practice and paper darts, Medical Education, 34, 721–728. National Research Council. (1996). National Science Education Standards. Washington, DC: National Academy Press. Retrieved March 19 2011 from http://www.nap.edu/readingroom/books/nses/ Nevitt, J., & Hancock, G. R. (2001). Performance of bootstrapping approaches to model test statistics and parameter standard error estimation in structural equation modeling. Structural Equation Modeling, 8(3), 353-377. Newson, R. (2006). Confidence intervals for rank statistics: Somers' D and extensions. Stata Journal, 6(3), 309. Nunnally, J. C. (1978). Psychometric theory (2nd ed.). New York: McGraw-Hill Nunnally, J.C. & Bernstein, I.H. (1994). Psychometric theory (3rd ed.). New York: McGrawHill. O’Connor, B.P. (2000). SPSS and SAS programs for determining the number of components using parallel analysis and Velicer’s MAP test. Behavior Research Methods, Instrument, 7 Computer, 32, 396-402 Ohland, M. W., Zhang, G., Brawner, C. E., & Miller III, T. K. (2003). A longitudinal study of retention and grade performance of participants in an engineering entrepreneurs program. In Proceedings of American Society Engineering Education, Nashville, Tennessee. Olson, S. & Loucks-Horsley, S. (Eds.). (2000). Inquiry and the National Science Education Standards: A guide for teaching and learning. Washington, DC: National Academy Press. Osborne, J. & Waters, E. (2002). Four assumptions of multiple regression that researchers should always test. Practical Assessment, Research & Evaluation, 8(2), Retrieved March 20, 2014 from http://PAREonline.net/getvn.asp?v=8&n=2 Pedhazur, E. J. & Schmelkin, LP (1991). Measurement, design, and analysis: An integrated approach. New York: Taylor & Francis Group. Philbin, M., Meier, E., Huffman, S., & Boverie, P. (1995). A survey of gender and learning styles. Sex Roles, 32(7-8), 485-494. Prados, J.W., G.D. Peterson, and L.R. Lattuca. 2005. Quality assurance of engineering education through accreditation: the impact of Engineering Criteria 2000 and its global influence. Journal of Engineering Education 94(1): 165–184. 149 Prince, M., & Felder, R. (2007). The many faces of inductive teaching and learning. Journal of College Science Teaching, 36(5), 14. Prince, M. J., Vigeant, M. A., & Nottis, K. (2009). A preliminary study on the effectiveness of inquiry-based activities for addressing misconceptions of undergraduate engineering students. Education for Chemical Engineers, 4(2), 29-41. Prins, F. J., Nadolski, R. J., Berlanga, A. J., Drachsler, H., Hummel, H. G., & Koper, R. (2008). Competence Description for Personal Recommendations: The importance of identifying the complexity of learning and performance situations, Journal of Educational Technology & Society, 11(3), p141-152 Prosser, M., & Trigwell, K. (1998). Teaching for learning in higher education. Buckingham: Open University. Prosser, M. & Trigwell, K. (1999) Understanding Learning and Teaching: The Experience in Higher Education. Milton Keynes: SRHE/Open University Press. Prosser, M., Trigwell, K., Hazel, E. & Gallagher, P. (1994). Students experiences of teaching and learning at the topic level, Research and Development in Higher Education, 16, 305–310. Pushkin, D. (1997) Where do ideas for students come from?, Journal of College Science Teaching, 26, 238–242. Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E. & Duncan, R. G. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13, 337–386. Quinn, K. A., & Albano, L. D. (2008). Problem-based learning in structural engineering education. Journal of Professional Issues in Engineering Education and Practice, 134(4), 329-334. Ramsden, P. (1992). Learning to teach in higher education. New York : Routledge Ramsden, P., & Moses, I. (1992). Associations between research and teaching in Australian higher education. Higher Education, 23(3), 273-295. Rao, N., & Sachs, J. (1999). Confirmatory factor analysis of the Chinese version of the Motivated Strategies for Learning Questionnaire. Educational and Psychological Measurement, 59(6), 1016-1029. Rockwell, S. K., & Kohn, H. (1989). Post-then-pre evaluation. Journal of Extension, 27(2), 1921. Runco, M. A., & Acar, S. (2012). Divergent thinking as an indicator of creative potential. Creativity Research Journal, 24(1), 66-75. 150 Schaffer, S. P., Chen, X., Zhu, X., & Oakes, W. C. (2012). Self‐Efficacy for Cross‐Disciplinary Learning in Project‐Based Teams. Journal of Engineering Education, 101(1), 82-94. Seago, J. L. (1992). The role of research in undergraduate instruction. The American Biology Teacher, 401-405. Secker, C. V. (2002). Effects of inquiry-based teacher practices on science excellence and equity. The Journal of Educational Research, 95(3), 151-160. Seel, R. (2000). Culture and complexity: New insights on organizational change, Organizations and People, 7(2), 2-9. Seymour, E., Hunter, A. B., Laursen, S. L., & DeAntoni, T. (2004). Establishing the benefits of research experiences for undergraduates in the sciences: First findings from a three‐year study. Science Education, 88(4), 493-534. Shadish, W.R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Wadsworth Cengage learning. Sharma, S. (1996). Factor analysis, In Applied multivariate techniques, New York: Wiley, pp. 90–130. Sharp, D.M.M., & Primrose, C.S. (2003). The ‘virtual Family’: An evaluation of an innovative approach using problem-based learning to integrate curriculum themes in a nursing undergraduate program, Nurse Education Today, 23, 219–225. Sheppard, S. & Jenison, R. (1997) Freshmen engineering design experiences: an organizational framework, International Journal of Engineering Education, 13 (3), 190-197. Sheppard, S.D., Macatangay, K., Colby, A., & Sullivan, W.M. (2009). Educating engineers: Designing for the future of the field. San Francisco: Jossey-Bass. Shepherd, D. A., Patzelt, H., & Wolfe, M. (2011). Moving forward from project failure: Negative emotions, affective commitment, and learning from the experience. Academy of Management Journal, 54(6), 1229-1259. Shiland, T. W. (1997). Decookbook It! Science and Children, 35(3), 14-18. Short, K., Harste, J., & Burke, C. (1996). Creating classrooms for authors and inquiries. Portsmouth, NH: Heinemann. Siekpe, J.S. (2005) An examination of the multidimensionality of flow construct in a computer-mediated environment, Journal of Electronic Commerce Research, 6(1), 31-43. 151 Smith, K.A. (2000). Inquiry in large classes. In Sigma Xi conference proceedings -- Reshaping undergraduate science and engineering education: Tools for better learning, p. 53-64. Smith, K.A. (2002). Inquiry and cooperative learning in the laboratory. Invited paper for ABET/Sloan Project to examine distance learning in the practice oriented professions. Proceedings ABET/Sloan Conference on Distance Learning in the Practice Oriented Professions. Smith, K. A., Sheppard, S. D., Johnson, D. W., & Johnson, R. T. (2005). Pedagogies of engagement: Classroom‐based practices. Journal of Engineering Education, 94(1), 87-101. Spronken-Smith, R., Bullard, J. O., Ray, W., Roberts, C., & Keiffer, A. (2008). Where might sand dunes be on Mars? Engaging students through inquiry-based learning in geography. Journal of Geography in Higher Education, 32(1), 71-86. Stark, J. S., & Lattuca, L. R. (1997). Shaping the college curriculum: Academic plans in action (Vol. 430). Boston: Allyn and Bacon. StataCorp (2013). Stata User’s Guide: Release 13. TX: StataCorp LP. Steiger, J. H. (2007). Understanding the limitations of global fit assessment in structural equation modeling. Personality and Individual Differences, 42, 893-898. Stevens, J. (1996). Exploratory and confirmatory factor analysis. Applied multivariate statistics for the social sciences, 362-428. Tabachnick, B. G., & Fidell, L. S. (2001). Using multivariate statistics (4th Ed.). Needham Heights, MA: Allyn & Bacon. Tagg, J. (2003). The learning paradigm college. Bolton, MA, USA: Anker Publishing Company. Tal, T., Krajcik, J. S., & Blumenfeld, P. C. (2006). Urban schools' teachers enacting project‐based science. Journal of Research in Science Teaching,43(7), 722-745. Tang, C. (1994). Effects of modes of assessment on students’ preparation strategies. In G. Gibbs (Ed.). Improving student learning: Theory and practice. p. 151-170. The Oxford Center for Staff Development. Tippin, G. K., Lafreniere, K. D., & Page, S. (2012). Student perception of academic grading: Personality, academic orientation, and effort. Active Learning in Higher Education, 13(1), 51-61. Todd, R.H., Magleby, S.P., Sorensen, C.D., Swan, B.R., & Anthony, D.K (1995) A Survey of Capstone Engineering Courses in North America, Journal of Engineering Education, April, 165-174. 152 Trigwell, K., & Prosser, M. (1991). Improving the quality of student learning: the influence of learning context and student approaches to learning on learning outcomes. Higher education, 22(3), 251-266. VanAntwerp, J. J., VanAntwerp, J. G., Vander, G. D., & Wentzheimer, W. W. (2004). Chemistry and Materials Science for all engineering disciplines: a novel interdisciplinary team-teaching approach. In Proceedings of the 2004 American Society for Engineering Education Annual Conference & Exposition, American Society for Engineering Education, Salt Lake City, UT. Van Prooijen, J. W., & Van Der Kloot, W. A. (2001). Confirmatory analysis of exploratively obtained factor structures. Educational and Psychological Measurement, 61(5), 777792. Vernon, D.T.A., & Blake, R.L. (1993). Does problem-based learning work? A meta-analysis of evaluative research, Academic Medicine, 68, 550–563. Wheaton, B., Muthen, B., Alwin, D.F., & Summers, G. (1977). Assessing reliability and stability in panel models, Sociological Methodology, 8 (1), 84-136. Walker, J. T., & Lofton, S. P. (2003). Effect of a problem based learning curriculum on students’ perceptions of self directed learning. Issues in Educational Research, 13(2), 71100. Walton, S.P., Briedis, D., Urban-Lurain, M., Hinds, T.J., Davis-King, Carmellia, & Wolff, T.F. (2013). Building the whole engineer: An integrated academic and co-curricular first-year experience. In Proceedings of the 120th American Association of Engineering Education Conference & Exposition, Paper ID#7410. Watai, L. L., Brodersen, A. J., & Brophy, S. P. (2007, October). Designing effective Laboratory courses in electrical engineering: Challenge-based Model that reflects engineering process. In Frontiers In Education Conference-Global Engineering: Knowledge Without Borders, Opportunities Without Passports, 2007. FIE'07. 37th Annual (pp. F2C-7). IEEE. Watson, K. (2009). Change in Engineering Education: Where does research fit? Journal of Engineering Education, 98 (1), 3-4. Wheaton, B., Muthen, B., Alwin, D., & Summers, G. (1977). Assessing reliability and stability in panel models. In D.R. Heise (ed.) Sociological Methodology, 6, 84-136. Wigfield, A. Tonks., & Eccles, J.S (2004). Expectancy value theory in cross-cultural perspective. Big theories revisited, 165-198. Wilcoxon, F. (1945). Individual comparisons by ranking methods. Biometrics,1(6), 80-83. 153 Williams, B., Onsman, A., & Brown, T. (2010). Exploratory factor analysis: A five-step guide for novices. Journal of Emergency Primary Health Care, 8(3). 1-13 Wilpert, B. (2008). Psychology and human factors engineering. Cognition, Technology & Work, 10(1), 15-21. Wood, A. M., White, I. R., & Thompson, S. G. (2004). Are missing outcome data adequately handled? A review of published randomized controlled trials in major medical journals. Clinical trials, 1(4), 368-376. Wright, M. C., Phillips-Bute, B. G., Petrusa, E. R., Griffin, K. L., Hobbs, G. W., & Taekman, J. M. (2009). Assessing teamwork in medical education and practice: relating behavioural teamwork ratings and clinical performance.Medical teacher, 31(1), 30-38. Wuersig, K. (2007). Should engineering freshmen have an engineering laboratory in the first semester? Proceeding of International Conference on Engineering Education, Coimbra, Portugal. Yin, R. K. (2009). Case study research: Design and methods. Thousand Oaks, CA: Sage Publications. Yoder., B. L. (2011). Engineering by the numbers. Washington, DC: American Society of Engineering Education Zwick, W. R., & Velicer, W. F. (1986). Comparison of five rules for determining the number of components to retain. Psychological bulletin, 99(3), 432. 154