INVESTIGATING STUDENT PERCEPTIONS OF THINKING AND LEARNING IN CHEMISTRY VIA A CULTURAL FRAMEWORK By Ryan Scott Bowen A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of Chemistry – Doctor of Philosophy 2022 ABSTRACT Various studies have found that transformed curricula in chemistry are encouraging students to reason more effectively and retain these abilities longer. This has been particularly salient in transformed courses using three-dimensional learning, such as Chemistry, Life, the Universe and Everything (CLUE) and its organic chemistry counterpart, Organic Chemistry, Life, the Universe and Everything (OCLUE). Previous research within the context of these courses, as well as more traditional course experiences, primarily focused on how students used causal mechanistic reasoning or models while engaging with learning tasks. That is, CLUE and OCLUE had been characterized according to student reasoning abilities on prompts but little work had been done to characterize the student experience and perspective with CLUE and OCLUE. The work presented within sought to address this gap by employing open-ended questions about student perceptions of their classrooms to minimize prompting in responses. Student responses to these questions were then analyzed using qualitative methods. The primary goals were to characterize student perceptions of transformational intent and elements of the classroom cultures that socialize students into particular ways of doing and thinking. The qualitative methods employed were supported by theoretical frameworks that acknowledge the salience and importance of culture and included: sociocultural theory/perspectives, Reinholz and Apkarian’s four frames of culture, and Schein and Schein’s framework of organizational culture. Using inductive thematic analysis, grounded theory, and narrative analysis, the work within was able to arrive at four main conclusions: 1) students in courses designed with three- dimensional learning perceive the use of knowledge is expected and assessed; 2) the perception questions we used to collect data could be easily and reliably incorporated into other courses; 3) students in organic chemistry seem to perceive “critical thinking” to entail the use of knowledge, clarifying an amorphous, yet ubiquitous, term in science education; and 4) students’ perceptions of thinking and learning and influenced by their previous experiences. These conclusions are all related via a sociocultural perspective on teaching, learning, and thinking in chemistry education. Aside from characterizing the student experience in the context of the transformed courses at Michigan State University (MSU), an underlying goal was to highlight how the frame of learning cultures can act as a connective tissue between diverse, yet robust, studies that holistically characterize learning environments and the interactions within. Of course, more work needs to be done, but it is my belief that student perceptions through the lens of such social and cultural lens can offer us invaluable insights into our course design, enactment, and assessment as well as the factors that influence student thinking and learning more broadly. This dissertation is dedicated to the thousands of students who are pushed out of STEM majors and courses against their will from persistent and embedded systemic inequities in science education. I further dedicate this work to my family, friends, and colleagues who supported me throughout this process and understood that, for me, the 12- or 14-hour workdays were more than just “finishing” graduate school; it was about helping students and giving them voice in this educational enterprise iv ACKNOWLEDGEMENTS The longest part of this dissertation could easily be the acknowledgments section. I would like to begin by thanking Dr. Melanie Cooper for her advice and guidance throughout my graduate training. She managed to convince a poor, country boy like myself that they could be an expert in something. I would like to acknowledge and thank my guidance committee: Dr. Lynmarie Posey, Dr. Amelia Gotwals, and Dr. Heedeok Hong for their guidance, questions, and feedback. I would be remiss to not thank and acknowledge Dr. Aishling (Ash) Flaherty at the University of Limerick, Ireland. Dr. Flaherty was an invaluable collaborator on my starting project at MSU which turned into a published paper (Chapter IV) and a research agenda that blossomed into this very dissertation. I would also like to acknowledge and thank the Cooper Group at MSU at the time of writing this dissertation which included: Dr. Keenan Noyes, (soon to be Dr.) Samantha Houchlei, Clare Carlson, Kriti Seth, Sewwandi Abeywardana, Veeda Scammahorn, Jacob Starkie, Dr. Paul Bergeron, Dr. Paul Nelson, and Dr. Liz Day (now at University of Texas – El Paso). The support and feedback you all provided was truly a blessing. Last, but certainly not least, I would like to make a land acknowledge. I acknowledge that Michigan State University (MSU) occupies the lands that belong to the Anishinaabeg – Three Fires Confederacy of Ojibwe, Odawa, and Potawatomi indigenous peoples. I further recognize and advocate for the sovereignty of all twelve federally-recognized Indigenous nations located within modern-day Michigan. All of these peoples were forcibly removed from the homes and should be acknowledged as part of our history. By offering this acknowledgement, my aim is to voice my support and affirmation of the sovereignty and protections of these Indigenous peoples, and I will continue to work to hold MSU accountable to their needs. v TABLE OF CONTENTS CHAPTER I: INTRODUCTION .................................................................................................... 1 REFERENCES .......................................................................................................................... 15 CHAPTER II: THEORETICAL FRAMEWORKS ...................................................................... 19 REFERENCES .......................................................................................................................... 37 CHAPTER III: LITERATURE REVIEW ..................................................................................... 41 REFERENCES .......................................................................................................................... 73 CHAPTER IV: INVESTIGATING STUDENT PERCEPTIONS OF TRANSFORMATIONAL INTENT AND CLASSROOM CULTURE IN ORGANIC CHEMISTRY COURSES............... 86 REFERENCES ........................................................................................................................ 146 APPENDIX ............................................................................................................................. 153 CHAPTER V: INVESTIGATING STUDENT PERCEPTIONS OF TRANSFORMATIONAL INTENT AND CLASSROOM CULTURE IN GENERAL CHEMISTYR COURSES ............ 161 REFERENCES ........................................................................................................................ 210 APPENDIX ............................................................................................................................. 212 CHAPTER VI: EXPLORATORY STUDIES ON STUDENT PERCEPTIONS OF ORGANIC CHEMISTRY CLASSROOM CULTURES ............................................................................... 271 REFERENCES ........................................................................................................................ 322 CHAPTER VII: STUDENT PERCEPTIONS OF “CRITICAL THINKING”: INSIGHTS INTO CLARIFYING AN AMORPHOUS CONSTRUCT ................................................................... 323 REFERENCES ........................................................................................................................ 372 APPENDIX ............................................................................................................................. 378 CHAPTER VIII: CONCLUSIONS, IMPLICATIONS, AND FUTURE WORK ...................... 386 REFERENCES ........................................................................................................................ 395 vi CHAPTER I: INTRODUCTION Over the years, research on how people learn has highlighted that there are many factors that influence student learning in undergraduate science courses (National Academies of Sciences, Engineering, and Medicine, 2018; National Research Council, 2000). These factors can be further associated with the overarching cultures of learning where implicit and explicit messages about how to think, talk, and act socialize and influence people to interact with the environment and content in a specific way. These cultures span the spectrum from the micro- level, such as cultures of specific classrooms, departments, and research groups, to the macro- level, such as the cultures of entire nations or larger organizations (National Academies of Sciences, Engineering, and Medicine, 2018; Schein & Schein, 2016; Thoman et al., 2017). Even at the micro-level of an individual classroom, culture can be a complex and highly dynamic entity with many moving parts. For example, cultures consist of highly-visible practices and ways of doing, referred to as structures or artifacts; however, they are also heavily informed by underlying beliefs, values, and assumptions, known as symbols, which tend to be less visible and more implicit (Reinholz & Apkarian, 2018; Schein & Schein, 2016). Furthermore, structures/artifacts and symbols are mediated, enacted, and interpreted by people within the context who have their own personal needs and goals and the power dynamics that attempt to direct, sustain, and control the culture and its members. Despite this complexity, consideration of these frames of structures/artifacts, symbols, people, and power, researchers can engage in systematic and productive cultural analyses. The work presented in this dissertation focuses exclusively on the micro-level learning cultures of classrooms within chemistry and biology courses at Michigan State University. 1 Throughout my work, I utilized student perceptions as a mechanism to explore these learning cultures, often employing open-ended, qualitative questionnaires and interview practices. This work was heavily informed by a variety of cultural frameworks including sociocultural theory (John-Steiner & Mahn, 1996; Vygotsky, 1978), Reinholz and Apkarian’s four frames of culture (Reinholz & Apkarian, 2018), and Schein’s framework for organizational culture (Schein & Schein, 2016). Though these frameworks will be explained in more detail in the Theoretical Framework section, it is important to note that throughout all studies there is the core narrative of using student perceptions to speak to how the overarching cultures of classrooms influenced student perceptions of what was valued, how they should practice, and ultimately, how they should learn or think (and whether they would learn in the first place). Since this work focuses on interpreting learning cultures and student perceptions, I find it important to start with a positionality statement to highlight my own biases, perspective, and assumptions, and to clarify my personal experiences with culture in science education classrooms. Positionality Statement I am originally from rural Tennessee in the United States where I grew up in abject poverty. In college, I was a first-generation college student and throughout my education I interfaced with a system that actively denied access to resources and opportunities due to my socioeconomic and first-generation status. However, I was able to overcome some of these obstacles, at least partially, due to my identity as a white, cisgender, male which afforded me power and privilege. Though many professionals in chemistry look like me, my experiences related to my upbringing and socioeconomic status have sensitized me to issues of science 2 classroom culture and how instructors support (or not do not support) students, particularly those from marginalized backgrounds. Despite my areas of marginalization (and due to my areas of privilege), I was able to go on and receive an education at two private universities. I currently hold a bachelors in biochemistry, a masters in biophysical chemistry, and a masters in education from two separate, private institutions. Prior to arriving at Michigan State, I had experience as a graduate research and teaching assistant, a graduate student teaching professional development coordinator and consultant, a middle and high school chemistry teacher, an adjunct professor in chemistry and biochemistry, and a full-time lecturer in chemistry and biochemistry. These experiences have exposed me to the overarching cultures of chemistry classrooms and research environments across multiple universities. Upon arriving at Michigan State, I was familiar with the work being done in chemistry education and the transformation efforts with three-dimensional learning. I believe three- dimensional learning to be a powerful transformative approach to science education; therefore, I always try to be careful that my work does not embody “confirmatory bias” when I work with three-dimensional courses (like I do within this dissertation). I joined the team for this project due to my interest in the research goals and the methods being used. In my research, I have always preferred to use qualitative research methods, particularly open-ended, inductive coding approaches. My ultimate interests in research seek to continue this line of inquiry where I explore and characterize chemistry (and other science) course cultures to aid in the transformation of science courses and departments at universities. I plan to continue to use qualitative methods coupled with historiographical approaches and critical theories to further 3 understand the student experience and address issues pertinent to diversity, equity, inclusion, and persistence. Situating My Work and Integrating Cognitive and Affective Domains of Learning Learning takes on a variety of definitions, yet it is generally agreed to involve three “domains”: cognitive, affective, and psychomotor as shown in Figure 1.1 (Bloom et al., 1956; Krathwohl et al., 1965). The cognitive domain is focused on knowledge development and strategies such as reasoning, understanding, or the structure of knowledge. On the other hand, the affective domain considers facets such as beliefs, values, attitudes, perceptions, motivation, and others, all of which are known as affective constructs. Finally, the psychomotor domain is associated with physical and manual skills or reflexes. All three of these domains therefore contribute to an overall, holistic learning experience for students. Figure 1.1. The three domains of learning 4 While all domains of learning are important, the most commonly considered within the context of classroom learning are the cognitive and affective domains. Largely starting in the 1960s, there was a major focus on the cognitive domain involving studies of expert and novice chess players players (Bramer, 1982; Charness, 1992; Chase & Simon, 1973; De Groot, 1965; Sala et al., 2017). Though cognitive science and chess may seem unrelated, the game of chess was referred to as cognitive science’s Drosophila or E.coli; that is, it was considered the model system under which cognitive events could be studied (Larkin et al., 1980). Since then, however, various studies have been published characterizing expert- and novice-like knowledge structure and thinking in disciplinary courses, such as physics (Bunce et al., 1991; Chi et al., 1981; Hammer, 2000; Hardiman et al., 1989; Mason & Singh, 2011; Medin, 1989; Veldhuis, 1990; Walsh et al., 2007). Much of the early work on the cognitive domain was eventually synthesized into the consensus document How People Learn: Brain, Mind, Experience, and School (National Research Council, 2000). Within How People Learn, the research on expert-like knowledge had established a few major findings including: 1) experts have highly organized and complex networks of knowledge, 2) experts are able to transfer knowledge and approaches from one situation to another more easily than novices, 3) experts notice implicit features of problems that novices often do not, 4) experts tend to have efficient retrieval mechanisms, and 5) experts are able to effectively integrate new knowledge into existing knowledge structures. Since then, various studies have sought to help students develop more expert-like dispositions when learning (Connor et al., 2021; Lenzer et al., 2020). Recently, the National Research Council revisited the literature on learning and published How People Learn II: Learners, Contexts, and Cultures (National Academies of Sciences, 5 Engineering, and Medicine, 2018). Within this new consensus document, it was found that what was true regarding the cognitive domain of learning back in 2000 when How People Learn was published remained to be true in 2018 with the publishing of How People Learn II. However, the second book also considered the growing body of research on the affective domain of learning and asserted that both the cognitive and affective domains were inseparable (Dewey, 1916; Immordino-Yang & Damasio, 2007; National Academies of Sciences, Engineering, and Medicine, 2018; National Research Council, 2012; Tomasello et al., 2005). That is, what a student believes, their attitude or disposition toward facets of a context, their perception of a classroom culture, their perception of sense of belonging, among others, ultimately influences the student’s cognitive functions. This sentiment was reiterated by the National Research Council in their consensus document Disciplinary-Based Education Research when they suggested that “researchers and instructors should not consider cognitive and affective development apart from each other,” (National Research Council, 2012). At Michigan State University, the Cooper Research Group (of which I was a member) in chemistry education had primarily focused on the cognitive domain of learning in attempts to characterize transformed curricula and its impacts on student learning and reasoning. For example, various studies from my colleagues were published detailing students’ reasoning abilities in the context of chemistry, particularly causal mechanistic reasoning (Becker et al., 2016; Crandell et al., 2019, 2020; Houchlei et al., 2021; Noyes & Cooper, 2019). While studies within the affective domain have certainly been published in chemistry education (Flaherty, 2020a), we had not focused on characterizing our transformative efforts or sought to support the cognitive research with affective work. To address the affective dimension, we opted to use student perceptions to explore their understanding of expectations and valued ways of doing and 6 thinking in chemistry and biology courses at MSU. Since we had previously established an evidence base for what students were doing within their chemistry courses, we thought it would be productive to complement this work with a better understand of what students perceived they were doing. Significance of Perceptions in Research and Using the Frame of Learning Cultures Perceptions, particularly patterns across different perspectives, offer valuable insight into nuanced and less salient features of a classroom. In the context of these studies, student perceptions of what to do and what is valued in their science courses is influenced by a variety of factors including what instructors say and do, what other students say and do, the overall course structure, course assignments, course policies, course assessments, among others (see Figure 1.2). That is, student perceptions act as an “intersection” of these factors, ultimately giving us insight into their experience, the practices they are using, their understanding of course expectations, and the interpretations they make about course policies. 7 Figure 1.2. Various factors that influence student perceptions of what to do and what is valued (Note: this is not an exhaustive representation) When perceptions are used in conjunction with open-ended questioning, researchers can gain insight into many of the factors that are related to the overarching culture such as the messages students receive from instructors and other students and their subsequent interpretation, practices (ways of doing), language (ways of talking), ways of thinking, values, expectations, policies, and many others. Therefore, using learning cultures as a frame through which to study student perceptions of their courses acts as a powerful way to explore contexts and their influence on student learning. Of course, when one considers studying something as encompassing and dynamic as culture, it becomes important to establish a working definition. Certainly, there is no universal definition and “no one view of culture… represents a thorough and complete understanding,” (Parsons & Carlone, 2013). However, throughout this dissertation when I refer to the word culture or a learning culture, I am referring to the micro-level cultures of courses. This view includes a constellation of structures/artifacts (policies, practices, etc.), symbols (beliefs, values, 8 assumptions, etc.), and socializing mechanisms that are mediated by people and power that directly and indirectly impact how people talk, act, and think (Calabrese Barton et al., 2008; Deng et al., 2021; Gutiérrez & Rogoff, 2003; Lemke, 2001; Miller & Goodnow, 1995; Nasir & Hand, 2006; Reinholz & Apkarian, 2018; Rogoff, 1990; Schein & Schein, 2016; Thoman et al., 2017). This working definition is informed by several cultural frameworks which will be outlined in the theoretical framework section. Though my work never explored the explicit role of culture on cognitive outcomes, my working definition recognizes that culture has a strong influence on how people learn, as Bruner (1996) notes: “For you cannot understand mental activity unless you take into account the cultural setting and its resources, the very thing that give mind its shape and scope,” (Bruner, 1996). Study Goals and Research Questions Introducing the Studies As stated throughout the introduction, the goal for the studies was to explore and investigate the cultures of learning in chemistry and biology courses. In chapters IV, VI, and VII, comparative studies between courses with different pedagogical underpinnings were explored using student perceptions and cultural frameworks. These comparisons helped highlight the differences in learning cultures that ultimately encouraged students to perceive different expectations and experiences. In the courses explored, the terms transformed and traditional will be used. Traditional courses are typically didactic, lecture-based courses with minimal interaction between peers and instructor, and they tend to follow the order of topics in a textbook. Transformed courses, on the other hand, are typically informed by the educational literature in some way, and they tend to be less lecture-based and more interactive. The 9 transformed courses in these studies used three-dimensional learning which will be described in more detail in the literature review. While the majority of the work takes places in organic chemistry courses, Chapters V and VI detail a series of pilot studies we conducted in other courses. Throughout the studies on student perceptions of what they were expected to do in their courses, my collaborators and I consistently had responses where students talked about engaging in “critical thinking” without much context or explanation as to what this meant for them. Therefore, I conceptualized the study presented in Chapter VII which used sociocultural frameworks to explore student perceptions of “critical thinking”. Finally, in conversations with students on their perceptions of courses and “critical thinking”, I came across student perspectives where they discussed being turned “off” to learning in chemistry which prompted me to conduct the case study presented in Chapter VIII. In the final study, using narrative analysis, I explore how one student, Virginia (a pseudonym) navigated moments of discouragement and encouragement in learning (within and outside of chemistry). Chapter IV: Investigating Student Perceptions of Transformational Intent and Classroom Culture in Organic Chemistry Courses This study was conducted in the second semester of an organic chemistry sequence (CEM 252) in Spring 2018 and asked students in transformed and traditional organic chemistry courses three open-ended questions about their perceptions of how they were expected to think, the most difficult aspects of the course, and how they were assessed. The study was framed with sociocultural theory, and the questionnaire acted as our data collection tool. The questionnaire was administered at the end of the semester, and we employed an inductive thematic approach for data analysis and noted interesting patterns in student perceptions between the two courses 10 that commented on the overarching learning cultures. The research question for this study were as follows: 1. In what ways do student perceptions of valued ways of doing and thinking align with the transformational intent? 2. How do elements of the course culture impact student perceptions of what is valued? Chapter V: Investigating Student Perceptions of Transformational Intent and Classroom Culture in General Chemistry Courses After the organic chemistry study, we conducted pilot studies in general chemistry I and II (CEM 141 and 142) and introductory biology (BS 161) at MSU. The pilots began with two summer 2020 studies in general chemistry I (CEM 141) and general chemistry II (CEM 142) where we changed our open-ended questions to see if we could improve the clarity of responses. In Fall 2020, we used modified open-ended questions for the questionnaire and piloted them in general chemistry I (CEM 141) and cell and molecular biology (BS 161) to yield our largest data set totally over 14,000 responses across all questions. For the CEM 141 group, the questions were asked twice, once in the middle of the semester and once at the end. The BS 161 group only received the questions at the end. We continued to apply our cultural lens and used the inductive thematic approach we had used for the Spring 2018 CEM 252 study. Furthermore, by doing these pilots, we wanted to see if the questions could pick up on differences in other courses to further validate the questionnaire; however, it’s important to note these studies were not comparative due to the lack of a relevant comparison group at MSU for these courses. Work with machine learning algorithms to automatically code the data will also be mentioned briefly. The research question for these studies were as follows: 11 1. Can the perception questions be used in other courses reliably? 2. In what ways do student perceptions of valued ways of doing and thinking align with the transformational intent? 3. How do elements of the course culture impact student perceptions of what is valued? 4. How did student perceptions change from the middle of the semester to the end of the semester in CEM 141? Chapter VI: Additional Pilot Studies Investigating Student Perceptions of Organic Chemistry I and II Classroom Cultures From the Fall 2020 CEM 141 data, we noted that the questions on the previous questionnaire were providing more detailed responses. Therefore, we opted to pilot the questions in transformed organic chemistry II (CEM 252; OCLUE) at the end of the Spring 2021 semester and transformed and traditional organic chemistry I (CEM 251) at the end of the Fall 2021 semester. Through the frame of sociocultural theory and other cultural frameworks such as organizational culture and Reinholz and Apkarian’s four frames of culture, we employed an inductive thematic approach, the responses were analyzed similarly to the previous studies. In this chapter, I will discuss my work with Automated Analysis of Constructed Responses (AACR) for the organic chemistry data and summarize my preliminary findings on the use of machine learning algorithms for automatic coding of our data sets. The research questions for these studies were as follows: 1. How does the shift to online instruction impact student perceptions in CEM 252 compared to in-person CEM 252? 2. How do OCLUE student perceptions in CEM 251 and CEM 252 compare? 12 3. What occurs if the questions are made less open-ended in an attempt to limit responses in the “Generalities” category? 4. In what ways do student perceptions of valued ways of doing and thinking align with the transformational intent? 5. How do elements of the course culture impact student perceptions of what is valued? Chapter VII: Student Perceptions of “Critical Thinking”: Insights into Clarifying an Amorphous Construct This study is an outcome of the study in Chapter IV where we had noted that many students in CEM 251 had perceived they were expected to engage in “critical thinking” yet many of these students did not explain what they meant nor did they provide an example. The term “critical thinking” is also used by many instructors, yet the definition is taken-for-granted and instructors often assume students know exactly what it means to think “critically”. The literature has also not proved helpful with no general agreement on what the terms means or entails. The ambiguity in student responses from the first study made it difficult to determine how to code these responses. Therefore, to offer insights into what students perceive “critical thinking” to mean and entail and to extend the literature, we conceptualized this study. Situated within cultural frameworks and using a constructivist grounded theory approach (Charmaz, 2006) with semi-structured interviews, we aimed to develop an analytical framework to help make sense of what students mean when they talk about “critical thinking”, why they conceptualize the construct this way, and how they came this understanding. The research questions for this study were as follows: 1. What are the commonalities across student perceptions of “critical thinking”? 13 2. What insights do student perceptions of “critical thinking” offer to help clarify the construct in instruction? Summary My hope for this introduction has been to weave threads of commonalities between all of the studies contained herein. At the core of all of this work is a critical and curious eye toward learning cultures and their associated structures/artifacts and symbols. My goal then is to illustrate how I have used this cultural lens to analyze classroom practices and expectations, ideas, policies, and practices at work in broader science education, and to illustrate how learning environments inform student perceptions of learning and engagement 14 REFERENCES 1. Becker, N., Noyes, K., & Cooper, M. (2016). Characterizing Students’ Mechanistic Reasoning about London Dispersion Forces. Journal of Chemical Education, 93(10), 1713–1724. https://doi.org/10.1021/acs.jchemed.6b00298 2. Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals. David McKay Company. 3. Bramer, M. A. (1982). Pattern-based representations of knowledge in the game of chess. International Journal of Man-Machine Studies, 16, 439–448. 4. Bruner, J. S. (1996). The Culture of Education. The Harvard University Press. 5. Bunce, D. M., Gabel, D. L., & Samuel, J. V. (1991). Enhancing chemistry problem- solving achievement using problem categorization. Journal of Research in Science Teaching, 28(6), 505–521. 6. Calabrese Barton, A., Tan, E., & Rivet, A. (2008). Creating Hybrid Spaces for Engaging School Science Among Urban Middle School Girls. American Educational Research Journal, 45(1), 68–103. https://doi.org/10.3102/0002831207308641 7. Charness, N. (1992). The impact of chess research on cognitive science. Psychological Review, 54, 4–9. 8. Chase, W. G., & Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55– 81. 9. Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5(2), 121–152. https://doi.org/10.1207/s15516709cog0502_2 10. Connor, M. C., Glass, B. H., Finkenstaedt-Quinn, S. A., & Shultz, G. V. (2021). Developing Expertise in 1 H NMR Spectral Interpretation. The Journal of Organic Chemistry, 86(2), 1385–1395. https://doi.org/10.1021/acs.joc.0c01398 11. Crandell, O. M., Kouyoumdjian, H., Underwood, S. M., & Cooper, M. M. (2019). Reasoning about Reactions in Organic Chemistry: Starting It in General Chemistry. Journal of Chemical Education, 96(2), 213–226. https://doi.org/10.1021/acs.jchemed.8b00784 12. Crandell, O. M., Lockhart, M. A., & Cooper, M. M. (2020). Arrows on the Page Are Not a Good Gauge: Evidence for the Importance of Causal Mechanistic Explanations about Nucleophilic Substitution in Organic Chemistry. Journal of Chemical Education, 97(2), 313–327. https://doi.org/10.1021/acs.jchemed.9b00815 15 13. De Groot, A. D. (1965). Thought and Choice in Chess. Mouton. 14. Deng, J. M., McMunn, L. E., Oakley, M. S., Dang, H. T., & Rodriguez, R. S. (2021). Toward Sustained Cultural Change through Chemistry Graduate Student Diversity, Equity, and Inclusion Communities. Journal of Chemical Education. https://doi.org/10.1021/acs.jchemed.1c00485 15. Dewey, J. (1916). Democracy and Education. Macmillan. 16. Flaherty, A. A. (2020a). A review of affective chemistry education research and its implications for future research. Chemistry Education Research and Practice, 21(3), 698–713. https://doi.org/10.1039/C9RP00200F 17. Gutiérrez, K. D., & Rogoff, B. (2003). Cultural ways of learning: Individual traits or repertoires of practice. Educational Researcher, 32(5), 19–25. 18. Hammer, D. (2000). Student resources for learning introductory physics. American Journal of Physics, 68(S1), S52–S59. https://doi.org/10.1119/1.19520 19. Hardiman, P. T., Dufresne, R., & Mestre, J. P. (1989). The relation between problem categorization and problem solving among experts and novices. Memory & Cognition, 17, 627–638. 20. Houchlei, S. K., Bloch, R. R., & Cooper, M. M. (2021). Mechanisms, Models, and Explanations: Analyzing the Mechanistic Paths Students Take to Reach a Product for Familiar and Unfamiliar Organic Reactions. Journal of Chemical Education, 98(9), 2751–2764. https://doi.org/10.1021/acs.jchemed.1c00099 21. Immordino-Yang, M. H., & Damasio, A. (2007). We Feel, Therefore We Learn: The Relevance of Affective and Social Neuroscience to Education. Mind, Brain, and Education, 1(1), 3–10. https://doi.org/10.36510/learnland.v5i1.535 22. John-Steiner, V., & Mahn, H. (1996). Sociocultural approaches to learning and development: A Vygotskian framework. Educational Psychologist, 31(3–4), 191–206. https://doi.org/10.1080/00461520.1996.9653266 23. Krathwohl, D. R., Bloom, B. S., & Masia, B. B. (1965). Taxonomy of Educational Objectives: Handbook II: Affective Domain. McKay Company, Inc. 24. Larkin, J., McDermott, J., Simon, D. P., & Simon, H. A. (1980). Expert and Novice Performance in Solving Physics Problems. Science, 208, 1335–1342. 25. Lemke, J. L. (2001). Articulating communities: Sociocultural perspectives on science education. Journal of Research in Science Teaching, 38(3), 296–316. https://doi.org/10.1002/1098-2736(200103)38:3<296::AID-TEA1007>3.0.CO;2-R 16 26. Lenzer, S., Smarsly, B., & Graulich, N. (2020). How do students become experts? An in- depth study on the development of domain-specific awareness in a materials chemistry course. International Journal of Science Education, 42(12), 2032–2054. https://doi.org/10.1080/09500693.2020.1810355 27. Mason, A., & Singh, C. (2011). Assessing expertise in introductory physics using categorization task. Physical Review Special Topics - Physics Education Research, 7, 020110–020110. 28. Medin, D. L. (1989). Concepts and conceptual structure. American Psychologist, 44(12), 1469–1481. 29. Miller, P. J., & Goodnow, J. J. (1995). Cultural practices: Toward an integration of culture and development. New Directions for Child and Adolescent Development, 1995(67), 5–16. https://doi.org/10.1002/cd.23219956703 30. Nasir, N. S., & Hand, V. M. (2006). Exploring Sociocultural Perspectives on Race, Culture, and Learning. Review of Educational Research, 76(4), 449–475. https://doi.org/10.3102/00346543076004449 31. National Academies of Sciences, Engineering, and Medicine. (2018). How People Learn II: Learners, Contexts, and Cultures. The National Academies Press. https://doi.org/10.17226/24783 32. National Research Council. (2000). How People Learn: Brain, Mind,Expereince, and School: Expanded Edition. National Academies Press. https://doi.org/10.17226/9853 33. National Research Council. (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. The National Academies Press. https://doi.org/10.17226/13362 34. Noyes, K., & Cooper, M. M. (2019). Investigating Student Understanding of London Dispersion Forces: A Longitudinal Study. Journal of Chemical Education, 96(9), 1821– 1832. https://doi.org/10.1021/acs.jchemed.9b0045 35. Parsons, E. C., & Carlone, H. B. (2013). Culture and science education in the 21st century: Extending and making the cultural box more inclusive. Journal of Research in Science Teaching, 50(1), 1–11. https://doi.org/10.1002/tea.21068 36. Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(1), 1–22. https://doi.org/10.1186/s40594-018-0103-x 37. Rogoff, B. (1990). Apprenticeship in Thinking: Cognitive Development in Social Context. Oxford University Press. 17 38. Sala, G., Foley, J. P., & Gobet, F. (2017). The Effects of Chess Instruction on Pupils’ Cognitive and Academic Skills: State of the Art and Theoretical Challenges. Frontier Psychology, 8(238), 1–4. 39. Schein, E. H., & Schein, P. A. (2016). Organizational Culture and Leadership (5th ed.). Jossey-Bass. 40. Thoman, D. B., Muragishi, G. A., & Smith, J. L. (2017). Research Microcultures as Socialization Contexts for Underrepresented Science Students. Psychological Science, 28(6), 760–773. https://doi.org/10.1177/0956797617694865 41. Tomasello, M., Carpenter, M., Call, J., Behne, T., & Moll, H. (2005). Understanding and sharing intentions: The origins of cultural cognition. Behavioral and Brain Sciences, 28, 675–735. 42. Veldhuis, G. H. (1990). The use of cluster analysis in categorization of physics problems. Science Education, 74(1), 105–118. 43. Vygotsky, L. (1978). Mind in Society. The Harvard University Press. 44. Walsh, L. N., Howard, R. G., & Bowe, B. (2007). Phenomenographic study of students’ problem solving approaches in physics. Physical Review Special Topics - Physics Education Research, 3, 020108–020108. 18 CHAPTER II: THEORETICAL FRAMEWORKS Throughout this dissertation, a variety of theoretical frameworks informed my interest and perspectives. The first three theoretical frameworks I employed focused explicitly on culture and included: 1) sociocultural theory and perspectives; 2) Reinholz & Apkarian’s four frames of culture; and 3) Schein’s organizational culture framework. I will close out this section with a brief discussion of the use of theoretical frameworks with grounded theory studies since Chapter VII is a study using constructivist grounded theory. Sociocultural Theory and Perspectives Sociocultural theory offers a robust theory for educational research and practice with widespread applicability. The theory has been used as a foundation for other theoretical frameworks in education that are concerned with the significance of the social context on learning. Broadly, sociocultural theory centers attention on the social, cultural, and historical factors that influence learning (John-Steiner & Mahn, 1996; Mantero, 2002; Vygotsky, 1978). While many early writings relevant to sociocultural theory pertain to child development, the theory is applicable at all levels of learning and development, and work has been done to demonstrate its use within college and university settings in chemistry education (Bowen et al., 2022; Petterson et al., 2022; Zotos et al., 2020). Aside from being able to address the overarching cultural factors of a learning environment, sociocultural theory offers a lens through which to understand the types of messages being conveyed and interpreted within a context. The theory also highlights the significance of “learning the discourses and social practices of scientific 19 communities,” (Mason, 2007), therefore offering a view of the cultural practices employed in learning environments and the extent to which students engage with those practices. The first aim of this literature review will be to detail sociocultural theory and demonstrate how it has been used as a foundation for supporting several other theoretical frameworks in education. By comparing sociocultural theory to these other frameworks, the distinct characteristics of the theory can be better understood and will allow me to demonstrate its applicability. The second aim of the review will be to provide examples of how sociocultural theory has been extended in the literature and in education to demonstrate the robust nature of this framework. Sociocultural theory and its approaches to learning and development are largely attributed to Lev Vygotsky and is based on his work in Russia in the 1920s and 1930s (John- Steiner & Mahn, 1996). Though Vygotsky introduced the theory, he did not get the chance to develop it due to an early death from tuberculosis. The slow adoption of his work within the U.S. was further exacerbated by the censorship of his work by the Community Party of the Soviet Union (Cole & Scribner, 1978). However, according to Bruner (Bruner, 1986) despite the censorship of Vygotsky’s work, it was still widely circulated within Russia. Although Vygotsky was considered a psychologist, he was deeply influenced by sociology, and he was credited with being one of the first psychologists to “suggest the mechanisms by which culture becomes a part of each person’s nature,” (Cole & Scribner, 1978). Sociocultural approaches to learning “are based on the concept that human activities take place in cultural contexts, are mediated by language and other symbol systems, and can be best understood when investigated in their historical development,” (John-Steiner & Mahn, 1996). This definition communicates that learning is dependent on the environment, that messages sent 20 via language and other symbol systems communicate cultural values and practices, and that human activity is best understood by investigating the process or how the activity has changed. Although Vygotsky’s work is considered within the constructivist school of thought more broadly, his attention to the social and cultural context set him apart from other constructivists at the time. Within constructivism, Vygotsky is largely considered to be a social constructivist and differed from cognitive constructivists such as Jean Piaget. For example, Vygotsky believed that due to constantly fluctuating historical conditions, human experience would constantly change. Piaget, on the other hand, ascribed to universal stages of development that were the same for all children depending on age, therefore placing less emphasis on sociohistoric facets (John-Steiner & Souberman, 1987). Furthermore, Vygotsky focused attention to sociocultural activities and interactions between individuals in a group and saw cognitive development as inseparable from the social context in which the learning occurs. In contrast, Piaget was more concerned with the individual with little attention to the effect of social aspects on development (though Piaget speculated that social aspects played some role) (Rogoff, 1990). Vygotsky also believed that learning preceded development, unlike Piaget who believed that learning could only occur according to fixed developmental stages (Palincsar, 1998). The dichotomy between Vygotsky and Piaget illustrates that within sociocultural approaches “the unit of analysis is not the individual, but the situated collective activity constructed by individuals,” (Mason, 2007). Therefore, sociocultural theory is more socially oriented than other constructivist approaches and is concerned with how groups (not individuals) construct shared understandings and what knowledge is shared amongst group members (Alexander, 2007). Despite the differences between sociocultural theory and other constructivist approaches, scholars have conceptualized the relationship and integration of these theories 21 (Mason, 2007). Scholars have discovered that most of Vygotsky’s work converges on three themes that are relevant to sociocultural theory (Wertsch, 1991). These themes are as follows: 1) individual development which includes higher mental functioning originates from social sources; 2) human action is mediated by tools and signs; and 3) developmental analysis is the best tool to study the first two themes (John-Steiner & Mahn, 1996; Wertsch, 1991). By exploring these themes in-depth, the ideas at the core of sociocultural theory are revealed. For the first theme, it should be noted that Vygotsky placed great importance on language and speech, particularly for the role it played in internalizing cultural practices: “...the most significant moment in the course of intellectual development, which gives birth to the purely human forms of practical and abstract intelligence, occurs when speech and practical activity, two previously independent lines of development, converge,” (Vygotsky, 1978). For Vygotsky, he believed that speech would transition from serving a strict external function to an internal one: “Instead of appealing to the adult, children appeal to themselves; language thus takes on a [sic] intrapersonal function in addition to its interpersonal use,” (Vygotsky, 1978). In the view of Vygotsky, the cognitive and social aspects of learning were intertwined, constantly interacting, inseparable, and dependent on one another, and over time these mental processes would transform from being strictly for external purposes to being used for internal purposes as well (Brown et al., 1989; John-Steiner & Mahn, 1996, p.; Rogoff, 1990; Vygotsky, 1978). Regarding the second theme, Vygotsky leveraged semiotics which is the development of signs and how those signs are used and interpreted. To Vygotsky, semiotic signs were “language; various systems of counting; mnemonic techniques; algebraic symbol systems; works of art; writing; schemes, diagrams, maps and mechanical drawings; all sorts of conventional signs and so on,” (Vygotsky, 1978). Vygotsky believed that these semiotic signs sent messages to people 22 about how to act in culturally influenced ways (Vygotsky, 1978). In relation to semiotics, scholars have further distinguished between what is classified as “tools” and “signs”. “Tools” are considered outward-facing, and they used to mediate the external, physical world. “Tools” carry no inherent meaning, and their sole purpose it to be used on the external world. “Tools” often include artifacts of the context such as language, notes, resources, procedures, among others, as long as they carry no meaning to the tool user. However, “tools” can be negotiated into “signs” within the context. “Signs” are inward-facing, and they are used to mediate the mental world. “Signs” develop out of the process of interpretation and meaning-making. Furthermore, “signs” always reference something else outside of themselves. Examples of signs can include the examples used for tools if those tools have been interpreted and meaning ascribed (Mantero, 2002). As Mantero (Mantero, 2002) notes in their example of tools and signs with language: “Words, and even language itself, is devoid of meaning. It is when language is placed into a social situation that signs and, hence, ideologies are created internally through external experience, which is mediated by and through tools,” (Mantero, 2002). Brown et al. (Brown et al., 1989) reiterate the sentiment by Mantero: “Words like I or now, for instance, can only be interpreted in the context of their use,” (Brown et al., 1989). The use of the tools and their negotiation into signs is important for meaning-making within the cultural context (Wertsch, 1991). This transformation of the use of tools into signs has been suggested to assist in the transition of cultural activities from the interpersonal to the intrapersonal (Wang et al., 2011). For the third theme, Vygotsky defined developmental analysis in terms of history where Vygotsky notes that “to study something historically means to study it in the process of change,” (Vygotsky, 1978). With this lens of studying process and change, sociocultural theory posits that 23 the internalization of sociocultural activities and semiotics is transformative and not simply transmissive (John-Steiner & Mahn, 1996). All three of the themes in Vygotsky’s work converge within one of his most famous ideas: the zone of proximal development. Vygotsky defined the zone of proximal development (ZPD) as “...the distance between the actual developmental level as determined by independent problem solving and the level of potential development as determined through problem solving under adult guidance or in collaboration with more capable peers,” (Vygotsky, 1978). According to Vygotsky, navigating the ZPD is an inherently social process that only occurs when the learner is interacting with others within the environment (Vygotsky, 1978). Sociocultural theory posits that as learners cooperate with others within the ZPD, a transformation eventually occurs where cultural practices transition from interpersonal to intrapersonal via internalization of the practices within the learner (Vygotsky, 1978). Further, the ZPD utilizes “tools” and “signs” to help assist learners, and by crossing the ZPD, the learner has undergone a transformative process that situates them differently within the culture than where they were when they started. Traversing the ZPD is often accomplished with “scaffolding”. Although Vygotsky never used this term when describing the ZPD, “scaffolding” implies that with the appropriate supports and help, learners can cross the ZPD in order to engage in tasks they were not originally able to do on their own. The idea of “scaffolding” had been introduced by scholars in the 1970s, yet the term was attributed to Vygotsky’s ZPD by Bruner (Bruner, 1986). The ZPD is an important aspect of Vygotsky’s theory as it facilitates learning within a cultural context, and it has become a pedagogical approach that assists learners in developing knowledge and engaging in authentic cultural practice. 24 Rogoff further developed sociocultural theory and reframed learning as an “apprenticeship in thinking” (Rogoff, 1990). Rogoff references the ZPD in problem solving and highlights the shared nature of it: “Shared problem solving – with an active learner participating in culturally organized activity with a more skilled partner – is central to the process of learning in apprenticeship,” (Rogoff, 1990). To Rogoff, the ZPD acted as a “crucible” for learning and development where new members of the culture could participate and learn the practices from more able members (Rogoff, 1990). The ZPD is one way in which sociocultural theory permeates many theoretical frameworks and pedagogical practices. Sociocultural theory, as well as the use of the ZPD, highlight the significance of the social aspects of education and have influenced the development of other perspectives. By examining the relationships of sociocultural theory to other frameworks such as situated cognition, communities of practice, and conceptual change, the distinctive characteristics of sociocultural theory and how they apply to this study can be elucidated. Situated cognition is a framework that was proposed by Lave and Wenger (Lave & Wenger, 1991) and is rooted in sociocultural theory. However, situated cognition differs from sociocultural theory in that it is more focused on how individuals interact with their context: “Situated cognition posits that knowledge exists not as a separate entity in the mind of an individual, but that knowledge is generated as an individual interacts with his or her environment (context) to achieve a goal,” (Orgill, 2007). Therefore, situated cognition is more focused on the extent to which the context influences learning in the individual (Orgill, 2007) rather than how groups engage in sociocultural activities and negotiate meaning. Both situated cognition and sociocultural theory acknowledge the importance of culture on learning (Brown et al., 1989). The practices inherent in chemistry are informed by the culture 25 which, as Brown et al. (Brown et al., 1989) note, is formed by members past and present. Therefore, it is argued that if instructors want students to engage with the practices of science then students must engage with the culture. Brown et al. (Brown et al., 1989) asserts that engaging students in cultural practices of the discipline offers a more authentic and stimulating way to teach, highlighting that the “school” culture (often concerned with memorizing facts and performing well on standardized exams) and the “disciplinary” culture (what practitioners of the discipline actually do) are different. Situated cognition stipulates that over time as students engage in sociocultural activities they progress into the role of practitioner and continue to develop through social engagements with other members of the community (Brown et al., 1989). Though situated cognition shares a focus on the social aspects of learning with sociocultural theory, it’s focus on the individual and the individual interacting with the context makes it a distinct framework. Scholars have advocated for the idea that situated cognition connects sociocultural theory and other cognitive-focused theories (Billett, 1996). These authors also point out the connections between sociocultural theory, situated learning, communities of practice (mentioned later), and the importance of culture on learning (Billett, 1996; Lave, 1991; Lave & Wenger, 1991). In contrast to these “bridging” perspectives, other scholars have taken the position that constructivist approaches are “problematic” and have situated sociocultural theory as a more appropriate framework by itself: “Despite the appeal of the notion that learners construct their understanding, I argue that constructivism is problematic because it ignores the subjectivity of the learner and the socially and historically situated nature of knowing; it denies the essentially collaborative and social nature of meaning making; and it privileges only one form of knowledge, namely, the technical rational,” (O’loughlin, 1992). 26 Sociocultural theory has also been linked to communities of practice frameworks (Snowball & McKenna, 2017). The communities of practice framework emerged from Lave and Wenger’s (Lave & Wenger, 1991) work on situated learning and was subsequently developed further by Wenger (Wenger, 1998). Sociocultural theory and the framework of communities of practice both involve communities of people engaged in the interpretation of practice, the negotiation of meaning, and the use of a “shared repertoire” (Macklin, 2007). According to Macklin, this “shared repertoire” includes “tangible tools and artifacts, such as manuals and documents, and intangible tools, such as common discourse or routine methods of accomplishing tasks...” (Lesh & Lehrer, 2003; Macklin, 2007). Therefore, to Macklin the “shared repertoire” in communities of practice embodies the tools of the context, a strong component to sociocultural theory as described by Vygotsky. The communities of practice theoretical framework “serves as a means to study how shared knowledge evolves and how groups learn,” (Macklin, 2007). The focus on groups in communities of practice and the significance given to shared knowledge and understanding mirror the focus of sociocultural theory. The community of practice framework emphasizes socialization into a culture by encouraging individuals to interact with the culture of a discipline and the experts in the field (Brown et al., 1989; Macklin, 2007). Likewise, the idea of socialization is inherent in sociocultural theory which posits that over time people will internalize the practices of the culture. Despite the overlap between the two frameworks, a major distinction between sociocultural theory and community of practice emerges when one considers the focus of each one. Where sociocultural theory is focused on sociocultural activity within a group and the social interactions inherent in that activity, community of practice seems to place the social 27 relationships among people secondary with a larger emphasis on how identity within a community of practice develops as one negotiates competence within the discipline (Farnsworth et al., 2016). In contrast to situated cognition and communities of practice, the underpinnings of conceptual change research are strongly cognitive, yet connections to sociocultural theory have been made. Mason (Mason, 2007) asserts that these connections can be traced back to Pintrich et al.’s 1993 article where the authors advocated for consideration of the affective domain of learning and contextual factors in conceptual change studies (Mason, 2007; Pintrich et al., 1993). Yet, scholars in conceptual change had already attempted to bridge the two theories previously. In a conceptual change study of 18 university students on how the students construct explanations about dissolving, it was found that the social environment of the science classroom “provided the students with opportunities to elaborate their explanations for dissolving, and reflected practical, theoretical and applied understanding,” (Kaartinen & Kumpulainen, 2002). As can be seen, sociocultural theory has been used in a variety of ways to explain how the context and culture can shape learning. It has been considered distinct from more cognitively-focused frameworks, but as noted, some scholars have sought to draw connections between the strong social orientation of sociocultural theory and the internal construction of knowledge of cognitive constructivist theories. Work has been done to develop and extend sociocultural theory to adopt more sociopolitical perspectives that include critical discussions of how social constructs such as race, gender, and class influence a person’s experience (Skerrett, 2006). Furthermore, Mutegi (Mutegi, 2013) used sociocultural theory to discuss the social construction of race and comment on the systemic nature of racism (Mutegi, 2013). 28 Mucherah and Owino (Mucherah & Owino, 2016) used sociocultural theory to situate their study on the perceptions of homosexuality with Kenyan and U.S. university students. They concluded that due to the sociocultural differences between the two groups of students that Kenyan students adopted more negative dispositions toward homosexuality (Mucherah & Owino, 2016). Therefore, this study demonstrates that sociocultural theory can be helpful when investigating student perceptions. Shabani (Shabani, 2016) used sociocultural theory as the foundation of teacher professional development, stating that “what Vygotsky claimed about students’ learning in the school setting is applicable to the teachers...” (Shabani, 2016). Similar to Shabani, other scholars have extended sociocultural theory and applied it to teacher learning. Kelly (Kelly, 2006) developed a framework for sociocultural theory as applied to teacher learning by recognizing that the constructing of professional knowledge within teaching is socioculturally influenced. Zotos et al. (Zotos et al., 2020) used the sociocultural framework of teacher learning put forth by Kelly in order to investigate chemistry graduate teaching assistants’ (GTAs) teaching knowledge and identity. Within this study it was found that “GTAs rarely view themselves as teachers, but more so as tutors in discussions or as managers in laboratories,” (Zotos et al., 2020). The authors of this study then went on to use the sociocultural perspective to make a powerful statement about the culture of chemistry: “In accordance to sociocultural theory of teacher learning, GTAs learn to focus on research and neglect their teaching role if that is the culture of their department (Kelly, 2006). In order to improve participation in instruction, GTAs must learn to prioritize teaching, which requires a shift in departmental culture,” (Zotos et al., 2020). This study highlights the power of sociocultural theory to detect influential cultural factors. 29 Since Vygotsky, sociocultural theory has expanded to include a variety of perspectives. My purpose for covering sociocultural theory and related frameworks and extensions was to illustrate the point that sociocultural theory has been deeply influential in research across many fields. So much so that it has led to the development of many other adjacent theories and frameworks. To this end, the umbrella term sociocultural perspectives has been adopted to refer to the wide range of perspectives that acknowledge the influence of the social and cultural facets within a context. As Lemke (Lemke, 2001) notes: “Sociocultural perspectives include the social- interactional, the organizational, and the sociological; the social-developmental, the biographical, and the historical; the linguistic, the semiotic, and the cultural. For many researchers they also include the political, the legal, and the economic, either separately or as implicit in one of the others.” (Lemke, 2001), which makes the theory incredibly robust with a wealth of applications. Throughout the studies presented in this dissertation, I frame the research within the broad framework of sociocultural perspectives. Reinholz and Apkarian’s Four Frames of Culture In 2018, Reinholz and Apkarian defined culture as “a historical and evolving set of structures and symbols and the resulting power relationships between people,” (Reinholz & Apkarian, 2018). This definition was a succinct way of summarizing their four frames of culture in order to enact systemic change. Structures are the first frame and include the “roles, responsibilities, practices, routines, and incentives that organize how people interact,” (Reinholz & Apkarian, 2018). Structures are therefore highly visible and more explicit within a given culture. Structures in the context of a classroom could include the curriculum, grading policies, practices and ways of doing, expectations for performance and behavior, routines, assignments, 30 and assessments, among others. Symbols are the second frame and “constitute the cultural artifacts, language, knowledge, myths, values, and vision that [people] use to guide their reasoning,” (Reinholz & Apkarian, 2018). Symbols effectively underpin the highly visible structures and give meaning to them; however, symbols themselves can often be fairly invisible or implicit. In a classroom, symbols could be represented by the language used (or the way content is talked about), myths or assumptions instructors have about learning that are not supported by evidence, values that instructors instill in their practice, and the vision they have for student learning (Reinholz & Apkarian, 2018). The third frame is people and it highlights that learning spaces are composed of a variety of individuals with their own goals and identities. While common experiences may exist between people in a given culture, this frame reminds culture workers that not all people will experience the culture in the same way. Furthermore, in the context of making change, this frame is important because it requires that change agents work with people and support their agency and independence as they engage with change efforts. The final frame is power, and it highlights that power dynamics are incredible influential in mediating a culture and how people interact with one another. This frame reminds change agents that within any culture, there are likely people who have been marginalized and must be involved in change efforts to ensure their voice is heard (Reinholz & Apkarian, 2018). These four frames help make culture work more accessible and manageable given its complex nature. The frames were primarily proposed in order to address issues of diversity, equity, and inclusion, but they also apply broadly to other cultural considerations. Deng et al. (Deng et al., 2021) used the frames to put forth strategies for developing and utilizing chemistry graduate student committees for diversity, equity, and inclusion. 31 Schein’s Framework of Organizational Culture Originally intended to speak toward organizational and business cultures, Schein’s framework has considerable overlap with the other cultural frameworks previously presented. In Schein and Schein’s book Organizational Culture and Leadership, they defined culture as “learned patterns of beliefs, values, assumptions, and behavioral norms that manifest themselves at different levels of observability,” (Schein & Schein, 2016). Therefore, to Schein, culture is situated as symbols (to borrow Reinholz and Apkarian’s term), the underlying components that give meaning to the other facets of a culture. Certainly, this view has also been shared by other scholars who have explored cultures in universities (Lee, 2007). Schein also notes that cultures can be micro-cultures, which includes subcultures of small groups, classrooms, or departments, or macro-cultures, which includes entire nations, international groups, and large organizations. Within Schein’s framework, culture had three “levels” which were equivalent to Reinholz and Apkarian’s frames. The first level was known as artifacts which were the highly visible structures and processes of a culture: “We think of artifacts as the phenomena that you would see, hear, and feel when you encounter a new group with an unfamiliar culture,” (Schein & Schein, 2016). Schein goes on to suggest that while artifacts are easy to observe, those who study culture should be very careful not to place too much emphasis on the practices people use as a way to characterize a culture because these practices and structures tend to not explain why people do the things they do. The second level was described as the espoused beliefs and values and encompassed the values, ideologies, goals, and beliefs that are openly declared by the group. Schein described this level as relatively conscious amongst the group membership because they constantly received explicit messages about these beliefs and values. Schein was also careful to note that while these 32 beliefs and values may be purported by the group, they may not fully describe the behavior observed, for that, one would need to consider the third and final level (Schein & Schein, 2016). Assumptions were the third level within this cultural framework and are described as the mostly unconscious, taken-for-granted beliefs and values. Considering how unconscious they are, this level is also the most implicit and difficult to observe or elucidate. However, in some instances, it can be a driving force for a culture. Schein described assumptions as “generally non- confrontable and non-debatable and hence are extremely difficult to change,” (Schein & Schein, 2016). In many cases, these assumptions often make up the “cultural DNA” or the “essence” of a group since they make up the beliefs and values and earliest shared learning experiences that characterize the identity of a group (Schein & Schein, 2016). Synthesizing the Cultural Frameworks The three cultural frameworks I used throughout my work included sociocultural theory (or sociocultural perspectives), Reinholz and Apkarian’s four frames for culture, and Schein’s framework for organizational culture. All three had considerable overlap and were used to make sense of the highly complex nature of culture, even at the micro-level of a course. To further make sense and apply these frameworks, I synthesized them together to inform my research. I posit that the four frames proposed by Reinholz and Apkarian act as a strong foundation for exploring culture. Sociocultural theory, on the other hand, provides strong philosophical, theoretical, and empirical backing for the structures and artifacts of a culture; that is, it helps “fill out” the structure/artifacts frame. Perhaps more importantly, sociocultural theory also details the influence and role of socializing mechanisms in enculturating individuals into certain ways of doing, an important point not as emphasized or highlighted in the other frameworks. Schein’s 33 organizational culture framework provides strong theoretical support for the role of beliefs, values, and assumptions in a culture. For the purposes of my study, I combine these under the umbrella of symbols because they both underpin structures and artifacts and give them meaning. Furthermore, Schein’s framework also provides much background for understanding the stable yet dynamic nature of cultures and how all frames (or levels) are interconnected to create a powerful entity of influence as well as the factors that influence a culture to evolve over time. Figure 2.1. A depiction of my synthesis of the three frames of culture Grounded Theory and the Use of Theoretical Frameworks The grounded theory study on “critical thinking” adopts the sociocultural lens alongside other cultural frameworks mentioned here, and therefore an in-depth explanation will not be noted here. However, it is important to note that some scholars in grounded theory stress that no theoretical frameworks should be used when conducting grounded theory research (Corbin & Strauss, 2015). They claim that since grounded theory is meant for theory generation, it should not begin with previously published theories that frame the study as this could “cloud” the 34 generation of the theory. However, they also note that once the theory is established in the study, the grounded theory can then be compared to other theories. Said another way, grounded theory could be seen as a way to help develop certain theoretical frameworks (Corbin & Strauss, 2015). However, this view of grounded theory has been heavily critiqued and challenged (Charmaz, 2006; Timonen et al., 2018). Charmaz notes that scholars are not completely blank slates when they engage with a research project, and they come in with a variety of assumptions, knowledge, and ideas relevant to the project (Charmaz, 2006). Instead, Charmaz supports the idea that researchers acknowledge this fact, work hard to identify assumptions and biases and foreground or eliminate them, and recognize that the researcher is “constructing” a theory, not simply finding one. This idea has been further supported by Timonen et al. (Timonen et al., 2018). Therefore, my grounded theory approach uses the constructivist grounded theory perspective as developed by Charmaz (Charmaz, 2006) which allows for the use of theoretical frameworks in research. Summary In sum, there were many frameworks that informed the work of this dissertation. Although some studies explicitly cited one framework or multiple, over time my understanding and beliefs about culture have evolved to be informed by an amalgamation of the frameworks previously mentioned. Each framework brought something to my understanding that I felt made it a substantial factor in my analyses and beliefs about culture in chemistry and biology classrooms. Sociocultural theory offered a lens for understanding how practices and ways of doing are connected to an overarching culture and provided a mechanism for sustaining a culture through socialization mechanisms. The four frames from Reinholz and Apkarian acted as 35 foundation for which to develop a hybrid framework for understanding culture while Schein’s work at the symbolic level of beliefs, values, and assumptions was incredibly influential. I closed out this chapter addressing and defending my use of these frameworks in my grounded theory study despite some authors stating one should be careful when using frameworks to develop grounded theory 36 REFERENCES 1. Alexander, P. A. (2007). Bridging Cognition and Socioculturalism Within Conceptual Change Research: Unnecessary Foray or Unachievable Feat? Educational Psychologist, 42(1), 67–73. https://doi.org/10.1080/00461520709336919 2. Billett, S. (1996). Situated learning: Bridging sociocultural and cognitive theorising. Learning and Instruction, 6(3), 263–280. https://doi.org/10.1016/0959-4752(96)00006-0 3. Bowen, R. S., Flaherty, A. A., & Cooper, M. M. (2022). Investigating student perceptions of transformational intent and classroom culture in organic chemistry courses. Chemistry Education Research and Practice. 4. Brown, J. S., Collins, A., & Duguid, P. (1989). Situated Cognition and the Culture of Learning. Educational Researcher, 18(1), 32–42. https://doi.org/10.4324/9780203990247 5. Bruner, J. (1986). The Inspiration of Vygotsky. In Actual Minds, Possible Worlds possible worlds (pp. 70–78). The Harvard University Press. 6. Charmaz, K. (2006). Coding in Grounded Theory Practice & Memo Writing. Sage. 7. Cole, M., & Scribner, S. (1978). Introduction. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in Society: The Development of Higher Psychological Processes (pp. 1–16). Harvard University Press. 8. Corbin, J., & Strauss, A. (2015). Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory (Fourth). Sage. 9. Deng, J. M., McMunn, L. E., Oakley, M. S., Dang, H. T., & Rodriguez, R. S. (2021). Toward Sustained Cultural Change through Chemistry Graduate Student Diversity, Equity, and Inclusion Communities. Journal of Chemical Education. https://doi.org/10.1021/acs.jchemed.1c00485 10. Farnsworth, V., Kleanthous, I., & Wenger-Trayner, E. (2016). Communities of Practice as a Social Theory of Learning: A Conversation with Etienne Wenger. British Journal of Educational Studies, 64(2), 139–160. https://doi.org/10.1080/00071005.2015.1133799 11. John-Steiner, V., & Mahn, H. (1996). Sociocultural approaches to learning and development: A Vygotskian framework. Educational Psychologist, 31(3–4), 191–206. https://doi.org/10.1080/00461520.1996.9653266 12. John-Steiner, V., & Souberman, E. (1987). Afterword. In M. Cole, V. John-Steiner, S. Scribner, & E. Souberman (Eds.), Mind in Society: The Development of Higher Psychological Processes (pp. 121–133). Harvard University Press. 37 13. Kaartinen, S., & Kumpulainen, K. (2002). Collaborative inquiry and the construction of explanations in the learning of science. Learning and Instruction, 12(2), 189–212. https://doi.org/10.1016/S0959-4752(01)00004-4 14. Kelly, P. (2006). What is teacher learning? A socio‐cultural perspective. Oxford Review of Education, 32(4), 505–519. https://doi.org/10.1080/03054980600884227 15. Lave, J. (1991). Chapter 4: Situated Learning in Communities of Practice. Perspectives on Socially Shared Cognition, 63–82. 16. Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge University Press. 17. Lee, J. J. (2007). The shaping of the departmental culture: Measuring the relative influences of the institution and discipline. Journal of Higher Education Policy and Management, 29(1), 41–55. https://doi.org/10.1080/13600800601175771 18. Lemke, J. L. (2001). Articulating communities: Sociocultural perspectives on science education. Journal of Research in Science Teaching, 38(3), 296–316. https://doi.org/10.1002/1098-2736(200103)38:3<296::AID-TEA1007>3.0.CO;2-R 19. Lesh, R., & Lehrer, R. (2003). Models and modeling perspectives on the development of students and teachers. Mathematical Thinking and Learning, 5, 109–129. 20. Macklin, A. (2007). Communities of Practice. In G. M. Bodner & M. Orgill (Eds.), Theoretical Frameworks for Research in Chemistry/Science Education (pp. 195–217). 21. Mantero, M. (2002). Scaffolding Revisited: Sociocultural Pedagogy within the Foreign Language Classroom. 1–31. 22. Mason, L. (2007). Introduction: Bridging the Cognitive and Sociocultural Approaches in Research on Conceptual Change: Is it Feasible? Educational Psychologist, 42(1), 1–7. https://doi.org/10.1080/00461520709336914 23. Mucherah, W., & Owino, E. (2016). Using the Sociocultural Theory to Explain the Perceptions of Homosexuality among Kenyan and U.S. University Students. Journal of Black Sexuality and Relationships, 3(2), 1–23. https://doi.org/10.1353/bsr.2016.0026 24. Mutegi, J. W. (2013). “Life’s first need is for us to be realistic” and other reasons for examining the sociocultural construction of race in the science performance of African American students. Journal of Research in Science Teaching, 50(1), 82–103. https://doi.org/10.1002/tea.21065 25. O’loughlin, M. (1992). Rethinking science education: Beyond piagetian constructivism toward a sociocultural model of teaching and learning. Journal of Research in Science 38 Teaching, 29(8), 791–820. https://doi.org/10.1002/tea.3660290805 26. Orgill, M. (2007). Situated Cognition. In Theoretical Frameworks for Research in Chemistry and Science Education (pp. 179–194). 27. Palincsar, A. S. (1998). Social constructivist perspectives on teaching and learning. Annual Review of Psychology, 49(1), 345–375. https://doi.org/10.1146/annurev.psych.49.1.345 28. Petterson, M. N., Finkenstaedt-Quinn, S. A., Gere, A. R., & Shultz, G. V. (2022). The role of authentic contexts and social elements in supporting organic chemistry students’ interactions with writing-to-learn assignments. Chemistry Education Research and Practice, 23, 189–205. https://doi.org/10.1039/D1RP00181G 29. Pintrich, P. R., Marx, R. W., & Boyle, R. B. (1993). Beyond cold conceptual change: The role of motivational beliefs and classroom contextual factors in the process of conceptual change. Review of Educational Research, 63, 167–199. 30. Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(1), 1–22. https://doi.org/10.1186/s40594-018-0103-x 31. Rogoff, B. (1990). Apprenticeship in Thinking: Cognitive Development in Social Context. Oxford University Press. 32. Schein, E. H., & Schein, P. A. (2016). Organizational Culture and Leadership (5th ed.). Jossey-Bass. 33. Shabani, K. (2016). Applications of Vygotsky’s sociocultural approach for teachers’ professional development. Cogent Education, 3(1), 1252177–1252177. https://doi.org/10.1080/2331186X.2016.1252177 34. Skerrett, A. (2006). Looking Inward: The impact of race, ethnicity, gender, and social class background on teaching sociocultural theory in education. Studying Teacher Education, 2(2), 183–200. https://doi.org/10.1080/17425960600983213 35. Snowball, J. D., & McKenna, S. (2017). Student-generated content: An approach to harnessing the power of diversity in higher education. Teaching in Higher Education, 22(5), 604–618. https://doi.org/10.1080/13562517.2016.1273205 36. Timonen, V., Foley, G., & Conlon, C. (2018). Challenges when using grounded theory: A pragmatic introduction to doing GT research. International Journal of Qualitative Methods, 17(1). https://doi.org/10.1177/1609406918758086 37. Vygotsky, L. (1978). Mind in Society. The Harvard University Press. 39 38. Wang, L., Bruce, C., & Hughes, H. (2011). Sociocultural Theories and their Application in Information Literacy Research and Education. Australian Academic & Research Libraries, 42(4), 296–308. https://doi.org/10.1080/00048623.2011.10722242 39. Wenger, E. (1998). Communities of Practice: Learning, Meaning, and Identity. Cambridge University Press. 40. Wertsch, J. V. (1991). Voices of the mind: A sociocultural approach to mediated action. Harvard University Press. 41. Zotos, E. K., Moon, A. C., & Shultz, G. V. (2020). Investigation of chemistry graduate teaching assistants’ teacher knowledge and teacher identity. Journal of Research in Science Teaching, 57(6), 943–967. https://doi.org/10.1002/tea.21618 40 CHAPTER III: LITERATURE REVIEW The scope of this dissertation is wide, ranging from investigations of course expectations to feelings related to sense of belonging. However, all of the studies within use student perceptions and occur within the context of traditional or transformed courses. In some studies, a comparative analysis was done; however, in others, I opted to be more purposeful in my sampling and use case study methodologies. I explored the literature on cultural studies and “critical thinking” to lay the groundwork on the need for these studies. I provided an overview of the research on grounded theory to support my use of that method. Although the diversity within my work is clear, throughout all studies is a sociocultural thread that acknowledges the significance of social interactions, overarching cultures of learning, and context on student learning and experience. In this chapter, I will aim to provide additional context and background to help set the stage for each study. Traditional and Transformed Courses To begin, much of the work presented within this dissertation took place in the context of transformed and/or traditional courses at Michigan State University. Although a brief introduction to traditional and transformed courses was provided in the Introduction, I will expand on this discussion here. In several of the studies included involved a comparison between traditional and transformed courses, and it is therefore important to provide working definitions of their meaning. Traditional courses in science derive their name from the fact that they align with how the courses have been traditionally taught. That is, they are typically didactic in nature, mostly relying on lecture as the content delivery mechanism with minimal interaction between 41 peers and the instructor. In many cases, traditional courses are organized around discrete topics which often follows the order of a standard textbook. In my experiences working with traditional courses, many of them primarily use high-stakes summative assessments with fewer points being allocated for effort and formative assessment. That is, a characteristic of these types of learning environment is that most of a student’s grade are determined by exams and a final. In the context of chemistry education research, it has been found that students in traditional courses utilize and/or value memorization and rote learning over reasoning when compared to students in transformed courses (described later) (Bowen et al., 2022; Crandell et al., 2019, 2020; Houchlei et al., 2021). These findings from transformed courses, as well as consensus reports on how people learn (National Academies of Sciences, Engineering, and Medicine, 2018; National Research Council, 2000), have highlighted the benefits of transforming courses to enhance pedagogy and encourage student learning. Transformed courses are typically informed by the educational literature in some way. Though not always true, transformed courses tend to place less emphasis on lecturing and instead seek to employ more interactive and active approaches. As previously stated, it has been noted that more traditional approaches to learning are not as effective (Crandell et al., 2019, 2020; Talanquer, 2021; Talanquer & Pollard, 2010). The term transformed can take on many different meanings. For example, one of the easier transformation methods involve the implementation of one or more interventions (White et al., 2021). However, interventions are typically short-term solutions that tend to only expand across over a few lessons (in some cases only one). While many interventions focus more on the content of the discipline, recently work has been done to use interventions as a way to incorporate inclusive and equitable practices in chemistry education (White et al., 2021). 42 Perhaps the most popular form of transformation is the use of active learning. Though active learning has been poorly defined by the literature (Lombardi et al., 2021), various studies have been published claiming to use this transformative approach to pedagogy (Deslauriers et al., 2019; Eichler, 2022). For example, Deslauriers and colleagues compared a classroom using active approaches, such as in-class activities, interactive quizzes, or conceptual questions, along with interactive lecturing to a more traditional, didactic class. The authors ultimately concluded that the students in the active approach learned “more” but perceived they had learned “less” (Deslauriers et al., 2019). While this may sound promising, the active learning approach used was poorly conceptualized (likely due to the lack of consensus definition of active learning), and there were some confounding factors in the analysis (such as concerns over student perceptions being conflated with simply enjoying the course). A review on active learning asserted that the use of active learning could narrow “achievement gaps” for students from marginalized communities (Theobald et al., 2020). In this review, the authors compiled various different studies that had used active learning and used Bayesian regression to demonstrate that active learning approaches could narrow such “gaps” in student performance. Similar to Deslauriers, Theobald et al. did not offer an exhaustive review of the literature on active learning to define what this pedagogical transformation was. In accordance with the theoretical frameworks that influenced the bulk of the work in this dissertation, some studies have provided evidence that communities of practice tended to be associated with active learning in STEM lecture courses (Tomkin et al., 2019). Readers will recall that communities of practice, when used as framework, is a sociocultural approach. The authors of this paper stated that active learning approaches were more student-centered and posited that by fostering communities of practice in instructors interested in transformational 43 change that they’d be more inclined to adopt more active learning approaches. In their analysis, they did find that “instructors who were members of a community of practice were much more likely to employ student-centric practices, such as asking questions, following up, and engaging in discussion, and much less likely to use instructor-centered practices, such as lecturing,” (Tomkin et al., 2019). That is, when instructors were surrounded by like-minded colleagues, they were more likely to adopt pedagogical strategies, hence the relationship to sociocultural perspectives. Aside from one-shot interventions and the incorporation of active learning, other transformation approaches have sought to act as an entire overhaul of the curriculum and pedagogy. In 2019, McGill et al. published a paper on the development of a new four-year curriculum for undergraduate chemistry majors known as Chemistry Unbound (McGill et al., 2019). The new curriculum purportedly leveraged core ideas and scientific practices as defined by A Framework for K-12 Science Education (National Research Council, 2012) and three- dimensional learning (3DL4US, n.d.; NGSS Lead States, 2017) which will be discussed later. The curriculum begins with an exploration of structure and properties, progresses through the principles of reactivity (as well as advanced reactivity considerations), continues on to light, matter, and macromolecules, following by independent courses on analytical tools and techniques, quantum mechanisms, specialized subdisciplines (such as biophysical chemistry, materials and synthesis, etc.), and ending with a senior capstone. While the work inherent in such a curricular overhaul is commendable, expert perspectives on change acknowledge that it takes time and should be done systematically (rather than entirely at once) (Fleming, 2018; Reinholz et al., 2021; Reinholz & Apkarian, 2018; Schein & Schein, 2016). No further work has been published on the Chemistry Unbound transformation. 44 Other transformation efforts have sought to take it a course at a time while still leveraging the core ideas and scientific practices of three-dimensional learning (3DL4US, n.d.; Cooper et al., 2019; Cooper & Klymkowsky, 2013; National Research Council, 2012). Chemistry, Life, the Universe, and Everything (CLUE) (Cooper & Klymkowsky, 2013) and Organic Chemistry, Life, the Universe, and Everything (OCLUE) (Cooper et al., 2019) are two transformed curricula for general and organic chemistry, respectively, that have been developed by Cooper and colleagues at Clemson University, Michigan State University, and the University of Colorado – Boulder. Both curricula are two semester sequences that organize content around four core ideas in chemistry and engage students in learning via scientific practices. Previous research on student reasoning in the context of CLUE and OCLUE have provided evidence that these curricula can be strong alternatives to more traditional approaches in instruction (Becker et al., 2016; Crandell et al., 2019, 2020; Houchlei et al., 2021; Noyes & Cooper, 2019). The work presented throughout this dissertation was aimed at characterizing student experience and perspectives within the context of CLUE and OCLUE. Therefore, before I can speak more to these course experiences, it is necessary to review the literature on the underlying curricular framework of three-dimensional learning. Three-Dimensional Learning Next, considering that most of my work takes place in transformed courses using three- dimensional learning, I will discuss this mode of transformation here. In the late 2000s and early 2010s, efforts in science education curricular reform attempted to shift from an overt focus on rote memorization toward embedding more authentic disciplinary practices. For example, in 2007, the National Research Council made the statement that “expectations of what it means to 45 be competent in doing science and understanding science have also broadened. Beyond skillful performance and recall of factual knowledge, contemporary views of learning prize understanding and application or knowledge in use,” (National Research Council, 2007). In 2012, this “contemporary view” led to the National Research Council’s A Framework for K-12 Science Education, henceforth known as the Framework. This document identified and clarified the scientific and engineering practices, crosscutting concepts, and disciplinary core ideas for science and engineering education within primary and secondary education (National Research Council, 2012). Although the Framework targeted both science and engineering, my work has focused on the “science” components. Within the Framework, there were eight scientific practices which were related to the ways in which expert scientists apply and use their knowledge. There were seven crosscutting concepts which were initially situated as concepts that permeated all science disciplines (hence their name, indicating they “crosscut” fields). However, recently scholars have re-conceptualized the crosscutting concepts as tools or lenses that enable people to engage with problems (Cooper, 2020; Cooper et al., 2017; Laverty et al., 2016). Finally, the disciplinary core ideas originally combined major concepts for chemistry and physics into overarching, fundamental ideas for physical science. However, scholars in higher education have disaggregated the core ideas of the two disciplines to make them more disciplinary-specific (3DL4US, n.d.). Figure 3.1 graphically illustrates the scientific practices, crosscutting concepts, and disciplinary core ideas for chemistry. 46 Figure 3.1. Three Dimensions of Three-Dimensional Learning: The eight scientific practices, seven crosscutting concepts, and four disciplinary core ideas for chemistry The Framework effectively established a vision for science education curricular reform by making practices, crosscutting concepts, and core ideas more explicit to instructors and students, ultimately referring to these three components of science education as dimensions. When all three dimensions are considered together in curricula, assessment task design, or learning task develop, it gives rise to Three-Dimensional Learning (3DL; Figure 3.2) (3DL4US, n.d.; NGSS Lead States, 2017). It’s worth noting that the original intent of the Framework and 3DL was for K-12 science education, and standards were developed for use in U.S. primary and secondary schools (NGSS Lead States, 2017). Despite this original intention, it has been argued that the Framework and 3DL are applicable to higher education, particularly to introductory courses such as general and organic chemistry (3DL4US, n.d.; Cooper et al., 2015, p. 3). This application of 3DL to higher education is important, particularly for introductory courses, because it enhances learning opportunities for students, and it allows for vertical coherence between K-12 science education and introductory science courses in colleges and universities. 47 Figure 3.2. Three-Dimensional Learning: Three-Dimensional Learning (3DL) utilizes the scientific practices, crosscutting concepts, and disciplinary core ideas in chemistry (the three dimensions) for curricular, assessment, and learning task design Scientific Practices As previously noted, in developing the practices, the Framework conceptualized practices for science and engineering. However, I will only be focusing on the science components of these practices. The authors of the Framework state that the use of the term “practice” was intentional: “...we use the term “practices,” instead of a term such as “skills,” to stress that engaging in scientific inquiry requires coordination both of knowledge and skill simultaneously,” (National Research Council, 2012). Therefore, the practices can be viewed as strategies for knowledge-in-use. The scientific practices are arguably the most widely researched and utilized dimensions of 3DL due to the association with the application of knowledge, sensemaking, and even equity and inclusion (National Research Council, 2012; Schwarz et al., 2017). The practices also illustrate the point that engaging in science is not as linear as the scientific method may lead one to believe (Kind & Osborne, 2017; National Research Council, 2012). Practices such as “developing and using models”, “constructing explanations” and “engaging in argumentation from evidence” have been commonly researched and explored practices in the literature (Baumfalk et al., 2019; Kararo et al., 2019; Lazenby & Becker, 2019; Merritt et al., n.d.; 48 Osborne & Patterson, 2011; Passmore et al., 2014; Schwarz & White, 2005). Other practices, such as “asking questions”, have not been as closely studied but researchers have attempted to reframe and clarify some of the more ill-defined practices (Phillips et al., 2018). Crosscutting Concepts Initially, the Framework envisioned that the crosscutting concepts would help students build an “organizational framework for connecting knowledge from various disciplines into a coherent and scientifically based view of the world,” (National Research Council, 2012). Therefore, the crosscutting concepts were one way to generate “connective tissue” between the various scientific domains and illustrate areas of overlap. Although the crosscutting concepts were situated as “crosscutting” across disciplines, many have argued that these concepts were poorly defined, implying that practitioners were unsure of what to do with them. In fact, some scholars have stated that “the crosscutting concepts have no scholarly basis for what the sciences have in common,” (Osborne et al., 2018). In response to such critiques, researchers have reframed the crosscutting concepts in a variety of ways such as epistemic heuristics that “can build students’ capacity to engage in making sense of phenomena in communities that extend beyond the classroom,” (C. W. A. Anderson et al., 2018). From the perspective of Anderson and colleagues, the CCCs were framed as tools that could be used to ultimately construct explanations. Other scholars have agreed with Anderson et al. and framed the crosscutting concepts as “lenses, tools, bridges, or “rules of the game” (epistemic heuristics), that can support sensemaking about chemical phenomena in the context of three-dimensional learning,” (Cooper, 2020; Rivet et al., 2016). 49 Core Ideas As the name suggests, the core ideas were meant to capture the main disciplinary ideas that all other content details connect back to in some way. As previously noted, the Framework combined the core ideas for chemistry and physics into four core ideas. However, scholars in higher education have disaggregated the core ideas to allow for more disciplinary-specific activities and reform (3DL4US, n.d.; Cooper et al., 2017; Laverty et al., 2016). For example, in chemistry, there are four core ideas: Structure and Properties, Bonding and Interactions, Energy, and Change and Stability of Systems. There are different core ideas for biology and physics, each developed by content and pedagogical experts in their respective domains. Similar to crosscutting concepts, explicit research studies on the core ideas are scant. However, the significance of identifying major ideas within a discipline have been recognized, leading others to develop their own “big ideas” in chemistry. For example, the ACS Examinations Institute have developed a series of “anchoring concepts content maps”, or ACCMs (Holme et al., 2018, 2020; Marek et al., 2018a, 2018b). The ACCMS, however, identify thirteen big ideas and are largely content-detail heavy. Curricular Reform with 3DL in Higher Education Given that much of my work has taken place in courses that have been transformed with 3DL, next I will review the literature on 3DL transformations in higher education. As noted, 3DL promotes the use and application of knowledge through the scientific practices. Though, the Framework was not published until 2012, scholars have been advocating for scientific practice in education before the Framework clarified the eight practices. For example, research on learning progressions led scholars to publish learning progressions on argumentation (Berland & McNeill, 50 2010) and modeling (Schwarz et al., 2009). However, the Framework offered a consensus and guiding document to frame curricular reform. In the context of chemistry, early reform efforts in higher education using 3DL sought to transform general chemistry first. Cooper and Klymkowsky (2013) developed the Chemistry, Life, the Universe, and Everything (CLUE) curriculum which is a two-semester general chemistry sequence for non-chemistry majors (Cooper & Klymkowsky, 2013). Using 3DL, the authors noted that “the goal of the course is to help students develop a mechanistic understanding of chemistry,” (Cooper & Klymkowsky, 2013). This approach therefore goes deeper with core ideas instead of covering a broad array of topics like in more traditional course settings and is built on progressions of ideas. Furthermore, in the context of CLUE, students are consistently encouraged to explain why chemical phenomena occur rather than focusing on building up declarative content knowledge (Cooper, 2015). A similar approach to CLUE has also been implemented in organic chemistry by the same authors, known as Organic Chemistry, Life, the Universe, and Everything (OCLUE) (Cooper et al., 2019). With regard to CLUE, it was found that students who had taken the transformed curriculum were more likely than students from a traditional general chemistry sequence to engage in causal mechanistic reasoning when asked questions about acid-base reactions (Crandell et al., 2019) and were better able to draw Lewis structures and use them to predict a range of different phenomena (Cooper, Underwood, & Hilley, 2012; Cooper, Underwood, Hilley, et al., 2012). Research in CLUE has also characterized the ways in which students construct mechanistic explanations about London Dispersion Forces (LDF) (Becker et al., 2016), and it was found that explicit scaffolding can improve student representations of these intermolecular forces (Noyes & Cooper, 2019). In order to speed up analysis of students’ 51 constructed responses of London Dispersion Forces, authors from the previously mentioned studies have used the Automated Analysis of Constructed Response suite of algorithms to automatically code data on their LDF prompts using machine learning (Noyes et al., 2020). That is, there are now more resources to aid in characterizing student responses to these tasks that can be used to make comparisons between transformed courses, like CLUE, and more traditional approaches. Similarly, studies with OCLUE have found that students who undergo the transformed OCLUE sequence in organic chemistry are more likely to use causal mechanistic reasoning by the end of the second semester than students in a traditional organic chemistry course (Crandell et al., 2020). Furthermore, additional evidence from OCLUE showed that students enrolled in the transformed course were more likely to draw reasonable mechanistic steps in an unfamiliar reaction when compared to students in a traditional organic chemistry course (Houchlei et al., 2021). That is, the work emerging from CLUE and OCLUE seem to support the idea that courses designed using three-dimensional learning are enabling students to engage in causal mechanistic reasoning, allowing them to retain this ability longer, and helping them to develop more transferable knowledge. Of course, course design also entails thinking about the types of assessment that will be used to measure student learning. In order to determine if assessments in science have three- dimensional items, scholars have developed the Three-Dimensional Learning Assessment Protocol (3D-LAP) which offers tables and approaches to determining if assessment items include core ideas, crosscutting concepts, and/or scientific practices (Laverty et al., 2016). Furthermore, a similar tool was developed for observing classroom instruction for three- dimensional occurrences and is known as the Three-Dimensional Learning Observation Protocol (3D-LOP) (Bain et al., 2020). These tools are meant to help researchers characterize three- 52 dimensional approaches to assessment and instruction to provide formative feedback and guide transformation. Sociocultural Studies in Chemistry Education As I continued to work with 3DL courses, I began to notice that the consistent use of the scientific practices in the context of core ideas seemingly promoted a particular classroom culture of learning. As I thought about how to frame the work I wanted to do, I was drawn to sociocultural perspectives. Considering that sociocultural (and other cultural) perspectives make up my theoretical frameworks, I will explore some of previous sociocultural work in chemistry and science education that was informative for my work. Although the study of culture in science education more broadly has been described as an important endeavor (Wood et al., 2013), surprisingly little work has been done in this area as it pertains to science education at colleges and universities. The Journal of Research in Science Teaching published a special issue on culture in 2013 (Parsons & Carlone, 2013); however, the articles included in the special issue were more positional pieces or focusing on the role of macro cultures on classroom learning (Mutegi, 2013; Tao et al., 2013). Some studies have used the framing of socioscientific issues to discuss culture (Lee et al., 2020) and other larger studies have commented on culture without explicitly stating it as one of their research goals(Seymour & Hewitt, 1997; Thiry et al., 2019). That is, though studies in science and chemistry education have not explicitly adopted the sociocultural perspective in their studies, they often speak to or reference social interactions and the overarching cultures of learning in their analyses in some way. Considering the sociocultural lens applied to the work in this dissertation, a look into other applications of sociocultural theory in chemistry education was necessary. Early work in 53 chemistry education utilized sociocultural perspectives to characterize and advocate for more context-based curricula in the field (King, 2012). King argues that context-based curricula are more effective at promoting student learning because they aim to help students connect their knowledge of chemistry to real-world phenomena. Here, King uses a sociocultural approach to highlight the context-dependency and significance of student learning. Further work in chemistry has targeted the area of “sociochemical norms” (Becker et al., 2013). Norms are facets of a culture that reflect shared ways of doing and thinking (Chang & Song, 2016; Reinholz & Apkarian, 2018; Schein & Schein, 2016). In Becker’s work, they analyzed how students engaged in argumentative discourse and found that students had accepted the norm that particulate-level descriptions were necessary in their explanations of chemical phenomena. They concluded that knowledge of these norms were important for helping guide student learning and explanations (Becker et al., 2013). More recently, Zotos et al. used the sociocultural perspective of teacher learning (Kelly, 2006) to explore how chemistry graduate teaching assistants saw themselves with regard to their teaching (Zotos et al., 2020). Using this sociocultural lens, Zotos and colleagues found that chemistry GTAs positioned themselves more as a “tutor” or “lab manager” than a teacher. The authors situated this finding by discussing how there was limited social interaction on pedagogy and that chemistry departmental cultures were not designed in a way to encourage chemistry GTAs to pursue a teacher identity (Zotos et al., 2020). Recently, sociocultural lenses have been applied to student writing, particularly within the context of writing-to-learn activities (Petterson et al., 2022). In this study, researchers were trying to better understand how the context and social interactions impacted how students engaged with a writing-to-learn assignment. The authors found that by making the content 54 explicit and relevant in the assignment and incorporating peer review that they had “promoted a positive affective learning experience while also allowing students to reflect on their explanations and understanding of the course material,” (Petterson et al., 2022). That is, the sociocultural lens helped them identify the significance of the context (for a learning task) as well as the social interactions that influence the overall activity. Other than the studies mentioned, to my knowledge, there have not been other studies in chemistry education that have used sociocultural perspective explicitly, highlighting a gap within our understanding of the student experience (particularly at the higher education level). Therefore, the goal of the work presented herein was to fill this gap by using sociocultural perspectives to highlight student voices and experience while characterizing transformational efforts in my context. Instruments for Measuring Affective States in Science Education In order to explore the classroom cultures of learning, I turned to the literature for potential instruments to use. This section details that review of the literature on various instruments related to my work on student perceptions. For Chapters IV, V, and VI, we collected and analyzed student perceptions of their courses. Therefore, it was important to explore the literature on instruments that have been used to measure affective states in science education and gather student perspectives. Upon exploring this literature, it was found that these previously published instrument often asked how students experienced entire courses or whole majors. These instruments often used more quantitative approaches by using Likert- and semantic differential scales. For example, one of the most prominent examples was the Maryland Physics Expectations (MPEX) survey, a Likert-scale instrument (Redish et al., 1998). The MPEX 55 effectively paved the way for a similar survey in chemistry known as the Chemistry Cognitive Expectations (CHEMX) survey (Grove & Bretz, 2007). Both of these surveys utilized the Likert- scale which is the classic agree-disagree scale that primarily gathers information on student assumptions, beliefs, and cognitive expectations. These surveys ultimately compare how student responses compare to experts. Another popular instrument has been the Colorado Learning Attitudes about Science Survey, or CLASS, instrument. The CLASS is also a Likert-scale instrument that was initially developed in physics (Adams et al., 2005), but it has since been adapted for chemistry (Barbera et al., 2008) and biology (Semsar et al., 2011). As stated in its name, the CLASS is focused on gathering information on student beliefs and attitudes about learning, the discipline, content, structure of the knowledge in the discipline, and “real world” connections. The primary difference between the CLASS and the MPEX and CHEMX is that the CLASS primarily focuses on the discipline overall while the MPEX and CHEMX questions target a single course. Similar to the MPEX and CHEMX, the CLASS also compares student responses to expert responses. Over time, these Likert-scale instruments have been called into question and scholars have suggested that these instrumented by re-analyzed. For example, Douglas et al. (2014) claimed that the CLASS instrument had poor psychometric properties and provided a revised version (Douglas et al., 2014). Furthermore, the MPEX has been questioned with regard to whether students selected the “neutral” option when they truly meant that the question does not apply to them (Saltzman et al., 2016). While Likert-scale instruments tend to be very popular, they are not the only instruments widely used. The semantic differential scale, for example, has students respond on a scale where the extremes have polar opposite adjectives. Students then respond to a sentence stem by rating 56 how close they are to one of the adjectives. Two examples that utilize the semantic differential scale include the Chemistry Attitudes and Experiences Questionnaire (CAEQ) (Dalgety et al., 2003) and the Attitude towards the Subject of Chemistry Inventory (ASCI) (Bauer, 2008). Use of the ASCI seems to be popular according to the literature and has led to the production of the ASCIv2 (Xu & Lewis, 2011) and the ASCIv3 (Rocabado et al., 2019). As will be discussed in Chapter IV, these instruments did not support our purposes, indicating that we needed to develop our own way of collecting student perceptions. Given this review, we were not convinced that these instruments would serve our purpose of collecting student perceptions in a way that we wanted or would be productive for our research questions. Therefore, we opted to develop our own set of open-ended questions. Though the description of this instrument will be reserved for the next chapter (Chapter IV), we consistently noted that students often used vague terms to describe their perception or experience in their courses. To generate insights on this, I turned to the literature on “critical thinking” “Critical Thinking” in Science Education Throughout my projects, I continually noted that students used terms like “critical thinking” to describe their experience, yet it was not clear what students meant. I then turned to the literature to provide some insights on student perceptions of “critical thinking”, but I found that the term has held an amorphous meaning for decades. In preparation for my interview study on “critical thinking” I reviewed this literature and outline it here. 57 Definitions of “Critical Thinking” The term “critical thinking” is often thrown around in discussions of learning goals and objectives; however, to date, a common definition of what “critical thinking” means to broader education has not been agreed upon. Some scholars have referred to “critical thinking” as a component of higher-order thinking (Barak et al., 2007); however, others have challenged this view and asserted that “critical thinking” is a type of thinking and not higher-order thinking in general (Facione, 1990; Mulnix, 2012). For example, in the “Delphi Report”, Facione states that “[critical thinking] is one among a family of closely related forms of higher-order thinking, along with, for example, problem-solving, decision making, and creative thinking,” (Facione, 1990) (italics added). However, this statement by Facione uses other amorphous terms, such as “problem-solving”, to describe an already amorphous concept like “critical thinking”. As early as the 1950s, “critical thinking” has been contrasted against overt focus on rote memorization in education (Dunning, 1954; George, 1967, 1968; Rickert, 1967; Santos, 2017; Tsai, 2001), particularly in science education, where some scholars have commented on how assessments send strong messages about what instructors value in their courses (Momsen et al., 2013; Stowe et al., 2021). In some cases, this contrast has led to some suggesting that we need “more critical thinking” instead (Charen, 1970; George, 1967, 1968; Siegel, 1989). This critique also extended to laboratory curricula which primarily leverages “cookbook-style” approaches that often do not prompt the use of knowledge (Charen, 1970). In one such study, TAs for an undergraduate biology course believed that teaching critical thinking was more important than focusing on content in introductory courses (Barron et al., 2021); however, it was never clarified what was meant by “critical thinking”. 58 Despite these early calls for “critical thinking”, a common definition of “critical thinking” has not been agreed upon in the literature (Crenshaw et al., 2011; Moore, 2013; Mulnix, 2012; Stowe & Cooper, 2017). Certainly, many studies have defined the construct, and, as Crenshaw et al. (2011) note: “There are nearly as many definitions of critical thinking as there are publications on the topic,” (Crenshaw et al., 2011). Defining something as ubiquitous as the term “critical thinking” is important because “…it is essential that we know precisely what we mean when we refer to critical thinking or thinking skills, if the constructs are to be useful,” (Kuhn, 1999; Mulnix, 2012). Furthermore, ways of thinking such as “critical thinking” or “problem solving” are seemingly valued by many instructors, but they are often not operationalized, made explicit to students, or assessed accordingly (Stowe & Cooper, 2017). However, whatever “critical thinking” is, some have stated that the development of “critical thinking” is important for national development (Forawi, 2016), solving complex problems such as climate change or pollution (Lau, 2011), or have situated it as enhanced intellectual development in some way (Kogut, 1996). Early definitions of “critical thinking” were by Edward Glaser (George, 1967; Glaser, 1941) where Glaser defined “critical thinking” as: “(1) an attitude of being disposed to consider in a thoughtful way the problems and subjects that come within the range of one’s experiences, (2) knowledge of the methods of logical inquiry and reasoning, and (3) some skill in applying those methods. Critical thinking calls for a persistent effort to examine any belief or supposed form of knowledge in the light of the evidence that supports it and the further conclusions to which it tends. It also generally requires ability to recognize problems, to find workable means for meeting those problems, to gather and marshal 59 pertinent information, to recognize unstated assumptions and values, to comprehend and use language with accuracy, clarity, and discrimination, to interpret data, to appraise evidence and evaluate arguments, to recognize the existence (or nonexistence) of logical relationships between propositions, to draw unwarranted conclusions and generalizations, to put to test the conclusions and generalizations at which one arrives, to reconstruct one’s patterns of beliefs on the basis of wider experience, and to render accurate judgments about specific things and qualities in everyday life,” (Glaser, 1941). Such a definition, as provided by Glaser, implies the use of knowledge in some way which would make sense given “critical thinking” and it’s early contrast toward rote memorization. Related to this definition, Dunning described three abilities that they believed made up “critical thinking”: 1) the ability to apply principles; 2) the ability to interpret data; and 3) the ability to associate with the “nature of proof” (Dunning, 1954). Once again, the use of knowledge is prevalent. One major deviation from Glaser’s definition here was Dunning’s focus on the cognitive aspects of “critical thinking” where Glaser had noted some affective resources, including having a disposition or attitude to engage in the act of “critical thinking”. Later, Dunning extended their definition of “critical thinking” and framed it with the three associated abilities, the cognitive procedures people employ in unfamiliar problem contexts, the ability to sort relevant and irrelevant information pertinent to a problem, and the ability to estimate the impact of variables on a particular solution (Dunning, 1954, 1956). This mention of “unfamiliar problem contexts” in Dunning’s definition would resurface in other definitions later. In the 1960s, other scholars started to define “critical thinking” in terms of inference, deduction, interpretation, and evaluation (Smith, 1963) or as the “evaluation of assertions, 60 assumptions, authorities, conclusions, and the like,” (George, 1967, 1968). Around this time, some of the language used to describe “critical thinking” sounded reminiscent of Bloom’s taxonomy (Bloom et al., 1956). For example, some have highlighted that “critical thinking” involves application and synthesis, two tiers of Bloom’s taxonomy: “Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information,” (Forawi, 2016; Gupta et al., 2015; Oliver-Hoyo, 2003). Interestingly, some early articles situated “critical thinking” as a “scientific” way of thinking, and since students in science and math achieved “higher” performance on “critical thinking” evaluations in one study, the author concluded that these students were simply “more intelligent than the other groups of students…” (George, 1967). However, this study did not provide the backgrounds of these students, and a critical evaluation of the instrument used was never mentioned making these broad generalizations and statements problematic. However, the statements by George are also interesting given the debate over what critical thinking is amongst philosophers (Facione, 1990). George also went on to conduct additional studies where they correlated higher grades in science courses (biology in particular) to higher levels of “critical thinking” ability (George, 1968). It could be the case that early studies and statements offered by George informed how “critical thinking” is perceived in science education today. As literacy grew in importance in education, connections between it and “critical thinking” were made. It has been suggested that learning environments need to be more open to allow students to explore concepts, make mistakes, and reflect on what they are learning (Vieira et al., 2011). Some have been suggesting for decades to sacrifice lecture time to allow for such discussions and learning activities to take place (Kogut, 1996). Paul and Elder have described 61 “critical thinking” as entailing close reading, writing, and ethical reasoning while also emphasizing that these facets need to be made more explicit to students (Paul & Elder, 2011). On the other hand, some have described “critical thinking” as a form of critiquing information and questioning (Osborne, 2014). Such a view may align well with the Framework and the scientific practices. Mulnix defined “critical thinking” as “a process, a skilled activity of thought. It includes a commitment to using reason in the formulation of our beliefs. It is not the same as creative, imaginative or emotion-based thinking. And, as with any skill, it can be possessed to a greater or lesser degree,” (Mulnix, 2012). Ultimately, Mulnix summarizes by saying that “critical thinking is the same as thinking rationally or reasoning well,”. However, some may argue that “reasoning well” is subjective. Some articles have discussed encouraging “critical thinking” in chemistry by asking questions of students, using examples to demonstrate uncertainty in scientific knowledge and theories, promoting discussions, offering feedback, and modeling “critical thinking” for students (Kogut, 1996); however, Kogut never provided an explicit definition of “critical thinking” in this article. Similarly, Oliver-Hoyo (2003) highlighted that “critical thinking” is repetitive and that feedback is important for its development (Oliver-Hoyo, 2003). In a phenomenological study on perceptions of science education by scientists and K-12 science teachers, many of the participants (scientists and science teachers, alike) declared students needed more experience with “critical thinking”, yet the interviewers did not ask the participants what exactly they meant (Taylor et al., 2008). As previously noted, other scholars have investigated the role of inquiry-based instruction or lab work writing on the development of “critical thinking” in chemistry courses (Gupta et al., 2015; Weaver et al., 2016). Tseng et al. suggest that “supporting students in critical thinking, specifically by inducing critique of 62 scientific claims, may be effective in raising epistemic vigilance against erroneous information,” (Tseng et al., 2021) which shows connections to argumentation. Despite all of the work to operationalize “critical thinking”, others have highlighted the lack of coherence within research on “critical thinking” and criticized science education’s conception of “critical thinking” as specific mental and physical procedures (Bailin, 2002). Bailin claimed that conceptualizing “critical thinking” as a “mental process” was problematic given that these “mental processes” were unobservable. Furthermore, Bail also challenged framing “critical thinking” as procedures (such as applying, interpreting, analyzing, etc.) because individuals could follow procedures without ever thinking critically. Instead, Bailin proposed that “critical thinking” be framed according to criteria or norms and adopted a resources perspective stating: “I am concerned, not with trying to determine the nature and boundaries of domains to which certain critical thinking skills transfer or the nature of the mental entities which are transferred, but rather with which resources apply to particular challenges and how widely applicable such resources are,” (Bailin, 2002). Overall, Bailin asserted that criteria for “critical thinking” were more important as it would help guide which procedures and heuristics would be productive in a given context. Associating “Critical Thinking” with Other Amorphous Terms In some cases, other ill-defined terms such as “problem-solving” have been associated with “critical thinking”, making both terms all the more confusing. Some scholars have separated “critical thinking” from “problem-solving”, ultimately concluding that “problem-solving” was a part of the “critical thinking process (Rickert, 1967). However, others have assumed both constructs were the same skill (Charen, 1970). Similarly, the link between “critical thinking” and 63 “inquiry” has also been considered (Byrne & Johnstone, 1987; Gupta et al., 2015; Weaver et al., 2016). Within the same vein, “critical thinking” was also situated broadly as what people do when they are uncertain of what exactly to do, for example, if they were operating in a new context (George, 1967; Vieira et al., 2011). “Critical Thinking”, Transfer, and Metacognition With more scholars considering “critical thinking” in science education, they attempted to determine how well “critical thinking” ability could transfer between different domains of science (Charen, 1970; Ennis, 1962; Lau, 2011). However, transfer has always been a problematic idea (Bransford & Schwartz, 1999). Other scholars have promoted the idea that “critical thinking” is more disciplinary-specific and relies heavily on the content knowledge of the discipline (Byrne & Johnstone, 1987; McPeck, 1981). Yet, others have asserted that “critical thinking” does not have to be entirely transferable or entirely disciplinary-specific (Mulnix, 2012; Siegel, 1989). That is, it seems more likely that near transfer is possible but there has been little to no evidence to suggest far transfer can be done by students. That is, perspectives and evidence suggest that students are likely to transfer their knowledge to unfamiliar problems within the context of a course; however, evidence of transfer between courses or the outside world is scant. “Critical thinking” has also been asserted to involve metacognition (Kuhn, 1999; Tsai, 2001). The incorporation of new ideas and ways of processing into “critical thinking” further highlights that “critical thinking” means a lot of different things to many different people. 64 “Critical Thinking” and the Affective Domain As readers may recall, Glaser further invited affective considerations into the discussion on “critical thinking”. The concern over affective dispositions in “critical thinking” was considered in the “Delphi Report” where Facione notes that: “[Experts] argue that these dispositions flow from, and are implied by, the very concept of [critical thinking], much as the cognitive dispositions are. These experts argue that being adept at [critical thinking] skills but habitually not using them appropriately disqualifies one from being called a critical thinker at all,” (Facione, 1990). In an attempt to summarize some of the thoughts on “critical thinking”, Mason summarized the prevalent features of “critical thinking” as follows: 1) skills of critical reasoning; 2) a motivation for “critical thinking”, and 3) content knowledge (Mason, 2007). Therefore, the increasing importance of the affective domain on “critical thinking”, though mentioned in previous decades, began to be taken in for more consideration (Byrne & Johnstone, 1987; Facione, 1990; Glaser, 1941; Mason, 2007; Siegel, 1989; Smith, 1963). The previously mentioned “Delphi Report” was influential in later definitions and conceptualizations of “critical thinking”. The “Delphi Report” used a panel of 46 experts on “critical thinking” to operationalize the term. Their consensus statement on “critical thinking” stated: “We understand critical thinking to be purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based... The ideal critical thinker is habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fair-minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused 65 in inquiry, and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit,” (Facione, 1990). This definition has been leveraged recently in chemistry education (Danczak et al., 2017, 2020); however, it is important to note, this panel of experts consisted of over 50% philosophers and may not be a representative sample of scholars. Furthermore, they did not reach full consensus on how best to operationalize “critical thinking”, only a majority. Connecting “Critical Thinking” to the 3DL Scientific Practices However, as will be noted in Chapter VII, some of the language sounds similar to the more recent Framework for K-12 Science Education scientific practices (National Research Council, 2012). For example, some scholars’ definition of “critical thinking” relied heavily on the tenet of argumentation, one of the eight scientific practices in the Framework: “…a critical thinker must be able to assess reasons and their ability to warrant beliefs, claims, and actions properly. This means that the critical thinker must have a good understanding of, and the ability to utilize, principles, governing the assessment of reasons,” (Mulnix, 2012; Siegel, 1989). This focus on argumentation has led others to situate argumentation as a way to develop “critical thinking” in students (Hand et al., 2018; Osborne et al., 2004). The practice of questioning, another scientific practice in the Framework, has also been stated to develop “critical thinking” (Crenshaw et al., 2011). The connection between some of these definitions of “critical thinking” and the scientific practices in the Framework have led some to suggest that the eight practices established by the Framework could represent the “disaggregated parts of nebulous umbrella terms such as “inquiry” or “critical thinking”,” (Cooper, 2015; Stowe & Cooper, 2017). 66 Measuring “Critical Thinking” Attempts to measure “critical thinking” have led to the generation of several instruments including the Watson and Glaser Critical Thinking Appraisal (WGCTA) (Assessment Day Ltd., n.d.; Watson & Glaser, 1964), the Critical Thinking Attribute Survey (CTAS) (Forawi, 2016; Wright & Forawi, 2000), the Cornell Critical Thinking (CCT) test (Ennis, 1962; The Critical Thinking Co., 2021), and the California Critical Thinking Disposition (CCTDI) (Banning, 2006; Insight Assessment, 2020), among others. Despite all of the work put into these assessment, some have argued that since “critical thinking” is so complex, a single form of assessment of it will not paint an accurate picture of the ability (by whatever definition) (Crenshaw et al., 2011). Stowe & Cooper have also noted the difficulty with measuring something like “critical thinking”: “…it is very hard to measure improvement in a construct of interest (like “critical thinking”) if we do not agree what it is or how to measure it…” (Stowe & Cooper, 2017). In work with the Science Writing Heuristic (SWH), Hand et al. found that inventions with SWH improved third- to fifth-grade students “critical thinking” scores on the Cornell Critical Thinking (CCT) test (Hand et al., 2018). Ultimately, this implies that Hand et al. agreed with the definitions offered by the authors of the CCT. Recently, the Danczak-Overton- Thompson Critical Thinking Test, also known as the DOT test, was published (Danczak et al., 2020). The authors acknowledged the many definitions of “critical thinking” present in the literature and operationalized “critical thinking” as including analysis, problem-solving, inference, and judgment (Danczak et al., 2020). 67 Student Perceptions of “Critical Thinking” While studies have been done to gather student perceptions on “critical thinking”, it was different than our goals here. For example, Scott (2008) gathered student perceptions of how the use of debate in technology courses enhance their “critical thinking” (Scott, 2008). In this article, Scott described various practices such as asking questions, researching information, testing, analyzing, and communicating as all involving “critical thinking” (Scott, 2008). In 2013, Moore interviewed seventeen academic faculty in philosophy, history, and literary/cultural studies about their perceptions of “critical thinking” and concluded a few major themes, including: 1) making judgments; 2) skeptical thinking; and 3) close reading (Moore, 2013). Other themes and discussion were included by Moore, but ultimately, they concluded that these faculty did have a notion of what they meant by “critical thinking’ and that “critical thinking” is “far from being a largely ‘buried’ and ‘ineffable’ concept within university education…” (Moore, 2013). In one study with Master’s level teachers in the UK who identified as international students, researchers found that students had varying perceptions of “critical thinking” including that it entails asking questions or asking “why” about something, or seeing “critical thinking” as a way to come up with a solution to a particular problem, or that the need to provide evidence to support an argument silenced an individual’s voice (Hammersley-Fletcher & Hanley, 2016). In terms of teaching “critical thinking”, it was found to come with its own challenges. Tan noted that teaching “critical thinking” requires a consideration of “cultural compatibility” and the sociocultural context. In their study of teaching “critical thinking” in Singapore, the participants identified two major challenges to teaching “critical thinking”: 1) the expectation that teachers will act more as knowledge transmitters rather than facilitators, and 2) the 68 perception that “critical thinking” is adversarial (Tan, 2017). The participants in Tan’s study recommended using cooperative learning approaches and spending time to develop a safe learning context (Tan, 2017). One of the most similar studies to ours here was the one done by Danczak et al.. Danczak and colleagues investigated student, teaching staff (TAs and teaching faculty), and employers definitions of “critical thinking”. Using qualitative analysis on data collected via an open-ended questionnaire, the authors summarized overarching definitions for each group based on the data. They concluded that, overall, the students defined “critical thinking” as “to analyze and critique objectively when solving a problem”. The teaching staff’s definition was “to analyze, critique, and evaluate through the logical and objective application of knowledge to arrive at an outcome when solving a problem”. Finally, the employers defined “critical thinking” as “to analyze, critique and evaluate problems and opportunities through logical, systematic, objective and creative application of knowledge so as to arrive at an outcome and recognize the larger scale context in which these problems and opportunities occur” (Danczak et al., 2017). The authors ultimately concluded limited agreement amongst the different groups and that employers had more to add to their definitions. Summary Some of the earlier conceptions of “critical thinking” certainly remain (Hunter, 2014), and in recent literature, “critical thinking” has continued to be distanced from rote memorization practices, and it been more associated with how people think, instead (Mulnix, 2012). With this in mind, stronger connections between the cognitive and affective domains have been strengthened as it has been pointed out that even if someone has the ability to think critically (by 69 whatever definition), it does not mean they will employ it every time, or that they will not make mistakes (Facione, 1990; Mulnix, 2012). As previously noted, “critical thinking” has been associated with metacognition, and it has been further extended by some to involve creativity in order to solve complex problems such as climate change (Lau, 2011). Lau also outright states that “good critical thinking is a cognitive skill”, ultimately challenging the research and theories on the association of the affective domain (Lau, 2011). Regardless of the definition used, most scholars have highlighted that, for their definitions, it cannot be assumed that students will develop these skills on their own, and they will need direct instruction in them in order to develop “critical thinking” as defined by the instructor (Barak et al., 2007; Byrne & Johnstone, 1987; Facione, 1990; George, 1967, 1968; Mulnix, 2012; Rickert, 1967; Vieira et al., 2011). Yet, despite all of this, we note that it may not be entirely necessary to establish a single definition of “critical thinking” as long as people define exactly what they mean and are explicit about it with their students (Moore, 2013). The purpose of this literature review on “critical thinking” was to highlight the amorphous nature of “critical thinking”, yet how pervasive the term is in research and instruction. My aim was to provide a rather exhaustive account of why a study on student perceptions of “critical thinking” was necessary. Although a similar study was conducted previously (Danczak et al., 2017), it left more questions about how students conceptualize such an amorphous term. Grounded Theory in Chemistry Education When it comes to grounded theory, some scholars suggest that literature reviews not be conducted until after the relevant project is done (Corbin & Strauss, 2015). However, others 70 challenge this idea, saying that it does not match recent research expectations and that researchers are not blank slates when they start a research project (Charmaz, 2006; Timonen et al., 2018). Regardless, to my knowledge, grounded theory has not been utilized much in science education, let alone chemistry education despite its robust methodological approach to qualitative data. In 2015, Randles and Overton conducted a grounded theory qualitative study where they explored the various approaches used by experts (academics), transitional from novice to expert (industrial chemists), and novices (undergraduates) to solve open-ended problems. The authors inductively developed a variety of codes, but they ultimately concluded that the faculty (experts) were more likely to find a successful solution to the open-ended problems, followed by the industrial chemists, and then the undergraduates. Furthermore, the expert faculty relied on more practices and analytical skills than the other two cohorts when approaching problems (Randles & Overton, 2015). Although the authors claimed this study to be grounded theory, there were some large deviations from the method. For example, according to the article, the authors never really engaged in a theoretical coding stage where they develop their overarching theory or analytical framework from the codes. Instead, the authors relied heavily on the codes and themes they initially developed. Another point of consideration was that the authors did not engage in theoretical sampling. Although grounded theory studies have been published without it (Dunn et al., 2019; Flaherty, 2020b), other scholars state that theoretical sampling is a necessary core principle to be considered a grounded theory study (Timonen et al., 2018). A second grounded theory study was conducted by Flaherty in 2020. In this study, Flaherty was interested in elucidating student perceptions of the structure and development of scientific knowledge. The students in the study were selected from a transformed organic chemistry course 71 (OCLUE to be exact). In this study, Flaherty found that students in this transformed organic chemistry course perceived that students in our disciplines outside of science were more naïve, that science was not simply rote memorization and required knowing the underlying causes, and that in OCLUE they had to bring forth and question prior understandings to address new problems (Flaherty, 2020b). The Flaherty study concluded a “core category” which would entail the theoretical coding stage; however, Flaherty also did not engage in theoretical sampling. This information was included to acknowledge that the “critical thinking” study, which employed a constructivist grounded theory approach, could also be used as a methodological article. 72 REFERENCES 1. 3DL4US. (n.d.). Three-Dimensional Learning for Undergraduate Science. https://3dl4us.org 2. Adams, W. K., Perkins, K. K., Dubson, M., Finkelstein, N. D., & Wieman, C. E. (2005). The Design and Validation of the Colorado Learning Attitudes about Science Survey. AIP Conference Proceedings, 790(April 2015), 45–48. https://doi.org/10.1063/1.2084697 3. Anderson, C. W. A., Gane, B., & Hmelo-silver, C. E. (2018). CCCs as epistemic heuristics to guide student sense-making of phenomena. 51–65. 4. Assessment Day Ltd. (n.d.). Watson Glaser critical thinking appraisal. https://www.assessmentday.co.uk/watson-glaser-critical-thinking.htm 5. Bailin, S. (2002). Critical thinking and science education. Science and Education, 11(4), 361–375. https://doi.org/10.1023/A:1016042608621 6. Bain, K., Bender, L., Bergeron, P., Caballero, M. D., Carmel, J. H., Duffy, E. M., Ebert- May, D., Fata-Hartley, C. L., Herrington, D. G., Laverty, J. T., Matz, R. L., Nelson, P. C., Posey, L. A., Stoltzfus, J. R., Stowe, R. L., Sweeder, R. D., Tessmer, S. H., Underwood, S. M., Urban-Lurain, M., & Cooper, M. M. (2020). Characterizing college science instruction: The Three-Dimensional Learning Observation Protocol. PLoS ONE, 15(6). https://doi.org/10.1371/journal.pone.0234640 7. Banning, M. (2006). Nursing research: Perspectives on critical thinking. British Journal of Nursing, 15, 458–461. 8. Barak, M., Ben Chaim, D., & Zoller, U. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking. Research in Science Education, 37(4), 353–369. https://doi.org/10.1007/s11165-006-9029-2 9. Barbera, J., Adams, W. K., Wieman, C. E., & Perkins, K. K. (2008). Modifying and Validating the Colorado Learning Attitudes about Science Survey for Use in Chemistry. Journal of Chemical Education, 85(10), 1435–1439. https://doi.org/10.1021/ed085p1435 10. Barron, H. A., Brown, J. C., & Cotner, S. (2021). The culturally responsive science teaching practices of undergraduate biology teaching assistants. Journal of Research in Science Teaching, 58(9), 1320–1358. https://doi.org/10.1002/tea.21711 11. Bauer, C. F. (2008). Attitude towards chemistry: A semantic differential instrument for assessing curriculum impacts. Journal of Chemical Education, 85(10), 1440–1445. https://doi.org/10.1021/ed085p1440 12. Baumfalk, B., Bhattacharya, D., Vo, T., Forbes, C., Zangori, L., & Schwarz, C. (2019). Impact of model‐based science curriculum and instruction on elementary students’ 73 explanations for the hydrosphere. Journal of Research in Science Teaching, 56(5), 570– 597. https://doi.org/10.1002/tea.21514 13. Becker, N., Noyes, K., & Cooper, M. M. (2016). Characterizing Students’ Mechanistic Reasoning about London Dispersion Forces. Journal of Chemical Education, 93(10), 1713–1724. https://doi.org/10.1021/acs.jchemed.6b00298 14. Becker, N., Rasmussen, C., Sweeney, G., Wawro, M., Towns, M., & Cole, R. (2013). Reasoning using particulate nature of matter: An example of a sociochemical norm in a university-level physical chemistry class. Chemistry Education Research and Practice, 14(1), 81–94. https://doi.org/10.1039/c2rp20085f 15. Berland, L. K., & McNeill, K. L. (2010). A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Science Education, 94(5), 765–793. https://doi.org/10.1002/sce.20402 16. Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals. David McKay Company. 17. Bowen, R. S., Flaherty, A. A., & Cooper, M. M. (2022). Investigating student perceptions of transformational intent and classroom culture in organic chemistry courses. Chemistry Education Research and Practice. https://doi.org/10.1039/D2RP00010E 18. Bransford, J. D., & Schwartz, D. L. (1999). Rethinking transfer: A simple proposal with multiple implications. Review of Research in Education, 24, 61–100. https://doi.org/10.3102/0091732x024001061 19. Byrne, M. S., & Johnstone, A. H. (1987). Critical Thinking and Science Education. Studies in Higher Education, 12(3), 325–339. https://doi.org/10.1080/03075078712331378102 20. Chang, J., & Song, J. (2016). A case study on the formation and sharing process of science classroom norms. International Journal of Science Education, 38(5), 747–766. https://doi.org/10.1080/09500693.2016.1163435 21. Charen, G. (1970). Do laboratory methods stimulate critical thinking? Science Education, 54(3), 267–271. https://doi.org/10.1002/sce.3730540315 22. Charmaz, K. (2006). Coding in Grounded Theory Practice & Memo Writing. Sage. 23. Cooper, M. M. (2015). Why Ask Why? Journal of Chemical Education, 92(8), 1273– 1279. https://doi.org/10.1021/acs.jchemed.5b00203 24. Cooper, M. M. (2020). The Crosscutting Concepts: Critical Component or “third Wheel” of Three-Dimensional Learning? Journal of Chemical Education, 97(4), 903–909. 74 https://doi.org/10.1021/acs.jchemed.9b01134 25. Cooper, M. M., & Klymkowsky, M. (2013). Chemistry, Life, the Universe, and Everything: A New Approach to General Chemistry, and a Model for Curriculum Reform. Journal of Chemical Education, 90(9), 1116–1122. https://doi.org/10.1021/ed300456y 26. Cooper, M. M., Posey, L. A., & Underwood, S. M. (2017). Core Ideas and Topics: Building Up or Drilling Down? Journal of Chemical Education, 94(5), 541–548. https://doi.org/10.1021/acs.jchemed.6b00900 27. Cooper, M. M., Stowe, R. L., Crandell, O. M., & Klymkowsky, M. W. (2019). Organic Chemistry, Life, the Universe and Everything (OCLUE): A Transformed Organic Chemistry Curriculum. Journal of Chemical Education, 96(9), 1858–1872. https://doi.org/10.1021/acs.jchemed.9b00401 28. Cooper, M. M., Underwood, S. M., & Hilley, C. Z. (2012). Development and validation of the implicit information from Lewis structures instrument (IILSI): Do students connect structures with properties? Chemistry Education Research and Practice, 13(3), 195–200. https://doi.org/10.1039/C2RP00010E 29. Cooper, M. M., Underwood, S. M., Hilley, C. Z., & Klymkowsky, M. W. (2012). Development and assessment of a molecular structure and properties learning progression. Journal of Chemical Education, 89(11), 1351–1357. https://doi.org/10.1021/ed300083a 30. Cooper, M. M., Williams, L. C., & Underwood, S. M. (2015). Student Understanding of Intermolecular Forces: A Multimodal Study. Journal of Chemical Education, 92(8), 1288–1298. https://doi.org/10.1021/acs.jchemed.5b00169 31. Corbin, J., & Strauss, A. (2015). Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory (Fourth Edition). Sage. 32. Crandell, O. M., Kouyoumdjian, H., Underwood, S. M., & Cooper, M. M. (2019). Reasoning about Reactions in Organic Chemistry: Starting It in General Chemistry. Journal of Chemical Education, 96(2), 213–226. https://doi.org/10.1021/acs.jchemed.8b00784 33. Crandell, O. M., Lockhart, M. A., & Cooper, M. M. (2020). Arrows on the Page Are Not a Good Gauge: Evidence for the Importance of Causal Mechanistic Explanations about Nucleophilic Substitution in Organic Chemistry. Journal of Chemical Education, 97(2), 313–327. https://doi.org/10.1021/acs.jchemed.9b00815 34. Crenshaw, P., Hale, E., & Harper, S. L. (2011). Producing Intellectual Labor In The Classroom: The Utilization Of A Critical Thinking Model To Help Students Take Command Of Their Thinking. Journal of College Teaching & Learning (TLC), 8(7), 13– 75 13. https://doi.org/10.19030/tlc.v8i7.4848 35. Dalgety, J., Coll, R. K., & Jones, A. (2003). Development of chemistry attitudes and experiences questionnaire (CAEQ). Journal of Research in Science Teaching, 40(7), 649–668. https://doi.org/10.1002/tea.10103 36. Danczak, S. M., Thompson, C. D., & Overton, T. L. (2017). “What does the term Critical Thinking mean to you?” A qualitative analysis of chemistry undergraduate, teaching staff and employers’ views of critical thinking. Chemistry Education Research and Practice, 18(3), 420–434. https://doi.org/10.1039/c6rp00249h 37. Danczak, S. M., Thompson, C. D., & Overton, T. L. (2020). Development and validation of an instrument to measure undergraduate chemistry students’ critical thinking skills. Chemistry Education Research and Practice, 21(1), 62–78. https://doi.org/10.1039/c8rp00130h 38. Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences of the United States of America, 116(39), 19251–19257. https://doi.org/10.1073/pnas.1821936116 39. Douglas, K. A., Yale, M. S., Bennett, D. E., Haugan, M. P., & Bryan, L. A. (2014). Evaluation of Colorado Learning Attitudes about Science Survey. Physical Review Special Topics - Physics Education Research, 10(2), 1–10. https://doi.org/10.1103/PhysRevSTPER.10.020128 40. Dunn, A. H., Sondel, B., & Baggett, H. C. (2019). “I Don’t Want to Come Off as Pushing an Agenda”: How Contexts Shaped Teachers’ Pedagogy in the Days After the 2016 U.S. Presidential Election. American Educational Research Journal, 56(2), 444–476. https://doi.org/10.3102/0002831218794892 41. Dunning, G. M. (1954). Evaluation of critical thinking. Science Education, 38(3), 191– 211. https://doi.org/10.1002/sce.3730380304 42. Dunning, G. M. (1956). Critical thinking and research. Science Education, 40(2), 83–86. https://doi.org/10.1002/sce.3730400203 43. Eichler, J. F. (2022). Future of the Flipped Classroom in Chemistry Education: Recognizing the Value of Independent Preclass Learning and Promoting Deeper Understanding of Chemical Ways of Thinking During In-Person Instruction. Journal of Chemical Education, 99(3), 1503–1508. https://doi.org/10.1021/acs.jchemed.1c01115 44. Ennis, R. H. (1962). A concept of critical thinking. Harvard Educational Review, 32, 81– 111. 76 45. Facione, P. A. (1990). Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction Executive Summary “ The Delphi Report. The California Academic Press, 423(c), 1–19. http://www.insightassessment.com/pdf_files/DEXadobe.PDF 46. Flaherty, A. A. (2020b). Investigating perceptions of the structure and development of scientific knowledge in the context of a transformed organic chemistry lecture course. Chemistry Education Research and Practice, 21, 570–581. https://doi.org/10.1039/c9rp00201d 47. Fleming, C. M. (2018). How to be Less Stupid About Race. Beacon Press. 48. Forawi, S. A. (2016). Standard-based science education and critical thinking. Thinking Skills and Creativity, 20, 52–62. https://doi.org/10.1016/j.tsc.2016.02.005 49. George, K. D. (1967). A comparison of the critical‐thinking abilities of science and non‐ science majors. Science Education, 51(1), 11–18. https://doi.org/10.1002/sce.3730510103 50. George, K. D. (1968). The effect of critical‐thinking ability upon course grades in biology. Science Education, 52(5), 421–426. https://doi.org/10.1002/sce.3730520504 51. Glaser, E. M. (1941). An experiment in the development of critical thinking. J. J. Little & Ives Company. 52. Grove, N., & Bretz, S. L. (2007). CHEMX: An instrument to assess students’ cognitive expectations for learning chemistry. Journal of Chemical Education, 84(9), 1524–1529. https://doi.org/10.1021/ed084p1524 53. Gupta, T., Burke, K. A., Mehta, A., & Greenbowe, T. J. (2015). Impact of guided- inquiry-based instruction with a writing and reflection emphasis on chemistry students’ critical thinking abilities. Journal of Chemical Education, 92(1), 32–38. https://doi.org/10.1021/ed500059r 54. Hammersley-Fletcher, L., & Hanley, C. (2016). The use of critical thinking in higher education in relation to the international student: Shifting policy and practice. British Educational Research Journal, 42(6), 978–992. https://doi.org/10.1002/berj.3246 55. Hand, B., Shelley, M. C., Laugerman, M., Fostvedt, L., & Therrien, W. (2018). Improving critical thinking growth for disadvantaged groups within elementary school science: A randomized controlled trial using the Science Writing Heuristic approach. Science Education, 102(4), 693–710. https://doi.org/10.1002/sce.21341 56. Holme, T. A., Bauer, C., Trate, J. M., Reed, J. J., Raker, J. R., & Murphy, K. L. (2020). The American Chemical Society Exams Institute Undergraduate Chemistry Anchoring Concepts Content Map V: Analytical Chemistry. Journal of Chemical Education, 97(6), 77 1530–1535. https://doi.org/10.1021/acs.jchemed.9b00856 57. Holme, T. A., Reed, J. J., Raker, J. R., & Murphy, K. L. (2018). The ACS Exams Institute Undergraduate Chemistry Anchoring Concepts Content Map IV: Physical Chemistry. Journal of Chemical Education, 95(2), 238–241. https://doi.org/10.1021/acs.jchemed.7b00531 58. Houchlei, S. K., Bloch, R. R., & Cooper, M. M. (2021). Mechanisms, Models, and Explanations: Analyzing the Mechanistic Paths Students Take to Reach a Product for Familiar and Unfamiliar Organic Reactions. Journal of Chemical Education, 98(9), 2751–2764. https://doi.org/10.1021/acs.jchemed.1c00099 59. Hunter, D. A. (2014). A Practical Guide to Critical Thinking. https://doi.org/10.1002/9781118839751 60. Insight Assessment. (2020). California critical thinking skills test (CCTST). https://www.insightassessment.com/article/california-critical-thinking-skills-test-cctst-2 61. Kararo, A. T., Colvin, R. A., Cooper, M. M., & Underwood, S. M. (2019). Predictions and constructing explanations: An investigation into introductory chemistry students’ understanding of structure-property relationships. Chemistry Education Research and Practice, 20, 316–328. https://doi.org/10.1039/c8rp00195b 62. Kelly, P. (2006). What is teacher learning? A socio‐cultural perspective. Oxford Review of Education, 32(4), 505–519. https://doi.org/10.1080/03054980600884227 63. Kind, P., & Osborne, J. (2017). Styles of Scientific Reasoning: A Cultural Rationale for Science Education? Science Education, 101(1), 8–31. https://doi.org/10.1002/sce.21251 64. King, D. (2012). New perspectives on context-based chemistry education: Using a dialectical sociocultural approach to view teaching and learning. Studies in Science Education, 48(1), 51–87. https://doi.org/10.1080/03057267.2012.655037 65. Kogut, L. S. (1996). Critical Thinking in General Chemistry. Journal of Chemical Education, 73(3), 218–221. 66. Kuhn, D. (1999). A Developmental Model of Critical Thinking. 28(2), 16–25. 67. Lau, J. Y. F. (2011). An introduction to critical thinking and creativity: Think more, think better. Wiley. 68. Laverty, J. T., Underwood, S. M., Matz, R. L., Posey, L. A., Carmel, J. H., Caballero, M. D., Fata-Hartley, C. L., Ebert-May, D., Jardeleza, S. E., & Cooper, M. M. (2016). Characterizing college science assessments: The three-dimensional learning assessment protocol. PLoS ONE, 11(9), 1–21. https://doi.org/10.1371/journal.pone.0162333 78 69. Lazenby, K., & Becker, N. M. (2019). A Modeling Perspective on Supporting Students’ Reasoning with Mathematics in Chemistry. In ACS Symposium Series (Vol. 1316, pp. 9– 24). American Chemical Society. https://doi.org/10.1021/bk-2019-1316.ch002 70. Lee, H., Lee, H., & Zeidler, D. L. (2020). Examining tensions in the socioscientific issues classroom: Students’ border crossings into a new culture of science. Journal of Research in Science Teaching, 57(5), 672–694. https://doi.org/10.1002/tea.21600 71. Lombardi, D., Shipley, T. F., Astronomy Team, Biology Team, Chemistry Team, Engineering Team, Geography Team, Geoscience Team, and Physics Team, Bailey, J. M., Bretones, P. S., Prather, E. E., Ballen, C. J., Knight, J. K., Smith, M. K., Stowe, R. L., Cooper, M. M., Prince, M., Atit, K., Uttal, D. H., LaDue, N. D., McNeal, P. M., Ryker, K., St. John, K., van der Hoeven Kraft, K. J., & Docktor, J. L. (2021). The Curious Construct of Active Learning. Psychological Science in the Public Interest, 22(1), 8–43. https://doi.org/10.1177/1529100620973974 72. Marek, K. A., Raker, J. R., Holme, T. A., & Murphy, K. L. (2018a). The ACS Exams Institute Undergraduate Chemistry Anchoring Concepts Content Map III: Inorganic Chemistry. Journal of Chemical Education, 95(2), 233–237. https://doi.org/10.1021/acs.jchemed.7b00498 73. Marek, K. A., Raker, J. R., Holme, T. A., & Murphy, K. L. (2018b). Alignment of ACS Inorganic Chemistry Examination Items to the Anchoring Concepts Content Map. Journal of Chemical Education, 95(9), 1468–1476. https://doi.org/10.1021/acs.jchemed.8b00241 74. Mason, M. (2007). Critical thinking and learning. Educational Philosophy and Theory, 39(4), 339–349. https://doi.org/10.1111/j.1469-5812.2007.00343.x 75. McGill, T. L., Williams, L. C., Mulford, D. R., Blakey, S. B., Harris, R. J., Kindt, J. T., Lynn, D. G., Marsteller, P. A., McDonald, F. E., & Powell, N. L. (2019). Chemistry Unbound: Designing a New Four-Year Undergraduate Curriculum. Journal of Chemical Education, 96(1), 35–46. https://doi.org/10.1021/acs.jchemed.8b00585 76. McPeck, J. E. (1981). Critical thinking and education. Routledge. 77. Merritt, J. D., Shwartz, Y., & Krajcik, J. (n.d.). Middle school students’ development of the particle model of matter. 29. 78. Momsen, J., Offerdahl, E., Kryjevskaia, M., Montplaisir, L., Anderson, E., & Grosz, N. (2013). Using Assessments to Investigate and Compare the Nature of Learning in Undergraduate Science Courses. CBE—Life Sciences Education, 12(2), 239–249. https://doi.org/10.1187/cbe.12-08-0130 79 79. Moore, T. (2013). Critical thinking: Seven definitions in search of a concept. Studies in Higher Education, 38(4), 506–522. https://doi.org/10.1080/03075079.2011.586995 80. Mulnix, J. W. (2012). Thinking Critically about Critical Thinking. Educational Philosophy and Theory, 44(5), 464–479. https://doi.org/10.1111/j.1469- 5812.2010.00673.x 81. Mutegi, J. W. (2013). “Life’s first need is for us to be realistic” and other reasons for examining the sociocultural construction of race in the science performance of African American students. Journal of Research in Science Teaching, 50(1), 82–103. https://doi.org/10.1002/tea.21065 82. National Academies of Sciences, Engineering, and Medicine. (2018). How People Learn II: Learners, Contexts, and Cultures. The National Academies Press. https://doi.org/10.17226/24783 83. National Research Council. (2000). How People Learn I: Brain, Mind, Experience, and School: Expanded Edition (p. 50). The National Academies Press. https://doi.org/10.17226/9853 84. National Research Council. (2007). Taking Science to School. https://doi.org/10.17226/11625 85. National Research Council. (2012). A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (p. 385). The National Academies Press. https://doi.org/10.17226/13165 86. NGSS Lead States. (2017). Next Generation Science Standards At. September, 1–102. 87. Noyes, K., & Cooper, M. M. (2019). Investigating Student Understanding of London Dispersion Forces: A Longitudinal Study. Journal of Chemical Education, 96(9), 1821– 1832. https://doi.org/10.1021/acs.jchemed.9b00455 88. Noyes, K., McKay, R. L., Neumann, M., Haudek, K. C., & Cooper, M. M. (2020). Developing Computer Resources to Automate Analysis of Students’ Explanations of London Dispersion Forces. Journal of Chemical Education, 97(11), 3923–3936. https://doi.org/10.1021/acs.jchemed.0c00445 89. Oliver-Hoyo, M. T. (2003). Designing a Written Assignment To Promote the Use of Critical Thinking Skills in an Introductory Chemistry Course. Journal of Chemical Education, 80(8), 899–903. 90. Osborne, J. (2014). Teaching Critical Thinking? New Directions in Science Education. School Science Review, 95(352), 53–62. 80 91. Osborne, J., Erduran, S., & Simon, S. (2004). Enhancing the quality of argumentation in school science. Journal of Research in Science Teaching, 41(10), 994–1020. https://doi.org/10.1002/tea.20035 92. Osborne, J., & Patterson, A. (2011). Scientific argument and explanation: A necessary distinction? Science Education, 95(4), 627–638. https://doi.org/10.1002/sce.20438 93. Osborne, J., Rafanelli, S., & Kind, P. (2018). Toward a more coherent model for science education than the crosscutting concepts of the next generation science standards: The affordances of styles of reasoning. Journal of Research in Science Teaching, 55(7), 962– 981. https://doi.org/10.1002/tea.21460 94. Parsons, E. C., & Carlone, H. B. (2013). Culture and science education in the 21st century: Extending and making the cultural box more inclusive. Journal of Research in Science Teaching, 50(1), 1–11. https://doi.org/10.1002/tea.21068 95. Passmore, C., Gouvea, J. S., & Giere, R. (2014). Models in Science and in Learning Science: Focusing Scientific Practice on Sense-making. In M. R. Matthews (Ed.), International Handbook of Research in History, Philosophy and Science Teaching (pp. 1171–1202). Springer Netherlands. https://doi.org/10.1007/978-94-007-7654-8_36 96. Paul, R., & Elder, L. (2011). Critical Thinking: Competency Standards Essential for the Cultivation of Intellectual Skills, Part 2. Journal of Developmental Education, 35(1), 36– 36. 97. Petterson, M. N., Finkenstaedt-Quinn, S. A., Gere, A. R., & Shultz, G. V. (2022). The role of authentic contexts and social elements in supporting organic chemistry students’ interactions with writing-to-learn assignments. Chemistry Education Research and Practice, 23, 189–205. https://doi.org/10.1039/D1RP00181G 98. Phillips, A. M. L., Watkins, J., & Hammer, D. (2018). Beyond “asking questions”: Problematizing as a disciplinary activity. Journal of Research in Science Teaching, 55(7), 982–998. https://doi.org/10.1002/tea.21477 99. Randles, C. A., & Overton, T. L. (2015). Expert vs. novice: Approaches used by chemists when solving open-ended problems. Chemistry Education Research and Practice, 16(4), 811–823. https://doi.org/10.1039/C5RP00114E 100. Redish, E. F., Saul, J. M., & Steinberg, R. N. (1998). Student expectations in introductory physics. American Journal of Physics, 66(3), 212–224. https://doi.org/10.1119/1.18847 101. Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(1), 1–22. https://doi.org/10.1186/s40594-018-0103-x 81 102. Reinholz, D. L., White, I., & Andrews, T. (2021). Change theory in STEM higher education: A systematic review. International Journal of STEM Education, 8(1), 37. https://doi.org/10.1186/s40594-021-00291-2 103. Rickert, R. K. (1967). Developing critical thinking. Science Education, 51(1), 24– 27. https://doi.org/10.1002/sce.3730510106 104. Rivet, A. E., Weiser, G., Lyu, X., Li, Y., & Rojas-Perilla, D. (2016). What Are Crosscutting Concepts in Science? Four Metaphorical Perspectives. In C. K. Looi, J. L. Polman, U. Cress, & P. Reimann (Eds.), Transforming Learning, Empowering Learners: The International Conference of the Learning Sciences (Vol. 2). International Society of the Learning Sciences. 105. Rocabado, G. A., Kilpatrick, N. A., Mooring, S. R., & Lewis, J. E. (2019). Can We Compare Attitude Scores among Diverse Populations? An Exploration of Measurement Invariance Testing to Support Valid Comparisons between Black Female Students and Their Peers in an Organic Chemistry Course. Journal of Chemical Education, 96(11), 2371–2382. https://doi.org/10.1021/acs.jchemed.9b00516 106. Saltzman, J., Price, M. F., & Rogers, M. B. (2016). Initial study of neutral post- instruction responses on the Maryland Physics Expectation Survey. Physical Review Physics Education Research, 12(1), 1–6. https://doi.org/10.1103/PhysRevPhysEducRes.12.013101 107. Santos, L. F. (2017). The Role of Critical Thinking in Science Education. Journal of Education and Practice, 8(20), 159–173. 108. Schein, E. H., & Schein, P. A. (2016). Organizational Culture and Leadership (5th ed.). Jossey-Bass. 109. Schwarz, C. V., Passmore, C., & Reiser, B. J. (Eds.). (2017). Helping Students Make Sense of the World: Using Next Generation Science and Engineering Practices. 110. Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Achér, A., Fortus, D., Shwartz, Y., Hug, B., & Krajcik, J. (2009). Developing a learning progression for scientific modeling: Making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632–654. https://doi.org/10.1002/tea.20311 111. Schwarz, C. V., & White, B. Y. (2005). Metamodeling Knowledge: Developing Students’ Understanding of Scientific Modeling. Cognition and Instruction, 23(2), 165– 205. https://doi.org/10.1207/s1532690xci2302_1 112. Scott, S. (2008). Perceptions of Students’ Learning Critical Thinking through Debate in a Technology Classroom: A Case Study. The Journal of Technology Studies, 82 34(1), 39–44. https://doi.org/10.21061/jots.v34i1.a.5 113. Semsar, K., Knight, J. K., Birol, G., & Smith, M. K. (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for Use in Biology. CBE--Life Sciences Education, 10(3), 268–278. https://doi.org/10.1187/cbe.10-10-0133 114. Seymour, E., & Hewitt, N. M. (1997). Talking About Leaving: Why Undergraduate Leave the Sciences. Westview Press. 115. Siegel, H. (1989). The Rationality of Science, Critical Thinking, and Science Education. Synthese, 80, 9–41. 116. Smith, P. M. (1963). Critical thinking and the science intangibles. Science Education, 47(4), 405–408. https://doi.org/10.1002/sce.3730470420 117. Stowe, R. L., & Cooper, M. M. (2017). Practicing What We Preach: Assessing “Critical Thinking” in Organic Chemistry. Journal of Chemical Education, 94(12), 1852– 1859. https://doi.org/10.1021/acs.jchemed.7b00335 118. Stowe, R. L., Scharlott, L. J., Ralph, V. R., Becker, N. M., & Cooper, M. M. (2021). You Are What You Assess: The Case for Emphasizing Chemistry on Chemistry Assessments. Journal of Chemical Education, 98(8), 2490–2495. https://doi.org/10.1021/acs.jchemed.1c00532 119. Talanquer, V. (2021). Multifaceted Chemical Thinking: A Core Competence. Journal of Chemical Education, 98(11), 3450–3456. https://doi.org/10.1021/acs.jchemed.1c00785 120. Talanquer, V., & Pollard, J. (2010). Let’s Teach How We Think Instead of What We Know. Chemistry Education Research and Practice, 11, 74–83. 121. Tan, C. (2017). Teaching critical thinking: Cultural challenges and strategies in Singapore. British Educational Research Journal, 43(5), 988–1002. https://doi.org/10.1002/berj.3295 122. Tao, Y., Oliver, M., & Venville, G. (2013). A comparison of approaches to the teaching and learning of science in Chinese and Australian elementary classrooms: Cultural and socioeconomic complexities. Journal of Research in Science Teaching, 50(1), 33–61. https://doi.org/10.1002/tea.21064 123. Taylor, A. R., Gail Jones, M., Broadwell, B., & Oppewal, T. O. M. (2008). Creativity, inquiry, or accountability? Scientists’ and Teachers’ perceptions of science education. Science Education, 92(6), 1058–1075. https://doi.org/10.1002/sce.20272 83 124. The Critical Thinking Co. (2021). Cornell critical thinking tests. https://www.criticalthinking.com/cornell-critical-thinking-tests.html 125. Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., Chambwe, N., Cintrón, D. L., Cooper, J. D., Dunster, G., Grummer, J. A., Hennessey, K., Hsiao, J., Iranon, N., Jones, L., Jordt, H., Keller, M., Lacey, M. E., Littlefield, C. E., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences, 117(12), 6476–6483. https://doi.org/10.1073/pnas.1916903117 126. Thiry, H., Weston, T. J., Harper, R. P., Holland, D. G., Koch, A. K., Drake, B. M., Hunter, A.-B., & Seymour, E. (2019). Talking about Leaving Revisited (E. Seymour & A.-B. Hunter, Eds.). Springer. https://doi.org/10.1007/978-3-030-25304-2 127. Timonen, V., Foley, G., & Conlon, C. (2018). Challenges when using grounded theory: A pragmatic introduction to doing GT research. International Journal of Qualitative Methods, 17, 1–10. https://doi.org/10.1177/1609406918758086 128. Tomkin, J. H., Beilstein, S. O., Morphew, J. W., & Herman, G. L. (2019). Evidence that communities of practice are associated with active learning in large STEM lectures. International Journal of STEM Education, 6(1), 1–16. https://doi.org/10.1186/s40594-018-0154-z 129. Tsai, C. C. (2001). A review and discussion of epistemological commitments, metacognition, and critical thinking with suggestions on their enhancement in internet- assisted chemistry classrooms. Journal of Chemical Education, 78(7), 970–974. https://doi.org/10.1021/ed078p970 130. Tseng, A. S., Bonilla, S., & MacPherson, A. (2021). Fighting “bad science” in the information age: The effects of an intervention to stimulate evaluation and critique of false scientific claims. Journal of Research in Science Teaching, April. https://doi.org/10.1002/tea.21696 131. Vieira, R. M., Tenreiro-Vieira, C., & Martins, I. P. (2011). Critical thinking: Conceptual clarification and its importance in science education. Science Education International, 22(1), 43–54. 132. Watson, G., & Glaser, E. M. (1964). Watson-Glaser critical thinking appraisal manual. Harcourt, Brace & World. 133. Weaver, M. G., Samoshin, A. V., Lewis, R. B., & Gainer, M. J. (2016). Developing Students’ Critical Thinking, Problem Solving, and Analysis Skills in an Inquiry-Based Synthetic Organic Laboratory Course. Journal of Chemical Education, 93(5), 847–851. https://doi.org/10.1021/acs.jchemed.5b00678 84 134. White, K. N., Vincent-Layton, K., & Villarreal, B. (2021). Equitable and Inclusive Practices Designed to Reduce Equity Gaps in Undergraduate Chemistry Courses. Journal of Chemical Education, 98(2), 330–339. https://doi.org/10.1021/acs.jchemed.0c01094 135. Wood, N. B., Erichsen, E. A., & Anicha, C. L. (2013). Cultural emergence: Theorizing culture in and from the margins of science education. Journal of Research in Science Teaching, 50(1), 122–136. https://doi.org/10.1002/tea.21069 136. Wright, A., & Forawi, S. (2000). Social challenges require critical and creative thinkers. The Forum. The Forum, Calgary, Canada. 137. Xu, X., & Lewis, J. E. (2011). Refinement of a chemistry attitude measure for college students. Journal of Chemical Education, 88(5), 561–568. https://doi.org/10.1021/ed900071q 138. Zotos, E. K., Moon, A. C., & Shultz, G. V. (2020). Investigation of chemistry graduate teaching assistants’ teacher knowledge and teacher identity. Journal of Research in Science Teaching, 57(6), 943–967. https://doi.org/10.1002/tea.21618 85 CHAPTER IV: INVESTIGATING STUDENT PERCEPTIONS OF TRANSFORMATIONAL INTENT AND CLASSROOM CULTURE IN ORGANIC CHEMISTRY COURSES Preface This study sought to complement previous work on student reasoning in the transformed Organic Chemistry, Life, the Universe, and Everything (OCLUE) curriculum. Research published by my colleagues had consistently demonstrated that students in OCLUE were able to engage in causal mechanistic reasoning more often and retain this ability longer when compared to students in the traditional course. However, work had not been done to better understand if students perceived they were doing something different than in traditional courses. This study began as exploratory, and over time it not only commented on student perceptions of transformational intent, but the results also spoke to elements of classroom culture and their impact on learning. The work presented in this chapter is a foundational piece to the other studies reported in this dissertation. This study was originally published in Chemistry Education Research and Practice and is reprinted here with permission: Bowen, R. S., Flaherty, A. A., & Cooper, M. M. (2022). Investigating Student Perceptions of Transformational Intent and Classroom Culture in Organic Chemistry Courses. Chemistry Education Research and Practice. https://doi.org/10.1039/D2RP00010E. Copyright 2022 The Royal Society of Chemistry. A copy of permissions is included in the Appendix alongside Supplemental Information for this manuscript. 86 Introduction Chemistry education research (CER) has led to the development of a number of undergraduate course transformations with the goal of improving teaching and learning in chemistry (Talanquer and Pollard, 2010; Cooper and Klymkowsky, 2013; Sevian and Talanquer, 2014; Cooper et al., 2019; McGill et al., 2019). These transformations have been characterized and supported by research on student performance and reasoning within the contexts of these courses (Banks et al., 2015; Becker et al., 2016; Cooper et al., 2016; Crandell et al., 2019, 2020; Noyes and Cooper, 2019; Houchlei et al., 2021; Talanquer, 2021); however, little work has been done to explore student perceptions of what they think they are doing. That is, there is scarce research on student perceptions of what is valued in courses and whether these perceptions align with transformational goals. Within the CER and science education literature, there are many studies exploring student perceptions within the affective domain of learning and student experiences across entire courses or programs (Bauer, 2005, 2008; Galloway and Bretz, 2016; Galloway et al., 2016; Flaherty, 2020a). For example, longitudinal studies such as Talking About Leaving (Seymour and Hewitt, 1997) and Talking About Leaving Revisited (Thiry et al., 2019) have leveraged student perceptions and found that students perceive competitive, unsupportive class cultures in many of their STEM courses, including chemistry. According to students in these studies, the class cultures, in conjunction with many other factors, ultimately contributed to their decision to switch out of their STEM majors (Seymour and Hewitt, 1997; Thiry et al., 2019). More recently, studies have explored student perceptions of their chemistry courses following the shift to online instruction during the 2020-2022 COVID-19 pandemic (Ramachandran and Rodriguez, 2020). 87 Outside of the affective domain, research on student perceptions of learning has also been common. For example, one study explored how students interpreted structure, property, and function relationships across biology and chemistry. The authors found that while students could discuss structure and properties in the context of both courses, students had more difficulty discussing function in the context of chemistry (Kohn et al., 2018). Such work is supported by previous research that has found that students may miss crucial information during instruction that could aid their understanding which causes them to not perform as well as they intended despite their success in earlier chemistry courses (Anderson and Bodner, 2008). In the context of undergraduate laboratories, work with course-based undergraduate research experiences (CUREs) found that students demonstrated gains in their perceived knowledge, experience, and confidence with specific research-related abilities. The authors concluded that such perceptions could help instructors with course evaluation and assessment design (Irby et al., 2020). Student perceptions have also been leveraged to better understand how students engage in critical thinking. For example, Scott studied student perceptions of critical thinking after they completed a technology course where debate was employed as a pedagogical tool and found that students perceived their critical thinking abilities had been enhanced (Scott, 2008). Similarly, Hammersley-Fletcher and Hanley used student perceptions to explore the ways that international students in the UK viewed critical thinking and concluded that students thought that certain approaches associated with critical thinking silenced their voices (Hammersley-Fletcher and Hanley, 2016). Finally, in a study investigating student, teaching staff, and employer perceptions of the definition of critical thinking in chemistry communities, Danczak and colleagues found that definitions across the groups differed and that students perceived “critique”, “objectivity”, and “problem-solving” were all components of critical thinking (Danczak et al., 2017). 88 All of these examples highlight the robust and insightful nature of student perceptions in chemistry education, making them a significant area of research. In a study related to this work, co-author AAF employed constructivist grounded theory to investigate student perceptions of the structure and development of scientific knowledge within the transformed organic chemistry course discussed here. After interviewing twelve students in the transformed course, the findings indicated that students perceived memorization of content was not as effective as being able to reason, that students needed to critique information by interrogating prior knowledge, and that students recognized differences in explaining how and why chemical phenomena occur, among others (Flaherty, 2020b). Though this initial study was influential for our work discussed here, it’s important to note that it asked fundamentally different research questions and was not comparative. That is, it was focused on student perceptions of the structure and development of scientific knowledge and focused exclusively on students in the transformed course without comparison of their perceptions to students in other organic chemistry environments. Regardless, the findings pushed us to pursue this line of inquiry further. Although student perceptions had been leveraged in a variety of ways, to our knowledge, they had not been used to further assess transformation efforts and to ascertain whether student understanding of course goals aligned with instructor expectations. Therefore, we found student perceptions of expectations and what was valued to be a significant area of study for four reasons: 1) it complemented our previous research of our transformational efforts at our institution (Crandell et al., 2019, 2020; Houchlei et al., 2021); 2) it afforded another perspective and way to characterize our transformation that did not focus on student reasoning; 3) it allowed us to explore alignment between our transformational intent, expectations, and student perceptions (and ascertain whether there was misalignment); and 4) considering our 89 transformational efforts were informed by research on how people learn and think (National Research Council, 2000, 2012b, 2012a; National Academies of Sciences, 2018), this study would enable us to investigate if student perceptions of what was expected and valued in courses aligned with the evidence base on effective ways of doing and thinking. With these motivations in mind, we embarked on this exploratory study. However, as we will explain more later, the study evolved as we interfaced with and interpreted the data. Influenced by the Talking About Leaving studies (Seymour and Hewitt, 1997; Thiry et al., 2019), we were reminded that student perceptions can be used to provide insights on elements of the classroom culture. Studies have shown that alignment between course goals and classroom practices can lead to a more productive learning experience and engagement with scientific practices (Sandoval et al., 2019). Furthermore, we recognized that certain classroom norms communicate implicit and explicit messages to students about how to participate, think, and practice (Becker et al., 2013; Chang and Song, 2016; Reinholz and Apkarian, 2018). Considering that we were interested in knowing what students perceived they were doing in these courses and whether they aligned with our transformational intent, our interpretations and discussions evolved to consider the classroom cultures of the two organic chemistry courses in this study. Our previously published research compared student performance on a variety of tasks, including constructing causal mechanistic explanations and the use of mechanistic arrows, across a two semester sequence of a transformed organic chemistry course (Crandell et al., 2019, 2020; Houchlei et al., 2021). As a result, we have insights on student thinking and their approaches to such tasks. Therefore, we opted to engage in this exploratory study where we investigated student perceptions of two organic chemistry courses (including our transformed course) that, in our opinion and from our previous research, employed different approaches to teaching and 90 learning. One of the courses was transformed using three-dimensional learning (National Research Council, 2012a; 3DL4US, n.d.), and the other embodied a more traditional approach to organic chemistry (as further discussed below). Considering that our previous studies afforded us insights into how students responded to different types of organic chemistry tasks, the work presented here attempted to characterize the course experiences from student perspectives. Our motivations for this work were driven by an interest in complementing this previous work and to characterize our transformation efforts from a different perspective. Just as some CER scholars have argued that student perceptions can inform assessment design (Irby et al., 2020), we assert that student perceptions of what is expected and valued can inform course design and transformational efforts. Furthermore, this study enabled us to explore alignment of student perceptions of what they are expected to do, what is valued, our transformational intent, and the evidence base on effective ways of doing and thinking. As we will note later, given the research on the role that alignment between classroom practices, course goals, and norms of participation and practice have within the classroom culture, we ultimately discuss and situate this work within a sociocultural perspective that is informed by culture scholars (Vygotsky, 1978; Rogoff, 1990; John-Steiner and Mahn, 1996; Carlone et al., 2011; Becker et al., 2013; Chang and Song, 2016; Schein and Schein, 2016; Reinholz and Apkarian, 2018; Sandoval et al., 2019; Zotos et al., 2020; Petterson et al., 2022). In order to gather this data, we needed an instrument that would help us capture student perceptions in a robust way while minimizing external influences on student responses. Although there are a number of previously developed instruments for use in higher education that address student perceptions, expectations, and other affective states, none of them met the needs of this study; therefore, we opted to develop our own. Our instrument involved three open-ended 91 questions which will be discussed in more detail later. These questions specifically target student perceptions of how they were expected to think in organic chemistry, what they found most difficult in the course, and how they perceived they were assessed. However, first, we find it important to review some of these instruments to justify the development of our own. Previously Published Instruments Many of the previously published instruments we reviewed relied on the use of Likert or semantic differential scales where students responded to prompts developed by researchers. One of the first wide-scale uses of Likert-scale instruments in higher education was the Maryland Physics Expectations (MPEX) survey which was developed by Redish and co-workers (Redish et al., 1998). The MPEX later led to the development of the corresponding survey for chemistry known as the CHEMX (Grove and Bretz, 2007). Both the MPEX and the CHEMX have students respond to closed-ended questions on a agree-disagree Likert-scale and are designed to gather information on student assumptions, beliefs, and cognitive expectations within physics and chemistry. According to Redish, cognitive expectations refer to students “expectations about their understanding of the process of learning [physics] and the structure of [physics] knowledge rather than about the content of physics itself.” The CHEMX survey has a similar guiding philosophy. Both surveys compare student responses to expert responses, and it is notable that students appear to become less “expert-like” in their expectations and understanding of how science is done over the course of two semesters of introductory physics and chemistry. The authors of these surveys ascribe this apparent regression to how the content of these introductory courses is structured and how they are taught. 92 The Colorado Learning Attitudes about Science Survey (CLASS), is also a Likert-scale instrument developed for physics (Adams et al., 2005) and adapted for chemistry (Barbera et al., 2008) and biology (Semsar et al., 2011). The CLASS instruments are primarily focused on gathering information from students on their beliefs and attitudes about learning within the specific discipline, the content of the discipline, the structure of the disciplinary knowledge, and connections to the “real world”. In contrast to the MPEX and CHEMX, the CLASS asks about the discipline in general while the MPEX and CHEMX instruments probe student beliefs about a specific course. Just as with the MPEX and CHEMX, results from the CLASS are reported as how well they align with expert-responses, and typically there is no “improvement”. That is, there is no movement to more expert-like responses over a general chemistry sequence. However, these three instruments differ in that the MPEX and the CHEMX cluster responses using confirmatory factor analysis while the CLASS clusters according to exploratory factor analysis. Although not immediately related to expectations, other instruments have been developed to explicitly measure student attitudes. Some instruments, such as the Chemistry Attitudes and Experiences Questionnaire (CAEQ) (Dalgety et al., 2003) and the Attitude towards the Subject of Chemistry Inventory (ASCI) (Bauer, 2008) have utilized a semantic differential format where students respond on a scale where the extremes include polar opposite adjectives. In the case of the ASCI, the structure of the instrument begins with a sentence stem such as “Chemistry is...” and then students respond to the sentence stem by rating their response on a 7-point semantic differential scale where the extremes represent aforementioned polar opposite adjectives such as “easy/hard”, “comprehensible/incomprehensible”, and “tense/relaxed”, among others. The ASCI 93 has been further developed, producing the ASCIv2 (Xu and Lewis, 2011) and the ASCIv3 (Rocabado et al., 2019). While the use of Likert- and semantic differential-scale instruments allow for quick diagnostics and analysis of the data, the questions within these instruments may prompt students to respond in a certain way, do not allow for students to state their experience in their own words, and students may not be given the opportunity to volunteer information that they deem as most important or relevant to their experience. These restrictions signify that more open-ended questions coupled with qualitative methodologies could be helpful discovering themes that capture a more accurate picture of student experience. Though some of the items in previous instruments were investigating similar ideas as we are here, the potential for prompting inherent in the questions and the lack of opportunities for students to use their own words may not accurately capture student perceptions, beliefs, or attitudes. Furthermore, qualitative approaches to investigate perceptions, expectations, and other constructs is scarce (Flaherty, 2020a), and we believed this to be a great opportunity to explore student perceptions of their organic chemistry courses in an open-ended way. With this said, previously published instruments in CER and other fields tended to address how students experienced a course or whole discipline and did not align with our study objectives. Our goal was rather different in that we were interested in how students perceived course/instructor expectations and what was valued. Since we wanted to use students’ own words and perceptions to guide our investigation, minimize prompting in the questions, and use qualitative methodologies to analyze the responses, we opted to use our own instrument. Such an approach allowed for a combination of inductive and deductive coding, highlighted student 94 voices, and provided students the opportunity to identify what they believed to be most important and relevant to their experiences. Purpose As noted throughout, the purpose for this study was to complement our previous work on student reasoning in these courses and characterize our transformational efforts further. This was coupled with the goal of generating insights on whether student perceptions of what they were doing and what was valued aligned with our transformational intent and the underlying theories of learning in the transformed course. Considering that previously published instruments were not appropriate given our exploratory goals and interests, we opted to use our own instrument. As we engaged with the data, we began to address the areas of interest through the lens of classroom culture. Therefore, the research questions that guided our work included: 1) In what ways do student perceptions of valued ways of doing and thinking align with the transformational intent; 2) How do elements of the course culture impact student perceptions of what is valued? Theoretical Framework The work presented here began as an exploratory project; yet, as our findings began to take shape, we started to interpret and discuss the findings in terms of the classroom cultures. As we analyzed the data, we noted how certain classroom structures and practices, norms of participation, and messages about what was valued informed student perceptions (Becker et al., 2013; Chang and Song, 2016; Reinholz and Apkarian, 2018). That is, we saw student responses speaking to interpretations of course expectations, perceptions of valued ways of doing and 95 practicing, and the influence these expectations and ways of doing had on course difficulty. Therefore, our interpretations of this exploratory work drew upon sociocultural perspectives and studies (Vygotsky, 1978; Rogoff, 1990; John-Steiner and Mahn, 1996; Carlone et al., 2011; Zotos et al., 2020; Petterson et al., 2022), as well as other culture-related frameworks (Schein and Schein, 2016; Reinholz and Apkarian, 2018). We will speak more to this framework at the beginning of the discussion after we have presented the results. Our rationale for this intentional writing decision is to highlight the initial exploratory nature of this study and how our analysis and interpretations evolved over time. By using student perceptions as a proxy for elements of organic chemistry classroom cultures, we aim to complement our previous research on student reasoning in the context of these courses and demonstrate how student perceptions of what they were expected to do, what was most difficult, and how they were assessed can be insightful for the development and enactment of chemistry courses. Considering that this study uses student perceptions of what is expected and valued and ways of practicing, it is important to acknowledge that we (and students, for that matter) have assumptions and ideas about what it means to know and do. Broadly, our epistemological beliefs are informed by constructivist and sociocultural views of learning where we believe that students construct their own knowledge and are influenced by the contexts in which learning occurs and the interactions they have (Vygotsky, 1978; Bodner, 1986; Rogoff, 1990; John-Steiner and Mahn, 1996; National Research Council, 2000; Carlone et al., 2011; National Academies of Sciences, 2018; Zotos et al., 2020; Petterson et al., 2022). With this, we also ascribe to the resources perspective which asserts that students have knowledge that is connected in various ways which may or may not be activated when prompted depending on their knowledge structure and how the task is scaffolded. Furthermore, it is acknowledged that the resources students have 96 may be more or less productive on a given learning task which offers a way to understand how students are connecting and applying concepts (Hammer, 2000). With all of this said, we ascribe to the idea that people learn best when they are in environments that provide them consistent opportunities to apply and use their knowledge (and resources). Considering that three- dimensional learning engages students in scientific practices around fundamental ideas in chemistry, it resonates with our epistemological beliefs and is the foundation for our course transformations as will be discussed in the Methods section (National Research Council, 2012a; 3DL4US, n.d.). Coupled with our previous work on student reasoning in the context of the courses in this study, we acknowledge these beliefs and previous research influenced our analysis and interpretations of student perceptions. Methods Context: Transformed and Traditional Organic Chemistry Courses This research took place in the context of two types of organic chemistry courses: transformed and traditional. Both courses were taught at a large research-intensive midwestern university in the United States. The transformed course used the Organic Chemistry, Life, the Universe and Everything (OCLUE) curriculum (Cooper et al., 2019) which uses the framework of three-dimensional learning to support knowledge in use. That is, it emphasizes core ideas, scientific practices, and crosscutting concepts as discussed in A Framework for K-12 Science Education (National Research Council, 2012a). In OCLUE, ideas are introduced and linked to the chemistry core ideas of Structure-Property relationships, Bonding and Interactions, Energy, and Change and Stability in the context of scientific practices (Cooper et al., 2017). In particular, the development and use of models and explanations is combined with mechanistic reasoning to 97 support students as they explain how and why organic phenomena occur. OCLUE students are routinely asked to construct mechanistic explanations for phenomena such as acid-base reactions (Crandell et al., 2019), nucleophilic substitutions (Crandell et al., 2020), mechanisms for electrophilic addition and other reactions (Houchlei et al., 2021), thermodynamic and kinetic control, and solvent effects. Lectures in OCLUE are somewhat interactive. New topics are introduced by having students discuss what they already know, clicker questions are posed, students are encouraged to discuss the answers, and occasional group activities are incorporated (for example, groups build molecular models and compare them together). Students work in groups in OCLUE recitation sections to complete scaffolded worksheets which include a mixture of three-dimensional and more traditional questions, such as draw a reaction mechanism or determine the identity of an unknown compound from spectroscopic data. Homework is assigned twice a week for credit upon completion rather than accuracy to encourage students to try and practice without penalty and also includes three-dimensional prompts similar to the recitation activities. Therefore, a considerable proportion of a students’ grade in OCLUE is determined by participating, practicing, and trying with “good faith effort” and explaining how and why something happens, thus allocating a smaller proportion of the students’ grade to high stakes testing. Examinations in OCLUE employ a mixture of multiple-choice and open-response items, some of which mirror traditional questions in an organic chemistry course (such as predicting products and drawing mechanisms). However, frequently students are asked to provide an explanation of how and why a given chemical phenomena is occurring with about 50% of the points on exams focusing on having students use core ideas in the context of scientific practices. 98 In contrast, traditional courses are usually organized by functional group. Rather than connecting a few types of reaction mechanisms to the core ideas, a traditional course tends to treat each type of reaction and functional group separately. By agreement between all instructors, the same topics are covered, the course is primarily taught in a traditional expository lecture format. While students can (and do) ask questions, there is no expectation of peer interactions either in the lecture or in the recitation sections. Instead, the recitation sections for the traditional course typically consist of a quiz, followed by a question-and-answer period, or another short lecture from the graduate teaching assistant. Students may complete online homework, typically multiple-choice questions, however the homework is not completed for a grade. The examinations consist of open-response items where students must fill in the reactant, reagent, or products, draw a mechanism for a reaction, or design a synthesis, and are typical high stakes summative assessments. These items are similar to those that we have found are prevalent in sophomore organic chemistry courses, and our prior analysis of these items indicates that students are typically not required to explicitly show evidence of reasoning, but rather can they answer questions by recall or pattern recognition (Stowe and Cooper, 2017). It's worth noting that the overall assessment strategies for the two courses also differ significantly. In OCLUE, between 45-50% of the points are allocated through formative assessment strategies. That is, group work in recitation and homework are not graded for accuracy but on completion with a “good faith effort”. The rest of the overall grade in OCLUE comes from three mid-terms and a final exam. In contrast, in the traditional sections all of the points towards the class grade come from summative exams (midterms and final). This difference may have significant consequences for students since there is emerging evidence that allocating parts of the course grade to completion of formative assessments is a more equitable 99 strategy that can address differences in outcomes among various demographic groups (Tashiro and Talanquer, 2021). In summary, the two types of courses cover the same material, but they have different pedagogical approaches, course requirements, and approaches to assessments. Given that the purpose of this study was to complement previous research by characterizing our transformational efforts from the student perspective and to generate insights on how student perceptions aligned with our transformational intent and subsequent theories of learning, we found this to be an informative study. As will be discussed in more detail later, our interpretive frame of classroom culture clarifies these purposes by helping us acknowledge that the implicit and explicit messages sent by the course and instructors communicate what are valued ways of knowing and doing. Alignment between classroom practices, course goals, messages instructors send, and the interpretations of those messages by students as well as classroom norms have been shown to be important for engaging students in learning practices (Becker et al., 2013; Chang and Song, 2016; Schein and Schein, 2016; Sandoval et al., 2019). Participants The study took place in the Spring semester of 2018 in organic chemistry II. Therefore, this course was entirely in-person and was completed before the 2020-2021 COVID-19 pandemic moved classes online. The total number of participants in this study was 852 undergraduate students. Six-hundred and four students were enrolled in a traditional organic chemistry course and 248 students were enrolled in OCLUE. Both are large enrollment courses taught in lecture sections of 200-300 students that meet for approximately three hours per week. Each student is also enrolled in a one-hour recitation section of about 30 students that is taught 100 by a graduate teaching assistant. Students are not aware of the differences in the two courses before they enroll, and the demographics and academic background of the students in each course section are similar (Crandell et al., 2020). Students answered the three questions in our instrument for extra credit in each class, and all students were informed of their rights as research participants in accordance with the institutional review board. Participant demographics are included in Table 1 which is the demographic breakdown of all students enrolled in organic chemistry II of the Spring 2018 semester at the university in this study that was provided by the university registrar. From the demographics breakdown it can be noted the majority of students were life sciences majors and white. We have previously not noted major differences in demographics between the two types of courses. 101 Table 4.1. Participant demographics Gender First-Generation Transfer Female 693 Yes 202 Yes 152 Male 310 No 801 No 851 Total 1003 Total 1003 Total 1003 Major Ethnicity Life Sciences 733 American Indian/Alaskan Native 1 Lab Sciences 61 Asian (non-Hispanic) 91 Physical Sciences 24 Black or African-American (non-Hispanic) 66 Engineering 4 Hawaiian/Pacific Islander (non-Hispanic) 1 Animal Sciences and 58 Hispanic 37 Veterinary Food and Nutritional 36 International 49 Sciences Social Sciences 27 Not Reported 11 Other 60 Two or More Races (non-Hispanic) 37 White (non-Hispanic) 710 Total 1003 Total 1003 Design of the Instrument Questions To generate a manageable dataset for the 852 students in our study, our instrument included three open-ended questions. While similar instruments consist of many focused questions, our first goal was to ask open ended questions so that students could respond in their own words. Second, we wanted to minimize prompting; that is, we wanted to avoid using highly specific questions that might make students respond a certain way. Third, we wanted to ask a few questions that addressed different but related aspects of the course that could be answered in a few sentences at most and enable us to collect data from small or large courses. Finally, we wanted to pose questions that were accessible and understandable to students. The question design occupied a useful analytic middle ground. That is, it was not as constrained as a 102 quantitative questionnaire, yet it could capture insightful, rich responses from many students without conducting time-consuming interviews. The first question stated: “If you met a student who is thinking about enrolling in (traditional or OCLUE) organic chemistry next year, how would you describe the ways students are expected to think about reaction mechanisms in organic chemistry?” The choice to include language on “if you met a student” was thought to help students frame their response as if they were talking to a peer. By mentioning mechanisms, we intended to scaffold student responses and help them reflect on course expectations. Furthermore, there is ample research to show that students have great difficulty with thinking about mechanisms and often resort to memorization as a way to succeed (Bhattacharyya and Bodner, 2005). This question also related to our previous work where we have shown that students in OCLUE are more likely to engage in causal mechanistic reasoning, to use mechanisms appropriately, and are significantly more likely than traditional peers to correctly predict products for unknown reactions (Houchlei et al., 2021). The second question was the following: “What would you tell them is the most difficult thing about organic chemistry?” Previously published instruments often asked students about the difficulty of the overall course or specific content, and we thought this question would provide students the opportunity to identify aspects that they deemed most difficult without constraining their response. The CER literature has detailed various aspects of organic chemistry that students have difficulty with and by investigating student perceptions on the most difficult aspect of the course in this open-ended way, we can gather insights into which facets of a course students struggle with the most, such as a certain way of thinking, a course policy, or instruction in general. 103 The third question was “How would you describe to them what is assessed in organic chemistry?” This question was designed to elicit if students perceived that assessments aligned with how they perceived they were expected to think in the course. It is well recognized that assessments send strong messages to students about what is valued in a course (Momsen et al., 2013; Stowe et al., 2021). Considering that the approaches to summative and formative assessments were quite different between the two courses in this study, we believed this question would be insightful. The design of these questions aligned with our goals of the study. We were interested in complementing our previous research on student reasoning and characterizing OCLUE from the student perspective which all three questions would address. In addition, we were interested in investigating whether student perceptions aligned with our course goals and expectations and the underlying theories of learning that informed the OCLUE curriculum. We anticipated that questions 1 (expectations of thinking) and 3 (assessment) would be an open-ended way to explore this alignment while question 2 (most difficult thing) provided additional information on how the enactment of the course impacted difficulties encountered by students. Data Collection Student responses were collected in the form of a homework activity assigned through the beSocratic homework system (Bryfczynski, 2010; beSocratic, 2020). For both types of courses, students were given extra credit and an entire week to complete the questions. After the due date, the data was exported out of beSocratic into an Excel file and then responses were deidentified to protect the anonymity of students. Before beginning analysis on the data, the 104 responses were blinded and mixed up so that the coders did not know which course the response were from (either traditional or OCLUE). Data Analysis The data collected for this study was analyzed with an inductive thematic analysis approach (Thomas, 2006) that allowed us to establish an analytical framework that we then applied deductively to the rest of the data. This form of data analysis facilitates the emergence of research findings from themes within the data without being restrained by structured methodologies (Boyatzis, 1998; Thomas, 2006). Unlike a grounded theory methodology, which produces theory, or phenomenology, which produces a description of lived experiences, inductive data analysis produces themes or categories which are relevant to the research objectives identified (Thomas, 2006). The purposes of inductive data analysis involve (i) condensing text data into a brief, summary format, (ii) establishing links between the research objectives and the summary findings, and (iii) developing a model or theory about the underlying structure of experiences that are evident in the data (Thomas, 2006). Our analysis began with inductive thematic analysis which allowed us to form categories that were prominent in the data and develop a codebook. Although this study sought to complement our previous work, it’s important to note that the language used to describe and name categories was pulled from student responses; that is, although our thinking about our categories may have been influenced by previous work, we attempted to use student words and perspectives to guide our analysis and name our categories. After our codebook was revised and developed, it was applied to the remainder of the data. That is, our analysis began inductively and proceeded to a deductive analysis once our codebook was developed (Merriam and Tisdell, 2016). 105 Inductive thematic analysis was deemed most suitable in the beginning because: 1) we wanted categories to emerge from student experiences at first to guide our analysis and highlight their voices; 2) we had highly open-ended questions; and 3) our initial stance toward this project was exploratory in nature and we believed beginning with an inductive approach was appropriate. Therefore, analysis was conducted as described by Thomas (Thomas, 2006). First, the raw data files were formatted to promote ease of comprehension. Second, we familiarized ourselves with the nature of the data by reading the student responses. Third, categories were identified and defined based on actual phrases or meanings in specific text segments. Finally, each category was continually revised based on the ongoing analysis of data. To establish the codebook, responses from 248 traditional students and all 248 OCLUE students were analyzed. Taken together these 496 students answered the three open-ended questions mentioned earlier, yielding 1,488 responses across all three questions, and representing over 58% of the total data. Two of the authors (RSB and AAF) went through several rounds of independent coding of the 1,488 responses, developing and revising the codebooks for each question each time. Upon settling on a semi-finalized codebook, the authors then calculated percent agreement and found they had an 86.1% agreement. After discussing the coding discrepancies and sharpening the code dimensions to yield the finalized codebook, the authors settled on a 99.4% agreement. With such a high percent agreement, the authors concluded that any additional measure of inter-rater reliability would not be necessary. The remaining set of data was then split in half between RSB and AAF and coded to yield the full set of analyzed data (all 2,556 responses). Throughout the coding process, mutually exclusive codes were identified and used. The decision to use mutually exclusive coding was based on the following reasons: 1. The overall majority of the responses could only be categorized by a single code. 106 2. An analysis by author RSB using non-mutually exclusive coding yielded almost identical overall patterns. This is provided in the Supplementary Information, Figures S1, S2, and S3. 3. The use of mutually exclusive codes allowed for quicker and more efficient coding of the 2,556 responses. Results The three open-ended questions of our instrument were analyzed separately, and a separate codebook was developed for each question. The results section will report on the nature of these codes. Question 1: Expectations of Thinking As a reminder, the first question asked: “If you met a student who is thinking about enrolling in (traditional or OCLUE) organic chemistry next year, how would you describe the ways students are expected to think about reaction mechanisms in organic chemistry?” Responses were classified into six categories and outlined in Table 1. As we have noted throughout, one of the motivations behind this study was to complement our previous work on student reasoning. Therefore, our thinking and approach to analysis may have been informed in some way by this previous work; however, we reiterate that descriptions and naming of categories were based on student perspectives or language they chose to use in their responses. The “Apply and Reason” and the “Identify and Describe” categories differ with respect to whether students noted the significance of knowing why a mechanism occurs. For example, if a student mentioned the existence of forces and stabilization in their response, this was coded as 107 “Identify and Describe.” However, if the student mentioned the existence of forces and stabilization, and then expanded their response to include a discussion of how this helps explain why a reaction happens, then the response was coded as “Apply and Reason.” While “Apply and Reason” responses were considered more sophisticated than “Identify and Describe”, we acknowledge the complexity and potential understanding exhibited in “Identify and Describe” responses. Although we did not set out to develop a hierarchical model of categories during the analysis, the progression from “Memorization” to “Apply and Reason” does suggest a greater degree of sophistication in students’ perceptions of how they were expected to think about organic chemistry mechanisms. Though it is possible that a student can memorize a heuristic, the “Apply Heuristics” and “Memorization” categories were considered separately because using a heuristic does require some form of application that simply memorizing does not. The “Generalities” and the “Not Applicable” categories were also identified and included in the analysis. The “Generalities” category included responses where students explained the need to think about organic chemistry mechanisms using general or vague terms such as “critically”, “conceptually”, “creatively”, “thoroughly”, or “differently”; that is, they did not appear to have developed a specific vocabulary for what they were doing. Also included in this category were responses which referred to the need to think about mechanisms in a step-by-step manner, solving puzzles, or telling stories since these responses often did not expand on what was meant. Responses that were answering the question but seemed to not refer to chemistry concepts were also included here because we believed they were answering the question, but their meaning and context was uncertain. The “Not Applicable” category included instances when students did not give any response at all, or when their response was unclear or unrelated to the question posed. 108 Table 4.2. Codebook for question 1: Perceptions of the expectations of thinking Code Dimensions Example Quotes Apply Student responses that include one or more of Traditional: “I would tell them and the following dimensions in regard to their not to memorize them but to Reason perceptions of how they are expected to think actually think through each of about mechanisms: (1) understanding “why” a them and the reasoning behind reaction proceeds; (2) the use of knowledge, why what happens, happens.” specifically with the use of fundamental or basic ideas; (3) the transfer of knowledge to new problems; (4) making connections between OCLUE: “Should expect to concepts, especially in order to apply them; (5) understand the molecular making predictions in order to solve a problem. interactions of reactions and WHY these occur.” Identify Student responses that include one or more of Traditional: “They have to think and the following dimensions in regard to their about the polarity of bonds and Describe perceptions of how they are expected to think the nature of atoms when about mechanisms: (1) understanding the reacting with other atoms in “what” and “how” reactions proceed, regard to electronegativity and particularly without mentioning the use of polarity” knowledge to understanding “why” reactions proceed; (2) mentions understanding at the scalar or one scalar below levels, particularly OCLUE: “The reaction through the recognition of concepts such as mechanism are meant to show polarity and electronegativity and their the transfer of electrons from significance to understanding; (3) when a one compound/atom to another. student explicitly mentions understanding the This helps show how these mechanism instead of memorization; (4) when a reactions occur..“ student mentions “differentiating” between reactions with any further explanation; (5) responses includes a discussion of forces, charges, or stabilization. 109 Table 4.2 (con’t) Apply Student responses that include one or more of the Traditional: “think Heuristics following dimensions in regard to their perceptions about it in terms of Nu- of how they are expected to think about attacks E+” mechanisms: (1) focuses on the approach to solving problems rather than thinking about a problem; (2) mentions explicit use of arrow OCLUE: “they need to pushing without mentioning how knowledge is think about mechanistic used to engage in the formalism; (3) provide arrows as the movement descriptive statements without causal or of electrons from a mechanistic knowledge such as “negatives attack source to a sink” positives” or “source goes to sink”; (4) mention of identifying patterns and trends without expanding on the significance of identifying these patterns and trends; (4) explicit mention of the movement or flow of electrons without further explanation of how the movement or flow of electrons influence reactions. Memorization Student responses that include one or more of the Traditional: following dimensions regarding their perceptions of “Memorize, memorize, how they are expected to think about mechanisms: memorize.” (1) memorization, remembering, recalling, or regurgitation of reactions, products, reagents, and/or mechanisms; (2) knowing reactions, products, OCLUE: “memorize reagents, and/or mechanisms, particularly with no what reacts with what” explicit mention of understanding the “what”, “how”, or “why” a reaction proceeds. 110 Table 4.2 (con’t) Generalities Student responses that include one or more of the Traditional: “You following dimensions in regard to their perceptions of have to think about how they are expected to think about mechanisms: (1) would they would the total absence of any chemistry in the response; (2) benefit from most if thinking of reactions and/or mechanisms as “puzzles”; they reacted.” (3) thinking of reactions and/or mechanisms on a “step- by-step” basis; (4) using generic and unclear descriptors for thinking such as thinking critically, conceptually, OCLUE: “You need creatively, thoroughly, or differently, particularly if the to think rationally student does not expand on what they mean; (5) stating rather than general facts about reactions and mechanisms such as memorize.” “reactants go to products” or that there are many mechanisms for a given reaction; (6) seeing organic chemistry as a new or different language; (7) mentioning that organic chemistry focuses on the details; (8) mentioning that mechanisms are “like a story”. Not Student responses that include one or more of the Traditional: “To do Applicable following dimensions in regard to their perceptions of the practice problems how they are expected to think about mechanisms: (1) in the text book and the response does not answer the question, such as make flash cards.” organic chemistry or mechanisms are “easy” and “straightforward”; (2) when a student provides no response at all; (3) when the response is unclear or OCLUE: “good” interpretation difficult, such as when students say “you must understand/know the material”; (4) mention of actions students must do in organic chemistry such as “study a lot”; (5) mention of exam and/or course aspects such as exam difficulty and the challenging nature of organic chemistry; (6) when a student is venting about the course, professor, or other aspects relevant to the course. The analysis of student responses from the OCLUE and traditional courses revealed differences in the perceptions of expectations of thinking regarding reaction mechanisms. As noted in Figure 1, more OCLUE students perceived the need to engage in more sophisticated ways of thinking about mechanisms than students in the traditional course. For example, 30.6% 111 (n=76) of OCLUE students perceived that they were expected to apply what they knew to navigate their way through new and unforeseen problems and to provide a reason for why these mechanisms proceed. This compares to 13.2% (n=80) of students from the traditional course perceiving the same. For the category “Identify and Describe”, 21.4% (n=53) of OCLUE students and 17.4% (n=105) of students from the traditional course perceived this expectation. While the number of students from both types of courses who perceived the need to apply heuristics were quite similar (12.9%, n=78, of the traditional students and 10.9%, n=27, of OCLUE students), there was a much larger difference in extent to which students perceived they had to memorize material. Of the traditional student cohort, 20.9% (n=126) of students perceived the need to memorize information on organic chemistry mechanisms with just 2.8% (n=7) of OCLUE students perceiving the same. Finally, more students from the traditional course (24.5%, n=148) used general terms to describe how they were expected to think about organic chemistry mechanisms than OCLUE students (16.1%, n=40). The “Not Applicable” category included instances when students did not give any response at all (and there were very few of these responses across all three questions), or when their response was unclear or unrelated to the question posed. More students from the OCLUE course (18.1%, n=45) had responses categorized to the “Not Applicable” category than students from the traditional course (11.1%, n=67). 112 Figure 4.1. Percentages of student responses in each category/code for Question 1 Question 2: Most Difficult Thing The second question asked, “What would you tell them is the most difficult thing about organic chemistry?” Detailed explanations of each category as well as associated examples of student responses can be found in Table 2. The “Apply and Reason”, “Identify and Describe”, and “Memorization” categories align to previous explanations of the same categories in question 1 (expectations of thinking). The only category identified that was unique to the responses to this question was the “Personal, Course, and/or Exam Aspects” category. Responses for this category typically reported personal actions or behaviors such as “staying motivated”, “staying on top of the material”, or “being patient” as well as referring to facets of the course (i.e., the professor and grading schemes) and exams (i.e., format). These types of responses received their own category due to their prevalence in the data, unlike in question 1 (expectations of thinking) where there were so few of 113 these types of responses that they were assigned to the “Not Applicable” category. In question 1 (expectations of thinking), “Memorization” had an entire category of its own; however, for question 2 (most difficult thing) many students, particularly in the traditional course, coupled their perception of memorization with a large workload that was “overwhelming” or included a “high speed of coverage”. Initially for this question, there were separate “Memorization” and “Workload” categories, but since it was difficult to determine whether to classify these responses separately as “Memorization” or “Workload” we decided to combine the codes together as it still allowed us to make a broad comparison of the two courses. However, to further explain our rationale for the combination of “Memorization” and “Workload”, 355 out of 604 responses in the traditional course discussed “Memorization”, “Workload”, or both. Over 25% of the traditional students mentioned “Memorization” and “Workload” simultaneously. Given this sizable chunk of data in one of the courses mentioned both together, we opted to combine them especially given that the narrative we were interpreting did not change based on combining the two categories and it allowed for noting broad themes and patterns across all 852 responses. Two other categories, namely “Specific Topic” and “Not Applicable” were also identified. The “Specific Topic” category included responses which listed discrete specific topics that students found difficult. Throughout these responses, students did not make any reference to ways of thinking used to interpret the content associated with these topics. The common topics mentioned by students included “acid-base reactions”, “naming”, “synthesis”, and “spectroscopy”. In contrast to question 1 (expectations of thinking), the “Not Applicable” category for question 2 (most difficult thing) did not include references to the course or 114 instructor as they were coded separately, but it did include instances when students did not give any response at all or when the response was unclear or unrelated to the question posed. Table 4.3. Codebook for question 2: Perceptions of the most difficult thing about organic chemistry Code Dimensions Example Quotes Apply Student responses that include one or more of the Traditional: “Understanding and following dimensions in regard to their why mechanism happen the Reason perceptions of what is the most difficult thing way they do” about organic chemistry: (1) understanding “why” a reaction proceeds; (2) the use of OCLUE: “Realizing that you knowledge, specifically with the use of are not going to memorize every fundamental or basic ideas; (3) the transfer of reaction, you just need to worry knowledge to new problems; (4) making about patterns and reasons why connections between concepts or piecing/linking things happen a certain way” concepts together, especially in order to apply them; (5) making predictions in order to solve a problem. 115 Table 4.3 (con’t) Identify Student responses that include one or more of the Traditional: “For me, it was and following dimensions in regard to their rotating molecules around in Describe perceptions of what is the most difficult thing my head and understanding about organic chemistry: (1) understanding the how each reaction condition “what” and “how” reactions proceed, particularly affects the products.” without mentioning the use of knowledge to understanding “why” reactions proceed; (2) mentions understanding at the scalar or one OCLUE: “Understanding scalar below levels, particularly through the how each reagent indicates recognition of concepts such as polarity and different mechanisms electronegativity and their significance to between structures.” understanding; (3) when a student explicitly mentions understanding the mechanism instead of memorization; (4) when a student mentions “differentiating” between reactions with any further explanation; responses includes a discussion of forces, charges, or stabilization. NOTE: responses that simply mention “knowledge” or “understanding” do not receive this code. 116 Table 4.3 (con’t) Specific Topic Student responses that include one or Traditional: “the synthesis more of the following dimensions problems” regarding their perceptions of what is the most difficult thing about organic chemistry: (1) listing off specific topics, OCLUE: “The most difficult particularly with no reference to thing is CNMR and HNMR so understanding or approaches utilized. if you can learn that you can Most common specific topics mentioned learn anything” include mechanisms, acid-base reactions, learning objectives, naming, synthesis, and spectroscopy; (2) explicit mention of “concepts” without expanding on what they mean (i.e., “understanding concepts” or “knowing a mechanism”). Memorization Student responses that include one or Traditional: “The and/or more of the following dimensions in memorization of content. Its a Workload regard to their perceptions of what is the lot of information.” most difficult thing about organic chemistry: (1) memorization, remembering, recalling, or regurgitation OCLUE: “The most difficult of reactions, products, reagents, and/or thing about organic chemistry is mechanisms; (2) knowing reactions, how many mechanism you products, reagents, and/or mechanisms, have to know. It can get a bit particularly with no explicit mention of overwhelming, but if you try to understanding the “what”, “how”, or practice once a day, and keep up “why” a reaction proceeds; (3) mention with your notes then it won't be of the large amount/volume of material, as bad.” the large amount of studying, and/or the amount of time the course requires; (4) explicit mention of feeling overwhelmed with the course; (5) mention of difficulty with keeping up with the class. 117 Table 4.3 (con’t) Personal, Student responses that include one or more of Traditional: “the most Course, the following dimensions regarding their difficult part is holding and/or Exam perceptions of what is the most difficult thing yourself accountable to Aspects about organic chemistry: (1) mention of continue studying personal action and/or behaviors that a student throughout the semester” must have such as “staying motivated” or “staying on top of the material” or “being patient”; (2) mention of how a student must OCLUE: “The most regulate actions and behaviors to complete the difficult part is the self course; (3) when a student discusses aspects of discipline that is required the course or exams such as overall difficulty or in order to make sure you time allotted to an exam. learn everything that is being offered to you in this course.” Not Student responses that include one or more of Traditional: “nothing” Applicable the following dimensions in regard to their perceptions of what is the most difficult thing about organic chemistry: (1) the response does OCLUE: “Literally all of it” not answer the question; (2) when a student provides no response at all; (3) when the response is unclear or interpretation of the response is difficult, such as when students say “understanding the material”; (4) when the response falls into no other category; (5) when a student is venting about the course, professor, or other aspects relevant to the course The analysis of student responses from the OCLUE and traditional courses again revealed differences in how students perceived the difficulty of learning organic chemistry. As shown in Figure 2, more OCLUE students perceived that more sophisticated ways of thinking such as “Apply and Reason” and “Identify and Describe” were the most difficult thing about learning organic chemistry compared to students in the traditional course. For example, 16.5% (n=41) and 16.1% (n=40) of OCLUE students perceived that the most difficult aspects of learning organic chemistry were applying and reasoning and identifying and describing, respectively. This compares to the 2.6% (n=16) and 5.1% (n=31) for students in the traditional course for those 118 same categories. More students from the traditional course listed specific topics (18.4%, n=111) and the memorization and/or workload aspect (58.8%, n=355) as the most difficult part of learning compared to OCLUE students (17.7%, n=44, and 17.7%, n=44, respectively). Figure 4.2. Percentages of student responses in each category/code for question 2 Question 3: Assessment The third and final question asked, “How would you describe to them what is assessed in organic chemistry?” Detailed explanations of each category as well as associated examples of student responses can be found in Table 3. Codes such as “Apply and Reason”, “Identify and Describe”, and “Memorization” were explained in previous questions. A further three categories, namely that of “Specific Topic”, “Exam Aspects”, and “Not Applicable” were also identified and included in the analysis. The “Specific Topic” category is similar to the category in question 2 (most difficult thing) and 119 included responses where students listed discrete specific topics as what gets assessed in organic chemistry. Once again, throughout these responses, students did not make any reference to ways of thinking used to interpret the content associated with these topics. The common topics mentioned by students here included “synthesis reactions”, “naming”, “NMR”, and “spectroscopy”. The “Exam Aspects” category included responses that referred to the format, length, time, and/or fairness of the exams in response to what gets assessed. These responses were noted in question 2 (most difficult thing), but were subsumed into the “Personal, Course, and/or Exam Aspects” category. Responses which noted perceptions of what course materials are typically assessed (such as lecture notes, homework, practice exams, and/or recitation materials) were also included in the “Exam Aspects” category. The “Not Applicable” category included instances when students did not give any response at all, or that their response was entirely unclear and unrelated to the question posed. Table 4.4. Codebook for question 3: Perceptions of what was assessed Code Dimensions Example Quotes Apply Student responses that include one or more of the Traditional: “you don't just and following dimensions regarding their perceptions of memorize; you understand Reason what is assessed: (1) understanding “why” a reaction why they are made like that proceeds; (2) the use of knowledge, specifically with so you can apply it to other the use of fundamental or basic ideas; (3) the reactions” transfer of knowledge to new problems; (4) making connections between concepts or piecing concepts together, especially to apply them; (5) making OCLUE: “We were predictions to solve a problem. expected to know WHY things were happening, not just what was going on but the driving force behind those reactions” 120 Table 4.4 (con’t) Identify Student responses that include one or more of Traditional: “You need to know and the following dimensions in regard to their how to classify and name Describe perceptions of what is assessed: (1) molecules, know characteristics understanding the “what” and “how” like acidity and aromaticity, and reactions proceed, particularly without mostly know how bonds are mentioning the use of knowledge to formed and broken in different understanding “why” reactions proceed; (2) situations using different mentions understanding at the scalar or one molecules.” scalar below levels, particularly through the recognition of concepts such as polarity and electronegativity and their significance to OCLUE: “You are required to understanding; (3) when a student explicitly think about reactions more about mentions understanding the mechanism how electrons are moved in a instead of memorization; (4) when a student system rather than what the mentions “differentiating” between reactions begining and end products are. with any further explanation. (5) responses you have to know the steps of include a discussion of forces, charges, or how to get there.” stabilization. NOTE: responses that simply mention “knowledge” or “understanding” do not receive this code. Specific Student responses that include one or more of Traditional: “there is naming, Topic the following dimensions regarding their mechanism, nmr, lots of perceptions of what is assessed: (1) listing off reactions, and some bonus specific topics, particularly with no reference questions” to understanding or approaches utilized. Most common specific topics mentioned include mechanisms, acid-base reactions, learning OCLUE: “Different types of objectives, naming, synthesis, and reactions and the classifications spectroscopy; (2) explicit mention of of structures.” “concepts” without expanding on what they mean (i.e., “understanding concepts” or “knowing a mechanism”). 121 Table 4.4 (con’t) Memorization Student responses that include one or more of Traditional: “The the following dimensions regarding their majority of the exams are perceptions of what is assessed: (1) memorization of the memorization, remembering, recalling, or reactions.” regurgitation of reactions, products, reagents, and/or mechanisms; (2) knowing reactions, products, reagents, and/or mechanisms, OCLUE: “Mechanisms particularly with no explicit mention of and if you can memorize understanding the “what”, “how”, or “why” a 20 different types of reaction proceeds. problems with the same molecule everytime.” Exam Format Student responses that include one or more of Traditional: “You need to and Aspects the following dimensions regarding their go to lecture and take perceptions of what is assessed: (1) the course notes, because the exams materials leveraged on the exam such as lecture cover pretty closely what notes, homework, practice exams, and/or we cover in lecture.” recitation materials; (2) the format of the exam, such as stating the types of questions on the OCLUE: “your ability to exam (i.e., multiple-choice, or short answer); (3) the length, time, or fairness of the exam. do them as fast as possible since the exam were only 50 minutes and crammed with material” Not Applicable Student responses that include one or more of Traditional: “Everything” the following dimensions in regard to their perceptions of what is assessed: (1) the response does not answer the question; (2) when a OCLUE: “don’t take 3 student provides no response at all; (3) when the other intens classes with response is unclear or interpretation of the it” response is difficult, such as when students say “your understanding of the material”; (4) when the response falls into no other category; (5) when the response focuses on student actions (i.e., “be sure to study hard”); (6) when a student is venting about the course, professor, or other aspects relevant to the course and the response focuses on the course or the professor such as frustrations they have with the course or professor. The analysis of student responses from the OCLUE and traditional courses to this question revealed differences in how students perceived what gets assessed in organic chemistry. 122 As shown in Figure 3, more OCLUE students perceived that more sophisticated ways of thinking were assessed in their course compared to students in the traditional course. For example, in relation to “Apply and Reason” and “Identify and Describe”, 35.5% (n=88) and 12.5% (n=31) of OCLUE students perceived these modes of thinking were assessed, respectively. This compared with 4.6% (n=28) and 6.5% (n=39), respectively, for students in the traditional course. More students from the traditional course listed specific topics (52.2%, n=315) and memorizing information (11.8%, n=71) as how they perceived they were assessed compared to OCLUE students (12.9%, n=32, and 2.0%, n=5, respectively). However, more OCLUE students noted responses coded to other categories such as “Exam Aspects” (27%, n=67) and “Not Applicable” (10.1%, n=25) than students from the traditional course (18.7%, n=113, and 6.3%, n=38, respectively). 123 Figure 4.3. Percentages of student responses in each category/code for question 3 Superordinate Themes While the analysis of the open-ended student responses was conducted separately for the three questions, each yielded similar results. To note broader trends across the results and more easily communicate and discuss the findings, we grouped the results into three superordinate themes for each question. In the first theme, which we refer to as “Use of Knowledge”, are “Apply and Reason” and “Identify and Describe”. The second theme encompasses student responses that are more rote, formulaic, or surface level, do not imply ways of thinking but rather the idea that topics must be memorized, or that students must refer to rote methods used to think through problems. This theme is therefore called “Rote Knowledge” and includes categories like “Memorization”, “Apply Heuristics”, and “Specific Topic”. Categories such as “Generalities”, “Personal, Course, and/or Exam Aspects”, “Exam Aspects”, and “Not Applicable” captured 124 responses that did not answer the question or were vague and uninformative. We refer to this theme as “Other” for our purposes. By condensing the codes in this way, we believe it is easier to see patterns in responses as related to how knowledge is used in these courses. In all three questions we saw marked differences between the “Use of Knowledge” and the “Rote Knowledge” themes for OCLUE and traditional students, while the responses coded as “Other” were more similar across the two cohorts. Because responses coded as “Other” were, in general, not specific enough to make inferences about the course culture and concomitant types of thinking required we will not discuss them in detail here. We opted to focus on how knowledge was used in our superordinate themes to better complement our previous research on student reasoning in the context of these two courses, and it seemed to be the most prevalent and overarching way to organize our analysis based on how students were responding to the questions. For question 1 (expectations of thinking), around 50% of OCLUE students believed that they were expected to reason with or use their knowledge in the context of drawing mechanisms, while 15% believed that this process was a more rote procedure. This split was more equal for traditional students with around 30% in “Use of Knowledge” and around 34% in “Rote Knowledge” as noted in Figure 4. Responses from question 1 (expectations of thinking) such as “You’re expected to not memorize the reactions but understand why moelcules [sic] react the way they do so you can draw your own reactions” and “You think about where electrons are going and what it’s bonding with and why it bonds with one thing over another” were classified as “Apply and Reason” because they include the idea that students must not only use their knowledge to do something, but also explain or understand why the phenomenon occurs. There is a subtle distinction between 125 “Apply and Reason” responses and those that were classified as “Identify and Describe”. For example, one “Identify and Describe” response states: “Energy flow, electron flow, ect. You need to be able to understand how electrons are moving and see relationships throughout the year.” This response focuses more on the “how” a reaction happens rather than the “why”, and it highlights the need to identify relationships; though it does not mention knowing why, it still implies the use of knowledge. In contrast, responses such as “you need to memorize all reactions given to you in lectures!” which was classified as “Memorization” and “they need to think about mechanistic arrows as the movement of electrons from a source to a sink”, (coded as “Apply Heuristics”) indicate that students have not moved towards the use of knowledge but implies they are using rote procedures. Figure 4.4. Percentage of student responses in each superordinate theme for question 1 For question 2 (most difficult thing) there was an even more marked difference between the two cohorts as noted Figure 5. OCLUE students were evenly split on what aspects of the 126 course they perceived as more difficult, whereas almost 80% of the traditional students believed that the focus on “Rote Knowledge” was the most difficult. Here, we recall back to Figure 2 where it can be noted that more students in the traditional course perceived that the memorization, workload, or the workload involved in memorizing a large amount of material was what made the course difficult. For example, one OCLUE student’s response categorized as “Use of Knowledge” stated the following: “The most difficult thing in orgo [sic] is the mechanism and understanding where and how different molecules attack each other. If you know them well then it makes writing reactions easier”. This student highlights that when you understand the behaviors of different molecules, then this can make writing reactions more approachable. On the other hand, an example from the traditional course categorized as “Rote Knowledge” noted that: “There is a large amount of material that we have to know and memorize”. In this case, the student is not only perceiving a large workload, but they also perceive they are expected to memorize the material. 127 Figure 4.5. Percentage of student responses in each superordinate theme for question 2 The differences in perceptions was continued in responses to question 3 (assessment; Figure 6), where once again a majority (48%) of OCLUE students perceived an emphasis on the use of knowledge in course assessments, whereas 64% of traditional students perceived that they were being assessed on memorization and rote knowledge. For example, one OCLUE student’s perspective on this question was the following: “Your knowledge not only of what is taught in class but your ability to apply it to various situations. Also, you [sic] knowledge of the CONCEPTS [sic] and underlying themes is heavily assessed.” Here, the student describes how OCLUE assesses the student’s ability to transfer concepts from one problem to another and to identify underlying themes which correlated with “Use of Knowledge”. On the other hand, responses correlated with “Rote Knowledge” included: “how well you can memorize the reactions” and “the exams mainly test reactions and naming of molecules”. Here the responses in the traditional course cluster around memorization and the 128 focus on discrete, specific topics, such as knowing reactions and nomenclature, rather than ways of thinking. Figure 4.6. Percentage of student responses in each superordinate theme for question 3 To further determine if the differences between OCLUE and traditional cohorts in the qualitative data was supported statistically, we conducted a Pearson’s chi-square test of independence using an alpha of 0.05 within SPSS 27 (SPSS, 2020) with the data organized into superordinate themes. According to the chi-square tests, the analysis for each question yielded statistically significant results at an alpha of 0.05 where p < 0.001 for each question. Since all Pearson chi-square tests came back significant, we decided to run post-hoc analyses to further illustrate which theme(s) were primary drivers for statistical significance in the initial chi-square tests. From the post-hoc analyses we found that the “Use of Knowledge” and “Rote Knowledge” themes were strong primary drivers for significance in each question. All of the calculations and 129 a more in-depth write up of these analyses can be found in the supplemental materials (Tables S1 and S2). Discussion Interpreting the Findings through the Lens of Classroom Cultures The findings highlighted clear differences in the ways that students perceived knowledge use in the organic chemistry courses in this study. As noted throughout, the two organic chemistry courses had different pedagogical underpinnings; that is, the courses were designed, enacted, and assessed in different ways. Our previous research on student reasoning has demonstrated that students in OCLUE are better able to engage in causal mechanistic reasoning and retain this ability longer than students in traditional courses (Crandell et al., 2019, 2020). Therefore, we were aware of what students were doing; however, with this study, we wanted to know if students were aware of what they were doing. In other words, we wanted to know if students perceived the intent of the transformation, and we did not want to make assumptions without conducting this study. In the beginning, our goals were exploratory. We aimed to complement our previous work and characterize our transformational efforts further from the student perspective. This also allowed us to generate insights on how student perceptions aligned with course goals and expectations, our transformational intent, and the theories of learning that informed our course design. As we began analysis, we sought a way to further make sense of the findings. Across all three questions, more students in OCLUE had perceptions aligned with the use of knowledge while student perceptions in the traditional course aligned more with rote knowledge. As we noted these differences in student perceptions, our interpretations and 130 discussions often centered on the classroom cultures of learning in each organic chemistry course. Certainly, learning is a social and cultural activity that is dependent on the context in which it occurs (Vygotsky, 1978; Rogoff, 1990; Calabrese Barton et al., 2008; Carlone et al., 2011). Therefore, our discussion and interpretation of our findings can be situated within sociocultural perspectives (Vygotsky, 1978; Rogoff, 1990; John-Steiner and Mahn, 1996; Carlone et al., 2011; Zotos et al., 2020; Petterson et al., 2022) and is informed by other scholars who have conceptualized culture (Schein and Schein, 2016; Reinholz and Apkarian, 2018). Since the term “culture” can take on a variety of meanings, we find it important to provide a working definition prior to discussing our findings. To begin, it is important to note that “no one view of culture… represents a thorough and complete understanding” (Parsons and Carlone, 2013). However, throughout this discussion when we refer to culture, we are referring to a micro-level culture, or a subculture, that exists in the context of these organic chemistry classrooms, as opposed to macro-level cultures which represent larger entities such as ethnic groups, nations, and international organizations (Schein and Schein, 2016; Thoman et al., 2017). Aside from sociocultural perspectives, our view of culture draws heavily on Reinholz and Apkarian’s four frames for systemic change (Reinholz and Apkarian, 2018) and Schein and Schein’s framework for organizational culture (Schein and Schein, 2016). Reinholz and Apkarian’s four frames include structures, symbols, people, and power which exhibits overlap with Schein and Schein’s framework of artifacts, espoused beliefs and values, and taken for granted assumptions, both of which inform our working definition and are further explained in our working definition below. For us, our working definition of culture includes a constellation of visible structures and artifacts which encompass the visible course policies, course practices, expectations, and 131 assessments, among other features. These structures are given meaning by an underlying system of symbols that include beliefs, values, and assumptions. Socializing mechanisms in a context enculturate people by encouraging them to adopt the symbols and participate in or interact with the structures and artifacts. These socializing mechanisms are mediated by people and power that directly and indirectly impact how people talk, act, and think (Rogoff, 1990; Miller and Goodnow, 1995; Lemke, 2001; Gutiérrez and Rogoff, 2003; Nasir and Hand, 2006; Calabrese Barton et al., 2008; Schein and Schein, 2016; Reinholz and Apkarian, 2018; Deng et al., 2021). While our findings cannot speak to all frames (structures/artifacts, symbols, people, and power), this definition helps us suggest potential explanations for our findings. For this study, we found the cultural frames of structures/artifacts and symbols most useful particularly because most student responses were related to these frames given the questions asked. Structures, or artifacts, within a classroom could be elements such as the practices used, the learning and assessment tasks, and the established norms. That is, they are the visible features of the culture that are informed by the underlying symbols that give them meaning. The symbols could include the implicit and explicit messages that students receive and interpret that communicate valued ways of knowing and doing (Schein and Schein, 2016; Reinholz and Apkarian, 2018). The other frames mentioned by Reinholz and Apkarian, such as people and power are important, but were difficult to address with this data. Therefore, we aimed to use the frames of structures/artifacts and symbols to discuss how students perceived they were expected to practice learning and what was valued, both of which will be linked to elements of their respective cultures of learning. Other studies have used sociocultural perspectives to explore different classroom cultures and found that when the use of certain practices, such as argumentation, align with the course goals, then students engage more productively in the 132 practice (Sandoval et al., 2019) while others have demonstrated how classroom norms (and their interpretation) can impact how students respond to learning tasks (Becker et al., 2013; Chang and Song, 2016). Thus, this suggests that better alignment between course goals, the practices students engage in, and clear and universally understood norms can lead to a more productive learning experience. Therefore, by investigating student perceptions of what is valued through the lens of classroom cultures we can help identify potential mismatches between instructor expectations and what students are doing that may perturb learning. Question 1: Expectations of Thinking Question 1 (expectations of thinking) was included in our instrument for three main reasons: 1) it helped us address one of our research questions regarding the alignment of student perceptions with transformational intent; 2) it related to our previous work on student reasoning in these courses (Crandell et al., 2019, 2020; Houchlei et al., 2021); and 3) it was inspired by previous research that found students in organic chemistry often resorted to memorization (Bhattacharyya and Bodner, 2005). As noted in Figure 4, more OCLUE students perceived they were expected to use their knowledge while more students in the traditional course perceived they were expected to rely on rote knowledge. Earlier we noted the most salient differences between the two courses in this study, and we highlighted that OCLUE consistently encourages students to construct scientific explanations and arguments about how and why something happens. In the context of question 1 (expectations of thinking), these enacted practices in the course were also noted in student perceptions. That is, the expectations and emphasis on the use of knowledge were perceived by many students, indicating that student perceptions were at least partially aligning with the transformation goals (Cooper et al., 2019). 133 Constructing explanations and engaging in argumentation are important classroom practices in OCLUE (National Research Council, 2012a; Cooper et al., 2019; 3DL4US, n.d.; Flaherty, 2020b). Their incorporation coupled with the expectation students will engage in them act as structural features of the overarching culture. It has been suggested that by implementing the practice of constructing explanations that students will have a better idea of how scientific knowledge is developed (McNeill et al., 2017) and that argumentation can move the focus of learning away from memorization (Berland and McNeill, 2010). Certainly, this data corroborates this claim. Structural features of the classroom culture, such as the incorporation and consistent use of scientific practices, may have helped students in OCLUE perceive expectations of how to think on a deeper level relative to students in the traditional course. Question 2: Most Difficult Thing Question 2 (most difficult thing) was incorporated into the study for two reasons: 1) previously published instruments asked about course difficulty; and 2) we believed it would be insightful to know about what aspects of a course students found to be most difficult in case it needed to be addressed. For example, if most students found a course policy to be more difficult than a way of thinking or content, then we would have viable feedback in order to address this. For question 2 (most difficult thing), an overwhelming majority of students in the traditional course perceived that “Rote Knowledge” (such as memorization, the workload, or the workload associated with memorizing) was the most difficult part of the course (as noted in Figure 5). In contrast, OCLUE students had a more even distribution of perceptions of which facets were most difficult, though more OCLUE students perceived that the “Use of Knowledge” was the most difficult aspect when compared to the traditional course. The OCLUE curriculum was designed 134 in such a way to discourage rote memorization of content (Cooper et al., 2019), and far fewer students in OCLUE perceived memorization and workload as being the most difficult aspect of the course when compared to students in the traditional course. This highlights that student perceptions in OCLUE exhibit alignment with the transformational intent and implies that memorization and workload are stronger driving influences or forces within the culture of the traditional course. Students have perceived that organic chemistry requires a great deal of memorization (Moran, 2013) which has also been noted as an approach that students take on organic chemistry exams (Webber and Flynn, 2018). Furthermore, instruments focused on gathering student perceptions have sought to collect information on whether students are memorizing in their courses (Grove and Bretz, 2007), yet, considering the previous exploration of the association of memorization with organic chemistry, a course centered on rote knowledge is almost certainly not the intent of the instructors. In a qualitative study on student reasoning in organic chemistry Anderson and Bodner (Anderson and Bodner, 2008) found that students did not appreciate that mechanisms were used to understand how and why phenomena occur despite the fact that this was the intent of the instructor in that course. In the same study, students in their interviews stated they wanted to understand the material on a deeper level but also mentioned that this was difficult given the volume and pace of the material (Anderson and Bodner, 2008), a perception we noted in our study for the students in the traditional course. That is, structural components of the traditional classroom culture, such as the amount of material covered, and pace of coverage may coalesce with perceived expectations to drive the perception that students need to memorize large amounts of material. 135 If instructors want students be able to explain how and why chemical reactions happen, the findings from Anderson and Bodner and our study make it clear that the purpose of mechanisms needs to be made explicit and leveraged consistently throughout the course and that courses need to slow down and connect content back to fundamental principles so that students can develop a robust understanding which may not have been clear in the traditional course. Both points are addressed in OCLUE by leveraging the scientific practices, crosscutting concepts, and core ideas and is further evidenced by the shift in perceived difficulties of students in the course toward use of knowledge, relative to students in the traditional course. Question 3: Assessment Question 3 (assessment) was used in this study for three reasons: 1) considering that research has shown that assessment practices send strong messages to students about what is valued (Momsen et al., 2013; Stowe et al., 2021), we saw this as a useful question for ascertaining what students perceived are valued ways of doing and knowing; and 2) from our previous work and observations of the two types of courses in this study, we have known them to have different assessment approaches and wanted to explore student perceptions of these two approaches. Finally, the responses for question 3 (assessment) yielded similar patterns to question 1 where, in general, OCLUE students perceived they were assessed more on their “Use of Knowledge” while students in the traditional course perceived they were assessed more on “Rote Knowledge” such as memorization and discrete, specific topics (as can be seen in Figure 6). It is important to reiterate that assessments play a large role in the culture of a learning environment and send strong messages about valued ways of thinking and participating in the course (Snyder, 136 1973; Crooks, 1988; Entwistle, 1991; Scouller and Prosser, 1994; Scouller, 1998; Momsen et al., 2013; Stowe et al., 2021). Within OCLUE, much work is done to ensure alignment between learning goals, expectations, and assessments with regard to the use of knowledge. Student perceptions imply that this transformational goal may be accomplished (at least partially) since 52% of students in OCLUE perceived they were expected to use their knowledge and 48% perceived they were assessed on their use of knowledge. In terms of culture, assessments act as one mechanism through which instructors reflect what is valued in the learning culture and what students are expected to do. As shown in this study, these messages can be perceived by students and influence how they participate in learning. If the goal of the learning environment is to engage students in reasoning and disciplinary practices, then the culture and instructor expectations must support that goal (Bain et al., 2020), as Cooper and Stowe note: “...it is important for students to receive and respond to the message that both knowledge and the ways that knowledge is used are crucial aspects of learning chemistry,” (Cooper and Stowe, 2018). Instructor expectations and assessments are intricately linked, and the ways in which courses communicate expectations, emphasize particular ways of doing, and place value on those ways of doing (by assessing them) become structures and symbols of the learning culture. The alignment of expectations and assessment in OCLUE, as noted in student perceptions, was an important component of the transformation effort to ensure that what was expected of students was valued in the form of points on assessments. The Impact of the Transformed Classroom Culture We set out to investigate whether student perceptions of what they were expected to do and what was valued aligned with the transformational intent of OCLUE and the theories of 137 learning that informed it. We saw this study as complementing our previous research on student reasoning in the context of OCLUE and a traditional organic chemistry courses. Though the study does address these aims, it continued to evolve throughout data analysis. To make sense of the data and situate it within the literature, we discussed the results through the lens of elements of the classroom cultures. That is, the differences noted between student perceptions in these two organic chemistry courses could be attributed to the structures/artifacts (i.e., expectations, learning task design, assessment design, etc.) and the symbols (i.e., the intentional and/or unintentional valuing of certain ways of doing) within the course that are supported by the instructors. The difference between the design and enactment of these two types of courses not only have impacts on how students reason (as shown in our previous research), but it also impacts how students perceive they are to engage in doing and learning organic chemistry which may reflect classroom norms of engagement and learning that are more or less aligned with the disciplinary practice (Becker et al., 2013; Schein and Schein, 2016; National Academies of Sciences, 2018; Reinholz and Apkarian, 2018; Sandoval et al., 2019). When considering the culture of a classroom, it becomes important to also consider the ways that culture socializes people. Instructional practices that focus on rote memorization and solving exercises will likely not introduce students to the authentic disciplinary culture nor encourage them to engage in “science-as-practice” (Nasir and Hand, 2006; Stroupe, 2014). Instead, if students are immersed in an environment where they are encouraged to use their knowledge, particularly with unfamiliar problems, and given the chance to make mistakes and learn from them, then students may develop perceptions of learning which are more aligned with authentic disciplinary ways of thinking (Brown et al., 1989). Furthermore, if a class culture’s goals align with the practices students are expected to engage in, then it can lead to more 138 productive engagement and learning (Sandoval et al., 2019). By leveraging scientific practices in the context of fundamental core ideas, instructors can shift the culture of learning to expect, emphasize, and value the use of knowledge and provide students a route to connect their knowledge and make sense of a phenomena rather than relying on memorization (Cooper, 2015). One of the goals of our study was to explore the alignment between student perceptions and instructor expectations. Previous research has found that organic chemistry instructors do not list rote memorization as an important facet of learning organic chemistry (Duis, 2011); yet, students in the traditional organic chemistry in this study largely perceived they were expected to and assessed on their ability to memorize. We imagine that the goal of the instructor was not to have students rely solely on rote memorization. Therefore, there seems to be a disconnect and misalignment between what the instructor values and expects students to do and their learning and assessment task design (Stowe and Cooper, 2017). That is, though instructors may expect and value students to use their knowledge, students are still able to complete prompts and learning tasks by memorizing the material. In a recent interview study on student perceptions of “critical thinking” in organic chemistry courses, some students mentioned they saw memorization as an “easier” method to achieve the results they wanted (a better grade) and that they were accustomed to memorizing in school (Bowen and Cooper, manuscript in preparation). Thus, this highlights that more attention should be given to the questions and prompts being asked of students and that learning and assessment task design be intentional and reflective. While there are a variety of ways to engage students in meaningful learning, pedagogical approaches such as those informed by A Framework for K-12 Science Education and three- dimensional learning have advocated for engaging students in authentic disciplinary practices in science (National Research Council, 2012a; Laverty et al., 2016; Matz et al., 2018; 3DL4US, 139 n.d.). From these perspectives, learning involves introducing students to the disciplinary cultures of science by engaging them in the practices that scientists actually use, such as constructing explanations and using models to predict and explain. Our previous work on OCLUE, along with the findings here, demonstrate how three-dimensional learning can impact student performance and communicate clear expectations and values that are explicit for students and align with more expert-like practice. Certainly, there are many factors at work in a learning culture, and our study did not, and could not, address them all. However, what we have highlighted is how instructor expectations, whether implicit or explicit, along with what is emphasized in a course and on assessments are related to the overarching classroom culture and how these features send strong messages about how people should think and practice. While our previous work on student reasoning was certainly insightful, we needed evidence to better understand if students perceived what they were doing aligned with the goals of the course. Put simply, the enactment of a course, the elements of its culture, such as instructor expectations, emphasis, and valued of ways of doing influence how students participate in learning and more attention should therefore be given to these influences when designing and enacting instructional practice. Limitations To begin, our three open-ended questions, though interesting to us, were not all- encompassing. Though the questions were open-ended enough to provide students the opportunity to comment on their instructors, we did not have a question directly asking students about the role of the instructor on their perception. Additionally, the large number of responses allocated to the “Other” theme is in part due to the “Generalities” category which was applied 140 when students used vague, generalized language that was unclear. For example, many students mentioned they had to think “critically” but did not elaborate on what that entailed. It could be the case that some students did not have the vocabulary to explain what they meant and might not have been able to be more precise in their explanation because they have not been exposed to the notion of using knowledge to predict and explain. Studies are underway to identify scaffolded approaches to help students answer the questions we intended and clarify future responses. However, finding the right level of scaffolding takes time, as this approach can “over- prompt” students, which is not desirable in the context of these studies. A third limitation is that our analytic approach relied on interpretation of written student perceptions. With any qualitative study, we must acknowledge that our interpretations are our own and contextual. Through the use of multiple coders, a unified codebook, and multiple cycles of coding and revisions, we aimed to characterize student perceptions the best way we could through noting broad and communicable themes across these two different organic chemistry course experiences. Finally, it could be that these perceptions may not be stable over time. Like other affective constructs, perceptions are subject to social and cultural influences and therefore may change throughout the semester; however, we do believe perceptions offer a snapshot into the student experience and worth considering in our transformation efforts. Implications For teaching implications, our approach to exploring student perceptions could be useful to instructors who are interested in how their courses are being perceived by students. It is clear from the findings that students in the traditional organic chemistry course perceived that they must memorize a great deal of material, yet it is highly unlikely that instructors intended for this 141 to be the case. Studies on what organic instructors believe is important do not mention rote memorization (Duis, 2011), and the need for taking organic chemistry is often supported by the assertion that it fosters forms of critical thinking and problem solving (Stowe and Cooper, 2017). If anything, these results imply that if instructors desire students to use and apply their knowledge and recognize what they are doing, these expectation should be clear, emphasized, and valued by having students engage in these practices on course assignments and assessments. With regard to research implications, the use of our instrument occupies a methodological middle ground in that it is less constrained than previously published quantitative instruments, and it does not take as much time as conducting interviews, yet the instrument still provides rich descriptions of student perceptions. By studying perceptions in this way, we can provide valuable insight into how students are engaging with curricula and learning environments without relying on our assumptions, and more qualitative studies on student perceptions of learning and affective states would help expand the CER literature base. With all of this said, various cultural frameworks informed our interpretation and communication of the results. We posit that more work should be done within the realm of classroom cultures in chemistry courses. Culture influences how students talk, think, and act, and research focused on characterizing the different classroom cultures that support learning and foster student engagement would be productive and insightful. Future Directions This study was exploratory in nature; therefore, we are attempting to rework the language of the questions in an attempt to minimize the number of responses in the “Other” theme while still ensuring that we are not prompting students. Furthermore, since many students used general 142 or vague terms, such as “critical thinking”, to describe their experience, an interview study is planned to explore what students mean in the context of these two courses. We are also expanding this work to other courses such as introductory chemistry and biology to determine how robust the instrument and data analysis are in different contexts. Another side to transformational efforts are instructor expectations and intent. Therefore, we are currently discussing plans to investigate instructor perceptions of what they want students to do. The most time-consuming component of this project was the data analysis. We are currently working with the Automated Analysis of Constructed Response (AACR) tool (Automated Analysis of Constructed Response, n.d.) to train machine learning algorithms to automatically analyze data and categorize according to the codebooks established in this study. Preliminary results are encouraging, and we believe that there is great potential to use this approach as a supplement to classroom data gathering about teaching and learning. However, it’s important to note that we do not support the use of this tool for faculty “evaluation” but rather for faculty development. Finally, more work needs to be done to understand the stability of affective constructs, including perceptions. While little work has been done in this area within science education, it is important to note that these constructs may be subject to change based on a variety of social and cultural factors. Therefore, in future studies we are planning to do multiple data collections in a single course. By investigating the perceptions of students within a course over time, some evidence could be provided into what factors of course design will assist in helping keep perceptions as stable as possible so that reliable measurements can be obtained. 143 Conclusions This study was designed to investigate student perceptions in the context of transformed and traditional organic chemistry courses. The idea was that this study would complement our previous work on student reasoning and inform our transformation efforts. Using three open- ended questions and inductive thematic analysis we noted significant differences on what students perceived they were expected to do, what was most difficult, and how they were assessed in the transformed and traditional courses. Our interpretation of the findings led us to discuss these differences in the context of the classroom cultures. Overall, we noted that more OCLUE students perceived that the use of knowledge was expected and assessed while more students in the traditional course perceived that memorization was expected and assessed alongside discrete, specific topics. Differences in what students perceived the most difficult aspect of organic chemistry was noted where students in OCLUE perceived the use of knowledge as being most difficult. Using various frameworks, we discussed how the underlying cultures of these classrooms communicated expectations, emphasized the use of knowledge, and valued the use of knowledge differently. Student perceptions acted as a valued feedback mechanism about how course enactments were being experienced and perceived. Therefore, by using student perceptions as a proxy for elements of the classroom culture, we aimed to offer insights into the design and enactment of these Final Thoughts There is evidence to support the idea that if there is alignment between expectations and practice that learning is enhanced or improved. It is my belief that classroom culture is all- encompassing and, therefore, significant to characterize in order to address pedagogical 144 concerns. My colleagues have published work on how students perform on cognitive tasks and prompts, yet we had not done much work to characterize the OCLUE curriculum from the student perspective. The work here sought to fill that gap and to provide more information on how we can support and guide students in learning organic chemistry. However, at the conclusion of this study, there were several unanswered questions, such as 1) what do students in other courses, such as general chemistry and general biology, perceive they are expected to do, what is most difficult, and how they are assessed (the focus of Chapter V); 2) can the results presented here be reproduced in the same course contexts (the focus of Chapter VI); and 3) what do students mean when they use words like “critical thinking” (the focus of Chapter VII)? All of these questions are addressed in other studies throughout this dissertation, illustrating the significance of this study as a cornerstone project to exploring student perceptions of transformational intent and classroom culture. 145 REFERENCES 1. 3DL4US, (n.d.), Three-Dimensional Learning for Undergraduate Science. https://3dl4us.org. 2. Adams W. K., Perkins K. K., Dubson M., Finkelstein N. D., and Wieman C. E., (2005), The Design and Validation of the Colorado Learning Attitudes about Science Survey. AIP Conf. Proc., 790(April 2015), 45–48. 3. Anderson T. L. and Bodner G. M., (2008), What can we do about “Parker”? A case study of a good student who didn’t “get” organic chemistry. Chem. Educ. Res. Pract., 9(2), 93– 101. 4. Automated Analysis of Constructed Response, (n.d.),. Home. https://beyondmultiplechoice.org. 5. Bain K., Bender L., Bergeron P., Caballero M. D., Carmel J. H., Duffy E. M., et al., (2020), Characterizing college science instruction: The Three-Dimensional Learning Observation Protocol. PLoS ONE, 15(6). 6. Banks G., Clinchot M., Cullipher S., Huie R., Lambertz J., Lewis R., et al., (2015), Uncovering Chemical Thinking in Students’ Decision Making: A Fuel-Choice Scenario. J. Chem. Educ., 92(10), 1610–1618. 7. Barbera J., Adams W. K., Wieman C. E., and Perkins K. K., (2008), Modifying and Validating the Colorado Learning Attitudes about Science Survey for Use in Chemistry. J. Chem. Educ., 85(10), 1435–1439. 8. Bauer C. F., (2008), Attitude towards chemistry: A semantic differential instrument for assessing curriculum impacts. J. Chem. Educ., 85(10), 1440–1445. 9. Bauer C. F., (2005), Beyond “Student Attitudes”: Chemistry Self-Concept Inventory for Assessment of the Affective Component of Student Learning. J. Chem. Educ., 82(12), 1864–1870. 10. Becker N., Noyes K., and Cooper M., (2016), Characterizing Students’ Mechanistic Reasoning about London Dispersion Forces. J. Chem. Educ., 93(10), 1713–1724. 11. Becker N., Rasmussen C., Sweeney G., Wawro M., Towns M., and Cole R., (2013), Reasoning using particulate nature of matter: An example of a sociochemical norm in a university-level physical chemistry class. Chem. Educ. Res. Pract., 14(1), 81–94. 12. Berland L. K. and McNeill K. L., (2010), A learning progression for scientific argumentation: Understanding student work and designing supportive instructional contexts. Sci. Educ., 94(5), 765–793. 146 13. beSocratic, (2020), Home page. https://besocratic.com/home. 14. Bhattacharyya G. and Bodner G. M., (2005), “It Gets Me to the Product”: How Students Propose Organic Mechanisms. J. Chem. Educ., 82(9), 1402–1407. 15. Bodner G. M., (1986), Constructivism: A theory of knowledge. J. Chem. Educ., 63(10), 873. 16. Bowen R. and Cooper M., Investigating Student Perceptions of Critical Thinking in Organic Chemistry. Manuscript in preparation. 17. Boyatzis R. E., (1998), Transforming qualitative information: Thematic analysis and code development, Sage. 18. Brown J. S., Collins A., and Duguid P., (1989), Situated Cognition and the Culture of Learning. Educ. Res., 18(1), 32–42. 19. Bryfczynski S. P., (2010), BeSocratic: An Intelligent Tutoring System for the Recognition, Evaluation, and Analysis of Free-Form Student Input. Dissertation. Clemson University, Clemson, South Carolina, US. 20. Calabrese Barton A., Tan E., and Rivet A., (2008), Creating Hybrid Spaces for Engaging School Science Among Urban Middle School Girls. Am. Educ. Res. J., 45(1), 68–103. 21. Carlone H. B., Haun-Frank J., and Webb A., (2011), Assessing equity beyond knowledge- and skills-based outcomes: A comparative ethnography of two fourth-grade reform-based science classrooms. J. Res. Sci. Teach., 48(5), 459–485. 22. Chang J. and Song J., (2016), A case study on the formation and sharing process of science classroom norms. Int. J. Sci. Educ., 38(5), 747–766. 23. Cooper M. and Klymkowsky M., (2013), Chemistry, Life, the Universe, and Everything: A New Approach to General Chemistry, and a Model for Curriculum Reform. J. Chem. Educ., 90(9), 1116–1122. 24. Cooper M. M., (2015), Why Ask Why? J. Chem. Educ., 92(8), 1273–1279. 25. Cooper M. M., Kouyoumdjian H., and Underwood S. M., (2016), Investigating Students’ Reasoning about Acid-Base Reactions. J. Chem. Educ., 93(10), 1703–1712. 26. Cooper M. M., Posey L. A., and Underwood S. M., (2017), Core Ideas and Topics: Building Up or Drilling Down? J. Chem. Educ., 94(5), 541–548. 27. Cooper M. M. and Stowe R. L., (2018), Chemistry Education Research - From Personal Empiricism to Evidence, Theory, and Informed Practice. Chem. Rev., 118(12), 6053– 147 6087. 28. Cooper M. M., Stowe R. L., Crandell O. M., and Klymkowsky M. W., (2019), Organic Chemistry, Life, the Universe and Everything (OCLUE): A Transformed Organic Chemistry Curriculum. J. Chem. Educ., 96(9), 1858–1872. 29. Crandell O. M., Kouyoumdjian H., Underwood S. M., and Cooper M. M., (2019), Reasoning about Reactions in Organic Chemistry: Starting It in General Chemistry. J. Chem. Educ., 96(2), 213–226. 30. Crandell O. M., Lockhart M. A., and Cooper M. M., (2020), Arrows on the Page Are Not a Good Gauge: Evidence for the Importance of Causal Mechanistic Explanations about Nucleophilic Substitution in Organic Chemistry. J. Chem. Educ., 97(2), 313–327. 31. Crooks T. J., (1988), The impact of classroom evaluation practices on students. Rev. Educ. Res., 58(4), 438–481. 32. Dalgety J., Coll R. K., and Jones A., (2003), Development of chemistry attitudes and experiences questionnaire (CAEQ). J. Res. Sci. Teach., 40(7), 649–668. 33. Danczak S. M., Thompson C. D., and Overton T. L., (2017), “What does the term Critical Thinking mean to you?” A qualitative analysis of chemistry undergraduate, teaching staff and employers’ views of critical thinking. Chem. Educ. Res. Pract., 18(3), 420–434. 34. Deng J. M., McMunn L. E., Oakley M. S., Dang H. T., and Rodriguez R. S., (2021), Toward Sustained Cultural Change through Chemistry Graduate Student Diversity, Equity, and Inclusion Communities. J. Chem. Educ. 35. Duis J. M., (2011), Organic chemistry educators’ perspectives on fundamental concepts and misconceptions: An exploratory study. J. Chem. Educ., 88(3), 346–350. 36. Entwistle N. J., (1991), Approaches to learning and perceptions of the learning environment. High. Educ., 22(3), 201–204. 37. Flaherty A. A., (2020a), A review of affective chemistry education research and its implications for future research. Chem. Educ. Res. Pract., 21(3), 698–713. 38. Flaherty A. A., (2020b), Investigating perceptions of the structure and development of scientific knowledge in the context of a transformed organic chemistry lecture course. Chem. Educ. Res. Pract., 21, 570–581. 39. Galloway K. R. and Bretz S. L., (2016), Video episodes and action cameras in the undergraduate chemistry laboratory: Eliciting student perceptions of meaningful learning. Chem. Educ. Res. Pract., 17(1), 139–155. 148 40. Galloway K. R., Malakpa Z., and Bretz S. L., (2016), Investigating Affective Experiences in the Undergraduate Chemistry Laboratory: Students’ Perceptions of Control and Responsibility. J. Chem. Educ., 93(2), 227–238. 41. Grove N. and Bretz S. L., (2007), CHEMX: An instrument to assess students’ cognitive expectations for learning chemistry. J. Chem. Educ., 84(9), 1524–1529. 42. Gutiérrez K. D. and Rogoff B., (2003), Cultural ways of learning: Individual traits or repertoires of practice. Educ. Res., 32(5), 19–25. 43. Hammer D., (2000), Student resources for learning introductory physics. Am. J. Phys., 68(S1), S52–S59. 44. Hammersley-Fletcher L. and Hanley C., (2016), The use of critical thinking in higher education in relation to the international student: Shifting policy and practice. Br. Educ. Res. J., 42(6), 978–992. 45. Houchlei S. K., Bloch R. R., and Cooper M. M., (2021), Mechanisms, Models, and Explanations: Analyzing the Mechanistic Paths Students Take to Reach a Product for Familiar and Unfamiliar Organic Reactions. J. Chem. Educ., 98(9), 2751–2764. 46. Irby S. M., Pelaez N. J., and Anderson T. R., (2020), Student Perceptions of Their Gains in Course-Based Undergraduate Research Abilities Identified as the Anticipated Learning Outcomes for a Biochemistry CURE. J. Chem. Educ., 97(1), 56–65. 47. John-Steiner V. and Mahn H., (1996), Sociocultural approaches to learning and development: A Vygotskian framework. Educ. Psychol., 31(3–4), 191–206. 48. Kohn K. P., Underwood S. M., and Cooper M. M., (2018), Connecting structure-property and structure-function relationships across the disciplines of chemistry and biology: Exploring student perceptions. CBE—Life Sci. Educ., 17(2). 49. Laverty J. T., Underwood S. M., Matz R. L., Posey L. A., Carmel J. H., Caballero M. D., et al., (2016), Characterizing college science assessments: The three-dimensional learning assessment protocol. PLoS ONE, 11(9), 1–21. 50. Lemke J. L., (2001), Articulating communities: Sociocultural perspectives on science education. J. Res. Sci. Teach., 38(3), 296–316. 51. Matz R. L., Fata-Hartley C. L., Posey L. A., Laverty J. T., Underwood S. M., Carmel J. H., et al., (2018), Evaluating the extent of a large-scale transformation in gateway science courses. Sci. Adv., 4(10), eaau0554–eaau0554. 52. McGill T. L., Williams L. C., Mulford D. R., Blakey S. B., Harris R. J., Kindt J. T., et al., (2019), Chemistry Unbound: Designing a New Four-Year Undergraduate Curriculum. J. 149 Chem. Educ., 96(1), 35–46. 53. McNeill K. L., Berland L. K., and Pelletier P., (2017), Constructing explanations, in Helping Students Make Sense of the World Using Next Generation Science and Engineering Practices, Schwarz C., Passmore C., and Reiser B. J. (eds.)., NSTA Press, pp. 205–227. 54. Merriam S. B. and Tisdell E. J., (2016), Qualitative Research: A Guide to Design and Implementation, Jossey-Bass. 55. Miller P. J. and Goodnow J. J., (1995), Cultural practices: Toward an integration of culture and development. New Dir. Child Adolesc. Dev., 1995(67), 5–16. 56. Momsen J., Offerdahl E., Kryjevskaia M., Montplaisir L., Anderson E., and Grosz N., (2013), Using Assessments to Investigate and Compare the Nature of Learning in Undergraduate Science Courses. CBE—Life Sci. Educ., 12(2), 239–249. 57. Moran B., (2013), How to get an A- in organic chemistry. N. Y. Times. 58. Nasir N. S. and Hand V. M., (2006), Exploring Sociocultural Perspectives on Race, Culture, and Learning. Rev. Educ. Res., 76(4), 449–475. 59. National Academies of Sciences E. &. Medicine, (2018), How People Learn II, The National Academies Press. 60. National Research Council, (2012a), A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, The National Academies Press. 61. National Research Council, (2012b), Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering, The National Academies Press. 62. National Research Council, (2000), How People Learn I: Brain, Mind, Experience, and School: Expanded Edition, The National Academies Press. 63. Noyes K. and Cooper M. M., (2019), Investigating Student Understanding of London Dispersion Forces: A Longitudinal Study. J. Chem. Educ., 96(9), 1821–1832. 64. Parsons E. C. and Carlone H. B., (2013), Culture and science education in the 21st century: Extending and making the cultural box more inclusive. J. Res. Sci. Teach., 50(1), 1–11. 65. Petterson M. N., Finkenstaedt-Quinn S. A., Gere A. R., and Shultz G. V., (2022), The role of authentic contexts and social elements in supporting organic chemistry students’ interactions with writing-to-learn assignments. Chem. Educ. Res. Pract., 23, 189–205. 150 66. Ramachandran R. and Rodriguez M. C., (2020), Student Perspectives on Remote Learning in a Large Organic Chemistry Lecture Course. J. Chem. Educ., 97(9), 2565– 2572. 67. Redish E. F., Saul J. M., and Steinberg R. N., (1998), Student expectations in introductory physics. Am. J. Phys., 66(3), 212–224. 68. Reinholz D. L. and Apkarian N., (2018), Four frames for systemic change in STEM departments. Int. J. STEM Educ., 5(1), 1–22. 69. Rocabado G. A., Kilpatrick N. A., Mooring S. R., and Lewis J. E., (2019), Can We Compare Attitude Scores among Diverse Populations? An Exploration of Measurement Invariance Testing to Support Valid Comparisons between Black Female Students and Their Peers in an Organic Chemistry Course. J. Chem. Educ., 96(11), 2371–2382. 70. Rogoff B., (1990), Apprenticeship in Thinking: Cognitive Development in Social Context, Oxford University Press. 71. Sandoval W. A., Enyedy N., Redman E. H., and Xiao S., (2019), Organising a culture of argumentation in elementary science. Int. J. Sci. Educ., 41(13), 1848–1869. 72. Schein E. H. and Schein P. A., (2016), Organizational Culture and Leadership, 5th ed. Jossey-Bass. 73. Scott S., (2008), Perceptions of Students’ Learning Critical Thinking through Debate in a Technology Classroom: A Case Study. J. Technol. Stud., 34(1), 39–44. 74. Scouller K., (1998), The influence of assessment method on students’ learning approaches: Multiple choice question examination versus assignment essay. High. Educ., 35, 453–472. 75. Scouller K. M. and Prosser M., (1994), Students’ experiences in studying for multiple choice question examinations. Stud. High. Educ., 19(3), 267–279. 76. Semsar K., Knight J. K., Birol G., and Smith M. K., (2011), The Colorado Learning Attitudes about Science Survey (CLASS) for Use in Biology. CBE--Life Sci. Educ., 10(3), 268–278. 77. Sevian H. and Talanquer V., (2014), Rethinking chemistry: A learning progression on chemical thinking. Chem. Educ. Res. Pract., 15(1), 10–23. 78. Seymour E. and Hewitt N. M., (1997), Talking About Leaving: Why Undergraduate Leave the Sciences, Westview Press. 79. Snyder B., (1973), The Hidden Curriculum, The MIT Press. 151 80. SPSS, (2020), IBM Corp. 81. Stowe R. L. and Cooper M. M., (2017), Practicing What We Preach: Assessing “Critical Thinking” in Organic Chemistry. J. Chem. Educ., 94(12), 1852–1859. 82. Stowe R. L., Scharlott L. J., Ralph V. R., Becker N. M., and Cooper M. M., (2021), You Are What You Assess: The Case for Emphasizing Chemistry on Chemistry Assessments. J. Chem. Educ., 98(8), 2490–2495. 83. Stroupe D., (2014), Examining Classroom Science Practice Communities: How Teachers and Students Negotiate Epistemic Agency and Learn Science-as-Practice. Sci. Educ., 98(3), 487–516. 84. Talanquer V., (2021), Multifaceted Chemical Thinking: A Core Competence. J. Chem. Educ., 98(11), 3450–3456. 85. Talanquer V. and Pollard J., (2010), Let’s Teach How We Think Instead of What We Know. Chem. Educ. Res. Pract., 11, 74–83. 86. Tashiro J. and Talanquer V., (2021), Exploring Inequities in a Traditional and a Reformed General Chemistry Course. J. Chem. Educ., 98(12), 3680–3692. 87. Thiry H., Weston T. J., Harper R. P., Holland D. G., Koch A. K., Drake B. M., et al., (2019), Talking about Leaving Revisited, Seymour E. and Hunter A.-B. (eds.) Springer. 88. Thoman D. B., Muragishi G. A., and Smith J. L., (2017), Research Microcultures as Socialization Contexts for Underrepresented Science Students. Psychol. Sci., 28(6), 760– 773. 89. Thomas D. R., (2006), A general inductive approach for analyzing qualitative evaluation data. Am. J. Eval., 27(2), 237–246. 90. Vygotsky L., (1978), Mind in Society, The Harvard University Press. 91. Webber D. M. and Flynn A. B., (2018), How Are Students Solving Familiar and Unfamiliar Organic Chemistry Mechanism Questions in a New Curriculum. J. Chem. Educ., 95(9), 1451–1467. 92. Xu X. and Lewis J. E., (2011), Refinement of a chemistry attitude measure for college students. J. Chem. Educ., 88(5), 561–568. 93. Zotos E. K., Moon A. C., and Shultz G. V., (2020), Investigation of chemistry graduate teaching assistants’ teacher knowledge and teacher identity. J. Res. Sci. Teach., 57(6), 943–967. 152 APPENDIX Permissions to reproduce manuscript in its entirety Figure 4.7. A screenshot of the notice for permissions to reproduce manuscript in its entirety 153 Nonmutually Exclusive Coding As mentioned in the article, the authors used a mutually exclusive coding scheme. However, in order to assess the viability of using mutually exclusive codes. One of the authors (RSB) did a quick analysis of the data using non-mutually exclusive codes in order to determine if the analysis changed. It is important to note that most of the responses could only be categorized in one way; however, some responses mentioned multiple ideas that technically qualified for two different categories (even though in many cases these responses primarily focused on one category). Figure S1, S2, and S3 correspond to question 1 (expectations of thinking), question 2 (most difficult thing), and question 3 (assessment), respectively. Figure 4.8. Percentage of student responses for each category/code for question 1 using nonmutually exclusive codes 154 Figure 4.9. Percentage of student responses for each category/code for question 2 using nonmutually exclusive codes 155 Figure 4.10. Percentage of student responses for each category/code for question 3 using nonmutually exclusive codes In the case of these photos, the percentages are calculated out of the total number of codes assigned to each course, either traditional or OCLUE. In the graph legends, we have listed the “n” values for the total number of students for traditional (Trad) and OCLUE. Beside that we have also included the total number of nonmutually exclusive codes assigned to the data set. As can be noted, the same patterns are recognized when using mutually exclusive coding. Pearson’s Chi-Square Tests and Post-Hoc Analyses It’s important to note that the calculation of these statistics did include the data in the “Other” theme which we largely do not discuss throughout the paper. Since each question yielded different patterns and codebooks, they will be discussed individually. For question 1 156 (expectations of thinking), it was noted in the qualitative results that over 21% more OCLUE students perceived they were expected to use their knowledge while over 20% more traditional students perceived they were expected to rely more on rote knowledge. The Pearson’s chi-square test further supported the differences noted in the qualitative data at the alpha level of 0.05 (χ2 = 47.247, df = 2, p < 0.001; see Table 4). This indicates that there is a statistically significant difference in student perceptions of how they are expected to think between the two cohorts, albeit with a relatively small effect size. For question 2 (most difficult thing), almost 25% more students in OCLUE perceived the most difficult aspect was the use of knowledge while almost 42% more traditional students perceived the most difficult aspect was related to the rote knowledge theme (i.e., memorization or a discrete specific topic). Once again, the Pearson chi- square test supported these differences at an alpha level of 0.05 (χ2 = 144.220, df = 2, p < 0.001; see Table 4), indicating that there is a difference in student perceptions of the most difficult aspects between the OCLUE and traditional organic chemistry students with a medium-to-large effect size. Finally, for question 3 (assessment), almost 37% more students in OCLUE perceived they were assessed on their use of knowledge, while over 49% more students in traditional perceived they were assessed on their rote knowledge. The chi-square test supported the differences we noted in our coding (χ2 = 203.605, df = 2, p < 0.001; see Table 4) which indicates that the perceptions of what is assessed between the two cohorts are different with a medium-to- large effect size. 157 Table 4.5. Chi-square test values Pearson Chi- p-value (2- Question df Cramer’s V Square Value sided) Q1: Expectations 47.247 2 p < 0.001 0.235 of Thinking Q2: Most Difficult 144.220 2 p < 0.001 0.411 Thing Q3: Assessment 203.605 2 p < 0.001 0.489 For our chi-square analysis, we used an alpha of 0.05, this means that if the absolute value of the standardized residuals from the post-hoc analysis are greater than the critical value of plus-or-minus 1.96 then that particular theme (“Use of Knowledge”, “Rote Knowledge”, and/or “Other”) is contributing to the statistically significant result from the Pearson’s chi-square test. That is, that particular theme is a driving force for the differences between the two cohorts of Traditional and OCLUE. Furthermore, the positive and negative values associated with each standardized residual allows us to determine if the observed number of responses in a particular theme is greater than expected (a positive value) or lower than expected (a negative value). Finally, we can also consider the magnitude of the standardized residuals relative to one another to provide further insights. For the first question (expectations of thinking), the standardized residual for the “Use of Knowledge” theme was -2.5 for Traditional and 3.9 for OCLUE (see Table 5). The absolute value of both standardized residuals is larger than 1.96 which indicates these two themes are influencing the Pearson’s chi-square test. Furthermore, it can be noted that the Traditional standardized residual is negative while the OCLUE value is positive. This means that the number of responses categorized as “Use of Knowledge” for Traditional is lower than expected while it is higher than expected for OCLUE. For the “Rote Knowledge” theme, the standardized residual for Traditional was 2.7 and -4.2 for OCLUE. These values indicate that the “Rote Knowledge” 158 theme is another driver for significance in the Pearson’s chi-square test for question 1 (expectations of thinking). Since the standardized residuals for the “Other” theme are below 1.96, this indicates that this theme is not a significant driver of the chi-square test for this question. In terms of the second question (most difficult thing), we note similar patterns to question 1 (expectations of thinking), albeit more pronounced. For the “Use of Knowledge” theme, the standardized residuals were -4.6 for Traditional and 7.2 for OCLUE (see Table 5). On the other hand, the “Rote Knowledge” theme standardized residuals were 3.7 and -5.8 for Traditional and OCLUE, respectively. Similar to question 1 (expectations of thinking), these values indicate that both the “Use of Knowledge” and “Rote Knowledge” themes are influencing the outcome of the chi-square test. For this question, though, the standardized residuals for the “Other” theme appear to also be a driver of the chi-square test. However, as previously mentioned, the magnitude of the standardized residuals can provide further insights. For both cohorts (Traditional and OCLUE), the standardized residuals for “Use of Knowledge” and “Rote Knowledge” are larger in magnitude than the standardized residuals of the “Other” theme which sits at -2.7 and 4.2 for Traditional and OCLUE, respectively. In particular, the standardized residual of 7.2 for “Use of Knowledge” in OCLUE is fairly large. This indicates that while the “Other” theme is an influence in the initial chi-square test, the “Use of Knowledge” and “Rote Knowledge” themes are likely stronger influences overall. Finally, for the third question (assessment), we once again note the same patterns in the standardized residuals as the other questions. For the “Use of Knowledge” theme, the standardized residuals were -5.6 for Traditional and 8.8 for OCLUE. Then, for the “Rote Knowledge” theme, the standardized residuals were 5.0 and -7.8 for Traditional and OCLUE, 159 respectively. Here, the standardized residuals for “Other” were -1.6 and 2.5 for Traditional and OCLUE. Therefore, similar to the previous results for the other questions, these results indicate that both the “Use of Knowledge” and “Rote Knowledge” categories are primary influences in Pearson’s chi-square test, though the standardized residual for OCLUE was higher than 1.96 for the “Other” theme. This indicates that the “Other” theme was a primary driver for the chi-square test for OCLUE (alongside the other themes), but this was not the case for Traditional. As noted in question 2 (most difficult thing), although the “Other” theme in OCLUE was influencing the significant result of the chi-square test, it was far smaller in magnitude than “Use of Knowledge” or “Rote Knowledge”. Table 4.6. Post-hoc analyses Use of Rote Knowledge Other Question Course Knowledge Theme Theme Theme Question 1: Traditional -2.5 2.7 0.2 Expectations of OCLUE 3.9 -4.2 -0.2 Thinking Question 2: Most Traditional -4.6 3.7 -2.7 Difficult Thing OCLUE 7.2 -5.8 4.2 Question 3: Traditional -5.6 5.0 -1.6 Assessment OCLUE 8.8 -7.8 2.5 160 CHAPTER V: INVESTIGATING STUDENT PERCEPTIONS OF TRANSFORMATIONAL INTENT AND CLASSROOM CULTURE IN GENERAL CHEMISTYR COURSES Preface The work presented in this chapter was conducted in CEM 141: General Chemistry I at Michigan State University. The impetus for this study was informed by our findings in CEM 252 (Chapter IV) and was subsequently guided by previous pilot studies in CEM 141 and 142 (General Chemistry II) which are only briefly mentioned here. The purpose of this study was to explore how the perception questions would perform in different courses. The pilot studies that took place prior to the study presented here were important for perception question development and refinement. As a corollary to this purpose, we also piloted the questions in BS 161: Cell and Molecular Biology. However, given the exploratory nature of that pilot, it’s data will not be discussed in detail. All of this data was collected during the coronavirus pandemic and these courses (including the pilots) were all taught online (in some cases for the first time) which is an important consideration for comparison to previous work. At the time of the study, there was not a relevant comparison course at MSU for CEM 141 of comparable size. Therefore, a comparative analysis between course types was not possible. Regardless, a brief discussion of the culture of learning will be considered. All introductory and methodological information within Chapter IV of this dissertation applies to the data and discussion here. 161 Introduction By the time this study was conducted, CEM 141 had been entirely transformed into a three-dimensional course which was reserved for non-chemistry majors at MSU. Prior to transformation of general chemistry, the course mimicked a traditional course previously described within the Introduction and Chapter IV, and the transformation was part of a large institutional effort to transform STEM gateway courses by making them more three-dimensional (Matz et al., 2018). At MSU, CEM 141 uses the Chemistry, Life, the Universe, and Everything (CLUE) curriculum (Cooper & Klymkowsky, 2013). Previous work in CLUE courses investigated student reasoning ability and the evidence supports CLUE as an effective curriculum for improving student reasoning in chemistry (Becker et al., 2016; Cooper et al., 2016; Crandell et al., 2019; Kararo et al., 2019; Noyes & Cooper, 2019; Underwood et al., 2016; Williams et al., 2015). Alongside the CEM 141 data, I will briefly discuss a series of pilot studies conducted in CEM 141, CEM 142, and BS 161. Similar to CEM 141, CEM 142 is also transformed using three-dimensional learning and serves non-chemistry majors. Although BS 161 had not been completely transformed, it had undergone some curricular and pedagogical changes that were influenced by three-dimensional learning at the time of this study. The course was team-taught by different instructors. The revamping of BS 161 worked to align the separate courses sections by using the same learning objectives, recorded lectures, quizzes, in-class activities, and poll questions. However, individual instructors could choose how they wanted to deliver content during the first and last parts of each class period. The BS 161 instructional team also worked together to develop a common exam bank, but the instructors did attempt to make exams slightly different between the sections to reduce opportunities to cheat. One of the organizing instructors 162 stated that the questions on exams included some constructed response questions which were focused on constructing explanations and arguments. The purposes of these pilot studies were meant to guide perception question development and refinement as well as provide insights into how the perception questions performed in other courses outside of organic chemistry. Therefore, this chapter will primarily focus on data collected from Fall 2020 CEM 141 after the pilots in CEM 141 and CEM 142 had completed. The BS 161 pilot took place at the same time as data collection for Fall 2020 CEM 141. Even though there were no comparison courses for CEM 141, CEM 142, or BS 161, a couple of the research questions are similar to the previous study in Chapter IV: 1. Can the perception questions be used in other courses reliably? 2. In what ways do student perceptions of valued ways of doing and thinking align with the transformational intent? 3. How do elements of the course culture impact student perceptions of what is valued? 4. How did student perceptions change from the middle of the semester to the end of the semester in CEM 141? Methods Participants The table below lists the participants for pilot studies in CEM 141, CEM 142, and BS 161, as well as the study in Fall 2020 CEM 141. In Fall 2020, the CEM 141 students received the same perceptions questions twice: once at the middle of the semester and a second time at the end. This allowed me to address the question of how stable perceptions were throughout the 163 course (research question #4). Furthermore, the Fall 2020 CEM 141 students were the only cohort to receive an additional question regarding their perception of the significance of grades. Table 5.1. Participant information for pilot studies Semester Course Number of Notes Participants Summer 2020 CEM 141: General 74 Data collected at end- Chemistry I (CLUE) of-semester Summer 2020 CEM 142: General 76 Data collected at end- Chemistry II (CLUE) of-semester Fall 2020 CEM 141: General 1894 Data collected at Chemistry I mid-semester Fall 2020 CEM 141: General 1672 End-of-semester Chemistry I follow-up for the mid-semester data in Fall 2020 Fall 2020 BS 161: Cell and 400 Data collected at end- Molecular Biology of-semester Question Development The questions evolved over the course of each of the pilot studies listed in the table above. Starting in Summer 2020, the perception questions asked to the CEM 141 students were similar to the questions asked of the CEM 252: Organic Chemistry II students in Chapter IV with some minor modifications. The questions for Summer 2020 CEM 141 students are shown in Table 5.2. 164 Table 5.2. The open-ended questions asked to Summer 2020 CEM 141 students Question Nickname Full Question Question #1: Expectations of Thinking How would you describe the ways students are expected to think in CEM 141? Question #2: Most Difficult Thing What would you say is the most difficult thing about CEM 141? Question #3: Assessment How would you describe what is assessed in CEM 141? As can be noted in the table above, the questions changed from the organic chemistry study in Chapter IV by becoming more general and open-ended. That is, we were not asking specifically about mechanisms in question #1. With these new questions, however, we noted that many of the students in this small sample were offering vague responses that were difficult to characterize (data shown in the appendix). Therefore, we attempted to minimize these types of responses for the Summer 2020 CEM 142 data by developing “scaffold” questions to supplement the perception questions. These scaffold questions were not meant to elicit rich responses regarding student experience but instead were meant to minimize responses that were either too vague to analyze or telling us information that was not interesting from a research standpoint. All questions asked to Summer 2020 CEM 142 students are included in Table 5.3. 165 Table 5.3. The open-ended questions asked to Summer 2020 CEM 142 students Question Nickname Full Question Scaffold question #1: How would you describe your experience taking CEM 142 online? online Scaffold question #2: How would you describe the things students should do in and student actions outside of CEM 142? Scaffold question #3: How would you describe the overall course structure and policies course aspects in CEM 142? Analysis question #1: How would you describe the ways students are expected to think in expectations of thinking CEM 142? Please explain your response or provide an example. Scaffold question #4: How would you describe the overall format and aspects of exams exam and quiz aspects and quizzes in CEM 142? Analysis question #2: What would you say is the most difficult thing about CEM 142? most difficult thing Please explain your response. Scaffold question #5: Which course materials are most important to study in CEM 142? course materials Scaffold question #6: Which topics are assessed the most in CEM 142? specific topics Analysis question #3: How would you describe the ways of thinking that are assessed in assessment CEM 142? Please explain your response. The addition of the “Please explain your response…” piece of the question was meant to encourage students to elaborate on their response to eliminate additional vagueness. However, the scaffold questions were eventually removed, and the analysis questions were polished further for the Fall 2020 CEM 141 and BS 161 students (data for Summer 2020 CEM 142 cohort can be found in the appendix). As a reminder, for the Fall 2020 CEM 141 students, we asked the perception questions twice in the semester; however, we did not do this for the BS 161 pilot Instead, the BS 161 students received the questions at the end of the semester only. The questions asked to the Fall 2020 CEM 141 cohort and BS 161 pilot are included in Table 5.4. 166 Table 5.4. The open-ended questions asked to Fall 2020 CEM 141 and BS 161 students Question Nickname Full Question Analysis question #1: How would you describe the ways students are expected to think in expectations of thinking CEM 141? In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #1 and/or provide an question #1 example. Analysis question #2: What would you say is the most difficult thing about CEM 141? most difficult thing In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #3 and/or provide an question #2 example. Analysis question #3: How would you describe what is assessed in CEM 141? assessment In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #5 and/or provide an question #3 example. In your own words, what do grades signify in CEM 141? (NOTE: Analysis question #4 only CEM 141 students received this question) The “Please explain your response…” piece of the question was separated out from the original question and included in a follow-up question. The follow-up questions were separated to help students target their responses. As noted, the scaffold questions were dropped as some of the questions seemingly confused students in the Summer 2020 CEM 142 pilot study. Furthermore, it is plausible that the addition of the scaffold questions may have induced question fatigue. The addition of “Analysis question #4” can be noted here. The question was included to collect pilot information on how students perceive the significance of grades within the context of the course to further comment on the cultures of learning in each course. For clarity of Fall 2020 data collection, a timeline of question implementation is shown in Figure 5.1. 167 End of Semester Analysis questions Follow-up questions Start of Semester CEM 141 and BS 161 Mid-Semester Analysis questions Follow-up questions CEM 141 Figure 5.1. Timeline of questions asked in Fall 2020 Data Collection All data collection was done through beSocratic for CEM 141 and 142 (beSocratic, 2020; Bryfczynski, 2010), and it was similar to the data collection approach used in Chapter IV. For BS 161, on the other hand, the perception questions were added onto their final exam which was administered online by the BS 161 instructional team via the course management system, D2L. The responses were then exported from the exam into Excel spreadsheets by their instructors and shared with me via Google Drive. All student identities were protected throughout this process, and the responses could only be tracked using anonymous beSocratic IDs or with random numbers generated by me for each response. Data Analysis All data analysis for CEM 141, CEM 142, and BS 161 followed the same protocols as previously detailed in Chapter IV for the organic chemistry study. That is, it followed an 168 inductive thematic analysis approach (Thomas, 2006) that was informed by grounded theory methodologies (Charmaz, 2006; Corbin & Strauss, 2015). Results and Discussion Summer 2020 CEM 141 and CEM 142 General Chemistry Pilots The summer 2020 pilot studies relied on smaller datasets and were explicitly focused on question development and refinement. Therefore, the results of these pilots will not be discussed in detail here but can be found in the appendices of this chapter. Many of the same categories that emerged in the organic chemistry data (in Chapter IV) also emerged in the Summer 2020 general chemistry data sets with some minor differences. For the summer 2020 CEM 141 pilot, the majority of responses were vague and difficult to interpret or conveyed information I already knew about the course and assignments, a similar occurrence found with the CEM 252 data presented in Chapter IV. To clarify student responses to the perception questions, I developed the scaffold questions introduced in the methods section of this chapter. The goal of these questions was to clarify student responses to the perception questions by asking them explicitly about features of the course that I already knew about with the hope that they would not regurgitate this information on the perception questions I wanted to analyze. I piloted these scaffold questions in Summer 2020 CEM 142. However, the scaffold questions did not seem to clarify responses, and the majority of responses were coded as “Generalities” or telling me information I already knew. That is, this approach to scaffolding did not help. Furthermore, the scaffold questions seemed to induce question fatigue or confuse students. Moving forward into Fall 2020, I removed the scaffold questions and settled on asking students to provide me with examples in their responses 169 to the perception questions with the hope this would help me to interpret the meaning of their responses. Fall 2020 CEM 141 General Chemistry Data In Fall 2020, I removed the scaffold questions and returned to only asking the perception questions. The main difference in question structure here, though, was that I split the questions into two sections to encourage students to provide examples to support their response. That is, I asked about their perception in one question, then I followed it up with a “Please explain your response…” question. The number of responses for this course constituted a much larger data set, and there was not a relevant comparison group of similar size. For this cohort, I decided to collect data at two time-points in Fall 2020: the middle and end of the semester. My goal, then, was to compare student responses from the middle of the semester to the end to determine if their perceptions remained stable over time or if they changed. To begin, I will present the data as broad comparisons of the data from the middle of the semester to the end of the semester. Expectations of Thinking This question asked students: “How would you describe the ways students are expected to think in CEM 141?” which then had a follow-up question asking for an example. The codebook for this question was similar to the CEM 252 study (Chapter IV) with some minor differences. The codebook with example quotes for this question is shown in Table 5.5. 170 Table 5.5. Codebook for question 1: Perceptions of the expectations of thinking Code Dimensions Example Quotes Apply Student responses that include one or “Students are expected to think and more of the following dimensions in analytical. We are given knowledge Reason regard to their perceptions of how they are and are expected from that knowledge expected to think: (1) understanding to relate ideas and concepts back to the “why” a reaction proceeds; (2) the use of overall picture. In class we would be knowledge, specifically with the use of given a claim. Like opposite charges fundamental or basic ideas; (3) the attract and like charges repel. So, when transfer of knowledge to new problems; looking at a model of the atom and we (4) making connections between concepts, know that the nucleus is positive and especially in order to apply them; (5) the electrons are negative. These two making predictions in order to solve a charges are opposite and we know that problem. they will attract each other. With this attractive force pulling things together we know that the nucleus will pull electrons closer to it.” “they need to understand the why. When in the class for example you need to know what evidence and why light is considered both a wave and a partial. Not that just know that it is.” 171 Table 5.5 (con’t) Identify Student responses that include one or “We are expected to think and more of the following dimensions in conceptually and theoretically. We are Describe regard to their perceptions of how they also to think both at a grand scale and are expected to think: (1) understanding at smaller scales.Many of the things the “what” and “how” reactions proceed, that we study are not able to be visibly particularly without mentioning the use or physically proven to us in person, of knowledge to understanding “why” but rather through the experiments reactions proceed; (2) mentions conducted by others, or through understanding at the scalar or one scalar theoretical explanation. We also study below levels, particularly through the how things work as a whole, but also recognition of concepts such as polarity on a very small, molecular scale.” and electronegativity and their significance to understanding; (3) when a “I think that in cem 141 we are student explicitly mentions expected to think more conceptually.I understanding instead of memorization; said conceptually because we have to (4) when a student mentions think about how something happens “differentiating” between reactions with instead a straight forward answer like any further explanation; (5) responses a math question.” includes a discussion of forces, charges, or stabilization. 172 Table 5.5 (con’t) Only Student responses that include one “I think students are expected to learn Understand or more of the following dimensions the concepts you teach, but then put it in regard to their perceptions of how into a deeper perspective.If we just they are expected to think: (1) when watched the lectures and did the a student mentions “understanding”; homework we wouldn't do too good in (2) description of “understanding” as this class. The quizzes and short answers contrasted to repeating rules or are a lot harder than the homework, and practice problems. make you think at a deeper level. In order to be prepared you have to be able to understand that different learning objects and process the information in a deeper way.” “I found the ways to be quite different from what I am used to in my country (India). The ways at MSU really force you to UNDERSTAND the material which I feel is the best way as this develops an interest in the subject and also the best part is no cramming. Also, no final exam lifts the pressure off one's mind enabling the student to study stress free.In my country, for a chapter let's say periodic trends, you are expected to cram a lot of trends and exceptions in a single chapter. A concept may not be explored in thorough detail which is not the case at MSU. At MSU even if lesser number of concepts are taught, they are done in a way so that the student can understand it thoroughly. This is a much better way to teach” 173 Table 5.5 (con’t) Memorization Student responses that include one “I am expected to recall information or more of the following that was previously stated in slides and dimensions regarding their or lessons just learned.we are expected perceptions of how they are to recall information from waves, expected to think: (1) particles, quantized numbers, etc....” memorization, remembering, recalling, or regurgitation; (2) “Although I guarantee chemistry retaining information. teachers would love to hear differently, students are expected to think based on memorization and from such a fine and small level that much of the information is a building block, but not detrimental to one's life. For example, in any of the possible careers of my interests (and I believe for others), will I have to know why there are stronger LDFs for some molecules versus others and why one has a higher boiling point for another. I believe chemistry is an interesting concept that can be built upon and does have some useful things, however, the overall way to think is only beneficial, in my opinion, for a bit of time.” Solution- Student responses that include one “We are expected to be very concise Centered and or more of the following and accurate knowing the material.If Heuristics dimensions regarding their you say the wrong word in your perceptions of how they are response it could ruin the whole expected to think: (1) focusing on answer. 100% accuracy is very accuracy, being right or wrong; (2) important in this class” the use of rules, processes, or “Students are expected to think in thinking in procedural ways procedural ways during the class. It's without mention of what, how, and not outside of the box thinking usually why. which is a good thing to me. I like when the way of thinking is the same.For example, when writing electron configurations, it's the same process for everyone.” 174 Table 5.5 (con’t) Student Student responses that include one or “Students are expected to think on Responsibility more of the following dimensions their own and figure things out by regarding their perceptions of how learning with a little help from the they are expected to think: (1) teacher. The class itself is not actions related to what students do in particularly hard which gives a course, such as taking notes, students the opportunity to applying themselves, finishing work, understand the material in their own and more; (2) putting in effort and way without the stress of knowing participating; and (3) learning on everything because its not hard if you their own outside of class and being apply yourself. This class is one of accountable. my easiest classes this semester and one of the ways I have noticed that is because I am not stressed about knowing the material. I can focus and almost have fun learning how I want to and think with creativity.” “I feel like we are expected to come to class everyday and actively participate. and if you think otherwise you are definitely going to fail for example, if i thought that i didnt have to go to class and still pass I would definitely fail because the way the class is structured you need to show up every single day” Course Student responses that include one or “Student responses that include one Aspects and more of the following dimensions or more of the following dimensions Materials regarding their perceptions of how regarding their perceptions of how they are expected to think: (1) they are expected to think:” focusing on course organization, “Students are expected to think based difficulty, policies, or other aspects on objectives.All of the questions related to the course itself; (2) focus revolve around the objectives so if on specific course materials, such as you understand the objectives you homework, lecture notes, or should be able to answer the majority recitation worksheets. of the questions asked on tests.” 175 Table 5.5 (con’t) Generalities Student responses that include one or “Scientifically Like you think more of the following dimensions in logically pretty much. You have to do regard to their perceptions of how they things the chemistry way, like are expected to think: (1) the total thinking of light as a wave and absence of any chemistry in the particle.” response; (2) using generic and unclear descriptors for thinking such as thinking “Students are expected to think with critically, conceptually, creatively, an open mind and consider the thoroughly, or differently, particularly if theory's not as fact, but what we the student does not expand on what currently knowWe have learned they mean; (3) stating general facts several different models of the atom about content such as “reactants go to so far and have been asked to products”. compare whats different about them. They tell us that the information we are learning is what we currently know about the topics and things might change as time goes on” Not Student responses that include one or “I took chemistry In high school so I Applicable more of the following dimensions in knew it would be what I learned then regard to their perceptions of how they but much more advanced.We also are expected to think: (1) the response had many labs.From what I does not answer the question; (2) when remember in high school we had lab a student provides no response at all; partners in groups of 4 and had lab so (3) when the response is unclear or many times a week were we would interpretation difficult, such as when do all types of cool experiments.” students say “you must “Students are expected to” understand/know the material”; (4) mention of exam and/or course aspects such as exam difficulty; (6) when a student is venting about the course, professor, or other aspects relevant to the course. The addition of the “Please explain your response…” follow-up question was to limit the number of responses coded as “Generalities”. Codes such as “Apply and Reason”, “Identify and Describe”, “Memorization”, “Course Aspects and Materials”, “Generalities”, and “Not Applicable” had near-identical descriptions as the codes presented in the organic chemistry and Summer CEM 141/142 pilots. The “Only Understand” code was used to capture student 176 responses that highlighted the importance of understanding the material, but the students did not elaborate further. The “Solution-Centered and Heuristics” category was assigned when a response stated that the accuracy of the answer or the process used to achieve the correct answer was expected in the course. The “Student Responsibility” category was similar to the “Student Actions” category noted in previous datasets and was meant to capture responses that discussed what students must do to be successful in the course such as paying attention, studying, and/or reviewing material (among others). Figure 5.2. Breakdown of responses to question 1 for Fall 2020 CEM 141 A cursory investigation of the code distributions indicate that perceptions remained relatively consistent across most of the categories. However, statistical results to ascertain if there was a difference was only conducted for the superordinate themes which will be presented later. With the mid-semester data, approximately 46% of the students mentioned they are expected to engage in “Apply and Reason”. This accounts for approximately 873 students of this cohort, and a similar percentage of students perceived the same at the end of the semester. While 177 these questions were able to detect other perceptions, it can be noted that the “Generalities” code is the second most assigned code for this question. The continued prevalence of the “Generalities” code may highlight the fact that students may not have the language to truly describe what they mean when they say terms like “critical thinking”. It could also be the case that students believe that their definition of “critical thinking” and similar descriptors are universal, certainly various studies in the literature have made this assumption as I will explore in my “critical thinking” study in Chapter VII. The results are similar to what I have found in the past. Many students in CEM 141 perceive they are expected to use their knowledge (as noted in the large number of responses in “Apply and Reason” and “Identify and Describe”). Considering that CEM 141 is transformed using three-dimensional learning which seeks to get students to use their knowledge in the context of the scientific practices, this data demonstrates that many student perceptions are aligning with the transformational intent of the curriculum. However, there are still many responses in “Generalities”. Most Difficult Aspects This question asked students: “What would you say is the most difficult thing about CEM 141?” with a follow-up question requesting an example to further explain their response. Similar to the “Expectations of Thinking” question, the codebook for this question had similarities to the CEM 252 study (Chapter IV) and is shown in Table 5.6. 178 Table 5.6. Codebook for question 2: Perceptions of the most difficult thing Code Dimensions Example Quotes Apply Student responses that include one or “The most difficult thing I have come and more of the following dimensions in across is the idea of using knowledge Reason regard to their perceptions of what is the from the readings and interpreting it in most difficult thing: (1) understanding the zoom sessions. The retention of “why” a reaction proceeds; (2) the use of knowledge and the overall piecing knowledge, specifically with the use of together of concepts from each chapter fundamental or basic ideas; (3) the to fully understand what primary things transfer of knowledge to new problems; can be studied for exams would be an (4) making connections between concepts example of this.” or piecing/linking concepts together, “I think the most difficult thing for me especially in order to apply them; (5) so far about CEM 141 has been the making predictions in order to solve a keeping of knowledge and building on problem. it as we go. I will sometimes get confused in a new chapter and forget to apply previous knowledge from other chapters because sometimes it is confusing to see the relation until I ask for clarification in breakout rooms or recitation to my peers.” 179 Table 5.6 (con’t) Identify and Student responses that include one or “Trying to understand how everything Describe more of the following dimensions in has an affect on what is happening on regard to their perceptions of what is the microscopic level. It is hard to the most difficult thing: (1) think about all the properties that have understanding the “what” and “how” an affect on the size of an atom and reactions proceed, particularly without how easily the atom can be changed mentioning the use of knowledge to so quickly.” understanding “why” reactions “Understanding concepts on how proceed; (2) mentions understanding at atoms function. The charges of the scalar or one scalar below levels, subatomic particles and their impact particularly through the recognition of on the shape and size of the atom has concepts such as polarity and always confused me.I find it hard to electronegativity and their significance understand how a positive increasing to understanding; (3) when a student and an electron being added to a shell explicitly mentions understanding could make the atom smaller. I asked instead of memorization; (4) when a Zach in my recitation to explain it to student mentions “differentiating” me and I think I have begun to between reactions with any further understand better on a atomic level of explanation; responses includes a why this happens. Atomic trends in discussion of forces, charges, or general led me to struggle in AP chem stabilization. NOTE: responses that in high school.” simply mention “knowledge” or “understanding” do not receive this code. Only Student responses that include one or “I think understanding all of the Understand more of the following dimensions in material is one of the hardest things to regard to their perceptions of what is do. There is a lot thrown at us in each the most difficult thing: (1) when a module and sometimes it gets hard to student mentions “understanding”; (2) remember what we worked on during description of “understanding” as what day, but checking the class notes contrasted to repeating rules or practice daily has helped me a lot with that problems. lately.” “Understanding chemistry concepts. Explanation in detail of what kind of problems each concept is used.” 180 Table 5.6 (con’t) Memorization Student responses that include one or “All the memorization type work. more of the following dimensions in I'm a CS/Math student, I prefer regard to their perceptions of what is concept based subjects, and I feel the most difficult thing: (1) like chemistry relies a lot less on memorization, remembering, intuition and requires me to recalling, or regurgitation; (2) memorize rules and the suchI like knowing content with no explicit having to critically think about ways mention of understanding the “what”, to use tools I have to solve a “how”, or “why” a reaction proceeds. problem. In chemistry, I feel like once you find the right information on whatever compounds you're using, whether that be molar mass, or atomic structure, the questions can be a bit trivial” “Learning new laws, new properties of certain things, and memorize them all.New things that you need to learn and memorizing those stuff.” Specific Student responses that include one or “The math equations Stoichiometry” Topics and more of the following dimensions Concepts regarding their perceptions of what is “I would say the most difficult part is the most difficult thing: (1) listing off learning about ionization energy.It is specific topics, particularly with no just hard to grasp that as you increase reference to understanding or the the atomic number down the row approaches utilized. Most common that the radius of the element would specific topics mentioned include decrease.” mechanisms, acid-base reactions, learning objectives, naming, synthesis, and spectroscopy; (2) explicit mention of “concepts” without expanding on what they mean (i.e., “understanding concepts” or “knowing a mechanism”). 181 Table 5.6 (con’t) Student Student responses that include one or “I would say that it is the trying to Responsibility more of the following dimensions keep up with all the reading.It just regarding their perceptions of the would forget what need to read most difficult thing: (1) actions before classes but the check list related to what students do in a helped” course, such as taking notes, applying themselves, finishing work, and “Time management but that on me more; (2) putting in effort and though.I accidentally missed a few participating; and (3) learning on homework assignments because I their own outside of class and being got lazy.” accountable. Quiz Aspects Student responses that include one or “The most difficult thing for me more of the following dimensions would be the writing portion of regarding their perceptions of the exams. I have a hard time explaining most difficult thing: (1) general my way of thinking and putting it mention of the quizzes being most down on paper. It seems like it's difficult; (2) discussion of the format difficult to use my words and images of the quizzes; (3) discussion of the to represent my answer.” grading approach to the quizzes. “The quizzes and test. Personally i do not feel like there is enough practice.In class we usually just talk about something once and its moves onto the next right after.” Course Aspects Student responses that include one or “The Aleks definitely. It brings and Materials more of the following dimensions down my score and is just hard to regarding their perceptions of the finish because once you're stuck most difficult thing: (1) focusing on that's it. Especially since I never course organization, difficulty, took chemistry in highschool.” policies, or other aspects related to the course itself; (2) focus on specific “The fact that assignments in course materials, such as homework, beSocratic are often due at 9:00 a.m. lecture notes, or recitation on the day of class, since just by worksheets. glancing at the due-date you might be deceived into thinking you have until the end of that day to do the assignment, when to your horror, you will find out upon logging into beSocratic after class that that isn't the case.This has proven true of a few of the recent homeworks in beSocratic for me.” 182 Table 5.6 (con’t) Online and Student responses that include one or “it being onlinehard to get help with Accessibility more of the following dimensions quick questions which end up being regarding their perceptions of the most on the quizzes” difficult thing: (1) being online; (2) accessibility of getting help in the “The most difficult thing for me is course; (3) access of materials. just focusing with online classes. I know I would be doing better if it was in person and that is really frustrating for me. It's not necessarily the material that is hard for me, but the motivation to learn the material when in the online environment. I mean exactly what I said above” Workload Student responses that include one or “The workload.I struggle with the and Pace more of the following dimensions amount of work while being online, regarding their perceptions of the most however think this amount of work difficult thing: (1) amount of material; is efficient in in-person classes” (2) pace of course; (3) feeling overhwhelmed. “Keeping up with the pace of the classThe pace can sometimes be a little rough or rigorous” Generalities Student responses that include one or “The most difficult thing about more of the following dimensions in CEM 141 is the scientific thinking regard to their perceptions of the most aspect of it.You must be really difficult thing: (1) the total absence of thorough and leaving out small any chemistry in the response; (2) details could be a big deal.” using generic and unclear descriptors for thinking such as thinking critically, “Just in general, the critical thinking conceptually, creatively, thoroughly, or required for the class. It's just a lot of differently, particularly if the student difficult concepts. There are a lot of does not expand on what they mean; concepts that are really hard to (3) stating general facts about content understand such as Coulomb's Law such as “reactants go to products”. and even the equations used when understanding that light is a wave.” 183 Table 5.6 (con’t) Not Student responses that include one or more of the “I think the most Applicable following dimensions in regard to their perceptions of difficult thing for me what is the most difficult thing: (1) the response does not I r” answer the question; (2) when a student provides no response at all; (3) when the response is unclear or “I feel like nothing interpretation of the response is difficult, such as when is really students say “understanding the material”; (4) when the difficultnothing is response falls into no other category; (5) when a student difficult” is venting about the course, professor, or other aspects relevant to the course Similar codes such as “Apply and Reason” and “Identify and Describe” make an appearance here, along with others. However, new codes, such as “Online and Accessibility”, emerged which was used for responses that talked about the difficulties of taking a course online during the COVID-19 pandemic. Yet, the most difficult thing identified by many of the students were the quizzes used in the course. Figure 5.3. Breakdown of responses to question 2 for Fall 2020 CEM 141 In the case of this question, there was an uptick in the number of responses describing aspects of the quizzes as the most difficult aspect. However, statistical tests will be conducted for 184 the superordinate themes which are mentioned later. It’s important to note the “Generalities” code because it does appear that students used less of the generalized language to describe their perception of the most difficult thing in CEM 141. While the majority of students in question #1 perceived that they are expected to apply and reason, more students were saying that the quizzes were the most difficult aspect of the course. However, it could be that the quizzes encourage students to engage in applying and reasoning, but that cannot be determined with this data set. Although the results here are comparable to previous data collections in CEM 141, CEM 142, and organic chemistry, I was still encountering the same issue of having many responses allocated to categories that told me information I already knew or was not as insightful (particularly given this is not a comparative analysis). Throughout the perceptions project, the “most difficult aspects” question has had the most issues, and in many cases the least amount of information regarding the culture of learning. This finding, however, led to additional scaffolding that I piloted in organic chemistry which will be discussed in more detail in the next chapter (Chapter VI). Assessment The “assessment” question asked students: “How would you describe what is assessed in CEM 141?” with a follow-up question asking students to provide an example. The codebook for this question is shown in Table 5.7. Once again, there are similarities between the codebook for this question and the previous study (Chapter IV). 185 Table 5.7. Codebook for question 3: Perceptions of what was assessed Code Dimensions Example Quotes Apply Student responses that include one or “In CEM 141, we are assessed based on and more of the following dimensions the how we can apply basic chemistry Reason regarding their perceptions of what is knowledge to different laws and theories. assessed: (1) understanding “why” a It is our ability to solve application based reaction proceeds; (2) the use of questions in recitation/quizzes using our knowledge, specifically with the use of predisposed knowledge from the fundamental or basic ideas; (3) the lectures.For example, after the classes transfer of knowledge to new about atoms and energy, we should be problems; (4) making connections able to tell if an atom in the middle of a between concepts or piecing concepts field of energy will go towards the high together, especially to apply them; (5) positively charged atom or the weak making predictions to solve a problem. negatively charged atom. A student in CEM 141 should be able to deduce that this is dependent on the charge of the atom in the middle itself and how strong its charge is.” “we are assessed on ideas and concepts discussed in class, but they are questions to test if you've learned the material or memorized it.generally questions in chemistry involve some sort of why? portion if that's in supporting your answer in a diagram or writing about ways something can cause or effect things around it” 186 Table 5.7 (con’t) Identify Student responses that include one or “The things assessed in CEM141 are and more of the following dimensions in mainly theories based on multiple Describe regard to their perceptions of what is experiments and observation that assessed: (1) understanding the “what” scientists have gathered to conclude to and “how” reactions proceed, specific facts about chemistry and the particularly without mentioning the use atomic level of substances.For of knowledge to understanding “why” example. CEM141 covers topics about reactions proceed; (2) mentions atoms and how they interact with each understanding at the scalar or one scalar other as well the basic qualities of below levels, particularly through the atoms and their subatomic particles recognition of concepts such as polarity with an even deeper dive to how they and electronegativity and their can act in certain situations.” significance to understanding; (3) when a student explicitly mentions “The contents of atoms, what makes understanding instead of memorization; them, how they react to certain things, (4) when a student mentions and specific characteristics of “differentiating” between reactions with them.Right now we are learning about any further explanation. (5) responses ionization energies and atomic radii. include a discussion of forces, charges, We have also learned about how or stabilization. NOTE: responses that electrons have wave properties.” simply mention “knowledge” or “understanding” do not receive this code. 187 Table 5.7 (con’t) Only Student responses that include one or “A basic understanding of chemistry Understand more of the following dimensions in is assessed in CEM 141, as well as regard to their perceptions of what is your understanding of specific chem assessed: (1) when a student related concepts. Quizzes are focused mentions “understanding”; (2) on the fundamentals of chemistry and description of “understanding” as not on the intricate details of how contrasted to repeating rules or every single concept works. Each quiz practice problems. focuses on only a few concepts/ topics, but can be difficult when not every topic is discussed in detail because of the assumption that we wouldn't understand.” “Your knowledge and understanding of the material you are asked to read, what is discussed in lecture and examples from recitation. The stuff we go over is what is usually assessed.” Memorization Student responses that include one or “Students are assessed by what they more of the following dimensions have memorized.Tests are fact based. regarding their perceptions of what They only contain information that is assessed: (1) memorization, can be found via laws and formulas.” remembering, recalling, or regurgitation; (2) knowing content “I would describe it as mostly particularly with no explicit mention memory.Lots of the quizzes are based of understanding the “what”, “how”, on memorization of facts rather than or “why” a reaction proceeds. the ability to complete a problem. For example, on the last quiz I think we had two questions that test our ability to solve a problem and the rest were questions about facts.” Specific Student responses that include one or “atoms and elements students are Topic more of the following dimensions assessed on periodic trends and atomic regarding their perceptions of what concepts” is assessed: (1) listing off specific topics, particularly with no reference “periodic tables, compounds, and sig to understanding or approaches figsThe content during lectures and utilized; (2) explicit mention of quizes have been over these three “concepts” without expanding on concepts.” what they mean. 188 Table 5.7 (con’t) Student Student responses that include one or “We are assessed on work ethic Responsibility more of the following dimensions Besocratic grades off of work regarding their perceptions of what is ethic and quality” assessed: (1) actions related to what students do in a course, such as taking “Our participation and the amount notes, applying themselves, finishing of work we put in to the class. For work, and more; (2) putting in effort example, if we don't participate or and participating; and (3) learning on put work towards CEM 141, our their own outside of class and being grades will not be the best. During accountable. our recitations, if we don't put effort into our assignments, we will not succeed very well.” Quiz Aspects Student responses that include one or “It is very difficult The quizzes more of the following dimensions are very difficult with very regarding their perceptions of what is detailed questions” assessed: (1) information regarding the quizzes; (2) the format of the quiz, such “The assessments are a series of as stating the types of questions on the multiple choice questions quiz (i.e., multiple-choice, or short followed by several short answer answer); (3) the length, time, or response questions. It is very fairness of the quizzes. helpful that our professors allow us to do open note, open book.Quiz 3 consisted of 10 multiple choice questions and an 8 part short answer question.” Course Aspects Student responses that include one or “I would say the lectures are most and Materials more of the following dimensions of what we are assessed by.I feel regarding their perceptions of what was like most the information that we assessed: (1) focusing on course need from this class are based on organization, difficulty, policies, or the lectures. But attending the other aspects related to the course itself; lectures helps because you cant (2) focus on specific course materials, know all the information by just such as homework, lecture notes, or looking at the slides.” recitation worksheets. “Reading the textbook (CLUE), beSocratic assignments, watching occasional short videos, and the mock quizzes. I feel like the workload is not too bad or too heavy, and I feel like studying for the quizzes are the most time consuming aspect.” 189 Table 5.7 (con’t) Generalities Student responses that include one or more of the “Knowledge on the unit following dimensions in regard to their that is being tested over. perceptions of what is assessed: (1) the total Each quiz is about the unit absence of any chemistry in the response; (2) that was taught prior to the using generic and unclear descriptors for thinking quiz.” such as thinking critically, conceptually, creatively, thoroughly, or differently, particularly “Our critical thinking is if the student does not expand on what they mean. assessed in CEM 141.The short answer quizzes definitely require us to think critically and be specific in our answers.” Not Student responses that include one or more of the “I think f” Applicable following dimensions in regard to their perceptions of what is assessed: (1) the response “I don't knowI don't know” does not answer the question; (2) when a student provides no response at all; (3) when the response is unclear or interpretation of the response is difficult, such as when students say “your understanding of the material”; (4) when the response falls into no other category; (5) when a student is venting about the course, professor, or other aspects relevant to the course and the response focuses on the course or the professor such as frustrations they have with the course or professor. Here we see some prominent codes in “Apply and Reason” and “Course Aspects and Materials”. Here it can be noted that the “Generalities” category was fairly low, similar to question #2. 190 Figure 5.4. Breakdown of responses to question 3 for Fall 2020 CEM 141 (N=1894) Between mid-semester and the end-of-semester data, responses shifted away from listing specific topics to using more generalized language. Similar to the other questions, statistical results will be reported for the superordinate themes instead of the disaggregated categories. According to this question, approximately a quarter or more of the students perceive they are assessed on “Apply and Reason” while approximately 20% of the students said they are assessed on a variety of course materials or made reference to how assessment factors into their grades (the “Course Aspects and Materials” code). With the “Expectations of Thinking”, “Most Difficult Thing”, and “Assessment” questions analyzed, it does appear that this approach to gathering student perceptions is more informative and will likely be used moving forward. It is promising to see a larger chunk of responses perceiving they are assessed on the use of their knowledge which implies student perceptions are aligning with the transformational intent of CLUE. Although I noted a major difference in “Specific Topic” and “Generalities” for CEM 141 data when compared to the previous CEM 252 study, it’s difficult to ascertain why without further information. 191 Grading This question asked students: “In your own words, what do grades signify in CEM 141?” and is referred to as the “Grading” question. The codebook is shown below in Table 5.8. This question was not asked to CEM 252 students. Table 5.8. Codebook for question 4: Perceptions of what grades signify Code Dimensions Example Quotes Apply Student responses that include one or more of the “I think from the things that are and following dimensions regarding their perceptions being assessed, grades represent Reason of what grades signify: (1) understanding “why” in how hard we are learning in things happen; (2) the ability to use knowledge, this class and how much we're specifically with the use of fundamental or basic able to apply when we need to ideas; (3) the ability to transfer knowledge to explain them to somebody new problems; (4) the ability to make else.” connections between concepts or piecing concepts together, especially to apply them; (5) “Grades signify not our the ability to make predictions to solve a intelligence, but what we problem. learned and how well we can apply that information to familiar (or slightly unfamiliar) situations.” 192 Table 5.8 (con’t) Identify Student responses that include one or “Grades in CEM 141 represents your and more of the following dimensions in knowledge and understanding of the Describe regard to their perceptions of what topic on atoms, molecules, chemical grades signify: (1) understanding the equations, relationships, forces, and “what” and “how” reactions proceed, subatomic particles. The grade often particularly without mentioning the use reflects how hard and focused you are of knowledge to understanding “why” in CEM 141 in understanding a topic reactions proceed; (2) mentions that learned. If your grade is not well, then grades signify understanding at the it means that you aren't paying enough scalar or one scalar below levels, attention or not asking enough particularly through the recognition of questions. It means that you're not concepts such as polarity and trying hard enough to actually electronegativity and their significance understand the course material.” to understanding; (3) when a student explicitly mentions understanding “Since the homework assignments are instead of memorization; (4) when a not graded on accuracy, it gives me the student mentions “differentiating” impression that what you learn in this between reactions with any further class is more important than just explanation. (5) responses include a memorizing. Grades in CEM141 are discussion of forces, charges, or just a guide point to how well you stabilization. NOTE: responses that understand and can explain simply mention “knowledge” or phenomenons. It is more important to “understanding” do not receive this code. the Chemistry Department at Michigan State University that kids are being able to learn chemistry and enjoy it rather than focus strictly on what they have to do to get a good grade.” 193 Table 5.8 (con’t) Effort and/or Student responses that include one or “your understanding of the chapters Understanding more of the following dimensions in and how much effort you put in regard to their perceptions of what during the besocratic homework grades signify: (1) the effort or and the recitation worksheets.” participation of the student; (2) a general mention of “understanding”. “Grades signify hard work, NOTE: these were combined into a dedication, and an understanding of code together because they often co- the material in CEM 141. I think if occurred (similar to Memorization you listen and take good notes in and/or Workload in CEM 252 study). class, complete all of the homework and recitation assignments with full effort, and put full effort into Learning Objectives, a good grade can be achieved in CEM 141. I have felt really good about the course so far because I put in 100% effort and I am happy to see that it has been reflected in my grade.” Memorization Student responses that include one or “If you are able to memorize more of the following dimensions concepts.” regarding their perceptions of what grades signify: (1) memorization, “In my opinion, grades signify our remembering, recalling, or ability to regurgitate material on the regurgitation; (2) knowing content fly.” particularly with no explicit mention of understanding the “what”, “how”, or “why” a reaction proceeds. Inaccuracy of Student responses that primarily “I do not think it accurately Grades focused on mentioning the general represents what we understand at inaccuracy of grades. students.” “Grades do not signify anything in CEM141. The only thing a grade can really signify is a students ability to answer questions the way someone may want them answered. It does not show someone’s true abilities as many things can be lost in a way someone is graded. Grades signify a students ability to memorize and achieve the standards set by a professor.” 194 Table 5.8 (con’t) Quiz Aspects Student responses that include one or “I think grades are important, but I more of the following dimensions think the grading in CEM 141 with regarding their perceptions of what 60% of our grades coming from grades signify: (1) information quizzes is a lot.” regarding the quizzes; (2) grades being determined by quiz grades “How well you can perform on tests and quizzes. How much knowledge you can remember.” Course Student responses that include one or “The grades that most matter are Aspects and more of the following dimensions ALEKS, HW, and Quizzes.” Materials regarding their perceptions of what grades signify: (1) focusing on course “CEM 141 grading process makes organization, difficulty, policies, or sense it contains a beginning project, other aspects related to the course homework, group work, short answer itself; (2) focus on specific course quizzes and quizzes. I am happy there materials, such as homework, lecture are no really difficult assignments that notes, or recitation worksheets. needs to be covered and there are only quizzes instead of the addition of tests. I guess what the grade means is if you did poorly on a quiz, you could revisit the lessons of that chapter. As well there is no end year exam as well or mid-term exam just quizzes.” Generalities Student responses that include one or “In CEM 141, grades signify how well more of the following dimensions in a student is able to not only take in the regard to their perceptions of what given information but how well they grades signify: (1) the total absence can think critically and conceptually.” of any chemistry in the response; (2) using generic and unclear descriptors “Grades signify not only your for thinking such as thinking understanding, but your ability to think critically, conceptually, creatively, outside the box in order to solve thoroughly, or differently, particularly problems we haven’t fully discussed in if the student does not expand on class.” what they mean. 195 Table 5.8 (con’t) Not Student responses that include one or more of the “good,excellent.” Applicable following dimensions in regard to their perceptions of what grades signify: (1) the response does not “I feel like my grade in answer the question; (2) when a student provides no this class could be better I response at all; (3) when the response is unclear or just need to keep interpretation of the response is difficult, such as progressing on the when students say “your understanding of the quizzes.” material”; (4) when the response falls into no other category; (5) when a student is venting about the course, professor, or other aspects relevant to the course and the response focuses on the course or the professor such as frustrations they have with the course or professor. This was the first (and only time) this perception question was asked of students. The distribution of perceptions is shown below in Figure 5.5. Figure 5.5. Breakdown of responses to question 4 for Fall 2020 CEM 141 (N=1894) Similar patterns can be noted between the mid-semester and end-of-semester data. As can be seen in Figure 5.5, the majority of students perceived that grades signified effort and/or 196 understanding. My rationale for combining “effort” and “understanding” responses together was because many students mentioned both “effort” and “understanding” in the same response, making it difficult to detangle the two codes. Many of the responses included in this category discussed the relationship of effort to understanding, often stating that the more effort put into a course, the more understanding a student will have. That is, this could indicate that students perceive both effort and understanding as being represented in their grades in CLUE. The “Effort and/or Understanding” category largely overshadowed all other categories. This result is not too surprising considering that many of the course points in CEM 141 are devoted to simply participating and completion of formative assessments and that students are consistently encouraged to explain the underlying features of chemical phenomena. These findings may provide some evidence to the idea that point allocations and grade distributions send strong messages to students about what is valued, such as practice. However, more work would need to be done to untangle how students associate effort with understanding. Superordinate Themes Similar to the organic chemistry data in Chapter IV, I organized the findings from Fall 2020 CEM 141 into three superordinate themes of “Use of Knowledge”, “Rote Knowledge”, and “Other” (question #2 has another theme known as “New and Contextual” that I will describe then). This reorganization can be found in Figures X.X, X.X, and X.X for the expectations of thinking, most difficult thing, and assessment questions. The grading question was not included in this reorganization into superordinate themes simply because the “Effort and/or Understanding” category was so large on its own and that it was difficult to determine how this category could be reorganized into our current superordinate themes. 197 Figure 5.6. Breakdown of Fall 2020 CEM 141 responses to question 1 into superordinate themes From this data, it can be noted that the majority of students in CEM 141 in Fall 2020 perceived they were expected to use their knowledge (which included “Apply and Reason”, “Identify and Describe”, and “Only Understand” codes). Treated as a whole, the comparison between mid-semester and end-of-semester responses were found to be not statistically significant (c2 = 4.629, df = 2, a = 0.05, p = 0.099); that is, there was no significant difference in the overall perceptions from mid-semester to the end-of-semester 198 Figure 5.7. Breakdown of Fall 2020 CEM 141 responses to question 2 into superordinate themes For question #2, I begin by highlighting the “New and Contextual” theme which explores new categories for this question that I had not previously noted. In this case, this theme captures responses that were categorized as “Online and Accessibility” and “Workload and Pace”; that is, students in this theme discussed how the online aspects of the course were the most difficult or that the workload and pace were most difficult, two new categories that were not noted in previous CEM 141 or 142 pilots (though Workload and Pace had been noted for CEM 252 data). As expected for this question, most responses fell into the “Other” theme (which included “Student Responsibility”, “Quiz Aspects”, “Course Aspects and Materials”, “Generalities”, and “Not Applicable” codes). Unlike with question #1, the results here were found to be statistically significant (c2 = 39.894, df = 3, a = 0.05, p < 0.001). A look at the residuals shows that “Rote Knowledge”, “New and Contextual”, and “Other” were driving forces of the significance; that is, these themes contributed most to the statistically different result. However, “Use of Knowledge” was not found to be a significant driver of the statistically significant result. Despite the significant results, the effect size was found to be small, with a Cramer’s V of 0.106. 199 Figure 5.8. Breakdown of Fall 2020 CEM 141 responses to question 3 into superordinate themes For the assessment question, a considerable number of students did perceive they were assessed on their ability to use their knowledge. In the case of this question, the rote knowledge theme has ballooned up due to the addition of the “Specific Topics” category into the theme. As a reminder, “Specific Topics” is when a student perceives they are assessed on a particular topic rather than a way of thinking. However, at the end of the semester, we do see a shift away from this theme. The “Other” theme is still large for this question due to many students perceiving they were assessed on materials presented in class, which was a category included in this theme. However, we have ran into the same issue as previous questions (and datasets): the large number of students allocated to categories that is not telling us much regarding underlying cultures of learning and how those influence knowledge use. Similar to the “Most Difficult Thing” question, the results for this question were also found to be statistically significant between the middle and end of the semester (c2 = 73.477, df 200 = 2, a = 0.05, p < 0.001). “Use of Knowledge” and the “Other” theme were found to not be drivers of significance for this test. Instead, “Rote Knowledge” was the most significant driver of this result according to the residuals (5.5 and -5.8, for mid and end of semester, respectively). However, a look at Cramer’s V shows this was a small effect size (0.144). Stability of Perceptions in CEM 141 Observations To better address the stability of perceptions question (research question #4), I decided to conduct observations of the Zoom classroom to determine if the classroom environment remained consistent over time and to gather additional insights on how online instruction occurred in this course to further comment on how the shift to online learning may have influenced student perceptions. My rationale for the observations was based on the assumption that perceptions would be more stable in more consistent learning environments (at least regarding expectations). Over the course of the year, I attended 11 different class sessions for CEM 141. It’s important to note that I TA’d for CEM 141 in the past, and the structure and pedagogical approach had not changed despite being transitioning to online. The class was very consistent from class period to class period in that it always began with announcements and what students needed to do before the next class period. Then the instructor always conducted a thorough review of the homework before jumping into new material. Throughout the class session, the instructor would ask students to respond to questions via the polling feature or using the yes/no functions on Zoom. If a certain concept required further exploration, the instructor would put the students into breakout rooms. Occasionally, if many students did not answer a polling question correct, the instructor would meticulously go 201 through the question to model how to think through the problem. The instructor paused often to give students the chance to ask questions; however, students mostly used the Zoom chat feature instead of unmuting and asking their question verbally. These questions were either answered by the instructors, TAs, or other students. These facets of the CEM 141 classroom were heavily noted in each observation and never changed. Therefore, I felt more comfortable noting that the classroom seemed consistent. Furthermore, the classes were taught by veteran CEM 141 professors who have taught the CLUE curriculum for multiple years. Question Data As noted earlier, for many of the codes across questions, the overall code distributions remained fairly consistent in the case of the courses included in the study. For question #1, the differences between the mid- and end-of-semester were found to not be statistically significant. In the case of question #2 and #3, the findings were found to be statistically significant but had rather small effect sizes. Although the observations revealed a consistent nature to the course, the disaggregation of the codes to track individual student responses from the middle to the end of the semester revealed a different finding. As can be noted in Table 5.9, most students changed their response between mid- and end-of-semester (except in the case of Q7: Grading, which still saw a large chunk of students changing their perception) despite overall trends seeming consistent. 202 Table 5.9. Mid-semester and end-of-semester perceptions comparison Question Percentage of responses Percentages of responses that remained the same that changed Q1-2: Expectations of 47.0 53.0 Thinking Q3-4: Most Difficult Thing 36.1 63.9 Q5-6: Assessment 32.6 67.6 Q7: Grading 53.1 46.9 To better explain the shift in perceptions, Sankey diagrams were used to visualize the flow of the change of responses, and all of these diagrams are included in the appendix. Unfortunately, this analysis did not yield any insights, and in some cases, it seemed responses flowed between two major categories, such as “Apply and Reason” and “Generalities”. That is, this only pointed us in one direction: the need to spend more time characterizing responses that are vague and/or within the “Generalities” code (a conclusion I have consistently arrived at throughout all pilot studies). Given the large number of responses that shifted over the course of the semester, some may question whether it is worth our time to study perceptions. Although it was my belief that perceptions were more likely to be more stable in more consistent learning environments (hence the observations), the majority of student perceptions changed. However, I did not go to every class session, and I’m still uncertain of the exact impacts online instruction has on student perceptions. Given that perceptions are subject to sociocultural influences, I will always expect some change, and it is not always a negative outcome if perceptions change over time. For example, it could be the case that in a specific context, perceptions of expectations of thinking shift from “memorization” to “apply and reason”. It is my belief that instructors would view this as a positive shift. However, work would need to be done to ensure that students are speaking about their experience broadly, or if they only have specific units of instruction in mind. 203 Regardless, if perceptions change over time, this kind of research may offer a snapshot of the valued ways of participating at that given time. However, I acknowledge, additional methods are necessary to gauge dynamics of the classroom more thoroughly. All of this to say that though the overall trends showed little to no difference from the mid- to the end-of-semester, most students changed their response and disaggregation of the responses primarily led me to find a way to clarify responses in the “Generalities” category. The Use of AACR with CEM 141 Data At this point, we felt that the perception questions, though not perfect, could still be insightful to instructors and departments. However, the limiting factor associated with our approach was the time-intensive analysis of the qualitative responses. The time to analyze all responses for CEM 141 data took months of specifically focusing on its analysis which would not be feasible for instructors and departments looking to get a quick measure of what students are perceiving in their courses. Therefore, we opted to employ the Automated Analysis of Constructed Response (AACR) machine learning algorithms to automatically code the data (Automated Analysis of Constructed Response, n.d.). In order to use AACR, one must train the algorithms to “pick-up” on the codes assigned by the researchers. That is, the data must already be coded, and then time must be spent training the algorithms to assign codes that agree with the codebook established by the human coders. Upon coding a dataset, the codebook and responses are fed into the AACR program. AACR then uses a suite of algorithms to go through and analyze the data. Each algorithm in the program assigns a code to the response, which is treated as a “vote” for a particular code. Each algorithm in the suite “votes” on which code to assign to each response, and the program then officially 204 assigns a code to a response that has the most “votes”. After this, agreement between the human codebook and the code assigned by the program is assessed. In many cases, the model is not perfect, and work must be done to help the model address edge cases and disagreements with the human codebook. In order to have a feasible model, AACR uses Cohen’s kappa as one of the major measurements of agreement between the algorithms and the human codebook. Within the social sciences, if the kappa value is around 0.80, this is considered near perfect agreement. While there are other measures of agreement generated by AACR, I will only discuss the kappa values. Prior to combining the categories into superordinate themes, I noted the following kappa values in the data set. Table 5.10. Kappa values for CEM 141 data prior to combining into superordinate themes Question 1 – Expectations of Thinking 0.77 Question 2 – Most Difficult Thing 0.66 Question 3 – Assessment 0.71 That is, the agreement between my codes and the algorithm was close to near perfect in some cases but was substantial overall. Organizing the data into superordinate themes for AACR did not prove productive. For one, there was limited data for the Rote Knowledge theme because many students in CEM 141 did not perceive they had to rely on memorization or other rote knowledge (particularly when compared to the “Use of Knowledge” theme). Although these models would seem feasible, particularly for questions #1 and #3, it would likely not be a robust model for use with other datasets considering that the model primarily consists of students who fell into the categories of “Apply and Reason” and “Generalities” and only students in transformed courses were represented in this dataset. Considering that the kappa values were 205 lower for Q2 and that the model did not include a diversity of responses to enhance the power of the model, I opted to not train this model further until data from traditional courses in general chemistry were provided. With this said, with enough data, the training of AACR to automatically code open-ended responses certainly seems possible. The Culture of Learning in CEM 141 The perception questions have been shown to produce reliable results in courses transformed with three-dimensional learning. That is, many students across these courses (CEM 252 and CEM 141) have consistently perceived their course requires the use of knowledge while not perceiving the emphasis on rote knowledge. Put another way, these students are largely perceiving the transformational intent of these courses. In the context of the culture of learning, these three-dimensional courses are seemingly sending a message to students that the use of knowledge is a valuable and expected practice, as demonstrated by their responses to questions #1 and #3. Although the perceptions were not stable from mid-semester to end-of-semester, it highlights that care must be given to the interpretation of the results, and the context is certainly important. When such questions are instituted into a course, students are likely relying on their most recent or most impactful experiences. These experiences are influenced by a variety of social influences (such as interactions with friends, family, instructors, and other peers) and cultural aspects (such as course policies and tasks). However, more work needs to be done to get more information from students to clarify responses so that we can generate more insights into the cultures of learning on a larger scale. 206 Fall 2020 BS 161: Cell and Molecular Biology Data At the end of Fall 2020 alongside CEM 141 data collection, I collected pilot data in BS 161: Cell and Molecular Biology. The students in this course received the same perceptions questions as the CEM 141 students (minus the grading question). However, this cohort only received the questions at the end of the semester via D2L on their final exam. Although BS 161 had not been fully transformed, some of the ongoing work on the curriculum and pedagogy were informed by three-dimensional learning. Upon analyzing the data, I found similar trends in student perceptions as to other transformed data. That is, for question #1, many students perceived they were expected to use their knowledge with the majority also perceiving this was how they were assessed. Furthermore, although the “Generalities” category was smaller for this cohort, it still emerged alongside other categories that were telling me information I already knew. That is, I was still encountering some of the same issues with the perception questions even outside of chemistry courses. The data was also used to examine similarities and differences in student perceptions across different sections of the same course that were taught by different instructors. This data showed some qualitative differences in student perceptions between sections, but more data would be needed to make strong claims about what was happening in these sections. This data is not explored in detail within the chapter but can be found in the appendix. The rationale for this was: 1) the data was largely exploratory; 2) the goal was to see how the perception questions “held up” in other courses which was already addressed in the CEM 141 data; and 3) nothing else was done with the BS 161 data moving forward; that is, it did not inform additional data collection or analysis. 207 Final Thoughts In conjunction with the previously stated research questions of these pilots, it is imperative I note that these pilot studies were meant to: 1) help with question development; 2) explore ways in which the data could be used, such as with section comparisons in BS 161; and 3) as exploratory studies to note if we saw anything interesting, particularly related to the culture of learning. What is clear, however, is that more work needs to be done to address the “Generalities” code, as it arose in all pilots as a major category. However, I would argue there is still power of this approach and its ability to capture student perceptions of classroom cultures. In many cases, we still saw many students perceiving they had to engage in applying and reasoning and not memorization. This is an important artifact of the classroom culture, and I still believe findings like this in the pilot studies carry research and pedagogical utility. Furthermore, I remind readers that the use of these questions does not constitute a perfect instrument, but instead acts as a middle-ground compromise between heavily structured and constrained quantitative instruments and highly open-ended and time-consuming interviews. Another point to make is that of the pilots, the BS 161 data allowed us to illustrate an interesting point: that we can use the questions to note differences in student perceptions of instruction across sections of the same course. However, given the lower “N” values, and the changes in how this data was collected, it would be difficult to really say much more. It does seem to be the case that there are some differences in student perceptions across sections despite attempts to make the different sections highly aligned. Furthermore, it is worth pointing out that section 3 was taught by an instructor who is trained and knowledgeable in three-dimensional learning. This could have influenced why more of the students in section 3 perceived that they 208 had to engage more in the use of their knowledge (i.e., through application and reasoning) moreso than the students in the other sections. 209 REFERENCES 1. Automated Analysis of Constructed Response. (n.d.). https://beyondmultiplechoice.org 2. Becker, N., Noyes, K., & Cooper, M. M. (2016). Characterizing Students’ Mechanistic Reasoning about London Dispersion Forces. Journal of Chemical Education, 93(10), 1713–1724. https://doi.org/10.1021/acs.jchemed.6b00298 3. beSocratic. (2020). Home page. https://besocratic.com/home 4. Bowen, R. S., Flaherty, A. A., & Cooper, M. M. (2022). Investigating student perceptions of transformational intent and classroom culture in organic chemistry courses. Chemistry Education Research and Practice. https://doi.org/DOI https://doi.org/10.1039/D2RP00010E 5. Bryfczynski, S. P. (2010). BeSocratic: An Intelligent Tutoring System for the Recognition, Evaluation, and Analysis of Free-Form Student Input. 6. Charmaz, K. (2006). Constructing Grounded Theory: A Practice Guide through Qualitative Analysis (p. 300). Sage. 7. Cooper, M. M., & Klymkowsky, M. (2013). Chemistry, Life, the Universe, and Everything: A New Approach to General Chemistry, and a Model for Curriculum Reform. Journal of Chemical Education, 90(9), 1116–1122. https://doi.org/10.1021/ed300456y 8. Cooper, M. M., Kouyoumdjian, H., & Underwood, S. M. (2016). Investigating Students’ Reasoning about Acid-Base Reactions. Journal of Chemical Education, 93(10), 1703– 1712. https://doi.org/10.1021/acs.jchemed.6b00417 9. Corbin, J., & Strauss, A. (2015). Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory (Fourth Edition). Sage. 10. Crandell, O. M., Kouyoumdjian, H., Underwood, S. M., & Cooper, M. M. (2019). Reasoning about Reactions in Organic Chemistry: Starting It in General Chemistry. Journal of Chemical Education, 96(2), 213–226. https://doi.org/10.1021/acs.jchemed.8b00784 11. Kararo, A. T., Colvin, R. A., Cooper, M. M., & Underwood, S. M. (2019). Predictions and constructing explanations: An investigation into introductory chemistry students’ understanding of structure-property relationships. Chemistry Education Research and Practice, 20, 316–328. https://doi.org/10.1039/c8rp00195b 12. Matz, R. L., Fata-Hartley, C. L., Posey, L. A., Laverty, J. T., Underwood, S. M., Carmel, J. H., Herrington, D. G., Stowe, R. L., Caballero, M. D., Ebert-May, D., & Cooper, M. M. (2018). Evaluating the extent of a large-scale transformation in gateway science 210 courses. Science Advances, 4(10), eaau0554–eaau0554. https://doi.org/10.1126/sciadv.aau0554 13. Noyes, K., & Cooper, M. M. (2019). Investigating Student Understanding of London Dispersion Forces: A Longitudinal Study. Journal of Chemical Education, 96(9), 1821– 1832. https://doi.org/10.1021/acs.jchemed.9b00455 14. Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(1), 1–22. https://doi.org/10.1186/s40594-018-0103-x 15. Schein, E. H., & Schein, P. A. (2016). Organizational Culture and Leadership (5th ed.). Jossey-Bass. 16. Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation, 27(2), 237–246. 17. Underwood, S. M., Reyes-Gastelum, D., & Cooper, M. M. (2016). When do students recognize relationships between molecular structure and properties? A longitudinal comparison of the impact of traditional and transformed curricula. Chemistry Education Research and Practice, 17(2), 365–380. https://doi.org/10.1039/C5RP00217F 18. Williams, L. C., Underwood, S. M., Klymkowsky, M. W., & Cooper, M. M. (2015). Are Noncovalent Interactions an Achilles Heel in Chemistry Education? A Comparison of Instructional Approaches. Journal of Chemical Education, 92(12), 1979–1987. https://doi.org/10.1021/acs.jchemed.5b00619 211 APPENDIX Summer 2020 CEM 141 Pilot Data and Discussion The first pilot study took place in the Summer 2020 CEM 141 course. This was the first time that the course was offered entirely online due to the COVID-19 pandemic. Furthermore, the instructor of the course was a graduate student who had never been an instructor-of-record but had been a teaching assistant for CEM 141 many times in the past. The pilot study included 74 students, and the questions were asked in the same order: 1) expectations of thinking, 2) most difficult thing, and 3) assessment. Expectations of Thinking The “Apply and Reason”, “Identify and Describe”, “Memorization”, “Generalities”, and “Not Applicable” categories made an appearance in the Summer 2020 CEM 141 data set, and the dimensions of these codes were identical to the organic chemistry codebook descriptions. Here I also noted the emergence of a “Scientific Practice” category which could have been subsumed into the “Apply and Reason” category as it implied the explicit application of various scientific practices such as modeling. I opted to keep it alone considering this was a pilot, and it still allowed me to analyze broad themes across student responses. Nine students offered responses that fell within the “Student Aspects” category which was an original category for question 1 in the organic chemistry dataset, but it was eventually subsumed by the “Not Applicable” category. Figure 5.9 shows a breakdown of student responses. 212 Figure 5.9. Breakdown of student responses to question 1 for Summer 2020 CEM 141 students (N=74) The majority of responses fell within the “Generalities” category which, similar to the organic chemistry codebook (in Chapter IV), was assigned when responses used vague, generalized language that made it difficult to discern the student’s perception (i.e., when a student says they must use “critical thinking”). Our questions seem to show promise for being used in other courses, though it was originally intended for organic chemistry. For the most part, we see some similar patterns to the organic chemistry data presented in Chapter IV which makes sense since this course has the same curricular and pedagogical underpinnings as OCLUE. Considering that CLUE (the curriculum for CEM 141) was entirely transformed using three- dimensional learning, we see slightly over a quarter of students perceiving that they must use their knowledge in some way. That is, over a quarter of student perceptions are seemingly aligning with the transformational intent of CLUE. However, there was still a large number of responses that were classified as vague and general. One potential explanation is that most of the students who take CEM 141 are freshmen in college and are still developing ways to describe and explain their experiences in courses at the university-level. Although it was promising to see many more students perceiving they must use their knowledge than relying on rote knowledge, a 213 pattern we also noted with OCLUE students in organic chemistry, this data highlights more work was needed to further clarify the “Generalities” category, especially since we do not have a large enough comparison group at MSU. Regardless, it does illustrate that the expectations of thinking question could be used in other courses with the same or similar coding scheme. Most Difficult Aspects For question 2, “Apply and Reason”, “Identify and Describe”, “Specific Topic”, and “Memorization” all emerged in this data, similar to the organic chemistry data in Chapter IV (although the “Memorization” category now has its own category). The “Scientific Practice” category emerged in this question as well but for only one response. The categories of “Exams, Quizzes, and Course Aspects” and “Student Aspects” were assigned when responses were focused on discussing exams, quizzes, or the course overall or what behaviors students must engage in to be successful in the course. The “Student Aspects” category also emerged in the first question for the Summer 2020 CEM 141 data, and it is related to the “Personal, Course, and/or Exam Aspects” category in the organic chemistry data. The data is presented in Figure 5.10. 214 Figure 5.10. Breakdown of student responses to question 2 for Summer 2020 CEM 141 students (N=74) Once again, most responses fell into categories that did not provide much insight into student perceptions of ways of thinking, or the responses were telling us what was already known about the course (such as the response saying that exams are multiple choice and short answer). We ran into this issue with the organic chemistry data in Chapter IV as well; however, this question regenerated analytical significance in its comparison between OCLUE and traditional organic chemistry, an approach not possible here. There were some similarities between this data and patterns in the OCLUE data. However, one major deviation is the number of responses discussing exam, quiz, and course aspects. Once again, it’s important to recall that this course is an online summer courses that moves rather quickly. Therefore, it’s perhaps not so surprising that students felt the most difficult aspects were their assessments. Assessment Reemergence of the “Apply and Reason”, “Scientific Practice”, “Specific Topic”, “Memorization”, “Exam Format and Aspects”, “Student Aspects”, and “Generalities” were noted 215 in this question with identical dimensions as the organic chemistry data and previous categories in the Summer 2020 CEM 141 dataset. This question saw the emergence of a “Fundamentals” category where the student response was focused on detailing the use of fundamental concepts. However, in a larger dataset it is likely this category would be subsumed by the “Apply and Reason” category. The category “Formative” captured student responses that described the formative nature of assessment and the value placed on practice. The “Course Materials” category was also in the organic chemistry dataset, but it was eventually subsumed by the “Not Applicable” category because it references which course materials were most referred to on assessments (i.e., lecture notes, homework, etc.). The breakdown of responses to this question can be seen in Figure 5.11. Figure 5.11. Breakdown of student responses to question 3 for Summer 2020 CEM 141 students (N=74) As noted in the previous questions, many responses were categorized in ways that do not provide much information in terms of student perceptions of ways of thinking or the responses are stating what is already known about the course. We see that many students in the dataset 216 perceive they are assessed on the use of knowledge, which provides some evidence that students in CEM 141 have perceptions that are aligning with the transformational intent of the course. This is further corroborated by the lower number of responses in rote knowledge categories like “Memorization” and “Specific Topic”. Regardless, there are still many categories here (with many responses in them) that do not provide much information to us with regard to how students are thinking. To address this issue, I attempted to develop additional scaffold questions to help students explain their responses in detail to eliminate any vague, general wording and to minimize the number of responses assigned to categories that are not providing much information. The new questions were asked to the Summer 2020 CEM 142 group of students. Summer 2020 CEM 142 Pilot Data and Discussion Similar to the Summer 2020 CEM 141 dataset, the Summer 2020 CEM 142 data was a pilot study. The main difference was the use of the scaffolded questions as noted earlier in the chapter. There were 76 students in this pilot study, and the students were asked similar perception questions to the organic chemistry and Summer 2020 CEM 141 students with the addition of the scaffold questions. The scaffold questions were not analyzed in any detail as their sole purpose was to clarify the responses to the perception questions. Expectations of Thinking The “Apply and Reason”, “Generalities”, “Course Aspects”, “Student Actions”, “Identify and Describe”, and “Memorization” categories had previously emerged in the other data sets in some form. The “Memorization is Not Enough” category would likely be subsumed by the “Identify and Describe” category in the other data sets, and the “Opportunity” category was used 217 when student responses focused on the number of opportunities available for students to show what they have learned or to practice. The results for this question are shown in Figure 5.12. Figure 5.12. Breakdown of responses to question 1 for Summer 2020 CEM 142 (N=76) Even with the scaffolded questions that sought to minimize the “Generalities”, “Course Aspects”, and “Student Actions” categories, they still emerged (particularly the “Generalities” category). This potentially indicates that the scaffolded questions may not be effective at clarifying responses to the questions. However, it is promising to see around 40% of students perceive they had to use their knowledge. That is, a large number of students in this CEM 142 course have perceptions that align with the transformational intent. Furthermore, very few students discussed memorization in the context of this course. Similar to my previous work in organic chemistry and CEM 141, however, there were many responses in the “Generalities” category, indicating the scaffold questions were not as effective at providing more insight into these responses. 218 Most Difficult Aspects For the second question “Student Actions”, “Course Aspects”, “Exam and Quiz Aspects”, “Specific Topic”, “Apply and Reason”, and “Memorization” all made reappearances here. No new codes were generated, and the results can be seen in Figure 5.13. Figure 5.13. Breakdown of responses to question 2 for Summer 2020 CEM 142 (N=76) The majority of responses to this question fell into categories that provided information that was already known or not useful for eliciting perceptions on ways of thinking even though scaffolded questions had been provided to minimize the number of responses in “Student Actions”, “Course Aspects”, and “Exam and Quiz Aspects”. Despite scaffolding, the previously mentioned categories were the top three categories for this question. I had noted similar patterns as shown above. Typically for the “most difficult aspects” question, we note a drop in categories like “Apply and Reason” and see increases in the more generalized categories like “Course Aspects”. That is, the scaffold questions did not seem to help minimize the number of responses in categories that do not provide much information. 219 Assessment Finally, for the third question, “Apply and Reason”, “Generalities”, “Identify and Describe”, “Course and Exam/Quiz Aspects”, “Specific Topics”, and “Memorization” all appeared. The “Solution-Centered” category was used to capture a couple of responses that referred to the solution-centered nature (i.e., being right and wrong) of the class. The data for this question is listed in Figure 5.14. Figure 5.14. Breakdown of responses to question 3 for Summer 2020 CEM 142 (N=76) Similar to questions 1 and 2 for this dataset, many responses were still categorized as “Generalities” or other categories that did not elicit useful information despite the use of scaffold questions. However, I noted that the majority of students in CEM 142 perceived that they were expected to use their knowledge on their assessment which implies that their perceptions align with the intent of the transformation. However, I still encountered many generalized, vague responses that were difficult to categorize despite having scaffold questions specifically targeting some of these types of responses. That is, the scaffold questions did not seem to help. Furthermore, throughout the analysis of this data set, I noted many students being confused about the scaffold questions and what they were asking. I also believe that the use of the scaffold 220 questions overcomplicated the data collection and analysis procedure and may have induced question fatigue on students. The Culture of Learning in Summer 2020 CEM 141 and CEM 142 Similar to the organic chemistry data, it does seem that many students in CEM 141 and CEM 142 have perceptions that align with the transformational intent. That is, many students in the transformed courses of CEM 141 and CEM 142 seemingly perceive that they are expected to use their knowledge and are assessed accordingly. I noted similar patterns across the data to those for the organic chemistry OCLUE data (Bowen et al., 2022). Considering that both courses are transformed with three-dimensional learning, this is promising. The underlying culture of learning for CEM 141 and 142 does seem to be sending the message that the use of knowledge is valued and that memorization (and rote knowledge) will not be sufficient to succeed in the course. These messages are supported and emphasized by how the students practice in the course and how they assessed accordingly (Reinholz & Apkarian, 2018; Schein & Schein, 2016). There are some differences, however, in content, context, and patterns of responses for some categories (when compared to the OCLUE data). To begin, this data was collected from a summer cohort of students and the course was offered online due to the COVID-19 pandemic. Although we can hypothesize and speculate with regard to comparisons to the CEM 252 study in Chapter IV, we must be careful given the differences in context. Furthermore, I saw some differences in student perceptions, notably the number of people who had responses allocated to the “Exam, Quiz, and Course Aspects” category which typically included data that discussed the format or difficult of exams and quizzes or the course overall or discussed general policies of the course (information I already knew). The scaffolding did not seem to help clarify the responses. 221 Therefore, the scaffold questions were removed, and it was decided to move forward with the original analysis questions for the Fall 2020 CEM 141 students. Student Perceptions of BS 161 The BS 161 data was somewhat different. For one, it did not include the “grading” question asked of CEM 141 students. Furthermore, the structure of the course was different. Although there were five sections of BS 161 in Fall 2020, I only received data from three instructors. Therefore, this data does not represent all students who were enrolled in BS 161 at the time of this pilot study. Furthermore, although the data can be disaggregated by code, I opted to present the data comparatively by section to assess the questions capabilities in this regard and given the differences that could emerge in instruction between different professors. Expectations of Thinking Similar to CEM 141, the first question focused on student perceptions of expectations of thinking. The description of each code is the same as those outlined in Table 5.5. for the CEM 141 data. Here, the “Independent Learning and Student Actions” code is the same as “Student Responsibility”. Results and comparisons across sections can be noted in Figure 5.15. 222 Figure 5.15. Breakdown of responses to question 1 for BS 161 There are some fluctuations of responses across the sections. To begin, section 4 has more students perceiving they are expected to apply their knowledge and reason more than the other two sections, notably more than section 5. Across many of the other categories, these differences are somewhat smaller. Given the exploratory nature of this study, no statistical tests were done to assess significance. Most Difficult Aspects The second question asked of BS 161 was aimed at gathering their perception on what they thought the most difficult part of BS 161 was. The code descriptions are the same as those detailed in Table 5.6. for CEM 141. Here, “Assessment Aspects” correlates to “Quiz Aspects” and “Student Aspects” correlate to “Student Responsibility”. The results and section comparisons are included in Figure 5.16. 223 Figure 5.16. Breakdown of responses for question 2 for Fall 2020 BS 161 Once again, students in section 4 had perceptions aligned more with the “Apply and Reason” code. Students in section 3 spoke more to the format and types of assessments used. Finally, students in section 5 perceived that the workload and pace of the course was more difficult than students in the other sections. Once again, these numbers are relatively small, but there does seem to be some differences in response patterns across sections. Assessment The final question asked of students focused on what they were assessed on. The codebook for this question is the same as Table 5.7. for CEM 141. The results and section comparisons are included in Figure 5.17. 224 Figure 5.17. Breakdown of responses to question 3 in Fall 2020 BS 161 For many of the responses, there seems to be relatively similar percentages in each category with some minor exceptions. For example, more students in section 4 perceived they were assessed on course materials than students in the other sections. However, it should be noted that the instructors did use relatively common exams. Fall 2020 BS 161 Superordinate Themes Similar to the organic chemistry data in Chapter IV and the Fall 2020 CEM 141 data, I opted to reorganize the data from BS 161 into superordinate themes. Instead of separating the data into sections, I decided to aggregate all of the sections together to get a view of what BS 161 students were perceiving overall. The reorganization of data into superordinate themes can be noted in Figures 5.18, 5.19, and 5.20. 225 Figure 5.18. Breakdown of Fall 2020 BS 161 responses to question 1 into superordinate themes By aggregating the responses together, it can be noted that a large majority of BS 161 students perceived they were expected to use their knowledge in the course. The categories included here were “Apply and Reason” and “Identify and Describe”. Furthermore, a very small proportion of students perceived they had to use more rote knowledge (which included the “Memorization” category). All other responses were included in the “Other” theme. In the case of BS 161, we see a large majority of responses falling into “Use of Knowledge” with less than a quarter falling into the “Other” category. 226 Figure 5.19. Breakdown of Fall 2020 BS 161 responses to question 2 into superordinate themes As is typical for this question, the majority of responses fell into the “Other” theme which captured responses such as “Assessment Aspects”, “Student Aspects”, “Generalities”, and others. This pattern is similar to other data collected from other courses with the majority of students perceiving that their exams/assessments are the most difficult aspect of the course. 227 Figure 5.20. Breakdown of Fall 2020 BS 161 responses to question 3 into superordinate themes Finally, most students perceived they were assessed on their ability to use their knowledge. There is still a considerable number of students within the “Other” theme because their response included aspects related to the course, including what materials were included on exams. All-in-all, similar patterns with BS 161 and CEM 141 were noted. Given that BS 161 had recently underwent transformation to incorporate more three-dimensional learning, this is promising. The Use of AACR with the BS 161 Data Similar to CEM 141, some categories in this data set did not have many responses. Therefore, AACR struggled with coding the data with any accuracy. Before combining into superordinate themes the kappa values were exceptionally low as indicated in the table below. 228 Table 5.12. AACR kappa values prior to combining into superordinate themes Question 1 – Expectations of Thinking 0.23 Question 2 – Most Difficult Thing 0.59 Question 3 – Assessment 0.38 Although we had more diversity of responses, we had a lower number of responses overall here. Considering the open-ended nature of the questions, AACR struggled picking up on any lexical patterns in the data. For example, although many students would use similar terminology to describe their experience, it was possible for two responses to use entirely different terminology yet still be coded into the same theme given the broad nature of our coding. Therefore, the algorithms struggled with coding responses. I attempted to combine the data into the superordinate themes to determine if we saw any improvements in kappa values. The results are presented in the table below. Table 5.13. AACR kappa values after combining into superordinate themes Question 1 – Expectations of Thinking 0.51 Question 2 – Most Difficult Thing 0.42 Question 3 – Assessment 0.63 As can be seen, there wasn’t much improvement for the kappa values. Once again, considering the open-endedness, low number of responses (for AACR), and the complexity and diversity of responses, the lexical analytic procedures of the AACR suite of algorithms struggles to locate patterns in the responses. Upon closer examination, many students discussed their perceptions in the context of certain assignments or learning tasks in the course. That is, many responses used the same language (i.e., discussing the same assignments or tasks), but their perception may have differed in that one student may have perceived they had to use their knowledge to navigate a learning activity while another perceived the activity included specific 229 topics or relied on other course materials (which would have been coded differently). That is, AACR struggled when students used the same language, but the underlying meaning of their response was different. Given that AACR is primarily relying on the prevalence of certain words and phrases, this could be problematic for the use of AACR with some courses. The Culture of Learning in BS 161 It seems that the data is implying that the culture of learning, as perceived by many students, is sending the message that the application of knowledge is important. More importantly, this pilot showcases that the questions can also be transferred to other disciplines and be used to make section comparisons across different instructors. Across some of the categories, there were similarities in student responses; however, students in section 3 perceived “applying and reasoning” was expected. That is, the overarching culture seems to be consistent across these sections, at least at the end of the semester. These findings are certainly preliminary, and more work needs to be done. Sankey Diagrams for Fall 2020 CEM 141 With regard to research question #4, the overall patterns of perceptions from the middle- of-the-semester to the end-of-the-semester seemed consistent. However, when I analyzed the data at a finer grain-size, I noted that most students were changing their perceptions between the two time points. In order to see where they were going, I generated Sankey diagrams. As I noted above, this did not provide much insight. The largest finding from the Sankey diagrams was that the majority of perceptions were switching between use of knowledge and non-responsive themes. The Sankey’s are broken up into individual categories (not superordinate themes). 230 Question 1: Expectations of Thinking Figure 5.21. Sankey diagram for Apply and Reason in Q1 231 Figure 5.22. Sankey diagram for Identify and Describe in Q1 232 Figure 5.23. Sankey diagram for Only Understand in Q1 233 Figure 5.24. Sankey diagram for Memorization in Q1 234 Figure 5.25. Sankey diagram for Solution-Centered and Heuristics in Q1 235 Figure 5.26. Sankey diagram for Student Responsibility in Q1 236 Figure 5.27. Sankey diagram for Course Aspects and Materials in Q1 237 Figure 5.28. Sankey diagram for Generalities in Q1 238 Figure 5.29. Sankey diagram for Not Applicable in Q1 239 Question 2: Most Difficult Thing Figure 5.30. Sankey diagram for Apply and Reason in Q2 240 Figure 5.31. Sankey diagram for Identify and Describe in Q2 241 Figure 5.32. Sankey diagram for Only Understand in Q2 242 Figure 5.33. Sankey diagram for Memorization in Q2 243 Figure 5.34. Sankey diagram for Specific Topics and Concepts in Q2 244 Figure 5.35. Sankey diagram for Student Responsibility in Q2 245 Figure 5.36. Sankey diagram for Quiz Aspects in Q2 246 Figure 5.37. Sankey diagram for Course Aspects and Materials in Q2 247 Figure 5.38. Sankey diagram for Online and Accessibility and Materials in Q2 248 Figure 5.39. Sankey diagram for Workload and Pace in Q2 249 Figure 5.40. Sankey diagram for Generalities in Q2 250 Figure 5.41. Sankey diagram for Not Applicable in Q2 251 Question 3: Assessment Figure 5.42. Sankey diagram for Apply and Reason in Q3 252 Figure 5.43. Sankey diagram for Identify and Describe in Q3 253 Figure 5.44. Sankey diagram for Only Understand in Q3 254 Figure 5.45. Sankey diagram for Memorization in Q3 255 Figure 5.46. Sankey diagram for Specific Topics and Concepts in Q3 256 Figure 5.47. Sankey diagram for Student Responsibility in Q3 257 Figure 5.48. Sankey diagram for Quiz Aspects in Q3 258 Figure 5.49. Sankey diagram for Course Aspects and Materials in Q3 259 Figure 5.50. Sankey diagram for Generalities in Q3 260 Figure 5.51. Sankey diagram for Not Applicable in Q3 261 Question 4: Grading Figure 5.52. Sankey diagram for Apply and Reason in Q4 262 Figure 5.53. Sankey diagram for Identify and Describe in Q4 263 Figure 5.54. Sankey diagram for Effort and/or Understanding in Q4 264 Figure 5.55. Sankey diagram for Memorization in Q4 265 Figure 5.56. Sankey diagram for Inaccuracy of Grades in Q4 266 Figure 5.57. Sankey diagram for Quiz Aspects in Q4 267 Figure 5.58. Sankey diagram for Course Aspects and Materials in Q4 268 Figure 5.59. Sankey diagram for Generalities in Q4 269 Figure 5.60. Sankey diagram for Not Applicable in Q 270 CHAPTER VI: EXPLORATORY STUDIES ON STUDENT PERCEPTIONS OF ORGANIC CHEMISTRY CLASSROOM CULTURES Introduction All previous introductory material from Chapter IV applies here. This chapter details exploratory studies in organic chemistry (CEM 252 and CEM 251) at Michigan State University. For the first study in CEM 252, the main impetus was to see if the overall student perceptions demonstrated the same trends despite the course being online (unlike the study in Chapter IV which was in-person but still took place in CEM 252). This study was only conducted with students in OCLUE in Spring 2021. The second study, conducted in Fall 2021 with CEM 251 students in OCLUE and traditional organic chemistry, was an attempt to try a different scaffolding approach and curtail the number of responses in the “Generalities” category and others that provided me with information I already knew. These studies offered the opportunity to make comparisons between in-person and online CEM 251 and CEM 251 and CEM 252 OCLUE students to see if perceptions change over the course sequence. The research questions for these studies as follows: 1. How does the shift to online instruction impact student perceptions in CEM 252 compared to in-person CEM 252? 2. How do OCLUE student perceptions in CEM 251 and CEM 252 compare? 3. What occurs if the questions are made less open-ended in an attempt to limit responses in the “Generalities” category? 4. In what ways do student perceptions of valued ways of doing and thinking align with the transformational intent? 271 5. How do elements of the course culture impact student perceptions of what is valued? Methods Participants The table below details the participants for the studies in Spring 2021 CEM 252 and Fall 2021 CEM 251. All participants in these studies received the perception questions at the end of the semester. Table 6.1. Participants for Spring 2021 CEM 252 and Fall 2021 CEM 251 pilot studies Semester Course Number of Participants CEM 252 – Organic Spring 2021 260 Chemistry II (OCLUE) CEM 251 – Organic 440 (296 for Professor 1, 144 Chemistry I (OCLUE) for Professor 2) Fall 2021 CEM 251 – Organic 436 (all with Professor 3) Chemistry I (traditional) Question Development For Spring 2021, the perception questions were similar to those asked to the Fall 2020 CEM 141 students (except for the grading question) presented in Chapter V. For convenience, the questions are shown below: 272 Table 6.2. Perception questions asked to Spring 2021 CEM 252 students Question Nickname Full Question Analysis question #1: How would you describe the ways students are expected to think in expectations of thinking CEM 252? In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #1 and/or provide an question #1 example. Analysis question #2: What would you say is the most difficult thing about CEM 252? most difficult thing In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #3 and/or provide an question #2 example. Analysis question #3: How would you describe what is assessed in CEM 252? assessment In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #5 and/or provide an question #3 example. The rationale behind this organization was discussed in Chapter V. However, for Fall 2021 CEM 251, I changed the questions slightly. As noted, we aimed to decrease the open- endedness slightly to curtail the number of responses that we were unable to categorize due to them being too vague or responses telling us information we already knew. Half of the students in OCLUE and traditional organic chemistry received the same questions shown above in Table 6.2. However, the other half received the modified questions below in Table 6.3: 273 Table 6.3. Modified questions asked to the other half of students in Fall 2021 CEM 251 Question Nickname Full Question How would you describe the ways students are expected to think in CEM 251? Please try not to use vague terms such as “critical Analysis question #1: thinking”, “problem solving”, “systematically, “analytically”, etc. expectations of thinking Instead, provide examples of the types of things you are expected to do and explain in question #2. In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #1 and/or provide an question #1 example. What would you say is the most difficult thing about CEM 251? Similar to the previous questions, please try not to use vague terms Analysis question #2: such as “critical thinking”, “problem solving”, etc. Instead, provide most difficult thing examples of what you believe the most difficult aspect(s) of the course is/are, and please explain why this aspect is the most difficult for you in question #4. In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #3 and/or provide an question #2 example. How would you describe what is assessed in CEM 251? Similar to the previous questions, please try not to use vague terms such as Analysis question #3: “critical thinking”, “problem solving”, etc. Furthermore, please assessment avoid restating what assessments look like since we already know that information. Instead, provide examples that support your perception of what was assessed in the course in question #6. In order to help us better understand your response and perspective, Follow-up to Analysis please explain your response from question #5 and/or provide an question #3 example. The rationale for this scaffolding was informed by our previous work on student perceptions in these courses, and I tried to scaffold questions so that students would not provide ambiguous information or tell me information I already knew (such as what assessments look like). The scaffolding was aimed at particular categories from previous analyses. For example, for question #1, the “Generalities” category, which included responses where students said they must think “critically” or “systematically”, was a major category, and I sought to target this explicitly. Furthermore, for question #3, I had previously noted that many students would discuss the format of exams which was information I already knew, so I added scaffolding to potentially 274 curtail those responses. For the CEM 251 student, the goal was to explore the impact of such scaffolding on the open-ended questions with the aim of determining whether or not it was possible to limit the number of ambiguous responses that were unclear. Recalling the scaffolding from the Summer 2020 CEM 142 pilot, I opted to limit the amount of scaffolding and only target specific categories. Data Collection All data collection was done through beSocratic (beSocratic, 2020; Bryfczynski, 2010), the same as with Chapter IV and Chapter V. Once the deadline for data collection passed, I exported all responses from beSocratic to Excel spreadsheets, deidentified, and then loaded them into MAXQDA for data analysis. As noted in Chapter V, student responses could still be “tracked”, if necessary, by using the numeric beSocratic ID attached to each response. Data Analysis All data analysis proceeded in the same way used in Chapter IV and V. That is, I used an inductive thematic approach (Thomas, 2006) coupled with grounded theory methodologies (Charmaz, 2006; Corbin & Strauss, 2015). Results and Discussion Spring 2021 CEM 252 Organic Chemistry II Data Expectations of Thinking This question asked students: “How would you describe the ways students are expected to think in CEM 252?” with a follow-up asking them to provide an example. Similar categories 275 emerged from this data as previously noted in other studies; therefore, the codebook for this question looks similar to previous codebooks. The codes and descriptions are listed in Table 6.4. Table 6.4. Codebook for question 1: Perceptions of the expectations of thinking Code Dimensions Example Quotes Apply Student responses that include one or more “I would say students are expected to and of the following dimensions in regard to think about why reactions occur rather Reason their perceptions of how they are expected than just memorize them. An example to think: (1) understanding “why” a would be acid/base reactions because reaction proceeds; (2) the use of there are different mechanisms for knowledge, specifically with the use of different reactions. It's easier to learn fundamental or basic ideas; (3) the why the reaction occurs than try to transfer of knowledge to new problems; memorize all of the different (4) making connections between concepts, reactions” especially in order to apply them; (5) making predictions in order to solve a “Students are expected to think in an problem. orderly, mechanical fashion. We are expected to follow the movement of electrons and how certain functional groups interact with each other in specific ways, which allows us to predict how other reactions will happen, even if we haven't seen them before.” 276 Table 6.4 (con’t) Identify and Student responses that include one or “Students are expected to think Describe more of the following dimensions in mechanistically about things. In class regard to their perceptions of how you focus on drawing mechanisms they are expected to think: (1) and figuring out how molecules react understanding the “what” and “how” together. You have to draw out full reactions proceed, particularly reactions, but to do that you need to without mentioning the use of understand what steps a reaction knowledge to understanding “why” could take.” reactions proceed; (2) mentions understanding at the scalar or one “Electron movement and predictably scalar below levels, particularly You really need to develop the through the recognition of concepts thought process of how different such as polarity and electronegativity electrons and protons will behave” and their significance to understanding; (3) when a student explicitly mentions understanding instead of memorization; (4) when a student mentions “differentiating” between reactions with any further explanation; (5) responses includes a discussion of forces, charges, or stabilization. Memorization Student responses that include one or “I would say that students must think more of the following dimensions practically when taking this class. regarding their perceptions of how You do not need to memorize all the they are expected to think: (1) reactions but memorizing the memorization, remembering, mechanism will allow you to do all recalling, or regurgitation; (2) sorts of problems. Memorize the retaining information. mechanisms rather than the contents of each reaction. If you know the general mechanism than you know how to do the problems generally.” “I believe students are expected to think critically in CEM 252. Though there is a lot of memorization in organic chemistry, students should think critically when it comes to problem solving. For example, when trying to figure out how to synthesize a specific molecule, a student must think critically by using the processes they have memorized in order to solve the problem.” 277 Table 6.4 (con’t) Solution- Student responses that include one “Student are expected to look at problems Centered or more of the following and determine a)all of the different kind of dimensions regarding their solutions there are, and b)which is best in perceptions of how they are most of the problems provided in this class, expected to think: (1) focusing on there are many answers, so determining accuracy, being right or wrong; which is best is important.” (2) focusing on the solution to a problem. “pretty linearly organic chemistry does not have much "grey area" Answers tend to be right or wrong. like which mechanism goes first.” Course Student responses that include one “Students are expected to use their resources Aspects or more of the following to strengthen their understanding without dimensions regarding their being reminded. Students are given many perceptions of how they are resources such as office hours, notes, outside expected to think: (1) focusing on resources, youtube tutorials etc - that there is course organization, difficulty, an underlying expectation for students to policies, or other aspects related to strengthen their understanding whenever or the course itself; wherever there is confusion.” “every thing you di in this class has your brain thinking. You have to be able to think during class, lecture, recitation, and homework. None of it is easy.” 278 Table 6.4 (con’t) Effort Student responses that include one “In this class, I would say that students or more of the following dimensions are guided to think individually about the in regard to their perceptions of how topics that are in class. By that, I mean they are expected to think: (1) students are led to the proper content, putting in effort and trying; (2) and they are explained what is needed of taking initiative and choosing how them, but the ways that they are able to to approach their studying. be successful is left in the hands of the students.” “students were expected to put in full effort in every aspect of this class. It was a lot of out of class dedication to the class in order to fully understand the objective at hand.in this class you could not simply remember one rue and know how to do everything. understanding how one mechanism works and simply memorizing that mechanism will not always help you with every mechanism.” Generalities Student responses that include one “Students are expected to think or more of the following dimensions critically/logically and learn concepts as in regard to their perceptions of how if they are math problems. Figuring out they are expected to think: (1) the products and synthesis reactions means total absence of any chemistry in the thinking and solving each problem using response; (2) using generic and similar formulas. Thinking in a critical unclear descriptors for thinking such and logical way may aide in successfully as thinking critically, conceptually, completing the course.” creatively, thoroughly, or differently, particularly if the student does not “They are supposed to think outside the expand on what they mean; (3) box For example sometimes they won't stating general facts about content show u the path to the solution of a such as “reactants go to products”. problem but they will give you all the tools for it” 279 Table 6.4 (con’t) Not Student responses that include one or “I think this class was very interesting Applicable more of the following dimensions in and helped students to think about regard to their perceptions of how they different topics throughout the class. are expected to think: (1) the response While going through lectures, I does not answer the question; (2) when learned a lot of new topics and a student provides no response at all; sometimes I had to think multi topics (3) when the response is unclear or for better understanding.” interpretation difficult, such as when students say “you must “Tirelessly. I don't think there is a understand/know the material”; (4) simple way to think in CEM 252. You mention of exam and/or course aspects (or at least, me) are constantly such as exam difficulty; (6) when a overthinking the way you do things student is venting about the course, and the answers you get. Nothing professor, or other aspects relevant to seems solidified or verified and it's the course. hard to understand and know what youre doing. It gives me a constant headache. You can't learn from a 7-14 minute lecture video. Especially when it's nothing but examples and no explanations.” The “Solution-Centered” category was used for responses that discussed how the course only sought “right” answers. The “Effort” category was reserved for responses that described the expectations of the course as being based on “trying” and giving a “good faith effort”. 280 Figure 6.1. Breakdown of student responses to question 1 for Spring 2021 CEM 252 students (N=260) The majority of students in this course perceived that they were expected to apply their knowledge and reason. A very small percentage perceived they had to rely on memorization or find the “right” answer (Solution-Centered). As can be seen, the “Generalities” category is the second-largest category which has been a common occurrence with these perception questions. This is a similar pattern that has been noted previously with CEM 252 data and transformed experiences overall. That is, the questions seem to generate reliable trends from semester to semester, despite this semester being online. A quarter of responses were in the “Generalities” category, with the majority in “Apply and Reason” implying that students do perceive the transformational intent of OCLUE. 281 Most Difficult Aspects This question asked students: “What would you say is the most difficult thing about CEM 252?” with a follow-up question requesting an example. The codebook for this question is shown in Table 6.5. Table 6.5. Codebook for question 2: Perceptions of the most difficult thing Code Dimensions Example Quotes Apply Student responses that include one or “The most difficult thing is not just and more of the following dimensions in memorizing facts, but understand the Reason regard to their perceptions of what is the why of it. It's easy to memorize facts most difficult thing: (1) understanding like compound A is more acidic than “why” a reaction proceeds; (2) the use of Compound B. But on the exam, you knowledge, specifically with the use of need to understand why this happens.” fundamental or basic ideas; (3) the transfer of knowledge to new problems; “The most difficult thing about CEM (4) making connections between concepts 252 is having to apply what is taught to or piecing/linking concepts together, reactions not know. Reacting two especially in order to apply them; (5) molecules together that I have never making predictions in order to solve a seen, for instance, is very challenging. I problem. have to recognize what parts of them would react and how they would attack, where electrons go, and more. It takes a new type of thinking.” 282 Table 6.5 (con’t) Identify and Student responses that include one or “I think the most difficult thing is Describe more of the following dimensions in learning reagents and finding where regard to their perceptions of what is the electrons are moving. There the most difficult thing: (1) were many different reactions and understanding the “what” and “how” mechanisms proposed that students reactions proceed, particularly were expected to know. I think that without mentioning the use of by fully understanding where the knowledge to understanding “why” electrons are moving and how certain reactions proceed; (2) mentions structures react with differing understanding at the scalar or one reagents. By knowing what the end scalar below levels, particularly structure should contain, it is easier through the recognition of concepts for students to know what such as polarity and electronegativity mechanism to do.” and their significance to understanding; (3) when a student “Understanding how a problem goes explicitly mentions understanding from point A to B when it comes to instead of memorization; (4) when a writing out the mechanisms. CEM student mentions “differentiating” 252 is more about understanding the between reactions with any further mechanisms behind how a molecules explanation; responses includes a goes from reactant to product, rather discussion of forces, charges, or than how to get the product from a stabilization. NOTE: responses that reactant.” simply mention “knowledge” or “understanding” do not receive this code. Memorization Student responses that include one or “The most difficult part was all of more of the following dimensions in the mechanisms and compounds. regard to their perceptions of what is There were many compounds and the most difficult thing: (1) mechanisms that we had to memorization, remembering, memorize in order to be successful.” recalling, or regurgitation; (2) knowing content with no explicit “The most difficult thing was trying mention of understanding the “what”, to memorize the different “how”, or “why” a reaction proceeds. mechanisms. I had a lot of trouble trying to remember what type of reaction coincides with what type of mechanism. For example, I would mix up what a nucleophilic substitution would look like and what an electrophilic addition would look like.” 283 Table 6.5 (con’t) Specific Student responses that include one or “spectroscopy literally the entire Topics more of the following dimensions subject of it confused me” regarding their perceptions of what is the most difficult thing: (1) listing off “condensation reactions I had a difficult specific topics, particularly with no time understanding the mechanisms of reference to understanding or approaches how these reactions were performed” utilized. Most common specific topics mentioned include mechanisms, acid- base reactions, learning objectives, naming, synthesis, and spectroscopy; (2) explicit mention of “concepts” without expanding on what they mean (i.e., “understanding concepts” or “knowing a mechanism”). Exam Student responses that include one or “The most difficult thing about CEM Aspects more of the following dimensions 252 is the fact that doing well requires a regarding their perceptions of the most clear and confident understanding of all difficult thing: (1) general mention of the the content. Since the exams are exams being most difficult; (2) multiple choice, with relatively few discussion of the format of the exams; questions that often build on one (3) discussion of the grading approach to another, it can be easy to score poorly if the exams. you overthink a problem or two. More specifically, even if you know the content pretty well, if you ever second guess yourself on some of the questions, you can get many points off of your exam scores.” “The exams. There are only 10-14 questions on the exams. I would get two wrong and receive a 79% because of the weighted questions, even when I felt like I knew the material.” 284 Table 6.5 (con’t) Course Student responses that “adjusting to the different structure and expectations Aspects focus on course I had a different professor for Orgo 1 so I wasn't organization, difficulty, really prepared. I felt like I was trying to catch up for policies, or other aspects the majority of the semester” related to the course itself. “Transitioning from a traditional professor to OCLUE. I succeeded in both orgo 1 with Vaseilou and orgo 2 with Dr. Cooper. In the future, I would not recommend switching curriculums. When I enrolled, the professors names were not available so I did not know I was switching curriculums. Overall, what helped me overcome this change was OFFICE HOURS. I lived in office hours for a LONG period of time and was able to perform very well in the course!” Course Student responses that “The most difficult thing for me was the recitation Materials focus on specific course assignments. I would have trouble completing the materials, such as assignment on my own if I didn't have a group to homework, lecture notes, work on it with.” or recitation worksheets “I think the most difficult thing about CEM 252 is the beSocratic homework. For the majority of beSocratic assignments, my notes were really useful and they helped me to understand the assignment. For some assignments, however, my notes weren't useful and so I couldn't really understand the material in the assignment.” 285 Table 6.5 (con’t) Student Student responses that include one or “Remembering to do the besocratic Aspects more of the following dimensions homework My memory is not good and I regarding their perceptions of the often forgot that I had to do the homework most difficult thing: (1) actions so I didn't complete the assignments” related to what students do in a course, such as taking notes, applying “Remembering to do the the homework. I themselves, finishing work, and had a difficult time remembering to the the more; (2) putting in effort and homework because it wasn't due at night participating; (3) learning on their but instead before class, for me personally I own outside of class and being had to change my mind set that it was due accountable; and (4) actions the night before class in order to get it pertaining to the student, such as done.” keeping track of due dates. Online Student responses that include one or “For me, it was learning everything online, more of the following dimensions as I really struggle with this. As for the regarding their perceptions of the curriculum, I think really understanding the most difficult thing: (1) being online; mechanisms and knowing which type of (2) accessibility of getting help in the reaction is occurring. I perform much better course; (3) access of materials. when I am in an environment of learning/surrounded by people learning the same material. It seems that no matter how hard I try, I still feel there are too many distractions when standing class over zoom.” “learning from the homeworks and learning during the pandemic beSocratic I find hard to learn from, and through the pandemic, with little break, etc. eventually I just burned out this semester trying to learn online” 286 Table 6.5 (con’t) Workload Student responses that include one or “possibly the bulk of the content and Pace more of the following dimensions While it was nice that CEM 251 regarding their perceptions of the most covered half of the content for difficult thing: (1) amount of material; organic chemistry and CEM 252 (2) pace of course; (3) feeling covered the other half, there was still overwhelmed. a lot of content to go through. considering that we had to follow lecture videos outside class, as well as attend weekly meetings and recitations,” “I would say that the pace at which we learn is dificult. It is a lot of information, and largely new information in organic chem II, within a short period of time. Although I enjoyed the course, it would be great to have more time to practice real work example problems and discuss more about each topic.” Generalities Student responses that include one or “How to think conceptually. It is more of the following dimensions in really hard for me to take a molecule regard to their perceptions of the most I have written on paper and imagine difficult thing: (1) the total absence of it in 3D. However, it is important to any chemistry in the response; (2) using understand this because reactions generic and unclear descriptors for actually happen in 3D. What helps thinking such as thinking critically, me are the kits that allow you to build conceptually, creatively, thoroughly, or molecules.” differently, particularly if the student does not expand on what they mean; “The material The way Dr. Cooper (3) stating general facts about content constructed this class was absolutely such as “reactants go to products”. amazing and I'm go glad I took CEM 252 this semester. With that in mind the material for CEM 252 in general is just difficult because of how advanced it is but having Dr. Cooper definitely made it easier to understand” 287 Table 6.5 (con’t) Not Student responses that include one or more of “Everything It is overall a Applicable the following dimensions in regard to their hard class there are a lot perceptions of what is the most difficult thing: reactions that can react (1) the response does not answer the question; different ways with different (2) when a student provides no response at all; compounds.” (3) when the response is unclear or interpretation of the response is difficult, such “All of it. Chemistry has as when students say “understanding the always been difficult for me material”; (4) when the response falls into no and I'm not sure if i'll ever other category; (5) when a student is venting understand it. An example is about the course, professor, or other aspects all of my exam grades, except relevant to the course for exam 5 which was a miracle.” The “most difficult thing” question often contains the most categories as noted in Figure 6.2. Figure 6.2. Breakdown of student responses to question 2 for Spring 2021 CEM 252 students (N=260) Here, we still see a large portion of students perceiving the most difficult aspect is the “applying and reasoning”. However, the majority of students perceive a particular specific topic was the most difficult. Furthermore, we do see a small number of students mentioning that the 288 most difficult aspect of the course was being online or the overall workload and pace. This data is not entirely different than what we have previously seen with transformed courses. In some cases, student responses are allocated to the “Specific Topics” category when they list off a topic but do not describe why it is the most difficult. Regardless, we see a similar pattern across student responses as we have historically for this course. Assessment This question asked students: “How would you describe what is assessed in CEM 252?” with a follow-up question asking them to provide an example. Similar to the other questions, the codebook for responses here was similar to previous studies. The codes and descriptions are listed in Table 6.6. Table 6.6. Codebook for question 3: Perceptions of what was assessed Code Dimensions Example Quotes Apply Student responses that include one or more “I think our ability to critically think and of the following dimensions regarding their is assessed in CEM 252, we aren't Reason perceptions of what is assessed: (1) taught to memorize but instead to understanding “why” a reaction proceeds; think our way through problems. (2) the use of knowledge, specifically with Rather than memorize reactions of the use of fundamental or basic ideas; (3) functional groups, we are taught the transfer of knowledge to new problems; about the movements of bonds and (4) making connections between concepts or WHY they interact the way they do.” piecing concepts together, especially to apply them; (5) making predictions to solve “Honestly, I think it's putting your a problem. knowledge together to think about what you're doing instead of just doing random stuff. When learning about certain topics, you can either memorize them, or actually figure it out and understand the WHY behind it.” 289 Table 6.6 (con’t) Identify and Student responses that include one or “our understanding of the different Describe more of the following dimensions in functional groups most recitations regard to their perceptions of what is and exams where on how certain assessed: (1) understanding the “what” functional groups interact with and “how” reactions proceed, something in the environment or particularly without mentioning the use some other chemical group” of knowledge to understanding “why” reactions proceed; (2) mentions “I think again what is assed is not understanding at the scalar or one how will we memorize the reaction scalar below levels, particularly but how we understand how through the recognition of concepts molecules behave, and how each such as polarity and electronegativity reaction follows some rules. For and their significance to understanding; example on a quiz we are not asked (3) when a student explicitly mentions exactly what we are taught or understanding instead of memorization; memorized of the reaction but (4) when a student mentions understatement of the reaction and it “differentiating” between reactions is applied. For example, being test in with any further explanation. (5) what order does a reaction occur responses include a discussion of when having carbonyl as a forces, charges, or stabilization. NOTE: electrophile” responses that simply mention “knowledge” or “understanding” do not receive this code. Understand Student responses that include one or “Ability to understand organic more of the following dimensions in reactions The homework and tests regard to their perceptions of what is focused on solving and analyzing assessed: (1) when a student mentions reactions” “understanding”; (2) description of “understanding” as contrasted to “I think what is assessed is the ability repeating rules or practice problems. to understand the reactions presented. I believe this because the most important part to understanding the concepts is to understand the reactions that underly them.” 290 Table 6.6 (con’t) Memorization Student responses that include one or “Your level of memorization Being more of the following dimensions tested on your acknowledgement regarding their perceptions of what is and remembrance of structures.” assessed: (1) memorization, remembering, recalling, or “More than often memorization is regurgitation; (2) knowing content assessed Most of the time we find particularly with no explicit mention of ourselves memorizing reactions and understanding the “what”, “how”, or examples rather than truly learning “why” a reaction proceeds. them because test scores heavily affect our futures” Specific Student responses that include one or “Reaction mechanisms and concepts Topics more of the following dimensions for each unit. Knowledge on how to regarding their perceptions of what is do reactions (products, energy assessed: (1) listing off specific topics, diagrams, intermediates, particularly with no reference to mechanisms) of certain types of understanding or approaches utilized; organic compounds and (2) explicit mention of “concepts” occasionally conceptual questions without expanding on what they mean. are included.” “different reactions and a more in depth study of many of the organic chemistry functional groups. we covered many different types of reactions in this class and went in depth of the many different functional groups which make up molecules.” Exam Student responses that include one or “I think the assessments are very Aspects more of the following dimensions fair and are a good representation of regarding their perceptions of what is the material I think everything we assessed: (1) information regarding the learn, and do in recitiation always exams; (2) the format of the exam, such relates to what we are being tested as stating the types of questions on the on, nothing is ever a surprise.” exam (i.e., multiple-choice, or short answer); (3) the length, time, or “The assessments aren't too much fairness of the exams. more difficult than the homework but they do require a bit more thinking and practice. I remember a mini quiz where two different topics were put together in one question.” 291 Table 6.6 (con’t) Course Student responses that “I think CEM252- especially online- does a good job Aspects focus on course at actually grading based on your knowledge and not organization, difficulty, soley on exam perfomance. This semester, grades policies, or other aspects like homework and recitation are nearly as important related to the course itself as the exams. Also by breaking the exams into 6 smaller ones, it feels more practical and easier to do well.” “All the materials covered was fair I think we covered a good amount of material. Also CEM 252 has to follow a curriculum” Course Student responses that “I say that our knowledge of the lectures is assessed Materials focus on specific course in CEM 252. The exams are very similar to the materials, such as notes we take in class. I believe we are never asked homework, lecture notes, to solve problems that are very different or more or recitation worksheets. difficult than the class.” “We are assessed on the things learned in the lecture videos and the homework A lot of the exam questions are similar to the homework and recitation, they are just much harder” 292 Table 6.6 (con’t) Effort Student responses that perceive their “I believe our effort is assessed in overall effort, participation, and CEM252. I really appreciate how willingness to try as being assessed. homework and recitation was graded on effort, as learning means you will make mistakes, which is something I do not believe should be punished. Obviously our exams are not graded for effort, but if you put effort into the course, the exams were not difficult (at least from my perspective).” “CEM 252 assesses how hard I am trying to learn the material, in addition to how well I am learning the material. I like that the homeworks and recitations are based on effort and not off correctness, it helps for students that have a bad test day, or a bad week to still get a good grade when they are trying their hardest. Also having the exams every 2 weeks helps to take the pressure off of a massive exam.” Generalities Student responses that include one or “Your ability to think critical and more of the following dimensions in creatively about reactions. You want regard to their perceptions of what is us to have many different types of assessed: (1) the total absence of any reactions in mind, and use those chemistry in the response; (2) using examples to solve problems. i would generic and unclear descriptors for spend more time reviewing from week thinking such as thinking critically, to week.” conceptually, creatively, thoroughly, or differently, particularly if the “Mastery of the subject The test student does not expand on what they questions were not easy. They were mean. complex, and students really have to know their stuff to be successful with exams.” 293 Table 6.6 (con’t) Not Student responses that include one or “vbnjm vbnm” Applicable more of the following dimensions in regard to their perceptions of what is “Im kind of confused on what this assessed: (1) the response does not question is asking. I would say answer the question; (2) when a student typical orgo 2 topics. amines, ethers, provides no response at all; (3) when the oxidation/reduction, response is unclear or interpretation of nucleophiles/electrophiles, etc” the response is difficult, such as when students say “your understanding of the material”; (4) when the response falls into no other category; (5) when a student is venting about the course, professor, or other aspects relevant to the course and the response focuses on the course or the professor such as frustrations they have with the course or professor. Once again, I saw the re-emergence of most categories, and the overall patterns matched what we have previously seen with transformed courses. The “Understand” category captured responses that focused exclusively on the idea that course assessments assessed their understanding without further explanation. 294 Figure 6.3. Breakdown of student responses to question 3 for Spring 2021 CEM 252 students (N=260) The majority of students perceived they were assessed according to their ability to apply and use their knowledge or “Identify and Describe”. For the most part, other categories were small in comparison. It is promising to see that many students perceive they are assessed on their ability to apply and reason, which would align with the overarching transformational intent of OCLUE. Similar to previous studies in CEM 252, the results showed similar trends in student responses. Superordinate Themes Similar to previous studies, I opted to aggregate the data into superordinate themes to make it easier to discuss and communicate the findings. For the first question: 295 Figure 6.4. Superordinate themes for Spring 2021 CEM 252 responses to question 1 (expectations of thinking) (N=260) Most students perceived they were expected to use their knowledge in CEM 252 which is similar to our findings in Chapter IV in the same course. For this data, I did couple the “Understand” category with “Apply and Reason” and “Identify and Describe” since many of the responses were implying understanding beyond the use of rote memorization. This particular semester was online due to the COVID-19 pandemic. Therefore, it is promising to see student perceptions of what they are expected to do is still (at least partially) aligning with the transformational intent of the course. The trends across OCLUE responses are very similar to previous studies. For the second question: 296 Figure 6.5. Superordinate themes for Spring 2021 CEM 252 responses to question 2 (most difficult aspects) (N=260) Considering that this course was online, and the previous CEM 252 study presented in Chapter IV was not, I saw some differences here. Specifically with responses in the “New and Contextual” theme. This theme captured responses specifically focused on the online experience or the explicit focus on Workload and Pace. This is slightly different than in the study presented in Chapter IV where Workload and Pace showed up in the “Other” theme. I opted to incorporate the “Workload and Pace” category with the “Online” category because in some cases, these codes co-occurred. That is, students often mentioned that because of the online environment, the class moved at a pace that wasn’t ideal or involved more work. Although the general trend is similar to what we noticed in previous studies, the numbers are different with more student responses falling into the “Rote Knowledge” theme than previously noted (since more students were perceiving the most difficult aspect to be specific topics). This may be explained by the 297 shift to online instruction, but more information would be needed to make any conclusions regarding these differences. For the third question: Figure 6.6. Superordinate themes for Spring 2021 CEM 252 responses to question 3 (assessment) (N=260) For the final question, I noted that most of the students perceived they had to use their knowledge on assessments, which aligns with previous findings across transformed courses. One major difference here is the gap between “Use of Knowledge” and “Other”. Less responses were categorized as “Other”, and it is likely due to the fact more students offered examples (since they were prompted for it, explicitly) that helped me determine how to categorize their response. In the original CEM 252 study, students were not asked to provide an example, and some of them responded vaguely. The addition of examples certainly helps to clarify their responses, enabling me to code it more specifically. However, almost 31% of responses still fall within this theme. 298 Although we have been able to lower the “Generalities” code, there are still plenty of responses that are not providing as much useful information. Comparing CEM 252 Online to CEM 252 In-Person As mentioned previously, this study allowed a unique opportunity to compare student responses from CEM 252 in an online and an in-person environment. The data collected for this chapter was collected in Spring 2021 when CEM 252 was entirely online. It can then be compared to the data collected in Spring 2018, when CEM 252 was entirely in-person and is the same data presented in Chapter IV. For ease of comparison, I will focus exclusively on the superordinate themes. A comparison of CEM 252 in-person (Spring 2018) with CEM 252 online (Spring 2021) for question #1: expectations of thinking is shown below. 299 Figure 6.7. A comparison of CEM 252 in-person with CEM 252 online for question #1 There is an uptick in the number of responses stating they must use their knowledge for the online course with a decrease in the percentage of perceptions that perceive they are expected to use rote knowledge in OCLUE. This is interesting and unexpected given the shift to online instruction. These differences were also found to be statistically different (c2 = 30.475, df = 2, a = 0.05, p < 0.001) albeit with a small effect size (Cramer’s V = 0.245). One potential explanation here is that with the shift to online instruction, the OCLUE classroom mirrored a more “flipped” classroom approach where content delivery primarily occurred outside of the class with pre- recorded videos and class time was used to go over homework and additional examples. Although homework was reviewed while in-person, sometimes it was done more quickly in order to make time for lecture. This shift could have further clarified the expectation that students were expected to apply and use their knowledge to explain why chemical phenomena happen since more time in class was devoted to doing examples. A comparison of CEM 252 in- 300 person (Spring 2018) with CEM 252 online (Spring 2021) for question #2: most difficult thing is shown below. Figure 6.8. A comparison of CEM 252 in-person with CEM 252 online for question #2 For this question, I noted a new theme labeled as “New and Contextual” for the online CEM 252 cohort. As a reminder, this theme captured responses that explicitly discussed the online nature of the course and/or the workload associated with the course being online. Since this did not apply to the in-person data (since the course wasn’t online), it did not show up in the Spring 2018 data. We see a small difference in the perceived use of knowledge as being most difficult, with more students in the in-person cohort perceiving having to use their knowledge was the most difficult aspect. We see roughly comparable percentages for the number of responses perceiving rote knowledge being the most difficult. For statistical testing, I opted to remove the “New and Contextual” responses since they did not show up in the Spring 2018 data 301 set and test the remaining responses. At an a = 0.05, the results were found to not be statistically significant (c2 = 5.017, df = 2, a = 0.05, p = 0.081). A comparison of CEM 252 in-person (Spring 2018) with CEM 252 online (Spring 2021) for question #3: assessment is shown below. Figure 6.9. A comparison of CEM 252 in-person with CEM 252 online for question #3 Similar to question #1, more students in the online cohort perceived they were assessed on their use of knowledge with a larger proportion of students in the in-person cohort perceiving they were assessed on rote knowledge. These results were also found to be statistically significant (c2 = 13.705, df = 2, a = 0.05, p = 0.001) albeit with a small effect size (Cramer’s V = 0.164). Once again, a potential explanation here is the shift to how the classroom was conducted with class time being used to go over homework and examples and answer student questions rather than content delivery. By engaging in consistent practice throughout class time, it could be that the expectations and practices were better communicated to students in the online cohort. 302 The Use of AACR with the CEM 252 Data There were a low number of responses for this dataset. This, coupled with the complexity of responses, likely lowered the kappa values (shown in the table below). Table 6.7. Spring 2021 CEM 252 AACR kappa overall values disaggregated Question 1 – Expectations of Thinking 0.47 Question 2 – Most Difficult Thing 0.47 Question 3 – Assessment 0.43 The kappa values were fairly low, and I decided to attempt and train the models using the superordinate themes operating from the rationale that this might make it easier for the model to categorize responses. Results are included below. Table 6.8. Spring 2021 CEM 252 AACR overall kappa values aggregated by superordinate themes Question 1 – Expectations of Thinking 0.40 Question 2 – Most Difficult Thing 0.66 Question 3 – Assessment 0.56 It appears that organizing the data into superordinate themes may have helped with Questions 2 and 3 but did not for Question 1. Upon looking more closely at the data it was found that the models struggle with the complexity of responses because students often do not use the same language to describe their experience, a similar phenomenon I explained in the previous chapter (Chapter V). Therefore, it becomes difficult to classify according to lexical analysis. For example, if a response stated, “we do not have to memorize, we have to apply and reason”, it sometimes was miscoded as “Memorization” due to the use of the term “memorize”. Issues like these seemed to abound throughout the dataset. 303 It seems that by asking students to explain their answer, their responses become much more complex (and longer), making it difficult for AACR to agree on a code. Furthermore, as I am not looking for the occurrence of words or phrases and instead interpreting meaning underlying the response, it can be difficult for AACR to agree with the codebook I have established given the smaller number of responses. However, I did not test all parameters, and it may be likely there is a possible set of parameters that can increase the kappa values to acceptable levels. However, more responses are needed. Given the highly open-ended nature of these questions and the complexity in responses, AACR has historically struggled with these questions when “N” values are below 1,000 responses. Fall 2021 CEM 251 Organic Chemistry I Data In continuation with understanding the boundaries of the perception questions, I decided to modify them to see if some scaffolding may help limit responses that are not as insightful. For these responses, half of the students across OCLUE and traditional CEM 251 received scaffolded questions that encouraged them to refrain from using terms like “critical thinking” or discussing the format of the exams. The other half received the same perception questions as before, this group is referred to as the “No Scaffold” group since they did not receive the scaffolding clauses in their questions. Expectations of Thinking The data included two different modifications of the “expectations of thinking” question. The “No Scaffold” group received the following question: “How would you describe the ways students are expected to think in CEM 251?” with a follow-up requesting an example. The 304 “Scaffold” group received the following modified prompt: “How would you describe the ways students are expected to think in CEM 251? Please try not to use vague terms such as “critical thinking”, “problem solving”, “systematically, “analytically”, etc. Instead, provide examples of the types of things you are expected to do and explain in question #2.” For the Fall 2021 CEM 251 data, I will start by discussing the Scaffold vs. No Scaffold results, then follow-up with OCLUE vs. Traditional results. We noted similar categories as we have in the past, albeit a somewhat different pattern. Considering that I noted similar codes that have previously been described in Chapters IV, V, and VI, I have not provided an additional codebook here. Statistical tests will be provided for the superordinate themes. Figure 6.10. Breakdown of students responses to question 1 for Fall 2021 CEM 251 students (scaffold vs. no scaffold) To begin, I think it’s important to recognize that the scaffolding only targeted specific areas. That is, one should not assume that the scaffolding would help lower responses in all categories that are not as informative. As I mentioned earlier, the scaffolding for question 1 for 305 this cohort was primarily focused on the “Generalities” category and encouraging students to stop using vague terms, such as “critical thinking”, to describe their experience. Here, we see a slight drop in “Generalities” responses (which may have correlated to the uptick in “Apply and Reason” responses). It’s important to note that responses still used other types of vague language to describe their experience (i.e., “organic chemistry is a new language”), and it may be difficult to provide enough scaffolding to curb all ways students can respond to this question. When we disaggregate the data into OCLUE vs. Traditional, this trend is similar. Figure 6.11. Breakdown of students responses to question 1 for Fall 2021 CEM 251 students (OCLUE vs. traditional) Once again, we do note slight drops in the “Generalities” code for OCLUE and Traditional students, albeit not a large amount. It’s worth noting we do see a large jump in the “Apply and Reason” code after scaffolding for traditional students that cannot be fully attributed to the drop in “Generalities”. We saw less responses in “Generalities” in previous studies (in CEM 252). Although we see some slight drops in the number of “Generalities” responses, it seems that even when students are encouraged to not use certain terms, they still rely on vague 306 language or use other terms like “problem solving” to describe their experience without providing examples. Most Difficult Aspects The question for the “No Scaffold” group asked: “What would you say is the most difficult thing about CEM 251?” followed by a prompt asking for an example. The “Scaffold” group received the following question: “What would you say is the most difficult thing about CEM 251? Similar to the previous questions, please try not to use vague terms such as “critical thinking”, “problem solving”, etc. Instead, provide examples of what you believe the most difficult aspect(s) of the course is/are, and please explain why this aspect is the most difficult for you in question #4.” Once again, no codebook is provided here because the codes and their descriptions are identical to codes that have emerged in the previous studies. Statistical tests will be provided for the superordinate themes. 307 Figure 6.12. Breakdown of student responses to question 2 for Fall 2021 CEM 251 students (scaffold vs. no Scaffold). Similar to question #1, we wanted to target the “Generalities” code with our scaffolding. That is, we wanted to encourage students to not use vague language like “critical thinking” as describing what was most difficult about the course. A cursory examination of the data shows that we see a steep drop off in “Generalities” responses with the scaffolding. However, on many questions, we see differences between the scaffolding vs. no scaffolding (such as with “Specific Topics”). It certainly could be that the groups of students (who were randomly assigned to each group) do have different perceptions of what is most difficult. To help make more sense of the data, I turned to the disaggregated data. 308 Figure 6.13. Breakdown of student responses to question 2 for Fall 2021 CEM 251 students (OCLUE vs. traditional). Here, we once again see the same trend. It seems for both OCLUE and Traditional, the scaffolding did limit the “Generalities” responses. However, we see differences across all categories here that cannot be solely attributed to the steep drop in responses in “Generalities”, particularly with the “Specific Topics” category for both groups. In both cases, the “Specific Topics” category sees an increase in responses after scaffolding. It could be that the students interpreted the task as meaning they should list specific topics as examples here which is why we see the sharp uptick in responses here. Assessment The “No Scaffold group” received the following question: “How would you describe what is assessed in CEM 251?” with a follow-up requesting an example. The “Scaffold” group, however, received the following modified question: “How would you describe what is assessed 309 in CEM 251?” Similar to the previous questions, please try not to use vague terms such as “critical thinking”, “problem solving”, etc. Furthermore, please avoid restating what assessments look like since we already know that information. Instead, provide examples that support your perception of what was assessed in the course in question #6.” No codebook is provided here given the similarities in codebooks across the studies and datasets. When looking at the scaffolding vs. no scaffolding data for Fall 2021 CEM 251, I noted a similar pattern as question #2. However, statistical tests will be provided for the superordinate themes. Figure 6.14. Breakdown of student responses to question 3 for Fall 2021 CEM 251 students (scaffold vs. no Scaffold) The scaffolding for this question was specifically targeting “Generalities” and “Exam Aspects”. For the most part, both categories are comparable with and without the scaffolding. We see a very slight drop in “Generalities” and a very slight increase in “Exam Aspects” with scaffolding. However, the data seem to indicate that the scaffolding may have helped limit the 310 number of responses in “Course Materials and Aspects”. In previous data collections, some students mentioned that the exams assessed material from the course (i.e., lectures, homework, etc.). This was still the case here, but the scaffolding may have helped limit those responses so that they were more specific. When we disaggregate the data, however, we see that some courses are contributing more to these differences than others. Figure 6.15. Breakdown of student responses to question 3 for Fall 2021 CEM 251 students (OCLUE vs. traditional) Here we note that the “Generalities” category for OCLUE is very similar before and after the scaffolding. The traditional cohort shows a steeper drop with the scaffolding. With “Exam Aspects”, the number of responses for OCLUE actually increased after scaffolding, while it dropped slightly for the traditional group. For “Course Materials and Aspects”, both groups showed decreases in this category after the scaffolding. 311 Superordinate Themes Given the large number of responses split across many categories, I aggregated the categories into superordinate themes to better see the trends. To begin: Figure 6.16. Superordinate themes showing how scaffold and no scaffold groups for OCLUE and traditional CEM 251 differed on question 1. Here it is easier to see how scaffolding and no scaffolding compared as well as differences between OCLUE and traditional. For the most part, there was little difference between scaffolding and no scaffolding. However, we do see some differences between OCLUE and traditional with more OCLUE perceiving they have to use their knowledge relative to students in the traditional course. However, we do not see as many students in the traditional course perceiving that the use of rote knowledge is expected, like we did in the CEM 252 study in Chapter IV. Granted, these are two different courses, and this is the first time we have used these questions in CEM 251. There are several ways to calculate statistical differences within this data. To begin, a comparison between no scaffold and scaffold results yielded no significant differences when the 312 data was aggregated into superordinate themes (c2 = 3.642, df = 2, a = 0.05, p = 0.162). That is, there is no statistical difference between the non-scaffolded and scaffolded results. This is further confirmed when I disaggregated the dataset into OCLUE and traditional. Taking just OCLUE results and comparing no scaffold vs. scaffold, I found no statistical difference between OCLUE no scaffold vs. scaffold responses (c2 = 0.528, df = 2, a = 0.05, p = 0.768). A similar result was noted for the traditional cohort (c2 = 3.127, df = 2, a = 0.05, p = 0.209). However, there were statistical differences when comparing OCLUE to traditional (as was noted in Chapter IV). When comparing OCLUE and traditional non-scaffolded responses, there was a statistically significant difference (c2 = 22.336, df = 2, a = 0.05, p < 0.001, Cramer’s V = 0.227). A similar result was found when comparing OCLUE and traditional scaffolded responses (c2 = 15.261, df = 2, a = 0.05, p < 0.001, Cramer’s V = 0.186). Although both of these comparisons between OCLUE and traditional were significant, it’s important to note that their effect sizes were small (as evidenced by Cramer’s V). 313 Figure 6.17. Superordinate themes showing how scaffold and no scaffold groups for OCLUE and traditional CEM 251 differed on question 2. For question 2, there was little difference between scaffolding and no scaffolding for the OCLUE group. However, I saw more differences with the traditional cohort. The general pattern here is that more students in OCLUE perceive that the most difficult aspect was the use of knowledge, while more students in the traditional course perceived that rote knowledge and other facets (such as exams) were the most difficult aspects. When the data is treated holistically as no scaffold vs. scaffold, similar to question #1, no statistical difference was found (c2 = 4.645, df = 3, a = 0.05, p = 0.200). Upon disaggregating the data by the course type (OCLUE and traditional), there were no significant differences in OCLUE responses between no scaffold and scaffold (c2 = 0.589, df = 3, a = 0.05, p = 0.899); however, there was a significant difference for traditional no scaffold vs. scaffold (c2 = 11.015, df = 3, a = 0.05, p = 0.012, Cramer’s V = 0.159). Put simply, the presence of scaffolding appears to have been helpful for traditional students (albeit with a small effect size) but not for students in OCLUE. 314 Once again, statistical differences emerged when I compared OCLUE and traditional cohorts to one another. Starting with the non-scaffolded data, there is a significant difference between OCLUE and traditional responses (c2 = 27.706, df = 3, a = 0.05, p < 0.001, Cramer’s V = 0.253). Similarly, there is a statistical difference with the scaffolded data between these two course types as well (c2 = 14.614, df = 3, a = 0.05, p = 0.002, Cramer’s V = 0.182). However, as was seen with the “expectations of thinking” question, these differences are with small effect sizes. Figure 6.18. Superordinate themes showing how scaffold and no scaffold groups for OCLUE and traditional CEM 251 differed on question 3. Finally for question 3, we see a similar pattern: the scaffolding seems to make no difference in OCLUE responses while it seems to help somewhat for the students in the traditional course. In terms of differences between OCLUE and traditional, more students in OCLUE perceived they were assessed on their use of knowledge while more students in the 315 traditional course perceived they were assessed more on their rote knowledge or discussed other features such as how exams covered course material (i.e., lectures, homework, etc.). Broadly comparing no scaffold vs. scaffold, I did not note a significant difference for this question (c2 = 5.210, df = 2, a = 0.05, p = 0.074). When the no scaffold vs. scaffold data is disaggregated according to course type, I did not find a statistical difference for the OCLUE data (c2 = 0.147, df = 2, a = 0.05, p = 0.929), but I did for the traditional cohort (c2 = 7.254, df = 2, a = 0.05, p = 0.027, Cramer’s V = 0.129). That is, similar to question #2, the scaffolding seemed to help the traditional students clarify their responses more than the OCLUE students, but there is a small effect size. When comparing OCLUE and traditional, there was a statistically significant result for the non-scaffolded (c2 = 19.403, df = 2, a = 0.05, p < 0.001, Cramer’s V = 0.211) and scaffolded responses (c2 = 16.706, df = 2, a = 0.05, p < 0.001, Cramer’s V = 0.194). However, both of these results yielded a small effect size. Comparing CEM 251 and CEM 252 OCLUE Perceptions With the data from CEM 251 (Fall 2021) and CEM 252 (Spring 2021), this offered an opportunity to compare student perceptions in OCLUE across the entire organic chemistry sequence. Data from OCLUE students in CEM 251 and CEM 252 were compared. It’s important to note that only the “No Scaffold” data from CEM 251 was used to compare to CEM 252 because these students received the same questions as the CEM 252 students. 316 Figure 6.19. Superordinate themes showing CEM 251 and CEM 252 comparisons for question #1 For question #1, there were no statistical differences between CEM 251 and CEM 252 (c2 = 0.102, df = 2, a = 0.05, p = 0.950). These students received the same questions and both courses were informed by three-dimensional learning (they were part of the OCLUE sequence), and they were both conducted online. That is, for this question, the data suggests that there is no shift in perceptions over the course sequence. 317 Figure 6.20. Superordinate themes showing CEM 251 and CEM 252 comparisons for question #2 Unlike with question #1, there were differences in question #2 (c2 = 52.914, df = 2, a = 0.05, p < 0.001, Cramer’s V = 0.335) with close to a moderate effect size. Differences were largely driven by “Use of Knowledge”, “Rote Knowledge”, and “Other” with comparable results for “New and Contextual”. Historically, question #2 has been influenced by course content with many students often discussing specific topics as being more difficult. For example, in CEM 252 many students perceived “synthesis” to be the most difficult aspect of the course, considering that synthesis problems are more pronounced in CEM 252 when compared to CEM 251, this is not too surprising. 318 Figure 6.21. Superordinate themes showing CEM 251 and CEM 252 comparisons for question #3 Using an a = 0.05, the results for the third question were found to not be significant (c2 = 5.735, df = 2, a = 0.05, p < 0.057, Cramer’s V = 0.110). That is, there wasn’t any statistical difference noted between CEM 251 and CEM 252 students in terms of what they perceived they were assessed on. That is, once again, there are not statistically significant differences between CEM 251 and CEM 252 student perceptions, potentially indicating no shift in student perceptions over the sequence with regard to this question. The Culture of Learning in CEM 252 and CEM 251 Overall, the scaffolding may have helped slightly though more work needs to be done here. In the cases where it seemed to help, it was mostly for the traditional students. Similar to the CEM 252 study presented in Chapter IV, I noted statistically significant differences between 319 OCLUE and traditional, albeit with smaller effect sizes. In many cases, students still used vague or generalized language, even when the prompt explicitly stated not to use such terms. For example, in question 3, the scaffolding seemed to lower the number of responses in the “Other” category. However, I would recommend caution with these preliminary results. For example, previous scaffolding attempts have found that asking too many scaffolding questions may confuse students and that given the open-ended nature it would be difficult to completely curtail all vague, generalized responses. As of right now, scaffolding has not proven to be as impactful as expected. In cases where it did help, primarily with students in the traditional course, there was a small effect size. With regard to the culture of learning, I arrived at similar conclusions as previous studies: OCLUE seemingly sends the message to students that the application and use of knowledge is necessary in practice, and that more students in traditional perceive that rote knowledge is expected and assessed when compared to OCLUE (Bowen et al., 2022; Reinholz & Apkarian, 2018; Schein & Schein, 2016). However, it’s important to point out that the differences between these two types of courses is not as strong as I saw with CEM 252 in the original study despite differences in the course. Final Remarks Overall, the Spring 2021 CEM 252 data did not show us anything new, particularly regarding student perceptions in transformed courses. This further highlights the reliability of the perception questions. For the Fall 2021 CEM 251 cohort, the scaffolding seemed to help, slightly, and sometimes in different ways than was anticipated. For question #1 (expectations of thinking), the scaffolding decreased the number of responses in “Generalities” overall by a small 320 amount. For question #2, the scaffolding seems to have helped curb the number of “Generalities” responses, but the scaffolding may have also encouraged students to list specific topics as examples, leading to the steep increase in that category after scaffolding. For question #3, I did not see the decrease in the categories we scaffolded around (and for some groups it even increased), but we did see a decrease in the “Course Materials and Aspects” category. The data does suggest that students in CEM 251 and CEM 252 OCLUE do perceive they are expected to use their knowledge and are assessed accordingly at similar rates. However, more work needs to be done to explore this finding, however, and to better understand how the entire OCLUE sequence impacts student perceptions. It is clear that more works needs to be done here. The project began with a focus on minimizing prompting (and to an extent, that is still the focus). Of course, with open-ended questions and minimal prompting, one should plan to receive responses that span the spectrum. With this study I have attempted to curtail the number of not relevant responses with some scaffolding. Given the open-ended nature of the questions, it would be very difficult to provide enough scaffolding to curtail all vague and not relevant categories. It may be the case that if you scaffold for one, you will see a shift toward other irrelevant categories, rather than categories that provide us with useful information about the course culture of learning. For example, if I scaffold to curtail responses about “critical thinking”, I may see more responses discussing course materials without much explanation as to what those course materials have students doing. That is, the response does shift, but it shifts to a different vague, generalized response. 321 REFERENCES 1. beSocratic. (2020). Home page. https://besocratic.com/home 2. Bowen, R. S., Flaherty, A. A., & Cooper, M. M. (2022). Investigating student perceptions of transformational intent and classroom culture in organic chemistry courses. Chemistry Education Research and Practice. https://doi.org/DOI https://doi.org/10.1039/D2RP00010E 3. Bryfczynski, S. P. (2010). BeSocratic: An Intelligent Tutoring System for the Recognition, Evaluation, and Analysis of Free-Form Student Input. 4. Charmaz, K. (2006). Constructing Grounded Theory: A Practice Guide through Qualitative Analysis (p. 300). Sage. 5. Corbin, J., & Strauss, A. (2015). Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory (Fourth). Sage. 6. Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(1), 1–22. https://doi.org/10.1186/s40594-018-0103-x 7. Schein, E. H., & Schein, P. A. (2016). Organizational Culture and Leadership (5th ed.). Jossey-Bass. 8. Thomas, D. R. (2006). A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation, 27(2), 237–246. 322 CHAPTER VII: STUDENT PERCEPTIONS OF “CRITICAL THINKING”: INSIGHTS INTO CLARIFYING AN AMORPHOUS CONSTRUCT Preface Throughout the perceptions studies of transformational intent and classroom cultures (detailed in previous chapters), I was consistently finding that a portion of the students were using vague and generalized language to describe their experiences in their courses. One of the most common vague responses centered on the mention of “critical thinking”. In some cases, students provided examples that would help me determine how to classify “critical thinking”, but not always. In the cases where they did provide examples, I noted that how they talked about “critical thinking” was not always the same, despite the assumption that it had a universal definition. Therefore, I conceptualized this study to generate further insights into student perceptions of “critical thinking” and to proffer a way forward on clarifying an amorphous term commonly used in education and society. This paper was originally published in Chemistry Education Research and Practice and is reprinted here with permission: Bowen, R. S. (2022). Student Perceptions of “Critical Thinking”: Insights into Clarifying an Amorphous Construct. Chemistry Education Research and Practice. Advance Article. Copyright 2022 The Royal Society of Chemistry. A copy of permissions is included in the Appendix alongside Supplemental Information for this manuscript. Introduction Within chemistry education, it is not uncommon to see the term "critical thinking" in research articles, course assignments, and learning goals (Bernardi and Pazinato, 2022; Dixson et 323 al., 2022; Hunter and Kovarik, 2022) or hear it mentioned in conversation about pedagogy. The prevalence of "critical thinking" implies a particular value and importance, and the term's definition is seemingly taken-for-granted and assumed to be universally understood in many contexts. However, a common definition of what the construct means to chemistry education has not been agreed upon with some advocating that we stop using the term entirely (Cooper, 2016; Stowe and Cooper, 2017). Scholars have certainly attempted to operationalize "critical thinking" (Facione, 1990), however, these efforts lacked discipline-based education research (DBER) perspectives and, in some cases, relied on other amorphous terms, such as "problem-solving" and "inquiry", which were poorly defined and did not explicitly detail what students must know and do (Rickert, 1967; Charen, 1970; Byrne and Johnstone, 1987; National Research Council, 2012b; Gupta et al., 2015; Cooper, 2016; Weaver et al., 2016; Stowe and Cooper, 2017). In previous work by my colleagues and I, we found that students often used terms like "critical thinking" to describe their learning experiences in organic chemistry courses. However, for many of these responses, it was unclear what students were actually doing when they were "thinking critically" or how they would define the construct. At this point, I began to conceptualize this study and turn to the literature to generate insights on what "critical thinking" meant in the context of learning chemistry. Early attempts to operationalize this way of thinking have been attributed to Edward Glaser (Glaser, 1941; George, 1967; Abrami et al., 2015) where Glaser's definition leveraged the application of knowledge and necessary affective dispositions to think "critically" (Glaser, 1941). Though other sources have situated "critical thinking" as the general application of knowledge like Glaser (Dunning, 1954; Gupta et al., 2015; Barron et al., 2021), other definitions diverged by including facets such as interpretation (Dunning, 1954), synthesis and/or evaluation (Smith, 324 1963; George, 1967, 1968; Oliver-Hoyo, 2003; Gupta et al., 2015; Forawi, 2016), metacognition (Kuhn, 1999; Tsai, 2001), literacy (Paul and Elder, 2011; Vieira et al., 2011), being inherently "scientific" way of thinking (George, 1967), requiring consistent practice (Kogut, 1996; Oliver- Hoyo, 2003), leveraging practices such as argumentation or asking questions (Siegel, 1989; Osborne et al., 2004; Crenshaw et al., 2011; Mulnix, 2012; Hand et al., 2018), and the general approach taken when people are unsure of what to do next (Dunning, 1956; George, 1967; Vieira et al., 2011), amongst others. This lack of coherence within research on "critical thinking" in science education has been criticized with Crenshaw and colleagues noting that "there are nearly as many definitions of critical thinking as there are publications on the topic," (Bailin, 2002; Crenshaw et al., 2011). In 1990, Facione published the "Delphi Report" which was a major attempt at operationalizing "critical thinking" using a panel of 46 experts on the construct (Facione, 1990). Within the "Delphi Report", "critical thinking" was defined as "purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which that judgment is based... The ideal critical thinker is habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fair-minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider, clear about issues, orderly in complex matters, diligent in seeking relevant information, reasonable in the selection of criteria, focused in inquiry, and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit," (Facione, 1990). Though the definition from the "Delphi Report" has been leveraged in chemistry education (Danczak et al., 2017, 2020), it's important to note that past and present conceptualizations of "critical thinking" have 325 not always aligned with this definition and that the panel did not reach a full consensus. Furthermore, the panel of experts used in the report consisted primarily of philosophers and did not include discipline-based education experts at the time. Aside from defining "critical thinking", research has also struggled to conceptualize the construct as involving general skills that could transfer between domains (Ennis, 1962; Charen, 1970; Lau, 2011) or as discipline-specific skills that are dependent on content knowledge (Siegel, 1989; Mulnix, 2012). In 2012, the National Academies considered "critical thinking" to be part of the 21st century competencies that were necessary for "deeper learning" which entailed transferring knowledge from one situation to another. However, this report from the National Academies concluded common definitions had not been established for constructs like "critical thinking" and acknowledged that "research to date provides little guidance about how to help learners aggregate transferable competencies across disciplines" (National Research Council, 2012b). The amorphous nature of "critical thinking" creates major problems for measuring and promoting whatever it is. Despite its nebulous nature, various instruments have been published to assess "critical thinking", all of which operate from their own conceptualization of the construct, indicating that one instrument may not be appropriate for all contexts (Ennis, 1962; Watson and Glaser, 1964; Wright and Forawi, 2000; Banning, 2006; Forawi, 2016; Danczak et al., 2020; Insight Assessment, 2020; The Critical Thinking Co., 2021; Assessment Day Ltd.). Furthermore, some have sought to explore strategies and develop frameworks to help promote and develop "thinking critically" (Abrami et al., 2015; Duncan et al., 2018). In their meta-analysis of "critical thinking" strategies, Abrami and colleagues concluded that individual practice, discussion, real- world examples, and mentoring could all be helpful for developing "critical thinking" (Abrami et 326 al., 2015). Abrami and colleagues were careful to define "critical thinking" in their meta- analysis, but it implies that much of our research on "critical thinking" operates from conceptualizations of the construct that may be different. Despite these divergent perspectives on a seemingly important construct, there has been some overlap amongst definitions. For example, the application and use of knowledge (Glaser, 1941; Dunning, 1954; Gupta et al., 2015; Barron et al., 2021), the contrast of "critical thinking" to rote memorization (Dunning, 1954; George, 1967, 1968; Rickert, 1967; Facione, 1990; Tsai, 2001; Mulnix, 2012; Santos, 2017), and the idea that "critical thinking" must be explicitly taught to students are notable commonalities (George, 1967, 1968; Rickert, 1967; Byrne and Johnstone, 1987; Facione, 1990; Barak et al., 2007; Vieira et al., 2011; Mulnix, 2012) have all been noted. However, commonalities across many studies and perspectives may differ from one another as well. As noted earlier, given the divergence in what "critical thinking" is understood to be, researchers in chemistry education have advocated for abolishing the term altogether and being more explicit about what we want students to know and do with (Cooper, 2016; Stowe and Cooper, 2017). These authors have also suggested that the scientific practices in A Framework for K-12 Science Education (National Research Council, 2012a) and three-dimensional learning (3DL) (3DL4US, n.d.) could act as the component parts of "critical thinking" (Cooper, 2016; Stowe and Cooper, 2017). However, this particular stance implies that systematic curricular and pedagogical overhaul may be necessary to have longitudinal impacts on how students approach learning due to the need for consistent practice (Kogut, 1996; Oliver-Hoyo, 2003). Certainly, other scholars' definitions of "critical thinking" have relied on various scientific practices 327 covered in the Framework such as argumentation or asking questions (Siegel, 1989; Osborne et al., 2004; Crenshaw et al., 2011; Mulnix, 2012; Hand et al., 2018). In our previous research on student perceptions of transformational intent and classroom cultures in organic chemistry, my colleagues and I noted that many students would use the term "critical thinking" to describe their experiences. However, these responses were often vague and unclear as to what students were doing when they engaged in this way of thinking, highlighting that the term had a taken-for-granted meaning amongst students (Bowen et al., 2022). In a related study on student perceptions of "critical thinking", Danczak and colleagues found that there was limited agreement amongst undergraduates, teaching assistants, teaching faculty, and chemical industry employers' definitions of "critical thinking" (Danczak et al., 2017), further highlighting the amorphous nature of the construct amongst different groups in chemistry. With this in mind, I became more interested in what students perceived "critical thinking" to be and what it might entail. My aim for this study was to extend the literature base on student perceptions of this amorphous, yet pervasive, construct in science education (such as the work done by Danczak et al., 2017). I wanted to go beyond how students defined "critical thinking" and probe the experiences and factors that informed their understanding and use of this way of thinking. To assist in this endeavor, I employed a constructivist grounded theory approach and used semi-structured interviews with students across three different organic chemistry courses at a large, research-intensive Midwestern university in the United States. My rationale for choosing students in organic chemistry was to align with our previous work in these courses where students often used the term to describe their experiences (Bowen et al., 2022). My research questions for this study were as follows: 1. What are the commonalities across student perceptions of "critical thinking"? 328 2. What insights do student perceptions of "critical thinking" offer to help clarify the construct in instruction? Theoretical Framework Caution has been advised when using theoretical frameworks with grounded theory studies since they may interfere with the development of theory (Corbin and Strauss, 2015). However, other scholars have asserted that researchers are not “blank slates” and thus influenced (and informed) by previous work and training (Charmaz, 2006; Timonen et al., 2018). As will be described in more detail later, I am adopting a constructivist grounded theory approach which acknowledges the importance of theoretical frameworks in the context of the study (Charmaz, 2006). From the constructivist grounded theory perspective, the theoretical framework provides a lens through which to understand how the researcher interpreted and situated the data, and for this study I was informed by sociocultural perspectives (Vygotsky, 1978a; Rogoff, 1990; John- Steiner and Mahn, 1996; Lemke, 2001; Mantero, 2002; Zotos et al., 2020; Bowen et al., 2022). Sociocultural perspectives have been recently employed in chemistry education research to explore graduate teaching assistants’ teaching identity (Zotos et al., 2020), how students identify the significance of course material they are learning through writing (Petterson et al., 2022), and how students perceive their courses’ learning cultures (Bowen et al., 2022). These perspectives situate the significance of social interactions and contextual factors on how people think, talk, and act. Within the context of this study, I work from the assumption that student conceptualizations of “critical thinking” have largely been informed by their sociocultural experiences. That is, students come to understand “critical thinking” via social interactions with instructors, peers, and family and the ways of thinking and doing supported by their learning 329 environments (Vygotsky, 1978b; Rogoff, 1990). Therefore, students’ understanding of “critical thinking” may be the product of socialization where their understanding has been heavily shaped and informed by others who have purported a certain way of conceptualizing “critical thinking” (Rogoff, 1990; Miller and Goodnow, 1995; Lemke, 2001; Gutiérrez and Rogoff, 2003; Nasir and Hand, 2006; Calabrese Barton et al., 2008). With the adoption of this sociocultural view, I believe “critical thinking” is informed by expectations and norms of that space (Becker et al., 2013; Chang and Song, 2016; Bowen et al., 2022). My aim was to use this theoretical perspective in conjunction with my methodological approach to identify emergent themes that not only provide insight into what students think “critical thinking” is but where they source this understanding, how they see it developing, and what factors are influential for encouraging them to engage in it. Methodology Course Contexts I focused on organic chemistry students in this study for three reasons: 1) previous work related to this study was conducted alongside the same students (Bowen et al., 2022; Flaherty, 2020b), 2) the students were more accessible to me, and 3) organic chemistry has been situated as providing students with generalizable “critical thinking skills” (Dixson et al., 2022). Two types of organic chemistry courses were used in this study: one that had been transformed using three-dimensional learning (OCLUE) (National Research Council, 2012a; 3DL4US, n.d.) and a more traditional organic chemistry experience. These courses were taught during the Fall 2021 semester at a large, research-intensive, midwestern university in the United States. Both courses involve a lecture and recitation component. The recitations meet once a week for 50 minutes and 330 are a chance for groups of 30 students to meet in a smaller class environment to work on practice problems for the course under the guidance of a graduate teaching assistant in the department. Both courses were considered large lecture classes with 200-300 students. Due to the COVID-19 pandemic, both courses were offered online via Zoom. I invited students in organic chemistry taught by three different professors. Two of the professors were teaching the transformed curriculum, Organic Chemistry, Life, the Universe, and Everything (OCLUE) which has been previously discussed elsewhere, and the third professor taught a more traditional organic chemistry course (Cooper et al., 2019; Bowen et al., 2022). OCLUE is informed by A Framework for K-12 Science Education (National Research Council, 2012a) and leverages three-dimensional learning (3DL4US, n.d.). In the context of this transformed curriculum, students are often asked to engage in scientific practices in the context of fundamental, core ideas in chemistry (Cooper et al., 2017). Students in OCLUE are frequently encouraged to engage in causal mechanistic reasoning (Crandell et al., 2019, 2020) and construct explanations (Houchlei et al., 2021), among other practices outlined in the Framework. On the other hand, the traditional course is typically organized via functional group rather than core ideas, and students are not often required to provide reasoning or engage in scientific practices. In a previous analysis of assessments between these two courses, it was found that the traditional course had more questions that could be answered via recall (Stowe and Cooper, 2017). Previous research from my colleagues and I in these two types of organic chemistry courses also found that more students in OCLUE perceived they were expected to use their knowledge throughout the course while more students in the traditional course perceived they were expected to rely on rote memorization and knowledge (Bowen et al., 2022). However, in both types of courses, 331 students used terms like “critical thinking” to describe their experience which influenced me to initiate this study. Participants There were 14 interview participants and all were informed of their rights as research participants in accordance with the IRB at my institution. Participants signed a consent form to explicitly communicate their acknowledgement of the study and consent to being audio recorded. All students were given the option to choose their own pseudonym or have one generated for them randomly. Participants ranged from freshmen to returning students who had previously completed degrees at the same institution, and the students were majoring in a variety of disciplines with most participants being interested in health-related professions. Four of the participants were enrolled in the traditional organic chemistry course while the remaining students were enrolled in the transformed course (OCLUE) previously described above. Participant information is included below in Table 1. In order to collect a range of perspectives, I opted to recruit participants from OCLUE and traditional organic chemistry courses considering these courses had differing approaches to instruction. Furthermore, this project was motivated by a previous comparative analysis between these two courses (Bowen et al., 2022), thus I found it important to incorporate perspectives from both learning environments here. 332 Table 7.1. Participant information Participant Year Major Professor Course Damien Senior Earth and Environmental Science Clara Sophomore Human Biology Professor 1 Traditional Whitefox Freshman Physiology (Pre-Med) Milo Senior Kinesiology (Pre-PA) Reina Freshman Human Biology and Psychology (Pre-Med) Amanda Sophomore Human Biology (Pre-Med) Noelle Sophomore Human Biology (Pre-Med) Professor 2 OCLUE Virginia Sophomore Biochemistry and Molecular Biology (Pre- Dental) Ember Sophomore Biochemistry (Pre-Med) Rebeka Sophomore Microbiology and Molecular Genomics Arisha Junior Interdisciplinary Studies of Social Science (Pre- Med) Adelynn Sophomore Physiology Professor 3 OCLUE Hailee Returning Pre-PA (previously Student graduated with Kinesiology degree) Ella Sophomore Secondary Education – Chemistry Interview Question Development To align with my methodological approach, my interview protocol structure was informed by Charmaz (Charmaz, 2006). The interview questions were designed to be open- ended, elucidate how students define “critical thinking”, highlight how and why they define “critical thinking” in that way, and generate insights into how students use (or don’t use) “critical thinking”. Early questions in the interview were influenced by my previous work on student perceptions of organic chemistry classroom cultures (Bowen et al., 2022). The full interview protocol, including the questions asked during the semi-structured interviews are included as supplementary material. Upon drafting the questions, I had discussions about the initial protocol 333 and refined it with my colleagues. To ensure my questions would be understood by my target population (undergraduate students), I conducted three pilot interviews with undergraduates who had previously passed organic chemistry at the institution of my study. The undergraduates were encouraged to provide feedback on the questions which was used to refine the protocol further. Data Collection Students were contacted toward the end of the Fall 2021 semester to determine if they were interested in participating in interviews for extra credit in their courses which is common practice in my department. I received many requests for interviews, and as students volunteered, their name and information were added to an Excel spreadsheet. Given that I had access to course performance data for professors 2 and 3 (the OCLUE professors), I stratified students according to their course performance based on whether they had a 4.0, 3.5, 3.0, 2.5, or 1.5-2.0 in the course at the time of data collection. Then, one student was randomly selected from each stratification. Since course performance data was not available from students in the traditional course (with professor 1), I opted to randomly select from the list of volunteers. Students who were not chosen were given the option of completing another assignment for extra credit alongside students who did not volunteer to be fair. Students were notified of being selected, and an interview time was scheduled. Students were e-mailed a welcome document which included some of the major questions covered in the interview (this document is included as supplementary information). Interviews were semi- structured and conducted online via Zoom with each interview lasting between 30-75 minutes. Interviews were recorded using the Zoom recording feature with live transcriptions to assist with 334 the transcription process later on. At the end of each interview, participants were given the option to choose a pseudonym, e-mail one, or have one generated for them. After each interview, I reflected and took notes, detailing any questions or ideas that came up during the conversation. Most interviews were transcribed immediately after conducting them. Each interview underwent two rounds of transcription. First, I listened to each interview with the Zoom transcriptions and corrected them. Then, I listened to the audio once again with the transcript in hand, checking it for accuracy, and making edits when necessary. Once a master transcript of each interview was made, the transcripts were further deidentified by redacting instances where they referred to their specific professor or themselves by name. In grounded theory, the use of theoretical sampling is considered an important component of the method (Charmaz, 2006; Corbin and Strauss, 2015; Timonen et al., 2018), however, grounded theory articles have been published without the use of theoretical sampling across education more broadly, including in this journal (Randles and Overton, 2015; Dunn et al., 2019; Barron et al., 2021; Flaherty, 2020b). Given the timeframe of the study, the online nature of the course, and logistical issues, I was unable to engage in theoretical sampling as traditionally defined, and instead I relied more on convenience and random sampling. Constructivist Grounded Theory Grounded theory was co-developed by Barney Glaser and Anselm Strauss (Glaser and Strauss, 1967), and it has been further extended and developed into various schools of thought (Glaser, 1978; Charmaz, 2006; Corbin and Strauss, 2015); that is, there is not one approach to grounded theory. Though there are some commonalities between these different types of grounded theory (Timonen et al., 2018), I opted to use Charmaz’s constructivist grounded theory 335 (CGT) for the purposes of this study (Charmaz, 2006). CGT was chosen as the methodology because it offered an approach to explore open-ended data while minimizing biases and assumptions, and it acknowledged the role of the researcher in the research process. That is, CGT acknowledges that the themes and subsequent theory or framework developed by the researcher is co-constructed alongside the participants. Furthermore, there was a precedent for using Charmaz’s CGT to explore student perceptions in the context of the transformed organic chemistry course used in this study (Flaherty, 2020b). At its core, CGT can be viewed as principles and practices for engaging in qualitative work (Charmaz, 2006). The methodology is a highly inductive approach, meaning that all concepts and ideas described emerge directly out of the data (Charmaz, 2006; Corbin and Strauss, 2015; Creswell and Poth, 2017). Like some other types of qualitative methods, constructivist grounded theory acknowledges that “objectivity” is largely unobtainable, and instead researchers can rely on their personal experiences to help make sense of the ideas emerging from the study (Charmaz, 2006; Corbin and Strauss, 2015). Although it is often said themes “emerge” from qualitative data (language that I use here), to me this means that themes were derived via inductive methods and thus not meant to imply the themes were independent of my perspectives and thoughts. That is, the themes I interpreted were generated by me in an inductive manner. Though some may argue that grounded theory must lead to a strong, well- supported theory, recent conceptualizations have instead argued that sometimes CGT leads to a conceptual, or analytical, framework which may be less comprehensive than a theory but still productive for making sense of the data (Timonen et al., 2018). 336 Data Analysis All data was analyzed within the MAXQDA 2022 software (VERBI GmbH, 2022), and I underwent multiple stages of coding in accordance with constructivist grounded theory (Charmaz, 2006; Corbin and Strauss, 2015; Creswell and Poth, 2017). Like other types of grounded theory, Charmaz relies on use of the constant comparison method and memo-writing throughout coding (Glaser and Strauss, 1967; Charmaz, 2006; Corbin and Strauss, 2015). I began by engaging in a process of open coding where I went line-by-line through interview transcripts. During this stage I wrote down thoughts, notes, identified assumptions, questions, and potential initial categories. During this stage, many memos and notes were made to guide analysis. Next, I engaged in initial coding which involved line-by-line reading once again, but this time I used my notes from the open coding stage to further analyze for meaning. During the initial coding stage, I began to assign initial categories to coded data. These initial codes were largely descriptive with initial attempts at locating meaning within the data. Once all available interview transcripts had undergone open and initial coding, I engaged in a focused and axial coding stage. Here, I identified codes that were relevant to the research questions (though I kept note of the other codes in case anything changed). These relevant codes were then analyzed further to develop larger and more encompassing emergent themes. At this point, the initial codes were combined, modified, or re-coded, if necessary. The final stage of coding is known as theoretical coding and is the stage that develops the theory or analytical framework of the study. This stage involves analyzing and establishing relationships between the themes and categories captured in earlier stages of coding (Timonen et al., 2018), and is often referred to as the core category (Charmaz, 2006; Corbin and Strauss, 2015; Flaherty, 2020b). Once relationships were identified, a final write-up of the categories, themes, and subsequent 337 relationships between them was completed to establish the analytical framework that arose inductively from the data in this study on how students perceive “critical thinking”. As I will show later, this core category is explained narratively and woven into the previous literature on what “critical thinking” means and entails with a remarks on how to move forward with “critical thinking” in chemistry education. Reliability and Validity While addressing reliability and validity in any study is important, it is more-so here given that I am the sole author of this qualitative work (Merriam and Tisdell, 2016; Creswell and Poth, 2017). The underpinnings of constructivist grounded theory as a method are clear that the research process involves a co-construction of findings (i.e., interpretations) between the participants and the researcher. That is, the findings here are unique to the participants and myself considering that our perspectives and assumptions were interacting throughout the research process and represented within this final write-up (Charmaz, 2006; Flaherty, 2020b). To offer some credibility to my interpretations, however, the findings presented here have also been noted in the literature previously discussed, offering a sort of theoretical triangulation (Merriam and Tisdell, 2016). To assist with this, I have opted to give a detailed methodological section to clarify my approach. However, just like with all qualitative work, the goal here was not to generate generalizable findings (in the statistical sense). Instead, the findings and subsequent discussion are meant to be transferable. That is, components (and in some cases, all of the findings) of my interpretations may transfer to different contexts and participants and should be seen as potential perspectives that may be operating in other spaces. 338 Reflexivity is an important part of qualitative work, and I engaged with it through the use of memo writing, which is also an important procedure in the grounded theory methodology. In the spirit of transparency, upon initial generation of the themes presented here, my thought was that I had essentially re-affirmed pieces of what was already known or theorized about the construct of “critical thinking” (as evidenced in the literature review). However, after discussions with other scholars and additional reflection, I concluded that not only had I extended the empirical evidence base of what people perceive about “critical thinking” but had established additional evidence that supports a way forward for such a nebulous construct. I have attempted to include thick descriptions of my methodological approach and as many quotes from the interviews as possible to allow readers to better understand my interpretations (Merriam and Tisdell, 2016). Finally, the peer review process offers a powerful way of supporting this work. Although I conducted this study by myself, it has been assessed by the expertise of my colleagues in accordance with the inclusion in this journal. Results and Discussion My colleagues have previously highlighted how “critical thinking” has been defined in various ways. The amorphous nature of this construct was also noted in this study with half of the participants (seven out of fourteen) stating that there was not one definition of “critical thinking” while others struggled to formally define it. Though previous work has explored student conceptualizations of “critical thinking” (Danczak et al., 2017), the goal of this study was to extend the literature base on student perceptions of the construct via qualitative methods that sought to go beyond how students defined the construct (though important and necessary to know) and ultimately generate insights on how students arrived at this understanding of “critical 339 thinking”, and how students motivated themselves to engage in it. As previously mentioned, this work was influenced by our previously published work (Bowen et al., 2022). Given the nebulous nature of “critical thinking”, my methodological approach (which focused on looking for common themes), and my initial interpretations at the start of data analysis, I found it more productive to focus on the commonalities of “critical thinking” shared amongst students in the sample. However, it’s important to note, student perceptions of “critical thinking” in this study were not identical, but the commonalities between them may offer an analytical and practical handle that can be leveraged to better understand how students conceptualize similar constructs and how we can support student learning. From my analysis, I synthesized findings into four major themes which sought to detail student perceptions of what “critical thinking” is (theme #1), what it is not (theme #2), the perceived origin of their conceptualization and how it develops (theme #3), and the motivational factors that encourage students to engage in it (theme #4). All four themes and associated subcategories are included in Table 2. In accordance with constructivist grounded theory, it is imperative to acknowledge that these themes were developed by me, and that someone else could have developed different themes from the same set of data (hence the influence of sociocultural factors). My approach was inductive and my findings were based on how pervasive these themes were across student perspectives. That is, I wanted to highlight what I interpreted as the most salient factors. Prior to discussing these themes, I believe it important to mention some minor findings that highlight commonalities across students that were not explored in substantial detail to warrant incorporation into a theme. I refer to these findings as minor findings. 340 Table 7.2. Themes and subcategories Themes and Subcategories Number of Participants With Responses in Theme or Subcategory Theme #1: “Critical thinking” involves the 14 (all participants) application and use of knowledge • Applying and using knowledge 11 • Building up and connecting knowledge 10 • Reasoning and understanding why 8 • Synthesizing knowledge and noting 1 patterns • Analyzing questions 2 Theme #2: “Critical thinking” is contrasted 14 (all participants) against more passive approaches to learning • Using rote memorization 14 (all participants • Knowing the answer prior to practicing 2 • Reading over notes 2 Theme #3: Prior experiences inform 14 (all participants) “critical thinking” which is further developed through practice Theme #3.1: Students’ conceptualization of 14 (all participants) “critical thinking” is primarily based on prior experiences • Previous academics 14 (all participants) • Family and friends 3 • Work 1 • Social experiences 2 Theme #3.2: Practice is necessary for 14 (all participants) developing “critical thinking” • Practicing in general 12 • Reflecting and learning from mistakes 7 Theme #4: Intrinsic and extrinsic factors 14 (all participants) motivate “critical thinking” Theme #4.1: Intrinsic factors motivate “critical 12 thinking” • Having interest and curiosity 6 • Being challenged and disciplined 6 • Wanting to work with and help others 2 • Relating content to self 3 • Setting goals 1 341 Table 7.2 (con’t) • Helping with learning 5 Theme #4.2: Extrinsic factors motivate 11 “critical thinking” • Prompting from learning tasks 5 • Grading 10 • Experiencing social pressures 1 NOTE: Students could have had multiple responses in a single theme, coded as different subcategories; hence why subcategory numbers do not total up to 14. Minor Findings In my analysis, I dubbed these findings as minor because students often did not provide additional information or spoke generally about their experience. However, given my focus on common themes across student perceptions, I found them to be informative and related to the study. All (that is, all fourteen participants) students in this study perceived that “critical thinking” was significant and important for their learning and life, that they enjoyed engaging in “critical thinking” with some situating it as a love-hate relationship, and that it was applicable across all courses (including those outside of science). This final minor finding was interesting given that historically some have situated “critical thinking” as inherently “scientific” (George, 1967). Furthermore, in a more recent study, Flaherty found that undergraduate students in OCLUE perceived that science students would be more curious and questioning than history students regarding argumentative claims (Flaherty, 2020b), potentially indicating that these students might perceive science majors as being more “critical” than students in non-science degree paths. Furthermore, despite the different curricular and pedagogical approaches in the organic chemistry courses in this study, all students perceived that they were using “critical thinking” in their organic chemistry courses at the time of the interview. Unsurprisingly, all students were 342 familiar with the term “critical thinking” despite its nebulous nature. This related to the fact that most participants (eleven out of fourteen) mentioned that “critical thinking” had been mentioned (and even expected of them) but had not been made explicit to them in the past. These minor findings provide additional context to the themes and further support my decision to explore commonalities across student responses. Theme #1: “Critical thinking” involves the application and use of knowledge Theme #1 focused on capturing commonalities across student conceptualizations of “critical thinking”. Though students discussed “critical thinking” in various ways, a point of convergence was that all students situated “critical thinking” as applying and using knowledge in some way, a perspective that was also noted in the literature (Glaser, 1941; Dunning, 1954; Barron et al., 2021). This theme included several subcategories: 1) applying and using knowledge (in general); 2) reasoning and understanding why; 3) building up and connecting knowledge; 4) synthesizing knowledge and noting patterns; and 5) analyzing questions. The subcategories and number of participants that had a response coded as the subcategory is shown in Table 2. It’s important to note that the subcategories were not mutually exclusive; that is, a student’s conceptualization of “critical thinking” could have multiple subcategories depending on how they discussed the construct. Applying and Using Knowledge (In General). The subcategory of “applying and using knowledge” captured responses that discussed “critical thinking” as involving the general application of material. In some cases, this entailed applying previously learned knowledge to unfamiliar problems or new situations as noted by quotes from Amanda and Virginia below. 343 “…you’re presented pieces of information or like concepts like being able to absorb that and apply it in a new given situation and being able to like work through like a problem with what you already know I guess, is how I would, specifically like applying it. I think that’s how I would define [critical thinking]…” (Amanda 85; OCLUE) “…I said in my opinion [critical thinking is] learning something new and using that knowledge to apply to future concepts and ideas” (Virginia 21; OCLUE) For both Amanda and Virginia, the application of concepts and ideas to “future” problems was important for their understanding of “critical thinking”. The perspectives of these students was also noted in previously mentioned literature (Glaser, 1941; Dunning, 1954, 1956; George, 1967; Vieira et al., 2011; Gupta et al., 2015; Barron et al., 2021). In total, eleven out of fourteen students discussed “critical thinking” as applying and using knowledge in general. Building Up and Connecting Knowledge. The “building up and connecting” knowledge subcategory captured responses that detailed how students used their knowledge to build up or draw relationships between concepts. Initially, the “building up and connecting knowledge” subcategory was a category all to itself. I opted to incorporate this subcategory into the theme because 1) in order to relate concepts together one must use their knowledge, and 2) the subcategory consistently co-occurred with “applying and using knowledge (in general)” or other subcategories already subsumed in the theme. Hailee and Damien provide examples of how students talked about this subcategory. 344 “…with organic chemistry there, we have different principles and theories that, um, we build upon, and that's like our foundational understanding of certain concepts like… um, in recitation it was like Le Chatelier’s principle so there's always like facts and scientific evidence and theories that are, build upon what we learned in class and those kind of tie back into what we, the reactions we do” (Hailee 51; OCLUE) “…so critical thinking, and in my opinion, is just utilizing all these building blocks that intro classes and intermediate classes prepare you for, to be able to get to these more advanced classes…” (Damien 47; traditional) Both quotes above highlight how previous information learned, sometimes in another course (i.e., introductory courses) would need to be used for new problems in organic chemistry or upper-level courses. Therefore, to these students, connecting concepts together and building off of previous knowledge was important for engaging in “critical thinking” and was how they situated it. In total, ten out of fourteen participants had responses in this subcategory. Reasoning and Understanding Why. Some participants were more specific in what they meant by applying and using knowledge. That is, they not only conceptualized “critical thinking” as involving application and use of knowledge, but they saw it as involving reasoning and being used to understand why something happens. Arisha and Milo offered exemplar responses for this subcategory. 345 “…You can really like apply what's going on to like a situation like you're not just doing it, like you're actually… like, like, get what's going on, versus just like going with the flow like what someone's telling you is happening. Like you can see why it's happening” (Arisha 55-57; OCLUE) “So, just in general, critical thinking to me is… instead of asking, like, answering what something is, it's how something is, why something is” (Milo 39; traditional) I argue here that in order for students to engage in reasoning and parse out why a phenomenon happens, students must apply concepts and use their knowledge in some way. In fact, in many cases (Arisha’s response being one example), responses in this subcategory mentioned “application” and “reasoning” or “understanding why” together. In our previous work we noted a similar co-occurrence in student perceptions of what they were expected to do, with many responses associating the application of knowledge to understanding why (Bowen et al., 2022). In total, eight out of fourteen students had responses in this subcategory. Synthesizing Knowledge and Noting Patterns and Analyzing Questions. The vast majority of responses related to theme #1 were contained in the other subcategories. However, there were other perspectives that we posit are related to seeing “critical thinking” as the application and use of knowledge. For Adelynn, they discussed how they often synthesized knowledge together across multiple assignments and learning tasks, a process which involved applying knowledge and using it to note patterns between related concepts. 346 “I think of it as like, okay, from all the lectures, and all the notes, and even like, the beSocratic homeworks, and recitations like, there's just like, a bunch of information, and I just tried to see like, what, you know, I try to like, put it all together and really see patterns amongst that and, um, like, the general overarching like, takeaways I can think of...” (Adelynn 11; OCLUE) On the other hand, a couple of students also discussed the process of analyzing questions with the ultimate goal of figuring out which information needed to be applied, as Noelle notes: “…I guess just like, um, like you’re given a problem, you need to analyze it, you know, so I guess, analyzing it first, and then thinking back on what you know that can be applied in order to find a solution for it.” (Noelle 43; OCLUE) Although the number of responses in these two subcategories was much smaller than the others, with only three students across both subcategories having coded segments assigned to them, I find them to be important for inclusion considering they represent the ways students conceptualized “critical thinking” and how it involved the application and use of knowledge. Theme #2: “Critical thinking” is contrasted against more passive approaches to learning Theme #2 focused on capturing commonalities in student perceptions of what “critical thinking” is not. Within the interview protocol, I explicitly encouraged participants to contrast “critical thinking” to other ways of doing to further clarify their perspective. The rationale for this was based on previous interview experiences that found some participants had an easier time navigating what was not representative of an abstract construct, rather than what it was. With this 347 group of students, all of the participants contrasted “critical thinking” against passive approaches to learning, particularly rote memorization, an idea also noted in the literature (Dunning, 1954; George, 1967, 1968; Rickert, 1967; Tsai, 2001; Santos, 2017). Similar to theme #1, theme #2 includes multiple subcategories: 1) using rote memorization; 2) knowing the answer prior to practicing; and 3) rereading over notes. However, it is worth noting that in the case of this theme, the “rote memorization” subcategory was far larger and more prevalent than the others. A breakdown of the number of participants with responses in each subcategory is included in Table 2. Using Rote Memorization. As the name suggests, responses that contrasted the use of rote memorization to “critical thinking” were included in this subcategory. Examples from Clara and Rebeka are included below: “Just the ones where you kind of like copy notes from a board and the next day or a week later, you're tested on exactly what you, you know, word by word from what you copied. I think that doesn't allow for critical thinking to happen” (Clara 49; traditional) “…not critical thinking is straight memorization without asking like the who, what, why, when, like, um, not exploring like the ideas that came before just like taking the baseline facts…” (Rebeka 29; OCLUE) As can be seen, Clara, a student in the traditional organic chemistry course, mentions situations where students are tested over how well they can regurgitate copied information as 348 being the opposite of what “critical thinking” is. Similarly, Rebeka contrasts “critical thinking” to memorization and further expands on this by stating that “critical thinking” involves the “who, what, why, when…”. All fourteen participants contrasted their conceptualization of “critical thinking” with rote memorization, and it was, by far, the dominant subcategory in this theme, indicating strong commonalities across student conceptualizations. However, it’s worth noting that some participants were quick to say that this did not mean memorization was entirely “bad”, a perception which I plan to explore in a different study. Knowing the Answer Prior to Practicing. The “knowing the answer prior to practicing” subcategory captured the few responses that illustrated situations where the answer was already known prior to starting practice problems. The students who had responses in this subcategory talked about how knowing the answer prior to engaging in a problem may instill false confidence into students and that it “tricked” students’ brains into thinking they understood the material. Adelynn and Hailee provide examples below: “…like trying to do the problem but just for like, you kind of expect the answer or you know it already, and I think there's like, not a true test of what you actually know” (Adelynn 47; OCLUE) “…I think it's in those moments like if we were to just kind of sit there and guess, and think, oh, I think I've seen that answer before where I just, you know, a really vague, um, principle, you know something and then you just click the answer and you just keep going and you don't really understand why you chose it but you get it right and you just kind of move on and not knowing 349 the deeper meanings for the answers or why the processes work the way they do and the foundational level” (Hailee 31; OCLUE) From the perspective of these students, knowing or recognizing an answer to a problem can hinder a student from looking beyond the answer into understanding why the answer is correct. Only Adelynn and Hailee had responses allocated to this subcategory, therefore it is a small but important part of their perception into what is not “critical thinking”. Reading Over Notes. Passively reading over notes was identified as another direct contrast to “critical thinking” by a couple of students. Although the students who mentioned this identified that reading over notes could be helpful for studying, they saw that passively reading the notes by itself was not efficient for learning. Whitefox and Adelynn provide examples below: “…reading things it has like, it has neither the, you’re not like, spending enough time in it, and also like, you're not even like, I guess, mentally challenging yourself, and there's no critical thinking involved in it whatsoever.” (Whitefox 43; traditional) “Another example that might apply is like, reading over your notes. Like, you might think that like that information is getting into your brain more because you're like, reading over it, but I just don’t think that’s like, very critical thinking because it’s not like, taking stuff you know and trying like, to apply it to something that you don’t know the answer to yet.” (Adelynn 47; OCLUE) 350 Earlier in the interview, Whitefox noted that “…I find rereading notes is one of the most like, inefficient in terms of like, yield for studying,” (Whitefox 27). This perspective coupled with their quote above highlight how Whitefox contrasts this passive approach to “critical thinking”. Similarly, Adelynn also noted that rereading notes was not effective and connected it back to how it does not encourage one to apply knowledge. Similar to the previous subcategory, only two students (Whitefox and Adelynn) had responses in this subcategory. Theme #3: Prior experiences inform “critical thinking” which is further developed through practice Theme #3 captured how students came to understand what “critical thinking” meant and how they got “better” at it. This theme sought to extend the insights generated from previous themes by exploring how students perceived they arrived at their conceptualizations of “critical thinking” captured by themes #1 and #2. Given my theoretical framework, I found this to be an important inclusion in this study since students’ past sociocultural experiences with academics, work, family, and friends impacted how students conceptualized “critical thinking”. Theme #3.1: Students’ conceptualizations of “critical thinking” are based on their prior experiences. A breakdown of responses in this subtheme are included in Table 2. Students in this study relied on previous academic experiences, family, friends, work, and social experiences to conceptualize “critical thinking”. That is, there was little to no evidence amongst these students to suggest that their organic chemistry experience informed their “critical thinking”. For example, Damien relied on their hydrogeology course as helping them develop “critical 351 thinking” while Reina provided a broader explanation and stated that different environments “[play] a role into how you think”: “…I would say that one, one of the classes that really allowed me to see how things kind of pieced together, um, and from a variety of disciplines was actually my hydrogeology course, um, of Fall 2019, and that was really cool to see how chemistry, physics, geology biology, all combined to impact, you know, subsurface movement of water, and, and, and different pollutants that could travel to various areas and how it impacts, you know, agriculture or forest land or, or your drinking water…” (Damien 155; traditional) “…everyone grows up in a different environment, the way in the way like, parents teach you, the way that you, um, interact with your friends when you're younger too, it all just plays a role into how you think, and I think that it can be very different for people…” (Reina 67; OCLUE) All students in the study relied on their prior experiences in the context of “critical thinking”. I only show two examples above due to space limitations but students also mentioned their families (i.e., mirroring approaches of family members), social pressures (i.e., engaging in “critical thinking” in order to compete in a class), and work (i.e., working in a hospital) as informing their understanding of “critical thinking”. I found it interesting that students did not discuss their organic chemistry course in the context of developing their “critical thinking” given that scholars and instructors assert that organic chemistry develops these generalized “critical thinking” skills (Dixson et al., 2022). According to the students in this sample, they arrived at organic chemistry already having some conceptualization of what “critical thinking” was and 352 how to do it; that is, they were engaging in a practice they had been using and learned previously. Theme #3.2: Practice is necessary for developing “critical thinking”. A breakdown of responses in this subtheme is shown in Table 2. This subtheme contained responses that describe how “critical thinking” develops. All students had at least one coded segment that discussed how “critical thinking” developed through consistent practice and/or learning from mistakes, as noted in the quotes below by Clara and Whitefox: “I would say the more that you really, and like, authentically immerse yourself in the content and the material. And the more that you kind of want to be in the classroom setting. And the more effort, like I said, that you put in, I think practicing and putting in the effort is a really big thing. I think the more that you do that, the more that you'll be rewarded and that reward comes from critical thinking” (Clara 95; traditional) “I think you have to like be in that field and keep applying that critical thinking for that field over and over again that helps you make, become quicker, forming connections between concepts and also simply because the more concepts you have the easier it is to form connections between them. So I think the more exposure, you have to that field, you’re going to be able to form, be able to, be able to critically think in that field” (Whitefox 79; traditional) The idea of practicing being necessary for the development of “critical thinking” was noted in the literature (Kogut, 1996; Oliver-Hoyo, 2003; Abrami et al., 2015). In some cases, 353 students’ perceptions went further, such as when they discussed reflecting on responses and learning from their mistakes in the context of practicing. This highlights that students do perceive one must consistently practice to get better at something, a theme which could be pedagogically useful. Theme #4: Intrinsic and extrinsic factors motivate “critical thinking” Up until this point, the themes have covered how students conceptualized “critical thinking”, where students source their understanding of the construct, and how they perceive it develops. However, I also wanted to generate some insights on what motivates students to engage in this way of thinking. Theme #4 captured a variety of motivating factors that students thought encouraged them to think “critically”. Theme #4.1: Intrinsic factors motivate “critical thinking”. A breakdown of responses in this theme is shown in Table 2. Intrinsic factors were those that students internally leveraged to get themselves to engage in “critical thinking” (according to their definition). As one can imagine, there were many different factors mentioned by students including: 1) having interest and curiosity, 2) being challenged, 3) being disciplined, 4) wanting to work with and help others, 5) relating the content to their major and life, 6) setting goals, and 7) helping with learning. For example: “…I think when I critical think in my classes I learn more, and I understand things more. And so, I think if we want students to succeed, or you yourself as a student, you want to succeed, 354 pushing yourself to critically think about that is something that’s going to help…” (Ella 75; OCLUE) In Ella’s quote, they note that by engaging in “critical thinking” they ultimately learn more, highlighting how engaging in that way of thinking helps with their learning. On the other hand, Clara (below) notes that when they are working with a problem at the appropriate challenge level, they find this fun and engaging, and this is what helps them think “critically” about the content. “So, a question where I feel like I have some of the pieces, but I need to find the, the other ones, those types of questions I really like to critically think about because I feel like I have all the tools needed and I just have to kind of set it up. So that, that becomes fun. I, I want to say like when it’s a question that is the right amount of difficulty…” (Clara 119-121; traditional) In total, twelve out of fourteen participants leveraged intrinsic factors to motivate themselves to think critically. Theme #4.2: Extrinsic factors motivate “critical thinking”. A breakdown of responses in this is shown in Table 2. Intrinsic factors were not the only way that students encouraged themselves to think “critically”. In fact, some students were vocal about not enjoying chemistry, yet they still perceived they could engage in “critical thinking” with the material. In some of these cases, students relied more on extrinsic factors to motivate themselves. Similar to theme #4.1, there 355 were a variety of factors mentioned by students including: 1) prompting, 2) grading, and 3) experiencing social pressures. In the case of prompting, Noelle offers a great example: “…So, like on the [homework] how she has is like okay, like each slide is like one step for the problem. So it's like okay let's do like when it's like the multiple step reactions and she's like okay like what's, what's step one and then you click next and like okay now based on that, what’s step two, and you do that, draw it out whatever, so I like that because then I like gets me thinking about every step focuses on every step, versus like, there have been like some problems versus like okay here's this, here's a giant box like draw the whole reaction, it's like four steps, but it just like gets all mixed up, so I like it how she actually breaks it up sometimes, that helps me so” (Noelle 131; OCLUE) In Noelle’s experience, the homework questions that were broken up and scaffolded encouraged them to “critically think”. This was largely due to the prompting in the task which encouraged students to think about each facet individually before bringing all of the information together. Aside from prompting, some students were motivated by their grades and performing well in the course: “…so I guess, to be motivated to think critically means you need to be motivated to be a high achiever in the class, and think that there's that kind of motivation come from both like past experiences, especially if you have like a tracker record of like it's doing good you want to keep the track record going, um, and also, I also know people who are like who have a track record of doing average, they have no incentive they're like, oh, I'm just aiming for a B or I'm just aiming 356 for a C, I hear that quite a lot, people are like, not, I guess, uh, they're not aiming for an A” (Whitefox 101; traditional) From Whitefox’s perspective, one must be “motivated to be a high achiever” in order to “critically think”. However, Whitefox was not alone in this perspective, and various students talked about the role of grades as a motivating factor. For example: “So, I mean, I think it is possible to, like, critically think even though you don’t have like an interest in it, if you want, if there’s like a different motivation behind it, I guess… Which I think for most people would be like being successful and like getting good grades,” (Amanda, 129- 131; OCLUE) Amanda also notes the role that grades can have for motivating students to engage in certain ways of doing, in this case “critical thinking”. Other students talked about how grades hinder them from engaging in “critical thinking” with some situating grades as inaccurate of a student’s “actual” learning. Regardless, for a handful of students, grades were a motivating source. In total, eleven out of fourteen students discussed extrinsic factors and their role with “critical thinking”. Differences Between OCLUE and Traditional Organic Chemistry Much of the previous research between these two types of courses have been comparative and have found differences across student approaches to learning tasks and perceptions (Crandell et al., 2019, 2020; Houchlei et al., 2021; Bowen et al., 2022). In the case of this study, I noted that 357 all students in this study perceived they were engaging in “critical thinking” in their organic chemistry courses. Despite the differences in course design and enactments between OCLUE and the traditional course, there were strong commonalities amongst students when discussing “critical thinking”. As illustrated in theme #4, students may draw on a variety of ways to motivate themselves to think “critically”; that is, even if a course does not encourage it, students may motivate themselves to think in a certain way. Though student conceptualizations of “critical thinking” showed overlap, there were some differences in the when and where students perceived they were thinking “critically” in their organic courses. In OCLUE, students primarily saw themselves “critically thinking” on their weekly homework assignments (as long as they took them seriously), recitations, and course assessments. For example, Arisha, Hailee, and Ember describe how OCLUE encouraged them to think “critically” on homework, recitations, and exams, respectively: “I think we definitely are expected to use like the term critical thinking like we're supposed to take the knowledge we learned in lectures and be able to like, apply it to our homework and exams.” (Arisha 9; OCLUE) “…I know one that we did in recitation was kind of like describe, like it was like explain this reaction, so you have to draw out the reaction, and then you have to explain why it happened like that. And sometimes its flipped where she'll ask a question like, um, why does why, like why is carbon… what am I thinking of? Why are fats not soluble in like, why is oil not soluble in water, something like that, she'll tell you to explain it, and then she'll tell you to like, draw a picture that also aids the explanation…” (Ember 21; OCLUE) 358 “…you have to really know what you're doing to do well on the test because it's not a multiple choice test and it's not just facts or, you know, like s-simple concepts, it's really broad concepts that all build off of each other so I think you would, you wouldn't be able to get by at all with just memorizing…” (Hailee 35; OCLUE) On the other hand, students in the traditional course perceived that they were primarily asked to engage in “critical thinking” on in-class activities (of which there were two for the semester), recitations, and homework (though for different reasons), and not as much on their course assessments. For example: “…I would say that the critical thinking, parts have been present throughout the whole semester, but most, mostly in, um, the you know the activity, like I mentioned… and I feel that critical thinking part comes out in recitation more than it does in lecture.” (Damien 93; traditional) “…I would say not as much critical thinking on the quizzes and exams because it’s multiple choice. I would say critical thinking is used much more on the homework since there's, uh, a good bit of questions that require like, you to draw the molecule or to type out the name, so.” (Milo 101; traditional) Here, Milo talks about how the multiple-choice assessments do not encourage “critical thinking” but discusses how “critical thinking” on the homework involves more open-ended responses. Though Damien perceived that “critical thinking” was taking place throughout 359 organic chemistry, they primarily discussed it in the context of the activities (and recitations) which students perceived had them relate multiple concepts together. Upon further digging into these activities in the traditional course, I found something quite interesting. As readers may recall, OCLUE is informed by three-dimensional learning which incorporates three-dimensional items on homework, recitations, and assessments, all of which students perceived involved “critical thinking” while the traditional course is not transformed. In an effort to enhance and expand transformation efforts at my institution, there is a fellowship program for faculty members that acts as professional development to help interested faculty engage with and utilize three-dimensional learning in their courses. The faculty member who taught the traditional organic chemistry course in this study was part of this fellowship program in the past. Though their course looks largely traditional, they have attempted to incorporate more three-dimensional items into their instruction and assessments. One way they have done this is through the two activities in their course. That is, the two activities, which most of the students in the traditional course mentioned as getting them to use “critical thinking” were developed in a professional development program that sought to instruct faculty on how to incorporate three-dimensional learning into their learning tasks. Though the degree to which the courses incorporate three-dimensional learning is quite different, it is interesting to see students mention “critical thinking” in the context of three-dimensional activities in both courses. The Core Category The core category in grounded theory is developed in the theoretical coding stage and represents the theory that is generated based on interpretations in the data. The theory is based on 360 the inductive categories and themes that have emerged and been interpreted in the data; that is, the theory is grounded in the data, hence the name (Glaser and Strauss, 1967; Charmaz, 2006; Corbin and Strauss, 2015). As I noted in the methods section, I have opted to use constructivist grounded theory which further describes the core category as an analytical handle by which to interpret and situate the findings (Charmaz, 2006); however, the analysis was still informed by other schools of thought on grounded theory, such as the use of axial coding (Corbin and Strauss, 2015). Within this methodological approach, Charmaz stresses that researchers “should focus on meaning, action, and process” (Charmaz, 2006; Hallberg, 2006) which I have sought to do. In my experience, grounded theory approaches within the framing of Corbin and Strauss (Corbin and Strauss, 2015) encourage the generation and communication of a more formalistic theory with the core category; however, those within constructivist grounded theory may have a core category that’s more “narratively” explained due to its inescapable connection to its context and sociocultural factors (Hallberg, 2006) which is an approach I take here. Since the core category should describe what the study was about, taken together, all four major themes inform my core category of conceptualizing “critical thinking”. With the focus on how students perceived what “critical thinking” is (themes #1 and #2), what experiences influenced their perception and how “critical thinking” develops (theme #3), and what motivates them to do it (theme #4), my aim was to provide an analytical and practical handle on what students believe “critical thinking” to be, despite its amorphous nature noted by students in this study and throughout the literature. These themes therefore address my first research question of what commonalities exist across student perceptions. In a similar study by Danczak and colleagues, they noted similar themes across student responses, including the application of knowledge. They concluded that students primarily defined “critical thinking” as “to analyse and 361 critique objectively when solving a problem” (Danczak et al., 2017). Although some students mentioned the idea of “objectivity”, it was not a major point of convergence. However, the focus on analyzing and solving problems was also noted in student responses in this study. Although the themes can be useful in their own right, my overarching goal has to clarify what students meant when they mentioned “critical thinking”. I argue all four themes represent student perspectives in this sample, and I believe can extend the literature base on our understanding of how students conceptualize “critical thinking”. To address my second research question of what insights student perceptions can provide to help clarify the construct, I posit that students seemingly do not conceptualize “critical thinking” as thoroughly as some definitions, such as in the “Delphi Report” (Facione, 1990), nor do their definitions align across all facets. That is, students also recognize the amorphous nature of the construct. Previously I mentioned that others have advocated for abolishing the term “critical thinking” and instead situating the scientific practices in three-dimensional learning as component parts of the construct to make it explicit what we want students to know and do (Cooper, 2016; Stowe and Cooper, 2017). Using the themes from this study, I add credence to this assertion, and I will discuss the core category and subsequent themes in the context of two facets: 1) the amorphous nature of “critical thinking” and 2) the alignment of student perceptions with the scientific practices in 3DL The Amorphous Nature of “Critical Thinking”. Research related to “critical thinking” has been going on for decades and disciplines are still struggling with developing a consensus definition. The amorphous nature of the construct was also recognized by students in this sample. For example, Adelynn described “critical thinking” as merely a “buzzword” while Reina and Clara saw the definition of “critical thinking” changing based on the person defining it. Although 362 both Reina and Clara were able to provide a definition for “critical thinking”, they were quick to recognize its nebulous nature. Some students, such as Milo, Amanda, and Whitefox had a more difficult time defining “critical thinking” despite receiving the interview questions ahead of time. For example, in response to the question “how would you describe what “critical thinking” is (like if you had to give a definition)?”, all three participants hesitated. Therefore, if students also recognize the amorphous nature of the construct or are confused by its meaning, its mention in instruction is not a useful practice. Despite having previous experience with “critical thinking” and stating they had come across the term before, students confirmed its amorphous nature in various ways throughout the interview. The different conceptualizations noted are likely rooted in the ways that students have been trained in their previous experiences. From theme #3, students were drawing on a variety of experiences to conceptualize “critical thinking”, and the diversity across these experiences make it even more difficult for a consensus definition to be established. Within CER, Stowe and Cooper have suggested that we completely avoid the term “critical thinking” and instead be more specific about what we want students to know and do (Cooper, 2016; Stowe and Cooper, 2017). Throughout the literature and this study, it is clear that “critical thinking” is something more than having declarative knowledge. Themes #1 and #2 highlight that students perceived that this knowledge must be put into practice and that rote memorization is not “thinking critically”. In the literature I noted there was overlap amongst definitions that described “critical thinking” as the application of knowledge (Glaser, 1941; Dunning, 1954; Gupta et al., 2015; Barron et al., 2021), contrasting the construct against rote memorization (Dunning, 1954; George, 1967, 1968; Rickert, 1967; Facione, 1990; Tsai, 2001; Mulnix, 2012; Santos, 2017), and that practice is important for its development (Oliver-Hoyo, 363 2003; Abrami et al., 2015). All three commonalities were also noted in student perceptions of the construct and are represented in the major themes identified, indicating potential points of nucleation for clarifying the construct and what we want students to do when we say “critical thinking”. That is, these general overarching commonalities may act as foundations of what “critical thinking” is and entails but will require more explicit and detailed descriptions of what students are expected to know and do, an idea I will discuss next. The Alignment of the Commonalities of “Critical Thinking” with the 3DL Scientific Practices. While the commonalities may offer a foundational scaffold for “critical thinking” in science education, I posit that these commonalities must be situated within the literature and theories of learning and have more explicit and detailed learning targets. Others have previously asserted that the 3DL scientific practices act as component parts of “critical thinking” (Cooper, 2016; Stowe and Cooper, 2017) and over time other scholars have also situated “critical thinking” as involving certain scientific practices (Siegel, 1989; Osborne et al., 2004; Crenshaw et al., 2011; Mulnix, 2012; Osborne, 2014; Hand et al., 2018). The scientific practices in 3DL define concrete and specific ways of doing that mirror the ways of thinking expert scientists employ (National Research Council, 2012a; 3DL4US, n.d.). Engagement in a scientific practice requires the use of scientific knowledge where the knowledge is applied to come up with a solution. In some cases, multiple practices are needed, but they all require application of relevant knowledge. As evidenced by the first theme, the students ultimately perceived that “critical thinking” involves the application and use of knowledge. This, in conjunction with the fact that students perceived passive approaches to learning (theme #2), especially rote memorization, was 364 contradictory to “critical thinking”, illustrates that students do see the construct as being something more than regurgitation of declarative knowledge and facts. With regard to theme #3, the scientific practices offer explicit ways to engage students in the act of doing science. That is, in a three-dimensional environment leveraging the scientific practices, students have many opportunities and access points to engage in scientific thinking and practice (Bang et al., 2017). Although the findings from theme #3 imply that students are not relying on organic chemistry to inform their perception of “critical thinking”, there is more to consider. At the time of the interview, it may have been too early for students to reflect on their experiences in organic chemistry to recognize how the course had impacted their perception and understanding. Regardless, the data demonstrate that current student perceptions align well with the intended purposes of the scientific practices in three-dimensional learning. Other perspectives and studies have also found ideas like “application”, “use of knowledge”, and contrasts to rote memorization as being important for “critical thinking”, indicating that previous work in conjunction with this study pinpoint a convergence point for the construct that aligns with the scientific practices. While we have no control over intrinsic factors that motivate students, we can, however, leverage the extrinsic factors many students relied upon, such as prompting. For example, my colleagues have conducted research into the role of prompting on learning tasks that impact how students respond (Crandell et al., 2019; Noyes and Cooper, 2019; Noyes et al., 2022). That is, through the lens of three-dimensional learning, they have found effective ways for prompting students to engage in causal mechanistic reasoning and recognize its influence on student thinking. Though grades were mentioned as an extrinsic motivating factor, I have opted to explore this perception in more detail in another publication. 365 I have noted alignment between what students perceive is “critical thinking” and the scientific practices since the practices would clarify the meaning of “critical thinking” and explicitly communicate what students need to do. The use of three-dimensional learning does necessitate a curricular overhaul, however, and would not be accomplished with a simple intervention. Regardless, I posit that the use of three-dimensional learning and the scientific practices offer a potential way forward for engaging students in the work of “critical thinking” that not only aligns with the evidence presented in this study, but to the perspectives that have been noted in the literature. In relation to the alignment between student perceptions and the scientific practices, it is likely this work may require consistent instruction. All students in the study relied on past and present experiences outside of organic chemistry as influential for their understanding for “critical thinking”. That is, organic chemistry was not the source of “critical thinking” skills for any of these students, as some may imply. Regardless, within theme #3, I noted that students perceived they would need to consistently practice to get better at “critical thinking”. In some cases, this practice was described as “immersion” and involved reflection and learning from mistakes. As I have noted, the idea of consistent practice has also been suggested in the literature, including a meta-analysis of strategies related to developing “critical thinking” (Oliver-Hoyo, 2003; Abrami et al., 2015). That is, though “critical thinking” has historically assumed many definitions, it has consistently been suggested that it requires practice. This may suggest that one-off interventions are not as effective at providing students enough opportunities to practice and develop their thinking (Noyes et al., 2022). This point aligns with the use of three-dimensional learning scientific practices in that the underlying goal of this curricular approach is to provide consistent opportunities throughout the course, often in the form of 366 formative assessments, so that students can receive feedback on their thinking. Such an approach can further support the use of the three-dimensional scientific practices in instruction. By adopting a systematic and systemic approach rather than an intervention-based approach, instructors can better communicate that consistent practice and ways of doing (such as the application of knowledge) are valued. In previous research on student reasoning in chemistry courses, it was found that students in OCLUE were more likely to retain their reasoning ability over time (Crandell et al., 2020). Considering that OCLUE engages students in the scientific practices throughout the entire semester on homework, recitations, and assessments, I argue that students are given plenty of opportunities to engage in the practices, ultimately contributing to their ability to use them later. That is, OCLUE is a whole course overhaul with intentional decisions to engage students, consistently, in the scientific practices of three-dimensional learning. Similarly, theme #3 also illustrated that students are drawing on a variety of previous experiences to inform their view of “critical thinking”. Given the previous discussion on the amorphous and nebulous nature of the construct, it’s difficult to imagine that a single intervention would shift how students conceptualize “critical thinking”. Limitations The first limitation I acknowledge is that the analysis was conducted solely by me. With this said, the findings not only represent interpretations of student responses, but primarily my own interpretations. As mentioned earlier, my choice of method was intentional, and in the context of constructivist grounded theory, it is acknowledged that someone else could have analyzed the same data and developed different themes (i.e., perhaps they would not have focused on commonalities); however, this is a point that Charmaz makes clear (Charmaz, 2006). 367 Therefore, I acknowledge that I am influenced by the use of three-dimensional learning and how it may have influenced the discussion of the findings. However, I have attempted to make the connections clear and still argue that three-dimensional learning offers a way forward on this decades-long conversation on “critical thinking”. A second limitation is that this study included a small number of students (fourteen). These students were offered extra credit to participate in the interview (though other students not randomly selected for interview received a separate extra credit activity), and though I tried to ensure I was randomly selecting from a range of experience, these students may have been self- selecting and not representative of the student population at the university. Furthermore, all students were from the same large, research-intensive university and may not represent perspectives across different institutional contexts. In conjunction with this, a third limitation is that students were aware of my position as a chemistry education researcher and overall intentions of transforming curricula and pedagogy. Thus, it is possible students may have catered their responses to my interests or to “protect” their professors. For example, I noted earlier that students in the traditional course largely did not see themselves using “critical thinking” on exams, when I asked Milo why this was, they prefaced their response by saying that they really enjoyed the professor and course and did not want their response to cause any changes in the course. Implications for Research As I mentioned earlier, one implication is that smaller, invention-based studies may not be effective to encourage students to engage in certain ways of doing or thinking. Students noted that their understanding of “critical thinking” comes from a variety of experiences, and in order 368 to get better at “critical thinking” that they needed to practice. Therefore, to help socialize students into particular ways of doing and thinking, it is likely better to engage students in certain practices systematically over the course of one or more courses rather than one intervention. I have situated three-dimensional learning and its practices as one way to define the component parts of “critical thinking”; however, I also see three-dimensional learning as a framework to designing environments and assessments that systematically engage students in the practices around core ideas. That is, it has value beyond clarifying “critical thinking”. To reiterate, student conceptualizations were not identical, though my themes highlight the major commonalities across student responses that could be leveraged. For example, studies could explore how students perceive certain scientific practices as being related to “critical thinking”. There may be practices that students already use due to their previous academic experiences, however, there may be other practices that are important but that students are less familiar with and have a more difficult time using. Furthermore, additional work is needed to understand why organic chemistry is situated as important for developing “critical thinking” even though students in this study had already conceptualized the construct before getting to this course. The future study(ies) could comment on whether students need time to digest their experience before recognizing the impact it has on their perception, or if their prior experiences are fully dominating over their organic chemistry experience. Implications for Teaching To begin, I assert that instructors consider the nature of the term “critical thinking”. Its amorphous nature does not lend itself to designing effective learning tasks or goals. Though there are commonalities between student perceptions of what “critical thinking” is, this should 369 not be taken to mean that students agreed on what “critical thinking” was entirely. Instead, I propose that 100% agreement may be an implausible task. However, by focusing on the commonalities, we can seek out ways that student perceptions align with potential pedagogical approaches. In this case, I suggest that the commonalities can be addressed through the use of the scientific practices in three-dimensional learning. Regardless of whether an instructor chooses to learn more about three-dimensional learning, I recommend that instructors think about the things they want students to know and do and spend some time defining these aspects of their instruction, especially if they are using terms like “critical thinking”. Furthermore, instructors should make these definitions and learning goals very explicit to students. Constructs such as “critical thinking” and “problem solving” are often used in lectures, yet students may not be entirely clear on what is expected of them or what exactly they need to do. In some cases, students may assume they know, but their definition or understanding may be quite different than what the instructor intends. I recommend instructors concretely define what they want students to know and do instead of shrouding expectations in terms with many different definitions (without ever defining what they mean). Given the nebulous nature of the term, Stowe and Cooper have taken the position that the term “critical thinking” should not be used at all (Stowe & Cooper, 2017); however, the ubiquitous nature of “critical thinking” throughout education and society makes this difficult. Therefore, my position is that if instructors use terms like “critical thinking”, or other amorphous terms like “problem solving”, they should be explicit and specific about what students must know and do with regard to these terms. In terms of moving forward, I have posited that the scientific practices and three- dimensional learning offer a route to clarifying the seemingly important construct of “critical thinking”. In this study, I noted how, despite the differences in how “critical thinking” was 370 conceptualized, there were some commonalities that seemed to align with the scientific practices. This offers a potential access point to getting students to do something “more” than just memorize and go deeper into how and why chemical phenomena occur. Final Remarks “Critical thinking” is a commonly used term throughout education and society that has a taken-for-granted meaning and an assumed universal definition. However, the research over the last few decades is clear: people cannot agree on exactly what “critical thinking” is and entails, something I also noted in my study. Regardless, I did note some commonalities across perceptions that aligned with previous perspectives and evidence in the literature. Although these themes could stand on their own right as they provide valuable insights into what a ubiquitous term like “critical thinking” might mean to students, I extended my discussion to offer a way forward and clarify this amorphous construct. That is, I posited that the scientific practices of three-dimensional learning could effectively be used as elements of “critical thinking” which is particularly important since this would align the construct with previous work on “critical thinking” and the findings in the study presented here. From a sociocultural perspective, students’ understanding of the construct had been developed through past experiences, and it did not seem that the cultures of either organic chemistry class had socialized students differently. However, the perceptions that many students had aligned very well with the transformed courses: that is, they aligned with the use of knowledge which is accomplished through the use of scientific practices. 371 REFERENCES 1. 3DL4US, (n.d.), Three-Dimensional Learning for Undergraduate Science. https://3dl4us.org. 2. Abrami P. C., Bernard R. M., Borokhovski E., Waddington D. I., Wade C. A., and Persson T., (2015), Strategies for Teaching Students to Think Critically: A Meta- Analysis. Rev. Educ. Res., 85(2), 275–314. 3. Assessment Day Ltd., Watson Glaser critical thinking appraisal. https://www.assessmentday.co.uk/watson-glaser-critical-thinking.htm. 4. Bailin S., (2002), Critical thinking and science education. Sci. Educ., 11(4), 361–375. 5. Bang M., Brown B., Barton A. C., Rosebery A., and Warren B., (2017), Toward More Equitable Learning in Science: Expanding Relationships Among Students, Teachers, and Science Practices, in Helping Students Make Sense of the World Using Next Generation Science and Engineering Practices, Schwarz C. V., Passmore C., and Reiser B. J. (eds.)., NSTA Press, pp. 33–58. 6. Banning M., (2006), Nursing research: perspectives on critical thinking. Br. J. Nurs., 15, 458–461. 7. Barak M., Ben Chaim D., and Zoller U., (2007), Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking. Res. Sci. Educ., 37(4), 353–369. 8. Barron H. A., Brown J. C., and Cotner S., (2021), The culturally responsive science teaching practices of undergraduate biology teaching assistants. J. Res. Sci. Teach., 58(9), 1320–1358. 9. Becker N., Rasmussen C., Sweeney G., Wawro M., Towns M., and Cole R., (2013), Reasoning using particulate nature of matter: An example of a sociochemical norm in a university-level physical chemistry class. Chem. Educ. Res. Pract., 14(1), 81–94. 10. Bernardi F. M. and Pazinato M. S., (2022), The Case Study Method in Chemistry Teaching: A Systematic Review. J. Chem. Educ., 99(3), 1211–1219. 11. Bowen R. S., Flaherty A. A., and Cooper M. M., (2022), Investigating student perceptions of transformational intent and classroom culture in organic chemistry courses. Chem. Educ. Res. Pract. https://doi.org/10.1039/D2RP00010E. 12. Byrne M. S. and Johnstone A. H., (1987), Critical Thinking and Science Education. Stud. High. Educ., 12(3), 325–339. 13. Calabrese Barton A., Tan E., and Rivet A., (2008), Creating Hybrid Spaces for Engaging School Science Among Urban Middle School Girls. Am. Educ. Res. J., 372 45(1), 68–103. 14. Chang J. and Song J., (2016), A case study on the formation and sharing process of science classroom norms. Int. J. Sci. Educ., 38(5), 747–766. 15. Charen G., (1970), Do laboratory methods stimulate critical thinking? Sci. Educ., 54(3), 267–271. 16. Charmaz K., (2006), Constructing Grounded Theory: A Practice Guide through Qualitative Analysis, Sage. 17. Cooper M. M., (2016), It Is Time To Say What We Mean. J. Chem. Educ., 93(5), 799–800. 18. Cooper M. M., Posey L. A., and Underwood S. M., (2017), Core Ideas and Topics: Building Up or Drilling Down? J. Chem. Educ., 94(5), 541–548. 19. Cooper M. M., Stowe R. L., Crandell O. M., and Klymkowsky M. W., (2019), Organic Chemistry, Life, the Universe and Everything (OCLUE): A Transformed Organic Chemistry Curriculum. J. Chem. Educ., 96(9), 1858–1872. 20. Corbin J. and Strauss A., (2015), Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory, Fourth Edition. Sage. 21. Crandell O. M., Kouyoumdjian H., Underwood S. M., and Cooper M. M., (2019), Reasoning about Reactions in Organic Chemistry: Starting It in General Chemistry. J. Chem. Educ., 96(2), 213–226. 22. Crandell O. M., Lockhart M. A., and Cooper M. M., (2020), Arrows on the Page Are Not a Good Gauge: Evidence for the Importance of Causal Mechanistic Explanations about Nucleophilic Substitution in Organic Chemistry. J. Chem. Educ., 97(2), 313– 327. 23. Crenshaw P., Hale E., and Harper S. L., (2011), Producing Intellectual Labor In The Classroom: The Utilization Of A Critical Thinking Model To Help Students Take Command Of Their Thinking. J. Coll. Teach. Learn. TLC, 8(7), 13–13. 24. Creswell J. W. and Poth C. N., (2017), Qualitative Inquiry & Research Design, Fourth Edition. Sage. 25. Danczak S. M., Thompson C. D., and Overton T. L., (2020), Development and validation of an instrument to measure undergraduate chemistry students’ critical thinking skills. Chem. Educ. Res. Pract., 21(1), 62–78. 26. Danczak S. M., Thompson C. D., and Overton T. L., (2017), “What does the term Critical Thinking mean to you?” A qualitative analysis of chemistry undergraduate, teaching staff and employers’ views of critical thinking. Chem. Educ. Res. Pract., 373 18(3), 420–434. 27. Dixson L., Pomales B., Hashemzadeh Mehrtash, and Hashemzadeh Mehnroosh, (2022), Is Organic Chemistry Helpful for Basic Understanding of Disease and Medical Education? J. Chem. Educ., 99(2), 688–693. 28. Duncan R. G., Chinn C. A., and Barzilai S., (2018), Grasp of evidence: Problematizing and expanding the next generation science standards’ conceptualization of evidence. J. Res. Sci. Teach., 55(7), 907–937. 29. Dunn A. H., Sondel B., and Baggett H. C., (2019), “I Don’t Want to Come Off as Pushing an Agenda”: How Contexts Shaped Teachers’ Pedagogy in the Days After the 2016 U.S. Presidential Election. Am. Educ. Res. J., 56(2), 444–476. 30. Dunning G. M., (1956), Critical thinking and research. Sci. Educ., 40(2), 83–86. 31. Dunning G. M., (1954), Evaluation of critical thinking. Sci. Educ., 38(3), 191–211. 32. Ennis R. H., (1962), A concept of critical thinking. Harv. Educ. Rev., 32, 81–111. 33. Facione P. A., (1990), Critical Thinking : A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction Executive Summary “ The Delphi Report. Calif. Acad. Press, 423(c), 1–19. 34. Flaherty A. A., (2020b), Investigating perceptions of the structure and development of scientific knowledge in the context of a transformed organic chemistry lecture course. Chem. Educ. Res. Pract., 21, 570–581. 35. Forawi S. A., (2016), Standard-based science education and critical thinking. Think. Ski. Creat., 20, 52–62. 36. George K. D., (1967), A comparison of the critical‐thinking abilities of science and non‐science majors. Sci. Educ., 51(1), 11–18. 37. George K. D., (1968), The effect of critical‐thinking ability upon course grades in biology. Sci. Educ., 52(5), 421–426. 38. Glaser B. G., (1978), Theoretical Sensitivity, The Sociology Press. 39. Glaser B. and Strauss A., (1967), The Discovery of Grounded Theory: Strategies for Qualitative Research, Sociology Press. 40. Glaser E. M., (1941), An experiment in the development of critical thinking, J. J. Little & Ives Company. 41. Gupta T., Burke K. A., Mehta A., and Greenbowe T. J., (2015), Impact of guided- inquiry-based instruction with a writing and reflection emphasis on chemistry 374 students’ critical thinking abilities. J. Chem. Educ., 92(1), 32–38. 42. Gutiérrez K. D. and Rogoff B., (2003), Cultural ways of learning: Individual traits or repertoires of practice. Educ. Res., 32(5), 19–25. 43. Hallberg L. R.-M., (2006), The “core category” of grounded theory: Making constant comparisons. Int. J. Qual. Stud. Health Well-Being, 1(3), 141–148. 44. Hand B., Shelley M. C., Laugerman M., Fostvedt L., and Therrien W., (2018), Improving critical thinking growth for disadvantaged groups within elementary school science: A randomized controlled trial using the Science Writing Heuristic approach. Sci. Educ., 102(4), 693–710. 45. Houchlei S. K., Bloch R. R., and Cooper M. M., (2021), Mechanisms, Models, and Explanations: Analyzing the Mechanistic Paths Students Take to Reach a Product for Familiar and Unfamiliar Organic Reactions. J. Chem. Educ., 98(9), 2751–2764. 46. Hunter R. A. and Kovarik M. L., (2022), Leveraging the Analytical Chemistry Primary Literature for Authentic, Integrated Content Knowledge and Process Skill Development. J. Chem. Educ., 99(3), 1238–1245. 47. Insight Assessment, (2020), California critical thinking skills test (CCTST). https://www.insightassessment.com/article/california-critical-thinking-skills-test- cctst-2. 48. John-Steiner V. and Mahn H., (1996), Sociocultural approaches to learning and development: A Vygotskian framework. Educ. Psychol., 31(3–4), 191–206. 49. Kogut L. S., (1996), Critical Thinking in General Chemistry. J. Chem. Educ., 73(3). 50. Kuhn D., (1999), A Developmental Model of Critical Thinking. 28(2), 16–25. 51. Lau J. Y. F., (2011), An introduction to critical thinking and creativity: think more, think better, Wiley. 52. Lemke J. L., (2001), Articulating communities: Sociocultural perspectives on science education. J. Res. Sci. Teach., 38(3), 296–316. 53. Mantero M., (2002), Scaffolding Revisited : Sociocultural Pedagogy within the Foreign Language Classroom (ED459623). ERIC. https://files.eric.ed.gov/fulltext/ED459623.pdf. 54. Merriam S. B. and Tisdell E. J., (2016), Qualitative Research: A Guide to Design and Implementation, Jossey-Bass. 55. Miller P. J. and Goodnow J. J., (1995), Cultural practices: Toward an integration of culture and development. New Dir. Child Adolesc. Dev., 1995(67), 5–16. 375 56. Mulnix J. W., (2012), Thinking Critically about Critical Thinking. Educ. Philos. Theory, 44(5), 464–479. 57. Nasir N. S. and Hand V. M., (2006), Exploring Sociocultural Perspectives on Race, Culture, and Learning. Rev. Educ. Res., 76(4), 449–475. 58. National Research Council, (2012a), A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, The National Academies Press. 59. National Research Council, (2012b), Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century, National Academies Press. 60. Noyes K., Carlson C. G., Stoltzfus J. R., Schwarz C. V., Long T. M., and Cooper M. M., (2022), A Deep Look into Designing a Task and Coding Scheme through the Lens of Causal Mechanistic Reasoning. J. Chem. Educ., 99(2), 874–885. 61. Noyes K. and Cooper M. M., (2019), Investigating Student Understanding of London Dispersion Forces: A Longitudinal Study. J. Chem. Educ., 96(9), 1821–1832. 62. Oliver-Hoyo M. T., (2003), Designing a Written Assignment To Promote the Use of Critical Thinking Skills in an Introductory Chemistry Course. J. Chem. Educ., 80(8), 899–903. 63. Osborne J., (2014), Teaching Critical Thinking? New Directions in Science Education. Sch. Sci. Rev., 95(352), 53–62. 64. Osborne J., Erduran S., and Simon S., (2004), Enhancing the quality of argumentation in school science. J. Res. Sci. Teach., 41(10), 994–1020. 65. Paul R. and Elder L., (2011), Critical Thinking: Competency Standards Essential for the Cultivation of Intellectual Skills, Part 2. J. Dev. Educ., 35(1), 36–36. 66. Petterson M. N., Finkenstaedt-Quinn S. A., Gere A. R., and Shultz G. V., (2022), The role of authentic contexts and social elements in supporting organic chemistry students’ interactions with writing-to-learn assignments. Chem. Educ. Res. Pract., 23, 189–205. 67. Randles C. A. and Overton T. L., (2015), Expert vs. novice: approaches used by chemists when solving open-ended problems. Chem. Educ. Res. Pract., 16(4), 811– 823. 68. Rickert R. K., (1967), Developing critical thinking. Sci. Educ., 51(1), 24–27. 69. Rogoff B., (1990), Apprenticeship in Thinking: Cognitive Development in Social Context, Oxford University Press. 376 70. Santos L. F., (2017), The Role of Critical Thinking in Science Education. J. Educ. Pract., 8(20), 159–173. 71. Siegel H., (1989), The Rationality of Science, Critical Thinking, and Science Education. Synthese, 80, 9–41. 72. Smith P. M., (1963), Critical thinking and the science intangibles. Sci. Educ., 47(4). 73. Stowe R. L. and Cooper M. M., (2017), Practicing What We Preach: Assessing “Critical Thinking” in Organic Chemistry. J. Chem. Educ., 94(12), 1852–1859. 74. The Critical Thinking Co., (2021), Cornell critical thinking tests. https://www.criticalthinking.com/cornell-critical-thinking-tests.html. 75. Timonen V., Foley G., and Conlon C., (2018), Challenges when using grounded theory: A pragmatic introduction to doing GT research. Int. J. Qual. Methods, 17(1). 76. Tsai C. C., (2001), A review and discussion of epistemological commitments, metacognition, and critical thinking with suggestions on their enhancement in internet-assisted chemistry classrooms. J. Chem. Educ., 78(7), 970–974. 77. VERBI GmbH, (2022), MAXQDA. https://www.maxqda.com. 78. Vieira R. M., Tenreiro-Vieira C., and Martins I. P., (2011), Critical thinking: Conceptual clarification and its importance in science education. Sci. Educ. Int., 22(1), 43–54. 79. Vygotsky L., (1978a), Chapter 6: Interaction between Learning and Development, in Mind in Society,., The Harvard University Press, pp. 79–91. 80. Vygotsky L., (1978b), Mind in Society, The Harvard University Press. 81. Watson G. and Glaser E. M., (1964), Watson-Glaser critical thinking appraisal manual, Harcourt, Brace & World. 82. Weaver M. G., Samoshin A. V., Lewis R. B., and Gainer M. J., (2016), Developing Students’ Critical Thinking, Problem Solving, and Analysis Skills in an Inquiry- Based Synthetic Organic Laboratory Course. J. Chem. Educ., 93(5), 847–851. 83. Wright, A. & Forawi, A. (2000). Social responsibly requires critical thinking. Paper presented at the 2000 Forum at Calgary, Canada. 84. Zotos E. K., Moon A. C., and Shultz G. V., (2020), Investigation of chemistry graduate teaching assistants’ teacher knowledge and teacher identity. J. Res. Sci. Teach., 57(6), 943–967 377 APPENDIX Permissions to reproduce manuscript in its entirety Figure 7.1. Permissions to publish article in dissertation. 378 Welcome Document Interview Participant Information Perceptions of Critical Thinking Study CEM 251 – Organic Chemistry I WELCOME AND INTRODUCTION Hello! To begin, I would like to thank you for your participation in this research study. Your input and perspective are incredibly valuable for helping us support more students like yourself. My name is Ryan, and I am a doctoral candidate in Dr. Melanie Cooper’s group at Michigan State University, and my research area is chemistry education. I will be the one conducting your interview, and I wanted to send this document and provide my email in case you needed to get in touch with me before the interview: bowenrya@msu.edu. This interview will seek to gather your perspective on critical thinking. If you are unfamiliar with this term, do not worry, you can still participate, and we can talk solely about your experience with how you are expected to think in CEM 251. In preparation for the interview, you can read through this document which includes a brief heads-up of some of the questions I’ll be asking you. However, please do not look anything up, even if you are unsure about something. We provide the questions ahead of time in case you wanted to reflect on your experiences and write down some notes, though this is not required or necessary. There are no right answers to these questions, and we are just seeking your perspective and thoughts. INTERVIEW DETAILS AND LOGISTICS The interview will take place on a date and time of your choosing (you should have already received an email from me about scheduling it). You should expect the interview to last about an hour. Interviews will take place via Zoom, and you will receive Zoom room information a day before your scheduled time. Please note that the interview will be recorded, and you may choose to keep your video on or off during the entire process (the only thing I need is the audio). I (Ryan) will be the only person who ever sees this video or listens to the audio of the interview. After the interview, if anything was confusing to me, or I want to make sure I capture your perspective accurately, you may receive an e-mail from me to double-check. Any time after the interview, you may email me to be removed from the study or ask me questions. FINAL REMARKS Prior to showing you some of the interview questions on the back of this sheet, I want to reiterate that there are no right answers, and please do not look up anything on the internet. If you need to look at your notes from the CEM 251 for an example or look at transcripts for your previous courses, please do! However, we want to make sure that we capture your thoughts and opinions and not something from the internet. We really appreciate you being here and helping 379 us! Please see the back for some of the interview questions I will be asking and email me with any questions prior to the interview! Thanks again! INTERVIEW QUESTIONS NOTE: These are not all of the questions I will ask, but it is most of them! 1. What chemistry classes have you taken before (thinking back to high school)? 2. Who is your chemistry professor now for CEM 251? 3. How would you describe how you are expected to think in the course? Please provide an example. 4. What would you say is the most difficult aspect of the course? Please provide an example. 5. How would you describe how you are assessed in the course? Please provide an example. 6. How would you describe what “critical thinking” is (like if you had to give a definition)? a. NOTE: throughout the interview we will come back to your perception and definition of “critical thinking” and how it relates to your experiences and responses to the other questions. 7. Considering your definition of “critical thinking”, what is not “critical thinking” then? 8. Can you explain if you believe “critical thinking” to be important, in general? 9. How did you come to this understanding and perception of “critical thinking”? What experiences informed your understanding? 10. Considering how you view “critical thinking”, can you explain if it is what you do in your chemistry course currently? 11. Can you explain if your definition of “critical thinking” applies to all subjects, or if it’s more relevant for one subject or field? 12. Would you enjoy being in a course that values “critical thinking” (as you have defined it) and encourages you to engage in it often? Why or why not? 13. Reflecting on your experiences, how do you think that “critical thinking” develops, or how does one get “better” at thinking “critically”? 14. What influences you to think “critically”? What gets your “critical thinking” gears turning? 15. Would you say there is a relationship between “critical thinking” and something like “motivation”? For example, is it possible to think “critically” if the person did not want to? 380 Interview Protocol Grounded Theory Study Interview Protocol General information • Unstructured to semi-structured interviews • Research questions: o What do students mean when they use vague, generalized terms such as “critical thinking” to describe their learning experiences and what do these types of thinking entail? o How did students arrive at their understanding of terms such as “critical thinking”? o Are the types of thinking associated with “critical thinking” similar across different courses? Before interview • Ask participant the to tell me about themselves by asking about the following: o Major o Hometown o Career interests o Hobbies • Introduce myself o Name o Graduate student o Hometown o Research • Explain the purpose of the research o We had asked some questions to students and found a lot of students were talking about ways of thinking that can mean a variety of things. Therefore, the goal of this interview it to better understand how YOU might define how you perceive you’re expected to think and to understand where your ideas came from o Note: there are no right answers here, it’s all about your perspective • At any point, if it is more comfortable to keep your video off, please do that, otherwise feel free to keep it on. • If anything I ask is confusing, or you need me to rephrase, just let me know. • Please don’t feel like you have to respond immediately, if you need to think about something, feel free to think! o I will ask follow-up questions • Inform the participant they can opt out at any time • Ask the participant if they looked anything up just so I’m aware. • Let the participant know that I will be recording the interview at this point • Give as much detail as possible! 381 START RECORDING AND TRANSCRIPTION Interview – Part 1: Introductory Questions 16. What chemistry classes have you taken before (thinking back to high school)? a. Follow-up: Who were your previous chemistry professors at MSU? 17. Who is your chemistry professor now for CEM 251? 18. How would you describe how you are expected to think in the course? a. Follow-up: Why would you describe these as the expected ways of thinking? b. Follow-up: If necessary, encourage the student to provide an example. 19. What would you say is the most difficult aspect of the course? a. Follow-up: Why is this the most difficult? b. Follow-up: If necessary, encourage the student to provide an example. 20. How would you describe how you are assessed in the course? a. Clarifying statement: “Assessed” broadly means “what kind of learning is measured, and how is that learning measured”. Basically, what does the instructor look for when they measure your learning, and how do they get that information? b. Follow-up: Why would you say you are assessed this way? c. Follow-up: If necessary, encourage the student to provide an example. NOTE: depending on what students say to these questions dictates whether I start with Part 2a or Part 2b; however, every student will be asked Part 2a. For example, if a student talks about “critical thinking” in the introductory questions above, then I will follow up the introductory questions with Part 2a and will NOT ask the student Part 2b. However, if a student talks about other generalities such as “thinking with an open-mind”, I will use Part 2b, but I will circle back to Part 2a to get their thoughts on critical thinking and problem-solving. TRANSITION: Thank you for sharing your experience with those questions, its really helpful as we try to better understand what is most helpful for students. We are going to shift gears at this point, but in order to determine the direction we go, I want to ask first: Are you familiar with the term “critical thinking”? Have you heard it before? • If yes: I’m going to ask you more specific questions about critical thinking which may relate to some of the things you mentioned just now. If we have time, I may follow up on some other things you said, but we’ll just see how we are doing on time. Start with question 6 in Part 2a. • If no: Not a problem. Depending on if you have heard of it dictates the questions I ask next, so I just wanted to check. Next, then, I’m going to ask you some questions about some of your perceptions, just to get a better understanding of your experience in the course. • NOTE: if the student says “no” they have not heard of “critical thinking”, and they have not really said anything that gets me curious. I could ask them about “problem solving” or something or probe deeper into their responses from these first questions. Interview – Part 2a: Critical Thinking Questions 21. How would you describe what “critical thinking” is (like if you had to give a definition)? 382 a. Follow-up: How would you engage in “critical thinking”? For example, what are things you do when you think “critically”? b. Follow-up: If a student contrasts “critical thinking” with rote memorization, ask them the following: Is there any relationship between “critical thinking” and “memorization” or do you see them as opposite to one another? 22. Considering your definition of “critical thinking”, what is not “critical thinking” then? 23. Can you explain if you believe “critical thinking” to be important, in general? a. Follow-up: Why (or why not) is it important? 24. How did you come to this understanding and perception of “critical thinking”? What experiences informed your understanding? a. Clarifying question: Did you have a course or experience in the past or currently that helped you think of “critical thinking” this way? Was it “taught” to you? b. Follow-up: Would you say that your definition of “critical thinking” changed from high school to college? Why or why? 25. Considering how you view “critical thinking”, can you explain if it is what you do in your chemistry course currently (may have stated this in part 1, if so, ask for example if necessary)? a. If yes: Ask student to provide an example from the course (may have stated in Part 1). b. If no: Ask student what they do, and how would they describe that (may have stated in Part 1). 26. Can you explain if your definition of “critical thinking” applies to all subjects, or if it’s more relevant for one subject or field? a. Follow-up: In your experience, have your courses more-or-less shared the same perspective of “critical thinking” that you have or have they been different? Please explain. 27. Would you enjoy being in a course that values “critical thinking” (as you have defined it) and encourages you to engage in it often? Why or why not? 28. Reflecting on your experiences, how do you think that “critical thinking” develops, or how does one get “better” at thinking “critically”? a. Follow-up: What are the specific conditions or course cultures under which “critical thinking” is developed? b. Follow-up: Do you think people naturally know how to think “critically”, or do they have to be taught this? Why or why not? c. Follow-up: If someone were to learn how to think “critically”, would they always have the ability to think “critically”, or does it change? Why or why not? 29. What influences you to think “critically”? What gets your “critical thinking” gears turning? 30. Would you say there is a relationship between “critical thinking” and something like “motivation”? a. Clarifying question: Could you think “critically” if you weren’t motivated to do so, like if you simply did not want to? 383 Interview – Part 2b: Generalities NOTE: I will ask these questions if: 1) they are not familiar with “critical thinking” at all; 2) if we finish Part 2a early, but they also said some other things in Part 1 that I wish to follow up on. • Even though “critical thinking” is the focus of the study, if a student is not familiar with it, I think that it’s still important to capture their perspective. It’s still good to know that some students may not have any clue what it means, yet faculty may assume they do. 31. How would describe what __________ is (like if you had to give a definition)? a. Follow-up: Why did you describe it that way (why is that important for your definition)? b. Follow-up: How would you engage in this type of thinking? For example, what are things you do when you think this way? 32. Considering your definition of __________, what is not __________ then? Clarify student definition and then state the following: If at any point you want to add something or take away, feel free to jump in and say as we reflect and think on this more. 33. Can you explain if you believe __________ to be important, in general? a. Follow-up: Why (or why not) is it important? 34. How did you come to this understanding and perception of this way of thinking? What experiences informed your understanding? a. Clarifying question: Did you have a course in the past or currently that helped you think of this way of thinking this way? 35. Earlier you stated this was a way of thinking you use in your course, and I was curious if you could provide an example to further help me understand what you mean? 36. Can you explain if your definition of __________ applies to all subjects, or if it’s more relevant for one subject or field? 37. Would you enjoy being in a course that values __________ (as you have defined it) and encourages you to engage in it often? Why or why not? 38. Reflecting on your experiences, how do you think that __________ develops, or how does one get “better” at thinking this way? a. Follow-up: What are the specific conditions or course cultures under which __________ is developed? b. Follow-up: Do you think people naturally know how to think this way or do they have to be taught this? Why or why not? c. Follow-up: If someone were to learn how to think this way would they always have the ability to think __________ or does it change? Why or why not? 39. What influences you to think in this way? What gets your __________ gears turning? Clarify student definition with anything they have added and ask the following: As we wrap up was there anything else you thought of that you would like to add to your definition or anything else? 384 Interview – Part 3: Wrap-Up 40. Is there anything else you think I should know to understand your perspective better? 41. Is there anything you would like to ask me? STOP RECORDING Post-Interview • Logistics o I will go through interview, come up with some thoughts, and send them to you to make sure they capture what you believe. • Give them my e-mail so they can contact me at any time (and get their e-mail if I don’t already have it). • As part of the analysis process, I may assign a pseudonym to your responses to keep your identity private and anonymous. Would you like to choose a pseudonym that I use, or would you rather I choose one later? 385 CHAPTER VIII: CONCLUSIONS, IMPLICATIONS, AND FUTURE WORK The work presented in this dissertation included multiple qualitative studies conducted across three years at Michigan State University. Initially, I had set out (with my colleagues) to further characterize the CLUE and OCLUE curricula. Previous work by my peers had explored how students engaged in reasoning in the context of these courses across multiple tasks and had compared CLUE and OCLUE cohorts to students in traditional chemistry courses. We wanted to supplement this research by investigating the student perspective and whether what students perceived what they were doing in their courses aligned with the transformational and instructor intent. My work focused on undergraduate students enrolled in general chemistry (CEM 141, CEM 142), cells and molecules (a general biology course; BS 161), and organic chemistry (CEM 251, CEM 252). All of the work here sought to investigate student experiences and perceptions within chemistry (with one course in biology) learning environments (particularly in transformed courses) with the common narrative connecting back to the overarching learning culture on thinking and learning. The key takeaways from these studies are as follows: Takeaway 1: Students in courses designed with three-dimensional learning perceive the use of knowledge is expected and assessed In Chapters IV, V, and VI, I discussed studies that asked students to provide their perceptions of how they were expected to think in their courses, what was the most difficult aspect, and how they were assessed. These questions were asked in a variety of courses including general chemistry I and II (CEM 141 and CEM 142), cells and molecules (BS 161), and organic chemistry I and II (CEM 251 and CEM 252). Although I had modified the questions several 386 times to see if I could clarify responses, I consistently arrived at a similar finding: many students in courses that were informed by three-dimensional learning in some way, such as CLUE (CEM 141 and CEM 142) and OCLUE (CEM 251 and 252) perceived that they were expected to use their knowledge in their courses, build on that knowledge, and were subsequently assessed on how they applied their knowledge (Bowen et al., 2022). In BS 161, a general biology course where 3DL has some influence, it was found that many students did perceive they were expected to use their knowledge and were assessed accordingly. In cases where I could compare responses to student responses in traditional courses (such as with organic chemistry), I noted differences, particularly in organic chemistry II (CEM 252). Here, I saw that students in the traditional course had rather different perceptions: that they were primarily expected to rely on rote knowledge (such as memorization) and that they were assessed on seemingly disparate specific topics. Although the differences in perceptions between transformed students and traditional students in this regard seemed to be less pronounced with organic chemistry I (CEM 251), these differences were still noted to a smaller extent. To make sense of these findings, I have consistently situated them within the context of the overarching learning cultures and the messages these cultures send to students. That is, the use of the scientific practices and the connections back to core ideas within the discipline (as described by three-dimensional learning) seem not only to be significant for engaging students in causal mechanistic reasoning, as evidenced by my colleagues’ work, but it also communicates to students the practice of applying knowledge. Since cultures partly emerge from valued practices and ways of doing (Miller & Goodnow, 1995; Reinholz & Apkarian, 2018; Rogoff, 1990; Schein & Schein, 2016), consistently doing this throughout a semester and over a course sequence seems to have widespread implications for student learning and their perception of said learning. 387 Takeaway 2: The perception questions used in this dissertation can be easily and reliably incorporated into other courses In Chapters V and VI, I discussed pilot studies that attempted to modify the perception questions, incorporate the questions outside of organic chemistry, and even replicate previous findings in organic chemistry. From these studies, I learned that modifying the perception questions may defeat our original purpose (to keep them open-ended and minimize prompting) and not help us in the long run. In a couple of studies, I attempted to include scaffolding in the perception questions to curtail responses that were not informative or capable of deeper analysis. In both cases, the scaffolding did not help by either confusing students, potentially inducing question fatigue, or student flat-out ignored the scaffolding and provided their experience without adhering to the guidance in the prompt. That is, although scaffold may be helpful, I was not able to find an adequate way of scaffolding these questions to assist with analysis. Initially, the perception questions were developed for organic chemistry. However, in Chapter V, I incorporated the perception questions into general chemistry (CEM 141 and CEM 142) and cells and molecules (BS 161) to determine if it could be easily transferred. Although these were pilot studies and I was modifying the questions throughout, these pilots provided preliminary evidence that the perception questions can be easily adapted to any course and still provide relevant formative feedback on instruction and elements of the overarching course cultures. For CEM 141 and CEM 142, I showed how the questions could pick up on a variety of perceptions which could be organized into themes similar to ours, if desired. In the case of BS 161, I demonstrated how data from the questions could be used to look at potential differences across different sections of the same course which could help characterize efforts to baseline a course experience across multiple instructors. Although perceptions may not remain consistent 388 from one time point to another (due to sociocultural influences), I have posited that they still afford us insights and a “snapshot” of student perceptions of overarching course cultures. Of course, the major downside of this approach is the time it takes to analyze the data. I attempted to use AACR to train machine learning algorithms to automatically code student responses according to our codebook; however, my preliminary results did not yield a viable model. My conclusions with AACR were that: 1) in some cases, dummy responses were needed to increase the kappa value, indicating that more responses were needed; 2) for some datasets, most of the responses were coded to a couple of categories, not providing enough responses in other categories to help the model pick up on the diversity of codes; 3) in several cases, only certain perspectives were represented (such as having only transformed data and not traditional data, as was the case for CEM 141, CEM 142, and BS 161); and 4) on a basic level, the model struggled with the complexity and open-endedness of the responses. Although this could be remedied with more responses, we would need to ensure that a variety of courses and perspectives were represented, and that all categories had a decent number of responses present to help the model pick up on them in other datasets. Takeaway 3: Students in organic chemistry may perceive “critical thinking” to entail the use of knowledge which provides insights into the “Generalities” code present throughout the perception studies In Chapter VII, I discussed my interview study with students in CEM 251 (organic chemistry I). The impetus behind this interview study was to generate some insights into what students meant when they used amorphous terms like “critical thinking” in their responses. The ubiquitous nature of the term in education also piqued my interest, and I set out to collect the 389 perceptions of fourteen students who, at the time of the study, were enrolled in CEM 251 at Michigan State University. I interviewed students from OCLUE and traditional organic chemistry and found that a commonality across all students was that they perceived “critical thinking” to entail the use, or application, of knowledge and directly contrasted to passive approaches, such as rote memorization. Although these findings cannot and should not be generalized to the larger student population, it does provide an interesting insight that when students say “critical thinking” they are envisioning something along the lines of the application of knowledge. Certainly, more work could be done here to gather more perceptions of what “critical thinking” entails across a larger student cohort. Regardless, a corollary takeaway is that “critical thinking” could be framed within the context of the scientific practices to clarify the construct (Bowen, 2022). Takeaway 4: Students come from a variety of backgrounds and previous experiences that impact their perceptions of thinking and learning In Chapters VII, I used semi-structured interviews to explore the perceptions of students regarding “critical thinking”. Although the research questions were different from the other studies presented here, they were connected by the sociocultural thread throughout this dissertation. Although acknowledging the influence of context and sociocultural factors is certainly nothing new (Vygotsky was doing it in the early 1900s!) (Vygotsky, 1978), this study offered a reminder that instructors cannot assume all of their students are starting at the same place. By the time students have arrived at the university, they have been socialized by previous academics, work, family, friends, and society where certain practices, ways of doing, ways of thinking, and ways of speaking have been engrained into their brain (Miller & Goodnow, 1995; 390 Nasir & Hand, 2006; Rogoff, 1990; Vygotsky, 1978). That is, students are arriving at the threshold of undergraduate chemistry classrooms with preconceptions about what certain things are, such as “critical thinking”, or what it means to learn (and how to do it). Therefore, these studies highlight that one should not assume that students share the same perspective or way of doing and thinking as the instructor. Instead, in the context of these studies, I advocate that instructors be very explicit about what they want students to know and do and how they intend students to learn. Implications for Teaching, Learning, and Research in Chemistry Education The implications that emerged are widespread. Although more work is certainly needed, the work presented in this dissertation generates insights into the overarching cultures of learning and the student experience. Just like throughout this dissertation, the implications are tied to one another via a sociocultural “connective tissue” that encourages the consideration and further investigation of the influence of sociocultural factors on student experiences and learning. Implication 1: Design, Enact, and Refine Cultures of Learning in Chemistry I have previously noted that the overarching culture is a network of structures, symbols, and people. That is, it involves a complicated array of course policies, assignment tasks, expectations, practices, and valued ways of doing and thinking (among others) that students interface with in explicit and implicit ways. Said another way: learning cultures are complex. However, I have argued that by situating the work we do within sociocultural frameworks, we can centralize the significance of the classroom culture on learning and student experience. I have never advocated that one should focus on every aspect of a learning culture at once; 391 however, by centralizing culture, we can systematically address most of the underlying factors that influence student learning, behavior, persistence, and dispositions. In the context of these studies, I have explored how courses using three-dimensional learning have communicated a consistent message to students: that the application and use of knowledge is expected and assessed. That is, the use of knowledge, through the use of the scientific practices, has been communicated as a valued practice within the context of these classrooms. Furthermore, this practice, a component of the learning culture is perceived by students indicating that many students are perceiving the transformational intent which aligns with previous theories of learning. That is, there is alignment between what was intended and what students perceived (in conjunction with alignment between what was practiced and what was assessed). In some of the studies, this practice and culture was contrasted to the perceived culture and valued ways of doing in a traditional organic chemistry course, which seemed to value that students rely mostly on rote knowledge (such as memorization). That is, the way material is presented, the way students are expected to practice, and the way students are assessed are communicating valuable messages about what it means to learn and do organic chemistry. Previous research on organic faculty perspectives never mentioned these faculty overtly valued memorization (Duis, 2011), indicating a potential misalignment between what the instructor of the traditional course intended and what students were actually doing. By centralizing our focus on the classroom culture, we can highlight how certain practices used in the classroom communicate to students what it means to “do” chemistry. Furthermore, we can also highlight how assessments communicate valued ways of knowing and doing. Although instructors may want students to reason and understand why certain phenomena occur, a look at the messages 392 students receive implicitly through assessment tasks may illuminate why students are taking a different approach. Implication 2: Acknowledge Prior Experience and Be Explicit Acknowledgement of student prior experiences is nothing new in chemistry education (Bodner, 1986). Therefore, this work acts as a reminder of the significance and importance of heeding this advice. Students are coming to our classrooms with a diverse range of experiences, training, and ideas. They are not arriving at the threshold at the same place. Considering sociocultural influences, we can better acknowledge that no two students come in with the exact same experiences and backgrounds, thus highlighting that we should be more explicit about what we want students to know and do. I see the acknowledgement of prior experiences as a precursor to this point of being more explicit. Terms that are ubiquitous yet have taken-for-granted meanings, like “critical thinking” and “problem solving”, should likely be removed from syllabi and learning goals (Cooper, 2016; Stowe & Cooper, 2017). The work presented here offers some clarity of this construct (from the student perspective) and highlights that instructors should take the time defining exactly what they want students to know and do and ensure these objectives and goals are made explicit to students. Otherwise, students may be operating from their own experience when they interface with tasks that ask them to “think critically” or “use problem solving”, and their ideas may not entirely align with the instructor’s perception. In terms of “critical thinking”, I (along with others in the literature) posited that the scientific practices in three-dimensional learning offers one way to be more explicit about what is meant when the term “critical thinking” is used. 393 Future Work Perhaps not so surprising, as I finish writing up this summary of years of research, though I’m happy to have the insights from these studies, I can’t help but feel I’m left with more questions than answers. To begin, it may be beneficial to explore other scaffolding approaches to the perception questions. Although it is expected to always have some responses that are vague or uninformative, particularly given the open-ended nature of the questions, it may be possible to further clarify responses. Additional work with the AACR program is also in line. My preliminary kappa values were not acceptable in most cases; however, I was able to identify potential areas to move forward with regard to this facet of the perception questions. If a model is trained to analyze the perception questions according to our codebook, it would be a first for the AACR program (as all other models focus on content questions), and it would allow faculty to employ these questions in their course and get quick formative feedback about what students are perceiving about the course. It would also be interesting to pilot these questions in upper- level chemistry courses to see if we still see similar patterns in responses. Throughout the perceptions project, students used many other terms alongside “critical thinking” such as “problem solving”, thinking “systematically”, and “analyzing” reactions, among others. It could be helpful to explore student perceptions of these terms, much like I did with “critical thinking”. One approach I thought of that may be helpful is to ask students the perception questions through beSocratic, analyze them, then by using students’ beSocratic ID, follow-up with them via a second beSocratic activity if they had responses classified in “Generalities”. The goal here would be to attempt to eliminate most of the vague responses and classify them in meaningful ways. 394 REFERENCES 1. Bodner, G. M. (1986). Constructivism: A theory of knowledge. Journal of Chemical Education, 63(10), 873. https://doi.org/10.1021/ed063p873 2. Bowen, R. S. (2022). Student Perceptions of “Critical Thinking”: Insights into Clarifying an Amorphous Construct. Chemistry Education Research and Practice. 3. Bowen, R. S., Flaherty, A. A., & Cooper, M. M. (2022). Investigating student perceptions of transformational intent and classroom culture in organic chemistry courses. Chemistry Education Research and Practice. https://doi.org/DOI https://doi.org/10.1039/D2RP00010E 4. Cooper, M. M. (2016). It Is Time To Say What We Mean. Journal of Chemical Education, 93(5), 799–800. https://doi.org/10.1021/acs.jchemed.6b00227 5. Duis, J. M. (2011). Organic chemistry educators’ perspectives on fundamental concepts and misconceptions: An exploratory study. Journal of Chemical Education, 88(3), 346– 350. https://doi.org/10.1021/ed1007266 6. Miller, P. J., & Goodnow, J. J. (1995). Cultural practices: Toward an integration of culture and development. New Directions for Child and Adolescent Development, 1995(67), 5–16. https://doi.org/10.1002/cd.23219956703 7. Nasir, N. S., & Hand, V. M. (2006). Exploring Sociocultural Perspectives on Race, Culture, and Learning. Review of Educational Research, 76(4), 449–475. https://doi.org/10.3102/00346543076004449 8. Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(1), 1–22. https://doi.org/10.1186/s40594-018-0103-x 9. Rogoff, B. (1990). Apprenticeship in Thinking: Cognitive Development in Social Context. Oxford University Press. 10. Schein, E. H., & Schein, P. A. (2016). Organizational Culture and Leadership (5th ed.). Jossey-Bass. 11. Stowe, R. L., & Cooper, M. M. (2017). Practicing What We Preach: Assessing “Critical Thinking” in Organic Chemistry. Journal of Chemical Education, 94(12), 1852–1859. https://doi.org/10.1021/acs.jchemed.7b00335 12. Vygotsky, L. (1978). Mind in Society. The Harvard University Press. 395