‘ THESlS‘ .—- l, LIERARY Michigan state 1 University \u- This is to certify that the dissertation entitled Towards a Conceptual Framework for the Continuum of Clinical Competence Development in an Undergraduate Osteopathic Medical Education Program: An Exploratory Study presented by Shirley A. Weaver has been accepted towards fulfillment of the requirements for Doctor of Philosophy degree in Educational Administration and Curriculum Major professor Dick Featherstone Mew MSU is an Affirmative Action/Equal Opportunity Institution 0-12771 MSU ‘ LIBRARIES n RETURNING MATERIALS: Place in book drop to remove this checkout from your record. FINES will be charged if book is returned after the date stamped below. TOWARDS A CONCEPTUAL FRAMEWORK FOR THE CONTINUUM OF CLINICAL COMPETENCE DEVELOPMENT IN AN UNDERGRADUATE OSTEOPATHIC MEDICAL EDUCATION PROGRAM: An Exploratory Study BY Shirley Anne Weaver A DISSERTATION Submitted to Michigan State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSO PHY Department of Educational Administration 1983 ( ‘__-‘ :5 /'~; ‘/ / k» - AETR ACT TOWARDS A CONCEPTUAL FRAMEWORK FOR THE CONTINUUM OF CLINICAL COMPETENCE IN AN UNDERGRADUATE OSTEOPATHIC MEDICAL EDUCATION PROGRAM: AnExploratory Study By Shirley Anne Weaver This study is an interpretation of interview data on clinical competence development. 'lhe interview data were acquired in the context of a cross-sectional study of the training of physicians at the Michigan State University College of Osteopathic Medicine. Students at three levels of training were asked to describe in detail what they were able to do in the clinical setting and why. The responses to these questions were analyzed towards answering two basic questions: (1) can unique definitions of competence be developed for each of the three levels of training? (2) what variables in the instruction/learning process should be considered when developing a competence-based medical education program? Students described doing specific clinical tasks, medical history and physical examination and medical problem solving, in ways that were unique to a given level of training. Not only what and how they did the task, but the perspective from which they viewed the task, differed for each level of training. From these descriptions were drawn four (4) continuums of competence which could provide the basis for defining clinical competence at each level of training: philosophic perspective; four aspects of cognitive development; four aspects of psychomotor development; and three aspects of attitudinal orientation. From students' explanations for why they could or could not do certain tasks were drawn six variables in the clinical competence developmental process: student's accumulated knowledge and skills; clarity of program goals and philosophy; congruity of curriculum and instruction; integration of theory and practice; instruction/role modeling; and the context of learning. Recommendations for further research were presented and implications of the findings for administrators of osteopathic medical education programs were discussed. DEDICATION To Paul, Calien and Wynne Whose joy in the hard work and satisfaction of learning, modeled the continuum. ii ACK NOW LEDG EM ENTS Student researchers depend on the guidance, support and forbearance of many - mentors, peers, family and friends — to get them through their first modest effort in scholarship. Novice researchers can hardly lay claim to full credit for whatever product emerges from their efforts. Such is my case. It is impossible to properly acknowledge the many people who contributed to this dissertation. However, it would be remiss to not publicly thank a few to whom I am especially indebted for having helped make the study possible and, lhope, useful. I must thank Deans Magen and Greenman for providing the clinical laboratory in which to learn and work. Without their moral and financial support I could not have conducted the study, nor would I have had the opportunity to develop a greater understanding and respect for osteopathic medicine and its practitioners. The kindness and helpfulness of their faculty and staff will be long remembered and appreciated. Special thanks and best wishes are extended to the students of MSU-COM who participated in this and previous studies. I know that their respect for their education and concern for the students who follow in their footsteps inspired them to thoughtfully and candidly share their experiences and insights. I appreciate their trust in my ability to understand their reality. I am humbled by these young people - their dedication to becoming responsible physicians and their willingness to sacrifice time, money, and normal patterns of living in order to become such physicians. iii Heartfelt thans must go to Margaret Lorimer for placing me in the tutelage of Richard Featherstone and Paul Dressel. Dick Featherstone did more than guide me through the Labyrinth of doctoral study; he encouraged me to trust my own ability--a trust that often flagged. There are no adequate words to express to Paul Dressel my appreciation for his contribution to my learning. The opportunity to interact daily with such an individual is a rare gift. He modeled for me the process of constant inquiry and analysis. He challenged and gently dragged and pushed me to see new things in every day words and events. Would that there were such a scholar-mentor for every student at every stage of learning! And finally, but hardly least, a special note of thanks must go to Connie Burch who expertly and cheerfully typed the many drafts and to Calien Lewis who edited the final draft of this dissertation. CHAPTER Introduction ONE TWO THREE FOUR TABLE OF CONTENTS Page THE PROBLEM Traditional Curriculum Design and Evaluation Medical Competence and Competence-Based Education Purpose of the Study Definition of Terms Assumptions Limitations Conduct of the Study Overview REVIEW OF THE RELATED LITERATURE Fundamentals of Competency-Based Education Issues Related to Operationalizing CBE Programs Theories of Learning and Competence Development Medical Education and Physician Competence Structuring Medical Education Programs for Competency Development Summary METHODOLOGY Study Design The Interview Process Data Analysis Summary STUDEN'I‘S' PERCEPTIONS OF THE CONTINUUM OF CLINICAL COMPETENCE DEVELOPMENT Educational and Experiential Background History and Physical Examination Diagnosis and Patient Management Descriptions of the Continuum of Clinical Competence Development Summary 18 18 26 33 46 59 67 68 68 75 80 94 96 98 103 125 135 144 FIVE STU DEN T'S' PERCEPTIONS OF THE VARIABLES IN DEVELOPING CLINICAL COMPETENCE Explanations for Clinical Competence Integration of Theory and Practice Summary SIX INSIGHTS, CONCLUSIONS, AND RECOMMENDATIONS Elements and Processes in the Continuum of Clinical Competence Development Re-conceptualization of the Continuum of Clinical Competence DeveIOpment Recommendations APPENDICES A Interview Schedules EXHIBITS Unit I H/ P Competence Unit II III P Competence Unit III H/ P Competence Unit II Diagnosis and Ti‘eatment Competence Unit III Diagnosis and Treatment Competence H/ P Skills Development Unit II H/ P Competence described by Unit III Students Explanations for Unit ll H/P Competence Explanations for Unit 11 Competence Described by Unit III Students Explanations for Unit II Competence in Diagnosis and TYeatment Explanations for Unit III Competence Nature of Most/Least Productive Externship Rotations Effect of Knowledge and Experience on Professional Competence Explanations of Unit II Integration of Theory and Practice Associations of Explanations for Unit II H/ P Competence Associations of Explanations for Unit 11 Competence offered by Unit III Subjects Associations of Explanations for Unit II Deficiencies Offered by Unit II Subjects Associations of Explanations for Diagnosis and Patient Management Competence offered by Unit 11 Students S Associations for Explanations of Deficiencies in Unit 11 Diagnostic and Patient Management Competence offered by Unit 11 Subjects ”EO'UFIUOCD.) C... EFN wD'UOZ vi 146 148 209 225 227 228 264 269 274 280 281 282 283 284 285 286 287 289 290 291 292 293 294 295 297 299 301 303 T Associations for Explanations for Unit III Competence Offered by Unit IH Students 305 BIBIOGRAPHY vii Table 1.1 O0 0 H O H hktbfbibih 1b @030!th LIST OF TABLES Preliminary Statement of the Relationship of the Level and Training to Variables in the Continuum of Clinical Competence Development Comparison of the Sample and the Population on Selected Demographic Characteristics Antecedent Professionally-Related Skills Offered by All Subjects Antecedent Skills of Unit I Subjects .Antecedent Skills of Unit II Subjects Antecedent Skills of Unit III Subjects Antecedent Skills of All Study Subjects Descriptors of Unit I H/ P Competence Descriptors of Unit I H/ P Competence Offered by Unit II Subjects Descriptors of Unit I H/ P Deficiencies Descriptors of Unit I H/ P Deficiencies Offered by Unit 11 Subjects Descriptors of Unit 11 H/ P Competence Offered by Unit II Subjects Descriptors of Unit 11 H/ P Competence Offered by Unit II Subjects In Each Experience Category Descriptors of Unit II H/ P Competence Offered by Unit III Subjects Descriptors of Unit II H/ P Deficiencies Offered by Unit II Subjects Descriptors of Unit H H/ P Deficiencies Offered by Unit II Subjects in Each Experience Category Descriptors of Unit II H/ P Deficiencies Offered by Unit III Subjects Descriptors of Unit III III P Competence Descriptors of Unit 111 H/ P Deficiencies Descriptors of Diagnosis and Treatment Competence Offered by Unit II Subjects Descriptors of Diagnosis and Treatment Deficiencies Offered by Unit II Subjects Descriptors of Diagnosis and Treatment Competence Offered by Unit 111 Subjects Descriptors of Diagnosis and Treatment Deficiences Offered by Unit III Subjects viii 12 74 99 100 100 101 101 105 106 107 108 112 113 114 115 116 117 121 122 126 127 129 130 4.23 4.24 5.1 5.2 5.3 5.4 5.5 5.6 5.7 5.8 5.9 5.10 5.11 5.12 5.13 5.14 5.15 5.16 5.17 5.18 5.19 5.20 5.21 5.22 5.23 5.24 Self-Descriptors of Ideal Clinical Competence Descriptors of Ideal Clinical Competence Explanations for H/ P Competence Offered by Unit 1 Subjects Explanations for Unit II III P Competence Offered by Unit II and III Subjects Explanations for Unit II Competence Described by Category l-IV Subjects Explanations for Unit II Competence Described by Category V Subjects Explanations for Unit II Competence Offered by Category l-III Unit III Subjects Explanations for Unit II Competence Offered by Category IV-V Unit 111 Subjects Explanations for Unit 11 H/P Clinical Skills Deficiencies Offered by Unit II and Unit III Subjects Explanations for Unit 11 Deficiencies Offered by Category [-11] Unit II Subjects Explanations for Unit II Deficiencies Offered by Category V Unit II Subjects Explanations for Unit II Deficiencies Offered by Unit III Subjects Associations of Explanations for Unit 11 Deficiencies Offered by Category I-III Unit III Subjects Associations of Explanations for Unit II Deficiencies Offered by Category IV-V Unit III Subjects Explanations for Diagnosis and Patient Management Competence Offered by Unit II Subjects Explanations for Diagnosis and Patient Management Deficiencies Offered by Unit II Subjects Associations of Explanations for Dificiencies in Unit II Diagnostic and Patient Management Competence Offered by Category lI—IV Unit II Subjects Associations of Explanations for Dificiencies in Unit 11 Diagnostic and Patient Management Competence Offered by Category V Unit II Subjects Percentage of Unit II Students Seeking Certain Features of Preceptorship Percentage of Subjects Reporting Preceptorship Activity Explanations for Unit III Clinical Competence Comparison of Experienced and Inexperienced Subjects' Explanations for Unit III Competence Explanations for Unit III Competence Offered by Category I -III Subjects Explanations for Unit III Competence Offered by Category IV-V Subjects Comparison of Experienced and Inexperienced Subjects' Explanations for Unit III Deficiencies in Diagnostic and Management Competence Associations of Explanations for Unit III Deficiencies in Diagnostic and Management Competence Offered by Category I-III Unit III Subjects ix 137 142 149 154 156 157 159 160 160 162 163 164 166 167 168 170 172 173 176 178 186 187 189 191 193 195 5.25 5.26 5.27 5.28 5.29 5.30 5.31 5.32 5.33 5.34 5.35 5.36 6.1 Associations of Explanations for Unit III Deficiencies in Diagnostic and Management Competence Offered by Category IV-V Unit III Subjects Areas of Perceived Insufficient Preparation for Unit III Least and Most Productive Unit III Clinical Rotations Explanations for a Unit III Clinical Rotation Being Very Supportive of Clinical Competence Development Explanations for Unit III Clinical Rotation Being Unsupportive of Clinical Competence Development Relationship of Criteria for Prodictive Clinical Rotation to Specific Unit III Rotation Relationship of Criteria for Unproductive Clinical Rotation to Specific Unit III Rotation Medical Problem Solving Competence when Practice Precedes Theory Affect on Classroom Learning of Practice Preceding Theory The Affect of Having Theory Precede Practice Descriptors of the Effectiveness of the CPSS in Integrating Theory and Practice Explanations for Integration of Theory and Practice Offered by Unit [1 Subjects Preliminary Statement of the Relationship of the Level of Training to Variables in the Continuum of Clinical Competence Development 196 198 201 202 202 204 205 213 215 218 221 223 266 LIST OF FIGURE Preliminary Conceptualization of the continuum of Clinical Competence Development Nature of Outcome Expectations Orientations to Student Opportunity Structures Flow-chart of Cognitive Processes Organizational Framework for Exploring Questions about Learning, Understanding, and Remembering Conceptual Systems and Reality Matrix for specifying Competence Criteria MSU-COM Basic Curricular Model Elements in Defining Clinical Competence Defining Clinical Competence for an Individual at One Level of Training Preliminary Conceptualization of the Continuum of Clinical Competence Development Conceptalization of the Relationship of the Variables in the Process of Competence Development xi 11 24 25 34 36 43 51 72 245 260 Introduction Michigan State University College of Osteopathic Medicine (MSU-COM) was the first college within the profession to be publicly supported and to become part of a public university.1 As part of its committment to furthering osteopathic medical education, the College, in 1974, initiated a study of its professional training program, the first of its kind within the profession, under the guidance of Paul L. Dressel, Ph.D. The study started from the premise that a professional training program should be evaluated by examining total outcomes or competencies. Instead of asking what courses a student in a professional program ought to take or what knowledge is essential, this approach concentrates on those who have almost completed their training, identifying what they think and feel about their program and profession. (Sharma and Dressel 1975, p. 2‘. A series of studies have been undertaken as part of this on-goinq effort to carefult; examine the MSUCOM curriculum and its relation to student professional development. The studies have variously described the values and concerns of interns and externs (Sharma 1975, 1976), program evaluation by students ’ J. Zines-7:91 1977; Weaver 1980), predictions of academic achievement (West 1979*, issues in examining and grading (P. Dressel 1979), attitudes and values in osteopathic medical education (Greenman and P. Dressel 1980), and curriculum analysis (P. Dressel 1981). Each of the studies has contributed to a further understanding of the educational process in which MSUCOM students are engaged. These insights have been shared with other colleges of osteopathic medicine and the profession through the publication of Occasional Papers, and some have provided the basis for presentations at national conferences on osteopathic medical education. Public legislation passed in 1969 transferred the Michigan College of Osteopathic Medicine, chartered in 1964 at Pontiac, Michigan, to Michigan State University in 1971. This study is an extension of the larger, longitudinal study of the MSU-COM curriculum and its students. It focuses specifically on the variables attending the cumulative process of developing professional competencies. As in past studies, students' perceptions provide the basis for the descriptive study. In contrast to previous studies, which considered students' perceptions of the external processes of their education (courses, teaching, grading, curriculum structure), the present study focuses on the internal processes of the students (the specific things that students know and are able to do in the clinical setting at each of the three major stages of the curriculum), and the relation of these internal processes to the external processes of the program. The study addresses the broad question: How do students accumulate and integrate the knowledge, skills and attitudes necessary to performing the role of an osteopathic physician? While the study is a case study, concerning itself with a single educational program, its relevance to the remaining 14 colleges of osteopathic medicine is presumed. As the social demands for accountable and efficient educational systems become increasingly persistent, osteopathic medical educators must ask themselves the critical question: How can we be certain that we are producing physicians who competently practice according to the tenants of the profession? The current study is an important first step towards answering that question. Chapter 1 THE PROBLEM American higher education, including professional education, is now under more critical public scrutiny than perhaps at any time in its history (Riesman 1979). The professions have in the past enjoyed a considerable degree of autonomy in managing the means by which individuals enter and maintain their rank status, because society has assumed that exposure to a formal educational program results in the graduate developing the necessary professional skills and virtures. While the credentialing process has typically required graduates to demonstrate their knowledge of the curriculum content, it usually has not required their demonstrating ability to perform actual professional tasks. That is, the professional school currriculum was assumed to be relevant to competent practice (Olesen 1979; Olmstead undated). That assumption is now being challenged. Wifiml Curriculum Design and Evaluation Medical education has been said to have evolved in three stages: the dogmatic era, the empiric era, and the scientific era (Flexner, 1910). Since the nearly universal implementation of the "Flexnerian Scientific" curriculum model, medical educators have continuously modified the curriculum content to accommodate new scientific and social demands. It has been expected that graduates will be knowledgeable of the latest and most advanced techniques for managing disease, if not proficient in their use (Armstrong 1977). This expectation is coupled with the belief that a thorough foundation in the basic and medical sciences is a fundamental prerequisite to such knowledge (McGaghie 1978). The study of disciplines; i.e.; cognitive knowledge, has been the focus of the traditional Flexnerian curriculum. Even the clinical experience, which follows the didactic years in this model, remains focused on the cognitive aspects of the clinical discipline and specific, often uncommon, disease. That is, medical school curricula have traditionally been designed to include the knowledg thought to be critical to the physician's professional performance. Recent modifications have expanded the curriculum to include new areas of knowledge in response to medical students' and patients‘ demands that the application of scientific and medical knowledge be tempered with knowledge of ethics, sociology, anthropology, psychology and epidemiology (Cope 1968; Krevans and Condliffe 1970; Jesse 1971; Shapiro and Lowenstein 1979). It has become apparent in recent years that the traditional goal in medical education of gaining encyclopedic knowledge, is no longer feasible. A significant turning point in medical education came with a report recommending that the educational process be directed more towards learning to problem solve, gaining skills to ensure life-long learning, and emphasizing M care (Coggeshall 1965). In response, new schools of medicine were developed, many taking these recommendations as their ideological starting point. T'Wo features characterized the "new" schools: (1) a broader psycho-socio- physiologic paradigm for understanding health and illness, and (2) the intergration of skills development and/or clinical experience throughout the curriculum (Lippard and Purcell 1972). Numerous curricular and pedagogical innovations were infused into these new medical educational programs, including: use of simulated patients, computer-assisted instruction, systems biology, behavioral objectives, and medical problem solving. Each innovation was informed by the then current thinking in educational psychology, management science or curriculum. Each was inspired by a specific instructional or research problem and each was seen as a means to increasing the relevance and effectiveness of the educational program, i.e. increasing the student's professional competence. Despite these intense efforts, undergraduate medical education programs typically persist in emphasizing the acquisition of knowledge of disease (Armstrong 1977; Engel 1978; Jonas 1978; Weed 1978). The relationship of this knowledge to the acquisition of competencies necessary to perform in the professional role remains unaddressed. Evaluation has been a central feature of the modern scientific curriculum. The earliest student evaluation efforts of medical schools were given to the student selection process. Subsequent efforts, consistent with the emerging psychometric theory, were concerned with reliably measuring course achievement. Throughout the twentieth century, boards of examiners have assumed the social responsibility of determining the "competence" of graduates of medical schools. These boards ,have persistently attempted to improve the credentialling examination in keeping with current conceptions of "competence" and measurement theory (Hubbard 1971; Senior 1976). Evaluation of candidates for licensure, originally conducted at bedside by master-physicians, has thus become a pencil and paper examination of knowledge and medical problem solving. More recently medical school evaluation efforts, concomitant with changes in curriculum, have focused on measuring clinical performance. The literature is replete with reports of methods for measuring clinical performance, including: the objective structured clinical examination; patient management problems; audits of medical records, supervisors' reports, and project work; and case studies (Harden 1979). Clinical performance evaluation has posed significant problems for educators. Since clinical situations differ for every student, equivalent testing conditions cannot be established for all students. And, since evaluators (especially in community-based programs) are busy, independent, idiosyncratic clinicians, standards for evaluation vary and comprehensive written reports are difficult to obtain. Educators have faced three major problems arising from the complex nature of clinicial performance evaluation: reliability, validity and precisio . Despite persistent and creative efforts to overcome these measurement problems studies continue to find that there is little, if any, correlation between academic performance and professional performance (Price 1971; Wingard and Williamson 1973; Bunda and Saunder 1979). Underlying the many and knotty problems of designing and evaluating medical education has been a fundamental problem: the lack of a clear and valid conception of medical competence and competence-based education. Medical Competence and Competence-Based Education Medical educators have gradually shifted their focus towards a competence development conception of medical education (Samph and Templeton 1979). A recurring theme in the discussions of medical education is the need to think of medical education as a lifelong continuum of professional competence development (McGaghie, et al 1978; Taskforce on Graduate Osteopathic Education 1979; American Board of Medical Specialties 1979; Samph and Templeton 1979). The current concern for clinical competence calls for re-examination of educational policies and assumptions. Educational institutions have responded in various fashions to these new demands: modifying admissions criteria, employing professional educators and evaluators, including or increasing social and behavioral science subject content, framing new paradigms for distributing health care resources, utilizing community health care resources in the educational process, introducing students to clinical skills and clinical settings early in the program, and requiring students to demonstrate that they can perform in clinical situations. "Perform what?" and "how?" have been the central questions educators have attempted to answer during the past several decades (Burg and Lloyd 1979; McGaghie 1978). Nearly all efforts have been directed towards answering the first question, "perform what?" Extensive efforts have been made, particularly by medical specialty boards, to define the role and tasks of physicians. Early efforts, which resulted in identifying nine broad task areas including history, physical examination, tests and procedures, etc., have been refined and expanded to include criteria for performance (to attain specialty certification). In addition, there have been proposed theoretical models by which to make more clear the desired clinical competence and the context in which it is to be demonstrated (De Luca et al 1965; Burg and Lloyd 1979). These recent efforts to clarify the construct competence have gone a long way towards illuminating the deficiencies of traditional notions of professional competence and medical school curriculum design. But, although in at least one area of professional competence, medical problem solving, differences in trainees and professional-level competence have been shown (Elstein, Shulman and Sprafka 1976), there continues to be no specific definition (standards) of clinical competence for medical students. And, although advocated (McGaghie 1979), no effort appears to have been made to define differences in students' competence at different levels of medical school training. Despite what appears to be a wide-spread concern for defining, developing and evaluating medical competence, competence-based medical education is only now being seriously considered as a curriculum model. That is, despite the inclusion of early clinical experiences in the medical curriculum and extensive efforts to improve clinical evaluation, medical education has continued to emphasize the acquisition of knowledge. The World Health Organization recently proposed that medical education programs be designed following the tenets of competence-based education (McGaghie et a1 1979), thus emphasizing the attainment of functional competence. Unfortunately, no single definition of "competence," nor any single conception of the structure of CBE programs has emerged from the studies of the teacher education CBE programs which have provided the basis for current theory (Burke et a1 1975; Frahm and Covington 1979; Nickse 1981). The lack of a definition of competence and the lack of established standards by which to judge student competence are acknowledged as the major barriers to establishing competence-based educational programs (Senior 1976; Bunda and Saunder 1979; McGaghie et al 1979; Spady and Mitchell 1977; Monjam and Gassner 1979). It has also been pointed out that competence-based educational programs must also reflect the nature of the process by which students acquire the desired competen--e (McDonald 1974). Medical educators have increasingly attempted to reflect the nature of the physician's practice role in the content of their curricula, thus meeting the first criteria of a competency-based educational program. The other criteria, designing the program to reflect the nature of the acquisition process by which the student- physician acquires those professional competencies, has most frequently not been met. Osteopathic as well as allopathic medical education programs have neither established explicit standards of professional competence for graduates, not characterized the process by which students acquire professional practice competence. A conceptual framework of the continuum of professional competence development remains to be described in order that the necessary theory and definitions for a model of competency-based osteopathic medical education can be develOped. Purpose of the Study The current study was undertaken to describe the continuum of clinical competence development, towards conceptualizing a competence-based educational program for osteopathic medical students. Consistent with the principles of CBE the study was conducted from the perspective of the student rather than from the perspective of the goals of the curriculum or the objectives of the instructors: what students described themselves as able to do in the clinical setting and why. Preliminary studies of the MSU-COM curriculum and extensive interactions with MSU-COM students had led the investigator to certain preliminary conceptions of clinical competence development in osteopathic undergraduate medical education. The study began with those preconceptions. Figure 1.1 uses a Guttman mapping sentence to describe the assumed relationships of the content of the program (Facet A) to the clinical conditions of practice (Facets C-F), and learners' activities (Facets B and G) and performance levels (Facets H-L). Certain systematic relationships, as revealed in Table 1.1, represent the differences thought to characterize Unit I, Unit II and Unit III student competencies and performance behaviors. Specifically, these preconceptions propose that students at each of the three levels of training are able to assume different tasks and to solve different medical problems, because they have different knowledge and are in different practice settings. The study also assumes, as do traditional clinical evaluation criteria, an idealized professional-level standard against which to judge independence, accuracy, efficiency and confidence. And yet, this preconception was not altogether satisfying, since students' anecdotal accounts of their performance did not always affirm its basic assumption: clinical competence follows didactic instruction. This study, then, was designed to elucidate the process by which MSU-COM students acquire professional skills, and to confirm the presumptions regarding the 10 competencies acquired at each of the three levels. Specifically, the study addressed two central questions: . What clinical skills (competencies) have students acquired by the eulofeachofthemreeplusesoftheeducationalprocess? . What conditions facilitate or inhibit competence development? A related question guided the investigation: . How does the formal coursework (theory) relate to the development of clinical competence (practice)? The central purpose of the study was to describe as accurately as possible, the complex process of clinical competence development in order to guide more definite research. The immediate intent was to refine the initial statement to reflect both the variables that affect learners and the competence they describe. That is, it was thought that such an exploratory, descriptive study was a necessary first step towards developing a conceptual framework for competence-based osteopathic medical education. It was also anticipated that the descriptive statement of subjects would be of particular interest to administrators of the MSU- COM curriculum. 11 715.229. sad: «I and: «- l:«vol «I 5:: 15.90! n— Y:z 30— _l ec:cfickzzt 30~ —~ mt:sm:WMmc uo_:um oucuvuueou Ax. Adv :omss sored: o rm z¢.; .s [naval «a f:: I: .1 :...:..:.,€.. nucI.Uuuuu Ax. uIQuooum IIaIquuual I Issue-«v cane-aAuIIegosIs I nququs ouuououIluIQ I IIvIOIuQ uIecuIIquOIuII. I uovuquv _IIII I uIQuOIuv II-aUIIpouvqu I nos-cod! dado—ooIcsm I nosuoouv IcuuuocII BI uoauqus huoquuaIIu I novnoauv vane—oIIII I .Io_¢o~c.ooo n; Amiillllilllx nosuoI¢I ouoo—ouIIIs I .Iu_uo~osusae N: uIIuquI «IuIauaqu I _Io.I>:a _: exuficurzt allude Inoqum IuImmlmua A2. A9. are? oercgmoa nou>Isoa ~I:¢—~qu« Iv unocoaouoIcooca s9 .c_ooo_a .e II_I~II me “ uo>o~ as . :uIa av kc cewo~grco ueI-uIs QIIu-quana no " eequu¢l«_ .Ico.uoesu as kemxo o rum: uIquII mucuI—:AII no u ue«I_aIOu o: .v agreeaxg ox: aeoguII IquouuaI .o . o to . 9 mmmmummuummm mmuuuuum “ .3 8. “ rouezo urIsokIs ge.; .. a.ca -:.eou u. Ski; w; 15» vIvIII II Inqqu—sI:Oo nu . :o_uoI no. IdI-g no «Ibo-aaI a. cause fire so. .q «meeccmmecec Iueuveuu -I we coda-Isuueoo .u caeutss Nquaoo< IoeIu-«III AH. an» n.u «u. —_l 0.. an I. so on an an n. \C Imootuomr e b I» u co» 9 no «a .toLH Izecroa Lo osquuIaoouca a. :3 mac _ s _ . .u .Ier 00 are Inquuoeoou —u romooajw 0a:0I o e: mmmuumwm mmmmuumm any Am» ass-o.\.ao..-. o..uauo «.s uIIIIoIIII uncle—sluxcoas _.a «II-«Iona uIIII-IlusII-I I_a Ique-Ius IIoIOLn «a IuIIu .OIuv «nauseoer—s on quu IIqaoua IIeaOua «a IuIv uoumuouc— I; IuIaquonu—II an Isaac: I32. on 9.33.2.8 2.3.3: 22 IIuI ququsa sea-Ibo as on uuu u-Ia nI .ooeomsoane ~comrm~o quI~IIoo qusu uuuu—I «a teuooghu an u-Ia «I Izouoosn so:\omc .uII uuoaoua .IgaIII qu-uou .a on :oo — u¢Ia .I soc: utmmroasa . noqu .I. Eggs: moo-gag a5—3—ao 3 is 2 LG errands-kg gut—duds ~.~ Isauwn 12 e a n; . N. No . no . . . . . . =_ .:23 n NHO N N n ha a . c o e . . . . . = .223 D: N. No. 1: N. r. a c so so 1 so 5. n n e a n; so . no . . . . . _ h-ZD as H. r. a. a. E -- r so do so so a»: An: Av: A3 A: AT: Am: An: . UV An: ADV Am: 2v whiz—U0 >Q4U— >U< UUZ<._. m-mQ‘ KO— ZO—h 203.42. PZM mU_I._.U 1.....‘100 luv—LLB -SUU< im_mw< 105.04?— uU<-O i>uO wQLUhUnt‘OU J Oh g—Z—(ih LO JM>UJ NI... “.0 Q—ImZO—._.<1_m¢ ML... 1.5 thZUhK Attitudes interpersonal Technical ESL’L‘L‘I [muniti- Knnvledfle rrnhlee solving Attitudes interpersonal Technical w r“.‘— .- "‘"QE'”€2£’ etc. (American Board of Medical Specialties 1979:24) Rather than attempt to state the subject matter for every clinical situation, they suggest that examples of important diseases or health maintenance activities could serve as representative samples for the domain. The Lloyd and Burg model takes into account the important issue of the context of the clinical problem, an issue which heretofore had not been considered in medical competence models, but provides little guidance in operationalizing the concept. LaDuca, it. 2.! (1975) developed a performance situation model to select training objectives for several allied health occupations which provides such a guide. The LaDuca model proposes a three dimensional universe for inventoring the clinical problems that can be encountered by the professional: the client, the problem, the setting. From this inventory, clinical situations are selected which have the highest priority, thus eliminating rarely encountered problems and 52 unnecessary duplication of learning experiences. Then for each critical training situation the professional tasks to be performed are specified. Together the Lloyd and Burg, and LeDuca models provide a promising place to start to define a performance model for physician competence. For example, existing inventories of cases managed by physicians in general practice (Bergman 1969; Baker 1978; Kroeger 1978; Weinberger 1976; Golden 1976; HEW 1978) could provide the universe from which to select the critical cases to be elucidated by the performance situation model. The Lloyd and Burg matrix could then guide the definition of abilities for each of the tasks associated with the clinical situation. Thus, by combining the two models, the limitations of each could be minimized. For example, the Lloyd and Burg model uses "ill patient" and "health maintenance situation" to dichotomize the situational context, which is insufficient to distinguish the problem solving, available resources and management strategies that would be expected of the physician managing, for example, upper respiratory infection (URI) cases. It can be envisioned that a child with a URI seen as an out patient in early stages of the infection would be managed very differently than if that child required hospitalization because of complications. One might also envision that a physician whose philosophy included the limited use of hospitalization and extensive use of alternative therapeutic modalities, might consider very different management strategies from a physician whose phi1050phy endorsed rigorous drug therapy and use of hospital staff to provide the therapy.3 The LaDuca model also provides a means of analyzing psychosocial and physical dysfunctions that can be associated with a particular clinical situation, an Ms. Wendy Page-Echols, a third-year osteopathic medical student, is credited with having provided this insight into the complex nature of the construct 'context' by describing differing approaches to management of these kinds of cases. 53 elucidation which should be particularly accommodating for the wholistic philosophy of osteopathic medicine. On the other hand, the Lloyd and Burg model elaborates the abilities and behaviors essential to performing the specific professional tasks required of the selected clinical situation. Jason (1979) cautioned that statements of competence should comply with standards of clarity, importance, difficulty and pertinence to outcomes. He proposed the following checklist by which to evaluate statements: 1. Is it clear what the Candidate needs? . to me? . to others? . to the Candidate? 2. Is it clear why the Candidate needs this competence? . to me? . to others? . how do I know? . how sure am I? 3. How regularly will the Candidate need this competence? 4. Under what conditions will the Candidate need this competence? 5. What are the consequences of the Candidate not having this competence? (1979:28) In answering these questions one must, as Jason points out, consider the principles of learning previously outlined. For example, knowing something (competence) is bound to the situation in which it is learned; hence, knowledge gained in didactic courses isolated from the clinical setting is not readily translated to clinical problem solving and is, therefore, not necessarily a clinical competence. Defining competence as clinical judgment: When n the 1950's and '60's, certification examinations were criticized for their subjectivity and lack of reliability, efforts were made to devise objective tests of competence. One definition OF clinical competence was "taken to mean using knowledge to solve problems (Senior 1976:16).4 In an extensive study undertaken by the National See Elstein, Shulman and Sprafka, Medical Problem Solving, for a discussion of the theoretical and methodological issues of medical problem solving and their implications for medical education. 54 Board of Medical Examiners and the American Board of Internal Medicine it was shown that computer-based clinical management cases provide a means of offering all examinees equivalent tests and testing conditions, of lobjectively and reliably scoring tests results and of analyzing the examinees clinical problem solving styles (Senior 1976). It also revealed that partially and fully-trained medical personnel (medical students and internists) could be distinguished on the basis of their clinical work-up style. This extensive study concluded, among other things, that knowledge is necessary but not sufficient for medical problem solving competence, which, in turn, is necessary but not sufficient for good performance in practice. Evaluatim Competence in the Clinical Setting Despite vigorious efforts in the last several decades to improve educator's evaluation skills, to improve testing instruments, and to refine models of evaluation, evaluation of medical students in the clinical setting has remained a troublesome business. The single most important factor contributing to the problem is the insufficiency of definitions of competence. As Samph and Templeton (1979), have argued, definitions of competence and availability of reliable methods of evaluating competence are inextricably interwoven. Concerns for psychometric issues (particularly reliability) have shaped the kinds of testing done and, thus, the definitions of competence. For example, as was noted above, the National Board of Examiners defined professional competence in terms of clinical judgment, which lent itself to objective and reliable tesing. Evaluating performance, however, has been more difficult. Samph and Templeton conclude that important aspects of professional competence, previously ignored through objective evaluation mechanisms, or assumed to have been reliably assessed in clinical training, will in the future have to be defined and evaluated by medical schools, if for no other reason than legal challenges to current certifying 55 procedures. Demonstrating validity of indirect measures of professional ability and performance, they say, will continue to pose the greatest challenge. A second, and related, contributing factor is the lack of standards for the competencies that are described. Reported studies of clinical evaluation have typically employed either: normative rating scales, where the student's performance is judged in one of a number of ways as "above average," "average," or "below average;" or criterion-referenced check lists, where specific behaviors are reported as being present or absent. Attending this problem is the lack of definition of reasonable expectations of students at various levels of training. While differences in ability to solve clinical problems have been shown for individuals at various levels of professional training (Senior 1976; Mazzuca, Cohen and Clark, 1981; Elstein, Shulman and Sprafka 1976) there still remains little qualitative information by which to guide the establishment of standards of performance for students at different levels of medical education. And without such standards evaluators are left to their own judgments as to what is 'average' or what is reasonable for a student to omit or comit in his/her performance. Another significant problem in clinical evaluation is the variability of testing conditions and ratings. Clinical situations differ for each student, making it difficult if not impossible to establish equivalent testing conditions for all students in a given clinical situation or in a given class. Field studies in clinical settings of students' reactions to their experiences and descriptions of their typi_ca_l clinical performance (Schermerhorn 1979; Sachoff 1979; Gordon e_t 21 1977) point up the problems of sampling attendant in measurement techniques using structured clinical performance examination and the artificiality of such techniques. More recent efforts have been made to combine the realism of the qualitative (field evaluation) studies with the higher reliability of the measurement techniques; videotaping, in particular, has proved effective, though costly (Liu, Miller and Herr 56 1980). As was noted previously, the context of clinical performance is a critical variable in determining what M be performed and how students 99 perform; thus, the criteria for performance must to some extent be situation specific. It has also been shown that clinicians use idiosyncratic bases for judgments and, hence, reliable clinical evaluations are difficult to obtain (Littlefield _e_t_ a_l 1981). It is time-consuming and costly to adequately sample students' clinical performances and to assess them reliably. Yet another problem has been the difficulty in determining the validity of clinical performance standards. Some studies have approached the question of the validity of performance ratings by attempting to correlate them with other measures such as the written and oral cognitive examinations. This has proved to be an unproductive approach, resulting in the conclusion that performance ratings measure abilities significantly different from those measured in cognitive assessments (Willoughby, Gammon and Jonas 1979). Even fewer studies have undertaken to test the validity of training objectives. One such study was unable to justify the inclusion of interviewing skills in medical training, since interviewing skills were not significantly correlated with problem solving, data collection, or problem identification (Brockway 1979)! And Greenland gt 9.1. (1979) showed that increased knowledge of diagnostic test characteristics correlates poorly with increased selectivity of use of those tests in the diagnostic process. More important than raising questions about training to be included in the medical curriculum, these studies point out the complex problems attending defining and measuring competence. Brockway raises another important evaluation issue: credibility. She points out that unless evaluation is seen to be relevant and valid, it will be perceived as a hurdle rather than as a benefit to the learner. 57 One thing is clear from the studies of evaluation of clinical performance: no single testing mechanism is sufficient (Slotnick and Grey 1978; Samph and Templeton 1979; Harden 1979; Brockway 1981). Desigging clinical evaluation systems: Evaluation begins from a philosophical view of both education and evaluation. What and how one evaluates depends on what one intends students be able to do and what one intends to do with the results of the evaluation (Morgan and Irby 1978; Zinser §_t ‘a__l 1979; Dressel 1978, 1981). Traditionally student evaluation has focused on measuring what students are able to do in order to certify their successful (or unsuccessful) completion of a course of study. In recent years medical schools have endorsed the teaching-by-objectives approach to instruction, which, in turn, has encouraged evaluation-by-objectives. While this approach, in theory, allows for evaluation of instructional process as well as student achievement, it has exhibited operational weaknesses: the objectives frequently are written to accommodate measurement tools rather than to reflect the intent of the course or program; objectives tend to be individual faculty course objectives, which cumulatively may or may not adequately reflect the program goals and objectives; and evaluation tends to be a terminal process rather than providing faculty, students and administrators with on-going information by which to adjust instruction and learning (Cuba 1969). Jason and Westberg take the position that evaluation in the health professions has such important social and psychological consequences that it must be conducted through a democratic process. Evaluation, they say, should provide "trustable answers to worthy questions" (1978:23) posed by students, faculty, administrators, and the public. Their examples of questions are useful to consider. STUDENTS: -- "How much promise do Ihave as a health professional?" -- "How effective a learner am I?" -- "How should I modify my learning efforts at this time?" 58 FACULTY: -- "How effective an instructor am I?" —- "How should I modify my current efforts in behalf of my students?" -- "How successful was my course/ program in preparing my students for their careers?" ADMINISTRATORS: -- "How effective is our college/ department in preparing our students for their careers?" —- "How can we improve upon what we are now doing?" PUBLIC: —- "How well prepared are the graduates of this program to provide what society most needs?" -- "How efficiently has the program used the resources society has provided? (1978:23) Jason and Westberg make the important point that the typical evaluation mechanism in medical school, content-oriented examinations which make demands on fairly low level, passive, intellectual skills, are counter—productive to developing independent learners who are critical thinkers and skilled problem solvers. Also contributing to this failure to develop independent learners is the lack of a self- evaluation component of the evaluation system (Jason and Westberg 1978; Fuhrmann and Weissburg 1978). Ultimately the evaluation system elected and designed for a program depends on the philosophy, resources (money, time, and personnel), and receptivity of the 5 academic community. The evaluation system can be designed to focus at any Descriptions and typologies of theoretical models of evaluation can be found in numerous texts. See for example: Worthen and Sanders 1973; Anderson and Ball 1978; House 1979; Patton 1981. 59 aspect of the program, or the entire program; and be intended to provide on-going information for program planning purposes, such as the model proposed for osteopathic medical education by Dressel (1981); provide comprehensive information about students' clinical competence, such as the models outlined by Harden (1979), and Graham (1971); or describe the student performance and instructional process in the clinical setting, such as the model developed by Gordon, Hadac and Smith (1977). Structuring Medical Education Programs for Competency Development Medical education program planners historically have employed one of several curriculum models and have used various rationales for the sequencing of curriculum content (courses). Curriculum Models for Medical Education Medical education programs have traditionally been subject-centered. In the traditional curriculum model, the first several years of the program consist of a series ofdiscrete courses representing the science disciplines throught essential to- gaining the theoretical basis of medical practice, such as anatomy, biochemistry, physiology, pathology, pharmacology, microbiology, biostatistics and embryology. The next several years consist of analogous discrete courses (rotations) in the clinical disciplines, such as gynecology, obstetrics, pediatrics, surgery, hematology and psychiatry. An implicit assumption of this model of medical education is, not only that students become 'competent' in the disciplines as a result of such course work, but that students can and do apply the theory in practice. McGaghie gt _a_l (1979) contended that such programs have certain undesirable consequences for medical students and their future patients by: 60 . emphasizing factual knowledge of independent disciplines; . limiting students to the same education, ignoring individual differences; . attending to less common clinical problems; . implicitly focusing on human disease rather than emphasizing health; and . promoting insularity of the disciplines and their practitioners. A curriculum model which emerged in the 1950's through the pioneering efforts of Case Western Reserve Medical School emphasizes the integration of the disciplines around a topic, such as an organ system (Sinclair 1972). In its ideal implementation clinical experiences are concurrent with the integrated didactic course work. This model assumes that learning has greater meaning and retention is increased if didactic and experiential learning are concurrent. The integrated curriculum model has obvious advantages over the subject-centered model; it does not, however insure that theory is integrated into practice, since the emphasis remains on developing cognitive competence in the theory undergirding practice. Only recently has there appeared in the medical education literature discussion proposing competence-based medical education (Hamilton 1976; Weed 1976; McGaghie et al 1979; Barondess 1981). And only one monograph was found which formulated an approach to designing competency-based medical education (CBME), the World Health Organization monograph edited by McGaghie gt 31. The curriculum model proposed in this monograph subscribes to the elements of (CBE previously described: . the curriculum is organized around functions required for the practice of medicine in a specified setting; . instruction is based on the principles of mastery learning; i.e., entry level testing, stepwise instruction, flexibile time scheduling, and frequent assessment. These authors advocate a situation-specific definition of physician competence. That is, a definition of competence for physicians intending to practice in developing nations, they argue, is likely very different from one for physicians intending to practice in technologically sophisticated settings. It should be added 61 that the philosophy of medicine upon which the educational program is based should also direct the definition of competence (Dressel 1983). That is, when attempting to design a competency-based osteopathic medical education program, one must first describe the nature of the professional role of the osteopathic physician in the practice setting of the graduates. To date few studies of osteopathic physicians and their practice have been conducted, although helpful statistics which distinguish M.D.s and D.O.s appear to be emerging from the National Center for Health Statistics.6 Structuring Competency-Based Medical Education McGaghie _e_t _a_} (1979) report that there is no evidence of an optimal sequence of courses for the traditional curriculum models, nor is there any convincing evidence that early courses are prerequisite to those that follow since there is such rapid decay of unused knowledge. Posner and Strike (1974, 1976) proposed that curriculum structure can be analyzed at two levels: (1) the extent of relationships among intended learning outcomes, and (2) the kinds of relationships between curriculum elements. They categorize the kinds of structuring criteria used in curriculum designs as follows: 1. World-related. What are the empirically verifiable relationships between the phenomena (peOple, things or events) in the world about which the pupil is to learn and how can the curriculum be sequenced so that the organization is consistent with the way the world is? 2. Concept-related. What are the conceptual properties of the knowledge which the pupil is to learn and how can the curriculum be sequenced so that it is logically consistent in organization to the organization of the concepts? See for example: Koch, H. "Office Visits to Doctors of Osteopathy: National Ambulatory Medical Survey United States, 1975. DHEW, 1978, and Cypress, B.K. Characteristics of Physician Visits for Back Syndrome: A National Perspective. American Journal fo Public Health _7_3(4):389-395, April 1983. 62 3. Inquiry-related. How are knowledge claims produced and how can the curficulum be sequenced so that it is consistent with this process of inquiry? 4. Learning theory-related. How does the pupil learn and. how can the curriculum content be sequenced to provxde for optimal learning efficiency, retention, and transfer? 5. Utilization-related. How will the pupil utilize the curriculum content aTter he has learned it and how can the content be sequenced so that it is consistent with the utilization process? (1974:5,6) Dressel (1980) takes the view that the structure of curriculum has reflected the territorial interests of departments and presumptions about the essentiality and transferrability of knowledge of the disciplines. He points up the arbitrariness and interrelatedness of curriculum structure and content,_and their impact on student motivation and learning. It is his opinion that "the traditional departmental disciplinary orientation to the deve10pment of education has made it very difficult to establish new structures and to relate structure, content and the interaction involved among these to their effectiveness in producing or stimulating growth toward broader and more enduring behavioral objectives" (Dressel 1980:23). He concludes that the structure of the curriculum has traditionally been determined by utilitarian considerations rather than by those considerations intrinsic to the learning process or what is to be learned. Armstrong (1977) has highlighted the importance of the structure of the curriculum itself, suggesting that inherent in the traditional medical curriculum of pre-clinical and clinical courses is a disease-oriented perspective--a point which may be of particular concern to osteopathic medical educators. He concludes that the structure and content of traditional curricula ensures that medical students develop a "clinical gaze" that corresponds to hospital work and not to that of primary care. He argues that the pre-clinical courses emphasize reductionism, reformulation (not explanation) of the phenomenon, acquisition of states of knowledge, and controlled and bounded knowledge; which are inconsistent with the 63 nature of clinical work characterized by openness, knowing M knowledge is created, and abstract and implicitly controlled knowledge. The common integrator for these dichotomous conceptions of knowledge, Armstrong contends, is _d_i5£a_s_e_. Thus, the student comes to see "the task of diagnosis as essentially the approximation of the patient's pathology to an established disease category" (1977:246). Administrators of professional educational programs have, as Dressel has pointed out, a social obligation to ensure "that each individual undergoing a professional program be held to reasonably well defined and acceptable standards" (1979:4). Dressel has outlined six sequential steps for developing an osteopathic medical education program: (1) definition of the purposes of a college of osteopathic medicine and of any unique purposes of a particular college; (2) a statement of educational objectives, such that their attainment at or above a specified level indicates attainments deserving recognition by conferring the D.O. degree; (3) develop a continuous sequential, integrative, and individually adaptable set of experiences including: . formal courses in basic and clinical sciences, . clinical experiences, . discussions of professional, ethical, social, and philosophical problems, issues, and obligations, . continuing, constructive, and developmental individual evaluation; (4) conduct continuing or recurrent evaluation of individual faculty members to ascertain the extent of understanding of, commitment to, and performance in particular phases of the program in appropriate relation to the desired composite student experience; (5) conduct continuing or recurrent evaluation of the program and of its composite impact on student attitudes, values, knowledge, and developing porfessional competencey; (6) adjust and modify the program in reference to continuing changes and accumulation of knowledge about health and maintenance of it, improved technology and expanded range of health care technicians and specialists, changing social expectations and demands, new insights into learning and means of motivating and expediting it, change in and hence continuing need for orientation of new students and faculty. (1 981:2,3) 64 The model proposed by McGaghie gt 2.1 (1979) follows the classical principles of mastery learning with the important exception that they appear not to ascribe to the notion of hierarchical learning espoused in early mastery learning theory. The sequencing principle they propose is consistent with the Posner and Strike utilization-related principle, in that they propose that modules be arranged so as to facilitate the student's mastery of essential problem solving competence. Specifically, they propose that: First, a clear and precise listing of the components of competence be prepared; Second, components (units) are clustered into logical patterns related to the problem solving competence; Third, each component be developed into a self—contained instructional unit; Fourth, students proceed through the units at their own pace; Fifth, the sequence in which the instructional units are undertaken is determined by the individual student; and Sixth, criteria be established by which competence is measured. McGaghie gt 31 point out that expectations (standards) for student performance should realistically reflect the student's level of training and accumulating effect of training on competence, by establishing minimum criteria for performance at critical points in the training program. Evaluations, in their curriculum model, are intended to be used primarily to guide the professional development of the student. Assuming that standards of performance are clearly defined, the final judgment can only conclude that the function is mastered, for if it is not, instruction and learning (and evaluation) continue until mastery is achieved. The examples provided by the authors to illustrate the mastery approach do not come from medical education programs nor from programs in which the clinical facilities are geographically removed from the educational institution, as would be 65 the case with community-based medical education programs. Also, the authors provide no guidance with regard to the practical problems of limited resources (time and faculty) in managing an ideal mastery learning program. They do, however, point out that CBME programs will significantly alter the institution's operation and must be undertaken as an experiment in change. Teachers, they argue, will have to undertake new roles such as planning CBE strategies, managing instructional resources, and conducting diagnostic evaluations; while students will have to abandon their adversarial position toward curriculum, and accept personal responsibility for learning. McGaghie _e_t_ 3} describe the three strategies most frequently employed to effect curriculum/organizational change and some of their consequences: 1. Power: The person or persons (dean, department chairs, for example) with primary authority in the program identify a program goal and mobilize the resources to put the new program into effect. The change can be effected in a relatively short period of time; however, it may not be long lived. Ultimately it is the faculty who implement and sustain change and without their understanding and commitment to the new program it is likely not to be sustained. 2. Rationality: A change is hypothesized and an investigation is undertaken through which a rational proposal, complete with supporting data and data on alternatives, is develOped and presented for discussion. While this method should appeal to academics it generally does not, because it does not deal with the special interests and psychological needs of faculty and students. 3. Re-education: Change is effected through systematic organizational efforts which will enhance changes in values, skills, and political realignments, as well as in knowledge of the intended change. This approach is frustratingly slow and can easily be sidetracked by conflicting priorities and territoriality, but it can be enhanced by rewarding teaching and education research. Summary Certain conclusions can be drawn from this cursory review of the literature. First, definitions of medical competence have been varied but have typically they focused either on knowledge, in the sense of having the capacity to do something, or performance, in the sense of taking appropriate actions in the performance of 66 professional duties. Recently proposed conceptual models can guide the development of competence statements which include both ability and behavioral components and which define the environmental context for the competence. To date, the descriptions of medical competence have described professional level competence: most prevalant are descriptions of specialty level competence (completion of residency training). Evidence from studies of both knowledge and performance certifying exams suggests that cummulative knowledge and the practice situation affect competence. Implicit, but not explicitly stated, in the models for defining competence is the importance of the philosophy of medicine by which the training program is guided. Second, availability of conceptual and methodological tools for evaluation has to a considerable extent dictated definitions of competence. The recent emphasis on performance outcomes of medical education requires consideration of validity, as well as reliability issues, in evaluation and program content. Multiple approaches to evaluation will be necessary in order to effectively determine cognitive, psychomotor and affective aspects of professional competence. It will also be necessary to define reasonable competence standards for students at various stages of training in the pre-professional level. And, finally, designing educational programs to facilitate development of professional competence requires new perspectives and skills on the part of all members of the academic community. Any attempt to change from a traditional curriculum model to a CBME model will be hampered by the lack of operationalized models and institutionalized resistance to change. Educators can expect difficulties when attempting to design and manage a non-traditional curriculum. Discipline-focused courses, use of discipline experts as faculty, and faculty/discipline autonomy are strong traditions in academia, paralleled by the traditions of specialty-oriented services, use of specialist and subspecialist 67 consultants, and attending physician autonomy in hospital-based patient care. These traditions are woven into the fabric of what we think of as "medical education" and "medicine." Not only is it difficult to implement change which counter these traditions, it is even more difficult to conceptualize alternatives. Without a clear understanding of the processes and effects of the current curriculum, proposed changes are likely to be ignored or co-opted. And without such insights it will be difficult to prOpose changes which truly reflect an alternative. Having envisioned and designed and even operationalized a non- traditional curriculum, however, does not ensure its long life. Summary This chapter has reviewed selected literature which points up the critical issues in competence-based education. The review of literature related to the general concepts of CBE pointed up the diversity of views and the central importance of philosophy in determining what one thinks "competence," "competency-based education," and "evaluation of competence" mean. The review of basic concepts of learning theory presented some issues whicn are thought particularly pertinent to medical education and to conceptualizing a continuum of clinical competence development: how information is perceived, how one selects and attributes meaning to information, the social process of learning and defining reality, and the relationship of theory to practice. The brief review of the medical literature focused on how professional competence has been defined and evaluated, and how educational programs have been and can be structured, depending upon the planners' notion of competence. 68 CHAPTER III METHODOLOGY This chapter outlines the methods used to address the problem outlined in Chapter I: the need to elucidate the process by which osteOpathic medical students acquire professional competence and to examine certain assumptions regarding competence at each of three levels of training. First, the study design and its rationale are described. Next, an overview of the content and process of the interviews is provided. And finally, the methods used to analyze the data are presented. Study Design This study is part of a series of studies initiated in 1974 by the Michigan State University College of Osteopathic Medicine to examine its curriculum and its relation to student professional development. A number of the studies consider students' perceptions of the program and employ the research interview methodology (Sharma 1975, 1976; J. Dressel 1977; Weaver 1980). This study continues the student interview methodology initiated by Paul L. Dressel for the study of MSU-COM. And, as with past research efforts, the study is intended to benefit curriculum administrators, students, and the profession. It is intended to be illuminary rather than critical. The study began with the presumption that insufficient theory guides curriculum development and evaluation of so-called competence-based educational programs. It was assumed that efforts to improve planning or study of such 69 programs could be improved by having a more thorough description of the manner in which participants are affected by the program. It has been argued that scientific inquiry must be grounded in theory inductively developed from social research (Glaser and Strauss 1967; Denzin 1978; Lofland 1971; Blumer 1969; Webb et al 1966), with Blumer describing grounded theory as: lying in the examination of the empirical social world. It is not to be achieved by forming and elaborating catchy theories, by devising ingenious models, by seeking to emulate the advanced procedures of the physical sciences, by adopting the newest mathematical and statistical schemes, by coining new concepts, by developing more precise quantitative techniques, or by insisting on adherence to the canons of research design. (1969:35) The study, then, was intended to describe the phenomenon of clinical competence development of osteopathic medical students in one educational program. Its intent was to gain a better understanding of the complex nature of that developmental process, rather than to identify predictive cause and effect relationships of the variables. Specifically, it was directed towards understanding the students' perspective of what professional tasks students are able to do in the patient care setting and why. Seen from the perspective of evaluation the study was concerned with both outcomes-~intended and unintended--and processes (Stufflebeam 1973; Stake 19750 Partlett and Hamilton 1976; Dressel 1981). The study used a single data base, the perceptions offered in in-depth research interviews by students who were currently completing or had just completed one of three phases of the education program. Ratiortale fortheStudy Design As MacDonald and Walker (1975) pointed out, curriculum evaluators are frequently faced with questions which simply do not lend themselves to conventional experimental research methods. Particularly those questions which 70 focus on understanding the transactions in the teaching/learning milieu, they suggest, lend themselves better to descriptive methods of study. Stake (1978), in his review of the literature, argued that the case study lends itself particularly well to extending understanding, versus, explaining. Typically the case study is a historic description of variables in their complex relationships, which is better suited to expanding one's view of the phenomenon than reducing it to a set of propositional statements. He also contends that case studies gag lead to generalizations about the case under study and those similar to it. That is, "naturalistic generalization," in contrast to scientific law, is a product of experience, "from the tacit knowledge of how things are, why they are, how people feel about them and how these things are likely to be later or in other places with which this person is familiar," (1978:6l. In order that such generalizations can be made about the case studies, however, it is essential that the target case be properly described and that the boundaries of the system be kept in focus. Patton (1980), in his typology of evaluation methods, described the case method as compatible with evaluation methods intended to examine the educational processes themselves. Evaluation using the case method places the "emphasis on perception and knowing as a transactional process," (1980:54). House also contends that "one can study perceptions only by studying particular transactions in which the perceptions can be observed" (1978:9). Similarly, Parlett and Hamilton argue that studies in order to be illuminative must be descriptive and interpretive, and concern themselves with the transactions within the milieu being studied: It [illuminative evaluation] aims to discover and document what it is like to be participating in the scheme, whether as a teacher or pupil, and, in addition, to discern and discuss the [program's] most significant features, recurring concomitants, and critical processes. in short, it seeks to illuminate a complex array of questions (1976:144). 71 Certain assumptions and a particular philosophic orientation, then, undergird this study. The assumptions underlying the study reported here are those of qualitative evaluation: . . . the importance of understanding people and programs in context; a commitment to study naturally occurring phenomena without introducing external controls or manipulation; and the assumption that understanding emerges most meaningfully from an inductive analysis of open-ended, detailed, descriptive, quotive data gather through direct contact with the program and its participants (Patton 1980:55). It is also assumed that interviews can provide not only the perceptive data sought, but can be beneficial to the subjects. As Sanford (1982) argued, research interviews can and do benefit interviewees, by providing them: a chance to say things for which there had not previously been an appropriate audience. They can put into words some ideas and thought that had been only vaguely formulated. When these are met with attention and interest self-esteem rises. People who are interviewed havea chance to reflect on their lives, to take stock, to think out loud about alternatives (1982:897). And, finally, the philosophical guidelines of the study are similar to those proposed for "democratic evaluation" by MacDonald and Walker (1975): 1. its aim is to find ways of encouraging participants to develop insight into their world of learning. 2. Rather than setting proof as its primary goal, the case study aims to increase understanding of the variables, parameters and dynamics of the program and learning processes of students. Therefore, cross- checking, rather than consistency, is the main strategy for validation. An implicit assumption is that there is no one true definition of the phenomenon understudy. 3. Neither praise or blame is intended or inferred by the research. Contingency relationships will be presented so that the audience can draw its own conclusions as to cause. Study subjects: Two principles guided the sample selection decision making: (1) purposeful sampling, in contrast to random sampling, can increase the utility of the information obtained from a small sample, and (2) depth and breadth of information are both important to the study (Patton 1980). The nature of the program and the experience of the investigator further refined the selection process. 72 Cross-sectional sampling was used, in which subjects were selected from each of the three phases of the educational program. The MSU-COM curriculum is designed in three phases, designated numerically as Units I, II and III. Each phase offers didactic and skill development teaming experiences, but each successive phase offers increased time committed to clinical skill development and proportionally less time in didactic instruction. visualized: The curriculum model can be Basic Curricular Model \? i".'l("\ ‘6 Clinical . a ll’ll an. I ran UNIT l UNIT ll UNIT m | Conference/Seminar Principles g 20* v :. of C a S 1, l' T. ’l g 2 lac-meal 5’ ' m 9") 09y 80$ . in E; Blower Clinicm ; 5 com. _ E: 7.3 Cterkship o g g a ‘05“ Rotation -‘ 80% Clinical Problem Solving and Skills Development 1. Clinical Skills Clinical instruction TERMS l. 2. 3 TERMS 4, 5, 6. 7. 8 TERMS 9. 10. 11,12. 13 Specifically, the phases have been described:1 UNIT I includes UNIT II includes 1 Basic science courses - to provide a foundation for the medical and clinical sciences; and Clinical skills labs - to develop basic skills of physical examination, medical interviewing, and osteopathic manipulative diagnosis; Systems biology courses - to provide a medical science foundation for each body system; and Osteopathic manipulative therapy skills labs - to develop and refine OMT‘ diagnostic and therapeutic skills; and Family practice precepgirships - to provide an opportunity to refine clinical skills in a private practice setting; and MSU-COM 1982-83 Self-Study Graduate Survey 73 UNIT III consists of Externship and Jr. Partnership rotations - to refine and extend clinical diagnostic and management skills in the practice setting. Based on the curriculum design, it was assumed that, students at each phase of the program would have a level of competence different from counterparts in either of the other two phases; differences which could be described and which could serve as foci of analysis. Twenty (20) students from each phase of the program were invited to participate in the study. Individual students were selected based on at least one of several factors: (1) the student's participation in informal discussion groups led by the investigator revealed that he/she was insightful about his/her educational experiences and was interested in providing constructive feedback to the program, and/or (2) it was thought that the student's pre-medical school life experience would broaden the range of variables to be considered in understanding competence at the various levels, particularly Unit I. No effort was made to assure that students included in the study were representative of their peer group with regard to age or academic achievement; neither was there any attempt to match students across groups. The primary intent was to get as broad a range of perspectives and experiences as possible, while assuring candid, insightful descriptions of individual perceptions of clinical competence and learning processes. A total of thirty-seven (37) in-depth interviews were conducted: fifteen Unit I students, eleven Unit II students and eleven Unit III students. As Table 3.1 reveals, the sample group was representative of the population only with regard to age. The purposeful selection of equal numbers of males and females and those with and without training in an allied health occupation significantly diminished the representativeness of the sample group, while, it was thought, enhancing the opportunity to gain the insights of those subgroups. CHAR ACTERISTIC N= Age: range _ : mean Sex : male : female Graduate Degree Health/Medical Occupation Certification 1 UNIT I UNIT 11 Sample Population Sample Population 15 125 11 125 22-38 20—38 23-37 20-37 27.6 25.7 28.5 26 4096 6496 45.496 6896 6096 3696 54.696 3296 26.696 10.396 27.296 22.3 40% 9.696 45.4% 17.696 COMPARISON OF THE SAMPLE THE POPULATION ON SELECTED DEMOGRAPHIC CHARACTERISTICS 74 TABLE 3.1 UNIT 1111 Sample Population 11 89 23—39 19-43 27.0 24.6 45.496 63.6 54.696 36.4 996 24.3 45.496 696 Unit III subjects include students from both the 1978 and 1979 entering classes. Population data represents the mean values for the combined classes. 75 The Interview Process Cannell and Kahn describe the research interview as "a two-person conversation, initiated by the interviewer for the specific purpose of obtaining research-relevant information and focused by him on content specified by research objectives of systematic description, prediction or explanation (1968:527). Patton elaborated that "the purpose of interviewing is to find out what is in and on someone else's mind. . . . We interview people to find out from them those things we cannot observe easily. . . . We cannot observe feelings, thoughts, and intentions. We cannot observe behaviors that took place at some previous point in time. . . . We cannot observe how people have organized the world and the. meanings they attach to what goes on in the world. . . . The purpose of interviewing, "is to allow us to enter into the other person's perspective. The assumption is that the perspective is meaningful, knowable, and able to be made explicit," (1980:196). The research interview also differs from the therapeutic interview in its intention to gather information that is measurable in some way. Borg and Call (1979), outline four general steps for developing the interview process to insure measurability of the results: a. Define the purpose of the study--its background, theoretical basis, general goals, possible applications of results, and reasons for using the interview methods. b. Translate the general goals into detailed and specific objectives which can be fitted to the particular interview pattern you plan to follow, constructing questions yielding useful information. c. Develop a tentative guide to be used during the interview, exploiting the advantages of the interview technique. (1. Develop a satisfactory method of coding and/or recording responses. Generally, responses can be pre-categorized in a pilot study to anticipate the most frequent response patterns. Only responses falling outside these general categories would need to be written out. Tape recording of the interview may offer 76 advantages, unless the nature of the interview is highly personal and produces guarded response. A generally poor technique is a written summary of information recorded during or following the interview. Because of the pace of an interview, the act of writing either slows the interview unnecessarily or causes the interviewer to be selective in the kind and amount of information he records, at the risk of introducing bias (cite ). Of the three types of research interviews: unstructured, semi-structured and structured, a variation of the structured interview was selected for the study. This approach, using standardized, open—ended questions but pursuing individual issues in depth when appropriate, was intended to maximize the commonality of issues addressed by each of the three groups of subjects, while at the same time allowing the interveiwer to gain greater insight into the variables of the learning process. It was assumed that the greatest depth of insight could be gained by following the lead of the interviewee on particular issues, but that the greatest breadth of insight would be gained by asking common questions of all subjects within a particular subject group. ThelnterviewSchedule Three distinct interview schedules, one for each Unit under study, were developed to elicit information regarding the subject's pre-MSU-COM experience, the nature of the individual's current clinical experiences, a description of specific clinical skills at the end of the current and previous Unit, explanations for perceived deficiencies and proficiencies in clinical tasks, and insights regarding instructional/learning processes. The questions were designed to address the major physician tasks outlined in the program's various clinical course syllabi: conducting the medical history and physical examination (Units I, II, III), evaluating clinical data (Units 1, II, III), proposing differential diagnoses (Units II and III) and developing management plans (Units II and III.) (See Chapter I, Figure 1.1, (B) Task.) The questions were arranged so as to lead the interviewee chronologically through his/her clinical 77 experiences, and to culminate in at hoc assessments of those experiences. Each of the three different interview schedules followed the general outline: A. Experiential background B. Nature of the clinical experiences C. Descriptions of clinical performance D. Insights on clinical competence development Experiences in field-testing the Unit I interview revealed that students with medically-related pre-MSU-COM experiences were more articulate about their perceptions of the inexperienced students' competence and the adequacy of the teaching/learning process for those students, than they were about exactly how and what they, themselves, did in the clinical setting. A separate interview schedule was, therefore, designed to elicit those insights from the experienced students. Experienced students in Units II and HI were willing and able to provide both types of information and, therefore, one interview schedule sufficed for both experienced and inexperienced students. Drafts of the interview schedules were critiqued by the investigator's research director and by educational administrators of the program. Appendix A presents the four interview schedules used in the study. Interview Process Once the subjects were selected from their respective class lists, they were sent a letter describing the project and soliciting their participation. (Apendix A) Unit I and II volunteers returned a post card indicating preferred dates and times for an interview, while Unit III students were called to confirm participation and to set up appointments. Interviews of Unit I and 11 students were conducted in the investigator's college office, while Unit III interviews were conducted at the clinical site, usually in the hospital library or a classroom. In all but one case, the interviews were conducted in one session, and all were planned to accommodate the time constraints on the interviewee's time. Initial Unit I interviews were recorded in hand-written notes only. When the investigator became aware of the subtle differences in descriptions and foci of attention expressed by students of various levels of experience, all subsequent interviews were tape recorded. Timing was a crucial factor in interviewing the medical students. An effort was made to interview students as close to the time of completion of their respective Unit as possible. In the case of Unit I and II students, attempts to interview near the end of the final term of the Unit were frustrated by impending examinations. Interviewing Unit III subjects was particularly problematic, due to the fact that they were located in hospitals around the state and were no longer under the direct supervision of the College. Authority for access to the Unit III subjects was finally gained from the College and the respective Director of Medical Education post-graduation; hence, subjects were interviewed regarding their Unit III experience when they were actually involved in internship experiences. This posed several problems. First, the subjects were involved in a new level of responsibility - a phenomenon which likely put their Unit III experiences into a different perspective from that which might have been presented had they been interviewed while in the Unit III program. Second, the demands on interns' time and energy greatly complicated scheduling the interviews. One intern who agreed to participate rotated to an out-of-state hospital: one had just begun a rotation that was so short-staffed that he simply could not arrange an hour's time. Invariably interviews were conducted at the end of twelve-hour work shifts; but, although tired, all of the interns, with one exception, engaged themselves in the interview process. Often the interviews were in excess of an hour in length; one was pursued for well over two hours at the intern's insistence even though she had been on duty for twenty-four hours. 79 Cannell and Kahn (1968) cite three factors which influence the quality of the interviewers response: motivation, coalition, and accessibility. With the exception of one Unit III subject who was ill, all interviewees appeared to engage themselves in the process of the interview. Historically, students in this program have been encouraged to express their opinions on the program in course evaluations and research surveys; they have been candid and articulate in expressing those opinions. Those volunteering to be interviewed and actually taking time out of hectic schedules for the interview, showed motivation. Many subjects had had previous interaction with the investigator in informal discussions of their education, and were, therefore, well prepared to engage in the descriptive and introspective process required in the interviews. Many of the interviewees expressed appreciation for the opportunity to talk with someone about their experiences and to offer their opinions about ways to make the clinical experience more productive for students. For some students the cognitive process of introspection and verbalizing throughts about what should be, was difficult. Usually, such questions as: "What would have been better for you, do you think?" "How did that make you feel?" "Why?," helped the interviewee express insights into his/her experiences. In some instances, the issue of clinical competence development (as posed in the questions: "What skills should a student have developed in Unit I in order to be prepared for Unit 11?") was illogical from the student's view, since his/her view was that the program required no such pre-requisite. Difficulty in makng a distinction between the ideal and the real was usually resolved with other questions, but not always, as in case of one Unit III subject who, despite various approaches to the question, maintained no particular competence should be expected of individuals being awarded the D.O. degree. On the other hand, the cognitive abilities of the interviewer were also critical to the outcome of the interview. It can be assumed 80 that had the investigator had a physician's knowledge of professional tasks and medicine, much more specific questions regarding the clinical competence of the interviewee could have been pursued. Subjects at all levels had what seemed to be remarkable recall of their clinical experiences. For example, Unit III subjects had little difficulty in remembering the order and the general nature of preceptorships that they had taken one and two years previously. Clinical experiences during the previous year were recalled by all subjects in vivid detail, including clinician's names, critical incidents, patients and medical problems, and feelings and thoughts at the time of specific events. Situations that had been stressful or particularly rewarding seemed to be the most accessible. In sum, many, but not all, of the problems attending interview research were minimized by the students' motivation to participate, and ability and willingness to candidly share their clinical experiences, and the investigator's familiarity with the program and the individual subjects. Any limitations in the data more likely reflect the investigator's lack of specific medical knowledge than subjects' lack of ability or willingness to provide the information. Data Analysk The general purpose of the data analysis in this study was to identify variables in the process of clinical competence development and to derive descriptors of competence at each level of the program. Content analysis (Krippendorff 1982), and contingency analysis (Osgood 1959), were used to analyze the information gathered from the in-depth interviews. 81 DesignoftheAnalysis As noted earlier, interview schedules were designed, based on an _a_ priori conception of the continuum of clinical competence, to gain students' perceptions of clinical competence development. In this conception, clinical competence at each of the three levels of the educational program was presumed, based on program objectives, to parallel cognitive development (See Table 1.1, page 12). Descriptions of competence were further presumed to reflect the interaction of three major variables: the student, the didactic program and (the practice conditions. Each of these variables subsumed other variables, such as: Student Didactic Program Clinical Setting . Prior experience . Content . Clinician's expectations . Practice habits . Skills training . Patient's condition . Attitude . Faculty expectations . Patient population Implicit in this conception wasa further presumption that clinical competence would be different for each of the three levels of education, but that students at a given level would have similar overall competence. The first phase of the analysis was intended to clarify and extend this preliminary conception of clinical competence development, by describing through content analysis students' perceptions of what they are able to do in the clinical setting at the end of each of the three phases of the educational program and why. According to Krippendorff, "content analysis could be characterized as a method of inquiry into symbolic meaning of messages," (1982:22). He cautions that given the symbolic nature of communication, one cannot claim to have analyzed £h_e_ content of the communication, since the meaning depends on the perspective of the analysis. He adds the corollary caveat that intersubjective agreement of the meaning of messages and symbolic communications is unlikely, except in the most simple (and uninteresting) circumstance. 82 Krippendorff suggests the following conceptual framework for designing, conducting and evaluating content analyses: 1. it must be clear which data are analyzed, how they are defined, and from what population they are drawn; 2. the context relative to which data are analyzed must be made explicit; 3. the task is to make inferences from data to certain aspects of their context and to justify these inferences in terms of the knowledge about the stable factors in the system; 4. the kind of evidence needed to validate its results must be specified in advance or to be sufficiently clear so as to make validations conceivable. (1982:26-28) That is, the process of content analysis involves transforming the communicated information into data from which inferences can be made about how the data are related to the context. The process of analyzing the content of the interviews gathered in this study was guided by the basic premise that the study was pre-theoretical and descriptive; i.e., it was intended to elaborate, rather than refine, the picture of the educational process by which osteopathic medical students developed clinical competence. It was, therefore, thought essential that the information remain as near as possible to the original form in which the student presented it, in order that readers could bring their own perspectives to the analysis of the data. For the same reason, it was thought important to describe the inquiry process used in coding and analyzing the information. The analysis in this study focused on information students provided about their performance of specific professional tasks (history and physical examination, and medical problem solving), and the specific circumstances under which they performed those tasks (presumably, their antecedent knowledge and skills, and the conditions of practice). The frame of reference used to determine the data to be analyzed can be described, following Krippendorff (1980): 83 UNIT 0F ANALYSIS GENERAL DESCRIPTION STUDY UNIT Sample Unit Material to be studied Interviews of students at each of three levels of training Recording Unit Content category Major issues Context Unit Portion of material Entire text of interview to be examined transcript Specifically, the information from each interview was categorized into the following units of analysis: RECORDING UNIT SAMPLE UNIT I II III i-I/ P Competence X X X H/ P Deficiencies X X X Explanation for El? Competence X X X Diagnostic and Management Competence X X Diagnostic and Management Deficiencies X X Explanations for Diagnostic and Management Competence X X Effect of Knowledge Prior to Experience X X Effect of Practice Prior to Knowledge X X Explanation of Integration of Theory and Practice X X Antecedent Relevant Skills X X X Each step in the analytic process was undertaken to explore the nature of the phenomenon of osteopathic medical student experience. Some times laborious analyses of the interview content revealed little that appeared to be significant. A description of the various analyses is provided regardless of its apparent value. Descriptive Analysis The analysis began with transcribing verbatim the recorded interviews; then. each interview transcript was read several times in an attempt to understand the perspective of the subject. Third, all interviews within a given subject group (Unit) were read, in an attempt to gain insights into the experiences of students at that level of training and to identify issues that emerged. Finally, once a subjective 84 "feel" for the interviews had been gained, a content analysis was initiated. This first level of analysis was descriptive, attempting to draw from each interview those statements which represented that subject's experiences, opinions, feelings and perceptions in regard to the central issues outlined by the interview schedule, as well as those that impressionalistically arose in the researcher's mind from the initial readings of each interview. Short descriptive phrases representing the subject's statement served as descriptor codes regarding the particular issue under study; i.e., pre-medical school clinically-related skills, history and physical examination performance deficiencies, etc. The entire transcript was searched for a response to the question issue. Phrase codes were recorded for each subject by experience category (see below), training level, and by issue, with care taken to not presume equivalency of similarly phrased statements. Once issue-related responses were recorded for all subjects in a particular training group, a numerical count was made of the subjects within the group offering that conceptual response; then the proportion of subjects using that code was calculated. A total of twenty-one analyses were conducted at this level. A series of content analyses was performed for each Unit, in turn. Some analyses were common to all levels of the program, whereas others were unique to a particular subject group. Descriptive Analyses Common to All Units: There was a series of analyses performed on interviews from all three groups of subjects. In each case, the content analysis was performed using the entire interview transcript as the unit of analysis as described above. 1. Classification of Subjects. The first analysis was based on the initial presumption that students' pre-medical experiences might be important variables in the clinical competence developmental process. A codification for experience was devised, and all interview transcripts were coded according to the interviewee's 85 premedical training and experience in health-related or medically-related service roles. Subjects were classified into five categories with regard to health/medical care experience and training: I: II: III: IV: No or extremely limited health-related or medical care experience or training. Thaining and/or experience in a health-related occupation. Training in a medically-related occupation but limited experience. Limited training but extensive experience in medically-related occupation. TT'aining and extensive experience in medically-related occupation. These categories were established on the basis of certain definitions and assumptions. Definitions: a. Health-related occupations: refers to those human services roles, such as social worker, dietician, medical records administrator, counselor, administrator, which are supportive of but distinctly different from that of the physician role in terms of perspective, technical skills and/or critical knowledge base. b. Medically-related occupations: refers to those roles, such as nurse, EMT, P.A., physical therapist, dental hygienist, radiology techologist, medical technologist, EKG technician, which are supportive of, and similar to that of the physician role in perspective, technical skills or critical knowledge base. Assumptions: . Training for and/or experience in a health-related or medical care role provided students with certain knowledge and/or skill advantages over "naive" students. . Training and/or experience in a medical care role was more analogous to the physician role, thus giving students advantages over those in the health-related care category. . Extensive experience in a medically-related care role provided more advantages to students than did training alone. 86 Using this classification system, subjects were distributed across categories of experience and training, and the Units of the program as follows: EXPERIENCE CATEGORY PROGRAM UNIT I II III I 4 4 2 II 2 4 4 III 1 0 0 IV 2 1 3 V 6 2 2 (Females) 8 5 5 (Males) 7 6 6 More specifically, in each Unit, subjects' clinical experiences ranged from none (other than visiting people in the hospital) to working for up to six years as a physician's assistant. Their education and training background ranged from students holding doctoral degrees (Ph.D.) but having no health care training or experience, to students holding baccalaureate degrees and being certified in one of a variety of allied health occupations (nurse, physician's assistant, emergency medical technician, respiratory therapist, EKG technician and medical technologist.) This categorization scheme served as the basis for all subsequent analysis. That is, in all analyses transcripts were first sorted according to the experience category of the subject and the content analysis recorded by individual and experience category. 2. Pre-MSU-COM Experiences. In this analysis antecedent skills and experiences related to the professional tasks under study were specifically identified for each subject. This analysis was intended to clarify the a M experience codification, and to gain insight into subjects' perceptions of their preparation for clinical training and performance. 87 3. History and Physical Examinatim Competence. The medical history and physical examination are seen by the MSU-COM curriculum planners as critical tasks for osteopathic physicians. Training in the various skills related to these tasks is initiated in the first term of the program and is presumed to continue throughout the length of the program. These tasks, then, are obvious foci of the study of the developmental process of clinical competence. Of particular concern was what subjects perceived as the ideal competence at their particular level of training and what they perceived themselves to actually be able or unable to do and why. 4. Instructioml Environment. All subjects were asked questions to elicit their perceptions of effective and ineffective clinical instruction. The content analysis was intended to identify variables related to clinical instruction and conditions of practice that affect competence deve10pment. Descriptive Analyses Specific to Units: Given the program goals and the limited training and knowledge of Unit I students, it was assumed that more advanced professional performances, such as diagnosis and treatment, were not suitable issues to pursue with Unit I subjects. Hence, certain analyses grew out of the unique experiences of the Unit II and III students. 1. Diagnosis and Treatment Competence. Both Unit II and III students are taught and are expected to be able to engage in medical problem solving, including: identifying the chief complaint; evaluating clinical data; proposing a problem list; proposing relevant diagnostic procedures; developing a differential diagnosis; and planning and implementing therapeutic/management strategies. Content analyses were conducted to provide a description of the range of competence represented by the subjects and to propose a consensual description of optimal competence for both training levels. 88 2. Explanations for Inmgration of Theory and Practice. One of the assumed strengths of the MSU-COM curriculum is the fact of early and continuous clinical practice experiences for students. However, a recognized problem with the community-based educational model, is the difficulty in structuring relevant experiences concurrent with didactic instruction. In an effort to gain insight into the consequences of discontinuous theory and practice, descriptive analyses were performed based on Unit II subjects' responses to inquiries about the effect on performance of theory preceding practice, and the effect on learning and performance of practice preceding theory. This inquiry was pursued through questioning subjects about how they felt and what they could actually do when confronted with a patient with a chief complaint related to the renal system, for example, when they had not yet had and when they had had the renal system biology course. The situations posed were those which the subject had actually encountered, rather than hypothetical situations. The questioning also pursued the specific skills and knowledge that the didactic program provided them in performing clinical tasks. ContingencyAnalyses The central impression the investigator gained from the descriptive analysis was that, while all subjects were involved in learning and practicing similar tasks, there were differences in subjects' perspectives and points of focus as well as description of competency, depending on the subject's experience level. The experience variable seemed to be multi—dimensional. A method with which to explore these insights was sought; one which would be compatible with the exploratory nature of the study. 89 Exploratory studies such as the one reported here do not lend themselves to the rigorous quantitative methods of experimental and quasi-experimental research. As Cooley has argued: Exploratory approaches [to statistical analysis] are particularly useful in current studies of the effects of educational programs because of the primitive state of relevant theoretical models. Needed at this stage are statistical procedures that allow us to see the relative usefulness of different predictors or sets of predictors, as well as what confounding is occuring among independent variables, and what differences there are among different possible models for the data. At the exploratory stage, the data analysis should suggest ways in which the theoretical model might be modified, how the measures might be combined or separated, or which variables might safely be deleted from the model (1978:14). And as he further pointed out, statisticians and educational researchers have by and large, ignored the challenge to develop appropriate statistical methods for "explanatory" studies. Somewhat in contrast, Patton described the analysis of data from qualitative studies as "arty and intuitive" (1980:313). He claimed there are no statistical tests to confirm that a perceived pattern in an observation is significant. In fact, Patton proposes no quantitative methods to be used in analyzing qualitative evaluation data. In this study the sampling procedure, alone, violates the principles of statistical analyses designed to assist in establishing the plausibility of a theoretical model or to establish the relationship of dependent variables. However, certain patterns of students' descriptions of their perceptions seemed to emerge and some methods of examining those insights was sought. Contingency analysis was selected to explore the initial impression that cumulative experience influenced how the students thought their clinical competence development. Contingency analysis was thought to be the most appropriate manner of examining the psychosocial data and to be most compatible with the investigator's 90 notion that an individual's psychophysiological state, overt behavior and cognitive structure are interdependent parts of a complex whole (Foa and Turner 1970; Catania 1973; Neisser 1976; Vygotsky 1978). That is, it was thought that what was said by the interviewee reflected what, why and how he/she thought about a particular phenomenon. The descriptive study of the interviews seemed to give merit to that presumption. For example, there appeared to be a contradiction in what Unit I students said about their history and physical examinations (that they were pretty good) and what experienced classmates said about the skills of their inexperienced classmates, and, what Unit II students, in retrospect, said about their Unit I competence. A method of analysis was needed that would make more clear the basis of those differences. Contingency analysis is a correlational method that measures the probability of two symbols (statements, for example) being made within the context of a message. Based on association theory the contingency method assumes that greater-than—chance co-occurrence (contingencies) of the items in a message are indicative of the associations in the individual's thinking (Osgood 1959). The contingency assumes that: "(1) contirgencies in experience come to be represented in (2) an individual's association structure by patterns of association and dissociation of varying strengths, which help determine (3) the contingencies in messages produced by this individual." (Osgood 1959:57) In principle, then, the statements of individuals reflect a psychological structure of association resulting from specific life experiences; they are an index of those associations, but only a tenuous index of the actual historical events. This method provides a means of comparing how subjects at the three levels of the program "think about" how they perform--what their concerns are about particular tasks, and, thereby, provides another means of conceptualizing the continuum of competence development. 91 Contingency theory would suggest that students with similar clinical experiences would perceive and, thus, describe those experiences similarly, all other things being equal. Several questions arise in the case where descriptions do not appear similar: Are the experiences M similar? Are "other things" "go_t equal" _W_ha_t "things" are not equal? How are experiences dissimilar? Presumably answers to those questions would be suggested in differences in patterns of contingencies for different groups of subjects. The descriptive study sugested that "level of experience" is a rather complex notion, at least in the context of the study case. At least two major division of "level of experience" seemed to affect how students view the program and their experiences, and what they do: (1) their current level of training in the formal program; and (2) the cumulative, related experience they brought to the medical school program. Contingency analyses were conducted using the two levels of experience to subgroup the subjects. The Contingency Methodology: Each analysis began with the results of the descriptive analysis: the lists of descriptors for each of the major issues. First a raw data matrix was constructed, where each subject's use or non-use of each descriptor was recorded and the proportion of subjects using each descriptor calculated. Table 3.2 provides an example from the study. Table 3.2 DATA MATRIX UNIT I H/P DEFICIENCES Experience Category Subject Coded Descriptor A B C D E F I 1 + + - - + 2 - — - + + - 3 - — .. - .. + 4 - — + - — + N .09 .18 .45 .18 .09 .36 92 A contingency matrix was then calculated from the raw data. First, the expected (chance) probability of the co-occurrence of each pair of descriptors was calculated; e.g., the joint occurrence of AB = (.09) (.18), etc. Then the actual co- occurrence of each pair of descriptors was found by counting the number of subjects using both of the descriptors, and determining their proportion of the total subgroup. When a pair of descriptors co—occur more frequently than expected they are said to be contingent, and are assumed to represent an association within the thinking of those subjects. Such contingencies are indicated in bold type on the matrix. Table 3.3 provides an example from the study. Table 3.3 CONTINGENCY MATRIX: UNIT I H/P DEFICIENCIIS A B C D E A F A - .016 .04 .016 .008 .03 Obtained B .09 - .08 .03 .016 .06 Expected Contingencies C .09 09 - .08 .04 .16 Contingencies D 0 0 0 - .016 .06 0 0 0 .09 - .03 '11 L“ o 9 CG 0 a ‘9 o 27 .09 0 " A contingency analysis was conducted on all but one of the sets of descriptive data (antecedent relevant skills). In fact, a series of such analyses was conducted on each set of descriptive data: (1) including all subjects at a given training level; (2) including only category I-III experience subject at the level; (3) including only category IV and V subjects at that level of training; (4) including only category I-IV subjects, and (5) including only category V subjects. Each contingency matrix was compared with the intent of finding changes in patterns of associations. Two patterns emerged: (1) Unit II contingency matrices for category I-V, I-IV, and HE subjects were similar, but dissimilar to the matrix for category V subjects; and (2) Unit III contingency matrices for category I-V and HE subjects were similar, but 93 dissimilar to those for category IV-V and V subjects. On the basis of these observations, presentation of the contingency analysis (Chapter V) focuses on contrasts between the two different subgroups with Unit II and Unit 111 subjects, I- IV/V and I-III/IV-V respectively. Due to the differences in interview schedules for experienced and inexperienced subjects in Unit I, such comparisons were not possible for the Unit I descriptive data. It must be again pointed out that subdividing already small sample groups would be indefensible in any thing but the most exploratory study. Osgood (1959) proposed a standard error of a percentage test of the significance of contingencies, on = § where p is the expected value, q is equal to 1-p, and N is the total number of subjects. The standard error of percentage provides an estimate of how much an obtained percentage may be expected to vary about its expected value. Although initially proposed, the test of significance was abandoned because of the extremely small values for N in the study. For the purposes of this exploratory study, obtained contingencies exceeding the value for expected contingencies are considered to be "more significant" than those which do not. Selected contingency tables are presented in the text of the report and all others may be found in the Exhibit section of the report. Where inspection of the matrix revealed clusters of descriptors with common contingencies, those patterns are described and discussed in the report. It is emphasized that the sample unit and size do not fully support even this elementary statistical treatment of the data. The analyses are reported in full realization of their limitations and with the understanding that the exploratory nature of the study gives license to such methodological liberties. 94 Summary This exploratory case study sought osteopathic medical students' perceptions of their clinical competence development through in-depth interviews. Content analysis of interview transcripts identified students' descriptions of clinical competence for each of the three phases of the educational program, and their ideas of what influenced their development of competence. Contingency analysis was used to further explore these descriptions and pursue insights gained from the descriptive analysis. Twenty students were purposely selected from the approximate 125 students at each of the three levels of the program to participate in the study interviews. Thirty-seven (37) students who would shortly or just had completed one of the three units volunteered to participate in an interview. Analysis of the interview data involved performing content analyses to describe each interviewee's experience, competence and learning process, and contingency analyses to gain further insight into the descriptive data. The following descriptive (D) and correlational (C) analyses were performed: ANALYSIS U N I T I II III H/P Competence D, C D,C D,C H/P Deficiences D,C D,C D,C Explanation for H/ P Competence D,C D,C D,C Diagnostic and Management Competence D,C D,C Diagnostic and Management Deficiencies D,C D,C Explanations for Diagnostic and Management Competence D,C D,C Affect of Knowledge Prior to Experience D,C Affect of Practice Prior to Knowledge D,C Explanation of Integration of Theory and Practice D,C Antecedent Relevent Skills D D D 95 The results of the content analysis are presented in Chapters IV and V. Chapter IV presents students' perceptions of the clinical competence, and Chapter V presents students' perceptions of the variables attending their development of competence. 96 CHAPTER IV STUDENT? PERCEPTIONS OF THE CONTTNUUM OF CLINICAL COMPETENCE DEVELOPMENT The current study was undertaken in an attempt to describe a continuum of clinical competence development for osteopathic medical studies. Specifically, to describe students' perceptions of 3112! they are able to do in the clinical setting and hgvy they achieved that competence. Chapter V will report students' explanations for their competence. In neither chapter does the investigator attempt to judge’ students' perceptions in terms of any program standards or any personally-held notions about osteopathic medical education. Such interpretations and conclusions will be presented in the final chapter. In-depth interviews were conducted with 37 volunteer osteopathic medical students at the Michigan State University College of Osteopathic Medicine: fifteen Unit I students, eleven Unit II students and eleven Unit III students. Interviews were guided by interview schedules unique to each level of training with t'eg’ard to the clinical experiences undertaken, but which emphasized common aSI>e<2ts of the students' clinical competence deve10pment: . educational and experiential background; . abilities in the performance of clinical tasks; . notions of ideal clinical competence; . how and why certain skills did and didn't develop; and . how theory and practice was integrated. ““8 chapter reports students' perceptions on the first three of these issues. In an attempt to gain as specific information as possible, students were asked to describe their pre-medical school health/ medical training and/or experience, the details of their medical school clinical experiences, including: what they were exPected to do by others; what they themselves so-zght to do; what they were 97 permitted to do; and what they thought, in retrospect, they should have been able to do by the end of the unit. Since the clinical experiences at each level of the program are distinct, questions addressed the specific nature of the respective student's experience. At the same time, an effort was made to focus on tasks which were common to students at all levels; hence, certain professional‘tasks served as focal points: the history and physical examination, and diagnosis and patient management problem solving (including differential diagnosis, and treatment planning and execution.) Subjects' responses to each of the various questions were, first subjected to a content analysis, where pertinent responses were identified within the text of the interview transcript, capsulized and recorded on a chart; each chart revealing for each unit each student's descriptive responses to a particular issue. The complete charts are presented as Exhibits A—G. The capsulized responses were then formed into more generic responses. These so-called coded responses were listed and frequency counts made for each group of subjects. The coded responses are Presented in the text of this chapter where appropriate. Certain assumptions regarding the mediating effect of prior experience guided further analysis of the descriptive data. An 3 m categorization based on health- and/or medically-related training and] or experience placed subjects into five categories: I: no or extremely limited health or medical care experience or training II: training an/or experience in a health-related occupation III: training in a medically-related occupation but limited experience IV: limited training but extensive experience in a medically-related occupation V: training and extensive experience in a medically-related occupation 98 This chapter, men, presents and interprets each student's description of his/her clinical competence in either of the two focal professional tasks, on the basis of three levels of analysis: program level (Unit I, II or III); pre-medical experiential level (Category I, II, III, IV, or V); and specific descriptive response. Three topical issues provide the organizational framework for the presentation of the analysis: educational and experiential background; perceptions of history and physical examination competence; perceptions of diagnosis and patient management competence; and descriptions of the continuum of clinical competence development. Educationaland Experiential Backgroum The initial phase of the interview sought students' perceptions of pertinent pre-medical school training and/or experience; i.e.education and experience they thought advantageous in their clinical training/practice experiences in medical school. It was reasoned that such antecedent knowledge and skill would provide an important template for medical school clinical performance; it was not known what or how these antecedents would affect clinical competence development. Subjects at each of the three levels of training were asked to describe tasks they performed in any pre-medical experiences, including training programs, and to des(Bribe any skills they felt were useful in their training for the practice of OSteOpathic medicine. All subjects described skills they thought useful in medical t1‘aihing and/or were similar to those used by physicians. Table 4.1 lists those ante(Bedent skills. 99 Table 4.1 ANTECEDENT PROFESSIONALLY-RELATED SKILLS meOZZFNQHQHEOWNUOW> Students' descriptions of their antecedent skills were less affected by their levels of medical school training than by their pre—medical experience or training. The more closely related their pre-medical experience/training was to the Physician role, the more specifically the skills resembled those students would learn in the medical curriculum. Conversely, the less related the experience to the 1">13 of the physician, the more general the reported skills. When the antecedent Skills were examined in relation to the experience classification system and by OFFERED BY ALL SUBJECTS Interpersonal skills Instruct/educate other Make decisions/take responsibility Appreciation of cross—cultural values Interview Knowledge of health care delivery system Understand physician role Had OMT training (limited) Perform medical history Perform partial physical examination Perform medical history Perform partial physical examination Perform clinical procedures Responsible for patient care Perform treatment procedures Do medical problem solving Knowledge of medical records Write prescriptions Knowledge of clinical pathology Know sterile technique level of program, patterns of skills emerged. In Unit I subjects', antecedent skills differed depending upon whether their Pre‘medical education and experiences were health-related or were medically- rehted, and whether they actually practiced or were trained but didn't practice in the Occupation. Table 4.2 demonstrates the results of this analysis. 100 Table 4.2 ANTECEDENT SKILLS OF UNIT I SUBJECTS U NIT I C ATEGORY A B C D E F H I J K L M N O P Q R S I(n=4) + - - + - + + + — - .. - .. .. .. .. - .. - II(n=2) + - - + + + - - - - - - .. - + - - - - III(n=1) + - - - + + - - - + - .. .. .. - - - .. .. IV(n=2) + - - + + + + - - + - — — - + — - - .. V(n=4) - - - + + + + - + + + + + + + + + + + These data reveal that Unit I students considered interpersonal relations skills, including interviewing, and some broad knowledge of the medical system to be useful to the medical student. A sharp distinction in reported skills is seen between Category V students (those with extensive training and experience in a medically-related occupation) and all others. In Unit II, similar patterns of antecedent skills emerged across the five experience categories, as seen in Table 4.3: Table 4.3 ANTECEDENT SKILLS OF UNIT II SUBJECTS UNIT II CATEGORY ABCDEFGHIJKLMNOPRS I(n=4) + - - + + + - - - - - - - - - - _ _ II(n=4) + - - + + + + + — + + -. - - + - - .. III(n=0) IV(n=1) + - — — + + + — + + + - - — + - .. + V(n=2) + - - + + + + — + + + + + + - + — + And, again in Unit III, similar patterns of antecedent skills are seen across the five categories, except that the patterns appear more consistent with the initial rationale. Table 4.4 presents this analysis. 101 Table 4.4 ANTECEDENT SKILLS OF UNIT III SUBJECTS UNIT I CATEGORY ABCDEFGHIJKLMNOPQRS l(n=2) + + + - - - - - - - - .. _ - - _ - - .. II(n=4) + + - + + + - - + + + + - - .. - - - - III(n=0) IV(n=3) + + - + + + + - + + + + + + + + - .. .. V(n=2) + + - + + + + - + + + + + + + + + + - One difference from students in the other two levels of training emerged here. The ability. to instruct/educate (B) was seen by these clinical students to be an important skill. It should be noted that Unit III subjects were already interns when interviewed. As interns, these individuals were already involved in training clerks (Unit III students.) It is not knownwhether this new (non-Unit III) role or the need to guide/instruct patients as Unit III clerks influenced this perception. When all levels of the program are combined in this analysis a pattern emerges that approximates the initial rationale, as seen in Table 4.5. Table 4.5 ANTECEDENT SKILLS OF ALL STUDY SUBJECTS EX PERIE NCE C ATEGORY A B D E F G H I J K L M N O P Q R S l(n=10) + + + + + + + - - - .. .. .. - - .. - - II(n=10) + + - + + + + + + + + + — - + - - .. .. III(n= 1) + - - - + + - - - + _ .. .. - .. .. - - .. IV(n= 7) + + + + + + + — + + + + + + + + - — + V(n= 8) + + — + + + + - + + + + + + + + + + + Six descriptors of antecedent skills were common to all subjects, regardless 0f the level of training: interpersonal skills (A), instruct/educate others (8), 8‘F’Dl‘eciation of cross-cultural values (D), interview (E), knowledge of health care 102 delivery systems (F), and understand physician role (G). It can also be seen that any health—or medical-related training or experience (Categories II—V) provided some skills (and/or perceptions) that students without such experiences did not have. Students with medically-related training and experience (Categories IV and V) cited very specific physician role-related antecedent skills. Insights This initial analysis merely specifies and confirms what is generally known: medical school classes are extremely heterogeneous. Not only do students vary greatly in their academic backgrounds, but they are similarly varied in their clinically relevent skills upon entry into medical school. It is interesting to note that the subjects trained in the allied health occupations were specific-skill oriented. They quickly enumerated particular Clinical tasks, for example medical history taking, that they had performed prior to entry into MSU-COM. They needed to be prompted with probing questions in order to described more general skills they had acquired through life experiences that could also be construed to be pertinent. On the other hand, the clinically naive StUdents, particularly those with advanced educational backgrounds, more quickly identified analogous general skills, for example interviewing or knowledge of the h<-':alth care system. There were, however, naive students who offered extremely limited insights and skills as descriptors of their relevent background, for example, "Comfortable talking with elderly people." The subjects' responses may, as one reviewer suggested, reveal an empirically °Piented perspective, or, may reflect a "way of thinking" that they feel is appropriate for medical students. Currently, no account or accommodation for such antecedent skills is made in the curriculum. Whereas, students may take a Weiver examination for any basic science course in which he/she feels competent 103 to test, students have no mechanism by which to waive out of or modify clinical skills training experiences. If these policies reflect a more general attitude of dismissal of pre-medical experiences, these subjects' responses may merely reflect a pragmatic disengagement from their past, except as it most specifically relates to what they were then learning to do. It may well be, for example, that faculty do not specifically attempt to build on the students' existing knowledge and experience base through the use of analogy or comparison to the students' own experiential reference points. Further insights regarding pre-medical clinical experience are described in each subsequent section of this chapter and in Chapters V and VI. History and Physical Examination Each unit of the program involves instruction and/or experience in the performance of the medical history and physical examination (H/P). Unit I provides training in the basic Ill? and osteopathic manipulative diagnosis procedural skills, and provides a basic science foundation for understanding the rationale for interpreting anatomical, physiological, and psycho-social findings. Unit I students conclude this phase of their training by performing history and physical examinations in several ambulatory and hospital clinical settings. Unit II provides training in advanced osteopathic manipulative diagnosis and treatment, provides a medical science foundation for understanding the rationale for using clinical data in the medical problem solving process, and guides the students through a series of term-long clinical experiences in private, ambulatory care office practices. Unit III involves hospital-based and private practice-based clinical rotations, in which the students use and extend their skills by regularly performing history and physical examinations, as well as other professional tasks. 104 Based on their differences in accumulated knowledge and clinical experience, students at each of the three levels of the program were expected to have unique competence in the performance of the history and physical examination (H/ P). Unit I Descriptors of competence at this level of the educational program were sought in interviews with two groups of subjects: those just completing the Unit (Unit I subjects) and those who had completed the unit during the previous year (Unit II subjects.) Interviews of Unit I subjects differed depending upon the interviewee's prior experience. 'lhose having medically-related training and/or experience were asked to describe what and why their less-experienced classmates were able to perform in the clinical setting and to contrast that with their own performance. Therefore, descriptions of competence offered here refer primarily to those subjects who did not have prior medically-related training or experience. In fact, the experienced subjects did not engage in detailed discussions of their own performance and they seemed to disengage themselves from the training and clinical experience, taking a "big brother/sister" attitude towards the experiences of their more naive classmates. Competence: Unit I subjects described their skills variously, as Table 4.6 reveals. 105 Table 4.6 DESCRIPTORS OF UNIT I H/P COMPETENCE Had nine basic questions Felt good about palpatory skills Comfortable with interview Zeroed in on the chief complaint Could discern abnormals Did thorough job Had protocol for H/ P Comfortable handling instruments Comfortable with procedures Did excellent social history Had basics of eye exam Could ask more questions in pre-med area Confident of H/ P skills Pursued clues around chief complaint wozgthmommuow> Comfortable with setting/environment MSUeCOM students at this level generally had learned a general protocol for the history and physical examination, had mastered the mechanics of the physical examination procedures, and could quite comfortably engage a patient in an interview process. They had not the medical knowledge base for pursuing either the history or physical examination investigation beyond the protocol, for interpreting the findings they collected or modifying procedures to accommodate Specific situations patients present, for example, having injured limbs which compromise the .standard structural examination. Individual Unit I subjects, because of pre-medical experience, had particular Skills, knowledge or insights that exceed this generalization. For example, a nutritionist was able to pursue pertinent history and physical examination data related to the patient's diabetes beyond that which naive students described the mseives as being able to do. Subjects with Physician Assistant (P.A.) training des(:t'ibed their medical history taking and physical examination training and 106 performance as "more thorough than physicians." Exhibit A presents the specific statements that Unit I subjects used describing their H/ P skills. Looking back to their first (Unit I) hospital history and physical examination, Unit II subjects described their competence somewhat differently than did Unit I subjects, as shown in Table 4.8. Table 4.8 DESCRIPTORS OF UNIT I H/P COMPETENCE -OFFERED BY UNIT II SUBJECTS Descriptor Experience Category I II III IV V (N=4) (N=4) (N=0) (N=1) (N=2) percent response A Took 2-3 hours to complete H/P 25 25 - 100 0 B Write-up extensive 25 25 - 0 0 C Everything equally important 25 25 - 0 0 D Do history, then physical exam 25 0 - 0 O E Sympathetic/concerned for patient 50 25 - 0 0 F Follow strutured protocol 50 100 - 100 0 G Awkward with procedures 25 25 - 0 0 H Quit procedures if taking too long 25 25 - 0 0 I Self/role-conscious 25 25 - 100 50 J Do complete history and physical 25 0 - 0 0 K Confident 0 0 — 0 100 L Competent history skills 0 0 - 0 50 M Smooth manner 0 0 0 50 N Get adequate information 0 0 - 0 100 O Learned H/ P prior to MSU-COM 0 0 - 0 100 P Tailor H/ P to chief complaint 0 0 - 0 100 In light of their contemporaneous experiences in preceptor offices, Unit II subjects were more critical than complimentary of their Unit I competence. Even when they described their write up of the H]? as being "extensive," they were critical of their inability to sort out the important from the unimportant. Deficiencies: As one might expect at this level of training, subjects, generally, were unable to interpret historical or physical examination clues or to pursue them beyond the standard protocol. Certain procedures were particularly 107 troublesome with regard to recognition of anatomical and physiological phenomena: fundascopic examination, auscultation of the heart and lungs and palpation; and for more than half, the standard review of systems questions. Specifically, Unit I subjects used descriptors for their deficiencies listed in Table 4.9. Table 4.9 DESCRIPTORS OF UNIT I H/P DEFICIENCIFE. Don't know how to do a pediatrics exam Didn't follow protocol Problem with eye exam Didn't do physical examination Couldn't palpate enlarged spleen Uncertain about heart sounds Uncertain about lung sounds Problems with systems review Uncertain about identifying abnormals Can't label abnormal findings Uncomfortable with OMT Problem with physican role Uncertain about verbalizing instructions Not certain how much to demand of patient P.E. skills 'rusty' Preoccupied with procedural routines DOZSFNQ“:GQ'UMUOW> Unit 11 subjects, notably medically-inexperienced subjects, in reporting their deficiencies described a general a_ng_s_t_: about their Unit I history and physical examination experience and performance. As was noted, their descriptors of 'competence' were phrased as critical evaluations of their performance, so it was to be expected that their descriptions of deficiency would be equally critical. Table 4.10 lists their descriptions of Unit I H/ P deficiencies: 108 Table 4.10 DESCRIPTORS OF UNIT I H/P DEFICIENCIES OFFERED BY UNIT II SUBJECTS Experience Category I II III IV V (N=4) (N=4) .(N=0) (N=1) (N=2) Descriptor percent response A Didn't know what was doing 25 25 - 0 0 B 'Didn't know terminology 25 0 - 0 0 C Got no information 25 0 - 0 0 D Anxious 25 25 - 0 0 E Not see or hear cues 25 25 - 0 0 F Couldn't tie information together 25 50 - 100 0 G Couldn't identify normal or abnormal 25 25 - 100 0 H Eye exam mechanics 25 25 - 0 0 I Couldn't tailor H/ P to chief complaint 0 0 - 100 0 J None 0 0 - 0 100 Generalizations About Unit I H/P Performance: Clinical performance in training programs must be seen in the gestalt of the educational-practice environment, both from a historic and an immediate perspective. Each of the subjects perceived the experience differently, and brought to it different skills and expectations. All of the Unit I subjects concluded that "whatever Idid was fine," a post-hoe view which, it is presumed, mediated their retrospection and descriptions of competence. Since these students perceived that accuracy and precision were 3195 the faculty/supervisor's expectation, they were generally positive about their performance if they felt that they had gotten through the H/ P without embarassing themselves. A preoccupation with self and uncomfortableness in the role of student physician was described by all but the most experienced subjects. The manner in which Unit I students rationalized their clinical performance is portrayed in these few representative excerpts of interviews: I feel good about my palpatory skills. . . The chart said 'slight spleen enlargement,‘ but I didn't feel anything. But the chart said 'slight,’ so Ididn't worry about it. 109 I couldn't see anything on the fundascopic. I expected to see some diabetic retinopathy. It looked like grey bilateral areas that might have been cataracts. I didn't look on the chart about that. But I couldn't focus; decided it was me or a cataract. That's one place where I bluffed. Iam pretty comfortable with the eye exam. . . Icould not see much in the eyes, but the room was so bright, so the pupils were constricted. The patient's chief complaint had nothing to do with eyes, so Ididn't worry about that. If I had to do the examination for anything more than my own edification, I probably would have insisted. But as it was, I didn't. . . I was inconveniencing him so I just said, "It isn't important." If I had been on a clerkship it would have been important, because I would have been accountable for it. It was acceptible to everybody that whatever Idid was o.k.--it was no big deal! In contrast, Unit II subjects retrospectively viewing their Unit I performance through "re-ground lenses," were less sanguine. Their experience during Unit II had led them to conclude that is i_s "a big deal" if you don't feel, see, or hear what you should, or if you fail to perform an examination because of technical imcompetence or inconvenience. Unit I subjects saw themselves as intruders upon patients; intruders in the sense that they were not a part of the patient's health care team no were they competent to provide any particular service to the patient. Unit II subjects on the other hand saw themselves as providing care to patients, having responsibility to be the "eyes and ears" of the physician who was caring for the patient. This new sense of responsiblity to the recipient of care requires a new level of accountibility and competence. In fact, the investigator's experience talking with some of the Unit II subjects when they were first year students, leads her to conclude that their Unit I skills and attitudes were not dissimilar to those described by the current Unit Isubjects. 110 The following generalizations were drawn from the statements offered by Unit I subjects: . Students had various personal goals for the experience: to practice their skills; to meet assumed expectations of faculty (to get "correct answer"); to meet requirements of the course. . Students experienced more discomfort with the orientation and patient selection process than with the actual H/P experience. . Students followed protocols, usually through use of "crib sheets," when performing both the history and the physical examination. . Despite any inability to "see" and "hear" or to interpret what was seen or heard, students described their H/P skills as "good," "comfortable," "confident of skills," and even "thorough." . With the exception of those with medically-related pre- medical school experience, students were unable to pursue chief complaint clues, due to their lack of medical science knowledge. . Students perceived their role as non-physician learners and as such assumed no authority to impose time demands or discomfort on patients. Therefore, they eliminated procedures, especially structural examinations, when they had questions regarding their relevance or their safety for the patient. . Students emphasized those aspects of the H/P for which their pre—medical school experience best prepared them. . Students without medically-related pre-medical school experience described their clinical competence at the end of Unit I: MEDICAL HISTORY Can establish patient rapport Can get patient to share information Follow a routine protocol Have basic questons to be asked Can't pursue chief complaint with more specific systems questions 111 PHYSICAL Follow protocol EXAMINATION Can handle equipment Comfortable with procedures Usually recognize findings which aren't "normal" Difficulty in visualizing anatomic structures in eye exam Most difficulty recognizing abnormal findings in eye exam, heart and lung auscultation Descriptions of findings usually framed in lay terms Don't know what findings mean clinically Know how to 1993 competent H/P WRITE UP Didn't do Unitll Descriptions of competence at this level of the educational program were sought in interviews with two groups of subjects, those just completing Unit II (Unit II subjects) and those who had completed the Unit more than a year earlier (Unit III subjects.) Competence: The history and physical examination described by Unit II subjects reflected increased confidence in their medical student role and skill, and a more indepth medical science knowledge base. They generally describe themselves as being able to effectively establish rapport with office practice patients; to conduct good history and physical examinations, and to distinguish between normal and abnormal physical findings; and to be able to recognize clues from the RIP that they would have ignored in Unit I, and to pursue those clues with a more in-depth investigation. The H/P was now geared to the chief complaint, as compared to a comprehensive I-I/P of Unit I students, and was fairly well routinized. The specific descriptors used by Unit II subjects to describe their H/P performance are listed in Table 4.11. 112 Table 4.11 DPSCRIPTORS OF UNIT II H/P COMPETENCE OFFERED BY UNITII SUBJECTS Comfortable with procedures Good pelvic examination Good chest examination Good ear examination Good Vitals (B/ P) OMT is better Palpatory skills better More observant Good eye examination More objective in performing examinations More confident Can better distinguish abnormal from normal Get accurate information Good well-baby physical examination Comfortable with rectal examinations Get good histories H/ P skills were developed prior to MSU-COM Competent in interpretations of findings NDWOZEPWQHEQWNUOW> When these descriptors were keyed to the subjects' pre-medical experience, certain patterns in responses emerged, as shown in Table 4.12. The least experienced subjects (Category I) used more specific descriptors of their competence than the most experienced subjects (Category V). Experienced subjects appeared to emphasize newly-acquired or expanded competence (OMT, palpatory skills, cardiovascular examination, and interpretation of findings), while embracing the technical skills of the history and physical examination under the single comment that their "H/ P skills were developed prior to entering MSU-COM." 113 Table 4.12 DESCRIPTURS OF UNIT II H/P COMPETENCE OFFERED BY UNIT II SUBJECTS IN EACH EXPERIENCE CATEGORY Percent Response By Descriptor Experience Category I II III IV V (n=4) (n=4) (n=0) (n=1) (n=2) A Comfortable with procedures 75 25 - 100 0 B Good pelvic examination 50 25 - 100 0 C Good chest examination 25 25 - 100 50 D Good ear examination 25 0 - 100 50 E Good Vitals (B/P, 35c) 25 0 - - - F OMT better 75 50 - 0 50 G Palpatory skills better 25 0 - 100 50 H More observant 25 0 - 0 0 I Good eye examination 25 0 - 0 0 J More objective in performing examinations 25 0 - 0 0 K More confident 0 75 - 0 0 L Can better distinguish normal from abnormal 50 0 ~ - 50 M Get accurate information 25 50 - 0 0 N Good well-baby examination 25 25 - 0 0 O Comfortable with rectal exam 25 25 - 100 0 P Get good histories 0 25 0 100 0 Q H/ P skills developed prior to MSU-COM 0 0 — 0 100 R Competent in interpretation of findings 0 0 - 0 100 Exhibit B present each of the statements Unit 11 subjects used to describe their H/ P performance. When asked to describe their Unit II H/P competence, Unit III subjects used an equal number, but different descriptors, as shown in Table 4.13. Unit III subjects' descriptors seem to focus on the meaning of data, while Unit II subjects focus on the means of gathering accurate data. However, inexperienced Unit III subjects' descriptors can also be interpreted as more focused on technical aspects of the H/P (knowing what questions to ask, knowing criteria for normal, organization of the H/P) than were those of their peers with medically-related GXperience. Nonetheless, the language of Unit III subjects was distinct from that Of the Unit II subjects, suggestive of their broader conception of the role of the 1’Iistory and physical examination in the medical problem solving process. meWOZSFNQ“=flDWMUOw> examinations persisted in Unit 11: 114 Table 4.13 DESCRIPTORS OF UNIT II H/P COMPETENCE OFFERED BY UNIT III SUBJECTS Pre—natal checks Begin to distinguish abnormal from normal Ask routine questions Identify system of chief complaint Explain something about possible problems Gather accurate data Have some labels for abnormalities H/ P as good as average D.O. Identify chief complaint Make decent problem list H/P tailored to chief complaint Good basic systems review Limited facility in proposing problem list Thorough history ,Good organization of H/ P Comfortable with pelvic exam H/P same as prior to MSU-COM Broader, longer problem list Comfortable with all routine office procedures Percent Response by Experience Category I II III IV _ V (N=2) (N=4) (N=0) (N=3) (N=2) 50 0 - 0 0 50 25 - 0 0 0 25 - 0 0 0 25 - 66.6 0 0 25 - 33.3 50 0 25 - 100 0 0 50 - 0 0 0 50 - 0 0 0 25 - 0 0 0 25 - 33.3 50 0 25 - 0 50 0 50 - 0 50 0 25 ~ 0 0 0 25 - 0 0 0 0 - 66.6 50 0 0 - 33.3 0 0 0 - 0 5 0 0 - 0 50 0 0 - 0 50 Deficiencies: Certain Unit I deficiencies in the history and physical performing the fundascopic examination; interpreting auscultation of the heart and lungs; and palpation. In addition, at least a third described problems with pelvic and rectal examinations, and some noted problems in describing findings in proper medical terminology. The most experienced subjects denied having any problems with the history and physical examination at this level of training. Table 4.14 presents the range of descriptive statements Unit II subjects used to describe their H/ P deficiencies: 115 Table 4.14 DBCRIPTORS OF UNIT II II/P DEFICIENCIES OFFERED BY UNIT 11 SUBJECTS . Problems with eye exam Rectal exam not good Endoscopic exam not good Problems with chest sounds Psychiatric/neurologic evaluation Uncertain about palpatory diagnosis Uncomfortable with pelvic exam Need practice on entire H/ P Not sure what to do for each patient Dictation of H/ P Accuracy of description of findings History incomplete None SFNHHEQWMUOW> Again, inexperienced subjects described their H/P deficiencies differently from experienced subjects, as shown in Table 4.15. Although certain deficiencies, such as the eye exam, neurologic exam, palpation and medical terminology, persist across experiential groups, the most experienced subjects offered far fewer descriptors of their deficiencies. One experienced subject, a former physician's assistant, claimed to have no deficiencies. 116 Table 4.15 DESCRIPTORS OF UNIT II H/P DEFICIENCIES OFFERED BY UNIT II SUBJECTS IN EACH EXPERIENCE CATEGORY Percent Response by Experience Category I II III IV V (n=4) (n=4) (n=0) (n=1) (n=2) A Problems with eye exam 75 50 - 100 50 B Rectal exam not good 50 25 - 0 0 C Endoscopic exam not good 25 0 - 0 0 D Problems with chest sounds 25 50 - 0 0 E Psychiatric/neurologic evaluation 25 O - 0 0 F Uncertain about palpatory diagnosis 25 75 - 0 50 G Uncomfortable with pelvic exam 0 75 - - 0 0 II Need practice on entire H/ P 0 50 - 0 0 I Not sure what to do for each patient 0 25 - 0 0 J Dictation of H/ P 0 50 - 0 0 K Accuracy of description of findings 0 25 - 0 50 L History incomplete 0 0 - 100 0 M None 0 0 - O 50 Unit 111 subjects offered more descriptors of their Unit II history and physical examination deficiencies than did Unit II subjects. As shown in Table 4.15, subjects with medically-related training and/or experience used different descriptors from subjects without such training and experience. Inexperienced subjects described deficiencies both in technical skills and in data interpretation ability, while more experienced subjects tended to point up organizational and data interpretation deficiencies. 117 Table 4. 16 DESCRIPTORS OF UNIT 11 H/P DEFICIENCIES OFFERED BY UNIT III SUBJECTS Percent Response by Experience Category I II III IV V (n=2) (n=4) (n=0) (n=3) (n=2) A Not able to evaluate patient 50 0 - 0 0 B Weak physical exam skills 50 0 - 0 50 C Not able to hone in on chief complaint 50 25 - 0 0 D Miss subtle findings 50 0 - 0 0 E Weak n interpreting data 0 25 - 0 0 F Heart sounds identification 0 25 - 0 0 G Miss things in PE 0 25 - 0 0 H Difficulty developing problem list 0 25 - 0 0 I Inadequate pelvic exam 0 25 - 0 0 J Didn't feel competent 0 25 - 0 0 K Certain systems knowledge weak 0 25 33. 3 0 L Tend not to know what looking for 0 0 33. 3 0 M Disappointed in OMT skills 0 0 - 33. 3 0 N Can't weed out unimportant 0 0 - 33. 3 0 0 Not always recognize abnormal 0 0 - 33 . 3 50 P Not able to label findings 0 0 - 33.3 0 Q Couldn't integrate H/P 0 0 - 33.3 50 R Not get beyond general 0 0 - 33.3 0 S Not efficient 0 0 O 100 When one examines the descriptors used by Unit II subjects with those offered by Unit III subjects, further comparisons can be made. Although both subject groups point to certain technical deficiencies, Unit III subjects' various other descriptors go beyond the procedural aspects of the history and physical examination. As was noted in discussing the descriptions of competence, Unit III subjects had come to think about the I-I/P differently from Unit I and Unit II subjects; i.e. as an integral part of the diagnostic process, and had greater insight into the importance of knowledge of pathology and disease in performing the history and physical examination. Just as the Unit II subjects with prior medically- 118 related experience tended to focus on their problem solving competencies and deficiencies, the Unit III subjects, looking back at their Unit II performance, viewed it with their current criteria in mind. Generalization about Unit II I-I/ P Eformance: Unit II subjects, in contrast to Unit I students, performed the H/P as part of their preceptors' patient care services. Their performance was necessarily geared to the constraints of the practice environment. AS'their descriptive statements show, they no longer attempted to perform a complete H/P as learned in their MSU-COM skills laboratory, but rather, attempted to collect accurate date pertinent to the patient's chief complaint within the time constraints of the particular preceptor's office. Certain procedures, for example structural evaluation, may have been infrequently used--in which case subjects assumed these skills to have deteriorated. This orientation towards professional practice and away from §_tt_i_d_e_nt practice forced a shift in perspective at several levels. First, there was a shift away from self towards attention to the validity of the data. Although there was self-consciousness in the student-physician role, there was an increasing effort to provide a service to the patient and to meet the expectations of the preceptor. This change in perspective, secondly, changed the tone of the subjects' self- evaluations. Most subjects appeared to have raised the standard by which they judged their performances. The Unit II subjects' perspectives on clinical performance is conveyed in the following selected excerpts: Everything is geared to the chief complaint. You learn to take the short cuts and which short cuts to take from feedback from the doc. I sit down and talk directly with the patient and get the chief complaint, but I tend to do the systems review as I am doing the physical exam. The signs and symptoms direct the questions. In Unit I you just follow the protocol and worry about whether you left anything out. 119 In Unit I if I didn't visualize the ear drum, I'd chart 'couldn't see the ear drum.‘ In the preceptor's office [Unit II], I might have to ask the nurse to help me wash out the ear--then take a good look. I would be thinking otitis media in that case. I didn't realize the seriousness in Unit I. Now lots of things are going through my mind--I am not just thinking about the ear drum. When I started to tailor-make the history [in Unit II], and when I talked with the physician about the patient, I would find out that I had omitted something critical, and then go back in. I would make more than one trip into the room with the patient. I think that is a healthy sign in the early stages. There are somethings that I almost always do: I look in the eyes because I need the experience; I always listen to the heart for the same reason and because I think everyone should have it; every kid I look in the ears, because ear problems are so diffuse and common in kids. Past that, it [PE] is focused. More specifically the following generalizations can be drawn from Unit II SUbjects' statements: . Students are concerned with looking and sounding good to preceptors and patients. . Without relevant knowledge, in the form of prior clinical experience or having had the pertinent systems course(s), students feel uncomfortable, are insecure and function slowly and methodically. Student competence in performing the III? is related to the amount of accumulated knowledge and experience. Systems courses provide students with such functional knowledge as an understanding of the pathophysiology of the particular disorder at hand, questions to elicit pertinent data in the history, and criteria for recognizing and interpreting signs. . Proficiency in conducting the history and physical examination is increased from the end of Unit I to the end of Unit II—-the amount depending upon the nature of pre-medical competence. . Skills quickly diminish without constant practice. Many skills learned in the Unit I skills labs are not practiced in the clinical setting. . The primary "competence" standard for Unit II students is to "feel comfortable" doing the task. The definition of "comfortable" varies with the student. . Students specifically describe their H/P performance in the following way: MEDICAL HISTORY PHYSICAL EXAMINATION W RITE-U P Unit ll]: Can establish rapport--patient-oriented, but guided by preceptor's style Focus on chief complaint Try to conform to constraints of practice; i.e. time, process, protocol Identify chief complaint and pertinent system(s) Ask pertinent questions to explore chief complaint in more depth Know meaning of questions and answers Focus on chief complaint Perform pertinent procedures re chief complaint Can handle equipment comfortably Can transcend procedures to attend to patient Can usually decribe findings in medical terminology Can pick up on and pursue clues with further history and/or physical examination Can interpret findings--know criteria for normal Nearly all procedures under control Pursue collection of data as part of responsibility to patient Follow procedure used by preceptor to chart findings Descriptors of competence at this level of the educational program were sought in interviews with individuals who had, several months earlier, completed the educational program and were at the time of the interview interns. 121 Competence: The history and physical examination performed by Unit III students is fundamentally different from that performed by Unit II students, both in structure and competence. Unit III students routinely perform complete admission H/P's in hospital rotations, the results of which they dictate for inclusion in the patients' medical record. They describe being able to do an accurate, thorough, and efficient medical history, and being able to gain increased information from the physical examination. The most striking feature of the subjects' description of their H/P performance was their confidence, as evidenced by their 15.33 of detail in describing their H/P skills, either positively or negatively. As Table 4.17 reveals, all Unit III subjects expressed general satisfaction with their history and physical examination. Whereas relatively inexperienced subjects described increased skill in gaining subtle information through the RIP and more finese in interpreting abnormal findings, students with prior medically-related experience described gaining confidence in their organizing the data into more efficient systems for diagnosis, which can be considered a higher level of professional competence. Table 4.17 DESCRIPTORS OF UNIT III H/P COMPETENCE Descriptors Percent Response by Experience Category I II III IV V Conduct good history 50 100 - 100 100 Can ask more specific questoisn 0 75 - 33 50 Can distinguish normal form abnormal findings 0 75 - 0 0 Prepare an accurate write-up of H/ P ~ 0 0 - 33 50 Conduct thorough physical examination 50 50 - 100 100 (Can identify systems) involved in chief complaint 0 O - 33 0 Am confident 0 0 - 0 50 Can assess the cardiovascular status 0 0 - 0 50 Have identified pertinent signs and symptoms for generic problems in all systems 0 O - 0 50 ":30 "Unit! OWU’ 122 Exhibit C shows the descriptions of history and physical examination competence offered by each of the Unit III subjects. Deficiencies: It is in their descriptions of deficiencies that one can make the clearest distinction between the experienced and inexperienced subjects. The inexperienced students variously described specific weaknesses in their clinical data collection skills: not efficient, miss refined clues, cardiology evaluation Weak, etc. On the other hand, experienced subjects either offered no deficiency descriptions or spoke of inefficiencies only in writing treatment orders. Specific descriptors of deficiency used by Unit III subjects are presented in Table 4.18. Table 4.18 DESCRIPTORS OF UNIT III H/P DEFICIENCIES Descriptors Percent Response by Experience Category I II III IV V (n=2) (n=4) (n=0) (n=3) (n=2) A Not efficient 0 25 - 0 50 B PE skills not as good as should be 0 25 - 0 0 C Cardiology evaluation 0 25 - 33 0 D Neurology evaluation 0 25 - O 0 E Respiratory evaluation 0 0 - 33 0 F Refining H/ P for detailed system evaluation 0 O - 33 0 G Describing things not seen before 0 0 - 33 0 H No deficiencies described 50 25 - 33 50 I Lack in-depth medical knowledge 0 0 - 33 0 J Miss subtle signs and symptoms 0 0 - 33 50 K Uncomfortable with semi-independent O O l O 01 CD responsibility 123 Generalizations about Unit III H/P performance: As previously noted, the Unit III students, particularly when in hospital-based rotations, are expected on a routine basis to perform and dictate accurate, pertinent histories and physical examinations on newly admitted patients. The students, thus, gain a good deal of practice doing the complete H/P while at the same time increasing their base of knowledge of clinical medicine. Both skills and confidence in performing the RIP increase to a new level of competence during Unit III. The following excerpts demonstrate the changing perceptions of clinical competence offered by Unit III subjects: I think that I thought I was pretty thorough in the history [in Unit 111 because I thought Idid a pretty good job of systems review and systematically went through from head to toe. We had some basic questions to ask for systems review. But [now] when you work with specialists you find out that each sub-category has to be pinned down--there are additional questions to ask if they say "yes" to one of those basic questions. History and physical examination are the most important clinical skills. I include differential diagnosis under the H and P. You have to approach each patient as though you know nothing about them. Alot of times the physician will just say, "Direct your H/P towards the chief complaint." That's O.K. if you have 20 years of experience and you can pick up on things, but as an extern, intern or resident you have to get a system down so you can be pretty confident that you ask all the right questions and you have properly examined all the parts of the body that can be problems for the patient. You not only have to be able to do an adequate general history and physical examination on each area but you also have to be able to get in specific questions in each area in order to get in your diagnosis. These comments reveal a different perspective on the role of the H/ P and the degree of sophistication of knowledge of clinical medicine (pathology) necessary to 124 performing an "adequate, thorough" l-I/P. Unit III subjects, in contrast to both Unit I and Unit II subjects, were involved in solving complex medical problems. They were beginning to understand the relationship of the H/P to his/her personal Icnowledge, and how the physician's diagnostic acumen hinges on that knowledge. In that light, the deficiencies described by Unit III subjects cannot be equated with similar deficiencies described by Unit II or, surely, Unit Isubjects. Certain generalization were supported by Unit III subjects' statement: . Students' confidence and competence in all areas of clinical skills are greatly increased over those of Unit II. . Subjects perceive themselves to be more thorough in the performance of the medical history and physical examination than the average D.O. practitioner. Higher standards of performance are self-imposed, based on insight into specialist's level of knowledge and skills. Efficiency in performing the H/P greatly increases through the course of Unit 111. Competence at the Unit III level is generally described: M EDIC AL HISTORY PHYSIC AL EX AMIN ATIO N Thorough and accurate Self-confident Know important questions to ask Can elicit pertinent information on all systems Data collections focuses on differential Data collection efficient Thorough and accurate Much improved in distinguishing normal from abnormal Get basic information for identification of problems Do good assessment of the systems Procedures and protocols are routinized Data collection focuses on differential The medical history and physical examination were integrated to the extent that the student thought appropriate or productive. 125 Diagnosis and Patient Management Both Unit II and Unit III provided instruction and experience related to medical problem solving. Unit II provided, through the systems biology courses, the basis for understanding the normal structure and function of each system and the pathophysiology of system-related disease. Through the preceptorship experiences students were provided Opportunities for applying didactic information to the practice of medical problem solving in the ambulatory care setting. In these settings patients most often presented with routine primary care problems and students were involved in collecting clinical data pertinent to the presenting problem (chief complaint) and proposing a problem list, a diagnostic evaluation process and a treatment plan. Unit III provided students with both ambulatory and hospital care clinical experiences. Generally students extended their clinical medicine knowledge base by being directly involved in the diagnosis and management of patients on a case- by-case basis. By the nature of the medical problems of hospitalized patients in particular, students were involved in a problem solving process requiring more in- depth knowledge than they had acquired in Unit 11. Based on differences in their knowledge base and the type of problems they confront, students at the second and third levels of the program were expected to have unique competence in medical problem solving. It was assumed that Unit lstudents were not involved in diagnosis and management of medical problems. UnitII Competence: As shown in Table 4.19 subjects used a variety of descriptors of their problem solving abilities. On the basis of frequency of reference to each descriptor, students at this level have generally developed an ability to identify 126 abnormal findings from the H/P, to accurately identify the chief complaint, to correctly identify the associated body system, to propose at least a short list (3 to 4) of possible problems which are congruent with the clinical data, and to recall protocols for diagnosing and treating routine problems with which they have had experience. Subjects with medically-related experience who have worked in ambulatory settings have routinized protocols from which they diagnose and manage primary care problems. Table 4. 19 DESCRIPTORS OF DIAGNOSIS AND TREATMENT COMPETENCE OFFERED BY UNIT II SUBJECTS Descriptors Percent Response by Experience Category I II III IV V (n=4) (n=4) (n=0) (n=1) (n=2) A Include osteopathic evaluation in differential 0 0 - 0 50 B Confident/comfortable 25 0 - 100 100 C Can propose problem list 25 100 - 100 100 D Will be in the "ball park" on differential 25 O - 100 50 E Can do cardiovascular review 25 25 - 109 100 F Know/use references for problem solving 50 25 - 0 0 G Look up drugs for treatment 50 50 - 100 0 H Know routine drugs if have worked on case 25 50 - 100 50 I Can problem solve simple (one system) problems 25 0 - 0 0 J Learning list of routine G.P. cases 25 0 - 100 0 K Can integrate systems knowledge in diagnosis 25 0 - 0 0 L Use general terminology in problem solving 0 50 - 0 50 M Comfortable with Gynecologic cases 0 50 - 100 100 N Can develop problem list for symptoms 0 25 - 100 0 0 Can identify system of chief complaint 0 25 - 100 100 P Comfortable with respiratory problems 0 25 - 100 100 Q Comfortable with all systems 0 O - 100 100 R Can propose rationale for diagnostic tests 0 0 - 0 50 Exhibit D shows each Unit II subject's description of his/her diagnostic and treatment competence. 127 Deficiencies: Unit II subjects also used a variety of descriptors for their problem solving deficiencies, as shown in Table 4.20. The general deficiencies acknowledged by students at this level were limited knowledge of drugs of choice and inability to work-up patients with multi-system problems. Additional observed deficiencies depended upon the individual student's experience. The ability to work-up particular systems or problems was described as a deficiency when such a problem or system had not been worked-up because it was not confronted, or the subject was unable to work-up the case when it was confronted. Table 4.20 DESCRIPT‘ORS OF DIAGNOSIS AND TREATMENT DEFICIENCIES OFFERED BY UNIT 11 SUBJECTS Descriptors Percent Response by Experience Category I II III IV V (n=4) (n=4) (n=0) (n=1) (n=2) A Uncertain about renal problems 25 25 - 0 0 B Uncertain about respiratory problems 25 O - 0 0 C Uncertain about neurology problems 0 0 - 100 50 D Both old and young patients are a problem 50 25 - 0 0 E Don't know drugs of choice 50 25 - n 0 F Multi-systems problems difficult 25 50 - 100 0 G Have problems with people who don't speak English 25 0 - 0 0 H Don't know treatments 50 0 - 0 0 I Uncertain about dermatology problems 0 0 - 100 0 J l-Iave trouble refining initial problem list 0 25 - 100 0 K Don't know diagnostic tests 0 25 - 0 0 L Problem list not accurate much of the time 0 25 - 0 0 M I-Iave trouble working up things not see before 0 25 - 0 50 N Don't have medical terminology for describing findings 0 25 - 0 50 0 Am slow 0 25 - 0 0 P Forget important questions 0 0 - 100 0 Generalizations about Unit 11 medical problem SOIVIIE: Unit II students, in the absence of extensive knowledge of clinical medicine, had limited confidence 128 and skills in medical problem solving. Students used their accumulating systems biology knowledge to assist them in interpreting clinical data. They used clinical texts, drug references and the preceptor to guide them through the problem solving process. The following general statements were supported by interview data: Without the relevant knowledge and experience with a particular clinical problem, the student feels uncomfortable and functions slowly and methodically. . Approaches to problem solving and management are very much influenced by models presented by preceptors, unless the behavior modeled is counter to the student's philosophy and/or "scientific principles." . Students gear their activity to the constraints of the practice in which they work; i.e. do things as the preceptor does them, reduce the time for procedures. UnitIII Competence: Unit III students were intensely engaged in the process of learning how to diagnose and manage specific medical problems. Accordingly, their confidence and competence differed from Unit II subjects. Particular statements characterize Unit III subjects' descriptions of competence: . can do initial work-up, can organize data into a few possible problems, know basic admitting orders, know the initial diagnostic tests to be performed, can start the initial treatment regimen, know drugs for certain problems, have an idea of how to initially manage an emergency. The interviews made clear, however, that these generic skills were limited to those types of clinical cases with which the student had had sufficient practice. Table 4.21 lists the descriptors used by all subjects in describing their problem solving skills. 129 Table 4.21 DECRIPTORS OF DIAGNOSIS AND TREATMENT COMPETENCE OFFERED BY UNIT III SUBJECTS Descriptors I II (n=2) (n=4) (n=0) A Self—confident 0 0 B Can organize information into a few possible problems 50 50 C Can do initial work-up 50 25 D Know initial diagnostic tests 50 25 E Can start initial treatment regimen 50 25 F Can read chest and abdomen X-rays 50 25 G Know drugs for certain problems 0 50 I! Know basic admitting orders 0 50 I Have idea about managing emergency cases 0 25 J Try to think holistically 0 25 If , Know quick differential for emergency problems 0 25 L Can handle COPD cases from beginning to end 0 M Can write good progress reports 0 25 N Can handle G.l. problems from beginning to end 0 50 0 Can handle cardiovascular problems 0 50 P Can work-up hematology problems 0 25 Q Can read EKG's 0 0 Percent Response by Experience Category III IV V (n=3) (n=2) 33 100 66 100 66 100 66 100 33 100 0 50 66 50 66 50 100 50 0 50 - 0 0 0 0 0 0 0 0 0 33 0 Exhibit E presents individual Unit III subjects' descriptions of their diagnostic and treatment competence. Deficiencies: Unit IH subjects cited a host of deficiencies in their ability to diagnose and treat medical problems. As seen in Table 4.22, three of these deficiencies were common to almost all students at this level of training: they had not memorized drug dosages; their differential diagnosis was limited and included broad, rather than specific, problems; and, they had limited knowledge of the specific diseases. 130 Table 4.22 DECRIPTORS OF DIAGNOSIS AND TREATMENT DEFICIENCIES OFFERED BY UNIT III SUBJECTS Descriptors Percent Response by Experience Category I II III IV V (n=2) (n=4) (n=0) (n=3) (n=2) A Can't manage complete patient work-up 50 25 - 33 50 B EKG interpretation 50 0 - 0 0 C Can't manage "new" diabetic 50 0 - 0 ~ 0 D No appreciation for pediatrics 50 0 - 0 0 E Back pain evalaution/differential 50 0 - 0 0 F Treatment management for diabetic acidosis 50 0 - 0 0 G Work-up for anemia 50 0 - 0 0 H Work-up for renal disease 50 0 - 0 0 I Work-up for cirrhosis 50 O - 0 0 J Knowledge of medical problems limited so 50 - 33 50 K Drug dosages 0 50 - 66 0 L Refinement of problem solving weak 0 50 - 66 50 M Admitting orders not routine 0 25 - 0 0 N Distressing to work from PE alone 0 25 - 0 0 0 Write inadequate progress notes 0 25 0 0 P Neurology 0 25 - 0 50 Q Orthopedics 0 25 0 0 R Handling respiratory problems 0 25 0 0 S l.V. therapy 0 0 66 0 T Variations in treatment for common problems 0 0 - 33 50 U Lack of confidence in decision making 0 0 - 66 50 Generalizations about Unit III medical problem solving competence: The medical problem solving competence described by Unit III subjects was distinctly different from that described by Unit 11 subjects. However, in this small group of subjects there appeared to be a wide range of definitions of individual competence. The subjects who had medically-related experience prior to medical school continued to have more confidence in their clinical performance than those with no or lees relevant medical experience. As one experienced subject stated: Ihad everything down going into internship. It was just a matter of refining it. . . There were things that I did as a medical corpsman I haven't been allowed to do as a doc ye t. 131 Competence was described in terms of specific types of medical problems, rather than generic skills. Hence, subjects who had not worked-up certain cases described themselves as being deficient in diagnosing and managing those types of cases. It becomes apparent that students do not have uniform experiences with regard to the types of cases with which they work; hence, the specific and wide ranging descriptions of deficiencies. Too, it must be taken into consideration that the subjects of this part of the study were then currently involved in their internship. They were looking backward to Unit III from the perspective of a new level of responsibility, a fact important to what associations are made and how performance is evaluated by students. Certain other generalizations can be made based on students' comments: . Students' confidence and competence in all areas of clinical performance are greatly increased over those of Unit II. ' . Students' primary goal is to routinize the prescribed protocols for management of "common" medical cases that they will be responsible for as interns. . In the absence of supervised responsibility for a patient's diagnosis and management--as in coat—tailing or attending lectures-~the student perceives him/herself to be incompetent to work-up and follow-up the generic problem represented by the patient. . Competence at the Unit III level is generally described: DIFFERENTIAL DIAGNOSIS Have identified "packages" of signs and symptoms for generic problem of system Can propose problem list of at least 3 or 4 specific possible problems for each system Have good differential approach to systems in which there was good clinical instruction Able to work-up more complex (multi- system) problems Know routine diagnostic procedures to employ to work-up differential Likely not to know esoteric diagnostic procedures Can interpret EKG's and routine chest and abdominal films 132 MANAGEMENT Know routine admitting orders for frequently encountered problems Know first-level management of frequently encountered emergency situtations Have not routinized drug dosages except for frequently encountered emergency problems Have limited knowledge and familiarity with therapeutic regimens such as I.V. therapy, acute respiratory therapy, etc. May or may not have performed normal delivery Have insight into the complete history of a medical problem-—out-patient and in-patient presentations Have developed certain technical skills used in diagnosis and management, such as: suturing, vena puncture, arterial puncture, pelvic exam, rectal exam, intubation, catheterization, central venus line. Imights The subjects of this exploratory study brought to their clinical experiences their cumulative life experiences, as well as skills and knowledge they had gained in the medical school program. Just as each individual's life experience was different, each had a different perspective, demonstrated unique skills, and gained different outcomes from each clinical' experience. It is this investigator's impression that Unit I students drew heavily from their past experience when confronted with the initial challenges of patient interaction. They tended to operate from their "civilian" experiences; i.e., emphasizing those aspects of the history or physical examination for which they felt they had the most background. Students' pre-medical school experiential base can carry them through certain aspects of Unit II and Unit III experiences as well; however, it is health and medically-related experiences which are significant at the more advanced stages of clinical training. With each succeeding Unit medically-related experiences became the most supportive. These "special refuges of confidence" were important to 133 students, for they provided them with at least one bit of knowledge, skill or insight on which they could depend, and, perhaps, even distinguish themselves from peers. Comments by subjects point up this dependence on a special skill/knowledge: My goal when I came to medical school was when I got to the hospital to be able to do one thing better than people around me. (Unit II subject) People out in internship report not being able to visualize vessels on the fundascopic exam. I guess I attribute it [skills in performing fundascopic exam] to my mechanical background. If I don't see anything I will re-position myself or the instrument until I see what I am supposed to see. (Unit Isubject) When I do a medical history I think I emphasize the social aspect of one's health situation more so than other students. Although I think it is very important to look at the medical aspects, I am not as comfortable with it. I have noticed that my social histories are much more complete. Ihave also found that in my interactions with patients, emotional sharing often occurs that they say is unique for them. (Unit I subject) I can go into a room with a patient complainng of chest pain and symptomology--the things I learned in the ER (prior to MSU-COMF-of real patients. In school they teach classical presentation, but they didn't tell you how your patient was going to look: grey color. (Unit III subject) However, this exploratory study suggests that typologies of experience such as the one used, may serve a limited purpose. It is unlikely that such categories will predict performance, per se. However, further exploration of the ways in which students draw on their life experiences could serve as a basis for modifying clinical experiences for individuals and categories of students and for advising students in their professional development. These data also support a well known but often ignored or forgotten fact: medical school does not produce _a_ competent physician. Individuals graduating from medical schools, like any other graduate educational program, are each unique in their competence. The standards for the definition of that competence may be self-designed, program designed, or even undefined, depending upon the degree to which the program specifies its standards of performance and the degree 134 to which the student pro-actively guides his/her professional development. This issue will be discussed further in Chapter V. In general, it is possible to distinguish the clinical competence of subjects at each of the three levels of training. And further, the differences are not only in the professional tasks which students at each level can perform, but also in the very nature of haw, m and with what a particular task is undertaken. Data collected through the H/P performed by Unit I subjects, for example, differ in substance and meaning from those collected by Unit 11 subjects; and Unit III subjects are able to collect even more specific, relevant, comprehensive information than Unit II subjects. There is a continuously accumulating body of knowledge which at each level of training increases the mechanical finesse with which students perform procedures, the meaning of data for students and the student's ability to organize data into meaningful information with which to solve medical problems. The accumulating knowledge and skills, together with increased responsibility for patient care, mediates shifts in students' definitions of what they could and couldn't do at a particular level of training. At the time of study within a particular unit, subjects were inclined to draw fairly positive descriptive images of their performances, even though they could identify many specific deficiencies. In retrospect, however, they were inclined to focus on what they were @1613 to do, in light of their new levels of understanding of what one M be able to do. This phenomenon is demonstrated in Exhibit F where Unit 11 students contrast their Unit I and II H/ P performance, and in Exhibit G, where Unit III subjects describe their Unit II H/P performance. In these comparisons, students across levels of training consistently described performance competence for a particular Unit, but tended to be more critical in assessing deficiencies at a former level of training. This is, 135 of course, part of the socialization process of adopting a new role, shifting from a "lay" person's perspective to that of an "expert." It is not clear to what extent the lack of clear clinical performance standards influences students' performances at a particular level, or their descriptions of their performances. As noted earlier, Unit I students concluded that the clinical experience was designed to merely allow them an opportunity to interact with patients in a clinical setting. This experience, they concluded, was designed neither to improve nor to evaluate their H/P skills. These students also commented that when upper class persons assured them that it would be a low- pressure experience, they did not particularly worry about or make a special effort to prepare for the experience. It is likely that any description of competence for a given level of training offered by this study is unique to this program at this point in time; it may not be pertinent to the MSU-COM program if clinical experience goals change, nor are the specific descriptions likely to be relevant to any other program. Descriptions of the Continuum of Clinical Competence Development There are several possible approaches to empirically defining competence for each of the three levels of training at MSU-COM based on the data from this study. One approach is to draw together the composite descriptions of competence as defined by subjects' perceptions of their actual grformances. Another approach is to describe the ideal image of performance which subjects proposed when reflecting on what they M have been able to do. Descriptions of the continuum of clinical competence developments based on both of these approaches are proposed. 136 Comparisons of Competence Based on Performance Descriptions of clinical performance were provided by subjects at each of the three levels of the training program. Those descriptions were presented above for each level. Table 4.23 draws together those descriptions. These descriptions, together with insights gained from the interviews, point up some distinguishing features of the continuum of clinical competence development described below. FEATURE I Role Perspective Student/lay- of Student person Knowledge Base Basic science; Pre-medical experience Focus of Clini— Patient in ter- cal Learning action; Data collection skills Professional H/ P Task Assignment Level of Focus Procedure mastery of Learning (mechanical); Self-conscious Standard Bearer Self (Clinical faculty) UNIT CHAR ACTERISTIC II Studen t-physician1 Basic medical science; Pre-med- ical experience Refinement of data collection skills; Interpretation of clinical data; Treatment protocols HIP; Diagnostic problem solving; Drug therapy manage- ment Procedure mastery (protocol/ process) Knowledge-conscious Adjunct faculty (Self; Clinical faculty) III Physician-student2 Basic clinical med- icine; Basic medical science; Pre-medical experience Refinement of data interpretation; Development of clin- cal management knowledge and skills H/ P; Diagnostic and Management problem solving Preparation for internship; Patient-management- conscious Specialists ' (Self) 1/ 2 Current discussions in medical ethics suggest the term "student-physician" to The term, and its mirror image, are used here to emphasize a psychological/role orientation. be ethically/legally inadvisable. advocated for general use. The terms are not 137 Table 4.23 SLLf-ULSCRIPTUHS (Y ACTUAL CLINICAL COMPLTL‘MTE l ASK Unit I HISTORV Can eetabiiah rapport-~patimt-orienied Comprehenaive Followa protocol Slow. methodical Can identify "chief complaint" Deacribe C.C. in 'ley' terma Unable to interpret date May not be able to 'control' situation Feel aa though impoemq all on patient Have few. atandard duratione for ayetem renew PHYSICAL EXAMINATICN Complete. mfocuead Follow atandard protocol Slow. methodical Self-conemoua about techniquea Can "look like' performing procedure correctly May be able to recognize but not identify abnormal findinga Certain procedurea problematic: finde- acopic. percuaaion. auacultation Unwilling or uncomfortable impoaing aelf on patient in puraiat of data CL lvaICAL PNUHLEM 50L vwc Unable to interpret findings, follow cuee of hiatnry or pliyeical eaain, or propoae problem liet uiileee pre- medical eaperienceltraming pro- vided neceeanry medical knowledge DIAC‘J‘OSIS MANAGEMENT Unit It Can eatainah repiiort-opatient-oriented but guided by preceptor'a etyle Focue on chief complaint Try to conform to conatrinta of practice; i.e.. time. proceaa. protocol Identify chief complaint and pertinent ayatem(a) Aai: pertinent ouaatione to explore C.C. in more depth Know meaning of oueationa and anewere Feel as though contributing to patient care Focue on chief complaint Perform pertinent procedurea re C.C. Can handle equipment comfortably Can get beyond procedure to patient'a problem Can ueually deacriba findinge in medical medical terminology Canpiciiiponandpurauecueawith further history and/or 96 Can interpret findinga-oirnow criteria for normal Nearly all procedurea mder control" need more experience with funde. acopic and euacultat‘ion in order to feel comfortable in inter- preting normal from abnormal Puraua data ea part of reaponaibillty patlel'it Generally “get in the ball park" of the problem Pureue ciuea to collect pertinent data from appropriate eyetemla) Conacioua and hiiowlndgneble of the inter- relatedneea of ayatema Propoae problem liet of limited number of general poeaible probleme--mey need to uae reference Uaually can propoae a plan to rule out poeaible problemanlillely to uae reference May be able to propoae diegnoeia In general term May have general propoae! for treat- ment i.e. drug regimen. if can uae reference Except for OMT. only aware of 915 treatment modality Lhit III thorough and accurate Sell confident Knew important oueetlona to eat Cari elicit pertinent information on all ayateme Data collection effiCient Thorough and accurate Much improved in dietinguiiiing normal from abnormal Get basic information for identification of problema Do good aaaeaament of Procedurea and protocola are routinized Data collection focuaee on differential Medical nietory and phyalcal nomination are integrated to the eatent Mt appropriate or productive . Have identified 'iiaciiaiiee' uf aigria and aymptume for generic prublema of ayetema Can propoae problem llat of at Ieeel 3 or a specific poaeiiile prubieme for ear-h eyetem Have aymptome-oriented differential for emergenCy problema Have good differential approach to ayetema in which had good clinical lnatruction Able to work-m more complea (multiayatem) probleme Know routine diagnoetic proceduree to employ to workoup differential Likely not to know eaoteric diagnnatic procedurea Can interpret EKSe and routine cheat and abdominal filme Know routine admitting ordera for freouently encountered problema. including nursing care Know first-level management of frequently encountered emergency altuetiona Have nit routinized tug doaagea except for lrewently encomtared emergency aituationa Have some knowledge and familiarity with therapeutic regimena auch ea LV. therapy. acute respiratory therapy, etc. May or may not have performed a normal delivery Have inaight into the complete. hiatory of a medical problem-mutopatient and inopatient Have developed certain technical akllla uaed in diagnoaia and management. auch ea: auturing. vane puncturea. arterial puncture. peiiiic exam. rectal exam. intubatin. catheterization. central veni- line 138 As the composites point out, clinical competence can be distinguished at the three levels of training at several levels of analysis. The descriptors presented here, based on the students' actual experiences, may or may not be what the program intends. They do, however, provide useful points of comparison for the ideal competence statements these same subjects proffered. Comparisons of Competence Based on Ideal Statements Subjects at each of the three levels of the program were asked to describe what a student M be able to do at a given level in order to be prepared for the next level of training. Statements of ideal performance at the Unit I level were provided by subjects who had had training and experience in a medically-related occupation and were then completing Unit I, and by Unit II subjects. The ideal statements for Unit 11 were elicited from subjects then completing that portion of the program; and Unit III ideal statements were provided by subjects who had completed the program. A fairly circumscribed set of skills and knowledge were preposed for Unit I students: Operate equipment properly Talk comfortably with any patient Elicit data, including chief complaint Follow H/ P protocol and hospital rules Ask basic questions about the chief complaint Be able to use palpatory diagnostic skills Not understand what data means Know process of rectal and pelvic examinations Use medical terminology to describe findings Recognize abnormals for each procedure Know roles of hospital personnel Know organization of hospital and ward Be able to find information in patient's chart EPWQHEQWNUOW.) It is interesting to note that experienced Unit I subjects and Unit 11 subjects proposed similar descriptors, but only the Unit Isubjects preposed organizational skills and knowledge (K, L, M). 139 Unit II student proposed an extensive list of skills and knowledge that should be gained in Unit II: Do PE exam properly Do thorough ENT examination Do thorough chest examination Do preper pelvic examination Know criteria of normal physical findings Be able to perform OMT treatment for all regions of body Identify systems pertinent to the chief complaint Elicit more elaborate history of chief complaint (than Unit I) Propose problem list for any system, using references Propose limited problem list Develop a diagnostic approach to problem list Modify PE to history and physical findings Effectively palpate and percuss Know normals for routine diagnostic tests (lab and X-ray) Write prescriptions Distinguish normal from abnormal EKG Read chest and abdomen X—rays Give injections Draw blood Suture lacerations Know drug and non-drug approaches to common problems Instruct patient on follow-up routine Here we note that Unit ll subjects used the same procedural perspective to describe Unit ll ideal competence that they used in the describing the Unit I ideal com pe tence. 140 Unit 111 subjects proposed an extremely long list of skills and knowledge to characterize the ideal hospital-based Unit III competence at the time of graduation: mmcom>N4x2 muoow> Work effectively with staff Get information from current and historic patient records Get services from hospital resources Admit patient with proper admitting orders Write routine patient care (nursing) orders Perform effective/thorough medical history Perform effective/thorough physical examination Write on-going care notes Assist effectively in surgery Perform normal delivery Know normal delivery procedures Know abnormal delivery procedures Perform accurate H/P on newborn Accurately evaluate the newborn Know post-partum procedures Know psychological evaluation procedures and definitions Evaluate routine X—rays Know routine laboratory tests and how to interpret them Use I.V. antibiotic therapy properly . Know basic routine for Codes Diagnose and manage basic medical cases Administer local anesthetic Diagnose and manage shock Know surgical procedures in which have assisted Be able to read and interpret jouer articles Know medical terminology Know specific references for each specialty Know basic drugs for cases (U): classification and uses Read EKGs Triage emergency cases Be able to start I.V., catheterize, and intubate Personally perform routine lab tests 141 In addition, Unit III subjects proposed a list of skills and knowledge that should be gained from the Junior Partnership (ambulatory care) eXperience: Be familiar with routine business procedures Understand patient care system of practice Be able to counsel on birth control and apply LU.D.s Remove warts and moles Remove toe nails Know internal medicine Diagnose and manage routine G.P. problems Interpret EKGs Interpret routine X-rays Deve10p criteria/rationale for refering patients Manage interpersonal relations Develop definition of wellness and illness See hospital care in the perspective of primary care Learn follow-up care strategies Perform effective pelvic examinations Give injections Perform effective rectal examinations Learn how G.P. uses hospital services and personnel Be able to perform triage of emergency cases mwowOZZFNQ”EQWWUOw> Table 4.24 presents composite ideal statements of clinical competence for each of the three levels of training. Comparison of Actual and Ideal Statements of Competence Two important differences between subjects' actual and ideal competence statements emerge. First, there is much more consensus on what "should be" than "what is," particularly when talking about the technical, medical procedures. It appears that from the vantage point of entering the next level of responsibility, it becomes clear to students that there are some generic skills and knowledge that mark the entry to the next level of practice and development. Second, their perspectives appear both broader and more specific when the subjects reflected on what they should have gained at any particular level. Some, but not the majority of subjects, cited, for example, skills and/or knowledge at the organizational level of both the health care facility and patient management: knowing the politics of the organization; being familiar with the office procedures; TASK ORIENTATION] MEDIC AL HI STORY PHVSICAL EXAMINATION OIAGIVOSTIC PROBLEM SOLVING MANAGEMENT PROBLEM SOLVING RECORD MANAGEMENT tht I Know role of Mapltal I Know organization of hospital and ward Have basic understanding of organization of patient records Establish rapport with patient Commuiicate effectively Do history following ' protocol Use standard review of systems questions Identify chief complaint Perform PE following routine protocol Master equipment and technical procedures Recognize gross soror- meiitiss for each procedure Usable to interpret findings. mlsss had previous clinical experience Unable to propose or implement treatment Plans. miess had previOus clinical experience 142 Table 4.24 DESCRIPTORS OF DEAL CLINICAL COMPETEBCE Lhit II Ba professional. including nest and clean Know personal. ethical and legal limits Interact effectively with all patients Have general categories of questions for review of systems Identify systems pertinent of chief complaint Elicit more elaborate history of chief com- plaint (than Unit 1) Perform PE adequately on persons of all ages Proficiently perform: reflexes: ear. eye and throat examinations: auscultation and per- cussion of heart: percussion of liver; palpation of abdomen Tailor PE to medical history findings Know criteria for physical findings Perform effec tlva pelvic and rectal asaminatiiine i-hve an approach to problem solving all common G.P. clinical problems Propose problem list for all systems, using reference Know routine tests to order for cardiovascular. respiratory and renal problems Write prescriptions Use references to identify drugs and dosages Perform OMT on all regions of the body Know some drug and non- drug approaches for common G.P. problems Be able to: draw blood give injections Instruct patient on follow-up routine unit it! (hospital) Know organizational politics of hospital Understand own role in organization Know medical record system Effectively communicate with people of all ages and circumstances Perform thorough history Perform comprehensive PE Accurately interpret findings Convert knowledge into symptoms approach to problem solving Work-tn all basic medical cases Know diagnostic protocol for common emergency cases: ML: angina; hypertension; diabetes: asthema: seizures Read EKG's Evaluate the newborn Know routine diagnostic tests and how to evaluate them Read cheat and abdomen films Diagnose shock Perform normal delivery and NB care Know Code protocol Know basic drugs for common cases: class- ification and use Know references for penalties Be able to read lOUl’fllII Can start I.V.s. catheter- ize, and intubate Can manage basic medical cases for each system Write concise. accurate HIP report Write good progress notes Write proper admitting notes, including nursing care I)!“ III (office) Know routine business procedures Understand patient care system for private practice Manage interpersonal relations Understand definitions of illna. and wellness Interpret EKG's and x-rays Know primary care management protocols Cordinate out-patient and in-patient care Counsel on birth control; I.U.O.s Treat all common problems Learn follow-up protocols Give injections Develop rationale for referring patients Remove warts. ns-Zs. moles understanding the continuum of wellness-illness-disease; management strategies for common health problems. 143 knowing non-drug These descriptions stand in sharp contrast to the disease management orientation they described themselves as actually doing. At the same time, the subjects had concluded that there were specific clinical problems and knowledge that they should be able to manage. From these ideal statements the following distinguishing features of the continuum of clinical competence deve10pment are concluded: FEATURE Role Perspective of Student Knowledge Base Focus of Clinical Learnmg Professional Task Assignment Level of Focus Standard Bearer I UNIT CHAR ACTERISTIC II Student/ Physician Student-physician Basic science; Pre-medical experience Orientation to health care delivery; Confirming procedural skills of H/ P H/P Procedures Curriculum; Clinical faculty Basic science; Basic medical science Pro-medical experience Orientation to private practice; Refining H/ P skills; Establishing criteria for normal; Confirming skills in patient inter- action; Initiating problem solving approach Refine OMT skills H/ P; Diagnostic problem solving; OMT therapy Patient interaction; Protocol mastery Data accuracy Curriculum; Clinical faculty; Adjunct faculty III Physician-student Unit I and II Basic clinical medicine; Pre-medical experience Orientation to hospital care; Refining interpre- tive skills; Developing problem solving skills; Learning management protocols; Learning technical skills H/P; Diagnostic problem solving; Patient management problem solving Disease diagnosis and treatment; Health care management Curriculum; Adjunct faculty; Clinical faculty 144 Explicit in the ideal statements was the imposition of standards for clinical experience outcomes; whereas, implicit in all subjects' statements concerning their actual experiences was a lack of uniformity of outcomes, and the subjects' own responsibility for determining what outcomes will be sought. Explanations for the variety of actual outcomes will be described in Chapter V, and implications of both the ideal and the actual competence statements will be explored in Chapter VI. Summary This chapter has presented the descriptive analysis of in-depth interviews of students at three levels of an osteopathic medical education program, with regard to three issues: the nature'and influences of pre-medical clincial experiences and training on clinical competence development; abilities and deficiencies in performing profesisonal tasks at each of the three levels; and, descriptions of the continuum of clinical competence development. Presented for each level of the program were the results of a content analysis of each interview, regarding the specific description of competence and deficiency in performing two major professional tasks: the medical history and physical examination, and diagnosis and management problem solving. Frequency counts of the coded responses, based on the pre-medical experiential background of the subject were presented, and coded responses for which there was substantial subject agreement were presented as a general statement for competence or deficiency for each level of training for both professional tasks. In a similar fashion, subjects' views of ideal competence were analyzed and presented. In the cases of the actual clinical performance and the ideal, there are clear distinctions in clinical competence between and among each of the three levels of training in the program studied. The analysis also showed differences within 145 groups, primarily based on subjects' pre—medical clinical experiences, but also on the nature of individual subject's clinical experiences in the medical school program. And, finally, the comparison of ideal and actual competence revealed differences both in the breadth and specificity of skills and knowledge underlying students' clinical competence, and in the students' perspectives on their professional development. Chapter V will provide further insight into the continuum of clinical competence development by presenting students' perspectives of the explanations for their competence and deficiencies. 146 CHAPTER V STUDENTS) PERCEPTIONS OF THE VARIABLE IN DEVELOPING CLINICAL COMPETENCE This study had two primary purposes: (1) to describe, from the students' prespective clinical competence at each of three stages of an educational program for osteopathic medical students; and (2) to identify teaching/learning variables that influence the clinical competence developmental process. Chapter IV presented 3'1 osteopathic medical students' descriptions of their clinical competence. This chapter will present their insights into the process by which they developed that clinical competence. ‘ In an effort to gain as specific information as possible about the teaching/ learning process, subjects were asked in in-depth interviews to describe each formal clinical experience in which they participated in terms describing: what they did; what the clinical supervisory did; what characterized for them a "good" and "bad" clinical supervisor or clinical experience; what they could or could not do given certain conditions; what affect didactic preparation had on clinical performance; what affect clinical performance had on learning didactic information; and how they were evaluated. Subjects' responses to these various questions are presented here under two major headings: (1) explanations for what was and wasn't described in Chapter IV as clinical competence, and (2) insights into the relationship of theory to practice. As in Chapter IV, the findings presented in this chapter are intended to present students' perceptions on the learning process. The investigator's impressions and insights will'be presented in Chapter VI. 147 Unit I subject interviews were analyzed differently from those of the other two subject groups, due to the difference in interview process. Subjects having had medically-related training and experience (Category V) were asked to discuss the RIP skills of their less-experienced classmates and what they thought accounted for those skills. Category I—IV subjects were asked to describe their experiences, and reactions to them, in detail. Content analysis was used to identify (1) the key explanations offered by Category V subjects, and (2) supporting and/or refuting evidence in Category I-IV subjects' statements. The results of that analysis are presented in the text of the chapter. Two analytic methods were used in codifying and interpreting the information gained through the Unit II and III subjects interviews. First, as in Chapter IV, descriptors of each subject's responses were obtained through content analysis. 'lhat is, responses to a particular issue (e.g., eXplanation for history and physical examination (H/P) competence), were recorded and then coded for each subject categorized by level of training and category of pre-medical experience. The Unit level descriptive composites are present as Exhibits H through M and the coded descriptors are presented in the text. ' Second, tabulations were made of each subject's use of particular descriptors and then contingency matrices constructed. The resulting contingency tables were studied to identify patterns of associations of paired descriptors. The contingency tables were constructed for each issue using two sets of conditions. One table was constructed using _afl subjects within a given level of training. A second set of tables was constructed to determine whether pre-medical experience made any difference in the way subjects responded to questions. Experience in analyzing the interviews for Chapter IV led the investigator to suspect that the perspective of subjects having medically-related experiences would differ from that of other subjects. Initial examination of the Unit II tabulations showed that Category V 148 subjects did use different descriptors and, therefore, contingency analyses for Unit II subjects compared Category V to Categories I-IV subjects. Similar examination of Unit III tabulations suggested that Category IV and V subjects differed from Categories I-III subjects in their responses, and, therefore, contingency analyses compared these two composite groups. Unfortunately, the small number of subjects makes the interpretations highly suspect. Given the exploratory nature of this study, interpretations are intended only to guide future research. It is in that spirit that highly speculative interpretations are included in this presentation. The Unit level contingency tables are presented as Exhibit 0 through T, and those comparing responses based on the students' pre-medical school experience are presented in the text where appropriate. ' Explanations for Clinical Performance Students at each of the three levels of the educational program under study were generally asked to describe what and how they performed in their clinical experiences, what they were expected and allowed to do by their preceptor, the nature of the clinical cases with which they worked, and how they would explain their level of performance. Questions were directed to the specific kinds of clinical experiences of the subject group. In addition, questions probed the particular experiences of the individual subjects and sought clarification of their responses to questions posed in the interview schedule. Unit! The focus of the interview of Unit I subjects was the series of three history and physical examinations performed in the final term of the Unit. Unit I students, as part of a three-course series in patient evaluation skills training, were assigned three experiences in which they perform a history and physical examination on 149 individuals in a health care setting: two persons were hospitalized and one was participating in a senior citizen health screening project. Prior to these three experiences Unit I students had performed history and physical examinations on their peers in a simulated laboratory. Although all Unit I subjects were asked to describe the details of the three H/P experiences, those subjects who had had medically-related training and experience prior to entering medical school were asked to reflect on the training program and the conditions of the three experiences, and to propose explanations for the performance of the typical Unit I student. The insights of these experienced students were then compared with the statements offered by the less experienced subjects. Explanations for History and Physical Examination Competence: Experienced subjects proposed thirteen general explanations for H/P competence of the typical (medically-inexperienced) Unit Istudent, listed in Table 5.1. Table 5.1 EXPLANATIONS FOR H/P COMPETENCE OFFERED BY UNIT I SUBJECTS (A) Uncomfortable in patient setting (B) Certain skills not yet mastered (fundascopic, palpation, percussion, auscultation) (C) Inexperience with variations in normal findings (D) Inexperience with abnormal findings (E) Lack of knowledge of clinical medicine (F) H and PE too much for first experience (G) Unable to tailor H/ P to chief complaint (H) Lack of experience using medical terminology (I) Lack of training and experience writing up H/P (J) Lack of specific evaluation of performance (K) Lack of remediation of skills between experiences (L) Lack of specific training in medical history taking (M) Students don't take responsibility for professional development 150 The first seven of these explanations (A through G) can be considered to reflect conditions inherent in the initial level of clinical training. For example, students who demonstrate mastery of physical examination procedures in the simulation laboratory may very well appear "all thumbs" because of their uncomfortableness in clinical settings (A). On the other hand, experienced subjects unanimously concluded that the conditions under which their inexperienced classmates were expected to perform their hospital H/P exaccerbated this expected stress. For example, with the exception of one hospital, patients did not know that first year medical students would be performing history and physical examinations on them. Some patients refused and others had to be coaxed to participate. Also, most inexperienced students were unfamiliar with the organization of the hospital, wards and medical charts, and no orientation was provided at most hospitals. Too, in most instances, students were allowed to select patients from the admissions list. Some patients were inappropriate for inexperienced, unsupervised students-“notably the 8 month old infant one student selected because he "had never done a pediatric exam." And, as was pointed to repeatedly in Chapter IV, knowledge of clinical medicine (E) is essential to being able to direct the PU? towards the medical problem at hand (G). Students in the program under study would not be exposed to the basics of clinical medicine until Unit II. Lack of knowledge of medicine created other practical problems for students. For example, one student's patient was normal except for an injured knee which was propped up on a pillow. As the student remarked: I was perplexed totally. I had no idea if she could bear weight; no idea if I could touch it; no idea if I could move it; didn't know if I should spend my energy looking at her knee or doing the physical. It is like doing a physical on someone who is bedridden. That was a pretty frustrating situation. Also, these experienced subjects contend that inexperienced students are slow and, therefore, should not be expected to perform a complete H/P in one experience 151 (F)--at least for the sake of the patient. Experience, they say--both with a wide range of normal subjects (C) and those with pathology (D), is the key to being able to interpret the findings in the physical examination and gaining competence in the most troublesome procedures (B). That is, unless Unit I students have unlimited access to such a wide range of H/ P subjects (and the time to practice), they can be expected to lack proficiency with the fundascopic, palpatory, percussion and auscultory procedures, and to be unable to accurately assess findings. 0n the other hand, the remaining explanations (H through M) can be viewed, at least to some extent, as deficits of the training program. Specifically, subjects at all levels of the program reported that they had not been taught the principles of medical history taking (J), including systems review questions, although they had had extensive group work in principles of communication, interpersonal relations, and interviewing. In contrast, subjects who formerly were Physicians' Assistants (P.A.) claimed to have previously been taught a systematic H/P, including systems review which they continued to use throughout their medical training. Many inexperienced subjects reported seeking from these former P.A.s, advice on the conduct of the medical history. Similarly, experienced subjects' perceptions regarding Unit I students' lack of experience using medical terminology and performing the H/P write-up (F and G, respectively) were confirmed by comments of inexperienced subjects: There was no requirement for a write-up and I didn't turn one in. At the Civic Center there was such a requirement for a write-up. There was no feedback on those. I would have liked more follow-up on the write-up. . . Where I found Ididn't have to hand it in, Ididn't write it up. I can't use medical terminology and feel 0.x. about it» For example, [for] my second patient he [clinician] asked me what kinds of lung sounds I heard. I said, "I don't know--I know it was pathology;" and I described it in laymen's terms. And he said, "That's O.K. It's good you heard something." 152 Inexperienced subjects also confirmed that no one was immediately available to confirm findings (J). Experienced and inexperienced subjects recommended some mechanism for gaining immediate feedback. The following suggestion made by one inexperienced subject was typical: The supervisor needs to be immediately available--but not in the room. . . The supervisor needs to know the patient and be able to evaluate the accuracy of the findings. If significant findings are missed, the student and evaluator should discuss the findings and perhaps go into the patient's room and re-check findings and techniques. This expressed need for immediate feedback and remediation was a constant theme in statements of students at all levels of the program. Without continual evaluation and guided remediation some students reasoned that they "must be doing O.K. because no one said anything," while others failed to develop the confidence necessary to assertively pursue their professional development. While some subjects described the feedback on skills in performance labs as being "good and non-threatening," others described it as being "unavailable" or "inconsistent." No one described having had feedback on skills when in the hospital setting, although some reported having asked for and receiving immediate confirmation of findings when they performed the senior citizen H/P. It might be reasoned that the key to clinical competence development is the student's own responsible action (M), and that despite flaws in an educational program, the student, through his/her own sense of responsibility and initiative, can develop the essential competence. Whatever the degree of truth in that statement, the program can significantly constrain the student's best efforts to carry out that responsiblity. For example, Unit I subjects reported having considered a plan of preparation for their forthcoming clinical experiences. Most frequently, however, those plans were modified, often they were abandoned, because of the demands of the didactic program. 153 I would have read my Malasantos [H/ P textbook] if I hadn't had two exams that week. . I was excited--I really wanted to do it, but I also think I was consumed with the thought of studying for the immunology exam--it was an extremely difficult academic term. [The debriefing] was very open. No one wanted to be there--we had a big test the next day. He was very congenial about letting us go home, which was all we wanted to do. These students appear to have taken their cues for establishing learning priorities from the design of the program. As we noted in Chapter IV, Unit I subjects concluded that the clinical experiences "were no big deal." Here we see that they perceived their didactic courses to be most important. Only in retrospect, as experienced Unit I and advanced level subjects have shown, does the student understand the significance of those clinical experiences for his/her professional development. That is, naive students appeared not to have an a priori conceptual framework to guide their professional competence development. They assumed the educational program was designed to logically lead to professional competence if they met its (minimum) requirements. Unit]! The focus for examining clinical competence at the Unit II level was the four-term series of preceptorships in private practice, ambulatory care settings. Two subjects groups were asked to consider their preceptorship experiences: those who had just or would shortly complete Unit II (Unit 11 subjects), and those who currently were interns and had several years previously completed the unit (Unit III subjects). Several different types of information were sought: descriptions of each preceptorship experience, Opinions of what characterized a "good" and "bad" preceptor, and explanations for any discrepancies between what they felt they should be able to do and what they had described themselves as being able to do. 154 And since at the Unit II level it was presumed that students were engaged in performing history and physical examinations and also in diagnosis and treatment problem solving, questions were separately and specifically direced at each of these two major professional tasks. Explanations for History and Physical Examination Competence: Unit II students offered fourteen (14) specific explanations for their Unit 11 H/P competence, while interns offered eight (8), as here seen in Table 5.2. EXPLANATIONS FOR UNIT II H/P COMPETENCE OFFERED BY UNIT H AND UNIT III SUBJECTS Percent Response Explanation UNIT II UNIT III (n=1l) (n=ll) Got to do alot (A) 8296 (B) 4596 Lots of cases with pathology (B) 27 (A) 55 Was assertive/self-confident (C) 73 - Had self-teaching goals (D) 64 - Was taught "tricks of trade" (E) 64 (E) 18 Got positive reinforcement (F) 55 — Was honest about own skills (G) 36 - Was given critical feedback (H) 45 (F) 36 Had personal support system (I) 9 - Supervisor had high expectations (J) 9 (D) 9 Had requisite knowledge base (K) 64 (C) 64 Non-threatening learning environment (L) 9 - Was given responsibility (M) 27 - Pre-medical experience/ training (N) 27 (H) 36 Supervisor had protocol for handling cases - (G) 9 (The explanations for competence offered by each Unit II subject are presented in Exhibit H and those for Unit III subjects are presented in Exhibit 1.) Generally, Unit II subjects explained their H/P competence in terms of two major variables: their own attributes (assertiveness, goal-directedness, and having the requisite knowledge) and clinical instruction (given opportunity to practice, taught the "tricks of the trade" and were given feedback). Specifically, their most common explanations were: (A) "got to do alot," (C) "was assertive/self- 155 confident," (D) "had self-teaching goals," (F) "got positive reinforcement," and (K) "had requisite knowledge base." The majority of inexperienced subjects also cited (E) "taught tricks of the trade" and (H) "got critical feedback." When one analyzes the contingency data for Category I-IV subjects (Table 5.3) by aligning co-occuring descriptors, one finds individual constellations of correlated statements. That is, there is no common set of descriptors offered by inexperienced students. (See Exhibit N for Unit II subject contingency table.) This phenomenon points up the highly idiosyncratic nature of the preceptorship experiences, as was suggested in Chapter IV. Each subject was faced with a different set of learning experiences, not only in terms of different preceptors, but what he/she m to do, was a_b_l_e_ to do and was allowed and guided to do. Hence, there are different sets of associated explanations for competence. For example, one student's set of associations showed the student's personal support system (I) common to a set of five otherwise unassociated explanations (D,E,F,G,H): another student's explanation revealed the non-threatening nature of the learning environment (L) to be the common association of four other explanations (E,F,G,H); yet another student placed receiving positive reinforcement (F) at the center of a set of otherwise unassociated explanations (C,E,G,L,H,I). This may suggest that inexperienced students at this level feel a particular need for a supportive learning environment. 156 Table 5.3 EXPLANATIONS FOR UNIT 11 COMPETENCE DESCRIBED BY CATEGORY I-IV SUBJECTS 0 0 .11 .11 .11 .11 .22 .11 - .01 .07 .01 .02 .01 .11 0 .11 - .11 .11 0 0 .66 .22 .44 .33 .33 .44 .11 .44 .11 0 0 0 .ll .11 .11 .11 .22 .22 .22 .22 0 .ll 0 .11 .11 0 .11 .11 0 .11 0 .11 .11 - .02 .01 .11 0 - .02 .11 0 0 - ZSPxQMEQWMUOW> is N O L: N L: N o u a o 03 u I I... Q 'o lb 2: h 2.: N 3: .5 'o .4 2:: .h OOOOO O In contrast, alignment of the contingencies for experienced subjects (Table 5.4) reveals a set of student-centered statements (C,D,G) associated with themselves and with "the opportunity to do alot" (A). Experienced subjects apparently view their competence to be dependent upon self-assertiveness, self— evaluation, and self-goals, in contrast to the inexperienced subjects' dependence on other-directed supportive learning environment. In fact, highly experienced students describe their clinical performance in the preceptorships as extensions of their former clinical role. And, in contrast to inexperienced students' descriptions of uncertainty in the clinical setting, the experienced students described considerable comfort and self-directedness, as revealed in this Unit II subject's statement: 157 Maybe it is my clinical experience that gave me security in knowing some things and how to handle myself. There were things I have been doing for five years in the ER--history taking--that I knew I could do and I could feel comfortable telling the preceptor that I could do that. Something in my experience and personality that made me comfortable that I had something to offer and that would be the place from which I would start. Table 5.4 EXPLANATIONS FOR UNIT II COMPETENCE D$CRIBED BY CATEGORY V SUBJECTS "‘ A B C D E F G H I J K L M N A - 0 .25 .25 0 0 .25 0 0 0 .25 0 .25 .50 B 0 - 0 0 0 0 0 0 0 0 0 0 0 0 C .50 0 - .25 0 0 .25 0 .0 0 .25 0 .25 .50 D .50 0 .50 - 0 0 .25 0 0 0 .25 0 .25 .50 E 0 0 0 - 0 0 0 0 0 F 0 0 0 0 - 0 0 0 0 G .50 0 .50 .50 0 0 - 0 0 0 .25 0 .25 .50 H 0 0 0 0 0 0 0 - 0 0 0 I 0 0 0 0 0 0 0 0 - 0 0 J 0 0 0 0 0 0 0 0 0 - 0 K 0 0 0 0 0 0 0 0 0 0 - 0 .25 .50 L 0 0 0 0 0 0 0 0 0 0 - 0 0 M .50 0 .50 .50 0 0 .50 0 0 0 0 - .50 N .50 0 .50 .50 0 0 .50 0 0 0 .50 0 .50 - Similarly, one intern, a former Physician's Assistant, described his Unit 11 preceptorship experiences as " welcomed relief" from school. " Bold type indicates actual correlation coefficients which exceed expected correlation coefficients. 158 As also shown in Table 5.2, when Unit III subjects (interns) reflected on their Unit 11 experiences and H/P performance, they used explanations that were similar to those of Unit II subjects. Their most common explanations were: "got to do alot" (B); "worked with lots of cases" (A); and "had the requisite knowledge" (C); more than a third also cited "was given critical feedback" (F). Only experienced subjects (Categories IV and V) cited "pre-MSU-COM training and experience" (H). All but one of the explanations offered by internshad been offered by Unit II subjects. Interns, however, did not offer certain subjective explanations ("was assertive," "had self-teaching goals," "was honest about skills," "non-threatening learning environment") offered by Unit 11 subjects. It must be kept in mind that, as we observed in Chapter IV, Unit III subjects tended to view the medical history and physical examination as an integral part of the medical problem solving process. Thus, their explanations should not be seen as explanations for having attained some level of technical competence in the H/P, but more likely for developing more finesse in organizing the WP to "fit the problem at hand" and interpreting results. Interns, like Unit II subjects, proffered explanations which differed depending upon the subject's pre-medical experience. (See Exhibit P for composite Unit III subject contingency table.) For Category I-III subjects, competence explanations were associated either with "had requisite knowledge" (C) or "got to practice alot" (B), which in turn were associated with each other: (See Table 5.5) C B A ‘ . \/ G F E D These contingencies suggest that systems knowledge and practice associated with explicit clinical teaching and feedback are central to inexperienced students' development of H/P competence. On the other hand, for experienced subjects (Category IV and V) competence is associated with doing, seeing a lot of cases of a 159 type, and having an opportunity to practice, which is supported by prevous knowledge enhanced by systems biology courses, as Table 5.6 reveals. The significant associations can be aligned: H—B—F A—C That is, experienced students associated their clinical competence with personal goals and efforts, rather than with instructional guidance from preceptors. Table 5.5 EXPLANATIONS FOR UNIT II COMPETENCE OFFERED BY CATEGORY I-III UNIT III SUBJECTS A B C D E F G H A - .33 .42 .09 .17 .17 .09 0 B .33 - .55 .11 .22 .22 .11 0 C .33 .66 - .14 .27 .27 14 0 D .17 .17 .17 - 06 .06 .03 0 E .17 .33 .33 0 - 11 .06 0 F .17 .33 .33 0 .33 - 06 0 G .17 .17 0 .17 .17 - 0 H 0 0 0 0 0 0 0 160 Table 5.6 EXPLANATIONS FOR UNIT II COMPETENCE OFFERED BY CATEGORY IV-V UNIT III SUBJECTS A B c D E F G H A - 12 24 o 0 .24 o .43 B .20 - .03 o o .08 o .16 c .40 0 - 0 0 l6 0 .32 D o 0 o - o 0 E o 0 o - 0 F .20 .20 o 0 0 - 0 .32 G o 0 0 o o 0 I - o H .40 .20 o 0 0 .20 0 - Explanations for deficiencies in Unit II history and physical examination skills were equally as variable as the explanations for competence. Unit 11 subjects offered twelve (12) explanations for deficiencies, and interns also offered twelve explanations. Table 5.7 presents the explanations for both groups of subjects. Table 5.7 EXPLANATIONS FOR UNIT II H/P CLINICAL SKILL DEFICIENCIES OFFERED BY UNIT II AND UNIT III SUBJECTS Percent Response of Subjects Explanation UNIT II UNIT III Little pathology in available cases (A) 4596 (G) 45% No repetition or practice (B) 73 (A) 45 No patient follow-up (C) 9 (H) 9 Didn't have requisite knowledge (D) 36 * Psychological stress/intimidation (E) 45 ((1 Too few patients in practice (F) 9 - - Too many patients in practice (G) 45 (E) 9 No personal interest (H) 9 (D) 9 Not encouraged to do (I) 9 - No feedback from clinical supervisor (J) 27 (C) 9 No consistent modelling (K) 9 (I) 18 Poor MSU-COM training (L) 18 (F) 64 No previous training or experience - (B) 9 Systems course out of sync with practice - *(J) 82 No outside reading assigned - (K) 9 Supervisor didn't know student level - @(L) 9 161 Only one explanation, "no repetition or practice" (B) was offered by more than half of the Unit II subjects; however, at least one-third of all Unit II subjects offered explanations A,B,E, and G. Only Category V subjects in this group offered explanations K and L. Unit II subjects appear to pose two generic explanations for skill deficiencies, (1) insufficient practice, reflected in explanations B,E,F,G,H, and J; and (2) unproductive practice, reflected in explanations A,C,D,I,K, and L. When the associations between eXplanations for deficiencies in the H/P expressed by experienced and inexperienced Unit II subjects are compared, we can see different correlations. (See Exhibit R for correlations of all Unit II subjects.) As seen in Table 5.8, inexperienced subjects most frequently described their deficiencies with four key explanations: "no repetition/practice" (B); "no interest" (H); "no feedback" (J); and, "don't have requisite knowledge" (D). The relationship of significant correlations can be shown as follows: E I'D—A (a; \ / C/B\F These associations suggest that inexperienced students perceive a need for coherent, structured clinical instruction-~what subjects seem to be referring to when they describe a good clinical rotation as being "academic." This interpretation is consistent with the previous interpretation of these students explanation for their competence. 162 Table 5.8 EXPLANATIONS FOR UNIT II DEFICIENCIES OFFERED BY CATEGORY I-IV UNIT II SUBJECTS A B C D E F G H I J K L A - .34 .05 .15 .20 .05 .20 05 .05 05 0 0 B 22 - 09 . 26 . 34 . 09 34 09 09 09 0 0 C .11 .11 - .04 05 .01 05 01 01 01 0 0 D .22 .22 .11 - 15 .04 15 04 .04 04 0 0 E 11 22 .11 .11 - 05 20 .05 .05 .05 0 0 F 0 .11 0 0 .ll - .01 .01 .01 .01 0 0 G 11 .33 0 .22 .11 0 - 05 .05 .05 0 0 H 0 .ll 0 .11 .11 0 .11 - .01 .01 0 0 I 0 0 0 0 .11 0 0 0 - .01 0 0 J .22 .22 0 .ll 0 .22 0 0 - 0 0 K 0 0 0 0 0 0 - 0 L 0 0 0 0 0 0 0 - In contrast, Unit II subjects with prior medically-related training and experience expressed few significantly correlated explanations for their H/P deficiencies, as seen in Table 5.9. The relationship of the five explanations: "little pathology" (A), "no repetition/practice" (B), "lack of requisite knowledge" (D), "psychological stress" (E) and "lack of modeling" (K), can be described: D .K .42] \E/ Experienced subjects, in contrast to inexperienced subjects, expressed the need/ want for role-modeling, practice, cases with pathology and the requisite knowledge, but seemed not to need/want formal clinical instruction. This interpretation is compatible with their explanations for competence described earlier; i.e., "they were able to do a lot." These data suggest that experienced 163 students perceive themselves as needing limited guidance (in the form of role- modeling), but extensive practice with patients with disease. Table 5.9 EXPLANATIONS FOR UNIT II DEFICIENCIES OFFERED BY CATEGORY V UNIT II SUBJECTS A B C D E F G H I J K L A - .25 0 .25 .25 0 .25 0 0 0 .25 .50 B .50 - 0 .25 .25 0 .25 0 0 0 .25 .50 C 0 0 - 0 0 0 0 0 0 0 0 0 D .50 .50 0 - .25 0 .25 0 0 0 .25 .50 E .50 .50 0 .50 - 0 .25 0 0 0 25 .50 F 0 0 0 0 0 - 0 0 0 0 0 0 G 0 0 0 0 0 0 - 0 0 0 .25 .50 H 0 0 0 .0 0 0 0 - 0 0 0 I 0 0 0 0 0 0 0 0 - 0 0 J 0 0 0 0 0 0 0 0 0 - 0 K .50 .50 0 .50 0 0 0 0 0 0 - .50 L .50 .50 0 .50 0 0 .50 0 0 0 .50 - Although interns (Unit III subjects) cited very similar explanations for their Unit 11 clinical performance deficiencies, as previously shown in Table 5.7, their most common explanations differed from those of Unit II subjects. More than half of the interns faulted the MSU-COM training program's lack of differential diagnosis training (F) and the lack of syncrony of the systems courses with the cases they encountered (J). More than a third of these subjects also cited "little or no practice" (A) and "little pathology in available cases" (G) as explanations for deficiencies. Again, it should be pointed out that these explanations are consistent with these advanced students' perspective that the H/P is an integral part of the medical problem solving process. 164 When their associations of explanations are examined, as revealed in Table 5.10, we see that Unit III subjects, in contrast with those currently completing Unit II, did not associate clinical supervisor behaviors (creating psychological stress or not encouraging students to perform) with their clinical performance deficiencies; instead, these advanced students appeared to hold the educational program on campus accountable: F F J40 “\H Experienced Unit 11 subjects also had cited "poor MSU-COM training" as an explanation, whereas none of the inexperienced subjects expressed that explanation. This may suggest that individuals experienced in the realities of clinical training don't make the same assumptions about what can be expected from of f—campus clinical instructors as do inexperienced students. Table 5.10 EXPLANATIONS FOR UNIT II DEFICIENCIES OFFERED BY UNIT III SUBJECTS A. B (3 ID E F (3 II I J K L A. - .04 .04 .04 .04 .29 .20 .04 .08 .37 .04 .04 B .09 - .008 .008 .008 .06 .04 .008 .016 .07 .008 .008 (3 .00 .00 - .008 .008 .06 .04 .008 .016 .07 .008 .008 I) .09 .09 .00 - .008 .06 .04 .008 .016 .07 .008 .008 E .09 .09 .00 .09 - .06 .04 .008 .016 .07 .008 .008 F .36 .09 .09 .09 .09 - .29 .06 .12 .52 .06 .06 C} .18 .00 .00 .00 .00 .18 - .04 .08 .37 .04 .04 II .00 .00 .00 .00 .00 .00 .09 - .016 .07 .008 .008 I .00 .00 .00 .00 .00 .00 .09 .09 - .15 .016 .016 J .45 .09 .00 .09 .09 .55 .45 .09 .09 - .07 .07 K .00 .00 .00 .00 .00 .00 .09 .09 .09 .09 - .008 L .09 .00 .00 .00 .00 .09 .00 .00 .00 .09 .00 - 165 Further analysis of the explanations for deficiencies offered by the Unit III subjects, based on the student's pre~medical experience, reveals that the previous interpretation may be somewhat misleading. As shown in Table 5.11, inexperienced (Category [-110 subjects did not associate the lack of syncrony of the systems courses with other explanations, although lack of training in differential diagnosis was associated with other explanations. The conditions for clinical learning: "too many patients" (E), "little practice" (A), "little pathology" (G), "no follow-up" (H), "inconsistent modeling" (1) and "no outside reading assigned" (K) were reported by these subjects, as they were by Unit II inexperienced subjects. The significant associations can be expressed: A B /G\ /L\ F H I A-——-F /\ \j7 , and D————-— E , Experienced (Category IV - V) Unit III subjects, on the other hand, associated fewer explanations for their deficiencies. As seen in Table 5.12, explanations A ("little practice"), F ("poor MSU-COM training"), G ("little pathology)) and J ("systems courses out of sync") were correlated with one another as follows: A—F-—-C EX.) These analyses suggest that pre—medical experience persists in distinguishing students regardless of their level of training in the formal program. While clinical experience within the educational program appeared to change the perspective from which students viewed their clinical performance, pre-medical school experience remained an important variable in students' perceptions of their medical school training program. It is likely that an actual "learning lag time" distinguishes the experiential groups. The most inexperienced students were still 166 refining skills which more experienced students mastered at the previous level of training. Table 5.11 ASSOCIATIONS OF EXPLANATIONS FOR UNIT II DEFICIENCIES OFFERED BY CATEGORY I-IIIUNIT III SUBJECTS A B C D E F G H I J K L A - .085 0 .085 .085 .25 .333 .085 .085 .50 .085 .085 B .17 - 0 .029 .029 .085 .11 .029 .029 .17 .029 .029 C 0 0 - 0 0 0 0 0 0 0 0 0 D .17 .17 0 - .029 .085 .11 .029 .029 .17 .029 .029 E .17 .17 0 .17 - .085 .11 .029 .029 .17 .029 .029 F .333 .17 0 .17 .17 - .333 .085 .085 .50 .085 .085 G .17 0 .17 - .11 .11 .666 .11 .11 H 0 0 .17 - .029 .17 .029 .029 I 0 0 .17 .17 - .17 .029 .029 J .50 .17 0 .17 .17 .50 .666 .17 .17 - .17 .17 K 0 0 0 .17 .17 .17 .17 - .029 L .17 0 .17 0 0 0 .17 0 - 167 Table 5.12 ASSOCIATIONS OF EXPLANATIONS FOR UNIT II DEFICIENCIES OFFERED BY CATEGORY IV-V UNIT III SUBJECTS A B c D E F G H 1 J K L A - 0 .03 0 0 .32 .08 0 .08 .24 o 0 a 0 - 0 o o o 0 0 o o 0 o c 0 0 - 0 o .16 .04 o .04 .12 o o D o 0 0 - 0 o 0 0 0 E 0 o 0 0 - 0 o 0 o F .40 o .20 0 o - .16 0 .16 .48 0 0 G .20 0 0 0 0 .20 - 0 .04 .12 o 0 H 0 0 o 0 o o - 0 0 o 0 1 0 o 0 0 o 0 - .12 o o J .40 0 0 o 0 .60 .20 o o - 0 0 x . o 0 0 o 0 0 0 - o L 0 0 0 o o o o 0 - Explanations for Diagnosis and Patient Management Competence: Chapter IV presented data from which it was concluded that Unit II students develop a basic level of competence in diagnosis: know the criteria for identifying normal and abnormal findings in the H/P; can identify the chief complaint; can correctly identify the associated body system(s); and can prOpose a short list of possible problems which are congruent with the clinical data. Their competence in management is primarily a matter of recallng and applying treatment protocols used by the preceptor. Unit II subjects offered seventeen explanations for having attained this competence, as shown in Table 5.13. .O'O 02357.9”310'5m00w Z> subjects. Only (D), "personal effort to learn" was cited by at least one-half of the Unit II subjects; and one—third cited (A), "some CPSS's provided problem-solving strategy," (C), "quality of some systems courses;" and (E) "seeing patient—case 168 Table 5.13 EXPLANATIONS FOR DIAGNOSIS AND PATIENT MANAGEMENT COMPETENCE OFFERED BY UNIT II SUBJECTS Explanation Some CPSS's provided problem-solving strategy Some CPSS's provided therapy regimen Quality of some systems courses Personal effort to learn Seeing patient-case increases memory Use clincial medicine manual Cumulative knowledge of systems courses Assertiveness in clinical setting Recency of pertinent systems course Good clinical role‘model Pre-MSU-COM training/experience Help from patients with chronic illness Worked up cases in particular system Developed personal clinical notebood Repetition of clinical cases increases memory Can recognize abnormalities Patient follow-up There is little common ground among these various explanations across increases memory." explanations, differences are seen between experiential subject grops. 5.14 shows, inexperienced subjects (Category I—III) associated systems knowledge When one examines the ways in which the Unit II subjects associated these I Percent Response by Experience Category II III IV V (n=4) (n=4) (n=0) (n=1) (n=2) 75 50 75 75 50 25 OOOOOOOD DOC) (A,C) and personal efforts (D) with other explanations: A C D / F'E MN 0 25 - o .. 0 _ 25 - 25 - 0 - 25 - 25 - 25 - 50 - 25 - 25 - o _ 0 _ 0 o .. 0 As Table men 010' U! OOOOCOOCOOOCOO U10! COO ALL 36 18 36 54 36 18 18 18 18 18 CDCDCD 169 (See Exhibit R for correlations of explanations for all subjects.) Many of the inexperienced subjects noted that they tried to get the case at hand to "fit" their existing knowledge base, particularly the information being gained in the current systems course. For example, when they have had the relevant systems course they have more critical questions to pursue in the H/P and may have, depending upon the systems course in question, an approach to problem solving, including a short problem list and notions about applicable drug therapy. These inexperienced students' explanations for medical problem solving competence seem to differ in perspective from those they gave for their history and physical examination competence. In the case of H/P competence these subjects' explanations were very idiosyncratic, and their explanations for their deficiencies focused on the nature of the instructional environment. Here, in the case of medical problem solving competence, their explanations seem to center around the their own didactic preparation and its congruence with the clinical cases they confronted. Subjects cited only one explanation that directly credited the clinical instructor for their competence (J) "good clinical role model," and only two subjects cite this explanation. In contrast, the Category V subjects associated competence with their empirical knowledge and person effort: K/G\H N P Q That is, rather than beginning with systems knowledge in an attempt to understand the problem at hand, experienced students started with empirical knowledge they had previously gained, and elaborated it with information gained in the systems courses. Even if they had not had the relevant systems course, they had sufficient empirical knowledge to give them a sense of confidence and competence. 170 Table 5.14 shows that Unit II subjects' explanations for deficiencies in the diagnosis and patient management were also highly variable. Table 5.14 EXPLANATIONS FOR DIAGNOSIS AND PATIENT MANAGEMENT DEFICIENCIES OFFERED BY UNIT II SUBJECTS Explanation Percent Response by Experience Category I II III IV V (n=4) (n=4) (n=0) (n=1) (n=2) A Short or poor systems course 25 0 - 100 50 B Unconstructive clincial instruciton 25 25 - 0 0 C Cram med for systems course exams 25 25 - 0 0 D Not taught to problem solve 50 25 - 100 50 E Ideologic conflicts 25 0 - 0 0 F No good role model 25 0 - 0 0 G Delay in applying systems knowledge 25 0 - 0 0 H Easier to look up drugs than memorize 25 25 - 0 0 I No patient follow-up 0 25 - 0 0 J Lack of knowledge of disease 0 25 — 0 0 K No feedback on write-ups 0 25 - 0 0 L Lack of confidence in knowledge 0 25 - 0 0 M Common diseases not presented in systems courses 0 25 - 0 O N Program confused about its goals 0 25 - 0 0 0 Finished preceptorships with only three systems courses 0 25 - 0 0 P Fast pace of practice 0 25 - 100 0 Q Did not work-up case in that system 0 0 - 100 100 R Am passive in clinical situation 0 0 - 0 50 S Quality of CPSS's variable 0 0 - 0 100 T Selective learning 25 25 - 0 50 As can be seen, no explanation was cited by at least one-half of the subjects. "Not taught to problem solve" (D) was the most frequently cited explanation, being offered by 45 percent of the subjects. In fact, references to the program design (A,C,D,G,J,K,M,N,O,Q,S) were twice as often cited as the explanation for deficiencies as were references to either the nature of the clinical instruction/ experience (B, F, I, P) or personal behavior (E, H, L, R, P). All Subjects 27 18 18 45 @QtDtDCDCDCDCD COCO 171 When one examines the associations of explanations offered by these Unit II students, numerous patterns emerge, most of which have a single explanation in common. Just as we saw in examining explanations for history and physical examination competence, inexperienced subjects appear to make individually unique associations, all of which differ from those of experienced subjects. Specifically, as Table 5.15 supports, inexperienced subjects make associations of their explanations which can be described: A (1) /\ (2) P Q . . T /C\ ./ (3) F/IG\H , (4) J x LM, and one interrelated set of associations: (5) J -- K L-——M All of these associations point up the centrality of systems course instruction in students' ability to diagnose and propose patient management plans. The experienced subjects (Categoy V) offer a single set of interrelated association, as Table 5.16 reveals: A——D >< M-—-R , which also focus on the importance of the system biology courses in the students' ability to solve medical problems. (See Exhibit S for the correlations of explanations offered by all Unit II subjects.) 172 afic. auc. auc. anc. aac. auc. auc. auc. ch. mac. afic. aac. agc. mc. mac. mac. mac. O QQGQQQQQOOGGO€OGOO m 66 _m amc. a—c. awc. auc. aac. auc. auc. auc. aac. auc. auc. anc. auc. auc. anc. 165:6 650$ an. icccccc NNN MI-Ov-O 66: O auc. amc. auc. amc. awc. auc. auc. anc. mac. mac. mac. mac. auc. aac. aac. auc. auc. anc. auc. auc. anc. anc. anc. auc. mc. mc. mc. mc. mac. mac. mac. mac. mac. mac. mac. mac. mac. mac. mac. mac. 0 .m Au 2 |°=¢¢€€¢ aac. auc. mac. anc. auc. a—c. mc. mac. mac. mac. «N 0-00"! 6: Iii-09659999 NNN Fir-IF! III QGO 0 O mac. auc. amc. auc. mc. mac. mac. mac. A mac. cc 6:56:96 as. an. nu. a—c. mac. aac. auc. auc. mc. mac. mac. mac. c p; O IGOOQQGOOQOF‘ aac. mc. mac. mac. mac. Nana ids-IN 0°: 0 — fl 0 lv-IOOGQGOOQGGF‘ mmm NOON and CG: 0 cu. m 9 mc. mc. mu FCOQGGQOOGOQG: O p; O anc. aac. mc. mac. mac. mac. my €¢GQOOO°GGG= O Fir-t III—t O O auc. mc. mac. mac. mac. m IGQGOFOGGQOOOGQQO an. mc. a mac. mac. mac. m whom—33m _— EZD Zn— MmOONhkO Mm Qmmmmmc mOZMFNA—EOO Ezm2m0 >— :— cc cm ma cm ma ma cm cm cm cm ma ma ma ma ma cm c cmcIma cc— I ma cc— I ma cm c I c c c I c > 25: «a c ma ma .ma ma > a - a e - a a I as as“ I an e - as a I e e I an a I e co” - an ac” - em ecu -. as e I e 35: a ma ma cm ma >d00m.—.==mmp§a0mflm c c c c c > cc— I c I cc~ I ccu I BE 3 ma ma ma ma ma ma ma ma ma cm ma cm ma ma 0220.5 mac... 2:89 55 . $903.8 oz 00:00.5 5 c0223“... EEO . 8:8,... 3 85 2:83 5:002 0500mm .0d .3603 .0d 00205580.. 3:035 c.5025 0354 0:003 coon 02.: 0303330.» 00 03.280205? 3:033 00m 02.0.0098 c095 .01—<00 Fran—Dab m0 mum—=22 m—ZmflOfimM—Ommm m0 mummy—2mm 2—555—0. 0255mm azmn—Dbm = .329. m0 m0.>—=~=~ ac cm cm I ma I ma I ma I c E = a. 3. ma cm c c cca I cc~ c cc~ cm c >3:— rmOUM—AHO mozammmxm PZMQDPQm—EgOPn—mnvmmm : cm ma ma ma ma cm ma 82.8.8 9.3.8. 82:04.8 Sou 3:330 0:33 9833009.”.— afiocmafi 002.95 3 00.009...“— EE .5288... €036>LOBO\ 305 EFFO< FzmcDfim m0 mum—9932 VE—PO< 3:32.595: UZFQOmm—m whom—35m m0 NOS—.szu—mm cu .m 030.9 179 working up a patient. When asked what he would have done if the first preceptor had made him go in alone and work-up the patient, he replied: I would have panicked the first week. I would have started carrying around more books in the second week; and I would have started functioning. Even though Iam passive I will adapt to anything. I would have felt insecure with such a limited knowledge base, but would have had a willingness to try. The adaptability of the subjects was a predominant feature of their descriptions of their clinical performances. Students reported challenging clinicians on their therapeutic regiment or diagnosis, for example, only to the extent that it appeared to be acceptible or encouraged by the preceptor. Only one subject reported that, while she was merely trying to learn the clinician's treatment protocol, the preceptor expected her to propose a treatment plan of her own design, based on what she knew and could research. Some students, rather than seek an explanation from the clinician for his/her diagnosis or treatment, complied with what appeared to be established protocols and, then made judgments about the competence of the preceptor based on information that had been presented in systems courses. This "do—what-they-do" approach to preceptorship experiences seemed to dominate student thinking throughout Unit II. It is in retrospect that students realize that the preceptorships had given them an opportunity to refine their skills and to apply theoretical knowledge, competence that they would shortly need in Unit 111. At the outset, Unit II students seem to have harbored the notion/ expectation that preceptors were extensions of the campus faculty who would teach and refine students' H/ P skills, and show them how to apply their classroom knowledge to the solving of medical problems. In fact, some preceptors did that and they were described as "good" preceptors. The description of a "good" preceptor was consistent, as was that of a "bad" preceptor: 180 Good Like to teach Gear their expectations to the student's level of competence Let students know what and how they think about a case Provide supervision appropriate to the student's ability Verify student's findings and conclusions Teach "tricks of the trade" Press student to be accountable am Don't let students do anything Supervise too closely or not enough Have personality traits that interfere with the student's performance (racism, sexism, intimidation) Have too few or too many patients Don't teach how they think or how they do things Don't give useful feed back Practice "poor" medicine for learning _ Give appropriate feedback-being increasingly critical as the student gains competence Implicit in a number of these characteristics is the notion of a continuum of clinical confidence, if not comgtence. Students sought reassurance and positive feedback early in their preceptorship experiences, but as they gained knowledge, experience and confidence, they sought increasing responsibility and accountability. As several pointed out, positive feedback is a motivator to continue to try, and trying/practice increases one's skill. But there was a point at which the student sought more critical evaluation of skills. The variability, or what one student called "lack of continuity", in clinical instruction created for students a considerable amount of confusion about the goals of the preceptorship program. Some concluded that the formal program failed to prepare them to perform adequately in clinical setting. Some concluded that the experiences were only to give them exposure to the clinical setting and that any concrete knowledge or skills that they gained was a bonus. Some concluded that they would have better spent their time studying for the demanding examinations of the formal courses. In conclusion, certain generalizations about Unit II students' preceptorship experiences are supported by subjects' comments: 1. Variations in preceptorship experiences can offer students a broad view of osteopathic general practice, something most students seek. 181 2. Students consider for selection those preceptors who their peers have described as "being interested in teaching ; however, their actual selection preferences are based on the locale of the preceptor. Students do not know what criteria are used by the preceptorship coordinator to assign preceptors. 3. Neither selected nor assigned preceptorships necessarily coincide with a students' abilities and needs in terms of clinical competence development. 4. Clinical preceptorships vary greatly in terms of preceptors' commitment and ability to teach, what is expected or permitted of students, patient population, complexity of office organization, philosophy of medical care and patient load. 5. Preceptors are perceived as lacking understanding of both the students' individual stages of clinical competence development and their role in that process. 6. Passive students seem to learn less when they have a "bad" preceptor who does not make any demands on the student than they do when they have a "bad" preceptor who has too high expectations for student performance. Both assertive and passive students can be overwhelmed by intimidation or perceived antagonism on the part of the preceptor. ' 7. The co-occurrence of clinical cases in the preceptors's office with the relevant knowledge in the systems course under study is serendipitous and infrequent; hence, integration of theory and practice is inefficient. 8. Preceptors' expectations and evaluations tend to focus on the student's recall of specific information, such as drug regimens and criteria for disease, and the ability to propose the "correct" interpretation of the data presented in the case at hand. Unit!!! The focus for examining clinical competence at the Unit III level was the (then required) 48-week series of clinical rotations in ambulatory and hospital settings. The subjects of this study undertook the following clinical rotations: 182 6 weeks Jr. Partnership (Private D.O. office) 12 weeks Internal Medicine 6 weeks Surgery/ Anesthesiology Base 6 weeks Obstetrics/Gynecology . Hospital 6 weeks Pediatrics 6 weeks Psychiatry 6 weeks Selectives Students could choose, depending upon availability among approximately 150 Senior Partners, and fifteen base hospitals, and virtually unlimited resources for selectives. Subjects were asked in interviews to consider their clinical rotation experiences and to describe those which were particularly productive and unproductive in developing their clinical competence; what they were able and not able to do and what they should have been able to do; and to describe how and why they developed the clinical competence and confidence that they had at the end of Unit 111. It was presumed that Unit 111 students would be involved in the full range of basic professional tasks, and that they would be primarily preoccupied with refining their history and physical examination skills and in developing diagnostic and management problem solving skills. Questions were, therefore, directed to those two major professional tasks, with no particular effort being made to ascertain insight into technical/procedural skill competence development. Explanation for History and Physical Examination Competence: As was described in Chapter IV, Unit III subjects described their History and Physical Examination (H/P) skills as being thorough, accurate and efficient, although they individually described specific system evaluations to be deficient in certain respects. It was noted that the striking feature in the subjects' discussions of their H/P skills was the _l_a_c_k_ of detail. The issues of concern for Unit 11 students: mechanisms of the procedures, patient interaction, and understanding the meaning of the data, no longer appeared to be of concern to these students who had completed Unit III. More subjects (3696) offered 93 descriptions of deficiencies 183 than offered any single descriptor. The most frequently cited deficiency (2796 of the subjects) was "not efficient." As one might expect, subjects offered little to specifically explain their competence and/or deficiency in performing the history and physical examination. The Unit III H/P is an interesting issue, particularly in light of these subjects' taciturn response to questions regarding their H/P competence. Most H/Ps were performed as part of the clerks' responsibilities for processing admitted patients-- which students less than genially refer to as "scut" work. Patients on whom the clerics performed the admitting H/P might or might not have been on the service to which the student was assigned and/or might or might not have been followed by the student. All subjects described having taken hours (as many as four, but no fewer than two) to do their initial Unit 111 H/Ps and having literally hundreds of H/Ps assigned, but having had no demonstrations, guidance or supervision in the performance of the admitting history and physical examination. Only one subject specifically acknowledged an explanation for her H/P competence: "the house staff went over findings four or five times." The same subject explained her deficiences in performing the H/ P as resulting from the fact that "MSU-COM had no physical exam course," and-that she "had only done 4 or 5 H/Ps before entering Unit III," and she "never got good feedback on the H/ P." The explanations for subjects' reported confidence and competence may be explained by the sheer volume of H/Ps they report having performed.1 That, no doubt, is a significant, though not all together satisfying, explanation. Several subjects' comments may give a clue to another important variable: increased knowledge and skills gained from sub-specialists: 1One subject reported having kept a record of his admissions H/Ps and having done 350 in a five month period, none of which were reviewed with him by a clinician. 184 I don't want to learn a physical exam from a general practitioner. I want to learn lymph nodes from an oncologist, how to listen to the heart from a cardiologist. . . to be able to do a darn good physical exam. 0 O 0 My rotation in (3.1. with [specialist]. . . he was very organized and structured In' hIs' differentials. . . And to this day when I see an upper or lower G.l. bleed I have a differential in my head that is unshakeable and l have an approach that is flawless-~at least in terms of my understanding--and it gets me thorough 9596 of the time. Yesterday, I was working with an ophthamologist and he pointed out how I might better use the ophthalmoscope and see better in the eye--that just happens all of the time. . . You need repetition with guidance and occasional refined guidance to fine tune what you are doing. Here we get a clearer view of the distinction between Unit II and Unit III history and physical examination competence. Unit III students had both the need to know and the opportunity to learn the clinical medicine knowledge essential to performing an "accurate and thorough" history and physical examination. Faced with medical problems complex and/or serious enough to require hospitalization and/or referral to specialists and subspecialists, these students realized that their previous competence was insufficient, and that their knowledge of pathology and clinical medicine was the difference between adequate and inadequate performance of the history and physical examination. Sub-specialists are seen as knowing more and, therefore, being able to do a better examination and history, and, therefore, being the best source of instruction and feedback to refine one's H/ P skills. And, as was concluded from their descriptions of their H/P competence, the history and physical examination was seen by Unit III subjects as an integral part of diagnostic problem solving. Most reported that in order to both recognize and pursue important cues in the H/P, one must be very knowledgeable of disease. One gets the impression that students who are most cognizant of this relationship between "knowing and doing" are most likely to offer ambiguous descriptions of their own competence. Those students who sought advice from 185 subspecialists described themselves as both being very competent--better even than the average professional--and needing a great deal of improvement in the H/P. The student's personal standards appear to determine whether he/she will seek competence at the "average doc" level (often described as meaning the family/G.P. D.O.) or at the specialist professional level. Explanations for Diagnosis and Patient Management Competence: In Chapter IV, Unit III students were described as being intensely engaged in the process of learning how to diagnose and manage specific medical problems. The definition of diagnostic and patient management competence for Unit III students was vastly different from that of Unit II students, primarily because of the situational context of their learning and performance: the hospital. There are some interesting similarities and dissimilarities in the nature of the learning process and explanations for competence between the two groups. Unit III subjects offered twenty-four (24) explanations for having attained clinical competence, as shown in Table 5.19: 186 Table 5.19 EXPLANATIONS FOR UNIT III CLINICAL COMPETENCE Staff check findings/given immediate feedback House staff tell you what you need to know In-depth knowledge form specialty rotations M.D. institution had good teaching Staff discuss problem solving process Repetition with type of case Rely on Unit II didactics Good teaching at base hospital Used clinical reference books Got to do consults Given patient care responsibility Took responsibility for learning Didactics congruent with cases Pre-MSU-COM knowledge Role-modeling by staff Studied patient charts Self—confidence Staff had protocol for case management Varied clinical experience Staff interested in students Students made accountable Sufficient pathology Peer teaching Clerks organized lectures and demonstrations N2 (See Exhibit K for the individual subjects explanations for Unit III competence.) Frequently cited explanations were: (K) "given patient care responsibility" (8296); (F) "repetition with type of case (6496); (H) "good teaching" (5496); (J) "got to do consults (5496); (N) "pre-MSU-COM knowledge" (5496); (A) "staff check findings/ given immediate feedback" (E) "staff discussed problem-solving process" (4596); (L) "took responsibility for learning" (36%); (Q) "Self-confidence" (3696; and (T) "Staff interested in students" (3696). The frequency with which certain explanations were offered by these Unit III subjects was unexpected given their highly variable descriptions of personal competence (Chapter W, p. 18-20), and was in sharp contrast to the virtual idiosyncracy of Unit II subjects' explanations. The explanations can be grouped into at least five arbitrary categories: clinical instruction (A,B,D,E,H,M,O,R,T,U); student knowledge base (C,G,N); clinical experience/practice (F,J,K,S,V); and self-instruction/effort (I,L,P,Q,W,X). Of the commonly cited explanations, the most frequently cited explanations fall within the clinical experience category (F and K); the next most frequently cited explanations fall into the clinical instruction (H), clinical experience (J) and student knowledge (N) categories; the third most frequently cited within the 187 clinical instruction (A,E); and the least frequently cited within the self- instruction/effort (L,Q) or clinical instruction categories. inexperienced subjects (Category I-IlI) with those of experienced subjects Other insights emerge when one contrasts explanations offered by (Category IV-V), as shown in Table 5.20. N2 Table 5.20 COMPARISON OF EXPERIENCED AND INEXPERIENCED SUBJECTS' EXPLANATIONS FOR UNIT III COMPETENCE Explanation Staff check findings/given immediate feedback House staff tell you what you need to know In-depth knowledge from specialty rotations M.D. institution had good teaching Staff discuss problem solving process Repetition with type of case Rely on Unit II didactics Good teaching at base hospital Used clinical reference books Got to do consults Given patient care responsibility Took responsibility for learning Didactics congruent with cases Pre-MSU-COM knowledge Role-modeling by staff Studied Patient charts Self-confidence Staff had protocol for case management Varied clinical experience Staff interested in students Students made accountable Sufficient pathology Peer teaching Clerks organized lectures and demonstrations 5096 50 33 17 Percent Response by Category I - III (n-6) (n-5) 4096 188 These data suggest that students, regardless of prior medical experience, attributed their competence to their MSU-COM didactic base, feedback on performance, good teaching, including the staff discussing their problem solving process, and opportunities to perform somewhat independently (doing consults). Inexperienced students gave more credit for their competence to being provided specific directions, protocols and instruction, repetition of specific tasks, and study of clinical references. Experienced students, on the other hand, were more likely to credit having been given or having taken responsibility and accountability of patient care, their prior knowledge and resulting self-confidence, varied experience and case pathology, role modeling by staff, didactics being congruent with the cases they were currently working up, self-instructional efforts, and recognition by the clinicians. In sum, experienced students appeared more pragmatic and self- assertive in the clinical learning, whereas, inexperienced students appeared to require more formal teaching/guidance. When one examines the contingency data, several interesting patterns of associations emerge. Table 5.21 reveals the probabilities of paired (associated) explanations for Category I-III (inexperienced) subjects. (See Exhibit T for associations of all Unit III subjects.) Several obvious patterns are seen. In the first, the cluster of clinical instruction-related explanations (A,B,C,D,E) are associated with a single clinical experience explanation (F), where: A-—-B CWE (l) i» This may suggest that inexperienced students perceive the efficacy of "repetition with type of case" in developing clinical competence to be contingent on clinical instruction. A second pattern of associated explanations, where: 189 OGGQGQOOOQOGOOOOOGOQOGOI X 3: > GGGOGOOOG°€°°GOOOOOGGIGQ c c c c c c c c c c c c c c c c c. c c c c c c c c c c c . .c c I c c c c c c c c c c I c c an. bu. c cu. c an. c cc. I c c c c c c c c cc. c I c c c c c c c cc. c cc. I ha. c bu. c pm. c cc. c cc. cc. I . c pm. c an. c c c c c c c I c c c c cc. c cc. cc. cc. c I 9—. cu. c cc. c cc. cc. cc. c cc. I c c cc. c cc. cc. cc. c an. cc. I c a“. c nu. ma. uu. c cc. an. cc. c cc. c cc. cc. cc. c an. cc. an. c cc. c cc. cc. cc. c cc. cc. cc. c cc. c cc. cc. cc. c um. cc. an. c cc. c cc. cc. cc. c u". cc. nu. c «u. c v—. cu. cu. c cc. cu. an. c cc. c cc. cc. cc. c pm. cc. pg. c cc. c cc. cc. cc. c cc. cc. cc. c cc. c cc. cc. cc. c mm. cc. um. c cc. c cc. cc. cc. c 9—. cc. cu. c cc. c cc. cc. cc. c kg. cc. pm. a. a. c “a mu 5 Ac .5 us 4 cu. h c=~cccccccccc . . cc. an. bu. an. «1. cu. cc. nu. ma. cc. =€°G°€€€°°°GG FF F‘s-1 O 0 rm. b F! 0 cc. nu. cc. my. a". cu. my cc=cccc o d“ 00 I‘D €€€€€°=CGGG°° o h F6 0 pm.. r-r- I‘fl 0 cc. cc. nu EON—dam E..— >¢OONE3>< 7 suggests that they also viewed the efficacy of clinical instruction (A,B), clinical experience (F,J,K) and self-instruction/effort (L,N) as being associated with affective aspects of the conditions of learning ("staff is interested in students" and "self-confidence") and student initiative ("study patient charts"). A third pattern: (3) suggests that those students without medically-related experience who brought to their clinical training some health-related experience/knowledge associated competence development with receiving didactic material that was congruent with the clinical cases with which they were currently working. Under these conditions, clinical instruction (A,C,H) and clinical experience (F,K,L) were productive in developing clinical competence. When one examines the associated explanations of experienced students, different patterns emerge, as revealed in Table 5.22. The first cluster of contingencies described for the inexperienced student does not emerge at all, and in the second and third clusters different‘patterns emerge. For example, self- instructional explanations, particularly W,X,L, and Q, were significantly correlated with other explanations: Q MNTUWX (2),and STUX(3). 191 cm. D. a QGGGQOGOGQOCGQGOQIGQOGOG m OGGGGQGG665Q656|G°9€GGOQ sss.s.:.sss.ss ss.s.. 1' an O O 0‘ a" o o 0‘ fl“ . cm. 2 e e a a a a a as. o c a as. a a e a a a a a a a . as. a a a a c a a a a c a e an. an. an. e c a a a a a ea. :0. an. a c on. an. :0. a e e 9 ea. c e a e an. a a as. cc. :0. a a an. as. as. e e a a a c a an. :0. an. a a I an. ea. c a a I ea. a a a a“. I a e e a e I a e . e a a I e e e a a e a". o_. a a no a m” A. "C From—35c >I>— NGOOMHEO *m Camacho mOzM—ENQSOO :— .329 mg mZOF3N 192 These associations suggest that experienced students assumed more responsibility--or credit--than inexperienced students for their effective use of learning opportunities to develop their clinical competence. Other association clusters also distinguished experienced students: T-—————U (4), and J L M Q s H A J F Q 0 These various associations suggest that the experienced student did not place upon clinical instructors an expectation for didactic instruction--or what several subjects referred to as an "academic approach"--so much as expecting them to recognize the student's ability and to allow him/her to assume corresponding responsibility in caring for the patient. This is consistent with the previous interpretation that experienced subjects were more pragmatic in their approach to learning--an interpretation also posed for experienced students in Unit II. Unit III subjects posed twenty—three (23) explanations for their deficiencies in diagnosis and patient care management, with some obvious differences between explanations of inexperienced and experienced subjects, as shown in Table 5.23. 193 Table 5.23 COMPARISON OF EXPERIENCED AND INEXPERIENCED SUBJECTS' EXPLANATIONS FOR UNIT III DEFICIENCE IN DIAGNOSTIC AND MANAGEMENT COMPETENCE Explanation Percent Response by Category I - III IV - V A No differential diagnosis course (MSU-COM) 3396 2096 B Exam process (MSU-COM) doem't require that you think 16 0 C No orientation to hospital, procedures, goals 16 0 D Not allowed to do certain things as a clerk/not challenged . 33 60 E If Idon't see it, I don't understand it 16 0 F Lectures not congruent with service on 50 0 G No personal interest 16 40 H Differences in approach by clinicians/ no quality control 33 0 I No emphasis on self-teaching/reading at MSU—COM l6 0 J Teaching at too high level for clerk 50 20 K Insufficient patient base 33 20 L Not good role models 16 0 M No reinforcement 33 0 N No feedback 16 0 0 Too much scut work 33 0 P Lack of clnical relevance in basic science courses (MSU-COM) 16 0 Q Rotation at wrong time 16 20 R Clerk's academic perspective not realistic in clinical learning 16 0 S lack of educational orientation of hospital/ no teaching 33 80 T No patient responsibility 0 60 U Curriculum deficient in topic(s) 0 60 V House staff not helpful _ 16 20 W No sub-specialty rotation 0 20 Similar to Unit 11 subjects' explanations of deficiencies, these Unit III subjects more frequently cited explanations related to faculty program design (A,B,C,D,LO,P,Q,S,T,U,W) than those related to clinical instruction (D,I-I,J,K,L,M,N) or personal behavior (E,G,R). Both experienced and inexperienced students were more likely to fault the program; however, inexperienced subjects offered three times as many explanations related to clinical instruction as did experienced subjects. Again it appears that inexperienced students harbor greater 194 and more specific expectations of the program and clinical instructors. They appear to have expected that the clinical training program be "academic," whereas, experienced students expected some basic, pragmatic instruction in conjunction with experience, and responsibility commensurate with their abilities. Explanations for personal deficiencies might be seen as expressions of unmet personal expectations. When the correlations of paired explanations are examined, the difference between experienced and inexperienced subjects is further magnified. The contingencies matrix for inexperienced subjects, Table 5.24, can be interpreted as describing individual subjects' associations, with little common ground for all subjects. Explanations (F), "lectures not congruent with service on" and (J), "teaching at too high level for clerk," are the only ones cited by at least half of the subjects, but each of them is associated with an idiosyncratic cluster of explanations and not with each other. That is, both (F) and (J) are common elements in nearly all subjects' sets of explanations, but contingent explanations were unique to the individual. This finding is consistent with previous findings of inexperienced students at other levels of training. Inexperienced students' learning appears to be a more privatized process, with each student arriving at different ends by different means. In contrast, Table 5.25 shows a much more homogeneous set of correlations of the explanations offered by experienced subjects. Experienced subjects used fewer, more common explanations. Explanations (S) "lack of educational orientation of hospital/no teaching," (T) "no patient responsibility," and (U) "curriculum deficient in topics" appear as central to all other significantly correlated explanations: 195 3: ucc. cc. ccc. pcc. cc. D E— a a a a a a ce_. a a e a a a a a a e a e e a a a a I can. a a e a we. I e a e a «a. has. I no“. oo_. «nu me. use. an: I .m—. .._. ma. sac. sue. use. cos. we. sac. sue. sue. ems. ~_. we. we. as. we. a: we. use. sac. has. sac. sue. nu. we. as. as. we. we. pc_. ”as. «as. ”as. «as. ”as. me. has. has. use. use. was. s“. me. me. we. as. we. as. use sac. use. sac. use. "a“. «so. «as. «as. «as. «as. as. has. sue. sac. use. sac. _u. as. we. as. me. as. we. sac. sac. pug. sac. «as. we. use. use. sac. sac. use. __. we. as. as. we.. we. a _a a. .m a. z ccm. an. cg ccn. ccu. cc—I cc—I cc". cc. .ccc. vac. cc. pcc. ccc. pcc. cc. ccc. pcc. cc. 4 cc". pcm. cc. an. cc. cc". cc. mu. cc. cc. nu. c. ccu. ccc. cc. ccc. ccu. ccc. ccc. ccn. c d. cc. bcc. ccc. ccc. cc. ccc. ccc. cc. ~ ccn. ccn. ccu. ccn. ccu. cc". EON—dam E. .EZD EI— >¢OOMF3 0 cc—. cca. ccu. ccn. ccn. ccn. .< PLEASE NOTE: Page 196 is not missing, it is misnumbered. 197 A.__._p UVW K/STU sr GJL These data appear consistent with previous interpretations that experienced students did not hold many expectations of clinical instructors, except to allow students patient care responsibility. The fundamentals of medicine were assumed to have been taught on campus in previous Units. The analyses of the interviews with Unit III subjects regarding the explanations for their clinical competence support the following generalizations: l. The situational context of clinical experience is an important variable in the development of clinical competence. Hospitals generally offer students experience with more serious medical problems and more sophisticated diagnostic and therapeutic resources than do ambulatory care facilities. 2. The amount of clinical experience the student has affects the amount of confidence and competence he/she has in managing his/her own clinical training and the expectations he/she has for clinical instruction. 3. The Unit III history and physical examination competence is extended well beyond that of Unit 11 because of the student's increased knowledge of _and experience with serious medical problems and instruction by sub-specialists. 4. Inexperienced students, more than those who have extensive medically-related training and experience, associate their clinical competence development with structured clinical training; i.e. guided learning around clinical cases. 5. Experienced students, more than inexperienced individuals, associate their clinical competence development with being given and held accountable for responsibility in patient care. 6. Although the inexperienced are more dependent upon structured clinical teaching, both experienced and inexperienced students hold the formal program (Units I and II) accountable for their clinical skill deficiencies. Further insight into these generalizations can be gained from students' perceptions 198 of their preparation for Unit III and the nature of their clinical rotations. adequacy of their preparation for Unit 111 within the formal (Units I and II) program provided further insight into the explanations for Unit 111 deficiencies. Table 5.26 lists the twenty-two (22) areas in which these subjects perceived themselves to be The adequacy of preparation for Unit III: Students' perceptions of the insufficiently prepared for Unit III. As this data reveals, the perceived inadequacies of the formal program are quite idiosyncratic, particularly within the inexperienced group, Table 5.26 AREAS OF PERCEIVED INSUFFICIENT PREPARATION FOR UNIT III Deficiency Percent Response by Category Physical exam emphasis Differential diagnosis course Journal/reference reading Specific clinical pharmacology Pathology (to distinguish normal from abnormal) Pelvic examinations Reinforcement of EKG interpretation Reinforcement of X-ray film interpretation Pulmonary tests Blood gases/electrolyte management Facility in suturing Respiratory system biology Obstetrics system biology Orthopedics system biology Ophthamology Thrust manipulative skills Laboratory medicine Medical terminology OMT skills emphasis Common clinical medicine Admitting orders Practical nursing treatments (n=5) 40% and there are 199 differences between experienced and inexperienced subjects' perceptions. More important, however, is what this list implies regarding student expectations: the formal program should provide students with knowledge and skills clinical instructors demand of students. That is, Unit III is not seen as an extension of a continuous instructional process of the College and clinical instructors do not necessarily have responsibility for building on what the individual student brings to the clinical training setting. Instead, the formal program is presumed responsible for preparing the student to undertake whatever demands the clinical training program makes of him/her. Thus, since they have foreknowledge of the medical care process, organization and function of the medical care system, and some technical and adaptive skills, medically-experienced students were less vulnerable to and more "realistic" about the nature of clinical training. Thus also, experienced students placed different demands on the formal program--in part because they possessed certain knowledge that inexperienced students might not have had, but also because they had different expectations of the clinical training programs. They also appear to have had more self-confidence and skills in managing their own learning. Subjects' discussions of their preparation for the Unit III clinical externship support the following generalizations: 1. The teaching/learning process of Units I and II does not prepare students for the process used in the clinical setting. 2. The systems courses are. an effective way to learn the basics of medical science. 3. Some systems courses are perceived as better than others in that they: . emphasize knowledge and skills routinely used inclinical medicine . provide a framework for developing a differential diagnosis for problems of the system . test important, clinical relevant knowledge and skills 200 4. Systems courses are faulted for not preparing students with all of the clinical medicine they need to know for their extern/internships. 5. Students do not functionally retain all of the information given them in the systems biology courses for a variety of reasons, most commonly: . information wasn't reinforced by clinical experience . students "learn" information for test purposes, not for practical application . students with extensive pre-medical school general practice experience discount significant "esoteric" information 6. Students, with the possible exception of those with medically-related clinical experience, feel insufficiently prepared in physical examination skills, differential diagnosis/problem solving, and journal article interpretation to function effectively as a clinical student. 7. Perceived insufficiencies in preparation for Unit III reflect differences in demands of clinical supervisors, students' standards of performance, pre-Unit III clinical competence, students' insight and professional goals. Students' perceptions of their clinical rotation experiences and why they did and did not productively contribute to their clinical competence development further explain these interpretations. The productivity of clinical rotations: Unit III subjects were asked to identify and describe clinical rotations which they considered the most and least productive in terms of their clinical competence development, as a means of gaining more insight into the conditions of learning. (See Exhibit L for all subjects' descriptions of these rotations.) Table 5.27 shows that, in general, no particular pattern emerged for perception of any rotation as most or least productive. With the exception of the psychiatry rotation, subjects provided no evidence that the site of the rotation was a significant factor in determining the perception of productivity. In the instance of psychiatry, all those who reported it as the least productive rotation had been 201 assigned to the same facility, whereas those reporting it to be the most productive and been assigned elsewhere. In several instances, subjects trained at the same base hospital reported the productivity of a given rotation differently. Table 5.27 LEAST AND MOST PRODUCTIVE UNIT III CLINICAL ROTATIONS Rotation Percent Response as Most Least Jr. Partnership 9% 18% Medicine 18 27 - Infectious Disease 9 - 6.1. 27 - Psychiatry 18 27 Pulmonary Medicine 18 - OB/Gyn 27 9 Hematology - 9 Nephrology . - 9 Pediatrics - 9 Surgery 9 18 Radiology - 9 Emergency Room 9 - All good/none bad 9 9 When one contrasts the descriptors of a most productive rotation with those for a least productive rotation, further understanding of the. process of competence development can be gained. Tables 5.28 and 5.29 present those respective sets of descriptors. 3595“"936'1‘! MD 0 ID> 72“”:3 63 "111160!” I'D 202 Table 5.28 EXPLANATIONS FOR A UNIT III CLINICAL ROTATION Student's cumulative knowledge Got to manage patients/make decisions Made accountable for diagnostic knowledge Academic Lectures immediate followed by relevant case write-up Lots of hands on experience Staff helpful Lots of feedback Taught structured differentials Taught to do procedures Personal interest Taught to be efficient Felt competent I (n=2) 50 0| O OOOOCOOGO CO Table 5.2 9 II (n=4) 25 III (n=0) IV (n=3) 0 100 33 33 33 33 EXPLANATIONS FOR A UNIT III CLINICAL ROTATION BEING UNSUPPORTIVE OF CLINICAL COMPETENCE DEVELOPMENT Personally not prepared for Unit III Instruction above extern level Too many students on service First rotation No teaching on service No patient care responsibility assigned Difference in G.P./hospital philosophy Assigned to "bad" intern High service demand stuff No house staff Little pathology I (n=2) 50 50 0 0 0 O OOGOO II (n=4) @9090 III (n=0) IV (n=3) 33 100 100 33 33 33 BEING VERY SUPPORTIVE OF CLINICAL COMPETENCE DEVELOPMENT v (n=2) 50 V (n=2) OOOOO OOOOO All 27 64 18 27 9 36 18 27 18 27 18 9 18 All 27 64 64 CDCDCOCDCD 203 Unit III subjects, irrespective of pre-medical experience shared opinions of what helped them develop clinical competence: their having sufficient cumulative knowledge to benefit from the rotation; having hands-on experience with patient care responsibility; being taught to do and being evaluated on clinical procedures and problem solving; having some degree of self-confidence with which to perform and take advantage of the learning opportunities. Table 5.30 reaffirms the centrality of these criteria. The five general descriptors account for most of the explanations for productivity in all rotations except surgery-~which was explained as "was taught to be efficient with time." Conversely, when externs were not actively involved in the patient care process or clinicians did not (or could not) -' teach the clinical skills, students perceived themselves as being unable to optimally develop clinical competence, as shown in Table 5.31. 204 cu 00:02:30 Iroc 2 - 2 I I 2. I 8 8 an as as as a. as I I 2.. ea as on I I *2: 2...... sea as: 25 .8 .83.: =6 12.50 305285 cocoa—gm vac—.09 .3165: I cc I cc— I cc I c I c cm c c c cc cc I c 38.36:: 8:5 0223:5530 09—030 952.2335— ? «cognac v: .32).. a. .32: Ora—Oman 9—. 2055—5: .— @CDCD‘DCDCD More than half of the subjects in emotion-laden terms described experiencing considerable discomfort at having to approach a problem for which they had neither a practical nor a theoretical background. I was very embarrassed. I was sent in to a fat woman; she has pain In her leg; first metatarsal swollen; pain at night; it hurts terribly. He [physician] says, "What is it?!" Isays, "I haven't the slightest idea." Ididn't know gout from shout. For some subjects the embarrassment of the situation had a negative affect on their immediate learning: I feel very inadequate when I have to go and see people with no background. I am not comfortable with it. Comfortableness makes it an enjoyable experience, versus 214 doing something to get it over with. I probably am more aggressive and assertive, and, consequently, can get much more out of it [experience] when I feel comfortable. Not having the theoretical knowledg placed students in two problems: (1) they didn't have the knowledge to independently approach the problem solving challenge, so they had to depend on the preceptor to guide them step-by-step, to follow methodically a H/P protocol, and/or look up everything, making their performance slow at best; and (2) they lacked the medical terminology to fully comprehend the explanations preceptors offered, and were unfamiliar with resources from which to efficiently retrieve the information they needed. Without the theory students didn't even "see" what was so apparent to their preceptors, as one Unit II subject's recollection of his Unit I H]? experience points out. I think if I see the patient before I know anything, I don't know enough to see what is there. An example of that is the very first hospital history and physical that Idid. It was on a lady who they thought had pheochromocytoma--a tumor of the adrenal gland-~and [hadn't even taken endocrinology. As I sat there after taking the history and physical and the interns were debriefing me, trying desperately for me to make this association with something that wasn't even there, I realized after they had done all of that and they explained to me what the lady had, that I had missed seeing signs of the disease. Some students reported drawing upon their backgrounds in science to help them understand new medical problems: I have encountered clinical problems before I had the systems course, but I could use my physiology knowledge to get me through. The systems courses give you a differential diagnosis to work from, but I think I have developed my own. Oh! I felt fear! Before Ihad G.I. [system] I had a woman with a gall stone. I figured it out from Anatomy--knowing where the pain was it could only be a couple of things, unless it [pain] was referred. The students with pre-medical school medically-related training and experience tended not to have the same reactions to these new situations, and for the most part there were not "new" situations in the ambulatory situation. These subjects 215 reported recalling protocols for diagnosis and management they had used in the past. The affect on learning: However, students were at considerable advantage in the classroom having first had the clinical experience. When asked what they brought to the relevant systems biology course from the previous clinical experience, subjects were quite unanimous in their perceptions, as seen in Table 5.33. Table 5.33 AFFECT ON CLASSROOM LEARNING OF PRACTICE PRECEDING THEORY Percent Response A Conjure up mental image of patient 73% B Case provides marker and reason for what to learn 63 C Can remember theory longer 45 D Can build variations around case 9 E Case is basis for learning differential diagnosis 18 P Case is motivator for learning 18 Subjects conjured up mental images of their experiences when lecturers presented the theoretical information about the medical problem. The images were rich in clinical information. I can see that woman sitting there. I can see her fat leg throbbing. I can see her face. I can see both of them [physician and patient]. I can hear her telling how it hurts, that her husband was under a lot of stress -he's a farmer. Yes I can! Projecting mental images of patients was reported by many Unit III subjects as well, some referring to events years before medical school: I can still remember the lady with Addison's disease. I will never forget her--I can't remember her name. I can see her standing by the door, telling me that she had to have hydrocortisone or she would die! . . . And then I saw her brown skin. 216 These cases can become the basis for learning the differential. As one earlier quoted Unit 111 subject reported, three or four patients continued to provide him the benchmarks of the differential in G.I. disease. Most Unit II subjects did independent study, primarily in clinical texts, around the particular cases, and several reported maintaining personal 'clinical case notebooks, in which they recorded signs and symptoms, drug treatment and the like. However, these Unit II subjects did not make any special effort to teach themselves the theoretical aspect of the medical problems they confronted; nonetheless the recall of the particular patient was vivid and provided a basis for learning theory in the didactic course. Subjects also acknowledged remembering the theory longer when it was anchored to clinical experience. As far as retention and learning, for me going into the clinical and then going to the systems course works best. I didn't have to go back and look that stuff up--I remembered it. . . Less work, more efficient, a better educational process. I can't give you right off hand the 6 or 7 criteria for gout, but I can close my eyes and see that lady and then I can remember—and the treatment. That's the anchor for all that knowledge. I never thought about it like that, but it's true. I think if I see it first and then read about it, I remember it better. So the visual aids are really important. But there are several possible disadvantages to having to perform before having the theoretical base. One, as was noted, is the inefficiency and ineffectiveness of performance. If the performance evaluation is predicated on an assumption of theoretical knowledge which determines a course grade, obviously, the student is placed in academic jeopardy. A second consequence was pointed up in an interview with a Unit III subject. This individual acknowledged that, as a former Physician's Assistant, he had had no problems meeting performance expectations of his Unt II preceptors, but he had, he thought, as hard, and perhaps harder, a time as his classmates in trying to figure out where the theoretical information presented in systems biology courses was going to be relevant in the clinical situation, 217 Because I knew what 90% of family practice was all about, and at the time I was thinking of [going into] family practice. So when it got down to a lot of nitty-gritty details, it just went over my head. This suggests that some very experienced medical students may be limited by the very experience that motivates and guides the clinical competence development of inexperienced students. Whereas, inexperienced students may have no experiential basis for understanding the significance or even meaning of didactic information, experienced students may dismiss such information when it does not jibe with their clinical experience. Perhaps different kinds of clinical experiences are necessary for these experienced subjects in order to modify their perspective of health] medical care and to increase their receptivity to theoretical information. Given the two dimensions of students' learning and evaluation [practical and theoretical], it is understandable that they were equivocal in answering the question, "which would you prefer: to have had the systems course or the experience with the medical problem first?" In fact, the clinical experience did not make the difference of passing or not passing the systems courses, since students had learned how to study to pass examinations. It was only in retrospect, as Unit III subjects were able to understand, that students appreciated the value of all of the information that was presented in the systems course. For most students, then, the psychological stress of having to perform without knowledge and the possiblity of being rated as a poor student by the preceptor, were the deciding factors. But as one student observed: As long as you see the clinical experience as an opportunity to learn, rather than as a test of what you are supposed to know or be able to do, you can manage to learn something one way or another. The Effect of Theory Preceding Practice Unit II subjects described themselves as "having a place to begin" when they had had the relevant systems biology course for the medical problem being 218 confronted. As Table 5.34 reveals, systems biology courses provided students with a good deal of applicable information and skills, provided the student remembered the needed information. Table 5.34 THE EFFECT OF HAVING THEORY PRECEDE PRACTICE Percent Response A Have differential to work from 54% B Can key in on major signs 63 C Have few tentative working diagnoses 18 D have basic pharmacolgoy 18 E Can problem solve more efficiently 27 F Have criteria for disorders 18 G Know references to use 18 H Understand pathology of disorder 18 I Know lab tests for problem list 18 K Provides vocabulary/ medical terminology 27 L Know key questions for systems review 18 M More comfortable/assertive 18 N Have certain skills; e.g. read EKGs 9 0 Have basis for asking preceptor questions 9 P Remember theory when encounter patient 9 Q Learn and forget 9 R Current knowledge dominates thinking 9 The systems biology courses present a great deal of information important to understanding clinical medicine. In retrospect the systems courses were seen by students as having been thorough and practical, although that may not have been true at the time they were being studied, as one Unit II subject lamented: In retrospect, I wish I could go back and revisit some of those lectures. The most boring and "useless" lectures that we had, and that I didn't attend, the information is all there--it is organized well. Someone gave me a differential; someone gave me treatment and some general management tips. I didn't appreciate it as a second year student, never having been exposed to those things--never having seen someone with peptic ulcer. You can't appreciate all the things that you have to do in the therapies. 219 Despite their best efforts to learn the vast amount of presented information, Unit 11 students had difficulty integrating the theory and practice, primarily because the presentation of the theoretical information was incongruent with the presentation of the practical problem. The theory tended also to focus on serious medical problems which students seldom, if ever, encountered in the ambulatory care settings in which they practiced, and when such problems _wege encountered, they were not likely to coincide with the presentation of theory. This disjuncture in presentation tended to enhance the dichotomy of theory and practice: students attempting to learn, more or less rote, preceptors' protocols for diagnosis and treatment, and faculty's presentation of theory. Several other intervening variables also appeared to discourage integration of theory and practice: placement of the systems courses and the effectiveness of clinical problem solving study cases. Unit II subjects referred to the importance of the placement of the systems courses from several perspectives. First, the placement of the system courses within the overall curriculum was seen as a factor in understanding the theory presented in the course. The reason OB is so fuzzy for me is that Ididn't know much about systems and I [have] learned alot from that course to this one. If I were to take OB now, I would be much more comfortable with this issue. The knowledge is cumulative and it overlaps eo_ much. This insight is in contrast to the typical subject's analysis of his/her difficulty in working up a particular type of case; i.e., the systems course was inadequate or it was studied some time ago and they had forgotten much of the material. While these differences in perspective support what faculty have always contended: students are not good judges of course content, they also support the conclusion that faculty may not realize what (little) students actually understand after completing even the "best" course. Students in this and previous studies have suggested that an "integrating system," such as pediatrics, be offered at the end of 220 the curriculum, so that courses presented early in Unit II can be re-visited. Students have also suggested that integration of theory could be enhanced by progressively placing individual systems biology courses so that each reinforces and extends the theory of the previous course(s). Integration of theory and practice, then, depends on understanding the theory. It also depends on having the opportunity to apply the thoery in a practical situation. Timing appears to be an important factor. Subjects reported having forgotten much of the theory if the presentation of the clinical case was delayed. To further compound the problem, students 92 make an effort to immediately reinforce theory with practice, by testing the clinical case against theory currently being studied or against clinical problems they have failed to prOperly diagnose. If I was in a system--Cardiovascular--everything I was looking for was cardiovascular Absolutely! Ido a more thorough work-up on the system I've just completed. A young man came in the other day. . .I thought he had a urinary tract infection. . . but he had mono. I missed it completely. And another patient came in and [the physician] thinks she has mono too. I missed that completely. Mono wasn't in my differential diagnosis; now everyone has mono until proven otherwise! Faculty have attempted to ameliorate the discrepancy in co-occurrence of theory and practice by including "paper case" exercises in the systems courses and the preceptorship courses. Many of the systems biology courses have small group sessions in which hypothetical clinical cases (CPSSs) are presented for students to work-up. Student reactions to these sessions vary, although almost all subjects of this study reported that the CPSSs provided them with a strategy for problem solving. Table 5.35 shows, however, that the effectiveness of the CPSS in facilitating the individual student's integration of theory and practice could be compromised by the instructional process. 221 Table 5.35 DESCRIPTORS OF THE EFFECTIVENESS OF THE CPSS IN INTEGRATING THEORY AND PRACTICE Descriptor Percent Response A Provided a way to think through problem 82% B Provided idea of therapeutics 36 C Made student think and integrate 18 D Concerned with esoteric problems 45 E Not holistic in management approach 27 F Quality varied by course and instructor 63 G Schedule already too heavy 9 H Individual student not held accountable 36 I Not structured for optimal learning 27 J Case geared to current system 36 K No "grey" cases to make student think 9 L No distinction between hospital and ambulatory care management 9 These statements suggest that the helpfulness of the CPSS to the individual student is dependent upon the CPSS instructional process, which is reported to vary from course to course and from instructor to instructor. Many of the subjects judged the quality of the CPSS on the basis of its practicality; i.e. its usefulness to them in their preceptorship experiences. When CPSS cases represented medical problems which were not seen in the ambulatory care setting, students perceived little was gained from them. Similarly, when the case related obviously to the system under study, they reported the CPSS as not "being like the real world." In these instances the students saw themselves going through yet another didactic exercise that didn't prepare them for the clinical situation. Students attitudes were particularly negative when the time for CPSS sessions infringed on time thought better spent studying for examinations. Some subjects explained that the CPSS sessions didn't help them integrate theory and practice as much as they might have, because the process allowed them to renege on personal responsiblity for thinking through the problem. Too, the "correct solution" to the medical problem 222 was usually limited, failing to take into consideration many theoretical aspects of patient management that students had been taught in other courses. Similarly, the clinical cases which students wrote up for the preceptorship course were reported to be of limited value, because they received no feedback on their reports. Feedback is a constant quest of students, whether it is in the clinical setting or in college courses. Once students realize that they will get no feedback, they reduce the value they place on the experience _a_ne the effort they put into the activity. These students reactions suggest that exercises designed to help students develop clinical competence must be well conceived; i.e., congruent with students' total educational experiences and with program goals, and provide each student with an opportunity to compare his/her thinking with the ideal. Explanations for Integration ofTheoryand Practice Unit II subjects offered a variety of explanations for their individual processes of integrating theory and practice, as seen in Table 5.36. (See Exhibit N for individual subjoc ts' explanation.) 223 Table 5.36 EXPLANATIONS FOR INTEGRATION OF THEORY AND PRACTICE OFFERED BY UNIT II SUBJECTS Explanation Percent Response A Use didactic information as basis for problem solving 18 B Cumulatively understand interrelationships 36 C Systems courses give differential diagnosis 27 D Study after see cases 91 E Clinical practice reinforces theory 45 F Clinical practice makes learning theory more effective and efficient 54 G Look for things currently studying in clinical cases 18 H There are gaps between theory and what is needed in practice 9 I Forget things if there is a time lapse between theory and clinical case 9 J Building personal clinical case notebook 27 K Need someone to help put theory and practice together 45 L Basic sciences assists in screening of signs and symptoms 18 M Need model for thinking 27 N Follow-up of patient helps evaluate problem solving skills 9 0 Pediatrics course could be integrator 9 P Study clinical texts 45 These descriptors summarize much of what has been previously discussed. Students started with existing knowledge, either cumulative practical experience or such theory as they had gained in systems biology courses. Students then attempted to fill gaps in their existing knowledge in several ways. Inexperienced students attempted to make the patient "fit" their existing theoretical knowledge and read "around the case in clinical texts." Experienced students depended upon their pre- medical knowledge and cumulative systems biology knowledge, and also used clinical texts, such as Harrison's. Both experienced and inexperienced students expressed the need for clinician role models who would both demonstrate how to approach the problem and discuss with the student how they think about the case. In either case, the clinical case provided the stimulus for integration. 224 Students description of the manner in which theory and practice related to their clinical competence, suggested the following generalizations: l. The integration of theory and practice is a cyclical process, requiring mutual reinforcement of theory with practice and practice with theory, such that at a men point in time: (A) THEORY —-’ PRACTICE MTHEORY—éINTEGRATION (didactic) (clinical practice) (re-study) 01‘ (B) PRACTICE —é?' THEORY ———£.- PRACTICE "9" INTEGRATION (exposure) (didactic) (clinical practice) a) The cumulative effect of increased knowledge from systems courses and continual clinical experience is a dynamic evolvement of clinical competence throughout Unit II, where: TheoryL/fi Theo JD ("M-”’ép’LheQuJ gym; Practice ATE-“63.13% .. ractice are” ractice Term 1 2 3 4 b) The effectiveness and efficiency of the integration of theory into practice is affected by the availability and timing of relevant clinical experiences, clinical instruction and student ability and effort. 2. Integration of knowledge into clinical problem solving competence is dependent upon experience, personal insight and effort on the part of the student and the quality of the pertinent systems course(s). a) Information given in the systems courses is not functional knowledge until: . the student has had to apply it, someone has guided the integration process-- usually by providing an approach to problem solving in a particular case. b) The systems courses provide important, relevant information for clinical problem solving. Factors which influence transfer of that information to the clinical situation include: the temporal placement of the systems course in the curriculum the student's confidence in his/her grasp of the information the opportunity to apply the information in a clinical situation the immediacy of the relevant clinical practice 225 c) The patient, either in the immediate clinical situation or as a post-hoe conjured up mental image of past experience, is the most powerful stimulator and motivator of learning and organizing information into functional knowledge. (1) Students, in the absence of immediate, relevant clinical experience, are variously able to organize/integrate information into clinically-operable knowledge. Individual ability to do so seems to be idiosyncratic and may have more to do with general learning/studying styles and degree of dependence on clinical modeling than with premedical clinical experience or academic background. 3. Students use various means to increase their medical science knowledge in order to enhance clinical performance. a) Students primarily use clinical reference texts to understand clinical problems and their management; those lacking confidence in their science background will review basic medical science texts as well. b) Some students create "peripheral brains"--clinical case notebooks—-as a mechanism for drawing tszcther all of the pertinent features of clinical cases and as a quick reference for future case work ups. Summary This chapter has presented the analysis of in-depth interviews of students at three levels of an osteopathic medical education program intended to identify the variables if the process of developing clinical competence. Two central issues focused the interviews and the presentation of findings in this chapter: (1) explanations for competence in performing the history and physical examination and in diagnosing and managing medical problems; and (2) the integration of theory and practice. Presented for each level of the program were the results of a content analysis of each interview regarding the specific explanations for the clinical competence previously described in Chapter IV. Frequency counts of the coded responses, based on the pre-medical experience of the subject were repesented, as were analyses of statistical correlations of paired explanations. Similarly, the 226 results of a content analysis of Unit II subjects' views of the relationship of theory and practice in developing clinical competence were also presented. Variables in the teaching/learning process perceived by students at each of the three levels of the program were identified and discussed. The subject's pre— medical experience was seen to influence the manner and degree to which the variables affect the development of clinical competence, just as Chapter IV concluded that such experience affected the subject's clinical competence. In addition, students' perceptions of the relationship of theory and practice and their influence on clinical competence were presented and discussed. Conclusions and recommendations regarding the implications of these findings for osteopathic medical education and future research are presented in Chapter VI. 227 CHAPTER VI INSIGHTS, CONCLUSIONS, AND RECOMMENDATIONS The present study was intended to provide insight into the nature of clinical competence development through the perceptions of medical students at one college of osteopathic medicine. The study sought, through in—depth interviews, students' perceptions towards answering two central questions: . Can acquired clinical competence be described for students at the end of each of the three phases of the educational process? . What factors influence clinical competence development? The preceding two chapters presented the results of the data analyses as objectively as possible. No particular attempt was made to evaluate students' perceptions, to propose relationships among the issues that emerged or to suggest any implications of the findings for curriculum planners. It was the investigator's intent to present as complete information as possible in order to facilitate the reader's own analysis and conclusions. This chapter will present the investigator's analysis of the descriptions offered in Chapter IV and V. The analysis attempts to draw from the case study, elements and processes of competence development which need to be considered when attempting to operationalize the theoretical construct of competence-based osteopathic medical education. The chapter is organized in three parts. The first part will present conclusions drawn from the analysis of students' descriptions of their competence and the conditions of competence development. The second part of the chapter will re—examine the preliminary conceptualization of the continuum of clinical competence development presented in Chapter I. And the third part will present 228 recommendations: first for additional research thought necessary to clarify and extend the conception of the competence-based curriculum model; second, for administrative considerations in deve10ping the curriculum model. Elements and Processes in the Continuum of Clinical Competence Development As Chapter IV revealed, students' perceptions of their clinical knowledge and skills are very different at each of the three levels of the program. What they describe as their competence may or may not be what faculty intend, and it may or may not conform to what faculty actually s_e_e students do at a particular level. But what students describe are fairly consistent intra-group and inter—group perceptions of competence at a given level. Similarly, Chapter Vrevealed that students' perceptions of the process by which they did or didn't develop certain competence, may or may not conform with faculty perceptions of the instructional process, but students' perceptions of their experiences are relatively consistent from class to class. It is these consistencies which provide the basis for the conclusions about elements of competence and processes of competence development which need to be considered in developing competence-based osteopathic medical education. The Elements of Clinical Competence This study did not specifically describe the clinical competence acquired by students during their undergraduate osteopathic medical education; rather, it identified concepts and issues to be considered in designing a competence-based osteopathic medical education program. Professional competence is multidimensional, at least as described in theory, reflecting the practitioner's cognitive skills, psychomotor skills, attitudes and values, and medical philosophy. 229 One cannot in reality, of course, separate these various apsects of behavior, but examination of the separate elements facilitates discussion. Cognitive Skills: Four elements of cognitive competence clearly emerge from the study interviews: medical/scientific knowledge, knowledge of the clinical environment, information processing skills, and self-evaluation skills. The study subjects explicitly described their cognitive skills in terms of "medical knowledge," "knowledge of pathology," and "clinical knowledge." From this perspective there are obvious differences in cognitive skills of students at each of the three levels-differences which, it turns out, made the primary difference in what data was collected and what sense could be made of the data. It seems that until knowledge is perceived as "medical," students do not use existing knowledge to solve problems. For example, few students except those with complete mastery of a particular discipline (as individuals with graduate level training) described using their knowledge of anatomy or physiology to contemplate the meaning of data or to resolve problems with technical procedures. While there was an assumption that the basic science courses are important to understanding "medical" courses, students seldom described using that information to enhance or analyze other skills, like physical examination procedures. First year students, for example, denied thinking through procedures they learned in skills laboratories in terms of anatomy or physiology. They no doubt did apply basic science knowledge in those instances, but not consciously. A second kind of cognitive skill was implicit in students' descriptions: knowledge of health care delivery systems. With the exception of medically- experienced students, Unit I students had little frame of reference for their tasks when they were placed in the clinical context. They seemed to interpret the context they confronted in terms of existing knowledge, primarily as psychological matters of interpersonal relations. Few interpreted the context in terms of social 230 systems; for example, few saw themselves as intervening in an existing dynamic of nurse-patient-doctor role relationships and sought assistance or approval from all parties. The breadth of perspective of health care systems can carry over into the problem solving skills; for example, seeing patient behavior or illness as simply a psychological pheonomenon limits the depth of understanding and constrains the problem solving perspective. It also may result in students misinterpreting their interaction with the patient, and thereby taking undue responsiblity for problems encountered in attempting to carry out clinical tasks. Unit II students seemed to start very nearly at the same point as Unit I students when they entered their first preceptorship. Inexperienced students initially could not anticipate the processes of the ambulatory care setting and their possible role in that setting. Much of their energy and time in the early preceptorships was spent finding a way to "fit" into the clinical context. With each succeeding experience (term) they gained more empirical information with which to describe for themselves the "ambulatory care delivery system," and as that schema developed they more quickly and effectively adapted to new situations. Unit III students were confronted with a similar learning challenge. At first they had, depending on prior experience, limited knowledge of the hospital setting, and although some aspects of the knowledge of the ambulatory delivery systems transferred to the hospital setting, they had limited ability to anticipate what and how "to do." And, again, with each succeeding rotation they gained more confidence and skill in interacting with the environment. Students who had previous experience through other occupational roles in any of these settings had a cognitive base (schema) that placed them at considerable advantage over their more naive peers. However, this knowledge was constructed from the perspective of _tlleg role, not the role of a physician. These students were often on the horns of a dilemma: they had a different perspective from which to view and develop professional skills-~often more akin to what the 231 patient and other health workers would have physicians be--but their perspective was often not valued and built upon by physician mentors. Students who continue to operate from this former-role perspective and who do not expand it to a new physician perspective can also hamper their professional development by being unable to consider information which is inconsistent. with past experience or by being unable to assertively try tasks they previously were not allowed to do. A third aspect of cognitive skills, that of the psychological processes of cognition, was less obvious in these students' perceptions. There is a disquieting feeling that some students are functioning at a very low level of cognition throughout their medical training. They used recall and recognition in the first two years of didactic course work and they appeared to at least attempt to use a similar tactic in the clinical arena: they learn protocols for diagnosing and treating, and with repetition they were able to apply those protocols in future similar situations. There is the impression that students generally perceived that there were "right" and "wrong" diagnoses and treatments, and that one had only to recognize the signs and symptoms, and to match them with some "correct" list of problems and first-step treatment--whether the problem was presented as a patient in the clinical setting or as a paper and pencil case in the classroom. This approach is reinforced in some measure by mentors who ask the student, immediately upon seeing a patient who presents in a classical way, "What is it?!" It is assumed in medical education that the student is gaining medical problem solving skills. There is evidence that these students are increasingly able to solve medical problems, but flew they solve them is not clear. There is no clear evidence that students gained skills in M to think about professional situations and patient problems, so much as they were being trained in _w_l_Ie_t_ to think and do when confronted with specific "disease" problems. 232 And a fourth aspect of cognitive competence, being able to establish standards of practice and guide one's professional development, emerged as an increasing concern of students as they proceeded through the program. These students seemed to make increasing efforts to establish such standards for themselves, but they had little confidence that their goals were realistic. On the other hand, at every level of training there were students who appeared to have little concern about setting personal goals, being confident that what they did was appropriate unless they were specifically told otherwise, or conversely, who assumed they were not performing appropriately if they were criticized. Neither perspective seems to demonstrate that students are developing the important professional competence of self-monitoring and self-development. Of particular interest and concern are individuals (often those with extensive medically-related past experience) who perceived themselves as good as or better than general practitioners in performing routine ambulatory care tasks, and therefore "coasted" through preceptorship experiences. Some of these students appeared not to formulate goals for themselves which went beyond that of their past role or that of their peers. Conclusions. These insights into students' descriptions of their cognitive development support the following conclusions. First, it is important for curriculum planners to keep in mind that there are different aspects of "cognitive competence," all of which are learned skills and require guided learning, reinforcement, evaluation and continual up—grading. Second, a continuum of clinical competence development should include increasing cognitive skills in thinking and learning, as well as information gathering. It is likely that students can more efficiently independently teach themselves the factual content of a medical curriculum than they can teach themselves how to think and problem solve, to set realistic personal development goals, or to plan and carry out 233 developmental strategies; yet, most instruction is pre-occupied with conveying great hordes of information. Research has consistently shown that, irrespective of the nature of the information, it is quickly forgotten if it isn't utilized in some way which is meaningful to the students. Third, students need some cognitive map of their medical education; one which includes both structure (what courses and experience they will encounter), and function (how those courses and experiences relate to professional competence development). Given only a map of the structure and left to their own devices to figure out how the curriculum can work for their personal professional development, students tend to let the structure dictate function: they develop recall and recognition skills and try to apply them in life situations; they follow course protocols and assume they have learned enough and what is essential; they study for exams rather than to learn. This suggests also that examinations are very influential aspects of the structure of a curriculum. Psyehomotor Skills: Four issues became apparent as students described the development of their technical skills: the dependence of technical skills on eegpitive competence; phases of skill development; importance of perspective in definingpersonal standards of performance; and the impact of the clinical mactice environment on competence development. Although this study did not investigate what technical procedures students develop at each of the levels of training, it did concern itself with the physical examination and student's perceptions of their competence in the performance of its individual procedures. Students described an increasing proficiency and comfortableness in the performance of the physical examination. But when one examines what students say, one finds that they are most often talking about the cognitive aspect of the physical examination: knowing what the findings mean; knowing the medical terminology to describe findings; knowing what examinations to do, in what depth, for a particular situation. Cognition is inextricably woven into the psychomotor performance; as 234 was described above, it is medical knowledge that undergirds the concepts of "thoroughness," "accuracy," and "completeness" of a physical examination, whereas psychomotor skills are the basis of the "proficiency" of the performance of the procedures. One interesting aspect of the development of proficiency in performing the physical examination is the focal point of the learner's attention; what might be called the "self-other" phenomenon. When learning a new procedure students describe themselves as being preoccupied with the mechanics of the procedure itself, to the exclusion of considering either themselves or the person on whom they are performing. They appear to be able to attend either to the person 2:; the procedures 95 to how they are doing them. For example, while learning new manipulative skills, students are often unaware of the position of their own bodies until it is pointed out to them that their postures are inhibiting them from correctly or more efficiently performing the technique; or, in their intense effort to perform correctly the fundascopic examination they may be unattentive to the patient's discomfort. With regard to the physical examination, this phase of learning is done in the simulation laboratory. When, however, they perform the physical examination on a "real" patient, they become acutely aware of themselves and how they perceive themselves to be seen by the patient. They are more concerned with the overall organization of the procedure than with individual components of each task, which presumably have been mastered. In an effort to at least appear proficient, these subjects tended not to pursue a procedure if it inconvenienced or annoyed the patient, even if they had not collected sufficient or accurate data. While this is likely an artifact of the design of the learning experience (Unit I students assumed it did not matter whether they got accurate data or not), this attention to subjective matters does seem to be a common phenomenon of second stage learners. 235 For most students this self-conscious stage of performance persists beyond the point at which they have mastered the technical procedures. Not until they have gained confidence in their knowledge, are able to interpret findings reasonably well, can they transcend the procedures and their own concerns to truly attend to the objective features of the clinical situation presented by the individual patient. At this stage there is the risk of the student being preoccupied with gaining accurate and thorough information and dealing with the complications of the problem solving process, and therefore, losing touch with the subjective aspects of the data collection process. Theoretically, as experience and knowledge are gained, the student will be better able to adjust procedures and processes to both the subjective and objective uniqueness of patients; however, most students need guidance as well as encouragement in order to develop this most advanced level of psychomotor skill; The opportunity to encounter clinical situations which require modifications of procedures is essential for developing that competence, but the opportunity must be presented at a time when the student is ready for that challenge. For example, first year students who were confronted with patients who presented complicating conditions (infants, retracted limbs, oxygen therapy, etc.) could not adjust to those situations and were frustrated by them. These difference in stages in learning technical skills are important to keep in mind when establishing standards of performance and when designing clinical experiences. Students will likely be more frustrated than motivated by evaluations which include criteria for performance that are beyond their ken. Also, early stages of developing procedural proficiency, students are very slow and deliberate. In the clinical settings they are encouraged to and rewarded for increasing their speed. Due to the patient service orientation, in contrast to an education orientation of the clinical setting, physicians cannot take time to supervise students if they take an hour or longer to do a complete history and 236 physical examination. Therefore, if students take the necessary time, they usually do not receive the supervision and feedback they need to improve their skills. If, on the other hand, they attempt to fit into the time constraints of the practice, they either develop strategies to expedite their examinations or perform only part of the total process. In either case the student may be jeopardizing his/her competence development. The most common experience of these students, at all levels, was to have no one attend them while performing the history and physical examination in the clinical setting. Under these circumstances students seem to gain confidence in their ability to establish rapport with patients and no doubt successfully experiment in finding their own interpersonal approach. They do not, however, gain confidence in their own assessments of what and how they should be performing. Insufficient data was collected regarding the development of competence in performing procedures which are less involved cognitively than the physical examination to contrast their development. A few students described their skills in such technical procedures as drawing blood, suturing, delivering babies, and performing rectal and pelvic examinations. The students often expressed an unexpected (and disturbing) readiness to claim proficiency after only several trials, even if they had been unsuccessful. One interesting difference in male and female perspectives on their proficiency in performing the pelvic examination may point up a variable in developing personal standards of performance. Women students expressed concern for the thoroughness and m comfort of their performance, whereas male students tended to judge their competence in performing the pelvic examination on the basis of M e_w_n. comfortableness in doing the procedures. This difference seems understandable, given that males have no cognitive schema for the experience from the patient's perspective. This may suggest that personal experience as the receiver of the professional service may be an important variable 237 in setting personal standards of performance. It also suggests that instruction should attend to subjective aspects of the procedures and be particularly aware of the student's personal perspective. It also suggests that program performance standards must be carefully evaluated to ensure that they consider patient as well as provider perspectives. Conclusions. These insights into students' descriptions of their performance support certain conclusions regarding the development of psychomotor competence. First, curriculum planners and instructors should keep in mind the distinction between and among the various aspects of task performance: thoroughness, completeness, accuracy and proficiency, and the role cognition plays in each. Since cognitive competence is continuous, task performance must be an evolving cbmpetence and should be guided and evaluated according to different standards at each level. Second, the development of proficiency evolves through stages. Early stages of learning are probably best learned in safe, educationally- oriented environments where sufficient time and supervision can be provided; but later stages need supervision and continuous evaluation as well. Early stages of learning can be best supported with simulated patients, where the student does not have to be unnecessarily concerned with the welfare of the patient. Later stages of learning require situational challenges: different contexts and different-patient circumstances which require the student to adapt himself/herself and procedures to the extenuating circumstances. Third, self-confidence is an important aspect of competence development which requires independent responsibility for task performance in the clinical setting. However, while the deve10pment of self- confidence and personal style are essential components. of professional competence, their development at the cost of accuracy, thoroughness, proficiency and adaptibility cannot be supported in a competence-based program. If it is the case that there are stages of psychomotor development, stages that depend on 238 accumulating knowledge, different clinical experiences, and training in the basic procedural technique, different instructional objectives and standards for performance are called for at each stage of the program. Without supervision, evaluation and remediation, students are not likely to successfully progress through all stages of psychomotor development, or to develop valid standards of performance by which to guide their own professional development. Attitudes am Values: Several themes emerge from the changing attitudes of students: an increasing identification with the physician role; assumption of an orthodox view of medicine; an increasiI_Ig need to rely on self-teaching; and increasing disassociation from the educational progra_m. One can sense from students' comments an increasing identification with the physician role and attendant belief systems. Unit I students are just that--students. For all intents and purposes they are lay-persons. They are uncomfortable with medical terminology; many resist using it for fear they will sound silly or will too quickly lose their lay-value orientation. If they had a previous occupational role, they continue to look upon themselves as a "biologist," "nurse," or "social worker" who is in medical school. In any case they view themselves as "pretenders." Much of their energies in their first clinical experience are devoted to dealing with this role conflict. Unit 11 students are also self-conscious, but as they gain knowledge and skills they begin to see themselves as "being eb_l_e_ to be a physician." Unit III students become increasingly intent on learning skills with which to assume next year's responsibilities as an intern. If at the outset of Unit III they adopt a passive student role, they quickly become aware that all too soon they will be D.O.s and will be held personally accountable for being able to perform in that role. It must be kept in mind that Unit III students in this program were involved in a very unstructured clinical training program, one that encouraged, even required, that individual students assertively pursue their 239 own learning through assuming increasing patient care responsibility. In more academically structured and group—oriented clinical training programs, students may not so quickly identify with the physician role. There also appear to be changing attitudes and values regarding what "medicine" is. Unit I and II students generally are able to explore and/or maintain certain personal ideologic and philosophical views about medical care. During Unit II students are challenged with competing perspectives: systems biology courses are disease and drug treatment oriented; comprehensive patient evaluation and management courses emphasize osteopathic management of health and illness; and preceptors vary in their approach to health. Frequently theory and practice are strikingly different. Nonetheless, through selecting or encountering preceptors who approximate their ideal, students can attempt to reinforce their own views of osteopathic medicine. Unit III, on the other hand, is predominantly hospital-based and specialty-oriented, as will be the internship. It is here that students find out what "medicine" is really all about. Previously espoused notions about "wholism," "continuity of care," and "health maintenance" are no longer reported by the students. They appear preoccupied with learning what they need to survive in the hospital environment. And having become acutely aware of the enormous amount of knowledge one must have in order to "properly practice family medicine," and their limited grasps of even the fundamentals, they begin to think about limiting the scope of their future practice in order to define a managable body of knowledge. In other words, as sociological studies have consistently shown about medical students, osteopathic medical students conform to the concept of medicine prevailing in the practice setting. The students describe that prevailing value system as: defining, diagnosing and treating disease, using the tools of scientific, technological medicine. 240 Along with this demand to "learn medicine," students demonstrate increased motivation to learn and to manage their own professional development. On a passive-active continuum, Unit I students describe themselves as extremely passive learners-- letting examinations define for them what and how they should learn. Unit II students are somewhere in the mid-ground: lectures and examinations in ' large measure define what they learn, but they exercise increasing self-direc tion in when and how they learn it and they are confronted with a new learning challenge in the clinical setting. Unit III students are necessarily the most active learners; for the most part they have to define what to learn, where to find the information, and how to integrate it into the practice of medicine. Students in the Unit III clinical program are also required to develop these skills in isolation from their peers, whereas in Units I and II, teaching was a group process and students generally involved themselves in study groups and/or social support groups. Little social interaction with peers was reported by the Unit III subjects. Some described the isolation as the most difficult aspect of adjusting to the clinical situation. Without peer interaction or the availability of faculty and staff with whom to discuss personal and professional development issues, students are left to their own resources to cope with the pressure of this most important phase of their medical school program. The isolation from college staff and faculty apparently has engendered a sense of alienation. Unit III students express little loyalty to the college; many are excessively critical. Conclusions. Changes in students' attitudes and values are significant issues for CBE program planners. First, students' ability and willingness to accept responsibility for their own learning and professional deve10pment is an important goal of competence-based education. Abrupt changes in demands on independent learning, as experienced when students progress form Unit II to Unit III, create problems for students-- more for some than others. The instructional process 241 should be designed to provide students with the skills needed to assume increasing responsibility for managing their own learning. At a certain point, however, self- reliance may become dysfunctional, since physicians' social role requires that they interact with a network of colleagues in providing care to their patients. Ability to learn from colleagues and to be able to assess the expertise of referring physicians are skills which may best be developed through group learning activities in the medical school. Second, as students become increasingly more independent learners and self- identified as physicians, it can be expected that they will change their relationship with the college. However, it is expected that administrators would want graduates to leave with a sense of regard and loyalty to the college. Graduates' attitudes towards the college and their education is an important political issue for an institution, for their conception, of what Baldridge (1975) called the "saga" of the school, has a great deal to do with its public support, quality of future candidates, quality of future faculty and, ultimately, the quality of its program. The saga, to a large extent, evolves from the behavior of the graduates and the myths they perpetrate about the school, faculty and curriculum. A professional school, regardless of its curriculum design, is well-advised to be aware of what its graduates think and are saying about their alma mater. Third, professional education programs can expect that both students and faculty will have varying value systems regarding health, illness, medical care and the physician's role in the health care delivery system. Yet, there must be some agreed upon standards of quality care, student and physician ethics, personal integrity, etc., in order to define standards of performance for competence. In a CBE program, failure to perform according to these standards is evidence of a student's lack of competence. By explicitly stating codes of behavior, standards of care, etc. potential students (and faculty) can make an informed choice about 242 attending such a medical school. Implicit in their acceptance of a position is an agreement to be judged by those standards. Fourth, program goals do reflect certain values and attitudes which must be sustained by the educational program. For instance, a program proposing to prepare osteopathic physicians for general practice should be designed so as to engender and support the values and attitudes, as well as cognitive and psychomotor skills, that denote competence for such practice. Certain of these value issues are directly and inseparably related to the philosophy of medicine which forms the basis for the educational program. Medical Philosophy: This study began with the assumption that the educational program was founded in the philosophic tenets of osteopathic medicine and, while it was not assumed that all entering students would be fully cognizant and/or commited to those principles, it was assumed that students would develop skills and appreciation for osteopathic medical practice through their course of study. Three impressions were formed from students' comments: Units I and II do increase students awareness, interest and skills in osteopathic medical practice, but Unit III does not reinforce those values and skills; students' perceptions of an osteopathic medical philosophy are variable; and the student selection process is probably the most significant variable in determining who and how many graduates will ascribe to the philosophic tenets of osteopathic medicine. Across all levels of the program there were subjects who had well thought out conceptions of an osteopathic medical philosophy--although they are not necessarily similar from student to student; there were students who distinguished osteopathic from orthodox medicine by its manipulative therapy modality; and there were yet others who saw no essential or important difference between osteopathic and orthodox medicine. The small percentage of individuals who had specifically sought osteopathic medical training because of its traditional principles, appeared to persist in holding that philosophy across the years of 243 training. Those who had developed, during the first two years, an appreciation for the potential of osteopathic manipulative therapy, were generally persuaded by their clinical experiences that practice guided by those principles is economically impractical. The prevailing belief of students was that osteopathic physicians do not use a problem solving perspective that is different from orthodox physicians, but they have a useful, ancillary, treatment modath should they choose to use it. While Unit II students have opportunities to work with physicians who utilize manipulative therapy in their practice, Unit III students seldom encounter that opportunity. Generally that opportunity is offered in the Jr. Partnership experience, if at all. Lack of practice with OMT in Unit III causes students to doubt their proficiency, even if they had prided themselves on those skills in Unit II. The issue of problem solving was a different matter. Unless there was clearly a structural problem and it was the chief complaint of the patient, students at both the Unit II and Unit III level, with rare exception, could not describe how an osteopathic physician would approach a medical or health problem differently from an orthodox physician. There are several possible explanations for this phenomenon, including: . the osteopathic medical problem solving approach is covertly taught and modeled, such that neither practitioners nor students are conscious of its distinctive features; . clinical faculty do not explicate their problem solving paradigm and therefore students cue on the outcomes of the process; . there is currently little or no difference in these two medical philosophies; . the essential difference between the orthodox and osteopathic medical practice is the absence or presence of competence in managing structural dysfunction as a primary health problem. Unit III students expressed considerable dissatisfaction with the political position of the profession with regard to graduate education, expressing the opinion that the profession should approve residency programs, regardless of the sponsoring medical profession, if they meet the criteria of a quality program. They defined 244 quality in terms of: availability of academic teaching, provision of training in research, opportunities for learning innovative approaches to specialty care; the academic credentials of the teaching staff; and/or offering the opportunity to work with sufficient numbers and kinds of patients in the specialty area. These views seem to support the concluding generalization that students increasingly ascribe to an orthodox view of medicine and that, at least in hospital-based care, they cannot/do not distinguish between the practice of osteopathic medicine and other medical philosophies. Conclusions. These insights into students' perceptions of osteopathic medical philosophy suggest several implications for competence-based osteopathic medical education (CBOME). First, competence-based education, by definition, must be guided by a philosophy, for the philosophy dictates the parameters of competence: skills, values, and knowledge. Without an explicit statement of philosophy a CBOME program cannot be properly designed or evaluated, nor can valid standards of student performance be established. Two, educational programs which prepare students for an occupation are to a large extent defined by the practitioners of that occupation and by societal expectations and pressures. An educational program can, of course, take a particqu political or philosophic view with the intent of being an agent for change within that occupation, but planners of such programs must recognize that their students likely will be confronted with differing views when they undertake experiential learning in the community setting. If there is not sufficient reinforcement of the program's principles offered by these mentors, students will likely be socialized to conform to the value system of the practitioner community. Careful selection of both students and clinical faculty is probably essential when the program's principles differ from the societal or professional view. Third, instruction must guide students' attempt to articulate and operationalize a philosophy. If, for example there is an osteopathic approach 245 to problem solving which students are expected to master, faculty must be able to articulate and model the approach, and evaluate students' knowledge and skills in using the approach, in order for students to conceptualize that approach and to develop skills in problem solving using its principles. Summary: Fou- continuums of competence which should be defined for each professional task and for each level of the educational program were identified. The relationship of these continuums to the task and training level is shown in Figure 6.1. Figure 6. l ELEMENTS IN DEFINING CLINICAL COMETENCE mmrm stsr. / I I I I I I 1'35." a (mum 5' 5 2 s momma E g 9 g “W” Imam/wuss g E / PHILDSPHY 246 The four continuums of clinical competence were proposed to encorporate certain specific elements and/or issues: 1. Philosofllic Perspective. Not only should the particular medical philosophy guide the content and process of the curriculum, but students should be held accountable for acquiring increasing understanding of its implications for practice and demonstrating increasing skill in performing professional tasks in accordance with its principles. The philosophy must be explicated and modeled by faculty, and clinical experiences should reinforce the philosophic principles and consciously guide students in their operationalization. It is expected that graduates will not only understand but endorse the professional philosophy. 2. Cognitive Development. Cognition is the foundation of professional behavior; its acquistion is guided by the individual's personal values and philosophy, as well as by learning experiences. There are at least four aspects of the continuum of cognitive development: formal (theoretical) knowledge acquisition; tacit (experiential/phenomenological) knowledge acquisition (including knowledge of learning/practice environments); information processing skill development; and metacognition skill development (skill in planning, managing and evaluating one's own cognitive development). Competence-based education must be designed to coordinate the development of all essential components of cognitive competence, without assuming that development of one aspect necessarily insures the development of any other. 3. Psychomotor Development. There are at least four aspects of psychomotor performance: accuracy, completeness, thoroughness, and proficiency, all of which are dependent on cognitive development and practical experience. Each aspect can, in theory, be defined for a particular level of training and such definitions would be expected to be different from that for the professional level 247 of competence. Factors varying according to the level of training include: speed, manual dexterity, social-psychological orientation, adaptibility, organization of the task, and cognitive competence. 4. Attitudinal Orientation. There are at least three attitudinal elements, in addition to philosophy: those related to students' attitudes and behaviors towards their own learning, those on whom they practice, and the educational program and its staff. Ethics, integrity, self-awareness, responsibility and respect are critical elements of professional competence. Students are presumed to enter medical school with those basic values and behaviors, but the educational program must have clear expectations and provide experiences which allow the individual student to integrate those entering values into his/her professional competence. Variables in Developing Clinical Competence It has been the perspective of this study that the design of CBE programs must be concerned with both content and process. The content should be determined by the definition of competence, which, in turn, is guided by the parameters of the professional role and by the philosophy of the program. The processes, instructional, learning and administrative, by which the content is organized and delivered and competence development is facilitated, must be congruent with the program's definition of competence and its educational philosophy. From students' perceptions of how and why they did or did not develop certain clinical competence there appeared to emerge six variables in the teaching/learning pocesses which affect competence development: students' accumulated knowledge and skills; the clarity of the instructional goals and milosomy; the comity of £818 and philosophy and instructional design; integration of theory and practice; the context of learning; and instruction and role 248 modeling. Each of the variables appears to have some direct and/or mediating effect on the development of clinical competence. Students' descriptions also suggest that the variables are interactive. Students' Accumulated Knowledge and Skills: This study has persistently pointed to gem differences in students' knowledge, skills and perspective at each of the three levels of training, and to spe_cific differences within each group based on the individual student's pre-medical experiences. Cumulative knowledge and skills determined not only what students could do in the clinical situation, but affected how they thought about what they did, what personal standards they established, how they utilized learning resources, and what they expected of the educational program. It is assumed that all life experience influences what one is able and willing to m and d_o in the situation. In the case of these students prior experience in an occupation which practices in the clinical setting and involves knowledge and/or skills similar to that of the physician, was shown to also have a significant influence on what is learned and done. The better able students were to anticipate the practice setting to which they were assigned, the more comfortable they were and, it appears, the more rapidly they adapted to the environment and the better able they were to manage their learning. Similarly, the better able they were to match theory with past experience, the more efficient their cognitive development with regard to scientific knowledge acquisition. Prior medically-related experience provided students with both the "language of medicine" and the "syntax" of that language. That is, while naive students were learning the names of parts of medical science, experienced students could arrange the parts into meaningful relationships. This also seems a plausible explanation for the difference in clinical competence at the various levels of training. 249 On the other hand, experienced students placed along side naive peers in simulation laboratories or clinical experiences did not necessarily attempt to extend their competence. And extensive experience was even proposed as an explanation for inappropriately weighing the significance of information and valuing the source of that information; i.e., information that did not jibe with the student's experience was given lower priority for learning. One's life experience appeared also to not only guide 1h_a_t_ was learned, but to guide hey! one processed information. The impression gained from this study is that students attempt to understand and analyze new information and situations using the perspectives and cognitive skills with which they entered that circumstance. Hence, different students "see" different things and, in effect, learn different things. Conclusion. These insights support the conclusion that a CBOME program must take into account student' prior experiences. Standards of performance for each of the three levels of the study curriculum should be different. And, since competence is an indiviudal matter, individual standards (within groups) should be different. In theory, differences among individuals should be accommodated by CBE program. For example, simulation laboratories and clinical experiences might be designed differently for certain experienced students, taking into account the advantages and the limitations resulting from such experiences; or, students might be allowed to test out of skills training courses and/or clinical experiences, just as students are allowed to test out of didactic courses. Ideally, the perspective of the student and the manner in which they solve problems would be analyzed so that instruction could enhance the individual's ability to solve medical problems. And ideally, students would be prepared for clinical experiences in such a way that they could anticipate the forthcoming experiences, thus enhancing the educational potential of the experience. 250 It should be remembered that students d_e individualize their learning, whether the program intends for them to do so or not. CBE consciously directs that individual effort to ensure professional-level competence, as defined m m- Clarity of Curriculum Goals and Philosophy: students in the study program described at least two areas of program goals and philosophy in which they perceived a lack of clarity: the osteopathic philosophy and the purpose of the clinical experiences. Presumably the osteopathic philosophy which provides the foundation for an educational program reflects the philosophy of the professional. However, study subjects offered a range of definitions of the osteopathic philosophy, and there appeared to be more uncertainty about its meaning at the later stage of training than at earlier stages. Students entered the training program with or without a clear conception of the osteopathic philosophy and apparently graduated holding a similar view. It appears that the view most commonly held at graduation is that what distinguished osteopathic medicine is manipulative therapy, but that few practitioners employ the treatment modality. Students also seem not to have a clear notion of the relationship of their clinical experiences to the rest of their educational program. The study supports students' perceptions that Unit I and II clinical experiences are "enrichers' in that the experiences do facilitate integration of theory and practice. The skills laboratories and preceptorships were pointed to as the most appealing aspect of the program--"what makes our program different,"--yet students were uncertain as to the goals of those experiences. The didactic courses were consistently described as having priority over clinical experiences. It is, however, obvious to students that Unit III clinical experiences are intended to help develop clinical competence. Students define competence for themselves by anticipating what they have to do as 251 an intern, not by examining program goals and objectives. It appears that students rather quickly take the view that the Unit III clinical program is not a part of the instructional program of the College, but rather is only administratively linked to the College. Conclusions. These generalizations bring into focus several issues for CBE planners. It is the assumption of CBE that while the definition of competence is grounded in the current state of the art, it need not be limited to it, and that it is the m goals which define graduate's competence and the curriculum structure and processes. The program goals and processes are described within the particular philosophic framework adopted by the faculty; i.e., the faculty must define an osteopathic educational philosophy and develop program goals and instructional processes which are congruent with those goals. The more explicit those statements, the more effectively the program can be monitored and refined. Students, as well as faculty should understand the rationale for the curriculum design. TheCongruityoftheCurriculum withProgramGoalsandInstructiomlDesign: The ambiguity regarding program goals and philos0phy perceived by the study subjects was often explained by them in terms of curriculum design and instructional process. The structure of courses and/or the clinical faculty's perspectives were described as the reasons for students' views on osteopathic philosophy. For example, the separate and distinct courses and faculty which deal with osteopathic diagnosis and therapy were contrasted with courses and faculty that did not integrate or even endorse osteopathic principles. Few students described having experiences in Unit III which reinforced osteopathic principles. The structure of the curriculum also influenced the way in which students perceived the significance of clinical experiences and what and how they learned. The demands of the didactic program were of primary concern to the students; 252 since the clinical experiences in Units I and II took valuable time, but offered few penalties or rewards, students devoted most of their time and energy to the didactic program. For example, Unit I students reported conscientious plans to prepare for their first hospital H/P, but invariably they modified or eliminated those plans in favor of studying for exams; and Unit II students attempted to plan preceptorships to accommodate the demands of the didactic program. Examinations appear to be a potent structural variable. Examinations provide the student with the best indicator of what the curriculum intends for them to learn, how they should learn and what they should value. In Units I and II examinations in didactic courses were given greater priority than were clinical experiences. The absence of feedback, such as was reported for preceptorship case write-ups and Unit I H/P write-ups, resulted in students reducing the effort they put into those activities. The absence of examinations in Unit III did not appear to diminish the value of those experiences, but it did appear to contribute to individuals' uncertainty about competence (theirs and that of certain of their peers). Evaluation, formal and informal, appears to be an critical aspect of competence development. 4 Students also attributed their difficulties in adapting to clinical learning to the structure of the systems courses. Even though students generally thought the systems approach a good one, and believed that the courses included what was needed for the clinical externship, they expressed frustration with their difficulties in making the transition from learning information organized according to the logic of the discipline to retrieving information required to solve real problems. Unit III students described themselves as being unable to efficiently retrieve and critique literature, to determine what they needed to study and to organize their learning effectively. 253 Students also contended that adjunct clinical faculty were not fully cognizant of the program goals and the student's stage of professional development. Despite being provided course outlines and objectives, clinical instructors, according to students, did not teach according to those objectives. Conclusion. These insights add support to the previous conclusion that there are different aspects of the generic components of competence and the CBE curriculum must provide opportunities to develop all aspects. It also is concluded that both the structure of the curriculum and the instructional processes must intentionally guide and reinforce program goals and philosophy. Evaluation is a critical aspect of CBE. For example, if a course in interviewing is included in the curriculum (and is well taught), but clinical faculty do not evaluate students' interviewing competence when performing the history and physical examination, and according to criteria similar to that of the interviewingcourse, students are not likely to encorporate those theoretical principles within their clinical behavior. Similarly, if osteopathic principles and theory are taught separate from the principles ofmedicine, and faculty do not teach and model their integration and students are not evaluated on their ability to integrate them, it is likely that students will not develop competence in integrating osteopathic principles in their medical problem solving process. Whatever the program's definition of competence, instruction and evaluation must explicitly address those competence goals. This, of course, means that faculty must be supportive of the program goals and philosophy, and reinforce them through modeling, instruction, and evaluation. There is no expectation that all faculty will hold the same values or weight equally those they do share. However, faculty must be able to endorse the philosophic statement of the educational program; hopefully, they will practice and teach in accordance with that philosophy. 254 Integration of Theory and Practice: Students describe personal clinical experience as being the most effective means for integrating theory and practice. The integration of theory and practice was usually described in terms of integrating didactic material concerning a particular disease or body system into practice when confronting a clinical case representing that disease or system. The integration of such theory and practice appears to be a circular, rather than linear, process. On the one hand, students having had the theory (of a urinary system for example) described being able to perform more efficiently and confidently when confronted with a clinical problem (of urinary tract infection for example). On the other hand, students who had first encountered the clinical case, described better understanding and retention of theory when it was later presented. To complete the cycle of integration, the student needed to again encounter the clinical situation and be called upon to demonstrate understanding of the theory through performance. The more complex notion of integration with regard to the interrelationship of body systems in the expression of well-ness and ill-ness, as emphasized in osteopathic medicine, was also identified by students. They credited their accumulating knowledge with facilitating their increasing understanding of the theory and their efforts to integrate that theory into their diagnostic and management performance. Students, however, expressed the need for more guidance and reinforcement in developing this approach to clinical problem solving; guidance which they described as totally or generally lacking, both in didactic courses and clinical instruction. Students seldom described being held accountable for knowing theory when performing in the clinical setting simulated and/or actual. Nor did they describe clinical problem solving cases (CPSS) in systems courses as being cumulative and integrative. 255 Certain conditions enhance integration of theory and practice: the co- occurence of their presentation; faculty modeling; explicit reinforcement of theory in didactic and clinical courses; increasing responsibility and accountability for (integrated) competence; and recall of patient images. Conversely, certain curriculum structures and instructional processes were described as inhibiting integration: skills courses not drawing on and evaluating knowledge of principles offered in basic and medical science courses; basic science courses not offering clinical examples; clinical instructors not reinforcing theory; clinical mentors modeling medical problem solving which is not consistent with theory; courses arranged without regard for their conceptual relationships; time lapses between theory and practice; and discontinuation of training and evaluation in basic skills. Conclusion. The insights of study subjects commend a number of features to a CBE program in osteopathic medical education. First, the logic which prescribes the relationship of courses and experiences should derive from the competence the program intends to develop; competence being viewed as multidimensional and transdisciplinary. Neither courses/disciplines nor experiences can be considered ends in themselves. Nor is it satisfactory to assume that competence gained from each of a series of courses and/or experiences will be transferred to other courses and experiences, or that the sum total of that competence is professional competence as defined by the program. Both the structure of the currriculum and the instructional process must explicitly facilitate integration of accumulating theory and students must be held accountable for integrating theory and practice both in the didactic and the clinical setting. The Context of Learning: The environments in which students performed clinical tasks defined what students were expected and able to do. Each unit of the study curriculum presented students with increasingly complex medical problems, commensurate with their increasing knowledge and skills, thus facilitating 256 continuing development of clinical competence. Certain features of specific environments were described as inhibiting competence development: simulation laboratories provided limited opportunity for Unit I students to confront abnormal physical findings; non-ambulatory, hospitalized patients posed problems for Unit 1 students attempting to perform history and physical examinations; patients seen in preceptors' offices seldom presented with problems currently or previously studied in didactic courses; hospital-based patient care training in Unit III seldom concerned itself with certain issues presented in the on-campus program 'such as comprehensive health care or osteopathic diagnosis and therapy; and any given clinical preceptorship or rotation might be unproductive for a student due to: a limited number of patients or limited type of medical problem; too high a patient service demand on the clinical instructor; inability or unwillingness of the clinician to teach to the student's level; personality or value conflicts between the student and clinician and/or staff; patients' unwillingness to participate in the student's learning experience; and lack of a didactic component in the instruction. Learning experiences were also described as unproductive when the student was unable or unwilling to understand and adapt to the environment. Conclusion. One of the precepts of CBE is that the learning environment must be supportive of the student's learning needs. These students' descriptions of their clinical experiences points up unique problems for competence-based, community-based osteopathic medical education programs. The number and kind of patients which will be seen by a clinician and the clinician's approach to care cannot be controlled by the College. Thus coordinating clinical and didactic experiences is difficult, if not impossible. However, the program administrators and faculty are responsible for ensuring that graduates have achieved the competence described by program goals; they must, therefore, anticipate and address problems inherent in the curriculum design and limitations of the resources 257 available to the program. Curriculum planners can design strategies to enhance student learning such as: . enhancing simulation laboratory experiences through the use of such tools as A—V materials, trained simulated patients, and a patient bank; . orienting students to clinical environments before they are expected to perform in them; . explicating the types of patients which particular students should and should not perform on; . utilizing on-campus faculty for teaching in the clinical setting, particularly at the initial stages of each phase of training; . deve10ping protocols which outline preferred procedures for obtaining patient cooperation; . making every effort to establish a cadre of informed, loyal and effective clinical adjunct faculty; . providing students with explicit guidelines regarding their clinical competence development, standards of competence, and the purpose of the current experience; . providing mechanisms by which students can be assured of a productive clinical experience; . developing mechanisms for evaluating and remediating students' clinical competence deficiencies on an on-going basis; . enhancing the theoretical content of clinical rotations through the use of such instructional strategies as: developing CAI programs for each rotation; arranging group study projects; evaluating cognitive competence; offering instruction in particular clinical management problems using a workshop approach to instruction; rotate clinical students through the campus program at strategic points; etc. Instruction/Role Modeling: Many of the issues related to the influence of instruction and role modeling on students' competence development have already been pointed out: consistency of personal philosophy with that of the program; familiarity with, commitment to and ability to reinforce the goals and objectives of the program; skill in articulating, reinforcing and modeling integration of theory and practice. Students at all levels of the study program described dependence on instructors for their clinical competence development, although the degree and kind of instructional assitance varies according to the students' level of program, life experience and personal needs. Inexperienced students at each level described needing more specific instruction, direct supervision, and explicit feedback than 258 did medically-experienced students. The focus of the instruction depended upon the students' level of training: Unit I students-— understanding basic principles and techniques of the history and physical examination; Unit II students-- integating accumulating knowledge in the performance of the H/P and developing a medical problem solving paradigm; and Unit III students-«refining the H/P and medial problem solving skills by integrating accumulating knowledge and skills of clinical medicine. Also the amount of clinical experience appeared to affect the student's ability to deal with ambiguity and variations in instruction and clinical modeling. Students described wide variations in instructors' awareness and skills in meeting students' different learning needs. Students viewed their competence development as being hampered if they were 531: given clinical responsibility commensurate with their abilities; taught and held accountable for increasing skills in performing tasks, including the H/P; assigned learning/service tasks which enhanced their clinical competence development; offered explanations for clinical decisions made by mentors; given appropriate task assignments and/or not given feedback which facilitated their improving performance; and provided sufficient time to gain the necessary knowledge to understand clinical problems. These factors in learning were described as applying to all levels of clinical training and certain of them applied to didactic courses and assignments as well. And as has been pointed out, the adjunct faculty are particularly influential in the students' socialization into the professional role. Conclusion. These insights reinforce conclusions previously made. Program designers and faculty must have a clear conception of the differences in students' learning needs and competence at each level of the program. Faculty must be familiar with and support the goals of the program and the objectives which are expected to teach. Individuals who teach in the clinical setting (both similated and 259 real) are probably the most important faculty within a professional program. These individuals must be fully aware and committed to meeting the special learning needs of the students with whom they interact, and they must understand and accept the responsibility for the competence of their students. Competence-based education program administrators, in turn, must take responsibility for ensuring that students have instructors who not only are appropriate role models but who can and do teach according to program objectives. It is' doubtful that students can effectively learn, particularly at the first level and at early stages of succeeding levels, from either poor role models or poor teachers. Program planners should, therefore, take steps to optimize clinical instruction, for example: make program goals, objectives and philosophy clear to both faculty and students; establish standards of practice by which to evaluate potential instructors; conduct faculty development programs; devise reward systems for good instructors; carefully select/match instructors and students; mediate instructor-students difficulties; and make certain that students are prepared for the clinical learning situation. Summary: Six variables in the clinical competence developmental process were identified. The relationship of these variables to the professional tasks studied and the continuums of competence previously described, is shown in Figure 6.2. 260 Figure 6. 2 DEFININ CLINICAL oorrrmmcr: FOR AN INDIVIDUAL AT ONE LEVEL OF TRAINING COMPETENCE \ snnems ACCLMLATED mama/Skins S; “'5" _ 5 3 s s mmorraoeamsomsarmwsorm 3 E 55 )- qumc E I: g: mmu comaum 05me 3 INSTRUCTION \ g E >— 8 i if» INTEGRATION or neoav a PRACTICE 5 \ INSTRUCTION/acts rename WNLEARMMS The six learning variables were proposed to incorporate certain specific issues: 1. Students' Accumulated Knowledge and Skills. Accumulated knowledge, skills, and behavior (competence) appears to be the most significant variables in learning. At each level of training, students gain knowledge and skill which enables them to "see and do" from a unique perspective. With each succeeding phase of training, knowledge and skills accumulate so as to change both the perspective and competence with which professional tasks are undertaken. Although certain general differences in competence can be defined for each level of training, individual student differences transcend these general boundaries. Since competence is defined in terms of what students know and do, not what they have heard or seen, individual differences in competence are likely more 261 significant than usually thought by planners of traditional programs. It is proposed that students' competence both affects and is affected by the other variables. 2. Clarity of Program Goals and Philosophy. The CBE program goals state the parameters of competence graduates must achieve. Medical students are highly motivated and capable, and given a clear conception of what it is they are to do, they can guide their learning towards those ends. If the program goals are not clear to faculty and/or students, it is unlikely that their energies will be directed towards program goals. When program goals are ambiguous, individuals pursue personal goals. 3. Congruity of Curriculum and hatmctionalDesign with Program Goals. It is essential that the design of the curriculum and the instructional processes are congruent with the program goals and philosophy. Certain structural features of the educational program (especially examinations and course relationships) and process features (especially modeling and reinforcement of theory in practice) direct what and how students learn, and therefore, must be carefully designed in order that program goals be achieved. 4. Integration of Theory and Practice. Practice which reflects the integration of theory is the hallmark of professional competence. Curriculum designers must consider factors in curriculum design, course design, instruction and evaluation which influence how effectively students are able to integrate theory into practice. Modeling and verbal reinforcement of theory by clinical instructors, co—occurrence of theory and practice, and evaluation which holds students accountable for integrating theory into practice were shown to be important factors in students development of professional competence. 5. Context of Learning. The clinical setting offers the most challenging and rewarding learning for medical students. Each clinical setting, including the simulated laboratory, presents the student with unique learning Opportunities and 262 challenges. Curriculum planners of CBE programs should provide an environment which optimizes students' learning. A number of factors appear to be involved in determining the approprateness of the learning environment: the student's knowledge of the environmental context; the patient's acceptance of the students; the appropriateness of the medical problem for the student's level of competence; the clinical instructor's acceptance and ability to carry out the faculty role; and the appropriateness of the clinical training for the program goals. 6. Instruction/Role Model. Regardless of how well a program is defined and designed, it is the instructional process which effects student learning. Through selection of instructional objectives, instructional techniques, role modeling, interpersonal interactions with students, and imposition of evaluation criteria, the faculty overtly and covertly convey to students what they should know and how they should behave. Practitioners (those in the community setting) appear to have the most influence on students in a community-based educational program. Curriculum planners must carefully select and train clinical faculty, and arrange students' experiences in order that program goals can be achieved. Implicatim for Competence—Based Programs Programs intentionally designed to produce professionally competent osteopathic physicians must carefully define what they intend through both explicit statements of standards and criteria of student competence, and statements of philosophy, goals and curriculum rationale. In order to effectively operationalize and maintain the intended program, certain issues related to the instructional process need be considered. Fig}, all constituents of the program must understand the program goals, philosophy and curriculum rationale. M, program designers and faculty must have a clear conception of the differences in students' learning needs and 263 competence at each level of the program. M, every effort must be made to facilitate students' integration of theory and practice in all learning situations, through verbal reinforcement by instructors, co-occurrent practice and theory, modeling and evaluation mechanisms. M, faculty must be familiar with and support the goals of the program and the objectives which they are expected to teach. 5191, individuals who teach in the clinical setting (both similated and real) are probably the most important faculty within a professional program. These individuals must be fully aware and committed to meeting the special learning needs of the students with whom they interact, and they must understand and accept the responsibility for the competence of their students. Sigh, competence-based education program administrators, in turn, must take responsibility for ensuring that students have instructors who not only are appropriate role models, but who can and do teach according to program objectives. It is doubtful that students can effectively learn from either poor role models or poor teachers, particularly at the first level and at early stages of succeeding levels. Program planners should, therefore, take steps to optimize clinical instruction by: making program goals, objectives and philosophy clear to both faculty and students; establishing standards of practice by which to evaluate potential instructors; conducting faculty development programs; devising merit systems for good instructors; selecting/matching carefully instructors and students; mediating instructor-student difficulties; and making certain that students are prepared for the clinical learning situation. Seventh, students' professional development must be monitored and guided by evaluation consistent with the program goals and philosophy. Implicit in these notions of competence-based education is the single imperative that the structure and function of the program must be directed by and towards the program goal: the defined competence of graduates. Specific faculty, 264 department and discipline educational goals must be consistent with and contributive towards that program goal. Reconceptionalization of the Continuum of Clinical Competence Development The study began with an e m conception of the continuum of clinical competence for the students in the case study program. The study intended to both test the viability of that conception and to identify variables which affect that developmental continuum. The study revealed certain inadequacies of the initial conceptualization and identified variables related to the student, environment and instruction that affect the developmental process. The Conception of Clinical Competence Development Chapter I presented a preliminary conceptualization of the continuum of clinical competence development, and a statement of the relationship of the level of training to the elements which describe the continuum (presented here as Figure 6.3 and Table 6.1). While the conceptualization does reflect the apparent fact that there are differences in competence at each of the three program levels, the study revealed certain weaknesses in the preliminary statement: (1) it does not accurately reflect the complex nature of clinical competence; (2) it does not describe the nature of the differences in competence at each level; and, (3) it does not clearly convey the difference in individual (actual) competence at a particular level of training. The study revealed that, without a clear statement of program goals framed within the program's medical philosophy to guide the development of clinical competence statements (course objectives), students inferred standards of performance from modeled behavior--which might or might not be consistent with 265 program goals. Thus, lists of professional tasks (as listed for element B in Figure 6.3), do not convey the multi-dimensional nature of competence, nor can they convey to students and faculty the specific definition of competence espoused by the educational program. Given a clear definition of terminal competence, however, the preliminary statement could provide a framework for conceptualizing an intended continuum of clinical competence for the study curriculum. The differences in students' clinical competence at each of the three levels are not accurately conveyed in the preliminary statement due to the lack of: (1) a definition of terminal competence just mentioned, and (2) adequate descriptions of the students' response mode (elements H -M., Figure 6.3). The study revealed that students do the same tasks differently, as well as doing different tasks at each level; thus, responses such as accuracy (J), efficiency (K), confidence (L), and ethics (M) would need to be defined differently for each level. For example, the program should expect a high level of accuracy in student performance at all levels, however, accuracy for the Unit I student would probably be defined in terms of accurate placement of instruments, accurate descriptions of observed findings, etc. Whereas, for the Unit III student, accuracy might be defined in terms of recognizing and correctly pursuing pertinent clues, correctly interpreting findings, etc. In a similar fashion, the definition of the specific professional task (element B, Figure 6.3.) would vary at each level of training, based on the intended accumulated competence of students at each level. Such a conceptualization, even if properly defined, cannot adequately define or predict individual student's competence at any point in time. The study suggests that learning conditions must be optimal in order for intended learning goals to be achieved. To the extent that the program cannot control the teaching/learning processes--which, of course, it cannot expect to do totally--individual competence will deviate from the program goals. 266 003000000 53030.!» 30003 0.0 E00 9. 00—0000 am .300 0003050 ..u 000000000 033 00.- .hfi 00 am ca 00000 000 «a 015300 03.300 06000003. .« 0.0053025 ~55c000u0£ Sc 03305.5 HI 0030000 no 055008500 ..c 00300.»... 000 .6005 co 090500.00 ac 00000.... 0.! Soon 93:000— 0.§0 51. 0000:0900 a! Scam 0.900.. 55 0000.580 .c 00«00:uu0ec " 9300: 00.1.3006 ADV Eugfldu n a” 6.029089%.“ 2.. 0333000 00 331:0?!— 0.0 033302303303 .0 .35: a... £1... 06 888.. .0. 0000-0 ‘00 nun-0‘ 00a0000000000 0a 000000u 030 000 u hual0u\000«u0c 0000010 c.0 0am0~000anuw ..6 0000000000 00000—0Iaxe0—a ..0 ugumuuspflm 0.0 00000003 00000—002001. 0.0 ~mcwu av 0«000c0«0 0004009 00 00d3000>0u6000 06 00000 0au00000u0 ausceouxcaua 00 uawocooachm av 00u- 00—4000 uaocouc h0 flcvhuoucu 0‘ 0000 uuucuuuua 00 AHOUIHumnuu ct 0000a0>0tu~00 c0 0 00:06 :0 000500 0000) :0 0“ «0000!. 00 0000030000 00002:... 0030000 nn. #005003000 00 100000u000 0000~0000 «0050 uuuuuo ~0 dnlnuc .0 lcqaouu 030 a 0000 50000uz ~00u000 uuuuuou .0 00000.:— gtguafiunfi ..03 00008.. 130300.050 02.300. 9.0 a 0300 .505 El 00.933000.— 000008 dag—3d A5 0.. 0 1.06. 0003 .0 «a .00 0000~00 .A snucuuuu00c0 clamor:— 55 090mm” 0: 0.0000500 RI 0:30 0 0.. 030000 000 >305 no 830.8005 0.. P805 no 0030000300 0.. 5303-5035003 :5 00.938000 bu 0300?.0 as ”Euuuuce «5 5000.50 10030309: .5 uBOI£°~0>uv ~000«000—0us 030 0c0¢0| 00 000—030 0.0000000 03. 100 n 5003005 cc ”.4"; “Esau "50.) 0000000000 000000c ha 0000.000 0. ucuflnocusoc 000000n000 c0 000:0»«000ccu 000 cocsuuccuu 0&0 00a ..£822.. 50. 0.8086 .0 arc—000 00.300...— 51. 335.80 «a lsuauuuuau a auvkfluu .u 280 6.08.. Q 0030.... 3000! A8 in... 0.53... .0. >331... .83..- .0 00303010032. 00 0:0... .538 9.8.... .0 0: 0.0096593.— ..0 050600 ~0000|000¢>00 030 0000000000 0000~000000 005\0—: 00 0000500 «36.30.033.00 «a 0058. 5 .200 .6 0300930 A5 0000000000 acuquauu «0 000050—0suv 0.0000000 4 géaggag gagggsgg 0.. 00:30- 267 Variables in the Continuum of Clinical Competence Development The study revealed a number of factors which influence students' actual competence development. The relationship of these factors can be conceptualized using a Guttman mapping sentence (as in Figure 6.4). As in the conceptualization of the continuum of clinical competence development, this conceptualization provides curriculum planners with a way to think about the variables which affect compétence development, but it does not offer any guidance in predicting their actual affect on individual students. At best, the conceptualization can make the planners aware of the complexity of the process of competence deveIOpment. As yet, too little is known about the student-environment interaction to more specifically suggest relationships among variables. 268 00.00.0000 5000.00.50 >355. 0.00 x0000 000 0000.60 a.“ 0.00.0000... 0.00:0“... £003.30... 3 0.5.... 2.... ..0............ 000.500.! .. 00000000000 . 0.3.0.000 0.0.0. 9.0me 0&33353 3.3.0... 0.005: 000 £0.00: .00 000000.. mmwm 0000.000 000 a .30.... 00.5130 :. 0.0:... 1300......— 000 0. .c.....:....>..: .55....xatrr. A: 003000.00 0.00 0000008 no 0800000300 00.00000 0.00 .3005 we 00.000.060.000 mm 0000000 0.8 300w wag. 0.050200 ..00; 09%.. «m a 0.00m .50.on ..00: 0000:5000 .w 00.8.. 5.8.5 8. 0000.300 0 .0 0.3.9.0090... 0.. ~ .0 0.330020% . .0 .300005000 00 .0000 m.0 030000800000 00 0.3.0009... h0 050000.00 A.0 500.0000 A.0 030.0030: :0 n0 0000... 000 00.00: ouon00m¢0£ «0 00 mmmwm..0.0.0.000:: c. . #0509500 0? .0 000000. 0:0 000 . fiflzhoc .0 A”: .530... .0023. .8 «IO uCCU ufiuCQECOhw>=0 6;.- mmuooun. 90000.5(..0538500... v... 00000.... 705.000.5000 0.. .0. 00.0.00 00.00 0.9.0 000 0050.03.00... .0 >.~00..«00:: 00.0080 .00—5.0.5.. «2.0.. A... .0000:~0>00 ~::a.:00.00& :3: bxtzzi :. %.«..:0 0..:;0:.: 0;. 0:: m 01.62 00.0.5... .3 .3 950.....250 1.. 0pr0:.m=:-.0 0C3 magnxn 03:03:. b: 00.00:.; 1:. 53.0.0 .0 83............c.. .... >030... .0 0.5.5.0320: .._. 5303035....03. J2 000.000.00.— 00. 300...... 1. “00.13::. .2 zuirzE: 1:!—riC.:c. .: .00. ..0.0 .t.0— 0... 0.00m 000m00m .00.: 0000000000 0000000 >0 0000.000 00 %£tOfiOaw£Q £003 huwahwcou mu 000500~0>00 0000000000 0000000.. 00.3 53.680 0... .0 000=0>.000000 3.00.0 .0 ..ca 3:33.: .2: .2. 38.. 6.08.. .0 U n.000.\000.000 0000000 0000000005 00000.000\00.0 000000000 00060.00«\00.0 0.0000000 0000000 00000 0w000000.0 0000000\00.0 00.. 00.0000 0000000 0000 000000000 0000~0>0-..00 000000 0000: 00.000.0000 .00.0»:0 0000000 00.0.0000 00.00 0.0.00 0.00 >0o.m.: .00.0ua 000..00 O—N C‘v-I—A UUUUU [NO 000000.000 l't' || 50.£:MN 0;. . anNM’flQD UUUUUU sank 000wqmz< AU. azaomoHuza .00.0fl= 00.50.000\au...0> 0:3... 30.30.. 0330.... 02.0.... 0890.....an 0000000500 0000......0000 0x09~3acx 0.000 00:\0«: :0 00:0;00 02.0....sz— wZZG... 0.0 0000000000 .0303... .0 aCACO=0~ Ca caddie .0 000000—0>00 0.0000000 < 01.10103" QQQQM 00.00.2305 G... 3.305.325 Haw—H.525 ..x. flflfliz 2:... 7: 74.025020, 3:... 05 ......a-......~u .00... oafl...U 0.... ..O 2:! 07.2.5.8... .57. 2. in 2‘ Em 2. 9:. 253.319.... 0-_r~2... 2 ..CCT... b 9:12.35 .50.. ..8.JC.U u... 5.3 .33... 3... . ES... 0.9.33.6... 5...: 0.45... 023.. 25.9.1. .0 10...... 3. 2.3.30.6 0.5.33.5: .3... 39... 35.0 00.... V0.2... .253...- .5! o. 39. 3...... .293 0.... 5.5.9:... :. 0.53.353. 52 on o..— n.t0...2 .0 301.0. .325 5...... .cZ .4936... met... 55-..... ..C..J.. 1 Es... 3.3.3..- 0.:......-.. o. .C..:...¢:.:... :53... 35:0 01.3.0.1...5 .5.......:,.¢.:< 01.3.5.3...5 :5... 2.3.3353. 23.03.... 30...... .. £23.»: .232». 3:32:50. 3.50... 0!. ..cs. «9:5. 9!. 5...... sect..0..101..a:. v.52 ca. 75...... .2..~..s...0> 5...... 5259:. ..Sa .5 ........c.a.:o.. net. 50...... .30.... U309 :03... 39. 53.0.2.3 ans 0. 9.9.9:... .s...» .55. 35.6- -aELES .c 3030.. 3.38.. €6.30. Ta c.5391 as .39.... 0.3. {5.3 :20 ......M.i.....a.x...:c...£. 9:0... c... .3.) so... ..coa... 55.9... 3....933 50.9.2... 70...... Ch... 2.9326 Eat a. 95293... e... 0:... ..85 3.0 c. :03... 00s 35.30.. .52922. 3.35:0... :5:- 53.0.3.3 .95... 3:3. .39... 0.3.3:. .6 4:... .5 0.9. ecu. .333 “50.93 .0 £0901 5.5.... .50.. .20.... o........... .59... .._._.C 9.9.31. .0; is o. C0285. 0...... Each 0% a. Ear... 0>0 5.! «E039... .0005... 30:0. .290 930039.... 9.0.1.. :3. 8. =9. .30... cc «0.2.35... 0... 3.5.36... cits—CL _tt. ...€.O 55.3.... .00: 3.5.309 3:5... 2...... 3:5. €0.32 Esme 01.3004 0.... o...:. 5.3 :93...» .00 t. .30.. :5... 5.3.0.»... 50...... - ..E 8 .320 8 o. 39. )9... {5.3 a. .. .s .u >92 ..... .4... (.00. c.ax:c. 7.0.0:... .0. p.59... .00.: a... 003....6... A441 .2. .59.... ..(.Q at 0...... 0.5.2....» .09-0 5:... .09... 5:... A; q. 5 n a ..oa... e... .1333 .. oz...- un. .0 2.3.5:... as 2.059300 5...... 01.3.0250 03 a. 902.93.. Es . .9.) 00. SC 559:. :35 09.030 0. 30.8.5.0. 9:05»... 0.3 55.0 n0>0 .o 3.2:. 0... 02.1 c052 .o 530 50.2.23 5..) 01.38.50,. :22: .300. 30:02.0 cs .20 30.3.0.5 a... 5.3 25023500 a. 25:36.: :23... 3.3.... .50—3:5,. .050 ..o c. Dessk ca. 5.53.... .86... - 30.)». .50.:- £..3 023315.“. 307.02.. .6008... a}. ear. 5.) 0...e.......5u 4‘ a. 02.01.39. ..0 ..5 o. 9.335 629...... . 02:00 .01 .2623...‘ Q\I 00.2.03... 5 033.35.... u. 9......» 5.... 2.3.5.5.... . 1.. .. «#593 .03.. 00.5....» ..(.l .3 0....0. 2.... £3. 2...... 5.... .27.. 2...... s...m.....c.su 0.5.2....Eou N: ..u S. 0:55.. >_ cuss... ... 30.3 S 6.x...aeiu .. .3... .3: E30 .. «{I .3 3.5.3... 9.22:7... 0. 35.95.. “3...... BBQ 2. 2355...... 035.59.... 9:00 C... .9.) .45.... . 0;: ..o... o. 3.... 15:! - >r.2....u_,cv..n. .21....— §C,. QT. _ :z. < Salad .— .0.:..0..o.a 3..) 03.35....0U 3:05:23. 053...... 352.8...5. 5.09:... :0) as! 33...... «Q . 0...... .30.... ..ss 0. -.a 3...... «.95 .00.. 25.301. .35. o .31 «u : >103 :< u . 27!. EXHIBIT B UNIT II H/P CCMPE'I'ENCE 00....0....U....a 0.2.0.7.... 0.... 0.1.8.: {I 6.3... 3.0.0.25... 3...... .37.. 0—...0...C..L.... 0 ”5.5.0... .23).. 6.2.2.2... .......... .’.C.0..&0... 53.0.... 7....0.......... c. 7.073;: . .0... .z .0... .c ..C....:..: ... 2;: ...:..:: z. .c .. :1... 2k 8.3 .. .02.. I ..n. .1. 5...... . 2.... 3:05.005... 0....000 .. 9- .203 30.90.0507. 00:00:... .0:::00..< 52..."... .0... :1... 82.00.; 02.000... ..040 - 0.000.008. Goat-.8. 0.57.. .1. .C. 5...... .32. 0...... 5.0 . 02...... :130 0:... . E... a... 0.... :0 a: :.. a. .. .X... 0. 0:20.... .0... $25.25.. 7.............. 0...........t3 . 5......— d=.o..a.. .=>—..QQ a}... _a_...ub .i.. 3.3. .01—5.2.3:... h.-.~.Q >— aph...-.b :0..— C.‘ —..0L..O.....r. 01.02.2323» 9.80.. 0»... 3...... ..Z . 9:30 .....:......:: . Odo—2.0.1... 00.... .a.v. Ex: 5..) t...c..:—..... . o. on E 0:03.003. .0 000... 0x0 0. 0...... 5.3.0.0.... 2.05:9. L300 02.000... ..000 9.3.0... .0 2.0.0:... 2.030230... 9.223 00.5.3 0:00 .2005... .0030: £000 ..0 00 00.00.02 0000 .00 ..IU .20 0:09.020 EI .o 8:220 006.50 :00... o. :0 .2! 0.3 82 00.500 . I. 000.0... 1.1 0.0.0.55 00:00.2. .0 00:02.16 5...... .00.... . g. 5.00... 00 2.00.2.0“. 0.0.00. 0 00.300 «.4 0.. .05.... - 09 0.23 00.500 ..001 0.5.0.. .0 0.0..) .3 02.00... 0002 :20 0500020.: .Hn. 0029.200. :20 50...... = 2 003.3332... 0.03.030 00....0Q 9.5.... 05.0.0.0...— 00.>.00 A50.0.... .0 >034?! 06.00 c. 0050...... .22-.. .0 003 .u... .0 00.03.00... 30......- - 0.0.001 0.....0 50.00.... 00.000 0.... 5.3 0.00.8.5; 5.0...» .0 2.03.8.0 500.....00 3.00200 500....00 0.0.2 5.) 0.00.8250 as A. 0‘ as as 55 o. n\ 0.00 .00.. -30. :0 .0....0.0...O 25:00.... ..2.. 00.0 .0500 c. .00 000.0 9.50.0 .0 0000 08... 3.... ..>.. 6.. 50 5x. 8 5. 05...... .300. :0; 0.0000006). -.0......0 5..) 0.:0..0....0U 00.020... 0.....0 00.300 UQ .0 2.00.750 $.00! 5.) 5.00....0 0.... . 3.0... .00.. 0.....- oc.000. .- 0000 05.00. :30 9.20.000... .0200 0. :20 00.. cot-.50.... .0 .0209... .00-0 5000 .3... 0000 . 00.00 .00..- 0.00 .02 550.... .0 00.00 c. 0300...... 000.. 0.000 .00 0000 0.0000) 0 00:0. E30 0:3...00 0.0... 5000 .0000 .0300 0002 8.3.03.3... .0..0¢ 2.030000 0.02 0.000 .00.... 0000 3.0.0 00.. 000 :00 ..00 9:00 0000:0000 0: 001 0.0000009. 0000 .02 000000000. 00.0.9... 0.20.. 0000 - 000000E_a:.....0 0.0030000. .300. 0000 .02 000000 00.00000... 00.00000... um 2.00. 0.0.0 3.3.00.5 00.30000... 5.... 000.00... 02 :0 00:00... 0002 8:00.30 3.0.0.100 0.0000093. 0000 00 .02 an. ..0 0. 2.0.2.0... 0.00.. .00... .0. 0.0.0.0.. 02.1 .0200 0.....0 30.00.00 00:30. 5.) 0.00.8...5U 0. a. ~. «0 a. as «5 .0 >3!U..H. >gu_.u..fiuzl >~I..m. :( . = :5 ..Z:#.§.C.. 3).. = :z. a .3...“ EXHIBIT C UNIT III H/P C(NPETEMIE an}... an ... :03... .0 .0... 00.1.0.2.5 5...... .x.....nc....t...., 3.3.... £1.05... a .c. 3.7.5:: ..0 .0 n......_...0..Q 05.2.9... 1.2.5-5.... .rsc..¢0..tc.-:ttu i... o...m..c.k0u..3 .0. 2.00.0.0: a. 5.0... .3 ..o..?...0=0.. 21.23;... 3...... .0. 2.9.5:... 7... . .233... .2. 01.9.30 0.. o. .9...» ca... 0...: a. 0.0%.... a... 0:22.! 2. .0307. “Citric: 0:02. ..cu... 2.2.10 .50.... .5... ., N... ..:......_.:: 2.5.1.. 3.1.7.. .. .2. :5: ..:.q.. 2!: 3 :0 5.0 :a c..cr...1.t.o.c. .1 a. 3:7... .09....60 .7... u (cc. 25.....1: T)... .. £0.55. 4 6 _ ..Q a; «U Al. ..ttc. 5.! 5:55... .9... .. . E0220 8...... 9.7.2.020 .c .500 .02 .0 00.30:...5... ..L......... 3550...... 50.0.59: 0...: .2302... .0. 09.2.23... 0.2... .00 e. “:0... 0.2;. 93...... -05.... 3.0.0.0.»... .52.... 4.0. 0. 0.0.50 .90 0.00,. >_ 050. 3.... ..a 0:56.... .o .0 231.50....1. 0...... .3. 3...... U0 :0 :3 90:0 50...... 7......03 35:»... EU 50...:0... £0.35: .0 .3. 37220 0.1 0:72.... 9.5.5.300 6.22.0 .00....500 .252. 3.9.5. o... .0 0.30.0... 3.30 0.. 0.... 0.0.... 0... 33:00.. 53.5.2. 0% a. h. 23.0. - 20.5.9... £93.... ..xT. 3.0.3... 0.02700... 0.. EU a}. £5.39: 0‘ as . ..u =— _1E.0.. :30. 35.0.5». .055: .00.. 25.5.59. 303v. : 9:22.593... .3 .375 2.3.32... 0.2.0.... 330.. n - 00.3553... 60......02 .2. 0.00.... .7... 0. 0.3.: 4.... :20 3 ..c....:......: 27:030.. 00.2.2.0... 60.06:... 20.1.2.9... .307: 0.50 A: .9. 2.... 0.. 7.0.0:... .02 50.35:... 0...... .59.! 9.... on... 2...... 37:... 30:! 02...... . 7:10. .3 n... ..u n. .0}... 0.2:. 0000 302.350.. 9.02. .{I 5.3 ...~.....:.0 0.0.2 3...”... c. 3.0.2:...“ ch ma «.6 an 7.0.050... .0: 30.33.... 3.3.59.0 .352. . N... .0 .33....5. .0: $03.30.. L500 3.0.0 :53... QxI 0...: Nu .\ >92 :7: .. .2 >27... 3.52.0 >1. .1. a. 4.. CZ... 00.20.. 0.1 =. :75 _= ..Z: O .323 EXHIBITD UNIT II DIAGNOSIS & TREA‘I'MENT COW u. .000. 33......{9 .0: ..1. 3:05.95! 9.2.3.... SD 3.... 5...: 601.0! 5.30:3... 3:0)... .. 3:05.00: 82000.0 ..CIU a. a . J a _ 003-0....310 03.0 )9... xzcv. ..coO 0.00 0... 4.0... 0. 0:... 9.0. a a...» 3.0.00.0 .0. 02.0: 200.00.... :5... o... 9.2 ..So a. £2.00. 0000 :00..- ..c8 9.3 £09... 5.) 0.030.. 0.51 a oo- .33! 60.0.29 5.) 0.9.30 .6250... 305. 9000 ..n .- 05.... 39... ..80 ch 2. 3.00 £53.30: 9. 06.4.0) 0...- «3.50.. «OZ nap-3.0:..- Ibr? .00.: .0 9. J33 riU 253:3». £200.... .1. 0. .88. 5:0 80:00 on! )9... ..coO 0.0.5.. 0. E. .50-00.0 0. 8:00 908 8.80.9 .02-no.0 ._ 60.00 0: 9......0! 0.033.500 .02 50.0.0.2 9 0......) .0: 2.0... 0:... I. 0000 .3. 50.0 0),. . 3... ... ..cou .2. .02... 5.} .000 .09.“. C0: 8 .32. 0.00.. .98... .05... 5.3 05......» n. 5.... .0]. ..i... 30:00 0.0.00.0 0.0.3:... . 53!: 3...! 1.0.50... :0 .0232: I: 50 400.0 n. >ui‘u-U...3 0.0 >0. :0 ..0 on 51:. . .000 d .069. c. 0.00.2.3» 50> .90... Zn». 0.30: .9... 5.3 I._.Ea.5 Qty-00.0 0.0.00.0 0. .2. 5200.0 055.0. 0.45... as 3:... 00.0.5 2. .0... 38:. ..coo ~§ a. :9. 0. ..uu! I... 59.. 53.0 305060.. 9:00 30.63.».- .0 Eu...- -...:E a. 33 .. 030- 50300 ..iU an 5.0.00.0 o... $3.300- .o- .0 0:0 .05.. .0 $00.30 30.2.0.0. d .000. 0. 2.2.00 9 003.0) .30... c.- to; .c 08. ~l.8.gu _I.—C.U..Uu:u a. ISZS—IDO 2.0.00.0...030 40.35.... 10.53.000.81 903...... 632.. $5.83.. :35... :63. .80 3.8.2.6 3. «no. 0.30.620 .0. 0.3.0.... 3000.0 an“. :00- nvaa... .0 .00.. .0. :0... 7.05.9... .251 39.0.0... 0:- uE... 5.... 1: 5.32.. 8.9.8 .30 20030.0 :3 .00.: 590.... 4.0) SD 0.: 10).. .009 3 £09.00 300v. 25.0.. .22-0 c. a. .3. 628.0 .3030... .3. c. 3.5.308 no: 0. 09.0.0... 3.3 :88... S v3 .92 coo-u .0.0 .0030... .00... .o .2. 0.3.. a. 0:...» c. USU-pg“. ngp0... 33030.0 00.0500 .00.. 0. 10: no... 3.0.8 «>01 09.0.3000 .0 5:52.02. .0 235 :0 32:03:... 00.0.00 .v..O :0 o.- »o...» 40...... .8 0.5.0- 9 0.5.03 0...:— E.00 .7333: 6.0.0... $.52}! .0 30. 0: 0......) .22 . 3.3a: >8.-..0nv. E. €0.20. 5030.0 020.5255 0.9... .00.. 90030.0 03.300 .0 .2. 8.3.00 3.3!. CCU .U.U .0 3.90.... 53:00. :00 030. 5030.0 0. 3.55.0000 05.0w 0. 538. 0: 400. :00 CO: 0830 .0. 00.050 .0 03.0 10:! 0:... 00.3 . «09.0 :8... 0:3: E3... .2... 5.... .2. E030... 530.... .00 CCU Ea...- §.0.0.ou c. 2.0... .00... .0 .08....00 0:0 .620...- 0.02 as >n¥¢U.U_ 510 a. 1C0 5.) 3:26.33 03.0 .30 0.00.. CCU 0.0 50.2.3.3... .._.U u.0.s..0uo.. .000: 03.2.0500 .5 a...) 0.03.3030 6.0.2; :0»... .0. 0.03.0. .53... 300 .. .250 1.. 5030.0 .00 3:0..00 0: 3.0.... 0. 0.3—. you) 30>... as on... .0 .00... 3.00-2... c. on .2. 5030.0 0300.0 .0? 0000 .00 33 00.2.8 3 20...». 00.3; 3:3..- .no... 0. 2.2.00 0: .203 3 3.5.8000 .3... : >§u~.. .500 > 00.0.0.0 .0 05.0.0000... 2.000005. 00.0"“..0... .0..) 00.8. .0 00.5.0000. .00....3 00.0.00... 60.... 00 0.00.. 0.20000 0.0.5 .30 0.... ..0... 9.30.35.09.05 0.. 0.0.0.0... 0.0.0 3...... :0 a: :0 0. 0 0.0000. .00 1.50.3.5 00.....30 5.205.200... $00.0... 3.0.8... .>._ .3... ”2....- yflkfio .0 .832.» .o .8585... .o 0.00.. .0 .005 .000 0.... 0. 0000......8 00...... 0.00.6... 5030.0 05.3.. :3. .30... 0000 .0.. ..000 :30. .5... 0.) 50.... .0....00 0. 3.00.0...0 .0. 0.5.0 00.2.5... )00v. 5?! .0.—500.5%... 300.. ..000 .2530... .0..) .00 50.00... .. 0.0000 0..) 000.. .00 0.30U ...I - 5030.0 .0005. 5.0.00.0 005500 .08 o .o 0.0.... 8.83:... 0..- ....89. 8.6.02... 2...... .0... :9... a. 800.39... .3: 2:3... 8. 8...... 2:8... 38.. z .88.... 3...... ..i... :38... .63.. c. 5.2.00 00.. .0.: 33. 2...... 0.30 52...... .3 2.0... £35 8.8.0.2... - 00.0.00... 0. 0.8.60.0 00.00.... 6.0 .5200... 0...... >020... 7.2.0.0.... 0. 0003.00. 505000005 00.00005 .0. 3.0.0.0 0 0...: -.0..00....... 0. 3.0.0:... .00.. .0. 0505...... 0. 5.....0.’ .>.. .0 3.0.00 )0... 8.6.0 .0050... .0. 0.00. 0:00.500. 3000. ..0... 0... 50.00... 0.000 05.06 . 5.2.00. ..0m 00 a. .0 00 00 n0 :— u50.00.0 3000.050 .3. 1:00.02... 00.00 1.. 00...... .5... o. 0...... 3 8.2.3.8... o .o n A. 5.3 33.58:... 82...... 30...... 05.30. 5.0.00.0 €82.99. 05.0.0... .8 :0... 9.3.2.... 3 23.8... .888...» 53.8 a. 5.3.20.3... .0030500. .00.. U0 5.23.0 0.30000 .0. 5.0 30.0.05... 9. ...0) 000 0.0000 .5005 .0 0...... 050m .. .0 2...... .0 0.0... .0.... I000... .02. 0.2.. 0. 00.00.25 0.0.5.... 00...... 0.... 0.30.... .8... 0.... .0.... 9.8... 5.3.6... .300 000.... 00.0.... ....m 530... 0000 0...) ..000 3.00.0...0 303...... ....U .0.-0005.30 6.0... ...d 0. 3080.0 .0300 3..) .0550... 0...-.. 0.03.... 0.0.0 .000. 0000 80.5. 00.000005 .000.)00.. .00.!05 0000 00.0300... 0. ..003 .0. 0.0.0.0... 9......) 0.... 000 0. 0050.000 50.. 000 0. 00.5.0.0 5.0.00.0 0....00 .0. .30.... )9... ..000 50.0.3.2 0.30 63000 0»... (5.0 0.0008. 300.. .02 0000.05. 00...... 0.0.00.0 50.00... 0.00... 0.300 50.. gU 0.000.. 000 .03... ..0 0.00... 00...... 0. a0 00 n 0 00 a. 00 n 0 05030.0 10.0.5 .5 .0. 00....00.00< £000.... .0003... .000. .0055 9.0.0.5 .1020. 0:00.... .00... 0.00 ..0.0 8. 5:02.... 0000 0.2-x 005000. 000 1...... 0.0. 0.30U g a. sad-moss. a 3.. 2.92 «SEuIDhu 1:: Sum — 9:8... 0.1... 822. 00.80 .3. 2.8.0.... 3:... - 0......) 3...... 8 380 .32. 0:. 2.3.8... .3220»... 0.0.9000 .00. 300...... .0 9-.....) :3. 0000.... £51.00 7.0.20.0. 0.0.0500 .00. 0.0.0000 )0. . 0.0. 52.5.0.0. 02000.0 0.30.. .0 .0 n0 .0 >383! . >Q.H..U.b¢0 >§up.. .3... 00.5.0. 300.. ..000 050.50 .0 505000005 ..5. .3... 080 .8. ..So 90 0.30.056... )9... ..000 .000 . .0 0...... 50.0.... )9... ..000 - 5.0.003. 0. 0.9.00.0 ....5...0...0 0. 50.0..00 .00.. 0.2.... 50.5.5... 00:00.... 5:35.... 0 050.0 .0 0.0... .202 00.0.... :00 0.0.0 .5 000.. 80.00 00.000 )9... ..000 0. 0. :53... .0..) .8 0.0.09.0 .— .....000... .63... 0. 50.2.00 0.00.... .0.. ..50.00.0 03.0 .0. $005.00.. 0. 90......) Esta... 2.36 c. 0.2 .E 0.... 0... 8.3.50. 00 .00.... 00.0.1 .0.-0.0) 0:00.50... .0. .008 0. 0.....0 00.... .0 0.. .00) .0 90.5 50.00530... .50.... ..0 0. 0.30.80.00.00 500.0000 .00 02.0.0309. .0 3... 00.0500 5.0.00.0 .. 00.0 50...... 0.0.00( 3.9.00. 0. .000.0..0.0 00.00.90 0.. - 8. 0.0 .2) 00.8 .o .3002... 22...... 00.0.8... 500.050 .0 505000005 .0 00.000 0.... 0. 00:00:00.. 000.00.. .8 :8... 003:6... .0035... .00.. 00 50.5.0 00.0.0. .0. 00.0 00.9. 0.0.60.0 55. 3.0) 0. 00.90....0 505.00.. 00...... 0.1 509.0 0000 0...) (.00 ....00.....0 50.20.00 ....0 £335.00 3.0000000 0000.066. ..0.005 0000 00.00020 0. 0003 .0. 3.00.0.0 .0.-.8) 00... 50.0.0.2 0.0.0 50000 0.... ..000 0000.00 )9... .02 00.0.05. 00....- 0.0.00.0 n. .0 n. .0 1.03.0.0. 0.0.0500 .00. u. >H‘H-u...H. 5200.0 10.005 .5 .0. 00....00.00< 08...... .83.... .22 .55.... 0......) .18... 3.3.0 I...» 0.... .02 a. 803.... 080 8.00 8. 0030.00.00. 9. 0.... 3.8... )2 .005... .2230 .91... 0x0 :8. .1. .0 0.-....) =3 .02.... .2280 .- gupgu ngaszg .0 02.420 E :5 u .32: 0.. 0020.0. 82) 2...... a. 5.020.... 0. 25...... 0.0.0.00 . .0 n .0 3.. 0.0.0000 ...0.0 00.0.00 50.... 00 0.00.. 0.2000. 05 .00 .IU 30 0.0 5.0.00.0 30960.0 0.8. .00... .0) 50.... .3005 0. 3.50.0...0 0.0.... c... 08.0 .00 230 8.3.0.5. 00. £00.00... 00.5.2.0... 0.0000 10v... 0.2 .._..2 33. 0.0:... 0.2.0 5.0.9.. 5.50.0..0 0. 0000.00. .>.. .0 5.... .60.. ..55 80.3.0 .0. :8. 0.30:0... 300x .0 2 a. 05.00. 050.00.... 30.0.50: 9. 3.0) 0.0 .030... 6.0.00 :00 0. 00. o. 00.5.03 E... 5.32.. 0.002 0.30 .0 .)0...05. 0.0.0500 .00. .0 >nI£HU70E 0.1 - 5.0.00.0 55.050 0.002 0. 000....5 00: $032.. .3 2:00 0.35 505000005 :2. 00. 53...... 3.... 000.0 0.0.0.0.... 00. .0. 5.00 3...... 0... .0 .005 0.1 3.0000550 - .00... 0:00. €300.50... .500 .00.... 05.003.050.55 .0. 20.0.0.0 0.000 )5! o: 5.0.00.0 5.50. a. 0...... 05:05.. .05. 5.0.00.0 00550.. 00.000. .0. .030 0.2000. )00! 0.000005. 00.00005 .0. .0005... 0 0.02 500.50.. :9. o. a. 5.0.00.0 308.050 8. 1.50.03... 00.00 2.....2 0...: 3 0...... 80:30.8 0 8 a 0..) ....00.0...0 00:00.0 .5030... .0..0O.050 50:00 .0. .5005 .0 .00. 00.0w :88 005...... .0... .0530... 0...... 0.00.0.0 0.000. .005 000. .0.... 8050.0 .0.... 0...... 00. 0. 0500.000 50.00... 5.000 .0. EB. 0000 20:2 0.0 .02.. 8 0.002 9.5.0 0 a. : ..2.x 5:8... vi .3... 0.2 0.30 5.00. 500...... .....0. .5...- .8. 2.800... 00.0. - 03.83 .20.... 8 0.30 5.0.00.0 0.0.0000 )0. 0 0.0. 00:05.05. 0.5.0.0 0.00.. .. >§uw<0 :— 0.7!. EXHIBITF H/P SKILLS DEVELOPMENT 33.9.34 05?... :3 .5339... #9. 0:55. .0 2.2.8 30:! 8.3.3.: 0.339.... 30.2 .03... ‘25.; :3 sin. .303. :23 .2.E..Eo¢ Q d I 23.29:... H... c. .3. £23.30. 19.. o. .23 :55. :9. 353...... ginom 0.9: 293.. d 330?... 32 2.2 .2. 3. 23:33: .0....0 .0500... «one... .2. a. )9. .o I»... 0555.. 309.8,. 20.2 32:30.: to... 3.9.230 .Q... :3 «3.5.. as .3 .53 «J 3:. ..5 A. g o. uU...U..,.:.=.. :23 .23 91.93.... 2. 2.3. .53.. 3.37:... .23.: 0:53... . E5... .3: .c.» ..f. 63:. ..r. .2.) 39.2.3... .23.... .75.. .320 :2» ..........; -.m.........::t.; . :v. .>.. 1.4 2:. 3...... .7. 0.2.1.7.... 1 ya...“ .3 .. 2:11... 2.1.... 3.... .0 «g: . 7:5 ...... .6. ..q.-. 33.1.2... 5... .9 ..C ..Q . .....CA. :33... : av}... .13... ..3... .LC .9...» .. .3... .55.; 0...: 7.: c. n33... a. 3:33.303 5.3.8 0:5... .3 32.2.... 3. 38.95 553.29.. to: Q5... .0 2.2. 3.203.:— KXE: 3:: he} 2.0.. 3: c. 339:.— .202 233:. 0:. 23:35.. .52.»: :c «32.. .32 3.90.3 tote... u. 50.2.7. any: :. 33:93:... 25.1. .U.«. 49.2... :0 3.0.53.5 «3.5.. ad... .3. .U.U :3 .33.. n. 3.... d a... 3...... ...m..........: 9.23.5. c: ..._:....;..:.m 3:22;... 3.. ..m. 912...... -mC 42...... r... 7.2 .35.. .............: ..7. [9.54.7.1. I: :1... ..c..._...... .t... .. 1.3.323... .....t 7.2.3: _GCD .9... .L. a UL; . ..C) .,.__.. ‘Ca.\. 2...... ... .3. 33.13;...“ 39.7.... .....3. 3.7. um; . S..:~....:._.. .04.. 9.5.33... . pi = ...Z.. :2 «8:05:13. .988:- .30....5 .. :0». 2:. 03:3. HQ 5:... >332 383. up»... .0 on: .35.... 3.02 0.33.95. 01.99320 .23 33:! :32: Ea... Ho. .33.. 26. .o 2:05. P20 .0...- - 33 ..c8 $5.... 5:88: 30:... .3 00...» P... .30 9.39. 3. 05:59.0 02.03.. B. - £35.32 65:. :3 too 00 .U.U .3 $1 «3.3 to}... .EU .U.U :0 a. .59....3Q 3:. .00 3.3.9:»... .03... :0 253. a. 12.2.35: 3.0.. 3.5.3.. 5031 >..t...:.3 5:2. L.....LU..C‘~ 1:. 3..-. $176....— .53.. a... an... or. .x... .3. .3...— p..I A... .33... C3.....:.......: .c .....E:......... 2:37. _c.......:. L... . 75.3.5.1... 320...... 97.....3 2:2. 3:...v. fix: a.............. CC 3. ..r. m— ..hzsr. .I..—G......D..:CCL my 5:2. 8.... 3032: .0033... u: 2.3m 3:9.- .I.) .3 in .m: on. . 53.0 0.0 5.) 333%.. ~82 3:. 25:33.. .00 .56 no - an - 2:... la. a...» U830: 2:. :3 .3 E3: .32 28.3.3... 3 E) 4 .23 2.5. 1.5.5.5.. 3:: 38901 0.3» :30 3.2.0.050 30.)». 2:03: 5.) t... :52 93:355. 39: 0:..331 .88. .. 02.33.53,U 052 in 3. 6:2. a3.) «.5 3.3.7.21. _fl..{...r..Q 50:1 32:... rag... ml .6 n91.3195— 221......5... C. .2: ..w..5....m.O .Iu..L G— C&..«-.-.. UP. V... 3.2.5.5 HQ .3... o. T... :1 HQ 2...} i... .0. £53776 «.3: :9... 3.... .3. «0.3.1.: c. 33.51 1:... .i .030 4.32.. .02 3.54:5 .39... C33 3.39:. c3332. 30 2.9322? 20...... co ..to .3an. o. .5: 32 38.0.... . 39:. .: 99...... .3 73.75.. at .353: 35:8: ..EC 3. £32052. 32.. ..SO 9:... co. 3...... .. :33 guztsack 3.635.. 2.2.33.5. 29.)»... 0:53 .32 .9233. 2. 3 59. 3.6. 02 an. 3:. I. .2. 33335.3: .14 q- .. 43.3 .5. n .33 .3383 E023 o. an. 33.0.. o. a: 33.3.. 3:. .23 .3 age. :2 .60.. 9.5.3.2.. 2...; 9::.C..>s 3< ..h .92 a tho .32 2559.59 326.95.... 33:593.: :51 ..3( $5.55 .3255. a... .3... E: 2:98.. .0035: .39.: 25...:< .9532. 9.5.93.5 .232. E 3.2.... ELSE. 5.! 0923.15. 85.53.... 3: .30 o..u:.ca.:.m n5 _ :z. ..o. .2.) 00:39:... 5:2. :0: 5 a... .3: .. ..3 June I... . 05...?! 39... 35:0 .0223... 33:3. ,3 55...... :23 93.33:: . an 3.1.. £3.21 3...: 3 .7... ..coc .5... :5... 33.59:. £337.. 9:233). 053......» 9. 5.23 a). 39.3.3.9 8 o. .2: n «5.0. 13:: ..‘zC 925. no! .5.) 33:... 35:0 N. I... 95:32. 3.... .2) 39:. :55 53-0 on... .3 5.5.93. 0).: 35:0 u... 8. .E .7. 9.: .. «it»: 0.5. .9...» 3°C.. 3.5.3 53. u.....:!.« .33 .0.. 5.. CC n. 8.2.2091. 06:2. .a >¢OU .33. mp7. Xiznu .. :2. EXHIBIT G UNIT II H/P CGVIPETENCE DESCRIBED BY UNIT III STUDENTS 00:29.3. .a: «.>.. 2.0.2:; .0: ma 3.22:0 .9..50.u.4:n-.?.3 :35... £0.27. 2‘ E0... 210.. 003......303 7...... 15:... .0: 8:9... 32322:“... «.60.. .0. 63.9.52... "305.3: 5...... 332...... £0.50... .92... 3:3... 55.0 3.23... 30$ ax: 0.3.00.5 ..CO..5L .catan. a...) .9.) 3°C.. ..ct.O oh 286. $5.95.... .59... 506.16.35 .8 c. .303 Lac.._x..:x any... :0 :. 0.330.005: ..Hmm too... .0... En}; L.cc.c..:..2 53.2.50. ...0. {5.0 0‘ n\ 09.5.:- 0..a:o 3.: as . 3095...! 0:62. o; .3... .5200... 2a :3 52:00. 4.35.... 5. .050. 1.95.2: 2...... 3.03.0 32 3:23;... 26:00.: 1.4;: .62 3.7.5.5 391.... Luauoa :5... 53.25.52: ...c Lt...) ..SU 2.3.5 803. 0...... 02.22.59: c. nbzaaaoeo 000.. .0. .008. 00. ..SU Bu . 9.38. :23 I55. 0. .9. 9.0.. a. n. 50.0 03.00 0.56.50 on ..SU 6:393 85.1.3520 .CG. .0.... 09.33 2001 .0230. . 65.3.30: 0.]. c. .2. 5.2.5:. 95:03:01: S .30.: 9.53.023. 9:032: .U.U :0 a}. r: 3...»: 0:23 3.: C. 9:9. 6. 0—0. .02 a0 a. 2...... .5 c. 2:9. a. .3. .02 an. coco 8 3 93. .92 .92.... 0.0230 c. 030 .02 9.2.»:- 8 2.5.... oz .0 .rnzu.b:5 pzxxu.—m ... :90. in. 3.1.: ..m X. 05:30... 20.... .05: n no N 53:00. 0-6.! .52: E 2:03....- vc- 30.. 028000”. In... 950).. an. 0:93.56 220:0... £00.30 “.911 .U.U «0.015. “.0326 .33.... :0 23:3... 00:3 0:32.: ..0 .3.) 03.28.53,. .00.. En... 0.11.5.3 0.00.3.5: U 33.3255. .325... .: 950.503 29:: 35:0... 5039:. 0.9:; .. .0.“. .0 50.93 32:00. 050.563 6.52. 00 an} 3...: .2920 35.... a}... 2.39.25 00 .U.U .0 50.19 32:00. ac I030. 050:: 2.23 “5°C .0... E0393 9:39.05 5 3:3... .325. . .3123: 3:3. .I ..tuE c. 355...... “.0952: .0535: netivcusz no.5 x..0.a\u0>a.....:. 50.1... 30.3.. .0. 3030.. £51.? .: .50. 1:50 2:! Eu... 30.1.5 2.987.: Jon. v.3»... gusset—1.6.2.. EEC HUN. 0. 9.33.5 ...\I a. a. .0555... 5.1.0632“. a. «1.5.506 ~§ _a_. £03....“ .5008 was: 21.34 3 E03». 32...»... . -..U 0 02:30.0 .C.O 003...:- 2— 3.0 . A «(I .9536 a. > L7. ..zb. ‘5 IQ ..2: ruin. ..xI .. :z. .0.. E0333 59:... £08.55 .a .(.Q .0 Ci...» 3. 0580 Q); a. a 53.2... .550... m3... 303:. 20.5%.-..3 .2...» 50.22.99 1.00.3.3 :03... >— 33: .2. 50303 0:30.030 2.9.0.501: Q 9! I 0.0.:Qam 12:92:30.5 n5. 0.0.950 n. .5535... 3. 3.3.: Eccm 35.92... 2:59.01 ax... Ea... 2.6 .0250 29.53920; \.au.;..a\_:.:.:c3...... .— 052923 35.3.... .39... 9.22099». :51... .— .Qflu .: 50.1- 3.1:... 0:53.25 0:330. and mu 8.00.7. 3.2-0.3 ~§ >3. up (A. _= 27!. EXHIBITH EXPLANATIONS FOR UNIT II H/P CCMPETI‘ENCE 23'] 00500.0 0. 3.5.3000 00:00:00U .0 ..00. 00.00.00. 30:20:. 5:300:00 . 002000..:3~u300 3:09—00 5.300. S....000¢30..0000 30.5.0 00050 .00033000 .0 .30.. cuss—03...: 0350002300... 00:00.0 00.0.3.0! n.5..0.000n0-..0n 5:... 0000.309... 00.0606 30.5.0 .0 300.. 3:53.503. 00.33 0. 0.04 030 0.0.0.109. 000.5...u > 30.0500 5,.) .0000 30.5.0 .0 100.. .303 003.530.. 203m! .0 3.300 n50.03000.00 33:20:50.. 8.0892 c. 8:80.00. 0000 0020000 330.... .0 00.00.00 00030.0 0000020903550: .00.00€.0..0 0003000 0000:0033552. 30.005000 :0 a: :0 o. 0 .888. 2:82 vi «3:84 .o .8. 3.2. .o 9.2:. 2?... 05.009: 303.3 02 50.00.. 0.3 2...... .0 00.30.23) >— 330 30.5.0 0. .00—0500 .0 .30. .3230 300 080.39.! 0030000 0. 5.8-00000 .0 ..00... 00.00000 0)....0001 =. 000.00... 300200 5:06:80 00:8:000-..0m .0 "0. 005.2300 50.00.. .0 8.30.2.0) 300005033000 00.. 0060.305. .0 9:5... 030.....030000000 023.25-20m .0 30.390 0.000 9.0.... 6.33:0-..00 «00:01 00009.0 205. .02 50.0500 00300.0 0.00 090.230 0000.08.30 :0 0000000 0 0030000 00 0589.0 .0 000.. 10005 .0 000.000.03.01 30.9.30 0. 0009.... 5.) 00003095 38.0.0... 0.004 L000: .0 0900500 0000 0000.39... 5002 00.33001323000 00000300043m 00:00.0 d 0000. 0000 .— 00000.30: E00. 0. .10 0. 5 0008.300 3 :00. 00 0. 009.0. 002 30.0.... 1000.0. 300200 -305. .0 00.353:— 0000.305. 0.0:000< 6003000 0000. 05; 0000.30: - Esra 0000000. 02 00300.0 .0 0.000 3300.309. 905003000 02.... 0000 0900. 3:33.303”. .5509300 0020000 0. 00300.0 00.330 00.3300. 00:00:00U -305. 59.330 :01 0030000 8.0000. 00.00.002.002 505.3000 .0 000.. 33?... .0 0000500 :30... .0 .2500 9:30.530. _ 30.0....‘9503000m 00:00.0 02.20001 0005000. 0025:0000 00:00.0 a. s. 00 a0 a. ..- 00 , n. 0000000. 30....U 8.32:0...3 303:0 00:00:00.0-..0m 505005.500 50.1. 7.0003 300200 0030000000 “$0.00.. .50....3. 0... 02.300 .31....00U 02:. 0900.105. 00f. 00.0 00.0.00. .0» 03000.0 .0.—>200... 30.00020 .3300. 300200 0. 302m 00 0. 3.03.0000 .. 00 0. 3.5.3000 0......0 .0000 .30pr 2.0.0.5... .5000 .0 .00......( 0000 80:0 30010-20. 00:00.0:— .:0E 48:0 3003.00. >0 D003) 00: .23 350-0 .000 0000.305. 0),. ..000 00.30.55.323... .0 00.0.0.0. 02.300 002.5. 0300 $0 301 . 000.300 0. >000. 035. 00 0. 0030.3 .02 9.30:0. 30:00 02 .00500..0.S0.<8...3000¢ 5.3 00:000. .0 4083300033... 03.0030 :20 .0: - 30.0.0. 02 . £0 0. 003.300.. 02 00:000. .0 30.3.0005 3.1 .50-0 5390 . 00.320 30.200. - .300 .0 30. 00:00.0 0:... - 00:00:. 12w 0000 30.30 5.! 950.300 0:... 0000.». 3.0.... 0.: .500. .03 30.000.03.00 03.300 :20. :30 00.30000 0.00.... .0 00.00500 00.000000 .3 00.00.53:— 00..0000 000.200“ :00. #53500 00:5... .000 3:00.00 00:00:80 :00 .SU 000... 0.... 23:0. .03 5E 00 .0. 0 00 00 .00 00 n. «a .0 00 a. wt .0 >943: . H. >3 1:... 510 >103 ...< 0 = 270:. J. 2 .- .«Q‘Cp 1x: 2 :z. ~5 . .Hxi~<2( .1...” a a .35 .20. 02 .8 . 3.058.... 2:8... .0 2.580.. 0.00 50.0... .0.. .. 09...)... 0.. 3.0.0.... 00. 50.0.0 un. 03.. 0000.03.00 00000.0 $0.50... .0020 00030000 .0 0.00.0 0.00.200... .0 00002.0..00 .0.. .2..0...0 .0 0.00,. 00 .3 .00 ..000 30... 0:52.... .0 L3... .8... 62:... so .35.... 8.35.0.0. 000 000. .001. 2.00.200 .0 00:00.00 .0 .50....01 .5... 0.0.60.0. 0.00.50... .0..0...0...0 50.. 000 000030. 0300.. 050.10 .0 30.30. 0. 00:00. 00000... 0.00.40 0.5.00. ..- ““3 _- o...0.0.. 005.0... 0. 00.08000 02.020090 .0.. .0. 00.8000... 0.5.0.... £03.32} 8:2. 3.3. 020...... 5:159...- o..8. 5.52.... 0. 0.... 5...... 00000.. 5.3 60.00... 5000.950 05 00030000 5:58.... .5 5:0. .3... .45.... ..i .80.. o. 22 5.; £580 0328.50 a: .. .58. .2: .. 59.2.... 2.. E: 28:80 £00.30. .0039... an. 0 003 8.3.8... .8... 5.; 0.3.8.5.. 00:00.0 .0. .00. 30. 00.3000... 00 0053053 .9... .0. 00.... 0.. 0. 0003 300v. #8. 022...... .......IM.. .8 0. 03.00.... 000.. 5:33 3.. .35. . 00...... 3:00... 082 .0.... c. - CO 00 0:00.. on 00.0.00... 000.55 0.2.000... ..n. .. 00.300. 3.0039... 00:000.. 0 0 I 0.0.000... 0000... 0.000 .0. 50:00.... 302.0. 50.0.0 00.50. 001 5.35.0.0. 0.0.0.000 .0». .0000... .2005... ..00 0. 002.0000 0... 305. £3. 02.... 5 5.2.6.... 20 20030.2 2 8.... .2308... 0.. 08.00. .31 50.3.. 0. 00.3.0.0. 3500.00. 00.. .1. 03.00.... - 0.0.0500 000. 50.0.1 .0.... .o 02.... 5.8 5 28:8,. 0.0. 0000.07.00: 5 00.00000. 0.... ..— a 629...... 2.830. 82 .00 000.... 00.300503: .0 .00... .0000 30.00000 ..0m .0 05.8. .2; 38.. ..80 .5. 8.... o. 5:80 38.. .55 I a. .59. N . E. 30.2.8 32.... 00 >. EXHIBIT I EXPLANATIONS FOR UNIT II COVIPETEDCE DESCRIBED BY UNIT III STUDENTS 0.0.30.0..6 00:000. £50.00... 90:09.5... 0000.33... 2:03; 00.80... .2025 c. 30.59:. 33 .5.) :95: 3.50.390 :3: a. 50:0: .0 820.530... 30.5.... 0:. £000.30 9.6.6000 53062.5 53.05 2.0.23 90:008.... 0:0... 2:03.... 0:5. 3.6 :23 00:00... 0. .5. > 52:... 2.0.0.: c0033.. «20:35.38. Eaton 0.009: 0.8 00:00... .3605 .09. 053 09.0.39... 62:... 5.0.25 .:...<00:0..0...0 £5.55 .2 a: :s o. . 5.0.8:... 30.5.0 5.) 0:... .o .3 3.0.0:. 050:; 05 ..cac so. .0..) .003... 5:30:50 :. 0......0 HQ :0 0.02050 .o .00. :00. 5... £922.83. .0 030.39... .0 .00.. 050.003 .12. 5..) 000300 050.23 .6 :0... 200.3 0..0.¢0U 9...... 0:... .609 33000:. 00.20.. 30.590003 00:022.: 0)....000. .o ..00. 0:30. 30...: 09.0.3.3. 080.35... >. 50:00 :. 09.020610 .0 .004 000...}... .0 080.32.. :32»: a :. 05:000. 2.2.00. .0 .30 . 70:01:93 2:60:35. 7.1.2.090 .02—.050... «00.2.. 050.1- 5.29.53; 303.9%. 0...... .o 8.5....05 .00....00. cc: :3: :0 E02... 955...: a. 9:3 3. 00:02.5... :5. ..8 8:02:01 .5303... :2 0... :0 3:30.50 1.00 55 3.33.300 0.2.5:: .o ..00 . 4..) .43) 0. 02.33.0950 -0... 3 9.3.3.031 ..3. . ....an\0..:020...0 5:... r... o. as 2 0. ..- . Z n. . 3 a . =— 30:00:. 5: 30.00.5005. .02... 05.50:! 851532: .303 z...) 00.. 00:. .o ..u... 909:5... 00:45.4 2:0..1w '05:... 00000.6 .0 0:30.00... 09:0...03-0 305050.... 00:00... 5.) 00:02.. 30.5.0 .50....- 82. 53.00 080.30.... 2.0.2.38. 5 02.00... 5:65.. .3800. .c 3.... ._ :00... 0.0.0.30 0. 0.00:3 0. 0300.230 30.2.2. 530:0 :03 .20. 5:03.. 2:0... :5 .o 50.0.3 :0. an .3: £009.80 050.0; :0 £593 0:0: ..fiI 3:5. ..56 03.50000... 3. I 8 o. 2030.... :5... .3. 53000.... 0000.39... 3.0.0; :0 no?!) 0.603... 30.5 .5323... £8.93 6000 002.0 30.... 5...! Us: v0.3.4.6“. 233.... 83.00033 5.3 .2 a a... o. .co 30.2.... 0...... .6 5:9.- 28 ..50... .c .3 .0833 .6223 .3. 3.551000... n £3050: 8.302: 09.0.39... 2.0.1.. 5:052. .c .a. 0 Jam 0.0..) 3.53.3.0... N as a. a. n. 0‘ m§ a. a. nanou 30.5.0 0.09505 32:03:... 5.! 0:: .a .3: .0333: 0239. 20.0.0; 3:0 25:90.“. 30.1.... :. 95:3... 02 6:50.... 00.33: :0 2005...; {30:0 .02 .01.. 2... .05. 300-0- $02.00... 3 Ivan. IS_U.C=U uCOuD—st. A“ GUCUQOIRKQ dCOhU...o a.£m3uafigha bar! as 9.0 8:32 0200:0335. 730:3 3:030... .52 050.00.: .3253. .o 09.0.3.0... :2 a. 0,0: 0. 2.9.5.3 0503?... 50...... low. 339:2. 530:0 :000 .02 00:0..090 5900:. :2 .0.. 503:... 3.0.0.. o. $0.009w . 0000.39... .0050... {030:0 .02 953300003 :0 :. $225k. V0.3.-.3U .0950: 2:0...- EA... 0:... 00:01.55. 05000.9:— 8.....u0~a 0:: :. 2:30 .0.-:02. GO .3 .0. 0 .20 ~\ = N. .. >UZu.U....8 >3wfi=b¢a >¢OOw~¢U mnghm :— 27: >0 0 5.1..58 gwhgu a :5 8k Sp2. EXHIBIT J EXPIANATIONS FOR UNIT II C(MPEI‘ENCE IN DIAGNOSIS AND TREA'DIENT 050.93 5 00.5.3.0 2.0.00 0... ..0I.|¢00.. 0. 50.5 .5 .0: 00>, .0055: .0... 5:50.. .2.) .000 .200 wmap .30.. 10.0.. 020 :55 mamau 2 CFC—.5... 00a. .3022; :73 0020—35.... 4030 .0.... ..caU 000... a: 033 ~3.00. 0. 00000: 05.. 000. 0. 50:00 .0000 00509.00 00 p 050.2... m 00... 3:0 051 w. 0000.39... 50...: 05:00: :. 5.00 000.... :0... 00.05 0.3 :0» .0. 9:00 A3.20.00. 050.53 00.00... 20500.0 .0..:0.0...0 02:. ..SO 52. 02.0505 0. :0... 50.0 00 ..00. 0. .50-u .0005 0.0.. 0000 0 05000 .02 0‘ :h 0030-0 :0 :60. 00.000 050.1... 00. 0000 C0» 0 00 ..:0.0 5.050.000 350.002 00.3.55 :0 005.000 meU .0 00.0) a .53 .0 5.0. .00. :50 005.50 .9. 00000.... 05300 50.00... 020000 0.. 0. 3:00.22 0 0,01 00.0.0.0 .0 505 00 0000.. 9. 000.01 ..:0>0I .2 8. 8 xi .04-.050. 9:00 0000 5 0000 xi .00 ..55 00....000... 5.00.3.0... 00.0-:0: 00: ..:00 0000 3.. ‘05:. 000. 0. 00.20 00.0000... :. 05.. 00!. 3:00 309...... 0.: 0.303 .0050... 03 500 .000 30.0.00: :. 0000,. 9. 05.5) .0: 02.1 050.03 5. 5000... .0: 200.00... 00550... 000.. 0000.35... 5 00:00:50.. .0 ..00 . 000.0...) 35 .2. :0 4000000. 02 0.000 :10 0.. .30.... 50.005... .0 £030.80 5. 720.051.300— S.) 0.0.3.30 5200.000. .05 0. 30.. 3» 05:000. .0: 81000.... at >.UZU.U_....3 040. TB. 55.0.3» 0:0 0000000... 00000.0 .0 0000.305. .0 .0.. . .0.... :0. 0. 00.055 0050.... .03. 50.0.0 .0 0.5.3.; 05.200 - 5030.0 .0 000.. as 500. 0. .0 000.05 .00. 50.1. 00. 00550.0 «0 0‘ 0.8... 800 .0 5.00. 5:00.58. .8200 .3 30:0. 0. 3.50.0000: 02 305.00.. .0 09.0 500. 0. C. ..55 a. 00:25 _0 a.) H: c. .00 3.082.. 550.50.. 20... 00 .000. d .05.. 00.000 050.05.“ .0 p2!..( .I— 94‘ m_mCZU(.C 2. 3.255090. 0:0 0056:0000... 05000.05 7.000.. 0>0I 0.00.000 :00 00 2.0300 00.000200... 8.9.00.0 00....05330 02:00.00. EU 00:03.09... 0:: 0000.305. .025... .309... 00.3.. $605.00.. .0 3:500:40530: 000.. .025... 000v. 000350 0 55.0000 0... :0 )9... 3:022. 0.0050 05030.0 EU 5...! 05.000 5 00:0..090 Bin. as 05030.... .0000 0:2. 0. .03 0 a00.>0.a ..0000. :0 050:0000 .mwau 050...: .0. 13:03.20 000.05 000.500 050m 0.03.05 050.00.: .32..0 c. :5... c. 0:03.000 5.09.00... _0..:0.0...0 :0 000:. 0. 05 .0 50300 0. ..000. .00 004.0... 8.0002Q 0:00 :03 050.9: 050m 05000. 0:... .0 .0. 0 500m .. 33:0 0 .02. 8 .50 .. 0.09:..00: d 050: 00 .5032. 00m .03 30.4.0.0 5. 7.0.2.... 0. 0:... 2.0.0.500. 3-5. «50.93 50 ..0: :. 0.00.030 .00.: 0.2.0.. .0 0.05... .2. 0. £00.50 0.5.00 . 0...... E. > 5 0530.. 5020.0 00...... - ..015 00..0..J< :0 a: 505.00.. 0. 505005.05 05.0000... ..0. 0000 00 050.1- 050.» 5050.: 00.00.05 «0000 .0 8.2.000”. 09.0.0.0. .0. 000.. 000. .05....0 0'0m0_0>|0.m 0.51 >— 050...0.a 55:5: .0. 0.50:... .0 505.00.. :30. 0. .820 3.03. 50.0.00: .0.. 5.0.2.0 :0 5 00000 9. 00.20) 000... o. :— 26058.30. 3:50.56000 3:00.04... 00.000 50.0». .0 55000! 5.35.0.5. .0051. 05......00 :. 0>....000< .. .5050. - .. 000 = = .. 3500,50 00000.0 2.500... 03.0 000020 :53 5000.0 02. mmau .095... 050.1- .0 .0005 0.0.. 0 001 0000.39... 02.0.0500 us 00 n0 3.0.0.. 500.0 .55.. .. .5050. .... 0... .— .0..0.05 «50.2.0 05.500... 0:... 500m 02.000050... 0500 d . 00.000 050.002. .0000 4:2. 050.10 0:. .0 3:006 0. x0! 0 00» 0.30 mmfiU n. w. : >U2,=U. 51.. >18 .»(U = :70. 3.: H.243. = :2. 10.. 2v3<2 09.010000 5.00:... .0— 3.5.89.0 .0 .00. 95.0000. . Sinuuiunn 00:00:50 08050060 505003.50. 0.....00... 00....00 0 70.00.5000 020:0...“ :. . a: 002......» 85.003 205:2? .90. E8 .n 5:2. 8 3 $0 :0 a: 00:10:50...00 0:0: «:02... .0. 3.2500090. 02 05.390 S 00:07. ..0..0..=.:0. 0.... 05:000. 5 «.0.—0.0.... 5.000.; 8035 .0 5:23.20 .10! 53.8 c. .55.... 5.35.... 5.52.50 3:21.598- 0.=.E-._ .o 003.00....» Ens»! .- 0:30 0000...... G95... . .03-0:00 5:080:50. 5.3 00:000. 0000 05500. 0500... .020 02 60.0 .0...:0.0=_0 .00....3 02 000.. J... .. 50 ..0 5.005... .5000 5.) 0...~..0...0..9...0:m .0. 5:0. 0 003.00 :0 55:0. 5: 00.000 .0 0.00000 . .0.:- 505...} 501.300.305— .00.00...0.n_ E1 5030.. 2:000- 00:900( :0... 050.. 5 00.0.... 0.0.. 02 5:2. 00 0. 00)....0 .02 :0. 0. .46... 0.0. 0006 $505890. 3:50:50. 20:2. 0.0. 9.5.00. 05:000. 105:0 02 20.03. 00.0 :. €205.50 . 00... . 0.3.05. 0:50. 5 00.00.05. 1.0.0 .00 000. n .5000. 0>.I§( >— 0. 00.5.20 5.2.5050. 02 500:.- .0. 0.0. 02 . :0:- 0... 3000000. 00.00 .0 .60... .0....007. 00.0.00... 00.4.00. 02 0000 50:00 .0. 5003.0 .5... 00 .50.- 0. .0.... 02 0.0.0 0. 00.... $50.50.»... .0050... 350.0 0.0.0.0 9.530.. .005... 00:0» .03-:0.- u 00 . 0w 02.0.... E. :00- 00. 50.50.... 003.00 :0 0......000. 00.00 5:52.050. 50:00 00:0.l( 00.5.0..- .0 10.5.0.0 >0 0. I. ..0 =— E 5 2090...} .0 £500.... :00- 00. .0020... ..n. 5... c. 2.5... .e ..I. 8!. 1.0.00: .0 30:. 5:00 23.0500... 5.0.00.0. ..0>0 .52-0.60 .0 .00. 9.2000. 0000.0. 25.333 d .20 £1... .12 c. 83.. .033...” 00 5003.50. .0 .00.. 550...— :0 000800. 02 .0.! .0005... .0 :00.— 0..0..0 00:0..00-0 0.....000! 0000 000020.03 05500. 5.0.0 5.09.0 5.) 00100.0 0:... 1.00m 0000—30.... 30.0050...» 0 0.0.0.0 :00 05.000. ‘10.. 00.5.00. .0 .0>0.. 3 00.0.0. .5 E... .5530 SI. 00:05.0 {0.0 .0- 5,20 0.5.0.0000. 90 0000 9. 05.00) 09.02090 0.05:0 5.0.00... .005... .0 9.2000. 0...: . .0.. 550.5050! .000 305.! 005.00. 0.0.0.. 70.550 .0 3.0! .300 5 00.. 1.00.0.0 .00.... 00: 00.0.9.5: £00.50... 0.0...00 0...... 55,0...— 5030... 5.1 5....0001 :52 3:00. 5.] .0030 000500. 0000 .02 50:00 .50 5.5.0:... 30:... d .0.. 0.0! 60.0 1:00.050 10 .— 00:07.09; 3:0) :00 .0: €0.30. 505703.50. 02 9.0.0) 0.0.0:.- 000... .3. .0005 0.212003. .0. 350.5090! 00.3 005...... 50.... I )9... 900000 50.... 03.02.0000 u.:.0000< 0.000... 0.0.. 000—. 002 :00... 00 0. )2 30... 00:00:00“. 5005.80 @5500. 0.... = .05 50.. 0.03 09.0209..- S.0..0£ 02 0.0.0 8. .93. .0.-.009. 0000 5:00.! 0:04 .00 :0.- ..Iunu... 5:201 35:000. 9.! 00.5.0.3“. .0.—33:00.0. 05:000. 02 0 .0... 00. .0 95:000.. .3 .0.—50 3:5 02 .5300 250......— us...’ 05:000. 0000 3.09.0 5.00:0: .0 30. 00.0.0. :0 051600. 0000 .0.—00:00 00.00 00.5.09.“ 0. A. 0‘ A. 0. m\ 0. a. 050.005 .Qfld 5 10.0.5 02 :0 00......- :._3 .55.... .0: .2300 .— .. 0:03.005. .500. z 000 .100... 3.0.0 .0 00...... 50:00 20 0. 0010:- .02 522.10. I 000.. 5.) 5:0..040U £000 .580003 .0....5: a. 5.0 2.0.... 02 05>...- 50300.. =_ :5 52.0.. 03.00 0?. n .0 4 0:00 00... 00...: .8 .0. 00 0. 0590.0 E000 I :0.- 0033. .5?..< a}. :0 4000000. 000... .00 .0202 .2! 005...... 15.03.53 0.0m. .55-.9 000.. 00:20.01 05:000. 00.5 00039... 0.3.3.20... .0.! n 0.0.0 .0. .20. 5... 00. .0 05:000. 0:... :0. .0.: 0.500. 5.000 100.... .00-u 750 .- 05:000. 0000 .3.- B. .000. 0.0.3.0 05:03. ...0.005-.!m _ = .J 111 I... 00.500 5.9.00.0 .0.—00.0.50 I: 00000 .0005...- .0 .00....0601 .00.... 0. 000: 00> .23 20> :0. :01 03.. 000.01.... .0 5000.000 :. ..00:0.0=.D 05.0.. 50.. .. .0035: 02 . 00:000.... = :0... :0 0.01 5.0:: .000 0° :0.- 030: 0:... a .0 0 N. .0 2 .0 >:¥.U.U....Xu >82... 6%.. >80 u~...iaza Pm(u.<..§ b wSh(Z ........:§..E .0 :2. 60.0.0030 .09.. 00.0... .00... :0... 0.0:. - .0. 0 0.. 0. .0... .2000 09.0.65. .3000 000.00.. .8 9.5 03.2.3090. 0:30. 0. 00.00.03. 003 . 30.0.00... c. 0. .3309. .0 09.2.0000 33:00:00 300800. .00 .0083. -0.-02.000.02.00 9.20000 30:0. ”00.0.00... .0 "8.0.00.2 .3020»... .0300 ..( o. ..s 0.0.8 050.00.... .3530 02.00.008.50... .53... 28.. .93....) 2:33.000! 000.0 80. 5 0032.... .=. :5 .o . .0... 20.0.0003... 0.0.00. "8.0.00.2...6 60:00.00 0.30:8 0.. 300000. .........0:o&0. 00...: 5.1 0.3-5-0.... 5.0.00.0 .0 0:0 .000... 0.0 .. .I. ...0...u.00 «00000.6 0.6.30.0. 0. m. 0.00.08 9.0... o. 00.. 00.0..00 000:0... 0. 39.5.7000 .0530 ..n €003.00. 09.0.0900... 05.0000 .00 0. .00. 000.. 009.0..0000 013.350.00.00.) .0 .00.. 02 a. > >— a. .— _ >§u»(u =. ..z. EXHIBITM EFFECT OF mum AND EXPERIENCE ON PROFESSIQIAL CCMPEI‘ENCE DAIOIIIIIOO 000.0. .5090 .0. 0.0.00. :00 0.: .0 0.00 5.3 0:00.... 00 ..i... 0.00.600. .00-0.... .30.. :0...» .0300 00....00. 0.0.). .2... 240.90... 0...... 00.0.1. 0. 00.0.0... .0. 02.0.20... 0......003. $020.00.. - 00...... 00.0: .003... .30.. 00...... .0... .00 0.. 0.0.0.... 00:. .20—0 Ed 00.5.02; 0.. ..c: . ..00. 3.00.0.5 .00....0 ..00 0. 0. 00.00 0.0.... 0000 4.0.0.... 0.0.0 .0... .0”. on 293 0000 0... 0:00... 0:0. .0.........0 0.5.... LwfiL 9.3.5005... 5:20.. .0 .50.... a». 000...... .070... 000.00.... 00...... .500... ..v... :9 :0... 0:... 00......2... 0.00 .c. 7.00.0.0 50:0. 00 0.950 .0.... :0 0000.00: 00.30. . 0......0... 75:5... 0. .0203... .3... . .0010... 000 - .0.... ..«a. . 000.. 00 30.0.09... .30: .0... 0000 7.0.0.... .00....00 0. 0.5.0000...— 2 .0... .. 000 . .. .02.... .0.... _ .5... .G.1..0...0~. 0......0 .000.) 9. .090... 000...: 0...... :0... .0...» .0.. c. 0...... ..000 n \ .0....300 0. 00.0.5.0 .0003 0.00 0.. .0..) 0.0.0. a. .0022. .3005... . E. 0...... .00. c. 0.0..0 .. .0000 0 00... 00.00.. 00:0 IIIIIIIIIIIIII OCIIIIIOIIIQOIOI 000.....0... 50.05: $0.... ..COC 00.28.5000: 3.00. .0.. 0 00 0.85:... ..E. 0. £000.00: 00.0.00. .0. 0.00.. ..00 .5030. 0......o.0...0....-:0.....0 00.0.0... - 0. .0 00.0.0... .0. 0.0.... 9:03.... .0. .500... 000.00.... 020......000... 0... 0.5.0000... .0... . 00.0000... .01 .7002: .: .00. 0.0.00. .0 .0. 0.0.... 0: .301 0.000 0.08... 0... :0 000.0 0.00.000. ..003 - 0000. 0...... 005 .. ..h w. ..0..0..0.0..0 00.000.20.50 .020: 0000. -30.... 50...... .0...F.:...C $0.... :30... 3...... .0. .0....0. 000.00.... 50.0... .0. 0000 0.0.0.... .0 000.0. .0.:0... 9. 0.0.00 0 >5......: 0. ix: 0. 00m...— 0......0¢.. C. 0.0.0:... .0123... 0.0:. 00.50... 50.0 .0 0:. .00.... 30.5.0 .0 00...... .5000. .3 .0000 0 - 0. .. 2...... £501.00... 00... .00....0. .00).... .000... ... £0. ..5. . 0000.25.50 .0... .03 00...... £0 .0... c. 0.5T. my .0 hm~..0 u... .(N... 00 7...... 0. x. ,2 .34.. . .0.. 2 CE... 00:... 000.0 00.000 .0.: .0.... 0.0.0.0 00.. .00. .0000. .0. 00000.0 0.3.00.0... 000?. .0.. 0.0.50... 0. 1.0.0.0... .> 3.003 .. .0 0.5.0 ..E :0 2.3.03 0.. ..3. 0:00.30...— 0).... ........0.... :50... 000.00... 050.90? a. 0:30 7.1.1... 3‘1... m2 ..m>m .5. =1 .07... ~2< #1....3CZX b b... .0.. I .3208 0:... . 0...... 0.3.00 3.00.5 340.200”. :0 «5 0000.05. 0.00. a... 0.0.. .0. 5.0:... 0.0.2.... .0. 2...... 30.. 00.20.0— 00 a. 3:39.30 2.0.0.... .22.. 3.0.5....3'. .05... .0.... 0.. .0500... 00...... 0.0.50.0 0. 0....0000 0.0.: 0000.000 0.00. .0: .0.. .00-.00... .09.: 0.... 03.0000 0.... . .00.... 00.. 0.0.. . 00005.0 30...... 05 a. .0050... .0 $5.92.... 0.320.533 00.000 3.00.05... 00:90.30. 300v. 0:00 0.07.0.0 .0. . 0.00.00... .00.. ..c > .0. 0.......0 0.0... 0. 50.00555... 0.00m. 0.0.. 4.02 0000.50... 0. .0..00.0..... .>..... 000.0 .00.... ..c .0! 0:30.... 00.00.05: 0000.39... .02.... 0.0.. 4.0... b .,....... - sat-0.5.500: 0:. 00.0.4 0.0.0.0 .0. 0002000.. .0v. 3.1.0.3.... In. 9......03 02.0.00. 30... 000.0 .00.... 00 .0! 0. 32:20:... 0...... mu .0 Z = p _z. EXHIBIT N EXPLANATIONS FOR UNIT II INTEGRATION OF THEORY AND PRACTICE 0000.89... 0.0.00.0. o. 0000 30.0.... .0002 000000000 .000 000000 00.0.0». .00.. ... 00 00000 3.00000... 3.2.0.3.... .0. 3.09.0 50.... 9000.008. 0:01 :94 00000 .0 .300 0.00 30.0.... 0 00.0000 031 .. 00.000 .33 80... 5000 000.... :55 a. 2!: 88 0000. 05.2.5 .0 .89.. :82 0.00.000... 0.. 0. 00;... 35820... 200.23.208.50 00.00.3000. .0. .0005 002.6... 00000....95 0.000 0. 000. 00.... 000 .0000 30.5.... 000 £30... 0000.. 0- ..u 0. 00000090 200-0... .0 0000.39... 00000.00 0.0... :5 $32. 05...: 0. use... .058. 0.2. 8030 2800. 00000.00 .5 00.009. 00.0.. 0000 ..0 .000 .000 0000.0»! 0.00 d 53050.0. fl 00.0.9.0 00......- .00-00.0. 03 .32.: a. .2000... = .1 50.... 000079. 0. .00. 320:... .000”. .0000 00.30.. 8 00.30 .05... 0.0.9.... 3 - .000... a .00.... 8 83.38.. .2..- - .80 in... 0.0: ..S. 0.3. 0 .z :8. 2.2 .05 0000 30.5.0 .33 600.01 = 0.5.0 4 300.00..- .0 05000.00 3.00.0000... 005.3 2.00.000. 0000.00 0.000 .000 0. 33.0.3.0. PI 0.00.0.0 0.... .3080... 5.... .0908... .. .3. PIE-0.. 0000 00.000 .0... 0.00.. .0 000.. .60.... c. .02 .2. .50... 0. 6...... a. ..8. n Q .0000... .330230 0 0. $50.10 0. 90.0000... .30... .3000 0.40.... 0... 3...... 8.0000... :00 :5. 00.5000 0.0.. n. d p 0.0.. .003 00.000 023.00.... 09.0.0.0. 30.5.0 .030 00... $00.30 30.50000 0003 00.30... 00.0.00. 0. 00.00...- .030... 000.0... 0. 00...: .0202 0... 003 0000.000 30.53 003.0008 000 05.00.0900. 0. Colt! 0. 000... 00.00... .00... 000 . .0.... 00000 00 .3 .0001 0.00000... 3200.0..6 00 0:00. .0... 00000.0.0. 00:06 30.0.... 00.0... 00.000. 09.0.00 .0 .0. 0 000.0 0.01 8.0.03... ..x. 2:8 832.000 0.033005 0. 0000.300! 00.200 30.3030022. 0. 0000.000 000.3 .3080... .0000 .3000 0.0. 3.000031. 30.0.3 00 .33 00 .00. 0000.00 0.00.. 0. 000.. 0000 0... 000 .33 0.00. 0... 00.00.0000. 5 000.0 .3090 00.3.00...— 0930. 30.0 0. 00.0.0.3... 0.0 60.... 0.00... o. 00200.3 .000 .00..0...0 0.00. 00.0.00. 00.30. 00.000... 300.5 0000 00.000 .33 300. 30.5.0 .00... 000.0 00 00 .00.. 00003.0. 0 :30 00:00... .000 0.0.0.0 0003.00 000 00... 0.0..) 50.... 0...: 0... .000... 001...... 0.00:8 .00.... 8 .3.- 8. ..8. «u HU:U<¢0. A2( >¢C tr 5 5:43.322. .. :5 5 2.:424 .01. 0‘ .8500. 00.0.. .03 .9. 0.3... 0000 00.003003. 05...... 50.8.. 0.3.0.. .9308... 0...... .50... S 9-3.3.. 9.38.. u. .803... .0- 00.00.3000. .3. .93.! 000.000.. 0000.02...“ 0000 000 00.3 00.00 0. 00.... 9 0.00.. 0.00.0.0 .0 000.000.03.000... 0... 05.0.0003 20>...03050 as 0.0.00.3 000 .00.0...0 0.00. 00.0000. 003.0 02.00... 30.0..U .30... 0008.50. 00.50... 30.0.... 0000 000 3:0 030 00 :35 33020:... 0020 00.300 0.00.03 0.02.0.9. .0.... o. 0.. 00.0...0E .000 0.030... .00. 000.200.! 0. .00—0.00.... 0. 930.0002. 003 .0030... 30. 5.) 0000 0. 000000 .0..) 0 8.3.0.0.... 0.00.0.0 0001.00 0000 .9 >— >50up(u .— :5 EXHIBIT O ASSCIIIATIONS OF EXPLANATIONS FOR UNIT II H/P COMPETENCE OFFERED BY UNIT II SUBJECTS 295 EXPLANATIONS FOR UNIT II H/P COMPETENCE OFFERED BY UNIT II SUBJECTS Got to do alot lots of cases/pathology Was assertive/ self confident Had self-teaching goals Was taught "tricks of trade" Got positive reinforcement Was honest about own skills Was given critical feedback Have personal support system Supervisor had high expectations Had requisite knowledge base Was in non-threatening learning environment Was given responsibility Pre-MSUCCM experience and training 296 a... .5. ca. .5. a... 0... m... 0... co. 0—. 3. I a... 00. cc. .3. a... 00. 2.. a... as. ..N. «c. I a... co. 0... a... we. a... a... 0... cc. 5. c... I an. a... on. 0... on. g. 5. an. we. 8... m... I c... a... a... a... as. a... a... No. 0:... we. we... I 0... we. a... a... a... a... 2. v... as. 0... v... I a... an. as. pm. pm. 3. no. as. me. no. 3. I 5. kn. pa. 5. 3. n... mm. m... n... mm. as. I an. an. m0. 2. 3.. 0m. 2.. 2.. an. 0.. mm. I 3. ..N. S. we. 3.. we. we. as. «N. mm. an. I 00. as. mm... 3. mm... me... an. an. av. an. 5. I 2.. «c. 5. «c. N... a. S. 3. NH. S. as. «a. a... an. S. 2.. an. an. mv. 5. an. em. S. A v. H. m I O m m G O aOmanm = .529 >m Dmmmmmo mDZMEmmEOU 0:: a .529 m0... mZOFDcmmam02242cmmcmm00m< chm—cmbm E .523 Mm mmmmo mOZm—cmmSOO E .529 mp.“ cZOF3x BIBLIOGRAPHY 307 Bibliography Alkin, M.C. and Fitz-Gibbon, C.T. Methods and Theories of Evaluating Programs. Journal of Research and Develmnent in Education 8(Spring 1975X3):2-15. Alpert, J. and Charney, E. The Education of Ph icians for Primary Care. U.S. Gov't. Printing Office, Washington, D.C., 19 3. American Board of Medical Specialties. Definitions of Competence in_Specialties of Medicine: Conference Proceeding. Chicago, September 19, 1979. Anderson, 8.8. and Ball, S. The Profession and Practice of Program Evaluation. San Francisco: Jossey-Bass Publishers, 1979. Armstrong, D. The Structure of Medical Education. Medical Education 11(1977):244-248. Baker, A.S. What do Family Physicians in a Prepaid Group Do in Their Offices. Journal of Family Practice 6(February 1978):335-340. Bandura, A. Influence of Model's Reinforcement Contingencies on the Acquisition of Imitative Responses. Journal of Personality and Social Psychology 1(1965):589-595. --—-. Social Learning meory. Englewood Cliffs, NJ: Prentice-Hall, Inc. 1977. -—-. Social Learning Through Imitation. In M.R. Jones (ed.) Nebraska Symposium on Motivation. University of Nebraska Press, 1962. Barondess, J.A. The Future Physician: Realistic Expectations and Curricular Needs. Journal of Medical Education 56(1981)(5):381-389. Benoist, H. and Gibbons, R. Commentary: The Competence Movement and the Liberal Arts Tradition: Enemies or Allies? Journal of Higher Education 51(June 1980):685-692. Berger, P.L. and Luckmann, T. The Social Construction of Reality. Penquin Press, 1966. Bergman, A.B., et a1. Task Identification in Pediatric Practice. American Journal of Diseases of Children 118(1969):459-468. —-—-. Time-Motion Study of Practicing Pediatricians. Pediatrics, 38(February 1966):254-263. 308 Block, James H. The "C" in CBE. Educational Researcher 7(May 1978):13-16. Bloom, B. S. (ed. ) Taxonomy of Educational Objectives: Handbook ILCpgnitive D_o____main. New York: David McKay Company, 1956. Blumenthal, A.L. The Process of Cpgnition. Englewood Cliffs, NJ: Prentice-Hall, Inc. 1977. Blumer, H. Methodological principles of empirical science. In N.K. Denzin (ed.), Sociolggjcal Methods: A Source Book. New York: McGraw Hill, 1978. Borg, W.R. and Gall, M.D. Educational Research, Third Edition. NY: Longman, 1979. Bransford, J. D. Human C ition: Learning, Understanding and Remembeng. CA: Wadsworth Publishing Company, 1979. Brockway, B. Evaluating Physician Competency: What Difference Does it Make? Evaluation and Program Planning (2) (1979):211-220. Bunda, M. A. and Sanders, J. R. Practices and Problems in Competency-Based Education. National Council on Measurement in Education, 1979. Burg, F. D. and Lloyd, J. S. Definitions of Competence: A Conceptual Framework. In Definitions of Competence in Specialties of Medicine: Conference Proceedings of the American Board of Medicafipecmlties. Chicago, Septemer19,1979. Burke, J.B., Hansen, J. H., Houston, W. R. and Johnson, C. Criteria for Describing and Assessing. Competengy Based Programs. National Consortium of CBE CenTersflVIarc 1975. Campbell, D.T. Qualitative knowing in Action Research. Presentation for the Annual Meeting of the American Psychological Association, New Orleans, 1974. Canfield, A.A. Competencies for Allied Health Instructors. Gainesville, FL: The Center for Allied Health Instructional Personnel, 1972'. Cannell, C. F. and Kahn, R. L. Interviewing. In G. Lindzey and E. Aronson (eds.) The Handbook Social Psychol , Second Edition, volume Two. PA: Addison- Wesley PublishIng Co., 19 , pp. 526-595. Carroll, J. A Model of School Learning. Teachers CollegerRecord 64(1963):723-33. Chapman, C. G. Doctors, and Their Autonomy: Past Events and Future Prospects. Sc___i____ence 200(1978): 851-856. Chickering, A. and Claxton, C. What is Competence? In R. Nickse (ed.) Competency-Based Education: Beyond Minimum Competence Testing, New York: Teachers College Press, 1981. 309 Coggeshall, Lowell T. Planning for Medical Progress Through Education. A Report submitted to the Executive Council of the Association of American Medical Colleges Chicago: AMA, April 1965. Conaway, L.E. Setting Standards in Competency-Based Education: Some Current Practices and Concerns. In M.A. Bunda and J.R. Sanders (eds.) Practices and Problems in Competency-Based Education. National Council on Measurement in EducatEn, 1979. Cooley, W. Explanatory Observational Studies. Educational Researcher 7(October l978):9—15. Cape, Oliver. Man, Mind 6: Medicine: The Doctor's Education. Philadelphia: J.B. Lippincott Co., 1363. Cronback, L. J. Beyond the Two Disciplines of Scientific Psychology. American ngcholggist 30(1975):116-127. Cypress, B.K. Characteristics of Physician Visits for Back Syndrome: A National Perspective. American Journal of Public Health 73(April 1983):389-395. Denzin, N.K. The Logic of Naturalistic Inquiry. In N.K. Denzin (ed.), Sociological Methods: A Source Book. New York: McGraw Hill, 1978. Dressel, J.L. Program Evaluation by MSU-COM Externs. Michigan State University College of Osteopathic Medicine Occasional Papers, Number Four, Fall 1977. Dressel, P.L. Curriculutm Analysis for Colleges of Osteopathic Medigirg. Michigan State University College of Osteopathic Medicine Occasional Papers No. 11, June 1981. —-——. Handbook of Academic Evaluation. San Francisco: Jossey-Bass Publishers, 1978. -—-. mproving Dggree Prgggams, San Francisco: Jossey-Bass Publishers, 1980. -——-. Integration: An Expanding Concept. (In The Int tion of Educational Efmriences. Fifty-seventh Yearbook of the NatI'onaI Society for "the Study 0 EducatIon, Part III. Chicago: The University of Chicago Press, 1958. -—--. Philosophy of Osteopathic Medicine. In Michigan State University College of Osteopathic Medicine Curriculum Handbook. East Lansing: Michigan State University, 1981. Dressel, P.L. and Marcus, D. On Teachifl and Learning in College. San Francisco: Jossey-Bass Publishers, 1m. Elam, S. Performance-Based Teacher Education: What Is the State of the Art? Washington, D.C.: American Association of Colleges for Teacher Education, 1971. Elstein, A.S., Shulman, L.S. and Sprafka, S.A. Medical Problem Solving. Harvard University Press, 1978. 310 Engel, Geo. L. Biomedicine's Failure to Achieve Flexnerian Standards of Education. Journal of Medical Education 53(May 1978):387-392. Flexner, A. Medical Education in the United States and Canada. A Report to the Carnegie Foundation for the Advancement of Teaching. Yulletin No. 4, 1910. Foa, U.G. New Developments in Facet Design and Analysis. Psycholggical Review 72(April 1965):262-274. Frahm, R. and Covington, J. What's Happening in Minimum Competency Testing? Bloomington: Phi Delta Kappa, 1979. Fuhrmann, 3.8. and Weissburg, M.J. Self-Assessment. In Morgan and Irby (eds.) Evaluating Clinical Competence in the Health Professions Saint Louis: C.V. Mosby Company, 1978. Gagne, RM. The Conditions of Learning. New York: Holt, Rinehart and Winston, 1965. Gale, L.E. and Pol, G. Competence: A Definition and Conceptual Scheme. Educational Technology (June 1975):19-25. Gamson, Z. Understanding the Difficulties of Implementing a Competence—Based Curriculum. In Grant, et al (eds.) On Competence. San Francisco: Jossey- Bass Publishers, 1979. Glaser, B.G. and Strauss, A.L. DiscoverLof _‘Grounded Theory: Strategies for Qualitative Research. Chicago: AVC, 1967. Golden, A.S. An Inventory for Primary Health Care Practice. Cambridge: Ballinger Publishing Company, 1976. Goodlad, John I. The Development of a Conceptual SystemLfor Dealing with Problems of Curriculum ind Instruction. The Cooperative Research Program, Office of Education, HEW Project No. 454, 1966. Gordon, M.J., Hadac, R.R. and Smith, C.K. Evaluation of Clinical Training in the Community. Journal of Medical Education 52(1977):888-895. Graham, J.R. Systematic Evaluation of Clinical Competence. Journal of Medical Education 46(1971):625-629. Grant, G., et a]. On Competence: A Critical Analysis of Competence-Based R_e_forms in Higher Education. San Francisco: Jossey-Bass Publications, 1979. Grant, G. and Kohli, W. Contributing to Learning by Assessing Student Performance. In Grant, et al. (eds.) On Competencei A critical Analysis of Competence-Based Reforms in Higher Education. San Francisco: Jossey- Bass Publishers, 1973. Greenland, P., Mushlin, A.I. and Griner, P.F. Discrepancies Between Knowledge and Use of Diagnostic Studies in Asymptomatic Patients. Journal of Medical Education 54(November 1979):863-869. 311 Guba, B.G. The Failure of Educational Evaluation. Educational Technology 9(May l969):29-38. -——. Toward a Methodol of Naturalistic Inquiry in Educational Evaluation. CSI Monograph erIes In Evaluatfin No. 3. Los Angeles, CA: Center for Study of Evaluation, University of California, Los Angeles, 1978. Hamilton, J.D. ’Ihe McMaster Curriculum: a Critique. ,British Medical Journal 15 (May 1976):1191-1196. Harden, R.M. How to Assess Clinical Competence: An Overview. Medical Teacher l(June 1979):289-296. ---. How to Assess Students: An Overview. Medical Teacher 1(February 1979):65-70. Harden, R.M. and Gleason, F.A. Assessment of Clinical Competence Using an Objective Structured Clinical Examination. Medical Education 13(1979):41- 54. Holzemer, W.L. A Protocol for Program Evaluation. Journal of Medical Education 51(February 1976):101-103. House, E.R. Assumptions Underlying Evaluation Methods, Educational Researcher (1978):74-12, cited in Patton, M.Q. Qualitative Evaluation Methods. . Evaluatigcwith Validity. Beverly Hills: Sage Publications, 1980. Housten, W.R. (ed.) Exploring Competence Based Education. Berkeley: McCutchan, 1974. Hubbard, J.P. Measuring Medical Education. Philadelphia: Lea 6c Febiger, 1971. Huff, S. Education, Work and Competence Society 13(February 1976):44-51. Illich, I. Medical Nemesis: 'Ihe Expropriation of Health. New York: Pantheon books, 1976. Jacobsen, T.L., Price, P.B., deMik, G.H. and Taylor, C. An Ex lorato Stud of Predictors of Physician Performance. ERIC Documenf No. E3 003351, September 1965. Jaeger, R.M. Measurement Consequences of Selected Standards-Setting Models. In M.A. Bunda and J.R. Sanders (eds.) Practices and Problems in Competency-Based Education. National Council on Measurement in Education, 1979. Jason, H. Defining Competence: Some Basic Considerations. In Definitions of Competence in Specialties of Medicine Conference Proceedings of the Amerfian Board of MedicafSpecialties. Chicago, September 19, 1979. 312 Jason, H. and Westberg, J. What an Evaluation Should Deliver: Problems and Other Pitfalls in Evaluation in the Health Sciences. In An Evaluation Primer: Proceedings of the North Dakota Conference on Evaluation in Health Science Education. Grand Forks, ND, September 20-21, 1978. Jesse, Wm. F. American Medical Education: The Student Viewpoint. Student American Medical Association Standing Committee on Medical Education, 1971. (Mimeo) Jonas, S. Medical Mystery: The Training of Doctors in the United States. New York: W.W. Norton 6: Co., 1978. Kelman, S. The Social Nature of the Definition of Health. In Health and Medical Care in the U.S.: A £ritica1 Analysis, edited by B. Navarro. Farmfngton, New York: Baywood Pubfiéhing Company, Inc., 1975. Koch, H. Office Visits to Doctors of Osteopathy: National Ambulatory Medical Survey United States, 1975. DHEW, 1978. Krevans, R. and Condliffe, P. Reform of Medical Education: The Effect of Student Unrest. Washington: National Academy of Sciences, 1970. Krippendorff, K. Content Analysis: An Introduction to its Methodolcgy. Beverly Hills: Sage Publications, 1980. Kroeger, H.H., et al. The Office of Internists. Journal of American Medical Association 193(1965):194. Kurtz, M.E. Development of the Generic Professional Core of Competency-Based Recreation Courses for the Undergraduate Michigan State University Recreation Curriculum. Doctoral Dissertation, Michigan State University, 1976. LaDuca, A., et aL Professional Performance Situation Model for Health Profession Educationg Occupational TI'Iera . Chicago: Center for Educational Development, University of IllInOIs College of Medicine, 197 5. Lagerkvist, B. and Samuelson, B. and Sjolin, 8. Evaluation of the Clinical Performance and Skill in Pediatrics of Medical Students. Medical Education 10(1976):176-178. Lehmann, H. The Systems Approach to Education. In M. Kapfer (ed.) Behavioral Objectives in Curriculum_ Development. New Jersey: Educational Technology Publications, 1971. Levy, S. Use of the Mapping Sentence for Coordinating Theory and Research: A Cross—cultural Example. Quality and Quantity 10(1976):117-125. Lippard, V.M. and Purcell, E.F. (ed.). Case Histories of Ten New Medical Schools. New York: Josiah Macy, Jr. Foundation, 197i. Littlefield, J.H., Harrington, J.T., Anthracite, N.E., and Garman, R.E. A Description and Four-Year Analysis of a Clinical Clerkship Evaluation System. Journal of Medical Education 56(April 1981):334-340. 313 Liu, P., et al. Videotape Reliability: A Method of Evaluation of a Clinical Performance Examination. Journal of Medical Education 55(August 1980):?13-715. Lloyd, J.S. Existing Definitions of Competence in Medicine. In Definitions of Competence in Specialties of Medicine Conference Proceedings of the American Board of Medical Specialties. Chicago, September 19, 1979. Lofland, J. Analyging Social Settings. Belmont, CA: Wadsworth, 1971. MacDonald, B. and Walker, R. Case-study and the Social Philosophy of Educational Research, Cambridge Journal of Education 5(197 5X1 ):2-1 1. Mao, Tse-Tung. On Practice: On the Relation Between Knowledge and Practice-- Between Knowing and Doing. Peking: Foreign Languages Press, 1953. Mazzuca, S.A., Cohen, S.J., and Clark, C.M. Evaluating Clinical Knowledge Across Years of Medical Training. Journal of Medical Education (1981)56:83-90. McAsham, H.H. Com tenc -Based Education and Behavioral Objectives. Englewood Cliffs, N5: Educationai Technology Publications, 1973. McClelland, D.C. Testing for Competence Rather Than for "Intelligence." American Psychologist January 1973:1-14. McDonald, R.J. 'Ihe Rationale for Competency Based Programs. In W.R. Houston (ed.) Exploring Competency Based Education. Berkeley: McCutchan Publishers, 1974. McGaghie, W.C., Miller, G.E., Sajid, A.W., and Telder, T.V. Competence-Based Curriculum Development in Medical Education: An Introduction. Geneva: World Health Organization, 1975. Mechanic, D. The Organization of Medical Practice and Practice Orientations Among Physicians in Prepaid and Non-prepaid Primary Care Settings, Medical Care 13(March 1975):189-204. -—-. General Medical Practice: Some Comparisons Between the Work of Primary Care Physicians in the United States and England and Wales. Medical Care 10(May 1972):402-420. Merrow, J.G. Politics of Competence: A Review of Competency-Based Teacher Education. Washington, D.C.: National Institute for Education, 197 5. Monjan, SN. and Gassner, S.M. Critical Issues in Competency Based Education. New York: Pergamon Press, 1979. Morgan, M.K. and Irby, D.M. (eds.) Evaluating Clinical Competence in the Health Professions. Saint Louis: C.V. Mosby Company, 1978. Nash, R.J. and Agne, R.M. Competency in Teacher Education: A Prop for the Status Quo? The Journal of Teacher Education XXII(2):147-156. 314 National Ambulatory Medical Care Survey. National Center for Health Statistics, HEW, 1978. Neisser, U. anition and Reality. San Francisco: W.H. Freeman and Company, 1976. Nelson, F.G. Models for Evaluation. Monmont, OR: Teaching Research, 1970. Nickse, R. (ed.) ComEtence-Based Education: Beyond Minimum Competency Testing. NY: Teachers College Press, 1981. O'Donohue, W.J. and Wergin, J.F. Evaluation on Medical Students During a Clinical Clerkship in Internal Medicine. Journal of Medical Education 53(1978X8):55- 58. Olesen, V. Employing Compentency-Based Education for Reform of Professional Practice. In Grant, et al (eds.) On Competence: A Critical Analysis of Competence-Based Reforms in Higher Education. San Francisco: Jossey- Bass Publishers, 1979. Omstead, A.G. Professions, Competence and CBE. Office of Medical Education Research and Development, Michigan State University. (Unpublished manuscript) Osgood, C.E. The Representational Model and Relevant Research methods. In Pool, I. de S. Trends in Content Analysis. Urbana: University of Illinois Press, 1959. Parker, J. T. and Taylor, P. G. The Delphi Study: CBAE Through the Eyes of Leading Educators. Belmont, CA: Fearon Pittman fiblishmg, Inc., 1980. Partlett, M. and Hamilton, D. Evaluation as illumination: A new approach to the study of innovative programs. In G.V. Glass (ed.) Evaluation Studies Review Annual, Vol. 1. Beverly Hills, CA: Sage, 1976. Patton, M.Q. Qualitative Evaluation Methods. Beverly Hills, CA: Sage Publications, 1980. P0pham, W.J. The Case for Minimum Competence Testing, Phi Delta Kappan, October 1981: 89-91. Posner, G.J. and Strike, K.S. An Analysis of Curriculum Structure. Presentation at the 1974 Annual Meeting of the American Educational Research Association, Chicago, April 15-19, 1974 (Mimeo) -—-. A Categorization Scheme for Principles of Sequencing Content. Review of Educational Research 46(1976 )(4):665-690. Pottinger, P.S. Competence Testing as a Basis for Licensing: Problems and Prospects. In M.A. Bunda and J.R. Sanders (eds.) Practices and Problems in _Cfimpetencr—Based Education. National Council on Méasurement in Education, I979. 315 Price, P.B. Attributes of a Good Practicing Physician. Journal of Medical Education 46(1971):229. Price, P.G., et a1. Measurement and Predictors of Physician Performance. Salt Lake City: LLR Fess, “63. Printen, K.J., Chappell, J.W. and Whitney, D.R. Clinical Performance Evaluation of Junior Medical Students. Journal of Medical Education 48(1973):353-348. Ragan, C.A. Difficulties in Assessing Medical School Curricula. Resident and Staff Physician, August 1974. Reed, 8.8. and Riley, W. Comprehensive Evaluation Model for Nursing Education Evaluation and the Health Professions 2(1979X4):438-454. Rosenberg, P. "Catch 22 - The Medical Model." In Shipiro and Lowenstein (eds.) Becomi a Physician: Development of Values and Attitudes in Medicine. CambrIdge: Ballinger PublIshIng Co., 1979. Rosenthal, T.L. and Zimmerman, B.J. Instructional Specificity and Outcome Expectation in Observationally Induced Question Formulation. Journal of Educational Psychology 63(1972):500—504. --—. Social Learning and Cognition. New York: Academic Press, 1978. Sackoff, J. A Multiple-Measure Approach to Curriculum Evaluation. In lh_e Proceedings of the 18th Annual Conference on R.I.C., November 1979. Salmon, P. Coming To Know. London: Routledge and Kegan Paul, 1980. Samph, T. and Templeton, B. Evaluation in _Medical Education Past, Present, Future. Cambridge: Ballinger Publishing Company, 1979. Sanazaro, P.J. and Williamson, J.W. A Classification of Physician Performance in Internal Medicine. Journal of Medical Education 43(March 1968):38 9-397. -——. Physician Performance and Its Effects on Patients: A Classification Based on Reports by Internists, Surgeons, Pediatricians, and Obstetricians. Medical Care 8(April 1970):299-308. Sanford, Nevitt. Social Psychology: Its Place in Personalogy, American Psycholpgist 37(1982X8):896-903. Schermerhorn, G.R. Complementary Approaches to Program Evaluation. In 'l_h_e_ Proceedings of the 18th Annual Conference on R.I.C., November 1979. Schulberg, H. Sheldon, A., and Baker, F. (eds.). Progpam Evaluation in the Health Fields. New York: Behavioral Publications, 1969. Senior, J.R. Towprd the Measurement of Competence in Medicine. National Board of Medical Examiners, 1976. 316 Shapiro, E.C. and Lowenstein, L.M. (eds.) Becominga Physician: ngelopment of Values _gnd Attitudes in Medicine. Cambridge: Harper and Row Publishers, Inc., 1979. Sharma, S. Values and Concerns of Clinical Clerks. Michigan State University College of Osteopathic Medicine Occasional Papers, Number One, Fall 1975. -—-. Values and Concerns of Interns. Michigan State University College of Osteopathic Medicine Occasional Papers, Number Two. Shepard, L.A. Setting Standards. In M.A. Bunda and J.R. Sanders (eds.) Practices and Problems in Competepgy-Based Education. National Council on Measurement in Education, 1979. Sinclair, D. Basic Medical Education. London: Oxford University Press, 1972. Slotnick, H.B. and Gray, L.A. An Evaluation Primer: Proceedings of the North Dakota Conference on Evaluation in Health Science Education. Grand Forks, ND, September 20—21, 1978. Spady, W.G. Competency-Based Education: A Bandwagon in Search of a Definition. Educational Researcher 6(January 1977 ):9-1 4. Spady, W.G. and Mitchell, D.E. Competency-Based Education: Organizational Issues and Implications. Educational Researcher 6(1977)(2):9-15. Stake, R.E. The Case Study Method in Social Inquiry. Educational Researcher 7(1978):5-8. Stone, C.A. and Day, M.C. Competence and Performance Models and the Characterization of Formal Operational Skills. Human Development 23(1980):323-353. Stufflebeam, D.L. Educational Evaluation and Decision Making. PDK National Study Committee on Evaluation, P.E. Peacock Publ., Inc., Itasca, IL, 1972. Sundberg, N.D., Snowden and Reynolds. Toward Assessment of Personal Competence and Competence in Life Situations. Annual Review of ngcholpgy 29(1978):179-221. Tarr, E.R. Some Philosophical Issues. In Houston (ed.) Exploring Competency- Based Education. Berkeley: McCutchan, 1974. Tinning, F.C. A Beginning Look at Evaluation of Clinical Experiences for the Osteopathic Medical Student. » Michigan State University College of Osteopathic Medicine, (Mimeo), 197 2. Tolman, E.C. Cognitive Maps in Rats and Men. Psychological Review 55(1948):189-208. Vygotsky, L.S. Mind in Society: Development of Higher gycholo 'cal Processes. (Edited by Cole, M., John-Steiner, V.,Ecri5ner, . and Siberman, E.) Harvard University Press, 1978. 317 Weaver, S. A. MSU-COM Student's Insights and Opinions of the Curriculum. Michigan State University College of OsteOpathic Medicine Occasional Pagrs, Number Ten, Summer 1980. Webb, B.J., Campbell, D.T., Schwartz, R. and Sechrest, L. Unobtrusive Measures: Nonreactive Research in the Social Sciences. Chicago: Rand McNally, 1966. Weed, L. A New Paradigm for Medical Education. In E.F. Purcell (ed.) Recent Trends in Medical Education. New York: Macy Foundation, 1976. Weinberger, H.L. An Attempt to Identify Frequency of Use of Technical Skills and Procedures by the Primary Care Physician, Journal of Pediatrics 38(February 1966):254—263. Willoughby, T.L., Gammon, L.C. and Jonas, H.S. Correlates of Clinical Performance During Medical School. Journal of Medical Education 54(June 1979):453-460. Winegard, J.R. and Williamson, J.R. Grades as predictors of Physicians' Career Performance: An Evaluative Literature Review. Journal of Medical Education 48(1973):311-22. ‘ Woditsch, C.A. Jonathan Livingston Student: Competence for What? In V.T. Peterson (ed.) Renewipg Higher Education: The Competency-Based Approach. Toledo: Center for the Study of Higher Education, University of o edo, 1976. Worthen, E.R. and Sanders, J.R. Educational Evaluation: Theory and Practice. CA: Wadsworth Publishing Co., 1973. Yankauer, A. et a1. Physician Output, Productivity and Task Delegation in Obstetric/Gynecologic Practice in the United States. Ostetrics and Gynecology 39(January 1972):1515—1617. -—-. Task Performance and Task Delegation in Pediatric Office Practice. American Journal of Public Health 59(July 1969):1104-1117. Zinser, E.A., Carl, D., and Demers, J. Variables to Consider in Doing Program Evaluation. In An Evaluation Primer: Proceedings of the North Dakota Conference on EvaluatiBn in Health Science Education. Grand Forks, ND, September 20-21, 1978.