OVERDUE FINES: 25¢ per day per item RETURNING LIBRARY MATERIALS: Place in book return to remove . charge from circulation records ~ Ink-Jam! v I . (:3? u Q'QG‘ 39,0 AN EVALUATION OF THE NORMATIVE AND PRESCRIPTIVE CONTENT OF THE U.S. DEPARTMENT OF ENERGY'S ENERGY INFORMATION ADMINISTRATION By Judith LuciIIe Brown A THESIS Submitted to Michigan State University in partiaT fuifiiiment of the requirements for the degree of MASTER OF SCIENCE Department of AgricuTturaT Economics 1981 5) ()7 / / v ABSTRACT AN EVALUATION OF THE NORMATIVE AND PRESCRIPTIVE CONTENT OF THE U.S. DEPARTMENT OF ENERGY'S ENERGY INFORMATION ADMINISTRATION By Judith Lucille Brown In 1977, the Texas Energy Advisory Council (TEAC) questioned the fairness of federal energy proposals to Texans. TEAC's questions lead it to challenge the objectivity, credibility and relevance of the Department of Energy's Energy Information Administration (EIA). TEAC's questions had to do with the nature of objective information and with the normative and prescriptive content of energy policies. This thesis was supported by TEAC to answer these questions. The study is institutional and methodological. It traces the history of EIA to highlight causes of EIA's strengths and weaknesses. The evaluation centers on interpretations of objectivity and on values as part of objective information. The data consisted of interviews of EIA and TEAC officials, pertinent legislation and legislative histories, and EIA and TEAC records. The conclusion is reached that both EIA and TEAC should adopt methodologies less specialized on positivism so as to include emphasis on pragmatic and normative methodologies. This would prevent EIA and TEAC from becoming isolated from governmental decision makers. ACKNOWLEDGMENTS I wish to thank the members of my committee for their assistance and criticisms. Dr. G. L. Johnson recommended the project to me when TEAC first contacted him about doing an evaluation of the normative and prescriptive content of EIA. He oversaw and participated in the entire undertaking; much of the credit for making this such an interesting experience goes to Dr. Johnson. Dr. L. w. Libby and Dr. R. A. Solo, although not so intimately involved during the research phase, made valuable contributions to the writing of the thesis. I have learned much from each of them. Appreciation is also extended to the Texas Energy Advisory Council for making possible a very rewarding introduction to research. Steve's undying enthusiasm for my projects and his confidence in me have, as always, been of great help. ii TABLE OF CONTENTS Page LIST OF ACRONYMS ......................... v CHAPTER I. INTRODUCTION ....................... l Origins of the FEA-DOE/Texas Conflict .......... 2 Organization and Purposes of TEAC and TNEMP ....... 9 Organization and Purposes of This Evaluation ...... 13 Chapter I Endnotes ................... l6 II. A METHODOLOGY FOR EVALUATING AN INFORMATION SYSTEM . . . . 19 What is Information? .................. l9 What is an Information System? ............. 25 Steps in the Process of Generating Information . . . . 26 Information System Functions vis-a-vis Types of Information ................... 30 Types of Research ................... 30 The Roles for Formal Models .............. 33 The Time Dimension .................. 36 The Actors ...................... 39 The Institutional Context .............. 44 The Agenda for Evaluating the Normative and Prescriptive Content of EIA .............. 48 Chapter II Endnotes ................... 52 III. THE NORMATIVE AND PRESCRIPTIVE CONTENT OF DOE'S EIA AND 57 MEFS .......................... Structural Impacts of Normative and Prescriptive Considerations on EIA, MEFS and Their Antecedents. . . 59 The Legislative History-—Prescriptions and Implied Values ................... 60 Personnel and Values ................. 70 The Structure of Decision Making vis-a-vis the Evolution of MEFS .................. 72 Page Normative and Prescriptive Influences on EIA as an Operating Organization and on MEFS as Operating Models ................... 82 The Selection of Topics to be Investigated with MEFS ...................... 83 Decisions About the Development of the Models ..... 85 Interactions Between EIA and Its Clients ....... 86 Interactions Between EIA and Its Critics, Especially Evaluators ................ 89 The Four Values: Professionalism, Objectivity, Credibility, Relevance ............... 92 Chapter III Endnotes .................. 95 IV. IMPLICATIONS AND CONCLUSIONS ............... lOl About EIA-u-Its Strengths, Weaknesses and Possible Futures ........................ 102 Implications of the Legislative Mandate ........ 102 Implications of EIA Operations ............ l04 Institutional Alternatives .............. l09 About TNEMP-u-Its Strengths, Weaknesses and Possible Futures ................... llZ Understanding the FEA-DOE/Texas Conflict ....... 112 Implications for Institutionalizing TNEMP ....... ll4 Potential Contributions of TNEMP to DOE ........ llS Postscript-u-TEAC and TNEMP Become TENRAC and TEPP ......................... 116 Chapter IV Endnotes ................... l19 BIBLIOGRAPHY ........................... ‘21 iv DOE ECPA EIA FEA FEO FOIA MEFS NAB NEO NEP NEP II OEIA OPEC OPPE PART PI PE PIES TEAC TENRAC TEPP TNEMP LIST OF ACRONYMS Department of Energy Energy Conservation and Production Act Energy Information Administration Federal Energy Administration Federal Energy Office Freedom of Information Act Mid-term Energy Forecasting System National Advisory Board National Energy Outlook National Energy Plan National Energy Plan 11 Office of Energy Information and Analysis Organization of Petroleum Exporting Countries Office of Policy and Program Evaluation Professional Audit Review Team Project Independence Office of the Assistant Secretary for Policy and Evaluation Project Independence Evaluation System Texas Energy Advisory Council Texas Energy and Natural Resources Advisory Council Texas Energy Policy Project Texas National Energy Modeling Project CHAPTER I INTRODUCTION This evaluation was sparked by a bitter conflict between the state government of Texas on the one hand, and the Federal Energy Administration (FEA) and, later, the U.S. Department of Energy (DOE) on the other. Texan policy makers were dissatisfied with an unexpected change in federal energy policy. They questioned the fairness of the new policies to Texan energy producers and consumers. In order to understand the change in policy better, the Texans asked to examine supporting documentation for the new policy. At this point, the Texan policy makers and FEA staff locked horns over the former's access rights to the latter's energy information. Eventually the Texan policy makers achieved access to both the documentation and the tools used in developing the documentation, a system of computerized energy models. By this time, FEA had been replaced by DOE. The energy models were maintained by and the documentation had been produced by the Energy Information Administration (EIA), that part of DOE which is responsible for producing "objective" energy information. The Texans began examining in detail individual EIA models. They checked the accuracy of the parameters, the data bases, and the theoretical foundations. They soon realized, however, that these procedures were not sufficient for answering the questions they were asking. These questions had to do with the nature of objective information and its role in policy making, and particularly with the value content of energy policies--with what the value content is and ought to be and by l what process it is and ought to be woven into policies. This evaluation was supported by Texan decision makers, specifically the Texas Energy Advisory Council (TEAC),2 to answer these questions. An evaluation useful to TEAC could only result from a clear and thorough understanding of the causes of the FEA-DOE/Texas conflict. This introductory chapter explains the origins of the FEA-DOE/Texas conflict. It then describes the purposes and organizations of TEAC and the Texas National Energy Modeling Project (TNEMP),3 that part of TEAC responsible for evaluating EIA and its models. Finally, in view of the nature of the FEA-DOE/Texas conflict and the goals of TEAC and TNEMP, it describes more fully the purposes and organization of this study. Subsequent chapters develop a methodolbgy for objectively evaluating an information system, use this methodology to evaluate EIA, and then pursue some implications for improving both EIA and TNEMP. Origins of the FEA-DOE/Texas Conflict The rise of energy as an important policy area has been a time of confusion, to say the least. The winter of l973-74, with its OPEC oil embargo and the first long lines at the gasoline pumps, took the American public by surprise. The initial reaction was a collective spirit of camaraderie and volunteerism as the country united to meet a presumably momentary crisis imposed by hostile outsiders. As energy shortages and rising prices became chronic, however, the optimistic initial reaction changed to one of disbelief. The popular press of the mid- and late '705 is filled with articles citing large percentages of the electorate who did not believe the ”energy crisis" was real, who suspected it was a ploy of U.S. energy conglomerates, and/or who charged American policy makers with contriving the whole mess.4 3 Policy makers were no better prepared for understanding and addressing energy problems than were their constituents. To be sure, there were those in Washington who knew before 1973 that energy resources were dwindling rapidly. But in a representative democracy such as ours, articulating new problems, designing tools with which to address them, and gaining authority to use them are virtually impossible tasks in the absence of a crisis.5 Hence policy makers, too, were empty-handed when it came to ideas for solving energy problems. Slowly the political machinery began to shift gears. The transition from a mind set of energy abundance to one of energy scarcity required action on many fronts: the development of a reliable information system, the design of appropriate mechanisms for articulating different perceptions of energy problems and different preferences for energy solutions, the implementation of appropriate incentives for managing energy supply and demand. In particular, l976 Presidential hopeful James Carter promised during his campaign to make energy policy a basic issue of his Administration. In April T977 President Carter, having been inaugurated three months earlier, delivered his comprehensive energy program to the Congress and the public in his "moral equivalent of war" speech.6 A written version of the energy program called The National Energy Plan (NEP) was published in April too.7 Carter's energy proposals and the steps he outlined for attaining them included a broad array of legislative, regulatory, adminis- trative and budgetary initiatives. Above all, President Carter stressed fairness as the guiding principle of his proposals; the advantages and the sacrifices of the NEP were to be shared equitably between producers and consumers and among regions of the country. Due to the importance 4 of these energy proposals, analysts at state, national, and private institutions around the country immediately began to evaluate the predicted impacts of the NEP. As policy makers and their staff in Texas listened to and read about the Carter energy plan, they were particularly concerned about statements regarding natural gas pricing. The Ford Administration had been moving toward deregulation of natural gas prices.8 Carter, however, declared that ”... proposals for immediate and total decontrol of ... natural gas prices would be disastrous for our economy and for working Americans, and would not solve long-range problems of dwindling supplies."9 He further noted that "...[eJXisting supplies are being wasted on nonessential "10 industrial and utility uses. To remedy these ills, Carter proposed a combined incentive price-ceiling price of approximately $l.75 per 1] This would simultaneously thousand cubic feet on all new natrual gas. encourage exploration but prevent gas producers from reaping windfall gains. Carter also proposed a use tax for high volume industrial and utility users of natural gas to encourage conservation and conversion to coal. A use tax for nonessential uses of oil was included too. 12 why had the Texans had several concerns about these proposals. Ford and Carter Administrations come to such different conclusions regarding natural gas pricing? Why was it assumed that the supply response dropped off above $l.75 per thousand cubic feet? Was this a sufficient incentive for exploration and new drilling? There were concerns 13 The about the supply and demand balance sheets contained in the NEP. Texan analysts were joined by analysts at the General Accounting Office, the Office of Technology Assessment, the Bureau of Labor Statistics and the Council of Economic Advisers. among others, in suspecting that pre- dicted domestic production under the NEP had been overestimated (and hence 5 imports underestimated), making the President's plan look more attractive. Finally, there were concerns about the equity of the pricing proposals for end uses. Texas, an oil and natural gas producer, was also a heavy user of oil and natural gas. The coal conversion requirements and the oil and gas use taxes would fall heavily on Texans. Texan analysts expected retail energy prices in Texas to rise more than the national average in almost all energy categories. Was this fair, especially in view of Carter's emphasis on the fairness of his program and his desire to eliminate regional discrepancies? Beginning shortly after President Carter's speech and continuing throughout the summer of l977, the Texas Energy Advisory Council (TEAC), a staff agency of the Texas government, began making inquiries to obtain 14 TEAC staff attempted to identify the source answers to these questions. of the NEP analysis. They assumed that the source was the Project Independence Evaluation System (PIES), an integrated set of computer models housed at the Federal Energy Administration (FEA). Numerous in- formal attempts were made to obtain documentation of the models and the specific parameters and assumptions used in preparing the NEP, with attention focused on the $l.75 ceiling price and the NEP balance sheets. None of these efforts produced satisfactory results. Discovery of the fact that the l977 National Energy Outlook (NEO), an annual publication legislatively required of FEA, was circulated for criticism but then not 15 Because PIES was used for published further aroused Texan suspicions. the technical analysis in the NEO, PIES, too, came under suspicion. A reading of the enabling legislation for FEA revealed a requirement for 16 It was decided, partly as a response to public access to the models. the mounting frustration and suspicion of TEAC staff, to expand the TEAC request from written information regarding documentation and parameters 6 to actual use of the PIES models. Model access was first discussed with White House officials in July. Finally on September 29 a meeting of Texas state officials, including TEAC staff, and Carter Administration officials and consultants was held. It was confirmed that PIES was indeed the source of the technical analysis for the NEP. The Texas representatives were promised and given documentation of the models, but it proved to be very general and largely out-dated. They were also promised computer tapes of the models for their own use. Follow-up efforts during October all failed to procure the tapes and on October 26, after seven months of frustration, TEAC filed a Freedom of Information Act (FOIA) request for the tapes promised earlier, plus tapes for additional portions of the PIES models and documentation of the NEP analysis results. Delivery of the tapes entailed additional misunderstandings. The first tapes received by TEAC in December were usable only on DOE'S computers. The tape of another model turned out to be blank. On January 3l, l978, a meeting was held between Secretary of Energy Schlesinger and Texas Lieutenant Governor Hobby (chairman of TEAC), Dr. Milton Holloway (execu- tive director of TEAC) and Mr. Harry McAdams (director of the State of Texas Office of State-Federal Relations) to discuss the FOIA request. Following this meeting the TEAC staff felt that DOE and particularly that unit within DOE that housed PIES, the Energy Information Administration (EIA), became noticeably more cooperative. Dr. Holloway turned his attention to establishing the Texas National Energy Modeling Project (TNEMP), which would conduct an independent review of PIES/MEFS.17 From the perspective of energy analysts in FEA and later DOE, there are several explanations for their reluctance if not inability to respond to TEAC's requests. first, l977 was a year of transition. FEA had been reorganized in the fall of l976. This has followed only a year later by 7 another reorganization with the opening of DOE in October l977. In addition to adjusting to new institutional set-ups and new legislative mandates, the staff was swamped with the analytic work required for the President's April l977 energy address and then, following the address, with requests from the Congress as it set about evaluating NEP proposals. These pressures left little time to respond to requests like those from TEAC. Time pressures also left little opportunity to document the analytical work as it was done, which in turn made the TEAC request more difficult to handle; much of the material requested did not exist. Segggg, there was confusion over the relevance of the FOIA request. FOIA required that records in the possession of an agency be made available upon request. It did not require that records not yet existing be created upon request.18 Hence EIA staff felt they had responded adequately to TEAC's FOIA request, since they had produced all the documentation in existence, and that any further requests by TEAC should not come under the auspices of FOIA. Thirg, there was confusion over the bearing of the TEAC requests on the NEP policy implications versus the technical analysis undergirding the policy implications. EIA was an independent agency and did not make policy recommendations. To be sure, the TEAC request called for EIA to justify its procedures for specifying assumptions and para- meters -- but to the extent TEAC wished to take issue with the NEP policy recommendations, EIA staff suggested that TEAC requests should be directed elsewhere, namely to the Office of the Assistant Secretary for Policy and Evaluation (PE).19 Fourth, there was confusion over the legislated requirement for public access to the models. This was a new requirement, unique to the federal energy modeling effort, and it had not been challenged before. In particular, it was not clear if access could 8 and/or should be handled directly by EIA or through a third party with minimal direct EIA involvement. EIA moved very carefully here to avoid setting an awkward precedent. Finally, EIA was under attack from a number of sources in addition to the Texas interests. Some of the harshest criticism came from the Professional Audit Review Team (PART), a panel created by the Energy Conservation and Production Act to evalute annually the Office of Energy Information and Analysis (OEIA, EIA's direct predecessor in FEA). The first PART report was published in 20 It was based on an evaluation of OEIA's first ten December l977. months' activities. Like TEAC, PART challenged OEIA's independence from the policy function, criticized the poor documentation standards and suggested that OEIA data collection and modeling activities were not credible. The presence of other critics in addition to TEAC again put constraints on the amount of time EIA could devote to TEAC. Several of these roadblocks were fairly simple to clear up. The time pressures on OEIA/EIA staff time eventually decreased as the flurry of activity following the release of the NEP died down and as additional 2] One of the first activities undertaken was to begin staff was hired. improving documentation standards. A detailed ex_pg§t_justification of the assumptions made for NEP analyses was drawn up.22 Alvin Alm, Assistant Secretary for Policy and Evaluation in the new DOE, justified the optimistic balance sheets by noting “... [i]t had been our hope that ... by ad0pting conservative assumptions ... we could focus the current energy policy debate on fundamental policy issues of national 23 Questions about the concern, rather than on analytical details." relevance of FOIA to the TEAC requests faded into the background as TEAC/EIA relations became more cooperative in l978. iore time too was devoted by EIA to the access issue. 9 Above and beyond remedying specific grievances, however, the TEAC/EIA controversy had raised a number of fundamental issues regarding the meaning of independence, the relation of an independent agency to the policy process, the role of complex models in policy making, and the organization and conduct of independent reviews of federal modeling efforts. Weaving through all these issues was yet another, that of values as part of objective information, in the policy process, and at the nexus between research-oriented and action-oriented communities. These are issues which were addressed by TNEMP, of which this evaluation is a part. Organization and Purposes of TEAC and TNEMP The Texas government has had a continuous formal commitment to energy research and policy at least since May l973.24 By the end of l977, the energy staff agency had evolved into the Texas Energy Advisory Council (TEAC). TEAC was a statutory agency with a tripartite structure of (l) the Council itself, consisting of twelve elected state officials, includ- ing the lieutenant governor as chairman and the speaker of the house as vice chairman, and two members of the Advisory Committee; (2) a staff of about 20, including economists, lawyers, and computer scientists, and headed by an appointed executive director (Milton Holloway) who served as liaison between the staff and Council; and (3) an Advisory Committee of approximately 70 citizens appointed by the Council chairman to represent various socioeconomic, geographic and other interest groups. TEAC struc- ture also included, although not required by statute, a University Coordinating Committee composed of representatives of all Texas institu- tions with energy-related programs. This committee enhanced effective tapping of faculty expertise and was regarded as particularly important in defining research needs. Public hearings were held as needed for additional citizen input. lO TEAC's energy policy function was both participatory and advisory, including "... the planning, formulation, revision, recommendation, monitoring and comment on federal policy and legislative recommendationsfl'25 Energy data bases and econometric models for determining impacts on the Texas economy were maintained by the staff in support of this policy function. TEAC and its predecessors had played two particularly important roles in the Texas policy process. One was that of problem identification. Due to its contacts with citizen groups, business interests, university researchers, high level executive and legislative state leadership, the Texas delegation in Washington, and its own professional staff, TEAC was in a unique position to unravel the complexities and controversies of energy problems. The second, and not unrelated, key role TEAC played was that of coordination. Coordination is particularly important and particularly difficult in Texas because the executive branch is weak relative to the legislative branch.26 Hence there are many potential decision points and decision makers which must be coordinated for effec- tive policy making. TEAC also provided valuable coordination between scientific research and policy making, between problem identification and problem solution, and among disciplines, as is crucial in any multi- disciplinary policy area. TEAC's energy policy function encompassed more than the identification of policy problems and the coordination of policy making. Much effort was also directed at solving specific policy problems. The Texas National Energy Modeling Project (TNEMP) was one example of a TEAC problem-solving effort. Once access to PIES/MEFS was gained, it became necessary for TEAC to decide exactly what it intended to do with the models and how it intended to do it. As a result, TNEMP was set up ll ”... with the primary intent ... to provide an independent, critical evaluation of the PIES model [sic]. The study will examine the data reliability, the assumptions, the structure relative to accepted theory and the actual behavior of the model.... [RJesults will be useful as recommendations to the Department of Energy for improving the model, as well as precautions to users concerning its proper use. Finally the study will produce specific recommendations to the State of Texas for 27 TNEMP was research— maintaining a national modeling capability." oriented and provided a kind of staff support for TEAC. The initial PIES evaluation was designed as a six month's undertaking. TNEMP's funding became available in mid-I978 with most of the specific investiga- tions of PIES components being completed by March l979. However, from the very beginning TNEMP participants knew there was the potential for a more permanent function, that of maintaining some sort of enduring national energy modeling capability in Texas. TNEMP's structure consisted of (l) a National Advisory Board (NAB), (2) an Analysis Team and (3) a group of Supporting Institutions.28 Dr. Holloway, in addition to being executive director of TEAC, was execu- tive director of TNEMP. The NAB played two important roles: one of oversight with respect to the work of the Analysis Team and another of credibility promotion and maintenance. Establishing credibility was especially important for guaranteeing TNEMP's success. The long controvery preceding TEAC's acquisition of the PIES models had resulted not only in TEAC suspicions of EIA but also in EIA suspicions of TEAC. Completion of TNEMP's objectives would be much easier if TNEMP had the cooperation of EIA staff. Furthermore, TNEMP intended to produce constructive criticism for EIA, criticism which would surely not be ll ”... with the primary intent ... to provide an independent, critical evaluation of the PIES model [sic]. The study will examine the data reliability, the assumptions, the structure relative to accepted theory and the actual behavior of the model.... [R]esults will be useful as recommendations to the Department of Energy for improving the model, as well as precautions to users concerning its proper use. Finally the study will produce specific recommendations to the State of Texas for "27 TNEMP was research- maintaining a national modeling capability. oriented and provided a kind of staff support for TEAC. The initial PIES evaluation was designed as a six month's undertaking. TNEMP's funding became available in mid-l978 with most of the specific investiga- tions of PIES components being completed by March l979. However, from the very beginning TNEMP participants knew there was the potential for a more permanent function, that of maintaining some sort of enduring national energy modeling capability in Texas. TNEMP's structure consisted of (l) a National Advisory Board (NAB), (2) an Analysis Team and (3) a group of Supporting Institutions.28 Dr. Holloway, in addition to being executive director of TEAC, was execu- tive director of TNEMP. The NAB played two important roles: one of oversight with respect to the work of the Analysis Team and another of credibility promotion and maintenance. Establishing credibility was especially important for guaranteeing TNEMP's success. The long controvery preceding TEAC's acquisition of the PIES models had resulted not only in TEAC suspicions of EIA but also in EIA suspicions of TEAC. Completion of TNEMP's objectives would be much easier if TNEMP had the cooperation of EIA staff. Furthermore, TNEMP intended to produce constructive criticism for EIA, criticism which would surely not be l2 heeded unless EIA's suspicions were quieted. The NAB promoted TNEMP's credibility in part by wise selection of the people who sat on it. NAB members were a careful mix of men with established reputations as large- scale modelers, both in academic circles and in federal energy modeling circles, and with distinguished careers as policy advisers. Careful NAB review of Analysis Team results before their public release was also intended to enhance credibility. The Analysis Team too was designed to promote credibility by involving universities in TNEMP and by drawing on competent, trained professionals. The Analysis Team was a team of about 20 academic economists, engineers and operations researchers at Texan institutions. They evaluted component models of PIES/MEFS, recommended alternatives to and improvements in them, and made recommendations with respect to the maintenance of a national modeling capability in Texas. The Supporting Institutions were institutions which were interested in TNEMP's activities, endorsed its objectives and were willing to provide review, data, facilities and funding support for TNEMP. The Supporting Institutions are not particularly relevant to this study. Initially, the evaluation procedures used by TNEMP concentrated on the positive (value—free) and monetary (valuational but quantitative) content of the models in accordance with positive research methodologies common in the physical sciences, statistics, and even economics. Positivism maintains that science is value-free and that objectivity means value-free. Monetary values however are often accepted as positive because they can be quantified. Policy analysis necessarily involves non-monetary values too. At meetings of the NAB, concerns about the value content of models such as PIES/MEFS grew. Further, it was noted that the value content of such models is derived in part from the pre- l3 conceptions of their builders and users and from the institutional 29 It was realized that TNEMP's contexts in which the models exist. evaluation of PIES/MEFS would be neither credible nor constructive if it ignored values. This, then, led to a decision on the part of Executive Director Holloway to inaugurate a special study of the normative and prescriptive dimensions of EIA and MEFS. This is that special study. Organization and Purposes of This Evaluation The general purposes of this study, as indicated by its title, is to evaluate the normative and prescriptive, or value, content of DOE'S EIA, including its PIES/MEFS models. More specifically, this purpose is fulfilled by the completion of four tasks: (l) determination of the value content of EIA's legislative mandate, (2) analysis of the value content of MEFS as operating subject-matter models, (3) analysis of agency-client interactions involving EIA and (4) development of some implications which follow from the analysis in tasks 1-3. The data consisted of interviews of EIA and TNEMP officials, the pertinent legislation and legislative histories, TEAC and EIA records, and some of the modeling literature. The time period covered stops around June l979. The approach used in this case study is institutional and methodo- logical. The evaluation is institutional because it is historical and because it examines relationships between EIA's structure and the resulting behavior of EIA staff and clients.30 The emphasis is on process more than it is on the details of any particular model's performance. An historical approach was needed for several reasons. In any new policy area, considerable time must be devoted to identifying and prioritizing problems that should be addressed. Energy problems and policy analysts' perceptions of them have changed dramatically over the past six or seven years, with l4 a number of implications for providing energy information. Furthermore, EIA has evolved through a series of institutional innovations from a succession of predecessor agencies. It was believed that tracing trends such as these would help to highlight causes of and potential cures for any present weaknesses EIA might show. The evaluation is methodological because the framework used for evaluating EIA is developed from the phil050phy of science literature. The evaluation centers on interpre- tations and implications of objectivity and on the role of values in research and in policy making. These are tOpics in the philosophy of science. In terms of activities and principles of procedure, evaluation research is no different from any other research. The remaining chapters are organized as follows. Chapter II, which draws on literature by philosophers of science and technical modelers, develops a framework for evaluating the normative and prescriptive content of an information system. Chapter III presents the results of using this framework to perform tasks (l), (2) and (3) above with respect to EIA and its MEFS. Chapter IV addresses task (4). It gives some implications for understanding the FEA-DOE/Texas conflict and for shaping the futures of TNEMP, EIA and several other entities that are or could be involved in keeping energy information objective, relevant and accessible. Because more than a year has lapsed since the preliminary results of this evaluation were presented to TNEMP (June l979) and the final writing of this thesis (August-September l980), there has been time for TNEMP to act on some of the recommendations developed here. Chapter IV closes with a quick look at some of the changes made when TNEMP was reorganized in the fall of l979. This evaluation, as mentioned above, was performed for TNEMP. Throughout the emphasis has been on evaluating, objectively yet relevantly, l5 those aspects and issues that would help TNEMP solve its problems with EIA and conduct a valuable critique of PIES/MEFS. TNEMP's objectives were fairly broad, including constructively criticizing EIA and making contributions to the art of model evaluation. This evaluation, too, hoped to contribute to improving EIA, to improving model evaluation, and to improving TNEMP, especially in the event that it be made into a more permanent institution. This evaluation is one of several evaluations emphasizing different aspects of EIA, TNEMP and modeling. The other evaluations supplement and strengthen this evaluation of the institutional context of EIA modeling, just as this evaluation supplements and strengthens the others. Any recommendations developed here for improving TNEMP would be strengthened by an evaluation of TNEMP itself along the lines used here to evaluate EIA. Such an evaluation is included in the study by Johnson and Brown.31 Recommendations for improving EIA and its MEFS would benefit from evalua- tions of specific component models. Evaluations of specific models were conducted by the TNEMP Analysis Team.32 Recommendations for improving EIA would also need to be supplemented by an understanding of the power relationships and personalities involved in setting EIA's budget and agenda. Such an evaluation was not included in TNEMP's plan of work. Investiga- tions of agenda- and budget-setting were considered beyond the scope of this and the other TNEMP evaluations, primarily because these are matters over which TNEMP and TEAC had no control. Such information would be needed, however, to develop the implications of this evaluation into workable options. CHAPTER I ENDNOTES 1The title of this study refers to "normative and prescriptive con- tent." The meaning of "normative and prescriptive content" will be dis- cussed in detail in the next chapter; until then "value content" can be substituted for "normative and prescriptive content." 2TEAC was reorganized and renamed Texas Energy and Natural Resources Advisory Council (TENRAC) in September l979. Except where appropriate in Chapter IV, TEAC, which was the correct name during the time period of this evaluation, will be used throughout. 3TNEMP was reorganized and renamed Texas Energy Policy Project (TEPP) in the fall of l979. Again, except where appropriate in Chapter IV, TNEMP will be used throughout this evaluation. 4For example, quick comparison of the entries under "Energy and Power" in The New York Times Index reveals a note of optimism in l975 versus one of confusion and apathy in l976. 5See R. A. Solo, Economic Organizations and Social Svstems (Indianapofis: Bobbs-Merrill, l967), Chapter 2, for an explanation of why a democratic legislative system is reactive rather than proactive. 6"Text of Speech by Carter on His Energy Program to a Joint Session of Congress," New York Times, April 21, l977, p. 46. 7Executive Office of the President, Energy Policy and Planning, Ihe_ National Energy Plan (Washington, D.C.: Government Printing Office, April 1977). 8M. L. Holloway, "The National Energy Plan and the Texas Energy Advis- ory Council's Freedom of Information Request: Interpretations and Implica- tions," Texas National Energy Modeling Project Records (Austin, Texas), January l978, p. 6. 9 10 p. 52. ll "Text of Speech by Carter," p. 46. Executive Office of the President, Energy Policy and Planning, NEP, Ibid., Pp. 53-55. 12The following concerns are taken from a letter from Representative Billy Tauzin to President Jimmy Carter, March 29, I978, TNEMP Records; and "Documentation of DOE/TEAC Communication Concerning TEAC's Freedom of Information Request, February 7-March 23, 1978," TNEMP Records. 13These are on pages 95 and 96 of the E. 16 17 MThe following paragraph summarizes "Documentation of DOE/TEAC Communication." 15These complaints about cancellation of the 1977 NEO may or may not have been well-founded. They stemmed from EIA's general lack of credibility with TEAC. Cancellation of the 1977 NEO, its adequacies and inadequacies, are discussed in Professional Audit Review Team, Activities of the Office of Energy Information and Analysis (Washington, D.C.: Government Print- ing Office, December 1977), pp. 27-28 and 35-36. 16Energy Conservation and Production Act, P.L. 94-385, August 14, 1976, Sec. 113 and 31(3). 17PIES was renamed Mid-term Energy Forecasting System (MEFS) sometime in 1979. Both acronyms will be used, depending on the time being refer- red to. 18Letter, John Treanor (DOE Information Access Officer) to M. L. Holloway, March 9, 1978, in "Documentation of DOE/TEAC Communication." 19 1979. 20PART, Activities of the OEIA. OEIA became EIA when the Department of Energy opened in October 1977. 21For example, OEIA staff numbered around 350 at the time of the first PART report (PART, Activities of the OEIA, p. 28) and EIA staff numbered around 720 in August 1978 ("Summary of the Texas National Energy Modeling Project National Advisory Board Meeting, August 11, 1978," TNEMP Records, p. 2). 22Memorandum for the Record from E. C. MacRae and J. Solow, concerning differences between the draft NED/77 reference case and the President's program base case for 1985, as revised July 25, 1977, Federal Energy Administration Records included in the TNEMP Records. Also, glancing through the publications lists in the Energy Insider, a DOE biweekly news- paper, reveals a marked increase in the publication of EIA Technical and Analysis Memoranda, which document EIA work, toward the end of 1978. 23Letter, Alvin Alm to Representative Billy Tauzin, July 11, 1978, TNEMP Records. 24For a history of Texas energy agencies, see M. L. Holloway, "The Texas Energy Advisory Council and the Texas National Energy Modeling Pro- ject" (materials for Stanford seminar, October 19, 1978), TNEMP Records; and M. L. Holloway, R. King and M. Stevens, "A Study of the Texas Energy Advisory Council: Its structure and Functions Relative to State Science, Engineering and Technology Transfer" (paper prepared for the National Science Foundation, June 1979), TNEMP Records, especially the Executive Summary and Chapter 7. 25 Personal interview with an EIA official, Washington, D.C., March 28, Holloway, "The TEAC and the TNEMP," p. 2. 26Holloway, King and Stevens, p. 126. 18 27M. L. Holloway, "Texas National Energy Modeling Project: Purpose, Organization and Work Plan" (Report No. 78-06-01, June 23, 1978), TNEMP Records, p. 1. 28TNEMP's structure is clearly described in Holloway, "The TEAC and the TNEMP," pp. 6-7. 29Letter, M. L. Holloway to L. Zerby, October 3, 1978, TNEMP Records; and "Summary... August 11, 1978," TNEMP Records, p. 10. 30For a definition of institutional analysis, see A. A. Schmid, "Analytical Institutional Economics: Challenging Problems in the Econ- omics of Resources for a New Environment," American Journal of Agricultural Economics 54(5):893-90l, December 1972. 31G. L. Johnson and J. L. Brown, An Evaluation of the Normative and Prescriptive Content of the Department of Energy Mid-term Energy Fore- casting System (MEFS) and the Texas National Energy Modeling Project (TNEMP), Texas National Energy Modeling Project, Part III, ed. by M. L. Holloway (Austin: Texas Energy and Natural Resources Advisory Council, forthcoming). 32See M. L. Holloway, ed., Texas National Energy Modeling Project: An Experience in Large-scale Model Transfer and Evaluation, Part II (Austin: Texas Energy and Natural Resources Advisory Council, 1980). CHAPTER II A METHODOLOGY FOR EVALUATING AN INFORMATION SYSTEM Methodology is taken here to mean a generalized conceptual frame- work accounting for the ensemble of activities and principles of procedures involved in producing scientific (objective) knowledge. The purpose of methodology is to guide, coordinate and justify research. This chapter develops a methodological framework for evaluating an information system, and particularly its normative and prescriptive content. It points out when, how and why value content enters into information. First the nature of information, then of an information system, and lastly issues in evaluating an information system are discussed. It is neither possible nor necessary to explore all of the potential issues and presuppositions of information systems in this chapter. Throughout, those issues and presuppositions most relevant to evaluating EIA are emphasized. What is Information? 1 It consists of data, Information is an input to decision making. theory and a context. All information can be regarded as purposive, designed to solve problems or to aid in understanding phenomena. A first step in producing information is to ask, "What are the purposes of this inquiry?" Nor do purposes of inquiry stand in isolation. People have purposes. Hence identifying the purposes of an inquiry also requires identifying known or potential users of information. Within this context of known or potential users and uses, appropriate theoretical concepts are used to select and interpret appropriate data. The result is information. 19 20 Information is knowledge. Knowledge, too, results from using data and theory to produce understanding of some phenomenon by a knower. The process by which information is generated in an information system is identical to the research process. It is the identity of information and knowledge that makes a methodological framework so appropriate for evalu— ating an information system. Information and knowledge can be classified various ways. Two inter- related classifications will be particularly useful here - »one having to do with the information's value content, another having to do with the in- formation's use in the research process. With regard to value content, information can be classified according to whether it is normative, posi- tive or prescriptive.2 Normative information has to do with goodness and badness, or with values. Positive information does not have to do with goodness or badness. Prescriptive information has to do with rightness and wrongness, or with what "ought" to be. Prescriptive information is necessarily developed from both positive information about reality and normative information about valuations of reality. It consists of informa- tion about goals and about decision rules for weighting and attaining goals. Rightness and wrongness and "ought" are action-oriented concepts whereas 3 To repeat, information goodness and badness and the positive are not. can be regarded as purposive. Purposes have to do with reasons and motiva- tions for actions or decisions. In short, purposes have much normative and prescriptive content. The purpose-ful context within which information is developed endows it with considerable normative and prescriptive content. The second useful classification of information has to do with whether 4 Information it is descriptive, diagnostic, predictive or prescriptive. is intended to be a useful input to decision making. The process of decision making includes several steps. Each step both requires information 21 and produces information in a cumulative, iterative process. Decision making requires (1) identifying, prioritizing and selecting that about which decisions are to be made, (2) identifying alternative decision options, (3) selecting a decision rule, (4) selecting a particular decision option and (5) relevant background information. The relevant background information is descriptive. Information about identifying, prioritizing and selecting decision areas is diagnostic. Information about consequences of alternative decisions is predictive; it is of the "if... then" variety. Information recommending particular decision rules, decision options and strategies for implementation is prescriptive. Making a decision always requires descriptive, diagnostic, predictive and prescriptive information. These two classifications -u- one according to normative and pre— scriptive content and the other according to use -u- are interrelated also. Descriptive information is positive and/or normative but not prescriptive. Diagnostic information is, in a sense, pre-prescriptive; it is about decisions that "ought" to be made. A prediction is a statement of the consequences which will follow from a given action. Unlike description, prediction entails notions of causation or correlation. The actions and the consequences of which predictions are made may themselves have normative or prescriptive content. Predictions however make no claims about the goodness or badness, rightness or wrongness of either actions or consequences.5 Prescriptions go one step further. They do make claims about the rightness and wrongness of actions and consequences. Just as prescriptive information builds on or requires both positive and normative information, so, too, prescriptive information builds on or requires descriptive, diagnostic and predictive information. It could also be said that the needs for descriptive, diagnostic and predictive 22 information are derived from the need for prescriptions. For the purposes of this study, the important point is that descriptive and predictive information may have normative content, and predictive information may have in addition prescriptive content, while diagnostic and of course prescriptive information always have both prescriptive and normative content. Depending on the particulars of the information under consideration and the particular institutional arrangements for producing and using it, information may have several publicggood characteristics.6 For example, some information may be used over and over again by different decision makers without diminishing its value. Alternatively, broad access may reduce the value of information to some but not all, as in the case of trade secrets. Both of these cases raise questions of who ought to have access to information and of who ought to pay how much for it. Further- more, information can be very expensive to produce -- too expensive for individuals to produce independently. This raises questions not only about mechanisms for group participation in the production of information but also about the advantages and disadvantages of duplicate information. Because of these and other public good characteristics, information pro- vision is a common governmental function. Public provision of information calls for explicit, justifiable choices regarding wh t information to produce for whgm_for how much. Any information system and any evaluation of an information system must address these issues. Information is pgwer, As an input to decision making, information is one source of power to affect if not direct decision making.7 Similarly, the ability to determine what information will be available to whom at what cost through which decision rules is a formidable source of power. Bartlett has shown straight-forwardly the significance of 23 asymmetrical access to information in decision making.8 Indeed, some have claimed that freedom of speech is meaningless without fair access to information. Power is often assumed to be a necessary evil. This assump- tion is often accompanied by an implicit assumption that all games are zero-sum. No such assumptions are made here. Rather, power is viewed as a tool that may be either used or abused, a tool that may even unlock doors that otherwise would remain locked.9 Power is a substitute for infinitely expensive perfect knowledge and universal values. Information as power is raised here because design and evaluation of information systems must consider the issue of power. Information -- positive, normative and prescriptive -- may or may not be objective. Objective information meets the criteria of clarity, coherence, correspondence and workability.10 The criterion of clarity means that information is understandable and unambiguous. The test of clarity is a prerequisite to the other three tests. Information that is unclear or ambiguous can be interpreted in more than one way. One interpretation might pass the tests of coherence, correspondence and workability while another interpretation might fail. The criterion of coherence requires that information be logically consistent. The test of correspondence is sometimes called a test of experience. It requires that information "match up" well with other knowledge of reality. Reality is always selectively perceived not only through the five senses but also with the use of theories and other preconceptions. Hence, information can never be compared directly with reality but only with other concepts of and knowledge about reality. The test of correspondence means, for example, that descriptions are accurate statements of what is known about reality and that predictions lead to the consequences they predict. The criterion of workability applies directly to prescriptive 24 information and indirectly to other types of information. In a sense it is a special case of correspondence. It requires that prescriptions lead to the consequences they predict gngthat the consequences be desirable or right. Emphasis on consequences implies, of course, that prescriptions be action—oriented and capable of being implemented. Workability also implies that information be relevant and comprehensive -- i.e., that there be no omissions of data or theory relevant to making the decision. Since prescriptions are necessarily preceded by description, diagnosis and prediction, the test of workability also implies that description, diagnosis and prediction be relevant and comprehensive. The criteria for objectivity apply both to knowledge or information and to researchers. Objective researchers are those who submit, re- submit, and allow others to submit their research to the four tests of objectivity. Repeated testing may be necessary because, as knowledge about reality grows, information that is objective at one point in time may later become unobjective. Thus objectivity is based on a belief in the tentative, social nature of knowledge and of information.H Objective information and objective investigators are better than unobjective because they are credible. Credible information will be trusted enough by others to be used by them in their decision making. Objectivity and credibility both presume honesty and integrity on the part of those who have produced the information. Some minimum level of trust is required to make knowledge transferable and usable by others than those who produced it. This discussion of objectivity is getting away from the nature of information per se. Up to now the discussion of information has included its development from data, theory and context, its use as an input to 25 decision making, its similarity to knowledge, two useful classifications of information, its public good characteristics, its relationship to power, and the requirements for objectivity and credibility. Evaluation of an information agency like EIA, however, requires evaluating not just the product -— the information itself -- but also the process by which that information is generated.12 This requires discussing information systems. What is an Information System? A model of an information system will be developed by (l) outlining the steps or functions involved in the overall process of generating information, (2) relating these steps to the types of information discussed above, (3) categorizing types of research or of information provision according to two classifications drawn from the literature, (4) detailing potential roles for formal models in an information system, (5) commenting on the importance of the (implicit) time dimension in the process, (6) identifying the various actors in the system according to their roles and philOSOphical orientations and (7) discussing the importance of the institutional context. The design and evaluation of an information system is related to the decision rules the decision makers use and to the system's power components. This goes beyond decisions about the information system per_§g_to the problems the decision makers use the information system to help solve; the design and evaluation of information systems cannot be separated from the system's context of purposes, users and problems. The model of an information system presented here is both a generic model and a model designed to highlight those features of EIA that are to be evaluated -- namely, stances toward the normative and the prescriptive, roles for formal models, and the institutional context. 26 Steps in the Process of Generating Information The steps involved in generating information are the steps of the research process. They include identification and selection, definition, analysis, synthesis, decision making and action taking, and evaluation.13 These steps are not so much things that must be done in sequential order as they are functions that any viable information system must be able to perform. Each step or function and criteria for evaluating it will be commented on briefly. The criteria are all aspects of objectivity as it was discussed above. Identification and selection in the context of an information system means determining the need for and purposes of the information. Informa- tion needs exist for a multitude of purposes and decision makers at any given time. Given scarce resources and costly information, they cannot all be satisfied. This function of an information system includes identifying, prioritizing and selecting those needs that are to be met. Criteria for objective identification and selection include clarity, coherence and workability, especially relevance and comprehensiveness. Information needs that cannot be clearly articulated cannot be adequately compared with other information needs. It is also difficult to continue with the process of designing an information system when information needs are ambiguous. In multi-purpose information systems, it is essential that the various goals be logically consistent or coherent with each other. Inconsistent goals will ultimately be unworkable. Relevance is important both because of the purposive nature of information and because of the opportunity cost of information. Comprehensiveness is important because omissions in information can be just as serious as not having any information. Comprehensiveness also means avoiding excessive 27 duplication. Overall, the function of identification and selection determines the potential usefulness of an information system and is an important source of the normative and prescriptive content of an information system. Definition involves clarifying and decomposing a somewhat general or broad information need into a series of specific requirements. Information requires both theory and data. (This is in addition to the context of purposes and users as identified in problem selection.) Generating information may require theoretical research, data collection and/or the development of new research techniques -- or it may only require using theories, data and techniques already available, albeit to serve a new purpose. The task of definition is to identify theory, data and technique needs. This demands careful coordination. Hence one criterion for objective definition is comprehensiveness. Another is relevance -- relevance to the information needs at hand. Clarity is always essential. Again, definition also embodies an information system with normative and prescriptive content. Analysis 14 begins the transition from identifying, selecting and defining information needs to generating the information required. Definition results in an information need decomposed into a series of component theory, data and technique needs. The task of analysis is to provide alternative ways of individually meeting each of these needs. The criteria for evaluating analysis vary somewhat depending on the nature of individual needs. In all cases, workability is applicable to analysis insofar as it must be kept in mind that the information is intended ultimately to be used in making decisions or prescriptions. Thus, relevance and comprehensiveness, with their implications for normative and prescriptive content, underlie analysis too. 28 Synthesis reverses the decomposition of analysis but continues the addition of normative and prescriptive content.15 It juxtaposes the individual information components so as to form a coordinated body of information and cross-compares the components for compatibility. Since synthesis immediately precedes use of the information in decision making, the tests for it are all-encompassing -- including coherence, correspon- dence, workability and, of course, clarity. Coherence is a check for consistency or compatibility among the theoretical foundations of different components of the information. Correspondence checks for consistency among the data inputs and empirical outputs of the different components. Comparing data and theories for consistency also implicitly compares different techniques used in analysis for compatibility. Compre- hensiveness is a check for gaps or excessive duplications in the informa- tion relative to its intended purposes. Comprehensiveness, an aspect of workability, is a particularly important criterion for synthesis. The tests of coherence, correspondence and comprehensiveness are only useful, however, if overall the information system is relevant and otherwise workable. Decision making and action taking involve using the information to make and implement decisions or prescriptions. The test for an objective prescription is workability, which judges a prescription to be right or wrong according to the consequences it produces. Evaluating a prescription includes evaluating not only the prescription itself but also the decision rule used to produce it and the manner in which it was implemented. An unworkable prescription might be unworkable because it was based on un- workable information, an unworkable decision rule, or unworkable imple- 29 mentation. The normative and prescriptive aspects of workability have already been discussed sufficiently. Evaluation is the final necessary function in an information system. Evaluation provides feedback (1) among the various components of an information system and (2) over time. For example, evaluation of synthesis may reveal a gap in the data and hence call for better definition and additional analysis. An evaluation of decision making may reveal that the information need has changed over time, calling for renewed identifi- cation and selection and adjustment of the whole information system. Evaluation helps to highlight the interdependence of all the steps in the process. It also illustrates that the steps in the process of generating information need not take place in any particular order -- other than that, in general, first there must be some information need. Actually, evaluation has been included as a component of each of the functions above; for each function criteria were given for evaluating whether that function had been performed adequately or inadequately -- that is, objectively. In evaluating an information system, however, it is also necessary to evaluate the system's own evaluation function and feedback mechanisms. Much more will be said about evaluation of information systems, and particularly about this evaluation of EIA, toward the end of this chapter. By now it should be clear that satisfactorily performing each component function of an information system requires considerations of the normative and the prescriptive. It should be obvious also that each step both requires and produces information. The types of information used and produced along the way vary, however. 30 Information System Functions vis-a-visegpes of Information Identification and selection use descriptive information about what is and prescriptive information about what should be to produce a diagnosis. A diagnosis is a prescription about an information need that ought to be met as well as a description about what the information need is. Defini- tion breaks the diagnosis into a series of component diagnoses and may require additional descriptive information. Analysis and synthesis generate predictions from diagnostic and descriptive information. Decision making generates prescriptions (decisions). Action taking requires more prescriptions about the best way to implement a decision. Again, both may require supplementing information produced by the previous step with new descriptive information to produce new prescriptions. Evaluation also is both description and prescription. It is description of consequences and prescription of ways and means of improving consequences. In addition to the different types of information involved at each step, there are also different types of research processes by which information is generated. Types of Research In each step of the information system or, equivalently, of the research process, research has a slightly different orientation. It has already been described how identification and selection,are descriptive and diagnostic, decision making is descriptive and prescriptive and so on. There are many different schemata for classifying research. Such classifications are only important if they help the researcher to be clear about the purposes of his research. Another classification developed by Johnson has useful implications for designing and evaluating information systems. This classification differentiates research 31 according to whether it is disciplinary or multidisciplinary and according to the clarity and specificity of the research's or the information's potential uses. Johnson classifies research into the categories disciplinary, subject matter and problem solving.16 Disciplinary research has to do with improving the theories, data, and techniques of a discipline. Disciplinary research may be related to solving a specific problem or set of problems and hence of known relevance or purpose, or it may be of unknown relevance. Disciplinary research is not as relevant as the other two types for discussing information systems because (1) information systems are always of known relevance, (2) although information systems may apply the results of disciplinary research, little disciplinary research is done within the context of providing information per se, (3) to the extent that disciplinary research is done in the course of developing information, the disciplinary findings often can be considered spin-offs of either subject-matter or problem-solving research and (4) information systems are multidisciplinary and hence not disciplinary. Subject-matter and problem-solving research are both multidisciplinary. Subject—matter research is concerned with information relevant to sets of problems. The decision makers to whom the subject-matter findings should be useful may be more or less well identified. Subject-matter research does not provide answers to specific problems. luch of the work that goes on within many practical information systems is subject-matter research; the system's overall purpose rather than specific problems and specific decision makers provide direction. Problem-solving research deals with a particular problem of a well—identified decision maker (or decision makers). Problem-solving research is carried out with the intention of 32 generating a solution to the specific problem under investigation. The functions of decision making and action taking are problem-solving func- tions. Other steps of the information system may involve problem-solving research as well, depending on the specifics of individual cases. Ignoring disciplinary research, Johnson's subject-matter and problem- solving classes basically divide Shaffer's four categories into eight. Descriptive research may be either problem solving or subject matter, depending on whether the description is being done to solve a particular problem or to provide information for a more general problem area. Diag— nostic research can also fall under either subject-matter or problem- solving headings. Diagnostic research to select, identify and define problem areas is subject-matter research, whereas diagnostic research to define a specific problem is problem solving. Similarly, predictions and prescriptions can be either subject matter or problem solving. The problem solving/subject matter distinction is an important one for information provision. Each places a different emphasis on the importance of identifying, selecting and defining problems or information needs. Problem-solving research frequently takes the problem as given and concentrates on proposing solutions and operationalizing performance goals. Subject-matter research on the other hand focuses to a greater extent on identifying and selecting problems with some attention directed at developing general solutions and no attention directed at developing specific solutions. Each also places different emphasis on the range of applicability of information. Problem-solving information is expected to be of definite but limited applicability whereas subject-matter information is of less definite but presumably wider applicability. 33 It has been stated that theory and data are interactive in generating information and that the ultimate usefulness of information in decision making depends on the earlier functions of identification, selection, analysis and synthesis. What exactly are the roles for formal models in all these processes? The Roles for Formal Models Cohen and Cyert define a model simply as a set of assumptions and conclusions which are logically derived from the assumptions.17 Greenberger, etygj,, describe models and modeling as resting on a troika of theory, data and technique and add that a reference system (knowledge about the aspects of reality being modeled) is also a necessity.18 Models after all are models gf_something. Links between reference system and model include (i) the nature of the problem which guides the choice of a 19 (iv) any model; (ii)the data;(ifi) any undefined primitive terms used; assumptions made in the model, including implicit assumptions about the institutional framework of the reference system; (v) theory which is used to derive conclusions and which, importantly, incorporates knowledge about how things happen, about processes, in the reference system; and finally (vi) any conclusions drawn which may be checked against the reference system. Thus clarity, coherence, correspondence and workability all pertain to evaluating models and modeling too. Modeling is an attempt at being systematic. Computers are a tool for storing and retrieving information and for performing computations. Integrated computerized systems of models can be devices for organizing and coordinating interconnected activities. Furthermore, with their capacity for detail, their need for explicitness, their ability to illumin- ate assumptions, and their flexibility allowing for the gradual accumu- 34 lation of knowledge, systems of models appropriately used can enhance the objectivity of a research process. More specifically, contributions computerized models and model systems can make include: 1. 20 Organizing data and being able to recall historical data in meaningful arrays. This can lead to large reductions in computa- tion and bookkeeping chores. Defining and even helping to prioritize data needs. Model structures will highlight data deficiencies whereas sensitivity analyses will help indicate which data needs are most important. Furthermore data series which monitor the development of the reference system will indicate needed adjustments in the data over time. Clarifying assumptions. Structuring discussion by integrating issues, improving under- standing of interdependencies and counterintuitive results, and highlighting mutually inconsistent policies. Organizing and presenting the analysis and the results. Working through decision processes by asking "what if" questions. Educating modelers and decision makers. As Hogan notes, the process of model construction and their serendipitous uses often prove more fruitful than their intended results.21 In view of the expense (both dollar-wise and time-wise) of modeling, Hogan further prioritizes the roles of modeling in the following order of increasing usefulness: data reporting, data development, answering gg.hgg questions, answering recurring questions, and analyses for periodic 35 reports.22 Benefits from modeling, especially with respect to potential contributions of lesser priority, can be increased by careful design and coordination of models for multiple uses. In terms of the functions performed by an information system, useful applications of modeling fall primarily under analysis and synthesis. Models may be problem solving, subject matter or disciplinary. In terms of different kinds of information, models mainly produce descriptions (e.g. when used for data reporting) and predictions (e.g. when used for answering questions). In view of earlier discussion regarding the normative and prescriptive content of analysis and synthesis and of prediction and prescription, it follows that models too have normative and prescriptive content. In addition to understanding the potential contributions of modeling, it is crucial to understand the limitations of modeling. Except in the case of some disciplinary models, models are developed in response to problems and information needs. Models perform poorly the functions of identification and selection. Modeling generally is preceded by these functions. Greenberger, gt_gl., note that the partnership between modeling (and the power of knowledge) and political power shifts in favor of modeling with a shift toward systematic testing of policy options (i.e., analysis and synthesis) as opposed to the invention of policy options 23 Furthermore, many (i.e., identification, selection and definition). models are specific to a problem (or problem set) or information need. Modeling designed to address one set of information needs may have limited ability to address even a seemingly similar set of needs. For example, macro models built from aggregate data and relationships cannot answer questions calling for regional or sectoral detail, even though the latter 36 generally require the former. Models must be changed both in response to the inevitably changing nature of any problem area or information need over time and in response to changes in the agenda of important problems and information needs. Hogan emphasizes that useful modeling in the context of changing information needs requires generalized confidence or credibility in models and modelers.24 Another set of issues potentially limiting the usefulness of modeling revolves around the ultimate purpose of models in an information system, which is the generation of information useful to decision makers and executives. This calls for good communi- cation between modelers and users and channeling of modeling activities to contribute to users' understanding and insight and ultimately to their making prescriptions. Iterative interaction must be an essential part of the modeling process. These and other contributions and limitations of modeling are contingent also on the time element, the modelers and other people involved, and the particulars of the institutional context. The Time Dimension25 The provision of information has been broken into a series of steps or functions. Performing each of these functions takes time. Different functions require different amounts of time; even the same function requires different amounts of time when performed for different purposes. Time requirements and the temporal ordering of the various steps are important aspects of understanding and evaluating an information system. It is the time element that makes an historical approach to information systems so important. 37 The various steps may be expected to fall into certain sequences and cycles. Bonnen notes that initially many information systems are largely descriptive and that only after a descriptive foundation has been laid does a demand for information in a "learning or developmental mode" (i e., diagnostic, predictive and prescriptive) develop.26 Similarly, subject-matter research, with its emphasis on diagnosing sets of problems, might be expected in many instances to precede problem-solving research. Other cycles and sequences may become obvious in the context of particular information systems. The various functions in an information system may require different amounts of time. This can result in bottlenecks and tensions between functions. Two obvious processes in an information system which have different appetites for time are the actual production of information . . . . . 7 versus Its use in dec1Sion making.2 This is especially true when decision makers include participants in the public policy process. Identification, selection, analysis and synthesis have the potential to consume inordinate amounts of time. Information needs can be difficult to scout out and prioritize. The data can always be improved, theories can always be refined, and issues can always be analyzed from one more perspective. Decisions, however, often must be made within a definite time frame, whether or not complete information is available to aid decision makers. Failure to coordinate the use of information in decision making and action taking with the production of information can be to the detriment of both. Information that is not timely can result in uninformed decisions and actions with unintended and undesired-~unworkable--consequences and/or in the use of information for gy_post facto justification of decisions and actions. Similarly, attempts to speed up information production too 38 much will result in information of poor quality and/or in the appearance if not the fact of gathering only that information useful for an advocacy position. Damaging an information system's reputation for objectivity and credibility damages its usefulness. Other sources of tension due to differing time requirements may come to light in the context of particular information systems. Another interesting aspect of the time dimension, particularly for this study, concerns the timing of evaluation. Evaluation, like the other functions of an information system, is intended to be useful. It must be timed and relevant performance criteria must be selected so as to enable improving the information system. Suchman notes many programs -- which includes information systems -- go through a life cycle from planning (to design a program) to demonstration (to develop a prototype experimental program) and finally to operation of an on-going program.28 The potential for change is different at each stage and evaluations should be designed to take this into account. In a similar vein, Scriven distinguishes between formative evaluation, which evaluates an on-going activity, and summative evaluation, which evaluates a completed activity.29 Formative evaluation might be expected to place relatively more emphasis on the process by which information is provided, whereas summative evaluation might be expected to emphasize end results. Hatry, gt_gl., recommend that evaluation of the diagnostic function, which they call “effectiVeness status monitoring," take place regularly but that evalua- tion of the decision-making function be performed on an "as needed" basis to guarantee relevance.3O In addition to choosing an appropriate time for evaluation, it is necessary to know for whom an evaluation is being performed. This 39 raises the issue of identifying the actors involved in an information system. The Actors Actors can be conveniently classified according to the roles they play in an information system and according to their individual philo- sophical orientations with respect to the research and policy processes and particularly with respect to the meaning of objectivity. Such classi- fications are useful and important for recognizing what gets done by whgm_and hgw_in an information system. The roles participants play in an information system include, for example, information system designers; agenda setters (those who work on identification, selection and definition); analysts and synthesists, including modelers; users, or decision makers and executives; and evaluators. The role influences the behavior of the participant and hence the performance of the system. For example, Bonnen observes that the initial designers of an information system usually understand and take 3] Later, however, speciali- into consideration the whole of the system. zation tends to increase, and data collection or modeling may come to be an end in itself. This is especially likely to occur in the absence of mechanisms whereby user groups express their preferences for information. Similarly, evaluators will be more likely to select relevant criteria for evaluation if they are aware of uses for and users of the evaluation. The need for effective coordination of the various roles involved in an information system is captured by Bauer's notion of "boundary roles."32 Boundary roles include those roles that are involved in transitions from one function to another. Information systems must be designed and operated 40 so that the various roles complement each other, resulting in an objective, credible information system. Philosophical orientations of information system participants also influence their behavior and the system's performance. The classification to be used here classifies participants according to their views on the meaning of objectivity. This classification, although not necessarily the interpretation of it given here, has been widely used by Johnson.33 It includes as interpretations of objectivity the following five positions: 1. positivism, which maintains that scientific research is and should be value-free and that objective, empirical knowledge of the normative is impossible. Objectivity is taken to mean clarity, coherence and correspondence with respect to positive knowledge. 2. conditional normativism,34 which is a partial adaptation of positivism to allow for some normative and prescriptive content in scientific research, maintains the claim that objective empirical knowledge of the normative is impossible but allows for the inclusion of values and goals by assumption. Conditional normativism treats values and goals as arbitrary; they are not researched or questioned, just made explicit and assumed. Objectivity means clarity, coherence and correspondence with respect to the positive and clarity with respect to assumptions about the normative. 3. normativism, which maintains that empirical knowledge of the normative is possible. Objective normativism takes objectivity to mean clarity, coherence and correspondence with respect to the normative. Objective normativists also frequently accept the possibility of objective positive knowledge, in which case 41 they may also include workability as a component of objectivity vis-a-vis prescriptive knowledge. 4. pragmatism, which holds (i) that positive and normative knowledge are interdependent and inseparable in the context of a problem and (ii) that knowledge should be accepted or rejected according to the consequences it produces -- i.e., that workability is the criterion for truth. Objectivity is taken to mean workability, with the usefulness of the tests of clarity, coherence and correspondence derived from the criterion of workability. 5. eclecticism, which is an attempt to draw on the strengths and overcome the weaknesses of each of the above positions. It accepts the possibility of objective knowledge of the normative, the positive and the prescriptive and suggests that the positive and the normative are sometimes interdependent and sometimes not. Eclecticism recognizes that each of the above positions can be very useful in particular instances. Basically, eclecticism takes the stand that the problem determines the method and hence appropriate philosophic positions are determined on a case-by- case basis. The eclectic's test for objectivity includes the tests of clarity, coherence, correspondence and workability, with different aspects being emphasized in different instances. Eclecticism was just described as an attempt to overcome the weaknesses and build on the strengths of each of the other philosophic positions -- yet nothing has been said yet about the strengths and weaknesses of each position. Individuals with particular philosophic orientations will be better at performing some functions in an information system than others. 42 Positivists will in all likelihood not be well trained to perform those functions that most intimately involve the normative or the pre- scriptive -- this includes identification and selection, decision making and action taking, and evaluation. On the other hand, positivists usually are trained to be very systematic and hence may be skilled at analysis and synthesis with respect to questions about the positive nature of reality. Of course, analysis and synthesis do entail normative and prescriptive content. Many positivists partially skirt the prescription on studying the normative by using different definitions than those used here of what is and is not value-free and of what is and is not science.35 Many positivists consider quantitative normative information (e.g., incomes and prices) to be positive. Many also consider description and prediction to be purely positive. Similarly, positivists often do not consider identification and selection to be scientific activities and although they do participate in them they do not evaluate them for objectivity. Nevertheless, positivism severely constrains the ability to make prescriptions and it leaves unclear the criteria for evaluating the normative and the prescriptive. Conditional normativists will be better than positivists at evaluation and perhaps at identification and selection, especially in those cases where criteria for evaluation, identification and selection can be clearly specified. They will be able to make conditional pre- scriptions -- i.e., predictions with normative or prescriptive content. Conditional normativists however will not be adept at tasks that require setting priorities, since they treat values and goals as arbitrary. Only those who believe in the possibility of objective policy making -- this includes objective normativists, pragmatists and eclectics -- 43 will be capable of responsible, accountable, informed decision making and action taking. Objective normativists and eclectics are likely to be better than pragmatists at identification and selection; pragmatists tend to emphasize solution of problems and information needs more than their selection. Those who emphasize workability -- which includes eclectics, pragmatists, and those objective normativists who also believe in the possibility of objective positive knowledge -- will recognize the need for evaluation. Eclectics are particularly well suited for adminis- trative and supervisory tasks, since they are more likely to recognize the strengths and weaknesses of various individuals. Different phiksophical orientations are differently suited also for problem-solving, subject-matter and disciplinary work. Pragmatists and eclectics are particularly suited for multidisciplinary research, including problem-solving and subject-matter research. Some disciplines tend to be positivistic, for example the "hard" sciences and statistics. Others tend to be normativistic, for example the study of ethics. Still other disciplines, like economics and psychology, are able to accommodate a mix of philosophies. The disciplinary background of an individual is often a clue to his or her research philosophy. As with mixing roles in an information system, so it is important to mix philOSOphical orientations so as to maintain a system with the capacity to objectively identify, select, define, analyze, synthesize, use and evaluate information needs and information. The behavior and performance of an information system, however, are shaped not only by the philosophical orientations of the participants but also by the manner in which the system is instituted. 44 The Institutional Context The conceptual framework developed so far postulates an information system as a system of interacting, interdependent functions and actors. It is a major premise of this evaluation that the manner in which the system is instituted shapes the behavior and performance of the system and has important bearing on the system's normative and prescriptive content. Schmid defines institutions as "... sets of ordered relation- ships among people which define their rights, exposure to the rights of 36 In short, institutions direct others, privileges and responsibilities." interdependencies. The structure and character of an institution are determined not only by formal laws and regulations but also informally by interpretations of the formal structure. Examples of the kinds of factors to be considered in institutional design and institutional evalua- tion will be discussed briefly. Roberts suggests that institutions have ideologies, sets of beliefs 37 Individual about the institution's philosophy, values and goals. members of an institution may or may not interpret the institutional ideology differently, and institutional performance may or may not reflect a variety of individuals' interpretations of the ideology. Institutions in which many individuals share a common interpretation of the institutional ideology are said to have a strong ideology, Institu- tions in which the ideOlogy of top management alone dominates are said to have a strong system of controls and incentives. Roberts then develops a number of hypotheses about institutional performance under stronger and weaker ideologies and stronger and weaker systems of control. For example, Roberts hypothesises that (i) under stronger systems of control, 45 institutional performance will be more likely to reflect a consistent set of values and goals than under weaker; (ii) under weaker systems of control, more reliance will be placed on ideology to guide behavior and the consistency of institutional performance will depend on the strength of the ideology; and (iii) under weaker systems of control and in the absence of a strong ideology, individuals will be delegated authority with little supervision and the values and goals implicit in institutional performance will vary. The scope for an individual to influence institutional performance depends on the resources he controls, his influence on others, how much he tries to exert influence, and whether or not individuals with similar institutional goals group together to pursue their common goals jointly. Another important aspect of an institution is its boundaries. Institutional boundaries determine who and what are included within an 38 They may delineate both geographical institution's span of control. area or, more importantly for this study, lines of authority. An information system may -- indeed usually will -- encompass more than one institution. Users alone are likely to be participants in many different institutions. Crossing boundaries requires careful coordination and communication. Arrow notes that the need for communication mutually understandable to various participants imposes a uniformity requirement on the behavior of the participants and is one means by which an 39 Wallich observes that different types of institution molds behavior. communication are appropriate for different audiences; anonymous or generally authored publications may be appropriate when the audience is the general public whereas authored releases of restricted distribution may be more appropriate for information that is a direct input to policy 46 making.40 Bonnen writes that guaranteeing the objectivity of information calls for definition, analysis, synthesis and evaluation to be independent from but accountable to users;41 Wallich adds that independence implies relying for effectiveness on respect and technical competence -- or in the terms of this study, on credibility.42 The ease and effectiveness with which transfers across boundaries are made depends on the magnitude of transactions costs and who bears 43 them. Bauer notes that unsolicited feedback is normally provided only by organized interests and only when they are hurt, and furthermore that spontaneous feedback is likely to be perceived highly selectively.44 In the absence of explicit arrangements to redirect transactions costs, those who are rarely heard from include the unorganized, the powerless, the inarticulate, the unaware, and those to whom it may never occur to respond.45 The latter group often includes friends of the institution whose criticism is more likely to be favorable and/or constructive. Roberts hypothesizes that an institution's own system of control will have more influence on institutional behavior and performance when 46 transactions costs to actors outside the institution are high. Bauer sees boundary roles as very important for the flow of information through an information system.47 Boundary roles include positions which "represent" an institution and feed information from it to external actors, and "contact positions” that reach out to solicit information from the institution's environment. Boundary roles are sources of institutional stress and cross-institutional mediation. Bonnen and Bauer have tackled the issue of appropriate institutional boundaries in relation to the scope of an institution's purpose.48 Bonnen notes that most problem areas are multidisciplinary and hence 47 that information systems must be able to synthesize information from different disciplines. He also notes that subject-matter information, which is relevant to a range of problems, is often put to unanticipated uses by unanticipated actors. This can but does not always result in unobjective use of information, especially in the absence of adequate evaluation. Bauer realizes that an institution's scope may be either too broad or too narrow. A broad scope will make it impractical to provide detailed information and a narrow scope will make it difficult to provide macro information. These merge into issues of how an institu- tion changes its scope. Finally, the manner in which an institution evolves or changes over time is an important aspect of institutional behavior and performance. Systems of control, communication mechanisms and boundaries are important here too. As noted above, institutions carefully select feedback; in part this is because becoming aware of and being known to be aware of a problem generates an expectation that the institution will do something about it.49 Roberts' hypotheses in this area include (i) that change will be slow and performance more predictable when systems of control are weak and ideology is strong, (ii) that an important source of change is turn-over among participants, and (iii) that structural changes often lead to unanticipated changes in behavior and performance.50 Arrow adds that the desirability of predictability versus flexibility depends in part on the availability of corrective mechanisms and the consequences of making a mistake.51 Arrow also notes that it is very difficult to change the structure of an institution because of the many participants} involved and that one structure may be appropriate for one purpose and 48 52 The latter poses problems not only for inappropriate for another. multi-function institutions but also for changing functions. Wallich realizes the importance of informal practices in facilitating change, for example whether implementation is based on the letter of the law or the spirit of current preferences.53 Bonnen sums up many of the issues involved in institutional change when he categorizes ways in which information systems may become obsolete or fail to evolve.54 These include conceptual obsolescence caused by changes in information needs, empiric failure to design and collect appropriate data, and institutional obsolescence or failure to re-design institutions in response to changed functions. This conceptual framework sets quite an agenda for evaluating an information system. The intent was to enable an understanding of an information system as a system. Not all of these issues can or should be addressed by this evaluation. It remains in this chapter to more clearly demarcate the boundaries of this study. The Agenda for Evaluating the Normative and Prescriptive Content of EIA The agenda for this evaluation was set by (l) the recognition by TNEMP's NAB both of the importance of TNEMP's initial oversight of the normative and the prescriptive, (2) an examination of the conflict between TEAC and EIA to determine the nature of the problem, (3) a clear specification of TNEMP's goals and (4) a determination of what EIA is supposed to do, keeping in mind all the while TNEMP's complaints and goals and the focuses of the other TNEMP evaluations. The results of the first three steps in the agenda-setting process have already been presented 49 in Chapter I. The procedure with respect to the determination of what EIA is supposed to do as well as the actual evaluation of what EIA does do was as follows. EIA's values and goals were identified by studying its enabling legislation. The values and goals of EIA's predecessors were identified too because one aspect of determining whether an institution has the capacity to stay relevant is determining the mechanisms by which it changes. The next step was to determine how EIA adds meaning to the generalities in the legislation. Characteristics of EIA's personnel, decision-making processes within EIA including decisions about the development of the models and the selection of topics on which EIA 55 and interactions between EIA and its clients and provides information, critics were examined. Finally, EIA was evaluated (1) for the objectivity of the values and goals themselves as specified in the laws, (2) for the objectivity of EIA's interpretations of and operations with respect to its legislative mandate and (3) for the objectivity of TEAC/TNEMP's expectations of EIA. These final claims and suggestions for improvement are in Chapter IV. Evaluation can be said to consist of two components: (1) description to determine what has been done and (2) judgment or prescription as to whether what has been done was right or wrong. For the most part, Chapter III presents the description and Chapter IV the judgments. This is not an evaluation of an entire information system. It is just an evaluation of the normative and prescriptive content of the Energy Information Administration. In terms of the functions of an information system, the emphasis is on the functions of identification, selection, 50 definition and evaluation. Analysis and synthesis were evaluated by the TNEMP Analysis Team and it would have been more worthwhile to evaluate the TNEMP evaluations than to do what they had done over again.56 There is some mention nonetheless of the processes by which analysis and synthesis occur. The functions of decision making and action taking are not evaluated at all, since these functions are not performed by EIA personnel. EIA is, however, responsible for producing information useful to decision makers and executives and hence there is considerable examination of the interactions between EIA and its clients. Power relationships are not evaluated either. In particular the politics involved in designing or re-designing the EIA in its annual budget appropriation would have to be considered before any of the recommenda- tions for improving EIA made in Chapter IV could be implemented. Bauer writes that there are two kinds of evaluation, one kind that specifies particular anticipated effects and determines whether those effects occurred and another kind that starts with an examination of 57 This values and goals and assesses performance relative to them. evaluation is of the second kind. There are several reasons for why the latter was more appropriate. First,EIA's mandate clearly specifies values and goals and EIA activities clearly have been shaped with them in mind. Also energy policy has been an area of such confusion that it would be difficult indeed to specify particular anticipated effects and determine whether they had occurred. Second, much normative and pre- scriptive content results from the cumulative selection of problems, performance and evaluation criteria, and feedback. Bauer's value- and goal-based evaluation is more process-oriented. Third, the other TNEMP evaluations were primarily evaluations of models gg[_§§_and of EIA 51 publications. It seemed better to complement those product evaluations with a process evaluation. Finally, this evaluation is a formative one. It was desirable to produce results that were general enough to be useful in re-shaping EIA and/or TNEMP. An evaluation that focused on values, goals and processes seemed most suitable. It seems appropriate to close this chapter with an explicit statement of the research philosophy underlying this evaluation. This evaluation is an attempt at objective pragmatism. It is pragmatic rather than eclectic because the author thinks there is a right way and a wrong way to conceptualize about research and she thinks that positivism and conditional normativism are not adequate for this purpose and that eclecticism lacks in cohesiveness and strength for general conceptuali- zing. The philosophy none the less is pluralistic. It is recognized that other researchers have been trained and think differently and that they have a right to their views and to practice their views -- and that they can make valuable contributions providing there are appropriate coordinating and oversight institutions. This evaluation is pragmatic rather than objectively normative because the interdependence of the positive and the normative is accepted. However, the author does not think that accepting the interdependence of the positive and the normative precludes emphasizing one or the other in the context of a particular problem in a sort of bottleneck approach. Many of the methodological issues raised and philosophical pro- nouncements made in this chapter are subject to much controversy. But rather than go off on esoteric philosophical tangents to justify them, it is better to justify them by seeing if they lead to useful insights about the strengths and weaknesses of the EIA. CHAPTER II ENDNOTES 1For a discussion of what information is and is not, see J. T. Bonnen, "Assessment of the Current Agricultural Data Base: An Informa- tion System Approach," in A Survey of Agricultural Economics Literature, Vol. 11, ed. by L. R. Martin and G. J. Judge (St. Paul: American Agri- cultural Economics Association, University of Minnesota Press, 1977), especially pp. 395-400. 2This classification has been widely used by Johnson. See for example his "Basis for Evaluation," in The Overproduction Trap, ed. by G. L. Jo-nson and C. L. Quance (Baltimore: Resources for the Future, The Johns Hopkins University Press, 1972), especially pp. 44-48. There is, however, a slight difference between Johnson's use of this clas- sification and my own; Johnson is a realist with respect to the normative and I am an anti-realist. 3From this point on, “value content” and "normative and prescriptive content” will no longer be used as synonyms. Values are normative and goals and decision rules are prescriptive. 4This classification comes from J. D. Shaffer, "0n the Concept of Subsector Studies“ (paper presented at the Technical Seminar on Sub- sector Modeling of Food and Agricultural Industries, Department of Agri- cultural Economics, University of Florida, March 30, 1970). What Shaffer calls projection, I call prediction. 5In terms of terminology to be developed later in this chapter, predictions with normative or prescriptive content are conditionally normative. 6Schmid discusses information as power in Power, Property and Public Choice: An Inquiry Into Law and Economics (New York: Praeger, 1978). He would, however, object to use of the term "public good." 7Other types of power include market power, military and police power, political power, etc. 8R. Bartlett, Economic Foundations of Political Power (New York: The Free Press, 1973). See also Samuels' review of Bartletth book in the Journal of Economic Issues X(l):l8l-l85, March 1976. 9See C. Sower and P. A. Miller, "The Changing Power Structure in Agriculture: An Analysis of Negative Versus Positive Organization Power," in Our Changing Rural Society: Perspectives and Trends, ed. by J. H. Copp (Ames: Iowa State University Press, 1964), especially pp. 127-131. 52 53 19Primitive terms are used to transform abstract statements into empirical statements. For example, x + y = z is abstract while blue + yellow = green is empirical; blue, yellow and green are primitives. 20This list has benefited from readings of W. W. Hogan, "Energy Modeling: Building Understanding for Better Use" (paper presented at the Second Lawrence Symposium on the Systems and Decision Sciences, Berkeley, California, October 3, 1978); W. W. Hogan, "The Role of Models in Energy Information Activities" (paper presented at the Stanford Work- shop on Energy Information, Palo Alto, California, December 15-16, 1976); and W. E. Gahr, "Can Models Help Congress Make Decisions" (statement to the Rockefeller Foundation Conference on Large-scale Systems Analysis, System Research Center, Cleveland, Ohio, January 7, 1977). 21Hogan, "The Role of Models," p. 19. 221bid., pp. 27-33. 23Greenberger, et al., Models in the Policy Process, p. 23. 24Hogan, "Energy Modeling," p. 5. 25Of course, information systems have many dimensions-n-geographical, financial, distributional, legal, in addition to temporal, to name just a few. Some of these dimensions are covered in other sections of this chapter. The time dimension in particular is singled out because of its relevance to evaluating EIA. 26Bonnen, "Assessment," p. 400. 27Greenberger, et al., discuss the tensions between the knowledge function and the decision function in their Chapter 2. 28E. A. Suchman, "Action for What? A Critique of Evaluative Research," in Evaluating Action Programs: Readings in Social Action and Evaluation, ed. by C. H. Weiss (Boston: Allyn and Bacon, 1972), pp. 56-64. 29M. Scriven, "The Methodology of Evaluation," in Evaluating Action Programs, pp. 123-126. 30H. P. Hatry, R. E. Winnie and D. M. Fisk, Practical Program Evalua- tion for State and Local Government Officials (Washington, D.C.: The Urban Institute, 1973), pp. 3, 12, and 108. 31 Bonnen, "Assessment," p. 403. 32R. A. Bauer, "Detection and Anticipation of Impact: The Nature of the Task," in Social Indicators, ed. by R. A. Bauer (Cambridge: Massachusetts Institute of Technology Press, 1966), pp. 61-62. 33 See for example, Johnson, "Philosophic Foundations," pp. 213-221. 54 10This definition of objectivity follows G. L. Johnson and L. K. Zerby, What Economists Do About Values: Case Studies of Their Answers to Questions They Don't Dare Ask (East Lansing: Department of Agri- cultural Economics, Center for Rural Manpower and Public Affairs, Michigan State University, 1973), p. 12. HKnowledge that is not social is subjective. Subjectivity has nothing to do with whether knowledge has normative or prescriptive con- tent. According to the definitions, used here, positive, normative and prescriptive knowledge can all be objective. 12The product/process distinction is a common one in the philosophy of science literature, where science is often considered to be both the product of research and the process of research. See for example, E. McMullin, "The History and Philosophy of Science: A Taxonomy," in Historical and Philosophical Perspectives of Science, Minnesota Studies in the Philosophy of Science, Vol. V, ed. by R. H. Stewer (Minneapolis: University of Minnesota Press, 1970), pp. 15-16. 13These steps are adapted from Johnson's problem-solving process. See for example, G. L. Johnson, Philosophic Foundations: Problems, Know- ledge and Solutions," in Agricultural Change and Economic Method, authored by the Transatlantic Committee on AgriEUTtural Change (The Hague: European Review of Agricultural Economics, 1976), p. 226. Above, on page 20, description, diagnosis, prediction and prescription were also referred to as steps in the research process. The relation between the two lists will be discussed below. 14In this thesis, analysis and synthesis are used in the sense of "breaking down" and "building up" rather than in the philosopher's sense of the analytic/synthetic distinction. 15The function of synthesis is frequently and unfortunately over- looked by methodologists. For example, H. A. Simon's The Sciences of the Artificial (Cambridge: Massachusetts Institute of Technology Press, 1969) is a delightful essay on decomposition to the complete neglect of recomposition. 16This is another classification widely used by Johnson. See for example, Johnson and Zerby, What Economists Do About Values, pp. 8-9. 17K. J. Cohen and R. M. Cyert, Theory of the Firm: Resource Alloca- tion in a Market Economy (2nd ed.; Englewood, New Jersey: Prentice-Hall, 1975), p. 18. 18M. Greenberger, M. A. Crenson and B. L. Crissey, Models in the Policy Process (New York: Russell Sage Foundation, 1976), Chapter 3, especially pp. 64-76. As is common among modelers, the authors refer to specific research methods as "methodologies." Methodology as used in this thesis, however, has a much broader meaning. 55 34Gunnar Myrdal is the most famous spokesman for conditional norma- tivism. See Appendix 2, "Note on Facts and Valuations" of The American Dilemma (New York: Harper Brothers, 1944). 35See M. Friedman's Essays in Positive Economics (Chicago: University of Chicago Press, 1953) for one example of alternative definitions. 36 Schmid, "Analytical Institutional Economics," p. 893. 37This paragraph summarizes M. J. Roberts, "An Evolutionary and Institutional View of the Behavior of Public and Private Companies," American Economic Review LXV(2):415-427, May 1975. 38J. D. Shaffer and A. A. Schmid, "Community Economics: A Framework for Analysis of Community Economic Problems,‘I 5th ed., materials for PAM 201, Department of Agricultural Economics, Michigan State University, p. 8 (mimeographed). 39K. J. Arrow, The Limits of Organization (New York: W. W. Norton, 1974), pp. 56-57. 40Wallich compares the communication channels of two similar yet dif- ferent institutions in "The Council of Economic Advisers and the German Sachverstaendigenrat, A Study in the Economics of Advice," The Quarterly Journal of Economics LXXXII(3):349-379, August 1968. 41 Bonnen, et al., "Improving," pp. 204 and 207. 42Wallich, “The CEA," pp. 351 and 366. 43Transactions costs are the costs arising from human interactions, including contractual costs, information costs and policing costs. See Schmid, Power Property and Public Choice, p. 88. 44Bauer, "Detection," pp. 11 and 62. Of course, selection always en- tails a danger of bias and must be carefully controlled to maintain objectivity. 45 Bauer, "Detection," p. 59. 46Roberts, "An Evolutionary and Institutional View," p. 423. 47Boundary points as transition points between functions in an in- formation system were previously mentioned above, on page 39. See also Bauer, "Detection," pp. 61-62. 48 p. 48. 49 50 425. See Bonnen, et 31-, "Improving," p. 198; and Bauer, "Detection," Bauer, "Detection," p. 12. Roberts, "An Evolutionary and Institutional View," pp. 420, 424 and 56 51Arrow, The Limits, pp. 63 and 75. 52Ibid., p. 57. 53wallich, "The CEA," p. 356. 54Bonnen, "Assessment," pp. 387-395. 55This decision making within EIA is not to be confused with the information system function called decision making (and action taking) above. It is necessary to distinguish between the decision to produce information and the decisions for which that information becomes an input. 56Such an evaluation of TNEMP was indeed included in Johnson and Brown, An Evaluation of the Normative and Prescriptive. 57 Bauer, "Detection," p. 2. CHAPTER III THE NORMATIVE AND PRESCRIPTIVE CONTENT OF DOE'S EIA AND MEFS The main interest in this study of federal energy information and modeling systems is in how normative knowledge and prescriptive choices are produced. Normative research has to do with values, with good and bad. Prescriptive research has to do with goals or objectives and actions for attaining them, with right and wrong. There is a link between values which are descriptive and goals which are prescriptive and attainable; values are weighed in decision making and pursued through goal-oriented actions. In this chapter the values and attendant goals of the various federal energy information agencies are identified by examining their structures, including their legislative mandates, and their operations. The aim is to evaluate the normative and prescriptive content of the present-day Energy Information Administration (EIA) and its Mid-term Energy Forecasting System (MEFS). For a statutory agency such as EIA, the values and broadly outlined goals are specified in the enabling legislation. In searching for EIA's lesiglative mandate, two considera- tions must be kept in mind. First, since EIA is an agency nested within another agency (the Department of Energy, DOE), the values and goals of the large agency must also be considered. Second, executive agencies focused on energy have evolved directly and over a fairly short period of time. The historical and institutional contexts of EIA and changes in them over EIA's short life span -- and hence in its matrix of relevant values and goals -- are important for understanding EIA. This chapter 57 58 then begins with the beginning of formal modeling by federal executive agencies for energy policy purposes, and traces the historical and institutional evolution of the present-day EIA and its MEFS. Only a simple identification in chronological order of the (normative) values and (prescriptive) goals important for EIA and DOE and their most important antecedents will be attempted here. The implications are attended to in Chapter IV. The approach adopted toward studying an institution such as EIA is that its legislative mandate establishes parameters for the jgggl_ performance of the institution, whereas the extent to which the ideal is attained is determined by the ggtggl_operations of individual actors within the institution.1 Our method, therefore, has been to accept as parameters the legislatively specified structure, values, and general goals and to view as control variables those variables used in operation- alizing and adding detail to the general goals. The parameters have been taken directly from the pertinent legislation. The control variables have been identified through studying EIA's operating procedures. A broad outline of what follows is, first, a description of the institutional and historical evolution of EIA and, second, a description of the operations of energy information agencies as they have pertained to energy modeling. It will be assumed throughout this chapter that the reader is familiar with the definitions and pointSlrfview presented in Chapter II. 59 Structural Impacts of Normative and Prescriptive Considerations on EIA, MEFS and Their Antecedents Normative and prescriptive considerations have an impact on the structure of an organization. This is because the organization must be structured so as to make its goals attainable and goals are in turn influenced by both values (or normative knowledge) and positive knowledge. As already mentioned, values, some general goals and broad organizational structure are specified by the enabling legislation. An additional aspect of an organization's structure is the people who make it up. Normative and prescriptive considerations have an impact here too; one of the criteria by which members are selected (or select themselves) is the values and goals they hold. A third area where structural impacts of normative and prescriptive considerations are important is that of decision making by the organization within its legislative mandate. The decision—making process will influence whose values and which goals are emphasized. As normative knowledge is acquired in the prescriptive process of solving problems, both structure and decisions are influenced. This section accordingly has three subsections. The first is an historical inventory of the institutions, values, and goals that have been and are important influences on the structure of EIA. The second is a general survey of the associated personnel and their values. The third and final subsection describes the normative and prescriptive content of decision making (mainly by EIA's Office of Applied Analysis) about the development of specific goals and of MEFS. 60 The Legislative History -- Prescriptions and Implied Values Identifying the beginning of concern in the federal government with energy policy and energy information is easily as difficult -- and would probably take as long -- as identifying the source of the Nile. The concern in this evaluation, however, is with the normative and prescriptive content of an integrated system of computerized energy models now used as a source of information for policy making. This activity has a more definite starting point. The "energy crisis“ of 1973-74 posed a series of serious practical problems for the United States. It created huge trade deficits and brought home the realization that petroleum, basic to our economy, is a non-renewable resource for which no low cost alternative is available. These were problems that required immediate responses. President Nixon's initial (pre-legislative) action was to establish in the Executive Office a Federal Energy Office (FEO) to give the President policy advice on all matters pertaining to energy. FEO also had some data collection and analysis duties and some authority over energy conservation and emergency fuel allocation and pricing.2 It was within FEO that a system of energy models for energy policy purposes began to be assembled. It is from the establishment of FEO in December 1973 that the history of the present- day EIA and its MEFS is traced. FEO was set up under Presidential authority granted by the Economic Stabilization Act of 1970, the Emergency Petroleum Allocation Act of 1973, and the Defense Production Act of 1950.3 As such, the national values it was intended to protect included national security, common defense and the general welfare as these were perceived by the 61 Administration. FEO was to safeguard these values particularly with respect to emergency preparedness and energy shortages. Goals were prescribed for FEO in many functional areas: the production, conservation, use, control, distribution, and allocation of energy, including the impact of these activities on stabilizing the economy, reducing inflation, minimizing unemployment, improving the competitive position of the U.S. in the world and protecting the purchasing power of the dollar. These values and goals helped shape Project Independence, the federal energy project initiated in March 1974 to explore policy alternatives, a project in which FEO was heavily involved. FEO was from its conception a temporary, problem-solving office. It concentrated on diagnosing the problems posed by the "energy crisis" and prescribing some immediate solutions. It was the Administration's initial reaction to the energy crisis. The day after President Nixon signed Executive Order 11748 creating FEO, his proposal to replace it with a statutory agency was introduced in Congress. The proposed replacement for FEO was, of course, the Federal Energy Administration (FEA). The FEA Act of 1974 was signed into law in May and FEA replaced FEO in June. FEA bore many similarities to FEO but also differed in some respects. Again it was a temporary problem-solving agency, due to terminate in July 1976. Its emphasis was less on diagnosis and more on prescribing responses to the OPEC oil embargo and tactics to ensure that such an energy crisis not recur. The enabling legislation identified the same societal values (the general welare, common defense and national security with respect to emergency pre- 4 paredness and energy shortages) as had been specified for FEO. FEA 62 however consolidated into a single agency some important federal energy policy functions and its policy goals were somewhat narrower and more specific than those of EEO.5 Goals were laid out for FEA in many functional areas, but four were deemed especially important: allocation, pricing, federal-state coordination and rationing.6 In the area of allocation, there were instructions to manage the fair and efficient distribution of energy supplies and to take action to conserve scarce supplies and expand sources. With respect to pricing, FEA was to promote fair, reasonable and stable consumer prices. The coordination of federal and state energy policies and planning was to be overseen by FEA so as to ensure the participation of the states in federal policy making. And finally, in the event of energy shortages, FEA was to provide coordinated and effective rationing, ensuring equitable sharing of burdens while meeting all priority needs.7 A primary concern of FEA then was dealing with energy shortages in an equitable fashion.8 Such stress on fairness, reasonableness, participation, coordination, etc., was all in recognition of differences in the normative foundations and prescriptive preferences among individuals and categories of individuals, and among states and regions of the country. In contrast to FEO which was located within the Executive Office, FEA was an independent agency. Its independence had several implications for its role in the policy-making process. In contrast to FEO, whose policy- making responsibilities were entirely to the Administration, FEA's legislative mandate included advising both the President and Congress, keeping the public informed, arranging for the participation of state and local governments in energy policy formation, and making legislative 9 initiatives. Thus its perceptions of national values and goals could 63 differ more from the Administration's than could the FEO's. Attempting simultaneously to serve diverse interest groups, with such divergent values and goals, started FEA on a path toward specializing on positive and conditionally normative methodologies. In addition to policy-making responsibilities, FEA also had administrative, programmatic, regulatory and information collection and 10 The inherent conflicts among the normative analysis responsibilities. foundations and prescriptive implications of these various roles led to a number of changes in FEA's organization over its lifetime, particularly in relation to its policy-making and information-gathering activities. Under FEA's original structure, both of these activities were performed by the Office of Policy and Analysis. This was an integral part of FEA, located at the assistant administrator level. It was responsible for collecting and disseminating data, forecasting and analyzing the impacts of energy shortages, developing policy alternatives, and integrating (or synthesizing) FEA program, policy and legislative initiatives.H Simultaneous data collection and policy formulation led to claims that FEA was unobjective and that its data were gahtered gy_pg§t_to justify what was regarded as an out-dated bias toward energy self-sufficiency, the original goal of Project Independence. Subsequent reorganizations of energy information agencies have focused on separating data collection and analysis on the one hand from the making of policy recommendations on the other, with the intent of separating "objective" positivistic activities from "unobjective" normative and prescriptive activities. In August 1976, as provided for by the Energy Conservation and Production Act, the Office of Policy and Analysis was split into an Office of Policy and Program Evaluation (OPPE) 12 and an Office of Energy Information and Analysis (OEIA). The former was 64 responsible for developing, analyzing, and evaluating energy plans and policies and for integrating FEA program, policy and legislative initiatives.13 Its value and goal orientation was expected to be that of the Administration. OEIA was responsible for collecting, publishing and disseminating information and for providing technical support for national energy programs and plans, including short- and long-term forecasts and 14 It did not make policy recommendations and generally 15 analyses. maintained a conditionally normative posture. Thus policy making and information provision were separated. Some overlap remained in policy analysis, with the difference being that OPPE was required to assess or evaluate information while OEIA was required to produce but not evaluate information. OPPE continued FEA's problem-solving focus on energy shortages whereas OEIA's activities were a shift away from a problem-solving toward a subject-matter orientation, that of providing information relevant to energy problems in general. The values and goals of OPPE remained those mandated in the FEA Act of 1974. Some additional values and goals, however, were separately specified for OEIA. It was to be objective and professional, not political. It was to provide information to meet the needs not only of FEA and the Administration, but also of Congress, the states and the general public. The discussion in the legislation and in the legislative history boils down to the specification of three principal values for OEIA: objectivity, professionalism and relevance. Objectivity and relevance were to be guaranteed by the legislatively mandated responsibility to a broad spectrum of interest groups, by considerable authority to gather its own data directly, including the power of subpoena, by the provision for an annual evaluation by the Professional Audit 65 Review Team (PART), and by provisions for public access to information 16 Professional- and to the Project Independence Evaluation System (PIES). ism was to be guaranteed by the stipulations that the director of OEIA have a professional background and experience in managing information systems and that he establish and maintain a professional capability in his organization.17 Whether or not the professional/political distinction mentioned above is a meaningful one depends on the attitude adopted toward the possibility of objective normative research. The implication in OEIA's value mandate is that professionals are disinterested analysts who give impartial analyses by making their client's value judgments explicit and/or by looking at a situation from a value-free conditionally norma- tive or positive perspective. Politicians, on the other hand, are implied to be interested parties who view a situation from a much less "objective" perspective. Because they are normativists or pragmatists, they are not regarded as "objective." If, however, one believes that the normative can be studied objectively, then (1) politicians are capable of being objective without being value-free, and (2) professionals do not have to be positive or conditionally normative in order to be objective. This implies that the professional/political distinction cannot be made in terms of objectivity or the lack thereof. The emphasis in OEIA's value mandate on professionalism in the conditionally normative sense was one structural indication that it would adopt conditional normativism, which it did. As time passed and energy problems stayed with us, the need for a 18 permanent federal subject-matter focus on energy was recognized. This ultimately resulted in the establishment of the Department of Energy (DOE) 66 19 As might be by the Department of Energy Organization Act of 1977. expected given the changed interpretation of energy problems from temporary to enduring, DOE's mandate regarding its values and goals is much broader than was FEA's. DOE is instructed to promote the public interest and the general welfare in a manner consistent with overall national economic, social, and environmental goals (however arbitrarily the differences between economic, social, and environmental values are defined).20 DOE further centralizes energy policy functions, making the locus of accountability and responsibility more identifiable. The ultimate goal is the provision of a stable energy policy framework. Goals are specified in a long list of functional areas, including data collection, conservation, transportation, environmental protection, pricing, research and development, regulation and the maintenance of 2] DOE'S mandate encompasses that of FEA and then some. competition. Whereas FEA's prime responsibility was short-term crisis management, DOE is intended to be an institution for comprehensive and balanced national energy planning.22 It has responsibilities for both short-term and long-term integrated policy making. In further contrast to FEA's short-term emphasis on reducing national vulnerability through emergency 23 Whereas preparedness, DOE'S short-term emphasis is on conservation. FEA was an independent agency, DOE is a cabinet agency and is designed to serve the President. Its objectives encompass both problem-solving and subject-matter activities; it is to make policy proposals as well as to identify policy problem areas for the President. In DOE, the separation of policy making and information analysis is even more complete than it was in FEA. The Office of the Assistant Secretary for Policy and Evaluation (PE) develops, recommends, and 67 24 PE maintains a problem-solving integrates national energy policy. (and hence prescriptive) focus. Information on the other hand is provided by an agency with a subject-matter focus, EIA. The EIA is an independent 25 Like agency and does not make policy recommendations or prescriptions. OEIA which was its direct antecedent, it has an explicit conditionally normative philoSOphic orientation, with similar implications for its 26 EIA, like OEIA, collects and analyzes interpretation of objectivity. data. The analyses include both statistical tracking of energy trends and special purpose analyses using energy modeling systems. Again like OEIA, EIA is instructed to meet the needs of DOE, the Congress, the states, and the public. There is no change in the legislatively mandated values of objectivity, professionalism, and relevance, although discussion 27 FEA was over the legislation also underscored a need for credibility. an independent agency and OEIA was a component of it. Thus, the values identified specifically for OEIA were in addition to the values specified for FEA. DOE, however, is not an independent agency, whereas EIA is. There seems to be some question as to whether the specific values and purposes mandated for EIA are in addition to or in place of those 28 specified for DOE. For example, DOE has a federal-state coordination role similar to FEA's: "... to provide for the cooperation of federal, state and local governments in the development and implementation of "29 EIA has not actively sought 30 national energy policies and programs ... such cooperation and it is not clear they are required to. In par- ticular, this has implications for EIA's stance of conditional norma- 3' Aside from this tivism and for how it interacts with its clients. subtlety, objectivity is again provided for by independence, by broad data-gathering abilities, by an annual PART evaluation, by provisions for 68 public access, and by the specific preclusion of policy recommendations. Appropriate professional backgrounds are again required for the EIA director and staff. The means by which credibility and relevance are to be guaranteed are unspecified and have posed serious problems for EIA. Questions of credibility stimulated this study and the other TNEMP studies. There are several trends running through the evolution of energy information agencies. Perhaps the most important is that the nature of perceived energy problems has changed over time. FEO was a crisis response and tried to diagnose and prescribe some answers to the problems posed by the OPEC embargo. FEA's task was to prevent such a crisis from happening again and, in the event that it should happen, to be prepared to deal with it equitably. DOE'S task is to develop a stable energy policy framework for addressing enduring energy problems. Accordingly the focus of federal energy agencies has shifted from problem solving (diag- nosing and prescribing solutions to specific problems or crises) to subject matter (identifying, defining and clarifying a less specific problem area). Another trend has been the increasing separation of policy making and information provision. Simultaneously there has been an increasing separation of job positions deemed political and those deemed professional. These have been responses to guaranteeing objectivity (always in the positive or conditionally normative sense). FEO was an Administration agency and as such was not required to be overly concerned about research objectivity. FEA was an independent agency. Its diverse functions and its responsibilities to various interest groups led FEA down a path toward conditional normativism. Originally, the Office of Policy and Analysis 69 combined policy making and information provision. Later it was split into OPPE, with problem-solving (policy-making) responsibilities, and OEIA, with subject-matter responsibilities. OEIA in particular tried to be objective and professional. In DOE, PE has assumed OPPE's problem- solving focus. EIA has assumed OEIA's data collection and analysis functions and OEIA's concern with objectivity and professionalism. Throughout this evolution, there have been continuing conflicts among the demands for credibility, relevance and conditionally normative objectivity. This has resulted, for example, in complaints about 32 unobjective data collection by FEA, and in suspicions about the cancellation of the 1977 NEO by FEA and about lack of access to information 33 Attempts have been made to provide simultaneously and to models in EIA. for credibility, objectivity and relevance by experimenting with various sorts of independent agencies and, within agencies, by separating contro- versial activities. In FEO, which was not independent, relevance was perhaps maintained at the expense of objectivity and credibility. Relevance at the expense of objectivity and credibility continued in the early FEA. After the separation of OPPE and OEIA and OEIA's adoption of conditional normativism, a particular kind of objectivity gained in emphasis while the separation of policy making and data analysis intro- duced at least the potential for loss of relevance and, hence, credibility. This trend of separating and isolating controversial activities in trying to preserve objectivity in less controversial activities continued in EIA. As conflicts between (1) data collection for use in regulations, litigation, and prosecution and (2) data analysis to answer broader economic questions have been detected, there has been a separation of data 70 collection, including the statistical analysis of data series, from data analysi , including modeling and forecasting.34 Credibility, particularly in the use of large, complex models has been and remains a problem. Other differences between the various institutions for energy information result from the particular value constellations of their members, and from the operating procedures of the institutions. Personnel and Values Each individual or organization has constellations of national, institutional professional, political and personal philosophic values. The match or mis-match between the values of the individual members and the value mandate of an institution influences the performance of the institution. In this section the value constellations emphasized during EIA's evolution are surveyed. FEO's structure was less explicit than its successors' structures. Those members closest to its director, William Simon, emphasized national 35 The and political values in proposing solutions to the energy crisis. office, however, also employed a number of statisticians and modelers who emphasized the values of academicians and researchers and who were interested in making a permanent place for energy models in the policy process rather than in solving ”temporary“ energy problems.36 Interaction between policy makers, inside as well as outside FEO, and modelers influenced both the design of legislation for successive energy agencies and the structuring of model systems. Modelers, who tend to be trained as positivists and conditional normativists, most easily fit into an institution with a conditionally normative philosophy. Similarly, the type of models developed is influenced by the policy issues 71 they will be used to study. The politics of the interactions between policy makers and modelers will not be discussed here; suffice it to say that interaction between the structuring of institutions and the structuring of models has continued over the years. In the Office of Policy and Analysis in the early FEA, members were selected on the basis of, first, sensitivity to policy issues and, second, professional training.37 That is, the "characteristic employee" was a policy analyst (who mg§t_deal with normative and prescriptive issues) first and a modeler (who tends to be trained as a positivist or conditional normativist) second. This relative emphasis on the importance of political issues over the ”professional" values of conditional normativists began to reverse with the splitting of the Office of Policy and Analysis into the Office of Policy and Program Evaluation (OPPE) and OEIA. With the stipulation of professional qualifications for the director and staff of OEIA, as well as with the avoidance of policy recommendations, the characteristic member of OEIA became a positive or conditionally normative modeler first and a policy analyst second. Conditional norma- tivism became the dominant philosophy. This is true of EIA, too. EIA Administrator Lincoln Moses, who has an academic background in statistics, reputedly has favored making a clear distinction between political work 38 and professional work. EIA staff generally have professional degrees in statistics, operations research, economics or systems science; the 39 necessary training in policy analysis can be obtained on the job. The greater relative importance of professional than political values is further illustrated by the tendency for EIA employees to go on to subsequent jobs in modeling rather than in policy analysis.40 The 72 reverse is true of staff members in PE, where political values continue to be emphasized over professional.4] EIA is the longest-lived of the energy information agencies. A new value seems to be increasingly common among its members: that of valuing EIA as an institution, as a way of doing things.42 The development of this value seems to have been heavily influenced by enhanced Opportunities for the staff to do professional (in the conditionally normative sense) work. Partly as a result of the gradual shift from a problem-solving to a subject-matter focus, time constraints on staff work are now less pressing, staff turnover has slowed somewhat, documentation standards are more exacting, and there is much less of the last-minute rush atmosphere that predominated when assignments had to be finished "by the break 0' day.‘I This value is reinforced by and reinforces the EIA commitment to doing "good" -- i.e. objective and professional -- work. Its impact on relevance is less clear because of the conflict noted above between relevance and objectivity as defined by conditional normativists. Over- all, EIA seems to have established a reputation for conditionally norma- tive objectivity and honesty at the cost of relevance and credibility with decision makers. The Structure of Decision Making vis-a-vis the Evolution of MEFS At the time of the oil embargo in 1973, the federal government needed estimates of the shortages and predicted consequences of policy pre- 43 scriptions to lessen them. The only tool the government had readily 44 available for making such estimates was judgmental forecasting. The resulting estimates of the petroleum shortage were large enough to stimulate the development of institutions and tools to manage the "energy crisis." FEO was one such institution, 73 However, analyses produced by judgmental forecasting were not sufficiently detailed and documented for confident policy selection and implementation. Policy makers needed a finer level of detail with respect to both the positive and normative components of the energy crisis and with respect to the prescriptive implications of policy proposals. The first step taken by the federal government, and particularly by the Department of Interior, to make such detail available quickly was to use an accounting structure. In that crisis environment, the choice of a policy-making tool was contingent upon its being readily available; an accounting framework was adopted with only slight changes from common industry procedures. This framework enabled description and classifica- tion of sources of supply and of end uses. It was an efficient way of summarizing historical data and it made possible a finer level of product detail. Its assumptions were more explicit than were those of judgmental forecasting and it permitted limited sensitivity analysis. However, the accounting scheme still had serious limitations on its usefulness in policy making. Component supplies and demands -- details about the accounting scheme's details -- were still estimated with experts' judgmental forecasts. Also the historical data were not analyzed explicitly as a basis for predictions and sensitivity tests were clumsy. This crude "model” as developed by the Department of Interior, however, became the basis for the first formal computerized models at FEO, where the locus of short-run federal energy modeling shifted in December 1973. FEO evaluation of the estimates produced with the accounting system indicated that demand estimates were the most sensitive components, and hence a good place to begin making improvements. After considerable discussion about whether it would be most productive to build a formal 74 model to improve these estimates or to continue to rely on existing reports, the decision was made to build an econometric model of the demands for individual petroleum products. Apparently this decision was influenced to a large extent by several FEO staff members who had professional interests in modeling.45 The econometric demand model was developed by FEO staff in cooperation with fuel industry experts and was then used to make monthly projections for 1974. A parallel effort on the supply side built a simple simulation model of fuel inventory adjust- ments and refinery output changes. FEO combined these two models, producing new estimates of aggregate shortfalls and much more extensive sensitivity analyses. The combined models were used during the first quarter of 1974 to make predictions about the effects of the OPEC oil embargo. FEO's first rulings on fuel allocations and conservation policies issued were backed up by these models. In March 1974 the intensive effort known as Project Independence was initiated to develop a framework for national energy policy. The policy goal for Project Independence was specified by President Nixon: energy 46 Project Independence was an interagency effort, independence by 1980. ultimately involving more than 500 professionals. The central responsi- bility for Project Independence passed from FEO to FEA when the latter replaced the former in June 1974. Within FEA the models continued to be developed and extended to meet the needs of Project Independence. As described above, FEA was an independent agency responsible for serving the needs of the executive and legislative branches and the general public. Responsibility to such diverse interests was handled with a conditionally normative stance. The explicit policy of Project Indepen- dence was to recommend no specific policy actions but to concentrate on 75 developing a data base, policy analysis tools and a conceptual framework 47 The goal was to put within which individual issues could be studied. policy alternatives and their impacts in perspective and to focus public debate on the issues and choices. Conditional normativism implies one of two courses of action for an information agency with such a goal. One option is to require clients to clearly, explicitly and completely specify their values and goals pgfgrg_any analysis takes place so that the conditionally normative analyst need only supplement the normative assumptions and constraints with positive information in order to produce prescriptions. Otherwise, a second option is for the analyst to specify a broad array of possible values and goals, use them to produce an array of possible prescriptions or policies, and then let the client choose from among the array after all analysis has been done. Under the first option, the client says "My goal is " and the analyst suggests alternative means of attaining it. Under the second option, the analyst says "Potential goals are and alternative means of attaining each goal are " and the client chooses an appropriate goal-means combination. Clearly the first option is not appropriate in situations where clients are not clearly identified, where goals are not well articulated, or where there is conflict among interdependent clients and/or goals. As energy policy was an area of confusion and ignorance, not of consensus and clarity, FEA and Project Independence chose the second option. "Energy independence" was a prescription that went through several definitions.48 It was originally interpreted to mean self-sufficiency. Analyses with this goal in mind indicated that self-sufficiency would 76 result in domestic energy prices even higher than prices during the embargo. This was unacceptable to the Administration. Energy indepen- dence was finally taken to mean a reduction in vulnerability to import disruptions rather than the elimination of imports pg: pg, With this goal in mind, and consistent with the second approach of the conditional normativist, four broad strategic options, designed to overlap as little as possible, were compared and contrasted to illustrate the very different energy futures available. The four strategies chosen were (1) to increase domestic supply, (2) to conserve and manage energy demand, (3) to establish standby emergency programs, and (4) to do nothing.49 Each strategy was evaluated at three- to five-year intervals in terms of its impact on the development of alternative energy sources, on the nation's vulnerability to import disruptions, on the domestic economy (GNP, inflation, unemployment, balance of payments, and distributive effects by region, sector and income), on the environment, on the world price of oil, and on the degree of federal intervention required for 50 Each of these performance criteria differs in norma- implementation. tive and prescriptive content. The computer models were considerably altered and extended to provide support for performing these detailed analyses. The structure of the Project Independence Evaluation System (PIES) evolved in response to the development of Project Independence and the specification of the policy options to be considered. A number of goals were kept in mind while assembling PIES so that the resulting framework would be capable of performing analyses of alternative strategies in terms of the impacts 5] The goals in assembling PIES included developing described above. capabilities to analyze (1) price sensitivity, (2) substitution between fuels, (3) impacts of technological change, (4) resource limitations, 77 (5) externalities, (6) economic impacts, (7) regional variations, and (8) lead times and other time-dependent conditions. Additional specifi— cations for PIES were that it be modular and that it be capable of incorporating expert judgments. The requirement for modularity expanded the breadth of application of the models since not all problems require the use of the complete system of models. Capacity to use approximations and informed estimates reduced the severity of the problems posed by lack of data or extreme complexity, also expanding PIES's usefulness. The resulting system of models was a flexible and somewhat eclectic system. Its components included econometric, linear programming, simulation, accounting and optimization models. Since one of the key applications of PIES was predicting supply, demand and equilibrium prices at different future dates, the integrating model was a particularly crucial component of the system. In this respect, PIES was -- and MEFS still is -- a relatively primitive modeling system; the integrating model is a linear program, making it expensive and clumsy to run analyses through time. The last Project Independence report was published in early 1975. PIES, which was originally developed to evaluate the possibility of energy independence by forecasting prices and imports under alternative conditions, was sufficiently flexible and broad enough in application to continue to be used to evaluate energy policy proposals. The separate components were refined and extended individually in response to analysis needs. The PIES which evolved was capable of dealing with such issues as "... the impacts of price regulations, electric power generation mixes, utilization of Alaskan oil and the measurement of energy capital 52 requirements." It was less capable of dealing with present versus future values of fossil fuels; trade-offs between environmental quality 78 and fossil fuel production; the relationships among balance of payments, fuel price controls, deficit financing, monetary practices and inflation; the costs and benefits associated with rationing, and similar considera- 53 The modeling framework provided a mechanism for accumulating tions. information about the energy system. PIES became a key resource in the production of the National Energy Outlook (NEO), a publication required of FEA by its legislative mandate, and subsequently in the production of The Annual Report, an annual publication required of EIA by its legiala- tive mandate. The post-Project Independence FEA and EIA continued the tradition of conditional normativism. The interpretation of conditional normativism, however, changed over the years. FEA began by contrasting four different policy strategies (emergency, supply, demand, and do nothing). In pre- paring the NEO, FEA continued to contrast broad policy directions, although the number was increased to six. The six scenarios were: ”(1) Reference, with gradual oil and gas price decontrol and minimal governmental intervention in energy supply and demand markets; (2) Conservation, with policies to reduce demand; (3) Acceleration, with conservation combined with aggressive measures to increase domestic supply; (4) Regulation, with oil and gas producers' prices controlled and consumers paying a weighted average of domestic and import prices; (5) Electrification, with increased conversion to electricity on the demand side and increased use of coal for generation; and (6) Regional limitation, with environmental restrictions on the production and use of energy goals."54 The emergency focus shifted to a focus on regulation and conservation, and increased emphasis was put on environmental impacts and coal 79 55 conversion. This was one step in the transition from the perception of energy problems as temporary to enduring. While OEIA had continued to interpret conditional normativism according to the second option described above (producing an array of goals and means), as had the early FEA, EIA opted instead for the first 56 option (clients specify goals before analysis). All EIA analyses are performed by clearly specifying the assumptions and then following their 57 implications through to prescriptions. In the case of analyses per- formed for The Annual Report and the general public, analyses are based on the assumption that current policies will continue into the future. This shift from one conditionally normative option to the other may sound like a major procedural change. Actually, it was part of a gradual, pragmatic learning process that permitted continuity in the use of PIES/MEFS. The early FEA was a problem-solving institution. The second option for conditional normativism is easiest to pursue in a problem- solving context since the problem itself provides guidelines for pro- posing a broad range of policies. EIA, however, has a subject-matter focus. In this context. the second option for conditional normativism easily becomes confusing: in the absence of a specific problem, selecting an array of goals to analyze is itself a formidable and time-consuming problem. It is simpler to specify some assumptions and follow them through to their conclusions -- i.e., to adopt option one. Furthermore, when FEA was established there were very few explicit energy policies to provide a basis for the assumptions needed for option one conditional normativism. The opposite is the case for EIA; there is a growing body of energy legislation on the books. Lastly, the energy legislation used 80 by EIA as a source of assumptions was in part developed on the basis of policy analyses that involved PIES or MEFS. Hence the generic types of legislation now on the books could be expected to afford opportunities for the type of analyses MEFS is capable of providing. Overall, the evolution of the models and of the institutions they are embedded in have followed similar trends. The models reflect the commitments of the institutions and the individuals involved. Federal energy modeling began as a problem-solving response to the oil embargo but has evolved, in response to new normative, positive and prescriptive information, into subject-matter research. The resulting system of models is a flexible, somewhat eclectic mix, with component models using methodologies deemed most appropriate for each particular component of the energy system. Although MEFS may be eclectic, EIA is not; EIA adheres to conditional normativism, a variant of positivism. In conjunction with the shift from problem-solving to subject-matter research, federal energy modeling has also evolved from low priority model applications to high priority applications in rough conformance with 58 FEO's initial models were descriptive, Hogan’s priorities for modeling. used to report and develop an energy data base. As the models and data base improved, the models were put to predictive uses, first answering gg_ngg_questions such as determining the effect of fuel allocation proposals. With the use of the models to make monthly supply, demand and price predictions and to analyze the Project Independence scenarios, use of the models advanced to answering recurring questions. OEIA and particularly EIA have also used the models extensively in preparing periodic reports, the NEOs and the Annual Reports. In EIA, the use of 81 models for prediction rather than description is made more likely by 59 The Office of the separation of data collection from data analysis. Energy Data, the subdivision within EIA which collects data, is staffed primarily by statisticians rather than modelers. All of these model applications have emphasized analysis over synthe- sis. As Hogan observes, the models are most useful for analyzing problems within the energy sector and of limited use in making trade-offs among sectors.6O A problem with the integrating model has been and continues to be that its predictions for one time period may not be consistent with predictions for another time period, making it difficult to compare policy proposals through time. Perhaps one reason for the emphasis on analysis is that the function of integration or synthesis has been delegated to the policy-making agency -- OPPE in the case of FEA and PE in the case of DOE -- by legislative mandate. In any case, energy policy making requires that impacts on different sectors and across time be compared, with or without the use of models. Three shifts have been noted in the structure of decision making vis-a-vis the evolution of MEFS: a shift from one type of conditional normativism to another, a shift from problem-solving to subject-matter work, and a shift from descriptive to predictive modeling. There is a potential conflict between the shift from the second option to the first option for conditional normativism and the shift from problem- solving to subject-matter work which is a source of irrelevance. The shift from one type of conditional normativism to another implies that energy information needs have become better defined. In contrast, the shift from focusing on a specific set of problems to focusing on a more general subject-matter area implies that new and probably poorly defined 82 information needs have been added to EIA's domain. Both shifts together imply either that (1) EIA is concentrating on improving its ability to address some information needs while ignoring others or that (2) an additional unit, perhaps outside EIA, with the capability to articulate new energy problems and information needs, has developed. Identifying how EIA has dealt with this tension requires looking more closely at its operating procedures. This concludes the description of the structural impacts of norma- tive and prescriptive considerations on EIA, MEFS and their antecedents. The description has shown that the "official" EIA philosophical position is conditional normativism. Normative and prescriptive considerations also have important impacts on operating procedures. Are EIA's operating procedures those of the conditional normativist too? Normative and Prescriptive Influences on EIA as an Operating Organization and on MEFS as Operating Models Output is a function of more than structure. Within any given struc- ture, there is leeway for a variety of behavior patterns and hence out- comes. This is true of both institutions and models. The structure of EIA discussed above had to be interpreted and used by the members of EIA before it became operational. Similarly, a system of models such as MEFS can be used to help analyze any number of problems; each mix of problems selected will lead to a different performance record. In this section normative influences on the operating procedures of EIA with respect to MEFS are examined. Operating procedures and their consequences are influenced both by explicit agency rule making, which establishes ideal operating procedures, and by the actual decisions and activities of the staff members. 83 The Selection of Topics to be Investigated With MEFS EIA has the authority to accept or reject requests. The Office of Applied Analysis, the division within EIA responsible for managing the analytical programs, has a well-defined set of priorities for performing analyses. The clarity and explicitness of the priorities for accepting requests is intended to permit much of the interaction between client and EIA to occur at a fairly low staff level while maintaining consistency in overall request acceptance policy.61 In descending order of importance the explicit or ideal priorities are (1) analyses required for EIA legislatively mandated publications, such as The Annual Report, (2) requests from the Secretary of Energy, (3) requests from Congress, (4) requests from other parts of DOE, and (5) requests from other sources and 62 Consideration is also analyses EIA undertakes on its own initiative. given to the time and resource commitments required and to the "general interest'I of the analysis. A final important consideration, especially for non-EIA requests, is the appropriateness of the request for MEFS.63 All non-EIA clients are officially required to submit a formal, written request for analytical services which is accepted or rejected according to the above criteria. Note that, in principle, the President's most direct access to EIA is through the Secretary of Energy, making communica- tion links between the President and the Secretary and between the Secretary and EIA of vital importance. In practice, of course, these priorities are quite flexible. Judgments concerning the general interest of a request and estimates of the ability to provide the necessary time and resources add leeway. Communication between the Executive Office and EIA can be much more direct 84 than indicated provided what EIA has to offer is of interest. EIA knowledge of who controls their funding must inevitably influence their Ipriorities too. Although, in principle, EIA accepts requests from the general public, in practice lack of time and resources seldom permit it to do 50.64 However, EIA seldom outrightly rejects requests;65 in part this is because conditional normativists, with their arbitrary attitude toward values, rarely feel justified in rejecting a request. The explicitness of the priorities for requests diminishes the need for rejections since inappropriate requests tend never to be submitted. Also the nature of agency-client interactions are such that inappropriate or undesired requests are usually withdrawn by the client rather than rejected by EIA. There is, nonetheless, a demand for analyses of topics that are not considered appropriate for MEFS. Many of these involve normative and prescriptive questions MEFS is not designed to handle. The parties requesting such analyses are forced to turn elsewhere. The responsibility for managing a system of models and providing data analyses was delegated to EIA in the Department of Energy Organization 66 The demand for analyses that do not fall Act on a non-exclusive basis. within the guidelines established by EIA for MEFS has been sufficient to stimulate the development of many additional analytic units. The Office of the Assistant Secretary for Policy and Evaluation (PE), with its responsibility for developing new energy policies, has had the greatest 67 It need for analyses not supplied by the Office of Applied Analysis. is within PE that governmental analyses supplemental to EIA analyses are being performed. The development of additional analytic units, and hence the destruction of incentives for EIA to shift its focus as energy problems and information needs change, may lead to the irrelevance of EIA services. 85 Decisions to accept or reject requests are primarily day-to-day decisions. In the longer run, decisions must also be governed by modi- fications and extensions or contractions of the models. Decisions About the Development of the Models In the longer run the models themselves can be altered either to improve their accuracy and adequacy within present guidelines or to broaden the range of requests EIA is capable of accepting. In keeping with EIA's policy of conditional normativism, changes in MEFS to broaden its range of applicability are made primarily in response to new energy 68 decisions, including both legislative and regulatory initiatives. Such initiatives call for changes in the structure of EIA's starting assumptions and could call for new and different modeling capabilities. In practice, these decisions are also influenced by budget considerations and by interest generated by staff members in adding new or modifying old models. In addition to deciding when models should be modified or added, it is also necessary to decide by whom the work should be done. Minor model alterations are done in-house by the staff as time permits. In deciding who should make major alterations and model replacements or additions, EIA has primarily been concerned about the implications for 69 There has been a continuing trend 70 its independence and objectivity. toward more in-house control of models. In the past, greater use was made of proprietary models. This is a particularly attractive option for problem-solving agencies operating under severe time constraints. However, the use of proprietary models hindered EIA's ability to adapt the models quickly to the requirements of individual analyses and to understand and explain model outputs. Less than full in-house control was believed to compromise the objectivity, credibility and consistency 86 of EIA outputs. As soon as time and other resources permitted, EIA began developing its own models. Initially much model development was done in-house. This is done less and less frequently now due to the amount of time it requires. Now most model development is done by contractors. This provides EIA with access to the specialized expertise needed for one-time model development while careful specification of EIA requirements, close interaction between staff and contractor, and full contractor documenta- tion provides EIA with full understanding of and control over the resulting model. A second important interaction in understanding the operations of EIA is that between EIA and its clients, those who use energy information in decision making. Interactions Between EIA and Its_§lients EIA has three main types of clients. These are clients who request analytical services, clients who use EIA publications, and clients who request access to EIA's models. Each will be dealt with in turn. The formal (ideal) request procedure for non-EIA clients wanting analytical services was mentioned above. Placing a request includes specifying the assumptions to be used in the analysis. Specification of the assumptions usually involves some interaction between EIA staff 7] The interactions between clients and staff tend to and the client. be positive and conditionally normative in the short run and pragmatic (in the sense of resulting in learning about interdependent positive and normative knowledge, rather than necessarily in the sense of being workable) in the long run. Performing an analysis for a client is conditionally normative in the sense that the normative assumptions are 87 ultimately the client’s own choice and are accepted whether EIA agrees 72 Also EIA documentation of requests and services with them or not. provided are written in a conditionally normative manner. However in many instances both the client and the EIA staff benefit from discussing 73 The result of such the appropriateness of assumptions together. discussion is often a decision to alter the assumptions. This inter- action between client and staff is thoroughly pragmatic, with both sides learning about the relevance of the assumptions and the capabilities of the models. Thus EIA/client interactions appear conditionally normative if one examines them at a point in time or if one reads EIA published accounts of the services they provide; however, if one takes a fuller view of the process of fulfilling a request, it is much more pragmatic. Once the assumptions are specified, interaction between client and staff halts unless or until the client chooses to re-specify some assumptions. On the basis of analysis results, the client may decide on, or the EIA staff may request, alterations in the assumptions. In this case there may be further client/staff interaction. However, once an analysis is completed, there is no follow-up by EIA to find out if the analysis proved useful to the client. EIA policy is that such follow- 74 up would compromise its independence and objectivity. Lack of follow— up, however, makes it difficult to maintain relevance. For example, EIA input to NEP II has reputedly been very small, largely because PE (which has prime responsibility for preparing the national energy plans) does not consider EIA's scenarios to be relevant.75 Official EIA policy is that all analyses and forecasts are published. Although the first reason for publication is to give the appearance of 76 objectivity, this service is also geared toward the second category of 88 clients, those who use EIA publications. In performing unrequested analyses, such as those required for The Annual Report, assumptions must be specified by the staff. The staff may also specify some assumptions in performing analyses requested by clients; EIA reserves the right to specify alternative additional sets of assumptions to better represent the range of likely outcomes.77 The behavior of EIA in preparing its publica- tions goes one step further from conditional normativism. The publications are conditionally normative only in the sense that the assumptions are explicit. The selection of appropriate assumptions by EIA cannot be based on conditional normativism but is again pragmatic. Interactions between the EIA and clients wanting access to the models is less well defined and is in part a stimulus for this evaluation. The Department of Energy Organization Act and before it the Energy Conservation and Production Act specified that the public have access to the models.78 The meaning of access has never been agreed upon. A narrow interpretation such as has been proposed by EIA could involve documenting models fully and making them available to the public through a library 79 The with minimal direct contact between the model borrower and EIA. National Energy Software Center at the Argonne National Laboratory near Chicago has been suggested as a likely facility to use as a library. TNEMP and others are pushing for a broader interpretation of model access that would involve closer cooperation between the borrower and Office of 80 Applied Analysis staff. EIA access policy has brought it considerable criticism. 89 Interactions Between EIA and Its Critics, Especially Evaluators This evaluation has required becoming familiar with two EIA critics, PART and TNEMP. PART is a committee composed of individuals from the General Accounting Office, the Securities and Exchange Commission, the Bureau of Labor Statistics, the Federal Trade Commission, the Bureau of 8] They evaluate EIA the Census and the Council of Economic Advisers. annually. PART members do not have technical expertise as modelers but they do have experience in the use and evaluation of information for policy making. EIA's attitude for the most part toward the PART evalua- tors is that they are not competent for the task and hence that they are nuisances who must be tolerated but from whom EIA can expect to learn nothing.82 EIA's attitude toward TNEMP too has largely been that they must be tolerated and even cooperated with, since model transfer is in the interest of objectivity, but again that EIA can expect to learn 83 A common characteristic, however, of EIA interac- nothing from them. tions with PART and TNEMP has been that it anticipates their findings and begins to take action on them before any criticism is formally released. This enhances EIA's reputation for objectivity, which requires willing- ness to improve. This is also intended to enhance EIA's independence, since EIA then claims it internally generated its own constructive criticism. EIA is enabled to anticipate criticisms by the pragmatic character of its interactions with critics. PART evaluators, at least during the 1977 evaluation, interacted freely with OEIA staff and participated informally in the energy information working group of the DOE Activation Task Force which contributed to the design of DOE. EIA has also participated in TNEMP's NAB meetings.84 90 One recommendation made by both PART and TNEMP but presumably anticipated by EIA was that EIA radically improve its documentation 85 The standards and develop an internal ability to evaluate its models. Office of Oversight Analysis and Access, a subdivision within the Office of Applied Analysis, was established to design and then implement procedures for on-going EIA review of its modeling activities. The procedures subsequently developed stressed documentation (clarity) and logical consistency (coherence) with minimal mention of correspondence.86 In keeping with other EIA practices, there is no intent to examine the functions of identification and selection, and follow-up with those who have used EIA services is regarded as downright harmful.87 The section above on structural impacts concluded by posing two questions. These questions were, (1) how has EIA dealt with the conflic- ting shifts from one type of conditional normativism to another and from problem-solving to subject-matter work? and (2) are actual EIA practices those of the conditional normativist, as would be consistent with EIA's self-declared goal? It is now time to answer both of these questions. There are indications that the conflicts between EIA's type of conditional normativism (and positivism) and subject—matter work have meant pgyn that EIA ignores some important problems (as evidenced by complaints about lack of EIA input to NEP II) and that problem identi- fication capabilities are developing elsewhere, especially in PE. If the problems and information needs identified by PE could be quickly passed to EIA for good technical analysis, EIA's inability to identify relevant problems and information needs might not be a serious shortcoming. However, there is a considerable time lag in transferring tasks between these units and PE's own time constraints are too binding for it to be 91 able to wait for technical back-up. There is a danger that EIA will do more and more g§_pg§§_justification for PE's policy proposals. FEA was accused of doing gy'pg§t_justifications and hence of being unobjec- tive. What will be EIA's response to similar charges? EIA's lack of input to NEP II and its inability to identify new information needs may both be adaptations to time constraints. As was mentioned above, severe time constraints early in OEIA/EIA's history resulted in inadequate documentation of analyses and general staff dissatisfaction. Relaxing time constraints did much to alleviate these problems; there would be great reluctance to letting time constraints become more binding again. Identification of new information needs is a very time-consuming task and hence one which would be put off readily. (Developing an identification capability would also probably require adding some staff who were, once again, primarily policy analysts and only secondarily modelers.) Furthermore, inputs to the national energy plans are not specifically required of EIA by the DOE Organization Act and can be put off or ignored in favor of more urgent and more specifically required analyses. This seems especially unfortunate since modeling is a time-consuming activity and hence is particularly useful for longer- term planning; the NEPs are planning documents required of the Adminis- tration and DOE- by Title VIII of the DOE Organization Act. With regard to whether or not EIA's operations are actually conditionally normative, the answer is that in some respects they are and in other respects they are not. With respect to the selection of topics to guide the development of the models, EIA is more or less conditionally normative; the scope of the models is broadened or shifted 92 in response to new legislative or regulatory initiatives. This lack of attention to identifying new energy problems and information needs makes it difficult for EIA to maintain relevance. With respect to the acceptance of requests for analyses, EIA is not really conditionally normative; a conditional normativist could not reject requests as EIA does, at least implicitly. EIA/client interactions in the course of analysis are conditionally normative in print but not in practice. In practice they are either pragmatic, as in the case of responding to requests for technical services, or ill-defined, as in the case of the access issue. EIA/critic interactions, though stilted. are also pragmatic. EIA is neither thoroughly conditionally normative nor thoroughly pragmatic. EIA deficiencies from the standpoint of pragmatism include its inattention to workability, including its inability to diagnose relevant problems and information needs and its failure to seek feedback regarding the usefulness of its services. The Four Values: Professionalism, Objectivity, Credibility, Relevance EIA's value mandate specifies that EIA and its products be professional, objective, credible and relevant. This chapter concludes by examining the relations between these four values and EIA operations and between these four values and the definition of objectivity as developed in Chapter I. EIA seems to have succeeded well at guaranteeing professionalism. The staff of EIA from the Administrator on down at least have a reputation for being professionally qualified and the agency in general has developed a reputation for professional work. PART and TNEMP both agree on EIA's achievement of professionalism.88 93 Objectivity has been interpreted by EIA to mean two things: conditional normativism and independence. The positive or conditionally normative goal is to keep EIA research free of normative elements; objectivity means to EIA disinterested reporting. As pointed out above, EIA is conditionally normative in the sense that all assumptions are explicit whereas operating procedures are in many respects pragmatic, especially in the long run. EIA independence is carefully guarded, for example, by safeguarding the right to decide internally how to structure the system of models and which analyses to prepare as independent EIA products. Independence from clients is achieved by explicit specifica- tion of the clients' assumptions with EIA reserving the right to specify additional assumptions before endorsing the published analysis (which in turn contributes to EIA's long-run pragmatism). Both conditional normativism and independence, as interpreted by EIA, pose problems for relevance. They lead to lack of attention to identifying new energy problems and information needs and to the usefulness of EIA outputs. Indeed, Administrator Moses has stated that independence "... means we really are able to proceed without paying attention to how data will fit with administration policy."89 Credibility is difficult to differentiate from objectivity and professionalism. The EIA seems to have interpreted credibility as the ability to replicate results. All EIA products are subjected to technical review.90 All analyses are checked prior to publication by staff who were not involved in the original analysis. The models are also reviewed and assessed by the Office of Oversight Analysis and Access. In addition, EIA's operating procedures are reviewed annually according to legislative mandate by the independent PART. 94 Clarity, coherence, correspondence and workability are the four criteria for objectivity as defined in Chapter II. A concept or set of concepts (i e., a model) is objective if it passes these four tests. A person or institution is objective if his or its products are subjected to these four tests. The relationship between this definition of objectivity and the EIA's concept of objectivity and its value mandate is as follows. Relevance is an aspect of the test of workability for prescriptions and by derivation for predictions, descriptions and diagnoses. Credibility is the ideal result of objectivity. Professional- ism sets standards for the conduct of objective investigators. Objec- tive conditional normativism means that all normative elements are made explicit as assumptions and that the four tests of objectivity are applied only to non-normative elements. However, since all prescriptions are based on both positive and normative knowledge, the test of work- ability is only meaningful if the relevance of the (normative) assumptions is questioned. This is one reason strict conditional normativism is not feasible for participants in any aspect of policy making. This does not imply, however, that policy making cannot be objective; it simply means that policy makers must adopt a philosophical position less constraining than conditional normativism. And as has been shown, EIA's operating procedures are pragmatic in some important respects -- and perhaps should be more so in others. CHAPTER III ENDNOTES 1Legislative mandates are not commonly relied on so heavily in public policy research. This was considered to be an appropriate ap- proach in evaluating EIA because the EIA staff interviewed repeatedly cited the legislation in explaining why EIA does what it does. 2U.S. Congress, House, Interstate and Foreign Commerce Committee, Federal Energy Administration Authorization and Extension, H. Rept. 1113 to Accompany H. R. 12169, 94th Congress, 2nd Session, 1976, p. 3. 3See Executive Orders 11615 and 11748 and ibid., p. 10. 4Federal Energy Administration Act, P.L. 93—275, May 7, 1974, Sec. 2(a). 5 Ibid., sec. 6, lists the functions FEA consolidated. 6Longer lists of goals are in the FEA Act, sec. 2 and 5; the particular emphasis on these four areas is found in U.S. Congress, Senate, Federal Energy Administration Act of 1974, S. Conf. Rept. 788 to Accompany H. R. 11793, 93rd Congress, 2nd Session, 1974. U.S. Code Congressional and Administrative News, 1974, Vol. 2, p. 2973. 7Equitable sharing of burdens and meeting all priority needs may, of course, be inconsistent goals during an energy shortage. In temporary situations such as those FEA was intended to deal with, it makes some sense to speak of an equity-production or equity-efficiency trade off- ”normal" productivity standards may be sacrificed to maintain equity. In anything but very temporary situations, however, there is no simple equity-production or equity-efficiency trade off as changes in the dis- tribution of ownership and power change the meanings of production and efficiency. The meaning of efficiency is derived from the meaning of equity. 8The emphasis on shortages comes from the FEA Act, sec. 22; U.S. Congress, House, Government Operations Committee, Federal Energy Admin- istration, H. Rept. 748 to Accompany H. R. 11793, 93rd Congress, 2nd Ses- sion, I974, U.S. Code Congressional and Administrative News, 1974, Vol. 2, p. 2941; and U.S. Congress, Senate, FEA Act of 1974, pp. 2973 and 2975. 9Lists of FEA's functions and roles are scattered throughout the legislation and its history, but see for example, U.S. Congress, House, Government Operations Committee, ng, pp. 2940 and 2957 (and U.S. Congress, Senate, FEA Act of 1974, p. 2975 for an amendment of the dis- cussion in the House report). 10U.S. Congress, House, Government Operations Committee, ng, pp. 2940 and 2957. 95 96 nU.S. Government Manual (Washington, D.C.: Office of the Federal Register, National Archives and Records Service), 74/75 ed., pp. 464-465 and 75/76 ed., p. 476. 12Energy Conservation and Production Act, P.L. 94-385, August 14, 1976, sec. 142 and part B. 13 U.S. Government Manual, 77/78 ed., p. 516. Ibid. 15Personal interviews with former OEIA staff, Washington, D.C., March 28-30, 1979. 'nggn, sec. 113 and 31(3); and sec. 142; and part B, sec. 55. 17§§EA, sec. 142 and part 8, sec. 51(a) and 54. 18ECPA, with its one year extension of FEA, was an interim step in the transition from temporary to permanent federal energy institutions. See U.S. Congress, House, Interstate and Foreign Commerce Committee, f§A_ Authorization and Extension, pp. 5-7. 19Department of Energy Organization Act, P.L. 95-91, August 4, 1977, sec. 101 and 102. 20Ibid. 211bid., sec. 102. 22This theme pervades the legislation and its history. See especially Title VIII (Energy Planning) of the DOE Organization Act and U.S. Congress, Senate, Governmental Affairs Committee, Department of Energy Organiza- tion Act, S. Rept. 164 to Accompany S. 826, 95th Congress, lst Session, 1977, pp. 59-63. 23U.S. Congress, Senate, Governmental Affairs Committee, DOE Organiza- tion Act, pp. 23-24. 24 U.S. Government Manual, supplement to 77/78 ed., p. 150. ZSDOE Organization Act, sec. 205 and U.S. Congress, Senate, Govern- mental Affairs Committee, DOE Organization Act, p. 24. 26 1979. 27DOE Organization Act, sec. 102(7) and 205; and U.S. Congress, Senate, Governmental Affairs Committee, DOE Organization Act, pp. 24-25; and EIA, Annual Report to Congress 1978 (Washington, D.C.: Government Printing Office, April 1977), Vol. 1, "Administrator's Message." 28 1979. Personal interviews with EIA staff, Washington, D.C., March 28-30, Personal interview with an EIA official, Washington, D.C., June 13, 97 29DOE Organization Act, sec. 102(11). 30For example, the legislation provides for both advisory committees (section 624) and regional energy advisory boards (section 655). EIA has not made significant use of either, although they did submit pro- posals for advisory committees around the time President Carter decided to restrain the formation of new advisory committees to federal agencies as a cost-cutting measure. These proposals were rejected. See PART, Activities of the Energy Information Administration (Washington, D.C.: U.S. Government Printing Office, May 1979), pp. viii and 34-35. 31This issue will be dealt with in greater detail later in this chapter. 32Personal interview with former FEA staff, Washington, D.C., March 28 and June 13, 1979. 33To repeat, these complaints, especially about cancellation of the 1977 NHL.may or may not have been well-founded. They stemmed from a general lack of credibility. Cancellation of the 1977 NEO, its adequacies and inadequacies, are discussed in the first (1977) PART report. 34"Lincoln Moses: DOE's Chief Energy Statistician," U.S. Department of Energy Energy Insider 1(16);5, May 15, 1978. The nature of the con- flict between data collection for regulatory purposes and more general data analysis is based partly on fear that the time pressures for analyses required by Congress will cause data collection to be neglected and partly on the realization that incentives to be honest for those who pro- vide EIA with information are different in the two cases. EIA's Office of Energy Data now does most EIA data collection and the Office of Applied Analysis does most data analysis. Actually the Office of Applied Analysis fills requests from both policy makers and regulators (particularly the DOE's Economic Regulatory Administration). Perhaps this is the source of a future separation even within the Office of Applied Analysis. 35Personal interview with a former FEO staff member, Austin, Texas, March 17, 1979. 36Ibid. 37 1979. 38Statement of a NAB member, TNEMP National Advisory Board Meeting, Austin, Texas, March 17, 1979. 39 1979. 40 1979. 41 42Personal interviews with EIA officials, Washington, D.C., March 29 and June 13, 1979. Personal interview with former FEA staff, Washington, D.C., March 28, Personal interview with an EIA official, Washington, D.C., March 28, Personal interview with an EIA official, Washington, D.C., March 29, Ibid. 98 43The historical development of PIES/MEFS follows w. n. Hogan, "The Role of Models in Energy Information Activities." 44Judgmental forecasting, which does not rely on formal models, is discussed in relation to formal modeling in Greenberger, et al., Models in the Policy Process. 45Personal interview with former FEO staff member, Austin, Texas, March 17, 1979. 46FEA, Project Independence Report (Washington, D.C.: Government Printing Office, November 1974), p. 18. 47 See the preface by John Sawhill (FEA Administrator) to the PI Report. 48For a description of this process, see the Executive Summary and Chapter I of the PI Report. For an example of an FEO document using the definition of self-sufficiency, see "United States Energy Self—Sufficiency: An Assessment of Technological Potential," Washington, D.C., February 6, 1974 (mimeographed). 49FEA, PI Report, p. 1. 50Ibid., pp. 1-8. 51This description of PIES's characteristics is drawn from W. W. Hogan, "Energy Policy Models for Project Independence," Computers and Operations Research 2:251-253, 1975. 52W. W. Hogan, J. L. Sweeney and M. Wagner, "Energy Policy Models in the National Energy Outlook," Energy Policy, TIMS Studies in the Manage- ment Sciences, Vol. 10, ed. by J. S. Aronofsky, A. G. Rao and M. F. Shakun (New York: Elsevier, 1978), p. 39. 53 See Hogan, "Energy Policy Models for PI," p. 262. 54Hogan, et al., "Models in the NEO," p. 61. 55Members of TNEMP's NAB, however, have noted that PIES/MEFS is not well constructed to consider environmental goals. See "Summary of the Texas National Energy Modeling Project National Advisory Board Meeting, August 11, 1978," TNEMP Records, p. 10. 56The two options were introduced above, on page 75. 57See for example, EIA's Annual Report 1978, Vol. 1, p. 24. 58Above, page 34. 59Above, page 69. 60Hogan, "Energy Policy Models for Project Independence," p. 262. 99 61 1979. 62 Personal interview with an EIA official, Washington, D.C., March 29, See "Summary... August 11, 1978," p. 2. Ibid. 64 1979. 65 Personal interview with an EIA official, Washington, D.C., March 29, Ibid. 66ECPA, sec. 142 and part B, sec. 51(6)- 67PART, Activities of the EIA, p. iii. 68Personal interview with an EIA official, Washington, D.C., March 29 and June 13, 1979. 69 1979. 70EIA, Office of Planning and Evaluation, "Energy Applied Analysis Program Plan, Fiscal Years 1978-1981, FY '78 Version," Washington, D.C., August 1978, p. 7 (mimeographed). 71 1979. 72 Personal interview with an EIA official, Washington, D.C., June 13, Personal interviews with EIA staff, Washington, D.C., March 28, EIA, Annual Report 1978, Vol. 1, p. 24. 73Personal interviews with EIA staff, Washington, D.C., March 28, 1979. 74Personal interviews with EIA officials, Washington, D.C., March 28 and 30, 1979. 75Statement of a NAB member, TNEMP National Advisory Board Meeting, Washington, D.C., June 11, 1979. 76 1979. 77 78ECPA, sec. 113 and 32(3). 796. M. Lady, "Model Assessment and Validation: Issues, Structure and Energy Information Administration Program Goals" (paper presented at the National Bureau of Standards Symposium on Model Validation and Assess- ment of Energy Models, Gaithersburg, Maryland, January lO-ll, 1979), p. 12. 80See "Statement of Meeting Purpose," TNEMP Records, which was pre- pared for a June 12, 1979 meeting of NAB members, TEAC officials, and EIA officials in Washington, D.C. Personal interview with an EIA official, Washington, D.C., June 13, EIA, Annual Report 1978, Vol. 1, p. 24. 100 8IPART's composition and duties are discussed at the beginning of each PART report. 82Personal interview with an EIA official, Washington, D.C., March 29, 1979. 83Personal interview with an EIA official, Washington, D.C., June 13, 1979. 84 EIA/TNEMP interactions are discussed more fully in Johnson and Brown, An Evaluation of the Normative and Prescriptive. 85Letter, M. L. Holloway to Lt. Governor Hobby, September 29, 1978, TNEMP Records; and PART, Activities of the OEIA, p. 48. 86EIA's evaluation criteria are outlined in Lady, "Model Assessment and Validation." George Lady was-u-and perhaps still is-u-Director of the Office of Oversight Analysis and Access. 87Personal interview with Office of Oversight Analysis and Access official, Washington, D.C., March 30, 1979. 88PART, Activities of the EIA, p. ix; and personal interview with M. L. Holloway, East Lansing, Michigan, June 5, 1979. 89 "Lincoln Moses,“ ibid. 90EIA, Annual Report, 1978, Vol. l, p. 27. CHAPTER IV IMPLICATIONS AND CONCLUSIONS This chapter has three sections. The first section highlights the implications of EIA's legislative mandate, and its interpretations thereof, and suggests some institutional alternatives for improving EIA's performance in the future. The next section is directed more toward improving TEAC and especially TNEMP. It takes a second look at the FEA-DOE/Texas conflict, which was the stimulus for TNEMP, and evaluates the appropriateness of TEAC's original expectations of EIA. Implications drawn from evaluating EIA that should be considered in developing TNEMP into a more permanent institution are discussed. Contributions that a more permanent successor to TNEMP might make to DOE are suggested also. The third section is a postcript. TEAC and TNEMP were reorganized (and renamed) in the fall of 1979, after the preliminary results of this evaluation had been presented to TEAC and TNEMP, but before this final writing. The postscript describes the reorganizations and attempts to determine whether and how the recommenda- tions made here were acted upon. In a sense it is an evaluation of this evaluation. As has been mentioned several times throughout this study, implementing some of the recommendations made here would require deeper knowledge of the powers and personalities involved in the politics of energy, both in Washington and in Texas. 101 102 About EIA--Its Strengths, Weaknesses and Possible Futures The implications about EIA follow the order of the description of EIA in Chapter III -- implications about the legislative mandate precede implications about EIA's interpretations of and operations with respect to the mandate. The reason behind this sequence is basically the same as that given in Chapter III; the content and consistency or inconsistency of the legislated values and goals help determine relevant issues in evaluating operating procedures. Implications of the mandate and implications of EIA's operating procedures are followed by institutional alternatives for redesigning and improving EIA. Implications of the Legislative Mandate EIA's legislative mandate includes prescriptions to be objective, professional, credible and relevant. The most remarkable thing about this mandate is that it consists of concepts for which there are no widely accepted definitions. Yet the issue of giving them appropriate interpretations has never been openly addressed by EIA. The methodological approach used here was designed to show that the meanings of these concepts are matters of choice and of great importance. Perhaps there are indications that DOE and EIA personnel are beginning to realize the importance of these issues, partly as a result of the TNEMP challenges. Alvin Alm, DOE Assistant Secretary for Policy and Evaluation, was quoted in Chapter I as hoping that conservative assumptions for NEP analyses would be uncontroversial.1 He never realized that even conservative assumptions hurt some and help others and that the normative-prescriptive choice of whose interests count cannot be avoided! Just one year later, Elizabeth MacRae, an EIA 103 administrator and EIA's liaison to TNEMP, emphasized to TNEMP's NAB her understanding that every EIA forecast or prediction has political implications and that EIA cannot hope to please everyone.2 From here, it is but a small step to turning full attention to appropriate justifi- cation of EIA procedures. The concept of objectivity is the kingpin in the four mandates of objectivity, professionalism, relevance and credibility. It ties them all together and determines whether they are mutually consistent or inconsistent. If objectivity is defined in a positivistic or conditionally normative manner, as it has been by EIA, then objectivity is inconsistent with relevance. This leads to ignoring EIA/decision maker interactions as a valuable source of feedback about the usefulness of EIA services. It also leads to lack of attention to identifying and prioritizing new energy problems and information needs. Over time, both oversights result in omissions in the scope of information provided and eventually in loss of credibility. In part, EIA attempts to achieve conditionally normative objectivity by staunchly defending its independence. Independence is another undefined notion which has not been subjected to adequate discussion. Independence was intended as a means of attaining objectivity. It has become something of an end in itself. The reason for making EIA independent was to keep it free of manipulation by powerful interest groups. However, in the absence of a focus on relevance and an openness to evaluation, independence-for-objectivity becomes independence-without- accountability. This evaluation of EIA has emphasized that the possibility and importance of objective normative information must be understood. EIA's 104 proscription on making policy recommendations need not translate into a prescription to ignore the normative. EIA is instructed to provide information useful to decision makers. Decisions or prescriptions are necessarily based on both positive and normative information. Hence the normative will be handled, either implicitly or explicitly, and either objectively or unobjectively. Wise decisions (prescriptions) with desirable consequences are much more likely to result from the use of objective than unobjective positive and normative information. These issues are important not only for the objectivity of EIA's products but also for the very existence of the institution. The seriousness of EIA's omissions, if uncorrected, will likely increase over time. Institutions without viable, relevant functions wither and die. They lose political and administrative support, are given inadequate resource allocations and over time become truly marginal activities.3 An information agency designed around conditionally normative or positivistic ideals will not develop adequate operating procedures and performance in the long run. Implications of EIA's Operations One recurring response to challenges to EIA's objectivity and independence has been to separate and isolate controversial activities. Policy making and information provision were first separated in OEIA and continue to be separate in EIA. Data collection and data analysis have been separated in EIA.4 Separating incompatible activities does have advantages. The requirements and time constraints of problem- solving and subject-matter research, for instance, are different; a single institution trying to perform both simultaneously may consistently 105 be to the detriment of one. However, separating interdependent activities also has disadvantages. In particular it leads to lack of coordination. In the case of EIA and its predecessors, problem-solving preceded subject-matter research. EIA's loss of its diagnostic capabilities without addition of close coordination with other diagnostic units is one source of irrelevance. Similarly, objective modeling requires empirical inputs; a modeling unit separated from and not well coordinated with a data collection unit suffers. A common response to poor coordination is duplication -- PE is developing its own models and the Office of Applied Analysis sometimes collects its own data.5 The key point to be made here is that many activities are both somewhat incompatible and somewhat interdependent; in these cases the advantages and disadvantages of separating versus not separating must be carefully weighed. If a decision is made to separate, the necessary interactions between separate but interdependent activities will require careful structuring. The difficulty again is one of independence with accounta- bility. It is also necessary for EIA and DOE to adopt a posture toward sources that duplicate EIA information. EIA was designed as a centralized information source to put an end to credibility problems resulting from multiple sources producing contradictory information. A second dis- advantage of competition and duplication is that it is another potential source of irrelevance -- PE's problem—solving relevance takes the pressure off EIA. One advantage of competition is that alternative sources provide checks for each other, promoting objectivity. However, careful interactions with decision makers regarding the usefulness of information and thorough evaluation are substitutes for competition in 106 this respect -- while avoiding the cost of lost credibility. It seems worthwhile to protect EIA as a single, centralized source of subject-matter energy information. Simultaneously, however, careful guidelines for credibility promotion and maintenance should be developed. Credibility is something that is built up -- earned -- over a period of time. EIA's responses to various sorts of time constraints are particularly critical over the long haul for its credibility. As has been suggested, identifying and prioritizing new energy problems and information needs is time-consuming and hence is avoided by EIA. Evaluation is time-consuming too; EIA's commitment to its internal evaluators has yet to be demonstrated. Modeling also takes time. EIA time constraints in addition to its legislative mandate have resulted in EIA modeling centered around the annual, shorter-run Annual Reports rather than the biennial, longer-run National Energy Plans. There is a trade-off to be made between doing specific problem analyses well on a day-to-day basis and maintaining a viable, healthy information system over the long haul. The latter is not possible without capabilities for diagnosis and evaluation and it would be promoted by using models more rather than less for long-term planning. Two particular cycles, each with different implications, have been uncovered by this examination of EIA. One cycle is that problem- solving research gave way over time to subject-matter research. This is to be expected. As the initial problems of the "energy crisis" were solved or faded in importance, and as a temporary agency was institutional- ized into a permanent one, the focus shifted from solving particular problems to identifying a kind of information useful in solving a set of relevant problems around which a more permanent bureaucratic edifice 107 could be built. One implication of this cycle, already mentioned, is the need for increased diagnostic capability to keep the subject matter relevant to an important set of problems. Another implication is a heightened need for evaluation. Particularly in the case of subject- matter research, decision makers and hence usefulness often are not -- and cannot be -- identified at the time of the research; relevance can only be maintained by e§_pp§t_evaluation. The other cycle is that models were used initially for description of poorly defined problem areas and later, when problems and information needs were better defined, for prediction. This cycle has now been broken at EIA. Models are used almost exclusively for prediction. It is possible however that this cycle is necessary to achieve the full contributions of modeling to relevant information provision.6 Involving models early in the attempt to solve new problems -- during the descriptive phase -- provides an opportunity for the models to be developed simultaneously with understanding of the problems. Gradual development of a model from simple data reporting to more sophisticated predictive analyses would seem to present less of a hurdle to maintaining relevance than would the quantum leap from no modeling to sophisticated modeling at some point later in the life cycle of the problems. Under present EIA structure, model development from description through pre- diction would again require close coordination between the Office of Energy Data and the Office of Applied Analysis. Closer interactions with decision makers also might enhance the usefulness of models. Hogan's observation that one of the most valuable results of modeling is the education of decision makers was noted in Chapter 11.7 The benefit is lost if decision makers are presented only with the final results of an analysis. 108 The development of a fairly strong EIA ideology along with a reduction in staff turnover was mentioned in Chapter 111.8 According to Roberts, this implies that change will become slow and difficult in EIA.9 Furthermore, Arrow suggested that there is a trade-off to be made in 10 However, institutional design between predictability and flexibility. it seems that EIA needs some of both. Long-term credibility demands stability and continuity for an information system. Furthermore, mistakes caused by poor quality information can be very expensive when decisions are irreversible and/or decision makers are unidentified. On the other hand, maintaining relevance and avoiding obsolescence -- conceptual, empiric or institutional -- demands flexibility. The complexity of the demands on EIA plus the likely difficulty of changing it only make more crucial the need to consciously design and control EIA activities. Key items on the agenda for improving EIA performance include going beyond conditional normativism, developing a capability for identifica- tion of new energy problems and information needs, careful design of mechanisms for interactions and coordination between EIA staff and decision makers and between data collection and modeling activities, and appropriate structuring of modeling activities to include both descrip- tion and long-term planning analyses in addition to shorter-term analyses. Institutional alternatives to make these suggestions realities for EIA would help to result in a well-rounded, credible energy informa- tion system capable of performing the functions of identification, selec- tion, definition, analysis, synthesis, decision making, action taking, and evaluation. 109 Institutional Alternatives One obvious step for EIA to take is to employ an applied methodolo- gist or philosopher of science to provide insights and oversights. Such a person could provide institutional analyses of EIA procedures similar to this one, help justify changes in EIA to the relevant congressional committees, and be generally valuable in helping EIA to advance beyond conditional normativism and positivism. An advantage in having such a person within EIA is that EIA is, at least at present, more receptive to internal than to external criticism. Several suggestions come to mind for improving diagnosis, evalua- tion, and EIA interactions with users or decision makers. EIA could employ liaisons whose responsibilities would include interacting with PE, congressional energy committees and other clientele groups. Such liaisons might be located in the Office of Oversight Analysis and Access to provide close interaction with EIA and yet not interfere too much with day-to-day modeling activities. Another alternative would be for EIA to make better use of PART's talents. PART already interacts some- what with EIA clients but could do much more. This could have the further advantage of relieving PART from presenting a facade of evaluating EIA's models, for which PART is not appropriate. The main problem with making better use of PART is that it would require some re-education of PART members along the same lines as that suggested here for EIA. PART evaluations tend to accept EIA's interpretations of objectivity and independence and there is a danger that PART evaluations may ask the same questions of EIA year after year, leading to irrelevant evaluations.H A third possibility for obtaining feedback regarding relevant problems and information needs and the usefulness of EIA services would be to set 110 up advisory groups for these purposes, thus lowering the transactions costs to clientele in providing feedback. The DOE Organization Act permits the use of advisory groups, although in the past EIA budget requests for advisory groups have been rejected. With appropriate justification however -- provided perhaps by the EIA philosopher of science recommended above -- such requests might be more successful in the future. Bonnen, etpplyg have suggested that diagnosis and evalua- tion be centralized for all federal information systems in what they 12 This would have the advantages call an Office of Statistical Policy. of coordinating information not just within but across subject matters, while not requiring EIA to give up valuable staff time for modeling per_§e, However, for the Office to be successful at overseeing EIA, EIA would have to be involved in the Office's research, sympathetic to it, and somehow also accountable to it. A final possibility for changing EIA's agenda is to change its mandate. For example, future EIA appropriations might include a line item for EIA participation in the preparation of the NEPs. The advantage of a line item is that it carries clout -- those activities for which there are line items definitely done. The drawback is that once a line-item is on the books, it can be difficult to get it off even if it subsequently loses relevance. Line items should be used with great discretion.13 All of the above recommendations could also help to improve EIA's coordination of its various activities. Another tactic for improving coordination between information system functions is to increase the number of boundary roles, including developing roles that overlap functions. For example, some staff could work on both data collection and modeling. This could help to make it easier to use models for both 111 description and prediction. Staff could be tied to a particular topic, working on both the data development and the subsequent modeling, rather than being tied to an Office of Energy Data and an Office of Applied Analysis as they are now. Similarly, a tactic for better coordination between EIA and PE might be to more actively involve EIA in the synthesis function. Some of these recommendations might require legislative initiatives, whereas others only require insightful administration. Line items obviously involve a legislative initiative. Preventing further separation of interdependent activities, preventing duplication of EIA services, and involving EIA in more synthesis or policy integration may or may not require legislative action. Carrying EIA beyond conditional normativism or positivism by restructuring interactions and re-aligning the various functions probably only demands insightful administration. In any case, the recommendations made here would be useful only if EIA could be made to understand their importance. Furthermore, the recommendations made here basically preserve the EIA. By now there is considerable learning and valuable experience locked up in EIA. It seems better to use this foundation as a basis for improvement than to tear it down and start again. PE, EIA's valuable counterpart, at the moment seems to be at a crossroads. On the one hand, it could be on the brink of repeating history. Like FEO and the early FEA, it is again a unit with simultaneous responsibilities for information provision and policy making. We have seen the difficulties in store for it if it should attempt to maintain a posture of conditional normativism. On the other hand, if it recognizes the necessity of being normative yet objective, PE could develop into a still more valuable problem-solving staff agency for the Administration. 112 There are great advantages in PE doing the controversial staff research on the President's problems. PE, like EIA, should exploit opportunities for repeated interactions with decision makers and affected persons as an important source of information. The process of iterative inter- action among investigators, decision makers and affected people is an important source of information about the positive, the normative and decision rules. When information derived from this process is exploited and the information acquired is tested with the four tests of coherence, correspondence, clarity and workability, the result is an objective, pragmatic orientation. TNEMP, too, is at a crossroads. It is on the brink of becoming a more permanent institution. About TNEMP -- Its Strengths, Weaknesses and Possible Futures TNEMP was originally set up to solve a problem, TEAC's problem with EIA. And if -- when -- it becomes permanent, a main task will be periodically evaluating EIA activities. Hence, an important part of evaluating TNEMP is evaluating TNEMP's and TEAC's original diagnosis of the conflict with EIA. This is done in the first subsection of this section. The second subsection extends some of the recommendations about EIA made above to TNEMP. The third subsection suggests contributions a successor to TNEMP might continue to make to DOE and EIA. Understanding the FEA-DOE/Texas Conflict FEA's OEIA, DOE's EIA, TEAC and TNEMP were all agencies with conditionally normative orientations. Hence it is to be expected that the conflict and subsequent interaction among them mainly pointed out 113 positivistic shortcomings and was characterized by misunderstandings, mishandlings and possibly avoidances of normative and prescriptive issues. This was indeed the case. Many of TEAC's initial complaints and suspicions about EIA resulted from TEAC's failure to understand the role of values and goals even in conditionally normative analysis. For example, TEAC complained that by specifying the assumptions for the NEP 14 Yet, such analyses, the Executive Office had manipulated PIES. specification of assumptions is a prerequisite of conditionally normative analysis! TEAC also suspected that because the NEP recommended larger energy price increases for Texans than for most other Americans, PIES must have “biases” or non-random errors. It has been shown that information generally has normative and prescriptive implications but that the choice of the prescription to raise energy prices relatively more in Texas was probably not made by EIA personnel. It was more likely to have been made by PE or Executive Office personnel. Further- more, TNEMP's original agenda emphasized the positivistic interpretations of clarity, coherence and correspondence. Thus, in the beginning TNEMP's agenda contained omissions (the normative and prescriptive) and perhaps should have been directed at PE rather than EIA. Of course, this would have been difficult to know without access to the models. Some of these weaknesses were at least partially corrected in mid-course by the addition of this study and the Johnson and Brown study. Other weaknesses can be corrected as TNEMP is redesigned and made more permanent. 114 Implications for Institutionalizing TNEMP The history of energy information agencies reveals repeated transitions from problem-solving to subject-matter orientations. This is what has happened in the evolution of EIA and MEFS. It is to be expected that this will happen to TNEMP as it becomes a more permanent research arm of TEAC (now TENRAC). There is nothing wrong with this transition as long as the subject matter is relevant to an important set of problems. As has been the case with EIA, the danger is that the set of problems will change unbeknownst to the staff of the informa- tion agency. Such changes are not detected well by conditionally normative or positivistic information agencies. TNEMP must insure against irrelevance if it is to attain enduring institutional status. It can do this by (l) appropriately mixing philosophic orientations among its staff, (2) engaging in rather con- tinuous interaction with affected persons and decision makers in Texas, in TEAC, in the Texas legislature and in the governor's office, (3) assuming responsibility for normative and prescriptive investigations, (4) engaging in problem diagnosis and definition and, on occasion, solution at policy, program and project levels so that it need never be divided into the equivalents of EIA and PE, and (5) having itself evaluated at timely intervals. There are substantial advantages for TNEMP in avoiding the separation which took place between EIA and PE. Further- more, if TNEMP's oversight of federal energy policy is broadened to include both PE and EIA, it will be to TNEMP's advantage to have both problem-solving and subject-matter capabilities. With regard to evalua- tion, in the past TEAC's two-years-at-a-time leases on life have provided stimuli for evaluation as TEAC went about justifying itself to the state 115 legislature. If TEAC and TNEMP are given more open-ended mandates, they may need to more consciously select appropriate times for evaluation and appropriate evaluators. If TNEMP were to follow a philosophic orientation more pragmatic or eclectic than conditionally normative, it would be able to address normative and prescriptive questions more objectively than EIA, PE and the Executive Office -- and also more objectively than TNEMP has to date. Such a philosophic orientation would permit it to be an information agency dealing objectively with both positive and normative information as well as a staff agency helping Texas decision makers arrive at prescriptive conclusions. If EIA and PE do not fill the void of objective normative information with respect to problems involving energy, a TNEMP which does would be a major national, as well as Texan, asset. Potential Contributions of TNEMP to DOE Originally, EIA and DOE viewed TEAC as a threat,an unjustified challenge to EIA's credibility which was itself not credible. As was described in Chapter I, much of this hostility eventually faded into the background. However, EIA personnel continued to reiterate that TNEMP was a test case too expensive to repeat, that EIA could not favor Texas over other states, and that EIA had basically learned little from TNEMP.15 Presumably one of TNEMP's enduring functions will be to continue independently evaluating EIA's models. It will be to TNEMP's advantage to maintain workable interactions with EIA. EIA will be more likely to cooperate with TNEMP if TNEMP is clear about what it has to offer EIA and DOE. 116 TNEMP could become a valuable source of normative and prescriptive information for EIA and PE. Undoubtedly EIA has already learned much about which energy problems are important to Texas and the Southwest from its interactions with TNEMP. In addition, TNEMP could provide informa- tion supplemental to DOE information, helping EIA and PE to evaluate their work. PE and EIA macro information may prove compatible or incompatible with TNEMP's state and regional information, both problem- solving and subject—matter. Admittedly, TNEMP could be more valuable in these areas if other states or regions of the country developed similar capabilities, helping EIA and PE maintain balanced -- objective -- perspectives. In the area of evaluating specific models, such as was done by the Analysis Team, multiple inputs from additional states are less crucial because of TNEMP's close ties to the academic community. The broader evaluation of overall EIA activities is in the long run more crucial for EIA's credibility, but also the one EIA -- at least at present -- is least likely to appreciate. Of course, it must be realized that it is not just TNEMP's responsibility to convince EIA of the usefulness of TNEMP's criticisms; EIA too is partly responsible for convincing itself of the usefulness of feedback from "the outside." Furthermore, a permanent TNEMP will be useful even if its suggstions for EIA go unheeded; TNEMP, as a research arm of TEAC has a vital role to play in Texas policy making. Postscript —- TEAC and TNEMP Become TENRAC and TEPP Milton Holloway, executive director of TEAC and of TNEMP, had the major responsibility for redesigning and updating both in the summer and fall of 1979. By his own admission, he relied heavily on the results 117 of the normative and prescriptive evaluations of EIA, MEFS, TEAC and 16 TNEMP. This section briefly highlights how the recommendations made here were taken into consideration.17 TEAC was renamed the Texas Energy and Natural Resources Advisory Council, TENRAC. Its major goals are to maintain the economic growth of Texas, encourage a clean environment, and promote full employment to the extent that these goals are influenced by energy policy and energy development. A list of more specific TENRAC duties includes adapting and reassessing Texas energy policy and Texas natural resources policy, recommending lesiglation at both the national and state levels, and reviewing the impacts of federal actions on Texas. This is a subject-matter mandate. TENRAC consists of the Council itself, advisory committees, an executive director and an Energy Analysis and Development Division, EADD, which provides the Council with analytic support and develops policy recommendations. It maintains an energy data base and econometric models, produces an annual Texas energy outlook and an assessment of EIA's Annual Report, produces energy information, policy analyses and recommendations for the Council, the legislature and other Texas decision makers, maintains an awareness of all energy research of importance to Texas, and coordinates and supports energy research, development, demonstration and information exchange. More specifically, EADD consists of a Policy Analysis Section, which identifies policy issues, and the Texas Energy Policy Project, TEPP, which integrates and develops Texas' energy investigative capacity in support of TENRAC decision making. TEPP's structure is similar to TNEMP's, consisting of a National Advisory Board, an Analysis Team, and a new Peer Review Team. 118 There are provisions for integration of problem-solving, subject-matter and disciplinary research; incorporation of the positive, the normative and the prescriptive; interactions between all the necessary functions of an information system; and technical peer review and periodic third party overall evaluation. TENRAC staff consists of a mix of computer scientists, statisticians, economists, policy analysts, lawyers -- and even a new addition, a philosopher. The compatibility of this design with the recommendations which flow from this evaluation is obvious. Ultimately, however, it must be remembered that this evaluation is an institutional evaluation. It has suggested institutional structures which establish incentives for certain types of behavior. Hence, a key test of the usefulness of this evaluation yet remains -— that is, whether or not TEPP's new structure ultimately results in desirable performance or consequences in terms of objective, credible energy policies. CHAPTER IV ENDNOTES 1Above, page 8. 2"Summary of the Texas National Energy Modeling Project National Advisory Board Meeting, January 12-13, 1979," TNEMP Records, p. 4. 3Bonnen discusses the obsolescence of the census of agriculture along these lines in "Assessment," pp. 389-392. 4Beneath the surface of this separation, in addition to the conflicts between the time requirements of data collection and modeling, there is also the conflict mentioned in Chapter III between information for regulatory purposes and information for policy making. The problems posed by the conflicts between regulation and policy making would be a worth- while topic for further study. 5Personal interviews with EIA staff, Washington, D.C., March 28-30, 1979. 6While this suggestion makes intuitive sense, it too would be useful to find out if this cycle has been observed in other information systems and what its consequences are. 7Above, page 34. 8Above, page 72. 9This hypothesis was discussed above, on page 47. 10Above, page 47. 11The first two PART evaluations used the same performance criteria and one suspects, perhaps due to PART's own time constraints, that this practice will continue. The problems and potentialities of PART would be another useful topic for further research. 12Bonnen, et al., "Improving" describes in detail the need for, responsibilities of, and an institutional design for the suggested Office of Statistical Policy. 13Another interesting topic for further research would be to explore a past EIA experience with a line item. For example, the Energy Industry Profile, a data series designed to illustrate the competitiveness of the energy industry, was finally made a line item after Congress realized EIA was not interested in using its discretionary authority to demand data of energy suppliers. My first impression is that EIA half-heartedly did the bare minimum to develop the data series and that, perhaps as a result, DOE has recently dropped its investigations of competition. 119 120 14M. L. Holloway, "The National Energy Plan Analyses," pp. 2, 7 and 8, and personal interview with M. L. Holloway, East Lansing, Michigan, June 5, 1979. 15 1979. 16Memorandum, M. L. Holloway to National Advisory Board, March 19, 1980, TNEMP Records. 17The following description of TENRAC and TEPP summarizes TENRAC. "Texas Energy Policy Project Work Plan," Austin, Texas, March 1980 (draft copy). Personal interview with an EIA official, Washington, D.C., June 13, BIBLIOGRAPHY BIBLIOGRAPHY Arrow, K. J. The Limits of Organizatipn. New York: W. W. Morton, 1974. Bartlett, R. Economic Foundations of Political Power. New York: The Free Press, 1973. Bauer, R. A. "Detection and Anticipation of Impact: The Nature of the Task." Social Indicators. Edited by R. A. Bauer. Cambridge: Massachusetts Institute of Technology Press, 1966. Bonnen. J. T. "Assessment of the Current Agricultural Data Base: An Information System Approach." A Survey of Agricultural Economics Literature. Vol. II. Edited by L. R. Martin and G. J. Judge, et a1. St. Paul: American Agricultural Economics Association, University of Minnesota Press, 1977. , et a1. "Improving the Federal Statistical System: Report of the President's Reorganization Project for the Federal Statistical System.” Statistical Reporter 80-8zl97-212, May 1980. Boulding, K. E. The Image: Knowledge in Life and Society. Ann Arbor: University of Michigan Press, 1956. , "Science: Our Common Heritage." Science 207(4433):831-836, February 22, 1980. Brewster, J. M. A Philosopher Among Economists. Edited by J. P. Madden and D. E. Brewster. Philadelphia: J. T. Murphy, 1970. Carnap, R. "Formal and Factual Science.” Readings in the Philosophy of Science. Edited by H. Feigl and M. Brodbeck. New York: Appleton-Century-Crofts, 1953. Cohen, K. J. and R. M. Cyert. Theory of the Firm: ‘Resource Allocation in a Market Economy. 2nd ed. Englewood Cliffs, N.J.: Prentice- Hall. 1975. Department of Energy Organization Act. Public Law 95-91. 91 STAT 565. ADgust 4, 1977. Dewey, J. "The Continuation of Ends-Means." Ethical Theories. Edited by A. T. Melden. New York: Prentice-Hall, 1950. . Logic: The Theory of Inquiry. New York: Henry Holt and company, 1938. 121 122 Energy Conservation and Production Act. Public Law 94-385, amending Public Law 93-275. 90 STAT 1125. August 14, 1976. Energy Information Administration. Annual Report to Congress 1978. Vol. 1. Washington, D.C.: Government Printing Office, April 1979. Office of Planning and Evaluation. "Energy Applied Analysis Program Plan, Fiscal Years 1978-81, FY '78 Version," Washington, D.C., August 1978. (Mimeographed.) . Models of the Energy Information Administration. Washington, D.C.: Energy Information Administration, May 1978. Energy Supply and Environmental Coordination Act. Public Law 93-319. 88 STAT 246. June 22, 1974. Executive Office of the President. Energy Policy and Planning. Ipe_ National Energy Plan. Washington, D.C.: Government Printing Office, April 1977. Federal Energy Administration. National Energy Outlook. Washington, D.C.: Government Printing Office, November 1976. . Project Independence Report. Washington, D.C.: Government Printing Office, November 1974. Federal Energy Administration Act. Public Law 93-275. 88 STAT 96. May 7, 1974. Federal Energy Office. Office of the Assistant Administrator for Economic and Data Analysis and Strategic Planning. "United States Energy Self-Sufficiency: An Assessment of Technological Potenial," Washington, D.C., February 6, 1974. (Mimeographed) Feigl, H. "The Scientific Outlook: Naturalism and Humanism." Readings in the Philosophy of Science. Edited by H. Feigl and M. Brodbeck. New York: Appleton-Century-Crofts, 1953. Friedman, M. Essays in Positive Economics. Chicago: University of Chicago Press, 1953. Gahr, W. E. "Can Models Help Congress Make Decisions." Statement to Rockefeller Foundation Conference on Large Scale Systems Analysis, System Research Center, Cleveland, Ohio, January 7, 1977. Greenberger, M., M. A. Crenson and B. L. Crissey. Models in the Policy Process. New York: Russell Sage Foundation, 1976. Hardin, C. M. "The Bureau of Agricultural Economics Under Fire: A Study in Valuation Conflicts." Journal of Farm Economics XXVIII(3):635-668, August 1946. 123 Hatry, H. P., R. E. Winnie and D. M. Fisk. Practical Program Evaluation for State and Local Government Officials. Washington, D.C.: The Urban Institute, 1973. Hogan, W. W. "Energy Modeling: Building Understanding for Better Use." Paper presented at the Second Lawrence Symposium on the Systems and Decision Sciences, Berkeley, California, October 3, 1978. . "Energy Policy Models for Project Independence." Computers and Operations Research 2 251-271, 1975. "The Role of Models in Energy Information Activities." Paper presented at the Stanford Workshop on Energy Information, Palo Alto, California, December 15-16, 1977. , J. L. Sweeney and M. Wagner. "Energy Policy Models in the National Energy Outlook." Energy Policy. TIMS Studies in the Management Sciences, Vol. 10. Edited by J. S. Aronofsky, A. G. Rao and M. F. Shakun. New York: Elsevier, 1978. Holloway, M. L., ed. Texas National Energy Modeling Project: An Experience in Large-Scale Model Transfer and Evaluation. New York: Academic Press, 1980. , ed. Texas National Energy Modeling Project: An Experience in Large-Scale Model Transfer and Evaluation. Part II. Austin: Texas Energy and Natural Resources Advisory Council, 1980. House, E. R. "Justice in Evaluation." Evaluation Studies Review Annual. Vol. 1. Edited by G. V. Glass. Beverly Hills, California: Sage, 1976. Johnson. G. L. "Basis for Evaluation." The Overproduction Trap in U. S. Agriculture. Edited by G. L. Johnson and C. L. Quance. Baltimore: Resources for the Future, The Johns Hopkins University Press, 1972. . "Economics, Ethics, Food and Energy." The Second James C. Snyder Memorial Lecture in Agricultural Economics, Department of Agricultural Economics, Purdue University, West Lafayette, Indiana, March 22, 1976. . "General Systems Simulation Analyses (GSSA) of the Nigerian and Korean Agricultural Sectors and Related Efforts." Agricultural Change and Economic Method. Authored by the Trans- atlantic Committee on Agricultural Change. The Hague: European Review of Agricultural Economics, 1976. . "Philosophic Foundations: Problems, Knowledge and Solutions." Agricultural Change and Economic Method. Authored by the Transatlantic Committee on Agricultural Change. The Hague: European Review of Agricultural Economics, 1976. 124 . “The Quest for Relevance in Agricultural Economics." American Journal of Agricultural Economics 53(5):738-739, December 1971. . Review of The Entrophy Law and the Economic Process, by Nicholas Georgescu-Roegen. Journal of Economic Issues VII(3): 492-499, September 1973. "Value Problems in Farm Management." Journal of Agricultural Economics (English) XIV(1):13-31, June 1960. , and J. L. Brown. An Evaluation of the Normative and Prescrip- tive Content of the Department of Energy Mid-Term Energy Forecasting System (MEFS)Iand the Texas National Energy Modeling Project (TNEMP). Texas National Energy Modeling Project, Part III. Edited by M. L. Holloway. Austin: Texas Energy and Natural Resources Advisory Council, 1981 (forthcoming). , and L. K. Zerby. What Economists Do About Values: Case Studies of Their Answers to Questions They Don't Dare Ask. East Lansing: Department of Agricultural Economics, Center for Rural Manpower and Public Affairs, Michigan State University, 1973. Lady, G. M. "Model Assessment and Validation: Issues, Structure and Energy Information Administration Program Goals.“ Paper presented at the National Bureau of Standards Symposium on Model Valida- tion and Assessment of Energy Models, Gaithersburg, Maryland, January lO-ll, 1979. Lewis, C. I. The Ground and Nature of the Right. New York: Columbia University Press, 1955. “Lincoln Moses: DOE's Chief Energy Statistician." U. S. Department of Energy Energy Insider l(l6):5, May 15, 1978. Machlup, F. "Positive and Normative Economics." Economic Means and Social Ends. Edited by R. Heilbroner. Englewood Cliffs, N.J.: Prentice-Hall, 1969. McMullin, E. "The History and Philosophy of Science: A Taxonomy." Historical and Philosophical Perspectives of Science. Minnesota Studies in the Philosophy of Science, Vol. V. Edited by R. H. Stuewer. Minneapolis: University of Minnesota Press, 1970. Mitroff, I. I. and L. R. Pondy. "On the Organization of Inquiry: A Comparison of Some Radically Different Approaches to Policy Analysis." Public Administration Review 34(5):471-479, September/October 1974. Moore, G. E. Principia Ethica. Cambridge: Cambridge University Press, 1956 (originally published 1903). 125 Mydral, G. The American Dilemma. New York: Harper Brothers, 1944. Northrop, F. S. C. The Logic of the Sciences and the Humanities. New York: World Publishing, 1959. Parsons, K. H. "The Logical Foundations of Economic Research." Journal of Farm Economics XXXI(4):656-686, November 1949. . "The Value Problem in Agricultural Policy." Agricultural Adjustment Problems in a Growing Economy. Edited by E. O. Heady, et a1. Ames: Iowa State University Press, 1958. Pearson, K. Grammar of Science. London: J. M. Dent and Sons, 1937 (originally published 1892). Popper, K. R. The Logic of Scientific Discovery. New York: Harper and Row, 1959. Professional Audit Review Team. Activities of the Energy Information Administration. Washington, D.C.: Government Printing Office, May 1979. . Activities of the Office of Energy Information and Analysis. Washington, D.C.: Government Printing Office, December 1977. Rescher, N. Introduction to Value Theory. Englewood Cliffs, N.J.: Prentice-Hall, 1969. Robbins, L. An Essay on the Nature and Significance of Economic Science. 2nd ed. London: Macmillan, 1952. Roberts, M. J. "An Evolutionary and Institutional View of the Behavior of Public and Private Companies." American Economic Review LXV(2):415-427, May 1975. Rudner, R. S. Philosophy of Social Sciences. Englewood Cliffs, N.J.: Prentice-Hall, 1966. Runes, D. C., ed. Dictionary of Philosophy. 16th ed. New York: Philosophical Library, 1960. Salter, L. A Critical Review of Research in Land Economics. St. Paul: University of Minnesota Press, 1948. Samuels, W. J. Review of Economic Foundations of Political Power, by R. Bartlett. Journal of Economic Issues X(l):18l-185, March 1976. Schmid, A. A. “Analytical Institutional Economics: Challenging Problems in the Economics of Resources for a New Environment." American Journal of Agricultural Economics 54(5):893-90l, December 1972. 126 Property, Power and Public Choice: An Inquiry Into Law and Economics. New York: Praeger, 1978. Scriven, M. "The Methdology of Evaluation." Evaluation Action Programs: Readings in Social Action and Evaluation. Edited by C. H. Weiss. Boston: Allyn and Bacon, 1972. Shaffer, J. D. "On the Concept of Subsector Studies." Paper presented at the Technical Seminar on Subsector Modeling of Food and Agricultural Industries, Department of Agricultural Economics, University of Florida, March 30, 1970. , and A. A. Schmid. "Community Economics: A Framework for Analysis of Community Economics Problems." 5th ed. Materials for PAM 201. Department of Agricultural Economics, Michigan State University. (Mimeographed.) Simon, H. A. The Sciences of the Artificial. Cambridge: Massachusetts Institute of Technology Press, 1969. Solo, R. A. Economic Organizations and Social Systems. Indianapolis: Bobbs-Merrill, 1967. . The Political Authority and the Market System. Cincinnati: South-Western Publishing, 1974. Sower, C. and P. A. Miller. “The Changing Power Structure in Agriculture: An Analysis of Negative Versus Positive Organization Power." Our Changing Rural Society: Perspectives and Trends. Edited by J. H. Copp. Ames: Iowa State University Press, 1964. Suchman, E. A. "Action for What? A Critique of Evaluative Research." Evaluating Action Programs: Readings in Social Action and Evaluation. Edited by C. H. Weiss. Boston: Allyn and Bacon, 1972. Texas Energy and Natural Resources Advisory Council. "Texas Energy Policy Project Work Plan," Austin, Texas, March 1980. (Draft copy.) Texas National Energy Modeling Project Records. Austin, Texas. "Text of Speech by Carter on His Energy Program to a Joint Session of Congress." New York Times, April 21, 1977. U.S. Congress. House. Department of Energy Organization Act. House Conference Report 539 to Accompany S. 826, 95th Congress, lst session, 1977. , House. Government Operations Committee. Federal Energy Administration. House Report 748 To Accompany H. R. 11793, 93rd Congress, 2nd session, 1974. U. S. Code Congressional and Administrative News, 1974, Vol. 2. o'l' 4". . od..‘B ll w_ '3. e ,‘l‘ ’ ., . g.ri 1. ”.7“. .'. .‘m-. 5.3 .C‘A _ I C'. kl“ A -"-l ' 127 . House. Interstate and Foreign Commerce Committee. Federal Energy Administration Authorization and Extension. House Report 1113 To Accompany H. R. 12169, 94th Congress, 2nd session, 1976. . Senate. Energy Conservation and Production Act. Senate Conference Report 1119 To Accompany H. R. 12169, 94th Congress, 2nd session, 1976. . Senate. Federal Energy Administration Act of 1974. Senate Conference Report 788 To Accompany H. R. 11793, 93rd Congress, 2nd session, 1974. U.S. Code Congressional and Administrative News, 1974, Vol. 2. Senate. Governmental Affairs Committee. Department of Energy Organization Act. Senate Report 164 To Accompany S. 836, 95th Congress, lst session, 1977. U.S. Department of Energy. National Energy Plan 11. Washington, D.C.: Government Printing Office, May 1979. U.S. Government Manual. Washington, D.C.: Office of the Federal Register, National Archives and Records Service, issues for 1974 1975, 1977 1978, 1977 1978 (Supplement) and 1978 1979. U.S. President. Executive Order 11615. "Providing for Stabilization of Prices, Rents, Wages and Salaries." Federal Register XXXVI (159):15727-15729, August 17, 1971. . Executive Order 11748. ”Federal Energy Office,” Federal Register XXXVIII(234):33575-33576, December 6, 1973. Wallich, H. C. "The American Council of Economic Advisers and the German Sachverstaendigenrat, a Study in the Economics of Advice." The Quarterly Journal of Economics LXXXII(3):349-379, August, 1968.