, . an.“ ‘_ ‘ , ‘ :éxiafir H... 3:. . V , . . . ‘ ‘ . . 7.: .Ia-‘vcooLa‘ 1 .. . , . . tor. X i!. .Y—Aw... { ‘ fi?§ . :3 . . .. . , . 54...}. W»... , . .mfiflkfi.) ‘ . , ‘ 8.w«..‘m..w.. . , ‘ . , ,.. «m... . mflmmflr . .. 7 2.34;“ .3 $5», . 3.: .. :mhmTT.§u . i. 1.597 3, iv.» tiq ‘ .n. A7115 .0‘ . ..\.u\u~|~;3:s .\a.v.... ;I- A‘ :- . ~ 4 -"'..I 3...! ,3. t -1 {.11 .l . 3...l.lt...:: x 13.21....oéih...‘ U1». V .u\l..¢ {3‘ .AZ .an .. .11.. . Q a .n‘... it»... .lln‘ta ‘ l.\I\I-ll\ :“tlln.c v. .l-.t.€!n.,rr.f ‘.1w?: r, . .Y L . .. P . , v, 3 , n ; ‘ r h 1| 5.: . i ‘ I. City . , . pn.rv.1 .Iiannr IE. . . ., I. . , .. ‘0- . ‘ V :1 ‘ 4 .{vfir‘a-n: ‘..! THESIS /.. lllilllllllilIllIllllllllllllllllll 3 1293 01565 4456 This is to certify that the thesis entitled LINEAR AND NON—LINEAR EDITING SYSTEMS: CHANGES ON THE CREATIVE AND TECHNICAL ASPECTS OF VIDEO POST-PRODUCTION presented by Ana Delia Velazquez-Cruz has been accepted towards fulfillment of the requirements for MA degree in Telecommunicat ion V4,, Major professor Date {A7/4 2 0-7639 MS U is an Affirmative Action/Equal Opportunity Institution LIBRARY Mlchlgan State Unlversity PLACE IN RETURN BOX to remove We checkout from your record. TO AVOID FINES Mum on or betore duo duo. DATE DUE DATE DUE DATE DUE MSU leAn Affirmative Action/Equal Opportunity Ineutulon M1 ...M7 ._ LINEAR AND N ON-LINEAR EDITING SYSTEMS: CHAN GES ON THE CREATIVE AND TECHNICAL ASPECTS OF VIDEO POST-PRODUCT ION By Ana Delia Velazquez-Cruz A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of MASTER OF ARTS Department of Telecommunication 1997 ABSTRACT LINEAR AND NON-LIN EAR EDITING SYSTEMS ON THE CREATIVE AND TECHNICAL ASPECTS OF VIDEO POST-PRODUCTION By Ana Delia Velazquez-Cruz This thesis explores the changes of video post-production when using non-linear technology. The research follows the descriptive and comparative methodology along with intensive interviews with non-linear editing editors. The descriptive sections concentrate on both linear and non-linear editing systems. The description id divided on two major areas: the technical aspects (related to the machine and the creative aspects (related to the editor). The comparison between systems emerges from the descriptive sections and is outline in a table listing the technical and creative differences. To summarize the results of this research, the finding reveal no-linear systems as a welcomed technology that gives editors more creative freedom by reducing their involvement with technical issues. However, the essence of the editing process and the editor’s creativity and abilities to make creative decisions remained unchanged. ii To tfie memory of my Wryfatfier, fllrmin wfio mtg/it tfiat dreams do come true if we pursue tfiem, and to my motfier, LDeIia wfio encouraged me to pursue tfiisgoaL fit [a memoria tie mi querirfo papa, firmin quien me enseria que [as suer‘ios 5e fiacen realirfarisi trafiajamas para ello, y a mi mama', ’Delia quien me ofiecio’ taiio su apoyo para [agrar esta meta. iii ACKNOWLEDGMENTS The realization of a thesis is a major project that requires the involvement of many people along with the researcher. The first thanks have be to the friend who gave me the strength to follow through even when I did not acknowledge his presence. Thank you, Jesus for lending a hand. After the Lord, my family is the most important force in my life. Two wonderful girls, Cristy and Stephanie who challenge me to be the best role model possible. A sister-in-law, Wandy who’s always interested. My dear brother Armin always proud of the achievements of. his little sister. Last but not least my exceptional mom, Delia, who respected and supported this project since the acceptance letter from MSU arrived. Also as family is Sandeep who helped me all the way from the first idea to the completion of this thesis. Thanks for your positive criticism and unconditional love and support. I will never forget your help transcribing those long interviews. The academic support from my thesis advisor Robert Albers and committee members Dr. McCarty and Dr. Barbatsis was crucial for the realization of this thesis. Thank You. iv TABLE OF CONTENTS CHAPTERI INTRODUCTION 1.0 Introduction .......................................................................................................... 1 1.1 Statement of the Problem - - -- ..................................... 4 1.2 SignificanoeoftheStudy - ...... _ - _ _- 4 1.3 Methodology ......................................................................................................... 5 1.3.1 Information Gathering- .............. - ......... 6 a. Secondary Sources- _ - -- 7 b. PrimarySources - 7 1.3.2 Technical and Creative Aspects- - - - 11 a. Technicalaspects .................................................. 11 b. Creative aspects __ -- - ........... 12 1.3.3 Comparison Analysis ........................................................................... 13 1.4 Limitations Of The Study ................................................................................... 14 CHAPTERZ LINEAR EDITING SYSTEMS 2.0 Introduction _ _ _ ......... - ............................... 16 2.1 Technical Aspects: Videotape and Linear Editing - - _ _ - .--.i16 2.1.1 The First Videotape Recorder: Split and Splice Editing .............. 17 2.1.2 Electronic Editing .................................................................................. 18 2.1.3 Computer Assisted Editing ................................................................ .21 2.2 Creative Aspects: Post-Production and Linear Editing .............................. .22 2.2.1 Stages of Post-Production .................................................................. .23 a. Planning .......................................................................................... .23 b. Logging ............................................................................................ .25 c. Off-line Editing .............................................................................. .25 d. On-line Editing .............................................................................. .27 2.2.2 Editing Styles: Continuity and Complexity Editing ...................... 3 2 2.2.3 Editing Language: Elements and Techniques ................................ 32 3. Shots and Perspectives - - - - -- - -- - -33 b Transitional Devices...-.---- - -- -- - - - -- - -- 35 c. Special Effects and Digital Video Effects.- - - -37 (1. Graphics - ..... - - -- 38 e. Pace and Rhythm ........................................................................... 40 CHAPTER 3 NON-LINEAR EDITING SYSTEMS 3.0 Introduction .......................................................................................................... 42 3.1 Technical Aspects: Non-Linear Editing . ---.--------- 43 3.1.1 Platforms ................................................................................................. 44 a. PC--- -- ---------- - ----- - - - -- .45 b. Macintosh ........................................................................................ 46 c Commodore's Amiga-"- - ...... ----...47 (1. Silicon Graphics Inc. ..................................................................... 48 3.1.2 Openvs. Closed Architecture .............. -- -- - - - ----------- 4 9 3.1.3 Interface- - -- ------------ . - .......................... 4 9 3.1.4 Digitization" - -- . ------- . - - -- - 51 3.1.5 Digital Video Compression ............................................................... .52 vi a. Lossless vs. Lossy Compression ................................................. .53 b. Intraframe vs. Interframe Compression .................................. .54 c Compression Standards ............................................................... .55 d. Using Compression ..................................................................... 56 3.1.6 Storage ............................................. -- - ............... .57 a. Storage Management .................................................................... 60 3.1.7 Editing Tools - -- - ----- -- ------.---61 a. Transitions...- -- - -- - - - 62 c. Graplfics.-.----- -- -- - - . - - ---63 3.1.8 Audio- -- - 63 3.2 Creative Aspects: Post-Production and Non-Linear Editing ..................... 65 3.2.1 Stages of Post—Production Using Non-Linear Editing Systems... 66 a. Logging - - -- - 66 b. Media Organization - - - - 68 c. Digitization- -- - - . - -- ....... - 68 d. Off-line/On-Iine Editing - 70 e. Effects, Titles and Graphics 73 3.2.2 Editing Styles: Continuity and Complexity Editing ...................... 7 4 3.2.3 Editing Language. ......... -- - a. Pace and Rhythmic- - - 3.3 Editor’s Opinions ................................................................................................. 76 3.3.1 Advantages..................... - ...... ....... - ------------- --..66 a. Flexibility-.. -- ----- 66 b. Random Access .............................................................................. 68 c d a d . Upgradeability -- -- ---.- - - 68 . Comfort ............................................................................................ 70 3.3.2 Disadvantages -- -- .............................. 66 vii CHAPTER 4 NON-LIN EAR EDITING SYSTEMS: THE EDITORS AND THE INDUSTRY 4h() IIIttCNEllCiiCMnL- “ ....................................................................................... t§3 44.1. lirlitcnrtsIFherikJrrrtauarxe ........................................................................................... EKB 14.1.1. ILeMarrdhnggtireeEiysflxnnm ............................................................................. (NS 4.2 The Post-Production Industry ........................................................................... 86 4.2.1 Industry Changes .................................................................................. 87 4.2.2 Revolutionary Tool .............................................................................. 88 444243 I?rcxdiic1i\du3r ............................................................................................ 5E9 (IIIIII'IIEIIES CONCLUSIONS Sit) lirtrcudricticna .................................................................................... _ - -- -,-- SID 5L1. Shurrunmaujr ............................................................................................................... El) 5.2 Comparison: Differences Between Linear and Non-Linear Editing Systems .................................................................................................................. 91 55:3 (Icumcitrsicnm ............................................................................................................ 9K5 Eia4 llcnccnnnrrmestclaflirmnus .............................................................................................. 9N8 IklafmenicliXLIA. ................................................................................................................... 9K3 ijprxerudix:13".“.u.u.u.u.u.u.u.u.u.u.u.u.u.u.u.n.n.-- ............. 5&6 Appendix C .................................................................................................................... 102 Appendix D ................................................................................................................... 104 lList CdFItefeuranKnes ......................................................................................................... IIIS viii LIST OF FIGURES CHAPTERI Figure 1.0 Linear and Non-Linear Editing .......................................................... .3 CHAPTERZ Figure 2.0 Helical Scan-- - -- ........ - -- - -- 20 Figure 2.1 Minimum Equipment Configuration for an On-line Editing Suite------ -- - ----- -- - - --------2-9 Figure 2.2 Input/ output Control Signal Path for a Typical Computer Edit System- -- -- - --------..... -- - .................................................... .31 Figure 2.3 Essential Areaofthe Screen- -- - -- -------. -- - 40 CHAPTER 3 Figure 3.0 Basic Non-Linear Equipment -- ...... - - --44 Figure 3.1 Basic Non-Linear Systems Interface - -- - 50 Figure 3.2 Calculations to Determine the Minimum Data Transfer Rate.....59 ix LIST OF TABLES Table 1 Differences Between Linear and Non-Linear Editing Systems .......... 92 DVE EDL EWS I/O MS SMPTE VTR WS ABBREVIATIONS Close Up Digital Video Effects Edit Decision List Extreme Wide Shot Input/ Output Medium Shot Society of Motion Picture 8: Television Engineers Video Tape Recorder Wide Shot CHAPTER 1 INTRODUCTION 1.0 INTRODUCTION Computers have revolutionized many industries by simplifying complex and repetitive tasks. Accounting has never been the same since the introduction of Lotus 1-2-3. Desktop publishing has made publishing accessible to small groups and has reduced printing and designing costs. Video production has not been immune to the influence of computers. Computers have been part of video production for quite some time. Computerized programs have helped to control lighting boards, generated audio and video effects and manipulated their signals. In editing suites, computers have facilitated the editing process for years. Their latest generation, non-linear editing systems have made it possible to perform on a computer the tasks of every piece of equipment in the editing suites. The constant development of faster processors and the cost reductions of hardware and software has encouraged the application of computer technologies to the video industry. Computer technologies have provided affordable video processing equipment to non-broadcast video users. Nowadays, video is used as a communication tool for training, business, education, in government and in medical fields, among many others. When 1 2 video was exclusive to the broadcast industry it was out of reach for these users because the costs were prohibitive. Computers and non-linear editing systems have helped to change all that. The diversification of video users outside broadcasting has also encouraged the development of new technologies. Manufacturers of computer equipment have realized that there is a digital video market to explore. On the other hand, manufacturers of traditional video equipment have been forced to compete with these newcomers for a market that previously was exclusively theirs. The result has been the introduction of new digital technologies to the video production industries, particularly the non-linear editing systems. The non-linear editing systems are computer-based. These computer systems perform off-line and on-line editing tasks eliminating the need of traditional video equipment other than the videotape recorder. Video, audio and graphic source materials are digitized into the computer where all the editing is done. The computer contains the source materials as well as the editing tools. The quality of these systems ranges from broadcast quality to consumer level. The systems provide accurate and immediate random access to any material. However, their most relevant contribution has been transforming editing from a linear into a non-linear process that allows the editor to work in any desired order without affecting the editing sequences that follow (see Figure 1.0). E ITIN r Step 1 Edit Desicion c f Step2 Linear Editing Take i Take 3 £2 WW// // a[III/IIIIIIIIIIIII/I/IIIIIIIIIIIIIZ/I/II // /// Edit Take 4 and Re—edit Take 3 c f Step2 Non-Linear Editing Take {III/IIIIIIIIIr/lII/IIIIII 4 3”!IIIIIIIII|IIIII% g Take Take / / Take : ,,,,,,,,,,,,,,,,,,,,,,,,,, // fi ,,,,,,,,,,,,,,,,,,, ///// —D—>—>/ Moves Back to position k Take4 Figure 1.0 Linear and Non-Linear Editing Source: prepared by the author. 4 Non-linear editing technology has been in the market for some years but it is still being improved. Many editors, producers and directors are using it now as their main post-production units. Since this technology has started to take roots in the video industry there are important questions to answer: 0 How do non-linear systems work? - What changes do they bring to the traditional ways of editing? 0 What are the advantages and disadvantages, if any? e What has been the user's experience? - Are non-linear editing systems another technological improvement or the first step towards a new generation of editing systems? 1.1 STATEMENT OF THE PROBLEM This thesis describes and compares linear and non-linear editing systems. It explores and explain the most significant changes made to the creative and technical aspects of editing when using non-linear editing systems. 1.2 SIGNIFICANCE OF THE STUDY A few years ago, digital editing technologies were a privilege of major post—production houses. Powerful and expensive computers were used to create complex post-production effects for commercials and films. Today, these technologies have filtered down to less sophisticated users. The technologies have been simplified to reduce costs and to perform less 5 complex tasks than the special effects of la rassic Park. Non-linear editing systems are probably one of the most revolutionary of those digital editing technologies. Supporters of the systems claim that non-linear editing systems simplify many technical aspects. If the use of these systems can save time and consequently save money, productivity levels may be increased, which in turn may have important consequences on the post-production industry by helping the development of a new breed of video production professionals, more powerful and independent from the traditional post-production houses. But most important is the fact that it frees the editor's creativity from the structured linear ways of video editing. The relatively new non-linear editing systems and the diversified group of video users may be a force of change to the traditional ways of editing. These two elements cannot be ignored because their growth may cause a gradual but steady trend towards new editing perspectives and consequently towards video production. The number of non-broadcast video users is increasing; a projected total of 83,000 users by 1995 (see Appendix A). The production budget for these producers was expected to reach $12.6 billion by 1995. These video users are a group powerful enough to establish trends for the use of new technologies and for the video production process. 1.3 NIEIHODOIDGY The methodology of this research follows two major approaches, one is descriptive and the other comparative. These were the best options to work with information that was exclusively qualitative. The first part of the 6 research uses the descriptive approach to clarify and define linear and non- linear systems. The second part uses these findings as the grounds upon which to compare and contrast linear and non-linear systems. In the first part, an extensive description defines the creative and technical aspects of both editing systems, linear and non-linear. This description establishes the framework for the research by presenting the development of both systems, its parts and properties. Chapters 2 and 3 are dedicated to linear and non- linear editing systems respectively. To explore in depth the creative and technical aspects of the non-linear editing process, a series of intensive interviews with editors who use the systems were conducted. Chapter 3 includes the personal experiences and insights of non-linear systems editors. Chapter 4 explains two areas that came up during the research even though were not directly related. These chapter concentrates on the editor's performance and the post-production industry. These three chapters establish the base for the comparison between both systems. The concluding part of the research used the comparative approach. Chapter 5 identifies the relations, similarities and differences between the linear and non-linear systems. It begins by exploring the changes on the creative and technical aspects of editing from one system to the other. It outlines the advantages and disadvantages of non-linear system. Finally, the conclusions and recommendations summarize the most relevant findings of the research, present the conclusions and propose further research directions. 1.3.1 Information Gathering For this research it was important to explore multiple and diverse sources because there was an overall lack of sources dedicated to editing. 7 Since the non-linear technology is relatively new the information available was very reduced. Most of the literature available is limited to technical specifications or user reports. The emphasis is on the technological aspects rather than its use as a creative tool and the impact of the technology on the editing process. During the research, both secondary and primary sources were consulted. The secondary sources were used mostly during the first part (descriptive). It sets the foundation on which this whole research has been developed. Primary sources were intended to provide new insights about the creative and technical aspects of non-linear editing. a. Secondary Sources The secondary source materials consist mostly of information obtained through library research. These sources include books, magazines, newspaper articles, and special reports on the topic. Most of the information obtained through the library research was related to linear editing systems. The information related to non-linear systems provided a wealth of information about the technical aspects. However, the discussion of the creative aspecm was limited to the system's editing features and the type of projects being edited on them. Other secondary sources included the information packages and videos of various non-linear editing systems supplied by the vendors. These later sources were reviewed considering that they are designed to attract and convince possible buyers. b. Primary Sources The primary source materials were obtained from intensive structured interviews (2 to 3 hours) with five editors who use non-linear editing 8 systems. These interviews investigated the post-production experiences of the editors while using non-linear systems. The questionnaire (see Appendix B) was designed to obtain information that would provide a better understanding of the process of non-linear editing and to answer the questions that remained unanswered after completing the review of the secondary sources. The recruitment of the editors interviewed was divided in two stages. First, four main manufacturers of non-linear systems were contacted. They were asked for a list of their clients on the Michigan area. Avid Technology responded with a list of 30 facilities. The other manufacturers were unable to respond within the time limits imposed by this research. Given the fact that the research topic was focused on non-linear technology and the exploratory nature the research, the researcher decided to only interview Avid users. This decision did not affect the research because even though there is a wide variety of non-linear editing systems, they are all based on the same principles. The creative and technical aspects of the technology remain basically unchanged from one manufacturer's product to another. From the list provided by Avid Technology a subset of facilities using Avid 4000 and 8000 models was selected. These models process real-time video (30 frames / 60 fields per second) suitable for broadcasting. This reduced the list to 15 facilities. During the second stage the directors from those facilities were contacted by phone or fax to explain the purpose of the research and to request an interview. Eight facilities agreed to the interviews, but only five were able to schedule within the research time limits. The editor of one of those facilities failed to attend twice to scheduled interviews without prior notification. Unable to re-schedule, the facility was dropped from the research. 9 The four facilities where editors were interviewed were representative of four different types of users. There are two post-production facilities; one offers off-line services, the other on-line. The other two facilities are an academic institution and a business corporation. The backgrounds of the editors are as diversified as the facilities at which they work. The following are the profiles of the editors interviewed. Gregg Jackson, Consultant Mr. Jackson is a certified trainer and consultant for Avid systems. He has been a video editor for over 21 years and has been using non-linear systems for 4 years. He is also editor and consultant for Whirlme Corporation. At Whirlpool he uses an Avid 80m to edit a variety of video projects. The projects range from video loops for trade shows to 28 part training series. They produce videos for sales, marketing, product launches and for corporate communications. The videos' masters are recorded to 1 inch tape and copied for distribution on VHS videotape. They rarely use outside post-production services, except for special graphics and effects or because their Avid is completely booked. Flip Mulliner, Postworks Mr. Mulliner has been an editor for 11 years on traditional linear on- line and has specialized in non-linear editing for the last 3 years. He is one of the owners and editors of Postworks, a post-production facility. He uses an Avid 8000 and the company is acquiring its second system. Originally they used the system as an off-line solution to substitute a 3/ 4 inch system. But now they are using it as an on—line system. At Postworks most of their projects are industrial videos for training and corporate communications 10 recorded on Betacam or film. They also do commercials for local broadcasters and corporations. They provide all the post-production services except audio “sweetening" and complex 3D animation. Steve Koster, Calvin College Mr. Koster is the video productions coordinator for Calvin College. He does all the video work for the college assisted by students. Mr. Koster has been involved in a wide range of video projects, from art documentaries, to training videos, to public service programming for Calvin College. The system Mr. Koster uses is an Avid Media Suite Pro which processed real-time video. They have had the system for three months approximately, which gives a fresh perspective of the transitional process from the linear systems to non-linear. Jim Monigold and Eric Luttermoser, Harvey's Place Mr. Monigold has been an editor for 12 years and is a senior editor specialized in film at Harvey's Place. He is a graduate from the Communications Department at Western Michigan University. Mr. Luttermoser holds a degree in Scientific and Technical Communications from Michigan Tech University. He is also an experienced film editor. At Harvey’s Place they have five Avid systems and three different models, including the 8000. Their facility has evolved along with the technology, going from film editing tables, to 3 / 4 inch tape-based CMX, to laser disk based systems and finally to non-linear systems. Harvey's Place is an off-line facility, even though recently they have started finishing projects on the Avid. Most of their off-line work is for broadcast commercials, but they also do longer format pieces and corporate videos. 11 1.3.2 Technical and Creative Aspects Consistently, through out this research there are two clearly identifiable areas or aspects to the editing process. When an editor receives a set of tapes with raw footage to change it into a finished video, somebody is expecting the editor to master the creative as well as the technical aspects of editing. A good editor masters the technology, like an artisan knows the trade tools inside out, its capabilities and limitations. It is the editor’s responsibility to make good quality and clean edits. But also a good editor masters the language of editing. A good cut is not only clean but it conveys the intended meaning. The success of the finished video depends on the quality of the edits as well as on its ability to reach its communication goals. a. Technical Aspects The editing process requires technical knowledge and understanding of the medium, in this case video and editing equipment. The editor must understand the signal processing in the system, analog or digital, as well as the procedures and options of the equipment. The technical quality of the signal on the edited master is as important as its content. The signal must be flawless, clear and pure. All the signal components must be in sync and in phase and the signal to noise ratio, black level and color information must be acceptable. To ensure adequate signal quality, editors must fix problems that come directly from the raw footage, such as drop-outs and videos out of sync. Any problem or fault with the signal that was not intended will distract the viewer and will call attention to imelf. Along with the technical quality of the video signal the editor must 12 understand the operation and manipulation of the system. The editor should know the order and steps that his/ her particular system impose on the process of editing. The editor should master of the editing tools. Technical aspects are related to the technology, the systems and its components. They refer to how the medium and the technology work. Owing to continuous technological developments the editors need to update their mastering of the technical aspects constantly. Even though for clarity purposes there is a separation between creative and technical aspects in most real situations they overlap. b. Creative Aspects The creative aspects of editing are more than just collecting pieces of footage and editing them into a sequence. As Herbert Zettl (1990) explains, editing consists of selecting and sequencing those parts of an event that contribute most effectively to its clarification and intensification. The creative aspects require the mastering of the editing techniques to make the communication effective. The intended viewers, the content, the reason to communicate, how the message is being communicated and the creator‘s personal style, are part of the context in which the creative process takes place. Based on this context, the projects take place and the decisions are made; what to include, what not to include or how to put it together. At times these artistic decisions will be made almost instantaneously. But occasionally it takes much more time to decide what is the best technique to communicate an event in a clear and intensified manner. Thus, editing is the art of communicating an effective message shaped by the editing techniques. To make the right creative decisions, editors also need a fundamental working knowledge of camera shots, camera perspectives and the principles of pacing 13 and continuity. For the purpose of this research the creative aspects refer to those elements related to the decision making process through which the editor constructs and shapes the message. Creative aspects are related to the editor, his/ her creativity and communication skills. Oranges affecting the creative aspects of editing may have a significant effect on the video production process. Since editing is the last stage of video production, the design of the previous stages depends on how this last stage is going to be performed. Any significant change may affect the whole production process and more important the final video product. 1.3.3 Comparative Analysis Before entering the stage of comparison the information had to be managed and organized. The information gathered through secondary sources about the linear post-production process was organized and classified in two categories: creative and technical aspects. Then, the information gathered about non-linear systems was organized following the same scheme. This method provided the basis for a preliminary comparison between the systems. This preliminary comparison results helped to clarify the relationship between the systems and to draw their similarities and differences regarding the editing process. This information was used to assist in the design of the questionnaire for the interviews. The design of the questionnaire paid special attention to those areas where there were contradictions or lack of information about the non-linear systems. The information gathered through the interviews was transcribed and organized by topics following the scheme used for the linear systems 14 (technical and creative aspects). The responses were evaluated based on content and not by the questions they were answering. Once the information was organized it was possible to use the information gathered from the interviews to corroborate or refute the information obtained through the literature review. But most important, this information was intended to bring out new insights, ideas and perspectives about the use of non-linear editing systems. 1.4 LIMITATIONS OF THE STUDY A set of limitations have been imposed on this research to make it manageable. Time, monetary and geographical constraints have defined the following limitations. 0 This research is concerned only with video signals either analog or digital. It does not include the editing procedures for film. Even though some of the work done in the non-linear systems is actually shot on film, it is transferred to videotape before importing it into the system. 0 This research does not intend to be a technical report for engineers. The review of editing equipment and devices is limited to the technical information required by editors, producers and directors to use these tools as creative instruments. 0 This research assumes the reader has at least a basic knowledge of the procedures and techniques of video editing and production. 15 0 This research is not a comparison of the systems offered on the market. It refers to the capabilities and requirements of non-linear editing systems in general. - Due to geographical and time limitations the editors interviewed are all users of the Avid systems in the state of Michigan. ' Through the literature research a wealth of information has been found regarding the technical aspects of the non-linear editing systems. However, a minimal number of sources mention or discuss the creative aspects of the editing process using non-linear systems. CHAPTER 2 LINEAR EDITING SYSTEMS 2.0 INTRODUCTION This chapter reviews relevant literature to set the foundation of this research. It is subdivided in two major areas. Section 2.1 discusses the history of the development of videotape and editing systems. Section 2.2 presents an overview of the traditional post-production process. It explains the four stages of post-production: planning, logging, off—line and on-line editing. Then, it describes the techniques and devices editors use to transform raw footage into a finished video program. 2.1 TECHNICAL ASPECTS: VIDEOTAPE AND LINEAR EDITING The history of video editing is closely related to the development of videotape. During the first years of television there were no recording devices native to this new electronic medium. However, there were people editing for television on film. They adopted the tools and techniques from the film industry and at the same time started developing the creative framework for video editing. With the creative framework under 16 17 development, the introduction of the first recording devices using videotape marked the transition to a new era in television production and for the post- production industry. 2.1.1 The First Videotape Recorder: Split and Splice Editing The early days of television resembled theater production because video recording methods were not available. After rehearsal, the shows were broadcast live. Whether a show was good or full of mistakes, it was always lost after the broadcast. The only recording options available to producers and directors depended on film. Some programs were recorded and edited on film, with elevated costs and large time delays. The other alternative was a process called kinescope (Wurtzel, 1989). The shows were recorded live on 16 mm film from a video monitor. Then, the film had to be developed and sometimes edited. This process involved serious compromises of image quality, time and cost. In order to search for a better recording solution Ampex Corporation organized a team of engineers, which included Grades Anderson and Ray Dolby (Anderson, 1988). In 1956, they introduced the first 2-inch broadcast videotape recorder (VTR) at the annual convention of the National Association of Radio and Television Broadcasters. The television industry welcomed the new technology. It allowed stations to record 'live on tape" programming from the networks for later distribution and/ or broadcasting. However, videotape was used only as a storage medium (W urtzel, 1989). Producers and engineers soon envisioned new uses for the VTR machines. They planned to use them for production ”as soon as someone figured out how to edit the videotape" (Anderson, 1988). 18 The only form of video editing known at the time was adopted from film and audio: slice and splice. The first video editing block was introduced by Ampex two years later in 1958 (Anderson, 1988). The editing process was totally mechanical. The edit points were marked without seeing the images on the videotape using the frame pulses as reference. The editor applied an iron particle solution over the splice area to 'develop' the video tracks. Then, using a microscope the editor was able to see the video tracks and the frame pulses, which indicated the start of each video frame. With a precision blade the editor made the necessary cuts on the videotape. Using an ultra- thin metallic adhesive tape the editor spliced the two sections, making sure there was no gap between the two pieces of videotape. This procedure was modified by other organizations involved in the television industry. They developed their own versions of the 'slice and splice' process to facilitate the editing . Even though this was a cumbersome and time consuming process, it remained as the only technique to edit videotapes until the 1960's, when electronic editing devices were introduced. 2.1.2 Electronic Editing The beginnings of basic electronic editing dates back to the early 608. During those years electronic editing was only a transfer process. A 'source' VTR was used to play back the original footage and an "edit” VTR recorded the scenes in the desired order. The process was totally manual. The accuracy of the edits depended on the ability of the editor to press the record button at the precise moment. The innovation consisted on a smooth transition between cuts. It eliminated distortion and break-up in the videotape signal. Electronic editing had two important advantages. First, the elimination of 19 split and splice editing reduced the videotape damages due to physical handling. Second, the original source tapes were preserved and the footage remained available for future use. Editors used various techniques trying to improve the accuracy of the edits. They made reference marks on the reverse of the tape with grease pencils to have a better idea of the location of the edit point. They calculated the time needed for both machines to reach proper speed and used the marks as a reference to perform the edits. However, editing still lacked precision. The next relevant change came with the introduction of time code editing. In 1967, Electronic Engineering of America (EECO) introduced the EECO Time Code System. The concept was based on the idea of Dick Hill, a TV engineer. He proposed to mark each frame with a distinctive code, similar to the numbers on a film strip. The time code was displayed in hours, minutes and seconds. Each video frame was "addressed" with its own unique code. A device tracked and displayed video frame numbers at the standard rate of 30 frames per second. The edit systems featured dumb—wheel counters for entering the edit in and out points. Time code simplified the retrieval of scenes from the tapes and permitted editing with reliable frame accuracy. Editing became an exact process. Many companies developed their own time code systems, none of which were compatible with each other. Over a four year period the Society of Motion Picture and Television Engineers (SMPTE) held meetings with manufacturers to establish a standard. In 1973, they agreed on the standard used today, SMPTE time code. The format was also adopted by the European Broadcasting Union (EBU), making it an international standard. By the mid 70's the videotape was also being changed (W urtzel, 1989). A smaller format of 1-inch videotape was introduced. Editors needed lighter 20 and smaller recording equipment easier to manipulate, that would facilitate the off-line process and would be cost effective. To accommodate all the video information in a tape with smaller width the engineers developed the helical scan. The slanted video tracks were read by one large rotating head in a spiral-like configuration (see Figure 2.0). This tape recording format provided superior picture and sound quality and at a much lower cost. In addition, the helical VTRs offered features such as slow motion, fast motion, freeze-frame, and expanded editing capabilities. Slanted Video Tracks Rotating Head Videotape Figure 2.0 Helical Scan Source: prepared by the author. During the 1980's helical VTRs became the new industry standard. The systems offered longer recording time, multiple audio tracks, the enclosing of the open-reel into cassette shells and the ability to produce broadcast-quality 21 recordings on ever smaller tapes. A wide range of editing systems were developed for formats like 1-inch and 1/2-inch open reels, and for 3/4-inch and 1/2-inch cassettes. Producers and directors were given the option to choose the most appropriate format depending on the quality needed for their final master. The smaller tape formats meant less expensive and more portable equipment but also a decrease in quality. However, recent developments such as digital and component signal recording have made the quality of small and large formats equivalent. 2.1.3 Computer Assisted Editing Computers came to the editing suite as off-line systems. In a joint venture CBS and Memorex Corp. developed a system to create edit decision lists (EDL). The CMX-600 used multiple disk drives to provide random access to any information on the disks (Anderson, 1988). The editor chose the time coded edit points for the whole project and stored them in the computer. The output of the CMX-600 was a computer-readable punch tape which had to be fed into an on-line system to perform the edits. The on-line system was called CMX EDIPRO 300. Its introduction was a success because of the frame- accurate control of the source and record VTRs. Furthermore, using an EDL in the form of a punch tape the systems performed frame-accurate edits and special effects transitions. The introduction of computer assisted editing systems combined the computer’s large memory and digital accuracy with the video production equipment. It permitted the same sophisticated and precision editing on videotape which previously had been available only to film. Computer assisted systems went as far as controlling the editing equipment in the post- 22 production suite, but they were unable to store, edit and manipulate footage without most of the traditional black-boxes and the videotape. 2.2 CREATIVE ASPECTS: POST-PRODUCTION AND LINEAR EDITING Post-production involves all the activities necessary to transform raw video footage into a finished program. In most professional productions, those activities include: 0 make original footage clubs 0 log raw footage 0 determine edit points and transitions ' generate a workprint 0 review and modify workprint 0 generate an EDL 0 assemble the final master 0 add titles and graphics ‘ and finally, add sound effects and music. Editing is the most important activity, it is at the heart of post- production. Technically, editing is only one step in the post-production process. The step in which the selected shots, graphics and effects are electronically or digitally pieced together to create a workprint or a finished program. All the other post-production steps are a preparation for editing. The amount of post-production and editing varies greatly from one production to another. On a live on tape show, there is little or no need for 23 post-production, except when some editing may be required to trim the length of the program to make it fit in a time slot, or to correct mistakes by eliminating or adding segments. Other programs require more than just trimming. The simplest way of editing is to combine segments into a sequence to form a complete program. Programs such as documentaries require a more complex editing process. In a documentary the most effective shots and transitions are chosen to create the scripted sequences that will finally become the program. This type of post-production builds the program piece by piece in the editing suite, giving absolute creative control to the producer, director and editor. 2.2.1 Stages of Post-Production As with any other process, post-production consists of various stages. The first stage involves the careful planning and designing of the necessary activities to produce a finished video product. The second stage consists of logging the materials needed for editing, its preparation and organization. The off-line and on-line editing are the two final stages. The off-line is basically the decision making process through which the program's concept is developed. The on-line is the artistic execution of the edit decisions. a. Planning The planning of a good post-production begins during pre-production. During these early stages the producer and / or director decides the type of post-production needed to complete the program. Is it going to be live on tape ?, are only the errors are going to be corrected? or, is the whole project going to be assembled in post-production? These decisions determine what is 24 the most appropriate style to shoot and edit the program. The production of a program that will be assembled during post- production requires careful and organized shooting for editing. A good videographer records extra seconds of footage at the beginning and end of each scene to maintain the continuity and flow of the action. The editor needs the flexibility of that extra footage to select the most appropriate edit points. Cu taways facilitate the transitions by covering and smoothing errors in continuity. An accurate shot sheet maintains a log of every segment that has been recorded and of what needs to be recorded. At the end of each day, the footage should be reviewed. These daily screenings verify the quality and suitability of the material recorded. In addition, the screening provides an opportunity to identify problems on the video and/ or audio that may affect the editing process. Often it is easier to correct mistakes by recording a scene again rather than trying to fix it in post- production. Post-production is a time to exercise creativity. It is not the time to 'fix" anything (Schubin, 1994). As the production process unfolds the post-production design takes shape. The design should specify the exact procedures; editing equipment to be used, video format transfers, time-code considerations, transitions, graphics and audio specifications. The most important element of the whole post-production process is the team of professionals. Every person on the team needs to have a clear understanding of his / her role in the process. The pressure of deadlines adds additional tension to a strenuous job. The better the working environment, the smoother the post-production will flow. The size of the post-production team varies with the complexity of the project, from only one editor to a large group of editors, assistants, audio and special effects' specialists. 25 13- Logging Every tape needs to be logged prior to editing. Properly logged footage can make the difference between a highly efficient edit session, where source materials are properly labeled and easily accessed, and another during which valuable time (more accurately money) is wasted locating takes and other materials (Groticelly, 1994). Logging procedures vary depending on the length of the project. Logging for short projects is usually done directly from the master tapes, eliminating the need to make dubs. It also can be done manually with paper and pencil on a shot sheet. Complex and lengthy projects require elaborate logging. Because the work on tape before the final edit is rather extensive, dubs are made from the original tapes. This procedure prevents damage and accidents that may occur to the master tapes. Time code can be recorded on the tape while making the dubs, or it can be recorded in the field. Usually, the time-code is 'burned' into the dub for quick reference. The window with the numbers should be big enough for easy reading, without being obtrusive. The original tapes are frequently dubbed into a smaller format such as 3/ 4 or 1 / 2 inch videotape (Wurtzel, 1989). This smaller format tapes can be reviewed and logged in any place there is a playing deck, without the need for expensive higher format equipment. c. Off-line Editing Before starting the off-line editing, each person involved must have a clear understanding of the story and the communication objective(s) of the project. The decision making process during the off-line editing should be guided by those defined objective(s). This ensures that every decision made 26 strengthens the project. Off-line editing refers to the use of an edit system to make edit decisions concerning scene placement and timing (Smith, 1991). The purpose of off-line editing is to create a workprint of the final project and to produce an EDL. It is a preparatory step to on-line editing, where the final program is created, including special effects, graphics and audio mixes. The final concept of the project is developed during the off-line editing. It provides the opportunity to create and piece together a collection of shots into a program. The shots are reviewed and studied to decide the editing order of the sequences, their duration, continuity and pace of the program. There are a number of options available for off-line editing (W urtzel, 1989). The simplest method is to edit on paper using the dubs of the original tape. This only requires noting the time-code numbers for each edit point. The edit log produced should include the in and out points for each edit. For longer and more complex projects it is often recommended to produce a workprint or rough- cu t, in which the individual takes and segments are edited in order. At this stage the only transitions used are cuts. Usually, editors leave extra frames before and after the takes, which are trimmed to the exact length later. In other cases, the rough-cut determines the exact in and out edit points that will be used for the on-line. During the off-line the proper timing, pace, rhythm, continuity, and overall program length are establish. For sophisticated productions the off-line can be done on a computer- assisted system. The computer helps to make the rough-cut edits by providing an interface that controls all the editing equipment on the suite. It also keeps all the edit decisions in its memory. The EDL can be accessed at any time. Most systems output the EDL as a paper print—out or save it on a floppy 27 disk. This eliminates the need to keep a hand-written edit log. When the on- line system is computer-assisted and its protocol is compatible with the off- line system the EDL can be fed into the computer to guide the automatic assembly of the on-line master tapes. When the protocols are not compatible most systems allow the manual entry of the EDL. Regardless of the particular system or process used, the end product of every off-line editing session is the same: a blueprint on paper, disk or videotape that can be evaluated by producers and other key project officials. The approved blueprint or EDL will be used to assemble the master tape during the on-line editing (Anderson, 1988). d. On-line Editing On-line editing is the most technically demanding phase of the video editing process. It always refers to the compiling of scenes using an integrated editing system (Smith, 1991). It produces the final master copy that is used for broadcasting or distribution. Transitions, character generated materials and other graphics and special effects are also mixed into the program during the on-line sessions. All the off-line work and decision making are expected to be completed before the first on-line edit session begins. Even though some changes can be introduced at this stage, it is not the time to make creative decisions but to execute them. The materials required for an on-line session are: a. all the original rolls b. all graphics and other materials c. an EDL which specifies the exact location of every scene on the original rolls, its length, the in and out edit points for every scene, their order of placement in the program and the type of transition 28 between scenes. (1. new tapes to record the master. The on-line session can also be the most stressful phase of video editing (Anderson, 1989). As the final stage of production, it receives all the pressure from previous schedule and budget overruns. The best way to deal with these extra pressures is to be prepared. Improper preparation inevitably leads to higher costs and inferior product quality. Contrary to off-line where only cut edits are performed, the on-line editing suite requires more sophisticated equipment to edit, record and monitor a professional-quality video. The minimum equipment into an on- line editing suite (see Figure 2.1) should include: editing controller ' at least two playback VTRs with time base corrector 0 one record VTR capable of assemble and insert edits 0 video switcher 0 waveform monitor and vectorscope 0 video processing amplifier 0 color bars and sync generator 0 audio mixing board with equalization capability ' audio and video monitors. Besides the equipment, the editing suite must satisfy minimal ergonomics conditions. During recent years post-production facilities have tried to make the environment more comfortable for editors as well as for clients. Taking into consideration the long and stressful hours that editors 29 and clients spent working in the editing suite, it should have sufficient space to allow comfortable movement for at least two persons, and provide adequate work space to keep scripts and notes. The lighting should be designed to hold down glare. The equipment should be arranged to minimize editor's stretching and reaching. Video Marthe TitleCamen Audi) Monibr Audi) Monlbr 7 7 “x I a 00° 33° a .. 1 Editing Corinth iol o O Cobrhtlnd /' °§° Syncm Figure 2.1 Minimum Equipment Configuration for an On-line Editing Suite Source: Video Editing and Post-production, Gary H. Anderson In his book Video Editing and Post-production, Gary H. Anderson describes a sample on-line session. According to Anderson, the editor should begin by checking that the equipment and editing controls are in a normal 30 configuration. Any additional equipment that is not normally found in the editing bay should be ”patched“ into the editing system. All the equipment should be tested and phased. Before start editing, the final step should be to check that the proper audio and video levels are being fed to the record VTR. Once all the equipment has been checked the on-line session can begin. There are a number of techniques to enter the edit points into the editing system. The selection of the most convenient technique depends on the off- line previously performed and the type of EDL being used. If an off-line has not been done, the edit points may be entered manually by selecting and marking in and out points while the tape is played in real time. Another way of editing without a previous off-line is to set the in and out points (using the control track numbers) into the edit controller; or by typing time code numbers into the editing system's computer. If an off-line was done and an EDL produced, either hand-logged or stored on a disk, there are two basic ways of using it. The EDL can be used as a reference to quickly identify the edit points and perform the edits or it can be entered into the computer before the on-line session for automatic assembly. On-line computer assisted editing systems combine a data processing computer with television production equipment (W urtzel, 1989). The computer keeps track of all the information related to the edits, helping the editors concentrate on the creative aspects of the project. These systems can be used for off-line as well as on—line editing. The editor can control the playback and record VTRs, the switcher, audio equipment, and other devices plugged into the system, directly from the computer (Figure 2.2). Audio Switdter Canole Video Switrher Figure 2.2 Input/Output Control Signal Path for a Typical Computer Editing System Source: Video Editing and Post-production, Gary H. Anderson As Wurtzel explains, the computer uses SMPTE time code to control the operation of all the equipment during editing and to store in its memory every edit decision as it is made by the editor. The edit decisions are used later for the automatic assembly of the master. The original rolls are loaded into the VTRs and the master tape into the record VTR. Then, the computer is instructed to use the EDL to assemble the master. The computer controls all the devices while editing the master, only asking for the necessary rolls of original footage and audio sources. Frequently, an assistant director coordinates the master tape editing, since all creative decisions have already been made during off-line editing. 32 2.2.2 Editing Styles: Continuity and Complexity Editing A successful post-production requires a clear understanding of video editing styles. Zettl (1990) defines two major editing styles: continuity and complexity editing. Continuity editing is primarily concerned with the clarification of an event. It intends to create the illusion that the story is told in a continuous, uninterrupted manner. Continuity should be maintained through a variety of elements: lighting, background, subjects, direction and motion of the action. The transitions should not be a noticeable distraction to the viewer. The action must flow logically across consecutive shots. Complexity editing intensifies an event. According to Zettl, ”complexity editing communicates the inner relationships of an event and stresses the event's principal moments and their complex interdependencies.‘ It intends to help viewers gain a deeper insight into the event and grasp its totality made up of the various details. The montage is a classic example of complexity editing. In a montage different images are combined to create a strong impression. The complexity lies in the juxtaposition and contrast among the images of a common event. A montage shifts attention from the general presentation to highlight a particular idea or feeling (Smith, 1991). 2.2.3 Editing Language: Elements and Techniques The language of editing provides various elements and techniques to help the editor make informed creative decisions. The individual judgment of each editor and his/ her decisions regarding the use of these techniques 33 define the creative aspects of editing. a. Shots and perspectives At the beginning of the post-production process, all the editor has is many hours of raw footage. Scenes are probably shot several times from different angles and perspectives. The selection of the shots depends on how the story or event is going to be told. The only guideline available besides the editor's criteria is the understanding of the conventional meanings of the different types of shots and angles. Anderson classifies shots by image size in four categories: 0 Extreme wide shots (EWS) cover the widest possible view and are usually usedasestablishingshots. Duetothe small screen size of the TV, no specific details are visible and their impact is minimal. Therefore, the use of this shots should be brief. s Wide shots (WS) are also used as establishing shots. They provide an accurate perception of the location, as well as of the details within the shot. WS may be used in the middle of a sequence to show changes in the location and its content. 0 Medium shots (MS) are mostly used as transitions between the establishing shot and the close-up. They bring the viewers closer to the action without being intrusive. MSs are also used to follow characters’ movement. The medium-two—shot is probably the most used shot in TV. It is used to follow a conversation with the performers side by side or as an over-the-shoulder shot. 0 It is been said that TV is the medium for close-ups (CU). CUs range fromashotfromthechestuptoadetailinthefaceofacharacter. Properly framed CUs can be very powerful. Traditionally, they are 34 used within sequences to direct and focus attention to particular details. However, sometimes CUs are used at the beginning of the sequence to slowly reveal the location or to emphasize a detail within the scene. Also, they are used to magnify objects or emotions, as cutaways, or to present the point of view of a character focused on a detail. The three shots most used in TV are WS, MS and CU. Usually, they are used in that order to guide the viewer through the sequence. But, editors have creative license to use any combination of shots that is effective. The editor must consider that whenever two shots are edited together, their order and relation affects the viewer's perception of the event (W urtzel, 1989). The juxtaposition of two shots immediately establishes a relationship between them, even if their content is totally dissimilar. This technique plays an important role on the creative process of editing. For example, different editors may create different sequences with different story-lines, using the same footage. Each editor establishes a particular order, shot length, and transitions to communicate their own perception of the event. Camera perspectives define the position of the viewer in relation to the action. An objective perspective places the viewers in the position of an unseen observer. The talent never looks directly at the camera nor acknowledges its presence. The subjective perspective presents the events from the perspective of a character or a participant. The camera becomes the eyes of a character or narrator. Finally, the point-of-view perspective combines the two previously discussed perspectives. The action is not seen directly from the eyes of a participant but from alongside him/ her, like an invisible observer with in the sequence. 35 b. Transitional devices Whenever two shots are edited together, there is a transition between them. The most important aspect of the transition devices is the moment of change from one shot to another (Zettl, 1990). At the junctures where every shot begins and ends, a transition occurs that carries definite communication implications (Smith, 1991). The transition establishes a relationship between the shots. It also helps to develop the structure of the program by determining the basic pace of the sequences and ultimately the rhythm of the whole program. Transitions can be classified into four major categories: cuts, dissolves, fades and transitional effects. The cut is an instantaneous change from one shot to another. It is the most common and least obtrusive transition, because it resembles the way the human eye changes from one visual field to another (Zettl, 1990). Technically, the cut as a transition does not exist, because it does not occupy time nor space. When used properly, cuts are the least obtrusive way of manipulating space, time and event density or rhythm. Cuts manipulate space by continuing the action from one shot to another, by revealing detail, by following or changing the event, and by changing the viewpoint. A cut can establish change in time or indicate simultaneous events. Sequences of fast cuts increase the program pace while slower cutting creates a calmer rhythm. Cuts are often misused when the transition between shots is unintentionally abrupt and breaks the continuity. C u taways are commonly used to cover such jump cuts. A cutaway will smooth the transition by presenting an image related to the shots. Another misuse of the cut, the scene-on-scene cut, occurs when consecutive shots are so similar that the transition loses its meaning, looking like a jump cut. Smith (1991) classifies 36 cuts as soft when the shots have similar content; and as hard when they are completely unrelated. The convention is to vary the shots by cutting between dissimilar viewpoints. Dissolves are softer transitions than cuts. One shot gradually blends into another, overlapping temporarily. The duration of the dissolve can be controlled to produce a long soft transition or a fast dissolve often called so ft- cut. When a dissolve is stopped midway it produces a superimposition, where both images overlap. A dissolve signifies an evident change of time, space, content and / or mood, while it establishes a relationship between shots even when they are not connected at all. Dissolves reduce the rhythmic accents creating a softer mood while preserving the emotional intensity (Zettl, 1990). Fades are gradual transitions from an image to black or from black to an image. Technically it is a dissolve but it always involves black. A fade-in implies a beginning and a fade-out implies an end. A quick combination of both or cross-fade signifies the end of a sequence and the start of a new one. As Smith suggests, the use of a fade represents a complete change of time, space, content or a combination of these elements. The introduction of transitional effects has provided an ever increasing number of transition options. Originally the options were limited to wipes; shapes that moved through the screen taking one image off while bringing another on. Today, images can be frozen, shrunk, flipped, stretched, or manipulated in almost any thinkable manner. These transitions are direct and pronounced. Zettl prefers to call them interconnecting shots rather than transitions because they occupy space and carry a strong meaning. Instead of relating the images, these special effects separate them by placing the emphasis in the transition between shots (Smith, 1991). The more 37 pronounced the transitional effect the more obvious is the separation between one shot and the next. It emphasizes a difference in space and/ or time. c. Special effects and digital video effects Special effects and digital video effects (DVE) are image manipulations usually created during the on-line edit session. Whenever this technology becomes available, it expands the communication potential, the creative capability and the ability to visualize the impossible (Smith, 1991). Special effects' devices have become overused tools to enhance the video images making a goal in itself rather than enhancing the communication objectives. Producers and editors should keep in mind that any effect calls attention to iself. Therefore, the effects should be used only as communication tools. Their use is justified only if they clarify, stimulate and create meaning. Special effects can be generated electronically or digitally. Electronic effects are generated through the switcher. Zettl identifies four standard effects including the wipe previously discussed: 0 Superimposition or "super'I is a form of double exposure in which one image is superimposed over another. The most common super is created electronically by doing a dissolve but stopping half way. 0 Key is an electronic cut out of some portions or areas on an image and filling it with another image. It is mostly used to add titles. 0 Chroma key uses chroma and luminance for keying. It uses a specific color, usually blue, as a background over which the keying occurs. The subject that is in front of the background is basically cut out while the background is substituted by another image. This 38 process is commonly used during the weather section on the newscast, when the reporter stand in front of the weather map. Digital Video Effects are generated by devices that change the analog video signal into digital information (Zettl, 1984). The DVE unit grabs a video frame and digitizes it, to manipulate it, store it and retrieve it on command. These units expand the production capabilities by offering multiple effects and allowing effects combinations to create new ones. The list of DVEs is virtually infinite. Zettl classified them in four categories according to the type of image manipulation. 0 Multi-image effects combine multiple images into the screen, such as split images. 0 Reshaping effects change the size and shape of the image by modifying the aspect ratio and compressing or expanding the image. These types of effects also create mirror images by flipping an image on its x- or y-axis. ' Some effects are designed to manipulate the texture of the image. The most common effects of this type are mosaic, posterization, and negative exposure. 0 Effects can be animated by adding motion. The major motion effects involve continuous change of image size and position, such as zooms, rotations and tumbles. d. Graphics Graphics usually include titles, credits, diagrams and charts that 39 supplement the content (Smith, 1991). They may be generated digitally, electronically or prepared on title cards. To integrate them to the final master, graphics are keyed over the original footage and recorded during post- production. Designing adequate graphics is a time consuming process. There are three main technical aspects to consider. First, the designer must consider the TV aspect ratio (Anderson, 1988). All graphics must fall inside the essential area, to avoid having the information close to the edges cut-off (see Figure 2.3). Second, the titles must be readable. The letters must be big and bold enough to be easily read from the screen. Finally, colors must be considered. Good graphics require a careful selection of colors. The use of color must be aesthetically acceptable and must help to separate the foreground from the background. The tones must follow the style of the program and at the same time help to clarify the design. The communication aspects of the graphics should be considered. The graphics design must correspond to the program. Graphics should present complementary information without causing distraction. Complex graphics may require extra time to understand and read. Smith suggests that their appearance should match the tone, texture, and sensibility of the program. Decisions related to font style, size, color, edges and background must be made accordingly. Another aspect to consider is how the graphics are going to be introduced. Electronically generated graphics offer a large number of possibilities. Apart from the traditional cut, dissolve, crawl and roll options, graphics can be moved along the x-, y- and z-axis, to be positioned anywhere on the screen. Charts, graphs and other illustrations can be used as separate stand-alone images. -:33:3332331331323113:33:33532333:312331353323“{if:33:35123323231233}:EIEI-zfiéfiiéiéfi: Essential Area (Approx. 25% reduction) Figure 2.3 Essential Area of the Screen Source: Television Production by Alan Wurtzel e. Pace and rhythm Besides determining which shots to use and in what order, the ability to lengthen or shorten shots and sequences is one of the editor's most powerful tools. These manipulations alter the pace of a sequence and the rhythm of the whole program. Wurtzel (1989) defines pacing as the viewer's perception of the speed of the sequences that form the program. It is a psychological and emotional impression totally unconnected to the objective time. The pacing is determined by the length of the shots, the transitions’ frequency and duration and the content of the shots. However, when the natural pacing is altered, the changes should not damage the flow of the sequence (Smith, 1991). There is no exact formula to create the adequate pace. The editor must rely on his/ her judgment and sensitivity to the content. The rhythm of the program is determined by the pace of individual 41 sequences and how they relate to one another (Zettl, 1990). The sequences that form the program have their own pace, but when edited together they must flow smoothly from one to the next. The rhythm is determined by the sum and flow of pace of the different sequences. Smith suggests the preparation of an emotional flow chart to help to identify the dynamics of the program and establish the rhythm. CHAPTER 3 NON-LIN EAR EDITING SYSTEMS 3.0 INTRODUCTION The previous chapter has established the foundation upon which the rest of this research is built. It identified two main areas of study, the technical and creative aspects of post-production. To facilitate the comparison analysis this chapter follows the same format to investigate and explain non- linear editing systems. The section dedicated to the technical aspects concentrates on the technical features of the non-linear editing systems that are relevant to post- production. The development of these systems picks up where the linear systems left on the previous chapter. This section is focused from the editor, producer and director perspective. There are other technical aspects but those are not so much part of the editor's work as of engineers and computer experts. The discussion of the creative aspects of non-linear post-production includes the procedures and techniques available to editor's to transform raw footage into a video piece with a communication purpose. In the last section, as non-linear editing professionals the interviewed editors expressed their Opinions on the systems and identified what they consider are its advantages and disadvantages. 43 This chapter intertwines information gathered from secondary sources along with the responses from the interviews made to non-linear editors. 3.] TECHNICAL ASPECTS: NON-LINEAR EDITING A non-linear editing system consists of a computer which basically performs all the tasks of off—line and on-line systems. The computer combines video, audio, text and graphic capabilities, all in one system. Taking the risk of over simplifying, we can say that a non-linear editing system is a seamless integration of video switcher, edit controller, DVE unit, character generator, paint and animation equipment, audio board, and time base corrector in a computer based system. Even though the same tasks that are performed in an editing suite can be performed in a non-linear editing system, the basic equipment is totally different (see Figure 3.0). Instead of having multiple black boxes, each one performing a different task, these systems have a computer that basically controls and performs all the post-production tasks. The only elements that remain from the editing suite are the WR and the audio and video monitors. The computer's central processing unit (CPU) processes all the information, which includes running the editing software and compression algorithms. The main difference between these computer systems and the ones editors used to have in the suites is the unavoidable need for large amounts of digital storage. A non-linear system requires media storage devices to store the footage and other materials in digital format. The keyboard, mouse, trackball and electronic pad are some of the available interfaces used to control the systems. 44 The systems have four characteristics that separate them from linear systems (Ohanian, 1993). First, they provide random access. Any material can be accessed in a non-linear fashion in approximately 12 milliseconds (12/ 1000 sec). Second, edits are non-sequential. Scenes can be edited in any order and then organized according to the script, without any generation loss. Third, edit changes are easily made. Since edits are non-destructive, length and position of the shots can be easily adjusted and also multiple versions of the project can be easily created. And fourth, the three previous characteristics are time savers. The market offers such a variety of systems that an editor may use a different one every month. The single most important factor that differentiates them is the quality of the output signal, which ranges from consumer level to broadcast quality. Despite the system's differences in output quality , the basic principles of operation are similar across all systems. The quality of the output signal and the requirements of each project determine if the system should be used to produce an EDL, an off-line or an on-line video. To obtain a better understanding of the reach and limits of non-linear systems is necessary to begin with a review of the technical aspects of the technology. 3.1.1 Platforms Non-linear editing systems are based on computer platforms. Each platform has different characteristics. The platform's speed, memory, storage, expansion capacity, and audio / video capabilities have a direct effect on the system's performance. On his article Choosing a desktop video platform, Jeff Burger describes the four computer platforms for non-linear systems: PC, 4.5 Macintosh, Commodore's Amiga and Silicon Graphics Inc. ”I o —H—— O o o o I O J Source Monitor Editing Monitor h _ l l / Media 5‘0““ B morn: o 0 Audio 5 akers ’ ' pe Comm (CPU) \ v‘dmpe Recorder = O ‘ . l O [ (39900000000 I :13 HJJCDOOGJOO Fr. rooczzaooo Trackball Keyboard & Mouse Electronic Pad 6: Pen Figure 3.0. Basic Non-Linear Equipment Source: Constructed based on the descriptions from various authors a. PC PC systems benefit from the immense proliferation of clones. Approximately, 90% of the computers in the United States are PC compati- bles. The fierce competition among manufacturers results in hard to beat per- formance to dollar ratios. Nonetheless, standardization on software and 46 hardware has been limited because of the free-for-all clone war. It is important to remember that the PC was developed as a text-only computer. To improve its multimedia capabilities and install friendly interfaces like windows, require additional memory. The PC's processing speed has been increased with the Intel’s Pentium processors. Machines capable of non- linear editing may be purchased for less than $2,000. Additional memory and storage peripherals may be acquired separately. Several advances are being made to compensate for the PC's weaknesses. However, presently only a hand full of PC based professional quality non-linear editing options are available. b. Macintosh Graphics, audio, video and print media professionals have been using Macintosh for more than decade. Despite its higher pricesl, the creative communities prefer the Mac. Its friendly interface, multiple windows, menus, and instinctive point-click commands are their trademark. The quality of its creative tools is higher than the PCs because this platform has been developed considering the needs of these users. The introduction of Quick'l‘ime2 in 1991 popularized desktop video. Macintosh standardized digital video compression and established the foundations for new software and hardware developments. The introduction of the PowerPC chips from Motorola have increased its speed and graphic capabilities. Software written for the PowerPC runs approximately three times faster than the traditional Macs. However, because of a limited market compared with the PCs, there are fewer third party Macintosh based software developers when compared 1 Currently, Macintosh price reductions compete with those of PC clones. 2 Quick Time is a digital video editing software for the Macintosh computers. It standardized digital video compression in an open arquitecture. - 47 with PCs. This is expected to change with the opening of the market to clone manufacturers. c. Commodore’s Amiga The Commodore’s Amiga3 has certain qualities that distinguish it from the PCs and Macs. It is the only system built around NTSC video. This fea- ture facilitates animation of video and graphics overlays. The Amiga has more built-in graphics and sound capabilities, because it has customized chips dedicated to these tasks. A double-buffering4 technique enhances the anima- tion smoothness. The Amiga 4000 ($2,400 approximately) displays 262,000 colors simultaneously. The New Tek’s Video Toaster board installed in the Amiga will virtually transform the system into a Toaster5 with internal YIQ6 format for broadcast use. For an additional fee the Amiga offers 3D animation, paint, character generator, switching, and DVE. Adding memory and storage media, the total package costs around $101“). New Tek’s Screamer performs 3D effects 40 times faster than the Toaster alone, a PC or a Mac. However, the pattern of compression standardization has been slow, which means that non-linear editing features are not prominent. Only linear video editing software that controls traditional video transports is available, but the company has promised the introduction of non-linear capabilities in the near future. 3 The company disappeared in 1994, but some top executives have formed a new company, which is expected to continue the work on these platforms. 4 Double buffering technique employs two image buffers that are toggled back and forth. One displays the current frame while the other draws the next frame. 5The Toaster is a desktop video system which consists of software and hardware. It offers 3-D animation, paint, character generation, switching unit, DVE and other features. 6 YIQ is the color space used in the NTSC color system. The Y component is the black and white portionsoftheimage. Theland Qpartsarethecolorcomponents;thesearenornorethan a 'watercolor wash' placed over the black and white, or luminance, component. 48 d. Silicon Graphics Inc. Silicon Graphics Inc. (SGI) systems provide the highest power and better capabilities, but at the highest prices. An SGI state of the art system costs around $20,000. SGI offers a 100 MHz microprocessor, compared with the 25 MHz of the Amiga. It runs at about twice the speed of Intel’s Pentium and Motorola’s PowerPC. The bus for I/ O operations runs at 267 MB/ 8 compared with the other platforms that run between 20 MB/ 5 to 40 MB/s. The Indy, SGI basic package ($4,995), includes a microprocessor, an IndyCam desktop communication camera, 16 MB of RAM, and a 15 inch monitor. It can display full screen 30 frames/s and accepts S-video and com- posite formats. However, a stand alone system requires extra peripherals, storage, and additional memory. The hardware buffer for the 3D effects and an extra 500 MB hard drive will triple the Indy costs. Besides the cost, it is important to note that some of the most powerful software for 3D effects runs in SCI systems. Recent SGI implementation of QuickTime provides file compatibility with the Mac systems. It also provides network transmission, including the ability to use ISDN7. Choosing a platform should depend on the user's present needs, even though future expansions and growth should be considered. Steve Epstein (1994 a), the technical editor for Video Systems magazine suggests to first find the best software to match the user's post-production needs. Then, determine what is required to run the application. 7 Integrated Services Digital Network (ISDN) is a set of standards for operating parameters and interfaces of a network that will allow a variety of mixed digital transmission services. 49 3.1.2 Open vs. Closed Architecture Non-linear editing systems may be designed as open or closed architectures. A system with an open architecture permits third party plug-in boards and software packages off the shelf to work with its core application. Closed architectures eliminate the option of adding devices, such as paint software or audio mixers, which are not developed by the manufacturer. An open architecture simplifies setting up system components (Molinary, 1994). It allows the user to choose and buy only what his/ her upgrade needs require. On the other hand, closed architecture upgrades and complementary devices are designed according to the capabilities of the system, without affecting its overall performance. 3.1.3 Interface Although all the editors interviewed recognized that they are not familiar with other non-linear editing systems interfaces, they are very satisfied with the Avid interface. They define it as natural, intuitive and easy to use. The basic interface of the system consists of a keyboard and two monitors; however other devices such as trackballs and electronic pads can be added (see Figure 1.0, page 3). The source monitor contains the bins and other additional information. It can be considered the computer monitor The other monitor, called the com poser, provides the area on which the edits are performed (see Figure 3.1). It contains the time lines for video and audio, and small windows that show the media being used. Within these windows the editor can review the takes and mark the in and out edit points. If desired this monitor can play the NTSC video signal at full screen, in which case the 50 composer window with the time line will be automatically transferred into the source monitor. Edit Filters -@:ns:am T oo:m:oo:oo oo:oz:oo:oo 00:03:00.00 oo:os:oo:oo v1 'IIIIIIIII IIIII IIIIIIIIIIIII IIIIIIIIIIIIIIIIIIIIIIIIIIIIII IIII// Figure 3.1 Basic Non-Linear Systems Interface Source: A composition bmed on the descriptions from various authors. A significant interface change from traditional linear video systems is that time code is only used during digitization. After that, the editors deal with images only. According to Mr. Mulliner, he does what he wants and the 51 computer keeps up with him, keeping track of his numbers. Important aspect of the Avid interface is its flexibility. The keyboard is completely reconfigurable and the monitor screens can be customized. Each editor can have a different setup that accommodates to his/ her personal style and the needs of the facility. Mr. Monigold points out that the Avid interface was designed following film editing style. Most of the vocabulary and terms such as "bins" and ”clips” are adopted from the film industry, because film editing is a non-linear process the editing concepts and procedures are closer to film than to video editing. He recognizes that most film editors are not technical people, but the system interface facilitates editing: "even if you don't know what you are doing, you can find your way around it.“ 3.1.4 Digitization Before editing on a non-linear system, footage has to be transferred from an analog to a digital format which computers can process. Thomas Ohanian (1993) defines digitization8 as the process by which analog signals are converted to digital data. The process begins when analog video is played into the computer system. The video signal is decoded into its analog components: red, green, blue and sync. The three color components are processed by three analog to digital converters (ADC) on the digitization card. The digitized data is sent to the computer memory as bits9. The bytes10 can either be stored or compressed and then stored. The method by which these 3 Digitization is the process of converting an analog signal into binary representation by encoding the state of that signal at frequent, successive moments in time. 9 Bit is a contraction for binary dig t. The term refers to the representation of data in binary form, as a series of on-off or 1-0 digits. 1° Byte isthe numberofbits usedto represent acharacterin acomputercode. T'hemost common byte size is eight bits. 52 signals are stored can vary, but they are usually stored as either RGB11 or YUV12 (Ohanian, 1993). Once digitized the video frames are available for additional processing. 3.1.5 Digital Video Compression Storage costs are reasonable and affordable, approximately one dollar per megabyte (MB). However, if we calculate the amount of storage needed for 100 minutes of video, it is not cost effective. Approximately, 100,00) floppy disks of 1.44 MB are needed to store 100 minutes of NTSC video (Lookbaugh, 1993). NTSC digitized video has 24 bit color (8 bits for each red, blue, and green) and a resolution of 720 x 484 pixels (McConnell). Therefore, each frame requires approximately 700 kilobytes (KB). Real time video requires 21 MB (700 KB x 30 frames) for each video second. In addition to the file size, there is the issue of data transfer rate. Average computers have transfer rates that range from 2 MB/ s to 16 MB/ s. This is definitely slower than the minimum of 21 MB/s required for real-time NTSC video. Compressing the video signal solves both problems of storage and data transfer rate. Compression reduces the amount of data to be transferred and stored, while maintaining a pre- determined level of image quality. Compression methods use mathematical algorithms to reduce and compress video data by eliminating and/ or grouping and / or averaging similar data found in the video signal (Masavage, 1994). Masavage identifies 11 RGB stands for red, green and blue, the three primary colors in the additive color family used for projected image display. It is another form of component video where the signal is broken down into the three primary colors with accompanying sync information typically carried separately or on the green signal. 12 YUV is the color space or mathematical representation of color used by PAL and NTSC color systems. The Y represents the luminance component and the U and V are the color-difference components. 53 four major factors that determine the quality of the compressed video : 0 frame rate: video displays 30 frames/ 60 fields per second. Some compression algorithms reduce the number of frames to less than 30 per second, or eliminate every other field, using the same field twice to represent a frame. 0 color resolution: the color palette or colors displayed can be significantly reduced to achieve high compression ratios. ' spatial resolution: compression can affect the size of the picture. NTSC video uses 768 x 484 pixel display. 0 image quality: the quality of the image should be acceptable to its intended purpose. The higher compression ratio, the lower will be the image quality. The type of compression used determines how the video signal is manipulated. There are multiple standards for compression. Compression can be classified into two major categories: lossless or lossy, and interframe or intraframe. a. Lossless vs. Lossy Compression Video compression can be designed to perform lossless or lossy compression (Lookabaugh, 1993). Lossless13 compression ensures that the digital input to the compression system will be identical to the digital reconstructed output of the decompression. Lossy“ compression does not 13 Lossless compression involves any scheme that compresses a file by simply rearranging or recording the data in a more compact fashion. 14Lossycompression involvesanyschemethatcompressesafilebytluofingawayanydata that the compressor deems redundant or unnecessary. 54 guarantee an identical reconstruction, because it eliminates some redundant information permanently. Nonetheless, it is the more efficient way to compress digital video. It achieves compression ratios ranging from 2:1 to 100:1. Lossy compression benefits from two facts. First, particular differences between original and decompressed video may be transparent to the human eye due to its visual limitations. Second, the bit rate of transparent lossy coding is typically one fourth of the lossless compression. When a 'perfect' image is not the most important goal, transparent lossy compression may be cost beneficial. In addition, any image captured by the camera suffers an inevitable loss or distortion introduced by every component through which the signal passes. If the signal is going to have an inevitable loss, then all decisions regarding compression should be based on the tradeoff between cost and performance. b. Intraframe vs. Interframe Compression Compression algorithms requiring hardware-assisted” decoding are divided in two main categories: intraframe or interframe. Intraframe systems compress each frame individually and store them as discrete pictures (Masavage, 1994). Sometimes the compression is performed at field level. This method ensures quality video and editing accurate to the frame. However, intraframe data rates are 2 to 10 times higher than interframe compression. In terframe compression compresses across video frames by saving only the differences between frames (Walker, 1994). Interframe uses a process 15 Hardware-assisted means that the computer uses add-on hardware boards with dedicated and extremely fast chips. Hardware—assisted digital video displays better quality video. 55 called motion compensation which combines the use of key frames and motion-prediction to achieve high compression ratios and low data rates. Key frames are full frames saved without being compressed, used as reference for those after them being compressed. Motion-prediction uses the information differences between consecutive key frames to reconstruct the frames in between. c. Compression Standards Many compression formats have been developed but only a few are widely used (McConnell, 1993). Motion IPEG (Joint Photographic Experts Group) is the most frequently used format for video editing. Originally developed for still images, Motion IPEG digitizes each frame and plays them back simulating full-motion video. It uses intraframe encoding, providing random access to each frame. Motion IPEG does not include a built-in audio digitizer. However, the audio is frequently digitized separately and added at later stage during editing. Moving Picture Experts Group (MPEG) is mostly used for multimedia applications. MPEG compression ratios reach 200:1, but reasonable quality is only maintained up to a 100:1 ratio. MPEG combines intraframe and interframe compression by compressing each frame individually and eliminating redundant information between frames, saving only their differences. Frames are not individual elements, they require information from proximate ones. Not having random access to individual frames makes this format unsuitable for non-linear editing. Digital Video Interactive (DVD is a proprietary standard with two dif- ferent methods of compression. Production Level Video (PLV) is comparable to MPEG. It benefits from a wide range of compression ratios. PLV compres- 56 sion costs are quite high and typically the footage must be sent away to a ser- vice bureau. Real Time Video (RTV) performance is similar to PLV, but it uses low resolution images to achieve real time compression. Even though it was commonly used for non-linear editing, it is being substituted by Motion IPEG. Wavelet compression technology is similar to Motion IPEG. It saves the frames individually, quantifying information for each frame as a single unit. Its major drawback is a noticeable image degradation. Wavelet introduces artifacts especially around the object's edges. d. Using Compression The way the interviewed editors approach the issue of compression depends on the type of editing they do, either off-line or on-line. As Mr. Koster explains " it is really hard to rate the (compression) quality because the quality (acceptable level) changes depending on what it is used for.” The Avid 8000 has twelve levels of compression or resolution. However, of those twelve levels only two use two fields per frame while the rest doubles a field to form a frame. More modest models have around 3 compression levels. The highest resolution available at the moment is closer to true Betacam quality. Mr. Mulliner says that approximately 80% of his work is on-line and requires the highest resolution. For the remaining 20% of his work he uses AVR 3 or AVR 4. The quality of the image is considerably lower, close to consumer level. He does not like working with resolutions below these levels. He comments that after working at this lower resolution and then going to an on-line suite the change in quality is a big shock, like wearing a pair of glasses for the first time. 57 At Harvey's Place almost all their work is off-line. Mr. Luttermoser explained that they always try to use AVR 26 or the best possible quality that will allow them to store the footage they have. They also have to consider the projects that currently 'live” in the machine. Artifacts like pixellation are of no concern for off-line editing. However, when using complex images with a lot of little details and combined camera movements, the images seem to strobe. In these cases Mr. Monigold prefers to check the original footage before using it to make sure the strobing is not on the source tape. The editors interviewed agreed that the quality is good enough for the type of work they do, off-line, corporate and industrial videos, and even some broadcast programs. The quality is very close to Betacam except on extremely complex images, where it may have break-ups. Still, the image quality is not acceptable to finish high-end productions. Nevertheless, they understand that they are using the best quality available in the market and that these problems are temporary because new improved versions are being developed. As Mr. Mulliner explains, "You have to realize that we are on the edge of compression technology here... that is the compromise you made. It is cheaper to cut on the Avid than it is to cut in an on-line suite.” 3.1.6 Storage Once all source materials are digitized they reside in the computer disk. Logically these disks capacity of storage is very limited considering the vast amount of storage required by digital video. This fact introduces to the editing process the concept of storage which with analog systems was solved by simply buying more tapes. When using non-linear systems the capacity of the computer disks is frequently not enough to store the footage and 58 materials from all the current projects. External storage devices are the most popular alternative to store the source materials that have been digitized, with the most common being the hard drive. Charles Ohanian (1993) identifies the three main characteristics of the storage devices: capacity, transfer rate and access time. 0 Capacity refers to the amount of digital video that can be stored in the device. Capacity is determined by each disk drive and the number of drives that can be linked. It is measured in kilobytes (KB), megabytes (MB), gigabytes (GB) and tetrabytes (TB). ' Access time refers to the amount of time required to retrieve and display information from the storage device. It is measured in milliseconds (ms). 0 Data transfer rate is defined as the amount of data that can be read or written by the storage device during a second. It is measured in KB or MB per second. These three aspects should be considered when choosing storage. However, the most important characteristic is the data transfer rate. To determine the minimum data transfer rate required, multiply the number of bytes of a compressed frame by 30 frames per second. For example, if a compressed frame requires 10 KB and each second of video requires 30 frames, then the transfer rate should be at least 300 KB per second (see Figure 3.2). nKB x 30fps —> 30n KB/s storage for real time minimum 1 frame video transfer rate Example: 10KB x 305 —>- 300KB/s Figure 3.2 Calculations to Determine the Minimum Data Transfer Rate. Depending on the post-production style of non-linear editing the storage devices can be used as primary or secondary storage (McConnell, 1993). Primary storage contains the source materials that are currently being used. Meanwhile, other materials necessary for the project but not currently in use are kept in the secondary storage. SCSI16 hard drives, either fixed or removable are commonly used as primary storage. They are fairly inexpensive ($1,000 per GB) and provide transfer rates up to 4 MB/ 3. Single disk drives come in sizes up to 3.5 GB, enough storage capacity for nearly 30 minutes of betacam quality video. Its capacity is easily upgradable by linking additional drives. For secondary storage, magnetic, magneto-optical or phase change op- tical drives are practical solutions. These drives use removable diskettes, allowing editors to change projects as quickly as inserting a new diskette. Each diskette storage capacity varies from 650 MB to 1.3 GB, with a transfer 16 scsr stands for Small Computer Systems Interface. SCSI is an interface standard used to connect the computer's CPU to small internal and/ or external peripheral devices such as scanners, hard drives and printers. 60 rate of approximately .5 MB/ s. The storage cost for 1 GB is about $100. Diskettes are low-cost, especially considering that a gigabyte of memory stores about 30 minutes of video. On the other hand, 8 mm digital tapes have lower transfer rates but also lower costs (Ohanian, 1993). An 8 mm tape can store 3 CBS for only $20 dollars but the transfer rate is often too slow for practical use. There are many alternatives for storage and many more being developed. However, the best alternative for each user has to be determined by his/ her particular needs. a. Storage Management Managing storage is a juggling act. To maximize the storage capacity a high degree of organization is required during the early stages of editing. Careful organization involves selecting only the best takes to be digitized and other materials to be used, such as still pictures, graphics, and music. It is necessary to calculate the amount of storage required to store the materials using different resolutions. Mr. Jackson recognizes storage as a constant concern. For tape-based systems storage is unlimited. There can be as many tapes as needed. On non- linear systems, the storage is limited by the capacity of the hard drives, especially when working at higher resolutions which are very demanding of storage space. At the Whirlpool facilities they have seven drives with a total capacity of 16 GB. Mr. Jackson estimates these 16 GBs can store approximately three hours using AVR 2617 resolution. For projects with large amounts of materials, he digitizes at a lower resolution that will allow him to store all his footage and edits the program. Then, he saves the timeline which contains 17 AVR 26 is one of Avid‘s compression algorhythms. Its quality is comparable to a 3/ 4 inch videotape. 61 the edit decisions, throws away the lower resolution footage and re-digitizes the footage at a better resolution to reconstruct the project. On the other hand, Mr. Koster at Calvin College prefers to divide the programs in sections before editing and digitize the footage for each section individually at the highest resolution. This way he avoids having to re-digitize and reconstruct the whole project at the end. At Postworks, Mr. Mulliner has four 9 GB drives for video and three 2 GB drives for audio. He backs up the projects on 1/2 inch digital tapes but he recognizes that some times it is easier to re-digitize the whole program, if it is necessary to make changes. For him storage is a whole new issue that has been brought to the business. He foresees that: "there are going to be people who can make a career out of just media management. As things go more and more to the digital platforms with big file servers, and all this data that we are (moving) back and forth, somebody has to juggle that... Its kind of important." At Harvey's Place they have from 9 to 12 GB removable hard drives for each system. If more storage is required for a particular project they just take a removable drive from another suite or digitize at a lower resolution. They try to keep only one project on the machine and when it is finished and transferred, they throw it away. The summer is their busiest time of the year and that is when according to Mr. Monigold they have to be selective about what to store. The rest of the year they have no problem with storage. 3.1.7 Editing Tools There are many devices integrated in a non-linear editing system. This 62 section covers the most relevant functions for the editing process and their main features. This is not an exhaustive listing because of frequent product modifications and introductions. a. Transitions The video switcher is the heart of any system. The switcher functions on the non-linear editing systems provide standard transitions such as cuts dissolves, wipes and keying. On software based systems, new effects can be added to the systems, including hundreds of traditional effects. Some systems offer two separate keyers, one for video and another for graphics. These two keyers allow simultaneous keying of video or luminance, and graphics titles (Chan, 1993 b). On each and every edit, a transition, keying effect and/ or multi-layering can be done, all with the same ease. Quality transitions should look smooth at all speeds. Furthermore, transitions and keying should work identically between all video layers. Hardware-based video effects process transitions and effects in real time while most software based effects' banks require additional time to render the effects. New transitions can be easily added in software-based systems but hardware-based are more powerful. b. Effects High-end systems offer a seamless integration of effects' generation (Chan, 1993). DVEs include 2—D movements, scaling of video or graphics and effects, such as mosaic, strobe and posterization. Three-dimensional effects include warps, skews, rotations and page turns. Other effects are pattern position and hold, wipe modulation, variable aspect ratio, softness and border modifiers. Multi-layered compositing is becoming a regular feature of most 63 systems. Compositing permits each video layer to be treated independently allowing any combination of video, background and key (McConnell, 1993). The priority of the layers is completely programmable. Layers can be cut, mixed or wiped into or out of the video composite, individually or in combi- nation with other layers. Top of the line equipment offers unlimited layering without image degradation. c. Graphics The process of adding titles has traditionally been performed by a dedi- cated character generator. Most non-linear systems include the character generator as a regular tool. Its basic motion features to look for are the characters scroll and crawl. Characters may be sized, colored with a color gradient or transparency or outlined. Dropped and single shadows with individual coloring or color edging can be added. 3.1.8 Audio All the editors interviewed agreed to the fact that audio quality is better than in most linear systems. It is digital CD quality. The storage requirements for audio are approximately 10 times smaller than for video. Nonetheless, the audio tools seem to be the weakest point of the Avid system. The Avid 4000 and 8000 models offer 24 audio tracks in, but only four can be played simultaneously. As Mr. Jackson explains, regardless of how many tracks he uses they all have to come out as one stereo pair, whether two or four channels. The Avid 1000 or lower models only have two simultaneously usable tracks. Mr. Koster uses one of these models. To edit audio he uses one track for voice and another for music. If sound effects are 64 needed, he combines track one and two together and adds the effects on the track that has been freed. He considers this process cumbersome and hopes for changes in future upgrades. The systems have no audio processing capabilities except for level control and 3 dB of gain. There is no audio equalization, no editing tools and no mixing capabilities. The editing process is similar to that of video. On the clip window the editor marks the in and out points and places it on the time line. Mr. Jackson describes the time line as ”a place to put audio". He says audio ”has to be what you want it to sound like going in, because that's how its going to sound going out." Compared with audio workstations that work with 1/ 100 of a second as the smallest unit, the Avid falls short because it works with video frames which are a 1/ 30 of a second. For small projects the quality will do but when better quality and accuracy is required the editors recommend to request the services of an audio facility. Devices like a mixer can be connected to the system as I/O devices“. Many users may like to see better audio tools especially now that the Avid is an on-line system. In terms of video the system provides all the elements necessary to finalize the video portion of a project; however, most editors are forced to send the audio portion to be "sweetened” by audio specialists who will be able to match audio quality to that of video. The editors expressed discontent about the disparity in quality between video and audio tools. They are looking forward to the introduction of audio tools comparable to broadcast quality. In the case of Mr. Luttermoser and Mr. Monigold from Harvey's Place, these drawbacks are not relevant because they do mostly off-line editing and their clients go to an audio post-house in any 18 I/O devices are characterized by their ability to send information or data signals to and from a computer. I / 0 stand for input and output. 65 case. Mr. Monigold referred to a new breed of musicians, who work in editing and post-production. ”They are doing what they called audio design... when they put in a car crash, or a tire squeal or a baby cooing they make sure that the pitch and the sound and everything works within the music. Sometimes you would have a music track, and we would have cut the effects track with the right effect and the music is perfect but somehow when you put them together it is not right. The sound designer can make sure that all is going to fit together." The audio capabilities of the system are satisfactory for the audio work they do. As Mr. Mulliner puts it, ”We edit pictures not sound.” His clients also go to audio houses, that use Mac audio workstations. He explains that " ...these places are run by musicians, they have whole libraries and they are good at what they do. That's what we recommend." The audio from the Avid is recorded on a disk and taken to the audio post-house for sweetening. When it comes back it is imported into the Avid and relayed on the project. 3.2 CREATIVE ASPECTS: POST-PRODUCTION AND NON-LINEAR EDTTING The non-linear editing systems may have multiple uses during post- production, depending on the user needs and quality requirements. These systems can be used to log, to produce EDLs, to edit off-line or to edit on-line. The discussion of the creative aspects will concentrate on the use of non- linear systems as a post-production tool to produce finished videos. 66 3.2.1 Stages of Post-Production using Non-Linear Editing Systems The editing procedures on the non-linear systems vary from one editor to another and from one project to another. The flexibility of the system allows the editors to select the most effective procedure to approach each project. However, there is a basic outline for the editing process on the non- linear editing systems: : Logging 0 Materials organization and creation of bins 0 Digitization and storage 0 Review of the organization scheme 0 Edit off-line and/ or on-line 0 Backup and save of the project 0 Output the project to an analog format Before starting the editing process, all the materials must be time coded and the tapes must be properly identified. This is especially important in the non-linear environment because it is the only way in which the computer can recognize and identify the original sources of the digitized materials. Once everything is identified and time coded the editor may proceed to log the materials. a. Logging Editing in non-linear systems requires some adjustments to the traditional procedures. The logging process needs to be more precise. Besides identifying the location of shots and a description of good and bad ones, 67 logging for non-linear editing should produce an accurate selection of the shots to be used during editing. This pre-selection ensures that only good quality and needed footage will be digitized and stored. The pr0per management of the storage maximizes the amount footage available during editing. The procedures of logging are similar to those for linear and traditional systems. Depending on the facilities and resources available for each project, the editors interviewed identified three different ways to log: 0 The logging can be done in an off-line system with or without burn- in time code. The materials are reviewed and the circled takes‘l9 marked on the shot sheet. Then, the time code of the circled takes can be manually entered into the non-linear system. 0 Using a computer workstation with logging software the materials can be reviewed to select the circled takes. To import the list from the workstation into the non-linear system the protocols must be compatible, otherwise it will be necessary to input the list manually into the system. 0 Materials can be logged directly on the non-linear system. The producer comes into the suite with all the tapes to select the takes. The list of circled takes can be created on the system. Independent of the method selected for logging, the result of this process should be a list of the best takes that may be used to edit the program. 19 Circled takes is term used by the interviewed editors to refer to the good takes that may be used during the editing of the project. 68 The limited storage capacity of the systems requires a careful selection of the materials to be digitized to maximize use of the storage space. b. Media Organization The next step involves the system directly. It consists of the development of some type of organization to classify the materials that will be used for editing. The system allows the user to create multiple 20bins in which to organize the materials. According to Mr. Jackson, in the early versions of the Avid systems there was no way to differentiate a clip from a sequence, an effect or a graphic, which forced editors to separate them by placing them in different bins. Nowadays, each type of media21 is identified by a different icon followed by a name given by the editor, but the classification using separate bins has been maintained as an effective way of organization. Depending on the project, Mr. Jackson creates different bins to store the titles, sequences, effects, source materials, a work bin for media that is in use and a utility bin for the slates, color bars, etc. He may create other bins while editing to break down the media even further. c. Digitization Once the organizational structure has been devised, the source materials can be digitized. The analog video signal from the tape is played back and fed into the computer through a converter that changes the signal from analog to digital. Other materials such as graphics and titles have to be digitized too. Usually, digitization and compression are performed 2° Bins are storage files used to organize the clips. The term was borrowed from the film vocabulary. 21 Once the footage, titles, graphics and other materials are digitized into the system they are referred to as media. 69 simultaneously. The technical aspects of the digitization process are invisible to the editors. The editor selects the materials to be digitized using time code numbers, plays the source deck and selects the record button for the corresponding bin. The computer automatically grabs the materials from the source tape and places them in the designated bins. From the bin, the editor selects the resolution or compression ratio to be used while digitizing. The materials can be digitized directly into the different bins or batch digitized and classified later. The editors interviewed identified four ways to digitize the source materials. 0 When digitizing "on the fly” the deck that contains the source tape is played and using the record button of the bin, the computer grabs all the materials as the deck plays. The editor uses the play, pause, and record buttons to select the materials to be digitized. 0 Another form of digitizing "on the fly" is to mark the in and out points of a selected take and digitize it individually. 0 One of the easiest procedures is to' batch" digitize. The editor enters into the system the list of the circled takes time codes, manually or from a floppy disk. The computer uses the circled takes time code to automatically select the takes from the tape and digitize them. The editor's job is to place the tapes in the deck as the computer requests them. ° Finally, the editor can go through the entire tape marking in and out points while logging and then batch digitize the list of takes on that particular tape. Once in the computer disk all the materials are called clips. Each clip is represented by an icon. The type of icon may vary depending on the clip's media. A video clip icon may be a thumbnail or video frame representative of the clip. Other clip's icons only indicate the media (video, audio, graphics, 70 effects) and the name given by the editor. Clips can be opened into windows with controls and editing tools used to manipulate the clips. All clips must be identified, cataloged and organized. The editor decides the content and size of each clip, which can vary from a single frame or graphic to several minutes of edited video. When using a clip editor, the name, location (based on time- code), a picture icon and other information related to the clips are stored in a database (Chan, 1993). d. Off-line/On-line Editing With the materials digitized and organized, the editor is prepared to edit. Typically, the systems use two screens (Ohanian, 1993). One serves either as a monitor to display the edits or a computer screen. The other, the composer or editing screen, is a graphic interface (refer to Figure 3.1, page 9) representing the editing workspace. Through this screen the editor controls audio, video, graphics and effects. The icons representing both the controls and the media can be opened into windows with graphic representations of the editing tools used to manipulate the clips. The editing process can be as simple as opening the clip, selecting the in and out points in the clip window, dragging it to the time line and releasing it in the desired position within the sequence. The time lines are a graphic representation of the project including all the media used. The time lines function like a story board to visually monitor the composition of the project. The editor can review the overall rhythm of a sequence and see how much time is allotted to each media clip and transition. The timelines can be expanded to show details of edited sections or contracted to display a total view of the project. The duration and the edits in and out points can be controlled from the 71 clip window. Any required trimming to the clip can be performed in the time line as well as in the clip window. Both systems allow the editor to see the exact frames of the edit points, which is very helpful when trying to match the action between shots. The in and out points can be changed at any time without affecting the material before and after the edit points. Using point, click and drag operations, the entire production can be edited by moving clip icons along time lines like in a storyboard. Transitions, effects, titles, keying and multiple layers are also placed along the timelines. Besides the traditional keyboard and mouse, the user may use other control devices such as the trackball, joystick, electronic pad and controller knobs. Depending on the project the editor may decide to use different approaches. All the editors interviewed prefer to do a rough-cut first, making the scene placement and timing decisions. They lay down all the takes in order to get a sense of how the project is flowing, dealing first with the pace and rhythm of the program. This editing is comparable with the creative decision-making process during the off-line editing. Then, they go back to do changes and give the finishing touches. As they would during the on-line if they were finishing on the non-linear system. As Mr. Luttermoser explains: "I try to build the spot, I don't like to slave over the first (edit) because you may slave over that two first seconds of the spot, then when you go to find scene two, and find that scene one does not work at all with scene two. I try to build the spot to get the big picture, to see how its going to flow." When editing with a client they prefer to edit a section or a couple of pages of the script at a time, instead of doing edit by edit. In Mr. Jackson's opinion 'a single edit does not mean much'. Non-linear systems allow the editor to quickly edit multiple versions of a sequence. This allows the 72 producer to select almost immediately what is the best take to use in the edit. If the producer is unable to make a decision, two or more versions of the project can be saved and shown to the client to select the final one. According to Mr. Monigold more than half of the projects they do at Harvey's Place have four or five versions. Also, it is possible to divide the project in sections and work with one at a time disregarding their script order. The editor works with the available footage and then pieces the sections together. This is especially practical if the storage required for the source materials is considerably larger than the available storage capacity. The editor can digitize all the necessary footage for a section of the project, edit it and throw away the takes that were not used. The extra space can be used to digitize the source materials of the next section. For large projects Mr. Jackson suggests another approach that helps to deal with limited storage. It consist of digitizing at a low resolution all the necessary materials and then doing a complete off-line with titles, graphics and effects. Once the program is completed, the time line which contains the in and out points, duration and location of the takes is saved, but the footage is thrown away. If the source materials have been properly identified and time coded, the editor just needs to select the capture mode at the desired resolution and batch digitize the program sequences. The computer, knowing the time code address of each source material, will be able to identify the corresponding tapes, digitize the adequate takes and place them following the original time line. All the editor has to do is place the correct tapes into the source deck and the computer will reconstruct the program using higher resolution footage. However, all the effects need to be rendered again because they are created frame by frame using the new footage. As Mr. Jackson explains, the problem is not editing a 30 minute program but storing and 73 managing the many hours of raw footage required to edit those 30 minutes. Another interesting post-production alternative is the case of one of Mr. Mulliner's clients. The client edits the program on a laptop and brings a complete EDL to the editing facility. They import the list and digitize the materials. The show is essentially done, only requiring the addition of effects and graphics. e. Effects, Titles and Graphics Usually the last stage of the editing process involves adding the effects titles, and graphics. For projects done totally on a non-linear system this would be the on-line editing. Adding effects is relatively simple. The editor selects the effect to be used from the menu and places it on the timeline between the clips where the transition or effect will take place. The duration can be easily modified by typing the desired length on the effects window or adjusting it on the timeline. The process of adding graphics can be performed through a chroma key. However, the creation of effects and graphics inserts requires additional rendering time2 2. Mr. Mulliner deals with this problem by rendering during lunch time or overnight. When all the editing is completed the time line is saved to a floppy disk in case additional copies or changes are required later. The editor selects the type of output he/ she needs from the system. All the edit decisions can be collected in an EDL, edited into an off-line workprint or the edited final project can be exported from the computer into a videotape. If further on-line editing is required the producer takes it to an on—line suite. Once again, the 22 Rendering time refers to the amount of time the computer needs to create the effect using the footage. It varies from minutes to hours, depending on the complexity of the task. This computation process creates each frame individually. 74 use of the product that comes out of the non-linear system depends on its quality and the quality requirements of the project. 3.2.2 Editing Styles: Continuity and Complexity Editing Complexity and continuity editing styles are both enhanced when editing in a non-linear system. In describing the trimming capabilities of the system Mr. Mulliner explained that four different shots can be seen simultaneously on the screen, the outgoing shot of the previous edit, the in and out points of the take the editor is working on and finally the first shot of the following edit. For continuity as well as for complexity editing this feature helps the editor visualize the relationship between the shot he / she is working on and the previous and next shots. Another important capability is the ease to make changes. A clip can be changed in length and position as easily as to click and drag. The whole process only takes about 2 seconds. As Mr. Jackson said, “anything that is difficult to edit would benefit from the Avid; you can be confident that you would end up with what you like because you have the opportunity to do just that.” When asked what type of shows benefit most from editing in the system, the editors were inclined to respond that anything that is complex or with a lot of choices. For example Mr. Monigold explained the case of multi- camera editing. The takes from all the cameras can be loaded and looked at individually or simultaneously. He compared it to being in a control room and switching between cameras with the only difference being that there is enough time to choose the best take. Another type of show that benefits is the documentary type where there is no required chronological or natural order. Mr. Koster used the example of the video of an art exhibition where he could 75 start with interviews of the participants or by presenting the art works. In this case the options were unlimited and the system allowed him to explore more than one possibility and then use the one that best captured the event. On the other hand, simple shows where little editing is required and the order is basically set up do not benefit greatly. Nonetheless, if the show is basically a talking head it can be cut much faster. According to Mr. Mulliner, instead of recording and waiting two or three minutes for the next edit, the editor can simply mark the in and out points and lay the clip in the time line without having to wait for the whole clip to be recorded. 3.2.3 Editing Language The elements and techniques available to the editor to communicate and create meaning through the language of editing remained unchanged by the use of non-linear editing systems. The conventional meaning and use of shots, perspectives, transitions, effects and graphics remain intact. When the editor creates meaning and the viewer reads it, the use of linear or non-linear systems is invisible. a. Pace and Rhythm Pace and rhythm has always been a major concern for editors and producers as well. On the linear systems the attention is directed to the individual edit, to make it as good as possible because any change after a sequence is completed means going down a generation or re-editing the program from that point on. Under these circumstances it is not easy to follow the pace and rhythm of the program. According to Mr. Koster, on the non-linear system the editor can see a visual representation (time-line) of the 76 edits that form the program and graph out the pacing. Also, the editor can do a rough cut to have a sense of the program rhythm and even after it is completely edited it is easy to make changes in duration and content of the clips to alter the pace and rhythm. 3.3 EDTTOR'S OPINIONS During the interviews the editors were asked directly what they considered were the advantages and disadvantages of non-linear editing systems (see Appendix B). Their answers are compiled here, sometimes enriched by comments or ideas they expressed while answering other questions. This section presents what the editors interviewed deemed relevant. It has been maintained separately from the previous section because these are the impressions of professional non-linear editors who use the systems every day to make a living. 3.3.1 Advantages Most of the advantages have been already discussed through out the previous section. Some are related to the editing process itself. Others deal with economics and the lifestyle of the editor. The purpose of this section is to outline them and provide the reader with a clear view of the advantages of the non-linear systems as perceived by non-linear editors. 77 a. Flexibility The advantage to which the editors referred the most during the interviews is the flexibility of these systems. There are two aspects to the flexibility of the system. One refers to the features that allow the editors to customize the screens and the keyboard. The other refers to the flexibility of the editing process. ' The editor can lay down all the takes and then come back to refine them. 0 The editor can divide the project, edit by sections and then edit all the sections together. 0 The editor can edit with the available footage whatever part he or she wants to edit going back and forth along the project, - the beginning, the middle and even the end. ' In addition, the editor is able to edit in a linear fashion. The most remarkable feature of the flexibility is that it allows the editor to save a sequence and then explore other editing alternatives without generation loss or redoing the subsequent edits. Oranges to the program can be performed at any time during the editing process without damaging the previous or following materials. Takes can be trimmed directly on the time line. Complete sections or individual takes can be eliminated, moved or added in just seconds by selecting and dragging them to the desired position. If the new version works he/ she can replace the original sequence, if not just keep the original one. The editors are able to produce multiple versions of a program. The only extra time required is what it takes to incorporate the changes and save it as a new version. 78 b. Random access The random access feature has had great impact on the editing process. It is no longer necessary to shuttle the tape up and down trying to find a take. All the takes are listed as clips in the bins. The editor selects the clip's icon and open it to have almost instant access to the take. Mr. Jackson explained that, if he wanted to compare shots on the tape-based systems by the time he shuttled from one take to another, in most of the cases, he forgot what he was looking for. The instant of inspiration got lost during the time the tape was rewinding. Most of the times even the producers do not want to bother with looking for that perfect shot, and settle for a take that will just do the job. Random access is just one of the many functions that save time. There is no need to rewind, fast-forward or change tapes. No time is spent in preparing the switcher and the A/B/ C rolls to route effects. Changes and trimmings only take seconds. There is no wait for the take to be recorded on the time line. As Mr. Luttermoser explains ”there is no one thing that saves an incredible amount of time. It is all little things that don't take long to add up to hours.” Time is saved but is certainly used to improve the projects. According to Mr. Monigold, ”it is not great if you just got out at 5:00 pm and say this will do. But when you are done at 2:00 pm you still got three hours to explore other powbilities. We are getting a better product out, we are trying more versions and we are still going home early.” c. Upgradeability From the perspective of an owner and busirressperson, Mr. Mulliner is glad he does not have to watch expensive equipment go obsolete. To upgrade 79 his systems he just gets the new version of the software or adds hardware. There is no need to invest in a whole new system. He has a contract of maintenance with the manufacturer under which he receives software upgrades free of additional cost. d. Comfort For Mr. Monigold the job has become more comfortable. He does not have to spend long hours bending over a film editing machine, dealing with chemicals and cuts from the editing blades or searching for that little piece of film that was trashed earlier. Now, he sits in a comfortable chair and his hands are healed. There is virtually no physical effort in editing on the system. 3.3.2 Disadvantages The disadvantages discussed here refer to the difficulties the editors have encountered while editing on the system. Most of them are related to the use of a computer based system. Fortunately, the problems that the editors have encountered have caused minimal damage. According with Mr. Luttermoser the software is very robust and when it breaks down the editor may lose the last couple of edits but nothing that is not unrecoverable with some extra time. Mr. Mulliner points out that problems like the computer locking up are mostly due to wrong combinations of key strokes that the computer carurot recognize. He says it happens less frequently now, because the system has become more stable. In terms of reliability he considers it equal between linear and non-linear systems. Mr. Mulliner and Mr. Monigold accept that lockups, breakdowns and bugs are inherent problems of 80 computers. For them, dealing with new versions and the cutting edge of software means that they have to deal with these problems. Another aspect to consider is that most editors are not computer trained people. They know how to use the system to edit, but to fix problems computer specialists are required. In Mr. Luttermoser opinion, ”as more technology is imposed on people, they are forced to learn about the technology and some people are unwilling or unable or just don't care enough.' There is even a sense of frustration due to the fact that the computer might break down and send a message to the screen that is totally foreign to the editor. As Mr. Monigold says, ”sometimes it is not even English because a screen comes up and it will go non-linear bit-map version. What does that mean?‘' With the non-linear system most of the time they simply do not know what has gone wrong. On the contrary with most linear systems they were confident that at least they would know what the problem was. They could determine whether the machine was fixable or a specialist was required. The editors were unable to go into the technical details of the problems they encountered, besides describing them as lockups, bugs or breakdowns. When such problems arise and they are unable to fix them, they use a customer service line from the manufacturer which helps them get around the problems. When asked directly what are the disadvantages of the system the editors' responses were varied, based on the type of editing they do, off-line or on-line. Both the editors at Harvey's Place agreed that they did not need any more improvements on the system. As Mr. Monigold explains, “we can actually do more in the system now than anybody ever does here. We just don't finish (don't do on-line)." According to Mr. Luttermoser, the more 81 features you have the more it can detract the user from the actual creative process. He does not want to be caught up in the technical as Opposed to the creative. The other editors who finish their projects on the systems are looking forward to improvements in the following areas: Title tool 0 Image resolution 0 Chroma key 0 More DVEs 0 Reduction in the rendering time or real time effects 0 Audio mixing capabilities and equalization and editing tools. 0 Storage capabilities It is important to note that all the editors interviewed had difficulties coming up with disadvantages. Much of the content of previous list's content has been extracted from their responses to other questions. These are some of the comments and reactions to the question: 0 'I don't really see a disadvantage. If there is any disadvantage it is so minute compared with the advantages, that it is not really an issue for me.“ (Mr. Mulliner) 0 "The advantages of the computer outweigh the disadvantages." (Mr. Luttermoser) 0 "I don't see any disadvantages. I really don't.” (Mr. Monigold) 0 'What it lacks, it more than makes up for in editing flexibility and speed. It is truly a marvelous tool for an editor to have.” (Mr. Jackson) 82 0 "... does not make everything perfect, but it will.‘ (Mr. Koster) The major reason for this attitude may be the fact that they are certain that the disadvantages are likely to be improved in the near future. They feel the trade-off between these disadvantages and the benefits of editing in a non- linear random access environment makes it worth dealing with these issues. CHAPTER 4 NON-LIN EAR EDTI'ING SYSTEMS: EDITORS AND INDUSTRY 4.0 INTRODUCTION Besides studying the creative and technical aspects of non-linear systems to complete this research we must explore the role of the editor and the impact of the systems on the industry. The content of this chapter comes directly from the editors interviewed for this research. This is a recollection of their experiences, ideas and opinions. 4.1 EDTTORS' PERFORMANCE The editors agreed that almost anybody can be trained to operate the system. It is relatively easy to learn to perform audio and video edits, execute DVEs, and create titles. But, this does not make a good editor out of most individuals. Somebody who is an editor can certainly become better and improve the quality of their work by using better tools. In same way a good painter using better brushes and pigments makes better paintings, a good editor with a non-linear system makes better videos. However, a non-linear 83 84 system does not make anyone more or less creative than they really are. All the editors agreed that to make video communication effective the person who runs the machine should be an experienced editor. Mr. Mulliner said, "you can buy a scalpel and cut something up but that does not make you a surgeon”. All the editors agreed that the non-linear systems do not make anyone an editor but it gives them more choices, more freedom and less constraints. According to Mr. Mulliner, ”A good editor can edit with a razor blade, on-line, off-line, it does not matter. He/ she is still telling a story. He/ she is still putting the pieces together, with a good art to the continuity and to the creativity that is required” A good editor does not have to think about the process: it is ingrained. As an example Mr. Monigold shared one of his experiences as a young editor. During his first editing projects his supervisor came regularly to check on his progress and used to ask him why he chose to do certain edits. His answers were full of explanations about the background, the movements, the colors. On one occasion he just answered that he did not know, it just felt right and the supervisor told him, ”now you are an editor" . With experience the editors develop criteria that cannot be taught in a class or a seminar. He explained that he just does what feels right and the whole decision process is almost inbred. He does not think about it. It is that extra sense that no editor can define what makes a real editor. For Mr. Koster, the values of editing are what makes an editor; "the real value is in the communication skills, the concepts, basic things like aesthetics and understanding and having an eye for 85 aesthetics. The experience in editing teaches you how to communicate concepts, how to get the message of the program across... being able to communicate an idea visually." 4.1.1 Learning the System Training in non-linear editing systems is almost a continuous process. With frequent releases of new systems and new software the learning curve is steeper and longer than it was with earlier models, according to Mr. Mulliner. The newer versions of software are more complex than they were four years ago. Mr. Luttermoser explains: "When we first got the machines it was simple because there was only one way of doing things. Now, with the latest generation of software there are so many ways to go about the same task, that people who are just learning it can get confused. They watch somebody work the machine one way, and the next day they sit and watch another editor who does the same thing but goes a different direction to do it... It can be a little overwhelming to get beyond the basic cutting scenes together. " The training process for the Avid systems has several characteristics. Mr. Koster identified two aspects of the learning process, the navigation of the system and the editing procedures. The navigation of the system refers to the use of the computer and how to operate the machine. The editing procedures refer to the use of the system as a creative tool. It deals with the ability to use the systems to effectively perform edits and create effects. None of the editors seemed to have had major problems while learning the system. Their first contact with the system was through the tutorial program which comes with the system or through a brief 86 introduction given by Avid personnel. The learning process consisted mostly of how to Operate the system (computer) to perform the desired tasks. It took them between two to three days to feel comfortable and edit for a client. It is important to note that all the editors interviewed are experienced editors who mastered the creative as well as the technical aspects of linear editing before coming in contact with non-linear systems. Mr. Jackson, who is a certified Avid trainer, stresses the importance of studying and learning the system. He recommends talking to other editors to find out how are they using the system because: ”there are always other ways of doing things, sometimes much simpler. Its like proof reading your own stuff. You will never improve it, its always going to be only as good as you are. You have to incorporate what other people have found as well.“ The most common mistake of beginners is the resistance to letting go of the habits of linear editing. Mr. Jackson explains that most new editors try to approach the system thinking linearly. They slave over the first edits and forget that they can continue editing and come back later with fresh eyes. 4.2 THE POST-PRODUCTION INDUSTRY The impact of non-linear editing systems on the post-production industry is not easy to assess. It would require a formal research to obtain reliable results. However, the editors interviewed already have their own ideas, opinions and comments of on how the industry is being changed by the 87 use of non-linear systems. 4.2.1 Industry Changes Non-linear technology has opened a door for lower end productions. The prices of the editing tools as well as of editing time in the post-production facilities is lower for non-linear systems. Productions with lower budgets can get professional post-production services with an image quality that is acceptable for their projects at an affordable cost. Many individuals and organizations that considered video an inaccessible communication tool because of prohibitive post-production costs can now use it. Another important aspect was brought to attention by Mr. Koster. As costs go down there has also been an increase of the channels of distribution, such as cable. A market expansion will provide new outlets for smaller productions. Consequently, there will be more work for editors as long as they keep cutting with new economical technologies. Mr. Monigold understands that the impact of non-linear editing systems in the high-end of video production has been minimal. He explained that "costs (of editing) have stayed pretty much the same, but the big ticket item in any type of broadcast is production, the actors, crew, permits, set construction.” He does not think the high-end market is going to open up a lot more because 'editing is a very small part of it (production)." Economics are changing more than the creative aspects of the business. In Mr. Monigold’s opinion creativity will not be affected by the merging of the on-line and off-line editing in the non-linear systems. He says that what is changing is who does what. Their suppliers of outside services such as on- line, are in many cases their competition because they have Avids to provide 88 off-line services. On the other hand, they are also competing with their suppliers by finishing some clients' projects in the Avid. To Mr. Mulliner the introduction of a new resolution comparable to the CCIR 60123 video signal will transform the on-line suites in high-end rooms for animation, painting, DVEs, compositing and other complex effects. For Mr. Luttermoser the future is less clear. He said it is hard to predict what the future changes will be, when we look back at the changes of the past four years. The technology went from these systems not existing to being able to output broadcast quality images. 'How does that equate over the same period of time in the future?“ 4.2.2 Revolutionary Tool Some editors consider non-linear editing systems a revolutionary tool but for others it is just another tool. For Mr. Mulliner what is revolutionary is the freedom and flexibility it has given to the creative process. The system is just a tool. However, for Mr. Koster it is a revolutionary tool that will completely change the industry. It will expand the market, by giving smaller institutions the opportunity to do much more visually interesting things, such as slow motion and reverse motion (without paying for expensive on- line services). To Mr. Luttermmr it is just another tool because the system is so popular now that ”the pe0ple who do not use it are probably the odd ones.” Mr. Monigold pointed out that, ”there were a lot of revolutions in the last 5 to 10 years period that did change the industry. Nonlinear is a portion of it, but a small portion. 23 CCIR 601 is a digital video standard developed and recommended by the International Radio Consultative Committee. This standard for the digitization of color video signals uses luminance and two color difference signals. 89 No one thing revolutionized the industry but all of it feels like a revolution because things are certainly changed.“ 4.23 Productivity None of the facilities where the editors interviewed worked have done a formal study to assess productivity changes. Only Whirlpool was planning to assess changes in their productivity. Yet the overall impression of the editors is that productivity has in fact increased. They pointed out various reasons, the most common being that editing on non-linear systems is faster. Mr. Monigold and Mr. Luttermoser were quick to point out that they have the same workload but their work day finishes approximately three or four hours earlier. Mr. Mulliner has a different perception. He accepts that in a strict sense the process is faster but the saved time is spent exploring ways to improve the program. The productivity in his company has increased because they are doing more work and have more projects, up to the point that they ordered a second system. For Mr. Koster the increase in productivity resides in ”the multiple ways to approach a program. It has not saved me a lot of time yet but it has allowed me to do better work." CHAPTER 5 CONCLUSION 5.0 INTRODUCTION This chapter culminates the research. It begins with a summary, recapitulating the purpose and objectives of the thesis, followed by the results. The research results are condensed in the listing of the differences between linear and non-linear editing systems. The conclusion reviews and evaluates the results of this research. Finally, the chapter closes with recommendations for further research on non-linear editing systems. 5.1 SUMMARY The purpose of this research was to describe and compare linear and non-linear editing systems. It explored and explained the changes of non- linear editing systems on the post-production process. The idea was to go beyond the technical information published by the trade press and see if non- linear systems are merely another technological advancement or a step into a new direction for the editing process The comparison between systems was built around two major categorized areas of the editing process: the creative 90 91 and technical aspects. The framework of the research was built through an extensive literature review. The information gathered was contrasted with the experiences of five non-linear editors. They are the primary sources of this research. Given the time and economic limitations of this research it must be remembered that even though the basic principles of non-linear editing are similar across systems, the editors interviewed are Avid users within the state of Michigan. 5.2 COMPARISON: DIFFERENCES BETWEEN LINEAR AND NON- LINEAR EDITING SYSTEMS This research used the descriptive and comparative methodology to study the linear and non-linear editing system. Both systems have been described in Chapter 2 and Chapter 3 respectively. The description of both systems concentrates on the creative and technical aspects of post-production. Using the same descriptive method for both chapters developed a comparative format, uncovering significant differences between the systems. As the result of the comparison, Table 1 lists the found differences between linear and non-linear systems. They are classified as differences related to the technical aspects of post-production and those which are more related to the creative aspects. However, this classification is not exclusive, a creative difference may have technical implications and vice-versa. The technical and creative differences are interdependent. The listed differences could also be classified as advantages and disadvantages, however, that classification may not always hold true. For example, the video quality of a particular system may be a disadvantage for an 92 on-line editor and perfectly suitable for an off-line editor. The advantage- disadvantage classification was discarded because it varies depending on the system, its intended use and the user (editor). Table 1 .___-__.___._. _— ———————————__..—-________.-—-—_—_——— Differences Between Linear and N on-llnear Editing Systems Technical Aspects 1 N on-Lmear Editing Systems Linear Editing Systems 1. Wm Digital. Analog. f 2. Integrafien: : Integrate all the black boxes in the Use multiple black boxes each one editing suite in a computer system. dedicated to single tasks. ; 3. MW 1 The same is used to log, produce EDL, edit a off-line and edit 3 on-line. Increase the productivity of ; the equipment. Generally the systems are dedicate to edit off-line or on-line. ., 4. Ilmldendlflmerface; , The interfaces recreate traditional ' devices and controls to facilitate the . learning process and the transition ; to these computer based systems. The interfaces are closer to the user ; than to the machine. The interfaces are closer to the machine than to the user. ' 5. TM The use of time code is limited to . the digitization process. Time code is used during the whole ' post-production process. . 6. mm; Use only one VTR to playback the footage for digitization and to record ‘ materials from the system. Use one VTR to record and one for each tape source (A/B/C rolls). 93 Table 1 (continued) The concept of A/B/C roll is eliminated. Clips are all from the same source (the bins). [7. No B Roll; Use multiple videotape sources. 8. Mideotape: Used only during digitization. Used during the whole post- ; production, risking physical damage i to the tapes. : ? Most systems work with real time video (30 frames/ 60 fields per 1 second). Others use 30 frames/ 30 ? fields per second and sometimes j even lower quality video. All systems work with real time video. 10. We; The quality of the video output is i the same of the video input (after j compression) because the video i signal is converted to digital format. Every time the video signal is copied the copy losses a generation and reduces its quality. 11. 92W: Many systems do not offer image quality suitable for high-end 5 projects. The system tape format defines the video quality. 11 Storage; ; Even after compression digital video 5 requires massive amounts of 1 storage. Storage have to be careftu managed and the amount of footage ; and other materials monitored. Storage is only limited by the amount of tapes available and the space to store them. ? 13. Audio; ? Most systems offer 4 audio channels j with CD quality. Limited mixing, ; equalization and audio editing tools. Audio capabilities are as good as the 1 audio system accompanying the ‘ system. 14- 11mm Software upgrades maintain the systems updated. Peripherals and hardware are easily added. Most systems are not upgradable. Table 1 (continued) 15. W Only one person is needed to operate the systems. For complex post-productions is convenient to have an assistant editor. 16. Maintenance: Require computer specialized personnel. Maintenance contracts should be made and/ or personnel should be retrained. Editors and video engineers are qualified to trouble-shoot and maintain the equipment. 17. Miduplelrdeoitandards. The video signal can be exported in 525 and 625 lines of resolution. One system can serve multiple standards. Each system works with only one standard. 18. W Digital technology is rapidly changing. To keep up the system must be regularly upgraded. The technology is established, it has been main stream since the 1960's. 19. W As the industry consolidates, many developers may be absorbed by bigger companies or simply disappear leaving the users without backup. Systems are not dependent on software. 20. Managing; The systems digital video signal can be sent over digital networks. Any system connected to an ISDN network can send video to another that is also connected. N o networking capabilities. 21. Wiener: The technology is invisible to the viewer. The technology is invisible to the viewer. 95 Table 1 (continued) Linear Editing Systems 1. W Non-linear. The editor is free to work in any desired order without affecting the edits before or after. There is no need to slave over a particular edit. Linear. Each edit should be final before moving to the next. Need to ; anticipate the next edit creatively and technically. E2 W Off-line and on-line editing overlap A and even merge when using the i system. Both are different stages of the post- production process 3. W The conventional meanings ' remained unchanged. The editing language was adapted from film and developed through the use of linear systems. 4. Transparentlechnclew l While editing the editor works only with image and sound. Besides image and sound the editor , works with time-code, routing, among other technicalities. 5. Iimfl'mea: : Time lines are a graphic representation of the whole project. 6. Rande Access: All materials are accessible at the There is no graphic representation of the project. Takes are located by fast-forwarding f same time and almost immediately. or rewinding the tape. : 7. W § Footage has to be located and then The footage has to be located and digitized. simply playback. = 8. Execufienlime: > The editor's vision can be more The editor has to search through the Q clearly realized because the tape, set up the edit and then wait ' execution time of regular edits has been reduced to seconds. the length of the take being edit to be 2 recorded. _—.—_.———__-.—._._-_ _——_______.__.______.____.,.._~.___—._____~ Table 1 (continued) 9. W Changes can be made anytime during post-production without affecting the edited sections before or after the location where the changes are made. _._...___*___.__-___- “D To make changes 1n an edited sequence, means to destroy what has been done replacing it with the new a take(s). i 10. Wrens. The time it takes to incorporate changes to a video, save it and export it, is the time it takes to do a new version. All with the same quallty. Multiple versions require to edit the whole video again, or loosing a generation by copying and editing the differences. 11. Multiplelsiasters; The system can create a master Masters need to be created directly form an EDL. All copies of individually piece by piece or copied lithe master have the same quality. loosing a generation. 12. Renderinglime; Effects and transitions need to be rendered taking up computer time. 5.3 CONCLUSION time. Besides the time needed to set up the effect, the routing and previewing there is no rendering Non-linear technology is having a serious impact in post-production. It has become a powerful instrument in the hands of those who want to use video to communicate. Users have been given the alternative of finishing their own projects without requesting the services of post-production facilities. The essence of the editing process has not been changed. Editing is still the process of putting the pieces together to tell a story. The editors still do an 97 off-line, where the creative decisions are made to shape the program's concept. That is when the editors lay down the takes on the time-line as a rough-cut. When they go back to give the finishing touches to the edits, they are simply doing an on-line. What non-linear technology has changed is that the two stages are overlapped and merged. The system's random access and non-linear capabilities allow the editors to work on any section, at any level, at anytime. The editor’s creativity and abilities to make creative decisions remained unchanged. Talking about his creativity Mr. Monigold says “even when we have non-linear systems reach the highest quality standards we will do everything the same way we always have" . But, now editors have the opportunity to explore and execute their ideas faster, without adding great expenses in time and cost. Creativity has been enhanced by the freedom to explore ideas. The ideas were always there but trying them out was cumbersome, time consuming and inconvenient. The editing process has become more simple and less technical. Editors have been given a friendly tool, with an interface that is clmer to the editor than to the machine. Only during the preparation for editing do the editors have to deal with the technology. After a detailed logging, the editor decides on the image quality and compression level to be used, taking into account the storage available. Once these technical issues are resolved it is just a matter of digitizing and organizing the clips (editing materials) before editing. Once they sit to edit, they only work with images and sound and how to piece them together. All their undivided attention is concentrated on how to communicate effectively. Non-linear systems are facilitators of the creative process. The technology imposes technical restrictions on the editing process. 98 Its major drawback is that the image quality is mostly below broadcast level. Users have been given the tools to finish their video project with a trade off on quality. Editors deal with image quality, audio limitations, compression and storage as issues that will soon be resolved by new technological advancements. To summarize the results of this research, the findings reveal non- linear systems as a welcomed technology that gives editors more creative freedom by reducing their involvement with technical issues. However, we must remember that this is a technology in its infancy, and any new breakthrough might change the editor's opinion of these systems. Nonetheless the most important finding might be the unanimous editors statement: "I don't want to go back to edit in a linear system“. 5.4 RECOMMENDATTONS Given the fact that non-linear systems are an alternative technology that is being used by serious post-production professionals, it should be careftu studied. There are many areas that are unexplored and crucial to the understanding of the effects of non-linear editing systems on the post- production process. The first step should be quantitative research to define the population. We need to know how many systems are currently in use in the US, who are the owners and how are they being used. A growing population will justify further research of these systems, its users and its effects on the industry. This information will provide an understanding of 99 the systems penetration and position its users with in the post-production industry. Even though, in Michigan the number of users and supporters seems to be growing, that may not be the case in other parts of the country. With a better knowledge of the users population, the research may be moved to investigate if the systems save time, increase productivity levels, affect the industry and encourage creativity, among many other topics. APPENDIX A NON-BROADCAST VIDEO USERS TABLE A Non-Broadcast Video Users Type of Organization No. of Users, No. of Users No. of Users 1980 1987 1995 Business/ Industry 13,500 23,000 40,000 Education 5,000 11,000 19,000 Medical 5,000 8,000 13,000 Govemment 3,000 5,000 7,000 Non-profit 800 2,000 4,000 Source: Knowledge Industry Publications, Inc. (estimates) (Figures from 1980 from W) 100 101 TABLE B N on-Broadcast Video Expenditures By User Segment (1980,1987 And 1995) (In billions) Segment (% of Total) 1980 1987 1995 Business/ Industry (60%) $0.66 $3.30 $7.50 Education (15%) 0.17 0.82 1.90 Government (12%) 0.13 0.66 1.50 Medicine (10%) 0.11 0.55 1.30 Non-profit (3%) 0.03 0.17 0.40 I J TOTAL $1.1 $5.5 $12.6 Note: Discrepancies in totals are due to rounding. Source: Knowledge Industry Publications, Inc. (estimates) (Figures from 1980 from W) 102 TABLE C Non-Broadcast Video Expenditures For 1980, 1987 And 1995 (In billions) Type of Expenditure 1980 198 7 1995 Equipment $0.4 $2.0 $4.6 Programming (total) 0.70 3.50 8.00 In-house production 0.55 2.80 6.30 Outside production services 0.09 0.40 1.00 Packaged programming 0.06 0.30 0.70 our f_ _ $1.1 f $5.5 _ $12.6 “ml 1. Includes salaries, overhead, out-of-the pocket expenses Source: Knowledge Industry Publications, Inc. (estimates) (Figures from 1980 from M APPENDIX B INTERVIEW QUESTIONNAIRE I. BACKGROUND INFORMATION About the interviewee: 1. Name, job position, job description and relevant experience (linear & nonlinear editing experience) About the system: 1. System model and platform. 2. How long you have been using the product? 3. Previous system used (why the change, where is the system being used now). 4. Other pieces of equipment being used with the system. Has the system been upgraded in any way. 5. What compression ratios and formats are supported by the system? 6. What type of storage is used? How much storage does the system has? 7. Type of programs edited on the system (news, training, ...) and their approximate duration. 8. Medium of product distribution (videotape, computer network, disks, cable, ...) 9. The system is used to provide service or for internal use only. 10. While using the system, the services of a post-production house have been required? Which services? 103 104 II. TECHNICAL ASPECTS 1. How many hours of training are required before an editor is able to use the system successfully? Please describe the training process. 2. What are the benefits of having a graphic user interface, working with picture icons instead of time-code numbers? Probes: * Does it make editing feel less technical? * Does it facilitates the creative decision making? * Does it make the machinery more transparent to the editor? 3. One of the big issues about non-linear editing has been the compression algorithms. How does it affect the editing process and the quality of the projects? Probes: * Do you use different compression ratios? * How do you decide the adequate compression ratio? What are the tradeoffs of increasing or decreasing the compression ratios? * Have you encountered problems with artifacts or image blemishes? 4. When editing with a linear system, the editor can have all the raw footage accessible. The only limitation is the time it takes to change tapes and find the desired shots. Under ideal circumstances, when using non—linear systems, all the footage should be digitized because is hard to know what shots will be used to edit the program. But digital storage capabilities are limited. How do you manage the storage space to overcome its limitations? How have the storage limitations affected the editing process? 5. Have you come across any difficulties regarding the maintenance and service of the system? Has the system ever failed or crashed? 105 III. CREATIVE ASPECTS 1. Explain the steps or stages of non-linear editing, from the time the videotape recording is completed to the moment when the project is ready for distribution. What are the changes to the traditional linear editing process? Probes: * Preparation for editing, logging, digitization, clips management, off- line and on-line editing. 2. How do you prepare and perform edits, transitions, effects and titles, on the non-linear system? How is it different from the process in a linear system? 3. Describe the audio capabilities of the system and how do they compare with the linear system. Probes: * How many audio tracks are provided? * What are the options to manipulate audio? * Are you satisfied with the audio capabilities and quality? Why? * How does compression affect audio? 4. Some editors compare nonlinear editing with trying to see the forest and the trees at the same time. The editor has the option of making the creative decisions (order, sequence, length, etc.) and doing the finishing details of transitions, effects and graphics at the same time. How do you combine the use of the off-line and on—line capabilities? All the way rough first? Probes: * Do you ever feel overwhelmed with so many options? * Are these many on-line options a distraction from the development of concept of the program? * Is there a tendency to overuse transitions, effects and graphics or do you try to keep it simple until the on-line editing? 10. 106 I have read, that non-linear editing systems force the editors to think, work and schedule their time differently. Can you explain what this expression means. Probes: * What is different from the way editors used to think, work and schedule their time when they were using linear systems? * How has non-linear editing changed the creative process? Please, describe what you consider the main advantages and disadvantages of the non-linear editing systems. Probes: * Advantages: random immediate access, easy changes, multiple version, time/ cost saving, perform off-line and on-line simultaneously. * What are the creative and technological the barriers? Storage, compression. What new tools or changes will you like to see in future generations of non-linear systems? Probes: * Which tools from traditional systems you miss? Originally off-line editing was designed to cut costs of expensive on- line editing time. As technology improves and its cost is being reduced, it seems as if the line between off-line and on-line editing is beginning to blur. Probes: * What do you think about it? * Did you consider off-line and on-line to be totally separated stages? Why? What is the most common mistake of new users? How do you think the systems affect the editors' performance? Probes: * Do the systems help to make better editors or have no effect at all? 11. 12. 13. 107 * Can the system help improve the quality of work? * How the systems affect the artistic side of editing? * From an editor's point-of-view, what do you think about the systems? What kind of program benefits the most from the system, if any? Why? There are two basic editing styles: one, tries to establish continuity between shots. It tries to make a series of shots look like a continuos single unit. The other is the montage, which is more concerned with the pace and rhythm, and uses many shots that are not necessarily related, to communicate an idea. * How do you edit to create continuity, using a non-linear system? * How is it different to edit a montage in a linear system from edit it in a non-linear system? How does non-linear systems manipulate the pace and rhythm of the programs compared with linear systems? IV. INDUSTRY After using the systems, has productivity been increased? Specify measures of productivity Have the types of programs been diversified? Has the quantity of projects developed increased? Has the cost of post-production been affected by using the systems? Is it cost—effective to adopt technology? In an article I read this line: ”...the voices of individual expression are challenging the once monolithic domain of prime-time television.” Do you think that the post-production independence non-linear systems give to producers, will encourage new types of production that may compete with the established production industry? 108 V. CLOSING QUESTIONS 1. Do you ever want to go back to edit in a traditional system? 2. Do you consider non-linear system as just another editing tool or as a revolutionary technology? Why? 3. Are there any secrets about non-linear editing that you would like to share? 4. What roles may non-linear systems have in the information super- highway? APPENDIX C INTERVIEWEES INFORMATION Gregg Jackson Business Communication Services, Whirlpool Corp. 2000 M-63 North Benton Harbor, MI 49022 Phone: 312/ 989-7412 or 616/ 923-5006 Fax: 616/923—5077 Steven J. Koster Video Productions, Calvin College 3201 Burton Street, SE. Grand Rapids, MI 49546 Phone: 616/ 957-6335 Fax: 616/957-8551 Eric Luttermoser Harvey's Place, Inc. 2330 Telegraph Road Southfield, MI 48034 Phone: 313/ 3533030 Fax: 313/353-5598 Jim Monigold Harvey's Place, Inc. 2330 Telegraph Road Southfield, MI 48034 109 110 Phone: 313/ 353-3030 Fax: 313/ 353-5598 Flip Mulliner Postworks Inc. 2660 Horizon, Southeast, Suite E Grand Rapids, NH 49546 Phone: 616/940-4100 Fax: 616/ 940-1092 APPENDIX D MANUFACTURERS AND DISTRIBUTORS Avid One Metropolitan Park West Tewksbury, MA 01876 Cruse Communications 4903 Dawn Ave. East Lansing, MI 48823 Data Translation Multimedia Group 100 Locke Drive Marlboro, MA 01752-1192 Fast Electronics 1 Twin Dolphin Dr. Redwood City, CA 94065 ImMix P.O. Box 2980 Grass Valley, CA 95945 Matrox 1055 St. Regius Blvd. Dorval, QUE, Canada H9P 2T4 Quantel 85 Old Kings Highway N. Darie, CT 06820 Touchvision Systems 1800 Winnemac Ave. Chicago, IL 60640 111 LIST OF REFERENCES Magazines Barrett, David. 'Digitfl yidsg is hsgg, ready gr n91” AV Video. March, 1994. pp. 170(1). Brandt. James. WWW. GM V. pp. 40-41 (2) Burger. Jeff. ”WW Broadcast Engineering. February, 1994. pp. 20-28 (4). Gian. Curtis (8). MW Broadcast Engi- neering. September, 1993. pp. 26-34 (5). Chan, Curtis (b). WWW: Broadcast Engineering. July, 1993. pp. 44-53 (6). Chan. Curtis (c). WWW Broadcast Engineer- ing. August, 1992. pp. 62-66 (3). Chan, Curtis (d). W Broadcast Engineering. September, 1992. pp. 48-68 (4). Dick, Brad. W Broadcast Engineering. January, 1994. pp. 62-70 (3). Dwyer, Ed. 1W Broadcast Engineering. September, 1992 pp. 26-33 (5). ' Ellis, Ken. WWW Broadcast Engineer- ing. October, 1993. pp. 66-72 (3). 112 113 Epstein. Steve (a)- WWW Video Systems. March, 1994. pp. 142-143 (2). Epstein, Steve (b). "New sglgtiggs fgr Pgst' Video Systems. March 1994. pp. 30-36 (5). Flick, Randy. W Videography. June, 1994. pp. 16-17 (2). Grotticelli, Michael. W Videography. August 1994. pp. 76—86 (6). Grunin, Lori. 1W PC MAGAZINE. May 11, 1993. pp. 289-333 (14). Guess, Michael. W Videography. April 1994. pp. 76-78 (3). Kurz, Phil. WM GMV June, 1994. pp. 42-45 (3). Leonard. Chris. WW Broadcast En- gineering. March, 1993. pp. 38-168 (5). Lenttinen, Rick. MW AV Video. March, 1994. pp. 25- 126 (7). Levy, Don. MW Videography. April, 1994. pp. 123- 124(2). Lindstorm, Bob. WW BYTE. September. 1993. pp. 153-158 (4). Lookabaugh. Tom. WW: Broadcast Engi- neering. February, 1993. pp. 26-36 (6). McConnell, Maureen. 1W Broadcast Engineering. September, 1993. pp. 36-43 (3). McKernan, Brian. W Videography. April, 1994. pp. 14 (1)- McMahon. Frank. W D TVW. September, 1994. pp. 62-64 (3). 114 Molinari, John. W Videography. June, 1994. pp. 107-108 (2). Ohanian, Thomas A. "Digital rgorgg' g 9n ths desktgp' Video Systems. October, 1993. pp. 36-40 (3). Poole, Lon. ‘Qgingim' s in mgtion" Macworld. September, 1991. pp. 154-159 (5)- Ransom, Tom (a). ”Buying s desktgp 35ng system” Broadcast Engineering. February, 1994. pp. 30-36 (4). Ransom, Tom (b). WW Broadcast Engineering. April, 1994. pp. 54-57(3). Schubin, Mark. W Videography. August, 1994. pp. 18-112 (4) Shames, Erica L. W Videography. April, 1994. pp. 32-34 (2) Strassmer, Norman H. W Broadcast Engineering. September, 1993. pp. 44-49 (4). Turner, Bob (a). " ' ' -' ' - ‘ i " Videography. April 1994. pp. 62-66 (4). Turner, Bob (b). ' l ' r l - ' " Videography. August, 1994. pp. 50-59 (8). Turner, Bob (c). W Videography. June, 1994. pp. 60-68(4). Vesey, Ion- WW Videography. April, 1994. pp. 16-17 (2). Walker. Patrick 13. WW Broadcast Engineering. February, 1994. pp. 54-60 (3). Wiedemann, Steve. WWW Videography. August, 1994. pp. 24-28 (4). Wine, James M. a; o . N1 ' u «11' .11 ' Videography. April 1994. pp. 28-60 (5) 115 Yager, Tom. ”Mam g ths gut” BYTE. July, 1992. pp. 143-154(6). Books Anderson. Gary H. 'MWWEQL 2nd edition. Knowledge Industry Publications, Inc. White Plains, New York. 1988. Gayeski. Diane M. WWW Prentice-Hall, Inc. Englewood Cliffs, New Jersey, 1983 Jack. Keith. WWW High Text Publications Inc. Solana Beach, California. 1993. Marlow, Eugene, ' T ' ' n P ' - 1i tsghnigggs' Knowledge Industry Publications, Inc. White Plains, New York. 1992. Ohanian, ThomasA. 'Dig' ._ - , - - - W Focal Press, Boston, Massachusetts.1993. Smith, DavidL. " - W Wadsworth Publishing Company, Belmont, California. 1991. Tereno Stokes, Judith. WWW Knowledge Industry Publications, Inc. White Plains, New York. 1988. Wells, Michael. 'W Knowledge Industry Publications, Inc. White Plains, New York. 1989. Wimmer, Roger D. and Joseph R. Domminick. W 3rd edition Wadsworth Publishing Company, Belmont, California. 1991. 116 Wurtzel, Alan and Stephen R. Acker. 'fIfslgvisiQn Ergdngtign" 3rd edition McGraw—Hill Book Company,, New York, New York. 1989. Zettl. Herbert. mammalian. 2nd edition Wadsworth Publishing Company, Belmont, California. 1990. Zettl, Herbert. W 4th edition Wadsworth Publishing Company, Belmont, California. 1984. "7111111111 1111111113