4:33 9:23. .. . J2...) 1.: :0“... ‘ 3M5 :. 1:52: . t... . “vow; inch? x A... . X! S. 1'”:er ! i , u. at. u , la: 3 f. .1. x. , ‘ ‘ . , ‘ . 3. , . . I , a $3.; H_fimfln.m3_ .Nufimfim, “Egg THE 313 {- 1 IlllILHIIHHIIHIIHUIJllllllllllllllllllllllllllllIllllll L'PRARY 33129 01780 6658 Mlehlgan State University This is to certify that the thesis entitled COMPUTER-ASSISTED INSTRUCTION IN EAR-TRAINING AND ITS INTEGRATION INTO UNDERGRADUATE MUSIC PROGRAMS DURING THE 1998-99 ACADEMIC YEAR presented by Douglas Raymond Spangler has been accepted towards fulfillment of the requirements for Masters degree in Music MaSogEofessor MS U is an Affirmative Action/Equal Opportunity Institution COMPUTER-ASSISTED INSTRUCTION IN EAR-TRAINING AND ITS INTEGRATION INTO UNDERGRADUATE MUSIC PROGRAMS DURING THE 1998-99 ACADEMIC YEAR. By Douglas Raymond Spangler A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of MASTER OF MUSIC Department of Music 1999 ABSTRACT COMPUTER-ASSISTED INSTRUCTION IN EAR-TRAINING AND ITS INTEGRATION INTO UNDERGRADUATE MUSIC PROGRAMS DURING THE 1998-99 ACADEMIC YEAR. BY Douglas Raymond Spangler As computer use has become more widespread, with both better technology and lower prices, a growing number of undergraduate institutions are integrating ear-training CAI (Computer-assisted instruction) into their music theory programs. New ear-training programs are becoming available, and many Older programs are being updated to include more and better features. With more than thirty commercial ear-training programs currently available, music instructors face an increasingly daunting task when asked to choose software for undergraduate ear- training. This work identifies more than sixty ear-training CAI programs and reviews thirty programs using a two-page review form. It also provides results from a survey representing 209 undergraduate institutions and their integration of ear-training CAI during the 1998-99 academic year. The thesis research and software reviews were published on the World Wide Web at http://www.msu.edu/user/spangle9. This Home Page was also referenced on the Society for Music Theory Web Site. This work is dedicate to my parents, Doug and Carol Spangler Jr., whose love and support made this project possible. ACKNOWLEDGEMENTS A project such as this would be incomplete without thanking those people who helped make it possible. The professors at Michigan State University have been a wonderful source of guidance and encouragement. Dr. Gordon Sly, my thesis adviser, has offered many helpful suggestions regarding the researching and writing of a master’s thesis. Dr. Bruce Taggart, Instructor of Music Theory and a member of my thesis committee, offered recommendations regarding the software review form and suggestions for evaluating the survey data. Professor Deborah Moriarty, Chair of the Keyboard Area, offered many insights into piano performance, music pedagogy, and ear-training CAI. Dr. Mark Sullivan, Chair of Composition, allowed me to work in the Computer Music Lab for nearly five years and offered many suggestions on how to improve the webpage version of the thesis. Dr. Cynthia Taggart, Associate Director for Graduate Studies, was a wonderful academic adviser and offered many useful suggestions regarding the survey. My Sincere thanks to Ann Blombach, a Professor at Ohio State University and author of the ear-training program, MacGAM UT, for providing an e-mail list of instructors who used MacGAMUT software. Finally, my sincerest thanks go to the many software developers and distributors who provided evaluation copies of various ear-training programs as well as the more than 209 music instructors who responded to the e-mail survey. iv TABLE OF CONTENTS LIST OF TABLES ................................................. vii LIST OF ABBREVIATIONS ......................................... ix INTRODUCTION .................................................. 1 BEGINNINGS AND PURPOSE ................................. 1 CHAPTER 1 ..................................................... 2 THE CAI SURVEY ................................................ 2 PURPOSE OF THE CAI SURVEY ............................... 2 METHODS ................................................. 2 THE SAMPLE SIZE AND RESPONSE RATE ...................... 3 DEMOGRAPHICS ........................................... 5 BIASES OF THE SURVEY METHOD ............................ 6 SURVEY RESULTS .......................................... 7 CHAPTER 2 ..................................................... 18 THE SOFTWARE REVIEW FORM ................................... 18 PURPOSE ............................................... 18 FORMAT .................................................. 18 GENERAL INFORMATION .................................... 20 AVAILABLE EXERCISES ...................................... 21 INSTRUCTIONAL ISSUES .................................... 24 SYSTEM REQUIREMENTS AND SETUP INFORMATION ............ 3O PRICING AND PRODUCT INFORMATION ........................ 31 CHAPTER 3 ..................................................... 32 DIRECTIONS FOR CAI DEVELOPMENT ............................... 32 SOURCES OF INFORMATION ................................. 32 PERSONAL OBSERVATIONS ................................. 32 SURVEY COMMENTS ....................................... 33 THE MOST COMMON COMMENTS ............................. 33 CAI USEFULNESS ........................................... 35 STUDENT USAGE .......................................... 36 LAB ACCESSIBILITY AND PLATFORM AVAILABILITY .............. 36 EXERCISES AND SOUND QUALITY ............................ 37 SCORING ISSUES AND GENERAL PROGRAM BUGS ............. 39 COURSE INTEGRATION AND CAI CUSTOMIZATION ............... 40 CONCLUSIONS .................................................. 42 APPENDIX A .................................................... 44 THE SURVEY INSTRUMENT ....................................... 44 APPENDIX B .................................................... 47 ALPHABETICAL LISTING OF RESPONDING INSTITUTIONS .............. 47 APPENDIX C .................................................... 55 GEOGRAPHIC DISTRIBUTION OF RESPONSES ....................... 55 APPENDIX D .................................................... 58 LISTING OF PROGRAMS USED IN RESPONDING INSTITUTIONS ......... 58 APPENDIX E ..................................................... 6O THIRTY-THREE PROGRAMS NOT REVIEWED ......................... 60 APPENDIX F ..................................................... 62 SOFTWARE REVIEW FORM ........................................ 62 APPENDIX G .................................................... 65 REVIEWS OF THIRTY EAR-TRAINING CAI PROGRAMS ................. 65 TABLE OF CONTENTS FOR THIRTY PROGRAM REVIEWS ............... 65 vi LIST OF TABLES TABLE 1 SIZE DISTRIBUTION OF INSTITUTIONS ..................... 6 TABLE 2 PERCENTAGE OF INSTITUTIONS USING CAI ................ 8 TABLE 3 NUMBER OF PROGRAMS USED ........................... 9 TABLE 4 PLATFORMS USED FOR CAI ............................. 9 TABLE 5 PERCENTAGES OF GRADE BASED ON CAI USE .............. 10 TABLE 6 METHODS OF INTEGRATING CAI ......................... 11 TABLE 7 ACCESS TO CAI SOFTWARE ............................. 12 TABLE 8 NUMBER OF COMPUTERS AVAILABLE FOR CAI .............. 12 TABLE 9 INTERNET ACCESSIBILITY OF COMPUTER LABS .............. 13 TABLE 10 SOFTWARE AT SCHOOLS USING ONLY ONE PROGRAM ..... 14 TABLE 11 PROGRAMS USED FOR 10% OR MORE OF GRADE ......... 15 TABLE 12 INSTRUCTOR RATINGS OF VARIOUS CAI SOFTWARE ........ 15 TABLE 13 STUDENT RATINGS OF CAI HELPFULNESS ................. 16 TABLE 14 EXCERPT FROM A BLANK REVIEW FORM .................. 19 TABLE 15 EXCERPT FROM A COMPLETED REVIEW FORM ............ 19 TABLE 16 GENERAL CATEGORIES OF COMMENTS ................... 33 TABLE 17 “TOP TEN” COMMENTS ................................. 34 TABLE 18 CAI USEFULNESS ..................................... 35 TABLE 19 STUDENT USAGE ....................................... 36 TABLE 20 LAB ACCESSIBILITY AND PLATFORM AVAILABILITY ......... 37 TABLE 21 EXERCISES AND SOUND QUALITY ....................... 39 vii TABLE 22 SCORING ISSUES AND GENERAL PROGRAM BUGS ......... 40 TABLE 23 COURSE INTEGRATION AND CAI CUSTOMIZATION ........... 41 viii LIST OF ABBREVIATIONS ATMI The Association for Technology in Music Instruction. CAI Computer-assisted instruction. CMS College Music Society. K Kilobytes. LAN Local Area Network. MB Megabytes. MG + SMT/ATMI The combination of all survey responses. MG list A list of 103 e-mail addresses of instructors using MacGAMUT ear- training software. It was provided by Ann Blombach, the developer of MacGAMUT. MG sample The seventy survey responses generated from the MG list. mHz Megahertz. MIDI Musical Instrument Digital Interface. MSU Michigan State University. O/S Operating System. RAM Random Access Memory. SMT The Society for Music Theory. SMT/ATMI list A combination of two music e-mail lists representing an estimated 550 undergraduate institutions. SMT/ATMI sample The 139 responses generated from the SMT/ATMI list. URL Uniform Resource Locator. INTRODUCTION BEGINNINGS AND PURPOSE This thesis had its beginnings in the Michigan State University School of Music computer labs. The author worked from September 1995 to May 1999 as a lab monitor in the Computer Music Lab—a public lab devoted to programs for music sequencing, notation, sound editing, programming, and ear-training. The author also worked in the MSU ear-training lab from its opening in September 1997, until December 1998. This close contact with ear-training software sparked an interest in the subject and led to the decision to do a master’s thesis in Music Theory on the current state of ear-training software. The project began as a study of ear-training software used in Big Ten schools. As countless web searches were done to discover ear-training programs, the topic of ear-training appeared to be a chaotic field of information ripe for research. Ear-training programs were discovered on an almost weekly basis, yet very little literature on the current use of CAI was discovered. This thesis attempts to bring some order to the field of ear-training CAI by addressing which programs are currently available and how they are used in undergraduate institutions. It provides reviews of thirty ear-training programs and gives results from a survey representing 209 undergraduate institutions. It is hoped that this work will make future research into ear-training CAI more profitable and serve as a reference for music instructors seeking to integrate ear-training CAI into their classes. CHAPTER ‘I THE CAI SURVEY PURPOSE OF THE CAI SURVEY The survey was designed to provide general information about how ear- training CAI software was integrated into undergraduate music theory instruction during the 1998-99 academic year. The questions asked pertained to which programs were used, how CAI use was integrated, how CAI was graded, and how the instructor rated the software. METHODS The method chosen was a convenience sample using a ten-question e-mail survey that required approximately four minutes to complete. Eight of the ten questions were multiple choice, and one question (regarding the respondent’s rating of the software) was optional. In addition, Optional comments were requested at the end of the survey. After pretestS of the survey instrument were completed in the fall of 1998 it was decided to use three email lists to obtain a sampling of undergraduate institutions. Two of the lists chosen were the SMT (Society for Music Theory) list, and the ATMI (Association for Technology in Music Instruction) list. The third source, referred to as the MG sample, was a list of e-mail addresses which were graciously provided by Ann Blombach, a professor at Ohio State University and author of the ear-training software, MacGAMUT. The SMT list contained 853 addresses, the ATMI list contained 282 addresses, and the MG list contained 103 addresses. The instrument, which is shown in Appendix A, was sent to the SMT and ATMI lists on February 2, 1999 and again on February 12, 1999. It was sent to the MG sample on February 15, 1999, February 23, 1999, and finally on March 6, 1999. The SMT and ATMI samples were intended to provide a random sampling of undergraduate institutions; they generated 134 responses. The MG sample was intended to provide a closer look at institutions using MacGAMUT software; it generated seventy responses. Five additional responses arose from contact with various people while research for the thesis was being conducted; these included software developers, instructors who were consulted via e-mail, and responses to a copy of the survey posted on my webpage between February 2, 1999, and March 15, 1999. These five responses were counted towards the SMT/ATMI sample bringing the number for the SMT/ATMI sample up to 139 responses. THE SAMPLE SIZE AND RESPONSE RATE The actual sample size represented by the SMT and ATMI lists was difficult to determine. Membership lists of the SMT and ATMI lists were obtained on March 4, 1999. The SMT list had 847 subscribers with 843 of these listed as unconcealed e-mail addresses. The ATMI list had 282 subscribers. Although this indicates a population of about 1125 subscribers, the number of institutions represented in the sample is considerably lower—possibly as low as 450 undergraduate institutions. The 1125 available e-mail addresses were combined into a database to determine a closer approximation of the sample Size. This number was reduced to 1066 addresses simply by discounting for duplication between the two lists. This 3 number was further reduced to 1030 by discounting thirty-Six addresses that indicated publishers or news organizations. Often, there were ten or more people from the same academic institution subscribed to one of the lists. Because only one response from each institution was counted in the survey results, multiple subscribers were removed so that each institution was represented only once by an academic e-mail address. There were 456 duplicate institutional addresses which further reduced the sample size to 574 possible institutions. Using the SMT research profiles database it was determined that an additional twenty-four of these addresses had no academic affiliation. Of the 550 possible academic institutions remaining in the SMT/ATMI sample, 191 of these addresses lacked any indication of academic affiliation; for instance, there were sixty-four AOL (America Online) subscribers. Of the 550 possible institutional addresses, 134 responded. This indicates a response rate of about twenty-four percent. Even if 100 of the 191 non- academic address are discounted, the response rate is still very low at about thirty percent. One possible reason for such a low response rate may be that the use of the word “survey" in the subject line of the e—mail may have prompted many subscribers to skip the message. In addition, many persons subscribed to the SMT or ATMI lists were undoubtably students or instructors not directly associated with any undergraduate aural skills classes. The response rate of the MG sample was sixty-eight percent. Although the MG sample contained more than 150 addresses representing 145 institutions, a few of the addresses were of students or secretaries at music schools, and twenty of the institutions were already represented in the random sample. The instrument was sent to instructors at 123 institutions; however, eighteen were immediately returned 4 as undeliverable. Two working addresses represented instructors no longer on the faculty at the indicated institutions. The sample size represented 103 institutions, and generated seventy responses. This higher response rate of sixty-eight percent may be due to the fact that the instrument was e-mailed privately to each individual. MacGAMUT users may also have been more inclined to fill out surveys and more likely to speak favorably about the ear-training program. DEMOGRAPHICS The survey represents seven countries in addition to the United States; forty- three of the fifty states are represented, as are Puerto Rico and the District of Columbia. The survey includes responses from 194 four-year colleges and fifteen two-year colleges. Ten of the responding institutions were not listed in the College Music Society Directory of Music Schools; however, the survey represents fifteen percent of the four-year undergraduate institutions listed in the 1999 CMS Directory—or 184 of the 1213 listed four-year schools. Appendix B lists alphabetically the 209 institutions represented in the survey. Appendix C lists the geographic distribution for the 209 survey responses along with the subset of seventy responses generated by the MG sample. The geographic distribution Of the1817 institutions represented in the CMS directory are also listed. In order to maintain the confidentiality Of the responses, the numbers for the seventy institutions in the MG sample are indicated only as a subset of the 209 responses. In cases where the survey represents only one institution from a geographic region in the US, the listing for the MG sample is indicated as not applicable. The SMT/ATMI sample tends to represent larger institutions with fifty-nine percent of the responses coming from institutions with 100 or more music students. This may be due to a bias in the survey method that would favor larger schools. The MG sample tends to represent smaller schools, and serves well in complementing the SMT/ATMI sample. The size distribution of undergraduate institutions represented in the survey is show in Table 1 below and indicates that institutions of all sizes are well represented. The smaller number (139) represents results from the SMT/ATMI sample only, while the larger number (209) includes results from the MG sample. TABLE 1 SIZE DISTRIBUTION OF INSTITUTIONS Number of Music Majors SMT/ATMI MG + SMT/ATMI n=139 % n = 209 % O to 19 21 15% 33 1 6% 20 to 49 1 6 12% 32 1 5% 50 to 99 19 14% 33 1 6% 1 00 to 199 24 1 7% 42 20% 200 or more 59 42% 69 33% BIASES OF THE SURVEY METHOD The SMT and ATMI lists provided a convenience sample that represents a random sampling of schools of differing sizes, but there is a pronounced technology bias in the type of individual who was able to respond to the survey. Only instructors regularly using e-mail and subscribing to one of two e-mail lists were even likely to see the survey. There may also have been a reluctance on the part of instructors to respond at all if their institution was not using ear-training CAI. Institutions not using ear-training CAI are therefore likely to be under-represented in the survey results. There is also a bias toward the type of undergraduate institution, whether four-year or two-year, that was likely to respond to the survey. While the CMS directory lists 1213 four-year schools and 506 two-year schools, the 209 responses represent only fifteen two-year schools. One possible reason may be the phrasing of the second question in the survey which asked for an indication of the number of undergraduate “music majors”. A second possibility is that the above mentioned technology bias may be even more pronounced with regard to smaller two-year institutions. The technology bias, as well as the low response rate of about 24%, prevent the SMT/ATMI sample from being a truly random sample which can be used to make generalizations regarding the state of ear-training; however, a descriptive analysis of the survey results follows. SURVEY RESULTS Many of the tables used below will provide results from the SMT/ATMI sample followed by the results from the SMT/ATMI and MG samples combined. This is done to provide as much information as possible and to allow for comparison between the responses from the various samples. Table 2 shows that twelve of the 139 responses from the SMT/ATMI sample reported that they did not use ear- training CAI. Of these, one respondent reported having used CAI in the past but had since discontinued its use. Others mentioned that they were currently looking 7 into CAI for ear-training. Considering the technology bias of the survey, one could infer that the actual percentage of undergraduate institutions using ear-training CAI is Significantly less than the 91% indicated below. One response from the MG sample indicated that ear-training CAI was no longer being used. TABLE 2 PERCENTAGE OF INSTITUTIONS USING CAI CAI usage SMT/ATMI MG + SMT/ATMI n=139 % n=209 % None 12 9% 13 6% CAI used 127 91 % 196 94% Appendix D lists more than twenty-five commercial programs along with a number of the 22 “homegrown” CAI programs used in the responding institutions. The number of institutions that reported using each program is also indicated. Many schools reported using two or three different software programs. While the use of multiple ear-training programs may imply a search for variety, it may also indicate a level of dissatisfaction with the software currently being used. Conversely, the use of a single program may indicate a greater level of satisfaction with the CAI software. The SMT/ATMI sample indicates that forty-six percent of institutions using CAI use two or more programs. Table 3 shows the rates at which multiple CAI programs are used in the responding institutions. TABLE 3 NUMBER OF PROGRAMS USED Number of programs used SMT/ATMI MG + SMT/ATMI n=1 27 % n=1 96 % 1 program 69 54% 113 58% 2 programs 32 25% 47 24% 3 programs 16 13% 21 1 1% 4 or more programs 10 8% 15 8% The Macintosh platform was by far the most widely used platform for ear- training CAI. At least eighty-seven percent of the institutions from the SMT/ATMI sample reported using Macintosh computers for ear-training. IBM-compatible computers were used for ear-training at about sixteen percent of the institutions from the SMT/ATMI sample. Table 4 shows the computer platforms used for ear-training CAI. TABLE 4 PLATFORMS USED FOR CAI Platforms used for CAI SMT/ATMI MG + SMT/ATMI n=127 % n=196 % Macintosh 1 05 83% 1 73 88% IBM-compatible 1 6 1 2% 1 6 8% Both Mac and IBM 5 4% 6 3% NeXT 1 1% 1% The percentage of course grade determined by CAI use ranges from nothing to more than eighty percent. The SMT/ATMI sample shows that three percent of the schools reported using CAI for fifty percent or more of the grade, but that forty percent of the schools used CAI for ten percent or more of the grade. Similarly, forty percent of the schools used CAI only for ungraded practice. The MG sample indicates that institutions using MacGAM UT are more likely to grade the use of ear- training CAI. Table 5 shows the percentage of the grade based on CAI usage in the responding institutions. TABLE 5 PERCENTAGES OF GRADE BASED ON CAI USE Grade evaluation based on CAI use SMT/ATMI MG + SMT/ATMI n=127 % n=196 % Ungraded Practice 51 40% 66 34% Extra credit 8 6% 11 6% 1% to 9% of the grade 15 12% 25 13% 10% to 19% of the grade 30 24% 54 28% 20% to 29% of the grade 12 9% 18 9% 30% to 39% of the grade 2 2% 10 5% 40% to 49% of the grade 3 2% 3 1% 50% or more of the grade 4 3% 4 2% Other 2 2% 5 3% Many different methods of integrating the CAI software were reported. The most common use for CAI in both samples was some form of graded practice. The SMT/ATMI sample indicated that thirty-four percent of the responding institutions used CAI only for ungraded practice. Grades were based on passing levels or 10 completing tests at twenty-one percent of the institutions. Recording of practice time along with completing levels accounted for CAI use in another twenty-one percent of the institutions. Approximately eleven percent of the institutions included CAI as lab work during part of a class period, and nine percent of the institutions based the grade only on the amount of time spent with CAI. Table 6 shows the various methods of integrating CAI into undergraduate aural skills classes. TABLE 6 METHODS OF INTEGRATING CAI Integration SMT/ATMI MG + SMT/ATMI n=1 27 % n=196 % Ungraded practice 43 34% 54 27% Testing (Passing Levels) Only 27 21% 40 20% Testing + Time 27 21% 40 20% Testing + Lab work 3 2% 5 3% Testing + Time + Lab work 6 5% 17 9% Time Only 1 1 9% 22 1 1% Time + Lab work 3 2% 5 3% Lab work 2 2% 5 3% Extra credit practice 5 4% 8 4% The most common method for students to access the CAI was in a single computer lab. Nearly eighty percent of schools indicated the use of a primary ear- training lab. About eleven percent added that the CAI could be accessed through a campus network. The limited access to ear-training software in computer labs was mentioned often in the optional instructor comments. 11 TABLE 7 ACCESS TO CAI SOFTWARE Software Access SMT/ATMI MG + SMT/ATMI n=127 % n=196 % One Lab 102 80% 147 75% Many Labs 23 18% 37 19% Personal Copies (at least one lab) 2 2% 12 6% Through Campus Network (14) (1 1%) (16) (8%) The limited number of computers for students to do CAI work was also a frequent comment in the survey responses. Many schools required the students to purchase personal copies of CAI which were not dependent on the use of a single computer lab. Other schools used a campus network to address the accessibility problem. Table 1 previously showed that more than half of the 209 schools in the survey have 100 or more students. Table 8 below indicates that forty-six percent of the schools have fewer than nine computers in a music lab that can access ear- training CAI. Only twenty percent of the institutions have twenty or more computers available in a music lab. TABLE 8 NUMBER OF COMPUTERS AVAILABLE FOR CAI Computers available in music lab(s) SMT/ATMI MG + SMT/ATMI _ n=127 % n=196 % 1 to 9 59 46% 90 46% 10 to 19 43 34% 69 35% 20to 29 16 13% 22 11% 30 or more 9 7% 15 8% 12 Nearly Sixty-six percent of the institutions using CAI reported that their ear- training computer labs were connected to the internet. Three instructors from the SMT/ATMI sample responded that the computer labs were intentionally not connected to the internet so that students would not waste time surfing the web. Although a few instructors in the SMT/ATMI sample did not respond to this question, Table 9 shows the breakdown of internet accessibility of the labs in the responding institutions. TABLE 9 INTERNET ACCESSIBILITY OF COMPUTER LABS Internet Accessibility SMT/ATMI MG + SMT/ATMI n=1 27 % n=196 % Yes 80 63% 1 29 66% No 39 31% 59 30% Not available 8 6% 8 4% In order to identify some of the more useful CAI programs, the institutions reporting the use of one CAI program will be further examined. The SMT/ATMI sample contained Sixty-nine institutions which reported using a single CAI program. Practica Musica led the list and was reported at forty-three percent of these institutions. MacGAMUT was second and was reported at thirty-two percent of these institutions. Most of the programs listed had been available for five to ten years; however, Auralia, which was first released in 1998, posted a relatively strong showing despite its IBM platform and recent release date. Four commercial programs are represented by a single responding institution. These four programs 13 are: Computerkolleg Musik, Guido, teoria, and Musique. The three “homegrown” software programs include: Audio Challenger written by Anthony Holland, a professor at Skidmore College; Harmonic Idioms written by Edward Chudacoff, a professor at the University of Michigan, and; a set of custom MIDI sequences used for melodic dictations. Table 10 lists the programs used by sixty-nine institutions reporting only a single CAI package. TABLE 10 SOFTWARE AT SCHOOLS USING ONLY ONE PROGRAM Programs reported SMT/ATMI n=69 % Practica Musica 30 43% MacGAMUT 22 32% Music Lab Melody 4 6% Auralia 2 3% CAT. 2 3% ET Drill 2 3% Other “Homegrown” programs 3 4% Other commercially available programs 4 6% Of the sixty-nine institutions reporting only one CAI program, twenty-six reported using CAI for ten percent or more of the grade. Table 11 lists the programs used at institutions integrating CAI as ten percent or more of the grade and using only one CAI program. Practica Musica again tops the list, but MacGAMUT follows as a close second. 14 TABLE 11 PROGRAMS USED FOR 10% OR MORE OF GRADE Software titles used for 10% or more of grade SMT/ATMI n=26 % Practica Musica 10 38% MacGAMUT 9 35% Music Lab Melody 3 12% Curriculum for Aural Training (CAT) 2 8% Computerkolleg Musik 1 4% Musique 1 4% Nearly Sixty percent of the instructors rated the various ear-training programs as good. About 20% of the instructors rated the software as only fair or poor, and two instructors discontinued using ear-training CAI altogether. A number of instructors did not rate the software. In the few cases where an instructor indicated a rating between two categories, the lower of the two categories was counted. Table 12 shows the approximate instructor ratings of various CAI packages. TABLE 12 INSTRUCTOR RATINGS OF VARIOUS CAI SOFTWARE Rating categories SMT/ATMI MG + SMT/ATMI n=1 27 % n=196 % Excellent 19 1 5% 35 1 8% Good 76 60% 1 14 58% Fair 24 19% 34 17% Poor 3 2% 4 2% Not available 5 4% 5% 15 The final question on the survey instrument asked whether the students seemed to find the CAI helpful. More than seventy percent of the instructors responded that students did find the programs helpful. Some instructors at institutions where CAI was used for ungraded practice commented that although students found the CAI helpful, only a few students actually used the programs. TABLE 13 STUDENT RATINGS OF CAI HELPFULNESS Did students find the CAI helpful? SMT/ATMI MG + SMT/ATMI n=1 27 % n=1 96 % Yes 97 72% 155 77% Indifferent 15 12% 21 1 1% Varies 8 6% 1 1 6% No 3 2% 5 3% Not available 5 4% 5 3% BRIEF SUMMARY OF SURVEY RESULTS Of the 209 survey responses, fifty-three percent of the schools had 100 or more music majors. The following comments refer to the 196 institutions that used ear-training CAI. Approximately forty-five percent of the institutions used two or more ear-training programs. The Macintosh platform was used at well over eighty- five percent of the institutions. CAI use was evaluated as part of the course grade at more than fifty-two percent of the institutions. At nearly fifty percent of the institutions, the most common method of integrating CAI software included testing or the passing of levels. Eighty percent of the institutions reported using only one 16 computer lab for the ear-training CAI, and forty-six percent of the institutions had fewer than nine computers in music labs for use with ear-training software. Nearly seventy-five percent of instructors rated the software as good, and seventy-seven percent said that students seemed to find the software helpful. There were sixty- nine institutions from the SMT/ATMI sample that used only one ear-training CAI program. Practica Musica and MacGAMUT appear as the most used programs in this category and when combined were used at seventy-five percent of these institutions. These two programs were also used in seventy-four percent of the 127 institutions that used CAI and were from the SMT/ATMI sample. While thirty of these institutions reported using both programs, Practica Musica was used at seventy-two institutions, and MacGAMUT was used at fifty-two institutions. 17 CHAPTER 2 THE SOFTWARE REVIEW FORM PURPOSE The software reviews offered here do not provide a comparative rating or judgement of each program’s design features or effectiveness in various activities. Rather, the reviews provide a brief overview of each program’s available features and an indication of each program's limitations. To this end, it was decided to use a consistent form for each review but to attempt to give the form a degree of flexibility to accommodate the unique characteristics of each program. The form was designed primarily to address the needs of undergraduate music instructors, but every effort was made to make the reviews useful for elementary school teachers, college students, or parents looking for music instruction programs. Perhaps the greatest advantage of the form is the ease with which readers can identify those programs that may fit their particular needs. The reviews were also published on the World Wide Web and were designed for ease of updating by the author so that they could be kept current in the fast-changing world of computer technology. FORMAT The two page review takes the form of an extended table with the first column giving the main categories in bold, capitalized lettering. The other columns list possible program features or are left blank. Blank cells may be filled in with general information or optional commentary. Listed features or options that do not apply to 18 the software being reviewed will have their text struck through with a single line. This at once indicates that the feature in question is not present in the software, and it makes that feature less readable for anyone searching quickly through the form for a program’s general qualities. Optional commentary written into the blank cells on the form will appear in italics. The following two examples illustrate an excerpt from a blank form, and that same excerpt as it might appear in a completed review. TABLE 14 EXCERPT FROM A BLANK REVIEW FORM HARMONIC Inversions +6 Chords PROGRESSIONS: Single-click Response Secondary Dominants MELODIES: Computer-generated Libraries of Melodies Melodies Include Rhythm TABLE 15 EXCERPT FROM A COMPLETED REVIEW FORM HARMONIC- lnversions *G-Ghords . S' I l' I R S l B . I MELODIES: Computer-generated MIDI Entry Of Answers tibraries-of—Melodies- Melodies Include Rhythm The completed excerpt indicates that the program does not contain harmonic progression exercises but that it does contain melodic exercises. The excerpt also indicates that there are no pre-entered libraries of melodies but that the melodies are computer-generated (usually from a list of parameters chosen by the user) and 19 include rhythm. There is optional commentary, in italics, indicating that answers can be entered using the MIDI keyboard. The review form is divided into five main sections: 1. GENERAL INFORMATION 2. AVAILABLE EXERCISES 3. INSTRUCTIONAL ISSUES 4. SYSTEM REQUIREMENTS AND SETUP INFORMATION 5. PRICING AND PRODUCT INFORMATION Each of these sections is discussed below in greater detail. Explanations are given for terms used on the review form, and observations are made regarding the various exercises and options available in the thirty programs reviewed. GENERAL INFORMATION The first section of the review form begins with the program name appearing at the top of the form, and presents basic information about the software being reviewed. Information is given regarding the version of the software being reviewed, the reviewer name, and the webpage URL. The review date is given and is followed by information about the platforms and operating systems on which the software runs. The first section closes with information about the intended uses for the software. It indicates whether a program is intended primarily for user-directed individual practice or whether it is also designed for use in educational institutions—where tracking of student progress and instructor customization are often considered as desirable features. Subcategories of individual practice indicate whether the program includes games or tutorials. Games and game-like elements 20 are found most often in programs for younger students. Tutorials are often found in programs for self-motivated individuals wishing to learn or review basic music theory terminology in addition to aural skills. The final subcategory indicates the approximate target audience of the program as kindergarten to 6th grade, 7th to 12th grade, or college level students. AVAILABLE EXERCISES Interval exercises are the most common type of exercise found in the current generation of ear-training programs. Users are often given total control in selecting the intervals to be practiced as well as the response methods. Response methods can include a single mouse-click, playing on a MIDI keyboard, clicking notes of an on-screen keyboard, notating the pitches on an on-screen staff, or singing. Melodic intervals are listed on the form as ascending and descending intervals. This is done because a few programs do not allow for practice of descending intervals. Listings for harmonic intervals and compound intervals close out the interval section of the review form. Chord identification exercises are also a common feature of many ear- training programs, although some programs are limited to the use of chords in root position. The form addresses this issue by specifying whether or not the program includes chord inversions. Separate listings are given for triads and seventh chords. One issue that can frustrate students is the open voicing of seventh chords in some programs; the spacing in some instances places the outer voices more than two octaves apart. Many programs address this issue by allowing users to choose an option for open or closed spacing of chords or by allowing the user to specify the 21 range of the pitches to be used for the exercise. A blank space is provided for optional features such as custom chords which can be entered and labeled by the user. Other optional features may include extensive listings of jazz chords (9ths 11ths and 13ths), chord clusters, suspensions, or augmented sixth chords. A problem with the more advanced Single-chord identification exercises can be their limited usefulness when there is no harmonic context. One example of this is the identification of an isolated German Augmented Sixth chord, which is the enharmonic equivalent of, and therefore indistinguishable from, a V7 chord. Few programs precede their single-chord identification exercises with a tonal context. Harmonic progression exercises are not as widespread as the previous two exercises but are available in nearly half of the programs reviewed. Most of the programs feature only basic diatonic progressions in major and minor keys. Some programs include augmented sixth chords and secondary dominants. A few programs feature extensive libraries ofjazz progressions. Augmented Sixth chords, indicated by +6 on the form, are a listed feature along with inversions. Secondary dominants are also listed, and space is left for optional features. These features may include instructor customization of progressions either by direct entry, by selecting from a menu of options, or by entering progression elements which are then recombined by the program. This latter method can sometimes produce poor voice-leading and unintended chord progressions. Some programs use simple progressions that sound like an academic harmony exercise, but other programs use excerpts from classical music or popular music—helping to create a connection between ear-training and music appreciation. None of the reviewed programs used actual recorded performances of musical excerpts, but some programs featured a 22 MIDI playback of an actual performance. The methods used for answering some progression exercises are rather tedious: The user selects a Roman numeral, an inversion symbol, and then clicks on the box representing the chosen chord. Some programs even allow for optional notation of the bass and soprano lines. While these methods may reinforce basic music theory concepts, they may also take away from the actual amount of time spent on aural practice. The one response method listed on the review form in this category is that of answering with a single-click of the mouse. Cadences or cadence formulas are a related category of exercise that sometimes feature a very quick multiple-choice response method. Very few of the reviewed programs employ a single-click answer method for harmonic progressions or cadence exercises. There is still much room for improvement in this type of ear- training exercise with more musical progressions and quicker response methods. Melodic dictation exercises take many forms. Some programs include rhythm with the melody—although the user may not have to include the rhythm as part of the answer—while other programs merely play melodic patterns or pitch patterns that lack any rhythmic variation. There are two primary methods of creating melodies. In one method, melodies come with the program and are saved in libraries which the program can access as needed. This method often allows instructors to enter their own custom melodies. While this can be added work, it allows the computer program to become more integrated with the classroom work. In another method of melodic dictation, the user or instructor enters parameters such as melody length, range, size of largest leaps, and even rhythmic values into a dialog box; and the program generates an endless string of melodies. This “computer-generated” method of creating melodies offers ease of use and variety in melodies, but the 23 product often sounds more like a string of random intervals than a real melody. The response methods for this type of exercise can include complete notation, playback on an on-screen keyboard, MIDI playback, or mouse-clicks on an on-screen staff. A related form of melodic dictation is melodic error-detection. In one implementation of this type of exercise, the user views the notation, hears the melody played, then indicates the spot in the notation where there is an error. Scale recognition is a common element of most ear-training programs. Usually the computer plays a scale and the user clicks on the name Of proper scale, but sometimes the answer is given by notation. Major, minor, and modal scales are listed on the form. Space is provided for optional information about pentatonic, octatonic, whole tone, and various types of jazz scales. Some programs give the user an option to create customized scales or pitch sets, and many programs offer at least a modest tutorial explaining the different scales. A closely related exercise is that of scale degree recognition. In this type of exercise a tonic is established, a pitch follows, and the user indicates the scale degree of the pitch. The scale degree can be indicated either by solfege or by number. Another issue related to scales is the use of different temperaments for the ear-training exercises. While some programs are starting to include options for use of alternate tuning systems, the vast majority of programs only use equal-temperament. Rhythmic dictation is not as widespread as the previous exercises, but it is being incorporated into more and more ear-training programs. One type of rhythmic dictation exercises, referred to as “hear/notate” on the form, has the computer play the rhythm and user notate the answer on the screen. The most common method of rhythmic dictation, referred to as “hear/tap” on the form, has the computer play 24 the rhythm and the user answer by clicking the mouse or tapping a key such as the space bar. There are at least two variations of this type of response method. The rhythm can be indicated when the user presses down on the key or when the user lets up on the key. A few programs use the first method and also record the length of time for which a note is held. In another type of rhythmic exercise, the computer shows the notation, and the user taps the answer. This method brings up a subtle point about some ear-training exercises; namely, that they tend to reinforce basic music theory reading and notational skills more than aural perception. While this type of exercise may be useful, it is not listed as a review category. Yet another method for rhythmic dictation exercises has the computer play the rhythm and the user respond by selecting from a number of boxes displaying different rhythmic patterns. The rhythmic patterns or “elements” are placed in the appropriate order to provide a quick method of notating the answer. This multiple-choice method reinforces the notation of answers (basic music theory) while still focusing on the listening part of the exercise. Singing (or audio input of answers) is being incorporated into many programs—especially the newer titles. While this can be an impractical option for large school music labs where the singing would be distracting to other students, it can be a useful option for individual practice at home or in a dormitory. The level of feedback and number of different exercises varies from program to program. The most common exercises are pitch matching and the singing of simple intervals, melodies, and scales. Some programs feature an exercise in which a chord is played and the user sings one of the pitches. Most programs currently using a microphone for audio input are aimed at the analysis of vocal singing and are not 25 intended for use with acoustic instruments. The ability to use acoustic instruments to respond to questions would open the door for musical participation of the users without forcing them to sing or to learn keyboard skills in order to respond via MIDI. One program currently features a “hands off" mode where the program plays an example, waits, gives the answer, then plays another example. While there is no direct feedback given by this particular program, other programs do provide graphical feedback of the respondent’s intonation. As audio input response methods continue to develop, there is the potential that someday programs will offer a totally “hands free” approach to ear-training. Addressing additional exercises or features, the last section allows for descriptions of exercises or features that may not fit into the above-mentioned categories of the review form. Occasionally these three lines are used for in-depth descriptions of features already mentioned or to provide optional comments about the program in general. INSTRUCTIONAL ISSUES This section addresses the elements of record keeping and program customization as it applies to both the user and the instructor. There are three basic options for customizing or structuring exercises: 1) The user defines the settings; 2) The programmer defines the settings (as preset levels or parameters), or; 3) The instructor defines the settings. User-defined settings are found in all ear-training programs to some extent—from simple volume and tempo control to choosing the intervals or chord progressions to be practiced. One method is to allow the user to define the setup 26 of each exercise such as the materials, the method of response, and the types of feedback provided by the computer. Another method, indicated by the word, “Levels” on the form, allows the user to choose from various preset levels. This arrangement is especially useful for individuals who are learning on their own and may not know where to start. The form lists two additional categories indicating whether a user can change settings for both the practice modes and any available test modes. Instructor customization is only available in about twenty percent of the programs reviewed. Some programs have limited customization, while others allow so much room for customization that the instructor could become overwhelmed with work trying to create custom melodic and harmonic dictation exercises for various classes! Among the multitude of possible options, three are listed on the form. The first two options refer to whether the instructor can make custom tests or define various settings for different classes. The third option refers to whether the instructor can modify the scoring parameters that the program uses to determine whether a student passes a test or a level. Other options for instructor customization include keeping detailed records of each student, or the ability to create databases of student records to assist with the evaluation of student progress, as well as overall class progress. Response options vary greatly from program to program. Single mouse-click identification is often the simplest and quickest response method—although many programs require multiple mouse-clicks for each answer. On-screen keyboards are a popular and flexible response method, and they are especially handy on a computer that is not hooked up to a MIDI keyboard. Some programs offer MIDI 27 input or allow the user to Sing the answer into a microphone. Other programs Offer on-screen notation which, depending on the program, can be a rather tedious method of response. In order to save time, some programs offer the useful features of automatically checking the answer and automatically skipping to the next question. Optional methods of response may include an on-screen guitar fret-board, the use of the computer keyboard as a keyboard instrument, or the selection of numbers representing different choices of a multiple-choice question. User feedback is generally very limited in the current generation of ear- training programs. The user feedback most commonly seen is the positive reinforcement of correct responses with phrases like “Way to go!” accompanied by sounds such as clapping. This feedback is so common that there is not a category for it on the review form; in addition, most programs allow the accompanying sounds to be turned off. Some useful feedback can occur when the number of correct and incorrect answers are given, or when the responses are displayed as statistics—especially in a visual graph or in such a way that it creates a game-like atmosphere. Hints are few and usually limited to “Try again!”, but some programs offer more useful feedback, such as indicating whether a note was too high or too low when answering via MIDI. A number of programs allow the student to view the answer upon request. Some programs offer an analysis of the responses given by the user so that the user can discover potential weaknesses. Other programs go one step further and include the use of a diagnostic test that grades a user’s performance then suggests appropriate levels or settings for each exercise. An indirect, and as yet unmeasured, type of feedback can occur in the practice modes of some programs where the student can play along on a MIDI keyboard while an 28 exercise such as melodic dictation is being played. While this may in fact be a very useful exercise it is not utilized in many programs. One reason for this may be that the response cannot be readily analyzed and graded by the program. When sufficiently detailed records are kept and analyzed, it would be useful to know not only which questions a student answered incorrectly, but also what the student gave as the wrong answer so that patterns of incorrect answers can be established— and potentially corrected with targeted exercises. Records and the tracking of student progress are often a consideration in classroom situations. This section of the form deals with what kinds of records are maintained, and the following section of the form deals with what can be done with the records. Some programs only maintain records for the current session, and all information is lost once the program is closed. Other programs save information about the number of correct answers as well as information about completed levels or tests. Many programs give a running total of the time spent using the program. Some programs list the individual times Spent on each exercise, and a number of programs even list the day and time each exercise was completed. Optional information might include more detailed statistical data, or provisions for the instructor to combine records into large class lists to compare student scores. Records can be saved in different locations and used for various purposes. The form lists a computer hard drive, a network, or a student disk as places where the records can be automatically saved. This brings up the related issue of how the records are saved. If students must manually save records, they will likely forget and become frustrated if the program crashes—causing them to lose their unsaved scores. Secure records may be important for a number of reasons: they are often 29 tied to the program in such a way that when the program opens, a user’s records are called upon to determine which settings and tests the user can access; and, of course, they may determine a grade for the class. While a floppy disk is extremely convenient for students—allowing them to work in different labs or on different computers—they are not secure enough to be the only copy of the student’s records. When student information on a floppy disk is lost or corrupted, some programs provide methods to restore the records from a back-up on a local hard drive or a network. Records can often be saved in a text format to be printed or e-mailed. Some programs allow records to be put into an instructor database and used to provide information about each student’s performances. Future databases may be able to provide useful information not only to an instructor but also to the program itself—thereby allowing the program to customize itself to the perceived weaknesses of the user. Some programs allow records to be viewed in the form of a graph or Chan. SYSTEM REQUIREMENTS AND SETUP INFORMATION This section of the form provides information about the minimum system requirements to run the software. The form lists the program size (when installed on the hard drive) and provides space that can be used to list additional information such as the amount of RAM required to run the program. The hardware section specifies whether a sound card, microphone, or MIDI keyboard are required to use the program. Space is provided for optional information such as the need for a CD- ROM drive or other peripherals. The software category provides information about whether additional software is required to run the program. Some Macintosh 3O programs require the use of Hypercard or OuickTime. Other programs require additional software for the instructor to enter custom exercises or to work with databases of student records. One program currently requires additional software to use a microphone for audio input of answers. PRICING AND PRODUCT INFORMATION This section begins by listing the approximate price of the software in US. dollars. The price is given for a single copy as well as a lab-pack, and information is provided regarding whether a site license is available. Optional information might include pricing for a student access disk at an institution with a Site license. The form also indicates whether a downloadable demo of the software is available. Optional information might include whether a demo is available through the mail or whether a preview policy exists for music instructors. The webpage URL is given for the software company or the software distributor. Additional contact information includes an e-mail address, a phone number, and the company name and mailing address. 31 CHAPTER 3 DIRECTIONS FOR CAI DEVELOPMENT: ISSUES REGARDING THE INTEGRATION OF EAR-TRAINING SOFTWARE IN UNDERGRADUATE MUSIC PROGRAMS SOURCES OF INFORMATION This chapter draws upon the author’s personal experiences as computer lab monitor as well as the optional comments from instructors responding to the CAI survey. Because the survey stated that all respondent’s names would be treated with anonymity, no citations shall be given for the commentary referred to below. Instructor comments will be paraphrased and are used primarily to give an indication of the types Of problems encountered by music instructors currently using ear- training CAI. PERSONAL OBSERVATIONS Michigan State University Opened a twenty-station PowerMac ear-training lab in September 1997, that included eighteen Kurzweil P088 MIDI keyboards. This lab and the undergraduate ear-training program were supervised by Dr. Bruce Taggart. Practica Musica (3.0 to 3.82) was used during the 1997-98 academic year, and MacGAMUT 3.8 was used during the 1998-99 academic year. The personal observations made below are based in large part on three semesters of work as a monitor in this lab. This work varied from four to eighteen hours per week, and included the observation of up to four sections of freshman ear-training classes. 32 SURVEY COMMENTS Of the 209 undergraduate music instructors responding to the CAI survey, 106 provided optional commentary. Many professors provided two or three comments, raising the number to 176 comments. All but seven of the comments addressed a shortcoming or limitation of the CAI software. The 176 comments were arranged into the six broad categories Shown below in Table 16. Issues pertaining to each of these categories will be discussed in detail, with the anonymous comments being combined with the author’s personal observations. TABLE 16 GENERAL CATEGORIES OF COMMENTS Categories of Instructor Comments n = 176 % CAI usefulness 40 23% Student usage 30 17% Lab accessibility and platform availability 25 14% Exercises and sound quality 35 20% Scoring issues and general program bugs 24 14% Course integration and CAI customization 22 13% THE MOST COMMON COMMENTS Interestingly, the three most commonly made comments did not refer to the ear-training software but rather mentioned instructor attitudes, student usage, and computer lab availability. The “Top Ten” comments are shown in Table 17 below. 33 TABLE 17 “TOP TEN” COMMENTS “Top Ten” Comments n = 81 % Success depends on instructor attitudes 12 15% Difficulty getting students to do required work 10 12% Limited student access to computer labs 10 12% Platform availability (needs windows version) 9 11% General program bugginess 9 11% Different learning methods among students 7 9% CAI lacks mOre advanced exercises 7 9% Lack of flexibility for customization of exercises 7 9% Lost student scores (floppy disk malfunction) 5 6% CAI needs better record-keeping ability 5 6% CAI USEFULNESS The single most frequently made comment, occurring twelve times out of 176, was that the effectiveness of CAI use depends on the attitudes and guidance of the instructor. Many respondents were referring to the fact that some instructors at their institution supported the integration of computers in music instruction, while other instructors were against the use of computers. Many of these comments also made reference to the necessity of familiarizing the students with the operations of the software and giving them suggestions for approaching the exercises. There were four comments that CAI use was not as effective as in-class work, three comments that it was not as effective as partner practice, and two comments that it was not as effective as human mentoring. Three comments mentioned that CAI use had been discontinued due to dissatisfaction with the software, three comments mentioned 34 unspecified limitations of existing software, and two comments stated that the initial enthusiasm of using computers wore off quickly. One comment mentioned that CAI was not cost-effective and another that CAI was promoted because of its technological implications rather than its proven pedagogical effectiveness. Of the seven comments that mentioned successes, two stated that the sight-singing and melodic dictation abilities of the students were greatly improved by the software. Two comments mentioned that the software saved class time from tedious drill and practice, and two comments mentioned increased student motivation and self- confidence. TABLE 18 CAI USEFULNESS CAI Usefulness n = 40 % Success depends on instructor attitudes 12 30% Not as effective as classroom instruction 4 10% Not as effective as partner practice 3 7.5% Discontinued use of CAI due to dissatisfaction 3 7.5% CAI very limited 3 7.5% Not as effective as human mentoring 2 5% Enthusiasm for software Short-lived 2 5% Saves class-time from tedious drills 2 5% Motivation and confidence are much improved 2 5% Improved Sight-singing and dictation abilities 2 5% Various comments 5 12.5% 35 STUDENT USAGE Even among institutions that required CAI use as part of the grade, getting the students to spend time with the ear-training software was a major difficulty reported in ten of the comments. Three instructors using CAI as ungraded practice reported that students don’t realize the helpfulness of the program, and two instructors mentioned that students simply do not use the CAI. Regarding different learning methods of individual students, four comments mentioned CAI does not work for some students, three comments mentioned CAI is a time-consuming hoop for some students, and two mentioned that CAI does not work well with computer- phobes. TABLE 19 STUDENT USAGE Student Usage n = 30 % Difficult to get the students to work with the CAI 10 33% Different learning methods of students 7 23% Time-consuming hoop for some students 3 10% Student do levels but don’t focus on learning 3 10% Students do not realize the usefulness of CAI 3 10% Students do not use the CAI 2 7% Does not work well with computer-phobes 2 7% LAB ACCESSIBILITY AND PLATFORM AVAILABILITY One of the most frequently mentioned comments is that of the lack of computer lab accessibility for students to work on their ear-training assignments. 36 The limited hours of lab operation was the primary reason given, although some instructors also mentioned that it was inconvenient for the students to come to a music lab. The ear-training lab at Michigan State University was used for many sections of ear-training classes as well as other music technology classes. Students complained that this use of the lab limited their access; however, the author observed many hours when there were very few students using the lab or when students spent hours checking their e-mail and surfing the web. The issues of lab accessibility and platform availability are tied together for two reasons. PC (IBM-compatible) computers are becoming increasingly availabl%ven replacing Macintosh computers as the predominant computer in many campus labs—and students are increasingly likely to have a PC of their own. This trend means that instructors are currently looking at IBM-compatible CAI as one way to help solve the accessibility problem. TABLE 20 LAB ACCESSIBILITY AND PLATFORM AVAILABILITY Lab accessibility and platform availability n = 25 % Limited access to computer lab(s) 10 40% Limited number of computers 4 16% Windows version of CAI needed 9 36% Various comments 2 8% EXERCISES AND SOUND QUALITY Seven instructors commented that more advanced exercises, appropriate for the sophomore level or beyond, were lacking in some programs. In a similar 37 vein, three instructors commented that CAI worked better with Simpler, less contextual exercises such as interval identification. Four instructors commented that exercises such as melodic dictation need to have quicker response options, and comments by students at MSU confirmed that one of the most time-consuming parts of CAI was the on-screen notation of melodic dictation exercises. Three survey comments referred to the poor quality and quantity of dictation melodies, and three comments stated that the difficulty between levels was too great in some exercises. Blurring the distinction between exercises and sound quality, three instructors stated that the exercises lacked musicality. Two comments indicated that the quality of computer-generated sound was a weakness, and two comments referred to the difficulty of discerning multiple voices in harmonic dictation exercises despite the use of MIDI instruments. A related observation from the MSU computer lab regards the open spacing of isolated chords which can increase the difficulty of identifying the chord. Harmonic progressions that have a simultaneous attack of the voices and no independent volume control for each voice can make for dictations which are both unmusical and hard to hear as independent lines. Some instructors worked around these weaknesses by recording performances of harmonic progressions on a MIDI sequencer. Regarding the types of exercises that should be included in CAI, two comments noted the lack of rhythmic exercises as a major weakness. One comment suggested the use of harmonic context for single chord identification, and another comment suggested the use of harmonic context for scale degree exercises. One instructor suggested a contextual approach to melodic dictation, and noted that most melodic dictation exercises force a linear approach to hearing 38 and notating the melody. Other comments noted the need for more student feedback as well as larger libraries of harmonic progressions. TABLE 21 EXERCISES AND SOUND QUALITY Exercises and Sound Quality n = 35 % CAI lacks more advanced exercises 7 20% Exercises need quicker response methods 4 11% Dictation melodies are too few and lack quality 3 8.5% Difficulty between levels too great 3 8.5% Simpler exercises (less contextual) work best 3 8.5% Exercises lack musicality 3 8.5% Sound quality (computer-generated) is lacking 2 6% Multiple voices difficult to hear even with MIDI 2 6% Rhythm exercises are lacking 2 6% Various comments 6 17% SCORING ISSUES AND GENERAL PROGRAM BUGS Nine comments expressed frustration with unspecified bugginess of the programs. Five comments made specific reference to problems with records kept on a floppy disk, and five more referred to the need for better record-keeping ability in the programs. Three comments mentioned lack of easy access to the records and one comment mentioned a move towards using the program only for ungraded practice due to various frustration with records. One instructor noted that the scoring system of some exercises was very frustrating for the student because a single error at the end of a test would dramatically decrease the student’s score and 39 force the student to practice again on material on which the student had no problems. TABLE 22 SCORING ISSUES AND GENERAL PROGRAM BUGS Scoring issues and general program bugs n = 24 % General program bugginess 9 38% Lost student scores (floppy disk malfunction) 5 21 % CAI needs better record-keeping ability 5 21% CAI lacks easy access to student records 3 12% Various comments 2 8% COURSE INTEGRATION AND CAI CUSTOMIZATION Two primary methods were used for integrating CAI with classroom instruction: Either the instructor could customized the computer program to fit into the course, or the course could be built around the computer program. Many instructors observed that their approach to various aspects of ear-training often differed from the approach of the ear-training program. Although some instructors noted that customizable exercises allowed them to integrate their own material into the course, others complained that the program influenced both the materials covered as well as their ordering. Some features, such as the choice of a solfege system, could not be changed by the instructor. Seven comments referred to lack of flexibility for instructor customization, and three comments referred to lack of good accompanying textbooks. Two instructors mentioned that they employed different instructional models than the ones reflected in the design of most CAI 40 software. Two instructors mentioned that customization of the CAI was a time- consuming operation. Table 23 lists the comments regarding course integration and instructor customization. TABLE 23 COURSE INTEGRATION AND CAI CUSTOMIZATION Course integration and CAI customization n = 22 % CAI lacks flexibility for instructor customization 7 32% CAI lacks good accompanying textbooks 3 14% Instructors not utilizing CAI to its full potential 2 9% Requires lots of time to customize CAI 2 9% Custom exercises using MIDI sequencer 2 9% CAI does not fit cognitive model for learning 2 9% Various comments 4 18% 41 CONCLUSIONS This study has focused on the current generation of ear-training software and its integration into undergraduate music instruction. Although numerous instructors reported success with various programs, there were many observations regarding weaknesses of the available software. The following list, based on the thirty software reviews and the 209 survey responses, identifies the aspects of CAI most in need of improvement. 1. More secure and detailed student records (scores) are needed. 2. More instructor customization Options are needed to accommodate different teaching methods and approaches. 3. More useful feedback for the students is needed. 4. More advanced exercises are needed. 5. Quicker response methods are needed to keep the focus on aural Skills. There are other types of music software that contain ear-training exercises or that can be used for ear-training. These categories of software include music theory CAI, keyboard skills CAI, MIDI sequencers, notation software, and accompaniment software. CAI focusing on music theory writing skills or on keyboard Skills often contains elementary ear-training exercises. MIDI sequencing software and notation software can also be used to create ear-training exercises. A number of instructors reported that they used a MIDI sequencer to create melodic or harmonic dictation exercises. One instructor reported having each student work at a computer with sequenced dictations during class periods. The students worked 42 at their own pace notating the answers on paper, and the instructor was free to walk around the room Offering help where needed. A number of instructors also maintained Web Pages with downloadable music files for their classes. These files could be accessed by students and used with the appropriate sequencing or notation program to practice ear-training. The development of accompaniment software also has great potential for ear- training. This type of software provides a “music minus one” approach and allows students to practice accompanied pieces without a human accompanist or orchestra. This type of software uses a microphone to detect what the student plays, and it can adjust to subtle tempo changes by the performer. As this type of software develops and becomes more widely available, its ear-training potential may increase. Students may someday be able to do their ear-training in practice rooms and use their own instruments to play the answers. Ear—training CAI is still in an early stage of development, and more research needs to be done regarding its effectiveness. However, despite its present limitations, there are currently more than sixty ear-training CAI programs available for Macintosh and Windows computers. The distinction among different types of music programs continues to blur as many ear-training programs incorporate better notation and sequencing abilities as well as tutorials covering basic music theory. With the increasing availability of more powerful and less expensive computers, ear- training CAI will likely continue its development into an even more useful and flexible educational tool. 43 APPENDIX A THE SURVEY INSTRUMENT Dear List Subscriber, This is a 4-MINUTE SURVEY of undergraduate music theory instructors. PURPOSE-to evaluate the use of CAI (Computer-Assisted Instruction) in undergraduate Ear-Training during the 1998-99 academic year as part of a Master's Thesis in Music Theory. YOUR CONSENT--you indicate your voluntary agreement to participate by completing and returning this questionnaire. CONFIDENTIALITY--all results will be treated with strict confidence and the respondents’ names along with their institutions will remain anonymous in any report of research findings. FORMAT-~You may checkmark (with a = or some other character) the answers that apply, or you may delete the answers that do not apply. 0. SAMPLE QUESTION? ===Yes SEND RESPONSE TO: spangle9@pilot.msu.edu No 1. Name of Institution: 2. Approximate number of Undergraduate Music Majors? 1 to 19 20 to 49 50 to 99 100 to 1 99 200+ 3. Which CAI Programs are used (more than one may apply)? NONE-(PLEASE SEND RESPONSE TO:spangle9@pilot.msu.edu) MacGAMUT Practica Musica OTHER (Please specify) 44 4. How do students access CAI Software (more than one may apply)? In ONE computer lab From MANY computer labs Personal copies of CAI Software Through a campus NETWORK OTHER MEANS OF DISTRIBUTION (Please Specify) 5. How many computers in labs have access to CAI Software? NONE 1 to 9 1 0 to 1 9 20 to 29 30+ 6. Are your lab computers connected to the internet? YES NO Does not apply 7. How is CAI integrated (more than one may apply)? PRACTICE—ONLY individual student practice TIME-Tracking of time spent on CAI TESTING-Students pass levels or complete exercises LAB WORK-Students use CAI during part of a CLASS PERIOD 8. How is CAI work evaluated in the various classes of Freshman and Sophomore aural skills? (If classes or sections differ in grading policy please elaborate) UNGRADED PRACTICE EXTRA CREDIT 1 to 9% of the GRADE 10% to 19% of the GRADE 20% to 29% of the GRADE 30% to 39% of the GRADE OTHER (Please specify) 9. How would you rate the CAI software (OPTIONAL)? Excellent --Highly successful Good --Moderately successful Fair --Workable, slight flaws Poor --Unworkable, major flaws 10. Do students seem to find the CAI helpful? Yes No Indifferent 45 OPTIONAL COMMENTS: include any additional observations. (shortcomings, problems, successes) SEND YOUR RESPONSE TO: Spangle9@pilot.msu.edu 46 APPENDIX B ALPHABETICAL LISTING OF 209 RESPONDING INSTITUTIONS INSTITUTIONS STATE/COUNTRY Adams State College Colorado Albertson College Idaho Arizona State University Arizona Arkansas State University Arkansas Augusta State University Georgia Ball State University Indiana Baylor University Texas Belmont University Tennessee Bob Jones University Bowling Green State University Bradley University Bucks County Community College Butler University, Jordan College of Fine Arts California State Polytechnic University California State University, Chico California State University, Northridge California State University, Sacramento Calvin College Capital University Conservatory of Music Carleton College Carthage College Casper College Catawba College 47 South Carolina Ohio Illinois Pennsylvania Indiana California California California California Michigan Ohio Minnesota Wisconsin Wyoming North Carolina Catholic University of America Central Michigan University Central Missouri State University Central Washington University Chapman University College of Marin College of Notre Dame College of Staten Island, CUNY Colorado College Community College of Southern Nevada Concordia University Conservatory of Music, Puerto Rico Cornell College Crane School of Music, SUNY—Potsdam Crowder College Dana College Davidson College De Anza College DePauw University Dickinson College Dordt College Drake University Duquesne University School of Music East Carolina University Eastern Kentucky University Eastern New Mexico University Eastman School of Music Elmhurst College 48 District of Columbia Michigan Missouri Washington California California California New York Colorado Nevada Canada Puerto Rico Iowa New York Missouri Nebraska North Carolina California Indiana Pennsylvania Iowa Iowa Pennsylvania North Carolina Kentucky New Mexico New York Illinois Elmira College Emory University Florida State University Franciscan University of Steubenville Harding University Heidelberg College Hillsdale College Hong Kong Academy for Performing Arts Houston Baptist University Hunter College Huntington College Idaho State University Illinois State University Indiana University James Cook University James Madison University Kellogg Community College Kennesaw State University Kent State University Kenyon College Lake Forest College Lakehead University, Ontario Lansing Community College Lawrence University Conservatory of Music Lee University Louisiana College, Alexandria Louisiana State University Loyola Marymount University 49 New York Georgia Florida Ohio Arkansas Ohio Michigan Hong Kong Texas New York Indiana Idaho Illinois Indiana Australia Virginia Michigan Georgia Ohio Ohio Illinois Canada Michigan Wisconsin Tennessee Louisiana Louisiana California Luther College Lynchburg College Macalester College Mansfield University Maranathe Baptist Bible College Mary Washington College McGill University McPherson College Memorial University of Newfoundland Michigan State University Mississippi Valley State University Montclair State University Morehead State University Morris Brown College Mount Union College Mount Vernon Nazarene College New England Conservatory Northern Arizona University Northern Kentucky University Northern Michigan University Northwestern University Oberlin College Conservatory of Music Ohio State University Ohio University Ohlone College Oklahoma Baptist University Oklahoma Christian University Oklahoma State University 50 Iowa Virginia Minnesota Pennsylvania Wisconsin Virginia Canada Kansas Newfoundland Michigan Mississippi New Jersey Minnesota Georgia Ohio Ohio Massachusetts Arizona Kentucky Michigan Illinois Ohio Ohio Ohio California Oklahoma Oklahoma Oklahoma Pima College, The Center for the Arts Plymouth State College Prairie Bible College Purdue University, West Lafayette Rhodes College Rice Universiy Ricks College Roanoke College Roosevelt University, Chicago Musical College Rutgers The State University, New Brunswick Saint Mary's College Salisbury State University San Jose State University Seattle Pacific University Shepherd College Siena Heights University Silver Lake College Simpson College Skidmore College Southern Oregon University Southern University, New Orleans Southwestern Oklahoma State University Southwestern University Spring Arbor College St. Cloud State University St. John's University, College of St. Benedict St. Louis University St. Mary's College of Maryland 51 Arizona New Hampshire Canada Indiana Tennessee Texas Idaho Virginia Illinois New Jersey Indiana Maryland California Washington West Virginia Michigan Wisconsin Iowa New York Oregon Louisiana Oklahoma Texas Michigan Minnesota Minnesota Missouri Maryland SUNY, Fredonia Susquehanna University Temple University, Esther Boyer College Towson University Tulane University Universite' de Paris-Sorbonne (Paris IV) University of Alabama, Birmingham University of Alabama, Huntsville University of Alaska Anchorage, Department of Music University of Arizona University of Arkansas, Fayetteville University of Arkansas, Little Rock University of British Columbia University of California, Santa Barbara (UCSB) University of Central Florida University of Central Arkansas University of Cincinnati University of Colorado, Boulder University of Dayton University of Florida, Gainesville University of Hamburg, Institute of Musicology University of Houston, Moores School of Music University of Illinois, Champaign-Urbana University of Iowa University of Kansas University of Kentucky University of Maine, Augusta University of Manitoba 52 New York Pennsylvania Pennsylvania Maryland Louisiana France Alabama Alabama Alaska Arizona Arkansas Arkansas Canada California Florida Arkansas Ohio Colorado Ohio Florida Germany Texas Illinois Iowa Kansas Kentucky Maine Canada University of Massachusetts, Amherst University of Massachusetts, Lowell University of Miami School of Music University of Michigan University of Minnesota University of Missouri, Columbia University of Nebraska, Omaha University of North Carolina, Greensburo University of North Carolina, Pembroke University of North Dakota University of North Texas University of Oklahoma University of Oregon University of Oslo, Norway University of Osnabrueck, Germany University of Richmond University of Rio Grande University of Tennessee, Chattanooga University of Tennessee, Knoxville University of Tennessee, Martin University of Texas, Arlington University of Texas, Austin University of Texas, San Antonio University of the Pacific, Conservatory of Music University of Utah, Salt Lake City University of Victoria University of Washington University of Western Ontario 53 Massachusetts Massachusetts Florida Michigan Minnesota Missouri Nebraska North Carolina North Carolina North Dakota Texas Oklahoma Oregon Norway Germany Virginia Ohio Tennessee Tennessee Tennessee Texas Texas Texas California Utah Canada Washington Canada University of Wisconsin, La Crosse University of Wisconsin, Madison University of Wisconsin, W.C. University of Wisconsin, Whitewater Valley City State University Vanderbilt University Virginia Tech Wartburg College West Chester University West Virginia University Western Baptist College Wichita State University Wilfrid Laurier University William Rainey Harper College Wingate University Winthrop University Yavapai College York University, Toronto 54 Wisconsin Wisconsin Wisconsin Wisconsin North Dakota Tennessee Virginia Iowa Pennsylvania West Virginia Oregon Kansas Canada Illinois North Carolina South Carolina Arizona Canada CMS Survey MG list APPENDIX C GEOGRAPHIC DISTRIBUTION OF RESPONSES = The 1830 institutions listed in the 1997-98 CMS Directory. = The 209 total responses to the Survey. = The 70 responses from MacGAMUT list. Geographic Regions CMS Survey MG list (n=70 represents a subset of n=209) n=1830 n=209 n=70 Canada 60 10 4 Alabama 67 2 1 Alaska 2 1 N/A Arizona 16 5 0 Arkansas 21 5 2 California 1 68 1 3 6 Colorado 23 3 2 Connecticut 22 0 0 Delaware 4 0 0 District of Columbia 1 N/A Florida 56 4 1 Georgia 52 4 2 Hawaii 1 1 0 0 Idaho 9 3 1 Illinois 86 8 3 Indiana 36 7 2 Iowa 43 7 2 Kansas 37 3 1 55 Kentucky Louisiana Maine Maryland Massachusetts Michigan Minnesota Mississippi Missouri Montana Nebraska Nevada New Hampshire New Jersey New Mexico New York North Carolina North Dakota Ohio Oklahoma Oregon Pennsylvania Puerto Rico Rhode Island South Carolina South Dakota Tennessee Texas 56 33 3 2 21 4 0 1 2 1 N/A 30 3 1 51 3 O 52 1O 2 41 6 3 24 1 N/A 41 4 9 0 21 2 3 1 N/A 9 1 N/A 31 2 1 1 1 1 N/A 1 02 7 2 65 6 1 9 2 1 59 14 6 30 5 3 27 3 2 80 7 2 4 1 N/A 7 O O 26 2 0 1 1 O O 41 7 2 99 9 1 Utah Vermont Virginia Washington West Virginia Wisconsin Wyoming Hong Kong Australia Newfoundland France Germany Norway 57 9 1 N/A 1 O O O 39 6 2 33 3 1 1 8 2 O 45 8 3 7 1 N/A 1 O 1 O 1 O 1 O 2 O 1 O APPENDIX D LISTING OF PROGRAMS USED IN RESPONDING INSTITUTIONS CAI used in Responding Institutions Platform If of Schools Audio Challenger (In-house CAI) NeXT 1 Auralia 2.0 Win 5 Aural Skills Trainer (ECS media) Mac/Win 1 Benward Eartraining: A Technique for Listening Mac 13 CAT. (Curriculum for Aural Training) Mac 4 Claire A Personal Music Coach (discontinued) Mac Computerkolleg Musik Ear-Training Win 1 CUSTOM “UNPUBLISHED” CAI Various 20 Das Ohr (discontinued) Atari 1 Dolphin Don’s Music School Win 1 EarTraining 2.5 (Lars Peters) Mac 2 ETDrill Win 3 Explorations (mostly written theory) Mac 5 Guido DOS 1 Harmonic Idioms Mac 1 Hearing Harmony (In-house CAI) Mac 1 Hearing Tonal Harmony Mac 1 HearMaster MacNVin 1 Joseph Bloom Ear Training (discontinued) DOS 1 Listen MacNVIn 9 MacGAMUT Mac 121 MiBAC Music Lessons MacNVin 10 Music Lab Melody Mac/Win 13 58 Music Lab Harmony Music Theory Tutor Musique (ECS Media) NoteWell Play it by Ear Practica Musica teoria The Music Kit (written theory) Tim Smith’s 4-part Dictation (Hypercard 2.2) Well-tempered Ear 59 Win Mac MacNVin Mac Win -L Mac Win Mac Mac Mac NNOJN APPENDIX E THIRTY-THREE PROGRAMS NOT REVIEWED A Musical Tutorial (1999) AudioChallenger (NeXT program by Anthony Holland) Aural (1994, Atari program by Mark Grimshaw) BigEarS (Web-based Java program) CAET S (1996) CALMA (Upcoming program) Chordtrainer (1996, Kjetil Eide) Curriculum for Aural Training (1994, Hypercard) Ear Challenger (ECS media) EARTEST (1995) Ear Trainer (1989, by Lawrence Gallagher) Ear Training: A Technique for Listening (1995) El' drills (1996, Quicktime drills) E-Train (1997, Free DOS melody game by Victor Grauer) GUIDO 2.1 (1989) Halves/Not Halves (1998) Hearing Tonal Harmony (Upcoming program) Ike's Ear Tuner 1.1 (1998, Jason Stracner) Just Intonation Ear Trainer (1996, Hypercard) Listenl-A Music Skills Program (ECS media) 60 Melodic Ear (May 1999, NEW freeware) Music Ace (1996, Award-winning children's program) NoteWell (Upcoming program) Patterns in Pitch (ECS media) Play it by Ear (ca.1995, now owned by Alfred Publishing) Rhythm Ace (ca.1995, now owned by Alfred Publishing) Rhythmaticity Advanced (ca.1995) Rhythmaticity Basic (ca.1995) Super Ear Challenger (ECS media) Take Note (1997) Toon Up (ECS media) Tune-it ll (ECS media) WinOye (Now teoria) 61 APPENDIX F THE SOFTWARE REVIEW FORM A copy of the two-page review form begins on the next page. The review form is discussed in the second chapter, and page nineteen contains a detailed legend for reading the review form. 62 ----(Software Title) REVIEW-«- VERSION: REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: PLATFORM - O/S: INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games Music Tutorials K - 6 7 - 12 College ----AVAILABLE EXERCISES---- INTERVAIS: Ascending Descending Harmonic Compound CHORD Triads with Inversions IDENTIFICATION: (h _ , , 7 Chords wrth Inversrons Open /Closed Spacrng HARMONIC Inversions +6 Chords PROGRESSIONS: , , , Single-click Response Secondary Dominants MELODIES: Computer-generated Libraries of Melodies Melodies Include Rhythm SCALES: Major Minor Modal RHYTHMS: Hear/Notate Hear/Tap SINGING (AUDIO IN): Pitch Matching Intervals ADDITIONAL EXERCISES OR FEATURES: #====I===l== ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: INSTRUCTOR- Custom Tests Settings Scoring Parameters DEFINED SETTINGS: ----(Software Title) REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic Testing Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only # of Correct Answers Total Times Individual Times Levels Passed RECORDS CAN BE: Auto-saved to: Hard Drive / Network / Student Disk Printed E-mailed Backed-up Restored Viewed in a Database Viewed as a Graph ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: PROGRAM SPECS: Program Size: HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: ----PRICING and PRODUCT INF ORMATION---- APPROXIMATE Single-User Copy: $ COST (in US $): Lab-pack: X for $ Site License Available DEMO: Downloadable Demo WEBPAGE: E-MAIL / PHONE: COMPANY INFO: APPENDIX G REVIEWS OF THIRTY EAR-TRAINING CAI PROGRAMS TABLE OF CONTENTS FOR THIRTY PROGRAM REVIEWS Mac = Macintosh platform Win = Windows platform Records = Scorekeeping component Mac Win Records Reviewed Software Page Number W Anvil Studio MIDI sequencer/ear—trainer ........ 67 W R Aural Skills Trainer ........................ 69 W R Auralia .................................. 71 W Chord ID ................................ 73 W R Computerkolleg Musik ...................... 75 W Dolphin Don’s Music School ................. 77 W R Ear Trainer .............................. 79 W R EarMaster School ......................... 81 W Earobics ................................ 83 W Earpower ................................ 85 M Eartraining 2.6.1 .......................... 87 W ET Drill .................................. 89 W Fanfare ................................. 91 M Four-Part Dictation 5.1 .................... 93 M R Harmonic Hearing I & II ..................... 95 M W R Harmonic Progressions ................... 97 W R HearMaster ......................... 99 M W R Inner Hearing I & |l ....................... 101 M Listen ................................. 103 65 M M W M W W M W W W M W W W MacGAMUT ............................. 105 MiBAC ................................. 107 Musianship Basics ........................ 109 Music Lab Harmony ....................... 111 Music Lab Melody ........................ 113 PET (Personal Ear Trainer) ................. 1 15 Pitch ID ................................ 117 Practica Musica .......................... 119 teoria .................................. 121 The Music Box ........................... 123 Thought Sauce Ear Training Method .......... 125 66 ----Anvil Studio—Ear-training Accessory REVIEW---- VERSION: 1999.03.02 [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 2, 1999 PLATFORM - 0/ S: Windows 95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking-of-Userl’mgrcss Games Music-Tutorials H 7 - 12 College I: JL ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Gomponnd CHORD Triads with Inversions Volume Control of Bass IDENTIFICATION: u, , , , 7 Chords wrth lnversrons Open /Closed Spacmg WON-IE Inversions +6-ehords PROGRESSIONS: MELODIES: Computer-generated Rhythm not evaluated Libraries-Melodies- Melodies Include Rhythm RHYTHMS: Hear/Notate Hcan‘Fap See/Play SINGING (AUDIO IN): Pitch Matching Intervals ADDITIONAL Anvil Studio is a freeware MIDI sequencer with optional EXERCISES OR , . , FE ATURES° accessories (such as ear-training) that can be purchased separately. ==== m. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup bards Practice Test-Modes SETTINGS: , Parameters for Melody and Chord exerczses H’l-S‘I‘RUG'POR1 Gum-"Fests Settings Scoring-Parameters DEFINED-SETTINGS: 67 ----Anvil Studio-Ear-training Accessory REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic-Testing- Statistics of Responses #rof-{Eorrect-and-Incorrect-Responses Hints REEORBS-KEP‘FFOR: Gurmanession—Only #ofemmct-Answm 5F Hg. ll' .1 IF I IP I P' l E- .1 l E l l- R l I”. l’ E l a? l S l ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 95 (100 mHz Pentium with 16 MB of RAM) PROGRAM SPECS: Program Size:1.2 MB Disk Space: 1.9 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: = =3 ----PRICING and PRODUCT INF ORMATION---- APPROXIMATE Single-User Copy: $19 COST (in US $): . . . bab=pack.-X-for$ Srte-trccnscfivaflablc DEMO: Downloadable Demo WEBPAGE: http://www.anvilstudio.com E-MAIL/ PHONE: support@anvilstudio.com COMPANY INFO: Willow Software, PO. Box 60122 Shoreline, WA 98160-0122 68 ----Aural Skills Trainer REVIEW---- VERSION: 1.82 (1998) [Demo copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 18, 1999 PLATFORM - O/S: Windows 3.1/95, Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress a: Games Music Tutorials K - 6 7 - 12 College ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonie Compound CHORD Triads with Inversions IDENTIFICATION: 7th Chords with Inversions Open-felosed-Spaeing meme inversrons' +6-6hords PROGRESSIONS: , - , , MEEOBIES: Gomputefigenerated RH’FHIMS: HearfNotate Hea-rfFap ADDITIONAL EXERCISES OR FEATURES: = =—-—-—=¥ Fm ----INSTRUCTIONALISSUES---- USER-DEFINED Exercise Setup bevels Practice =liest-ilvh'rdes SETTINGS: INSTRUGPOR- Enstom-Tests Settings Scoring-Parameters BEHN-E-B-SE‘FHNGS: 69 ----Aural Skills Trainer REVIEW CONTINUED-«- RESPONSE OPTIONS: Screen-Notation Screen-Keyboard- Mouse-clickI.D. MIBII I S' . Auto-checking of Answers Anto=skip-to-ncxt-anstion USER FEEDBACK: Biagnostic‘l’csting- Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Enfient-Session-On’ry #cf-Gorrectfinswers $1112. Il"ll¥' I IP I % scores for first, last, and best sessions. RECORDS CAN BE: Auto-saved to: Hard Drive fNetwork-I-Student-Bisk Printed E=1naiied Backed-11p Restored :z' I . E l a l. l S I Used to see first, last and high scores in each category. E ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: IBM 486, Windows 3.1; or Macintosh System 6.0.7 PROGRAM SPECS: Program Size:187 K Disk Space: 715 K HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: == ----PRICING and PRODUCT INFORMATION-«- APPROXIMATE Single-User Copy: $99 Network: $500 COST (in US $): Izab=pack:-X-for$ Site License: $700 DEMO: Downloadable Demo WEBPAGE: http://www.ecsmedia.com E-MAIL / PHONE: sales@ecsmedia.com 1-800-832-4965 COMPANY INFO: ECS (Electronic Courseware Systems, Inc.) 1210 Lancaster Drive, Champaign, IL 61821 70 ----Auralia 2.0 REVIEW---- VERSION: 2.0.4 (1998) [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 24, 1999 PLATFORM - O/S: Windows 95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games Tutorials K - 6 7 - 12 College ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Jazz and Cluster Chords IDENTIFICATION: u, , , , 7 Chords W1th Inversrons Open-felosed-Spactng HARMONIC Inversions +6 Chords Jazz Progressions PROGRESSIONS: . , , Smgle-chck Response Secondary Dommants MELODIES: €0mpnte1=generatcd Answer by Notation Libraries of Melodies Melodies Include Rhythm SCALES: Major Minor Modal Jazz Scales, Whole Tone RHYTHMS: Hear/Notate Hear/Tap Rhythm-element ID SINGING (AUDIO IN): Pitch Matching Intervals Chords, Melodies, Scales ADDITIONAL 26 exercises including Rhythm Styles, Meter Recognition, EXERCISES OR , . FEATURES: Interval Comparison, Cadences, Tuning, and many Singing exercises. @ ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: Password protected user records. IN STRUCTOR- Custom Tests Settings Scoring-Parameters DEFINED SETTINGS: Password-protected administration option including the ability to create tests and track class scores. 71 ----Auralia 2.0 REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing Mmchecknrg-offinswers !-I' 8’ USER FEEDBACK: E' '=F' Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Enrrent-Session-Only # of Correct Answers Total Times Individual Times Levels Passed RECORDS CAN BE: Auto-saved to: Hard Drive / Network / Student-Disk Printed Emailed Backed-mpRestored Viewed in a Database Viewed as a Graph Fm = ----SYSTEM REQUIREMENTS and SE TUP INFORMATION-«- SYSTEM MINIMUM: IBM 486, 66 mHz or better required for microphone input PROGRAM SPECS: Program Size: 3.2 MB Disk Space: 6.5 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: m ----PRICING and PRODUCT INFORMATION-«- APPROXIMATE Single-User Copy: $149 Student Copy: $49 COST (in US $): . . Lab-pack: 5 for $395 Site License: $995 DEMO: Downloadable Demo WEBPAGE: http://www.risingsoftware.com E-MAIL/ PHONE: rising@risingsoftware.com US toll free: 888-667-7839 COMPANY INFO: Rising Software, 31 Elmhurst Road, Blackburn, Victoria, Australia 3130 72 ----Chord ID REVIEW-«- VERSION: 1997 [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 2, 1999 PLATFORM - O/S: Windows 95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Trackingof-User-Progress Games Music Tutorials H 7 - 12 College ----AVAILABLE EXERCISES-«- W #61 El I . I i . e ‘EI I s . HARMONIC Inversions #G-Ghords 20 levels PROGRESSIONS: , , _ SIDgIC-CIICK Response Secondary-Dominants MEEOBIES: €0mpnter=generated i'l . Fllll' lIll'IIIRII RHYTHMS: Hcan‘Notatc Hearffap ADDITIONAL Chord progressions are 8 bars long—one chord per bar. EXERCISES OR . , , _ “ ” FEATURES: Features libraries of progresszons m a pop style. Users compare their response with the correct answer. HE: E — ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test-Modes SETTINGS: INSTRUGPORI Custom-Tests Settings ScoringParameters BEF-I-NED-SE‘FH-NGS: 73 ----Chord ID REVIEW CONTINUED---- RESPONSE OPTIONS: Screen-Notation Screen-Keyboard- Mouse-click I.D. #oonrrect-and-Incorrect-Responses Hints REEORBSiEEP‘ILFORrEorrent-Session-Oniy #-of-€orrect-Answers ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size: 293 K Disk Space: 1.3 MB HARDWARE: Soundcard Microphone MiBHéey‘board-(Optionai) SOFTWARE: ----PRICING and PRODUCT INFORMATION-«- APPROXIMATE Single-User Copy: $14.95 COST (in US $): . . . bab=paek:-X-for-$ Srte-hcenseAvarlabie DEMO: Downloadable Demo WEBPAGE: http://www.musicstudy.com E-MAIL/ PHONE: htrythal@yahoo.com COMPANY INFO: Dr. Gil Trythall, KBA Software, 41 West Main St. Morgantown, WV 26505 74 ----Computerkolleg Musik REVIEW-«- VERSION: (1999) [Full German Copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 10, 1999 PLATFORM - O/S: Windows 95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress L Games Music Tutorials K - 6 7 - 12 College ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Jazz Chords IDENTIFICATION: a, , , , 7 Chords w1th Inversrons Open /Closed Spacmg HARMONIC Inversions +6-ehords Cadence Patterns PROGRESSIONS: , , , Single-click Response Secondary-Dormant: MELODIES: Gompntcr-rgcnerated Pop, Classical, 12-Tone Libraries of Melodies Melodies Include Rhythm SCALES: Major Minor Modal Pentatonic, Blues RHYTHMS: Hear/Notate Heat-Hap Rhythm Elements ADDITIONAL Includes tonality exercises and Jazz cadences as well as EXERCISES OR . _ FEATURES, many written theory exercrses. ----INSTRUCTIONAL ISSUES-m USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: INSTRUG'POR- €nstom-Tests Settings Scoring-Parameters 913mm 75 ----Computerkolleg Musik REVIEW CONTINUED-u— RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing- Anto=chccking~of+mswers !-I' 8' USER FEEDBACK: E“ 'T' 5.. FR # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current-Session-Orrly # of Correct Answers Total Times Individual Times Levels Passed Number of sessions worked and the age of the user. RECORDS CAN BE: Auto-saved to: Hard Drive f-thwork-f-Stndenrfiisk Printed Emailed BackedmpRestored Viewed in a Database Viewedasa-Graph : i — ======A ----SYSTEM REQ IREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size:872K Disk Space: 52 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) CD-ROM Drive ~=== SOFTWARE: ----PRICING and PRODUCT INF ORMATION---- APPROXIMATE Single-User Copy: $ N/A English release late 1999 COST (in US $): . . . Lab-pack: X for $ N/A Site-hcensefivadabie DEMO: Bowntoadabic-Bemo WEBPAGE: http://www.schott-music.com/ckm.htm E-MAIL / PHONE: eamdc@eamdc.com 1-610-648-0506 COMPANY INFO: Schott Music Corp. NY, c/o European American Music Distribution Corp. Po Box 850, Valley Forge, PA 19482 76 ----Dolphin Don’s Music School REVIEW-m VERSION: 3.0 (1998) [Full Copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 10, 1999 PLATFORM - O/S: Windows 3.1/95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games Music Tutorials K - 6 7 - 12 College FE ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Eornponnd CHORD Triads with-Inversions IDENTIFICATION: 7‘h Chords mm Open /Closed Spacing HARMONiC— Inversions W MEbOB-IES: €ompnter=gcnerated RHYTHMS: HearfNotate HcarfPap ADDITIONAL Many other fun music theory games. A wonderful game EXERCISES OR , . . . FEATURES: for children. Features the talking vozce of Dolphin Don. ----INSTRUCTIONAL ISSUES---- _ USER-DEFINED Exercise Setup Levels Practice Test-Modes SETTINGS: iNS‘PRHGPGR- Eastern—”Fests Settings ScofingParamctei-s BEHN-EB—SE‘FH-NGS: 77 ----Dolphin Don’s Music School REVIEW CONTINUED-u— RESPONSE OPTIONS: Screen Notation Screen-Keyboard- Mouse-click I.D. W surging Auto—checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic-Testing- Statistics of Responses #-of€oncct-and~incorrect-Responses Hints RECORDS KEPT FOR: Garrent-Scssion-Oniy ”Correct-Answers Tomi-Times IndividuaI-Times Levels Passed Levels of achievement. RECORDS CAN BE: Auto-saved to: Hard Drive f'thWUfk'f'Sttrdmt'Bisk Printed Emailed Backed=npRestored Viewed in a Database Viewedas-a-Graph ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 3.1 with 8 MB of RAM PROGRAM SPECS: Program Sizezl MB Disk Space: 5.2 MB HARDWARE: Soundcard Microphone MiBI-Keyboard-(Optiona-l) SOFTWARE: ----PRICING and PRODUCT INF ORMATION---- APPROXIMATE Single-User Copy: $49 COST (in US $): . . . bab=pack:-)Hor$ Site-bicenscaérvaiiable DEMO: Downloadable Demo WEBPAGE: http://www.dolphindon.com r E-MAIL/ PHONE: ddon@dolphindon.com 1-256-721-2537 / COMPANY INFO: Don Bowyer, Dolphin Don’s Music School 5041 Galaxy Way #212, Huntsville AL 35816 78 ----Eartrainer REVIEW-«- VERSION: (1997) [Full Copy reviewed on Windows 3.1] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 6, 1999 PLATFORM - O/S: DOS (Windows 3.1/95/98) INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress an: Games Music Tutorials H 7 - 12 College ----AVAILABLE EXERCISES-«- INTERVALS: Ascending Descending Harmonic Compound I-BE-N‘PifiGA‘HON: :7“, , , , W inversrons' -r6-€hords PROGRESSIONS: , _ , , MEbODIES: €0mpnter=generated RHYTHMS: Hcan‘Notatc HearfFap ADDITIONAL EXERCISES OR FEATURES: m ----INSTRUCTIONAL—ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: _ . , , , User cannot speczfy direction of intervals to practice INSTRUCTOR- Custom Tests Settings Scoring Parameters DEFINED SETTINGS: 79 ----Eartrainer REVIEW CONTINUED-u— RESPONSE OPTIONS: MiBI-Inpnt- Singing- Arrow Keys or Letter Keys Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic Testing Statistics of Responses #oonrrect-and-Incorrect-Responses Hints RECORDS KEPT FOR: €urrent-Sessiorr6niy #cf-eorrectaknswers T 1‘? I 1.1 I‘F' i IP I % correct and average response time for each interval. RECORDS CAN BE: Auto-saved to: Hard Drive-f-Network-f-Stndent-Bisk Printed E=maiied BackcdmpRestored Viewed in a Database Viewed-asafiraph ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: DOS 5.0 or higher, 640K of RAM PROGRAM SPECS: Program Size: 168 K Disk Space: 172 K HARDWARE: Soundcard Microphone Ward-(Optional) SOFTWARE: ----PRICING and PRODUCT INFORMATION---- APPROXIMATE COST (in US $): Single-User Copy: $9.95 DEMO: Bownioadabie—Bemo WEBPAGE: http://www.ilovemusic.com E-MAIL/ PHONE: ear@ilovemusic.com 1-415-665-8933 f [ COMPANY INFO: Forest Hill Music, 25 Balceta Ave, San Francisco, CA 94127 80 ----EarMasterSchool REVIEW-u— VERSION: 2.5 (1998) [Demo Copy reviewed on Windows 95] REVIEWER: Douglas Spangler httpz/lwww.msu.edu/user/spangle9 REVIEW DATE: April 21, 1999 PLATFORM - O/S: Windows 3.1/95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games Mnsic-"Fntoriais K - 6 7 - 12 College ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Custom Chords IDENTIFICATION: 7th Chords with Inversions Open /Closed Spacing HARMONiG- inversions -r6€hords PROGRESSIONS: S' I _ l' l R S l E . MELODIES: Computer-generated SCALES: Major Minor Modal Custom Scales RHYTHMS: Hearchtatc Hear/Tap Error Detection ADDITIONAL EXERCISES OR FEATURES: m ; ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: INSTRUCTOR- Custom Tests Settings Scoring Parameters DEFINED SETTINGS: Detailed class records via computer network with password protection. 81 ----EarMasterSchool 2.5 REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic Testing Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current-Session-On-ly # of Correct Answers ‘Fotai-‘Fimes Individual Times Levels Passed Dates, times worked, levels completed, percentage scores. RECORDS CAN BE: Auto-saved to: Hard Drive / Network / Student-Disk Printed E=maiied Backed-up Restored Saved to floppy disk. ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: Windows 3.1 (IBM 486) PROGRAM SPECS: Program Size: 1.4 MB Disk Space: 3.3 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: ----PRICIN G and PRODUCT INFORMATION---- APPROXIMATE COST (in US 3): Single-User Copy: $118 Lab-pack: 5 for $355 Site License: $770 DEMO: Downloadable Demo WEBPAGE: http://www.miditec.com E-MAIL/ PHONE: info@miditec.com (+45) 43-6464-49 COMPANY INFO: [\i‘r MidiTec Vegavaenget 26, DK - 2620 Albertslund, Denmark 82 ----Earobics REVIEW-«- VERSION: 1.5 (1998) [Demo Copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 17, 1999 PLATFORM - O/S: Windows 95/98 INTENDED USES: Individual Practice Educational Institutions Tracking-oPUser-Progress User-directed Practice H 7- 12 College m ----AVAILABLE EXERCISES-«- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions 9””, 11””, Suspensions IDENTIFICATION: th . , , 7 Chords wrth lnversrons Open-felosed-Spactng HARMONIC Inversions -t-6-€hords PROGRESSIONS: , , , Single-click Response Secondary Dominants MELODIES: Computer-generated SCALES: Major Minor Modal Whole Tone, Pentatonic RHYTHMS: Hear/Notate Hear/Tap See/ Tap ADDITIONAL Single-click chord inversion identification exercises. EXERCISES OR , . FEATURES: Quick and Simple screen notation entry method. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: . . Savmg and loading of custom user profiles (settings). INSTRUGTORv €nstom-"Fests Settings Scoring-Parameters BEHN‘EB-SE‘FH-NGS: 83 ----Earobics REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic Testing Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only #cf-Eorrcct-Answers T I‘F' I I"! I? I IP I User-defined profiles of settings Printed E=maiicd Backcd=np Restored II. I. E I II. I S I m ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size: 882 K Disk Space: 2.7 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: ----PRICING and PRODUCT INFORMATION-«- APPROXIMATE Single-User Copy: $69 Sliding Price Scale COST (in US $): Lab-pack: 10 for $500 Site License Available DEMO: Downloadable Demo WEBPAGE: http://www.cope.dk E-MAIL/ PHONE: info@cope.dk (+45) 3312-0747 COMPANY INFO: Cope Media N¢ITC Sogade 25c, DK-1370 Kobenhavn K, Denmark 84 ----EarPower 3.0 REVIEW-m VERSION: 3.0 (1999) [Demo copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 20, 1999 PLATFORM - O/S: Windows 3.1/95/98/NT INTENDED USES: Individual Practice Educational Institutions Trackingof-User-Progress User-directed Practice ----AVAILABLE EXERCISES---- H 7 - 12 College INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Customizable Chords IDENTIFICATION: th , . , 7 Chords wrth InverSlons Open /Closed Spacrng HARMONiC— Inversions 145-Chords PROGRESSiONS: MELODIES: Computer-generated Customizable Melodies RHYTHMS: Hear/Notate Hear/Tap Customizable Rhythms SINGING (AUDIO IN): Pitch Matching Intervals Melodies, Chords ADDITIONAL Rhythm exercises also include the option of notating the EXERCISES OR , , . . FEATURES’ answer by clicking on “rhythmic unit” boxes. Microphone input can be used to respond to all exercises. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: , , User can save custom configurations and exerczses. INSTRBGPORv Custom-"Fests Settings Scoring-Parameters BEHNEB—SE—FHNGS: 85 ----EarPower 3.0 REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing Guitar F ret-board Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic Testing Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only # of Correct Answers ‘P 113' I l"l IT' I IP I Printed E-maried— ' Backed-np— Restored II. I . E I II. I S l ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: IBM 386, Windows 3.1 (486 or better recommended) PROGRAM SPECS: Program Size: 478 K Disk Space: 813 K HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) APPROXIMATE COST (in US $): Single-User Copy: $25 bab=paek.-)(-for$ Site License Available DEMO: Downloadable Demo WEBPAGE: http://www.earpower.com E-MAIL / PHONE: sheep13@aol.com 1—800-2424-775 x 14915 COMPANY INFO: Fast and Soft Author: Nick Baciu 402 Onderdonk Ave. #1R, Ridgewood, NY 11385 86 ----Eartraining 2.6.1 REVIEW-«- VERSION: 2.6.] (1998) [Demo reviewed on PowerMac, OS 8] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: March 14, 1999 PLATFORM - O/S: Macintosh INTENDED USES: Individual Practice Educational-Institutions User-directed Practice Tracking-of-Bser-Progress a Games Tutorials H 7 - 12 College ----AVAILABLE EXERCISES-«- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Suspensions IDENTIFICATION: 7th Chords with Inversions Open-fClosed-Spacing HARMONIC- lnversions #6‘eh01'ds MEEOBHES: Computer=gencratcd SCALES: Major Minor Modal Custom Scales RHYTHMS: Healfl‘htatc HearfPap ADDITIONAL Pitch Practice: exercises absolute pitch by playing EXERCISES OR , . FEATURES: random notes--an alternate form of interval exerczse. ----INSTRUCTIONAL ISSUES---- USER-DEFINED SETTINGS: Exercise Setup bevcls Practice Ofi’ers flexible user-defined exercises and settings. INSTRUCTOR- BEHNE—B—SE‘FPINGS: Custom-TestsScttings Scoring-Parameters 87 ----Eartraining 2.6.1 REVIEW CONTINUED---- RESPONSE OPTIONS: Screen-Notation Screen—Keyboard- Mouse-click I.D. IHEII S' . Auto—checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic-Testing- Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only # of Correct Answers Total-Times I I. .I I‘F' bevels-Passed RECORDS CAN BE: Printed E=mailed Backed=upRestored II. I. E I Viewed as a Graph m ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Macintosh System 7.1.3 to OS 8 PROGRAM SPECS: Program Size: 554 K RAM: 400 K HARDWARE: SoundcardMicrophone MIDI Keyboard (Optional) SOFTWARE: E OMS 2.0 or higher required to use MIDI @ ----PRICING and PRODUCT IN FORMATION---- APPROXIMATE Single-User Copy: $20 Shareware COST (in US $): . . bab=packr96for$ Slte Llcense: $130 DEMO: Downloadable Demo WEBPAGE: http://members.aol.com/LarsPeters/ E-MAIL/ PHONE: LarsPeters@aol.com COMPANY INFO: Lars Peters Leibnizstrasse 9, 22089 Hamburg, Germany 88 ----ETDrill REVIEW---- VERSION: 3.0 (1999) [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 10, 1999 PLATFORM - O/S: Windows 95/98/NT (DOS version also available) INTENDED USES: Individual Practice Educational Institutions User-directed Practice Trackingcfb‘serPl-ogress Games gTutorials H i7-12§College ----AVAILABLE EXERCISES-u— INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Volume control--bass voice IDENTIFICATION: .h . , , 7 Chords wrth InverSlons Cpen-fCIosed-Spacnrg HARMONIC Inversions +6 Chords Borrowed Chords PROGRESSIONS: , , , Smgle=chck~Response Secondary Domlnants MELODIES: Computcr=gcneratcd Rhythm not evaluated Libraries of Melodies Melodies Include Rhythm RHYTHMS: llean‘Notate HcarfFap ADDITIONAL Pitch Patterns—user indicates solfege, scale degrees, or EXERCISES OR , , , FEATURES° answers Vla MIDI to melodies Without rhythm. Melodic dictations are answered exclusively via MIDI. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test-Modes SETTINGS: . . _ , Volume control for each vaice in harmonic progresswns. INSTRUCTOR- Cnstonr-Tests Settings ScoringPararnetcrs DEFINED-SETTINGS: 89 ----ETDrill REVIEW CONTINUED---- RESPONSE OPTIONS: ScreenNotation Screen-Keyboard— Mouse-click I.D. MIDI Input Singing- Solfege or Scale Degree # Auto=cheekingvffinswerr Auto-skip to next Question USER FEEDBACK: E' 'T‘ Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only # of Correct Answers cFowl-Times ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size: 544K Disk Space: 800 K HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: #—=l= w ----PRICING and PRODUCT INFORMATION---- APPROXIMATE Single-User Copy: $50 Schools-sliding price scale COST (in US $): . . . Lab-pack: 11 for $440 Site-breensefivarlablc DEMO: Downloadable Demo (free) Mail order demo: $5 WEBPAGE: http://theory.music.indiana.edu/etdrill/ E-MAIL/ PHONE: etdrill@indiana.edu COMPANY INFO: Indiana University Project Directors: Eric Isaacson and Gary Wittlich 90 ----Fanfare REVIEW---- VERSION: 1.1 (1997) [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 23, 1999 PLATFORM - O/S: Windows 3.1/95/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games cFutorials Er6 7 - 12 College f== ===l=fl=== ---AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Keyboard Range IDENTIFICATION: th . , , 7 Chords wrth Inversrons Open-ICIosed-Spacmg HARMONIC Inversions +6-ehords Cadences PROGRESSIONS: , , , Single-click Response Secondary—Bonunants MELODIES: Computer-generated SCALES: Major Minor Modal Whole Tone RHYTHMS: Hear/Notate Hear/Tap ADDITIONAL Tuning exercise and general music reading exercises. EXERCISES OR FEATURES: £==========l=== ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup bevels Practice cFest-Modes SETTINGS: INSTRUCTOR- Custom-Tests Settings Scoring-Parameters DEFINED SETTINGS: Instructor has sole access to the password-protected student records. 91 ----Fanfare REVIEW CONTINUED-«- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. ”"3” S' . Auto=cheeking-of-Answers- ,4. 5' USER FEEDBACK: E' 'T' Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only # of Correct Answers Total Times Individual Times Levels Passed Text document of Names, Dates, Exercises, % Correct. RECORDS CAN BE: Auto-saved to: Hard Drive fNetwork-f-Student-Bisk Printed E=maiied Backed=upRestored n. l' E l Viewed-as-afiraph Accessed with a password (by the instructor) F===== ----SYSTEM REQUIREMENTS and SETUP INFORMATION-m SYSTEM MINIMUM: Windows 3.1 (IBM 386 or better) PROGRAM SPECS: Program Size: 2.7 MB Hard Drive: 4.5 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE ----PRICING and PRODUCT INFORMATION---- APPROXIMATE Single-User Copy: $99 Student Price: $79 COST 6" US 3): W Site-bieense-Avafiable DEMO: Downloadable Demo WEBPAGE: http://www.stardock.com E-MAIL/ PHONE: sales@stardock.com 1-734-762-0687 COMPANY INFO: Stardock Systems, Inc. 17292 Fannington Road Livonia, MI 48152 (Author: Jerry Wyrick) 92 ----Four-Part Dictation 5.1 REVIEW---- VERSION: 5.1 (1990) [Full Copy reviewed on PowerMac, O/S 7.5.3] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 18, 1999 PLATFORM - O/S: Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress College mgmmmueu' 7*"va 9W HARMONIC Inversions +6 Chords Altered Dominants PROGRESSIONS: , , . Single-click Response Secondary Dominants MELODIES: €ornputer=generated 5 -note Melodies Libraries of Melodies Mciodies-Include-Rhythm RHYTHMS: Hezm‘Notate I-Iean‘Fap ADDITIONAL Each line of the harmonic dictations can be answered EXERCISES OR _ , , _ . _ . FEATURES: indiVidually and used as melodic dictation practice. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: , . Test mode requires full version of Hypercard 2.2 INSTRUCTOR- Custom Tests Settings Scoring Parameters DEFINED SETTINGS: Instructor can enter progressions and obtain scores (shown as a percentage) of student tests. 93 ----Four-Part Dictation 5.1 REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen-Keyboard Mouse-click I.D. #of-eorrectnndincorrect-Responses Hints RECORDS KEPT FOR: Current-Session-Only #cf-Gorreet-Answers 5F H" I l' 'l I? I I? l Percentage scored on tests. RECORDS CAN BE: Auto-saved-trr-Hard-ane-ffietwork-f-Student-Brsk Printed E-mailed Backed=up Restored I l. l . E l a ,. l S l Viewed by instructor only. w ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: Macintosh System 6.0.3 PROGRAM SPECS: Program Size: 430 K HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: H ypercard 2.2 for scores (Hypercard Player 2.2) E——== ----PRICING and PRODUCT INFORMATION---- APPROXIMATE Singie=User€opyr$ Freeware COST (in US $): . . . bab=paek:-X-for-$ Site-Incensefivarlable DEMO: Downloadable Demo WEBPAGE: http://www.jan.ucc.nau.edu/~tas3/courseindex.html E-MAIL/ PHONE: tim.smith@nau.edu COMPANY INFO: Dr. Timothy Simth, 3353 S. Carol Dr. Flagstaff, AZ. 86001 94 ----Harmonic Hearing 1 & 11 REVIEW---- VERSION: Units I & II (1999) [Demo reviewed on PowerMac, OS 8] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 11, 1999 PLATFORM - O/S: Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress K-=-6 7- 12 College ----AVAILABLE EXERCISES---- {NW ! l’ E l' Harmonic€ompound CHORD- WON: 512.11} . 7618] I 'II . Opem‘Gloscd-Spacing HARMONIC PROGRESSIONS: Inversions +6 Chords Single-click Response Secondary Dominants M-EbOBI-E-S: €0mputer=gencrated Meiodierhrclude-Rhythm Minor Major RHYTHMS: Hear/Notate Included in the Melodies SINGING-(AWN): I" Hi I“ ADDITIONAL EXERCISES OR FEATURES: é=============== ----INSTRUCTIONAL ISSUES-m USER-DEFINED Exercise Setup beveis Practice Test-Modes SETTINGS: INS‘PRUGFORI Custom-Tests Settings Scoring-Parameters DEFINED-SETTINGS: 95 ----Harmonic Hearing I & II REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Mouse-click I.D. USER FEEDBACK: RECORDS KEPT FOR: Tomi-Times Individual Times levels Passed Class, Name, Dates, Minutes, and Scores are indicated RECORDS CAN BE: Auto-saved to: Hard Drive fNetwork-f-Studcnt-Bisk Printed E=mailcd BackedmpRestored Viewed in a Database Viewed-as-a-Graph Sorted by categories such as student name or class. ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: Macintosh System 6.0.3 PROGRAM SPECS: Program Size: 662 K Disk Space: 1.1 MB HARDWARE: Soundcard Microphone MI-BI-Keyboard-(Gptional) SOFTWARE: Requires OMS for sound ----PRICING and PRODUCT IN FORMATION---- APPROXIMATE COST (in US $): Single-User Copy: $55 Units I & II sold separately Isab=pack.—X-for$ 5.1. ! .1” DEMO: Downloadable Demo WEBPAGE: http://www.musicalhearing.com E-MAIL/ PHONE: scott@musicalhearing.com 1-508-643-9122 COMPANY INFO: Musical Hearing, 6 Shepard Street Plainville, MA 02762 96 ----Harmonic Progressions REVIEW---- VERSION: 3.0 (1999) [Demo copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 7, 1999 PLATFORM - O/S: Windows 3.1/95/98/NT, Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games gMusic-T‘uturiais Ififi g7-123College ----AVAILABLE EXERCISES-«- Hm ”mama!" 91*me Open-recscdSpacing HARMONIC Inversions *6-ehords Embellishing 6/4 Chords PROGRESSION S: , , . Single-click Response Seeondary-Bomrnants MEEOBIES: €0mputcr=gencrated RIHEPHMS: HearfNotate Hear/Tap ADDITIONAL Cadence patterns, and recognition of notated harmonic EXERCISES OR , , FEATURES: progresswns. Features a summary (after each exerCise) of the number of times each chord type was missed. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup bevels Practice Test—Modes SETTINGS: INSTRUGI-‘OR- Custom-Tests Settings Scoring-Parameters DEFINED-SETTINGS: 97 ----Harmonic Progressions REVIEW CONTINUED---- RESPONSE OPTIONS: Screen-Notation Screen-Keyboard— Mouse-click I.D. HIE” S' . Auto-checking of Answers !-I' 8' USER FEEDBACK: E' ‘=F' Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current-Sessiorreniy #of-Gorrecfiknswers T FF II"! I? I IP I First, Last and Best scores for each type of exercise. RECORDS CAN BE: Auto-saved to: Hard Drive f-Network-f-Studcnt-Bisk Printed E=maiied BackedmpRestored H“ l' E 1 Wewed-as-a-Graph ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: IBM 486, Windows 3.1; Macintosh System 6.0.3 PROGRAM SPECS: Program Size: 495 K Disk Space: 1.5 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: ----PRICING and PRODUCT INFORMATION---- APPROXIMATE Single-User Copy: $200 Network: $1000 COST (in US $): . . tab=pack.—X-for$ Site License: $1400 DEMO: Downloadable Demo WEBPAGE: http://www.ecsmedia.com E-MAIL/ PHONE: sales@ecsmedia.com 1-800—832-4965 COMPANY INFO: ECS (Electronic Courseware Systems, Inc.) 1210 Lancaster Drive, Champaign, IL 61821 98 ----HearMaster REVIEW-u— VERSION: 2.0 (1997) [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 17, 1999 PLATFORM - O/S: Windows 95/98, Macintosh, Atari INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress H 7--—12 College w ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions Custom Chord Entry IDENTIFICATION: th , , , 7 Chords wrth Inversrons GpeniEIosedSpacrng HARMONIG Inversions +6-€hords PROGRESSIONS: MELODIES: Computer-generated Short Custom Melodies SCALES: Major Minor Modal Custom Scales, Jazz RHYTHMS: HearfNotatc Hear/Tap ADDITIONAL Extensive user manual suggesting uses and approaches. EXERCISES OR “ n , FEATURES: Use of MIDI notes as remote controllers for exerczses. ----INSTRUCTIONAL ISSUES---- Can analyze any chord played on the MIDI keyboard. USER-DEFINED Exercise Setup Levels Practice Test-Modes SETTINGS: . . . ExtenSive customizable settings can be saved. INSTRUCTOR- Custom Tests Settings ScoringParameters DEFINED SETTINGS: No separate instructor access, but an instructor can create and load custom lessons or exercises. 99 ----HearMaster REVIEW CONTINUED-u- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: E‘ '=F' Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only # of Correct Answers TotaFFimes I 1.1 I? bevels-Passed Questions Attempted, # of Repeats, Percentage Correct RECORDS CAN BE: Printed E—mailed Backed=upRestored H' I' E l Viewed-as-a-Graph Saved as individual text files to Hard Drive or Floppy. ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 95; or Macintosh Plus, System 6.0.4 or higher PROGRAM SPECS: Program Size: 1.2 MB Disk Space: 1.6 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) APPROXIMATE COST (in US $): Single-User Copy: MSRP $99 bab=pack:-)Hor-$ S' I' ! 'lll DEMO: Bomrloadabie-Bemo WEBPAGE: http://www.emagic.de E-MAIL/ PHONE: info@emagic.com COMPANY INFO: Emagic Soft- und Hardware GmbH, Halstenbeker Weg 96, D-25462 Rellingen, Germany 100 ----Inner Hearing I & 11 REVIEW-«- VERSION: Units I & II (1999) [Demos reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 11, 1999 PLATFORM - O/S: Windows 95/98, Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress H $7-12 College M ----AVAILABLE EXERCISES-«- CHORD- 5F . l . l I . IDENTIFICATION: 7th , , , HARMONIC- Irrvcrsrons' #6-Ghords PROGRESSiONS: , _ , , MELODIES: €0mputer=generated 230 different melodies Libraries of Melodies Melodies Include Rhythm RHYTHMS: Hear/Notate Hear/Tap Included in the Melodies ADDITIONAL Unit I contains 130 folk melodies; Unit II contains EXERCISES OR , FEATURES: 100 melodies of Mozart, Haydn, and Beethoven. Quick method of screen notation to answer each phrase. m ----INSTRUCTIONAL ISSUES-«- USER-DEFINED Exercise Setup bevels Practice Test-Modes SETTINGS: . , User can choose dictation of rhythm, melody, or both. INSTRUGPORv Gustom-Tests Settings Scoring-Parameters BEHNEBSETHNGS: 101 ----Inner Hearing I & II REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen-Keyboard Mouse-click I.D. USER FEEDBACK: Diagnostic-Testing- Statisticsof-Rcsponscs- #of-eorreet-and-Incorrect-Responses Hmts RECORDS KEPT FOR: Eunent-Session-On-ly thorrectfimwers Tomi-Times Individual Times Levels Passed Class, Name, Dates, Minutes, and Scores are indicated RECORDS CAN BE: Auto-saved to: Hard Drive i—Network-fStudent-Bisk Printed E=maiied Backcd=up Restored Viewed in a Database Viewed-asaGraph Sorted by categories such as student name or class SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 95 (IBM 486 or better); Macintosh O/S 6.0.3 PROGRAM SPECS: Program Size: 815 K Disk Space: 978 K HARDWARE: Soundcard Microphone MiBI—Key’ooard-(Optional) SOFTWARE: Mac requires OMS for sound ----PRICING and PRODUCT INF ORMATION---- APPROXIMATE Single-User Copy: $55 Units I & II sold separately COST (in US $): . . . bab=pack:*for-$ Srte-bcense-Avadabie DEMO: Downloadable Demo WEBPAGE: http://www.musicalhearing.com E-MAIL/ PHONE: scott@musicalhearing.com 1-508-643-9122 COMPANY INFO: Musical Hearing, 6 Shepard Street Plainville, MA 02762 102 ----Listen REVIEW-«- VERSION: 2.4 (1998) [Demo reviewed on PowerMac, OS 8] REVIEWER: Douglas Spangler httpz/lwww.msu.edu/user/spangle9 REVIEW DATE: March 11, 1999 PLATFORM - O/S: Macintosh (System 6 to OS 8) INTENDED USES: ----AVAILABLE EXERCISES-u— Individual Practice Educational Institutions User-directed Practice =Fraciting-offlser-Progress Games Tutorials H $7-12 College M INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions 91h, 11m, and 13th chords IDENTIFICATION: 7th Chords with Inversions Open-[Closed-Spacing HARMONIC— Invcrsions +6-Chords MELODIES: Computer-generated RIPFPHMS: Hear/Notatc i-Iean‘Fap ADDITIONAL Setting for “beat the timer” mode where a specified EXEFIEilggigg number of seconds are allowed in which to input each ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup beveis Practice Test-Modes SETTINGS: . . MIDI notes can be used to replay or skip questions. INSTRUG'PORv Custom—Tests Settings SeoringParameters BEFI-NE-D-SE‘FHNGS: 103 ----Listen REVIEW CONTINUED---- RESPONSE OPTIONS: Screen-Notation Screen Keyboard Mouse-click I.D. MiBanut- Singing Screen Guitar Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Biagnostie-Testing- Statisticrof-Responses— # of Correct and Incorrect Responses Hints RECORDS-KEP‘FFOR: Current-Session-Oniy #of€orrectfinswers 5F I? II"! FF 1 IP I L_==== _ ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Mac Classic or higher (System 6 or above) PROGRAM SPECS: Program Size: 800 K RAM: 500 K HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) APPROXIMATE Single-User Copy: $99 COST (in US 9: Lab-pack: 5 for $249 Site License Available DEMO: Downloadable Demo WEBPAGE: http://www.imaja.com/listen/index.html E-MAIL/ PHONE: software@imaja.com 1-510-526-4621 COMPANY INFO: Listen, PO. Box 6386 Albany, CA 94706 104 ----MacGAMUT REVIEW---- VERSION: 3.8] (1998) [Full copy evaluated on PowerMac, OS 8] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: March 14, 1999 PLATFORM - O/S: Macintosh (System 7 or higher) INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games Erotorials H §7-1z§College r=============$===i ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with Inversions IDENTIFICATION: th , , , 7 Chords wrth Inversmns Open-lCIosed-Spacrng HARMONIC Inversions +6 Chords Borrowed Chords PROGRESSIONS: , , , Slngle-cllck Response Secondary Dominants MELODIES: €0mputer=generated MIDI entry of answers Libraries of Melodies Melodies Include Rhythm SCALES: Major Minor Modal Octatonic, Pentatonic RH-YEPHMS: HcarfNotatc HearfFap ADDITIONAL Includes exercises for written music theory. EXERCISES OR FEATURES: Music reading / keyboard drill where students sight-read notated pitches by playing them on a MIDI keyboard. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: Students can choose exercise materials in practice mode. INSTRUCTOR- Custom Tests Settings Scoring Parameters DEFINED SETTINGS: Separate instructor disk enables the ordering of units and the entry of custom melodic and harmonic exercises. 105 ----MacGAMUT REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen-Keyboard Mouse-click I.D. MIDI Input Singing USER FEEDBACK: Diagnostic-Testing- Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current-Session-Only # of Correct Answers Total-Times Individual Times Levels Passed Dates and Minutes worked. RECORDS CAN BE: Auto-saved to: Hard-Brive-chtwork-r‘ Student Disk Printed E-mailed Backed-up Restored m ----SYSTEM REQUIREMENTS and SETUP INFORMATION-u- SYSTEM MINIMUM: Macintosh System 7 or higher, PROGRAM SPECS: Program Size: 471 K RAM: 1250 K HARDWARE: SoundcardMicrophone MIDI Keyboard (Optional) 1.44 floppy drive SOFTWARE: QuickTime recommended if MIDI is not available. ----PRICING and PRODUCT INF ORMATION---- APPROXIMATE COST (in US $): Single-User Copy: $35 Lab-pack: 5 for $140 5.1. g .1” DEMO: Downloadable Demo Also available via mail WEBPAGE: http://www.macgamut.com E-MAIL/ PHONE: info@macgamut.com 1-800-305-8731 COMPANY INFO: MacGAMUT Music Software International 98 Brevoort Road, Columbus, OH, 43124 106 ----MiBAC 3.0 REVIEW---- VERSION: 3.0 (1996) [Full copy reviewed on PowerMac, OS 8] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 23, 1999 PLATFORM - O/S: Macintosh System 7 or higher, (or Windows 3.1/95/NT) INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games Music Tutorials K--—6 7 - 12 College ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD- ‘F . l . l I . IBEN‘HFIGBOEPION: 7th , , , HARMONIG Inversions +6-Chords ME-bODHES: WW SCALES: Major Minor Modal Jazz Scales RHYTHMS: HearfNotate HearfFap ADDITIONAL Features many exercises pertaining to written music EXERCISES OR , , , FEATURES: theory and keyboard playing skills. Detailed records USER-DEFINED Exercise Setup Levels Practice Test-Modes SETTINGS: . MIDI shortcut keys are available. INSTRHGPOR- Custom-Tests Settings Scoring-Parameters DEHNEB-SE‘FFINGS: 107 ----MiBAC 3.0 REVIEW CONTINUED---- RESPONSE OPTIONS: Screen-Notation Screen-Keyboard- Mouse-click I.D. IHEII S' . Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: BiagnostieTesthrg- Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: Current-Session-Oniy # of Correct Answers 5F 131' 1!. '1 IT' I IP I Percentages and types of questions answered incorrectly. RECORDS CAN BE: Auto-saved-tor—Hard-Brrve-l-Network-f-Student-Brsk Printed E=maiied Backed-up Restored Viewed in a Database Viewed-as-a-Graph Manually saved to Hard Drive or Floppy Disk. ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Macintosh System 7 or higher PROGRAM SPECS: Program Size: 2.2 MB Disk Space: 4.4 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: QuickTime recommended if MIDI is not available. ----PRICING and PRODUCT INFORMATION ---- APPROXIMATE Single-User Copy: $123 (IBM version 1.2: $99) COST (in US $): . . Lab-pack: X for $447 Slte Llcense: $999 DEMO: Downloadable Demo WEBPAGE: http://www.mibac.com E-MAIL / PHONE: info@mibac.com 1-507-654-5851 COMPANY INFO: MiBAC Music Software, PO. BOX 486 Northfield, MN 55057 108 ----Musicianship Basics REVIEW---- VERSION : Windows 1.0.3 [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: March 22, 1999 PLATFORM - O/S: Windows 3.1 /95/98 , Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking-of-User-Progress K-6 7—-1-2Col-lege ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Hanuonic Compound CHORD Triads with-Inversions All chords are arpeggiated IDENTIFICATION: [h , , , 7 Chords With-Inversions Open-iClosed-Spacmg HARMONIC- Inversions alto-Chords PROGRESSIONS: S' l _ I. l R S l E . MELODIES: Computer=generated Multiple-choice response Libraries of Melodies Melodies Include Rhythm SCALES: Major Minor Modal- Whole Tone, Pentatonic RHYTHMS: Hear/Notate Hear/Tap Multiple-choice response ADDITIONAL Many useful theory and keyboard drills. EXERCISES OR . . . . FEATURES: Interval practice does not include minor intervals. Rhythm tapping available in Macintosh version. m ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercieret-up Levels Practice Test-Modes SETTINGS: , . . Very Simple and comment interface for young users. INSTRUCTOR= Custom-Tests Settings Scoring-Parameters DEFINEBSE‘FHNGS: 109 ----Musicianship Basics REVIEW CONTINUED---- RESPONSE OPTIONS: Screen~Notation Screen Keyboard Mouse-Click I.D. MiBanut Singing- Multiple-choice Auto-Checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic-Testing- Statistidrof-Responses # of Correct and-Incorrect Responses Hints RECORDS KEPT FOR: Current Session Only #‘of-Correct-Answers T l‘F' I I..I IT I II“ I Printed E-maried- ' Backed-up- Restored I I. I . E l I I. l S I g====lgi ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: Windows 3.1 or Macintosh System 6.03 PROGRAM SPECS: Program Size: 5.4 MB HARDWARE: Soundcard Microphone MiBI-Keyboard-(Optional) Uses internal speaker or headphones. SOF-FWARE: ----PRICING and PRODUCT INFORMATION-«- APPROXIMATE Single-User Copy: $44 COST (in US $): . . . Lab—pack: 50 for $97 Slte Llcense Available DEMO: Downloadable Demo WEBPAGE: www.dragnet.com.au/~donovan/mb/music.html E-MAIL/ PHONE: greglewis@nexus.edu.au 1-800-023-069 COMPANY INFO: New Horizons PO. Box 658, Annidale, NSW 2350 Australia 110 ----Music Lab—Harmony REVIEW---- VERSION: 3.1 (1999) [Student copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 12, 1999 PLATFORM - O/S: Windows 3.1 /95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress Games gTutoriais H g7-12gCollege ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Hanuonic Compound CHORD Triads with Inversions MIDI Entry of Chords IDENTIFICATION: fl, , , , 7 Chords wrth InverSlons Open-fClosed-Spacmg HARMONIC Inversions +6 Chords MIDI Entry of Chords PROGRESSIONS: , , , Srngie=clrck~Response Secondary Domlnants ME-bODIES: Computer=generated RH-FPHMS: Hearchtate HcarfFap SINGING (AUDIO IN): Pitch Matching Intervals ADDITIONAL Features written theory exercises and 20 graded levels EXERCISES OR . FEATURES: for each type of exerCise. USER-DEFINED Exercise-Setup Levels Practice Test Modes SETTINGS: , , , Student can adjust Size of on-screen notation. INSTRUCTOR- Custom-Tests Settings Scoring-Parameters DEFINED SETTINGS: Instructor can set up classes and access class records. Set up MIDI patches for a class, backup student records. 111 ----Music Lab—Harmony REVIEW CONTINUED-n— RESPONSE OPTIONS: Screen Notation Screen-Keyboard Mouse-click I.D. MIDI Input Singing Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic-Testing Statistics of Responses # of Correct anddncorrect-‘Responses Hints RECORDS KEPT FOR: CurrentSession-Only # of Correct Answers Total Times Individual Times Levels Passed Class averages for quizzes and practice time. RECORDS CAN BE: Auto-saved to: Hard Drive fNetwork+Student-Disk Printed Emailed Backed-up Restored Viewed in a Database Viewed-as-a-Craph Saved to lab computers via LAN. ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: IBM 486 or better PROGRAM SPECS: Program Size: 1.2 MB Disk Space: 2 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE: m ----PRICING and PRODUCT INF ORMATION---- APPROXIMATE Single-User Copy: $49 COST (in US $): . . . Lab-pack: X for $199 Slte Llcense Available DEMO: Downloadable-Demo WEBPAGE: www.musicwareinc.com E-MAIL/ PHONE: sales@musicwareinc.com 1-800-99PIANO COMPANY INFO: Musicware, 8654 154th Avenue, NE Redmond, WA 98052 112 ----Music Lab—Melody REVIEW---- VERSION: 3.0 (1999) [Student copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: March 14, 1999 PLATFORM - O/S: Windows 3.1 /95/98/NT, Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking of User Progress H 7 - 12 College w ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Hammnic Compound wanmama” 7th-Chordswith-Imrersions Open-fCl-osed-Spacing HARMONIC- lnversions +6-Chords PROGRESSIONS: MELODIES: Computer=gencrated Response: Rhythm/Melody Libraries of Melodies Melodies Include Rhythm RHYTHMS: Hear/Notate Hear/Tap See/Play SINGING (AUDIO IN): Pitch Matching Intervals Melodies ADDITIONAL Interval exercises are implemented by playing a tonic EXERCISES OR . , . FEATURES’ chord followed by the interval. The answer is given by clicking on the solfege syllables for the interval. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise-Setup Levels Practice Test Modes SETTINGS: . . . Student can adjust Size of on-screen notation. IN STRUCTOR- Custom-Tests Settings ScoringParameters DEFINED SETTINGS: Instructor can set up classes and access class records. Set up MIDI patches for a class, backup student records. 113 ----Music Lab—Melody REVIEW CONTINUED-«- RESPONSE OPTIONS: Screen Notation Screen-Keyboard Mouse-click I.D. MlDanut Singing Solfege “Keyboard ” Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: E' '=F' Statistics of Responses # of Correct and-hrcorrect-Responses Hints RECORDS KEPT FOR: Current-Session—Only # of Correct Answers Total Times Individual Times Levels Passed Class averages for quizzes and practice time. RECORDS CAN BE: Auto-saved to: Hard Drive f-Networlc-l-Studcnt-Disk Printed E=mailed Backed-up Restored Viewed in a Database Viewed-as-a-Craph Saved to lab computers via LAN. FE: ----SYSTEM REQUIREMENTS and SETUP INFORMATION-u- SYSTEM MINIMUM: Windows 3.1 (IBM 386 or higher) PROGRAM SPECS: Program Size: 775 K Disk Space: 1.4 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) MIDI keyboard is not used for responding to questions APPROXIMATE COST (in US $): Single-User Copy: $49 Lab-pack: X for $199 Site License Available DEMO: Downloadable Demo WEBPAGE: www.musicwareinc.com E-MAIL/ PHONE: sales@musicwareinc.com 1-800-99PIANO COMPANY INFO: Musicware, 8654 154th Avenue, NE Redmond, WA 98052 114 ----PET (Personal Ear Trainer) REVIEW---- VERSION: 1.04 (1998) [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 12,1999 PLATFORM - O/ S: Windows 95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking-of-Uscr-Progress Games Tutorials H 7 - 12 College h==l=== ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Hanuonic Compound CHORD Triads with Inversions 9m, 11"”, and Suspensions IDENTIFICATION: m , , , 7 Chords Wlth Inversmns Open /Closed SpaClng HARMONlC Inversions +6-Chords PROGRESSIONS: MELODIES: Computer-generated Respond via screen piano SCALES: Major Minor Modal Jazz Scales RHYTHMS: Hear/Notate HearfFap Rhythmic Elements Entry ADDITIONAL Features a “Hands-F ree ”mode that plays a question, EXERCISES OR , FEATURES, pauses, shows the answer, then proceeds to next question. Brief music tutorials termed “Show me... ----INSTRUCTIONAL ISSUES---- ” are provided. USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: , User can create and save custom setting profiles. INSTRUCTOR- Custom-Tests Settings Scoring-Parameters DEFINEDSE‘FHNGS: 115 ----PET (Personal Ear Trainer) REVIEW CONTINUED-«- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MlDanut Singing “Hands-Free ” Mode Auto=checkingof+mswers Auto-skip to next Question USER FEEDBACK: Diagnostic-Testing Statistics of Responses #of-Correetand-IncorrectResponses Hints RECORDS KEPT FOR: Current Session Only #efCorrect-Answers T I? II..I l‘F' I 11’ I User-defined custom setup of practice sessions. RECORDS CAN BE: Auto-saved-tm—I-lard-Drive-l-thworkf'Student-Disk Printed E=mailcd Backed-mp Restored II. I. D I II. I S I Used to automatically launch customized user settings. ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size: 975 K Disk Space: 1.7MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) APPROXIMATE Single-User Copy: $50 Sliding Price Scale COST ' US : (Ill 3) W3 Site License Available DEMO: Downloadable Demo WEBPAGE: http://www.janasoftware.co.uk E-MAIL/ PHONE: info@janasoftware.co.uk COMPANY INFO: Jana Software, 31 Hall Cliffe Crescent Horbury, Wakefield, WF4 6DG United Kingdom 116 ----Pitch ID REVIEW-«- VERSION: 1998 [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 2, 1999 PLATFORM - O/S: Windows 95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Tracking-oPUschrogress Games Music Tutorials H 7 - 12 College ----AVAILABLE EXERCISES-m INTERVALS: Ascending Descending Harmonic Compound IDENTIHGA‘HON: ?thEI I .II . a IEI Is . HARMONlClnversions-rG—Chords WW Heal-Hap ADDITIONAL User hears a pitch then responds using the on-screen EXERCISES OR . . FEATURES: keyboard. If correct, the pitch is repeated and a new one played. Uses pitches from major or minor scales. ----INSTRUCTIONAL ISSUES-«- USER-DEFINED Exercise Setup Levels Practice Test-Modes SETTINGS: . User selects the key and the scale degrees to practice. INSTRUCTOR-Custom-TestsSettingsScoring-Paramcters DEFINED-SW: 117 ----Pitch ID REVIEW CONTINUED-n— RESPONSE OPTIONS: Screen-Notation Screen Keyboard HIE” S‘ . Auto-checking of Answers Auto-skip to next Question ----SYSTEM REQUIREMENTS and SETUP INFORMATION-u- SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size: 294 K Disk Space: 1.2 MB HARDWARE: Soundcard Microphone MleKcyboard-(Optional) SOFTWARE: ----PRICING and PRODUCT INFORMATION---- APPROXIMATE COST (in US $): Single-User Copy: $14.95 bab=paclc.—X-for$ DEMO: Downloadable Demo WEBPAGE: http://www.musicstudy.com E-MAIL/ PHONE: htrythal@yahoo.com COMPANY INFO: Dr. Gil Trythall, KBA Software, 41 West Main St. Morgantown, WV 26505 118 ----Practica Musica 3.92 REVIEW---- VERSION: 3.92 (1999) [Full Copy reviewed using Macintosh OS 8] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 26, 1999 PLATFORM - O/S: Macintosh INTENDED USES: Individual Practice Educational Institutions User-directed Practice Games Music Tutorials ----AVAILABLE EXERCISES-«- Tracking of User Progress 7- 12 College INTERVALS: Ascending Descending Hanuonic Compound CHORD Triads with Inversions IDENTIFICATION: th , , , 7 Chords wrth Inversrons Opm-fGloscd—Spacmg HARMONIC Inversions +6 Chords Custom Progressions PROGRESSIONS: , , , Single-click Response Secondary Dominants MELODIES: Computer-generated Custom Melodies Libraries of Melodies Melodies Include Rhythm SCALES: Major Minor Modal Pentatonic RHYTHMS: i-lcarfNotate Hear/Tap See/Play ADDITIONAL Many sight-reading exercises and other theory exercises. EXERCISES OR . . . . FEATURES: Comes With an extenswe printed music theory manual which suggests learning approaches. ----INSTRUCTIONAL ISSUES-u— USER-DEFINED Exercise Setup Levels Practice Test-Modes SETTINGS: INSTRUCTOR- Custom Tests Settings Scoring-Parameters DEFINED SETTINGS: IIISD’UCIOI' can enter custom exercises. 119 ----Practica Musica 3.92 REVIEW CONTINUED---- RESPONSE OPTIONS: Screen Notation Screen Keyboard Mouse-click I.D. MIDI Input Singing- Auto-checking of Answers I-I‘ 8' USER FEEDBACK: E' 'T' Statistics of Responses #of-Eorrect-andincorrcct-Rcsponses Hints RECORDS KEPT FOR: €urrcnt-Scssion-Only #-of€orrect-Answers Total Times I l' 'l 15F Levels Passed First use, Last use, and total minutes logged. RECORDS CAN BE: Auto-saved to: Hard Drive f-Network-l Student Disk Printed E=maiicd Backed-up Restored u. l' E l Viewed-ara-Graph Viewed as the start-up screen when program is launched. ----SYSTEM REQUIREMENTS and SETUP INFORMATION-«- SYSTEM MINIMUM: Mac Plus or better, System 6.0.7 or higher PROGRAM SPECS: Program Size:1.2 MB Disk Space: 4.5 MB HARDWARE: SoundcardMicrophone MIDI Keyboard (Optional) SOFTWARE: Songworks (notation program) to create custom exercises. E ----PRICING and PRODUCT INFORMATION---- APPROXIMATE Single-User Copy: $99 Student Disk: $15 COST (in US $): . . . Lab-pack: 4 for $140 Site License Available DEMO: Bownioadable-cho WEBPAGE: http://www.ars-nova.com E-MAIL/ PHONE: info@ars-nova.com 1-800-445-4866 COMPANY INFO: Ars Nova Software, Box 637, Kirkland, WA 98083-0637 Developer: Jeffrey Evans 120 ----teoria REVIEW-«- VERSION: 1.3.4 (1997) [full version reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: April 11, 1999 PLATFORM - O/S: Windows 95/98/NT INTENDED USES: Individual Practice Educational Institutions User-directed Practice Games Tutorials ----AVAILABLE EXERCISES---- Tracking of User Progress H §7- 12 College INTERVALS: Ascending Descending Hanuonic Compound CHORD Triads with Inversions Augmented Sixth Chords IDENTIFICATION: m , , , 7 Chords wrth Inversrons Open-feloscd-Spacrng HARMONIC- Inversions -t-6-€hords PROGRESSIONS: S' l _ l' l R S l E . MELODIES: Computer-generated Rhythm not evaluated Isrbrarrcs-of-Meiodres- Melodies Include Rhythm SCALES: Major Minor Modal Gregorian modes RHYTHMS: Hcan‘Notate Hean‘Pap ADDITIONAL Extensive tutorials included with the program cover EXERCISES OR . FEATURES° intervals, scales, and chords. Also features many exercises which focus on written theory. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup bevels Practice Test Modes SETTINGS: User can load custom user-defined presets. INSTRUCTOR- Custom Tests Settings Scoring Parameters DEFINED SETTINGS: Extensive record tracking abilities-although there is no separate instructor access with password protection. 121 ----teoria REVIEW CONTINUED-«- RESPONSE OPTIONS: Screen-Notation Screen Keyboard Mouse-click l.D. HtEH S' . Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: Diagnostic-Testing- Statistics of Responses #ofEmmctandinmmd—Rcsponscsifints RECORDS KEPT FOR: Current-Session-Only #‘ofCorrcctfirswcrs Total Times Individual Times bowls-Passed Date, Time, Minutes, # of Questions Answered, Score RECORDS CAN BE: Auto-saved to: Hard Drive f-Network-l-Studcnt-Bisk Printed Emailed Backed-up Restored Viewed in a Database Viewed as a Graph E======== Records can be deleted by any user ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- w SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size: 841 K Disk space: 1.6 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) ----PRICING and PRODUCT INFORMATION---- APPROXIMATE Single-User Copy: $32 COST (in US S): DEMO: Downloadable Demo WEBPAGE: http://www.teoria.com E-MAIL/ PHONE: teoria@teoria.com COMPANY INFO: .1086 Rodriguez Alvira, Cond Monte Sur, 190 Ave Hostos, Apt. B-342, San Juan, Puerto Rico 00918-4236 122 ----The Music Box 2.6 -A Personal Ear Trainer REVIEW-u— VERSION: 2.6 (1999) [Full copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: February 07, 1999 PLATFORM - O/S: Windows 95/98/NT INTENDED USES: Individual Practice Educational-Institutions User-directed Practice Tracking-of-Uschrogress Games Tutorials K—-6 7 - 12 College ----AVAILABLE EXERCISES---- INTERVALS: Ascending Descending Harmonic Compound CHORD Triads with-Inversions IDENTIFICATION: th . . . 7 Chords mth-Invcrsrons Open-felosed-Spacmg HARMONIG Inversions +6€hords PROGRESSIONS: S' l _ l' l R S l E . MELODIES: Computer-generated Quick response method tibraries-of-Meiodics- Melodies Include Rhythm SCALES: Major Minor Modal Whole tone, Pentatonic RHYTHMS: HearfNotate HearfFap Hear/Write on Paper ADDITIONAL Simulates classroom testing by giving melodic, rhythmic, EXERCISES OR . . . . . . FEATURES: interval, and chord dictation exerczses which are written down on paper then compared with the screen notation. w ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup bevels Practice Test Modes SETTINGS: Other: Mouse-click answering of melodic dictations. INSTRUGFOR-Gustom-TestsSettmgsScofing-Parameters DEFINED-SETTINGS: 123 ----The Music Box 2.6 REVIEW CONTINUED-m RESPONSE OPTIONS: Screen Notation Screw-Keyboard- Mouse-click l.D. “IE” 5' . Auto-checking of Answers !-1° 8' USER FEEDBACK: E' .13. 5.. FR # of Correct and Incorrect Responses Hints Current-Session-Only #of-Gorrect-Answcrs Total-Times ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 95 (IBM 486 or better) PROGRAM SPECS: Program Size: 632 K Disk Space: 800 K HARDWARE: Soundcard Microphone MiBI-Iécyboard-(Optional) ----PRICING and PRODUCT INFORMATION-«- APPROXIMATE Single-User Copy: $26 Shareware COST (in US $): . . . bab=packTX-for-$ Site-twensefivariabic DEMO: Downloadable Demo WEBPAGE: http://tscnet.com/pages/carner E-MAIL/ PHONE: camer@tscnet.com COMPANY INFO: Camer Enterprises, 13298 Rocky Ridge Road Silverdale, WA 98383 124 ----ThoughtSauce Eartraining REVIEW-«- VERSION: 1.0 (1999) [Full Copy reviewed on Windows 95] REVIEWER: Douglas Spangler http://www.msu.edu/user/spangle9 REVIEW DATE: May 10,1999 PLATFORM - O/S: Windows 3.1/95 INTENDED USES: Individual Practice Educational Institutions User-directed Practice Games Music Tutorials Em ----AVAILABLE EXERCISES-m Tracking of User Progress H 7 - 12 College INTERVALS: Ascending Descending Hanuonic Compound CHORD Triads with Inversions IDENTIFICATION: u, , , , 7 Chords wrth InverSlons Open /Closed Spacrng HARMONIC Inversions +6-€hords PROGRESSIONS: , , . Single-cllck Response Secondary-Dominants MELODIES: €ompnter=generated Melody Comparison Libraries of Melodies Melodies Include Rhythm SCALES: Major Minor Modal- Whole-tone, Chromatic RHYTHMS: Hear/Notate Hean‘I-‘ap Hear/Compare ADDITIONAL More than 800 different lessons or topics. There are EXERCISES OR , , . . , . FEATURES: Singing exerCises, but there is no microphone input. ----INSTRUCTIONAL ISSUES---- USER-DEFINED Exercise Setup Levels Practice Test Modes SETTINGS: , , , _ , User must “Sign in ” if recordkeeping is desired. INSTRUGFOR- Custom-Tests Settings Scoring-Parameters BEFINEB-SE‘FHNGS: 125 ----ThoughtSauce Eartraining REVIEW CONTINUED---- RESPONSE OPTIONS: Screen-Notation Screen-Keyboard- MIDl Input Singing- Numbers Keys /Letter Keys Auto-checking of Answers Auto-skip to next Question USER FEEDBACK: E' .12. Statistics of Responses # of Correct and Incorrect Responses Hints RECORDS KEPT FOR: CurrentScssion-Oniy #ofGorrechkrrswcrs Total-Times 11.11512. Levels Passed User must “sign in ” if recordkeeping is desired. RECORDS CAN BE: # Auto-saved to: Hard Drive f-Network-l-Student-Disk Printed B=maiied BackedmpRestored Viewed in a Database Viewed-asa-Graph Viewed for all exercises or by individual exercise ----SYSTEM REQUIREMENTS and SETUP INFORMATION---- SYSTEM MINIMUM: Windows 3.1, (IBM 386 or better) PROGRAM SPECS: Program Size: 420K Disk Space: 3 MB HARDWARE: Soundcard Microphone MIDI Keyboard (Optional) SOFTWARE ----PRICIN G and PRODUCT INFORMATION---- APPROXIMATE COST (in US $): Single-User Copy: $79 Release set for late 1999 bab=pack:*for$ S' I' a .1” DEMO: Bownioadabie-Bemo WEBPAGE: http://www.thoughtsauce.com E-MAIL/ PHONE: open-ear@thoughtsauce.com COMPANY INFO: ThoughtSauce.com (Begun in 1998 on the WWW) 126