THS l\i\[flifiifliifli\|\\iWWWWW“ \ 3 02060 4181 ,C' C’C O This is to certify that the thesis entitled 73:5va:09 ,4 (is/49mm TEST/M6 gufs-nzu Foe. was 5A5?!) [Egg/{gay presented by Kara-r :4 . 7f?) ESEL‘KEE has been accepted towards fulfillment of the requirements for Mega in W/fl/ [If/7174] Major professor Date 41/ / Z/I 9/7 0-7639 MS U is an Affirmative Action/Equal Opportunity Institution LIBRARY Michigan State . University PLACE IN RETURN BOX to remove this checkout from your record. TO AVOID FINES return on or before date due. MAY BE RECALLED with earlier due date if requested. DATE DUE DATE DUE DATE DUE 11m chiRCJDateDuepSS—p.“ DEVELOPING A USABILITY TESTING SYSTEM FOR WEB BASED RESEARCH By Kurt A. Besecker A THESIS Submitted to Michigan State University in partial fulfillment of the requirements for the degree of MASTER OF ARTS Department of Telecommunication 1999 ABSTRACT DEVELOPING A USABILITY TESTING SYSTEM FOR WEB BASED RESEARCH By Kurt A. Besecker The focus of this study was to design a usability testing system for web based research while evaluating the effects of navigation on user efficiency and satisfaction. The outcome was to determine the usability of the College of Communication Arts & Sciences web site at Michigan State University. Two versions of the College site were evaluated based on information finding tasks. Searching for information on the old and new College sites showed to be a frustrating experience for users. Numerous navigational errors were identified which effected the overall usability of the College sites. ACKNOWLEDGMENTS I would like to express my appreciation to all those who provided encouragement and direction throughout this project. Sincere thanks to Dr. Frank Biocca and the M.|.N.D. Lab for whom this would not have been possible without their support. Special thanks to Dr. Carrie Heeter for planting the seed and allowing me to grow. The Comm Tech Lab for allowing me to be a part of such an incredible family, pushing to create new and exciting experiences. Lastly I would like to thank my parents for their constant patience, love, and understanding. Thank you for instilling the importance of education and allowing me to follow my dreams. TABLE OF CONTENTS ACKNOWLEDGEMENTS .................................................................................... lll TABLE OF CONTENTS ...................................................................................... IV LIST OF TABLES ................................................................................................ VI INTRODUCTION .................................................................................................. 1 CHAPTER 1 ......................................................................................................... 2 What is usability? ........................................................................................... 2 Learnability .................................................................................................... 2 Efficiency ....................................................................................................... 3 Memorability .................................................................................................. 3 Error ............................................................................................................... 4 Satisfaction .................................................................................................... 4 CHAPTER 2 ......................................................................................................... 5 How do we evaluate usability in web design? ................................................ 5 CHAPTER 3 ......................................................................................................... 9 What is needed to effectively carry out a usability study? .............................. 9 CHAPTER 4 ........................................................................................................ 13 Why is navigation important? ....................................................................... 13 Learnability .................................................................................................. 14 Efficiency ..................................................................................................... 15 Memorability ................................................................................................ 15 Error ............................................................................................................. 16 Satisfaction .................................................................................................. 1 7 CHAPTER 5 ....................................................................................................... 19 Methods ....................................................................................................... 19 Overview ...................................................................................................... 19 Design ......................................................................................................... 19 Participants .................................................................................................. 20 Stimulus Materials ....................................................................................... 20 Measures ..................................................................................................... 21 Procedure .................................................................................................... 22 CHAPTER 6 ....................................................................................................... 24 Results and Discussion ............................................................................... 24 Demographics .............................................................................................. 24 Media Use .................................................................................................... 24 Efficiency ..................................................................................................... 25 Satisfaction .................................................................................................. 26 Site Usability ................................................................................................ 27 User Observations ....................................................................................... 28 Task 1 ................................................................................................... 28 Task 2 ................................................................................................... 3O Task 3 ................................................................................................... 31 Task4 .................................................................................................... 33 Common Problems While Performing All Tasks ................................... 37 CHAPTER 7 ....................................................................................................... 38 Conclusion ................................................................................................... 38 APPENDIX A: Advanced Usability Lab ............................................................... 40 APPENDIX B: Actual Usability Lab Developed .................................................. 41 APPENDIX C: Informed Consent Form .............................................................. 42 APPENDIX D: Pre-test Questionnaire ................................................................ 43 APPENDIX E: Tasks for Test Session ................................................................ 44 APPENDIX F: Post-task Questionnaire .............................................................. 45 APPENDIX G: Post-test Questionnaire .............................................................. 47 APPENDIX H: Site Comparisons of Usability Ratings ........................................ 48 BIBLIOGRAPHY ................................................................................................. 50 LIST OF TABLES Table 1: Common Attributes of Navigation and Usability Table 2: Participant Demographics Based on Site Version vi INTRODUCTION Web design is continually in a state of evolution. Technological developments, innovative design tools, and a better understanding for the medium itself offer promise to new media developers everyday. But even with all these enhancements, how do we know we have created a usable site? Many books and articles have been written discussing in great detail concepts such as site navigation, page design, and web graphic creation (Mok 1996; Mullet, Sano 1995; Kristof, Satran 1995; Sano 1998; Siegel 1997, 1998; Lopuck 1996). Unfortunately there is not much in terms of real data supporting many of these heuristic guidelines of what makes a good web site and little publicly available research has focused directly on the usability of World Wide Web sites (Nielsen, 1999) The purpose of this study was to design and construct a usability testing system, design and conduct usability tests, and to look at the effects of navigation on user efficiency and satisfaction while determining the usability of the old and new College of Communication Arts & Sciences web site at Michigan State University. Chapter 1 WHAT IS USABILITY? Within the development of software and web interface design the term “usability” has often been equated with the terms ease of use and user friendly implying that if a user doesn't get it on the first pass, then it is not usable. Designing and testing for usability has more to do with the appropriateness of the design solution for a target population than it does with ease of use. It is important to not think of usability as a single property of a user interface. Traditionally usability has been associated with the following five attributes: Iearnability, efficiency, memorability, errors, and satisfaction (Nielsen, 1993). To best understand what usability is, it is important to clearly understand these five attributes. Learnabiliy In software design Iearnability is the ease of learning and operating the system to carry out a particular task. The learning curve of such software, has a steep incline in the beginning, but allows for a decent level of proficiency within a reasonable period of time (Nielsen, 1993). This attribute also applies to web page design. In web design, Iearnability is the ease of learning the navigation structure of a web site. “Like most aspects of usability, navigation is invisible when it is working. But when there’s a problem, users can get completely stuck. In fact, navigation problems frequently cause users to give up” (Spool et al., 1999, page 15). With so many web sites freely available, users who have trouble learning and navigating a particular site most likely will give up and look to other sources. The main difference between a web site and installable software is that users have less tolerance for learning. A user who has spent $500 on a software tool will usually take the time to learn it. Even if it is known to have a large learning curve. The same is not true of the web (Flemming, 1998). As a result it is of great importance that the navigational system of web sites be easily learned. Efficiency Once a user has learned the navigational system of a web site, a high level of productivity should be possible. Usability evaluates this level of productivity through the concept of efficiency. Efficiency refers to the user’s level of performance at the time when the learning curve flattens out. This means once users have learned the system, task completion should have an economy of action and time (Flemming, 1998). Memorability If a web site has a high levelof Iearnability and efficiency we can then begin to look at the aspect of memorability. Memorability is the ability for users to remember how the site navigation structure works while performing a particular task. A navigational system should be easy to remember. Casual users should not have to relearn the system after some period of not having used it (Nielsen, 1993). A perfect example of this in relation to a university department web site is the occasional use by students. Students who leave for the summer or in between semester breaks should be able to return and continue to use the site efficiently for information retrieval. m Web designs that minimize the amount of errors or user mistakes are also important to usability. Error in usability studies is any action that does not accomplish the desired goal of the user while performing some specific task (Nielsen, 1993). For example, as a user sets out to retrieve a specific piece of information we want to limit his or her experience of selecting links that lead them astray. We want to help users make as few errors as possible in an attempt to reduce productivity losses. Error rates are measured within usability by counting the number of instances in which users’ actions do not follow their expectations while performing some specific task (Nielsen, 1993). Satisfaction The final attribute when evaluating usability is that of satisfaction. Satisfaction refers to how pleasant the user finds the experience of working with the system (Nielsen, 1993). Through questionnaires we can find subjective opinions from users on such items as whether or not the system is enjoyable to use, visually appealing, and in effect assess whether or not the users like the system. Chapter 2 HOW DO WE EVALUATE USABILITY IN WEB DESIGN? The most fundamental method for evaluating the usability of a computer system, software application, or web page design is through user testing (Nielsen, 1993). In web development, user testing is a procedure in which we observe target users interacting with a web site attempting to accomplish a set of goal oriented tasks. These tasks are chosen to be representative of how the system will eventually be used in the field. When conducting a user test typically a four-phase cycle is followed. These phases include test preparation, introduction, actual test session, and debriefing. During the preparation phase the entire framework is laid to help facilitate a successful testing session. It is during this phase in which the experimenter prepares a test room for the experiment. This includes ensuring the proper computer hardware and software is ready for use and that all the necessary test materials including instructions, task scenario booklets, and questionnaires are available. Many user tests take place in specially designed facilities. Although much of this equipment is not essential, it does aid in the ability to record and evaluate user behavior. This type of setup will be discussed later. Once all the necessary preparations have been established and the test session is ready, the introduction phase begins. This is signaled by the arrival of the test subject. First the experimenter introduces him or herself and welcomes the test subject, followed by a brief explanation to the purpose of the test session. During this period users are asked to sign a participation consent form. This document provides the experimenter with permission to use the subject’s voice, verbal statements, and possibly user images for evaluation purposes. It is also at this time that subjects are made aware that participation is voluntary and if they have any questions, they should be asked at that time. Upon signing the consent form the experimenter continues with the test instructions. This includes explanation of any audio or video recording which may take place, explanation that the user is welcome to ask questions throughout the test session as they arise, and any other special instructions which may need to be given. Lastly, the experimenter asks the test user whether he or she has any questions regarding the testing procedure before they begin. Once all questions have been answered and test subject is ready, the actual testing phase begins. During this phase the experimenter should refrain from interacting with the user as much as possible. The experimenter does not want to express any opinions or thoughts that may trigger unnatural user actions. During the testing phase, users are supplied with a task scenario booklet. This book has all the tasks the experimenter wants to observe the user attempt to accomplish. While the experimenter does want to limit his or her interaction with users while they try to accomplish their tasks, it may be necessary to ask simple questions to obtain more in depth information on the user’s actions. These types of comments help keep users motivated and are intended to clarify what the experimenter is observing. For example if a user seems puzzled, it is common to ask the user “what are you thinking right now” or “what did you expect to happen when you selected that item". The main idea is the experimenter wants to simply observe the user behavior while gaining as much insight to his or her thought process while working on a particular task. It is also at this point in which the experimenter takes notes and records user actions for post-test evaluation. Items noted may include user errors, task start and end times, and recording of user verbal comments. Once the user has completed all the necessary tasks to be assessed during the testing session, users are asked to fill out any subjective satisfaction questionnaires. This is done before the final debriefing phase to avoid any biasing comments from the experimenter being passed to the test subject. Finally during the debriefing session, experimenters explain the purpose of the test in more detail. It is at this time when the subjects are asked for further comments about the events of the tests that were hard for the experimenter to understand. The goal is to get clarification on the items recorded during the observation of the subject that are still unclear. Lastly, once the subject has left, the experimenter verifies that all the necessary tests and surveys have been labeled , numbered, and put together along with the experimenter’s notes to generate a complete user packet. Once all subjects have been run through the user tests, these packets can be reviewed and evaluated to generate a report of the test findings which includes the overall user performance results. Through this type of evaluation we can best determine if we are meeting the perceived needs of our intended user audience and as designers, are not solely relying on our best guess while making design decisions. Often this feedback from actual users will prevent a product or web site design from heading in the wrong direction. Chapter 3 WHAT IS NEEDED TO EFFECTIVELY CARRY OUT A USABILITY STUDY? The ability to effectively carry out a usability study can be as simple or complex, as the experimenter wants it to be. Part of the purpose of this study was to learn how to setup a usability testing system, and in doing so create a system to run a set of usability tests on the College of Communication Arts & Sciences new web site at Michigan State University. From that process, we now feel that in its simplest form, a usability study can be run within a small room, with a user workstation, source materials, a stopwatch, and a notepad. This simple setup is enough to observe target users and get an inside look at how users work with a web site to perform a series of tasks. The main difficulty in this simplistic setup is that there is often too much going on for a single researcher to properly record every single detail on paper. During a test session, the researcher is responsible for recording all data including user actions, verbal comments, errors, task start and end times, all while facilitating the test session. The downside to this is that once a session is over, any observation that might have been missed is lost. There is no visual or audio data recorded for post-test analysis or to show developers as an example of a particular error or problem. Observation of user tests often generates large data sets to be analyzed. With a lab as simple as the one mentioned previously, opportunities for missing potentially important data arise. As a result, a much more complex system is often developed to aid in the capturing and analysis of user testing data. For the complex setup, usability labs are normally constructed as a two-room configuration (see appendix A for diagram). In the subject’s room there is a user workstation including the computer, source materials, video camera(s) and audio input devices. In a second soundproof room, often separated from the subject’s room by a one-way mirror, sits the experimenter and his recording devices. These devices include a scan converter, video recorder(s), video mixer, audio mixer, video monitors, speakers, event logger’s workstation, analysis software, and a time-code generator. Assembling the complex usability lab is quite simple. First, the scan converter is connected to the user workstation. From the scan converter, the output video is connected to a video input on the video switcher. Keep in mind that when considering which scan converter to purchase it is important to understand what resolution the computer screen you intend to capture on video will be. Numerous scan converters are available which can be switched from 640 x 480 up to 1024 x 768 screen resolutions. After connecting the scan converter the video and audio output is connected from the video camera(s) and microphones to the video and audio switcher inputs. If a camcorder is to be used as your audio/video input device, one should make sure it can 1) record video from the input jacks and 2) can display the incoming video on its viewfinder/screen as it sends signal out to the video mixer. Next, the audio and video signals should be routed out of the video and audio switchers and through a time-code generator and connected to the video recording device. The final step is to connect the recording device to the event logger’s workstation. It is 10 important that this recording device is capable of being controlled via the logging workstation and that it can properly record SM PTE time-code signal. These two features are important if you want to use observational software to analyze the recorded test sessions and at the same time control the video recorder via a computer. This complex setup allows for all audio and video source materials generated in the subjects room to be viewed, mixed and recorded onto a single video tape in the experimenter’s room. Once the lab setup is complete, the video taped data can be analyzed in real-time or in post-test sessions using the event logger’s workstation. This workstation gives the experimenter the ability to control video playback via a computer in an integrated system for the collection, analysis, presentation, and management of observational data using software such as that developed by either Noldus (http://www.noldus.com) or Triangle Research Collaborative (http://www.irctechcom). Both these software tools add the benefit of statistical analysis based on user event logs. Using these systems, the recorded user session can be saved for future review by both the experimenter and/or developers who need to witness a usability problem to better understand and assess how to fix it. In addition to offering observational research software, both Noldus and Triangle Research Collaborative also offer complete usability lab systems including all the necessary hardware and software needed to effectively develop a usability lab facility. When developing the usability lab for this study, we attempted to develop an observational environment that was portable, cost effective, and allowed for 11 more sophisticated data collection than the simplistic usability lab requirements. The final usability testing system (see appendix B for diagram) used for this study included a user workstation, video monitor, VCR, scan-converter, video camera with built in microphone, and a video mixer. This system allowed for all the advanced recording features of the complex setup, minus the substantial costs and the automated software analysis features. 12 Chapter 4 WHY IS NAVIGATION IMPORTANT? When we design for the web, it is important for users to be able to orient themselves to the space they are in. Navigation is the user’s primary form of interaction with a web sites content. It is the user action of clicking and moving throughout the structure of a web site. The purpose of navigation design is to clearly identify to users where they are, where they can go, how they got there, and how they can get back to where they once were (Flemming, 1998). If it is unclear where they can or should go to find the information they seek, tasks will likely become difficult to complete and will result in a less usable system. Like usability, navigation should support users’ goals and behaviors. It is not as simple as including sidebars and menus in the design. When we begin to layout out the navigational structure we are in effect laying the roads in which users will travel to accomplish set tasks and goals. Although there is no secret formula to successful navigation design in web sites, there are some guiding principles which designers should understand and consider when developing usable systems. The correlation between successful navigation and usable systems becomes more apparent as you compare the concepts in the following table: 13 Table 1: Common Attributes of Navigation and Usability - Be easily learned (Learnability) - ls easy to learn (Learnability) - Require an economy of action and time - ls efficient to use (Efficiency) (Efficiency) - ls easy to remember (Memorability) - Remain consistent (Memorability) . Provides quick recovery from errors 0 Offer alternatives (Errors) (Errors) - Use clear and understandable labels - ls enjoyable to use (Satisfaction) (Learnability) - Be appropriate to the site's purpose (Satisfaction) - Support users’ goals and behaviors (Satisfaction) (Flemming, 1998) (Nielsen, 1993) As you can see from the above table, many of the principles of successful navigation have corresponding usability principles. It is through understanding and implementing these ideas that designers will be able to make better design choices. The principles of successful navigation will now be discussed. Learnability The concept of Iearnability in navigational systems is very similar to that in usability. The key component here is to try and avoid burdening your users with a high learning curve. When designing a site meant to give visitors information you do not want to force users to spend hours trying to find the content they are 14 looking for. You want the navigation to be transparent to the user so they can focus on the content they are looking for and not the method it takes to find it. Efficiency As users begin to learn a navigation system, efficiency becomes a main component very similar to the usability principle. You want to try and limit the number of levels a user must go through to find a particular piece of information. A site structure that features many sub pages in which users must click through to find information can easily induce “Are We There Yet?” syndrome (Flemming, 1998). This type of syndrome can quickly frustrate users and prolong the time it takes to carry out a particular information search task. This leads to the following relationship: H1: Web sites with consistent navigation will be more efficient for information retrieval tasks. Memorability By designing navigation systems that are consistent, we can in effect increase the usability concept of memorability. Consistency of navigation systems includes placement, appearance, and function elements. Many navigation techniques have become common practice in web design. This includes the use of top and/or side navigation bars, navigation elements such as buttons and text links having consistent locations throughout site design, and the appearance of links adhering to some sort of visual scheme. Normally these 15 appear as distinct buttons, icons, or text that standout from the rest of the web page contents. Aside from the visual aesthetics of the buttons themselves, items usually follow some sort of grouping and labeling principle as well. As users learn what, where, and how to use the navigation system it will quickly become transparent. Transparent navigation doesn’t literally mean invisible. When a system reaches a level of transparency it means users do not have to focus on it and it doesn’t interfere with the tasks and objectives of the user. This consistency will bring order to potential confusion and allow users to focus on the information content instead of where and how to go about finding the information. liners A navigation system, which is learnable and consistent, isn’t necessarily error proof (see pg.4 for a definition of errors). A system with quirky features may be consistent, but with poor labeling or no alternative choices, users may still make selection errors. All users are different, whether it is in the hardware they use, their learning style, or their personality traits. It is considerations like this, which make it necessary to develop alternative navigation solutions. For example, some users may like to select the items they feel will directly lead them to the information they seek. Others may prefer to use a search engine or go through an overall site map. It is these types of alternative navigation methods that will provide users with multiple ways of obtaining the same information with as few errors as possible. 16 Satisfaction As with usability, when creating navigation systems we must consider user satisfaction. The navigation design should be appropriately aligned with the objectives of the user. “A good match between navigation and users’ goals will mean that the site’s navigation reinforces the site’s purpose and is integrated with the overall experience” (Flemming, 1998). Navigation that does not integrate properly with the purpose of the site can lead to user confusion for example. This type of problem could then negatively affect the overall user experience, which would most likely lead to poor user satisfaction. Thus: H2: Consistency of navigation will provide an enhanced feeling of user satisfaction. Based on the common attributes of usability and navigation, the attributes efficiency and satisfaction should effect the usability of a web site. Therefore we hypothesize: H3: Web sites with consistent navigation will have higher measured efficiency and satisfaction resulting in a higher usability rating. There are no clear cut answers to what makes navigation systems work best in all situations. In understanding why the above elements are important and how they can affect the overall usability of a web site we can begin to better balance these principles with the needs and objectives of our user audience in an effort to create web sites which satisfy them. If we begin to stray away from these 17 principles we may begin designing navigational systems which do not meet the needs of the user. If these systems are not effective, then it will most likely negatively effect the usability of our web site. By asking users from our target audience to evaluate elements on web pages that reflect these dimensions, then we will be able to make educated design decisions. The key here is to avoid using our best guess as a guide to our web design decisions. 18 Chapter 5 METHODS Overview The purpose of this study was to design and construct a usability testing system, conduct a full usability test on a university web site, and evaluate the effects of navigation on user efficiency and satisfaction during information finding tasks. 99—9493 To evaluate the usability of the old and new College of Communication Arts & Sciences web sites we followed the procedure developed for a research study evaluating the usability of web sites while attempting information finding tasks (Spool et al., 1999). Participants were randomly assigned either the old or new College web site and then asked to find the answers to four types of questions aimed to study the usability of finding information on web sites. Efficiency measures were obtained by analyzing the time it took each participant to complete the information finding tasks using the web site they were randomly assigned. Upon completion of each of the four tasks, participants completed post-task questionnaires, which measured levels of fatigue and confusion. A post-test questionnaire was then completed by the participants at the end of the usability test session. The purpose of this questionnaire was to supply us with the 19 participant’s subjective satisfaction measures including such items as logic of navigation and overall ease of use. Participants Participants were students and staff members at Michigan State University. Students were recruited from a Department of Telecommunications class by offering them an opportunity to earn extra credit towards their course grade. Staff members and graduate students were offered an opportunity to raise money for a wheel chair charitable fund within the College. The Media Interface and Network Design Lab donated $5 dollars for each staff member and graduate student that participated. Participation was anonymous and voluntary. Subjects were randomly assigned one of the two site conditions (old site or new site). Seven men and eleven women completed the study, for a total of 18 subjects. S_timulus Materials The source materials for this study included the old and new College of Communication Arts & Sciences web site at Michigan State University. This included the 5 departmental sites within the College. Sites were viewed on an Apple Power Macintosh with a 17-inch color monitor, using Netscape Navigator 4.0, and connected to the Internet via Ethernet. 20 Measures The following measures were recorded in this study. Media Use. Media use was a 14-item scale that measured the daily use of computers. The questionnaire included items such as the number of hours a day participant used a computer at work, at home, and the number of web pages a participant had designed. See Appendix D for questionnaire. Efficiency. Efficiency was a measure of time participants took to complete a task. It has 5 indicators. Participants completed 4 tasks. Time to complete each task was measured separately and a total time index was created by summing the time for each task, forming the fifth indicator. See Appendix E for questionnaire. Satisfaction. Satisfaction was a 16-item scale from Spool (1999). These items were summed together to form a scale of satisfaction (alpha=0.92). The indicators included items such as overall ease of use, logic of navigation, and overall productivity with the site. See Appendix G for questionnaire. Usability Rating. Usability rating was a measurement of the sum of 3 numeric scale indicators including participant’s frustration level while working with the site, participant’s perception of how long the task took, and participants confidence level in their answers (See Appendix F). These indicators have been shown to correlate with the successful completion of the tasks, users’ preferences, and other factors (Spool, 1999). For comparison of the site’s usability to others, Spool’s (1999) procedure was followed as described below. 21 For each site, the average of these indicators were multiplied together. Using a scale of 1 to 7 the highest possible rating a site could achieve was 343. The site’s scores were divided by 343 and multiplied by 100 to place the site usability rating on a scale of 0 to 100. Procedure The following procedure was followed for each test subject in this study. Once the participant entered the testing room, they were seated at a conference table and asked to sign a consent form (see Appendix C). This document pointed out to the participant that their participation was voluntary, and that they have the ability to end their participation at any time. The consent form also assured the participant that at any time during the study he or she decided to withdraw from session, all documents and data related to that individual would be destroyed. Upon completion of the consent form, participants were given a pre-test questionnaire (see Appendix D). The pre-test questionnaire provided information related to previous media use experience and basic demographics of the test subjects. Participants were then escorted to the user computer workstation. They were given a task-scenario booklet and then informed about the computer system they were to use. We then asked if they had any question on how to use the Netscape web browser. If necessary, participants were instructed on how to use the basic features of the Netscape web browser. Users were then instructed on the "think aloud" technique to follow while completing tasks from within the 22 task-scenario booklet. They were told to ask questions or tell us if they were having any difficulties. Once the participant was ready to begin, the video recorder was turned on to record the computer screen and the users” actions while working with the selected web site. Participants followed the task-scenario booklet and tried to find the answers to each of the four task related questions (see Appendix E). Upon completion of each task question, users filled out a short post-task questionnaire (see Appendix F) that supplied us with data on the subjective experience the participant had while working with the web site. This questionnaire was taken from a study carried out by Jared Spool (1999), based on a method for workload studies developed at NASA. When all four tasks and post-task questionnaires had been completed we instructed participants to fill out a final post-test questionnaire (see Appendix G). This allowed users to rate the site in different areas including ease of finding information, appearance of site, and overall productivity of site. Once completed, students were then debriefed on the purpose of the study and thanked for their participation. 23 Chapter 6 RESULTS AND DISCUSSION DEMOGRAPHICS A total of 18 subjects participated in this study including 7 male and 11 female. Since we were testing general-interest sites, we did not require participants in our study to have any particular skills or level of proficiency in using the web. Table 2 establishes the total number of participants who used the old and new College of Communication Arts & Sciences web site based on sex and whether the user was a student or university staff member. Table 2: Participant Demographics Based on Site Version Student Staff Male Female Male Female Total New Site 4 3 0 2 9 Old Site 3 4 0 2 9 MEDIA USE The amount of computer use by participants ranged from 1 hour to more than 8 hours per day. The average time of computer use at work and at home was 1-3 hours per day. The most frequent use of computers was for word processing, spreadsheet, statistical analysis, and personal communication related tasks. Very few participants used computers for engineering simulation, information reference, and arts (graphic/music). 24 EFFICIENCY The first hypothesis tested was the following: H1: Web sites with consistent navigation will be more efficient for information retrieval tasks. It was our expectation that the use of consistent navigation in web page design would enhance our participants’ efficiency while working with web sites to complete information retrieval tasks. Hypothesis 1 was not supported. A total time of all tasks were summed together to form a total time index. A T-Test showed a non significant difference between the old (M=753.78, §g=201.32) and new (M=809.78, g=218.63) web sites [t(16)=.57, p>.05]. In fact, the mean for the new site was 7% larger than that of the old site. The opposite of our expectations. We did not find support for this hypothesis and there was no statistical significance between those who were in the old site condition and those who were in the new site condition. This may be due to the small sample size used for this study and may prove significant with a larger sample. In addition, there were inconsistencies between the two sites which may have affected the time it took to complete a task. The new College site had been designed with the addition of new content that was not available in the original site. This caused information to be grouped and broken up into small pieces. As a result, there was the addition of navigation choices and the number pages that had to load per task. This added idle time where users’ were not actually using the site. For example in task 1, users only had to make one correct decision to 25 find the appropriate answer when using the old site. In the new version, the user had to make two navigation choices as well as digest content before choosing the second navigation choice to find the correct answer. The original sight also used frames technology. This design technique allowed the College sites main navigation to always be present and gave a sense of one large site versus six independent sites as is the case with the new site. This may also explain for the why the difference between the means did not follow in the predicted direction. SATISFACTION The second hypothesis tested was the following: H2: Consistency of navigation will provide enhanced feeling of user satisfaction. It was our expectation that the use of consistent navigation in web page design would enhance participant satisfaction while working with web sites to complete information retrieval tasks. This hypothesis was not supported. The difference between the means were in direction predicted, with people being more satisfied with the new web site (M=87.89, g=12.57) than with the old web site (M=84.11, g=13.40), however a T-Test showed a non-significant difference between groups [t(16)=.61, p>.05]. The satisfaction measure had only a slight increase for the new site condition which may be due the small sample size, and may prove significant with a larger sample. 26 Procedural changes might also effect these results. Participants only used one site during their testing session. Satisfaction measures including quality of graphics, appearance of site, and fun to use had no reference point. Although users might have relied on past web use experience, if users had participated in tests for both sites, satisfaction measures may have been based on the differences between the two sites. Other satisfaction measures including ease of finding specific information and logic of navigation might have a correlation with efficiency. The lack of improvement in efficiency between the two sites may be a reason why there is no significant improvement in our satisfaction measure. SITE USABILITY The third hypothesis tested was the following: H3: Web sites with consistent navigation will have higher efficiency and satisfaction resulting in a higher usability rating. It was our expectation that the usability rating would be heavily influenced by the site’s efficiency and satisfaction measures. Usability ratings were calculated and compared (see Appendix H) to the web sites measured in a study carried out by Spool (1999). The results of usability in this study between the old and new College of Communication Arts & Sciences web site were not significant. The old site received a usability rating of 29, the new received a usability ratings of 31. With little variance in efficiency and satisfaction between the old and new College sites, it is possible that there is 27 a correlation of efficiency and satisfaction with usability, but that we are not able to see it. Although the difference in site usability for the old and new College web site are not significant, it does tell us that both sites are well below the highest possible score. It is evident from these results that there is room for improvement in the College web site. USER OBSERVATIONS ln undertaking this part of the project, our goals were to identify usability problems with the old and new College of Communication Arts & Sciences web sites. After observing 18 test sessions, we have a good idea of how the old and new College of Communication Arts & Sciences web sites compare to each other in regards to usability. Both old and new web sites were developed to be a portal between the College and the 5 departmental sites. The main difference between the two design solutions was the inconsistent look and feel and the use of frames in the old web site, as compared to the non-use of frames and consistent look and feel of the new College and department web sites. Below I will discuss each task individually as well as identify some common usability problems users encountered while attempting to complete the usability test. TASK 1: Simple Fact Question From the very beginning of our user testing sessions we began to witness some interesting results of users interacting with the old and new web sites. In 28 task 1 users were asked to identify the undergraduate advisor for the department of Audiology and Speech Sciences. All 9 users of the old site successfully identified the correct answer where 8 users of the new site found the correct answer with 1 subject failing to complete the task. Of the 18 participants, only 4 used the “advising” link located directly on the home page of both the old and new sites. Originally we thought this link was missed on the old site due to its location below the fold of the computer screen. In the new site, the same “advising” link was moved above the fold and placed as the first item on the side navigation list. What we began to notice was how our users associated advisors with the people of a departments as opposed to the College. Over and over we witnessed users either select the “people” link from within the College site or select the “departments” link followed by the “audiology and speech sciences” link and then once again select “people” from the departments home page. The design of the new web site follows the organizational structure of the College. The results of this task point out that this organization does not match the users mental model and therefore should be adjusted. Users regularly viewed department advisors as a member of the faculty and staff of a department versus a member of the Student Affairs Office. While selecting the “people” link at the College level would get you to the appropriate information, this same action within a department site will not. If users miss the “advising” link on a department home page they must figure out that advising is within the student resources section. 29 Recommended Design Changes: - Add a text graphic link titled "advising" or "advisor" to the sub- navigation of the people section in all departmental sites. This new page should list only the department specific advisor(s). - Remove or change the text graphic link titled "advisory board" in the sub-navigation of the "people" section of the departmental sites. Advising and advisory language tends to confuse users. TASK 2: Comparison of Facts Question Task 2 also proved to be quite difficult for our users. For this task users were given the following scenario: “You are having difficulty understanding the materials covered in your mathematics course and are afraid you might not pass without help. Would the Service Learning Center be a good choice for help?” Of the 9 participants using the old site, 4 completed the task successfully, 3 provided incorrect answers, and 2 users gave up. The 9 participants of the new site did somewhat better with 5 providing the correct answer and 4 users providing the incorrect answer. It is Important to note that the 4 users who decided on the wrong answer did navigate to the location of the correct answer. They simply formulated the wrong answer based on the comparison of facts. Users of the old web site often thought information would be found in either the department web sites or within the link to “labs and facilities”. When they didn’t find what they were looking for, users’ would return to the advising section to find the correct answer to this task. 30 Users of the new site faired a little better. Most users chose the “student resources” link to start the search for the correct answer but once the new page loaded they were not sure where to go next. Users of the new site repeatedly chose links for “facilities", “advising”, “student resources”, and “organizations”. There seemed to be confusion as to what types of information was located within each of the student resources sub- sections. This confusion was possibly due to the similarity of the sub-navigation labels and the lack of sub-section descriptions within the body copy of the student resources overview page. Recommended Design Changes: - Re-design the content grouping and sub-navigation labeling in the student resources section of the College and departmental sites. TASK 3: Comparison of Fact Question In task 3 test participants were asked to identify which department currently offers the most 300-400 level courses. This task proved to be the most frustrating for users. Both the old and new site had 1 user who gave up while attempting to complete the task. The remaining 8 users of the new site all successfully completed the task while 7 users of the old site completed the task and 1 user was unsuccessful. The main complaint from users was the lack of options to compare items across departments within the College site. Unfortunately the only method to 31 finding the answer to this task was to jump between department sites and count the courses one by one. One surprising observation made during this task was the excessive use of the browser back button, especially with users of the old College web site. Users repeatedly hit the back button to return to a master page containing links to all the department web sites. This same page could have been reached via the “departments” link, which was always present within the global navigation frame of the old web site. A second problem users’ faced with the old College site was the inconsistency of the department web sites. While looking for the answer to this task, users always went to the Department of Advertising web site first. Many users were unsuccessful at finding the course listings hidden in the undergraduate section of the site and would then go to a different department site to look for their course listings. Once users identified that the information was available within another department they would return to Advertising and search until they found the correct location of the course information. Users of the new site also had difficulties. Once again the major complaint was the lack of information at the College level for comparing department information. As one user said, “If I’m not familiar with a department, I want to search around the College for all the information versus searching each site one by one.” Nearly every user successfully navigated to the programs section of each department. However, numerous users first went into the bachelors and 32 graduate program sub pages before noticing the link to courses at the bottom of the sub-navigation. This task was also the first chance to witness users learn and use the consistent navigation system of the new department sites. Upon finding the course information for one department, users quickly navigated to the other four departments course information. One user even noticed the consistency of the html page locations. This user simply modified the first few letters of the URL’s to reflect a different department site location. For example, to jump from the Department of Telecommunication web site to the Department of Advertising web site the user changed the TC of http://tc.msu.edu to ADV. TASK 4: Comparison of Judgement Question Upon completion of task 3 in both the old and new College web site, users often found themselves beginning task 4 while located in the Department of Telecommunication web site. This tended to benefit users in terms of efficiency when attempting to answer the question in task 4 that asked the users to compare three student organizations within the Telecommunication Department and decide which would best prepare students looking for careers in television production. This was the only task that all 18 users (9 using old site, 9 using the new site) successfully completed. Although not statistically significant, users in the old College site were slightly more efficient than the users of the new site. This was most likely due to the fact that the old Department of Telecommunication site 33 listed “student organizations” in the main navigation. This link was available at all times. The new site had this same link place one level deep within the student resources section of the site. Therefore users had to navigate from the course section of the new site to student resources and then to student organizations. Upon completion of the four tasks and post-test questionnaires, users were lead through a short debriefing session. Two items that came out of these sessions had to do with the search feature of the site and the content of the new site. First, of the 18 users in the study only 4 used the search function while attempting to find answers to the tasks during the test session. All four were initially confused and ovenlvhelmed by the information presented on the search results page. Users were unclear as to the purpose of a few items within search results page. These items included the color bar next to the list of search result links as well as the structure of the links themselves. They also wished there were some sort of brief description of the search result. One user didn't understand why his search for “ascot” didn’t work. What he discovered was the search feature on the College of Communication Arts & Sciences home page did not search across the entire College including all five external department sites. Secondly, when l inquired with the staff members of the College on how they use the College of Communication Arts & Sciences web site they all said they really don’t use it. They felt the site was more of a PR piece and a resource for students, faculty, and incoming students. The types of daily tasks staff members perform have to be done through the Michigan State University 34 web site, which they also noted was difficult to use. Tasks staff members regularly need to complete include handling pay vouchers for visiting faculty, looking up classroom scheduling information, and looking up information at University Stores. The main complaint staff members had was that all the sites they need to access are scattered throughout the Michigan State University web site. Staff members wish there was one site that had links to all the appropriate web sites they need to access to complete their daily tasks. Recommended Design Changes: - Add content to the College site so it becomes more information rich and less of a portal site. If users are unfamiliar with the College or individual departments, it is difficult to carry out comparative type tasks. - Add a "view/search all courses" text link and possibly a call to action paragraph to the body copy of the bachelor, graduate, and doctoral program sections. This should link to a page where users can view and search complete Communication Arts and Sciences course offerings using a combination of key words and pull down menu selections. - Add a "view/search all courses" text link and possibly a call to action paragraph to the courses page. This should link to same page mentioned above. 35 Common Problems While Performing All Tasks While observing participants interact with the two sites we repeatedly witnessed similar user actions. One interesting result of the study was that the new site scored a higher usability rating but the old site had a higher level of user efficiency. By using frames in the old web site, users always had the option to use the College’s main navigation to jump to new locations. In the new site, once users navigated from the College site to a department site, they would repeatedly use the back button to navigate back to the College home page before choosing a different navigation path. Although a link that would take users directly back to the College home page was available on every page of the new site, no users selected it. When asked in the debriefing session about the link, users were unaware that the College logo was a navigation element. Most users thought it was simply a graphic and expected the home button to take them back to the College. Others thought the logo had the same action as the home link in the main navigation due to the close proximity of the two links. Recommended Design Changes: - Add a text graphic link to the main navigation of departmental sites which takes the user back to the College of Communication Arts and Sciences home page. - Add a text description to output results of search function. This includes the addition of column headers over search relevance bars and links. 36 - Add a short descriptive statement below each link that provides insight to information found on linked page. - Set up search in College site so it searches across both the College site and the five departmental sites. - Develop a new intranet site specific to faculty and staff goal oriented tasks. 37 Chapter 7 CONCLUSION Searching for information on the old and new College of Communication Arts Sciences web site at Michigan State University can be a frustrating experience. When we compare the two sites against the 11 sites tested in the Spool (1999) usability study (see appendix H), we find they score better than average but, fall well short of the highest possible score. Throughout our study, users repeatedly made similar navigational errors while attempting to find answers to simple and complex questions. While the results of this study are inconclusive with regards to the effects of navigation on user efficiency and satisfaction, we did identify problems which effected the overall usability of the College sites. When designing and testing for usability we are checking for the appropriateness of the design solution for a target population with a specific set of tasks and goals in mind. After observing students and staff from Michigan State university work with the College of Communication Arts and Sciences web site, we believe the our design suggestions will aid in removing the many obstacles causing users usability problems. Through usability evaluation and a constant effort to focus directly on the goals and objectives of the target users of the College of Communication Arts & Sciences web site, we can continue to improve and enhance the Colleges online presence via a successful and usable web site. 38 APPENDICES 39 APPENDIX A Advanced Usability Lab Subjects Room I- Experimenters Room Scan Converter ‘ I 0 Video recoreder(s) Video mixer ' Audio mixer ‘ ll Video monitors 0 Speakers I 0 Event Loggers's Workstation Analysis software User workstation _ Time-code generator Video camera(s) Audio input devices T 4O APPENDIX B Actual Usability Lab Developed User workstation Scan Converter 0 Video recoreder/built in microphone Video/Audio mixer Video monitor/speakers 41 APPENDIX C INFORMED CONSENT FORM Please read this carefully. You are being asked to participate in a study, sponsored by the Media Interface and Network Design (M.l.N.D.) Lab, located in the department of Telecommunication at Michigan State University. This study is conducted by investigators and/or students associated with the M.l.N.D. Lab. Further, it will be conducted in accordance with Lab and University rules and procedures. By participating in this evaluation, you will help us improve this and other web site designs. You are only being asked to participate in this study on this single occasion. You are not being asked to participate in multiple sessions. This single session will last approximately one hour. During this session we will observe you and record information about how you work with web sites. We will ask you to fill out questionnaires. You will be asked questions about media use and your thoughts and experiences related to using particular web sites. We will videotape all of your work. By signing this form, you give your permission to the M.l.N.D. Lab to use your voice, verbal statements, and videotaped pictures for the purposes of evaluating web sites and showing results of these evaluations. We will not use your real name. You are freely consenting to participate in this study. If at anytime during this study, you wish to not continue, you may do so. Your participation is voluntary. All results from this study will be treated with strict confidence and your identity will remain anonymous in any report of research findings. Upon your request and within these restrictions results may be made available to you, if you so indicated now or in the next three years. You are provided with a subject code, which you may use to request such a report summarizing these results. During the course of this study or after completion, you may contact (Kurt Besecker at 353.5497 or Dr. Frank Biocca at 353.5964) regarding any questions or concerns that may be raised by your participation in the study. If you have questions about your rights as participants in a research experiment, you may contact David Wright, Chair of the University Committee on Research Involving Human Subjects at 517-355-2180. You guarantee that you are not a minor, or that you have informed the investigator that you are a minor and that you must be excluded from participation in this study. You indicate your voluntary agreement to participate in this study by signing this consent form. Please print your name: Signature: Date: Please detach and keep the bottom portion of this sheet. Place the top portion with your signature in the manila envelope. Thank you for your participation in this study! 42 APPENDIX D PRE-TEST QUESTIONNAIRE Please write in, check, or circle the most appropriate answer. Participant Information: 1. Sex: Male Female 2. Age: years Computer Experience: 3. What type of computer do you typically use? At home: ( ) IBM or compatible At work: ( ) IBM or compatible ( ) Apple Macintosh ( ) Apple Macintosh ( )Other: ( )Other: 4. About how many hours a day do you use a computer? At home: ( ) 1-3 hours At work: ( ) 1-3 hours ( ) 3-8 hours ( ) 3-8 hours ( ) more than 8 hours ( ) more than 8 hours 5. How frequently or rarely do you use your computer for the following purposes? Very Rarely Word processing Spreadsheets Games Information reference Online service Personal communication Arts (graphic/music) Statistical Analysis HMO—\HF—‘HHHF—S t—db—‘HHH—JHH—d Hh—l—JHHHHI—J—l Engineering simulation h—lu—Jh—l—ah—I—J—au—ou—a HHF—‘HHf—‘VHHI-N h—lh—l—li—Jh—J—IH—l—l f—‘HF-‘Hf—‘f—‘F—‘HI—fi I—lh—Jh—du—l—dh—dh—lh—dh—d Fir-\Hr—HF-‘Hl—‘r—‘H h—dI—lh—lt—Jh—JHHHH 6. On average, how many hours a day do you use the web? At home: ( ) 1-3 hours At work: ( ) l-3 hours ( ) 3-8 hours ( ) 3-8 hours ( ) more than 8 hours ( ) more than 8 hours 7. On average, how many e-mail messages do you send and receive daily? 0—5 6-10 l H 5 l6-20 over 20 8. How many web sites have you designed? 0-2 3-5 6-8 9 or more 43 Very Frequently f-IF‘f—‘F—‘F—‘I—fll—‘f-‘P—S h—Jh—JHh—JHHHHH APPENDIX E TASKS FOR TEST SESSION Task 1: Who is the undergraduate advisor for the Department of Audiology and Speech Sciences? Task 2: You are having difficulty understanding the materials covered in your mathematics course and are afraid you might not pass without help. Would the “Service Learning Center” be a good choice for help? Task 3: Which department currently offers the most 300-400 level courses? Task 4: Which of the following Telecommunication organizations will best prepare students looking for careers in Television Production? - ASCOT - Telecasters - Telestate 44 1. Physically, how do you feel right now? APPENDIX F POST-TASK QUESTIONNAIRE Exhausted 1 2 3 4 5 6 7 full of energy 2. Mentally, how do you feel right now? completely confused 1 2 3 4 5 6 7 everything made sense 3. While completing this task, did you feel completely frustrated 1 2 3 4 5 6 always know what to do next 4. Compared to what you expected, did the task go much slower 1 2 3 4 5 6 7 much faster 5. Rate the quality of the information in this site unacceptable 1 2 3 4 5 6 7 exceflent 6. How confident are you that you found all the relevant information? not at all confident 1 2 3 4 5 6 7 very confident 7. How do you feel now that this task is over? relieved 1 2 3 4 5 6 7 eager for more 45 8. How would you have answered this question if you did not have web access? (Check all that apply) _Read something (what?) _Call someone (who?) Gone somewhere (where?) _Other: 46 APPENDIX G POST-TEST QUESTIONNAIRE Please rate your satisfaction with the site you have just finished working with. Circle the number on the scale to indicate your level of satisfaction 1. Ease of finding specific very unsatisfied 1 2 3 4 5 6 7 very satisfied information 2. Ease of reading data very unsatisfied 1 2 3 4 5 6 7 very satisfied 3. Ease of concentrating on very unsatisfied 1 2 3 4 5 6 7 very satisfied the data search (distractions) 4. Logic of navigation very unsatisfied 1 2 3 4 5 6 7 very satisfied 5. Ease of search very unsatisfied 1 2 3 4 5 6 7 very satisfied 6. Appearance of site very unsatisfied 1 2 3 4 5 6 7 very satisfied 7. Quality of graphics very unsatisfied 1 2 3 4 5 6 7 very satisfied 8. Relevance of graphics to very unsatisfied 1 2 3 4 5 6 7 very satisfied site subject 9. Speed of data display very unsatisfied 1 2 3 4 5 6 7 very satisfied 10. Timeliness of data (is it very unsatisfied 1 2 3 4 5 6 7 very satisfied current) 11 . Quality of language very unsatisfied 1 2 3 4 5 6 7 very satisfied 12. Fun to use? very unsatisfied 1 2 3 4 5 6 7 very satisfied 13. Explanations of how to very unsatisfied 1 2 3 4 5 6 7 very satisfied use site 14. Overall ease of use very unsatisfied 1 2 3 4 5 6 7 very satisfied 15. Cempleteness with which very unsatisfied 1 2 3 4 5 6 7 very satisfied the site’s subject is treated 16. Your overall productivity very unsatisfied 1 2 3 4 5 6 7 very satisfied with the site 47 APPENDIX H . « O O .29 9 47% 0.040 so be 7 .00.. are ea @ J& 6/ , a 00 O 0 ®9 a .0 An 0 ,3 o/ to. f/ ,e. we 6 ...., o/ A? /o a - 7.4. .7 a... a 4 a 0%. w x K fig... ow mm o_m m_N o_N moczmm >553: Co mcomthEoO 9% 48 BIBLIOGRAPHY Bibliography Flemming, J. (1998). Web Navigatipn: Desidninq the User Experience. Sebastopol, CA: O’Reilly 8 Associates, Inc. Kristof, R., Satran, A. (1995) Interactivity bv Desiqn. Mountain View: Adobe Press. Lopuck, L. (1996) Designing Multimedia. CA: Peachpit Press. Mok, C. (1996) Designing Business. San Jose: Adobe Press. Mullet, K., Sano, D. (1995) DesiqningVisgal Interfaces. CA: SunSoft Press. Nielsen, J., (1993). Usability Engineering. Boston: AP Professional. Nielsen, J., (1999). Web Research: Believe the Data. Available on-line at. Sano, D. (1998) Designing Large-Scale Web Sites. New York: John Wiley & Sons, Inc. Siegel, D. (1998) Creating Killer Web Sites. Indianapolis: Hayden Books. Siegel, D. (1997) Secrets of Successful Web Sites. Indianapolis: Hayden Books. Spool, J., Scanlon,T., Schroeder, W., Snyder, C., & DeAngelo, T. (1999) Web Site Usability: A Desiqner’s Guide. San Francisco: Morgan Kaufmann Publishers, Inc. 50 HICH l IGRN STATE UNIV. LIBRARIES IllllllllllllllllllllllllIllllllllllllllllllllllllllll 0604181 3129302