31
ABC Center User Testing Report Client: ABC Center for Research Website: http://www.abccenter.org Contact: John Walton Consultants: Bazler, Jae Kim, Erik Brook, Heather Beery, Lora Ramseier Date: January 15, 2007

eReading Experience

Embed Size (px)

DESCRIPTION

The School of Informatics hosted technology employers (Cisco, Oracle, Disney Animation Technology, Intuit, etc) for a day to better express what IU's human-computer-interaction design program. As a member of the committee to design this interaction, we thought there was no better way to see our students in action, then to have our guests design for experience with us. This is the playful, situated, experiential design problem the committee created (and I put together and presented).

Citation preview

Page 1: eReading Experience

ABC Center User Testing Report

Client: ABC Center for Research Website: http://www.abccenter.org Contact: John Walton Consultants: Bazler, Jae Kim, Erik Brook, Heather Beery, Lora Ramseier Date: January 15, 2007

Page 2: eReading Experience

User Experience Group Indiana University 2 of 31

Table of Contents ABC CENTER USER TESTING REPORT........................................................................................................... 1 TABLE OF CONTENTS.......................................................................................................................................... 2 EXECUTIVE SUMMARY ....................................................................................................................................... 3 STUDY DETAILS..................................................................................................................................................... 4

PURPOSE OF STUDY ................................................................................................................................................. 4 METHOD.................................................................................................................................................................. 4 PARTICIPANTS ......................................................................................................................................................... 4 PROCEDURES ........................................................................................................................................................... 4 TASKS...................................................................................................................................................................... 5 RESULTS.................................................................................................................................................................. 5

TASK PERFORMANCE.......................................................................................................................................... 7 OBSERVATIONS AND RECOMMENDATIONS................................................................................................ 9

SCREEN LAYOUT/DESIGN........................................................................................................................................ 9 TERMINOLOGY ...................................................................................................................................................... 11 CONTENT ORGANIZATION ..................................................................................................................................... 11 NAVIGATION ......................................................................................................................................................... 12 OPERATING SYSTEM AND BROWSER ISSUES ......................................................................................................... 15

POST-TEST SATISFACTION RATINGS........................................................................................................... 16 SITE SATISFACTION RATINGS................................................................................................................................ 16 SATISFACTION QUESTIONNAIRE – POST-TEST QUESTIONS ................................................................................... 17

Question 1 – What do you consider the most valuable aspect of the system? ................................................. 17 Question 2 – What is the biggest problem with the system?............................................................................ 17 Question 3 – Additional comments .................................................................................................................. 17

APPENDICES.......................................................................................................................................................... 19 APPENDIX A – TESTING PROTOCOL SCRIPT........................................................................................................... 19 APPENDIX B – PARTICIPANT CONSENT FORM ....................................................................................................... 20 APPENDIX C – VIDEO RELEASE FORM................................................................................................................... 21 APPENDIX D – SATISFACTION QUESTIONNAIRE .................................................................................................... 22 APPENDIX E – SPACE UTILIZATION ....................................................................................................................... 23 APPENDIX F – IMAGE FONTS ................................................................................................................................. 24 APPENDIX G – CONTRAST BETWEEN TEXT AND BACKGROUND ........................................................................... 25 APPENDIX H – TERMINOLOGY............................................................................................................................... 26 APPENDIX I – LINK FORMAT ................................................................................................................................. 27 APPENDIX J – MOUSE-OVER EFFECT..................................................................................................................... 28 APPENDIX K – LINKING TO PREVIOUS PAGE ......................................................................................................... 29 APPENDIX L – OPERATING SYSTEM AND BROWSER ISSUES .................................................................................. 30 APPENDIX M – BROWSER COMPATIBILITY ISSUES ................................................................................................ 31

Page 3: eReading Experience

User Experience Group Indiana University 3 of 31

Executive Summary In March 2005, a usability study of the ABC Center website was conducted with fifteen participants in the Bloomington, Indiana area. Results showed that users were generally satisfied with the amount and quality of the content available on the ABC site. However, a number of usability issues were identified that hindered locating information, and the site was perceived as being unprofessional in appearance. One of the greatest concerns for usability in the ABC site was the lack of a consistent, reliable navigation system. Users were often left without navigation options to return to the previous page or to the main menu, and were forced to use the browser buttons to navigate the site. This issue was compounded by the fact that some links took the users unexpectedly out of the site. The overall unprofessional appearance of the site was another concern. When searching for scientific research, the perceived credibility of a site will play a large role in whether people are willing to use the site and trust the information provided. Participants expected the images used in the site to be of higher quality, page space to be utilized more efficiently, and the content to be formatted correctly (capitalization, text alignment, etc.). Even though these issues may seems somewhat minor, the combined effect reduces the sites credibility in the users’ eyes. Other usability issues included linking to software-specific files without warning and unpredictable content presentation format. This report describes the purpose and methods of the study, and provides a summary of the observations made from each testing session. Recommendations are made for the ABC Center web development team to consider as they prepare for the redesign of the website.

Page 4: eReading Experience

User Experience Group Indiana University 4 of 31

Study Details Purpose of Study The ABC Center website is a resource for students to find research studies on the web. The site provides a variety of features including a research article database, links to other online research sources, and a directory of researchers. A new site design is going to be introduced in July, 2005, and the ABC Center web development team wants to explore opportunities to improve the website by obtaining user feedback. As part of the usability evaluation for the ABC Center, user testing was conducted in March of 2005, in Bloomington, Indiana. Method Usability testing methodology1 was used to evaluate the website. This approach involved having authentic users perform authentic tasks using the system, while evaluators observed and recorded their actions and comments. Users were asked to perform a think-aloud protocol2 to help the evaluators understand their behaviors and gain insight into the design of the website. Sessions were performed on an individual basis with each session lasting approximately 1 to 1 ½ hours. ABC Center provided users with $50 gift certificates in return for their participation. Following the last session, qualitative and quantitative data were analyzed and recommendations for redesign were made. Participants Participants recruited for the testing sessions were identified and recruited by the UXG team. An effort was made to target participants who fit a wide range of user profiles due to the heterogeneity of the site’s potential users. The goal was to recruit a sample of 16 participants to achieve adequate representation across the following four demographic categories:

1. Gender – approximate equal split between male and female 2. Age – three age ranges (20-39, 40-59, 60+) 3. Computer experience – novice (little to no computer use), intermediate (office

applications, web), or expert (complex applications and/or programming) 4. Experience with research websites

A total of 15 participants were tested including 7 men and 8 women. Five participants were in the 20-39 year age-range, six were in the range of 40-59 years of age, and four participants were over 60. Procedures Participants were read an introductory protocol at the beginning of the testing session (see Appendix A for script) and asked to read and sign two consent forms. The first consent form (see Appendix B) is a standard consent form indicating the participants’ understanding of what will be expected of them during the sessions and their willingness to participate. The second consent form (see Appendix C) is a video release form indicating the participants’ awareness that the sessions are videotaped and granting their permission for the video to be used in the 1 Dumas, J. S., & Redish, J. C. (1993). A practical guide to usability testing. Norwood, NJ: Ablex. 2 Ericsson, K. A., & Simon, H. A. (1993). Protocol analysis: Verbal reports as data (Revised ed.). Cambridge, MA: MIT Press.

Page 5: eReading Experience

User Experience Group Indiana University 5 of 31

evaluation and reporting of results. Participants were given a copy of the consent forms for their records. Each participant completed 10 tasks using the abccenter.org website. Tasks were completed one at a time and were recorded as either 1) success with ease, 2) success with difficulty, or 3) failure to complete the task. Success with ease included those tasks that were completed on the first or second try. Success with difficulty included tasks that participants adequately completed but more effort and persistence was needed. For example, trying numerous paths to locate necessary information before finding the correct screen. Failure to complete a task was given if the participant gave up on the task or failed to locate the correct information. In some “failure” cases, participants may have been satisfied that they completed the task; however, the correct information was not located or portions of the task were not completed. Following the session, the users were asked to rate their satisfaction with the system using the Survey of User Satisfaction developed by Digital Equipment Corporation (see Appendix D). Tasks were performed using Internet Explorer 6.0 and the Windows XP operating system on a Dell, Pentium 4, 3.80 GHz computer. Tasks The ABC Center team developed a draft list of 10 tasks for testing based on key site information and features. UXG consultants reviewed and refined the tasks in order to create appropriate scenarios for user testing. Tasks were presented in random order to eliminate order effects from influencing the testing results.

Results The results of the study have been divided into four sections:

1. Task performance (e.g. successes and failures) Tasks are listed and the number of participants experiencing successes and failures are noted.

2. Observations and recommendations The results of the study are presented in a table format. The table includes three columns of information: Observations, Interpretations, and Recommendations. Each column is described in more detail below: • Observations – an objective description of participants’ actions and comments

during a session. • Interpretations – a proposed explanation for participants’ observed behaviors and

justification for recommendations based on known design principles and rules. • Recommendations – suggestions for maintaining aspects of the current design that

work, changing aspects that are problematic, and considering alternative possibilities for presenting information to the user.

The data within this table are organized into categories that represent the major types of issues raised during the study. The categories include:

Page 6: eReading Experience

User Experience Group Indiana University 6 of 31

• Screen Layout/Design – Items concerning the placement of elements on the screen, use of screen space and opportunities for modifying items to increase user understanding of presented information.

• Terminology – Items concerning the user’s understanding of the language used on site.

• Content Organization – Items concerning the organization of information within the site.

• Navigation – Items concerning the users’ ability to maneuver throughout the site. • Operating System and Browser Issues – Items concerning compatibility and

consistency of site presentation across major computer configurations.

3. Post-test satisfaction ratings and comments The results of the SUS (System Usability Scale) questionnaire and a brief description of how the overall SUS scores are calculated is presented. Participants’ comments are included.

Page 7: eReading Experience

User Experience Group Indiana University 7 of 31

Task Performance Task Success

with Ease Success with

Difficulty Fail

1 You are looking for employment at the ABC Center. Find out if there is anything open at this time that you could apply for. Are you interested in the position?

12 2 1

2 You are interested in taking an online course offered by the ABC Center. Find a course in “Referencing in APA Style.”

11 2 2

3 You are writing a research paper, and you realize that you are missing the author information for one of your sources. Find the author of the research article “Intercultural Discourse.”

7 3 5

4 You have been asked to contact a researcher by your teacher/professor. Find the contact information for “Ellen Miller.”

4 7 4

5 You are interested in visiting the ABC Center in person. Where is the ABC Center located?

13

2 0

6 With so many online research databases to choose from, you wonder how they compare. Find the discussion forum where the users rate them.

4 6 5

7 After spending some time browsing through the ABC Center research database, you found some articles that you might need in the future. After logging in, find and save the research article “Benningham DNA Study” in your “Online Bookmarks” for later retrieval.

2 6 7

8 You have heard that ABC Center offers several workshops throughout the year. What are the titles of the two that are offered in May (2005)?

7 5 3

9 ABC Center offers several free research aid software for download. Find and download “Ace Reference Tracker 2”.

12 1 2

10 Thank you for visiting the ABC Center website. Can you please use the online form to rate us?

10 3 2

Page 8: eReading Experience

User Experience Group Indiana University 8 of 31

The graph above illustrates the percentage of users who accomplished the tasks with ease (easy), with difficulty (hard), or failed. The tasks that caused users the most difficulty were locating the contact information for researchers (task #4) and using the Online Bookmarks feature (task #7).

Page 9: eReading Experience

Observations Interpretations Recommendations

User Experience Group Indiana University 9 of 31

Observations and Recommendations Screen Layout/Design The copyright information on the homepage is invisible to the users since it is white text on white background.

Using the same colored text and background does not provide any contrast for the text to be visible, unless they are drag-selected with the mouse by accident.

1. Use black or other dark colored text for this information.

At least three different font and color schemes are used in the site to graphically represent the organization’s name.

When the same information is presented in different formats, it causes inefficiency in user behavior because they have to re-interpret the data. Furthermore, a chance to reinforce the organization’s image/identity is lost.

2. Standardize the font and color format if the organization does not have a graphical logo. If it does, replace the current titles with the logo.

The organization’s name is difficult to read.

The organization’s name is not easy for users read. It looks like a list including; “the center, education, research, retailing”.

3. Change the layout of the organization’s name.

The information currently displayed on the homepage is not helpful to users. For example, the organization’s name is shown twice, the picture displayed is not associated with the center, and there is no site menu or navigation on this page.

The homepage does not help users understand what the site is about or where they can navigate.

4. Remove the current home page and make the “ABC Center Menu” page the site’s home page.

The “Research Time Line” link is presented in a different font than the other links in the same list.

Content should be presented in a consistent graphical format, especially if the items belong to the same list. Any deviation would suggest that the particular link is different from the others in content, importance, etc.

5. Change the font of the link to be consistent with the other links in the list.

Too much screen space (80+ pixels) is wasted on the top of the menu page (see Appendix E).

The top of a web page is valuable and draws the attention of the users. The current use of the page layout does not utilize this potential.

6. Move the page title image up to reduce the white space.

Many images used in the web site are of low quality and/or stretched from smaller samples, making the overall look of the site unprofessional.

Low quality images degrade the overall look of the web site, making the site and its affiliated organization less credible. Images should enhance the look of a site without being distracting or overwhelming.

7. Replace the existing images with higher quality images.

8. Limit the use of images because they increase site load time and can distract users from more important content on the site.

The participants commented that some items were blurred and difficult to read (see Appendix F).

Low quality text images and/or adjusting image size (shrinking and stretching) can cause images to appear blurry.

9. Replace all text images with normal text, unless specific design considerations require it (such as within a logo).

Page 10: eReading Experience

Observations Interpretations Recommendations

User Experience Group Indiana University 10 of 31

There is no consistent format for content text, links, and visited links.

Users have difficulty knowing what constitutes a link, a header, plain text and imbedded links. Without a consistent format, users must explore each item to determine if it is a link or not. This requires more time and energy than many users are willing to spend.

10. Choose a format for page layout, title, content text, links and visited links and apply the format consistently across the site.

Every page begins with a title that describes the contents of the page, but the format is not consistent across the site.

Providing a clear title on each page helps users know where they are and feel more comfortable navigating the site. However, an inconsistent format can cause confusion.

11. Choose a font, color, size and position for the title and use the same format consistently throughout the site.

The repetition of frame at the left hand side could be site identification; however, there is no design scheme or stylistic treatment for the content frame.

The site identification is important. The repetition of screen layout, font size and color contribute to site identification.

12. Apply a consistent format to the page layout as well as the contents.

The link on the left frame turned to purple when user clicked on the link and used the “Back” button on the browser to go back to the page (see Appendix G). Participants were not able to read the link.

The purple text on a red background does not provide sufficient contrast and is therefore difficult to read.

13. Choose a color schemes that provide good contrast between the text and background.

Links to external sites are included in what should be the site’s main navigation.

The main purpose of navigating the site is to locate information about the organization instead of linking to other sites. Most-frequently used links and the site navigation structure should be available on every page.

14. Replace the links with the site navigation structure.

15. Put the external links under the “resources” category or put them in the footer.

An empty check box in front of the “Return to ABC Center Menu” link causes confusion. Participants were unsure whether it was a bullet point or something requiring an action. On some pages, there is a checkbox animation in front of the link.

Inappropriate use of symbols or images causes confusion. Users may not know whether to click on the underlined text or to click the box. The check box animation is not relevant to the link or its content and is distracting to the user.

16. Remove the empty check box and the check box animation from the links.

The navigation image map used on “About Our Center” takes too much screen space (550 x 413 pixels).

Using large images for navigation is not an efficient way of organizing content on a web page.

17. Do not use large images unless absolutely necessary. Replace the image map with a bulleted list of links.

There are no visual cues for users to know that the 4 red triangles on the corners of the image map on “About Our Center” page have imbedded links.

User could easily miss/ignore a link if it does not look like a link.

18. Avoid using image maps. 19. If an image map is

necessary, provide visual cues (underline the text on graphics, for example) to tell the user that there is an

Page 11: eReading Experience

Observations Interpretations Recommendations

User Experience Group Indiana University 11 of 31

imbedded link.

Participants clicked on some texts and then realized that it was not a link.

There is no visible distinction between text and embedded links. Also, there is no consistent format for imbedded links.

20. Choose a font, color and style for imbedded links and use the format consistently across the site.

Bulleted links do not line up on the ‘Programs for Students’ page.

Proper item alignment is an important element for professional presentation.

21. Make sure all items are properly aligned.

Most, but not all, link text is capitalized on the ‘Programs for Students’ page.

A consistent capitalization scheme is an important element for the professional appearance of the site.

22. Change the link text to adhere to a consistent capitalization guideline.

Terminology Only the homepage has a window title and it did not describe the website clearly.

Windows should be given a clear identifiable title that will display in the browser address bar. This will also help the user bookmark the page appropriately.

23. Provide an appropriate title for every web screen.

The “Welcome to the ABC Center” link on the homepage is vague as it did not tell user anything about the purpose of the link.

The sentence is more like a greeting instead of a link. User would not be able to tell that it is a link if the link was not underlined.

24. Phrase all the links appropriately. The link text should clearly identify the purpose of the link.

Participants expected to get access to an application by clicking on images under “Software Applications” (see Appendix H)

The manner in which the images were displayed led the user to believe they could access the various applications shown.

25. Change the title to be “Screenshots of Software Applications” to tell users that those images are just some screenshots from the applications.

26. A brief description for each image helps users understand the content.

The names of the links to the menu page are not consistent. On some pages, it is called “ABC Center Welcome Page”; on other pages, it is called “ABC Center Menu”.

Using a consistent word or phrase to link to any given page will help the user learn the structure of your site and allow them to accurately select the appropriate link no matter what screen they are on.

27. Make sure the name of a link is identical on every page.

The menu item and the page title are not the same.

Users are not sure where they are when they click on a link and get a page with a title different from the link they clicked.

28. Make sure the link reflects the title of the target page. Page titles may be longer and more descriptive than links, but key words should be the same.

Content Organization Contact information is buried too deep into the web site and only available after following a narrow navigation route.

Users should not need to browse deep into the site in order to find basic contact information. Many users will look for this information and become frustrated if it is not easily located.

29. Consider placing basic contact information (address, e-mail, and phone number) at the bottom of each page on the web site, or providing a

Page 12: eReading Experience

Observations Interpretations Recommendations

User Experience Group Indiana University 12 of 31

top-level link to the contact information.

A number of links open an external web page within the content frame.

Users can easily become confused about which site they are visiting if an external site opens within the content frame.

30. Only contents of the website should display in the content frame. An external web page should open in a new window so users know that it is another site.

A number of links on the site open a PDF file, a word document, a PowerPoint file or even a streaming video without alerting the users beforehand.

Users like to be informed before downloading/opening a file. They become frustrated when they click on a link and get unexpected results.

31. Provide the content in a format that users can view online. That gives user the control to choose to read the document online or download it.

32. Clearly label any links that will start a download or attempt to open an application (such as Acrobat Reader, Word, PowerPoint, etc.) so users know what to expect before clicking.

‘Customer Interface Technology Lab’ opens a Real Media file, which requires a media player that has to be downloaded.

Users would like to be informed before trying to open a file that requires an application they may not have.

33. Warn the users that the link will open a Real Media file, and provide a link to a download page where the required media player is available.

Navigation Participants spent too much time scanning the homepage with the mouse to see what links were available.

Users expect to see a clear navigation scheme that can help determine the next page to visit. Time and effort is wasted when the users have to explore the page to find what navigation options are available.

34. Remove the current homepage from the site, and establish the main menu page as the homepage.

35. If the current homepage has to be kept, make sure that the users know that there is only one navigation option available by moving the “welcome” link to the center of the page.

A clear association between the top-level navigation descriptions and sub-level content was not apparent to participants.

The menu page specifically presents a bulleted list of information that is available in each of the top sections. This leads users to believe that these categories will be clearly marked in their respective sub-pages. The current layout, however, does not distinguish these categories.

36. Provide the bulleted list content presented in the menu page for the sub-level pages.

A quick, obvious link back to the menu page is not provided

The current navigation scheme lets the users go back to the

37. Remove the current homepage from the site,

Page 13: eReading Experience

Observations Interpretations Recommendations

User Experience Group Indiana University 13 of 31

throughout the site. homepage from most of the site pages. However, there should be little need for this, since the homepage only provides a link to the menu page. It is more logical that the users will want to return to the menu page from their location in order to restart their interaction with the site.

and establish the main menu page as the homepage.

38. If the current homepage has to be kept, replace the universal link to the homepage with link to the menu page.

Participants do not know where they are and where they came from when navigating the site.

The current site design does not indicate where the user is within the site. Show the user where they are by using breadcrumbs at the top of screen or by highlighting the user’s location within a persistent navigation structure. User would feel more confident if they know exactly where they are and where they came from when navigating a site.

39. Provide information about current location to facilitate navigation.

Participants relied on the back button on the browser to navigate the site. A ‘go back to the previous page’ link is provided at the bottom of some pages, but not all.

Users frequently miss the link because they would have to scroll to the bottom of the page to find it. In addition, the link is not always available on every page. Users would be frustrated if they scroll down to the bottom of a page but cannot get the link they expected.

40. Provide a sound navigation mechanism.

41. Make the go back to previous page link available at the same position on every page.

Some participants could not locate information on the center’s director as they did not know that the link was imbedded in the bullet point (see Appendix I).

Users normally click on an item’s name to get more details about the item instead of clicking on the bullet. A dark red border around the red rectangular bullet point is a hint to tell users that it is a link, but it is not easy to tell as the colors are quite similar.

42. Make the name/title of an item a link instead of imbedding the link in a bullet point.

43. Links should be presented in an easy to recognize and consistent manner.

Participants did not realize that the image above the facility lists would change when the mouse moved over an item of the list. Participants clicked on an item as the cursor turned to a finger when moving over the item.

The use of fancy technique is not always appropriate as users might not know how to use it. They may easily miss the information if they do not know how to get access. User would be frustrated as the cursor turns to be a finger when moving over an item, but those are not links and they cannot get more information by clicking on it.

44. Remove the mouse-over effect and allow users to get more details of a facility item by clicking on it.

On the research on retailing page, links of two study reports have the same title. The image above the sentence of “Click for a PDF copy of the study” is not a link although it indicated to be.

User would not be able to know even the title of a report unless clicking on the link. It causes confusion when the site cues the users for action that does not lead to the expected result.

45. Use the title of the study as a link and clearly labeled the link to tell user what to expect when clicking on the link.

46. Change the image to act as a link.

Page 14: eReading Experience

Observations Interpretations Recommendations

User Experience Group Indiana University 14 of 31

Participants were confused when the “shopping simulations” link and the “software applications” link took them to the same page.

Users expect different links to go to different pages. Many users will not understand that the links point to different sections on the same page.

47. Either combine the links to be “shopping simulations & software applications” or separate the contents into two separate pages.

When the user mouses over a category in the menu, it popped up a brief description of the category. Participants regarded it as a pop-up submenu and wanted to click on it but the pop-up description disappeared once the cursor moved away from the category. Participants moved their cursors back and forth between the menu and the pop-up information and then realized that it was not a submenu (see Appendix J).

The list format and position of the pop-up information create confusion because it looks like a pop-up submenu. Users become frustrated trying to click on this image that disappears every time they move their mouse.

48. Remove the mouse-over effect and use available screen space to display the brief description right below each category.

49. Change the format to make it more like a brief description of a category in stead of a list

The link at the bottom of the page always returns the user to the previous page (see Appendix K). However, participants were confused by links that contained the title because they did not always recognize the names of the previous pages they visited.

Bad navigation mechanisms force the user to recall where they came from. Users have more processing capacity for the contents when there is less cognitive load for locating their position.

50. Simplify the link at the bottom of the page to be “back to previous page”.

51. Provide a mechanism that helps users to recognize where they are in the website instead of recalling their path.

‘KPMG’ link in the ‘Sponsoring Partners’ page is a dead link.

Dead links damage the credibility of the site and frustrates the users who wish to visit the linked page.

52. Fix the dead link to lead to the correct, working site or if the site no longer exists, remove the link and associated text.

The ‘Marshall Field’s’ and ‘Mervyn’s’ links on the ‘Sponsoring Partners’ page do not take the user to the correct sites. Rather, both of these links lead to target.com.

Link names should always correctly refer to the site they link to. Unexpected behaviors, such as the wrong site opening, lead to user confusion and frustration.

53. Fix the ‘Marshall Field’s’ and ‘Mervyns’s’ links to lead to their respective sites. Check all links (internal and external) for accuracy and periodically verify that external links still point to published sites.

On the ‘Activities and Programs’ page, various links in the same bulleted list behave differently. Some open up new browser windows, others open PowerPoint-like slides, and others go to pages deeper in the hierarchy.

Users expect similar behaviors from links that are grouped together. Furthermore, users’ mental model of the site navigation scheme is weakened when unpredictable links are present.

54. Group links together that behave in the same manner. Provide a distinction or visual separation for links that behave differently.

Clicking any items on the “Timeline” page takes the users out of the site navigation frame. Clicking on the “Home” link on

The users expect the side navigation bar to be present at all times.

55. Fix the links so that the pages are displayed within the content frame in the browser window, leaving

Page 15: eReading Experience

Observations Interpretations Recommendations

User Experience Group Indiana University 15 of 31

the “Timeline” page takes the users out of the site navigation frame.

the side navigation intact.

The “Home” link at the upper-left hand corner of the “Timeline” page is confusing. Participants did not know whether they would be taken to the site’s homepage or the center timeline’s homepage.

Different links with the same name causes confusion.

56. Rename the link to avoid confusion with the link to the center’s homepage.

Operating System and Browser Issues The red rectangle with the imbedded link on the “About Our Center” page has a 1 pixel bolder to indicate that a link. The border does not display properly on a Macintosh (see Appendix I)

Some operating systems and browsers are more flexible than other in rendering code

57. Test the most popular browsers and operating systems to make sure the site works properly.

The checkbox image in front of the “Return to Research Center Menu” works on PC, but the image did not display properly on a Macintosh (see Appendix L).

Some operation systems and browsers are more flexible than other in rendering code

58. Test the most popular browsers and operating systems to make sure the site works properly.

The mouse-over link color does not change for Netscape 7.2 on PC, Netscape 7.01 on Macintosh and Firefox 1.0 on both PC and Macintosh computers.

Some operation systems and browsers are more flexible than other in rendering code

59. To increase the likelihood that pages will display and function properly, make sure that code is formed correctly with proper tags (always use closing tags), etc.

60. Test the most popular browsers and operating systems to make sure the site works properly. If all browsers and operating systems will not be supported, provide text stating which browsers the site supports.

Page 16: eReading Experience

User Experience Group Indiana University 16 of 31

Post-test Satisfaction Ratings Site Satisfaction Ratings Satisfaction User Survey

(Raw score by SUS question)

1 5 strongly strongly disagree agree

Median Mean

I think that I would like to use this system frequently.

5 5 2 4 4 3 3 4 3 5 4 2 3 2 3 3 3.47

I found the system more complex than necessary

1 2 5 3 3 5 1 2 3 1 3 5 3 5 1 3 2.87

I thought the system was easy to use. 4 5 2 2 3 3 4 5 3 5 3 2 2 2 4 3 3.27 I think that I would need the support of an experienced person to be able to use this system.

1 1 1 3 1 1 1 2 2 1 2 1 1 1 1

1 1.33 I found the system visually appealing 5 5 3 4 2 3 4 4 5 5 3 5 3 3 4 4 3.87 I thought there was too much inconsistency in this system.

1 1 3 1 3 2 1 1 3 1 3 3 4 3 1 2 2.07

I would imagine that most people would learn to use this system very quickly.

5 5 1 4 2 2 5 4 3 5 3 2 3 1 5

3 3.33 I found the system very cumbersome to use.

1 1 4 2 2 2 1 4 4 1 3 5 3 4 1 2 2.53

I felt very confident using the system. 5 4 5 3 2 4 5 5 2 5 3 3 4 5 5 4 4.00 I needed to learn a lot of things before I could get going with this system.

1 1 1 3 3 1 1 1 3 1 4 2 1 1 1 1 1.67

Satisfaction User Survey (total adjusted SUS scores by participant)

SUS Score (0-100)

Participant 1 97.5 Participant 2 95 Participant 3 47.5 Participant 4 62.5 Participant 5 52.5 Participant 6 60 Participant 7 90 Participant 8 80 Participant 9 52.5

Participant 10 100 Participant 11 52.5 Participant 12 45 Participant 13 57.5 Participant 14 47.5 Participant 15 90

Median 60 Mean 68.67

Raw SUS scores are adjusted as follows: * Questions 1, 3, 5, 7, and 9: adjusted score = raw score – 1 * Questions 2, 4, 6, 8, and 10: adjusted score = 5 – raw score

Page 17: eReading Experience

User Experience Group Indiana University 17 of 31

Calculating an individual’s total SUS score for an application: * The sum of the adjusted scores is multiplied by 2.5 to yield an overall score in the range of 0 – 100. Higher total SUS scores indicate greater overall satisfaction with the website. The median and mean scores of 60 and 68.67 respectively for the ABC Center website represent a somewhat lower level of user satisfaction than desired. Correcting the various usability issues identified in the preceding sections of this report are likely to improve user satisfaction with the site. Satisfaction Questionnaire – Post-Test Questions The table below shows participants responses to the three SUS post-test questions. Participants wrote their responses on the SUS form and they are included here verbatim. Question 1 – What do you consider the most valuable aspect of the system? P1 Easy to use, straight forward. P2 Access to rich, original research. P3 Easy to find research based on topic of interest. P4 Getting information. P5 There is a lot of information. P6 Fairly easy to find what I needed. P7 Wealth of information. P8 Links to cross references. P9 Rich information. P10 Providing easy links for additional research material. P11 P12 This site has many research materials that are not available anywhere else. P13 I liked the links to other research resource sites. P14 Contact information for the research authors is a good resource. P15 Easy to find the research I am looking for. Question 2 – What is the biggest problem with the system? P1 Some of the links took me out of the site completely without warning. P2 The images that are used are not that good in quality. It makes the site look unprofessional. P3 Organization is not clear. P4 Make it user friendly. P5 There seem to be a lot of space wasted – I would rather have more content than fancy titles and

graphics. P6 Did not like the images used. P7 It was difficult to navigate the site. P8 The pages in the site looked too different from each other. They don’t look like they belong to the same

site. P9 Difficult to navigate. P10 Lack of professionalism in site presentation. P11 N/A P12 Full screen is not utilized. P13 Site looked unprofessional. P14 Navigation options are sometimes unavailable. P15 Many linked files required programs that I did not have on the computer. Question 3 – Additional comments P1 Nice site, lots of information. P2 Change the images to better ones, and the site will look much nicer. P3 P4 About time! [do not know what the participant was referring to ]

Page 18: eReading Experience

User Experience Group Indiana University 18 of 31

P5 P6 It was frustrating because navigation options would disappear on me at times. Those links should be

there at all times. Also, there were instances where the links that I clicked took me to an external link. I think those should open up new windows.

P7 P8 P9 Needs more organization. P10 P11 I look forward to using it in the future, nice content. P12 Overall it was fine, but I think the images should be changed to better ones. P13 P14 Make the navigation bars available at all times. This will make the site much easier to use. I liked the

content available on the site, lots of original research. P15

Page 19: eReading Experience

User Experience Group Indiana University 19 of 31

Appendices Appendix A – Testing Protocol Script Participating in Usability Sessions Thank you for agreeing to participate in the usability study of the ABC Center website. Before we begin, I will briefly go over what a usability session consists of, what will be expected of you, and what our goals are in conducting this study. The first thing that I want to make clear is that we are testing the application, NOT you. If you find errors or if you have any difficulties with the application, it is very likely that other people visiting the site will also experience those same difficulties. We will use this information to better understand how we might improve the site. During the Session:

1. Tasks: You will be asked to perform a series of tasks using the ABC Center website. The tasks are structured to determine if the most important information and features of the site are easy to locate and use. We ask that you try to accomplish the tasks without assistance, as if you were at home trying to find the information on your own. This gives us a better idea of the things that work well or the difficulties people experience. Anytime during the session, if you find yourself thinking that you would quit the task, use help, contact someone for assistance, or visit another site to accomplish your goal, please let us know.

2. Test Facilitator and Observers: One member of the User Experience Group will act as the test facilitator assisting you in getting started and answering any questions you may have. Another member of the User Experience Group will observe and take notes. Representatives from the ABC Center team may also be present to observe the session and take notes.

3. Think Aloud Protocol: To help the observers understand the way you use the application, you will be asked to “think aloud” as you complete the tasks. This simply means that we ask you to talk about what you are doing, what you are looking for, clicking on, wishing you could find, etc. In other words, any task-related thought that comes into your mind we would like for you to share aloud. Some people find this easy right away, while others need to be reminded a little. It may seem a bit odd to talk about every step that you are doing, but it really helps the observers understand how you are interacting with the application and how the application is working.

After the Session: 1. User Satisfaction Questionnaire

The user satisfaction questionnaire is a short questionnaire that asks you about a few general aspects of the system. It will only take a few minutes for you to complete.

2. Questions If you have any further questions for the facilitator or members of the ABC Center team, you will have the opportunity to ask them at that time.

Page 20: eReading Experience

User Experience Group Indiana University 20 of 31

Appendix B – Participant Consent Form Participant Consent Form The purpose of this usability study is to evaluate the design of the ABC Center website. We are interested in determining if people can accomplish common tasks and easily find information using this website. The session will not ‘test’ you or your ability, rather the session will test the website in order to provide information on areas that might be improved. Please be advised that there are no foreseeable risks associated with participation in this session. During this session, you will be asked to complete some tasks using the ABC Center website and fill out a user satisfaction questionnaire. As you complete the tasks, members of the User Experience Group and ABC Center will observe and take notes. In addition, the session will be captured on video for future review. The session will last no longer than one hour and fifteen minutes. If for any reason you are uncomfortable during the session and do not want to complete a task, you may say so and we will move on to the next task. In addition, if you do not want to continue, you may end the session and leave at any time. Approximately 16 people will participate in this study. Results from all sessions will be included in a usability report to be presented to ABC Center. Your name will not be included in the report nor will your name be associated with any session data collected. If you wish to speak with someone about your participation in this study, or if you feel you were not treated as described above, please contact the User Experience Group manager at 812-855-4499. I, ______________________________________________, have read and fully understand the extent of the study and any risks involved. All of my questions, if any, have been answered to my satisfaction. My signature below acknowledges my understanding of the information provided in this form and indicates my willingness to participate in this user testing session. I have been given a blank copy of this consent form for my records. Signature:______________________________ Date:________________

Page 21: eReading Experience

User Experience Group Indiana University 21 of 31

Appendix C – Video Release Form Video Release Form The signature below indicates my permission for University Information Technology Services User Experience Group of Indiana University to use video footage recorded during the usability session conducted for ABC Center Website on _______________, 2005 in which I served as a participant. My name will not be reported in association with session results nor will my name be included on the video footage. This video footage may be used for the following purposes:

• Analysis of research and reporting of results • Conference presentations • Educational presentations • Informational presentations

I will be consulted about the use of the video recording for any purpose other than those listed above. There is no time-limit on the validity of this release nor is there any geographic specification of where these materials may be distributed. This release applies to video footage collected as part of the usability session listed on this document only. I have been given a blank copy of this release form for my records. Name (please print): Date: / / Signature:

Address:

Phone:

E-mail:

Page 22: eReading Experience

User Experience Group Indiana University 22 of 31

Appendix D – Satisfaction Questionnaire System Usability Scale © Digital Equipment Corporation, 1986. w/ Revision by Usability Consulting Services, 2002. Strongly Strongly disagree agree 1. I think that I would like to use this system frequently 2. I found the system more complex than necessary 3. I thought the system was easy to use 4. I think that I would need the support of an experienced person to be able to use this system 5. I found the system visually appealing 6. I thought there was too much inconsistency in this system 7. I would imagine that most people would learn to use this system very quickly 8. I found the system very confusing to navigate 9. I felt very confident using the system 10. I needed to learn a lot of things before I could get going with this system

Page 23: eReading Experience

User Experience Group Indiana University 23 of 31

Appendix E – Space Utilization

Too much screen space (80+ pixels) is wasted on the top of the menu page.

Move the location of the page title image up to reduce the white space that occupies the top of the page.

Page 24: eReading Experience

User Experience Group Indiana University 24 of 31

Appendix F – Image Fonts

The participants commented that some items were blurred and difficult to read. Low quality text images and/or adjusting image size (shrinking and stretching) can cause images to appear blurry.

Replace all text images with normal text, unless specific design considerations require it.

Page 25: eReading Experience

User Experience Group Indiana University 25 of 31

Appendix G – Contrast Between Text and Background

Figure.1 Figure.2 Figure 1 shows what a link looks like on the red frame. The link turned to purple when user clicked on the link and used the back button on the browser to go back to the page (Figure 2). Purple font on a red background does not provide sufficient contrast and many participants will have difficulty reading the text. Choose a color scheme that provides good contrast between the background and text, links, and visited links.

Page 26: eReading Experience

User Experience Group Indiana University 26 of 31

Appendix H – Terminology

Figure.1 Screenshot of the Virtual Showcase page

Figure.2 A screenshot of an application When clicking on an image under the “Software Applications” session (see Figure 1

above), users expect to get access to the application. Users tried to click on the image (Figure 2) to navigate the system and then realized that it was just a screenshot of the application.

Page 27: eReading Experience

User Experience Group Indiana University 27 of 31

Appendix I – Link Format

The image above shows the current design scheme on the ‘About Our Center’ page. Users easily miss the links since bullet points are not a common place to locate links. The dark red border may have been meant to indicate an embedded link, but the border is difficult to see due to low contrast between the colors.

The image above shows what the rectangular bullet points look like on Macintosh computers using Internet explore 5.2 or Safari 1.2.4 It is difficult for users to get more details since there is no visual cue (border around the bullet point) to tell user that the bullet points are links and using bullets as links is not a common practice. If users notice that links are available, it will likely be due to accidentally mousing over the bullet point and noticing the cursor change.

The example above uses the name/title of an item as a link instead of the bullet point. Using the name/title to link to more details of an item is a common method that users are familiar with, and so they would be less likely to miss the link. In addition, applying a consistent format (such as underlining) to all links will help users easily distinguish links from other items on the site.

Page 28: eReading Experience

User Experience Group Indiana University 28 of 31

Appendix J – Mouse-Over Effect

Figure.1

Figure.2

Figure 1 above shows the appearance of the center menu page. When the user moves the mouse over a category of the menu, the category was highlighted and it popped up a short description about the category (see Figure 2). The short description displayed in list format; users regarded it as a pop-up submenu and tried to click on it. They got frustrated as the “submenu” disappeared when the mouse moved away from the category.

Page 29: eReading Experience

User Experience Group Indiana University 29 of 31

Appendix K – Linking to Previous Page

The link at the bottom of the page always returns to the previous page. However, participants were confused by links that contained the title of the target pages (as shown in the example above) as they did not always recognize the names of the previous pages they visited.

Simplify the link at the bottom of the page to “back to previous page”.

In addition, provide a mechanism, such as the breadcrumbs shown above, that helps users to recognize where they are in a website instead of recalling their path.

Page 30: eReading Experience

User Experience Group Indiana University 30 of 31

Appendix L – Operating System and Browser Issues

The checkbox image does not display correctly on Macintosh computers using Internet Explorer 5.2

The checkbox did not show up on the Macintosh computer using Safari 1.2.4

There is a checkbox image in front of the Return to Research Center Menu link on Windows using Internet Explore 6.02, Firefox 1.0.2, and Netscape.

Page 31: eReading Experience

User Experience Group Indiana University 31 of 31

Appendix M – Browser Compatibility Issues

Using the most recent version of Internet Explorer on both the Mac and pc, the links change color with mouse-over to indicate that they are selected and the cursor is changed to indicate a hotspot.

In Netscape 7.2 (pc), Netscape 7.01 (Mac), or Firefox 1.0 (pc and Mac), however, the mouse-over does not work, and the cursor stays the same.