23
Powermeeting Usability Evaluation – Group 2 Anastasia Tsoutsoumpi [email protected] Ijeoma Udejiofor [email protected] University ID: Billy Kwasi Yeboah [email protected] Fatima Sabiu Baba [email protected] Vishal Dixit [email protected] ABSTRACT (IJEOMA UDEJIOFOR) It cannot be over emphasized that the usability of any software system is a key factor that determines whether the system is going to be adopted for use by its intended users. PowerMeeting is a web-based synchronous system built on Ajax technology and Google web Toolkit. The aim of this paper is to measure the usability of PowerMeeting. We have done this through evaluating the user interface and the functionality of the PowerMeeting framework. We also evaluated the functionalities of the GroupCalendar, a tool in the system. Through careful and result-oriented planning, we developed user tests which were a combination of standard methodologies effective for identifying and measuring user errors, and main usability issues. In this paper, we have also made recommendations for improving the interface and functionality of the system. These were deduced from a qualitative analysis of the results obtained during the tests. INTRODUCTION (ANASTASIA TSOUTSOUMPI) Nowadays, web applications constitute an important aspect of everyday life. The popularity of Yahoo mail, Gmail and Facebook is a clear evidence of that. However, Web applications are not only related to communication or social networking. In fact they can be classified in categories according to their functionality. Further to this classification, applications can also be characterized by the technology upon which they are built. Therefore, they are divided in three categories; the conventional web applications, the rich internet applications (RIA) and the Ajax based web applications. Many developers and web applications experts consider the latter to be synonym to RIAs. However, some others, argue that this is not entirely the case and believe that Ajax is just one of the technologies used for the development of rich applications. Types of Web Applications As far as the functionality is concerned the following categories can be noted. Financial such as personal finance and accounting. Travel, weather information and map searching. Books and articles searching. Spreadsheet and word processing. Amusement and fun sharing. Start page modification. Calendars and planning. Question answering and searching. Information exchanging and opinions sharing. Social networking and communication. Employment, commerce and news. Language learning. Health and sports. People searching. Web analytics. Three of the most famous Ajax web applications are Google Maps, Google Suggest and Gmail. Other examples of Ajax applications are Google Groups, Yahoo FrontPage, Yahoo Instant Search, Windows Live, Orkut (social networking), Zimbra (email), Writely, Zoho, gOffice and AjaxOffice (on-line word processors), Kiko and Calendarhub (calendars). Also, PageFlakes, Netvibes, and Photopage (free start pages), Travbuddy (for creating travel journals and sharing travel experience), Digg (for technology news), Meebo (instant messenger), Amazon’s A9.com (search engine), Yahoo Flickr (photo sharing). Finally, some examples of Ajax based business applications are Salesforce.com, Basecamp (project management), Backpack (organizer). Positive User Experience The basic advantage of the Ajax based web applications are the interactivity and richness of the web interface. Also, in contrast to traditional web applications they are

Evaluation Test On Power Meeting

Embed Size (px)

DESCRIPTION

User evaluation test on Google's PowerMeeting an Ajax based groupware tool.

Citation preview

Page 1: Evaluation Test On Power Meeting

Powermeeting Usability Evaluation – Group 2Anastasia Tsoutsoumpi

[email protected]

Ijeoma [email protected]

University ID:

Billy Kwasi [email protected]

Fatima Sabiu [email protected]

Vishal [email protected]

ABSTRACT (IJEOMA UDEJIOFOR)

It cannot be over emphasized that the usability of any software system is a key factor that determines whether the system is going to be adopted for use by its intended users. PowerMeeting is a web-based synchronous system built on Ajax technology and Google web Toolkit.

The aim of this paper is to measure the usability of PowerMeeting. We have done this through evaluating the user interface and the functionality of the PowerMeeting framework. We also evaluated the functionalities of theGroupCalendar, a tool in the system.

Through careful and result-oriented planning, we developed user tests which were a combination of standard methodologies effective for identifying and measuring user errors, and main usability issues.

In this paper, we have also made recommendations for improving the interface and functionality of the system. These were deduced from a qualitative analysis of the results obtained during the tests.

INTRODUCTION (ANASTASIA TSOUTSOUMPI)Nowadays, web applications constitute an important aspect of everyday life. The popularity of Yahoo mail, Gmail and Facebook is a clear evidence of that. However, Web applications are not only related to communication or social networking. In fact they can be classified in categories according to their functionality.

Further to this classification, applications can also be characterized by the technology upon which they are built. Therefore, they are divided in three categories; the conventional web applications, the rich internet applications (RIA) and the Ajax based web applications. Many developers and web applications experts consider the latter to be synonym to RIAs. However, some others, argue that this is not entirely the case and believe that Ajax is just one of the technologies used for the development of rich applications.

Types of Web Applications

As far as the functionality is concerned the following categories can be noted.

Financial such as personal finance and accounting.

Travel, weather information and map searching.

Books and articles searching.

Spreadsheet and word processing.

Amusement and fun sharing.

Start page modification.

Calendars and planning.

Question answering and searching.

Information exchanging and opinions sharing.

Social networking and communication.

Employment, commerce and news.

Language learning.

Health and sports.

People searching.

Web analytics.

Three of the most famous Ajax web applications are Google Maps, Google Suggest and Gmail. Other examples of Ajax applications are Google Groups, Yahoo FrontPage, Yahoo Instant Search, Windows Live, Orkut (social networking), Zimbra (email), Writely, Zoho, gOffice and AjaxOffice (on-line word processors), Kiko and Calendarhub (calendars). Also, PageFlakes, Netvibes, and Photopage (free start pages), Travbuddy (for creating travel journals and sharing travel experience), Digg (for technology news), Meebo (instant messenger), Amazon’s A9.com (search engine), Yahoo Flickr (photo sharing). Finally, some examples of Ajax based business applications are Salesforce.com, Basecamp (project management), Backpack (organizer).

Positive User ExperienceThe basic advantage of the Ajax based web applications are the interactivity and richness of the web interface. Also, in contrast to traditional web applications they are

Page 2: Evaluation Test On Power Meeting

faster. The reason is that only small amounts of data are being refreshed and there is no need for the whole page to be reloaded. All the above aspects had led users to consider Ajax based applications being similar to desktop applications.

However, it is not only the desktop ‘look and feel’ of Ajax applications that makes them so popular. There are also many examples which reinforce the fact that such applications vastly improve user experience. For instance, for the completion of web forms with data (such as email address or credit card number) users do not experience a dead time during the validation of their data. Another example is that user experience is not slowed down when moving from their email inbox folder to the ‘compose mail’ option, or when they perform a big number of clicks while searching for books in the Amazon website. The reason is that requests to the server for the sending and retrieving of the various components of the graphical user interface are absolutely transparent to the users of the application.

Furthermore, animation such as fading text displayed on the screen which inform the user that an option has been just saved, add up to the user friendliness of Ajax applications. Moreover, along with friendliness, loyalty to an application is a clear evidence of Ajax success. That applies mostly to e-commerce. With the aid of Ajax technology user interfaces are presented in a way that stirs customers’ emotions, and unconsciously leads them to choose the option of online purchase instead of buying goods in a shop.

Negative User ExperienceTraditional web applications support both the use of ‘Back’ button and the ‘Bookmarking’ functionality. On the contrary, these features have been discarded in Ajax applications. Because of that, many users are either annoyed or confused. An additional source of confusion is color blinking behind changes, which is frequently mentioned by users of ‘Yahoo! Finance’ application. Also, being accustomed to traditional web applications, people interacting with Ajax applications, rarely realize how the updating and refreshing process based on Ajax technology works. Further to that, they claim that if information is given about whether a box is draggable or a text field is editable, could prevent them from confronting usability problems.

Additional usability issues arise when users try to send their friends links to a web page they find interesting. More specifically, the receivers by following the link can only see the default content. Similar problems exist with search engines, as the clicking of a link displays again the default content. This problem is widely known as SEO (search engine optimization) issue, and usually stems from the absence of a proper sitemap. Another issue of major importance in Ajax applications performance is

accessibility, which focuses on users with disabilities. For instance, Ajax applications are not friendly for visually impaired users due to the fact that screen readers are absolutely incompatible with Ajax technology.

As far as the security of Ajax web applications is concerned, experienced computer users argue that the level of security is not enhanced when compared with traditional web applications. In reality, security issues such as ‘cross-site scripting’ are common for all types of web applications.

Finally, despite the fact that Ajax applications do not need browser plug-ins, users should have JavaScript enabled in their browsers. However, this consist a daunting issue for less experienced computer users. Also this category of users, cannot cope with network delay. They consider it to be a severe problem, unless it is clearly indicated to them - through a screen message - that what happens is a delay rather than an internal problem of the application.

METHODS (BILLY KWASI YEBOAH AND IJEOMA UDEJIOFOR)In the recent past all the content of a web page had to be reloaded for every HTTP client request made described as synchronous communication Fig 1. This is unnecessary since most pages of a web application contain common content and led to considerable delays while pages were loading resulting in a diminished user experience. A group of client-side technologies, Asynchronous JavaScript and XML (AJAX) has been used to eliminate this effect resulting in more interactive and dynamic applications.

This is achieved by seemingly introducing an extra layer, the Ajax engine, to the traditional communication model for web applications as shown in the diagram below. The Ajax engine is responsible for asynchronous communication.

The Components of Ajax include:

XHTML, HTML and CSS used for creating the GUI and styling the web pages

The document Object Model (DOM) is used by JavaScript code to produce interactive applications

XMLHTTPRequest or XMLHTTP used to retrieve the data from the server

JavaScript used to create the Ajax engine

Power Meeting

PowerMeeting is a framework that makes use of the Google web Toolkit and provides a common foundation for developing a synchronous groupware application consisting of a set of plug-ins, i.e. groupware tools for specific collaborative activities.

Page 3: Evaluation Test On Power Meeting

PowerMeeting is a framework that makes use of the Google web Toolkit and provides a common foundation for developing a synchronous groupware application consisting of a set of plug-ins, i.e. groupware tools for specific collaborative activities.

Wang, the author of PowerMeeting lists its key features as Uses standard Web browsers, including AJAX-

enabled browsers running on mobile devices as front end making the system widely available and eliminating the need for installation.

Supporting direct manipulation of shared artifacts and direct textual and voice communication making real-time collaboration and coordination possible.

Supporting rich user experience through rich set of graphical widgets, fast feedback, and nature interaction means, such as tele-pointing, drag-and-drop, and gesture.

Maintaining view-data dependency and data consistency across clients and server; and maintaining data and collaborative session persistency essential for any document-based real time collaboration and for a smooth moving between synchronous and asynchronous collaboration modes.

Making the development and integration of task-specific groupware tools into the system easier.

Its implementation technologies are:

GWT GUI library and GWT RPC for the front end components.

The RemoteServiceServlet class and the continuation technology provided by Jetty web server for the back end components.

SQL database – for making objects persistent.

The general user interface layout of PowerMeeting (Figure 1 in Appendix B) has two main parts separated by a slide bar. The right side is a large working area for plug-ins such as shared gadget objects while the left side has a menu bar which cannot be seen on the image captured (Wang, 2008). This menu has tabs, Create, Edit, Pointer, Help and Logout. Beneath the tab “Create” is a list of Agenda (Agenda items), Current users (participants) and below this is the group chat which can be done be text or by Skype.

The PowerMeeting groupware tools include Pincard board, MindMap tool, Presetation slide, Calendar tool etc

Heuristic evaluation was mainly used for the user interface because it is quick, cheap and easy evaluation of user interface. It also effective and requires a small set of

evaluators in finding usability problems in a user interface (Nielsen and Molich, 1990; Nielsen 1994).

The Usability Evaluation

The usability evaluation was made up of three parts:

1. The Background Information Sheet:

We used this to collect the back ground information of all the participants for both the user interface and Group Calendar Test. The information collected includes the users’ computer literacy levels, computer programming experience especially with the components of Ajax technology (Cascading Style Sheets, Javax Swing Package and JavaScript), their opinions of web applications and possible motivations to use or not use Ajax applications.

2. The User Tests:

This was made up of quantitative and qualitative test of the functionality of PowerMeeting and the Groupcalendar. We adapted it differently for the user interface/functionality test and for the Group Calendar. A brief description of these tests is given below:

a. The User Interface/PowerMeeting Functionality Test

We chose six functionalities randomly in other to evaluate the user interface and functionality of PowerMeeting.

Login and Logout functions

The Documentation

The Agenda creation function

The Tele-pointer

The Group chat

Two users were evaluated concurrently. One was asked to chair the meeting and the other was a normal participant. Below are the tasks asked to test the five chosen functions:

Function 1: Log in to the PowerMeeting

Task: Create a new task.

Function 2: Test the documentation of PowerMeeting.

Task: Read the documentation in five minutes.

Function 3: Creating an Agenda

Task: Create new agenda item called CONTINENT

Task: Add in “categories” EUROPE

Page 4: Evaluation Test On Power Meeting

Task: Add in “Idea”, country name France.

Task: Put Ideas in their respective categories

Task: Replace “Idea” Nigeria with Kenya

Function 4: Using the Telepointer.

Task: Show and hide telepointer

Function 5: Log out from the session chair account and login as a user

Task: Log in to an already existing session, by typing the Skype name.

Function 6: Work on the group chat and on the voice conference with Skype

Task: Exchanging written messages

Task: Have a 2 minutes voice conference

After the completion of the above mentioned tasks, participants were asked to respond to the questionnaire used for our assessment.

For assessing the Agenda, Telepointer, voice conference tool using Skype and the textual chat, questionnaires were design to respond to the usability heuristics discussed in Molich and Nielsen (1990). Participants were asked to answer 14 questions using the SUS scale 1-5.

For the Login session, we used heuristics (Recognition rather than recall” and Flexibility and efficiency of use) and also measured performance and satisfaction. According to Dumas and Redish (1999), there are two aspects of the documentation which can be tested: whether users go to the documentation or how well the documentation works for them when they go to it. We chose the latter because the former requires a large number of participants.

For the assessment of the log in, log out and the documentation, another set of questions were asked to obtain the necessarily data needed for the evaluation.

b. The Group Calendar User Test

After briefly and orally introducing Group Calendar as a group calendar tool to the users, we logged into PowerMeeting and selected the Group Calendar tool. We presented a type-written questionnaire to the users. We then asked them to carry out the following tasks without providing any documentation, user guide or assistance.

Task 1 - Add a new event

Task 2 - View and modify the description of an event created by another user. (We created this event before the test)

Task 3 - Change the time for your event

Task 4 - View the events on the calendar through the day, week and month views

Task 5 - Delete your event

On the questionnaire, they were to respond to seven statements using the SUS scale type response and to four other questions using a binary response – Yes or No. These set of questions focused on assessing the following metrics

Ease of use

Efficiency

Main usability issues

For the next set of questions, users were to respond on the Group Calendar as a whole. These questions were focused on obtaining user satisfaction experience of the tool by using four questions from the SUS. We included an extra question to test the users’ satisfaction with using a browser-based groupware. The responses were also an SUS scale type response.

We then went on to obtain qualitative data of the user’s view of the usability issues they encountered in carrying out the tasks, their expectations from the tool and their recommendations.

3. The Evaluators’ Guideline and Observation Form

We understand that sometimes users’ comments may not give a true or exact picture of their experience of the software while responding to a questionnaire. We therefore decided to also make use of this form to record the observations we made as the users carried out the tests. We recorded information such as the main usability issues faced by users, usability errors made by users, user comments, facial expressions etc. This provided a solid basis for correct analysis of results and conclusion.

The Participants

We chose to test students of the University of Manchester because they are a true representative of the target audience for PowerMeeting. They have varying levels of exposure to computers and web applications. They are assigned group tasks. They face the challenge of having to meet again outside of the normal class schedule because of the varying locations where they live or work. They would therefore have to use web tools offering synchronous collaboration for effective communication in order to achieve their goals.

Page 5: Evaluation Test On Power Meeting

A total of twelve participants performed the evaluation. We increased the number of users to improve the reliability of the data, analysis and conclusions made in the test as suggested by (Insert references for Observing the User Experience). Six users evaluated the user interface/functionality and six evaluated the Group Calendar.

Users for Evaluation of PowerMeeting Framework Functionality/User Interface

User 1

Tim is a male MSc Informatics student. He describes both his computer literacy level and computer programming experiences as proficient. He has programming experience with JavaScript technology but none with CSS (Style sheets) and Javax.Swing Package.

In general the use of web applications has improved his life because they are useful for providing information and social networking. His top five web applications include Facebook, Email, Windows Live Messenger, Twitter and Web browsers which he uses for news and information.

He characterizes web applications as being user friendly, fast, well designed and secured. Though he does work in groups, he has never used any groupware tool before. He would use Ajax web applications if they were useful, easy to use and if they did not have any faults and errors. He would be discouraged from using them if they did not improve his life.

User 2

Xiaokun is a female student studying Information systems. She has an intermediate computer literacy level and is a beginner in computer programming. She has programming experience in JavaScript.

She sees a web application like skype as being useful because it makes it easier to contact people from a distance. Her top five web applications are MSN, Skype, Google search, Emails, and Blackboard. She believes web applications are user friendly, fast, well designed and secured.

She has used some groupware tools like google groups and skype. She is motivated to use web applications if they are easy to use and cope well with the user interface. She might be discouraged from using an Ajax application if it is insecure and difficult to use.

User 3

Ji is a male Informatics student. He is proficient in computer usage and intermediate in computer programming. Like User 2, he has no experience in CSS and Javax.Swing Package but has experience in JavaScript programming.

He sees some games based on web application as being tiny and interactive. His top web applications include MSN, Tencent, Blackboard, and Moodles. He thinks web applications are user friendly, fast, well designed and secured.

He has used groupware tools such as skype and tencent before. Factors that might motivate him to use an Ajax based web application are its ease of use, speed, and adaptability with the user interface. What might discourage him from using such an Ajax based web application is if it allows pop-ups and if it is insecure.

User 4

Abdullah is a male student in the HCI and User interface class. His level of computer usage and programming experiences is intermediate. He has some computer programming experience some experience with Javax.Swing Package technology.

Web applications have affected his lifestyle by helping him to communicate with the rest of the world. His top web applications are yahoo, google doc, facebook and adobe share. He characterizes web applications as generally being user friendly, fast, well designed and secured.

He sometimes works in groups and has used skype and group documentation tools before. He might be motivated to use an Ajax web based application if he finds it easy to use, cope well with the user interface, similar to a desktop application, and if refreshing function is easy and quick. He might not use Ajax web based application if it has security and privacy issues.

User 5

Sayeeram is a male student in Msc Electrical Engineering. He describes his computer usage experience as intermediate and his computer programming level as beginner. He has no programming experience with Ajax technology.

Web applications have positively affected his lifestyle because they allow him to keep in touch with the rest of the world. His top five web applications are facebook, MSN, Email, twitter and Google. He characterizes web applications as being user friendly, fast, well designed and secured.

He works in groups but is yet to use an Ajax based web application. He would be happy to use one if useful and easy to use. He would be discouraged from using them if they did not improve his life.

User 6

Diwakar is a male student in MSc Healthcare Management. He is an intermediate computer user and a

Page 6: Evaluation Test On Power Meeting

beginner in computer programming and has none with Ajax technology.

He sees web applications as being effective positively because they allow him to communicate with friends and family around the world. His favorite web based applications are Google, Email, MSN Messenger, Skype, and Facebook. He believes web applications are user friendly, fast, well designed and secured.

He has never used any groupware tool before. He would use Ajax web applications if they had a positive impact on his life and if they were not complicated to use. He would be discouraged from using them if they had limited impact on his life.

Users for Evaluation of Group Calendar

User 1

Sana is a 22 year-old female studying Informatics. She is computer literate, is a beginner programmer and has no programming experience with Ajax technology.

She uses Facebook, MSN, Skype, Blackboard, LinkedIn frequently. Group wares have improved her studying and ability to carry out group work. She would be motivated to use an Ajax-based application if it is easy to use and has minimal features displayed on the user interface and discouraged from using it if it has a busy user interface.

User 2

Deji is a 25 and is pursuing a Masters degree in Operations, Project and Supply Chain Management. He has intermediate computer literacy, is a beginner programmer with no experience with programming with Ajax technology.

He mostly uses web applications for sharing information, social networking, and making job applications. He uses Facebook and recruitment web sites frequently and mostly uses Google Groups for his group work.

He thinks most web applications are user-friendly, well designed and fast but he is uncertain of their security. He would be motivated to use Ajax applications if they are easy to use, offer real time communication, and secure and would not be motivated to use them if they are insecure.

User 3

Kerry is a 21-year-old female student of the Human Resource Management and Industrial Relations department. She is a proficient computer user, has intermediate programming experience but none with Ajax technology.

Web applications have made life easier for her through improved means of social networking, task management and access to information. She uses Facebook, Google,

Skype, Twitter and e-mail applications such as yahoomail frequently and Skype for her group work.

For her, web applications are generally user friendly, fast, well-designed and secure. Ease of use, simplicity and speed are her possible motivations to use Ajax applications and a lack of these would demotivate her to use them.

User 4

Bima is a 26-year-old male, Masters student of the Construction Management Department. He is a proficient computer user, has intermediate computer programming experience but none with Ajax technology.

Web applications have helped him in interacting with common interest group members and in resolving and working on programs or projects. He mainly interacts with group members through the UoM Blackboard system and Facebook Groups. He also uses YouTube, Facebook, Skype, Twitter, and MySpace.

He thinks most web applications are user friendly and fast, not secure and that their designs could be improved. His possible motivations to use Ajax applications would be user friendliness and security while poor design and insecurity of the application would demotivate him to use them.

User 5

Ibrahim is a 23 year-old male, postgraduate student of the Public Health department. He is a proficient computer user with intermediate programming experience but none with Ajax technology.

He uses Skype, Facebook, Gmail, Twitter and Google Maps. Google groups daily because they help him to organize his social networks and make his activities faster and easier. He uses mainly Google Groups and Google Calendar for his group work.

He thinks web applications are generally user friendly, fast, well-designed and secure. These features would also motivate him to use Ajax technology. He has no possible demotivations about the technology.

User 6

Waheeda is a 22-year-old female studying Information Security. She is computer literate, has intermediate programming experience and has programmed with Cascading Style Sheets (CSS), one of the components of Ajax technology.

She says web applications have made her life easier and more fun but have also increased the pace of living. She uses Facebook, Google Groups, Windows Live Messenger, Gmail and Skype regularly for social networking, chatting and e-mail. For her group work, she

Page 7: Evaluation Test On Power Meeting

uses Google wave, Skype, Google Groups, and Google calendar.

She thinks that web applications are generally user friendly, fast, well-designed and secure. These are her possible motivations for her to use Ajax applications while excessive display of information would discourage her from using them.

RESULTSThis section consists of the results obtained during the Powermeeting evaluation for the user interface and group calendar tool.

Results for User Interface (Vishal Dixit)The results for the User Interface evaluation carried out are based on three metrics task success, errors and issues.

Task Success: Table 1 in appendix B shows the percentage of participants who were successfully able to complete the tasks given for the User Interface tools (Login, Documentation, Agenda, Tele-pointer, Voice Conference). To calculate the percentage of success the binomial distribution method has been used where 1 indicates the task was a success and 0 indicates task failure. The table also shows the 95% confidence interval.

Graph 1 in the appendix shows the Tasks Success rates for all the tools in the User Interface:

All the participants were able to complete the tasks for login, documentation, agenda and tele-pointer.

The task success rate for Voice Conference is 66.66%.

As part of the analysis of the Powermeeting features, the total completion time of all the tasks were measured for each of the users. Tables 2, 3, 4, 5 and 6 in appendix Bshow a summary of the tasks completion times for each user.

The following were observed for Login feature: The average time taken for Login is 75 seconds

with 66.66% of the participants completing the tasks before the average completion time.

The standard deviation is marginally over 35 seconds.

The confidence level at 95% shows that the average population completion time will be 37 seconds.

The following were observed for the Documentation:

The average time taken for Documentation is 180 seconds with 50% of the participants completing the tasks before the average completion time.

The standard deviation is marginally over 73 seconds.

The confidence level at 95% shows that the average population completion time will be 91 seconds.

The following were observed for the Agenda:

The average time taken for Agenda is 292 seconds with 33.33% of the participants completing the tasks before the average completion time.

The standard deviation is approximately 149 seconds.

The confidence level at 95% shows that the average population completion time will be 156 seconds.

The following were observed for the Tele-pointer:

The average time taken for Tele-pointer is 51 seconds with 33.33% of the participants completing the tasks before the average completion time.

The standard deviation is over 27 seconds.

The confidence level at 95% shows that the average population completion time will be 28 seconds

The following were observed for the Voice Conferencing:

The average time taken for Voice Conference is 200 seconds with 83.33% of the participants completing the tasks before the average completion time.

The standard deviation is approximately 82 seconds.

The confidence level at 95% shows that the average population completion time will be 86 seconds.

Further there was a Chi-Square test conducted to see if there is any significant difference in the task success between three different groups (novices, intermediate and experts). The participants were categorized in three groups by asking them their computer usage experience.Tables of the test results are displayed in appendix B as Table 7, 8, 9, 10 and 11.

The results show that there is no difference in the task success between the three groups for Login, Documentation, Agenda and Tele-pointer as the task success rate is 100% for all the four tools. (Task success rate is shown in the bar graphs and also in the Fig. Task completion rate above)

Page 8: Evaluation Test On Power Meeting

The result for Voice Conference however is different; the task success between the three groups is shown with 0.6 distribution.

Errors: This section concentrates on the errors encountered in the tasks performed by the participants. The aim here is to understand how the level of experience can affect the errors made by the participants.

Table 12 and graph 2 in appendix B shows the number of errors encountered in the tasks performed for the user interface.

Graphs 3, 4, 5 and 6 in appendix B show the relationship between the level of experience and the errors made by the participants. The scatter plots indicate a negative slope between the two variables (months of experience and average errors per day). To understand it better the correlation coefficient was calculated.

The correlation coefficient for login is -0.70 which shows that there is strong negative relationship between the two variables (as the level of experience increases the errors decreases). It shows that the average expectancy of errors is -0.7 for the participants having a month of experience with the tool.

The correlation coefficient for documentation is -0.51.

The correlation coefficient for agenda is -0.06.

The correlation coefficient is not applicable for Tele-pointer as there were no errors encountered.

The correlation coefficient for voice conference = -0.06.

Issues: This section of the results concentrates on the issues that the participants encountered while carrying out the tasks. The issues are categorized on individual tasks done in tools (Login, Documentation, Agenda, Tele-pointer, Voice Conference) and further being classified to having high, medium and low severity.

High severity: Issues are classified as high severity if it has been faced by many participants and also likely to make a major impact on the usability.

Medium severity: Issues are classified as medium severity if it has been faced by many participants having a minor impact on the usability or few participants but large impact on the usability.

Low severity: Issues are classified as low severity if it has been faced by few participants and also has minor impact on the usability. (Tullis & Albert, 2008).

The issues encountered are summarized in table 13 in appendix B.

Graph 7 in appendix B shows the percentage of participants who encountered issues in the user interface tools.

In login 66.66% participants found the issues as high severity and 33.33% as medium.

For documentation, Tele-pointer and voice conference the percentage of high severity was 100%, 83.33% and 50% respectively.

This indicates that the issues encountered for documentation and Tele-pointer needs to be addressed for the next version of PowerMeeting.

Apart from the evaluation test the participants were asked to make some comments about the user interface tools which are summarized in table 14 in appendix B.

Results for Group Calendar Tool (Fatima Baba)The result from the evaluation of the tool focused mainly on efficiency of the system and the ease of completing the tasks using the tool. The findings also identified some issues the users faced and the severity of those issues. A summary of the data gathered from the evaluation is given below.

Efficiency: was measured in terms of the user’s success in completing the evaluation tasks and how long it took them to complete the tasks.

In measuring the success and failure of each task, the binary success rate was used where successful tasks were assigned 1’s and failed tasks were assigned 0’s. Table 15 in appendix B shows the averages and 95 percent confidence intervals of the binary success data.

Graph 8 in appendix B shows that:

Tasks 1 and 2 were successfully completed by all the users.

Task 3 has a sample mean completion of 50%. Hence there is a 95 percent chance that the true mean will fall between 6 and 94 percent.

The confidence interval is rather large as a result of the small sample size used for the evaluation.

For Tasks 4 and 5 there is a 95 percent chance that the true mean will fall between 50 and 116 percent.

As part of the analysis of the efficiency of the calendar system, the total completion time of all the tasks was measured for each of the users. Table 16 in appendix Bshows a summary of the completion time for each user.

The average evaluation time was 144.5 seconds with 50 percent of the users completing the tasks before the average completion time.

Page 9: Evaluation Test On Power Meeting

The standard deviation is about 33 seconds. This can be attributed to the small sample size. Therefore estimating the true mean of the population will be less accurate if based on this sample size.

The analysis also shows that with a 95 percent confidence, it can be inferred that the average population completion time will be 144.5 35 seconds.

The experience level of users was obtained during the test and was measured by assigning numerical values to different levels of experience as shown in table 17 in appendix B. This was to enable us test for the relationship between the users’ previous experience with calendar tools and the time it will take them to complete the tasks for the Powermeeting calendar tool.

The scatter plot shown in graph 9 of appendix B indicates a negative slope between the two variables. For each increase in level of experience there will be 0.65 seconds decrease in the completion time.

The correlation coefficient of -0.65 shows that there is a strong negative relationship between the level of experience with other calendar systems and the time taken to complete the tasks in the usability test.

Ease of Use: This was measured in terms of how easy it was for the users to complete the tasks. The users were asked to rate the ease of use of the Calendar tool by completing a questionnaire which was based on the System Usability Scale (SUS) and has been presented in the Methods section. The users’ responses which are the SUS scores (SUS scores are calculated by adding the rating figures and multiplying by 2.5) are presented as percentages in the table 18 and graph 10 in appendix B:

Easiest task - Task 4. (The task has an average ease of use rating of 66 percent with only 33 percent of the users rating below the average).

The most difficult task - Task 3. (The average ease of task rating of this task is 55 percent with about 67 percent of the users rating below the average).

Overall the users did not find the tasks too easy to complete thus the average ease of use for all the tasks are 62 percent with the maximum efficiency of 75 percent.

Issues: with the tool were identified as the evaluation was being carried out and a documentation of these issues is given in this section. The issues are categorized based on individual tasks in addition to being classified as having high, medium or low severity. The severity classifications are the same as the ones given in the user interface issues.

A summary of the issues identified are given in table 19 in appendix B.

Graph 11 in appendix B shows that three issues were identified in Task 1 and 60 percent of the users encountered problems in relation to the issue with the highest magnitude. It also shows that Task 3 had only one issue identified which all the users encountered. This indicates that the issue in Task 3 is the most sever and should be given priority when the issues are being addressed by the design team of the Calendar tool.

After the usability evaluation of the calendar tool, the users were asked some questions on the tool in general and were also asked to give general comments general. Table 20 in the appendix summarizes the responses of the users.

DISCUSSIONThis section discusses the findings of the usability test for the user interface and calendar tool.

User Interface Discussion (Anastasia Tsoutsoumpi, Vishal Dixit and Billy Kwasi Yeboah)

The user interface tool was evaluated on three metrics task success, errors and issues.

The findings of the evaluation suggest that there is no significant difference in errors among users with different level of computer experience. This could be as a result of time pressure the users where put under during the test and the fact that they were not familiar with PowerMeeting. For example, they had no reason to assume they needed to login as administrators as they have no prior knowledge of the fact that only administrators could create agenda items.

Calendar Tool Discussion (Fatima Baba)Three main aspects of the calendar tool were measured during the usability evaluation. They are efficiency, ease of use and issues (problems).

Page 10: Evaluation Test On Power Meeting

Efficiency: The five tasks were measured for success and the means and confidence intervals were calculated in the previous section. The confidence interval allowed us to estimate to what degree of accurateness the results from the sample can be used to generalize how successful the task completions will be when used in the real world. The first two tasks were successfully completed by the users but that does not mean everyone who will use the system will be able to successfully complete the tasks. The small sample size has to be considered before any generalizations can be made. On the other hand, task 3 had a really low average completion success and this point out that there is a problem with the tool and if used in the real world a lot of people may not be able to use it to complete such a task. Tasks 4 and 5 only showed moderate levels of difficulty in successful completion and this should also be addressed when improving the calendar tool. The negative slope of graph 9 shows that as the computer literacy level of the user increases, the time spent in completing the tasks decreases. To better understand the strength of the relationship of these variables, we calculated the correlation coefficient which gave a value of -0.71. This can be interpreted to mean that it is 71% probable that proficient computer users will find the system more usable than beginner computer users. Individuals, irrespective of their level computer literacy, need to be to manage group events using a calendar tool. Hence, the general features of this tool should be enhanced to encourage even beginners to adopt it as their group calendar tool.

A standard deviation of 33 was obtained and this indicates a large variance between the completion times of the different users. Although we can attribute this to the small sample size used for the test, it could also be indicative that it will take some classes of users far more time to complete tasks using the tool when compared to other classes of users. This could be as a result of several other factors such as level of computer literacy as discussed above. Again, this indicates that the design should be improved on to reduce the use time for all classes of users.

Ease of use: from the results, it is clear again that there is a problem with completing task 3. This confirms the findings in the efficiency that showed task 3 as the least efficient. In general all the users did not rate the ease of use of the calendar tool high, and this suggests that the tool is not easy to use.

Issues: with the calendar tool were identified during the usability test. These issues are valuable insights for the developers and in order to improve the tool, the issues need to be resolved. Some recommendations for the improvement of the tool gathered from the issues

recognized during the test as well as from the statistical results are outlined below.

Recommendations (Anastasia Tsoutsoumpi and Ijeoma Udejiofor)

After analyzing the result of the usability test taking into consideration improvement ideas from the users the following recommendations are given in order to improve the Powermeeting system:

Connection for the user conference should be made quicker

The login should also be made quicker and should have a mechanism to remember passwords.

The logout button should be clearly indicated on the screen.

Search mechanism should be built in the documentation.

The recycle bin used to delete the agenda items should allow users retrieve deleted items

The design of the group calendar tool should be enhanced so that it is possible to easily edit the time for an event by entering a new time using the blinking cursor in the pop-up dialogue box. This was a high severity issue and the general low usability results for task 3 echo this problem.

Standard alternatives should be made for core or commonly used functions. For example users should be able to select the “add”, ” edit” or “delete” event button by right clicking on the mouse. Users should also be able to use simple key combinations on the keyboard, as an alternative.

The users should get a notification of when an event is added or edited by another group member and the identity of the person who added or edited the event.

Currently in the group calendar, a deleted event cannot be restored. A means for restoring deleted items should be added as a feature of the tool as this will greatly enhance its usability.

Page 11: Evaluation Test On Power Meeting

When the mouse is placed on a clickable feature, a brief textual description of its function should be displayed to increase learnability and usability of the tool.

CONCLUSION (ANASTASIA TSOUTSOUMPI)Firstly, the evaluation of the Power Meeting web application, gave us the opportunity to gain a good understanding of Ajax technology. Further to that, we had the opportunity to familiarize ourselves with very important human computer interaction concepts. The experience gained, through the preparation of the usability test and through the analysis of users’ feedback, will improve our skills in the domain of software development. The main reason for that is that we will be able to approach software design from the user’s point of view.

As far as users attitudes towards web applications are concerned two basic points can be noted. Firstly, people are always attracted by eye-catching user interfaces. However, the motivation to be ‘loyal’ in a web application is its functionality. An additional point is that many users express their concerns about the security and the speed of web applications. That is a clear indication of their demand for high quality standards in contemporary software products.

Finally, the success of an application in the ‘web market’ is based exclusively on users’ acceptance of it. But they are the researchers and the professional software developers, who lead the technology one step further into the future.

REFERENCES1. Ajax Patterns. Whats Ajax. 25 March 2010. 1 April

2010 <http://ajaxpatterns.org/Whats_Ajax>.

2. Arlekar, Sagar G. The Role of AJAX in enhancing the user experience on the Web. 1 June 2006. 6 March 2010 <http://www.roseindia.net/ajax/ajax-user-interface.shtml>.

3. Avangate. Usability Friends: Ajax. 29 October 2007. 1 March 2010 <http://www.avangate.com/articles/ajax-usability-110.htm>.

4. Brookes, J. SUS - A Quick and Dirty Usability Scale.2009. 4 March 2010 <http://www.usabilitynet.org/trump/documents/Suschapt.doc>.

5. Bruno, Vince, Audrey Tam and James Thom. "CHARACTERISTICS OF WEB APPLICATIONS THAT AFFECT USABILITY: A REVIW." Proceedings of OZCHI 2005,. Canberra: CHISIG, 2005. 2-4.

6. Dumas, J.S. and J.C. Redish. A practical guide to Usability Testing. Exeter : Intellect Books, 1999.

7. Eernisse, Matthew. Build Your Own Ajax Web Applications. 28 June 2006. 5 March 2010 <http://articles.sitepoint.com/article/build-your-own-ajax-web-apps>.

8. Garrett, Jesse James. Ajax: A New Approach to Web Applications. 18 February 2005. 7 March 2010 <http://experiencezen.com/wp-content/uploads/2007/04/adaptive-path-ajax-a-new-approach-to-web-applications1.pdf>.

9. Gibson, Becky. Ajax Accessibility Overview. 1 April 2006. 1 April 2010 <http://www-03.ibm.com/able/resources/ajaxaccessibility.html>.

10. Giglio, Jason. "AJAX: Highly Interactive Web Applications." 2009.

11. INTERNET METHODOLOGIES JOURNAL AND NEWS. Are there Usability Issues with AJAX? 1 April 2010. 3 April 2010 <http://www.imjan.com/internet-www/are-there-usability-issues-with-ajax/>.

12. Itura. AJAX SECURITY: ARE AJAX APPLICATIONS VULNERABLE TO HACK ATTACKS? 2009. 05 March 2010 <http://www.itura.net/training/16-ajax-security-are-ajax-applications-vulnerable-to-hack-attacks.html>.

13. Keely, Pavan. Using Ajax. 18 January 2006. 2 March 2010 <http://keelypavan.blogspot.com/2006/01/using-ajax.html>.

14. Kluge, Jonas, Frank Kargl and Michael Weber. "THE EFFECTS OF THE AJAX TECHNOLOGY ON WEB APPLICATION USABILITY." WEBIST 2007 International Conference on Web Information Systems and Technologies. 2007. 289-294.

15. "Ajax." Java Jazz Up 8 April 2008: 1-79.

16. Lerner, Reuven M. Ajax Application Design. 1 December 2006. 1 April 2010 <http://www.linuxjournal.com/article/9295>.

17. MacKay, Tara. Ajax Usability Concerns. 25 December 2007. 2 April 2010 <http://www.notesondesign.net/resources/web-design/ajax-usability-concerns/>.

18. Molich, R and J. (1990) Nielsen. "Improving a human-computer dialogue." Communications of the ACM 33. 1990. 338-348.

Page 12: Evaluation Test On Power Meeting

19. Molich, R. and J. and Nielsen. "Improving a human-computer dialogue." Communications of the ACM 33. 1990. 338-348.

20. Nielsen, J. and R Molich. "Heuristic evaluation of user interfaces." Proc. ACM CHI'90 Conf. Seattle, 1990. 249-256.

21. Nielsen, J. "Finding usability problems through heuristic evaluation." Proceedings ACM CHI'92 Conference. CA: Monterey, 1992. 378-380.

22. —. Usability Inspection Methods. New York: John Wiley & Sons, 1994.

23. Osborn, A. F. Applied Imagination. New York: Scribner, 1957.

24. S.Dumas, Joseph and Janice C.Redish. A Practical Guide to Usability Testing. n.d.

25. Sarwate, Amol. Hot or Not: Ajax Vulnerabilities. 19 September 2007. 28 March 2010 <http://www.scmagazineus.com/hot-or-not-ajax-vulnerabilities/article/35698/>.

26. Site Security Monitor. Ajax Application Attacks.2010. 1 March 2010

<http://www.sitesecuritymonitor.com/ajax-application-attacks/>.

27. SPOOL, JARED M. Five Usability Challenges of Web-Based Applications. 4 December 2007. 8 March 2010 <http://www.uie.com/articles/usability_challenges_of_web_apps/>.

28. Tullis, Tom and Bill Albert. Measuring the User Experience. Burlington: Morgan Kaufmann, 2008.

29. Wang, W. PowerMeeting on CommonGrounds: Web based synchronous groupware with rich user experience. 2008. 20 March 2010 <http://sites.google.com/site/meetinginbrowsers/weigang-wang-s-work>.

30. Web Aim. What is AJAX? 1 March 2010. 6 March 2010 <http://www.webaim.org/techniques/ajax/>.

31. Wood, John. Usability Heuristics Explained. 18 January 2004. 28 March 2010 <http://iqcontent.com/publications/features/article_32/>.

Page 13: Evaluation Test On Power Meeting

APPENDIX A

Acronyms and Definitions

Asynchronous communication - the user’s interaction with the application happens independently of the application’s communication with the server.

AJAX – Asynchronous JavaScript and XML

CSS – Cascading Style Sheets

GUI – Graphical User Interface

GWT – Google Web Tool Kit

HTTP – HyperText Transfer Protocol

RPC – Remote Procedure Call

SQL – Structured Query Language

XHTML – Extensible HyperText Markup Language

XML – Extensible Mark-up Language

Statements and Usability Metric Tested for Agenda and Tele-pointer

Statement Usability metric used

1 I have used similar tools before. Ease of use

2 It found it easy to create an agenda item. Ease of use

3 I found it easy to create a category. Ease of use

4 It was easy to create ideas. Ease of use

5 I found it easy to delete and replace ideas. Ease of use

6 The terms used were easy to understand. Ease of use

7 The various tabs were clearly visible and easy to find. Efficiency and ease of use

8 I found it easy to move from one task to another. Ease of use

9 The tasks were clearly different from each other. Efficiency

10 I made errors while navigating through the individual tasks. Ease of use

11 It was easy to find the telepointer. Ease of use

12 Telepointer navigation from one item to another was smooth. Efficiency and satisfaction

13 I find the telepointer an important tool to use. Satisfaction

14 There is a consistent icon design scheme and stylistic treatment across the system Satisfaction

Page 14: Evaluation Test On Power Meeting

Statements and Usability Metric Tested For Voice Conference and Chat

No. Questions Usability Metric used

1. The tool is easy to use for the tasks given. Ease of use

2. The tool is efficient for voice conference and chat. Efficiency

3.The tool needs to be used several times to get accustomed with.

Ease of use

4.In relation to other tools I have used, this tool is easy. Ease of use

5.The fields/buttons are well presented and organized. Satisfaction

6.It is easy to understand the functions of the fields/buttons.

Ease of use and satisfaction

9.The tools (PowerMeeting and Skype) are well integrated. Efficiency

10.

I found the navigation around the tool easy. Ease of use

The following questions were then asked with expected binary response of YES/NO to gather qualitative data.

Did you encounter any problem while connecting with Skype? If you answer YES please briefly mention some of them.

Please make any comments on Power Meeting Voice conference (Skype) tool.

Would you recommend others to use this voice conference tool? Answer by (YES/NO).

Would you recommend others to use Power Meeting for group chat? Answer by (YES/NO).

Questions for Assessment of Documentation

1. Compared to other web applications that you use, how would you describe the registration process of the PowerMeeting? Choose one of the following options and put it in a circle. You may consider selecting more than one answer.

a. It is really confusing for the average userb. Very poorly designed mechanismc. Rather straightforward

2. Did you encounter any difficulties to log on the system and create a new session? Answer by (YES/NO)3. If your answer is NO describe in a short sentence the basic difficulty you encountered.4. Would you prefer it if Power Meeting included a mechanism to remember passwords? (YES/NO)5. Are you convinced of the security which is provided by the Power Meeting during the log in process? Please

consider mostly the case where you need to log in with your skype id. Answer by (YES/NO).6. On any PC, it is impossible to log in on the Power Meeting by using the same Web Browser. How would you

comment on that? Answer with a short sentence.7. How would you characterize the overall design of the user guide? Your options are the following and you should

put your answer in a circle.

Page 15: Evaluation Test On Power Meeting

a. very bad b. neither bad/nor goodc. goodd. good but corrections are needede. fascinating

8. Do you believe that the description of the sessions in the user guide was helpful to you? (YES/NO/INDIFFERENT)

9. Are you satisfied by the organization of the user guide? (YES/NO)10. Do you believe that the content of the user guide is accurate and to the point? (YES/NO)11. Could you manage to communicate via Skype through the Power Meeeting tool without reading the session of the

user guide describing the voice conference with Skype?12. Do you believe that you would have been able to perform better in the agenda task if the user guide had included

an illustrated presentation of this function? Answer by (YES/NO).13. Suggest any improvement in a new version of Power Meeting’s documentation.

Questions For Assessing the Group calendar and Corresponding Metrics Tested No Questions On Individual Tasks Type Of Usability

Metric

1 I found the tool easy to use for this task Ease of Use

2 I found the tool efficient for this task Efficiency

3 I would need to use the tool several times before I get accustomed to performing this task. Ease of Use

4 The experience I have of previous tools increased my ability and speed of performing this task in the Groupware calendar tool.

Ease of Use

5 I understood the text descriptions of buttons on the user interface of the Group calendar. Ease of Use

6 The text description on the buttons aptly describe their functionality Ease of Use

7 The steps for each task followed a natural and logical order. Efficiency

8 I felt confident and very much in control of the tool while performing this task. Ease of Use

9 I made an error on this task Ease of Use

10 I found it easy to retrace my steps when I made an error while carrying out this task. Ease of Use

11 I felt I needed to check the online user documentation for this task Ease of Use

Page 16: Evaluation Test On Power Meeting

General Questions on the Group Calendar

NoGeneral Comments on Power Meeting Group Calendar

User Metric Tested For

13 I found certain features of the tool unnecessary and distracting. Ease of Use

14 I found the various functions in the Power meeting calendar tool well integrated. Ease of Use

15 I would prefer to use a browser-based groupware calendar. Ease of Use

16 I would likely use this tool frequently as my group calendar tool. Ease of use/ Satisfaction

17 I would imagine that most people would learn to use this system very quickly Ease of Use

(Questions 1, 13, 14, 16 and 17 are adapted from the System Usability Scale listed by Brookes J. but developed at the DigitalEquipment Corporation).

18. Please briefly describe any problems you encountered in carrying out the tasks

19. List some ways you think the calendar tool would help you work better in groups

20. What other features did you expect to see in the group calendar tool?

21. Please give recommendations on the improvement of this tool.

The Group Calendar Evaluators’ Guideline and Observation Form

We used this form to get our own assessment of the main user metrics we set out to test. The data was recorded for each task and for each user. The table below lists out the guidelines we followed in order to effectively record our observations.

No Guidelines Possible Data Values User Metric Tested For

1 Completion time for task Value in seconds Efficiency

2 Task success Yes or No Efficiency

3 No of unsuccessful attempts Maximum Number of Attempts was 3

Efficiency

4 Does user display signs of discomfort? Descriptive text of discomfort(if any)

Ease of use

5 Identify User’s Errors While Carrying Out Tasks (In a measurable form)

Type of error Issue-based

6 Main Usability Issues Faced By this User

Measured description of error faced by user

Issue-based

7 Please Record User’s Comments per Task

Exact comments Efficiency/Ease of use/ Main Issues (They confirm the other observations made above).

8 Please record any other observations made

Miscellaneous Efficiency/Ease of use/ Main Issues (They confirm the other observations made above).

Page 17: Evaluation Test On Power Meeting

APPENDIX B – Graphs and Tables

Figure 1 -The general user interface layout of PowerMeeting

LoginDocumenta-tion Agenda Telepointer

Voice Conference

User 11 1 1 1 1

User 21 1 1 1 1

User 31 1 1 1 1

User 41 1 1 1 0

User 51 1 1 1 1

User 61 1 1 1 0

Average100% 100% 100% 100% 67%

Confidence Interval (95%)

0% 0% 0% 0% 41%

Table 1 Showing percentage of users who completed tasks

Graph 1 showing percentage of users task completions

Page 18: Evaluation Test On Power Meeting

UserEvaluation Time (Secs)

User 1120

User 2 60

User 360

User 4120

User 5 40

User 650

Average 75

Table 2 showing user completion times for login

UserEvaluation Time (Secs)

User 1240

User 2180

User 3 60

User 4240

User 5180

User 6 240

Average 180

Table 3 showing user completion times for Documentation

UserEvaluation Time (Secs)

User 1 300

User 2 336

User 3 540

User 4 300

User 5 130

User 6 147

Average 292

Table 4 showing user completion times for Agenda

UserEvaluation Time (Secs)

User 166

User 2 90

User 350

User 460

User 5 18

User 622

Average 51

Table 5 showing user completion times for tele-pointer

UserEvaluation Time (Secs)

User 1180

User 2180

User 3 120

User 4360

User 5180

User 6 180

Average 200

Table 6 showing user completion times for voice conferencing

Group Observed Expected

Novice 2 2

Intermediate 2 2

Experts 2 2

TOTAL 6 6

Chi-test 1Table 7 Login Chi-test

Group Observed Expected

Novice 2 2

Intermediate 2 2

Experts 2 2

TOTAL 6 6

Chi-test 1Table 8 Documentation Chi-test

Page 19: Evaluation Test On Power Meeting

Group Observed Expected

Novice 2 2

Intermediate 2 2

Experts 2 2

TOTAL 6 6

Chi-test 1Table 9 Agenda Chi-test

Group Observed Expected

Novice 2 2

Intermediate 2 2

Experts 2 2

TOTAL 6 6

Chi-test 1Table 10 Tele-pointer Chi-test

Group Observed Expected

Novice 1 2

Intermediate 2 2

Experts 1 2

TOTAL 4 6

Chi-test 0.606531Table 11 Voice conference Chi-test

Tools Errors

Login 3

Documentation 2

Agenda 1

Telepointer 0Voice Conference 1

Table 12 Number of errors

Login, 43%

Documentation, 29%

Agenda, 14%

Telepointer, 0%

Voice Conference, 14%

Login Documentation Agenda Telepointer Voice Conference

Graph 2Percentage of Errors in User Interface tools

Relationship between level of experience and errors made(Login)

R2 = 0.5786

0

0.2

0.4

0.6

0.8

1

1.2

0 2 4 6 8 10 12 14 16Months of Experience

Avera

ge E

rro

rs p

er

Day

Graph 3

Relationship between level of experience and errors made(Documentation)

R2 = 0.3223

0

0.2

0.4

0.6

0.8

1

1.2

0 2 4 6 8 10 12 14 16

Months of Experience

Av

era

ge

Err

or

pe

r D

ay

Graph 4

Relationship between level of experience and errors made(Agenda)

R2 = 0.0357

0

0.2

0.4

0.6

0.8

1

1.2

0 2 4 6 8 10 12 14 16

Months of Experience

Av

era

ge

Err

ors

per

Day

Graph 5

Relationship between level of experience and errors made(Voice Conference)

R2 = 0.0357

0

0.2

0.4

0.6

0.8

1

1.2

0 2 4 6 8 10 12 14 16

Months of Experience

Averag

e E

rro

rs p

er D

ay

Graph 6

Page 20: Evaluation Test On Power Meeting

Tools High severity Medium severity Low severityLogin Takes long time to log in. Powermeeting does not

remember passwordsDocumentation User guide does not illustrate

the tasks for agenda.Agenda Cannot retrieve deleted items

from the trash.Telepointer Cannot understand the use of

tele-pointer.Voice Conference Takes too long to connect to

Skype.

Table 13 Issues encountered with the User Interface tools

%participants encountered issues

0.00%

20.00%

40.00%

60.00%

80.00%

100.00%

120.00%

Login Agenda VoiceConference

%p

art

icip

an

ts

High severity Medium severity Low severity

Graph 7 Percentage of participants who encountered issues

Login Documentation Agenda Telepointer Voice Conference

P1 Delays while logging in when there are too many participants logged in.

Can’t understand how to highlight fields.

Delays while sending text messages.

P2 Should be a search function to find relevant documents.

System freezes when there are too many people logged in PowerMeeting.

P3 Would prefer PowerMeeting to remember passwords.

Takes too much time to connect.

P4 More security consideration and privilege rights assigned to certain participants

P5 The logout process should be made easy perhaps making the logout button more prominent.

P6 The administrator and client login should be explained

A bit more information on how to create agenda.

Should allow more than three participants to do the conference.

Table 14 comments on the user interface tools

Page 21: Evaluation Test On Power Meeting

Task 1 Task 2 Task 3 Task 4 Task 5

User 1 1 1 1 1 1

User 2 1 1 0 0 0

User 3 1 1 1 1 1

User 4 1 1 1 1 1

User 5 1 1 0 1 1

User 6 1 1 0 1 1

Average 100% 100% 50% 83% 83%

Confidence Interval (95%)

0% 0% 44% 33% 33%

Table 15 showing success rate for calendar tasks

Graph 8 showing task success with confidence intervals

UserEvaluation Time (Secs)

User 1 105

User 2 186

User 3 180

User 4 120

User 5 150

User 6 126

Average 144.5

Table 16 showing task completion times for each user

Usage of calendar tool

Experience level

Daily 10

Twice a week 8

Weekly 6

Once in two weeks

4

Monthly 2

Never 0

Table 17 showing experience levels

Page 22: Evaluation Test On Power Meeting

Graph 9 showing the relationship between the completion time and general computer usage experience

Task 1 Task 2 Task 3 Task 4 Task 5

User 1 43% 35% 43% 48% 33%

User 2 73% 73% 53% 63% 75%

User 3 70% 70% 60% 70% 70%

User 4 73% 73% 68% 75% 75%

User 5 53% 63% 53% 73% 70%

User 6 58% 68% 53% 68% 68%

Average 61% 63% 55% 66% 65%

Table 18 showing Data from Calendar Tool Ease of Use Evaluation

Graph 10 showing percentage ease of use of calendar tool

Task ID

High severity Medium severity

Low severity

Task 1 Difficulty adding a new event (it is not clear that the calendar date should be clicked to add new event and no alternative way to carry out task is available)

Cannot modify event end time

The use of the term ‘Event’ to refer to an activity was not clear (event was mistaken for an agenda item)

Task 2 Event description can only be modified by double clicking to bring up the dialogue box. It is not explicit that user should double click event to see the description

Task 3 Event time can only be changed by dragging and dropping event to a different time slot. Alternative way to change the time is through the dialogue box but this cannot be done as the text-box which contains the time is not editable

Task 4 Cannot change event date from the daily and monthly views

Task 5 Cannot delete an event without double clicking the event. No alternative way of deleting event is provided

Table 19 showing issues identified during usability evaluation of Calendar tool

Page 23: Evaluation Test On Power Meeting

Graph 11 showing percentage of users who encountered issues

User How can the calendar tool help you work better in groups?

What other features did you expect to see in the group calendar tool?

Other comments

User 1 Group deadlines would be easier to clarify if each person was set an individual deadlines

To do list

Show you which group member entered an event

User 2 Allow members of a group share work schedules

Uploading documents

User 3 Saves cost : if you are away and need to meet up with your group members

Saves time

Priority feature will help organize events better

User 4 Other users’ events should be shown differently on my page

Progress of event can be incorporated

User 5 Deadline and meetings can be easily communicated

The time in the dialogue box should be made editable like the description

The ease of use of the tool exceeded my expectation

User 6 Allow access to uniform schedule of group members

The textbox displaying time should either be made editable or changes to display the time as a label

Table 20 summary of further details gathered during usability evaluation of Calendar tool