Chapter 12: Introducing Evaluation. The aims To illustrate how observation, interviews and...

Preview:

Citation preview

Chapter 12: Introducing Evaluation

The aims

• To illustrate how observation, interviews and questionnaires that you encountered in Chapters 7 and 8 are used in evaluation.

• To explain the key concepts and terms used in evaluation.

• To introduce three main evaluation evaluation approaches and key evaluation methods within the context of real evaluation studies.

Six evaluation case studies

• Evaluating early design ideas for a mobile device for rural nurses in India.

• Evaluating cell phones for different markets.• Evaluating affective issues: challenge and engagement in a collaborative immersive game.

• Improving a design: the HutchWorld patient support system.

• Multiple methods help ensure good usability: the olympic messaging system (OMS).

• Evaluating a new kind of interaction: an ambient system.

Why, what, where and when to evaluate

Iterative design & evaluation is a continuous process that examines:

• Why: to check that users can use the product and that they like it.

• What: a conceptual model, early prototypes of a new system and later, more complete prototypes.

• Where: in natural and laboratory settings.• When: throughout design; finished products can

be evaluated to collect information to inform new products.

Designers need to check that they understand users’ requirements.

Bruce Tognazzini tells you why you need to evaluate“Iterative design, with its repeating cycle of design and testing, is the only validated methodology in existence that will consistently produce successful results. If you don’t have user-testing as an integral part of your design process you are going to throw buckets of money down the drain.”

See AskTog.com for topical discussions about design and evaluation.

The language of evaluation

• Analytical evaluation

• Controlled experiment

• Field study• Formative evaluation

• Heuristic evaluation

• Predictive evaluation

• Summative evaluation

• Usability laboratory

• User studies• Usability studies• Usability testing• User testing

Evaluation approaches

• Usability testing• Field studies• Analytical evaluation

• Combining approaches• Opportunistic evaluations

Characteristics of approachesUsability testing

Field studies

Analytical

Users do task natural not involved

Location controlled natural anywhere

When prototype early prototype

Data quantitative

qualitative

problems

Feed back measures & errors

descriptions

problems

Type applied naturalistic

expert

Evaluation approaches and methods

Method Usability testing

Field studies

Analytical

Observing x x

Asking users

x x

Asking experts

x x

Testing x

Modeling x

Evaluation to design a mobile record system for

Indian AMWs• A field study using observations and interviews to refine the requirements.

• It would replace a paper system.• It had to be easy to use in rural environments.

• Basic information would be recorded: identify each house-hold, head of house, no. members, age and medical history of members, etc.

Could these icons be used

with other cultures?

For more interesting examples of mobile designs for the developing world see Gary Marsden’s home page:http://people.cs.uct.ac.za/~gaz/research.html

Evaluating cell phones for different world

markets• An already existing product was used as a prototype for a new market.

• Observation and interviews.• Many practical problems needed to be overcome: Can you name some?

• Go to www.nokia.comand select a phone or imagine evaluatingthis one in a countrythat Nokia serves.

QuickTime™ and aTIFF (Uncompressed) decompressor

are needed to see this picture.

Challenge & engagement in a collaborative immersive

game• Physiological measureswere used.

• Players were more engaged when playing against another person than when playing against a computer.

• What were the precautionary measures that the evaluators had to take?

What does this data tell you?

high values indicate more variation

Playing against computer

Playing against friend

Mean St. Dev. Mean St. Dev.

Boring 2.3 0.949 1.7 0.949

Challenging 3.6 1.08 3.9 0.994

Easy 2.7 0.823 2.5 0.850

Engaging 3.8 0.422 4.3 0.675

Exciting 3.5 0.527 4.1 0.568

Frustrating 2.8 1.14 2.5 0.850

Fun 3.9 0.738 4.6 0.699

Source: Mandryk and Inkpen (2004).

The HutchWorld patient support system

• This virtual world supports communication among cancer patients.

• Privacy, logistics, patients’ feelings, etc. had to be taken into account.

• Designers and patients speak different languages.

• Participants in this world can design their own avatar. Look at the “My appearance” slide that follows. How would you evaluate it?

My Appearance

Multiple methods to evaluate the 1984 OMS

• Early tests of printed scenarios & user guides. Early simulations of telephone keypad. An Olympian joined team to provide feedback. Interviews & demos with Olympians outside US. Overseas interface tests with friends and family.

Free coffee and donut tests. Usability tests with 100 participants. A ‘try to destroy it’ test. Pre-Olympic field-test at an international event.

Reliability of the system with heavy traffic.

Something to think about

• Why was the design of the OMS a landmark in interaction design?

• Today cell phones replace the need for the OMS. What are some of the benefits and losses of cell phones in this context? How might you compensate for the losses that you thought of?

Evaluating an ambient system

• The Hello Wall is a new kind of system that is designed to explore how people react to its presence.

• What are the challenges of evaluating systems like this?

Key points Evaluation & design are closely integrated in user-centered design.

Some of the same techniques are used in evaluation as for establishing requirements but they are used differently (e.g. observation interviews & questionnaires).

Three main evaluation approaches are:usability testing, field studies, and analytical evaluation.

The main methods are:observing, asking users, asking experts, user testing, inspection, and modeling users’ task performance.

Different evaluation approaches and methods are often combined in one study.

Triangulation involves using a combination of techniques to gain different perspectives, or analyzing data using different techniques.

Dealing with constraints is an important skill for evaluators to develop.

A project for you …

• “The Butterfly Ballot: Anatomy of disaster” is an interesting account written by Bruce Tognazzini, that you can find by going to AskTog.com and looking through the 2001 column.

• Alternatively go directly to: http://www.asktog.com/columns/042ButterflyBallot.html

A project for you … continued

• Read Tog’s account and look at the picture of the ballot card.

• Make a similar ballot card for a class election and ask 10 of your friends to vote using the card. After each person has voted ask who they intended to vote for and whether the card was confusing. Note down their comments.

• Redesign the card and perform the same test with 10 different people.

• Report your findings.

Recommended