39
CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix.

CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Embed Size (px)

Citation preview

Page 1: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

CS5714 Usability Engineering

Usability Specifications

Copyright © 2003 H. Rex Hartson and Deborah Hix.

Page 2: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs2

Topics

What are usability specifications? Usability specification tables Benchmark task descriptions User errors in usability specifications Usability specifications and managing the UE

process Team exercises

Page 3: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs3

Usability Specifications

Introducing Envision– Video-clips to be used as examples

of process activities– Envision: a digital library of computer

sciences literature– Search results are presented in

graphical scatterplot– Video-clip: Envision prototype

Page 4: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs4

Usability Specifications

Quantitative usability goals against which user interaction design is measured

Target levels for usability attributes– Operationally defined metric for a usable interaction

design– Management control for usability engineering life

cycle– Indication that development process is converging

toward a successful design– Establish as early in process as feasible

Page 5: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs5

Usability Specifications

Tie usability specifications to early usability goals– E.g., for early goal of walk-up-and-use usability, base

usability specification on initial task performance time

All project members should agree on usability specifications attribution and values

Page 6: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs6

Usability Specification Data

Usability specifications based on– Objective, observable user

performance– Subjective, user opinion and

satisfaction Subjective preferences may reflect users

desire to return to your Web site, but flash and trash soon bores and irritates

Objective and subjective usability specifications can both be quantitative

A way to avoid usability specifications

Page 7: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs7

Usability Specification Table

Usability attribute – what general usability characteristic is to be measured– May need separate usability attributes for each

user class

Credit to:  [Whiteside, Bennett, & Holtzblatt, 1988]

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Page 8: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs8

Usability Specifications

Some quantitative usability attributes– Objective

Initial performance (on benchmark tasks) Longitudinal (experienced, steady state) performance Learnability Retainability

– Subjective Initial impression (questionnaire score) Longitudinal satisfaction

Page 9: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs9

Usability Specification Table

Usability attribute for Calendar– Initial performance, since want good 'walk-up-and-use'

performance w/o training or manuals

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

Page 10: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs10

Usability Specification Table

Measuring instrument– Vehicle by which values are measured for usability

attribute– The thing that generates the data

Benchmark task generates objective timing data Questionnaire generates subjective preference data

Sample questions from QUIS satisfaction questionnaire

Page 11: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs11

Benchmark Tasks

What tasks should be included?– Representative, frequently performed tasks– Common tasks – 20% that account for 80% of

usage– Critical business tasks – not frequent, but if you

get it wrong, heads can roll

Example: Schedule a meeting with Dr. Ehrich for four weeks from today at 10 am in 133 McBryde, about the HCI research project

Page 12: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs12

Benchmark Task Descriptions

Clear, precise, repeatable instructions IMPORTANT: What task to do, not how to do it Clear start and end points for timing

– Not: Display next week’s appointments (end with a user action confirming end of task)

Adapt scenarios already developed for design– Clearly an important task to evaluate– Remove information about how to do it

Page 13: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs13

Benchmark Task Descriptions

Start with fairly simple tasks, then progressively increase difficulty– Add an appointment, then add appointment 60 days

from now, then move appointment from one month to other, add recurring appointments

Avoid large amounts of typing if typing skill is not being evaluated

Tasks should include navigation– Not: look at today’s appointments

Page 14: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs14

Benchmark Task Descriptions

Tasks wording should be unambiguous– Why is this ambiguous? “Schedule a meeting with

Mr. Jones for one month from today, at 8 AM.”

Important: Don’t use words in benchmark tasks that appear specifically in interaction design– Not: “Find first appointment …” when there is a

button labeled “Find”– Instead: use “search for”, “locate”

Page 15: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs15

Benchmark Task Descriptions

Use work context wording, not system-oriented wording– “Access information about xyz” is better

than “submit query”

To evaluate error recovery, benchmark task can begin in error state

Page 16: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs16

Benchmark Task Descriptions

Put each benchmark on a separate sheet of paper

Typical number of benchmark tasks: Enough for reasonable, representative coverage

Example for Calendar: Add an appointment with Dr. Kevorkian for 4 weeks from today at 9 AM concerning your flu shot (yeah, right)

Page 17: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs17

Usability Specification Table

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Page 18: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs18

Usability Specification Table

Value to be measured – metric for which usability data values are collected

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Page 19: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs19

Usability Specification Table

Value to be measured – metric for which usability data values are collected– Time to complete task– Number of errors– Frequency of help and documentation use– Time spent in errors and recovery– Number of repetitions of failed commands– Number of times user expresses frustration or

satisfaction– Number of commands, mouse-clicks, or other user

actions to perform task(s)

Page 20: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs20

Usability Specification Table

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Time on task

Page 21: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs21

Usability Specification Table

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Time on task

Current level – present value of usability attribute to be measured

No time for reqts

Page 22: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs22

Usability Specification Table

Current level– Level of performance for current version of system

for measuring instrument (when available)– Baseline to help set target level, from:

Automated system (existing or prior version) Competitor system Developer performance (for expert, longitudinal use) Try out some users on your early prototype

Page 23: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs23

Usability Specification Table

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Time on task 20 secs (competitor system)

Page 24: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs24

Usability Specification Table

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Time on task 20 secs (competitor system)

Target level – value indicating unquestioned usability success for present version

Page 25: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs25

Usability Specification Table

Target level

– Minimum acceptable level of user performance

– Determining target level values Usually acceptable improvement over current level

Page 26: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs26

Usability Specification Table

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Time on task 20 secs (competitor system)

15 secs

Page 27: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs27

Usability Specification Table

More example usability specifications

Usability

attribute Measuring instrument

Value to be measured

Current level

Target level

Initial performance

BT1: Add appt

Time on task 20 secs (competitor system)

15 secs

Initial performance

BT1: Add appt

Nbr of errors 2 1

Initial satisfaction

Q 1, 2, 7 from questionnaire

Avg score over questions, users / 10

7 8.5

Page 28: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs28

User Errors in Usability Specs.

What constitutes a user error?– Deviation from any correct path to

accomplish task (except, for example, going to Help)

– Only situations that imply usability problems

– Do not count “oops” errors, doing wrong thing when knew it was wrong

Page 29: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs29

User Errors in Usability Specs.

Examples of errors– Selecting wrong menu, button, icon, etc. when user

thought it was the right one E.g., working on wrong month of calendar because they

couldn't readily see month's name

– Double clicking when a single click is needed, and vice versa

– Using wrong accelerator key– Operating on the wrong interaction object (when user

thought it was the right one)– Usually not typing errors

Page 30: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs30

Creating Usability Specifications

Usability evaluation design driven by usability goals– First determine usability goals

In terms of user class, task context, special tasks, marketing needs

Example: Reduce amount of time for novice user to perform task X in Version 2.0

Be specific as possible– Example: currently 35 seconds to perform task X (“current

level”); reduce to 25 seconds (“target level”)

Page 31: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs31

Creating Usability Specifications

What are constraints in user or work context?

Design for ecological validity– How can setting be more realistic?– Usability lab can be “sterile work

environment”– Does task require telephone or

other physical props?

Page 32: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs32

Creating Usability Specifications

Design for ecological validity– Does task involve more than one person or role?– Does task involve background noise?– Does task involve interference, interruption?

Page 33: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs33

Creating Usability Specifications

Experimental design must take into account trade-offs among user groups– Watch out for potential trade-off between learnability

for new users and performance power for experienced users

Page 34: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs34

Usability Specifications – Connecting Back to UE Process

Usability Specifications help manage the usability engineering process

This is the control of usability engineering life cycle– Quantifiable end to process– Accountability– Stop iterating when target level usability

specifications are met

Documentation

Page 35: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs35

Usability Specifications

It’s expected that you will not meet all usability target levels on first iteration– If usability target levels are met on first iteration, they

may have been too lenient– Point is to uncover usability problems– DO NOT design usability specifications with the goal

of meeting them with your initial design!

Page 36: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs36

Usability Specifications

Bottom line: This is not an exact science Good engineering judgment is important

– For setting levels (especially “target” level)– For knowing if specifications are “reasonable”– Video-clip: Setting usability specifications

You get better at it with experience

Page 37: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs37

Team Exercise – Usability Specifications

Goal:– To gain experience in writing precise, measurable

usability specifications using benchmark tasks

Activities:– Produce three usability specifications, two based on

objective measures, one based on subjective measures

– For the objective measures, write brief but specific benchmark task descriptions (at least two different benchmark task descriptions), each on a separate sheet of paper. Have them be a little complicated, include some navigation.

Page 38: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs38

Team Exercise – Usability Specifications

– Specifications with objective measures should be evaluatable, via benchmark tasks, in a later class exercise, on formative evaluation.

– Develop tasks that you can “implement” in your next exercise, to build a rapid prototype.

– The specification for subjective measure should be based on the questionnaire supplied. Select 3 or 4 items from questionnaire.

Page 39: CS5714 Usability Engineering Usability Specifications Copyright © 2003 H. Rex Hartson and Deborah Hix

Uspecs39

Team Exercise – Usability Specifications

Cautions and hints:– Don’t spend any time on design in this

exercise; there will be time for detailed design in the next exercise.

– Don’t plan to give users any training. 3 usability specifications, in the form on a

transparency Questionnaire question numbers included in

subjective specification Benchmark task descriptions, each on a separate

sheet of paper Complete in about 30-40 minutes max.