Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated...

Preview:

Citation preview

Prof. James A. LandayUniversity of Washington

Autumn 2004

(1) Action Analysis (1) Action Analysis (2) Automated Evaluation(2) Automated Evaluation

December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 2

Hall of Fame or Hall of Shame?

• Bryce 2– for building

3D models

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 3

Hall of Shame!

• Icons all look similar– what do they

do????

• How do you exit?• Note

– nice visuals, but must be usable

• What if purely for entertainment?

Prof. James A. LandayUniversity of Washington

Autumn 2004

(1) Action Analysis (1) Action Analysis (2) Automated Evaluation(2) Automated Evaluation

December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 5

Outline

• Review

• Action analysis

• GOMS? What’s that?

• The G, O, M, & S of GOMS

• How to do the analysis

• Announcements

• Automated evaluation tools

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 6

Review Toolkit Details• Models for images ?

– strokes, pixels, regions– what is good about the stroke model?

• saves space & computation, but can’t represent images well– what is aliasing & how do we fix it?

• jaggies due to low resolution -> antialias (partially fill in adjacent pixels)

• Clipping ?– drawing only regions that are visible to the user

• Windowing systems– special problem with networked WS?

• latency

• Input events, such as– keyboard, mouse, window, etc.

• Main event loop– used to dispatch events

• Interactor trees used for– figuring out where to dispatch events

• Dispatching events– two main ways…

• Event focus determines– what widget current events go to

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 7

Action Analysis Predicts Performance

• Cognitive model– model some aspect of human understanding,

knowledge, intentions, or processing– two types

• competence – predict behavior sequences

• performance– predict performance, but limited to routine behavior

• Action analysis uses performance model to analyze goals & tasks– generally done hierarchically (similar to TA)

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 8

GOMS – Most Popular Action Analysis

• Family of UI modeling techniques– based on Model Human Processor

• GOMS stands for (?)– Goals– Operators– Methods– Selection rules

• Input: detailed description of UI/task(s)• Output: qualitative & quantitative measures

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 9

Quick Example

• Goal (the big picture)– go from hotel to the airport

• Methods (or subgoals)?– walk, take bus, take taxi, rent car, take train

• Operators (or specific actions)– locate bus stop; wait for bus; get on the bus;...

• Selection rules (choosing among methods)?– Example: Walking is cheaper, but tiring and slow– Example: Taking a bus is complicated abroad

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 10

GOMS Output

• Execution time– add up times from operators– assumes experts (mastered the tasks)– error free behavior– very good rank ordering– absolute accuracy ~10-20%

• Procedure learning time (NGOMSL only)– accurate for relative comparison only– doesn’t include time for learning domain

knowledge

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 11

Using GOMS Output

• Ensure frequent goals achieved quickly • Making hierarchy is often the value

– functionality coverage & consistency• does UI contain needed functions?• consistency: are similar tasks performed similarly?

– operator sequence• in what order are individual operations done?

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 12

Comparative Example - DOS

• Goal: Delete a File• Method for accomplishing goal of deleting

a file– retrieve from Long term memory that

command verb is “del”– think of directory name & file name and make it

the first listed parameter– accomplish goal of entering & executing

command– return with goal accomplished

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 13

Comparative Example - Mac

• Goal: Delete a File

• Method for accomplishing goal of deleting a file– find file icon– accomplish goal of dragging file to

trash– Return with goal accomplished

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 14

Applications of GOMS

• Compare different UI designs

• Profiling (time)

• Building a help system– modeling makes user tasks & goals

explicit– can suggest questions users will ask &

the answers

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 15

Tradeoffs of Using GOMS

• Advantages– gives qualitative & quantitative measures– less work than user study– easy to modify when UI is revised

• Disadvantages– takes lots of time, skill, & effort

• research: tools to aid modeling process

– only works for goal-directed tasks• not problem solving or creative tasks (design)

– assumes tasks performed by experts w/o error– does not address several UI issues,

• readability, memorability of icons, commands

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 16

Announcements

• Make sure your web sites are up to date– I scanned last night and saw lots of material missing

• PowerPoint slides, all assignments, mailto link for team!– Use Design Patterns to guide your design– Make sure all links work & are on OUR disk space (we will archive)– We will start grading these after the final

• Write-up for user testing assignment due by 5 PM on Friday evening (online & at Richard’s or Kate’s office)

• Final presentations– Guggenheim 217– 22 registered industry/UW guests– Dress appropriately– Bring a resume if looking for a job

• Summer or permanent– Give demos after everyone has presented– I’ll supply lunch if you can hang around from 12-1

• Questions????

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 17

Rapid Iterative Design is the Best Practice for Creating Good UIs

Design

Prototyping

Evaluation

We have seen how computer-based tools can improvethe Design (e.g., Denim) & Prototyping (e.g., VB) phases

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 18

Online, Remote Usability Testing

• Use web to carry out usability evaluations• Main approach is emote usability testing

– e.g., NetRaker (now KeyNote WebEffective)• combines usability testing + market research techniques• automatic logging & some analysis of usage

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 19

Remote Usability Testing

• Move usability testing online– research participants access “lab” via web– answer questions & complete tasks in “survey”– system records actions or screens for playback– can test many users & tasks -> good coverage

• Analyze data in aggregate or individually– find general problem areas

• use average task times or completion rates

– playback individual sessions– focus on problems w/ traditional usability testing

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 20

NetRaker: Web Experience Evaluation

• NetRaker Index– short pop-up survey shown to 1 in n visitors – on-going tracking & evaluation data

• NetRaker Experience Evaluator– surveys & task testing– records clickstreams as well– invite delivered through email, links, or pop-ups

• NetRaker Experience Recording– captures “video” of remote participants screen– indexed by survey data or task performance

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 21

• Small number of rotated questions increases response rate

NetRaker Index: On-going customer intelligence gathering

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 22

• Small number of rotated questions increases response rate

NetRaker Index: On-going customer intelligence gathering

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 23

• Increasing these indices (e.g., retention) moderately (5%) leads to a large increase in revenue growth

NetRaker Index: On-going customer intelligence gathering

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 24

NetRaker Experience Evaluator:See how customers accomplish real tasks on site

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 25

NetRaker Usability Research:See how customers accomplish real tasks on site

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 26

NetRaker Experience Evaluator:See how customers accomplish real tasks on site

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 27

WebQuilt: Visual Analysis• Goals

– link page elements to user actions– identify behavior/nav. patterns– highlight potential problems areas

• Solution– interactive graph based on web content

• nodes represent web pages• edges represent aggregate traffic between pages

– designers can indicate expected paths– color code common usability interests– filtering to show only target participants– use zooming for analyzing data at varying

granularity

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 28

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 29

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 30

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 31

Advantages of Remote Usability Testing

• Fast– can set up research in 3-4 hours– get results in 24 hours

• More accurate – can run with large sample sizes

• 50-200 users -> reliable bottom-line data (stat. sig.)

– uses real people (customers) performing tasks– natural environment (home/work/machine)

• Easy-to-use– templates make setting up easy for non-specialists

• Can compare with competitors– indexed to national norms

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 32

Disadvantages of Remote Usability

• Miss observational feedback– facial expressions– verbal feedback (critical incidents)

• can replace some of this w/ phone & chat

• Need to involve human participants– costs money (typically $20-$50/person)

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 33

Summary

• GOMS– provides info about important UI properties– doesn’t tell you everything you want to know about UI

• only gives performance for expert behavior

– hard to create model, but still easier than user testing• changing later is much less work than initial generation

• Automated usability– faster than traditional techniques– can involve more participants -> convincing data– easier to do comparisons across sites– tradeoff with losing observational data

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 34

Next Time

• Final presentations– Guggenheim 217

Recommended