35
Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (1) Action Analysis (2) Automated Evaluation (2) Automated Evaluation December 7, 2004

Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

Embed Size (px)

Citation preview

Page 1: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

Prof. James A. LandayUniversity of Washington

Autumn 2004

(1) Action Analysis (1) Action Analysis (2) Automated Evaluation(2) Automated Evaluation

December 7, 2004

Page 2: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 2

Hall of Fame or Hall of Shame?

• Bryce 2– for building

3D models

Page 3: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 3

Hall of Shame!

• Icons all look similar– what do they

do????

• How do you exit?• Note

– nice visuals, but must be usable

• What if purely for entertainment?

Page 4: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

Prof. James A. LandayUniversity of Washington

Autumn 2004

(1) Action Analysis (1) Action Analysis (2) Automated Evaluation(2) Automated Evaluation

December 7, 2004

Page 5: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 5

Outline

• Review

• Action analysis

• GOMS? What’s that?

• The G, O, M, & S of GOMS

• How to do the analysis

• Announcements

• Automated evaluation tools

Page 6: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 6

Review Toolkit Details• Models for images ?

– strokes, pixels, regions– what is good about the stroke model?

• saves space & computation, but can’t represent images well– what is aliasing & how do we fix it?

• jaggies due to low resolution -> antialias (partially fill in adjacent pixels)

• Clipping ?– drawing only regions that are visible to the user

• Windowing systems– special problem with networked WS?

• latency

• Input events, such as– keyboard, mouse, window, etc.

• Main event loop– used to dispatch events

• Interactor trees used for– figuring out where to dispatch events

• Dispatching events– two main ways…

• Event focus determines– what widget current events go to

Page 7: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 7

Action Analysis Predicts Performance

• Cognitive model– model some aspect of human understanding,

knowledge, intentions, or processing– two types

• competence – predict behavior sequences

• performance– predict performance, but limited to routine behavior

• Action analysis uses performance model to analyze goals & tasks– generally done hierarchically (similar to TA)

Page 8: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 8

GOMS – Most Popular Action Analysis

• Family of UI modeling techniques– based on Model Human Processor

• GOMS stands for (?)– Goals– Operators– Methods– Selection rules

• Input: detailed description of UI/task(s)• Output: qualitative & quantitative measures

Page 9: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 9

Quick Example

• Goal (the big picture)– go from hotel to the airport

• Methods (or subgoals)?– walk, take bus, take taxi, rent car, take train

• Operators (or specific actions)– locate bus stop; wait for bus; get on the bus;...

• Selection rules (choosing among methods)?– Example: Walking is cheaper, but tiring and slow– Example: Taking a bus is complicated abroad

Page 10: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 10

GOMS Output

• Execution time– add up times from operators– assumes experts (mastered the tasks)– error free behavior– very good rank ordering– absolute accuracy ~10-20%

• Procedure learning time (NGOMSL only)– accurate for relative comparison only– doesn’t include time for learning domain

knowledge

Page 11: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 11

Using GOMS Output

• Ensure frequent goals achieved quickly • Making hierarchy is often the value

– functionality coverage & consistency• does UI contain needed functions?• consistency: are similar tasks performed similarly?

– operator sequence• in what order are individual operations done?

Page 12: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 12

Comparative Example - DOS

• Goal: Delete a File• Method for accomplishing goal of deleting

a file– retrieve from Long term memory that

command verb is “del”– think of directory name & file name and make it

the first listed parameter– accomplish goal of entering & executing

command– return with goal accomplished

Page 13: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 13

Comparative Example - Mac

• Goal: Delete a File

• Method for accomplishing goal of deleting a file– find file icon– accomplish goal of dragging file to

trash– Return with goal accomplished

Page 14: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 14

Applications of GOMS

• Compare different UI designs

• Profiling (time)

• Building a help system– modeling makes user tasks & goals

explicit– can suggest questions users will ask &

the answers

Page 15: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 15

Tradeoffs of Using GOMS

• Advantages– gives qualitative & quantitative measures– less work than user study– easy to modify when UI is revised

• Disadvantages– takes lots of time, skill, & effort

• research: tools to aid modeling process

– only works for goal-directed tasks• not problem solving or creative tasks (design)

– assumes tasks performed by experts w/o error– does not address several UI issues,

• readability, memorability of icons, commands

Page 16: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 16

Announcements

• Make sure your web sites are up to date– I scanned last night and saw lots of material missing

• PowerPoint slides, all assignments, mailto link for team!– Use Design Patterns to guide your design– Make sure all links work & are on OUR disk space (we will archive)– We will start grading these after the final

• Write-up for user testing assignment due by 5 PM on Friday evening (online & at Richard’s or Kate’s office)

• Final presentations– Guggenheim 217– 22 registered industry/UW guests– Dress appropriately– Bring a resume if looking for a job

• Summer or permanent– Give demos after everyone has presented– I’ll supply lunch if you can hang around from 12-1

• Questions????

Page 17: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 17

Rapid Iterative Design is the Best Practice for Creating Good UIs

Design

Prototyping

Evaluation

We have seen how computer-based tools can improvethe Design (e.g., Denim) & Prototyping (e.g., VB) phases

Page 18: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 18

Online, Remote Usability Testing

• Use web to carry out usability evaluations• Main approach is emote usability testing

– e.g., NetRaker (now KeyNote WebEffective)• combines usability testing + market research techniques• automatic logging & some analysis of usage

Page 19: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 19

Remote Usability Testing

• Move usability testing online– research participants access “lab” via web– answer questions & complete tasks in “survey”– system records actions or screens for playback– can test many users & tasks -> good coverage

• Analyze data in aggregate or individually– find general problem areas

• use average task times or completion rates

– playback individual sessions– focus on problems w/ traditional usability testing

Page 20: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 20

NetRaker: Web Experience Evaluation

• NetRaker Index– short pop-up survey shown to 1 in n visitors – on-going tracking & evaluation data

• NetRaker Experience Evaluator– surveys & task testing– records clickstreams as well– invite delivered through email, links, or pop-ups

• NetRaker Experience Recording– captures “video” of remote participants screen– indexed by survey data or task performance

Page 21: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 21

• Small number of rotated questions increases response rate

NetRaker Index: On-going customer intelligence gathering

Page 22: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 22

• Small number of rotated questions increases response rate

NetRaker Index: On-going customer intelligence gathering

Page 23: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 23

• Increasing these indices (e.g., retention) moderately (5%) leads to a large increase in revenue growth

NetRaker Index: On-going customer intelligence gathering

Page 24: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 24

NetRaker Experience Evaluator:See how customers accomplish real tasks on site

Page 25: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 25

NetRaker Usability Research:See how customers accomplish real tasks on site

Page 26: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 26

NetRaker Experience Evaluator:See how customers accomplish real tasks on site

Page 27: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 27

WebQuilt: Visual Analysis• Goals

– link page elements to user actions– identify behavior/nav. patterns– highlight potential problems areas

• Solution– interactive graph based on web content

• nodes represent web pages• edges represent aggregate traffic between pages

– designers can indicate expected paths– color code common usability interests– filtering to show only target participants– use zooming for analyzing data at varying

granularity

Page 28: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 28

Page 29: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 29

Page 30: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 30

Page 31: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 31

Advantages of Remote Usability Testing

• Fast– can set up research in 3-4 hours– get results in 24 hours

• More accurate – can run with large sample sizes

• 50-200 users -> reliable bottom-line data (stat. sig.)

– uses real people (customers) performing tasks– natural environment (home/work/machine)

• Easy-to-use– templates make setting up easy for non-specialists

• Can compare with competitors– indexed to national norms

Page 32: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 32

Disadvantages of Remote Usability

• Miss observational feedback– facial expressions– verbal feedback (critical incidents)

• can replace some of this w/ phone & chat

• Need to involve human participants– costs money (typically $20-$50/person)

Page 33: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 33

Summary

• GOMS– provides info about important UI properties– doesn’t tell you everything you want to know about UI

• only gives performance for expert behavior

– hard to create model, but still easier than user testing• changing later is much less work than initial generation

• Automated usability– faster than traditional techniques– can involve more participants -> convincing data– easier to do comparisons across sites– tradeoff with losing observational data

Page 34: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004

CSE490jl - Autumn 2004

User Interface Design, Prototyping, & Evaluation 34

Next Time

• Final presentations– Guggenheim 217

Page 35: Prof. James A. Landay University of Washington Autumn 2004 (1) Action Analysis (2) Automated Evaluation December 7, 2004