Upload
arundhati
View
1.414
Download
2
Embed Size (px)
DESCRIPTION
Evaluation: the 3rd phase of interaction design.
Citation preview
HOME-MESS SYSTEM Presentation III: Evaluation of Prototype I
Arundhati, Ihab, Ibrahim, Fareed, Zain
“TESTING” THE MOMENT
OF TRUTHINTR
OD
UC
TIO
N
Source- http://blogs.msdn.com/willy-peter_schaub
• Testing is the big unknown, often ignored world that intends to uncover design and construction flaws in a solution.
• The earlier we start testing, the better the testing strategy, the better the test coverage, the sooner we can reveal the flaws and address them.
• Uncovering a flaw when a solution is still in early construction phases is a reasonably cost effective and easy flaw to resolve
OBJECTIVES
“Reaching a deeper understanding of the users'
expectations and impressions of the system.”
Discovering strengths and weaknesses in prototype design
Identification of barriers to successful penetration of the system functions
Evaluation of non-response (non-participation) elements in design
Exploitation of evaluations results and outcomes for improving system layout and architecture
OUR APPROACH Step 1: Decide on testing
strategy- Technical review of the
solution- Involving both developers
and users- Starting from the ‘inside’- Define the boundaries and
scope
Step 2: Prepare the ‘battle plan’
- Define the types, objective, target users for testing
- Define process of task scenarios for testing
EVALUATION METHODS
TRIANGULATION
THEO
RYFIELD
EXPERT
Cognitive Walkthrough
User Testing
Heuristic Evaluation
Theory Based- Cognitive Walkthrough Measure the usability aspect by collecting empirical
data of task breakdown and recognizing the sequence/path taken by the user.
Field Based- User Testing Observation of users in their home environment. A
basic structure would be kept as a guideline. It’s a user centric approach
Expert Based- Heuristic Evaluation Identify usability problems based on established
human factors principles. The method will provide recommendations for design improvements.
COGNITIVE WALKTHROUGH
Th
e s
tag
es in
a c
og
nitiv
e
walk
thro
ug
h a
nd
the d
ep
en
den
cie
s
betw
een
sta
ges
Pre
para
tion
P
hase
Execu
tion
P
hase
TASK BASED WALKTHROUGHS- APPROACH
Users Identified:what knowledge, skills, experience will they have?
Tasks Identified:set of representative taskssequence of actions needed to achieve each task
PROCESSING MODEL OF HUMAN COGNITION
1) User sets a goal to be
accomplished by the system
2) User searches the interface for
currently available actions
3) User selects the action that
seems likely to make progress
toward the goal.
4) User performs the
selected action and
evaluates the system's feedback
PREDEFINED PROBLEM CRITERIA1. User articulates a goal & cannot
succeed in attaining it within 2 minutes
2. User explicitly gives up
3. User articulates a goal and has to try three or more actions to find a solution
4. User produces a result different from the task given
5. User expresses surprise
6. User expresses some negative affect or says something is a problem
7. User makes a design suggestion
KLM: KEYSTROKE LEVEL MODEL
K Press a key or button
P Point to a target on the display
H Home hands on input device
D Draw a line segment
M Mentally prepare for an action
R (system response time)
KLM Operators: Description
OperationTime (sec)
Reach the screen
H 0.20
Move finger to ‘Event’ (menu)
P(menu item)
1.10
Click on ‘Event’ K 0.40
Move finger to highlighted date
P(field)
0.15
Click on field K 0.30
Total 2.15
1 2 3
A 4.3 2.5 3.5
B 2.4 4.4 1.8
C 2 2 3
D 3.2 2.75 1.4
E 2.7 3.2 1.9
Chart Title
View Event Leave Mes-sage
Add new task
USER TESTING
Set task scenarios for user testing
Recruit prospective users
Record each session of testing
Observe and Analyse data
Follow up with a cooperative evaluation questionnaire
THINK ALOUD PROTOCOL : APPROACH
1. “Create a new task” To ascertain the path that the user would follow and
their understanding of the system layout
2. “Leave a direct message” To test the user’s intuition when no clear path has been defined.
3. “Create a new event”
4. “Check for reminders”
TASK SCENARIO
USERS TESTED
"Anything that can go wrong will go wrong” - Murphy’s Law
Measures of usability should cover: •Effectiveness ( the ability of users to complete tasks using the system, and the quality of the output of those tasks)•Efficiency ( the level of resource consumed in performing tasks) •Satisfaction (users’ subjective reactions to using the system)
SCALES USEFULNESS EASE OF USE
Very Useful
Useful
Somewhat Useful
Not Useful
Poor12345 Very Easy
Easy
Somewhat Easy
Somewhat Difficult
Difficult
QUESTIONNAIRE
1 2 3 4 5 6 7 8 9 10
A 3 2 3 2 4 2 3 2 3 2
B 5 2 4 1 4 1 1 1 4 1
C 4 2 4 2 5 2 2 1 3 1
D 4 2 5 1 4 2 1 2 4 1
E 5 1 5 1 3 1 1 1 5 1
0.5
1.5
2.5
3.5
4.5
5.5
SC
ALE
USER QUOTES
“Good white space – links are obvious – clearly labeled – browsing divided very nicely – good subcategories.”
“What is ‘camera’ icon for? It was the first choice I noticed.”
“I think the designers have done well”
“I don’t know which button to click with the options present in more that one place on the main screen.”
HEURISTICS EVALUATION
SEVERITY RATING SCALE
0
•I don't agree that this is a usability problem
1
•Cosmetic problem only — need not be fixed unless extra time is available on project
2
•Minor usability problem — fixing this should be given low priority
3
•Major usability problem — important to fix, so should be given high priority
4
•Usability catastrophe — imperative to fix this before product can be released
SEVERITY RATINGS
Avg. Severity
Std Deviation
1. Visibility of system status 2.8 0.45
2. Match between system and the real world
0.4 0.55
3. User control and freedom 1.8 0.55
4. Consistency and standards 1 0
5. Error prevention 1 0
Avg. Severity
Std Deviation
6. Recognition rather than recall 0 0
7. Flexibility and efficiency of use 1 0
8. Aesthetic and minimalist design
1.4 0.89
9. Help users recognize, diagnose and recover from errors
0 0
10. Help and documentation N/A N/A
Visibility of system status
Match between system & world
User control & freedom
Consistency & Standards
Error prevention
Recognition rather then recall
Flexibility & efficiency of use
Aesthetic & minimalist design
Recovery from errors
0 0.5 1 1.5 2 2.5 3
THEMATIC PROBLEMS IDENTIFIED
HEURISTIC
1. Visibility of system status -Informative system feedback
2. User control and freedom - Help - Lack of privacy control
3. Flexibility and efficiency of use
-Lack of accessibility options
4. Aesthetic and minimalist design
- Confusion with the 3 level navigation system
COMPILED EVALUATION ANALYSIS
IMPLEMENTING CHANGES
• Changing the appearance of menu buttons i.e., from text to icons
• Modifying the notification bar to no longer appear as buttons
Clearly define functionality difference of home screen buttons
• Add to the sound feature with ‘narration’
Add more functionality to accessibility option
• Make it illustrative with tutorials for functions
Incorporate a better help function
Improve system response further
Implement privacy options
CONCLUSION
Evaluation reveal s flaws in the system
Triangulated by methods: - Cognitive Walkthrough - User Testing - Heuristic Evaluation
Formulated data and analyzed the severity for each problem
Derived possible design and functionality solutions for next phase of prototype development
SEE YOU WITH OUR REDESIGNED PROTOTYPE
ANY QUESTIONS?