26
Click to edit Master title style January 23, 2015 | Convey UX Jennifer Romano Bergstrom, PhD UX Researcher @romanocog Unbiased Methods to Understand the User Experience

Unbiased Methods to Understand the User Experience

Embed Size (px)

Citation preview

Click to edit Master title style

January 23, 2015 | Convey UX

Jennifer Romano Bergstrom, PhD UX Researcher @romanocog

Unbiased Methods to Understand the User Experience

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? The path that got me here

1998-2002

2003-2009

2008-2011

2014

2011 - 2015

present

2014 – present: uxpa2015.org

Click to edit Master title style

January 23, 2015 | Convey UX

prejudice in favor of or against one thing, person, or group compared with another, usually in a way considered to be unfair.

BIAS

Click to edit Master title style

January 23, 2015 | Convey UX https://www.youtube.com/watch?v=G0ZZJXw4MTA

The way we ask questions impacts answers.

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

Difficulty Ratings

A. How difficult was it for you to complete the task? B. Was the task difficult for you to complete? C. How easy or difficult was the task to complete? D. Please rate your difficulty in completing the task.

1. Extremely difficult 2. Very difficult 3. Moderately difficult 4. Slightly difficult 5. Not difficult at all

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing ✗

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

Think-Aloud Protocol

A. Concurrent think aloud - while completing tasks. B. Retrospective think aloud – while watching a video replay. C. Retrospective think aloud – without video replay. D. A mix of concurrent and retrospective think aloud.

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing

Olmsted-Hawala, E. L. & Romano Bergstrom, J. C. (2012). Think-aloud protocols. Does age make a difference? Proceedings from the Society for Technical Communication Summit, May 2012, Chicago, IL.

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

Probing

A. While participant is completing tasks. B. At the end of the task. C. At the end of the session.

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing

✗ ✓

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX https://www.youtube.com/watch?v=jyMLDN9UOrE

Interrupting interferes with natural actions.

Click to edit Master title style

January 23, 2015 | Convey UX

Self-report data is great…

but it is not enough.

Click to edit Master title style

January 23, 2015 | Convey UX https://www.youtube.com/watch?v=Oons6amow3I

People cannot complete dual tasks well.

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

Combining Measures

A. Concurrent think aloud and accuracy. B. Concurrent think aloud and time to complete tasks. C. Retrospective think aloud and accuracy. D. Retrospective think aloud and time to complete task.

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing

OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate

•  Olmsted-Hawala, E. L. & Romano Bergstrom, J. C. (2012). Think-aloud protocols. Does age make a difference? Proceedings from the Society for Technical Communication Summit, May 2012, Chicago, IL.

•  Olmsted-Hawala, E. L., Murphy, E. D., Hawala, S., & Ashenfelter, K. T. (2010). Think-aloud protocols: A comparison of three think-aloud protocols for use in testing data-dissemination web sites for usability. Proceedings from CHI, April 2010, Atlanta, GA.

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

Combining Measures

A. Concurrent think aloud and accuracy. B. Concurrent think aloud and time to complete tasks. C. Retrospective think aloud and accuracy. D. Retrospective think aloud and time to complete task.

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing

OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate

•  Olmsted-Hawala, E. L. & Romano Bergstrom, J. C. (2012). Think-aloud protocols. Does age make a difference? Proceedings from the Society for Technical Communication Summit, May 2012, Chicago, IL.

•  Olmsted-Hawala, E. L., Murphy, E. D., Hawala, S., & Ashenfelter, K. T. (2010). Think-aloud protocols: A comparison of three think-aloud protocols for use in testing data-dissemination web sites for usability. Proceedings from CHI, April 2010, Atlanta, GA.

@romanocog #ConveyUX

Higher accuracy and satisfaction when moderators “coach.”

Click to edit Master title style

January 23, 2015 | Convey UX

Self-report data combined with observational data is very useful…

but there is a filter.

Click to edit Master title style

January 23, 2015 | Convey UX https://www.youtube.com/watch?v=Oons6amow3I

People think they make logical decisions.

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing

OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate

IMPLICIT •  Eye tracking •  Electrodermal activity

(EDA) •  Behavioral analysis •  Verbalization analysis •  Pupil dilation

Combining Measures

A. Eye tracking with concurrent think aloud.

B. Eye tracking with retrospective think aloud

C. Eye tracking and time to complete tasks.

D. Eye tracking alone.

•  Romano Bergstrom, J. C. & Strohl, J. (2014). Improving government websites and surveys with usability testing: A comparison of methodologies. Proceedings from the Federal Committee on Statistical Methodology (FCSM) Conference, Nov 2013, Washington, DC.

•  Romano Bergstrom, J. C. & Olmsted-Hawala, E. L. (2012). Effects of Age and Think-Aloud Protocol on Eye-Tracking Data and Usability Measures. Poster presentation at Usability Professionals Association (UPA) Conference, Las Vegas, NV, June 2012.

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing

OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate

IMPLICIT •  Eye tracking •  Electrodermal activity

(EDA) •  Behavioral analysis •  Verbalization analysis •  Pupil dilation

Combining Measures

A. Eye tracking with concurrent think aloud.

B. Eye tracking with retrospective think aloud

C. Eye tracking and time to complete tasks.

D. Eye tracking alone.

•  Romano Bergstrom, J. C. & Strohl, J. (2014). Improving government websites and surveys with usability testing: A comparison of methodologies. Proceedings from the Federal Committee on Statistical Methodology (FCSM) Conference, Nov 2013, Washington, DC.

•  Romano Bergstrom, J. C. & Olmsted-Hawala, E. L. (2012). Effects of Age and Think-Aloud Protocol on Eye-Tracking Data and Usability Measures. Poster presentation at Usability Professionals Association (UPA) Conference, Las Vegas, NV, June 2012.

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? User Experience (UX) Measures

Triangulated Approach

•  Self-report metrics tell us why participants think they focus on certain aspects.

•  Observational metrics tell us how participants navigate and interact.

•  Eye tracking tells us what, how long, and how often participants focus on design elements.

SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing

OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate

IMPLICIT •  Eye tracking •  Electrodermal activity

(EDA) •  Behavioral analysis •  Verbalization analysis •  Pupil dilation

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

What might this look like?

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? Methods

30 “expert” and 30 “novice”

users recruited Tasks using web & app

Satisfaction & Knowledge

Questionnaire Debriefing interview

•  Recruit a mix of iOS and Android users. •  In-person moderated sessions (or remote sessions) •  Half of the participant will think aloud; half will work in silence.

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants?

•  Example Tasks: •  Download app to your mobile device. •  Start a message with three friends. •  Check for new messages. •  Delete a message. •  Send a photo message. •  Send a text message to a contact. •  Change your notifications.

Methods

30 “expert” and 30 “novice”

users recruited Tasks using web & app

Satisfaction & Knowledge

Questionnaire Debriefing interview

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants?

•  Example Satisfaction items: •  How likely are you to use the app in the future? (1: not likely at all – 5:

extremely likely) •  How valuable is the app to you? (1: not valuable at all – 5: extremely

valuable) •  Example Knowledge items:

•  Can others see when you are online? (yes, no, not sure)

Methods

30 “expert” and 30 “novice”

users recruited Tasks using web & app

Satisfaction & Knowledge

Questionnaire Debriefing interview

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants?

•  Example debriefing items: •  You seemed to hover your mouse over here often. Can you tell me about

that? •  Tell me what you thought about downloading the app. •  What would make this experience better for you? •  What is the one thing holding you back from using this tool? •  Please rate your overall satisfaction in using this app.

1. Extremely satisfied 2. Very satisfied 3. Moderately satisfied 4. Slightly satisfied 5. Not satisfied at all

Methods

30 “expert” and 30 “novice”

users recruited Tasks using web & app

Satisfaction & Knowledge

Questionnaire Debriefing interview

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

UX Data Qualitative Quantitative

Self-Report Satisfaction and knowledge questionnaires YES YES

Verbal think aloud (half of participants) YES NO

Moderator follow up YES NO Observational Time on page/task NO YES Selection/click behavior YES NO Success/fail rate NO YES Conversion rate YES YES Implicit Verbalization analysis YES YES Eye tracking YES YES

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? Other Considerations

•  Iterative Testing – Keep tasks and questions the same. •  We cannot objectively test our own designs.

•  Social validation, pleasing the researcher, acing the test •  Age-related differences in performance

•  Our job is not to explain the product. •  Coaching = leading = waste of time

•  Romano Bergstrom, J. C., Olmsted-Hawala, E. L., Chen, J. M., & Murphy, E. D. (2011). Conducting iterative usability testing on a Web site: Challenges and benefits. Journal of Usability Studies, 7, 9-30.

•  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Bergstrom, H. C. (2014). Older adults fail to see the periphery during website navigation. Universal Access in the Information Society, in press.

•  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Jans, M. E. (2013). Age-related differences in eye tracking and usability performance: Web site usability for older adults. International Journal of Human-Computer Interaction, 29, 541-548.

@romanocog #ConveyUX

Click to edit Master title style

January 23, 2015 | Convey UX

How Many Participants? Other Considerations

•  Iterative Testing – Keep tasks and questions the same. •  We cannot objectively test our own designs.

•  Social validation, pleasing the researcher, acing the test •  Age-related differences in performance

•  Our job is not to explain the product. •  Coaching = leading = waste of time

•  Romano Bergstrom, J. C., Olmsted-Hawala, E. L., Chen, J. M., & Murphy, E. D. (2011). Conducting iterative usability testing on a Web site: Challenges and benefits. Journal of Usability Studies, 7, 9-30.

•  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Bergstrom, H. C. (2014). Older adults fail to see the periphery during website navigation. Universal Access in the Information Society, in press.

•  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Jans, M. E. (2013). Age-related differences in eye tracking and usability performance: Web site usability for older adults. International Journal of Human-Computer Interaction, 29, 541-548.

@romanocog #ConveyUX

Younger adults Middle-age adults Older adults

Click to edit Master title style

January 23, 2015 | Convey UX

Are you ready to be unbiased?

Thank you!

Jennifer Romano Bergstrom @romanocog [email protected]

#ConveyUX