DIY Mobile Usability Testing - SXSW Interactive 2012

Embed Size (px)

DESCRIPTION

This is our DIY Mobile Usability Testing presentation in its SXSW Interactive 2012 incarnation.

Text of DIY Mobile Usability Testing - SXSW Interactive 2012

  • 1.Thanks for coming!

2. Bernard, packet core engineer at NSN 3. Beln, interaction designer at Intels OTC 4. #SXdiymut 5. usability testinga process that employs people as testingparticipants who are representative of thetarget audience to evaluate the degree towhich a product meets specic usabilitycriteria.Handbook of usability testing 2nd Ed., J. Rubin and D. Chisnell 6. usability testinga process that employs people as testingparticipants who are representative of thetarget audience to evaluate the degree towhich a product meets specic usabilitycriteria.Handbook of usability testing 2nd Ed., J. Rubin and D. Chisnell 7. please, stand up 8. take out yourcellphone 9. sit down if you donthave a US cellphone with a data plan 10. sit down if you dontlike beer 11. sit down if you areabsolutely terried bythe idea of being our test subject 12. why recording? memory aid powerful communication tool 13. actionsreactions 14. dut = mut 15. dut = mutwhere:dut = desktop usability testingmut = mobile usability testing 16. dut = mut + afecwhere:dut = desktop usability testingmut = mobile usability testingafec = a few extra challenges 17. which phone?which context?which connection? 18. which phone?which context?which connection? 19. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 20. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 21. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 22. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 23. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 24. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 25. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 26. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 27. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 28. web task success rates feature phones 38% smartphones55%touch phones75%Mobile usability, J. Nielsens Alertbox 20 Jul 2009http://www.useit.com/alertbox/mobile-usability-study-1.html 29. handset usabilityaffects test results 30. remember ... test with participants own phones if not possible, include training and warm-up tasks 31. which phone?which context?which connection? 32. eldvs.labIts Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 33. eldvs.lab 00Its Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 34. The results show that the added value ofconducting usability evaluations in the eld isvery little and that recreating central aspectsof the use context in a laboratory settingenables the identication of the same usabilityproblem list.Is it Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context-AwareMobile Systems in the Field, J. Kjeldskov, M. B. Skov, B. S. Als, R. T. Hegh, 2004 35. 0 1The results show that the added value ofconducting usability evaluations in the eld isvery little and that recreating central aspectsof the use context in a laboratory settingenables the identication of the same usabilityproblem list. FieldLabIs it Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context-AwareMobile Systems in the Field, J. Kjeldskov, M. B. Skov, B. S. Als, R. T. Hegh, 2004 36. according to our study there was nodifference in the number of problems thatoccurred in the two test settings. Ourhypothesis that more problems would befound in the eld was not supportedUsability Testing of Mobile Applications: A Comparison between Laboratory and Field TestingA. Kaikkonen, T. Kallio, A. Keklinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005 37. 0 2according to our study there was nodifference in the number of problems thatoccurred in the two test settings. Ourhypothesis that more problems would befound in the eld was not supportedFieldLabUsability Testing of Mobile Applications: A Comparison between Laboratory and Field TestingA. Kaikkonen, T. Kallio, A. Keklinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005 38. evaluations conducted in eld settings canreveal problems not otherwise identied inlaboratory evaluationsIts Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 39. 1evaluations conducted in eld settings canreveal problems not otherwise identied inlaboratory evaluations 2Its Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 FieldLab 40. The analyses of the comparison betweenusability testing done in two different settingsrevealed that there were many more typesand occurrences of usability problems foundin the eld than in the laboratory. Thoseproblems discovered tend to be critical issues.Usability Evaluation of Mobile Device: a Comparison of Laboratory and Field TestsH.B Duh, G. C. B. Tan,V. H. Chen, MobileHCI 2006 41. The analyses of the comparison between22usability testing done in two different settingsrevealed that there were many more typesand occurrences of usability problems foundin the eld than in the laboratory. Thoseproblems discovered tend to be critical issues. FieldLabUsability Evaluation of Mobile Device: a Comparison of Laboratory and Field TestsH.B Duh, G. C. B. Tan,V. H. Chen, MobileHCI 2006 42. eldvs.labIts Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 43. eldvs.lab DI SA GR EE EXP ER TSIts Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the FieldC.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 44. ... but they all agree evaluations in the eld (are) more complex and time-consuming Its Worth the Hassle! The Added Value of Evaluating the Usability of Mobile Systems in the Field C.M. Nielsen, M. Overgaard, M.B. Pedersen, J. Stage, S. Stenild - NordiCHI 2006 45. ... but they all agree testing in the eld requires double the time in comparison to the laboratory Usability Testing of Mobile Applications: A Comparison between Laboratory and Field Testing A. Kaikkonen, T. Kallio, A. Keklinen, A. Kankainen, M. Cankar - Journal of Usability Studies, 2005 46. ... but they all agree eld-based usability studies are not easy to conduct. They are time consuming and the added value is questionable. Is it Worth the Hassle? Exploring the Added Value of Evaluating the Usability of Context-Aware Mobile Systems in the Field, J. Kjeldskov, M. B. Skov, B. S. Als, R. T. Hegh, 2004 47. testing in the lab isbetter than no testing 48. remember ... for most software, lab testing is ne if you must do eld testing do it late plan and run pilot tests be prepared (like the Scouts) 49. which phone?which context?which connection? 50. remember ... do not test over wi- cover participants data costs 51. dut = mut + afecwhere:dut = desktop usability testingmut = mobile usability testingafec = a few extra challenges 52. tsdohoeaygtrtwtdut = (mut + afec)where:dut = desktop usability testingmut = mobile usability testingafec = a few extra challenges 53. tsdohoeaygtrtwtdut = (mut + afec)where:dut = desktop usability testingmut = mobile usability testingafec = a few extra challengestsdohoeaygtrtwt = the smalldetail of how on earth are yougoing to record the whole thing 54. why recording? memory aid powerful communication tool 55. 4 approaches to thesmall detail of how on earth are you going to record the whole thing 56. 1. wearable equipment . Methods and techniques for eld-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands) 57. 1. wearable equipment . Methods and techniques for eld-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands) 58. 1. wearable equipment . Methods and techniques for eld-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands) 59. 1. wearable equipment . Methods and techniques for eld-based usability testing of mobile geo-applications, I. Delikostidis (2007) International Institute for Geo-Information Science and Earth Observation (Enschede, The Netherlands) 60. Figure 10. Video recording with third-person view of participants and close-up view of PDA. Note that 1. wearable equipment the camera focused on the device screen is turned 90 d