132
Dissertation Task 2 0834060 Department of Information Systems and Computing MSc Information Systems Management Academic Year 2009-2010 MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED Alexander Dreyer Johnsen - 0834060 A Dissertation submitted in partial fulfillment of the requirement for the degree of Master of Science Brunel University Department of Information Systems and Computing Uxbridge, Middlesex UB8 3PH United Kingdom Tel: +44 (0) 1895 203397 Fax: +44 (0) 1895 251686

Making Touch-based Mobile Phones Accessible for the Visually Impaired

Embed Size (px)

DESCRIPTION

MAKING_TOUCH-BASED_MOBILE_PHONES_ACCESSIBLE_FOR_THE_VISUALLY_IMPAIRED

Citation preview

  • Dissertation Task 2 0834060

    Department of Information Systems and Computing

    MSc Information Systems Management

    Academic Year 2009-2010

    MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Alexander Dreyer Johnsen - 0834060

    A Dissertation submitted in partial fulfillment of the requirement for the degree of Master of Science

    Brunel University Department of Information Systems and Computing Uxbridge, Middlesex UB8 3PH United Kingdom Tel: +44 (0) 1895 203397 Fax: +44 (0) 1895 251686

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060

    ABSTRACT

    The mobile phone enjoys increased popularity, providing new means of connectivity and

    functionality. Today most phones comes equipped with touch-based screens, enabling

    the user to interact in an easier and more efficient way, compared to standard buttons.

    However, such screens require visual navigation, ruling out access for the visually

    impaired; evidently, modern phones are not designed for this user group.

    By analyzing the mobile market, available solutions, and current technology, it was

    found that neither the society, nor manufacturers of products and services, is designing

    for the visually impaired. Thus, this group is denied access to the numerous services and

    possibilities sighted people enjoy.

    However, it is possible to operate mobile phones through the use of applications called

    screen readers; still, these applications have proven to be ineffective and less than user-

    friendly on touch screens. Hence, this dissertation sets out to find an alternative

    approach; to construct a solution that will make touch-based mobile phones accessible

    for the visually impaired.

    Design research was chosen as the methodology for the project. Design research

    highlights the importance of developing a solution over the course of several iterations,

    and to perform product evaluation using external participants. A total of five iterations

    were carried out, resulting in several artifacts and a prototype for a user interface. The

    prototype was designed to replace the phones own user interface and provide an easy and intuitive way of operating a touch-based mobile phone.

    Through the process of developing the user interface, virtual prototypes and other

    artifacts were created. The virtual prototypes turned out to be of great advantage,

    communicating the vision and potentials of the final product to the stakeholders. In

    addition, the experience from the project shows that a successful development project

    should produce several iterations, have well-documented artifacts and perform external

    user testing.

    The new user interface was developed for the Android OS to replace the phones own user interface. Operation relies on voice and haptic feedback, where the user receives

    information when tapping or dragging the finger across the screen. The proposed

    solution is unique in many ways, it keeps gestures to a minimum, it does not rely on

    physical keys, and it presents a menu layout similar to most Nokia mobile phones.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060

    ACKNOWLEDGEMENTS

    The process of writing this dissertation has been long and exhausting, and would not

    have been possible without the involvement and support from several people; for this I

    am grateful.

    First, I would like to thank my supervisor Bendik Bygstad for his guidance and academic

    support throughout the project. It has been indispensable and a key factor for the

    accomplishment of this dissertation.

    Secondly, I would like to thank Tor Ulland at Huseby Resource Centre for the blind,

    for taking an interest in the project and for sharing his knowledge and insights on life

    without vision. Ulland also acquired all of the participants for the evaluations,

    ensuring the completion of the project

    Without the involvement of May-Britt Haug, at The Norwegian Association of the Blind

    and Partially Sighted, I would not have had the possibility of getting in touch with

    Huseby or Tor Ulland. Thank you for believing in the project.

    I would also like to thank Magne Gabrielsen, Gaby Groff-Jensen and Knut Beck at

    SmartPhones Telecom for believing in the project, providing both financial and

    developmental resources. Kim Ruben Teigen, whose development skills ensured the

    realization of the User Interface, a key contribution to the success of the project.

    A special thank you to my wife Ragnhild Eg, for her continued support throughout the

    dissertation. You have stayed up countless nights proofreading, and have continued to

    support me during the entire process.

    I would also like to thank my family, friends, fellow students and colleagues for the

    continued support during the process of writing the dissertation.

    Last, but not least, I would like to thank the anonymous participants for spending their

    spare time on evaluating and providing feedback on the User Interface. Not only once,

    but twice! Thank you!

    TOTAL NUMBER OF WORDS: 13.031

    I certify that the work presented in the dissertation is my own unless referenced

    Signature: .........................................

    Date............31.01.2011....................

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060

    TABLE OF CONTENTS

    Chapter 1: Introduction .................................................................................................. 7 1.1 Problems that people with vision loss face .................................................. 7 1.2 Back to society ............................................................................................. 8

    1.3 Research aim and objectives ........................................................................ 8 1.4 Research approach ....................................................................................... 9 1.5 Dissertation outline ...................................................................................... 9

    Chapter 2: Literature review ........................................................................................ 10 2.1 Equal opportunities .................................................................................... 10

    2.2 The mobile market ..................................................................................... 11

    2.3 Potential problems with smartphones ........................................................ 11

    2.4 Improving compatibility ............................................................................ 11 2.5 Assistive technology .................................................................................. 12 2.6 Screen reader .............................................................................................. 12 2.7 Reading and input of text ........................................................................... 13 2.8 Haptics ....................................................................................................... 15

    2.9 Making a better user interface .................................................................... 15 2.10 Turning the mobile phone into an assistive aid ....................................... 16

    2.11 Framework ............................................................................................... 16

    Chapter 3: Research Method ........................................................................................ 18

    3.1 Design research .......................................................................................... 18

    3.2 Awareness of problem ............................................................................... 20

    3.3 Suggestion .................................................................................................. 20 3.4 Development .............................................................................................. 21

    3.5 Evaluation .................................................................................................. 22 3.6 Conclusion ................................................................................................. 24

    Chapter 4: Research results .......................................................................................... 25

    4.1 Results in regards to development ............................................................. 25 4.2 Results in regards to testing ....................................................................... 29

    Chapter 5: Discussion .................................................................................................. 34 5.1 Design as an Artifact .................................................................................. 34

    5.2 Problem relevance ...................................................................................... 35 5.3 Design evaluation ....................................................................................... 35 5.4 Research contributions ............................................................................... 36 5.5 Research rigor ............................................................................................ 36 5.6 Design as a Search progress ....................................................................... 36

    5.7 Communication of research ....................................................................... 36 5.8 Summary of discussion .............................................................................. 37

    Chapter 6: Critical evaluation of research ................................................................... 38

    Chapter 7: Conclusion .................................................................................................. 40 7.1 Summary of the dissertation ...................................................................... 40

    7.2 Future research and development .............................................................. 41

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060

    References .................................................................................................................... 42

    Appendix A: Ethical approval ..................................................................................... 52

    Appendix B: About the author ..................................................................................... 53

    Appendix C: The mobile market .................................................................................. 54 Appendix C.A - The new players .................................................................... 54

    Appendix D: Using the mobile phone for assitive aid ................................................. 55 Appendix D.A - Object recognition ................................................................. 55 Appendix D.B - Navigation ............................................................................. 55

    Appendix E: Interview with Tor Ulland at Huseby (Statped) ..................................... 57

    Appendix F: Project plan ............................................................................................. 60

    Appendix G: Project meeting reports .......................................................................... 61 Appendix G.A - Initial project workshop meeting, June 14, 2010 .................. 61

    Appendix G.B - Approval of functionality, June 25, 2010 .............................. 63 Appendix G.C - Layout of UI, August 30, 2010 ............................................. 66

    Appendix G.D - Layout of UI, September 13, 2010 ........................................ 68 Appendix G.E - Layout of UI, October 10, 2010 ............................................ 70

    Appendix G.F - Layout of UI, October 13, 2010 ............................................. 73

    Appendix H: Documentation of ui ............................................................................... 76

    Appendix H.A Documentation of UI, version 0.0.5 ..................................... 76 Appendix H.B Documentation of Conceptual Design .................................. 79 Appendix H.C Documentation of Second Prototype, round 1 of

    testing ................................................................................................... 84 Appendix H.D Documentation of Third Prototype, round 2 of testing ........ 87

    Appendix I: Development and activity log .................................................................. 90

    Appendix J: Information to project participants .......................................................... 96

    Appendix K: Consent form .......................................................................................... 97

    Appendix L: Participant Survey ................................................................................... 98

    Appendix L.A: participant survey, round 1 ..................................................... 98 Appendix L.B: Participant survey, round 2 ................................................... 100

    Appendix M: Ressults from user testing .................................................................... 102 Appendix M.A: Results from round 1, Overview of all responses ................ 102 Appendix M.B: Results from round 1, Participant 1 ..................................... 106

    Appendix M.C: Results from round 1, Participant 2 ..................................... 108 Appendix M.D: Results from round 1, Participant 3 ..................................... 110 Appendix M.E: Results from round 1, Participant 4 ..................................... 112

    Appendix M.F: Results from round 2, Overview of all responses ................ 114 Appendix M.G: Results from round 2, Participant 1 ..................................... 118

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060

    Appendix M.H: Results from round 2, Participant 2 ..................................... 120 Appendix M.I: Results from round 2, Participant 3 ....................................... 122 Appendix M.J: Results from round 2, Participant 4 ...................................... 124 Appendix M.K: Results from round 2, Participant 5 ..................................... 126 Appendix M.L: Results from round 1 and round 2 compared against

    each other ........................................................................................... 128 Appendix M.M: T-Test and Effect sizes ........................................................ 132

    LIST OF TABLES

    Table 1 - Framework for the top issues people with vision loss encounter when using

    mobile phones with touch screen ........................................................................................ 17

    Table 2 - Design Research Guidelines by Hevner et al. (2006) .......................................... 20

    Table 3 - Tasks the participants were asked to perform ........................................................ 22

    Table 4 - Survey completed by participants .............................................................................. 23

    Table 5 - Additional tasks presented to participants ............................................................... 24

    Table 6 - Additional survey questions ........................................................................................ 24

    Table 7 Summary of survey categories ................................................................................... 30 Table 8 - Table showing subjective feedback from round two ........................................... 33

    Table 9 - Overview on aims and objectives .............................................................................. 38

    LIST OF FIGURES

    Figure 1- Design Research process model by Vaishnavi and Kuechler (2007) ............. 19

    Figure 2 - Some of the menus available in the UI ................................................................... 26

    Figure 3 - List navigation in the UI ............................................................................................. 26

    Figure 4 - Conceptual UI prototype for people with vision loss ......................................... 27

    Figure 5 - Flow map for navigation in the UI........................................................................... 28

    Figure 6 - Conceptual model of application .............................................................................. 28

    Figure 7 - Graph displaying mean scores from the first and second round of testing .. 30

    Figure 8 - Graph displaying a comparison between results from first and second round

    of testing on the menu system ............................................................................................. 31

    Figure 9 - Graph displaying a comparison between results from first and second round

    of testing on the complete solution .................................................................................... 32

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 7 of 132

    CHAPTER 1: INTRODUCTION

    1.1 Problems that people with vision loss face

    According to the World Health Organization (WHO) (2009), around 314 million people are

    visually impaired, of which approximately 45 million are blind. Of the total number, 12

    million are children, 82% are of age 50 or older; furthermore, women have the greatest risk of

    becoming visually impaired. Developing countries have the largest representation of visually

    impaired people, with around 87% representation. Visual impairment blindness can be

    defined as the collective term vision loss, which refers to either the loss of the ability to see, or vision reduction. The terms low vision, partially sighted, or visually impaired, can be used for a person who has not lost the capability to see, but whose vision is significantly

    reduced compared to normal vision (Colenbrander, 2002).

    Both groups deal with daily situations that vary greatly depending on the severity of their

    visual impairment. A blind person will have to rely on other senses and inputs, such as touch

    and hearing, as well as assistive aids, such as the long cane and the guide dog. On the other

    hand, an individual with low vision can be aided by visual enhancements, like large print,

    magnifiers and illumination (Colenbrander, 2002).

    Common for both groups is their encounters with situations and problems that a sighted

    person would not consider an issue. This can be seen in everyday situations, where the

    environment is designed for the sighted, not someone with vision loss. Accessing stores and

    businesses, or simply crossing the street, can turn out to be challenging tasks. Much of our means

    of communication is through the use of signs and signals; visual symbols that a person with

    vision loss cannot see. For example, danger signs or signs providing direction, road blocks or

    general information on public transportation. At worst, situations like these may prevent a person

    from wanting to step outside, hence limiting contact with the surrounding world (Tjan et al.

    2005; Joseph, 2009; Maines, 2008; Appendix E).

    The same restrictions apply to other ordinary situations; TVs, radios, mobile phones, stoves and ATMs are just some examples of equipment that are becoming increasingly advanced. In the past, such items were mostly equipped with buttons; although not originally designed for people

    with vision loss, they made it possible to memorize the steps in order to access a certain feature.

    With todays modern design, with touch screens showing dynamic menus; it has become impossible for a person with vision loss to memorize and use such equipment. It can be argued

    that companies do not consider this user group when designing new products (Picard, 2010;

    Appendix E).

    Another issue facing people with vision loss is the difficulty some sighted people have when

    encountering blind people, making it hard for the latter to develop new relationships. Most jobs

    are also designed for the sighted. Due to this, blind people do not meet the same expectations and

    are often relegated to specific roles. This is an undesirable situation as employers may lose out

    on valuable skills and those who lack vision are left unable to prove their potential. Society

    should strive for quality and for implementing people with vision loss into the working

    environment (The National Federation of the Blind of Connecticut, n.d.).

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 8 of 132

    1.2 Back to society

    Several ideologies and assumptions about blindness and rehabilitation exist. One of the more

    accepted is the restorative approach, embraced by Father Thomas Carroll (American Printing House for the Blind, 2010). This approach works under the idea that most blind people could,

    with the help of professional counseling and training, live full lives. Seven basic losses are high-

    lighted, and the focus is on restoring these through training, with the white cane or a guide dog,

    learning to read Braille and using assistive technology (R. A. Scott, 1995). With the correct

    equipment and training, a person with vision loss can master ordinary tasks like operating a

    computer, reading a book or using mobile phones (Appendix E).

    People with vision loss can also participate in sports; blind Soccer is a particular example of

    how well correct training in applying other senses can enable a blind person to participate in

    team activities. Players are able to play soccer in almost the same way as a sighted person by

    relying on shouted commands and specially designed soccer balls, containing ball bearings

    (Malinowski, 2010).

    1.3 Research aim and objectives

    While it is clear that the technology and the will to integrate those with vision loss into society

    are strongly present, society is not yet properly adapted. Several aspects can be improved,

    where one is the mobile phone.

    The mobile phone introduces several benefits, but perhaps the most important is the easy

    access to communication with friends, family and the surrounding world. However, most new

    mobile phones are designed for visual navigation, rendering the mobile phone inaccessible for

    people with vision loss. Designing a phone that is inaccessible for people with vision loss can

    at worst result in loss of contact with society, thus the author of this project believes in the

    importance of making mobile phones not only accessible, but easy to use, for people with

    vision loss.

    The aim of this dissertation is to present, develop and evaluate a new User Interface (UI) for

    touch-based mobile phones, which will make this type of phones available for people with

    vision loss.

    Based on the aim of developing a UI, the following objectives are presented:

    A literature review on issues related to the operation of touch-based mobile phones

    and available assistive aids and technologies

    Create a framework for designing the UI

    Perform an evaluation of the UI through several iterations; both internal and external

    Discussion on the results from the project

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 9 of 132

    1.4 Research approach

    The project uses Design research method for developing and testing the UI for touch-based

    mobile phones. Design research is designed with development and testing in mind,

    encouraging projects to create and deliver several versions of a solution, where major releases

    are tested on external participants to provide feedback (Hevner et al. 2006; Vaishnavi &

    Kuechler, 2007).

    Two versions of the solution were tested on external users, allowing them to provide feedback

    and comments on the interfaces. Feedback from the first version was used to improve the

    second version. 1.5 Dissertation outline

    The dissertation report is organized as follows:

    Chapter 2: Presentation of literature, research and the current mobile market relating to people

    with vision loss.

    Chapter 3: The research method Design research is outlined along with documentation of the

    steps followed throughout the project.

    Chapter 4: Research results are presented, analyzing the different between the two tested

    versions.

    Chapter 5: Critical discussion comparing the findings from the literature review with the

    results from the evaluation of the UI.

    Chapter 6: The processes and the research methodology are critically evaluated.

    Chapter 7: The report concludes with a summary of the previous discussions and a highlight

    of the contributions to the existing research field, with future research suggestions.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 10 of 132

    CHAPTER 2: LITERATURE REVIEW

    The research field on vision loss encompasses several areas of interest, where this

    Dissertation focuses on how people with vision loss can operate mobile phones with touch

    screens. Hence, this chapter is divided into several subjects starting with rights of people with

    vision loss, followed by a description of the current mobile market, for then to present

    available assistive aids and technology. The chapter ends in a framework, used to design the

    UI. 2.1 Equal opportunities

    It goes without saying that people with vision loss have the same rights as sighted people; yet

    the misconception that vision loss reduces work efficiency is a common prejudice (Wall,

    2003). A Norwegian report on job opportunities for people with disabilities concluded that

    this user group has the largest share of unemployed potential workers compared to the rest of

    the population (ECON, 2003). However, a different report that looks into peoples experience with moving from school to work life, suggests that part of the problem is based on lack of

    knowledge and improper administration by government organizations (Berge, 2007).

    Several organizations are working towards equal rights for people with vision loss; eliminate

    prejudice, integration with society, achieve equal rights and benefits and provide aid to

    countries and people struggling with diseases causing blindness.

    World Blind union (n.d.) (WBU) is an internationally recognized organization that represents

    160 million people with vision loss in 177 member countries; as a universal voice, it aims to

    achieve equal rights and opportunities in all aspects of society. However, WBU does not

    provide direct services such as training, guidance and access to assistive technology (AFB,

    n.d.; NABP, n.d.). These services are designated to local and national organizations, like the

    American Foundation for the Blind and The Norwegian Association of the Blind and Partially

    Sighted (NABP).

    Vision 2020 is a joint cooperation between the WHO, the International Agency for the

    Prevention of Blindness (IAPB) and professionals within the field. The main focus is to

    eliminate the main causes of avoidable blindness by the year 2020 ant to prevent 100 million

    people from becoming blind (Vision 2020, 2009; International Agency for the Prevention of

    Blindness, 2010).

    Organizations are not the only means for support for people with vision loss. Several

    countries have also implemented rules to protect the rights of people with vision loss. In the

    United Kingdom, the Disability Discrimination Act states that a person should not be treated

    less favorable because of his or her disability. The European Union has introduced a similar

    act, which directs its member countries to introduce measures to reduce discrimination against

    people with disabilities (Wall, 2003). Norway has taken it a step further with the introduction

    of a new discrimination and accessibility law, which state that all new products marketed

    towards the general public should be equally accessible to all, regardless of personal

    limitations or handicaps (Steria, 2009).

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 11 of 132

    2.2 The mobile market

    The mobile phone market enjoys increased popularity, with market shares expanding every

    day (IDC 2010), it is now considered larger than the PC market (Neset, 2010). The mobile

    market consists mainly of two types of mobile phones; feature phones and smartphones.

    Where feature phones are considered low-end phones, with limited features and computer

    power (Nusca, 2010; Wikipedia, 2010a), smartphones offer more advanced computing power

    and operating systems, as well as greater connectivity; they are essentially small mobile

    computers (Wikipedia, 2010b).

    During the first quarter of 2010, a total of 314.7 million mobile phones were sold to end users

    worldwide, with smartphones making up 54.3 million of the sales (Gartner, 2010b); this is an

    increase of 17% compared to the same period in 2009. Although, feature phones are still

    considered the platform with the largest sales volume, smartphones have had a positive

    increase and sales are expected to increase further in the coming years (IDC, 2010).

    According to Gartner (2010) the top five OS for smartphones are; Symbian (44,3%),

    Blackberry (19.4%), Apple iOS (15.4%), Google Android (9.6%) and Windows Phone

    (6.8%). Apple iOS and Google Android are currently the fastest growing (The Nielsen

    Company, 2010) and it is estimates that the Android OS will be the second largest OS

    worldwide in 2010 and it will challenge Symbians top ranking position in 2014 (Gartner, 2010a). The numbers show that smartphones sales are increasing, suggesting that the market

    wants more advanced phones, with more functionality and better connectivity.

    A more detailed view on the mobile market and its actors is available in Appendix C. 2.3 Potential problems with smartphones

    While smartphones come with a range of additional features, there are drawbacks. Studies

    have shown that the majority of users only take advantage of a small portion of the

    functionalities available (Gomns, 2005). In addition, the touch screen can make it more

    difficult to type, as no physical feedback is provided. Nevertheless, the increasing sales

    indicate that most sighted people do not consider these factors as show stoppers. However,

    this is not the case for those with vision loss.

    The majority of information on a mobile phone is presented through graphical means,

    rendering access almost impossible for a person with vision loss. A touch-based mobile phone

    becomes even harder to access with a touch-based visual design and navigation that contains

    no physical representation of where or how to press buttons and icons. With these limitations,

    it seems likely that a large user group will be unable to use these types of mobile phones,

    segregating them further from the general population. This is also in contradiction with laws

    set by the government (Meyers, 2008). 2.4 Improving compatibility

    Studies on mobile phone accessibility for elderly and people with disabilities; all conclude

    that mobile phones are not designed for these user groups. They point out several aspects that

    could improve mobile phone accessibility. Mobile phones should be of adequate size and

    shape and have texture with good grip. The screen should be large and the buttons should

    have a logical placement for easier memorization of layout. Voice feedback should provide

    information and confirm execution of commands. The phone should also provide easy access

    to emergency numbers. Finally, the number of features should be kept to a minimum to ensure

    ease of use (Abascal & Civit, 2001; Plos & Buisine, 2006; Smith-Jackson et al. 2003; S.K.

    Kane et al. 2009). These suggestions are all in accordance with the guidelines on how mobile

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 12 of 132

    phones should be designed, provided by the NABP (n d).

    Creators are recommended to follow good practices and standards to ensure that their

    applications or web pages are accessible to all user groups (Perea et al. 2006). However, this

    recommendation is not always followed; not for mobile phones nor for applications or web

    sites. A recent study on disableds relationship to social media, like Facebook and Twitter, concludes that these and similar services are designed for visual navigation, forcing those who

    are visually impaired to use mobile versions of the media that are better formatted and have

    less content (Tollefsen et al. 2011; Rossen, 2011).

    Several guidelines have been presented in recent years to show developers how to design

    accessible applications and web sites. For instance, the W3C organization has formed such an

    initiative, which develops accessibility guidelines (W3C, 2008). These guidelines are in line

    with those published by NAPB (n.d.). Furthermore, Fukuda et al. (2005) proposes to

    introduce navigability and listenability metrics for designing and developing applications or

    web sites, ensuring that screen readers can navigate and present the information in a clear and

    consistent way. Similar suggestions were presented by Calabr et al. (2009) to ensure that e-

    books are compatible with screen readers.

    With a range of different web browsers, both on computers and mobile phones, compatibility

    cannot always be guaranteed. In light of this, The Mobile Web Initiative Device Description

    Working Group has proposed the creation of a Device Description Repository; a database

    containing information about mobile devices that can be queried by applications to present

    data in the most efficient way (K. Smith & Sanders, 2007).

    2.5 Assistive technology

    Although mobile phones are not designed for use by people with vision loss, there are solutions

    that can compensate or improve such use. These are defined by the general term assistive

    technology; they include any product, instrument, equipment or technical system designed for or

    used by a person with disabilities, which prevents, compensates, supervise, alleviates or

    neutralize the effects of the disability (Perea et al. 2006). Products may range from physical and

    living objects like the long cane or the guide dog, to software in a device like a screen reader. For

    an individual who is blind, it is impossible to read the content on a screen; hence the aid of

    assistive technology is a great benefit in their personal and professional life. This section will

    look at current solutions and relevant research that addresses the issue of assistive technology

    on mobile phones. 2.6 Screen reader

    It is possible to operate a mobile phone or computer without being able to see what is on the

    screen. Software applications called screen readers can convert text to speech, enabling the

    user to read and navigate the content of a screen through hearing. By listening to voice

    communication provided by the screen reader application, users are able to perceive and

    navigate the content on the screen, making it possible to perform tasks like word processing,

    e-mailing, listening to music and surfing the web (Wikipedia, 2010b).

    A screen reader is generally made out of two components; the application which monitors the

    content on screen, and a synthesizer, which provides the spoken feedback. This is produced

    through text-to-speech, where the text input is provided by the screen reader and the synthetic

    voice is produced by the synthesizer. The synthesizer works with different languages and

    supports phonemes and grammatical rules (AFB, n.d.). For a screen reader to work efficiently,

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 13 of 132

    applications need to follow common standards so that the content can be interpreted and

    presented correctly (Babinszki, 2010).

    Screen readers are available for both computers and mobile phones, ranging from products

    that are free of charge to those that cost closer to $1.000,-. One example of a screen reader for

    computers is JAWS, one of the most feature rich products, but also the most expensive

    (Freedom Scientific, n.d.). Screen readers can also be run in a web browser, allowing a person

    to use almost any computer (Bigham et al. 2008). On mobile phones, screen readers have

    been most common for phones with physical buttons, but are becoming available for phones

    with touch screens (Babinszki, 2010).

    However, a screen reader must not be confused with the voice feedback often built into

    modern mobile phones. Although they are capable of representing menu and application

    content, these functions are less advanced when it comes to conveying accurate information.

    Moreover, they are limited to phone menus; and so unable to access information inside

    applications (Theofanos & Redish, 2003).

    Mobile Speak is a commercial screen reader designed for mobile phones that run the

    Windows Mobile or Symbian OS. It is considered one of the best solutions for mobile phones,

    and it supports phones both with physical buttons and with touch screens (Code Factory, n.d.).

    TalkBack and Spiel are two open source screen readers designed for the Android OS; both

    enable developers to incorporate screen reader functionality into their applications. Although,

    not as advanced as commercial versions, they provide developers with an easy and convenient

    way making their applications more accessible (C. Chen & Ganov, 2009; Darilek, 2010).

    2.7 Reading and input of text

    Braille was created to enable blind people to read text. It is not a new language, but a text

    system with dots laid out in an organized manner enabling a blind person to read and write

    text in the same manner as a sighted person. By moving the finger across the dots, a trained

    person is able to understand the letters that the dots represent, and thus able to read the text

    just like a sighted person (AFB, n.d.). Reading text from a computer can also be achieved

    through Braille, through the use of refreshable Braille displays, Braille printers and Braille

    note takers (AFB, n.d.).

    Although Braille is a great tool, its reliance on finger sensitivity can exclude the elderly that

    may have reduced feeling in the fingertips (AFB, n.d.). For people who have lowered vision,

    Braille might not be needed; magnifiers, loupes and special glasses are just some of the

    equipment that can assist a person in reading text NABP (n.d.).

    However, Braille is not suited for mobile phone use. Instead, users will have to rely on

    different means of typing, depending on the phone in use. For a person with vision loss,

    mobile phones equipped with physical keyboards are the preferred means of interaction; this

    relies on the feedback provided by buttons. In conjunction with a screen reader, operating a

    mobile phone is feasible with a keyboard (Babinszki, 2010; Buxton et al. 2008).

    Physical keyboards for mobile phones come either with a T9 (text on 9 keys) or a QWERTY

    keyboard layout. A T9 layout consists of a phone pad with numbers ranging from 1 to 9 and

    three associated letters on each button; while a QWERTY layout represents the same keys as

    a keyboard attached to a computer. Just like a computer keyboard, it is possible to memorize

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 14 of 132

    the layout of these buttons and thus write without looking at the keyboard. In addition,

    predictive text is often used together with these layouts, enabling the phone to predict what is

    being typed, completing the words faster and correcting spelling mistakes (Wikipedia,

    2010d).

    Mobile phones equipped with touch screens do not always have a physical keyboard; instead

    text is entered on a virtual keyboard presented on the screen. This can render their use a

    challenge, as the virtual buttons do not provide any tactile feedback (Yfantidis & Evreinov,

    2005; Buxton et al. 2008). Several solutions have been presented to improve the typing of text

    on touch screens, ranging from physical equipment that work in cooperation with the mobile

    phone to software keyboards installed on the phone.

    HumanWare (n.d.) has created an overlay for touch screens that encompass the phone screen

    and enables the user to interact with the phone and gain access to the most common functions.

    The overlay can also communicate with a separate Braille device, making it possible for the

    user to read and write using Braille (HumanWare, n.d.). The advantage with this solution is

    that the user can carry only one device and interact directly with it; however, the downside is

    that it is only compatible with resistive touch screens (Wikipedia, 2010b); conversely, most

    modern phones come with capacitive screens that are only capable of interacting with

    conductive materials like a human finger (Wikipedia, 2010a).

    As mentioned, mobile phones with touch screens come with virtual keyboards. While they are

    handy, experience shows that these keyboards can be troublesome when typing. Hence,

    several third party software keyboards have been created. A keyboard named ThickButtons

    provides a virtual QWERTY layout on the screen; ThickButtons differentiate itself from a

    normal keyboard by anticipating the coming letters and making those letters larger and others

    smaller (Wells, 2010b). SlideIT and Swype are two other solutions, where the use of a

    dictionary enables users to write words by sliding a finger along the letters without removing

    it from the screen (Wells, 2010a; Sorrel, 2010a). Yet another option called SwiftKey

    anticipates the following word; developers claim that 1/3 of all words will be correctly

    anticipated, resulting in up to 50% faster typing (TouchType, 2010).

    However, all of these keyboards require some degree of visual navigation, which may be

    suitable for people with low vision but not for blind users. A keyboard called BlindType

    (Anon, n.d.) might be suitable for blind users, although this was not its original intention.

    BlindType is a QWERTY virtual keyboard, which perceives the users typing style. Through an assumption that the user is familiar with the QWERTY layout, the keyboard is able to

    recognize where and or what is being typed on screen without the user having to focus on the

    screen. Essentially, a word can be typed at any place on the screen without a keyboard

    displayed. In theory, this keyboard could also work for blind users, since it does not require

    text to be entered in a defined area and it can automatically correct typing errors. Recently,

    the company behind the solution was acquired by Google (Noyes, 2010). Google has also

    released a technology called Google Scribe, which predicts what the user is planning to type

    (Lofts, 2010).

    Furthermore, Yfantidis & Evreinov (2005) have designed a gesture based keyboard specifically

    for blind users. It works by tapping the screen to make a square appear, which represents eight

    directions; where each direction holds a separate letter. By moving the finger towards one of the

    directions, the letter is read out loud and proceeding to remove the finger confirms the selection

    for typing.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 15 of 132

    2.8 Haptics

    Haptics is a technology that provides tactical feedback, resulting in a more intuitive and less

    visually reliant interface. However, haptics can never replace hearing or vision; rather, it

    provides an additional sense. For touch-based interfaces, haptics can make a user feel and

    visualize the shape of an item without looking at the screen, or it can provide feedback when the

    finger reaches the border of an element or a button. Haptics is anticipated to become more

    advanced in the feature, with an expected ability to provide even more fine-grained details, such

    as the fur of an animal (Buxton et al. 2008; Gemperle et al. 2001; Rassmus-Grhn, 2006).

    Yu & Brewster (2003) have developed a solution for presenting graphs and tables to users with

    vision loss. Users can explore virtual graphs and tables through a force feedback device that

    creates haptic feedback, while audio is used to present detailed and abstract information.

    In a study on user reactions to haptics, Rassmus-Grhn (2006) found that some rely more on the

    audio feedback than the haptics feedback. This suggests that, although haptics can be a valuable

    additional modality, some users depend more on the senses they are used to. Hence, solutions

    that incorporate haptics will likely be greatly improved if they also implement sound.

    2.9 Making a better user interface

    Mobile phone manufacturers have noticed the inadequacies in usability of the UIs native to

    various OS. Most manufacturers develop their own set of UIs for the mobile phones,

    replacing the OSs own UI (HTC, n.d.; Topolsky, 2008). In addition, Nokia, Intel and the University of Oulu have established a dedicated research center, which focuses on improving

    the UI on mobile devices (Darling, 2010).

    Raman and Chen (2008) argue that the UI is only a means to an end and should blend

    seamlessly into the users way of operating the mobile phone. In contradiction to a computer OS, the mobile OS should provide access to what the user needs and remain unattended when

    not in use. According to Karampelas and Akoumianakis (2003), the layout for mobile phones

    should include the following, consistent presentations, alternative navigation tools,

    accessibility to common options and functions and self-explanatory navigation.

    Hence, several solutions have been put forward to improve the navigation of touch screens for

    users with vision loss, essentially providing eyes-free navigation.

    One of this was presented by Amy K Karlson et al. (2005) in a study on a user-interface

    designed for one-hand thumb use. The interface created a 3x3 square-grid; with each square

    containing a link to an application or a sub-menu. Selection is done by tapping one of the

    squares, while gestures were introduced for zooming and to navigate between menus and.

    Results showed that the participants liked the way the navigation and selection of applications

    worked, although they were hesitant with the gestures.

    Kane et al. (2008) have developed Slide Rule; a UI that solely operates through the use of

    gestures and auditory feedback. By using several combinations of gestures, Slide Rule allows

    the user to open applications and perform commands. Results revealed that the proposed

    solution performed significantly faster than button based menus. However, due to the gesture

    commands, more errors were produced.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 16 of 132

    Amar et al. (2003) built a prototype handheld device that used tactile and auditory feedback to

    convey a menu structure that provided easy access to common functions. The menu could be

    navigated through one hand access, where each finger would perform a specific task. Rotating

    a dial would move through various options and menus, a separate button would return to a

    previous menu, and four buttons performed navigation and application functions. Most

    participants were satisfied with the solution; however, it was noted that users had problems

    forming a conceptual model of the hierarchical menu. This was improved by adding the

    feature of pressing a button halfway down, which would state the functionality of the selected

    button.

    With their application, Strumillo et al. (2009) have abandoned the default phone menu on the

    Symbian OS and offer a new, simpler menu that provides audible feedback and access to

    common functions and features. While navigating through menus, the user is informed of

    what is a displayed and possible action to take. The solution has received good feedback;

    however, it was developed for an older version of Symbian, making it incompatible with

    todays Symbian phones.

    Eyes-free Shell is an open-source project for the Android platform created by Chen and

    Raman (2009); it replaces the phones existing UI with a new menu system, providing easy access to applications, as well as information on time and current location. Eyes-free Shell

    uses auditory feedback and gestures to navigate between menus. The user can search for

    applications by using the same type of technique presented earlier (Yfantidis & Evreinov,

    2005). The application is part of a larger open-source initiative, sponsored by Google, which

    aims to make touch-based mobile phones more accessible. However, some features require

    physical buttons, rendering the proposed solution inaccessible to certain phones. The reliance

    on gestures can also be problematic, as indicated from the findings in other projects

    mentioned above (Helft, 2009).

    2.10 Turning the mobile phone into an assistive aid

    Extending on the common functionalities of the mobile phone, several solutions exist today

    that turn the phone into an assistive aid. Appendix D presents some of the functions available,

    such as object recognition and navigation.

    2.11 Framework

    The literature review has pointed out several aspects that hinder the effective use of mobile

    phones with touch screen for people with vision loss. Although several factors could warrant

    further scrutiny, the aspect that stood out as the main obstacle is the operation of the touch

    screen itself.

    Based on the literature, a framework pointing out the most important factors for the

    development of a new UI for touch-based mobile phones are summarized in table 1.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 17 of 132

    Table 1 - Framework for the top issues people with vision loss encounter when using mobile phones with touch screen

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 18 of 132

    CHAPTER 3: RESEARCH METHOD

    Design research was chosen as the research method for the development project in this

    Dissertation. This chapter outlines the selected method along with a documentation of the

    steps followed throughout the project.

    3.1 Design research

    Design research aims to provide solutions for human purposes by creating an artifact; the

    artifact is tested to provide feedback on improvements (March & Smith, 1995). It is a design

    solving process, with several cycles of iterations, where new and improved artifacts are

    constantly put forward. Each artifact is evaluated and new artifacts build on the knowledge of

    previous ones; thus it is an approach that encourages innovation (Hevner et al. 2006; Grnli &

    Bygstad, 2009).

    The research field arose as a result of several failed projects; the failures highlighted the

    importance of closely studying the design of a product during development (Vaishnavi &

    Kuechler, 2008). The same approach is used in engineering, where a product is developed and

    tested to further improve the coming versions (Peffers et al. 2006; Vaishnavi & Kuechler,

    2007).

    Frederick R. Brooks Jr., best known for his book The Mythical Man-Month (Brooks, 1995),

    argues that constant incremental iterations should replace the practice of building complete

    product versions. Development teams should produce quick and effective prototypes that can

    be tested by external users, which provide valuable feedback for further development (Kelly,

    2010).

    According to Grnli and Bygstad (2009), design research is particularly relevant for

    understanding mobile services and innovation. For a project like the current, where the

    developers face a previously unfamiliar condition, best guess is not enough (Nielsen, 1993).

    In these situations, feedback from the intended target group becomes vital to the projects success. Hence, Design research is the process in which it is uncovered what works and what

    does not work (Vaishnavi & Kuechler, 2007).

    Several models for successful completion of a Design research project have been presented.

    One, presented by Grnli and Bygstad (2009), leverages on two widely accepted approaches

    to Design research. The method recommends Vaishnavi and Kuechler's (2007) model for the

    overall project framework and the model from Hevner et al. (2006) to evaluate the results.

    Vaishnavi and Kuechler's (2007) approach is a five step model, designed to guide a project

    from startup to the finished solution, as shown in figure 1. The first step is the Awareness of problem, which focuses on highlighting or improving the results of an existing problem. This is followed by the Suggestions step, where the problem is further analyzed using existing knowledge and theory, culminating in a tentative design. Development is the next step, where the project attempts to implement an artifact according to the suggested solution, which

    is then evaluated in the fourth step. The whole project is completed with a conclusion where the project results are evaluated (Grnli & Bygstad, 2009).

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 19 of 132

    Development, evaluation and further suggestions are often performed iteratively, allowing the

    researcher to step back and evaluate and make changes as necessary. Stepping back achieves

    an understanding that could only be gathered from the specific act of construction, the model

    refers to this process as circumscription (Vaishnavi & Kuechler, 2007).

    Figure 1- Design Research process model by Vaishnavi and Kuechler (2007)

    Vaishnavi & Kuechler's (2007) model has been criticized for being too dependent on

    traditional development methods. The suggestion step is the only stage that allows creative

    input and is also the only stage separating the model from the mentioned development

    methods. However, it is noted that the steps are only suggestions and not absolute

    requirements (Vaishnavi & Kuechler, 2007; Grnli & Bygstad, 2009), thus allowing the

    project to make changes as deemed necessary.

    Hevner et al. (2006) have developed seven guidelines, presented in table 2, to help provide

    better understanding and measurements for effective design research projects. Although quite

    general, these guidelines are implemented in the same way as Vaishnavi and Kuechler's

    (2007) model, allowing the project freedom to decide on implementation. However, it is

    advised to address each of the guidelines during the projects course (Grnli & Bygstad, 2009; Hevner et al. 2006).

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 20 of 132

    Table 2 - Design Research Guidelines by Hevner et al. (2006)

    As mentioned earlier, Vaishnavi and Kuechler's (2007) model were used for the overall

    project framework and are described in the following section. Hevner et al.'s (2006) seven

    guidelines will be used to evaluate and discuss the results of the project, as described in

    chapter 5.

    3.2 Awareness of problem

    The first stage of Vaishnavi & Kuechler's (2007) model was initiated by a general

    assumption, that the majority of todays mobile phones are touch-based and designed for those with no visual impairment. The assumption was based on the authors previous experience from the mobile industry; former projects had demonstrated that the UI is not

    always up to standards and hinder effective use of mobile phones.

    Once the general assumptions were in place, these needed to be verified by more thorough

    research. The initial literature review focused on the mobile market and recent phone releases,

    using resources found mainly through Google and later moving on to the major market

    analytics companies. With no prior experience related to vision loss, a literature review on

    users with vision loss was needed, not only to identify existing solutions for both daily life

    and mobile phone use, but to understand more of the experience if vision loss. Through

    resources, such as Google and NABP, literature on the most common problem areas was

    collected, along with information regarding manufacturers of assistive technology.

    3.3 Suggestion

    A more thorough literature review was carried out in the suggestion phase, focusing on all

    elements and issues regarding the development of assistive technology for people with vision

    loss. The literature review was completed in four iterations, each step contributing with more

    information regarding issues facing those with vision loss, how people with vision loss use

    mobile phones, application compatibility, operating systems to develop for, screen readers

    and haptics, and implementation of such technology. The end result was a framework of

    guidelines that worked as a source of reference throughout the project.

    The work that came out of the literature review supported the earlier assumptions,

    highlighting the need for a UI for touch-based phones, designed specifically for those with

    visual impairment. The new design would require operation using touch and sound, menu

    elements placed in a logical arrangement, and well-defined categories for assistive

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 21 of 132

    functionalities and standard mobile phone functionality.

    It was important for the project to provide new functions and improved access for people with

    vision loss. From the literature review, several projects with similar solutions came to the

    authors attention. However, the projects relied mainly on gestures and hardware buttons, functionalities that could make the solutions harder to learn, and that are certainly

    incompatible with mobile phone that are not equipped with buttons. Based on the feedback

    from these projects, it became evident that the use of gestures should be kept to a minimum

    and that the dependency on hardware buttons should be removed.

    3.4 Development

    This stage involved the development of the solution in addition to documentation and

    planning of the project. The Rational Unified Process (RUP) was chosen as the framework for

    the development process due to its emphasis on multiple iterations and its clear and defined

    guidelines for developing applications (Gornik, 2004; Pollice, 2002). However, time

    constraints necessitated a lightweight implementation of RUP, including only core activities.

    Nevertheless, this approach is still in accordance with the framework (Pollice, 2002).

    RUP requires all objects to start with an inception, where the project vision is defined and

    presented to the stakeholders (Pollice, 2002). Stakeholders for the current project include

    representatives NABP, Statped and SmartPhones Telecom; the latter is the company

    responsible for providing developer resources. To present the vision and the idea in an

    intuitive manner, a virtual prototype was created to show a possible solution for replacing a

    touch-based mobiles GUI. The virtual prototype was created in Microsoft Expression Blend 4 with SketchFlow (Microsoft Corporation, 2009), a tool for creating working prototypes of

    applications without writing any code.

    Further on, RUP dictates an elaboration on the design and a definition of the baseline,

    including functionality requirements and possible applications (Pollice, 2002). The virtual

    prototype served as a foundation for the outcome of the project and was used to communicate

    the idea and the vision to the developers. During several project meetings, discussions

    centered on what would be realistic outcomes from the allocated time. The decision was to

    create a working UI prototype for people with vision loss, while ignoring the server side

    functionality. Hence, the UI would work and behave as the final product, but all data

    presented would be static, not dynamic. An early prototype of the application was developed

    at this stage, to test core functionality. Complete logs of the project meetings are available in

    Appendix G.

    Construction is the third step in RUP; this is where the main development takes place (Pollice,

    2002). The application was developed in Java for Android, to support Android version 1.6

    onwards. Functionalities for the screen reader and haptics were not developed during this

    project. Instead, existing modules were implemented with API integration. During the

    construction phase, a total of five iterations of development were completed, with a working

    prototype coming from each one. Three iterations were for internal use, while two were tested

    externally. With each iteration new virtual prototypes, associated documentation, and added

    functionalities evolved. Guidelines and best practices for Android development, defined by

    Google, were followed through (Google Inc, 2010b; Google Inc, 2010a). Documentation on

    the UIs is available in Appendix H in addition to complete development logs in Appendix I.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 22 of 132

    3.5 Evaluation

    The final phase of RUP is the transition phase, where the developed product is tested on real

    users (Pollice, 2002). Each version of the UI was tested in three different scenarios, in a

    virtual environment, by the developer on a real device, and by the author, who made sure that

    all functionalities were implemented according to specifications. In addition, evaluations of

    the prototype were performed through two iterations of user experiments, where external

    participants tested and provided feedback on the prototypes.

    The participants were recruited with assistance from Statped, a Norwegian agency, which

    provides guidance to municipalities for supporting people with learning disabilities and vision

    loss (Statped, n.d.). All participants had previous experience from volunteering to other

    projects aimed at people with vision loss; moreover, before commencing, they were informed

    of their rights and of relevant ethical regulations.

    Participants all performed the same steps, starting with a general introduction to the project

    with its goals and aims, followed by ten minutes of training and some time to become familiar

    with the UI, finally performing a couple of exercises and providing feedback and completing

    a survey. Each test was performed on a HTC Hero (HTC, n.d.) device, with an average

    duration of one hour. Participants were monitored throughout the test. The introduction to the

    project emphasized to the participants that the UI was only a prototype and that the aim was to

    improve the navigation on touch-based mobile phones, they could therefor expect limitations

    and fewer available options.

    During the first iteration, participants were trained in operating the UI. They then had to

    choose a language for operating the system, either English or Norwegian. They were informed

    that the English version would be more advanced and sound more natural than the Norwegian

    version. All menus and feedback would reflect the chosen language.

    Table 3summarizes the exercise that the participants were asked to perform. The exercise was

    designed to evaluate whether the participants were able to understand and navigate the UI on

    their own. Instructions were to perform one task at the time and sufficient time was allowed

    for each task.

    Table 3 - Tasks the participants were asked to perform

    During the exercise, the participants behavior, responses, reactions and verbal feedback were documented. The participants were also providing verbal feedback on functions. Feedback

    provided included both comments on current functions and suggestions for improvements.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 23 of 132

    After the exercise, participants were asked to complete a survey. This collected background

    information and presented a set of statements for participants to mark their level of agreement

    with. From 1, strongly disagree to 5 strongly agree. Statements were grouped in categories according to the assessed function of the UI. Table 4 presents a summary of the

    most important questions from the survey. The complete survey is available in Appendix L.

    Table 4 - Survey completed by participants

    Data gathered and analyzed in the first user experiment resulted in a new version where the

    major issues and concerns uncovered in the first iteration were improved upon. In the first

    iteration, a single tap of the finger would state the function of the selected function, while a

    second tap would select the function. Although, this served the purpose, the solution caused

    frustration among the participants; they pointed to difficulties with tapping the correct area

    and mistakenly opening elements. To overcome these issues, the navigation of the UI was

    changed so that a single tap would still state the function of an element, but dragging the

    finger to the right would open it.

    The second iteration also introduced numbering and made changes to the naming of elements.

    One set of numbers reflected the elements; current position in a list, while another set

    presented the total number of elements in a list. A new list view, for lists with several

    elements, was also introduced. The new list introduced a second row of letters, ranging from

    A to Z, on the right side of the screen.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 24 of 132

    With the introduction of the new functions, the participants were asked to perform the same

    exercises and survey once more. Additionally, they were asked to test the new list view and

    answer a couple of new questions related to the added functionality, as seen in table 5 and 6.

    Table 5 - Additional tasks presented to participants

    Table 6 - Additional survey questions

    3.6 Conclusion

    To conclude the research, the data gathered from the two user iterations were categorized and

    further reviewed. Important findings from the surveys and the evaluations were analyzed and

    documented for further development of the prototype. The results are also presented in this

    report; providing new knowledge to the research field of assistive technology and mobile

    phone user access for people with vision loss.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 25 of 132

    CHAPTER 4: RESEARCH RESULTS

    The previous chapter presented the selected research method and the steps taken to produce

    the results. This chapter presents and evaluates the research results.

    4.1 Results in regards to development

    Following the Design research model of Vaishnavi and Kuechler (2007), the current project

    has gone through the recommended stages in developing an alternative mobile phone UI. The

    new UI provides users with a different way of interacting with the phone, compared to the

    mobile phone manufacturers original indentation. Through the UI, any function or application can be utilized in an easier and more efficient way, making the phone more

    accessible for people with vision loss. However, it is important to note that the applications

    need to be compatible with a screen reader to be useful for people with vision loss.

    Interaction with the new UI is made possible through two sensory systems, using sound and

    tactile feedback. An organized and intuitive menu system provides the user with access to

    relevant information and functionalities. The UI is designed in accordance with Nielsen's

    (1993) usability principles, and adapted based on the findings from the framework in the

    literature review. It also follows Brook's (1995) recommendation in keeping the number of

    available functions to a minimum, avoiding potential confusion.

    Figure 2 portrays screenshots from the UI running on a mobile phone; the key words represent

    the different menus and functions available. The intended audience is people who are either

    blind or that have strongly reduced vision; the text is for the benefit of the latter group. The

    key words can also serve as reference points for external users, for instance if customer

    support is needed. The font size is formatted according to guidelines from (NABP, n.d.).

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 26 of 132

    Figure 2 - Some of the menus available in the UI

    As mentioned, the UI is operated through sound and haptics. A single tap on the screen

    initiates the UIs screen reader, which states the function of the selected element, while the gesture tap and drag finger executes the element. The gesture was implemented to simplify the execution of elements, and to reduce the number of incorrect selections. Moreover,

    dragging a finger across the screen commands the UI to read out functions and at the same

    time makes the phone vibrate when a new item is available. A back element is placed at the

    bottom of every screen, with the exception of the main menu, allowing the user to easily

    navigate to the previous menu.

    To help the user create a mental map of the different menus and elements, the UI will also

    state the level it is currently displaying and give feedback when successfully executing an

    element. Furthermore, the UI can be customized to read out the selected elements position on a list, for example, number three of five.

    Longer lists are displayed in the same way as regular lists, only with an added ribbon on the

    screens right side, where an alphabetical list of letters is displayed. Figure 3 shows an

    illustration of such a list. The letters refer to the initial letter of available elements, which

    could be anything from applications to contacts. Dragging a finger across the ribbon causes

    the screen reader to name the letters as the finger crosses them.

    Figure 3 - List navigation in the UI

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 27 of 132

    In addition to improved accessibility on touch phones, the UI is intended to work as an

    assistive aid that can provide useful information, such as time, date and the weather forecast.

    The UI is also designed, to provide navigation and to analyze objects or text using the phones camera; however, the features were not implemented in the current version.

    The UI was tailored specifically for people with vision loss and it does not depend on existing

    functionality or layout. This approach differs greatly from other solutions that typically rely

    on a screen reader and the manufacturers UI implementation. By creating a new set of menu elements, the project did not have to focus so much on compatibility issues and could instead

    focus on creating new functionality. At the same time, this allowed freedom to customize the

    UI and optimize solutions for the visually impaired.

    Design features that were of particular importance in the development of the new UI included,

    the reduction of required gestures, a minimal learning curve and no reliance on hardware

    buttons. The use of gestures was decided to be reduced since they can be hard to master (Amy

    K. Karlson et al. 2005; Shaun K. Kane et al. 2008). In addition, by organizing menu elements

    in a vertical layout, the UI operation would be similar to that of the Nokia phones, common

    among people with vision loss. Finally, the dependency on hardware buttons was

    circumvented, as this could result in compatibility issues with certain phone models.

    Figure 4 shows one of several virtual prototypes created. Developing a virtual prototype

    turned out to be of great advantage, working as an aid in communicating the vision and as a

    tool for developers in finalizing and trying out the product. The virtual prototype resided in a

    web browser and was operated with a computer mouse. It behaved like the UI developed for

    the mobile phone, with fully operating menus and sound. In addition to the virtual prototype,

    figure 5 shows a flow map that was created to visualize the different elements and the

    relations between menus.

    Figure 4 - Conceptual UI prototype for people with vision loss

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 28 of 132

    Figure 5 - Flow map for navigation in the UI

    The Android OS was chosen as the developmental platform for the UI, a decision based on

    several factors. The Android platform is adopted by several phone manufacturers and is

    currently the fastest growing mobile OS in the market (The Nielsen Company, 2010; Gartner,

    2010a; Comscore, 2010). Hence, products developed for the Android OS have the potential of

    reaching out to a large customer base. The Android OS was also a natural choice for the

    developers, since it allows anyone to create a new UI for the phone. Furthermore, it is

    considered a very open and flexible OS for application development (Ableson, 2009; Google

    Inc, 2010a; Google Inc, 2010b).

    Figure 6 presents a conceptual overview of the application, developed in Java for Android,

    with support for Android version 1.6 and onwards.

    Code Behind

    Base Activity

    Modules TTS Extended

    Haptics

    Activity 1 Activity XActivity 2 Activity 3

    OnTouchListener 1 OnTouchListener 2 OnTouchListener 3 OnTouchListener X

    UI - ViewLanguage packs English Norwegian Etc.

    Application

    Figure 6 - Conceptual model of application

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 29 of 132

    Base Activity contains standard functionality and provides communication with the external

    modules TTS Extended (Eyes-Free Project, n.d.) and Kickback (Ganov, n.d.); both are

    modules used for creating voice feedback (screen reader) and vibration (haptics). All feedback

    to the user, menus, voice and haptics are provided through functions called Activities. Each

    Activity has an OnTouchlistener, which measures where and how the user is tapping on the

    screen. All menu elements reside in XML format, which can easily be translated to different

    languages. The current version supports English and Norwegian, but other languages can be

    added as appropriate.

    4.2 Results in regards to testing

    Two iterations of user testing were carried out. A total of four people participated in the first

    iteration, three women and one man with an average age of 50 years; a total of five people

    took part in the second iteration, the same four with an extra man, making the average age

    49.4. As mentioned, statistics from World Health Organization (2009), show that women over

    50 make up the largest share of the worlds blind population; thus the projects sample is representative of this population. Of the participants, three were blind, one was diagnosed

    with very low vision, although her interaction could indicate complete vision loss, the final

    participant had low vision and could see objects in very close proximity.

    According to Nielsen and Landauer (1993), only a small number of people are required for

    usability testing; in fact five people should be sufficient to obtain adequate results (Richtel,

    1998). Nielsen (2000) states that such a small number of users will provide better results than

    a larger number; largely due to limited project budgets, the advantage of running several

    smaller compared to one large test. This notion is in accordance with Design research, which

    recommends several iterations of user testing.

    The user tests were planned according to usability guidelines from Nielsen (1993), with

    usability measured relative to certain users and certain tasks, and where every test should

    define a representative and measurable set of tasks relevant to the users. For consistent

    measurements, a survey should be completed subsequently. The mean value of the measured

    attributes should exceed a specific minimum; on a Likert scale ranging from 1 to 5, the mean

    value should be at least 4 (Nielsen & Levy, 1994) and 50% of participants should provide the

    score of 5 (Nielsen, 1993). With two iterations of user testing, the ratings from the two

    surveys can be compared to evaluate whether the improvements affected usability ratings and

    if further improvements are still required (Nielsen, 1993).

    Survey questions were categorized to the features they were set to evaluate, these are listed in

    Table 7. Menu system and The solution are considered the two most important categories, as they measure how well the UI works and the success of the solution. The participants

    received the same set of questions in both rounds.

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 30 of 132

    Table 7 Summary of survey categories

    Figure 7 presents the results from round one and two, with an overview of the mean scores for

    each functionality category. Several questions make up a category, thus the mean scores are

    averaged across questions as well as participants.

    Figure 7 - Graph displaying mean scores from the first and second round of testing

    The difference between mean scores from round one and two clearly shows responses were

    more positive in round two, where mean were higher than four for all categories. The two

    categories of particular interest, Menu system and The solution both presented fairly poor feedback in the first round, with initial mean scores as low as 2.3 for The solution and 3.9 for Menu system. According to Nielsen and Levy's (1994) criteria, these low scores would constitute a project failure. However, round two of testing provided a substantial increase in

    positive feedback; thus, it can be concluded that the solution tested in round two is more

    efficient and better received among the participants.

    Menusystem

    Phone Navigation Applications Information The solution

    Round 1 3,9 4,4 4,1 4,0 4,3 2,3

    Round 2 4,5 4,7 4,7 4,7 4,8 4,3

    1,0

    2,0

    3,0

    4,0

    5,0

    Ave

    rag

    e s

    co

    re

    Comparing components in the UI

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 31 of 132

    Figure 8 shows the results from the individual questions for the Menu system category. The mean scores illustrate the changes in the participants reaction between the first and second

    iteration.

    Figure 8 - Graph displaying a comparison between results from first and second round of testing on the menu system

    The results show an increase in scores for all but one question, indicating that the second

    menu system received a better acceptance than the first version. The reduction in scores for

    the questions relating to the vibration function was clarified through verbal feedback from

    participants. In the first round of testing, a greater reliance on haptic feedback was required to

    navigation; however, in the second round of testing, the improvements facilitated the auditory

    feedback and participants chose to rely more on sound than touch. These results are consistent

    with findings from Rassmus-Grhn (2006), who found that people tend to rely more on the

    sense they are accustomed to.

    The menusystem waseasy to use

    The systemmade the

    touch phoneeasier to use

    The spokenoptions were

    easy tounderstand

    It was easy tonavigatebetween

    menus andsub-menus

    The vibrationmade it

    easier tochange

    betweenmenus

    Thecategorizatio

    n of sub-menus were

    logical

    Round 1 4,0 3,7 4,3 2,5 4,3 4,5

    Round 2 4,6 4,6 4,6 4,6 3,6 4,8

    1,0

    2,0

    3,0

    4,0

    5,0

    Ave

    rag

    e s

    co

    re

    Menu system

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 32 of 132

    Figure 9 presents results from the individual questions for the solution category. The mean

    scores portray participants acceptance of the solution for rounds one and two.

    *1 Three participants did not respond due to no prior experience with solutions for touch screens *2 One participant did not respond due to no prior experience with touch screens to compare against

    Figure 9 - Graph displaying a comparison between results from first and second round of testing on the complete solution

    The results show an increase in mean scores for all questions, indicating that the whole

    solution received a better acceptance in the second round of testing. It should be noted that

    some participants decided not to answer the second and third question, feeling that they

    lacked the experience to give an informed response. However, all participants responded to

    the most important question, I personally felt that the menu system enabled and improved my use of the mobile phone, with a great increase in positive feedback.

    While the Design Research method does not require statistical analyses, these were

    nevertheless carried out to shed more light on the changes in scoring that were evident

    between the two versions of the UI. Effect sizes and differences between means were

    evaluated across the categories previously outlined. Due to the small sample size, a certain

    measure of leniency with respect to level of significance was judged necessary; this was

    therefore set to 0.1. Thus mean scores, available in figure 7, were found to be significantly

    higher for the second round of testing, compared to the first round, for the navigation

    functions (t(3) = 2.45, p>0.1), for the information functions (t(3) = 2.45, p>0.1), and for the

    solution functions (t(3) = 5.44, p>0.05). Effect sizes were judged to be high for all three

    functions (r = 0.41; r = 0.40; r = 0.89; respectively). Effect sizes were similarly high for the

    menu functions (r = 0.37) and the application functions (r = 0.41), but the differences between

    means were non-significant (t(3) = 1.47, ns; t(3) = 1.48, ns; respectively). The remainder of

    the analyses is included in Appendix M.M.

    I personally felt that themenu system enabled and

    improved my use of themobile phone

    I prefer this solution aboveothers*1

    I would use this solution inmy daily life*2

    Round 1 2,3 2,5 2,3

    Round 2 4,6 4,0 4,3

    1,0

    2,0

    3,0

    4,0

    5,0

    Ave

    rag

    e s

    co

    re

    Acceptance of the solution

  • MAKING TOUCH-BASED MOBILE PHONES ACCESSIBLE FOR THE VISUALLY IMPAIRED

    Dissertation Task 2 0834060 Page 33 of 132

    Nielsen (1993) suggests that participants should be asked for their subjective opinions, to

    better provide an understanding of their satisfaction. Participants were asked to state their

    thoughts and suggestions to improvements, following each round this feedback was used to

    improve the application further. Feedback from round one is available in Appendix M.A,

    while a complete list of feedback from round two is available in Appendix M.F. The most

    informative comments from round two are summarized in Table 8.

    Table 8 - Table showing subjective feedback from round two

    The positive feedback from participants shows how much they appreciated the second

    iteration of the solution; this is consistent with the survey results. The negative feedback

    mainly focuses on the new functionality introduced in the second iteration, indicating that the

    issues encountered in the first iteration had been solved and that a third iteration should aim at

    solving the new problems. Comments pointing to the inadequacy of the vibration result from

    the over-all improvements to the solution causing participants to shift their attention to the

    voice, where the participants paid more attention. Participants suggestions for improvements are reasonable and well thought through and should certainly be considered in a third

    iteration.

  • MAKING TO