8
MOBILE INTERACTION: AN ANDROID-BASED PROTOTYPE TO EVALUATE HOW WELL BLIND USERS CAN INTERACT WITH A TOUCH-SCREEN First Author Name (Blank if Blind Review) Affiliation (Blank if Blind Review) Address (Blank if Blind Review) e-mail address (Blank if Blind Review) Optional phone number (Blank if Blind Review) Second Author Name (Blank if Blind Review) Affiliation (Blank if Blind Review) Address (Blank if Blind Review) e-mail address (Blank if Blind Review) Optional phone number (Blank if Blind Review) ABSTRACT Mobile devices are increasingly becoming part of everyday life for many different uses. These devices are mainly based on using touch-screens, which is challenging for people with disabilities. For visually-impaired people interacting with touch-screens can be very complex because of the lack of hardware keys or tactile references. Thus it is necessary to investigate how to design applications, accessibility supports (e.g. screen readers) and operating systems for mobile accessibility. Our aim is to investigate interaction modality so that even those who have sight problems can successfully interact with touch-screens. A crucial issue concerns the lack of HW buttons on the numpad. Herein we propose a possible solution to overcome this factor. In this work we present the results of evaluating a prototype developed for the Android platform used on mobile devices. 20 blind users were involved in the study. The results have shown a positive response especially with regard to users who had never interacted with touch- screens. Author Keywords Mobile accessibility, mobile interfaces, blind users ACM Classification Keywords H.5.2 [Information Interfaces and Presentation]: User Interfaces – Input devices and strategies, Voice I/O, Haptic I/O. K.4.2. [Computers and society]: Social issues – assistive technologies for persons with disabilities. General Terms Design, Experimentation, Human Factors. INTRODUCTION Nowadays mobile devices are used more and more for a variety of purposes. This is due to the increasingly advanced features offered by the smartphones, which can provide additional functionalities compared to traditional phones. The interaction modality which is increasingly used for these devices is mainly via a touch-screen display. The absence of hardware keys and any tactile reference makes the interaction with smartphones more difficult and complex for those who are blind. Interaction modalities based on gestures and taps can be a practicable solution, provided they are well designed and simple to use. Apple has already put on the market devices accessible to users with disabilities, such as iPhone 3G, 4 and 4S (http://www.apple.com/accessibility/ ). At the same time there are also some active projects aimed at studying how to provide access to devices based on the Android system (http://eyes- free.googlecode.com/svn/trunk/documentation/android_acc ess/index.html). However, all these solutions and studies are still at the early stages. It is therefore important to understand the suitability of the new interaction modalities with touch-screen devices for people with vision impairment. Our aim is to evaluate if there are still aspects to be made more accessible and usable for user interaction. In [Errore. L'origine riferimento non è stata trovata.] the authors observed some usability issues encountered by blind users while interacting with the tablet iPad, although the VoiceOver support seems to be generally accessible. This implies that there are still mobile accessibility issues to be analyzed and evaluated in order to enhance blind user interaction with a touch-screen. The study presented in this paper is part of mobile accessibility research with particular reference to the interaction with touch-screen based smartphones for blind people, especially for a first-time user. Devices based on the Android platform are still not particularly accessible and usable by blind people. So we have selected this particular platform to investigate mobile interaction by blind users. The aim is to gather information, suggestions and indications on interaction with a touch-screen by blind users which should be considered when designing mobile applications and support as well. To this end, in this work we present a prototype application designed to make the main phone features available in a way which is accessible for a blind user. The prototype has been developed to firstly evaluate the interaction modalities based on gestures, audio and vibro-tactile feedback. A small group of blind people was involved at an early stage of the prototype development to collect first impressions and preferences which were considered during the design phase of the study. Subsequently a structured user test was conducted to collect qualitative and quantitative data from the blind users’ point of view. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. MobileHCI 2011, Aug 30–Sept 2, 2011, Stockholm, Sweden. Copyright 2011 ACM 978-1-4503-0541-9/11/08-09....$10.00.

SMARCOS_Paper_Mobile hci12 246

Embed Size (px)

DESCRIPTION

Mobile devices are increasingly becoming part of everyday life for many different uses. These devices are mainly based on using touch-screens, which is challenging for people with disabilities. For visually-impaired people interacting with touch-screens can be very complex because of the lack of hardware keys or tactile references. Thus it is necessary to investigate how to design applications, accessibility supports (e.g. screen readers) and operating systems for mobile accessibility. Our aim is to investigate interaction modality so that even those who have sight problems can successfully interact with touch-screens. A crucial issue concerns the lack of HW buttons on the numpad. Herein we propose a possible solution to overcome this factor. In this work we present the results of evaluating a prototype developed for the Android platform used on mobile devices. 20 blind users were involved in the study. The results have shown a positive response especially with regard to users who had never interacted with touchscreens

Citation preview

Page 1: SMARCOS_Paper_Mobile hci12 246

MOBILE INTERACTION: AN ANDROID-BASED PROTOTYPE TO EVALUATE HOW WELL BLIND USERS CAN INTERACT WITH A TOUCH-SCREEN

First Author Name (Blank if Blind Review) Affiliation (Blank if Blind Review) Address (Blank if Blind Review)

e-mail address (Blank if Blind Review) Optional phone number (Blank if Blind Review)

Second Author Name (Blank if Blind Review) Affiliation (Blank if Blind Review) Address (Blank if Blind Review)

e-mail address (Blank if Blind Review) Optional phone number (Blank if Blind Review)

ABSTRACT

Mobile devices are increasingly becoming part of everyday life for many different uses. These devices are mainly based on using touch-screens, which is challenging for people with disabilities. For visually-impaired people interacting with touch-screens can be very complex because of the lack of hardware keys or tactile references. Thus it is necessary to investigate how to design applications, accessibility supports (e.g. screen readers) and operating systems for mobile accessibility. Our aim is to investigate interaction modality so that even those who have sight problems can successfully interact with touch-screens. A crucial issue concerns the lack of HW buttons on the numpad. Herein we propose a possible solution to overcome this factor. In this work we present the results of evaluating a prototype developed for the Android platform used on mobile devices. 20 blind users were involved in the study. The results have shown a positive response especially with regard to users who had never interacted with touch-screens.

Author Keywords

Mobile accessibility, mobile interfaces, blind users

ACM Classification Keywords

H.5.2 [Information Interfaces and Presentation]: User Interfaces – Input devices and strategies, Voice I/O, Haptic I/O. K.4.2. [Computers and society]: Social issues – assistive technologies for persons with disabilities.

General Terms

Design, Experimentation, Human Factors.

INTRODUCTION

Nowadays mobile devices are used more and more for a variety of purposes. This is due to the increasingly advanced features offered by the smartphones, which can provide additional functionalities compared to traditional phones. The interaction modality which is increasingly used for these devices is mainly via a touch-screen display. The absence of hardware keys and any tactile reference makes

the interaction with smartphones more difficult and complex for those who are blind. Interaction modalities based on gestures and taps can be a practicable solution, provided they are well designed and simple to use. Apple has already put on the market devices accessible to users with disabilities, such as iPhone 3G, 4 and 4S (http://www.apple.com/accessibility/). At the same time there are also some active projects aimed at studying how to provide access to devices based on the Android system (http://eyes-free.googlecode.com/svn/trunk/documentation/android_access/index.html). However, all these solutions and studies are still at the early stages. It is therefore important to understand the suitability of the new interaction modalities with touch-screen devices for people with vision impairment. Our aim is to evaluate if there are still aspects to be made more accessible and usable for user interaction. In [Errore. L'origine riferimento non è stata trovata.] the authors observed some usability issues encountered by blind users while interacting with the tablet iPad, although the VoiceOver support seems to be generally accessible. This implies that there are still mobile accessibility issues to be analyzed and evaluated in order to enhance blind user interaction with a touch-screen.

The study presented in this paper is part of mobile accessibility research with particular reference to the interaction with touch-screen based smartphones for blind people, especially for a first-time user. Devices based on the Android platform are still not particularly accessible and usable by blind people. So we have selected this particular platform to investigate mobile interaction by blind users. The aim is to gather information, suggestions and indications on interaction with a touch-screen by blind users which should be considered when designing mobile applications and support as well. To this end, in this work we present a prototype application designed to make the main phone features available in a way which is accessible for a blind user. The prototype has been developed to firstly evaluate the interaction modalities based on gestures, audio and vibro-tactile feedback. A small group of blind people was involved at an early stage of the prototype development to collect first impressions and preferences which were considered during the design phase of the study. Subsequently a structured user test was conducted to collect qualitative and quantitative data from the blind users’ point of view.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. MobileHCI 2011, Aug 30–Sept 2, 2011, Stockholm, Sweden. Copyright 2011 ACM 978-1-4503-0541-9/11/08-09....$10.00.

Page 2: SMARCOS_Paper_Mobile hci12 246

The paper is organized as follows: after a brief introduction to existing mobile studies, we describe the application prototype developed to investigate interaction modality with a touch-based display. Next we report on the user test conducted to evaluate the prototype. Conclusions end the work.

RELATED WORK

Several works have been recently proposed in literature to research mobile interaction by people with disabilities. With reference to visually-impaired users some studies discuss the “importance” of touch when interacting with a mobile device [1, 6]. Further studies are related to the way to combine and exploit various interaction modalities and techniques in order to enhance blind user interaction. In fact, a multimodal approach can be a valuable way to support various interaction modes, such as speech, gesture and handwriting for input and spoken prompts. The tests described in [2] showed that user performance significantly improves when haptic stimuli are provided in order to alert users to unintentional operations (e.g. double clicks or slips during text insertion). However, the study mainly focuses on the advantages of exploiting the haptic channel as a complement to the visual one and is not concerned with solutions for blind users. By combining various interaction modalities, it is possible to obtain an interactive interface suitable for users with varying abilities. A well-designed multimodal application can be used by people with a wide variety of impairments. In [Errore. L'origine riferimento

non è stata trovata.] the authors evaluated a combination of audio and vibro-tactile feedback on a museum guide, which had a positive response for the user with vision impairments. The study introduced in [8] considers the haptic channel combined with audio feedback to improve graphic perception by blind users. With regard to user interface (UI) design, the work presented in [11] suggests an important principle which should be considered when designing a product: developers should focus on ability rather than disability. This interesting concept has been considered in several pilot projects, including Slide Rule which studied touch-screen access for blind users. In particular, Slide Rule is a prototype utilizing accuracy-relaxed multi-touch gestures, “finger reading” and screen layout schemes to enable blind people to use unmodified touch-screens [4]. A very critical aspect for a blind person is related to input interaction. Several studies have been investigating possible alternatives to propose as solution to such issues [9].

In our work we intend to investigate if gestures and voice and vibro-tactile feedback can be useful for the blind to confirm an action on an Android-based smartphone. The Android platform includes a built in text-to-speech engine and a screen reader so that phone manufacturers can provide accessible smartphones. Android phones can also be highly customized by downloading third-party accessibility applications that make nearly every function

possible without sight, including making phone calls, text messaging, emailing and web browsing (http://www.google.com/accessibility/products/). [10] describes an example application developed for the blind using an Android-platform device. However, the proposed work on accessibility support is still in progress and input/output modalities need to be investigated in order to identify the most appropriate modalities used to interact with a touch-screen.

PROTOTYPE DESIGN

The proposed prototype is an application for the Android system for mobile devices. It was tested on Samsung Galaxy S, Samsung Nexus S and Huawei IDEOS. The application is not in any way intended to replace screen reading software. Instead it is designed to implement those features and functionalities that could be used to assess and understand the most appropriate user interaction via a touch-screen. So, for the prototype we implemented the basic phone functionalities (contacts, call, read/write an SMS (text message) and read information in table format). We also implemented the user setting functionality in order to allow the user to choose how to provide and receive information using the phone (e.g. editing modality, type of vibration, etc.). This was also useful to switch from one modality to another. When designing the application prototype we considered the following aspects to be evaluated for our purposes:

• Interaction with a multimodal interface based on a touch-screen (gestures) and feedback (audio and vibration);

• Organization and consistency of the user interface (e.g., menus, buttons and labels);

• Editing content (e.g., SMS text or phone number).

Gestures

The interaction modality implemented with the prototype

Figura 1 Menu flicks

Page 3: SMARCOS_Paper_Mobile hci12 246

was mainly based on gestures like left/right and up/down flicks, taps and double taps. For instance to go through the menus the user can proceed via right and left flicks. To confirm a menu item a double tap is required.

Feedback

Both audio and vibro-tactile feedback was used to provide UI information. In relation to the audio feedback, we considered:

Vocal messages. all the spoken messages are created by a voice synthesizer. We used a text-to-speech engine available for the Android platform (https://market.android.com/details?id=com.svox.langpack.installer&hl=it).

Short sounds. A very short beep was used to notify the beginning and end of an item list. While navigating a list of elements using left or right flick , when the focus is over the first or last item a short sound announces the end of the list. For the list of elements, such as menus, SMSs, and table cells, the focus moves in a cyclic way. Using the ‘next gesture’ (e.g. left flick) the focus moves from the last item to the first one, and vice-versa,, and a short sound is emitted to notify the end of the list.

For the vibro-tactile feedback, we exploited the vibration functionality available on the phone for:

Action confirmation. A very short vibro-tactile response was provided when selecting a UI element (e.g. menu item or button) with a double tap. This feedback is in addition to the voice tone change to provide confirmation of an action.

Button and key detection. To allow the user to quickly detect a button or a key, a short vibro-tactile response can be perceived when touching the item. This solution was applied only to some buttons and keys in order to understand its benefit if compared with those just announced by voice.

User Interface

When designing the prototype, certain aspects concerning the organization and arrangement of the Interface such as the best structure and the most appropriate position and size of the items were analysed. This was to ensure identification of the UI elements was simple and to make use of prior knowledge of the usual positions for keys. The main UI elements considered in the prototype can be summarized as: Menus and sub-menus. All available functionalities have been grouped into menus and sub-menus according to macro topics.

Set-corner buttons. In order to facilitate some main actions, four buttons were placed at the four display corners:

• (1 – top left) ‘exit/back’ to go to the previous step,

• (2 – top right) ‘Position’ to know the current position status (e.g. editing numpad),

• (3 – bottom left) ‘Repeat’ to use for reading the edited content (e.g. phonenumber or content text) in order to check if there is any error or read all the written message (e.g. an SMS message),

• (4 – bottom right) ‘Done’ to be used as an ‘OK’ button to confirm the current action.

Buttons (2) and (3) are particularly useful for a blind person so they know the current status and position at any time without having to explore the whole screen, as described in another study [Errore. L'origine riferimento non è stata

trovata.] related to the focus issues.

Editing Content

Editing via a touch-screen is a challenge for blind people due to the lack of hardware keys. Typing both a phone number and a text becomes particularly difficult. The main issue is related to key detection in a simple and reliable way. Thus for editing activity we considered: numeric keypad (numpad), keyboard and editing modality.

Numeric Keypad Detection

To support the detection accessibility for the numpad some possible solutions have been considered. In the prototype the numpad keys were identified using (1) number vocalization and (2) key vibration. In a first prototype version, the vibration was used to mark only the number '5' as it is commonly used in the hardware numpad. Based on the preliminary comments received from the end users during the prototype development, the solution implemented for the user test provides different vibro-tactile feedback:

• Single vibration for even numbers (2, 4, 6, 8 and 0); Figura 2 Four buttons in the screen corners

Page 4: SMARCOS_Paper_Mobile hci12 246

• Double vibration for the odd numbers except for '5'(1, 3, 7 and 9);

• Triple vibration for the number '5 '.

Figure 1 Numeric keypad

The different vibration frequencies have been designed to provide a possible support in order to better detect the number being touched.. For example, the user can recognize the number ‘5’ on the basis of the triple vibration. Then by sliding the finger to the right when you feel a single vibration (even numbers) it means it is the key '6 '.

Instead when you feel a double vibration (odd numbers), it means that the finger has shifted slightly upwards (so it must be '3') or down ('9'). This should support number detection especially in a noisy environment.

Keyboard

For text editing, a virtual qwerty keyboard was implemented in the prototype. No vibration support was provided. When each key is touched it is announced vocally. In this case we focused especially on the editing

Figure 2 Software keyboard

modality, i.e. single or double tap to confirm the touched letter (see next paragraph).

Editing Modality

Two editing modalities were implemented to select either a number or letter: (1) single tap and (2) double tap. Single tap means the user can explore the screen (i.e. keyboard or numpad) using a finger without leaving the screen. When the finger is hovering over the desired letter / number, by raising it (i.e. event up) the last key touched is confirmed (i.e. edited). Whereas a double tap means the user is freed to explore. . When the desired letter / number is identified, a double tap is used to select (i.e. (i.e. edit).

USER TEST

Overview

In order to evaluate the effectiveness and efficacy of the interaction modalities available in the prototype, a structured test with end users was conducted in order to collect objective and precise information. The main goal was to understand if the interaction modality implemented via gesture and vibration could be suitable for a blind user. We also planned to collect information on UI design in relation to menus, labels, buttons and audio/vibro feedback. The evaluation was targeted at answering the following questions: (1) Is the proposed gesture-based interaction appropriate for a blind person? (2) Is the arrangement of the UI elements suitable for quickly detecting the interactive elements? (3) How easy to learn and to use is the application? We were particularly interested in recording perceived difficulties together with critical issues for users when editing modality through a single / double tap as well as comments on using the numpad and keypad. Vibro-tactile feedback was also considered in our evaluation.

Method

Participants

Twenty totally blind users (7 female and 13 male) were involved in the user testing. The participants were recruited in collaboration with the Association for the Blind in Italy. The age ranged from 22 to 70 years. All of the people use a computer with a screen reader in Windows an environment on a daily basis. Five of them had no experience with smartphones and touch-based screens, thirteen of them had interaveragete experience, whereas two of them had very good knowledge in using the iPhone device.

Test protocol

Four meetings were held at four local offices of the Italian Association for the Blind in different cities. Five users were involved for each test. At the beginning of each meeting a general presentation of the test purpose was made, highlighting the importance of the user’s role in the design

Page 5: SMARCOS_Paper_Mobile hci12 246

and development cycle. The experimental protocol was divided into three phases:

Preliminary phase: participants were provided with an overall description of the prototype as well as with a list summarising the most important gestures and UI elements;

Training phase: each user was allowed to explore the application for 20 minutes in order to gain confidence with the smartphone and the gesture-based interaction;

Testing phase: for each user an individual test session was carried out. Each of them was asked to perform a set of tasks and the execution time had been register by means of the chronometer. The users were observed while carrying out the tasks. We applied the “thinking aloud” method to collect information as the user was interacting with the prototype.

The training phase was designed to avoid the bias of ability: which means discrepancies in interaction abilities within users associated with a different degree of individual training, can affect the result of a test: the training phase allowed the participants to start the testing procedure with similar basic skills especially regarding knowledge of gestures.

Through this test procedure both subjective and objective data were gathered for each user in order to collect useful information on the evaluation. With regard to objective data we recorded some information for each task: (1) Time spent by users performing assigned tasks, (2) task accomplishment (success/failure), (3) errors made in performing the task. Regarding subjective data we collected comments and suggestions during both the training and test session by observing the users while they were using the application. Also specific questions and interviews allowed us to collect further useful information.

Tasks

To evaluate the interaction modality developed with the Android-based prototype we designed six tasks to be performed by each user. The type of task was selected according to the interaction modality to be evaluated. The six tasks assigned to each user during the test session are listed in the table below.

Task Description Goal T1 Reading an SMS

message Gesture-interaction with the menus and main buttons

T2 Making a call (single tap)

Editing a number

T3 Making a call (double tap)

Editing a number

T4 Sending an SMS message (qwerty keypad)

Editing a text

T5 Searching for a flight time in a time table)

Exploring a table (rows and columns)

T6 Making a call (vibro-tactile numpad and single tap)

Editing a number

As we were especially interested in comparing the editing method using a single tap with the double tap, we assigned two tasks with the same action, i.e. editing a number (T2 and T3). In order to avoid the potential bias created by the learning effect in using the numpad in the task 2 and 3, we balanced the users by carrying out the two tasks in a different order: T2 and T3 and vice-versa. We applied this modality to all the four sessions with the 20 users. T6 was introduced to evaluate the usage of vibro-based numpad in order to understand if vibro-tactile support is a feasible solution for detecting keys more easily. However, for this specific solution we plan to investigate further so as to collect additional information on the possibility that this support can improve interaction when using a numpad..

Post questionnaire

After performing the test, participants were asked to fill in a questionnaire composed of 22 questions. This made it possible to collect information about the mobile devices and smartphones used by the participants, and to obtain other qualitative data not obtainable during the observations. Subjective information was also considered. For example, users could express opinions and ideas about the usefulness of audio , vibro-tactile feedback, labels, keypad and numpad, etc. Indications about the level of difficulty of editing were also taken into account. The questions regards the following topics:

• General information about user

• Prior knowledge and experience of using mobiles, and user expectations of a Smartphone

• Suggestions/opinions regarding multi-modal mobile interaction

• Prototype evaluation

For the first three topics the user has to choose the response among a set of options. For the last topic the user has to give evaluation about specific prototype feature in a range from 1 (lower value) to 5 (higher value). For all the topics it was possible for the user provide comments and suggestion.

EVALUATION RESULTS

As said, during the test procedure objective and subjective data were gathered.

Objective data

Page 6: SMARCOS_Paper_Mobile hci12 246

In Table 1 the average (M) and the standard deviation (SD) values relating to the time spent by users performing assigned tasks are provided.

Time spent

Task M SD T1 01:41,4 01:01,8

T2 02:04,4 01:07,4

T3 01:47,1 00:45,2

T4 01:44,4 00:43,7

T5 01:25,7 01:02,3

T6 01:54,7 00:45,0

Table 1 Time spent

Only successfully completed performances are taken into account.

In Table 2 the average success or failure rates for the task accomplishment are provided.

Task accomplishment

Task Success Failure

T1 0,81 0,18

T2 0,85 0,14

T3 0,85 0,14

T4 0,55 0,45

T5 0,85 0,14

T6 1,0 0,0

Table 2 Task accomplishment

The results show that most of the users are able to successfully complete the tasks. As expected, text editing has the largest failure rate.

In Table 3 the average number of errors per task is reported.

Task Evaluation of errors

T1 0,05

T2 0,55

T3 0,55

T4 0,85

T5 0,15

T6 0,7

Table 3 Number of errors

The errors considered relate to the number of attempts, and not the number of successfully completed tasks.

It is worth noting that more errors occurred in task T2 when it was performed after task T3 (70%). The users performed a double tap to select a number instead of a single tap, thus copying the editing modality of task T3.

During the test, it was observed that the users with experience of smartphone and touch screen technologies

achieved the best results. This implies that the accessibility of the task improves with practice.

Subjective data

All users have experience of Symbian technology and they use Symbian mobile phones. 55% of users had never used a smartphone and they stated that they had no knowledge of touch screen technology. Moreover, the users were questioned about the features they would like to have on a smartphone. The result is that 85% of users are interested in traditional phone functionalities (i.e. phoning, SMSs, contacts); 71% are interested in internet access while 69% would like to have access to email. Few users declared their interest in reading e-books and taking notes.

In the “Smartphone knowledge” section of the questionnaire, the users were asked for detailed opinions on mobile device interaction via touch-screen.

55% of users think that a software keyboard is a valid way to provide input to the smartphone. It is worth noting that 83% of users with this view are those who have already used a smartphone. This factor suggests that users who are initially reluctant to use a software keyboard can change their mind after using it.

Instead, after using the software numpad to insert a number, 97% of the users think that it is a valid way to perform this function.

77% of the users say that it would be useful to use a speech recognizer to provide some commands to the smartphone with the voice.

The majority (88%) think that vibro-tactile feedback is a valuable way to obtain indication from the smartphone.

The last question in the “Smartphone knowledge” section regards the possible presence of physical points of reference on the touch screen: 83% of the users consider that it would be very helpful to have physical reference points.

The questions contained in the “Prototype evaluation” section of the questionnaire regard the evaluation of the tasks performed with the prototype application. A scale from 1 (negative) to 5 (positive) was employed when the user was asked to express a score.

The users were asked about the usefulness of flick gestures when browsing the information. The average value obtained is M:4,47. 17 users think that the gestures from left to right and from right to left are appropriate to scroll lists of elements, while 3 users think that gestures from top to bottom and from bottom to top would be more intuitive.

Regarding the use of the 4 buttons at the corners of the screen when retrieving orientation information, the users

Page 7: SMARCOS_Paper_Mobile hci12 246

gave their view on their usefulness. The average value obtained is M:4,84.

In particular, 15 users expressed their appreciation of the “Repeat” button for two reasons. They found it very useful for listening again to the last message or the text entered so far. They also liked the fact that they could use it at any point during the interaction. This could well be a solution to the problem observed by the authors in [Errore. L'origine

riferimento non è stata trovata.].

The average value obtained for vibration feedback is M:3,84. This data is the most subjective of those analyzed: users are divided between believing that the vibration does not provide any added value and those who think that it is essential when in noisy environments and for reasons of privacy.

In particular it is interesting to note that the users in favour of vibration are the same as those who clearly perceive the difference between the vibration in different keys.

With regard to the single tap method for selecting a key within a keyboard or numpad the average value obtained is M:4,44. Instead the approval rating for the double tap mode is M: 4.06. The general opinion is that the double tap is better suited for novice users because it would result in more successful editing. Interestingly novice users who express this view claim to interact better when using the single tap mode.

Both novices and experienced users agree that the single tap is a faster way to insert the text in general. However, they think that using a single tap is not suitable for critical actions such as “Done” or “Delete”. They say they would feel more secure with a double tap.

Furthermore, it is worth noting that one of the most common obstacles, particularly for novice users, is being able to successfully perform a double tap. This difficulty occurs when the time between the two taps is too long, or when the second tap is performed in a different position on the screen. As a result, these users made a large number of errors when selecting elements using a double tap. Instead they achieve the correct result with a single tap.

For text insertion, 3 users suggested that it would be useful to have a list of predefined messages, which could be modified to include customized information.

Eleven users think that the qwerty keyboard is difficult to use and this opinion was unconnected to the editing modality chosen. For all of them this was due to the positioning of the keys too close together, which consequently prevented easy identification.

Regarding the phrases and words used by the voice (i.e. to read the UI labels), the average value obtained is M:4,63. Some users suggested making the labels more context-dependent. For instance, the "Done" button could be

replaced by "Send" when writing an SMS message, or "Call" when making a call.

Six users think that in some cases the phrases used to introduce the activities are too long. They would probably be appropriate when first using the device, but that for subsequent use it would be better to allow the user to customize the message. This fact suggests that the level of detail of the speech feedback might be an additional configuration parameter.

In conclusion, the global evaluation of the tested prototype is expressed with the average value: M:4:26.

Overall, users showed unexpectedly high interest in the application and were very willing to contribute their opinions and comments to the study.

A significant consequence of the study is that as many as five of the participants said they would consider buying an Android smartphone in order to install this application.

CONCLUSIONS

The study presented in this work is aimed at investigating the interaction with mobile devices by users with visual impairments. To this end, we chose Android-powered devices and we developed a prototype application to implement some targeted functionalities, which allowed us to analyse and evaluate how well the blind user is able to interact and work with a smartphone. Although the prototype developed is limited to only a few features for the Android operating system, the results obtained from blind user interaction with an Android-based device can be generalized and applied to any mobile device based on a touch-screen. Thus, the results of this work could be useful to developers of mobile operating systems and applications based on a touch-screen, in addition to those working on designing and developing assistive technologies.

Herein we have presented and discussed the results obtained through a user test conducted with 20 blind users in order to evaluate mobile interaction using the proposed prototype.

Comments from the users while interacting with the prototype, as well as collected data when performing a set of tasks were encouraging. Positive feedback was also observed by the researchers as regards the intuitiveness and ease of use for those people who had never used a touch-screen before. Based on the data and suggestions collected we can begin to outline certain aspects and features preferred by the users. These should be considered in the User Interface as well as in assistive technology design . In particular we evaluated and collected positive impressions and comments on the usefulness of the following UI features:

• The four action/function buttons at the corners of the touch-screen, such as “Back” and “Done/OK”.

Page 8: SMARCOS_Paper_Mobile hci12 246

This kind of feature located in fixed places can improve user interaction.

• The assistive technology functions to use in order to obtain information on the current status. Examples are the “Position” button to find out the location, or the “Repeat” button used to easily read the focused element (especially for the edit fields to check what has been written). Specific buttons or gestures can be a worthwhile solution.

• Vibro-tactile and tactile support to improve perception of given events (e.g. to confirm action) as well as identify UI parts or elements (e.g. UI macro areas or the focused edit field).

• The fully perceivable numpad as an alternative or an addition to audio number vocalization. Vibro-tactile support to differentiate between the numbers (e.g. odd and even) with different frequencies is a possible direction.

The study [5] suggests that blind subjects prefer gestures that use screen corners, edges, and multi-touch (enabling quicker and easier identification) and identifies new gestures in well-known spatial layouts (such as a qwerty keyboard). With regard to the four buttons placed at the corners, our study confirmed that blind users appreciate these UI elements which are easy to locate or the fact that they are perceivable (e.g. vibro-tactile feedback). In contrast practical problems are encountered when editing with a qwerty keyboard, even though the layout of the keys is well-known to the user. A specific user test based on vibro-tactile numpad will make it possible to collect additional and more precise information on vibration usage as accessibility support in order to overcome the difficulties associated with a virtual numpad. More generally, the evaluation confirmed that editing is still a challenge for a blind person, as already pointed out in the task completion results, where the task T4 is successfully terminated only by half of the users. So further investigation is necessary in order to identify how to improve text editing and numpad accessibility.

ACKNOWLEDGMENTS

The authors wish to thank to all people who participated in the user testing and the Italian Association for the Blind for the collaboration in the organization of the four meetings.

REFERENCES

1.Benedito, J., Guerreiro, T., Nicolau, H., Gonçalves, D.: The key role of touch in non-visual mobile interaction. In Proc. of MobileHCI'10, ACM, NY, 379-380

2.Brewster, S.A., Chohan, F., Brown, L.M.: Tactile Feedback for Mobile Interactions. Proc. CHI’07. ACM Press Addison-Wesley, pp 159-162

3.Omissed for blind review.

4.Kane, s. K., BiGHAM, J. P., AND Wobbrock, J. O.: Slide Rule: Making mobile touch screens accessible to blind people using multi-touch interaction techniques. Proc. of ASSETS’08. ACM Press (2008), 73-80

5.Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable gestures for blind people: understanding preference and performance. In Proc. of CHI '11, ACM. 413-422

6.Koskinen, E., Kaaresoja, T., Laitinen, P.: Feel-good touch: finding the most pleasant tactile feedback for a mobile touch screen button. Proc. of ICMI'08. ACM, New York (2008), 297-304

7.Omissed for blind review.

8.Manshad, A.S.: Multimodal vision glove for touchscreens. Proc. of Assets'08, ACM Press (2008), 251-252

9.Oliveira, J., Guerreiro, T., Nicolau, H., Jorge, J., Gonçalves, D.: BrailleType: unleashing braille over touch screen mobile phones. Proc. of Interact 2011

10.Shaik, A.S., Hossain, G., Yeasin, M.. Design, development and performance evaluation of reconfigured mobile Android phone for people who are blind or visually impaired. Proc. of the 28th ACM SIGDOC '10

11.Wobbrock, J.O., Kane, S.K., Gajos, K.Z., Harada, S. and Froelich, J.. Ability-based design: Concept, principles and examples. ACM Trans. Access. Comput., 2011.