Mobile devices are increasingly becoming part of everyday
life for many different uses. These devices are mainly based
on using touch-screens, which is challenging for people
with disabilities. For visually-impaired people interacting
with touch-screens can be very complex because of the lack
of hardware keys or tactile references. Thus it is necessary
to investigate how to design applications, accessibility
supports (e.g. screen readers) and operating systems for
mobile accessibility. Our aim is to investigate interaction
modality so that even those who have sight problems can
successfully interact with touch-screens. A crucial issue
concerns the lack of HW buttons on the numpad. Herein
we propose a possible solution to overcome this factor. In
this work we present the results of evaluating a prototype
developed for the Android platform used on mobile
devices. 20 blind users were involved in the study. The
results have shown a positive response especially with
regard to users who had never interacted with touchscreens
1. MOBILE INTERACTION: AN ANDROID-BASED PROTOTYPE TO EVALUATE HOW WELL BLIND USERS CAN
INTERACT WITH A TOUCH-SCREEN
First Author Name (Blank if Blind Review) Second Author Name (Blank if Blind Review)
Affiliation (Blank if Blind Review) Affiliation (Blank if Blind Review)
Address (Blank if Blind Review) Address (Blank if Blind Review)
e-mail address (Blank if Blind Review) e-mail address (Blank if Blind Review)
Optional phone number (Blank if Blind Review) Optional phone number (Blank if Blind Review)
ABSTRACT the interaction with smartphones more difficult and
Mobile devices are increasingly becoming part of everyday complex for those who are blind. Interaction modalities
life for many different uses. These devices are mainly based based on gestures and taps can be a practicable solution,
on using touch-screens, which is challenging for people provided they are well designed and simple to use. Apple
with disabilities. For visually-impaired people interacting has already put on the market devices accessible to users
with touch-screens can be very complex because of the lack with disabilities, such as iPhone 3G, 4 and 4S
of hardware keys or tactile references. Thus it is necessary (http://www.apple.com/accessibility/). At the same time
to investigate how to design applications, accessibility there are also some active projects aimed at studying how to
supports (e.g. screen readers) and operating systems for provide access to devices based on the Android system
mobile accessibility. Our aim is to investigate interaction (http://eyes-
modality so that even those who have sight problems can free.googlecode.com/svn/trunk/documentation/android_acc
successfully interact with touch-screens. A crucial issue ess/index.html). However, all these solutions and studies
concerns the lack of HW buttons on the numpad. Herein are still at the early stages. It is therefore important to
we propose a possible solution to overcome this factor. In understand the suitability of the new interaction modalities
this work we present the results of evaluating a prototype with touch-screen devices for people with vision
developed for the Android platform used on mobile impairment. Our aim is to evaluate if there are still aspects
devices. 20 blind users were involved in the study. The to be made more accessible and usable for user interaction.
results have shown a positive response especially with In [Errore. L'origine riferimento non è stata trovata.]
regard to users who had never interacted with touch- the authors observed some usability issues encountered by
screens. blind users while interacting with the tablet iPad, although
the VoiceOver support seems to be generally accessible.
Author Keywords This implies that there are still mobile accessibility issues to
Mobile accessibility, mobile interfaces, blind users be analyzed and evaluated in order to enhance blind user
interaction with a touch-screen.
ACM Classification Keywords
H.5.2 [Information Interfaces and Presentation]: User The study presented in this paper is part of mobile
Interfaces – Input devices and strategies, Voice I/O, Haptic accessibility research with particular reference to the
I/O. K.4.2. [Computers and society]: Social issues – interaction with touch-screen based smartphones for blind
assistive technologies for persons with disabilities. people, especially for a first-time user. Devices based on
the Android platform are still not particularly accessible and
General Terms usable by blind people. So we have selected this particular
Design, Experimentation, Human Factors. platform to investigate mobile interaction by blind users.
The aim is to gather information, suggestions and
INTRODUCTION indications on interaction with a touch-screen by blind users
Nowadays mobile devices are used more and more for a which should be considered when designing mobile
variety of purposes. This is due to the increasingly applications and support as well. To this end, in this work
advanced features offered by the smartphones, which can we present a prototype application designed to make the
provide additional functionalities compared to traditional main phone features available in a way which is accessible
phones. The interaction modality which is increasingly used for a blind user. The prototype has been developed to firstly
for these devices is mainly via a touch-screen display. The evaluate the interaction modalities based on gestures, audio
absence of hardware keys and any tactile reference makes and vibro-tactile feedback. A small group of blind people
was involved at an early stage of the prototype development
Permission to make digital or hard copies of all or part of this work for to collect first impressions and preferences which were
personal or classroom use is granted without fee provided that copies are considered during the design phase of the study.
not made or distributed for profit or commercial advantage and that copies Subsequently a structured user test was conducted to collect
bear this notice and the full citation on the first page. To copy otherwise,
or republish, to post on servers or to redistribute to lists, requires prior
qualitative and quantitative data from the blind users’ point
specific permission and/or a fee. of view.
MobileHCI 2011, Aug 30–Sept 2, 2011, Stockholm, Sweden.
Copyright 2011 ACM 978-1-4503-0541-9/11/08-09....$10.00.
2. The paper is organized as follows: after a brief introduction possible without sight, including making phone calls, text
to existing mobile studies, we describe the application messaging, emailing and web browsing
prototype developed to investigate interaction modality (http://www.google.com/accessibility/products/). [10]
with a touch-based display. Next we report on the user test describes an example application developed for the blind
conducted to evaluate the prototype. Conclusions end the using an Android-platform device. However, the proposed
work. work on accessibility support is still in progress and
input/output modalities need to be investigated in order to
RELATED WORK identify the most appropriate modalities used to interact
Several works have been recently proposed in literature to with a touch-screen.
research mobile interaction by people with disabilities.
With reference to visually-impaired users some studies PROTOTYPE DESIGN
discuss the “importance” of touch when interacting with a The proposed prototype is an application for the Android
mobile device [1, 6]. Further studies are related to the way system for mobile devices. It was tested on Samsung
to combine and exploit various interaction modalities and Galaxy S, Samsung Nexus S and Huawei IDEOS. The
techniques in order to enhance blind user interaction. In application is not in any way intended to replace screen
fact, a multimodal approach can be a valuable way to reading software. Instead it is designed to implement those
support various interaction modes, such as speech, gesture features and functionalities that could be used to assess and
and handwriting for input and spoken prompts. The tests understand the most appropriate user interaction via a
described in [2] showed that user performance significantly touch-screen. So, for the prototype we implemented the
improves when haptic stimuli are provided in order to alert basic phone functionalities (contacts, call, read/write an
users to unintentional operations (e.g. double clicks or slips SMS (text message) and read information in table format).
during text insertion). However, the study mainly focuses We also implemented the user setting functionality in order
on the advantages of exploiting the haptic channel as a to allow the user to choose how to provide and receive
complement to the visual one and is not concerned with information using the phone (e.g. editing modality, type of
solutions for blind users. By combining various interaction vibration, etc.). This was also useful to switch from one
modalities, it is possible to obtain an interactive interface modality to another. When designing the application
suitable for users with varying abilities. A well-designed prototype we considered the following aspects to be
multimodal application can be used by people with a wide evaluated for our purposes:
variety of impairments. In [Errore. L'origine riferimento
non è stata trovata.] the authors evaluated a combination • Interaction with a multimodal interface based on a
of audio and vibro-tactile feedback on a museum guide, touch-screen (gestures) and feedback (audio and
which had a positive response for the user with vision vibration);
impairments. The study introduced in [8] considers the • Organization and consistency of the user interface
haptic channel combined with audio feedback to improve (e.g., menus, buttons and labels);
graphic perception by blind users. With regard to user
interface (UI) design, the work presented in [11] suggests • Editing content (e.g., SMS text or phone number).
an important principle which should be considered when
designing a product: developers should focus on ability Gestures
rather than disability. This interesting concept has been The interaction modality implemented with the prototype
considered in several pilot projects, including Slide Rule
which studied touch-screen access for blind users. In
particular, Slide Rule is a prototype utilizing accuracy-
relaxed multi-touch gestures, “finger reading” and screen
layout schemes to enable blind people to use unmodified
touch-screens [4]. A very critical aspect for a blind person
is related to input interaction. Several studies have been
investigating possible alternatives to propose as solution to
such issues [9].
In our work we intend to investigate if gestures and voice
and vibro-tactile feedback can be useful for the blind to
confirm an action on an Android-based smartphone. The
Android platform includes a built in text-to-speech engine
and a screen reader so that phone manufacturers can
provide accessible smartphones. Android phones can also
be highly customized by downloading third-party Figura 1 Menu flicks
accessibility applications that make nearly every function
3. was mainly based on gestures like left/right and up/down
flicks, taps and double taps. For instance to go through the
menus the user can proceed via right and left flicks. To User Interface
confirm a menu item a double tap is required. When designing the prototype, certain aspects concerning
the organization and arrangement of the Interface such as
Feedback the best structure and the most appropriate position and size
Both audio and vibro-tactile feedback was used to provide of the items were analysed. This was to ensure
UI information. In relation to the audio feedback, we identification of the UI elements was simple and to make
considered: use of prior knowledge of the usual positions for keys. The
main UI elements considered in the prototype can be
Vocal messages. all the spoken messages are created by a
summarized as: Menus and sub-menus. All available
voice synthesizer. We used a text-to-speech engine
functionalities have been grouped into menus and sub-
available for the Android platform
menus according to macro topics.
(https://market.android.com/details?id=com.svox.langpack.i
nstaller&hl=it). Set-corner buttons. In order to facilitate some main
actions, four buttons were placed at the four display
Short sounds. A very short beep was used to notify the
corners:
beginning and end of an item list. While navigating a list of
elements using left or right flick , when the focus is over • (1 – top left) ‘exit/back’ to go to the previous step,
the first or last item a short sound announces the end of the
list. For the list of elements, such as menus, SMSs, and • (2 – top right) ‘Position’ to know the current
table cells, the focus moves in a cyclic way. Using the ‘next position status (e.g. editing numpad),
gesture’ (e.g. left flick) the focus moves from the last item • (3 – bottom left) ‘Repeat’ to use for reading the
to the first one, and vice-versa,, and a short sound is emitted edited content (e.g. phonenumber or content text)
to notify the end of the list. in order to check if there is any error or read all the
For the vibro-tactile feedback, we exploited the vibration written message (e.g. an SMS message),
functionality available on the phone for: • (4 – bottom right) ‘Done’ to be used as an ‘OK’
Action confirmation. A very short vibro-tactile response button to confirm the current action.
was provided when selecting a UI element (e.g. menu item Buttons (2) and (3) are particularly useful for a blind person
or button) with a double tap. This feedback is in addition to so they know the current status and position at any time
the voice tone change to provide confirmation of an action. without having to explore the whole screen, as described in
Button and key detection. To allow the user to quickly another study [Errore. L'origine riferimento non è stata
detect a button or a key, a short vibro-tactile response can trovata.] related to the focus issues.
be perceived when touching the item. This solution was
applied only to some buttons and keys in order to Editing Content
understand its benefit if compared with those just Editing via a touch-screen is a challenge for blind people
announced by voice. due to the lack of hardware keys. Typing both a phone
number and a text becomes particularly difficult. The main
issue is related to key detection in a simple and reliable
way. Thus for editing activity we considered: numeric
keypad (numpad), keyboard and editing modality.
Numeric Keypad Detection
To support the detection accessibility for the numpad some
possible solutions have been considered. In the prototype
the numpad keys were identified using (1) number
vocalization and (2) key vibration. In a first prototype
version, the vibration was used to mark only the number '5'
as it is commonly used in the hardware numpad. Based on
the preliminary comments received from the end users
during the prototype development, the solution
implemented for the user test provides different vibro-
tactile feedback:
• Single vibration for even numbers (2, 4, 6, 8 and
Figura 2 Four buttons in the screen corners 0);
4. • Double vibration for the odd numbers except for Figure 2 Software keyboard
'5'(1, 3, 7 and 9);
• Triple vibration for the number '5 '. modality, i.e. single or double tap to confirm the touched
letter (see next paragraph).
Editing Modality
Two editing modalities were implemented to select either a
number or letter: (1) single tap and (2) double tap. Single
tap means the user can explore the screen (i.e. keyboard or
numpad) using a finger without leaving the screen. When
the finger is hovering over the desired letter / number, by
raising it (i.e. event up) the last key touched is confirmed
(i.e. edited). Whereas a double tap means the user is freed
to explore. . When the desired letter / number is identified,
a double tap is used to select (i.e. (i.e. edit).
USER TEST
Overview
In order to evaluate the effectiveness and efficacy of the
Figure 1 Numeric keypad
interaction modalities available in the prototype, a
structured test with end users was conducted in order to
The different vibration frequencies have been designed to collect objective and precise information. The main goal
provide a possible support in order to better detect the was to understand if the interaction modality implemented
number being touched.. For example, the user can via gesture and vibration could be suitable for a blind user.
recognize the number ‘5’ on the basis of the triple vibration. We also planned to collect information on UI design in
Then by sliding the finger to the right when you feel a relation to menus, labels, buttons and audio/vibro feedback.
single vibration (even numbers) it means it is the key '6 '. The evaluation was targeted at answering the following
questions: (1) Is the proposed gesture-based interaction
Instead when you feel a double vibration (odd numbers), it appropriate for a blind person? (2) Is the arrangement of the
means that the finger has shifted slightly upwards (so it UI elements suitable for quickly detecting the interactive
must be '3') or down ('9'). This should support number elements? (3) How easy to learn and to use is the
detection especially in a noisy environment. application? We were particularly interested in recording
perceived difficulties together with critical issues for users
Keyboard when editing modality through a single / double tap as well
For text editing, a virtual qwerty keyboard was as comments on using the numpad and keypad. Vibro-
implemented in the prototype. No vibration support was tactile feedback was also considered in our evaluation.
provided. When each key is touched it is announced
vocally. In this case we focused especially on the editing Method
Participants
Twenty totally blind users (7 female and 13 male) were
involved in the user testing. The participants were recruited
in collaboration with the Association for the Blind in Italy.
The age ranged from 22 to 70 years. All of the people use a
computer with a screen reader in Windows an environment
on a daily basis. Five of them had no experience with
smartphones and touch-based screens, thirteen of them had
interaveragete experience, whereas two of them had very
good knowledge in using the iPhone device.
Test protocol
Four meetings were held at four local offices of the Italian
Association for the Blind in different cities. Five users were
involved for each test. At the beginning of each meeting a
general presentation of the test purpose was made,
highlighting the importance of the user’s role in the design
5. and development cycle. The experimental protocol was T5 Searching for a flight Exploring a table
divided into three phases: time in a time table) (rows and columns)
Preliminary phase: participants were provided with an T6 Making a call (vibro- Editing a number
overall description of the prototype as well as with a list tactile numpad and
summarising the most important gestures and UI elements; single tap)
Training phase: each user was allowed to explore the
application for 20 minutes in order to gain confidence with As we were especially interested in comparing the editing
the smartphone and the gesture-based interaction; method using a single tap with the double tap, we assigned
Testing phase: for each user an individual test session was two tasks with the same action, i.e. editing a number (T2
carried out. Each of them was asked to perform a set of and T3). In order to avoid the potential bias created by the
tasks and the execution time had been register by means of learning effect in using the numpad in the task 2 and 3, we
the chronometer. The users were observed while carrying balanced the users by carrying out the two tasks in a
out the tasks. We applied the “thinking aloud” method to different order: T2 and T3 and vice-versa. We applied this
collect information as the user was interacting with the modality to all the four sessions with the 20 users. T6 was
prototype. introduced to evaluate the usage of vibro-based numpad in
order to understand if vibro-tactile support is a feasible
The training phase was designed to avoid the bias of ability: solution for detecting keys more easily. However, for this
which means discrepancies in interaction abilities within specific solution we plan to investigate further so as to
users associated with a different degree of individual collect additional information on the possibility that this
training, can affect the result of a test: the training phase support can improve interaction when using a numpad..
allowed the participants to start the testing procedure with
similar basic skills especially regarding knowledge of Post questionnaire
gestures. After performing the test, participants were asked to fill in a
questionnaire composed of 22 questions. This made it
Through this test procedure both subjective and objective
possible to collect information about the mobile devices and
data were gathered for each user in order to collect useful
smartphones used by the participants, and to obtain other
information on the evaluation. With regard to objective data
qualitative data not obtainable during the observations.
we recorded some information for each task: (1) Time spent
Subjective information was also considered. For example,
by users performing assigned tasks, (2) task
users could express opinions and ideas about the usefulness
accomplishment (success/failure), (3) errors made in
of audio , vibro-tactile feedback, labels, keypad and
performing the task. Regarding subjective data we collected
numpad, etc. Indications about the level of difficulty of
comments and suggestions during both the training and test
editing were also taken into account. The questions regards
session by observing the users while they were using the
the following topics:
application. Also specific questions and interviews allowed
us to collect further useful information. • General information about user
Tasks • Prior knowledge and experience of using mobiles,
To evaluate the interaction modality developed with the and user expectations of a Smartphone
Android-based prototype we designed six tasks to be • Suggestions/opinions regarding multi-modal
performed by each user. The type of task was selected mobile interaction
according to the interaction modality to be evaluated. The
six tasks assigned to each user during the test session are • Prototype evaluation
listed in the table below.
Task Description Goal For the first three topics the user has to choose the response
T1 Reading an SMS Gesture-interaction among a set of options. For the last topic the user has to
message with the menus and give evaluation about specific prototype feature in a range
main buttons from 1 (lower value) to 5 (higher value). For all the topics it
T2 Making a call (single Editing a number was possible for the user provide comments and suggestion.
tap)
EVALUATION RESULTS
T3 Making a call (double Editing a number
As said, during the test procedure objective and subjective
tap) data were gathered.
T4 Sending an SMS Editing a text
Objective data
message (qwerty
keypad)
6. In Table 1 the average (M) and the standard deviation (SD) achieved the best results. This implies that the accessibility
values relating to the time spent by users performing of the task improves with practice.
assigned tasks are provided.
Time spent
Task M SD
Subjective data
T1 01:41,4 01:01,8
T2 02:04,4 01:07,4 All users have experience of Symbian technology and they
T3 01:47,1 00:45,2 use Symbian mobile phones. 55% of users had never used a
smartphone and they stated that they had no knowledge of
T4 01:44,4 00:43,7 touch screen technology. Moreover, the users were
T5 01:25,7 01:02,3 questioned about the features they would like to have on a
T6 01:54,7 00:45,0 smartphone. The result is that 85% of users are interested in
Table 1 Time spent
traditional phone functionalities (i.e. phoning, SMSs,
contacts); 71% are interested in internet access while 69%
Only successfully completed performances are taken into would like to have access to email. Few users declared their
account. interest in reading e-books and taking notes.
In Table 2 the average success or failure rates for the task In the “Smartphone knowledge” section of the
accomplishment are provided. questionnaire, the users were asked for detailed opinions on
mobile device interaction via touch-screen.
Task accomplishment
Task Success Failure 55% of users think that a software keyboard is a valid way
T1 0,81 0,18 to provide input to the smartphone. It is worth noting that
T2 0,85 0,14 83% of users with this view are those who have already
used a smartphone. This factor suggests that users who are
T3 0,85 0,14 initially reluctant to use a software keyboard can change
T4 0,55 0,45 their mind after using it.
T5 0,85 0,14
Instead, after using the software numpad to insert a number,
T6 1,0 0,0
97% of the users think that it is a valid way to perform this
Table 2 Task accomplishment function.
The results show that most of the users are able to 77% of the users say that it would be useful to use a speech
successfully complete the tasks. As expected, text editing recognizer to provide some commands to the smartphone
has the largest failure rate. with the voice.
In Table 3 the average number of errors per task is reported. The majority (88%) think that vibro-tactile feedback is a
valuable way to obtain indication from the smartphone.
Task Evaluation of errors
T1 0,05 The last question in the “Smartphone knowledge” section
T2 0,55 regards the possible presence of physical points of reference
on the touch screen: 83% of the users consider that it would
T3 0,55 be very helpful to have physical reference points.
T4 0,85
T5 0,15 The questions contained in the “Prototype evaluation”
T6 0,7 section of the questionnaire regard the evaluation of the
tasks performed with the prototype application. A scale
Table 3 Number of errors
from 1 (negative) to 5 (positive) was employed when the
user was asked to express a score.
The errors considered relate to the number of attempts, and
not the number of successfully completed tasks. The users were asked about the usefulness of flick gestures
when browsing the information. The average value obtained
It is worth noting that more errors occurred in task T2 when is M:4,47. 17 users think that the gestures from left to right
it was performed after task T3 (70%). The users performed and from right to left are appropriate to scroll lists of
a double tap to select a number instead of a single tap, thus elements, while 3 users think that gestures from top to
copying the editing modality of task T3. bottom and from bottom to top would be more intuitive.
During the test, it was observed that the users with Regarding the use of the 4 buttons at the corners of the
experience of smartphone and touch screen technologies screen when retrieving orientation information, the users
7. gave their view on their usefulness. The average value replaced by "Send" when writing an SMS message, or
obtained is M:4,84. "Call" when making a call.
In particular, 15 users expressed their appreciation of the Six users think that in some cases the phrases used to
“Repeat” button for two reasons. They found it very useful introduce the activities are too long. They would probably
for listening again to the last message or the text entered so be appropriate when first using the device, but that for
far. They also liked the fact that they could use it at any subsequent use it would be better to allow the user to
point during the interaction. This could well be a solution to customize the message. This fact suggests that the level of
the problem observed by the authors in [Errore. L'origine detail of the speech feedback might be an additional
riferimento non è stata trovata.]. configuration parameter.
The average value obtained for vibration feedback is In conclusion, the global evaluation of the tested prototype
M:3,84. This data is the most subjective of those analyzed: is expressed with the average value: M:4:26.
users are divided between believing that the vibration does
Overall, users showed unexpectedly high interest in the
not provide any added value and those who think that it is
application and were very willing to contribute their
essential when in noisy environments and for reasons of
opinions and comments to the study.
privacy.
In particular it is interesting to note that the users in favour
of vibration are the same as those who clearly perceive the A significant consequence of the study is that as many as
difference between the vibration in different keys. five of the participants said they would consider buying an
Android smartphone in order to install this application.
With regard to the single tap method for selecting a key
within a keyboard or numpad the average value obtained is
CONCLUSIONS
M:4,44. Instead the approval rating for the double tap mode
The study presented in this work is aimed at investigating
is M: 4.06. The general opinion is that the double tap is
the interaction with mobile devices by users with visual
better suited for novice users because it would result in
impairments. To this end, we chose Android-powered
more successful editing. Interestingly novice users who
devices and we developed a prototype application to
express this view claim to interact better when using the
implement some targeted functionalities, which allowed us
single tap mode.
to analyse and evaluate how well the blind user is able to
Both novices and experienced users agree that the single tap interact and work with a smartphone. Although the
is a faster way to insert the text in general. However, they prototype developed is limited to only a few features for the
think that using a single tap is not suitable for critical Android operating system, the results obtained from blind
actions such as “Done” or “Delete”. They say they would user interaction with an Android-based device can be
feel more secure with a double tap. generalized and applied to any mobile device based on a
touch-screen. Thus, the results of this work could be useful
Furthermore, it is worth noting that one of the most to developers of mobile operating systems and applications
common obstacles, particularly for novice users, is being based on a touch-screen, in addition to those working on
able to successfully perform a double tap. This difficulty designing and developing assistive technologies.
occurs when the time between the two taps is too long, or
when the second tap is performed in a different position on Herein we have presented and discussed the results
the screen. As a result, these users made a large number of obtained through a user test conducted with 20 blind users
errors when selecting elements using a double tap. Instead in order to evaluate mobile interaction using the proposed
they achieve the correct result with a single tap. prototype.
For text insertion, 3 users suggested that it would be useful Comments from the users while interacting with the
to have a list of predefined messages, which could be prototype, as well as collected data when performing a set
modified to include customized information. of tasks were encouraging. Positive feedback was also
observed by the researchers as regards the intuitiveness and
Eleven users think that the qwerty keyboard is difficult to ease of use for those people who had never used a touch-
use and this opinion was unconnected to the editing screen before. Based on the data and suggestions collected
modality chosen. For all of them this was due to the we can begin to outline certain aspects and features
positioning of the keys too close together, which preferred by the users. These should be considered in the
consequently prevented easy identification. User Interface as well as in assistive technology design . In
Regarding the phrases and words used by the voice (i.e. to particular we evaluated and collected positive impressions
read the UI labels), the average value obtained is M:4,63. and comments on the usefulness of the following UI
Some users suggested making the labels more context- features:
dependent. For instance, the "Done" button could be
• The four action/function buttons at the corners of
the touch-screen, such as “Back” and “Done/OK”.
8. This kind of feature located in fixed places can ACKNOWLEDGMENTS
improve user interaction. The authors wish to thank to all people who participated in
the user testing and the Italian Association for the Blind for
• The assistive technology functions to use in order the collaboration in the organization of the four meetings.
to obtain information on the current status.
Examples are the “Position” button to find out the REFERENCES
location, or the “Repeat” button used to easily read 1.Benedito, J., Guerreiro, T., Nicolau, H., Gonçalves, D.:
the focused element (especially for the edit fields The key role of touch in non-visual mobile interaction. In
to check what has been written). Specific buttons Proc. of MobileHCI'10, ACM, NY, 379-380
or gestures can be a worthwhile solution.
2.Brewster, S.A., Chohan, F., Brown, L.M.: Tactile
• Vibro-tactile and tactile support to improve Feedback for Mobile Interactions. Proc. CHI’07. ACM
perception of given events (e.g. to confirm action) Press Addison-Wesley, pp 159-162
as well as identify UI parts or elements (e.g. UI
3.Omissed for blind review.
macro areas or the focused edit field).
4.Kane, s. K., BiGHAM, J. P., AND Wobbrock, J. O.: Slide
• The fully perceivable numpad as an alternative or Rule: Making mobile touch screens accessible to blind
an addition to audio number vocalization. Vibro- people using multi-touch interaction techniques. Proc. of
tactile support to differentiate between the ASSETS’08. ACM Press (2008), 73-80
numbers (e.g. odd and even) with different
frequencies is a possible direction. 5.Kane, S.K., Wobbrock, J.O., Ladner, R.E.: Usable
gestures for blind people: understanding preference and
The study [5] suggests that blind subjects prefer gestures performance. In Proc. of CHI '11, ACM. 413-422
that use screen corners, edges, and multi-touch (enabling
6.Koskinen, E., Kaaresoja, T., Laitinen, P.: Feel-good
quicker and easier identification) and identifies new
touch: finding the most pleasant tactile feedback for a
gestures in well-known spatial layouts (such as a qwerty
mobile touch screen button. Proc. of ICMI'08. ACM, New
keyboard). With regard to the four buttons placed at the
York (2008), 297-304
corners, our study confirmed that blind users appreciate
these UI elements which are easy to locate or the fact that 7.Omissed for blind review.
they are perceivable (e.g. vibro-tactile feedback). In 8.Manshad, A.S.: Multimodal vision glove for
contrast practical problems are encountered when editing touchscreens. Proc. of Assets'08, ACM Press (2008), 251-
with a qwerty keyboard, even though the layout of the keys 252
is well-known to the user. A specific user test based on
9.Oliveira, J., Guerreiro, T., Nicolau, H., Jorge, J.,
vibro-tactile numpad will make it possible to collect
Gonçalves, D.: BrailleType: unleashing braille over touch
additional and more precise information on vibration usage
screen mobile phones. Proc. of Interact 2011
as accessibility support in order to overcome the difficulties
associated with a virtual numpad. More generally, the 10.Shaik, A.S., Hossain, G., Yeasin, M.. Design,
evaluation confirmed that editing is still a challenge for a development and performance evaluation of reconfigured
blind person, as already pointed out in the task completion mobile Android phone for people who are blind or
results, where the task T4 is successfully terminated only by visually impaired. Proc. of the 28th ACM SIGDOC '10
half of the users. So further investigation is necessary in 11.Wobbrock, J.O., Kane, S.K., Gajos, K.Z., Harada, S. and
order to identify how to improve text editing and numpad Froelich, J.. Ability-based design: Concept, principles
accessibility. and examples. ACM Trans. Access. Comput., 2011.