Developing Multi-modal Interfaces for Visually Impaired People to Access the internet

13 views
Skip to first unread message

iFeelPixel association

unread,
Mar 29, 2007, 11:02:54 AM3/29/07
to iFeelPixel
Hi all,

For your information.
The appended paper about Multimodal Interfaces and iFeelPixel software
is appeared on the web.


Best regards,
iFeelPixel Association


Interactive Accessibility via Tactile Sense technology
http://www.ifeelpixel.com


"Developing Multi-modal Interfaces for Visually Impaired People to
Access the internet"

Wai Yu, Graham McAllister, Emma Murphy, Ravi Kuber, Philip Strain

Queen's University of Belfast
Northern Ireland
{W.Yu, G.Mcallister, E.Murphy, R.Kuber, P.Strain}@qub.ac.uk

Keywords: visual impairments, internet, audio, interfaces, haptics,
multi-modal interface, technology, navigation, blind, first prototype,
reading, screen readers, accessibility, design

Abstract
This paper describes the work being carried out at the Queen's
University of Belfast (QUB) on improving visually impaired people's
access to information on the Internet. In particular, the project is
focused on problems that visually impaired people have on navigating
and reading information from Web pages. These problems will be
addressed by using a multi-modal approach of combining visual, audio
and haptic technologies. The first prototype of these interfaces has
been developed based on the results of the user requirements capture
conducted with visually impaired people. This paper will present a
review of the current technology to assist visually impaired people to
access the Internet and also the users' comments on this technology.
As a result of the user feedback, the paper will present a prototype
multi-modal interface and discuss the issues that should be considered
when designing such interfaces.


EXTRACT about iFeelPixel TactileWare:

Research has shown that advantage can be gained when the haptic
modality is used in conjunction with the visual and auditory channels
[Yu 02]. The IFeelPixel multi-modal application [IFeelPixel] enables
the user to mediate structures such as edges, lines and textures, via
a tactile/force feedback mouse. Multi-modal solutions have the
capacity to extend visual displays and thus making objects more
realistic, useful and engaging [Brewster 01]. Multimodal interfaces
also provide assistance in the mental mapping process, allowing the
user to develop a greater awareness of objects contained within the
environment. A clearer spatial representation created by multi-modal
feedback, enhances the user's ability to orientate and navigate within
their environment [Lahav 02, Caffrey 04]. It is apparent that a multi-
modal assistive interface can provide one solution to reducing
barriers that are faced by the blind and partially sighted community.

Read more from Source URL:
http://www.sarc.qub.ac.uk/~emurphy/papers/hci2005.pdf

iFeelPixel association

unread,
Mar 29, 2007, 11:27:48 AM3/29/07
to iFeelPixel

Another interesting extract of this paper

- "One of the most important considerations in designing audio for a
non-visual interface is that the auditory space
overloads much faster than the visual space. Initial user testing has
highlighted the fact that too many sounds
presented simultaneously will confuse the user. Short auditory clips
are preferable to convey simple information
quickly. Continuous sounds should be used sparingly (économiquement)
and a hierarchy of significance of each sound should be
determined before it is used."

Reply all
Reply to author
Forward
0 new messages