Tip:
Highlight text to annotate it
X
Welcome to the presentation of 3DHapticWebBrowser, an application that enables haptic navigation
on the web for the visually impaired users.
The main advantage of the application against the existing technologies (such as the screen
readers) is that the visually impaired user can freely navigate into the web page. No
sequence is imposed. The user can also perceive the structure of the web page. Moreover, the
2D maps found on the web can be haptically explored.
This is the interface of the application.
The main concept includes the transformation of each HTML component into a 3D object with
haptic feedback, called "hapget".
For instance, these hapgets represent some images with hyperlinks to other pages. The
yellow hapgets represent hyperlinks while the blue ones represent text components.
As we can see, the application has mouse support for the partially sighted users.
By pushing this button, we can activate a common web browser, which has been integrated
in the 3D HapticWebBrowser.
On the top of the current page there are two hyperlinks.
These two yellow hapgets are their equivalents in the 3D scene.
By pushing these two red buttons or the UP and DOWN keys of the keyboard as an alternative,
the user can move up and down on the page.
When the "left control" key of the keyboard is being pressed while the cursor is over
a hapget, the description of the hapget is being heard via a speech synthesis engine.
Let's see how the visually impaired users can interact with the application using a
haptic device such as the "Phantom Omni" of Sensable Technologies.
The cursor now is 3D and can be moved in any position into the 3D scene as well as be rotated.
Using the haptic device, the user can feel the shape and the surface of each hapget.
Each hapget has also an earcon, which is a short audio signal that makes the identification
of the hapgets easier.
When the "left contol" key of the keyboard is being pressed while the 3D cursor is in
contact with a hapget, the description of the hapget is being heard.
Audio messages inform the user about his position in the web page.
By pressing the button of the haptic device while "touching" a hapget representing a hyperlink,
the user can navigate to the target page of the hyperlink.
If the user needs some guidance during the haptic exploration of the 3D scene, by pressing
the keys 1 to 5 of the keyboard, he can automatically move in the center or to the corners of the
scene. A force is being applied by the application in the haptic device that leads the cursor
to the desired position.
The user can press the space key of the keyboard in order to type a new URL.
The value of each key is being heard and by pressing the RETURN key, the user moves to
the desired web page.
If the user identifies as a map a hapget representing an image (by its alternative text), by pushing
the tab key of the keyboard he can go to map exploration mode.
The 2D map is being automatically tranformed into a 3D multimodal map with haptic and aural
feedback.
The road names have been recognized by an Optical Character Recognition mechanism.
The crossrads have also been identified.
The fault percentage of the OCR mechanism depends basically on the resolution of the
map, the zoom level and the rotation of the road names.
By pressing the Escape key of the keyboard, the user can return to the web page.
There is a set of options that can be set according to user's preferences, including
the usage of earcons and haptic icons, the size of the hapgets, etc.
There is also a set of available hapgets.
Moreover, the application has UsiXML support.
UsiXML is an XML-based language that describes multimodal
user interfaces. For every visited web page, the corresponding 3D scene including the position
and the description of the hapgets is being described in an automatically generated UsiXML
document.
There is also load and save option for the automatically generated UsiXML file.