Tip:
Highlight text to annotate it
X
Welcome to the presentation of the "Open Touch/Sound Maps" mobile application.
The "Open Touch/Sound Maps" application aims to help blind and visually impaired users to virtually explore any map area around the world by touch.
The visual information that cannot be received by blind users, is substituted by vibration feedback, sonification and audio messages through a text-to-speech module.
The application uses the publicly available Google geocoding web service for the transformation of a search criterion into coordinates,
and the publicly available OpenStreetMap web service for getting information regarding the road network structure and points of interest of the desired
map area.
The "Open Touch/Sound Maps" application can run on any mobile phone or tablet running on Android version 2.2, or later.
This is the interface of the application.
User can navigate through options by simply sliding his finger left or right over the touch screen.
An option can be selected by clicking on it.
The application can work in conjunction with any text-to-speech module that is already installed in the mobile phone.
Let's examine the available options one-by-one.
First of all, the application currently supports two languages: English and Greek. More languages will be supported in the future.
Let's try Greek language.
As you can see, both the graphical interface of the application as well as the audio messages provided by the text-to-speech mechanism, are now provided in Greek.
Now, let's return to English language.
User can provide a search criterion, describing a location. This may be an address or a point of interest.
For the typing of this search criterion, a virtual keyboard has been developed.
This virtual keyboard has been developed from scratch and is more accessible than the standard Android virtual keyboard.
As user moves his finger over the virtual keyboard, the value of the letter that is currently selected is heard.
The letter is actually typed when user's finger loses contact with the touch screen.
Let's suppose that user wants to explore the map area around the Empire State Building in New York.
The search criterion is passed to the Google geocoding web service, which then returns the coordinates of the desired location.
Then, these coordinates are passed to the OpenStreetMap web service, which returns an OSM file containing the description of the specific area.
The OSM file has an XML syntax and includes the description of the roads and the points of interest included in the map area.
When user's finger is on a road, vibration feedback is provided and audio cues from a sonification module inform the user about the distance from the next crossroad.
Moreover, audio messages from the text-to-speech module provide information regarding the name of current road.
The sonification module works as follows:
As the distance from the next crossroad is changing, taking into account user's moving direction, the pitch of a single-frequency sound is changing accordingly.
When user's finger is on a crossroad, a music chord is heard.
Using the four buttons at the top-right corner of the screen, user can move left, right, up or down on the map.
User can return to main menu by double-clicking on the touch screen.
User may choose to use phone's GPS, in order to explore the map area around his current location.
Some brief helping notes on how to use the application as well as some general information about the application, are also provided.
Thank you for watching this video presentation of the "Open Touch/Sound Maps" mobile application.