blind-lesen_hero

Blind reading – me:you

concept for blind people to recognize known persons on the street

The concept „me : you“ helps blind and visually impaired to get in contact with known persons on the street by themselves. It consists of three components: a tactile bracelet, a camera and a smartphone-app.

To recognize people they first need to be identified and located. After that the user is navigated by the tactile bracelet. From a short distance the camera and a biometric facial recognition software allow him to haptically perceive other persons facial expression and to adjust his own reactions. This leads to a major independence in communication. The output of vibration, pressure and a tactile display make the system a new experience for the sense of touch. It enables a new way of handling without using the eyes.
The app „meet : you“ works like a social network, which is used to find friends who are around. It comes along with several features like individually definable profiles including user data and pictures as well as GPS tracking.

For this project I created personas, paper prototypes, a flash-clickdummy and models of the physical components.

2013 | Magdeburg-Stendal University of Applied Sciences


Adobe CS5, Rhinoceros


Interaction Design · Product Design

Concept

The process that takes place when a person is recognised has been analysed in detail.
Seeing people use their eyes to perceive all information in parallel. If technology should replace sight, it must be output one by one for the blind person.

With a camera it is possible to recognize faces at a distance of maximum 12 meters. If the average walking speed of a person of 1.38 – 1.49 m/s is taken into account, only 6 – 8 seconds remain to record information on identity, mood and position. With the help of GPS, the range can be extended to 500m. Unlike the short distance, there is more time for guiding but no facial expressions.

If the distance is very short, the system starts immediately with the output of the closest friend. The camera emits the name acoustically. At the same time, the device on the arm outputs the recognized facial expressions. Then the navigation starts with directional vibrations.

At a greater distance there is more time for the user to independently find out who the system has found and where he is. For this the device on the arm has a tactile display. On it he can recognize the positions of the located persons. By touching he gets to know names and by pressing he selects a person to whom the navigation starts. The navigation is the same as for a distance of 12 meters, except that the display can be used here, which also shows the direction.