The new app that serves as eyes for the blind...
Rich Hua stashed this in Technology
Stashed in: Software!, Siri, Turing, Hearing Impairment, Apps
Extraordinary use of AI in 2015:
The goal is as audacious as it is inspiring: coming up with a technological platform that can help the visually impaired navigate the world around them as effortlessly as everyone else. The first pilot in the program is a smartphone app for iOS and Android called NavCog, which helps blind people navigate their surroundings by whispering into their ears through earbuds or by creating subtle vibrations on their smartphones. (Users have the option of either setting the app to “voice mode” or “vibration mode.”)
Similar to the turn-by-turn directions offered by car GPS systems, the app offers its own version of turn-by-turn directions for the visually impaired. The app analyzes signals from Bluetooth beacons located along walkways and from smartphone sensors to help enable users to move without human assistance, whether inside campus buildings or outdoors.
The magic happens when algorithms are able to help the blind identify in near real-time where they are, which direction they are facing and additional surrounding environmental information. The computer vision navigation application tool turns smartphone images of the surrounding environment into a 3-D space model that can be used to issue turn-by-turn navigation guidance.
The NavCog project, which is a joint collaboration between IBM Research in Yorktown Heights, N.Y. and Carnegie Mellon, has particular meaning for one of the lead researchers on the project, IBM Fellow and visiting CMU faculty member Chieko Asakawa, who is visually impaired herself. It will soon be possible for her to walk across the CMU campus with the help of the NavCog app – and look just like any other person traversing the campus, listening to a smartphone with white earbuds in her ear.
10:10 PM Nov 03 2015