Helen Keller, perhaps the most famous activist for the visually impaired once said, “It is for us to pray not for tasks equal to our powers, but for powers equal to our tasks, to go forward with a great desire forever beating at the door of our hearts as we travel toward our distant goal.”
Empowerment of the visually impaired took another step forward this month with the presentation of Navatar, an indoor navigation system. Navatar, which was developed by a University of Nevada, Reno computer science engineering team, is an improvement on existing systems because it relies primarily on existing smartphone technology and not on less practical and bulky sensors.
“Existing indoor navigation systems typically require the use of expensive and heavy sensors, or equipping rooms and hallways with radio-frequency tags that can be detected by a handheld reader and which are used to determine the user’s location,” said Kostas Bekris, of the UNR College of Engineering’s Robotics Research Lab. “This has often made the implementation of such systems prohibitively expensive, with few systems having been deployed.”
In conjunction with two-dimensional, digital architectural maps that are widely available, the smartphone-based Navatar uses the device´s accelerometer and compass to navigate its user. The system is able to guide people with visual impairments down hallways and into rooms through audible instructions similar to those given by GPS devices made for autos.
“Nevertheless, the smartphone’s sensors, which are used to calculate how many steps the user has executed and her orientation, tend to pick up false signals,” said Eelke Folmer, who worked on the project.”To synchronize the location, our system combines probabilistic algorithms and the natural capabilities of people with visual impairments to detect landmarks in their environment through touch, such as corridor intersections, doors, stairs and elevators.”
Folmer explained that Navatar ℠listens´ for voice prompts or a button push on a Bluetooth-enabled device from the user to confirm the presence of these landmarks. This means the system can work to assist the user in conjunction with their typical routine for navigation, including the use of a cane.
On his website, Folmer noted that the system has a “high possibility of large-scale deployment” because it only requires a simple digital representation of an indoor environment can be sketched up with simple design drawing programs that could be downloaded from a building´s Wi-Fi network. The UNR team also performed a study involving 12 blindfolded and six blind users to demonstrate the feasibility of their system.
While the system was able to track users within 1.85 meters of their actual location, the researchers were able to identify several areas for improvement. Based on feedback from test subjects, the team´s report said improving Navatar´s accuracy, making it able to repeat directions, and having it capable of working from within in a pocket are all improvement they are considering.
For their work on Navatar, Bekris and Folmer recently won a PETA Proggy Award for Leadership in Ethical Science. PETA recognized the system as an animal-friendly achievement because of its potential to decrease the reliance on guide dogs for the visually impaired.
—
On the Net:
Comments