When was the last time you used GPS services to navigate to a location? Probably just a few hours ago. From finding the fastest route to your destination, to navigating public transportation options, we often rely on this type of service. Imagine needing to find a bus stop based on a GPS service alone, without the ability to see its exact location even when in its proximity. For those who are blind or low vision that scenario is a reality, often having to rely on spatial geolocation data is not accurate enough, commonly referred to as the “last 30-feet problem.” Oftentimes, in the bus stop scenario, even a small gap in navigation can cause people to miss their bus entirely as the bus driver may misunderstand their intention of getting onboard.
A team at Schepens Eye Research Institute of Mass Eye and Ear, led by Associate Professor Gang Luo, has been focusing on vision assistive technology for over a decade, running research studies on technology development, intervention, evaluation, and human factors in mobility for people who are blind or low vision. While transit agencies have a mandate to improve accessibility to public transportation as part of the Americans with Disability Act, opportunities exist to improve existing technologies and further remove barriers. Developing a cost-effective tool was paramount for the team in their aim to make bus stops more accessible and easily identifiable to all. In their effort, they have developed and released a free app called All Aboard which prototypes 10 bus transit services across the US, Canada, UK and Germany. Their project has been awarded a Microsoft AI for Accessibility grant.
To utilize the app, a user needs to hold their mobile phone in upright orientation in proximity to the stop. The service will make a sonar-like sound to indicate it’s searching for the bus stop sign, followed by a beeping sound to indicate the bus stop was identified. The latter has different levels of pitch roughly representing various distances as demonstrated in this video tutorial.
To ensure the solution they were creating was centering the needs of those it was developed for, the team complemented its existing expertise in computer vision and programming by running focus groups with members of the disability community. Micro-navigation was identified as a key challenge, not only for bus stops, but for other places such as subways, stores, banks and more. Lastly, there was a wish that the micro-navigation functions to be integrated with other services, such as travel route planning, bus arrival announcements, indoor navigation and more. While All Aboard does not have that particular capability yet, the team at Schepens Eye Research Institute is searching for industry partners who have an interest to further the use of their technology.
The All Aboard app used deep neural networks to recognize bus stop signs, with the assumption the user is aware of the bus route they wish to take and is in proximity of the bus stop. By using object recognition, it can correctly identify bus signs which have the same design for a particular transit, while ignoring the exact route number on the signs. For each bus transit, around 5,000 to 10,000 bus stop sign images were collected, labelled, and used to train the neural network to automatically learn the features of the signage patterns. Consequentially, the neural network is capable of differentiating the bus stop signs from other objects and other types of road signs in images. For the recognition neural network to run in real time on a mobile device with lower computational power, a lightweight neural network was created, allowing processing on a mobile device.