Apps are linking visually impaired people to sighted volunteers as assistive technology enters a new era of connectivity.
Be My Eyes has just over 35,000 visually-impaired users registered for the app and over half a million volunteers. Whenever a visually impaired user requests assistance a sighted volunteer receives a notification and a video connection is established.
Hello?” says an American woman, reminding me of Scarlett Johansson’s disembodied artificially intelligent character from the sci-fi film Her.
“Hey, er … can you give me a hand by reading the letter on the bus stop?” I ask.
“Sure … can you move your phone a bit more up, and to the left … Ya! It says … F.”
Result. I thank her, end the session, pull up Citymapper and navigate my way onto the 453 going to New Cross.
I have a little bit of vision, but only enough to see motion and movement.I am using an app called Be My Eyes, an app that connects blind and visually impaired people to sighted volunteers via a remote video connection. Through the phone’s camera, the blind person is able to show the sighted individual what they are looking at in the real world, allowing the volunteer to assist them with any of their vision-related problems.
I began to lose my sight in the summer of 2013 to a rare genetic mitochondrial disease called Leber’s hereditary optic neuropathy and was soon registered blind. I consequently found myself relying on an assortment of assistive technologies to do the simplest of tasks.
This app has a lot of benefits.Jose Ranola, a 55-year-old from the Philippines who works in construction and has retinitis pigmentosa, said: “I use it to help me identify medicine and read printed materials and also to describe places and objects.” He adds: “All my experiences were good. The volunteers were very helpful.”
James Frank, a 49-year-old counselor in Minnesota, US, who has severely damaged optic nerves, is also a fan. “The response has been favorable and the volunteers are always polite,” he says. “The longest I have waited is maybe a minute”.
In the UK there are over 2 million who have some form of sight loss and an estimated 285 million people registered blind or visually impaired worldwide. Technology has long been playing roles in improving their lives. In the mid-1970s Ray Kurzweil, a pioneer in optical character recognition (OCR) – software that can recognise printed text – founded Kurzweil Computer Products and programmed omni-font, the first OCR program with the ability to recognize any kind of print style. He went on to make the Kurzweil Reading Machine, the first ever print-to-speech reading machine for the use of the blind.
Now, there’s a new booming age in the field of accessibility, driven in part by smartphones and high-speed connectivity. Screen readers have developed to such an extent that braille is no longer necessarily taught to people who lose their sight later in life.
All the time, companies are finding new ways to improve accessibility and Be My Eyes isn’t the only assistive technology company taking advantage of the real-time human element, building technology that is based on the creation of a dialogue with its users.
In May, startup Aira, the first product out of AT&T’s Foundry for Connected Health raised $12m in funding. Aira’s platform takes advantage of pre-existing wearable smart glasses, like Google Glass, and uses the mounted camera. But where Be My Eyes and Aira differ is that Aira incorporates remote human agents using the gig economy and has plans for artificial intelligence assistance. This allows it to connect trained, paid, independent contractors with blind people to assist them in day-to-day tasks in real time. The glasses stream everything the user is seeing to an agent who, sitting in front of a dashboard, is able to assist the user with everything from reading signs to shopping, to navigate, to the numerous other mundane tasks that sighted individuals take for granted. Through the glasses, the agent is able to talk to the user and give them detailed information about their surroundings. There is a hope that through machine-learning, the agents will be able to teach and AI how to command users to perform certain tasks. Aira has the backing from venture capital firms like Jazz Venture Partners and Lux Capital. As yet it is currently only available in the United States.
It’s not just in linking sighted people with visually impaired users that technology is able to help. The Sunu band, partially funded through Indiegogo, is trying to help improve people’s ability to perceive their surroundings. Based in Boston and Mexico, Sunu is a technology start-up creating a bracelet that uses ultrasonic sonar technology to detect the user’s surroundings and send haptic feedback whenever an obstacle comes into proximity. The ultrasonic waves emitted from the band’s transducer bounce off obstacles and are translated into vibrations that get increasingly more frequent the closer the user gets to the obstacle.
The next generation of tech advancements can go even further to help blind people.Autonomous vehicles, if built with the kind of intuitive AI voice-enabled assistive solutions like Amazon’s Alexa or Apple’s Siri that are already helping in the home, will give blind people increased independence. It is just a matter of making these solutions integral to design when developing the vehicles.
Smith tells me: “It just blows me away to the extent that gadgets have grown. I was so terrified when I got my first mobile phone, can’t even remember when it was, it was so long ago. Maybe 15 or 16 years. No speech though, had to use it by memory and hope for the best that you were turning it on and off correctly. And there was no way of texting. Then when Nokia’s came on the scene, then the iPhone, just unbelievable.” She adds: “It’s honestly fantastic some of the things that have been developed – although there is always room for improvement and advancement.”
Facebook
Twitter
Instagram
LinkedIn
RSS