by Melody Holloway
The age of AI may be the best time period yet for enhancing independence, productivity, awareness, advocacy, and overall quality of life for the blind/visually impaired community. We have access to more options to receive visual interpretation/information than ever before.
From mobile applications such as Be My Eyes and Artificial Intelligence Remote Assistance (AIRA) which give users the option to use both AI virtual assistants and human volunteers or trained agents to read, describe, screenshot, and assist with nearly any task requiring eyesight, to apps such as VDScan, Envision, Seeing AI, and tabs within apps as in NFB-Newsline’s K-NFB Reader Basic which provide aspects of visual interpretation from color identification to reading documents, product labels, currency, photo, environment, and facial description via access to smartphone cameras to apps designed to interpret information pertinent to specific tasks such as reading medication information, digital interface panels, food expiration dates, restaurant menus, cash identification, reading laundry care instructions, and light detection.
Examples of these task-specific apps include Cash Reader, iNote, Vislens, Laundry Lens, Menus4All, ScripTalk, Light Detector, and Zuzanka.
WayAround is an application which allows users to customize our own labels and tag items and locations for quicker, more accurate, efficient identification particular to individual lifestyles.
There are global positioning system applications to assist in finding directions, locations, identifying routes, detect doors, recognizing pedestrian signals, indoor navigation, and other aspects of wayfinding. These apps include GoodMaps, Outdoors Explore, Blind Square, and Oko.
External visual interpretation devices are also available. Some are task-specific, such as the iBill currency identifier through the Bureau of Engraving and Printing, and ScripTalk prescription reader. Others such as Stellar Trek can be used as bar code readers and GPSs to aid us with indoor and outdoor travel.
The new Biped AI, used to identify obstacles by blind pedestrians, and Glidance with built-in artificial intelligence robotics which move over ground and floors to guide blind travelers to specific locations both indoors and out are cutting-edge, state-of-the-art technologies testing the limits of providing accurate, efficient, clearly understood essential visual information to blind users.
We have refreshable braille displays with QWERTY typing and six-dot braille layout keyboards which can be used stand-alone for reading books, word processing, editing files, identifying graphs, spreadsheets, and notetaking. Or we can pair these displays to mobile devices via Bluetooth and connect to computers with USB cables to be used in combination with screen readers. Some displays have no speech output. Others are speech-enabled notetakers. They have calendars, calculators, clock function, and braille translation built in. Most braille displays have multiple language capabilities.
Audio description is more widely available in theaters and across multiple platforms, including antenna television, cable providers, and streaming services, allowing blind viewers to enjoy movies, live entertainment, series, and even advertising. AD also provides accessible emergency alert, public safety alerts, and weather warning information.
Accessibility/usability of durable medical equipment has also improved. From talking glucometers, blood pressure monitors, thermometers, and scales, to continuous glucose monitors, scales, heart monitors, and sleep equipment with accompanying mobile apps as well as Bluetooth capabilities, visually impaired people can, at times, manage our health conditions and those of loved ones we care for independently with minimal to no sighted assistance.
Many of these AI platforms/services are only available via monthly or annual subscription plan, or full one-time purchase. This is one of the main barriers preventing access for those of us living on fixed or limited incomes.
Some of the mobile apps are free with the iBill, which people can receive with proof of visual impairment through a doctor or active patron status through National Library Services for the Blind and Print Disabled along with the NLS eReader braille display.
Services such as Braille and Audio Reading Download (BARD) and Bookshare offer books and magazines in audio and braille formats to be read digitally with braille displays or listened to on mobile devices, external players via thumb drives and SD cards, or in Bookshare’s case, using Echo smart speakers.
NFB-Newsline offers newspapers, magazines, TV listings, job postings, and local weather reports via automated telephone system, on their web site, mobile app, or eReader.
What do we collectively make of all this artificial intelligence enabled technology? AI technology enhances, improves, and increases independence, productivity, further access to lifesaving instruction, healthcare, employment and education opportunities, daily living, parenting, travel, emergency preparedness, etc. But is the recent acceleration/integration of artificial intelligence and electronic access interfering with or minimizing certain aspects of independent living?
Is all of our personal information protected and kept confidential? If apps and devices identify location, destination, color, read personal documents such as mail, financial, educational, legal, and medical paperwork, identify cash, medications, describe pictures, faces, and scenes, can sensitive private information fall into the wrong hands? Can daily routines be predicted and tracked?
Could Glide in particular replace white canes and guide dogs? Could APS and GPS identification replace accessible pedestrian traffic signals, orientation and mobility training, and cause us to lose previously learned personal skills and not learn new methods of performing tasks of independence?
Could these technologies decrease human engagement/in-person interaction, contributing to the global pandemic of loneliness and social isolation?
Could O and M instruction, certified vision rehabilitation therapy, independent living training, and guide dog instructor positions be in jeopardy?
Many of these technologies benefit people with other disabilities and medical conditions in addition to or besides blindness. People with autoimmune, neurological, hearing, processing, mobility, dexterity, respiratory, sleep, food intake, executive function, speech, learning, sensory, and emotional expression impairments experience enhancements and improvements in daily functioning, engagement, performance, independence, task completion, and self-confidence.
While these considerations may seem complicated, complications are not necessarily negative as often viewed. When investing time, money, effort, awareness, and training to implement AI and electronics into our lives, we must think about all positions and outcomes in order to become more self-sufficient and informed while remaining safe, stable, connected with and supportive of our family, close friends, neighbors, employers, co-workers, and peers.
I hope that this food for thought sparks constructive discussions, conversations, and adds to our ability to advocate, make informed decisions, and improve access to/quality of services.