Will Blind People find the Apple Vision Pro Useful? Perhaps!

Reading Time: 4 minutes.
Vision Pro UI mockup showing using your hand to point to UI elements instead of eye tracking.

Image via Apple’s developer website.

Sometimes I’m writing alt tags for my photos and I think, “Is this really necessary? Is this important or decorative? Will someone who can’t see care that the windows in the Apple Vision Pro are overlaid on a view of a scenic lake? Will they care about the Vision Pro?” But then I got to thinking… the Apple Vision Pro looks a lot like an idea I’ve had rumbling in my head for some time: a way for blind people to feel more independent thanks to technology.

Oh, and to answer the question, I almost always choose to write the alt text. Blind and vision impaired people deserve a descriptive and wonderful web too, you know!

There’s an app you might want to check out, called “Be My Eyes.” This allows a blind person to ask a sighted person for assistance through their smartphone. They can turn on the camera and ask questions about their surroundings. Everything from the price of eggs to the cash in their wallet. Apple has introduced text readouts for smartphones, and that’s certainly helpful, but you still have to know where the text is and point your phone right at it, wait for it to realize there’s text in the view, and get the readout. It also can’t easily tell you what you’re reaching for in a store. But what if you were wearing a device that could read things from your surroundings in real time? That’s what the Vision Pro could be.

The Vision Pro won’t be perfect out of the box for the blind, but, with just a few adjustments, maybe a third party app or two, it could be a huge step forward in accessibility equipment.

Disclaimer

Now, first, a disclaimer. I do not need accessibility accommodations for vision. Accessibility is something I’ve thought about a lot as an app developer and someone who writes content for the web, but that’s it. My perspective won’t carry much weight. However, I hope to inspire Apple and other app developers to consider ways they can use the capabilities of the Apple Vision Pro to make something that’s both accessible for vision impaired users and to consider creating new ways to make it helpful as an accessibility device.

AR Everywhere

The ability to get your iPhone to read out text is certainly helpful. However, with the Vision Pro, this could be easy enough to use with wide angle photos and views. No need to worry about bringing items into “view” of the iPhone, the Vision Pro will have a larger view of your surroundings.

An app, or even the built-in camera app, could read text when brought into view for a set period of time. It could even use AI and machine learning to identify items placed in front of you. Photo recognition is improving, and something that could easily read out your surroundings at all time could be incredibly helpful. It could tell you when a “Don’t Walk” sign turns to a “Walk” sign to cross an intersection, or when you’re walking too close to an object that may be head-height, but not anchored to the ground where your stick could be.

A stick has been a tool of the blind for eons for good reason. Few items can translate both the surroundings and even the texture and contents of the ground ahead of you like a stick. Readouts from a headset are unlikely to replace that. However, the Apple Vision Pro could augment it with object recognition, telling you what your stick is near or what you’re pointing at. It could inform you of intersections, of people approaching you, maybe it could even warn you before you swipe your stick through a pile of dog poo. The addition of augmented reality and readouts from TalkBack could work with existing navigational and tactile aids to present more information to blind people in real-time than ever before.

Navigating the UI

Vision Pro UI navigation is based primarily on eye tracking, responding to what what you’re looking at. Obviously this doesn’t work for those with vision impairments. However, Apple supports a number of pointing devices for their devices already. This could be something like a small handheld trackball mouse or a mouse that uses a joystick. You could even use a Nintendo Switch controller, perhaps. With audio feedback, or even tactile feedback in the remote, someone could translate the spacial awareness of a joystick location to the 3D environment the Vision Pro creates. Apple may also use other gestures, like those already present in VoiceOver, like swiping, to translate to a mobile-like experience, with users selecting items with swipes, in order that they appear, and hearing the readout.

Hand gestures could be useful for telling a user what’s around them as well as navigate menus and apps in the Vision Pro’s user interface. Apple has already built in a “Pointer Control” which allows you to use a hand or finger to act as your pointing device. This could allow blind users to “feel” their way around an interface, with audio feedback for both on screen items as well as items in the real world. Still, not everyone will want to point everywhere, so a pointing device may be more preferable. Whether the UI is floating in space or on a screen, existing pointing technologies as well as Apple’s new advancements to their accessibility framework will make features accessible as long as app developers remember to design their apps with accessibility in mind.

Endless Possibilities

The truth is, as technology gets more intelligent and wearable, the possibilities to augment everyone’s life are endless. We often think of navigation overlays, perhaps showing Yelp reviews and menus for restaurants as you look at the store front, but AR can be so much more. It can read out signs for the blind in real time, identify objects in the refrigerated section to help a blind person find their favorite kombucha, read a menu in a restaurant, and become like having an assistant or always-on version of Be My Eyes. Hell, even the Be My Eyes app could use the Vision Pro headset for easy, hands-free assistance, rather than requiring users hold a device with one hand and find items with the other.

Augmented reality in a headset that can do object and text recognition could revolutionize accessibility for the vision impaired. It’s up to all of us to ensure the world is accessible for everyone, and new tools like the Vision Pro could be the next giant leap forward in accessible technology, we just have to create it.

,