Latest AI-Powered Glasses Help Blind People Navigate the World With Ease


The newest AI glasses are made to use on the go, even in crowded or noisy areas.

Credit: Envision

Author’s note, to all who came for it: After this point, there will be no further Geordie La Forge references in this article. Thank you.

Ally Solos Glasses (said with a British accent: “Soh-loss”) are here, and they bring a big step up in the sophistication of the AI feature implementation—and they look nice, too!

These frames, the result of a collaboration between eyewear company Solos and vision assistance company Envision, are currently on a pre-order sale for $399 but will soon increase to their regular price of $699. This makes them significantly more expensive than previous models from Envision, but they’re also sleeker, longer-lasting, and more powerful.

Visual assistance is one of those technology applications that will only affect the relatively small proportion of people who need it, but which is nonetheless of intrinsic interest to everyone. This sort of fundamental increase in a person’s mobility and agency is what technology was always supposed to pursue, and of all the potential uses for new AI technologies, helping the sight-impaired seems like one of the most obvious.

Different AI tools can now parse an image of a page in a book into readable text, mine that text as data, and then feed that data to a voice synthesis algorithm; for those unable to see such text themselves, the effect is like having a decently literate person standing by throughout the day. As AI has advanced, the breadth of navigable situations has increased while the response times and overall sluggishness have dramatically decreased.

Just as obvious is where the camera taking that picture should be: on the user’s face. Not only does that let their head position dictate the targeting, which helps the function fit into a world designed for that use case, but it also keeps their hands free.

google glass

The product design on smart glasses has come a long way since Google’s failed Google Glass project.
Credit: Loïc Le Meur

This collaboration advances the technology a good step past its cheaper predecessor, the AirGo Vision. Beyond the aesthetic design, it integrates with the new Envision app for iOS and Android and essentially functions as just another input device for that app. All the features of the glasses can be used directly via the phone.

With the new setup, users can receive a wide variety of forms of AI assistance, from simple text reading to descriptions of objects and even events. Much of it involves building an app around questions that most people wouldn’t need to ask.

The app integrates with a dizzying array of AI services, including Llama, ChatGPT, Gemini, Perplexity, and others. If that seems like it would be expensive to run, then you shouldn’t be surprised at the existence of an Ally Pro subscription that removes time caps for calls and other restrictions. The core functionality for visual assistance, though, is available for free.

We may still be a ways off from the sophistication of the visor worn by you know who, but these are still exhilarating times for people with visual impairments.

Leave a Reply

Your email address will not be published. Required fields are marked *