AI for accessibility: bringing Ally to Meta Ray-Ban

Client: Envision AI
Type:
In-house full-time role
Role:
Product Designer (UX/UI)
Year:
2025/26

Ally (30k+ downloads) is an accessible AI assistant that helps people from the blind and low-vision community read texts, describe images, check their calendar, and navigate daily life more independently through voice. As a Product Designer at Envision AI, I designed the integration of Ally with Meta Ray-Ban glasses (now in private beta with 10k testers), opening a new customer acquisition channel and reinforcing Envision's mission of making AI assistance available across hardware. I also strengthened the design system by auditing components for inconsistencies and improving color contrast to meet 100% compliance with WCAG accessibility.

The product: Ally's ecosystem

For blind and low-vision people, tasks like reading a label or checking the arrival board at the station require extra effort or help from others. Ally removes that friction, putting an AI assistant in their hands that assists them in these kinds of tasks. The app supports hardware integrations with the Ally Solos Glasses (the newest Ally native smart glasses model) and the Envision Glasses (older model, born as a standalone device), allowing users to interact with it hands-free. The new Meta Ray-Ban integration meets users on an hardware they already love, opening a new significant customer acquisition channel.

Screenshots of the Ally app showing four key screens: the home screen with connected Ally Solos Glasses and quick-action buttons for calling and messaging; the glasses settings screen showing connection status and options to practice onboarding, access advanced features, or disconnect; a chat conversation where the user asks Ally to describe what's in front of them and receives a detailed scene description; and the voice call screen with a "Hold to talk" button and audio waveform visualization.

Screens of the Ally app when I first arrived at Envision

A smiling woman wearing smart glasses and a dark puffer jacket holds a white cane while standing outdoors in front of a bus or tram, on an overcast day.

User wearing the Envision Glasses

A smiling man wearing black-framed smart glasses and a gray polo shirt stands in front of Rotterdam Centraal train station on a sunny day, holding a cane in hist right hand.

User wearing the Ally Solos glasses

The challenge

Design an Ally integration for the widely used Meta Ray-Ban glasses, ensuring the experience is consistent with the app design and meets the high accessibility standards a blind and low-vision audience depends on.

Meta Ray-Ban: potential and early tensions

Potential
The Meta Ray-Ban glasses are already part of daily life for many people in the blind and low-vision community, making their integration with Ally a natural next step to reach new users on hardware they already trust.

Early tensions
Unlike the existing Solos and Envision Glasses integrations, connecting Meta Ray-Ban requires users to navigate through the Meta AI app multiple times during pairing. This can potentially create unintentional error states and confusing transitions. Since the feature had been technically built prior design intervention, I mapped the existing flow identifying UX inconsitencies and edge cases.

A user flow diagram mapping the buggy onboarding experience for connecting Meta Ray-Ban glasses to Ally. Annotated screenshots trace the journey from the connection screen through Meta AI app redirects, unclear error messages, permission prompts, and unexpected disconnections, highlighting pain points including an "unverified app" warning, absent success confirmation, unclear errors, and sudden redirects between apps.

Overview of the flow before design intervention

Pairing and tutorial: every word is a design decision

Connecting the Meta Ray-Ban glasses to Ally requires navigating through the Meta AI app multiple times, an interaction pattern that, without context, can be very disorienting. Given that most Ally's audience navigates the app through a screen reader, the challenge was to create clear instructions that make the flow feel intentional without using images. To do so, I used copy proactively, carefully balancing what screens display visually and what the screen reader reads aloud, refining word by word thanks to the feedback of beta testers.

Redesigned onboarding flow for Meta Ray-Ban glasses showing four improved screens: a splash screen that proactively prevents errors by listing app requirements before connection; a "Before connecting" screen that breaks lengthy instructions into key tasks with supporting screenshots; a success confirmation screen with clear feedback and options to start a tutorial or call Ally; and a step-by-step tutorial screen designed from beta tester feedback.

Settings: more cohesive hardware connections screens

The Meta Ray-Ban glasses have different options than the Solos, and the existing settings structure didn't accommodate them cleanly. Rather than forcing the new options into an existing pattern, I redesigned the settings page from scratch. The result was better aligned to the app's system and faster to navigate, so I applied it back to Solos, mindful of the differences between the two.

A before-and-after comparison of the glasses settings screen. The "before" Ally Solos screens show a minimal settings page with only three large colored buttons and a separate advanced features page. The Meta Ray-Ban reference screen and redesigned "after" Ally Solos screen show a comprehensive settings page with device status, connection info, battery life, resolution options, toggles, tap sensitivity, and action buttons in a unified layout.

Call screens: from a new feature to improved components

Three states of the video stream screen during a call with Ally, each showing a live camera view of a Rotterdam waterfront. The left screen shows the idle state with no border; the center screen shows a blue border around the video when Ally is speaking; the right screen shows a golden/amber border when the human user is speaking.
Live video stream The Ray-Ban integration introduced a capability that didn't exist in Ally before: a live video stream during calls. I designed a new call screen that fits the video almost full screen while still ensuring accessibility and communicating clearly when Ally is speaking or the microphone is recording.
Talking mode button
The talking mode button had been designed to accommodate a wider list of options that were later revealed to be unnecessary. The new button communicates its function more clearly and allows switching mode in one tap. Keeping in mind the needs of completely blind users, the screen reader announces different copy than what the button displays, allowing clear navigation for every user.
Before-and-after comparison of the talking mode switcher. The "before" screen requires two taps — opening a separate menu to switch between continuous and hold-to-talk modes. The "after" design replaces this with a single "Talking mode" button in the top bar that toggles between modes in one tap, with the screen reader label reading "Talking mode, button, switch to hold-to-talk."
Before-and-after comparison of the voice call screen. The "before" state shows two screens using different colored glasses icons (orange and green) to indicate connection states. The "after" redesign shows a single screen where the icon only appears when glasses are connected.
Connected glasses badge
Previously, the badge was always visible, turning color when glasses disconnected. A colour-only signal that becomes ambiguous for colourblind users, and creates visual clutter when multiple glasses are paired, since each shows its own badge simultaneously. Since users only ever use one type of glasses at a time, I simplified it: the badge appears only when glasses are connected, and disappears entirely when they are not. At most one badge is ever visible on screen, and no information is lost.

The work behind the work: system, accessibility, consistency

Designing a new feature in an existing product is an opportunity to look at the rest of the app. With the Ray-Ban integration, I audited the design system for accessibility gaps and I fixed them across the whole app. This resulted in reaching 100% compliance with WCAG standars both in light and in dark mode.

Color contrast audit comparing the app's before and after states across light and dark modes. The "before" values show several failing contrast ratios — as low as 1.09:1 and 2.61:1. The "after" values show all elements meeting or approaching WCAG standards, with improved ratios such as 10.29:1 and 5.74:1.

Additionally, save, delete and unpair flows were inconsistent across the app and in some cases not designed at all. I redesigned and standardised these flows, applying a coherent pattern across the whole app.

Before-and-after comparison of deletion confirmation flows. The "before" section shows inconsistent dialog designs and vague wording across Android and iOS for deleting an Ally persona or shortcut. The "after" section shows a standardized cross-platform flow with consistent confirmation dialogs, a processing state, a clear success message, and a graceful error state with a "Try again" option.

Beta tester feedback

The integration was shared with a group of 600 beta testers who provided feedback freely through a Slack channel. The overall response was positive, they highlighted the smoothness of the experience and expressed enthusiasm for the integration. Where they expressed confusion, I used their input to refine the design before the wider public beta release. For example, I identified some features that beta testers were unaware of and added dedicated information about them in the tutorial.

Screenshots of beta tester feedback messages from a group chat. Testers share generally positive impressions of the Meta Ray-Ban integration — praising the pairing process and seamless experience — while noting issues like missing instructions for advanced voice commands. An annotation highlights how one piece of feedback directly led to adding advanced feature guidance to the tutorial.