Creating a user interface for an AR contact lens might be the most extreme design challenge today. How do you deliver useful visuals and controls on a “screen” that sits directly on the user’s eyeball? As companies like XPANCEO and Mojo Vision push closer to functional smart contact lenses, designers are grappling with constraints and possibilities unlike anything in conventional UX.
Field of View & Focus: A fundamental consideration is that a contact lens display typically covers only a portion of the eye’s field of view – often around 40–50° in the center. Designers can’t just fill the user’s entire vision with UI; instead, they must work within a limited circular HUD that moves with the eye. This means prioritizing essential information in the center of gaze and avoiding clutter at the edges. Moreover, virtual elements need to appear at a perceivable focal distance. Because the lens moves with the eyeball, AR content must be optically engineered to stay in focus without the user straining. Recent prototypes address this by projecting a collimated image (parallel light) onto the retina, so it’s as if you’re viewing a display several feet away
Brightness & Contrast: Unlike a phone screen, an AR lens interface is effectively see-through, and real-world lighting can dramatically affect visibility of digital overlays. Designers have to consider scenarios from bright outdoor sun to dim indoors. A smart lens may need dynamic brightness adjustment or even a way to tint the lens when displaying content to ensure virtual elements remain visible. XPANCEO even built a test rig to measure how virtual imagery holds up under different lighting conditions, informing guidelines like using high-contrast graphics for outdoor readability.
Gaze-Based Interaction: Perhaps the biggest UX shift is moving from touch and click to using eye movements as the primary input. With a contact lens, there’s no mouse or touchscreen – the user’s gaze and maybe blinks become the cursor and click. This demands an intuitive and low-effort interaction design. For example, an icon or menu item might be “clicked” by staring at it for a brief dwell time or by performing a specific eye gesture (like looking up then down quickly). Early smart lens prototypes include built-in eye tracking sensors to detect these kinds of gaze gestures. Deniss Klopotovkis, Head of Engineering at XPANCEO, describes how his team iterated on gaze controls: “We learned that even a half-second dwell can feel long to a user. Too short and you trigger actions by accident; too long feels tedious.” Designers have added subtle visual feedback (a progress ring around a gaze target) to help users know an input is registering, much like VR gaze interfaces.
Minimalism & Context: Designers are adopting ultra-minimal, context-aware approaches. AR content is shown only when relevant and quickly fades to avoid distraction. Notifications might be just a brief glow or icon at the periphery instead of persistent pop-ups.
Designing an interface that literally lives on the eye requires rethinking usability norms. Visual elements must be bold yet unobtrusive, interactions must be powerful yet subtle. User testing is still in early stages, often involving simulators or AR glasses stand-ins since fully functional lenses are rare. XPANCEO’s design team built a custom AR simulator that projects lens-like visuals onto a VR headset, allowing them to conduct over 1,000 hours of user testing on interface prototypes before the actual lens hardware was ready.
The UX of smart contact lenses sits at the intersection of design and deep tech. It demands the minimalism of great watch UI, the adaptive thinking of AR designers, and the precision of optical engineering. As Klopotovkis puts it, “The best interface is one you don’t realize is there – and that’s never been more true than when the interface is in your eye.” By embracing that philosophy, designers are slowly but surely figuring out how to make digital visuals helpful, not harmful, to the human experience of sight.