Google Accelerates Its Move into Smart Glasses and Targets a Commercial Launch of Project Aura in 2026
Google is accelerating its push into smart glasses and is aiming for a commercial launch of the Project Aura lineup as early as 2026. This is about a new wave of augmented reality devices built on the Android XR platform, where the key feature is not “flashy AR effects,” but deep integration with Gemini and a persistent, real-time AI context. The glasses see what you see, hear surrounding speech through microphones, and instantly turn this into prompts, translation, navigation, and a kind of contextual “memory” of what matters.
According to published descriptions, Google is preparing two formats. The first is a display-equipped version: a camera, multiple microphones, and visual prompts rendered directly in the field of view, so short instructions, live translation, and routes appear without the need for a phone. The second is a lightweight, screenless model focused on voice and audio: you ask a question, and Gemini’s responses come through speakers or earphones, preserving a hands-free experience and making the glasses as light as possible for everyday wear. In both cases, the bet is on Android XR and Gemini’s cloud computing — an “intelligent layer” on top of reality rather than heavy hardware built into the frame.
Project Aura thus looks like a bridge between glasses and full XR devices: a partnership between Google and Xreal that has already been shown to and tested by journalists, and which many expect to become one of the first prominent implementations of Android XR in a glasses form factor.

