Diving into Android XR: Google Teases XREAL’s ‘Project Aura’ Smart Glasses

| Updated on May 28, 2025
XREAL’s Project Aura

First offered on Samsung’s mixed reality headset, Android XR will be released later this year on Project Moohan. Following this, Google has teamed with XREAL, creator of AR glasses, to debut the second device, the just-revealed Project Aura.

Google disclosed during its I/O developer event that XREAL, a Chinese company, will be the second certified device running Android XR, the forthcoming XR operating system currently under a developer preview.

Referred to as Project Aura, the companies describe this optical see-through (OST) device as a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR.

Though details remain scarce right now, XREAL has stated that Project Aura was created in partnership with Google and Qualcomm, the chip manufacturer. Following this, they will be offered to developers shortly after Project Moohan’s release, recently confirmed to debut later this year.

XREAL has not released precise specifications to the customers. The company has previously blended micro-OLEDs with birdbath optics, a technique that differs from the more expensive waveguide optics used in products like the Microsoft HoloLens, Magic Leap One, or Meta’s Orion AR glasses prototype.

Though this usually results in bigger constructions, birdbath optics use a curved mirror system to produce brighter displays with a larger field of view (FOV) at a cheaper price. By contrast, waveguides are typically thinner and more expensive to make, providing more wearable designs with better transparency; however, they typically have a smaller FOV, even if prototypes like Meta Orion challenge this norm.

Similar to the Android XR glasses presented at Google I/O, which are under development by eyewear companies Warby Parker and Gentle Monster, the XREAL Project Aura is expected to include built-in Gemini AI. Therefore, enabling real-time translation, AI assistant interactions, web searches, object recognition, and contextual information.

Aimee Pearcy

Tech Journalist


×