XREAL's Project Aura moves into developer kit status with a clear architectural choice: optical see-through glasses tethered to a dedicated compute puck. The glasses handle tracking and display. The puck runs Android XR on Snapdragon XR2+ Gen 2.
The standout feature is electronic dimming that goes 0 to 100% opacity, letting the same hardware do full AR or full VR immersion. Input is split too: puck-mounted trackpad plus dual spatial cameras for hand tracking. It's a bet that tethering is less friction than carrying a phone.
Vision Pro is integrated. Meta Quest is integrated. XREAL is betting that splitting glasses and compute unit is a better tradeoff. The glasses stay light and cool. The compute is where the power is. Users don't carry phones in their pockets; they carry the puck on a belt or hold it. It's different, but it makes ergonomic sense if adoption is the goal. Glasses don't feel heavy if the compute isn't in them.
This is the clever part. Most AR glasses are AR-only. If you want true immersion, you need a different device. Aura flips that. Full opacity dimming lets you snap into VR mode instantly. One device for both spatial computing paradigms. That's a flexibility advantage.
This is a real piece of developer hardware, not vaporware. Dual camera tracking for hand input is solid. Puck-based input is familiar from phone-era AR. The Android XR foundation means devs porting from Meta Quest won't struggle too hard. Thermal management on the puck is still a question (intensive apps might cook it), but this is early stage.