During a talk at the TED Humanity Reimagined conference in Vancouver, Canada, Shahram Izadi and other Google executives demonstrated a prototype pair of AI glasses.
The smart glasses look like normal glasses and have a miniature display built in, writes Axios.
Nishta Bathia, product manager for Glasses & AI, demonstrated several use cases, including a feature called Memory, in which the Gemini AI assistant uses a built-in camera to track what the user sees and remind them where they've placed certain objects like keys. The glasses can also record, translate or transcribe conversations in real time. In English or other languages such as Hindi.
"These glasses work with your phone, streaming back and forth, allowing the glasses to be very lightweight and access all of your phone apps," Izadi said..
The Project Moohan mixed reality headset, which is set to release in 2025 and is being developed by Samsung, was also demonstrated. Both the headset and the smart glasses are based on the AndroidXR operating system announced last December.
I just want the moohan to come out already. Now there's a vision pro 2 that might release before the moohan...