At TED2025 Google confirmed off smooth sensible glasses with a HUD, although the corporate described them as “ conceptual {hardware}”.
Shahram Izadi, Google’s Android XR lead, took to the TED stage earlier this month to indicate off each the HUD glasses and Samsung’s upcoming XR headset, and the 15-minute speak is now publicly available to observe.
A supercut of the TED2025 demo.
The glasses function a digital camera, microphones, and audio system, just like the Ray-Ban Meta glasses, but additionally have a “tiny excessive decision in lens show that is full shade”. The show seems to be monocular, refracting mild in the best lens when seen from sure digital camera angles throughout the demo, and has a comparatively small area of view.
The demo focuses on Google’s Gemini multimodal conversational AI system, together with the Venture Astra functionality which lets it keep in mind what it sees by way of “constantly encoding video frames, combining the video and speech enter right into a timeline of occasions, and caching this data for environment friendly recall”.
Here is every thing Izadi and his colleague Nishtha Bhatia showcase within the demo:
- Primary Multimodal: Bhatia asks Gemini to jot down a haiku primarily based on what she’s seeing, whereas wanting on the viewers, and it responds “Faces all aglow. Keen minds await the phrases. Sparks of thought ignite”
- Rolling Contextual Reminiscence: after wanting away from a shelf, which comprises objects together with a e-book, Bhatia asks Gemini what the title is of “the white e-book that was on the shelf behind me”, and it solutions accurately. She then tries a tougher query, asking merely the place her “lodge keycard” is, with out giving the clue concerning the shelf. Gemini accurately solutions that it is to the best of the music report.
- Advanced Multimodal: holding open a e-book, Bhatia asks Gemini what a diagram in it means, and Gemini solutions accurately.
- Translation: Bhatia seems at a Spanish signal, and with out telling Gemini what language it’s, asks Gemini to translate it to English. It succeeds. To show that the demo is reside, Izadi then asks the viewers to choose one other language, somebody picks Farsi, and Gemini efficiently interprets the signal to Farsi too.
- Multi-Language Assist: Bhatia speaks to Gemini in Hindi, while not having to alter any language “mode” or “setting”, and it responds immediately in Hindi.
- Taking Motion (Music): for instance of how Gemini on the glasses can set off actions in your cellphone, Bhatia seems at a bodily album she’s holding and tells Gemini to play a observe from it. It begins the track on her cellphone, streaming it to the glasses by way of Bluetooth.
- Navigation: Bhatia asks Gemini to “navigate me to a park close by with views of the ocean”. When she’s wanting instantly forwards, she sees a 2D turn-by-turn instruction, whereas when wanting downwards she sees a 3D (although mounted) minimap displaying the journey route.
Google Teases AI Smart Glasses With A HUD At I/O 2024
Google teased multimodal AI smart glasses with a HUD at I/O 2024.
This is not the primary time Google has proven off sensible glasses with a HUD, and it isn’t even the primary time mentioned demo has targeted on Gemini’s Venture Astra capabilities. At Google I/O 2024, virtually one yr in the past, the corporate showed a brief prerecorded demo of the expertise.
Final yr’s glasses had been notably bulkier than what was proven at TED2025, nonetheless, suggesting the corporate is actively engaged on miniaturization with the objective of delivering a product.
Nevertheless, Izadi nonetheless describes what Google is displaying as “ conceptual {hardware}”, and the corporate hasn’t introduced any particular product, nor a product timeline.
In October The Data’s Sylvia Varnham O’Regan reported that Samsung is engaged on a Ray-Ban Meta glasses competitor with Google Gemini AI, although it is unclear whether or not this product may have a HUD.
Meta HUD Glasses Price, Features & Input Device Reportedly Revealed
A new Bloomberg report details the price and features of Meta’s upcoming HUD glasses, and claims that Meta’s neural wristband will be in the box.
If it does have a HUD, it will not be alone in the marketplace. Along with the dozen or so startups which confirmed off prototypes at CES, Mark Zuckerberg’s Meta reportedly plans to launch its personal sensible glasses with a HUD later this year.
Just like the glasses Google confirmed at TED2025, Meta’s glasses reportedly have a small show in the best eye, and a powerful deal with multimodal AI (in Meta’s case, the Llama-powered Meta AI).
In contrast to Google’s glasses although, which seemed to be primarily managed by voice, Meta’s HUD glasses will reportedly even be controllable by way of finger gestures, sensed by an included sEMG neural wristband.
Apple too is reportedly engaged on sensible glasses, with obvious plans to launch a product in 2027.
Apple Exploring Releasing Smart Glasses In 2027
Apple seems to be exploring making smart glasses, and reportedly could ship a product in 2027.
All three firms are probably hoping to construct on the preliminary success of the Ray-Ban Meta glasses, which lately handed 2 million models bought, and can see their production vastly increased.
Count on competitors in sensible glasses to be fierce in coming years, because the tech giants battle for management of the AI that sees what you see and hears what you hear, and the flexibility to venture pictures into your view at any time.