Meta glasses hackers have new AI specs that listen, respond
Briefly

Halo X is a pair of smart glasses with a heads-up display, embedded microphones, and an agentic AI that interprets ambient audio and provides contextually relevant information. The frames do not include a camera but transmit audio over Bluetooth Low Energy to a paired iPhone running the Halo app. Transcribed requests are sent to a cloud platform for AI processing, while transcripts and summaries are stored locally on the paired device. Approximately 20 beta testers in Silicon Valley are using third-party hardware versions. The founders claim continuous listening, real-time answers, and unsolicited feedback to enhance user knowledge.
Caine Ardayfio and AnhPhu Nguyen on Tuesday opened preorders for Halo X, a pair of smart glasses they designed that, while not equipped with a camera like a pair of Meta Ray-Bans, do include a heads-up display. The display and some embedded microphones, combined with an agentic AI that is purportedly able to digest what the glasses hear, mean that the frames are able to respond with information they deem relevant to the wearer's current circumstances.
As designed, the glasses transmit audio via a Bluetooth Low Energy connection to a paired iPhone running the Halo app, which transmits transcribed requests to Halo's cloud platform for processing. While they told us that transcripts and summaries are stored locally on the paired device, all AI processing is done in the cloud. Ardayfio told us that using a combination
Read at Theregister
[
|
]