
"The glasses use AI-powered sensing, computer vision, and a camera to create a "heads-up display" that can showcase navigation details, hazards, and delivery tasks, as stated in the release. As seen in the screenshot above, the in-lens display appears to function similarly to the in-lens displays found on the Even Reality smart glasses. For example, the glasses automatically populate delivery information, such as address and number of packages, within the driver's field of view as soon as they park safely outside the delivery."
"During its "Delivering the Future" event in San Francisco last month, Amazon unveiled its smart delivery glasses, designed to help delivery associates deliver packages more safely and efficiently, which in turn improves customer delivery experiences. The glasses can scan packages, display turn-by-turn walking directions, and capture an image of the delivery hands-free, helping drivers stay focused and avoid reaching for their phones. Powering it all is AI and machine learning."
Amazon unveiled AI-powered smart delivery glasses intended to help delivery associates scan packages, display navigation, capture hands-free delivery images, and identify hazards. The glasses use sensing, computer vision, and a camera to create an in-lens heads-up display that shows navigation details, hazards, and delivery tasks. Delivery information such as address and package count populates automatically when the driver parks; the glasses help locate packages in the truck and provide alerts when identified. The device aims to keep drivers focused by reducing phone use and to improve both delivery safety and customer experiences.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]