
"The smart specs combine mounted cameras with computer vision and what Amazon calls "AI-powered sensing capabilities," which project turn-by-turn directions and delivery instructions directly into the driver's field of view. The idea, Amazon said in an announcement, is to create a "hands-free experience" that reduces the need for drivers to repeatedly look between their phones, packages, and surroundings when making a drop-off. In practice, the hands-free part may be limited, given that packages still need to be carried to the customer's doorstep."
"Once a driver parks, the glasses automatically activate, helping them with everything from finding the right parcel in the van to navigating apartment buildings and confirming the drop-off. A small controller in the delivery vest houses the battery and buttons, including an emergency contact system. The glasses support prescription lenses and transitional lenses that automatically adjust to light."
Amazon is testing AI-powered smart glasses for delivery drivers that combine cameras, computer vision, and "AI-powered sensing" to project navigation and delivery instructions into the driver's field of view. The glasses aim to reduce drivers' need to look between phones, packages, and surroundings during drop-offs, though carrying parcels still limits hands-free operation. The eyewear activates when a driver parks and helps find parcels, navigate buildings, and confirm drop-offs. A small controller in the delivery vest houses the battery, buttons, and an emergency contact system. The glasses support prescription and transitional lenses, and future iterations could detect wrong-address drops, flag loose pets or trip hazards, and adjust to low light. The program was developed under the Delivery Service Partner program with feedback from hundreds of test drivers and raises privacy concerns about wearable cameras and continuous tracking.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]