Your entire iPhone screen is now searchable with new Visual Intelligence features
Briefly

Despite earlier challenges in AI deployment, Apple is revitalizing its efforts with an expanded Visual Intelligence feature, set to roll out with iOS 26. This feature enhances users' ability to search both captured images and on-screen content seamlessly. Craig Federighi, Apple's Software Engineering VP, highlighted the commitment to making AI relevant and user-friendly while maintaining privacy. Developers will have access to this on-device model, allowing for even richer functionalities, such as adding events to calendars directly from captured images.
Apple's efforts in AI have struggled with execution, marked by delayed features and performance. However, the new Visual Intelligence aims to enhance user experience significantly.
Craig Federighi emphasized that Apple is focusing on creating relevant, easy-to-use intelligence that respects user privacy, ensuring the models powering Apple Intelligence are efficient.
With Visual Intelligence set to expand capabilities with iOS 26, users can search both captured images and screen content, bridging over to functionalities like ChatGPT queries.
Incorporating Visual Intelligence allows users to simply take a screenshot and get instant contextual help, streamlining tasks made common in daily interactions, like adding events.
Read at ZDNET
[
|
]