Apple Visual Intelligence To Add Screen Search With Google, ChatGPT & Third-Party Apps
Briefly

At WWDC 2025, Apple highlighted the importance of Apple Intelligence and Apple Visual Intelligence, indicating ongoing enhancements. Craig Federighi, SVP of Software Engineering, emphasized the need for perfection in Siri's personal features and announced new language support in iOS26. The introduction of the 'Foundation Models framework' will empower app developers to utilize enhanced on-device capabilities without privacy concerns. The session also showcased features like image search from screenshots, though Apple's offerings were noted to lag behind competitors like Google in some areas.
As we shared, we continue our work to deliver the features that make Siri even more personal. This work needed more time to reach our high quality bar, and we look forward to sharing more about it in the coming year.
The new 'Foundation Models framework' lets app developers tap into the on-device models, all done on the device and not in the cloud, and thus no cost, no privacy issues and offline support.
When you take a screenshot, you will see at the bottom 'Ask' and 'Image Search.' Clicking 'Image Search' will find matching images on Google or third-party apps.
Google's Live Search is much more impressive right now.
Read at Search Engine Roundtable
[
|
]