
""I think one of the core constructs there [is] looking at the interface of Windows evolving, looking at the interface itself becoming more multimodal and more capable based on new interaction technologies that come to life," he said. "Whether that's mouse and keyboard moving to pen and touch and so on. I think one of the transformational things that's also happening for us is the fact that we now have these AI models that can run on-device.""
""For us, the biggest priority, of course, is to be able to make sure as we evolve Copilot Plus PCs with these new capabilities, we're able to have the Copilot in Windows, have the M365 Copilot, take advantage of these capabilities," he added. "We see value in the Windows operating system itself having intrinsic understanding of these new AI models that are running in them and taking advantage of new capabilities.""
Windows is expected to evolve into a multimodal operating system that leverages on-device AI models to change how users interact with devices. Interaction methods will expand beyond mouse and keyboard to include pen, touch, voice, and other emerging technologies. Increasingly performant local AI models will provide new capabilities and agency at the device level, enabling improved tool orchestration and edge functionality. Integration will focus on tying Copilot in Windows and M365 Copilot to those on-device models, with the operating system gaining intrinsic understanding of local AI to coordinate features and deliver more interactive, context-aware experiences.
Read at IT Pro
Unable to calculate read time
Collection
[
|
...
]