Microsoft will bring Phi Silica to the Windows runtime this quarter as part of Copilot, according to Microsoft’s head of Windows devices, who presented on Monday at CES 2025 in Las Vegas.
Microsoft debuted Phi Silica at its Build conference in Seattle last May, showing off the Small Language Model (SLM) that’s meant to complement its Large Language Model (LLM) that runs in the cloud. Phi Silica paves the way for a local version of Copilot to run on Windows PCs.
Typically, LLMs are faster and more accurate than SLMs. However, they need to run in the cloud and can require expensive subscriptions for full access. On the other hand, SLMs can run AI chatbots and other AI-driven applications on a local PC, but they’re less sophisticated and they require NPUs that provide local AI capabilities for PCs, which can ensure privacy and prevent information from leaking to the cloud.
Microsoft has said that Windows Recall and other AI features will eventually depend on these sorts of SLMs. Phi Silica uses a 3.3 billion parameter model, which Microsoft has fine-tuned for both accuracy and speed, even with the smaller language model.
Pavan Davuluri, Microsoft’s Windows corporate vice president for Windows and devices, appeared on stage at the Intel CES 2025 presentation to make the announcement.
Login to add comment
Other posts in this group

How badly does AI harm the environment? We now have some answers to t

It’s been seven months since Nvidia launched its flagship RTX 5090 ca


Samsung’s monitors tend to come with a bigger price tag than a lot of

Life is noisy and hectic, and sometimes you just want to focus on wha

The most exciting thing about the ROG Xbox Ally handheld, at least fo

As another year of school kicks off, you may be scrambling for a dece