In 2023, Microsoft was a big believer in large language models, running in the cloud. At Microsoft Build, the company launched Phi Silica, a small language model designed to run specifically on the NPUs in new Copilot+ PCs.
In April, Microsoft announced Phi-3-mini, a model small enough to run on a local PC. Phi Silica is a derivative of Phi-3-mini, designed specifically to run on the Copilot+ PCs that Microsoft announced Monday.
Most interactions with AI take place in the cloud; Microsoft’s existing Copilot service, even on your PC, talks to a Microsoft remote server. A cloud-based service like Copilot is known as a large language model (LLM), with billions of parameters that increase the accuracy of Copilot’s answers.
Login to add comment
Other posts in this group
“The desktop PC is dying.” It’s something we’
Everyone knows the CAPCHA tests on websites where you either have to
Microsoft has decided to pull its controversial Recall feature from t
Scoring the perfect gaming monitor is imposs… wait, that can&r
Whenever we see one of our favorite keyboards on sale, we consider it