In 2023, Microsoft was a big believer in large language models, running in the cloud. At Microsoft Build, the company launched Phi Silica, a small language model designed to run specifically on the NPUs in new Copilot+ PCs.
In April, Microsoft announced Phi-3-mini, a model small enough to run on a local PC. Phi Silica is a derivative of Phi-3-mini, designed specifically to run on the Copilot+ PCs that Microsoft announced Monday.
Most interactions with AI take place in the cloud; Microsoft’s existing Copilot service, even on your PC, talks to a Microsoft remote server. A cloud-based service like Copilot is known as a large language model (LLM), with billions of parameters that increase the accuracy of Copilot’s answers.
Zaloguj się, aby dodać komentarz
Inne posty w tej grupie
OLED monitors are the new hotness for PC gamers, but that doesn&rsquo
Adam loves PC gaming handhelds. The Steam Deck kicked off the trend w
Microsoft has split the future of Windows between two user groups: th
OLED monitors are the new hotness for PC gamers, but that doesn&rsquo
The U.S. Department of Justice sued Adobe on Monday, alleging that Ad
VPNs, when managed properly, are a great way to protect your privacy
Gaming monitors tend to be pricey, but there are some models—es