Microsoft announces that starting in early 2025, it will be possible to run the AI assistant Copilot locally on Windows PCs.
At CES 2025 in Las Vegas, Pavan Davuluri, corporate vice president of Windows and Surface, announced that the new Phi Silica Small Language Model (SLM) will soon be integrated into Windows.
No more need for the cloud
Phi Silica, announced last May, is a local alternative to the powerful Large Language Models (LLM) that currently operate through the cloud. While LLMs are more accurate and faster, they require expensive subscriptions and depend on an Internet connection. SLMs such as Phi Silica, on the other hand, offer a locally operated solution that better ensures privacy and avoids sensitive information in the cloud.
The Phi Silica model, with 3.3 billion parameters, is specifically designed to balance speed and accuracy. However, it requires PCs with Neural Processing Units (NPUs), which support local AI capabilities. Phi Silica provides the basis for running Copilot locally on Windows PCs.
Microsoft stresses that features such as Windows Recall and other AI tools will build on this technology in the future. Perhaps other companies will follow Microsoft’s lead in making AI more accessible and not dependent on the cloud.