Microsoft Copilot AI will soon run locally on PCs  

  • According to Intel, Microsoft’s Copilot AI service will run locally on PCs.
  • Next-gen AI PCs would require built-in neural processing units (NPUs) with over 40 trillion operations per second (TOPS) of power, beyond the capabilities of any consumer processor on the market.
  • Intel mentioned that these AI PCs would be equipped to handle “more elements.

It has been confirmed that Microsoft’s Copilot AI could soon run locally on PCs rather than relying on the cloud.

Some updates

According to Intel, the chatbot could run on future AI-enabled PCs, which would need to incorporate neural processing units (NPUs) with over 40 trillion operations per second (TOPS) of power, beyond the capabilities of any consumer processor on the market.

Intel also mentioned AI PCs would be able to run “more elements of Copilot” locally.

Currently, Copilot runs nearly everything in the cloud, even small requests. That creates a fair amount of lag that’s fine for larger jobs, but not ideal for smaller jobs. Adding local compute capability would decrease that lag, while potentially improving performance and privacy as well.

Also read: Microsoft Teams is getting smarter Copilot AI features

Copilot on your PC

As previously reported, using Copilot on Windows 11, ChatGPT, Adobe Firefly, or similar genAI tools doesn’t process the task on your computer, but rather in a remote data centre, which can consume a lot of resources and power.

While it is possible to run several applications from text-to-image model or language models locally, PCs with powerful processing power, especially high-speed GPUs, are usually required to obtain high-quality results.

Local AI trends aren’t limited to PCs. For example, Google’s Pixel 8 and Pixel 8 Pro smartphones are equipped with the Tensor G3 chip, which Google claims lays the groundwork for generative AI on the devices.

However, despite these advances, such hardware is currently not capable of running extensive AI models like Google’s Bard AI, Copilot, or ChatGPT locally.  

Unfortunately for Intel, the first company to put out an NPU suitable for powering Copilot locally may come from Qualcomm.

The company’s upcoming Snapdragon X processors, long seen as the Windows ecosystem’s answer to Apple’s M-series Mac chips, promise up to 45 TOPS.

Rumors suggest that Microsoft will shift the consumer version of its Surface tablet to Qualcomm’s chips after a few years of offering both Intel and Qualcomm options.

Microsoft announced a Surface Pro update with Intel’s Meteor Lake chips last week but is only selling it to businesses.

Jennifer-Yu

Jennifer Yu

Jennifer Yu is an intern reporter at BTW Media covering artificial intelligence and internet governance. She graduated from The University of Hong Kong with a Master’s degree in Journalism. Send tips to j.yu@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *