• AI-powered PCs will run locally rather than operating on the cloud.
  • The rise of local AI processing extends beyond PCs to mobile devices.

Microsoft’s Copilot AI will operate locally on PCs instead of relying on cloud in the near future.The increasing local computing power is expected to enhance performance and privacy.

Hardware achievement pave the way for AI PCs

Intel is putting into efforts to enable Microsoft’s Copilot AI to run locally on PCs, rather than relying solely on cloud-based processing.

Also read: Intel aims for over 100 million AI PCs globally by 2025

Copilot used on generative AI tools processes tasks in remote data centres, which requires significant resources and electricity. Thus, the integration of neutral processing units (NPUs) is expected to leverage AI processing efficiency and conduct generative AI tasks.

This shift towards local processing aims to reduce delays, enhance performance, and bolster privacy by minimizing reliance on cloud-based infrastructure.

Empowering on-device intelligence

PCs are not the only thing that is involved in the local AI boom. Google’s Pixel 8 and Pixel 8 Pro smartphones, featuring the Tensor G3 chip, have laid the foundation for on-device generative AI.

While these devices support on-device generative AI for tasks like audio summarization and intelligent response generation, they currently lack the capacity to run extensive AI models locally.

Enhancing cybersecurity

“We’ve seen enough third-party breaches of cloud services to know that even with promises, the data can be lost,” said John Bambenek, a cybersecurity consultant.

The local AI processing presents potential cybersecurity benefits, as John mentioned, for it allows organizations to maintain control over their data, reducing the risk of third-party breaches associated with cloud-based AI services.