Intel develops the largest neuromorphic computer system

  • Intel develops the world’s largest neuromorphic computer system, named Hala Point.
  • Hala Point contains 1,152 Loihi 2 processors, supporting up to 1.15 billion neurons and 128 billion synapses.
  • The system achieves up to 50 times faster performance compared to traditional GPUs while using 100 times less energy.

Intel has unveiled Hala Point, the world’s largest neuromorphic computer system. This innovative hardware stack promises groundbreaking advancements in AI capabilities and computational efficiency.

Unprecedented scale and design

Hala Point integrates 1,152 Loihi 2 processors, boasting a remarkable capacity to support up to 1.15 billion neurons and 128 billion synapses.

The system’s architecture, comprising 140,544 neuromorphic processing cores, reflects Intel’s commitment to pushing the boundaries of AI research and development.

Also read: Intel reveals details of new AI chip to take on Nvidia

Exceptional performance and efficiency

Leveraging its technology, Hala Point achieves up to 20 quadrillion operations per second, surpassing the computational capabilities of traditional GPU-based systems.

Despite its performance, Hala Point consumes only a fraction of the energy required by conventional hardware, marking a significant leap in computational efficiency.

Also read: Intel expects to ship 40 million AI PCs in 2024

Potential for advancement and impact

Intel envisions Hala Point as a catalyst for future AI innovation. Intel said Hala Point “could enable future real-time continuous learning for AI applications” like AI agents, large language models and smart city infrastructure management.

“We hope that research with Hala Point will advance the efficiency and adaptability of large-scale AI technology.” said Mike Davies, director of the Neuromorphic Computing Lab at Intel Labs.

Besides, Google DeepMind explores NeuroAI, emphasizing AI learning from memory, not just data accumulation. IBM’s NorthPole semiconductors mimic human brain processing on one chip.


Lydia Luo

Lydia Luo, an intern reporter at BTW media dedicated in IT infrastructure. She graduated from Shanghai University of International Business and Economics. Send tips to

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *