- AMD outlined a “yotta-scale” AI compute vision and a new rack-scale platform for future data centres at CES 2026.
- The company also pledged USD $150 million to support AI education and community programmes.
What happened: AMD unveiled its Yotta-Scale AI roadmap at CES 2026
At the Consumer Electronics Show (CES) 2026 in Las Vegas, AMD used its opening keynote to advance an ambitious artificial intelligence strategy built around what it describes as “yotta-scale” computing — orders of magnitude beyond today’s AI infrastructure.
Chair and CEO Dr Lisa Su described it as the next step in global compute growth, citing rapid expansion in AI computing capacity. Central to the vision is Helios, a modular, rack-scale platform designed for next-generation AI data centres. A single Helios rack delivers up to 3 AI exaflops using AMD accelerators, CPUs and networking under ROCm.
AMD also expanded its hardware portfolio with Instinct MI440X GPUs for enterprise AI workloads, a preview of MI500 Series GPUs planned for 2027, new Ryzen AI 400 and Ryzen AI PRO 400 Series PCs featuring 60 TOPS NPUs, and Ryzen AI Halo systems alongside embedded processors designed for edge AI use cases.
In parallel with its silicon announcements, AMD pledged USD $150 million to support AI education initiatives in classrooms and local communities, aligning with broader industry efforts to ramp up hands-on AI learning opportunities.
Also Read: Nvidia in advanced talks to buy Israel’s AI21 Labs
Also Read: ByteDance ramps up Nvidia chip spending amid AI surge
Why it’s important
AMD’s yotta-scale roadmap underscores how computing demands for AI continue to surge. Meeting this new scale of performance will require not just faster chips, but modular, scalable infrastructure capable of bringing thousands of processors together efficiently — and that’s exactly the role Helios is designed to play.
The educational pledge is also notable: as governments and companies emphasise AI literacy and workforce development, AMD is positioning itself not only as a hardware vendor but as a contributor to the broader ecosystem that will power and use future AI systems.
By pushing both cutting-edge infrastructure and community support, AMD aims to influence how AI is built, scaled and adopted — from supercomputers to the devices in people’s hands. This dual focus may become increasingly important as global competition in AI hardware intensifies and demand for AI skills continues to grow.
