- Tesla, under Elon Musk, and the AI-focused company xAI have accumulated a significant number of Nvidia H100 series chips to tackle autonomous driving and advance AI research.
- Meta has stockpiled the highest number of H100 GPUs globally, with around 350,000 units.
- Despite the substantial investments and acquisitions of H100 GPUs, the rapid evolution of AI technology, exemplified by Nvidia’s GB200 Grace Blackwell superchip, poses a risk of rapid obsolescence for these high-end chips.
OUR TAKE
The rapidly evolving AI field poses a risk of rapid obsolescence for high-end chips. In March this year, Nvidia unveiled the GB200 Grace Blackwell superchip, which integrates the ARM-based Grace CPU and two Blackwell B100 GPUs. This system can run AI models with 27 trillion parameters, improving processing speed by 30 times in tasks such as chatbot responses.
— Chloe CHEN, BTW Media reporter
Under Elon Musk‘s leadership, Tesla and its mysterious AI-focused company xAI have stockpiled a large number of Nvidia H100 series chips. Tesla aims to tackle the ultimate challenge of autonomous driving—L5 level autonomy—with this, while xAI shoulders the vision of achieving Musk’s ‘ultimate truth AI’ vision.
Also read: Musk’s xAI to launch an improved version of the chatbot Grok
Also read: xAI’s chatbot Grok will be available to all premium subscribers
Meta has hoarded the most H100 GPUs globally
Recently, a user on the X platform, “The Technology Brother,” revealed that Meta has hoarded the most H100 GPUs globally, with a staggering quantity of 350,000 units. However, Musk expressed dissatisfaction with the ranking, marginalizing Tesla and xAI’s positions (10,000 units), stating ‘if calculated correctly, Tesla should be second, and xAI third.’
This statement suggests that Tesla may currently possess between 30,000 to 350,000 H100 GPUs, while xAI has around 26,000 to 30,000 units. In January this year, Musk confirmed an additional $500 million investment into Tesla’s Dojo supercomputer (equivalent to around 10,000 H100 GPUs), emphasising that “Tesla will invest more funds in Nvidia hardware this year’ as ‘staying competitive in AI requires billions of dollars in investment annually.”
xAI is also actively stockpiling computing power
In 2023, Musk recruited top AI talents from DeepMind, OpenAI, Google Research, Microsoft Research, Tesla, and the University of Toronto to build xAI. At that time, xAI reportedly purchased around 10,000 Nvidia GPUs, likely referring to the A100 series. However, judging from Musk’s recent statement, xAI has also amassed a considerable number of H100 GPUs.