Meta debuts an ‘all-rounder’ MTAI chip 3 times faster than previous

  • Meta promises its next-generation custom AI chip will be more powerful and able to train its ranking models faster.
  • The new Meta Training and Inference Accelerator (MTIA) chip is part of a broad custom silicon effort at the company that includes looking at other hardware systems too.
  • Meta also spends billions of dollars on Nvidia and other AI chips.
  • As the AI race intensifies, high-performance AI chips are becoming more and more in demand.

Meta Platform unveiled details of the company’s next-generation in-house AI accelerator chip on Wednesday. The next-generation chip will be more powerful and able to train its ranking models faster.

What’s special about this new AI chip

Compared to Meta Training and Inference Accelerator (MTIA) v1, Meta’s first-generation AI inference accelerator that was officially announced in May last year, the latest version of the chip offers significant performance improvements.

Meta said the new MTIA chip “is fundamentally focused on providing the right balance of compute, memory bandwidth, and memory capacity.”

The chip will have 256MB of on-chip memory at 1.3GHz, compared to 128MB and 800GHz for the v1.

Meta’s early tests show that the new chip performs three times better than the first-generation version in the four models the company evaluated.

The new MTIA chip is part of a broad custom silicon effort at the company that includes looking at other hardware systems too. In addition to manufacturing chips and hardware, Meta has invested heavily in developing software to best utilise the power of its infrastructure.

Also read: Meta to integrate AI into Ray-Ban smart glasses

AI races among the tech companies

As the AI race intensifies, high-performance AI chips are becoming more and more in demand.

On 18 January this year, Meta’s CEO Mark Zuckerberg made a high-profile announcement that Meta was planning to build its artificial general intelligence (AGI) plan to spend billions on buying Nvidia and other AI chips. This year CEO Mark Zuckerberg said the company planned to acquire roughly 350,000 flagship H100 chips from Nvidia.  

Of course, Meta isn’t the only tech giant turning its attention to homegrown chips.

Just a few days ago, Google announced that it was building a custom CPU based on the ARM architecture called Axion, which it plans to make to support services such as YouTube ads on the Google Cloud, and will be available later in 2024.

Microsoft and Amazon have also begun work on custom chips capable of handling AI tasks.

Jennifer-Yu

Jennifer Yu

Jennifer Yu is an intern reporter at BTW Media covering artificial intelligence and products. She graduated from The University of Hong Kong. Send tips to j.yu@btw.media.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *