- Broadcom will develop custom AI chips for Google under long-term agreement
- Google advances in-house compute capabilities to compete in generative AI
What happened
Deal extends TPU collaboration to support AI infrastructure scaling
Broadcom has signed a long-term agreement with Google to develop custom artificial intelligence (AI) chips, reinforcing a partnership built around application-specific integrated circuits. According to the announcement, the deal deepens Broadcom’s role in designing silicon tailored to Google’s data centre and AI workloads.
The collaboration builds on Broadcom’s involvement in multiple generations of Google’s Tensor Processing Units (TPUs), which are optimised for machine learning tasks. By extending this relationship, Google aims to further customise its hardware stack, improving efficiency, performance and cost control as demand for generative AI compute continues to rise.
The agreement also reflects a broader trend among hyperscale cloud providers to invest in proprietary chip design, reducing dependence on external suppliers and aligning hardware more closely with software ecosystems.
Why it’s important
The partnership signals a decisive shift towards vertically integrated AI infrastructure, where control over silicon becomes a competitive advantage. By working closely with Broadcom, Google can accelerate its custom chip roadmap and better support large-scale generative AI models.
For Broadcom, the deal secures long-term exposure to hyperscale demand and positions it at the centre of AI-driven data centre expansion. As more cloud providers pursue bespoke silicon, the competitive landscape may tilt towards those with tightly coupled hardware and software ecosystems.
Also read: Nvidia invests $2bn in Marvell to power AI chips
Also read: South Korea’s Rebellions raises $400m to scale AI chip ambitions
Also read: Alibaba advances AI chip strategy as agent race intensifies






