• Nvidia has agreed to pay about $20 billion for assets and technology from AI chip startup Groq, marking one of the largest transactions in the AI hardware sector.
• The agreement includes a non-exclusive licence to Groq’s technology and the hiring of key executives, but Groq will continue to operate independently.
What happened: Agreement to secure Groq assets and talent
Nvidia Corporation is set to acquire assets and technology from AI chip startup Groq Inc. for approximately $20 billion in cash, according to CNBC’s report, making it one of the largest deals in the company’s history. The arrangement, as described by Groq in a public statement and confirmed by people familiar with the matter, involves Nvidia taking a non-exclusive licence to Groq’s inference chip technology and hiring Groq’s founder and chief executive officer, Jonathan Ross, along with president Sunny Madra and members of the startup’s engineering team.
Groq’s cloud business, known as GroqCloud, is excluded from the transaction and will continue operating independently under the leadership of Simon Edwards, who has been appointed the company’s new chief executive officer. The startup, founded in 2016 by engineers who previously worked on machine learning hardware at Alphabet’s Google, was valued at about $6.9 billion following a funding round in September where it raised $750 million.
Under the terms of the deal, Nvidia gains access to Groq’s inference technology—designed to optimise how AI models respond to user requests—which is increasingly important as demand grows for real-time artificial intelligence applications. While Nvidia has not publicly disclosed detailed financials, the $20 billion figure is drawn from the CNBC report and market commentary and would represent a substantial premium over Groq’s most recent valuation.
Also Read: Trump allows Nvidia AI chip exports to China
Also Read: US clears Nvidia chip exports to China: A volatile turn in global AI supply
Why it’s important
Nvidia’s move comes amid intense competition in the semiconductor industry, particularly for hardware that supports artificial intelligence workloads. Nvidia’s graphics processing units (GPUs) have long dominated training and inference in data centres, but startups like Groq have built specialised inference architectures using alternative designs such as on-chip SRAM memory that aim to improve energy efficiency and task-specific performance.
By securing Groq’s technology and engineering talent, Nvidia may broaden its technical capabilities and accelerate integration of high-performance inference processors into its AI platforms. However, the structure of the deal—licensing technology rather than fully acquiring the entire company—means Groq remains operational as a separate entity, which may help address potential antitrust concerns in jurisdictions scrutinising concentration in semiconductor markets.
Analysts have noted that the broader trend in the AI industry favours securing specialised hardware and expertise that can complement existing offerings, particularly as demand for inference performance grows alongside training requirements. Firms such as Advanced Micro Devices and Cerebras Systems are also vying for market share in this segment, and Nvidia’s move could be seen as a defensive response to preserve its leadership position.
Nevertheless, questions remain about how the integration of Groq technology—and the absorption of its senior engineers into Nvidia—will affect competition and innovation. Critics argue that large deals in the AI hardware space risk consolidating too much capability within a few dominant players, potentially limiting the diversity of technical approaches and slowing the pace of independent innovation.
The transaction illustrates both the economic scale of strategic AI investments and the complex balance between competition policy, technological leadership and market consolidation in advanced computing.
