- Ciena’s global survey predicts a sixfold increase in DCI bandwidth demand over the next five years.
- 43% of new data center facilities will be dedicated to AI workloads, reshaping the networking landscape.
What happened: AI workloads reshaping data center infrastructure
A recent global survey commissioned by Ciena (NYSE: CIEN) has revealed that the rapid expansion of artificial intelligence (AI) workloads is driving a significant transformation in data center infrastructure. The survey, conducted in partnership with Censuswide, gathered insights from over 1,300 data center decision-makers across 13 countries, predicting a dramatic surge in data center interconnect (DCI) bandwidth demand over the next five years.
The study found that more than half (53%) of respondents expect AI to be the largest contributor to data center network strain over the next two to three years, surpassing traditional drivers such as cloud computing (51%) and big data analytics (44%). To accommodate this shift, 43% of newly built data center facilities will be dedicated solely to AI workloads. This shift underscores the necessity of enhanced networking solutions to sustain AI model training and inference, which require substantial data movement.
Additionally, the survey indicated that 87% of respondents foresee needing at least 800 Gb/s per wavelength to support AI-driven traffic. This aligns with historical network growth rates of 20-30% per year, which are now expected to accelerate significantly due to AI advancements. Ciena’s Chief Technology Officer for International, Jürgen Hatheier, emphasized that AI is reshaping network architecture and necessitating a more sustainable, scalable approach to connectivity.
The findings also highlighted a growing trend towards adopting pluggable optics to optimize bandwidth consumption while mitigating power and space constraints. A staggering 98% of data center professionals acknowledged the importance of pluggable optics in reducing energy usage and enhancing the efficiency of network infrastructure.
Also read: Cerebras expands AI data centers
Also read: Nvidia and Cisco partner to revolutionize networking with AI
Why it’s important
The anticipated surge in AI workloads presents significant implications for global data center infrastructure and networking capabilities. With a projected sixfold increase in DCI bandwidth demand, data centers will need to rapidly scale their interconnectivity solutions to sustain AI’s evolving computational needs.
One of the primary concerns highlighted in Ciena’s survey is the shift towards more distributed AI training. Around 81% of respondents expect Large Language Models (LLMs) to be trained across multiple data centers rather than single, centralized facilities. This necessitates robust inter-data center connectivity, increasing reliance on high-capacity networks such as Managed Optical Fiber Networks (MOFN), which 67% of respondents favored over traditional dark fiber solutions.
Additionally, data center operators are prioritizing AI resource utilization, latency reduction, and data sovereignty when determining where to deploy AI inference computing. These factors reinforce the demand for scalable and efficient interconnectivity solutions, ensuring that AI applications can function seamlessly across multiple geographies.
The growing reliance on pluggable optics further highlights the industry’s focus on sustainability. Given that AI workloads consume vast amounts of power, optimizing energy efficiency in data center interconnections will be crucial in mitigating environmental impact. Industry leaders will need to rethink infrastructure strategies to balance the rapid expansion of AI while maintaining sustainability goals.