Contact Us
News

AI Worsening Supply Chain Issues For Data Center Users And Developers

Supply chain challenges have plagued the data center industry for more than four years, but now artificial intelligence is creating a host of new headaches for data center providers and end users as they try to build the capacity needed to meet surging demand. 

Placeholder

Constraints that first struck data centers during the early years of the pandemic have largely worsened. Lead times for key equipment like generators and cooling gear are routinely measured in years, not months, with significant uncertainty as to whether manufacturers will actually hit projected delivery dates.

Data center development timelines have ballooned as a result, forcing end users to project their capacity needs years in advance to procure needed equipment and build out the data center infrastructure to support it. 

High-performance processors and other critical equipment to support AI computing have added new supply pinch points that the industry must contend with. Speaking at Bisnow’s DICE Pacific Northwest event last month at the Hilton Motif Seattle, leaders of some of the largest data center tenants and construction management firms said AI demand is forcing them to fundamentally change how they approach equipment procurement and planning future deployments. 

“For people who think generator sets take a long time, try to get an Nvidia GPU,” Sarah Keller, Uber’s head of global technical sourcing and supply chain, said at DICE Northwest. “It's the Wild West in some ways. So many things are not how they used to be.”

For major data center tenants, projecting specific computing, equipment and facility needs years in advance is a critical element of being able to navigate supply chain constraints and ensure information technology deployments and related capital projects happen on schedule. AI is making this much more difficult.

Until recently, most large enterprise tenants and cloud providers used fairly uniform IT equipment across their portfolios and saw few significant changes in the design requirements for the data centers needed to support them. Firms could reliably plan on a predictable cycle of IT gear and facilities upgrades — typically at four- and 10-year intervals — and develop accurate models for predicting how much additional capacity they would need. 

Even as record demand strained firms’ ability to expand their capacity, major tenants had a good grasp of exactly what their computing infrastructure needs were and could take steps to ensure they or their data center partners were able to procure the materials needed to deploy that capacity, Keller said.

Those days are over. 

“The forecast models for AI and typical cloud computing are not even close to being the same,” said Paul Vaccaro, vice president for mission critical at JE Dunn Construction. “That divergence is changing the landscape.”

Placeholder
Uber’s Sarah Keller talks supply chain at Bisnow’s DICE Pacific Northwest, held April 23 at the Hilton Motif Seattle, joined by Oracle’s Joe Graves, JE Dunn Construction’s Paul Vaccaro and Trangistics’ Joey Hougham.

AI has introduced several shifting variables into the logistics and supply chain equation.

Instead of dealing with standardized IT gear, firms must now contend with a rapidly evolving mix of technology, blending older equipment for more traditional workloads with new equipment that can support graphics processing units but also requires fundamentally different supporting infrastructure.

Often these AI deployments are to support specific business use cases that have emerged quickly, resulting in less predictable planning timelines, panelists said. It also means there is a broader range of equipment that needs to be procured, dramatically increasing the complexity and investment required in systems and strategies firms have developed to navigate supply chain constraints. Simply put, there are more suppliers to build relationships with, more partnerships to forge to gain access to long-lead items on short notice, and more equipment to stockpile in advance of capital projects. 

Keller said it is forcing players across the industry to rewrite their playbooks on how to plan and manage procurement to meet their capacity needs.

“We’re learning that our modeling is broken,” Keller said. “So now we have to take all these multiple input points and try to bring it all together to make sure we can commit to capacity for two years.”

Another variable is the expanding geography of the data center map, panelists said. New facilities are increasingly being built outside of major data center markets, a trend driven in part by AI computing’s hunger for power and less stringent latency requirements. 

JE Dunn’s Vaccaro said a shortage of qualified construction labor in these new markets is exacerbating a problem that has long plagued the industry. 

But Joe Graves, Oracle's senior principal for edge, metro and backbone planning, said he sees another shortage emerging as AI decentralizes the data center map: fiber. As data centers flood into secondary markets and rural areas, connectivity build-out isn’t keeping up. The end result is just as bad as other supply constraints delaying the construction of the data centers themselves, he said. 

“It’s of grave concern,” Graves said. “It's wonderful to have a 100-megawatt facility where you can build a GPU farm, but if you have no connectivity to it, that's just a really expensive warehouse.”