Contact Us
News

Nvidia's Record Data Center Sales Don't Calm Wall Street's AI Anxiety

Nvidia reported record data center revenue, but the numbers weren't enough to alleviate Wall Street’s lingering concerns about the company as the chipmaker's shares slid Thursday.

Placeholder
Nvidia CEO Jensen Huang at a conference in 2016

The chip giant reported $115.2B in data center revenue for last year, a 142% increase from the prior year, with the segment accounting for the bulk of the firm’s overall earnings. Its fourth-quarter data center revenue was $35.6B, up 16% from the previous quarter and up 93% year-over-year. 

For the company as a whole, Nvidia’s full-year revenue totaled $130.5B, up 114%. Its Q4 revenue rose 78% year-over-year to $39.3B.  

Despite the record revenue and profit numbers, which exceeded analysts’ expectations, Nvidia’s growth slowed for the fourth consecutive quarter, a trend Nvidia’s own forecast predicts will continue through at least the end of this quarter. This stands in stark contrast to the firm’s growth outlook just one year ago, when Nvidia posted its third consecutive quarter of revenue growth north of 200%. 

Shares of Nvidia fell sharply Thursday, down more than 7% as of 3:35 p.m. ET in a sell-off some analysts attributed to the firm’s comparatively sluggish growth. 

“Nvidia's earnings were good but not like the blockbuster earnings that they've been delivering for a while,” Certuity Chief Investment Officer Scott Welch said, according to Reuters

The chipmaker's numbers also came amid lingering concerns over long-term demand for artificial intelligence computing that emerged last month in the wake of an AI model released by Chinese firm DeepSeek

The model's low price tag and training on fewer, more primitive chips raised the possibility that the world’s largest tech companies will require far fewer chips than previously anticipated to develop the more advanced AI models that will lead to commercially viable products. Nvidia’s share price fell close to 17% the day after DeepSeek’s announcement.

In the weeks since, leaders throughout the tech sector have countered that cheaper, more efficient AI training will speed up AI adoption and drive a net increase in demand for AI computing. They say that demand for computing power will simply shift away from AI training and toward inference, the computing through which end users interact with an AI model.

Nvidia CEO Jensen Huang leaned in to that thesis on its earnings call Wednesday.

He called DeepSeek “an excellent innovation” that still requires 100 times more compute power than previous generations of AI models, highlighting the growing role inference plays in Nvidia’s demand pipeline. He also emphasized that the firm’s newest Blackwell line of AI processors is designed specifically with inference computing in mind.  

“The vast majority of our compute today is actually inference, and Blackwell takes all of that to a new level,” Huang said on the call. 

Still, not all investors are convinced, with analysts pointing to the potential for efficiency gains in inference computing to hurt Nvidia’s long-term demand outlook, as well as what some say is the firm’s growing vulnerability to lower-cost chipmakers.

“While we think the data center capex growth for the training market will continue to benefit NVDA, we believe the lower computing power requirement for inference, while not evident today, will undoubtedly have a negative impact to NVDA’s financial performance,” Kinngai Chan, an analyst at Summit Insights Group, wrote in a note to investors Thursday.