Nvidia’s strong Q3 earnings could quell AI-bubble anxiety

Nvidia posted profits bearing the brunt of a month-long AI panic and came away with a $57 billion quarter that made the whole sell-off look like a jitter. The data center business has become a $51 billion engine on its own, the kind of gravitational force that is reorganizing the rest of the industry around it.
It didn’t take long for the market to recalibrate: shares rose as much as 6.4% after hours. The impression is a reminder that AI’s biggest constraint remains supply, not sentiment, and traders reacted as they usually do when an idea survives contact with reality. Wedbush’s Dan Ives called this a “champagne” moment for a reason; the screens were tilted green once the numbers arrived.
Wall Street was prepared for something positive, but not necessarily something this loud. Analysts were expecting about $55.4 billion in revenue and $1.26 in earnings per share. Instead, Nvidia cleared both bars without breaking the cadence. Revenues reached $57 billion (up 62% from last year and 22% from the previous quarter), earnings reached $1.30 per share, and the growth rate brought the whole discussion back toward scale rather than saturation. Even the margins – 73.4% on a GAAP basis – remained in place, a weak but critical signal for a company soaring so quickly.
But perhaps the most important sign for Wall Street came from the outlook. Nvidia told investors it expected revenue of $65 billion next quarter, a figure that far exceeded the Street’s range of $61.7 billion and landed as a statement. If demand declined, Nvidia would not be looking into the quarter with targets of this size or gross margin expectations approaching 75% on a non-GAAP basis.
The pace and guidance have effectively set a new benchmark for what “normal” demand looks like in this phase of the cycle. And the rest of the print drew the same shape.
Demand for Nvidia’s Blackwell series chips has been ahead of available supply for months, and early results from the call suggest the pace hasn’t slowed. Cloud providers have reserved compute and networking capacity to get ahead of Rubin’s ramp next year, turning the upgrade path itself into a catalyst. Nvidia CEO Jensen Huang said in the earnings release that “Blackwell sales are off the charts and cloud GPUs are sold out.”
Compute and networking within the data center segment became a $51 billion mainstay, with networking alone growing more than 160% year-over-year as hyperscalers continued to assemble larger and larger AI clusters. Inventories have reached $19.8 billion because Nvidia is reserving its supply as quickly as manufacturers can produce it. Even cloud services deals — in which Nvidia actually leases its hardware through partners — have doubled to $26 billion, a sign that customers are willing to lock in capacity before the next wave of model training clogs the queue. For all the concerns about AI cooling, the news on hyperscaler spending was moving in the opposite direction.
As Nvidia’s results approached, some investors were nervous. Major funds such as SoftBank and Peter Thiel reduced their stakes earlier this month, and some analysts described Nvidia’s results as one of the key tests of whether the AI boom still has some air left.
Wedbush called it a “mind-blowing” guide that rewrites investor sentiment; The company’s early readouts of the call signaled strong tones around Blackwell’s request and Rubin’s ramp, perhaps enough to calm the AI bears who spent November claiming the upgrade cycle was already faltering.
China always remains in the background, like a glaring absence. Nvidia still can’t ship its high-end parts to the continent, and the quarter doesn’t reflect any of that demand. Any expected thaw in trade negotiations next year could reopen a significant revenue stream, meaning Nvidia is heading toward $65 billion without one of its historically important markets in the mix.
The physical limits of growth have attracted just as much attention. Power shortages, land constraints and grid bottlenecks are recurring concerns for hyperscalers, and analysts have pointed to these chokepoints as the next strict governor of AI spending. Nvidia hasn’t ruled out the problem, but the scale of the quarter showed what happens when customers turn to building anyway.
Inventories have increased because the company locks in supply months in advance, and long-term cloud commitments have doubled because customers want guaranteed access before the next wave of model training invades the system. The whole setup gave the impression of a market competing against infrastructure rather than demand.




