
Nvidia delivered another strong quarterly performance, bringing in $46.74 billion in revenue and $1.05 in adjusted EPS, both exceeding analyst estimates. While total revenue ended in a 56% year-over-year gain, the critical data centre segment (88% of total revenue) had only generated $41.1 billion, slightly off the $41.3 billion forecast. This marked the second consecutive quarter of data centre revenue falling short of projections, with immense concerns over extreme customer concentration: two unnamed "direct customers" (likely OEMs like Foxconn or Dell) accounted for 39% of revenue (23% and 16% individually), up from 25% a year ago. Though Nvidia cited steady demand across "neoclouds", enterprises, and sovereign AI initiatives (projecting $20 billion in government-related revenue), the heavy reliance on a handful of clients is a risk to capex fluctuations. Moreover, China restrictions cost the company between $2 billion and $5 billion in prospective H2O chip sales, though recent U.S. policy shifts may unlock partial recovery given the profit-sharing term by the administration is tolerated.
Nvidia's numbers were nowhere near bad or underperforming, but it is nothing groundbreaking and might have failed to alleviate overvaluation issues, especially in its growth sustainability. While revenue and EPS beats were expected given its explosive rise, the data centre miss and modest Q3 guidance ($54 billion, just above estimates but excluding China revenue) pointed to slowing momentum. Annual growth decelerated to 56% from prior 100%+ rates, and hyperscaler capex trends showed early signs of moderation, with HSBC noting "limited room for near-term earnings upside". The initial 3% post-earnings dip in its share price may have reflected market wariness, considering investors have priced in perfection after a 10x rally over 2.5 years, and Nvidia’s inability to show mass AI infrastructure adoption (CEO Jensen Huang’s $3–$4 trillion 2030 forecast). In the company’s own admission, "We have experienced periods where revenue comes from limited customers, and this trend may continue," exacerbating its overdependence on cloud providers and OEMs, even as it diversified into robotics and sovereign AI.
Nvidia’s current position extends beyond business model into a high-stakes equilibrium between U.S. regulators, Chinese authorities, and hyperscalers. In the U.S.-China chip game, both sides face a prisoner’s dilemma: the U.S. seeks to restrict China’s AI advancement but risks losing $56 billion in future revenue (per analyst projections), while China needs Nvidia’s chips (50% of global AI researchers are Chinese) but demands security concessions. The recent 15% royalty deal for H20 sales represents a Nash equilibrium, a suboptimal term for Nvidia but mutually tolerable. Continued regulatory brinkmanship is to be expected, with Nvidia leveraging its irreplaceable position (70% of AI data centre costs) to extract concessions, while China incrementally approves "crippled" chips (e.g., Rubin variants with 30–50% less capability). Meanwhile, among hyperscalers, a coordination game is unfolding with cloud giants (Microsoft, Amazon, etc.) collectively benefiting from Nvidia’s innovation but individually seeking pricing leverage. As "neoclouds" and sovereign AI programs fragment demand, Nvidia’s dominant strategy is ecosystem lock-in via CUDA and Rubin’s 2026 rollout, forcing competitors into a prisoner’s dilemma where defection into custom chips risks compatibility losses. Short-term volatility is inevitable, but Nvidia’s control of the AI infrastructure "critical path" ensures it retains bargaining power until 2026, when Rubin’s scale could reset the equilibrium. Long-term, sovereign AI investments ($20 billion) and robotics (69% auto revenue growth) will diversify its customer base and mitigate customer concentration risks, but only if it succeeds in the U.S.-China standoff without triggering a full decoupling.
Sources: Reuters, CNBC, NYTimes, Business Insider
Photos: Unsplash