The AI arms race just got a formal contract. Meta and Nvidia announced a multi-year, multi-generational strategic partnership on Tuesday, making official what everyone in the industry already suspected: Mark Zuckerberg is going to spend an almost incomprehensible amount of money on Nvidia hardware, and Nvidia is going to help him do it.
Nvidia's stock moved up 1.5% in after-hours trading on the news — a relatively muted reaction that tells you Wall Street already had this priced in. But the details of the deal deserve more attention than the stock tick.
What's actually in the deal
According to the Nvidia press release, Meta will deploy millions of Nvidia's Blackwell and Rubin GPUs across hyperscale data centers optimized for both AI training and inference workloads. The deal also involves Nvidia's Grace CPUs — the Arm-based chips Nvidia has been pushing as an alternative to Intel and AMD in AI server configurations — along with Nvidia Spectrum-X Ethernet networking switches integrated into Meta's existing infrastructure.
The headline technology is Nvidia's upcoming Vera Rubin platform, which Zuckerberg specifically called out in his statement: "We're excited to expand our partnership with NVIDIA to build leading-edge clusters using their Vera Rubin platform to deliver personal superintelligence to everyone in the world."
Jensen Huang, Nvidia's founder and CEO, was characteristically effusive: "No one deploys AI at Meta's scale — integrating frontier research with industrial-scale infrastructure to power the world's largest personalization and recommendation systems for billions of users. Through deep co-design across CPUs, GPUs, networking, and software, we are bringing the full NVIDIA platform to Meta's researchers and engineers as they build the foundation for the next AI frontier."
This is also the first large-scale deployment of Nvidia's Grace-only CPU architecture in a production environment, which matters for Nvidia beyond the revenue: it validates the chip in real-world conditions and gives Nvidia an enormous reference customer when pitching other hyperscalers.

