While everyone's obsessing over Nvidia, there's another chip story happening that could be just as big - and most investors are completely missing it.
Broadcom just reported earnings that should make you rethink how the AI chip race actually works. The headline number: $100 billion in AI chip revenue by 2027. That's not total revenue. That's just AI chips. And it's not GPUs - it's something most people haven't even heard of.
Let me break down what's actually happening here, because the Wall Street spin is missing the forest for the trees.
The Custom Silicon Story Nobody's Talking About
Broadcom reported $19.3 billion in revenue for Q1 fiscal 2026, with $8.4 billion from AI semiconductors - up 106% year-over-year. That's impressive, but here's the part that matters: almost all of that AI revenue is custom chips, not off-the-shelf GPUs like Nvidia sells.
What does that mean in plain English? Companies like Google, Meta, and Anthropic are designing their own chips for their specific AI workloads, and Broadcom is building them. Google's TPU (Tensor Processing Unit)? That's Broadcom. Meta's MTIA accelerator? Broadcom. Anthropic's massive compute infrastructure? Also Broadcom.
This is a completely different business model than Nvidia. Nvidia sells you a general-purpose GPU that can run any AI workload. Broadcom helps you design a chip that runs your specific workload better and cheaper than a generic GPU ever could.
The Numbers That Should Get Your Attention
CEO Hock Tan said on the earnings call that Anthropic alone is projected to use 1 gigawatt of compute in 2026, ramping to . That's a 3x increase in a single year for one customer.
