In 1987, economist Robert Solow quipped that "you can see the computer age everywhere but in the productivity statistics." Nearly four decades later, a landmark survey of 6,000 executives across the United States, United Kingdom, Germany, and Australia suggests we may be living through the same paradox — this time with artificial intelligence.
The study, reported by Fortune, found that roughly 90% of firms reported zero measurable impact on employment or productivity over the past three years — this despite corporations having poured more than $250 billion into AI infrastructure in 2024 alone. Nearly a quarter of respondents are not using AI at all.
Let that sink in for boards and investors: a quarter-trillion dollars deployed, and the macroeconomic needle has not moved.
Among firms that have adopted AI, average employee usage clocks in at a modest 1.5 hours per week. The forward projections are hardly more encouraging: executives surveyed forecast just a 1.4% productivity increase and a 0.8% output gain over the next three years, alongside a 0.7% reduction in employment. These are rounding errors, not the productivity revolution the industry has been selling.
The capital allocation problem
This is not primarily a technology story. It is a capital allocation story — and boards need to start treating it as one.
The mismatch between AI investment and AI returns mirrors, with uncomfortable precision, the original Solow Paradox. When computing investment surged in the 1970s and 1980s, productivity growth actually slowed — dropping from 2.9% annually during the 1948–1973 postwar boom to just in the years that followed. The productivity payoff from computing eventually arrived, but it took roughly two decades and required businesses to fundamentally restructure workflows, not merely bolt new tools onto existing processes.



