In an internal memo, Sam Altman told OpenAI employees that the military's operational decisions using their technology are "up to the government, not the company." This marks a significant shift from the organization's earlier stance on military applications—and raises uncomfortable questions about corporate responsibility in the age of AI.
According to CNBC reporting, Altman's message to staff effectively washes OpenAI's hands of how the military uses their models. The memo comes as the company deepens relationships with defense contractors and government agencies.
Let's track the evolution here. OpenAI started with lofty mission statements about beneficial AI and safety. They had policies against military use. Then those policies got quietly updated. Now we're at "not our problem how they use it."
From a business perspective, I understand the logic. The U.S. government is a massive customer with deep pockets and long-term contracts. Defense applications of AI are inevitable—if OpenAI doesn't sell to the military, someone else will. Probably Anthropic, or Google, or whoever else is willing to take the money.
But here's the thing about abdicating responsibility: you can't claim to be building safe, beneficial AI while simultaneously saying you have no control over how it's used. Those positions are fundamentally incompatible.
The "operational decisions are up to the government" framing is clever. It sounds reasonable. The military does make its own operational decisions—that's how civilian-military relations work in democracies. But it's also a dodge. If you build the weapon, you're responsible for what the weapon does. That's been true for arms manufacturers for centuries.
What makes this particularly thorny is that AI isn't a gun or a bomb. It's a general-purpose technology that can be used for logistics, intelligence analysis, autonomous weapons, propaganda, cyber operations—the list goes on. Some of those applications are defensible. Some are terrifying. And once you hand over the model, you lose control over which is which.
