Microsoft's VS Code started automatically adding GitHub Copilot as a co-author on git commits without user permission or notification.
Developers are understandably upset. This isn't just annoying - it has real legal and attribution implications.
Here's what happened: VS Code users with Copilot enabled started noticing that their git commit messages included a "Co-authored-by" line crediting GitHub Copilot. They didn't add this. They didn't consent to it. It just appeared.
In open source projects and enterprise environments, commit attribution matters. It affects licensing, copyright, liability, and credit for work. Having an AI automatically listed as co-author raises fundamental questions:
Who owns the code? If Copilot is listed as co-author, does that affect the copyright status? Could it be used as evidence that AI "wrote" the code in future copyright disputes?
What about licensing? Many open source licenses require proper attribution. If an AI is listed as co-author, does that create licensing complications? Can an AI even hold copyright or license rights?
What about liability? In enterprise settings, commit history is used to track who wrote what for security audits and liability purposes. If Copilot is listed as co-author, does that muddy those waters?
One developer commented on Reddit: "This could cost people their jobs. If management looks at commits and sees Copilot co-authored everything, they might think the developer isn't actually writing code."
That's not paranoid. In the current tech job market, with companies looking for excuses to cut costs and replace workers with AI, having your commits say "co-authored by AI" could genuinely be used against you.
Microsoft has since rolled back the change after significant backlash, but the fact that it shipped at all is concerning. This suggests:
1. No one at Microsoft thought through the legal implications 2. Or they did and shipped it anyway 3. There wasn't sufficient review of a feature that affects every Copilot user's git history
GitHub Copilot is a useful tool. I've talked to engineers who swear by it for autocomplete and boilerplate generation. But useful tool is very different from co-author. A compiler optimizes my code; we don't list it as co-author. An IDE suggests completions; we don't credit it in commits.
The incident reveals Microsoft's broader strategy tension. They want to promote Copilot as an essential development tool that writes code alongside you. But they also want developers to feel like they're still in control and doing the actual work.
You can't have it both ways. Either Copilot is a glorified autocomplete (useful but not a co-author), or it's genuinely writing significant code (in which case the attribution and copyright questions are serious).
Microsoft needs to pick a lane. Shipping features that automatically claim AI co-authorship without user consent isn't it.
