Ask Silicon Valley developers what they believe about human rights and digital ethics, and you'll get one answer. Ask what they'd actually do when their boss tells them to build something that restricts those rights, and you'll get a very different answer.
A new peer-reviewed study published in Information, Communication & Society reveals that 74% of developers would implement features restricting human rights if pressured by corporate demands, even when they personally disagree with those features.
This isn't about a few bad actors. It's about a systematic gap between what engineers believe and what they'll actually build when the pressure is on.
The research focused on developers in Silicon Valley and examined their willingness to create what the study calls a "slop economy" — low-quality AI content and features that prioritize corporate interests over user welfare and information quality. Think engagement algorithms designed to maximize screen time regardless of mental health impact, or AI systems that generate plausible-sounding misinformation at scale.
The findings suggest that corporate incentives — meeting deadlines, hitting targets, keeping your job — routinely override personal ethical beliefs. Developers might privately think a feature is harmful, but when it's in the sprint backlog and the product manager is asking for an ETA, most will build it anyway.
This creates what researchers describe as a "quality gap" in information systems. The features that get built aren't necessarily the ones that serve users best; they're the ones that serve corporate metrics best. And because engineers are the gatekeepers of what's technically possible, their willingness to implement ethically questionable features directly shapes what technology becomes.
The study argues this isn't primarily a problem of individual moral failure. It's a structural problem. When your job security, career advancement, and team relationships all depend on delivering what's asked of you, the cost of saying "no" on ethical grounds is high. Especially in a industry where layoffs are common and competition for roles is fierce.
There's also the question of incremental compromise. It's rare that anyone is asked to build something obviously evil. More often, it's a series of small decisions that each seem reasonable in isolation:


