Three-quarters of resumes submitted to job postings never reach a human recruiter. They're killed by AI screening systems before anyone with actual hiring authority sees them.
That's not a typo. 75%. Rejected by algorithms.
According to new research from Fortune, AI-powered applicant tracking systems have become the default gatekeepers of the American job market. And they're terrible at it.
These systems scan resumes for keywords, parse formatting, and score candidates based on pattern matching against job descriptions. The problem? They're optimizing for the wrong thing. A great candidate who phrases their experience differently than the job posting gets filtered out. Someone who worked at a less-famous company doing identical work gets downranked. Resumes with non-standard formatting get mangled by parsing errors.
I've talked to engineers who got auto-rejected from jobs they were overqualified for because their resume said "JavaScript" and the system was looking for "JS." I've talked to people with a decade of relevant experience who never got past the filter because they didn't have the exact certification listed in the posting.
The stated goal was to make hiring more efficient and reduce bias. The reality is it's created a new kind of bias - bias toward gaming the system. Now job seekers spend hours running their resumes through AI optimization tools, stuffing them with keywords, and reformatting for parsing compatibility. It's resume SEO, and it has nothing to do with whether you can actually do the job.
The real efficiency gain is for companies, not candidates. Instead of hiring enough recruiters to review applications properly, they can deploy software to auto-reject most applicants and only review the survivors. It's cheaper. It's faster. And it's filtering out qualified people at scale.
What makes this particularly frustrating is that better approaches exist. Companies could use AI to assist human recruiters rather than replace their judgment entirely. They could test their screening systems against known outcomes to see if they're actually finding good candidates. They could provide feedback when someone gets filtered out.
But that would require treating hiring as something more important than a cost center to be optimized with software.
The labor market has an AI problem, and workers are paying the price. Until companies start caring whether their screening systems actually work - not just whether they're cheap - expect more of the same: qualified people getting algorithmically rejected while companies complain they can't find talent.
