Tinder has agreed to pay $60.5 million to settle a lawsuit alleging the dating app charged users different prices based on age, with older users paying significantly more for premium features. The settlement is one of the largest payouts in a price discrimination case, and it sets a precedent that could force other apps to open their algorithmic pricing practices to scrutiny.
Let me tell you why this matters: this is algorithmic discrimination in its purest form. Tinder used an AI system to analyze your profile - your age, your behavior, your engagement patterns - and then decided how much to charge you. Older users routinely paid 2-3x more for Tinder Plus and Tinder Gold than younger users for the exact same features.
Tinder's defense was essentially "everyone does dynamic pricing." Which is true! Airlines charge different prices based on demand. Hotels adjust rates seasonally. Amazon experiments with personalized pricing. But here's the critical distinction: when you price discriminate based on protected characteristics like age, that's illegal under California law and many other jurisdictions.
The class action lawsuit argued that Tinder violated the Unruh Civil Rights Act, which prohibits businesses from discriminating based on age, race, gender, and other protected categories. Tinder argued that older users "valued the service more" and were "willing to pay more," therefore charging them more was just good business.
That argument is economically rational and legally absurd. Yes, older users might have more disposable income. Yes, they might be more serious about dating and less price-sensitive. But "this demographic will pay more" isn't a defense when the demographic is defined by a protected characteristic.
Here's what's particularly insidious about algorithmic pricing: it obscures the discrimination. If published a price list that said everyone would recognize that as age discrimination. But when an algorithm makes thousands of micro-decisions based on dozens of factors, the discrimination is harder to spot.
