Apple Card Algorithm Accused of Gender Discrimination

Users of Apple’s new credit card are accusing the company of using lending algorithms that discriminate against women. Financial regulators are now investigating.

Software developer David Heinemeier Hansson made his complaints public on Twitter on Nov. 7, saying the credit line Apple offered him was 20 times better than what it offered to his wife. Hansson said the two file joint tax returns and his credit score is worse than hers.

The @AppleCard is such a f—— sexist program,” he tweeted. “My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.”

Hansson also said even when his wife pays off her limit in full, the card will not let her spend until the next billing period.

“Women apparently aren’t good credit risks even when they pay off the f—— balance in advance and in full,” he said.

Hansson said though customer service was responsive, they were unable to offer his wife answers. Apple eventually agreed to raise his wife’s credit score but upheld that they could not change what the algorithm ultimately decides.

He also said this algorithm — and the process that went into creating it — is largely a black box, and customer service representatives were unable to explain how it worked.

He raised the point that while all of the employees he spoke to defended the algorithm, they did not acknowledge how this algorithm could be inherently flawed and biased.

Hansson’s complaints went viral, with others chiming in recounting similar experiences. Apple’s own co-founder Steve Wozniak said he had a similar experience where he was offered 10 times the credit limit his wife was offered.

Black box algorithms, like the one Apple is using, are indeed capable of discrimination. They may not require human intelligence to operate, but they are created by humans. Algorithms are thought to be objective because they are automated, but evidence shows they are not perfect. In an interview with Bloomberg, Wozniak said a lack of transparency on what goes into these algorithms is problematic.

“Algos obviously have flaws,” Wozniak said. “A huge number of people would say, ‘We love our technology but we are no longer in control.’ I think that’s the case.”

The New York State Department of Financial Services (DFS) said it would be investigating the card, which is issued by Goldman Sachs. An algorithm that is discriminatory — whether intentionally or not — is illegal, the DFS said. Goldman Sachs has denied its algorithm could be discriminating based on gender, but the Apple Card’s algorithm would not be the first to have questions of discrimination raised against it.

The tech world has known about algorithms’ flaws for years. In 2015, the New York Times reported on a Harvard University study that found ads for arrest records were significantly more likely to show up on searches for distinctively black names. The Federal Trade Commission reported algorithms allow advertisers to target people who live in low-income neighborhoods with high-interest loans.

On Twitter, in response to Hansson’s complaints, many simply said discrimination was the way of the financial world. Hansson said the tech giant upholding this norm still amounts to discrimination.

“Did the iPhone launch pledging to please carriers and the status quo as its modus operandi? No,” he said.

Related

Trending Now

Follow us

Most Popular

Join Our Newsletter

Get the top workplace fairness news delivered straight to your inbox