Users of Apple’s new credit card are accusing the company of using lending algorithms that discriminate against women. Financial regulators are now investigating.
Software developer David Heinemeier Hansson made his complaints public on Twitter on Nov. 7, saying the credit line Apple offered him was 20 times better than what it offered to his wife. Hansson said the two file joint tax returns and his credit score is worse than hers.
“The @AppleCard is such a f—— sexist program,” he tweeted. “My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.”
Hansson also said even when his wife pays off her limit in full, the card will not let her spend until the next billing period.
“Women apparently aren’t good credit risks even when they pay off the f—— balance in advance and in full,” he said.
Hansson said though customer service was responsive, they were unable to offer his wife answers. Apple eventually agreed to raise his wife’s credit score but upheld that they could not change what the algorithm ultimately decides.
He also said this algorithm — and the process that went into creating it — is largely a black box, and customer service representatives were unable to explain how it worked.
So let’s recap here: Apple offers a credit card that bases its credit assessment on a black-box algorithm that 6 different reps across Apple and GS have no visibility into. Even several layers of management. An internal investigation. IT’S JUST THE ALGORITHM!
— DHH (@dhh) November 8, 2019
He raised the point that while all of the employees he spoke to defended the algorithm, they did not acknowledge how this algorithm could be inherently flawed and biased.
So nobody understands THE ALGORITHM. Nobody has the power to examine or check THE ALGORITHM. Yet everyone we’ve talked to from both Apple and GS are SO SURE that THE ALGORITHM isn’t biased and discriminating in any way. That’s some grade-A management of cognitive dissonance.
— DHH (@dhh) November 8, 2019
Apple has handed the customer experience and their reputation as an inclusive organization over to a biased, sexist algorithm it does not understand, cannot reason with, and is unable to control. When a trillion-dollar company simply accepts the algorithmic overlord like this…
— DHH (@dhh) November 8, 2019
What’s even worse is how complete and unquestioned the faith of these Apple reps were in the wisdom of THE ALGORITHM. To be point of essentially credit shaming my wife, assuming her score must have been lower than mine, and roping us into a TransUnion shakedown to check.
— DHH (@dhh) November 8, 2019
Hansson’s complaints went viral, with others chiming in recounting similar experiences. Apple’s own co-founder Steve Wozniak said he had a similar experience where he was offered 10 times the credit limit his wife was offered.
The same thing happened to us. I got 10x the credit limit. We have no separate bank or credit card accounts or any separate assets. Hard to get to a human for a correction though. It's big tech in 2019.
— Steve Wozniak (@stevewoz) November 10, 2019
Black box algorithms, like the one Apple is using, are indeed capable of discrimination. They may not require human intelligence to operate, but they are created by humans. Algorithms are thought to be objective because they are automated, but evidence shows they are not perfect. In an interview with Bloomberg, Wozniak said a lack of transparency on what goes into these algorithms is problematic.
“Algos obviously have flaws,” Wozniak said. “A huge number of people would say, ‘We love our technology but we are no longer in control.’ I think that’s the case.”
The New York State Department of Financial Services (DFS) said it would be investigating the card, which is issued by Goldman Sachs. An algorithm that is discriminatory — whether intentionally or not — is illegal, the DFS said. Goldman Sachs has denied its algorithm could be discriminating based on gender, but the Apple Card’s algorithm would not be the first to have questions of discrimination raised against it.
The tech world has known about algorithms’ flaws for years. In 2015, the New York Times reported on a Harvard University study that found ads for arrest records were significantly more likely to show up on searches for distinctively black names. The Federal Trade Commission reported algorithms allow advertisers to target people who live in low-income neighborhoods with high-interest loans.
On Twitter, in response to Hansson’s complaints, many simply said discrimination was the way of the financial world. Hansson said the tech giant upholding this norm still amounts to discrimination.
“Did the iPhone launch pledging to please carriers and the status quo as its modus operandi? No,” he said.