Apple co-founder Wozniak echoes sexist Apple Card allegations

Apple Card

An algorithm used by Goldman Sachs to determine the credit limits for Apple Card customers will be probed by US regulators for alleged discrimination towards women.

Apple co-founder Steve Wozniak supported the claims made by David Heinemeier Hansson on Twitter, which ultimately went viral, which allege his wife's Apple Card credit limit was 20 times lower than his own, despite her having a better credit score.

See more

Hansson's scathing Twiter tirade spanned multiple threads, adding that even when his wife paid her "ridiculously low" limit in full and in advance, the card wouldn't permit her to spend on it until the next billing period.

Wozniak, who helped found Apple alongside Steve Jobs in 1976, substantiated Hansson's claims, replying to his original tweet with his own similar story that his credit limit was ten times higher than his wife's.

See more

IT Pro contacted Goldman Sachs but it did not reply at the time of publication.

An Apple spokesperson confirmed to IT Pro that the company has little input in the running of the card, saying "all credit decisioning and operational management of the card is done by the bank".

Upon raising the issue with Apple's customer service, Hansson was told the representatives didn't know why the limits were different, they just said "it's just the algorithm," according to Hansson. Following discussions, his wife's limit was subsequently raised to match his own.

"The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex," said the New York Department of Financial Services in a statement to Bloomberg.

"Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class violates New York law."

The Apple Card is Goldman Sachs' first credit card and accompanies its push to providing more consumer-targeted products such as personal loans and savings accounts.

"Our credit decisions are based on a customer's creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law," said the investment bank to Bloomberg.

It's currently unclear if the algorithm used to determine the credit limits is developed by Apple or Goldman Sachs. Whichever company developed the algorithm, it's unlikely to be used with any credit card form other providers. It's also unclear whether Apple knowingly approved the algorithm to be used on its product.

Calls to regulate the use of AI and algorithms have been heard for years now and this isn't the first time discrimination has been alleged towards the automated decision-making tools.

A history of bias

Algorithms have been used prevalently in the job recruitment industry to quickly filter out applicants, saving a lengthy process of human analysis of CVs and covering letters. Such technology has been widely scrutinised, particularly in the UK where specific cases have been met with complaints.

In March 2019, the UK government launched an investigation into possible bias in algorithms. One example cited was the Harm Assessment Risk Tool used in Durham to determine the likelihood of criminals reoffending, which was ruled to be unfairly targeting users based on their income.

Candidates that make it through to an interview stage may also be met with a camera to analyse their facial expressions and voice to determine whether they would be right for the job an implementation which has been condemned by privacy advocates.

The Home Office is also the subject of a legal investigation into its "secretive" algorithm used to make important decisions regarding immigration policy. The software is said to judge immigration applicants unfairly, when a human review may account for their merits in a fairer way, rather than relying on just simple criteria such as age or nationality.

Connor Jones
News and Analysis Editor

Connor Jones has been at the forefront of global cyber security news coverage for the past few years, breaking developments on major stories such as LockBit’s ransomware attack on Royal Mail International, and many others. He has also made sporadic appearances on the ITPro Podcast discussing topics from home desk setups all the way to hacking systems using prosthetic limbs. He has a master’s degree in Magazine Journalism from the University of Sheffield, and has previously written for the likes of Red Bull Esports and UNILAD tech during his career that started in 2015.