Twitter to make AI algorithm open source to scour for biases

Users highlight racial bias in the way Twitter’s automated image-cropping tool selects portions for preview

Twitter has committed to open-sourcing its image-cropping machine learning algorithm after users identified potential racial bias.

The social media platform has confirmed it has “more analysis to do” on the algorithm that determines which elements of an image are shown when previewed after users posted examples of potential racial bias in the tool’s application.

Cryptography and infrastructure engineer, Tony Arcieri, conducted an experiment on the platform to highlight how Twitter’s automated image-cropping for previews might prefer white faces over black faces. 

In the experiment, Arcieri used various combinations of a picture showing the faces of former US president Barack Obama and senator Mitch McConnell. Seemingly, regardless of positioning, or other potentially interfering factors such as the colour of the individuals’ tie, the algorithm preferred to show the face of Mitch McConnell in the cropped preview.

Twitter spokesperson Liz Kelley claimed, in response to the experiment, that the company’s own testing prior to the model being shipped found no evidence racial or gender bais present, though admitted further analysis was needed. 

Kelley added that the firm will open source its machine learning algorithm “so others can review and replicate” the results of Arcieri’s experiment and get to the bottom of the issue.

Concerns about bias in artificial intelligence (AI) systems, and especially machine learning, which can often be seen as a black box, are rife, with many arguing that technology companies haven’t prioritised eradicating discrimination in their systems.

Related Resource

Humility in AI: Building trustworthy and ethical AI systems

How humble AI can help safeguard your business

Download now

The escalation of Black Lives Matter protests earlier in the year forced a number of tech companies into reflecting on the potential intrinsic bias present in many of their systems, especially facial recognition technologies.

A wave of organisations, including IBM and Amazon, rushed to either suspend or discontinue facial recognition systems and their use in law enforcement as a result of the movement, for instance.

The number of newly-launched AI-powered systems to have shown racial bias shows that tech companies, on the whole, have much more work to do in stamping this out. Microsoft’s AI news editor on MSN, for example, was shown to have wrongly identified a member of Little Mix in a story about band member Jade Thirlwall’s personal reflections on racism.

Featured Resources

Edge-enabled mobility of the future

Turning vehicle data into value

Download now

Modern networking for the borderless enterprise

Five ways top organisations are optimising networking at the edge

Download now

Address multi-cloud configuration risks

Cloud security challenges and how to overcome them

Watch now

The total economic impact of IBM Security Verify

Cost savings and business benefits enabled by IBM Security Verify

Download now

Recommended

Hackers steal 70GB of data from far-right social network Gab
social media

Hackers steal 70GB of data from far-right social network Gab

1 Mar 2021
Parler suffers data leak before being taken offline
social media

Parler suffers data leak before being taken offline

12 Jan 2021
How to become a machine learning engineer
Careers & training

How to become a machine learning engineer

23 Dec 2020
The IT Pro Podcast: The power of disinformation
social media

The IT Pro Podcast: The power of disinformation

11 Dec 2020

Most Popular

UK gov flip-flops on remote work, wants it a standard for all jobs
flexible working

UK gov flip-flops on remote work, wants it a standard for all jobs

5 Mar 2021
How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

26 Feb 2021
I went shopping at Amazon’s till-less supermarket so that you don’t have to
automation

I went shopping at Amazon’s till-less supermarket so that you don’t have to

5 Mar 2021