Google looks to shake up the way the tech industry classifies skin tones
The tech giant is pursuing better ways to test for racial bias in tech products
Google is developing a new alternative to the tech industry’s standard method for classifying skin tones, which some critics say is inadequate for determining if tech products are biased against people of color, according to Reuters.
Many tech companies use a standard six-color scale known as Fitzpatrick Skin Type (FST), which dermatologists have used since the 1970s. Tech companies use it to test whether products, like facial recognition systems or smartwatch heart-rate sensors, perform equally well for various skin tones.
Google has occasionally grappled with these issues. Last year, the search giant announced an app that it said could recognize over 200 skin conditions, but it drew criticism from dermatologists who accused it of a “cavalier attitude” by not issuing a bias and accuracy study.
Google, which later published an accuracy study in Nature magazine, focused on ethnicity rather than skin color in its training dataset. As a result, subjects with brown skin were underrepresented in the dataset, while dark brown skin was absent.
As for FST, some tech researchers and even dermatologists are now saying it disregards diversity among people of color. For example, FST has four categories for “white” skin and one apiece for “black” and “brown.”
Last year, the Department of Homeland Security (DHS) recommended organizations stop using it to evaluate facial recognition systems because it doesn’t accurately represent the color range in diverse populations.
Google told Reuters that it’s pursuing better alternatives.
Business in the new economy landscape
How we coped with 2020 and looking ahead to a brighter 2021Download now
“We are working on alternative, more inclusive, measures that could be useful in the development of our products, and will collaborate with scientific and medical experts, as well as groups working with communities of color,” the company said.
Google recently came under fire from a group of senators who accused it of dealing poorly with racial issues.
Earlier this month, Senate Democrats Cory Booker, Edward Markey, Mark Warner, and Ron Wyden wrote to Alphabet and its subsidiaries Google and YouTube, urging them to audit their racial equality. The lawmakers warned the companies are falling behind in racial inclusion in their workplaces and in the technologies they develop.
Addressing Alphabet and Google CEO Sundar Pichai, YouTube CEO Susan Wojcicki, and Google’s chief marketing and legal officers, the senators’ letter warned that a racial equality audit was “long overdue.”
“We are concerned, after hearing reports about your company and its products, about harmful bias at Alphabet,” the letter said, warning of ethical issues with its use of AI. “Issues with Google search algorithms returning non-diverse image sets for basic searches and the more recent dermatology diagnosis algorithm that was not trained on dark skin tones are troubling.”
2022 State of the multi-cloud report
What are the biggest multi-cloud motivations for decision-makers, and what are the leading challengesFree Download
The Total Economic Impact™ of IBM robotic process automation
Cost savings and business benefits enabled by robotic process automationFree Download
Multi-cloud data integration for data leaders
A holistic data-fabric approach to multi-cloud integrationFree Download
MLOps and trustworthy AI for data leaders
A data fabric approach to MLOps and trustworthy AIFree Download