Google unveils two new tools designed to fight skin color bias

Google announced a set of initiatives Wednesday aimed at creating a more equitable product experience for people across the skin-tone spectrum. At its Google I/O conference, the company unveiled a new, 10-point skin tone scale—that is, a set of 10 representative human skin tones that people can match to their own, or to skin tones shown in photographs—developed with Ellis Monk, an associate professor of sociology at Harvard University known for his research into skin tone and colorism. In a Google-led study, a diverse set of research participants found the new Monk Skin Tone Scale to be more representative of a greater number of skin tones, and the company is releasing the set of colors and information on its research so that others can use the scale or suggest improvements. Google envisions that the scale can provide a standardized way for people in the tech industry to build and test products across the range of human skin tones, providing a uniform way to discuss which ranges of colors are or are not well served by a particular product. “This really is about creating an industry conversation,” says Tulsee Doshi, head of product for responsible AI at Google. [Image: courtesy of Google]The new scale will likely be used internally at Google to evaluate search tools, facial detection algorithms, and other automated systems to ensure that they’re functioning well across a wide range of human skin tones, she says. Historically, an array of technology from AI systems (including some of Google’s) to cameras have been faulted for performing better at handling images of white people than people of color. As Google pushes to move its tools away from that era, the company will likely annotate images based on which skin tones from the Monk scale are depicted in them so that it can deliberately train and test AI and other technology on a diverse set of skin colors. In general, Google is looking to ensure that search results for skin tone-neutral queries like “cute baby” aren’t biased toward particular colors, Doshi says. [Image: courtesy of Google]“We’re trying to identify when queries are homogenous, so really only showing a small number of skin tones, and actually improving the diversity of the results,” she says. The company is also rolling out new filters for Google Photos as part of its existing Real Tone system, which is designed to help generate high-quality photos for a wide array of skin tones.

Having a standardized skin tone scale will also help people within the company and, potentially, the industry quickly communicate about skin color-related issues, Doshi says. Google is also interested in developing ways that publishers can annotate content to indicate which skin tones are present where it’s relevant, similar to how recipe publishers can now add metadata useful to search engines and their users looking for cooking instructions with certain features, she says. [Image: courtesy of Google]Google also unveiled a skin tone search refinement tool that it’s adding to certain Google Image Search results, starting with those related to makeup, letting people filter for images that look like them or someone they’re, say, shopping for. That tool might not exactly reflect the Monk scale: Doshi says the company will continue to refine it based on what people find useful. [Image: courtesy of Google]In general, it seems likely the Monk scale will often be used behind the scenes at Google (and any other organization that adopts it), rather than being a tool everyday internet and phone users will use to describe their own skin tones. But, Doshi says, she anticipates using it in product development and testing will improve Google’s products for a wide array of users. “Doing that kind of work is not something that will be user-visible necessarily, but it is something that will just make our products better,” she says.

https://www.fastcompany.com/90750575/google-unveils-two-new-tools-designed-to-fight-skin-color-bias?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Erstellt 3y | 11.05.2022, 18:21:57


Melden Sie sich an, um einen Kommentar hinzuzufügen

Andere Beiträge in dieser Gruppe

Those security codes you ask to receive via text leave your accounts vulnerable. Do this instead

Do you receive login security codes for your online accounts via text message? These are the six- or seven-digit numbers sent via SMS that you need to enter along with your password when trying to

21.06.2025, 10:40:03 | Fast company - tech
This is the best online file converter—and it’s totally free

We were supposed to be finished with files by now.

For years, tech companies (well, certain tech companies) tooted their horns about a future in which files didn’t matter. You d

21.06.2025, 10:40:02 | Fast company - tech
Astroworld is back in the spotlight and survivors are sharing haunting stories on TikTok

Astroworld is back in the news, and social media has some thoughts.

In November 2021, a

20.06.2025, 23:10:03 | Fast company - tech
Your reliance on ChatGPT might be really bad for your brain

If you value critical thinking, you may want to rethink your use of ChatGPT.

As graduates

20.06.2025, 18:30:02 | Fast company - tech
What is ‘office chair butt’? TikTok’s viral term for a real health problem

Rather than the Sunday scaries or toxic bosses, employees have unlocked a new workplace fear: office chair butt.

While not a new concern, the term has resurfaced on TikTok to describe ho

20.06.2025, 16:10:07 | Fast company - tech
How this Parisian music streaming service is fighting AI fraud

Music streaming service Deezer said Friday that it will start flagging albums with AI-generated songs, part of its fight against

20.06.2025, 16:10:06 | Fast company - tech