Breaking

Google drops gender labels from image recognition to reduce bias
Feb 20, 2020 2 mins, 22 secs
TNW uses cookies to personalize content and ads to make our site easier for you to use. We do also share that information with third parties for advertising & analytics.

Human-centric AI news and analysis

Every individual will now be classified as a "person" in Google Cloud’s Vision API

Google will no longer identify people by gender in its image recognition AI, by removing labels such as “man” and “woman” from photos of people. Instead, every individual will now be classified as a “person,”  according to a company email seen by Business Insider.

The changes will be introduced to Google Cloud’s Vision API, which developers can use to add labels to images and then classify them into predefined categories. 

In an email to developers, Google cited two reasons for the changes: it’s impossible to infer someone’s gender by appearance, and attempting to do so could perpetuate unfair biases.

Journalist Sriram Sharma shared a screenshot of the email:

Got an email from Google saying Cloud Vision API will not return gendered labels such as 'man' and 'woman' after February 19, 2020. pic.twitter.com/9XjdgQQwNe

— Sriram Sharma (@SriramVSharma) February 20, 2020

Google added that removing the labels aligned with the second of its Artificial Intelligence Principles at Google: Avoid creating or reinforcing unfair bias.

Image recognition systems have a unique tendency to do this.

In one study, researchers found that algorithms trained on a deliberately-biased dataset of cooking-related images, in which women were 33% more likely to appear, became 68% more likely to predict a woman was cooking — even when the image was of a balding man in a kitchen. Image recognition systems also regularly misgender trans and non-binary people

[Read: Automated facial recognition breaches GDPR, says EU digital chief]

Not everyone will agree with Google’s decision to remove gendered labels from images. Business Insider notes that one developer accused Google of prioritizing political correctness over product quality.

But the move will at least reduce one area of AI bias.

As linguist and programmer Angus B. Grieve-Smith explained on Twitter: “Anytime you automatically classify people, whether that’s their gender, or their sexual orientation, you need to decide on which categories you use in the first place — and this comes with lots of assumptions.”

You’re here because you want to learn more about artificial intelligence. So do we. So this summer, we’re bringing Neural to TNW Conference 2020, where we will host a vibrant program dedicated exclusively to AI. With keynotes by experts from companies like Spotify and RSA, our Neural track will take a deep dive into new innovations, ethical problems, and how AI can transform businesses. Get your early bird ticket and check out the full Neural track.

Published February 20, 2020 — 17:34 UTC

February 20, 2020 — 17:34 UTC

The Heart of Tech™

Thank you!

Copyright © 2006—2020. All rights reserved. Made with ♥ in Amsterdam.

RECENT NEWS

SUBSCRIBE

Get monthly updates and free resources.

CONNECT WITH US

© Copyright 2024 365NEWSX - All RIGHTS RESERVED