• Home
  • News
  • Longform
  • Life
    • Jeans
    • Madam Zeenat’s Feminist Tarot
  • Health
  • Work
  • Culture
    • Books
    • Art
    • Music
    • Fashion
    • Cinema
    • TV
  • Sports
  • Kranti
  • Vaanthi
  • The FAK
  • About
    • Submissions
  • Home
  • News
  • Longform
  • Life
    • Jeans
    • Madam Zeenat’s Feminist Tarot
  • Health
  • Work
  • Culture
    • Books
    • Art
    • Music
    • Fashion
    • Cinema
    • TV
  • Sports
  • Kranti
  • Vaanthi
  • The FAK
  • About
    • Submissions
HomeNewsTechnologySigh, Even Photo Recognition Software is Sexist

Sigh, Even Photo Recognition Software is Sexist

August 24, 2017

By Sharanya Gopinathan

Image courtesy Pexels

Whenever you read reports about technology behaving in racist or sexist ways, like when Google’s search algorithms recently supplied pictures of half-dressed actresses when asked for “South Indian masala“, you immediately realise that it’s hardly the technology itself that can be blamed for being sexist, but in fact the creators and users of the technology that have passed on their biases to it.

Last year, during his research with image recognition software, a University of Virginia computer science professor named Vicente Ordóñez noticed some strange sexist patterns: that the software would associate pictures of kitchens with women, not men, for example.

He decided to test out two huge collections of labelled photos that are used to “train” image recognition software, including one that’s supported by Microsoft and Facebook. It found that these large image collections came with in-built gender biases, and a proclivity to link images of shopping and cleaning with women, and coaching and shooting to men.

The research also found that the softwares weren’t just reflecting the biases they were trained with, but also actually magnifying them. If an image recognition software is “trained” using a set of pictures that carry these biases, it was found that the software would create an even stronger association with the biases.

While it’s pretty annoying in and of itself that new technologies are being trained and programmed to mirror the real world biases that we’re trying so hard to remove, this could also have larger consequences. As Wired points out, if replicated in tech companies, these biases could affect widely used photo recognition software, and could interfere with tools that analyse social media photos and activities in order to glean customer preferences.

We’ve enough reason to understand that this a problem already: back in 2015, Google’s facial recognition software tagged two black people as monkeys, as the images that were fed to it when it was trained always linked white people to humans, leaving black people and others to…be assigned to animals, I suppose.

In all seriousness though, this is something we should be careful about. In the early days of the Internet, the virtual world was hailed as a new, neutral space where gender and racial biases could be erased, giving us a sort of opportunity to start over. Now, given that we’re seeing how easy it is to use the Internet to troll, harass and attack women, and also looking at the clearly gendered ways that technology in itself is learning to behave, it’s clear that we can’t just sit back and take it for granted that the virtual world will be any less sexist than the real world. In fact, given how so much of our daily lives is shifting to the virtual world, we should be particularly vigilant in fighting the sexisms we encounter here.

Tags: biases, collections, Facebook, facial recognition software, microsoft, photo, sexist, social media, university of virginia, Vicente Ordóñez

Share!
Tweet

Sharanya Gopinathan

About the author

Related Posts

I Tried to Set up a Anti-Sexual Harassment Cell at a Media House and They Thought I Was a Troublemaker

She Was the Only Woman in a Photo of 38 Male Scientists and We Didn’t Know Her Name, Until Now

The Best and Most Hilarious Moments From Our Facebook Interview with Aditi Mittal

Why I Care About Being ‘Liked’ Online

Leave a Reply Cancel reply

Trending

Sorry. No data so far.

Subscribe to our email newsletter!

You May Also Like

  • After Payal Tadvi’s Death Can We Allow Medical Education to Continue to Pretend to be Casteless? May 28, 2019
  • Jokha Alharthi’s Man Booker Win Reminds Us of Oman’s Recent Slave-owning Past May 24, 2019
  • In Avengers: Endgame, Black Widow is Sexy, Sterile and So Burnt By Marvel May 12, 2019
  • Let Us Admit the Sins of Atishi Marlena May 10, 2019
  • 20 Questions for SC Panel that Cleared the CJI of Sexual Harassment Charges May 9, 2019


Online Bachchi, Dil Ki Sachchi

Come on over for feminist journalism.

Politics. Pop Culture. Health. Sex. Law. Books. Work.

We write what we want to read.

  • Terms & Conditions
  • Privacy Policy
  • Refund Policy
  • About
  • Contact Us

Subscribe to our email newsletter!

Keep up with us!

Follow Us on FacebookFollow Us on TwitterFollow Us on YouTubeFollow Us on E-mail

Trending

Sorry. No data so far.

Copyright © 2018 The Ladies Finger
Subscribe to our RSS/Atom feed here