Skip to main content

Dear Friends, 

My first assignment as a journalist was to report on the life of  Africans in Delhi, who at the time had been accused of carrying on a  drug and sex racket. A pamphlet distributed in the neighbourhood read:  “All landlords are requested not to let their properties to Nigerians or  other such disruptive elements.


The basis of this profiling was deep-seated racism  –  going back to brown Indians feeling inferior about their complexion due to their  history of colonisation by the British. They were now trying to feel  superior by believing they were better than black people. Many Indians  who internalised the colourism received from their colonisers started passing on the discrimination to anyone darker than so that they could  have the solace of not being on the lowermost rung of this conveniently concocted hierarchy.


In the past few years, humans have made huge advancements in technology, while transferring their biases to it. We outsourced our prejudice, and called technology neutral because of its non-human form. The lenses were supposed to be new but those holding the camera – and their vantage point – had not changed much.
 

https://unbiasthenews.org/no-facing-away-why-indias-facial-recognition-system-is-bad-news-for-minorities/ (Opens in a new window)

Facial recognition technology has since proved this bias transference many times. From wrongful arrests (Opens in a new window) to death threats (Opens in a new window), the tech has led to the targeting of innocents, especially those with vulnerable social identities. Such concerns have led to resistance (Opens in a new window) against this identification system in many countries. In Europe, Belgium and Luxembourg (Opens in a new window) took an official stand against the technology.

The latest addition (Opens in a new window) to the Unbias the News repository of stories – written by Aishwarya Jagani and illustrated by Victoria Shibaeva – examines the use of facial recognition technology in the context of  India, and how it particularly threatens minorities. The errors and  inaccuracies of the system do not only lead to unfair persecution. They  also end up excluding people from receiving state-sponsored benefits.  This happens when these welfare programmes closely tie up their  distribution structures with such identification systems.

On the other hand, law enforcement in various parts of the world  continues to argue in favour of the tech. But how does one know for sure  that after all the drawbacks of this technology, it won’t add to the  long trail of “collaterals” historically created by the criminal justice  system? This won’t be the first time technology hailed as being  revolutionary in the initial days led to unjust convictions and was  ultimately deemed fallacious (Opens in a new window).

Technology might be biased but we would keep trying to unbias the news,  and our team deeply appreciates your contribution in helping us do that.  If the story resonates with you, please share it in your circles. If  you have feedback, critique or suggestions to offer, please send them  our way. We always love to hear back from our readers and carefully  consider your responses.

In solidarity,


Ankita Anand - Story Editor

Let's Unbias the News! Your all-women team at Unbias the News

0 comments

Would you like to be the first to write a comment?
Become a member of Unbias the News and start the conversation.
Become a member