Netflix’s “coded bias” exposes the racism of the algorithm

Netflix's "coded bias" exposes the racism of the algorithm

Recently, some content creators on the Internet put photos of white people on their profiles and indicated the algorithm’s potential racist behavior based on the results of the post’s engagement. Since then, my interest in the subject has grown. After all, in addition to being a black woman, I am a black woman who produces content for the Internet.

It is not today that any search in search engines results in violently black people. Even with the functions and expressions of different entities, it is still recurring, for example, you need to add the word “black” to display the image of a black family looking for the word “family” . Now, isn’t a black family a “family”?

It is no different in the field of aesthetics and fashion. Exploring references to what is beautiful, what is trending, will certainly give us most of the white productions coming from European countries or the United States. Why does this happen? If technology is so “smart”, why does it behave this way?

Okay, the question we need to ask is: Who develops technology? Who programs it? When we ask questions for digital platforms, who develops the algorithm?

Dayana Morais - Personal Collection - Personal Collection

Dayna Morris Da Cruz, who works with HR Recruitment, uses images we share across apps and networks

Picture: Personal Collection

Over the past few days, the subject has impressed me again, this time with the launch of the documentary “Coded Bias” from Netflix, and by the email of my follower Dayna Morris Da Cruz, who proposed the agenda for this column. She is a graduate student in Legislative, Territory and City Management and focuses on diversity and inclusion in an HR company.

The field in which she works is a good example to describe a line of reasoning. For example, it is known that the practices of hiring and hiring companies have biased procedures. They start already when they filter the syllabus by specific universities, leaving thousands of talented candidates; They continue when they opt for fluency in other languages, when we live in a country where only 5% of the population speaks English… and so on.

The “filter”, which has been organizing corporate logic for so long, in an offline perspective, is also reflected in software that creates profiles of professionals for companies to hire. Okay, but what do I have to do with it, Xongani? You must be asking yourself.

I can start answering your question with a single word: filter. You, like me, who use social networks all the time, must have already applied a filter to some photos or used applications that “improve skin”, “age”, “rejuvenate” . I ask him two questions. First, what did “reform” mean? Does it have to do with “fine-tuning stroke”? Changing the characteristics that are yours to bring them to the standard of beauty? Second, where does this quantity of images that we are producing go? Where does my face photo go?

“To register in the federal government’s Meow Gov. application, for example, you need to register a photo to seal facial recognition. Where does this information go and how is it used, we’re sure Is not “, comments Dayna.

It also includes references from many professionals, Brazilians and foreigners, who are pointing to all the problems of technologies, such as Tarsiz Silva, Researcher and master in communication and culture; Sil Bahia, Coordinator Pretalab; I Joey Boolamwini, From MIT, whose research is the basis of Netflix’s documentary and is also responsible for amending the law governing facial recognition practices.

The point is: algorithms work from the people they learn from the humans who develop them. “Scientists say that a greater mass of data from whites has been trained than blacks. This system causes black people to fail to recognize relevant characteristics,” data scientist Lucas Santana wrote.

With the absence of professionals, especially black professionals, who are actually working to change the prejudice in the education of “machines”, they will continue to be created from models that do not conform to reality. And, as Lucas points out, it generates a high number of errors.

Dayna also stated that these algorithms have an important role in the organization of network behavior, as they use funnels to interest and, thus, pronounce bubbles and polarizations.

There is a lot of information and I know that the subject is complex, but we need to get close to it, as we share personal information on the network and spend hours there daily. It is necessary to understand what is happening so that they no longer reproduce or increase inequalities. I recommend, as a starting point, a look at the contents of the entire crowd which I have mentioned here.

About the author: Will Smith

"Lifelong social media lover. Falls down a lot. Creator. Devoted food aficionado. Explorer. Typical troublemaker."

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *