No menu items!

Under growing criticism, facial recognition reaches 20 states in Brazil

RIO DE JANEIRO, BRAZIL – A few minutes is all it takes for a facial recognition algorithm to identify a person by correlating their face with millions of faces in a database. If the person is wanted by the law, they can be approached by the police if they look too much like a suspect.

Twenty states in the five regions of Brazil use or are implementing facial recognition technology in public security forces. Another three are studying its implementation, and only four states do not use it, have not had contact with the system or plan to use it. The data were collected by the report through the State Secretariats of Security and the Civil and Military Police.

Algorithms: researchers accuse technology of intensifying structural racism
Algorithms: researchers accuse technology of intensifying structural racism. (Photo internet reproduction)

The interstate agency announced a system that will collect, store and cross-reference personal data on 50.2 million Brazilians. To do this, the program will use facial recognition and fingerprint records and unify data from the state security departments. The technology will also be used by the Federal Police.

In the fight against crime, the technology is said to used to capture fugitives and wanted persons, but local governments also use it to search for missing persons. In most cases, the police maintain the operation 24 hours a day. In others, only during major events such as Carnaval.

In reality, of course, the system can be used for anything once it is installed, depending on who is manipulating the levers of power at the time.

Read also: Algorithm – researchers accuse technology of intensifying structural racism

In São Paulo, the system was implemented in 2020 to be used to support police investigations. There are about 30 million faces registered, which, according to the State Department of Public Safety, “allows greater speed, reliability, and processing capacity of the daily demands.”

The department did not answer how the police operate when someone is recognized. It stated that the faces used by the system come from the Civil Police database but did not state the origin of the information from the database.

Bahia has been using facial recognition since 2018. The system is fed by the National Bank of Arrest Warrants, which lists about 333,500 fugitives and wanted by police. As of June 16, 209 persons on the list had been arrested with the help of the technology.

No algorithm has developed so far that offers 100% accuracy. All are subject to errors. In the case of the black population, the chances of faulty recognitions are even greater.

In 2018, the Gender Shades project, developed by artificial intelligence researchers Joy Buolamwini and Timnit Gebru, checked the accuracy of facial recognition in four different categories: darker-skinned women, darker-skinned men, lighter-skinned women, and lighter-skinned men.

Technologies developed by Microsoft, IBM, and China’s Megvii were evaluated. The researchers used 1,270 faces from three African countries and three Nordic countries. The algorithm that performed best had 100% accuracy for lighter-skinned men and 79.2% for darker-skinned women.

In the researchers’ opinon, the application of these systems in Brazilian public security can worsen existing problems. It would not increase police effectiveness because there would be too many resources spent on officers attending false positives due to the speed of the technology. On the other hand, it could escalate the violence in the approaches to people not in contact with the police.

The worst performing system recognized men with lighter skin with 99.7% accuracy, while women with darker skin with only 65.3%.

For a face to be recognized with the highest possible level of similarity, the algorithm needs to be “trained” with many faces of people of different skin tones, gender, and age. Most systems developed so far have been trained with white people, making it difficult to recognize other races.

ALGORITHMS: RESEARCHERS ACCUSE TECHNOLOGY OF INTENSIFYING STRUCTURAL RACISM

A black person is automatically recognized as a gorilla on one digital photo platform. On one social media, the automatic cropping of an off-viewing photo privileges white people’s faces. On another network, a black woman has her post reach increased by 6,000% by posting white women.

These examples are not isolated and have been the target of criticism and reflection by internet users and researchers. How could mathematical models, the so-called algorithms, be racist?

Researcher Tarcizio Silva, Ph.D. student in Human and Social Sciences at the Federal University of ABC (UFABC), explains that it is necessary to ask how these systems are used in a way that allows “the maintenance, intensification, and concealment of structural racism”. Silva has developed a timeline that demonstrates cases, data, and reactions.

One of the cases of greatest repercussion recently occurred on Twitter, with the automatic cropping of photos that favored white faces. Thousands of users used the hashtag #AlgoritmoRacista, on the network itself, to question the automation that exposed racism. Silva explains that this discovery showed algorithms based on neural networks, whose technique finds regions of interest on the image from data gathered by gaze tracking.

“An accumulation of data and biased research that privileged white aesthetics resulted in the system that Twitter used and failed to even explain properly where the origin of the issue was,” the researcher said. At the time, the platform pledged to revise the mechanism: “We should have done a better job of foreseeing this possibility when we were designing and building this product.”

“This is how algorithmic racism works, through the accumulation of using poorly explained and poorly tuned technologies that at first optimize some technical aspect, but actually mutilate the user experience,” the researcher adds.

Outside of social networks, the damage of algorithmic racism can be even greater. Data collected by the Network of Security Observatories show that from March to October 2019, 151 people were arrested without reason in four states (Bahia, Rio de Janeiro, Santa Catarina, and Paraíba) using technology and facial recognition.

Records that had information about race and color, or when there were images of the people approached (42 cases), observed that 90.5% were black. “The main motivations for the approaches and arrests were drug trafficking and robbery,” points out the report.

Check out our other content

×
You have free article(s) remaining. Subscribe for unlimited access.