(New York) – Argentina’s Justice and Human Rights Ministry is regularly publishing online the personal data of children with open arrest warrants, Human Rights Watch said today. The Buenos Aires city government has then been loading the images and identities of these children into a facial recognition system used at the city’s train stations, despite significant errors in the national government’s database and the technology’s higher risk of false matches for children.
Human Rights Watch has sent letters to Argentinian President Alberto Fernández and Horacio Rodríguez Larreta, the mayor of Buenos Aires City, saying that the policies and practices violate international obligations to respect children’s privacy in criminal proceedings.
“By publishing the personal data of these children online, the national government is putting their access to education, work, and housing at risk for the rest of their lives,” said José Miguel Vivanco, Americas director at Human Rights Watch. “Making matters worse, the Buenos Aires city government uses this database, which contains serious errors, to feed into a facial recognition system with few protections, despite its foreseeable errors in identifying children and its adverse impacts on them.”
Since 2009, Argentina’s Justice and Human Rights Ministry has maintained a national database of people with outstanding arrest warrants, known as the National Register of Fugitives and Arrests (Consulta Nacional de Rebeldías y Capturas, CONARC). This database, which the ministry makes publicly available online, contains suspects’ names, ages, national ID numbers, the alleged offense, and the location and the authority issuing the warrant, among other details. Children, most of them 16 or 17 years old, are included, although people identified as being as young as one year old have also been included.
Under international human rights law, every child alleged to have committed a crime is guaranteed to have their privacy fully respected at all stages of the proceedings. International standards provide that no information should be published that may lead to identification of the child. On May 17, 2019, the United Nations special rapporteur on the right to privacy concluded his fact-finding mission in Argentina by warning the government that its use of CONARC was violating children’s rights. The online database contained personally identifiable information on 61 children at the time.
At least 25 more children have been added since his warning. Human Rights Watch reviewed 28 versions of the database published between May 2017 and May 2020, as archived by the Internet Wayback Machine, and found that over this three-year period, at least 166 children have been added to CONARC. Even children suspected of minor crimes are included. The most common crime that these children are accused of is theft – 63 children, or 37.5 percent.
The database contains obvious errors and discrepancies. Some children appear multiple times. There are blatant typographical errors, conflicting details, and multiple national ID numbers assigned to single individuals, raising the risk of mistaken matches. In one example, a 3-year-old is listed as being wanted for aggravated robbery. These persistent errors in CONARC, which is updated every morning at 7 a.m., indicate that the system lacks basic safeguards to minimize data entry errors, which can have serious consequences for a child’s reputation and safety.
Since April 24, 2019, the Buenos Aires city government has fed this data into its facial recognition system, the Facial Recognition System for Fugitives (Sistema de Reconocimiento Facial de Prófugos, SRFP). The technology scrutinizes live video feeds of people catching a subway train or walking through or in the vicinity of a subway station, and identifies possible matches between the captured images and the identities in CONARC. Because the database does not include photos of those charged with a crime, reference photos are pulled from the country’s population registry.
The system was contracted to the Buenos Aires-based company Danaide S.A., and is based on its software, UltraIP. The facial recognition component of this system is reported as having been developed by the Russian company NtechLab, which specializes in facial recognition technology and has publicly acknowledged its partnership with Danaide S.A. by listing Ultra IP on a partner list published on NtechLab’s website. NtechLab has also confirmed its partnership with Danaide S.A. to the news outlet OneZero, though did not provide further details pursuant to a nondisclosure agreement.
Facial recognition technology has considerably higher error rates for children, in part because most algorithms have been trained, tested, and tuned only on adult faces.
In tests conducted in a controlled lab setting, using photos posed under ideal lighting conditions, the US Department of Commerce’s National Institute of Standards and Technology (NIST), which evaluates the accuracy of facial recognition algorithms worldwide, found that the three algorithms that NtechLab submitted for testing produced a higher rate of mistaken matches among children than adults.
Based on the NIST results, Human Rights Watch calculates that, in a controlled setting, these NtechLab algorithms tested by NIST falsely identify a child between the ages of 10 and 16 six times more often than an adult between the ages of 24 and 40.
Since children experience rapid and drastic changes in their facial features as they age, facial recognition algorithms also often fail to identify a child who is a year or two older than in a reference photo. Because the facial recognition system matches live video with identity card photos collected by the country’s population registry, which are not guaranteed to be recent, it may be making comparisons with outdated images of children, further increasing the error rate. Argentine children are required to renew their identity card only when they are between the ages of 5 and 8, and again between the ages of 14 and 15.
According to official documents obtained by the Civil Rights Association (Asociación por los Derechos Civiles, an Argentinian organization), prior to its deployment, the system was only tested on the adult faces of employees from the city's police department and the Justice and Security Ministry. The city government did not require the companies providing this technology to perform tests to minimize bias against children. Danaide S.A. and NtechLab did not respond to multiple requests for comment.
Error rates for facial recognition technology in general also substantially increase when facial recognition is deployed in public spaces where the images captured on video surveillance cameras are natural, blurred, and unposed. Deploying this technology in Buenos Aires’ subway system, with a daily ridership of over 1.4 million people and countless more passing through or near its stations, will result in people being wrongly flagged as suspects for arrest.
Buenos Aires is currently deploying the system on a small scale, for budgetary reasons. The European Union has estimated that when using live facial recognition in places visited by millions of people, like subway systems, even a relatively small error rate like 0.01 percent may result in hundreds of people being incorrectly identified once the system is scaled up to its full capacity. In public statements, the Buenos Aires city government, has cited error rates of 3 percent or higher and given very different numbers of mistaken identifications, making it difficult to quantify the system’s impact so far.
Official documents obtained by the Argentine Computer Law Observatory (Observatorio de Derecho Informático Argentino), reveal that the Buenos Aires police are stopping and detaining people solely on the basis of the automated alerts generated by the facial recognition system. Adults have been mistakenly detained and arrested.
The city government has defended its use of the system, blaming mistaken matches on errors in CONARC. The city’s Justice and Security Ministry has denied that the facial recognition system identifies children, as “CONARC does not contain the data of minors,” which is false.
Argentina’s Justice and Human Rights Ministry should immediately remove all children under 18 from the CONARC database, which is published daily, Human Rights Watch said. The Buenos Aires city government should immediately suspend its facial recognition system, conduct a privacy and human rights impact assessment, and publish verifiable statistics on the system’s performance. The city government should also invite public engagement to assess the necessity, proportionality, and legality of the use of facial recognition surveillance, with special consideration for its implications for children.
“Authorities from the national and Buenos Aires city governments should coordinate efforts to shield children from harm caused by unreliable surveillance technologies and practices that violate their right to privacy,” Vivanco said.