Skip to main content

History Shows Why Police Use of Facial Recognition Tech Can Threaten Rights

Past Discrimination Shows Potential for Harm

A live demonstration uses artificial intelligence and facial recognition in dense crowd spatial-temporal technology at the Horizon Robotics exhibit at the Las Vegas Convention Center during CES 2019 in Las Vegas on January 10, 2019. © 2019 David McNew/AFP/Getty Images

Debates in the US about whether police should use facial recognition technology are intensifying. Amazon shareholders recently rejected a proposal to stop, at least temporarily, sales of the company’s Rekognition software to government entities. Washington state, home to Amazon and Microsoft, has seen a push for regulation. And last month, San Francisco banned the technology’s use by local agencies.

Human Rights Watch and other organizations have called on US companies not to sell the technology to governments due to rights concerns. Research suggests facial recognition may be inaccurate, especially for people of color and women. Inaccuracy is not the only potential rights problem, however. Since the technology can track where we go and with whom, its use could discourage people from freely expressing themselves and associating with others. It also creates a pool of information that could be misused in a discriminatory manner.

These last two concerns are highly significant for some Seattle activists who pointed to lessons of the past in interviews with Human Rights Watch. During World War II, the US government applied then-advanced punch-card reading (or “Hollerith”) machines to non-public census data to target west coast Japanese American communities for incarceration.

“We are a community that has been surveilled in the past, and we’re also a community where the latest technological advancements [were] used in our oppression,” said Geoff Froh, deputy director of Densho, which documents the mass incarceration of Japanese Americans. “Instead of Hollerith machines, we now have facial recognition technology.”

In 2000, the Census Bureau acknowledged that its use of census data facilitated the World War II-era abuses.

Facial recognition software can scan photographs or videos to identify people and record their locations – not unlike the earlier census data that was mined and misused.

Stanley Shikuma of Seattle’s Japanese American Citizens League noted that today’s monitoring technology is even easier to deploy, and highlighted the rights threats to anyone who might face bias or police harassment.

“If you think that the police are watching you … when you go to the bank, or you go to the doctor’s office, or you go to the church or the synagogue or the mosque,” you’ll be less likely to exercise those freedoms,” Shikuma said. “I don’t think that’s the kind of society you want, where people are looking over their shoulder because they don’t know which government agency might be watching them.”

Human Rights Watch agrees. Congress should impose strong, enforceable limits on this new technology – and in the meantime, companies shouldn’t put it in government hands.  

Your tax deductible gift can help stop human rights violations and save lives around the world.

Region / Country

Most Viewed