Skip to main content
Donate Now

On Friday, the identity matching services bill will be discussed at a hearing by the parliamentary intelligence and security committee. It has serious implications for human rights.

Should the government be able to track your every move when you walk down the street, join a protest, or enter your psychiatrist’s building? Facial recognition technology may make that a reality for Australians. Parliament should refuse to expand its use until the government can demonstrate it won’t be used to violate human rights or turn us all into criminal suspects.

The bill would create a nationwide database of people’s physical characteristics and identities, linking facial images and data from states and territories and integrating them with a facial recognition system.

The system would initially enable centralised access to passport, visa, citizenship, and driver license images, though states and territories may also link other information, for example, marine licenses or proof-of-age cards. Government agencies and some private companies would then be allowed to submit images to verify someone’s identity. Government agencies will also use it to identify an unknown person. The Department of Home Affairs would manage the system.

Prime minister Malcolm Turnbull describes the proposal as a “modernisation” and “automation” of existing data-sharing practices between law enforcement agencies, making facial recognition “available in as near as possible real time.” But the proposal is too broad, enables using facial recognition for purposes far beyond fighting serious crime, and leaves significant details to departmental discretion or future interpretation. The lack of safeguards combined with the centralisation of a massive amount of information raises the potential for abuse and ever-expanding mission creep.

For example, the bill contains insufficient limits on how officials might use information shared through the system. Home Affairs would also have broad powers to define new kinds of “identity matching services” and information sharing, including perhaps fingerprints and iris scans.

The stated purposes for the system are either too minor to justify such a serious intrusion on liberty or so broad in addressing law enforcement and national security that they may cast a wide net affecting many innocent people.

The bill raises immediate alarms about privacy and other rights. With scant limits on future data collection and use, the amount of data is likely to grow over time. It also obliterates notions of consent since information people disclose for one purpose—obtaining a fishing license—could be easily used for entirely different ones like targeting “jaywalkers or litterers.”

Proponents contend that the system will not involve “surveillance” or direct integration with CCTV cameras. Nonetheless, the bill has the potential to facilitate broad tracking and profiling, especially when images are combined with other data. Imagine the chilling effect if officials ran photos taken from surveillance cameras at a demonstration or outside a union hall. Or the assumptions that could be made if you’re caught on cameras outside of a drug treatment centre, abortion clinic, or marriage counsellor’s office.

Notably, the proposal doesn’t require law enforcement agencies to get a warrant before using the system to identify someone, which is critical to preventing abuse. And what would prevent the government from integratingit with CCTV once the technologies are in place?

Facial recognition technology is far from perfect. Independent studies have found these systems often have a racial or ethnic bias. Yet the government has not disclosed enough information about the accuracy of the system it intends to use. What are its error rates and are they higher for racial and ethnic minorities? This is not a trivial issue. False positives mean people are wrongly accused or placed under unwarranted suspicion. False negatives mean criminals may continue to walk free.

Errors shift the burden onto individuals to show they are not who the system says they are, undermining the presumption of innocence. And this may disproportionately impact already vulnerable communities if the system misidentifies them at higher rates. Indigenous Australians are already significantly overrepresented in the criminal justice system. And what recourse would a person have if a bank denied them services because the system failed to verify their identity correctly?

Errors aside, facial recognition still raises significant human rights concerns. Combined with other data, they can be used to draw (potentially flawed) conclusions about who you are, what you believe, what you have done—and what you might do in the future.

The next generation of artificial-intelligence-driven facial recognition systems may be used in even more pernicious ways, from inferring your sexual orientation, IQ, or political beliefs, to predicting your propensity to commit crime or automatically detecting and punishing trivial infractions. This is already happening in China.

Lack of explicit safeguards in the bill means that information could be abused by government officials, police officers, or even private companies against people in unpredictable and unexpected ways. Australia’s patchwork of data protection laws provides insufficient safeguards against these risks.

The extraordinary intrusiveness of facial recognition should not be underestimated. Parliament should scrap the bill until the government fully addresses the threats the system poses to a free society and provides real safeguards for people’s rights.

Your tax deductible gift can help stop human rights violations and save lives around the world.

Region / Country