Skip to main content
Donate Now
A man is pictured at the venue of China International Big Data Industry Expo in Guiyang, Guizhou province May 27, 2017. © 2017 Reuters/Stringer

As a researcher into China’s mass surveillance systems, I watch the revelations about the murky practices of the data marketing firm Cambridge Analytica and the debates about Facebook’s privacy policies with interest.  In many ways, China highlights the dangers of gathering data without sufficient control – not a road the US or any other country should follow.

In China, we are not shocked by social media companies giving user information to those who might put it to dubious, rights-violating uses. Social media companies in China are required to store all personal data inside China so the security agencies can conveniently access it when need  – such as when they’re investigating anyone who speaks out against the ruling Chinese Communist Party.

While there is growing concern about an emerging Artificial Intelligence race between China and the US, there is too little focus on how this competition is becoming a race to the bottom when it comes to protecting basic rights. Some observers worry that China will overtake the US in AI development for the simple reason that Chinese developers can access large amounts of personal data with far fewer privacy restraints than even the relatively weak ones found in the US.

In China, of course, there is no need to question whether data marketing  companies may have “meddled” in elections—there are no meaningful elections to meddle with. And we do not have to wonder which political party companies may have helped to catapult to power—there is only one party. In fact, the companies’ support for the  Communist Party  is so embedded in Chinese corporate culture that a giant like Tencent, which runs Wechat—a Facebook equivalent many people in China can’t live without–has a sign that says “Follow the Party” in front of its headquarters.

Those of us in China also don’t need to worry about whether big data analytics may be exploited to manipulate the population, we already know they are. Indeed, many of the data analysis systems in China are explicitly designed for social control. In the US, the use of credit scores, predictive policing and bail-related “risk assessment” software, and other tools relying on problematic algorithms already marginalize or pose rights risks to poor people and minority communities. In China, the consequences of data analytics without  rights safeguards  are even more alarming. A new national big data system, Police Cloud, which Human Rights Watch uncovered in November 2017, is designed to monitor and track categories of people called “focus personnel.” This includes people with mental health problems, people who have complained about the government, and a targeted  Muslim minority called Uyghurs who live in the northwestern region of Xinjiang.

Another big data system we found is also designed to monitor Uyghurs. This system, labelled the “Integrated Joint Operations Platform,” detects deviations from “normalcy,” such as the failure to pay phone bills, and treats them as indicators that a person may be politically “untrustworthy.”  The platform then flags such people for the authorities, who throw some of them into extralegal and indefinite detention to correct their thoughts.

The Chinese government is putting in place yet another national big data system, the Social Credit System, to engineer a problem-free society by rating citizens on a range of behaviors from their shopping habits to online speech. Those with low scores could face obstacles in everything from getting government jobs to placing their children in desired schools.

But unlike with the Cambridge Analytica scandal, don’t expect much publicly expressed outrage in China  about either the government’s high-tech surveillance and data analysis, or corporate invasions of privacy. While the Chinese media have reported on private company data breaches, such discussions are carefully contained to ensure they do not veer into criticism of the government or anything considered political. Inside China, reporting about state surveillance is blocked or scrubbed clean. Within hours after a netizen posted our Police Cloud report on a popular Chinese forum, censors removed the entire discussion.

The Chinese government’s exploitation of personal data to fuel its mass surveillance and AI projects is a prime example how a lack of regulations, coupled with authoritarianism, can lead to dangerous consequences for human rights. And even in democracies, surveillance powers can erode democratic institutions and give governments a sinister degree of power over their citizens.

We need more, and appropriate, rules to protect privacy before data mining and AI usher us into a world in which we do not wish to live. The debate surrounding Cambridge Analytica provides an opportunity to call for such reforms.

Your tax deductible gift can help stop human rights violations and save lives around the world.

Region / Country

Most Viewed