Along with a third of the world’s population, many human rights researchers are also in lockdown during the Covid-19 pandemic, unable to leave home to conduct their work. In response, they are relying even more on publicly available information found online.
During country lockdowns, content posted to social media has helped to expose violations, public records requests have been filed to understand how states decide which people to give Covid-19 tests, and public search engines are being used to highlight security flaws in the video conferencing platform Zoom. But as digital research becomes more important, some governments are restricting the public’s access to content by censoring information, limiting freedom of information requests, imposing internet shutdowns, and propagating disinformation campaigns.
Private companies are also making decisions that will likely limit access to human rights material. Last month, the major social media platforms put thousands of third-party content moderators on paid leave, saying they could not perform their sensitive work remotely. In their place, untested and opaque artificial intelligence technologies now have an outsized role in determining what content stays online and what gets removed, and users have a drastically reduced ability to appeal decisions.
Social media companies acknowledge that the lack of human moderators increases the chance of mistakes. These reviewers may understand the context and nuances of material in a way that automated systems do not, and without them, there is a heightened risk that videos and images of human rights violations will be erroneously removed. Given the secrecy of these systems, we will likely not know how they are programmed to remove content, what types of errors they make, and how they are being refined.
These are extraordinary times. But now more than ever, we need access to information and transparency so the public can understand what is going on. Governments should refrain from undue censorship and companies should better communicate how they are automating and refining their content moderation policies as well as their appeals and review processes. Otherwise, human rights abuses can happen in the dark and restrictive measures can get entrenched and harder to retract after the pandemic has passed.