Katie Hill, a US Congress member elected to the House of Representatives less than a year ago, has just resigned, days after nude photos of her – which she says were released without her consent – were posted online by media outlets. Hill, 32, is also alleged to have violated House rules by engaging in a sexual relationship with a staff member. She denies that allegation. But she stepped down anyway, citing the “private photos of personal moments” that had been “weaponized” against her as the reason, and adding that she was “fearful of what might come next.”
Many people – 90 percent of them women and girls – have lived through some form of sexual violence online, including what Hill describes. Compromising photos have been used against people for as long as cameras have existed. But today’s online media – where we spend so much of our lives and everything is a screenshot away from permanence – present opportunities for abuse that never existed before.
Abusers have been quick to see the opportunities – to humiliate, to destroy careers, reputations, and relationships, and even drive victims to suicide or trigger so-called “honor” violence in societies where sex outside of marriage is seen as bringing shame on a family. They have found opportunities to monetize their abuses: in South Korea, where Human Rights Watch is currently researching this issue; platforms charge viewers to watch “spycam” footage of women and girls filmed unknowingly in bathrooms and changing rooms.
From a victim’s perspective, once the image is posted once, the harm is permanent. Images cannot be unseen, and once shared, it’s like a game of whack-a-mole – even if the original is removed, screenshots and other copies can reappear at any time.
Internet companies fail to grasp the impact and urgency of these cases. Too often they fail to remove nonconsensual images, have slow and complex procedures for such requests, and fail to act quickly enough when images reappear.
Worse yet is the response by governments. In many countries, laws are not updated and nonconsensual sharing of intimate images may not even be illegal. Where it is a crime, police often lack the expertise, tools and compassion to investigate and support victims.
Technology is bringing us new forms of violence against women. Governments – and companies – need to act like they care.