It will soon be illegal to create sexual digital forgeries – or ‘deepfakes’ – of other people without their consent, according to new plans laid out by the government.
A Ministry of Justice spokesperson told GLAMOUR, “Sexually explicit deepfakes are degrading, harmful, and, more often than not – misogynistic.
“We refuse to tolerate the violence against women and girls that stains our society, which is why we’re looking at options to ban their creation as quickly as possible.”
Deepfake abuse refers to when a real person’s likeness is artificially mapped onto an image of another real person’s nude body, often engaged in a sexual act. In 98% of cases, neither of these people consent to the digital forgery being made. And in 99% of cases, the victims are women. No wonder a GLAMOUR survey of over 3000 people found that 91% thought deepfake technology poses a threat to women’s safety.
Over the past year, GLAMOUR has been calling for the government to act on the scourge of deepfake abuse by making it a criminal office to create – or ask someone else to create – a sexual digital forgery of someone without their consent.
But we didn’t stop there; we partnered with Jodie*, a survivor of deepfake abuse; the End Violence Against Women Coalition, Not Your Porn, and Professor Clare McGlynn, to demand the introduction of a comprehensive Image-Based Abuse Law covering all forms of image-based-abuse. Our petition, which you can sign here, already has 66k signatures – and counting.
Jodie, who has previously shared her experience of being deepfaked with GLAMOUR, says, “I welcome the news that the government plans to introduce a creation offence for deepfake abuse.
“However, for this legislation to truly protect victims, it must be comprehensive. This means including provisions for solicitation, forced deletion of abusive content, future-proofed language to account for evolving technology, and a consent-based approach so survivors aren’t burdened with proving intent to harm.