Victims of Deepfake Abuse Demand Stricter Protection as New Law Takes Effect
Over 73,000 people have signed a petition delivered to Downing Street by campaign group Stop Image-Based Abuse, calling for tougher measures against deepfake image abuse. The new law, which came into effect on Friday, makes creating explicit non-consensual images a criminal offense.
The law was hailed as a "momentous day" by Jodie, a victim of deepfake abuse who discovered her intimate images had been shared online without consent. She expressed frustration with the delayed implementation, stating that it allowed millions more women to become victims and denied them justice.
Campaigners are urging the government to introduce civil routes to justice such as takedown orders for abusive imagery on platforms and devices. They also want improved relationships and sex education, as well as adequate funding for specialist services like the Revenge Porn Helpline.
The law was introduced as an amendment to the Data (Use and Access) Act 2025, following years of campaigning by Stop Image-Based Abuse. However, campaigners argue that the current legislation falls short in protecting sex workers from intimate image abuse.
Madeline Thomas, a sex worker and founder of tech forensics company Image Angel, said that while the law was a step forward, it did not provide sufficient support for victims like her who have been subjected to non-consensual images shared online. "When commercial sexual images are misused, they're only seen as a copyright breach," she said. "However, the proportion of available responses doesn't match the harm that occurs when you experience it."
The issue of deepfake abuse is a growing concern, with one in three women in the UK experiencing online abuse. The Ministry of Justice has vowed to take action against companies behind "nudification" apps and is making creating non-consensual sexual deepfakes a priority offense under the Online Safety Act.
However, many campaigners remain skeptical about the government's commitment to addressing this issue, saying that more needs to be done to protect victims like Jodie.
Over 73,000 people have signed a petition delivered to Downing Street by campaign group Stop Image-Based Abuse, calling for tougher measures against deepfake image abuse. The new law, which came into effect on Friday, makes creating explicit non-consensual images a criminal offense.
The law was hailed as a "momentous day" by Jodie, a victim of deepfake abuse who discovered her intimate images had been shared online without consent. She expressed frustration with the delayed implementation, stating that it allowed millions more women to become victims and denied them justice.
Campaigners are urging the government to introduce civil routes to justice such as takedown orders for abusive imagery on platforms and devices. They also want improved relationships and sex education, as well as adequate funding for specialist services like the Revenge Porn Helpline.
The law was introduced as an amendment to the Data (Use and Access) Act 2025, following years of campaigning by Stop Image-Based Abuse. However, campaigners argue that the current legislation falls short in protecting sex workers from intimate image abuse.
Madeline Thomas, a sex worker and founder of tech forensics company Image Angel, said that while the law was a step forward, it did not provide sufficient support for victims like her who have been subjected to non-consensual images shared online. "When commercial sexual images are misused, they're only seen as a copyright breach," she said. "However, the proportion of available responses doesn't match the harm that occurs when you experience it."
The issue of deepfake abuse is a growing concern, with one in three women in the UK experiencing online abuse. The Ministry of Justice has vowed to take action against companies behind "nudification" apps and is making creating non-consensual sexual deepfakes a priority offense under the Online Safety Act.
However, many campaigners remain skeptical about the government's commitment to addressing this issue, saying that more needs to be done to protect victims like Jodie.