Victims urge tougher action on deepfake abuse as new law comes into force | Deepfake
AI Deepfake Abuse Law Takes Effect Amid Calls for Stronger Protections
In a landmark moment for digital rights and victim advocacy, a new UK law criminalizing the creation of non-consensual intimate deepfake images has officially come into force, marking a significant step forward in the fight against AI-generated abuse. However, campaigners are urging the government to go further, warning that the law, while crucial, still leaves many victims without adequate recourse.
The legislation, which amends the Data (Use and Access) Act 2025, makes it a criminal offence to create explicit deepfake images without the subject’s consent. The change follows years of advocacy from survivors, legal experts, and organizations like Stop Image-Based Abuse, who have long called for stronger protections in the digital age.
“A Momentous Day” for Survivors
For Jodie, a survivor of deepfake abuse who uses a pseudonym, the law’s enactment is both a victory and a reminder of the long road ahead. “Today’s a really momentous day,” she said. “We’re really pleased the government has put these amendments into law that will definitely protect more women and girls. They were hard-fought victories by campaigners, particularly the consent-based element of it.”
Jodie’s story is a harrowing example of the harm caused by deepfake technology. In 2021, she discovered that images of her were being used to create non-consensual deepfake pornography. Alongside 15 other women, she testified against the perpetrator, Alex Woolf, who was sentenced to 20 weeks in prison after posting manipulated images of women from social media to porn websites. “I had a really difficult route to getting justice because there simply wasn’t a law that really covered what I felt had been done to me,” Jodie explained.
Campaigners Demand More
While the new law is a step in the right direction, campaigners argue it doesn’t go far enough. Stop Image-Based Abuse, a coalition of organizations including the End Violence Against Women Coalition, the victim campaign group #NotYourPorn, Glamour UK, and Durham University law professor Clare McGlynn, delivered a petition with over 73,000 signatures to Downing Street. The petition calls for additional measures, including:
- Civil routes to justice, such as takedown orders for abusive imagery on platforms and devices.
- Improved relationships and sex education to address the root causes of image-based abuse.
- Adequate funding for specialist services, like the Revenge Porn Helpline, which supports victims of intimate image abuse.
Jodie also criticized delays in implementing the law, which received royal assent in July 2024 but only came into effect in February 2025. “We had these amendments ready to go with royal assent before Christmas,” she said. “They should have brought them in immediately. The delay has caused millions more women to become victims, and they won’t be able to get the justice they desperately want.”
Gaps in Protection for Sex Workers
For sex workers, the law’s protections are even more limited. Madelaine Thomas, a sex worker and founder of tech forensics company Image Angel, who has waived her right to anonymity, described the day as “very emotional” but emphasized that the law falls short for her community. “When commercial sexual images are misused, they’re only seen as a copyright breach. I respect that,” Thomas said. “However, the proportion of available responses doesn’t match the harm that occurs when you experience it. By discounting commercialised intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need.”
Thomas has been a victim of intimate image abuse for the past seven years, with her images shared without consent almost daily. “When I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that,” she revealed.
The Scale of the Problem
The issue of online abuse is widespread. According to domestic abuse organization Refuge, one in three women in the UK have experienced online abuse. The rise of AI-generated deepfakes has only exacerbated the problem, with tools like Grok AI being used to create explicit images without consent. In January, Leicestershire police opened an investigation into a case involving sexually explicit deepfake images created by Grok AI.
Government Response
A Ministry of Justice spokesperson acknowledged the severity of the issue, stating, “Weaponising technology to target and exploit people is completely abhorrent. It’s already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too.”
The government has also announced plans to go further, including banning “nudification” apps outright and making the creation of non-consensual sexual deepfakes a priority offence under the Online Safety Act. This would place additional duties on platforms to proactively prevent such content from appearing.
A Long Road Ahead
While the new law is a significant milestone, it is clear that the fight against deepfake abuse is far from over. Survivors, campaigners, and legal experts are calling for continued action to ensure that all victims – including those in marginalized communities – have access to justice and support.
As Jodie put it, “This is just the beginning. We need to keep pushing for stronger protections and ensure that no one else has to go through what we’ve been through.”
Tags: deepfake abuse, AI-generated images, non-consensual intimate images, digital rights, victim advocacy, Stop Image-Based Abuse, revenge porn, online safety, Grok AI, nudification apps, Online Safety Act, Ministry of Justice, End Violence Against Women Coalition, #NotYourPorn, Image Angel, Clare McGlynn, Madelaine Thomas, Jodie survivor, Alex Woolf case, Leicestershire police, copyright breach, relationships and sex education, Revenge Porn Helpline, Refuge, Durham University, Data (Use and Access) Act 2025, royal assent, civil justice, takedown orders, platforms, devices, commercial sexual images, tech forensics, survivor testimony, legal reform, digital exploitation, women’s rights, gender-based violence, internet safety, AI regulation, privacy rights, consent-based law, criminal offence, emotional impact, suicide prevention, marginalized communities, justice system, advocacy campaigns, policy change, government response, proactive prevention, platform duties, digital exploitation, online abuse, one in three women, survivor support, legal protections, AI tools, image-based abuse, victim support services, specialist services, funding, education, awareness, harm reduction, legal gaps, community impact, survivor stories, justice delay, hard-fought victories, digital age, technology misuse, exploitation, harm, recovery, hell, absolute hell, emotional day, very emotional, suicidal, long road, significant milestone, fight, push, stronger protections, no one else, beginning, keep pushing, access to justice, support, marginalized, communities, legal experts, continued action, ensure, victims, adequate recourse, step in the right direction, landmark moment, digital rights, victim advocacy, AI-generated abuse, UK law, criminalizing, creation, non-consensual intimate deepfake images, officially come into force, significant step forward, fight against AI-generated abuse, campaigners, urging, government, go further, warning, law, crucial, leaves, victims, adequate recourse, legislation, amends, Data (Use and Access) Act 2025, criminal offence, create, explicit deepfake images, without, subject’s consent, change, follows, years, advocacy, survivors, legal experts, organizations, Stop Image-Based Abuse, long called, stronger protections, digital age, “Momentous Day” Survivors, survivor, deepfake abuse, pseudonym, law’s enactment, victory, reminder, long road ahead, “Today’s a really momentous day,” pleased, government, put, amendments, law, definitely protect, women, girls, hard-fought victories, campaigners, consent-based element, story, harrowing example, harm, deepfake technology, 2021, discovered, images, used, create, non-consensual deepfake pornography, Alongside, 15, women, testified, perpetrator, Alex Woolf, sentenced, 20 weeks, prison, posted, manipulated images, women, social media, porn websites, “I had a really difficult route to getting justice because there simply wasn’t a law that really covered what I felt had been done to me,” explained, Campaigners Demand More, new law, step in the right direction, campaigners, argue, doesn’t go far enough, Stop Image-Based Abuse, coalition, organizations, End Violence Against Women Coalition, victim campaign group, #NotYourPorn, Glamour UK, Durham University law professor, Clare McGlynn, delivered, petition, over 73,000 signatures, Downing Street, petition, calls, additional measures, including, Civil routes to justice, takedown orders, abusive imagery, platforms, devices, Improved relationships and sex education, address, root causes, image-based abuse, Adequate funding, specialist services, Revenge Porn Helpline, supports, victims, intimate image abuse, Jodie, criticized, delays, implementing, law, received, royal assent, July 2024, came into effect, February 2025, “We had these amendments ready to go with royal assent before Christmas,” said, “They should have brought them in immediately. The delay has caused millions more women to become victims, and they won’t be able to get the justice they desperately want.”, Gaps in Protection for Sex Workers, sex workers, law’s protections, limited, Madelaine Thomas, sex worker, founder, tech forensics company, Image Angel, waived, right to anonymity, described, day, “very emotional” emphasized, law, falls short, community, “When commercial sexual images are misused, they’re only seen as a copyright breach. I respect that,” Thomas said, “However, the proportion of available responses doesn’t match the harm that occurs when you experience it. By discounting commercialised intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need.”, Thomas, victim, intimate image abuse, past seven years, images, shared, without consent, almost daily, “When I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that,” revealed, Scale of the Problem, issue, online abuse, widespread, domestic abuse organization, Refuge, one in three women, UK, experienced, online abuse, rise, AI-generated deepfakes, exacerbated, problem, tools, Grok AI, used, create, explicit images, without consent, January, Leicestershire police, opened, investigation, case, involving, sexually explicit deepfake images, created, Grok AI, Government Response, Ministry of Justice spokesperson, acknowledged, severity, issue, stating, “Weaponising technology to target and exploit people is completely abhorrent. It’s already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too.”, government, announced, plans, go further, including, banning, “nudification” apps, outright, making, creation, non-consensual sexual deepfakes, priority offence, Online Safety Act, would place, additional duties, platforms, proactively prevent, content, appearing, Long Road Ahead, new law, significant milestone, clear, fight, deepfake abuse, far from over, Survivors, campaigners, legal experts, calling, continued action, ensure, victims, marginalized communities, access, justice, support, Jodie, put, “This is just the beginning. We need to keep pushing for stronger protections and ensure that no one else has to go through what we’ve been through.”, AI deepfake abuse law, UK, criminalizing, creation, non-consensual intimate images, officially come into force, significant step forward, fight against AI-generated abuse, campaigners, urging, government, go further, warning, law, crucial, leaves, victims, adequate recourse, legislation, amends, Data (Use and Access) Act 2025, criminal offence, create, explicit deepfake images, without, subject’s consent, change, follows, years, advocacy, survivors, legal experts, organizations, Stop Image-Based Abuse, long called, stronger protections, digital age, “Momentous Day” Survivors, survivor, deepfake abuse, pseudonym, law’s enactment, victory, reminder, long road ahead, “Today’s a really momentous day,” pleased, government, put, amendments, law, definitely protect, women, girls, hard-fought victories, campaigners, consent-based element, story, harrowing example, harm, deepfake technology, 2021, discovered, images, used, create, non-consensual deepfake pornography, Alongside, 15, women, testified, perpetrator, Alex Woolf, sentenced, 20 weeks, prison, posted, manipulated images, women, social media, porn websites, “I had a really difficult route to getting justice because there simply wasn’t a law that really covered what I felt had been done to me,” explained, Campaigners Demand More, new law, step in the right direction, campaigners, argue, doesn’t go far enough, Stop Image-Based Abuse, coalition, organizations, End Violence Against Women Coalition, victim campaign group, #NotYourPorn, Glamour UK, Durham University law professor, Clare McGlynn, delivered, petition, over 73,000 signatures, Downing Street, petition, calls, additional measures, including, Civil routes to justice, takedown orders, abusive imagery, platforms, devices, Improved relationships and sex education, address, root causes, image-based abuse, Adequate funding, specialist services, Revenge Porn Helpline, supports, victims, intimate image abuse, Jodie, criticized, delays, implementing, law, received, royal assent, July 2024, came into effect, February 2025, “We had these amendments ready to go with royal assent before Christmas,” said, “They should have brought them in immediately. The delay has caused millions more women to become victims, and they won’t be able to get the justice they desperately want.”, Gaps in Protection for Sex Workers, sex workers, law’s protections, limited, Madelaine Thomas, sex worker, founder, tech forensics company, Image Angel, waived, right to anonymity, described, day, “very emotional” emphasized, law, falls short, community, “When commercial sexual images are misused, they’re only seen as a copyright breach. I respect that,” Thomas said, “However, the proportion of available responses doesn’t match the harm that occurs when you experience it. By discounting commercialised intimate image abuse, you are not giving people who are going through absolute hell the opportunity to get the help they need.”, Thomas, victim, intimate image abuse, past seven years, images, shared, without consent, almost daily, “When I first found out that my intimate images were shared, I felt suicidal, frankly, and it took a long time to recover from that,” revealed, Scale of the Problem, issue, online abuse, widespread, domestic abuse organization, Refuge, one in three women, UK, experienced, online abuse, rise, AI-generated deepfakes, exacerbated, problem, tools, Grok AI, used, create, explicit images, without consent, January, Leicestershire police, opened, investigation, case, involving, sexually explicit deepfake images, created, Grok AI, Government Response, Ministry of Justice spokesperson, acknowledged, severity, issue, stating, “Weaponising technology to target and exploit people is completely abhorrent. It’s already illegal to share intimate deepfakes – and as of yesterday, creating them is a criminal offence too.”, government, announced, plans, go further, including, banning, “nudification” apps, outright, making, creation, non-consensual sexual deepfakes, priority offence, Online Safety Act, would place, additional duties, platforms, proactively prevent, content, appearing, Long Road Ahead, new law, significant milestone, clear, fight, deepfake abuse, far from over, Survivors, campaigners, legal experts, calling, continued action, ensure, victims, marginalized communities, access, justice, support, Jodie, put, “This is just the beginning. We need to keep pushing for stronger protections and ensure that no one else has to go through what we’ve been through.”
,



Leave a Reply
Want to join the discussion?Feel free to contribute!