King’s Speech - Debate (6th Day)

Part of the debate – in the House of Lords am 9:16 pm ar 24 Gorffennaf 2024.

Danfonwch hysbysiad imi am ddadleuon fel hyn

Photo of Baroness Owen of Alderley Edge Baroness Owen of Alderley Edge Ceidwadwyr 9:16, 24 Gorffennaf 2024

My Lords, it is a great pleasure to speak in the debate on the humble Address and I welcome the new Ministers to their place. I was pleased to see this Government’s commitment to halving violence against women and girls. However, I am keen to understand whether the renewed focus on VAWG will include tech-facilitated abuse, as I was disappointed that no reference was made to the growing crisis of image-based abuse. Over the last few years, we have seen a piecemeal approach to legislating on this issue, with up-skirting, cyber flashing and the sharing of intimate images now illegal but the non-consensual taking of sexually explicit images, as well as the solicitation to create and the creation itself of sexually explicit deepfakes, remaining gaping omissions in our patchwork of law in this area.

I was pleased to see the Labour Party manifesto make a commitment to legislate on the creation of non-consensual sexually explicit deepfakes. Ninety-nine per cent of all sexually explicit deepfake videos feature women. If the Government are to succeed in their plan to tackle VAWG, they must not treat online violence in isolation. It can often form part of a much wider picture of abuse. Every day that we delay introducing this legalisation is another day when women have to live under the ever-present threat that someone will steal their picture to create sexually explicit images or pornographic videos of them. Every woman should have the right to choose who owns a naked image of her.

I have been privilege in my work in this area to meet Mariano Janin, who understands that sexually explicit deepfake videos were used to bully his beautiful 14 year-old daughter, Mia, leading to her tragically taking her own life. Sadly, this is not an isolated incident. I have also been entrusted by “Jodie”, whose case may be familiar, as she was brave enough to speak to the BBC about the trauma of being deepfaked by someone she counted as her best friend. Jodie discovered that pictures had been taken off her private Instagram page, overlayed on to pornographic images and posted on Reddit and other online forums, with comments asking people to rate her body. Jodie endured this abuse for five years, finding hundreds of pictures of herself, her friends and many other young women.

While it is illegal in the UK to share sexually explicit deepfaked images, it is still not illegal to create. In Jodie’s case, the perpetrator was soliciting the creation of images from others. It is of the utmost importance that solicitation becomes an offence in itself to prevent deepfakes being solicited from jurisdictions where they may not yet have legislated. We must not underestimate the real impact this digital content has on those such as Jodie whose image has been stolen. The content is often used to bully, harass and even extort money. It is not a one-off experience. Survivors often have to manage the trauma of this digital content trending, or being subject to further digital abuse, at any given moment.

We must become more agile in our response by ensuring that we view tech-facilitated abuse as a cohesive whole; we must work to find the balance between Parliament having legislative oversight and a regulator having the power to act quickly to not only remove harms but to anticipate and future-proof against them.

I am determined that we should close the gaps on the taking of non-consensual intimate images, as well as the creation of, and solicitation to create, non-consensual sexually explicit deepfakes. My Private Member’s Bill, being introduced on 6 September, seeks to address this. Urgent legislation is required as part of this new Government’s VAWG strategy to ensure the safety of women and girls online. It is not enough to react to this abuse; we must prevent it happening in the first place.