Love Island star reveals "shocking" naked deepfake ordeal

cally jane beech on good morning britain
Love Island star shares "shocking" deepfake ordeal ITV

Love Island star Cally Jane Beech has opened up about her "shocking" ordeal as the victim of sexually explicit, nude deepfakes.

The reality star, who appeared on the first season of the ITV2 show in 2015, spoke about discovering that images of her had been altered by AI to remove her clothes and posted online.

"It was just shocking but I have a daughter, so that's when I started to think, 'OK, hold on a minute, this could happen to her'," Cally said on Good Morning Britain today (January 7).

cally jane beech on good morning britain
ITV

"And that's when other people said to me it has been happening to children and paedophiles have been using this AI technology for their own access as well and there was nothing to stop them doing it.

ADVERTISEMENT

"I found that out when I contacted the police and it was such a grey area," she added. "They just said, 'There's not a lot we can do because it's not a real image of you'."

Channel 4 News presenter Cathy Newman, who has also been a victim of deepfake pornography, appeared alongside Cally, noting that you "worry that if you speak about it you'll attract more of the wrong kind of attention" and "generate more deepfakes ultimately".

Cathy added that the experience of uncovering images after doing her own case study "was haunting and the worst thing about it is not knowing who created this video and why".

cally jane beech and cathy newman on good morning britain
ITV

Speaking about new laws coming into place to prosecute the creation of sexually explicit deepfakes, Cally said: "I hope it will have an effect, I guess time will tell, it's good that it's being noticed, so that's a positive for now. But a key word that we were speaking about was consent.

ADVERTISEMENT

"When the legislation is written out, it cannot state if they [the creators] were trying to cause harm or intent to harass that person.

"But there's so many loopholes within that, so that's why it really needs to be based around completely being an offence."

A post made on the Ministry of Justice website on Tuesday (January 7) states: "The government will introduce a new offence meaning perpetrators could be charged for both creating and sharing these images, not only marking a crackdown on this abhorrent behaviour but making it clear there is no excuse for creating a sexually explicit deepfake of someone without their consent."

You Might Also Like