Love Island’s Cally Jane Beech has opened up on her ‘shocking’ and ‘scary’ ordeal, after she discovered digitally faked nude photos of herself circulating online.
The reality star, 33, who appeared on the ITV dating show in 2015, has become a prominent activist to raise awareness of the disturbing rise of deepfake porn and call for the Government to clamp down on the vile abuse.
In recent months, a number of high profile figures have been victims of deepfake pornography, including pop star Taylor Swift and US congresswoman Alexandria Ocasio-Cortez.
Fake images and videos are made to look realistic, and victims are usually unaware and unable to consent to being sexualised in such a way.
While in March last year, a Channel 4 investigation discovered more than 250 British celebrities had been victims of deepfake pornography, including newsreader Cathy Newman.
The probe analysed of the five most visited deepfake websites and found thousands of high profile figures had their faces superimposed onto pornographic videos using artificial intelligence, with all but two of the almost 4,000 famous individuals being women.
Love Island’s Cally Jane Beech has opened up on her ‘shocking’ and ‘scary’ ordeal, after she discovered digitally faked nude photos of herself circulating online
The reality star, 33, who appeared on the ITV dating show in 2015, has become a prominent activist to raise awareness of the disturbing rise of deepfake porn and call for the Government to clamp down on the vile abuse
Cally appeared on Good Morning Britain earlier this month, where she recalled finding a photo of herself from a lingerie photoshoot being circulated on the internet with her underwear digitally removed using AI.
She explained: ‘I was just contacted out of the blue. Someone said there’s an explicit image of you and I said that can’t be true because I knew I’d never taken anything like that.
‘I asked them to send me a link, opened it and to my surprise there was an image of me. I knew it wasn’t real because I had the original photo from an underwear campaign, but the underwear wasn’t there.
‘I then followed through the site to see where it was being hosted, and it was on a website advertising to put pictures of females into the site and be able to strip their clothes and was advertising other apps for your smart phone to do this to anybody.
‘And that’s when I found out where the real issue was and took to my social media to speak out about it.
‘A lot of people came rushing forward and said this is a major problem, but it was never something I realised was an issue until that point – how big it was and how many people contacted me.
‘And then it struck that it was really bad when the family member of a 14-year-old schoolgirl that took her own life because images were created of her and sent around school as a bullying tactic.’
Describing how she felt seeing the deepfaked image of herself for the first time, she said: ‘It was mixed emotions, because at first I was in shock and wasn’t quite sure what to feel.
In recent months, a number of high profile figures have been victims of deepfake pornography, including Taylor Swift (pictured March) and US congresswoman Alexandria Ocasio-Cortez
While in March last year, a Channel 4 investigation discovered more than 250 British celebrities had been victims of deepfake pornography, including newsreader Cathy Newman (pictured)
‘My instant response was to go on social media and say, “Guys I’ve just seen this image of me online, I’m not really sure what’s going on here. Is it a laughing matter?”
‘And then I realised no this is actually a really serious thing. It was just shocking but I have a daughter, so that’s when I started to think, “OK, hold on a minute, this could happen to her”.
“And that’s when other people said to me it has been happening to children and paedophiles have been using this AI technology for their own access as well and there was nothing to stop them doing it.
“I found that out when I contacted the police and it was such a grey area. They just said, “There’s not a lot we can do because it’s not a real image of you”. Even though I tried to argue that it technically is, it’s my head, my arms, my torso, just superimposed parts. It was still like, “We’re not really sure what we can do”.’
Cally first spoke about her discovery of AI photos of herself online in January last year, sharing a video to her Instagram in the hopes of raising awareness.
Describing her ‘scary’ ordeal, she recalled how she had been left ‘shook and in tears’ after first seeing the edited picture.
She admitted: ‘First thing I did was panic. I knew there was such a thing as AI but I didn’t realise this kind of thing was happened.
‘I was so shook last night, I wanted to cry, the next minute me and [her fiancé] DJ were laughing, but I knew it was so serious.’
Cally appeared on Good Morning Britain earlier this month (pictured), where she recalled finding a photo of herself from a lingerie photoshoot being circulated on the internet with her underwear digitally removed using AI
Cally first spoke about her discovery of AI photos of herself online in January last year, sharing a video to her Instagram in the hopes of raising awareness
Captioning the clip, Cally continued: ‘Honestly this has shook me to my core. Someone has brought to my attention that there is so called nude pictures of me on the internet.
‘I said there couldn’t be as I’ve never sent a nude to anyone or posted anything explicit. EVER. When they sent me the picture they was shown I instantly knew it was a deep fake, coz for one I know what my body looks like and for two I have a tattoo near my nunny lol, but that is not there.
‘I looked into it and I have found there are sites that allow you to remove peoples clothes/underwear appearing them to be naked. I was sent this last night and instantly had to speak on it and raise awareness coz this day and age is scary what AI can do.’
Issuing a warning to her followers and voicing her determination to do something about the problem, she added: ‘Be mindful what you see on the internet or someone may send you is not real and can be very soul destroying or affect someone hugely.
‘Lucky me and DJ have laughed this off but this could have had much much more impactful affect on my mental health and I’m now left thinking what the hell I’m even meant to do from here. I will get to the bottom of this. Please be aware guys. The internet is a scary, scary place.’
Cally received a flood of messages from her followers who had also been victims, which inspired her to campaign heavily for more stringent laws criminalising non-consensual deepfake pornography was to be made a criminal offence.
Speaking to Glamour in October, she revealed how she feared for her career after the photos emerged and said: ‘It makes you feel like you should be ashamed.
‘And really, you shouldn’t have to feel like that – even if you have taken a nude photo of yourself and shared it with a partner. That doesn’t make it OK for them to share it without your consent.’
Cally received a flood of messages from her followers who had also been victims, which inspired her to campaign heavily for more stringent laws criminalising non-consensual deepfake pornography was to be made a criminal offence (pictured with her daughter Vienna)
Despite her hellish ordeal, Cally says she sees herself as ‘lucky’ as she and her management team succeeded in getting the deepfake images removed, but explained her relief was short-lived as ‘there was no justice for me’
She added: ‘We all share our pictures on holidays, pictures of us in bikinis or with partners etc, and the fact someone can just take these and do what they see fit with it and there’s little that can be done is awful.
‘It just feels violating because it looks really realistic. If you didn’t know that wasn’t my body, you’d assume I’d put that video out there.’
Despite her hellish ordeal, Cally says she sees herself as ‘lucky’ as she and her management team succeeded in getting the deepfake images removed.
But she explained her relief was short-lived, saying: ‘However, there was no justice for me. There was no follow-up, there was no IP address tracing for who created this image, and there’s nothing to stop them from taking another image off my social media and doing it again.
‘I was like “What now?” How will we stop it from happening again to me or anybody else? I was told by the police, “[The image is] not you – it’s just your head”.
‘It was really distressing to have that conversation and realise that, if the image were real, I probably would’ve got more justice. That’s where the system is flawed.’
For her activism, she was awarded with Glamour’s Woman of the Year Activist Award in October and has opened up about her experiences at parliamentary roundtables to raise awareness.
Detailing her motivation, Cally said: ‘This is bigger than me just being a bit hurt by someone creating an image.
For her activism, she was awarded with Glamour’s Woman of the Year Activist Award in October and has opened up about her experiences at parliamentary roundtables to raise awareness (pictured)
This month saw the Government crackdown on explicit deepfakes, introducing new legislation to prosecute offender that could potentially lead to longer sentences
‘This is happening to children and school girls – and at that young age, you want to fit in and be happy and not have that stress of embarrassment. It’s a difficult time being a teenager as it is.
‘That’s when I thought, “No, I need to get involved and do something about it.” The person who created the images of me would never in a million years have thought I was capable of this.
‘I think they thought the images would make me crumble, be very sad, be ashamed, and damage my career, which they know I worked so hard for… but I will always try and change a negative into a positive. And if it can help other people, then there’s no question about it.’
This month saw the Government crackdown on explicit deepfakes, introducing new legislation to prosecute offender that could potentially lead to longer sentences.
The Online Safety Act 2023 criminalised the vile practice for the first time, making it a criminal offence to share, or threaten to share, a manufactured or deepfake intimate image or video of another person without his or her consent.
However, it was criticised for not going far enough as it was not intended to criminalise the creation of such deepfake content.
But the new law will mean anyone who makes sexually explicit ‘deepfake’ images of adults maliciously and without consent will face prosecution.
It also strengthens the existing offence, as creating and then sharing the image could lead to a person being charged with two offences, facing a criminal record and an unlimited fine.
Addressing the new legislation, Cally told GMB she was glad to see change being made, but pointed out the new law still allowed loopholes
Some have even called for perpetrators to be placed on the sex offenders register, with Cally saying: ‘It would be a huge deterrent if these men were put on the list. I don’t think people will really take this seriously until there are some big prosecutions’
In a statement surrounding the new legislation, Victims Minister Alex Davies-Jones said: ‘It is unacceptable that one in three women have been victims of online abuse.
‘This demeaning and disgusting form of chauvinism must not become normalised, and as part of our Plan for Change we are bearing down on violence against women – whatever form it takes.
‘These new offences will help prevent people being victimised online. We are putting offenders on notice – they will face the full force of the law.’
While Baroness Jones, Technology Minister, said: ‘The rise of intimate image abuse is a horrifying trend that exploits victims and perpetuates a toxic online culture.
‘These acts are not just cowardly, they are deeply damaging, particularly for women and girls who are disproportionately targeted.
‘With these new measures, we’re sending an unequivocal message: creating or sharing these vile images is not only unacceptable but criminal.
‘Tech companies need to step up too – platforms hosting this content will face tougher scrutiny and significant penalties.’
Addressing the new legislation, Cally told GMB she was glad to see change being made, but pointed out the new law still allowed loopholes.
She said: ‘I hope it will have an effect, I guess time will tell. It’s good that it’s being noticed, so that’s a positive for now.
‘But a key word that we were speaking about was consent. When the legislation is written out, it cannot state if they [the creators] were trying to cause harm or intent to harass that person.
Meanwhile, Vicky Pattison has created her own deepfake porn video to properly understand the violation of image-based abuse for her upcoming Channel 4 documentary (pictured last month)
Vicky has directed and produced the deepfake porn footage of herself and will release the video on social media to understand the way the content spreads online and how these images and videos are taken down
‘But there’s so many loopholes within that, so that’s why it really needs to be based around completely being an offence.’
She added: ‘I hope this law gives the police and CPS the powers to prosecute these men. But you still have a problem that many of the sites hosting the images are outside the jurisdiction of British police. It’s up to the platforms to sort that now.’
Campaigners are calling on social media platforms to ban advertisements for websites and apps that allow people to create the sexually explicit deepfakes.
While they want the new legal measures to close the loophole that could allow offenders to try and avoid prosecution by claiming they hadn’t meant to cause distress.
Some have even called for perpetrators to be placed on the sex offenders register to deter others from committing the crime.
Voicing her thoughts on the idea, Cally said: ‘It would be a huge deterrent if these men were put on the list. It would make people think twice about doing this.
‘I don’t think people will really take this seriously until there are some big prosecutions.’
Meanwhile, Vicky Pattison has created her own deepfake porn video to properly understand the violation of image-based abuse for her upcoming Channel 4 documentary.
In her own video, to ensure that the production of it is fully consensual, her likeness in the video is portrayed by an actor, with Vicky’s face superimposed on top using AI technology
Speaking about the decision to make her own explicit deepfake, Vicky said: ‘I wrestled with this decision for a long time, mulling over the permanence of it, and ultimately coming to accept the fact that this content may live online forever’ (pictured in 2023)
My Deepfake Sex Tape is set to air next Tuesday and will see the former Geordie Shore star, 37, fully immerse herself into the world of this rapidly evolving violation of privacy.
Vicky has directed and produced the deepfake porn footage of herself and will release the video on social media to understand the way the content spreads online and how these images and videos are taken down.
In her own video, to ensure that the production of it is fully consensual, her likeness in the video is portrayed by an actor, with Vicky’s face superimposed on top using AI technology.
The video appears to show Vicky engaged in a sex act with a man while first wearing a bra and then later, topless.
Speaking about the decision to make her own explicit deepfake, Vicky said: ‘I am hugely passionate about women’s issues, and have found myself increasingly disturbed by how prevalent the problem of deepfake porn is becoming.
‘As part of the documentary, I have made the challenging decision to release my own deepfake sex tape online, which I directed and produced with actors to ensure the process was fully consensual from start to finish.
‘I wrestled with this decision for a long time, mulling over the permanence of it, and ultimately coming to accept the fact that this content may live online forever.
‘Whilst I know this doesn’t compare to the distress and horror actual victims feel when they discover this content of themselves, I hope it will give some insight into what they go through.
‘I want this documentary to bring attention to the imbalance of power and encourage society, lawmakers, and tech companies to provide stronger protections and support for those who are affected.
‘My goal is to foster empathy and drive action and to contribute to a larger movement for justice and change, so victims feel supported, understood, and empowered to reclaim their voices and control.’
Revenge Porn Helpline provides advice, guidance and support to victims of intimate image-based abuse over the age of 18 who live in the UK and can be called on 0345 6000 459.