Are deepfakes the new revenge porn?

- Published
New face-swap technology means this is a growing issue
Guidance: contains adult themes
âEmma Watsonâ is naked, kneeling on a white sofa. Nearby, âMaisie Williamsâ is sat legs akimbo, masturbating. âGal Gadotâ is riding cowgirl. âJennifer Lawrenceâ is doing it doggy style.
Where are we? Not in the murky depths of a horny teenagerâs fantasy, but on the site PornHub â where the only differences between the Hollywood film star Emma Watson and the fake porn star âEmma Watsonâ are a few pixels, several layers of clothing, and a total lack of consent. Welcome to the world of deepfakes: a new kind of X-rated identity theft.
Deepfakes are porn videos doctored using AI face-swap technology, so the adult performerâs face is replaced with somebody elseâs. Only six months ago, making sexy switcheroos required mind-numbingly complex coding and a hell of a lot of free time.
But, now, a âdeepfake pornâ internet search brings upare almost 700,000 results. The tech has exploded so quickly â and become so easy to use â that pretty much anyone with the urge can build face-customised porn in around 12 hours. Thatâs less time than it takes to arrange a real-life hook-up on a dating app.
Celebrities are the most popular victims. This kind of face-fakery actually began in Hollywood. âThe technology to superimpose celebrity faces onto other peopleâs has been available to movie special effects departments for years,â says Evgeny Chereshnev, CEO of security technology company BiolinkTech.

Emma Watson has been targeted by deepfakes
Advanced visual effects (VFX) are what enabled new scenes with Paul Walker to be âfilmedâ for Fast and Furious 7, after he had died. The tech was also used to resurrect the late Peter Cushing as Grand Moff Tarkin in Rogue One.
Doctored photos arenât anything new, but the availability and level of realism of modern deepfakes is.
Deepfakes got their name from a Reddit user. Last year, âdeepfakesâ perfected a complex algorithm that created creepy videos appearing to feature Gal Gadot, Taylor Swift and Scarlett Johansson performing pornographic acts. Even if your brain knows that Taylor couldnât possibly be the person doing that, your eyes are pretty convinced.
Then, in January, another Redditor created a free app called FakeApp with an in-built algorithm to do the face-swapping for you. You'd need a high-powered computer (the kind a filmmaker would have), a graphics processing unit (GPU) and enough images of your target â which, due to social media, YouTube and even LinkedIn, isnât that hard â but, theoretically, anyone could turn a porn star into someone else.
Now, basic versions are available to ordinary people: for ÂŁ20 a month, Adobe can supply the tools to create a digital copy of someone â though expensive professional software would be needed to take it to the next level. Adobe VoCo, a kind of photoshop for audio, even enables the user to recreate someoneâs voice after only 20 minutes of listening to it. Itâs still in the research stages, but other companies like Lyre Bird have more basic software already available to use.
People can now replace porn actors' faces with those of their long-term crush, say. Or maybe they want vengeance on an ex, and they do it to sabotage the exâs career or new relationship. Or, indeed, to sabotage the career or relationship of anyone theyâre angry with.

Celebrities have an army of lawyers to tackle issues like this
Suddenly, it isnât just celebrities (with their armies of powerful lawyers) who could find âthemselvesâ on peopleâs laptops. Itâs you.
The lawyer Ann Olivarius, who has been working with victims of revenge porn since it was made a criminal offence in 2015, says she has received calls from clients saying they have been victims of deepfaking.
"It's a great concern to us," she says. "It's really devastating for these women because with new technology, it can look like the real thing. The intent behind it is always to hurt and to degrade."
She believes that deepfaking is part of the same problem as revenge porn. "There are so many different types of revenge porn out there," she says. "It's a growing problem, and it keeps manifesting itself in new ways."
While celebrities can call on expensive lawyers, and can potentially use defamation law to prosecute deepfakers (providing they can prove the image has caused, or is likely to cause, serious harm to their reputation), it can be a lot harder for ordinary people to take action.
In January, an Australian man was sentenced to 12 months in jail, external after Photoshopping his teenage stepdaughterâs face onto women engaged in sex acts, including bestiality, and there have been other similar cases in the UK.
But, says Luke Patel, a specialist in privacy law at Blacks solicitors, âThe influx of fast-paced developments in technology makes it very difficult for laws to keep up and adequately support victims.â
The law currently makes no explicit reference to deepfakes though on 25 May, the General Data Protection Regulation (GDPR) will be implemented. It includes two new tools under the âRight of Erasureâ and the âRight to Be Forgottenâ which Luke believes could help "enable an individual to request the permanent removal or deletion of their personal data (including images) when there is no good reason for its continued publicationâ - though each case will be decided on an individual basis.
âItâs not an absolute right, but the case is stronger if the image is unwarranted and causes substantial distress. Although,â he continues, âthey are still only tools that can be deployed when damage has already occurred.â They wonât stop it from happening in the first place.

Deepfaking could become the new revenge porn
If internet platforms stopped hosting deepfakes, that could stem the rising tide. Reddit has banned deepfakes, calling them an unacceptable form of âinvoluntary pornographyâ.
Pornhub claimed in February to be following suit, but if you search for âpornâ and âdeepfakesâ the top results are all on Pornhub. BBC Three contacted Pornhub for an explanation, and received this statement from its VP, Corey Price: âNonconsensual content is not permitted on our platform as it violates our Terms of Service.â
I flagged at least 20 deepfake videos on the site, including those featuring celebrities - which we could safely assume were not consensual. Pornhub responded that they have a Content Removal Page, where people can ârequest the removal of nonconsensual material. As soon as they make a request, we work to promptly remove the content. We also rely on our community and/or content owners to flag inappropriate content.â
Creating deepfakes is becoming so easy it could become a party game: bring photos, booze, and, instead of watching YouTube videos, sit around and create stolen-identity porn.
âIn a couple of years, you could be able to go to a porn app store and buy VR sex experiences with anyone you want,â says Evgeny Chereshnev, of BiolinkTech.
You can already buy incredibly realistic sex dolls. âSoon,â says Evgeny, âtechnology could allow someone to steal your identity and order a sex doll with your face on it.â
The thought of a future where âyouâ could exist on someoneâs laptop, or where images of âyouâ could be created solely to be maliciously circulated - or where 'you' could even be sitting in sex doll form in the corner of someoneâs bedroom - is deeply disturbing. It may already be here.
This article was originally published on 24 April 2018.
Read more:
These landlords asked me for sex instead of rent