They call them “deepfakes.”
It’s the term for pornography made using artificial intelligence-assisted technology to superimpose a person’s face on another performer’s body – essentially allowing the producer to create fake porn featuring celebrities, politicians or even average every people.
In a report published Wednesday, Motherboard recounted how they discovered a user on Reddit responsible for producing convincing porn videos featuring celebrities like Gal Gadot, Maisie Williams and Taylor Swift.
Pretty soon, the technology used to create “deepfakes” will be widely available enough to be used by extortionists and criminals with only a cursory understanding of how the software works. Another redditor discovered by Motherboard even created an app specifically designed to allow users without a computer science background to create AI-assisted fake porn. All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.
Two months ago, the first redditor mentioned above created a subreddit dedicated to the practice.
In that short time, the subreddit has already amassed more than 15,000 subscribers. Within the community, the word “deepfake” itself is now a noun for the kinds of neural-network generated fake videos their namesake pioneered, according to Motherboard.
Another “deepfake” auteur created an app called FakeApp, a user-friendly application that allows anyone to recreate these videos with their own datasets. The app is based on deepfakes’ algorithm, but another user who goes by deepfakeapp created FakeApp without the help of the original deepfakes. While none of these people divulged their identity to Motherboard, the user known as Deepfakeapp said in a direct message that his goal with creating FakeApp was to make deepfakes’ technology available to people without a technical background or programming experience.
“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”
Peter Eckersley, chief computer scientist at the Electronic Frontier Foundation, fears the technology will soon reach the point where fakes are virtually indistinguishable for authentic videos.
“I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks,” he said. “Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.”
Fakes posted in the subreddit have already been pitched as real on other websites; a deepfake of Emma Watson taking a shower was uploaded by CelebJihad, a celebrity porn site that regularly posts hacked celebrity nudes, as a “never-before-seen video” purportedly from the user’s “private collection.”
Here’s an example of a “deepfake”
Other redditors have taken video trained from celebrities’ Instagram accounts and used it to convincingly fake Snapchat messages.
“Deepfakes” are hardly a new phenomenon. Last July, we reported on a project conducted by Stanford’s Matthias Niessner that managed to create several faked videos of former US President Barack Obama.
Soon, this technology could create problems for everybody, from governments, to corporations to the news media – which will now find it even more difficult to distinguish veritable “Fake News” from reality.
via RSS http://ift.tt/2neMvhv Tyler Durden