A CREEPY video shows the Mona Lisa painting coming to life and talking in a chilling glimpse into the future of fake news.
Samsung’s artificial intelligence lab in Russia has turned Leonardo Da Vinci and dead celebs such as Marilyn Monroe and Albert Einstein into moving images.
While terrifying fake videos are nothing new, what the computer geeks in Moscow have managed to do it create them using only one image.
Previously, researchers would take multiple pictures of a celeb before “mapping” them onto the moving features of another person to create the so-called ‘deepfake’ in a technique known as “puppeteering”.
And while using several images creates a more convincing fake, the prospect of hackers being able to create bogus videos using a single photograph is frightening.
In the Mona Lisa example, the team shows how the animated painting looks slightly different depending on the person whose face is used behind the famous image.
BRINGING OUT THE DEAD
The authors said: “We show that such an approach is able to learn highly realistic and personalized talking head models of new people and even portrait paintings.”
But there are concerns about how the technology could be used in the wrong hands particularly in the murky world of political propaganda and fake news.
Last year, lawmakers in Washington warned that such bogus videos could be a threat to national security.
Already there have been videos created featuring former US President Barack Obama appearing to speak like Russian leader Vladimir Putin.
Female celebs are already suffering because of the creepy technology.
Last year, a dark piece of software called FakeApp emerged allowing users to superimpose the face of their favourite star onto an actress in a porn film.
Its origins can traced back to a single Reddit user going by the moniker “deepfakes”, who edited Wonder Woman star Gal Gadot’s face onto a porn star’s.
What frightened the online community is he did not seem to be a computer whizz, rather a Joe Bloggs using publicly available, Google-developed software.
MOST READ IN NEWS
Shortly after Motherboard outed him, he created a subreddit which quickly amassed tens of thousands of subscribers.
Another Reddit user pounced on this growing momentum by creating FakeApp, dubbed a “user-friendly” system which allows those with no computer knowledge to do exactly the same.
The videos have now reportedly spread to Pornhub and YouPorn.
Is “deepfaking” illegal?
It is certainly a violation but such apps could also be illegal, according to Andrew Murray, Professor of Law at the London School of Economics.
He told the Sun the actresses could sue for defamation should they be viewed “less favourably by members of society” as a result.
Murray added: “More likely such images could be viewed as forms of harassment, which they could report to the police.”
Simon Miles, a partner at intellectual property specialists Edwin Coe, said such acts could also amount to “unlawful intrusion into the privacy of the particular celebrity”.
We pay for your stories! Do you have a story for The Sun Online news team? Email us at firstname.lastname@example.org or call 0207 782 4368 . We pay for videos too. Click here to upload yours