• Sunday, November 24, 2024
businessday logo

BusinessDay

Deepfakes: Hollywood’s quest to create the perfect digital human

Deepfakes: Hollywood’s quest to create the perfect digital human

Deepfakes: Hollywood’s quest to create the perfect digital human

Every Hollywood actor is desperate to cling to their youth. Now, Will Smith, the star of Independence Day and Men in Black, can be 23 forever. But unlike his Botoxed peers, the secret of Mr Smith’s fresh face is a new breed of digital doppelgänger, offering unprecedented realism.

In Gemini Man, his latest blockbuster which is released on Friday, the 51-year-old actor plays a retired assassin whose younger clone is sent to kill him.

The 23-year-old Smith clone, known in the movie as Junior, is not the real actor hidden under layers of make-up or prosthetics. Instead, he is a completely digital recreation, constructed from his skeleton to the tips of his eyelashes by New Zealand-based visual effects studio Weta Digital.

Hollywood insiders estimate that the Junior character alone cost tens of millions of dollars to make — perhaps twice as much as hiring the real Will Smith.

Yet just a few weeks before Gemini Man’s premiere, another, far cheaper, digital clone of Will Smith appeared in a reboot of 1999’s hit science fiction movie, The Matrix. In a two-minute Youtube video, Mr Smith took the place of Keanu Reeves to play Neo, taking the red pill and pausing bullets in mid-air.

The clip was made without Gemini Man’s $138m budget. Instead its creator, a Youtuber known only as Sham00k, employed free software called Deepfacelab to superimpose Mr Smith’s face on to Mr Reeves’ within The Matrix footage. So-called “deepfakes” like these have been used to turn comedian Jordan Peele into Barack Obama or actor Bill Hader into Tom Cruise, with each clip more believable than the last.

Deepfakes and the high-end effects seen in Gemini Man offer two alternative paths to manipulating people in videos. But as the two techniques converge, the cost of a fully digital human is plummeting. The “uncanny valley” is finally being bridged, prompting some in Silicon Valley to wonder when virtual assistants, such as Alexa, will no longer be just a disembodied voice.

Read also: Del-york Creative Academy makes USA debut at Warner Bros Studios

“The price of realism has dropped dramatically in the last 20 years,” says Paul Franklin, cofounder and creative director at award-winning visual effects studio DNEG. “Things that were the domain of companies like DNEG can now be done with off-the-shelf software. It’s inevitable [that] the kinds of techniques in Gemini Man will be stock-in-trade in the next 10 years.”

The fact that it has never been easier for Weta wannabes to insert people into short videos has led to warnings from politicians, privacy activists and Hollywood itself. Convincing fake videos could be used to manipulate electorates, defraud companies or bully individuals — even if for now, deepfakers’ principle hobby is to insert unwitting celebrities into pornography.

A report in September from Deeptrace Labs, a cyber security start-up whose technology detects manipulated videos, found that the number of deepfakes posted online had almost doubled in the past six months to 14,678. Of those, 96 per cent are classified as porn.

“It’s definitely evolving very fast,” says Katja Bego, a data scientist who is researching deepfakes at Nesta, a tech-focused non-profit organisation. Facebook, Google and Microsoft have driven efforts to improve deepfake detection, hoping to prevent misleading videos from spreading across their networks.

Creating realistic digital people the traditional Hollywood way is still a daunting task. Bringing Junior to life took “hundreds of hours of painstaking animators’ and modellers’ time”, says Stuart Adcock, head of facial motion at Weta, which was founded by Lord of the Rings director Peter Jackson. “At times it felt more like we were making a real human from the ground up than a visual effect.”

But with advances in machine learning and processing power available on smartphones and cloud computing systems, some predict that Gemini Man-style effects could one day become as accessible as selfie-retouching smartphone apps like Facetune are today.

“Deepfakes are the next step in a long chain of the democratisation of media production,” says Peter Rojas, a venture capital investor at Betaworks Ventures. “Deepfakes are the democratisation of CGI. It’s not that different to what blogging did for publishing.”

Deepfakes are barely two years old but the biggest change in recent months is the amount of input data required to create a convincing video. In September, Chinese app Zao caused a viral sensation by allowing users to trade places with Leonardo Dicaprio in a selection of scenes from movies such as Titanic. Because Zao’s range of clips is limited and pre-selected, the process takes just a few seconds and requires only a single photograph of the face-swapper.

“Before, it was easy to do this for celebrities and politicians because you have a tonne of moving footage for them [on the internet],” Ms Bego says. “Now you just need one picture of a normal person.”

Despite the pace of deepfakes’ progress, traditional Hollywood effects studios such as Weta see little application for the technology in today’s blockbusters.

Deepfakes may be popping up on smartphones in Youtube clips and Facebook feeds but in Gemini Man, Junior’s digital face is shown in lingering close-ups across a vast Imax screen. While the effect is more convincing in scenes set in dark catacombs than in bright sunlight, it has nonetheless been hailed as a breakthrough for human realism. The difference is obvious even from the effects of two or three years ago, such as Princess Leia’s brief appearance in the 2016 Star Wars spin-off, Rogue One.

Join BusinessDay whatsapp Channel, to stay up to date

Open In Whatsapp