How To... Use Tech & VFX Creatively
Jonathon Tustain is well-versed in using VFX realistically but he's excited about the future possibilities of tech.
Synthespians, virtual actors, cyberstars, or "silicentric" actors - these terms wouldn't be out of place in a sci-fi novel, but they're increasingly becoming a part of the post-production glossary book.
It's a concept that was first explored in 1981 satirical film Looker, which attempted to create Cindy, the virtual actor, as part of a story exploring a future where digital models were used by a perfection-seeking advertising firm. The motives may have been dark, but in present and future commercials, the creative opportunities are far less sinister.
Virtual actors have been dealt with frequently in movies, sometimes unfortunately. After a fatal car crash in 2015's Fast and Furious 7, Paul Walker's character was able to continue on-screen thanks to the help of his brothers, CGI, particular lighting and clever camera angles.
A similar digital reincarnation was applied to Brandon Lee, who died while shooting The Crow in 1994. To complete the film, stunt double actor Chad Stahelski served as a stand-in and special effects helped to recreate Lee's face.
Digital recreation serves as more than just a sticky plaster for a tragedy; some studios see the technology as a tool for convenience, cost saving and creativity. James Cameron saved a fortune on catering alone in the production of his 1997 epic Titanic, as he featured digital actors walking along the decks of the fated ship.
Even the baby in 2006's sci-fi Children of Men was entirely computer-generated. Framestore modelled, rigged and textured a baby based on a prosthetic prop before compositing the animated object into the final birthing sequence. The results are incredibly realistic and even marked a moment in VFX's history.
Framestore also created revived the late Audrey Hepburn for a Galaxy ad - not an easy task according to the house’s CCO/co-founder Mike McGee.
“We noticed we could make a still of Audrey Hepburn quite quickly with Photoshop and the skill of an artistic painter.” he says. “However, it's a different story when they start to move. There are so many subtleties in the way the face moves, the behavior or the skin and the way it reflects and refracts light, the way it stretches and folds. We are so used to seeing ourselves in the mirror [that] we notice anything that isn’t real and where the expression ‘uncanny valley’ came from. I think one day, digi-doubles will become as viable as hiring real actors but for now we are restricted by the time-consuming, highly-skilled process of hand animating an emotional, nuanced human performance.”
But what about virtual actors today that are not based on stars from the past where there is no reference material?
Photogrammetry is one technique that is able to generate extremely lifelike human characters that can be made to act through realistic animation.
Brighton based Metapixel houses one of the most advanced photogrammetry rigs in the world and co-founder Robbie Cooper looks forward to giving movie studios the chance to extend the celebrity value of popular actors.
“A library of scan data could become part of an actor's portfolio, playing themselves at different ages," he says. "This could potentially fragment the job of being a performer into different disciplines. It is already happening to an extent, but what if the visual and charismatic dimension of being a star truly becomes the only real 'must-have' for the job. Because if you can't act, someone else can do it for you. There are actors alive today that are able to sell their future image rights for films they are going to be in when they are dead, adding an inheritance value to their estate.”
The likes of Kate Mulgrew, Michelle Pfeiffer, Denzel Washington and Tom Cruise have already been digitally ‘preserved’ but how is this type of immortality achieved?
In effect, it is like taking a 'digital photocopy' of a real actor. Metapixel houses 105x36 cameras which simultaneously take an image of an actor from every conceivable angle. Within just ten minutes, that data is rendered and stitched together where he or she can be placed within a game, commercial, movie or virtual reality experience. The final result is a representation that is indistinguishable from the real thing.
That scan data can then be applied to motion which is itself generated from the capture of actors wearing motion capture suits.
"The first stage on a digi-double is you get their scan and textures," says as motion capture studio Centroid founder Phil Stilgoe. "It is all rigged up and retopologized (the level of detail is reduced to enable animation). You use the same actor to create the full range of movements in a motion capture suit and that can then apply very cleanly onto the scan. They are the basic components of a digi-double pipeline."
The technology can also make the ‘living’ achieve the impossible, a prime example of this being Hahn Brewer’s Never Settle TV commercial (below). As people jump from scene to scene, what the viewer sees is mainly-animated scans, which Ogilvy Sydney used to illustrate the brief Get out of your comfort zone. A similar effect was achieved in the Bond movie, Spectre where scan’s of Daniel Craig’s head were composited onto stuntmen.
It's not just people that can be given the digi-double treatment. To showcase their new release of Cinema 4D R18, Maxon commissioned design and motion studio ManvsMachine with a brief to showcase its new tools.
They chose to 3D scan taxidermy animals which they incorporated into a breathtaking sequence that can be seen here in VERSES.
What does the future hold?
A visual representation of an actor is one thing, but acting is more than just looks - what about personality? This is where the next phase of synthetic acting is going - with personality agents.
The goal is to make real time actors more autonomous, acting by themselves and interacting with others. The cost benefits to both an actor and a studio are huge - an actor could licence their personality and synthetic avatar for a fee without the 18 hour days.
“I can see a time when avatars can react to you based on your facial performance captured by a webcam.” says McGee. “The avatar in the scene can react to your emotion accordingly; based on the narrative or emotion the storyteller wants you to feel. In the future we can imagine having a cup of tea with Audrey Hepburn and asking her about her life in VR. Her AI profile can then access a database of facts about her life and instantly answer back.”
With major brands and agencies exploring virtual reality as a way to capture the attention of new audiences, and where digi-doubles could thrive according to Cooper: “VR has focused very much on 360 degree video. It has been a good stepping stone but I think most people now realize that it is actually more limited than regular video. What is happening now is that people are diving into the world of what’s possible by animating photogrammetry models of people, and driving that with AI, as in a video game. It’s an amazing way to give the audience emotional engagement and agency in this type of content.”
Connections
powered by- VR Consultant Jonathon Tustain
Unlock this information and more with a Source membership.