Epic Games CTO talks virtual production, digital humans, and what’s next

Jun 06, 2019 at 10:00 am by Submitted

Epic Games CTO Kim Libreri

Kim Libreri, CTO at Epic Games since 2014, is something of a legend in the visual effects industry, with a career spanning more than two decades and two Sci-Tech Awards among his many accolades. His film credits include The Matrix—where he helped develop the iconic bullet-time effect—Oscar-winning What Dreams May Come, and more recently, PoseidonSpeed Racer, and Super 8

In this podcast, fxguide's Mike Seymour talks with Libreri about how virtual production techniques, many of which Libreri has pioneered, are changing the face of VFX pipelines. They also discuss how Libreri's team has advanced Unreal Engine technology by pushing it beyond its limits on special projects such as A Boy and His KiteHellblade: Senua's SacrificeMeet MikeSiren, and the new Chaos demo, and talk about the importance of Real-Time Live! To wrap things up, Seymour asks Libreri to reveal what's next for Epic Games and real-time technology.

You can listen to the full podcast below, or read on for an overview.

Libreri joined Epic from Lucasfilm, where he'd been exploring the convergence of film and game technology as he worked on the groundbreaking Star Wars 1313 demo, which Seymour calls "the greatest game never released."

"I thought that real-time technology was going to change the way that we make entertainment of all forms, whether it be classic video games, linear entertainment, or interactive VR experiences," says Libreri. "There was an interesting melting pot of all sorts of cool new ideas; computer graphics—because it could be interactive—could do all sorts of amazing new things."

It's the interactive aspect—as opposed to just real-time playback—that makes game engine technology so interesting in virtual production, Libreri explains, citing the example of making a car chase. Instead of following premade storyboards and animatics, you can set up a simulated car and actually drive it with a steering wheel or joystick, improvise on the action, and then reproduce that with real cameras.

"It brings back the spontaneity and the experimentation that you have in the physical world because you're not thinking of this as just classic premeditated visual effects," he says, adding that one of the benefits of virtual production is seeing characters, props, environments, and shooting scenarios much earlier in the process. "You have time to iterate and not have this very sort of removed abstracted process that we've had for the last 20 years. So that's one of the big things that excites me—it's bringing a more immersive process to the directors and the filmmakers."

blog_body_img_kite.jpg

Since arriving at Epic, Libreri and his team have worked on several projects in a bid to push Unreal Engine technology forward. The first of these was A Boy and His Kite, often referred to as simply Kite, which came out in 2015. At the time, its open-world concept and natural character animation style was very different from anything that Epic, then best known for first-person-shooter games, had produced before. From there, work began with 3Lateral (now part of Epic Games) on interactive digital humans, including Hellblade: Senua's Sacrifice in 2016, Meet Mike in 2017, and Siren at GDC 2018, demonstrating an incredible evolution from Kite in just two-and-a-half years.

blog_body_img_siren2.jpg

Libreri and his team has also worked on The Human Race with The Mill, the Reflections real-time ray tracing demo, and the demo for Unreal Engine's new Chaos physics engine.

Partly as a result of these special projects, Unreal Engine has also evolved, gaining new material systems for automotive rendering, the built-in compositing module Composure, the Sequencer Recorder that enables live performances to be recorded and edited on a timeline, and real-time ray tracing. The Chaos physics engine is due to be released in the next version, Unreal Engine 4.23.

blog_body_img_chaos.jpg

Sequencer Recorder and other Unreal Engine technology components—such as ARKit support for iPhone X and improved skin, hair, and eye shaders—were pivotal to the Bebylon project, created by Kite and Lightning. This project was a winning entry in SIGGRAPH 2018's Real-Time Live! competition for the best in real-time graphics and interactivity, ironically beating Epic's own Reflections entry in the process. Libreri, who had previously been part of the winning teams with A Boy and His Kite in 2015, Hellblade: Senua's Sacrifice in 2016, and The Human Race in 2017, was magnanimous in defeat to Kite and Lightning's co-founder, Cory Strassburger.

blog_body_img_kite_lightning.jpg

"To see Cory pull all this off with a brand new version of Unreal Engine—we deserved to be beaten on Real-Time Live!, for sure," he says. "Real-Time Live! has been such a catalyst for people trying to push the envelope in real-time entertainment."

So what's next for Epic? Libreri talks about extending the physics system to vehicles and characters, making digital humans more accessible to game developers and film companies, and continuing to evolve ray tracing and lighting. 

"It is inevitable real-time rendering will get to a point that it matches movie quality at some point in the next decade," he says. "We're going to keep extrapolating."

This podcast interview with Kim Libreri is part of our Visual Disruptors series. Visit our Virtual Production hub for more podcasts, videos, articles, and insights.




Sign up for our newsletter

This website uses cookies to ensure you get the best experience possible More Info
Got it!