- Currents
- Season 1
- Episode 35
VFX Artist Breaks Down This Year's Best Visual Effects Nominees
Released on 02/06/2020
My name's Kevin Baillie,
and I'm a visual effects artist.
I got my start when I was 18 years old in the industry,
on Star Wars: The Phantom Menace,
and have since worked on movies
like Harry Potter and the Goblet of Fire,
Transformers: Age of Extinction,
and a whole bunch of movies with Robert Zemeckis.
We're here today to talk about
the five visual effects Academy Award nominees,
and what makes them so groundbreaking.
What's really exciting to me as a visual effects artist
about this year's field of nominees,
is how diverse they are.
Some years you have three or four really similar films,
and one kind of stands above the rest,
and it's an easy guess.
This year, it's everything from digital characters,
to de-aging, to invisible effects, virtual production.
Every film stands for something different.
[upbeat music]
First, let's look at Avengers: Endgame.
Avengers: Endgame, which is a whole tie-up
to the Marvel comic universe,
really built upon successes of previous Marvel films
with their digital characters.
Thanos was an amazing performing digital character
in Infinity War that made use of machine learning,
and all kinds of cool tech,
to bring him to life in that film,
and they just elevated that technology
to a whole new level.
And not only was there Thanos,
in this film we also had smart Hulk, right?
So we had multiple digital characters
that were performing, emoting,
that the audience could really connect with.
In previous Marvel movies,
the Hulk has always been this sort of raging,
stupid, embodiment of anger,
and he didn't really have to connect with the audience
on any other level than that.
With smart Hulk in Avengers: Endgame,
he actually has got his wits about him.
Maybe smash a few things along the way.
I think it's gratuitous, but whatever.
[growls]
[Kevin] And he's sitting there performing
alongside his live action actors,
and needs to be up to the very same level as them
with every nuance of performance that he gives.
And now look at me.
Best of both worlds.
There was some pretty amazing technology
that went into bringing Thanos
and smart Hulk to life in Avengers: Endgame.
In the past there's been, you know,
techniques used where there's thousands
of dots on an actor's face,
and every single one of them is trying
to drive like a little piece
of a digital character's performance,
but there's always stuff that happens
in between the dots that we miss, right,
and we end up with this kind of uncanny valley effect
with digital characters.
Well, by using machine learning,
they spent a lot of time actually teaching the computer
how these faces should move,
and then when the actor goes to perform
with these head mounted cameras on,
they're actually not capturing that much data.
They're just sort of getting the gist
of what the actor's doing.
When I had the gauntlet, the stones,
I really tried to bring her back.
And they feed that to the machine learning algorithm,
and it analyzes what the face is doing,
and it effectively fills in the blanks,
and translates that to smart Hulk's performance.
So they actually use a lot of different sets
of input data, what we call it training data,
to learn how an actor's face should move.
So there is a very, very high resolution scan
of the actor's face that's hundreds of cameras,
multiple lights, that tell us not only
how their face is shaped,
but how their skin reacts to a different kind
of lighting hitting it.
And then we also put them through
what we call facial range of motions, right,
which is them going through every expression
you can possibly imagine.
It's actually very tiring going through this,
but what it allows us to do is to see
down to a pore level of detail
what their skin does in three dimensions,
as they go from a smile to a frown, for example.
Terabytes of data go into the input,
to train the machine learning algorithms,
and that allows us to actually not capture
a lot of data on set.
Then we can correlate that small amount of data
with this intense dataset that we have for training,
to create an incredibly rich performance
of a fully digital character.
[dramatic music]
In addition to these stunning digital characters
in Avengers: Endgame, there are entire worlds
that are designed from scratch,
just sort of out of peoples' imagination,
built down to every blade of grass,
and doing that in a level that is consistent
across an entire film,
and this film had 2500 individual shots
that had visual effects in them,
to make that consistent across the whole film,
is a huge, huge task on this scale,
and they really nailed it.
[Rey] People keep telling me they know me.
No one does.
[Ben] But I do.
Star Wars: The Rise of Skywalker
was a film that really played heavily on nostalgia
from a visual effects perspective,
and a storytelling perspective.
Not only did the visual effects team
have to kind of build on this Star Wars universe
that we've become so used to seeing,
and to have it feel really authentic,
and physical, and gritty, and grounded,
but they had to bring multitudes
of digital and actually live action puppets to life.
Star Wars has always been groundbreaking
from a technology perspective,
but it's also always been grounded in reality, right?
The world just feels tactile,
and in this film they went to huge lengths
to pay respect to how the original films were made.
They used tons of practical models,
and actual physical set builds,
in addition to a lot of digital techniques
that are cutting edge even for today.
That real blend of physical with state of the art,
is part of the Star Wars DNA.
One of the biggest challenges we always have
in deciding how we're gonna film a movie
is deciding what's gonna be real,
versus what's gonna be digital.
In the scene that takes place on the sunken Death Star,
Roger Guyette, the visual effects supervisor
and his team had to decide
what water should be real versus what should be digital,
and they ended up doing the vast majority
of the water digitally.
They actually created an entirely new ocean pipeline,
in order to be able to execute those shots.
But not all of the water was digital.
One of the really important things for us
as a visual effects team,
is to make sure that the actors always have context
to help with their performances,
and especially when it's something
that they need to touch.
Having something there for real,
like having these waves that are crashing over them,
be there for real,
is critical to achieving a believable illusion.
A really touching part of Star Wars: The Rise of Skywalker
was seeing Carrie Fisher onscreen again,
playing Princess Leia posthumously.
Instead of using the digital facial techniques
that are heavily used by some
of the other contenders this year,
they opted to use Carrie Fisher's real face,
filmed from footage from past films,
and augment it with digital hair,
and a digital body around her,
so that she fit into her environment.
I think that that really helped Carrie's soul
come out onscreen, and it was really appropriate
for that moment, right?
So, Thanos absolutely deserves to be a digital character.
Carrie in this film, the only way to do her
was to actually use her.
Since my days as an 18 year old on Star Wars episode one,
digital effects have come so, so far.
Back then we really had to be restrictive
about how we used the effects,
so digital characters were incredibly difficult to achieve.
We had to make sure that the camera wasn't moving too much
in any one shot, otherwise it would make
the shots take a long time, and be way too expensive.
Now, we're sort of freed up to be able
to do almost anything that we can imagine,
and so what filmmakers are challenged with these days
is not to ask whether something can be done,
which is what we faced back in the days
of Star Wars episode one, but should it be done,
and how does it service the story?
And I think Star Wars has evolved as a franchise
to be a great example of a film
that uses the right tool for the right job.
In your own time, gentlemen.
Must be something big if the General's here.
[General] They're walking into a trap.
Your orders are to deliver a message,
calling off tomorrow morning's attack.
If you don't get there in time,
[explosions]
we will lose 1600 men.
1917 is a movie that you might not look at
and say, Oh, that's a visual effects movie.
You would be sorely mistaken.
It is full of visual effects, beginning to end.
The whole movie plays as one continuous shot,
where there's no cuts that you can actually see
in the film,
and that required an immense amount of work
from the visual effects team.
The film was actually shot in several different locations,
some of them outdoors, some of them inside on sound stages,
over the course of several months.
And so how do you make that all look
like one seamless piece of storytelling?
Between each piece of footage,
shot on different days in different places,
digital effects artists had to seamlessly blend
from one to the next,
in a way that the audience can't actually perceive it.
One of the more dramatic examples
of one of these digital blends is a scene
where our hero character is running out of a village
that's on fire, away from gunmen,
jumps off a bridge and into a river,
and then floats down the river.
As the actor came around and jumped off the bridge,
he was transitioned to a fully digital actor,
until he lands in the water,
and then he became that same actor,
but on a different day, in a different place.
So, just that one example is thousands
of man hours of visual effects time,
to bring that to life.
In addition to all of the digital blends,
what a lot of people don't realize
is how much digital work is done
to actually plus out this World War I period world,
where tanks that are stuck in giant craters,
they're actually fully digital.
They were never built for real.
Big fields that have tons
of spent ammunition strewn all over,
very little of that was actually real.
And it all felt incredibly grounded.
It felt like it was really there,
and to me those kinds of effects,
what we call invisible visual effects,
that are there to support the story,
rather than be the story,
are some of my favorite kinds of effects.
It was like the army.
You followed orders,
you did the right thing,
you got rewarded.
The Irishman featured one specific technique
that is what got it nominated for an Oscar.
That is the de-aging of some of the most well known
and well loved actors of our time,
and to do that for the entire film,
without falling into the uncanny valley,
is a massive challenge.
What the visual effects supervisor, Pablo Helman,
and his team did on that film,
was to actually really use the actors facial expressions
on a movie set as ground truth,
and they made a digital mask effectively
that went over that actor's face.
To make these digital masks move perfectly,
they filmed every actor with not one camera, but three.
We have the main camera that was the normal camera
that you would use to shoot a movie,
and then we had two, what we call witness cameras
on either side.
And between those three cameras,
and a special piece of software
that Industrial Light and Magic wrote called Flux,
they were able to actually analyze every movement
of an actor's face.
Flux was actually able to figure out
what each actor was doing exactly,
create the younger version of them,
and then that would be superimposed
on the older version of the actor,
and that became the final result.
It was really important to Martin Scorsese
to be able to shoot this like a normal movie,
without motion capture, right?
He didn't want a bunch of technology getting
in the way of the process.
So the two cameras, the witness cameras
that are on either side of the main camera,
they actually shoot infrared footage,
because that allows the visual effects team
to light the scene in a way that allows those cameras
to actually see the actor.
We can't see infrared light,
so we can just pump in infrared light into the set,
and see really clearly what the actors are doing,
even though Scorsese wants to light it
as like a really dark moody scene.
So, what The Irishman visual effects team did
was actually brilliant,
is that they figured out how to get out of the way
of the filmmaking process of this, you know,
genius in Scorsese, to get all the data
that they needed.
A really important part of bringing these younger versions
of these actors to life was actually using thousands
of images and video clips
from each actor in their younger days,
so that they can not only build a face
that looks like kind of what we remember them
to have looked like in the past,
but also to help train algorithms
within the Flux software,
to make the faces move authentically.
Where are we gonna go but up?
When watching The Irishman
for all 3 1/2 hours of the movie,
I actually felt like they were really successful
in bringing these younger versions
of the actors to life.
There was some criticism about,
oh, you know, the posture of the actors
was a little more like an old man
than it was the younger version of yourself,
and to be honest, I think the sign of success
with any effect, is whether it helps
to engage you in the movie,
or if it bumps you out of the movie,
and for the duration of The Irishman,
I was just fully engaged in those characters.
So the little technical flaws,
of, in this instance posture,
they just didn't stand out to me.
I've heard a lot of actors talk about The Irishman
as giving them a new lease on life, right?
The fact that these actors could perform
as younger versions of themselves,
and do so so convincingly,
I think has pretty big ramifications
for storytellers moving forward.
We can now look at casting actors
based on their personality and their fit for a role,
and less for their age.
Some people talk about the ethical concerns
of being able to create a digital version
of any actor out there,
and I was like, Oh, are we gonna replace actors one day?
And I think The Irishman is proof positive
that that's just not gonna happen, it's nonsensical.
There is no amount of digital wizardry,
I promise you this, that is gonna bring a performance
to the level that a DeNiro is gonna bring
to the screen, right?
And that foundation, that soul of the actor,
that's what we're responding to
from an emotional perspective.
And as The Irishman proved to us, the digital wizardry,
more than replacing an actor's performance,
it's complementing it.
It's actually helping to highlight it,
and build upon it,
and that as a storyteller is
an incredibly exciting prospect to leverage
more and more into the future.
[Circle of Life]
In The Lion King, Rob Legato,
the visual effects supervisor, and his team,
created an entirely digital world
for the story to take place in.
In fact, there's only one shot in the movie,
the opening shot, that was filmed in Africa.
Everything else is completely virtual,
but it feels like real life.
It feels like you're watching
a National Geographic documentary.
Now the danger of having a completely virtual world
that you can do anything you want in,
is that it could end up looking like a video game.
No matter how realistic the grass looks,
if the camera's not moving correctly,
and if the lighting isn't cinematic,
it's just not gonna work.
So, what Rob Legato and his visual effects team did
on the film, is they designed virtual production tools
that allowed real camera teams,
with real camera cranes, and dollies,
and steady cams, that were puppeteering
these digital cameras in the virtual world.
And so what you end up with is a feel of the movie
that is every bit as naturalistic
as if it had been shot in the African Plains.
The thing that's always a dead giveaway
with digital effects is when things are too perfect, right?
The natural world, it has a certain amount of chaos
that is completely unavoidable,
and by using virtual production,
and real camera equipment
to design these digital camera moves,
that imperfection that we're so used
to seeing in cinema, is translated
onto these digital African Plains,
so that it just looks like something
that we're used to seeing.
Predicting who's gonna take home the Oscar this year
is, and I'm not copping out here,
it's just totally impossible.
All of the films that are nominated are so different,
and I really think that the winner is gonna be
probably the story that ended up connecting
with audiences the most.
[upbeat music]
Director: Ryan Loughlin
How the Disco Clam Uses Light to Fight Super-Strong Predators
Architect Explains How Homes Could be 3D Printed on Mars and Earth
Scientist Explains How Rare Genetics Allow Some to Sleep Only 4 Hours a Night
Scientist Explains Unsinkable Metal That Could Prevent Disasters at Sea
Is Invisibility Possible? An Inventor and a Physicist Explain
Scientist Explains Why Her Lab Taught Rats to Drive Tiny Cars
Mycologist Explains How a Slime Mold Can Solve Mazes
How the Two-Hour Marathon Limit Was Broken
Research Suggests Cats Like Their Owners as Much as Dogs
Researcher Explains Deepfake Videos
Scientist Explains How to Study the Metabolism of Ultra High Flying Geese
Hurricane Hunter Explains How They Track and Predict Hurricanes
Scientist Explains Viral Fish Cannon Video
A Biohacker Explains Why He Turned His Leg Into a Hotspot
Scientist Explains What Water Pooling in Kilauea's Volcanic Crater Means
Bill Nye Explains the Science Behind Solar Sailing
Vision Scientist Explains Why These Praying Mantises Are Wearing 3D Glasses
Why Some Cities Are Banning Facial Recognition Technology
Scientist's Map Explains Climate Change
Scientist Explains How Moon Mining Would Work
Scientist Explains How She Captured Rare Footage of a Giant Squid
Doctor Explains How Sunscreen Affects Your Body
Stranger Things is Getting a New Mall! But Today Malls Are Dying. What Happened?
The Limits of Human Endurance Might Be Our Guts
Meet the First College Students to Launch a Rocket Into Space
Scientist Explains Why Dogs Can Smell Better Than Robots
A Harvard Professor Explains What the Avengers Can Teach Us About Philosophy
NASA Twin Study: How Space Changes Our Bodies
What the Black Hole Picture Means for Researchers
Scientist Explains How to Levitate Objects With Sound
Why Scientists and Artists Want The Blackest Substances on Earth
Biologist Explains How Drones Catching Whale "Snot" Helps Research
Researcher Explains Why Humans Can't Spot Real-Life Deepfake Masks
Doctor Explains What You Need to Know About The Coronavirus
VFX Artist Breaks Down This Year's Best Visual Effects Nominees
How Doctors on Earth Treated a Blood Clot in Space
Scientist Explains Why Some Cats Eat Human Corpses
Voting Expert Explains How Voting Technology Will Impact the 2020 Election
Doctor Explains What You Need to Know About Pandemics
ER Doctor Explains How They're Handling Covid-19
Why This Taste Map Is Wrong
Q&A: What's Next for the Coronavirus Pandemic?
Why Captive Tigers Can’t Be Reintroduced to the Wild
How Covid-19 Immunity Compares to Other Diseases
5 Mistakes to Avoid as We Try to Stop Covid-19
How This Emergency Ventilator Could Keep Covid-19 Patients Alive
Why NASA Made a Helicopter for Mars
Theoretical Physicist Breaks Down the Marvel Multiverse
Former NASA Astronaut Explains Jeff Bezos's Space Flight
Physics Student Breaks Down Gymnastics Physics
What Do Cities Look Like Under a Microscope?
Inside the Largest Bitcoin Mine in The U.S.
How Caffeine Has Fueled History
How Mushroom Time-Lapses Are Filmed
Why You’ll Fail the Milk Crate Challenge
Why Vegan Cheese Doesn't Melt
How 250 Cameras Filmed Neill Blomkamp's Demonic
How Meme Detectives Stop NFT Fraud
How Disney Designed a Robotic Spider-Man
How Online Conspiracy Groups Compare to Cults
Dune Costume Designers Break Down Dune’s Stillsuits
Korean Phrases You Missed in 'Squid Game'
Why Scientists Are Stress Testing Tardigrades
Every Prototype that Led to a Realistic Prosthetic Arm
Why the Toilet Needs an Upgrade
How Animals Are Evolving Because of Climate Change
How Stop-Motion Movies Are Animated at Aardman
Astronomer Explains How NASA Detects Asteroids
Are We Living In A Simulation?
Inside the Journey of a Shipping Container (And Why the Supply Chain Is So Backed Up)
The Science of Slow Aging