Skip to main content

Director Ang Lee explains why he built a digital Will Smith in ‘Gemini Man’

Before showing “Gemini Man” to a group of reporters last week, director Ang Lee described the movie as a “leap of faith.” Then, to illustrate how nervous he was, he pretended to bite his nails.

Was Lee just being self-effacing? Maybe. But afterwards, when we got a chance to grill him about the production, he had a single question in return: “Did you believe in Junior?” When we answered yes, his relief was palpable.

That’s because Lee is doing something — several things — genuinely new here.

Will Smith plays two characters in “Gemini Man”: a middle-aged government assassin named Henry Brogan, and his younger clone, Junior, who’s sent to kill his older self. Stuntmen stood in for Junior during many of the action sequences, and Smith contributed to the character through performance capture, but ultimately, Junior is a computer-generated creation from the team at effects house Weta Digital.

Lee contrasted Weta’s approach to the way other movies have experimented with using visual effects to de-age stars — he described them as just brushing away actors’ wrinkles: “When you do that, you take away all the details … Aging is much [more] complicated, it’s life.”

gmff001k

Will Smith as “Junior” in Gemini Man from Paramount Pictures, Skydance and Jerry Bruckheimer Films

Where other movies have limited this process to a handful of scenes (think Robert Downey, Jr. briefly playing a younger version of himself in “Captain America: Civil War”), Lee noted that in “Gemini Man,” Junior is one of two lead characters. That meant he needed to be more than a “gimmick” — and it would have been prohibitively expensive to apply that “handcrafted brushing” to so many shots.

Lee made things even harder for himself by shooting the movie in 3D, at 120 frames per second. In that format, everything looks more clear and detailed than in traditional film, so an unconvincing effect would be even more obvious.

“You see through people like light,” Lee said. “With that requirement, I just don’t think something that erases age will do. You have to create it from zero.”

Apparently, that creation process took two years. And while I wouldn’t describe the results on-screen as completely photo-real, I thought they worked: I never forgot that Junior was an effect, but I also believed in him as a living, breathing character.

Lee added, “One of the hardest things, if not the hardest thing, in animation is: How do you get the secret of him getting paid the big bucks?” In other words, how do you capture Will Smith’s charm?

In his younger days as a director, Lee said he would have only been concerned with making Junior a convincing character, but now, “I’ve made movies long enough to learn to respect that a movie star is not just an actor, it’s something else. He has a contract with people.”

Lee recalled that during rehearsal, Smith was “very generous about sharing what makes Will Smith Will Smith.” Still, he argued, “You cannot retrieve [that charm] from his old movies. You can use that as reference, but what drives it, what final touches [make it work]?”

gmff035 0 1

Will Smith in Gemini Man from Paramount Pictures, Skydance and Jerry Bruckheimer Films

The challenge of capturing that, he said, is one of the main reasons he wanted to make the film: “When you do the digital face and the body, it’s like a microscopic study of what drama is, what moving is, how does it connect with emotion … and what age does to you, cell-by-cell.”

Lee said there were two other big things that “absorbed” him in making “Gemini Man.” First, there was his aim of creating a more real, more “messy” style of shooting and staging action; he argued that in other films, the action is so heavily choreographed that it’s basically “dancing.” (And this is coming from the director of “Crouching Tiger, Hidden Dragon.”)

Secondly, he wanted to explore “the beauty of this kind of media, digital cinema.” That’s why he shot “Gemini Man” at the aforementioned 120 frames per second. He’s clearly enamored with the format, having shot his last movie “Billy Lynn’s Long Halftime Walk” at a high frame rate as well, but he acknowledged that it’s something audiences still have to get used to. (When it doesn’t work, it can be hard to distinguish from bad TV.)

Lee’s “dream” is that one day, this approach will no longer be called “high frame rate” — instead, it’s the standard 24 frames per second that should be called “low frame rate,” because the default will have changed.

“You don’t call it color film, right?” he said. “You say silent film, you say black-and-white.”

And if “Gemini Man” is commercially successful, Lee is hoping other filmmakers will join him to “explore this new world” and further develop the technology. In the process, they might give audiences a reason to come back to theaters.

“People think 3D … or anything high-tech is the opposite of art and soul, and I don’t buy that,” Lee said. “I have to deliver action and spectacle — I’m delighted to do it — but I think the biggest gain [is] studying the human face close up.”

That, in turn, could lead to a different style of acting: “[In] this media, you read through people. They cannot fake it; they have to fake it differently, rather. They have to upgrade their skills.”

“Gemini Man” opens in theaters in October 11. Before then, you can watch Lee and Smith discuss the movie next week at Disrupt SF.



from TechCrunch https://ift.tt/2lHfcqQ
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT