Skip to main content

Filmic’s DoubleTake app brings simultaneous camera shooting to the iPhone 11

Filmic had a solid cameo at the iPhone launch event in Cupertino last September. Such an appearance is always a vote of confidence from Apple. In this particular case, the company was most interested in the ways in which the pro-focused camera app maker was planning to harness the iPhone 11 Pro’s triple-camera setup.

The feature arrives on the App Store today in the form of DoubleTake. It’s launching as an iOS exclusive tailored specifically to the imaging capabilities of the 11, 11 Pro and 11 Pro Max. In fact, it will only work on those devices specifically, owing to the multi-camera capabilities.

Pros continue to be a primary focus for the company — as evidenced by the presentation back in September. Over at the developer’s blog, you can find a wide range of works shot using the company’s Pro app, ranging from short films to music videos. With DoubleTake, the company’s broadening its capabilities by allowing shooters to grab multiple focal lengths at once with the different cameras.

The most visually compelling use here, however, is Shot/Reverse Shot, which takes video from both the rear-facing and front-facing cameras at once. Obviously there’s going to be a gulf in image quality between the front and back, but the ability to do both simultaneously opens up some pretty fascinating possibilities.

In a press release, Filmic points to the ability to shoot two actors in conversation — eliminating the need for multiple takes. That’s certainly interesting, as far as getting genuine, organic reactions, but I think what’s most promising here is what is opened up beyond such scripted takes. You can, say, shoot a two-way podcast conversation by putting an iPhone in the middle and using the Split-Screen mode.

Or there’s the Picture (PiP) window, which opens up some dynamic possibilities for webloggers, allowing them to insert themselves into what they’re shooting on the fly. For newer filmmakers not beholden to more traditional aesthetic constraints, it’s easy to see the lines blurring between these formats. Shooting on a mobile device opens up some tremendous possibilities.

In the case of something like PiP, that editing as actually happening on the fly, in real-time. You can always opt to do all of that in post-production, but there is, perhaps, something to be said for the sort of decision making that happenings with that sort of live editing — it’s kind of akin to a live TV multi camera switcher. I suspect broadcast journalists looking to pare down equipment to the bare mobile minimum will find something to like from that perspective.

DoubleTake is available starting today as a free download.



from TechCrunch https://ift.tt/2O2mj7V
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT