Skip to main content

NASA completes inflight testing of supersonic plane X-59’s virtual cockpit window

The X-59 supersonic research airplane developed by NASA, which aims to pioneer quiet supersonic flight for eventual use in commercial aviation, is one step closer to reality thanks to testing of a system it will use to provide the aircraft’s pilot with a fully virtual view of the skies. The eXternal Visibility System (or XVS because NASA loves ‘X’ in their acronyms) includes a front-facing camera and display combo that can provide a view for the pilot enhanced by augmented reality, which will include overlaying information like guidance to destination airports, warnings and alerts when other aircraft enter their airspace, and additional info and key cues upon landing approach and takeoff.

Sensors and a 4K camera feed combine to output the info on a 4K monitor that is inside the aircraft facing the pilot in the cockpit. There’s also an additional retractable camera located underneath the aircraft that can provide a key second view during lower speed flight, as in approaching an airport for landing.

The tests involved mounting the XVS on a Beechcraft King Air UC-12B test plane, and then flying with pilots on board and gauging their ability to spot other aircraft through the display, including in situations where it’s traditionally very difficult for pilots to spot the planes, as in when aircraft are coming at each other but with a slight offset in their relative paths vs. being truly head-on.

The reason the XVS is even necessary is that the X-59 will feature a totally new, unique design that emphasizes a much more contoured look vs. current commercial aircraft, which is how it aims to achieve its quiet flight and avoid loud sonic booms when it actually goes supersonic. This is a key development target because the whole point of the aircraft is to show that it’s possible to fly at supersonic speeds without loud booms at ground level, in order to show regulators that it’s possible to fly commercial supersonic aircraft over land and populated areas. The Concorde, which provided supersonic flight commercially from 1976 until 2003, couldn’t operate over land for this reason.

Despite not having a front-facing window, the X-59 will have a transparent canopy, and test pilots say that even if the XVS system were to somehow fail, they could still fly the aircraft using that external view and information from the aircraft’s sensor and avionics system alone.

The X-59 is currently under construction, being built by Lockheed Martin, and aims to take its first flight sometime in 2021.



from TechCrunch https://ift.tt/2Ugh1I3
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT