Skip to main content

NASA astronauts successfully pilot SpaceX’s Crew Dragon spacecraft manually for the first time

NASA astronauts Doug Hurley took over manual control of the SpaceX Crew Dragon spacecraft on Saturday, shortly after the vehicle’s historic first launch from Cape Canaveral in Florida. Crew Dragon is designed to fly entirely autonomous throughout the full duration of its missions, including automated docking, de-orbit and landing procedures, but it has manual control systems in case anything should go wrong and the astronauts have to take over. This test is the first time the manual controls have been used in space, and is a key part of certifying Crew Dragon for regularly operational human flight.

Astronaut Bob Behnken and Hurley removed their fashionable SpaceX space suits just before Hurley completed the manual maneuvers, which is also part of the plan. They’re able to go without the suits in the pressurized cabin during its transit to the ISS, only needing to put them back on for space station docking, and the interior of Crew Dragon actually provides them a fair amount of room to move around in. This also makes it easier for them to operate the spacecraft controls.

The manual maneuver testing including Hurley going through the process of using the spacecraft’s touchscreen controls to put the capsule into what’s called LVLH (local vertical local horizontal ) attitude, using Earth as a reference navigation point. That basically means putting Dragon in the same orientation as an airplane flying over Earth, with the planet located ‘underneath’ the Dragon as it flies. The test involves notifying the flight computer to not take over as Hurley conducts the maneuvers, but doesn’t involve actually finalizing the control orders by sending them to the flight computer, since it will be the one actually completing the automated flight and docking process.

Hurley will conduct two tests during the mission, the one he just did called a “far-field” flight test because it’s far away from the ISS, and one called the “near-field” test which will be conducted when they’re closer to the station.

You can actually try out the manual control system that Behnken and Hurley used yourself – no spaceship required. All you need is a browser, and this ISS Docking Simulator created and released by SpaceX. It’s a bit tricky, but not as hard as you might think thanks to an intuitive control interface design.

 



from TechCrunch https://ift.tt/2Mfh83y
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT