Skip to main content

Feast your eyes on the space rocks Japan’s Hayabusa 2 mission harvested from asteroid Ryugu

Japan’s ambitious second asteroid return mission, Hayabusa 2, has collected a wealth of material from its destination, Ryugu, which astronomers and other interested parties are almost certainly champing at the bit to play with. Though they may look like ordinary bits of charcoal, they’re genuine asteroid surface material — and a little something shiny, too.

Hayabusa 2 was launched in 2014, and arrived at the asteroid named Ryugu in 2018, at which point it deployed a couple landers to test surface conditions. It touched down itself in the next year, blasting the surface with a space gun so that it could collect not just the surface gravel but what might lie beneath it. After a long trip home it reentered the atmosphere on December 5 and was collected in the Australian desert.

Although everything worked perfectly, the team could never really be sure they would truly get the samples they hoped for until they opened the sample collection containers in a sealed room back at headquarters. The materials inside have been teased in a few tweets, but today JAXA posted all of the public images along with some new explanations and discoveries.

For one thing, the “sample catcher” itself had grains of sediment from Ryugu. Perhaps this material, exposed to different conditions than that of the containers, will prove different when analyzed.

Image Credits: JAXA

For another, sample container C appears to have an “artificial object” in it! But don’t get excited — as the team writes on their blog, “the origin is under investigation, but a probable source is aluminium scraped off the spacecraft sampler horn as the projectile was fired to stir up material during touchdown.”

In other words, it’s probably a bit of the probe that came off during the not-so-gentle process of shooting the asteroid and crashing into it.

GIF of the Hayabusa probe crashing into the asteroid Ryugu.

Image Credits: JAXA

But the most important bit is all the rocks collected as planned. As you can see by the scale bar, these are little more than pebbles, but they’re large enough to show evidence of all kinds of processes leading to their particular shape and makeup. There’s also plenty of smaller-scale dirt and dust from below the surface that scientists hope could show signs of organic materials and water, the building blocks of life as we know it.

The success of the mission is worth celebrating, and the team has only just begun studying the materials brought back from Ryugu — so we can expect more information soon as they perform the painstaking work of analysis on these priceless samples. The Hayabusa 2 Twitter account is probably the best way to stay up to date day to day.



from TechCrunch https://ift.tt/3hqKr1v
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT