Skip to main content

Prototype prosthesis proffers proper proprioceptive properties

Researchers have created a prosthetic hand that offers its users the ability to feel where it is and how the fingers are positioned — a sense known as proprioception. The headline may be in jest, but the advance is real and may help amputees more effectively and naturally use their prostheses.

Prosthesis rejection is a real problem for amputees, and many choose to simply live without these devices, electronic or mechanical, as they can complicate as much as they simplify. Part of that is the simple fact that, unlike their natural limbs, artificial ones have no real sensation — or if there is any, it’s nowhere near the level someone had before.

Touch and temperature detection are important, of course, but what’s even more critical to ordinary use is simply knowing where your limb is and what it’s doing. If you close your eyes, you can tell where each digit is, how many you’re holding up, whether they’re gripping a small or large object and so on. That’s currently impossible with a prosthesis, even one that’s been integrated with the nervous system to provide feedback — meaning users have to watch what they’re doing at all times. (That is, if the arm isn’t watching for you.)

This prosthesis, built by Swiss, Italian and German neurologists and engineers, is described in a recent issue of Science Robotics. It takes the existing concept of sending touch information to the brain through electrodes patched into the nerves of the arm, and adapts it to provide real-time proprioceptive feedback.

“Our study shows that sensory substitution based on intraneural stimulation can deliver both position feedback and tactile feedback simultaneously and in real time. The brain has no problem combining this information, and patients can process both types in real time with excellent results,” explained Silvestro Micera, of the École Polytechnique Fédérale de Lausanne, in a news release.

It’s been the work of a decade to engineer and demonstrate this possibility, which could be of enormous benefit. Having a natural, intuitive understanding of the position of your hand, arm or leg would likely make prostheses much more useful and comfortable for their users.

Essentially the robotic hand relays its telemetry to the brain through the nerve pathways that would normally be bringing touch to that area. Unfortunately it’s rather difficult to actually recreate the proprioceptive pathways, so the team used what’s called sensory substitution instead. This uses other pathways, like ordinary touch, as ways to present different sense modalities.

(Diagram modified from original to better fit, and to remove some rather bloody imagery.)

A simple example would be a machine that touched your arm in a different location depending on where your hand is. In the case of this research it’s much finer, but still essentially presenting position data as touch data. It sounds weird, but our brains are actually really good at adapting to this kind of thing.

As evidence, witness that after some training two amputees using the system were able to tell the difference between four differently shaped objects being grasped, with their eyes closed, with 75 percent accuracy. Chance would be 25 percent, of course, meaning the sensation of holding objects of different sizes came through loud and clear — clear enough for a prototype, anyway. Amazingly, the team was able to add actual touch feedback to the existing pathways and the users were not overly confused by it. So there’s precedent now for multi-modal sensory feedback from an artificial limb.

The study has well-defined limitations, such as the number and type of fingers it was able to relay information from, and the granularity and type of that data. And the “installation” process is still very invasive. But it’s pioneering work nevertheless: this type of research is very iterative and global, progressing by small steps until, all of a sudden, prosthetics as a science has made huge strides. And the people who use prosthetic limbs will be making strides, as well.



from TechCrunch https://ift.tt/2Eyx9ye
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Anomalous

Hello and welcome back to Max Q! Last week wasn’t the most successful for spaceflight missions. We’ll get into that a bit more below. In this issue: First up, a botched launch from Virgin Orbit… …followed by one from ABL Space Systems News from Rocket Lab, World View and more Virgin Orbit’s botched launch highlights shaky financial future After Virgin Orbit’s launch failure last Monday, during which the mission experienced an  “anomaly” that prevented the rocket from reaching orbit, I went back over the company’s financials — and things aren’t looking good. For Virgin Orbit, this year has likely been completely turned on its head. The company was aiming for three launches this year, but everything will remain grounded until the cause of the anomaly has been identified and resolved. It’s unclear how long that will take, but likely at least three months. Add this delay to Virgin’s dwindling cash reserves and you have a foundation that’s suddenly much shakier than before. ...