Skip to main content

Google launches Ripple, an open standard that could bring tiny radars to Ford cars and more

Google soli radar project
İmage Credit: Google

Google has been publicly building tiny radar chips since 2015. They can tell you how well you sleep, control a smartwatch, count sheets of paper, and have you play the world's smallest violin. But the company's Soli radar hasn't necessarily seen commercial success, primarily in an ill-fated Pixel phone. Now Google has launched an open source API standard called Ripple that could theoretically bring the technology to additional devices outside of Google, possibly even a car, as Ford is one of the participants in the new standard.



Adritionally, the Github ripple project  is filled with  references to Google, including different instances of "Copyright 2021 Google LLC" and contributors must sign a Google Open Source license agreement to participate. (One commit points out that the project was updated “to include CTA.”) Ripple appears to be a rebranding of Google’s “Standard Radar API,” which it quietly proposed one year ago (PDF).

None of that makes it any less exciting that Soli might find new life, though, and there may be something to the idea that radar has privacy benefits. It’s a technology that can easily detect whether someone’s present, nearby, and/or telling their device to do something without requiring a microphone or camera.


Ford, for its part, tells The Verge that indoor radar might become part of its driver-assistance technologies. Right now, the automaker says it’s using “advanced exterior radars” to research those features instead (which sounds expensive to me). Here’s a statement from Ford’s Jim Buczkowski, who’s currently heading up the company’s Research and Advanced Engineering team:

We are investigating how to use indoor radar as a  source of sensors to improve various customer experiences in addition to our  Ford CoPilot360 driver assistance technologies which now use advanced exterior radars. A standard API, with input from the semiconductor industry, will allow us to develop hardware-independent software purchases and give  software teams the freedom to innovate across multiple radar platforms.

 
Other companies are also exploring radar: Amazon is also investigating whether radar could help it track your sleep patterns; This smart dog collar uses miniature radar to monitor vital signs, even if your dog is very hairy or furry, and this  bulb does the same  for humans.



source https://techncruncher.blogspot.com/2022/01/google-launches-ripple-open-standard.html

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT