Skip to main content

Nvidia CEO Jensen Huang clarifies Uber is not using its Drive platform

While Uber makes use of Nvidia hardware in its own self-driving automotive technology, it does not employ Nvidia’s Drive autonomous computing platform, which includes the GPU maker’s own real-time sensor fusion, HD mapping and path planning. Nvidia CEO Jensen Huang shared this information today during a Q&A session attended by reporters at the company’s GPU Technology Conference in San Jose.

“Uber does not use Nvidia’s Drive technology,” Huang said. “Uber develops their own sensing and drive technology.”

Huang also reiterated comments made during an earlier Q&A that Nvidia ceased its own testing on public roads (its fleet comprises only about 5 or 6 vehicles in total at any given time, the company points out) out of an abundance of caution, and simply because it’s good engineering practice to pause and reflect when a new variable is discovered in any engineering problem.

The Nvidia CEO also clarified that Nvidia stopped its testing only “almost a day or two” after the accident occurred, as soon as “the news became clear to us,” and not only when news broke publicly earlier this week that its testing program had been suspended.

“If there’s an incident that happened that is a new piece of information that you can learn from, you should pause and learn from it,” Huang added. “I think everybody in the industry should – there’s no question in my mind, that everyone in the industry should pause to look at the situation, learn from it. Pause, just take a pause.”

Others running self-driving test programs on public roads, including Toyota Research Institute, have also paused, but some, including Waymo and Intel, have instead publicly declared that their own systems wouldn’t have failed where Uber’s did in this instance, and have continued their own testing programs on public roads.



from TechCrunch https://ift.tt/2GElJuP
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Anomalous

Hello and welcome back to Max Q! Last week wasn’t the most successful for spaceflight missions. We’ll get into that a bit more below. In this issue: First up, a botched launch from Virgin Orbit… …followed by one from ABL Space Systems News from Rocket Lab, World View and more Virgin Orbit’s botched launch highlights shaky financial future After Virgin Orbit’s launch failure last Monday, during which the mission experienced an  “anomaly” that prevented the rocket from reaching orbit, I went back over the company’s financials — and things aren’t looking good. For Virgin Orbit, this year has likely been completely turned on its head. The company was aiming for three launches this year, but everything will remain grounded until the cause of the anomaly has been identified and resolved. It’s unclear how long that will take, but likely at least three months. Add this delay to Virgin’s dwindling cash reserves and you have a foundation that’s suddenly much shakier than before. ...