Skip to main content

BMW continues to bet on the (Azure) cloud

Earlier this week, at MWC Barcelona, BMW announced its newest in-car AI initiative: BMW Natural Interaction. The idea here is to use cameras, microphones and other sensors in the car to allow you to have more natural interactions with the car, either through voice or gestures. The marquee feature here is the ability to point at something outside the car and get more information about it or, if it’s a restaurant, have the BMW Intelligent Personal Assistant (IPA) make a reservation for you. These systems will work by combining in-car AI with cloud technologies — and for those, BMW continues to bet on Microsoft’s Azure cloud.

After the announcement, I sat down with Christoph Grote, BMW Group’s senior VP for electronics. I admit that a lot of what I saw in the demo felt a bit futuristic, but Grote noted that everything he showed off during his presentation is more or less production-ready. “I don’t think I would’ve dared to stand up there if any of the things I showed today were a utopia,” he told me. “All of this is in series production and some of it is already available as part of the BMW OS 7 release. But the major work we are doing, looking ahead to the iNext [electric SUV], is about gaze, head pose and gesture tracking and combing those with the other modalities. But everything we showed today is going to go into production.”

In practice, this means that BMW will use two cameras: a wide-angle camera behind the rear-view mirror that can track the gestures of both the driver and front-seat passenger and one behind the dashboard that only looks at the driver through the steering wheel and recognizes when their eyes blink, where their eyes look and their head pose.

As Grote noted, figuring out where you are looking is not exactly easy. The camera sees your hands in relation to the car. That’s pretty straightforward. But the car, too, is situated somewhere in space, and for this to work, that localization has to be very precise, and the digital map has to be very detailed, too. “GPS isn’t enough for this,” Grote said, and noted that the company plans to use the car’s forward-facing camera to gather additional information that helps localize the car in space based on comparing the image to the digital map. The AI smarts that power these mapping features run right in the car — and in many ways, these features also lay the groundwork for self-driving cars, which obviously need highly detailed maps, too.

In many ways, this work is a continuation of BMW’s work on its IPA in-car assistant. “There, we use Azure Cognitive Service and we plan to integrate these new modalities (like gaze and gesture tracking) with the same technology. And that’s important for these multi-modal systems. […] We have a great partnership with Microsoft and we expect that’ll continue.”

Grote also noted that BMW has a long history of working in the cloud, thanks to many years of experience in offering its connected car services. “We don’t think of the car as an isolated client that connects to some service in the cloud, but that we also see these connected cars as a swarm that has collective intelligence.”

Vehicle-to-everything (V2X) connectivity is one of the hot topics in the car industry right now — especially given the advent of 5G with its low-latency connectivity — and BMW does have its own point of view here. For Grote, V2X systems that use the cellular network and connect to the cloud have major advantages over those that try to connect cars directly. These cloud-connected systems, he argues, are easier to maintain and they are able to translate between different standards or — in the long run — integrate different generations of this system to ensure that cars from different manufacturers can talk to each other.

“A cellular-based system is forward-looking, maintainable, secure and the better foundation that guarantees future development efforts versus a standard that’s 20 years old, from a time when the carriers were not interested in machine-to-machine traffic at all.”

BMW continues to bet on the cloud for many of its newest tech developments. Among car manufacturers, it’s obviously not alone here. Daimler recently announced that it has moved its big data platform to the cloud, for example. And in many ways, that move makes sense. Running online services isn’t a core competency for many of these companies, and even if they are experienced at running their own data centers by now, this isn’t what allows them to differentiate their cars in a highly competitive market. That energy is better spent on building applications, not managing them. The large cloud providers also offer global coverage, and redundancies are hard and expensive to build.



from TechCrunch https://ift.tt/2tIHbGJ
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT