Skip to main content

Mapbox’s new SDK helps developers build smart AR navigation apps

Mapbox, the open source mapping service that competes directly with Google’s Maps Platform, today announced a new software development kit (SDK) that will make it easier for developers to build applications that provide AR navigation. That by itself would be cool, but by using ARM’s Project Trillium AI platform, the Mapbox Vision SDK can also recognize other vehicles, pedestrians, speed limit signs, construction signs, crosswalks and more, all without having to train their own machine learning models to do so.

It’s easy to see how this could be useful for navigation apps, but Mapbox is going a step further by also integrating its service deeply with Microsoft’s Azure IoT platform. Indeed, the company has integrated the open source Azure IoT Edge runtime into its SDK to allow developers to easily push events that the Vision SDK detects into the cloud. Thanks to this, you could easily crowdsource data about roadside construction, for example, or how busy a given intersection currently is. And in the context of a navigation app, the driver could get the same info in real-time, too (just in case you missed that construction sign…).

“The future of location is building live maps in real-time from distributed sensor networks embedded in vehicles and mobile devices at scale,” said Eric Gundersen, CEO of Mapbox, in today’s announcement. “Every vehicle and mobile device utilizing the Vision SDK creates a better map, and this same data is streamed back to Microsoft Azure for further processing. The Vision SDK not only runs in real-time to improve the driving experience in the vehicle, but also generates data for the back end to update the map based on changing conditions, powering larger solutions for smart cities or insurance companies.”

Using ARM’s Project Trillium platform, the SDK is able to make use of the mobile device’s onboard CPUs, GPUs and AI chips (if available) to perform the necessary object recognition. Once new phones launch with ARM’s new ML and object detection processors, the SDK will be able to perform all of these functions even faster, but for now, it can extract features from the video feed at a speed of about ten times a second.

Mapbox notes that its SDK will work on iOS and Android, but developers could also use it directly in a car that uses ARM embedded automotive chipsets.

The company says that it currently has about 1.1 million developers on its platform. It uses a variety of sources for its mapping data, but the core of its data comes from the OpenStreetMap project.

 



from TechCrunch https://ift.tt/2IZlVmh
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT