Skip to main content

Uber adds 911 assistance to rider app

Uber has officially launched its 911 in-app calling feature, after first announcing it in April. As you can see below, riders can tap the safety icon at the bottom right corner of the app to call 911. Once on the line with the 911 dispatcher, you can easily communicate your location, since Uber clearly shows it in the app. In seven test markets, Uber is integrating with Rapid SOS to enable automatic location sharing with 911 dispatchers. Clicking on the safety icon also brings up an option to share trip details with trusted contacts.

Uber is testing the automatic location sharing with 911 dispatchers in seven cities: Denver, Colo., Charleston, S.C., Nashville, Chattanooga and Tri-Cities, Tenn., Naples, Fla. and Louisville, Ky. In terms of determining which cities to deploy automatic location sharing, Uber Director of Product Management Sachin Kansal told TechCrunch said it came down to “readiness of cities” and “how fast some of them were able to move in terms of training agents and testing functionality.”

Already, Uber is in discussion with several other cities to launch the feature, Kansal said. He added that the goal is to make it available everywhere.

As Uber started developing the 911 assistance feature, Kansal said, the team was looking for helpful ways to contribute. From the company’s research, Uber “found out that accurate location fo the caller is one of the biggest problems. Whenever you call 911, the first question is often, ‘What is your location?'”

Depending on what the situation is, it may not make sense to use the Uber app to call 911. In another case, if you already have the Uber app open, then maybe that would be the fastest way to get emergency help.

“At the end of the day, when a user is in an emergency, we want them to use whatever will be the fastest mechanism for them in that moment in time,” Kansal said. “If they’re already in a phone dialer, call 911 from there.”

He added, Uber’s approach is that if someone happens to be in the Uber app, the company wants to make it “extremely easy for them to dial, as well as to receive information at their fingertips.”

Given that some situations may entail a problematic driver, Uber drivers do not get notified when a rider seeks 911 assistance. But it’s worth noting the driver would obviously hear the passenger on the phone.

“Most of the scenarios that we see happening in an Uber are generally related to road accidents,” Kansal said.

However, Uber will only know the nature of the call if a rider explicitly lets the company know. Uber will know you dialed 911 through the app, but it won’t know what you said on the call. Afterward, Kansal said, Uber will send the rider a message to see if everything is ok and if there’s anything the company can do to help.

The feature will be available to all riders in the U.S. This summer, Uber plans to launch similar functionality for drivers in the U.S. and riders in international markets. Specifically, Uber plans to add the safety center icon, as well as the ability to call a local emergency number in international markets. In terms of automatic location sharing, there’s “nothing to announce,” he said.



from TechCrunch https://ift.tt/2GZM8PM
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT