Skip to main content

Google builds its cross-platform multiplayer AR tech into a doodling app

At Google I/O earlier this month, the company announced Cloud Anchors, a tool that shares with the cloud 3D data captured by a user’s smartphone — and can match that data with another user to create a shared AR experience where each person’s phone is seeing the same things in the same places.

Today, Google is rolling out Cloud Anchor functionality to its AR drawing app called Just a Line, which it released a couple of months ago. Just a Line is hardly a breakout hit for Google, but the simplistic app that lets users paint the 3D world with a white line offers a nice testbed for early AR functionality that’s just as experimental.

What will likely differentiate Google’s offering from whatever Apple ends up shipping is that Cloud Anchor is cross-platform. The Just a Line app is available on both Android and iOS, and with today’s update users on both platforms will be able to collaborate and view items in a shared space.

What’s generally important about multiplayer AR experiences is making the process simple enough for users to sync their spatial map with another user so they see the same digital objects in the same physical locations. What Google has built seems a bit cumbersome, with each user needing to stand next to each other to pair their environments. It also seems that the functionality is limited to two people at the moment.

Just a Line isn’t the most high-stakes place for Google to be dropping this feature, so there is clearly room for the company to keep updating what they’ve got as they see what early usage looks like.



from TechCrunch https://ift.tt/2srZKxZ
via IFTTT

Comments

Popular posts from this blog

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT