Skip to main content

Apple buys Denver startup building waveguide lenses for AR glasses

Apple has acquired Akonia Holographics, a Denver-based startup that manufactures augmented reality waveguide lenses. The acquisition was confirmed by Apple to Reuters who first reported the news.

An Apple spokesperson gave TechCrunch the company’s standard statement, “Apple buys smaller technology companies from time to time, and we generally don’t discuss our purpose or plans.”

This acquisition offers the clearest confirmation yet from Apple that it is investing resources into technologies that support the development of a lightweight augmented reality headset. There have been a number of reports over the years that Apple is planning to release consumer AR glasses within the next few years.

In late 2017, we reported that Apple had acquired Vrvana, a mixed-reality headset company with a device that offered users pass-through augmented reality experiences on a conventional opaque display. This latest acquisition seems to offer a much clearer guide to where Apple’s consumer ambitions may take it for a head-worn augmented reality device.

Waveguide displays have become the de facto optic technology for augmented reality headsets. They come in a few different flavors but all of them essentially involve an image being beamed into the side of a piece of glass and bouncing between etchings (or other irregularities) in a lens and eventually beaming that image to the user’s eyes. Waveguide lenses are currently used in AR headsets sold by Magic Leap and Microsoft, among many others.

A reflective waveguide display built by Lumus.

They’re popular because they allow for thin, largely transparent designs though they also often have issues with color reproduction and the displays can only become so large before the images grow distorted. Akonia’s marketing materials claim for their “HoloMirror” solution says it can “display vibrant, full-color, wide field-of-view images.”

The startup raised $11.6 million in funding according to Crunchbase.

While many of Apple’s largest technology competitors have already experimented with AR headsets, Apple has directed the majority of its early consumer-facing efforts to phone-based AR technologies that track the geometry of spaces and can “project” digital objects onto surfaces.

Apple ARKit

The most unclear question regarding Apple’s rumored work on its AR glasses is whether the company is looking to ship a higher-powered device akin to Magic Leap that would track a user’s environment and be built upon Apple’s interactive ARKit tech, or whether it’s first release will be more conservative and approach AR glasses as more of a head-worn Apple Watch that presents a user’s notifications and enables light interactions.

Moving forward with waveguide displays would certainly leave both options open for the company, though given the small window that even today’s widest field-of-view waveguides have, I expect that Apple may opt for the latter pending a big tech breakthrough or a heavily delayed release.



from TechCrunch https://ift.tt/2okEzwj
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT