Skip to main content

I tried meditating with a brain-sensing headband — and I didn’t hate it

I’m bad at meditating. The act of sitting still with my thoughts is a terrifying, anxiety-inducing thing. I’ve been on and off meditation kicks for years now, so I’m willing to try just about any quick fix that promises to keep on track.

I’ve had limited results with various mindful apps — Calm and Insight Timer have proven the most useful thus far. But those are only as good as your own practice. It’s a bit like going to the gym, really. You start to slip. One missed day becomes two, two becomes a week and next thing you know, you haven’t been mindful in a month.

The idea of hacking one’s mindfulness practice is pretty appealing, honestly — even if it means wearing a dumb-looking headband while you sit alone in silence. The second generation of Interaxon’s Muse headband features a slimmer profile than its predecessor and the ability to read four activity metrics: body movements, breath, heart rate and “mental activity.”

It all looks and feels a little silly — and frankly, I’m glad we opted not to do a video this time out. It was just me and a judgmental rabbit staring at me from the corner of a darkened room. The setup, too, is a pain. You have to sit through a couple of guided videos and then get the placement of the headset just right, so the headband’s electrodes all make contact with your skin. It’s tricky getting just right.

Once everything is up and running, it’s a distinctly different experience from the various meditation apps I’ve tried. The real-time feedback from the headset brings an interesting level of gamification to the product. Rather than music, guided meditation or ambient sounds, the audio interacts with your session.

The app defaults to a storm sound, with the weather getting worse as you grow more restless. Focus, breathe and calm down, and the storm subsides. Do it for a while and you start to hear the occasional bird. If nothing else, it gives you something to focus on beyond all of the reliving of past events and going down checklists we often obsesses about when attempting to quiet our minds.

I was able to shift things a good deal between my first and second sessions. The first, at five minutes, rated at 26 percent calm. The second, at 10, was 89 percent calm. That bumped me from two birds in the first to 90 in the second, no doubt drenched in bird crap by the end of the second.

I’m sure there’s a fair bit of placebo effect in all of this, but the Muse 2 has given an extra little jolt of energy to my admittedly wobbly meditation practice. And having a timeline of meditation metrics akin to an Apple or Fitbit app is certainly appealing. Actually sticking with the practice, on the other hand, is another question entirely. Ask me again in a few weeks whether I think it’s worth the $249.



from TechCrunch https://ift.tt/2qgRLD8
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT