Skip to main content

Microsoft’s Edge browser now lets you mute tabs

Unlike other browser vendors, Microsoft remains committed to a relatively slow update cycle that sees the company launch major new versions of Edge in sync with the bi-annual Windows 10 updates. It doesn’t look like that’s changing anytime soon, even as the Edge mobile browser is seeing far more regular updates now. Since today is the day Microsoft announced the Windows 10 April 2018 Update, which will be available for download on Monday, it’s also announcing a new version of Edge, too. No surprise there.

Like Windows 10, the Edge update mostly focuses on helping users be more productive, Divya Kumar, a senior product marketing manager on the Edge team, told me. Edge always had some of those features, including its surprisingly useful Cortana integration, so this is mostly a continuation of an existing theme. In addition, the update also includes the usual reliability, power efficiency, speed and security tweaks.

The marquee feature of this update is the addition of autofill for payment info, though. A couple of others browsers already do this, including Edge’s archrival Google Chrome, so it’s no major surprise that Microsoft is now adding this feature, too. It’s pretty straightforward and if you live in the Microsoft ecosystem, it’ll definitely save you from the aggravation of having to regularly punch in the same credit card numbers.

The feature that many users will likely be even more happy to see, though, is the ability to mute tabs. Given how many sites now love to autoplay videos, you never know which tab will randomly start talking to you. Now, you’ll see both a speaker icon in tabs that play audio and a right click on that tab allows you to mute that tab. Again, this is nothing new in the browser world but much appreciated. It would be even better, though, if Microsoft went one step further and just let you mute sites by default, too.

If you still print articles from the browser, this update now also allows you to print just the text without all the ads and other widgets around it. This is essentially an extension of the distraction-free reading view Edge has long featured. Not essential, but nice to have, I guess.

There are a few other new reading features in the Edge update, too. There’s now a full-screen reading view for PDFs, books and Reading View. There’s also a new grammar tool in Edge that’s mostly interesting for students. It lets you break down words by syllables and can highlight the parts of speech in a sentence.

As Microsoft also announced earlier this month, the company is rededicating itself to building better developer tools in Edge, too, with the launch of the Edge DevTool Preview app. This allows developers to test the latest Edge features without having to install the latest Windows 10 Insider builds.



from TechCrunch https://ift.tt/2I1Wjpa
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT