Skip to main content

Researchers developed a sensing system to constantly track the performance of workers

Researchers have come up with a mobile-sensing system that can track and rate the performance of workers by combining a smartphone, fitness bracelets and a custom app.

The mobile-sensing system, as the researchers call it, is able to classify high and low performers. The team used the system to track 750 U.S. workers for one year. The system was able to tell the difference between high performers and low performers with 80% accuracy.

The aim, the researchers say, is to give employees insight into physical, emotional and behavioral well-being. But that constant flow of data also has a downside, and if abused, can put employees under constant surveillance by the companies they work for.

The researchers, including Dartmouth University computer science professor Andrew Campbell, whose earlier work on a student monitoring app provided the underlying technology for this system, see this as a positive gateway to improving worker productivity.

“This is a radically new approach to evaluating workplace performance using passive sensing data from phones and wearables,” said Campbell. “Mobile sensing and machine learning might be the key to unlocking the best from every employee.”

The researchers argue that the technology can provide a more objective measure of performance than self-evaluations and interviews, which they say can be unreliable.

The mobile-sensing system developed by the researchers has three distinct pieces. A smartphone tracks physical activity, location, phone use and ambient light. The fitness tracker monitors heart functions, sleep, stress and body measurements like weight and calorie consumption. Meanwhile, location beacons placed in the home and office provide information on time at work and breaks from the desk.

From here, cloud-based machine learning algorithms are used to classify workers by performance level.

The study found that higher performers typically had lower rates of phone usage, had longer periods of deep sleep and were more physically active.

Privacy experts and labor advocates have long raised concerns about the practice of tracking employees. That hasn’t stopped companies from incentivizing employees to wear fitness tracks in exchange for savings on insurance or other benefits. Startups have popped up to offer even more ways to track employees.

For instance, WeWork acquired in February Euclid, a data platform that tracks the identity and behavior of people in the physical world. Shiva Rajaraman, WeWork’s chief product officer, told TechCrunch at the time that the Euclid platform and its team will become integrated into a software analytics package that WeWork plans to sell to companies that aren’t renting WeWork space but want to WeWork-ify their own offices.

Meanwhile, the team of researchers suggests that while its system of continuous monitoring via wearables and other devices is not yet available, it could be coming in the next few years. It’s unclear if the team is making a calculated guess or if there are designs to try and launch this system as a product.

The team, led by Dartmouth University, included researchers from University of Notre Dame, Georgia Institute of Technology, University of Washington, University of Colorado Boulder, University of California, Irvine, Ohio State University, University of Texas at Austin and Carnegie Mellon University.

A paper describing the study will be published in the Proceedings of the ACM on Interactive, Mobile Wearable and Ubiquitous Technology.



from TechCrunch https://ift.tt/2NnBx9P
via IFTTT

Comments

Popular posts from this blog

Apple’s AI Push: Everything We Know About Apple Intelligence So Far

Apple’s WWDC 2025 confirmed what many suspected: Apple is finally making a serious leap into artificial intelligence. Dubbed “Apple Intelligence,” the suite of AI-powered tools, enhancements, and integrations marks the company’s biggest software evolution in a decade. But unlike competitors racing to plug AI into everything, Apple is taking a slower, more deliberate approach — one rooted in privacy, on-device processing, and ecosystem synergy. If you’re wondering what Apple Intelligence actually is, how it works, and what it means for your iPhone, iPad, or Mac, you’re in the right place. This article breaks it all down.   What Is Apple Intelligence? Let’s get the terminology clear first. Apple Intelligence isn’t a product — it’s a platform. It’s not just a chatbot. It’s a system-wide integration of generative AI, machine learning, and personal context awareness, embedded across Apple’s OS platforms. Think of it as a foundational AI layer stitched into iOS 18, iPadOS 18, and m...

The Silent Revolution of On-Device AI: Why the Cloud Is No Longer King

Introduction For years, artificial intelligence has meant one thing: the cloud. Whether you’re asking ChatGPT a question, editing a photo with AI tools, or getting recommendations on Netflix — those decisions happen on distant servers, not your device. But that’s changing. Thanks to major advances in silicon, model compression, and memory architecture, AI is quietly migrating from giant data centres to the palm of your hand. Your phone, your laptop, your smartwatch — all are becoming AI engines in their own right. It’s a shift that redefines not just how AI works, but who controls it, how private it is, and what it can do for you. This article explores the rise of on-device AI — how it works, why it matters, and why the cloud’s days as the centre of the AI universe might be numbered. What Is On-Device AI? On-device AI refers to machine learning models that run locally on your smartphone, tablet, laptop, or edge device — without needing constant access to the cloud. In practi...

Max Q: Psyche(d)

In this issue: SpaceX launches NASA asteroid mission, news from Relativity Space and more. © 2023 TechCrunch. All rights reserved. For personal use only. from TechCrunch https://ift.tt/h6Kjrde via IFTTT