What is it about?

This work focuses on improving how wearable devices like smartwatches and fitness trackers understand user actions, even when used across different people, devices, or environments. Many current systems struggle when they encounter changes, like switching from one user to another, or when only a small amount of labeled training data is available. To address this, we developed a new approach called ContrastSense. By utilizing contextual information, such as when the data is collected and who collects it, ContrastSense achieves consistent performance across diverse scenarios. ContrastSense was tested on tasks including activity and gesture recognition, showing significant improvements compared to existing methods.

Featured Image

Why is it important?

Wearable technology plays an increasingly vital role in health monitoring, fitness tracking, and gaming. However, existing systems often struggle to deliver consistent performance when faced with diverse users, devices, or environments. ContrastSense tackles two critical challenges: adapting to variations across users and devices, and overcoming the scarcity of labeled data, which is costly and time-consuming to collect. By enhancing the generalizability of wearable systems, ContrastSense reduces the reliance on extensive data collection, paving the way for smarter, more adaptable devices that cater to a broader and more diverse range of users.

Perspectives

Writing this paper has been a deeply rewarding experience, especially as it marks my first publication during my PhD journey. Developing ContrastSense allowed me to dive into the fascinating intersection of wearable technology and advanced artificial intelligence, exploring solutions to real-world challenges like generalizability and label scarcity. Besides, collaborating with my co-authors is an inspiring and enriching process, and I am excited about the potential impact this work could have on improving wearable systems for diverse applications, from healthcare to gaming. This paper represents not just a technical contribution but also a personal milestone in my academic career, and I hope it sparks meaningful discussions and inspires further innovation in the field.

Gaole DAI
Nanyang Technological University

Read the Original

This page is a summary of: ContrastSense: Domain-invariant Contrastive Learning for In-the-Wild Wearable Sensing, Proceedings of the ACM on Interactive Mobile Wearable and Ubiquitous Technologies, November 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3699744.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page