What is it about?

The paper presents the OCOsenseTM smart glasses system, which recognizes and monitors facial gestures and expressions by using non-contact novel optomyographic OCOTM sensors and an IMU (a sensor detecting movement similar to those in smart phones) placed inside the frames of the glasses. The glasses stream the sensor data via Bluetooth to a mobile device, where data-fusion algorithms are applied, to recognize facial gestures and expressions in real time. The recognized gestures and expressions are then used as input to interact with the mobile device.

Featured Image

Why is it important?

We believe that the OCOsenseTM glasses are the next generation in wearables, which will allow for a better understanding of the user's context and emotional state, and will allow numerous ways to interact with smart devices and computer systems, even within Augmented and Extended Reality environments. Future versions of the system can be used in a variety of domains, including, affective computing, remote mental-health monitoring, and hands-free human-computer interaction, thus improving accessibility and inclusivity of future technologies.

Perspectives

Monitoring facial reactivity and expressivity over time, can give us the opportunity to explore e.g. symptom severity in mental health conditions, provide the training and a way to see how our face behaves in different settings. In this example we are detecting facial gestures and using them as input in basic human-computer interaction examples. The user is in charge of when using the device, simply by wearing it on, and the system can be programmed for various other applications. The novelty on facial tracking is in the fact that the sensors are targeting strategic locations on the face, and can provide high-resolution output in different lighting conditions, surpassing some of the challenges current computer vision tools are against.

Ifigeneia Mavridou
Emteq Labs

Read the Original

This page is a summary of: OCOsense Glasses – Monitoring Facial Gestures and Expressions for Augmented Human-Computer Interaction, April 2023, ACM (Association for Computing Machinery),
DOI: 10.1145/3544549.3583918.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page