SC Logo
IEEE Logo
Logo
IEEE Logo

Virtual Fusion With Contrastive Learning for Single-Sensor-Based Activity Recognition

Published in : IEEE Sensors Journal (Volume: 24, Issue: 15, August 2024)
Authors : Duc-Anh Nguyen, Cuong Pham, Nhien-An Le-Khac
DOI : https://doi.org/10.1109/JSEN.2024.3412397
Summary Contributed by:  Nhien-An Le-Khac (Author)

Human activity recognition (HAR) relies on sensory data to identify human activities, with applications in healthcare, exercise monitoring, sports performance analysis, human-machine interactions, gaming, sign language translation, and more. Wearable sensors are preferred for their mobility, affordability, and privacy protection. Sensors like cameras and wearables each have unique advantages and limitations. While different sensors can complement one another, their combination can improve accuracy. Single-sensor systems often struggle with limited perspectives, while sensor fusion, though accurate, requires significant resources, thus increasing the cost, compromising privacy, and creating usability issues.

This paper proposes a novel " Virtual Fusion with Contrastive Learning (VFCL)" method for single-sensor-based HAR. Virtual fusion aims to exploit the correlation between multiple sensors during training while only requiring a single sensor for inference. Experiments demonstrate its effectiveness in improving activity recognition accuracy across diverse datasets. It could also address challenges like limited sensor data and variability. This provides the benefits of sensor fusion without the drawbacks of needing multiple sensors deployed.

Despite different characteristics, all sensors describe the same human activity at any given timestamp. Hence, the researchers train the model to find common patterns among these sensors. The key idea is to leverage unlabeled multi-sensor data and labeled single-sensor data to learn richer feature representations through contrastive learning. Specifically, the model learns to map data samples across different sensors at the same timestamp to similar feature vectors while pushing apart features from different timestamps. This allows the model to transfer knowledge from the unlabeled multi-sensor data and the learning to be more consistent and robust to noisy data.

Furthermore, the researchers extend virtual fusion to "Actual Fusion within Virtual Fusion" (AFVF), which enables inference using a subset of the training sensors. This sensor subset is fused and is treated like an independent sensor. This provides flexibility in sensor selection for deployment based on factors like cost, privacy, or environmental constraints. Also, the study points out that the fused feature vector of a sensor subset, after a dimension projection layer, should also be included in the contrastive loss function besides its constituents.

Extensive experiments were conducted on several public HAR datasets, comparing virtual fusion and AFVF to single-sensor baselines and actual sensor fusion approaches. The results show that virtual fusion consistently outperforms single-sensor models and, in some cases, even surpasses the accuracy of multi-sensor fusion. AFVF also achieves state-of-the-art performance on benchmark datasets, demonstrating the effectiveness of the proposed techniques.

Key contributions of the work include:

  1. Leveraging unlabeled multi-sensor data through contrastive learning to enhance single-sensor classification.
  2. Enabling sensor subset selection for inference through AFVF, providing flexibility in deployment.
  3. Demonstrating that contrasting the fused and original modalities benefits performance.

Overall, this work presents a promising direction for single-sensor HAR that can provide the advantages of sensor fusion without the associated deployment and maintenance costs.

A non-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.
Copyright 2023 IEEE – All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions
This site is also available on your smartphone.