SC Logo
IEEE Logo
Logo
IEEE Logo

A Wearable Multisensor Fusion System for Neuroprosthetic Hand

Published in : IEEE Sensors Journal (Volume: 25, Issue: 8, April 2025)
Authors : Liu Honghai, Meng Jianjun, Ding Han, Guo Weichao, Shi Shang, Yang Xingchen, Yin Zongtian
DOI : https://doi.org/10.1109/JSEN.2025.3546214
Summary Contributed by:  Liu Honghai (Author)

Prosthetic arms and hands have come a long way as far as mechanics and design are concerned. However, achieving control of a prosthetic hand that feels natural, reliable, and intuitive still poses a major challenge. Most commercial prosthetics rely on surface electromyography (sEMG), which detects electrical activity from muscles. While effective in many cases, these systems often misinterpret gestures, require frequent recalibration, and become unreliable when the user's arm changes position.

To address these limitations, researchers have developed a compact, wearable multi-sensor fusion system that integrates three complementary sensing technologies:

  • A-mode Ultrasound – Captures real-time muscle shape changes, including deep muscle activity by using high-frequency sound pulses.
  • Surface EMG (sEMG) – Measures the electrical signals generated by muscle contractions.
  • Inertial Measurement Unit (IMU) – Tracks arm orientation, acceleration, and movement patterns.

By combining structural (ultrasound), electrical (sEMG), and motion (IMU) data, the system builds a more complete, stable, and accurate understanding of user intent, enabling smoother control of prosthetic limbs—even during complex, dynamic motions.

Key Features and Advantages:

  • Compact and Integration-Ready: The fusion system includes four small sensor probes (38.5 × 20.5 × 13.5 mm) and a compact processing unit (50 × 40 × 15 mm), all small enough to fit seamlessly within a standard prosthetic socket.
  • High Gesture Recognition Accuracy: In tests involving 16 healthy participants and 4 amputees, the system achieved 94.8% ± 1.1% accuracy for healthy users and 96.9% ± 1.3% for amputees across six common hand gestures.
  • Real-Time, On-Chip Processing: Data is processed directly on an embedded microcontroller unit (MCU). A-mode ultrasound is triggered by high-voltage pulses, echo signals are digitized, and sEMG is amplified and filtered—all in real time, with no need for external devices.
  • Improved Stability in Real-World Tasks: A novel control strategy maps each sensor’s strength to specific functions: sEMG helps detect resting states, ultrasound and EMG together inform gesture models, and the IMU filters unintended motion such as shaking. This enhances stability and reduces fatigue during everyday tasks.
  • Consistent Performance in Changing Arm Positions: Unlike traditional EMG-only systems, this solution maintains accuracy even when users raise, lower, or rotate their arms, which are critical for daily, natural movement.

Why This Matters for Prosthetic Users?

For prosthetic users, this system offers more than just better performance; it provides confidence. It reduces false activations, adapts to real-world arm movements, and makes gesture-based control feel more intuitive and dependable. Daily tasks like gripping a cup, typing, or opening a door become more effortless, fluid, and in control.

Future efforts should focus on expanding gesture recognition for richer interaction and miniaturizing the device for enhanced energy efficiency.

This innovation represents a significant step forward in prosthetic technology by fusing ultrasound, electrical, and motion sensing in a compact system. This innovative system has brought us closer to creating prosthetic limbs that imitate and truly respond like natural ones.

A non-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.
Copyright 2023 IEEE – All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions
This site is also available on your smartphone.