Event-Based Estimation of Hand Forces From High-Density Surface EMG on a Parallel Ultralow-Power Microcontroller
Modeling hand kinematics and dynamics is essential for building effective human–machine interfaces (HMIs). While deep learning (DL) has achieved high accuracy in gesture classification, regression-based approaches offer more natural and flexible control by predicting continuous hand movements and forces. Surface electromyography (sEMG) plays a key role in this context, capturing continuous muscle activity that enables intuitive control across wearable applications such as prosthetics, robotics, and augmented reality.
Despite its promise, much of the current research overlooks a critical challenge: efficient execution on resource-constrained platforms—an essential requirement for embedded wearable systems. This limitation is particularly evident in DL-based methods, which often prioritize low regression error at the cost of increased memory and computational demands.
To address this gap, the research presents a lightweight, bioinspired framework for event-based encoding of high-density sEMG (HD-sEMG) signals to estimate multifinger forces in real time. Implemented on an ultralow-power microcontroller, the approach emphasizes accuracy and execution efficiency, making it suitable for applications in embedded systems. Spiking Neural Networks (SNNs), used here for their energy-efficient, event-driven nature, process sparse spike trains that mimic biological neurons more closely than traditional DL models. Each neuron integrates input spikes over time, decays in the absence of input, and fires an output spike upon crossing a threshold—enabling sparse, low-latency, asynchronous computation.
By extracting spike trains via a leaky integrate-and-fire encoding, the solution estimates finger force with a mean absolute error of just 8.42 % ± 2.80 % of maximum voluntary contraction, a performance on par with state-of-the-art methods in far simpler experimental conditions.
Prior studies focused on fixed tasks or single-day sessions. However, this study evaluated the most challenging "RANDOM" subset of the High-densitY Surface Electromyogram Recording (HYSER) dataset, which includes multifinger, multiday data without predefined force patterns. The regression model, by L₁ regularization, prunes redundant channels so that only 42–84 of the original 256 contribute significant weights. This data-driven sparsification not only identifies minimal sensor configurations for reliable decoding but also reduces computational and power demands.
The entire processing—from event-based encoding to linear regression—is implemented on an ultralow-power microcontroller, which features an eight-core RISC-V cluster optimized for parallel computation. By parallelizing neuron updates and weight multiplications across the cluster, the system processes all channels in 276 µs, consuming just 6.5 µJ per sample. This translates to a 2.8x to 11x energy efficiency gain compared to single-core implementations.
The complete solution was evaluated in an end-to-end, online experiment—streaming HD-sEMG data from a computer to the microcontroller unit and returning real-time force estimates—demonstrating stable cross-day performance under realistic conditions. All code for the event-based encoder and regression training has been open-sourced, encouraging further community development.
This work represents a significant advancement as the first real-time, energy-efficient multifinger force estimator on a microcontroller. It paves the way for more natural and fine-grained human-machine interfaces in next-generation prosthetics, robotics, and augmented reality systems by achieving strong accuracy and adaptability within the strict power and latency constraints of embedded devices, even in non-laboratory settings.



