Have a personal or library account? Click to login
Machine learning-enhanced gesture recognition through impedance signal analysis Cover

Machine learning-enhanced gesture recognition through impedance signal analysis

Open Access
|Jun 2024

Figures & Tables

Figure 1:

An overview of the gesture recognition system.
An overview of the gesture recognition system.

Figure 2:

Evaluation of five hand gestures designed for holding, grasping, and relaxing activities.
Evaluation of five hand gestures designed for holding, grasping, and relaxing activities.

Figure 3:

The precision impedance analyzer WK6632.
The precision impedance analyzer WK6632.

Figure 4:

The electrode for gesture recognition data acquisition.
The electrode for gesture recognition data acquisition.

Figure 5:

Electrode configuration for impedance data collection: using four electrodes (E1-E2, E1-E3, E1-E4, E2-E3, E2-E4, and E3-E4).
Electrode configuration for impedance data collection: using four electrodes (E1-E2, E1-E3, E1-E4, E2-E3, E2-E4, and E3-E4).

Figure 6:

Comparative SNR performance graph for five hand gestures (A, B, C, D, and E), with each gesture measured for 208 values (04 electrodes) and conducted with 200 repetitions.
Comparative SNR performance graph for five hand gestures (A, B, C, D, and E), with each gesture measured for 208 values (04 electrodes) and conducted with 200 repetitions.

Figure 7:

The measurements obtained from electrode pairs across various gestures for three volunteers with 5 gestures A, B, C, D, and E: (a) Person 1, (b) Person 2, (c) Person 3.
The measurements obtained from electrode pairs across various gestures for three volunteers with 5 gestures A, B, C, D, and E: (a) Person 1, (b) Person 2, (c) Person 3.

Figure 8:

Investigating PCA for gesture recognition with 2 dimensions.
Investigating PCA for gesture recognition with 2 dimensions.

Figure 9:

Investigating PCA for gesture recognition with three dimensions.
Investigating PCA for gesture recognition with three dimensions.

Figure 10:

Standard deviation values for sample data from 3 volunteers with labeled gestures A, B, C, D, and E for Person 1, Person 2, and Person 3.
Standard deviation values for sample data from 3 volunteers with labeled gestures A, B, C, D, and E for Person 1, Person 2, and Person 3.

Figure 11:

Confusion matrices demonstrating the classification accuracy of machine learning models for hand gesture recognition.
Confusion matrices demonstrating the classification accuracy of machine learning models for hand gesture recognition.

Performance evaluation of machine learning models_

ModelFoldAccuracyPrecisionRecallF1-Score
KNNFold-10.700.720.700.72
Fold-20.930.960.930.92
Fold-30.910.920.910.90
Fold-40.940.960.940.94
Fold-50.810.870.810.79
Mean0.860.890.860.85
GBMFold-10.940.960.940.94
Fold-20.930.940.930.92
Fold-30.760.630.760.68
Fold-40.950.950.950.95
Fold-50.700.790.700.69
Mean0.860.860.860.84
Naive BayesFold-10.720.730.720.71
Fold-20.930.960.930.92
Fold-30.960.970.960.96
Fold-40.910.950.910.89
Fold-50.700.740.700.67
Mean0.840.870.840.83
LRFold-10.800.790.800.78
Fold-20.940.960.940.94
Fold-30.930.940.930.92
Fold-40.930.950.930.92
Fold-50.850.890.850.83
Mean0.890.900.890.88
Random ForestFold-10.740.740.740.73
Fold-20.930.960.930.92
Fold-30.960.970.960.96
Fold-40.930.960.930.92
Fold-50.810.890.810.79
Mean0.870.900.870.86
SVMFold-10.720.730.720.71
Fold-20.940.960.940.94
Fold-30.940.950.940.94
Fold-40.940.960.940.94
Fold-50.800.860.800.78
Mean0.870.890.870.86
Language: English
Page range: 63 - 74
Submitted on: Mar 31, 2024
Published on: Jun 11, 2024
Published by: University of Oslo
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2024 Hoang Nhut Huynh, Quoc Tuan Nguyen Diep, Minh Quan Cao Dinh, Anh Tu Tran, Nguyen Chau Dang, Thien Luan Phan, Trung Nghia Tran, Congo Tak Shing Ching, published by University of Oslo
This work is licensed under the Creative Commons Attribution 4.0 License.