Have a personal or library account? Click to login
Biomechanical sensor signal analysis based on machine learning for human gait classification Cover

Biomechanical sensor signal analysis based on machine learning for human gait classification

By: Hacer Kuduz and  Fırat Kaçar  
Open Access
|Dec 2024

Abstract

The present study investigates the effect of wearable sensor placements and the use of various machine learning (ML) algorithms for human gait pattern recognition based on temporal gait speeds using wearable multichannel sensor data. Therefore, classifying human gait from features extracted from biomechanical sensor signals and evaluating the effect of using these sensors on gait biomechanics can be successfully achieved with a machine learning approach. In this study, firstly, IMU (Inertial Measurement Unit) and GON (Goniometer) sensor features were extracted for machine learning input using the sliding windows method, and these features were applied to ML classifiers to classify human gait speed. Our experimental findings show that multi- and fusion sensor models provide superior classification performance compared to single sensor models, and the highest accuracy is achieved with the ‘FUS09’ fusion sensor model and SVM-based classifier. The classification accuracies of the FUS10 fusion sensor model, where all sensor parameters were combined, the FUS09 model, where the GON_ANK and IMU_Ft parameters in this model were excluded, are 0.895 and 0.901, respectively. Consequently, wearable biomechanical sensor data and machine learning approach can be easily preferred in multiple human activity recognition.

DOI: https://doi.org/10.2478/jee-2024-0059 | Journal eISSN: 1339-309X | Journal ISSN: 1335-3632
Language: English
Page range: 513 - 521
Submitted on: Sep 12, 2024
|
Published on: Dec 6, 2024
In partnership with: Paradigm Publishing Services
Publication frequency: 6 issues per year

© 2024 Hacer Kuduz, Fırat Kaçar, published by Slovak University of Technology in Bratislava
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.