Have a personal or library account? Click to login
Improved Competitive Neural Network for Classification of Human Postures Based on Data from RGB-D Sensors Cover

Improved Competitive Neural Network for Classification of Human Postures Based on Data from RGB-D Sensors

Open Access
|Mar 2024

Abstract

The cognitive goal of this paper is to assess whether marker-less motion capture systems provide sufficient data to recognize human postures in the side view. The research goal is to develop a new posture classification method that allows for analysing human activities using data recorded by RGB-D sensors. The method is insensitive to recorded activity duration and gives satisfactory results for the sagittal plane. An improved competitive Neural Network (cNN) was used. The method of preprocessing the data is first discussed. Then, a method for classifying human postures is presented. Finally, classification quality using various distance metrics is assessed. The data sets covering the selection of human activities have been created. Postures typical for these activities have been identified using the classifying neural network. The classification quality obtained using the proposed cNN network and two other popular neural networks were compared. The results confirmed the advantage of cNN network. The developed method makes it possible to recognize human postures by observing movement in the sagittal plane.

DOI: https://doi.org/10.14313/jamris/3-2023/19 | Journal eISSN: 2080-2145 | Journal ISSN: 1897-8649
Language: English
Page range: 15 - 28
Submitted on: Jan 5, 2023
Accepted on: Apr 18, 2023
Published on: Mar 4, 2024
Published by: Łukasiewicz Research Network – Industrial Research Institute for Automation and Measurements PIAP
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2024 Vibekananda Dutta, Jakub Cydejko, Teresa Zielińska, published by Łukasiewicz Research Network – Industrial Research Institute for Automation and Measurements PIAP
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.