Have a personal or library account? Click to login
Deep Learning Transformer Model for Human Activity Recognition Cover

References

  1. Cao K., Wang M., Human behavior recognition based on sparse transformer with channel attention mechanism, Frontiers in physiology, 2023, 14, 1239453.
  2. Demrozi F., Pravadelli G., Bihorac A., Rashidi P., Human Activity Recognition Using Inertial, Physiological and Environmental Sensors: A Comprehensive Survey, IEEE Access, 2020, 8, 210816-210836.
  3. Ek S., Portet F., Lalanda P., Transformer-based models to deal with heterogeneous environments in Human Activity Recognition, Personal and Ubiquitous Computing, 2023, 27, 1-14.
  4. Meng W., Liu Z., Li B., Cui W., Zhou J.T., Zhang L., GrapHAR: A Lightweight Human Activity Recognition Model by Exploring the Sub-Carrier Correlations, IEEE Transactions on Wireless Communications, 2024, 23, 2755-2770.
  5. Ordóñez F.J., Roggen D., Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition, Sensors, 2016, 16.
  6. Shavit Y., Klein I., Boosting Inertial-Based Human Activity Recognition With Transformers, IEEE Access, 2021, 9, 53540-53547.
  7. Wang M., A Comprehensive Survey on Human Activity Recognition Using Sensing Technology, Highlights in Science, Engineering and Technology, 2022, 376-389.
  8. Yang J.-B., Nhut N., San P., Li X., Shonali P., Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition, In Proceedings of the 24th International Joint Conference on Artificial Intelligence (IJCAI), 2015, 3995-4001.
DOI: https://doi.org/10.2478/bipie-2024-0011 | Journal eISSN: 2537-2726 | Journal ISSN: 1223-8139
Language: English
Page range: 87 - 100
Submitted on: Jun 22, 2025
|
Accepted on: Jul 12, 2025
|
Published on: Feb 21, 2026
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2026 Ionuţ-Adrian Iftode, Cristian-Ioan Foşalău, published by Gheorghe Asachi Technical University of Iasi
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.