Have a personal or library account? Click to login
A Short Review of Deep Learning Methods in Visual Servoing Systems Cover

Abstract

This survey explores the evolution and applications of Visual Servoing Systems in robotics, emphasizing the transition from traditional image processing techniques to the incorporation of neural networks for feature extraction and control. Robotic systems, integral to manufacturing, surveillance, and healthcare, increasingly rely on Visual Servoing for enhanced interaction within various work environments. Initially focused on auxiliary sensors like visual sensors for robustness and accuracy, recent advances have seen a shift towards integrating deep learning methods for direct control and feature extraction. The survey covers the differences that emerge from classical Visual Servoing architectures to novel methods involving Deep Learning, highlighting their respective advantages and limitations regarding stability, precision, and real-time applicability. Innovative approaches, such as Direct Visual Servoing and the use of siamese networks for camera position estimation, demonstrate significant progress in overcoming the challenges of traditional Visual Servoing. Through detailed examination of leading research, the survey highlights the potential of neural networks to revolutionize this domain by enhancing feature extraction, reducing reliance on precise calibration, and improving control laws for complex robotic tasks.

DOI: https://doi.org/10.2478/bipie-2023-0018 | Journal eISSN: 2537-2726 | Journal ISSN: 1223-8139
Language: English
Page range: 113 - 136
Submitted on: Jul 31, 2024
Accepted on: Oct 14, 2024
Published on: Nov 9, 2024
Published by: Gheorghe Asachi Technical University of Iasi
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2024 Adrian-Paul Botezatu, Adrian Burlacu, published by Gheorghe Asachi Technical University of Iasi
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.