Have a personal or library account? Click to login

Intelligent Guide Robots for People who are Blind or have Low Vision: A Review

Open Access
|Dec 2022

References

  1. Ackerman, E., 2021. Available at: https://spectrum.ieee.org/tencents-new-wheeled-robot-flicks-its-tail-to-do-backflips [Accessed 15 June 2021].
  2. Adagale, V. & Mahajan, S., 2015. Robot Guide System for Blind People Using GPS and GSM. International Journal of Electrical and Electronic Engineering & Telecommunications, 4(2), pp. 16-21.
  3. Antoun, S. M. & McKerrow, P. J., 2010. Mimicking a blind person navigating a corridor using a K-Sonar with a mobile robot. s.l., In Proceedings of the 3rd International Symposium on Practical Cognitive Agents and Robots, pp. 1-8.
  4. Anybotics, 2022. anybotics.com. Available at: https://www.anybotics.com/anymal-autonomous-legged-robot/ [Accessed 02 January 2022].
  5. Barra, J. et al., 2019. Localization system in GPS-denied environments using radar and IMU measurements: application to a smart white cane. Napoli, Italy, 18th European Control Conference (ECC).
  6. Boston Dynamics, W., 2021. Boston Dynamics. Available at: https://www.bostondynamics.com/products/spot [Accessed 28 December 2021].
  7. Bruno, D. R., De Assis, M. H. & Osório, F. S., 2019. Development of a Mobile Robot: Robotic Guide Dog for Aid of Visual Disabilities in Urban Environments. Rio Grande, Brazil, 2019 Latin American Robotics Symposium (LARS), 2019 Brazilian Symposium on Robotics (SBR) and 2019 Workshop on Robotics in Education (WRE), pp. 104-108.
  8. Capi, G. & Toda, H., 2011. A new robotic system to assist visually impaired people. Atlanta, USA, 2011 RO-MAN, pp. 259-263.
  9. Cardillo, E. et al., 2018. An Electromagnetic Sensor Prototype to Assist Visually Impaired and Blind People in Autonomous Walking. IEEE Sensors Journal, 18(6), pp. 2568-2576.
  10. Chen, K.-Y. & Tsui , C.-Y., 2021. The Fuzzy Control Approach for a Quadruped Robot Guide Dog. International Journal of Fuzzy Systems, Volume 23, p. 1789–1796.
  11. Chuang, T.-K.et al., 2018. Deep Trail-Following Robotic Guide Dog in Pedestrian Environments for People who are Blind and Visually Impaired - Learning from Virtual and Real Worlds. Brisbane, Australia, 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 5849-5855.
  12. Deep Robotics, 2022. Available at: https://www.deeprobotics.cn/en/products_jy_303.html [Accessed 02 January 2022].
  13. Deverell, L. et al., 2020. Use of technology by orientation and mobility professionals in Australia and Malaysia before COVID-19. Disability and Rehabilitation: Assistive Technology.
  14. Endo, Y., Sato, K., Yamashita, A. & Matsubayashi, K., 2017. Indoor positioning and obstacle detection for visually impaired navigation system based on LSD-SLAM. Kyoto, Japan, 2017 International Conference on Biometrics and Kansei Engineering (ICBAKE), pp. 158-162.
  15. Fankhauser, P. et al., 2016. Free Gait – An Architecture for the Versatile Control of Legged Robots. s.l., IEEE-RAS International Conference on Humanoid Robots, pp. 1052-1058.
  16. Fankhauser, P. et al., 2018. Robust Rough-Terrain Locomotion with a Quadrupedal Robot. Brisbane, Australia, 2018 IEEE International Conference on Robotics and Automation (ICRA), pp. 5761-5768.
  17. Gomez, J. V. & Sandnes, F. E., 2012. RoboGuideDog: guiding blind users through physical environments with laser range scanners. Procedia Computer Science, Vol 14, pp. 218-225.
  18. Guerreiro , J. et al., 2019. CaBot: Designing and Evaluating an Autonomous Navigation Robot for Blind People. s.l., The 21st International ACM SIGACCESS Conference on Computers and Accessibility, pp. 68-92.
  19. Guerrero, J. C., Quezada-V, C. & Chacón-Troya, D., 2018. Design and Implementation of an Intelligent Cane, with Proximity Sensors, GPS Localization and GSM Feedback. s.l., 2018 IEEE Canadian Conference on Electrical & Computer Engineering (CCECE), pp. 1-4.
  20. Gunethilake, W., 2020. Blind Navigation using Deep Learning-Based Obstacle Detection, Colombo: University of Colombo.
  21. Hill, J. & Black, J., 2003. The Miniguide: A New Electronic Travel Device. Journal of Visual Impairment & Blindness, 97(10), pp. 1-6.
  22. Holmes, N. & Prentice, K., 2015. iPhone video link FaceTime as an orientation tool: Remote O&M for people with vision impairment. International Journal of Orientation & Mobility, 7(1), pp. 60-67.
  23. Kalpana, S., Rajagopalan, S., Ranjith, R. & Gomathi, R., 2020. Voice Recognition Based Multi Robot for Blind People Using Lidar Sensor. s.l., 2020 International Conference on System, Computation, Automation and Networking (ICSCAN), pp. 1-6.
  24. Kayukawa , S. et al., 2019. BBeep: A Sonic Collision Avoidance System for Blind Travellers and Nearby Pedestrians. s.l., Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1-12.
  25. Kayukawa, S. et al., 2019. Bbeep: A sonic collision avoidance system for blind travellers and nearby pedestrians. s.l., In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1-12.
  26. Khandelwal, P. & Stone, P., 2014. Leading the Way: An Efficient Multi-Robot Guidance System. s.l., In 2014 AAAI Fall Symposium Series.
  27. Kulyukin, V. et al., 1999. A robotic wayfinding system for the visually impaired. s.l., In Proceedings of the National Conference on Artificial Intelligence, pp. 864-869.
  28. Lacey, G. & Dawson-Howe, K. M., 1998. The application of robotics to a mobility aid for the elderly blind. Robotics and Autonomous Systems, 23(4), pp. 245-252.
  29. Lacey, G. & Howe, K. D., 1997. Evaluation of Robot Mobility Aid for the Elderly Blind. s.l., In Proceedings of the Fifth International Symposium on Intelligent Robotic Systems.
  30. Lloyd, J., Budge, C. & Stafford, K., 2021. Handlers’ Expectations and Perceived Compatibility regarding the Partnership with Their First Guide Dogs. Animals, 11(10), p. 2765.
  31. Long, N. et al., 2019. Low power millimeter wave radar system for the visually impaired.. The Journal of Engineering, Volume 19, pp. 6034-6038.
  32. Long, N. et al., 2019. Unifying obstacle detection, recognition, and fusion based on millimeter wave radar and RGB-depth sensors for the visually impaired. Review of Scientific Instruments, 90(4), p. 044102.
  33. Megalingam, R. K., Vishnu, S., Sasikumar, V. & Sreekumar, S., 2019. Autonomous Path Guiding Robot for Visually Impaired People. In: P. Mallick , V. Balas, A. Bhoi & A. Zobaa, eds. Cognitive Informatics and Soft Computing. s.l.:Springer, pp. 257-266.
  34. Miyazaki, Y., Ohya, A. & Yuta, S., 2000. Obstacle avoidance behavior of autonomous mobile robot using fiber grating vision sensor. Nagoya, Japan, 2000 26th Annual Conference of the IEEE Industrial Electronics Society. IECON 2000., pp. 1925-1930.
  35. NSK Ltd, 2015. NSK Improves LIGHBOT™ (Guide Robot with Indoor Navigation and Obstacle Avoidance). Available at: https://www.nsk.com/company/news/2015/press1202d.html [Accessed 28 December 2021].
  36. Ozioko, O., Nassar, H. & Dahiya, R., 2021. 3D Printed Interdigitated Capacitor Based Tilt Sensor. IEEE Sensors Journal, 21(23), pp. 26252-26259.
  37. Petsiuk, A. L. & Pearce, J. M., 2019. Low-Cost Open Source Ultrasound-Sensing Based Navigational Support for the Visually Impaired. Sensors, 19(17), pp. 1-14.
  38. Rahman, S. U., Ullah, S. & Ullah, S., 2018. Obstacle Detection in Indoor Environment for Visually Impaired Using Mobile Camera. Journal of Physics: Conference Series, 960(1), p. 012046.
  39. Raibert, M., Blankespoor, K., Nelson, G. & Playter, R., 2008. BigDog, the Rough-Terrain Quadruped Robot. Seoul, Proceedings of the 17th World Congress, The International Federation of Automatic Control, pp. 10822-10825.
  40. Rumipamba L., J. A., Pérez C., A. R., Flores A, C. E. & Romero S., J. M., 2019. Prototype of guide robot using marks in dynamic environments for visually impaired people. Shanghai, China, Proceedings Volume 11321, 2019 International Conference on Image and Video Processing, and Artificial Intelligence, p. 1132109.
  41. Santos, A. D. P. D., Medola, F. O., Cinelli, M. J. & Ramirez, A. R. G., 2020. Are electronic white canes better than traditional canes? A comparative study with blind and blindfolded participants. Universal Access in the Information Society, Volume 20, pp. 93-103.
  42. Seamless Vision, 2022. Available at: https://www.seamless-vision.com [Accessed 02 January 2022].
  43. Shao, Y. et al., 2021. Learning Free Gait Transition for Quadruped Robots via Phase-Guided Controller. IEEE Robotics and Automation Letters.
  44. Sharifi, N., Ali, M., Holmes, G. & Chen, Y., 2020. Blind Obstacle Avoidance Using Taxicab Geometry for NanorobotAssisted Direct Drug Targeting. Montreal, Canada, 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), pp. 4700-4703.
  45. Shiizu, Y., Hirahara, Y., Yanashima, K. & Magatani, K., 2007. The development of a white cane which navigates the visually impaired. s.l., 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5005-5008.
  46. Shoval, S., Borenstein, J. & Koren, Y., 1994. Mobile robot obstacle avoidance in a computerized travel aid for the blind. San Diego, USA, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.
  47. Shoval, S., Borenstein, J. & Koren, Y., 1998. The NavBelt-a computerized travel aid for the blind based on mobile robotics technology. IEEE Transactions on Biomedical Engineering, 45(11), pp. 1376-1386.
  48. Ulrich, I. & Borenstein, J., 1998. VFH+: Reliable Obstacle Avoidance for Fast Mobile Robots. Leuven, Belgium, Proceedings. 1998 IEEE International Conference on Robotics and Automation.
  49. Unitree Robotics, 2022. Available at: https://www.unitree.com/products/aliengo [Accessed 02 January 2022].
  50. Vorapatratorn, S., Suchato, A. & Punyabukkana, P., 2021. Fast obstacle detection system for the blind using depth image and machine learning. Engineering and Applied Science Research, 48(5), pp. 593-603.
  51. Wang, S. et al., 2021. Balance Control of a Novel Wheel-legged Robot: Design and Experiments. Xian, China, IEEE International Conference on Robotics and Automation (ICRA), pp. 6782-6788.
  52. Weilan, 2022. Available at: http://www.weilan.com [Accessed 03 January 2022].
  53. Wei, Y., Kou, X. & Lee, M. C., 2014. A New Vision and Navigation Research for a Guide-dog Robot System in Urban System. Besançon, France, 2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.
  54. Wei, Y., Kou, X. & Lee, M. C., 2014. A New Vision and Navigation Research for a Guide-dog Robot System in Urban System. Besançon, France, 2014 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM), pp. 1290-1295.
  55. Wong, C., Wee, D., Murray, I. & Dias, T., 2001. A novel design of integrated proximity sensors for the white cane. Perth, Australia, The Seventh Australian and New Zealand Intelligent Information Systems Conference, pp. 197-201.
  56. World Health Organisation, 2021. Available at: https://www.who.int/news-room/fact-sheets/detail/blindness-and-visual-impairment#:~:text=Globally%2C%20at%20least%202.2%20billion,uncorrected%20refractive%20errors%20and%20cataracts.
  57. Xiao, A. et al., 2021. Robotic Guide Dog: Leading a Human with Leash-Guided Hybrid Physical Interaction. Xi'an, China, IEEE International Conference on Robotics and Automation (ICRA), pp. 11470-11476.
  58. Yelamarthi, K., Haas, D., Nielsen, D. & Mothersell, S., 2010. RFID and GPS integrated navigation system for the visually impaired. Seattle, USA, 2010 53rd IEEE International Midwest Symposium on Circuits and Systems, pp. 1149-1152.
  59. Yupa, F., Pozo, N. & Ayala, E., 2021. Smart White Cane for Unevenness Detection Using an Optical Device for Visually Impaired People Assistance. In: Á. Rocha, P. López-López & J. Salgado-Guerrero, eds. Communication, Smart Technologies and Innovation for Society. Singapore: Springer, pp. 659-671.
  60. Zhang, H., Jin, L. & Ye, C., 2021. An RGB-D Camera Based Visual Positioning System for Assistive Navigation by a Robotic Navigation Aid. IEEE/CAA Journal Of Automatica Sinica, 8(8), pp. 1389-1400.
  61. Zhang, Z. et al., 2021. Efficient Motion Planning Based on Kinodynamic Model for Quadruped Robots Following Persons in Confined Spaces. IEEE/ASME Transactions on Mechatronics, 26(4), pp. 1997-2006.
Language: English
Page range: 1 - 15
Accepted on: Oct 11, 2022
Published on: Dec 27, 2022
Published by: Guide Dogs NSW/ACT
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2022 Karthick Thiyagarajan, Sarath Kodagoda, Mark Luu, Taliah Duggan-Harper, Doug Ritchie, Kelly Prentice, Jodi Martin, published by Guide Dogs NSW/ACT
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.