Have a personal or library account? Click to login
Evaluation of Finger Direction Recognition Method for Behavior Control of Robot Cover

Evaluation of Finger Direction Recognition Method for Behavior Control of Robot

Open Access
|Dec 2013

References

  1. Morioka K, Hashikawa F, and Takigawa T, (2013), “Human Identification Based on Walking Detection with AccelerationvSensor and Networked Laser Range Sensors in Intelligent Space,” International Journal on Smart Sensing and Intelligent Systems, Vol. 6, No. 5, 20402054.10.21307/ijssis-2017-625
  2. Ahmed HS, Faouzi BM, and Caelen J, (2013), “Detection and Classification of the Behavior of People in an Intelligent Building by Camera,” International Journal on Smart Sensing and Intelligent Systems, Vol. 6, No. 4, 1318-1342.10.21307/ijssis-2017-592
  3. Jain R, Kasturi R, and Schunck BG, (1995), “Machine Vision,” McGraw-Hill, Inc., ISBN 007-032018-7.
  4. Kurata T, Kato T, Kourogi M, Keechul J, and Endo K, (2002), “A Functionally-distributed Hand Tracking Method for Wearable Visual Interfaces and its Applications,” In IAPR Workshop on Machine Vision Applications, pp. 84–89.
  5. Freeman W& Roth M, (1995), “Orientation Histograms for Hand Gesture Recognition,” IEEE Int. Workshop on Automatic Face and Gesture Recognition.
  6. Sawada H, Hashimoto S, and Matsushima T, (1998), “A Study of Gesture Recognition Based on Motion and Hand Figure Primitives and Its Application to Sign Language Recognition,” vol. 39, no. 5, 1325-1333.
  7. Starner, T. and Pentland, A., (1995), “Real-time American Sign Language Recognition from Video Using Hidden Markov Models,” 1995. Proc. of International Symposium on Computer Vision, vol. 21, no. 23, 265-270.10.1109/ISCV.1995.477012
  8. Yoshino L et al, (1996), “Recognition of Japanese Sign Language from Image Sequence Using Color Combination,” Proc. 3rd Int. Conf. Image Processing, 16-19.10.1109/ICIP.1996.560543
  9. Ong SCW et al, (2005), “Automatic Sign language Analysis: A Survey and the Future beyond Lexical Meaning,” IEEE Trans. PAMI, Vol. 27, No. 6, 873-891.10.1109/TPAMI.2005.11215943420
  10. Yamada Y, Matsuo T, Shimada N, andShirai Y, (2009), Hand Detection and Hand Shape Classification Based on Appearance Learning for Sign Language Recognition, MIRU2009, IS1-37. (in Japanese)
  11. Tanaka S, Umeda Ki, (2001), Operating a Mobile Robot by Gesture Recognition, The transactions of the Institute of Electrical Engineers of Japan. C, A Publication of Electronics, Information and System Society, vol. 121, no. 9, pp. 1457-1463. (in Japanese)10.1541/ieejeiss1987.121.9_1457
  12. Fujimoto K, Matsuo T, Shimada N, and Shirai Y, (2010), “High Speed 3-D Hand Shape Measurement by Tree-based Learning Contour Features, Proc. of Meeting on Image Recognition and Understanding,” MIRU2010, IS3-64. (in Japanese)
  13. Lee SU and Cohen I, (2004), “3D Hand Reconstruction from a Monocular View,” Proceedings of the Pattern Recognition, 17th International Conference on (ICPR’04), Vol. 3, 310-313.
  14. Takeda Y, Terabayashi K, Asano H, and Umeda K, (2011), “Finger Direction Recognition in 3D Utilizing a Range and Image Sensor,” Proc. Of Summer Seminar of Japan Soc. of Precision Engineering Technical Committee on Industrial Application of Image Processing, 75-76, (in Japanese)
  15. Ohkubo Y, Okada K, Inamura T, and Inaba M, (2005), “Finger Recognition for Target Indication to Robots in Daily Life (Humanoid 3, Mega-Integration in Robotics and Mechatronics to Assist Our Daily Lives),” Proc. of 2004 JSME Conference on Robotics and Mechatronics (ROBOMEC’04), 2A1-S-049.
  16. Rehg JM& Kanade T, (1994), “Visual Tracking of High DOF Articulated Structures: an Application to Human Hand Tracking,” In Third European Conference on Computer Vision, 35-46.
  17. Nickel K& Stiefelhagen R (2007), “Visual Recognition of Pointing Gestures for Humanrobot Interaction,” Image and Vision Computing, vol. 25, no. 12, 1875-1884.10.1016/j.imavis.2005.12.020
  18. Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, and Blake A (2011), “Real-time Human Pose Recognition in Parts from Single Depth Images,” In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference, 1297-1304.10.1109/CVPR.2011.5995316
  19. Ikai T, Ohka M, and Yussof H (2012), “Behavior Control of Robot by Human Finger Direction,” Engineering Procedia, 41, 784-791.10.1016/j.proeng.2012.07.244
  20. Bradski G (2011), “Open Source Computer Vision Library,”http://www.intel.com/research/mrl/research/opencv.
  21. Abdullah SC, Ikai T, Dosho Y, Yussof H and Ohka M (2011), “Edge Extraction Using Image and Three-axis Tactile Data,” International Journal on Smart Sensing and Intelligent Systems, Vol. 4, No. 3, 508-526.10.21307/ijssis-2017-454
  22. Hashem, HF (2009), “Adaptive Technique for Human Face Detection Using HSV Color Space and Neural Networks,” National Radio Science Conference, 2009, vol. 1, no. 7, 17-19
  23. Brooks, AG& Breazeal, C (2006), “Working with Robots and Objects: Revisiting Deictic Reference for Achieving Spatial Common Ground,” In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-robot Interaction, 297-304.10.1145/1121241.1121292
  24. Manders C, Farbiz F, Chong JH, Tang KY, Chua GG, Loke MH, and Yuan ML (2008), “Robust Hand Tracking Using a Skin Tone and Depth Joint Probability Model,” 8th IEEE International Conference on Automatic Face& Gesture Recognition, Vol. 1, No. 6, 17-1910.1109/AFGR.2008.4813459
  25. Kim S, Sekiyama K, and Fukuda T, (2008)”Pattern Adaptive and Finger Image-guided Keypad Interface for In-vehicle Information Systems,” International Journal on Smart Sensing and Intelligent Systems, Vol. 1, No. 3, 572-591.10.21307/ijssis-2017-308
Language: English
Page range: 2308 - 2333
Submitted on: Nov 20, 2013
Accepted on: Dec 15, 2013
Published on: Dec 31, 2013
Published by: Professor Subhas Chandra Mukhopadhyay
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2013 T. Ikai, M. Ohka, S. Kamiya, H. Yussof, S. C. Abdullah, published by Professor Subhas Chandra Mukhopadhyay
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.