Have a personal or library account? Click to login
Pattern Adaptive and Finger Image-guided Keypad Interface for In-vehicle Information Systems Cover

Pattern Adaptive and Finger Image-guided Keypad Interface for In-vehicle Information Systems

Open Access
|Dec 2017

References

  1. 1National Policy Agency of Japan, Statistics of Traffic Accident in 2007 (in Japanese), http://www.npa.go.jp/toukei/koutuu48/H19.All.pdf.
  2. 2National Policy Agency of Japan, Statistics of Traffic Accident Caused by Using Car Navigation and Cellular Phone (in Japanese), http://www.npa.go.jp/comment/result/koutsuukikaku2/honbun/betu06.pdf.
  3. 3B. Jan, A. Van, A. Hendrick, V. Van, “Vibrotactile In-vehicle Navigation System”, Transportation Research, Part F, Vol.7, 2004, pp. 247-256.10.1016/j.trf.2004.09.003
  4. 4G. Costagliola, S. Marino, F. Ferrcuci, G. Oliviero, U. Montemuro, A. Paliotti, “Handy –A New Interaction Device for Vehicular Information Systems”, Mobile Human-Computer Interaction – Mobile HCI 2004, Proceedings, Vol.3160, 2004, pp.264-275.
  5. 5T. Kumagi, M. Akamatsu, “Prediction of Human Driving Behavior Using Dynamic Baysian Networks”, IEICE Transactions on Information and Systems, Vol.E89-D, No.2, 2006, pp. 857860.10.1093/ietisy/e89-d.2.857
  6. 6A. Marcus, “Vehicle User Interface: the next revolution”, Interactions, Vol.1, 2004, pp. 40-47.10.1145/962342.962358
  7. 7Y. Shimizu, Actuators for a Tactile Display (in Japanese), http://ushiku2.se.uec.ac.jp/shimizu/research/kaken/ds33.html#ds335.
  8. 8S. Shimada, M. Shinohara, Y. Shimizu, M. Shimojo, “An Approach for Direct Manipulation by Tactile Modality for Blind Computer Users: Development of the Second Trial Production”, Computer Helping People with Special Needs, Proceedings, Vol.4061, 2006, pp. 1036-1046.
  9. 9M. Shinohara, Y. Shimizu, M. Mochizuki, “Three-Dimensional Tactile Display for the Blind”, IEEE Transactions on Rehabilitation Engineering, Vol.6, No.3, 1998, pp. 249-256.10.1109/86.7122189749902
  10. 10K. Kyung, D. Kwon, G. Yang, “A Novel Interactive Mouse System for Holistic Haptic Display in a Human-Computer Interface”, International Journal of Human-Computer Interaction, Vol.20, Issue 3, 2006, pp. 247-270.10.1207/s15327590ijhc2003_5
  11. 11M. Jung, T. Matsuno, S. Kim, T. Fukuda, T. Arai, “Effect of Tactile Display in Visually Guiding Input Device”, IEEE/RSJ 2006 International Conference on Intelligent Robots and Systems, Proceedings, 2006, pp. 5046-5051.10.1109/IROS.2006.282534
  12. 12S. Kim, K. Sekiyama, T. Fukuda, K. Tanaka, K. Itoigawa, “Development of Dynamically Reformable Input Device in Tactile and Visual Interaction”, IEEE 2007 International Symposium on Micro-Nano Mechatronics and Human Science, Proceedings, 2007, pp. 544-549.10.1109/MHS.2007.4420914
  13. 13L. Rabiner, B. Juang, Fundamentals of Speech Recognition, Prentice Hall, 1993, Chap.6.
  14. 14KGS Co. Ltd, Tactile Display Cells (SC5), http://www.kgs-jpn.co.jp/epiezo.html.
  15. 15M. Shimojo, M. Shinohara, Y. Fukui, “Human Shape Recognition Performance for 3-D Tactile Display”, IEEE Transactions on Systems, Man and Cybernetics – Part A:Systems and Humans, Vol.29, No.6, 1999, pp. 637-644.10.1109/3468.798067
Language: English
Page range: 572 - 591
Published on: Dec 13, 2017
Published by: Professor Subhas Chandra Mukhopadhyay
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2017 Sang-Ho Kim, Kosuke Sekiyama, Toshio Fukuda, published by Professor Subhas Chandra Mukhopadhyay
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.