Have a personal or library account? Click to login

References

  1. Augestad, K., Sneve, A. and Lindsetmo, R. (2020). Telemedicine in postoperative follow-up of stoma patients: A randomized clinical trial (the stompa trial), British Journal of Surgery 107(5): 509–518, DOI: 10.1002/bjs.11491.10.1002/bjs.1149132100297
  2. Boraii, S. (2017). A descriptive study to assess quality of life in Egyptian patients with a stoma, Ostomy Wound Manage 63(7): 28–33, PMID: 28759426.
  3. Buckley, K., Adelson, L. and Hess, C. (2005). Get the picture! Developing a wound photography competency for home care nurses, Journal of Wound, Ostomy and Continence Nursing 32(3): 171–177, DOI: 10.1097/00152192-200505000-00005.10.1097/00152192-200505000-0000515931147
  4. Bulkley, J., McMullen, C., Grant, M., Wendel, C., Hornbrook, M. and Krouse, R. (2018). Ongoing ostomy self-care challenges of long-term rectal cancer survivors, Support Care Cancer 26(11): 3933–3939, DOI: 10.1007/s00520-018-4268-0.10.1007/s00520-018-4268-0616033129845420
  5. Ciecierski, K.A. (2020). Mathematical methods of signal analysis applied in medical diagnostic, International Journal of Applied Mathematics and Computer Science 30(3): 449–462, DOI: 10.34768/amcs-2020-0033.
  6. Ciążyński, K.A. and Fabijańska, A. (2015). Detection of QR-codes in digital images based on histogram similarity, Image Processing and Communications 20(2): 41–48.10.1515/ipc-2015-0033
  7. Dinuzzi, V., Palomba, G., Minischetti, M., Amendola, A., Aprea, P., Luglio, G., De Palma, G. and Aprea, G. (2021). Telemedicine in patients with an ostomy during the COVID-19 pandemic: A retrospective observational study, Wound Management & Prevention 67(1): 12–17.10.25270/wmp.2021.1.1217
  8. Fiala, M. (2005). Comparing ARTag and ARToolkit Plus fiducial marker systems, IEEE International Workshop on Haptic Audio Visual Environments and their Applications, Ottawa, Canada, pp. 147–152.
  9. Garrido-Jurado, S., Muñoz-Salinas, R., Madrid-Cuevas, F. and Medina-Carnicer, R. (2016). Generation of fiducial marker dictionaries using mixed integer linear programming, Pattern Recognition 51: 481–491, DOI: 10.1016/j.patcog.2015.09.023.10.1016/j.patcog.2015.09.023
  10. Hansen, D.K., Nasrollahi, K., Rasmussen, C.B. and Moeslund, T.B. (2017). Real-time barcode detection and classification using deep learning, International Joint Conference on Computational Intelligence, Funchal, Madeira, Portugal, pp. 321–327.
  11. de Heide, J., Vroegh, C.J., Szili Torok T., Gobbens, R.J., Zijlstra, F., Takens-Lameijer, M., Lenzen, M.J., Yap, S.C. and Scholte Op Reimer, W.J.M. (2017). A pilot feasibility study of telemedical wound assessment using a mobile phone in cardiology patients, Journal of Cardiovascular Nursing 32(2): E9–E15, DOI: 10.1097/JCN.0000000000000377.10.1097/JCN.000000000000037727811584
  12. Hoon, L., Chi Sally, C. and Hong-Gu, H. (2013). Effect of psychosocial interventions on outcomes of patients with colorectal cancer: A review of the literature, European Journal of Oncology Nursing 17(6): 883–8913, DOI: 10.1016/j.ejon.2013.05.001.10.1016/j.ejon.2013.05.00123759360
  13. Hoube, S., Droeschel, D. and Behnke, S. (2016). Joint 3D laser and visual fiducial marker based SLAM for a micro aerial vehicle, 2016 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Baden-Baden, Germany, pp. 609–614.
  14. Kaltenbrunner, M. and Bencina, R. (2007). reacTIVision: A computer-vision framework for table-based tangible interaction, 1st International Conference on Tangible and Embedded Interaction, TEI’07, Baton Rouge, USA, pp. 69–74, DOI: 10.1145/1226969.1226983.10.1145/1226969.1226983
  15. Kato, H., Billinghurst, M., Poupyrev, I., Imamoto, K. and Tachibana, K. (2000). Virtual object manipulation on a table-top AR environment, IEEE and ACM International Symposium on Augmented Reality (ISAR 2000), Munich, Germany, pp. 111–119, http://www.hitl.washington.edu/artoolkit/.
  16. Kowalski, M., Kaczmarek, P., Kabaciński, R., Matuszczak, M., Tranbowicz, K. and Sobkowiak, R. (2014). A simultaneous localization and tracking method for a worm tracking system, International Journal of Applied Mathematics and Computer Science 24(3): 599–609, DOI: 10.2478/amcs-2014-0043.10.2478/amcs-2014-0043
  17. Li, M., Howard, D. and King, R. (2019). A picture tells a thousand words smartphone-based secure clinical image transfer improves compliance in open fracture management, Injury 50(7): 1284–1287, DOI: 10.1016/j.injury.2019.05.010.10.1016/j.injury.2019.05.01031147186
  18. Li, W., Nee, A. and Ong, S. (2017). A state-of-the-art review of augmented reality in engineering analysis and simulation, Multimodal Technologies and Interaction 1(3): 17, DOI: 10.3390/mti1030017.10.3390/mti1030017
  19. Lim, H. and Lee, Y. (2009). Real-time single camera SLAM using fiducial markers, 2009 ICCAS-SICE, Fukuoka, Japan, pp. 177–182.
  20. Muñoz-Salinas, R., Marín-Jimenez, M. and Medina-Carnicer, R. (2019). SPM-SLAM: Simultaneous localization and mapping with squared planar markers, Pattern Recognition 86: 156–171.10.1016/j.patcog.2018.09.003
  21. Nolan, C. and Forde, E. (2016). A review of the use of fiducial markers for image-guided bladder radiotherapy, Acta Oncologica 55(5): 533–538, PMID: 26588169.10.3109/0284186X.2015.111025026588169
  22. Pata, F., Bondurri, A., Ferrara, F., Parini, D., Rizzo, G. and MISSTO (2020). Enteral stoma care during the COVID-19 pandemic: Practical advice, Colorectal Disease 22(9): 985–992, DOI: 10.1111/codi.15279.10.1111/codi.15279740515832691949
  23. Romero-Ramirez, F., Muñoz-Salinas, R. and Medina-Carnicer, R. (2018). Speeded up detection of squared fiducial markers, Image and Vision Computing 76: 38–47, DOI: 10.1016/j.imavis.2018.05.004.10.1016/j.imavis.2018.05.004
  24. Romero-Ramirez, F., Muñoz-Salinas, R. and Medina-Carnicer, R. (2019). Fractal markers: A new approach for long-range marker pose estimation under occlusion, IEEE Access 7: 169908–169919.10.1109/ACCESS.2019.2951204
  25. Rufli, M., Scaramuzza, D. and Siegwart, R. (2008). Automatic detection of checkerboards on blurred and distorted images, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, pp. 3121–3126.
  26. Sani, M.F. and Karimian, G. (2017). Automatic navigation and landing of an indoor AR drone quadrotor using ArUco marker and inertial sensors, 2017 International Conference on Computer and Drone Applications (IConDA), Kuching, Malaysia, pp. 102–107.
  27. Szwoch, M., Zawiślak, R., Mik, M., Mik-Wojtczak, J. and Granosik, G. (2021). Stoma-Alert-Final—Development of IT system model supporting the process of diagnosing, treating and rehabilitating patients with stoma (image part), Project no. POIR. 04.01.04-00-0107/19, Technical report, PHIN Consulting and Lodz University of Technology, Lodz.
  28. Tändl, M., Stark, T., Erol, N.E., Löer, F. and Kecskeméthy, A. (2009). An object-oriented approach to simulating human gait motion based on motion tracking, International Journal of Applied Mathematics and Computer Science 19(3): 469–483, DOI: 10.2478/v10006-009-0038-y.10.2478/v10006-009-0038-y
  29. Todt, E. and Torras, C. (2000). Detection of natural landmarks through multiscale opponent features, 15th International Conference on Pattern Recognition, ICPR-2000, Barcelona, Spain, Vol. 3, pp. 976–979.
  30. Wang, Q., Zhao, J., Huo, X., Wu, L., Yang, L. F.and Li, J. and Wang, J. (2018). Effects of a home care mobile app on the outcomes of discharged patients with a stoma: A randomised controlled trial, Journal of Clinical Nursing 27(19–20): 3592–3602, DOI: 10.1111/jocn.14515.10.1111/jocn.1451529775491
  31. Zawiślak, R., Mik, M., Mik-Wojtczak, J. and Granosik, G. (2020). Stoma-Alert—Development of IT system model supporting the process of diagnosing, treating and rehabilitating patients with stoma—Project no. POIR. 04.01.01-00-0066/18-01, Technical report, PHIN Consulting and Lodz University of Technology, Lodz.
DOI: https://doi.org/10.34768/amcs-2022-0012 | Journal eISSN: 2083-8492 | Journal ISSN: 1641-876X
Language: English
Page range: 147 - 160
Submitted on: Apr 15, 2021
Accepted on: Oct 30, 2021
Published on: Mar 31, 2022
Published by: University of Zielona Góra
In partnership with: Paradigm Publishing Services
Publication frequency: 4 times per year

© 2022 Mariusz Szwoch, Rafał Zawiślak, Grzegorz Granosik, Joanna Mik-Wojtczak, Michał Mik, published by University of Zielona Góra
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.