References
- P. Koehn et al., “Moses: open source toolkit for statistical machine translation,” Proc. 45th Annu. Meet. ACL Interact. Poster Demonstr. Sess. - ACL ’07, no. June, p. 177, 2007, doi: 10.3115/1557769.1557821.
- A. Banerjee et al., “BENGALI-ENGLISH RELEVANT CROSS LINGUAL INFORMATION ACCESS USING FINITE AUTOMATA,” 2010, pp. 595–599, doi: 10.1063/1.3516373.
- D. Bahdanau, K. H. Cho, and Y. Bengio, “Neural machine translation by jointly learning to align and translate,” 3rd Int. Conf. Learn. Represent. ICLR 2015 - Conf. Track Proc., pp. 1–15, 2015.
- M. R. Costajussà, “From feature to paradigm: Deep learning in machine translation,” IJCAI Int. Jt. Conf. Artif. Intell., vol. 2018-July, pp. 5583–5587, 2018.
- K. Papineni, S. Roukos, T. Ward, and W.-J. Zhu, “Bleu: a Method for Automatic Evaluation of Machine Translation,” in Proceedings of the 40th Annual Meeting of the Association for Computational Linguistics, Jul. 2002, pp. 311–318, doi: 10.3115/1073083.1073135.
- U. Manual and C. Guide, “Statistical Machine Translation SystemUser Manual and Code Guide,” Univ. Edinburgh, pp. 1–267, 2012, [Online]. Available: http://www.statmt.org/moses/?n=Moses.Background%5Cnpapers2://publication/uuid/6CE5E8D9-FFCB-4D77-A9A0-EDB52F76AB11%5Cnpapers2://publication/uuid/A34532BD-1847-4471-9C35-B6C8F3C2C108.
- J. Kong, Y. Yang, L. Wang, X. Zhou, T. Jiang, and X. Li, “Filtering Reordering Table Using a Novel Recursive Autoencoder Model for Statistical Machine Translation,” Math. Probl. Eng., vol. 2017, 2017, doi: 10.1155/2017/3492587.
- E. Satir and H. Bulut, “Preventing translation quality deterioration caused by beam search decoding in neural machine translation using statistical machine translation,” Inf. Sci. (Ny)., vol. 581, pp. 791–807, 2021, doi: 10.1016/j.ins.2021.10.006.
- P. Martínez-Gómez, G. Sanchis-Trilles, and F. Casacuberta, “Online adaptation strategies for statistical machine translation in post-editing scenarios,” Pattern Recognit., vol. 45, no. 9, pp. 3193–3203, 2012, doi: 10.1016/j.patcog.2012.01.011.
- D. Xiong, F. Meng, and Q. Liu, “Topic-based term translation models for statistical machine translation,” Artif. Intell., vol. 232, pp. 54–75, 2016, doi: 10.1016/j.artint.2015.12.002.
- Y. Zhao, M. Komachi, T. Kajiwara, and C. Chu, “Region-attentive multimodal neural machine translation,” Neurocomputing, vol. 476, pp. 1–13, 2022, doi: 10.1016/j.neucom.2021.12.076.
- S. A. Mohamed, A. A. Elsayed, Y. F. Hassan, and M. A. Abdou, “Neural machine translation: past, present, and future,” Neural Comput. Appl., vol. 33, no. 23, pp. 15919–15931, 2021, doi: 10.1007/s00521-021-06268-0.
- H. Wang, H. Wu, Z. He, L. Huang, and K. W. Church, “Progress in Machine Translation,” Engineering, vol. 18, pp. 143–153, 2022, doi: 10.1016/j.eng.2021.03.023.
- S. Chauhan, S. Saxena, and P. Daniel, “Improved Unsupervised Neural Machine Translation with Semantically Weighted Back Translation for Morphologically Rich and Low Resource Languages,” Neural Process. Lett., vol. 54, no. 3, pp. 1707–1726, 2022, doi: 10.1007/s11063-021-10702-8.
- X. Liu, W. Wang, W. Liang, and Y. Li, “Speed Up the Training of Neural Machine Translation,” Neural Process. Lett., vol. 51, no. 1, pp. 231–249, 2020, doi: 10.1007/s11063-019-10084-y.
- I. Sutskever, O. Vinyals, and Q. V. Le, “Sequence to sequence learning with neural networks,” Adv. Neural Inf. Process. Syst., vol. 4, no. January, pp. 3104–3112, 2014.
- B. Ahmadnia and B. J. Dorr, “Low-resource multi-domain machine translation for Spanish-Farsi: Neural or statistical?,” Procedia Comput. Sci., vol. 177, pp. 575–580, 2020, doi: 10.1016/j.procs.2020.10.081.
- A. Pathak and P. Pakray, “Neural machine translation for Indian languages,” J. Intell. Syst., vol. 28, no. 3, pp. 465–477, 2019, doi: 10.1515/jisys-2018-0065.
- S. Jp, V. K. Menon, S. Kp, R. S, and A. Wolk, “Generation of cross-lingual word vectors for low-resourced languages using deep learning and topological metrics in a data-efficient way,” Electronics (Switzerland), vol. 10, no. 12. 2021, doi: 10.3390/electronics10121372.
- R. Rubino, B. Marie, R. Dabre, A. Fujita, M. Utiyama, and E. Sumita, Extremely low-resource neural machine translation for Asian languages, vol. 34, no. 4. Springer Netherlands, 2020.
- A. Vaswani et al., “Attention is all you need,” Adv. Neural Inf. Process. Syst., vol. 2017-Decem, no. Nips, pp. 5999–6009, 2017.
- J. Devlin, M. W. Chang, K. Lee, and K. Toutanova, “BERT: Pre-training of deep bidirectional transformers for language understanding,” NAACL HLT 2019 – 2019 Conf. North Am. Chapter Assoc. Comput. Linguist. Hum. Lang. Technol. - Proc. Conf., vol. 1, no. Mlm, pp. 4171–4186, 2019.
- R. Sennrich and B. Haddow, “Linguistic Input Features Improve Neural Machine Translation,” vol. 1, pp. 83–91, 2016, doi: 10.18653/v1/w16-2209.
- R. S. and B. H. and A. Birch, “Das Hunderttage-Stadion : Entstehungsgeschichte des Bad Nauheimer Kunsteisstadions unter Colonel Paul R. Knight,” Acl, pp. 1715–1725, 2016, [Online]. Available: http://www.aclweb.org/anthology/P16-1162.
- C. K. Wu, C. C. Shih, Y. C. Wang, and R. T. H. Tsai, “Improving low-resource machine transliteration by using 3-way transfer learning,” Comput. Speech Lang., vol. 72, no. February 2020, p. 101283, 2022, doi: 10.1016/j.csl.2021.101283.
- C. Raffel et al., “Exploring the limits of transfer learning with a unified text-to-text transformer,” J. Mach. Learn. Res., vol. 21, pp. 1–67, 2020.
- B. Zoph, D. Yuret, J. May, and K. Knight, “Transfer learning for low-resource neural machine translation,” EMNLP 2016 - Conf. Empir. Methods Nat. Lang. Process. Proc., pp. 1568–1575, 2016, doi: 10.18653/v1/d16-1163.
- S. K. Mahata, A. Garain, D. Das, and S. Bandyopadhyay, “Simplification of English and Bengali Sentences for Improving Quality of Machine Translation,” Neural Process. Lett., 2022, doi: 10.1007/s11063-022-10755-3.
- I. Goodfellow et al., “Generative adversarial networks,” Commun. ACM, vol. 63, no. 11, pp. 139–144, 2020, doi: 10.1145/3422622.
- Z. Zhang, S. Liu, M. Li, M. Zhou, and E. Chen, “Bidirectional generative adversarial networks for neural machine translation,” CoNLL 2018 - 22nd Conf. Comput. Nat. Lang. Learn. Proc., no. CoNLL, pp. 190–199, 2018, doi: 10.18653/v1/k18-1019.
- Z. Yang, W. Chen, F. Wang, and B. Xu, “Generative adversarial training for neural machine translation,” Neurocomputing, vol. 321, pp. 146–155, 2018, doi: 10.1016/j.neucom.2018.09.006.
- A. Das, P. Yerra, K. Kumar, and S. Sarkar, “A study of attention-based neural machine translation model on Indian languages,” Proc. 6th Work. South Southeast Asian Nat. Lang. Process., pp. 163–172, 2016, [Online]. Available: http://aclweb.org/anthology/W16-3717.
- R. Haque, M. Hasanuzzaman, and A. Way, “Terminology translation in low-resource scenarios,” Inf., vol. 10, no. 9, pp. 1–28, 2019, doi: 10.3390/info10090273.
- L. Li, K. Jamieson, G. DeSalvo, A. Rostamizadeh, and A. Talwalkar, “Hyperband: A novel bandit-based approach to hyperparameter optimization,” J. Mach. Learn. Res., vol. 18, pp. 1–52, 2018.
- J. Bergstra and Y. Bengio, “Random search for hyper-parameter optimization,” J. Mach. Learn. Res., vol. 13, pp. 281–305, 2012.
- S. Agnihotri, “Hyperparameter Optimization on Neural Machine Translation,” Creat. Components, vol. 124, 2019, [Online]. Available: https://lib.dr.iastate.edu/creativecomponents/124/.
- C. M. Bishop, Neural Networks for Pattern Recognition. USA: Oxford University Press, Inc., 1995.
- J. Tiedemann, “Parallel data, tools and interfaces in OPUS,” Proc. 8th Int. Conf. Lang. Resour. Eval. Lr. 2012, pp. 2214–2218, 2012.
- D. P. Kingma and J. L. Ba, “Adam: A method for stochastic optimization,” 3rd Int. Conf. Learn. Represent. ICLR 2015 - Conf. Track Proc., pp. 1–15, 2015.