Figure 1:

Figure 2:
![Encoder–decoder model with attention [3].](https://sciendo-parsed.s3.eu-central-1.amazonaws.com/6471fb66215d2f6c89db76dd/j_ijssis-2023-0007_fig_002.jpg?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=ASIA6AP2G7AKMIFG4RRN%2F20260131%2Feu-central-1%2Fs3%2Faws4_request&X-Amz-Date=20260131T183849Z&X-Amz-Expires=3600&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEPL%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDGV1LWNlbnRyYWwtMSJHMEUCICOM%2Bd5af2obAUp4MaE6%2BXigeEb5FttjBrLavYM3PlN2AiEA6fWCCvFixSejouAZCWhFeeZ%2F1ZFq%2BR%2BKT%2BYCyOy%2Bm58qxQUIvP%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FARACGgw5NjMxMzQyODk5NDAiDP71e2v9WggKHP9HQCqZBXhY6fNuFVf3kXIg2vbH1rXZV1x8t5FpSDSthbXT4O6up3vGRZZI4mAapao2bejekHxb4WwZ%2FUIC0zXZJqaz7zvWmKlK93WWLC3vFgRHNFarDB2ZVI9yxDlNgkxZoPWKlH%2B6UhnrnxVc05Gg0uuMAzdvKRkiKdIw%2FOR7drczRtGn%2Fr5XH%2BUrEswWPkKDH9EODwzFewq2AIGAYAdkUzbcN2Jk3NK2%2FowxPyxlL5iKTPjJdYJydDrkaZZhQdituz%2FAvkAJKGmZohLMOMqZ1LIWcIq6h7NwCHmvaI9s%2FT8W1fWIX2tiv2cJRZoH7OXjzKy0%2BJkxPZ6PO9CQmycobBgfrBLhIaqUgpGGFcs2uI2rPUoFt1q%2BDdkgGtYeTwFpTxt608nKDDdRIQqlbKXRRo0ZwUL2fURAIrcqu99R1hd2m9pcgTDxbjR1O6HgeLMIYgwzq15%2F6M7puuDa1IDvOSkjspWSWq3lS1YGmG%2BviiRVnL4BpN0Py4%2Ff%2Bo4YxMX5ifrCLmATjaNcSl6hu7rTW%2F48g0nvP3INHbCQhqKfsejHJI%2BoHV4V30hhrzg7%2FT4AqguWtmn%2BYylfar1mvfhWqvpFlKlXkAGXVAKm%2FQAImTkYQ9nxdcYInDb%2F1OxhwseEvw1m1xv%2B0QBJlJZKL81qY2AZbTjA2iVSoCCzEtyXq9IQwgQsddVcQ8QKM6JosHK6SdRH3yzD9qA2rncBovTFUgAvl2mpaBBdPOrOjDYjAgfc2xqf9NC4P6HHYNOxSt0E%2BDun424nOFF098pL9AtxpVLRiUK38PyK3SSFBeIqeUKKEIEPUlgKHu2QlGuHWb5od2UnzNcxEj7jyt5S9M2BSypB7eL2b7bn4LTvF%2FQGVkbSo5HBKWHNpw7QjQRMMLWR%2BcsGOrEB8jPuow%2F24iBa50J4lKbPaIyL9RpGmmOGRMhLkGkpyvAfCU8j%2FRcT4hG73RLODVP2flRUHNVdK7Bbc7C9ie7VmixvkAkgq1CemvsoyiYopFeItuypI%2FJ3P6EYgF0IZ6lxe38B3FQWkyMnuU8Ih0H1M8oTvhvivDeVN2c91aUNqo4XGNI20jT9rrvIwRR%2BW%2F0RR66%2FfeHTl2mxP8u9FqbYv%2B2t1U%2BuXSuvuY5HiMDP6aY3&X-Amz-Signature=30712c7ea4875ddea5ca397c1c8cc0685f59a704c53772cbe7265b425711ccdf&X-Amz-SignedHeaders=host&x-amz-checksum-mode=ENABLED&x-id=GetObject)
Figure 3:
![Transformer model [21].](https://sciendo-parsed.s3.eu-central-1.amazonaws.com/6471fb66215d2f6c89db76dd/j_ijssis-2023-0007_fig_003.jpg?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Credential=ASIA6AP2G7AKMIFG4RRN%2F20260131%2Feu-central-1%2Fs3%2Faws4_request&X-Amz-Date=20260131T183849Z&X-Amz-Expires=3600&X-Amz-Security-Token=IQoJb3JpZ2luX2VjEPL%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaDGV1LWNlbnRyYWwtMSJHMEUCICOM%2Bd5af2obAUp4MaE6%2BXigeEb5FttjBrLavYM3PlN2AiEA6fWCCvFixSejouAZCWhFeeZ%2F1ZFq%2BR%2BKT%2BYCyOy%2Bm58qxQUIvP%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FARACGgw5NjMxMzQyODk5NDAiDP71e2v9WggKHP9HQCqZBXhY6fNuFVf3kXIg2vbH1rXZV1x8t5FpSDSthbXT4O6up3vGRZZI4mAapao2bejekHxb4WwZ%2FUIC0zXZJqaz7zvWmKlK93WWLC3vFgRHNFarDB2ZVI9yxDlNgkxZoPWKlH%2B6UhnrnxVc05Gg0uuMAzdvKRkiKdIw%2FOR7drczRtGn%2Fr5XH%2BUrEswWPkKDH9EODwzFewq2AIGAYAdkUzbcN2Jk3NK2%2FowxPyxlL5iKTPjJdYJydDrkaZZhQdituz%2FAvkAJKGmZohLMOMqZ1LIWcIq6h7NwCHmvaI9s%2FT8W1fWIX2tiv2cJRZoH7OXjzKy0%2BJkxPZ6PO9CQmycobBgfrBLhIaqUgpGGFcs2uI2rPUoFt1q%2BDdkgGtYeTwFpTxt608nKDDdRIQqlbKXRRo0ZwUL2fURAIrcqu99R1hd2m9pcgTDxbjR1O6HgeLMIYgwzq15%2F6M7puuDa1IDvOSkjspWSWq3lS1YGmG%2BviiRVnL4BpN0Py4%2Ff%2Bo4YxMX5ifrCLmATjaNcSl6hu7rTW%2F48g0nvP3INHbCQhqKfsejHJI%2BoHV4V30hhrzg7%2FT4AqguWtmn%2BYylfar1mvfhWqvpFlKlXkAGXVAKm%2FQAImTkYQ9nxdcYInDb%2F1OxhwseEvw1m1xv%2B0QBJlJZKL81qY2AZbTjA2iVSoCCzEtyXq9IQwgQsddVcQ8QKM6JosHK6SdRH3yzD9qA2rncBovTFUgAvl2mpaBBdPOrOjDYjAgfc2xqf9NC4P6HHYNOxSt0E%2BDun424nOFF098pL9AtxpVLRiUK38PyK3SSFBeIqeUKKEIEPUlgKHu2QlGuHWb5od2UnzNcxEj7jyt5S9M2BSypB7eL2b7bn4LTvF%2FQGVkbSo5HBKWHNpw7QjQRMMLWR%2BcsGOrEB8jPuow%2F24iBa50J4lKbPaIyL9RpGmmOGRMhLkGkpyvAfCU8j%2FRcT4hG73RLODVP2flRUHNVdK7Bbc7C9ie7VmixvkAkgq1CemvsoyiYopFeItuypI%2FJ3P6EYgF0IZ6lxe38B3FQWkyMnuU8Ih0H1M8oTvhvivDeVN2c91aUNqo4XGNI20jT9rrvIwRR%2BW%2F0RR66%2FfeHTl2mxP8u9FqbYv%2B2t1U%2BuXSuvuY5HiMDP6aY3&X-Amz-Signature=049031a73d1c89930847ca012959d0a62b372b344886e26203b5b5afe04eca52&X-Amz-SignedHeaders=host&x-amz-checksum-mode=ENABLED&x-id=GetObject)
Figure 4:

Figure 5:

Figure 6:

Figure 7:

Figure 8:

Figure 9:

Figure 10:

Attention-based NMT outperforms SMT for the Bengali–Hindi language pair (Das et al_ [32])
| Translation model | BLEU score | Iterations |
|---|---|---|
| Attention-based NMT model | 20.41 | 25 |
| MOSES (SMT) | 14.35 | - |
NMT outperformed SMT with transfer learning, ensemble, and further processing of data (Zopth et al_)
| Language | SBMT | NMT | Transfer | Final |
|---|---|---|---|---|
| Hausa | 23.7 | 16.8 | 21.3 | 24.0 |
| Turkish | 20.4 | 11.4 | 17.0 | 18.7 |
| Uzbek | 17.9 | 10.7 | 14.4 | 16.8 |
| Urdu | 17.9 | 5.2 | 13.8 | 14.5 |
NMT system with transformer model and BPE outperformed phrase-based SMT for English–Hindi and Hindi–English language pairs (Haque et al_ [33])
| MT model | BLEU | METEOR | TER |
|---|---|---|---|
| Eng.Hindi-PBSMT | 28.8 | 30.2 | 53.4 |
| Eng.Hindi-NMT | 36.6 | 33.5 | 46.3 |
| Hindi–Eng.PBSMT | 34.1 | 36.6 | 50.0 |
| Hindi–Eng.NMT | 39.9 | 38.5 | 42.0 |
English–Hindi translation using different optimizers
| Language pair | Optimizer | BLEU-4 score | NMT model | No. of epochs |
|---|---|---|---|---|
| Eng.–Hindi | Adam | 12.25 | NMT with attention | 14 |
| Eng.–Hindi | SGD | 11.50 | NMT with attention | 14 |
| Eng.–Hindi | 16.64 | MOSES |
English–Bengali translation BLEU scores using different optimizers
| Language pairs | Optimizer | BLEU-4 score | MT model | No. of epochs |
|---|---|---|---|---|
| Eng.–Beng. | Adam | 10.78 | NMT with attention | 12 |
| Eng.–Beng. | SGD | 11.17 | NMT with attention | 12 |
| Eng.–Beng. | 14.58 | MOSES |
BLEU-1, 2, and 3 scores are summarized for Eng_–Beng_ and Eng_–Hindi language pairs using Adam- and SGD-Optimizers
| BLEU | Eng.–Beng.-NMT (Adam-Optimizer) | Eng.–Beng.-NMT (SGD-Optimizer) | Eng.–Hindi (NMT-Adam) | Eng.–Hindi (NMT-SGD) |
|---|---|---|---|---|
| BLEU-1 | 14.15 | 13.91 | 15.77 | 14.18 |
| BLEU-2 | 12.65 | 13.11 | 14.12 | 13.33 |
| BLEU-3 | 11.83 | 12.17 | 13.95 | 12.19 |
For various low-resource corpus SMT outperformed NMT (Ahmadnia et al_ [17])
| Corpus | SMT | NMT | NMT* | NMT** |
|---|---|---|---|---|
| Gnome | 20.54 | 15.49 | 17.26 | 18.76 |
| KDE4 | 15.64 | 13.36 | 14.29 | 15.71 |
| Subtitles | 18.82 | 18.62 | 19.51 | 22.54 |
| Ubuntu | 16.76 | 14.27 | 15.14 | 15.87 |
| Tanzil | 17.69 | 15.14 | 16.53 | 17.72 |
| Overall | 17.06 | 15.25 | 16.67 | 17.32 |