Have a personal or library account? Click to login
Distractor Generation for Lexical Questions Using Learner Corpus Data Cover

Distractor Generation for Lexical Questions Using Learner Corpus Data

By: Nikita Login  
Open Access
|Dec 2023

References

  1. Bird, S., Klein, E., and Loper, E. (2009). Natural language processing with Python: analyzing text with the natural language toolkit. Sebastopol, CA: O’Reilly Media, Inc, 509 p.
  2. Chen, T., and Guestrin, C. (2016). XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pages 785–794, San Francisco, CA. Accessible at: http://doi.acm.org/10.1145/2939672.2939785.
  3. Devlin, J., Chang, M. W., Lee, K., and Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4171–4186, Minneapolis, Minnesota. Accessible at: http://dx.doi.org/10.18653/v1/N19-1423.
  4. Dorogush, A., Ershov, V., and Gulin, A. (2018). CatBoost: gradient boosting with categorical features support. ArXiV preprint, 7 p. Accessible at: https://arxiv.org/abs/1810.11363.
  5. Fares, M., Kutuzov, A., Oepen, S., and Velldal, E. (2017). Word vectors, reuse, and replicability: Towards a community repository of large-text resources. In Proceedings of the 21st Nordic Conference on Computational Linguistics, pages 271–276, Gothenburg, Sweden. Accessible at: https://aclanthology.org/W17-0237/.
  6. Francis, W., and Kucera, H. (1979). Brown Corpus. Providence, Rhode Island: Department of Linguistics, Brown University. Accessible at: http://korpus.uib.no/icame/manuals/BROWN/INDEX.HTM.
  7. Granger, S. (2008). Learner Corpora. In A. Lüdeling – M. Kyto (eds.): Corpus Linguistics. An International Handbook. Volume 1. Berlin: Walter de Gruyter, pages 259–275.
  8. Jiang, S., and Lee, J. (2017). Distractor Generation for Chinese Fill-in-the-blank Items. In Proceedings of the 12th Workshop on Innovative Use of NLP for Building Educational Applications, pages 143–148, Copenhagen, Denmark. Accessible at: http://dx.doi.org/10.18653/v1/W17-5015.
  9. Ke, G., Meng, Q., Finley, T., Wang, T., Chen, W., Ma, W., Ye, Q., and Liu, T. Y. (2017). LightGBM: A Highly Efficient Gradient Boosting Decision Tree. In Proceedings of the 31st Internal Conference on Neural Information Processing Systems (NIPS 2017), pages 3149–3157, Long Beach, CA. Accessible at: https://dl.acm.org/doi/10.5555/3294996.3295074.
  10. Kumar, G., Banchs, R., and D’Haro, L. (2015). Automatic fill-the-blank question generator for student self-assessment. In Proceedings of 2015 IEEE Frontiers in Education Conference (FIE), pages 1–3, El Paso, TX. Accessible at: https://doi.org/10.1109/FIE.2015.7344291.
  11. Kurdi, S. (2020). A Systematic Review of Automatic Question Generation for Educational Purposes. International Journal of Artificial Intelligence in Education, 30(1), pages 121–204.
  12. Liu, M., Rus, V., and Liu, L. (2018). Automatic Chinese Multiple Choice Question Generation Using Mixed Similarity Strategy. IEEE Transactions on Learning Technologies, 11(2), pages 193–202.
  13. Mikolov, T., Chen, K., Corrado, G., and Dean, J. (2013). Efficient Estimation of Word Representations in Vector Space. ArXiV preprint, 12 p. Accessible at: https://arxiv.org/abs/1301.3781.
  14. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., and Duchesnay, E. (2011). Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research, 12, pages 2824–2830.
  15. Řehůřek, R., and Sojka, P. (2010). Software Framework for Topic Modelling with Large Corpora. In Proceedings of the LREC 2010 Workshop on New Challenges for NLP Frameworks, pages 46–50, Malta. Accessible at: http://dx.doi.org/10.13140/2.1.2393.1847.
  16. Sakaguchi, K., Arase, Y., and Komachi, M. (2013). Discriminative Approach to Fill-inthe- Blank Quiz Generation for Language Learners. In Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 238–242, Sofia, Bulgaria. Accessible at: https://aclanthology.org/P13-2043/.
  17. Stenetorp, P., Pyysalo, S., Topic, G., Ohta, T., Ananiadou, S., and Tsujii, J. (2012). brat: a Web-based Tool for NLP-Assisted Text Annotation. In Proceedings of the Demonstrations at the 13th Conference of the European Chapter of the Association for Computational Linguistics, pages 102–107, Avignon, France. Accessible at: https://aclanthology.org/E12-2021/.
  18. Vinogradova, O. (2019). To automated generation of test questions on the basis of error annotations in EFL essays: A time-saving tool? In S. Götz – J. Mukherjee (eds.): Learner Corpora and Language Teaching. Volume 29. Amsterdam, Netherlands: John Benjamins, pages 29–48.
  19. Vinogradova, O., and Lyashevskaya, O. (2022). Review of Practices of Collecting and Annotating Texts in the Learner Corpus REALEC. In P. Sojka – A. Horák – I. Kopeček – K. Pala (eds.): Text, Speech and Dialogue. 25th International Conference, TSD 2022, Brno, Czech Republic, September 6–9, 2022, Proceedings. Cham, Switzerland: Springer Nature Switzerland AG, pages 77–88.
DOI: https://doi.org/10.2478/jazcas-2023-0051 | Journal eISSN: 1338-4287 | Journal ISSN: 0021-5597
Language: English
Page range: 345 - 356
Published on: Dec 25, 2023
Published by: Slovak Academy of Sciences, Mathematical Institute
In partnership with: Paradigm Publishing Services
Publication frequency: 2 issues per year

© 2023 Nikita Login, published by Slovak Academy of Sciences, Mathematical Institute
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.