Have a personal or library account? Click to login
Sentiment Analysis Using Bert Model Cover

References

  1. Devlin, J., Chang, M. W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics (pp. 4171-4186).
  2. LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436-444.
  3. Pang, B., & Lee, L. (2008). Opinion Mining and Sentiment Analysis. Foundations and Trends in Information Retrieval, 2(1-2), 1-135.
  4. Socher, R., Perelygin, A., Wu, J. Y., Chuang, J., Manning, C. D., Ng, A. Y., & Potts, C. (2013). Recursive deep models for semantic compositionality over a sentiment treebank. In Proceedings of the conference on empirical methods in natural language processing (EMNLP) (Vol. 1631, p. 1642).
  5. Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., Polosukhin, I. (2017). Attention is all you need. Advances in neural information processing systems, 30.
  6. https://towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp-f8b21a9b6270 (last visit december 2023)
  7. https://www.kaggle.com/datasets/rajugc/kaggle-dataset (last visit december 2023)
  8. Kaufmann, M., Principles of Data Integration, ISBN 9780124160446, 2012
DOI: https://doi.org/10.2478/ijasitels-2023-0007 | Journal eISSN: 2559-365X | Journal ISSN: 2067-354X
Language: English
Page range: 59 - 66
Published on: Dec 19, 2023
In partnership with: Paradigm Publishing Services
Publication frequency: 2 issues per year

© 2023 Dorca Manuel-Ilie, Pitic Antoniu Gabriel, Crețulescu Radu George, published by Lucian Blaga University of Sibiu
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License.