References
- 1Akiki, C., & Burghardt, M. (2020). Toward a Musical Sentiment (MuSe) Dataset for Affective Distant Hearing. Proceedings of the 1st Workshop on Computational Humanities Research (CHR), 225–235.
http://ceur-ws.org/Vol-2723/short26.pdf - 2Delbouys, R., Hennequin, R., Piccoli, F., Royo-Letelier, J., & Moussallam, M. (2018). Music Mood Detection Based on Audio and Lyrics with Deep Neural Net. Proceedings of the 19th International Society for Music Information Retrieval Conference (ISMIR), 370–375.
- 3Hu, X., Downie, J. S., & Ehmann, A. F. (2009). Lyric Text Mining in Music Mood Classification. Proceedings of the 10th International Society for Music Information Retrieval Conference (ISMIR), 411–416.
- 4Kim, E., & Klinger, R. (2019). A Survey on Sentiment and Emotion Analysis for Computational Literary Studies. Zeitschrift für digitale Geisteswissenschaft, 4. DOI: 10.17175/2019_008
- 5Russell, J. (1980). A Circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161–1178. DOI: 10.1037/h0077714
- 6Scherer, K. R. (2004). Which emotions can be induced by music? What are the underlying mechanisms? And how can we measure them? Journal of New Music Research, 33(3), 239–251. DOI: 10.1080/0929821042000317822
- 7Strapparava, C., & Valitutti, A. (2004). WordNet Affect: An Affective Extension of Word-Net. Proceedings of the Fourth International Conference on Language Resources and Evaluation (LREC), 1083–1086.
- 8Warriner, A. B., Kuperman, V., & Brysbaert, M. (2013). Norms of valence, arousal, and dominance for 13,915 English lemmas. Behavior research methods, 45(4), 1191–1207. DOI: 10.3758/s13428-012-0314-x
