Have a personal or library account? Click to login
The IsoVAT Corpus: Parameterization of Musical Features for Affective Composition Cover

The IsoVAT Corpus: Parameterization of Musical Features for Affective Composition

Open Access
|Nov 2022

References

  1. 1Aljanaki, A., Wiering, F., and Veltkamp, R. (2014). Collecting annotations for induced musical emotion via online game with a purpose Emotify.
  2. 2Balkwill, L.-L. and Thompson, W. F. (1999). A crosscultural investigation of the perception of emotion in music: Psychophysical and cultural cues. Music Perception, 17(1):4364. DOI: 10.2307/40285811
  3. 3Dinculescu, M., Engel, J., and Roberts, A. (2019). MidiMe: Personalizing a MusicVAE model with user data. In Workshop on Machine Learning for Creativity and Design, NeurIPS.
  4. 4Eerola, T. and Vuoskoski, J. K. (2011). A comparison of the discrete and dimensional models of emotion in music. Psychology of Music, 39(1):1849. DOI: 10.1177/0305735610362821
  5. 5Eerola, T. and Vuoskoski, J. K. (2012). A review of music and emotion studies: Approaches, emotion models, and stimuli. Music Perception: An Interdisciplinary Journal, 30(3):307340. DOI: 10.1525/mp.2012.30.3.307
  6. 6Ens, J. and Pasquier, P. (2020). MMM: Exploring conditional multi-track music generation with the transformer. arXiv preprint arXiv:2008.06048.
  7. 7Fritz, T., Jentschke, S., Gosselin, N., Sammler, D., Peretz, I., Turner, R., Friederici, A. D., and Koelsch, S. (2009). Universal recognition of three basic emotions in music. Current Biology, 19(7):573576. DOI: 10.1016/j.cub.2009.02.058
  8. 8Gabrielsson, A. and Lindstrom, E. (2012). The role of structure in the musical expression of emotions. In Juslin, P. N. and Sloboda, J. A., editors, Handbook of Music and Emotion: Theory, Research, Applications, pages 367400. Oxford University Press. DOI: 10.1093/acprof:oso/9780199230143.003.0014
  9. 9Gerhard, D. and Hepting, D. H. (2004). Cross-modal parametric composition. In Proceedings of the International Computer Music Conference.
  10. 10Hadjeres, G. and Crestel, L. (2021). The piano inpainting application. CoRR, abs/2107.05944.
  11. 11Hasegawa, R. (2020). Creating with constraints. In Donin, N., editor, The Oxford Handbook of the Creative Process in Music. Oxford University Press. DOI: 10.1093/oxfordhb/9780190636197.013.17
  12. 12Hernandez-Olivan, C. and Beltran, J. R. (2021). Music composition with deep learning: A review. CoRR, abs/2108.12290.
  13. 13Hoffmann, H., Scheck, A., Schuster, T., Walter, S., Limbrecht, K., Traue, H. C., and Kessler, H. (2012). Mapping discrete emotions into the dimensional space: An empirical approach. In 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC), pages 33163320. IEEE. DOI: 10.1109/ICSMC.2012.6378303
  14. 14Hung, H.-T., Ching, J., Doh, S., Kim, N., Nam, J., and Yang, Y.-H. (2021). Emopia: A multi-modal pop piano dataset for emotion recognition and emotion-based music generation. arXiv preprint arXiv:2108.01374.
  15. 15Juslin, P. N. and Sloboda, J. A., editors (2011). Handbook of Music and Emotion: Theory, Research, Applications. Oxford University Press.
  16. 16Kim, Y. E., Schmidt, E. M., Migneco, R., Morton, B. G., Richardson, P., Scott, J., Speck, J. A., and Turnbull, D. (2010). Music emotion recognition: A state of the art review. In Proceedings of the International Society for Music Information Retrieval Conference, pages 937952.
  17. 17Livingstone, S. R., Muhlberger, R., Brown, A. R., and Thompson, W. F. (2010). Changing musical emotion: A computational rule system for modifying score and performance. Computer Music Journal, 34(1):4164. DOI: 10.1162/comj.2010.34.1.41
  18. 18Mehrabian, A. (1996). Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Current Psychology, 14(4):261292. DOI: 10.1007/BF02686918
  19. 19Ortony, A., Clore, G. L., and Collins, A. (1990). The Cognitive Structure of Emotions. Cambridge University Press.
  20. 20Panda, R., Malheiro, R., and Paiva, R. P. (2018). Novel audio features for music emotion recognition. IEEE Transactions on Affective Computing, 11(4):614626. DOI: 10.1109/TAFFC.2018.2820691
  21. 21Panda, R. E. S., Malheiro, R., Rocha, B., Oliveira, A. P., and Paiva, R. P. (2013). Multi-modal music emotion recognition: A new dataset, methodology and comparative analysis. In 10th International Symposium on Computer Music Multidisciplinary Research (CMMR 2013), pages 570582.
  22. 22Pasquier, P., Eigenfeldt, A., Bown, O., and Dubnov, S. (2017). An introduction to musical metacreation. Computers in Entertainment, 14(2):114. DOI: 10.1145/2930672
  23. 23Pati, A., Lerch, A., and Hadjeres, G. (2019). Learning to traverse latent spaces for musical score inpainting. CoRR, abs/1907.01164.
  24. 24Paz, I., Nebot, A., Mugica, F., and Romero, E. (2018). Modeling perceptual categories of parametric musical systems. Pattern Recognition Letters, 105:217225. DOI: 10.1016/j.patrec.2017.07.005
  25. 25Reisenzein, R. (2000). Wundt’s three-dimensional theory of emotion. In Balzer, W., Sneed, J. D., and Moulines, C. U., editors, Structuralist Knowledge Representation, pages 219250. Brill.
  26. 26Rich, A. (1998). Harmony before the common practice period. In Encyclopedia Britannica. Encyclopedia Britannica, Inc. https://www.britannica.com/art/harmony-music/Harmony-before-the-common-practice-period.
  27. 27Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6):11611178. DOI: 10.1037/h0077714
  28. 28Schimmack, U. and Grob, A. (2000). Dimensional models of core affect: A quantitative comparison by means of structural equation modeling. European Journal of Personality, 14(4):325345. DOI: 10.1002/1099-0984(200007/08)14:4>;325::AID-PER380<3.0.CO;2-I
  29. 29Schubert, E. (1996). Continuous response to music using a two dimensional emotion space. In Proceedings of the 4th International Conference on Music Perception and Cognition, pages 263268.
  30. 30Schubert, E. (1999). Measuring emotion continuously: Validity and reliability of the two-dimensional emotion-space. Australian Journal of Psychology, 51(3):154165. DOI: 10.1080/00049539908255353
  31. 31Thompson, W. F. and Robitaille, B. (1992). Can composers express emotions through music? Empirical Studies of the Arts, 10(1):7989. DOI: 10.2190/NBNY-AKDK-GW58-MTEL
  32. 32Vieillard, S., Peretz, I., Gosselin, N., Khalfa, S., Gagnon, L., and Bouchard, B. (2008). Happy, sad, scary and peaceful musical excerpts for research on emotions. Cognition and Emotion, 22(4):720752. DOI: 10.1080/02699930701503567
  33. 33Warrenburg, L. A. (2020). Choosing the right tune: A review of music stimuli used in emotion research. Music Perception, 37(3):240258. DOI: 10.1525/mp.2020.37.3.240
  34. 34Wu, H. and Leung, S.-O. (2017). Can Likert scales be treated as interval scales? A simulation study. Journal of Social Service Research, 43(4):527532. DOI: 10.1080/01488376.2017.1329775
  35. 35Wundt, W. M. and Judd, C. H. (1902). Outlines of Psychology. W. Engelmann.
  36. 36Yang, Y.-H. and Chen, H. H. (2011). Music Emotion Recognition. CRC Press. DOI: 10.1201/b10731
  37. 37Zhang, K., Zhang, H., Li, S., Yang, C., and Sun, L. (2018). The PMEmo Dataset for music emotion recognition. In Proceedings of the 2018 ACM International Conference on Multimedia Retrieval, pages 135142, New York, NY, USA. Association for Computing Machinery. DOI: 10.1145/3206025.3206037
DOI: https://doi.org/10.5334/tismir.120 | Journal eISSN: 2514-3298
Language: English
Submitted on: Oct 14, 2021
Accepted on: Jun 16, 2022
Published on: Nov 14, 2022
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2022 Cale Plut, Philippe Pasquier, Jeff Ens, Renaud Tchemeube, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.