Have a personal or library account? Click to login
Learning Sonata Form Structure on Mozart’s String Quartets Cover

References

  1. 1Bagan, G., Giraud, M., Groult, R., & Leguy, E. (2015). Modélisation et visualisation de schemas d’analyse musicale avec music21. In Journées d’Informatique Musicale (JIM 2015).
  2. 2Baratè, A., Haus, G., & Ludovico, L. A. (2005). Music analysis and modeling through Petri nets. In International Symposium on Computer Music Modeling and Retrieval (CMMR 2005), pages 201218. DOI: 10.1007/11751069_19
  3. 3Bigo, L., Giraud, M., Groult, R., Guiomard-Kagan, N., & Levé, F. (2017). Sketching sonata form structure in selected classical string quartets. In 18th International Society for Music Information Retrieval Conference (ISMIR 2017), pages 752759.
  4. 4Cambouropoulos, E. (2001). The local boundary detection model (LBDM) and its application in the study of expressive timing. In International Computer Music Conference (ICMC 2001).
  5. 5Caplin, W. E. (1998). Classical Form: A Theory of Formal Functions for the Instrumental Music of Haydn, Mozart, and Beethoven. Oxford University Press.
  6. 6Caplin, W. E. (2001). The classical sonata exposition: Cadential goals and form-functional plans. Tijdschrift voor Muziektheorie, 6(3), 195209.
  7. 7Caplin, W. E., Hepokoski, J., & Webster, J. (2009). Musical Form, Forms & Formenlehre – Three Methodological Reflections. Leuven University Press. DOI: 10.2307/j.ctt9qf01v
  8. 8Chen, H.-C., Lin, C.-H., & Chen, A. L. P. (2004). Music segmentation by rhythmic features and melodic shapes. In IEEE International Conference on Multimedia and Expo (ICME 2004), pages 16431646.
  9. 9Conklin, D., & Anagnostopoulou, C. (2001). Representation and discovery of multiple viewpoint patterns. In International Computer Music Conference (ICMC 2001), pages 479485.
  10. 10Cuthbert, M. S., & Ariza, C. (2010). music21: A toolkit for computer-aided musicology and symbolic music data. In 11th International Society for Music Information Retrieval Conference (ISMIR 2010), pages 637642.
  11. 11Czerny, C. (1848). School of Practical Composition. R. Cocks, London.
  12. 12Farbood, M. (2010). A global model of musical tension. In International Conference on Music Perception and Cognition (ICMPC 2010).
  13. 13Flothuis, M. (1998). Mozarts Streichquartette: Ein musikalischer Werkführer. C. H. Beck.
  14. 14Giraud, M., Groult, R., & Leguy, E. (2018). Dezrann, a web framework to share music analysis. In International Conference on Technologies for Music Notation and Representation (TENOR 2018), pages 104110.
  15. 15Giraud, M., Groult, R., Leguy, E., & Levé, F. (2015). Computational fugue analysis. Computer Music Journal, 39(2). DOI: 10.1162/COMJ_a_00300
  16. 16Giraud, M., Groult, R., & Levé, F. (2012). Detecting episodes with harmonic sequences for fugue analysis. In 13th International Society for Music Information Retrieval Conference (ISMIR 2012), pages 457462.
  17. 17Gjerdingen, R. O. (2007). Music in the Galant Style. Oxford University Press.
  18. 18Greenberg, Y. (2017). Of beginnings and ends: A corpus-based inquiry into the rise of the recapitulation. Journal of Music Theory, 61(2), 171200. DOI: 10.1215/00222909-4149546
  19. 19Hamanaka, M., Hirata, K., & Tojo, S. (2016). Implementing Methods for Analysing Music Based on Lerdahl and Jackendoff’s Generative Theory of Tonal Music. In Meredith, D., Editor, Computational Music Analysis, pages 221249. Springer, Cham. DOI: 10.1007/978-3-319-25931-4_9
  20. 20Hepokoski, J. (2002). Beyond the sonata principle. Journal of the American Musicological Society, 55(2), 91. DOI: 10.1525/jams.2002.55.1.91
  21. 21Hepokoski, J., & Darcy, W. (1997). The medial caesura and its role in the eighteenthcentury sonata exposition. Music Theory Spectrum, 19(2), 115154. DOI: 10.1525/mts.1997.19.2.02a00010
  22. 22Hepokoski, J., & Darcy, W. (2006). Elements of Sonata Theory: Norms, Types, and Deformations in the Late-Eighteenth-Century Sonata. Oxford University Press. DOI: 10.1093/acprof:oso/9780195146400.001.0001
  23. 23Herremans, D., & Chew, E. (2017). MorpheuS: Generating structured music with constrained patterns and tension. IEEE Transactions on Affective Computing. Early Access. DOI: 10.1109/TAFFC.2017.2737984
  24. 24Hsu, J. L., Liu, C. C., & Chen, A. (1998). Efficient repeating pattern finding in music databases. In International Conference on Information and Knowledge Management (CIKM 1998), pages 281288. DOI: 10.1145/288627.288668
  25. 25Huron, D. (2002). Music information processing using the Humdrum toolkit: Concepts, examples, and lessons. Computer Music Journal, 26(2), 1126. DOI: 10.1162/014892602760137158
  26. 26Jiang, N., & Müller, M. (2013). Automated methods for analyzing music recordings in sonata form. In 14th International Society for Music Information Retrieval Conference (ISMIR 2013), pages 595600.
  27. 27King, A. H. (1968). La Musique de chambre de Mozart. Arles: Actes Sud.
  28. 28Krumhansl, C. L., & Kessler, E. J. (1982). Tracing the dynamic changes in perceived tonal organisation in a spatial representation of musical keys. Psychological Review, 89(2), 334368. DOI: 10.1037//0033-295X.89.4.334
  29. 29Larson, S. (2003). Recapitulation recomposition in the sonata-form first movements of Haydn’s string quartets: Style change and compositional technique. Music Analysis, 22(1–2), 139177. DOI: 10.1111/j.0262-5245.2003.00178.x
  30. 30Lerdahl, F., & Jackendoff, R. (1983). A Generative Theory of Tonal Music. MIT Press.
  31. 31Lerdahl, F., & Krumhansl, C. L. (2007). Modeling tonal tension. Music Perception, 24(4), 329366. DOI: 10.1525/mp.2007.24.4.329
  32. 32Marx, A. B. (1838, 1845). Die Lehre von der musikalischen Komposition (volumes 2 and 3). Breitkopf & Härtel, Leipzig.
  33. 33Medeot, G., Cherla, S., Kosta, K., McVicar, M., Abdalla, S., Selvi, M., Rex, E., & Webster, K. (2018). StructureNet: Inducing structure in generated melodies. In 19th International Society for Music Information Retrieval Conference (ISMIR 2018), pages 725731.
  34. 34Miyake, J. (2004). The Role of Multiple New-key Themes in Selected Sonata-form Exposition. PhD thesis, University of New York.
  35. 35Mongeau, M., & Sankoff, D. (1990). Comparison of musical sequences. Computers and the Humanities, 24(3), 161175. DOI: 10.1007/BF00117340
  36. 36Nika, J., Chemillier, M., & Assayag, G. (2016). Improtek: introducing scenarios into human-computer music improvisation. Computers in Entertainment (CIE), 14(2), 4. DOI: 10.1145/3022635
  37. 37Rabiner, L. R. (1989). A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE, 77(2), 257286. DOI: 10.1109/5.18626
  38. 38Rafael, B., & Oertl, S. M. (2010). MTSSM – A Framework for Multi-Track Segmentation of Symbolic Music. International Journal of Computer, Electrical, Automation, Control and Information Engineering, 4(1), 713. DOI: 10.2316/P.2010.674-008
  39. 39Ratner, L. (1980). Classical Music: Expression, Form, and Style. Schirmer.
  40. 40Reicha, A. (1824). Traité de haute composition musicale. A. Diabelli.
  41. 41Rosen, C. (1980). Sonata Forms. W. W. Norton.
  42. 42Schenker, H. (1935). Der freie Satz. Universal Edition.
  43. 43Temperley, D. (1999). What’s key for key? The Krumhansl-Schmuckler key-finding algorithm reconsidered. Music Perception, 17(1), 65100. DOI: 10.2307/40285812
  44. 44Viterbi, A. (1967). Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory, 13(2), 260269. DOI: 10.1109/TIT.1967.1054010
  45. 45Weiß, C., & Müller, M. (2014). Quantifying and visualizing tonal complexity. In Conference on Interdisciplinary Musicology (CIM 2014), pages 184188.
DOI: https://doi.org/10.5334/tismir.27 | Journal eISSN: 2514-3298
Language: English
Submitted on: Dec 28, 2018
Accepted on: Oct 31, 2019
Published on: Dec 17, 2019
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2019 Pierre Allegraud, Louis Bigo, Laurent Feisthauer, Mathieu Giraud, Richard Groult, Emmanuel Leguy, Florence Levé, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.