Research Output
Building an Embodied Musicking Dataset for co-creative music-making
  In this paper, we present our findings of the design, development and deployment of a proof-of-concept dataset that captures some of the physiological, musicological, and psychological aspects of embodied musicking. After outlining the conceptual elements of this research, we explain the design of the dataset and the process of capturing the data. We then introduce two tests we used to evaluate the dataset: a) using data science techniques and b) a practice-based application in an AI-robot digital score. The results from these tests are conflicting: from a data science perspective the dataset could be considered garbage, but when applied to a real-world musicking situation performers reported it was transformative and felt to be ‘co-creative’. We discuss this duality and pose some important questions for future study. However, we feel that the datatset contains a set of relationships that are useful to explore in the creation of music.

  • Date:

    29 March 2024

  • Publication Status:

    Published

  • DOI:

    10.1007/978-3-031-56992-0_24

  • Funders:

    Engineering and Physical Sciences Research Council; The University of Nottingham; European Commission

Citation

Vear, C., Poltronieri, F., Di Donato, B., Zhang, Y., Benerradi, J., Hutchinson, S., Turowski, P., Shell, J., & Malekmohamadi, H. (2024, April). Building an Embodied Musicking Dataset for co-creative music-making. Presented at Evostar 2024: The Leading European Event on Bio‑Inspired Computation, Aberystwyth, Wales, United Kingdom

Authors

Keywords

dataset, music performance, embodied AI

Monthly Views:

Available Documents