Skip to yearly menu bar Skip to main content


Poster
in
Workshop: XAI in Action: Past, Present, and Future Applications

Explainable AI in Music Performance: Case Studies from Live Coding and Sound Spatialisation

Jack Armitage · Nicola Privato · Victor Shepardson · Celeste Betancur Gutierrez

[ ] [ Project Page ]
Sat 16 Dec 12:01 p.m. PST — 1 p.m. PST

Abstract:

Explainable Artificial Intelligence (XAI) has emerged as a significant area of research, with diverse applications across various fields. In the realm of arts, the application and implications of XAI remain largely unexplored. This paper investigates how artist-researchers address and navigate explainability in their systems during creative AI/ML practices, focusing on music performance. We present two case studies: live coding of AI/ML models and sound spatialisation performance. In the first case, we explore the inherent explainability in live coding and how the integration of interactive and on-the-fly machine learning processes can enhance this explainability. In the second case, we investigate how sound spatialisation can serve as a powerful tool for understanding and navigating the latent dimensions of autoencoders. Our autoethnographic reflections reveal the complexities and nuances of applying XAI in the arts, and underscore the need for further research in this area. We conclude that the exploration of XAI in the arts, particularly in music performance, opens up new avenues for understanding and improving the interaction between artists and AI/ML systems. This research contributes to the broader discussion on the diverse applications of XAI, with the ultimate goal of extending the frontiers of applied XAI.

Chat is not available.