Abstract
In this research, we introduce extensions of the ImproVision framework for multimodal musical human–machine communication. ImproVision Equilibrium integrates real‑time pitch detection, consonant chord determination, and visual cues to guide an ensemble from dissonance to harmony. In addition to Equilibrium, we introduce ImproVision Gestured Improvisation, a complementary mode in which musicians use body gestures to guide generative machine improvisation. These systems demonstrate a spectrum of human–machine interaction dynamics. We evaluate the ImproVision framework using the Standardized Procedure for Evaluating Creative Systems methodology for creative systems, assessing its capacity for co‑creation, communication, and adaptability. Potential applications span from ensemble rehearsal aids to interactive performance tools, opening new avenues through intelligent, responsive, and multimodal machine participation in the arts.
