Creativity - Technology - Learning
 
CIM – Controlling Interactive Music

CIM – Controlling Interactive Music

Sean Foran & CIM duet at NIME 2016

Controlling Interactive Music (CIM) is an interactive music system for human-computer duets. Designed as a creativity support system it explores the metaphor of human-machine symbiosis, where the phenomenological experience of interacting with CIM has both a degree of instrumentality and a sense of partnership. Building on Francois Pachet’s (2006) notion of reflexivity, John Young’s (2009) explorations of conversational interaction protocols, and Ian Whalley’s (2012) experiments in networked human-computer music interaction, as well as my own previous work in interactive music systems (Gifford & Brown 2011), CIM applies an activity/relationality/prominence based model of musical duet interaction. Andrew R. Brown, Toby Gifford, and Bradley Voltz developed various iterations of the CIM software from 2012 – 2017.

Andrew improvising with CIM software using two MIDI-controlled Disklavier pianos in 2017.

The CIM system is an interactive music system for use in human-machine creative partnerships. It is designed to sit at a mid-point of the autonomy spectrum, according to Rowe’s instrument paradigm vs player paradigm continuum. CIM accepts MIDI input from a human performer and improvises musical accompaniment. CIM’s behaviour is directed by a model of duet interaction, which utilises various conversational, contrapuntal, and accompaniment metaphors to determine appropriate musical behaviour. An important facet of this duet model is the notion of turn-taking – where the system and the human swap roles as the musical initiator.

Gerardo Dirié plays acoustic instruments with CIM controlling software instruments via MIDI

CIM listens to and generates MIDI data. In doing so, CIM is responding to the performance of the musician while simultaneously performing its own output. Typically, CIM performances involve two digitally enabled acoustic pianos (Yamaha Disklaviers), which, in this case, turn the performance gestures (live and MIDI based) into sound. Extending this use of performance data as a method of interaction, the musician’s piano pedals are used to control aspects of CIM’s behaviour. Extending Di Scipio’s turn of phrase when discussing sound as an user interface, this method of musician-machine interaction can be termed ‘performance as interface’.

Sean Foran was one of the most experienced performers with CIM during its development.

Improvisation is generative, it is relational, and it is temporal (Ingold and Hallam 2007). Accordingly, the CIM software creates a relational connection with a musician by using the material played by the human performer as the basis for its generated performance. This use of imitation and interpretation forms the basis for describing the system as reflexive. I agree with Ingold and Hallam when they argue that imitation is not simple repetition ‘but entails a complex and ongoing alignment of observation of the model with action in the world’ (Ingold and Hallam 2007, 5). Even though computational recording and playback might be more simple-minded than human imitation, experiences with designing CIM reveal that even though playback itself is straightforward, the decisions about what and when to play are certainly not. They require adherence to cultural expectations, such as the maintenance of stylistic conventions and the tracking of the dynamic development of the human musician’s performance.

Louise performs a duet with CIM as part of the Two Piano Festival in Brisbane in 2014