CrossTalk

Interactive performance duet for pianist and secondary reactive piano.

  • Dates: 2007
  • Location: San Francisco, CA
  • Role: Developer
  • Collaborators:

CrossTalk, by Jay Cloidt, is a “duet for pianist and piano”, as the live pianist’s performance is analyzed by another virtual pianist, and the virtual pianist responds to melodies it recognizes and replies with piano melodies of its own.

Crosstalk embraces a bold movement vocabulary and daring partnering for four exceptional dancers from the dance company. A piece about relationships, Crosstalk uses a non-linear approach to address the circumstances of friendship and community. The new work looks at ways in which individuals react to change through the introduction of new circumstances, while raising questions of how we are in the world alone as individuals, how we are in the presence of another.

Jay Cloidt describes the peice:

Crosstalk was an attempt to compose a “duet for pianist and piano”. That is, I wanted to write a piece for piano where the capabilities of the pianist were greatly expanded by creating an “intelligent piano” with which the performer could interact. Having dealt with the problem of combining instruments with recorded sounds in many formats and venues over the years, and being unsatisfied with the rigidity that results from performers playing to a click track or sound engineers starting and stopping tapes and CDs, I wanted to create a second piano part which would respond to the live pianist in a flexible manner.

To do this, I worked with programmer Barry Threw to create a system in Max which looked at the MIDI output from the pianist (playing a high-quality software sampler piano instrument on Mac laptop), and responded with phrases of varying length (on an identical software instrument running on the same computer). The pianist would then play together with the Max-piano until completing a phrase, at which point the Max instrument would go back into “listening” mode for the next cue. So at some points the tempo is controlled by the live pianist and at others, by the software-controlled piano.

The laptop running the piece sat on the piano in view of the performer. It included a display which indicated the currently running cue (if any), the next cue to be played, and a metronome “flasher” which blinked at the tempo of the upcoming cue, as an aid to the pianist in entering at the correct tempo. Additionally, there are pulldown menus for routing incoming and outgoing MIDI and controls for manually incrementing and decrementing the current cue, and other controls. For rehearsal and testing purposes, a “play” button instigated playback of a MIDI version of the “human” part which triggered the Max patch responses, in lieu of having to play all the cues in order to test the responses. Please see the screen shots referenced elsewhere on this page for more detail.

This system, after some tinkering, worked quite well, and we were able to tune the software’s response to the extent that miscues were practically eliminated. The technical execution of the piece was quite dependable in live performance.

One unexpected result: because this was a score for dance, it was important that the cues be played at strict tempi so as not to ruin the dancers’ timing. (As a choreographer once said to me, “your body falls at the same rate no matter how fast the music is playing”). So, even though the piece allowed for quite a lot of flexibility of expression and tempo on the part of the pianist, this was to some extent limited by the exigencies of a dance score. A concert performance of the piece would allow for more flexibility and expression.

Performances:

  • July 29, 2007. WestWave Dance Festival, Project Artaud Theater, San Francisco, CA.