REAL-TIME TEMPO CANONS

In 2012, I returned to IRCAM for a six-month Musical Research Residency with the MuTant team, to tackle the problem of predicting the precise point of convergence between a recorded sound and a live instrument using the score follower Antescofo~. 

My primary goal was to create converging tempo canons in real time, where a live voice and playback must coincide on a given attack. The idea was to create a more flexible system using a score-follower to track tempo shifts and to constantly recalculate playback speed, gauging the distance towards a given arrival point. Until now most music that deals with convergence has traditionally been highly metrical and rhythmically rigid — Nancarrow-style layers of multiple speeds that arrive together at a designated downbeat — but with Antescofo~ recording the real-time data of a passage’s execution and measuring it against an idealized version of the score, a new degree of rhythmic flexibility and vitality could be introduced. Creating a tempo canon where a single line played by an instrument “catches up” with a faster version of itself requires some information about where the line is heading, and thus seemed feasible, perhaps even for the first time.

Over the course of the residency I wrote several sketches for clarinet and tested these with the score follower on a monthly basis, refining calculations and building towards the final “esquisse canonique,” performed in this video by Jérôme Comte of Ensemble Intercontemporain — a work-in-progress demo, laying the groundwork for larger work for clarinet and electronics to come.

The results of our research were published in Real-Time Canons with Antescofo, a paper co-authored by myself and José Echeveste, presented at the 2014 International Computer Music Conference, where it was awarded the distinction of Best Presentation of the conference.