quartet (2021)quartet ensemble and video processing
Musicologist Jacob Hart has written a nice article about this work. Read it here.created with the [Switch~ Ensemble]:
Zach Sheets, flute
T.J. Borden, cello
Wei-Han Wu, piano
Megan Arns, percussion
quartet is a remote collaboration between myself and the [Switch~ Ensemble] designed to engage with the added technological mediation at play during the pandemic. The sonic source material of quartet is about two minutes of eurorack synthesizer recordings transcribed for the [Switch~ Ensemble] to record. These recordings were then subjected to data analysis using audio descriptors and machine learning algorithms using FluCoMa. Approaching these acoustic and electronic sounds as data for comparison and manipulation offered me new strategies for combining the material in expressive ways, finding form, and creating sonic relationships and audio-visual objects that I wouldn’t otherwise consider.
The audio manipulation and playback was all done in SuperCollider which set OSC messages about the currently playing sound to custom software created with openFrameworks in c++ that looked up what video frame to display on the screen. All the sections were recorded live using a Syphon Server and later edited together.
tags: intermedia , video , machine learning