Experiments in intermedia storytelling
An audio-visual work, exploring meaningful ways of mapping accelerometer data from my wireless audio-visual violin bow interface.
A collaboration with Macquarie University research fellow, Dr. Richard Savery. This piece explores interactions between a human and robotic musician, incorporating interactive visuals with simple collision behaviours and a reactive particle system. In this piece, data from all the sensors within the audio-visual interface is used, both in communicating with Kierzo, the robotic musician and the various parameters of the visual content.
Demonstration video of audio-visual bow interface. Culmination of first few months of PhD research at University of Technology Sydney. This research was published and presented at the ACMC conference, October 2023. You can read the full paper here .
A data visualization project done in p5.js. The sleep data was collected from an Amazon Halo wristband. Solo violin improvisation recorded and mixed using Logic.
A visual installation of a family holiday in Sydney and Perth, Australia. Images manipulated using Max/Jitter. Sound manipulation through PyPadberg system and Max/MSP.
You can view the published paper on the PyPadberg system that I presented at NIME, Brazil, in 2019 here.
First piece from my MFA in Integrated Composition, Improvisation and Technology, Capstone Recital Fragments of a Heartbeat .
Performed at the Winifred Smith Hall, University of California, Irvine. This piece explores reactivity between live audio and video animation. All processing done in MaxMsp and Jitter.
You can read my Thesis here.
Third piece in Fragments of a Heartbeat. A collaboration with Gunta Liepiņa-Miller, using live and pre-recorded audio and video sampling. All processing done in MaxMsp and Jitter.