By Alexander Refsum Jensenius (Department of Musicology, University of Oslo) & Victoria Johnson (Norwegian Academy of Music)
We report on the development of a video based analysis system that controls concatenative sound synthesis and sound spatialisation in realtime during concerts.
The system has been used in several performances, most recently Transformation for electric violin and live electronics, where the performer controls the sound synthesis and spatialisation while moving on stage.
By Knut Guettler (Norwegian Academy of Music), Hans Wilmers (Norwegian Centre for Technology, Acoustics, and Music (NOTAM)) & Victoria Johnson (Norwegian Academy of Music)
This paper gives a glimpse into the ongoing process of equipping a violin bow (as well as the violin itself) with electronics adequate for real-time manipulation of the sound.
In this project there exist several sound sources:
- the violin sound, which is picked up by built-in microphones of the electric violin,
- a number of prerecorded everyday sounds to be cued in by the performer during performance, and
- several pre-recorded series of counting, where the performer’s voice is heard.
Controlled by bow gestures these different sounds are filtered through one or more Max/MSP patches followed by playback through a quadraphonic speaker system. From time to time permutations of objects between speakers, including the movement on stage by the performer herself, take place.