
Publication details
Publisher: Springer
Place: Berlin
Year: 2014
Pages: 372-392
Series: Lecture Notes in Computer Science
ISBN (Hardback): 9783319129754
Full citation:
, "Interactive sound texture synthesis through semi-automatic user annotations", in: Sound, music, and motion, Berlin, Springer, 2014


Interactive sound texture synthesis through semi-automatic user annotations
pp. 372-392
in: Mitsuko Aramaki, Olivier Derrien, Richard Kronland-Martinet, Sølvi Ystad (eds), Sound, music, and motion, Berlin, Springer, 2014Abstract
We present a way to make environmental recordings controllable again by the use of continuous annotations of the high-level semantic parameter one wishes to control, e.g. wind strength or crowd excitation level. A partial annotation can be propagated to cover the entire recording via cross-modal analysis between gesture and sound by canonical time warping (CTW). The annotations serve as a descriptor for lookup in corpus-based concatenative synthesis in order to invert the sound/annotation relationship. The workflow has been evaluated by a preliminary subject test and results on canonical correlation analysis (CCA) show high consistency between annotations and a small set of audio descriptors being well correlated with them. An experiment of the propagation of annotations shows the superior performance of CTW over CCA with as little as 20 s of annotated material.
Publication details
Publisher: Springer
Place: Berlin
Year: 2014
Pages: 372-392
Series: Lecture Notes in Computer Science
ISBN (Hardback): 9783319129754
Full citation:
, "Interactive sound texture synthesis through semi-automatic user annotations", in: Sound, music, and motion, Berlin, Springer, 2014