Designing sound collaboratively-perceptually motivated audio synthesis

Niklas Klügel, Timo Becker, Georg Groh

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

In this contribution1, we will discuss a prototype that allows a group of users to design sound collaboratively in real time using a multi-touch tabletop. We make use of a machine learning method to generate a mapping from perceptual audio features to synthesis parameters. This mapping is then used for visualization and interaction. Finally, we discuss the results of a comparative evaluation study.

Original languageEnglish
Pages (from-to)327-330
Number of pages4
JournalProceedings of the International Conference on New Interfaces for Musical Expression
StatePublished - 2014
Event14th International conference on New Interfaces for Musical Expression, NIME 2014 - London, United Kingdom
Duration: 30 Jun 20144 Jul 2014

Keywords

  • Collaborative Music Making
  • Sound Design

Fingerprint

Dive into the research topics of 'Designing sound collaboratively-perceptually motivated audio synthesis'. Together they form a unique fingerprint.

Cite this