High-level control of sound synthesis for sonification processes

Author: Kronland-Martinet Richard  

Publisher: Springer Publishing Company

ISSN: 0951-5666

Source: AI & SOCIETY, Vol.27, Iss.2, 2012-05, pp. : 245-255

Disclaimer: Any content in publications that violate the sovereignty, the constitution or regulations of the PRC is not accepted or approved by CNPIEC.

Previous Menu Next

Abstract

Methods of sonification based on the design and control of sound synthesis is presented in this paper. The semiotics of isolated sounds was evidenced by performing fundamental studies using a combined acoustical and brain imaging (event-related potentials) approach. The perceptual cues (which are known as invariants) responsible for the evocations elicited by the sounds generated by impacts, moving sound sources, dynamic events and vehicles (car-door closing and car engine noise) were then identified based on physical and perceptual considerations. Lastly, some examples of the high-level control of a synthesis process simulating immersive 3-D auditory scenes, interacting objects and evoked dynamics are presented.