A novel approach to robotic monaural sound localization

F. Keyrouz, A. Bou Saleh, K. Diepold

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The paper presents a novel monaural 3D sound localization technique that robustly estimates a sound source within a 2.5° azimuth deviation and a 5° elevation deviation. The proposed system, an upgrade of monaural-based localization techniques, uses two microphones, one inserted within the ear canal of a humanoid head equipped with an artificial ear, and the second held outside the ear, 5 cm away from the inner microphone. The outer microphone is small enough so that minimal reflections that might contribute to localization errors are introduced. The system exploits the spectral information of the signals from the two microphones in such a way that a simple correlation mechanism, using a generic set of Head Related Transfer Functions (HRTFs), is used to localize the sound sources. The low computational requirement provides a basis for robotic real-time applications. The technique was tested through extensive simulations of a noisy reverberant room and further through an experimental setup. Both results demonstrated the capability of the monaural system to localize, with high accuracy, sound sources in a three dimensional environment even in presence of strong noise and distortion.

Original languageEnglish
Title of host publicationAudio Engineering Society - 122nd Audio Engineering Society Convention 2007
Pages708-712
Number of pages5
StatePublished - 2007
Event122nd Audio Engineering Society Convention 2007 - Vienna, Austria
Duration: 5 May 20078 May 2007

Publication series

NameAudio Engineering Society - 122nd Audio Engineering Society Convention 2007
Volume2

Conference

Conference122nd Audio Engineering Society Convention 2007
Country/TerritoryAustria
CityVienna
Period5/05/078/05/07

Fingerprint

Dive into the research topics of 'A novel approach to robotic monaural sound localization'. Together they form a unique fingerprint.

Cite this