학술논문

Free-Field Localization Performance With a Head-Tracked Virtual Auditory Display
Document Type
Periodical
Source
IEEE Journal of Selected Topics in Signal Processing IEEE J. Sel. Top. Signal Process. Selected Topics in Signal Processing, IEEE Journal of. 9(5):943-954 Aug, 2015
Subject
Signal Processing and Analysis
Loudspeakers
Headphones
Microphones
Ear
Frequency response
Frequency measurement
Discrete Fourier transforms
Head-related transfer functions (HRTFs)
localization
Language
ISSN
1932-4553
1941-0484
Abstract
Virtual auditory displays are systems that use signal processing techniques to manipulate the apparent spatial locations of sounds when they are presented to listeners over headphones. When the virtual audio display is limited to the presentation of stationary sounds at a finite number of source locations, it is possible to produce virtual sounds that are essentially indistinguishable from sounds presented by real loudspeakers in the free field. However, when the display is required to reproduce sound sources at arbitrary locations and respond in real-time to the head motions of the listener, it becomes much more difficult to maintain localization performance that is equivalent to the free field. The purpose of this paper is to present the results of a study that used a virtual synthesis technique to produce head-tracked virtual sounds that were comparable in terms of localization performance with real sound sources. The technique made use of an in-situ measurement and reproduction technique that made it possible to switch between the head-related transfer function measurement and the psychoacoustic validation without removing the headset from the listener. The results demonstrate the feasibility of using head-tracked virtual auditory displays to generate both short and long virtual sounds with localization performance comparable to what can be achieved in the free field.