J. Lamas, C.C.L. Silva, M. Silva, S. Mouta, J.C. Campos, J.A. Santos
Measuring end-to-end delay in real-time auralisation systems
In Euronoise - 10th European Congress and Exposition on Noise Control Engineering, pages 791-796. EAA-NAG-ABAV. 2015.

Abstract

One of the major challenges in the development of an immersive system is handling the delay between the tracking of the user's head position and the updated projection of a 3D image or auralised sound, also called end-to-end delay. Excessive end-to-end delay can result in the general decrement of the “feeling of presence”, the occurrence of motion sickness and poor performance in perception-action tasks. These latencies must be known in order to provide insights on the technological (hardware/software optimization) or psychophysical (recalibration sessions) strategies to deal with them. Our goal was to develop a new measurement method of end-to-end delay that is both precise and easily replicated. We used a Head and Torso simulator (HATS) as an auditory signal sensor, a fast response photo-sensor to detect a visual stimulus response from a Motion Capture System, and a voltage input trigger as real-time event. The HATS was mounted in a turntable which allowed us to precisely change the 3D sound relative to the head position. When the virtual sound source was at 90 azimuth, the correspondent HRTF would set all the intensity values to zero, at the same time a trigger would register the real-time event of turning the HATS 90 azimuth. Furthermore, with the HATS turned 90 to the left, the motion capture marker visualization would fell exactly in the photo-sensor receptor. This method allowed us to precisely measure the delay from tracking to displaying. Moreover, our results show that the method of tracking, its tracking frequency, and the rendering of the sound reflections are the main predictors of end-to-end delay.

visit publisher   download PDF

@inproceedings{Lamas:2015,
 author = {J. Lamas, C.C.L. Silva, M. Silva, S. Mouta, J.C. Campos, J.A. Santos},
 title = {Measuring end-to-end delay in real-time auralisation systems},
 booktitle = {Euronoise - 10th European Congress and Exposition on Noise Control Engineering},
 year = {2015},
 pages = {791-796},
 abstract = {One of the major challenges in the development of an immersive system is handling the delay between the tracking of the user's head position and the updated projection of a 3D image or auralised sound, also called end-to-end delay. Excessive end-to-end delay can result in the general decrement of the “feeling of presence”, the occurrence of motion sickness and poor performance in perception-action tasks. These latencies must be known in order to provide insights on the technological (hardware/software optimization) or psychophysical (recalibration sessions) strategies to deal with them. Our goal was to develop a new measurement method of end-to-end delay that is both precise and easily replicated. We used a Head and Torso simulator (HATS) as an auditory signal sensor, a fast response photo-sensor to detect a visual stimulus response from a Motion Capture System, and a voltage input trigger as real-time event. The HATS was mounted in a turntable which allowed us to precisely change the 3D sound relative to the head position. When the virtual sound source was at 90 azimuth, the correspondent HRTF would set all the intensity values to zero, at the same time a trigger would register the real-time event of turning the HATS 90 azimuth. Furthermore, with the HATS turned 90 to the left, the motion capture marker visualization would fell exactly in the photo-sensor receptor. This method allowed us to precisely measure the delay from tracking to displaying. Moreover, our results show that the method of tracking, its tracking frequency, and the rendering of the sound reflections are the main predictors of end-to-end delay.},
 publisher = {EAA-NAG-ABAV},
 hdl = {1822/39322},
 paperurl = {http://www.conforg.fr/euronoise2015/output_directory/data/articles/000529.pdf},
 month = {May-June}
}

Generated by mkBiblio 2.6.26