Sorry, you need to enable JavaScript to visit this website.

Guided-spatio-temporal filtering for extracting sound from optically measured images containing occluding objects

Citation Author(s):
Risako Tanigawa, Kohei Yatabe, Yasuhiro Oikawa
Submitted by:
Risako Tanigawa
Last updated:
9 May 2019 - 2:30am
Document Type:
Poster
Document Year:
2019
Event:
Presenters:
Risako Tanigawa
Paper Code:
AASP-P15.6
 

Recent development of optical interferometry enables us to measure sound without placing any device inside the sound field. In particular, parallel phase-shifting interferometry (PPSI) has realized advanced measurement of refractive index of air. Its novel application investigated very recently is simultaneous visualization of flow and sound, which had been difficult until PPSI enabled high-speed and accurate measurement several years ago. However, for understanding aerodynamic sound, separation of air flow and sound is necessary since they are mixed up in the observed video. In this paper, guided-spatio-temporal filtering is proposed to separate sound from the optically measured images. Guided filtering is combined with a physical-model-based spatio-temporal filterbank for extracting sound-related information without the undesired effect caused by the image boundary or occluding objects. Such image boundary and occluding objects are typical difficulty arose in signal processing of an optically measured sound filed.

up
0 users have voted: