Documents
Presentation Slides
RGB-D DATA FUSION IN COMPLEX SPACE
- Citation Author(s):
- Submitted by:
- Ziyun Cai
- Last updated:
- 20 September 2017 - 1:13pm
- Document Type:
- Presentation Slides
- Document Year:
- 2017
- Event:
- Presenters:
- Ziyun Cai
- Paper Code:
- ICIP-2414
- Categories:
- Log in to post comments
Most of the RGB-D fusion methods extract features from RGB
data and depth data separately and then simply concatenate
them or encode these two kinds of features. Such frameworks
cannot explore the correlation between the RGB pixels and
their corresponding depth pixels. Motivated by the physical
concept that range data correspond to the phase change and
color information corresponds to the intensity, we first project
raw RGB-D data into a complex space and then jointly extract
features from the fused RGB-D images. Consequently, the
correlated and individual parts of the RGB-D information in
the new feature space are well combined. Experimental results
of SIFT and fused images trained CNNs on two RGB-D
datasets show that our proposed RGB-D fusion method can
achieve competing performance against the classical fusion
methods.