Sorry, you need to enable JavaScript to visit this website.

Quality Assessment

VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning


In this paper, we propose a novel virtual reality image quality assessment (VR IQA) with adversarial learning for omnidirectional images. To take into account the characteristics of the omnidirectional image, we devise deep networks including novel quality score predictor and human perception guider. The proposed quality score predictor automatically predicts the quality score of distorted image using the latent spatial and position feature.

Paper Details

Authors:
Heoun-taek Lim, Hak Gu Kim, and Yong Man Ro
Submitted On:
20 April 2018 - 8:00am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

VR IQA NET-ICASSP2018

(83 downloads)

Subscribe

[1] Heoun-taek Lim, Hak Gu Kim, and Yong Man Ro, "VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/3102. Accessed: Aug. 19, 2018.
@article{3102-18,
url = {http://sigport.org/3102},
author = {Heoun-taek Lim; Hak Gu Kim; and Yong Man Ro },
publisher = {IEEE SigPort},
title = {VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning},
year = {2018} }
TY - EJOUR
T1 - VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning
AU - Heoun-taek Lim; Hak Gu Kim; and Yong Man Ro
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/3102
ER -
Heoun-taek Lim, Hak Gu Kim, and Yong Man Ro. (2018). VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning. IEEE SigPort. http://sigport.org/3102
Heoun-taek Lim, Hak Gu Kim, and Yong Man Ro, 2018. VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning. Available at: http://sigport.org/3102.
Heoun-taek Lim, Hak Gu Kim, and Yong Man Ro. (2018). "VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning." Web.
1. Heoun-taek Lim, Hak Gu Kim, and Yong Man Ro. VR IQA NET: Deep Virtual Reality Image Quality Assessment using Adversarial Learning [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/3102

No-reference weighting factor selection for bimodal tomography

Paper Details

Authors:
Yan Guo, Bernd Rieger
Submitted On:
12 April 2018 - 10:58am
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

ICASSP-Yan-No-Animation.pdf

(43 downloads)

Subscribe

[1] Yan Guo, Bernd Rieger, "No-reference weighting factor selection for bimodal tomography", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/2374. Accessed: Aug. 19, 2018.
@article{2374-18,
url = {http://sigport.org/2374},
author = {Yan Guo; Bernd Rieger },
publisher = {IEEE SigPort},
title = {No-reference weighting factor selection for bimodal tomography},
year = {2018} }
TY - EJOUR
T1 - No-reference weighting factor selection for bimodal tomography
AU - Yan Guo; Bernd Rieger
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/2374
ER -
Yan Guo, Bernd Rieger. (2018). No-reference weighting factor selection for bimodal tomography. IEEE SigPort. http://sigport.org/2374
Yan Guo, Bernd Rieger, 2018. No-reference weighting factor selection for bimodal tomography. Available at: http://sigport.org/2374.
Yan Guo, Bernd Rieger. (2018). "No-reference weighting factor selection for bimodal tomography." Web.
1. Yan Guo, Bernd Rieger. No-reference weighting factor selection for bimodal tomography [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/2374

Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering


In this paper, we propose variational fusion of time-of-flight (TOF) and stereo data using edge selective joint filtering (ESJF). We utilize ESJF to up-sample low-resolution (LR) depth captured by TOF camera and produce high-resolution (HR)depth maps with accurate edge information. First, we measure confidence of two sensor with different reliability to fuse them. Then, we up-sample TOF depth map using ESJF to generate discontinuity maps and protect edges in depth. Finally, we perform variational fusion of TOF and stereo depth data based on total variation (TV) guided by discontinuity maps.

Paper Details

Authors:
Cheolkon Jung, Zhendong Zhang
Submitted On:
10 September 2017 - 11:11pm
Short Link:
Type:
Event:
Presenter's Name:
Paper Code:
Document Year:
Cite

Document Files

ICIP2017_Fusion_slides.

(148 downloads)

Subscribe

[1] Cheolkon Jung, Zhendong Zhang, "Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering", IEEE SigPort, 2017. [Online]. Available: http://sigport.org/1877. Accessed: Aug. 19, 2018.
@article{1877-17,
url = {http://sigport.org/1877},
author = {Cheolkon Jung; Zhendong Zhang },
publisher = {IEEE SigPort},
title = {Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering},
year = {2017} }
TY - EJOUR
T1 - Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering
AU - Cheolkon Jung; Zhendong Zhang
PY - 2017
PB - IEEE SigPort
UR - http://sigport.org/1877
ER -
Cheolkon Jung, Zhendong Zhang. (2017). Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering. IEEE SigPort. http://sigport.org/1877
Cheolkon Jung, Zhendong Zhang, 2017. Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering. Available at: http://sigport.org/1877.
Cheolkon Jung, Zhendong Zhang. (2017). "Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering." Web.
1. Cheolkon Jung, Zhendong Zhang. Variational Fusion of Time-of-Flight and Stereo Data Using Edge Selective Joint Filtering [Internet]. IEEE SigPort; 2017. Available from : http://sigport.org/1877