Sorry, you need to enable JavaScript to visit this website.

360-degree Video Stitching for Dual-fisheye Lens Cameras Based On Rigid Moving Least Squares

Citation Author(s):
Tuan Ho, Ioannis Schizas, K. R. Rao, Madhukar Budagavi
Submitted by:
Tuan Ho
Last updated:
16 September 2017 - 5:42pm
Document Type:
Presentation Slides
Document Year:
2017
Event:
Presenters:
Tuan Ho
Paper Code:
2811
 

Dual-fisheye lens cameras are becoming popular for 360-degree video capture, especially for User-generated content (UGC), since they are affordable and portable. Images generated by the dual-fisheye cameras have limited overlap and hence require non-conventional stitching techniques to produce high-quality 360x180-degree panoramas. This paper introduces a novel method to align these images using interpolation grids based on rigid moving least squares. Furthermore, jitter is the critical issue arising when one applies the image-based stitching algorithms to video. It stems from the unconstrained movement of stitching boundary from one
frame to another. Therefore, we also propose a new algorithm to maintain the temporal coherence of stitching boundary to provide jitter-free 360-degree videos. Results show that the
method proposed in this paper can produce higher quality stitched images and videos than prior work.

up
0 users have voted: