Sorry, you need to enable JavaScript to visit this website.

Cross-Language Speech Dependent Lip-Synchronization

Citation Author(s):
Abhishek Jha, Vikram Voleti, Vinay Namboodiri, C. V. Jawahar
Submitted by:
Vikram Voleti
Last updated:
7 May 2019 - 1:43pm
Document Type:
Poster
Document Year:
2019
Event:
Presenters:
Vikram Voleti
Paper Code:
2898
 

Understanding videos of people speaking across international borders is hard as audiences from different demographies do not understand the language. Such speech videos are often supplemented with language subtitles. However, these hamper the viewing experience as the attention is shared. Simple audio dubbing in a different language makes the video appear unnatural due to unsynchronized lip motion. In this paper, we propose a system for automated cross-language lip synchronization for re-dubbed videos. Our model generates superior photorealistic lip-synchronization over original video in comparison to the current re-dubbing method. With the help of a user-based study, we verify that our method is preferred over unsynchronized videos.

up
0 users have voted: