Documents
Description of Database/Benchmark
Supplementary Material: CHUG - Crowdsourced User-Generated HDR Video Quality
![](/sites/all/themes/dataport/images/light-567757_1920.jpg)
- Citation Author(s):
- Submitted by:
- shreshth saini
- Last updated:
- 6 February 2025 - 12:49am
- Document Type:
- Description of Database/Benchmark
- Categories:
- Log in to post comments
High Dynamic Range (HDR) videos enhance visual experiences with superior brightness, contrast, and color depth. The surge of User-Generated Content (UGC) on platforms like YouTube and TikTok introduces unique challenges for HDR video quality assessment (VQA) due to diverse capture conditions, editing artifacts, and compression distortions. Existing HDR-VQA datasets primarily focus on professionally generated content (PGC), leaving a gap in understanding real-world UGC-HDR degradations. To address this, we introduce \textbf{CHUG}: Crowdsourced User-Generated HDR Video Quality Dataset, the first large-scale subjective study on UGC-HDR quality. CHUG comprises 856 UGC-HDR source videos, transcoded across multiple resolutions and bitrates to simulate real-world scenarios, totaling 5,992 videos. A large-scale study via Amazon Mechanical Turk collected 211,848 perceptual ratings. CHUG provides a benchmark for analyzing UGC-specific distortions in HDR videos. We anticipate CHUG will advance No-Reference (NR) HDR-VQA research by offering a large-scale, diverse, and real-world UGC dataset.
Comments
NA
NA