Sorry, you need to enable JavaScript to visit this website.

facebooktwittermailshare

Crowdsourced Pairwise-Comparison for Source Separation Evaluation

Abstract: 

Automated objective methods of audio source separation evaluation are fast, cheap, and require little effort by the investigator. However, their output often correlates poorly with human quality assessments and typically require ground-truth (perfectly separated) signals to evaluate algorithm performance. Subjective multi-stimulus human ratings (e.g. MUSHRA) of audio quality are the gold standard for many tasks, but they are slow and require a great deal of effort to recruit participants and run listening tests. Recent work has shown that a crowdsourced multi-stimulus listening test can have results comparable to lab-based multi-stimulus tests. While these results are encouraging, MUSHRA multi-stimulus tests are limited to evaluating 12 or fewer stimuli, and they require ground-truth stimuli for reference. In this work, we evaluate a web-based pairwise-comparison listening approach that promises to speed and facilitate conducting listening tests, while also addressing some of the shortcomings of multi-stimulus tests. Using audio source separation quality as our evaluation task, we compare our web-based pairwise-comparison listening test to both web-based and lab-based multi-stimulus tests. We find that pairwise-comparison listening tests perform comparably to multi-stimulus tests, but without many of their shortcomings.

up
0 users have voted:

Paper Details

Authors:
Mark Cartwright, Bryan Pardo, Gautham Mysore
Submitted On:
14 April 2018 - 5:27pm
Short Link:
Type:
Poster
Event:
Presenter's Name:
Mark Cartwright
Paper Code:
AASP-P9.4
Document Year:
2018
Cite

Document Files

cartwright_caqe_icassp_2018_poster.pdf

Subscribe

[1] Mark Cartwright, Bryan Pardo, Gautham Mysore, "Crowdsourced Pairwise-Comparison for Source Separation Evaluation", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/2854. Accessed: Mar. 20, 2019.
@article{2854-18,
url = {http://sigport.org/2854},
author = {Mark Cartwright; Bryan Pardo; Gautham Mysore },
publisher = {IEEE SigPort},
title = {Crowdsourced Pairwise-Comparison for Source Separation Evaluation},
year = {2018} }
TY - EJOUR
T1 - Crowdsourced Pairwise-Comparison for Source Separation Evaluation
AU - Mark Cartwright; Bryan Pardo; Gautham Mysore
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/2854
ER -
Mark Cartwright, Bryan Pardo, Gautham Mysore. (2018). Crowdsourced Pairwise-Comparison for Source Separation Evaluation. IEEE SigPort. http://sigport.org/2854
Mark Cartwright, Bryan Pardo, Gautham Mysore, 2018. Crowdsourced Pairwise-Comparison for Source Separation Evaluation. Available at: http://sigport.org/2854.
Mark Cartwright, Bryan Pardo, Gautham Mysore. (2018). "Crowdsourced Pairwise-Comparison for Source Separation Evaluation." Web.
1. Mark Cartwright, Bryan Pardo, Gautham Mysore. Crowdsourced Pairwise-Comparison for Source Separation Evaluation [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/2854