Sorry, you need to enable JavaScript to visit this website.

Improving ASR Contextual Biasing using Guided Attention Loss

DOI:
10.60864/bj21-pe57
Citation Author(s):
Submitted by:
Jiyang Tang
Last updated:
15 April 2024 - 11:31am
Document Type:
Presentation Slides
Document Year:
2024
Event:
Presenters:
Suwon Shon
Paper Code:
8039
 

In this paper, we propose a Guided Attention (GA) auxiliary training loss, which improves the effectiveness and robustness of automatic speech recognition (ASR) contextual biasing without introducing additional parameters. A common challenge in previous literature is that the word error rate (WER) reduction brought by contextual biasing diminishes as the number of bias phrases increases. To address this challenge, we employ a GA loss as an additional training objective besides the Transducer loss. The proposed GA loss aims to teach the cross attention how to align bias phrases with text tokens or audio frames. Compared to studies with similar motivations, the proposed loss operates directly on the cross attention weights and is easier to implement. Through extensive experiments based on Conformer Transducer with Contextual Adapter, we demonstrate that the proposed method not only leads to a lower WER but also retains its effectiveness as the number of bias phrases increases. Specifically, the GA loss decreases the WER of rare vocabularies by up to 19.2% on LibriSpeech compared to the contextual biasing baseline, and up to 49.3% compared to a vanilla Transducer.

up
0 users have voted: