Documents
Poster
Exploring Meta Information for Audio-based Zero-Shot Bird Classification
- Citation Author(s):
- Submitted by:
- Alexander Gebhard
- Last updated:
- 3 April 2024 - 5:49pm
- Document Type:
- Poster
- Document Year:
- 2024
- Event:
- Presenters:
- Alexander Gebhard
- Paper Code:
- AASP-P3.9
- Categories:
- Log in to post comments
Advances in passive acoustic monitoring and machine learning have led to the procurement of vast datasets for computational bioacoustic research. Nevertheless, data scarcity is still an issue for rare and underrepresented species. This
study investigates how meta-information can improve zero-shot audio classification, utilising bird species as an example case study due to the availability of rich and diverse metadata. We investigate three different sources of metadata: textual bird sound descriptions encoded via (S)BERT, functional traits (AVONET), and bird life-history (BLH) characteristics. As audio features, we extract audio spectrogram transformer (AST) embeddings and project them to the dimension of the auxiliary information by adopting a single linear layer. Then, we employ the dot product as compatibility function and a standard zero-shot learning ranking hinge loss to determine the correct class. The best results are achieved by concatenating the AVONET and BLH features attaining a mean unweighted F1-score of .233 over five different test sets with 8 to 10 classes.