Sorry, you need to enable JavaScript to visit this website.

Self-supervised representation learning from electroencephalography signals

Citation Author(s):
Hubert Banville, Isabela Albuquerque, Aapo Hyvärinen, Graeme Moffat, Denis-Alexander Engemann, Alexandre Gramfort
Submitted by:
Hubert Banville
Last updated:
13 October 2019 - 8:58pm
Document Type:
Poster
Document Year:
2019
Event:
Presenters Name:
Hubert Banville
Paper Code:
129

Abstract 

Abstract: 

The supervised learning paradigm is limited by the cost - and sometimes the impracticality - of data collection and labeling in multiple domains. Self-supervised learning, a paradigm which exploits the structure of unlabeled data to create learning problems that can be solved with standard supervised approaches, has shown great promise as a pretraining or feature learning approach in fields like computer vision and time series processing. In this work, we present self-supervision strategies that can be used to learn informative representations from multivariate time series. One successful approach relies on predicting whether time windows are sampled from the same temporal context or not. As demonstrated on a clinically relevant task (sleep scoring) and with two electroencephalography datasets, our approach outperforms a purely supervised approach in low data regimes, while capturing important physiological information without any access to labels.

up
0 users have voted:

Dataset Files

mlsp2019_poster_hjb_final.pdf

(174)