Documents
Poster
Building Recurrent Networks by Unfolding Iterative Thresholding for Sequential Sparse Recovery
- Citation Author(s):
- Submitted by:
- Scott Wisdom
- Last updated:
- 8 March 2017 - 9:22am
- Document Type:
- Poster
- Document Year:
- 2017
- Event:
- Presenters:
- Scott Wisdom
- Paper Code:
- SPTM-P7.6
- Categories:
- Log in to post comments
Historically, sparse methods and neural networks, particularly modern deep learning methods, have been relatively disparate areas. Sparse methods are typically used for signal enhancement, compression,and recovery, usually in an unsupervised framework, while neural networks commonly rely on a supervised training set. In this paper, we use the specific problem of sequential sparse recovery, which models a sequence of observations over time using a sequence of sparse coefficients, to show how algorithms for sparse modeling can be combined with supervised deep learning to improve sparse recovery. Specifically, we show that the iterative soft-thresholding algorithm (ISTA) for sequential sparse recovery corresponds to a stacked recurrent neural network (RNN) under specific architecture
and parameter constraints. Then we demonstrate the benefit of training this RNN with backpropagation using supervised data for the task of column-wise compressive sensing of images. This training corresponds to adaptation of the original iterative thresholding algorithm and its parameters. Thus, we show by example that sparse modeling can provide a rich source of principled and structured deep network architectures that can be trained to improve performance on specific tasks.