Documents
Presentation Slides
Differentiable Branching in Deep Networks for Fast Inference
- Citation Author(s):
- Submitted by:
- Danilo Comminiello
- Last updated:
- 5 June 2020 - 4:28am
- Document Type:
- Presentation Slides
- Document Year:
- 2020
- Event:
- Presenters:
- Danilo Comminiello
- Paper Code:
- 1402
- Categories:
- Log in to post comments
In this paper, we consider the design of deep neural networks augmented with multiple auxiliary classifiers departing from the main (backbone) network. These classifiers can be used to perform early-exit from the network at various layers, making them convenient for energy-constrained applications such as IoT, embedded devices, or Fog computing. However, designing an optimized early-exit strategy is a difficult task, generally requiring a large amount of manual fine-tuning. In this paper, we propose a way to jointly optimize this strategy together with the branches, providing an end-to-end trainable algorithm for this emerging class of neural networks. We achieve this by replacing the original output of the branches with a ‘soft’, differentiable approximation. In addition, we also propose a regularization approach to trade-off the computational efficiency of the early-exit strategy with respect to the overall classification accuracy. We evaluate our proposed design approach on a set of image classification benchmarks, showing significant gains in accuracy and inference time.