Sorry, you need to enable JavaScript to visit this website.

facebooktwittermailshare

AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER

Abstract: 

There is growing interest in understanding the impact of architectural parameters such as depth, width, and the type of
activation function on the performance of a neural network. We provide an upper-bound on the number of free parameters
a ReLU-type neural network needs to exactly fit the training data. Whether a net of this size generalizes to test data will
be governed by the fidelity of the training data and the applicability of the principle of Occam’s Razor. We introduce the
concept of s-separability and show that for the special case of (c-1)-separable training data with c classes, a neural network
with (d+2c) parameters can achieve 100% training classification accuracy, where d is the dimension of data. It is also shown that if the number of free parameters is at least (d+2p), where p is the size of the training set, the neural network can memorize each training example. Finally, a framework is introduced for finding a neural network achieving a given training error, subject to an upper-bound on layer width.

up
1 user has voted: Hossein Valavi

Paper Details

Authors:
Hossein Valavi, Peter J. Ramadge
Submitted On:
19 April 2018 - 7:01pm
Short Link:
Type:
Presentation Slides
Event:
Presenter's Name:
Peter J. Ramadge
Paper Code:
3232
Document Year:
2018
Cite

Document Files

AN UPPER-BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER

(62 downloads)

Subscribe

[1] Hossein Valavi, Peter J. Ramadge, "AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER", IEEE SigPort, 2018. [Online]. Available: http://sigport.org/2843. Accessed: Jul. 23, 2018.
@article{2843-18,
url = {http://sigport.org/2843},
author = {Hossein Valavi; Peter J. Ramadge },
publisher = {IEEE SigPort},
title = {AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER},
year = {2018} }
TY - EJOUR
T1 - AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER
AU - Hossein Valavi; Peter J. Ramadge
PY - 2018
PB - IEEE SigPort
UR - http://sigport.org/2843
ER -
Hossein Valavi, Peter J. Ramadge. (2018). AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER. IEEE SigPort. http://sigport.org/2843
Hossein Valavi, Peter J. Ramadge, 2018. AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER. Available at: http://sigport.org/2843.
Hossein Valavi, Peter J. Ramadge. (2018). "AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER." Web.
1. Hossein Valavi, Peter J. Ramadge. AN UPPER BOUND ON THE REQUIRED SIZE OF A NEURAL NETWORK CLASSIFIER [Internet]. IEEE SigPort; 2018. Available from : http://sigport.org/2843