Documents
Presentation Slides
Multi Layer Multi Objective Extreme Learning Machine
- Citation Author(s):
- Submitted by:
- Chamara Liyanaa...
- Last updated:
- 12 September 2017 - 11:22pm
- Document Type:
- Presentation Slides
- Document Year:
- 2017
- Event:
- Presenters:
- Cui Dongshun
- Paper Code:
- 1990
- Categories:
- Log in to post comments
Fully connected multi layer neural networks such as Deep Boltzmann Machines (DBM) performs better than fully connected single layer neural networks in image classification tasks and has a smaller number of hidden layer neurons than Extreme Learning Machine (ELM) based fully connected multi layer neural networks such as Multi Layer ELM (ML-ELM) and Hierarchical ELM (H-ELM) However, ML-ELM and H-ELM has a smaller training time than DBM. This paper introduces a fully connected multi layer neural network referred to as Multi Layer Multi Objective Extreme Learning Machine (MLMO-ELM) which uses a multi objective formulation to pass the label and non-linear information in order to learn a network model which has a similar number of hidden layer parameters as DBM and smaller training time than DBM. The experimental results show that MLMO-ELM outperforms DBM, ML-ELM and H-ELM on OCR and NORB datasets.