Autores
Sossa Azuela Juan Humberto
Título Dendrite morphological neurons trained by stochastic gradient descent.
Tipo Congreso
Sub-tipo Memoria
Descripción 2016 IEEE Symposium Series on Computational Intelligence, SSCI 2016
Resumen Dendrite morphological neurons are a type of artificial neural network that work with min and max operators instead of algebraic products. These morphological operators allow each dendrite to build a hyper-box in classification N-dimensional space. In contrast with classical perceptrons, these simple geometrical representations, hyper-boxes, allow the proposal of training methods based on heuristics without using of an optimisation method. In literature, it has been claimed that these heuristics-based trainings have advantages: there are no convergence problems, perfect classification can always be reached and training is performed in only one epoch. This paper shows that these assumed advantages come with a cost: these heuristics increase classification errors in the test set because they are not optimal and learning generalisation is poor. To solve these problems, we introduce a novel method to train dendrite morphological neurons based on stochastic gradient descent for classification tasks, using these heuristics just for initialisation of learning parameters. We add a softmax layer to the neural architecture for calculating gradients and also propose and evaluate four different methods to initialise the dendrite parameters. Experiments are performed based on several real and synthetic datasets. Results show that we can enhance the testing accuracy in comparison with solely heuristics-based training methods. This approach reaches competitive performance with respect to other popular machine learning algorithms. Our code developed in Matlab is available online. © 2016 IEEE.
Observaciones DOI 10.1109/SSCI.2016.7849933
Lugar Atenas
País Grecia
No. de páginas Article number 7849933
Vol. / Cap.
Inicio 2016-12-06
Fin 2016-12-09
ISBN/ISSN 9781509042401