Gradient descent learning algorithms, namely Back Propagation (BP), can significantly increase the classification performance of Multi Layer Perceptrons adopting a local and adaptive learning rate management approach. In this paper, we present the comparison of the performance on hand-written characters classification of two BP algorithms, implementing fixed and adaptive learning rate. The results show that the validation error and average number of learning iterations are lower for the adaptive learning rate BP algorithm.

European Symposium on Artificial Neural Networks 2002

QUEIROLO, FILIPPO;VALLE, MAURIZIO
2002-01-01

Abstract

Gradient descent learning algorithms, namely Back Propagation (BP), can significantly increase the classification performance of Multi Layer Perceptrons adopting a local and adaptive learning rate management approach. In this paper, we present the comparison of the performance on hand-written characters classification of two BP algorithms, implementing fixed and adaptive learning rate. The results show that the validation error and average number of learning iterations are lower for the adaptive learning rate BP algorithm.
2002
2-930307-02-1
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11567/848291
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact