Evolution of Stacked Autoencoders

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Choosing the best hyperparameters for neural networks is a big challenge. This paper proposes a method that automatically initializes and adjusts hyperparameters during the training process of stacked autoencoders. A population of autoencoders is trained with gradient-descent-based weight updates, while hyperparameters are mutated and weights are inherited in a Lamarckian kind of way. The training is conducted layer-wise, while each new layer initiates a new neuroevolutionary optimization process. In the fitness function of the evolutionary approach a dimensionality reduction quality measure is employed. Experiments show the contribution of the most significant hyperparameters, while analyzing their lineage during the training process. The results confirm that the proposed method outperforms a baseline approach on MNIST, FashionMNIST, and the Year Prediction Million Song Database.

OriginalsprogEngelsk
Titel2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Proceedings
Antal sider8
ForlagInstitute of Electrical and Electronics Engineers Inc.
Publikationsdato2019
Sider823-830
Artikelnummer8790182
ISBN (Elektronisk)9781728121536
DOI
StatusUdgivet - 2019
Begivenhed2019 IEEE Congress on Evolutionary Computation, CEC 2019 - Wellington, New Zealand
Varighed: 10 jun. 201913 jun. 2019

Konference

Konference2019 IEEE Congress on Evolutionary Computation, CEC 2019
LandNew Zealand
ByWellington
Periode10/06/201913/06/2019
Sponsoret al., Facebook, IEEE, IEEE CIS, Tourism New Zealand, Victoria University of Wellington

ID: 227137612