Δημοσιεύσεις

Project Acronym: DNAD
Title: Distributed Methods For Neural Architecture Design
Affiliation: university of macedonia
Pi: George Kyriakides
Research Field: mathematics and computer sciences

Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures
by Kyriakides, George and Margaritis, Konstantinos
Abstract:
Autonomous design and optimization of neural networks is gaining increasingly more attention from the research community. The main barrier is the computational resources required to conduct experimental and production project. Although most researchers focus on new design methodologies, the main computational cost remains the evaluation of candidate architectures. In this paper we investigate the feasibility of using reduced epoch training, by measuring the rank correlation coefficients between sets of optimizers, given a fixed number of training epochs. We discover ranking correlations of more than 0.75 and up to 0.964 between Adam with 50 training epochs, stochastic gradient descent with nesterov momentum with 10 training epochs and Adam with 20 training epochs. Moreover, we show the ability of genetic algorithms to find high-quality solutions of a function, by searching in a perturbed search space, given that certain correlation criteria are met.
Reference:
Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures (Kyriakides, George and Margaritis, Konstantinos), In Artificial Intelligence Applications and Innovations (MacIntyre, John, Maglogiannis, Ilias, Iliadis, Lazaros, Pimenidis, Elias, eds.), Springer International Publishing, 2019.
Bibtex Entry:
@inproceedings{10.1007-978-3-030-19823-7_22,
 author = {Kyriakides, George and Margaritis, Konstantinos},
 editor = {MacIntyre, John and Maglogiannis, Ilias and Iliadis, Lazaros and Pimenidis, Elias},
 title = {Comparison of Neural Network Optimizers for Relative Ranking Retention Between Neural Architectures},
 booktitle = {Artificial Intelligence Applications and Innovations},
 year = {2019},
 bibyear = {2019},
 publisher = {Springer International Publishing},
 address = {Cham},
 pages = {272--281},
 abstract = {Autonomous design and optimization of neural networks is gaining increasingly more attention from the research community. The main barrier is the computational resources required to conduct experimental and production project. Although most researchers focus on new design methodologies, the main computational cost remains the evaluation of candidate architectures. In this paper we investigate the feasibility of using reduced epoch training, by measuring the rank correlation coefficients between sets of optimizers, given a fixed number of training epochs. We discover ranking correlations of more than 0.75 and up to 0.964 between Adam with 50 training epochs, stochastic gradient descent with nesterov momentum with 10 training epochs and Adam with 20 training epochs. Moreover, we show the ability of genetic algorithms to find high-quality solutions of a function, by searching in a perturbed search space, given that certain correlation criteria are met.},
 isbn = {978-3-030-19823-7},
 doi = {10.1007/978-3-030-19823-7_22},
 url = {https://doi.org/10.1007/978-3-030-19823-7_22},
}