Δημοσιεύσεις

Project Acronym: DNAD
Title: Distributed Methods For Neural Architecture Design
Affiliation: university of macedonia
Pi: George Kyriakides
Research Field: mathematics and computer science

Regularized Evolution for Macro Neural Architecture Search
by George Kyriakides, Konstantinos Margaritis
Abstract:
Neural Architecture Search is becoming an increasingly popular research field and method to design deep learning architectures. Most research focuses on searching for small blocks of deep learning operations, or micro-search. This method yields satisfactory results but demands prior knowledge of the macro architecture’s structure. Generally, methods that do not utilize macro structure knowledge perform worse but are able to be applied to datasets of completely new domains. In this paper, we propose a macro NAS methodology which utilizes concepts of Regularized Evolution and Macro Neural Architecture Search (DeepNEAT), and apply it to the Fashion-MNIST dataset. By utilizing our method, we are able to produce networks that outperform other macro NAS methods on the dataset, when the same post-search inference methods are used. Furthermore, we are able to achieve 94.46% test accuracy, while requiring considerably less epochs to fully train our network.
Reference:
Regularized Evolution for Macro Neural Architecture Search (George Kyriakides, Konstantinos Margaritis), In Maglogiannis I., Iliadis L., Pimenidis E. (eds) Artificial Intelligence Applications and Innovations. AIAI 2020. IFIP Advances in Information and Communication Technology, volume 584, 2020.
Bibtex Entry:
@inproceedings{doi:10.1007-978-3-030-49186-4_10,
 author = {George Kyriakides, Konstantinos Margaritis},
 doi = {10.1007/978-3-030-49186-4_10},
 url = {https://doi.org/10.1007/978-3-030-49186-4_10},
 year = {2020},
 bibyear = {2020},
 booktitle = {Maglogiannis I., Iliadis L., Pimenidis E. (eds) Artificial Intelligence Applications and Innovations. AIAI 2020. IFIP Advances in Information and Communication Technology},
 volume = {584},
 pages = {111-122},
 title = {Regularized Evolution for Macro Neural Architecture Search},
 abstract = {Neural Architecture Search is becoming an increasingly popular research field and method to design deep learning architectures. Most research focuses on searching for small blocks of deep learning operations, or micro-search. This method yields satisfactory results but demands prior knowledge of the macro architecture’s structure. Generally, methods that do not utilize macro structure knowledge perform worse but are able to be applied to datasets of completely new domains. In this paper, we propose a macro NAS methodology which utilizes concepts of Regularized Evolution and Macro Neural Architecture Search (DeepNEAT), and apply it to the Fashion-MNIST dataset. By utilizing our method, we are able to produce networks that outperform other macro NAS methods on the dataset, when the same post-search inference methods are used. Furthermore, we are able to achieve 94.46% test accuracy, while requiring considerably less epochs to fully train our network.},
}