Efficient Machine Learning through Evolving Combined Deep Neural Networks

Published in Genetic and Evolutionary Computation Conference, 2020

The usage of Artificial Neural Networks (ANNs) with a fixed topology is becoming more popular in daily life. However, there are problems where it is difficult to build an ANN manually. Therefore, genetic algorithms like NeuroEvolution of Augmented Topologies (NEAT) have been developed to find topologies and weights. The downside of NEAT is that it often generates inefficient large ANNs for different problems.

In this paper, we introduce an approach called Turbo NEAT, which combines divide and conquer methods with NEAT to allow a symbiosis of specialized smaller ANNs. In addition, we optimize the weights of the ANNs through backpropagation in order to better compare the topologies. Experiments on several problems show that these approaches allow the handling of complex problems and lead to efficient ANNs.

Download here