A GA Based Approach to Optimizing Weights and More

Tanmay Khandait

Tanmay Khandait

Ahmedabad, Gujarat

0 0
  • 0 Collaborators

In this project, I am trying to train a neural network using a genetic algorithm instead of the back-propagation based methods. The main objective will be to study the convergence analysis of this method on simple MLP followed by more complex networks. ...learn more

Project status: Under Development

Artificial Intelligence

Code Samples [1]Links [3]

Overview / Usage

I am using this research to study how good or bad can evolutionary algorithms perform while training basic MLP. The initial task is to to start off by developing a code to find optimal weights to classify data in the IRIS data set. During this initial study, it would be important to understand the landscape of the function and avoid bad minima that can lead to over-fitting. The initial study will also find out problems while scaling up the problem to much complex neural network architecture.

Later on, based on my initial developments, I will work on using these GA's to train much complex network like RNN and LSTM. One interesting study would be the comparison of the back-propagation algorithm and GA's convergence over various iterations.

Methodology / Approach

I have started to develop the code to build the basic GA's. the important steps in GA are:

  1. Selection
  2. Crossover; and
  3. Mutation

Initially, the weights are randomly initialized for a certain population and the fitness function evaluates and sorts the various weights. The crossover and mutation are performed using the the standard real-encoded GA's. The weights for every layers are treated as chromosomes and the operations of real-encoded GA are then applied. I intend on using developing much more optimized algorithms after my initial analysis of the proposed methods.

Technologies Used

  1. Python

Repository

https://github.com/DaitTan/geneticNN

Comments (0)