Genetic Algorithms (Specifically with Keras)
I can't get my deep genetic algorithm snake game to work and I can't figure out why. At this point, I think it must be either the crossover_rate/mutation_rate or the actual crossover code itself is wrong. I'm hoping someone here can help me with figuring out which.
So, here's my understanding of deep genetic algorithms:
You have a pool of agents. They're randomly generated. You have each of them run, tracking their fitness up until they die. When all agents in the pool are dead, you select some number of the fittest of them.
You then take those models (the parents). You grab two parents and use one of them as the base, swapping in some weight and bias values from the other parent. That's crossover. You then go through those weights and bias values and randomly increase or decrease some of them by just a bit. That's mutation.
As far as I can tell, that understanding is correct. But then again, maybe it isn't. Something is broken here, so maybe I just don't have a clue how a DGA actually works.
But assuming that my understanding is correct, here's my code:
It's initialized with my chosen fitness threshold (a percentage if it's 1, a number of it's 1), crossover rate, mutation rate, and the degree to which I mutate a value. It also has a legacy pool where I store the absolute best agents I've ever gotten so I can use them as the parents for the new generation.
(FYI 1: The models are 11 input size, two 128 dense layers, 3 output layer)
(FYI 2: The legacy_pool and new_generation lists are lists of agents where each agent is two item list, with [0] being the model for an agent and [1] being the fitness of the agent)
class GeneticAlgorithm():
def __init__(self):
#self.fitness_threshold = 0.10
self.fitness_threshold = 2
self.crossover_rate = 0.10
self.mutation_rate = 0.10
self.mutation_degree = 0.50
# Pool of previous parents so we can use the fittest of all time
self.legacy_pool = None
def _improvement_check(self, new_generation):
''' Only allow the parents to be the absolute fittest of all generations. '''
# For the first time, we just set it to the first generation of parents
if self.legacy_pool == None:
self.legacy_pool = new_generation
# For every other generation, we actually check for improvements
else:
# Reverse the lists to increase accuracy
new_generation.reverse()
self.legacy_pool.reverse()
# Check for improvements
for i in range(len(new_generation)):
for j in range(len(self.legacy_pool)):
if new_generation[i][1] self.legacy_pool[j][1]:
self.legacy_pool[j] = new_generation[i]
break # so we only add a new agent once
# Resort the legacy pool
self.legacy_pool.sort(key=lambda a: a[1], reverse=True)
def breed_population(self, population):
'''
Crossover the weights and biases of the fittest members of the population,
then randomly mutate weights and biases.
'''
# Get the new generation and the number of children each pair needs to have
new_generation, num_children = population.get_parents(self.fitness_threshold)
# Update the legacy pool of agents to include any members of the new generation
# that are better than the old generations
self._improvement_check(new_generation)
# # Get the parent models
parents = [agent[0] for agent in self.legacy_pool]
#shuffle(parents) # Shuffle the parents into a random order
# Initialize children
children = AgentGA(population.population_size)
# Crossover and mutate to get the children
for c in range(num_children):
for i in range(1, len(parents)-1, 2):
children.agents[i*c][0] = self.crossover(children.agents[i*c][0], parents[i], parents[i+1])
return children
def crossover(self, child, parent_one, parent_two):
''' Apply crossover and mutation between two parents in order to get a child. '''
# Crossover and mutate each layer
for i in range(len(child.layers)):
# Get weights and biases of the parents
# p1 acts as the base for the child
child_data = parent_one.layers[i].get_weights()
p2_data = parent_two.layers[i].get_weights()
# Handle the weights
for x in range(child_data[0].shape[0]):
for y in range(child_data[0].shape[1]):
# Check to see if crossover should occur
if (random() self.crossover_rate):
child_data[0][x][y] = p2_data[0][x][y]
# Check to see if mutation should occur
if (random() self.mutation_rate):
child_data[0][x][y] += child_data[0][x][y] * uniform(-self.mutation_degree, self.mutation_degree)
# Handle the biases
for x in range(child_data[1].shape[0]):
# Check to see if crossover should occur
if (random() self.crossover_rate):
child_data[1][x] = p2_data[1][x]
# Check to see if mutation should occur
if (random() self.mutation_rate):
child_data[1][x] += child_data[1][x] * uniform(-self.mutation_degree, self.mutation_degree)
# Set weights and biases in child
child.layers[i].build(input_shape=child_data[0].shape[0])
child.layers[i].set_weights(child_data)
return child
```
Topic keras genetic-algorithms deep-learning python
Category Data Science