Genetic Algorithms (GAs) can find the minimum of a quadratic equation given a range. While not the fastest or most precise method, this is a great way to become familiar with how to set up GAs and how they work.
Continuing the discussion from Part 1, which details what GAs are and when they are helpful, this section shows one way to create a GA to find the minimum of a quadratic. The point is not to make the fastest algorithm that will find the minimum of a quadratic – that can be easily calculated using known quadratic formulas. The point is to show how a GA could be used and to explore how the algorithm works.
Preparing the Search
So to recap, these are the main steps that make up a genetic algorithm:
- Generate the initial population
- Sort the population based on their performance
- Have individuals mate while basing its probability of mating on their performance.
- Mate until the next generation is equal to the size of the first generation. This is now the generation to work with.
- Each offspring will go through the crossover phase using the genes of its two parents, and mutation phase by itself.
- Repeat steps 2 and 3 until passing over a given number of generations.
- Sort the population one last time and take the best.
This has been translated into the GeneticAlgorithmSearch seen below. It allows specific problems to define how to generate populations, how the gene crossover and mutations occur, and how to evaluate the population.
Evaluation and sorting are quite important. Ranking individuals based on their performance, the search will call create_probabilistic_population_for_pick() to create a list of the population where each member is seen so many times as the index of its rank. Essentially, the better the individual did in the evaluation, the more lottery tickets it received to help in procreating the next generation. In a population of size 10, the worst-performing individual receives one ticket, second-to-last receives two tickets, and first place gets 10 tickets.
With these lottery tickets in hand, the population is called to procreate at random until the next generation is filled. It’s entirely possible that some individuals are chosen more than once or not at all.
This particular search also includes the ability to carry the best individual (or solution) to the next generation. This helps the algorithm stay greedy, but it can be turned off if desired.
class GeneticAlgorithmSearch: def __init__(self, num_generations: int=100): self.keep_best = True self.num_generations = num_generations self.verbose_print_every = 10 # Display where we are every 10% self.verbose = True self._current_population = [] self._best_so_far = None self.crossover_rate = 50 # 50% self.mutation_rate = 10 # 10% self.reverse_sort = False @abstractmethod def _generate_initial_population(self): raise NotImplementedError @abstractmethod def _handle_crossover_between(self, chromosome1, chromosome2): raise NotImplementedError @abstractmethod def _handle_mutation_in(self, chromosome): raise NotImplementedError @abstractmethod def _should_exclude(self, chromosome): raise NotImplementedError @abstractmethod def _evaluate_chromosome(self, chromosome): raise NotImplementedError def __create_probabalistic_population_for_pick(self): """Assumes population is already sorted by performance - best last""" to_return = [] for position, chromosome in enumerate(self._current_population): to_return.extend([chromosome]*position) return to_return def __verbose(self, *to_print): if self.verbose: print(*to_print, sep=" ") def run_search(self): self._current_population = self._generate_initial_population() for i in range(self.num_generations): self.__verbose('Starting generation {}'.format(i)) # Evaluate self._current_population.sort(key=self._evaluate_chromosome, reverse=self.reverse_sort) self._best_so_far = self._current_population[-1] self.__verbose('\tBest score so far = {}'.format(self._evaluate_chromosome(self._best_so_far))) # Creating new population new_population = [] # Copy best over if needed if self.keep_best: new_population.append(self._best_so_far) # Filling the rest probabilistic_population_for_mating = self.__create_probabalistic_population_for_pick() while len(new_population) < len(self._current_population): parent1 = random.choice(probabilistic_population_for_mating) parent2 = random.choice(probabilistic_population_for_mating) # Performing crossover child = self._handle_crossover_between(parent1, parent2) # Performing mutation child = self._handle_mutation_in(child) # Ensuring child is good if self._should_exclude(child): continue new_population.append(child) self._current_population = new_population def get_result(self): return self._best_so_far
Searching for the Quadratic Minimum
Here, I extended the class above to specifically search for the minimum of a quadratic. To keep things simple, this class expects that the solution will be an integer within the given boundaries. Additionally, the quadratic equation and any search options are given to the setup.
class SmallMinQuadraticGA(GeneticAlgorithmSearch): def __init__(self, equation, boundary: (int, int), num_generations=4, population_size=4): GeneticAlgorithmSearch.__init__(self, num_generations=num_generations) self._quadratic_eq = equation self._lower_bound, self._upper_bound = boundary self._population_size = population_size self.reverse_sort = True assert self._lower_bound < self._upper_bound
I had to reverse the sort because python sorts numbers in ascending order, but we want to find the minimum of the quadratic and the search expects the best of the population to be in the last index.
Data Representation
This is one way to store integers and convert them back and forth to a binary string which will represent the chromosome. This can be optimized, but this is easy to read as a first encounter:
class SmallNumber: def __init__(self, decimal_number: int): self.decimal_number = decimal_number @classmethod def from_binary_str(cls, binary: str): return SmallNumber(int(binary, 2)) def __str__(self): binary = bin(self.decimal_number)[2:] while len(binary) < 6: binary = '0' + binary return binary def __repr__(self): return self.decimal_number
Initial Population
This is the simplest part; an individual is only a random number within bounds:
def _generate_initial_population(self) -> [SmallNumber]: return [SmallNumber(random.randint(self._lower_bound, self._upper_bound)) for _ in range(self._population_size)]
Crossovers
There are a plethora of directions a crossover may be implemented for this example. Generally, crossovers are where most of the algorithm’s diversity comes from. The main goal of any crossover is that some portion of the new individual’s genes comes from both parents, and the amount or position that comes from either parent is tunable by the crossover rate.
Here, I’m going to employ the idea that each gene (or bit) can come from either parent. So if one parent is 000000 and another is 111111, it could make children like 101010 or 110010 or 000111. Each bit will be subject to the probability of crossover:
def _handle_crossover_between(self, chromosome1: SmallNumber, chromosome2: SmallNumber) -> SmallNumber: offspring_genes = ''.join(gene1 if random.randint(0, 100) < self.crossover_rate else gene2 for gene1, gene2 in zip(str(chromosome1), str(chromosome2))) return SmallNumber.from_binary_str(offspring_genes)
For this example, the closer to a crossover rate of 50%, the more likely the genes will come evenly from both parents. If the rate is lower, the higher the probability more genes will come from the first parent; likewise if the rate is higher, more genes will come from the second parent.
Mutation
Again, there are plenty of mutation strategies that could be employed. This implementation flips a bit depending on the probability of the mutation rate. The lower the mutation rate, the less likely the general population will diverge from a well-formed solution:
def _handle_mutation_in(self, chromosome: SmallNumber) -> SmallNumber: flip_bit = lambda x: '0' if x == '1' else '1' binary_generator = (flip_bit(bit) if random.randint(0, 100) < self.mutation_rate else bit for bit in str(chromosome)) return SmallNumber.from_binary_str(''.join(binary_generator))
Evaluation
This piece is also quite simple. Simply convert the binary string into its respective integer and plug it into the quadratic equation.
def _evaluate_chromosome(self, chromosome: SmallNumber): return self._quadratic_eq(chromosome.decimal_number)
Putting it all together
GAs are great because a developer only needs to loosely orchestrate how the search should be conducted. Generating the population, showing how to crossover and mutate, and how to evaluate a population is usually simple to create. Generally, I’ve found it’s even easier than iterating over all possibilities of some problems.
Now to give this a whirl! For this run, I’ll use:
The minimum should be at x=5, or y = -20.
I set the crossover rate at 50%, a mutation rate of 10%, a population of size 4, and I let it run it for 5 generations. This means at most 20 numbers were tested instead of the full 64.
Here’s how to set a run up:
if __name__ == '__main__': ga = SmallMinQuadraticGA(equation=lambda x: (x**2 - 10*x + 5), boundary=(0, 63)) ga.run_search() best_estimate = ga.get_result() print('Best Estimated Minimum {}'.format(best_estimate.decimal_number))
Here is the output from the first run:
Starting generation 0 Best score so far = 176 Starting generation 1 Best score so far = 124 Starting generation 2 Best score so far = -20 Starting generation 3 Best score so far = -20 Starting generation 4 Best Estimated Minimum 5
Hey! It found the right answer! And on the 3rd generation. Not bad! It seems to have started pretty far to the right on the X-axis, but shifted back quickly.
Here’s what happens when I ran it again.
Starting generation 0 Best score so far = 1276 Starting generation 1 Best score so far = 1276 Starting generation 2 Best score so far = -4 Starting generation 3 Best score so far = -20 Starting generation 4 Best score so far = -20 Best Estimated Minimum 5
It did just as well that time. The initial population seems to have started with a worse population than the last run, but by the third generation it adjusted in the right direction, and the fourth generation found the answer.
Alright, one more time:
Starting generation 0 Best score so far = -4 Starting generation 1 Best score so far = -4 Starting generation 2 Best score so far = -16 Starting generation 3 Best score so far = -19 Starting generation 4 Best score so far = -19 Best Estimated Minimum 4
Well, it didn’t find the exact answer like it did the last two times, proving it’s not brute-forcing every possible answer, but it did find something that was pretty close: x=4.
What’s Next?
Now with a solid example under our belt, we can get into the fun stuff. In part 3, I’ll take data regarding fantasy-football players and find some of the best lineups to use for FanDuel competitions. Better grab a beer!
Until then, see the full example in GitHub.
One thought on “Genetic Algorithms Part 2: Quadratic Example”