CodeGym /Java Course /Python SELF EN /Genetic Algorithms

Genetic Algorithms

Python SELF EN
Level 60 , Lesson 4
Available

9.1 Introduction to Genetic Algorithms.

Genetic Algorithms (GA) are optimization and search methods inspired by the process of natural selection that mimic biological evolutionary processes.

Genetic algorithms are used for solving complex problems where traditional methods might be inefficient. These algorithms use selection, crossover, and mutation mechanisms to evolve solutions.

Working Principles:

1. Initialization:

An initial population of possible solutions (chromosomes) is created. Each solution is encoded as a string (bit string, character string, or other structures).

2 Fitness Function:

Evaluates the quality of each solution in the population.

3. Evaluation:

Each solution (individual) is evaluated using the fitness function, which determines how well the solution addresses the task.

4. Selection:

Solutions with better fitness have a higher chance of being selected for reproduction. Methods like roulette wheel selection, tournament selection, and rank selection are often used.

5. Crossover:

Selected solutions (parents) are combined to create new solutions (offspring). The crossover can be single-point, two-point, or multi-point.

6. Mutation:

Random changes in the new solutions (mutations) are applied to introduce diversity. This helps the algorithm avoid local minima.

7. Replacement:

The new population replaces the old one, and the process repeats until a stopping condition is met (e.g., reaching a certain number of generations or achieving a certain fitness level).

Advantages and Disadvantages

Advantages:

  • Wide range of applications: Can be used for solving various tasks, including those where analytical methods do not work.
  • Global optimization: Capable of finding global optima in multi-dimensional and complex spaces.
  • Flexibility: Can be used with any fitness function.

Disadvantages:

  • High computational cost: Requires significant computational resources, especially for large populations and complex tasks.
  • Difficulty in parameter tuning: Setting parameters (population size, mutation and crossover probability) can be challenging and highly affect performance.
  • No guarantee of optimal solution: No guarantee that the found solution will be globally optimal.

In fact, the genetic algorithm is a highly effective heuristic. It doesn’t guarantee finding the best solution, even if it has access to all data. However, for complex situations and massive amounts of data, it quickly provides a near-optimal solution.

9.2 Application Example

Let's consider an example of using a genetic algorithm for function optimization.


import random

# Genetic algorithm parameters
POPULATION_SIZE = 100
GENERATIONS = 1000
MUTATION_RATE = 0.01
TOURNAMENT_SIZE = 5
            
# Define the fitness function
def fitness_function(x):
    return -x**2 + 10*x
            
# Initialize population
def initialize_population(size):
    return [random.uniform(-10, 10) for _ in range(size)]
            
# Selection (tournament selection)
def tournament_selection(population):
    tournament = random.sample(population, TOURNAMENT_SIZE)
    return max(tournament, key=fitness_function)
            
# Crossover (single-point)
def crossover(parent1, parent2):
    alpha = random.random()
    return alpha * parent1 + (1 - alpha) * parent2
            
# Mutation
def mutate(individual):
    if random.random() < MUTATION_RATE:
        return individual + random.gauss(0, 1)
    return individual
            
# Main genetic algorithm loop
population = initialize_population(POPULATION_SIZE)
            
for generation in range(GENERATIONS):
    new_population = []
    for _ in range(POPULATION_SIZE):
        parent1 = tournament_selection(population)
        parent2 = tournament_selection(population)
        offspring = crossover(parent1, parent2)
        offspring = mutate(offspring)
        new_population.append(offspring)
    population = new_population
            
# Output the best solution
best_individual = max(population, key=fitness_function)
print("Best solution:", best_individual)
print("Function value:", fitness_function(best_individual))
            
        

Advantages and disadvantages

9.3 Multi-dimensional Function Optimization

Task:

Find the minimum value of a multi-dimensional function.

Solution:

Let's use the previous template


import numpy as np

def fitness_function(x):
    return np.sum(x ** 2)
            
def create_individual(dim):
    return np.random.uniform(-10, 10, dim)
            
def create_population(pop_size, dim):
return np.array([create_individual(dim) for _ in range(pop_size)])
            
def select_individuals(population, fitness, num_parents):
    parents = np.empty((num_parents, population.shape[1]))
    for parent_num in range(num_parents):
        min_fitness_idx = np.where(fitness == np.min(fitness))
        min_fitness_idx = min_fitness_idx[0][0]
        parents[parent_num, :] = population[min_fitness_idx, :]
        fitness[min_fitness_idx] = float('inf')
    return parents
            
def crossover(parents, offspring_size):
    offspring = np.empty(offspring_size)
    crossover_point = np.uint8(offspring_size[1] / 2)
    for k in range(offspring_size[0]):
        parent1_idx = k % parents.shape[0]
        parent2_idx = (k + 1) % parents.shape[0]
        offspring[k, 0:crossover_point] = parents[parent1_idx, 0:crossover_point]
        offspring[k, crossover_point:] = parents[parent2_idx, crossover_point:]
    return offspring
            
def mutate(offspring, mutation_rate):
    for idx in range(offspring.shape[0]):
        if np.random.rand() < mutation_rate:
            random_value = np.random.uniform(-1.0, 1.0, 1)
            offspring[idx, 4] = offspring[idx, 4] + random_value
    return offspring
            
def genetic_algorithm(dim, pop_size, num_parents, mutation_rate, num_generations):
    population = create_population(pop_size, dim)
    for generation in range(num_generations):
        fitness = np.array([fitness_function(ind) for ind in population])
        parents = select_individuals(population, fitness, num_parents)
        offspring_crossover = crossover(parents, (pop_size - parents.shape[0], dim))
        offspring_mutation = mutate(offspring_crossover, mutation_rate)
        population[0:parents.shape[0], :] = parents
        population[parents.shape[0]:, :] = offspring_mutation
    best_solution = population[np.argmin(fitness)]
    return best_solution
# Example usage:
dim = 10
pop_size = 100
num_parents = 20
mutation_rate = 0.01
num_generations = 1000
best_solution = genetic_algorithm(dim, pop_size, num_parents, mutation_rate, num_generations)
print(f"Best solution: {best_solution}")

        
2
Task
Python SELF EN, level 60, lesson 4
Locked
Genetic Algorithm
Genetic Algorithm
2
Task
Python SELF EN, level 60, lesson 4
Locked
Genetic Knapsack
Genetic Knapsack
Comments
TO VIEW ALL COMMENTS OR TO MAKE A COMMENT,
GO TO FULL VERSION