moo-1.2: Genetic algorithm library

Safe HaskellNone
LanguageHaskell98

Moo.GeneticAlgorithm.Continuous

Contents

Description

Continuous (real-coded) genetic algorithms. Candidate solutions are represented as lists of real variables.

Synopsis

Types

Initialization

getRandomGenomes Source #

Arguments

:: (Random a, Ord a) 
=> Int

n, how many genomes to generate

-> [(a, a)]

ranges for individual genome elements

-> Rand [Genome a]

random genomes

Generate n uniform random genomes with individual genome elements bounded by ranges. This corresponds to random uniform sampling of points (genomes) from a hyperrectangle with a bounding box ranges.

uniformGenomes :: Int -> [(Double, Double)] -> [Genome Double] Source #

Generate at most popsize genomes uniformly distributed in ranges.

Selection

rouletteSelect :: Int -> SelectionOp a Source #

Objective-proportionate (roulette wheel) selection: select n random items with each item's chance of being selected is proportional to its objective function (fitness). Objective function should be non-negative.

stochasticUniversalSampling Source #

Arguments

:: Int

how many genomes to select

-> SelectionOp a 

Stochastic universal sampling (SUS) is a selection technique similar to roulette wheel selection. It gives weaker members a fair chance to be selected, which is proportinal to their fitness. Objective function should be non-negative.

tournamentSelect Source #

Arguments

:: ProblemType

type of the optimization problem

-> Int

size of the tournament group

-> Int

how many tournaments to run

-> SelectionOp a 

Performs tournament selection among size individuals and returns the winner. Repeat n times.

Scaling and niching

withPopulationTransform :: (Population a -> Population a) -> SelectionOp a -> SelectionOp a Source #

Apply given scaling or other transform to population before selection.

withScale :: (Objective -> Objective) -> SelectionOp a -> SelectionOp a Source #

Transform objective function values before seletion.

rankScale :: ProblemType -> Population a -> Population a Source #

Replace objective function values in the population with their ranks. For a population of size n, a genome with the best value of objective function has rank n' <= n, and a genome with the worst value of objective function gets rank 1.

rankScale may be useful to avoid domination of few super-genomes in rouletteSelect or to apply rouletteSelect when an objective function is not necessarily positive.

withFitnessSharing Source #

Arguments

:: (Phenotype a -> Phenotype a -> Double)

distance function

-> Double

niche radius

-> Double

niche compression exponent alpha (usually 1)

-> ProblemType

type of the optimization problem

-> SelectionOp a -> SelectionOp a 

A popular niching method proposed by D. Goldberg and J. Richardson in 1987. The shared fitness of the individual is inversely protoptional to its niche count. The method expects the objective function to be non-negative.

An extension for minimization problems is implemented by making the fitnes proportional to its niche count (rather than inversely proportional).

Reference: Chen, J. H., Goldberg, D. E., Ho, S. Y., & Sastry, K. (2002, July). Fitness inheritance in multiobjective optimization. In Proceedings of the Genetic and Evolutionary Computation Conference (pp. 319-326). Morgan Kaufmann Publishers Inc..

distance1 :: Num a => [a] -> [a] -> a Source #

1-norm distance: sum |x_i - y-i|.

distance2 :: Floating a => [a] -> [a] -> a Source #

2-norm distance: (sum (x_i - y_i)^2)^(1/2).

distanceInf :: Real a => [a] -> [a] -> a Source #

Infinity norm distance: max |x_i - y_i|.

Sorting

bestFirst :: ProblemType -> Population a -> Population a Source #

Reorders a list of individual solutions, by putting the best in the head of the list.

Crossover

Neighborhood-based operators

blendCrossover Source #

Arguments

:: Double

alpha, range expansion parameter

-> CrossoverOp Double 

Blend crossover (BLX-alpha) for continuous genetic algorithms. For each component let x and y be its values in the first and the second parent respectively. Choose corresponding component values of the children independently from the uniform distribution in the range (L,U), where L = min (x,y) - alpha * d, U = max (x,y) + alpha * d, and d = abs (x - y). alpha is usually 0.5. Takahashi in [10.1109/CEC.2001.934452] suggests 0.366.

unimodalCrossover Source #

Arguments

:: Double

sigma_xi, the standard deviation of the mix between two principal parents

-> Double

sigma_eta, the standard deviation of the single orthogonal component

-> CrossoverOp Double 

Unimodal normal distributed crossover (UNDX) for continuous genetic algorithms. Recommended parameters according to [ISBN 978-3-540-43330-9] are sigma_xi = 0.5, sigma_eta = 0.35/sqrt(n), where n is the number of variables (dimensionality of the search space). UNDX mixes three parents, producing normally distributed children along the line between first two parents, and using the third parent to build a supplementary orthogonal correction component.

UNDX preserves the mean of the offspring, and also the covariance matrix of the offspring if sigma_xi^2 = 0.25. By preserving distribution of the offspring, /the UNDX can efficiently search in along the valleys where parents are distributed in functions with strong epistasis among parameters/ (idem).

unimodalCrossoverRP :: CrossoverOp Double Source #

Run unimodalCrossover with default recommended parameters.

simulatedBinaryCrossover Source #

Arguments

:: Double

non-negative distribution parameter n, usually in the range from 2 to 5; for small values of n children far away from the parents are more likely to be chosen.

-> CrossoverOp Double 

Simulated binary crossover (SBX) operator for continuous genetic algorithms. SBX preserves the average of the parents and has a spread factor distribution similar to single-point crossover of the binary genetic algorithms. If n > 0, then the heighest probability density is assigned to the same distance between children as that of the parents.

The performance of real-coded genetic algorithm with SBX is similar to that of binary GA with a single-point crossover. For details see Simulated Binary Crossover for Continuous Search Space (1995) Agrawal etal.

Discrete operators

onePointCrossover :: Double -> CrossoverOp a Source #

Select a random point in two genomes, and swap them beyond this point. Apply with probability p.

twoPointCrossover :: Double -> CrossoverOp a Source #

Select two random points in two genomes, and swap everything in between. Apply with probability p.

uniformCrossover :: Double -> CrossoverOp a Source #

Swap individual bits of two genomes with probability p.

noCrossover :: CrossoverOp a Source #

Don't crossover.

Application

doCrossovers :: [Genome a] -> CrossoverOp a -> Rand [Genome a] Source #

Crossover all available parents. Parents are not repeated.

doNCrossovers Source #

Arguments

:: Int

n, number of offsprings to generate

-> [Genome a]

parents' genomes

-> CrossoverOp a

crossover operator

-> Rand [Genome a] 

Produce exactly n offsprings by repeatedly running the crossover operator between randomly selected parents (possibly repeated).

Mutation

gaussianMutate Source #

Arguments

:: Double

probability p

-> Double
sigma
-> MutationOp Double 

For every variable in the genome with probability p replace its value v with v + sigma*N(0,1), where N(0,1) is a normally distributed random variable with mean equal 0 and variance equal 1. With probability (1 - p)^n, where n is the number of variables, the genome remains unaffected.

Control