Initializing genetic algorithm with mixed-seed

212 views
Skip to first unread message

Rachel

unread,
Jan 16, 2021, 3:06:40 PM1/16/21
to deap-users
Hi,

How could I create the first generation with random seed mixed to pre-defined seed?   
I have a set of good individuals and I would like to use them jointly to random seed for initializing the GA and check how it contributes (or not) to its convergence. 

Here is what I tried in the onemax problem. When I did that, the ind_1 was added every time an individual was created. How to add these good individuals once in the first generation?

def initFunc(icls):

    ind = [random.randint(0, 1) for _ in range(100)]

    ind_1 = [1]*100

    first_pop = [ind, ind_1]

    print(first_pop)

    return icls(first_pop)


creator.create("FitnessMax", base.Fitness, weights=(1.0,))
creator.create("Individual", list,  fitness=creator.FitnessMax)

Thanks! 

Derek Tishler

unread,
Jan 16, 2021, 4:19:19 PM1/16/21
to deap-users
In terms of effects(always test for your own problem of course), If you wanted to read about the topic there are a few papers and posts regarding 'population initialization methods', in this case you appear to be aiming for a hybrid of Heuristic(known) and Random initialization.
https://www.researchgate.net/post/How_beneficial_is_it_to_initialize_a_genetic_algorithm_problem_with_a_know_best
https://medium.com/datadriveninvestor/population-initialization-in-genetic-algorithms-ddb037da6773

One method is to save a Deap population object(see checkpointing in docs) and then load and append that population list to a new random pop. If you have values only to create a new individual, then can see below example on how to separate the pop creating process and combine the lists(may want to shuffle them too once done?)

In Deap, the population is often a List. If you have an existing population, and are using the toolbox to also create N random individuals as a population, you can try to combine these lists before starting the evo.

Here is runnable example of below code using the onemax_short demo:
https://gist.github.com/DMTSource/8005e8e345ed03438e3fca655ff57792

    # How we normally make a pop
    pop = toolbox.population(n=200)
    print(len(pop)) #200

    # You can combine different init methods and append them like so since they are just lists
    # can have a second toolbox.pop2 method registered
    pop += toolbox.population(n=100) 
    print(len(pop)) #300

    # What if we have a case of known individual, but not yet and Individual object?
    known_but_not_an_ind = [
        [1,2,3,4,5],
        [2,3,4,5,6],
        [3,4,5,6,7],
    ]
    # Can manually add items too
    pop += [creator.Individual(new_ind) for new_ind in known_but_not_an_ind]
    print(len(pop)) #303


Rachel

unread,
Jan 24, 2021, 2:06:23 PM1/24/21
to deap-users
Many thanks for your help, Derek.
I am using an individual that is a list of lists. Based on your example I created an onemax version for heuristic initialization when the individual is a list of lists. If someone is interested, the code is available here: https://gist.github.com/rcampioni1/ecdc2db2b539d2731299418c058d5925

When I start the population with good individuals, it tends to return as the solution the best individual from the first generation (one of the known individuals I've added). I was expecting that the GA would be able to attain even better results after heuristic initialization... Would someone have a suggestion of how equilibrating the exploration/exploitation when using heuristic initialization? 
Should I use good individuals but not individuals very close to the best one?

Also, would it be possible to implement GA operators that evolve through the generations? 

Derek Tishler

unread,
Jan 24, 2021, 3:06:48 PM1/24/21
to deap-users
I'm glad it was useful!

When I start the population with good individuals, it tends to return as the solution the best individual from the first generation
That is the sort-of expected outcome as mentioned in papers/blogs; usually random individuals are pretty terrible(in fitness), so a heuristic approach is very prone to convergence due to the disparity in fitness, assuming these great fitness individuals are not lost randomly in selection. I also noticed you add 5X the known individual in the gist example code, this will help such individuals not get randomly lost to selection, but also increase the convergence as with each generation this number of copies can grow until they saturate the population(they win every tournament they enter).

I see you are using a low selection pressure of 3 in the gist code. This is already pretty low and lowering selection pressure was going to be my first suggestion for this situation. Perhaps a larger random population can be a band aid(see next re eaMuPlusLambda instead), or perhaps you can inject these good individuals into the population at the generation where the pop fitness mean is on par with the known solutions? This may help with the disparity issue in selection.

Also, using eaMuPlusLambda + VarOr may also help with more exploration of solution space(based on the gist code example, you may already be doing this in practice).


Also, would it be possible to implement GA operators that evolve through the generations?
One way to do this would involve working with the algorithm locally(for example copy eaSimple and varAnd to your script). Then you would be able to pass along the current gen integer along to the mate and mutate operators, as well as create custom and mutate operators that take in their normal inputs plus the gen integer. This could then be used to guide the workings of these operations.

Rachel

unread,
Feb 21, 2021, 9:33:22 PM2/21/21
to deap-users
Hi Derek,

I was using good individuals obtained from another optimization method, so their fitness was much lower (close to the global optimum) than the fitness of randomly generated individuals. I think this big difference led to a rapid convergence, with no optimal solutions having fitness values better than that of the injected individuals. 

I also tried to inject individuals that are not too good, but respect most of the hard constraints. The evolution behavior is better, but the convergence values obtained are still higher than the desired (when compared to another optimization method). 

"perhaps you can inject these good individuals into the population at the generation where the pop fitness mean is on par with the known solutions? This may help with the disparity issue in selection."  How could I implement it? 

I worked on the implementation of adaptive genetic operators. Here is the gist in which I put a part of my code (only VarAnd and VarOr) (https://gist.github.com/rcampioni1/0645fb44537885197aef86797dae3f34). I am still getting solutions with bad fitness values. I would like to access the probability of mutation and crossover the algorithm assumes at each iteration to check if it is really changing according to the fitness function values. Would it be possible to access these values? 

Derek Tishler

unread,
Feb 24, 2021, 8:44:56 AM2/24/21
to deap-users
"How could I implement it?"
- One way is to work work with the algorithm locally. I the main For loop for each gen, you can check for a generation where the evolutions mean fitness is on par with your 'target injection individuals', or have them added at a specific gen number. You can append them to the population perhaps at this spot after working with the pop but before recording: 
https://github.com/DEAP/deap/blob/master/deap/algorithms.py#L182
doOnce = True
if doOnce and (gen > N or np.mean([ind.fitness.values[0] for ind in population]) <= injection_target_mean):
    doOnce = False
    population += injection_pop

"Would it be possible to access these values?" 
- One method like above would be to copy varAnd/varOr and the operators locally, and have your mut/cross functions and varAnd/varOr return offspring, proba_mut, proba_cross instead of just the offspring. Then you can go about using these items in record or via any method you prefer:
https://github.com/DEAP/deap/blob/master/deap/algorithms.py#L168

Rachel

unread,
Mar 10, 2021, 3:32:33 PM3/10/21
to deap-users
Thanks, Derek. It worked as needed!
Reply all
Reply to author
Forward
0 new messages