Is there an example optimization with mixed variables (continuous and discrete variables)?

50 views
Skip to first unread message

Bas van Dam

unread,
Apr 28, 2020, 8:47:15 AM4/28/20
to Inspyred
Dear all,

I am new to the inspyred python library and would like to know how I can make an optimization with mixed variables, both continuous and discrete. I see that for problems with continuous variables a generator is used. For problems with discrete variables there is mentioning of a constructor in combination with an ant colony based search algorithm, but I cannot seem to find an example of a constructor. Furthermore with a discrete optimization I see that the following variators can be used:

algorithm.variator = [inspyred.ec.variators.partially_matched_crossover, inspyred.ec.variators.random_reset_mutation]  #  for discrete variables

How does the optimization cope with multiple variators as in the example above? How can you use multiple variators in a problem with both discrete and continuous variables, e.g. partially_matched_crossover and gaussian_mutation? Furthermore, how can you set multiple discrete bounders. I.e. set separately for each variable. And how could you use a normal bounder for a continuous variable of the design vector in combination with a discrete bounder for another variable in the same design vector / optimization.


If you can please send me some examples. I would very much appreciate any help.


Best regards,


B. van Dam

Aaron Garrett

unread,
Apr 28, 2020, 6:27:06 PM4/28/20
to Inspyred
Here's a silly example that maybe sheds some light for you:



import copy
import random
import inspyred


def my_bounder(candidate, args):
    # The candidate is really 3 different types of things:
    # * a discrete value from 21, 42, or 63
    # * a continuous value v where 0 <= v <= 100
    # * a discrete value between 1 and 10, inclusive
   
    db1 = inspyred.ec.DiscreteBounder([21, 42, 63])
    rb  = inspyred.ec.Bounder(0, 100)
    db2 = inspyred.ec.DiscreteBounder(list(range(1, 11)))
   
    # If there were multiple components that used the same bounder,
    # you could just call the relevant function again for that component.
    # You don't have to make a new bounding function for each component,
    # only for those that have different bounds.
   
    return [db1([candidate[0]], args)[0],
            rb([candidate[1]], args)[0],
            db2([candidate[2]], args)[0]]
   

def my_generator(random, args):
    return [random.choice([21, 42, 63]),
            random.uniform(0, 100),
            random.randint(1, 10)]


def my_evaluator(candidates, args):
    # This is all just made up. It has no sense or meaning.
    # It's just an example.
    fitness = []
    for cs in candidates:
        multiplier = {21: 0.5, 42: 1.5, 63: 2}
        discount = [0.9, 0.8, 0.7, 0.65, 0.6, 0.55, 0.5, 0.45, 0.35, 0.25, 0.1]
        fit = multiplier[cs[0]] * cs[1] * discount[cs[2] - 1]
        fitness.append(fit)
    return fitness

def my_mutation(random, candidates, args):
    rate = args.setdefault('mutation_rate', 0.1)
    mutants = []
    for cs in candidates:
        mutant = copy.copy(cs)
        if random.random() < rate:
            dv1 = list(set([21, 42, 63]) - set([cs[0]]))
            mutant[0] = random.choice(dv1)
        if random.random() < rate:
            mutant[1] += random.gauss(0, 5)
        if random.random() < rate:
            dv2 = list(set([1, 2, 3, 4, 5, 6, 7, 8, 9, 10]) - set([cs[2]]))
            mutant[2] = random.choice(dv2)
        mutant = my_bounder(mutant, args)
        mutants.append(mutant)
    return mutants


prng = random.Random()
prng.seed(12345)
ec = inspyred.ec.EvolutionaryComputation(prng)
ec.terminator = inspyred.ec.terminators.evaluation_termination
ec.variator = [my_mutation]
ec.selector = inspyred.ec.selectors.tournament_selection
ec.replacer = inspyred.ec.replacers.generational_replacement
ec.observer = [inspyred.ec.observers.stats_observer]
final_pop = ec.evolve(generator=my_generator,
                      evaluator=my_evaluator,
                      pop_size=10,
                      maximize=True,
                      bounder=my_bounder,
                      num_selected=10,
                      num_elites=1,
                      max_evaluations=200,
                      mutation_rate=0.1)

final_pop.sort(reverse=True)
print(final_pop[0])





--
Aaron Garrett



--
You received this message because you are subscribed to the Google Groups "Inspyred" group.
To unsubscribe from this group and stop receiving emails from it, send an email to inspyred+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/inspyred/65dae970-c394-42bc-9698-e53fbf77b0da%40googlegroups.com.

Bas van Dam

unread,
May 6, 2020, 7:00:32 AM5/6/20
to Inspyred
Thank you very much for the quick reply! I will have of a look if this will work for me.
To unsubscribe from this group and stop receiving emails from it, send an email to insp...@googlegroups.com.

Bas van Dam

unread,
May 29, 2020, 3:35:38 AM5/29/20
to Inspyred
Dear Aaron Garrett,

Thank you once more for this example. This works great. I have one more question though. Can you also have crossover in such an optimization? I see that the standard variators that have crossover can deal with real or discrete variables but not with both. Is there a custom function that would for example split the design vector in one set with only real variables and another with only discrete variables and use e.g. blend_crossover and partially_matched_crossover for these sets after which it combines the two sets again? Or any other custom crossover that can do something comparable? An example would help a lot.

Any help would be very much appreciated!

Best regards,

B. van Dam

On Wednesday, 29 April 2020 00:27:06 UTC+2, Aaron Garrett wrote:
To unsubscribe from this group and stop receiving emails from it, send an email to insp...@googlegroups.com.

Aaron Garrett

unread,
Jun 1, 2020, 12:29:28 PM6/1/20
to Inspyred
Here's some untested code that should at least get you close:


def my_crossover(random, candidates, args):
    # Suppose the components were at the index values below.
    real_components = [1, 2, 7, 9]
    disc_components = [0, 3, 4, 5, 6, 8]
    moms = candidates[::2]
    dads = candidates[1::2]
    children = []
    for mom, dad in zip(moms, dads):
        mom_real = [mom[i] for i in real_components]
        mom_disc = [mom[i] for i in disc_components]
        dad_real = [dad[i] for i in real_components]
        dad_disc = [dad[i] for i in disc_components]
        bro_real, sis_real = inspyred.ec.variators.blend_crossover(random, [mom_real, dad_real], args)
        bro_disc, sis_disc = inspyred.ec.variators.partially_matched_crossover(random, [mom_disc, dad_disc], args)
        bro = copy.copy(dad)
        sis = copy.copy(mom)
        for i, bval, sval in zip(real_components, bro_real, sis_real):
            bro[i] = bval
            sis[i] = sval
        for i, bval, sval in zip(disc_components, bro_disc, sis_disc):
            bro[i] = bval
            sis[i] = sval
        bro = my_bounder(bro, args)
        sis = my_bounder(sis, args)
        children.append(bro)
        children.append(sis)
    return children



--
Aaron Garrett



To unsubscribe from this group and stop receiving emails from it, send an email to inspyred+u...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/inspyred/4b17923d-aaf4-4577-9527-00bc5f2538fd%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages