Customized GA

121 views
Skip to first unread message

Igor Markiewicz

unread,
Oct 15, 2017, 7:07:44 AM10/15/17
to deap-users
Hi !
I have two systems - the first for prediction and the second is DEAP (its task is to optimize). Would like to ask if it is possible to do something like this:
1) I put batch of data vectors into the first system and get the expected accuracy (e.g. 20%)
2) For the genetic algorithm in DEAP I put in the parameters of the first algorithm to optimize (e.g. [1.6, 3.8, 4.4]) and to the fitness function (which we want to maximize) the predicted accuracy (mentioned 20%).
3) Then I want to run GA and get the predicted parameter vector (here size 3) and use it in the predictive algorithm and see how it will affect the accuracy.
4) We repeat the whole process for a while

In particular, how should we do the selection of these three best parameters?

François-Michel De Rainville

unread,
Oct 16, 2017, 9:29:25 AM10/16/17
to deap-users
Just use the accuracy as fitness and let DEAP do selection, e.g. tournament or roulette or best...

--
You received this message because you are subscribed to the Google Groups "deap-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to deap-users+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

François-Michel De Rainville

unread,
Oct 16, 2017, 9:31:25 AM10/16/17
to deap-users
You can also look at http://github.com/aiworx-labs/Chocolate for easy hyperparameter optimisation. However, it is much more restrictive on the control you got over the optimization algorihtm itself. It does a lot more things for you.

Igor Markiewicz

unread,
Oct 16, 2017, 12:07:57 PM10/16/17
to deap-users

Thank you for your answer. I really appreciate it and will definitely read the link.
Going to the bottom, I think my problem is simply to find the maximum multi-parameter function. I imagine it like this:
1) Suppose I have a function of three variables - f (a, b, c), which I want to maximize
2) We randomize a real number from some interval [max, min] and convert it to a binary form
3) The drawn number is treated as a concatenation of the binary representation a, b, c
4) We run GA and evaluate each solution (we have to divide the chromosom in respect of a, b and c to evaluate)
5) Repeat the process to achieve satisfactory results

So, I now have such a question - how DEAP turns real numbers representation into binary and vice versa. Does this relate to max and min or to IEEE 754 (or something like that)? Does DEAP have automatic conversion functions that I could use?

When you talked about using the selection I thought you meant the situation as above. Do making separate chromosomes for each variable make sense?

François-Michel De Rainville

unread,
Oct 16, 2017, 12:33:23 PM10/16/17
to deap-users
I think you are referring to binary representation genetic algorithms. DEAP is able to work on directly real value representations. It as been shown that the closer the genotype representation is to the phenotype the better are the chances of success. Binary representation is now barely used to optimize floats or integers, it is most appropriate for feature selection or other binary tasks. Examples of real valued crossover and mutations are respectively cxSimulatedBinary() and mutGaussian(). 

Selection is the principle of selecting the individuals that will make the next generation population (parents).

Yes, you should have one chromosome per variable. You can follow examples/es/cma_minfct.py directly to use CMA-ES on real valued genotypes or the example in the overview section of the documentation. It minimizes the sum of all attributes of an individual made of floats.

Regards, 

Igor Markiewicz

unread,
Oct 18, 2017, 7:38:45 AM10/18/17
to deap-users
Thank you very much for your help. I have read implementations of these algorithms and understand what you mean. I have a last question, a little unrelated to this topic. Suppose I have several intervals from which I want to draw a value :
[1, 2]
[4, 7.2]
and so on

I want to create an individual consisting of several numbers from each space, but in the following order :
num_1 from [1, 2]
num_2 from [1, 2]
num_3 from [1, 2]
num_4 from [4, 7.2]
num_5 from [4, 7.2]
num_6 from [4, 7.2]

How can this be achieved using DEAP?

W dniu niedziela, 15 października 2017 13:07:44 UTC+2 użytkownik Igor Markiewicz napisał:

François-Michel De Rainville

unread,
Oct 18, 2017, 8:40:30 AM10/18/17
to deap-users
You have to create your own initialization function. I think if you dig a little bit on the mailing list you will find something suitable. cxSimulatedBinary and mutPolynomial offer bounded versions where you can provide a list of bounds for each attribute of the genotype.

Reply all
Reply to author
Forward
0 new messages