Hypervolume calculation fails for 212 individuals with 14 objectives

106 views
Skip to first unread message

Hemant Sharma

unread,
Nov 17, 2020, 12:11:12 PM11/17/20
to deap-users
Hello deap users,

Thank you developers and the community for this great package ! It is helping me a lot in my research. 

I finally found out why my optimisation loop isn't working after 2 days. I am using NSGA-III for a 14 dimensional optimisation with 4 parameters to optimise in each individual. Population size is 212 individuals.

Problem comes when i want to calculate hypervolume of my population. There is no problem when the population size is less than 20, but it becomes slower as the size increases. And when the size is 212, kernel crashes. I am attaching the fitness values as a text file if you guys are interested. You can already see at 'i+31' in the code below calculations become a bit slow. If you keep on increasing, it becomes slower and slower and eventually crashes. Do you guys have an idea of a way around this problem? 

Here is the code (Sorry about the formatting i tried, pastebin and notepad++ but it didnt give me the correct format):

import numpy as np

try:

    # try importing the C version

    from deap.tools._hypervolume import hv

except ImportError:

    # fallback on python version

    print("no C")

    from deap.tools._hypervolume import pyhv as hv

 

ttt = np.loadtxt('fitness_values.txt')

ref_hyp = np.array([19.685825,962.5723419,5.69326178,0.043162976,11.61159696,0.01699155,0.018175168,223965.1889,2.61386E-06,0.016530917,1.9237E-06,2.82397E-06,4472.93138,4.81537E-06])

 

for i in range(len(ttt)):

    print('iteration number ',i)

    value_hypervolume = hv.hypervolume(ttt[i:i+31], ref_hyp)

    print('value_hypervolume is ',value_hypervolume)


Have a good day !

Hemant

fitness_values.txt

Derek Tishler

unread,
Nov 25, 2020, 11:01:55 PM11/25/20
to deap-users
Are you working with the nsga3 example/explanation and perhaps scaled reference points to address higher dimensionality as dealing with 14 objectives is an expensive task in nsga III selection:
https://deap.readthedocs.io/en/master/examples/nsga3.html#higher-dimensional-objective-space

Is this an issue or expected behavior of the current hypervolume calculations? 

The c based hypervolume calc appears to be version 4 if I read correctly:
https://github.com/DEAP/deap/blob/master/deap/tools/_hypervolume/_hv.c

Which we can sort of compare compute times to the original paper but only up to nobj=8:
http://lopez-ibanez.eu/doc/FonPaqLop06-hypervolume.pdf

I updated your code above to plot and compare a few diff n-objective optimizations to the pop size with a cutoff time of 10 sec to get the trend:

This gives the following summary. Loosely comparing to the paper I don't see any obviously odd behavior for 7 objectives when looking at 50 vs 100 points.
hv_summary.png

Reply all
Reply to author
Forward
0 new messages