ANN: BlackBoxOptim.jl - blackbox optimization / meta-heuristics in Julia

568 views
Skip to first unread message

Robert Feldt

unread,
Oct 29, 2013, 7:13:25 PM10/29/13
to julia...@googlegroups.com
I found Julia last week and I'm just loving it. Flexible, fast and fun. Thanks to everyone contributing. I hope to be able to give back.

Since I have dabbled quite a lot with evolutionary algorithms etc in my research I decided to start with black-box optimization of a meta-heuristic ilk. I know there has been a lot of good work in Optim.jl and the JuMP-related stuff but I couldn't find any Differential Evolution (DE) or other non-gradient based optimizers. Maybe they are already out there; if so please enlighten me. I'm also sure there are many things I can improve in my design and Julia code and only 5 different DE's and a random searcher implemented yet but things will improve over time. So here goes:


or just install from your Julia repl.


and then try it with for example:

using BlackBoxOptim
function rosenbrock2d(x)
  return (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
end
bboptimize(rosenbrock2d, (-5.0, 5.0); dimensions = 2)

Feedback appreciated!

Cheers,

Robert Feldt

Iain Dunning

unread,
Oct 29, 2013, 9:59:21 PM10/29/13
to julia...@googlegroups.com
Very cool! We should talk about bringing this under the "JuliaOpt" (http://juliaopt.org) umbrella to increase visibility. It looks like you've great testing in place already and the documentation looks solid. I'd be tempted to shorten the name to BlackBox, that'd be really snappy :D

I'm not very familiar with this area - can you handle arbitrary constraints or are box contraints on the variables more common?

Adding syntax to provide multiple objectives with JuMP is definitely possible. We are working on integrating non-linear objective and constraint support which might be useful too, but it seems like it might be overkill because we were envisaging only differentiable functions (we would calculate the derivatives ourselves).

Thanks,
Iain

Robert Feldt

unread,
Oct 30, 2013, 5:53:36 AM10/30/13
to julia...@googlegroups.com
Thanks Iain.

Yes, I'd be happy to get it in under the JuliaOpt umbrella and I'm open to name change suggestions. How do we go about the former? 

And is BlackBox descriptive enough? I like it (it sure is snappy!) but afraid it does not say much. Is there some guidance on how to name Julia packages?

Yes, in principle arbitrary constraints can be handled if they can be mapped to objectives and then we can leverage that (some of) these methods are good at multi-objective evaluation. 

Example: Say that you want to optimize rosenbrock2d with the additional constraint that x[1] < x[2] + 1. If there is a way to transform the constraint into the function f(x) = x[1] - x[2] - 1 then we can do a multi-objective optimization on the same box with two objectives that both should be minimized

f1(x) = rosenbrock2d(x)
f2(x) = x[1] - x[2] - 1

we can transform in other ways if we want to weight them etc. Is this (the type of transformations etc) something that JuMP will/can support?

Cheers,

Robert

Billou Bielour

unread,
Oct 30, 2013, 6:38:07 AM10/30/13
to julia...@googlegroups.com
I made a minimal CMA-ES port here:


It's a mix of the original minimal matlab implementation (purecmaes.m) and the full one (cmaes.m).

Robert Feldt

unread,
Oct 30, 2013, 8:24:09 AM10/30/13
to julia...@googlegroups.com
Interesting Billou, thanks for the link.

Which license is thus under? I want to be sure it is as open as possible before I even look at it. I really do not want to have any non-MIT/BSD code in BlackBoxOptim and I know many CMA-ES implementations have quite strict licenses. My overall plan is rather to implement things from scratch from the latest research papers and thus be free from future licensing problems.

If I remember correctly Nikolaus Hansen has made one of the python versions of CMA-ES available to the public domain. Do you know?

Thanks,

Robert Feldt

Billou Bielour

unread,
Oct 30, 2013, 9:00:47 AM10/30/13
to julia...@googlegroups.com
Most implementations seems to be under GPL 2/3. The minimal python implementation is in public domain yes:

Robert Feldt

unread,
Oct 30, 2013, 10:36:54 AM10/30/13
to julia...@googlegroups.com
Thanks, I'll base it on that and the later research papers.

Cheers,

Robert

Steven G. Johnson

unread,
Oct 31, 2013, 10:47:08 AM10/31/13
to julia...@googlegroups.com
(There are also several derivative-free optimization algorithms in NLopt.)
Reply all
Reply to author
Forward
0 new messages