CMA-ES implementation

623 views
Skip to first unread message

Billou Bielour

unread,
Sep 3, 2013, 11:52:24 AM9/3/13
to julia...@googlegroups.com
Hey, I did an julia port of my favorite optimization algorithm, I've used it quite a bit on real problems and I find it pretty good overall.

It's a minimal implementation, lots of features are missing, but on the few tests I did it seems to work. Code and example here:

https://github.com/Staross/JuliaCMAES/


John Myles White

unread,
Sep 3, 2013, 12:03:11 PM9/3/13
to julia...@googlegroups.com
Very cool. If this has no licensing issues, would you be interested in pulling it into Optim.jl?

-- John

Billou Bielour

unread,
Sep 4, 2013, 4:42:40 AM9/4/13
to julia...@googlegroups.com
Yeah, I guess it has its place next to simulated annealing. There's no mention of license
on the official page, but he already list all kinds of implementation (R, python, etc) so it
should be fine.


One thing I should do is test it on different functions to make sure it works correctly,
there a section about that on that page ("Testing your own source code").

Stefan Karpinski

unread,
Sep 4, 2013, 9:43:19 AM9/4/13
to Julia Users
No explicit license is the same as all rights reserved, so that's not ok, regardless of the existence of other implementations. If this implementation is not based on other ones, then it is ok, or if it's based on another implementation that has an MIT or BSD license, then that might also be ok (although if that license was not legally applied in the first place, then it's not ok for us either). License stuff is annoying and nasty, but we have to be careful.

Billou Bielour

unread,
Sep 4, 2013, 10:20:00 AM9/4/13
to julia...@googlegroups.com
Actually I missed it, but it's under GNU General Public License.

Isaiah Norton

unread,
Sep 4, 2013, 10:29:28 AM9/4/13
to julia...@googlegroups.com
Maybe this is less useful, but the basic/demo version (by the author himself) is public domain:

And the Apache commons Java version claims to be Apache licensed, though it is explicitly derived from the GPL matlab code.

John Myles White

unread,
Sep 4, 2013, 11:07:13 AM9/4/13
to julia...@googlegroups.com
If someone feels up to it, it would be great to ask the author why the Python version is public domain and the Matlab version is GPL.

 -- John

Nikolaus Hansen

unread,
Apr 29, 2014, 6:36:34 AM4/29/14
to julia...@googlegroups.com
good question, the Matlab version is now also released in the public domain. The code is in any case also given in Wikipedia. 

-- Niko

Nikolaus Hansen

unread,
Apr 29, 2014, 7:09:27 AM4/29/14
to julia...@googlegroups.com
On Wednesday, 4 September 2013 16:29:28 UTC+2, Isaiah wrote:

Maybe this is less useful, but the basic/demo version (by the author himself) is public domain:

And the Apache commons Java version claims to be Apache licensed, though it is explicitly derived from the GPL matlab code.

the claim is correct, the Apache license has been granted by the author of the Matlab code. 

Hans W Borchers

unread,
Apr 29, 2014, 1:17:50 PM4/29/14
to julia...@googlegroups.com
In R there are implementations of both the minimal CMA-ES code (also called bareCMAES or pureCMAES, the Wikipedia version) and an adaption of the Matlab code in parts. We have done lots of tests with these implementations. It turned out in too many cases that the implementations are unstable and may stop with errors. This is also described in a survey article on global optimization to be published soon.

On the other hand, the full Matlab implementation is very useful and helped me to solve some difficult real-world applications (where other implementations stopped running). Therefore, I believe it would be necessary and indeed worth to re-implement the procedure from the original articles _and_ taking a look into the available implementations at times.

As Arnold Neumaier once wrote on 

    I found CMA-ES quite robust in dimensions up to 50.
    (It gets very slow though when the dimension is large.)

Hans W Borchers

unread,
May 1, 2014, 4:55:59 AM5/1/14
to julia...@googlegroups.com
@Billou  Thanks for implementing this in Julia.

I checked your JuliaCMAES code with Julia 0.3.0-prerelease and got an error and some warning. The error occurs when loading the function:

    ERROR: syntax: incomplete: unterminated multi-line comment #= ... =#

which happens on l.140. Actually, it was new for me, too, that Julia now has multi-line comment. That is nice though I like the comment format in, e.g., Lua more because with placing or deleting a single character the multi-line comment will be toggled. In this case, adding a space between '#' and '=' helps.

When calling the function, we get the following warnings:

    WARNING: min(x) is deprecated, use minimum(x) instead.
     in max at deprecated.jl:26
     in cmaes at /home/hwb/cmaes_orig.jl:32

and in lines 40, 71, 92, and 179 as well. Correcting this, of course, is easy though only the one in line 32 needs to be and should be replaced.
(In my own version I assume that a user might want to supply a vector of sigmas for all directions. This may require some more changes.)

    WARNING: x::Number - A::Array is deprecated, use x .- A instead.
     in - at deprecated.jl:26
     in cmaes at /home/hwb/cmaes_orig.jl:71

to be corrected as easily.

May I add that your CMA-ES implementation should allow for lower and upper boundaries. Most often, in global optimization problems and in stochastic optimization in general, one is looking for minima/optima in certain regions while the objective function may take on lower values outside. -- This functionality is offered in the full Matlab version, too.

This can easily be done after line 127 (that's about the place where I introduced reducing to the boundaries in my own implementation).

Billou Bielour

unread,
May 1, 2014, 5:43:03 AM5/1/14
to julia...@googlegroups.com
Thanks Hans, I fixed these issues. I don't have much time right now because I'm (trying to) finish my PhD, but maybe I'll implement
some additional features once I'm done.

Nikolaus Hansen

unread,
May 2, 2014, 5:22:07 PM5/2/14
to julia...@googlegroups.com
On Tuesday, 29 April 2014 19:17:50 UTC+2, Hans W Borchers wrote:
[...]
As Arnold Neumaier once wrote on 

    I found CMA-ES quite robust in dimensions up to 50.
    (It gets very slow though when the dimension is large.)

we can make this more specific: with n the search space dimension, the internal effort is O(n^2) per function evaluation (about 1e-8*n^2 seconds for the C implementation with 2GHz processor), and the number of function evaluations scale typically between O(n) and O(n^2). I admit that many years ago I abandoned my R code, because it just seemed to be too slow. 
Reply all
Reply to author
Forward
0 new messages