Finding multiple local minima in Julia

924 views
Skip to first unread message

Charles Martineau

unread,
Jul 25, 2014, 3:05:05 PM7/25/14
to julia...@googlegroups.com
Dear Julia developers and users,

I am currently using in Matlab the multisearch algorithm to find multiple local minima: http://www.mathworks.com/help/gads/multistart-class.html for a MLE function.
I use this Multisearch in a parallel setup as well.

Can I do something similar in Julia using parallel programming?

Thank you

Charles

Iain Dunning

unread,
Jul 25, 2014, 5:04:08 PM7/25/14
to julia...@googlegroups.com
I'm not familiar with that particular package, but the Julia way to do it could be to use the Optim.jl package and create a random set of starting points, and do a parallel-map over that set of starting points. Should work quite well. Trickier (maybe) would be to just give each processor a different random seed and generate starting points on each processor.

Charles Martineau

unread,
Jul 25, 2014, 8:00:40 PM7/25/14
to julia...@googlegroups.com
Thank you for your answer. So I would have to loop over, say 20 random set of starting points, where in my loop I would use the Optim package to minimize my MLE function for each random set. Where online is the documents that shows how to specify that we want the command
Optim.optimize(my function, etc.) to be parallelized? Sorry for my ignorance, I am new to Julia!

Iain Dunning

unread,
Jul 26, 2014, 12:22:07 PM7/26/14
to julia...@googlegroups.com
The idea is to call the optimize function multiple times in parallel, not to call it once and let it do parallel multistart.

Check out the "parallel map and loops" section of the parallel programming chapter in the Julia manual, I think it'll be clearer there.

Michael Prentiss

unread,
Jul 26, 2014, 1:08:20 PM7/26/14
to julia...@googlegroups.com
What you are doing makes sense.  Starting from multiple starting points is important.

I am curious why you just don't just run 20 different 1-processor jobs 
instead of bothering with the parallelism?

Charles Martineau

unread,
Jul 27, 2014, 12:13:28 AM7/27/14
to julia...@googlegroups.com
Yes I could do that but it is simpler (I think) to execute the code in parallel instead of sending 20 codes to be executed on the cluste.r

Ken B

unread,
Jul 27, 2014, 2:26:31 AM7/27/14
to julia...@googlegroups.com
Hi Charles,

You can have a look at the MinFinder algorithm for which I've just created a pull request to Optim.jl (talk about a coincidence!):
https://github.com/JuliaOpt/Optim.jl/pull/72

I'd like to add the possibility to run each optimization in parallel, but I have no experience with these things, although I have time to learn :). Would you like to collaborate on this?

Does anyone know of some parallel sample code to have a look at? Basically it's sending each optimization problem to a separate worker and getting the results, taking into account that some optimizations might take much longer than others.

Cheers,
Ken

Charles Martineau

unread,
Jul 27, 2014, 3:55:37 AM7/27/14
to julia...@googlegroups.com
Hi Ken

Interesting code you have there. I will have to take a closer look at it. Yes I would be happy to collaborate. But let me first try my problem out in Julia .. I am new to Julia and I am currently debating whether my code that I want to process will be faster in Python using mpi4py or Julia in parallel. I am definitely more familiar with Python. Keep in touch.

Charles

Hans W Borchers

unread,
Jul 27, 2014, 9:25:28 AM7/27/14
to julia...@googlegroups.com
Ken:

(1) Thanks for pointing out this approach and for implementing it.
Unfortunately, I was not able to locate your code at Github. I would certainly try it out on some of my examples in global optimization.

(2) Did you include (or do you plan to include) the improvements of MinFinder,
as discussed in "MinFinder 2.0: An improved version of MinFinder" by Tsoulos and Lagaris?

(3) Also this article contains examples of functions with many local minima.
Most of these are test functions for global optimization procedures. Did you test your function on these examples?

I have implemented  some of these functions for my own purposes.
I wonder whether it would be useful to have a Julia package of its own for compiling optimization test functions.

(4) Are you sure/Is it guaranteed MinFinder will reliably find all local minima?
This is a difficult problem, and for example there is a long discussion on this topic in Chapter 4, by Stan Wagon, in the book "The SIAM 100 Digit Challenge" about all the preventive measures to be taken to be able to guarantee to find all local minima -- and thus also the one global minimum.

John Myles White

unread,
Jul 27, 2014, 11:41:35 AM7/27/14
to julia...@googlegroups.com
Re. (3), I’d like to move the optimization problems we use for testing out of Optim into a separate package. Having a nice test suite would be a big gain for JuliaOpt.

 — John

Tim Holy

unread,
Jul 27, 2014, 11:49:26 AM7/27/14
to julia...@googlegroups.com
A package of test functions sounds worthwhile. There's also CUTEst.jl:
https://github.com/lpoo/CUTEst.jl

--Tim

On Sunday, July 27, 2014 06:25:28 AM Hans W Borchers wrote:
> Ken:
>
> (1) Thanks for pointing out this approach and for implementing it.
> Unfortunately, I was not able to locate your code at Github. I would
> certainly try it out on some of my examples in global optimization.
>
> (2) Did you include (or do you plan to include) the improvements of
> MinFinder,
> as discussed in "MinFinder 2.0: An improved version of MinFinder" by
> Tsoulos and Lagaris?
>
> (3) Also this article contains examples of functions with many local minima.
> Most of these are test functions for global optimization procedures. Did
> you test your function on these examples?
>
> I have implemented some of these functions for my own purposes.
> I wonder whether it would be useful to have a Julia package of its own for
> compiling optimization test functions.
>
> (4) Are you sure/Is it guaranteed MinFinder will *reliably* find *all*

John Myles White

unread,
Jul 27, 2014, 11:51:56 AM7/27/14
to julia...@googlegroups.com
Is CUTEst.jl easier to get working these days? The issue I opened in March seems to still be open.

— John

Tim Holy

unread,
Jul 27, 2014, 11:55:58 AM7/27/14
to julia...@googlegroups.com
I haven't tested. I should do so.

--Tim
> >>>>>> The idea f you want a binary format, JLD (in the HDF5 package)is to

Tim Holy

unread,
Jul 27, 2014, 12:31:41 PM7/27/14
to julia...@googlegroups.com
Nope.

One could write a SIF parser from scratch, but it would take some time.

--Tim

On Sunday, July 27, 2014 08:51:51 AM John Myles White wrote:

Peter Simon

unread,
Jul 27, 2014, 1:12:50 PM7/27/14
to julia...@googlegroups.com
There is a large number of optimization test functions (for derivative-free, black-box optimization) implemented in pure Julia in Robert Feldt's BlackBoxOptim.jl.   From the source code comments in single_objective.jl, functions are taken from the following sources:

*  CEC 2013 competition on large-scale optimization
*  "Test Suite for the Special Issue of Soft Computing on Scalability of Evolutionary 
    Algorithms and other Metaheuristics for Large Scale Continuous Optimization 

--Peter

Ken B

unread,
Jul 28, 2014, 12:06:34 AM7/28/14
to julia...@googlegroups.com
Hi Hans,

1) Your welcome :) The code is in the PR: https://github.com/JuliaOpt/Optim.jl/pull/73 and if all goes well could merge soon. Do try it out and share the experience.

2) Mindfinder 2.0 introduces 2 new stopping rules and an extra validation rule for sample points. If there is interest, I could add these as optional to the current code.

3) The package test uses the rosenbrock, camel, rastrigin and shekel(5,7,10) examples from the paper. They can be found in the folder `problems`. Some more are described at http://www2.compute.dtu.dk/~kajm/Test_ex_forms/test_ex.html. I would be happy to add those you have already implemented, let me know where's the code.

4) Good question. As far as l know, these optimization procedures only look at specific function values and sometimes their derivatives and then hop around, so it is impossible to find all minima in finite time :) I guess a program could solve the automatic differentiation equations if analytical solutions exist. However, if not, then you can never be sure, it would seem to me.
The minfinder algorithm has a parameter EXHAUSTIVE (`p` in the paper) that controls how exhaustive you explore the search space. You can tune that parameter to your problem.

Cheers,
Ken

Ken B

unread,
Jul 30, 2014, 3:22:45 AM7/30/14
to julia...@googlegroups.com
Short update: the merger of this PR in Optim.jl has been postponed. Meanwhile it is available as a separate package: Pkg.clone("https://github.com/Ken-B/MinFinder.jl.git") Feel free to try it out and, as usual, all feedback welcome :)

Ken
Reply all
Reply to author
Forward
0 new messages