Hi Hans,
1) Your welcome :) The code is in the PR:
https://github.com/JuliaOpt/Optim.jl/pull/73 and if all goes well could merge soon. Do try it out and share the experience.
2) Mindfinder 2.0 introduces 2 new stopping rules and an extra validation rule for sample points. If there is interest, I could add these as optional to the current code.
3) The package test uses the rosenbrock, camel, rastrigin and shekel(5,7,10) examples from the paper. They can be found in the folder `problems`. Some more are described at
http://www2.compute.dtu.dk/~kajm/Test_ex_forms/test_ex.html. I would be happy to add those you have already implemented, let me know where's the code.
4) Good question. As far as l know, these optimization procedures only look at specific function values and sometimes their derivatives and then hop around, so it is impossible to find all minima in finite time :) I guess a program could solve the automatic differentiation equations if analytical solutions exist. However, if not, then you can never be sure, it would seem to me.
The minfinder algorithm has a parameter EXHAUSTIVE (`p` in the paper) that controls how exhaustive you explore the search space. You can tune that parameter to your problem.
Cheers,
Ken