Nonlinear optimization, unknown gradient

322 views
Skip to first unread message

Baruch

unread,
Feb 13, 2014, 5:38:38 AM2/13/14
to accor...@googlegroups.com
Hi,

I am new to accord so please forgive me if this is a trivial question.

I am trying to find a minimum to a 4 variable function (given parameter range and initial guess).

BrentSearch would be a good but it only works with one parameter.

Any suggestions?

Thanks
Baruch

Anders Gustafsson Cureos AB

unread,
Feb 14, 2014, 5:43:51 AM2/14/14
to accor...@googlegroups.com
There are a number of multivariable nonlinear optimizers in the Math.Optimization  namespace, although they all seem to require that the gradient can be computed. These algorithms include:

If you are unable, or unwilling :-) , to calculate the gradient, you could make use of either of the following non-Accord codes. All these algorithms were developed by Prof. Michael J.D. Powell and were originally written in Fortran 77. I have recently ported these algorithms to C#, which you can retrieve if you follow the links below:

I hope this information is of any help.

Best regards,
Anders @ Cureos

César

unread,
Feb 14, 2014, 6:25:44 AM2/14/14
to accor...@googlegroups.com
Hello all!

Anders, many thanks for helping answer this question! I would also like to add that there is one way to calculate unknown gradients in the framework through the FiniteDifferences class. However, I still need to revise those algorithms because sometimes they don't converge to a proper answer, specially when using the L-BFGS algorithm. 

Now, Anders, I also found it very interesting that you have ported all such algorithms into the .NET world. From what I am seeing, they all have compatible open source licenses. Do you think we could integrate those in the Accord.NET Framework sometime in the future? Of course your contributions will all be proper credited!

Best regards,
Cesar

Anders Gustafsson Cureos AB

unread,
Feb 14, 2014, 7:33:16 AM2/14/14
to accor...@googlegroups.com
Thanks for the additional information on FiniteDifferences, Cesar.

I would be more than happy to see the C# ports of COBYLA2, LINCOA and BOBYQA being incorporated into Accord.NET! To facilitate inclusion it is also OK with me if the Accord.NET variants of these algorithms are LGPL licensed.

Unfortunately, I am a little short of time doing the integration myself right now. If you perform the integration yourself, I would be more than happy to help out if you need assistance. Otherwise I could attempt to make an integration and then make a Github pull request of it, but it will take some time before I can take on this task wholeheartedly.

Best regards,
Anders

César

unread,
Feb 14, 2014, 8:08:21 AM2/14/14
to accor...@googlegroups.com
No worries! I will do it soon and add the proper copyright notices. Thanks!

Best regards,
Cesar

Markus Stöger

unread,
Feb 14, 2014, 3:52:28 PM2/14/14
to accor...@googlegroups.com
Hi Cesar,

could you give some details on "However, I still need to revise those algorithms because sometimes they don't converge to a proper answer"? How stable/usable are these algorithms right now?

I had some problems with BFGS (and CG as well) a few days ago.. they seem to be failing 2 out of 3 times.. and I'm not sure if it's my fault or if there's something else wrong... see https://groups.google.com/forum/#!topic/accord-net/bokH2DytURQ

Max

Ryan

unread,
Feb 25, 2014, 1:24:49 PM2/25/14
to accor...@googlegroups.com
Hello Anders,

I was wondering if you would be able to explain how COBYLA2  or the other algorithms might be used in curve fitting.  I understand basic curve fitting and basic nonlinear min/maximization but I'm having trouble connecting how they work together from a programming standpoint.

Anders Gustafsson Cureos AB

unread,
Feb 26, 2014, 2:47:17 AM2/26/14
to accor...@googlegroups.com
Hello Ryan,

at least the way I see it, you would normally formulate an objective function representing the fitting procedure, for example least squares difference between observed and calculated data, and let COBYLA2 or another optimizer minimize the objective function.

Regards,
Anders

Ryan

unread,
Feb 28, 2014, 10:12:43 AM2/28/14
to accor...@googlegroups.com
Hi Anders, thank you for the response.  Just so I am clear, if I have a function -  MyNonLinearFunction  & I have a bunch of x,y data points, my actual objective function is - 

( MyNonLinearFunction(x) - ActualValueOf_X )^2    correct?

If so, that makes sense to me as I was getting confused how Squared Error and my equation were related.  If not, then I guess I'm still lost :) 

Anders Gustafsson Cureos AB

unread,
Feb 28, 2014, 10:57:04 AM2/28/14
to accor...@googlegroups.com
Hi again Ryan,

If you sum these square errors into one objective function, then Yes, this is what I meant.

Best regards,
Anders

Ryan

unread,
Feb 28, 2014, 11:01:05 AM2/28/14
to accor...@googlegroups.com
Great, thanks!
Reply all
Reply to author
Forward
0 new messages