Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

fmincon vs fminunc

899 views
Skip to first unread message

Rogelio

unread,
Apr 1, 2011, 9:02:04 AM4/1/11
to
Hi,

I have seen in some posts here, that it is always adviced to use a constrained optimizer instead of an uncostraiend.
Can some one explain it bit more..... for example if I am trying to estimate the parameters to model a given variance equation (there at least one of the parameters must be positive) what will be the potential problem/s by using an unconstrained optimizer or the advatages of using fmincon?

Regards,
Rogelio

Matt J

unread,
Apr 1, 2011, 9:22:04 AM4/1/11
to
"Rogelio " <rogelioa....@math.uio.no> wrote in message <in4icc$i4a$1...@fred.mathworks.com>...

> Hi,
>
> I have seen in some posts here, that it is always adviced to use a constrained optimizer instead of an uncostraiend.
==================

I'm not sure what you've read...

When a problem contains a constrained variable, there is sometimes no obvious alternative to a constrained optimizer. How in general do you expect an unconstrained optimizer to account for constraints?

In cases like yours where you have simple positivity constraints, it is possible to reformulate the problem by transforming the variables. For example to solve the simple 1D problem

min f(x) = x^2
s.t. x>=0

You can make the change of variables x=y^2 and transform the problem to the unconstrained one

min. g(y) = y^4

It is not clear, however, whether this will be good or bad. On the one hand, constrained solvers require more computational effort per iteration so transforming the problem to an unconstrained one could seem attractive. On the other hand, the transformed problem can have a Hessian that is singular at the solution, like in the example above. Algorithms tend to converge a lot more slowly for problems with singular Hessians.

It's never clear which of these trade-offs is going to dominate the computation time...

Bruno Luong

unread,
Apr 1, 2011, 2:40:05 PM4/1/11
to
"Matt J" wrote in message <in4jhs$8fl$1...@fred.mathworks.com>...

> "Rogelio " <rogelioa....@math.uio.no> wrote in message
>
> You can make the change of variables x=y^2 and transform the problem to the unconstrained one
>
> min. g(y) = y^4
>
> It is not clear, however, whether this will be good or bad.

Transformation is bad to me.

Bruno

Bruno Luong

unread,
Apr 1, 2011, 2:43:04 PM4/1/11
to
"Rogelio " <rogelioa....@math.uio.no> wrote in message <in4icc$i4a$1...@fred.mathworks.com>...

> Hi,
>
> I have seen in some posts here, that it is always adviced to use a constrained optimizer instead of an uncostraiend.

I don't have this impression.

If the problem is unconstrained then just use fminunc.

Bruno

Rogelio

unread,
Apr 6, 2011, 2:00:21 AM4/6/11
to
thanks for your comments!
Once i saw here a discussion about fminunc vs fmincon. Then some one suggested to always use fmincon eventhough the problem is not constraind. He/she argued that you always know "sometinhg" about the parameter to be estimated, so that could be the constrain. He/she pointed out heurastically that fmincon behaves more stable.
Then Im modeling a variance funcion, where there is obvious coinstrains. I want to know if it is true (and why) that a serious analysis uses fmincon and not fminunc. Eventhough fminunc gives me coherent parameters.
Thanks

"Bruno Luong" <b.l...@fogale.findmycountry> wrote in message <in56bo$q8c$1...@fred.mathworks.com>...

Matt J

unread,
Apr 6, 2011, 2:37:05 AM4/6/11
to
"Rogelio " <rogelioa....@math.uio.no> wrote in message <ingvhl$n6b$1...@fred.mathworks.com>...

> thanks for your comments!
> Once i saw here a discussion about fminunc vs fmincon. Then some one suggested to always use fmincon eventhough the problem is not constraind. He/she argued that you always know "sometinhg" about the parameter to be estimated, so that could be the constrain. He/she pointed out heurastically that fmincon behaves more stable.
> Then Im modeling a variance funcion, where there is obvious coinstrains. I want to know if it is true (and why) that a serious analysis uses fmincon and not fminunc. Eventhough fminunc gives me coherent parameters.
=========================

Adding information via constraints can improve the conditioning of the problem.
As a simpler example, consider the 2x2 system of linear equations:

A=[1 1+1000*eps; 1 1];
x=[pi;exp(1)];
y=A*x;

The matrix A is poorly conditioned,

>> cond(A)

ans =

1.8017e+013


and therefore I get large errors when I minimize
norm(A*x-y),

>> norm(A\y-x)

ans =

0.0032

Now, however, suppose I add a 3rd equation that explicitly informs the problem
that x(1)=pi in the true solution. This is similar to what happens when you add constraints,

>> A(3,:)=[1 0]; y(3)=pi;

The condition number is now much lower/better


>> cond(A)

ans =

3.2255


And accordingly, I get much lower errors:


>> norm(A\y-x)

ans =

9.9301e-016

0 new messages