Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Errors and issues with NLINFIT

665 views
Skip to first unread message

Bendik Mjaaland

unread,
May 13, 2009, 10:24:01 AM5/13/09
to
Hi!

I have worked with a prediction task for some time, now I have decided to use nonlinear regression. I have little experience with matlab, so bear with me.

so, I have a set of data points YDATA, about 60 values.
I define the X-axis
X1 = 1:60;

I have two models:
y = a + b*log(1 + c/x)
y = a + b*exp(c/x)
implemented in the following functions
function [F] = funlog(a,data)
F=a(1)+a(2)*log(1 + a(3)./data);
function [F] = funexp(a,data)
F=a(1) + a(2)*exp(a(3)./data);

initial beta0 = [10 5 5];
To generate the beta values
[beta,r,J,COVB,mse] = nlinfit(X1,YDATA,@funexp,beta0);
and the same for @funlog

Issue 1:
Both functions give me this error
Warning: The Jacobian at the solution is ill-conditioned, and some
model parameters may not be estimated well (they are not identifiable).
- why does this happen?
- how can I make sure my results are valid?

Issue 2:
The exp-function also gives me this error:
Warning: Iteration limit exceeded. Returning results from final iteration.
> In nlinfit at 193
- Is this something I can modify in the options? How? Unlike LSQCURVEFIT, there is no options parameter.

Issue 3:
The log-function returns complex values. Not being a statician, i hardly understand what this complex results really mean, I understand the cause - the logarithm of something negative produces an imaginary number, but as far as I know I do not need anything this sophisticated for an answer. I am interested in the plots, and plotting the log-results was really weird, the graph was not contineous.
- Can I merely ignore the complex addition so that if " a = 10.424 + 1.2325i " I can use "a = 10.424"?
- Can I tell matlab only to return real numbers?

And final question
- The documentation says something about outliers, which I would like to ignore. But I cannot figure out how this works.


Please help me if anyone can, and please make the answers precice and full, because some posts I've seen here speak a matlab language I have trouble understanding :).

Thank you!
Bendik

Bendik Mjaaland

unread,
May 13, 2009, 12:06:02 PM5/13/09
to
ok I solved Issue 1 and 2 by changing initial values :).

However, the complex result still bothers me. How can I avoid it?

It ruins the characteristics of my LOG-model. Y = a + b*log(1+c/x) will converge to a when x-> inf, with the complex results it just looks really odd

Peter Perkins

unread,
May 13, 2009, 2:46:58 PM5/13/09
to
Bendik Mjaaland wrote:
> Hi!
>
> I have worked with a prediction task for some time, now I have decided to use nonlinear regression. I have little experience with matlab, so bear with me.
>
> so, I have a set of data points YDATA, about 60 values.
> I define the X-axis
> X1 = 1:60;
>
> I have two models:
> y = a + b*log(1 + c/x)
> y = a + b*exp(c/x)
> implemented in the following functions
> function [F] = funlog(a,data)
> F=a(1)+a(2)*log(1 + a(3)./data);
> function [F] = funexp(a,data)
> F=a(1) + a(2)*exp(a(3)./data);
>
> initial beta0 = [10 5 5];
> To generate the beta values
> [beta,r,J,COVB,mse] = nlinfit(X1,YDATA,@funexp,beta0);
> and the same for @funlog
>
> Issue 1:
> Both functions give me this error
> Warning: The Jacobian at the solution is ill-conditioned, and some
> model parameters may not be estimated well (they are not identifiable).
> - why does this happen?
> - how can I make sure my results are valid?

In general this means one of two things:

1) two or more parameters in the model are aliased, e.g., b and c in y = a + b*exp(c+x), or

2) the solver has wandered off to a region in the parameter space that is far from the "true" MLE, and the log-likelihood surface has a very bad shape. It sounds from your followup post like that might be the case. Good starting values are pretty important in non-linear least squares, and in general it's impossible to pick them automatically.

> Issue 2:
> The exp-function also gives me this error:
> Warning: Iteration limit exceeded. Returning results from final iteration.
>> In nlinfit at 193
> - Is this something I can modify in the options? How? Unlike LSQCURVEFIT, there is no options parameter.

It isn't, but if NLINFIT doesn't converge in 100 iterations, that's a pretty good indication of something gone wrong. You can restart at the last value if you want, but given the above, it sounds like that would be pointless.

> Issue 3:
> The log-function returns complex values. Not being a statician, i hardly understand what this complex results really mean, I understand the cause - the logarithm of something negative produces an imaginary number, but as far as I know I do not need anything this sophisticated for an answer. I am interested in the plots, and plotting the log-results was really weird, the graph was not contineous.
> - Can I merely ignore the complex addition so that if " a = 10.424 + 1.2325i " I can use "a = 10.424"?
> - Can I tell matlab only to return real numbers?

No, iy's not something to ignore. It's kind of hard to diagnose this without specifics. What are the 60 values?

> And final question
> - The documentation says something about outliers, which I would like to ignore. But I cannot figure out how this works.

You probably wan to look at using the 'Robust' option. First call statset('robust','on'), then pass the resulting options structure into NLINFIT.

Hope this helps.

Amar

unread,
Apr 16, 2010, 11:20:24 AM4/16/10
to
Hello Peter,
I tried using statset('robust','on') in my code, but it gave me an error

I used the following code:

options = statset('FunValCheck','off','DerivStep',10^-12,'robust','on');

beta = nlinfit([parameter_values(3,:);parameter_values(4,:)],Y,@NTCPmodel, beta_seed,options);

And this the error:

??? Error using ==> times
Matrix dimensions must agree.

Error in ==> nlinfit>nlrobustfit at 417
radj = r .* adjfactor;

Error in ==> nlinfit at 188
[beta,J,sig,cause] =
nlrobustfit(X,y,beta,model,J,ols_s,options,verbose,maxiter);

Any pointers as to whats going on there?

Thanks
Amar

Peter Perkins <Peter....@MathRemoveThisWorks.com> wrote in message <guf4j2$hsr$1...@fred.mathworks.com>...

Amar

unread,
Apr 16, 2010, 1:34:23 PM4/16/10
to
Hello Peter,
I tried using statset('robust','on') in my code, but it gave me an error

I used the following code:

options = statset('FunValCheck','off','DerivStep',10^-12,'robust','on');

beta = nlinfit([parameter_values(3,:);parameter_values(4,:)],Y,@NTCPmodel, beta_seed,options);

And this the error:

??? Error using ==> times
Matrix dimensions must agree.

Error in ==> nlinfit>nlrobustfit at 417
radj = r .* adjfactor;

Error in ==> nlinfit at 188
[beta,J,sig,cause] =
nlrobustfit(X,y,beta,model,J,ols_s,options,verbose,maxiter);

Any pointers as to whats going on there?

Thanks
Amar

Peter Perkins <Peter....@MathRemoveThisWorks.com> wrote in message <guf4j2$hsr$1...@fred.mathworks.com>...

Tom Lane

unread,
Apr 19, 2010, 6:31:26 PM4/19/10
to
> options = statset('FunValCheck','off','DerivStep',10^-12,'robust','on');
>
> beta = nlinfit([parameter_values(3,:);parameter_values(4,:)],Y,@NTCPmodel,
> beta_seed,options);
>
> And this the error:
>
> ??? Error using ==> times
> Matrix dimensions must agree.

Amar, this is hard to figure out without the data. If you are comfortable
debugging, you may want to set a breakpoint at the location of the error and
see what's going on.

Also, is there are reason you turn off FunValCheck? That might help here. As
best I can tell looking at the code, the function is finding an NaN in an
unexpected place, removing it, and then getting an array of the wrong size.

-- Tom


EWi Wimmer

unread,
Jan 18, 2012, 3:31:08 AM1/18/12
to
"Tom Lane" <tl...@mathworks.com> wrote in message <hqilju$de4$1...@fred.mathworks.com>...
Hello I have the same error,

if i do the nonlinearfit without switching robust on, everything works perfect, but as soon as I switch robust on i get into trouble !

-- Erich
0 new messages