Not much difference really.
Lsqnonlin just has you return a list of numbers that it
will try to minimize the sum of squares of. Lsqcurvefit
has you provide a set of aims. Your objective function
must return predictions at each point, the code then
subtracts predictions from aims, then minimizes the
sum of squares of the differences.
Both codes allow no more than bound constraints on
the parameters. I seriously doubt that there is any
difference in the computational engines.
Are there reasons why one would choose one code
over the other? Yes. For example, a recent poster
asked how to do a weighted nonlinear regression.
This ends up being slightly easier to do using
lsqnonlin, since here you would subtract the aims
from the predictions before applying the weights.
Yes, it could also have been solved using lsqcurvefit.
HTH,
John
I need your experience in another topic
can i email you directly
Many do so. However it is often true that others will
benefit from your question or any response. As well,
someone else may be better able than am I to answer
some questions posed on this forum.
John
My problem is related to indirect curve fitting
I have the following data:
mdata,Cin(t): experimental data of the same length, they are raw
arrays
f(t): transport function
m(t)=conv(Cin(t),f(t)); conv means convolution
f(t)=fr.*exp(-k*(t-ta-Tm));
fr,k,ta,Tm :constants need to be find
The main goal is to find f(t)constants by doing indirect curve
fitting between experimental data mdata and m(t) which results from
the above
convolution
How can I do the curve fitting between m(t) and mdata
How can I do the deconvolution between mdata and Cin(t) to get f(t)
If the problem isn't clear please tell me
looking forwad to hearing from you
In the related thread:
http://www.mathworks.com/matlabcentral/newsreader/view_thread/156711
I give examples of the use of both lsqnonlin
and lsqcurvefit to solve for a set of parameters
using nonlinear regression.
Your problem is no different. You have a vector
t, and a function f(t). Find the set of parameters
which gives the best fit to your data in mdata.
Try it. If you don't try, you will never learn. If
you get stuck, then post what you've done so
far.
You will have one problem though. It will not
be possible to find independent estimates for
both ta and Tm. They both enter in only one
place in the model. The best that you can do
is estimate their sum, i.e., (ta+Tm).
John
function [FittedFRT,model]=ConvFun(time,mdata,Cin)
start_point=[0.15;0.341;0.28;4];
model=@objFun;
FittedFRT=lsqcurvefit(model,start_point);
function [sse,FRT]=objFun(para)
fr=para(1);
k=para(2);
ta=para(3);
Tm=para(4);
FRT=fr.*exp(-k*(time-ta-Tm));
Fitted_m = convn(Cin,FRT);
%length(FittedFrt)=length(Cin)+length(mdata)-1;
Fitted_mcut = Fitted_m(1:length(mdata));
ErrorVector=Fitted_mcut-mdata;
sse=sum(ErrorVector.^2);
end
end