# Re: FindRoot repeatedly evaluating function

379 views

### DrMajorBob

Aug 26, 2011, 5:26:47 AM8/26/11
to
FindRoot makes several evaluations near each guess to estimate the
derivative numerically, and uses that to compute another guess. There's
nothing odd or surprising about it.

To avoid all that (where possible), use symbolic solvers instead:

Solve[x == 0 /.
First@DSolve[{x''[S] - x'[S] + x[S] == 0, x == 1, x' == a}, x,
S], a]

{{a -> (Csc[5 Sqrt] (-3 Cos[5 Sqrt] + Sqrt Sin[5 Sqrt]))/(
2 Sqrt)}}

Bobby

On Thu, 25 Aug 2011 06:05:19 -0500, Simon Pearce
<Simon....@nottingham.ac.uk> wrote:

> Hi Mathgroup,
>
> When I use FindRoot[f[y],y] I find that the inner function f is
> evaluated 3 or 4 times at each value of y (or at least very similar
> values), even if y is far from the root. This has obvious implications
> to the speed of my code.
> Can anyone explain why this is the case, and tell me any way to stop it
> from repeatedly evaluating f? If I use f[a]:=f[a]=... then it uses the
> stored result, but I don't want to store thousands of such real valued
> expressions.
>
> The following simple code shows the essence of the problem, using Print
> to show where the function is evaluated and its value there.
>
> f[a_?NumericQ]:=Module[{sol},
> sol=NDSolve[{x''[S]-x'[S]+x[S]==0,x==1,x'==a},x,{S,0,10}][];
> Print[{a,x/.sol}]; x/.sol ]
> FindRoot[f[y],{y,6}]
>
> Thanks,
> Simon Pearce
> Postdoctoral Researcher
> The Centre for Plant Integrative Biology
> School of Biosciences
> University of Nottingham
>
>
> This message and any attachment are intended solely for the addressee
> and may contain confidential information. If you have received this
> message in error, please send it back to me, and immediately delete
> it. Please do not use, copy or disclose the information contained in
> this message or in any attachment. Any views or opinions expressed by
> the author of this email do not necessarily reflect the views of the
> University of Nottingham. This message has been checked for viruses but
> the contents of an attachment may still contain software viruses which
> could damage your computer system: you are advised to perform your own
> checks. Email communications with the University of Nottingham may be
> monitored as permitted by UK legislation.

### Simon Pearce

Aug 27, 2011, 8:17:10 AM8/27/11
to
Hi Bobby,

Thanks for your quick response. Why then if I store the result using f[a_]:=f[a] does it only do it twice? The first two results are being done at the same value (at least to machine precision I assume), with a third being done very slightly away.

My actual problem is not amenable to symbolic solving.

Thanks,
Simon

Bobby

> sol=NDSolve[{x''[S]-x'[S]+x[S]==0,x==1,x'==a},x,{S, 0,10}][];

### Oliver Ruebenkoenig

Aug 27, 2011, 8:22:50 AM8/27/11
to

Hello Simon,

On Thu, 25 Aug 2011, Simon Pearce wrote:

> Hi Mathgroup,
>
> When I use FindRoot[f[y],y] I find that the inner function f is evaluated 3 or 4 times at each value of y (or at least very similar values), even if y is far from the root. This has obvious implications to the speed of my code.
> Can anyone explain why this is the case, and tell me any way to stop it from repeatedly evaluating f? If I use f[a]:=f[a]=... then it uses the stored result, but I don't want to store thousands of such real valued expressions.
>
> The following simple code shows the essence of the problem, using Print to show where the function is evaluated and its value there.
>
> f[a_?NumericQ]:=Module[{sol},

> sol=NDSolve[{x''[S]-x'[S]+x[S]==0,x==1,x'==a},x,{S,0,10}][];

> Print[{a,x/.sol}]; x/.sol ]
> FindRoot[f[y],{y,6}]
>
> Thanks,
> Simon Pearce
> Postdoctoral Researcher
> The Centre for Plant Integrative Biology
> School of Biosciences
> University of Nottingham
>
>
> This message and any attachment are intended solely for the addressee and may contain confidential information. If you have received this message in error, please send it back to me, and immediately delete it. Please do not use, copy or disclose the information contained in this message or in any attachment. Any views or opinions expressed by the author of this email do not necessarily reflect the views of the University of Nottingham. This message has been checked for viruses but the contents of an attachment may still contain software viruses which could damage your computer system: you are advised to perform your own checks. Email communications with the University of Nottingham may be monitored as permitted by UK legislation.
>

here are some ideas you may find useful.

Here is a slight modification of your example:

f[a_?NumericQ] :=
Module[{sol},
sol = NDSolve[{x''[S] - x'[S] + x[S] == 0, x == 1, x' == a},
x, {S, 0, 10}][];
Print[InputForm[{a, x /. sol}]]; x /. sol]

e = s = 0;
FindRoot[f[y], {y, 6}
, StepMonitor :> s++, EvaluationMonitor :> e++
]

on my machine this gives e==17 and s==5.

Sometimes the Jacobian that needs to be computed does not need to be
updated every step.

s = e = 0;
FindRoot[f[y], {y, 6}, Method -> {"Newton", "UpdateJacobian" -> 2}
, StepMonitor :> s++, EvaluationMonitor :> e++]

This gives me e==11 and s==4.

Another thing to try is to use a finite difference approximation to the
Jacobian that is of lover order.

s = e = 0;
FindRoot[f[y], {y, 6}
, Jacobian -> {"FiniteDifference", "DifferenceOrder" -> 2}
, StepMonitor :> s++, EvaluationMonitor :> e++]

this gives me e==10 and s==3.

Combining the two last approaches:

s = e = 0;
FindRoot[f[y], {y, 6}
, Method -> {"Newton", "UpdateJacobian" -> 2}
, Jacobian -> {"FiniteDifference", "DifferenceOrder" -> 2}
, StepMonitor :> s++, EvaluationMonitor :> e++]

e==9 and s==3.

Since you seem concerned about performance in both speed and memory, here
is another idea.

First you cloud call NDSolve like this

if = First[
x /. NDSolve[{x''[S] - x'[S] + x[S] == 0, x == 1, x' == 1},
x, {S, 10, 10}]]

{S,10,10} stores just the last value in the interpolation function.

if["ValuesOnGrid"]

Compare that to {S,0,10}.

Next, you could separate the equation processing from the solution like
so:

nds = First@
NDSolve`ProcessEquations[{x''[S] - x'[S] + x[S] == 0, x == 1,
x' == 0}, {}, {S, 10, 10}];

ClearAll[ff]
With[{state = nds},
ff[a_?NumericQ] := (Module[{sol, tmp, tnds},
tnds = First@
NDSolve`Reinitialize[state, {x == 1, x' == a}];
NDSolve`Iterate[tnds];
sol = NDSolve`ProcessSolutions[tnds, 1];
tmp = x[10.] /. sol;
Print[InputForm[{a, tmp}]]; tmp])]

s = 0; e = 0
FindRoot[ff[y], {y, 6}
, StepMonitor :> s++, EvaluationMonitor :> e++]

gives e==17 and s==5

If you allow caching (May I ask why you do not?)

ClearAll[ff]
With[{state = nds},
ff[a_?NumericQ] := (ff[a] =
Module[{sol, tmp, tnds},
tnds = First@NDSolve`Reinitialize[state, {x == 1, x' == a}];
NDSolve`Iterate[tnds];
(*Print["sv: ",tnds@"SolutionVector"];*)
sol = NDSolve`ProcessSolutions[tnds, 1];
tmp = x[10.] /. sol;
Print[InputForm[{a, tmp}]]; tmp])]

s = 0; e = 0
FindRoot[ff[y], {y, 6}
, StepMonitor :> s++, EvaluationMonitor :> e++]

s and e stay the same but as can be seen from the print statement those
values are fetched. To be quite honest I have not looked at the code and
do not know how the algorithm works exactly.

Possibly another speed improvement is possible if you do not construct the
interpolation function but use the value from the tnds@"SolutionVector"
directly.

A colleague of mine came up with this reformulation. This is very clever
since then you have access to the Jacobian.

eqns = {Derivative[2, 0][x][S, p] - Derivative[1, 0][x][S, p] +
x[S, p] == 0, x[0, p] == 1, Derivative[1, 0][x][0, p] == p};
seqns = {eqns, D[eqns, p]};
svars = {x[S, p], Derivative[0, 1][x][S, p]};

s = 0; e = 0;
FindRoot[f[y], {y, 6}, Jacobian -> J[y]
, StepMonitor :> s++, EvaluationMonitor :> e++]

gives

s==3 and e==4

So, if you have the Jacobian it is good to use it. Even if you do not have
the Jacobian you can sometimes say something about it's structure and then
use a SparseArray to convey that to FindRoot.

Here are some links to the documentation

(* FindRoot *)
(*tutorial/UnconstrainedOptimizationNewtonsMethodRoot*)
(*tutorial/UnconstrainedOptimizationIntroductionNonlinearEquations*)
(*tutorial/UnconstrainedOptimizationSpecifyingDerivatives*)

(* NDSolve`XY *)
(* tutorial/NDSolveStateData *)

Hope this helps.

Oliver

### DrMajorBob

Aug 27, 2011, 8:24:22 AM8/27/11
to
Storing the result that way is called "memoization" or sometimes "dynamic
programming". (I don't know why it's not called "memorization", since
that's exactly what it is.)

If it makes things faster, do it. If it doesn't, don't.

I don't know why it would call ONLY twice near the same value if you do
that but three times if you don't; possibly because two of the values are
the same to machine precision. That may lead to a less-exact estimate of
the derivative.

Bobby

On Fri, 26 Aug 2011 04:00:14 -0500, Simon Pearce
<Simon....@nottingham.ac.uk> wrote:

> Hi Bobby,
>
> Thanks for your quick response. Why then if I store the result using
> f[a_]:=f[a] does it only do it twice? The first two results are being
> done at the same value (at least to machine precision I assume), with a
> third being done very slightly away.
>
> My actual problem is not amenable to symbolic solving.
>
> Thanks,
> Simon
>
> -----Original Message-----
> From: DrMajorBob [mailto:btr...@austin.rr.com]
> Sent: 25 August 2011 18:10
> To: math...@smc.vnet.net; Simon Pearce
> Subject: Re: FindRoot repeatedly evaluating function
>
> FindRoot makes several evaluations near each guess to estimate the
> derivative numerically, and uses that to compute another guess. There's
> nothing odd or surprising about it.
>
> To avoid all that (where possible), use symbolic solvers instead:
>
> Solve[x == 0 /.

> First@DSolve[{x''[S] - x'[S] + x[S] == 0, x == 1, x' == a}, x,

> S], a]
>
> {{a -> (Csc[5 Sqrt] (-3 Cos[5 Sqrt] + Sqrt Sin[5 Sqrt]))/(
> 2 Sqrt)}}
>
> Bobby
>
> On Thu, 25 Aug 2011 06:05:19 -0500, Simon Pearce

### Oliver Ruebenkoenig

Aug 27, 2011, 8:24:52 AM8/27/11
to
On Thu, 25 Aug 2011, Simon Pearce wrote:

> Hi Mathgroup,
>
> When I use FindRoot[f[y],y] I find that the inner function f is
evaluated 3 or 4 times at each value of y (or at least very similar values), even if y is far from the root. This has obvious implications to the speed of my code.
> Can anyone explain why this is the case, and tell me any way to stop it
from repeatedly evaluating f? If I use f[a]:=f[a]=... then it uses the
stored result, but I don't want to store thousands of such real valued expressions.

Simon, here is the answer to the remaining question:

This is the case because the Jacobian and FindRoot both need to evaluate
at those points

ClearAll[f]
With[{eqns = seqns, vars = svars}, Clear[f, J, ysol];
f[y_?NumericQ] := Part[ysol[y], 1];
J[y_?NumericQ] := (Print["Jackpot"]; {{Part[ysol[y], 2]}});
ysol[y_?NumericQ] := Module[{sol, yvars}, yvars = vars /. p -> y;
sol = First[NDSolve[eqns /. p -> y, yvars, {S, 10, 10}]];
sol = (yvars /. sol) /. S -> 10;
Print[InputForm[{y, sol}]];
sol]]

s = 0; e = 0; j = 0;
FindRoot[f[y], {y, 6}, Jacobian -> {J[y], EvaluationMonitor :> j++}

, StepMonitor :> s++, EvaluationMonitor :> e++]

but j == 3; this is why the caching is useful.

Oliver

### DrMajorBob

Aug 28, 2011, 4:10:51 AM8/28/11
to
> A colleague of mine came up with this reformulation. This is very clever
> since then you have access to the Jacobian.
>
> eqns = {Derivative[2, 0][x][S, p] - Derivative[1, 0][x][S, p] +
> x[S, p] == 0, x[0, p] == 1, Derivative[1, 0][x][0, p] == p};
> seqns = {eqns, D[eqns, p]};
> svars = {x[S, p], Derivative[0, 1][x][S, p]};
>
> s = 0; e = 0;
> FindRoot[f[y], {y, 6}, Jacobian -> J[y]
> , StepMonitor :> s++, EvaluationMonitor :> e++]

The first half of that DOES look clever, but the second has nothing to do
with it.

What did you leave out?

Bobby

On Sat, 27 Aug 2011 07:18:24 -0500, Oliver Ruebenkoenig
<rueb...@wolfram.com> wrote:

>
>
> Hello Simon,

>
> On Thu, 25 Aug 2011, Simon Pearce wrote:
>
>> Hi Mathgroup,
>>
>> When I use FindRoot[f[y],y] I find that the inner function f is
>> evaluated 3 or 4 times at each value of y (or at least very similar
>> values), even if y is far from the root. This has obvious implications
>> to the speed of my code.
>> Can anyone explain why this is the case, and tell me any way to stop it
>> from repeatedly evaluating f? If I use f[a]:=f[a]=... then it uses the
>> stored result, but I don't want to store thousands of such real valued
>> expressions.
>>

>> The following simple code shows the essence of the problem, using Print
>> to show where the function is evaluated and its value there.
>>
>> f[a_?NumericQ]:=Module[{sol},
>> sol=NDSolve[{x''[S]-x'[S]+x[S]==0,x==1,x'==a},x,{S,0,10}][];
>> Print[{a,x/.sol}]; x/.sol ]
>> FindRoot[f[y],{y,6}]
>>
>> Thanks,
>> Simon Pearce
>> Postdoctoral Researcher
>> The Centre for Plant Integrative Biology
>> School of Biosciences
>> University of Nottingham
>>
>>
>> This message and any attachment are intended solely for the addressee
>> and may contain confidential information. If you have received this
>> message in error, please send it back to me, and immediately delete
>> it. Please do not use, copy or disclose the information contained in
>> this message or in any attachment. Any views or opinions expressed by
>> the author of this email do not necessarily reflect the views of the
>> University of Nottingham. This message has been checked for viruses
>> but the contents of an attachment may still contain software viruses
>> which could damage your computer system: you are advised to perform
>> your own checks. Email communications with the University of Nottingham
>> may be monitored as permitted by UK legislation.
>>
>

> here are some ideas you may find useful.
>
> Here is a slight modification of your example:
>
> f[a_?NumericQ] :=
> Module[{sol},
> sol = NDSolve[{x''[S] - x'[S] + x[S] == 0, x == 1, x' == a},
> x, {S, 0, 10}][];
> Print[InputForm[{a, x /. sol}]]; x /. sol]
>

> e = s = 0;

> FindRoot[f[y], {y, 6}

> , StepMonitor :> s++, EvaluationMonitor :> e++
> ]
>
> on my machine this gives e==17 and s==5.
>
> Sometimes the Jacobian that needs to be computed does not need to be
> updated every step.
>
> s = e = 0;
> FindRoot[f[y], {y, 6}, Method -> {"Newton", "UpdateJacobian" -> 2}

> , StepMonitor :> s++, EvaluationMonitor :> e++]
>

> This gives me e==11 and s==4.
>
> Another thing to try is to use a finite difference approximation to the
> Jacobian that is of lover order.
>

> s = e = 0;

> FindRoot[f[y], {y, 6}

> , Jacobian -> {"FiniteDifference", "DifferenceOrder" -> 2}

> , StepMonitor :> s++, EvaluationMonitor :> e++]
>

> this gives me e==10 and s==3.
>
> Combining the two last approaches:
>

> s = e = 0;

> FindRoot[f[y], {y, 6}

> , Method -> {"Newton", "UpdateJacobian" -> 2}
> , Jacobian -> {"FiniteDifference", "DifferenceOrder" -> 2}

> , StepMonitor :> s++, EvaluationMonitor :> e++]
>

> e==9 and s==3.
>
> Since you seem concerned about performance in both speed and memory, here
> is another idea.
>
> First you cloud call NDSolve like this
>
> if = First[
> x /. NDSolve[{x''[S] - x'[S] + x[S] == 0, x == 1, x' == 1},
> x, {S, 10, 10}]]
>
> {S,10,10} stores just the last value in the interpolation function.
>
> if["ValuesOnGrid"]
>
> Compare that to {S,0,10}.
>
> Next, you could separate the equation processing from the solution like
> so:
>
> nds = First@
> NDSolve`ProcessEquations[{x''[S] - x'[S] + x[S] == 0, x == 1,
> x' == 0}, {}, {S, 10, 10}];
>
> ClearAll[ff]
> With[{state = nds},
> ff[a_?NumericQ] := (Module[{sol, tmp, tnds},
> tnds = First@
> NDSolve`Reinitialize[state, {x == 1, x' == a}];
> NDSolve`Iterate[tnds];
> sol = NDSolve`ProcessSolutions[tnds, 1];
> tmp = x[10.] /. sol;
> Print[InputForm[{a, tmp}]]; tmp])]
>

> s = 0; e = 0

> FindRoot[ff[y], {y, 6}
> , StepMonitor :> s++, EvaluationMonitor :> e++]
>
> gives e==17 and s==5
>
> If you allow caching (May I ask why you do not?)
>
> ClearAll[ff]
> With[{state = nds},
> ff[a_?NumericQ] := (ff[a] =
> Module[{sol, tmp, tnds},
> tnds = First@NDSolve`Reinitialize[state, {x == 1, x' == a}];
> NDSolve`Iterate[tnds];
> (*Print["sv: ",tnds@"SolutionVector"];*)
> sol = NDSolve`ProcessSolutions[tnds, 1];
> tmp = x[10.] /. sol;
> Print[InputForm[{a, tmp}]]; tmp])]
>

> s = 0; e = 0

> FindRoot[ff[y], {y, 6}
> , StepMonitor :> s++, EvaluationMonitor :> e++]
>
> s and e stay the same but as can be seen from the print statement those
> values are fetched. To be quite honest I have not looked at the code and
> do not know how the algorithm works exactly.
>
> Possibly another speed improvement is possible if you do not construct
> the
> interpolation function but use the value from the tnds@"SolutionVector"
> directly.
>
> A colleague of mine came up with this reformulation. This is very clever
> since then you have access to the Jacobian.
>
> eqns = {Derivative[2, 0][x][S, p] - Derivative[1, 0][x][S, p] +
> x[S, p] == 0, x[0, p] == 1, Derivative[1, 0][x][0, p] == p};
> seqns = {eqns, D[eqns, p]};
> svars = {x[S, p], Derivative[0, 1][x][S, p]};
>

> s = 0; e = 0;

> FindRoot[f[y], {y, 6}, Jacobian -> J[y]

> , StepMonitor :> s++, EvaluationMonitor :> e++]
>
> gives
>
> s==3 and e==4
>
> So, if you have the Jacobian it is good to use it. Even if you do not
> have
> the Jacobian you can sometimes say something about it's structure and
> then
> use a SparseArray to convey that to FindRoot.
>
> Here are some links to the documentation
>
> (* FindRoot *)
> (*tutorial/UnconstrainedOptimizationNewtonsMethodRoot*)
> (*tutorial/UnconstrainedOptimizationIntroductionNonlinearEquations*)
> (*tutorial/UnconstrainedOptimizationSpecifyingDerivatives*)
>
>
> (* NDSolve`XY *)
> (* tutorial/NDSolveStateData *)
>
> Hope this helps.
>
> Oliver
>

### Oliver Ruebenkoenig

Aug 28, 2011, 4:11:21 AM8/28/11
to

Apparently it was not entirely obvious what to do here so here is the
code:

ClearAll[f]
With[{eqns = seqns, vars = svars}, Clear[f, J, ysol];
f[y_?NumericQ] := Part[ysol[y], 1];
J[y_?NumericQ] := (Print["Jackpot"]; {{Part[ysol[y], 2]}});
ysol[y_?NumericQ] := Module[{sol, yvars}, yvars = vars /. p -> y;
sol = First[NDSolve[eqns /. p -> y, yvars, {S, 10, 10}]];
sol = (yvars /. sol) /. S -> 10;
Print[InputForm[{y, sol}]];
sol]]

;-)

Oliver

### DrMajorBob

Aug 28, 2011, 4:12:53 AM8/28/11
to

Bobby

### Oliver Ruebenkoenig

Aug 28, 2011, 4:13:24 AM8/28/11
to
On Thu, 25 Aug 2011, Simon Pearce wrote:

> Hi Mathgroup,
>
> When I use FindRoot[f[y],y] I find that the inner function f is
evaluated 3 or 4 times at each value of y (or at least very similar values), even if y is far from the root. This has obvious implications to the speed of my code.
> Can anyone explain why this is the case, and tell me any way to stop it
from repeatedly evaluating f? If I use f[a]:=f[a]=... then it uses the
stored result, but I don't want to store thousands of such real valued expressions.

Simon, here is the answer to the remaining question:

This is the case because the Jacobian and FindRoot both need to evaluate
at those points

ClearAll[f]

With[{eqns = seqns, vars = svars}, Clear[f, J, ysol];
f[y_?NumericQ] := Part[ysol[y], 1];
J[y_?NumericQ] := (Print["Jackpot"]; {{Part[ysol[y], 2]}});
ysol[y_?NumericQ] := Module[{sol, yvars}, yvars = vars /. p -> y;
sol = First[NDSolve[eqns /. p -> y, yvars, {S, 10, 10}]];
sol = (yvars /. sol) /. S -> 10;
Print[InputForm[{y, sol}]];
sol]]

s = 0; e = 0; j = 0;
FindRoot[f[y], {y, 6}, Jacobian -> {J[y], EvaluationMonitor :> j++}

, StepMonitor :> s++, EvaluationMonitor :> e++]

but j == 3; this is why the caching is useful.

Oliver

>