Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Variational Problem with Derivative Constraint

2 views
Skip to first unread message

Daryl McCullough

unread,
Oct 11, 2001, 10:19:54 AM10/11/01
to
Is there a general method for solving variational problems
with constraints on the derivatives? For example (the
suspended cable problem)

Choose functions x(s) and y(s) to minimize the integral
from s=0 to s=L of K y(s) ds subject to the constraint
(dx/ds)^2 + (dy/ds)^2 = 1.

For this problem, K y(s) ds is the potential energy for a
section of cable of length ds at height y(s). The constraint
on derivatives insures that ds is actually the length
of the section of cable.

I know how to solve this problem by switching independent
variables from s to x. Then the problem becomes

Choose functions x(s) and y(s) to minimize the integral
from x=0 to x=W of K y(s) square-root(1 + (dy/dx)^2) dx
subject to the constraint that the integral from x=0 to
x=W of square-root(1 + (dy/dx)^2) dx is equal to L.

This problem, with an integral constraint, can be solved by the
method of Lagrange multipliers. But is there a way to use Lagrange
multipliers to solve the original problem without changing variables?

--
Daryl McCullough
CoGenTex, Inc.
Ithaca, NY

David Ziskind

unread,
Oct 12, 2001, 11:45:22 AM10/12/01
to
Daryl McCullough <da...@cogentex.com> wrote in article
<9q49q...@drn.newsguy.com>...

> Is there a general method for solving variational problems
> with constraints on the derivatives? For example (the
> suspended cable problem)
> [text deleted]

The brief answer is: Yes, typically, the Lagrange multiplier method extends
to the situation of “non-holonomic” constraints (as postulated in the base
post). Specifically:

IF

{
(X, Y) is a minimum path for the functional
(Xv, Yv) into Int[a to b: G(Xv, Xv', Yv, Yv', I)] subject to the
constraints
(1) F(Xv, Xv', Yv, Yv', I) = 0, and
(2) (a,b) end-points of Xv, and Yv are fixed;
}

THEN

{
letting H = G + L*F, then:

if [(a) both P4(F) and P2(F) evaluated at F=Fv are uniformly zero
throughout some neighborhood of (X, Y),
OR
(b) either P4(F) and P2(F) evaluated at F=Fv is uniformly non-zero
throughout some neighborhood of (X, Y)], then
[(1) [P2(H)]' - P1(H) = 0, and
(2) [P4(H)]' - P3(H) = 0, and
(3) F = 0)]
} .

Notes:

1) P2(function) means: partial with respect to the 2nd argument of
function, etc.
2) P2(H) abbreviates: (P2(H))(X,X',Y,Y',I), etc.
3) function I is defined by: I(t) = t .
4) Xv, Yv represent comparison functions which lie within a certain
neighborhood of the minimizing functions: X, Y .
5) X' means: derivative of X, etc .
6) L is a real into real mapping.

In short, the Lagrange multiplier method is valid provided P4(F) does not
touch zero, etc., etc.

For a careful exposition of variational calculus and the non-holonomic
situation, see Section 11.6-2 and precedents in Korn, G., & Korn, T.,
“Mathematical Handbook for Scientists and Engineers”, Dover, 2000,
ISBN 0-486-41147-8.

David Ziskind
zis...@ntplx.net

Daryl McCullough

unread,
Oct 12, 2001, 11:45:06 AM10/12/01
to
da...@cogentex.com says...

>
>Is there a general method for solving variational problems
>with constraints on the derivatives? For example (the
>suspended cable problem)
>
> Choose functions x(s) and y(s) to minimize the integral
> from s=0 to s=L of K y(s) ds subject to the constraint
> (dx/ds)^2 + (dy/ds)^2 = 1.

Well, I didn't get any replies to this, so I'll just
talk to myself a little more. Even though I don't know
a way to directly impose this differential constraint,
I do know of a limiting procedure that imposes it in
the limit.

What we can do is relax the constraint that the
(dx/ds)^2 + (dy/ds)^2 be exactly 1. Instead, we
can put a "penalty" on deviations from 1. Then
in the limit as the size of the penalty goes to
infinity, the solution will approach the solution
with the exact constraint.

In other words, replace the problem of
minizing

Int ds K y(s)

subject to the constraint

(v - 1) = 0

(where v = square-root((dx/ds)^2 + (dy/ds)^2))

by the problem of minimizing

Int ds (K y(s) + A (v - 1)^2)

In the limit as A --> infinity, the solutions of this
problem should approach the solutions of the original
problem.

Daryl McCullough

unread,
Oct 12, 2001, 1:30:43 PM10/12/01
to
David says...

Thanks. But in the particular example that I gave, this
doesn't seem to work, and I don't see exactly which condition
fails.

The particular example of interest is

G(Xv, Xv', Yv, Yv', I) = K Yv
F(Xv, Xv', Yv, Yv', I) = (Xv')^2 + (Yv')^2 - 1

In this case, P2(F) = 2 Xv' which should be nonzero
throughout.

So it is a problem of minimizing the integral
of

K Yv + L((Xv')^2 + (Yv')^2 - 1)

The usual Euler-Lagrange equations for this give:

2L d/dt (Yv') = K
2L d/dt (Xv') = 0

which has the solution

Xv = A + Bt
Yv = 1/2 K/(2L) t^2 + C t + D

But this is not the usual catenary solution.

Daryl McCullough

unread,
Oct 12, 2001, 6:08:45 PM10/12/01
to
David says...

>6) L is a real into real mapping.

Doh! I missed this on my first reading. I thought that
the Lagrange multiplier L had to be a constant, rather
than a function of time. That explains my mistake.

Thanks!

default

unread,
Oct 17, 2001, 3:32:00 PM10/17/01
to
Daryl McCullough wrote:

> Is there a general method for solving variational problems
> with constraints on the derivatives? For example (the
> suspended cable problem)
>
> Choose functions x(s) and y(s) to minimize the integral
> from s=0 to s=L of K y(s) ds subject to the constraint
> (dx/ds)^2 + (dy/ds)^2 = 1.

This is worked out in detail near the beginning
of Gelfand and Fomin's book on calculus of variations.


David Ziskind

unread,
Nov 9, 2001, 12:28:34 AM11/9/01
to
This is a follow-up to my reply article posted Oct. 12, 2001, with
identifiers <zis...@ntplx.net> and
<01c15333$ca5b40a0$04bcd5cc@default>.

In that reply, there were various snags in the statement of the
non-holonomic theorem; the following revision (including revised notes)
corrects those problems.

REVISION:

IF

{
(X, Y) is a minimum path for the functional
(Xv, Yv) into Int[a to b: G(Xv, Xv', Yv, Yv', I)] subject to the
constraints
(1) F(Xv, Xv', Yv, Yv', I) = 0, and
(2) (a,b) end-points of Xv, and Yv are fixed;
}

THEN

there exists an L such that:
{
if [(a) both (Xv,Yv)-> P4(F)(Xv,Xv',Yv,Yv',I) and
(Xv,Yv)->P2(F)(Xv,Xv',Yv,Yv',I) are uniformly zero

throughout some neighborhood of (X, Y),
OR

(b) either (Xv,Yv)-> P4(F)(Xv,Xv',Yv,Yv',I) or
(Xv,Yv)->P2(F)(Xv,Xv',Yv,Yv',I) is uniformly non-zero

throughout some neighborhood of (X, Y)],
then

[(1) (d/dt)[P2(G+L(t)*F)(X(t),X'(t),Y(t),Y'(t),t)]
- P1(G+L(t)*F)(X(t),X'(t),Y(t),Y'(t),t) = 0 and
(2) (d/dt)[P4(G+L(t)*F)(X(t),X'(t),Y(t),Y'(t),t)]
- P3(G+L(t)*F)(X(t),X'(t),Y(t),Y'(t),t) = 0 and
(3) F(X,X',Y,Y',I) = 0]
}

Notes:

1) P2(function) means: partial with respect to the 2nd argument of
function, etc.

2) function I is defined by: I(t) = t.
3) Xv, Yv represent comparison functions which lie within a certain


neighborhood of the minimizing functions: X, Y.

4) X' means: derivative of X, etc.
5) L is a real into real mapping.
6) In an expression such as P1(G+L(t)*F), L(t) is a number, not a
function, and thus the partial is equal to P1(G) + L(t)*P1(F).

David Ziskind
zis...@ntplx.net

0 new messages