0% found this document useful (0 votes)
9 views

Calculus of Variation

Mathematics lecture note

Uploaded by

oboisraelajogi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Calculus of Variation

Mathematics lecture note

Uploaded by

oboisraelajogi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 33

Course code: MTH 441

Course tittle: CALCULUS OF VARIATION

Fundamental problems of calculus of variation


In this course, w shall focus on
1. Shortest path problems
2. Minimal surface of revolution
3. Brachistochrone problem and
4. Isoperimetric problems
SHORTEST PATH PROBLEMS
Here, we are interested in tracing the path followed by a particle and determining the shortest
distance between the point in the plane. This can be illustrated diagrammatically as in fig 1

yb B

ya dy

A dy ds

a dx b dx
Fig 1: A diagram showing the path from A to B by basic geometry.
2 2 2
(ds ) =(dx) +(dy)

[ ]
2
(dy )
2
¿(dx) 1+
(dx )2

[ [ ]]
2
2 2 dy
(ds ) =(dx) 1+
dx

Taking square roots of both sides

√ [ [ ]]
2
dy 2
ds= (dx) 1+
dx
√ [ ]
2
dy
ds=√ (dx)2 . 1+
dx

ds=dx √ 1+( y ' )2


Where
' dy
y=
dx
To find the shortest distance S, we integrate from A to B; i.e.
b b
J ( y )=∫ ds=∫ √ 1+( y ) dx
' 2

a a

To determine the shortest path, we minimize


b
J ( y )=∫ √ 1+( y ) dx
' 2

MINIMAL SURFACE OF REVOLUTION


Here, we are interested in determining the curve passing through two point in the plane with
co-ordinate (x 1 , y 1 ) and (x 2 , y 2 ) respectively, which when rotated about the x-axis gives a
minimal surface.
(x 2 , y 2 )

(x 1 , y 1 ) ds
y x

Let ds be a small strip on the curve y .


Area of the surface generated by ds when revolved is 2 πyds , Hence the total surface area
covered by revolving y (x ) is given by
x2

J ( y )=∫ 2 πyds (i)


x1
But, from the shortest path problems

ds=√ 1+( y ' )2 dx, hence (i) becomes


x2

J ( y )=∫ 2 πy √ 1+( y ' )2 dx


x1

x2

J ( y )=2 π ∫ y √ 1+( y ' )2 dx (ii)


x1

' dy
Where y = .
dx
The minimal surface covered will be determined by finding the critical point to the functional
in (ii).

BRACHISTOCHRONE PROBLEM
This is the oldest problem of C.O.V and it arises from physics. And here, we consider a bead
sliding down from a point A under the influence of gravity, and we are interested in finding the
shortest path on-which a particle in the absent of friction will slide from point A to point B in
the shortest time under the influence/ action of gravity.
Diagrammatically, we have.

0 x

g D

The total energy is conserved, i.e.


K . E=P . E
1 2
m v =mgh
2
2
v =2 gh
v=√ 2 gh=√ 2 gy ( 3 )
Recall that,
ds
Velocity, v= ,⟹ ds=vdt
dt

Acc ( a )=
dv d ds
=
dt dt dt ( )
.

From ds=vdt
ds
dt= (4)
v
The time taken to slide down the plane is given by
T
J ( y )=∫ dt (5)
0

Substituting ( 3 ) ∧( 4 ) into (5) we have


T
ds
J ( y )=∫ =∫
√ x
1+( y ' )2 dx
0
v 0 √2 gy

¿∫
√ x
1+( y ' )2
dx
0 √ 2 gy

J ( y )=
1 √ x
1+( y ' )2
∫ y dx .
√2 g 0
Necessary condition for Maximum/ Minimum when f is differentiable.
Let f : s → R be a differentiable function and let x 0 be an interior point of S and let X 0 be either
a maximum or minimum point of f .
Then, the first derivative of f variables at x 0. i.e.
'
f ( x 0 )=0

This condition is just a necessary condition but not sufficient condition.


'
An interior point X 0 ∈ D ≤ Ris said to be stationary point if f ( x 0 )=0.
''
A stationary point X 0 need not be max/ min point, however, if f ( x 0 ) >0, then X 0 is minimum
''
and f ( x 0 ) <0, then X 0 is a maximum point.
Example
The function f ( x )=x 2 ⟹ f ' ( x ) =2 x and f '' ( x )=2>0.hence f ( x )=x 2 is a minimum function.

Similarly, for multivariable functions, let f : R n be real valued function of n−variable defined
on Rn . If f has partial derivative at x 0 ε R n .

If x 0 is a maximum / minimum point of the function f ( x ) then


dF dF dF
x=x 0 = 0, x=0 , … x=0
d x1 d x1 f (n−1)

A stationary point X 0 is maximum point if the matrix

( )
2 2 2
d f d f d f

d x1
2
d x 1 dx 2 d x 1 dxn
2 2 2
d f d f d f

M = d x 2 dx 1 dx 2 d x 2 dxn
⋯ ⋯ ⋯
2 2 ⋮ 2
d f d f d f
⋯ 2
d x n dx 1 d x n dx2 d xn
x=x 0

Is negative definite and x 0 is minimum point at M is positive definite.

Fundamental Theorem of Calculus of Variation.


Lemma 1/ {Fundamental lemma of C.O.V}

If f ( x ) is a continuous function defined on [a, b] and if ∫ f ( x ) g( x )dx=0 for every function


g ( x ) ∈ C ( a , b ) such that g ( a ) =g ( b )=0 then f ( x )=0 ∀ x ∈ [a, b].

Proof
Let y ( x ) be an extremizing function for the problem.
x2

Minimize / Maximize J ( y )=∫ F (x , y , y )dx


'

x1

Let f ( x ) ≠ 0 for some C ∈ ( a , b ) . Without loss of generality, let us assume that f ( c ) > 0. Now
because of continuity of f . We have f ( x ) >0 for some interval [ x 1 , x 2 ] c [ a , b ] that contains the
point C.
( x− x1 ) ( x2 −x 1) for x∈[ x 1, x 2 ]
Let g ( x )=¿0 , outside [x , x ]1 2
¿
Note that ( x , x 2 ) (x 2 , x 1 ) is positive for x ∈(x 1 , x 2 ).

Now, consider
b x1 x2 b

∫ f ( x ) g ( x ) dx =∫ f ( x ) g ( x ) dx+∫ f ( x ) g ( x ) dx+∫ f ( x ) g ( x ) dx
a a x1 x2

b x2

⟹∫ f ( x ) g ( x ) dx=∫ f ( x ) g ( x ) dx
a x1

x2

⟹∫ f ( x ) ( x−x 1 )( x2 −x 1 ) dx> 0
x1

Thus, we get a contraction to what is given in the lemma. This implies that f ( x )=0 on [a , b].

Theorem 1. (Necessary condition for Extremer).


If y ( x )is an extremizing function for the problem
x2

Minimize / Maximize J ( y )=∫ F (x , y , y )dx


'
(2.1)
x1

with end condition y ( x )= y 1 and y ( x 2 )= y 2 , then y ( x ) satisfies the B.V.P

d df
dx d y ' ( )

dF
dy
=0(2.2)

y ( x 1 )= y 1∧ y ( x 2 )= y 2

Equation (6) is known as the Euler- Lagrange equation.

Proof
Let y ( x ) be an extremizing function for the functional J ( y ) in equation (2.1). consider the
diagram

y ( x )+ η( x) y2
η(x )
0 x1 x2 x

Let y= y ( x ) + Eη(x) be a variation of y ( η ) , where η(x ) is continuously differentiable function


with η ( x )=0=η(x 2 ) and E is a small constant.

Hence J ( E ) along the path y= y ( x ) +∈ η( x) is given by


x2

J ( E )=∫ F (x , y , y ' ) dx ( x )
x1

Recall that
' '
Y = y ( x ) + Eη ( x ) , y = y ( x ) + Eη ( x ) (¿)
Substituting ( ¿ ) into (x ), we write
x2

J ( E )=∫ F (x , y ( x ) + Eη ( x ) , y ( x )+ Eη ( x ) )dx
'

x1

x2

⟹∫ F( x , y (x), y ' (x))dx


x1

Since y ( x ) is an extremizing function J ( E ) has extremum when E=0 thus, by classical


calculus,
dJ
E=0 =0
dE
x2
dJ d
⟹ =∫ ¿¿
dE x dE 1

[ ]
x2
df dx df dy df d y '
¿∫ + + dx ¿
x1
dx dE dy dE d y ' dE

dx
But =0 ,since x is not depending on E , also from
dE
dy
Y = y ( x ) + Eη ( x ) ⟹ =η ( x )
dE
And ¿¿
'
' ' ' dy '
y = y (x )+ E η ( x )⟹ =η (x )
dE
Substituting ¿ ¿ into ¿, we write

[ ]
x2
dJ df df df
=∫ .0+ η ( x ) + ' η' (x) dx
dE x dx dy 1
dy

[ ]
x2
df df
¿∫ η ( x ) + ' η ' (x ) dx
x1
dy dy
x2 x2
df df
¿ ∫ η ( x ) dx+¿ ∫ η ' ( x ) ' dx ¿
x 1
dy x dy 1

Applying (Integration by parts), we get


x2 x2
df df d df
∫ d y ' η' ( x ) dx= d y ' η ( x )−∫ dx d y' η ( x ) dx
x1 x 1
( )
Using η ( x 1 )=η ( x 2 )=0 , we have
x2 x2

x
df
1 x
d df
∫ d y ' η' ( x ) dx=0−∫ dx d y ' η ( x ) dx
1
( )
x2

−∫
x1
( )
d df
dx d y '
η ( x ) dx

x2 x2
dJ
dE ∫
=
x
df
dx1
η ( x ) dx−∫
x
d df
dx d y'
η ( x ) dx=0
1
( )
[ ( )]
x2
df d df
¿∫ − η ( x ) dx=0
x1
dy dx d y '

Since η ( x ) is an arbitrary function, such that η ( x ) ≠ 0


df d df

dy dx d y ' ( )
=0(¿∗¿)

Example
Verify the Euler-Lagrange equation for the fundamental problems of C.O.V
Solution
Here, we are interested in the
1. Shortest path problems
2. Minimal surface of revolution
3. Brachistochrone problem
4. Isoperimetric problems

(A) For the Shortest Path Problem.


x2


J ( y )=∫ 1+ ( y ' ) dx
2

x1


Suppose F= 1+ ( y ' )2=[1+ ( y ' )2 ] 2
1
−1
df df 1 d ( ')2
= [ 1+ ( y ) ] .
' 2 2
=0. y
dy dy 2 dy
2
Let u=1+ ( y ' )
1
⇒ F=u 2
1 −1
df 1 2 −1 1
= u = u 2
du 2 2
du '
'
=2 y
dy
So that
−1
df df du 1 '
'
= . '
= u 2
.2 y
d y du d y 2
'
df 2y
'
= 1
dy
2 [ 1+ ( y ) ]
' 2 2

'
y
(ii)
√1+( y ) ' 2

Using
df d df

( )
dy dx d y '
=0(iii)

Substituting (i)∧(ii)into(iii) , we have

(√ )
'
d y
0− =0
dx 1+ ( y )
' 2
(√ )∫
'
y
∫d 2
= 0 dx
1+ ( y )
'

'
y
⟹ =0+ c
√1+( y ) ' 2

y =c √ 1+ ( y )
' ' 2

Squaring both side yields

( y ' ) =C 2 [ 1+ ( y ' ) ]
2 2

2 2
( y ' ) =C 2 +C2 ( y ' )
2
( y ' ) ( 1−C 2 )=C 2

y'=
√ C2
1−C
2

dy
dx
=
C2
1−C
2


2
C
∫ dy= ∫ 1−C
2
dx

y=
C2
1−C √
2
x+ A

y ( x )=mx+ A

Where m=
√ C2
1−C
x2
2

∫ √1+( y ' ) dx
2
Recall that
x1

'
y ( x1 ) = y 1 ⟹ y 1=m x1 + A
'
y ( x2 ) = y 1 ⟹ y 2=m x 2+ A

y 1− y 2=m(x1 + x 2)
y 1− y 2
m=
x1 + x2
From y 1=m x 1+ A
y 1− y 2
⟹ y 1= x +A
x 1+ x2 1

y 1− y 2
⟹ A= y 1− x
x 1+ x 2 1

y 1− y 2 y 1 ( x 1−x 2) ( y 1− y 2 ) x
y ( x )= x+
x 1+ x 2 x 1−x 2

Thus, the curve passing through the two fixed point is a straight line.

(B) Minimal Surface of Revolution


Here, we consider
x2


J ( y )=∫ 2 πy 1+ ( y ' ) dx
2

x1

x2


J ( y )=2 π ∫ y 1+ ( y ) dx
' 2

x1

The set


F= y 1+ ( y )
' 2

dF d
dy dy
' 2

= ( y 1+ ( y ) )

√ 1+( y ) dyd ( y )=√1+( y )


' 2 ' 2

dF
dy
√ ' 2
= 1+ ( y )

Also,
dF dF
'
[ √
' 2
= ' y 1+ ( y ) ]
dy dy
{ }
1
dF
y '
[ 1+( y ' )2 ] 2
dy

{ }
−1
1
y [ 1+ ( y ) ] . 2 y
' 2 2 '
2
' '
dF 2yy yy
⟹ = =
dy 2 1+ ( y ' )

2 ' 2
1+ ( y ) √
Using
df d df

dy dx d y '
=0
( )
(√ )
'
d yy

⟹ 1+ ( y ) −
' 2
dx 2
=0
1+ ( y )
'

(√ )√
'
d yy 2
= 1+ ( y )
'
dx 1+ ( y )
' 2

−1

√ y
⟹ y 1+ ( y ) − y [ 1+ ( y ) ]
' 2
2
' 2 ' 2
.2 y =C
'

Or
y
=C
√ 1+ ( y )
' 2

y=C 1+ ( y ) √ ' 2

y 2=C 2 [ 1+ ( y ' ) ]
2

' 2
y =C +C ( y )
2 2 2

y'=
√ y 2−C2
C
2

dy
dx
=
C √
y 2−C 2
2

Using separation of variable, we have


dy dx
⟹ =
√ y −C 2 2 C
dy dx
∫ =∫ +C
√ y −C
2 2 C

cosh −1 ( cy )= xc + C '

()
'
−1 y x +C C x + A
cosh = =
c c c

Where A=C C '

y
⟹ =C cosh
c
x+ A
c ( )
Which is categorically the constant A and C as determined from the points. (x 1 , y 1 ) and
(x 2 , y 2 )

(C) The Brachistochrone Problem


Here,


x2 2
1+ ( y )
'
1
J ( y )= ∫ y dx
√2 g x 1

Prior to solving this, we consider the spaced cases of Euler-Lagrange equation.

CASE 1
df
=0
dy

⟹ F=F ( x , y ' ) and hence, y is missing. The Euler – Lagrange equation is

df d df

( )
dy dx d y '
=0

dx ( d y )
d df
0− =0 '
d df
( )
dx d y '
=0

df
'
=C
dy
CASE 2
df
=0
dx

And so, F=F ( y , y ' ) and hence, x is missing. For any differentiable F ( y , y ' ) . We have the
chain rule
df df df df ''
= + + .y
dx dx dy d y '

df df d
= + y
df
dx dx dx d y ' ( )
On orbits now, we have
d
dx (
f −y
' df

dy
'
=
df
dx )
df
On orbits, when =0, this gives
dx
d
dx (
f −y
' df

dy
'
=0.
)
' df
⟹f −y '
=C
dy
CASE 3.
df
'
=0
dy
And hence, F=F ( x , y ) , i.e. y ' is missing. The Euler – Lagrange equation becomes

0=
df d df

( )
dy dx d y'

df
=0.
dy
For the brachistochrone problem


' 2
1+ ( y )
f=
y
Since x is missing, we employ
' df
f−y '
=C
dy
We write

{ }
1
df d 1
= ' [ 1+ ( y ) ]
' 2 2
'
dy dy √y
−1
1 1
. [ 1+ ( y ) ] . 2 y
' 2 2 '

√y 2
'
df 2y
=
dy 2 √ y . 1+ ( y ' )2
'

y
√ y [ 1+( y ) ] ' 2

⟹√
' 2
1+ ( y ) ' 1 y
'
−y . =C
√y √
√ y 1+ ( y ' )2
' 2 ' 2
1+ ( y ) −( y )
=C

√ y 1+ ( y ) ' 2

1
=C
√ y √ 1+ ( y ' )
2

⟹√ y¿¿
y [ 1+ ( y ) ]=a
' 2

' 2 a
1+ ( y ) =
y
( y ' ) = a− y
2

y√
y'=
√ a− y
y

dy
dx
=
a− y
y √
Separating the variable, we write
√ y
a− y
dy=dx


x y
y
∫ dx=∫ a− y
dy
0 0

Since (0, 0) is point in the curve, we get C=0


Let y=a sin2 θ ,
dy =2 a sine θ cos θ dθ


θ
a sine θ
x=∫ 2
2 a sinθ cosθ dθ
0 a−a sin θ
θ
sine θ
x=∫ 2 a sinθ cosθ dθ
0 cosθ
θ
a ∫ 2 sin θ dθ
2

θ
a ∫ ( 1−cos 2 θ ) dθ
0

a
[ 2 a−sin2 θ ]
2
a
Let =b , 2 θ=φ
2
⟹ x=b ( φ−sinθ )
⟹ y=b (1−cosφ)
Which is a cycloid.

Example
Find the extremals of the functional
x2 2
( y' )
∫ x
3
x1

Solution
2
( y' )
f= 3
x
'
df df 2 y
=0 , '
= 3
dy dy x
Since y is missing, the Euler-Lagrange equation becomes
d df
dx d y ' ( )
=0

( ) ( )
'
d df d 2y
=
dx dx dx x 3

Applying quotients yields


' 3
U =2 y , V =x
du ' ' dv 2
=2 y , =3 x
dx dx
du dv
v −u
dx dx
2
v
3 '' 2 '
2 x y −6 x y
6
=0
x
'' '
⟹ x y −3 y =0
''
y 3 1
∫ y
'
=∫ dx=3 ∫ dx
x x
'
ln y =3 ln x +C
' 3
ln y =ln x +C
3

y ' =e ln x +C
3

y ' =e C . eln x
' 3
y =C x
dy 3
=C x
dx

∫ dy=∫ C x 3 dx
4
Cx
y= +C 2
4
4
⟹ y= A x + B

Example
Extremize
x2

J ( y )=∫ (1+ ( y ' ) )dx , y ( x 1 )= y ( x 2 )=0


2

x1

Solution
2
Here f =1+ ( y ' )

df df '
=0 , =2 y
dy dy
'

Using
df d df

dy dx d y ' ( )
=0

d
0− ( 2 y ' ) =0
dx
''
2 y =0(by direct integration)
2 2
2 d y =0 d x

∫ 2 d 2 y=∫ 0 d x 2 ⟹ 2 dy=C dx
∫ 2 dy =∫ C dx ⟹ 2 y=Cx + D
Cx + D
y ( x )=
2
This can also be solved until
y ( x )=Cx+ D
y ( x 1 )=C x 1 + D=0 (i)

y ( x 2 )=C x 2 + D=0 ( ii )

From (i) D=−C x 1(iii)

Substituting ( iii ) into (ii) we have


C x 2−C x 1=0

Now, we let y ( x 1 )= y 1=C x 1 + D(iv)

y ( x 2 )= y 2=C x 2 + D(v )

Substituting(v)from (iv), we have


y 1− y 2=C(x 1−x 2 )
y1− y2
⟹ =C
x1 −x2

So that
y 1− y 2
y= x+D
x 1−x 2

Example
Find the curve y on which the functional extremizes
1

∫ [ ( y ' ) +12 xy ] dx , y ( 0 )=0 , y ( 0 )=1.


2

Solution
' 2
f =( y ) +12 xy
df df '
=12 x , =2 y
dy dy
'

Here, the Euler- Lagrange equation becomes


d
12 x− ( 2 y ' ) =0
dx
''
⟹ 12 x−2 y =0
''
y =6 x
Taking direct integration (twice)
' 2
y =3 x + C

∫ dy=∫ 3 x 2 dx +∫ C dx
3
3x '
y= +Cx+ C
3
y ( x )=x 3+ Cx+C '
Applying the conditions

y ( 0 )=03 +C ( 0 ) +C ' =0
'
C =0
3 '
y ( 1 )=1 +C ( 1 ) +C =1
C=0
3 3
y=x +0 x +0=x
3
⟹ y=x

ISOPERIMETRIC PROBLEM
In certain problems of calculus of variation, while extremizing a given function J ( y ) along the
end conditions y ( x 1 )= y 1∧ y ( x 2 )= y 2 . We also need the extremizing function which has to
satisfy an additional integral problem as we set in the following Dido’s problem.
Dido’s problem: Find the plane curve of fixed perimeter which has maximum area above x-
axis
Y ( x 2, y 2)

(x , y )

The perimeter and the area under the curve are given by
x2
'

Perimeter = area length = ∫ 1+ ( y ) dx
2

x1

x2

Area under curve A =∫ y ( x ) dx


x1

Solution
x2 x2

Here, we are expected to maximize J ( y )=∫ y ( x ) dx such that ∫ √1+( y ' ) dx=1 , y ( x 1 )= y 1 and
2

x1 x1

with end condition y ( x 2 )= y 2.

Recall that from Lagrange Multiplier Technique: if we have a function


x2

H ( x , y , y ' ) =F ( x , y , y ' ) + λG( x , y , y ' ) and they are to optimize ∫ H ( x , y , y ' ) dx without
x1

constraints, that is, optimise


x2

J ( y )=∫ F ( x , y , y ' ) + λG(x , y , y ' )dx


x1

With end condition y ( x 1 )= y 1 and y ( x 2 )= y 2 .

The problem is solved by solving the Euler- Lagrange equations.


dH d dH

dy dx d y ' ( )
=0 , y ( x 1) = y 1∧ y ( x 2 )= y 2

Then,

F ( x , y , y ' )= y ( x )= y


G ( x , y , y ) = 1+ ( y )
' ' 2


H ( x , y , y ) = y + λ 1+ ( y )
' ' 2

The problem becomes


Maximize
x2


J ( y )=∫ y + λ 1+ ( y ) dx
' 2

x1

dH
dy
=1 ,
dH
'
d d ' 2
= ' ( y ) + ' λ 1+ ( y ) [ √ ]
dy dy dy
1
1 −1 d
0+ λ . (1+( y )) 2 . [ 1+ ( y ) ]
' ' 2
2 dy
λ
¿
2
'
λy dH
=
√1+( y ) ' 2 '
dy

dH d dH

dy dy dy
=0 ( )
(√ )
'
d λy
⟹ 1− =0
dy 1+ ( y )
' 2

Integrating, we have

(√ )∫
'
λy
∫d ' 2
= dx
1+ ( y )
'
λy
=x +C
√1+( y ' ) 2

'
λy
x+C
' 2
= 1+ ( y ) √
Squaring both sides, we have
2 ' 2
λ (y )
2 2
=1+( y ' )2
x +C
' 2
λ ( y ) =x +C + ( y ) ( x +C)
2 ' 2 2 2 2

[ λ 2−( x+ C )2 ] ( y' )2=(x +C)2


2
2 ( x +C)
⟹ ( y' ) = 2 2
λ − ( x +C )

y=− √ λ 2−( x +C ) +b
2

Squaring both sides, we have

( y−b )2=λ 2−( x +a )2 where a=c


2 2
¿ ( y−b ) + ( x+ C )= λ
PROBLEMS WITH HIGHER ORDER DERIVATIVES.

Extremize
x2

J ( y )=∫ F ( x , y , y , y ) dx
' ''

x1

y ( x 1 )= y 1 y ( x 2 )= y 2
' ' '
y ( x1 ) = y 1 y ( x 2) = y 2

The Euler-Poisson Equation is

( ) ( )
2
df d df d df
− + 2 =0
dy dx d y dx dy ''
'

Example
Extremize
x2

J ( y )=∫ [ y 2−( y ' ' )2 ]dx


x1

With end conditions,


y ( x 1 )= y 1 y ( x 2 )= y 2
' ' ' '
y ( x1 ) = y 1 y ( x 2 )= y 2

Solution
2 '' 2
Here f = y −( y )
df df df ''
=2 y , ' =0 , ' ' =−2 y
dy dy dy
Using the Euler- Poisson equation, we write

( ) ( )
2
df d df d df
− + 2
dy dx d y dx dy ''
'

2
d d
⟹ 2 y− ( 0 ) + 2 (−2 y ) =0
''
dx dx
''
2 y=−2 y
2
d ( '' )
⟹ 2
y =y
dx
'v
y =y
'v
y − y =0
Which can be
y ( x 1 )= y 1 y ( x 2 )= y 2
' ' ' '
y ( x1 ) = y 1 y ( x 2 )= y 2

Problems with several unknown.


Let u and v be the unknown function which extremize the functional J(Y)
Extremize
x2

J ( u , v )=∫ F ( x , u , v ,u ' , v ' ) dx


x1

u ( x 1 )=u1 u ( x 2) =u2

v ( x 1 )=v 1 v ( x2 ) =v 2

The Euler- Lagrange equation y


( )
df d df
du dx d u'
=0

dv dx ( d v )
df d df
− =0 '

VARIATIONAL NOTATION
When a function changes, it values from y (x ) to y ( x+ δx ) the rate of change of this function
defines the derivative y ' ( x) . Whereas, in variational calculus the y (x ) is change to a new
function g ( x )+ ε η(x ) where ε is constant and η (x) is a continuous differentiable function. The
change ε η (x) in y (x ) as a function is called the variation of y and is denoted by δy . That is
δy=ε η (x) .
Similarly, we have δ y' =ε η ( x ) in F ( x , y , y ' ) for a fixed change in y from y to y + ε η makes F
to change to F ( x , y+ ε η , y ' + ε η' )

Thus, the change in F denoted by ΔF is given by

ΔF =F ( x , y+ ε η , y' + ε η ' )
Expanding the first term on the R.H.S using Taylors’s series, we have

[ ]
2
df df ' d 2 f 2 d2 f 2
' d f 2
ΔF =F ( x , y , y ) + η+ η + ( η' )
'
2
η + 2 '
η η + '
dy dy dy dy dy dy

+ higher Order terms


df df '
First variation δf = δy+ ' δ y
dy dy

[
1 d2 f d2 f
]
2
' d f 2
Second variation δ2 f = 2
( δy )
2
+2 '
δyδ y + '
( δy' )
2 dy dy dy dy

Note:
Variation is analogous to derivation in calculus properties of variation

i. δ ( f 1 ∓ f 2 ) =δ f 1+ δ f 2
ii. δ ( f 1 f 2 ) =f 1 δ f 2 +f 2 δ f 1

iii. δ
()
f 1 f 2 δ f 1−f 2 δ f 1
f2
= 2
f2
iv. δ ( x2 ) =0

Example
If
x2

J ( Y )=∫ F ( x , y , y ) dx
'

x1

Find the variation δJ of J

Solution
x2

J ( Y )=∫ F ( x , y , y ) dx
'

x1

(∫ )
x2

F ( x , y , y ) dx
'
δ J ( Y ) =δ
x1

x2

¿ ∫ δ F ( x , y , y ) dx
'

x1

[ ]
x2 2
df df
¿∫ δy+ ' ( δy' ) dx
x1
dy dy

GEODESIC PROBLEM
This is one of the finite constraint problems. Suppose that we want to minimize or maximize
the functional.
b
J ( Y )=∫ F ( x , y ( x ) , y ( x ) ) dx
'

Subject to the constant g ( x , y )=0 such a problem is said to be one of the finite constraints.
Hence, for the Geodesic problem, we consider the space curve (x , y , z) defined for a ≤ t ≤ b ,
lies on the surface described by G ( x , y , z ) =0 if G ( x , y , z ) =0 ∀ t ∈ [ a , b ] .
The geodesic problem is to find the curve of shortest length lying on the surface of and
connecting points A=( a1 , a2 ,a 3 )∧B=(b1 , b2 , b3 ), the functional to be minimised is the arc
length.
b
J ( Y )=∫ √ x + y + z dz (1 )
2 2 2

dx
Where x=
dt
We assume that the equation G ( x , y , z ) =0, can be written as z=g(x , y) . i.e. we assume that
function has continuous partial derivatives.
We may not be able to do this for the entire surface as the equation of a sphere
2 2 2 2
G ( x , y , z ) =x + y + z − y =0
Illustrate, but we can usually solve for z , or one of the other variables on point of the circle, as
for example on the upper or lower hemisphere, we then have
ż=g ẋ+ g ẏ=gx ( x , y ) ẋ (t ) + gy ( x , y ) ẏ ( t )
dy
Where gx=
dx

Lemma:
Following from the Geodesic problem

()
'
dz d g
=
dx dt x
Proof
'
dz d
= ( g x x + g x y ) =g x x + g y y
' '
dx dt
We also have
d d
( gx ) = [ gx ( x , y ) ] =g xx x ' + g yx y
dt dt
Since g xy =g yx . The assertion of lemma follows that
'
dz ' d ( gx ) ∧dz d
= = ( gy ) (2)
dx dt dx dt
b
J ( Y )=∫ √ x + y + z dz ( ¿ )
2 2 2

Substituting for z in equation (1), we see that the problem is now minimizing the functional.
b
J ( Y )=∫ √ (x ) +( y ) +(g x x + g y y ) dt ( 3 )
' 2 ' 2 ' '

Which we write as
b
J ( Y )=∫ F ( x , x , y , y ) dt (4 )
' '

Note that,
The only place where x∧ y occurs is in the gx∧gy terms. The Euler- Lagrange equation
becomes
df d df

dx dt dx '
=0
( )
And
df d df

dy dt dy '
=0
( )
Using

( )
'
df df d df d dg df dz
= , ( g x x + g y y )=
' '
= '
dx dz dx dz dx dt dz dx
And
'
df df dz
=
dy dz ' dy

We can re-write the Euler-Lagrange equation as

( )
d df
dt dx '
+ gx
d df
dt dz ' ( )
=0

And

( )
d df
dt dy '
+ gy
d df
dt dz ' ( )
=0

To see this, we reason as follows:


First
'
df df df dz df df
= + , = +
dx dx' dz' dx ' dx ' dz

( ) ( ) ( )
d df
=
d df
+
d df
dt dx dt dx dt dz x
g

¿
( ) ( )
d df
'
+
d df
dt dx dt dz '
df d
gx + ' ( gx )
dz dt

( ) ( )
'
d df d df df dz
¿ + '
g + ,' x ' '
dt dx dt dz dz dx

Now, let the function λ ( t ) be defined by

d θf
( )
dt dz '
=λ ( t ) G

Gx Gy
And note that g x = and g y = then, the equation becomes
Gz Gz
d df
dt dy '( )
= λ ( t ) Gx
And
d df
dt dy ' ( )
= λ ( t ) Gxy .

Eliminating λ (t) and extending the results to include z as well, we have

( ) ( ) ( )
d df
dt dx '
=
d df
dt dy '
=
d df
dt dz '
(5)
Gx Gy Gz

Example
Let the surface be a sphere with equation
2 2 2 2
0=G ( x , y , z )=x + y + z − y
The equation (5) becomes
'' ' ' '' ' ' '' ' '
fx −x f fy − y f fz −z f
2
= 2
= 2
2xf dy f dz f
We can re-write these equation as
'' '' '' '' 1
x y −x y yz −z y f
' '
= ' '
= 2
x y −x y yz −z y f
The numerator is the derivative w.r.t the derivative which leads to

log ¿ x y' − y x ' ∨¿ log | y z ' −z y '|+C 1

Therefore,

x y' − y x ' =C 1 ( y z ' −z y ' )

Re-writing, we obtain
' '
x +C 1 z y '
=
x +C 1 z y

Or
x +C1 z=z 2 y

Which is a plane on the origin.


HARMILTON’S PRINCIPLE
Let T be the kinetic energy and V be the potential energy of a particle in motion.
Let L=T – V be the kinetic potential or the Lagrangian function.
Let
t2

A=∫ Ldt
t1

( Actionintegral)
a=0 ( priciple of least action )

(∫ ) (∫ )
t2 t2

δ L dt =δ T – V dt =0
t1 t1

Hamilton’s principle state that the motion is such that the integral of the difference between
kinetic and potential energies is stationary for the true path.
Over a sufficiently small-time interval, the integral is a minimal. That is, nature tends to
equalize the kinetic and potential energy motion.
Hence,
δA =0 along the path.
Example
Reduce the B.V.P
''
y − y + x y ( 0 ) = y ( 1 ) =0
Into a variational problem.
Solution
Consider
''
y − y + x=0(1)
y ( 0 )= y ( 1 )=0
Multiply both sides of (1) by δy and integrate over (0,1)

∫ [ ( y '' − y+ x ) δy ] dx
1 1 1

∫y ''
δy dx−∫ yδy dx+∫ xδy dx
0 0 0

Applying integration by part, we have


1 1 1
y δy∨¿∫ y δy dx−∫ yδy dx +∫ xδy dx=0
' '

0 0 0

But

δ [ ( y ' ) ]=2 y ' δ y ' , δ y 2=2 y δy


2

And
δ ( xy )=xδy .
1 1 1 1
1
0 2
1
−∫ δ ( y ) dx−∫ δ y dx−∫ δxy dx−∫ δ
' 2

0 2
2

0 0
−1 ' 2 1 2
2
( y ) − ( y ) + xy dx=0
2 ( )
(∫ ( )
1
' 2
y ) + y −2 x dx =0
2
⟹δ
0

It is of the form δ J ( Y ) =0.


Thus, the corresponding variational problems.
Extremize
1
J ( Y )=∫ [ ( y ) ¿ ¿2 ¿ + y −2 xy ] dx y (0)=0 , y ( 1)=0.¿ ¿
' 2

If we find the Euler- Lagrange equation of the above variational function, we have
' 2
f =( y ) + y −2 xy
2

df
=2 y−2 x
dy
df
=2 ( y ) ⟹ 2 y
' '
'
dy
df d df

dy dx dy '
=0
( )
d
⟹ 2 y−2 x− ( 2 y ' ) =0
dx
''
2 y−2 x−2 y =0
Dividing through by -2, we have
''
− y + x + y =0
''
⟹ y − y+ x=0
Example
Reduce the B.V.P
d
x
dx dx( )
dy
+ y=x

y ( 0 )=0 , y ( 1 )=1
Into a variational problem.
Solution
Here
d
x
dx dx( )
dy
+ y=x

Multiply by δy and integrating over (0,1), we have


1 1 1

∫( x
dy
dx )
δy dx+∫ yδy−∫ xδy
0 0 0

1
x( )
dy
dx
dy dy 1 2
δx−∫ x δ +∫ δ y dx−∫ δxy dx=0
0 dx dx 2 ( )
1

∫δ¿
0

1
δ ∫ (−x ( y ¿¿ ' ) + y −2 xy)dx=0¿
2 2

⟹ δ J ( Y )=0
Where
1
J ( Y )=∫ [−x ( y¿ ¿') + y −2 xy ]dx ¿
2 2

y ( 0 )=0 , y ( 1 )=1

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy