50 Problems Reformatted 1
50 Problems Reformatted 1
50 Problems Reformatted 1
Divergence of a product: Given that 𝜑 is a scalar field and 𝐯 a vector field, show that
div(𝜑𝐯) = (grad𝜑) ⋅ 𝐯 + 𝜑 div 𝐯
grad(𝜑𝐯) = (𝜑𝑣 𝑖 ),𝑗 𝐠 𝑖 ⊗ 𝐠𝑗
= 𝜑,𝑗 𝑣 𝑖 𝐠 𝑖 ⊗ 𝐠𝑗 + 𝜑𝑣 𝑖 ,𝑗 𝐠 𝑖 ⊗ 𝐠𝑗
= 𝐯 ⊗ (grad 𝜑) + 𝜑 grad 𝐯
Now, div(𝜑𝐯) = tr(grad(𝜑𝐯)). Taking the trace of the above, we have:
div(𝜑𝐯) = 𝐯 ⋅ (grad 𝜑) + 𝜑 div 𝐯
5. Given a scalar point function 𝜙 and a vector field 𝐯, show that curl (𝜙𝐯) = 𝜙 curl 𝐯 +
(grad 𝜙) × 𝐯.
curl (𝜙𝐯) = 𝜖 𝑖𝑗𝑘 (𝜙𝑣𝑘 ),𝑗 𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 (𝜙,𝑗 𝑣𝑘 + 𝜙𝑣𝑘 ,𝑗 )𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝑣𝑘 𝐠 𝑖 + 𝜖 𝑖𝑗𝑘 𝜙𝑣𝑘 ,𝑗 𝐠 𝑖
= (∇𝜙) × 𝐯 + 𝜙 curl 𝐯
7. For a scalar field 𝜙 and a tensor field 𝐓 show that grad (𝜙𝐓) = 𝜙grad 𝐓 + 𝐓 ⊗
grad𝜙. Also show that div (𝜙𝐓) = 𝜙 div 𝐓 + 𝐓grad𝜙
grad(𝜙𝐓) = (𝜙𝑇 𝑖𝑗 ),𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 ⊗ 𝐠 𝑘
= (𝜙,𝑘 𝑇 𝑖𝑗 + 𝜙𝑇 𝑖𝑗 ,𝑘 )𝐠 𝑖 ⊗ 𝐠 𝑗 ⊗ 𝐠 𝑘
= 𝐓 ⊗ grad𝜙 + 𝜙grad 𝐓
Furthermore, we can contract the last two bases and obtain,
div(𝜙𝐓) = (𝜙,𝑘 𝑇 𝑖𝑗 + 𝜙𝑇 𝑖𝑗 ,𝑘 )𝐠 𝑖 ⊗ 𝐠 𝑗 ⋅ 𝐠 𝑘
= (𝜙,𝑘 𝑇 𝑖𝑗 + 𝜙𝑇 𝑖𝑗 ,𝑘 )𝐠 𝑖 𝛿𝑗𝑘
= 𝑇 𝑖𝑘 𝜙,𝑘 𝐠 𝑖 + 𝜙𝑇 𝑖𝑘 ,𝑘 𝐠 𝑖
= 𝐓grad𝜙 + 𝜙 div 𝐓
9. For a vector field 𝐮, show that grad(𝐮 ×) is a third ranked tensor. Hence or otherwise
show that div(𝐮 ×) = −curl 𝐮.
The second–order tensor (𝐮 ×) is defined as 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝐠 𝑖 ⊗ 𝐠 𝑘 . Taking the covariant
derivative with an independent base, we have
grad(𝐮 ×) = 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝐠 𝑖 ⊗ 𝐠 𝑘 ⊗ 𝐠 𝑙
This gives a third order tensor as we have seen. Contracting on the last two bases,
div(𝐮 ×) = 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝐠 𝑖 ⊗ 𝐠 𝑘 ⋅ 𝐠 𝑙
= 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑙 𝐠 𝑖 𝛿𝑘𝑙
= 𝜖 𝑖𝑗𝑘 𝑢𝑗 ,𝑘 𝐠 𝑖
= −curl 𝐮
14. Given a scalar point function 𝜙 and a vector field 𝐯, show that curl (𝜙𝐯) =
𝜙 curl 𝐯 + (∇𝜙) × 𝐯.
curl (𝜙𝐯) = 𝜖 𝑖𝑗𝑘 (𝜙𝑣𝑘 ),𝑗 𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 (𝜙,𝑗 𝑣𝑘 + 𝜙𝑣𝑘 ,𝑗 )𝐠 𝑖
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝑣𝑘 𝐠 𝑖 + 𝜖 𝑖𝑗𝑘 𝜙𝑣𝑘 ,𝑗 𝐠 𝑖
= (∇𝜙) × 𝐯 + 𝜙 curl 𝐯
20. Show that 𝑔𝛾𝑖 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 = 𝑔𝛼𝑗 𝑔𝛽𝑘 − 𝑔𝛼𝑘 𝑔𝛽𝑗
Note that
𝑔𝑖𝛼 𝑔𝑖𝛽 𝑔𝑖𝛾 𝑔𝛾𝑖 𝑔𝑖𝛼 𝑔𝛾𝑖 𝑔𝑖𝛽 𝑔𝛾𝑖 𝑔𝑖𝛾
𝑔𝛾𝑖 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 = 𝑔𝛾𝑖 | 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛾 | = | 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛾 |
𝑔𝑘𝛼 𝑔𝑘𝛽 𝑔𝑘𝛾 𝑔𝑘𝛼 𝑔𝑘𝛽 𝑔𝑘𝛾
𝛽 𝛾
𝛿𝛾𝛼 𝛿𝛾 𝛿𝛾
= | 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛾 |
𝑔𝑘𝛼 𝑔𝑘𝛽 𝑔𝑘𝛾
𝑗𝛽
𝛼 𝑔 𝑔 𝑗𝛾 𝛽 𝑔
𝑗𝛼
𝑔 𝑗𝛾 𝛾 𝑔
𝑗𝛼
𝑔 𝑗𝛽
= 𝛿𝛾 | 𝑘𝛽 | − 𝛿𝛾 | 𝑘𝛼 | + 𝛿𝛾 | 𝑘𝛼 |
𝑔 𝑔𝑘𝛾 𝑔 𝑔𝑘𝛾 𝑔 𝑔𝑘𝛽
𝑔 𝑗𝛽 𝑔 𝑗𝛼 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛼 𝑔 𝑗𝛽 𝑔 𝑗𝛼 𝑔 𝑗𝛽
= | 𝑘𝛽 𝑘𝛼
| − | 𝑘𝛼 𝑘𝛽
| + 3 | 𝑘𝛼 𝑘𝛽
| = | 𝑘𝛼 𝑘𝛽
|
𝑔 𝑔 𝑔 𝑔 𝑔 𝑔 𝑔 𝑔
= 𝑔𝛼𝑗 𝑔𝛽𝑘 − 𝑔𝛼𝑘 𝑔𝛽𝑗
𝐀
21. Given that 𝜑(𝑡) = |𝐀(𝑡)|, Show that 𝜑̇ (𝑡) = |𝐀(𝑡)| : 𝐀̇
𝜑2 ≡ 𝐀: 𝐀
Now,
𝑑 𝑑𝜑 𝑑𝐀 𝑑𝐀 𝑑𝐀
(𝜑 2 ) = 2𝜑 = : 𝐀 + 𝐀: = 2𝐀:
𝑑𝑡 𝑑𝑡 𝑑𝑡 𝑑𝑡 𝑑𝑡
as inner product is commutative. We can therefore write that
𝑑𝜑 𝐀 𝑑𝐀 𝐀
= : = : 𝐀̇
𝑑𝑡 𝜑 𝑑𝑡 | (
𝐀 𝑡 )|
as required.
22. Given a tensor field 𝐓, obtain the vector 𝐰 ≡ 𝐓 T 𝐯 and show that its divergence
is 𝐓: (∇𝐯) + 𝐯 ⋅ div 𝐓
The gradient of 𝐰 is the tensor , (𝑇𝑗𝑖 𝑣 𝑗 ),𝑘 𝐠 𝑖 ⊗ 𝐠 𝑘 . Therefore, divergence of 𝒘
(the trace of the gradient) is the scalar sum , 𝑇𝑗𝑖 𝑣 𝑗 ,𝑘 𝑔𝑖𝑘 + 𝑇𝑗𝑖 ,𝑘 𝑣 𝑗 𝑔𝑖𝑘 . Expanding,
we obtain,
div (𝐓 T 𝐯) = 𝑇𝑗𝑖 𝑣 𝑗 ,𝑘 𝑔𝑖𝑘 + 𝑇𝑗𝑖 ,𝑘 𝑣 𝑗 𝑔𝑖𝑘
= 𝑇𝑗𝑘 ,𝑘 𝑣 𝑗 + 𝑇𝑗𝑘 𝑣 𝑗 ,𝑘
= (div 𝐓) ⋅ 𝐯 + tr(𝐓 T grad 𝐯)
= (div 𝐓) ⋅ 𝐯 + 𝐓: (grad 𝐯)
Recall that scalar product of two vectors is commutative so that
div (𝐓 T 𝐯) = 𝐓: (grad 𝐯) + 𝐯 ⋅ div 𝐓
23. For a second-order tensor 𝐓 define curl 𝐓 ≡ 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 show that for
any constant vector 𝒂, (curl 𝐓) 𝒂 = curl (𝐓 T 𝒂)
Express vector 𝒂 in the invariant form with covariant components as 𝒂 = 𝑎𝛽 𝐠 𝛽 .
It follows that
(curl 𝐓) 𝒂 = 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 (𝐠 𝑖 ⊗ 𝐠 𝛼 )𝒂
= 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝑎𝛽 (𝐠 𝑖 ⊗ 𝐠 𝛼 )𝐠 𝛽
= 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝑎𝛽 𝐠 𝑖 𝛿𝛽𝛼
= 𝜖 𝑖𝑗𝑘 (𝑇𝛼𝑘 ),𝑗 𝐠 𝑖 𝑎𝛼
= 𝜖 𝑖𝑗𝑘 (𝑇𝛼𝑘 𝑎𝛼 ),𝑗 𝐠 𝑖
The last equality resulting from the fact that vector 𝒂 is a constant vector. Clearly,
(curl 𝐓) 𝒂 = curl (𝐓 T 𝒂)
24. For any two vectors 𝐮 and 𝐯, show that curl (𝐮 ⊗ 𝐯) = [(grad 𝐮)𝐯 ×]𝑇 +
(curl 𝐯) ⊗ 𝒖 where 𝐯 × is the skew tensor 𝜖 𝑖𝑘𝑗 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 .
Recall that the curl of a tensor 𝑻 is defined by curl 𝑻 ≡ 𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 .
Clearly therefore,
curl (𝒖 ⊗ 𝒗) = 𝜖 𝑖𝑗𝑘 (𝑢𝛼 𝑣𝑘 ),𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼 = 𝜖 𝑖𝑗𝑘 (𝑢𝛼 ,𝑗 𝑣𝑘 + 𝑢𝛼 𝑣𝑘 ,𝑗 ) 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜖 𝑖𝑗𝑘 𝑢𝛼 ,𝑗 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝛼 + 𝜖 𝑖𝑗𝑘 𝑢𝛼 𝑣𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= (𝜖 𝑖𝑗𝑘 𝑣𝑘 𝐠 𝑖 ) ⊗ (𝑢𝛼 ,𝑗 𝐠 𝛼 ) + (𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 𝐠 𝑖 ) ⊗ (𝑢𝛼 𝐠 𝛼 )
= (𝜖 𝑖𝑗𝑘 𝑣𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 )(𝑢𝛼 ,𝛽 𝐠 𝛽 ⊗ 𝐠 𝛼 ) + (𝜖 𝑖𝑗𝑘 𝑣𝑘 ,𝑗 𝐠 𝑖 ) ⊗ (𝑢𝛼 𝐠 𝛼 )
= −(𝐯 ×)(grad 𝐮)𝑻 + (curl 𝐯) ⊗ 𝐮 = [(grad 𝐮)𝐯 ×]𝑻 + (curl 𝐯) ⊗ 𝐮
upon noting that the vector cross is a skew tensor.
26. Given a scalar point function 𝜙 and a second-order tensor field 𝐓, show that
curl (𝜙𝐓) = 𝜙 curl 𝐓 + ((∇𝜙) ×)𝐓 T where [(∇𝜙) ×] is the skew tensor
𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝐠 𝑖 ⊗ 𝐠 𝑘
curl (𝜙𝑻) ≡ 𝜖 𝑖𝑗𝑘 (𝜙𝑇𝛼𝑘 ),𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜖 𝑖𝑗𝑘 (𝜙,𝑗 𝑇𝛼𝑘 + 𝜙𝑇𝛼𝑘 ,𝑗 ) 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝑇𝛼𝑘 𝐠 𝑖 ⊗ 𝐠 𝛼 + 𝜙𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= (𝜖 𝑖𝑗𝑘 𝜙,𝑗 𝐠 𝑖 ⊗ 𝐠 𝑘 ) (𝑇𝛼𝛽 𝐠 𝛽 ⊗ 𝐠 𝛼 ) + 𝜙𝜖 𝑖𝑗𝑘 𝑇𝛼𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
= 𝜙 curl 𝐓 + ((∇𝜙) ×)𝐓 T
28. Show that if 𝜑 defined in the space spanned by orthonormal coordinates 𝑥 𝑖 , then
𝜕𝜑
∇2 (𝑥 𝑖 𝜑) = 2 𝜕𝑥 𝑖 + 𝑥 𝑖 ∇2 𝜑 .
By definition, ∇2 (𝑥 𝑖 𝜑) = 𝑔 𝑗𝑘 (𝑥 𝑖 𝜑),𝑗𝑘 . Expanding, we have
𝑔 𝑗𝑘 (𝑥 𝑖 𝜑),𝑗𝑘 = 𝑔 𝑗𝑘 (𝑥 𝑖 ,𝑗 𝜑 + 𝑥 𝑖 𝜑,𝑗 ),𝑘 = 𝑔 𝑗𝑘 (𝛿𝑗𝑖 𝜑 + 𝑥 𝑖 𝜑,𝑗 )
,𝑘
= 𝑔 𝑗𝑘 (𝛿𝑗𝑖 𝜑,𝑘 + 𝑥 𝑖 ,𝑘 𝜑,𝑗 + 𝑥 𝑖 𝜑,𝑗𝑘 )
= 𝑔 𝑗𝑘 (𝛿𝑗𝑖 𝜑,𝑘 + 𝛿𝑘𝑖 𝜑,𝑗 + 𝑥 𝑖 𝜑,𝑗𝑘 )
= 𝑔𝑖𝑘 𝜑,𝑘 + 𝑔𝑖𝑗 𝜑,𝑗 + 𝑥 𝑖 𝑔 𝑗𝑘 𝜑,𝑗𝑘
When the coordinates are orthonormal, this becomes,
2 𝜕Φ
2 𝑖
+ 𝑥 𝑖 ∇2 Φ
(ℎ𝑖 ) 𝜕𝑥
where we have suspended the summation rule and ℎ𝑖 is the square root of the
appropriate metric tensor component.
29. In Cartesian coordinates, If the volume 𝑉 is enclosed by the surface 𝑆, the
position vector 𝒓 = 𝑥 𝑖 𝐠 𝑖 and 𝒏 is the external unit normal to each surface element,
1
show that 6 ∫𝑆 ∇(𝒓 ⋅ 𝒓) ⋅ 𝒏𝑑𝑆 equals the volume contained in 𝑉.
𝒓 ⋅ 𝒓 = 𝑥 𝑖 𝑥 𝑗 𝐠 𝑖 ⋅ 𝐠 𝑗 = 𝑥 𝑖 𝑥 𝑗 𝑔𝑖𝑗
By the Divergence Theorem,
= ∫ 𝜕𝑙 [𝑔𝑖𝑗 (𝑥 𝑖 ,𝑘 𝑥 𝑗 + 𝑥 𝑖 𝑥 𝑗 ,𝑘 )] 𝐠 𝑙 ⋅ 𝐠 𝑘 𝑑𝑉
𝑉
𝑗
= ∫ 𝑔𝑖𝑗 𝑔𝑙𝑘 (𝛿𝑘𝑖 𝑥 𝑗 + 𝑥 𝑖 𝛿𝑘 ),𝑙 𝑑𝑉 = ∫ 2𝑔𝑖𝑘 𝑔𝑙𝑘 𝑥 𝑖 ,𝑙 𝑑𝑉 = ∫ 2𝛿𝑖𝑙 𝛿𝑙𝑖 𝑑𝑉
𝑉 𝑉 𝑉
= 6 ∫ 𝑑𝑉
𝑉
30. For any Euclidean coordinate system, show that div 𝐮 × 𝐯 = 𝐯 curl 𝐮 − 𝐮 curl 𝐯
Given the contravariant vector 𝑢𝑖 and 𝑣 𝑖 with their associated vectors 𝑢𝑖 and 𝑣𝑖 ,
the contravariant component of the above cross product is 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 .The
required divergence is simply the contraction of the covariant 𝑥 𝑖 derivative of this
quantity:
(𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑖 = 𝜖 𝑖𝑗𝑘 𝑢𝑗,𝑖 𝑣𝑘 + 𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘,𝑖
where we have treated the tensor 𝜀 𝑖𝑗𝑘 as a constant under the covariant
derivative.
Cyclically rearranging the RHS we obtain,
(𝜖 𝑖𝑗𝑘 𝑢𝑗 𝑣𝑘 ),𝑖 = 𝑣𝑘 𝜖 𝑘𝑖𝑗 𝑢𝑗,𝑖 + 𝑢𝑗 𝜖 𝑗𝑘𝑖 𝑣𝑘,𝑖 = 𝑣𝑘 𝜖 𝑘𝑖𝑗 𝑢𝑗,𝑖 + 𝑢𝑗 𝜖 𝑗𝑖𝑘 𝑣𝑘,𝑖
where we have used the anti-symmetric property of the tensor 𝜖 𝑖𝑗𝑘 . The last
expression shows clearly that
div 𝐮 × 𝐯 = 𝐯 curl 𝐮 − 𝐮 curl 𝐯
as required.
31. For a general tensor field 𝑻 show that, curl(curl 𝑻) = [∇2 (tr 𝑻) −
T
div(div 𝑻)]𝑰 + grad(div 𝑻) + (grad(div 𝑻)) − grad(grad (tr𝑻)) − ∇2 𝑻T
curl 𝑻 = 𝝐𝛼𝑠𝑡 𝑇𝛽𝑡 ,𝑠 𝐠 𝛼 ⊗ 𝐠 𝛽
= 𝑆 𝛼.𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽
curl 𝑺 = 𝜖 𝑖𝑗𝑘 𝑆 𝛼.𝑘 ,𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
so that
curl 𝑺 = curl(curl 𝑻) = 𝜖 𝑖𝑗𝑘 𝜖 𝛼𝑠𝑡 𝑇𝑘𝑡 ,𝑠𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
𝑔𝑖𝛼 𝑔𝑖𝑠 𝑔𝑖𝑡
= | 𝑔 𝑗𝛼 𝑔 𝑗𝑠 𝑔 𝑗𝑡 | 𝑇𝑘𝑡 ,𝑠𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
𝑔𝑘𝛼 𝑔𝑘𝑠 𝑔𝑘𝑡
𝑔𝑖𝛼 (𝑔 𝑗𝑠 𝑔𝑘𝑡 − 𝑔 𝑗𝑡 𝑔𝑘𝑠 ) + 𝑔𝑖𝑠 (𝑔 𝑗𝑡 𝑔𝑘𝛼 − 𝑔 𝑗𝛼 𝑔𝑘𝑡 )
=[ ] 𝑇𝑘𝑡 ,𝑠𝑗 𝐠 𝑖 ⊗ 𝐠 𝛼
+𝑔𝑖𝑡 (𝑔 𝑗𝛼 𝑔𝑘𝑠 − 𝑔 𝑗𝑠 𝑔𝑘𝛼 )
= [𝑔 𝑗𝑠 𝑇 𝑡.𝑡 ,𝑠𝑗 − 𝑇..𝑠𝑗 ,𝑠𝑗 ](𝐠 𝛼 ⊗ 𝐠 𝛼 ) + [𝑇 𝛼𝑗 𝑗𝛼 𝑡 𝑠
.. ,𝑠𝑗 − 𝑔 𝑇 .𝑡 ,𝑠𝑗 ](𝐠 ⊗ 𝐠 𝛼 )
+ [𝑔 𝑗𝛼 𝑇 .𝑡𝑠. ,𝑠𝑗 − 𝑔 𝑗𝑠 𝑇 𝛼. 𝑡
.𝑡 ,𝑠𝑗 ](𝐠 ⊗ 𝐠 𝛼 )
T
= [∇2 (tr 𝑻) − div(div 𝑻)]𝑰 + (grad(div 𝑻)) − grad(grad (tr𝐓))
+ (grad(div 𝐓)) − ∇2 𝐓 T
33. For a scalar function 𝛷 and a vector 𝑣 𝑖 show that the divergence of the vector
𝑣 𝑖 𝚽 is equal to, 𝐯 ⋅ 𝛁𝛷 + 𝛷 𝑑𝑖𝑣 𝐯
(𝑣 𝑖 Φ),𝑖 = Φ𝑣 𝑖 ,𝑖 + 𝑣 𝑖 Φ,i
Hence the result.
34. Show that curl 𝐮 × 𝐯 = (𝐯 ∙ ∇𝐮) + (𝐮 ⋅ div 𝐯) − (𝐯 ⋅ div 𝐮) − (𝐮 ∙ ∇ 𝐯)
Taking the associated (covariant) vector of the expression for the cross product in
the last example, it is straightforward to see that the LHS in indicial notation is,
𝜖 𝑙𝑚𝑖 (𝜖𝑖𝑗𝑘 𝑢 𝑗 𝑣 𝑘 ),𝑚
Expanding in the usual way, noting the relation between the alternating tensors
and the Kronecker deltas,
𝑙𝑚𝑖
𝜖 𝑙𝑚𝑖 (𝜀𝑖𝑗𝑘 𝑢 𝑗 𝑣 𝑘 ),𝑚 = 𝛿𝑗𝑘𝑖 (𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝑢 𝑗 𝑣 𝑘 ,𝑚 )
𝑙𝑚 𝛿𝑗𝑙 𝛿𝑗𝑚
= 𝛿𝑗𝑘 (𝑢 𝑗 ,𝑚 𝑣 𝑘 𝑗 𝑘
−𝑢 𝑣 ,𝑚 )
=| 𝑙 𝑚
| (𝑢 𝑗
,𝑚 𝑣 𝑘
− 𝑢 𝑗 𝑘
𝑣 ,𝑚 )
𝛿𝑘 𝛿𝑘
= (𝛿𝑗𝑙 𝛿𝑘𝑚 − 𝛿𝑘𝑙 𝛿𝑗𝑚 )(𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝑢 𝑗 𝑣 𝑘 ,𝑚 )
= 𝛿𝑗𝑙 𝛿𝑘𝑚 𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝛿𝑗𝑙 𝛿𝑘𝑚 𝑢 𝑗 𝑣 𝑘 ,𝑚 + 𝛿𝑘𝑙 𝛿𝑗𝑚 𝑢 𝑗 ,𝑚 𝑣 𝑘 − 𝛿𝑘𝑙 𝛿𝑗𝑚 𝑢 𝑗 𝑣 𝑘 ,𝑚
= 𝑢𝑙 ,𝑚 𝑣 𝑚 − 𝑢𝑚 ,𝑚 𝑣 𝑙 + 𝑢𝑙 𝑣 𝑚 ,𝑚 − 𝑢𝑚 𝑣 𝑙 ,𝑚
Which is the result we seek in indicial notation.
35. . In Cartesian coordinates let 𝑥 denote the magnitude of the position vector 𝐫 =
𝒙𝒋 𝟏 𝒙𝒊 𝒙𝒋 𝟐 𝟏
𝑥𝑖 𝐞𝑖 . Show that (a) 𝑥,𝒋 = 𝒙 , (b) 𝑥,𝒊𝒋 = 𝒙 𝛿𝒊𝒋 − (𝒙)𝟑, (c) 𝑥,𝒊𝒊 = 𝒙, (d) If 𝑼 = 𝒙, then 𝑼,𝒊𝒋 =
−𝜹𝒊𝒋 𝟑𝒙𝒊 𝒙𝒋 𝐫 2
+ 𝑈,𝑖𝑖 = 0 and div (𝑥) = 𝑥.
𝒙𝟑 𝒙𝟓
(𝑎 ) 𝑥 = √𝑥𝑖 𝑥𝑖
𝜕√𝑥𝑖 𝑥𝑖 𝜕 √𝑥𝑖 𝑥𝑖 𝜕 (𝑥𝑖 𝑥𝑖 ) 1 𝑥𝑗
𝑥,𝑗 = = × = [𝑥𝑖 𝛿𝑖𝑗 + 𝑥𝑖 𝛿𝑖𝑗 ] = .
𝜕𝑥𝑗 𝜕 (𝑥𝑖 𝑥𝑖 ) 𝜕𝑥𝑗 2√𝑥𝑖 𝑥𝑖 𝑥
𝜕𝑥𝑖 𝜕𝑥 𝑥𝑖 𝑥𝑗
𝑥 − 𝑥𝑖 𝑥𝛿 −
𝜕 𝜕√𝑥𝑖 𝑥𝑖 𝜕 𝑥𝑖 𝜕𝑥𝑗 𝜕𝑥𝑗 𝑖𝑗 𝑥
(𝑏) 𝑥,𝑖𝑗 = ( )= ( )= =
𝜕𝑥𝑗 𝜕𝑥𝑖 𝜕𝑥𝑗 𝑥 (𝑥 )2 (𝑥 )2
1 𝑥𝑖 𝑥𝑗
= 𝛿𝑖𝑗 −
𝑥 (𝑥 )3
1 𝑥𝑖 𝑥𝑖 3 (𝑥 )2 2
(𝑐 ) 𝑥,𝑖𝑖 = 𝛿𝑖𝑖 − = − = .
𝑥 (𝑥 )3 𝑥 (𝑥 )3 𝑥
1
(𝑑 ) 𝑈 = so that
𝑥
1 1
𝜕𝑥 𝜕𝑥 𝜕𝑥 1 1 𝑥𝑗
𝑈,𝑗 = = × = − 2 𝑥𝑗 = − 3
𝜕𝑥𝑗 𝜕𝑥 𝜕𝑥𝑗 𝑥 𝑥 𝑥
Consequently,
𝜕 𝜕
𝑥3 ( (−𝑥 2 )) + 𝑥𝑖 (𝑥 3 )
𝜕 𝜕 𝑥𝑖 𝜕𝑥𝑗 𝜕𝑥𝑗
𝑈,𝑖𝑗 = (𝑈,𝑖 ) = − ( )=
𝜕𝑥𝑗 𝜕𝑥𝑗 𝑥 3 𝑥6
𝜕(𝑥 3 ) 𝜕𝑥
3
𝑥 (−𝛿𝑖𝑗 ) + 𝑥𝑖 ( ) −𝑥 3 𝛿 + 𝑥 (3𝑥 2 𝑥𝑗 )
𝜕𝑥 𝜕𝑥𝑗 𝑖𝑗 𝑖 𝑥 = −𝛿𝑖𝑗 + 3𝑥𝑖 𝑥𝑗
= =
𝑥6 𝑥6 𝑥3 𝑥5
−𝛿𝑖𝑖 3𝑥𝑖 𝑥𝑖 −3 3𝑥 2
𝑈,𝑖𝑖 = 3 + 5 = 3 + 5 = 0.
𝑥 𝑥 𝑥 𝑥
𝐫 𝑥𝑗 1 1 3 𝜕 1 𝑑𝑥
div ( ) = ( ) ,𝑗 = 𝑥𝑗 ,𝑗 + ( ) = + 𝑥𝑗 ( ( ) )
𝑥 𝑥 𝑥 𝑥 ,𝑗 𝑥 𝜕𝑥 𝑥 𝑑𝑥𝑗
3 1 𝑥𝒋 3 𝑥𝑗 𝑥𝑗 3 1 2
= + 𝑥𝑗 [− ( 2 ) ] = − 3 = − =
𝑥 𝑥 𝑥 𝑥 𝑥 𝑥 𝑥 𝑥
36. For vectors 𝐮, 𝐯 and 𝐰, show that (𝐮 ×)(𝐯 ×)(𝐰 ×) = 𝐮 ⊗(𝐯 × 𝐰) − (𝐮 ⋅
𝐯)𝐰 ×.
The tensor (𝐮 ×) = −𝜖𝑙𝑚𝑛 𝑢𝑛 𝐠 𝑙 ⊗ 𝐠 𝑚 similarly, (𝐯 ×) = −𝜖 𝛼𝛽𝛾 𝑣𝛾 𝐠 𝛼 ⊗ 𝐠 𝛽 and
(𝐰 ×) = −𝜖 𝑖𝑗𝑘 𝑤𝑘 𝐠 𝑖 ⊗ 𝐠 𝑗 . Clearly,
(𝐮 ×)(𝐯 ×)(𝐰 ×) = −𝜖𝑙𝑚𝑛 𝜖 𝛼𝛽𝛾 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝛽 )(𝐠 𝑙 ⊗ 𝐠 𝑚 )(𝐠 𝑖 ⊗ 𝐠 𝑗 )
= −𝜖 𝛼𝛽𝛾 𝜖𝑙𝑚𝑛 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )𝛿𝛽𝑙 𝛿𝑖𝑚
= −𝜖 𝛼𝑙𝛾 𝜖𝑙𝑖𝑛 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )
= −𝜖 𝑙𝛼𝛾 𝜖𝑙𝑛𝑖 𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )
𝛾 𝛾
= −(𝛿𝑛𝛼 𝛿𝑖 − 𝛿𝑖𝛼 𝛿𝑛 )𝜖 𝑖𝑗𝑘 𝑢𝑛 𝑣𝛾 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 )
= −𝜖 𝑖𝑗𝑘 𝑢𝛼 𝑣𝑖 𝑤𝑘 (𝐠 𝛼 ⊗ 𝐠 𝑗 ) + 𝜖 𝑖𝑗𝑘 𝑢𝛾 𝑣𝛾 𝑤𝑘 (𝐠 𝑖 ⊗ 𝐠 𝑗 )
= [𝐮 ⊗ (𝐯 × 𝐰) − (𝐮 ⋅ 𝐯)𝐰 ×]
38. Show that (𝐮 ×)(𝐯 ×) = (𝐮 ⋅ 𝐯)𝟏 − 𝐮 ⊗ 𝐯 and that tr[(𝐮 ×)(𝐯 ×)] = 2(𝐮 ⋅
𝐯)
(𝐮 ×)(𝐯 ×) = −𝜖𝑙𝑚𝑛 𝜖 𝛼𝛽𝛾 𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝛽 )(𝐠 𝑙 ⊗ 𝐠 𝑚 )
= −𝜖𝑙𝑚𝑛 𝜖 𝛼𝛽𝛾 𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝑚 )𝛿𝛽𝑙 = −𝜖𝛽𝑚𝑛 𝜖 𝛽𝛾𝛼 𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝑚 )
𝛾 𝛼 𝛾
= [𝛿𝑛 𝛿𝑚 − 𝛿𝑚 𝛿𝑛𝛼 ]𝑢𝑛 𝑣𝛾 (𝐠 𝛼 ⊗ 𝐠 𝑚 )
= 𝑢𝑛 𝑣𝑛 (𝐠 𝛼 ⊗ 𝐠 𝛼 ) − 𝑢𝑛 𝑣𝑚 (𝐠 𝑛 ⊗ 𝐠 𝑚 ) = (𝐮 ⋅ 𝐯)𝟏 − 𝐮 ⊗ 𝐯
Obviously, the trace of this tensor is 2(𝐮 ⋅ 𝐯)
39. The position vector in the above example 𝒓 = 𝑥𝑖 𝒆𝑖 . Show that (a) div 𝒓 = 𝟑, (b)
div (𝒓 ⊗ 𝒓) = 𝟒𝒓, (c) div 𝒓 = 3, and (d) grad 𝒓 = 𝟏 and (e) curl (𝒓 ⊗ 𝒓) = −𝒓 ×
grad 𝒓 = 𝑥𝑖 ,𝑗 𝒆𝑖 ⊗ 𝒆𝑗
= 𝛿𝑖𝑗 𝒆𝑖 ⊗ 𝒆𝑗 = 𝟏
div 𝒓 = 𝑥𝑖 ,𝑗 𝒆𝑖 ⋅ 𝒆𝑗
= 𝛿𝑖𝑗 𝛿𝑖𝑗 = 𝛿𝑗𝑗 = 3. 𝒓 ⊗ 𝒓 = 𝑥𝑖 𝒆𝑖 ⊗ 𝑥𝑗 𝒆𝑗 = 𝑥𝑖 𝑥𝑗 𝒆𝑖 ⊗ 𝒆𝒋
grad(𝒓 ⊗ 𝒓) = (𝑥𝑖 𝑥𝑗 ),𝑘 𝒆𝑖 ⊗ 𝒆𝑗 ⊗ 𝒆𝑘 = (𝑥𝑖 ,𝑘 𝑥𝑗 + 𝑥𝑖 𝑥𝑗 ,𝑘 )𝒆𝑖 ⊗ 𝒆𝑗 ⋅ 𝒆𝑘
= (𝛿𝑖𝑘 𝑥𝑗 + 𝑥𝑖 𝛿𝑗𝑘 )𝛿𝑗𝑘 𝒆𝑖 = (𝛿𝑖𝑘 𝑥𝑘 + 𝑥𝑖 𝛿𝑗𝑗 )𝒆𝑖
= 4𝑥𝑖 𝒆𝑖 = 4𝒓
curl(𝒓 ⊗ 𝒓) = 𝜖𝛼𝛽𝛾 (𝑥𝑖 𝑥𝛾 ),𝛽 𝒆𝛼 ⊗ 𝒆𝒊
= 𝜖𝛼𝛽𝛾 (𝑥𝑖 ,𝛽 𝑥𝛾 + 𝑥𝑖 𝑥𝛾 ,𝛽 )𝒆𝛼 ⊗ 𝒆𝒊
= 𝜖𝛼𝛽𝛾 (𝛿𝑖𝛽 𝑥𝛾 + 𝑥𝑖 𝛿𝛾𝛽 )𝒆𝛼 ⊗ 𝒆𝒊
= 𝜖𝛼𝑖𝛾 𝑥𝛾 𝒆𝛼 ⊗ 𝒆𝒊 + 𝜖𝛼𝛽𝛽 𝑥𝑖 𝒆𝛼 ⊗ 𝒆𝒊
= −𝜖𝛼𝛾𝑖 𝑥𝛾 𝒆𝛼 ⊗ 𝒆𝒊 = −𝒓 ×
∂|𝐀| 𝐀
40. Define the magnitude of tensor 𝐀 as, |𝐀| = √tr(𝐀𝐀T ) Show that = |𝐀|
∂𝐀
By definition, given a scalar 𝛼, the derivative of a scalar function of a tensor 𝑓(𝐀)
is
∂𝑓(𝐀) ∂
: 𝐁 = lim 𝑓(𝐀 + 𝛼𝐁)
∂𝐀 𝛼→0 ∂𝛼
for any arbitrary tensor 𝐁.
In the case of 𝑓(𝐀) = |𝐀|,
∂|𝐀| ∂
: 𝐁 = lim |𝐀 + 𝛼𝐁|
∂𝐀 𝛼→0 ∂𝛼
+ 𝐴23 𝑇𝑗3 ) 𝑇𝑘3 + 𝑇𝑖1 𝑇𝑗2 ( 𝐴13 𝑇𝑘1 + 𝐴32 𝑇𝑘2 + 𝐴33 𝑇𝑘3 )]
All the boxed terms in the above equation vanish on account of the contraction of
a symmetric tensor with an antisymmetric one.
(For example, the first boxed term yields, 𝜖 𝑖𝑗𝑘 𝐴12 𝑇𝑖2 𝑇𝑗2 𝑇𝑘3
Which is symmetric as well as antisymmetric in 𝑖 and 𝑗. It therefore vanishes. The
same is true for all other such terms.)
𝑑
det(𝐓) = 𝜖 𝑖𝑗𝑘 [(𝐴11 𝑇𝑖1 )𝑇𝑗2 𝑇𝑘3 + 𝑇𝑖1 (𝐴22 𝑇𝑗2 )𝑇𝑘3 + 𝑇𝑖1 𝑇𝑗2 (𝐴33 𝑇𝑘3 )]
𝑑𝛼
= 𝐴𝑚𝑚𝜖 𝑇𝑖 𝑇𝑗 𝑇𝑘 = tr(𝐓̇𝐓 −1 ) det(𝐓)
𝑖𝑗𝑘 1 2 3
as required.
43. Without breaking down into components, establish the fact that 𝜕det(𝐓)
𝜕𝐓
= 𝐓𝒄
Start from Liouville’s Theorem, given a scalar parameter such that 𝐓 = 𝐓(𝛼 ),
∂ ∂𝐓 −𝟏 −𝐓
∂𝐓
(det(𝐓)) = det(𝐓) tr [( ) 𝐓 ] = [det(𝐓) 𝐓 ] : ( )
∂𝛼 ∂𝛼 ∂𝛼
By the simple rules of multiple derivative,
∂ ∂ ∂𝐓
(det(𝐓)) = [ (det(𝐓))]: ( )
∂𝛼 ∂𝐓 ∂𝛼
It therefore follows that,
∂ ∂𝐓
[ (det(𝐓)) − [det(𝐓) 𝐓 −𝐓 ]]: ( ) = 0
∂𝐓 ∂𝛼
Hence
∂
(det(𝐓)) = [det(𝐓) 𝐓 −𝐓 ] = 𝐓 𝐜
∂𝐓
∂
44. [Gurtin 3.4.2a] If 𝐓 is invertible, show that ∂𝐓 (log det(𝐓)) = 𝐓 −𝐓
∂ ∂(log det(𝐓)) ∂det(𝐓)
(log det(𝐓)) =
∂𝐓 ∂det(𝐓) ∂𝐓
1 1
= 𝐓𝐜 = det(𝐓) 𝐓 −𝐓
det(𝐓) det(𝐓)
= 𝐓 −𝐓
∂
45. [Gurtin 3.4.2a] If 𝐓 is invertible, show that ∂𝐓 (log det(𝐓 −1 )) = −𝐓 −𝐓
∂ −1
∂(log det(𝐓 −1 )) ∂det(𝐓 −1 ) ∂𝐓 −1
(log det(𝐓 )) =
∂𝐓 ∂det(𝐓 −1 ) ∂𝐓 −1 ∂𝐓
1
= −1
𝐓 −𝐜 (−𝐓 −2 )
det(𝐓 )
1
= −1
det(𝐓 −1 ) 𝐓 𝐓 (−𝐓 −2 )
det(𝐓 )
= −𝐓 −𝐓
∂
46. Given that 𝐀 is a constant tensor, Show that ∂𝐒 tr(𝐀𝐒) = 𝐀T
In invariant components terms, let 𝐀 = A𝑖𝑗 𝐠 𝑖 ⊗ 𝐠 𝑗 and let 𝐒 = S𝛼𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽 .
𝐀𝐒 = A𝑖𝑗 S𝛼𝛽 (𝐠 𝑖 ⊗ 𝐠 𝑗 )(𝐠 𝛼 ⊗ 𝐠 𝛽 )
= A𝑖𝑗 S𝛼𝛽 (𝐠 𝑖 ⊗ 𝐠 𝛽 )𝛿𝑗𝛼
= A𝑖𝑗 S𝑗𝛽 (𝐠 𝑖 ⊗ 𝐠 𝛽 )
tr(𝐀𝐒) = A𝑖𝑗 S𝑗𝛽 (𝐠 𝑖 ⋅ 𝐠 𝛽 )
𝛽
= A𝑖𝑗 S𝑗𝛽 𝛿𝑖 = A𝑖𝑗 S𝑗𝑖
∂ ∂
tr(𝐀𝐒) = tr(𝐀𝐒)𝐠 𝛼 ⊗ 𝐠 𝛽
∂𝐒 ∂S𝛼𝛽
∂A𝑖𝑗 S𝑗𝑖
= 𝐠 ⊗ 𝐠𝛽
∂S𝛼𝛽 𝛼
𝛽 ∂ T
= A𝑖𝑗 𝛿𝑗𝛼 𝛿𝑖 𝐠 𝛼 𝑖𝑗 T
⊗ 𝐠𝛽 = A 𝐠𝑗 ⊗ 𝐠𝑖 = 𝐀 = ( 𝐀 : 𝐒)
∂𝐒
as required.
∂
47. Given that 𝐀 and 𝐁 are constant tensors, show that ∂𝐒 tr(𝐀𝐒𝐁 T ) = 𝐀T 𝐁
First observe that tr(𝐀𝐒𝐁 T ) = tr(𝐁 T 𝐀𝐒). If we write, 𝐂 ≡ 𝐁 T 𝐀, it is obvious
∂
from the above that ∂𝐒 tr(𝐂𝐒) = 𝐂 T . Therefore,
∂
tr(𝐀𝐒𝐁 T ) = (𝐁 T 𝐀)𝐓 = 𝐀T 𝐁
∂𝐒
∂
48. Given that 𝐀 and 𝐁 are constant tensors, show that ∂𝐒 tr(𝐀𝐒 T 𝐁 T ) = 𝐁 T 𝐀
Observe that tr(𝐀𝐒 T 𝐁 T ) = tr(𝐁 T 𝐀𝐒 T ) = tr[𝐒(𝐁 T 𝐀)T ] = tr[(𝐁 T 𝐀)T 𝐒]
[The transposition does not alter trace; neither does a cyclic permutation. Ensure
you understand why each equality here is true.] Consequently,
∂ T T)
∂
(
tr 𝐀𝐒 𝐁 = tr[(𝐁 T 𝐀)T 𝐒] = [(𝐁 T 𝐀)T ]𝐓 = 𝐁 T 𝐀
∂𝐒 ∂𝐒
49. Let 𝑺 be a symmetric and positive definite tensor and let 𝐼1 (𝑺), 𝐼2 (𝑺)and𝐼3 (𝑺) be
𝜕𝑰1 (𝑺)
the three principal invariants of 𝑺 show that (a) = 𝟏 the identity tensor, (b)
𝜕𝑺
𝜕𝑰2 (𝑺) 𝜕𝐼3 (𝑺)
= 𝐼1 (𝑺)𝟏 − 𝑺 and (c) = 𝐼3 (𝑺) 𝑺−1
𝜕𝑺 𝜕𝑺
𝜕𝑰1 (𝑺)
can be written in the invariant component form as,
𝜕𝑺
𝜕𝐼1 (𝑺) 𝜕𝐼1 (𝑺)
= 𝑗
𝐠 𝑖 ⊗ 𝐠𝑗
𝜕𝑺 𝜕𝑆 𝑖
Recall that 𝐼1 (𝐒) = tr(𝐒) = 𝑆αα hence
α
𝜕𝐼1 (𝐒) 𝜕𝐼1 (𝐒) 𝜕𝑆 α
= 𝑗
𝐠 𝑖 ⊗ 𝐠𝑗 = 𝑗 𝐠 𝑖 ⊗ 𝐠𝑗
𝜕𝐒 𝜕𝑆 𝜕𝑆
𝑖 𝑖
= 𝛿𝛼𝑖 𝛿𝑗𝛼 𝐠 𝑖 ⊗ 𝐠𝑗 = 𝛿𝑗𝑖 𝐠 𝑖 ⊗ 𝐠𝑗
= 𝟏
which is the identity tensor as expected.
𝜕𝐼2 (𝐒)
in a similar way can be written in the invariant component form as,
𝜕𝐒
𝜕𝐼2 (𝐒) 1 𝜕𝐼1 (𝐒) α 𝛽 α 𝛽 𝑗
= [𝑆α 𝑆𝛽 − 𝑆𝛽 𝑆α ] 𝐠 𝑖 ⊗ 𝐠
𝜕𝐒 2 𝜕𝑆 𝑗
𝑖
1
where we have utilized the fact that 𝐼2 (𝐒) = 2 [tr 2 (𝐒) − tr(𝐒 2 )]. Consequently,
𝜕𝐼2 (𝐒) 1 𝜕 α 𝛽 α 𝛽 𝑗
= [𝑆α 𝑆𝛽 − 𝑆𝛽 𝑆α ] 𝐠 𝑖 ⊗ 𝐠
𝜕𝐒 2 𝜕𝑆 𝑗
𝑖
1 𝑖 𝛼 𝛽 𝛽 𝛽 𝛽
= [𝛿𝛼 𝛿𝑗 𝑆𝛽 + 𝛿𝛽𝑖 𝛿𝑗 𝑆αα − 𝛿𝛽𝑖 𝛿𝑗𝛼 𝑆α − 𝛿𝛼𝑖 𝛿𝑗 𝑆𝛽α ] 𝐠 𝑖 ⊗ 𝐠𝑗
2
1 𝑖 𝛽 𝑗 𝑗 𝑗
= [𝛿𝑗 𝑆𝛽 + 𝛿𝑗𝑖 𝑆αα − 𝑆𝑖 − 𝑆𝑖 ] 𝐠 𝑖 ⊗ 𝐠𝑗 = (𝛿𝑗𝑖 𝑆αα − 𝑆𝑖 )𝐠 𝑖 ⊗ 𝐠𝑗
2
= 𝐼1 (𝐒)𝟏 − 𝐒
1
det(𝑺) ≡ |𝑺| ≡ 𝑆 = 𝜖 𝑖𝑗𝑘 𝜖 𝑟𝑠𝑡 𝑆𝑖𝑟 𝑆𝑗𝑠 𝑆𝑘𝑡
3!
Differentiating wrt 𝑆𝛼𝛽 , we obtain,
𝜕𝑺 1 𝑖𝑗𝑘 𝑟𝑠𝑡 𝜕𝑆𝑖𝑟 𝜕𝑆𝑗𝑠 𝜕𝑆𝑘𝑡
𝐠 ⊗ 𝐠𝛽 = 𝜖 𝜖 [ 𝑆 𝑆 + 𝑆𝑖𝑟 𝑆 + 𝑆𝑖𝑟 𝑆𝑗𝑠 ] 𝐠 ⊗ 𝐠𝛽
𝜕𝑆𝛼𝛽 𝛼 3! 𝜕𝑆𝛼𝛽 𝑗𝑠 𝑘𝑡 𝜕𝑆𝛼𝛽 𝑘𝑡 𝜕𝑆𝛼𝛽 𝛼
1 𝑖𝑗𝑘 𝑟𝑠𝑡 𝛼 𝛽 𝛽 𝛽
= 𝜖 𝜖 [𝛿𝑖 𝛿𝑟 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝛿𝑗𝛼 𝛿𝑠 𝑆𝑘𝑡 + 𝑆𝑖𝑟 𝑆𝑗𝑠 𝛿𝑘𝛼 𝛿𝑡 ] 𝐠 𝛼 ⊗ 𝐠 𝛽
3!
1 𝛼𝑗𝑘 𝛽𝑠𝑡
= 𝜖 𝜖 [𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑗𝑠 𝑆𝑘𝑡 + 𝑆𝑗𝑠 𝑆𝑘𝑡 ]𝐠 𝛼 ⊗ 𝐠 𝛽
3!
1
= 𝜖 𝛼𝑗𝑘 𝜖 𝛽𝑠𝑡 𝑆𝑗𝑠 𝑆𝑘𝑡 𝐠 𝛼 ⊗ 𝐠 𝛽 ≡ [𝑆 c ]𝛼𝛽 𝐠 𝛼 ⊗ 𝐠 𝛽
2!
Which is the cofactor of [𝑆𝛼𝛽 ] or 𝑺
50. For a tensor field 𝜩, The volume integral in the region Ω ⊂ ℰ, ∫Ω(grad 𝜩) 𝑑𝑣 =
∫∂Ω 𝜩 ⊗ 𝒏 𝑑𝑠 where 𝒏 is the outward drawn normal to 𝜕Ω – the boundary of Ω. Show
that for a vector field 𝒇
∫ (div 𝒇) 𝑑𝑣 = ∫ 𝒇 ⋅ 𝒏 𝑑𝑠
Ω 𝜕Ω
Replace 𝜩 by the vector field 𝒇 we have,
∫ (grad 𝒇) 𝑑𝑣 = ∫ 𝒇 ⊗ 𝒏 𝑑𝑠
Ω ∂Ω
Taking the trace of both sides and noting that both trace and the integral are
linear operations, therefore we have,
∫ (div 𝒇) 𝑑𝑣 = ∫ tr(grad 𝒇) 𝑑𝑣
Ω Ω
= ∫ tr(𝒇 ⊗ 𝒏) 𝑑𝑠
∂Ω
= ∫ 𝒇 ⋅ 𝒏 𝑑𝑠
𝜕Ω
51. Show that for a scalar function Hence the divergence theorem
becomes,∫Ω(grad 𝜙) 𝑑𝑣 = ∫𝜕Ω 𝜙𝒏 𝑑𝑠
Recall that for a vector field, that for a vector field 𝒇
∫ (div 𝒇) 𝑑𝑣 = ∫ 𝒇 ⋅ 𝒏 𝑑𝑠
Ω 𝜕Ω
if we write, 𝐟 = 𝜙𝒂 where 𝒂 is an arbitrary constant vector, we have,
∫ (div[𝜙𝒂]) 𝑑𝑣 = ∫ 𝜙𝒂 ⋅ 𝐧 𝑑𝑠 = 𝒂 ⋅ ∫ 𝜙𝐧 𝑑𝑠
Ω 𝜕Ω 𝜕Ω
For the LHS, note that, div[𝜙𝒂] = tr(grad[𝜙𝒂])
grad[𝜙𝒂] = (𝜙𝑎𝑖 ),𝑗 𝐠 𝑖 ⊗ 𝐠𝑗 = 𝑎𝑖 𝜙,𝑗 𝐠 𝑖 ⊗ 𝐠𝑗
The trace of which is,
𝑗
𝑎𝑖 𝜙,𝑗 𝐠 𝑖 ⋅ 𝐠𝑗 = 𝑎𝑖 𝜙,𝑗 𝛿𝑖 = 𝑎𝑖 𝜙,𝑖 = 𝒂 ⋅ grad 𝜙
For the arbitrary constant vector 𝒂, we therefore have that,
∫ (div[𝜙𝒂]) 𝑑𝑣 = 𝒂 ⋅ ∫ grad 𝜙 𝑑𝑣 = 𝒂 ⋅ ∫ 𝜙𝐧 𝑑𝑠
Ω Ω 𝜕Ω
∫ grad 𝜙 𝑑𝑣 = ∫ 𝜙𝐧 𝑑𝑠
Ω 𝜕Ω