12 LU Decomposition Lecture

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 30

LU Decomposition

LU Decomposition
LU Decomposition is another method to solve a set of
simultaneous linear equations

Which is better, Gauss Elimination or LU Decomposition?

To answer this, a closer look at LU decomposition is


needed.
LU Decomposition
Method
For most non-singular matrix [A] that one could conduct Naïve Gauss
Elimination forward elimination steps, one can always write it as

[A] = [L][U]
where
[L] = lower triangular matrix
[U] = upper triangular matrix
How does LU Decomposition work?
If solving a set of linear equations [A][X] = [C]
If [A] = [L][U] then [L][U][X] = [C]
Multiply by [L]-1
Which gives [L]-1[L][U][X] = [L]-1[C]
Remember [L]-1[L] = [I] which leads to [I][U][X] = [L]-1[C]
Now, if [I][U] = [U] then [U][X] = [L]-1[C]
Now, let [L]-1[C]=[Z]
Which ends with [L][Z] = [C] (1)
and [U][X] = [Z] (2)
LU Decomposition
How can this be used?

Given [A][X] = [C]


1. Decompose [A] into [L] and [U]
2. Solve [L][Z] = [C] for [Z]
3. Solve [U][X] = [Z] for [X]
Method: [A] Decompose to [L] and [U]
1 0 0 u11 u12 u13 
A  LU    21 1 0  0 u 22 u 23 
 31  32 1  0 0 u 33 

[U] is the same as the coefficient matrix at the end of the forward
elimination step.
[L] is obtained using the multipliers that were used in the forward
elimination process
Finding the [U] matrix
Using the Forward Elimination Procedure of Gauss Elimination

 25 5 1
 64 8 1
 
144 12 1
 25 5 1 
 2.56; Row 2  Row12.56   0  4.8  1.56
64
Step 1:
25
144 12 1 
25 5 1 
 5.76; Row3  Row15.76   0  4.8  1.56 
144
25
 0  16.8  4.76
Finding the [U] Matrix
25 5 1 
Matrix after Step 1:  0  4.8  1.56 
 
 0  16.8  4.76

25 5 1 
 16.8
Step 2:  3.5; Row3  Row 23.5   0  4.8  1.56
 4.8
 0 0 0.7 

25 5 1 
U    0  4.8  1.56
 0 0 0.7 
Finding the [L] matrix
1 0 0
 0
 21 1
 31  32 1

Using the multipliers used during the Forward Elimination Procedure


a 64
From the first step  25 5 1  21  21   2.56
of forward  64 8 1 a11 25
elimination    31 
a31 144
  5.76
144 12 1 a11 25
Finding the [L] Matrix
From the second 25 5 1 
step of forward  0  4.8  1.56   32  a32   16.8  3.5
elimination   a22  4.8
 0  16.8  4.76

 1 0 0

L  2.56 1 0 
5.76 3.5 1
Does [L][U] = [A]?

 1 0 0 25 5 1 
LU   2.56 1 0  0  4.8  1.56 
5.76 3.5 1  0 0 0.7 
?
Using LU Decomposition to solve SLEs
Solve the following set of  25 5 1  x1  106.8 
linear equations using LU  64 8 1  x   177.2 
Decomposition    2  
144 12 1  x3  279.2

Using the procedure for finding the [L] and [U] matrices

 1 0 0 25 5 1 
A  LU   2.56 1 0  0  4.8  1.56
5.76 3.5 1  0 0 0.7 
Example
Set [L][Z] = [C]  1 0 0  z1  106.8 
2.56 1 0  z   177.2 
  2   
5.76 3.5 1  z 3  279.2

Solve for [Z] z1  106.8


2.56 z1  z 2  177.2
5.76 z1  3.5 z 2  z3  279.2
Example
Complete the forward substitution to solve for [Z]

z1  106.8
z2  177.2  2.56 z1  z1   106.8 
 177.2  2.56106.8
 96.2
Z    z2    96.21
z3  279.2  5.76 z1  3.5 z2  z3   0.735 
 279.2  5.76106.8  3.5 96.21
 0.735
Example
Set [U][X] = [Z]
25 5 1   x1   106.8 
 0  4.8  1.56  x    96.21
   2  
 0 0 0.7   x3   0.735 

Solve for [X] The 3 equations become

25a1  5a2  a3  106.8


 4.8a2  1.56a3  96.21
0.7a3  0.735
Example
Substituting in a3 and using the
From the 3rd equation second equation

0.7a3  0.735  4.8a2  1.56a3  96.21

a3 
0.735  96.21  1.56a3
a2 
0.7  4.8
a3  1.050  96.21  1.561.050
a2 
 4.8
a2  19.70
Example
Substituting in a3 and a2 using Hence the Solution Vector is:
the first equation

25a1  5a2  a3  106.8  a1  0.2900


106.8  5a2  a3 a    19.70 
a1 
25
 2  
106.8  519.70  1.050
 a3   1.050 

25
 0.2900
Finding the inverse of a square matrix

The inverse [B] of a square matrix [A] is defined as

[A][B] = [I] = [B][A]


Finding the inverse of a square matrix
How can LU Decomposition be used to find the inverse?
Assume the first column of [B] to be [b11 b12 … bn1]T
Using this and the definition of matrix multiplication

First column of [B] Second column of [B]


b11  1  b12  0
b  0  b  1
A  21     A  22    
         
       
bn1  0 bn 2  0

The remaining columns in [B] can be found in the same manner


Example: Inverse of a Matrix
Find the inverse of a square matrix [A]
 25 5 1
A   64 8 1
144 12 1

Using the decomposition procedure, the [L] and [U] matrices are found to be

 1 0 0 25 5 1 
A  LU   2.56 1 0  0  4.8  1.56
5.76 3.5 1  0 0 0.7 
Example: Inverse of a Matrix
Solving for the each column of [B] requires two steps
1) Solve [L] [Z] = [C] for [Z]
2) Solve [U] [X] = [Z] for [X]

 1 0 0  z1  1
Step 1: LZ   C   2.56 1 0  z2   0
5.76 3.5 1  z3  0
This generates the equations:
z1  1
2.56 z1  z 2  0
5.76 z1  3.5 z 2  z3  0
Example: Inverse of a Matrix
Solving for [Z]

z1  1
 z1   1 
z2  0  2.56 z1
 0  2.561
Z    z2    2.56
 2.56  z3   3.2 
z3  0  5.76 z1  3.5 z 2
 0  5.761  3.5 2.56 
 3.2
Example: Inverse of a Matrix
25 5 1  b11   1 
Solving [U][X] = [Z] for [X]  0  4.8  1.56 b    2.56
   21   
 0 0 0.7  b31   3.2 

25b11  5b21  b31  1


 4.8b21  1.56b31  2.56
0.7b31  3.2
Example: Inverse of a Matrix
Using Backward Substitution
3.2
b31   4.571 So the first column of
0.7
the inverse of [A] is:
2.56  1.560b31
b21 
4 . 8 b11   0.04762 
2.56  1.5604.571 b    0.9524
  0.9524
4 . 8  21   
b11 
1  5b21  b31 b31   4.571 
25
1  5 0.9524   4.571
  0.04762
25
Example: Inverse of a Matrix
Repeating for the second and third columns of the inverse

Second Column Third Column


 25 5 1 b12  0  25 5 1 b13  0
 64 8 1 b   1  64 8 1 b   0
   22       23   
144 12 1 b32  0 144 12 1 b33  1
b12   0.08333 b13   0.03571 
b    1.417  b    0.4643
 22     23   
b32    5.000  b33   1.429 
Example: Inverse of a Matrix
The inverse of [A] is

 0.04762  0.08333 0.03571 


A1   0.9524 1.417  0.4643
 4.571  5.000 1.429 

To check your work do the following operation

[A][A]-1 = [I] = [A]-1[A]


When is LU Decomposition better
than Gaussian Elimination?
To solve [A][X] = [B]
Table. Time taken by methods

Gaussian Elimination LU Decomposition

 8n3 4n   8n3 4n 
T   12n 2   T   12n 2  
 3 3   3 3 

where T = clock cycle time and n = size of the matrix

So both methods are equally efficient.


To find inverse of [A]
Time taken by Gaussian Elimination Time taken by LU Decomposition
 nCT |FE CT |BS   CT |LU  n  CT |FS  n  CT |BS
 8n 4 4n 2   32n3 20n 
 T   12n 
3
  T   12n 2  
 3 3   3 3 
Table 1 Comparing computational times of finding inverse of a matrix using
LU decomposition and Gaussian elimination.
n 10 100 1000 10000
CT|inverse GE / CT|inverse LU 3.28 25.83 250.8 2501
Summary

29
THE END

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy