Random Vectors:: A Random Vector Is A Column Vector Whose Elements Are Random Variables
Random Vectors:: A Random Vector Is A Column Vector Whose Elements Are Random Variables
: n
X1
X
2
= X3
X4
X5
In this case we say we have a random vector with five components since it is a 51 column
vector.
In general, let X be a random vector with p components, then
X
X1
X2
Xi
Xp
0 P [ X 1 x1 , X 2 x 2 , X i x i ,.. X x p ] 1
.... .... P [ X
xp
xi
x1 , X 2 x 2 , X i xi ,.. X x p ] 1
x1
For (b) we are summing of all possible values that each random variable can take on. We have p
summations.
f ( x1 , x2 ,.., xi ,.., x p ) and
For (b) we have p integrals over all possible values the p variables can take on.
(a)
(b)
xp
xi
p)
dx1dxx ...dx p 1
x1
bp
(c )
bi
b1
f ( x , x ,.., x ,.., x
.. ..
ap
ai
p )dx1dx x ...dx p
a1
Using the j.pd.f we can obtain the marginal distributions and conditional distributions of the
components of random vector.
MEAN VECTOR AND VARIANCE-COVARIANCE MATRIX OF RANDOM VECTOR:
Since each component of the random vector is a random variable, then it is defined by its own
marginal probability distribution.
Using these marginal p.d.fs we can obtain the mean and variance of each random variable Xi,
i=1,2,..,p.
2
i E[ X i ] , i 1,2,.., p
i2 Var ( X i ) E[ X i i ]2 i 1,2,..., p
Also we can obtain the marginal distribution of combination of any two random variables Xi and
Xj. We can therefore use the distribution to define covariance between the two variables.
ij E[ X i i ][ X j j ]
We will used the mean, variance and covariance measures of components of random vector(i.e
random variables) to define the expected value and variance of the vector.
Definition 1: the expected value of X is given by
X1
X
2
E X
= E
E( X i )
X p
E ( X p )
Xi
1
2
E ( X1)
E( X )
2
is referred to as mean vector. Its elements are the expected values of random variables.
E X
It is the product of a column vector and its transpose will gives a pp matrix.
12
21
p1
( ij )
12 1 p
22 2 p
p 2 2p
It is a symmetrical matrix with measures of variance along the diagonal and measures of
covariance off the diagonal.
Correlation matrix:
3
ij
i2 2j
p1
12 1 p
1 2 p
p2
1
Var ( Z ) Var (a X ) a a
Ym
a11
a21
am1
a12
a22
am 2
a1 p X 1
a2 p X 2
amp X p
E Y AE X A
Var [Y ] AVar [ X ] A AA
Examples:
(1) let X=[X1 X2 X3] have joint probability distribution function
1
( x1 2 x2 3 x3 )
3
f ( x1 , x2 , x3 )
0 x1 1 0 x2 1 0 x3 1
Find
(ii) the marginal distribution of X1 and X2
(iii) P( X3 < | X1=1/4, X2=1/2)
Solution :
(i )
1
2
1
x1 2 x 2 3x3 dx3 x1 x3 2 x 2 x3 x3
3
3
3
2
1
( x1 2 x 2 )
3
0, elsewhere
(ii )
1 x 2 x 3x
1
2
3
f ( x1 , x2 , x3 )
x 2 x 2 3 x3
h( x3 | x1 , x 2 )
3
1
1 x 2x
g ( x1 , x 2 )
x1 2 x2
2
3 1
x1 1 x2 1
4
4
h( x3 | x1 , x 2 ) 1 4 x3
1
P( X 3 1 )
2
1 4 x3 dx3
3
4
(2) let Y be a random vector with three components and having mean vector
variance-covariance matrix
5
3
1
3
7
2
1
2
Solution :
Y1
(i) Z 2 1 3 Y2
Y3
[ 2 3 1]
and
E (Z ) 2
Var ( Z ) 2
2
3 3 2
1
5
3 3
1
3
7
2
1
2
8
2
3 53
Y1
4
X1
1 2
(ii)
Y2
1 1 1 Y
X2
3
47
213 16
47
1
0.8051
213 16
0.8051