Ps 3

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

ECE 534 RANDOM PROCESSES FALL 2011

PROBLEM SET 3 Due Tuesday, October 4


3. Random Vectors and Minimum Mean Squared Error Estimation
Assigned Reading: Chapter 3 and the section on matrices in the Appendix, in the course notes.
Problems to be handed in:
1 Comparison of MMSE estimators for an example
Let X = U
3
, where U is uniformly distributed over the interval [1, 1].
(a) Find E[X|U] and calculate the MSE, E[(X E[X|U])
2
].
(b) Find

E[X|U] and calculate the MSE, E[(X

E[X|U])
2
].
2 Estimation with jointly Gaussian random variables
Suppose X and Y are jointly Gaussian random variables with E[X] = 2, E[Y ] = 4, Var(X) = 9,
Var(Y ) = 25, and = 0.2. ( is the correlation coecient.) Let W = X + 2Y + 3.
(a) Find E[W] and Var(W).
(b) Calculate the numerical value of P{W 20}.
(c) Find the unconstrained estimator g

(W) of Y based on W with the minimum MSE, and nd


the resulting MSE.
3 Projections onto nested linear subspaces
(a) Use the Orthogonality Principle to prove the following statement: Suppose V
0
and V
1
are
two closed linear spaces of second order random variables, such that V
0
V
1
, and suppose X
is a random variable with nite second moment. Let Z

i
be the random variable in V
i
with the
minimum mean square distance from X. Then Z

1
is the variable in V
1
with the minimum mean
square distance from Z

0
. (b) Suppose that X, Y
1
, and Y
2
are random variables with nite second
moments. For each of the following three statements, identify the choice of subspace V
0
and V
1
such that the statement follows from part (a):
(i)

E[X|Y
1
] =

E[

E[X|Y
1
, Y
2
] |Y
1
].
(ii) E[X|Y
1
] = E[ E[X|Y
1
, Y
2
] |Y
1
]. (Sometimes called the tower property.)
(iii) E[X] = E[

E[X|Y
1
]]. (Think of the expectation of a random variable as the constant closest to
the random variable, in the m.s. sense.
4 Conditional third moment for jointly Gaussian variables
(a) Suppose Z is a N(,
2
) random variable. Express E[Z
3
] in terms of and
2
.
(b) Suppose

X
Y

is a N

0
0

1
1

random vector, where || < 1. Express E[X


3
|Y ]
in terms of and Y .
1
5 Some identities for estimators, version 3
Let X, Y, and Z be random variables with nite second moments and suppose X is to be estimated.
For each of the following, if true, give a brief explanation. (True means the statement is true for
any choice of X, Y, Z.) If false, give a counter example.
(a) E[(X E[X|Y ])
2
] E[(X E[X|Y, Y
2
])
2
].
(b) E[(X

E[X|Y ])
2
] E[(X

E[X|Y, Y
2
])
2
].
(c) E[(X

E[X|Y ])
2
] = E[(X

E[X|Y, Y
2
])
2
] if X and Y are jointly Gaussian.
(d) E[(X E[X|Y ])
2
] E[ (X E[E[X|Z] |Y ])
2
].
(e) If E[(X E[X|Y ])
2
] = Var(X), then E[X|Y ] =

E[X|Y ].
6 Steady state gains for one-dimensional Kalman lter
The Kalman lter equations for the following state and observation model, where
2
> 0, f is a
real constant, and let x
0
is a N(0,
2
) random variable:
(state) x
k+1
= fx
k
+w
k
(observation) y
k
= x
k
+v
k
where w
1
, w
2
, . . . ; v
1
, v
2
, . . . are mutually independent N(0, 1) random variables, are given by
x
k+1|k
= f x
k|k1
+K
k
(y
k
x
k|k1
)

2
k+1
=

2
k
f
2
1 +
2
k
+ 1 K
k
= f(

2
k
1 +
2
k
).
(a) Show that lim
k

2
k
exists. (Here,
2
k
is short for the conditional variance
2
k|k1
used in the
Kalman lter equations.
(b) Express the limit,
2

, in terms of f.
(c) Explain why
2

= 1 if f = 0.
7 Kalman lter for a rotating state with 2D observations
Consider the Kalman state and observation equations for the following matrices, where
o
= 2/10
and f = 0.99 (the matrices dont depend on time, so the subscript k is omitted):
F = f

cos(
o
) sin(
o
)
sin(
o
) cos(
o
)

H = Q = R = I,
where I is the 2 2 identity matrix. (a) Explain in words what successive iterates F
n
x
o
are like,
for a nonzero initial state x
o
(this is the same as the state equation, but with the random term w
k
left o).
(b) Write out the Kalman lter equations for this example, simplifying as much as possible (but
no more than possible! The equations dont simplify all that much.)
(c) Describe how the equations simplify further if the covariance matrix of the initial state, P
0
,
satises P
0
=
2
0
I, for some constant
2
0
> 0. (Hint: Do you see any simplication in the P
k
s,
where P
k
= Cov(x
k
)?)
(d) Identify the steady state values,

= lim
k

k+1|k
and K

= lim
k
K
k
.
2

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy