Ps 3
Ps 3
Ps 3
i
be the random variable in V
i
with the
minimum mean square distance from X. Then Z
1
is the variable in V
1
with the minimum mean
square distance from Z
0
. (b) Suppose that X, Y
1
, and Y
2
are random variables with nite second
moments. For each of the following three statements, identify the choice of subspace V
0
and V
1
such that the statement follows from part (a):
(i)
E[X|Y
1
] =
E[
E[X|Y
1
, Y
2
] |Y
1
].
(ii) E[X|Y
1
] = E[ E[X|Y
1
, Y
2
] |Y
1
]. (Sometimes called the tower property.)
(iii) E[X] = E[
E[X|Y
1
]]. (Think of the expectation of a random variable as the constant closest to
the random variable, in the m.s. sense.
4 Conditional third moment for jointly Gaussian variables
(a) Suppose Z is a N(,
2
) random variable. Express E[Z
3
] in terms of and
2
.
(b) Suppose
X
Y
is a N
0
0
1
1
2
k+1
=
2
k
f
2
1 +
2
k
+ 1 K
k
= f(
2
k
1 +
2
k
).
(a) Show that lim
k
2
k
exists. (Here,
2
k
is short for the conditional variance
2
k|k1
used in the
Kalman lter equations.
(b) Express the limit,
2
, in terms of f.
(c) Explain why
2
= 1 if f = 0.
7 Kalman lter for a rotating state with 2D observations
Consider the Kalman state and observation equations for the following matrices, where
o
= 2/10
and f = 0.99 (the matrices dont depend on time, so the subscript k is omitted):
F = f
cos(
o
) sin(
o
)
sin(
o
) cos(
o
)
H = Q = R = I,
where I is the 2 2 identity matrix. (a) Explain in words what successive iterates F
n
x
o
are like,
for a nonzero initial state x
o
(this is the same as the state equation, but with the random term w
k
left o).
(b) Write out the Kalman lter equations for this example, simplifying as much as possible (but
no more than possible! The equations dont simplify all that much.)
(c) Describe how the equations simplify further if the covariance matrix of the initial state, P
0
,
satises P
0
=
2
0
I, for some constant
2
0
> 0. (Hint: Do you see any simplication in the P
k
s,
where P
k
= Cov(x
k
)?)
(d) Identify the steady state values,
= lim
k
k+1|k
and K
= lim
k
K
k
.
2