Ps 6
Ps 6
Ps 6
|X
0
] and the resulting MSE, for 0.
(b) Find the value of that minimizes the MSE found in part (a).
3 Prediction of future integral of a Gaussian Markov process
Suppose (X
t
: t R) is mean zero Gaussian Markov process with R
X
() = e
||
. Let J =
0
e
t
X
t
dt, so that J is the weighted time average of (X
t
: t 0), computed using the expo-
nential density e
t
which has mean
1
.
(a) Describe the probability distribution of J.
(b) Find E[J|X
0
], and the resulting mean square error, MSE = E[(J E[J|X
0
])
2
].
1
4 A two-state stationary Markov process
Suppose X is a stationary Markov process with mean zero, state space {1, 1}, and transition rate
matrix Q =
T
,
2k
(t) =
2
T
cos
2kt
T
and
2k+1
(t) =
2
T
sin
2kt
T
for
k 1.
(b) Given N 1, let f
(N)
be the function minimizing the L
2
norm of the approximation error,
||f f
(N)
||, over all functions f
(N)
with N or fewer nonzero coordinates relative to the basis in
part (a). Describe f
(N)
and nd N such that ||f f
(N)
||
2
(0.01)||f||
2
. (You need not nd the
smallest such N, but try to get close; using an integral bound for a series can help.)
(c) Given N 1, let W
(N)
be the best approximation to the Brownian motion process W = (W
t
:
0 t T) on the the interval [0, T] (in the sense of minimizing the expected energy of the error:
E[||W W
(N)
||
2
]) over all random processes W
(N)
expressible as a random linear combination
of N functions on the interval [0, T]. Describe W
(N)
and nd N such that E[||W W
(N)
||
2
]
(0.01)E[||W||
2
]. (Hint: Use the basis functions (
n
: n 0) associated with the KL expansion of
W, given in the notes.)
6 First order dierential equation driven by Gaussian white noise
Let X be the solution of the ordinary dierential equation X