Take Home Exam

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

University of Cyprus

Department of Electrical and Computer Engineering


ECE 621: R ANDOM P ROCESSES
Fall 2016
Take Home Exam
Basic Probability Concepts
Issued: Friday, 02 December 2016

Due: 20:00, Friday, 23 December 2016

Reading: Any of the references listed in course syllabus.

Problem 2.1
(Sequential continuity of probability Distributions and Borel Cantelli Lemma)
Let (.F , P) denote a probability space and {An }
n=1 be a sequence of events in F .
Show the following.

(a) If {An }
n=1 is a decreasing sequence then P(n=1 An ) = limn P(An ).

,
(b)
n=1 kn Ak is equivalent to belongs to an infinite number of An s. Moreover, if i=1 P(Ai ) <
then P(limn sup An ) = 0.

,
(c)
n=1 kn Ak is equivalent to belongs to all but a finite number of An s. Moreover, if i=1 P(Ai ) =

and {An }n=1 are independent then P(limn sup An ) = 1.

Problem 2.2
(Sequential continuity)
1. Suppose the collection of sets F is such that i) F , ii) A, B F then A B F . Show that F
is an algebra.
4

2. Let (, F , P) = (R, B(R), `R ), where R = (, ), `R is defined as the Lebesgue measure, that is,
`R ([a, b]) = b a, a b.
Determine the following (by justifying your answers)

1
[0, 1 + ]),
n
n=1

P(

P(

n=1

1
, 1])
n+1

3. Show that any probability distribution on R = (, ) is continuous, that is,


lim FX (x + n ) = FX (x)

where limn n = 0.

Problem 2.3
(RVs taking finite values, also called simple RVs)
4

Let ( = (0, 1], F = B((0, 1]), P((a, b] = b a), 0 a b 1 be a probability space, and A1 =
4

(0, 21 ], A2 = ( 12 , 43 ], A3 = ( 34 , 1] a partition of , {x j }3j=1 arbitary real numbers, and IAi the indicator of the
event Ai , that is, IAi () = 1 if Ai and IAi () = 0 if Ac . We define the RV
3

X=

x j IA

: 7 (, ).

j=1

Find
 
(a) the expected value E X ,
 
(b) the second moment E X 2 ,
(c) the variance X2 ,
(d) the characteristic function of X.

Problem 2.4
(Functions of two independent RVs)
Let X1 , X2 be real-valued RVs having joint-density given by
n 1
o
1
fx (x1 , x2 ) =
exp x12 + x22
2
2
q
Let Y = X12 + X22 . Find the conditional expectation E(X1 |Y ).
Hint: Introduce the RV. so that X1 = Y cos(), X2 = Y sin().
Problem 2.5
(Cramers theorem)
Let X, Y and Z be Independent and Identically Distributed (IID) nonnegative random variables with
4

probability density function density f () = e for 0 and R = (, ).


2

1. Determine the value of for f () to be a valid probability density function.

2. Show the following non-trivial upper bound.


If > E[X] then P

 X +Y + Z
3

()

) e3

and identify ().

Problem 2.6
(Sequence of independent RVs)
Let X,Y be real-valued independent and identically distributed RVs with E[|X|] < .
(i) Show that
E[X|X +Y ] = E[Y |X +Y ] =

X +Y
.
2

(ii) Suppose {Xi }i1 is a sequence of independent and identically distributed RVs. Using part (i) establish that
E[X1 |Sn ] =

n
Sn
.
, where Sn = Xi ;
n
i=1

Problem 2.7
(Mixed RVs)
Let X and Y be independent RVs with the following distributions:
1. X is N(0, 2 );
2. Y takes the value 1 with probability p and 1 with probability q = 1 p.
Define
4

Z = X +Y ; W = XY.
Then do the following.
(a) Find the probability density of Z and its characteristic function.
(b) Find the conditional density of Z given Y .
.
(c) Show that W is Gaussian. Define U = W + X and show that U is not Gaussian. Find E[W ] and
Var(W ). Are W and X correlated? independent?
3

Problem 2.8
(Markov property of sequences of events)
Suppose {An }
n=1 is a sequence of Markovian events, that is, they satisfy
P(An |A1 , A2 , . . . , An1 ) = P(An |An1 ),

n = 1, 2, . . .

Show the following.


(a) The sequence {An }
n=1 is Markovian backwards, that is,
P(An |An+1 , An+2 , . . .) = P(An |An+1 ),

n = 1, 2, . . .

(b) The sequence {An }


n=1 satisfies conditional independence
P(An+1 An1 |An ) = P(An+1 |An )P(An1 |An ), n = 2, 3, . . .

Problem 2.9
(Limits of sequences of RVs and measurability)
Let (, F , P) be a probability space and let Xn : 7 (, ), n = 1, 2, . . . , be a measurable sequence of RVs.
(a) Show that the following functions are measurable.
(i) supn Xn .
(ii) infn Xn .
(iii) limn sup Xn .
(iv) limn inf Xn .
(b) Let X = nj=1 j IA j : 7 (, ), where {A j }nj=1 is a partition of .
Show that X is F measurable.

Problem 2.10
(Moments of Gaussian RVs)
Let X : 7 (, ) be a Gaussian RV, N(, 2 ), 2 > 0.
Show the following.
(a) The mean E[X] satisfies
Z

E[X] =

(x)2
1
x
e 2 dx =
2

(b) The Variance E[X E[x]]2 satisfies


E[X E[X]]2 =

(x)2
1
(x )2
e 2 dx = 2
2

Problem 2.11
(Characteristic function of RVs and proprties)
Given a continuous RV X : 7 (, ), the characteristic function is defined by
4

X (h) = E[eihX() ],

i2 = 1,

h (, ).

The above expression is the Fourier transform of the probability density function fX (x) of the RV X,
when it exits. Often, we work with the Laplace transform of the probability density function, called the
moment generating function defined by
4

MX (h) = E[ehX() ],

h (, ).

In addition, we also work with another related function called the cumulant generating function of the
RV X defined by
4

X (h) = log E[ehX() ],

h (, )

(a) Show the functions M(h) and (h) are convex functions of h (, ).
(b) For a Gaussian RV X N(, 2 ) compute the functions M(h), (h), M(h).
(c) Give the analogs of the functions M(h), (h), M(h) for a discrete RV X taking only countable values.
Compute M(h), (h), M(h) for a Poisson RV X with parameter > 0.
Problem 2.12
(Functions of two RVs)
Let X and Y be two random variables with zero means, unit variances and correlation coefficient .
Show that
p
E[max(X,Y )] 1 + 1 2 .

Problem 2.13
(Mean Square Error Estimation (MSEE)) Let random variables X and Y have joint density

x + y, if x, y [0, 1],
f (x, y) =
0,
else.
5

(a) Find the linear MMSE estimator of X given Y .


(b) What is the mean squared error of the LMMSE you computed in part (a)?
(c) Find the (possibly nonlinear) MMSE estimator of X given Y .
(d) What is the mean squared error of the MMSE you computed in part (c)?
(e) Compute the ratio of the mean squared errors, i.e., divide your answer in part (d) (the MMSE
squared-error) by your answer in part (b) (the LMMSE squared error).

Problem 2.14
(MSEE of Gaussian RV)
Repeat parts (a) through (e) of the previous problem for the case when Y is a N(0, 1) random variable
and X = |Y |.

Problem 2.15
(MSSE of Gaussian RVs)
Show that for jointly Gaussian random vectors ~X and ~Y , the linear MMSE estimate of ~X and the (possibly non-linear) MMSE estimate of ~X based on observations ~y are identical.

Problem 2.16
(MSSE of Gaussian RVs)
Consider a 3-dimensional Gaussian random vector ~x with zero mean and covariance matrix

2 1 1
= 1 2 0 .
1 0 2
(a) Find a matrix A such that ~x = A~z, where~z is another 3-dimensional Gaussian random vector, with
zero mean and covariance the identity matrix.
(b) Determine E[X1 |X2 , X3 ], where Xi is the i-th component of ~x.

Problem 2.17
(Additive noise communication channels)

Consider the following communication problem, in which a length-n vector~x of bits are to be transmitted
over a noisy channel to a receiver. The relationship between the transmitted data ~x and the received
~ , where H is a known m n matrix representing the effects of
length-m vector ~y is given by ~y = H~x +
~ is N(0, 2 I). Note that n need not equal m due
the LTI channel (i.e., it is a convolution matrix), and
to edge effects from the channel. The data is transmitted using BPSK, i.e., X1 , X2 , ..., Xn are i.i.d. and
P(xi = 1) = P(xi = 1) = 12 for i = 1, ...., n.
(a) Determine the joint density of ~x and ~y.
y) = A~y +~b, determine the values of matrix A and vector ~b, such
(b) For an estimator of the form ~x(~

that the error = E[(~x ~x(~y))2 ] is minimized.


y) is not restricted to be of the form given in part (b), determine ~x(~
y) that
(c) If the estimator ~x(~
2

minimizes the error = E[(~x ~x(~y)) ]. For this part of the problem (and this part only) assume
that x, y and H are scalars, i.e., m = n = 1.
(d) Determine an upper bound on from part (c).

Problem 2.18
(Vector of Gaussian RVs)
Let the random vector [X Y Z]T be a Gaussian random vector with mean [1 2 3]T and covariance matrix

4 1 1
= 1 2 0 .
1 0 1
(a) Suppose that X is estimated by an estimator of the form X = aY 2 + bY + c. Determine numerical
2 ) is minimized.
values of a, b and c such that = E((X X)
(b) Determine the resulting numerical value of the mean squared error from part (a).
(c) Suppose that X is estimated by an estimator of the form X = aY 2 . Determine the numerical value
2 ) is minimized.
of a such that = E((X X)
(Hint: The moment generating function of a Gaussian vector x with distribution N(m, K) is
(u) = exp( 12 uT Ku + juT m).)
Problem 2.19
(Statistics of random processes)
(a) Let X[n] be a discrete-time random process defined for n = 1, 2, 3, ... . Samples X[1], X[2], X[3],
..., are independent, identically distributed (i.i.d.) random variables and have Pr(X[n] = 0) =

Pr(X[n] = 1) = 1/2. Let the random process Y [n] be defined by


Y [n] = X[n] X[n 1] .
Find E[Y [n]] and Var[Y [n]].
(b) Let X(t) be a random process defined by X(t) = At + B.
(i) If A is constant and B is a random variable that is uniformly distributed in [0, 2], sketch a
sample function of this process and find E[X(t)].
(ii) If A, B are independent, identically distributed (i.i.d.) Gaussian random variables with mean
0 and variance 2 , find the joint pdf fX(t1 ),X(t2 ) (x1 , x2 ).
Problem 2.20
(Statistics of random processes)
Let B[n] be a Bernoulli random sequence equally likely to take values 1 (independently between
different time steps). Define the random process



X(t) = p sin 2 f t + B[n]


for nT t < (n + 1)T for all n ,
2

where p and f0 are real numbers.


(a) Determine the mean function X (t).
(b) Determine the covariance function KXX (t1 ,t2 ).

Problem 2.21
(Nonuniform Poisson process)
A nonuniform Poisson counting process N(t) with time varying rate (t) ( (t) 0 for all t 0) is
defined for t 0 as follows:
(i) N(0) = 0;
(ii) N(t) has independent increments;
(iii) For all t2 t1 ,
P[N(t2 ) N(t1 ) = k] =
Z t2

where u =

(v)dv.
t1

Answer the following:


8

uk
exp(u), for k 0 ,
k!

(a) Find N (t).


(b) Find RN (t1 ,t2 ).
(c) Repeat parts (a) and (b) for the case when (t) = 1 + 2t.

Problem 2.22
(Random Telegraph signal)
A random telegraph signal is a random process X(t) defined for t 0 as follows: X(0) = +1 and X(t)
switches between +1 and 1 according to a Poisson random arrival sequence T [n], i.e.

1,
0 t < T [1],

1, T [1] t < T [2],


X(t) =
1, T [2] t < T [3],

Assume that the rate parameter of the Poisson random arrival time sequence is known.
(a) Argue that X(t) is a Markov process and draw and label its state transition diagram.
(b) Find the steady-state probability that X(t) = +1 in terms of the rate parameter .

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy