Ps 3
Ps 3
Ps 3
Problem 3.1
Suppose x and y are the random variables from Problem Set 2, problem 2.4. Their
joint density, depicted again below for convenience, is constant in the shaded area
and 0 elsewhere.
p x,y(x ,y )
-2
-1
2
x
-1
-2
(a) In the (PD , PF ) plane sketch the operating characeristic of the likelihood ratio
test for this problem. Also, indicate on this plot the region consisting of every
(PD , PF ) value that can be achieved using some decision rule.
(b) Is the point corresponding to PD = 23 , PF =
test that achieves this value. If not, explain.
5
6
Problem 3.2
We observe a random variable y and have two hypotheses, H0 and H1 , for its prob
ability density. In particular, the probability densities for y under each of these two
hypotheses are depicted below:
py |H1 (y|H1 ) = 3y 2 , 0 y 1
py |H0 (y|H0 ) = 1, 0 y 1
3
y
1
(a) Find the decision rule that maximize PD subject to the constraint that PF 14 .
(b) Determine the value of PD for the decision rule specied in part (a).
Problem 3.3
Let k denote the uptime of a communications link in days. Given that the link is
functioning at the beginning of a particular day there is probability q that it will go
down that day. Thus, the uptime of the link k (in days) obeys a geometric distribution.
It is known that q is one of two values: q0 or q1 . By observing the actual uptime of
the link in an experiment, we would like to determine which of the two values of q
applies. The hypotheses are then:
H0 : pk|H [k|H0 ] = Pr [k = k|H = H0 ] = q0 (1 q0 )k k = 0, 1, 2,
H1 : pk|H [k|H1 ] = Pr [k = k|H = H1 ] = q1 (1 q1 )k k = 0, 1, 2, .
Suppose that q0 = 1/2, q1 = 1/4, and we observe k = k in our experiment.
y = m1 + w
y = m2 + w
y = m3 + w,
where
y1
y = y2 ,
y3
1
m1 = 0 ,
0
0
m2 = 1 ,
0
0
m3 = 0 ,
1
Pr [H = H1 |y = y]
1 (y)
(y) = Pr [H = H2 |y = y] = 2 (y) ,
3 (y)
Pr [H = H3 |y = y]
C12 = C21 = 1,
(i) Specify the optimum decision rule in terms of 1 (y), 2 (y), and 3 (y).
(ii) Recalling that 1 + 2 + 3 = 1, express this rule completely in terms of
1 and 2 , and sketch the decision regions in the (1 , 2 ) plane.
(b) Suppose that the three hypotheses are equally likely a priori and that the Bayes
costs are
1, i = j
Cij = 1 ij =
.
0, i = j
Show that the optimum decision rule can be specied in terms of the pair of
sucient statistics
2 (y) = y2 y1 ,
3 (y) = y3 y1 .
3
Hint: To begin, see if you can specify the optimum decision rules in terms of
Li (y) =
py|H (y|Hi )
,
py|H (y|H1 )
for i = 2, 3.
Problem 3.5
A former 6.432 student, skilled in the methods of hypothesis testing, came upon a
professional gambler who specialized in betting on ips of a coin. Our friend knew
that there were three possibilities for the coin the gambler chose:
* The coin is fair. That is, each ip of the coin is independent of all other tosses
and has an equal probability of coming up heads or tails.
* The coin is biased towards heads. Specically, while successive ips of the coin
are independent, the probability that any individual ip comes up heads is 3/4.
* The successive tosses of the coin are not independent. Specically, while the
marginal probability of head (or tails) on any given ip is 1/2, the probability
that the next ip yields the same result as the preceding one is only 1/4. For
example, the probability that the next ip comes up heads given that the current
ip is a head is 1/4.
What our friend plans to do is observe two ips of the gamblers coin and then to
make a decision among these three possibilities regarding the type of coin the gambler
is using.
(a) Assume that our 6.432 experts prior assessment is that each of these three
possibilities is equally likely and that she views as equally bad any error she
might make (i.e. she simply wants to minimize the probability of error). De
termine the best decision she can make for each of the possible outcomes of the
gamblers pair of coin ips.
(b) Determine the probability of error associated with the decision rule determined
in part (a).
Problem 3.6
Suppose x and y are random variables. Their joint density, depicted below, is
constant in the shaded area and zero elsewhere.
y
1
-1
x
1
-1
(x x)2 ,
x <0
C(x , x) =
2
K(x x) , x > 0
where K > 1 is a constant.
Determine xMLS (y ), the associated Bayes estimate of x for this cost criterion.
(d) Give a brief intuitive explanation for why your answers to (a) and (c) are either
the same or dierent.
Problem 3.7
Let y be a discrete-valued random variable taking on nonnegative integer values,
y = 0, 1, 2, . . . , and let x be a continuous-valued random variable. Suppose that the
conditional probability mass function for y given x is given by a Poisson density
xy x
P r[y |x = x] = e ,
y!
y = 0, 1, 2, . . .
2e2x x > 0
px (x) =
0
x<0
Compute xM AP (y ), the MAP estimate of x based on y .
5
y1
y2
y = ..
.
yL
1
vBLS
yi + 1 .
L i=1