ECO113 Practice Problems

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Practice Problems

1. The Beta distribution with parameters a = 3,b = 2 has PDF

f(x) = 12x2(1 – x), for 0 < x < 1.

Let X have this distribution.

(a) Find the CDF of X.

(b) Find P(0 < X < 1/2).

(c) Find the mean and variance of X (without quoting results about the Beta
distribution).

Solution:

(a) The CDF F is given by

for 0 < t < 1 (and the CDF is 0 for t ≤ 0 and 1 for t ≥ 1).

(b) By (a),
P(0 < X < 1/2) = F(1/2) – F(0) = 0.3125.

(c) We have

2. A stick of length 1 is broken at a uniformly random point, yielding two pieces. Let X and Y
be the lengths of the shorter and longer pieces, respectively, and let R = X/Y be the ratio of
the lengths X and Y.

(a) Find the CDF and PDF of R.

(b) Find the expected value of R (if it exists).

(c) Find the expected value of 1/R (if it exists).

Solution:

(a) Let U ~ Unif(0,1) be the breakpoint, so X = min(U,1 – U). For r  (0,1),

This is the CDF of X, evaluated at r/(1 + r), so we just need to find the CDF of X:

P(X ≤ x) = 1 – P(X > x) = 1 – P(U > x,1 – U > x) = 1 – P(x < U < 1 – x) = 2x,

for 0 ≤ x ≤ 1/2, and the CDF is 0 for x < 0 and 1 for x > 1/2. So X ~ Unif(0,1/2), and the CDF
of R is

for r  (0,1), and it is 0 for r ≤ 0 and 1 for r ≥ 1. Alternatively, we can note that
The events U ≤ r/(1 + r) and 1– U ≤ r/(1 + r) are disjoint for r  (0,1) since the latter is

equivalent to U ≥ 1/(1 + r). Thus, again for r  (0,1) we have

The PDF of R is 0 if r is not in (0,1), and for r  (0,1) it is

(b) We have

(c) This expected value does not exist, since diverges. To show this, note that
diverges and

3. Alice is trying to transmit to Bob the answer to a yes-no question, using a noisy channel. She
encodes “yes” as 1 and “no” as 0 and sends the appropriate value. However, the channel adds
noise; specifically, Bob receives what Alice sends plus a 𝒩(0, σ2 ) noise term (the noise is
independent of what Alice sends). If Bob receives a value greater than 1/2 he interprets it
as “yes”; otherwise, he interprets it as “no”.

(a) Find the probability that Bob understands Alice correctly.

(b) What Happens to the result from (a) if σ is very small? What about if σ is very
large? Explain intuitively why the results in these extreme cases make sense.

Solution:

(a) Let a be the value that Alice sends and ϵ be the noise, so B = a + ϵ is what Bob
receives. If 𝑎 = 1, then Bob will understand correctly if and only if 𝜖 > −1/2. If 𝑎 =
0, then Bob will understand correctly if and only if 𝜖 ≤ 1/2. By symmetry of the
Normal, 𝑃(ϵ > −1/2) = 𝑃(ϵ ≤ 1/2), so the probability that Bob understands does
not depend on 𝑎. This probability is

(b) If 𝜎 is very small, the

Since Φ(𝑥) (like any CDF) goes to 1 as 𝑥 → ∞. This makes sense intuitively: if there is
very little noise, then it’s easy for Bob to understand Alice. If 𝜎 is very large, then

Again this makes sense intuitively: if there is a huge amount of noise, then Alice’s
message will get drowned out (the noise dominates over the signal)

4. Let X and Y be independent r.v.s. Show that

Hint: It is often useful when working with a second moment E(T 2 ) to write it as 𝑉𝑎𝑟(𝑇) +
(𝐸𝑇)2
Solution: Note that X and Y are uncorrelated (since they are independent) and 𝑋 2 and 𝑌 2
are uncorrelated (since they are independent). Then

5. Let 𝑈 ∼ 𝑈𝑛𝑖𝑓(−1,1) and 𝑉 = 2|𝑈| − 1.

(a) Find the distribution of 𝑉 (give the PDF and, if it is a named distribution we have
studied, its name and parameters).

Hint: Find the support of 𝑉, and then find the CDF of 𝑉 by reducing 𝑃 (𝑉 ≤ 𝑣) to
probability calculations about 𝑈.

(b) Show that 𝑈 and 𝑉 are uncorrelated, but not independent. This is also another example
illustrating the fact that knowing the marginal distributions of two r.v.s does not
determine the joint distribution.

Solution:

(a) The support of 𝑉 is (−1, 1). The CDF of 𝑉 is

for −1 < 𝑣 < 1, since for a Uniform probability is proportional to length. The PDF of
𝑉 is 𝑓(𝑣) = 1/2 for 𝑣 ∈ (−1,1) (and 0 otherwise). Therefore, 𝑉 ∼ 𝑈𝑛𝑖𝑓(−1,1).

(b) By (a), U and V are both 𝑈𝑛𝑖𝑓(−1,1), with mean 0. They are uncorrelated since

But they are extremely dependent, since in fact V is a deterministic function of U.

6. (a) Let X and Y be Bernoulli r.v.s, possibly with different parameters. Show that if X
and Y are uncorrelated, then they are independent.

(b) Give an example of three Bernoulli r.v.s such that each pair of them is uncorrelated, yet
the three r.v.s are dependent.

Solution:

(a) Let 𝑋 ∼ 𝐵𝑒𝑟𝑛(𝑝1 ) and 𝑌 ∼ 𝐵𝑒𝑟𝑛(𝑝2 ) be uncorrelated. By the fundamental bridge,

𝑃(𝑋 = 1, 𝑌 = 1) = 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌) = 𝑃(𝑋 = 1)𝑃(𝑌 = 1).


Since
(𝑋 = 1, 𝑌 = 1) + 𝑃(𝑋 = 1, 𝑌 = 0) = 𝑃(𝑋 = 1),
we have

P(𝑋 = 1, 𝑌 = 0) = P(𝑋 = 1) − P(𝑋 = 1, 𝑌 = 1) = p1 − p1 p2 = p1 (1 − 𝑝2 ) =


P(𝑋 = 1)P(𝑌 = 0).

By the same argument (swapping the roles of X and Y),

P(X = 0, Y = 1) = (1 − p1 )p2 = P(X = 0)P(Y = 1).


Then

𝑃(𝑋 = 0, 𝑌 = 0) = 𝑃(𝑋 = 0) − 𝑃(𝑋 = 0, 𝑌 = 1) = (1 − 𝑝1 )(1 − 𝑝2 ) =


𝑃(𝑋 = 0)𝑃(𝑌 = 0).
Therefore, X and Y are independent.
(b) Pairwise independence of events doesn’t imply independence of the events, as shown
in Example 2.5.5. By taking the indicator r.v.s for events that are pairwise independent
but not independent, we can construct r.v.s that are pairwise independent (and hence
uncorrelated) but not independent.
As in Example 2.5.5, consider two fair, independent coin tosses, and let A be the event
that the first toss is Heads, B be the event that the second toss is Heads, and C be the
event that the tosses have the same result. Then the corresponding indicator r.v.s
𝐼(𝐴), 𝐼(𝐵), 𝐼(𝐶) are uncorrelated but dependent.

7. Let X and Y have joint PDF

(a) Check that this is a valid joint PDF.


(b) Are X and Y independent?
(c) Find the marginal PDFs of X and Y.
(d) Find the conditional PDF of Y given 𝑋 = 𝑥.

Solution:

(a) The function 𝑓𝑋,𝑌 is non negative for 0 < 𝑥 < 1 and 0 < 𝑦 < 1 (and 0 otherwise), so
we just have to check that it integrates to 1:

(b) They are not independent, since 𝑥 + 𝑦 does not factor as a function of 𝑥 times a
function of 𝑦.

(c) The marginal PDF of 𝑋 is

for 0 < 𝑥 < 1. By an analogous calculation or by symmetry, the marginal PDF of 𝑌


is

for 0 < 𝑦 < 1.

(d) Let 0 < 𝑥 < 1. The conditional PDF of Y given 𝑋 = 𝑥 is

for 0 < 𝑦 < 1.

8. A researcher studying crime is interested in how often people have gotten arrested. Let
𝑋 ∼ 𝑃𝑜𝑖𝑠(𝜆) be the number of times that a random person got arrested in the last 10 years.
However, data from police records are being used for the researcher’s study, and people who
were never arrested in the last 10 years do not appear in the records. In other words, the
police records have a selection bias: they only contain information on people who have been
arrested in the last 10 years.
So averaging the numbers of arrests for people in the police records does not directly
estimate 𝐸(𝑋); it makes more sense to think of the police records as giving us information
about the conditional distribution of how many times a person was arrest, given that the
person was arrested at least once in the last 10 years. The conditional distribution of X,
given that 𝑋 ≥ 1, is called a truncated Poisson distribution.

(a) Find 𝐸(𝑋|𝑋 ≥ 1)


(b) Find 𝑉𝑎𝑟(𝑋|𝑋 ≥ 1).

Solution:

(a) The conditional PMF of X given 𝑋 ≥ 1 is

For 𝑘 = 1,2, … Therefore,

We have

So

(b) We have

So

9. There are two envelopes, each of which has a check for a 𝑈𝑛𝑖𝑓(0,1) amount of money,
measured in thousands of dollars. The amounts in the two envelopes are independent. You
get to choose an envelope and open it, and then you can either keep that amount or switch
to the other envelope and get whatever amount is in that envelope.
Suppose that you use the following strategy: choose an envelope and open it. If you observe
U, then stick with that envelope with probability U, and switch to the other envelope with
probability 1 − 𝑈.

(a) Find the probability that you get the larger of the two amounts.

(b) Find the expected value of what you will receive.

Solution:

(a) Let 𝑈 ∼ 𝑈𝑛𝑖𝑓(0,1) be the amount in the envelope you open and 𝑉 ∼ 𝑈𝑛𝑖𝑓(0,1) be the
amount in the other envelope. Let B be the event that you get the larger of the two
amounts and 𝐼𝐵 be the indicator r.v. for B. Given that 𝑈 = 𝑢, the probability of B is
(b) Let U and V be as in (a), I be the indicator of the event that you stick with your initial
choice of envelope, and X be the amount you receive. Then 𝐼|𝑈 ∼ 𝐵𝑒𝑟𝑛(𝑈) and

𝑋 = 𝑈𝐼 + 𝑉(1 − 𝐼)
So
1 𝑈 1
𝐸(𝑋|𝑈) = 𝑈. 𝑈 + . (1 − 𝑈) = 𝑈2 − +
2 2 2
Which shows that
1 1 1 7
𝐸(𝑋) = 𝐸(𝐸(𝑋|𝑈)) = − + = .
3 4 2 2
Thus, the expected value is 7/12 pf a thousand dollars, which is about $583.33.

10. For i.i.d. r.v.s X1, . . . , Xn with mean µ and variance σ 2 , give a value of n (as a specific
number) that will ensure that there is at least a 99% chance that the sample mean will be
within 2 standard deviations of the true mean µ.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy