3 Marks Pyq

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

THEDRIM 111 1

.. :If ,A,,and Bare indepc:nd:em,l, then A, and B' a-re also 'inde:pen-
denl.
Pmaf Sin,ce A, = (A, n B ') U (A n B' )., as the· reader w·a s a:s'~ed 'to s'how. in
parl (a) or Exercise l1 A ri Band A in B; are 'mutually exclusiwei and A and
1

~ . ar
B ___ e- ind - n"
___'!c_pc __y1en-- 11 ._b:v, a.ssu
1 __
~ _ -~ -tio
mp ___ ave
___ n,- w.. e- b-

P(A) = P((A n B) u (A n B')]


= P(A nB) +P(A n1.B')
=· P(A) · P(B) + l'(A n lf'J

ll, ,follows. that


1

P(A n .B'') = P(.A ) - P'( A) ~ P(B)


= P(.A ) · f1- P(BJ)
= P<.A> · P<.B''>
and hence 1ba1 A and B' arc ind.cpc·ndcnt
IXAMIPLI I
.H X has 1Jllc: probabi'lil y de: mi11

I'
flJr) = 'lo
" .'" - 11 for N > Di
el~cwherc

tind, and P(D.S :ii X :i U.

f••·
tnc, ~1.uau Rt' 1111$ "'d ;md oal) ir .11111 T"ti!Nln11 ~ bee,... 1~"'
Rlidd hr; •l•lli\C f'ne· ~
tbr r ~ ~ •illlillul ~ U), ol Lbc p'Dhlhddltir. I ~ n. buLh ,Olli'id'klwns ul lliromn " .-ill1 hr.
ubt&d hf ·m:..-lJ !ID lir prnb,ahJU) ~ wcJ !Ill rr-:11or and ~1i1JJJ11!t.dl m ~bn tciti

:Pirobtab-ili ty Dutributiun'i und Piuhaib1l11y llci01!1a

Sallllian
To scH ill-,, Ille rwcondl condition of 1111.··o R m S.. 'llie· m1n1 b_a,-r

and ii rollOM, IUI k = ],. For lhc: probabililj \lie Jt:I


~Mft.l I I
H the joint pmb\3bility d.PlributJioo ,nf lhfec· dmctc random variah1e5,}r" ·r , anti Z
'5 given by

tor A' C! t2: : = 1.2

find PtX o 2. r + 2' :& l 1


~ .

Solution
,P~X c 2., 'Y + Z 1i ll c /C2. 1.11 +{12. t !) + ft2.,.'!. IJ
31 6 41
c -+- + -
~ ,6.\ 16 l,
ll
= -fu,
TNlOREM J. CEflTllAIL LIMIT THEOREM. If X1,,X2,, ~ ~Xn cooslitule I ran-
p

dom sample from an in,1fini1.e populatio.n with. the mean p., the variance
a 2 , and the moment-generating fun,c tion Mx(ll, then the limilm& distri-
bution of

Z= X-µ
a/Jn
as n~·oo is the standard normal dis,tribution.
IEXAMPlEI
H 'lbe pro1bab.ility de·nsi'ty of X is given by

,Z0 - ..1) for O<X< l


/(:c) =
{0 elsewhere

C•- show 1·bat


2
.E (.r) = . .•
(r+ l)(r+2)
(b) and use this res.ult to evaluate

Saluti.cm
(a)
EX AMPLE 12
If the probability density of X is given by

Dlr4(1 -x)4 for O <X < I


f(x) = {0 elsewhere

find the probability Lhat it will take on a value within two standard deviations or the
mean and compare this probability with the lower bound provided by Chebyshev's
theorem.

Solution
Straightforward integration shows that µ = ½ a nd a 2 = ,b. so that a = J t/44
or approximately 0.1.5. Thus, the probability that X will take on a value within two
standard deviations of Lhe mean is the probability that it wiU lake on a value between
0.20 and 0.80. that is,
0.80
P(0.20 < X < 0.80) ""

=0.96
i
0.211
6.11lr4 ( I - .r) 4 d:r

Observe that the statement "the probability is 0.96" is a much stronger state-
ment than "the probability is at leas t 0.75," which is provide,d by Chebyshev's
theorem.
EXAMPLE 12
The average number- of trucks arrivi ng on any one day at a truck P§i>s1 in a certain
city is known to be 12. What is the probability that on a given day fewer than 9 trucks
will arrive at this depot?

Solution
Le t X be Lhe number of trucks arriving on a given day. Then, using Table II of "Sta-
tistical Tables" with A= 12. we get

8
P(X < 9) = L P(X: 12) = 0.1550
x=O
EXAMPLE I

If the probability density of X is given by


fii(l - x) forO<x < 1
/ (x )

find the probability density of Y


=
l0

= X 3•
elsewhere

Solution
Letting G(y) denote the vaJue of the distribution function of Y at y , we can write
G (y) = P(Y < y)
= P(X 3 < y)

= P(X < ylf 3)


{ v1/3
= · <>x(l - x) dx
10
= 3/' 3 - 2y

and hence

for O < y < 1: elsewhere. g(y) = 0. In Eicercise 15 the reader will be asked to verify
this result by a different technique.
DEFINITION 2. RAN'DOM SAMPLE. // X1, X2, ... , X11 are independent and idenlica/fy
dis1rib11ted random variables, we say that 1hey constitute a rando m sample f ro,n
the it[finUtt poinilal'foij Riven by their con11non distribution.

If /(x 1.xi .... . x,,) is the value of the joint distribution of such a set of random vari-
ables at (x 1•x2 .. ... x,, ). by virtue of independence we can write
II

[(xi ,x2 .... ,x,,) = TT f<x1)


i= I
where J(x;) is the value of the popuJation distribution at x1. Observe that Definition 2
and the subsequent discussion ap ply also to sampling with rep lacement from finite
populations: san1pling without replacement from finite populations is discussed in
section 3.
Statistical inferences are usually based on statistics, that is, 011 random variables
that are funclio11s of a set of randon1 variables X1. Xi, . ... X,,. constituting a random
sample. Typical of w·hat we mean by "statistic" are the sample me an and the sample
variance.
EXAMPLE 3
If the joint density o f Xt , Xi , and X3 is given by

for O< X l < I , 0 < x2 < I. X3 > 0


elsewhere

find the regression equation of X2 on X 1 and X3.

Solution
The joint marginal density of X 1 and X3 is given by

for O <X1 < l ,x3 > 0


elsewhere

The.refore.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy