0% found this document useful (0 votes)
21 views

Stat Module-158

statistics for economics student

Uploaded by

Haftom Yitbarek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
21 views

Stat Module-158

statistics for economics student

Uploaded by

Haftom Yitbarek
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 171
BAHIR DAR UNIVERSITY FACULTY OF BUSINESS AND CONOMICS DEPARTMENT OF ECONOMICS aX > BAHIR Dak UNIVERSITY STATISTICS FOR ECONOMISTS (Econ 351) MOUDLE. PREPARED BY Tesfaye Melaku December, 2006 Table of Contents CHAPTER | BASIC PROBABILITY THEORY, CHAPTER Il :SOME SPECIAL PROBABILITY DISTRIBUTIONS & DENSITIES. CHAPTER IV: JOINT AND CONDITIONAL PROBABILITY DISTRIBUTION CHAPTER V: SAMPLING & SAMPLING DISTRIBUTION, CHAPTER VI: ESTIMATION OF PARAMETERS. CHAPTER Vil: HYPOTHESIS TESTING Chapter | Basic Probability Theory Content 1.1 Introduction 74 106 18 132 151 1.2 Sample space, sample point and events 1.3 Definitions of probability 1.4 Axioms of probability 1.5 Counting procedures 1.6 Conditional probability and independence 1.7 Baye's Theorem 1.1. Introduction There are large numbers of happenings in nature and in the realm of human activity that are associated with uncertainties. Though rising of the sun next day can be taken to be certain, appearance of clouds in the sky next morning is not as certain. The sex of a baby to be born some months hence is again not known for certain. Each happening is necessarily associated with two or more outcomes, because if there is only one outcome of some happening, there can not be any uncertainty about the outcome. On the other hand, from planned human activity, point of view a feature we commonly associate with the scientific method is experimentation. If an experiment is repeated under essentially similar conditions, we generally come across two types of situation: namely a) The result or the outcome is unique or certain - known as deterministic, b) The result is not unique rather may be one of the several possible outcomes- known as Unpredictable or non-deterministic or probabilistic phenomenon. The experiment (models) related to non-deterministic phenomenon is called random ‘experiment or non-deterministic model. In this course we will deal with the random experiments- or non - deterministic models. Examples of ‘random’ or ‘inon-deterministic'or ‘stochastic’ experiment are: Ex Tossa die and observe the number that shows up. E: Tossa coin n time and observe the total number of heads obtained Es: Manufacture items on a production line and count the number of defective items produced during a 24 hr period, Ex A light bulb is manufactured. itis then tested for its life length by inserting it into a socket and the time elapsed (in hrs) unti it burns out recorded Ex: Items are manufactured until 10 defective items are produced. The total number of manufactured items is counted. Ex The tensile strength of s steel beam is measured E: A thermograph records temperature, continuously, over a 24-hr period. At a specified locality and on a specified date, such a thermograph is 'read’ Es: Experiment of growing soybeans on an acre of land under a well defined set of actions, then recording the bushel. Important common features of the above random experiments are: a) Each experiment is capable of being repeated indefinitely under essentially unchanged conditions. b) Although we are in general not able to state what a particular outcome will be, we are able to describe the set of all possible outcomes of the experiment. ©) As the experiment is performed repeatedly, the individual outcomes seem to occur in a haphazard manner. However, as the experiment is repeated a large number of times, a definite pattern of regularity appear, there is a predictable long-term outcome. It is this regularity which makes it possible to construct precise mathematical model with which to analyze the experiment. The main aim of inferential statistics (science of statistics) is to arrive at some conclusion regarding such unpredictable happening, taking into account the variation among its outcomes and magnitude of uncertainties associated with them. For scientific investigation of uncertainty of outcomes, itis first necessary to have a measure of the degree of uncertainty. Since out comes of a random experiment are associated with uncertainties a measure of the degree of uncertainty associated with an event (to be defined latter on) is known as probability measure. Dear distance student! Before we directly deal with the three kinds or / interpretation/ definitions of probability let us first look to the definitions of basic terminologies that we frequently encountered with. 1.2, Sample Space, sample point, Events and Event space As we have seid earlier we can not predict with certainty the outcome of a random experiment in advance of (prior to) the completion of the experiment; we can only list the total possible outcomes. Definition: 1. Sample space. The set of all possible outcomes of an experiment is called the sample space, and denoted by S. The word sample is included as the reminder of the random nature of the experiment meaning that its outcomes is uncertain so that a given outcome is one sample of several possible outcomes. That is each individual outcome is occuring as @ sample from many possible ‘outcomes of an experiment. Some other symbols that are used in other texts to denote sample space are 2X. 2. Sample point, - Each element of the sample space S is called @ sample point and usually denoted by e 3. Events. - Events are subsets of the sample space, S, 4B . The class or collection of all events associated with a given experiment is defined to be the event space. We use E to denote event space. Dear colleague! now let us see some examples which will help us understand the above mentioned definitions. Example.1. Let the experiment be rolling of a well balanced single die. The sample space of the experiment is S= (1, 2, 3,4, 5, 6} which contains all the possible outcomes of the experiment. Each outcome is a sample point, i¢., «7, s. And this sample has a finite number of elements. Let A = {2, 4, 6}. Then A is a subset of S and defines the event of obtaining an even outcame. Let B= (4). Then B defines the event of obtaining a 4. Both A and B are subsets of the sample space and thus are events. The size of this sample space(S), is finite We say that an event A occurs if atrial produces one of its elements, i.e. when 2 or 4 or 6 occurs. If a 4 is turned up, we say events A and B both occur. Since in this case BCA, when B occurs then A occurs. We may be interested on the following events a) the outcome is the number 1 b) the outcome is even but less than 3 c) The outcomes is not even. When an event contains only one element of S, like B above, it is called elementary event. If we collect all subsets of S in one setand denote it by E this new set is called event space Example2. The experiment is to record the number of traffic death on the eve of the next new year in Addis Ababa. Any nonnegative integer is a conceivable outcome of this experiment. The sample space for this experiment is S= {0, 1, 2,. ..} There are countably infinite number of points in the sample space. Hence the size of this sample space is countably infinite, Since each point itself is an elementary event; so there is an infinite number of events. Other possible event is A=fewer than 500 deaths, then A = {0,1,2...499}. Example 3. Consider the agricultural experiment of growing wheat on an acre of land under a well defined set of conditions, then recording the yield in quintals (x). Any non negative number of quintals could be an outcome of this experiment. The sample space here is Se{xix20 } In this case the sample space has uncountably infinite number of elements since it takes values in terms of an interval of the real number line. We may be interested on the following events i) Event Aisa yield less than or equal to 100 quintals: A={x: 100 } ii) Event Bis obtaining a yield between 20 and 50 quintals = B= {x: 20 00. That is to say, if N(A) is the number of times that A occurs in the N trials, the ration wy, appears to converge to a constant limit as N increases indefinitely. We can take the ultimate value of this ratio as the P(A). Then the relative frequency definition is given by (2) That means assume that a series of experiment can be made keeping the initial conditions as equal as possible. An observation of a random experiment is made, then the experiment is repeated and another observation is taken. When the experiment is repeated up to sufficiently large times, in many of the cases the observations fall into certain classes where in the relative frequencies are quite stable. This stable relative frequency can be taken as the probability (approximate) of events. Example 9. Consider tossing a coin (balanced or unbalance 1000 times. The following result is expected from this experiment. Outcome | Observed Observed relative |Longtun expected frequenc) frequene} relative frequeno H 540 0.54 0.50 T 460 0.46 0.50 Total 1000 1.00 1.00 Probability of obtaining = e(¥) = 82 w0.5=1, p(t) =0.8 7000 2 Example 10 . If we find up on examination of large records that about 51% of births, in one locality, are male; it might be reasonable to postulate that the probability of a male birth in this locality is equal to P and P +0.51, ie P(male birth in that locality)=P<0.51 Example 11. A study of 8,000 economics graduates of A.A.U was conducted. The study revealed that 400 of these students were not employed in their major areas of study. What is the probability that a particular economics graduate will be employed in an 10 area other than his/her field of study? P(a graduate will be employed in area other than his/her major) p ~ 2% ~ 1 000 20 What we lear from the above examples is that the probability of an event happening in the long run is determined by observing what fraction of the time similar events happened in the past. Numberofi_meseventac curedinpas t Tota In umberofobs ervations Dear distance student! Please notice that the relative frequency definition does not a) require events to be equally likely b) necessarily requiring the objects of the experiment to be unbiased. When the objects of experiment are biased (not balanced) it is the relative frequency definition that must be used to determine the probability of events. Pr ababiltyo feventhapp ening = In general the important thing is that it is possible to conceive of a series of observations or experiments under identical conditions. Then a number p can be postulated as the probability of the event A happening and p can be approximated by the relative frequency of the event A in a series of experiments. i.e. raya eA wy 0 wnnememe(3) 1.3.3. Subjective Definition There are many situations, however, when the relative frequency definition is difficult to apply. If there is little or no past experience on which to base probability it may be guessed subjectively. Essentially, this means evaluating the available information and then estimating the probability of an event. This probability is known as subjective probability. Example 12. i) Estimating the probability that new product of a firm will be successful in the market ii, Estimating the probability that a distant student will score an A in the course. It depends on evaluation of individuals about the performance of a student in other subject. [inMicnapoemn movin ronlotms onthe ogeis it yotraintemare is, say 70% -he is expressing his personal degree of belief Dear distance student! You may have noticed from the above examples that subjective probabilities are based on any and all information available to each person about the Uncertainties related to the outcome of the event and will differ from person to person. The concept of subjective probability is the function for the Bayesian approach to statistics which we don't deal in the course 1.4. Axioms of Probability: the rules of Probability The modern theory of probability is based on the axiomatic approach introduced by the Russian mathematician A.N. Kolmagorove. In axiomatic approach some concepts are laid down and certain properties, or commonly known as axioms, are defined and from these axioms the entire theory is derived by logic of deduction, The axiomatic definition of probability includes the above three definitions of probability but is free from their draw backs. Discussion of the axiomatic definition of probability is conveniently couched in the language of set theory. Hence before giving axiomatic definition of probability below we will present, the summary of what we know about set theory which are relevant to our interest. 12 1 Demorgan’s Law (AUB) = AB, and (AB) = Au . Using Ven diagram ACB (AUB) =AnB ADB (An8)=AUB 8. Set difference Law A-B= AB A-B=A-(A%B) =(AUB)-B ‘A: (B-C) =(A-B) U(A-C) 9. Mutually exclusive or disjoin subsets. Subsets A and B of S are defined to be disjoint or mutually exclusive if AB =. Similarly if A.A, = for every ij 10. If A and B are subsets of S then a) A= AB UAB b) AB > A@ = @. Thus subsets AB and Aj are disjoint subsets. 11. If ACB, then AB = A, and AUB =B 1.4.1, Axiomatic Definition of Probability Given the sample space S and event space E (assumed to be an algebra of events), a probability function P (A) is defined as a set function with domain E (algebra of events) and counter domain [ 0, 1] which satisfies the following axioms. ‘Axiom i) P(A) 20 for every Ac E Axiom il) P(S) Axiom iil) if Ay, A... is a sequence of (finite or infinite) mutually Ua € & then exclusive events inE(ieA A=@ forix j) andif AU Ay, U > P(A). Note that when we say algebra of events we mean that events in E satisfy the algebra of subsets mentioned in the summary of algebra of set operation above. This is because events are subsets of a sample space. From the above 3 axioms we are able to deduce a number of laws ( properties) of probability. Since E is an algebra of events, if we assume that A and B € E we know that 4, AWB, AB, 4B, etc, are also elements of E, and hence it makes sense to talk about P( 4 ), P (AUB), P (AB) ,P (4B) etc. 1.4.2. Rules (properties) of Probability For each of the following rules assume that S and E are given and P(A) is a probability with domain €. 1. The probability of the impossible event is zero, P (2) = 0 Proof Since S~ 2 = @, then S and are disjoint events SUZ= S P(S2) = P(S) P(S) +P(2) =P(S) ne by axiom =. P(Z) = P(S)-P(S) 1-1 =0= P(@)=0 2, Probability of the complementary event. IfAisaneventinE,then P(A‘)=1-P (A)... Proof Ao A’=@, then A and A‘ are disjoint events. AUK P(AL A)=P(S) P (A)+ P(A) 1 «nny axiom iii and i =P (A)=1-P (A) 3, Law of addition of probabilities a) General law of addition of probability (a) 14 i) If A and B are any two events in E, the probability of occurrence of at least one of the two events is given by P(AWB) = P(A) + P(B) — P(AB). (5) The proof of this law is well facilitated by the help of Ven diagram and itis given below. Proof AUB=AU AB and 4B = B-A=B-AB A SinceA> Ag =2,Aand a6 are disjoint B P (AWB) = P(A) + P(A B) But P (‘4 B)= P(A) + P(B) ~ P (AB). Then P (AUB) = P(A) + P(B) ~ P(AB) We can arrive at the same result by relative frequeney approach, p(avia)= MAUS) _ Minberof (AU 8) N(S) Szeotsemp espace Pirvip)= (A) = (AB N= (AB) +10 (8) — W481 7) _M(A) + (2) ~ NCAR) (8) = MUA), NCB) _ N(AB) _ 9 4) + P(8)- P(ANB ) N(s) (Ss) NCS) ii) For three events, A, B,C in E, then P (AV BUC) = P(A)}+ P(B)+ P(C) ~ P (AB) — P(BC) — P (AC) + P(ABC).....(6) Proof p(aupuc) = MAU ave) n(S) Few MA NUAB) = HU A0) + (ABC ) +NB) — (BC) + COD] NS After re arranging and dividing each by N(S we get ON (A) , MCB) NCC) _N(AB) _NCBO) | MARC) 4 M(s) M(S) MCS) M(S) (3) A =. P(AGB..C) = P(A)+P(B)+P(C) — P(AB)-P(AC)- P(BC) + (ABC) iil) When we extend the general Law of addition of probabilities for n events, ie. Ay, Ay... As © Ewehave PIAUAU....UAI (Us) = ET P(A) —EP(A,A,) +D PAA, Ay ee + (-1)"" P(A A, 27... nA) ..(7) i ig icjck A tule for remembering the algebraic sign is that probabilities involving even numbers of events occur with a negative sign, the other with positive sign. Corollary. P (A-B) = P(A) ~ P (AB). ..... We know that A-B = AB = A-AB. Then P (A-B) = P (AB P(A)-P(AB) b) Mutually Exclusive Events If events are mutually exclusive (disjoint) events we simply apply axiom iii to obtain the probability of occurrence of at least one of them. i) IFA and B are mutually exclusive events then the probability of occurrence of Aor Bis then the sum of the individual probabilities P (A or 8) = P (AUB) =P (A) +P(B)... (9) ii) For three events, ie. AB, Ce E then probability of occurrence of A or Bor Cis the sum of their individual probabilities P (ABC) = P(A) + P(B) + POC) enennneeenernen(10) iil) If we partition (classify) the sample space into n mutually exclusive groups, say A:, A, Av... An. Then we have S= Av AW AsO... U An, ile S=[Ja,, Then P(S) = (Ua |= Payee) + $PCA,) =) PCA) smennrnennrnnnen( 1) Since P(S) =1 Then we have 1= (A) +P(A.) + .... P(A.) Corollary. If A and B are elements of E, then P (A)=P (AB) +P (AB) Proof We Know that A =A(BU & )= A> S= Awe also know that AB~ AB =@, hence AB and AB are mutually exclusive events. Therefore P(A)=P (ABU AB )= P(AB) + P (AB) Generalization If AL, Ay, As are sequences of mutually exclusive events in E, and [_J 4, = & (Us| Urs Decay 4, Some Laws (theorems) on cry of probabilities A)IfAand B ¢Eand ACB, then P(A) < P(B). (12) Then P(A: VALU. Proof B= BA W BA, ButBA=A there foreB= AU BA Since Ar B 4 =@,A and B 4 are disjoint events Hence P (8) =P(A v BA). )= P (B) -P (AB), if P(AB) > 0 then P(A) P=0.3.0 3. A chartered accountant applies for a job in two firms x and Y. He estimates that the probability of his being selected in firm x is 0.7, and being rejected at y is 0.5 24 and the probability of at least one of his applications being rejected is 0.6. What is the probability that he will be selected in one of the firms? Solution ; Let A and B denote the event that the chartered accountant is a elected in firm x and y respectively. Hence we are given P(A) = 0.7 P(B)= 0.5 P((4uB)= 06 What we are asked is to obtain P (AB). To calculate P(A WB) we have to know P(B),P (AB). P(B) =1 - P(@ )= 1-0.5=0.5 P(AB) will be obtained by applying Demorgan’s law. Weknow ans = AUB P(AnB) =1— P( AnB —P(AUB) =1-0.6-0.4 The probability that the chartered accountant will be selected in one of the two firms x or y is P (AB)= P (A) + P(B) - P (AB) =0.7+0.5-0.4 =0.8 4, The probability that a contractor will get 2 plumbing contract is 2/3, and the probability that he will not get an electric contract is 5/9. The probability of getting at least one contract is 4/8. What is the probability that he will get both the contracts? Solution: Let A= the event that the contractor will get a plumbing contract Let B: the event that he will get an electric contract. Weare given P(A)= 2 P(B) = = P(AuB ) = +. The questionis to find P (AB) 5 First we need to know P(B) P@)=1-P(a) =1- 2 =* 2» (AUB) = P(A) + P(B) — P (AB) > (AB)= P(A) + P(B) ~ P (AUB) Per a 3 9 5 as Hence the probability hat the contractor will get both contracts is 14/,. 1.6. Conditional Probability and Independence Dear Learner! Understanding of conditional probability and probability of independent events will be easier by first learning the concept of joint and marginal probabilities. Joint and Marginal probabilities. In considering events which can be classified by two or more criteria it is often helpful to portray the situation in one of the following tables. Tables 1. Classification by two criteria B A Marginal Total A P(AB) [P(@a) | PA) ai P(aB) |P(aa) |P(a) Marginal Total | P(8) P(a) u The probabilities inside the body of the table are joint probabilities, ie. P(AB), P (A B ) and P(4 @). In other words they are probabilities of the joint occurrence of the two events. The probabilities given in the last column and row are marginal probabilities. Note that marginal probabilities are the usual probabilities and are obtained by summing the joint probabilities in a given row or column. For instance P (A)= P(AB) +P (A) and P(#)= P(AB)+P(a a). Table 2.Classification by more than two criteria’s vi Ye Ys TE Marginal Total X PHY) [PHY | PK Ya) PRY) |P&) Xe POY) | P(X: | P(X: Ya) PO%Y) | P(x) Xi P(GY) [P(e Ys) | P(X: Ya) P(%GY) | P(x) a PRY) |P&Y) | PRY) PRY) | P(e) Marginal Total PY) (vs) P (Ys) P (Ys) 1 Dear distance student! Here again P (x, ¥) are joint probabilities, ie. the likelihood of the joint occurrence of X and ¥,, In the last column and row are marginal probabilities Inthis caseP(X)= D> p(x, y, ) rowwise and P (¥)= 31 P(x,y,) COWUMA WISE en snnnue-(22) Now we will discuss the conditional probability. In practice any one who conducts random experiments is usually confronted with the type of questions like: Such and such has happened now what is the probability that something else will happen? For instance in an experiment of sampling from a box containing say, 100 oranges of which 5 are defective, what is the probability that the third draw results in a defective orange given that the first two resulted defective? Probability questions of this kind are treated in the frame work of conditional probability 8 1.6.1, Conditional Probability An experiment is repeated N times and on each occasion we observe the occurrence or nor-occurrence of two events, say , A and B. Given these two events we are interested to know the probability of event A given that event B has occurred. This probability is known as conditional probability of event A, given that B has occurred and denoted by P(A/B). Similarly conditional probability of event B, given that A has occurred is expressed as P (B/A). Definition. Let A and B be two events in E (event space) of the given Probability space. If P(B) >0 the conditional probability that event A occurs given that B occurred is defined by (asa) = P(AnB) _ P(AB) =—Pint Probability PB) P(e) (M arg inal Pr obilityote P(AB) (23) Similarly P(B/A) = iP (A) >0 (24) Example 22. Two fair dice are thrown. Given that the first show 3, what is the probability that the total exceeds 6? Solution Clearly S= {1,2,3,4,5,6} x {1,2,3,4,5,6} Then N(s)= 36. Let B be the event that the first die shows 3, and A be the event that the total exceeds 6. In usual notations (3,b): 16} AB= {(3,4),(3,5), (3,6) = joint occurrence. Hence P(aye) = 2CA8) _ MIAB)/N 3, 6 P(B) N(B)/N 36 36 3 6 Example 23, A family has two children, What is the probability that both are boys, given that at least one is a boy? Solution. The older and younger child may each be male or female, so there are four possible combinations of sexes, which we assume to be equally likely. Hence the sample space will be, S= { GG,GB,BG,BB} We have P(GG) = P(GB) = P(BG) = P(BE) = % The question is to obtain P(BB/one boy at least) =P(B8/GBU BG. Ba)= PER (68 Ves UBB) P(GB BG OBB) (88) Bla (G8 BG BB) 4 4 43 8 From the definitions of conditional probability we can see that P(AB) = P(A/B)xP(B)= P(B/A)P(A) Dear colleague! An important effect of conditional probability is reducing the sample size. To make this idea clear let us revisit the formula ag} P(AB) p(B) cas) P(A/B) = Using frequency approach P(AB) = Where N(AB) the number of occurrence of AB in the N occurrence. p(B) = (4) N (AB) tas) N N(B) M(B) We can see that the given event is our nei sample space, whose size is. smaller than the size of the actual sample space (S). Hence P(A/B) = From this In passing we would like to tell you that actually all probabilities are conditional, because probabilities can not be computed except on the basis of probabilities of basic events. Frequently the conditioning statement in a probability statement is omitted either because it is obvious in the context of the problem or it is universally accepted, ie; (P(A)= P(A/S) The next question to be investigated is that does the conditional probability satisfy the three axioms and various postulates of probability? The answer is yes. i) P(A/B) = P(AB)/P(B) > 0 for every AcE ie. Os P(A/B) <1 ii) P(S/B) = P(SB)/P(B) = 1 Proof Since BCS, then BS=B Hence P08) _ PC) _ p(B) PCB) iil) If A,As,.. is a sequence of mutually exclusive events in E and |_) 4, cE, While fl =@,then Uc 0) p(B) P(e) PU are |= rua unre) = (24) = P(A/B) + PCA, /8) + PCAg/8)-~ = 5° wVA,/B) « The conditional probabilities also satisfy the following rules /properties/ of probabilities: ‘Assume that the probability space is given and let Be and P(B) >0. 1. P(@/B)=0 P(2/8) = P(@B)/P(B) =0 [Since @B=2 and P(Z)=0] 2. If Ais an event in E(event space), then P(4/B)= 1-P(A/B) Proof We know that 4W.4= and 4a =¢ using this result we have P(AUA /B) = P(A/B) + P(A/B) = P(S/ 8) axiom iii) P(A/8) + P(A/B) =1 2. P(AB) =1— P(A/8) 3. If Avand AceE, then P[A./B) = P(A:Ax/B) + P(A: A, /B) Proof We can write Ai = AiA2U A A, And we know A.A: OA: 4, = Ai(An A, )=2) Using this result we have P(A,/B) =P( Air A, A, /B) = P(A\A./B) + P(A, 4, /B) 4, For every two events A, and A: cE. (Ai As/B) = P(A:/B}+P(Ac/B)-P(ArAo/B) 1.6.2.Theorem of total probability Let BB, ..B, be a pattition ( collection of mutually exclusive events) of sample space § satisfying s= Uo, andif P(8) >0forj=1,..n then For every A in E (event space) P(A)= 7 P(A78,) P(B,) (25) Proof We know that AS= A(B: WB: JB) = ABIUAB2U ... UAB (Jas, — 4 then P(A)= el AB) =P P(A/B,) P(B,) + P(A/B,)P(B,) +--+ P(A/B,) P(B,) P(A) = P(4/B,) ACB, The total probability theorem will help us to calculate the probability of a single event A using the conditional probability and the concept of simultaneous occurrence of two events. The formation of a partition of S means that when the experiment is performed then exactly one of the events B, will occur. The result P(a)= P(4/8,)P(2,) is extremely useful relationship, because often when P(A) is required it may be difficult to compute it. But, with additional information that B has occurred and the joint probability P(AB) we may be able to evaluate (4/8) and itis possible to compute p(A). Example 24:- A certain item is manufactured by three factories, say, 1, 2 and 3. It is known that factory 1 produce twice as many items as factory 2, and factories 2 and 3 produce the same number of items in a given production period. It is also known that 2% of items produced by 1 and 2 are defective while 4% of ites produced by 3 are defective. All the items produced are put into one stockpile, and then one item is chosen at random. What is the probability that this item is defective? Solution Let {the item is defective) (the item came from factory 1} the iter came fram factory 2} B, = {the item came from factory 3} P(B:) = 14, while P(B.) = P(E) = 34 Probability that the defective item is selected from 1, is P(A/B,) = 2% = 0.02 P(A/B.) =2% = 0.02 P(A/Bs) =4% = 0.04 (A) = P(A/B;) P(B:)+P(A/Bz) +P(A/B:) P(B:) P(A) = (0.02)( ¥) +(0.02)( Ye 0.04)( x) = 0.025 1.6.3. Independence In general, the occurrence of some event B changes the probability that an other event A occurs, then P(A) is replaced by P(A/B). If the probability remains unchanged, that is if P(A/B) = P(A), then we say that event A is independent of event B. In other words the occurrence of B has no bearing on the occurrence of A. Definition. Let A and B be two events in E of the given probability space Events A and B are called independent if and only if any one of the following conditions is satisfied () ~——-P(AB) = P(A)P(B) (i) P(A/B) = P(A), if P(B) >0 (26) (iil) _ P(B/A) = P(B), if P(A)>0 Example25, Suppose that a fair die is tossed twice. Let A ={ the first die shows an even number} B = {the second die shows 5 or 6} Obviously it is clear that events A and B are totally unrelated. Knowing that B did occur provides no information about the occurrence of A. N(S) = 36 and the outcomes are equally likely. (a =e =! (a) = = 1. Hence P(AB) = P(A) P(B)= Lt = + 362 363 23 6 P(AB) 131 P(B) 61 2 Independence of A and B imply independence of other events as well If A and Bare two independent events in E, then A and, 4 andBand a and @, are all independent events, Let us see some of these: A) P(AB) = P(A) P(3) Proof: We know Ag = A-AB then P(A8)= P(A) -P(AB) (A)-P(A).P(B) = P(A) [1-P@)] = P(A) P(B) [b/se 1-P(8)=P( )] B) P(AB)= P(A )P(B) Proof: We know that B.A = B-AB, hence P(B A) = P(B) =P(AB) = P(B) -P(B).P(A) =P(B)[1-P(A)] _ =P(B)P(A) [b/se 1-P(A)=P(A) ©) P(AB)=P(A)P(B) The notion of independent events may be extended to more than two events. P(A/B) = Independence of several events. Let A, Az... As be family of events in E. Events A,,A:.,A, are called independent if and only if P(A\A) = P(A) P(A) for ij P(AA\A) = P(A) p(A) P(A\), for isi, jek, ik More generally olfs]- (4) — [|] issign of multiplication] .....(27) If the family { A.,A,—A,} has the property that P(A. A) = P(A) P(A) for i #j then it is called pair wise independent. This pair wise independence is not necessarily true always. To show this is beyond the scope of this module. The definition of independence of events is used not only to check whether two events are independent but also to model some experiment. For example, for a given experiment the nature of the events A and B might be such that we are willing to assume that A and B are independent; then the definition of independence gives the probability of the event AB in terms of P(A) and P(B). Example26: Consider the experiment of sampling with replacement from an urn containing M balls of which K are black and M-K white. Since balls are being replaced after each draw, it is reasonable to assume that the out come of the second draw is independent of the outcome of the first. Then. P(2 blacks in first two draws) = P(black on 1* draw) x P(black on von (E) wha) la Dear distance student Next we will see the multiplication rule of probability. Multiplication rule of probability Let A;,...An be events in E for which P(A; — > Aes) >0; then P(ArAa™ yA.) = P(A) P(A/A) P(An/ AAs) — P(AVAVAr—Avrt) ones snneo(28) In other words we are often interested in P(A.AcAs) let the joint occurrence of AA: be denoted by B. We know, from conditional probability, that P(A/B)= P(A/B)P(B) = P(Ai/AsAs) P(AdAs) We also know that P(A.A,) = P(A,/Ac) P(A.) - Substituting this in the above formula we get, P(AAsAs) = P(A:/AvAs) P(As/A:) P(As) It can be written in several other ways by permuting the subscripts, such as (AAAs) = P(As/AiA2) P(Ac/A:) P(A) Then extending the above chain rule to n events we get the following rule: PAAM 1s) = P(ALA2 Aa) P(Ad/ As Ac) —X = P(Ani/As) P(Ad). If-A,,A;, A.) are pair wise independent then POAAR =A) = P(A). PA) —x PA) = [] PCA.) one (29) The multiplication rule is primarily useful for experiments defined in terms of stages. 1.7, Bayes' Theorem Many of the important problems of economics, science, engineering, etc are concerned with problem of causation. If P(A/B) =1 and B occurs, then obviously A occurs. Then it would be stated, under these circumstances, that A is because of B if B happens before A. Suppose that the occurrence of one of n mutually exclusive events A:Ao, — , Asis necessary for the occurrence of B. Given that B has occurred, we wish to know which of A's preceded it, so we ask for the probability that A has occurred, given that B occurred. This is the problem which Baye's theorem solves. The formal definition of the theorem is as follows Let S be the sample space of some experiment. The disjoint events A.,A:, — Ar are partition of S satisfying S=[J4, and P(A) 20 for i=1 1, then for every Be E for which P(B)>0 P(B/A) pA) __P(B/A)PLA) P(A/B) = rai (30) Dre 4) (A) Proof . pap) = 248) and P(AB) = P(BVA) P(A), then P(A/B) = PCB/AI PCA). P(8) PB) But P(B) = P(A.8) + P(A.B) +—+P(A.B).. (from theorem of total probability) Therefore P(A/B) = —PCB/AIPLA) Dees 4yPca) This is known as Baye's theorem because Baye (1763) was one of the first to consider this problem. Example 27- Three different machines are used to produce chocolate chip cookies by Lovely Company, which promise to have at least six chips in every cookie. Suppose machine No 1 produces 20% of Lovely cookies, No.2 Produces 30% , and No.3 Produces 50%. Also suppose that the machines represent different vintages of capital so that 1% of the cookies produced by machine No.1 are defective, in the sense that they have less than six chips, 2% of those produced by machine No.2 are defective, and 3% of those produced by No. 3 are defective. If one cookie is chosen at random and observed to be defective what is the probability that it was produced by machine No. 2? Solution Let A is the event that the randomly chosen cookie is produced by machine No.i Thus P(A\) = 0.2, P(A.)= 0.3 and P(A;) = 0.5 and if B is the event that a randomly drawn cookie is defective, then P(B/A\) = 0.01, P(B/A:) = 0.02 and P(B/A:) = 0.03 using Baye's theorem we have P(Ase) = PORIAIPCA) (0.3)(0.02) © (0.2)(0.01) + (0.3)(0.02) + (0.50.08) Dra ayes) = 0.26 Bayes’ theorem has interesting interpretation. The probabilities P(A) can be called "Prior ' probabilities since they represent the probabilities of different machines in the above example, producing a randomly selected cookie before that cookie is checked to see if it is defective. The conditional probability P(A / B) is called a ‘posterior ' probability since it represents our assignment of probabilities after the sample evidence of the defective cookie is obtained. Thus, stated verbally, Bayes’ theorem is the probability of an event Ai is proportional to the probability of the sample evidence after Ai times the prior probability of Ai. Activity 1.2. 1. Inyour opinion is the use of the word probability function, instead of probability, appropriate? Justify 33 2, Explain the difference between mutual exclusiveness and independence of events? 3. Do you accept our consideration of probability as a function? Why? Worked Examples 1. Nearly all manufacturers now submit their finished produce to lot acceptance sampling, where the probability of accepting the lot depends upon the quality of the lot. Suppose that lot of equal size are submitted to acceptance sampling. ‘The percent defective in the lot is either 1%, 2%, 3% or 4% and the probabilities of acceptance in each case are given by ‘Defective Probability of Acceptance 1 0.95 2 0.70 3 0.30 4 0.10 Suppose it is known from long experience that (PA.)= 0.2, P(A:)=0.3, P(A.)=0.30. Given that a lot has been rejected, what is the probability that it is 1% defective? Solution We will use Baye’s theorem to solve this problem. % defective lot Ac=3% " " Then P(B/Ai) = 0.05, P(B/Ax) = 0.30, P(B/A:) = 0.70, P(B/A:) = 0.90 the question is to compute P(A,/B). Using Baye's rule P(A/B) = P(B/A)P(A) Yrerayrca) (0.05 )(0.20) (0.05}(0.2) +(0.8) + (0.7¥(0.8) + (0.90.2) . P(AW/B) = 0.02 2. Consider a large lot of items, say 10,000. Suppose that 10% of these items are defective and 90% are non defective. Two items are chosen. What is the probability that both items are non defective? Solution Let A= first item is non defective B = Second item is non defective i) If sampling is done with replacement, A and 8 may be assumed to be independent. Hence. (AB) = P(A).P(B) P(A)=0.9 P(B)=0.9 ©. P(AB) = 0.9x0.9 = 0.81 ii) If sampling is with out replacement P(AB) = P(B/A) .P(A) P(A)=0.9 pea) = ©. then PAB) == «0.9 = 0.81 999. 99 3. There are 5 urns, and they are numbered 1 to 5. Each urn contains 10 balls. Urn i has i defective balls and 104 non defective balls, i = 1,2, 5. Consider the following experiment: First an um is selected at random, and then a ball is selected at random from the selected urn. i) Whatis the probability that a defective ball will be selected? ii) Whatis the probability that the selected defective ball came from Urn 52 Solution Let A is the event that a defective ball is selected. Bis the event that um iis selected. We have P(B; © forall i, and P(A/B) = ~, fori=1,...5 5 10 Question (i) asks what is P(A)? Using the theorem of total probabltes, we ot P(A) Deere) ee) Ea as Question (i) asks, what is P(B/A)? Using Bayes’ formula we have preva) = 2041 2)PC8) _ 275 Leas, P(B) 4, Anum contains 10 balls of which 3 are black and 7 are white. The following game is played: At each trial a ball is selected at random its color is noted, and it is replaced along with two additional balls of the same color. What is the probability that a black ball is selected in each of the first 3 trials? Solution LetBis fa event that a black ball is selected on the i” trial. The question is to find = p(8,8,8,). Applying multiplication rule we have P(B:)P(B:/B}) P(B:/B:B:) P(8) ==, P(e2/B1)= 5 (because the ball is replaced with additional 2 balls 10 12 of the same color) there will be 5 black balls and total number of balls will be 12.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy