Week 3 Slides
Week 3 Slides
Week 3 Slides
Now we introduce the probability for discrete sample spaces (recall that those
with only a finite set of outcomes).
The restriction to these sample spaces enables us to simplify the concepts and
the presentation without excessive mathematics.
Probability is used to quantify the likelihood, or chance, that an outcome of a
random experiment will occur. “The chance of rain today is 30%’’ is a
statement that quantifies ourfeeling about the possibility of rain.
The likelihood of an outcome is quantified by assigning a number from the
interval [0, 1] to the outcome (or a percentage from 0 to 100%).
Higher numbers indicate that the outcome is more likely than lower numbers.
A probability of 0 indicates an outcome will not occur. A probability of 1
indicates an outcome will occur with certainty.
Axioms of Probability
The probability of an outcome can be interpreted as our subjective probability, or
degree of belief, that the outcome will occur. Different individuals may assign different
probabilities to the same outcomes.
Another interpretation of probability is based on the conceptual model of repeated
replications of the random experiment. The probability of an outcome is interpreted as
the limiting value of the proportion of times the outcome occurs in n repetitions of the
random experiment as n increases beyond all bounds. For example, if we assign
probability 0.2 to the outcome that there is a corrupted pulse in a digital signal, we
might interpret this assignment as implying that, if we analyze many pulses,
approximately 20% of them will be corrupted. This example provides a relative
frequency interpretation of probability.
The proportion, or relative frequency, of replications of the experiment that result in
the outcome is 0.2. Probabilities are chosen so that the sum of the probabilities of all
outcomes in an experiment adds up to 1. This convention facilitates the relative
frequency interpretation of probability.
Axioms of Probability
Probabilities for a random experiment are often assigned on the
basis of a reasonable model of the system under study.
One approach is to base probability assignments on the simple
concept of equally likely outcomes.
For example, suppose that we will select one laser diode randomly
from a batch of 100. Randomly implies that it is reasonable to
assume that each diode in the batch has an equal chance of being
selected. Because the sum of the probabilities must equal 1, the
probability model for this experiment assigns probability of 0.01 to
each of the 100 outcomes. We can interpret the probability by
imagining many replications of the experiment.
Axioms of Probability
Equally Likely Outcomes: Whenever a sample space consists of N
possible outcomes that are equally likely, the probability of each
outcome is 1/N
Probability of en Event: For a discrete sample space, the probability
of an event E, denoted as P(E), equals the sum of the probabilities
of the outcomes in E
Axioms of Probability
Example:
A random experiment can result in one of the outcomes {a, b, c, d}
with probabilities 0.1, 0.3, 0.5, and 0.1, respectively. Let A denote
the event {a, b}, B the event {b, c, d}, and C the event {d}.Then,
P(A) = 0.1 + 0.3 = 0.4
P(B) = 0.3 + 0.5 + 0.1 = 0.9
P(C) = 0.1
If E1, E2, E3, …., Ek are Mutually Exclusive (that is each and every
pair has no intersection at all) events than
P(E1 U E2 U E3 U E4….. U Ek) = P(E1) + P(E2) + P(E3)+ P(E4)+ ….+
P(Ek)
Conditional Probability
Definition:
Sometimes probabilities need to be reevaluated as additional information
becomes available.
A useful way to incorporate additional information into a probability model is to
assume that the outcome that will be generated is a member of a given event.
This event, say A, defines the conditions that the outcome is known to satisfy.
Then probabilities can be revised to include this knowledge.
The probability of an event B under the knowledge that the outcome will be in
event A is denoted as and this is called the conditional probability of B given A
and written as P(B \ A) and calculated as P(A ∩ B) / P(A) where P(A)>0.
P(B \ A) = P(A ∩ B) / P(A) where P(A)>0
Conditional Probability
Example:
A day’s production of 850 manufactured parts contains 50 parts
that do not meet customer requirements. Two parts are selected
randomly without replacement from the batch. What is the
probability that the second part is defective given that the first part
is defective? Let A denote the event that the first part selected is
defective, and let B denote the event that the second part selected
is defective. The probability needed can be expressed as P(B \ A) . If
the first part is defective, prior to selecting the second part, the
batch contains 849 parts, of which 49 are defective; therefore,
P(B \ A) = 49 / 849
Multiplication and Total Probability Rules
Multiplication Rule
P(B \ A) = P(A ∩ B) / P(A) => P(A ∩ B) = P(A). P(B \ A) = P(B). P(A \ B)
Total Probability Rule
P(B) = P(B ∩ A) + P(B ∩ A’) = P(A). P(B \ A) = P(A’). P(B \ A’)
Mutually Exclusive and Exhaustive Events: If two sets are mutually exclesive, ie
no intersection, and they together form the sample space then we call them
Mutually Exclusive and Exhaustive Events.
From here we can write P(A). P(B \ A) = P(B). P(A \ B) which yields
Which is a very useful result that enables us to solve P(A \ B) in terms of P(B \ A)
Bayes Theorem
Definition:
If E1, E2, E3, ….. Ek are k mutually exclusice and exhaustive events
and B is any event then
𝑃 𝐵\E1 .𝑃(𝐸1)
P(E1 \ B) =
𝑃 𝐵\E1 .𝑃 𝐸1 +𝑃 𝐵\E2 .𝑃 𝐸2 +⋯………+𝑃 𝐵\Ek .𝑃(𝐸𝑘)
Outcome a b c d e f
x 0 0 1.5 1.5 2 3
f(x) 1/6 1/6 1/6 1/6 1/6 1/6
x 0 1.5 2 3
F(x) 2/6 4/6 5/6 6/6
P(X≤x) P(X≤0) P(X≤1.5) P(X≤2) P(X≤3)
Mean and the Variance of a Discrete Random
Variable
Mean and the Variance of a Discrete RV
Example:
Let’s find the Mean and the Variance of our previous example
Outcome a b c d e f
x 0 0 1.5 1.5 2 3
f(x) 1/6 1/6 1/6 1/6 1/6 1/6
x 0 1.5 2 3
F(x) 2/6 4/6 5/6 6/6
P(X≤x) P(X≤0) P(X≤1.5) P(X≤2) P(X≤3)
Mean 1 1 1 1 1 1 8
µ=E(X) 𝑥. 𝑓 𝑥 = 0 + 0 + 1.5 + 1.5 + 2 + 3 = = 1.33
6 6 6 6 6 6 6
Variance
𝑥2. 𝑓 𝑥 − µ2 = 2.91
2
StDev 2 = 𝟏. 𝟕𝟏
Mean and the Variance of a function of RV
Recall that Mean = E(X) = µ
𝐸(ℎ 𝑥 ) = ℎ 𝑥 𝑓(𝑥)
𝑥
Outcome a b c d e f
x 0 0 1.5 1.5 2 3
f(x) 1/6 1/6 1/6 1/6 1/6 1/6
x 0 1.5 2 3
F(x) 2/6 4/6 5/6 6/6
P(X≤x) P(X≤0) P(X≤1.5) P(X≤2) P(X≤3)