0% found this document useful (0 votes)
73 views

Unit 1 - Probability Theory

This document provides an overview of a quantitative business methods unit on probability theories taught by Hashem Zarafat. The unit covers topics such as basic probability calculations, conditional probabilities, Bayes' theorem, and total probability theorem. Managers are shown using probabilities to analyze uncertainties in decisions around sales, productivity, and investment profitability.

Uploaded by

Rajdeep Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
73 views

Unit 1 - Probability Theory

This document provides an overview of a quantitative business methods unit on probability theories taught by Hashem Zarafat. The unit covers topics such as basic probability calculations, conditional probabilities, Bayes' theorem, and total probability theorem. Managers are shown using probabilities to analyze uncertainties in decisions around sales, productivity, and investment profitability.

Uploaded by

Rajdeep Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

EU Business School Munich

Quantitative Business Methods


Lecturer: Hashem Zarafat
Email: hashem.zarafat@euruni.edu

QUANTITATIVE BUSINESS METHODS


UNIT 1 – PROBABILITY THEORIES
Learning Objectives

➢ Introduction to random events, sample spaces and


events.
➢ Basic probability calculations.
➢ Introduction to conditional probabilities.
➢ The Bayes’ theorem (to revise).
➢ The Total Probability Theorem
Uncertainties

Managers often base their decisions on an analysis


of uncertainties such as the following:

What are the chances that sales will decrease


if we increase prices?

What is the likelihood a new assembly method


will increase productivity?

What are the odds that a new investment will


be profitable?
Probability

Probability is a numerical measure of the likelihood


that an event will occur.

Probability values are always assigned on a scale


from 0 to 1.

A probability near zero indicates an event is quite


unlikely to occur.

A probability near one indicates an event is almost


certain to occur.
Probability as a Numerical Measure
of the Likelihood of Occurrence

Increasing Likelihood of Occurrence

0 .5 1
Probability:

The event The occurrence The event


is very of the event is is almost
unlikely just as likely as certain
to occur. it is unlikely. to occur.
Assigning Probabilities
◼ Basic Requirements for Assigning Probabilities

1. The probability assigned to each experimental


outcome must be between 0 and 1, inclusively.

0 < P(Ei) < 1 for all i

where:
Ei (event) is the ith experimental outcome
and P(Ei) is its probability
Assigning Probabilities
◼ Basic Requirements for Assigning Probabilities

2. The sum of the probabilities for all experimental


outcomes must equal 1.

P(E1) + P(E2) + . . . + P(En) = 1

where:
n is the number of experimental outcomes
Assessing Probability

There are three approaches to assessing the


probability of an uncertain event:
1. a priori (Classical)- based on prior knowledge of the process
X numberof waystheeventcanoccur
probability of occurrence = =
Assuming
T totalnumberof elementaryoutcomes
all
outcomes 2. empirical (Relative Frequency)- based on observed data
are equally
numberof waystheeventcanoccur
likely
probability of occurrence =
totalnumberof elementaryoutcomes
3. subjective probability
based on a combination of an individual’s past experience,
personal opinion, and analysis of a particular situation
Example of a priori probability
When randomly selecting a day from the year 2010
what is the probability the day is in January?

X numberof daysin January


Probability of DayIn January = =
T totalnumberof daysin 2010
Example of a priori probability
When randomly selecting a day from the year 2010
what is the probability the day is in January?

X numberof daysin January


Probability of DayIn January = =
T totalnumberof daysin 2010

X 31daysin January 31
= =
T 365daysin 2010 365
Example of a priori probability
Consider a standard deck of cards that has 26 red cards
and 26 black cards. The probability of selecting a black
card is
Example of a priori probability
Consider a standard deck of cards that has 26 red cards
and 26 black cards. The probability of selecting a black
card is

X numberof blackcards
Probability of selectigna blackcard = =
T totalnumberof cards

X Numberof blackcards 26
= = = 0.50
T Totalnumberof cards 52
Another example of a priori/ Classical Method

◼ Rolling a Die
If an experiment has n possible outcomes, the
classical method would assign a probability of 1/n
to each outcome.

Experiment: Rolling a die


Sample Space: S = {1, 2, 3, 4, 5, 6}
Probabilities: Each sample point has a
1/6 chance of occurring
Empirical/ Relative Frequency Method

◼ Example: Lucas Tool Rental


Lucas Tool Rental would like to assign probabilities
to the number of car polishers it rents each day.
Office records show the following frequencies of daily
rentals for the last 40 days.

Number of Number
Polishers Rented of Days
0 4
1 6
2 18
3 10
4 2
Empirical/ Relative Frequency Method (2)
◼ Example: Lucas Tool Rental
Each probability assignment is given by dividing
the frequency (number of days) by the total frequency
(total number of days).

Number of Number
Polishers Rented of Days Probability
0 4 .10
1 6 .15
2 18 .45 4/40
3 10 .25
4 2 .05
40 1.00
Empirical/ Relative Frequency Method

• Surveys are used often to generate empirical


probabilities.
• For example, if you take a survey of students,
and 60% state that they have part-time jobs, then
there is a 0.60 probability that an individual
student has a part-time job.
Subjective Method

◼ When economic conditions and a company’s


circumstances change rapidly it might be
inappropriate to assign probabilities based solely on
historical data.
◼ We can use any data available as well as our
experience and intuition, but ultimately a probability
value should express our degree of belief that the
experimental outcome will occur.
◼ The best probability estimates often are obtained by
combining the estimates from the classical or relative
frequency approach with the subjective estimate.
Events and Their Probabilities

◼ Example: Rolling a Die

Event E = Getting an even number when rolling


a die
E = {2, 4, 6}
P(E) = P(2) + P(4) + P(6)
= 1/6 + 1/6 + 1/6
= 3/6 = .5
Sample Space – My students

The Sample Space is the collection of all


possible events
e.g. All 6 faces of a die:

e.g. All 52 cards of a bridge deck:

Chap 4-19
Complement of an Event

The complement of event A is defined to be the event


consisting of all sample points that are not in A.

The complement of A is denoted by Ac.

Sample
Event A Ac Space S

Venn
Diagram
Union of Two Events

The union of events A and B is the event containing


all sample points that are in A or B or both.

The union of events A and B is denoted by A  B

Sample
Event A Event B Space S
Intersection of Two Events

The intersection of events A and B is the set of all


sample points that are in both A and B.

The intersection of events A and B is denoted by A  

Sample
Event A Event B Space S

Intersection of A and B
Addition Law
The addition law provides a way to compute the
probability of the union of two events (event A, or B,
or both A and B occurring).

The law is written as:

P(A  B) = P(A) + P(B) - P(A  B)


Example of Union of Two Events

In a survey of 1000 households:


• 250 said that they actually purchased big-screen televisions.
• 300 said that they planned to purchase.
• 200 people said that they planned to purchase and actually
purchased.
What is the probability of those who planned to purchase or actually
purchased?
General Addition Rule Example

P(Jan. or Wed.) =?

P(Jan. or Wed.) = P(Jan.) + P(Wed.) - P(Jan. and Wed.)

Chap Chap
4-25 4-25
General Addition Rule Example

P(Jan. or Wed.) = P(Jan.) + P(Wed.) - P(Jan. and Wed.)

= 31/365 + 52/365 - 4/365 = 79/365

Don’t count
the four
Wednesdays
in January
Jan. Not Jan. Total twice!
Wed. 4 48 52
Not Wed. 27 286 313

Total 31 334 365

Chap Chap
4-26 4-26
Mutually Exclusive Events

Two events are said to be mutually exclusive if the


events have no sample points in common.

Two events are mutually exclusive if, when one event


occurs, the other cannot occur.

Sample
Event A Event B Space S
Mutually Exclusive Events

If events A and B are mutually exclusive, P(A  B) = 0.

The addition law for mutually exclusive events is:

P(A  B) = P(A) + P(B)

There is no need to
include “- P(A  B)”
Conditional Probability

The probability of an event given that another event


has occurred is called a conditional probability.

The conditional probability of A given B is denoted


by P(A|B).

A conditional probability is computed as follows :

P( A  B)
P( A|B) =
P( B)
Using Decision Trees
Multiplication Law

The multiplication law provides a way to compute the


probability of the intersection of two events.

The law is written as:

P(A  B) = P(B)P(A|B)
Conditional Probability and Conjunction Fallacy

What is more probable?

1. Student A will get all straight As?


2. Student A will party every day and get all straight As?

1. Student B will get very lucky with his/ her career?


2. Student B will get very lucky with career and marriage?
Independent Events (no correlation)

If the probability of event A is not changed by the


existence of event B, we would say that events A
and B are independent.

Two events A and B are independent if:

P(A|B) = P(A) or P(B|A) = P(B)


Multiplication Law
for Independent Events
The multiplication law also can be used as a test to see
if two events are independent.

The law is written as:

P(A  B) = P(A)P(B)
Independent Events and Gambler’s Fallacy

Example: Multiple-choice questions


Mutual Exclusiveness and Independence

Do not confuse the notion of mutually exclusive


events with that of independent events.

Two events with nonzero probabilities cannot be


both mutually exclusive and independent.

If one mutually exclusive event is known to occur,


the other cannot occur.; thus, the probability of the
other event occurring is reduced to zero (and they
are therefore dependent).

Two events that are not mutually exclusive, might


or might not be independent.
Mutual Exclusiveness and Independence 37

If two events A and B are independent:


Consider a fair coin and a fair six-sided die. Let event A be obtaining heads, and event B
be rolling a 6. Then we can reasonably assume that events A and B are independent,
because the outcome of one does not affect the outcome of the other. The probability that
both A and B occur is
P(A and B) = P(A)P(B) = (1/2)(1/6) = 1/12.

An example of a mutually exclusive event:


Consider a fair six-sided die as before, only in addition to the numbers 1 through 6 on
each face, we have the property that the even-numbered faces are colored red, and the
odd-numbered faces are colored green. Let event A be rolling a green face, and event B be
rolling a 6. Then
P(A) = ½ ; P(B) = 1/6

as in our previous example. But it is obvious that events A and B cannot simultaneously
occur, since rolling a 6 means the face is red, and rolling a green face means the number
showing is odd. Therefore
• P(A and B) = 0.
Bayes’ Theorem

◼ Often we begin probability analysis with initial or


prior probabilities.
◼ Then, from a sample, special report, or a product
test we obtain some additional information.
◼ Given this information, we calculate revised or
posterior probabilities.
◼ Bayes’ theorem provides the means for revising the
prior probabilities.
Application
Prior New Posterior
of Bayes’
Probabilities Information Probabilities
Theorem
Bayes’ Theorem
◼ Example: Quality of Purchased Parts
We can apply Bayes’ theorem to a manufacturing
firm that receives shipments of parts from two
different suppliers. Let A1 denote the event that a part
is from supplier 1 and A2 denote the event that a part
is from supplier 2.
Currently, 65% of the parts purchased by the
company are from supplier 1, and the remaining 35%
are from supplier 2. Thus, if a part is selected at
random, we would assign the prior probabilities P(A1)
= 0.65 and P(A2) = 0.35.
Bayes’ Theorem
◼ Example: Quality of Purchased Parts
The quality of the purchased parts varies with the
source of supply. We will let G denote the event that a
part is good and B denote the event that a part is bad.
Based on historical data, the conditional probabilities
of receiving good and bad parts from the two
suppliers are:
P(G | Al) = 0.98 and P(B | A1) = 0.02
P(G | A2) = 0.95 and P(B | A2) = 0.05
Tree Diagram
◼ Example: Quality of Purchased Parts

Supplier Part Quality Experimental


Outcomes

P(G|A1) = .98
P(A1  G) = .6370
P(A1) = .65
P(B|A1) = .02 P(A1  B) = .0130

P(G|A2) = .95
P(A2  G) = .3325
P(A2) = .35
P(B|A2) = .05 P(A2  B) = .0175
New Information
◼ Example: Quality of Purchased Parts
Now suppose that the parts from the two suppliers
are used in the firm’s manufacturing process and
that a bad part causes a machine to break down.
What is the probability that the bad part came from
supplier 1 and what is the probability that it came
from supplier 2?
With the information in the probability tree, we
can use Bayes’ theorem to answer these questions.
Bayes’ Theorem
◼ To find the posterior probability that event Ai will
occur given that event B has occurred, we apply
Bayes’ theorem.

P( Ai )P( B| Ai )
P( Ai |B) =
P( A1 )P( B| A1 ) + P( A2 )P( B| A2 ) + ... + P( An )P( B| An )

◼ Bayes’ theorem is applicable when the events for


which we want to compute posterior probabilities
are mutually exclusive and their union is the entire
sample space.
Posterior Probabilities
◼ Example: Quality of Purchased Parts
Given that the part received was bad, we revise
the prior probabilities as follows:

P( A1 )P( B| A1 )
P( A1 |B) =
P( A1 )P( B| A1 ) + P( A2 )P( B| A2 )
(.65)(.02)
=
(.65)(.02) + (.35)(.05)
= .4262
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 1
Prepare the following three columns:
Column 1 - The mutually exclusive events for
which posterior probabilities are desired.
Column 2 - The prior probabilities for the events.
Column 3 - The conditional probabilities of the
new information given each event.
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 1
(1) (2) (3) (4) (5)
Prior Conditional
Events Probabilities Probabilities
Ai P(Ai) P(B|Ai)
A1 .65 .02
A2 .35 .05
1.00
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 2
Prepare the fourth column:
Column 4
Compute the joint probabilities for each event and
the new information B by using the multiplication
law.
Multiply the prior probabilities in column 2 by the
corresponding conditional probabilities in column 3.
That is, P(Ai I B) = P(Ai) P(B | Ai).
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 2
(1) (2) (3) (4) (5)
Prior Conditional Joint
Events Probabilities Probabilities Probabilities
Ai P(Ai) P(B|Ai) P(Ai I B)

A1 .65 .02 .0130


A2 .35 .05 .0175 .65 x .02
1.00
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 2 (continued)
We see that there is a .0130 probability of the
part coming from supplier 1 and the part is bad.
We see that there is a .0175 probability of the
part coming from supplier 2 and the part is bad.
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 3

Sum the joint probabilities in Column 4. The sum is


the probability of the new information, P(B). The sum
.0130 + .0175 shows an overall probability of .0305 of a bad
part being received.
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 3
(1) (2) (3) (4) (5)
Prior Conditional Joint
Events Probabilities Probabilities Probabilities
Ai P(Ai) P(B|Ai) P(Ai I B)
A1 .65 .02 .0130
A2 .35 .05 .0175
1.00 P(B) = .0305
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 4
Prepare the fifth column:
Column 5
Compute the posterior probabilities using the
basic relationship of conditional probability.
P( Ai  B)
P( Ai |B) =
P( B)
The joint probabilities P(Ai I B) are in column 4
and the probability P(B) is the sum of column 4.
Bayes’ Theorem: Tabular Approach
◼ Example: Quality of Purchased Parts
• Step 4
(1) (2) (3) (4) (5)
Prior Conditional Joint Posterior
Events Probabilities Probabilities Probabilities Probabilities
Ai P(Ai) P(B|Ai) P(Ai I B) P(Ai |B)

A1 .65 .02 .0130 .4262


A2 .35 .05 .0175 .5738
1.00 P(B) = .0305 1.0000
.0130/.0305
Total Probability Theorem
𝑛

P(B) = ෍ 𝑃 𝐴𝑖 𝑃(𝐵|𝐴𝑖)
𝑖=1

In a country, 60% of registered voters are republicans, 30% are democrats, and 10% are
independents. A survey asked these voters about military spending and the results shows
that 40% of republicans, 65% of the democrats, and 55% of the independents opposed it.
What is the probability that a randomly selected voter in this country opposes increased
military spending?
Total Probability Theorem - Example

Sample Space = Ω = registered voters in the county


• R = registered republicans, Pr(R) = 0.6
• D = registered democrats, Pr(D) = 0.3
• I = registered independents, Pr(I) = 0.1
• O = registered voters opposing increased military spending,
Pr(O|R) = 0.4, Pr(O|D) = 0. 65, Pr(O|I ) = 0.55

By the total probability theorem:


Pr(O) = Pr(O|R) Pr(R) + Pr(O|D) Pr(D) + Pr(O|I) Pr(I)
= (0.4 * 0.6) + (0.65 * 0.30) + (0.55 * 0.1) = 0.49
56
Class Exercise - 1

The probability that a person has a certain disease is 0.03. Medical


diagnostics tests are available to determine whether the person actually
has the disease. If the disease is actually present the probability that
the medical diagnostic test will give a positive result (indicating that the
disease is present) is 0.90. If the disease is not actually present, the
probability of positive test result (indicating that the disease is present)
is 0.02. Suppose that the medical diagnostic test has given a positive
result (indicating that the disease is present). What is the probability
that the disease is actually present? What is the probability of a positive
test result?
57

Event Di Prior Probability Conditional Joint Probability Revised


P(Di) Probability P (T P (T|Di)P(Di) Probability
| Di) P(Di|T) =
P (T|Di)P(Di) / P
(T|D)P(D) + P
(T|D’)P(D’)
58

Event Di Prior Probability Conditional Joint Probability Revised


P(Di) Probability P (T|Di)P(Di) Probability
P (T|Di) P(Di|T) =
P (T|Di)P(Di) / P
(T|D)P(D) + P
(T|D’)P(D’)
D = has disease 0.03
D’ = does not 0.97
have disease
59

Event Di Prior Probability Conditional Joint Probability Revised


P(Di) Probability P (T|Di)P(Di) Probability
P (T|Di) P(Di|T) =
P (T|Di)P(Di) / P
(T|D)P(D) + P
(T|D’)P(D’)
D = has disease 0.03 0.90 0.027
D’ = does not 0.97 0.02 0.0194
have disease
= 0.0464
60

Event Di Prior Probability Conditional Joint Probability Revised


P(Di) Probability P (T|Di)P(Di) Probability
P (T|Di) P(Di|T) =
P (T|Di)P(Di) / P
(T|D)P(D) + P
(T|D’)P(D’)
D = has disease 0.03 0.90 0.027 P(D|T) = 0.027 /
0.0467 = 0.582
D’ = does not 0.97 0.02 0.0194 P(D’|T) = 0.0194
have disease / 0.0464 = 0.418
= 0.0464
Class Exercise - 2 61

Laid-off workers who become entrepreneurs because they cannot find


meaningful employment with another company are known as
entrepreneurs by necessity. The Wall Street Journal reports that these
entrepreneurs by necessity are less likely to grow into large business than
entrepreneurs by choice (J. Bailey, “Desire ---- More Than Need ----Build a
Business, “ Wall Street Journal, May 21, 2001, B4). This article states that
89% entrepreneurs in the United States are entrepreneurs by choice and
11% are entrepreneurs by necessity. Only 2% of entrepreneurs by
necessity expect their new business to employ 20 or more people within
five years, whereas 14% of entrepreneurs by choice expect to employ at
least 20 people within 5 years.
If an entrepreneur is selected at random and that individual expects that
his or her new business will employ 20 or more people within five years,
what is the probability that this individual is an entrepreneur by choice?
62

Event (Ei) Prior Probability Conditional Joint Probability Revised


P(Ei) Probability P(Ei) P(GE20|Ei) Probability
P (GE20|Ei) P(Ei|GE20)
Entrepreneurs 0.89 0.14
by choice = EC
Entrepreneurs 0.11 0.02
by choice = EN
63

Event (Ei) Prior Probability Conditional Joint Probability Revised


P(Ei) Probability P(Ei) P(GE20|Ei) Probability
P (GE20|Ei) P(Ei|GE20)
Entrepreneurs 0.89 0.14 0.1246
by choice = EC
Entrepreneurs 0.11 0.02 0.0022
by choice = EN
= 0.1268
64

Event (Ei) Prior Probability Conditional Joint Probability Revised


P(Ei) Probability P(Ei) P(GE20|Ei) Probability
P (GE20|Ei) P(Ei|GE20)
Entrepreneurs 0.89 0.14 0.1246 0.1246 / 0.1268
by choice = EC = 0.98265
Entrepreneurs 0.11 0.02 0.0022 0.0022/0.1268 =
by choice = EN 0.01735
= 0.1268
Solution 65

EC = entrepreneurs by choice EN = entrepreneurs by necessity


GE20 = employ 20 or more people within five years
P(EC) = .89 P(EN) = .11 P(GE20 | EN) = .02 P(GE20 | EC) = .14
(a)
P (GE20 | EC) P ( EC)
P ( EC | GE20) =
P (GE20 | EC) P ( EC) + P (GE20 | EN) P (EN)

=
(.14)(.89)
= 0.9826
(.14)(.89) + (.02)(.11)
Probability of getting A

o Dataset: 57, 99 (A), 78, 72, 84, 95 (A)


o This is our prior probability.
o New information: the new teacher is so strict she only lets 10% of the class
to get an A (grading distribution)
o Posterior probability of “A”?!
So, now we can…

• Provide definitions based on probabilities. Miracle? Accident? etc.


• Understand events and plan ahead better (e.g. the conjunction fallacy).
• Make more informed life choices.
The Question:
What is a reasonable probability…
1. … for a businessman to bet on?
2. … of winning a lottery?
❖ 7.15112384e-8 or ≈ 0.00000007
❖ cf. 0.7
3. … of winning in a roulette game?
❖ 0.027
❖ Red/ Black: 0.486 (0.236 for two bets…)
❖ Probability of winning 10 bets (independent events)?
❖ 2.0796141e-16 or 0.000,000,000,000,000,2
❖ The negative expected value game
Computing Normal Probabilities 70

• Characteristics of Normal Distribution


• The transformation formula
• Using Cumulative Standardized Normal Distribution
• Example of download time for a website with average
download time of 7 and standard deviation of 2
• P (X < 9)
• P (X < 7 or X > 9)
• P( 5 < X < 9)
Finding An X Value associated with a 71
Known probability

• What are the lower and upper values of X, symmetrically distributed


around the mean, that include 95% of the download times for a video at
the at the OURCAMPUS! website? (X ~ N (µ = 7 , sigma = 2))

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy