OSCM OptDSM M7 Decision Analysis-Part1
OSCM OptDSM M7 Decision Analysis-Part1
Seokjun Youn
(syoun@arizona.edu)
Today’s Plan
Lecture 7: Decision Analysis
• Reading: OptDSM_Anderson_Ch13_Decision Analysis.pdf
1. Problem Formulation
2. Decision Making without Probabilities
3. Decision Making with Probabilities
4. Risk Analysis and Sensitivity Analysis
5. Decision Analysis with Sample Information
6. Computing Branch Probabilities with Bayes’ Theorem
Next Class
Lecture 7: Decision Analysis (cont’d)
Decision Analysis Examples
Managers often must make decisions in environments that are fraught with uncertainty.
Some Examples:
• A manufacturer introducing a new product into the marketplace
o What will be the reaction of potential customers?
o How much should be produced?
o Should the product be test-marketed?
o How much advertising is needed?
3
Decision Analysis Examples
Some Examples (cont’d)
• A government contractor bidding on a new contract.
o What will be the actual costs of the project?
o Which other companies might be bidding?
o What are their likely bids?
• An agricultural firm selecting the mix of crops and livestock for the season.
o What will be the weather conditions?
o Where are prices headed?
o What will costs be?
5
Decision Analysis
Decision analysis can be used to develop an optimal strategy when a decision
maker is faced with several decision alternatives and an uncertain or risk-
filled pattern of future events.
Even when a careful decision analysis has been conducted, the uncertain
future events make the final consequence uncertain.
The risk associated with any decision alternative is a direct result of the
uncertainty associated with the final consequence.
6
Problem Formulation
A decision problem is characterized by decision alternatives, states of nature,
and resulting payoffs.
The decision alternatives are the different possible strategies the decision
maker can employ.
The states of nature refer to future events, not under the control of the
decision maker, which may occur.
• States of nature should be defined so that they are mutually exclusive and
collectively exhaustive.
7
Example: Pittsburgh Development Corp.
The financial success of the project depends upon the size of the
condominium complex and the chance event concerning the demand for
the condominiums.
The statement of the PDC decision problem is to select the size of the new
complex that will lead to the largest profit given the uncertainty
concerning the demand for the condominiums.
8
Influence Diagrams
An influence diagram is a graphical device showing the relationships among
the decisions, the chance events, and the consequences.
Squares or rectangles depict decision nodes.
Circles or ovals depict chance nodes.
Diamonds depict consequence nodes.
Lines or arcs connecting the nodes show the direction of influence.
9
Payoff Tables
The consequence resulting from a specific combination of a decision
alternative and a state of nature is a payoff.
A table showing payoffs for all combinations of decision alternatives and states
of nature is a payoff table.
Payoffs can be expressed in terms of profit, cost, time, distance or any other
appropriate measure.
10
Decision Making without Probabilities
11
Decision Making without Probabilities
Three commonly used criteria for decision making when probability
information regarding the likelihood of the states of nature is unavailable are:
The Optimistic Approach (Maximax Criterion) – the decision with the largest payoff
or lowest cost is chosen.
The Conservative Approach (Maximin Criterion) – for each decision the minimum
payoff is listed and the decision corresponding to the maximum of these payoffs is
selected. Or the maximum costs are determined and the minimum of those is
selected.
The Minimax Regret Approach.
12
Example: Optimistic Approach
An optimistic decision maker would use the optimistic (maximax)
approach. We choose the decision that has the largest single value in the
payoff table.
Maximum
Decision Payoff
Maximax 𝑑𝑑1 8
Maximax
decision 𝑑𝑑2 14 payoff
𝑑𝑑3 20
13
Example: Conservative Approach
A conservative decision maker would use the conservative (maximin)
approach. List the minimum payoff for each decision. Choose the decision
with the maximum of these minimum payoffs.
Minimum
Decision Payoff
𝑑𝑑1 7
Maximin
𝑑𝑑2 5
payoff
Maximin
decision 𝑑𝑑3 –9
14
Minimax Regret Approach
The minimax regret approach requires the construction of a regret table or an
opportunity loss table.
This is done by calculating for each state of nature the difference between
each payoff and the largest payoff for that state of nature.
Then, using this regret table, the maximum regret for each possible decision is
listed.
15
Example: Minimax Regret Approach
For the minimax regret approach, first compute a regret table by subtracting
each payoff in a column from the largest payoff in that column. In this
example, in the first column subtract 8, 14, and 20 from 20; etc.
For each decision list the maximum regret. Choose the decision with the
minimum of these values.
16
Decision Making with Probabilities
17
Decision Making with Probabilities
Expected Value Approach
If probabilistic information regarding the states of nature is available, one may
use the expected value (EV) approach.
Here the expected return for each decision is calculated by summing the
products of the payoff under each state of nature and the probability of the
respective state of nature occurring.
The decision yielding the best expected return is chosen.
18
Expected Value of a Decision Alternative
The expected value of a decision alternative is the sum of weighted payoffs
for the decision alternative.
The expected value (EV) of decision alternative di is defined as:
N
EV(di ) = ∑ P( s j )Vij
j =1
where:
N = the number of states of nature
P( s j ) = the probability of state of nature s j
Vij = the payoff corresponding to decision
alternative d j and state of nature s j
19
Decision Tree
A decision tree is a chronological representation of the decision problem.
A decision tree consists of nodes and branches.
• A decision node, represented by a square, indicates a decision to be made. The
branches represent the possible decision alternatives.
• An event node, represented by a circle, indicates a random event (i.e., states of
nature). The branches represent the possible outcomes of the random event.
At the end of each limb of a tree are the payoffs attained from the series of
branches making up that limb.
20
Expected Value for Each Decision
21
Expected Value of Perfect Information
Frequently, information is available which can improve the probability
estimates for the states of nature.
22
Expected Value of Perfect Information
Expected Value with Perfect Information (EVwPI)
23
Risk Analysis and Sensitivity Analysis
24
Risk Analysis
Risk analysis helps the decision maker recognize the difference between:
the expected value of a decision alternative, and
the payoff that might actually occur
The risk profile for a decision alternative shows the possible payoffs for
the decision alternative along with their associated probabilities.
Example: Large Complex Decision Alternative
• Risk profile for the large complex decision alternative for a condominium project:
25
Sensitivity Analysis
Sensitivity analysis can be used to determine how changes to the
following inputs affect the recommended decision alternative:
probabilities for the states of nature
values of the payoffs
If a small change in the value of one of the inputs causes a change in the
recommended decision alternative, extra effort and care should be taken
in estimating the input value.
26
Agenda
Announcement
Homework 4, Due: Apr 2, 11:59 pm
Today’s Plan
Lecture 7: Decision Analysis
• Reading: OptDSM_Anderson_Ch13_Decision Analysis.pdf
Decision Analysis with Sample Information
Computing Branch Probabilities with Bayes’ Theorem
• Decision Trees using Excel TreePlan Add-in
Next Class
Lecture 7: Decision Analysis (cont’d)
Decision Analysis with Sample Information
28
Decision Analysis with Sample Information
Frequently, decision makers have preliminary or prior probability assessments
for the states of nature that are the best probability values available at that time.
To make the best possible decision, the decision maker may want to seek
additional information about the states of nature.
This new information, often obtained through sampling, can be used to revise
the prior probabilities so that the final decision is based on more accurate
probabilities for the states of nature.
29
Example: Pittsburgh Development Corp.
Pittsburgh Development Corp. (PDC)’s management team is considering
a 6-month market research study designed to learn more about potential
market acceptance of the PDC condominium project.
Management anticipates that the market research study will provide one
of the following two results:
1. Favorable report: A significant number of the individuals contacted express
interest in purchasing a PDC condominium.
2. Unfavorable report: Very few of the individuals contacted express interest in
purchasing a PDC condominium.
Influence Diagram:
30
Sample Information
PDC has developed the following branch probabilities.
If the market research study is undertaken:
31
Sample Information (cont’d)
If the market research report is unfavorable:
If the market research study is not undertaken, the prior probabilities are
applicable:
32
Decision Tree for the PDC Condominium Project
33
Decision Strategy
A decision strategy is a sequence of decisions and chance outcomes
where the decisions chosen depend on the yet-to-be-determined
outcomes of chance events.
34
Decision Tree for the PDC Condominium Project (cont’d)
For example, the probability of the $20 million payoff is (0.77)(0.94) = 0.72 38
Expected Value of Sample Information
The expected value of sample information (EVSI) is the additional expected
profit possible through knowledge of the sample or survey information.
• In the PDC problem, the market research study is the sample information used to
determine the optimal decision strategy.
39
Efficiency of Sample Information
Efficiency of sample information is the ratio of EVSI to EVPI.
As the EVPI provides an upper bound for the EVSI, efficiency is always a
number between 0 and 1.
E = (EVSI/EVPI) X 100
= [($1.73 mil)/($3.20 mil)] X 100
= 54.1%
The information from the market research study is 54.1% as efficient as perfect
information.
40
Computing Branch Probabilities with
Bayes’ Theorem
41
Bayes’ Theorem (a.k.a. Bayes’ Decision Rule)
Bayes’ theorem describes the probability of an event based on prior knowledge
of the conditions that might be relevant to the event.
𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴 𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴
𝑃𝑃 𝐴𝐴 𝐵𝐵 = =
𝑃𝑃(𝐵𝐵) 𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴 + 𝑃𝑃(𝐵𝐵|𝐴𝐴𝑐𝑐 )𝑃𝑃(𝐴𝐴𝑐𝑐 )
𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴 = 𝑃𝑃 𝐴𝐴, 𝐵𝐵 .
If A and B are independent, 𝑃𝑃 𝐵𝐵 𝐴𝐴 = 𝑃𝑃(𝐵𝐵) so, 𝑃𝑃 𝐵𝐵 𝑃𝑃 𝐴𝐴 = 𝑃𝑃 𝐴𝐴, 𝐵𝐵 .
At the same time, only 35% of the companies that did not increase their share
price by more than 5% in the same period replaced their CEOs.
Knowing that the probability that the stock prices grow by more than 5% is 4%,
find the probability that the shares of a company that fires its CEO will increase
by more than 5%.
Thus, the probability that the shares of a company that replaces its CEO will grow
by more than 5% is 6.67%.
44
PDC Decision Tree
Now, returning back to the PDC Example, let
45
Computing Branch Probabilities
We will need conditional probabilities for all sample outcomes given all states
of nature, that is, P(F | s1), P(F | s2), P(U | s1), and P(U | s2).
In the PDC problem we assume that the following assessments are available for
these conditional probabilities:
Market Research
State of Nature Favorable, F Unfavorable, U
Strong demand, s1 P(F| s1) = 0.90 P(U| s1) = 0.10
Weak demand, s1 P(F| s2) = 0.25 P(U| s2) = 0.75
46
Computing Branch Probabilities (cont’d)
Branch (Posterior) Probabilities Calculation
Step 1:
For each state of nature, multiply the prior probability by its conditional
probability for the indicator - this gives the joint probabilities for the states
and indicator.
Step 2:
Sum these joint probabilities over all states - this gives the marginal
probability for the indicator.
Step 3:
For each state, divide its joint probability by the marginal probability for the
indicator - this gives the posterior probability distribution.
47
𝑷𝑷 𝑨𝑨, 𝑩𝑩
Computing Branch Probabilities (cont’d) = 𝑷𝑷 𝑩𝑩 𝑨𝑨 𝑷𝑷 𝑨𝑨
= 𝑷𝑷 𝑨𝑨 𝑩𝑩 𝑷𝑷 𝑩𝑩
Favorable: Favorable
State of Prior Conditional Joint Posterior
Nature Probability Probability Probability Probability
sj P(sj) P(F|sj) P(F, sj) P(sj |F)
s1 0.8 0.90 0.72 0.94 =0.72/0.77
s2 0.2 0.25 0.05 0.06
𝑷𝑷 𝑩𝑩 = 𝑷𝑷 𝑩𝑩 𝑨𝑨 𝑷𝑷 𝑨𝑨 + 𝑷𝑷(𝑩𝑩|𝑨𝑨𝒄𝒄 )𝑷𝑷(𝑨𝑨𝒄𝒄 ) P(favorable) = P(F) = 0.77 1.00
Unfavorable
Unfavorable:
State of Prior Conditional Joint Posterior
Nature Probability Probability Probability Probability
sj P(sj) P(U|sj) P(U, sj) P(sj |U)
s1 0.8 0.10 0.08 0.35
s2 0.2 0.75 0.15 0.65
P(unfavorable) = P(U) = 0.23 1.00
48
Bayes’ Theorem and Posterior Probabilities
Knowledge of sample (survey) information can be used to revise the
probability estimates for the states of nature.
Prior to obtaining this information, the probability estimates for the states of
nature are called prior probabilities.
49
(Optional) Bayes’ Decision Rule Overview
Naive Bayes Classifier
“Conditional independence is assumed to simplify the classification decision”
50
(Optional) Bayes’ Decision Rule Overview (cont’d)
Naive Bayes Classifier Example
Conditioned on class c
51
(Optional) Bayes’ Decision Rule Overview (cont’d)
Naive Bayes Classifier Example
Let’s find the probability of occurrence of dataset 𝒙𝒙 through class 𝒄𝒄
𝑝𝑝(𝒙𝒙|𝒚𝒚 = 𝒄𝒄)
52
(Optional) Bayes’ Decision Rule Overview (cont’d)
Naive Bayes Classifier Example
53
(Optional) Bayes’ Decision Rule
Features of Bayes’ Decision Rule
It accounts for all the states of nature and their probabilities.
The expected payoff can be interpreted as what the average payoff would
become if the same situation were repeated many times. Therefore, on
average, repeatedly applying Bayes’ decision rule to make decisions will lead to
larger payoffs in the long run than any other criterion.
54