0% found this document useful (0 votes)
6 views

OSCM OptDSM M7 Decision Analysis-Part1

Lecture 7 focuses on Decision Analysis, covering problem formulation, decision-making under uncertainty, and risk analysis. It provides examples from various industries, illustrating how managers can use decision analysis techniques to optimize strategies in uncertain environments. The lecture also discusses the use of decision trees and Bayes' theorem for computing probabilities and making informed decisions based on sample information.

Uploaded by

sachin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

OSCM OptDSM M7 Decision Analysis-Part1

Lecture 7 focuses on Decision Analysis, covering problem formulation, decision-making under uncertainty, and risk analysis. It provides examples from various industries, illustrating how managers can use decision analysis techniques to optimize strategies in uncertain environments. The lecture also discusses the use of decision trees and Bayes' theorem for computing probabilities and making informed decisions based on sample information.

Uploaded by

sachin
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

Lecture 7: Decision Analysis – Part 1

OSCM 471/571: Optimization and Decision Support Modeling for Business

Seokjun Youn
(syoun@arizona.edu)

Eller College of Management


University of Arizona
Agenda
Announcement
 Homework 4, Due: Apr 2, 11:59 pm

Today’s Plan
 Lecture 7: Decision Analysis
• Reading: OptDSM_Anderson_Ch13_Decision Analysis.pdf
1. Problem Formulation
2. Decision Making without Probabilities
3. Decision Making with Probabilities
4. Risk Analysis and Sensitivity Analysis
5. Decision Analysis with Sample Information
6. Computing Branch Probabilities with Bayes’ Theorem

Next Class
 Lecture 7: Decision Analysis (cont’d)
Decision Analysis Examples
Managers often must make decisions in environments that are fraught with uncertainty.

Some Examples:
• A manufacturer introducing a new product into the marketplace
o What will be the reaction of potential customers?
o How much should be produced?
o Should the product be test-marketed?
o How much advertising is needed?

• A financial firm investing in securities


o Which are the market sectors and individual securities with the best prospects?
o Where is the economy headed?
o How about interest rates?
o How should these factors affect the investment decisions?

3
Decision Analysis Examples
Some Examples (cont’d)
• A government contractor bidding on a new contract.
o What will be the actual costs of the project?
o Which other companies might be bidding?
o What are their likely bids?

• An agricultural firm selecting the mix of crops and livestock for the season.
o What will be the weather conditions?
o Where are prices headed?
o What will costs be?

• An oil company deciding whether to drill for oil in a particular location.


o How likely is there to be oil in that location?
o How much?
o How deep will they need to drill?
o Should geologists investigate the site further before drilling?
4
Problem Formulation

5
Decision Analysis
 Decision analysis can be used to develop an optimal strategy when a decision
maker is faced with several decision alternatives and an uncertain or risk-
filled pattern of future events.

 Even when a careful decision analysis has been conducted, the uncertain
future events make the final consequence uncertain.

 The risk associated with any decision alternative is a direct result of the
uncertainty associated with the final consequence.

 Good decision analysis includes risk analysis that provides probability


information about the favorable as well as the unfavorable consequences that
may occur.

6
Problem Formulation
 A decision problem is characterized by decision alternatives, states of nature,
and resulting payoffs.

 The decision alternatives are the different possible strategies the decision
maker can employ.

 The states of nature refer to future events, not under the control of the
decision maker, which may occur.
• States of nature should be defined so that they are mutually exclusive and
collectively exhaustive.

7
Example: Pittsburgh Development Corp.

Pittsburgh Development Corporation (PDC) purchased land that will be the


site of a new luxury condominium complex. PDC commissioned
preliminary architectural drawings for three different projects: one with
30, one with 60, and one with 90 condominiums.

The financial success of the project depends upon the size of the
condominium complex and the chance event concerning the demand for
the condominiums.

The statement of the PDC decision problem is to select the size of the new
complex that will lead to the largest profit given the uncertainty
concerning the demand for the condominiums.

8
Influence Diagrams
 An influence diagram is a graphical device showing the relationships among
the decisions, the chance events, and the consequences.
 Squares or rectangles depict decision nodes.
 Circles or ovals depict chance nodes.
 Diamonds depict consequence nodes.
 Lines or arcs connecting the nodes show the direction of influence.

9
Payoff Tables
 The consequence resulting from a specific combination of a decision
alternative and a state of nature is a payoff.
 A table showing payoffs for all combinations of decision alternatives and states
of nature is a payoff table.
 Payoffs can be expressed in terms of profit, cost, time, distance or any other
appropriate measure.

 Example: Pittsburgh Development Corp.


 Consider the following problem with three decision alternatives and two states
of nature with the following payoff table representing profits (in $Millions):

10
Decision Making without Probabilities

11
Decision Making without Probabilities
Three commonly used criteria for decision making when probability
information regarding the likelihood of the states of nature is unavailable are:

 The Optimistic Approach (Maximax Criterion) – the decision with the largest payoff
or lowest cost is chosen.
 The Conservative Approach (Maximin Criterion) – for each decision the minimum
payoff is listed and the decision corresponding to the maximum of these payoffs is
selected. Or the maximum costs are determined and the minimum of those is
selected.
 The Minimax Regret Approach.

12
Example: Optimistic Approach
An optimistic decision maker would use the optimistic (maximax)
approach. We choose the decision that has the largest single value in the
payoff table.

Maximum
Decision Payoff

Maximax 𝑑𝑑1 8
Maximax
decision 𝑑𝑑2 14 payoff

𝑑𝑑3 20

13
Example: Conservative Approach
A conservative decision maker would use the conservative (maximin)
approach. List the minimum payoff for each decision. Choose the decision
with the maximum of these minimum payoffs.

Minimum
Decision Payoff

𝑑𝑑1 7
Maximin
𝑑𝑑2 5
payoff
Maximin
decision 𝑑𝑑3 –9

14
Minimax Regret Approach
 The minimax regret approach requires the construction of a regret table or an
opportunity loss table.

 This is done by calculating for each state of nature the difference between
each payoff and the largest payoff for that state of nature.

 Then, using this regret table, the maximum regret for each possible decision is
listed.

 The decision chosen is the one corresponding to the minimum of the


maximum regrets.

15
Example: Minimax Regret Approach
For the minimax regret approach, first compute a regret table by subtracting
each payoff in a column from the largest payoff in that column. In this
example, in the first column subtract 8, 14, and 20 from 20; etc.

For each decision list the maximum regret. Choose the decision with the
minimum of these values.

16
Decision Making with Probabilities

17
Decision Making with Probabilities
Expected Value Approach
 If probabilistic information regarding the states of nature is available, one may
use the expected value (EV) approach.
 Here the expected return for each decision is calculated by summing the
products of the payoff under each state of nature and the probability of the
respective state of nature occurring.
 The decision yielding the best expected return is chosen.

18
Expected Value of a Decision Alternative
 The expected value of a decision alternative is the sum of weighted payoffs
for the decision alternative.
 The expected value (EV) of decision alternative di is defined as:

N
EV(di ) = ∑ P( s j )Vij
j =1

where:
N = the number of states of nature
P( s j ) = the probability of state of nature s j
Vij = the payoff corresponding to decision
alternative d j and state of nature s j
19
Decision Tree
 A decision tree is a chronological representation of the decision problem.
 A decision tree consists of nodes and branches.
• A decision node, represented by a square, indicates a decision to be made. The
branches represent the possible decision alternatives.
• An event node, represented by a circle, indicates a random event (i.e., states of
nature). The branches represent the possible outcomes of the random event.
 At the end of each limb of a tree are the payoffs attained from the series of
branches making up that limb.

20
Expected Value for Each Decision

 Choose the decision alternative with the largest EV.


 So, Build the large complex.

21
Expected Value of Perfect Information
 Frequently, information is available which can improve the probability
estimates for the states of nature.

 The expected value of perfect information (EVPI) is the increase in the


expected profit that would result if one knew with certainty which state of
nature would occur.
 The EVPI provides an upper bound on the expected value of any sample or
survey information.

EVPI = |EVwPI – EVwoPI|

22
Expected Value of Perfect Information
 Expected Value with Perfect Information (EVwPI)

EVwPI = 0.8(20 mil) + 0.2(7 mil) = $17.4 mil

 Expected Value without Perfect Information (EVwoPI)


EVwoPI = 0.8(20 mil) + 0.2(-9 mil) = $14.2 mil

 Expected Value of Perfect Information (EVPI)


EVPI = |EVwPI – EVwoPI| = |17.4 – 14.2| = $3.2 mil

23
Risk Analysis and Sensitivity Analysis

24
Risk Analysis
Risk analysis helps the decision maker recognize the difference between:
 the expected value of a decision alternative, and
 the payoff that might actually occur

The risk profile for a decision alternative shows the possible payoffs for
the decision alternative along with their associated probabilities.
 Example: Large Complex Decision Alternative
• Risk profile for the large complex decision alternative for a condominium project:

25
Sensitivity Analysis
Sensitivity analysis can be used to determine how changes to the
following inputs affect the recommended decision alternative:
 probabilities for the states of nature
 values of the payoffs

If a small change in the value of one of the inputs causes a change in the
recommended decision alternative, extra effort and care should be taken
in estimating the input value.

26
Agenda
Announcement
 Homework 4, Due: Apr 2, 11:59 pm

Today’s Plan
 Lecture 7: Decision Analysis
• Reading: OptDSM_Anderson_Ch13_Decision Analysis.pdf
Decision Analysis with Sample Information
Computing Branch Probabilities with Bayes’ Theorem
• Decision Trees using Excel TreePlan Add-in

Next Class
 Lecture 7: Decision Analysis (cont’d)
Decision Analysis with Sample Information

28
Decision Analysis with Sample Information
 Frequently, decision makers have preliminary or prior probability assessments
for the states of nature that are the best probability values available at that time.

 To make the best possible decision, the decision maker may want to seek
additional information about the states of nature.

 This new information, often obtained through sampling, can be used to revise
the prior probabilities so that the final decision is based on more accurate
probabilities for the states of nature.

 These revised probabilities are called posterior probabilities.

29
Example: Pittsburgh Development Corp.
Pittsburgh Development Corp. (PDC)’s management team is considering
a 6-month market research study designed to learn more about potential
market acceptance of the PDC condominium project.

Management anticipates that the market research study will provide one
of the following two results:
1. Favorable report: A significant number of the individuals contacted express
interest in purchasing a PDC condominium.
2. Unfavorable report: Very few of the individuals contacted express interest in
purchasing a PDC condominium.

 Influence Diagram:

30
Sample Information
PDC has developed the following branch probabilities.
 If the market research study is undertaken:

P(Favorable report) = P(F) = 0.77


P(Unfavorable report) = P(U) = 0.23

 If the market research report is favorable:

P(Strong demand | favorable report) = P(s1|F) = 0.94


P(Weak demand | favorable report) = P(s2|F) = 0.06

31
Sample Information (cont’d)
 If the market research report is unfavorable:

P(Strong demand | unfavorable report) = P(s1|U) = 0.35


P(Weak demand | unfavorable report) = P(s2|U) = 0.65

 If the market research study is not undertaken, the prior probabilities are
applicable:

P(Favorable report) = P(F) = 0.80


P(Unfavorable report) = P(U) = 0.20

32
Decision Tree for the PDC Condominium Project

33
Decision Strategy
A decision strategy is a sequence of decisions and chance outcomes
where the decisions chosen depend on the yet-to-be-determined
outcomes of chance events.

The approach used to determine the optimal decision strategy is based


on a backward pass through the decision tree using the following steps:
 At chance nodes, compute the expected value by multiplying the payoff at the
end of each branch by the corresponding branch probabilities.
 At decision nodes, select the decision branch that leads to the best expected
value. This expected value becomes the expected value at the decision node.

34
Decision Tree for the PDC Condominium Project (cont’d)

AFTER COMPUTING EXPECTED VALUES AT CHANCE NODES 6 TO 14 35


Decision Tree for the PDC Condominium Project (cont’d)

AFTER CHOOSING BEST DECISIONS REDUCED TO TWO DECISION 36


AT NODES 3, 4, AND 5 BRANCHES
PDC’s Decision Strategy
PDC’s optimal decision strategy is:
 Conduct the market research study.
 If the market research report is favorable, construct the large condominium complex.
 If the market research report is unfavorable, construct the medium condominium
complex.

PDC DECISION TREE SHOWING ONLY


BRANCHES ASSOCIATED WITH OPTIMAL
DECISION STRATEGY 37
Risk Profile
PDC’s Risk Profile

For example, the probability of the $20 million payoff is (0.77)(0.94) = 0.72 38
Expected Value of Sample Information
 The expected value of sample information (EVSI) is the additional expected
profit possible through knowledge of the sample or survey information.
• In the PDC problem, the market research study is the sample information used to
determine the optimal decision strategy.

 The expected value associated with the market


research study is $15.93.
 The best expected value if the market research study
is not undertaken is $14.20.
 We can conclude that the difference, $15.93 - $14.20
= $1.73, is the expected value of sample information.
 Conducting the market research study adds $1.73
million to the PDC expected value.

39
Efficiency of Sample Information
 Efficiency of sample information is the ratio of EVSI to EVPI.
 As the EVPI provides an upper bound for the EVSI, efficiency is always a
number between 0 and 1.

 The efficiency of the survey:

E = (EVSI/EVPI) X 100
= [($1.73 mil)/($3.20 mil)] X 100
= 54.1%

The information from the market research study is 54.1% as efficient as perfect
information.

40
Computing Branch Probabilities with
Bayes’ Theorem

41
Bayes’ Theorem (a.k.a. Bayes’ Decision Rule)
 Bayes’ theorem describes the probability of an event based on prior knowledge
of the conditions that might be relevant to the event.

𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴 𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴
𝑃𝑃 𝐴𝐴 𝐵𝐵 = =
𝑃𝑃(𝐵𝐵) 𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴 + 𝑃𝑃(𝐵𝐵|𝐴𝐴𝑐𝑐 )𝑃𝑃(𝐴𝐴𝑐𝑐 )

𝑃𝑃 𝐵𝐵 𝐴𝐴 𝑃𝑃 𝐴𝐴 = 𝑃𝑃 𝐴𝐴, 𝐵𝐵 .
If A and B are independent, 𝑃𝑃 𝐵𝐵 𝐴𝐴 = 𝑃𝑃(𝐵𝐵) so, 𝑃𝑃 𝐵𝐵 𝑃𝑃 𝐴𝐴 = 𝑃𝑃 𝐴𝐴, 𝐵𝐵 .

• English statistician Thomas Bayes discovered the formula in 1763.


42
• considered as the foundation of the special statistical inference approach called the Bayes’ inference.
Bayes’ Theorem Example
 Imagine you are a financial analyst at an investment bank.

 According to your research of publicly-traded companies, 60% of the companies


that increased their share price by more than 5% in the last three years replaced
their CEOs during the period.

 At the same time, only 35% of the companies that did not increase their share
price by more than 5% in the same period replaced their CEOs.

 Knowing that the probability that the stock prices grow by more than 5% is 4%,
find the probability that the shares of a company that fires its CEO will increase
by more than 5%.

Let’s define two events:


A = stock price increases by 5%
B = CEO is replaced
43
Bayes’ Theorem Example
Before finding the probabilities, you must first define the notation of the
probabilities.
 P(A) = 0.04: the probability that the stock price increases by 5%.
 P(B): the probability that the CEO is replaced.
 P(A|B): the probability of the stock price increases by 5% given that the CEO has
been replaced. → We are interested in this.
 P(B|A) = 0.60: the probability of the CEO replacement given the stock price has
increased by 5%.
 P(B|Ac) = 0.35: the probability of the CEO replacement given the stock price has
not increased by 5%.

Using the Bayes’ theorem, we can find the required probability:

 Thus, the probability that the shares of a company that replaces its CEO will grow
by more than 5% is 6.67%.
44
PDC Decision Tree
Now, returning back to the PDC Example, let

P(s1 |F) and P(s2 |F) are referred to as


• posterior probabilities because they are
conditional probabilities based on the outcome of
the sample information, or
• branch probabilities because they are located at
the end of branch in a decision tree.

45
Computing Branch Probabilities
 We will need conditional probabilities for all sample outcomes given all states
of nature, that is, P(F | s1), P(F | s2), P(U | s1), and P(U | s2).

 In the PDC problem we assume that the following assessments are available for
these conditional probabilities:

Market Research
State of Nature Favorable, F Unfavorable, U
Strong demand, s1 P(F| s1) = 0.90 P(U| s1) = 0.10
Weak demand, s1 P(F| s2) = 0.25 P(U| s2) = 0.75

46
Computing Branch Probabilities (cont’d)
Branch (Posterior) Probabilities Calculation

 Step 1:
For each state of nature, multiply the prior probability by its conditional
probability for the indicator - this gives the joint probabilities for the states
and indicator.
 Step 2:
Sum these joint probabilities over all states - this gives the marginal
probability for the indicator.
 Step 3:
For each state, divide its joint probability by the marginal probability for the
indicator - this gives the posterior probability distribution.

47
𝑷𝑷 𝑨𝑨, 𝑩𝑩
Computing Branch Probabilities (cont’d) = 𝑷𝑷 𝑩𝑩 𝑨𝑨 𝑷𝑷 𝑨𝑨
= 𝑷𝑷 𝑨𝑨 𝑩𝑩 𝑷𝑷 𝑩𝑩

Favorable: Favorable
State of Prior Conditional Joint Posterior
Nature Probability Probability Probability Probability
sj P(sj) P(F|sj) P(F, sj) P(sj |F)
s1 0.8 0.90 0.72 0.94 =0.72/0.77
s2 0.2 0.25 0.05 0.06
𝑷𝑷 𝑩𝑩 = 𝑷𝑷 𝑩𝑩 𝑨𝑨 𝑷𝑷 𝑨𝑨 + 𝑷𝑷(𝑩𝑩|𝑨𝑨𝒄𝒄 )𝑷𝑷(𝑨𝑨𝒄𝒄 ) P(favorable) = P(F) = 0.77 1.00

Unfavorable
Unfavorable:
State of Prior Conditional Joint Posterior
Nature Probability Probability Probability Probability
sj P(sj) P(U|sj) P(U, sj) P(sj |U)
s1 0.8 0.10 0.08 0.35
s2 0.2 0.75 0.15 0.65
P(unfavorable) = P(U) = 0.23 1.00

48
Bayes’ Theorem and Posterior Probabilities
 Knowledge of sample (survey) information can be used to revise the
probability estimates for the states of nature.

 Prior to obtaining this information, the probability estimates for the states of
nature are called prior probabilities.

 With knowledge of conditional probabilities for the outcomes or indicators of


the sample or survey information, these prior probabilities can be revised by
employing Bayes' Theorem.

 The outcomes of this analysis are called posterior probabilities or branch


probabilities for decision trees.

49
(Optional) Bayes’ Decision Rule Overview
Naive Bayes Classifier
 “Conditional independence is assumed to simplify the classification decision”

 Bayes Theory is based on conditional probability


• 𝑷𝑷(𝒙𝒙|𝒚𝒚, 𝒛𝒛) is the conditional probability that 𝑥𝑥 occurs based on the condition that 𝑦𝑦
and 𝑧𝑧 occurred earlier
• If 𝑥𝑥 is independent of 𝑧𝑧 then 𝑃𝑃 𝑥𝑥 𝑦𝑦, 𝑧𝑧 → 𝑃𝑃 𝑥𝑥 𝑦𝑦

 Everything is dependent to everything else


• But the relations are too complex to fully analyze

 In order to simplify the computation process, the Naive Bayes model


“Naively” assumes that events are independent
• Pros: Provides fast and easy-to-compute results
• Cons: Accuracy and reliability is sacrificed
• Used when the resulting accuracy is sufficient to be applied to its purpose

50
(Optional) Bayes’ Decision Rule Overview (cont’d)
Naive Bayes Classifier Example
 Conditioned on class c

1. If it is “assumed” that the probability distributions of the input variables


𝒙𝒙𝟏𝟏 , … , 𝒙𝒙𝑫𝑫 are independent (which they are actually not independent) then…

2. Class conditional probability density equation can be written as a simple


multiplication (product) of one dimensional probability density functions

51
(Optional) Bayes’ Decision Rule Overview (cont’d)
Naive Bayes Classifier Example
 Let’s find the probability of occurrence of dataset 𝒙𝒙 through class 𝒄𝒄

𝑝𝑝(𝒙𝒙|𝒚𝒚 = 𝒄𝒄)

 “Naively” assume conditional independence between the features

• D is the number of features

52
(Optional) Bayes’ Decision Rule Overview (cont’d)
Naive Bayes Classifier Example

 Multi-dimensional probability is easily obtained from multiplications


(product) of many one dimensional densities
 This model is called “naive” since in reality these features are not independent
 But it often works very well.

53
(Optional) Bayes’ Decision Rule
Features of Bayes’ Decision Rule
 It accounts for all the states of nature and their probabilities.
 The expected payoff can be interpreted as what the average payoff would
become if the same situation were repeated many times. Therefore, on
average, repeatedly applying Bayes’ decision rule to make decisions will lead to
larger payoffs in the long run than any other criterion.

Criticisms of Bayes’ Decision Rule


 There usually is considerable uncertainty involved in assigning values to the
prior probabilities.
 Prior probabilities inherently are at least largely subjective in nature, whereas
sound decision making should be based on objective data and procedures.
 It ignores typical aversion to risk. By focusing on average outcomes, expected
(monetary) payoffs ignore the effect that the amount of variability in the
possible outcomes should have on decision making.

54

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy