Tools For Decision Analysis
Tools For Decision Analysis
Tools For Decision Analysis
Both variance and standard deviation provide the same information and, therefore, one can always be obtained from the other. In other words, the process
of computing standard deviation always involves computing the variance. Since standard deviation is the square root of the variance, it is always expressed in
the same units as the expected value.
For the dynamic decision process, the Volatility as a measure for risk includes the time period over which the standard deviation is computed. The Volatility
measure is defined as standard deviation divided by the square root of the time duration.
What should you do if the course of action with the larger expected outcome also has a much higher risk? In such cases, using another measure of risk
known as the Coefficient of Variation is appropriate.
Coefficient of Variation (CV) is the relative risk, with respect to the expected value, which is defined as:
Coefficient of Variation (CV) is the absolute relative deviation with respect to size provided is not zero, expressed in percentage:
CV =100 |S/ | %
Notice that the CV is independent from the expected value measurement. The coefficient of variation demonstrates the relationship between standard
deviation and expected value, by expressing the risk as a percentage of the (non-zero) expected value. The inverse of CV (namely 1/CV) is called the
Signal-to-Noise Ratio.
The quality of your decision may be computed by using Measuring Risk.
The following table shows the risk measurements computed for the Investment Decision Example:
Risk Assessment
G(0.4) MG(0.3) NC(0.2) L(0.1) Exp. Value St. Dev. C. V.
B 12 8 7 3 8.9 2.9 32% **
S 15 9 5 -2 9.5 * 5.4 57%
D 7 7 7 7 7 0 0%
The Risk Assessment columns in the above table indicate that bonds are much less risky than the stocks, while its return is lower. Clearly, deposits are risk
free.
Now, the final question is: Given all this relevant information, what action do you take? It is all up to you.
The following table shows the risk measurements computed for the Investment Decision under pure uncertainty (i.e., the Laplace equal likelihood
principle):
Risk Assessment
G(0.25) MG(0.25) NC(0.25) L(0.25) Exp. Value St. Dev. C. V.
B 12 8 7 3 7.5 3.20* 43% **
S 15 9 5 -2 6.75 6.18 92%
D 7 7 7 7 7 0 0%
The Risk Assessment columns in the above table indicate that bonds are much less risky than the stocks. Clearly, deposits are risk free.
Again, the final question is: Given all this relevant information, what action do you take? It is all up to you.
Ranking Process for Preference among Alternatives: Referring to the Bonds and Stocks alternatives in our numerical example, we notice that based in
mean-variance, the Bonds alternative Dominates the Stocks alternative. However this is not always the case.
For example, consider two independent investment alternatives: Investment I and Investment II with the characteristics outlined in the following table:
Two Investments Portfolios
Investment I Investment II
Payoff % Prob. Payoff % Prob.
1 0.25 3 0.33
7 0.50 5 0.33
05-Apr-11 Tools for Decision Analysis
home.ubalt.edu/ntsbarsh//partix.htm 18/22
12 0.25 8 0.34
Performance of Two Investments
To rank these two investments under the Standard Dominance Approach in Finance, first we must compute the mean and standard deviation and then
analyze the results. Using the above Applet for calculation, we notice that the Investment I has mean = 6.75% and standard deviation = 3.9%, while the
second investment has mean = 5.36% and standard deviation = 2.06%. First observe that under the usual mean-variance analysis, these two investments
cannot be ranked. This is because the first investment has the greater mean; it also has the greater standard deviation. Therefore, the Standard Dominance
Approach is not a useful tool here. We have to resort to the coefficient of variation as a systematic basis of comparison. The C.V. for Investment I is 57.74%
and for investment II is 38.43%. Therefore, Investment II has preference over the other one. Clearly, this approach can be used to rank any number of
alternative investments.
Application of Signal-to-Noise Ratio In Investment Decisions: Suppose you have several portfolios, which are almost uncorrelated (i.e., all paired-
wise covariance's are almost equal to zero), then one may distributed the total capital among all portfolios proportional to their signal-to-noise ratios.
For Negatively Correlated portfolios you may use the Beta Ratio, or Bivariate Discrete Distributions Javascript.
Consider the above two independent investments with the given probabilistic rate of returns. Given you wish to invest $12,000 over a period of one year,
how do you invest for the optimal strategy?
The C.V. for Investment-I is 57.74% and for investment-II is 38.43%, therefore signal-to-noise ratio are 1/55.74 = 0.0179 and 1/38.43 = 0.0260,
respectively.
Now, one may distribute the total capital ($12000) proportional to the Beta values:
Sum of signal-to-noise ratios = 0.0179 + 0.0260 = 0.0439
Y1 = 12000 (0.0179 / 0.0439) = 12000(0.4077) = $4892, Allocating to the investment-I
Y2 = 12000 (0.0260 / 0.0439) = 12000(0.5923) = $7108, Allocating to the investment-II
That is, the optimal strategic decision based upon the signal-to-noise ratio criterion is: Allocate $4892 and $7108 to the investment-I and investment-II,
respectively.
These kinds of mixed-strategies are known as diversifications that aim at reducing your risky.
The quality of your decision may be computed by using Performance Measures for Portfolios.
Copping with Risk
Risk avoidance is refusing to undertake an activity where the risk seems too costly.
Risk prevention (loss control) is using various methods to reduce the possibility of a loss occurring.
Risk transfer is shifting a risk to someone outside your company.
Risk assumption or self-insurance is setting aside funds to meet losses that are uncertain in size and frequency.
Risk reduction by, for example, diversifications.
Further Readings:
Crouhy M., R. Mark, and D. Galai, Risk Management, McGraw-Hill, 2002.
Koller G., Risk Modeling for Determining Value and Decision Making, Chapman & Hall/CRC, 2000.
Moore P., The Business of Risk, Cambridge University Press, 1984.
Morgan M., and M. Henrion, Uncertainty: A Guide to Dealing with Uncertainty in Quantitative Risk and Policy Analysis, Cambridge University Press, 1998.
Shapira Z., Risk Taking: A Managerial Perspective, Russell Sage Foundation, 1997.
Vose D., Risk Analysis: A Quantitative Guide, John Wiley & Sons, 2000.
Wahlstrom B., Models, Modeling And Modellers: An Application to Risk Analysis, European Journal of Operations Research, Vol. 75, No. 3, 477-487, 1994.
Decision's Factors-Prioritization & Stability Analysis
Introduction: Sensitivity analysis is a technique for determining how much an expected payoff will change in response to a given change in an input variable
(all other things remaining unchanged).
Steps in Sensitivity Analysis:
1. Begin with consideration of a nominal base-case situation, using the expected values for each input.
2. Calculate the base-case output.
3. Consider a series of "what-if" questions, to determine by how much the output would deviate from this nominal level if input values deviated from their
expected values.
4. Each input is changed by several percentage points above and below its expected value, and the expected payoff is recalculated.
05-Apr-11 Tools for Decision Analysis
home.ubalt.edu/ntsbarsh//partix.htm 19/22
5. The set of expected payoff is plotted against the variable that was changed.
6. The steeper the slope (i.e., derivative) of the resulting line, the more sensitive the expected payoff is to a change in the variable.
Scenario Analysis: Scenario analysis is a risk analysis technique that considers both the sensitivity of expected payoff to changes in key variables and the
likely range of variable values. The worst and best "reasonable" sets of circumstances are considered and the expected payoff for each is calculated, and
compared to the expected, or base-case output.
Scenario analysis also includes the chance events, which could be rare or novel events with potentially significant consequences for decision-making in some
domain. The main issues in studying the chance events are the following:
Chance Discovery: How may we predict, identify, or explain chance events and their consequences?
Chance Management: How may we assess, prepare for, or manage them?
Clearly, both scenario and sensitivity analysis can be carried out using computerized algorithms.
How Stable is Your Decision? Stability Analysis compares the outcome of each your scenarios with chance events. Computer packages such as WinQSB,
are necessary and useful tools. They can be used to examine the decision for stability and sensitivity whenever there is uncertainty in the payoffs and/or in
assigning probabilities to the decision analysis.
Prioritization of Uncontrollable Factors: Stability analysis also provides critical model inputs. The simplest test for sensitivity is whether or not the optimal
decision changes when an uncertainty factor is set to its extreme value while holding all other variables unchanged. If the decision does not change, the
uncertainty can be regarded as relatively less important than for the other factors. Sensitivity analysis focuses on the factors with the greatest impact, thus
helping to prioritize data gathering while increasing the reliability of information.
Optimal Decision Making Process
Mathematical optimization is the branch of computational science that seeks to answer the question 'What is best?' for problems in which the quality of any
answer can be expressed as a numerical value. Such problems arise in all areas of business, and management. The range of techniques available to solve
them is nearly as wide that includes Linear Optimization, Integer Programming, and Non-linear Optimization.
A mathematical optimization model consists of an objective function and a set of constraints expressed in the form of a system of equations or inequalities.
Optimization models are used extensively in almost all areas of decision-making such as financial portfolio selection.
Integer Linear optimization Application: Suppose you invest in project (i) by buying an integral number of shares in that project, with each share costing
C
i
and returning R
i
. If we let X
i
denotes the number of shares of project (i) that are purchased, then the decision problem is to find nonnegative integer
decision variables X
1
, X
2
,, X
n
--- when one can invest at most M in the n project --- is to:
Maximize E R
i
X
i
Subject to:
EX
i
C
i
s M
Application: Suppose you have 25 to invest among three projects whose estimated cost per share and estimated return per share values are as follows:
Project Cost Return
1 5 7
2 9 12
3 15 22
Maximize 7X
1
+ 12X
2
+ 22X
3
Subject to:
5X
1
+ 9X
2
+ 15X
3
s 25
Using any linear integer programming software package, the optimal strategy is X
1
= 2, X
2
= 0, and X
3
= 1 with $36 as its optimal return.
JavaScript E-labs Learning Objects
This section is a part of the JavaScript E-labs learning technologies for decision making.
Each JavaScript in this collection is deigned to assisting you in performing numerical experimentation, for at least a couple of hours as students do in, e.g.
Physics labs. These leaning objects are your statistics e-labs. These serve as learning tools for a deeper understanding of the fundamental statistical
concepts and techniques, by asking "what-if" questions.
Technical Details and Applications: At the end of each JavaScript you will find a link under "For Technical Details and Applications Back to:".
Decision Making in Economics and Finance:
ABC Inventory Classification -- an analysis of a range of items, such as finished products or customers into three "importance" categories: A, B, and C
as a basis for a control scheme. This pageconstructs an empirical cumulative distribution function (ECDF) as a measuring tool and decision procedure
for the ABC inventory classification.
Inventory Control Models -- Given the costs of holding stock, placing an order, and running short of stock, this page optimizes decision parameters
(order point, order quantity, etc.) using four models: Classical, Shortages Permitted , Production & Consumption, Production & Consumption with
Shortages.
Optimal Age for Replacement -- Given yearly figures for resale value and running costs, this page calculates the replacement optimal age and average
cost.
Single-period Inventory Analysis -- computes the optimal inventory level over a single cycle, from up-to-28 pairs of (number of possible item to sell,
and their associated non-zero probabilities), together with the "not sold unit batch cost", and the "net profit of a batch sold".
Probabilistic Modeling:
05-Apr-11 Tools for Decision Analysis
home.ubalt.edu/ntsbarsh//partix.htm 20/22
Bayes' Revised Probability -- computes the posterior probabilities to "sharpen" your uncertainties by incorporating an expert judgement's reliability
matrix with your prior probability vector. Can accommodate up to nine states of nature.
Decision Making Under Uncertainty -- Enter up-to-6x6 payoff matrix of decision alternatives (choices) by states of nature, along with a coefficient of
optimism; the page will calculate Action & Payoff for Pessimism, Optimism, Middle-of-the-Road, Minimize Regret, and Insufficient Reason.
Determination of Utility Function -- Takes two monetary values and their known utility, and calculates the utility of another amount, under two different
strategies: certain & uncertain.
Making Risky Decisions -- Enter up-to-6x6 payoff matrix of decision alternatives (choices) by states of nature, along with subjective estimates of
occurrence probability for each states of nature; the page will calculate action & payoff (expected, and for most likely event), min expected regret ,
return of perfect information, value of perfect information, and efficiency.
Multinomial Distributions -- for up to 36 probabilities and associated outcomes, calculates expected value, variance, SD, and CV.
Revising the Mean and the Variance -- to combine subjectivity and evidence-based estimates. Takes up to 14 pairs of means and variances; calculates
combined estimates of mean, variance, and CV.
Subjective Assessment of Estimates -- (relative precision as a measuring tool for inaccuracy assessment among estimates), tests the claim that at least
one estimate is away from the parameter by more than r times (i.e., a relative precision), where r is a subjective positive number less than one. Takes
up-to-10 sample estimates, and a subjective relative precision (r<1); the page indicates whether at least one measurement is unacceptable.
Subjectivity in Hypothesis Testing -- Takes the profit/loss measure of various correct or incorrect conclusions regarding the hypothesis, along with
probabilities of Type I and II errors (alpha & beta), total sampling cost, and subjective estimate of probability that null hypothesis is true; returns the
expected net profit.
Time Series Analysis and Forecasting
Autoregressive Time Series -- tools for the identification, estimation, and forecasting based on autoregressive order obtained from a time series.
Detecting Trend & Autocrrelation in Time Series -- Given a set of numbers, this page tests for trend by Sign Test, and for autocorrelation by Durbin-
Watson test.
Plot of a Time Series -- generates a graph of a time series with up to 144 points.
Seasonal Index -- Calculates a set of seasonal index values from a set of values forming a time series. A related page performs a Test for Seasonality
on the index values.
Forecasting by Smoothing -- Given a set of numbers forming a time series, this page estimates the next number, using Moving Avg & Exponential
Smoothing, Weighted Moving Avg, and Double & Triple Exponential Smoothing, &and Holt's method
Runs Test for Random Fluctuations -- in a time series.
Test for Stationary Time Series -- Given a set of numbers forming a time series, this page calculates the mean & variance of the first & second half,
and calculates one-lag-apart & two-lag-apart autocorrelations. A related page: Time Series' Statistics calculates these statistics, and also the overall
mean & variance, and the first & second partial autocorrelations.
A Critical Panoramic View of Classical Decision Analysis
The coverage of decision analysis in almost all textbooks and published papers has the following limitations:
1. The decision maker facing a pure uncertain decision has select at least and at most one option from all possible options.
This certainly limits its scope and its applications. You have already learned both decision analysis and linear programming. Now is the time to use the
game theory concepts to link together these two seemingly different types of models to widen their scopes in solving more realistic decision-making
problems.
2. The decision maker facing a risky decision has to rely on the expected value alone which is not a good indication of a quality decision. The variance
must be known so that an educated decision might be made.
For example in investment portfolio selection, it is also necessary to compare the "risk" between alternative courses of action. A measure of risk is
generally reported in finance textbooks by variation, or its square root called standard deviation. Variation or standard deviation is numerical values
that indicate the variability inherent to your decision. For risk, smaller values indicate that what you expect is likely to be what you get. Therefore, risk
must also be used in decision analysis process.
To combine the expected values and the associated risk one may use Coefficient of Variation (CV) as a measuring tool and decision process in
decision analysis. As you know well, CV is the absolute relative deviation with respect to size provided is not zero, expressed in percentage:
CV =100 |S/expected value| %
Notice that the CV is independent from the expected value measurement. The coefficient of variation demonstrates the relationship between standard
deviation and expected value, by expressing the risk as a percentage of the (non-zero) expected value. This dimension-less nice property of C.V.
enables decision makers to compare and decide when facing several independent decision with different measurement of the payoff matrices (such as
dollar, yen, etc).
3. Analytical Hierarchy Process: One may realize the dilemma of analytical hierarchy process whether it can truly handle the real-life situations when
one takes into account the "theoretical" difficulties in using eigenvectors (versus, for example, geometrical means) and other related issue to the issue of
being able to pairwise-compare more than 10 alternatives extending the questionability to whether any person can/cannot set the nine-point scale
without being biased - let alone becoming exhausted when you have 15 options/alternatives to consider with 20-30 measures and 10 people sitting in a
room.
A selection of:
|Armed Forces Network| Associao Portuguesa de Investigao Operacional| Association for Facilities Engineering (AFE)| BUBL Catalogue | Business
Majors| Decision Modeling| Decision Sciences Institute| Economics LTSN| International MCDM Society| MathForum| NEEDS| MERLOT| Penn State U.|
Production & Operations Management Society|
Search Engines Directory
| AltaVista| AOL| Excite| HotBot| Lycos| Netscape| NetFirst| Yahoo|
Scottish Further Education| Scout Report| Society for Judgment and Decision Making | Society for Medical Decision Making| US Air Force| Virtual Library|
05-Apr-11 Tools for Decision Analysis
home.ubalt.edu/ntsbarsh//partix.htm 21/22
The Copyright Statement: The fair use, according to the 1996 Fair Use Guidelines for Educational Multimedia, of materials presented on this Web site is
permitted for non-commercial and classroom purposes only.
This site may be mirrored intact (including these notices), on any server with public access. All files are available at
http://www.mirrorservice.org/sites/home.ubalt.edu/ntsbarsh/Business-stat for mirroring.
Kindly e-mail me your comments, suggestions, and concerns. Thank you.
Professor Hossein Arsham
This site was launched on 2/25/1994, and its intellectual materials have been thoroughly revised on a yearly basis. The current version is the 9
th
Edition. All
external links are checked once a month.
Back to:
Dr Arsham's Home Page
EOF: 1994-2011.
05-Apr-11 Tools for Decision Analysis
home.ubalt.edu/ntsbarsh//partix.htm 22/22