Risk Adjusted Discount Rate: Sensitivity Analysis
Risk Adjusted Discount Rate: Sensitivity Analysis
(SA) is the study of how the variation (uncertainty) in the output of a mathematical model can be
apportioned, qualitatively or quantitatively, to different sources of variation in the input of the
model. [1]. Put another way, it is a technique for systematically changing parameters in a model to
determine the effects of such changes.
In more general terms uncertainty and sensitivity analysis investigate the robustness of a study
when the study includes some form of mathematical modelling. Sensitivity analysis can be useful to
computer modellers for a range of purposes[2], including:
support decision making or the development of recommendations for decision makers (e.g. testing
the robustness of a result);
Risk-free interest rate (such as on a government security) plus a risk premium appropriate to the
level of risk.
This article is about decision trees in decision analysis. For the use of the term in machine learning,
see Decision tree learning.
A decision tree
is a decision support tool that uses a tree-like graph or model of decisions and their possible
consequences, including chance event outcomes, resource costs, and utility. It is one way to display
an algorithm. Decision trees are commonly used in operations research, specifically in decision
analysis, to help identify a strategy most likely to reach a goal. Another use of decision trees is as a
descriptive means for calculating conditional probabilities. When the decisions or consequences are
modelled by computational verb, then we call the decision tree a computational verb decision tree.
[1]
Contents [hide]
1 General
2 Influence diagram
3 Uses in teaching
4 Advantages
5 Example
6 See also
7 References
8 External links
[edit] GeneralIn decision analysis, a "decision tree" — and the closely-related influence diagram — is
used as a visual and analytical decision support tool, where the expected values (or expected utility)
of competing alternatives are calculated.
Decision trees have traditionally been created manually, as the following example shows:
Analysis can take into account the decision maker's (e.g., the company's) preference or utility
function, for example:
The basic interpretation in this situation is that the company prefers B's risk and payoffs under
realistic risk preference coefficients (greater than $400K—in that range of risk aversion, the
company would need to model a third strategy, "Neither A nor B").
[edit] Influence diagramA decision tree can be represented more compactly as an influence diagram,
focusing attention on the issues and relationships between events.
Decision trees, influence diagrams, utility functions, and other decision analysis tools and methods
are taught to undergraduate students in schools of business, health economics, and public health,
and are examples of operations research or management science methods.
[edit] AdvantagesAmongst decision support tools, decision trees (and influence diagrams) have
several advantages:
Decision trees:
Are simple to understand and interpret. People are able to understand decision tree models after a
brief explanation.
Have value even with little hard data. Important insights can be generated based on experts
describing a situation (its alternatives, probabilities, and costs) and their preferences for outcomes.
Use a white box model. If a given result is provided by a model, the explanation for the result is
easily replicated by simple math.
Can be combined with other decision techniques. The following example uses Net Present Value
calculations, PERT 3-point estimations (decision #1) and a linear distribution of expected outcomes
(decision #2):