Behaveco 4

Download as pdf or txt
Download as pdf or txt
You are on page 1of 50

BEHAVIORAL ECONOMICS

ECON F345

Dushyant Kumar
BITS Pilani, Hyderabad Campus
Decision Making Under Uncertainty

Decision Making Under Uncertainty


Risk & Uncertainty

I Risk: agreed upon probability distribution.


I Uncertainty: subjective probability distribution.
I Many a times, even in literature, these terms are used
interchangeably.
I We will be confining ourselves to the risk part only here.
I Some basics first- recap..
Basics: Recap

I Mutually exclusive events

P(A ∪ B) = P(A) + P(B)

P(A ∩ B) = 0
I Exhaustive events

P(A) + P(B) = 1

I Independent events

P(A&B) = P(A) × P(B)


Basics: Recap

I Conditional probability

P(A&B)
P(A/B) =
P(B)

P(A&B) = P(A/B) × P(B) = P(B/A) × P(A).


For independent events,

P(A/B) = P(A), P(B/A) = P(B), P(A&B) = P(A)×P(B).


Basics: Recap

I Bayes’ Rule
I Suppose two procedures of producing a particular good-
A & B.
I From procedure A, the good can be defective with probability
P(D/A), similarly P(D/B) for procedure B.

P(D) = P(D/A)P(A) + P(D/B)P(B).

I If a good is defective, what is the probability that it was


produced through A?
Basics: Recap

P(A&D)
P(A/D) =
P(D)
P(D/A)P(A)
= .
P(D/A)P(A) + P(D/B)P(B)

I Example: Upon the tests and diagnosis, suppose the doctor


infers that a patient has either type A (with prob. 13 ) or type
B cancer (with prob. 23 ). (mutually exclusive and exhaustive
events)
type A is deadly- 4 out of 5 patients die within a year,
type B is relatively less deadly- 1 out of 5 patients die within a
year.
Bayesian Updation: Example Cont...
I What is the prboability of a patient dying within a year?

4 1 1 2 2
× + × = .
5 3 5 3 5
I Prior probability and posterior probability..
I Now suppose the patient die within a year, what is the
posterior probability that the patient was suffering from type
A cancer?
P(D/A)P(A)
P(A/D) =
P(D/A)P(A) + P(D/B)P(B)
4
×1 2
= 4 15 31 2 = .
5 × 3 + 5 × 3
3

favorable event prob ↑; unfavorable event prob ↓


P(A/D) > P(A), P(B/D) < P(B)..
Bayesian Updation

I Processing new information-


Prior probability −→ posterior probability..
I Example: unfair coin toss..
prior belief- the coin is fair with probability 0.30 and unfair
with probability 0.70..
unfair in the sense that its head both sides.
consider a game where you get Rs. 100 if its H, 0 otherwise.
I What is the (max) money that you are ready to pay as entry
fee?

0.30 × (0.50 × 100 + 0.50 × 0) + 0.70 × (1 × 100) = 85.


Bayesian Updation: Example Cont..

I Now suppose you observe that in the first toss, its H.


Now what is the (max) money that you are ready to pay as
entry fee?
Notations-
U− coin being unfair.
F − coin being fair.
G − getting head in first toss.

P(G /U) = 1, P(G /F ) = 0.50

P(G /U)P(U)
P(U/G ) = = 0.82
P(G /U)P(U) + P(G /F )P(F )
Bayesian Updation: Example Cont..

I So the expected value after observing G -

0.82 × 100 + (1 − 0.82) × 50 = 92 (approx.)

I What if G was observance of T in first toss?

I Different biases....
Conjuction and Disjuction Baises

I Conjuction fallacy: Example


I Consider a Boeing 747-400 aircraft...
it has roughly around 6 million parts!
I Each part is extremely reliable, fails with probability 0.000,001
probability.. works almost always, with probability 0.999,999
I Suppose the design is such that even if one part fails, the
aircraft won’t work! (not robust)
I What is the probability of aircraft not working?
1− Prob (all parts working) = 1 − (0.999999)6,000,000 = 0.75
(approx.)
Conjuction and Disjuction Baises

I Conjuction fallacy is also referred as planning fallacy


many-a-times.
I While planning mega projects, governments typically makes
this kind of mistakes..
delay in completion of government projects..
I It is basically tendency to overestimate the probability of the
event A ∩ B.
I Disjuction fallacy: tendency to underestimate A ∪ B.
I Why is it so tough to guard against terrorist attacks?
I Cricket: a game biased against the bowlers?
Conjuction and Disjuction Baises

I Example: Birthday problem- Suppose there are 30 students in


a class. What is the probability that no two students have
same birthday?
assume births are randomly distributed over the year, non-leap
year..

365 364 336


× × ··· = 29.4 approx.
365 365 365

I Next- base rate neglect..


Base Rate Neglect

I Lets consider an example. Suppose around 1 percent of the


population have certain disease.
There is a really good (highly accurate) test available- 90
percent accuracy.
Suppose someone is dignosed positive in the test, what is the
probability that he indeed has the disease?

P(+ve/D)P(D)
P(D/ + ve) =
P(+ve/D)P(D) + P(+ve/ND)P(ND)

0.08 (approx)
I Arguements for random testing for Covid during very early
stage..
Decision Thoeries Under Uncertainty

I Till now we have covered the ways to estimate probabilities


and what are some common mistakes that we make in the
process.
I Now we move on to cover decision making using these
probabilities!
I Various criteria/approaches..
1. Maximin, Maximax, Minimax-risk, Expected Value..
2. Expected Utility Theory
3. Prospect Theory
Maximin Criterion

I First see what is the minimum payoff/outcome attached to


every action..
Choose the action which has the maximum of these min
values- maximum guranteed payoff.
I This approach captures the extreme risk averse attitude..
I Consider following example-
Maximin Criterion

I So as per Maximin principle, Carrying Umbrella is optimal..


I Notice risk aversion, extreme one infact..
I What if its a pleasant sunny morning- Feb in BITSH campus-
chances of rain very close to zero..
Analysis and prescription remain same..
I What if (Don’t Carry, No Rain) payoff is insanely high?
Again analysis and prescription remain same..
Maximin Criterion

I So, maximin principle is good and insightful to start with but


it has significant drawbacks-
1. Don’t use a significant portion of available information..
2. No diversity in terms of risk preferences- all individuals using
maximin approach are exactly same in terms of risk preferences
or attitude..
I Maximax Criterion
I Now first see what is the maximum payoff/outcome attached
to every action..
Choose the action which has the maximum of these max
values- a shot at best possible payoff.
I This approach captures the risk taking attitude.. extremme
one..
Maximax Criterion

I So as per Maximax principle, not carrying umbrella is optimal..


I Notice risk taking behaviour, extreme one infact..
I What if its extreme likely to rain, say with probability very
close to 1.
Analysis and prescription remain same..
I What if (Don’t Carry, Rain) payoff is insanely low?
Again analysis and prescription remain same..
Maximax Criterion

I So once again, maximax principle is good and insightful to


start with but it has significant drawbacks-
1. Don’t use a significant portion of available information..
2. No diversity in terms of risk preferences- all individuals using
maximax approach are exactly same in terms of risk
preferences or attitude..
I Minimax-risk Criterion
I Here we choose the one with lowest (potential) ‘regret’.
Minimax-risk Criterion

I So here carrying umbrella is optimal..


I Once again we are not considering probabilities and
magnitudes of payoffs in many states..
I Do minimax-risk criterion and maximin criterion always
recommends same action?
in this example, change the numbers such that both are
different.
I Lets consider another example to check these three solution
concepts.
Minimax-risk Criterion

I Consider a bicycle shop owner- need to stock before the


season, demand uncertain..
I Cost side: bicycles can be procured in multiple of 20 units-
20, 40, 60,....
I Per unit cost decreases with an increase in the number of
units procured:
$70 for 20 units, $67 for 40 units,
$65 for 60 units, $64 for 80 units.
I Further supoose demand can be of 10 units (with prob 0.20),
30 units (with prob. 0.40), 50 units (with prob. 0.30) and, 70
units (with prob. 0.10).
Minimax-risk Criterion: Example Contd..

I Further, retail price of a bicycle is $ 100.


Leftover stock can be returned to the wholeseller at $45 each..
I In case of shortage of stock, a returned customer attracts bad
reputation and all.. value say a loss of $ 5 per returned
customer..
I How much should the shop owner stock as per maximin,
maximax, and minimax-risk criterion?
Minimax-risk Criterion: Example Contd..
Minimax-risk Criterion: Example Contd..

I Maximin- stock just 20 units.


Maximax- stock 80 units.
Minimax-risk- stock 60 units.
I Minimax-risk criterion provide a middle or balanced path..
although it also suffers from essentially similar weaknesses-
1. probabilities are not considered at all.
2. magnitudes of payoffs of many states ingnored..
I Overall in all these three approaches, a lot information in not
taken into account..
Maxmin Criterion: Harsanyi Challenge

Next stop: Expected value approach..


Expected Value Approach

I Set of potential outcomes: {x1 , x2 , · · · , xn }.


associated probability distribution: {p1 , p2 , · · · , pn }.
Lottery- L = {x1 , p1 ; x2 , p2 ; · · · ; xn , pn }.
I Expected value- X
EV (L) = xi pi
i

notice here both magnitudes and probabilities are


accommodated..
I Expected value approach is quite widely used-
law and economics- magnitude of punishment times prob of
getting punished..
expected profit
insurance- actuarially fair..
Expected Value Approach

I Biggest strength of expected value approach- simplicity!


I Extremely desirable characteristic for any decision theory..
I However notice that we still don’t have diversity of risk
attitudes here..
I The expected value is linear in probabilities and outcome..
linear in outcome- caters just monetary outcomes; even for
monetary outcomes, its possible to have non-linearity in
satisfaction/value/utility..
I Other issue- dealing with extreme probabilities..
Expected Value Approach

I Consider the following examples-


A if you hit a particular point in a circle, you get Rs. 10 lakhs;
zero otherwise..
B if you hit a particular point in a circle, you have to pay a
penalty of Rs. 10 lakhs; zero otherwise..
I Which one do you prefer?
I Expected value approach suggest indifference..
I Suggested approach: dominance..
I Tail risk: Consider an event with extreme low probability but
catastrophic effect.. difficult to deal with in expected value
approach.. such event should be properly hedged..
Expected Value Approach: St. Petersberg Paradox

I A fair coin is tossed-


if it takes n tosses to get first head (H), you get Rs. 2n .
I What is the max entry fee that you are ready to pay for this
game?

I What is the expected value of this game?


Expected Value Approach: St. Petersberg Paradox

I A fair coin is tossed-


if it takes n tosses to get first head (H), you get Rs. 2n .
I What is the max entry fee that you are ready to pay for this
game?

I What is the expected value of this game?


Expected Value Approach: St. Petersberg Paradox

I St. Petersberg Paradox thought experiment- why such a huge


divergence between your willingness to pay and the expected
value?
I The expected value approach also does not caters for diversity
in risk attitudes?
I What is risk attitudes? We have not not defined yet!
I Before moving forward, lets first discuss risk attitudes briefly.
I Different types of risk attitudes- risk averse, risk neutral, risk
taker..
I Lets define these formally..
I Consider the lottery L and expected value EV (L) as
described earlier..
Risk Attitudes

I Will you like play the lottery i.e. take a chance with the
lottery or, rather settle with EV (L)?
I Risk averse- EV (L)  L
I Risk neutral- EV (L) ∼ L
I Risk taker- EV (L) ≺ L.
I So the expected value approach just covers the risk neutrality
part. We need a theory that can explain or accommodates all
three behaviours..
I Risk premium- Consider a risk averse individual. The
individual strictly prefers EV (L) compared to playing the
lottery and taking risk.
Risk Attitudes

I What if the individual is offered a bit less than expected value,


EV (L) − δ?
I Notice that there exist an amount such that the individual
prefer to play the lottery instead of taking that amount with
certainty- worst outcome xn ..
I So if we keep on reducing from EV (L), there exist an amount
where the individual is going to be just indifferent between
playing the lottery and taking that amount with certainty-
certainty equivalent (CE)..

L ∼ CE
Risk Attitudes

I What is the amount that this risk averse individual is


sacrificing in order to avoid the risk?

EV (L) − CE

this is known as risk premium (RP)..


I For a risk averse individual, CE < EV (L) and RP > 0,
For a risk neutral individual, CE = EV (L) and RP = 0
For a risk taker individual, CE > EV (L) and RP < 0

I Next Stop: Expected Utility Theory..


Expected Utility Theory

I John von Neumann and Oskar Morgenstern- Theory of Games


and Economic Behavior (1944)
I Consider the lottery described earlier-

L = {x1 , p1 ; x2 , p2 ; · · · ; xn , pn }

I First look for a functional representation of individual’s


preference over the set of outcomes- u(xi ).
I Existence of such representation?
I Define the expected utility function (f : L −→ R) as
X
U= pi (xi )u(xi )
i
Expected Utility Theory

I First observe that expected utility theory allows for the


non-linearity of satisfaction from outcomes (u(xi )).
This ensures that now an individual can be risk averse, risk
taker or, risk neutral..
I Example: Consider the lottery- [A.] flip a (fair) coin- if it is
head, you get Rs. 1000; if it is tail, you get nothing.

L = {100, 0.50; 0, 0}
1
suppose u(xi ) = xi2 .
EV (L) = 50
Expected Utility Theory: Example

1
EU(L) = 0.50 × 100 2 + 0.50 × 0 = 5
Utility that the individual is going to get from getting the
expected value i.e. 50 with certainty-
1
50 2 = 7.07

So the individual here prefers to have 50 with certainty rather


than playing the lottery- risk averse..
What if the utility function is

u(xi ) = xi2

Verify- risk taker..


Expected Utility Theory: Example

1
I Lets continue with u(xi ) = xi2 . What is CE and RP?
1
u(CE ) = CE 2 = EU(L) = 5
or, CE = 52 = 25.
RP = EV (L) − CE = 50 − 25 = 25.
I CE and RP for u = x 2 and u = kx, k > 0..
I The expected utility framework is a significant development
over the earlier approaches- significantly richer setup..
I What are we assuming when we are working with EUT?
Axiomatic foundation..
crucial for further developments..
Expected Utility Theory: Axiomatic Foundation

I Completeness- Any two lottery can be compared.. no


indecisiveness
I Transitivity- If L1  L2 and L2  L3 then L1  L3 .
** Best and worst lottery- if x1 is the best outcome and xn is the
worst outcome then L1 is the best lottery and Ln is the worst
lottery,
where L1 = {x1 , 1; x2 , 0; · · · ; xn , 0} and
Ln = {x1 , 0; x2 , 0; · · · ; xn , 1}
I Continuity- For every lottery Li there exist a p ∈ [0, 1] such
that
Li ∼ {x1 , p; x2 , 0; · · · ; xn , 1 − p}
or,
Li ∼ {x1 , p; xn , 1 − p}
Expected Utility Theory: Axiomatic Foundation

I Reduction of Compound Lotteries- the individual is indifferent


between a compound lottery and its reduced simple lottery.
I Independence- Suppose L1  L2 , then

{L1 , p; L3 , 1 − p}  {L2 , p; L3 , 1 − p}

I Theorem- Let preferences satisfy above axioms, then there


exist a function U : L −→ R such that L1  L2 if and only if

U(L1 ) ≥ U(L2 ).
P
U= i pi u(xi ).
Expected Utility Theory: Violations

Violations of independence axioms


I Example:
b1 = (x0 , 0; x0 + 300000, 1)
b2 = (x0 , 0.2; x0 + 400000, 0.8)
b3 = (x0 , 0.75; x0 + 300000, 0.25)
b4 = (x0 , 0.8; x0 + 400000, 0.2)
for illustration, say x0 = 0.

I b1 vs. b2 ?
b1
I b3 vs. b4 ?
b4
Violations of Independence Axioms

I It seems that while comparing b3 and b4 , many of us treat a


small difference in probabilities (0.05 here) as insignificant,
hence prefer b4 ;
but while comparing b1 and b2 , often we treat a small
difference in probabilities (0.20 here) as significant, hence
prefer b1 .
I Non-linearity in probability..
? Near the certainty- why take even a slight ‘unnecessary’ risk?
? In deep uncertainty (away from shore)- a bit more risk does
not matter..
Expected Utility Theory: Violations

I Later it was found that Allais paradox is a part of larger class


of paradoxes- common ratio violation and common
consequence violation.
I Common ratio violation Consider three outcomes-
0 < x1 < x2 . Consider following lotteries-
b1 = (x1 , p1 ; x0 , 1 − p1 ) and b2 = (x2 , p2 ; x0 , 1 − p2 )
b3 = (x1 , αp1 , ; x0 , 1 − αp1 ) and b4 = (x2 , αp2 ; x0 , 1 − αp2 )
I Notice in both cases, the common ratio- pp1 .
2
I Now if someone prefers b1 over b2 , and b4 over b3 , this is
referred as ‘common ratio violation’.
Expected Utility Theory: Violations

I In the previous example,


p1 = 1, x1 = x0 + 3000, p2 = 0.8, x2 = x0 + 4000, α = 0.25.
I Proposition: The common ratio violation is a violation of the
independence axiom, or the reduction axiom, or of both
axioms.
I Common consequence violation- Let A, B, C, D be lotteries
and α ∈ [0, 1]. Suppose that we have the following four
compound lotteries-
a1 = (A, α; D, 1 − α), and a2 = (B, α; D, 1 − α)
a3 = (A, α; C , 1 − α), and a4 = (B, α; C , 1 − α)
Expected Utility Theory: Violations

I Common consequence violation occurs if someone prefers a1


over a2 and, a4 over a3 (or, a2 over a1 and, a3 over a4 ).
I Various modifications of expected utility theory have been
tried to relax the independence axiom.
I EU is still quite popular! the default concept while dealing
with risk and uncertainty..
I Other violations- we are going to briefly discuss couple of
other violations in the context of EU theory.
I Frame independence- our decision should be independent of
framing.. example-
Expected Utility Theory: Violations

I Framing effect

Most people choose A in the case of positive framing and, B


in the case of negative framing..
Expected Utility Theory: Violations

I As per the expected utility theory, the framing should have no


role to play in our decisions.
I Framing and communication- often they play a huge role..
I Reduction axiom- many a times, we don’t treat compound
and ‘equivalent’ simple lottery in exactly same way-
compound risk premium- Abdellaoui et al. (2014)
I Rabin’s paradox- Rabin (2000)- under expected utility, risk
aversion over small stakes implies unreasonable high risk
aversion over large stakes..
I Fair bet- a risk averse individual always rejects a fair bet, a
risk taker always takes it!
Expected Utility Theory: Violations- Rabin (2000)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy