Bab 2 - Marek

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

2.

Measure 45

2.6 Probability
The ideas which led to Lebesgue measure may be adapted to construct mea-
sures generally on arbitrary sets: any set Ω carrying an outer measure (i.e. a
mapping from P (Ω) to [0, ∞] monotone and countably sub-additive) can be
equipped with a measure µ defined on an appropriate σ-field F of its subsets.
The resulting triple (Ω, F, µ) is then called a measure space, as observed in
Remark 2.1. Note that in the construction of Lebesgue measure we only used
the properties, not the particular form of the outer measure.

For the present, however, we shall be content with noting simply how to
restrict Lebesgue measure to any Lebesgue measurable subset B of R with
m(B) > 0:
Given Lebesgue measure m on the Lebesgue σ-field M let

MB = {A ∩ B : A ∈ M}

and for A ∈ MB write


mB (A) = m(A).

Proposition 2.20
(B, MB , mB ) is a complete measure space.
S S
Hint i (Ai ∩ B) = ( i Ai ) ∩ B and (A1 ∩ B) \ (A2 ∩ B) = (A1 \ A2 ) ∩ B.

We can finally state precisely what we mean by ‘selecting a number from


[0,1] at random’: restrict Lebesgue measure m to the interval B = [0, 1] and
consider the σ-field of M[0,1] of measurable subsets of [0, 1]. Then m[0,1] is a
measure on M[0,1] with ‘total mass’ 1. Since all subintervals of [0,1] with the
same length have equal measure, the ‘mass’ of m[0,1] is spread uniformly over
1
[0,1], so that, for example, the ‘probability’ of choosing a number from [0, 10 )
6 7 1
is the same as that of choosing a number from [ 10 , 10 ), namely 10 . Thus all
numerals are equally likely to appear as first digits of the decimal expansion
of the chosen number. On the other hand, with this measure, the probability
that the chosen number will be rational is 0, as is the probability of drawing
an element of the Cantor set C.

We now have the basis for some probability theory, although a general
development still requires the extension of the concept of measure from R to
abstract sets. Nonetheless the building blocks are already evident in the detailed
development of the example of Lebesgue measure. The main idea in providing a
46 Measure, Integral and Probability

mathematical foundation for probability theory is to use the concept of measure


to provide the mathematical model of the intuitive notion of probability. The
distinguishing feature of probability is the concept of independence, which we
introduce below. We begin by defining the general framework.

2.6.1 Probability space

Definition 2.6
A probability space is a triple (Ω, F, P ) where Ω is an arbitrary set, F is a
σ-field of subsets of Ω, and P is a measure on F such that

P (Ω) = 1,

called probability measure or briefly probability.

Remark 2.6
The original definition, given by Kolmogorov in 1932, is a variant of the above
(see Theorem 2.14): (Ω, F, P ) is a probability space if (Ω, F) are given as
above, and P is a finitely additive set function with P (Ø) = 0 and P (Ω) = 1
such that P (Bn ) & 0 whenever (Bn ) in F decreases to Ø.

Example 2.2
We see at once that Lebesgue measure restricted to [0, 1] is a probability mea-
sure. More generally: suppose we are given an arbitrary Lebesgue measurable
1
set Ω ⊂ R, with m(Ω) > 0. Then P = c·mΩ , where c = m(Ω) , and m = mΩ de-
notes the restriction of Lebesgue measure to measurable subsets of Ω, provides
a probability measure on Ω, since P is complete and P (Ω) = 1.
1
For example, if Ω = [a, b], we obtain c = b−a , and P becomes the ‘uniform
distribution’ over [a, b]. However, we can also use less familiar sets for our base
1
space; for example, Ω = [a, b] ∩ (R \ Q), c = b−a gives the same distribution
over the irrationals in [a, b].

2.6.2 Events: conditioning and independence

The word ‘event’ is used to indicate that something is happening. In probability


a typical event is to draw elements from a set and then the event is concerned
with the outcome belonging to a particular subset. So, as described above, if
2. Measure 47

Ω = [0, 1] we may be interested in the fact that a number drawn at random


from [0, 1] belongs to some A ⊂ [0, 1]. We want to estimate the probability of
this happening, and in the mathematical setup this is the number P (A), here
m[0,1] (A). So it is natural to require that A should belong to M[0,1] , since these
are the sets we may measure. By a slight abuse of the language, probabilists
tend to identify the actual ‘event’ with the set A which features in the event.
The next definition simply confirms this abuse of language.

Definition 2.7
Given a probability space (Ω, F, P ) we say that the elements of F are events.

Suppose next that a number has been drawn from [0, 1] but has not been
revealed yet. We would like to bet on it being in [0, 14 ] and we get a tip that
it certainly belongs to [0, 12 ]. Clearly, given this ‘inside information’, the prob-
ability of success is now 21 rather than 14 . This motivates the following general
definition.

Definition 2.8
Suppose that P (B) > 0. Then the number
P (A ∩ B)
P (A|B) =
P (B)
is called the conditional probability of A given B.

Proposition 2.21
The mapping A 7→ P (A|B) is countably additive on the σ-field FB .

Hint Use the fact that A 7→ P (A ∩ B) is countably additive on F.

A classical application of the conditional probability is the total probability


formula which enables the computation of the probability of an event by means
of conditional probabilities given some disjoint hypotheses:

Exercise 2.9
S∞
Prove that if Hi are pairwise disjoint events such that i=1 Hi = Ω,
P (Hi ) 6= 0, then
X∞
P (A) = P (A|Hi )P (Hi ).
i=1
48 Measure, Integral and Probability

It is natural to say that the event A is independent of B if the fact that B


takes place has no influence on the chances of A, i.e.
P (A|B) = P (A).
By definition of P (A|B) this immediately implies the relation
P (A ∩ B) = P (A) · P (B)
which is usually taken as the definition of independence. The advantage of this
practice is that we may dispose of the assumption P (B) > 0.

Definition 2.9
The events A, B are independent if
P (A ∩ B) = P (A) · P (B).

Exercise 2.10
Suppose that A and B are independent events. Show that Ac and B are
also independent.

The Exercise indicates that if A and B are independent events, then all
elements of the σ-fields they generate are mutually independent, since these σ-
fields are simply the collections FA = {Ø, A, Ac , Ω} and FB = {Ø, B, B c , Ω}
respectively. This leads us to a natural extension of the definition: two σ-fields
F1 and F2 are independent if for any choice of sets A1 ∈ F1 and A2 ∈ F2 we
have P (A1 ∩ A2 ) = P (A1 )P (A2 ).
However, the extension of these definitions to three or more events (or
several σ-fields) needs a little care, as the following simple examples show:

Example 2.3
Let Ω = [0, 1], A = [0, 14 ] as before; then A is independent of B = [ 81 , 58 ] and of
C = [ 81 , 83 ] ∪ [ 34 , 1]. In addition, B and C are independent. However,
P (A ∩ B ∩ C) 6= P (A) · P (B) · P (C).
Thus, given three events, the pairwise independence of each of the three possible
pairs does not suffice for the extension of ‘independence’ to all three events.
On the other hand, with A = [0, 41 ], B = C = [0, 161
] ∪ [ 14 , 11
16 ], (or alterna-
1 9
tively with C = [0, 16 ] ∪ [ 16 , 1])
P (A ∩ B ∩ C) = P (A) · P (B) · P (C) (2.21)
2. Measure 49

but none of the pairs make independent events.

This confirms further that we need to demand rather more if we wish to


extend the above definition – pairwise independence is not enough, nor is (2.21);
therefore we need to require both conditions to be satisfied together. Extending
this to n events leads to:

Definition 2.10
The events A1 , . . . , An are independent if for all k ≤ n for each choice of k
events, the probability of their intersection is the product of the probabilities.

Again there is a powerful counterpart for σ-fields (which can be extended


to sequences, and even arbitrary families):

Definition 2.11
The σ-fields F1 , F2 , ..., Fn defined on a given probability space (Ω, F, P ) are
independent if, for all choices of distinct indices i1 , i2 , ..., ik from {1, 2, ..., n}
and all choices of sets Fin ∈ Fin we have
P (Fi1 ∩ Fi2 ∩ ... ∩ Fik ) = P (Fi1 ) · P (Fi2 ) · · · · · P (Fik ).

The issue of independence will be revisited in the subsequent chapters where


we develop some more tools to calculate probabilities

2.6.3 Applications to mathematical finance

As indicated in the Preface, we will explore briefly how the ideas developed
in each chapter can be applied in the rapidly growing field of mathematical
finance. This is not intended as an introduction to this subject, but hope-
fully it will demonstrate how a consistent mathematical formulation can help
to clarify ideas central to many disciplines. Readers who are unfamiliar with
mathematical finance should consult texts such as [4], [5], [7] for definitions and
a discussion of the main ideas of the subject.
Probabilistic modelling in finance centres on the analysis of models for the
evolution of the value of traded assets, such as stocks or bonds, and seeks
to identify trends in their future behaviour. Much of the modern theory is
concerned with evaluating derivative securities such as options, whose value is
determined by the (random) future values of some underlying security, such as
a stock.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy