chapter_1 Background
chapter_1 Background
chapter_1 Background
Background
1
Definitions
Outcome – a Sample Space:
particular result of The set of all possible
an experiment outcomes. Outcomes
cannot overlap. All
outcomes must be
Event – any subset represented. Can find
of outcomes. by:
1. A List
2. Grid
3. A Tree Diagram
2
SAMPLE SPACES
3
Imagine one dice was red and one was blue. Sample Space Diagrams
Dice 1 There are 36
equally likely outcomes.
+ How many ways can
2 3 4 5 6 7 we get a 7?
3 4 5 6 7 8 P(7) = =
4 5 6 7 8 9
Dice 2
5 6 7 8 9 10 P(12) =
6 7 8 9 10 11
7 8 9 10 11 12 P(6) =
Total Score
EXAMPLE :
When we toss a coin 3 times and record the results in the
sequence
that they occur, then the sample space is
Note{ 𝐻 ,𝐻 , 𝐻 }
that corresponds one of the ordered
{𝐻 ,𝐻 ,𝑇 } to three outcomes,
{ 𝐻 ,𝑇 , 𝑇 } ,, three ,,
{𝑇 , 𝑇 , 𝑇 } ,, one ,,
,, ,,
Thus {H, H, H } and {T, T, T } each occur with
probability while {H, H, T } and {H, T, T } each occur
with probability
6
Events
7
Independent / Dependent
Definition
Two events E and F are independent if the
occurrence of event E in a probability
experiment does not affect the probability of
event F.
8
Examples
Independent Dependent
Draw a card from a deck of Draw a card from a deck.
cards. Do NOT replace it.
Replace it. Draw another card.
Draw another card .
9
The Algebra of Events
Complement of Set
If U is a universal set and X is any subset of U then the
complement of X is the set of all elements of the set U apart
from the elements of X.
Example: and
Then, complement of A will be;
Venn Diagram:
10
Union of Sets
If two sets A and B are given, then the union of A and B is equal
to the set that contains all the elements present in set A and set
B. This operation can be represented as;
11
Intersection of Sets
If two sets A and B are given, then the intersection of A and B
is the subset of universal set U, which consist of elements
common to both A and B. It is denoted by the symbol ‘∩’. This
operation is represented by:
12
Venn Diagram of Intersection of sets Venn Diagram of Intersection of three sets
Difference of Sets
the difference of two sets A and B is equal to the set which
consists of elements present in A but not in B.
𝒄
𝑨− 𝑩= 𝑨∩ 𝑩
Example: If A = {1,2,3,4,5,6,7} and B = {6,7} are two sets.
Solution: the difference of set A and set B is given by;
A – B = {1,2,3,4,5}
13
The Algebra of Events
We write if E is a subset of F .
Then
Ec instead of E¯
instead of
,
instead of .
10
If the sample space S is finite then we typically allow any
subset of
S to be an event.
,
.
15
EXAMPLE : If the sample space is
,
then possible events,( .
16
General Definition of Probability
A probability function P assigns a real number (the
probability of E) to every event E in a sample space
S.
P (·) must satisfy the following basic
properties :
17
Basic Theorems of
Probability
T H E O R E M 1 : Complementation Rule
Thu
s
EXAMPLE :
What is the probability of at least one ”H” in four tosses of a
coin?
SOLUTION : The sample space S will have 16
outcomes.
18
T H E O R E M 2 : Addition Rule for Mutually Exclusive
Events
19
Example : If the probability that on any workday a
garage will get less than 10, 10–20, 21–30, 31–
40, over 40 cars to service is 0.08, 0.20, 0.35,
0.25, and 0.12, respectively, what is the
probability that on a given workday the garage
gets at least 21 cars to service?
20
Example 3.4 : In tossing a fair die, what is the probability of
getting an odd number or a number less than 4?
Solution.
S={ 1, 2, 3, 4, 5, 6 }
Let A be the event “Odd number”
and B the event “Number less than 4.”
where
Then
21
Conditional Probability
Definition
it is the probability of an event B under the
condition that an event A occurs.
This probability is called the conditional
probability of B given A and is denoted By
P A B if P B 0
P A B
P B
22
Example 3.4 : In tossing a fair die, what is the
probability of getting an odd number under the
condition that a number less than 4?
Solution.
S={ 1, 2, 3, 4, 5, 6 }
Let A be the event “Odd number”
and B the event “Number less than 4.”
Now
and
Then
23
An Example
The academy awards (Oscar) is soon to be
shown.
For a specific married couple the probability
that the husband watches the show is 80%, the
probability that his wife watches the show is
65%, while the probability that they both watch
the show is 60%.
If the husband is watching the show, what is the
probability that his wife is also watching the
show
24
Solutio
n: academy awards is soon to be shown.
The
Let B = the event that the husband watches the show
P[B]= 0.80
Let A = the event that his wife watches the show
P[A]= 0.65
and
P[A ∩ B]= 0.60
P A B 0.60
P A B 0.75
P B 0.80
25
THEOREM 4 : Multiplication Rule
26
Independen
Definition ce
Two events A and B are called independent
if
P A B P A P B
30
DISCRETE RANDOM VARIABLES
DEFINITION : A discrete random variable is a function
X(s)𝑋 (· ):
from 𝑆→
a finite or 𝑅 .
countably infinite sample space S to
the real numbers :
EXAMPLE :Toss a coin 3 times in sequence. The sample
space is
With ,
, corresponds to the event and the values
With
6
we have 𝑃 (0 < 𝑋 ≤ 2)=
8
33
E3 )
HHH
S
3
E2 HHT
HTH
2
THH
HTT
E1 1
THT
TTH
E0 0
TTT
Graphical representation of X .
with
X(s) = the number of
Heads ,
we have ≡ = 1/8
≡ = 3/8
≡ = 3/8
≡ = 1/8
where
.
10
The graph of
f(x) . 36
DEFINITION :
𝑓 ( 𝑥) ≡ 𝑃 ( 𝑋 = 𝑥) ,
is called the probability
function .
DEFINITION :
𝐹 (𝑥)≡ 𝑃 ( 𝑋 ≤ 𝑥)≡ ∑ 𝑓 (𝑥 𝑗 ),
𝑥𝑗≤ 𝑥
𝑖𝑠 𝑐𝑎𝑙𝑙𝑒𝑑 𝑡 h 𝑒(𝑐𝑢𝑚𝑢𝑙𝑎𝑡𝑖𝑣𝑒)𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛.
PROPERTIES :
37
EXAMPLE : With X(s) = the number of Heads ,
and
S = {HHH , HHT , HTH , HTT , THH , THT , TTH ,
TTT } , 1 3 3 1
p(0) 8
, p(1) 8
, p(2) 8
, p(3) 8
,
= = = =
we have the probability distribution
function
≡ = 0
≡ = 1/8
≡ = 4/8
≡ = 7/8
≡ = 1
≡ = 1
38
The graph of theprobability distribution function .
39
Mean , energy, and Variance of a random variable
The mean of a discrete random variable X is a weighted
average of the possible values that the random variable
can be take. also the mean known as the expected
value of X { E(X) }
The mean at time n {(mu)} is calculated by
40
We can also use the ensemble of realizations to obtain the mean value
using the frequency interpretation formula:
where is the i'th outcome at sample index n (or time n) of the i'th
realization.
Depending on the type of the rv (random varible), the mean value
may or may not vary with time.
For an ergodic process, we find the sample mean (estimator of the
mean) using the time-average formula
10
The energy is calculated by
42
Example : Determine the values for the statistical mean, energy, and
variance of a random variable X characterized by the discrete uniform
PDF given in Equation below with xi = 1, 2, . . . , M.
Solution:
Energy
Correlation
The cross-correlation between two random sequences is defined by
If the mean value is zero, then the variance and the correlation
function are identical.
Independent and uncorrelated rv’s
If the two rv’s are statistically independent then the joint pdf can be
separated into two pdf’s
49
Ensemble of Sample
Functions
The set of all possible functions is called the
ENSEMBLE.
50
Random Processes
• A general Random or Stochastic
Process can be described as:
– Collection of time functions
(signals) corresponding to various
outcomes of random experiments.
– Collection of random variables
observed at different times.
• Examples of random processes
in communications:
– Channel noise,
t1 t2
– Information generated by a source,
– Interference.
51
52
53
Random Process
• Random process X(w,t) is a time-varying function of t and w where
the t is usually time and w is the element of the sample space (Ω).
Therefore, There are two ways to look at the random process.
• Case 1: t (Time) is fixed, and w is changing:
54
• Case 2: w is fixed, and t is changing:
55
• Intuition: Consider the random process as a bag of time-domain
signals (functions, realizations) every time you put your hand in the
bag and grab one of the signals, but you do not know which one you
are taking (That is where the randomness is coming from).
56
Stationarity
• Definition: A random process is STATIONARY to the
order N if for any t1,t2, . . . , tN,
58
Example of First-Order Stationarity
61
Ergodic Processes
• Definition: A random process is ERGODIC if all time averages of
any sample function are equal to the corresponding ensemble
averages (expectations)
• Example, for ergodic processes, can use ensemble statistics to
compute DC values and RMS values
[ x(t )] [ x] f ( x)dx mx Ensmble average
xRMS x 2 t x 2 2 mx 2
• Ergodic processes are always stationary; Stationary processes are
not necessarily ergodic
Ergodic Stationary 62
Example: Ergodic Process
RANDOM PROCESS is x t A sin 0t 0
• A and w0 are constants; q0 is a uniformly
distributed RV from [-p,p); t is time.
• Mean (Ensemble statistics)
1
mx x x f d A sin 0t d 0
2
• Variance
2
1 A
x 2
A sin 0t d
2 2
2 2
63
Example: Ergodic Process
• Mean (Time Average) T is large
1 T
x t lim A sin t dt 0
0
T T
0
• Variance
1 T 2 2 A2
x 2 t lim A sin 0t dt
T T 2
0
64
EXERCISE
• Write down the definition of :
– Wide sense stationary
– Ergodic processes
• How do these concepts relate to each other?
• Consider: x(t) = K; K is uniformly distributed
between [-1, 1]
– WSS?
– Ergodic?
65
Autocorrelation of Random
Process
• The Autocorrelation function of a real random
process x(t) at two times is:
_________________
Rx t1 , t2 x t x t x1 x2 f x x1 , x2 dx1dx2
1 2
66
Wide-sense Stationary
• A random process that is stationary to order 2 or greater is
Wide-Sense Stationary:
• A random process is Wide-Sense Stationary if:
________
x t constant
Rx t1 , t2 Rx
• Usually, t1=t and t2=t+t so that t2- t1 =t.
• Wide-sense stationary process does not DRIFT with time.
• Autocorrelation depends only on the time gap but not where
the time difference is.
• Autocorrelation gives idea about the frequency response of
the RP.
67
Autocorrelation Function of
RP
• Properties of the autocorrelation function of wide-sense
stationary processes
_________
Rx 0 x 2 t Second Moment
Rx Rx , Symmetric
Rx 0 Rx , Maximum value at 0
68
Cross Correlations of RP
• Cross Correlation of two RP x(t) and y(t) is
defined similarly as:
_________________
Rxy t1 , t2 x t y t x1 y2 f xy x1 , y2 dx1dy2
1 2
t2 s2
72
2. Gaussian process
• The pdf of a Gaussian rv x(n) at time n is given by
4. Lognormal distribution
5. Chi-Square distribution
74
Wiener–Khintchin relations
• For a WSS process, the power spectrum is given by
75