chapter_1 Background

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 75

CHAPYRT 1

Background

Dr. SAAD MUHI FALIH

1
Definitions
 Outcome – a  Sample Space:
particular result of The set of all possible
an experiment outcomes. Outcomes
cannot overlap. All
outcomes must be
 Event – any subset represented. Can find
of outcomes. by:
1. A List
2. Grid
3. A Tree Diagram
2
SAMPLE SPACES

EXAMPLE : When we flip a coin then sample


space is
wher S =
e H denotes that
{ H ,the
T }coin lands ”Heads
an up”
d T denotes that the coin lands ”Tails up”.

For a ”fair coin ” we expect H and T to have the same


”chance ” of occurring, i.e., if we flip the coin many times
then about 50 % of the outcomes will be H.
We say that the probability of H to occur is 0.5 (or
50 %) . The probability of T to occur is then also 0.5.

3
Imagine one dice was red and one was blue. Sample Space Diagrams
Dice 1 There are 36
equally likely outcomes.
+ How many ways can
2 3 4 5 6 7 we get a 7?

3 4 5 6 7 8 P(7) = =
4 5 6 7 8 9
Dice 2

5 6 7 8 9 10 P(12) =
6 7 8 9 10 11
7 8 9 10 11 12 P(6) =
Total Score
EXAMPLE :
When we toss a coin 3 times and record the results in the
sequence
that they occur, then the sample space is

Elements of S are ”vectors ”, ”sequences ”, or ”ordered outcomes


”.

We may expect each of the 8 outcomes to be equally likely.


1
Thus the probability of the HTT 8
.
sequence is

The probability of a sequence to contain precisely two


Heads is
1 1 3
+ = .
1 8
+
8 8
8
5
EXAMPLE : When we toss a coin 3 times and record the
results
without paying attention to the order in which they occur,
e.g., if we only record the number of Heads, then the sample
space is n

The outcomes in S are now sets ; i.e., order is not


important. Recall that the ordered outcomes are

Note{ 𝐻 ,𝐻 , 𝐻 }
that corresponds one of the ordered
{𝐻 ,𝐻 ,𝑇 } to three outcomes,
{ 𝐻 ,𝑇 , 𝑇 } ,, three ,,
{𝑇 , 𝑇 , 𝑇 } ,, one ,,
,, ,,
Thus {H, H, H } and {T, T, T } each occur with
probability while {H, H, T } and {H, T, T } each occur
with probability
6
Events

In Probability Theory subsets of the sample space are called


events.

EXAMPLE : The set of basic outcomes of rolling a die once


is

so the subset is an example of an event.

If a die is rolled once and it lands with a 2 or a 4 or a 6 up


then we say that the event E has occurred.
We have already seen that the probability E occurs
that is
1 1 1 1
𝑃 ( 𝐸)=¿ + + = .
6 6 6 2

7
Independent / Dependent
Definition
 Two events E and F are independent if the
occurrence of event E in a probability
experiment does not affect the probability of
event F.

 Two events are dependent if the occurrence of


event E in a probability experiment affects the
probability of event F.

8
Examples
Independent Dependent
Draw a card from a deck of Draw a card from a deck.
cards. Do NOT replace it.
Replace it. Draw another card.
Draw another card .

Roll a die and get a 6. Eating too many calories.


Roll a 2nd die and get a 3. Putting on Weight.

9
The Algebra of Events
Complement of Set
If U is a universal set and X is any subset of U then the
complement of X is the set of all elements of the set U apart
from the elements of X.

Example: and
Then, complement of A will be;

Venn Diagram:

10
Union of Sets
If two sets A and B are given, then the union of A and B is equal
to the set that contains all the elements present in set A and set
B. This operation can be represented as;

Where x is the elements present in both sets A and B.


Example: If set A = {1,2,3,4} and B {6,7}
Solution: Union of sets, A ∪ B = {1,2,3,4,6,7}

Venn Diagram of Union of sets

11
Intersection of Sets
If two sets A and B are given, then the intersection of A and B
is the subset of universal set U, which consist of elements
common to both A and B. It is denoted by the symbol ‘∩’. This
operation is represented by:

Where x is the common element of both sets A and B.


The intersection of sets A and B, can also be interpreted as:

Example: Let A = {1,2,3} and B = {3,4,5}


Solution: A∩B = {3}; because 3 is common to both the sets.

12
Venn Diagram of Intersection of sets Venn Diagram of Intersection of three sets

Difference of Sets
the difference of two sets A and B is equal to the set which
consists of elements present in A but not in B.
𝒄
𝑨− 𝑩= 𝑨∩ 𝑩
Example: If A = {1,2,3,4,5,6,7} and B = {6,7} are two sets.
Solution: the difference of set A and set B is given by;
A – B = {1,2,3,4,5}
13
The Algebra of Events

We write if E is a subset of F .

Example: A={1,2,3,4,5} and B={2,3}

Then

REMARK : In Probability Theory we use

Ec instead of E¯
instead of
,
instead of .
10
If the sample space S is finite then we typically allow any
subset of
S to be an event.

EXAMPLE : If we randomly draw one character from a box


con- taining the characters a, b, and c, then the sample
space is

and there are 8 possible events, namely, those in the set of


events

,
.

If the outcomes a, b, and c, are equally likely to occur, then

15
EXAMPLE : If the sample space is

,
then possible events,( .

16
General Definition of Probability
A probability function P assigns a real number (the
probability of E) to every event E in a sample space
S.
P (·) must satisfy the following basic
properties :

17
Basic Theorems of
Probability
T H E O R E M 1 : Complementation Rule

Thu
s

EXAMPLE :
What is the probability of at least one ”H” in four tosses of a
coin?
SOLUTION : The sample space S will have 16
outcomes.

THHT, THTH, THTT, TTHH, TTHT, TTTH,TTTT }


1 15
( ) ( )
𝑃 𝑎𝑡 𝑙𝑒𝑎𝑠𝑡 𝑜𝑛𝑒 𝐻 =1− 𝑃 𝑛𝑜 𝐻 =1− =
16 16

18
T H E O R E M 2 : Addition Rule for Mutually Exclusive
Events

• For any disjoint events (mutually exclusive events)


we have

• Where for all value of i and j


T H E O R E M 3 : Addition Rule for Arbitrary Events

For any Arbitrary Events we have

19
Example : If the probability that on any workday a
garage will get less than 10, 10–20, 21–30, 31–
40, over 40 cars to service is 0.08, 0.20, 0.35,
0.25, and 0.12, respectively, what is the
probability that on a given workday the garage
gets at least 21 cars to service?

Solution. Since these are mutually exclusive events,


Theorem 2 gives the answer

P(gets at least 21 cars)=0.35+0.25+0.12=0.72

Check this by the complementation rule.

20
Example 3.4 : In tossing a fair die, what is the probability of
getting an odd number or a number less than 4?

Solution.
S={ 1, 2, 3, 4, 5, 6 }
Let A be the event “Odd number”
and B the event “Number less than 4.”

where

Then

21
Conditional Probability

Definition
it is the probability of an event B under the
condition that an event A occurs.
This probability is called the conditional
probability of B given A and is denoted By

P  A  B if P  B  0
P  A B  
P  B
22
Example 3.4 : In tossing a fair die, what is the
probability of getting an odd number under the
condition that a number less than 4?
Solution.
S={ 1, 2, 3, 4, 5, 6 }
Let A be the event “Odd number”
and B the event “Number less than 4.”

Now

and

Then

23
An Example
The academy awards (Oscar) is soon to be
shown.
For a specific married couple the probability
that the husband watches the show is 80%, the
probability that his wife watches the show is
65%, while the probability that they both watch
the show is 60%.
If the husband is watching the show, what is the
probability that his wife is also watching the
show
24
Solutio
n: academy awards is soon to be shown.
The
Let B = the event that the husband watches the show
P[B]= 0.80
Let A = the event that his wife watches the show
P[A]= 0.65
and
P[A ∩ B]= 0.60

P  A  B 0.60
P  A B    0.75
P  B 0.80

25
THEOREM 4 : Multiplication Rule

If A and B are events in a sample space S and

26
Independen
Definition ce
Two events A and B are called independent
if
P  A  B  P  A P  B 

Note if P  B  0 and P  A 0 then


P  A  B P  A P  B 
P  A B    P  A 
P  B P  B
P  A  B P  A P  B 
and P  B A   P  B 
P  A P  A
Thus in the case of independence the conditional probability of
an event is not affected by the knowledge of the other event
27
Difference between independence and
mutually exclusive
mutually exclusive
Two mutually exclusive events are independent only in
the special case where
P  A 0 and P  B  0. (also P  A  B  0
Mutually exclusive events are
A highly dependent otherwise. A
B and B cannot occur
simultaneously. If one event
occurs the other event does not
occur.
Please Solve The Problem Set 24.3 In Page 1024
28
Example 1.10: A box contains 10 screws, three of which are
defective. Two screws are drawn at random. Find the probability
that neither of the two screws is defective (a) With Replacement
and (b) Without Replacement
Solution. We consider the events
A: First drawn screw non defective.
B: Second drawn screw non defective.
(a) With Replacement
and

(b) Without Replacement


and
29
Random Variables
The quantity that we observe in an experiment
will be denoted by X and called a random
variable (or stochastic variable) (“Stochastic”
means related to chance.)

30
DISCRETE RANDOM VARIABLES
DEFINITION : A discrete random variable is a function
X(s)𝑋 (· ):
from 𝑆→
a finite or 𝑅 .
countably infinite sample space S to
the real numbers :
EXAMPLE :Toss a coin 3 times in sequence. The sample
space is

and examples of random variables are


• X(s) = the number of Heads in the sequence ; e.g.,
X(HTH ) = 2 ,

• Y (s) = The index of the first H ; e.g., Y (TTH) = 3 ,


0 if the sequence has no H , i.e.,
Y (TTT ) =
0.
31
Value-ranges of a random variable correspond to events in
S .
EXAMPLE : For the sample space

With ,
, corresponds to the event and the values

NOTATION:If it is clear what S is then we often just


write
X instead of
32
Value-ranges of a random variable correspond to events in
S ,and events in S have a probability .

Thus Value-ranges of a random variable have a probability .


EXAMPLE : For the sample space

With
6
we have 𝑃 (0 < 𝑋 ≤ 2)=
8

QUESTION : What are the values of


P (X ≤ −1) , P (X ≤ 0) , P (X ≤ 1) , P (X ≤ 2) , P (X ≤ 3) ,

33
E3 )
HHH
S
3
E2 HHT

HTH
2
THH

HTT
E1 1
THT

TTH

E0 0
TTT

Graphical representation of X .

The events E0, E1, E2, E3 are disjoint since X(s) is a


function ! (X : S → R must be
defined for all s ∈ S and must be single-valued.)
34
Probability Function
We will write probability function to denote.
EXAMPLE : For the sample space

with
X(s) = the number of
Heads ,
we have ≡ = 1/8

≡ = 3/8

≡ = 3/8

≡ = 1/8
where
.

10
The graph of

f(x) . 36
DEFINITION :
𝑓 ( 𝑥) ≡ 𝑃 ( 𝑋 = 𝑥) ,
is called the probability
function .
DEFINITION :

𝐹 (𝑥)≡ 𝑃 ( 𝑋 ≤ 𝑥)≡ ∑ 𝑓 (𝑥 𝑗 ),
𝑥𝑗≤ 𝑥
𝑖𝑠 𝑐𝑎𝑙𝑙𝑒𝑑 𝑡 h 𝑒(𝑐𝑢𝑚𝑢𝑙𝑎𝑡𝑖𝑣𝑒)𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛.
PROPERTIES :

• F (x) is a non-decreasing function of x .


• and .

37
EXAMPLE : With X(s) = the number of Heads ,
and
S = {HHH , HHT , HTH , HTT , THH , THT , TTH ,
TTT } , 1 3 3 1
p(0) 8
, p(1) 8
, p(2) 8
, p(3) 8
,
= = = =
we have the probability distribution
function
≡ = 0
≡ = 1/8
≡ = 4/8
≡ = 7/8
≡ = 1
≡ = 1

We see, for example, that

38
The graph of theprobability distribution function .

39
Mean , energy, and Variance of a random variable
The mean of a discrete random variable X is a weighted
average of the possible values that the random variable
can be take. also the mean known as the expected
value of X { E(X) }
The mean at time n {(mu)} is calculated by

40
We can also use the ensemble of realizations to obtain the mean value
using the frequency interpretation formula:

where is the i'th outcome at sample index n (or time n) of the i'th
realization.
 Depending on the type of the rv (random varible), the mean value
may or may not vary with time.
 For an ergodic process, we find the sample mean (estimator of the
mean) using the time-average formula

10
The energy is calculated by

The variance (sigma square) of a discrete random


variable X measures the spread, or variability, of the
distribution, and is defined by

The standard deviation is the square root of the


variance.

42
Example : Determine the values for the statistical mean, energy, and
variance of a random variable X characterized by the discrete uniform
PDF given in Equation below with xi = 1, 2, . . . , M.

Solution:

Energy
Correlation
 The cross-correlation between two random sequences is defined by

 If x(n) = y(n), the correlation is known as the autocorrelation.


 Having an ensemble of realizations, the frequency interpretation of
the autocorrelation function is found using the formula
Covariance
 The covariance of a random sequence is defined by

The variance is found by setting m = n

If the mean value is zero, then the variance and the correlation
function are identical.
Independent and uncorrelated rv’s
If the two rv’s are statistically independent then the joint pdf can be
separated into two pdf’s

 The above equation is a necessary and sufficient condition for


the two random variables x(m), x(n) to be uncorrelated.
 Note that independent random variables are always
uncorrelated.
 However, the reverse is not necessarily true.
 If the mean value of any two uncorrelated rv’s is zero, then the
random variables are called orthogonal.
 In general, two rv’s are called orthogonal if their correlation is
zero.
Random Processes
• A RANDOM VARIABLE X, is a rule for
assigning to every outcome, w, of an
experiment a number X(w).
– Note: X denotes a random variable and X(w) denotes
a particular value.
• A RANDOM PROCESS X(t) is a rule for
assigning to every w, a function X(t,w).
– Note: for notational simplicity we often omit the
dependence on w.

49
Ensemble of Sample
Functions
The set of all possible functions is called the
ENSEMBLE.

50
Random Processes
• A general Random or Stochastic
Process can be described as:
– Collection of time functions
(signals) corresponding to various
outcomes of random experiments.
– Collection of random variables
observed at different times.
• Examples of random processes
in communications:
– Channel noise,
t1 t2
– Information generated by a source,
– Interference.

51
52
53
Random Process
• Random process X(w,t) is a time-varying function of t and w where
the t is usually time and w is the element of the sample space (Ω).
Therefore, There are two ways to look at the random process.
• Case 1: t (Time) is fixed, and w is changing:

54
• Case 2: w is fixed, and t is changing:

• If w is fixed to wᵢ, then we get a time-domain signal or a realization.


Therefore a random process is a deterministic time-domain function
for a fixed w.
• The randomness in a random process is due to w, not t. Hence, w is
usually omitted in the definition of a random process thus, we
sometimes referred to the random process X(w,t) as X(t).

55
• Intuition: Consider the random process as a bag of time-domain
signals (functions, realizations) every time you put your hand in the
bag and grab one of the signals, but you do not know which one you
are taking (That is where the randomness is coming from).

56
Stationarity
• Definition: A random process is STATIONARY to the
order N if for any t1,t2, . . . , tN,

fx{x(t1), x(t2),...x(tN)}=fx{x(t1+t0), x(t2+t0),...,x(tN +t0)}

• This means that the process behaves similarly (follows


the same PDF) regardless of when you measure it.

• A random process is said to be STRICTLY


STATIONARY if it is stationary to the order of N→∞.

• Is the random process from the coin tossing experiment


stationary?
57
For a wide-sense (or weakly) stationary process, the cdf satisfies the
relationship

for any m, n, and k.


 If the above relationship is true for any number of rv’s of the
time series, then the process is known as strictly stationary
process.
 The basic properties of a wide-sense stationary process are:

58
Example of First-Order Stationarity

RANDOM PROCESS is x t   A sin 0t   0 


• Assume that A and w0 are constants; q0 is a
uniformly distributed RV from [-p,p); t is time.
• the PDF of x(t):
 1
 x A
f x  x    A2  x 2
0 x Elsewhere

• Note: there is NO dependence on time, the PDF
is not a function of t.
• The RP is STATIONARY.
59
Non-Stationary Example

RANDOM PROCESS is x t   A sin 0t   0 


• Now assume that A, q0 and w0 are constants; t
is time.
• Value of x(t) is always known for any time
with a probability of 1. Thus the first order
PDF of x(t) is
f  x    x  A sin 0t   0 
• Note: The PDF depends on time, so it is
NONSTATIONARY.
60
Autocorrelation matrix
• If is a vector representing a finite random sequence then the
autocorrelation matrix is given by

61
Ergodic Processes
• Definition: A random process is ERGODIC if all time averages of
any sample function are equal to the corresponding ensemble
averages (expectations)
• Example, for ergodic processes, can use ensemble statistics to
compute DC values and RMS values

xDC  x t  [ x(t )] mx


1 T
x t  lim  [ x(t )]dt Time average
T T
0


[ x(t )]  [ x] f ( x)dx mx Ensmble average


xRMS  x 2 t   x 2   2  mx 2
• Ergodic processes are always stationary; Stationary processes are
not necessarily ergodic
Ergodic  Stationary 62
Example: Ergodic Process
RANDOM PROCESS is x t   A sin 0t   0 
• A and w0 are constants; q0 is a uniformly
distributed RV from [-p,p); t is time.
• Mean (Ensemble statistics)
  1
mx  x  x   f   d  A sin 0t    d 0
  2
• Variance
2
 1 A
x 2
 A sin 0t    d 
2 2
 2 2
63
Example: Ergodic Process
• Mean (Time Average) T is large
1 T
x t  lim  A sin  t   dt 0
0
T T
0

• Variance

1 T 2 2 A2
x 2 t  lim  A sin 0t    dt 
T T 2
0

• The ensemble and time averages are the same, so the


process is ERGODIC

64
EXERCISE
• Write down the definition of :
– Wide sense stationary
– Ergodic processes
• How do these concepts relate to each other?
• Consider: x(t) = K; K is uniformly distributed
between [-1, 1]
– WSS?
– Ergodic?

65
Autocorrelation of Random
Process
• The Autocorrelation function of a real random
process x(t) at two times is:
_________________  

 
Rx t1 , t2   x t x t  x1 x2 f x  x1 , x2  dx1dx2
1 2 

66
Wide-sense Stationary
• A random process that is stationary to order 2 or greater is
Wide-Sense Stationary:
• A random process is Wide-Sense Stationary if:

________
x t  constant
Rx t1 , t2  Rx  
• Usually, t1=t and t2=t+t so that t2- t1 =t.
• Wide-sense stationary process does not DRIFT with time.
• Autocorrelation depends only on the time gap but not where
the time difference is.
• Autocorrelation gives idea about the frequency response of
the RP.

67
Autocorrelation Function of
RP
• Properties of the autocorrelation function of wide-sense
stationary processes
_________

Rx 0   x 2 t  Second Moment
Rx   Rx   , Symmetric
Rx 0   Rx   , Maximum value at 0

Autocorrelation of slowly and rapidly fluctuating random processes.

68
Cross Correlations of RP
• Cross Correlation of two RP x(t) and y(t) is
defined similarly as:
_________________  

 
Rxy t1 , t2   x t y t  x1 y2 f xy  x1 , y2  dx1dy2
1 2 

• If x(t) and y(t) are Jointly Stationary processes,

Rxy t1 , t2  Rxy t2  t1  Rxy    t2  t1


• If the RP’s are jointly ERGODIC,
_________________
Rxy    x t  y t     x (t ) y (t   )
69
Cross Correlation Properties of
Jointly Stationary RP’s
• Some properties of cross-correlation functions are
Rxy   Rxy   
Rxy    Rx 0  Ry 0 
Rxy    12  Rx 0   Ry 0 
• Uncorrelated:
__________________
Rxy    x t  y t     x y
• Orthogonal:
Rxy   0
• Independent: if x(t1) and y(t2) are independent (joint
distribution is product of individual distributions)
70
Moments of Stochastic Process
• We can describe a stochastic process via its moments, i.e.,
E X t , E X t2 , E X t X s  etc. We often use the first two
moments.
E X t    t .
• The mean function of the process is
Var X t   t2 .
• The variance function of the process is

• The covariance function between Xt , Xs is


CovX t , X s   E X t   t X s   s 

• The correlation function between Xt , Xs is


CovX , X 
 X t , X s   t s

 t2  s2

• These moments are often function of time. 71


Special random signals and
probability density functions
1. White Noise Processes
• A WSS discrete random sequence that satisfies the relation

• is a pure random sequence whose elements x(n) are statistically


independent and identically distributed (iid).
• Therefore, the zero mean iid sequence has the following correlation
function

• a random process consisting of a sequence of uncorrelated


Gaussian rv’s is a white noise process referred to as white Gaussian
noise (WGN).

72
2. Gaussian process
• The pdf of a Gaussian rv x(n) at time n is given by

• A stochastic process is said to be a normal or Gaussian process if its


joint probability distribution is normal.
• A Gaussian process is strictly and weakly stationary because the
normal distribution is uniquely characterized by its first two
moments.
• The processes we will discuss are assumed to be Gaussian unless
mentioned otherwise.
• Like other areas in statistics, most time series results are established
for Gaussian processes. 73
3. Exponential distribution

4. Lognormal distribution

5. Chi-Square distribution

74
Wiener–Khintchin relations
• For a WSS process, the power spectrum is given by

• This function is periodic with period 2π (exp(-jk(ω + 2π)) = exp(-jkω))


• the autocorrelation sequence is given by

• For real process, (symmetric function) and as a consequence the power


spectrum is even function.
• Furthermore, the power spectrum of WSS process is nonnegative.

75

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy