0% found this document useful (0 votes)
6 views

Lecture 3-Discrete random variables

The document provides an overview of discrete random variables, including definitions, probability distributions, and key concepts such as the Probability Mass Function (PMF) and Cumulative Distribution Function (CDF). It covers specific distributions like Bernoulli, Binomial, and Poisson, detailing their properties, formulas, and examples. Additionally, it discusses the expected value, variance, and skewness associated with these distributions.

Uploaded by

nghebindatulu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Lecture 3-Discrete random variables

The document provides an overview of discrete random variables, including definitions, probability distributions, and key concepts such as the Probability Mass Function (PMF) and Cumulative Distribution Function (CDF). It covers specific distributions like Bernoulli, Binomial, and Poisson, detailing their properties, formulas, and examples. Additionally, it discusses the expected value, variance, and skewness associated with these distributions.

Uploaded by

nghebindatulu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 46

DISCRETE RANDOM VARIABLES

• A random variable is an uncertain quantity whose value


depends on chance.
• A random variable may be discrete or continuous
• We usually denote the random variable by X.
• Discrete if it takes only a countable number of values.
• For example, number of events on tossing a die, number of
heads in coin tossing.
• Probability distribution of the random variable is a rule that
assigns probabilities to different values of the random
variable.
• The probability of X=xi for discrete random variables x1 ,x2 , x3
, x4 ……. is called Probability Mass Function (PMF), Px(xi).
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
Cumulative Distribution Function

• The Cumulative Distribution Function (CDF) of a discrete


random variable X denoted by FX(x) is the function defined
by
FX(a) = P(X ≤ a) for -∞ < a < ∞
• Properties of CDF
(i) P(a < X ≤ b) = FX(b) – FX(a)
(ii) FX(b)> FX(a) if b>a
(iii) FX(-∞)=0
(iv) FX(∞)=1
(v) 0≤FX(a)≤1
Cumulative distribution function

• Example: The cumulative distribution function of a


discrete random variable X is FX(b)=b2/64 .
Find the probabilities of the following random
variables; (a) X≤2 (b) X>4 (c) 2<X≤4
The Principle of Inclusion and Exclusion

 The Principle of Inclusion and Exclusion gives a


formula for the size of the union of n finite sets.
 Usually the universe is finite too.
 It is a generalization of the familiar formulas
|A ∪ B| = |A| + |B|−|A ∩ B| and

|A ∪ B ∪ C| = |A| + |B| + |C|−|A ∩ B|−|A ∩ C|−


|B ∩ C| + |A ∩ B ∩ C|.
Expected Value and Variance of a Discrete Random Variable
DISCRETE PROBABILITY DISTRIBUTION

• From the question below, find the value of k, and calculate


mean and variance also draw its graph
DISCRETE PROBABILITY DISTRIBUTION
DISCRETE PROBABILITY DISTRIBUTION
Bernoulli Distribution

 A random variable with two possible values, 0


and 1, is called a Bernoulli variable, and its
distribution is Bernoulli distribution
 X ~ Bernoulli(p) or X ~ Ber(p)
 Ber(p) is a Bernoulli distribution with parameter
p as a probability of success, where 0 ≤ p ≤ 1, and
p(1) = P(X = 1) = p
p(0) = P(X = 0) = 1 − p
E[X] = p, Var(X) = p(1 − p)
Bernoulli Distribution

• Example: A basketball player can shoot a ball into a


basket with a probability of 0.6. What is the
probability that he misses the shot?
THE BINOMIAL DISTRIBUTION

• Let a random experiment be performed repeatedly and let the


occurrence of an event in a trial be called a success and its non-
occurrence a failure.
• Consider a set of n independent trials (n being finite), in which
the probability 'p' of success in any trial is constant for each
trial. Then q = 1 - p, is the probability of failure in any trial.
• The random variable that counts the number of successes in
many independent, with the same probability of success and
failure in each trial is called a Binomial Random Variable.
THE BINOMIAL DISTRIBUTION

• Conditions for a Binomial Random Variable

1. There are only two mutually exclusive outcomes in the


experiment i.e. S = {success, failure}
2. In repeated trials of the experiment, the probabilities of
occurrence of these events remain constant
3. The outcomes of the trials are independent of one another
4. The number of trials ‘n' is finite.
• The probability distribution of Binomial Random Variable is
called the Binomial Distribution
BINOMIAL PROBABILITY FUNCTION

• To describe the distribution of Binomial random variable we


need two parameters, n and p we write; X ~ B (n, p) to
indicate that X is Binomially distributed with n number of
independent trials and p probability of success in each trial.
The letter B stands for binomial.
BINOMIAL PROBABILITY FUNCTION

• Now we know that there are nCx ways of getting x successes


out of n trials.

• We also observe that each of these nCx possibilities has


px(1-p)n-x probability of occurrence corresponding to x
successes and (n-x) failures. Therefore,

• P(X = x) = nCx px(1-p)n-x for x = 0,1,2,………, n


• This equation is the Binomial probability formula.
BINOMIAL PROBABILITY FUNCTION

• If we denote the probability of failure as q then the Binomial


probability formula is
• P(X = x) = nCx px(q)n-x for x = 0,1,2,………, n
i.e q=1-p
BINOMIAL PROBABILITY FUNCTION
BINOMIAL PROBABILITY FUNCTION

3. Skewness
• To bring out the skewness of a Binomial distribution we can
calculate, coefficient of skewness, γ1
• If p is the probability of success and q as the probability of
failure, the coefficient of skewness of a binomial distribution
is given by
BINOMIAL PROBABILITY FUNCTION
BINOMIAL PROBABILITY FUNCTION

• From the formula above, we note that;


 The Binomial distribution is skewed to the right i.e. has
positive skewness when γ1 > 0, which is so when p < q
 The Binomial distribution is skewed to the left i.e. has
negative skewness when γ1 < 0, which is so when p > q
 The Binomial distribution is symmetrical i.e. has no skewness
when γ1 = 0, which is so when p = q
 Thus, n being the same, the degree of skewness in a Binomial
distribution tends to vanish as p approaches ½ i.e. as p→ ½
BINOMIAL DISTRIBUTION

• Example. Assuming the probability of male birth as ½, find


the probability distribution of number of boys out of 5 births.
(a) Find the probability that a family of 5 children have
(i) at least one boy
(ii) at most 3 boys
(b) Out of 960 families with 5 children each find the expected
number of families with (i) and (ii) above
BINOMIAL DISTRIBUTION

• X ~ B (5, ½)
• P(X = x) = nCx px(q)n-x , x=0,1,2,3,4,5
BINOMIAL DISTRIBUTION
BINOMIAL DISTRIBUTION

(a) The required probabilities are


(i) P(X ≥ 1) = 1- P(X = 0)
= 1- 1/32
= 31/32
(ii) P(X ≤ 3) = P(X = 0)+ P(X = 1)+ P(X = 2)+
P(X = 3)
= 1/32 + 5/32 + 10/32 + 10/32
= 26/32
BINOMIAL DISTRIBUTION

(b) Out of 960 families with 5 children, the expected number of


families with
(i) at least one boy = 960 * P(X ≥ 1)
= 960 * 31/32
= 930
(ii) at most 3 boys = 960 * P(X ≤ 3)
= 960 * 26/32
= 720
BINOMIAL DISTRIBUTION

• Example. A coin thrown ten times. Find the probability of


getting
(a) at least seven heads
(b) at least nine heads
(c) at most eight heads
(d) at least two heads
(e) at most ten heads
(f) less than seven heads
• Example. Team A and B play a game in which their chances of
winning are in the ratio 3 : 2. Find their chances of winning at
least three games out of the five games played
BINOMIAL DISTRIBUTION

Example: Comment on the following:


“The mean of a binomial distribution is 3 and variance
is 4”
Example: The mean and variance of binomial
distribution are 4 and 4/3 respectively. Find
(i) P(X ≥ 1)
(ii) E( X2 )
(iii) Coefficient of skewness of the binomial
distribution
THE POISSON DISTRIBUTION

Poisson process: a very large population of


independent events, where each has a very
small probability to occur, and the average
occurrences in a range is roughly stable.
THE POISSON DISTRIBUTION
THE POISSON DISTRIBUTION

• A Poisson distribution is valid under following conditions:

1. The number of trials n, is infinitely large i.e. n → ∝


2. The constant probability of success p, for each trial is
infinitely small i.e. p → 0 (obviously q → 1)
3. np = μ is finite
• We can develop the Poisson probability rule from the
Binomial probability rule under the above conditions.
THE POISSON DISTRIBUTION
THE POISSON DISTRIBUTION

• Poisson distribution may be expected in situations where the


chance of occurrence of any event is small, and we are
interested in the occurrence of the event and not in its non-
occurrence.
• For example, number of road accidents, number of deaths
in flood or because of snakebite or because of a rare disease
etc.
• In these situations, we know about the occurrence of an event
although its probability is very small, but we do not know how
many times it does not occur
Characteristics of a Poisson Distribution

• Expected Value or Mean


The expected value or the mean, denoted by μ, of a Poisson
distribution is always μ itself. i.e. mean = μ
• Variance
The variance, denoted by σ2 , of a Poisson distribution is
σ2 = μ
• Skewness
The coefficient of skewness, γ1 in a Poisson distribution, is the
reciprocal of the square root of the mean, μ i.e.
γ1 = μ-1/2
Characteristics of a Poisson Distribution
Characteristics of a Poisson Distribution
Characteristics of a Poisson Distribution

• Evaluating γ1 = μ-1/2 , we note that Poisson distribution is


always skewed to the right i.e. has positive skewness
• The degree of skewness in a Poisson distribution decreases as
the value of μ increases.
Poisson Distribution

• Example
At a parking place the average number of car-arrivals during a
specified period of 15 minutes is 2. If the arrival process is
well described by a Poisson process, find the probability that
during a given period of 15 minutes
(a) no car will arrive
(b) at least two cars will arrive
(c) at most three cars will arrive
(d) between 1 and 3 cars will arrive inclusively
Poisson Distribution

• Example: Six coins are tossed 6,400 times. Using the Poisson
distribution, Find the coefficient of skewness and the
probability of getting six heads r times.
• Example: The mean and variance of a binomial distribution
are 2 and 1.5 respectively. Find the probability of (a) 2
successes (b) at least 2 successes (c) at most 2 successes.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy