Probability and Statistics 1
Probability and Statistics 1
Probability and Statistics 1
Statistics
Lecture-1
1
Rating plan for the discipline "Additional chapters of mathematics"
№ Type of classes Estimated parameters Quantity Unit cost Total score
Theoretical tests 4 5 20
Individual homework 4 5 20
4
Experiments, Outcomes, Events
An experiment is a process of measurement or observation,
in a laboratory, in a factory, on the street, in nature, or
wherever; so “experiment” is used in a rather general sense.
Our interest is in experiments that involve randomness,
chance effects, so that we cannot predict a result exactly.
A trial is a single performance of an experiment. Its result is
called an outcome or a sample point. n trials then give a
sample of size n consisting of n sample points.
The sample space S of an experiment is the set of all possible
outcomes. The subsets of S are called events and the
outcomes simple events. If, in a trial, an outcome a happens
and a Є A (a is an element of A), we say that A happens.
For instance, if a die turns up a 3, the event A: Odd number happens.5
Unions, Intersections, Complements of Events
The union of A and B consists of all points in A or B or both.
The intersection of A and B consists of all points that are in
both A and B.
If A and B have no points in common, we write
6
Unions, Intersections, Complements of Events
7
Unions, Intersections, Complements of Events
The union
8
Unions, Intersections, Complements of Events
Working with events can be illustrated and facilitated by Venn
diagrams for showing unions, intersections, and
complements, as in Figs., which are typical examples
Definition of Probability
If the sample space S of an experiment consists of finitely
many outcomes (points) that are equally likely, then the
probability of an event A is
10
Axioms of probability
Given a sample space S, with each event A of S (subset of S)
there is associated a number called the probability of A, such
that the following axioms of probability are satisfied.
1. For every A in S,
11
Basic Theorems of Probability
T H E O R E M 1. Complementation Rule
For an event A and its complement in a sample space S,
12
Conditional Probability
Often it is required to find the probability of an event B under
the condition that an event A occurs. This probability is called
the conditional probability of B given A and is denoted by
P(B|A). In this case A serves as a new (reduced) sample
space, and that probability is the fraction P(A) of which
corresponds to . Thus
13
Independent Events
Independent Events. If events B and A are such that
14
Random Variables. Probability Distributions
A probability distribution or, briefly, a distribution, shows the
probabilities of events in an experiment. The quantity that we
observe in an experiment will be denoted by X and called a
random variable (or stochastic variable) because the value it
will assume in the next trial depends on chance, on
randomness.
If you roll a die, you get one of the numbers from 1 to 6, but you don’t
know which one will show up next. Thus Number a die turns up is a
random variable.
15
Random Variables. Probability Distributions
If we count (cars on a road, defective screws in a production,
tosses until a die shows the first Six), we have a discrete
random variable and distribution.
If we measure (electric voltage, rainfall, hardness of steel), we
have a continuous random variable and distribution. Precise
definitions follow. In both cases the distribution of X is
determined by the distribution function (or cumulative
distribution function)
16
Random Variable
A random variable X is a function defined on the sample
space S of an experiment. Its values are real numbers. For
every number a the probability
17
Discrete Random Variables and Distributions
18
Discrete Random Variables and Distributions
Clearly, the discrete distribution of X is also determined by
the probability function f(x) of X, defined by
where for any given x we sum all the probabilities for which
is smaller than or equal to that of x. This is a step function
with upward jumps of size at the possible values of X and
constant in between.
19
Discrete Random Variables and Distributions
For the probability corresponding to intervals we have
20
Continuous Random Variables and Distributions
Discrete random variables appear in experiments in which we
count (defectives in a production, days of sunshine in
Chicago, customers standing in a line, etc.). Continuous
random variables appear in experiments in which we
measure (lengths of screws, voltage in a power line, Brinell
hardness of steel, etc.). By definition, a random variable X and
its distribution are of continuous type or, briefly, continuous,
if its distribution function can be given by an integral
22
Discrete Random Variables and Distributions
The mean μ characterizes the central location and the
variance σ the spread (the variability) of the distribution. The
mean μ (mu) is defined by
23
Discrete Random Variables and Distributions
σ (the positive square root of σ2) is called the standard
deviation of X and its distribution.
The mean μ is also denoted by E(X) and is called the
expectation of X because it gives the average value of X to be
expected in many trials. Quantities such as μ and σ2 that
measure certain properties of a distribution are called
parameters. μ and σ2 are the two most important ones.
σ2 >0
(except for a discrete “distribution” with only one possible
value, so that σ2 =0). We assume that μ and σ2 exist (are
finite), as is the case for practically all distributions that are
useful in applications.
24
Theorems of Mean and Variance
T H E O R E M 1. Mean of a Symmetric Distribution
If a distribution is symmetric with respect to x = c, that is,
f(c-x) = f(c+x), then μ = c
26
Expectation, Moments
27
END OF LECTURE-1
28